Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

no error when lacking hugging face permissions #915

Open
dangbert opened this issue May 1, 2024 · 2 comments
Open

no error when lacking hugging face permissions #915

dangbert opened this issue May 1, 2024 · 2 comments
Assignees

Comments

@dangbert
Copy link

dangbert commented May 1, 2024

I briefly changed my hugging face token (via huggingface-cli login) to one that accidentally lacked permissions to download the llama2 model, and model downloads stopped working BUT a successful download was falsely reported:

$ tune download meta-llama/Llama-2-7b-hf  --output-dir /tmp/checkpoints2
Ignoring files matching the following patterns: *.safetensors
Successfully downloaded model repo and wrote to the following locations:

The above was the output (with no locations listed) and running tree /tmp/checkpoints2 showed the directory was still empty.

Expected behavior:
Report an error and exit with a non-zero status code when the download fails due to hugging face token permissions

@SalmanMohammadi
Copy link
Contributor

SalmanMohammadi commented May 2, 2024

It looks like Torchtune defers to huggingface_hub to download checkpoints, and handles errors from huggingface_hub.snapshot_download. Would it be worth trying the method directly to see what happens? I've included an example below for the command you shared, which assumes you've run huggingface-cli login. I haven't managed to reproduce your error unfortunately.

from huggingface_hub import snapshot_download
import os

snapshot_download(
    "meta-llama/Llama-2-7b-hf",
    local_dir="/tmp/checkpoints2",
    local_dir_use_symlinks="auto",
    ignore_patterns="*.safetensors",
    token=os.getenv("HF_TOKEN", None),
)

For me, trying to access meta-llama/Meta-Llama-3-8B, which I don't have access to, outputs:

Cannot access gated repo for url https://huggingface.co/api/models/meta-llama/Meta-Llama-3-8B/revision/main.
Access to model meta-llama/Meta-Llama-3-8B is restricted and you are not in the authorized list. Visit https://huggingface.co/meta-llama/Meta-Llama-3-8B to ask for access.

EDIT: One more thing you could try make sure you do is huggingface-cli logout just incase there's something weird going on there.

@SalmanMohammadi
Copy link
Contributor

@dangbert did you have any luck with this? I found doing a couple things helped when I was having issues downloading models:

  • checking if ~/.cache/huggingface/ has a cache for the model you're trying to download, and deleting if so.
  • passing -hf-token explicitly to the tune download command

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants