Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Added dedicated error handling to load and get_model_path #775

Merged
merged 5 commits into from
May 20, 2024

Conversation

AtakanTekparmak
Copy link
Contributor

Added proper error handling to load and get_model_path by adding a dedicated exception class, because when the local path is not right, it still throws the huggingface RepositoryNotFoundError, which is not fully informative to the end user, they might have just entered their local path wrong by one character.

Added proper error handling to load and get_model_path by adding a dedicated exception class, because when the local path is not right, it still throws the huggingface RepositoryNotFoundError
llms/mlx_lm/utils.py Outdated Show resolved Hide resolved
Comment on lines 92 to 93
except RepositoryNotFoundError:
raise RepoNotFoundError(f"No local or Hugging Face repository found for path: {path_or_hf_repo}.")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Couldn't this fail for other reasons? TBH I'm not sure what the goal of this new exception is. Maybe you can clarify the purpose?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the case where the user tries to load a local model with the name example_model_name and makes a typo (for example example_model-name, the code still throws the huggingface's RepositoryNotFoundErrorerror, with the following error message:

Repository Not Found for url: https://huggingface.co/api/models/llama3-17b-instruc/revision/main.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.

for the following line of code:

model, tokenizer = load("llama3-17b-instruc")

which is not very informative, as the user is trying to access a local model. Not essential by any means, just something i encountered my first time using the package.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. Maybe it is worth augmenting that error message instead because it's possible they intended to download from HF in which case the message is actually useful.

Something like:

Model Not Found for {model_path}.
Please make sure you specified the local path or Hugging Face repo id correctly.
If you are trying to access a private or gated Hugging Face repo, make sure you are authenticated.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good, are you going to add the edits or should I?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would you mind making them?

Copy link
Member

@awni awni left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!!

@awni awni merged commit 199df9e into ml-explore:main May 20, 2024
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants