Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to convert CogVLM due to the model not existing. #695

Open
raoulritter opened this issue Apr 19, 2024 · 1 comment
Open

Unable to convert CogVLM due to the model not existing. #695

raoulritter opened this issue Apr 19, 2024 · 1 comment

Comments

@raoulritter
Copy link

I am experiencing issues converting the cogvlm model from Hugging Face to MLX format. The process fails due to an unsupported model type. Enhanced logging and documentation on adding new models would be beneficial.

Environment

  • Operating System:
  • Python Version:
  • MLX Version:

Steps to Reproduce

  1. Attempt to convert the cogvlm model with the following command:
    python -m mlx_lm.convert \
        --hf-path THUDM/cogvlm-chat-hf \
        -q \
        --upload-repo raoulritter/cogvlm-mlx
  2. Observe the error indicating the model type is unsupported.

Expected Behavior

The conversion process should identify the model type and either convert it successfully or provide detailed reasons why the conversion cannot be performed.

Actual Behavior

The process terminates with an unsupported model type error, without detailed debugging information or guidance on how to resolve or work around the issue.

Possible Solutions

  • Verbose Logging: Implement more detailed logging during the conversion process to help users diagnose issues related to model compatibility.
  • Model Support Documentation: Provide detailed guidelines or examples on how to add support for new models in mlx.models, similar to how the llava model support was added (referenced in issue LlaVA in MLX #461).

Thank you for your help. I look forward to broader model compatibility in MLX.

@Blaizzy
Copy link
Contributor

Blaizzy commented Apr 19, 2024

You can move this issue here:

https://github.com/Blaizzy/mlx-vllm

I'm building a vllm specific package for MLX models.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants