Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to import transformers.pipelines #1578

Open
llmwesee opened this issue Apr 23, 2024 · 6 comments
Open

Failed to import transformers.pipelines #1578

llmwesee opened this issue Apr 23, 2024 · 6 comments

Comments

@llmwesee
Copy link

I do a fresh installation of the latest h2ogpt. When i hit the following command
python generate.py --base_model=meta-llama/Llama-2-7b-chat-hf --score_model=None --langchain_mode='UserData' --user_path=user_path --use_auth_token=True --max_seq_len=4096 --max_max_new_tokens=2048

It then start the application window successfully but when make a simple query like Hi, generating the following error

raise RuntimeError(
RuntimeError: Failed to import transformers.pipelines because of the following error (look up to see its traceback):
Descriptors cannot be created directly.
If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.
If you cannot immediately regenerate your protos, some other possible workarounds are:
 1. Downgrade the protobuf package to 3.20.x or lower.
 2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).

protobuf_error

I also track the issues https://github.com/h2oai/h2ogpt/issues/965 and install pip install protobuf==3.20.0
then it throws the following error:
File "/home/abx/miniconda3/envs/gpt230424_test/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1512, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.pipelines because of the following error (look up to see its traceback): No module named 'gast'

gast_error

@pseudotensor
Copy link
Collaborator

Hi. I did a completely fresh install using quick install way in readme_linux.md:

curl -fsSL https://h2o-release.s3.amazonaws.com/h2ogpt/linux_install_full.sh | bash

and enter the sudo password when required. Once install done, do:

conda activate h2ogpt

and your exact run line:

(h2ogpt) jon@gpu:~/h2ogpt$ python generate.py --base_model=meta-llama/Llama-2-7b-chat-hf --score_model=None --langchain_mode='UserData' --user_path=user_path --use_auth_token=True --max_seq_len=4096 --max_max_new_tokens=2048

and I have no issues:

image

I have hit that protobuf thing in the past, but it's unclear the cause.

How did you install h2oGPT?

@llmwesee
Copy link
Author

I installed it using the INSTALL instructions from the https://github.com/h2oai/h2ogpt/blob/main/docs/README_LINUX.md for cuda 11.8

@llmwesee
Copy link
Author

And when I doing it with your way i.e quick install then the following things happened

proto_error

@pseudotensor
Copy link
Collaborator

pseudotensor commented Apr 24, 2024

Looks like you are missing /usr/local/cuda-12.1. Yes, that's default for that install, but you can download and edit the file instead of running it to switch to another cuda.

I'd recommend moving to cuda 12.1, as cuda 11.8 has lost support from many packages and old packages that support cuda 11.8 are no longer compatible with newer required ones.

It's too challenging to support the numerous packages plus also various cuda.

@llmwesee
Copy link
Author

llmwesee commented Apr 25, 2024

I first ensured that the correct environment variables are set before following the manual steps.
export PIP_EXTRA_INDEX_URL="https://download.pytorch.org/whl/cu118 https://huggingface.github.io/autogptq-index/whl/cu118"

and then proceed with the provided manual steps.

Note: Previously, the same procedure worked flawlessly on a server with 48GB VRAM. However, attempting the same steps on a laptop with 16GB VRAM results in the error.

@pseudotensor
Copy link
Collaborator

From what you said, I guess you are trying to use cuda 11.8 still. But I cannot tell from your screen shot what the package is that is having issues. You should share more.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants