Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Please review: change chatdocs numpy version requirements to <1.23.0 for ROCM experimentation #75

Open
williamblair333 opened this issue Sep 10, 2023 · 0 comments

Comments

@williamblair333
Copy link

Overview I'm attempting to use Radeon GPUs with chatdocs. What I've done below seems to work up to the point I ask a question. I get a segmentation fault now regardless if the chatdocs.yml is configured for "cuda" or not. Thanks.

Request: Please Change chatdocs numpy version requirements to <1.23.0 for ROCM experimentation

Why: To create a ROCM environment using docker for use with chatdocs

How: Using PyTorch Installation for ROCm Option 1, for a docker environment to install chatdocs

Behavior: The follow errors were noticed:
root@docker-container:/workspace# pip install chatdocs
<stuff>
ERROR: onnxruntime 1.15.1 has requirement numpy>=1.21.6, but you'll have numpy 1.18.5 which is incompatible.
ERROR: filelock 3.12.3 has requirement typing-extensions>=4.7.1; python_version < "3.11", but you'll have typing-extensions 4.7.0 which is incompatible.
ERROR: pandas 2.0.3 has requirement numpy>=1.20.3; python_version < "3.10", but you'll have numpy 1.18.5 which is incompatible.
ERROR: chromadb 0.3.29 has requirement numpy>=1.21.6, but you'll have numpy 1.18.5 which is incompatible.
ERROR: transformers 4.33.1 has requirement tokenizers!=0.11.3,<0.14,>=0.11.1, but you'll have tokenizers 0.14.0 which is incompatible.

The 'change':

root@docker-container:/workspace# pip install numpy==1.22.0 && \
pip install typing-extensions>=4.7.1 && \
pip install "tokenizers<0.14,>=0.11.1" --force-reinstall && \
pip install "pandas>=1.0.0,<2.0.0" --force-reinstall && \
pip install chatdocs && \
chatdocs download
root@docker-container:/app# python3 -c 'import torch; print(torch.cuda.is_available())'
True

root@docker-container:/app# cat chatdocs.yml
embeddings:
model_kwargs:
device: cuda

root@docker-container:/app# chatdocs add /app/documents/
<normal output until>

Type your query below and press Enter.
Type 'exit' or 'quit' or 'q' to exit the application.

Q: Configure the network interfaces and hostname on a rhel server

A:Segmentation fault

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant