We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I'm running on a machine with 24GB of GPU memory:
Traceback (most recent call last): File "/home/work/PythonProjects/AirLLM/inference.py", line 9, in mode = AutoModel.from_pretrained("/home/work/.cache/huggingface/hub/models--garage-bAInd--Platypus2-70B-instruct/snapshots/31389b50953688e4e542be53e6d2ab04d5c34e87") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/airllm/auto_model.py", line 54, in from_pretrained return class_(pretrained_model_name_or_path, *inputs, ** kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/airllm/airllm.py", line 9, in init super(AirLLMLlama2, self).init(*args, **kwargs) File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/airllm/airllm_base.py", line 104, in init self.model_local_path, self.checkpoint_path = find_or_create_local_splitted_path(model_local_path_or_repo_id, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/airllm/utils.py", line 351, in find_or_create_local_splitted_path return Path(model_local_path_or_repo_id), split_and_save_layers(model_local_path_or_repo_id, layer_shards_saving_path, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/airllm/utils.py", line 295, in split_and_save_layers state_dict.update(torch.load(to_load, map_location='cpu')) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/torch/serialization.py", line 1026, in load return _load(opened_zipfile, ^^^^^^^^^^^^^^^^^^^^^ File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/torch/serialization.py", line 1438, in _load result = unpickler.load() ^^^^^^^^^^^^^^^^ File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/torch/serialization.py", line 1408, in persistent_load typed_storage = load_tensor(dtype, nbytes, key, _maybe_decode_ascii(location)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/torch/serialization.py", line 1373, in load_tensor storage = zip_file.get_storage_from_record(name, numel, torch.UntypedStorage)._typed_storage()._untyped_storage ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ RuntimeError: PytorchStreamReader failed reading file data/17: invalid header or archive is corrupted
The text was updated successfully, but these errors were encountered:
No branches or pull requests
I'm running on a machine with 24GB of GPU memory:
Traceback (most recent call last):
File "/home/work/PythonProjects/AirLLM/inference.py", line 9, in
mode = AutoModel.from_pretrained("/home/work/.cache/huggingface/hub/models--garage-bAInd--Platypus2-70B-instruct/snapshots/31389b50953688e4e542be53e6d2ab04d5c34e87")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/airllm/auto_model.py", line 54, in from_pretrained
return class_(pretrained_model_name_or_path, *inputs, ** kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/airllm/airllm.py", line 9, in init
super(AirLLMLlama2, self).init(*args, **kwargs)
File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/airllm/airllm_base.py", line 104, in init
self.model_local_path, self.checkpoint_path = find_or_create_local_splitted_path(model_local_path_or_repo_id,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/airllm/utils.py", line 351, in find_or_create_local_splitted_path
return Path(model_local_path_or_repo_id), split_and_save_layers(model_local_path_or_repo_id, layer_shards_saving_path,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/airllm/utils.py", line 295, in split_and_save_layers
state_dict.update(torch.load(to_load, map_location='cpu'))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/torch/serialization.py", line 1026, in load
return _load(opened_zipfile,
^^^^^^^^^^^^^^^^^^^^^
File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/torch/serialization.py", line 1438, in _load
result = unpickler.load()
^^^^^^^^^^^^^^^^
File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/torch/serialization.py", line 1408, in persistent_load
typed_storage = load_tensor(dtype, nbytes, key, _maybe_decode_ascii(location))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/micromamba/envs/airllm/lib/python3.11/site-packages/torch/serialization.py", line 1373, in load_tensor
storage = zip_file.get_storage_from_record(name, numel, torch.UntypedStorage)._typed_storage()._untyped_storage
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: PytorchStreamReader failed reading file data/17: invalid header or archive is corrupted
The text was updated successfully, but these errors were encountered: