Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error using pth format model #36

Open
realalexsun opened this issue Jul 30, 2023 · 4 comments
Open

Error using pth format model #36

realalexsun opened this issue Jul 30, 2023 · 4 comments

Comments

@realalexsun
Copy link

Please forgive my ignorance...
Problem:

Exception: tensor stored in unsupported format

Full Error that popped up:

Traceback (most recent call last):
File "/var/folders/dl/8yzjfxfn26sf347jggtzb30c0000gn/T/0D6EEEEF-7FD3-417C-AFC4-71D9A9E81116/convert-pth-to-ggml.py", line 11, in
convert.main(['--outtype', 'f16' if args.ftype == 1 else 'f32', '--', args.dir_model])
File "/private/var/folders/dl/8yzjfxfn26sf347jggtzb30c0000gn/T/0D6EEEEF-7FD3-417C-AFC4-71D9A9E81116/convert.py", line 1144, in main
OutputFile.write_all(outfile, params, model, vocab)
File "/private/var/folders/dl/8yzjfxfn26sf347jggtzb30c0000gn/T/0D6EEEEF-7FD3-417C-AFC4-71D9A9E81116/convert.py", line 953, in write_all
for i, ((name, lazy_tensor), ndarray) in enumerate(zip(model.items(), ndarrays)):
File "/private/var/folders/dl/8yzjfxfn26sf347jggtzb30c0000gn/T/0D6EEEEF-7FD3-417C-AFC4-71D9A9E81116/convert.py", line 875, in bounded_parallel_map
result = futures.pop(0).result()
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/concurrent/futures/_base.py", line 438, in result
return self.__get_result()
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/concurrent/futures/_base.py", line 390, in __get_result
raise self._exception
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/concurrent/futures/thread.py", line 52, in run
result = self.fn(*self.args, **self.kwargs)
File "/private/var/folders/dl/8yzjfxfn26sf347jggtzb30c0000gn/T/0D6EEEEF-7FD3-417C-AFC4-71D9A9E81116/convert.py", line 950, in do_item
return lazy_tensor.load().to_ggml().ndarray
File "/private/var/folders/dl/8yzjfxfn26sf347jggtzb30c0000gn/T/0D6EEEEF-7FD3-417C-AFC4-71D9A9E81116/convert.py", line 489, in load
ret = self._load()
File "/private/var/folders/dl/8yzjfxfn26sf347jggtzb30c0000gn/T/0D6EEEEF-7FD3-417C-AFC4-71D9A9E81116/convert.py", line 497, in load
return self.load().astype(data_type)
File "/private/var/folders/dl/8yzjfxfn26sf347jggtzb30c0000gn/T/0D6EEEEF-7FD3-417C-AFC4-71D9A9E81116/convert.py", line 489, in load
ret = self._load()
File "/private/var/folders/dl/8yzjfxfn26sf347jggtzb30c0000gn/T/0D6EEEEF-7FD3-417C-AFC4-71D9A9E81116/convert.py", line 695, in load
return UnquantizedTensor(storage.load(storage_offset, elm_count).reshape(size))
File "/private/var/folders/dl/8yzjfxfn26sf347jggtzb30c0000gn/T/0D6EEEEF-7FD3-417C-AFC4-71D9A9E81116/convert.py", line 679, in load
raise Exception("tensor stored in unsupported format")
Exception: tensor stored in unsupported format

@hyeonchang
Copy link

Same issue.

@hyeonchang
Copy link

@realalexsun

Download these files

https://github.com/ggerganov/llama.cpp/blob/master/convert-pth-to-ggml.py
https://github.com/ggerganov/llama.cpp/blob/master/convert.py

Copy the downloaded files to the below folder.

/Applications/LlamaChat.app/Contents/Resources/llama.swift_llama.bundle/Contents/Resources

RESOLVED!!!

@realalexsun
Copy link
Author

@realalexsun

Download these files

https://github.com/ggerganov/llama.cpp/blob/master/convert-pth-to-ggml.py https://github.com/ggerganov/llama.cpp/blob/master/convert.py

Copy the downloaded files to the below folder.

/Applications/LlamaChat.app/Contents/Resources/llama.swift_llama.bundle/Contents/Resources

RESOLVED!!!

Thanks

@LucaColonnello
Copy link

These files are no longer available in the llama.cpp codebase, I went back in the history and grabbed them, but maybe this needs an update on LlamaChat side?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants