Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

failed to load model config file from colab #63

Open
localoca5 opened this issue Jan 15, 2024 · 1 comment
Open

failed to load model config file from colab #63

localoca5 opened this issue Jan 15, 2024 · 1 comment

Comments

@localoca5
Copy link

I failed when loading the model config file from colab.

Code as below:
#load model
model = create_model(model_config).cpu()
model.load_state_dict(load_state_dict(model_ckpt, location='cuda'))
model = model.cuda()
ddim_sampler = DDIMSampler(model)

Got error:

OSError Traceback (most recent call last)
in <cell line: 2>()
1 #load model
----> 2 model = create_model(model_config).cpu()
3 model.load_state_dict(load_state_dict(model_ckpt, location='cuda'))
4 model = model.cuda()
5 ddim_sampler = DDIMSampler(model)

24 frames
/usr/lib/python3.10/ctypes/init.py in init(self, name, mode, handle, use_errno, use_last_error, winmode)
372
373 if handle is None:
--> 374 self._handle = _dlopen(self._name, mode)
375 else:
376 self._handle = handle

OSError: /usr/local/lib/python3.10/dist-packages/torchtext/lib/libtorchtext.so: undefined symbol: _ZN5torch3jit21setUTF8DecodingIgnoreEb

Has anyone had run into this issue or had success running inference on colab? I`m referring to the HF space version of anydoor: https://huggingface.co/spaces/xichenhku/AnyDoor-online/tree/main.

@AB00k
Copy link

AB00k commented Apr 26, 2024

I'm facing same issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants