You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, guys,
I tried to perform inference on my Mac Studio, which has an M2 chip.
However, I encountered a problem installing xformers.
File "/Users/user/anaconda3/envs/opensora/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1783, in _write_ninja_file_and_compile_objects
_run_ninja_build(
File "/Users/user/anaconda3/envs/opensora/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 2123, in _run_ninja_build
raise RuntimeError(message) from e
RuntimeError: Error compiling objects for extension
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for xformers
Running setup.py clean for xformers
Failed to build xformers
ERROR: Could not build wheels for xformers, which is required to install pyproject.toml-based projects
Are there any other ways that I can perform inference on my device?
Thank you!
The text was updated successfully, but these errors were encountered:
Hi, guys,
I tried to perform inference on my Mac Studio, which has an M2 chip.
However, I encountered a problem installing xformers.
File "/Users/user/anaconda3/envs/opensora/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1783, in _write_ninja_file_and_compile_objects
_run_ninja_build(
File "/Users/user/anaconda3/envs/opensora/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 2123, in _run_ninja_build
raise RuntimeError(message) from e
RuntimeError: Error compiling objects for extension
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for xformers
Running setup.py clean for xformers
Failed to build xformers
ERROR: Could not build wheels for xformers, which is required to install pyproject.toml-based projects
Are there any other ways that I can perform inference on my device?
Thank you!
The text was updated successfully, but these errors were encountered: