Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot run model with noexec /tmp #4229

Closed
jmbit opened this issue May 7, 2024 · 1 comment
Closed

Cannot run model with noexec /tmp #4229

jmbit opened this issue May 7, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@jmbit
Copy link

jmbit commented May 7, 2024

What is the issue?

For security reason, I have set /tmp to nodev,nosuid,noexec.
However, this means I can't run ollama models:

 ~ $ ollama run llama3
Error: error starting the external llama server: fork/exec /tmp/ollama2604787016/runners/cpu_avx2/ollama_llama_server: permission denied
# /etc/fstab: static file system information.
...
# <file system>             <mount point>            	<type>  <options>                        			<dump>  <pass>
...
# shm, tmp
tmpfs                   /dev/shm                     tmpfs   defaults,nodev,nosuid,noexec                       	0       0
tmpfs                   /tmp                         tmpfs   defaults,nodev,nosuid,noexec                       	0       0

Is there any way to set the execution directory for ollama?

OS

Linux

GPU

Intel

CPU

Intel

Ollama version

0.1.33

@jmbit jmbit added the bug Something isn't working label May 7, 2024
@pdevine
Copy link
Contributor

pdevine commented May 7, 2024

I think this is essentially a dupe of #4105 . I'll close this and we can track it in the other issue.

@pdevine pdevine closed this as completed May 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants