-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG]: Linux AppImage - Error on update! (solution in comments) #1077
Comments
Fully close the application and re-open it so that the migrations can run and you can use that feature as it requires a migration. If the application was updated while running - the migrations will not run. |
I've tried opening and closing the application several times yet the issue persists. |
Just modified the Linux version to help debug this issue. Can you see if this error reproduces with this app image? Let's see if this boots on your distro + allows you to update that workspace config. Definitely prisma related, but local to on the AppImage |
I tried to run this appImage but I ran into the problem described here (#898). I tried to fix it but got the error below. Running Ubuntu 22.04.4, I didn't have this issue with the image I downloaded from the website. |
Okay, that was expected to be seen honestly. That was the previous issue. Just reverted to the prior image which was causing the original issue. While the provider per workspace functionality is not available in 1.4.2, are you able to even set the model per workspace in this version (1.4.2) |
It looks like here, for some reason, the Prisma client is failing to generate after migration, this blocks the client from being able to use any updated schema and then fails to update workspaces who use the new fields from the updated schema |
based on what you said
I tried to run the same process as described in #898 with the AppImage I downloaded earlier from the website
This worked, used node 20.12.2... I'm not familiar with javascript packaging, prisma etc. So I'm not sure how to fix this, maybe a CI step to create the AppImage? |
That likely could be the fix. Asking people to unzip the AppImage is a bit crazy so I wanted to hold off on recommending that, but it looks like patching the app post-install seems to be the most continuously reliable solution. :/ The main issue is that whatever CI we use to build the image we will have the same issue since Thinking we might just make the app a shell script at this point that can make the appimage on-machine to avoid all of these compatibility quirks. Docker just works for Linux on all distro but nobody wants to use that solution, which is why AppImage even exists |
I asked ChatGPT lol and it came back with the following:
I'm not sure how you fixed this previously.. It looks like Adding this to the prisma.schema generator client might do it:
|
I had a look at the I'm not familiar with prisma but it might be idea to put the generate command after the migrate command in the package.json file e.g.:
|
@renegadephysicist you may have just solved a big headache. Will see if we can wrap a new Linux version and test to see if this resolves everyone's issues. If this does, would be happy to mark you as a contributor Related: prisma/prisma#8112 |
I concur that Prisma is the root cause of the problem, yet I find myself unable to rectify the situation. |
chatProvider
when changing Workspace LLM Provider
For now. The solution to simply this for all distros is the run the AppImage in the following way: ./AnythingLLMDesktop.AppImage --appimage-extract && \
cd squashfs-root && \
./AppRun This will ensure on run that the |
原来如此!! |
it works,thank you very much! |
my system is windows, is there any sulution for this issue? |
Me too,Encountered the same problem in windows system |
我也遇到了,解决了吗 |
How are you running AnythingLLM?
AnythingLLM desktop app
What happened?
System: Linux Ubuntu using AppImage
AythingLLM Version: 1.4.4
When attempting to change the "Workspace LLM Provider" in "Chat Settings" I get the error below.
Seems to be the same error for both anthropic and openai. It's possible to change the LLM at the instance level by changing "LLM Provider" just not at Workspace level
I get an error when starting the AppImage:
EROFS: read-only file system, unlink '/tmp/.mount_Anythib4quPA/resources/backend/node_modules/.prisma/client/index.js'
Not sure if this is related or not.
Are there known steps to reproduce?
After clicking "Update Workspace" the following error should appear if a value different from "System Default" is selected for "Workspace LLM Provider"
The text was updated successfully, but these errors were encountered: