You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm getting following message when deployed zep image using docker-compose and failed the zep container creation.
embeddings model needs to be provided when using Azure API
Do we have to use an embedding model when using Azure? Can I just use Azure Open AI for the summarizing? I have tried using Azure Embedding Model as well. When using the embedding model my bot responses get very slow.
The text was updated successfully, but these errors were encountered:
Zep embeds all chat conversations so that they may be recalled via semantic search. Are you using Azure's OpenAI embedding service or Zep's local embedding service? More about Zep's embedding configuration here: https://docs.getzep.com/deployment/embeddings/
I want to use long-term memory with summarization. Because my message window is about 5 messages since this a real-time rag I want to summarize the last few messages to handle the flow. So I don't need to use embeddings. However I want to use Azure Open AI models for summarization. It won't let me disable the embedding when using Azure Open AI endpoint and LLM, the following message is coming : embeddings model needs to be provided when using Azure API
I'm getting following message when deployed zep image using docker-compose and failed the zep container creation.
embeddings model needs to be provided when using Azure API
Do we have to use an embedding model when using Azure? Can I just use Azure Open AI for the summarizing? I have tried using Azure Embedding Model as well. When using the embedding model my bot responses get very slow.
The text was updated successfully, but these errors were encountered: