You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
class_o# bj = {
"class": "Question1",
"description": "Information from a Jeopardy! question", # description of the class
"vectorizer": "text2vec-ollama",
"moduleConfig": {
"generative-openai": {} # Set generative-openai as the generative module
},
"properties": [
{
"name": "question",
"dataType": ["text"],
"description": "The question",
"moduleConfig": {
"text2vec-ollama": { # this must match the vectorizer used
"vectorizePropertyName": True,
"tokenization": "lowercase"
}
}
},
{
"name": "answer",
"dataType": ["text"],
"description": "The answer",
"moduleConfig": {
"text2vec-ollama": { # this must match the vectorizer used
"vectorizePropertyName": False,
"tokenization": "whitespace"
}
}
},
],
}
How to reproduce this bug?
Enable the text2vec-ollama module
What is the expected behavior?
able to import
What is the actual behavior?
skipping the module
Supporting information
No response
Server Version
1.24.10
Code of Conduct
The text was updated successfully, but these errors were encountered: