-
Notifications
You must be signed in to change notification settings - Fork 697
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add logs for OllamaClient #662
Conversation
Custom interceptors can be replaced by https://github.com/square/okhttp/tree/master/okhttp-logging-interceptor |
Got it, but the default logging interceptor for okhttp is not flexible or suitable for logging, the logging level cannot be changed or customized. |
Thanks a lot! BTW we will be replacing okhttp as a default http client, so it is better to have a custom logging component, ideally client-agnostic, which can be used with any http client |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@HashZhang thanks a lot!
Would be great to also add logging for streaming, as done in MistralAiClient
🙏
Thank you!
langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaClient.java
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review Status
Actionable comments generated: 5
Configuration used: CodeRabbit UI
Files selected for processing (3)
- langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaClient.java (1 hunks)
- langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaRequestLoggingInterceptor.java (1 hunks)
- langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaResponseLoggingInterceptor.java (1 hunks)
Additional comments: 2
langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaResponseLoggingInterceptor.java (1)
- 13-18: The implementation of the
intercept
method correctly captures, logs, and returns the HTTP response. Well done on following the interceptor pattern effectively.langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaRequestLoggingInterceptor.java (1)
- 14-18: The implementation of the
intercept
method correctly captures, logs, and proceeds with the HTTP request. Nicely done on adhering to the interceptor pattern.
...in4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaResponseLoggingInterceptor.java
Show resolved
Hide resolved
...in4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaResponseLoggingInterceptor.java
Show resolved
Hide resolved
...ain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaRequestLoggingInterceptor.java
Outdated
Show resolved
Hide resolved
...ain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaRequestLoggingInterceptor.java
Show resolved
Hide resolved
langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaClient.java
Outdated
Show resolved
Hide resolved
More code added for:
|
# Conflicts: # langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaClient.java
conflict resolved, please review, thanks |
langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaChatModel.java
Outdated
Show resolved
Hide resolved
langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaStreamingChatModel.java
Outdated
Show resolved
Hide resolved
langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaStreamingLanguageModel.java
Outdated
Show resolved
Hide resolved
...ain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaRequestLoggingInterceptor.java
Outdated
Show resolved
Hide resolved
…OllamaChatModel.java Co-authored-by: LangChain4j <langchain4j@gmail.com>
…OllamaStreamingLanguageModel.java Co-authored-by: LangChain4j <langchain4j@gmail.com>
…OllamaStreamingChatModel.java Co-authored-by: LangChain4j <langchain4j@gmail.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
# Conflicts: # langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaChatModel.java # langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaClient.java # langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaEmbeddingModel.java # langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaLanguageModel.java # langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaStreamingChatModel.java # langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaStreamingLanguageModel.java
Conflict resolved @langchain4j |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@HashJang thanks a lot! Just a few minor comments
this.client = OllamaClient.builder() | ||
.baseUrl(baseUrl) | ||
.timeout(getOrDefault(timeout, ofSeconds(60))) | ||
.customHeaders(customHeaders) | ||
.logRequests(getOrDefault(logRequests, false)) | ||
.logResponses(logResponses) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is it intended that there is no default for logResponses
?
|
||
@Builder | ||
public OllamaClient(String baseUrl, | ||
Duration timeout, | ||
Boolean logRequests, Boolean logResponses, Boolean logStreamingResponses, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can't logResponses
be re-used for streaming (instead of introducing logStreamingResponses
)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@HashJang thanks a lot!
Summary by CodeRabbit
langchain4j-ollama
module, providing better insights for debugging and monitoring.