Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add logs for OllamaClient #662

Merged
merged 14 commits into from
May 22, 2024
Merged

add logs for OllamaClient #662

merged 14 commits into from
May 22, 2024

Conversation

HashJang
Copy link
Contributor

@HashJang HashJang commented Feb 23, 2024

Summary by CodeRabbit

  • New Features
    • Enhanced logging for both requests and responses in the langchain4j-ollama module, providing better insights for debugging and monitoring.

@mmanrai
Copy link
Contributor

mmanrai commented Feb 27, 2024

Custom interceptors can be replaced by https://github.com/square/okhttp/tree/master/okhttp-logging-interceptor

@HashJang
Copy link
Contributor Author

HashJang commented Mar 3, 2024

Custom interceptors can be replaced by https://github.com/square/okhttp/tree/master/okhttp-logging-interceptor

Got it, but the default logging interceptor for okhttp is not flexible or suitable for logging, the logging level cannot be changed or customized.

@langchain4j
Copy link
Owner

Thanks a lot!

BTW we will be replacing okhttp as a default http client, so it is better to have a custom logging component, ideally client-agnostic, which can be used with any http client

Copy link
Owner

@langchain4j langchain4j left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@HashZhang thanks a lot!

Would be great to also add logging for streaming, as done in MistralAiClient 🙏
Thank you!

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 5

Configuration used: CodeRabbit UI

Commits Files that changed from the base of the PR and between e62ea2f and ef5c5d1.
Files selected for processing (3)
  • langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaClient.java (1 hunks)
  • langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaRequestLoggingInterceptor.java (1 hunks)
  • langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaResponseLoggingInterceptor.java (1 hunks)
Additional comments: 2
langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaResponseLoggingInterceptor.java (1)
  • 13-18: The implementation of the intercept method correctly captures, logs, and returns the HTTP response. Well done on following the interceptor pattern effectively.
langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaRequestLoggingInterceptor.java (1)
  • 14-18: The implementation of the intercept method correctly captures, logs, and proceeds with the HTTP request. Nicely done on adhering to the interceptor pattern.

@HashJang
Copy link
Contributor Author

More code added for:

  1. add configuration for request logs and response logs like OpenAI Client and Model
  2. add configuration for streaming response logs like Mistral Client and Model

Repository owner deleted a comment from coderabbitai bot Apr 10, 2024
# Conflicts:
#	langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaClient.java
HashJang

This comment was marked as duplicate.

@HashJang
Copy link
Contributor Author

conflict resolved, please review, thanks
@langchain4j

@langchain4j langchain4j added the P3 Medium priority label Apr 23, 2024
HashJang and others added 5 commits April 24, 2024 18:55
…OllamaChatModel.java

Co-authored-by: LangChain4j <langchain4j@gmail.com>
…OllamaStreamingLanguageModel.java

Co-authored-by: LangChain4j <langchain4j@gmail.com>
…OllamaStreamingChatModel.java

Co-authored-by: LangChain4j <langchain4j@gmail.com>
Copy link
Contributor Author

@HashJang HashJang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

# Conflicts:
#	langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaChatModel.java
#	langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaClient.java
#	langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaEmbeddingModel.java
#	langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaLanguageModel.java
#	langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaStreamingChatModel.java
#	langchain4j-ollama/src/main/java/dev/langchain4j/model/ollama/OllamaStreamingLanguageModel.java
@HashJang
Copy link
Contributor Author

Conflict resolved @langchain4j

Copy link
Owner

@langchain4j langchain4j left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@HashJang thanks a lot! Just a few minor comments

this.client = OllamaClient.builder()
.baseUrl(baseUrl)
.timeout(getOrDefault(timeout, ofSeconds(60)))
.customHeaders(customHeaders)
.logRequests(getOrDefault(logRequests, false))
.logResponses(logResponses)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it intended that there is no default for logResponses?


@Builder
public OllamaClient(String baseUrl,
Duration timeout,
Boolean logRequests, Boolean logResponses, Boolean logStreamingResponses,
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can't logResponses be re-used for streaming (instead of introducing logStreamingResponses)?

Copy link
Owner

@langchain4j langchain4j left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@HashJang thanks a lot!

@langchain4j langchain4j merged commit 15b58ad into langchain4j:main May 22, 2024
6 checks passed
langchain4j added a commit that referenced this pull request May 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
P3 Medium priority
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants