# java-ai-app-observation **Repository Path**: other-open-source/java-ai-app-observation ## Basic Information - **Project Name**: java-ai-app-observation - **Description**: No description available - **Primary Language**: Java - **License**: Apache-2.0 - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2024-10-01 - **Last Updated**: 2024-10-01 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Java AI App Observation ## Spring AI ### OpenTelemetry See the example app in [examples/spring-ai-example](examples/spring-ai-example). #### Metrics | Name | Type | Description | |---------------------------------------|-----------|---------------------------------------| | `chat.client.call.duration` | Histogram | Duration of ChatClient call requests | | `chat.client.prompt.tokens.count` | Counter | Count of ChatClient prompt tokens | | `chat.client.generation.tokens.count` | Counter | Count of ChatClient generation tokens | | `chat.client.total.tokens.count` | Counter | Count of ChatClient total tokens | For `ChatClient`s, using `ChatClientTelemetry` to wrap an existing `ChatClient`. ```java @Configuration public class ApplicationConfiguration { @Bean @Primary public ChatClient chatClient(OllamaChatClient ollamaChatClient, OpenTelemetry openTelemetry) { return ChatClientTelemetry.builder(openTelemetry) .tracePromptContent(true) // Trace prompt content .traceChatResponseContent(true) // Trace chat response content .build() .newChatClient(ollamaChatClient); } @Bean public OpenTelemetry openTelemetry(ApplicationContext applicationContext) { return AutoConfiguredOpenTelemetrySdk.builder() .addResourceCustomizer( ((resource, configProperties) -> resource.toBuilder() .put(AttributeKey.stringKey("service.name"), Optional.ofNullable(applicationContext.getId()) .orElseGet(applicationContext::getDisplayName)) .build())) .setResultAsGlobal() .build() .getOpenTelemetrySdk(); } } ```