Building LLM Applications in Java with Spring AI: A Developer's Guide
Introduction
In the rapidly evolving world of AI and Large Language Models (LLMs), Java developers often find themselves at a crossroads. While Python has been the dominant language for AI development, Java developers shouldn't feel left out. Spring AI brings the power of LLMs to the Java ecosystem, making it easier than ever to build AI-powered applications.
This article will guide you through building a Java application using Spring AI, focusing on key concepts like structured outputs and tool calling. We'll use a weather service application as our example, demonstrating how to integrate OpenAI and Anthropic models into your Java applications.
Why Spring AI for Java Developers?
graph TD
A[Java Developer] --> B[Traditional AI Development]
A --> C[Spring AI]
B --> D[Python Ecosystem]
B --> E[Complex Integration]
C --> F[Familiar Spring Patterns]
C --> G[Type-Safe APIs]
C --> H[Enterprise-Ready]
For Java developers, working with AI traditionally meant: - Learning Python - Dealing with complex integration layers - Managing multiple dependencies - Handling type safety issues
Spring AI changes this by: - Providing a familiar Spring-based development experience - Offering type-safe APIs - Supporting enterprise-grade features - Integrating seamlessly with existing Java applications
Setting Up Your Project
First, let's look at the basic project structure and dependencies:
<properties>
<java.version>17</java.version>
<spring-ai.version>1.0.0-M7</spring-ai.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-starter-model-anthropic</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-starter-model-openai</artifactId>
</dependency>
</dependencies>
Configuring LLM Models
Spring AI supports multiple LLM providers. Here's how to configure them in your application.properties:
# OpenAI Configuration
spring.ai.openai.api-key=${OPENAI_API_KEY}
spring.ai.openai.model=gpt-4-turbo-preview
spring.ai.openai.options.temperature=0.1
# Anthropic Configuration
spring.ai.anthropic.api-key=${ANTHROPIC_API_KEY}
spring.ai.anthropic.model=claude-3-5-sonnet-20241022
spring.ai.anthropic.options.temperature=0.1
Structured Outputs: Making LLM Responses Type-Safe
One of the most powerful features of Spring AI is structured outputs. Instead of dealing with raw text responses, you can define Java classes that represent your expected output structure.
public class Response {
private String content;
public Response(String content) {
this.content = content;
}
// Getters and setters
}
Using structured outputs in your controller:
@RestController
@RequestMapping("/api/chat")
public class ChatController {
private final ChatClient chatClient;
public ChatController(ChatClient chatClient) {
this.chatClient = chatClient;
}
@GetMapping("/query")
public Response getAnswer(@RequestParam String query) {
return chatClient.prompt()
.user(query)
.call()
.entity(Response.class);
}
}
Tool Calling: Extending LLM Capabilities
Tool calling allows your LLM to interact with external services and APIs. Let's look at how to implement a weather service tool:
@Service
public class WeatherService {
@Tool(description = "Get the weather forecast for a given city")
public Response getWeather(String city) {
return new Response("Sunny with a chance of rain in " + city);
}
}
Configuring the ChatClient with tools:
@Configuration
public class ChatConfig {
@Bean
public ChatClient chatClient(ChatClient.Builder builder, WeatherService weatherService) {
return builder
.defaultTools(weatherService)
.build();
}
}
How It All Works Together
sequenceDiagram
participant Client
participant Controller
participant ChatClient
participant LLM
participant WeatherService
Client->>Controller: GET /api/chat/query?query=weather
Controller->>ChatClient: prompt().user(query)
ChatClient->>LLM: Send prompt
LLM->>ChatClient: Request tool call
ChatClient->>WeatherService: getWeather(city)
WeatherService->>ChatClient: Return weather data
ChatClient->>LLM: Send tool response
LLM->>ChatClient: Return final response
ChatClient->>Controller: Return Response object
Controller->>Client: Return JSON response
Best Practices and Tips
- Type Safety: Always use structured outputs to ensure type safety and better error handling
- Tool Descriptions: Write clear, descriptive tool documentation to help the LLM understand when to use them
- Error Handling: Implement proper error handling for tool calls and LLM responses
- Testing: Write comprehensive tests for both the LLM integration and tool implementations
- Monitoring: Use Spring's observability features to monitor LLM usage and performance
Conclusion
Spring AI brings the power of LLMs to Java developers in a familiar, type-safe way. By leveraging structured outputs and tool calling, you can build robust, enterprise-grade AI applications without leaving the Java ecosystem.
The example weather service application demonstrates how to: - Configure multiple LLM providers - Implement structured outputs - Create and use tools - Handle responses in a type-safe manner
With Spring AI, Java developers can now easily integrate AI capabilities into their applications while maintaining the benefits of the Java ecosystem.
Next Steps
- Explore more advanced features like:
- Retrieval Augmented Generation (RAG)
- Vector databases integration
- Multi-modal capabilities
- Check out the Spring AI documentation for more details
- Join the Spring AI community for support and updates