This version is still in development and is not considered stable yet. For the latest snapshot version, please use Spring AI 1.0.0-SNAPSHOT!spring-doc.cn

Bedrock Converse API

Amazon Bedrock Converse API provides a unified interface for conversational AI models with enhanced capabilities including function/tool calling, multimodal inputs, and streaming responses.spring-doc.cn

The Bedrock Converse API has the following high-level features:spring-doc.cn

  • Tool/Function Calling: Support for function definitions and tool use during conversationsspring-doc.cn

  • Multimodal Input: Ability to process both text and image inputs in conversationsspring-doc.cn

  • Streaming Support: Real-time streaming of model responsesspring-doc.cn

  • System Messages: Support for system-level instructions and context settingspring-doc.cn

The Bedrock Converse API provides a unified interface across multiple model providers while handling AWS-specific authentication and infrastructure concerns.

Following the Bedrock recommendations, Spring AI is transitioning to using Amazon Bedrock’s Converse API for all chat conversation implementations in Spring AI. While the existing InvokeModel API supports conversation applications, we strongly recommend adopting the Converse API for several key benefits:spring-doc.cn

  • Unified Interface: Write your code once and use it with any supported Amazon Bedrock modelspring-doc.cn

  • Model Flexibility: Seamlessly switch between different conversation models without code changesspring-doc.cn

  • Extended Functionality: Support for model-specific parameters through dedicated structuresspring-doc.cn

  • Tool Support: Native integration with function calling and tool usage capabilitiesspring-doc.cn

  • Multimodal Capabilities: Built-in support for vision and other multimodal featuresspring-doc.cn

  • Future-Proof: Aligned with Amazon Bedrock’s recommended best practicesspring-doc.cn

The Converse API does not support embedding operations, so these will remain in the current API and the embedding model functionality in the existing InvokeModel API will be maintainedspring-doc.cn

Prerequisites

Refer to Getting started with Amazon Bedrock for setting up API accessspring-doc.cn

Auto-configuration

Add the spring-ai-bedrock-converse-spring-boot-starter dependency to your project’s Maven pom.xml or Gradle build.gradle build files:spring-doc.cn

<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-bedrock-converse-spring-boot-starter</artifactId>
</dependency>
dependencies {
    implementation 'org.springframework.ai:spring-ai-bedrock-converse-spring-boot-starter'
}
Refer to the Dependency Management section to add the Spring AI BOM to your build file.

Chat Properties

The prefix spring.ai.bedrock.aws is the property prefix to configure the connection to AWS Bedrock.spring-doc.cn

Property Description Default

spring.ai.bedrock.aws.regionspring-doc.cn

AWS region to use.spring-doc.cn

us-east-1spring-doc.cn

spring.ai.bedrock.aws.timeoutspring-doc.cn

AWS timeout to use.spring-doc.cn

5mspring-doc.cn

spring.ai.bedrock.aws.access-keyspring-doc.cn

AWS access key.spring-doc.cn

-spring-doc.cn

spring.ai.bedrock.aws.secret-keyspring-doc.cn

AWS secret key.spring-doc.cn

-spring-doc.cn

spring.ai.bedrock.aws.session-tokenspring-doc.cn

AWS session token for temporary credentials.spring-doc.cn

-spring-doc.cn

The prefix spring.ai.bedrock.converse.chat is the property prefix that configures the chat model implementation for the Converse API.spring-doc.cn

Property Description Default

spring.ai.bedrock.converse.chat.enabledspring-doc.cn

Enable Bedrock Converse chat model.spring-doc.cn

truespring-doc.cn

spring.ai.bedrock.converse.chat.options.modelspring-doc.cn

The model ID to use. You can use the Supported models and model featuresspring-doc.cn

None. Select your modelId from the AWS Bedrock console.spring-doc.cn

spring.ai.bedrock.converse.chat.options.temperaturespring-doc.cn

Controls the randomness of the output. Values can range over [0.0,1.0]spring-doc.cn

0.8spring-doc.cn

spring.ai.bedrock.converse.chat.options.top-pspring-doc.cn

The maximum cumulative probability of tokens to consider when sampling.spring-doc.cn

AWS Bedrock defaultspring-doc.cn

spring.ai.bedrock.converse.chat.options.top-kspring-doc.cn

Number of token choices for generating the next token.spring-doc.cn

AWS Bedrock defaultspring-doc.cn

spring.ai.bedrock.converse.chat.options.max-tokensspring-doc.cn

Maximum number of tokens in the generated response.spring-doc.cn

500spring-doc.cn

Runtime Options

Use the portable ChatOptions or FunctionCallingOptions portable builders to create model configurations, such as temperature, maxToken, topP, etc.spring-doc.cn

On start-up, the default options can be configured with the BedrockConverseProxyChatModel(api, options) constructor or the spring.ai.bedrock.converse.chat.options.* properties.spring-doc.cn

At run-time you can override the default options by adding new, request specific, options to the Prompt call:spring-doc.cn

var options = FunctionCallingOptions.builder()
        .withModel("anthropic.claude-3-5-sonnet-20240620-v1:0")
        .withTemperature(0.6)
        .withMaxTokens(300)
        .withFunctionCallbacks(List.of(FunctionCallback.builder()
            .description("Get the weather in location. Return temperature in 36°F or 36°C format. Use multi-turn if needed.")
            .function("getCurrentWeather", new WeatherService())
            .inputType(WeatherService.Request.class)
            .build()))
        .build();

ChatResponse response = chatModel.call(new Prompt("What is current weather in Amsterdam?", options));

Tool/Function Calling

The Bedrock Converse API supports function calling capabilities, allowing models to use tools during conversations. Here’s an example of how to define and use functions:spring-doc.cn

@Bean
@Description("Get the weather in location. Return temperature in 36°F or 36°C format.")
public Function<Request, Response> weatherFunction() {
    return new MockWeatherService();
}

String response = ChatClient.create(this.chatModel)
        .prompt("What's the weather like in Boston?")
        .function("weatherFunction")
        .call()
        .content();

Sample Controller

Create a new Spring Boot project and add the spring-ai-bedrock-converse-spring-boot-starter to your dependencies.spring-doc.cn

Add an application.properties file under src/main/resources:spring-doc.cn

spring.ai.bedrock.aws.region=eu-central-1
spring.ai.bedrock.aws.timeout=10m
spring.ai.bedrock.aws.access-key=${AWS_ACCESS_KEY_ID}
spring.ai.bedrock.aws.secret-key=${AWS_SECRET_ACCESS_KEY}
# session token is only required for temporary credentials
spring.ai.bedrock.aws.session-token=${AWS_SESSION_TOKEN}

spring.ai.bedrock.converse.chat.options.temperature=0.8
spring.ai.bedrock.converse.chat.options.top-k=15

Here’s an example controller using the chat model:spring-doc.cn

@RestController
public class ChatController {

    private final ChatClient chatClient;

    @Autowired
    public ChatController(ChatClient.Builder builder) {
        this.chatClient = builder.build();
    }

    @GetMapping("/ai/generate")
    public Map generate(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) {
        return Map.of("generation", this.chatClient.prompt(message).call().content());
    }

    @GetMapping("/ai/generateStream")
    public Flux<ChatResponse> generateStream(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) {
        return this.chatClient.prompt(message).stream().content();
    }
}