This is part of a series of articles to introduce Generative AI and step-by-step process with code sample to build GenAI Application; Part-1 (GenAI Introduction), Part-2 (Building GenAI App using Spring Boot and Amazon Bedrock) and Part-3 (Building GenAI App with Proprietary data).
Previous article (Introduction to GenAI) explains important concepts of GenAI, Model Architecture and Vector Database. In this article, we are going to develop a simple GenAI based ChatBot using Spring Boot and Amazon Bedrock. Here we'll explore AWS SDK and REST API based Integration rather than Spring AI framework. Also, going to share troubleshooting scenarios which you may face while building this application.
Prerequisites
- Spring Boot v3, JDK 17
- AWS SDK v2.x
- AWS IAM account access and API credentials. Make sure that "AmazonBedRockFullAccess" permission is granted.
- Amazon Bedrock FM (Anthropic Claude 3 Haiku). This Model helps with Image to Text, Chat, Conversation features.
Enable Amazon Bedrock
Amazon Bedrock includes wide range of Models (aka Foundation Model, FM) for various use cases provided by Amazon and other vendors- Titan (by Amazon), Claude (by Anthropic), Llma (by Meta) etc. Individual Model access need to be requested (with your use case, company details) and the same will be processed for approval (usually takes few minutes).
- Access Amazon Bedrock Dashboard from your AWS Account
- Go to Bedrock Configuration - Model Access
- Select relevant model (Claude 3 Haiku) and place access request with necessary details
Amazon Bedrock Playground
Playground is a very helpful tool in Bedrock Dashboard to experiment with Models with Prompt and Request parameters. There is an option "View API request" which helps to know request parameter structure.
- Go to Playground
- Select Approved Model
- "Load Example" or Experiment with parameters / configuration like Max Response Length, Stop Sequence, Randomness etc.
- Select "View API request" option from the Top Menu
Make sure to refer "Model inference parameters and response" for your selected FM version (this varies across versions, so be careful!). For this blog refer Anthropic Claude 3 Haiku - Request and Response.
Spring Boot Application
Now, lets build our Spring Boot Application (Chat Bot) which will provide a REST API to accept User's query (e.g. "Who is Mr. Bean"). This query is sent to Amazon Bedrock as FM prompt and returned response is sent back to API (displayed on Browser).
Include dependencies for AWS SDK, Amazon Bedrock, Bedrock Runtime and JSON Library in pom.xml
Set AWS SDK API Credentials in Application properties
my.cloud.aws.credentials.access-key=YOUR_ACCESS_KEY
my.cloud.aws.credentials.secret-key=YOUR_SECRET
my.cloud.aws.region.static=YOUR_REGION (e.g. ap-south-1)
Create AWS Configuration class to initialize Amazon Bedrock Runtime Provider instance with credentials which are defined in Properties.
@Configuration
public class AwsConfig {
@Value("${my.cloud.aws.credentials.access-key}")
private String accessKey;
@Value("${my.cloud.aws.credentials.secret-key}")
private String secretKey;
@Value("${my.cloud.aws.region.static}")
private String region;
public AwsCredentialsProvider awsCredentialsProvider() {
AwsCredentialsProvider awsCredentialsProvider = StaticCredentialsProvider.create(
AwsBasicCredentials.create(accessKey, secretKey));
return awsCredentialsProvider;
}
@Bean
public BedrockRuntimeClient bedrockClient() {
return BedrockRuntimeClient.builder()
.region(Region.of(region))
.credentialsProvider(awsCredentialsProvider())
.build();
}
}
Create ChatModelWrapper Service which invokes Model using Amazon BedRock Runtime Client instance. Invoke method sends JSON payload (encoded query and configurations). Response JSON object is parsed to retrieve Response message.
@Service
public class ChatModelWrapper {
// Amazon Bedrock - Base Models - Select your model and copy "Model ID"
private final String MODEL_ID = "anthropic.claude-3-haiku-20240307-v1:0";
private final String QUERY_PATTERN = "Human: %s\n\nAssistant:";
@Autowired
private BedrockRuntimeClient bedrockClient;
public String process(String query) {
String encodedQuery = String.format(QUERY_PATTERN, query);
return invokeModel(encodedQuery);
}
private String invokeModel(String query) {
// Create Model Payload
JSONObject obj = new JSONObject();
obj.put("anthropic_version", "bedrock-2023-05-31") // Refer AWS document
.put("max_tokens", 200) // Max Response Size
.put("temperature", 0.5) // Randomness of Response (max 1.0)
.put("stop_sequences", List.of("\n\nHuman:")); // Char sequences to mark end of query
JSONObject message = new JSONObject().put("role", "user"); // "user" role
JSONObject prompt = new JSONObject().put("type", "text").put("text", query); // Query
message = message.put("content", List.of(prompt));
obj = obj.put("messages", List.of(message));
String payload = obj.toString();
System.out.println("__payload: " + obj.toString());
// Invoke Model
InvokeModelRequest request = InvokeModelRequest.builder().body(SdkBytes.fromUtf8String(payload))
.modelId(MODEL_ID)
.contentType("application/json")
.accept("application/json").build();
InvokeModelResponse response = bedrockClient.invokeModel(request);
JSONObject responseBody = new JSONObject(response.body().asUtf8String());
System.out.println("__response: "+ responseBody.toString());
// Parse response object
JSONArray contentArray = responseBody.getJSONArray("content");
String reponseMessage = null;
if(!contentArray.isEmpty())
reponseMessage = contentArray.getJSONObject(0).getString("text");
return reponseMessage;
}
}
Create a REST Controller, GET method (/ai/chat) to accept query message from User. Call ChatModelWrapper Service to invoke Model and retrieve Response.Now,
go to the browser and hit the REST API (/ai/chat) with your query
("what is your name", "who is mr. bean" or any crazy thing!) and enjoy
the magic!
This Application source code and API request / response sample payloads are shared on GitHub.
Troubleshooting
While accessing Amazon Bedrock SDK you may face following errors-
- AccessDeniedException: User:YOUR_IAM_USER is not authorized to perform: bedrock:InvokeModel...(Solution: Check IAM user account permission and make sure "AmazonBedRockFullAccess" permission is added/granted)
- ValidationException: "YOUR_MODEL_ID" is not supported on this API. Please use the Messages API instead...(Solution: Correct Request Parameter format for the mentioned Model. Refer AWS document on Message API or Request parameters for the Model.)
Comments
Post a Comment