Architecture to AWS CloudFormation code using Anthropic’s Claude 3 on Amazon Bedrock

In this post, we explore some ways you can use Anthropic’s Claude 3 Sonnet’s vision capabilities to accelerate the process of moving from architecture to the prototype stage of a solution.

Sep 27, 2024 - 23:00
Architecture to AWS CloudFormation code using Anthropic’s Claude 3 on Amazon Bedrock

The Anthropic’s Claude 3 family of models, available on Amazon Bedrock, offers multimodal capabilities that enable the processing of images and text. This capability opens up innovative avenues for image understanding, wherein Anthropic’s Claude 3 models can analyze visual information in conjunction with textual data, facilitating more comprehensive and contextual interpretations. By taking advantage of its multimodal prowess, we can ask the model questions like “What objects are in the image, and how are they relatively positioned to each other?” We can also gain an understanding of data presented in charts and graphs by asking questions related to business intelligence (BI) tasks, such as “What is the sales trend for 2023 for company A in the enterprise market?” These are just some examples of the additional richness Anthropic’s Claude 3 brings to generative artificial intelligence (AI) interactions.

Architecting specific AWS Cloud solutions involves creating diagrams that show relationships and interactions between different services. Instead of building the code manually, you can use Anthropic’s Claude 3’s image analysis capabilities to generate AWS CloudFormation templates by passing an architecture diagram as input.

In this post, we explore some ways you can use Anthropic’s Claude 3 Sonnet’s vision capabilities to accelerate the process of moving from architecture to the prototype stage of a solution.

Use cases for architecture to code

The following are relevant use cases for this solution:

  • Converting whiteboarding sessions to AWS infrastructure To quickly prototype your designs, you can take the architecture diagrams created during whiteboarding sessions and generate the first draft of a CloudFormation template. You can also iterate over the CloudFormation template to develop a well-architected solution that meets all your requirements.
  • Fast deployment of architecture diagrams – You can generate boilerplate CloudFormation templates by using architecture diagrams you find on the web. This allows you to experiment quickly with new designs.
  • Streamlined AWS infrastructure design through collaborative diagramming – You might draw architecture diagrams on a diagramming tool during an all-hands meeting. These raw diagrams can generate boilerplate CloudFormation templates, quickly leading to actionable steps while speeding up collaboration and increasing meeting value.

Solution overview

To demonstrate the solution, we use Streamlit to provide an interface for diagrams and prompts. Amazon Bedrock invokes the Anthropic’s Claude 3 Sonnet model, which provides multimodal capabilities. AWS Fargate is the compute engine for web application. The following diagram illustrates the step-by-step process.

Architecture Overview

The workflow consists of the following steps:

  1. The user uploads an architecture image (JPEG or PNG) on the Streamlit application, invoking the Amazon Bedrock API to generate a step-by-step explanation of the architecture using the Anthropic’s Claude 3 Sonnet model.
  2. The Anthropic’s Claude 3 Sonnet model is invoked using a step-by-step explanation and few-shot learning examples to generate the initial CloudFormation code. The few-shot learning example consists of three CloudFormation templates; this helps the model understand writing practices associated with CloudFormation code.
  3. The user manually provides instructions using the chat interface to update the initial CloudFormation code.

*Steps 1 and 2 are executed once when architecture diagram is uploaded. To trigger changes to the AWS CloudFormation code (step 3) provide update instructions from the Streamlit app

The CloudFormation templates generated by the web application are intended for inspiration purposes and not for production-level applications. It is the responsibility of a developer to test and verify the CloudFormation template according to security guidelines.

Few-shot Prompting

To help Anthropic’s Claude 3 Sonnet understand the practices of writing CloudFormation code, we use few-shot prompting by providing three CloudFormation templates as reference examples in the prompt. Exposing Anthropic’s Claude 3 Sonnet to multiple CloudFormation templates will allow it to analyze and learn from the structure, resource definitions, parameter configurations, and other essential elements consistently implemented across your organization’s templates. This enables Anthropic’s Claude 3 Sonnet to grasp your team’s coding conventions, naming conventions, and organizational patterns when generating CloudFormation templates. The following examples used for few-shot learning can be found in the GitHub repo.

Few-shot prompting example 1

Few-shot prompting example 1

Few-shot prompting example 2

Few-shot prompting example 2

Few-shot prompting example 3

Few-shot prompting example 3

Furthermore, Anthropic’s Claude 3 Sonnet can observe how different resources and services are configured and integrated within the CloudFormation templates through few-shot prompting. It will gain insights into how to automate the deployment and management of various AWS resources, such as Amazon Simple Storage Service (Amazon S3), AWS Lambda, Amazon DynamoDB, and AWS Step Functions.

Inference parameters are preset, but they can be changed from the web application if desired. We recommend experimenting with various combinations of these parameters. By default, we set the temperature to zero to reduce the variability of outputs and create focused, syntactically correct code.

Prerequisites

To access the Anthropic’s Claude 3 Sonnet foundation model (FM), you must request access through the Amazon Bedrock console. For instructions, see Manage access to Amazon Bedrock foundation models. After requesting access to Anthropic’s Claude 3 Sonnet, you can deploy the following development.yaml CloudFormation template to provision the infrastructure for the demo. For instructions on how to deploy this sample, refer to the GitHub repo. Use the following table to launch the CloudFormation template to quickly deploy the sample in either us-east-1 or us-west-2.

Region Stack
us-east-1 development.yaml
us-west-2 development.yaml

When deploying the template, you have the option to specify the Amazon Bedrock model ID you want to use for inference. This flexibility allows you to choose the model that best suits your needs. By default, the template uses the Anthropic’s Claude 3 Sonnet model, renowned for its exceptional performance. However, if you prefer to use a different model, you can seamlessly pass its Amazon Bedrock model ID as a parameter during deployment. Verify that you have requested access to the desired model beforehand and that the model possesses the necessary vision capabilities required for your specific use case.

After you launch the CloudFormation stack, navigate to the stack’s Outputs tab on the AWS CloudFormation console and collect the Amazon CloudFront URL. Enter the URL in your browser to view the web application.

Web application screenshot

In this post, we discuss CloudFormation template generation for three different samples. You can find the sample architecture diagrams in the GitHub repo. These samples are similar to the few-shot learning examples, which is intentional. As an enhancement to this architecture, you can employ a Retrieval Augmented Generation (RAG)-based approach to retrieve relevant CloudFormation templates from a knowledge base to dynamically augment the prompt.

Due to the non-deterministic behavior of the large language model (LLM), you might not get the same response as shown in this post.

Let’s generate CloudFormation templates for the following sample architecture diagram.

Sample Architecture for CloudFormation generation

Uploading the preceding architecture diagram to the web application generates a step-by-step explanation of the diagram using Anthropic’s Claude 3 Sonnet’s vision capabilities.

Web application output screenshot

Let’s analyze the step-by-step explanation. The generated response is divided into three parts:

  1. The context explains what the architecture diagram depicts.
  2. The architecture diagram’s flow gives the order in which AWS services are invoked and their relationship with each other.
  3. We get a summary of the entire generated response.

In the following step-by-step explanation, we see a few highlighted errors.

Step-by-step explanation errors

The step-by-step explanation is augmented with few-shot learning examples to develop an initial CloudFormation template. Let’s analyze the initial CloudFormation template:

AWSTemplateFormatVersion: '2010-09-09'
Description: >
  This CloudFormation stack sets up a serverless data processing pipeline triggered by file uploads to an S3 bucket.
  It uses AWS Lambda to process the uploaded files, and Amazon SNS to send notifications upon successful processing.
  This template is not production ready and should only be used for inspiration
Parameters:
  S3BucketName:
    Type: String
    Description: Name of the S3 bucket for file uploads
    AllowedPattern: ^[a-z0-9][a-z0-9-]*[a-z0-9]$
    MinLength: 1
    MaxLength: 63

  EmailAddress:
    Type: String
    Description: Email address to receive notifications
    AllowedPattern: ^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$

Resources:

  # S3 Bucket
  S3Bucket:
    Type: AWS::S3::Bucket
    Properties:
      BucketName: !Ref S3BucketName
      NotificationConfiguration:
        LambdaConfigurations:
          - Event: 's3:ObjectCreated:*'
            Function: !GetAtt ProcessingLambda.Arn

  # SNS Topic
  SNSTopic:
    Type: AWS::SNS::Topic
    Properties:
      Subscription:
        - Endpoint: !Ref EmailAddress
          Protocol: email

  # Lambda Function
  ProcessingLambda:
    Type: AWS::Lambda::Function
    Properties:
      FunctionName: ProcessingLambda
      Runtime: python3.9
      Handler: index.lambda_handler
      Role: !GetAtt LambdaRole.Arn
      Code:
        ZipFile: |
          import boto3

          def lambda_handler(event, context):
              s3 = boto3.client('s3')
              sns = boto3.client('sns')

              # Process the uploaded file
              for record in event['Records']:
                  bucket_name = record['s3']['bucket']['name']
                  object_key = record['s3']['object']['key']

                  # Process the file data
                  # ...

              # Send notification upon successful processing
              sns.publish(
                  TopicArn=!Ref SNSTopic,
                  Message='File processing completed successfully',
                  Subject='Data Processing Notification'
              )

              return {
                  'statusCode': 200,
                  'body': 'File processing completed successfully'
              }

  # Lambda Role
  LambdaRole:
    Type: AWS::IAM::Role
    Properties:
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Principal:
              Service: lambda.amazonaws.com
            Action: 'sts:AssumeRole'
      ManagedPolicyArns:
        - arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
      Policies:
        - PolicyName: S3Access
          PolicyDocument:
            Version: '2012-10-17'
            Statement:
              - Effect: Allow
                Action:
                  - 's3:GetObject'
                Resource: !Join ['', ['arn:aws:s3:::', !Ref S3BucketName, '/*']]
        - PolicyName: SNSPublish
          PolicyDocument:
            Version: '2012-10-17'
            Statement:
              - Effect: Allow
                Action:
                  - 'sns:Publish'
                Resource: !Ref SNSTopic

  # Lambda Permissions
  LambdaPermission:
    Type: AWS::Lambda::Permission
    Properties:
      FunctionName: !GetAtt ProcessingLambda.Arn
      Action: 'lambda:InvokeFunction'
      Principal: s3.amazonaws.com
      SourceAccount: !Ref AWS::AccountId
      SourceArn: !Join ['', ['arn:aws:s3:::', !Ref S3BucketName]]

Outputs:

  S3BucketName:
    Description: Name of the S3 bucket for file uploads
    Value: !Ref S3Bucket
    Export:
      Name: !Sub '${AWS::StackName}-S3BucketName'

  LambdaFunctionArn:
    Description: ARN of the Lambda function
    Value: !GetAtt ProcessingLambda.Arn
    Export:
      Name: !Sub '${AWS::StackName}-LambdaFunctionArn'

  SNSTopicArn:
    Description: ARN of the SNS topic for notifications
    Value: !Ref SNSTopic
    Export:
      Name: !Sub '${AWS::StackName}-SNSTopicArn'

After analyzing the CloudFormation template, we see that the Lambda code refers to an Amazon Simple Notification Service (Amazon SNS) topic using !Ref SNSTopic, which is not valid. We also want to add additional functionality to the template. First, we want to filter the Amazon S3 notification configuration to invoke Lambda only when *.csv files are uploaded. Second, we want to add metadata to the CloudFormation template. To do this, we use the chat interface to give the following update instructions to the web application:

Make the following updates:

Use environment variables for AWS Lambda to access SNS Topic ARN.

Add filter to S3 notification configuration to only invoke AWS lambda when *.csv files are uploaded

Add metadata to CloudFormation template

Chat interface web application screenshot

The updated CloudFormation template is as follows:

AWSTemplateFormatVersion: '2010-09-09'
Description: >
  This CloudFormation stack sets up a serverless data processing pipeline triggered by file uploads to an S3 bucket.
  It uses AWS Lambda to process the uploaded files, and Amazon SNS to send notifications upon successful processing.
  This template is not production ready and should only be used for inspiration.
Metadata:
  'AWS::CloudFormation::Interface':
    ParameterGroups:
      - Label:
          default: 'S3 Bucket Configuration'
        Parameters:
          - S3BucketName
      - Label:
          default: 'Notification Configuration'
        Parameters:
          - EmailAddress

Parameters:
  S3BucketName:
    Type: String
    Description: Name of the S3 bucket for file uploads
    AllowedPattern: ^[a-z0-9][a-z0-9-]*[a-z0-9]$
    MinLength: 1
    MaxLength: 63

  EmailAddress:
    Type: String
    Description: Email address to receive notifications
    AllowedPattern: ^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$

Resources:

  # S3 Bucket
  S3Bucket:
    Type: AWS::S3::Bucket
    Properties:
      BucketName: !Ref S3BucketName
      NotificationConfiguration:
        LambdaConfigurations:
          - Event: 's3:ObjectCreated:*'
            Function: !GetAtt ProcessingLambda.Arn
            Filter:
              S3Key:
                Rules:
                  - Name: suffix
                    Value: .csv

  # SNS Topic
  SNSTopic:
    Type: AWS::SNS::Topic
    Properties:
      Subscription:
        - Endpoint: !Ref EmailAddress
          Protocol: email

  # Lambda Function
  ProcessingLambda:
    Type: AWS::Lambda::Function
    Properties:
      FunctionName: ProcessingLambda
      Runtime: python3.9
      Handler: index.lambda_handler
      Role: !GetAtt LambdaRole.Arn
      Environment:
        Variables:
          SNS_TOPIC_ARN: !Ref SNSTopic
      Code:
        ZipFile: |
          import boto3
          import os

          def lambda_handler(event, context):
              s3 = boto3.client('s3')
              sns = boto3.client('sns')
              sns_topic_arn = os.environ['SNS_TOPIC_ARN']

              # Process the uploaded file
              for record in event['Records']:
                  bucket_name = record['s3']['bucket']['name']
                  object_key = record['s3']['object']['key']

                  # Process the file data
                  # ...

              # Send notification upon successful processing
              sns.publish(
                  TopicArn=sns_topic_arn,
                  Message='File processing completed successfully',
                  Subject='Data Processing Notification'
              )

              return {
                  'statusCode': 200,
                  'body': 'File processing completed successfully'
              }

  # Lambda Role
  LambdaRole:
    Type: AWS::IAM::Role
    Properties:
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Principal:
              Service: lambda.amazonaws.com
            Action: 'sts:AssumeRole'
      ManagedPolicyArns:
        - arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
      Policies:
        - PolicyName: S3Access
          PolicyDocument:
            Version: '2012-10-17'
            Statement:
              - Effect: Allow
                Action:
                  - 's3:GetObject'
                Resource: !Join ['', ['arn:aws:s3:::', !Ref S3BucketName, '/*']]
        - PolicyName: SNSPublish
          PolicyDocument:
            Version: '2012-10-17'
            Statement:
              - Effect: Allow
                Action:
                  - 'sns:Publish'
                Resource: !Ref SNSTopic

  # Lambda Permissions
  LambdaPermission:
    Type: AWS::Lambda::Permission
    Properties:
      FunctionName: !GetAtt ProcessingLambda.Arn
      Action: 'lambda:InvokeFunction'
      Principal: s3.amazonaws.com
      SourceAccount: !Ref AWS::AccountId
      SourceArn: !Join ['', ['arn:aws:s3:::', !Ref S3BucketName]]

Outputs:

  S3BucketName:
    Description: Name of the S3 bucket for file uploads
    Value: !Ref S3Bucket
    Export:
      Name: !Sub '${AWS::StackName}-S3BucketName'

  LambdaFunctionArn:
    Description: ARN of the Lambda function
    Value: !GetAtt ProcessingLambda.Arn
    Export:
      Name: !Sub '${AWS::StackName}-LambdaFunctionArn'

  SNSTopicArn:
    Description: ARN of the SNS topic for notifications
    Value: !Ref SNSTopic
    Export:
      Name: !Sub '${AWS::StackName}-SNSTopicArn'

Additional examples

We have provided two more sample diagrams, their associated CloudFormation code generated by Anthropic’s Claude 3 Sonnet, and the prompts used to create them. You can see how diagrams in various forms, from digital to hand-drawn, or some combination, can be used. The end-to-end analysis of these samples can be found at sample 2 and sample 3 on the GitHub repo.

Best practices for architecture to code

In the demonstrated use case, you can observe how well the Anthropic’s Claude 3 Sonnet model could pull details and relationships between services from an architecture image. The following are some ways you can improve the performance of Anthropic’s Claude in this use case:

  • Implement a multimodal RAG approach to enhance the application’s ability to handle a wider variety of complex architecture diagrams, because the current implementation is limited to diagrams similar to the provided static examples.
  • Enhance the architecture diagrams by incorporating visual cues and features, such as labeling services, indicating orchestration hierarchy levels, grouping related services at the same level, enclosing services within clear boxes, and labeling arrows to represent the flow between services. These additions will aid in better understanding and interpreting the diagrams.
  • If the application generates an invalid CloudFormation template, provide the error as update instructions. This will help the model understand the mistake and make a correction.
  • Use Anthropic’s Claude 3 Opus or Anthropic’s Claude 3.5 Sonnet for greater performance on long contexts in order to support near-perfect recall
  • With careful design and management, orchestrate agentic workflows by using Agents for Amazon Bedrock. This enables you to incorporate self-reflection, tool use, and planning within your workflow to generate more relevant CloudFormation templates.
  • Use Amazon Bedrock Prompt Flows to accelerate the creation, testing, and deployment of workflows through an intuitive visual interface. This can reduce development effort and accelerate workflow testing.

Clean up

To clean up the resources used in this demo, complete the following steps:

  1. On the AWS CloudFormation console, choose Stacks in the navigation pane.
  2. Select the deployed yaml development.yaml stack and choose Delete.

Conclusion

With the pattern demonstrated with Anthropic’s Claude 3 Sonnet, developers can effortlessly translate their architectural visions into reality by simply sketching their desired cloud solutions. Anthropic’s Claude 3 Sonnet’s advanced image understanding capabilities will analyze these diagrams and generate boilerplate CloudFormation code, minimizing the need for initial complex coding tasks. This visually driven approach empowers developers from a variety of skill levels, fostering collaboration, rapid prototyping, and accelerated innovation.

You can investigate other patterns, such as including RAG and agentic workflows, to improve the accuracy of code generation. You can also explore customizing the LLM by fine-tuning it to write CloudFormation code with greater flexibility.

Now that you have seen Anthropic’s Claude 3 Sonnet in action, try designing your own architecture diagrams using some of the best practices to take your prototyping to the next level.

For additional resources, refer to the :


About the Authors

Author 1 Eashan KaushikEashan Kaushik is an Associate Solutions Architect at Amazon Web Services. He is driven by creating cutting-edge generative AI solutions while prioritizing a customer-centric approach to his work. Before this role, he obtained an MS in Computer Science from NYU Tandon School of Engineering. Outside of work, he enjoys sports, lifting, and running marathons.

Author 2 Chris PecoraChris Pecora is a Generative AI Data Scientist at Amazon Web Services. He is passionate about building innovative products and solutions while also focusing on customer-obsessed science. When not running experiments and keeping up with the latest developments in generative AI, he loves spending time with his kids.

Jat AI Stay informed with the latest in artificial intelligence. Jat AI News Portal is your go-to source for AI trends, breakthroughs, and industry analysis. Connect with the community of technologists and business professionals shaping the future.