Enabling generative AI self-service using Amazon Lex, Amazon Bedrock, and ServiceNow
In this post, we show how you can integrate Amazon Lex with Amazon Bedrock Knowledge Bases and ServiceNow to provide 24/7 automated support and self-service options.
Chat-based assistants have become an invaluable tool for providing automated customer service and support. This post builds on a previous post, Integrate QnABot on AWS with ServiceNow, and explores how to build an intelligent assistant using Amazon Lex, Amazon Bedrock Knowledge Bases, and a custom ServiceNow integration to create an automated incident management support experience.
Amazon Lex is powered by the same deep learning technologies used in Alexa. With it, developers can quickly build conversational interfaces that can understand natural language, engage in realistic dialogues, and fulfill customer requests. Amazon Lex can be configured to respond to customer questions using Amazon Bedrock foundation models (FMs) to search and summarize FAQ responses. Amazon Bedrock Knowledge Bases provides the capability of amassing data sources into a repository of information. Using knowledge bases, you can effortlessly create an application that uses Retrieval Augmented Generation (RAG), a technique where the retrieval of information from data sources enhances the generation of model responses.
ServiceNow is a cloud-based platform for IT workflow management and automation. With its robust capabilities for ticketing, knowledge management, human resources (HR) services, and more, ServiceNow is already powering many enterprise service desks.
By connecting an Amazon Lex chat assistant with Amazon Bedrock Knowledge Bases and ServiceNow, companies can provide 24/7 automated support and self-service options to customers and employees. In this post, we demonstrate how to integrate Amazon Lex with Amazon Bedrock Knowledge Bases and ServiceNow.
Solution overview
The following diagram illustrates the solution architecture.
The workflow includes the following steps:
- The ServiceNow knowledge bank is exported into Amazon Simple Storage Service (Amazon S3), which will be used as the data source for Amazon Bedrock Knowledge Bases. Data in Amazon S3 is encrypted by default. You can further enhance security by Using server-side encryption with AWS KMS keys (SSE-KMS).
- Amazon AppFlow can be used to sync between ServiceNow and Amazon S3. Other alternatives like AWS Glue can also be used to ingest data from ServiceNow.
- Amazon Bedrock Knowledge Bases is created with Amazon S3 as the data source and Amazon Titan (or any other model of your choice) as the embedding model.
- When users of the Amazon Lex chat assistant ask queries, Amazon Lex fetches answers from Amazon Bedrock Knowledge Bases.
- If the user requests a ServiceNow ticket to be created, it invokes the AWS Lambda
- The Lambda function fetches secrets from AWS Secrets Manager and makes an HTTP call to create a ServiceNow ticket.
- Application Auto Scaling is enabled on AWS Lambda to automatically scale Lambda according to user interactions.
- The solution will confer with responsible AI policies and Guardrails for Amazon Bedrock will enforce organizational responsible AI policies.
- The solution is monitored using Amazon CloudWatch, AWS CloudTrail, and Amazon GuardDuty.
Be sure to follow least privilege access policies while giving access to any system resources.
Prerequisites
The following prerequisites need to be completed before building the solution.
- On the Amazon Bedrock console, sign up for access to the Anthropic Claude model of your choice using the instructions at Manage access to Amazon Bedrock foundation models. For information about pricing for using Amazon Bedrock, see Amazon Bedrock pricing.
- Sign up for a ServiceNow account if you do not have one. Save your username and password. You will need to store them in AWS Secrets Manager later in this walkthrough.
- Create a ServiceNow instance following the instructions in Integrate QnABot on AWS ServiceNow.
- Create a user with permissions to create incidents in ServiceNow using the instructions at Create a user. Make a note of these credentials for use later in this walkthrough.
The instructions provided in this walkthrough are for demonstration purposes. Follow ServiceNow documentation to create community instances and follow their best practices.
Solution overview
To integrate Amazon Lex with Amazon Bedrock Knowledge Bases and ServiceNow, follow the steps in the next sections.
Deployment with AWS CloudFormation console
In this step, you first create the solution architecture discussed in the solution overview, except for the Amazon Lex assistant, which you will create later in the walkthrough. Complete the following steps:
- On the CloudFormation console, verify that you are in the correct AWS Region and choose Create stack to create the CloudFormation stack.
- Download the CloudFormation template and upload it in the Specify template Choose Next.
- For Stack name, enter a name such as
ServiceNowBedrockStack
. - In the Parameters section, for ServiceNow details, provide the values of ServiceNow host and ServiceNow username created earlier.
- Keep the other values as default. Under Capabilities on the last page, select I acknowledge that AWS CloudFormation might create IAM resources. Choose Submit to create the CloudFormation stack.
- After the successful deployment of the whole stack, from the Outputs tab, make a note of the output key value
BedrockKnowledgeBaseId
because you will need it later during creation of the Amazon Lex assistant.
Integration of Lambda with Application Auto Scaling is beyond the scope of this post. For guidance, refer to the instructions at AWS Lambda and Application Auto Scaling.
Store the secrets in AWS Secrets Manager
Follow these steps to store your ServiceNow username and password in AWS Secrets Manager:
- On the CloudFormation console, on the Resources tab, enter the word “secrets” to filter search results. Under Physical ID, select the console URL of the AWS Secrets Manager secret you created using the CloudFormation stack.
- On the AWS Secrets Manager console, on the Overview tab, under Secret value, choose Retrieve secret value.
- Select Edit and enter the username and password of the ServiceNow instance you created earlier. Make sure that both the username and password are correct.
Download knowledge articles
You need access to ServiceNow knowledge articles. Follow these steps:
- Create a knowledge base if you don’t have one. Periodically, you may need to sync your knowledge base to keep it up to date.
- Sync the data from ServiceNow to Amazon S3 using Amazon AppFlow by following instructions at ServiceNow. Alternatively, you can use AWS Glue to ingest data from ServiceNow to Amazon S3 by following instructions at the blog post, Extract ServiceNow data using AWS Glue Studio in an Amazon S3 data lake and analyze using Amazon Athena.
- Download a sample article.
Sync Amazon Bedrock Knowledge Bases:
This solution uses the fully managed Knowledge Base for Amazon Bedrock to seamlessly power a RAG workflow, eliminating the need for custom integrations and data flow management. As the data source for the knowledge base, the solution uses Amazon S3. The following steps outline uploading ServiceNow articles to an S3 bucket created by a CloudFormation template.
- On the CloudFormation console, on the Resources tab, enter “S3” to filter search results. Under Physical ID, select the URL for the S3 bucket created using the CloudFormation stack.
- Upload the previously downloaded knowledge articles to this S3 bucket.
Next you need to sync the data source.
- On the CloudFormation console, on the Outputs tab, enter “Knowledge” to filter search results. Under Value, select the console URL of the knowledge bases that you created using the CloudFormation stack. Open that URL in a new browser tab.
- Scroll down to Data source and select the data source. Choose Sync.
You can test the knowledge base by choosing the model in the Test the knowledge base section and asking the model a question.
Responsible AI using Guardrails for Amazon Bedrock
Conversational AI applications require robust guardrails to safeguard sensitive user data, adhere to privacy regulations, enforce ethical principles, and mitigate hallucinations, fostering responsible development and deployment. Guardrails for Amazon Bedrock allow you to configure your organizational policies against the knowledge bases. They help keep your generative AI applications safe by evaluating both user inputs and model responses
To set up guardrails, follow these steps:
- Follow the instructions at the Amazon Bedrock User Guide to create a guardrail.
You can reduce the hallucinations of the model responses by enabling grounding check and relevance check and adjusting the threshold
- Create a version of the guardrail.
- Select the newly created guardrail and copy the guardrail ID. You will use this ID later in the intent creation.
Amazon Lex setup
In this section, you configure your Amazon Lex chat assistant with intents to call Amazon Bedrock. This walkthrough uses Amazon Lex V2.
- On the CloudFormation console, on the Outputs tab, copy the value of
BedrockKnowledgeBaseId
. You will need this ID later in this section. - On the Outputs tab, under Outputs, enter “bot” to filter search results. Choose the console URL of the Amazon Lex assistant you created using the CloudFormation stack. Open that URL in a new browser tab.
- On the Amazon Lex Intents page, choose Create another intent. On the Add intent dropdown menu, choose Use built-in intent.
- On the Use built-in intent screen, under Built-in intent, choose QnAIntent- Gen AI feature.
- For Intent name, enter
BedrockKb
and select Add.
- In the QnA configuration section, under Select model, choose Anthropic and Claude 3 Haiku or a model of your choice.
- Expand Additional Model Settings and enter the Guardrail ID for the guardrails you created earlier. Under Guardrail Version, enter a number that corresponds to the number of versions you have created.
- Enter the Knowledge base for Amazon Bedrock Id that you captured earlier in the CloudFormation outputs section. Choose Save intent at the bottom.
You can now add more QnAIntents pointing to different knowledge bases.
- Return to the intents list by choosing Back to intents list in the navigation pane.
- Select Build to build the assistant.
A green banner on the top of the page with the message Successfully built language English (US) in bot: servicenow-lex-bot indicates the Amazon Lex assistant is now ready.
Test the solution
To test the solution, follow these steps:
- In the navigation pane, choose Aliases. Under Aliases, select
TestBotAlias
.
- Under Languages, choose English (US). Choose Test.
- A new test window will pop up in the bottom of the screen.
- Enter the question “What benefits does AnyCompany offer to its employees?” Then press Enter.
The chat assistant generates a response based on the content in knowledge base.
- To test Amazon Lex to create a ServiceNow ticket for information not present in the knowledge base, enter the question “Create a ticket for password reset” and press Enter.
The chat assistant generates a new ServiceNow ticket because this information is not available in the knowledge base.
To search for the incident, log in to the ServiceNow endpoint that you configured earlier.
Monitoring
You can use CloudWatch logs to review the performance of the assistant and to troubleshoot issues with conversations. From the CloudFormation stack that you deployed, you have already configured your Amazon Lex assistant CloudWatch log group with appropriate permissions.
To view the conversation logs from the Amazon Lex assistant, follow these directions.
On the CloudFormation console, on the Outputs tab, enter “Log” to filter search results. Under Value, choose the console URL of the CloudWatch log group that you created using the CloudFormation stack. Open that URL in a new browser tab.
To protect sensitive data, Amazon Lex obscures slot values in conversation logs. As security best practice, do not store any slot values in request or session attributes. Amazon Lex V2 doesn’t obscure the slot value in audio. You can selectively capture only text using the instructions at Selective conversation log capture.
Enable logging for Amazon Bedrock ingestion jobs
You can monitor Amazon Bedrock ingestion jobs using CloudWatch. To configure logging for an ingestion job, follow the instructions at Knowlege bases logging.
AWS CloudTrail logs
AWS CloudTrail is an AWS service that tracks actions taken by a user, role, or an AWS service. CloudTrail is enabled on your AWS account when you create the account. When activity occurs in that activity is recorded in a CloudTrail event along with other AWS service events in Event history. You can view, search, and download recent events in your AWS account. For more information, see Working with CloudTrail Event history.
As security best practice, you should monitor any access to your environment. You can configure Amazon GuardDuty to identify any unexpected and potentially unauthorized activity in your AWS environment.
Cleanup
To avoid incurring future charges, delete the resources you created. To clean up the AWS environment, use the following steps:
- Empty the contents of the S3 bucket you created as part of the CloudFormation stack.
- Delete the CloudFormation stack you created.
Conclusion
As customer expectations continue to evolve, embracing innovative technologies like conversational AI and knowledge management systems becomes essential for businesses to stay ahead of the curve. By implementing this integrated solution, companies can enhance operational efficiency and deliver superior service to both their customers and employees, while also adapting the responsible AI policies of the organization.
Stay up to date with the latest advancements in generative AI and start building on AWS. If you’re seeking assistance on how to begin, check out the Generative AI Innovation Center.
About the Authors
Marcelo Silva is an experienced tech professional who excels in designing, developing, and implementing cutting-edge products. Starting off his career at Cisco, Marcelo worked on various high-profile projects including deployments of the first ever carrier routing system and the successful rollout of ASR9000. His expertise extends to cloud technology, analytics, and product management, having served as senior manager for several companies such as Cisco, Cape Networks, and AWS before joining GenAI. Currently working as a Conversational AI/GenAI Product Manager, Marcelo continues to excel in delivering innovative solutions across industries.
Sujatha Dantuluri is a seasoned Senior Solutions Architect on the US federal civilian team at AWS, with over two decades of experience supporting commercial and federal government clients. Her expertise lies in architecting mission-critical solutions and working closely with customers to ensure their success. Sujatha is an accomplished public speaker, frequently sharing her insights and knowledge at industry events and conferences. She has contributed to IEEE standards and is passionate about empowering others through her engaging presentations and thought-provoking ideas.
NagaBharathi Challa is a solutions architect on the US federal civilian team at Amazon Web Services (AWS). She works closely with customers to effectively use AWS services for their mission use cases, providing architectural best practices and guidance on a wide range of services. Outside of work, she enjoys spending time with family and spreading the power of meditation.
Pranit Raje is a Cloud Architect on the AWS Professional Services India team. He specializes in DevOps, operational excellence, and automation using DevSecOps practices and infrastructure as code. Outside of work, he enjoys going on long drives with his beloved family, spending time with them, and watching movies.