Perficient recently concluded a 3-day hackathon to create contact center solutions addressing the COVID-19 (Coronavirus) crisis. Utilizing Amazon Connect Chat, my team and I implemented a solution for SMS deflection. We addressed the following scenario:
- Customer calls into a call center with high demand.
- Customer is prompted that his wait time is greater than 2 hours and is offered to switch to the SMS channel.
- Customer elects to use SMS and is placed into a SMS-dedicated queue.
- Customer can send SMS to a chatbot while waiting in queue.
- Customer connects and chats with an agent over SMS.
For those familiar with the Amazon Connect Chat APIs, you know that they are not designed for the SMS channel. See this blog post for a quick refresher on how the Chat APIs work.
The Chat APIs rely on WebSockets to relay messages from Connect to the customer. The APIs are intended for a web app where the WebSocket can be stored in the customer’s browser. To support SMS, we needed to create a service that maintained active WebSockets for all inbound customers.
Here is what we created.
The customer connection is initiated when the customer calls into the IVR, and elects to switch to SMS. Amazon Connect then sends an opt-in text to the customer via AWS Lambda. Once the customer responds, the SMS service is responsible for two-way communication with Amazon Connect, Amazon Lex, and the customer. The service effectively proxies messages from Connect and Lex to SMS.
We integrated with Amazon Pinpoint to communicate to a customer via SMS. Outbound SMS is handled directly by the Pinpoint API. Inbound SMS is routed through an Amazon SQS queue. The SMS service polled the queue for incoming messages. Once a message is received, the service decided if the message should be routed to Amazon Connect or Amazon Lex.
We relied on the Chat APIs to communicate with Amazon Connect. The SMS service stored an in-memory map of customer phone number to the WebSocket created by the Chat APIs. This allowed the service to both send and receive messages to Amazon Connect. We also relied on the Chat APIs to determine if a customer was in queue or not based on ‘marker’ messages sent by our contact flows.
While a customer is waiting in queue, the SMS service routes customer SMS to Lex. Lex provides a PostText API to handle two-way communication. I’ll cover how we created our Lex bot next.
An Aside: Scraping an FAQ to create a Lex Bot
To create a Lex bot to answer basic questions about the COVID-19 crisis, we scraped the CDC’s FAQ page. This idea was introduced last-minute into our project. We did the following:
- Scrape question and answer pairs from the CDC FAQ page and store each pair in a DynamoDb table with an unique identifier.
- Write and execute an ETL Lambda to create a new intent for each question and answer pair, where the intent name is the identifier and the question is the only utterance.
- Create a fulfillment Lambda that queries the Dynamo table for the answer when an intent is fulfilled.
You might have been wondering how this service might scale. And that’s a good question!
Running a service that maintains WebSockets to all customers will require sophisticated scaling. We didn’t solve for the scalability issue as part of the hackathon, but we do have a few ideas. For one, scale the EC2 instance using AWS Auto Scale! We can choose to add more EC2 instances if we implement a ‘controller’ EC2 instance.
Another idea is to switch to a Serverless architecture! We want to research options for using a Serverless architecture, including the option of implementing a direct agent to customer SMS channel. Going Serverless will allow us to scale our solution natively. We are continuing to look into this option.
Our team built a service that allows for customers to communicate with Amazon Connect via SMS. We believe this solution can alleviate the high volume of calls experienced at contact centers during this global crisis and in the future.
We’d love to extend our solution to support additional channels such as Facebook Messenger. Look for this addition in a future blog post!