Story image

AWS Contact Lens for Connect set to arrive in A/NZ

Mon 27 Jul 2020
FYI, this story is more than a year old

Amazon Web Services (AWS), has announced the general availability of Contact Lens, a set of machine learning-driven capabilities for Amazon Connect that provides customer interaction analytics for contact centres. 

Amazon Connect is a fully managed cloud contact centre service. 

With Contact Lens, contact centre supervisors can discover themes and trends from customer conversations, conduct a full-text search on call transcripts to troubleshoot customer issues, and improve contact centre agents’ performance with call analytics from within the Amazon Connect console. 

Coming late-2020, Contact Lens also provides the ability for supervisors to be alerted to issues during in-progress calls, giving them the ability to intervene earlier when a customer is having a poor experience. 

Contact Lens requires no technical expertise and can be activated through Amazon Connect.

It uses machine learning to transcribe calls and automatically indexes call transcripts so they can be searched from the Amazon Connect console. 

Machine learning is also used to make it easier for supervisors to search voice interactions based on call content (e.g. customers asking to cancel a subscription or return an item), customer sentiment (e.g. calls that ended with a negative customer sentiment score), and conversation characteristics (e.g. talk speed, long pauses, or customers and agents talking over one another). 

By clicking on search results, supervisors can view a contact detail page to see the call transcript, customer and agent sentiment, a visual illustration of conversation characteristics, and use this information to share feedback with their agents to improve customer interactions. 

Contact Lens also uses natural language processing to help supervisors uncover new issues (e.g. a price discrepancy between a website and an email promotion) on the contact detail page by visually identifying words and phrases in call transcripts that indicate reasons for customer outreach. 

Supervisors can automatically monitor all of their agents’ interactions for customer experience, regulatory compliance, and adherence to script guidelines by defining custom categories on a new page in Amazon Connect that allows them to organise customer contacts based on words or phrases said by the customer or agent (e.g. a customer mentioning a competitor, membership in a customer loyalty program, certain regulatory disclosures, etc.). 

The machine learning capabilities can automatically detect and redact sensitive personally identifiable information (PII) like names, addresses, and social security numbers from call recordings and transcripts to help customers more easily protect customer data.

Later this year, Contact Lens will introduce new features that provide supervisors with real-time assistance by offering a dashboard that shows the sentiment progression of live calls in a contact centre. 

This dashboard continuously updates as interactions progress and allows supervisors to look across live calls to spot opportunities to help their customers. Real-time alerting gives supervisors the ability to engage and de-escalate situations earlier.

Contact Lens capabilities are built into Amazon Connect to provide metadata (such as transcriptions, sentiment, and categorisation tags) in customers' Amazon Simple Storage Service (Amazon S3) buckets in a well-defined schema. 

Businesses can export this information and use additional tools like Amazon QuickSight or Tableau to do further analysis and combine it with data from other sources. 

Recent stories
More stories