Last December throughout its re:Invent 2019 convention in Las Vegas, Amazon unveiled Contact Lens, a digital name middle product for Amazon Connect that transcribes calls whereas concurrently assessing them. After a monthslong preview, Contact Lens right now launched basically availability within the US East (N. Virginia), US West (Oregon), EU (Frankfurt), EU (London), Asia Pacific (Singapore), Asia Pacific (Sydney), and Asia Pacific (Tokyo) Amazon Web Services (AWS) areas, with rollouts in further areas to come back later this 12 months.
As buyer representatives are more and more ordered to work at home in Manila, the U.S., and elsewhere, corporations together with John Hancock, Capital One, Intuit, GE, Square, Fujitsu, and Dow Jones are turning to AI options like Contact Lens to bridge gaps in service. The options aren’t good — there’s all the time going to be a necessity for human groups, even the place chatbots are deployed — however COVID-19 has accelerated the necessity for AI-powered contact middle messaging.
Contact Lens, which Amazon says is predicated on the identical expertise that powers its personal customer support facilities, is a totally managed set of capabilities enabled by AI and machine studying. With it, corporations can ostensibly perceive the sentiment, traits, and compliance of buyer conversations, discovering rising themes whereas conducting full-text search on name transcripts. Supervisors can use Contact Lens to view brokers’ efficiency with detailed analytics, and in late 2020, the service will optionally alert supervisors to points throughout in-progress calls, giving them the chance to intervene when a buyer is likely to be having a poor expertise.
Contact Lens leverages deep studying to make it simpler for supervisors to look voice interactions primarily based on name content material and dialog traits like discuss pace, lengthy pauses, and buyer and agent interruptions. By clicking on the search outcomes, supervisors can view a contact element web page to see the decision transcript, buyer and agent sentiment, and a visible illustration of dialog traits. And because of pure language processing expertise that highlights doubtlessly problematic transcript phrases and phrases, they’ll uncover points through Contact Lens that point out causes for buyer outreach.
Furthermore, Contact Lens lets supervisors monitor brokers’ interactions for buyer expertise, regulatory compliance, and adherence to script pointers by defining customized classes inside Amazon Connect that manage contacts primarily based on phrases or phrases. Contact Lens additionally consists of AI capabilities to robotically detect and redact delicate personally identifiable info (PII) like names, addresses, and Social Security numbers from name recordings and transcripts to assist clients extra simply shield their personal knowledge.
In the approaching months, alongside the alerting function, Amazon says Contact Lens will introduce a dashboard exhibiting the sentiment development of dwell calls. This dashboard will constantly replace because the interactions evolve and permit supervisors to look throughout dwell calls to identify alternatives to assist clients.
Contact Lens supplies metadata reminiscent of transcriptions, sentiment, and categorization tags in Amazon Simple Storage Service (Amazon S3) buckets in a well-defined schema, Amazon notes. If they so select, companies can export this info and use instruments like Amazon QuickSight or Tableau to carry out additional evaluation and mix it with knowledge from different sources.