Have you ever wished if it was possible for you to leap into your favorite cartoon series and seamlessly interact with characters such as Bugs Bunny?
Thanks to 5G, augmented reality, a Custom Neural Voice created with Microsoft Azure AI technology, and artificial intelligence, Bugs Bunny can now follow your directions to navigate stores such as the AT&T Experience Store in Dallas, chatting with you in real-time. The technology behind such conversation flows naturally is the neural text-to-speech capability within Speech, an Azure Cognitive Service, and it is now generally available.
“One of the things we hear from our customers is they like the idea of communicating with their customers through speech,” said Eric Boyd, corporate vice president for Azure AI Platform at Microsoft. “Speech has been very robotic over the years. Neural voice is a big leap forward to make it sound really natural.”
The immersive Bugs Bunny experience was an opportunity for AT&T to delight customers while demonstrating the capabilities of their 5G cellular network.
“We’re trying to prove to consumers that there is something to 5G that makes it different and better than a 4G network,” said Jay Cary, vice president of 5G product and mobility innovation for AT&T. “It has massive computing power, higher speeds, and lower latency. This felt like a really amazing way to bring the potential of the network and the technology to life.” “We love that idea of blending the physical environment and the virtual environment,” he said.
An approved Bugs voice actor came into the studio to create the custom voice with a purpose of recording as much as 2,000 phrases and lines, with guidance from the Microsoft team, Cary said.
“We require customers to make very clear it’s a synthetic voice or, when it’s not immediately obvious in context, that they explicitly disclose it’s synthetic in a way that’s perceivable by users and not buried in terms,” said Sarah Bird, Responsible AI lead for Cognitive Services within Azure AI.