Google’s Pixel Buds will find cultural complexities a challenge
Google recently unveiled its vision for the future of global communication — a pair of headphones that can translate over 40 languages in real time. But are we sure that we can rely on Google Translate to understand the nuances of spoken language and the complexities of culture?
The company has made enormous linguistic progress in the past 12 months. In late 2016, Google announced that it had made a breakthrough in translation and artificial intelligence. Google Translate had started using a neural networkto translate some of its most popular languages.
Do you have an AI strategy — or hoping to get one? Check out VB Summit on October 23-24 in Berkeley, a high-level, invite-only AI event for business leaders.
Last week, the company unveiled its latest invention: Google Pixel Buds, or headphones that can serve as a real-time personal translator. A live demonstration highlighted the simplicity of the new product as it translated a few sentences back and forth between Swedish and English.
“It’s like you’ve got your own personal translator with you everywhere you go,” said Adam Champy, the product manager behind Pixel Buds. “Say you’re in Little Italy, and you want to order your pasta like a pro. All you have to do is hold down on the right earbud and say, “Help me speak Italian.” As you talk, your Pixel phone’s speaker will play the translation in Italian out loud. When the waiter responds in Italian, you’ll hear the translation through your Pixel Buds.”
The product has been heralded as the first coming of the Babel fish, which is bound to have fans of The Hitchhiker’s Guide to the Galaxy (basically anyone who writes about tech) in a frenzy of excitement.
Unfortunately for the $600 billion company, there is one core difference between Douglas Adams’ Babel fish and the Google Pixel Buds. The Babel fish was “mindbogglingly useful” because it had the capacity to not only translate vocabulary but to interpret the cultural nuances that came with it.
Likewise, multilingual humans attempt to make cultural sense of what they are translating. Bringing a computer into the equation has the potential to really mess with these subtleties.
In 2015, a local Spanish town threatened to sue Google when the local word for leafy green “rapini” was mistaken for the word “clitoris.” The mistake left local websites, including the town’s official page, with a renamed annual event: The Clitoris Festival.
The syntax and conjugational differences between Asian languages and European languages adds an additional level of complexity. Countless examples on your local Chinese or Thai restaurant menus highlight this difficulty in translation and point to the level of human fluency that is required to make an accurate interpretation. You can safely assume supermarkets were misled when they boldly stated “F**k Vegetables,” instead of correctly labeling an aisle in their store.
And Google Translate is facing an uphill battle if even professional translators are struggling to interpret the sentence structure of Donald Trump’s ramblings.
“When the logic is not clear or a sentence is just left hanging in the air, then we have a problem,” said the Guardian‘s Chikako Tsuruta, who regularly interprets broadcasts by U.S. networks such as CNN, ABC, and CBS. “We try to grasp the context and get at the core message, but in Trump’s case, it’s so incoherent. You’re interpreting, and then suddenly the sentence stops making sense, and we risk ending up sounding stupid.”
Herein lies the number one question that Google Translate product managers will be asking themselves: How do we get our technology to understand and interpret the subjective cultural aspects and highly variable grammatical complexities involved in spoken language?
Despite an unquestionably exciting year for Google’s deep learning and language department, it seems there is still a long way to go.
Gabe McCauley is a freelance journalist and growth marketer based out of Sydney, Australia.
AI-backed voice assistants make customer support efficient yet human
Enterprises are constantly evaluating ways to improve the speed and quality of their customer support services. These have traditionally been delivered over the phone, through a conversation with a human agent, but customer support has been changing in recent years as businesses have begun adopting AI-based alternatives.
These approaches have indeed introduced new efficiencies, but they have also been largely unpopular with customers, who view them as too impersonal. Customers expect to have their needs addressed quickly, but they also want empathy and to have their requests easily understood. Most importantly, they want to be quickly passed to a human agent if their concerns are too complex for the technology to process.
Do you have an AI strategy — or hoping to get one? Check out VB Summit on October 23-24 in Berkeley, a high-level, invite-only AI event for business leaders.
With a variety of time-saving technologies sweeping the consumer market, voice-based virtual assistants stand apart by pairing efficiency with a human touch.
Growing expectations
Customer support and service delivered in real time over a customer’s preferred device has become standard across many industries. And the higher expectations that result from this on-demand service aren’t contained to the consumer space. People now expect rapid, high-quality support from B2B companies, and even from their own employers.
Businesses have taken notice of this trend. According to Forrester, 72 percent say improving customer experience is their top priority.
Technology has fueled time-saving methods for delivering support, like automated phone systems and chatbots that manage frontline contact. The problem is that customers haven’t been happy with this shift, and these technologies have been widely criticized for de-humanizing the support process.
Voice-based technology — a friendlier approach
The success of voice-based virtual assistants has already been observed in the consumer space. Siri sets reminders, makes dinner reservations, pulls GPS directions, and enables a number of hands-free operations. Technologies like Alexa and Google Home can play requested media, read recipes, and manage shopping lists, all through voice commands.
Tens of millions of Americans regularly use these technologies — 41.4 million unique monthly users interact with Siri, and Amazon Alexa has more than tripled its usage in just one year to 2.6 million users in May 2017.
The popularity of these tools shows that users are comfortable engaging conversationally with technology and that people want to receive quick, personalized services in hands-free ways.
These insights are valuable for the enterprise because customers’ preferences aren’t compartmentalized to just one area of their life. A user who would rather make a dinner reservation using voice commands would likely prefer to submit a support ticket through voice commands, too.
Voice-based virtual assistants can play a crucial role within enterprises, both when customers expect a quick resolution to their support needs and when employees expect internal service delivery just as quickly.
Fueled by natural language processing, voice-based virtual assistants can field initial customer support requests and efficiently resolve simple tasks. Even better, they can manage this while providing a more personal experience through a human voice, empathy, and even sometimes humor.
These technologies can also be integrated into IT service desk solutions so that an employee seeking IT support communicates first with the artificial intelligence and is only directed to a human agent if necessary. Using the voice-based virtual assistant, the solution can offer self-help resources and resolve a number of level 1 support requests on its own, relieving human support agents and freeing them to focus on more strategic work.
Humans and virtual assistants pair perfectly
Voice-based virtual assistants present a great opportunity, but they should only be one layer within enterprise service delivery and customer support.
An overwhelming 88 percent of consumers surveyed said they expect a natural transition between automated self-service technologies, like voice-based virtual assistants, and a human agent when purchasing a product, placing an order, or contacting customer service. It’s clear that humans will remain an essential part of service desk and support teams for the foreseeable future. The real challenge for enterprises is in developing a strategy for how technologies can be combined with human agents while ensuring that support conversations smoothly transition from AI to human representatives.
As customer service and support expectations remain high and pressures continue to increase, delivering a great service experience will be crucial for businesses. New AI-backed voice-based virtual assistants provide an opportunity to make support efficient yet human — just the way we like it.
Comments
Post a Comment