Everywhere you look these days, voice is becoming the dominant enablement medium. Whether in a car, at home, in the office, or anywhere else, voice is connecting us with the technologies we use every day.
Juniper estimates that more than 3.25 billion voice-enabled devices are in circulation today, driving voice-driven commerce to more than $80 billion by 2023. Companies across industries – both incumbents and startups – are recognizing it and are developing voice-enabled applications to enable better and more efficient processes for businesses and consumers alike. According to PwC and CB insights, venture capital funding of AI companies reached a record $9.3 billion, underscoring the momentum the speech-enablement and AI markets are gaining.
Of course, much of the hype has been driven, first, by Apple’s Siri, followed by Amazon’s Alexa, Microsoft’s Cortana, Google Assistant, and now Samsung’s Bixby. These are only the tip of the iceberg, but they have helped give validity to the AI market and created the demand for massive AI innovation.
Much of the growth can be attributed to advances in accuracy. If the voice recognition engines didn’t work, they wouldn’t be useful. Today, most major platforms are well above 90% accuracy ratings, making them much more enticing to users. In fact, 32% of American adults say they use voice search for the fun of it – a figure that jumps to 51% for teenagers, who will soon be entering the workforce and 55% of whom already use voice search on a daily basis.
What does it mean? comScore predicts that voice will be used for half of all searches by next year, and Gartner goes even further, saying that 30% of all searches will be done without even a screen.
For businesses looking to engage their customers in the most efficient and desirable way, that means they had better get their AI development hats on quickly.
To address that issue, Adam Cheyer, co-founder of both Siri and Bixby, will be in Los Angeles tomorrow, June 15, to talk not only about Samsung’s Bixby Developer program, but how AI can be used with existing APIs and services to build rich conversational experiences for users.
It’s all about the experience, and Cheyer’s engagement at the Bixby Developer Session will let attendees experience firsthand, in an immersive, hands-on training environment, how new capabilities for voice interaction will help create new levels of interactive engagement for more than 500 million users through Bixby.
Details of tomorrow’s event:
Date: Saturday, June 15, 2020
Location: Cross Campus, 29 Colorado Avenue, Santa Monica, California 90401
If you’re on the East Coast, there will be a Bixby Developer Session in New York next weekend, Saturday, June 22. Details here.
The Future of Work Expo will take place February 12-14, 2020, in Ft. Lauderdale, Florida, featuring three days of discussion about how AI, chatbots, and automation are enabling businesses to reinvent themselves and become more agile and customer-centric.
Future of Work Contributor
Customer interactions are a bit like icebergs: while there's a small amount of information - usually what goes into the ears of call center agents - v…
LG Electronics' has announced the company is stepping up it's offerings in the customer care arena with the launch of a new, innovative AI-infused cus…
Airlines are increasingly turning to AI and automation to improve customer service and increase efficiencies. Demand forecasting solutions designed to…
Artificial intelligence makes many aspects of your contact center smarter today: it helps locate conversations in quality monitoring solutions, and it…
Groupon announced the acquisition of AI outfit Presence AI to enhance communications between customers and vendors. Terms of the deal were not disclos…