Everywhere you look these days, voice is becoming the dominant enablement medium. Whether in a car, at home, in the office, or anywhere else, voice is connecting us with the technologies we use every day.
Juniper estimates that more than 3.25 billion voice-enabled devices are in circulation today, driving voice-driven commerce to more than $80 billion by 2023. Companies across industries – both incumbents and startups – are recognizing it and are developing voice-enabled applications to enable better and more efficient processes for businesses and consumers alike. According to PwC and CB insights, venture capital funding of AI companies reached a record $9.3 billion, underscoring the momentum the speech-enablement and AI markets are gaining.
Of course, much of the hype has been driven, first, by Apple’s Siri, followed by Amazon’s Alexa, Microsoft’s Cortana, Google Assistant, and now Samsung’s Bixby. These are only the tip of the iceberg, but they have helped give validity to the AI market and created the demand for massive AI innovation.
Much of the growth can be attributed to advances in accuracy. If the voice recognition engines didn’t work, they wouldn’t be useful. Today, most major platforms are well above 90% accuracy ratings, making them much more enticing to users. In fact, 32% of American adults say they use voice search for the fun of it – a figure that jumps to 51% for teenagers, who will soon be entering the workforce and 55% of whom already use voice search on a daily basis.
What does it mean? comScore predicts that voice will be used for half of all searches by next year, and Gartner goes even further, saying that 30% of all searches will be done without even a screen.
For businesses looking to engage their customers in the most efficient and desirable way, that means they had better get their AI development hats on quickly.
To address that issue, Adam Cheyer, co-founder of both Siri and Bixby, will be in Los Angeles tomorrow, June 15, to talk not only about Samsung’s Bixby Developer program, but how AI can be used with existing APIs and services to build rich conversational experiences for users.
It’s all about the experience, and Cheyer’s engagement at the Bixby Developer Session will let attendees experience firsthand, in an immersive, hands-on training environment, how new capabilities for voice interaction will help create new levels of interactive engagement for more than 500 million users through Bixby.
Details of tomorrow’s event:
Date: Saturday, June 15, 2020
Location: Cross Campus, 29 Colorado Avenue, Santa Monica, California 90401
If you’re on the East Coast, there will be a Bixby Developer Session in New York next weekend, Saturday, June 22. Details here.
The Future of Work Expo will take place February 12-14, 2020, in Ft. Lauderdale, Florida, featuring three days of discussion about how AI, chatbots, and automation are enabling businesses to reinvent themselves and become more agile and customer-centric.
Future of Work Contributor
Human customer service agents are in no danger of being replaced by AI and bots any time soon. But AI certainly has an important role to play in custo…
Illinois' new Artificial Intelligence Video Interview Act regulates employers' use of AI technologies in the interview and hiring process. The new law…
The growth of the contingent workforce model for MSPs has been driven by two factors, an increase in gig economy workers and the business economics of…
AI has myriad applications in the 911, emergency and public safety services realm. Technology and AI will prove invaluable in the future of public saf…
The popularity and pervasiveness of AI technologies raise interesting moral and ethical questions about whether it can surpass or even replace human t…