Future of Work News Free eNews Subscription

NVIDIA Introduces 'Interactive Digital Human' James at Recent Event

By

The earliest iterations of digital agents in customer support weren’t very popular, and for good reason. They were inaccurate, they were often flat and emotionless, and they frequently required customers to shout their information repeatedly, because the agents “didn’t quite get that.” It’s no wonder the technology generated a plethora of memes, jokes and overall customer anger.

Today, however, advanced technology has made inroads into making virtual agents more responsive, more accurate, more useful and more human.

At the recent SIGGRAPH event, NVIDIA introduced the world to “James,” the technology giant’s “interactive digital human” designed to make better connections with customers using emotions, humor and more. James is based on a customer-service workflow using NVIDIA ACE, a reference design for creating custom, hyperrealistic and interactive avatars.

In addition, NVIDIA introduced advancements to the NVIDIA Maxine AI platform, including Maxine 3D and Audio2Face-2D for an immersive telepresence experience. Developers can use Maxine and NVIDIA ACE digital human technologies to make customer interactions with digital interfaces more engaging and natural, according to NVIDIA. ACE technologies enable digital human development with AI models for speech and translation, vision, intelligence, lifelike animation and behavior, and realistic appearance. Companies across industries are using Maxine and ACE to deliver immersive virtual customer experiences.

“Built on top of NVIDIA NIM microservices, James is a virtual assistant that can provide contextually accurate responses,” wrote NVIDIA’s Ike Nnoli in a recent blog post. “Using retrieval-augmented generation (RAG), James can accurately tell users about the latest NVIDIA technologies. ACE allows developers to use their own data to create domain-specific avatars that can communicate relevant information to customers. James is powered by the latest NVIDIA RTX rendering technologies for advanced, lifelike animations. His natural-sounding voice is powered by ElevenLabs. NVIDIA ACE lets developers customize animation, voice and language when building avatars tailored for different use cases.”




Edited by Alex Passett
Get stories like this delivered straight to your inbox. [Free eNews Subscription]

Future of Work Contributor

SHARE THIS ARTICLE

Related Articles

VividQ's Holographic Vision Takes a Step Closer to Reality with $7.5M Funding

By: Greg Tavarez    9/5/2024

VividQ announced the completion of an additional $7.5 million in Series A funding, bringing the company's total funding to over $30 million to date.

READ MORE

BlandAI Announces $16 Million in Series A Financing

By: Tracey E. Schelmetic    9/5/2024

AI agent automation solutions provider Bland AI announced that it has officially emerged from stealth with a $16 million Series A financing round, led…

READ MORE

New Integration Opens the Door to Smartly Managed Meeting Rooms, Courtesy of Envoy and Logitech

By: Alex Passett    9/4/2024

Envoy has teamed up with Logitech to integrate Envoy Rooms and Logitech Tap Schedule. This integration offers a new unified workplace solution for opt…

READ MORE

Zoom Shatters Webinar Limits: 1 Million Attendees Now Possible

By: Greg Tavarez    9/4/2024

Zoom recently announced the launch of its new single-use webinar offering, capable of hosting up to 1 million total attendees.

READ MORE

OnviSource and IPFone Partner to Transform Business Communication with AI-Driven Insights

By: Stefania Viscusi    9/3/2024

IPFone's unified communications platform has been integrated with OnviSource's OmVista, an AI-powered suite

READ MORE