Future of Work News Free eNews Subscription

NVIDIA Introduces 'Interactive Digital Human' James at Recent Event

By

The earliest iterations of digital agents in customer support weren’t very popular, and for good reason. They were inaccurate, they were often flat and emotionless, and they frequently required customers to shout their information repeatedly, because the agents “didn’t quite get that.” It’s no wonder the technology generated a plethora of memes, jokes and overall customer anger.

Today, however, advanced technology has made inroads into making virtual agents more responsive, more accurate, more useful and more human.

At the recent SIGGRAPH event, NVIDIA introduced the world to “James,” the technology giant’s “interactive digital human” designed to make better connections with customers using emotions, humor and more. James is based on a customer-service workflow using NVIDIA ACE, a reference design for creating custom, hyperrealistic and interactive avatars.

In addition, NVIDIA introduced advancements to the NVIDIA Maxine AI platform, including Maxine 3D and Audio2Face-2D for an immersive telepresence experience. Developers can use Maxine and NVIDIA ACE digital human technologies to make customer interactions with digital interfaces more engaging and natural, according to NVIDIA. ACE technologies enable digital human development with AI models for speech and translation, vision, intelligence, lifelike animation and behavior, and realistic appearance. Companies across industries are using Maxine and ACE to deliver immersive virtual customer experiences.

“Built on top of NVIDIA NIM microservices, James is a virtual assistant that can provide contextually accurate responses,” wrote NVIDIA’s Ike Nnoli in a recent blog post. “Using retrieval-augmented generation (RAG), James can accurately tell users about the latest NVIDIA technologies. ACE allows developers to use their own data to create domain-specific avatars that can communicate relevant information to customers. James is powered by the latest NVIDIA RTX rendering technologies for advanced, lifelike animations. His natural-sounding voice is powered by ElevenLabs. NVIDIA ACE lets developers customize animation, voice and language when building avatars tailored for different use cases.”




Edited by Alex Passett
Get stories like this delivered straight to your inbox. [Free eNews Subscription]

Future of Work Contributor

SHARE THIS ARTICLE

Related Articles

Future of Work Expo 2025: UCaaS Drives the Future of Work

By: Greg Tavarez    2/12/2025

At Future of Work Expo 2025, part of the #TECHSUPERSHOW, a panel session, "Why UCaaS Is the Future of Work," explained why UCaaS is so central for the…

READ MORE

Is the Future of Work Powered by AI? Find Out at Future of Work Expo 2025

By: Alex Passett    2/11/2025

Future of Work Expo 2025 began today at the Broward County Convention Center in Fort Lauderdale, Florida. This story shares some details from the Futu…

READ MORE

Cybersecurity and Privacy Discussed at Future of Work Expo 2025

By: Greg Tavarez    2/11/2025

The flow of sensitive information, both within and outside organizations, is becoming harder to control.

READ MORE

Unified Office Announces Significant Expansion of its TCNIQ AI Analytics Suite of Products at Future of Work Expo 2025

By: TMCnet News    2/11/2025

Leading communications technology company Unified Office announced today the official expansion of its TCNIQTM AI-based business analytics suite of pr…

READ MORE

Beyond the Hype: Unified Office Provides Real AI Solutions for Business

By: Special Guest    2/8/2025

Unified Office is committed to creating practical AI applications that solve real world problems.

READ MORE