Future of Work News Free eNews Subscription

NVIDIA Introduces 'Interactive Digital Human' James at Recent Event

By

The earliest iterations of digital agents in customer support weren’t very popular, and for good reason. They were inaccurate, they were often flat and emotionless, and they frequently required customers to shout their information repeatedly, because the agents “didn’t quite get that.” It’s no wonder the technology generated a plethora of memes, jokes and overall customer anger.

Today, however, advanced technology has made inroads into making virtual agents more responsive, more accurate, more useful and more human.

At the recent SIGGRAPH event, NVIDIA introduced the world to “James,” the technology giant’s “interactive digital human” designed to make better connections with customers using emotions, humor and more. James is based on a customer-service workflow using NVIDIA ACE, a reference design for creating custom, hyperrealistic and interactive avatars.

In addition, NVIDIA introduced advancements to the NVIDIA Maxine AI platform, including Maxine 3D and Audio2Face-2D for an immersive telepresence experience. Developers can use Maxine and NVIDIA ACE digital human technologies to make customer interactions with digital interfaces more engaging and natural, according to NVIDIA. ACE technologies enable digital human development with AI models for speech and translation, vision, intelligence, lifelike animation and behavior, and realistic appearance. Companies across industries are using Maxine and ACE to deliver immersive virtual customer experiences.

“Built on top of NVIDIA NIM microservices, James is a virtual assistant that can provide contextually accurate responses,” wrote NVIDIA’s Ike Nnoli in a recent blog post. “Using retrieval-augmented generation (RAG), James can accurately tell users about the latest NVIDIA technologies. ACE allows developers to use their own data to create domain-specific avatars that can communicate relevant information to customers. James is powered by the latest NVIDIA RTX rendering technologies for advanced, lifelike animations. His natural-sounding voice is powered by ElevenLabs. NVIDIA ACE lets developers customize animation, voice and language when building avatars tailored for different use cases.”




Edited by Alex Passett
Get stories like this delivered straight to your inbox. [Free eNews Subscription]

Future of Work Contributor

SHARE THIS ARTICLE

Related Articles

Inflection AI Expands Enterprise AI Capabilities with BoostKPI and Jelled.ai Acquisitions

By: Greg Tavarez    12/10/2024

Inflection AI deepened its commitment to enterprise AI with two new acquisitions: BoostKPI and Jelled.ai.

READ MORE

LTIMindtree Invests in, Partners with Voicing.AI to Advance Enterprise Automation

By: Alex Passett    12/9/2024

LTIMindtree officially announced its partnership with Voicing.AI, a startup that provides responsible, enterprise-grade AI to power new revenue growth…

READ MORE

Discuss Cybersecurity and Privacy in the Future of Work at Future of Work Expo 2025

By: Greg Tavarez    12/5/2024

"Cybersecurity and Privacy - Why So Central to Future of Work" is for those interested in exploring the latest trends and challenges in cybersecurity …

READ MORE

Front Gains Edge in AI-Driven Customer Service with Idiomatic

By: Greg Tavarez    12/4/2024

Front recently acquired Idiomatic, a pioneering AI-powered voice-of-customer intelligence platform previously used by various customer-centric brands.

READ MORE

LTIMindtree and Microsoft Combine Forces for Global Impact on AI Solutions

By: Greg Tavarez    12/2/2024

Microsoft and LTIMindtree will collaborate to create a joint go-to-market strategy and make joint investments in AI-powered solutions.

READ MORE