
For customer service operations, leveraging AI to deliver greater experiences is becoming more commonplace. One type of AI from which personalized day-to-day interactions are made stronger is conversational AI; this is made possible via natural language processing (NLP), aka what enables technologies to understand and process human language, as well as respond to it in a more natural and ideally human-like way.
Enter Character.AI, a full stack conversational AI platform that gives consumers access to deeply personalized superintelligence. Character.AI recently announced its strategic partnership with a tech household name – Google Cloud – with intent to use its cloud infrastructure to power the building and training of sophisticated AI models with even more advanced reasoning and accuracy percentages than other forms of enterprise-grade AI solutions.
This Character.AI-Google Cloud handshake looks to be a win-win situation. The details:
- Character.AI will use Google Cloud’s generative AI and large language model (LLM) infrastructure to meet the needs of consumers and experience creators alike. According to a Character.AI representative, the combination of each other’s AI capabilities is “set to enhance customer experiences by inspiring imagination, discovery and understanding.”
- Character.AI has also agreed to use Google Cloud’s Tensor Processor Units (TPUs) to train and infer LLMs more efficiently. These purpose-built TPUs fast-track AI training and inference functionalities, “accelerating computationally-intensive workloads with more than 100 petaflops of performance in a single pod.”
- Remember, Google Cloud’s infrastructure is known for its reliable security and scalability; it’s what Google Search and YouTube run on. Here, Google Cloud benefits by aligning itself with those casting wider, more precisely woven nets (Character.AI, in this case), perhaps actively supporting whatever the next leap in the fields and applications of AI may be.
Additionally, it’s worth noting that Character.AI will employ Google Cloud’s new A3 VMs running on NVIDIA H100 Tensor Core GPUs. This, in turn, enables a workload boost-up; specifically, in how customers “can more confidently tackle AI workloads with speed and flexibility, and it provides customer optionality between TPUs and GPUs to better meet their needs.”
As Noam Shazeer, CEO of Character.AI, has shared, "We've recognized the power and strength of Google Cloud's technology, legitimately from day one. As we continue our growth trajectory, working with Google Cloud's AI technologies was the obvious choice, allowing us to rapidly expand our compute abilities so we can deliver new features and capabilities to millions of users. "
And per Thomas Kurian, CEO of Google Cloud, “We're offering Google Cloud's industry-leading infrastructure, Google’s foundation models and our AI tooling to companies across industries so they can build, train and deploy the future of AI creatively and at scale. Character.AI's cutting-edge conversational AI technology will create entirely new opportunities that transform how we interact with AI systems, and we are thrilled to be their partner as they continue to scale their vision.”
If you’re interested in more Character.AI-related news, the platform’s new mobile app has launched for iOS and Android.
Learn more about Google Cloud’s TPUs, cloud and AI infrastructure here, as well.
Edited by
Greg Tavarez