Nvidia's "big move": When AI is really integrated, what will the game look like

Author: Ge Jiaming

Nvidia, who has "killed crazy", released another big news to change the game industry?

On May 28, Nvidia announced on its official website that it will provide customized AI model foundry services for games——Avatar Cloud Engine (ACE) for Games, which is suitable for AI model development on the cloud and on PCs. Developers can use this service in It builds and deploys custom speech, dialogue, and animation AI models in its software and games.

At the Computex 2023 exhibition, a game demo called Kairos "turned out", bringing the "metaverse" that had been silent for a long time back to the public's attention.

Nvidia CEO Huang Renxun demonstrated Cyberpunk 2077 running real-time ray tracing on the spot. And the generation of this realistic character is to use a real-time artificial intelligence model rendering tool - Nvidia Ace.

Nvidia is calling this development "a glimpse into the future of gaming." Meta's broken "Metaverse", Nvidia is about to come true?

Nvidia changed the game industry?

For a long time, Nvidia has also spared no effort to lay out the metaverse, hoping to become an infrastructure provider in the metaverse field.

At the SIGGRAPH conference in August 2022, Nvidia will "show off its muscles" and demonstrate their latest progress-the launch of Omniverse ACE and the hope that the Universal Scene Description (USD) can be used as the HTML of the Metaverse.

The Demo shown by Nvidia this time sets the scene in a cyberpunk-style ramen shop scene. Players only need to press a button to use their own voice to talk to Jin, the owner of the ramen shop.

Jin is an NPC character, and the content of his dialogue with the player is completely generated in real time by the generative AI based on the player's voice input. At the same time, Jin also has a realistic facial expression and voice, and the dialogue with the player is consistent with the background story.

Nvidia said that this Demo was completed for it and its partner Convai. Convai is a startup company that focuses on the development of conversational AI in virtual games. They use Nvidia's upcoming ACE (Avatar Cloud Engine) for Games. An AI model OEM service, the service includes the following parts:

  • NVIDIA NeMo: For building, customizing, and deploying language models, proprietary data can be used. Language models can be customized to match game settings and character backgrounds, and prevent bad or unsafe dialogue with NeMo Guardrails.
  • NVIDIA Riva: for automatic speech recognition and text-to-speech, enabling real-time voice conversations.
  • Nvidia Omniverse Audio2Face: It can be used to quickly generate rich facial expression animations of game characters based on character voices. Audio2Face supports Unreal Engine 5's Omniverse connector, allowing developers to add facial animation directly to MetaHuman characters.

Game developers can use the full ACE for Games solution, or just the parts they need, Nvidia said.

ACE for Games also provides the service of customizing AI models to help developers optimize according to their game needs, and perform real-time inference through NVIDIA DGX cloud, GeForce RTX PC or locally. These models are latency-optimized to meet the requirements of immersive, responsive interactions in games.

According to Nvidia, generative AI has the potential to completely change the way players interact with game characters, improving the immersion of games. Some analysts pointed out that **With its expertise and years of experience in the field of artificial intelligence and game development, Nvidia is leading the application of generative AI in games. **

And just 3 days ago, NVIDIA released VOYAGE, the first large-scale model-driven game agent that can learn for life. It brought a little "Nvidia" shock to the AI circle.

When Nvidia incorporated GPT-4 into the game Minecraft, VOYAGER quickly became a seasoned explorer, earning 3.3x more unique items, traveling 2.3x more distance,** unlocking key tech trees Milestone is 15.3 times faster than the previous method. **

Netizens began to imagine the scene of the future game, with the NPC driven by the large model, and the realm of vigorous and vigorous competition of all things is vividly seen:

"Burst" Nvidia seizes the opportunity and continues to run wild

How could Huang Renxun's ambition only stop at subverting the future of games?

At the COMPUTEX 2023 conference, Huang Renxun announced a series of new AI-related products and services. Among them, the "Super GPU" - GH200 chip once "exploded", Nvidia announced that Google Cloud, Meta and Microsoft will be the first to obtain GH200.

According to Nvidia, the new DGX GH200 artificial intelligence supercomputing platform is designed for large-scale generative AI loads.

This "super GPU" uses the new NVLink-c2c interconnection technology, which combines the ARM-based energy-saving Grace CPU and the high-performance NVIDIA H100 Tensor Core GPU to provide a total bandwidth of up to 900GB/s. **

**DGX GH200 is designed to provide customers with maximum throughput and scalability. Bypassing the limitations of standard cluster connections like InfiniBand and Ethernet by using Nvidia's custom NVLink Switch chip to provide maximum throughput for massive scalability of the largest workloads. **

This supercomputer composed of 256 Grace Hopper super chips will have an extraordinary AI performance of up to 1 exaflop and 144TB of shared memory (nearly 500 times more than the previous generation DGX A100). In GPT-3 training,** it can be 2.2 times faster than the previous generation DGX H100 cluster. ** Nvidia CEO Jensen Huang said:

The GH200 chip is a "giant GPU". This is the first time Nvidia has used the NVLink Switch topology to build an entire supercomputer cluster, increasing the GPU channel bandwidth several times. The system has 150 miles of fiber optics and weighs 40,000 pounds, but appears to be a single GPU unit.

Nvidia and Softbank subsequently announced that will work together to use the grace hopper super chip to generate artificial intelligence and 5G/6G technology to power Softbank's next-generation data center. SoftBank's daily stock rose more than 9% at one point, the biggest gain since May 2022.

At the same time, NVIDIA also introduced NVIDIA MGXTM, a reference architecture for system manufacturers to quickly and cost-effectively build more than 100 server variants. **

The specification is said to cut development costs by as much as three-quarters and development time by two-thirds** to just six months.

With MGX, technology companies can optimize the basic system architecture for accelerated computing for their servers, and then choose their own GPU, DPU and CPU.

**MGX can also be easily integrated into cloud and enterprise data centers. **

** **Nvidia announced that it will also cooperate with WPP, the world's largest advertising communication group, to use artificial intelligence and "Metaverse" to greatly reduce the cost of advertising production and improve efficiency. The advertising industry has once again ushered in a major change.

Nvidia says they are working with WPP to develop a content engine** that leverages **NVIDIA Omniverse and artificial intelligence to help creative teams produce high-quality commercial content faster, more efficiently and at scale, while fully cooperating with customers. The core of the brand is consistent, or it will bring "subversive" innovation to the future advertising industry:

The new engine will integrate 3D imaging software, manufacturing and creative supply chain tools into an ecosystem: including Adobe and Getty Images, allowing WPP artists and designers to combine 3D content creation with generative AI. Enables brands to engage with consumers in a highly personal and more engaging way while maintaining the company's brand image. Taking cars as an example, the platform incorporates 3D imaging software that can be used to generate a fully accurate photorealistic image, which is then generated by an AI engine into a video or 2D advertisement. For example a car it can be placed in a desert or a rainy street, and adjust to the environment, such as reflecting the glare of the light from the rainy headlights or the sun. Traditional green screen or on-location shooting would take days to complete these processes. Fast production speed means campaigns can be quickly adapted to different market or country needs, such as placing cars on the streets of Hong Kong or New York, and being able to create customizations for different digital channels (such as YouTube or TikTok) and specific user groups advertise.

WPP CTO Stephan Pretorius said that now, with the blessing of generative AI, can create 10,000 versions required by customers in minutes:

Our clients started asking us to use generative AI and now we are able to use generative AI to personalize ads for them

WPP chief executive Mark Read said the technology will be the "foundation" of everything WPP does,

Generative AI is changing the marketing landscape at breakneck speed, and clients are looking for ways to quickly reduce production costs to meet the demands of new channels. Our partnership with Nvidia will create a unique competitive advantage for WPP through AI. This new technology will change the way brands market.

The subversion of AI to the advertising industry has also caused many practitioners in the media field to have "unemployment fear". In this regard, Read said that he is better able to distinguish the jobs that artificial intelligence is about to replace-there is no need to use creative skills in business. Work:**

“We use AI extensively in our media business, but we use very little AI in the creative parts of our business.”

View Original
The content is for reference only, not a solicitation or offer. No investment, tax, or legal advice provided. See Disclaimer for more risks disclosure.
  • Reward
  • Comment
  • Share
Comment
0/400
No comments