🚀 Gate.io #Launchpad# Initial Offering: #PFVS#
🏆 Commit #USDT# to Share 10,000,000 #PFVS# . The More You Commit, the More $PFVS You Receive!
📅 Duration: 03:00 AM, May 13th - 12:00 PM, May 16th (UTC)
💎 Commit USDT Now: https://www.gate.io/launchpad/2300
Learn More: https://www.gate.io/announcements/article/44878
#GateioLaunchpad#
Nvidia's "big move": When AI is really integrated, what will the game look like
Author: Ge Jiaming
Nvidia, who has "killed crazy", released another big news to change the game industry?
On May 28, Nvidia announced on its official website that it will provide customized AI model foundry services for games——Avatar Cloud Engine (ACE) for Games, which is suitable for AI model development on the cloud and on PCs. Developers can use this service in It builds and deploys custom speech, dialogue, and animation AI models in its software and games.
Nvidia CEO Huang Renxun demonstrated Cyberpunk 2077 running real-time ray tracing on the spot. And the generation of this realistic character is to use a real-time artificial intelligence model rendering tool - Nvidia Ace.
Nvidia is calling this development "a glimpse into the future of gaming." Meta's broken "Metaverse", Nvidia is about to come true?
Nvidia changed the game industry?
For a long time, Nvidia has also spared no effort to lay out the metaverse, hoping to become an infrastructure provider in the metaverse field.
At the SIGGRAPH conference in August 2022, Nvidia will "show off its muscles" and demonstrate their latest progress-the launch of Omniverse ACE and the hope that the Universal Scene Description (USD) can be used as the HTML of the Metaverse.
The Demo shown by Nvidia this time sets the scene in a cyberpunk-style ramen shop scene. Players only need to press a button to use their own voice to talk to Jin, the owner of the ramen shop.
Jin is an NPC character, and the content of his dialogue with the player is completely generated in real time by the generative AI based on the player's voice input. At the same time, Jin also has a realistic facial expression and voice, and the dialogue with the player is consistent with the background story.
Nvidia said that this Demo was completed for it and its partner Convai. Convai is a startup company that focuses on the development of conversational AI in virtual games. They use Nvidia's upcoming ACE (Avatar Cloud Engine) for Games. An AI model OEM service, the service includes the following parts:
ACE for Games also provides the service of customizing AI models to help developers optimize according to their game needs, and perform real-time inference through NVIDIA DGX cloud, GeForce RTX PC or locally. These models are latency-optimized to meet the requirements of immersive, responsive interactions in games.
According to Nvidia, generative AI has the potential to completely change the way players interact with game characters, improving the immersion of games. Some analysts pointed out that **With its expertise and years of experience in the field of artificial intelligence and game development, Nvidia is leading the application of generative AI in games. **
And just 3 days ago, NVIDIA released VOYAGE, the first large-scale model-driven game agent that can learn for life. It brought a little "Nvidia" shock to the AI circle.
When Nvidia incorporated GPT-4 into the game Minecraft, VOYAGER quickly became a seasoned explorer, earning 3.3x more unique items, traveling 2.3x more distance,** unlocking key tech trees Milestone is 15.3 times faster than the previous method. **
Netizens began to imagine the scene of the future game, with the NPC driven by the large model, and the realm of vigorous and vigorous competition of all things is vividly seen:
"Burst" Nvidia seizes the opportunity and continues to run wild
How could Huang Renxun's ambition only stop at subverting the future of games?
According to Nvidia, the new DGX GH200 artificial intelligence supercomputing platform is designed for large-scale generative AI loads.
This "super GPU" uses the new NVLink-c2c interconnection technology, which combines the ARM-based energy-saving Grace CPU and the high-performance NVIDIA H100 Tensor Core GPU to provide a total bandwidth of up to 900GB/s. **
**DGX GH200 is designed to provide customers with maximum throughput and scalability. Bypassing the limitations of standard cluster connections like InfiniBand and Ethernet by using Nvidia's custom NVLink Switch chip to provide maximum throughput for massive scalability of the largest workloads. **
This supercomputer composed of 256 Grace Hopper super chips will have an extraordinary AI performance of up to 1 exaflop and 144TB of shared memory (nearly 500 times more than the previous generation DGX A100). In GPT-3 training,** it can be 2.2 times faster than the previous generation DGX H100 cluster. ** Nvidia CEO Jensen Huang said:
Nvidia and Softbank subsequently announced that will work together to use the grace hopper super chip to generate artificial intelligence and 5G/6G technology to power Softbank's next-generation data center. SoftBank's daily stock rose more than 9% at one point, the biggest gain since May 2022.
The specification is said to cut development costs by as much as three-quarters and development time by two-thirds** to just six months.
With MGX, technology companies can optimize the basic system architecture for accelerated computing for their servers, and then choose their own GPU, DPU and CPU.
**MGX can also be easily integrated into cloud and enterprise data centers. **
**
**Nvidia announced that it will also cooperate with WPP, the world's largest advertising communication group, to use artificial intelligence and "Metaverse" to greatly reduce the cost of advertising production and improve efficiency. The advertising industry has once again ushered in a major change.
WPP CTO Stephan Pretorius said that now, with the blessing of generative AI, can create 10,000 versions required by customers in minutes:
WPP chief executive Mark Read said the technology will be the "foundation" of everything WPP does,
The subversion of AI to the advertising industry has also caused many practitioners in the media field to have "unemployment fear". In this regard, Read said that he is better able to distinguish the jobs that artificial intelligence is about to replace-there is no need to use creative skills in business. Work:**