discover our new developer tools

To stimulate metaverse development and make it more realistic, Nvidia has just unveiled new tools for virtual world creators. On the program: realistic, emotional avatars, embodied virtual assistants, physical simulations and a 3D Internet…

Since early summer 2022, thehe hype around the metaverse seems to be fading. The virtual real estate market is in freefall, and the creator of Ethereum recently estimated that Mark Zuckerberg’s project is doomed to failure.

However, one of the main players in the metaverse industry doesn’t see it that way. Nvidia, the graphics card manufacturer, has not said its last word. intends to rekindle enthusiasm around the metaverse.

On Tuesday, August 9, 2022, at the Siggraph conference in Vancouver, Nvidia Omniverse announced a series of new tools for designers and developers of virtual worlds. The aim is to make metaverse environments more realistic.

In particular, the company unveiled new artificial intelligence tools and simulations. Users of the Omniverse Kit andtools such as Nucleus, Audio2Face and Machinima will be able to take advantage of new creative possibilities.

According to Nvidia, these tools will mainly improve the construction of ” digital twins and realistic avatars “.

Omniverse Avatar Cloud Engine (ACE): a tool for creating realistic 3D avatars

La quality of interactions in the metaverse is a topic of debate. Developers and users are still looking for the right balance between quantity and quality of experience.

For example, in spring 2022, during the first Fashion Week organized in the metaverse, most participants complained about the poor quality of the digital environments, the virtual clothes and, above all, the avatars.

With Avatar Cloud Enginer (ACE) included in the new Omniverse toolkit, Nvidia intends to solve this problem. This engine will improve the conditions for creating ” virtual assistants and digital humans “.

According to the company, ” with Omniverse ACE, developers can build, configure and deploy avatar applications on almost any engine, on any public or private cloud “.

As Simon Yuen, senior director of avatar technology, explains, ACE is based on a 3D framework including the human skeleton and muscles. Simply upload a photo to create a 3D model of the person.

Audio2Face transcribes emotions on avatars’ faces

The Audio2Face application also benefits from an update focused on digital identity. This technology also enables associate an avatar’s expressions with the words it utters.

In addition, the Audio2Emotion tool will now allow you to change an avatar’s facial expression according to emotion conveyed by the user’s words.

According to Nvidia’s official press release, users can now direct the emotions of digital avatars, including their facial animations. The developers will be able to choose to accentuate or soften them.

Virtual assistants with artificial intelligence

The ACE engine will also make it possible to create digital assistants embodied by avatars for the metaverse. Nvidia CEO Jensen Huang wants to enable the creation of real robots capable of perceiving their environments, exploiting their knowledge and acting accordingly.

According to him ” avatars will populate virtual worlds for help us create and build thingswill be brand ambassadors and customer service agents, and will help you find information on a websiteorder at the drive-thru, or recommend insurance. “.

According to Technavio analysts, the metaverse market should exceed $50 billion by 2026. If we are to believe his predictions, the technology should take off and attract new users.

Already today, the virtual world is developing, and we’re seeing the appearance of new eventsoffices or even classrooms. As it becomes more populated, this digital universe will attract more and more users.

The latter seek to create a digital version of themselves. The development of tools to support this mass adoption of metavers is therefore essential.

Universal Scene Description: towards a 3D Internet?

In the eyes of Nvidia’s CEO, the metaverse is “ the next evolution of the Internet “. He asserts that ” the metaverse is the Internet in 3Da network of persistent, connected virtual worlds. The metaverse will extend 2D pages into 3D spaces and worlds. L’hyperlinking will evolve into hyperjuming between 3D worlds. “.

However, for this prediction to come true, a standard allowing developers to create metavers like HTML for web pages will be needed. This is why Nvidia supports the development of Universal Scene Description format initially created by the Pixar animation studio..

Nvidia’s ambition is to help develop the USD format so that it can support complex, changing 3D worlds, not just animated film scenes.

Other companies supporting this new standard include Apple, Epic Games, Autodesk, BMW, Disney, Industrial Light and Magic and Dreamworks.

Another tool unveiled by Nvidia is PhysX: a physics simulation engine a realistic real-time physics simulation engine. Developers can use it to add physics-based reactions to the metaverse.

Since the emergence of the metaverse, Nvidia is one of the forerunners of this new industry. Its artificial intelligence technology is massively used by designers, and these new tools for developers will further strengthen the firm’s influence.