Game news Artificial Intelligence in video games has taken a leap forward with this video at CES 2024! What is the future of the players?
Nvidia's AI project, called ACE, was revealed a little more with a tech demo during CES 2024.
CES 2024 is an opportunity for everyone to attend and showcase their technological prowess, and AI is no exception.. The manufacturer Nvidia decided to hold a technical demonstration during the show of ACE (Avatar Cloud Engine), the artificial intelligence that allows improving the realism of non-playable characters in video games. Through this tool, it is possible to chat normally with characters that the players do not control via the microphone. It is not really a surprise to know that nowadays, artificial intelligence is becoming more and more popular among the general public.
Nvidia's tool allows you to create more realistic NPCs. During the demonstration, the audience was able to talk to Nova and the chef of the ramen restaurant. After asking a question, it takes a few seconds to get an answer. Each character can be configured to speak several languages, such as English or Spanish, and the developers are the ones who provide the data, generally very recent, to incorporate into the dialogues. NPCs can also sense different moods which can be guessed by their facial expressions and the way they speak. Occasionally notice some minor errors. A breakthrough that heralds a major future addition to video games.
Worrying progress?
During this quick joint project, the editorial team wondered how to use this technology. Nvidia has already announced partnerships with some major studios like Ubisoft, miHoYo (which looks after Genshin Impact and Honkai: Star Rail), NetEase Games, Inworld, and Chinese giant Tencent. ACE seems to be full of promise, but it seems complicated at the moment to replace the expertise of real professionals.
Although studios have had access to it for a few months, it seems like getting used to this new technology is a real headache. The focus is on combining human expertise with artificial intelligence and training employees so that they can use it easily and naturally. Perhaps it's also up to programmers to set limits on their own technology, when we know that Nvidia's AI can itself generate animations and lip-sync based on a spoken sentence. If you're wondering what limits need to be placed on AI so that it no longer takes precedence over human expertise, Panthaa and Anagund give you the start of the answer in this quick joint project.