Nvidia CEO Jensen Huang hopes AI will help gaming rather than hurt it | The DeanBeat


SOURCE: GAMESBEAT.COM
JAN 21, 2026

For much of Nvidia’s history, the company’s gaming technologies have developed in a symbiotic way with the non-gaming technologies, including AI, that the company is now better known for instead of gaming.

But the current shortage of main memory chips is related to the popular explosion of demand for Nvidia’s AI chips. And the consequent price increase (40% to 50% in Q4 alone) for memory chips stands to hurt game companies that are trying to make affordable gaming PCs and low-cost game consoles. Gaming hardware prices are going up.

This situation could forestall the launch of new game consoles, and it has already made the current products on the market less affordable. And that could hurt any natural market recovery that is happening in the games market.

Jensen Huang, CEO of Nvidia, held a 90-minute press Q&A at CES 2026, a day after he gave a speech outlining all of Nvidia’s advances in AI and gaming. Nvidia has become the most valuable public company in the world at $4.55 trillion, thanks to its advances in AI, not because of its recent gaming advances, which have been stalled in part by a slowing of Moore’s Law. But much of that growth in market value has happened because of the symbiotic link between AI and games. Will it still be the case in the future?

The big questions for AI and gaming

Nvidia’s DLSS4 tech at CES 2026. Source: Nvidia

So the question remains. Will AI help gaming, or will it hurt it?

At Nvidia’s press Q&A at CES 2026 (which I have written about fully in this story about the 90-minute session that Huang held), I asked Huang this question. “We know that a secondary effect of demand for AI is a shortage of memory. That’s making the price of game machines go up right now. Gamers are worried their graphics card prices are going up as well. I wonder if you’re concerned that gamers, who are early adopters, are going to resent AI because of this. And do you wish you’d bought a DRAM factory?”

Sadly, Huang anwered the second part of my question, but not the first part.

“You guys don’t know this, but Nvidia is probably the only chip company in the world that’s been a large purchaser of memory,” he said. “We’re one of the largest purchasers of memory, direct purchasing, in the world. We work with every single memory supplier. Some of them on HBM.”

He said Nvidia was the first consumer of HBM4 memory chips.

Nvidia continues to make advances with DLSS for gaming. Source: Nvidia

“All of those factories being made for HBM, thank goodness, we’re the only user of it,” he said. “We’re not expecting anybody else to use HBM4 for some time. We have the benefit of being the primary, the only consumer of HBM4. Because our demand is so high, every factory, every HBM supplier, is gearing up. They’re all doing great. Every single of them is doing fantastically.”

He also said Nvidia purchases GDDR memory chips for graphics cards.

“We’ve been a very significant GDDR consumer for a very long time. We’ve planned that out with our suppliers for quite some time,” Huang said. “We’re also a purchaser of LPDDR5 for the Grace and Vera memories that we use for context memory.”

He said AI needs to have working memory and long-term memory.

“That’s where we store it. We’ve been a very large customer, a direct customer. We do a very good job planning with our supply chain. They’re doing a fantastic job supporting us. We’re grateful for that,” he said. “But I think with respect to–in the final analysis, the fact of the matter is, the world is going to need more fabs.”

He said that data centers these days are best known as AI factories because they are a brand new category of factories that have never existed before.

“We now believe that this segment of the world’s infrastructure called AI factories is going to be gigantic. There are going to be a lot of fabs built. It’s great to be a memory supplier, great to be a semiconductor manufacturer. TSMC is doing great. The demand out there is quite terrific,” he said.

It was clear to me from that answer that Huang is more concerned about keeping the cost of memory low so that AI can keep growing, rather than helping the game market stabilize with lower hardware costs.

Will Nvidia still make progress on GeForce gaming GPUs?

Nvidia’s GeForce RTX 50 Series graphics cards. Source: GamesBeat

Gaming advances such as Nvidia’s GeForce, which introduced programmable shaders to the world, helped gaming advance. That programmability led to the CUDA language that enabled developers to program non-gaming applications on Nvidia’s graphics processing units (GPUs). That, in turn, led to deep learning neural networks and AI models that began to work based on feeding tons of data into neural networks.

This is how gaming technology gave birth to the AI revolution a decade ago, and it led to advances like Open AI’s ChatGPT 3.5, which kicked off the explosion in the last three years. It was symbiosis. But Huang also noted the reverse has now happened, as he noted how an AI-driven software technology has come back from AI to benefit gaming graphics.

DLSS is short for Deep Learning Super Sampling, an AI-driven technology that Nvidia uses to make games look sharper and run faster. DLSS uses AI models that are trained on Nvidia’s AI supercomputers to render a game at a lower internal resolution and then upscale the image to a higher resolution.

This is like using AI to upscale an image and reconstruct the details based on what it believes the frame details should look like, based on past motion, and learned patterns. DLSS 3 and later also predict new frames, based on the current frame, the previous frame, the flow of vectors showing where objects are moving, and then the AI model synthesizes the intermediate frame between the two frames. The GPU renders fewer real frames, and DLSS fills the games. This is how AI can speed up frame generation many times over. Some gamers aren’t happy with actual quality of frame generation, but they may accept the performance gains could be worth it.

At CES 2026, Nvidia showed how DLSS advances continue to help gaming.

Another questioner at the press event also brought up gaming. He noted that Nvidia continues to push DLSS to be better and faster at the task of using AI to predict which kind of pixel a graphics card should render next.

Huang interrupted and noted that it was pretty amazing how gaming and AI technology evolved together. GeForce brought CUDA to the world, which brought AI to the world. After that we used AI to bring RTX to gamers, and DLSS to gamers. Without GeForce there would be no AI today. Without AI there would be no DLSS today. That tells you something.

Question: It’s harmonious.

Huang: Yeah, it’s incredible.

Gaming came up again in the Huang Q&A. A journalist asked, “Is the RTX 5090 the fastest GPU that gamers will ever see in traditional rasterization? What does an AI gaming GPU look like in the future?”

Huang said the answer is hard to predict.

“Maybe another way of saying it is that the future is neural rendering. It’s basically DLSS. That’s the way graphics ought to be,” Huang said. “You’re going to see more and more advances in DLSS. We’re working on things in the lab that are just utterly shocking and incredible. I would expect that the ability for us to generate imagery of almost any style, from photorealism, extreme photorealism, basically a photograph, interacting with you at 500 frames a second, all the way to cartoon shading if you want–all of that entire range is going to be quite sensible to expect. “

He said we should also expect that future video games have essentially AI characters within them. It’s almost as if every character will have their own AI, and every character will be animated robotically using AI.

“The realism of these games is going to climb in the next several years. It’s going to be quite extraordinary. This is a great time to be in video games, frankly,” Huang said.

Nvidia’s headquarters in the digital realm. Source: Nvidia

Another writer asked about gaming, specifically on the prices of gaming GPUs, getting so high. The writer asked, “We might be seeing restrictions on supply and production capacity. Do you think that maybe spinning up production on some of the older generation GPUs on older process nodes where there might be more available production capacity would help that? Or maybe also increasing supply of GPUs with lower amounts of DRAM? Are there steps that can be taken, any specific color you can give us on that?”

“We could possibly, depending on which generation, also bring the latest generation of AI technology to the previous generation GPUs,” Huang said. “That would require a fair amount of engineering, but it’s within the realm of possibility. I’ll go back and take a look at this. It’s a good idea.”

Another game-focused writer wanted to come back to gaming and GeForce. The writer said Nvidia proved with DLSS that AI can work magic with gaming framerates.

But I think there’s still a fundamental bottleneck at the bottom, which is raw shader and raytracing power. We have six times multi frame gen now, but that doesn’t magically make a 15 frames per second game playable. Path tracing also requires a tremendous amount of GPU power at this base level. Are we really never going to see a substantial jump in GeForce shader performance again?,” the writer said.

“Moore’s Law has ended. We make these chips as large as we possibly can. If you look at a 5090, holy cow. It’s giant. You’re really at the limits, physical limits. That’s the reason why we invented DLSS in the first place,” Huang said. “For all of you who’ve been reviewing Nvidia’s technology for a long time, you probably remember when I first announced RTX. It was at SIGGRAPH. The only way to make it–raytracing requires so much computation, the framerate was incredibly low. In order to get the framerate high, we used uprezzing, essentially, AI, to generate samples that otherwise we didn’t compute at all. We use AI to do that. It’s called DLSS.”

When Nvidia first launched DLSS, Huang said he got a lot of criticism.

“Now people love it. The bottom line is that in the future, it’s very likely that we’ll do more and more computation on fewer and fewer pixels,” he said. “By doing so, the pixels that we do compute are insanely beautiful. Then we use AI to infer what must be around them.”

He said it was like generative AI, except Nvidia heavily conditions it with the render pixels.

“I think this likely will be the future. And then the fusion between rendering and generative AI, the generative AI we all know now, that fusion will likely also happen in the future,” he said.

And he said, “I’m very optimistic about the future of computer graphics. We’re about to see another inflection point coming.

Will AI advances come back to benefit gaming?

Nvidia’s Vera Rubin platform at CES 2026. Your gaming PC may not need this now, but someday. Source: GamesBeat/Dean Takahashi

Huang’s words made me wonder if the memory shortage will doom gaming. It will hurt during the current cycle, but it won’t kill off the cycles altogether. The memory shortage problem can be solved over time with an increase in memory chip production. It takes tens of billions of dollars to increase that production, but it will happen due to AI demand.

And it raises the question. Does AI still have something to give to gaming? And a logical question could also be: Will gaming still benefit AI, or should Nvidia spin off its gaming division and focus only on AI?

“Nvidia is making record money in gaming, and the team there is bigger than ever,” said Ian Cuttress, chief analyst at More than Moore, in a message to GamesBeat. “While it’s a smaller part of their overall pie, I don’t see Nvidia slowing down on gaming development any time soon. Someone will always pay the prices for the gaming GPUs, memory or otherwise.”

I have to agree. There are changes that AI can bring to game software, like making AI characters so much more intelligent and making game environments so much more interactive. Those kinds of advances are made possible by generative AI, but we haven’t seen the impact on games, which take a long time to make, just yet.

I think as long as the symbiotic nature between a fundamental technology, AI, and the ideal application, gaming, still exist, then Nvidia should keep its gaming division and invest heavily in it.