It’s been 25 years since Nvidia GeForce 3 — and I think gamers accidentally built the AI era

nvidia
(Image credit: Future)

Nvidia GeForce GPUs have been around since 1999, but it was with GeForce 3 that changed the rules of the game forever. Today is the 25th anniversary of the third generation GPU tech, and you can draw the past quarter of a century of generational leaps in gaming back to this one pivotal moment.

And that change? Programmability. Instead of just feeding data to the chip, people were able to run small programs directly on the GPU — giving birth to the rampant progression towards the incredible fidelity we see on PC games. But in an ironic twist for gamers in 2026, it’s that very programmability that makes it impossible to find the best GPUs at MSRP.

Plus, you can also see how the sheer size of GPUs have grown over time too… Yes this is technically the GeForce 210 that I picked up at Guang Hua Digital Plaza last year. But this is roughly the same dimensions of a GeForce 3 GPU back in the day (around 6.5 inches in length).

What Can $50 Buy at Taiwan's CRAZIEST Tech Mall?! - YouTube What Can $50 Buy at Taiwan's CRAZIEST Tech Mall?! - YouTube
Watch On

‘Your world, programmed’

So let’s look back. In 2001, Nvidia CEO Jensen Huang talked about GeForce 3 as a way to “unleash cinematic realism,” all by moving away from “fixed-function” chips and giving creators a blank canvas to make with.

This came down to the introduction of Programmable Vertex and Pixel Shaders. Think of a 3D video game like a movie set. To get a character on screen, the computer has to figure out where things are and what they look like.

Before GeForce 3, these jobs were hardwired into the chip and developers were stuck with them. If you wanted to draw water in a harbor scene, you were stuck with the water Nvidia’s engineers gave you.

Nvidia GeForce 3

(Image credit: Wikipedia)

Now, with this increased programmability, the Vertex shaders (the “where”) are able to be controlled by the coder — like a construction worker moving the scenery around. And on top of that, the programmable pixel shaders (the “what color”) was also able to be controlled.

You could start telling the lighting crew what to do, figure out whether that floor is supposed to be wet, dusty, bumpy, or whatever you want it to be!

And it was this that gave birth to those moments of cinematic realism. I stared at the water in “Elder Scrolls: Morrowind” for far too long, and “Doom 3” was utterly terrifying. Plus, with GeForce 3 architecture powering the original Xbox, there was a night and day difference between it and the PS2.

Paving the way

RTX 5070 vs RTX 5070 Ti

(Image credit: Future)

Fast forward to today and you can see how that programmability of GeForce 3 paved the way for CUDA and Tensor cores. This is the age of neural rendering, and Jensen said as much in my Q&A panel with him back at CES 2026.

“In the future, it is very likely that we'll do more and more computation on fewer and fewer pixels. By doing so, the pixels that we compute are insanely beautiful, and then we use AI to infer what must be around it.” Huang said.

And the aim seems to be moving beyond the architectural limits of GPU silicon, looking past cinematic realism and producing extreme photo realism: “basically a photograph interacting with you at 500 frames per second” in his own words.

You can see just how much neural work goes into something like “Resident Evil Requiem” — using a combination of DLSS 4.5 and Ray Reconstruction AI trained on millions of hours of gameplay to make a game run smoother and look prettier.

It’s the difference between a developer in 2001 saying “I will write a program to tell this pixel to look blue,” and one in 2026 saying “I’ll let an AI model look at the scene and decide what the pixel should look like, based on billions of previous examples.”

And it’s that very breakthrough that has made Team Green the belle of the AI ball, and that’s come at a bit of a detriment to gamers.

The cost of intelligence

AWS data center

(Image credit: Amazon)

You saw it in Nvidia’s recent earnings and the ominous warning that RTX 50-series GPUs were going to be hard to come by because of the RAM price crisis. While the company still makes billions of dollars on gaming tech, it makes hundreds of billions fueling the AI factories you see popping up all over the planet.

The silicon that once lived in our towers is being diverted to build the future of autonomy, and while I don’t fault the decision (I’d make the same one too in Jensen’s leather jacket), you can’t help but feel like as the GPU figured out how to learn, the world now wants it for everything except games.

The legacy

Nvidia

(Image credit: Future)

“Without GeForce, there would be no AI today. Without AI, there would be no DLSS today. It's harmonious," Huang said back in January.

While it’s a little tricky to see the harmony with GPUs far exceeding their MSRPs, I think it’s even more than that. In 2001, the programmability of GeForce 3 was initially thought to be a way to give us nicer water effects in games.

But Jensen’s long-term bet was that if you make a chip programmable enough, the world will find a use for it beyond games. And boy did they, looking at the most valuable company on the planet.

We are now living in the world that GeForce 3 built. 25 years ago, it delivered on the promise of “infinite effects” so well that the technology outgrew the toybox and the world realized that the most powerful tool for the future of humanity was sitting inside our gaming rigs all along.

And with Vera Rubin being essentially a sneak peek at what the RTX 60-series architecture will be able to do, this train isn’t stopping. But we may have to wait a while for the stock alerts to catch up.


Google News

Follow Tom's Guide on Google News and add us as a preferred source to get our up-to-date news, analysis, and reviews in your feeds.


More from Tom's Guide

TOPICS
CATEGORIES
Jason England
Managing Editor — Computing

Jason brings a decade of tech and gaming journalism experience to his role as a Managing Editor of Computing at Tom's Guide. He has previously written for Laptop Mag, Tom's Hardware, Kotaku, Stuff and BBC Science Focus. In his spare time, you'll find Jason looking for good dogs to pet or thinking about eating pizza if he isn't already.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.