Launch Shadow PC in Browser

My Shadow Drive

My Account

The History of Gaming: The evolution of GPUs

A powerful GPU is vital for running modern games - but how much do you know about its long and storied past?

A huge graphical upgrade. 

The humble graphics processor unit - or GPUs as they’re more commonly referred to - have evolved substantially over the last few decades. From beginning life as pre-determined hardware that only served one purpose, to powering 3D and high resolution graphics, GPUs are now an indispensable part of any PC. 

Walk with us, then, as we take a trip down memory lane as part of our History of Gaming series, charting the evolution of GPUs from then, to now.  If you want to take a look in the Future instead check our brandnew Roadmap.

1970s 

Today’s GPUs stemmed from a pastime many of us still love: arcade gaming. Except back then, 3D graphics were determined by early display controllers - known as video shifters and video address generators - as opposed to the sleek and ultra-powerful graphics cards we know today. 

Change began when notable companies, including Motorola, RCA and LSI, started producing video chips and processors with highly advanced capabilities. RCA’s “Pixie” video chip came along in 1976, and was capable of outputting an NTSC compatible video signal at 62x128 resolution (a mighty achievement at the time) - while Motorola’s work on video address generators in the same era would provide the foundation for the Apple II. Meanwhile, LSI developed ANTIC - an interface controller which debuted in the Atari 400 - which supported smooth scrolling and playfield graphics.

The Namco Galaxian arcade system, released in 1979, took things to the next level. Equipped with graphics hardware that supported advancements like RGB colour, tilemap backgrounds and custom video hardware, it was one of the most popular systems used during what’s now known as the “golden age of arcade video games”. Who knew that gaming was about to get a whole lot better?

1980s

The 1980s would see in a massive variety of graphics display advancements for both video games and computers. This would include the first fully integrated VLSI graphics display processor for PCs, the NEC µPD7220, and the first fully programmable graphics processor - the TMS34010, which would become the basis of the TIGA Windows accelerator cards. 

Graphics, indeed, were on everyone’s mind - and other companies, including IBM, were experimenting with ways to make both picture and video clearer. In 1981, they began to incorporate monochrome and colour display adapters into their hardware, so that they could play video. Shortly after, Intel would release the ISBX 275 Video Graphics Controller Multimodule Board, which could display eight colours at a resolution of 256x256 and monochrome at 512x512. Elsewhere, three Hong Kong immigrants would form ATI Technologies in Ontario, which would go on to create revolutionary graphics boards and chips that’d lead the market for years. 

But what was happening with video games graphics, we hear you cry? Well, a couple of things. In 1987, a lovely little company called Sharp released the X68000 for PC gaming at home. Boasting a 65,536 colour palette, multiple playfields and hardware support for sprites, it was later used as a development machine for Capcom’s CP System arcade board. However, only two years later, Fujitsu would release their FM Towns computer - which came with a mind-boggling 16,777,216 colour palette of its own. Ooft.

1990s

An era beloved by all for its awesome TV shows, music and fashion was also the basis for modern GPUs. By this point, real-time 3D graphics were in huge demand, and 3D graphics hardware was being mass-produced for the public market - focusing particularly on improving video games. Leading the innovation for most of the decade was 3DFx, whose initial 3D graphics card - the Voodoo 1 - quickly overtook a whopping 85% of the market. Their second release, the Voodoo 2, would be one of the first video cards to support two cards within a single computer… but their reign wouldn’t last forever. 

Towards the millennium, a little company called Nvidia would burst onto the scene with two revolutionary cards: the RIVA TNT 2, which supported 32-bit colour, and the Nvidia GeForce 256 DDR… which the company coined the “world’s first graphics processing unit” (or GPU!). People quickly realised that it was leaps and bounds better than anything they’d seen before, and its quality 3D gaming performance wowed the world. Picture quality was incredible, it ran smoothly and it was affordable, too - making it a no-brainer for consumers to include in their home computers. After its release, it didn’t take long for the quality of video games to skyrocket.

2000s

The naughties may be best remembered for the now laughable tech apocalypse known as the “Y2K bug”, but that didn’t stop Nvidia and ATI from pushing the boundaries once more. Nvidia introduced the first chip capable of programmable shading, which would lay the foundations for pixel shading, a process that could add texture to an object, making it look either round or extruded, or appear shiny, dull or rough. ATI quickly followed suit, however, adding the technology to their second generation Radeon cards. 

Subsequent cards continued to offer incremental performance improvements over the next few years, although the aging AGP interface was later replaced by the much faster PCiE interface, which still exists today. The 2000s were also notable for the arrival of ATI’s Crossfire and Nvidia’s SLI support, allowing two or more graphics cards to link up and provide increased performance. 

The ATI name would soon be phased out after a buyout from AMD, solidifying team red’s position as a worthy competitor to Nvidia. AMD would later bring their EyeFinity display tech to consumers, which allowed graphics cards to display a video game across three or more screens for a truly immersive experience. 

2010s

And that brings us to the present day - or decade, if you will. Nvidia and AMD have ushered in a slew of new technologies that have had a transformative effect for GPUs as a whole, with many innovations reserved for business enterprises reaching consumer-grade hardware for the first time. Consequently, graphics processing units are used in a wide-range of fields today, with Nvidia helping to create AI-powered, self-driving cars for the automotive industry, while AMD continues to power datacenters across the globe. 

For gamers, the advances in GPU technology have been significant to say the least. Both companies have eliminated stutter and screen tearing thanks to G-Sync and Free-Sync display technologies, and new manufacturing processes have paved the way for smaller chipsets that continue to draw less power - at least that’s the aim - while also boosting performance. 

The biggest change on the horizon, however, is the introduction of real-time ray-tracing and AI-assisted supersampling. Although AMD is yet to reveal their approach to ray-tracing, Nvidia has invested heavily in the new lighting technology with their GeForce RTX cards, with Nvidia’s CEO Jensen Huang dubbing ray-tracing as the “Holy Grail” of the graphics industry. Nvidia is also banking on its Deep Learning Super Sampling (DLSS) tech, which promises to help boost framerates at higher resolutions. Not to be outdone, AMD has their own proprietary solution, Radeon Image Sharpening, which will make 4K gaming a more tangible prospect in the future. 

It’s clear, then, that both Nvidia and AMD continue to trade blows for the benefit of gamers. Some innovations can seem superfluous as opposed to revolutionary (we’re looking at you, Nvidia Hairworks) but there’s no doubt gaming tends to benefit immensely when competition is alive and well.