Back to top

The indispensable source for professionals who create, implement and service technology solutions for entrepreneurs to enterprise.

In the Zone

Tech Explainer: How do GPUs work?

Kevin Jacoby's picture

by Kevin Jacoby on 01/11/2022
Blog Category: devices

The graphics processing unit (GPU) is tech’s ultimate nerd. For many years, it lurked in the shadow of its far sexier cousin, the central processing unit (CPU).

Now they’re reversing roles. Today it’s the GPU that’s powering sexy new tech like AI and machine learning. And the CPU that’s handling boring old text documents, spreadsheets and other mundane chores.

No, this isn’t the end of the CPU. But with Moore’s Law winding down and a mixed-reality revolution on the way, it’s now the GPU’s time to shine.

GPU: a brief history

The history of the GPU dates back only a little over 20 years. It was 1999 when NVIDIA introduced the industry’s first widely available GPU, the GeForce 256.

Back then it was all about graphics (hence the name). The best engineers, seeing the ascendance of the graphical user interface (GUI), surmised that computers would soon need much more power to process graphics.

Nvidia GeForce 256

NVIDIA’s 1st GPU: GeForce 256 (1998)

They were right. And over time, GPUs have gotten faster and smarter, and graphics processing has become a powerful enabler. Better, faster and more powerful GPUs have let video-game designers offer hyper-realistic detail, super-fast frame rates and better action.

Multimedia creators fell in love with the GPU, too. New generations of 2D, 3D and video artists have used multiple GPU arrays to create modern art exhibits, animated films, and countless independent movies that now stream on Netflix and other popular services.

GPU vs. CPU: What’s the difference?

Though GPUs and CPUs are both processors, they operate differently.

The CPU, using a technique called serial processing, tackles a wide range of tasks with brute force and lighting speed. It’s called “serial” because the CPU processes one operation at a time, moving on to the next operation only after it has completed the previous one.

Also, a CPU has relatively few cores, typically no more than a dozen or so. However, each CPU core uses a large cache and broad instruction set. That’s how a CPU can manage a computer’s every input and output.

CPU vs. GPU

CPU vs. GPU: more cores, more simultaneous operations

GPUs use way more cores — literally hundreds, even thousands of them. For an extreme example, one of NVIDIA’s current GPUs, the GeForce RTX 3070 Ti, contains 6,144 cores. 

This lets the GPU process thousands of operations simultaneously, a technique known as parallel processing. In this way, the GPU and its associated software can take a big workload — for example, rendering a 3D graphic — and divide it into smaller ones.

Then the GPU can process all those smaller pieces at once, completing the total workload in much less time than a CPU could.

Other advanced technologies can further enhance GPU performance. Intel’s Arc graphics processors, now shipping, offer hardware-accelerated Ray Tracing, Xe Super Sampling (XeSS) AI-driven upscaling technology, and Intel Deep Link technology.

Back to GPU future?

While today’s GPUs are cutting-edge, the parallel processing approach they use dates back to the 1970s. That’s when Seymour Cray and his colleagues built the first gigantic supercomputers, the Cray-1 and its successors.

Cray-1 supercomputer

Cray-1 supercomputer: parallel processing circa 1975

Back in the 1970s, artificial intelligence was just a glimmer in the eyes of sci-fi writers and comic book artists. But whether Seymour Cray knew it or not, his legacy — and that of his fellow engineers — is being realized as we speak.

Bleeding-edge technologies such as AI, machine learning and mixed reality are now being powered by the parallel processing of the latest GPUs.

For example, when you ask Alexa whether it’s going to snow tonight, it’s a GPU that parses your words, scrapes meteorological data from the web, and tells your smart speaker what to say.

Similarly, after your doctor orders up an MRI for your aching back, it’s a GPU that processes the images, recognizes the patterns, and assists with the diagnosis.

That’s today. Tomorrow, when your personal robot offers you a good movie, teaches your child to speak French, and then pours a chilled mug of your favorite beer, all that will be powered by a GPU. Talk about revenge of the nerds.

 
Back to top