Opinion PC

A novice’s guide to gaming graphics computing


As the title suggests, this preface is slightly dumbed-down for the sake of understanding the basic gaming graphics concepts. This is not a technical white paper and some complex intricacies have been simplified. 

But let’s start at a point of departure to see how games have evolved.

Evolution of Video Games

Today’s computer games require some fairly sophisticated hardware to run effectively and have been at the forefront of cutting edge technology for a good 50 years. As games have become more and more graphically demanding, the hardware used to power these devices has evolved almost as much as the industry. The tangible results of this evolution can be viewed in this fantastically displayed video:

So what were the major hardware components required for getting the games onto your display device? 


The hardware cycle of any application that you run on a computer goes through a bit of a process. Before the high-end, high definition gaming that we have today, the majority of all the computing power was reliant purely on the computer’s processor, or CPU. Let’s have a very basic look at the flow of information when launching a game to the display device.

  • The game or application is launched
  • The data loads from your Hard Drive into system RAM
  • The CPU process the data
  • The images are sent to the display device
This was the very rudimentary process, with the CPU and RAM also being utilized to run the operating systems instructions. With the added overhead of graphics processing one can only imagine the demands that this would put on the poor processors of old. With the graphics demands increasing, the poor little CPU struggled to process all these system and game instructions. The CPU, RAM, Hard Drive combination could not do this job effectively which necessitated a dedicated graphics CPU relieving load from the primary CPU. Enter the GPU.


If you watched that awesome evolution video above, you may have noticed that one of the major giant leaps happened around the 1998/9 mark. This was because of the inclusion of the GPU. The Graphics Processing Unit.

More Background

Little side note as I know there are some pedantic tech heads out there who will want to argue against the statement above, so to make them happy; the first GPU’s were used in the early 1970’s in arcade machines and home consoles like the Atari 2600 in 1977, as did 1985’s Commodore Amiga, although in a slightly different way, but I am not going to get verbose on that for the sake of this article. These types of GPU’s were not included in conventional desktop computers.

The giant leap

The first really popular GPU’s in our trusty home computers, that we all really just wanted to play games on, were add-in cards that accelerated the 3D graphics. Essentially still relying on the CPU or integrated video cards to perform the basic 2D displays with the add-in card doing all the heavy graphics processing. The pioneers in this field were the likes of S3 Graphics who released the S3 911 in June of 1991 and were responsible for the Virge and Savage cards that old gamers like us remember rather fondly. Followed rather quickly with the Voodoo and Banshee range by 3dfx 
The decade of the 90’s had quite a bit of drama around 3D graphics, not only with the hardware we have been talking about, but also with the software that was being used hand in hand with these devices (OpenGL / Direct3D / Glide).  Manufacturing continued to progress and evolve with the approach of Y2K andvideo, 2D acceleration and 3D functionality quickly became integrated into one chip ,GPU that we know and adore today.

Let’s have a very elementary look at the flow of information when launching a game to the display device using a GPU.
  • The game or application is launched
  • The data loads from your Hard Drive into system RAM
  • The CPU processes the data
  • The graphics driver translates the DATA
  • The VRAM stores the data
  • The GPU process the data
  • The data is sent to the Frame Buffer
  • The images are sent to the display device
As you can see, the major load of computing the graphics has been removed primarily from the CPU and onto the GPU which improves performance and frees up many other hardware resources.

I did state this was a very basic overview. If you would like to know more detail, both of the technology timelines as well as historical info on GPUs, this slightly dated article makes for interesting and informative reading: 


So what is an APU? Put as simply as possible, it’s a CPU and GPU combined into one piece of hardware.

Known as the AMD Accelerated Processing Unit (APU) – and formerly known as ‘Fusion’ – it is the marketing term for a series of 64-bit microprocessors from AMD designed to act as a CPU and GPU on a single chip (usually a multi-core chip). Development of the Fusion project kicked off in 2006, with the purchase of ATI being the key to realising the concept more completely.

Here is a great video that will explain it in more detail for those of a technical nature who would like to have a deeper understanding of it: 

So why should you consider an APU?

Given the options – and the traditional configuration being a CPU and GPU - why would you want to consider the APU option?

  • Budget constraints: Due to the APU containing both the CPU and the GPU it removes the need to purchase a GPU unless one is explicitly required. This makes it an ideal solution for office PCs that require moderate graphics capabilities.
  • Carbon footprint: If your concerns when buying a PC extend to power usage and the effect it has on your carbon footprint (and let’s be honest, you should be thinking about this especially in South Africa), the APU presents a remarkable drop in power consumption over the CPU/GPU combination. The AMD A8-7670k APU uses 95W versus a comparable CPU/GPU combination coming in at 178W – 265W. This power saving capability was showcased in 2013 when a group of university students won an energy efficient Supercomputer build off by using APUs against others using traditional CPU/GPU combinations.
  • Added processing power: In situations where the GPU portion of an APU is under little load it can use its spare cycles to assist the CPU portion with calculations thus speeding up execution.

Why have I never heard about it before?

You’re probably wondering why you’re only hearing about an APU now? Well, chances are that you have heard about it but just not realised it. In fact, if you have an Xbox One or PlayStation 4 you are already using an APU as both use semi-custom AMD APUs.

You may also have found that many friends will have advised you down the CPU/GPU path, but the truth is if you are in the market for a gaming PC, but don’t require it to run on ‘ultra’ graphics setting, an APU may very well be an option worth considering – and you can always choose to upgrade to a standalone GPU at a later date

AMD: Facebook | Twitter

Where do I get an APU?

If you are interested in investigating any of the APU products, either the standalone APU or a full upgrade kit, check the links below:

More about PC Gaming

Brad: Twitter / MWEB GameZone: Twitter | Facebook

Other news from around the NET:

Recent Comments

Community on Disqus

Latest Reviews

Forza Horizon 4 Review

Forza Horizon 4 Review


With a gorgeous open world, epic car roster and a new seasonal system, this year's Forza is the best...

V-Rally 4 Review

V-Rally 4 Review


V-Rally 4 delivers some great off-road racing that all rally fans will enjoy.

comments powered by Disqus