Advertisement

COMMENT: More power for more performance the wrong direction for PC parts

Stock image of a gaming PC with LED case (Photo: Getty Images)
Stock image of a gaming PC with LED case (Photo: Getty Images)

Technological advancements in the PC and gadget space are at an all-time high in the past few years.

Just take a look at this simple list of what came out from 2019 to 2021:

  • AMD Ryzen 5000 CPUs

  • AMD Radeon 6000 GPUs

  • Intel 11th Gen CPUs

  • Intel 12th Gen CPUs

  • Nvidia RTX 3000 GPUs

  • Apple M1 SoC

  • Xbox Series S and X

  • PlayStation 5

I could go on, but these are just the few important ones that pertain to the topic of this piece.

These new products have pushed the tech space forward by a big margin, but there is one thing that is sticking out like a sore thumb for some of them. I would even dare say that it is moving some things backward.

You might ask after seeing that list: What’s so “bad” about these parts? They are pushing tech forward.

Two words: Power Consumption

In the age of everyone going digital and using a lot more electronic products, we are also using a lot more power and electricity than before.

There are a few ways that electricity is produced, but the main bulk of electricity production is still through burning of coal/oil/natural gases. This in turn creates a lot of greenhouse/CO2 emissions, which means more pollution for the environment.

This article from CNA explains pretty thoroughly on how electricity production affects the environment, and will help with understanding why there is a big issue for some of these tech parts.

Looking back

The two parts in your PC that use the most power in your system are the central processing unit (CPU) and the graphics processing unit (GPU).

Let me start by referencing Nvidia’s top of the line GPU from the GTX 10 series. The GTX 1080 Ti, released in 2017, was Nvidia’s flagship gaming GPU prior to the release of the RTX 2000 series. It has a Thermal Design Power (TDP) of 250w.

This means it is rated to pull 250 watts of power from your power supply to function at 100 per cent of its rated clocks/performance.

You are also able to increase its power limit to draw more power (for overclocks and stability, but this is by choice), and this increase in limit depends on the type of card that you have (it depends on the manufacturer as well).

The RTX 2080 Ti also has the same TDP of 250w. The RTX 2080 Ti is the successor of the GTX 1080 Ti, improving performance and introducing raytracing capabilities, while using the same amount of power. It is only natural to have a performance gain at the same power consumption if you want to call it an upgrade of some sort.

The RTX 3080 Ti has a TDP of 350w.I believe you can see where I am going with this.

Is the RTX 3080 Ti much more powerful than the its predecessor? Yes. Does the extra performance over the RTX 2080 Ti warrant a 100w TDP increase? Perhaps. Is it something to be proud of? No.

The fact that it uses more power to be “more powerful” only makes it look like it’s using brute force to overcome its predecessor’s limits.

Is it really a generational leap when the RTX 3070, which is supposed to be on a par with the RTX 2080 Ti’s performance, still has a 220w TDP? While it consumes 30w less, the RTX 3070 also has 3GB less memory than the RTX 2080 Ti.

What this means for the consumers

Now, lets look at the bigger picture. If you are someone who uses the GTX 1080 Ti or RTX 2080 Ti and would like to upgrade to a RTX 3080 Ti, you have to factor in a few things.

First is your power supply.

If you’ve been using a 600w power supply for your system (250w for GPU, roughly another 200-250w for CPU and miscellaneous things in your computer like fans, which will total up to 450w-500w of usage), you would have no choice but to swap your power supply unit (PSU) to something more powerful to accommodate for the 100w increase.

NVIDIA computer graphic cards are shown for sale at a retail store in San Marcos, California. (Photo: Reuters/Mike Blake)
NVIDIA computer graphic cards are shown for sale at a retail store in San Marcos, California. (Photo: Reuters/Mike Blake)

It is extremely important to leave some wattage headroom (basically the amount between how much your system will need versus what your PSU is rated for). Unavailable headroom will only lead to power trips on your PSU if any of your parts decide to go through a power spike (especially if you run programmes that will trigger these power spikes), and in the worst case, a short.

If you are only playing games on your system, you MAY be able to get away with the 600w power supply because games don’t fully utilise the CPU, so your CPU may not pull that much of power.

But if you are someone who games, streams and does a lot of multitasking, you are playing with fire (maybe even literally) if you do not use a more powerful PSU.

While most PSUs are rated to last anywhere between 5 to 10 years of usage, you will have no choice but to swap yours before this time frame is up because of this power increase.

Now, any sensible person will sell the power supply away, so that it can be used by another. But you may be surprised at how many people that will throw these parts away after replacing it with a new one, increasing waste (I personally know a few).

If you need to dispose your used power supply, make sure to do it responsibly. A lot of countries have dedicated e-waste disposal systems these days, so do use them whenever possible. (Here is an example of Singapore’s e-waste disposal efforts)

Secondly, when you have an extra 100w of heat in your system, you will need more cooling to dissipate that heat.

This will mean that you will need to ramp up your fans in your system or GPU cooler, and in turn increase power consumption of your PC. This is pretty self-explanatory.

A single “upgrade” of your GPU can mean an increased power usage in many ways, which is why it is extremely important to highlight that we may be moving in the wrong direction for these parts.

It doesn’t help that there are rumours about the upcoming generation of GPUs needing way much more power to operate. “Double performance, double power consumption, can you accept it?” asks the leaker.

No. I cannot accept it. Even as a tech nerd, how am I supposed to recommend this to an average user if this increases e-waste and carbon emissions?

Even if you think that this is a pretty “hippy” line of thought, something more immediate that you can think of is your power bill. I can do double performance, but not for double my power bill.

GPUs aside...

Things don’t look that much better on the CPU side either.

Intel just recently released its 12th Gen processors, and tests from various media outlets paint a picture that a stock i9-12900K (Intel’s flagship CPU for this generation) pulls almost double the wattage of AMD’s Ryzen 9 5950X when at 100 per cent usage.

While it does perform much better than its Intel predecessors, it is also another example that it may be using brute force to achieve those results.

Intel's 12th Gen Alder Lake i9 and i5 CPUs (Photo: Yahoo Gaming SEA)
Intel's 12th Gen Alder Lake i9 and i5 CPUs (Photo: Yahoo Gaming SEA)

Like what I mentioned earlier, if you are only using your PC for gaming, this shouldn’t matter much as your CPU won’t be fully utilised anyway to be pulling that much of power.

However if you are constantly loading your PC with a lot of tasks to run, this may be something that you want to keep in mind, especially if you are looking to cut down on your power usage.

From my own personal testing with the 12th Gen CPUs, if you are running Windows 11, any kind of menial tasks will first be handed over to the Efficiency cores to be handled, keeping power consumption down.

It is when you decide to activate the Performance cores at full force that the power usage goes up. So, if you are constantly using your CPU to render and perform heavily multithreaded tasks, you will be using a lot more power.

Intel, Nvidia and AMD

I personally use an all AMD set up for work and gaming (CPU and GPU). The reason I chose this combo is simply for the fact that my system uses less power overall, while being able to still perform considerably well under heavy load.

At the risk of sounding like a fanboy, the combo of the Ryzen 9 5950X and the Radeon RX 6900XT has reduced overall power consumption for me compared to what I had in the past before my “upgrade”.

The 5950X only pulls 130w at full load if I am running an intensive benchmark or when I am rendering a video. I have never seen the CPU’s power usage go beyond this unless there is a power spike, but even then, it barely lasts a second before going back down into the 100-130w range.

I could turn on its Precision Boost Overdrive (PBO) mode, which will give me extra performance at the cost of increased wattage consumption, but I never had to do that because the 16 cores at stock is more than enough for me for my use cases.

Prior to this, I used an Intel 8700K, and then upgraded to the 10700K when I needed more cores. I remember seeing my power spiking to about 150-200w whenever I ran some intensive multitasking workloads on these CPUs, so I am glad I switched over to the 5950X.

I also decided to initially upgrade to the RTX 3090 from a GTX 1080 Ti, since I managed to get one at MSRP during the first few weeks of its launch.

While it improved a lot of things for both my work and play, the 350w TDP and constantly seeing its high power consumption even under light use made me rethink a lot about how much power I was wasting, when I could achieve the same results with the 6900 XT.

The constant ramping up of the fans in my system also indicated that my system was working extra hard to dispel the excessive heat produced by the RTX 3090. I have also seen it spike to 400w before occasionally, and it made my heart jump a couple of times, albeit only for a split second.

I decided to take the plunge and sold off my 3090 to replace it with the 6900 XT. While I admit that I do miss the efficiency and speed of the 3090 for my GPU-based workloads, I am honestly okay with the way the 6900 XT performs for these tasks.

I also feel like I have not sacrificed any performance in games (I game on 1440p) while also using less power, since the 6900 XT has a 300w TDP. It’s still not ideal (I very much prefer to have my GPU remain at 250w), but at least I am still using much less power than the 3090.

Do we have a power efficiency winner?

The reasons above are why I am also pretty happy (and jealous for Apple users) about Apple’s M1 System on a Chip (SoC).

When it comes to power efficiency in computers, Apple’s M1 probably takes the cake. An SoC consists of the CPU and the GPU on a single chip, and while the GPU in the M1 is nothing to shout about (who really games on a Mac anyway), it’s the CPU in the M1 that is the clear winner.

TIANJIN, CHINA - 2021/05/23: A customer is editing a movie with the M1 processor on a new iMac in an Apple store.  On May 21, the new iMac series with Apple's M1 processor, have been officially on sale in China . Unlike the Intel processors previously used, the M1 processor, developed by Apple itself, makes the computer lighter, faster and smoother. (Photo by Zhang Peng/LightRocket via Getty Images)
A customer is editing a movie with the M1 processor on a new iMac in an Apple store in Tianjin, China. (Photo: Zhang Peng/LightRocket via Getty Images)

While performing on a par with one of Intel’s top of the line 11th Gen mobile processors, the M1 consumes half the wattage needed by the Intel to achieve such results.

It is really sad that the M1 is exclusive to the Apple ecosystem, but hopefully the components in the PC space will follow suit in designing low powered chips with strong performance.

At the end of the day, no one is able to (or going to) stop these corporations from creating more parts that will consume more power.

The responsibility of using less power ultimately falls on us that rely on our PCs and gadgets to do our daily tasks. If you do care about power consumption and saving the environment, support the products that are able to do more for less.

Right now, AMD has the upper hand in consuming less power for more performance in both the CPU and the GPU space. A simple Ryzen 5 5600X and a Radeon RX 6600XT could give you decent entertainment and work system, consuming only about 250-300w at its peak.

What if I told you that a system from 2018 using an Intel i7-8700K and a RTX 1080 Ti performs exactly the same, while consuming a good 400-500w?

We definitely have come a long way since then. And I hope we are able to get RTX 3090 levels of performance for 200w soon.

But for now, it does feel like more power for more performance is the trend that the tech world is heading towards, and I don’t think it’s the right direction.

Dominic loves tech and games. When he is not busy being headshotted in VALORANT or watercooling anything he sees, he does some pro wrestling.

For more gaming news updates, visit https://yhoo.it/YahooGamingSEA. Also follow us on Twitter, as well as our Gaming channel on YouTube.