What's new

Breaking Dad's Big Bad Gaming Discussion - For all things to do with PC Gaming !

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

Intel is throwing power into their chips at a very high rate as well, just like nVidia.
I find that trend worrying (of pushing performance through raw power increase). My personal take on the nVIDIA 12V connector issue isn't that the connector is wrong (it was actually confirmed to mostly be a user-error, of a connector not being fully inserted). It's the fact that a regular Joe Average consumer product (a GPU), installed by said Average Joe, involves dealing with a connector made of a few wires that can carry up to 600W of power. I find that worrying, and dangerous.
 
I find that trend worrying (of pushing performance through raw power increase). My personal take on the nVIDIA 12V connector issue isn't that the connector is wrong (it was actually confirmed to mostly be a user-error, of a connector not being fully inserted). It's the fact that a regular Joe Average consumer product (a GPU), installed by said Average Joe, involves dealing with a connector made of a few wires that can carry up to 600W of power. I find that worrying, and dangerous.
Definitely, I would say it’s a poor design in that it’s not easy for said average Joe to be sure it’s seated fully and correctly. If this trend persists they will also hit the ceiling of what a standard home outlet will allow you to draw. I can’t imagine every gamer buying a 1600W power supply. Not to mention the literal space heaters they will become. I can’t cool that kind of heat in the summer here.
 
Not to mention the literal space heaters they will become. I can’t cool that kind of heat in the summer here.
I replaced my 1660 Ti with a 3070 this summer. I was definitely seeing the temperature difference in my room while running a game.

I spent some time undervolting, but not enough to get any satisfying results yet (tho the undervolting resulted in getting higher performance at a lower voltage, with the final result being same power usage). Next summer, I might end up defining another profile that would cap the clock at a lower frequency in addition to the undervolt, to use during the hottest summer days.
 
I replaced my 1660 Ti with a 3070 this summer. I was definitely seeing the temperature difference in my room while running a game.

I spent some time undervolting, but not enough to get any satisfying results yet (tho the undervolting resulted in getting higher performance at a lower voltage, with the final result being same power usage). Next summer, I might end up defining another profile that would cap the clock at a lower frequency in addition to the undervolt, to use during the hottest summer days.
I undervolted my launch 3070 after finding no gains with overclocking (but a lot more heat). It keeps it a couple degrees cooler, but definitely still warmer than my old 1060 and 2060 Super.

I think undervolting is becoming the new overclocking. I run my 5800x with PBO on and a negative voltage offset. Higher or equal clocks and performance with less power than stock.
 
I think undervolting is becoming the new overclocking.
Indeed. With manufacturers pushing the hardware to the limit by applying high voltage amounts to it, the performance cap is most frequently the power limit, not the chip's speed limit. So unless you lost the silicon lottery, undervolting would give you more headroom to push the clocks even higher while at the same power usage as before.

Same experience with my Ryzen 9 5900X and PBO2.
 
I replaced my 1660 Ti with a 3070 this summer. I was definitely seeing the temperature difference in my room while running a game.

Up or down on the temps?

I don't want to assume, as they're very different architectures and process nodes...

Both are decent cards, and it's nice to finally see prices and availability improve these days after the crypto meltdown (and etherium moving from proof of work to proof of stake might also be playing into that)
 
Up or down on the temps?
Temperature in my room was going up, from around 21C to 23C after a few hours of play on the 3070. Mind you, that would include the heat generated by both the GPU and the CPU there.

I forgot the TDP of my 1660 Ti (probably a bit higher than the 120W official TDP since I was overclocking it), but the 3070 currently consumes around 230W during gaming. So roughly this new card would be dumping twice as much heat inside my room under gaming load. And I have no AC, so... gaming in the summer is getting uncomfortable :(

Thankfully, the game I mostly played this summer (Elden Ring) was being streamed to my living room through GameStream. Living room was staying cool while the bedroom with the PC was warming up :)
 
Same experience with my Ryzen 9 5900X and PBO2.

Same box as this?

 
I forgot the TDP of my 1660 Ti (probably a bit higher than the 120W official TDP since I was overclocking it), but the 3070 currently consumes around 230W during gaming. So roughly this new card would be dumping twice as much heat inside my room under gaming load. And I have no AC, so... gaming in the summer is getting uncomfortable :(

GPU power consumption is going thru the roof - noticed nVida RTX4K and even AMD on the high end - we'll all need 1000-1200w power supplies to keep them properly fed...
 
Same box as this?

Yes, except for the GPU upgrade this summer.
 
word on the street is that the recent AMD RX7xxx GPU's are very competitive with the RTX4080's...
 
word on the street is that the recent AMD RX7xxx GPU's are very competitive with the RTX4080's...
FPS per dollar, they are apparently better. Class against class, they might be slowly behind (and with RT workloads they are a fair bit behind). But if you end up paying 15% less and the loss of 5% FPS, I'd say it's a net win for Team Red and for customers.
 
Yeah in RT I think they may be a gen behind Lovelace extrapolating from AMDs own infographics., and FSR is visibly worse than DLSS, though it will probably catch up in time. But I think they are getting back on their feet, and pricing seems competitive for what they offer, assuming realworld testing holds up the presentation claims. We will find out soon enough by the 13th when the reviews start to come in.

On the CPU side, waiting for the X3D variant of the 7950X for a possible upgrade.


Intel has stated that power usage optimization isn't targeted for Intel 7 chips.

That will change in the next couple of years when (if) Intel delivers the timeline it says it is on schedule for today.

Intel: 4nm, 3nm-Class Nodes on Track, 1.8nm Technology Pulled in | Tom's Hardware (tomshardware.com)

They've been struggling/improving on 10nm for a while including the power aspect, the Intel 7 is just the latest iteration of it, it never stopped. They abandoned the node reference in "nm" I think with 10nm troubles and started going by the "equivalent" naming scheme. Other reason for that probably being Intel's higher nm nodes used to be actually more dense than some lower nm processes from competing foundries, it's also nicer marketing wise than to repeatedly shown 10nm with some extension each time to denote improvement.
 
Last edited:
On the CPU side, waiting for the X3D variant of the 7950X for a possible upgrade.
Why are you waiting for the X3D variant out of interest?

I am concerned about voltage and oc limitations, possible lack of support, potential worse thermals and higher price of X3D? Is the extra cache really that beneficial ?

Genuinely interested as I am having this debate in my head.
 
Why are you waiting for the X3D variant out of interest?

I am concerned about voltage and oc limitations, possible lack of support, potential worse thermals and higher price of X3D? Is the extra cache really that beneficial ?

Genuinely interested as I am having this debate in my head.
What sfx2000 said is true about overall value. I mean the extra cache actually can give big gains like 20+% in some in Windows gaming benchmarks I saw for the 5800X3D even if it can’t clock or OC as high as the standard variant, but for productivity and a lot of other stuff the cache I don’t think it gives much of a boost. other than that probably out of vanity, I can’t really actually justify the need though. I may settle for the 7700X3D instead. At least I didn’t say Threadripper or Xeon Platinum, would be an even bigger waste/overkill for my use lol not to mention idle power draw lol, putting affordability aside.
 
Last edited:
As I guessed before, the Intel equation is the more balanced one.

Hard to tell - Zen4 is fairly compelling in many ways.

That being said - Core i5 has always been an interesting chip for gaming with the right motherboard that allows for overclocking, can get some real speed out of the i5's...
 
Untitled.jpg


The final scaled down build.

Not full AMD
 

Latest threads

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top