Something we hear a lot is that CPU performance doesn't matter for 4K gaming. Without proper context, CPU benchmark data can be misinterpreted. Let us explain to you what's what with some hard data.
Something we hear a lot is that CPU performance doesn't matter for 4K gaming. Without proper context, CPU benchmark data can be misinterpreted. Let us explain to you what's what with some hard data.
You didn't read the article...Be nice if you included a 5800x and a 5600. I get that the 3600 was a popular CPU, I even still have 2 1800x's running in my server room but, a 5600 or even a 5700x is a cheap upgrade for someone running a 3600. Anyone running a 3600 could drop in a 5800X3D for less thabln the cost of upgrading to a new mobo and 7600. And with everything I've seen, the 5800X3D is better than the 7600 in 90% of games.
I did read the article and CPU can have massive implications in gameplay, part of the reason I said something. If games with fidelity is all you play, naturally. I play EvE and ESO, I can play those games maxed at 4K. The problem is when there are lots of other players in large battles. This is a problem at all resolutions. There are games that aren't graphic lyrics demanding but incredibly CPU heavy.You didn't read the article...
"Now, we can already envision comments claiming that the data here is misleading because the Ryzen 5 3600 is so much slower than the 7800X3D, and if that's how you feel, we're sorry to say it, but you've missed the point. Yes, we did use an extreme example to illustrate the point we're trying to make, but the point is, it's the frame rate that's important, not the resolution."
However, I don't agree with Steve when the only way to fall into CPU bound situations at 2160p is by either using a 4090, or using low or medium settings.
Like I was explaining just before, if you play at 2160p, it is because you crave for image FIDELITY. Which mean you will likely use max settings AT ALL TIME, which also mean that your CPU will have almost no impact in your framerate at 2160p unless you have a 4090.
First off, people playing at 2160p, like me, don't play at low, or medium settings. They play at max settings. So your data is not representative of a REAL Use Case.
Neither did you by the looks of it, they benched everything using different quality settings including High, Very High, Ultra etc...You didn't read the article...
The point was to address the effect CPUs have at 4k, not how likely the build is. Why is this hard for people to understand? The 4k comments are all the same "well thats not a REALISTIC BUILD" or "why dont you test CPUs at 4k" BS.This review may be addressing some comments lately regarding 4K benchmarks, but still feel it misses several points and artificially skews results in favor of the OP by artificially introducing a serious bottleneck in a situation that will likely never present itself.
Sure you slapped the R5 3600 community hard in the face with RTX 4090 results, however, I don't see them ever running 4090s to begin with, they'd have more than likely upgraded away from the 3600 long before dropping the absurd amount of money for a 4090.
Given the data only contains 2 CPUs, it's really hard to make any real conclusions in my use scenario, I've always purchased top tier CPUs and kept them for as long as possible, pairing them with multiple generations of upper mid range GPUs, because at the resolution (4K) and settings I use the CPU takes a really long time to become a bottleneck.
its not quite true that if youre playing at 4k you'd be using max settings.First off, people playing at 2160p, like me, don't play at low, or medium settings. They play at max settings. So your data is not representative of a REAL Use Case.
Second off, if you look at your data, the only way to be CPU bound, and using max settings, is with a 4090.
Lastly, you just proven that if you don't use a 4090, you will likely see barely any improvement by playing at 2160p with a better CPU, so I don't understand the point of this article at all since EVERY time you made some CPU benchmarks, you LITERALLY disregarded any feedback requesting 2160p results because you were raising the point that you will be GPU bound. However, now you are going 180 degrees?
I can't follow your logic anymore.
Matter of fact is that if you play games at 2160p with a high end GPU;
- You will most likely play with really close to maximum graphical settings
- The title you are going to play will be GPU bound
Why is it hard to understand people WANT realistic data? Not a condescending attempt at proving a point that means next to nothing to most people?The point was to address the effect CPUs have at 4k, not how likely the build is. Why is this hard for people to understand? The 4k comments are all the same "well thats not a REALISTIC BUILD" or "why dont you test CPUs at 4k" BS.
Exactly, everyone that I know has a 5600 and a 5800X. I wish they did so it can clearly show the difference to these friends/people. Anyways, I got a 7800X3D now and my whole PC is much smoother and faster. I came from a 4Core CPU, that could barely hit 3.4 speed. Quite the upgrade id say, totally worth it too, shame I had to pay 650 euro for it. Now its down to 350.Be nice if you included a 5800x and a 5600. I get that the 3600 was a popular CPU, I even still have 2 1800x's running in my server room but, a 5600 or even a 5700x is a cheap upgrade for someone running a 3600. Anyone running a 3600 could drop in a 5800X3D for less thabln the cost of upgrading to a new mobo and 7600. And with everything I've seen, the 5800X3D is better than the 7600 in 90% of games.
I literally used Steve own statement. You are missing the point that this is not a review but just a demonstration that you can be CPU bound at 2160p.I did read the article and CPU can have massive implications in gameplay, part of the reason I said something. If games with fidelity is all you play, naturally. I play EvE and ESO, I can play those games maxed at 4K. The problem is when there are lots of other players in large battles. This is a problem at all resolutions. There are games that aren't graphic lyrics demanding but incredibly CPU heavy.
My biggest issue with these benchmarks is the lack of relevant CPU heavy games at 4k and the absurdity of running a 4090 with a 3600.
You don`t play games at 2160p low settings with an enthusiast GPU...its not quite true that if youre playing at 4k you'd be using max settings.
one of the biggest benefits to 4k gaming is clarity, even when a game is at its lowest settings, just being viewed @4k is a massive visual bump, and I know alot of people who play @4k but at low or medium settings because the clarity is an edge.
compare a game like ff14 being played at 1080p or 4k, theres alot of text on the screen and the game isnt particularly heavy to run(im ignoring the upcoming graphics update) its lowest settings may not look the best visually on the characters and world but at 4k far as text, hotbars, minimap etc its a massive jump.
Go back looking at the numbers and look back at my comment.Neither did you by the looks of it, they benched everything using different quality settings including High, Very High, Ultra etc...
Starfield and Hogwarts Legacy both clearly benefitted from a better CPU in their highest quality settings.
No, it seems that you are missing mine. There are gaming applications outside what was benchmarked that show CPUs are very important for things outside of gaming fidelity and to ignore that is careless amd unprofessional. Go play a city builder, strategy game or basically anything physics heavy.I literally used Steve own statement. You are missing the point that this is not a review but just a demonstration that you can be CPU bound at 2160p.
My point is that you don`t play at 2160p to play at low or medium settings. You are better off dropping your resolution a tier than doing that, meaning all the low and medium numbers at 2160p are biased toward a proof of concept and not a REAL Use Case.
And when it comes to High/Ultra settings, unless you are having a 4090, any other GPU is going to provide you similar FPS regardless of the CPU.