add arrow-down arrow-left arrow-right arrow-up authorcheckmark clipboard combo comment delete discord dots drag-handle dropdown-arrow errorfacebook history inbox instagram issuelink lock markup-bbcode markup-html markup-pcpp markup-cyclingbuilder markup-plain-text markup-reddit menu pin radio-button save search settings share star-empty star-full star-half switch successtag twitch twitter user warningwattage weight youtube

3700X or 2700X for 1440p gaming?

AdmiralSoup

1 month ago

Given that I'm going to stick strictly with 8c/16t for this particular thought exercise, which CPU would be "smarter" choice? I can get 3700X for $320 and 2700X for $160, so half the price! I will be gaming for 100% of the time, AAA are the focus here. The monitor will be 1440p 144Hz VA Freesync display. Since on this resolution I will be definitely more GPU bound, should I get myself 2700X instead 3700X and save some money? Plus, I could spend a bit less on mobo with 2nd gen. Or maybe the difference is not worth saving up since I'll probably will have to spend almost $2k for the entire system to be able to play 1440p AAA high fps.

Comments

  • 1 month ago
  • 3 points

I would go with the 3700X since the CPU is more important for a high FPS goal. Graphical settings are adjustable and can make up for a lesser GPU (within reason of course so no GTX 750 Ti or anything like that). However there really isn't much you can do to make up for the CPU.

Since on this resolution I will be definitely more GPU bound

Sure if we just stick to maxing out the game but graphical settings are adjustable and it would be foolish to leave the graphical settings at max if you aren't getting the FPS you want. A GTX 1660 Ti or a RTX 2080 Ti or anything in between would work for 1440P 144FPS, the thing that would differ between them is the graphical fidelity.

Also FYI you mean 144Hz, 144Mhz is 144,000,000Hz.

  • 1 month ago
  • 1 point

Thanks for response. Huh, I always thought that the higher the resolution, the more GPU dependable you are. Especially for resolutions above 1080p. For example, when I look at the 4K benchmarks there is little to none difference between i5, i7, R5, R7 etc.

My mistake with the Hz. Just corrected that.

  • 1 month ago
  • 3 points

That's becasue benchmarks just keep the settings on max, it turns any "CPU" benchmark basically into a GPU one at 4K max since most of the time you will be GPU bound but in reality you can get the same FPS as you do at 1440P or even 1080P at 4K if you adjust the settings. As for how much adjusting you need to do that will vary game to game and the hardware in question. Sticking to max keeps the benchmark simple but its not realistic example if you have a certain FPS goal since you would be adjusting your settings to meet your FPS goal .

  • 1 month ago
  • 1 point

I see. I think I get the angle you approaching this case from. I need to think on that.

  • 1 month ago
  • 3 points

Listen to what vegabond is saying here.

I'll phrase this another way to help with your pondering:

The amount of CPU "power" required to achieve is particular FPS goal, does not change with resolution.

Increasing resolution does increase the amount of work for the GPU per frame rendered, but it does NOT decrease the amount of work the CPU has to do to deliver the wireframe data and draw calls to the GPU for that frame.

The "range" of acceptable FPS for "PC" gaming is very narrow compared to the range of acceptable visual quality. FPS is a requirement, visual quality isn't.

Go back 20 years. PC gaming was commonly played on CRT monitors at 75-150+ FPS (gamers often sought out high end CRT monitors that could be driven to very high refresh rates at lower resolution for first person shooters and such). GPU's have increased in power by approximately 10,000X since 1999. Todays "PC gaming" FPS is ~60-180+. The frame-rate hasn't really changed.

Think about that the next time you're looking at gaming benchmarks that use "FPS" as a yardstick to compare GPU's. They are just using "FPS" as a yardstick under fixed conditions to compared render throughput. In the real world, you're not limited to running any specific setting. You can adjust settings to balance performance vs visual quality within the hard limits set by the CPU and monitor refresh rate.

More CPU power means a more headroom to "adjust" into. If that's important to you, pay for it, if not, don't ;)

  • 1 month ago
  • 2 points

TBH I think the GPU is going to be the most important choice here. If scrimping a little on the CPU means a letter grade upgrade on the GPU I would go for this. I have not heard or seen anything out there to suggest the 2700X is not at least capable of 144fps for the vast majority of titles. I think your monitor or gpu will hold back that CPU. Thus in all honesty, unless you have other goals/software uses, I would say the 2700X will hold you over. I really hate to say this but.... the 3700X is slightly redundant at it's price with the 2700X half that and still freely available. Many i7-4790K owners felt the same when Sky Lake i7-6700K and Kaby Lake i7-7700K released. The $350 7700K was a hard sell to that crowd in that it did not offer enough to make it worth the price or hassle. Conversely, when buying new, it is nice to have the latest and greatest. On the other hand nothing wrong with being frugal when you have a choice between the 2700X and 3700X. They are both great CPU's, the 3700X is 10-20% faster across the board but double the price. How much is 10-20% worth? For gaming at a demanding resolution, not that much.

  • 1 month ago
  • 2 points

Thanks for the response and explanations. Given the examples you provided and the fact that 1440p is more GPU concentrated, I lean stronger towards 2700X. Instead of $500 processor + mobo combo (3700X+X570), I can get a $300 combo (2700X+X470) and use those remaining $200 into better GPU (going from RTX 2070 Super to 2080 Super). That should yield much better performance improvement over newer CPU, right?

  • 1 month ago
  • 2 points

Short Answer: => Yes Ryzen 7 2700X with RTX 2080 Super should do it for newer titles over the Ryzen 7 3700X with RTX 2070 Super - at least at 1440p, at least if you want your graphics to look nice and you bump up settings. If you are going to mainly play eSports and do not care how the game looks I would say faster CPU. Forget the 3700X, go for i9-9900K overclock and not look back. Older games, i.e. DX 11, ditto, performance favors faster CPU since graphics not challenging for RTX 2070 Super. Then again, Ryzen 7 2700X does not struggle to keep pace with a 144Hz panel - at least for a great number of titles. The 3700X being faster is moot, what you do not see cannot be considered as part of the observable. If you are going to play a range of titles, i.e. AAA's, eSports, you name it, I recommend the MiniMax approach - that is go for Ryzen 7 2700X with RTX 2080 Super and establish a reasonable upper minimum bound.

Long Answer: => More complicated but still go for Ryzen 7 2700X with RTX 2080 Super.

Long term wise, a better more expensive CPU could end up a more frugal purchase since GPU's are inevitably going to be upgraded at least once before retiring a chipset. However, despite people getting all excited about the 3700X, the reality is it is not a major bump up on the 2700X. Then again, important to consider that everyone is different, one man's piece of trivia is another's ex cathedra.

In the GPU <-> CPU tradeoff there is no real right or wrong answer. When compromising it will always be a case of minimizing entropy.

The are some games that flat out will refuse to render at 144fps no matter what GPU you throw at the problem or settings you render at. Even if you moved over to a 5GHz all core 18 core CPU on liquid nitrogen your frame rates may not reach that target for a particular title. Result - No CPU on market will make that game run at 144fps. Solution? Ignore the game as an anomaly and factor it out.

In general the CPU fps limits are not dependent on settings or graphical resolution. On the other hand it is incorrect for people to think or advise that all a GPU does is write a few discrete voltages to the lovely little pixels on the screen at a resolution you set it at. The GPU does a heck of a lot more than just drive the pixels on a monitor. 1080p or 1440p does not mean easy picture, easy performance, easy meat.

In general 1440p resolution will be GPU limited for a number of CPU's. This is not a rule 100% across the board. Certainly some games just cannot run above 100fps no matter what CPU you throw at the problem, as will some games slack off with a 2080TI in SLI and refuse to budge past 90fps. One has to treat it as a random variable.

The issue is - how will compromising the CPU affect throughput relative to upgrading a GPU? Do we see net gain, net deficit, or parity?

There is no absolute answer. Some games will run better with the 3700X and RTX 2070 Super and others will run better with 2700X and RTX 2080 Super. On average, the Ryzen 7 2700X with RTX 2080 Super will outperform a rig with Ryzen 7 3700X and RTX 2070 Super. Not every single game, certainly. Across a number of titles, then yes, for enough of a majority to place GPU precedence above CPU.

Of course ideally you have the cash to settle the debate in favor of the 3700X and RTX 2080 Super - best of both worlds based on choice outcomes provided.

  • 1 month ago
  • 2 points

Thanks for your answer. I think this exhaust the topic. After all those responses, I know I should be more focused and decisive on what do I actually want from my system. As of making this topic initially, all I said I wanted is AAA games at 1440p, neglecting the type of game, desired frame-rate and desired visual fidelity. And that leave a lot of headroom for choosing the right parts. Thinking on it, now I know that what I specifically want is to play current and up to two years into the future AAA games at 1440p, in max, down to high settings presets, with only fps requirement being not dropping below 60 fps in 1% low. And those are pretty hefty requirements. I guess to meet them I would have to go 3700X and 2080 S.

  • 1 month ago
  • 2 points

One thing to factor in. We will be seeing a fairly big overhaul in the next couple of years. Intel will be releasing new chipsets as will AMD. Both will be DDR5 compliant. Many people at these junctures like to upgrade their PC's. You may/may not be one of them. Nothing at all wrong with counting pennies with this rig and saving up for next. No need to go for broke and outspend yourself. Worth a thought. Your monitor will last you years and years. PC hardware comes and goes. This is a tricky time to build a PC with future proofing in mind. State of flux, lot going down in next year or two before settles to a new Plateau.

only fps requirement being not dropping below 60 fps in 1% low. And those are pretty hefty requirements. I guess to meet them I would have to go 3700X and 2080 S.

My i9-9900K and RTX Titan cannot meet that either for a lot of the newer games coming out. Even at 1080p I would not bet on it. Do the best you can. There are 1000's of games, that a few do not play nice with hardware is too bad. We are under no obligation to buy these games. That said - as long as Red Dead Redemption 2 plays with consistent 60fps plus we will be good to go. All indications point to this being achievable with an RTX 2070 Super ;)

  • 1 month ago
  • 1 point

Thanks for pointing that out. I'm aware of incoming changes in a few years such as DDR5 RAM and GPUs potentially fully utilizing PCIe 4.0. Also, the current socket support for both Intel and AMD will end pretty soon, closing up potential future upgrade paths along the line. Those thoughts are creeping in the back of my mind all the time. Maybe I really just should stick with my 75Hz 1080p monitor and build something around that. Maybe I should wait until I'll be ready to make this generational leap and then just go all in. And, who knows, maybe by that time 1440p will be just a baseline standard as 1080p is today and high refresh 4K will be my new go-to step? Thanks again for your insight.

  • 1 month ago
  • 2 points

Agree 100%. Think of it this way: you have three targets you can set: fps goal, graphics detail goal, resolution goal. The hardware you need to reach those goals will vary by game, but you can roughly generalize and the AAA titles need more hardware. Once you pick those goals and the hardware needed, are you over budget? If yes, then something has to give, and typically it's either fps or graphics detail, less often resolution. The choice will depend on the kind of gaming you do. Adjust the CPU for the fps target and the GPU for graphics detail (or resolution) target.

I'm not saying anything different from vagabond or allan_m_systems, just restating.

and, keep in the back of your mind that there's no point in spending for a computer that's faster than your monitor...that's more of an issue at 4K but something to keep in mind nonetheless.

Sort

add arrow-down arrow-left arrow-right arrow-up authorcheckmark clipboard combo comment delete discord dots drag-handle dropdown-arrow errorfacebook history inbox instagram issuelink lock markup-bbcode markup-html markup-pcpp markup-cyclingbuilder markup-plain-text markup-reddit menu pin radio-button save search settings share star-empty star-full star-half switch successtag twitch twitter user warningwattage weight youtube