gtotheunit91 said:
Really? Interesting. I currently have a Ryzen 7 2700x, but I'm planning on doing a new build for the beginning of 2023. I'm hoping to pair a 3080 with the next gen Ryzen 7000 series. By that time I'm hoping GPU scarcity is all but alleviated and CPU's should be completely available. God of War did receive a lot of patches post-launch, so I'm wondering if Nixxes will continue to optimize the game to where older CPU's aren't quite as much of a bottleneck as they are at launch. |
Depending on your motherboards CPU support, you could get a generational leap just by replacing the CPU alone.
The 2700X was a fine chip, but you can get a good 40%-100% performance increase by dropping in even a lowly 5600x, depending on application/game of course... The 2700X only closes the gap (But never wins) only when all 16 threads are loaded.
I personally believe the 5900X is the best price/performance at the moment though and it will last years, prices on those have tumbled hard.
gtotheunit91 said: WOW! I did not realize the disparity of ray traced reflections on console compared to PC. What kind of RTX cards were being used for testing? I'm sure my 2080 shouldn't have too much of a problem. Although I probably won't crank it up to Ultra High lol. |
Sadly AMD's Ray Tracing capabilities are pretty terrible... Which is sadly also the benchmark for Ray Tracing capabilities for consoles.
AMD just doesn't invest in the "side things" as heavily as nVidia... AMD had Tessellation in it's GPU's for example back in the Playstation 2 era... Something which didn't become common until the Xbox One/Playstation 4 era on console.
But when nVidia introduced the technology, despite being a decade late to the party, it absolutely dominated AMD's implementation, took AMD years to catch up.
AMD just spends more transistors on compute.
It also doesn't help that the consoles are using old Zen2 CPU cores paired up with only 16GB of Ram which holds things back too.
--::{PC Gaming Master Race}::--