By using this site, you agree to our Privacy Policy and our Terms of Use. Close
JEMC said:

Alright, since I was dumb enough to mention it, I went ahead and did it.

Raster (11 games)_

  • 9070 vs 6900GRE_ 1440p = 16.8% ahead # 4K = 18.5% ahead
  • 9070XT ve 6900GRE_ 1440p = 32.7% ahead # 4K = 37.4% ahead

RT (9 games)_

  • 9070 vs 6900GRE_ 1440p = 25.7% ahead # 4K = 26.2% ahead
  • 9070XT ve 6900GRE_ 1440p = 49.7% ahead # 4K = 53% ahead

Something worth of note: the 6900GRE has 5120 shaders while the 9070XT has 4096. It achieves those leaked numbers with 20% less shaders.
I couldn't find how many shaders  the 9070 is supposed to have.

The number of shaders is only part of the story. It's the clockspeeds that are pushing the 9070 higher.
AMD is going for a smaller and faster and more balanced core so they can invest more transistors into things like RT and A.I, which is what they needed to do.

The 9070XT has 4096 pipelines @2400Mhz base, 2970Mhz boost.
The 7900GRE (I assume you meant the 7900GRE as the 6900GRE doesn't exist AFAIK) has 5190 pipelines @ 1270Mhz, 2245Mhz boost.

Almost a double in base clock and over a 30% increase in boost clock... But having all the ROPS, Texture Mapping Units, Geometry Units and more run at a higher rate is also helping.
Add in the new RT functionality and the 9070XT should be a much more efficient platform.

It's like when we went from VLIW5 to VLIW4, less shaders, but higher clocks, resulted in a more balanced chip.



--::{PC Gaming Master Race}::--