JEMC said:
caffeinade said: ~50% is what I predicted on Discord, so this is looking pretty good to me. |
You're just lucky .
|
vivster said:
caffeinade said: ~50% is what I predicted on Discord, so this is looking pretty good to me. |
Having seen the presentation I'm pretty sure these graphs are with RT enabled. Which means abysmal framerates for the 1080 and 50% better than abysmal for the new card. The new cards have about 20% more CUDA cores and about the same clock. Where do you expect the 50% to come from? I'm expecting 20% on average.
|
Um no.
It is highly like that these graphs aren't with RT enabled.
Pascal cannot keep pace with Turing in that aspect.
Volta can't keep up with Turing in that kind of a workload.
If it takes four V100s to lag behind a single Turing card, then a single Pascal card won't be close.
https://s22.q4cdn.com/364334381/files/doc_presentations/2018/08/JHH_Gamescom_FINAL_PRESENTED.PDF
Go to page fourteen.
Pascal wishes it could keep up.
When Turing is getting like 30FPS in Tomb Raider, Pascal is getting something in the single digits.
Not even the most hardcore console players would consider that playable.
The new cards have a decent increase to their bandwidth too, so lets not forget about that.
That's going to make a pretty big difference when trying to render at high resolutions and stuff.
So lets not forget that.
Turing seems to be quite the departure from Pascal.
Nvidia has apparently reworked their SMs for Turing, so core for core, Turing should probably come out ahead.
They seemed quite proud to talk about how their GPUs can do 32bit int and floats at the same time now, so hopefully that results in a pretty decent performance boost, when compared to Pascal cards with the same FLOPS.
Whilst I'm not sure if we'll get support for it in the consumer cards, Turing does support something like AMD's Rapid Packed Math (where you do two FP16 operations in the same time it takes to do one FP32); Intel GPUs support this, the Switch supports it, the PS4Pro supports it, and Vega supports it.
Assuming Turing does support it on the consumer side, these GPUs should be able to take advantage of the optimisations already done for the other devices.
Games such as Wolfenstein II and Far Cry V already support it, so maybe those games will see an extra percent or five of performance.
There's other changes too, but Turing seems to be more than Pascal with some RT and Tensor cores strapped to it.