vivster said:
I think people are overlooking the fact that they didn't actually compromise much space in the chip and instead just made it bigger to compensate. The 2080Ti still has a considerable upgrade in CUDA cores.
|
The point is, they could have had more CUDA cores for the same die space.
The 2080Ti is 18.6 Billion transistors large.
The 1080Ti is 12 Billion transistors large.
That is an increase of 55% in transistor counts.
The 2080Ti has 4352 CUDA cores.
The 1080Ti has 3584 CUDA cores.
That is an increase of 21%.
We should have been looking at closer to 5,500 CUDA cores.
See the problem now?
Combine that with a slight reduction in clocks between the 1080Ti and 2080Ti... I don't expect the performance gains in typical rasterized scenario's to be generationally-ground breaking.
vivster said:
Even if AMD puts out a conservative GPU they won't eclipse the rasterization performance by much, if at all.
|
Well. If AMD was to take VEGA and took it to 18.6 Billion transistors... We would be looking at a part with 6000~ GCN Cores, nothing to sneeze about.
vivster said:
Even if AMD puts out a conservative GPU they won't eclipse the rasterization performance by much, if at all.
|
Well, AMD's performance is hindered in a ton of different areas, we are only talking hypothetical's here... AMD isn't set to catch up to nVidia with Navi, maybe Next Gen.
vivster said:
Also, the RT and Tensor cores aren't like PhysX, as in a very niche and proprietary feature. They work together with DX and they are very broadly applicable to games without much special input from the devs. RT is without a doubt the future of gaming and Tensor cores can help with a variety of tasks. For now it's just gimmicks because the hardware behind it is still too weak to actually accomplish properly what it promises but you have to start somewhere. The sooner devs get familiar with these new opportunities, the better.
Now we just have to hope that the RT and AI functionality that will eventually pop up in AMD GPUs are similar enough in framework to that in Nvidia chips.
|
We don't know if the GPU is going to be any good in Ray Traced scenario's anyway. This is essentially the "base line" for next-gen graphics.
Chicken and Egg and all that.
I would prefer hardware that works well in games today, not years later.
So until we actually have games and enough hardware on the market that these features become proven... Then they are all gimmicks at this stage.