| vivster said: Not sure if I want to call Turing weak. It's weaker than it could be, sure, but it still handily beats everything else while at the same time introducing new and future oriented technology. In fact, I love that they're trying to start the RT revolution early. We might not be at a point where it can see widespread use or have hardware to properly support it but that's how new technology works. Compare it with the introduction of 4k and now 8k. It doesn't hurt to start things early when you know it will be inevitably be the standard in the future. |
Right now, the RT cores don't do shit in the vast majority of the benchmarks. Turing is literally a blunder and the 2080 Ti or Titan RTX being the silver linings won't change the fact that Nvidia has an over-engineered architecture ...
We don't even have an idea if RT will become the standard since not even D3D12 is the standard among AAA game developers and they NEED D3D12 if they want to do RT so as far as it being "inevitably be the standard" with enough lobbying from the outside parties (AMD, Intel, Microsoft and Sony) they could effectively kill adoption of RT altogether in the near future ...
Nvidia has taken a big risk in baking RT to their silicon, almost as big as that time they were banking on G-Sync somehow becoming the standard ...







