Chazore said:
You are?. I mean, I'd want to be, but seeing where game desgin goes these days and how it has to run on consoles, I honestly don't see myself caring about tech I'll never be able to see, because it'd make the other closed off boxes look bad if a PC had it designed for them and used for first. I'd be absolutely down for it if we saw many devs making games with real time ray tracing in mind for PC versions or primarily for PC made games. Just not later or after consoles get it first, because I'm honestly tired of the whole "avoid upsetting the special snowflake in the room" routine we've been having for years and years. |
https://wccftech.com/shadow-of-the-tomb-raider-bfv-nvidia-rtx/
So, yeah.
"It just works"
It'll still take some time before a game could be designed with ray tracing in mind.
It would've been irresponsible to have been working on titles with it in mind before this year.
haxxiy said:
Mmm, I'm a bit skeptical of the long-term success of Nvidia turning their backs on the hardware design that propelled them to greatness in the first place: unified compute engines and nothing else inside their GPUs. It feels like they've contracted IBM's disease, though IBM had Intel selling more efficient and simpler hardware to keep them in their place. On the other hand AMD, though, designed what was possibly their worst GPU ever since buying ATI (and subject to the same "disease", with all of Vega's useless bells and whistles). |
Eh, well we'll just have to disagree then.
You can't just rely on the same tricks to work forever.
Look at Intel, and how AMD is currently clawing back market share via smart design, and a somewhat risky approach.
If Nvidia didn't try to innovate now: AMD, or Intel. or Imagination, or someone else would've tried to challenge them.
Eventually someone would've taken them down.
Plus it seems pretty clear to me that GPUs relying on compute alone would've gone about as well as having cards dedicated to rasterisation, or media decoding, or something like that.







