vivster said:
It's their technology and they can claim as much as they want. It's not that they're wrong, even. A shader isn't a firmly defined entity and the amount of shaders does not define performance. How would you feel if the shader count is correct but it turns out the shaders are actually only half as capable as previous shaders? That wouldn't be false advertising but it would have the same effect. The current facts are that the numbers do not add up, which means that either the advertised shaders are bad OR not as numerous but better. I opt for the latter. |
It can aslo mean that the shaders aren't fully used. Some years ago, I don't remember if it was with Fury or the Vega cards, AMD had that problem. Those cards had something close to double the shaders of the regular, mainstream cards but didn't offer twice the performance because the chips wasn't well scaled and not all shaders could be used. Something akin could have happened this time to Nvidia, only to a less extend.
Another option would be that drivers still need to mature more and can't take full use of the new hardware.
Please excuse my bad English.
Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.