So it begins again...
It looks so tiny compared to my current case.
If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.
So it begins again...
It looks so tiny compared to my current case.
If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.
EricHiggin said:
If the highest performing Big Navi SKU is thought by Nvidia to be greater performance than the 3080, by a significant margin, that's why they would want to leave not just room, but a huge price gap so they can land wherever they need to when they respond with the 3080Ti. Probably sooner than later. Below that there isn't as much wiggle room in terms of pricing. Offer 3070Ti and 3080Ti performance for 3070 and 3080 pricing with Big Navi. That would certainly make things interesting. 3070 and 3080 like performance for $50-$100 cheaper would be the next best thing. After multiple gens of overpricing, Nvidia didn't just all of the sudden decide to be generous for no reason this gen. These prices should scream worthy competition is coming. Below the 3090 anyway. |
I'd be surprised if AMD manages to not only beat, but even come on par with 3080. I'm not saying it can't happen, but I'd certainly be cautious about that, specially given the latest rumors about Big Navi (from AMD being surprised by the performance jump of Ampere, to Big Navi not being tapped out until recently, meaning that all previous rumors were fake or not true, to the latest kopite tweet comparing it to the GA104 of the 3070).
And when it comes to the price of the new cards, we also have to keep in mind that, because of COVID, the whole world is in the middle of an economic crisis and, as such, Nvidia can't charge as much as they want because they could risk to lose sales from people not being able to afford the new cards.
Also, looks like I could be wrong about the 3070Ti... (see below)
vivster said: Let's talk about CUDA cores. So it looks like that seemingly massively increased number of shaders isn't the true story and neither are the TFLOPS. It has been noticed that performance of the new cards does not scale linearly with the core count as it usually does. |
Where do you get that info about the "fake" shader count? Just curious, I'd want to read more about it because videocardz has an article about Lenovo spoiling the existence of a 3070Ti and says this:
NVIDIA GeForce RTX 3070 Ti spotted with 16GB GDDR6 memory https://videocardz.com/newz/nvidia-geforce-rtx-3070-ti-spotted-with-16gb-gddr6-memory
Interestingly, Lenovo also confirmed that their Legion T7 system will feature the GeForce RTX 3070 Ti model. This SKU has not been announced or even teased by NVIDIA in any form. Though, it aligns with the rumors that RTX 3070 series will be offered with both 8GB and 16GB memory. What remains unclear is whether the model is really called 3070 Ti or 3070 SUPER, we have heard both names in private conversations with AIBs.
(...)
There is, however, something to consider. NVIDIA clearly did not inform the partners with the full specifications until the very last moment. We have heard that the final BIOS for the Ampere series was provided only recently. The doubled FP32 SM (Cuda) count has also not been communicated clearly to partners until just a few days ago. Hence, some AIBs still list incorrect CUDA core counts (5248/4352/2944) on their websites. What this means is that Lenovo may still rely on old data, which could’ve changed over the past few days.
They seem to think that the shader core number is real.
Please excuse my bad English.
Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.
I got it from the presentation. He says that the new shaders are able to process 2 calculations per core. Coupled with the massive increase in shaders, but the lackluster scaling I deduced that they most likely just took that and pretended that a shader who does 2 calculations is now 2 shaders. It's really just how you look at it and how you want to define a shader.
Basically I refuse to believe that the new shaders are less efficient than Turing, which means they're probably not proper shaders, which would be the case when they split one shader into two.
It would also fit the rumors and the fact that the shaders magically doubled just very recently.
edit: computerbase.de sees it the same way.
https://www.computerbase.de/2020-09/geforce-rtx-3090-3080-3070-vorgestellt/
"What exactly has changed in the Ampere architecture remains a secret. Apparently, however, Nvidia has rebuilt the shader units significantly. Apparently an ALU at Ampere can no longer only perform one MAD calculation (Multiply ADD) with FP32 accuracy per cycle, but rather two. This would double the theoretical computing power per cycle for a single shader unit.
For this reason, Nvidia also calls Ampere twice as many CUDA cores as up to now. The GeForce RTX 3070 is specified with 5,888 CUDA cores, the GeForce RTX 3080 with 8,704 and the GeForce RTX 3090 with 10,496. In comparison, the GeForce RTX 2080 Ti has just 4,352 CUDA cores."
Last edited by vivster - on 02 September 2020If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.
Due to the weird shader stuff and other limiting factors I expect the 3070 to be behind the 2080ti in some if not most scenarios.
If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.
I'll have to disagree with you and computerbase. If the official slides say 5000, 8000 and 10000 sharders, then the cards have those amount of shaders. Otherwise, Nvidia could be sued for false advertising and lose millions. That's why Intel and AMD always say X-cores/Y-threads on their CPUs, and not just Y-cores, because it's not true and can open the door for lawsuits.
They aren't dumb enough to make that mistake.
Please excuse my bad English.
Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.
JEMC said: I'll have to disagree with you and computerbase. If the official slides say 5000, 8000 and 10000 shaders, then the cards have those amount of shaders. Otherwise, Nvidia could be sued for false advertising and lose millions. That's why Intel and AMD always say X-cores/Y-threads on their CPUs, and not just Y-cores, because it's not true and can open the door for lawsuits. They aren't dumb enough to make that mistake. |
It's their technology and they can claim as much as they want. It's not that they're wrong, even. A shader isn't a firmly defined entity and the amount of shaders does not define performance. How would you feel if the shader count is correct but it turns out the shaders are actually only half as capable as previous shaders? That wouldn't be false advertising but it would have the same effect.
The current facts are that the numbers do not add up, which means that either the advertised shaders are bad OR not as numerous but better. I opt for the latter.
If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.
vivster said:
It's their technology and they can claim as much as they want. It's not that they're wrong, even. A shader isn't a firmly defined entity and the amount of shaders does not define performance. How would you feel if the shader count is correct but it turns out the shaders are actually only half as capable as previous shaders? That wouldn't be false advertising but it would have the same effect. The current facts are that the numbers do not add up, which means that either the advertised shaders are bad OR not as numerous but better. I opt for the latter. |
It can aslo mean that the shaders aren't fully used. Some years ago, I don't remember if it was with Fury or the Vega cards, AMD had that problem. Those cards had something close to double the shaders of the regular, mainstream cards but didn't offer twice the performance because the chips wasn't well scaled and not all shaders could be used. Something akin could have happened this time to Nvidia, only to a less extend.
Another option would be that drivers still need to mature more and can't take full use of the new hardware.
Please excuse my bad English.
Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.
JEMC said:
It can aslo mean that the shaders aren't fully used. Some years ago, I don't remember if it was with Fury or the Vega cards, AMD had that problem. Those cards had something close to double the shaders of the regular, mainstream cards but didn't offer twice the performance because the chips wasn't well scaled and not all shaders could be used. Something akin could have happened this time to Nvidia, only to a less extend. Another option would be that drivers still need to mature more and can't take full use of the new hardware. |
Yeah, but think about it. If you say the shaders advertised are full shaders and believe Nvidia when they say their shaders can now do double the calculations, wouldn't that mean we have to expect quadruple the power when the shaders are doubled? Even with bad optimization we'd still see something like 3x the performance. Yet, despite having more than double the shaders and a noticeably higher clock we're not even at double the performance.
That can only mean that the Ampere shaders are not comparable to Turing shaders, yet they are both called "shaders". Wouldn't that be false advertisement?
If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.
same size(amount) but roughly double the performance bc double the channels? would explain why things aren't much more expensive
"I think people should define the word crap" - Kirby007
Join the Prediction League http://www.vgchartz.com/predictions
Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.