Chazore said:
Now there's a thought, what if that's Nvidia's goal?.
That would be absolutely evil as shit to do though, and I hope to god that isn't one of their plans, because if I were an AIB partner, I'd get the fuck outta there if that were to happen. personally, even if consoles were made available again, I can never go back to them. I just cannot stand the lacking freedom, the software I currently use, the subscription plans for the crap that's on there (netflix, hulu, all that shit, online gameplay, lack of modding, etc, it all feels so restrictive to me still). |
I don't think Nvidia is in that position, at least not yet. And prove of that is the 4080 12GB, and AIB only card. If Nvidi wanted to hurt them they would have also done a F.E. card with that GPU.
Now you could argue that giving the 12GB model to AIB is a poisoned gift, because it will be very difficult for them to launch custom models close to the $900 MSRP and will, instead, be closer to $1,000 or even a bit more, putting potential buyers in the mindset of "if I'm spending that much, why not go for the true 4080 instead?". But well, AIBs aren't run by stupid people, and if they think that Nvidia is cutting the grass off their feet, they'll make something about it.
Darc Requiem said: The RTX 40 series is actually worse than I thought. I saw a commenter saying the 4080 16GB is a rebranded 4070 and the 4080 12GB is a rebranded 4060ti. If you check the percentage of CUDA cores for a 3070 vs a 3090 and a 3060ti vs a 3090 and then compare them to the 4080 12GB and 4080 16GB vs the 4090 the dude isn't be BSing. WTF Nvidia. |
I don't really agree with that. Sure, the difference between the 4090 and real 4080 is massive, But I think it has more to do with Nvidia leaving a gap big enough between the different models to fit the -Ti models that will come next year, so they bring enough performance improvements to make them look good enough.
After all, while Nvidia didn't say a thing, GALAX confirmed the chipsets uded on each GPU, and they don't differ much from what they've been using these last generations, with the exception of the 4080 being AD103 when Nvidia rarely uses the x03 name.
Captain_Yuri said:
|
Between low availability and ridiculous prices, the idea of Nvidia putting these cards out of reach to make the remaining Ampere cards more enticing sounds more plausible.
After all, we have to remember that Nvidia tried to renegotiate its contract with TSMC to reduce it, which TSMC said no. So we know that Nvidia has the chips, and it's only a matter of willing to use them now or not.
Please excuse my bad English.
Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.