Forums - PC Discussion - GeForce GTX '20' Series (Turing) Rumored For Summer 2018 Launch

 

The rumours of Nvidia’s next-generation graphics cards have taken another, disappointing, twist. Igor Wallosek from Toms Hardware Germany is claiming that Nvidia’s plans are completely different to what had previously been mooted. Rather than a launch in Spring, Nvidia plans to release its Turing gaming graphics card architecture in July, pushing it all the way back to Q3 2018.

If you’re raising your eyebrows at Turing, that’s because of a bit of an about-turn on the naming. Nvidia has allegedly decided to name its successor to Pascal as Turing, the architecture we’d previously suspected would be aimed at cryptocurrency miners.

Source 1

Speculation suggests that when the GTX 2060 and GTX 2080 cards do launch, they will be packing Samsung-built 16 Gbps or 18 Gbps GDDR6 RAM. The real question is whether gamers will be able to get hands on these new cards when they do launch, or if the crypto miners will gobble them all up.

There is no clear indication of what the hold up is right now on Turing-based cards. Perhaps the issue is that there is no real pressure on NVIDIA to rush the cards out. AMD and its Radeon Vega graphics cards aren’t applying the pressure needed to force NVIDIA's hand to get Turing cards out more quickly. NVIDIA is expected to continue the rollout of the Volta architecture and word is that Ampere-based cards won't see consumer versions.

Source 2



                                                                                                             

Around the Network

Can't say that I have any hype for the launch. Will be impossible to get one thanks to miners. Sucks because I'd really like to upgrade my 950 to a 2050ti or 2060. 



Yeah, this year feels different than previous. The Miners currently have the market strangle-held. I could technically skip Turing, but really, I may have to.



                                                                                                             

Can game developers even keep up at the rate they keep coming out with stronger cards?

IMHO, this card is for crypto mining, I might be wrong, but it seems like it's the only thing that can fully utilize its architecture.



I suspect Turing will be more of a refresh then a generational leap. Unless they made some big architectural improvements I don't see 12nm being a big enough leap to give that 50-70% performance gain, I suspect we will get about 30%.

Another issue is memory size, are they just gonna double it when memory is so expensive right now (example, 2080/2070 getting 16gb of ram)? I don't think so.

Another interesting aspect is 7nm. Will they launch Turing this year then another architecture next year on 7nm?



6x master league achiever in starcraft2

Beaten Sigrun on God of war mode

Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.

1-0 against Grubby in Wc3 frozen throne ladder!!

Around the Network

The supposed delay might not be all too bad. Pascal is probably the best architecture Nvidia ever put up and is still performing quite well 2 years after release.

We need memory prices to go down (Samsung investing in new factories could help), so a few more months delay can't hurt.

I do think it might be worth waiting for the 7nm for those sitting on a high end Pascal card.



Trumpstyle said:
I suspect Turing will be more of a refresh then a generational leap. Unless they made some big architectural improvements I don't see 12nm being a big enough leap to give that 50-70% performance gain, I suspect we will get about 30%.

Another issue is memory size, are they just gonna double it when memory is so expensive right now (example, 2080/2070 getting 16gb of ram)? I don't think so.

Another interesting aspect is 7nm. Will they launch Turing this year then another architecture next year on 7nm?

12 nm is basically a slightly improved 16 nm, which on turn is already a 20 nm process in reality, so just about all the performance gains will be due to architecture. That is, assuming it is derived from Volta instead of just a Pascal refresh. As for 7 nm, I guess that depends of when the smartphone SoCs will start using it, since they make it to larger chips about a year afterwards. I used to expect it for the next year, but it's not looking like there will be smartphones later this year or early on the next using it, so far, so maybe that's a 2020 launch to keep things on a nice even 2 year cadence, as it has been for a while now.

A wild card here is GlobalFoundries' 7 nm process which is very close to Samsung's 10 nm, and a 50% improvement over their 10 nm, which could be available earlier than the more commonly expected 7 nm from TSMC and Samsung. It would mean a nice boost to AMD if they could release GPUs on it maybe even a year before Nvidia gets their next process node, though sadly, AMD are still be so far behind Nvidia in architecture that it would end up making things even at best.

That's how I understand it at least.



 

 

 

 

 

deskpro2k3 said:
Can game developers even keep up at the rate they keep coming out with stronger cards?

IMHO, this card is for crypto mining, I might be wrong, but it seems like it's the only thing that can fully utilize its architecture.

They haven't been able to keep up for close to a decade now these card are surely just for cashing in on the crypto mining craze.



At the pace AMD is going they might as well delay it to next year.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Well, thanks to crypto minders, I'm not even remotely excited about it. Unless nVidia puts some kind of hardware restrictions, so they will be useless for the miners, but I seriously doubt it. At least current situation should slow graphical progress in gaming, as number of potential customers for the most demanding games will shrink dramatically. That means if you're 9 or 10 series owner, you still should be fine for the next few years.