By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - GeForce GTX '20' Series (Turing) Rumored For Summer 2018 Launch

Can game developers even keep up at the rate they keep coming out with stronger cards?

IMHO, this card is for crypto mining, I might be wrong, but it seems like it's the only thing that can fully utilize its architecture.



CPU: Ryzen 7950X
GPU: MSI 4090 SUPRIM X 24G
Motherboard: MSI MEG X670E GODLIKE
RAM: CORSAIR DOMINATOR PLATINUM 32GB DDR5
SSD: Kingston FURY Renegade 4TB
Gaming Console: PLAYSTATION 5
Around the Network

I suspect Turing will be more of a refresh then a generational leap. Unless they made some big architectural improvements I don't see 12nm being a big enough leap to give that 50-70% performance gain, I suspect we will get about 30%.

Another issue is memory size, are they just gonna double it when memory is so expensive right now (example, 2080/2070 getting 16gb of ram)? I don't think so.

Another interesting aspect is 7nm. Will they launch Turing this year then another architecture next year on 7nm?



6x master league achiever in starcraft2

Beaten Sigrun on God of war mode

Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.

1-0 against Grubby in Wc3 frozen throne ladder!!

The supposed delay might not be all too bad. Pascal is probably the best architecture Nvidia ever put up and is still performing quite well 2 years after release.

We need memory prices to go down (Samsung investing in new factories could help), so a few more months delay can't hurt.

I do think it might be worth waiting for the 7nm for those sitting on a high end Pascal card.



Trumpstyle said:
I suspect Turing will be more of a refresh then a generational leap. Unless they made some big architectural improvements I don't see 12nm being a big enough leap to give that 50-70% performance gain, I suspect we will get about 30%.

Another issue is memory size, are they just gonna double it when memory is so expensive right now (example, 2080/2070 getting 16gb of ram)? I don't think so.

Another interesting aspect is 7nm. Will they launch Turing this year then another architecture next year on 7nm?

12 nm is basically a slightly improved 16 nm, which on turn is already a 20 nm process in reality, so just about all the performance gains will be due to architecture. That is, assuming it is derived from Volta instead of just a Pascal refresh. As for 7 nm, I guess that depends of when the smartphone SoCs will start using it, since they make it to larger chips about a year afterwards. I used to expect it for the next year, but it's not looking like there will be smartphones later this year or early on the next using it, so far, so maybe that's a 2020 launch to keep things on a nice even 2 year cadence, as it has been for a while now.

A wild card here is GlobalFoundries' 7 nm process which is very close to Samsung's 10 nm, and a 50% improvement over their 10 nm, which could be available earlier than the more commonly expected 7 nm from TSMC and Samsung. It would mean a nice boost to AMD if they could release GPUs on it maybe even a year before Nvidia gets their next process node, though sadly, AMD are still be so far behind Nvidia in architecture that it would end up making things even at best.

That's how I understand it at least.



 

 

 

 

 

deskpro2k3 said:
Can game developers even keep up at the rate they keep coming out with stronger cards?

IMHO, this card is for crypto mining, I might be wrong, but it seems like it's the only thing that can fully utilize its architecture.

They haven't been able to keep up for close to a decade now these card are surely just for cashing in on the crypto mining craze.



Around the Network

At the pace AMD is going they might as well delay it to next year.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Well, thanks to crypto minders, I'm not even remotely excited about it. Unless nVidia puts some kind of hardware restrictions, so they will be useless for the miners, but I seriously doubt it. At least current situation should slow graphical progress in gaming, as number of potential customers for the most demanding games will shrink dramatically. That means if you're 9 or 10 series owner, you still should be fine for the next few years.



Kristof81 said:
Well, thanks to crypto minders, I'm not even remotely excited about it. Unless nVidia puts some kind of hardware restrictions, so they will be useless for the miners, but I seriously doubt it. At least current situation should slow graphical progress in gaming, as number of potential customers for the most demanding games will shrink dramatically. That means if you're 9 or 10 series owner, you still should be fine for the next few years.

The current generation is still way too weak to handle 4k properly. There is no such thing as "being fine" with GPUs, especially not for years. We can't even advance to the next step of imaging techniques because the GPUs aren't able to catch up.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

vivster said:
Kristof81 said:
Well, thanks to crypto minders, I'm not even remotely excited about it. Unless nVidia puts some kind of hardware restrictions, so they will be useless for the miners, but I seriously doubt it. At least current situation should slow graphical progress in gaming, as number of potential customers for the most demanding games will shrink dramatically. That means if you're 9 or 10 series owner, you still should be fine for the next few years.

The current generation is still way too weak to handle 4k properly. There is no such thing as "being fine" with GPUs, especially not for years. We can't even advance to the next step of imaging techniques because the GPUs aren't able to catch up.

I wouldn't go that far. Weak? Yes. Way too weak? Hmmm, no. Or perhaps you mean handling 144fps at 4k? In that case I would agree.



Mordred11 said:
vivster said:

The current generation is still way too weak to handle 4k properly. There is no such thing as "being fine" with GPUs, especially not for years. We can't even advance to the next step of imaging techniques because the GPUs aren't able to catch up.

I wouldn't go that far. Weak? Yes. Way too weak? Hmmm, no. Or perhaps you mean handling 144fps at 4k? In that case I would agree.

Even a 1080ti cannot handle a bit more demanding games at stable 60fps unless you turn down other stuff. That's too weak. It barely goes over 60fps on average in current games and that is not stable at all. Considering they're slowly moving away from multi GPU single GPUs have to work more than ever.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.