By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

vivster said:
hinch said:

The RTX 3090 is targeted more towards the prosumers and enthusiasts. There's a reason why they showcased the RTX 3080 as their main consumer grade card instead of their flagship.

Nvidia can pretty much price those any way they want, since they offer the most amount of CUDA cores and copious amount of VRAM, which is needed for work and are willing to pay for it/invest in. But I should think that 10GB of ultra fast GDDR6X will be enough for next generation of 4K gaming.

For others on the edge, there are hints of more models in the future with more ram. I don't think its personally a big deal. The 8GB on the 3070 will limit that card however somewhat in the near future.

It's still shitty for people like me who are just below the prosumer.

If I buy the 3090 I'm gonna hate myself when they release the 3080ti next year.

If I buy the 3080 I'm gonna hate myself when they release the 3080ti next year.

If I wait for the 3080ti I'm gonna hate myself because I still can't play all the cool games properly.

I just cannot win here.

And that's before considering the possibility of a 3080 with 20GB launching next year as well.

In any case, an with the 3080 launching in two weeks, most stores will already be out of pre-order cards, so if you haven't secured one, you'll have to wait a bit to get one. But at least that will let you read the reviews and take a better decision.

vivster said:
Captain_Yuri said:

Yea makes sense. The next gen GPUs will come in 2.5 years and by then, most games probably won't use that much Vram as a lot will still be cross gen. When the main next gen games comes around, they should start to use more Vram.

I very much doubt it'll take that long for the next gen. My guess is latest early 2020. 5nm is pretty much ready to go for TSMC. It will definitely be a smaller step, but there is no point in waiting. I can only imagine AMD will also release very soon after Big Navi eventually fails.

I know it's hard to believe, man, but we're already in the future, we're already in 2020 . Maybe you meant 2022?



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network

Grrr, I want to get the Founders Edition cards but I feel like it's a solid fuck you to Air Cooler setups like mine.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

Grrr, I want to get the Founders Edition cards but I feel like it's a solid fuck you to Air Cooler setups like mine.

Time for liquid cooling then.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

vivster said:
Captain_Yuri said:

Grrr, I want to get the Founders Edition cards but I feel like it's a solid fuck you to Air Cooler setups like mine.

Time for liquid cooling then.

Yea but Noctua has been so good to me.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

Grrr, I want to get the Founders Edition cards but I feel like it's a solid fuck you to Air Cooler setups like mine.

I'm gonna wait till early next yr and see what Zotac has in store for water cooling. Fuck going with Aorus again. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Around the Network
Captain_Yuri said:
vivster said:

Time for liquid cooling then.

Yea but Noctua has been so good to me.

You know you can put just the GPU in the loop, right?

Last edited by JEMC - on 03 September 2020

Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Chazore said:
Captain_Yuri said:

Grrr, I want to get the Founders Edition cards but I feel like it's a solid fuck you to Air Cooler setups like mine.

I'm gonna wait till early next yr and see what Zotac has in store for water cooling. Fuck going with Aorus again. 

Yea I might end up with Asus or Zotac. Too bad cause Founders looks hot damn.

And yea, some of the AIBs have some wtf designs loll.

Here's KFA2 for the 3080:

It's like if you took the Founders Edition and got rid of all the elegance and engineering...



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

JEMC said:
EricHiggin said:

I wouldn't expect a Radeon 3080(Ti) competitor to go blow for blow with it's direct GeForce competition, but that's not to say it couldn't be better in some aspects while worse in others.

A 3070 Ti wouldn't be as warranted based on the pricing layout, but considering a Ti version is the norm, it's not entirely unexpected. Having a Ti/Super edition out sooner than later would make it even tougher on AMD though. This perhaps is a stronger indicator of where Nvidia thinks the top tier Radeon cards will land, so flood those tiers with reasonably priced models, price gaped for up sale at that, to try and keep people from buying AMD. If correct, this would mean AMD mostly has to rely on cheaper pricing if it wants to gain market share.

RTG's silence really does make me wonder, as explained below.

So you're like haxxiy and think that AMD could potentially beat Nvidia's Ampere in pure rasterization but lose in RT and the like. We'll see.

In any case, in order to present some battle, AMD needs to be competitive in both performance and also price. The 5700 series were very competitive against Nvidia cards but, given that they were priced very close, Nvidia still managed to sell more units because of brand name.

EricHiggin said:

Didn't AMD get sued over this not all that long ago, for marketing more CPU cores than those chips 'legitimately had'?

Yeah, they got sued for Bulldozer and had to pay 12 million dollars: https://www.anandtech.com/show/14804/amd-settlement

More along those lines, yes. I'd be highly surprised if Big Navi comes close to Ampere's higher tier RT capabilities.

That's something I've also worried about. The 5700 wasn't exactly cheap, without RT, and if Big Navi is supposed to be AMD going for the performance crown, then should people really expect them to undercut Ampere by much if at all? That's why I suggested 3080Ti and 3070Ti levels of performance in general, for 3080 and 3070 pricing. The Navi 3080Ti level card for example, may only give you, say, 3060/3070 level RT perhaps, but that may be enough for AMD to gain some market share, considering it'll be their first RT cards. It wouldn't lead to them crushing Ampere or anything like that though, but it would be considered quite competitive overall. Assuming their drivers are also much improved, which it seems they're improving more recently, and is rumored to be part of the reason AMD is taking some extra time before unveiling and releasing Big Navi, to make sure the drivers are worthy by launch.

Hmm. 12 mil? That's it? Even if Nvidia got sued for Ampere and had to pay 10's of millions, it would just be pocket change in comparison to profits, so how much would they really care at that point?



vivster said:
HoloDust said:
From that video, Battlefield1 uses 50%, while Witcher 3 uses only 17-18% of INT math (compared to FP).
So, using games form that chart, in worst case scenario. 3080 is actually 22.5TFLOPS FP32 card, in best case scenario it's around 27TFLOPS...and of course, theoretically, it's 30TFLOPS FP32 card if no INT is used.

I guess that architecture actually makes a lot of sense, cost and effective wise, depending on the balance of FP vs INT math.

I'm not a chip designer so for me the biggest question is why do it like that and not "just" introduce a 3rd datapath to be able to use all cores simultaneously. It would undoubtedly make the chip more complex and engines and interconnects would need to be adjusted, but is that the big issue? Is there a mechanical issue to it? Is it a cost saving measure? Are whatever controllers not able to handle more than 2 paths?

Considering they are huge chips they probably did not have the real estate to just double both kinds of cores, so they just increased the important ones.

Space and power consumption constraints, basically what you propose is what will happen eventually, it's just that Ampere is already large and power hungry even on Samsungs 8nm node, so they had to make some compromises.

Core flexibility does cost transistors... And because fabrication improvements are slowing down, sometimes going with simpler core/pipeline designs and optimizing for projected workloads is your best approach to getting the best possible performance... And nVidia is changing focus towards compute as that is what future projected workloads demand. (I.E. Ray Tracing is an inherently compute constrained application currently and Tesla is seeing massive growth.)

AMD also recognized this years ago, hence why Graphics Core Next became a thing... Sadly Rasterization was still an important workload hence why they decided to branch out GPU designs, CDNA for Compute, RDNA for Rasterization.



--::{PC Gaming Master Race}::--

I love how he's warping words already and spreading dat misinformation.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"