By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Pemalite said:
Bofferbrauer2 said:

Yep, at less than 400€ starting price the 6700XT is actually good value now, especially compared to the competition: It's in the same price bracket as the 3060, while the 3060Ti (starting at 440€) sits in between the 6750XT (starting at 410€) and the 6800XT (560€, base 6800 is actually more expensive than the XT models right now on Mindfactory) and 3070 (530€)

@bolded: This is why I won't buy a Laptop with a 3060 or below in it, I won't go below 8GB VRAM. Right now that leaves me with exactly 2 laptop options that I'll choose from under 1500€ with AZERTY keyboard layouts:

  1. Lenovo Legion 5 Pro 16 (Ryzen 7 6800H, RTX 3070 and a great 1600p 165hz 500nits screen, 1TB SSD, or
  2. ASUS TUF Gaming A16 Advantage Edition (Ryzen 7 7735HS (so basically the same CPU), Radeon 7600S (performance-wise, it should be in between a desktop 6600 and 6600XT with tendency to the latter), less bright 1200p screen and just 512GB SSD

The Lenovo is right now almost 1000€ off (original price is 2429€), so both cost around 1500€, but there's no guarantee this discount will last until the point where I can afford to buy that laptop. Pretty much all other laptops with similar specs available here are closer or even past 2000€.

To be fair my notebook with the Intel 11400H+Geforce 3060 6GB+8GB Ram(Then upgraded it to 64GB of Ram) went on sale for about $950 AUD, so at that specific price point, it was the best value you could buy at the time. Heck, it's still stupidly good value even today compared to other devices in this price point which are normally integrated graphics or Geforce 3050 4GB/Radeon RX 560 or something.

I would still make the same purchase if I was looking for a sub $1,000 notebook even today as it's still a bargain at that price... It's just a shit sandwich that the GPU is *more* capable and could fully utilise *more* than 6GB of VRAM, but it ended up with less.
If I had the option for an 8GB variant, I would have jumped all over it.

It's actually a similar issue I had years ago with an old Pentium M Dothan @1.6Ghz + Mobility Radeon 9700 laptop that I had, it only came with 64MB of VRAM, when it really needed 128MB or better yet, 256MB... Then it would have handled Oblivion and Fallout 3 far far far better.

Yeah for that price I would also have taken that one. For Sub-1000€ I can find at best a 3050Ti here (and most are even just a 3050 or some MX), laptops with a 3060 are generally around the same price bracket as those I mentioned above, and for that price I find the 3060 simply not worth it anymore.



Around the Network



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

They deserve it.

Last edited by JEMC - on 08 April 2023

Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

NVIDIA GeForce RTX 50 “Blackwell” GPU Rumors: 3nm Monolithic, Faster Clocks, Over 2x Faster Than Ada RTX 40

https://wccftech.com/nvidia-geforce-rtx-50-blackwell-gpu-rumors-3nm-monolithic-faster-clocks-over-2x-faster-than-ada-rtx-40/

The usual suspects are likely on crack with 2x faster. As Kopite says, too early but would be interesting if it came with 512 bit interface for their flagship. BW being Mono also makes sense as a) It's called Ampere Next Next so likely it's not a complete redesign and b) As we seen with Radeon, MCM doesn't really have any benefit over Mono just quite yet other than margins which clearly isn't being pasted down to the customer.

Early Intel 14th Gen Meteor Lake-M Laptop CPU With 4+8 Core Configuration Spotted

https://wccftech.com/early-intel-14th-gen-meteor-lake-m-laptop-cpu-with-48-core-configuration-spotted/

Counter-Strike 2 is getting Nvidia’s latency-reducing Reflex tech

https://www.theverge.com/2023/4/6/23672408/counter-strike-2-nvidia-reflex-feature

Now this is what I have been waiting for!

A blast from the past. Steve going over 3dfx history for a bit is always interesting considering how Nvidia feels as though they are in that state of building in house GPUs and treating AIBs poorly. But Nvidia is also in a significantly different position as 3dfx was as 3dfx bet on the niche pc users back then where as Nvidia bet on OEMs... And we knew who came out on top.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Doing some napkin math, assuming a 30-50% density increase and a 10-15% performance increase at the same power, and 25-30% power savings on 3 nm, to theoretically double the 4090 one would need a ~ 810 mm2 die with a whopping ~ 700 W TDP.

This monolithic die, larger than any Nvidia ever released outside of the HPC market, would also be at least 80% more expensive to manufacture than the 4090's.

That doesn't count architectural improvements but also assumes perfect scaling per transistor and frequency, so take it as you will.



 

 

 

 

 

Around the Network

Well, Nvidia still hs to release half of its 4000 cards, so excuse me if I don't really care about how great Blackwell is going to be, even more so if we look at Ada and how Nvidia has handled it. Yeah, the top of the line card (5090 or 5090Ti?) may bring a 100% increase in performance, but look at this gen, the 4090 brought a 70% increase, yes, but that became 50% for the 4080 (with 50% more money), roughly 40% for the 4070Ti and, given the latest rumors/leaks, the 4070 will be around 30% faster than its predecessor. Who knows what kind of improvement will 4060-/Ti and 4050 bring? 20-25%?

Meh.

Only one thing is worth mentioning: if haxxiy's numbers ever become true, it's going to be even more expensive.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

That's all straight from TSMC, even the wafer cost of +25% on 5 nm. So regardless of how good Blackwell is, you'll be paying more per a similarly sized chip.

Personally, I'd expect something like a 50% improvement on the 4090 for $2000 for the flagship and the same-ish TDP (although with a higher actual power consumption).

Edit - if Nvidia is going for Samsung, don't mind much these estimates then. Samsung's 3GAE is less dense than TSMC's 3N and their nodes tend to be a little worse overall, but beyond that it's hard to compare both in terms of cost and performance.

Last edited by haxxiy - on 08 April 2023

 

 

 

 

 

I believe Nvidia is going Samsung this time around so that's one thing to keep in mind whiling attempting to judge it. I do think that while flagship prices will be higher, the rest of the stack will either be lower or similarly priced as Lovelace but with more performance. After the failure of the 4080 in terms of sales, I highly doubt Nvidia is going to attempt the 80 series at $1200 again. Mix that with lower demand with consumer GPUs due to high prices and economic recession and going Samsung which is supposed to be cheaper than TSMC, I think there's a chance that some of the GPUs will get restructured accordingly.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

haxxiy said:

That's all straight from TSMC, even the wafer cost of +25% on 5 nm. So regardless of how good Blackwell is, you'll be paying more per a similarly sized chip.

Personally, I'd expect something like a 50% improvement on the 4090 for $2000 for the flagship and the same-ish TDP (although with a higher actual power consumption).

Edit - if Nvidia is going for Samsung, don't mind much these estimates then. Samsung's 3GAE is less dense than TSMC's 3N and their nodes tend to be a little worse overall, but beyond that it's hard to compare both in terms of cost and performance.

My comment was in regards to your density and performance increase guesses, not the waffer costs. After all, the article claims that Nvidia will restructure the SM, which should influence both those parameters.

Captain_Yuri said:

I believe Nvidia is going Samsung this time around so that's one thing to keep in mind whiling attempting to judge it. I do think that while flagship prices will be higher, the rest of the stack will either be lower or similarly priced as Lovelace but with more performance. After the failure of the 4080 in terms of sales, I highly doubt Nvidia is going to attempt the 80 series at $1200 again. Mix that with lower demand with consumer GPUs due to high prices and economic recession and going Samsung which is supposed to be cheaper than TSMC, I think there's a chance that some of the GPUs will get restructured accordingly.

Given what's stated in the article, Mr. leather jacket already went to TMSC to secure 3nm wafers so, unless Nvidia has decided to split their chips between both TSMC and Samsung (for example going with the former for the bigger and more expensive parts, leaving the rest for Samsung), we have conflicting rumors. Which one to believe?

Also, the 4080 was a failure, true, but that doesn't mean much for the next generation. Nvidia could simply price the 4080 at $1000 instead of $1200 while leaving the rest untouched. It's unlikely, of course, because if the chips are more expensive to manufacture, Nvidia will charge more for them, plain and simple.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:
haxxiy said:

That's all straight from TSMC, even the wafer cost of +25% on 5 nm. So regardless of how good Blackwell is, you'll be paying more per a similarly sized chip.

Personally, I'd expect something like a 50% improvement on the 4090 for $2000 for the flagship and the same-ish TDP (although with a higher actual power consumption).

Edit - if Nvidia is going for Samsung, don't mind much these estimates then. Samsung's 3GAE is less dense than TSMC's 3N and their nodes tend to be a little worse overall, but beyond that it's hard to compare both in terms of cost and performance.

My comment was in regards to your density and performance increase guesses, not the waffer costs. After all, the article claims that Nvidia will restructure the SM, which should influence both those parameters.

Captain_Yuri said:

I believe Nvidia is going Samsung this time around so that's one thing to keep in mind whiling attempting to judge it. I do think that while flagship prices will be higher, the rest of the stack will either be lower or similarly priced as Lovelace but with more performance. After the failure of the 4080 in terms of sales, I highly doubt Nvidia is going to attempt the 80 series at $1200 again. Mix that with lower demand with consumer GPUs due to high prices and economic recession and going Samsung which is supposed to be cheaper than TSMC, I think there's a chance that some of the GPUs will get restructured accordingly.

Given what's stated in the article, Mr. leather jacket already went to TMSC to secure 3nm wafers so, unless Nvidia has decided to split their chips between both TSMC and Samsung (for example going with the former for the bigger and more expensive parts, leaving the rest for Samsung), we have conflicting rumors. Which one to believe?

Also, the 4080 was a failure, true, but that doesn't mean much for the next generation. Nvidia could simply price the 4080 at $1000 instead of $1200 while leaving the rest untouched. It's unlikely, of course, because if the chips are more expensive to manufacture, Nvidia will charge more for them, plain and simple.

Yeah, I def don't see Nvidia going back to £6-700 for their 80 series cards like we had with the 1080/1080ti, if anything like you said it'll be around £1000, which to me is still too expensive and not worth a buy, even if I had to wait another 3yrs.

I feel like nvidia is just going to ride out a few more gens of this price jacking shit, instead of next gen being an immediate, consumer friendly course correction. This gen and last gen showed us their hand, and they are in for the pot of greed, that's all I needed to know. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"