By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

I don't know where you found those numbers, haxxiy, but thank you. It's very revealing.

I wonder how much bigger the Ada R&D costs are compared to RDNA3, that also plays a part. And developing DLSS 3, of course, I'm sure Nvidiai has included that as well.

In any case, I agree that be better hope AMD prices their cards fairly. For our own sake.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network
haxxiy said:

By the way, does anyone have experience with curved VA monitors? Are they actually good?

I have the 32" Samsung Odyssey G7 - the 1440P one.. and its great. For gaming and movies its very immersive. Colors are good out of the box but aren't to IPS levels (great with some calibration) and motion, response times and contrast is about as good as you can get for an LCD without full local dimming. Great for gaming and media consumption, not that ideal for productivity but workable.

Also depends on how aggressive the curve is as well.. Samsung ones are quite extreme at 1000R though you can get lesser ones, with widescreen and ultrawides. You may or may not like the effect.. on a ultra-wide its a necessity imo. Oh with a curve, a plus side you can sit closer to the monitor as the screen wraps around your peripheral vision.

If unsure try visit a local computer store or buy from a place where you can return easily. If you are set on a 16:9, 21:9 VA or otherwise, I'd look at Samsung monitors personally.

Last edited by hinch - on 24 September 2022



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

hinch said:

haxxiy said:

I have the 32" Samsung Odyssey G7 - the 1440P one.. and its great. For gaming and movies its very immersive. Colors are good out of the box but aren't to IPS levels (great with some calibration) and motion, response times and contrast is about as good as you can get for an LCD without full local dimming. Great for gaming and media consumption, not that ideal for productivity but workable.

Also depends on how aggressive the curve is as well.. Samsung ones are quite extreme at 1000R though you can get lesser ones, with widescreen and ultrawides. You may or may not like the effect.. on a ultra-wide its a necessity imo. Oh with a curve, a plus side you can sit closer to the monitor as the screen wraps around your peripheral vision.

If unsure try visit a local computer store or buy from a place where you can return easily. If you are set on a 16:9, 21:9 VA or otherwise, I'd look at Samsung monitors personally.

I see, thanks!

It sucks that around here not only is stuff expensive but also very hard to find :/ once an import batch sells out, the product is gone for good from that retailer. The only decent monitors I find anywhere online are either these curved QHD VAs or 43" monstrosities. But maybe I'll have some luck with the local stores.



 

 

 

 

 

haxxiy said:

I've seen that the estimated cost of a 300 mm 5 nm wafer from TSMC is about $14,000-$15,000 vs. ~ $10,000 for the 7 nm process wafer. I don't know how much more expensive Nvidia's 'special' node is but let's assume it's 10% more and they all have a similar rate of defects.

So there's a huge price increase there, more than 50% per mm2. Using the die yield calculator:

$10,000 / 71 viable GA102 per wafer = $140,84 per chip

$15,950 / 75 viable AD102 per wafer = $212,66 per chip

That's for the big ones. The smallest (AD104) yields 170 chips per wafer in this estimate and would cost $93,82 per chip, around 66% of the GA102 Ampere chips. If the profit margin were the same, the price of the RTX 4080 12 GB should've been $599-$699. On the other hand, the profit margin of the 4090 is lower than the 3090 as everyone noticed eyeballing it.

RTG might have had a good idea before Nvidia for once with the chiplet design. Using standard 5 nm and assuming 250 mm2 per chiplet, the cost would be $138,75 for the Navi 31 die vs. $113,63 for Navi 21 (a 22% increase per mm2).

But even with 100 mm2 chiplets the cost should increase another 30% by 2026 over AD102, so yeah.

I do wonder if Nvidia are actually paying more of a premium than the estimated wafer cost from when all companies were outbidding each other to get the wafers ordered years in advance amid the chip shortages and that cost has been passed onto consumers and why Nvidia wanted to cancel some of their wafers and why TSMC wouldn't allow it.



Around the Network
kirby007 said:

honestly with gas prices being at almost 3 euro vs 80 eurocent last year its probably the cheapest way to heat the room

I doubt that the gas prices tripled since last year in your country.



AMD Ryzen 7950X/7900X/7700X/7600X stock Cinebench scores leaked

https://videocardz.com/newz/amd-ryzen-7950x-7900x-7700x-7600x-stock-cinebench-scores-leaked

This certainly marks the end of the 5950x dominance. That CPU held its own in multi-threaded runs against 10th gen, 11th gen and 12th gen while largely making a good portion of Intel's HEDT irrelevant. While the jump to 7950x certainly looks big enough, I am personally going to continue to wait.

NVIDIA RTX 4090 Boosts Up to 2.8 GHz at Stock Playing Cyberpunk 2077, Temperatures around 55 °C

https://www.techpowerup.com/299175/nvidia-rtx-4090-boosts-up-to-2-8-ghz-at-stock-playing-cyberpunk-2077-temperatures-around-55-c

"At native resolution, the RTX 4090 scores 59 FPS (49 FPS at 1% lows), with a frame-time of 72 to 75 ms. With 100% GPU utilization, the card barely breaks a sweat, with GPU temperatures reported in the region of 50 to 55 °C. With DLSS 3 enabled, the game nearly doubles in frame-rate, to 119 FPS (1% lows), and an average latency of 53 ms. This is a net 2X gain in frame-rate with latency reduced by a third."

This is what I was referring to yesterday when I said that DLSS 3 latency should be lower than running the game natively. Certainly still not a 4k Native with Max RT on card in this game. I suspect that maybe 6000 series or 7000 series might be the ones to hit 4k 60fps natively. 55C does leave a lot of room for overclocking past 3ghz as long as Nvidia didn't cuck the power sliders.

AYANEO 2 and GEEK Ryzen 7 6800U gaming consoles to launch in December, pricing from $949 to $1549

https://videocardz.com/newz/ayaneo-2-and-geek-ryzen-7-6800u-gaming-consoles-to-launch-in-december-pricing-from-949-to-1549#disqus_thread

Very expensive considering what the GPD folks offers at a similar price

Last edited by Jizz_Beard_thePirate - on 25 September 2022

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

AMD Ryzen 7 7700X & Ryzen 5 7600X Are A Hit In Pre-Launch Reviews, Full Lineup Including Ryzen 9 7950X & 7900X Get Cinebench Benchmark

https://wccftech.com/amd-ryzen-7-7700x-ryzen-5-7600x-review-sisoftware-ryzen-9-7950x-7900x-cinebench-benchmarks/

These reviews are being leaked so take it with huge heaps of salt as the sources are largely unknown as far as popularity goes:

(From wccftechs comment section)

Last edited by Jizz_Beard_thePirate - on 25 September 2022

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

The gaming results seem a bit odd. I can understand losing to the 5800X3D or 12900K in some titles, but against the regular 5800X? Despite the higher ipc and clocks? Odd.

Oh, well. We'll see how it goes when we get more reviews to compare.

By the way, anyone has an idea of when the reviews will roll out?



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:

The gaming results seem a bit odd. I can understand losing to the 5800X3D or 12900K in some titles, but against the regular 5800X? Despite the higher ipc and clocks? Odd.

Oh, well. We'll see how it goes when we get more reviews to compare.

By the way, anyone has an idea of when the reviews will roll out?

If it's legit, could be due to latency that DDR5 brings.

Supposedly the reviews go live at 15:00 CEST tomorrow.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850