By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Ok, time to put some hardware news. And because it's Friday, I'll end them with a joke:

Suck it up, Valve's going to make us wait years for Steam Deck 2
https://www.pcgamer.com/suck-it-up-valves-going-to-make-us-wait-years-for-steam-deck-2/
The Steam Deck has been a significant hit. So, a sequel seems something of a certainty. But any Steam Deck 2 with a major performance upgrade remains at least a "few years" away.

So says Steam Deck designer Lawrence Yang. Speaking to ye olde Rock Paper Shotgun (opens in new tab), what Yang said specifically was, "a true next-gen Deck with a significant bump in horsepower wouldn’t be for a few years."

Cadence Delivers Technical Details on GDDR7: 36 Gbps with PAM3 Encoding
https://www.anandtech.com/show/18759/cadence-derlivers-tech-details-on-gddr7-36gbps-pam3-encoding
When Samsung teased the ongoing development of GDDR7 memory last October, the company did not disclose any other technical details of the incoming specification. But Cadence recently introduced the industry's first verification solution for GDDR7 memory, and in the process has revealed a fair bit of additional details about the technology. As it turns out, GDDR7 memory will use PAM3 as well as NRZ signaling and will support a number of other features, with a goal of hitting data rates as high as 36 Gbps per pin.
>> That's what Nvidia's 5000 and AMD's 8000 series GPU may use.

First AMD AM5 motherboard is now available below $125
https://videocardz.com/newz/first-amd-am5-motherboard-is-now-available-below-125

Cyberpunk 2077 To Showcase Truly Next-Gen RTX Path Tracing as part of RT: Overdrive Mode in GDC Presentation
https://wccftech.com/cyberpunk-2077-implement-truly-next-gen-rtx-path-tracing-utilizing-nvidia-rt-overdrive-tech/
In a session listed by NVIDIA, the company will be presenting the first real-time RTX Path Tracing demo within Cyberpunk 2077. The session titled "Cyberpunk 2077' RT: Overdrive – Bringing Path Tracing into the Night City (Presented by NVIDIA)" will include NVIDIA's Senior Developer & Technology Engineer, Pawel Kozlowski, along with CD Projekt Red's Global Art Director, Jakub Knapik. The session will take place on the 22nd of March (...)

Backblaze Reveals 2022 SSD Life Expectancy Statistics: Temperatures Are Potential Factor
https://wccftech.com/backblaze-reveals-2022-ssd-life-expectancy-statistics-temperatures-are-potential-factor/
Backblaze has revealed the newest storage drive stats report for 2022, with a singular focus on SSDs that the company utilizes for data storage boot drives within their cloud storage systems.

And now, the joke:

AMD Says It Is Possible To Develop An NVIDIA RTX 4090 Competitor With RDNA 3 GPUs But They Decided Not To Due To Increased Cost & Power
https://wccftech.com/amd-says-it-is-possible-to-develop-an-nvidia-rtx-4090-competitor-with-rdna-3-gpus-but-they-decided-not-to-due-to-increased-cost-power/
During an interview with ITMedia, AMD's EVP, Rick Bergman, and AMD SVP, David Wang, sat to discuss their goals with the RDNA 3 and CDNA 3 architectures. The most interesting question is asked right at the start of the beginning about why AMD didn't release an RDNA 3 GPU under its Radeon RX 7000 lineup that competes in the ultra high-end enthusiast segment such as NVIDIA's RTX 4090.

Last edited by JEMC - on 10 March 2023

Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network
WoodenPints said:

The 7900XT cards at £799 with TLoU as a freebie are starting to look like decent value and are currently cheaper than the 4070Ti and I've been a little tempted myself I think Nvidia completely dropped the ball on vram and it's looking like my next GPU will be AMD and my first Radeon card since the HD 6870.

Thats the one I'm looking at too, on OcUK. With the game its closer to the £750 mark. The TLOU deal kinda tipping me over the edge as I was planning to get it on PC anyways. It is a reference card but tbh I don't mind as I plan on undervolting plus some OC on whatever I get.

I was humming and hurring on the 4070Ti but nah. 12GB isn't going to last long tbh. Have a feeling that buying that will be like getting a 8GB card like the 3070 in 2023. Which was just enough for games rn, but when things kick into gear.. it'll be left behind. Even with cards outperforming it. Say I could see the upcoming 7800XT, which is rumored to have 16 GB VRAM aging better than the latter. Much like 6700XT > 3070/Ti in a lot of current titles

Last edited by hinch - on 10 March 2023

hinch said:
Mummelmann said:

Absolutely. I'll be getting a 4090 (Asus TUF version seems like the most balanced choice), most likely, which has 24GB VRAM. Should tide me over for a few years. I might upgrade my GPU 4-5 years down the line, and then keep the rest of the setup as is, more or less. I chose my parts with the future in mind the last time, and it will have served me for 8 years come July. Since I've already been playing at 1440p since 2015, 4K is the only logical option for me going forward. It hasn't really been possible until this gen though, the 30xx cards and AMDs last gen offerings were simply not up to the task (wanting decent frame rates).

Yeah we see games already going high on VRAM and we're just scratching the surface for graphically demanding games this generation. With more engines using RT, we're going to need as much as we can get.

And true, Nvidia skimped on VRAM across the stack for Ampere and even Ada in the lower stacks. Outside their flagship end and refreshes (plus the 3060 12GB lol) and it didn't last long until people had to eventualy start lowering settings to play them. Definately worth getting as much VRAM as you can for longevity. A 4090 will definately last you long long time. With the latest DLSS tech and frame generation, even better.

I'm half tempted to get the 7900XT, with the recent price drop and free game. 20GB VRAM will last a while, one should hope. Since AMD and Nvidia have been playing games with the market and drawing out the mid ranged launches, I really need an upgrade that's not old gen hardware lol.

The 7900XT is perhaps the best value for money on the market right now, especially in sheer raster performance. The 4070 Ti only has a leg up when DLSS is used, which is likely used alongside RT (which it can't handle all that well at any rate). If I were sticking with 1440p another cycle, I think I'd get either the 7900XT or the 7900XTX. I only had AMD cards before my current one (good 'ole 980 Ti), and I have no brand loyalty or shame. :P I go where the performance and value is.

I first considered it ludicrous to even think of getting a 4090, but the rest of the 40xx lineup is simply worse value, all things considered. This goes double for 4K gaming.

As for VRAM; I remember maxing out mine on the 980 Ti when it was brand new. The game was GTA V. Developer approach, and indeed GPU manufacturer approach, to visual fidelity right now reminds me of the automotive industry in the 90's and early 2000's. They tacked on huge turbos and bored out the cylinders rather than apply finesse and tuning to existing solutions. Without any sort of proper regulation, besides requiring ABS brakes and catalytic converters, it increased performance and fun factor across the board, but at the cost of constant breakdowns, high consumption, and expensive part replacements. Not to mention the toll on the environment. 

I think we'll be seeing a strict regime of regulations hit the electronics indsustry in the coming 5-7 years. They already started with TVs here in the EU, enforcing max wattage on new sets. Gaming rigs that require 1-1.2KW to power, crypto farms, and constant use and charging of smart devices whose battery time relative to use technically hasn't improved for years will ensure this (capacity increases, but so does power requirements and number of demanding features).

Last edited by Mummelmann - on 10 March 2023

hinch said:
WoodenPints said:

The 7900XT cards at £799 with TLoU as a freebie are starting to look like decent value and are currently cheaper than the 4070Ti and I've been a little tempted myself I think Nvidia completely dropped the ball on vram and it's looking like my next GPU will be AMD and my first Radeon card since the HD 6870.

Thats the one I'm looking at too, on OcUK. With the game its closer to the £750 mark. The TLOU deal kinda tipping me over the edge as I was planning to get it on PC anyways. It is a reference card but tbh I don't mind as I plan on undervolting plus some OC on whatever I get.

I was humming and hurring on the 4070Ti but nah. 12GB isn't going to last long tbh. Have a feeling that buying that will be like getting a 8GB card like the 3070 in 2023. Which was just enough for games rn, but when things kick into gear.. it'll be left behind. Even with cards outperforming it. Say I could see the upcoming 7800XT, which is rumored to have 16 GB VRAM aging better than the latter. Much like 6700XT > 3070/Ti in a lot of current titles

I wouldn't say a 6700xt is aging better than 3070 in a lot of current titles unless you have proof. Twitter drama certainly has been hammering the vram limitations but you need to be sure they aren't just trying to misslead people so we need to make sure that there's facts to back it up.

If we look at various recent titles, the 6700xt is in fact not aging better than 3070/Ti:

https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/6.html

https://www.techpowerup.com/review/forspoken-benchmark-test-performance-analysis/5.html

https://www.techpowerup.com/review/the-callisto-protocol-benchmark-test-performance-analysis/5.html

You can see that while vram usage is high, the 3070 is infact not performing worse than 6700xt. A lot of the drama that I have seen surrounding the vram issue is when you enable ray tracing, that some of these Vram limited gpus are not performing as well as they should which is true. But that doesn't mean RDNA 2 is performing better in RT enable titles either as the RT performance on RDNA 2 is generally bad.

I do think a 7900xt for the same price as a 4070 Ti is a better buy though but I'd wait and see if Nvidia responds with price drops as well.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

One thing people should remember is that vram isn't everything when it comes to buying a GPU. It's certainly one of the most important things no doubt but it's not the only factor. Nvidia has always been stringent when it comes to vram. GTX 980 for example had 4GB of vram while consoles had 8GB of unified vram and so did the RX390.

But Nvidia always gave more features to counter such has Gsync which was better than Freesync at the time along with Gameworks and better tessellation.

But one key area that Nvidia has always been better with is driver support. While RX300 series was aging a bit better with games that required more vram, Radeon stopped supporting driver updates in June 2021. Meanwhile a GTX 970 had faster driver updates than 6900XT a few weeks ago and still bring supported. Not to mention the emulation compatibility issues with Radeon 7000 series GPUs that RTX 4000 series does not have.

It's why I said before that 4090 is the only gpu worth buying this generation. Spending money on Radeon with its lack of features and bad track record is a non option imo even if the vram looks great. But Nvidia has had ludicrous pricing on its 40 series gpus so outside of the 4090, they aren't worth getting rn either.

Imo the best course of option for someone worried about vram is either get a 4090 or wait till 4080 gets discounted or seeing if you can snag a 3090 for cheap or waiting for Blackwell. I wouldn't put faith back into Radeon after their recent incidents just yet.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network

In Tom's Hardware 4K benchmarks, the 6700 XT had 91.9% of the 3070's performance in 2021 and now it has 92.1% while also getting ahead in the 99th percentile. Since the 3080 aged better than the 6800, this might be a VRAM handicap on the mid-range parts despite RTG's low bandwidth + cache solution.

By my estimates, this means the 6700 should in fact surpass the 3070 circa 2112, so hold on until there for confirmation :D





 

 

 

 

 

haxxiy said:

In Tom's Hardware 4K benchmarks, the 6700 XT had 91.9% of the 3070's performance in 2021 and now it has 92.1% while also getting ahead in the 99th percentile. Since the 3080 aged better than the 6800, this might be a VRAM handicap on the mid-range parts despite RTG's low bandwidth + cache solution.

By my estimates, this means the 6700 should in fact surpass the 3070 circa 2112, so hold on until there for confirmation :D



Dat Fine Wine lul



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:
hinch said:

Thats the one I'm looking at too, on OcUK. With the game its closer to the £750 mark. The TLOU deal kinda tipping me over the edge as I was planning to get it on PC anyways. It is a reference card but tbh I don't mind as I plan on undervolting plus some OC on whatever I get.

I was humming and hurring on the 4070Ti but nah. 12GB isn't going to last long tbh. Have a feeling that buying that will be like getting a 8GB card like the 3070 in 2023. Which was just enough for games rn, but when things kick into gear.. it'll be left behind. Even with cards outperforming it. Say I could see the upcoming 7800XT, which is rumored to have 16 GB VRAM aging better than the latter. Much like 6700XT > 3070/Ti in a lot of current titles

I wouldn't say a 6700xt is aging better than 3070 in a lot of current titles unless you have proof. Twitter drama certainly has been hammering the vram limitations but you need to be sure they aren't just trying to misslead people so we need to make sure that there's facts to back it up.

If we look at various recent titles, the 6700xt is in fact not aging better than 3070/Ti:

https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/6.html

https://www.techpowerup.com/review/forspoken-benchmark-test-performance-analysis/5.html

https://www.techpowerup.com/review/the-callisto-protocol-benchmark-test-performance-analysis/5.html

You can see that while vram usage is high, the 3070 is infact not performing worse than 6700xt. A lot of the drama that I have seen surrounding the vram issue is when you enable ray tracing, that some of these Vram limited gpus are not performing as well as they should which is true. But that doesn't mean RDNA 2 is performing better in RT enable titles either as the RT performance on RDNA 2 is generally bad.

I do think a 7900xt for the same price as a 4070 Ti is a better buy though but I'd wait and see if Nvidia responds with price drops as well.

I think age better isn't right the phrase here.. What I'm saying is that having more VRAM is better than running out and having huge performance penalties. With newer titles using more higher quality textures 8GB isn't going to cut it for newer titles for 1440P. It was barely scraping the minimum requirements in 2020 and consoles have way,way more RAM. We had games going over that on cross gen stuff like RE7 on high quality textures. And it looks like the bar is going to get higher as we progress with newer engines and games.

What I'm saying is that you don't want to be close to the edge with VRAM because once your at the limit and go over, you're going to have to reduce settings to make something playable. Granted its not a magic bullet with performance have more is better than running out and getting single figure or massive FPS drops due to lack of RAM.

Another one is the 3080 with its 10GB RAM. Which was and is low for its performance tier and is already problematic with a few select titles with higher quality settings and resolutions.

What we're saying is that Nvidia have skimped on VRAM for years on the lower end stack to get people to go higher end. Its market segmentation but also seems a little like planned obsolecence for mid range buyers. Offering the bare minimum for each tier as possible per generation.



Honestly the fact that you can get a 6700XT for less than the 3060 is bigger deal than whether the 6700XT is aging better than the 3070. They cards aren't in the same price class right now. You can get a RX 6800XT for the price of 3070 and you don't have worry about the 8GB frame buffer limiting your performance.



Arc price has been reduced again. You can get  A750 shipped fron new egg for  $229 and a A770 for $269. How do you guys see them as entry mid level against the Nvidia and AMD options?