By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Prime Gaming is giving dishonored 2 along with other games on the 27th

https://primegaming.blog/enjoy-even-more-great-games-and-content-this-holiday-from-prime-gaming-fddbdb0ec78



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Captain_Yuri said:
7900XTX4080
24GB 384 Bit Bus16GB 256 Bit Bus
192 ROPs112 ROPs
960.0 GB/s716.8 GB/s
96 CU76 SM
96 RT Cores76 RT Cores
61.42 Teraflops48.74 Teraflops
355 Watt TDP320 Watt TDP
ChipletMonolithic


Yet some how, these two are with in single digit % of each other in Raster and around 30-40% difference in Ray Tracing in favor of the 4080. This is like the inverse of Ampere vs RDNA 2 just that Ampere had the Ray Tracing and DLSS advantage while RDNA 3 has no advantage other than price. But when a person is spending $1000 and the competition is $200 more but gives you superior Ray Tracing + DLSS while being more power efficient... AMD just might have sold the 4080...

Idk how a company can create a chiplet architecture that is more inefficient than monolith but here we are... Even their transient spikes feels like Ampere spikes but on TSMC.

Overall both the 7900XTX and 4080 are terrible products for the price. Hopefully both will get a price cut sooner than later.

Chiplets by their very nature will always be less efficient than a monolithic die.

- You are moving caches and memory controllers further away from compute... And adding in an interconnect/fabric which consumes energy and increases latencies.

haxxiy said:

That TUF is more like what I had expected. The OC would even manage to just about match the 4080 in RT (on average, still behind on the more demanding titles).

This lends some credence to the idea AMD found some frequency curve issues late in development and that is why the clocks were much lower than expected. Even then, I wonder if they'd be better off just adding another hundred Watts to the cards. Could be bad PR trauma from the Vishera/Hawaii days but now everyone's doing it so yeah.

Too bad these won't retail for the MSRP at least for now.

I think the biggest issue currently is just the drivers.
In theory in raster, RDNA3 should be beating the Geforce 4080, not just matching it.

In some benchmarks we see a glimpse of it's potential and it looks great, but other titles it falls short... Which to me shows that the immature drivers are holding it back, which is par the course for AMD and it's launch GPU's.

But in saying that, we can only take the GPU on the basis of how it performs today, not how it may perform in 6 months time.

I am personally not happy with it costing $1,800 AUD, hoping the 7800 series is a little more sensible... But I don't think they will come under $1,000 AUD either.






--::{PC Gaming Master Race}::--

Pemalite said:

haxxiy said:

I think the biggest issue currently is just the drivers.
In theory in raster, RDNA3 should be beating the Geforce 4080, not just matching it.

In some benchmarks we see a glimpse of it's potential and it looks great, but other titles it falls short... Which to me shows that the immature drivers are holding it back, which is par the course for AMD and it's launch GPU's.

But in saying that, we can only take the GPU on the basis of how it performs today, not how it may perform in 6 months time.

I am personally not happy with it costing $1,800 AUD, hoping the 7800 series is a little more sensible... But I don't think they will come under $1,000 AUD either.

This might be the case, but I still think AMD should have brute-forced it to 3 GHz with an extra 100W. It'd be near the 4090 instead of the 4080 in raster for $600 less.

That seems much more marketable to me, and one would get 4080 levels of RT for $200 less too. I'm not sure people would care if it's rated 450W given the 4090 is there (despite it not consuming that much in gaming without RT).

Maybe AMD considered all of this but was just too late to change the reference design.



 

 

 

 

 

haxxiy said:
Pemalite said:

haxxiy said:

I think the biggest issue currently is just the drivers.
In theory in raster, RDNA3 should be beating the Geforce 4080, not just matching it.

In some benchmarks we see a glimpse of it's potential and it looks great, but other titles it falls short... Which to me shows that the immature drivers are holding it back, which is par the course for AMD and it's launch GPU's.

But in saying that, we can only take the GPU on the basis of how it performs today, not how it may perform in 6 months time.

I am personally not happy with it costing $1,800 AUD, hoping the 7800 series is a little more sensible... But I don't think they will come under $1,000 AUD either.

This might be the case, but I still think AMD should have brute-forced it to 3 GHz with an extra 100W. It'd be near the 4090 instead of the 4080 in raster for $600 less.

That seems much more marketable to me, and one would get 4080 levels of RT for $200 less too. I'm not sure people would care if it's rated 450W given the 4090 is there (despite it not consuming that much in gaming without RT).

Maybe AMD considered all of this but was just too late to change the reference design.

I wouldn't be too hyped about the OC numbers until we see more. Techpowerup needed to push the clocks up to 3.2Ghz with aggressive OC, unvolting and Turning to get near the 4090 stock levels in Cyberpunk Raster and we have no idea how much power they needed to push. Most review outlets only managed to OC to 3Ghz with the TUF and the performance ended up being only 5% more in averages which you can easily do with a 4080 as well:

https://www.guru3d.com/articles_pages/asus_tuf_gaming_radeon_rx_7900_xtx_oc_review,30.html

Not to mention, 4080 level Ray Tracing would still be too far out of reach even with OC:

At best, it will likely be 3090 Ti levels

So it's better to wait and see before claiming it will be near the 4090 Raster or 4080 RT levels but it certainly should have been higher than what the reference shipped with.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

I wouldn't be too hyped about the OC numbers until we see more. Techpowerup needed to push the clocks up to 3.2Ghz with aggressive OC, unvolting and Turning to get near the 4090 stock levels in Cyberpunk Raster and we have no idea how much power they needed to push. Most review outlets only managed to OC to 3Ghz with the TUF and the performance ended up being only 5% more in averages which you can easily do with a 4080 as well:

https://www.guru3d.com/articles_pages/asus_tuf_gaming_radeon_rx_7900_xtx_oc_review,30.html

Not to mention, 4080 level Ray Tracing would still be too far out of reach even with OC:

At best, it will likely be 3090 Ti levels

So it's better to wait and see before claiming it will be near the 4090 Raster or 4080 RT levels but it certainly should have been higher than what the reference shipped with.

TPU managed to hit these frequencies with two out of two though (the XFX and the TUF) while Guru3D's TUF performed worse than TPU's reference card (much lower mem clocks and just 20 MHz more in the GPU clock).

Could be a bad sample, or could be because G3D is using Afterburner instead. Another bugged software wouldn't be a surprise here.

As for ray tracing, I'm thinking more of the average rather than the worst-case scenario (Steve from GN was told by AMD that Cyberpunk 2077's RT is bugged in Radeon). I mean, it's fair to take that into account since it's RTG's fault, but other games exist, too.



 

 

 

 

 

Around the Network
haxxiy said:
Captain_Yuri said:

I wouldn't be too hyped about the OC numbers until we see more. Techpowerup needed to push the clocks up to 3.2Ghz with aggressive OC, unvolting and Turning to get near the 4090 stock levels in Cyberpunk Raster and we have no idea how much power they needed to push. Most review outlets only managed to OC to 3Ghz with the TUF and the performance ended up being only 5% more in averages which you can easily do with a 4080 as well:

https://www.guru3d.com/articles_pages/asus_tuf_gaming_radeon_rx_7900_xtx_oc_review,30.html

Not to mention, 4080 level Ray Tracing would still be too far out of reach even with OC:

At best, it will likely be 3090 Ti levels

So it's better to wait and see before claiming it will be near the 4090 Raster or 4080 RT levels but it certainly should have been higher than what the reference shipped with.

TPU managed to hit these frequencies with two out of two though (the XFX and the TUF) while Guru3D's TUF performed worse than TPU's reference card (much lower mem clocks and just 20 MHz more in the GPU clock).

Could be a bad sample, or could be because G3D is using Afterburner instead. Another bugged software wouldn't be a surprise here.

As for ray tracing, I'm thinking more of the average rather than the worst-case scenario (Steve from GN was told by AMD that Cyberpunk 2077's RT is bugged in Radeon). I mean, it's fair to take that into account since it's RTG's fault, but other games exist, too.

They did but the point is that even at 3Ghz, it doesn't actually mean it will get near a 4090 when measured across other games as they all scale differently with RDNA 3. 200Mhz memory OC rarely makes a difference in between the games.

You also can't cherry pick the best case scenario and use that to assume every other game will scale linearly. Remember that AMD said in their press event that Cyberpunk was one of the games that had 1.7x in Raster improvement while the rest at 1.5x so if anything, that is the outlier. You can also pick other RT games that actually does have proper Ray Tracing like Control which shows a similar 3090 Ti tier RT performance if we assume the best case scenario:

The other issue with the relative chart is that it includes games like Far Cry 6 which isn't actually all that RT heavy so that favors AMD. So if you were to say 4080 level RT but include games that have bad RT, it would be very miss leading because if a person buys a 7900XTX thinking that OCing will allow 4080 level RT and then plays actual RT games, it wouldn't perform as expected.

Just to be clear, I am not saying 7900XTX can't perform like what Techpowerup showed, I am saying that we need to wait for more evidence before assuming it can across the board.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

I wouldn't be too hyped about the OC numbers until we see more. Techpowerup needed to push the clocks up to 3.2Ghz with aggressive OC, unvolting and Turning to get near the 4090 stock levels in Cyberpunk Raster and we have no idea how much power they needed to push. Most review outlets only managed to OC to 3Ghz with the TUF and the performance ended up being only 5% more in averages which you can easily do with a 4080 as well:

https://www.guru3d.com/articles_pages/asus_tuf_gaming_radeon_rx_7900_xtx_oc_review,30.html

Not to mention, 4080 level Ray Tracing would still be too far out of reach even with OC:

At best, it will likely be 3090 Ti levels

So it's better to wait and see before claiming it will be near the 4090 Raster or 4080 RT levels but it certainly should have been higher than what the reference shipped with.

That tells me there is a bottleneck somewhere, potentially bandwidth.

Now they have doubled down on dual issue pipelines, they might be running into a bandwidth wall... And it may not be DRAM bandwidth either.

haxxiy said:

This might be the case, but I still think AMD should have brute-forced it to 3 GHz with an extra 100W. It'd be near the 4090 instead of the 4080 in raster for $600 less.

That seems much more marketable to me, and one would get 4080 levels of RT for $200 less too. I'm not sure people would care if it's rated 450W given the 4090 is there (despite it not consuming that much in gaming without RT).

Maybe AMD considered all of this but was just too late to change the reference design.

Keep in mind they do have room to move with an RX 7950.



--::{PC Gaming Master Race}::--

Pemalite said:
Captain_Yuri said:
7900XTX4080
24GB 384 Bit Bus16GB 256 Bit Bus
192 ROPs112 ROPs
960.0 GB/s716.8 GB/s
96 CU76 SM
96 RT Cores76 RT Cores
61.42 Teraflops48.74 Teraflops
355 Watt TDP320 Watt TDP
ChipletMonolithic


Yet some how, these two are with in single digit % of each other in Raster and around 30-40% difference in Ray Tracing in favor of the 4080. This is like the inverse of Ampere vs RDNA 2 just that Ampere had the Ray Tracing and DLSS advantage while RDNA 3 has no advantage other than price. But when a person is spending $1000 and the competition is $200 more but gives you superior Ray Tracing + DLSS while being more power efficient... AMD just might have sold the 4080...

Idk how a company can create a chiplet architecture that is more inefficient than monolith but here we are... Even their transient spikes feels like Ampere spikes but on TSMC.

Overall both the 7900XTX and 4080 are terrible products for the price. Hopefully both will get a price cut sooner than later.

Chiplets by their very nature will always be less efficient than a monolithic die.

- You are moving caches and memory controllers further away from compute... And adding in an interconnect/fabric which consumes energy and increases latencies.

haxxiy said:

That TUF is more like what I had expected. The OC would even manage to just about match the 4080 in RT (on average, still behind on the more demanding titles).

This lends some credence to the idea AMD found some frequency curve issues late in development and that is why the clocks were much lower than expected. Even then, I wonder if they'd be better off just adding another hundred Watts to the cards. Could be bad PR trauma from the Vishera/Hawaii days but now everyone's doing it so yeah.

Too bad these won't retail for the MSRP at least for now.

I think the biggest issue currently is just the drivers.
In theory in raster, RDNA3 should be beating the Geforce 4080, not just matching it.

In some benchmarks we see a glimpse of it's potential and it looks great, but other titles it falls short... Which to me shows that the immature drivers are holding it back, which is par the course for AMD and it's launch GPU's.

But in saying that, we can only take the GPU on the basis of how it performs today, not how it may perform in 6 months time.

I am personally not happy with it costing $1,800 AUD, hoping the 7800 series is a little more sensible... But I don't think they will come under $1,000 AUD either.

It had been reported before launch that the drivers were not fully ready yet at launch, with some bugs needing work and optimization not complete. So I'm pretty sure the RDNA3 cards will get better performance and these power bugs ironed out over time (the transient spikes maybe not).

for 1000 Aussie dollars, I suspect you'll have to go with a Navi 32-based card. I expect the top-end 7800 to still be made with the Navi 31 as otherwise there's too large a gap (from 84 CU down to just 60 CU) and an actual regression in CU numbers gen over gen. As such, I think that Navi 32 will either be a 7700XTX, or a 6800XT with a 6800XTX made from a Navi 31 (probably 72CU, so there's a linear regression in CU numbers, which would mean that the CU count of the 7800XTX and XT would be the same as 6800XT and 6800). 

haxxiy said:
Pemalite said:

haxxiy said:

I think the biggest issue currently is just the drivers.
In theory in raster, RDNA3 should be beating the Geforce 4080, not just matching it.

In some benchmarks we see a glimpse of it's potential and it looks great, but other titles it falls short... Which to me shows that the immature drivers are holding it back, which is par the course for AMD and it's launch GPU's.

But in saying that, we can only take the GPU on the basis of how it performs today, not how it may perform in 6 months time.

I am personally not happy with it costing $1,800 AUD, hoping the 7800 series is a little more sensible... But I don't think they will come under $1,000 AUD either.

This might be the case, but I still think AMD should have brute-forced it to 3 GHz with an extra 100W. It'd be near the 4090 instead of the 4080 in raster for $600 less.

That seems much more marketable to me, and one would get 4080 levels of RT for $200 less too. I'm not sure people would care if it's rated 450W given the 4090 is there (despite it not consuming that much in gaming without RT).

Maybe AMD considered all of this but was just too late to change the reference design.

AMD probably needs to make a new step or even a full-on revision to achieve this.

I believe that RDNA3 wasn't ready yet, both on a hardware and software level, but got pushed out the door ASAP as AMD needed something to counter NVidia's new cards. As such, RDNA3 reminds me a bit of the first Ryzen chips, which also got pushed out the door ASAP as AMD was on the brink of bankruptcy at the time, and could only deploy their full potential a year later with the launch of Zen+, which contained all the fixes that couldn't make it into the original Zen chips. 

If this proves to be correct, I expect AMD to come with faster 7x50 versions later down the line, probably fall next year or early 2024. Until then, the driver issues should be resolved and performance already improved from those. Either way, I expect the gap to the 4080 to grow over time, not shrink, as the chips are clearly held back by their unfinished driver support.



The news:

SALES /PLAYER COUNT & DEALS

GOG has three new Deals of the Day, plus other stuff:

Steam has two new deals:

There are two new deals at Humble:

And Fanatical has a new Star Deal and a constest:

SOFTWARE & DRIVERS

INTEL Arc Graphics Driver 31.0.101.3975
https://videocardz.com/driver/intel-arc-graphics-driver-31-0-101-3975
Highlights
Intel® Game On Driver support on Intel® Arc™ A-series Graphics for:

  • The Witcher 3: Wild Hunt Next-Gen Update
  • High on Life
  • Conqueror’s Blade

>> It also brings an improvement of 4% at 1440p for PUBG.

MODS, EMULATORS & FAN PROJECTS

-Empty-

GAMING NEWS

Warcraft III: Reforged Update 1.35.0 released on PTR, full patch notes
https://www.dsogaming.com/patches/warcraft-iii-reforged-update-1-35-0-released-on-ptr-full-patch-notes/
Blizzard has released a brand new update for Warcraft III: Reforged on the Public Test Realm servers. According to the release notes, Patch 1.35.0 adds the ability to play Custom Campaigns. Moreover, it packs a number of balance tweaks and bug fixes.

CRISIS CORE –FINAL FANTASY VII– REUNION does not require a high-end PC, but has shader compilation stutters
https://www.dsogaming.com/articles/crisis-core-final-fantasy-vii-reunion-does-not-require-a-high-end-pc-but-has-shader-compilation-stutters/
Square Enix has just released the remaster of the Final Fantasy 7 PSP game, CRISIS CORE –FINAL FANTASY VII– REUNION, on PC. CRISIS CORE –FINAL FANTASY VII– REUNION uses Unreal Engine 4 and as the title suggests, it has shader compilation stutters on PC.
Now the good news here is that the game does not require a high-end PC system in order to be enjoyed. Furthermore, it fully supports Keyboard & Mouse, and plays wonderfully with them.
>> Oh, and someone didn't do its homework and left images with watermarks from Getty Images XD.

The Witcher 3 Next-Gen is another “Cyberpunk 2077” buggy mess at launch
https://www.dsogaming.com/news/the-witcher-3-next-gen-is-another-cyberpunk-2077-buggy-mess-at-launch/
CD Projekt RED has just released the Next-Gen Update for The Witcher 3 and I honestly got some “Cyberpunk 2077 buggy” vibes from it. Right now, this Update is a buggy mess as it has numerous bugs and optimization issues.
>> It' s like they don't learn.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Part two of the news:

High on Life lets you watch an entire 90-minute movie for some reason
https://www.pcgamer.com/high-on-life-lets-you-watch-an-entire-90-minute-movie-for-some-reason/
Ever catch yourself playing a game and thinking "Damn, I really wish I could watch the 1994 sci-fi horror comedy Tammy and the T-Rex in this game right now?" No? Well maybe High on Life can convince you to adopt that mindset.

Happy settings are beyond FromSoftware's 'capability or experiences,' says Miyazaki
https://www.pcgamer.com/happy-settings-are-beyond-fromsoftwares-capability-or-experiences-says-miyazaki/
Is FromSoftware okay? I don't mean to pry, but I can't help but notice that pretty much every single one of the company's recent games takes place in a blighted world of husks and ghosts slipping slowly into oblivion. Are things, like, alright over there? Have some of the most important games of the past decade been a sustained cry for help?
These are the questions—more or less—posed to FromSoft president Hidetaka Miyazaki in a recent IGN interview. The Dark Souls director was asked why it is the studio so often sets its games in dreary, apocalyptic settings. There's a couple of reasons, the first of which is that the devs just really like 'em.

PSA: Don't play Destiny 2's new Dawning event without equipping this item first
https://www.pcgamer.com/psa-dont-play-destiny-2s-new-dawning-event-without-equipping-this-item-first/
Tis the season to be grinding, guardians. Today kicks off the annual Dawning event in Destiny 2, which means cookies to bake, gifts to receive, and most crucially updated perk rolls on a suite of winter-themed weapons to receive. But wait, before you begin, please take this word of advice from Santa Clarksmas. Go into your collections tab and check whether you previously unlocked the specific Dawning ship and sparrow listed below. Both of these confer actual in-game benefits during the event, which will be active from Dec 13 to Jan 3.

Like most Marvel movies, Midnight Suns is hiding a huge secret in a post-credits teaser
https://www.pcgamer.com/midnight-suns-post-after-credits-scene/
Midnight Suns is the first truly great superhero game since, well, Insomniac released several good Spiders-Men this year, so not that long, I guess. But it's been great to see modern Marvel superimposed on a new genre, the so-hot-right-now card game, and create something more Mass Effect-like than we've seen before. Our Midnight Suns review anointed it 88%.
True to its Marvel license, Midnight Suns has two short post-credits scenes waiting for players at the end of its long, 65+ hour campaign (it took me 75).
>> Be careful with spoilers.

PSA: Firaxis put a whole dang Midnight Suns cartoon on YouTube
https://www.pcgamer.com/psa-firaxis-put-a-whole-dang-midnight-suns-cartoon-on-youtube/
The way X-Men: The Animated Series distilled the essence of its sizeable cast of mutants down to a series of vignettes only seconds long remains a masterclass in cartoon introductions. Someone was definitely paying attention, because the intro of the official animated prequel to Marvel's Midnight Suns follows it to a tee.

This game copied Among Us so hard it became popular in almost exactly the same way
https://www.pcgamer.com/this-game-copied-among-us-so-hard-it-became-popular-in-almost-exactly-the-same-way/
Social deception game Among Us became a megahit during the 2020 lockdowns, but it actually released two years earlier in 2018, making it one of the most notable examples of delayed success in the videogame business. Weirdly, the creators of Goose Goose Duck, a free-to-play game which closely resembles Among Us, have managed to not only replicate its format, but also that delayed popularity.

World of Warcraft: Dragonflight Season 1 kicks off with a new raid, arena and lots of proto-drakes that need killing
https://www.pcgamer.com/world-of-warcraft-dragonflight-season-1-kicks-off-with-a-new-raid-arena-and-lots-of-proto-drakes-that-need-killing/
Now that Azeroth's many adventurers have had a couple of weeks to get to grips with World of Warcraft: Dragonflight, Blizzard has seen fit to throw some new challenges their way in Dragonflight Season 1. The expansion's villain, the impressively-named Raszageth the Storm-Eater, has breached the Vault of the Incarnates with her Primalist pals, which means heroes will need to jump in and put an end to their plans by fighting through eight boss encounters in a new raid.

Overwatch 2's Winter Wonderland event is back and actually has an earnable skin this time
https://www.pcgamer.com/overwatch-2s-winter-wonderland-event-is-back-and-actually-has-an-earnable-skin-this-time/
Overwatch 2's second seasonal event is here, and unfortunately it seems Blizzard is still struggling to capture fans with its culled rewards and recycled game modes.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.