By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

AMD Radeon RX 7900 GRE 16 GB Launches, Cut Down Navi 31 GPU Starting At $649 US

https://wccftech.com/amd-radeon-rx-7900-gre-16-gb-gpu-launch-rtx-4070-competitor-649-usd-price/

Lol slightly faster than a 4070 while having 60 watts higher tdp being positioned as a successor to 6800XT with a 8-13% uplift gen on gen. You can still get a AIB 6800XT for $529 USD + Starfield and it will be a much better value than this trash.

Imagine if cutdown AD102 die had similar performance to GA102 like how cut down N31 has similar performance to N21. Jensen would shoot all his engineers for bringing dishonour to Nvidia. RDNA 3 might be one of the worst architectures in history.

AMD Unleashes The Ryzen 9 7945HX3D, First 3D V-Cache CPU For Laptops & Powers ASUS ROG SCAR 17 X3D

https://wccftech.com/amd-unleashes-the-ryzen-9-7945hx3d-first-3d-v-cache-cpu-for-laptops/

Considering how close Zen 4 laptop CPUs are against Raptor lake Laptop CPUs and you can't OC laptop ram/cpus to the moon, X3D should dominate the laptop cpu space in terms of performance.

Intel Says All Products Are On or Ahead of Schedule: Arrow Lake Running In Fabs, Client-Segment Recovers

https://wccftech.com/intel-all-products-ahead-of-schedule-arrow-lake-running-in-fabs-pc/

I'll believe it when I see it



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
TallSilhouette said:

I enjoyed it at least as much as the first game.

&

-Adonis- said:

I loved it. More than the first one. I played with the dope DLC War of the Chosen.

Thanks. I'm glad you enjoyed it.

m0ney said:
JEMC said:

I have a question for those of you that have played XCOM 2: is it me or does the game feel like a chore from time to time?

I don't know how to explain it, but I don't find it as engaging nor fun as the first one, and I've found myself forcing me to keep playing, especially when it comes to the timed/turn-limited missions because they force me to play it in a way that's no my style and that happen way too often for my liking.

So well, is it me or did you felt the same way when you played it?

I quit it early. Many devs think that 'bigger is better' when it comes to sequels - IMO it isn't more often than it is.

Thank you. My biggest problem isn't with the bigger is better mindset, I actually like some of the changes, but it just doesn't feel fun.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Oof Navi 31 on the 7900 GRE really got gimped with that 256-bit bus configuration. Should be $600 max imo.

Shows how far AMD/RTG and RDNA 3 is behind Nvidia with those specs vs their counterparts on Ada LL. And they still chose to sell at over $650.. what a joke. If it were $550 that would be a compelling buy vs a 4070. Seeing as it has more VRAM and a performs slightly better. But yeah, judging from this and 7000 series so far it doesn't bode well for the 7800/7700.

Last edited by hinch - on 28 July 2023

Everyone knows Intel and RTG's best products are Powerpoint Slides and disappointment, respectively, so nothing out of the ordinary in today's news.



 

 

 

 

 

Sold my 3060Ti and got a 4070 instead. Those rays won't trace themselves. I can play Witcher 3 NG close to 60fps some of the time with DLSS 2 enabled. DLSS 3 is still crashing every few minutes.

The future is here.*

* Alpha product version



Around the Network

That moment XeSS is better than DLSS even though it uses the dp4a backwards path. Come on Huang, what did we pay you for when an open source Ai upscaler is better.

On the bright side, lots of issues got fixed on the latest patch that got released today.

Last edited by Jizz_Beard_thePirate - on 28 July 2023

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Asus AXE7800 review - Read the fine print to make sure it works for your use case

Why do these routers have so many antennas?
     Wifi 6 and Wifi 6E especially needs antennas for all the additional bands and channels. In the past, you only needed dual band but with Wifi 6E, you really need triple band because of the new 6GHz band. If you buy a dual band Wifi 6E router, you would have to pick and choose which 2 of the 3 bands (2.4 Ghz, 5Ghz and 6Ghz) you want to keep enabled. But at that point, it's likely better to buy a dual band Wifi 6 over a dual band Wifi 6E which I'll explain below.

What is Wifi 6E and is it worth it?
     Without getting too deep into it, Wifi 6 (non-E) essentially added more channels to 5GHz band which is known as the 160Mhz channel width. This channel width allows more data to be carried which results in higher bandwidth than the former channel width such as 20/40/80Mhz but it is also more susceptible to interference compared to the smaller channel widths. Wifi 6E on the other hand has an entirely new band called the 6Ghz band which also has 20/40/80/160Mhz channel widths. So this means clients and devices connecting to Wifi 6E has an entire band with channels all to themselves where as older devices have to fight over congestion space on Wifi 6 with the 2.4/5Ghz band. Wifi 6E is also super fast because within the channel widths, Wifi 6 has only 2 channels for 160Mhz vs 6E has 7 channels for 160Mhz for example. The picture below should give you a better idea. There's also other stuff like MIMO/MU-MIMO, beamforming and etc but no one has time for that.

There is a kicker however. Wifi 6E is pretty short range largely due to much worse wall penetration. My room is in the top floor of my house which is where the router is, then there is a ground floor and then there is a basement where the network pc is. The router is in my room and if I go to the washroom for example which is right outside of my room on the same floor, I lose half the wifi bars where it went from 750Mbps in my room to 300-400 in the washroom while using Wifi 6E.. Meanwhile Wifi 6 using 5Ghz band can go all the way to the basement where my network PC stays and still have full bars however the speed there is 200-300Mbps. If I switch to 2.4Ghz, I get about 50Mbps from the basement and Wifi 6E signal doesn't reach there. In fact 6E doesn't even reach my living room from the router but 5Ghz reaches all the way to the yard.

So it's beginning to sound like Wifi 6E is a scam but really it's all about congestion. Even if say you have 3 devices using the Wifi 6E band and are close enough to it, those devices won't congest the 5Ghz band therefore the devices that are on the 5Ghz band will in turn have faster wifi.

The Fine Print
     Now if you thought the Wifi 6E was about the Fine print, there is something more interesting that you have to be aware of and that is the antenna configuration. With my AC86U for example, that had 3x3 2.4Ghz and 4x4 5Ghz Wifi 5. While my AXE7800 actually has 2x2 2.4Ghz, 4x4 5Ghz Wifi 6 and 2x2 6Ghz. This means that clients that are on 2.4Ghz that have 3-4 antennas would actually be slower on my AXE7800 than on my AC86U. But in practice, not only does 5Ghz band cover my entire house, I don't have any 2.4Ghz only devices. And while 4x4 5Ghz might sound the same, because of 160Mhz of Wifi 6 channel and MU-MIMO and other tech, in practice and even on the specs, it is a lot faster.

Should you get Wifi 6E or buy a cheaper Wifi 6 router?
     I think it depends on configuration. A lot of the benefits I stated can be largely achieved with a Wifi 6 router because Wifi 6E basically has little to no benefit with Wifi 6 devices. Wifi 6E is much faster than Wifi 6 but the limited range cucks it hard and even if you increase the antennas, the range won't improve too much more. The best placement for Wifi 6E would be placing the router in the living room on in an open area to get the most out of it. The important thing to make sure is you have the most amount of antennas for 5Ghz band and if you have devices that have low bars, to also make sure you have more antennas for 2.4Ghz but remember that 2.4Ghz is very slow. In some cases, a dual band Wifi 6 could be better than a triple band Wifi 6E depending on the antenna configuration. So it depends on your use case but I wouldn't buy a router solely Wifi 6E unless you have a small house or live in an apartment.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

hinch said:

Oof Navi 31 on the 7900 GRE really got gimped with that 256-bit bus configuration. Should be $600 max imo.

Shows how far AMD/RTG and RDNA 3 is behind Nvidia with those specs vs their counterparts on Ada LL. And they still chose to sell at over $650.. what a joke. If it were $550 that would be a compelling buy vs a 4070. Seeing as it has more VRAM and a performs slightly better. But yeah, judging from this and 7000 series so far it doesn't bode well for the 7800/7700.

It's frustrating because nVidia sets the price and AMD tries to emulate it.

...When AMD could be undercutting nVidia, selling higher volume and taking more marketshare.



--::{PC Gaming Master Race}::--

Pemalite said:
hinch said:

Oof Navi 31 on the 7900 GRE really got gimped with that 256-bit bus configuration. Should be $600 max imo.

Shows how far AMD/RTG and RDNA 3 is behind Nvidia with those specs vs their counterparts on Ada LL. And they still chose to sell at over $650.. what a joke. If it were $550 that would be a compelling buy vs a 4070. Seeing as it has more VRAM and a performs slightly better. But yeah, judging from this and 7000 series so far it doesn't bode well for the 7800/7700.

It's frustrating because nVidia sets the price and AMD tries to emulate it.

...When AMD could be undercutting nVidia, selling higher volume and taking more marketshare.

Yes but we've seen that song and dance before, Pem. Nividia just has such an insanely strong amount of mindshare, that even when AMD were undercutting them, people still chose Nvidia regardless.

AMD also sucks at playing the extremely long waiting game, in trying to undercut Nvidia for more than a few generations and in a row too. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Chazore said:
Pemalite said:

It's frustrating because nVidia sets the price and AMD tries to emulate it.

...When AMD could be undercutting nVidia, selling higher volume and taking more marketshare.

Yes but we've seen that song and dance before, Pem. Nividia just has such an insanely strong amount of mindshare, that even when AMD were undercutting them, people still chose Nvidia regardless.

AMD also sucks at playing the extremely long waiting game, in trying to undercut Nvidia for more than a few generations and in a row too. 

But when they were competitive price/performance wise, they took marketshare, they just never maintained that momentum.
For example... AMD's marketshare pretty much imploded when they kept rebadging the Radeon 7000 series to > 8000/R200/R300 series.

Some highlights of AMD's marketshare was the Radeon 9000 series, x850 series, x1950 series. - They dropped a ton of marketshare with the Radeon 2900 series, deservedly so. - Clawed some back with the Radeon 5000 series, deservedly so. - But then the 6000 series was just a refinement and didn't push boundaries...
Radeon 7000 series was tarnished due to frame pacing drivers and a focus on compute.

Radeon RX 6000 series is at an all time low because Ray Tracing, DLSS are overshadowing everything AMD has... Deservedly so.
But even when nVidia drops the ball with the 4000 series, AMD had the potential to release a very solid, much higher clocked 12GB-14GB-16GB Radeon 7600 and obliterate nVidia... But didn't.

So historically, when AMD releases a solid product lineup (I.E. Top to bottom stack), their marketshare increases, they just never kept that going for a second generation as nVidia's leaps (I.E. Maxwell) were significantly more impressive as of late.



--::{PC Gaming Master Race}::--