By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx's PC gaming emporium - Catch up on all the latest PC Gaming related news

 

Zarx changed his avatar again. Thoughts?

Noice 248 61.23%
 
So soon? I just got used to the last one 14 3.46%
 
it sucks 22 5.43%
 
Your cropping skills are lacking 14 3.46%
 
Too noisy, can't tell WTF it even is 14 3.46%
 
Meh 32 7.90%
 
Total:344
Captain_Yuri said:
QUAKECore89 said:

I gotta keep my planned on to build a Ryzen 9 7900X with Radeon RX 7900 XT rig, i don't like turning back on my rival intel/nvidia user.. XD

What the actual fuck industry LOL.

Lol well both of those are great products despite me perhaps going a bit overboard so it's not like you can go wrong. The fact that RDNA 3 now has the feature set that Nvidia/Intel does even if it's slower means that it's at a good spot. 7900X is currently a bit overpriced imo but that will come down soon enough. So you can still go with your build and it will still be great and I doubt you will regret your purchase. Sure it may not be first place but it certainly won't cost you first place money.

Bold: It should be early 2023 along with DDR5 RAMs, so, i hope they fix the prices.

Thanks, Yuri.



Around the Network

I'm currently building an entry gaming PC for my brother and went with a 3600 + 3060 Ti, a nice mix of similar numbers, and also somewhat of a throwback to my 2010 build with a Phenom and a 460. Back then I was just a wee freshman lad with a tight budget, but damn, the exchange rates and prices were great.






 

 

 

 

 

Native ATX 3.0 16-Pin Cable Melts Too When Connected To An NVIDIA GeForce RTX 4090

https://wccftech.com/native-atx-3-0-16-pin-cable-melts-too-when-connected-to-an-nvidia-geforce-rtx-4090/

$1600+ GPU getting cucked by 10 cent connector. While it hasn't happened to mine yet, it's embarassing that Nvidia hasn't come out and said anything. And I have seen some tards on twitter defend this using % based on the 100,000 number when in reality, this shouldn't have happened to even a single consumer released product unless the consumer was doing something dumb like daisy chaining molex connectors.

AMD RDNA 3 “Navi 31” GPU Block Diagram Detailed: 1st Chiplet Gaming GPU, 54% Perf/Watt, Larger L0/L1/L2 Cache, 80% Better Ray Tracing

https://wccftech.com/amd-rdna-3-navi-31-gpu-block-diagram-detailed-1st-chiplet-gaming-gpu-perf-watt-larger-l0-l1-l2-cache-better-ray-tracing/

Key word is up to

More Video Cards Need This: AMD RX 7900 XTX Reference Design Hands-On

Very interesting temp tensors on the fans



             

                   PC Specs: CPU: 5950X || GPU: Strix 4090 || RAM: 32GB DDR4 3600 || Main SSD: WD 2TB SN850

So I found out what Hyper RX is. I rewatched the Hyper Rx portion of AMD's press event and Cortex's video and it's a one click feature that combines Boost with FSR with Anti Lag...

Which is to say it's gonna be useless. All it really does is attempt to decrease input latency by increasing the frames by dynamically adjusting resolution with upscaling. Radeon Anti-Lag is similar to Nvidia's Null where you enable in driver level and it gives a minor input latency decrease. So Hyper RX is certainly not a Reflex Alternative because with Reflex, it's all native with drastic input latency decrease.

If anyone is interested in why Reflex is so good vs Boost, Anti-lag and such, watch this video. Yes it's 20 minutes but as Reflex support is only going to increase with it being part of DLSS 3, people should understand why it's so good.



             

                   PC Specs: CPU: 5950X || GPU: Strix 4090 || RAM: 32GB DDR4 3600 || Main SSD: WD 2TB SN850

I guess the cablemod is the only hope for 4090, how is the Corsair's cable going?



Around the Network
QUAKECore89 said:

I guess the cablemod is the only hope for 4090, how is the Corsair's cable going?

I wouldn't touch cablemod either until Nvidia finds out what's causing the issue cause otherwise, you aren't gonna have a good time with warranty. It's why I still have the shit adapter installed on my Strix 4090 cause I am not gonna let warranty people claim it was due to a third party adapter.



             

                   PC Specs: CPU: 5950X || GPU: Strix 4090 || RAM: 32GB DDR4 3600 || Main SSD: WD 2TB SN850

Captain_Yuri said:
QUAKECore89 said:

I guess the cablemod is the only hope for 4090, how is the Corsair's cable going?

I wouldn't touch cablemod either until Nvidia finds out what's causing the issue cause otherwise, you aren't gonna have a good time with warranty. It's why I still have the shit adapter installed on my Strix 4090 cause I am not gonna let warranty people claim it was due to a third party adapter.

Dam, this is not good...



Captain_Yuri said:

I think if Nvidia does the following, it will be an easy win this gen for them:

Lower the price of the 4080 to $900
Release the rumoured 4080 Ti with 14000 cuda cores for $1200 to compete more directly against the 7900XTX

Personally I don't like 4080/7900XT being priced at $900. The 4080 should be priced at $700 and 7900XT which is a 7800XT is disguise should be around the same price. If Nvidia doesn't do those two things, I think 4080 will be DOA.

Ok, now I'm a bit lost. I know the difference in shaders with the Ada GPUs is so big that we've made those kind of comments, but is the difference between the XTX and the XT so big? I know one has 12 CUs less than the other, but the full Navi 32 GPU has, allegedly, 60CUs, putting the 7900XT in the middle of both full chips.

Captain_Yuri said:
QUAKECore89 said:

Just in time i was reading this article, it's really sure chaos going on!

https://videocardz.com/newz/amd-confirms-rx-7900-xtx-is-rtx-4080-competitor-fsr3-may-be-supported-by-pre-rdna3-architectures

LOL

Zen 4 Flop
RDNA 3 Flop

It will certainly not set the world aflame, but it's too soon to know how RDNA3 will perform.

haxxiy said:

I'm currently building an entry gaming PC for my brother and went with a 3600 + 3060 Ti, a nice mix of similar numbers, and also somewhat of a throwback to my 2010 build with a Phenom and a 460. Back then I was just a wee freshman lad with a tight budget, but damn, the exchange rates and prices were great.

You only need to use some 3600MHz DDR4 memory to make the perfect combo .

I hope your brother thanks you how he should, because that will be a great gaming PC.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:
Captain_Yuri said:

I think if Nvidia does the following, it will be an easy win this gen for them:

Lower the price of the 4080 to $900
Release the rumoured 4080 Ti with 14000 cuda cores for $1200 to compete more directly against the 7900XTX

Personally I don't like 4080/7900XT being priced at $900. The 4080 should be priced at $700 and 7900XT which is a 7800XT is disguise should be around the same price. If Nvidia doesn't do those two things, I think 4080 will be DOA.

Ok, now I'm a bit lost. I know the difference in shaders with the Ada GPUs is so big that we've made those kind of comments, but is the difference between the XTX and the XT so big? I know one has 12 CUs less than the other, but the full Navi 32 GPU has, allegedly, 60CUs, putting the 7900XT in the middle of both full chips.

Well the 6800XT is 72CU while the 6900XT/XTX is 80CU. That's why the 7900XT is a 7800XT is disguise because the 7900XT is 84 CU vs 7900XTX is 96 CU.

The 4080 is a 4070 disguised as a 4080 in paper specs. But since AMD themselves are saying that the 7900XTX is a 4080 competitor, it is a "4080" in the sense that is will compete against the 7900XT realistically if that makes any sense. Cause the 7900XTX will no doubt out perform the 4080 cause the 4080 is just too slow. But the 7900XT will most likely be a close match against the 4080.

So both companies are doing shitty things. Nvidia outright said fu and priced the 4080 to $1200 vs the $700 of the previous 3080 while AMD is being slimy by increasing the price to $900 and calling it a 7900XT while the previous price of the 6800XT was $650. If Nvidia doesn't lower the price of the 4080, it will certainly be the shittiest GPU of the year though. It will be a worse buy than Intel A770 imo.



             

                   PC Specs: CPU: 5950X || GPU: Strix 4090 || RAM: 32GB DDR4 3600 || Main SSD: WD 2TB SN850

Captain_Yuri said:
JEMC said:

Ok, now I'm a bit lost. I know the difference in shaders with the Ada GPUs is so big that we've made those kind of comments, but is the difference between the XTX and the XT so big? I know one has 12 CUs less than the other, but the full Navi 32 GPU has, allegedly, 60CUs, putting the 7900XT in the middle of both full chips.

Well the 6800XT is 72CU while the 6900XT/XTX is 80CU. That's why the 7900XT is a 7800XT is disguise because the 7900XT is 84 CU vs 7900XTX is 96 CU.

The 4080 is a 4070 disguised as a 4080 in paper specs. But since AMD themselves are saying that the 7900XTX is a 4080 competitor, it is a "4080" in the sense that is will compete against the 7900XT realistically if that makes any sense. Cause the 7900XTX will no doubt out perform the 4080 cause the 4080 is just too slow. But the 7900XT will most likely be a close match against the 4080.

So both companies are doing shitty things. Nvidia outright said fu and priced the 4080 to $1200 vs the $700 of the previous 3080 while AMD is being slimy by increasing the price to $900 and calling it a 7900XT while the previous price of the 6800XT was $650. If Nvidia doesn't lower the price of the 4080, it will certainly be the shittiest GPU of the year though. It will be a worse buy than Intel A770 imo.

Ok, so because the difference in CUs between the two 7900 is bigger than the difference between the 6900 and 6800s, you've deduced that the 7900XT should be a lower tier card than its name says, and that's why it should be the 7800XT. That's it, right?

I don't think it's as easy as that, because the RDNA architectue only has three gens and AMD has changed the naming and how they've used the different designs in each one of them versus Nvidia, that has a more reliable history of which tier uses which chips, but I see your point. 



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.