By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Radeon RX 6600 XT vs. GeForce RTX 3060, 50 Game Benchmark

This is with SAM enabled and etc.

Remember, the MSRP of 6600XT is $379 and MSRP of 3060 is $329. During Hardware Unboxed initial 30 games review of the 6600XT vs 3060, the 6600XT was 6% faster at 1080p and 3% faster at 1440p. Now 6600XT is only 3% faster at 1080p and 2% faster at 1440p.

It's amusing to see Ampere aging like Fine Wine while Radeon aging like milk. But muh console optimization!


Intel’s Alchemist desktop graphics cards not until late summer? Driver confusion and competitor reactions

https://www.igorslab.de/en/intels-alchemist-desktop-graphics-cards-but-first-in-summer-driver-clutter-and-reactions-of-competitors/

Least Radeon will have Arc to make them look good. Thank you Raja.

NVIDIA GeForce RTX 30 Lite Hash Rate (LHR) has been fully unlocked

https://videocardz.com/newz/nvidia-geforce-rtx-30-lite-hash-rate-lhr-has-been-fully-unlocked

AMD Ryzen Threadripper 7000 series with Zen4 architecture rumored to feature up to 96 cores

https://videocardz.com/newz/amd-ryzen-threadripper-7000-series-with-zen4-architecture-rumored-to-feature-up-to-96-cores

Remember when Intel had HEDT?



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Captain_Yuri said:

Intel’s Alchemist desktop graphics cards not until late summer? Driver confusion and competitor reactions

https://www.igorslab.de/en/intels-alchemist-desktop-graphics-cards-but-first-in-summer-driver-clutter-and-reactions-of-competitors/

Least Radeon will have Arc to make them look good. Thank you Raja.

This is beyond embarrassing. At this point I don't even care about the performance or price of their cards, there's no way I trust them to release new drivers when needed.

Captain_Yuri said:

NVIDIA GeForce RTX 30 Lite Hash Rate (LHR) has been fully unlocked

https://videocardz.com/newz/nvidia-geforce-rtx-30-lite-hash-rate-lhr-has-been-fully-unlocked

Let's hope Ada has some hardware features to block, or limit, crypto mining. And if we're very lucky, also from AMD.

Captain_Yuri said:

AMD Ryzen Threadripper 7000 series with Zen4 architecture rumored to feature up to 96 cores

https://videocardz.com/newz/amd-ryzen-threadripper-7000-series-with-zen4-architecture-rumored-to-feature-up-to-96-cores

Remember when Intel had HEDT?

Keep in mind that we only had word about Threadripper PRO, not the "regular" one, like the last 5000 PRO.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

42" LG C2 vs Alienware QD-OLED Monitor - 19 Differences in 10 Minutes!

The conclusion is that if you are a PC Gamer, you should get the Alienware QD-OLED where as if you are a console gamer, you should get the LG C2. The biggest issue with getting the LG C2 to use as a PC gaming monitor is the auto dimming feature. One of the best examples of this I have seen is in this video:

Skip to 7:58. Essentially the TV will auto dim to very low brightness and moment you switch tabs, boom, back to original brightness. You can some what mitigate it by lowering the brightness overall but not the most ideal thing. You can disable it completely through the service menu but you could lose warranty and you need to buy a special service remote. Where as with Alienware, it has none of that but it does have shitty HDMI ports so anything that isn't a PC is going to have a bad time.

But it is exciting to see a PC monitor going head to head to the best TVs out there instead of the past few years where it was like, buy an OLED cause PC monitors in that price range are hot garbage in comparison. I think it's better to wait maybe another year and hope that Nvidia updates that G-sync module to HDMI 2.1 and DP 2.0.


AMD Next-Gen 5nm Ryzen 7000 Desktop CPUs For AM5 Platform Allegedly Launch As Early As September 2022

https://wccftech.com/amd-ryzen-7000-5nm-zen-4-desktop-cpus-am5-platform-september-launch-rumor/

ASUS x Noctua GeForce RTX 3080 Graphics Card Leaks Out, Features A Quad-Slot Cooler

https://wccftech.com/asus-x-noctua-geforce-rtx-3080-oc-graphics-card-quad-slot-cooling/

They used the 10GB 3080 model instead of the 12GB one and it also costs $1,214. Seeing as how you can find a 3080 Ti for that price, sadly, there isn't much point in getting one.


AMD Radeon RX 6950XT Equips The Navi 21 KXTX GPU With Samsung & Hynix 18 Gbps Memory Support, Up To 400W TBP

https://wccftech.com/amd-radeon-rx-6950-xt-navi-21-kxtx-gpu-samsung-hynix-18-gbps-memory-400w-tbp/

Looks like AMD is also raising it's TDP numbers


Intel “Alder Lake-HX” specifications leaked, up to 16-cores with overclocking and PCIe 5.0 support for enthusiast-class laptops

https://videocardz.com/newz/intel-alder-lake-hx-specifications-leaked-up-to-16-cores-with-overclocking-and-pcie-5-0-support-for-enthusiast-class-laptops

157 Watt turbo you say...

Last edited by Jizz_Beard_thePirate - on 08 May 2022

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Those processors don't make sense in a laptop, even less in a gaming one because between that and the GPU, the battery will last next to nothing.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

>Intel “Alder Lake-HX” specifications leaked, up to 16-cores with overclocking and PCIe 5.0 support for enthusiast-class laptops

What the fuck are they doing? I can't imagine if they designed thickness laptops for this future CPU, the battery will be interesting i'm expecting it should be as big as portable printer's.



Around the Network

Intel just wants the performance crown at any cost possible even if the package itself is pretty unreasonable. And since AMD wants to also have the performance crown as well, they are also raising their TDP figures but granted, still much more efficiently than Intel. It's one of the few areas where Apple is showing how building an ARM CPU can truly revolutionize the laptop space.

But once you plug it in, Intel and AMD CPUs rolf Apple:

It's one of the reasons I hope that Nvidia can bring their future ARM cpus to the consumer laptop space so that we don't have to deal with Apple and their nonsense if we want that efficiency. If Nvidia can have an ARM CPU that's competitive against x86 CPUs while delivering similar efficiencies as Apple's M1, it would be insane!

But that's just a pipe dream for now and x86 on 5nm should in theory, close the gap against the M1 in laptop CPU space.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

does it have so much more battery life because it cant run the witcher?



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

Nah that's the youtube test. It can technically run the Witcher through CrossOver which is kind of like Proton for Mac. But he didn't test it.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Just think about it would explain the lower power draw comparing running a game vs running idle lmao



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

Nah you can watch the videos and see:

Youtube/video play back tests doesn't really stress the CPU/GPUs a lot so it may as well be idle and ARM CPUs are super efficient when Idle and even under load compared to both Intel and AMD CPUs. However when you start pushing the GPU, that is when Macbooks start getting less and less efficient against Nvidia GPUs as most gaming tests puts the M1 Max close to a 3060 (M1 Pro 1650 Ti Max-Q) and when you do a gaming test with the M1 Max on battery, it only gets 1 hour of battery under a full gaming load.

https://www.notebookcheck.net/Apple-MacBook-Pro-16-2021-M1-Max-Laptop-Review-Full-Performance-without-Throttling.581437.0.html

Essentially, the power efficiency when it comes to the M1 CPUs:

Vs Apple's attempt at their own GPUs against Nvidia GPUs:

Last edited by Jizz_Beard_thePirate - on 09 May 2022

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850