By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Tbh they should give Halo to id Software after they're done with Infinite. The franchise has been hanging on and got stale even before Bungie left and 343 took over. Adding to that the recent games are clearly mismanaged and mishandled. Not even sure why or how 343 are still in charge of their flagship franchise.


As for Cyberpunk I actually enjoyed it lol. Could it have been better? Yeah, was it a bad game.. I don't think so. Unless you're playing on console.

Last edited by hinch - on 21 August 2021

Around the Network

Tbh I haven't really been following the campaign and have no interest in the franchise anymore. I played an hour of Halo 4's campaign and that was that. Reach was last Halo game I played and completed and MP too. But yeah, Co-op and Forge isn't needed launch, they just need to release a decent game and deserving of the Halo name.

I'd probably get into it if Halo again if it was rebooted by id. Otherwise pass. Though hoping Infinite is good and works out.. and from the MP previews; it looks decent for fans of the series.



Imo 343 has been given enough chances already to prove them selves and they stumbled wayy too much. The problem isn't just that it doesn't have co-op or forge, the problem is that it doesn't have those things even after being delayed for a year. What did they do this entire time? Spending their money on hookers? This game was supposed to be ready last year. They didn't even show any single player gameplay this time around which shows just how confident they are in their execution. These are the devs that thought it was acceptable to showcase a Halo game that looked early xbox one in terms of graphics as a launch title for next gen console. Sony is doing cross gen too and Horizon looks incredible. Hell even Spiderman looks better. Why doesn't this flagship game look just as good?

If this was 343's first time, I can be like sure, lets give them a chance. But this is like their 4th time. They goofed MCC hard during the initial xbox one launch. They goofed Halo 5. They goofed Halo Infinite during it's initial showing and now, even after being delayed a whole year, they have barely anything to show for it. The multiplayer looks good sure, but you would think that if the Single Player was up to snuff, they would showcase it.

We will see what happens but 343 has been given too many chances with Halo imo. This was once Microsoft's Mario. Imagine if Nintendo fucked up Mario for over a decade while Sega made Sonic games that were better. That would be insane! But that's what 343 has done with Halo.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Intel Core i9-12900K 16 Core Alder Lake CPU Benchmarked on ASUS ROG STRIX Z690-E Gaming WIFI Motherboard, Faster Than Core i9-11900K

https://wccftech.com/intel-core-i9-12900k-16-core-desktop-cpu-benchmarked-asus-rog-strix-z690-e-gaming-wifi-motherboard-faster-than-amd-ryzen-9-5950x/

Now these are early samples but man, if this is true, Alder Lake S is gonna be a flop. The benchmark is clearly not taking advantage of the 5950x's 16 cores 32 threads because of how close it is to an 11900k. So the fact that the i9-12900k is this close means that it's in big trouble! We will see what happens when the CPU actually comes out but if this is what Intel's answer to AMD is... This ain't it chief!

Intel DG2-512 specs compared to nVidia GA104 and AMD Navi 22 (as well DG2-128 & DG2-256)

https://www.reddit.com/r/hardware/comments/p95nyq/intel_dg2512_specs_compared_to_nvidia_ga104_and/

Now I wouldn't compare TF between the GPUs as they are meaningless. But it does give us an interesting idea as to the spec differences. I do think that RDNA 2 is lacking in ML performance but we already knew that with DF. The Ray Tracing performance is something I will be interested in seeing.

Last edited by Jizz_Beard_thePirate - on 22 August 2021

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:
Cyran said:

100% intel going to support AVX 512 on Sapphire Rapid.  There no other reason why Golden Core would have AVX-512 on it.  Why make a core with AVX-512 if you not going to support it on anything.  

The question is will AMD support AVX512 on Ryzen or will they disable it on ZEN 4 just like intel disabling it on Golden Cove for Alder Lake.

I'd be surprised if AMD disables it in Zen 4 as that would be a pretty easy selling point even if many consumers may not get much use out of it.

AVX-512 is certainly irrelevant today, but who knows in a couple years?

For Instance, Phenom-based processors couldn't start some games after 2015 anymore since those needed SSE 4.1 and/or 4.2, but those only had the AMD-exclusive 4.a. But in 2009, SSE4 was still even just barely used in HPC.

Either way, it would be funny if in a couple years some game would make use of AVX-512 and then run on Rocket lake, but not it's immediate successors...

Captain_Yuri said:

Intel Core i9-12900K 16 Core Alder Lake CPU Benchmarked on ASUS ROG STRIX Z690-E Gaming WIFI Motherboard, Faster Than Core i9-11900K

https://wccftech.com/intel-core-i9-12900k-16-core-desktop-cpu-benchmarked-asus-rog-strix-z690-e-gaming-wifi-motherboard-faster-than-amd-ryzen-9-5950x/

Now these are early samples but man, if this is true, Alder Lake S is gonna be a flop. The benchmark is clearly not taking advantage of the 5950x's 16 cores 32 threads because of how close it is to an 11900k. So the fact that the i9-12900k is this close means that it's in big trouble! We will see what happens when the CPU actually comes out but if this is what Intel's answer to AMD is... This ain't it chief!

We clearly need some more test samples to see the full picture, but since Alder Lake is supposed to be released in October, I doubt it will change much over what we have right now in the leaks.

So for the high end, I expect Intel to revive it's HEDT platform with Sapphire Rapids and fight both the 5950X and the Threadrippers 5960X and 5970X with that one while Alder Lake will compete with the 5900X at best.

And that's all without the boost AMD is about to give their CPUs with the infinity cache they'll be getting for their refresh...

Captain_Yuri said:

Intel DG2-512 specs compared to nVidia GA104 and AMD Navi 22 (as well DG2-128 & DG2-256)

https://www.reddit.com/r/hardware/comments/p95nyq/intel_dg2512_specs_compared_to_nvidia_ga104_and/

Now I wouldn't compare TF between the GPUs as they are meaningless. But it does give us an interesting idea as to the spec differences. I do think that RDNA 2 is lacking in ML performance but we already knew that with DF. The Ray Tracing performance is something I will be interested in seeing.

Interesting would have been to know the TGP or TBP of Arc to see if it can keep up in that regard or run very hot.

Captain_Yuri said:
Cyran said:

Because if you going to let people run AVX-512 you also going to have to support the power consumption and heat it requires.  Good article by AnandTech on just how much more power is require when running AVX-512 work loads

https://www.anandtech.com/show/16495/intel-rocket-lake-14nm-review-11900k-11700k-11600k/5

That's true. Zen 4s 16 cores is rumoured to have a TDP of 170 watts while being on TSMC's 5nm which is up from 5950x 105 Watts TDP. If they end up disabling AVX512, then they really need to make sure those big cores are able to beat Raptor Lake handly which should be it's direct competition. Especially as Intel is going DDR5 and PCI-E Gen 5 while Zen 4 is rumoured to have DDR5 but only PCI-E Gen 4.

It will be interesting to see how it all plays out.

I'm expecting that 170W chip to be something more special.

Either AMD starts rating TDP like on the servers (where the chips with a TDP of 280W also don't consume more than that; a 5950X at full core utilization is more like 145W, add in the extra memory on top that will come with the refresh and you're close to 170W in real terms already), or AMD comes with an chip that's actually a full 7950X (6950X for the upcoming refresh) plus a relatively powerful mobile GPU chip(-let, as in, 16-32CU), as the big cache they are putting on top of the chips with the upcoming refresh would also do really great for an integrated GPU.

Last edited by Bofferbrauer2 - on 22 August 2021

Around the Network
Bofferbrauer2 said:
Captain_Yuri said:

I'd be surprised if AMD disables it in Zen 4 as that would be a pretty easy selling point even if many consumers may not get much use out of it.

AVX-512 is certainly irrelevant today, but who knows in a couple years?

For Instance, Phenom-based processors couldn't start some games after 2015 anymore since those needed SSE 4.1 and/or 4.2, but those only had the AMD-exclusive 4.a. But in 2009, SSE4 was still even just barely used in HPC.

Either way, it would be funny if in a couple years some game would make use of AVX-512 and then run on Rocket lake, but not it's immediate successors...

Captain_Yuri said:

Intel Core i9-12900K 16 Core Alder Lake CPU Benchmarked on ASUS ROG STRIX Z690-E Gaming WIFI Motherboard, Faster Than Core i9-11900K

https://wccftech.com/intel-core-i9-12900k-16-core-desktop-cpu-benchmarked-asus-rog-strix-z690-e-gaming-wifi-motherboard-faster-than-amd-ryzen-9-5950x/

Now these are early samples but man, if this is true, Alder Lake S is gonna be a flop. The benchmark is clearly not taking advantage of the 5950x's 16 cores 32 threads because of how close it is to an 11900k. So the fact that the i9-12900k is this close means that it's in big trouble! We will see what happens when the CPU actually comes out but if this is what Intel's answer to AMD is... This ain't it chief!

We clearly need some more test samples to see the full picture, but since Alder Lake is supposed to be released in October, I doubt it will change much over what we have right now in the leaks.

So for the high end, I expect Intel to revive it's HEDT platform with Sapphire Rapids and fight both the 5950X and the Threadrippers 5960X and 5970X with that one while Alder Lake will compete with the 5900X at best.

And that's all without the boost AMD is about to give their CPUs with the infinity cache they'll be getting for their refresh...

Captain_Yuri said:

Intel DG2-512 specs compared to nVidia GA104 and AMD Navi 22 (as well DG2-128 & DG2-256)

https://www.reddit.com/r/hardware/comments/p95nyq/intel_dg2512_specs_compared_to_nvidia_ga104_and/

Now I wouldn't compare TF between the GPUs as they are meaningless. But it does give us an interesting idea as to the spec differences. I do think that RDNA 2 is lacking in ML performance but we already knew that with DF. The Ray Tracing performance is something I will be interested in seeing.

Interesting would have been to know the TGP or TBP of Arc to see if it can keep up in that regard or run very hot.

Another funny thing is that if Zen 4 does have AVX512, all those benches that Intel was ahead in due to AVX512 support will then be reversed.

I am personally not feeling Alder Lake S anymore as there were early leaks of Rocket Lake before it launched and the performance wasn't much different post launch. Alder Lake could be a really hot mess when it comes out, maybe even worse than Zen 1. New Memory, New PCI-E, big.LITTLE scheduling and etc. I am sure there's going to be tons of Windows and BIOS updates gonna be needed to make it all viable. Meanwhile Zen 3 gets heavily discounted and Zen 3 Vcache comes out with tried and true formula with DDR4 and gen 4 while giving people more performance. And by the time AMD comes out with their own big.LITTLE, all these issues will be fixed. If Alder Lake isn't 15-20% faster than Zen 3 with their big cores, Intel is gonna be in big big trouble.

Also they do have TDP estimates in the bottom table but I'd take it with some salt.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:
Bofferbrauer2 said:

AVX-512 is certainly irrelevant today, but who knows in a couple years?

For Instance, Phenom-based processors couldn't start some games after 2015 anymore since those needed SSE 4.1 and/or 4.2, but those only had the AMD-exclusive 4.a. But in 2009, SSE4 was still even just barely used in HPC.

Either way, it would be funny if in a couple years some game would make use of AVX-512 and then run on Rocket lake, but not it's immediate successors...

Captain_Yuri said:

Intel Core i9-12900K 16 Core Alder Lake CPU Benchmarked on ASUS ROG STRIX Z690-E Gaming WIFI Motherboard, Faster Than Core i9-11900K

https://wccftech.com/intel-core-i9-12900k-16-core-desktop-cpu-benchmarked-asus-rog-strix-z690-e-gaming-wifi-motherboard-faster-than-amd-ryzen-9-5950x/

Now these are early samples but man, if this is true, Alder Lake S is gonna be a flop. The benchmark is clearly not taking advantage of the 5950x's 16 cores 32 threads because of how close it is to an 11900k. So the fact that the i9-12900k is this close means that it's in big trouble! We will see what happens when the CPU actually comes out but if this is what Intel's answer to AMD is... This ain't it chief!

We clearly need some more test samples to see the full picture, but since Alder Lake is supposed to be released in October, I doubt it will change much over what we have right now in the leaks.

So for the high end, I expect Intel to revive it's HEDT platform with Sapphire Rapids and fight both the 5950X and the Threadrippers 5960X and 5970X with that one while Alder Lake will compete with the 5900X at best.

And that's all without the boost AMD is about to give their CPUs with the infinity cache they'll be getting for their refresh...

Interesting would have been to know the TGP or TBP of Arc to see if it can keep up in that regard or run very hot.

Another funny thing is that if Zen 4 does have AVX512, all those benches that Intel was ahead in due to AVX512 support will then be reversed.

I am personally not feeling Alder Lake S anymore as there were early leaks of Rocket Lake before it launched and the performance wasn't much different post launch. Alder Lake could be a really hot mess when it comes out, maybe even worse than Zen 1. New Memory, New PCI-E, big.LITTLE scheduling and etc. I am sure there's going to be tons of Windows and BIOS updates gonna be needed to make it all viable. Meanwhile Zen 3 gets heavily discounted and Zen 3 Vcache comes out with tried and true formula with DDR4 and gen 4 while giving people more performance. And by the time AMD comes out with their own big.LITTLE, all these issues will be fixed. If Alder Lake isn't 15-20% faster than Zen 3 with their big cores, Intel is gonna be in big big trouble.

Also they do have TDP estimates in the bottom table but I'd take it with some salt.

A reason why Alder Lake doesn't have AVX-512 but Sapphire Rapids, which comes with the same CPU chiplets does, could be because the Atom cores don't support the instruction, and this could probably be problematic if a workload gets bounced from an AVX-512 core to one that doesn't support the said instruction. So this will probably get resolved on future chips by either including the instruction into their Atom chips or through an updated scheduler in Windows and Linux that can handle this problem and ensures workloads with specific instructions don't bounce to cores who can't handle those.

Alder Lake's big cores are basically Tiger Lake, and as such only on par with Zen 3. The only way it could get significantly faster is through clock speeds in excess of 5.3 Ghz on more than one core. Which is probably gonna happen more or less considering it's power draws...



Captain_Yuri said:

Alright, I am going to post some observations that I experienced while mining with my Strix 3080. Now I know that mining isn't very well liked but if you are one of those people who are desperate to get a new GPU and are going to pay the scalping price, mining may be the only way to recoup that cost. So here are some things that I found interesting after mining for 3-4 months during work days and making 2-3x that I paid for my Strix 3080 during the crypto boom before the June heatwave.

If you want to get started with mining, Linus tech tips has a good starter tutorial on it.

1) Do not start mining without using mining software that will adjust your GPU configurations such as Nicehash quickminer.

Now there are plenty of ways to mine. You can load up a mining script that you got from a mining pool, you can use nicehash and plenty of other ways. The issue with mining is that it's put a lot of load on your GPUs VRAM. Most GPUs however such as my 3080 Strix are not configured to spin it's fans based on VRAM temps. So by default and if you aren't paying attention, your VRAM temps can shoot up to 110C and start throttling or worse. Instead if you use Nicehash quickminer + lite optimization, it will configure your GPU to focus on VRAM temps as well as your Core clock Temps. So when your VRAM temps goes up the fans will spin to lower it. The lite optimization will also undervolt/underclock your GPU and overclock your VRAM automatically. You can change these manually however.

2) Room temperature matters a lot

When you are mining, the VRAM temps will traditionally sit around 94C on my Strix 3080 with variable fan speeds. Now technically I can lower it with the software even more but 94C is technically with in spec and the fans would probably die sooner spinning at 100% than the VRAM would. It's when you get to 100C or above is when issues start to occur. In my experience, 25C room temp or lower are the best times to mine (least with the Strix 3080). If your room temp is above 25C, then the fans will be at 100% all the time. And once you go above 30C, the cooling can no longer keep up with the VRAM temps no matter how much airflow there may be. At that point, you would need to replace the thermal pads with more efficient ones or go liquid cooling or get an AC for your room. But I wouldn't do any of those as you are trying to save money.

3) Mine as much as you can until you hit the minimum pay out

When you start mining, you should monitor to make sure your GPU is performing as it should. Core temps should be 50-60C, VRAM temps can be 94C, fans can start at 100% but eventually go down to 50-60% and then vary depending on the load. Once that happens, I suggest mining with your GPU as much as possible until you hit the minimum payout. For Nicehash, it's 0.001 BTC which with a Strix 3080, takes about a week of running continuously. The reason is crypto market can change at any point and the last thing you want is to not get paid. Once you hit that minimum payout, then you can do whatever you want. It may not sound like a good idea to run at 100% load but remember that these GPUs have a 3-5 year warranty depending on the manufacturer and most of them are scalping you anyway.

I may post some additional stuff some other time.

Mining back in Bogota (or Spain during Winter) would've been viable... here in Medellin... not so much... unless I leave my rig outdoors. Either that or using the AC and that kind of kills the point. 



Weren't there some rumors earlier this year that Zen4 processors would feature an IGP? That could explain the extra power of the new chips that would go, mostly, for that. The thing is that I don't remember if those were the rumors that were proven false or not.

As for the leaked performance of Alder Lake... well, it's not great, but I think that this kind of hybrid processors will need help from MSoft, with an update to Windows that helps with the threading and such. We'll have to wait until it launches to see if MSoft launches an update to Win10 that increases the performance of those chips.



Please excuse my bad English.

Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.