PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850
Asus ROG PG32UQX now has a product page. It's their 4k, 144hz 32 inch HDR monitor.
https://rog.asus.com/monitors/32-to-34-inches/rog-swift-pg32uqx-model/
The kicker is that the HDMI ports are only HDMI 2.0 instead of 2.1... So while if you only play on PC, you will get the goodies through Display port, if you also have consoles, it won't be a good experience. Since this is probably gonna be very expensive, no thanks Asus...
Apple Orders Initial 4nm Chip Production for Next-Generation Macs
https://www.macrumors.com/2021/03/30/4nm-chips-for-apple-silicon-macs/
Good cause as Apple moves to 4nm, AMD will get more 5nm supply for their next gen products.
GeForce GPU Passthrough for Windows Virtual Machine (Beta) (with Driver 465 or later)
https://nvidia.custhelp.com/app/answers/detail/a_id/5173
"NVIDIA has enabled GPU passthrough beta support for a Windows virtual machine on GeForce GPUs. What does this mean?
With virtualization enabled, GeForce customers on a Linux host PC can now enable GeForce GPU passthrough on a virtual Windows guest OS. There are a few GeForce use cases where this functionality is beneficial such as:
[VideoCardz] AMD confirms Navi 23 has 32MB of Infinity Cache while VanGogh APU lacks it
Most likely for the 6600 series
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850
Rocketlake double the power consumption while less performant than equivalent Ryzen SKU. Intel needs to release Alderlake asap and leave behind the relic that is 14nm.
Performance wise I can see Intel bringing the gaming crown back with higher clocks and IPC on much better node. But Intel has been on a loosing streak lately, so who knows.
So te summary of Intel's new chips is that the i7 and i9 are useless, and only the i5 11600 may be worth buying as long as they're cheaper than AMD's 5600, but you'll need to buy a better CPU cooler that may offset the cost difference between the CPUs.
Alder Lake can't come soon enough for Intel.
Please excuse my bad English.
Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.
Yea in a lot of ways, this is Intel's Bulldozer moment. Their i9s and i7s are slower than AMD 5900x and 5800x. Their 8 core 16 threaded i9 is more expensive than the 12 core 24 threaded 5900x while consuming more power and generating more heat than the 5900x. And in some games and workloads, the i9 10900k can beat the i9 11900k thanks to it's additional 2 cores and 4 threads.
The only one anyone can recommend is the i5 but then when you consider you would need a more expensive Z series motherboard to overclock it compared to a B series on AMD which can overclock the Ryzen 5 just fine, the value goes down. Not to mention the 10 series i7s, i5s and i9s have been going on crazy sales if you are looking for the best bang/buck gaming build.
Crazy times guys. You know it's fucked for Intel when Nvidia is using AMD CPUs in their gaming benchmarks to show improvements! 2021 is the year of Ryzen as Alderlake won't be out until Q4.
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850
Let's hope Alderlake is a turning point for Intel and it's good enough to compete with Zen4. Just like we need competition in the GPU market, we also need competition in the CPU one.
Please excuse my bad English.
Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.
DDR5 being mass produced in China. So should be ready by Q3?
https://videocardz.com/newz/china-starts-mass-production-of-ddr5-memory
Probably. Needs to be ready by the time Alder Lake launches
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850
Yeah that makes sense, will be interesting if there are any tangible performance increases. Usually the initial ones come with huge latency costs vs more mature DDR.
A Russian Youtuber shows off modded a 3070 with 16GB VRAM and gets fps gains at higher res, on top of better 1% lows.
https://wccftech.com/nvidia-geforce-rtx-3070-16-gb-modded-graphics-card-benchmarked-in-games/
Nvidia should have given it at least 10GB imo.
Last edited by hinch - on 31 March 2021The difference looks pretty minimal but I do agree that that more vram the better. I think it all comes down to cost. If they added more vram, the 3070 would cost more than $500. Maybe $550 at least which would be approaching 6800 territory. My theory is they are saving the 3070 Ti for that with additional Cuda cores and more vram to really make a big difference. We will see though.
Also because of the memory interface, they can't do 10GB otherwise they would either have to give it a lower memory interface which would make it run into memory bandwidth issues or increase the cost a lot by going into a higher one.
Last edited by Jizz_Beard_thePirate - on 31 March 2021
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850