NVIDIA GeForce RTX 4090 Graphics Card Allegedly Delivers Over 160 FPS In Control With RT & DLSS at 4K, 2x Performance Gain Over RTX 3090
Tbh Control running in RT Ultra at 160FPS with an unknown DLSS setting isn't the real story. The real story is going to be the 4090 being able to run Control at 4k Native at 90 FPS with Ray Tracing set to Ultra.
This should mean that a 4080 will most likely be a 4k 60fps Native with Ray Tracing set to Ultra while 4070 Ti or lower are similar to Ampere where it's 4k DLSS or 1440p Native for Ray Tracing set to Ultra games. But I wouldn't be surprised if some games like Cyberpunk still require DLSS in Quality mode to get above 60fps when Ray Tracing is enabled.
AIDA64 update brings AMD Zen 4 optimized AVX-512 and AVX2 benchmarks
https://videocardz.com/newz/aida64-update-brings-amd-zen-4-optimized-avx-512-and-avx2-benchmarks
Poor Intel, removing AVX 512 just when Ryzen started implementing it
AMD EPYC 9654 “Genoa” 96 Core CPU Benchmarked In Cache & Memory Benchmark – Over 2x Inter-Cache Bandwidth Versus Milan-X & Up To 30 TB/s L1 Transfer Rates
AMD Ryzen 7000 6-core CPU with 4.4 GHz clock and Gigabyte X670E motherboard spotted together
One thing I do hope is that with Ryzen 7000, AMD lowers the price of their 6 core CPUs to sub $250 assuming the 13600k costs around $300. The main reason is that the 13600k is going to give you 14 cores (6P 8E) vs Ryzen is only 6 cores 12 threads. So imo, it's about time we go back to 8 cores 16 threads for around $300 era.
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850