vivster said:
I think the judges aren't in for that yet. The things I've read is that a 2080ti barely uses 8x. I will certainly wait for Ampere benchmarks before I make a decision on that. Here is a nice test https://www.igorslab.de/en/pcie-4-0-and-pcie-3-0-different-between-x8-andx16-with-the-fast-fastest-cards-where-the-bottle-neck-begins/ Overall there is no noticeable difference between PCIe 3 x8 and x16 on the highest end GPUs with heavy loads. Also the required bandwidth seems to go down with higher resolution, that's probably because of the lower fps. |
If you think the high resolutions are the heavy loads, think again. All it does is bringing you into a more CPU-limiting scenario, hence why the gap drops.
The 720p resolution it thus more representative for what to come, and you can see that it lags by 6.3% in average and 7.4% in minimum FPS already.
But this is not just about FPS: A chip as huge as a 3080, 3090 or Big Navi (with 80 CU) will also make tons of drawcalls, which will increase the overhead a lot. And if you look at the frametimes, you'll see that the x8 will have more and higher spikes than the x16 in the frametimes, resulting in microstutter which the raw FPS numbers won't show. Future games will make better use of more cores, and thus will only accentuate the problem further.
The Nintendo eShop rating Thread: http://gamrconnect.vgchartz.com/thread.php?id=237454 List as Google Doc: https://docs.google.com/spreadsheets/d/1aW2hXQT1TheElVS7z-F3pP-7nbqdrDqWNTxl6JoJWBY/edit?usp=sharing
The Steam/GOG key gifting thread: https://gamrconnect.vgchartz.com/thread/242024/the-steamgog-key-gifting-thread/1/
Free Pc Games thread: https://gamrconnect.vgchartz.com/thread/248138/free-pc-games/1/