By using this site, you agree to our Privacy Policy and our Terms of Use. Close
ArchangelMadzz said:

Yeah but obviously that requires more money, plus you're talking about future tech that doesn't exist today so you could make an argument "Well a PS7 could run 8K so there aren't constraints" but it doesn't exist yet and we don't even have the display bandwidth standards to do higher than 8k60fps.

Indeed.
However, Multi-monitor gaming is a thing which allows you to take 6x displays and "stitch them together" which gets around the single display bandwidth issue for 16k and higher resolutions.

ArchangelMadzz said:

1440p is the best resolution for PC in a value sense, but now that it's possible to play games at max settings 4k60fps the people that do want to do that can

Don't get me wrong, if that is what someone wants to do, go full hog.

Random_Matt said:
Kristof81 said:
I don't understand, why people attempt to make purchase decisions, knowing so little about the products themselves. Don't fall into that marketing trap and just wait until everything's released and properly benchmarked, including Big Navi.

Unless they combat GDDR6X with HBM, then I would simply skip it. They will probably release a card between the 3080 and 3070, AMD are MIA at the mo and probably crying.

Wish they sell the GPU business to Samsung or something, their division is so low on cash it shows everytime.

The big issue with GDDR6X is actually DRAM density... We only have 8gbit chips. Aka. 8 bits in a byte equates to 1GB memory chips.
So a GPU with 16GB of memory has 16x of those memory chips but will need to run it in clamshell mode, so two memory chips per bus interface.

It's not really an economical approach... There are developments for 16gbit chips, but I think that will take some time... So AMD will possibly stick with vanilla GDDR6 to take advantage of capacity and cost.

It's going to be very interesting either way, AMD typically adopts newer DRAM technologies faster than nVidia in order to try and gain any advantage they can, but that isn't happening this time around.
Case in point... AMD adopted GDDR5 before nVidia which gave the Radeon 4870 the performance edge who was still stuck using GDDR3 and thus needed twice the memory bus width to match... AMD could thus undercut nVidia significantly on price which made the Radeon 4870 the GPU to have.

ViktorBKK said:
The RTX 3080 is a planned obsolescence card. 10GB VRAM is not enough for next gen. Even consoles have more than that.

I would not buy a GPU(AMD or NVIDIA) with anything less than 16GB VRAM this gen.

Not an apples to apples comparison.

A PC GPU is not a Console.
A PC GPU can dedicate 100% of it's memory to graphics duties... And also has a heap of System memory to fall back onto... Your average mid-range desktop PC sold in 2020 has usually got 8GB of graphics memory and 16GB of system memory or 24GB in total.

A console on the other hand... A % of it's memory needs to be used for background/OS tasks rather than games, the PS4/Xbox One for example had 8GB of Ram, but 3Gb (Give or take as it changed over time) of that was reserved for non-gaming duties... Or 37.5% of total memory couldn't be used for gaming.

The remaining 62.5% of console memory then needed to be split up between graphics duties and other game-related content...

In the end a PC with a 2GB GDDR5 GPU like the Radeon 7870 was pretty much able to provide an equivalent experience to the base Playstation 4 all generation long... No 8GB of GDDR5 was needed, most PC's with a 7870 would have had 8GB~ of system memory as well.

The RTX 3080 will absolutely provide a better experience than the next-gen consoles... The next-gen consoles cannot use all of that 16GB of VRAM for graphics duties anyway... And will certainly rely heavily on the SSD and I/O in order to make that tiny amount of memory go as far as possible.

ViktorBKK said:

10gb won't be even be enough for the end of 2021.

Bit of an assertion... Got any evidence to back that up?



--::{PC Gaming Master Race}::--