By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

JEMC said:
WoodenPints said:

Not really sure what's going on with future super cards but the 5080 only having 16GB of vram seems to low for a card that will likely come in at $1200+

I expect between between $1200 and $1400 for the 5080, but we'll see.

I agree that the amount of VRAM is surprisingly low. I read on another discussion that this could be an intentional move from Nvidia because 24GB seem to be the sweet spot for AI training right now, and by limiting the 5080 to just 16GB, they could avoid getting the card trapped in the AI craze that's going on right now.

But I don't know how legit that may be.

I did take a look at the 5090 specs and thought for sure this card seemed to be aimed at the generative AI crowd and why it will be priced at a hefty premium but I think a 16GB 5080 is the same situation of when the 3070 launched with it's 8GB and whilst most games still fit in the 8-12GB vram range there has been more pushing to 14-16GB without RT enabled.



Around the Network
WoodenPints said:
JEMC said:

I expect between between $1200 and $1400 for the 5080, but we'll see.

I agree that the amount of VRAM is surprisingly low. I read on another discussion that this could be an intentional move from Nvidia because 24GB seem to be the sweet spot for AI training right now, and by limiting the 5080 to just 16GB, they could avoid getting the card trapped in the AI craze that's going on right now.

But I don't know how legit that may be.

I did take a look at the 5090 specs and thought for sure this card seemed to be aimed at the generative AI crowd and why it will be priced at a hefty premium but I think a 16GB 5080 is the same situation of when the 3070 launched with it's 8GB and whilst most games still fit in the 8-12GB vram range there has been more pushing to 14-16GB without RT enabled.

Indeed, the 5080 would have been a bit more appealing with 24GB of VRAM, but Nvidia always goes for the min. amount of it possible. If there' one area where AMD is better than Nvidia is that one, the amount of VRAM in its cards (at least as of late). That's why the 6700XT/6500XT are still a better proposition than the 4060Ti.

With that said, I hope AMD ups their game a bit more with the next ones and the 8600 comes with 12GB, with 16GB for the 8700 and 24GB for the 8800.

In any case, and to be fair, looking at the game performance reviews Techpowerup does, most games still work well with 16GB even at 4K, outsiders likt SW Outlaws appart.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:
WoodenPints said:

I did take a look at the 5090 specs and thought for sure this card seemed to be aimed at the generative AI crowd and why it will be priced at a hefty premium but I think a 16GB 5080 is the same situation of when the 3070 launched with it's 8GB and whilst most games still fit in the 8-12GB vram range there has been more pushing to 14-16GB without RT enabled.

Indeed, the 5080 would have been a bit more appealing with 24GB of VRAM, but Nvidia always goes for the min. amount of it possible. If there' one area where AMD is better than Nvidia is that one, the amount of VRAM in its cards (at least as of late). That's why the 6700XT/6500XT are still a better proposition than the 4060Ti.

With that said, I hope AMD ups their game a bit more with the next ones and the 8600 comes with 12GB, with 16GB for the 8700 and 24GB for the 8800.

In any case, and to be fair, looking at the game performance reviews Techpowerup does, most games still work well with 16GB even at 4K, outsiders likt SW Outlaws appart.

Yeah I think if you upgrade to each new card the vram on Nvidia cards is enough but I think the general consumer wants to skip the following card after a purchase so every 4 years is the real minimum lifespan but some of the vram usage ramps is greatly when you enable the feature sets like RT, RR and DLSS+frame gen.

Whilst 24GB would be cool I think if they had 20GB on the 5080 it would offer the comfort of having headroom to enable more stuff over the next few years.



WoodenPints said:
JEMC said:

Indeed, the 5080 would have been a bit more appealing with 24GB of VRAM, but Nvidia always goes for the min. amount of it possible. If there' one area where AMD is better than Nvidia is that one, the amount of VRAM in its cards (at least as of late). That's why the 6700XT/6500XT are still a better proposition than the 4060Ti.

With that said, I hope AMD ups their game a bit more with the next ones and the 8600 comes with 12GB, with 16GB for the 8700 and 24GB for the 8800.

In any case, and to be fair, looking at the game performance reviews Techpowerup does, most games still work well with 16GB even at 4K, outsiders likt SW Outlaws appart.

Yeah I think if you upgrade to each new card the vram on Nvidia cards is enough but I think the general consumer wants to skip the following card after a purchase so every 4 years is the real minimum lifespan but some of the vram usage ramps is greatly when you enable the feature sets like RT, RR and DLSS+frame gen.

Whilst 24GB would be cool I think if they had 20GB on the 5080 it would offer the comfort of having headroom to enable more stuff over the next few years.

20GB would have worked, yes, but it's an odd number. Just like the 3080 with its 10GB. It doesn't feel right.

I think 24 would have been a better goal as that's how much capacity the 4090 has, and offering the 5080 with the same amount would make some buyers think that both cards are at the same level... which they may or may not be. We'll see.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

16 gb seems low for a high end GPU. I regularly get games taking 12 gb, and that is only going to increase over time.



Around the Network

Dumb question, but is there a point where upgrading a GPU doesn't do anything because games are CPU limited?



Chrkeller said:

Dumb question, but is there a point where upgrading a GPU doesn't do anything because games are CPU limited?

When it comes to RTS/Sim games, you aren't going to get much of the GPU to carry you since those can be CPU heavy (the newer ones I mean).

That being said, from what I saw with Frospunk 2, it looks like it takes from both (ACG tried to play it in 4k with dlss and it still took some hits to perf).

Yeah the 16gb is kinda low with games taking up more VRAM as time goes on, especially at 4k. For me though, I think I'm going to stick with 1440p until that res becomes the 1080p standard, and 4k eventually becomes more of a stabilised norm (where we don't get many hits to perf I mean). The 16gb should be enough for me at 1440p, but there's no way in hell am I ever paying £1000-1500 for a GPU, let alone a 16gb version. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

As Chazore said, there are times when the CPU limits the performance you get in games. That's why CPU reviews also look at their performance with games, and it's also the reason your choice of components needs to be balanced.

It was said, I don't know if it's still the case or not, that CPUs are responsible of how many frames you get while the GPU is responsible of how pretty those frames are. Obviously, the prettier the frames are, the harder the GPU needs to work, affecting your frames as well, but you get the idea.

Genres like strategy games and management/simulator games tend to be a bit more demanding for your CPU because there are a many more variables to take into account when it comes to render the frame (number of units, resources, AI controlled enemies, etc.)

With that said, CPUs are more likely to be a bottleneck at lower resolutions, which is why CPU reviews test them at 720p, while GPUs are the limiting factor at higher resolutions.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:

As Chazore said, there are times when the CPU limits the performance you get in games. That's why CPU reviews also look at their performance with games, and it's also the reason your choice of components needs to be balanced.

It was said, I don't know if it's still the case or not, that CPUs are responsible of how many frames you get while the GPU is responsible of how pretty those frames are. Obviously, the prettier the frames are, the harder the GPU needs to work, affecting your frames as well, but you get the idea.

Genres like strategy games and management/simulator games tend to be a bit more demanding for your CPU because there are a many more variables to take into account when it comes to render the frame (number of units, resources, AI controlled enemies, etc.)

With that said, CPUs are more likely to be a bottleneck at lower resolutions, which is why CPU reviews test them at 720p, while GPUs are the limiting factor at higher resolutions.

This is also why I've learned over time that I don't actually need to run my games (or try to at least) at the full 144fps, instead opting for a cap of 72 (if at 144hz), or 60fps (if running at 120hz), to put less stress on both my CPU/GPU (and it also helps keep my temps down for both and a consistent framerate). 

I originally built this rig back in 2017, so my CPU of choice back then was the i7-6700k and paired it with a GTX 1080ti, which both at the time acted as a decent balance for each other. Today the CPU is obviously the bigger factor holding me back for those CPU intensive Sim games, whilst the GPU is still getting me by at 1440p (just not with any RT though). 

I could technically play games like Planet Coaster, but I realised early into my playthrough that I had to always keep my park visitors at a lower number, because the number of visitors require more threads, and I only have 4 cores and 8 threads, meaning I had to turn down both CPU intensive settings, but also limit my number of park guests. 

Next rig build is likely going to get me going with a Ryzen 7800x3D (since that so far seems like the gamer's choice of CPU, and the newest lineup doesn't sound all that compelling). GPU-wise, I'll prob stick with my 1080ti until I see what AMD comes out with, because I'm sure I won't be able to afford any of Nvidia' high end, or even mid range GPU's next yr. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Chazore said:

Next rig build is likely going to get me going with a Ryzen 7800x3D (since that so far seems like the gamer's choice of CPU, and the newest lineup doesn't sound all that compelling). GPU-wise, I'll prob stick with my 1080ti until I see what AMD comes out with, because I'm sure I won't be able to afford any of Nvidia' high end, or even mid range GPU's next yr. 

AMD said they were going to improve their X3D chips, and in one of the news roundup videos, Steve (GN) said that he had heard some things from his contacts and it looked like AMD wasn't lying about it.

I've seen some speculate that you'll be able to overclock them a little, while others claim that, since the regular 9000 CPUs are so focused on power efficiency, it could mean that the X3D parts should be able to run at the same clocks as the regular counterparts. If this later rumor is true, the 9800X3D could be close to 500MHz faster than the 7800X3D. That would help a lot with those games that still prefer higher clocks over more cache.

But we'll see what's going on with those chips, hopefully next month to compete with Intel's new ones.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.