By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Chazore said:
Captain_Yuri said:

It's a joke relating to the current trend of people blaming the lack of vram to poor game performance when in fact the games aren't optimized to begin with and realistically shouldn't actually require as much vram as they are asking since they don't do anything visually spectacular. Cause if we go by that logic, all GPUs outside of a 7900XTX and 4090/3090 are obsolete cause they clearly weren't forwarding thinking enough. But we all know that's not the case.

Like my 1080ti has 11gb VRAM, and I'm still able to play games at 1440p, like ppl banking all bets on it being VRAM is like the console wars all over again (remember when they were arguing about VRAM only a few years ago?). 

What gets me, is how this seems to happen every single console cycle, where devs release ports to PC in the worst of conditions (only now they're asking for higher prices and less content), then by the time consoles reach their middle to late life cycles, devs magically get better with PC ports and we all have a grand old time, rinse and repeat.

Where's the part where devs actually git gud at the start and learn from past mistakes?, because I'd pay real money to see that in real time. This seems to happen every cycle, and I'm getting mighty tired of seeing the same issues (this time it's DX 12 and UE, which we may as well brutally chalk up to Epic/MS for this, when they could have released both far later on when they were more refined and waaaaaay easier for devs to grasp). 

The recent trend of games really makes it feel like 7th gen all over again. I remember how awful GTA IV was where you couldn't stop those stutters regardless of how beast of a PC you had. These days it feels the same but the problems that PC is having is also leaking into consoles to some extent. Least those days, people can be like, well PC doesn't have market share or consoles architecture was wildy different so you had poor port jobs. These days it's all effectively the same. Sure there are some custom SoCs in consoles but like 90% of console hardware is the same as pc.

And what really gets me is this shader comp and UE4 stutter nonsense. Like why is it that a ton of games including Cyberpunk or Spiderman doesn't need shader compilation pass and haven't needed them in the past yet in modern gaming, they all effectively need it. And even when games have shader compilation, it doesn't fix the traversal stutters like with Jedi Survivor is having on every platform. Does no one test games anymore or are publishers/developers so contend with ship now fix later stupidity?



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850