By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Wouldn't a more apt comparison with the XX90 series be the Titan series? I've always seen the XX90 series as more like the Titan series than the XX80ti (especially when in the Ampere generation we had an XX80ti.) 

If you do the comparison using Titan GPU's, it makes sense why this was an incremental improvement generation for the 5090. 

GPU Process Node

Microarchitecture

Release Date

(Rounded to nearest month)

3dmark (Timespy) Difference Point Gain per day % Gain
Titan X (Maxwell) 22N Maxwell 2.0 3/2015 5721
Titan X (Pascal) 14N Pascal 8/2016 9565 519 7.40655106 67.19%
Titan Xp 14N Pascal 4/2017 10213 243 2.666666667 6.77%
Titan V 14N Volta 12/2017 12507 244 9.401639344 22.46%
Titan RTX 14N Turing 12/2018 14306 365 4.928767123 14.38%
3090 10N Ampere 9/2020 18178 640 6.05 27.07%
4090 5N Ada 10/2022 30478 760 16.18421053 67.66%
5090 5N Blackwell 2/2025 854 9.63 (Projected) Projected 27%

Edited: To include Titan X (Maxwell)

RTX 3090 -> RTX 4090 was a huge leap because 10N -> 5N was a huge jump. But as you can see, the Pascal -> Volta -> Turing leaps were much more moderate. A 27% increase is roughly in line with the Titan RTX -> 3090.  

We're going to see things slow down like this because Moore's law is slowing down. That's why Nvidia is chasing low-hanging fruit (given that they also run a GPGPU compute business) like neural-rendering and ray-tracing acceleration. It's a lot easier than trying to innovate a paradigm-shifting technology that creates a new S-Curve in terms of hardware acceleration. 

I do agree that the pricing (below the xx90 series) is bad though, even adjusted for inflation/costs of production. 

Last edited by sc94597 - 6 hours ago

Around the Network
Jizz_Beard_thePirate said:

Yea I don't think there's much winning no matter what you buy this gen. Nvidia may as well start at $750 for Rtx 5000 and even if you have 2k to spend, you won't feel like you got something untouchable like you did with a 4090.

Getting Radeon feels like you are buying from a sleazy car salesman. Their leadership is so garbage that they have zero idea on how to move forward until Nvidia shows them the way so they can price their shat slightly under them. And it feels like they will always be behind Nvidia or possibly even Intel with feature sets.

And getting Intel still feels like you are getting into an experiment. Driver issues, cpu overhead and questionable support for old games and emulators. But least their price and hardware is amazing.

Like I already know some folks are going to grab it regardless, but when the 4090 was out and new, it was their literal halo product from the whole line and it's metrics made sense as to why it was, and it made sense, despite the stupidly high price.

This 5090 though just ain't it, in metrics, maybe design?, but I'm pretty sure the rest of the 5000 series will follow that cut down look anyway, so that leaves the 5090 with nothing to boast for besides price. 

The uplift is shit, no matter how they'll try to spin it, not compared to the 1080ti and not even with the 4090. Nvidia have basically done a slight 1080ti job with their 4090 when you look back on it now and compare to to the 5090. It's basically like the 1080ti vs the 2080 all over again. So now this has me wondering, "is Nvidia going to just copy MS and make good halo GPU, next gen bad halo GPU, next gen good GPU, etc?, because it's starting to look that way, especially with the uplifts getting smaller and smaller each gen now too. 

Yeah Radeon can suck a big one. Their feature-set is late, some of it still needs to get the okay from the games themselves, their RT is also not up to spec and their AI chasing still leaves them 1-2 gens behind Nvidia. There is virtually no point in us waiting to watch AMD wait for Nvidia to price reveal, just so AMD can charge $20-50 cheaper, which will still mean nothing, when you're behind Nvidia in multiple ways, that $50 ain't gonna pay itself off like that. 

I'm sure once Intel gets past their experimental phases and gets some good stability, then we'll find out what they are like amongst Nvidia/AMD and then we'll end up deciding who they are trying to be like (I pray they just be themselves and don't chase Nvidia, or don't copy AMD and wait their asses off and lag behind). 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

So this new gpu gen is definitely a pass for me (not that I wanted to be tempted to upgrade already anyway).

Any word on a die shrink or anything for the 6000 series?



I still plan to get a 5090 the math a bit different for me since am coming from a 3090. I would be getting a 250% increase over 3090 base on same math so that pretty huge and worth the price for me personally.

Can always play the next one will be a big increase game and maybe that true or maybe 6090 will just be another ~30%, I rather just stick with my every other gen and not try to play guessing games on which gen will be the good ones.



There has been absolutely zero point in upgrading your GPU every year for awhile now.

Every 2-3 years is where you see enough of a jump to make things worthwhile outside of professional use-cases.

I went from Having 3-4x high-end GPU's in Crossfire, upgrading every year... To only a single mid-range GPU that I replace every 2-3 years, honestly haven't missed much.



--::{PC Gaming Master Race}::--

Around the Network

Yeah, I upgrade my rig every 5 years now. No point in doing yearly upgrades.



Personally, if I only used GPUs for gaming I'd be happy with an RTX 3080 and probably wouldn't upgrade until the next console generation released. I am content with 1440p 60-120fps depending on title. DLSS quality looks near native or even better than native to me. Ray-tracing is nice, but I could wait until next-console-gen when Nvidia accumulates enough neural-rendering techniques in its feature-set to get us real-time path-tracing with a decent denoiser and more games support these features.  

The XX90 series targets the same demographic that the Titan cards did -- prosumers who use GPUs for compute tasks but don't need the Quadro-only features. 

There isn't competition in this space to bring prices down because AMD and Intel are way behind on getting traction behind their APIs. If either of them had API-parity (or close enough support) with CUDA then Nvidia probably would compete on things like VRAM capacity on consumer cards. There is a lot of hunger for VRAM in the prosumer space. But if they don't have to, then they don't want to eat into their market for actual professional cards and chipsets.  

As for just keeping the pace of prior decades in terms of raw compute, we need a paradigm shift and that requires the coordination of multiple companies in the industry in general, not just the GPU designers, but the computing industry as a whole.