By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Captain_Yuri said:

I think the main issue is gonna be input latency which is why Nvidia didn't put it on pre-4000. 4090 already is very close to having input latency of Native in some games. Doing frame generation on Ampere let alone Turing might make the input latency too unbearable. 80fps frames with 15fps like input latency.

Yeah thats true. Could be buggy and sub-optimal rn with current drivers. And it may be hardware limitation.

Still not convinced that it couldn't be done (well) on previous cards with some work but we're going to have to take Nvidia's word for it.

JEMC said:

Oh, my, my Nvidia!

I don't know what's more embarrassing, that Nvidia decided to use a software lock to prevbent this from happening (didn't they learn a thing from what happened to their software hash-rate limiter?), or that even a 2070 card, a card from two gens ago, can make use of it.

It's ridiculous.

It isn't the first time they locked it down feature sets to new products to make them more enticing for customers. Maybe would've been better to allow Frame Generation available for all RTX users and see if its worth the hassle/trade offs. But then.. they'd have a harder time getting people to upgrade from previous gen cards.



Around the Network
hinch said:
JEMC said:

Oh, my, my Nvidia!

I don't know what's more embarrassing, that Nvidia decided to use a software lock to prevbent this from happening (didn't they learn a thing from what happened to their software hash-rate limiter?), or that even a 2070 card, a card from two gens ago, can make use of it.

It's ridiculous.

It isn't the first time they locked it down feature sets to new products to make them more enticing for customers. Maybe would've been better to allow Frame Generation available for all RTX users and see if its worth the hassle/trade offs. But then.. they'd have a harder time getting people to upgrade from previous gen cards.

But that's a problem they could have easily solved. The pure performance of the 4090 is so high that it's able to roughly deliver 3090Ti+DLSS performance natively, something that many users will appreciatte it, even more so when you add the DLSS extra to the 4090, making it capable of delivering stupidly high fps at 4K... which some users won't be able to appreciate because Nvidia decided to skimp a few bucks and not include DisplayPort 2.0.

Also, if they added this feature to the 3000 series, that could persuade customers to not wait for the launch of the rest of Ada's lineup and get an Ampere card, solving Nvidia's other problem along the way.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

hinch said:
Captain_Yuri said:

I think the main issue is gonna be input latency which is why Nvidia didn't put it on pre-4000. 4090 already is very close to having input latency of Native in some games. Doing frame generation on Ampere let alone Turing might make the input latency too unbearable. 80fps frames with 15fps like input latency.

Yeah thats true. Could be buggy and sub-optimal rn with current drivers. And it may be hardware limitation.

Still not convinced that it couldn't be done (well) on previous cards with some work but we're going to have to take Nvidia's word for it.

Yea and personally, I am not impressed with the current version of DLSS 3 anyway. Even though I am getting the 4090, I'll stick with DLSS 2.0 for now until they iterate on DLSS 3 some more. Spidermans web on DF videos were going in and out and there were some slight but weird artifacting loll. While DLSS 3 will eventually be great, I think DLSS 2 will still remain the go to for even 4000 series despite the framerate smoothness increases.

You also can't have V-sync on otherwise you get a big input latency penalty of 100+ ms according to DFs video. So maybe in 6 months, it may be worth using for 4000 series buyers.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

hinch said:
Captain_Yuri said:

I think the main issue is gonna be input latency which is why Nvidia didn't put it on pre-4000. 4090 already is very close to having input latency of Native in some games. Doing frame generation on Ampere let alone Turing might make the input latency too unbearable. 80fps frames with 15fps like input latency.

Yeah thats true. Could be buggy and sub-optimal rn with current drivers. And it may be hardware limitation.

Still not convinced that it couldn't be done (well) on previous cards with some work but we're going to have to take Nvidia's word for it.

JEMC said:

Oh, my, my Nvidia!

I don't know what's more embarrassing, that Nvidia decided to use a software lock to prevbent this from happening (didn't they learn a thing from what happened to their software hash-rate limiter?), or that even a 2070 card, a card from two gens ago, can make use of it.

It's ridiculous.

It isn't the first time they locked it down feature sets to new products to make them more enticing for customers. Maybe would've been better to allow Frame Generation available for all RTX users and see if its worth the hassle/trade offs. But then.. they'd have a harder time getting people to upgrade from previous gen cards.

We all know AMD is eventually gonna come out with something similar, then follow it up by allowing it to work on older cards, while Nvidia sits there in the corner, giving everyone, even it's own customers the stink eye.

I'm really glad AMD released FSR, and modders made it possible to mod 2.1 into games that allow for it, because Nvidia sure as fuck doesn't give a rats arse about anyone before the 2000 series cards...

Feels like AMD is caring that bit more about those Nvidia is leaving behind, while Nvidia only cars about those camping outside a microcenter at 4 in the morning to fork over 2k. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

I like how the latest Nvidia driver is all "look at the perf gains you're gonna get in Ass creed, Cyberpunk and other games!!111"

>perf boosts only for 3000 series GPU

You really don't care about anything below the 3000 series, do ya, Nvidia?. Really making Red team more appealing by each stunt they pull at this point for me. Pricing me out, dicking around with previous drivers (still mad that they fucked around with my perf in Supreme Commander 3 fucking times in a row), forgetting that the last series or 3 even exist, locking tech to the most newest and expensive cards, etc. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Around the Network
Chazore said:

I like how the latest Nvidia driver is all "look at the perf gains you're gonna get in Ass creed, Cyberpunk and other games!!111"

>perf boosts only for 3000 series GPU

You really don't care about anything below the 3000 series, do ya, Nvidia?. Really making Red team more appealing by each stunt they pull at this point for me. Pricing me out, dicking around with previous drivers (still mad that they fucked around with my perf in Supreme Commander 3 fucking times in a row), forgetting that the last series or 3 even exist, locking tech to the most newest and expensive cards, etc. 

Well the 20 series and prior didn't have such a CPU overhead issue at lower resolutions. 30 series was surprisingly terrible at 1080p compared to 1440p and 4k in various games. So the driver largely increases the performance at 1080p for 30 series hence Nvidias examples being limited to 1080p.

While I do agree that AMD has certainly been doing things that are much more open... Hell even Intel is bringing XeSS to older GPUs that support dp4a which includes Pascal so you will get DLSS-like upscaling on Pascal GPUs even if the overhead might not be that great... I do think a notable gift that Nvidia has given to their old gen users is making Reflex compatible for GPUs all the way back to 900 series. The resolutions FSR and XeSS are strong at is 4k and both suffer significantly more than DLSS as you go down to 1440p and 1080p. But Reflex allows you to basically half the input latency so with games that support Reflex which will only continue increase now that it's a requirement for DLSS 3, it's also a very good tech for older gamers to have.

So if you are playing a game like Overwatch at 60fps, you will have the input latency of playing it like 120fps (sometimes more, sometimes less) with Reflex on which is something that neither AMD or Intel has.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

Well the 20 series and prior didn't have such a CPU overhead issue at lower resolutions. 30 series was surprisingly terrible at 1080p compared to 1440p and 4k in various games. So the driver largely increases the performance at 1080p for 30 series hence Nvidias examples being limited to 1080p.

While I do agree that AMD has certainly been doing things that are much more open... Hell even Intel is bringing XeSS to older GPUs that support dp4a which includes Pascal so you will get DLSS-like upscaling on Pascal GPUs even if the overhead might not be that great... I do think a notable gift that Nvidia has given to their old gen users is making Reflex compatible for GPUs all the way back to 900 series. The resolutions FSR and XeSS are strong at is 4k and both suffer significantly more than DLSS as you go down to 1440p and 1080p. But Reflex allows you to basically half the input latency so with games that support Reflex which will only continue increase now that it's a requirement for DLSS 3, it's also a very good tech for older gamers to have.

So if you are playing a game like Overwatch at 60fps, you will have the input latency of playing it like 120fps (sometimes more, sometimes less) with Reflex on which is something that neither AMD or Intel has.

Aye, but that one gift you have to remember, is mostly geared towards the e-sports and competitive type players who need that reduced latency. So far I'm not seeing Reflex in say Cyberpunk or Cult of the Lamb ( a game that really relies heavily on dodging and attacking at the right timee, as well as the game in general acting like a bullet hell title).

I know they have it in DRG, and that sorta benefits me a smidge?, but at the same time, DRG;s DX 12 mode is kinda ass, so that ends up negating Reflex for me, since I get the nasty stutters here and there (also doesn't help that the game operates on P2P, which kills half the point of playing co-op).

I feel like FSR is a more bountiful gift over Reflex, seeing as how Nvidia and AMD cards can run it, even older cards at that, while Reflex is for super selective titles, bound to Nvidia GPU's, and then you've got DLSS 3, which is again, only found in an even tinier pool of titles and locked more tightly onto one line of GPU's.

Nvidia absolutely does baller tech for sure, but I'm getting super tired of us getting less gifts/benefits for sticking with them for a gen or 3, because I'm now getting the glaring feeling that Nvidia is punishing me for not keeping up with their own expectations, and to be brutally honest, that's not how any corp should operate, that comes off as anti consumer for them to expect me to sell my kidneys to afford an overpriced chungus GPU, just to access one piece of tech for a handful of games, with no sign that they won't just drop that GPU or tech in the next decade (because we've seen this happening since the 900/1000 series now).

Also this new driver update fucked over my 2077 mini map, so not only am I not getting benefits from this new driver Nvidia wanted me to download, but one of my games is borked because of it. Their case really isn't being helped here.

I can see why ppl rag on Sony for being what they are atm, but Nvidia is coming off more or less the same, if not worse. Like I get it from the reviews, that 4090 fucking kicks sweet arse, but it's no good to me if I and others cannot afford it or gain access to DLSS 3. I may as well go Red team this gen and stick it to nvidia for the next decade if this is how things are going to go, because I've already came to terms with "ultra" settings no longer truly meaning legit "ultra", so the extra power on my end isn't really going to be needed all that much. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Chazore said:
Captain_Yuri said:

Well the 20 series and prior didn't have such a CPU overhead issue at lower resolutions. 30 series was surprisingly terrible at 1080p compared to 1440p and 4k in various games. So the driver largely increases the performance at 1080p for 30 series hence Nvidias examples being limited to 1080p.

While I do agree that AMD has certainly been doing things that are much more open... Hell even Intel is bringing XeSS to older GPUs that support dp4a which includes Pascal so you will get DLSS-like upscaling on Pascal GPUs even if the overhead might not be that great... I do think a notable gift that Nvidia has given to their old gen users is making Reflex compatible for GPUs all the way back to 900 series. The resolutions FSR and XeSS are strong at is 4k and both suffer significantly more than DLSS as you go down to 1440p and 1080p. But Reflex allows you to basically half the input latency so with games that support Reflex which will only continue increase now that it's a requirement for DLSS 3, it's also a very good tech for older gamers to have.

So if you are playing a game like Overwatch at 60fps, you will have the input latency of playing it like 120fps (sometimes more, sometimes less) with Reflex on which is something that neither AMD or Intel has.

Aye, but that one gift you have to remember, is mostly geared towards the e-sports and competitive type players who need that reduced latency. So far I'm not seeing Reflex in say Cyberpunk or Cult of the Lamb ( a game that really relies heavily on dodging and attacking at the right timee, as well as the game in general acting like a bullet hell title).

I know they have it in DRG, and that sorta benefits me a smidge?, but at the same time, DRG;s DX 12 mode is kinda ass, so that ends up negating Reflex for me, since I get the nasty stutters here and there (also doesn't help that the game operates on P2P, which kills half the point of playing co-op).

I feel like FSR is a more bountiful gift over Reflex, seeing as how Nvidia and AMD cards can run it, even older cards at that, while Reflex is for super selective titles, bound to Nvidia GPU's, and then you've got DLSS 3, which is again, only found in an even tinier pool of titles and locked more tightly onto one line of GPU's.

Nvidia absolutely does baller tech for sure, but I'm getting super tired of us getting less gifts/benefits for sticking with them for a gen or 3, because I'm now getting the glaring feeling that Nvidia is punishing me for not keeping up with their own expectations, and to be brutally honest, that's not how any corp should operate, that comes off as anti consumer for them to expect me to sell my kidneys to afford an overpriced chungus GPU, just to access one piece of tech for a handful of games, with no sign that they won't just drop that GPU or tech in the next decade (because we've seen this happening since the 900/1000 series now).

Also this new driver update fucked over my 2077 mini map, so not only am I not getting benefits from this new driver Nvidia wanted me to download, but one of my games is borked because of it. Their case really isn't being helped here.

I can see why ppl rag on Sony for being what they are atm, but Nvidia is coming off more or less the same, if not worse. Like I get it from the reviews, that 4090 fucking kicks sweet arse, but it's no good to me if I and others cannot afford it or gain access to DLSS 3. I may as well go Red team this gen and stick it to nvidia for the next decade if this is how things are going to go, because I've already came to terms with "ultra" settings no longer truly meaning legit "ultra", so the extra power on my end isn't really going to be needed all that much. 

Well Reflex will be coming to Cyberpunk cause DLSS 3 is coming to Cyberpunk. Any game that will have DLSS 3 will have Reflex.

But yea overall I agree that Nvidia isn't treating its old customers to the best of their abilities. While their GPUs are supported longer than AMDs as far as drivers are concerned, the fact that AMD is giving FSR and Intel is giving XeSS to Pascal users while Nvidia is like "Oh you want these things? Gotta upgrade bro" is certainly not a good look. If a company like Intel can find a way to get Ai Upscaling working on Pascal, Nvidia certainly can.

But the biggest problem with Nvidia really is their pricing these days. When Ampere came out, that shit was amazing. 3080 being 25-30% faster than the 2080 Ti while costing $700? Insane value! But then boom, we got hit by scalpers are miners. After suffering through two painful years, what do we get? 4080s/rebranded 4070s at scalping prices. And the gap between 4080 16GB and a 4090 is so big that if you are buying in that price range, may as well just get the top dog. But this prices out 99% of the people that have been waiting for the past two years. No one really wants to buy a 2 year old GPU at basically 2 year old prices.

So I do hope that AMD or Intel can bring some much needed competition. Cause I do agree, Nvidias actions are pretty shitty overall regardless of how good their tech is.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

JEMC said:
hinch said:

It isn't the first time they locked it down feature sets to new products to make them more enticing for customers. Maybe would've been better to allow Frame Generation available for all RTX users and see if its worth the hassle/trade offs. But then.. they'd have a harder time getting people to upgrade from previous gen cards.

But that's a problem they could have easily solved. The pure performance of the 4090 is so high that it's able to roughly deliver 3090Ti+DLSS performance natively, something that many users will appreciatte it, even more so when you add the DLSS extra to the 4090, making it capable of delivering stupidly high fps at 4K... which some users won't be able to appreciate because Nvidia decided to skimp a few bucks and not include DisplayPort 2.0.

Also, if they added this feature to the 3000 series, that could persuade customers to not wait for the launch of the rest of Ada's lineup and get an Ampere card, solving Nvidia's other problem along the way.

They probably don't want the 3000 series to have the same spark as the new fanged 4000 series do. Seeing as the 4090, 80 and en all is the new hot toy they want to sell.

Giving the 4000 series a unique USP, and marketing crazy numbers (DLSS 3 vs 2 or off) is way more impressive sell than having two platforms with identical feature sets. They just want to offload Ampere to those who don't want to spend or can't afford to get a $900/$1200/$1600 graphics card I feel. Hence why Nvidia priced the 4080 12GB *cough 4060ti* and 4080 they way they did.

But yeah, including the feature on all RTX might entice more people to buy their old stock. Maybe they'll release FG in drivers for 2000/3000 series RTX GPU's in the near future.

Captain_Yuri said:
hinch said:

Yeah thats true. Could be buggy and sub-optimal rn with current drivers. And it may be hardware limitation.

Still not convinced that it couldn't be done (well) on previous cards with some work but we're going to have to take Nvidia's word for it.

Yea and personally, I am not impressed with the current version of DLSS 3 anyway. Even though I am getting the 4090, I'll stick with DLSS 2.0 for now until they iterate on DLSS 3 some more. Spidermans web on DF videos were going in and out and there were some slight but weird artifacting loll. While DLSS 3 will eventually be great, I think DLSS 2 will still remain the go to for even 4000 series despite the framerate smoothness increases.

You also can't have V-sync on otherwise you get a big input latency penalty of 100+ ms according to DFs video. So maybe in 6 months, it may be worth using for 4000 series buyers.

Same, like the first iteration of tech. Its promising to see what the tech can do, but its not all there yet. Just like DLSS in its in infancy and first iteration. I watched a bit of the DF video too and there appears to have frame pacing issues (not surprising considering they are Ai generated) and motion artifacts on particular objects. Like one of the NPC's in Hitman with constant ghosting following the person. Not to mention added latency..

But yeah interesting tech that will evolve and get better. Something people may want to turn on in a ridiculously demanding game, or even better.. boosting a lower end SKU to appear way smoother and bearable to play for something that isn't without the tech. And idd DLSS 2 is probably the way to go in most situations for image quality and performance.



Not sure why or how, but over the past few days I've been noticing that the TF2 forums have been quickly turning into /b/ (tard) levels of cancer.

Ppl are posting literal copy+paste meme texts from 4 chan as threads, making stupid topics like "how many genders are there" or "who's the worst, furries or anime pfp users?", and all those threads are just full on "umadbro?" responses, that I feel like old channers are just invading steam forums for some shitty last hurrah or some stupid shit, because the forum was not this bad a few months back.

I wish Valve would actually moderate their own game forums, because I know damn well that indie devs and indie community admins monitor indie game threads on Steeam like hawks, yet official Valve game forums remain mostly unmoderated. 

Also the threads asking for engi/spy nerfs are memes at this point, and posted by ppl with hidden profiles who's names are just pure random numbers.


That thread where OP claims friendlies are worse than hackers is like a thread welcoming hackers to come in and point the finger at friendly player lmao, like for real, there's almost no hiding the "I cheat, so what?" types in there. Hackers in games will forever be scum of the earth, no matter who they point the finger at. 

Last edited by Chazore - on 13 October 2022

Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"