By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

3.72 GHz is nothing to be ashamed of. And we know some third party will develop custom vBIOS to reach even higher clocks.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network

Looking back, it's interesting how Nvidia made the 3080 as it's main GPU launch trailer. There was no trailer and such for the 3090/3070:

But this time around, their main GPU trailer is the 4090 as there is no trailer for the 4080 versions:

Kind of goes to show their main focus cause they knew the 4080 was gonna be a shit show. Personally having watched both trailers, I think the 3080 reveal trailer is better.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Alright kind of a big news from Nvidia rep:

"DLSS 3 consists of 3 technologies – DLSS Frame Generation, DLSS Super Resolution, and NVIDIA Reflex.

DLSS Frame Generation uses RTX 40 Series high-speed Optical Flow Accelerator to calculate the motion flow that is used for the AI network, then executes the network on 4th Generation Tensor Cores. Support for previous GPU architectures would require further innovation in optical flow and AI model optimization.

DLSS Super Resolution and NVIDIA Reflex will of course remain supported on prior generation hardware, so a broader set of customers will continue to benefit from new DLSS 3 integrations. We continue to train the AI model for DLSS Super Resolution and will provide updates for all RTX GPUs as our research"

https://www.reddit.com/r/nvidia/comments/xje8et/comment/ip8d0d7/

Sounds like any games that have DLSS 3 will continue to support DLSS 2 for Ampere/Turing



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

Alright kind of a big news from Nvidia rep:

"DLSS 3 consists of 3 technologies – DLSS Frame Generation, DLSS Super Resolution, and NVIDIA Reflex.

DLSS Frame Generation uses RTX 40 Series high-speed Optical Flow Accelerator to calculate the motion flow that is used for the AI network, then executes the network on 4th Generation Tensor Cores. Support for previous GPU architectures would require further innovation in optical flow and AI model optimization.

DLSS Super Resolution and NVIDIA Reflex will of course remain supported on prior generation hardware, so a broader set of customers will continue to benefit from new DLSS 3 integrations. We continue to train the AI model for DLSS Super Resolution and will provide updates for all RTX GPUs as our research"

https://www.reddit.com/r/nvidia/comments/xje8et/comment/ip8d0d7/

Sounds like any games that have DLSS 3 will continue to support DLSS 2 for Ampere/Turing

Thats good at least sounds like devs implement the 2.0 as the core then jump it up to 3.0 after.



Official now

https://wccftech.com/nvidia-details-ada-lovelace-gpu-dlss-3-geforce-rtx-40-founders-edition-graphics-cards/



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network

Darc Requiem said:

The RTX 40 series is actually worse than I thought. I saw a commenter saying the 4080 16GB is a rebranded 4070 and the 4080 12GB is a rebranded 4060ti. If you check the percentage of CUDA cores for a 3070 vs a 3090 and a 3060ti vs a 3090 and then compare them to the 4080 12GB and 4080 16GB vs the 4090 the dude isn't be BSing. WTF Nvidia.

4090 - 16384 Cuda Cores
4080 (16GB) - 9728 Cuda Cores (58.38%)
4080 (12GB) - 7680 Cuda Cores (46.88%)

3090 - 10496 Cuda Cores
3070 - 5888 Cuda Cores (56.1%)
3060ti - 4864 Cuda Cores (46.34%)

I don't really agree with that. Sure, the difference between the 4090 and real 4080 is massive, But I think it has more to do with Nvidia leaving a gap big enough between the different models to fit the -Ti models that will come next year, so they bring enough performance improvements to make them look good enough.

After all, while Nvidia didn't say a thing, GALAX confirmed the chipsets uded on each GPU, and they don't differ much from what they've been using these last generations, with the exception of the 4080 being AD103 when Nvidia rarely uses the x03 name.  

I couldn't disagree more. This is about Nvidia trying to hide the fact that they are charging double the price for what should be the 4070 and 4060ti. There is a 6656 CUDA core gap between the "4080" 16GB and the 4090. There is only a 1792 CUDA core difference between the 4090 and the RTX 6000. Which has the CUDA core count that the eventual 4090ti will have as it has the full AD102 die. Also here is the RTX 20 series for further reference.

2080ti - 4352 Cuda Cores

2070 - 2304 Cuda Cores (52.9%)

2060 - 1920 Cuda Cores (44.1%)



Chazore said:
Bofferbrauer2 said:

Same for me, the last AAA I bought were Dragon Age: Origins and Might & Magic X some 10 years ago. I could feel even back then that the industry was moving in a direction that didn't fit well with me in the slightest.

That being said, I still game on a 1050Ti, and it's getting limited even in indies now, so something new must come in. But something like a 3060Ti/RX 6700 would be amply enough for me and wouldn't overtax my power bill or overheat my room.

Ngl, the last AAA I truly enjoyed and felt like I got my monies worth was RDR 1, and that was 12yrs ago, and ever since then that's just slowly declined, to a point where I now just don't like how gimmicky and flashy AAA's have gotten, and yet with all the tech they have, so little substance.

Take for instance, Conan Exiles, yes it's another open world survival game, but it featured Nvidia's Sand technology. Where is that tech years later?, no one fucking knows, because no one us using it, and that malds me, because I thought it was so fucking cool to drag a corpse through the desert on a rope, and see my deep footprints on the sand, and seeing the corpse leave a trail that spanned nearly a mile, and I thought to myself "wow, this would be so fucking cool if AAA companies did this sort of interactivity", and here we are today and everyone and their execs are focused only on RT, no interactive water sources, sand, fire, dirt, etc, just how to make everything one big fucking shiny puddle (a puddle you can't even interact with to distort the reflection).

AAA is just all flash, little substance, and it sucks, because all the money that goes into them you'd think would involve higher player agency and interactivity, but no, it's wasted on a thing veil of pretend realism

Honestly, if my previous co-worker wasn't a douche and moved to the UK with my 980ti I lent to him, I would gladly hand it over to you free of charge, because it lasted me during the times of MGSV, up till 2017.

Also yeah, power bills atm are going up and even though my fan curve cannot be altered (I had to remove the middle broken fan of my 1080ti, so now the GPU outright ignores all custom fan curve modules and ramps up the fan speed to max when playing any intense 3D game), I've still had to undervolt my GPU, to not drain more energy. 

This. Also, they're pretty much paint-by-numbers and interchangeable.

And thanks a lot for the offer with the GPU, would really have been nice and I would have been really grateful for it. But I'll get by with my 1050Ti until I get to upgrade. If the 7600XT won't be too neutered, I think that's gonna be my next GPU, should last me a good half dozen years at least.

I had a similar experience with a broken fan, though it was on my CPU at the time. I couldn't get my cooler off my Phenom II (didn't know at the time that I should run the PC a bit to heat it up to ease removing of the cooler), so I basically had to cool my CPU passively. Undervolted and slightly underclocked (2.6 Ghz instead of 3 Ghz), it managed to run just fine passively in a well-ventilated big tower.



JEMC said:  
Captain_Yuri said:


Damn, sounds like even if you want to get one, it may not be easy.

Between low availability and ridiculous prices, the idea of Nvidia putting these cards out of reach to make the remaining Ampere cards more enticing sounds more plausible.

After all, we have to remember that Nvidia tried to renegotiate its contract with TSMC to reduce it, which TSMC said no. So we know that Nvidia has the chips, and it's only a matter of willing to use them now or not.

Also, artificial scarcity to inflate the value proposition of their new cards.

JEMC said:
Darc Requiem said:

The RTX 40 series is actually worse than I thought. I saw a commenter saying the 4080 16GB is a rebranded 4070 and the 4080 12GB is a rebranded 4060ti. If you check the percentage of CUDA cores for a 3070 vs a 3090 and a 3060ti vs a 3090 and then compare them to the 4080 12GB and 4080 16GB vs the 4090 the dude isn't be BSing. WTF Nvidia.

4090 - 16384 Cuda Cores
4080 (16GB) - 9728 Cuda Cores (58.38%)
4080 (12GB) - 7680 Cuda Cores (46.88%)

3090 - 10496 Cuda Cores
3070 - 5888 Cuda Cores (56.1%)
3060ti - 4864 Cuda Cores (46.34%)

I don't really agree with that. Sure, the difference between the 4090 and real 4080 is massive, But I think it has more to do with Nvidia leaving a gap big enough between the different models to fit the -Ti models that will come next year, so they bring enough performance improvements to make them look good enough.

After all, while Nvidia didn't say a thing, GALAX confirmed the chipsets uded on each GPU, and they don't differ much from what they've been using these last generations, with the exception of the 4080 being AD103 when Nvidia rarely uses the x03 name.  

The 12GB 4080 has less Cuda Cores than either of the 3080 models (8704 for the 10GB version; 8960 for the 12GB version), while the 3080Ti was almost a full 4090 with just half the VRAM. Also, both 3080 had a higher memory bandwidth than either of the 4080 models.

While I can understand NVidia wanting to make a bigger gap between the 4090 and some 4080Ti down the line, the gap between 4080 and 4090 is too big. There's enough space for 3 3080Ti versions to fill up the gap between 3080 and 3090.

Basically, the 16GB 4080 is a more like a 4070Ti with a 3080Ti pricetag.



Captain_Yuri said:

Official now

https://wccftech.com/nvidia-details-ada-lovelace-gpu-dlss-3-geforce-rtx-40-founders-edition-graphics-cards/

Actually compatible with the 3000 series, they decided not to implement it.



The Splinter Cell remake will update the game’s story ‘for a modern audience’
https://www.videogameschronicle.com/news/the-splinter-cell-remake-will-update-the-games-story-for-a-modern-audience/
The upcoming Splinter Cell remake will include a rewritten story in order to appeal to a new generation of players.

Why not just make it a new game?