By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Bofferbrauer2 said:
Captain_Yuri said:

Yea that guy is one of the biggest AMD shills out there. Lets not go down to his level...

Possibly (never seen any videos of him before), but when NVidia claims zero latency and input lag with him explaining that that's impossible since it needs to wait for the next frame to calculate the frames in between, he's not exactly wrong there.

He may not be incorrect with this assessment but the moment we start listening to blind people like him who has basically zero objectivity, we are basically bound for miss information. You can just glance at his video library to see how much AMD favoured he really is.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network

To both of you:

Darc Requiem said:

I don't really agree with that. Sure, the difference between the 4090 and real 4080 is massive, But I think it has more to do with Nvidia leaving a gap big enough between the different models to fit the -Ti models that will come next year, so they bring enough performance improvements to make them look good enough.

After all, while Nvidia didn't say a thing, GALAX confirmed the chipsets uded on each GPU, and they don't differ much from what they've been using these last generations, with the exception of the 4080 being AD103 when Nvidia rarely uses the x03 name.  

I couldn't disagree more. This is about Nvidia trying to hide the fact that they are charging double the price for what should be the 4070 and 4060ti. There is a 6656 CUDA core gap between the "4080" 16GB and the 4090. There is only a 1792 CUDA core difference between the 4090 and the RTX 6000. Which has the CUDA core count that the eventual 4090ti will have as it has the full AD102 die. Also here is the RTX 20 series for further reference.

2080ti - 4352 Cuda Cores

2070 - 2304 Cuda Cores (52.9%)

2060 - 1920 Cuda Cores (44.1%)

And

Bofferbrauer2 said:
JEMC said:

I don't really agree with that. Sure, the difference between the 4090 and real 4080 is massive, But I think it has more to do with Nvidia leaving a gap big enough between the different models to fit the -Ti models that will come next year, so they bring enough performance improvements to make them look good enough.

After all, while Nvidia didn't say a thing, GALAX confirmed the chipsets uded on each GPU, and they don't differ much from what they've been using these last generations, with the exception of the 4080 being AD103 when Nvidia rarely uses the x03 name.  

The 12GB 4080 has less Cuda Cores than either of the 3080 models (8704 for the 10GB version; 8960 for the 12GB version), while the 3080Ti was almost a full 4090 with just half the VRAM. Also, both 3080 had a higher memory bandwidth than either of the 4080 models.

While I can understand NVidia wanting to make a bigger gap between the 4090 and some 4080Ti down the line, the gap between 4080 and 4090 is too big. There's enough space for 3 3080Ti versions to fill up the gap between 3080 and 3090.

Basically, the 16GB 4080 is a more like a 4070Ti with a 3080Ti pricetag.

I guess we'll have to agree to disagree here becuse I don't think Nvidia would put a 4060 chip and disguise it as a 4080 unless:

A) They'd gone nuts and had lost all connection with reality

B) They, somehow, know that AMD's RDNA3 sucks and can use a mid range chip to compete with their high end stuff

Personally, for the sake of competitiveness and the market, I don't like neither of them.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Captain_Yuri said:

‘Moore’s Law’s dead,’ Nvidia CEO Jensen Huang says in justifying gaming-card price hike

https://www.marketwatch.com/story/moores-laws-dead-nvidia-ceo-jensen-says-in-justifying-gaming-card-price-hike-11663798618

“We are very, very specifically selling into the market a lot lower than is what’s selling out of the market, a significant amount lower than what’s selling out of the market,” Huang said. “And I’m hoping that by Q4 time frame, sometime in Q4, the channel would have normalized, and it would have made room for a great launch for Ada.” To critics, Huang said he feels the higher price is justified, especially since the cutting-edge Lovelace architecture is necessary to support Nvidia’s expansion into the so-called metaverse. “A 12-inch [silicon] wafer is a lot more expensive today than it was yesterday, and it’s not a little bit more expensive, it is a ton more expensive,” Huang said.

Personally, I don't think it's a good enough reason but I do think more and more that RDNA 3 will be just as costly but we will see.

As I said the other day, while there's no doubt that the chips are more expensive to make (TSMC raising their prices to reap more profits from the chip shortage and Nvidia having to pay a premium for using their latest process node, more noticeable because they got there from the cheaper Samsung 8nm fabs), AMD has two things going for them to minimize the price increase, and that's using the not so expensive 5nm process and going with chiplets, giving them more chips per waffer.

If RDNA3 are as expensive as Nvidia it will be only because AMD wants, not because they need to be that high.

Captain_Yuri said:

AMD Radeon RX 6000 “RDNA 2” GPUs Get Official Price Cuts Prior To Radeon RX 7000 “RDNA 3” Launch

https://wccftech.com/amd-radeon-rx-6000-rdna-2-gpus-official-price-cuts-prior-to-radeon-rx-7000-rdna-3-launch/

Around 20% decrease for most GPUs. Some higher some lower.

I checked it and there are already some discounts. I think it's the first time I've seen 6700XT cards for less than 500€.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

zero129 said:
Captain_Yuri said:

I think the biggest question that I have for Nvidia is that going forward, they will obviously be pushing DLSS 3 but will they make it so that DLSS 2 is continued to be supported or are they just gonna focus on DLSS 3? Cause based on what they showed, the "DLSS Super Resolution" which is what DLSS 2 is still as aspect of DLSS 3. So hopefully, Nvidia doesn't give people the ultimate middle finger and actually continues to push developers to implement both instead of one.

Of course, thanks to the .dll method that modders use to inject FSR into DLSS types of games, modders should be able to do the same when it comes to injecting DLSS 2 into DLSS 3 games. But I'd much rather have official support than going the jank route.

I hope this wont be the case. Already owning a 3060ti i dont plan on getting a new card till the 5060ti so it would be super sucky of Nvidia drops support of DLSS on the 30xx series gpu's after only 1 gen.

The good news is that Nvidia has confirmed that it won't be the case. They said while DLSS 3 is exclusive to the 40 series, DLSS 2 is part of DLSS 3 so any games that will get DLSS 3 will also get DLSS 2 which will continued to be supported on Ampere/Turing. They also said they are going to continue to make improvements to DLSS 2 since improving that also improves DLSS 3 and those improvements to DLSS 2 will come to Ampere/Turing.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Some more news...

Ada SM layout looks to be similar to Ampere:

The biggest changes are coming from the massive L2 cache increase as well as Ray Tracing + Tensor Core uplift:

https://wccftech.com/nvidia-details-ada-lovelace-gpu-dlss-3-geforce-rtx-40-founders-edition-graphics-cards/

NVIDIA GeForce RTX 40 Series PCIe Gen 5 Power Adapters Have a Limited Connect & Disconnect Life of 30 Cycles

https://wccftech.com/nvidia-geforce-rtx-40-series-pcie-gen-5-power-adapters-limited-connect-disconnect-life-of-30-cycles/

According to Zotac at least but still very weird and shitty. Granted the chances are you probably won't be unplugging and replugging too often but still, that seems rather low...



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Captain_Yuri said:

Official now

https://wccftech.com/nvidia-details-ada-lovelace-gpu-dlss-3-geforce-rtx-40-founders-edition-graphics-cards/

The fact that they have to tell us they are going to support a marketed feature on last gen cards going forward is sad, because it's the sort of support I would always expect for paying a premium price for hardware.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850



Captain_Yuri said:
Bofferbrauer2 said:

DLSS 3 marketing dissected:

https://www.youtube.com/watch?v=qFMSgzJlzFI

 

Yea that guy is one of the biggest AMD shills out there. Lets not go down to his level...

He's a guy brought you from wccftech's comment section, an absolute legend! lmao



Captain_Yuri said:

Some more news...

Ada SM layout looks to be similar to Ampere:

*pic1*

The biggest changes are coming from the massive L2 cache increase as well as Ray Tracing + Tensor Core uplift:

*pics 2-3*

https://wccftech.com/nvidia-details-ada-lovelace-gpu-dlss-3-geforce-rtx-40-founders-edition-graphics-cards/

There's also a new power management system to minimize the infamous transients. There's even a graph (from TechPowrUp)

Althogu, without knowing the scale, it's hard to tell if it only makes the transients "cleaner" or if it also makes them shorter/smaller.

Captain_Yuri said:

NVIDIA GeForce RTX 40 Series PCIe Gen 5 Power Adapters Have a Limited Connect & Disconnect Life of 30 Cycles

https://wccftech.com/nvidia-geforce-rtx-40-series-pcie-gen-5-power-adapters-limited-connect-disconnect-life-of-30-cycles/

According to Zotac at least but still very weird and shitty. Granted the chances are you probably won't be unplugging and replugging too often but still, that seems rather low...

The problem, as they said in a New recap video from GamersNexus, is that all the other cables on your PC are rated for thousands of connects & disconnects, which is a massive difference.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.