By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

hinch said:
JEMC said:

Scalpers got burned with the 4080. It's not surprising that they didn't want to get burned again with the 4080 12GB 4070Tis.

Hopefully Nvidia will learn from this and lower the price once they realize that they aren't sell them.

Yeah, people are way more cautious with their money now and much more privvy to information for GPU's. Unfortunately for Nvidia, they didn't get the memo that most customers are not biting on GPU's on the higher end cards of the stack at over $700. And like you said, scalpers also learned this the hard way xP

Also realised that Epic has removed Unreal games (all of them) on Steam. Nice anti-consumer move there, Tim.

Thr biggest problem with the price of the 4080/4070Ti is that the vast majority of people are only willing to spend that much money if they get the best GPU possible. Paying $800, that in reality is more, to get the third best option Nvidia offers is something that doesn't result attractive to us, consumers, and it's a lesson that hopefully Nvidia will learn this gen. Unfortunately for them, the cost of learning that lesson may be quite expensive, which sucks for them, but the costlier it is, the better for us.

I didn't realize Epic had done that. It may have to do with Epic removing those games from the Epic Store because they can't add the Fornite's multiplayer system, and they won't let their games run perfecly fine on its competing platform and look bad.

But hey, thank God Epic is so much pro-consumer and cares so much about our freedom to play games on the platform we want.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network

Scalpers getting cucked this gen is probably the only good thing about these shit prices. With some luck, Nvidia and AMD will stop being dumb by summer and discount their GPUs to reasonable levels. And by that time, there would be so much stock that scalping would be impossible. Till then, best to either buy a 4090 or last gen GPUs or wait.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Even if those two GPU's from Nvidia are going to be the 4070/4060Ti, it still means both are going to be received less warmly, primarily because we've seen what the 4070Ti is priced at, and the fact it is just lesser and rather a meh-worthy card.

I'm fully expecting those other two cards to do less and rely more heavily on DLSS 3 to pump up their stats, which means the cards are getting worse and now have to rely more on a band-aid to look semi decent, and both will still likely cost 500+

Also what's with NVME's getting chunkier over time?, between the GPU's and NVME's, it feels like we're regressing in the compact dept of tech (I've seen this with phones over the past decade getting bigger, wider and chunkier in general).

Epic likely removed their games because they are either going to slightly "remaster" them and sell them exclusively on their store, or hand them to third parties to then re-release on Steam, complete with Trojan horse EoS, because everyone totally wants crossplay between Steam/EGS, at the cost of losing the ability to play said games in Steam's offline mode/having to make EGS accounts..


Seriously, I wish Tim would fuck off this planet, he's been doing nothing but shit on the PC platform since he sold gears and copied what Tencent told them to copy. 

Last edited by Chazore - on 08 January 2023

Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

So Youtube randomly suggests this video and user to me, so I thought "why not?" and gave it a watch.

Turns out Sega delisted all the OG Sonic games entirely from Steam (funny since Epic did that with their UT games), and that this collection of theirs is utterly half baked and just feels like an amalgamation of good/bad pros and cons.

I'm actually glad I own all the og games on Steam for Sonic, because this collection sounds like half arsed shit compared to Mania. It's been 6 months since that video came out, and I doubt they rectified all the negatives listed in that guy's vid.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Chazore said:

Even if those two GPU's from Nvidia are going to be the 4070/4060Ti, it still means both are going to be received less warmly, primarily because we've seen what the 4070Ti is priced at, and the fact it is just lesser and rather a meh-worthy card.

I'm fully expecting those other two cards to do less and rely more heavily on DLSS 3 to pump up their stats, which means the cards are getting worse and now have to rely more on a band-aid to look semi decent, and both will still likely cost 500+

The 4070Ti isn't a bad card "per se". If you look at the performance it brings, it offers a 30% improvement over the 3070Ti:

(graph taken from TechPowerUp's review of the ASUS 4070Ti TUF card)

If you look at previous gens (3070Ti FE review, 2070 SUPER FE), you can see that a 30% increase is the average improvement from an xx70Ti gen to the next one, so in this regard, the 4070Ti performs as expected.

The problem is, obviously, the price. They've gone from $600 (official Nvidia price, it's the value they gave to the card) to $800, a 33% increase. So, just like with the 4080 vs 3080, Nvidia is charging $10 more for every 1% performance increase, killing its value.

Chazore said:


Also what's with NVME's getting chunkier over time?, between the GPU's and NVME's, it feels like we're regressing in the compact dept of tech (I've seen this with phones over the past decade getting bigger, wider and chunkier in general).

Heat. The push for faster drives pushes the temps of the controller and NAND chips higher, requiring more cooling with every gen update. All Gen4 drives need heatsinks, with some controllers running cooler and the simple heatsink of the motherboard was ok, while others run far hotterand needed beefier ones.With Gen5 it's worse, and all drives will need bigger coolers, and some even with fans.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network
JEMC said:

The 4070Ti isn't a bad card "per se". If you look at the performance it brings, it offers a 30% improvement over the 3070Ti:

The problem is, obviously, the price. They've gone from $600 (official Nvidia price, it's the value they gave to the card) to $800, a 33% increase. So, just like with the 4080 vs 3080, Nvidia is charging $10 more for every 1% performance increase, killing its value.

From 70 to 100 is a 43% increase, not a 30% increase.



Conina said:
JEMC said:

The 4070Ti isn't a bad card "per se". If you look at the performance it brings, it offers a 30% improvement over the 3070Ti:

The problem is, obviously, the price. They've gone from $600 (official Nvidia price, it's the value they gave to the card) to $800, a 33% increase. So, just like with the 4080 vs 3080, Nvidia is charging $10 more for every 1% performance increase, killing its value.

From 70 to 100 is a 43% increase, not a 30% increase.

Yeah... I knew something was off, but I was too tired to make the maths.But you're right, it's not a 30% increase but a 43%, similar to the jumps from previous gens.

And, therefore, it's not $10 more for each % increase, but something like $9 (for example). Still, you're paying more for the extra performance, neglecting the value of the new cards.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Summary from reddit

"

  • Der8auer respects Scott’s decision to make an official statement at CES during an interview instead of going “no comment”. But he doesn’t like the way the issue seems to be handled.

  • AMD is relying on users testing their card and then contacting them if they can identify the issue, instead of recalling the problematic batches through serial number identification. He finds it unacceptable to let users troubleshoot such an issue instead of handling it themselves, this is an expensive GPU. He says that AMD either doesn’t care or they just don’t know which batches are affected and that’s just as worrisome.

  • He disagrees with the “small performance loss” statement, as the affected cards can lose 70W of cooling capacity which makes a significant impact on a 355W TGP card.

  • One affected 7900XTX happened to be defective as well. He doesn’t believe the issue is related at all though, just a regular VRM failure

  • He insists that if AMD who claims to have identified the issue wants to make things right they should be the one contacting customers with cards from the affected batches to handle the RMA instead of waiting for customers to test things themselves"


Spend $1000, get treated like you spent $200. Even the reference coolers that aren't broken aren't very good as they are power limited and loud. Least when Nvidias reference coolers you are getting one that performs as good as many AIB coolers.

Intel’s Arc driver update shows shocking improvement

https://www.pcworld.com/article/1448060/checking-up-on-intels-arc-gpu-driver-performance.html

Intel is worth paying attention to because of their position, they are forced to give out a good GPU at low prices. If they can get their drivers sorted by Battlemage and Battlemage is reasonably priced, then perhaps people can start recommending them

NVIDIA reportedly working on ‘AI-optimized drivers’ with improved performance

https://videocardz.com/newz/nvidia-reportedly-working-on-ai-optimized-drivers-with-improved-performance

Take it with massive amounts of salt with that "up to 30% more performance"



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

So I have a few potential upgrade plans coming up. One issue with owning a 4090 is that the GPU is so powerful that when you have a bottleneck in the system, it really starts to show. There are instances with Spiderman Remastered with RT on to max that my PC actually drops below 60fps when playing at 4k Native. It doesn't happen frequently but it happens once in a while when swinging around the city. This is of course due to a CPU bottleneck as the 5950x is not able to keep up with the 4090s capabilities.

So I have been eyeing an upgrade and neither vanilla zen 4 or a 13900k seemed like a good deal. 7950x while a tad faster than 5800x3d in gaming is not much faster and while 13900k is 5-10% faster than a 7950x in gaming, again not much faster and no upgrade path. Of course the leap over a 5950x for both is more like 30-40% in gaming but still, feels a bit meh considering how expensive of an upgrade the platform will be. And upgrading to a 5800X3D may be the cheapest option, I would still like to have my cores.

You can start to see why this is a bit of a dilemma. But with 7950X3D, the future is bright. As my job anniversary will happen around May, it may be time to upgrade with the bonus that's coming up. If the numbers are legit and it can beat a 13900k lets say by 5-10%, then there's a good chance I'll be buying it. If it ends up being slower though, would be lame.

The other thing I'll likely upgrade this year is a monitor. Sadly there are no 4k 32 inch OLED options that have high refresh rate.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

How many cores do Spiderman use while running? Given how the 3D cache will not be split between the two CCDs, if the game uses more than 8-cores (highly unlikely), you'd be better going for something other than the 7950X/7900X 3D.

And we'll have to see how well the new scheduller for those chips work in Windows. There were some issues with several games when Alder Lake launched, and it could happen again with these AMD processors.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.