By using this site, you agree to our Privacy Policy and our Terms of Use. Close
hinch said:
Cyran said:

While I not one to defend Jay's channel normally in this case Jay had very little involvement in that video because he had recently had a surgery and clearly his team was not ready to do a review video without him.  Jay have since release a video apologizing for the video and it has been taken down.  It was a very bad video that should never off been release but I very much doubt he received a penny from Nvidia for it.

His apology video 

Saw that video. Pulling the video and putting up an apology seems more like a reactionary than anything. Though I do wish him well. It was more of a jest but you have to admit he does act a little suspicious when it comes to Nvidia lol. Like half their tests and results were games with RT or benchmarks of RT performance and then they had the 2060 in there, just like Nvidia's slides (funny that) putting big emphasis on how big of a jump it is verses that and calling it it the best jump since the Pascal years and telling people to go out and buy it. $400, yo.. DLSS3, great power consumption.. its the GTX 1060 of this generation. Like what?

I'd like to say its techtuber thing when they get sent so much free stuff to review, it clouds their judgement. But I don't know how they can factor in such a small gen on gen upgrade and even regression from the 3060Ti but having to rely on things like frame gen (which doesn't increase performance but makes games feel like they run smoother) saying this as I do like the tech. That and their testings with RT.

The card is an a piece of shit for the money and should've been called the 4050Ti and priced less. You'd have to be a little out of touch to have people testing this gear impartially to come to the conclusion that this is a decent generational leap and great card for $400 and something worthy of full recommendation. I guess that's the reason why Jay took it down. Because it makes them look ridiculous.

Yeah. Definitely reactionary.

The 4060Ti isn't a bad card per-say. It's just a shit price and positioned poorly and should have been the 4050Ti.
If it was cheaper than the 3060Ti by a significant margin, then I don't think many would complain because performance per-dollar would have gone up... And I think we are all sick and tired of the high prices at this point.

nVidia is pivoting to sell us less silicon but more software, DLSS3 was meant to make up any performance shortfalls relative to the 3060Ti.

episteme said:

a $249 7600 would have had positive reviews I guess. I think it will be below that before the 4060 non Ti releases.
But even at $199 it wouldn't fly off the shelves because people wanna upgrade from 8GB, not to 8GB.

What's worrying about the 7600 is the little advancement in efficiency compared to the 6600 series.

What I find odd is that the RX 7600 isn't a chiplet design which is designed to reduce costs... And allows for chiplets to be used across an entire product stack and thus leverage scales of economies.

AMD just needed to build the compute die and take those memory cache dies from the 7900 and bundle a couple less.

I understand why they didn't... Because chiplets are less efficient.

In saying that, expect to see some improvement with the 7600's performance over the next year or so... We know how AMD's driver game works these days... New designs take time to get tapped out.

hinch said:

At bold, 100%. All the entry level cards are intentially handicapped with poor memory bandwidth and bus this generation. Which would be fine if they were 50 class cards and closer to $200. But here we are with $269-500 craptastic cards with performance that barely edges out last generation 60 class of old. Its embarrasing.. from both AMD and Nvidia. Top Navi 33 barely outperforms 23 on a enhanced step node and new architecture. Its laughably bad that they launched these cards. Utterly pointless.

AMD and nVidia are spending extra silicon to try and mitigate bandwidth bottlenecks... Because over the last several decades compute has very much outstripped memory bandwidth improvements.

Back in the early 2000's we started to see smarter culling with HyperZ and ATI augmented that with Z-Compression to reduce bandwidth demands... nVidia followed suit with the Geforce 3 Ti with it's "Lightspeed Memory Architecture".

nVidia introduced delta colour compression with the Geforce 5/FX.

Geforce 6 culled non-visible primitives.

x800 brought us the 3Dc texture compression... Which got refined with 3Dc+ compression a few generations later.

Radeon HD 5000 brought forth BC6H, BC7.

nVidia improved Delta Colour Compression significantly with Maxwell and also introduced tiled based rendering to improve bandwidth efficiency as well as primitive culling.

AMD started to augment it's GPU's with large and fat caches to mitigate bandwidth limitations with it's RDNA2 GPU's.


If we go back 10 years, bandwidth was at a respectable 288GB/s on the Radeon 7970...
But the Radeon 7900XTX has 960GB/s. About 3x the improvement.

But if we look at compute we went from 8.2 Teraflops to 61 Teraflops, about 8x improvement... This is a trend that is mimicked all through GPU history going back decades, compute has outstripped bandwidth.

Fast Ram is expensive, AMD and nVidia can't really increase prices or increase the bill of materials, so they are working on finding the right balance... But in the end, that is their problem, not ours.



--::{PC Gaming Master Race}::--