By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

green_sky said:

It is Apple strategy. They implement one or two "catchy" features in new phone. Then say oh no the old iSomething can't have this as new hardware has this and that.

Meanwhile Apple. Our old hardware is strongggest hardware. Look at these benchmarks. Bitch, 5 year old Androids have that feature you just implemented.

Nvidia doing same. Applization continues.

That Always On Display not coming to older iphones is so dang funny. Like the iphone 13 pro has oled ltpo display that can go down to 10hz vs iphone 14 pro has a oled ltpo display that can go down to 1hz. Yet Apple is like naw bro, you need the 14 pro to get that always on display hurr durr.

Meanwhile every samsung phone for the like last 5-6 years had it as an option with or without the oled ltpo display because you don't need it if you implement it right. Since OLED screens can turn off individual pixels, you could get rid of the wallpaper and only have the timer and app icons and you largely won't lose much battery. But Apple be like, gotta keep that wallpaper bro... Funniest shat ever.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network


Damn, sounds like even if you want to get one, it may not be easy.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

And here i had the hope i could run endgame DSP at +60fps before 2023



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

Chazore said:
JEMC said:

I really hope this whole price isn't to please shareholders. They only care about the money, not the future of the company and pricing your good out of reach of the bulk of your consumers will do nothing to improve your profits  in the mid and long run.

Also, pricing them high to make Ampere cards appealing is a good way to reduce your stock on 3000 cards, but those that will buy one of them isn't going to buy a 4000 card in the short term. You're trading one sale for another. And with the Ampere cards falling in price, that tactic actually hurts AIB parterns, forcing them to sell at a loss. Not that Nvidia seems to care, but those AIB partners DO care, and will adjust its orders of Ada GPUs accordingly.

And yeah, most of us are very tired of the high GPU prices. The only saving point for Nvidia right now is that consoles are available in such short quantities that people can't quit PC gaming and go to consoles.

Now there's a thought, what if that's Nvidia's goal?.


Think about it, if they force AIB partners to sell the older cards at a loss, that's one less problem for Nvidia to care about, or bring up at board meetings for shareholders (they can legit just tell shareholders that they sold the majority of them and not tell the partners of the AIB's failing in profits), and at the same time, this would allow the AIB's to fall short and give Nvidia the forefront of selling the newer cards, while the AIB's still play catch up, or hesitate to sell their own, that would give Nvidia an insane amount of power and even more leverage to cut out the middle-men. 

That would be absolutely evil as shit to do though, and I hope to god that isn't one of their plans, because if I were an AIB partner, I'd get the fuck outta there if that were to happen. 

personally, even if consoles were made available again, I can never go back to them. I just cannot stand the lacking freedom, the software I currently use, the subscription plans for the crap that's on there (netflix, hulu, all that shit, online gameplay, lack of modding, etc, it all feels so restrictive to me still). 

I don't think Nvidia is in that position, at least not yet. And prove of that is the 4080 12GB, and AIB only card. If Nvidi wanted to hurt them they would have also done a F.E. card with that GPU.

Now you could argue that giving the 12GB model to AIB is a poisoned gift, because it will be very difficult for them to launch custom models close to the $900 MSRP and will, instead, be closer to $1,000 or even a bit more, putting potential buyers in the mindset of "if I'm spending that much, why not go for the true 4080 instead?". But well, AIBs aren't run by stupid people, and if they think that Nvidia is cutting the grass off their feet, they'll make something about it.

Darc Requiem said:

The RTX 40 series is actually worse than I thought. I saw a commenter saying the 4080 16GB is a rebranded 4070 and the 4080 12GB is a rebranded 4060ti. If you check the percentage of CUDA cores for a 3070 vs a 3090 and a 3060ti vs a 3090 and then compare them to the 4080 12GB and 4080 16GB vs the 4090 the dude isn't be BSing. WTF Nvidia.

4090 - 16384 Cuda Cores
4080 (16GB) - 9728 Cuda Cores (58.38%)
4080 (12GB) - 7680 Cuda Cores (46.88%)

3090 - 10496 Cuda Cores
3070 - 5888 Cuda Cores (56.1%)
3060ti - 4864 Cuda Cores (46.34%)

I don't really agree with that. Sure, the difference between the 4090 and real 4080 is massive, But I think it has more to do with Nvidia leaving a gap big enough between the different models to fit the -Ti models that will come next year, so they bring enough performance improvements to make them look good enough.

After all, while Nvidia didn't say a thing, GALAX confirmed the chipsets uded on each GPU, and they don't differ much from what they've been using these last generations, with the exception of the 4080 being AD103 when Nvidia rarely uses the x03 name.  

Captain_Yuri said:


Damn, sounds like even if you want to get one, it may not be easy.

Between low availability and ridiculous prices, the idea of Nvidia putting these cards out of reach to make the remaining Ampere cards more enticing sounds more plausible.

After all, we have to remember that Nvidia tried to renegotiate its contract with TSMC to reduce it, which TSMC said no. So we know that Nvidia has the chips, and it's only a matter of willing to use them now or not.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Captain_Yuri said:

This GeForce RTX 4090 costs only 56 dollars, but it is made out of plastic bricks

https://videocardz.com/newz/this-geforce-rtx-4090-costs-only-56-dollars-but-it-is-made-out-of-plastic-bricks

Meanwhile at colorful's HQ:

Gotta say that I'd love to have that LEGO sest.

Captain_Yuri said:

Nintendo Switch Pro / Successor Rumored NVIDIA Tegra239 SoC Confirmed to Be Real; To Feature 8-Core CPU

https://wccftech.com/nintendo-switch-pro-successor-rumored-nvidia-tegra239-soc-confirmed-to-be-real-to-feature-8-core-cpu/

Kopite continues to be insane with his leaks

Going with a farily recent architecture like Ampere would do wonders for a Switch console for its DLSS capabilities.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network

So to sum it up...



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Bofferbrauer2 said:

Same for me, the last AAA I bought were Dragon Age: Origins and Might & Magic X some 10 years ago. I could feel even back then that the industry was moving in a direction that didn't fit well with me in the slightest.

That being said, I still game on a 1050Ti, and it's getting limited even in indies now, so something new must come in. But something like a 3060Ti/RX 6700 would be amply enough for me and wouldn't overtax my power bill or overheat my room.

Ngl, the last AAA I truly enjoyed and felt like I got my monies worth was RDR 1, and that was 12yrs ago, and ever since then that's just slowly declined, to a point where I now just don't like how gimmicky and flashy AAA's have gotten, and yet with all the tech they have, so little substance.

Take for instance, Conan Exiles, yes it's another open world survival game, but it featured Nvidia's Sand technology. Where is that tech years later?, no one fucking knows, because no one us using it, and that malds me, because I thought it was so fucking cool to drag a corpse through the desert on a rope, and see my deep footprints on the sand, and seeing the corpse leave a trail that spanned nearly a mile, and I thought to myself "wow, this would be so fucking cool if AAA companies did this sort of interactivity", and here we are today and everyone and their execs are focused only on RT, no interactive water sources, sand, fire, dirt, etc, just how to make everything one big fucking shiny puddle (a puddle you can't even interact with to distort the reflection).

AAA is just all flash, little substance, and it sucks, because all the money that goes into them you'd think would involve higher player agency and interactivity, but no, it's wasted on a thing veil of pretend realism. 

Honestly, if my previous co-worker wasn't a douche and moved to the UK with my 980ti I lent to him, I would gladly hand it over to you free of charge, because it lasted me during the times of MGSV, up till 2017.

Also yeah, power bills atm are going up and even though my fan curve cannot be altered (I had to remove the middle broken fan of my 1080ti, so now the GPU outright ignores all custom fan curve modules and ramps up the fan speed to max when playing any intense 3D game), I've still had to undervolt my GPU, to not drain more energy. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

If nothing else, impressive clock speeds



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

I think the biggest question that I have for Nvidia is that going forward, they will obviously be pushing DLSS 3 but will they make it so that DLSS 2 is continued to be supported or are they just gonna focus on DLSS 3? Cause based on what they showed, the "DLSS Super Resolution" which is what DLSS 2 is still as aspect of DLSS 3. So hopefully, Nvidia doesn't give people the ultimate middle finger and actually continues to push developers to implement both instead of one.

Of course, thanks to the .dll method that modders use to inject FSR into DLSS types of games, modders should be able to do the same when it comes to injecting DLSS 2 into DLSS 3 games. But I'd much rather have official support than going the jank route.

I wouldn't hold out for them wanting to continue support for the now previous version of DLSS, I imagine Nvidia will treat it as legacy, and push devs to use 3.0, meaning nvidia will be forcing folks to go for the new lineup, or face being shut out of perf gains on prev gen cards. 

Yes that'd be evil shit, but what company these days hasn't been dreaming of gaining more pushing power and wanting to really tell the customer how they see them as?.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Chazore said:
Captain_Yuri said:

I think the biggest question that I have for Nvidia is that going forward, they will obviously be pushing DLSS 3 but will they make it so that DLSS 2 is continued to be supported or are they just gonna focus on DLSS 3? Cause based on what they showed, the "DLSS Super Resolution" which is what DLSS 2 is still as aspect of DLSS 3. So hopefully, Nvidia doesn't give people the ultimate middle finger and actually continues to push developers to implement both instead of one.

Of course, thanks to the .dll method that modders use to inject FSR into DLSS types of games, modders should be able to do the same when it comes to injecting DLSS 2 into DLSS 3 games. But I'd much rather have official support than going the jank route.

I wouldn't hold out for them wanting to continue support for the now previous version of DLSS, I imagine Nvidia will treat it as legacy, and push devs to use 3.0, meaning nvidia will be forcing folks to go for the new lineup, or face being shut out of perf gains on prev gen cards. 

Yes that'd be evil shit, but what company these days hasn't been dreaming of gaining more pushing power and wanting to really tell the customer how they see them as?.

Yea we will see. Nvidia has historically been great at supporting their older GPUs, especially compared to competitors. But I wouldn't be surprised if their tune has changed. They really need a loss cause it feels like they are becoming blinded by their own success.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850