By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Jizz_Beard_thePirate said:

NVIDIA GeForce RTX 5090 32 GB, RTX 5080 16 GB, RTX 5070 12 GB To Launch at CES, New PCB “Back Drill” Changes

https://wccftech.com/nvidia-geforce-rtx-5090-32-gb-rtx-5080-16-gb-rtx-5070-12-gb-ces-2025-launch/

If you thought Lovelace had a bad GPU stack... Wait till you see Blackwell

GDDR6 VRAM Prices Plummet: 8GB of Memory Now Costs $27

https://www.tomshardware.com/news/gddr6-vram-prices-plummet

Can we get more vram at low tier for next gen Nvidia? "No"

Of course we can't get more VRAM on we'll-skimp-on-the-memory-NVidia. The low memory was my main critique of the 30 Series right from their announcement, especially on the 3070/3070Ti. Looks like NVidia is going to repeat that again with the 5070 and maybe even 5080 depending on how fast the memory needs grow.

Let us just hope that GB 205 is not going to be a 5070. The 4070 had 46 SMs, while the GB 205 comes with 50, so that would be an absolutely tiny increase, probably not enough to even beat the 4070Ti - and maybe even not the 4070 SUPER. On a side note, this is also the first time that NVidia has named a chip with a 5 at the end, before they always went from 3 or 4 straight to 6.

Jizz_Beard_thePirate said:

AMD Preps Radeon RX 7650 GRE “RDNA 3” Graphics Card For CES 2025 Launch

https://wccftech.com/amd-preps-radeon-rx-7650-gre-rdna-3-graphics-card-ces-2025-launch/

Would have been nice to know some specs. My guess is that it will be based on the same chip as the 7800XT/7700XT, but with much less CU. I'm expecting 40-48 CU on this one, with a tendency more towards 40.

JEMC said:

Oof: Years before Steam, a Blizzard engineer wanted to turn Battle.net into a third-party game store, but was reportedly turned down
https://www.pcgamer.com/gaming-industry/oof-years-before-steam-a-blizzard-engineer-wanted-to-turn-battle-net-into-a-third-party-game-store-but-was-reportedly-turned-down/
The world of today is the result of every decision and accident that led to it, which is a disconcerting thought given how many things it's possible to almost do in a lifetime. Case in point, according to a new book on the history of Blizzard, the Warcraft studio rejected a proposal to turn Battle.net into a third-party game store years before Steam launched.
>> Dear God. Can you imagine Steam in the hands of Activision and Bobby Kotick? **shudders**

Blizzard canceled a roguelike version of Diablo 4 with permadeath and Batman: Arkham-style brawls
https://www.pcgamer.com/games/rpg/blizzard-canceled-a-roguelike-version-of-diablo-4-with-permadeath-and-batman-arkham-style-brawls/
If things had gone differently at Blizzard 10 years ago, we might've been playing a version of Diablo 4 with melee brawls akin to the Batman: Arkham series. I'm not even sure that idea sounds good on paper, but it didn't seem to work out in practice either because Blizzard eventually rebooted the project into the demon-slaying action RPG we have today.

News 1: If this really was years before Steam, then Blizzard most certainly was still part of Sierra On-line, itself a subsidiary of Vivendi. So, no Activision store with that one yet.

Besides, having their own storefront didn't exactly mean it would work out nicely. Both Stardock and Paradox had their own stores (Impulse, sold to GameStop, and GamersGate, spun off) before they divested from them. In fact, I was on those stores (and GOG) first before switching to Steam.

News 2: Blizzard is going full circle here; Diablo was originally conceived as a roguelike with graphics (most roguelikes at the time, and even today, still used ASCII characters) until someone removed the turns and ran the game in real time. 

JEMC said:
Jizz_Beard_thePirate said:

NVIDIA GeForce RTX 5090 32 GB, RTX 5080 16 GB, RTX 5070 12 GB To Launch at CES, New PCB “Back Drillâ€Â Changes

https://wccftech.com/nvidia-geforce-rtx-5090-32-gb-rtx-5080-16-gb-rtx-5070-12-gb-ces-2025-launch/

If you thought Lovelace had a bad GPU stack... Wait till you see Blackwell

GDDR6 VRAM Prices Plummet: 8GB of Memory Now Costs $27

https://www.tomshardware.com/news/gddr6-vram-prices-plummet

Can we get more vram at low tier for next gen Nvidia? "No"

I just realized (yeah, I know, I'm slow) that if that memory configuration is true, the 5060 will come again with a paltry 128-bit bus and either an insuffcient 8GB of VRAM or 16GB that won't do anything because the chip won't hav eenough memory bandwidth to take advantage of it. After all, we know Nvidia isn't going to use the newer and expensive GDDR7 on such lower parts.

So yeah, for those that want more memory, AMD may be the only one to bring it. And we'll see if they do or not. ***stares at the equally crippled 7600 8GB & 7600XT 16GB cards***

Jizz_Beard_thePirate said:

AMD Preps Radeon RX 7650 GRE “RDNA 3â€Â Graphics Card For CES 2025 Launch

https://wccftech.com/amd-preps-radeon-rx-7650-gre-rdna-3-graphics-card-ces-2025-launch/

Anyone thinks this card will be good? Because this late in the cycle and likely aimed only for China, it doesn't look like it will bring much to the table.

My guess is that the 7650GRE is made of Navi 32 chips that weren't good enough for the 7700XT. As such, it should fit right in between the 7600(XT) and 7700Xt in terms of performance.

As for how much memory this chip is gonna have, that's an interesting question. After all; Navi 32 isn't monolithic, and AMD could put as many MCD on that this as they want. But I guess they're going to use 3 of them, just like on the 7700XT, 2 would simply be too little for the performance.

The 7650GRE could performance-wise be the perfect counter to the 4060Ti, with similar performance but less VRAM issues for a much cheaper price. It's biggest Problem will be the price: the gap between the 7600 and 7700 models has been shrinking, so much so that there's not much left in between the 7600XT and 7700XT to make way for a new card: Either it will be just $30 or less apart from one of the models or AMD kills of the 7600XT to replace it with the 7650GRE.



Around the Network
Jizz_Beard_thePirate said:

I think I am at a point of caring less and less about hardware news. Like I am obviously going to post them but instead of being from the perspective of "Oh boy, I can't wait to buy it" to "Thank fuck I don't have to think about buying it" lol

Back in the day, 80 class always felt like the flagship. You started with the "80" as the top dog and then "80 Ti" the year later for around the same price if you wanted "Titan tier" performance. Yea there were Titans and all that but those felt like the rich people toys since 80 class never felt very compromised. These days it feels like the 80 class is there just to keep competitive with Radeon while 90 class shows off the true capabilities and the generational uplift we want to see... For the kidney prices we don't want to spend. If the rumours end up being legit, the gulf between 5080 and 5090 is a generational leap in itself.

Feels like a situation where instead of upgrading every gen or every 2 gens, you buy the top dog and chill for 6-8 years.

You say this, and I just finished watching this a few hrs ago lol (GET OUT OF MY HEAD CHARLES!).

I'm still rocking my 1080ti atm, so all I'm having to do atm is avoid all the stupidly expensive/janky "AAA" games, while focusing on indies and my backlog. Still managing to get by at 1440p as well and happy with reaching 60fps at this point.

I know the next Nvidia line-up will be out of my price range, and I'm still not a fan of the whole AI focus, let alone RT (I'm still happy with raster/baked in fx, I've been replaying Dying Light 1 and that game still looks fucking good to me at night time/sunset).

I remember when 80 and 80TI were considered the top of the line, and a bracket you could lock yourself into to skip another gen (which was originally what I was going to do with my 980ti, but then the 1080ti just caught my interest and I will never regret that purchase). I feel like them introducing the Titan line caused a disruption, because at that point you had folks like Total Biscuit rocking dual Titans, and using them as the new cards to bench with, which of course was beyond realistic expectations due to their og price point.

Fast forward to today and the 90 models are basically being visually treated like they are the new Titans, when spec-wise they are just the 80 models, and the general public still haven't caught onto this deception yet. 

Considering how long the 1080ti has taken me, I'm likely to just grab a card I know I can afford and coast for the 6-7yrs, if not a lil more. I'm getting older and fine with tuning down settings. I'd rather tune down than rely on AI, that's just how I'll be till the day I die (I will always see AI assistance as a crutch, not the solution). 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Bofferbrauer2 said:
JEMC said:

Oof: Years before Steam, a Blizzard engineer wanted to turn Battle.net into a third-party game store, but was reportedly turned down
https://www.pcgamer.com/gaming-industry/oof-years-before-steam-a-blizzard-engineer-wanted-to-turn-battle-net-into-a-third-party-game-store-but-was-reportedly-turned-down/
The world of today is the result of every decision and accident that led to it, which is a disconcerting thought given how many things it's possible to almost do in a lifetime. Case in point, according to a new book on the history of Blizzard, the Warcraft studio rejected a proposal to turn Battle.net into a third-party game store years before Steam launched.
>> Dear God. Can you imagine Steam in the hands of Activision and Bobby Kotick? **shudders**

Blizzard canceled a roguelike version of Diablo 4 with permadeath and Batman: Arkham-style brawls
https://www.pcgamer.com/games/rpg/blizzard-canceled-a-roguelike-version-of-diablo-4-with-permadeath-and-batman-arkham-style-brawls/
If things had gone differently at Blizzard 10 years ago, we might've been playing a version of Diablo 4 with melee brawls akin to the Batman: Arkham series. I'm not even sure that idea sounds good on paper, but it didn't seem to work out in practice either because Blizzard eventually rebooted the project into the demon-slaying action RPG we have today.

News 1: If this really was years before Steam, then Blizzard most certainly was still part of Sierra On-line, itself a subsidiary of Vivendi. So, no Activision store with that one yet.

Besides, having their own storefront didn't exactly mean it would work out nicely. Both Stardock and Paradox had their own stores (Impulse, sold to GameStop, and GamersGate, spun off) before they divested from them. In fact, I was on those stores (and GOG) first before switching to Steam.

News 2: Blizzard is going full circle here; Diablo was originally conceived as a roguelike with graphics (most roguelikes at the time, and even today, still used ASCII characters) until someone removed the turns and ran the game in real time. 

I know that a successful Blizzard back then would have given Vivendi more money and less reasons to sell it, but telecom and entertainment are their main business. It wouldn't be too hard to believe that they would try to sell it to get the money they needed to keep the other business afloat.

Still, we're lucky it didn't go anywhere.

Bofferbrauer2 said:
JEMC said:

I just realized (yeah, I know, I'm slow) that if that memory configuration is true, the 5060 will come again with a paltry 128-bit bus and either an insuffcient 8GB of VRAM or 16GB that won't do anything because the chip won't hav eenough memory bandwidth to take advantage of it. After all, we know Nvidia isn't going to use the newer and expensive GDDR7 on such lower parts.

So yeah, for those that want more memory, AMD may be the only one to bring it. And we'll see if they do or not. ***stares at the equally crippled 7600 8GB & 7600XT 16GB cards***

Jizz_Beard_thePirate said:

AMD Preps Radeon RX 7650 GRE “RDNA 3â€Â Graphics Card For CES 2025 Launch

https://wccftech.com/amd-preps-radeon-rx-7650-gre-rdna-3-graphics-card-ces-2025-launch/

Anyone thinks this card will be good? Because this late in the cycle and likely aimed only for China, it doesn't look like it will bring much to the table.

My guess is that the 7650GRE is made of Navi 32 chips that weren't good enough for the 7700XT. As such, it should fit right in between the 7600(XT) and 7700Xt in terms of performance.

As for how much memory this chip is gonna have, that's an interesting question. After all; Navi 32 isn't monolithic, and AMD could put as many MCD on that this as they want. But I guess they're going to use 3 of them, just like on the 7700XT, 2 would simply be too little for the performance.

The 7650GRE could performance-wise be the perfect counter to the 4060Ti, with similar performance but less VRAM issues for a much cheaper price. It's biggest Problem will be the price: the gap between the 7600 and 7700 models has been shrinking, so much so that there's not much left in between the 7600XT and 7700XT to make way for a new card: Either it will be just $30 or less apart from one of the models or AMD kills of the 7600XT to replace it with the 7650GRE.

Given how both Nvidia and AMD are lowering the prices of their cards for their upcoming new cards, launching the 7650 GRE now to fill a price and perfomance gap is difficult to understand. It's not like the 7700XT is much more expensive than the 4060Ti, at least over here, and the performance jump is quite big (in raster).

Memory wise, it has options, yes, but AMD went with 256-bit for the 7800XT and 192-bit for the 7700XT, so it will likely be either of those.

Lastly, it's a GRE card. That points towards a China only card, which could ot not, be released in other markets later. But, with RDNA4 around the corner, will AMD bother to do it?



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Chazore said:
Jizz_Beard_thePirate said:

I think I am at a point of caring less and less about hardware news. Like I am obviously going to post them but instead of being from the perspective of "Oh boy, I can't wait to buy it" to "Thank fuck I don't have to think about buying it" lol

Back in the day, 80 class always felt like the flagship. You started with the "80" as the top dog and then "80 Ti" the year later for around the same price if you wanted "Titan tier" performance. Yea there were Titans and all that but those felt like the rich people toys since 80 class never felt very compromised. These days it feels like the 80 class is there just to keep competitive with Radeon while 90 class shows off the true capabilities and the generational uplift we want to see... For the kidney prices we don't want to spend. If the rumours end up being legit, the gulf between 5080 and 5090 is a generational leap in itself.

Feels like a situation where instead of upgrading every gen or every 2 gens, you buy the top dog and chill for 6-8 years.

You say this, and I just finished watching this a few hrs ago lol (GET OUT OF MY HEAD CHARLES!).

I'm still rocking my 1080ti atm, so all I'm having to do atm is avoid all the stupidly expensive/janky "AAA" games, while focusing on indies and my backlog. Still managing to get by at 1440p as well and happy with reaching 60fps at this point.

I know the next Nvidia line-up will be out of my price range, and I'm still not a fan of the whole AI focus, let alone RT (I'm still happy with raster/baked in fx, I've been replaying Dying Light 1 and that game still looks fucking good to me at night time/sunset).

I remember when 80 and 80TI were considered the top of the line, and a bracket you could lock yourself into to skip another gen (which was originally what I was going to do with my 980ti, but then the 1080ti just caught my interest and I will never regret that purchase). I feel like them introducing the Titan line caused a disruption, because at that point you had folks like Total Biscuit rocking dual Titans, and using them as the new cards to bench with, which of course was beyond realistic expectations due to their og price point.

Fast forward to today and the 90 models are basically being visually treated like they are the new Titans, when spec-wise they are just the 80 models, and the general public still haven't caught onto this deception yet. 

Considering how long the 1080ti has taken me, I'm likely to just grab a card I know I can afford and coast for the 6-7yrs, if not a lil more. I'm getting older and fine with tuning down settings. I'd rather tune down than rely on AI, that's just how I'll be till the day I die (I will always see AI assistance as a crutch, not the solution). 

I think the domino affect started when they initially merged the Datacenter architecture and Gaming architecture together. Instead of Nvidia creating a gaming line that's largely good at gaming and consumerism and leaving the professional workloads for Titans and Quadros... They merged them and now consumer GPUs are good at a lot of datacenter oriented tasks. The fact that US government had to block a 4090 from being exported to China shows how good these consumer cards are at doing Ai work and other shat. And guess what limits the datacenter workloads the most... Vram!

Like do we need 1 Terabyte/s+ of memory bandwidth for gaming workloads? Fuck no.... Outside of a few frames, GPUs care about vram capacity more than speed. For gaming, having 16GB of GDDR6 is better than having 8GB of GDDR7. But Nvidia can't have that because muh datacenter cards.... (and obvious greed) Absolutely awful state of the GPU industry. And consoles aren't the saving grace either when you look at the circus that has been the PS5 Pro. Everything is just becoming too dang expensive.

The worst excuse I have seen thus far to justify the prices is "but there's financing options now." Made me facepalm so hard.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Chazore said:

Vanillaware is such an odd case, in that I keep hearing their CEO doesn't like PC gaming, because of the fact that we can mod our games and such. The weird thing is the fact that they seem to not be aware of PC gaming's nature/long-term history (we've had modding as far back as the 70/80's, all the way to this day), which makes his decision to not budge, having him look like a bit of an ignorant person.

We've seen plenty of Japanese games selling decently on PC, welcomed with open arms and even hyped. It's the ones that don't put the effort into the port jobs that tend to recieve the negative treatment, but that's only because we want them to do better, instead of settling for less.

Like I mentioned in my previous post, hoping the new guard or whoever replaces his position is more open minded.



Around the Network
Chazore said:
Jizz_Beard_thePirate said:

I think I am at a point of caring less and less about hardware news. Like I am obviously going to post them but instead of being from the perspective of "Oh boy, I can't wait to buy it" to "Thank fuck I don't have to think about buying it" lol

Back in the day, 80 class always felt like the flagship. You started with the "80" as the top dog and then "80 Ti" the year later for around the same price if you wanted "Titan tier" performance. Yea there were Titans and all that but those felt like the rich people toys since 80 class never felt very compromised. These days it feels like the 80 class is there just to keep competitive with Radeon while 90 class shows off the true capabilities and the generational uplift we want to see... For the kidney prices we don't want to spend. If the rumours end up being legit, the gulf between 5080 and 5090 is a generational leap in itself.

Feels like a situation where instead of upgrading every gen or every 2 gens, you buy the top dog and chill for 6-8 years.

You say this, and I just finished watching this a few hrs ago lol (GET OUT OF MY HEAD CHARLES!).

I'm still rocking my 1080ti atm, so all I'm having to do atm is avoid all the stupidly expensive/janky "AAA" games, while focusing on indies and my backlog. Still managing to get by at 1440p as well and happy with reaching 60fps at this point.

I know the next Nvidia line-up will be out of my price range, and I'm still not a fan of the whole AI focus, let alone RT (I'm still happy with raster/baked in fx, I've been replaying Dying Light 1 and that game still looks fucking good to me at night time/sunset).

I remember when 80 and 80TI were considered the top of the line, and a bracket you could lock yourself into to skip another gen (which was originally what I was going to do with my 980ti, but then the 1080ti just caught my interest and I will never regret that purchase). I feel like them introducing the Titan line caused a disruption, because at that point you had folks like Total Biscuit rocking dual Titans, and using them as the new cards to bench with, which of course was beyond realistic expectations due to their og price point.

Fast forward to today and the 90 models are basically being visually treated like they are the new Titans, when spec-wise they are just the 80 models, and the general public still haven't caught onto this deception yet. 

Considering how long the 1080ti has taken me, I'm likely to just grab a card I know I can afford and coast for the 6-7yrs, if not a lil more. I'm getting older and fine with tuning down settings. I'd rather tune down than rely on AI, that's just how I'll be till the day I die (I will always see AI assistance as a crutch, not the solution). 

You don't like DLSS?  Seems to work really well.  I hate the idea of frame generation but 4k quality is quite something, especially since it jumps fps by 30 to 40.



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

JEMC said:
HoloDust said:

Oh man, that gap between 5090 and 5080...half RAM, half cores...as Daniel Owen said, I'm thinking nVidia is trying to pull 4080 12GB again.

It occurred to me the other day that this time Nvidia may have another reason to do this other than greed: China.

Remember that the US limited the hardware capabilities that could be exported to China for AI research, and the 4090 surpassed it, leading to the launch of the cut down 4090D.

Now, Nvidia could never launch a 5090 slower than the 4090 (can you imagine it?), but they could make the 5080 slightly slower than the 4090 to avoid going above that limit and be able to sell the card in China without problems.

And well, the card will still be 25-30% faster than the 4080, so it's not like anyone will be able to accuse Nvidia of not delivering a good generational jump.

Maybe...but I'm more willing to bet on usual Green Goblin's greed and shenanigans.

5070 with 12GB?!?! Yeah, fuck off. Even 3060 had 12GB.

I think AMD, if they don't fuck up again royally, will have their big comeback in mid-tier market.



No way I would upgrade to 12 gb. When I push settings with today's games I'm hitting 10 to 12 gb.



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Chrkeller said:

You don't like DLSS?  Seems to work really well.  I hate the idea of frame generation but 4k quality is quite something, especially since it jumps fps by 30 to 40.

I'm just not a fan of relying on something that was originally designed as an "aid" tool, yet Nvidia know some devs are using that "aid", as an actual crux to their initial game dev, which isn't what the AI was originally intended for. You seee this around multiple industries now, where AI was originally designed to help the user, but now big tech has been stating that it is meant to "replace", make the "pipeline more streamlined", which in turn means layoffs, replacing human input, or reducing the human workload (and we've seen the results that's produced in other industries, it's not good).

Also I don't like DLSS, because it still doesn't look as sharp as ever, and frame gen can introduce more latency (especially when using gamepads). 

If you are using a mid tier card that barely manages 60fps and you turn on frame gen, that frame genned 60fps isn't going to feel as smooth as a real locked 60fps (also more latency lag). 

I'm also not big on AI being used more than human input, because I'm friends with a group of artists that also share the same view I do, and even explain to me the pitfalls of over reliance on AI for different jobs (like how AI cannot successfully pull off artistic changes that an artist wants to do, or asks a prompter to make multiple corrections for, the AI cannot understand what is being asked).



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

HoloDust said:

Maybe...but I'm more willing to bet on usual Green Goblin's greed and shenanigans.

5070 with 12GB?!?! Yeah, fuck off. Even 3060 had 12GB.

I think AMD, if they don't fuck up again royally, will have their big comeback in mid-tier market.

I'm hoping they do something for their mid-tier market, because if they don't, and Nvidia's entire line just crushes AMD again, we're really going to be fucked for yet another gen or two, and may as well give up at that point.

Really I just think Nvidia doesn't care about gaming anymore and is physically trying to excuse themselves for dicking over an entire industry, just so they can go "well you guys can't afford our stuff and it means little to us anyway, so we're gonna focus on the data center market now". 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"