By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

I might just skip my gaming monitor and go full TV gaming.
https://www.lg.com/us/tvs/lg-oled65cxpua-oled-4k-tv
OLED, 4K, 120Hz, G-Sync, HDR. If the response time is as good as they claim it might even be good enough for Rocket League.

And the best part is I'm gonna buy that TV no matter what I do with my PC because I wanted a new TV anyway and that one happens to be the best TV that currently exists.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Around the Network
Chazore said:
Captain_Yuri said:

It's technically at "clock speeds similar to mobile gpus" so the actual clock speeds will probably be higher. AMD essentially needs a two generational leap in order to be competitive with Nvidia's high end cause unlike Intel, Nvidia hasn't been sitting around...

It will all come down to the prices at the end of the day. You can have a 100 TF GPU but if it costs as much as a house, I'd rather take the 14 TF...

That's what gets me though. They're managing to lift their weight around against Intel, but they just somehow cannot pick up the pace with Nvidia. By the time they even bother to, they'll be outpaced by Nvidia again, and again, and again, like they're stuck in some temporal infinite loop of never improving or getting so very close to what NV do. 

This is why I hate Adored on Youtube, because that smug Scot thinks AMD's won the entire war, despite the fact that AMD are pretty much just putting focus on low to mid end. Can you imagine if they supposedly "won" this war, and the high end just vanishes?, because that's what he makes it sound like, a "non issue". 

I mean, yeah, I getcha on the whole price ratio. It's why I'm not really seeing myself grabbing a 3080ti, considering I own a 1080ti, that would be most logical of me to go for it, but I've had enough insight into how high priced the 2000 series is, so I know a 3080ti will cost way more than my current card, meaning I'll have to go for the 3080 instead (which isn't that bad of a choice, since I'll still be sticking to 1440p, until the day 4k OLED becomes cheap as chips, which is years and years away). 

Well the thing with AMD vs Intel is, for AMD to have won, Intel needed to have stopped progressing... And that is exactly what they did. This is speculation but Intel was so focused on trying to get 10nm and FX left AMD so out of the picture that Intel never bothered to pursue anything else. They figured that if AMD were to make another CPU, they wouldn't be able to catch up to Intel's IPC lead so it wouldn't matter. And if AMD went for the high core count aspect, the yields would be similar if not worse compared to Intels.

What they didn't account for is something like the infinity fabric which gave AMD the ability to have high yields and high core count while having really good performance. There are some performance penalty compared to Intel's approach but it's mainly a non issue.

Nvidia however has been progressing and hasn't stopped. Yea the Value of 2000 series is shit but it was more of a hold off until 7nm was ready. If you look at a pessimistic point of view from what the 5000 series is by comparison... AMD had everything going for them. Nvidia had shit pricing because 2000 series had ray tracing and tensor cores before they were ready to do anything meaningful and the game performance wasn't that different from Pascal other than 2080 Ti. Yet some how, the 5000 series not only had similar pricing to 2000 series, it couldn't compete against Nvidia's high end while not having Ray Tracing or Tensor Cores. And the biggest kicker is that the 1080Ti is still more powerful in average than 5700 XT and they still had driver problems.

It's hard for me to recommend a 5000 series to anyone cause people say that 2000 series is gonna age like milk due to having Ray Tracing and Tensor cores before it was ready... Shit where does that leave the 5000 series that doesn't even have it while costing the same?

Anyway, unless AMD can pull a magic trick out of it's hat and I do want them to, the only way they will catch up to Nvidia is if Nvidia stops progressing or makes a big woopsie... A bigger one than the 2000 series...



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

AMD went from competting with the 1060 with their best mainstream cards, the 480/580, to compete with the 2070 with the same replacement cards, so much so that Nvidia was forced to lower the prices of their cards to make them attractive and launch the Super versions to beat AMD.

If AMD can bring up the competition once again and fight head to head with Nvidia's 3080 or even better, launch a card that sits between that card and the 3080Ti, that will already be a big improvement and bring some sense to the pricing of both camps.

Just because they won't launch the fastest/most powerful card of the world doesn't mean that they suck and their products are shitty. Some of you need to stop thinking only in the high end and pay a bit of attention to the rest of the range which, coincidentally, is where most of the money is made.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Pemalite said:
vivster said:

It's as if I predicted that years ago. This really deserves its own thread. Let's watch console people go through the stages of grief.

1. Blame it on Xbox. PS5 will surely run it on 60fps with the power of SSD!

2. Blame it on the lazy/inexperienced developer because dem technologies be brand new yo.

3. Say that it'll probably must be for too huge and demanding games and everything else will run 60fps.

4. Say that 30fps really isn't that bad and that PC people are stupid for even demanding it.

Just going to say... I fucking called it. Years in advance before these consoles were even unveiled.

I think that applies to most regulars here in this thread. Consoles simply have limitations that make it impossible to really compete with PC, especially outside of the launch window.

Chazore said:
Only 10% faster than the 2080ti?. I really hope their high end match NV's 3000 series, and not playing sodding catch up, once again. I'm so tired of AMD just caring for the low to middle end, and not throwing their weight around with the high end sector. Yeah, I get it, high end is 1% blah blah, but fuck that for a second and let's not let Nvidia just keep the high end on lockdown till the end of time, okay?. Just...super tired of seeing the same song and dance for years now. I just want AMD to wallop Nvidia like they did Intel, so Nvidia get's it's big brain cap on and prices more fairly and actually ramps up their shit, y'know, like really strong healthy pro-consumer competition?.

AMD needed during the FX days to sell whatever they produce, and in sufficient quantities. That meant concentrating on mainstream chips, and those kept AMD afloat at the time. Additionally, the development of Ryzen dried up much of the ressources for the graphics department, hence why Vega was such a limited release and it took so long until Navi to finally release. And even then, they only released 2 chips based upon Navi/RDNA1 so far, simply because they didn't have much money to develop them back when they were designed in 2016/2017.

But with Ryzen being the success that it became, AMD finally could start developing more and broader again, and I think RDNA2 will reflect this with broader GPU launches. Because the last time that AMD had a full stack launch for any GPU generation was all the way back in 2012 and the launch of the original GCN architecture, and that was also the last time AMD could really financially afford to develop so many different dies at once - until now.

However, I don't think RDNA2 will already be a full stack launch just yet, that will come with RDNA3. Instead, AMD will finally tackle the high-end again with cards that fight with the 2080Ti and 3080Ti while letting RDNA1 pull it's duty below this - though possibly with an added chip to fill the gap between the 5700 and 5500 or a new entry-level chip to replace Baffin/Lexa (outside of OEMs) and thus put Polaris finally at rest.

Captain_Yuri said:

We will see how it goes but I myself have doubts as well. If the Series X has 52 CUs + RDNA 2 which is already a lot more than a 5700 XT at 40 CUs but it performs similar to a 2080, how many more will they be able to add to big navi so that it will beat a 3080 and not have ridiculous cooling/power requirements?

I do think that's a pretty faulty comparison.

The reason? Microsoft has asked for severely clocked down GPU parts to address potential heat issues, RROD anyone? On the other hand, the GPU in the PS5 is much smaller, but also runs at much higher speed. Since both are the same architecture, you should at the very least take the XSX GPU and take the PS5 clock speed to determine potential raw performance of a big(ish) RDNA2. At that clock speed for instance, the XSX GPU would be at 15 TF, which is nothing to sneeze at, and a full 64CU GPU at that speed certainly could also threaten Ampere.

Last edited by Bofferbrauer2 - on 12 May 2020

JEMC said:

AMD went from competting with the 1060 with their best mainstream cards, the 480/580, to compete with the 2070 with the same replacement cards, so much so that Nvidia was forced to lower the prices of their cards to make them attractive and launch the Super versions to beat AMD.

If AMD can bring up the competition once again and fight head to head with Nvidia's 3080 or even better, launch a card that sits between that card and the 3080Ti, that will already be a big improvement and bring some sense to the pricing of both camps.

Just because they won't launch the fastest/most powerful card of the world doesn't mean that they suck and their products are shitty. Some of you need to stop thinking only in the high end and pay a bit of attention to the rest of the range which, coincidentally, is where most of the money is made.

That's be nice for sure, as it'd give me some options to look at.

I only pay attention to the high end, because that's what I seek in a GPU. I don't want to settle for mid to low end upgrades time after time, and I do use my GPU for rendering as well, so there's that use for high end that I have. 

Yeah, low to mid range is always where the money is, but really, so is mobile gaming. Normies in any industry make anyone the most money, as that is just the way things are and always will be. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Around the Network
Bofferbrauer2 said:

At that clock speed for instance, the XSX GPU would be at 15 TF, which is nothing to sneeze at, and a full 64CU GPU at that speed certainly could also threaten Ampere.

Do you think XSX is going to be a far bigger threat to an unreleased GPU line?. I dunno, I'm finding that a bit hard to believe. For the longest time, PC's have managed to outpace console performance, even out the gate, albeit a years time if not 2. I just don't expect the gap to suddenly widen from the console side to PC.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Bofferbrauer2 said:

Captain_Yuri said:

We will see how it goes but I myself have doubts as well. If the Series X has 52 CUs + RDNA 2 which is already a lot more than a 5700 XT at 40 CUs but it performs similar to a 2080, how many more will they be able to add to big navi so that it will beat a 3080 and not have ridiculous cooling/power requirements?

I do think that's a pretty faulty comparison.

The reason? Microsoft has asked for severely clocked down GPU parts to address potential heat issues, RROD anyone? On the other hand, the GPU in the PS5 is much smaller, but also runs at much higher speed. Since both are the same architecture, you should at the very least take the XSX GPU and take the PS5 clock speed to determine potential raw performance of a big(ish) RDNA2. At that clock speed for instance, the XSX GPU would be at 15 TF, which is nothing to sneeze at, and a full 64CU GPU at that speed certainly could also threaten Ampere.

"severely clocked down GPU parts"

Do we know this for sure cause I wouldn't call 1.825GHz severely clocked down. If anything, it's a pretty good base clock. But we will see how it performs since increasing the frequency might sound great on paper but the performance you get back isn't proportional.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:
vivster said:

If the Ampere rumors are true I have a hard time believing that Big Navi will beat a 3080, let alone a ti. When was the last time AMD could go toe to toe with a non-Titan Flagship?

OMG, I just watched that video. Please tell me you guys are NOT pronouncing it "Ampeer" and instead use the correct pronunciation "Ampair".

We will see how it goes but I myself have doubts as well. If the Series X has 52 CUs + RDNA 2 which is already a lot more than a 5700 XT at 40 CUs but it performs similar to a 2080, how many more will they be able to add to big navi so that it will beat a 3080 and not have ridiculous cooling/power requirements?

Keep in mind that the 5700XT is a mid range GPU meant to replace Polaris which was at 36CU...
And when comparing the 5700XT... It manages to match the RTX 2080, but nVidia is still stuck at 12nm (Which is an enhanced 14/16nm process, which in turn is based on 20nm), things will really get interesting when there is fabrication parity between AMD and nVidia.

AMD is enjoying some good profit margins on RDNA though.

Plus RDNA1 is a hybrid GPU architecture, it's not purely RDNA, it's built on top of Graphics Core Next and comes with some of it's pro's and con's.
RDNA2 should be a significant departure from that and should bring forth a slew of efficiency enhancements... Touch wood.

I think it's clear from this point that it's going to take a few years for AMD to do what it did to Intel... To nVidia. - nVidia hasn't stopped innovating and bringing in new features and enhancements, so it's a much harder task.

Chazore said:
Bofferbrauer2 said:

At that clock speed for instance, the XSX GPU would be at 15 TF, which is nothing to sneeze at, and a full 64CU GPU at that speed certainly could also threaten Ampere.

Do you think XSX is going to be a far bigger threat to an unreleased GPU line?. I dunno, I'm finding that a bit hard to believe. For the longest time, PC's have managed to outpace console performance, even out the gate, albeit a years time if not 2. I just don't expect the gap to suddenly widen from the console side to PC.

RDNA2 and Ampere should be starting to trickle onto the market, putting the PC ahead of the consoles at around the same time frame.

It's like when the Playstation 3 launched using a "high-end" Geforce 7 class GPU, nVidia dropped the Geforce 8000 a month or two prior which made the Geforce 7 series seem highly antiquated.



--::{PC Gaming Master Race}::--

Chazore said:
Captain_Yuri said:

Dat Tease

Dammit Jensen, you're not supposed to cook them

Looks like with the 3000 series they also upgraded Jensen's sense of humor.

I got a good laugh out of this, but as soon as the clip was over I thought to myself, if he had been wearing his leather jacket, with an apron over top of it, now that would have been the icing on the cake!

Your Moe pic here is just so spot on!



Bofferbrauer2 said:
Pemalite said:

Just going to say... I fucking called it. Years in advance before these consoles were even unveiled.

I think that applies to most regulars here in this thread. Consoles simply have limitations that make it impossible to really compete with PC, especially outside of the launch window.

In this case it's not about the limitations. It's about console developers' unwillingness to sacrifice visual fidelity for performance. The consoles are plenty strong to deliver solid 60fps on 4K, but developers refuse to go there or even give the option of a performance mode. It's the same thing from the last two gens where consoles had plenty power to run all games on 1080p60, but developers opted to not go there. And why would they, when they have a rabid fanbase of people who don't even know how it is to have more. Console developers live in luxury and they won't give it up.

That why it was so easy to predict that games wouldn't run at 60fps, no matter how fast the consoles will be.

Last edited by vivster - on 13 May 2020

If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.