By using this site, you agree to our Privacy Policy and our Terms of Use. Close
shikamaru317 said:
DonFerrari said:

Sorry but just from quick math you need more than 2x the power to generate 4k from 1080p (likely 4x. Or why do you think X1X can't achieve 4K in all the games that are near 1080p on X1, let's not even talk about more RAM and buffer, plus all the assets).

Also you said with certainty that MS would never allow Sony to have a more powerful HW, and the only way to accertain that is either launching much later or spying, because even if they decided to launch a console at 800 USD nothing prevents Sony to go crazier and launch a 1000 USD.

Plus since performance isn't linear, who would really pay 60% more to have just twice the processing power? It would be the very small crowd.

I know it seems like you would need 4x more GPU power to output 4x more pixels, but you actually don't. I could show you multiple PC benchmarks that prove that you only need about 2x more GPU power to hit native 4k, though obviously it varies depending on the game, some engines struggle with 4K more than others, so some games need more than 2X more power to hit 4K, but 90% of games fall somewhere between 2-2.5x more power needed to hit 4K. Now obviously if you are running "4K" textures as well, then you definitely need more GPU power and RAM that is good enough to stream those textures quickly. Anaconda definitely won't hit native 4K/60 fps on all games, some sacrifices will need to be made to either resolution or framerate for the more demanding games, and as the generation progresses and PC pulls further ahead, more and more games won't be able to hit 4k on Anaconda. But I'm sure MS has plans to release an even more powerful console a few years into the generation, much like X this gen. I have a feeling that MS is planning to move towards shorter hardware cycles with possibly 3 different specced consoles being supported at any one time.

Most of the games that can't hit native 4k on XB1 X are games that run at 900p or 720p on XB1 btw. Only a few games that are 1080p on XB1 aren't 4K on XB1 X. 

That was Soundwave who said that MS definitely wouldn't let Sony beat them on specs again, not me. All I said was that Phil hinted at MS planning to beat Sony again on specs next gen. 

On confusing you with Soundwave, my bad. But since you picked on the conversation with that being part I didn't really noticed. And sure I believe MS would like and probably is aiming to beat Sony at spec.

No console next gen will have only 4k60fps games, 60 fps by standard in consoles aren't the focus, devs rather increase IQ than have 60fps except specific genres. But since I'm not specialist on this I won't accertain that only 2x difference in power would assure the 4k (PS4 is 50% more powerful than X1 and that didn't warranty a very big difference in pixel count in several games).

I can certainly see someone paying double on a 4K machine versus a FullHD, but I think the power different seem slim plus the expected differences mainly being just 4x resolution without other assets meeting the jump.

Pemalite said:

Otter said:

I think this was always going to be the case with a mid-gen upgrade, at least on paper I don't expect anaconda to do much more than x2.5 performance of XB1X .

First AMD needs to actually invent a GPU that is 2.5x faster than the Xbox One X GPU.

Otter said:

Anaconda and Lockhart will both provide a new generation of graphical presentation that is not available on X1X: Lockhart will just do so at the expense of native 4k targets, but without any compromise to the CPU and overall performance.

Why couldn't there be any deviation in CPU performance? This generation certainly has every console with a different CPU performance profile.
I.E. Xbox One: 1.75Ghz, Xbox One X: 2..3ghz (With some additional offloading), Playstation 4: 1.6ghz, Playstation 4 Pro: 2.13Ghz.

Trumpstyle said:

Now with 2 leaks on Microsoft next-gen strategy and assuming Sony does a 399$ console, we now know pretty much the specs in these consoles as desktop zen2 and Navi has been leaked. Based on the leaks and a little speculation the specs we got is looking like this:

We really don't though.

Trumpstyle said:

Xbox Two (Lockhart) 300$
CPU: 6 Core, 12-thread zen2, clocked at 2,4Ghz
Gpu: Navi with 32CU, 12 GB ram Gddr6, 288 GB/s bandwidth, 192-bit bus, Amd radeon 590 performance
Storage: 1TB mechanical drive with 64GB SSD storage

PS5 400$
CPU: 8 core, 16-thread zen2, clocked at 2,6Ghz
GPU: Navi with 48CU, 16 GB ram Gddr6, 448 GB/s bandwidth, 256-bit bus, Geforce 1080/vega64 performance
Storage: 1TB mechanical drive with 128GB SSD storage

Xbox Two+ (Anaconda) 500$
CPU: 8 core, 16-thread zen2, clocked at 3Ghz
GPU: Navi with 56CU, 24 GB ram Gddr6, 672 GB/s bandwidth, 384-bit bus, Geforce 2080 performance
Storage: 1TB mechanical drive with 128GB SSD storage

I am expecting single CCX on the CPU side for all devices. It offers the best price/performance... And allows for more of the transistor budget to be sunk into the GPU side of the equation.

Navi with 56CU's to be equivalent than a Geforce 2080 is a bit of a claim, Vega 64 has 64 CU's, which is 14% more... Yet ends up being 40-60% slower than the 2080, I highly doubt AMD has made such significant strides in boosting Graphics Core Next efficiency. Grains of salt shall be had.
Granted at 7nm, they should be able to bolster clockrates, but... Still a big gap to close.

Trumpstyle said:

Will have Vapor chamber cooling for extra high clocks :) and be quiet as a stone.

Vapor chamber doesn't guarantee high clock and silent operation. - You still need a fan.
It does allow for more efficient movement of heat however.

Trumpstyle said:

We should get dev kits leaks within 7 months if history repeats itself. As in June 2012 there was already a forum post at Beyond3D that the Ps4 will a SOC containing 8-core jaguar cpu with and a gpu similiar to amd radeon 7850.

Indeed.
Some dev kits were a deviation though as some claimed to be using Terascale at some points.

DonFerrari said:

PS3 and PS4 had better HW than X360 and X1.

Indeed. Although to be fair, the Xbox 360's GPU was the better chip at the end of the day.

Totally fair point, with only little games taking advantage of the better HW of PS3 with much more hassle than ones using the better GPU of X360.

DonFerrari said:

And how would MS know for sure that their hardware isn't outpowered before Sony either announce or release the HW? The only way they can be sure is if they always take over 12 months later than Sony to release their HW (because not all HW change can be made fast and worse, the time to dev SW for it)... or are you suggesting MS will have access to confidential development documents from Sony?

Precisely. They wouldn't. AMD and other hardware companies would be under a non-disclosure agreement and would not be allowed to disclose what the other company is doing.

Certainly when MS or Sony buy around the tech, since they will likely be using AMD or NVidia (less likely) instead of deving their own chips, they will now what power to expect around each cost. And may take assumptions on how much the competitor will charge and the loss they want to take. Plus also can do estimatives of what gen of HW and specs will be best bang for buck on console form. But all of that can be useless if the competitor decides to be bold (or dumb).

DonFerrari said:

PS4 is the first console to have ever won being the most powerful at start of gen, all other in previous gen lost. So being the most powerful doesn't warranty good sales (we have plenty overpowered duds in the past). Being less expensive but giving a not up to standard experience also doesn't solve it, as show by WiiU and others as well.

Power doesn't guarantee success/failure, that is what history has ultimately told us.
But what does determine that is the right performance/price ratio.

The Playstation 3 was higher performing, but also over priced high on it's launch, paving the way for the Xbox 360 to take a massive chunk of marketshare.
The Playstation 4 was higher performing, but lower price... And that resonated with consumers across the spectrum.

Yes performance/price is quite important, plus the games that will make use of it.

shikamaru317 said:

Actually, Lockhart shouldn't hold back Anaconda at all. The rumored specs for Lockhart are high enough that the only downgrade should be resolution, Lockhart should play next gen games at 1080p that Anaconda plays at 4K with no other graphical downgrades. It therefore shouldn't hold back graphics for exclusives or multiplats next-gen. 

Depends on how much of a cutback Lockhart has.
If it's got less ROPS, Bandwidth, Geometry units, CPU cores, CPU clocks, Texture Mapping Units, Compute and so on... That has a flow-on effect.

Developers do need to build their games for the lowest common denominator in mind.

DonFerrari said:

It's not out of the blue that several PC gamers blame consoles for holding they back. Consoles sell much more SW so they are taken more in account.

It's true though.
It's no coincidence that in 2014~ that Multiplat games took a relatively large leap forward in terms of fidelity when consoles caught up to mid-range PC's.

shikamaru317 said:

So, Anaconda is actually more likely to hold back graphics than Lockhart is, simply because native 4k/60 fps is very demanding. 

Eh. Except you contradict yourself from the very outset of your post by stating that flops isn't an indicator for gauging complete system performance.

shikamaru317 said:

Also, based on current PC GPU benchmarks, 2x more GPU power is actually just about enough to hit 4K with the same graphics settings. As a for instance, the 6 tflop AMD RX 580 can achieve a locked 60 fps at 1080p, ultra settings on Battlefield V, while the 10 tflop RX Vega 64 can hit 50 fps on Ultra settings at 4K (if AMD already had a 12 tflop GPU it very likely would be able to hit 60 fps at 4K, ultra settings).

The RX 580 is mid-range. It's not a GPU that is well suited for 4k with settings dialed up... And it's certainly not a GPU that is ideal for 1080P. - It's ideal resolution is actually somewhere in between, 1440P.

At 4k, you need to start cutting back on visual effects... And at that point, you are better off with 1440P with settings pushed up.

So you would be in line that somewhere around 4x difference in performance would be expected to have similar level and balance from 1080p to 4k?

shikamaru317 said:

I know it seems like you would need 4x more GPU power to output 4x more pixels, but you actually don't. I could show you multiple PC benchmarks that prove that you only need about 2x more GPU power to hit native 4k, though obviously it varies depending on the game, some engines struggle with 4K more than others, so some games need more than 2X more power to hit 4K, but 90% of games fall somewhere between 2-2.5x more power needed to hit 4K.

It is all completely dependent on the GPU architecture and how well it would hit higher resolutions and what the games/engines are trying to accomplish.

Back in the 90's, one of the biggest limitations to hitting higher resolutions was actually Ram capacity, if you had a GPU with a smaller amount of VRAM, you might have been limited to 640x480 rather than say... 1024x768. - Some manufacturers got around that issue by employing texture compression.

Move towards the dawn of TnL (Fixed function stuff), fillrate became a massive limitation... Often various GPU designs came out with strange multi-texturing set-ups.. And if a game didn't leverage it appropriately, fillrate could be cut in half or down to 1/4th of it's full speed, making a massive impact on total performance and thus resolution you could operate at, some GPU's got around that issue by taking a tile-based approach to rendering.

Today workloads are very much compute heavy, but if a game is heavy on the geometry or texturing it can make a fairly large impact on AMD GPU's ability to hit higher resolutions as performance will suffer.

shikamaru317 said:

Now obviously if you are running "4K" textures as well, then you definitely need more GPU power and RAM that is good enough to stream those textures quickly.

4k textures were used even during the 7th gen. Texture resolution and screen resolution aren't 1:1 pixel to pixel.

 

Answers in bold

shikamaru317 said:
Pemalite said: 

The RX 580 is mid-range. It's not a GPU that is well suited for 4k with settings dialed up... And it's certainly not a GPU that is ideal for 1080P. - It's ideal resolution is actually somewhere in between, 1440P.

At 4k, you need to start cutting back on visual effects... And at that point, you are better off with 1440P with settings pushed up.

4k textures were used even during the 7th gen. Texture resolution and screen resolution aren't 1:1 pixel to pixel.

I know, I was just using the 580 and RX Vega 64 to illustrate my point since they are currently the two closest AMD PC GPU's to the rumored GPU specs for Lockhart and Anaconda. 

I'm also aware that actual 4k textures aren't that impressive and that many games use them even at lower resolutions like 1080p. That's why I used quotation marks when I said "4K textures". The actual texture resolutions that people usually run at 4K screen resolution are more like 8K or 16K, but they are often called 4K by people writing articles and such since they are paired with a 4k screen resolution.

More like texture for 4K games.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."