shikamaru317 said:
I know it seems like you would need 4x more GPU power to output 4x more pixels, but you actually don't. I could show you multiple PC benchmarks that prove that you only need about 2x more GPU power to hit native 4k, though obviously it varies depending on the game, some engines struggle with 4K more than others, so some games need more than 2X more power to hit 4K, but 90% of games fall somewhere between 2-2.5x more power needed to hit 4K. Now obviously if you are running "4K" textures as well, then you definitely need more GPU power and RAM that is good enough to stream those textures quickly. Anaconda definitely won't hit native 4K/60 fps on all games, some sacrifices will need to be made to either resolution or framerate for the more demanding games, and as the generation progresses and PC pulls further ahead, more and more games won't be able to hit 4k on Anaconda. But I'm sure MS has plans to release an even more powerful console a few years into the generation, much like X this gen. I have a feeling that MS is planning to move towards shorter hardware cycles with possibly 3 different specced consoles being supported at any one time. Most of the games that can't hit native 4k on XB1 X are games that run at 900p or 720p on XB1 btw. Only a few games that are 1080p on XB1 aren't 4K on XB1 X. That was Soundwave who said that MS definitely wouldn't let Sony beat them on specs again, not me. All I said was that Phil hinted at MS planning to beat Sony again on specs next gen. |
On confusing you with Soundwave, my bad. But since you picked on the conversation with that being part I didn't really noticed. And sure I believe MS would like and probably is aiming to beat Sony at spec.
No console next gen will have only 4k60fps games, 60 fps by standard in consoles aren't the focus, devs rather increase IQ than have 60fps except specific genres. But since I'm not specialist on this I won't accertain that only 2x difference in power would assure the 4k (PS4 is 50% more powerful than X1 and that didn't warranty a very big difference in pixel count in several games).
I can certainly see someone paying double on a 4K machine versus a FullHD, but I think the power different seem slim plus the expected differences mainly being just 4x resolution without other assets meeting the jump.
Pemalite said:
First AMD needs to actually invent a GPU that is 2.5x faster than the Xbox One X GPU.
Why couldn't there be any deviation in CPU performance? This generation certainly has every console with a different CPU performance profile.
We really don't though.
I am expecting single CCX on the CPU side for all devices. It offers the best price/performance... And allows for more of the transistor budget to be sunk into the GPU side of the equation.
Vapor chamber doesn't guarantee high clock and silent operation. - You still need a fan.
Indeed.
Indeed. Although to be fair, the Xbox 360's GPU was the better chip at the end of the day. Totally fair point, with only little games taking advantage of the better HW of PS3 with much more hassle than ones using the better GPU of X360.
Precisely. They wouldn't. AMD and other hardware companies would be under a non-disclosure agreement and would not be allowed to disclose what the other company is doing. Certainly when MS or Sony buy around the tech, since they will likely be using AMD or NVidia (less likely) instead of deving their own chips, they will now what power to expect around each cost. And may take assumptions on how much the competitor will charge and the loss they want to take. Plus also can do estimatives of what gen of HW and specs will be best bang for buck on console form. But all of that can be useless if the competitor decides to be bold (or dumb).
Power doesn't guarantee success/failure, that is what history has ultimately told us. Yes performance/price is quite important, plus the games that will make use of it.
Depends on how much of a cutback Lockhart has.
It's true though.
Eh. Except you contradict yourself from the very outset of your post by stating that flops isn't an indicator for gauging complete system performance.
The RX 580 is mid-range. It's not a GPU that is well suited for 4k with settings dialed up... And it's certainly not a GPU that is ideal for 1080P. - It's ideal resolution is actually somewhere in between, 1440P. So you would be in line that somewhere around 4x difference in performance would be expected to have similar level and balance from 1080p to 4k?
It is all completely dependent on the GPU architecture and how well it would hit higher resolutions and what the games/engines are trying to accomplish.
4k textures were used even during the 7th gen. Texture resolution and screen resolution aren't 1:1 pixel to pixel.
|
Answers in bold
shikamaru317 said:
I know, I was just using the 580 and RX Vega 64 to illustrate my point since they are currently the two closest AMD PC GPU's to the rumored GPU specs for Lockhart and Anaconda. I'm also aware that actual 4k textures aren't that impressive and that many games use them even at lower resolutions like 1080p. That's why I used quotation marks when I said "4K textures". The actual texture resolutions that people usually run at 4K screen resolution are more like 8K or 16K, but they are often called 4K by people writing articles and such since they are paired with a 4k screen resolution. |
More like texture for 4K games.
duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"
http://gamrconnect.vgchartz.com/post.php?id=8808363
Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"
http://gamrconnect.vgchartz.com/post.php?id=9008994
Azzanation: "PS5 wouldn't sold out at launch without scalpers."