Pemalite said:
No. You can't. It's inaccurate... Which my Radeon RX 5500 example proves without a doubt. |
I think Digital Foundry has answer your theory flawlessly , and also tell me what metrix you suggest
What do you think | |||
I am excited for next gen | 22 | 61.11% | |
I cannot wait to play next gen consoles | 4 | 11.11% | |
I need to find another th... | 2 | 5.56% | |
I worried about next gen | 8 | 22.22% | |
Total: | 36 |
Pemalite said:
No. You can't. It's inaccurate... Which my Radeon RX 5500 example proves without a doubt. |
I think Digital Foundry has answer your theory flawlessly , and also tell me what metrix you suggest
HollyGamer said: You can scaling down games engine But you will possibly lose the benefit of future hardware and more powerful hardware can do. The tech will stagnant like how COD use the same engine from PS3 era or Bethesda on every Fallout Games. |
CoD has made some great strides in visuals over the years, despite being based on a derivative of the Quake 2 engine.
It's just they have different design goals, low latency, high framerate is generally the goal for those engines...
For example with Modern Warefare 2, they introduced texture streaming which meant that texturing wasn't limited to the tiny pool of DRAM that the Playstation 3 and Xbox 360 had, resulting in a substantial uptick in visual fidelity, especially on the texturing side.
https://community.callofduty.com/t5/Call-of-Duty-Modern-Warfare-2/Modern-Warfare-2-Texture-Streaming/ba-p/9900744
https://www.eurogamer.net/articles/digitalfoundry-modern-warfare-2-face-off
We saw increases with Advanced Warefare when it brought a new lighting model and much improved post-process pipeline.
https://www.eurogamer.net/articles/digitalfoundry-2014-call-of-duty-advanced-warfare-face-off
Call of Duty Modern Warefare reboot saw great strides in geometry, improved lighting, shadowing and more.
https://www.eurogamer.net/articles/digitalfoundry-2019-cod-modern-warfare-delivers-series-most-advanced-visuals-yet
Every time they updated the engine they improved the core visuals. -
https://en.wikipedia.org/wiki/IW_engine
Call of Duty 2 (IW 2.0 engine): - Normal Maps, Bloom, Improved Shadowing.
Call of Duty 4 (IW 3.0 engine): Improved Lighting, Particles, Self Shadowing.
Call of Duty: Modern Warfare 2 (IW 4.0 engine): Texture and Mesh Streaming for better textures and models.
Call of Duty: Modern Warfare 3 (IW 5.0 engine): Improvements to streaming for larger environments, Improved Shadowing, Improved Reflections.
Call of Duty: Ghosts (IW 6.0 engine): Model geometry subdivision, HDR Lighting, Displacement Mapping, Tessellation, Specular Particles.
Call of Duty: Infinite Warfare (IW 7.0 engine): Physically based rendering for more accurate materials and lighting.
And I can keep going. Point is, it doesn't make sense to reinvent the wheel... Because these days, developers do not rebuild engines from scratch anymore.
Take the creation engine for instance, yes it looks dated today... But when it debuted with Skyrim in 2011 it was a decent looking engine that showcased what the 7th gen consoles could do with an open world environment... And that engine is based upon Oblivion's Gamebryo Engine from 2006... Which in turn is based on Morrowind Net Immerse engine from 2002.
Again, that engine spans multiple console generations and has scaled across hardware really well. - Elder Scrolls 6 is likely to be built on those same engine foundations and look absolutely stunning doing it.
Here is the jump from Morrowind on Xbox to Skyrim on Xbox 360.
Generational leap, no?
Even Unreal Engine is based on the engines that came before it, Unreal Engine 3.0 and 4.0 still contains code from the original Unreal Engine from the late 90's for example. Again. - Why reinvent the wheel?
https://en.wikipedia.org/wiki/Unreal_Engine
Essentially that engine technology has scaled from (In terms of hardware power) the Dreamcast right up to the Xbox One X.
The latest Battlefield being a graphical powerhouse (Especially on PC) is based on the Frosbite engine, the same engine which debuted with Battlefield: Bad Company on the Xbox 360 back in 2008. - The engine has had some massive overhauls since with Frostbite 1.5, 2.0, 3.0, 4.0... And I wouldn't be surprised if code from refractor still lingered somewhere.
https://en.wikipedia.org/wiki/Frostbite_(game_engine)
I think the reason why there is this misconception that older game engines hold back newer platforms is because people do not fundamentally understand what a game engine is or what it actually does.
A game engine is NOT just the "thing" that draws the pretty pictures on your display... A Game engine is a bunch of components working under a "framework" - Be it Audio, Physics, Networking and more. - Thus you can rewrite a part of the engine like with say... The lighting and materials shaders and take advantage of a newer platforms graphics capabilities.
But you can bet your ass that most game engines today are derived from technology of yesterday, because they scale.
HollyGamer said: 12 teraflop confirmed, LOL. in GCN number Navi teraflop are equal to 1,4 times of GCN performance 12X 1.4 = 16.8 teraflop of GCN from Xbox One = 16.8/1.3= 12.9 times more powerful than Xbox One. |
God dammit.
A flop is a flop.
Navi doesn't have 1.4x more "theoretical flops" than Graphics Core Next. It just doesn't.
Flops is based on the number of Stream Processors * Instructions Per Clock * Clock Rate.
Navi gets 1.4x more performance, not because of flops, but because of everything else in the GPU, if you were to throw a purely compute task at Graphics Core Next, it would be able to achieve some impressive flop numbers, often higher than Navi. - But when it comes to gaming, games need more than just flops, ergo Navi is able to pull ahead.
Thus a flop on Navi is the same flop as Graphics Core Next. - Navi is just more efficient.
Trumpstyle said: Dude 5500 have 5TF, where you getting your number from? Navi 50% faster than gcn in Xbox one and PS4, 20% faster than Polaris, (590= polaris) |
My mistake. I was reading my database incorrectly.
My point still stands however.
Bofferbrauer2 said: You are aware that you are comparing half precision to single precision in that example, right? RX 5500XT has 4.7-5.2 Teraflops in single precision, significantly less than an RX 590 yet almost at the same performance. RX 5700 beats RX Vega despite having less TFlops. Which also shows that teraflops can only be compared within same architecture (and even then, with limitations) and can't be used otherwise as a yardstick. I agree with the rest of what you're saying, just wanted to point out that inconsistency there. |
That just reinforces my point that more flops doesn't mean better performance. The RX 590 also has more theoretical bandwidth as well.
But you are right.
--::{PC Gaming Master Race}::--
Pemalite said:
CoD has made some great strides in visuals over the years, despite being based on a derivative of the Quake 2 engine. Generational leap, no?
God dammit. |
You brought Bethesda as an example, it's mean you prove my point. Bethesda never has any new engine, they always using the same engine from 2001 era. Their engine are limited so it performed bad on hardware that come out after 2001 and new hardware , many effect, graphic and gameplay, AI, NPC etc look and played very outdated.
Modder the one who actively fixing Oblivion and Skyrim.
Yes flop is flop, but how Flop perform are different on every uarc, the equation of effectiveness from one uarc to other uarc is very different . the effectiveness of TFLOPS can be measured from one UARC to other UARC. Navi it's indeed 1.4 times then GCAN.
Pemalite said:
You have misconstrued my statements.
You can have identical flops with identical chips and still have half the gaming performance.
Well. They are irrelevant, it's a theoretical number, not a real world one, the relevant "Flop number" would be one that is based on actual, real-world capabilities that the chips can actually achieve.
The Switch we know exactly what it's capabilities are because Nintendo are using off-the-shelf Tegra components, we know what clockspeed and how many functional units it has as well thanks to Homebrew efforts that cracked the console open.
|
I have 3 comments.
On the baseline... if you make a game with let's say PS4 as baseline and make it the best performance there and later you develop for Switch you are going to cut some stuff without affecting PS4 version (probably making Switch version look worse than if it was the baseline or with some performance issue). Now if you go for Switch as baseline and considering how multiplats usually work the PS4 version will only receive more resolution a little better texture, etc, it will be hold down (even to the design) by Switch.
On your comparison of GPUs you used one with DDR4 and other with GDDR5 that would already impact the comparison. We know that the core of your argument is that TFlop have almost no relevance (and after all your explanations I think very little people here put much stock in the TFlop alone), but what I said is ceteris paribus. If everything else on both GPUs is perfectly equal and just the flops are different (let's say because one have a 20% higher clockrate) then the one with the 20% higher clockrate is a stronger GPU (that sure the rest of the system would have to be made to use this advantage). Now if you mix the memory quantity, speed, bandwidth, design of the APU itself and everything else of course you will only be able to go and have a real life performance after they release. And even so you won't really have a very good measurement because same game running on 2 system the difference in performance may not be because one is worse than the other but just how proficient in that HW the dev are.
We know the capabilities of Switch, sure, but since Nintendo haven't gave any specific number I can't say we have 100% certain on a very precise number. We have a range of what we suspect are the performance of docked and undocked, also as you said yourself there is a difference between theoretical and real world.
EricHiggin said:
No hard feelings, no forced assimilation, just worthwhile thoughts. While certain info isn't available to make a reliable conclusion, I don't focus as much on the past. It's certainly necessary and useful, but too much focus on what was, without enough consideration about what is, will make discerning the future less likely. I mean, who could have foreseen this? |
I really liked the design of Xbox Series X.
Considering the size PS4 and X1 had and their capabilities, and I don't think PS5 will be smaller than Series X, I would agree with the reply that said that even if the rumor of 9 vs 12 Tflops is true (40 vs 56 CUs) were true than the silicon budget of PS5 would have been used in other stuff instead of just giving away over 33% in power (if everything else in the consoles would give the same 9 vs 12 advantage to Series X). Because that devkit was just to big to have so much less power.
Trumpstyle said:
We know very little about Sony/Microsfts ray-tracing solution, the person who leaked Prospero first says Ps5 and Xbox Series X uses completely different ray-tracing solution. I would assume Microsoft uses AMD and Sony has there own. Yes frame-rate for games could be all over the place because someone has better ray-tracing but weaker Flop performance. Whoever has more TF will probably market there consoles as WORLD MOST POWERFUL CONSOLE :) |
I think one of the reports with official information have MS using a RT solution they patented.
duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"
http://gamrconnect.vgchartz.com/post.php?id=8808363
Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"
http://gamrconnect.vgchartz.com/post.php?id=9008994
Azzanation: "PS5 wouldn't sold out at launch without scalpers."
DonFerrari said: .. if the rumor of 9 vs 12 Tflops is true (40 vs 56 CUs) were true than the silicon budget of PS5 would have been used in other stuff instead of just giving away over 33% in power (if everything else in the consoles would give the same 9 vs 12 advantage to Series X). Because that devkit was just to0 big to have so much less power. |
The current rumours are I think 40cus at 2GHz vs 56cus at 1.8GHz, so looking at cus only, it would be a 25% difference (both are very optimistic clock rates imho).
Power comes at a price. Particularly at 7nm, every square mm adds $ to the chip cost. It's unikely that Sony uses other stuff, it's just that Sony uses a smaller chip overall to save significant costs. Rumours have the PS5 at around 310-320mm^2, while the XSX is at 360-380mm^2. That is a significantly higher cost (probably around $30 per chip) for the latter chip. If the performance difference stays within 20%, people will barely notice the difference. Both will be constraint to a certain extent by memory. 16GByte is the rumoured size for both, but if XSX uses, say, 24GBytes (the 384bit bus rumour), that would mean a significant difference in performance. 16GByte is the very bottom you can get away with for something that has to live in the 4k world for even just a few years.
drkohler said:
The current rumours are I think 40cus at 2GHz vs 56cus at 1.8GHz, so looking at cus only, it would be a 25% difference (both are very optimistic clock rates imho). Power comes at a price. Particularly at 7nm, every square mm adds $ to the chip cost. It's unikely that Sony uses other stuff, it's just that Sony uses a smaller chip overall to save significant costs. Rumours have the PS5 at around 310-320mm^2, while the XSX is at 360-380mm^2. That is a significantly higher cost (probably around $30 per chip) for the latter chip. If the performance difference stays within 20%, people will barely notice the difference. Both will be constraint to a certain extent by memory. 16GByte is the rumoured size for both, but if XSX uses, say, 24GBytes (the 384bit bus rumour), that would mean a significant difference in performance. 16GByte is the very bottom you can get away with for something that has to live in the 4k world for even just a few years. |
Well a 30USD cost for 25% performance would be an easy expense for me =p
But looking at the size of XSX when the control is near it (much smaller than the original xbox and little bigger than X1S) I don't think PS5 will be much smaller (if so the devkit is just much much much bigger than retail version).
duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"
http://gamrconnect.vgchartz.com/post.php?id=8808363
Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"
http://gamrconnect.vgchartz.com/post.php?id=9008994
Azzanation: "PS5 wouldn't sold out at launch without scalpers."
DonFerrari said: Well a 30USD cost for 25% performance would be an easy expense for me =p But looking at the size of XSX when the control is near it (much smaller than the original xbox and little bigger than X1S) I don't think PS5 will be much smaller (if so the devkit is just much much much bigger than retail version). |
Well if you intend to sell 100mio consoles, that's a pretty $3billion in the end. If you think that is easy expense for you, just send me a percent of that
The volume of the XSX (is the power supply inside or external, btw?) is considerably bigger than the volume of the X1S. Noone has seen Sony's idea of the PS5 end product. Given Sony's track record of building small(er) things with jet engines inside, I don't have much hope in that area.
drkohler said:
Well if you intend to sell 100mio consoles, that's a pretty $3billion in the end. If you think that is easy expense for you, just send me a percent of that The volume of the XSX (is the power supply inside or external, btw?) is considerably bigger than the volume of the X1S. Noone has seen Sony's idea of the PS5 end product. Given Sony's track record of building small(er) things with jet engines inside, I don't have much hope in that area. |
I like that PS fats have their power supply inside the console so whenever I want to pick the console to play on weekend in another place I just need to pick it from my cabinet and use a stock cord and hdmi and it will do just fine.
So I'm all in for a PS5 jet engine on cozy pack =p
Still I wouldn't expect a PS5 that is smaller than PS4 original. And sure the devkit is no direct correlation to final product dimensions or even shape.
duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"
http://gamrconnect.vgchartz.com/post.php?id=8808363
Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"
http://gamrconnect.vgchartz.com/post.php?id=9008994
Azzanation: "PS5 wouldn't sold out at launch without scalpers."
Pemalite said:
My mistake. I was reading my database incorrectly. |
Yeah I also made a mistake, turns out Proelite over at Beyond3d was a fake insider, it was he who said the devkit for PS5 was 40CU's. But looks false.
Oberon remains a mystery, maybe it just haft to do with backwards compatibility and nothing else.
Oberon says 3 things, that the gpu is clocked at 2ghz and it have 2 backwards compatility modes. 1 mode where 18CU's is active which matches PS4 and another with 40CU's which don't match PS4 pro so you would assumed this was Boost mode with all CU's active, but the insider at beyond3d is false.
6x master league achiever in starcraft2
Beaten Sigrun on God of war mode
Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.
1-0 against Grubby in Wc3 frozen throne ladder!!
Trumpstyle said:
Yeah I also made a mistake, turns out Proelite over at Beyond3d was a fake insider, it was he who said the devkit for PS5 was 40CU's. But looks false. Oberon remains a mystery, maybe it just haft to do with backwards compatibility and nothing else. Oberon says 3 things, that the gpu is clocked at 2ghz and it have 2 backwards compatility modes. 1 mode where 18CU's is active which matches PS4 and another with 40CU's which don't match PS4 pro so you would assumed this was Boost mode with all CU's active, but the insider at beyond3d is false. |
Makes almost 0 sense that it would need 18CUs to match PS4 for compatibility on a 40CU, that would mean the new console would have power about PS4Pro (which is 2.25x stronger than base PS4). With XSX being 4x more powerful than X1X (and seems like GPU about 2x as powerful) then PS5 being a PS4Pro level machine would be something so weak that it would need to sell at 199 to have a chance.
duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"
http://gamrconnect.vgchartz.com/post.php?id=8808363
Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"
http://gamrconnect.vgchartz.com/post.php?id=9008994
Azzanation: "PS5 wouldn't sold out at launch without scalpers."