I don't know. I'm just a bit that plays games. Not gonna pretend to be some kind of expert like a lot of people. I'll just keep buying whichever version of any game on the hardware that gives me the best experience.
I don't know. I'm just a bit that plays games. Not gonna pretend to be some kind of expert like a lot of people. I'll just keep buying whichever version of any game on the hardware that gives me the best experience.
walsufnir said:
|
That's the point of the way the whole APU has been designed, developers can easily queue up compute commands, have them waiting for when GPU hardware is free to process it. The GPU is never completely saturated dealing with graphical tasks, there's always tonnes of times when Stream Processors are sitting idle, so this is when AI and Physics are handled. Remember I said efficiency?
You should be sure that developers can use this to a great level of extent, because that's what they asked for when Sony did all of their research in what hardware to put in PS4.
There's no need for a trade off with this because it's part of the system's design to use resources as efficiently as possible.
JustBeingReal said:
That's the point of the way the whole APU has been designed, developers can easily queue up compute commands, have them waiting for when GPU hardware is free to process it. The GPU is never completely saturated dealing with graphical tasks, there's always tonnes of times when Stream Processors are sitting idle, so this is when AI and Physics are handled. Remember I said efficiency? You should be sure that developers can use this to a great level of extent, because that's what they asked for when Sony did all of their research in what hardware to put in PS4. There's no need for a trade off with this because it's part of the system's design to use resources as efficiently as possible. |
Waiting for the GPU hardware to be free to process data? In a game where devs are fighting for milliseconds to get their stuff done (ie. computed)? How would that work? You need your data fast and in a predictable way - what if you need the data now and can't wait for another bunch of milliseconds?
I am not sure of anything until a dev talks freely about it and provides data on how and when the GPU was free enough to process data for a decent amount of time when it is not limiting the render tasks and what can be accomplished when doing so.
Btw, how would you know to which extend PS4's GPU is currently utilized? There are clearly games that are not as stable as some wish so why wouldn't devs use the hardware not to 100%? This makes no sense at all.
Yep. Makes a game that uses 10% of the system capacity call them equal and says devs who disagree are idiots, seems good.
duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"
http://gamrconnect.vgchartz.com/post.php?id=8808363
Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"
http://gamrconnect.vgchartz.com/post.php?id=9008994
Azzanation: "PS5 wouldn't sold out at launch without scalpers."
WhiteEaglePL said: We knew about this for a while now... But I don't think it's a big difference between PS4 & XBO and XBO & Wii U. |
PS4 > XB1 >>> Wii U
It's on the specs. There is a decent difference between PS4 and X1 and a bigger on to the Wii U. The Wii U suffers with lack of memory bandwidth (it's smaller than PS360), pretty weak CPU, less memory and a weaker GPU.
The lack of bandwidth results in cases like MK8: 720p with no AA. The weaker CPU will be an issue to NPC count, AI, physics. The CPU is just a tri-core version of the Wii's CPU that was just a beefed up GC CPU. It can't reach the same performance of the X360 Xenos, let alone the much more powerful and modern PS4/X1 AMD CPUs.
About the OP, it isn't a misrepresenting. It like saying that a GTX 770 isn't worse than a 980. It's on the specs, and the games just reflect that. The only game that didn't was AC Unity, but it was a disaster from the beginning and the publisher publicy stated that they would ruin the PS4 version. The difference in CPU for the X1 is just an extra overclock, something that Sony could do with firmware upgrade and it's just a 10% difference, compared with a 50% advantage on GPU.
torok said:
PS4 > XB1 >>> Wii U It's on the specs. There is a decent difference between PS4 and X1 and a bigger on to the Wii U. The Wii U suffers with lack of memory bandwidth (it's smaller than PS360), pretty weak CPU, less memory and a weaker GPU. The lack of bandwidth results in cases like MK8: 720p with no AA. The weaker CPU will be an issue to NPC count, AI, physics. The CPU is just a tri-core version of the Wii's CPU that was just a beefed up GC CPU. It can't reach the same performance of the X360 Xenos, let alone the much more powerful and modern PS4/X1 AMD CPUs. About the OP, it isn't a misrepresenting. It like saying that a GTX 770 isn't worse than a 980. It's on the specs, and the games just reflect that. The only game that didn't was AC Unity, but it was a disaster from the beginning and the publisher publicy stated that they would ruin the PS4 version. The difference in CPU for the X1 is just an extra overclock, something that Sony could do with firmware upgrade and it's just a 10% difference, compared with a 50% advantage on GPU. |
This is not possible. It couldn't run the games it does with lower memory bandwidth than the PS360.
On the second point, I think people underestimate the importance of SDKs. Microsoft know their software and my understanding is that developers universally praise the Microsoft dev kits. Sony I'm not sure about but with the discrepancy in hardware specs, I would say they must be behind on the software side. I can't really see any other explanation. I can't see developers intentionally sabotaging their own work for parity's sake.
walsufnir said:
I am not sure of anything until a dev talks freely about it and provides data on how and when the GPU was free enough to process data for a decent amount of time when it is not limiting the render tasks and what can be accomplished when doing so. Btw, how would you know to which extend PS4's GPU is currently utilized? There are clearly games that are not as stable as some wish so why wouldn't devs use the hardware not to 100%? This makes no sense at all. |
It works because even in those instances where the GPU is under heavy load there are always Stream Processors free, the developers plan for this stuff in the development process.
It's part of the GPU hardware, not all of it in it's entirety. What you have to remember is that it's never the whole of the GPU doing one task, the GPU is running mulitple tasks simultaneously. Games are designed around all of their aspects from graphics to compute, you're acting like it's a one or the other type situation and completely missing the point that graphics never take up all of GPU's time. It's all a matter of proper resource management, when the game's being developed.
You should be sure that this is a great thing, if you did research about it, it actually becomes self evident.
This is simple, it's efficient use of resources, not wasting it when it could be put to use.
The reason I know this stuff is from research and common sense. The reason games aren't as stable as they could be is because of a lack of optimization, in the design of the games and how optimized the development tools are at this point in time. This new SDK directly addresses the physics side of shared compute load between the CPU and GPU.
MoHasanie said: No Goblin’s Roundabout is a game that never takes itself seriously and that is precisely the reason why it is turning out to be an extremely enjoyable title. For those who are living under a rock, Roundabout is a partial open world game where players would need to drive passengers in a rotating limousine while solving puzzles along the way.
GamingBolt recently chatted with No Goblin’s co-founder Dan Teasdale to know more about how the game is shaping up. Roundabout will be released on the PlayStation 4 and Xbox One and we found it interesting to know Dan’s thoughts on the differences between the two consoles. Details such as the Xbox One’s lower GPU compute units and memory issues are fairly common knowledge now but according to Dan these are arbitrary numbers and not the ideal way to compare the consoles.
“Saying that the PS4 is “more powerful” than the Xbox One or vice versa as a blanket statement is really misrepresenting how games work,” Dan said to GamingBolt.
“If all you’re doing is rendering polygons, yes, the PS4 is a little faster at that. If all you’re doing is gameplay and simulation, the Xbox One is a little faster. Video games need both, so it’s not as simple as comparing core counts or memory speed to determine which platform is the “fastest” or “easiest”.”
He further states that both the consoles are mostly similar and getting the game ready for certification is a priority for developers rather than optimizing it for a unique platform.
“At the end of the day, it doesn’t matter. Both platforms are in the same relative ballpark, so the real work for releasing is getting your game certification-ready on each platform instead of crazy optimizations on one specific platform.” Dan definitely has a point here. Instead of comparing the technical specs of the two consoles, players should enjoy the games that the developers are building for them. Roundabout is currently available on the Steam store and will hit the PS4 and Xbox One early next year. Stay tuned for our full interview with Dan Teasdale in the coming days.
|
Going"all focus on graphics" and therefore "well then let framerate suffer to get even better graphics" is also completely misinterpreting how games work.
So what do devs expect when they indoctrinate the consumers with nonsense?
Dear devs #DealWithIt
the best part it that naughty dog will develop for the most powerful one and will optimise like crazy.
dd if = /dev/brain | tail -f | grep games | nc -lnvvp 80
Hey Listen!
The way multiplatform games are developed nowadays you won't see that much of a early difference in game performance. Games ship with long lists of known bugs and performance problems. The biggest problems are tackled first, thus the console struggling the most gets the most attention.
For example GTA5, dips to 26fps aren't a priority on ps4. On XBox One it probably dipped lower in the country side areas, thus grass was removed in places. The heavy deadlines insure parity more than any money hatting deals would do.
Even if some nice optimization was made on one platform for the current game, it's unlikely it would be ported over to the other if that one is already doing 'good enough'. If it ain't boke, don't fix it. Introducing new bugs with tight deadlines is always a big risk. That doesn't mean both consoles won't benefit from eachothers optimizations later in life. For example, the new AA techniques developed on ps3 out of necessity made it to 360 as well in later titles.
The gap will likely grow this gen as engines mature and get optimized for each platform. There is simply more to be gotten from the 6 extra compute units on ps4.