By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Anonymous Assassin's Creed Unity Dev Says "PS4 Couldn't Handle 1080p"

generic-user-1 said:
mornelithe said:

Nope.  The amount of ACE's in the PS4's GPU is only matched by AMD's R9-290 series, which are less than a year old.  And considering we're talking devs who've been working on static hardware from 2007 for the past 7 years, they didn't just wake up yesterday and learn how to use it properly. 

it was total different hardware...   there is just not so much new to learn

Yeah, it's totally different hardware, that they had API's written for...for that totally different hardware.  That they improved the efficiency of over a 7 year period...for totally different hardware.

What you think the PS4's API was miraculously birthed at the most efficient level possible, pre-launch, using architecture that only 1 card, at the time boasted, seriously?  Which would, basically be counter to every gen ever?  You realize many games devs still don't properly use all cores in multi-core CPU's (most just use 2, some use 4, few use all 8)?  It's literally the same as when a new High-end GPU is launched.  Very few devs actually design games to utilize that power, because there are so few in circulation for the first few years due to price.  It's also why AMD/Nvidia release driver updates routinely, because their API has become more refined, and utilizes the power more effectively/efficiently.

For an industry of folks who've been working with 7 year old hardware, this gen is no different than any other.  It takes time for the software and the developer knowledge to catch up with what they have available.  This is not rocket science level stuff here, it's the 8th gen, we should be used to it by now.

 



Around the Network
Teeqoz said:
Dark_Feanor said:
Teeqoz said:
Sharpryno said:

CPU is a bottleneck, thats how.  Lighting physics takes more than just gpu. 


GPGPU?


The engine has to be toiled to work that way. And you will only get marginal better, just look at all the 1080 vs 900p games we have for the last year.

IT DOESN`T MAKE THAT MUCH OF A DIFFERENCE!


Ubisoft themselves recently published a test to show just how much you can actually get out of GPGPU, and it was a lot more than you can get out of the CPU.

 

If you check my posts in this thread, you'll se that it's not the fact that it's 900p that bothers me, it's just that they (if this in fact is a Ubisoft dev) come up with this lame excuse. We know that the PS4 could do this in 1080p if they optimised it for GPGPU. I'm not even saying that they have to do that, but instead of this excuse, they should just say it's too much of a hassle, and you only get marginal differences. Now that explanation would've been way better.


Well, you said it itself.

It was a test. The game ships in the next 3 weeks, it´s probably finished main coding in the last 2 months.

So, should Ub delay the game util those tecnics are finished? May be they could recall Watchdogs and try again to reach 1080p. Let´s also ask EA to do same with Battlefield 4.

Because, you know. Reaching 1080p is much more important than any other aspect of the game.

You are not Ubisoft shareholder. No one here is in the position of "demanding" how they voice their design decisions. Either you buy the game in a given platform your you don´t.

This whole discusion is diminishing what gaming is about. We reached a crazy standard where the number of pixels outputed by the GPU is more what matter the most.



Dark_Feanor said:

Well, you said it itself.

It was a test. The game ships in the next 3 weeks, it´s probably finished main coding in the last 2 months.

So, should Ub delay the game util those tecnics are finished? May be they could recall Watchdogs and try again to reach 1080p. Let´s also ask EA to do same with Battlefield 4.

Because, you know. Reaching 1080p is much more important than any other aspect of the game.

If it produces a better product on the PS4 and XB1, absolutely.  I may know the PS4 has a much higher ceiling than the XB1, but this paltry 900p for the XB1 is a load of horseshit, and XB1 gamer's should be just as pissed.  Given the lesser amt of resource overhead the consoles have (compared to PC's), there is literally no excuse (other than poor API's/engine efficiency) for the games to be that low.  And I mean all games, not just AC U.

Maybe I'm in the minority, but I strongly, very strongly prefer companies to take their time, over rushing a game to launch.



archer9234 said:
TheAdjustmentBureau said:

Gamers need to stfu. Gamers know absolute jack about what's involved. They think they know but they don't. Just enjoy the games on your platform.

Ot. Ac unity looks gorgeous.

Actually, that is really not the whole point of this debate. We bought hardware that is suppose to be way better than the past ones. So far they still are failing to meet game makers demands. I still think game makers need to stop throwing all this stuff at the consoles, if nothing can run at gamers expectations. Which is usually the res and FPS.

I'd like one of these games, run at real 1080/60. Than see how much of a "downgrade" the other departments really get. I don't know if the game is gonna look "bad", unless someone actually does this. It be stupid, if say AC would work at those levels, if they just cut 20 AI people from the street, for example. Or made the draw distance slightly lower. I seriously doubt any of the reductions would make them look like Launch xbox/ps3 games. All as we know, the game can't run at 1080/60 is because they can't afford to pay the people to get it optimized. And all this is pure BS.


Actually I think its stupid to assume more power means visuals. It's good to see Ubisoft pushing ain't etc. Not just ac4 with new story and better graphics.



Kane1389 said:
 

Also, you didnt actually adress any of my points in the post, most of it is just your typicall ''PC is superior'' rambling you are known for here. Thats fine and all, but it doesnt disprove my statments on the next gen consoles being build on developer feedback and the graphics standard being set by the consoles rather than the most powerfull GPUs at the market.


Don't lose your time. He must post about PCs in all console threads to justify the money he spent. That's how it works. He's probably one of the most off-topic guys on this forum.

About his opinion, PC gaming is a nich thing. It simply demands more technical knowledge than 99% of the people have. Even if you look at mobile devices, people are replacing PCs with other devices everywhere they can just to avoid the hassle. Gaming on PC is basically knowing what a GPU, CPU, HDD and RAM memory are (most people don't have a clue), knowing which configuration is better, dealing with driver updates and games that sometimes won't run unless you edit a config file or apply a fix, knowing what AA, Anisotropic filtering and SSAO are so you can adjust the configurations and make the game run.

I know how to do it. I just already work with software development and I'm completely sick of dealing with computing problems. I just want to get at home, pick a game disc and put it on my console and play. How do I know it will work? The game box hass PS4 written on it. Patch? Console will do for me. I don't want the hassle, but some guys don't understand that it isn't for everyone to expend 1 hour making a game run.



Around the Network

Why would anyone trust this company?

And I don't understand how people enjoy AC games. I think they are shit, personally.



Dark_Feanor said:
Teeqoz said:
Dark_Feanor said:


The engine has to be toiled to work that way. And you will only get marginal better, just look at all the 1080 vs 900p games we have for the last year.

IT DOESN`T MAKE THAT MUCH OF A DIFFERENCE!


Ubisoft themselves recently published a test to show just how much you can actually get out of GPGPU, and it was a lot more than you can get out of the CPU.

 

If you check my posts in this thread, you'll se that it's not the fact that it's 900p that bothers me, it's just that they (if this in fact is a Ubisoft dev) come up with this lame excuse. We know that the PS4 could do this in 1080p if they optimised it for GPGPU. I'm not even saying that they have to do that, but instead of this excuse, they should just say it's too much of a hassle, and you only get marginal differences. Now that explanation would've been way better.


Well, you said it itself.

It was a test. The game ships in the next 3 weeks, it´s probably finished main coding in the last 2 months.

So, should Ub delay the game util those tecnics are finished? May be they could recall Watchdogs and try again to reach 1080p. Let´s also ask EA to do same with Battlefield 4.

Because, you know. Reaching 1080p is much more important than any other aspect of the game.

You are not Ubisoft shareholder. No one here is in the position of "demanding" how they voice their design decisions. Either you buy the game in a given platform your you don´t.

This whole discusion is diminishing what gaming is about. We reached a crazy standard where the number of pixels outputed by the GPU is more what matter the most.


Did you read the second paragraph in the post you replied to? I'll bold it for you to give you another chance

Teeqoz said:

If you check my posts in this thread, you'll se that it's not the fact that it's 900p that bothers me, it's just that they (if this in fact is a Ubisoft dev) come up with this lame excuse. We know that the PS4 could do this in 1080p if they optimised it for GPGPU. I'm not even saying that they have to do that, but instead of this excuse, they should just say it's too much of a hassle, and you only get marginal differences. Now that explanation would've been way better.



Kane1389 said:

I didnt stuck a nerve at me at all, where did I display the signs of annoyance in my post? Maybe its your nerve that its stuck when you are briging up such a defensive statement. Given that your console maker of choice is Nintendo who not only makes underpowered consoles, but rather histerically  underpowered consoles, it wouldnt surprise me that you bring up PCs in every thread related to Xbone/PS4 to compensate for something. This is even more apperant since the majority of your post is about how PCs are better than next gen consoles and not about the original points you or I have made. So much for that.

People dont buy Titan GPUs, not because they dont need them, but because they cant afford them, which is painfully obvious, because most people in the industry dont have 1000$+ to spend on a graphic card alone. And you didnt need a high end PC to play majority of multiplats, most certanly not with the same graphically fidelity which is on PS360. However, this does bring up another major advantige consoles have over PC and that is future-proofing (for the lack of a better word) You buy a console and you'll guranteed to be able to play all multiplat games for the entire generation whereas for PC, you'd have to keep spending hunderds of dollars on upgrading your GPU, CPU and RAM.

Also, you didnt actually adress any of my points in the post, most of it is just your typicall ''PC is superior'' rambling you are known for here. Thats fine and all, but it doesnt disprove my statments on the next gen consoles being build on developer feedback and the graphics standard being set by the consoles rather than the most powerfull GPUs at the market.

Just ignore him. Very rarely is he actually on topic and at that point, it's totally meaningless to reply back.

OT: Ubisoft just keeps shooting itself in the foot. They really should've just kept quiet and after seeing how much more you can get out of PS4's GPGPU compute, it's apparent (if it wasn't already obvious enough) that the PS4 is held back.



TheAdjustmentBureau said:
archer9234 said:
TheAdjustmentBureau said:

Gamers need to stfu. Gamers know absolute jack about what's involved. They think they know but they don't. Just enjoy the games on your platform.

Ot. Ac unity looks gorgeous.

Actually, that is really not the whole point of this debate. We bought hardware that is suppose to be way better than the past ones. So far they still are failing to meet game makers demands. I still think game makers need to stop throwing all this stuff at the consoles, if nothing can run at gamers expectations. Which is usually the res and FPS.

I'd like one of these games, run at real 1080/60. Than see how much of a "downgrade" the other departments really get. I don't know if the game is gonna look "bad", unless someone actually does this. It be stupid, if say AC would work at those levels, if they just cut 20 AI people from the street, for example. Or made the draw distance slightly lower. I seriously doubt any of the reductions would make them look like Launch xbox/ps3 games. All as we know, the game can't run at 1080/60 is because they can't afford to pay the people to get it optimized. And all this is pure BS.


Actually I think its stupid to assume more power means visuals. It's good to see Ubisoft pushing ain't etc. Not just ac4 with new story and better graphics.

Than the companies shouldn't be adverting this aspect. If the consoles actually can't do it. And I don't care about the visuals. The res and FPS aren't really that. The other areas are.



mornelithe said:
generic-user-1 said:
mornelithe said:

Nope.  The amount of ACE's in the PS4's GPU is only matched by AMD's R9-290 series, which are less than a year old.  And considering we're talking devs who've been working on static hardware from 2007 for the past 7 years, they didn't just wake up yesterday and learn how to use it properly. 

it was total different hardware...   there is just not so much new to learn

Yeah, it's totally different hardware, that they had API's written for...for that totally different hardware.  That they improved the efficiency of over a 7 year period...for totally different hardware.

What you think the PS4's API was miraculously birthed at the most efficient level possible, pre-launch, using architecture that only 1 card, at the time boasted, seriously?  Which would, basically be counter to every gen ever?  You realize many games devs still don't properly use all cores in multi-core CPU's (most just use 2, some use 4, few use all 8)?  It's literally the same as when a new High-end GPU is launched.  Very few devs actually design games to utilize that power, because there are so few in circulation for the first few years due to price.  It's also why AMD/Nvidia release driver updates routinely, because their API has become more refined, and utilizes the power more effectively/efficiently.

For an industry of folks who've been working with 7 year old hardware, this gen is no different than any other.  It takes time for the software and the developer knowledge to catch up with what they have available.  This is not rocket science level stuff here, it's the 8th gen, we should be used to it by now.

 

this gen IS different because its a x86 system  with pc like gpu and cpu. the only big difference is the unified ram, but thats not soo new too.  sure we see some improvements but far less than in the last gen.  the cell was such a strange piece of hardware, the new gpu isnt.