By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - I don't want to be a Playstation fanboy anymore :(

 

The VGC forum is...

FUN! 68 27.76%
 
A cesspool of fanboys 152 62.04%
 
The only option after the... 25 10.20%
 
Total:245

>Unstoppable love machine

Bravo, bravo. You, sir, have made my day.
Spread your PK Love throughout the world ~☆

Now, if you excuse me, I'll be busy being not what you are: not a fanboy. I'll post my thoughts later.



 
I WON A BET AGAINST AZUREN! WOOOOOOOOOOAAAAAAAAHHHHHHHHHHHHHHHHHHH

:3

Around the Network
vivster said:

I am not denying that Ubi did a very stupid PR move. They do a lot of stupid things.

All I'm suggesting is that if there is a CPU bottleneck and if it is true what all sources are indicating that the X1 CPU is stronger than the PS4 CPU, that the PS4 version is actually holding the game back and that it could run slightly better on the X1. Which means the X1 fans should be outraged and the PS fans should thank Ubi for not making them look even worse.

But in the end it's up to your own taste who and for what to blame.

You can hate Ubi for not properly programming the game or for straining the CPU with too many AIs in the first place.

Or you could hate the console manufacturers for going the cheap route with an APU.

Or you could hate yourself for not having enough money to build a powerful home PC.

But currently the hate is absolutely misdirected.

Thats a really really big if right there. But what you are ignoring is how little a CPU bottleneck that is being tied to AI will affect the resolution of a game. Actually, if the guy says the GPUs are really powerful and they are limited on the CPU side of things, hen the natural thing to do is move more of their code to the GPU. Actually just read this. What the guy said is a lot of bullshit plain and simple. Like if you know anything about tech you will instantly see that it doesn't make sense at all. And as Binaray said, and like I have said, its not even that they did it that caused the entire problem, its that they opened their stupid mouths to actually say they did it so people would be happy. Its rally not that complicated.

Lets not make this about how much weaker consoles are... cause if that is the basis of your argument then it really means you have gobbled up the nonsense that stupid guy said. I will remind you of this when the next AC game from ubisoft looks much better than this one and runs at 1080p. Or the next GTA game is released and completely wipes the floor with ACU.

And the hate is misdirected? I won't lie... yu are making it really hard to take what you are saying seriously, especially considerring that I thought you to be one of the most objective posters here.

  • You really think both versions of the game can't run at higher than 900p because of hardware limitations? Because ACU is the best we are going to see from these consoles in the 8th generation?
  • You really think its wrong to be mad at ubisoft for them saying they made both versions the same to avoid debates? Is that on any kinda level acceptable?


That's a pretty bold step you took, but one for the best. Congratulations on your neutrality. That's the best way to go.



just become a wii or xbox fanboy :)



vivster said:
binary solo said:

Just tell me straight, 'cause I haven't been bothered trawling through all the posts looking for the technological explanation. How does the AI/CPU thing gel with what I understand to be a GPU thing?

OT, it's always better to be a fan rather than a fanboy. It's a sign if immaturity that someone revels in wearing the fanboy label. It's a sign of maturity that someone renounces that characterisation and determines to be more levelheaded about their gaming preferences.

The short version is that the GPU needs the CPU. If the GPU has to wait for the CPU because it is overtaxed it will not be able to run at full capacity. If there is a CPU bottleneck it doesn't matter if your computer is running a weak APU or a triple SLI Titan. You can't produce faster if there is no raw material to work with. Which is the stuff that the CPU is providing.

The juicy thing about this whole mess is that the X1 CPU is supposedly stronger than the PS4 CPU which could actually mean that the PS4 is currently the limiting factor. And now think about what would've happened if we'd seen an X1 performing better than the PS4 on a multiplat game. If you think about it that way, Ubis comments actually make a lot of sense since that debate would've definitely be a huge thing and best to avoid.

If Ubi would spend the time to either optimize or reduce the CPU load, the PS4 would no doubt perform better because the bottleneck would switch to the GPU, where it belongs.

Ok I can understand that from a fps perspective. The CPU can only output so much timewise for the GPU to do its thing, so CPU demand is so great on ACU that it can only send content to the GPU for it to output 30 times per second. And it seems no one really complains much about fps parity. But why does CPU load and output rate affect resolution? What's the CPU's role in resolution, especially in squeezing that last bit of juice out to achieve 1080p?

Also interesting DF article on the latest Aliens game, it seems that in order to meet 1080p for that game both PS4 and Xb one suffer performance-wise (framerate dips, screen tear, brief pauses), and no doubt both versions would have benefitted from a 900p treatment. There is a good case to be made, IMO, especially with multiplat games, that 1080p is actually a distraction that causes compromises and reduces game quality because of lowered performance in other, more important areas of graphics. Maybe 1080p should be left to the exclusives, because those folks have the time to devote all their optimisation resource on one platform.



“The fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt.” - Bertrand Russell

"When the power of love overcomes the love of power, the world will know peace."

Jimi Hendrix

 

Around the Network

...

I applaud your conviction, it takes a lot of will power to stop being a fanboy. I am still trying to fight my inner fanboy... but your speech gave me hope for the future...

Thanks Vivster... *sheds a tear*



"I've Underestimated the Horse Power from Mario Kart 8, I'll Never Doubt the WiiU's Engine Again"

He didn't wanna be a "Playa" (slang for Playstation Fanboy, I'm sure) no more, either.  Wanna know how he's doing now?

 

 

 

 

 

 

 

He's dead.



vivster said:
binary solo said:
vivster said:

The ACU meltdown was the last straw. In itself it was the usual fanboy meltdown and nothing to cough about. But since my stronger involvement in everything IT I've become allergic to anything that is spreading lies, misinformation or simply ignorance about technology. This shouldn't affect me much though since I'm surrounded by technologically ignorant people all the time.

But what really got me this time was that these kind of people who willfully ignored or weren't able to comprehend simple technological concepts were the same people who can't stop talking about pixels, framerates and other graphical features. I was almost shaking by embarrassment, anger and my crumbling world view.

Just tell me straight, 'cause I haven't been bothered trawling through all the posts looking for the technological explanation. How does the AI/CPU thing gel with what I understand to be a GPU thing?

OT, it's always better to be a fan rather than a fanboy. It's a sign if immaturity that someone revels in wearing the fanboy label. It's a sign of maturity that someone renounces that characterisation and determines to be more levelheaded about their gaming preferences.

The short version is that the GPU needs the CPU. If the GPU has to wait for the CPU because it is overtaxed it will not be able to run at full capacity. If there is a CPU bottleneck it doesn't matter if your computer is running a weak APU or a triple SLI Titan. You can't produce faster if there is no raw material to work with. Which is the stuff that the CPU is providing.

The juicy thing about this whole mess is that the X1 CPU is supposedly stronger than the PS4 CPU which could actually mean that the PS4 is currently the limiting factor. And now think about what would've happened if we'd seen an X1 performing better than the PS4 on a multiplat game. If you think about it that way, Ubis comments actually make a lot of sense since that debate would've definitely be a huge thing and best to avoid.

If Ubi would spend the time to either optimize or reduce the CPU load, the PS4 would no doubt perform better because the bottleneck would switch to the GPU, where it belongs.

No it is not. It is the same CPU but overclocked due to the extra overhead of the OS. It even performs worse with the higher clock if the benchmarks floating around are true, I didn't look into it. PS4 cpu can be overclocked anytime if there is a need for it. One CPU was overclocked at the later stages of the development cycle to keep up with the OS.

Also framerate is a function of what you are describing, not resolution. It is nice that you claim to know how things work, but just because you are saying things, it doesn't mean they are true. 

Stick with neutrality though.



michael_stutzer said:
vivster said:

The short version is that the GPU needs the CPU. If the GPU has to wait for the CPU because it is overtaxed it will not be able to run at full capacity. If there is a CPU bottleneck it doesn't matter if your computer is running a weak APU or a triple SLI Titan. You can't produce faster if there is no raw material to work with. Which is the stuff that the CPU is providing.

The juicy thing about this whole mess is that the X1 CPU is supposedly stronger than the PS4 CPU which could actually mean that the PS4 is currently the limiting factor. And now think about what would've happened if we'd seen an X1 performing better than the PS4 on a multiplat game. If you think about it that way, Ubis comments actually make a lot of sense since that debate would've definitely be a huge thing and best to avoid.

If Ubi would spend the time to either optimize or reduce the CPU load, the PS4 would no doubt perform better because the bottleneck would switch to the GPU, where it belongs.

No it is not. It is the same CPU but overclocked due to the extra overhead of the OS. It even performs worse with the higher clock if the benchmarks floating around are true, I didn't look into it. PS4 cpu can be overclocked anytime if there is a need for it. One CPU was overclocked at the later stages of the development cycle to keep up with the OS.

Also framerate is a function of what you are describing, not resolution. It is nice that you claim to know how things work, but just because you are saying things, it doesn't mean they are true. 

Stick with neutrality though.


Why do you think it was overclocked because of the OS? Any source for this?



The ACU debacle at least in my eyes has nothing to do with wich version got held up by wich or even if any version got held up at all. The problem is Ubi admited they didnt try to optmize either version to "avoid debate and stuf". Maybe they couldve improved the XOne version, maybe they couldve improved the PS4 one, maybe they couldve improved both, but nah they just stuck with theyre build thats not optmized at all but run on both consoles.

 

Ill admit Im not a tech specialist but I find the CPU excuse a bit dumb, AC doesnt realy have some super advanced crazy AI and the game will run on 8 year old processors on PC so Im not ealy sure what theyre on about. Also CPU is suposed to matter for framerates no resolution unless Im mistaken (wich I may be).