By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Full PlayStation 4 specifications

ethomaz said:

Slimebeast said:

Yes but as 99% of console games are GPU bound (meaning GPU will be the bottleneck while the other specs will not significantly influence the performance) the difference in GPU power is essentially the difference in console power.

I have another theory... the GPU in Wii U is subutilized because the bottlenecks found in the slow CPU and RAM... the fast RAM of the PS4 will show the real power of that GPU... and even if the CPU can be a bottleneck for PS4... yet is way better than Wii U CPU.

The gap will be bigger than Wii to PS3... even the Epic developer said the same thing.

Oh yeah, I forgot about that, you are right about the GPU in Wii U being subutilized, and therefore the PS4 is actually more than 5.5x Wii U.



Around the Network
ethomaz said:
HappySqurriel said:
As much as (some) people are drooling over these specs I think it could be a huge mistake for Sony ...

Nintendo, loses money while selling the Wii U for $349 and I think Sony will have difficulty selling the PS4 for less than $500 without taking a massive loss; and I doubt they can afford a loss, or that many people are willing to pay $500 (or more) for a system.

Nobody is buying Wii U... gamers needs more POWARRRRRRR.

I see a lot more success in PS4 than PS3... the Sony is not making any mistake this time... the devs are in love with Sony.

I remember people saying the same thing about the 3DS, I said that the 3DS struggling to sell at $250 was a bad sign for the PS-Vita (ad any handheld above $200) ...

Honestly, the Wii U struggling at $350 is a very bad sign for any console selling for (significantly) more than $350



Shinobi-san said:

Not sure what you reply has got to do with my intial post?

Everything... the HD 6000 are VLW5 and the HD7000 (like PS4 GPU) are VLW4... there is no way to compare CUs... so I did the best comparasion for you.



Shinobi-san said:
Slimebeast said:
Shinobi-san said:
People are even dumb enough to do comparisons with a hd6950...sigh. Please tell me how many GCN's does a 6950 have?

Oh thats right...ITS A COMPLETELY DIFFERENT ARCHITECTURE.

Wait, wasn't the Radeon 6000 series the first gen to have GCN cores?


Im pretty sure it was the 7000 series :/

AMD did the GCN presentation late 2011...didnt we already have the 6000 series early 2011?

You are right... HD 6000 is VLW5... HD 7000 is the fist GCN VLW4.



Shinobi-san said:
Slimebeast said:
Shinobi-san said:
People are even dumb enough to do comparisons with a hd6950...sigh. Please tell me how many GCN's does a 6950 have?

Oh thats right...ITS A COMPLETELY DIFFERENT ARCHITECTURE.

Wait, wasn't the Radeon 6000 series the first gen to have GCN cores?


Im pretty sure it was the 7000 series :/

AMD did the GCN presentation late 2011...didnt we already have the 6000 series early 2011?

Not sure. Perhaps they're GCN1 and GCN2 like ethomaz put it?

From my memory the Radeon 5000 series had early compute capabilities but it was in the 6000 series it was extended and pushed by AMD and named GCN and then the 7000 series added even more compute power.



Around the Network
ethomaz said:

Slimebeast said:

Yes but as 99% of console games are GPU bound (meaning GPU will be the bottleneck while the other specs will not significantly influence the performance) the difference in GPU power is essentially the difference in console power.

I have another theory... the GPU in Wii U is subutilized because the bottlenecks found in the slow CPU and RAM... the fast RAM of the PS4 will show the real power of that GPU... and even if the CPU can be a bottleneck for PS4... yet is way better than Wii U CPU.

The gap will be bigger than Wii to PS3... even the Epic developer said the same thing.

In terms of numbers probably but in the eyes of the average consumer not likely.



Slimebeast said:

Not sure. Perhaps they're GCN1 and GCN2 like ethomaz put it?

From my memory the Radeon 5000 series had early compute capabilities but it was in the 6000 series it was extended and pushed by AMD and named GCN and then the 7000 series added even more compute power.

HD 5000 = VLW5
HD 6000 = VLW5
HD 7000 = new VLW4 (GCN)
HD 8000 = VLW4 (GCN2) - To be released Q3 2013



Beast of a console. Blows away the (rumored) NextBox specs.



goddog said:
Shinobi-san said:
goddog said:
its an odd mishmash of high end on the ram, upper mid range on the gpu, and lower midrange on the cpu ...

using the gddr5 will help hide issues in the cpu but i still feel twoards the end of its life the "eight core" (4 module ) will actually limit what the gpu can do. it will be an interesting switch form this gen where the ram and then gpus were the choke points. and due to the design of the cores, if a game is not programed to take advantage of how the 4 modules operate they could actually slow down the performance below that of a quad core as has been shown on ars in the past.

it will be interesting to see where they take this, at least programming support will be easier... though if they had used intel could have been much easier ....

stupid fake 8 cores .... but then all about arguing what makes a core....


so i guess in summary the ram is a neat choice, the gpu is not as high end as i hopped but not a POS and since its custom may have trick up its sleave
and the cpu leaves a lot to be desired (are we 100% on its clock speed yet)


WOW.

This kinda post is the perfect example of spreading the wrong information, yet making as if what you are saying is true.

 

whats wrong there? the 8 amd cores are 4 modules with some share resources i would suggest reading arstechnica or tomshardware for a good write up on the family. 

as far as i have seen no clock speed has been released officially thats why i asked

i consider the area around 7850-7870  pitcarin  to be upper mid current gpu 

the family of the gpu is only given as GCN and runs at 1.8, per wikipedia they 7850 runs at 1.791 TFLOPS  and the 7870 ghz runs at 2.5 Tflops

and i expressed i was pleasantly surprised  at the ram 

 

where do you get 4 modules from?

Last i checked Jaguar cores are based off of the Bobcat cores. Bobcat cores have a different architecture compared to Bulldozer cores. 8 jaguar cores means exactly that.....8 jaguar cores. This is not 8 bulldozer cores / 4 modules (which would have been more powerfull anyways) where each core shares resources between two cores.

Will adress the rest of your post later.



Intel Core i7 3770K [3.5GHz]|MSI Big Bang Z77 Mpower|Corsair Vengeance DDR3-1866 2 x 4GB|MSI GeForce GTX 560 ti Twin Frozr 2|OCZ Vertex 4 128GB|Corsair HX750|Cooler Master CM 690II Advanced|

Honestly, the Wii U struggling at $350 is a very bad sign for any console selling for (significantly) more than $350


or it could be people are not interested, just like they were not interested in GC or N64, but especially like GC, i mean it did'nt even sell  at 149$ 6 months after launch.