By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony - Nvidia bitterness continues, compares PS4 specs to a 'low-end CPU'

BTW, for quick referencing of nVidia vs AMDs cards:





Around the Network
Dgc1808 said:


I know, i just found that funny. Half of what a 680 can accomplish in a console would be pretty amazing considering the price tag of a 680. That combined with less overhead that PC games have to deal with would make for a pretty compelling 400$ (hopefully) console this year. lol

He is being bitter but that line would actually be good news for console gamers, assuming that PS4 does launch at an attractive price.


Around 65% of 680 actually at the moment, but it will probably shift in favour of AMD's GPUs as devs optimize for that architecture.



HoloDust said:

Actually, with HSA features, there is no need for system to take so much memory:

- CPU and GPU now access the memory with the same address space. Pointers can now be freely passed between CPU and GPU.

- GPU can now access and cache data from coherent memory regions in the system memory, and also reference the data from CPU's cache. Cache coherency is maintained.

- GPU can take advantage of the shared virtual memory between CPU and GPU, and pageable system memory can now be referenced directly by the GPU, instead of being copied or pinned before accessing.

The HSA is inded a big leap for games... in PC if the CPU needs some data from GPU (or vice-versa) the CPU needs to copy the data from the VRAM to System RAM and works... after the CPU have to copy again from System RAM to VRAM to GPU works output to display.

In HSA memory pool CPU and GPU can work and use any space of memory withou move from VRAM to System RAM or System RAM to VRAM.

I'm sure that will give a lot of advantage or developers.



superchunk said:
Nvidia is right.

The APU has by default a mobile based CPU and GPU and therefore they are low/mid products.

This why I been laughing at all the craziness people have gone on about the 8GB of GDDR5 ram.

MSony are not pushing the limits at the start of this gen like previous. This is also why they are significantly closer to WiiU than their predecessors were to Wii. Wii to WiiU is a larger jump than PS360 to PS4/NXbox.

Now after saying that... its obvious Nvidia is in damage control.

Why? How exactly is the CPU going to affect the speed and size of the RAM?

It's definitely the weakest link, the opposite of last gen really. I wonder if it'll bottleneck games in the future.



czecherychestnut said:
fillet said:

Why does everyone take this so personally. This is a businessman defending his business decisions that were made for business reasons. i.e - he's credible as nvidia like any other company like money.

His comments are a little exaggerated, for the GPU. To expect GTX680 performance or anywhere near it in a console is a bit much and simply can't be done at a decent price level, or even TDP on the heat/power side of things. The CPU part, he's likely pretty close with his statement there, but CPU isn't as important as GPU by a long way once it's at a certain level.

Point is that he's not "bitter" and saying as such just makes you look like you don't really understand much about comments made by a rep from a business who deals with this stuff.

I'm sure the CEO of McDonald's franchise operations (if such a thing exists) would have similarly derrogatory remarks about the direction Burger King is taking their business, you can be quite sure that they wouldn't be "bitter" or actually lie but you're going to get an opinion that is centric to McDonald's business direction and biased in that direction.

In fact OP, you should be ashamed really for using the word "bitterness", it's really not helpful and gives totally the wrong impression than what is actually being discussed here. You haven't explained why you have used the word and it just riles up hate from the usual suspects.

Problem is he is a spokesman for nVidia, it doesn't matter what he personally feels regarding this, all that matters is how the average Joe Bloggs perceives his comments, and from the comments I've read on many tech sites, he comes across as 'bitter' which then reflects back on nVidia. He would have been far better off by not saying anything and not drawing attention to the issue. You don't hear the CEO of macca's slagging off their competition, in fact you don't hear them mention the competition at all, why give them free advertising?

This whole comparing the PS4 CPU to a PC is silly anyway. Yes, Jaguar is fairly low end, its a Brazos replacement designed for low-power applications. However, the fact that its directly attached to a powerful ~HD78XX class GPU through an extremely high bandwidth memory bus, the fact it has unified memory addresses means the CPU and GPU can work on the same data without having to shuffle the data across a PCI-E connection into separate buckets of memory marks a massive departure from the PC architecture. Through OpenCL, you have this seamless ability to balance compute on both the CPU and GPU, using the same memory structures, the PS4 architecture (and assuming the 720 as well if it uses the same) are just not comparable, even before you factor in the benefits of a set target platform for which you can heavily optimise code.


I agree with you basically, apart from the bitterness bit. I don't think most educated people believe this guy is bitter at all. It's PR pure and simple, he's defending his companies business decsion in an agressive fashion. That's got nothing to do with being bitter and much more to do with a strategy.

At the end of the day. they would likely have said the same thing about the original Xbox 360 if asked about it, after all MS stopped working with nvidia due to the massive licensing costs of the GPU in original Xbox that didn't go down chronologically, nvidia god a good deal. Maybe Sony had a similar stuffing with the PS3...

Who knows, end of day though, comes back to PR and business - not bitterness, that implies it's personal which is just silly. Sure you can expect a journalist with an agenda to paint a picture like that, after all it makes the story more interesting and damning.

Reality is much less interesting though.



Around the Network

all this talk about nvidia being bitter and all... some people here were saying that they decline the partnership and the power were at nvidia hands... but lots are forgetting PS4 is intended (all points that this was a intended and not just a cost effective option) to be x86 architecture to be easy to code to... nvidia can't do x86 CPUs and SoC its what consoles are going for now a days... sony would need other party to do CPU (only intel and AMD can do x86 CPUs for now), nvidia to do the GPU and other party to put them together and fab the chip itself... my honest guess is, sony did go with AMD because AMD can do the CPU, the GPU and have contracts in place with fabs to produce the chip wafers... nvidia would be more trouble/work and a lot less cost effective...



Proudest Platinums - BF: Bad Company, Killzone 2 , Battlefield 3 and GTA4

HoloDust said:
keroncoward said:

It is a CONFIRMED customized Tablet/Mobile Phone GPU. Go research what the bobcat is (basically the Jaguar is two bobcats put together). The only RAM that would matter to help visuals is the VRAM on the video card itself. The RAM we are speaking about will only help draw distance, how fast the game loads up and they will set aside some to add features like cross game chat and whatever else. People treating 8GB RAM like its the new cell lol. If both GPUs have the same modern DX11 like features then we would not see that much of a difference.


You obviously have no idea what you're talking about...

- Jaguar is low-power CPU arhcitecture, not GPU. It is aimed at tablets, netbooks and ultra-thin notebooks. I has nothing to do with mobile phones. It is NOT two Bobcats put toghether

http://www.techpowerup.com/img/13-02-19/95a.jpg

- PS4 has fully coherent GDDR5 memory pool with unified adress space for CPU and GPU - their is no separate system RAM and VRAM in PS4

- more RAM is not new Cell - more RAM allows for better textures, better lighting and in potential future engines better Sparse Voxel Octree rendering, to name just a few advantages.


I have no idea what im talking about? Jaguar is a APU which is a CPU and GPU in one (Wii U's GPGPU comes to mind). AMD dont even make APUs for the public to buy again from what i know. You have no idea what a APU is and how it works with crossfire. If PS4 only has 8bg RAM as you are claiming with no VRAM on the GPU then it makes sense what youre saying. You right its not two Bobcats put together cause two Bobcats put together would have performed better as i explained i was exaggerating when i said that. Jaguar only performs 30% better than the Bobcat at best. You seriously think a tablet APU would not be able to run in a mobile phone? Fyi there are already mobile phones that have far better specs than current gen but they just dont have the compelling software to show off the capabilities. Mobile phone devs are not going to invest countless time and money just to make a game comparable to consoles on a mobile phone.



He's speaking the truth though.



keroncoward said:
HoloDust said:
keroncoward said:

It is a CONFIRMED customized Tablet/Mobile Phone GPU. Go research what the bobcat is (basically the Jaguar is two bobcats put together). The only RAM that would matter to help visuals is the VRAM on the video card itself. The RAM we are speaking about will only help draw distance, how fast the game loads up and they will set aside some to add features like cross game chat and whatever else. People treating 8GB RAM like its the new cell lol. If both GPUs have the same modern DX11 like features then we would not see that much of a difference.


You obviously have no idea what you're talking about...

- Jaguar is low-power CPU arhcitecture, not GPU. It is aimed at tablets, netbooks and ultra-thin notebooks. I has nothing to do with mobile phones. It is NOT two Bobcats put toghether

http://www.techpowerup.com/img/13-02-19/95a.jpg

- PS4 has fully coherent GDDR5 memory pool with unified adress space for CPU and GPU - their is no separate system RAM and VRAM in PS4

- more RAM is not new Cell - more RAM allows for better textures, better lighting and in potential future engines better Sparse Voxel Octree rendering, to name just a few advantages.


I have no idea what im talking about? Jaguar is a APU which is a CPU and GPU in one (Wii U's GPGPU comes to mind). AMD dont even make APUs for the public to buy again from what i know. You have no idea what a APU is and how it works with crossfire. If PS4 only has 8bg RAM as you are claiming with no VRAM on the GPU then it makes sense what youre saying. You right its not two Bobcats put together cause two Bobcats put together would have performed better as i explained i was exaggerating when i said that. Jaguar only performs 30% better than the Bobcat at best. You seriously think a tablet APU would not be able to run in a mobile phone? Fyi there are already mobile phones that have far better specs than current gen but they just dont have the compelling software to show off the capabilities. Mobile phone devs are not going to invest countless time and money just to make a game comparable to consoles on a mobile phone.

You should really stop, it's getting a bit embarrasing.

AMD APUs for public purchase:

http://www.scan.co.uk/shop/computer-hardware/all/cpus-amd/amd-a-series-multi-core-socket-fm2

Consumer based Jaguar products will only be 4-core, so the chip in the PS4 will have twice the theoretical power with 8 identical Jaguar cores.

The APU in the PS4 is a SoC. All the RAM will be available to both the CPU and GPU components. There is no crossfire, it's just one big APU that hasn't been done on this scale before, or using the GCN architecture. This also means that bottlenecks that exist on a normal desktop PC with seperate dedicated GPU vs CPU and seperate RAM pools of system vs VRAM don't exist in this setup.

An APU is not like a GPGPU. GPGPU has been around for ages and simply the transferring of traditionally CPU related tasks to a GPU. It's much harder to code for than traditional CPU development, but some tech/middleware is starting to make use of it (e.g. Havok, UE4).

It's highly unlikely you'll see these chips on mobile phones, the power consumption vs ARM is too great.



CGI-Quality said:
TheBardsSong said:
He's speaking the truth though.

I've heard some people say this already. Care to elaborate?



PCs will always be constantly evolving while consoles are stuck with defined specs that are usually outdated by the time they launch.