By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - NVIDIA: PS4 GPU 3x less powerful than Titan, but more powerful than Xbox 720

fillet said:


Completely wrong.

7800GTX easily gobbled up 250w+ during peak load.


Seriously doubt that

http://www.techarp.com/showarticle.aspx?artno=88&pgno=8



Around the Network

Foundn this interesting. not sure if it was posted yet

http://www.techradar.com/news/gaming/consoles/amd-on-the-ps4-we-gave-it-the-hardware-nvidia-couldn-t-1141607



Really, they are going with XBox 720 for comparison? That is Internetz blah blah. Anyone professional using that term is suspect to being professional. One would SERIOUSLY expect to see Durango in there, not 720.



I'm not sure if anyone has mentioned this but I saw a couple times that the xbox 720 has a dual gpu.


http://www.expertreviews.co.uk/games/1296739/xbox-720-release-date-specs-news-and-rumours

PROCESSOR AND GRAPHICS - updated 26/03/13

As with the PS4, the Xbox 720 looks certain to use AMD graphics hardware. VG247 reports that multiple sources have confirmed to it that the console will use two GPUs in tandem, it goes on to say that these won't be used in the usual 'CrossFire' configuration seen in PCs. We're not sure about all this as complex hardware architectures have been the failing of many consoles in the past, as a simple architecture is easier for developers to get to grips with quickly.

In any case it's the RAM that really matters if the ps4 is using 8 gigs of memory, you can bet xbox 720 will at least match that.



ethomaz said:
arminuk said:
Seriously since when nvidia call nex Xbox Xbox 720 to be honest Sony loyality boys do try to make ps4 looks more powerful than new Xbox

It is a official nVidia slide... so nVidia is upset with the new generation.


Yeah they r but no one knows anything about new Xbox so they can't compare anything on rumores on their website I called it unfair and unreasonable  



Around the Network
HoloDust said:
fillet said:


Completely wrong.

7800GTX easily gobbled up 250w+ during peak load.


Seriously doubt that

http://www.techarp.com/showarticle.aspx?artno=88&pgno=8


That chart doesn't show anything at all regarding power consumption.

EDIT - sorry was referring to total power consumption, didn't realize the test benchmarks was looking at were referring to overall power consumption of the whole setup. I wasn't aware of this thanks for the info.



fillet said:
AnthonyW86 said:
ethomaz said:

AnthonyW86 said:

I don't get why Nvidia is so upset by this, they couldn't offer a system on a chip design so it's their own fault.

I don't know too but I think it is because the money lost with each console sold... at least in this generation they have the PS3... now nothing.

That is the third direct attack from nVidia to the next-generation in less than one week.

I think they are getting scared, they must realize that since alot of PC games are very similiar to their console counterparts that AMD will have a advantage on the PC graphics card market aswell.

Mostly though i think the chart shows are very wrong image, since the top notch graphics cards these days are simply way to power hungry. Just to compare a Geforce 7800 GTX hade a peak power consumption of 80W and that was the top dog back in the day(and a similair design was used in the Ps3). A titan Gpu has a peak consumption of 260W(!). It's like using 3 7800 GTX cards in SLI. And that's why i think AMD did a incredible job by creating such a powerfull APU, since the GPU of the PS4 will like have a slightly higer power consumption than the RSX in the PS3.


Completely wrong.

7800GTX easily gobbled up 250w+ during peak load.

All top end graphics cards for the past 5 years or so have had a power consumption of between 200-300w give or take a few watts. The power requirments haven't gone up, the manufacturing process has shurnk which reduces power consumption but transistor count has increased (obviously) which increases power consumption, they work to a target peak and the Titan is nowhere abnormal relative to it's power.

It basically offers 50% performance increase for nowhere near 50% increased power consumption Vs say a crossfire/SLI design which eats 100% more power for basically 60-65% increase in performance.

Of course with crossfire/SLI you get good old microstutter making it worthless.

The Gefore TiTAN is nothing short of a marvel in terms of performance and power, to say otherwise is extremely ignorant and shows a complete lack knowledge of where things are currently. AMD doesn't have anything ready to compete with it and won't do for about 6-12 months!

...Now the price, that's something entirely different, that's not good at all and stinks of a cash in.

Also, the power consumption in the PS4 being similar to the PS3 is completely expected, as outlined already, the die process shrinks, the transistor count increases, each balancing out to reach a power envelope that is manaeable. AMD have done well with their low end - mid-range APUs, but don't be fooled there's nothing special going on here, top end parts still have massive power draws from both AMD and nVidia.

And rightly so, because if they didn't, they wouldn't be maximizing the design to it's limits. Which is what the PC graphics card industry is all about and these days the console graphics card solutions - are the PC solutions. AMD low power APUs = PS4/Xbox 720. These are not high end parts, the important thing to remember as well is that performance Vs power consumption doesn't follow a linear increase, when you get to the top end the "performance per watt" decreases and plateaus off as expected and as all components in the electronic industry do.

Low power designs are optimized for low power consumption (obviously), high power designs are optimized for high power with little regard for power consumption (obviously).

I think you are confused with total system power consumption, 7800 gtx used about 80W:

The 7900 GT even lowered that to 50W while delivering high-end performance, and a 7950 GX2 used about 110w-120w. That's a dual gpu card using just over 100W! Power consumption really started to boom about two years later though, and it was mostly Nvidia cards becoming very power hungry.



AnthonyW86 said:
fillet said:
AnthonyW86 said:
ethomaz said:

AnthonyW86 said:

I don't get why Nvidia is so upset by this, they couldn't offer a system on a chip design so it's their own fault.

I don't know too but I think it is because the money lost with each console sold... at least in this generation they have the PS3... now nothing.

That is the third direct attack from nVidia to the next-generation in less than one week.

I think they are getting scared, they must realize that since alot of PC games are very similiar to their console counterparts that AMD will have a advantage on the PC graphics card market aswell.

Mostly though i think the chart shows are very wrong image, since the top notch graphics cards these days are simply way to power hungry. Just to compare a Geforce 7800 GTX hade a peak power consumption of 80W and that was the top dog back in the day(and a similair design was used in the Ps3). A titan Gpu has a peak consumption of 260W(!). It's like using 3 7800 GTX cards in SLI. And that's why i think AMD did a incredible job by creating such a powerfull APU, since the GPU of the PS4 will like have a slightly higer power consumption than the RSX in the PS3.


Completely wrong.

7800GTX easily gobbled up 250w+ during peak load.

All top end graphics cards for the past 5 years or so have had a power consumption of between 200-300w give or take a few watts. The power requirments haven't gone up, the manufacturing process has shurnk which reduces power consumption but transistor count has increased (obviously) which increases power consumption, they work to a target peak and the Titan is nowhere abnormal relative to it's power.

It basically offers 50% performance increase for nowhere near 50% increased power consumption Vs say a crossfire/SLI design which eats 100% more power for basically 60-65% increase in performance.

Of course with crossfire/SLI you get good old microstutter making it worthless.

The Gefore TiTAN is nothing short of a marvel in terms of performance and power, to say otherwise is extremely ignorant and shows a complete lack knowledge of where things are currently. AMD doesn't have anything ready to compete with it and won't do for about 6-12 months!

...Now the price, that's something entirely different, that's not good at all and stinks of a cash in.

Also, the power consumption in the PS4 being similar to the PS3 is completely expected, as outlined already, the die process shrinks, the transistor count increases, each balancing out to reach a power envelope that is manaeable. AMD have done well with their low end - mid-range APUs, but don't be fooled there's nothing special going on here, top end parts still have massive power draws from both AMD and nVidia.

And rightly so, because if they didn't, they wouldn't be maximizing the design to it's limits. Which is what the PC graphics card industry is all about and these days the console graphics card solutions - are the PC solutions. AMD low power APUs = PS4/Xbox 720. These are not high end parts, the important thing to remember as well is that performance Vs power consumption doesn't follow a linear increase, when you get to the top end the "performance per watt" decreases and plateaus off as expected and as all components in the electronic industry do.

Low power designs are optimized for low power consumption (obviously), high power designs are optimized for high power with little regard for power consumption (obviously).

I think you are confused with total system power consumption, 7800 gtx used about 80W:

The 7900 GT even lowered that to 50W while delivering high-end performance, and a 7950 GX2 used about 110w-120w. That's a dual gpu card using just over 100W! Power consumption really started to boom about two years later though.


You are entirely correct here. My apologies for talking nonsense about the power consumption bit.



fillet said:


You are entirely correct here. My apologies for talking nonsense about the power consumption bit.

No problem, Nvidia made a interesting step with Titan indeed but it did end up using more power than the GTX 680. I think this is the reason why AMD has passed on introducing an entirely new card this time, there is simply not a big enough step to take without a rise in power consumption.



Jega said:

I'm not sure if anyone has mentioned this but I saw a couple times that the xbox 720 has a dual gpu.


http://www.expertreviews.co.uk/games/1296739/xbox-720-release-date-specs-news-and-rumours

PROCESSOR AND GRAPHICS - updated 26/03/13

As with the PS4, the Xbox 720 looks certain to use AMD graphics hardware. VG247 reports that multiple sources have confirmed to it that the console will use two GPUs in tandem, it goes on to say that these won't be used in the usual 'CrossFire' configuration seen in PCs. We're not sure about all this as complex hardware architectures have been the failing of many consoles in the past, as a simple architecture is easier for developers to get to grips with quickly.

In any case it's the RAM that really matters if the ps4 is using 8 gigs of memory, you can bet xbox 720 will at least match that.


Hmm this is not very likely. Dual GPUs would need higher memory bandwith to make sense. Apu+GPU+8gb GDDR5 north of 200GB/s Kinect HDD etc.. Would make for a big and expensive console. Its not impossible though but I don't see it. The majority of rumors point to 1.2Tflop 8gb DDR3+32 MB Esram. 

Its not impossible to slap on another GPU or make a dual Apu system but it would definetly not be cheap. Wasn't the Dual APU Design called Yukon and MS decided against it ?

 

Personally I would love to see  MS to go all out again and make an unreasonable powerhungry beast but I don't believe that they will do it.