fillet said:
7800GTX easily gobbled up 250w+ during peak load. |
Seriously doubt that
http://www.techarp.com/showarticle.aspx?artno=88&pgno=8
fillet said:
7800GTX easily gobbled up 250w+ during peak load. |
Seriously doubt that
http://www.techarp.com/showarticle.aspx?artno=88&pgno=8
Foundn this interesting. not sure if it was posted yet
Really, they are going with XBox 720 for comparison? That is Internetz blah blah. Anyone professional using that term is suspect to being professional. One would SERIOUSLY expect to see Durango in there, not 720.
I'm not sure if anyone has mentioned this but I saw a couple times that the xbox 720 has a dual gpu.
http://www.expertreviews.co.uk/games/1296739/xbox-720-release-date-specs-news-and-rumours
ethomaz said:
It is a official nVidia slide... so nVidia is upset with the new generation. |
Yeah they r but no one knows anything about new Xbox so they can't compare anything on rumores on their website I called it unfair and unreasonable
HoloDust said:
|
That chart doesn't show anything at all regarding power consumption.
EDIT - sorry was referring to total power consumption, didn't realize the test benchmarks was looking at were referring to overall power consumption of the whole setup. I wasn't aware of this thanks for the info.
fillet said:
7800GTX easily gobbled up 250w+ during peak load. All top end graphics cards for the past 5 years or so have had a power consumption of between 200-300w give or take a few watts. The power requirments haven't gone up, the manufacturing process has shurnk which reduces power consumption but transistor count has increased (obviously) which increases power consumption, they work to a target peak and the Titan is nowhere abnormal relative to it's power. It basically offers 50% performance increase for nowhere near 50% increased power consumption Vs say a crossfire/SLI design which eats 100% more power for basically 60-65% increase in performance. Of course with crossfire/SLI you get good old microstutter making it worthless. The Gefore TiTAN is nothing short of a marvel in terms of performance and power, to say otherwise is extremely ignorant and shows a complete lack knowledge of where things are currently. AMD doesn't have anything ready to compete with it and won't do for about 6-12 months! ...Now the price, that's something entirely different, that's not good at all and stinks of a cash in. Also, the power consumption in the PS4 being similar to the PS3 is completely expected, as outlined already, the die process shrinks, the transistor count increases, each balancing out to reach a power envelope that is manaeable. AMD have done well with their low end - mid-range APUs, but don't be fooled there's nothing special going on here, top end parts still have massive power draws from both AMD and nVidia. And rightly so, because if they didn't, they wouldn't be maximizing the design to it's limits. Which is what the PC graphics card industry is all about and these days the console graphics card solutions - are the PC solutions. AMD low power APUs = PS4/Xbox 720. These are not high end parts, the important thing to remember as well is that performance Vs power consumption doesn't follow a linear increase, when you get to the top end the "performance per watt" decreases and plateaus off as expected and as all components in the electronic industry do. Low power designs are optimized for low power consumption (obviously), high power designs are optimized for high power with little regard for power consumption (obviously). |
I think you are confused with total system power consumption, 7800 gtx used about 80W:
The 7900 GT even lowered that to 50W while delivering high-end performance, and a 7950 GX2 used about 110w-120w. That's a dual gpu card using just over 100W! Power consumption really started to boom about two years later though, and it was mostly Nvidia cards becoming very power hungry.
AnthonyW86 said:
I think you are confused with total system power consumption, 7800 gtx used about 80W: The 7900 GT even lowered that to 50W while delivering high-end performance, and a 7950 GX2 used about 110w-120w. That's a dual gpu card using just over 100W! Power consumption really started to boom about two years later though. |
You are entirely correct here. My apologies for talking nonsense about the power consumption bit.
fillet said:
|
No problem, Nvidia made a interesting step with Titan indeed but it did end up using more power than the GTX 680. I think this is the reason why AMD has passed on introducing an entirely new card this time, there is simply not a big enough step to take without a rise in power consumption.
Jega said: I'm not sure if anyone has mentioned this but I saw a couple times that the xbox 720 has a dual gpu.
PROCESSOR AND GRAPHICS - updated 26/03/13As with the PS4, the Xbox 720 looks certain to use AMD graphics hardware. VG247 reports that multiple sources have confirmed to it that the console will use two GPUs in tandem, it goes on to say that these won't be used in the usual 'CrossFire' configuration seen in PCs. We're not sure about all this as complex hardware architectures have been the failing of many consoles in the past, as a simple architecture is easier for developers to get to grips with quickly.In any case it's the RAM that really matters if the ps4 is using 8 gigs of memory, you can bet xbox 720 will at least match that. |
Hmm this is not very likely. Dual GPUs would need higher memory bandwith to make sense. Apu+GPU+8gb GDDR5 north of 200GB/s Kinect HDD etc.. Would make for a big and expensive console. Its not impossible though but I don't see it. The majority of rumors point to 1.2Tflop 8gb DDR3+32 MB Esram.
Its not impossible to slap on another GPU or make a dual Apu system but it would definetly not be cheap. Wasn't the Dual APU Design called Yukon and MS decided against it ?
Personally I would love to see MS to go all out again and make an unreasonable powerhungry beast but I don't believe that they will do it.