vivster said:
Well I checked my sources before saying this. A whole system with overclocked 280X and a 4770k consumes about 350W. The 8350 consumes about 100W more than the i7. That's why 500W would be fine(if it is a good PSU it will deliver close to 500W) and 550W to be safe. Everything above that would be wasted unless he plans to overclock his CPU and get an overclocked 290X. I'm not used calculating with AMD. My next system will be with an i7 and big Maxwell and I'm planning on getting a PSU around 400W and not bigger than 450W. |
Here are some benchmarks for the 280x.
http://www.tomshardware.com/reviews/radeon-r9-280x-r9-270x-r7-260x,3635-18.html
In Metro: Las Light it averages at 207 Watts. That means it likely peaks at the TDP of 250 Watts.
Here are some benchmarks for the FX8350
http://www.tomshardware.com/reviews/fx-8350-vishera-review,3328-16.html
It averages at 182.21 Watts.
206 + 182.21 = 388.21 on average from the GPU and CPU alone.
Now the peak power consumption can be higher than that, and it will be with more demanding games. Basically it depends on the model and the brand of the power supply. There is an Antec supply that my friend is looking at that really can do 550 watts continously but is rated for 500 watts. Meanwhile certain Corsiar supplies are rated at 500 watts and really only output 450 watts continously. Both are highly regarded brands. Nevertheless, if you can consider a situation in which the CPU and GPU can peak within 50 watts of your power supplies rating, then that isn't enough, because that means your PC will shut down at those peaks (there are other components drawing power.) And if you ever were to consider the concept of overclocking (even if you don't think you will now) then you are pretty screwed. And it is all to save $10-$15 (usually the cost difference of 50-100 more watts with the same model.)