By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Question about SLI and power consumption

m0ney said:
Tecmo said:

Anyway, prob just going to get the R9 280X unless someone has a better suggestion for around the same price.

I don't want to be that guy but from my own experience I would recommend going with Nvidia and here is why:

- Nvidia used to have better frame latency, that is a very important thing in your gaming experience. Notice I said used to have because I don't know the current situation, maybe AMD has improved in that department.

- Nvidia cards generate less heat and hence usually are quietier. Of course it depends on particular card, brand and model.


1. The first part is no longer true. In fact, 960 has horrible frame times against the 280X, and is WORLD'S apart from a similarly priced 290. 

http://www.techspot.com/review/948-geforce-gtx-960-sli-performance/page3.html

It actually takes $400 960 SLI to match a single $240-250 R9 290 in games:

http://www.techpowerup.com/mobile/reviews/NVIDIA/GeForce_GTX_960_SLI/23.html

That's shockingly bad. NV really screwed up with the 960's pricing. The card should be $149 with that level of performance.

2. An after-market 290 = reference 290X in performance. That means 45-50% more performance at $250 going with a 290 over a 960:

http://www.techpowerup.com/mobile/reviews/Gigabyte/GTX_960_G1_Gaming/28.html

3. ^ Notice in that review 960's horrendous performance in AC Unity, Shadow of Mordor? That's because 2Gb of VRAM is no longer enough for 1080p PC gaming in 2015. Want more proof? 

960 bombs against the R9 280/280X 3GB and 290X 4GB in Evolve, another VRAM heavy title. 

http://www.sweclockers.com/artikel/20031-snabbtest-grafikprestanda-i-evolve/3

4. Your last point is SKU specific. You can easily find cool and quiet R9 290/290X cards:

MSI Gaming R9 290 is $240 on Newegg after rebate:

http://m.newegg.com/Product/index?itemnumber=14-127-774

Noise levels are top notch on it

http://www.kitguru.net/components/graphic-cards/zardon/msi-r9-290-oc-gaming-edition-review-1600p-ultra-hd-4k/29/

5. Games will only get more VRAM intensive as we see 2nd and 3rd wave of PS4 console ports. For that reason going GTX570 1.28Gb SLI is a waste of time and money. Secondly, XDMA Cross-fire is smoother than SLI (see HardOCP's reviews). This also means 960 2GB and 960 SLI 2GB are already obsolete! What makes it worse is a single AMD R9 290 card with no SLI profiles is as fast as 960 SLi. What happens in games where SLI doesn't scale? It's a walkover; making things worse that 960 SLI costs 60% more than the 290. Ouch. 

6. On average the GTX970 is only 5% faster than an after-market 290 but costs $330+. Since he has a stock i5 2500K CPU, but most reviewers use highly overclocked 4.2Ghz+ i5/7, he would never see any of that 5%. Moreso, in Mantle games, he would get faster performance since his CPU bottleneck would be lifted. 

Given the price digference between 970 and 290, a $240 290 is a clear winnee for the OP. He can pocket $90 towards a 250GB SSD like Samsung 850 EVO or save it for an after-market cooler to overclock his CPU or put it aside towards 14nm GPUs in 2016/2017. The 960 at $200-250 is a horrible option by all accounts, 45-50% slower than a 290, and that disadvantage grows to 75-90% where 2Gb of VRAM becomes a major bottleneck. 

Cordair 650W is easily powerful enough for a 290X max OC and a Core i7 5960X @ 4.5Ghz. Right now NV's lineup is all underperforming below $330, while 980 is badly overpriced. The best value on the market is an after-market 290. Alternatively one will have to wait for June/July when R9 300 series launches and pricing and performance get readjusted for the entire market.



Around the Network
m0ney said:

I don't want to be that guy but from my own experience I would recommend going with Nvidia and here is why:

- Nvidia used to have better frame latency, that is a very important thing in your gaming experience. Notice I said used to have because I don't know the current situation, maybe AMD has improved in that department.

- Nvidia cards generate less heat and hence usually are quietier. Of course it depends on particular card, brand and model.

Frame pacing has tremendously improved on AMD. 

Nvidia weren't exactly cooler or quieter with Fermi ...



fatslob-:O said:
Tachikoma said:

Still a shame, because no sites actually cover it, for people wanting to do it the markets a minefield of gpus that have unknown performance at the task.

Only this week i saw someone who had bought a pair of 780s for v-ray rendering, when for the price of just one he could have gone with dual or tripple 580 sli depending on new/used prices.

There is at least a few sites that DO cover compute benchmarks and in your particular case the ones dealing with light transport simulation ... 

As for mining performance, everything prior to Maxwell on Nvidia's side sucks badly because of bad integer or memory subsystem performance ... 

AMD hardware has a good history on mining performance and the hashing benchmarks show for it too ...

All of that is in the past now since FPGA's and ASIC's are threatening GPU mining in general ...

Luxmark isnt representitive, (AT ALL) of v-ray render performance, and does the exact opposite in terms of informing budding 3d designers at which hardware is best for actual 3d modelling and rendering by providing results that are virtually the polar opposite of what they will get in actual 3d modelling applications, not just broadly generic benchmark testing.

Why you brought up gpu mining I cannot for the life of me fathom.



Tachikoma said:

Luxmark isnt representitive, (AT ALL) of v-ray render performance, and does the exact opposite in terms of informing budding 3d designers at which hardware is best for actual 3d modelling and rendering by providing results that are virtually the polar opposite of what they will get in actual 3d modelling applications, not just broadly generic benchmark testing.

Luxmark isn't supposed to be representative of V-Ray performance ... Their two different applications that focus around visualization ...

V-Ray itself isn't specifically a 3D modeling program either. It's just a rendering engine meant to be integrated into a 3D modeling software much like the LuxRender ...



You can get away with a HX650 (I'm assuming that's the one you have?) with 2 x 560tis but 2x 570s is really pushing it. I'd just sell it and get a GTX 970. You could SLI 2 of those with your PSU. I wouldn't crossfire any high end AMD gpus with that PSU though. They're a bit thirsty.



Around the Network
fatslob-:O said:

Luxmark isn't supposed to be representative of V-Ray performance ... Their two different applications that focus around visualization ...

V-Ray itself isn't specifically a 3D modeling program either. It's just a rendering engine meant to be integrated into a 3D modeling software much like the LuxRender ...

Then why in gods name did you chime in, i was, and have been, talking specifically about v-ray the entire time.

Given that you went off on a tangent about bit mining because i used the word minefield, im going to guess that you skimmed my post and decided to be a smartypants.

Ask yourself this, if you wanted to know what card was best for using vray, would ANY of the stuff you posted help, what so ever?, and kindly dont point to luxmark/luxrender because its performance is completely different to vrays, almost the exact opposite card wise.

you know what, nevermind, cant be bothered arguing about it



Tachikoma said:

Then why in gods name did you chime in, i was, and have been, talking specifically about v-ray the entire time.

Given that you went off on a tangent about bit mining because i used the word minefield, im going to guess that you skimmed my post and decided to be a smartypants.

Ask yourself this, if you wanted to know what card was best for using vray, would ANY of the stuff you posted help, what so ever?, and kindly dont point to luxmark/luxrender because its performance is completely different to vrays, almost the exact opposite card wise.

What's NOT representative about Luxrender ? Both of them are based off similar global illumination schemes ... 

How do you know if it's not the early DRIVERS affecting performance ? I find it hard to believe that a cut down titan should take twice the time on the same scene. 

One things for certain is that a 580 outperforms the 680 in luxmark much like that blender benchmark you showed so not all is dissimilar ...



fatslob-:O said:

How do you know if it's not the early DRIVERS affecting performance ? I find it hard to believe that a cut down titan should take twice the time on the same scene. 

Because i have a rig here with 3x 780ti, and Tamron has one with 3x 580, and there is a large notable difference in rendering speed between the two, with the 3 way 580 outperforming the 780ti in vray 3 and 3sp1, just as it did for 2.0, just as it has done with every driver release for these cards since they were installed.

Simply put, the 580 is an excellent card for vray rendering and punches way above the newer cards, i just wish sites actually included vray benchmarking because the results arent really comparable to other renderers or simulations, only the titan and newer 9xx cards actually outperform it, and when you can pick up used 580s that go relatively cheap now (since no price inflation from the mining junkies), for a low cost high performance vray rendering rig, the 580 kicks ass.