By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - AMD R9 295X2 Specs Out: GTX Titan Z Killer!

TheJimbo1234 said:

 

EDIT:  Yep it destroys the competition.  Titan Z may be dead on arrival:

http://www.tomshardware.com/reviews/radeon-r9-295x2-review-benchmark-performance,3799-9.html


Sure it is an epic gaming card, but remember what the Titan Z was for? As stated, it is for harsh number crunching for simulations and rendering programs, not for gaming. The 790 will be for gaming ^^. However, will it beat 11 teraflops at a mere 65oC and ~465W? Doubtful.

Yes that is what it is for.  It is called the GTX Titan Z.

As for the 790.  Yeah I am guessing Nvidia is gonna gun for $1000 and try to make it almost as strong as 780 SLI.  It will be very hard to though, and honestly 3GB isn't enough...



Around the Network
Captain_Tom said:
Soleron said:

http://techreport.com/review/26279/amd-radeon-r9-295-x2-graphics-card-reviewed/13

It's worse than a single  GTX 780 Ti on a better measure of perceptible image quality (99% percentile framerate, rather than gross framerate)

This is meant for 4K, and at 4K it absolutily shows a noticable difference.  It would be silly to buy even a 780 for 1080p.

 

P.S.  That is also 1 game, and a crappy one at that.  You are ignoring benchmarks where this thing stomps a 690 by over 50%:

http://www.techpowerup.com/reviews/AMD/R9_295_X2/17.html

Or double's the 690's minimum framerate:

http://www.tomshardware.com/reviews/radeon-r9-295x2-review-benchmark-performance,3799-11.html

1. The review is done at 4K resolution.

2. It's not one game, that's an average

3. I don't believe framerates or minimum framerates are the best measure of what you'd want from a card. I prefer when stuttering is taken into account, such as 99th percentile frame time (but it doesn't have to be that). So, yes, I'm ignoring benches showing huge raw frame leads when those don't translate into a better experience for the buyer.



Soleron said:
Captain_Tom said:
Soleron said:

http://techreport.com/review/26279/amd-radeon-r9-295-x2-graphics-card-reviewed/13

It's worse than a single  GTX 780 Ti on a better measure of perceptible image quality (99% percentile framerate, rather than gross framerate)

This is meant for 4K, and at 4K it absolutily shows a noticable difference.  It would be silly to buy even a 780 for 1080p.

 

P.S.  That is also 1 game, and a crappy one at that.  You are ignoring benchmarks where this thing stomps a 690 by over 50%:

http://www.techpowerup.com/reviews/AMD/R9_295_X2/17.html

Or double's the 690's minimum framerate:

http://www.tomshardware.com/reviews/radeon-r9-295x2-review-benchmark-performance,3799-11.html

1. The review is done at 4K resolution.

2. It's not one game, that's an average

3. I don't believe framerates or minimum framerates are the best measure of what you'd want from a card. I prefer when stuttering is taken into account, such as 99th percentile frame time (but it doesn't have to be that). So, yes, I'm ignoring benches showing huge raw frame leads when those don't translate into a better experience for the buyer.

The 295X2 doesn't have frame varience problems.  The 290 series was built from the ground up to avoid them, and they were fixed in the 7000 series months ago.  Frankly the link you provided is the only one I have seen complain about it, and i am inclined to say something in their testing caused problems due to the fact that all the other reviewers show no frame varience problems...

EDIT: Here the 295X2 has lower frametimes than a single 780 Ti.  So the 780 Ti is now far worse based on your logic right?



Fuck that price. I will just wait 2 years and buy it for $250-$300.



 

LiquorandGunFun said:
Fuck that price. I will just wait 2 years and buy it for $250-$300.

Probably be still 300+ in 2 years.  You'd probably have to wait 3-4 years to go below 300.



Around the Network
LiquorandGunFun said:
Fuck that price. I will just wait 2 years and buy it for $250-$300.


This GTX 690 is 2 years old and it still costs $790:

http://www.ebay.com/itm/ASUS-GeForce-GTX690-4096MB-GDDR5-512bit-Graphics-Video-Card-GTX690-4GD5-/371037136829?pt=PCC_Video_TV_Cards&hash=item56638967bd

It's refurbished instead of new, and it started at $1000 instead of $1500.  Seriously some people who don't know PC hardware just need to be quiet.

P.S.  "F that price" all you want, but this $1500 card is 6 times stronger than a $400 PS4.  No, the rest of the system will not cost another $1500...



Soleron said:

Basically, multi-GPU setups can produce more frames, but they'll come out like this: 10ms, 10ms, 10ms, 50ms, 10ms, 10ms ...

And that gives a worse image quality to the viewer than the following: 12ms, 12ms, 12ms, 12ms,  12ms, 12ms. Even though the first card has higher FPS, the second card will look better to a human.

So the 99% measure shows in how long you can expect 99% of frames to be rendered, rather than the average. For smooth framerates, that comes out to the same thing as raw FPS. But if they're more like the first situation I said, it's a better measure of the experience.

Video of actual cards: http://techreport.com/review/24051/geforce-versus-radeon-captured-on-high-speed-video

Full explanation (long): http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking

More testing on it: http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Tes-12


Wow, that was a complete answer and the links and explanation was great. I never tought about latency problems, but it looks like they are actually an issue. I'm not surprised that NVidia GPUs perform better on that than AMD ones, since the NVidia driver is much more mature (they actually share more than 80% of the code between Windows, Mac and Linux version, minimizing maintenance efforts, allowing a closer performance between each OS).



sethnintendo said:

AMD is making a killing from the scrypt miners and gamers. Miners are buying AMD cards while shunning Nvidia cards. If I had some spare money I would probably invest in AMD stock.


Miners are moving away from GPUs. Asics currently manufactures special devices with the entire Bitcoin mining algorithm implemented directly in hardware delivering the performance of several GPUs consuming 100w. That miners are a beast for that specific application. 

Of course, there isn't a dedicated hardware for each digital currency, but the bigger ones are already being dominated by these guys.



torok said:
sethnintendo said:

AMD is making a killing from the scrypt miners and gamers. Miners are buying AMD cards while shunning Nvidia cards. If I had some spare money I would probably invest in AMD stock.


Miners are moving away from GPUs. Asics currently manufactures special devices with the entire Bitcoin mining algorithm implemented directly in hardware delivering the performance of several GPUs consuming 100w. That miners are a beast for that specific application. 

Of course, there isn't a dedicated hardware for each digital currency, but the bigger ones are already being dominated by these guys.

The Litecoin Asics aren't out yet so people are still using GPUs for those.  Plus not everyone can afford them (Knc miner Asics go for around 10k).  Eventually the Asics take over but right now people are still buying GPUs to mine with.  Eventually fewer people will do this when Asics take over but right now there are so many different alt coins that GPU mining still exists.  I'm sure most of these alt coins will die off but there should be a few that make it.



sethnintendo said:

The Litecoin Asics aren't out yet so people are still using GPUs for those.  Plus not everyone can afford them (Knc miner Asics go for around 10k).  Eventually the Asics take over but right now people are still buying GPUs to mine with.  Eventually fewer people will do this when Asics take over but right now there are so many different alt coins that GPU mining still exists.  I'm sure most of these alt coins will die off but there should be a few that make it.


Yes, Litecoin uses a different algorithm. Bitcoin (and Dogecoin uses the same algorithm) is pretty much Asics now. A top AMD GPU will give you around 1GH/s and Asics aren't that expensive. USB miners are pretty cheap, I remeber seeing a miner made with a Raspberry PI and several USB miners giving 1.3GH/s consuming 4 or 5 watts. The expensive ones are giving 1TH/s, so it isn't that expensive. 

But the situations for alt coins is like you said, no dedicated hardware. A lot of people is just mining alt coins with GPUs and then buying Bitcoins with the money.