By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Should I get the RX Vega 64 or GTX 1080?

Pemalite said:

1440P is not demanding enough to warrant the need for a Geforce 1080Ti.

A 1070 is still overkill for 1080P.

GTA V and Witcher 3 on ultra seem to be heavy hitters to me. I'm aware we can turn down settings but there are folk like me who don't want to, thus the fps hit is significant at 1440p.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Around the Network

the current prices of gpus suck. I could only give information of the situation with old prices.

1080p -> 1060 ti / rx 580 for around 250€
1440p -> 1070 for 400€

then after 2-3 years you sell your old gpu for 100-120€ and get a current 250€ one.





Chazore said:
Pemalite said:

1440P is not demanding enough to warrant the need for a Geforce 1080Ti.

A 1070 is still overkill for 1080P.

GTA V and Witcher 3 on ultra seem to be heavy hitters to me. I'm aware we can turn down settings but there are folk like me who don't want

http://www.anandtech.com/bench/product/1771?vs=1731

GTA 5 Very High settings.
1440P Geforce 1070: 63.1fps.
1080P Geforce 1060: 68.2fps.

Witcher 3. Ultra settings.
1440P Geforce 1070: 62fps
1080P Geforce 1060: 58.5fps
 
Just disable hair works. Or throw a little overclock at the problem. (Pascal is a clockrate monster anyway.)
Witcher 3 is also not a twitch-shooter. Even a stable 30fps can be acceptable.

1080TI for 1440P is a waste, 1070 for 1080P is a waste.

I'm running 1440P on an RX 580 which is Geforce 1060 class at the moment. With plans for Vega which is Geforce 1080 class. I'm under no illusions that Vega is a waste for this display, same goes for the Geforce 1080 and 1080Ti.



--::{PC Gaming Master Race}::--

If people are bringing 1070 to the table, Vega 56 is also a choice that is rumored to be faster and a similar price. Also, TDP is not the same as power consumption, some 1080 can use over 300w in some workloads so don't count on that 180w as the max of the GPU, more like an average. That being said, you could use one or the other as long as you don't OC, even some undervolt could be more efficient and cool for a itx case. I would wait for vega reviews coming out in a few days to get all the info on power draw and performance. From that you will get a better picture.



Proudest Platinums - BF: Bad Company, Killzone 2 , Battlefield 3 and GTA4

Pemalite said:
Chazore said:

GTA V and Witcher 3 on ultra seem to be heavy hitters to me. I'm aware we can turn down settings but there are folk like me who don't want

http://www.anandtech.com/bench/product/1771?vs=1731

GTA 5 Very High settings.
1440P Geforce 1070: 63.1fps.
1080P Geforce 1060: 68.2fps.

Witcher 3. Ultra settings.
1440P Geforce 1070: 62fps
1080P Geforce 1060: 58.5fps
 
Just disable hair works. Or throw a little overclock at the problem. (Pascal is a clockrate monster anyway.)
Witcher 3 is also not a twitch-shooter. Even a stable 30fps can be acceptable.

1080TI for 1440P is a waste, 1070 for 1080P is a waste.

I'm running 1440P on an RX 580 which is Geforce 1060 class at the moment. With plans for Vega which is Geforce 1080 class. I'm under no illusions that Vega is a waste for this display, same goes for the Geforce 1080 and 1080Ti.

That's the thing though, for someone like me who has a 144hz monitor and likes having the bells and whistles, I'd like to have the extra legroom to do with what I want.

I know you think it's perfect and more than enough, but I'm not you, we are both different in how we percieve and approach game settings and what GPU we think we need.

1080ti hardly dominates 4k gaming at 60fps across the board, so the 1080ti may as well be a waste entirely.

I'm not a fan of playing games at 30fps these days, so with a 144hz display, no, it's not really acceptable to me, but I'm also not the only person in the world who thinks that either.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Around the Network
JRPGfan said:
AMD cards always.... ALWAYs age better.
Id say get the RX Vega 64, if you can find it for the same price as a 1080.
Its supposed to be slightly faster too right?

That's not really completely true. In particular, I've been really disappointed with AMD's long term driver support. There's almost a myth to the AMD performance aging better because of how badly their 7xxx drivers were at release, and then they picked up vastly when they got it up to par. 

I constantly make super budget gaming rigs (sub $100 complete with monitors, and even some sub $50 ones, as well as the ones I give away for free) out of old parts for local kids, and things like GTX470, 570, etc work just fine and have solid driver support, whereas HD3870, 4850, I had a little to a lot of trouble getting them to play nicely with modern games in Win10.

TO BE PERFECTLY FAIR, this is really only the case when dealing with VERY old cards though, which is a fair amount of what I deal with in my ultra-budget-concious recycling and charity stuff. Stuff newer than HD4xxx and GTX9xxx are generally just fine with the right settings on modern OS/gaming.

Remember the showdown between the overpriced GX480 vs the 5870?

https://www.youtube.com/watch?v=g41owucTjgs

GTX480 wins 5 of 7 modern games, sometimes by a lot (1080P)

Game / 5870 / 480

Doom 24 34
FO4 52 54
GTAV 64 105
W3 37 42
RE7 77 79
Planet Coaster 47 42
Battlefront 32 42

How about GTX680 vs 7970? This is a competition between the vanilla 680 and the Ghz edition 7970, which was, and always will be, better than the stock 680. 680 vs OG 7970 the 680 looks better. However, even in this case, the 680 still wins in some modern games :

https://www.youtube.com/watch?v=BfSi-Z8r12M

Game / 7970GE / 680 (Min-Avg)

F1 2016 31-46 33-43

ME:C 63-73 56-64

Overwatch 79-100 82-111

BF1 45-50 46-53

Gears4 49-61 50-64

Titanfall2 59-66 58-62

Sleeping Dogs 70-101 65-105

TR2013 51-67 46-65

So, an even split on modern games between the 7970GE and OG 680 4 favoring each. A far cry from what one would expect if the constant narrative about old AMD cards doing so well with age, and Nvidia the opposite. I honestly don't know where the idea comes from, maybe the R290s initially awful legacy DX performance?

As for the Vega 64 vs 1080, jeez. Vega looks like a dumpster fire to me honestly, unless they really shock us. Vega FE doesn't overclock well, has shocking power consumption, and that's the same product basically as the gaming Vega 64 AFAIK. Performance actually fell between 1070 and 1080, while consuming vastly more power and creating a ton more heat.

AMD recently hit a home run with RyZen CPUs, which is awesome, but it's like things have flipped. I had no issues recommending the HD7xxx over Nvidia on price/performance, ditto R290. At the same time it was REALLY hard to recommend an FX CPU. Now, I can totally recommend 10xx + Ryzen or i5/i7 pretty comfortably. If miners hadn't driven GPU prices insane, I think the 470/480/570/580s are pretty good, but 1050ti is unbeatable for value gaming and 6GB 1060s are way cheaper on average than even a 570 thanks to miners, so that sucks.

TLDR : Wait on Vega for whatever final reviews show. The FE performance seems to really indicate some concerns that it would be pretty surprising to see reversed in the 64. But anything's possible.



Take this with a grain of salt, but the leaked FireStrike benches seem MOST reliable out of the leaks so far, thanks to both the source (an AMD staffer account on 3dMark's system), and that AMD hasn't followed up with any denial/disavowing actions.

http://www.techradar.com/news/amd-radeon-rx-vega-could-match-gtx-1080s-speed-but-maybe-not-its-price

So, close to 1080 speed, but that's .. not a great thing. The pricing appears to be pretty terrible for Vega 64, and we know that the 10xx cards for the most part overclock incredibly well, including the 1080s, many of which can get closer to 1080ti. The 10xx also have been remarkably efficient.

And as for power consumption, if FE and the new cards really do use the same silicon and similar clock speeds :

https://ibb.co/k5uzRa

Woof. The air-cooled version of the FE uses more power than a stock 1080ti or Titan XP, and nearly as much as a fully overclocked Titan XP (overvolt).

DO NOT take this as final info however. Even though the Vega 64 gaming model uses the same GPU as the Vega FE, who knows, the final product might just have some tweaks that surprise us, some amazing driver magic, whatever. Until the final cards are in the hands of the big hitter tech sites, it remains a mystery, albeit with a lot of dark clouds circling. It reminds me so much of when Bulldozer was about to come out, or even that hilariously terrible Nvidia 'FX' which culminated in the hilariously awful FX5800. Man, ATI stomped that gen with the 9700 Pro and 9800 stuff. LOL. Nvidia's later FX5900 was improved, but the damage was done.



Chazore said:

That's the thing though, for someone like me who has a 144hz monitor and likes having the bells and whistles, I'd like to have the extra legroom to do with what I want.

I know you think it's perfect and more than enough, but I'm not you, we are both different in how we percieve and approach game settings and what GPU we think we need.

1080ti hardly dominates 4k gaming at 60fps across the board, so the 1080ti may as well be a waste entirely.

I'm not a fan of playing games at 30fps these days, so with a 144hz display, no, it's not really acceptable to me, but I'm also not the only person in the world who thinks that either.

144hz is an entirely different ball game and wasn't what this discussion originally entailed.
But you are correct in that instance, but it doesn't mean I am incorrect either.

And don't get me wrong. I do buy high-end hardware that exceeds my perceived needs as I hate a compromised experience, be it framerate or image quality. - The RX 580 being the exception as I was desperate. I can't wait to flog it off.



--::{PC Gaming Master Race}::--

Pemalite said:

144hz is an entirely different ball game and wasn't what this discussion originally entailed.
But you are correct in that instance, but it doesn't mean I am incorrect either.

And don't get me wrong. I do buy high-end hardware that exceeds my perceived needs as I hate a compromised experience, be it framerate or image quality. - The RX 580 being the exception as I was desperate. I can't wait to flog it off.

I'ma ware, whichw as why I mentioned my 144hz monitor in particular in relation to going for a 1080ti.

I prefer having the ehadroom to be able to do what I want when I want, rather than just average, just tipping the line bogstandard hw to get the job dont, because just tipping the line doesn't always gurantee the ebst results, especially 5-6 years down the line with how games and hw requirements change (like how we don't need 8gb RAM, yet not we're seeing 16gb requirements cropping up, surely they are wrong for good reason and would stop requesting 16gb RAM?).

If you think the same as me then you'll understand why I suggested the 1080ti. The 1070 for 1440p won't always get the job done and that would require the toning down of certain settings, to which you would then be better off with a 1070/1080 at 1080p to keep them all on max, that's why I chose the 1080ti to go with my 1440p 144hz monitor.

I know some people claim that you can do with less, but we live in a world where you can take their word and get screwed over for taking their advice, and well, you can't exactly offload the bill onto them, now can you?. That's why I'm not a fan of "you only need less" advice, because the last thing I want to do when building or going for something I want is trying to be as cheap as possible and then finding myself gimped for doing so. In the case of going high to very high end, it would become a devloper issue or one with nvidia/AMD/Intel, but going cheap as chips becomes is all on you, because you wanted to go for less ro what you thought could handle the same exact scenario 100% of the time (which is why I take those bench results with a tiny pinch of salt, as it's not the same across the board, definitely not with GTA V).



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Chazore said:

If you think the same as me then you'll understand why I suggested the 1080ti. The 1070 for 1440p won't always get the job done and that would require the toning down of certain settings, to which you would then be better off with a 1070/1080 at 1080p to keep them all on max, that's why I chose the 1080ti to go with my 1440p 144hz monitor.

I know some people claim that you can do with less, but we live in a world where you can take their word and get screwed over for taking their advice, and well, you can't exactly offload the bill onto them, now can you?. That's why I'm not a fan of "you only need less" advice, because the last thing I want to do when building or going for something I want is trying to be as cheap as possible and then finding myself gimped for doing so. In the case of going high to very high end, it would become a devloper issue or one with nvidia/AMD/Intel, but going cheap as chips becomes is all on you, because you wanted to go for less ro what you thought could handle the same exact scenario 100% of the time (which is why I take those bench results with a tiny pinch of salt, as it's not the same across the board, definitely not with GTA V).

We are exceptions to the rule. We are enthusiasts/high-end users.

A Geforce 1070 is ample enough for 1440P.
A Geforce 1060 is enough for 1080P.
At 60fps, which most people game at. The benchmarks/evidence I provided is a testament to that very fact.

Now where pricing comes into it, things get a little more difficult. Mining has made the 1070 almost as expensive as the 1080, making the 1080 the better buy anyway.

With that, you always buy the best you can afford for your needs, but people should not be under some silly illusion that they need a Geforce 1080Ti for 1440P as that is silly.



--::{PC Gaming Master Race}::--