By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Should I get the RX Vega 64 or GTX 1080?

The RX Vega 56 may be worth looking into, it has been said that it does outperform the GTX 1070 (though we should wait for proper benchmarks).
It should hopefully beat the 1070 and not draw an obscene amount of power.
Or the RX Vega Nano.



Around the Network
Pemalite said:

We are exceptions to the rule. We are enthusiasts/high-end users.

A Geforce 1070 is ample enough for 1440P.
A Geforce 1060 is enough for 1080P.
At 60fps, which most people game at. The benchmarks/evidence I provided is a testament to that very fact.

Now where pricing comes into it, things get a little more difficult. Mining has made the 1070 almost as expensive as the 1080, making the 1080 the better buy anyway.

With that, you always buy the best you can afford for your needs, but people should not be under some silly illusion that they need a Geforce 1080Ti for 1440P as that is silly.

Again I am aware of the test results, but unfortunetley the results don't always 100% pan out like they do on a console, such is the nature of PC gaming. What works in your rig for 1440p/1080p will not always work for someone else. My 980 should have been more than enough for GTA 5, as the same should have been for my i5-4670k, yet they were not able to give me a decent and constant 60fps with GTA V and that wasn't with everything maxed either at 1080p and I know I'm not one in 7 billion on this rock that would have experienced that either (because you know not everyone within PC gaming ahs the likes of a 980 either).

I know of the mining situation, but that is only a temporary issue and not one that will be around for years at a time, thus the 1070 as you put it would be the defacto buy, just do what the average person would do and wait, it won't kill you.

So it's not really a need then, it's a factual illusion that I bought the 1080ti for my monitor and needs?, because that's honestly how that sentence is going. I've watched the benchmarks for multiple takes on the i7-6700k being paired with a 1080 at 1440p max settings for various games and I wasn't too happy with how far the frame dips had gone (because a 1080 is more than enough right?, so who's fault is it for the stupidly low eprformance dips, the devs?), which is why I went for the 1080ti, because I wanted to avoid said dips.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

1080. I have not had 1 game on max settings in 4k that pushed this card unless your Star Citizen.



Just write a love poem, that's all you need to get a 1080



VGPolyglot said:
Just write a love poem, that's all you need to get a 1080

1080Ti



Around the Network
Chazore said:

Again I am aware of the test results, but unfortunetley the results don't always 100% pan out like they do on a console, such is the nature of PC gaming. What works in your rig for 1440p/1080p will not always work for someone else. My 980 should have been more than enough for GTA 5, as the same should have been for my i5-4670k, yet they were not able to give me a decent and constant 60fps with GTA V and that wasn't with everything maxed either at 1080p and I know I'm not one in 7 billion on this rock that would have experienced that either (because you know not everyone within PC gaming ahs the likes of a 980 either).

I know of the mining situation, but that is only a temporary issue and not one that will be around for years at a time, thus the 1070 as you put it would be the defacto buy, just do what the average person would do and wait, it won't kill you.

So it's not really a need then, it's a factual illusion that I bought the 1080ti for my monitor and needs?, because that's honestly how that sentence is going. I've watched the benchmarks for multiple takes on the i7-6700k being paired with a 1080 at 1440p max settings for various games and I wasn't too happy with how far the frame dips had gone (because a 1080 is more than enough right?, so who's fault is it for the stupidly low eprformance dips, the devs?), which is why I went for the 1080ti, because I wanted to avoid said dips.

You are basing everything around something that is ancedotal. I cannot possibly adhere to such things.



--::{PC Gaming Master Race}::--

Arkaign said:
Take this with a grain of salt, but the leaked FireStrike benches seem MOST reliable out of the leaks so far, thanks to both the source (an AMD staffer account on 3dMark's system), and that AMD hasn't followed up with any denial/disavowing actions.

http://www.techradar.com/news/amd-radeon-rx-vega-could-match-gtx-1080s-speed-but-maybe-not-its-price

So, close to 1080 speed, but that's .. not a great thing. The pricing appears to be pretty terrible for Vega 64, and we know that the 10xx cards for the most part overclock incredibly well, including the 1080s, many of which can get closer to 1080ti. The 10xx also have been remarkably efficient.

And as for power consumption, if FE and the new cards really do use the same silicon and similar clock speeds :

https://ibb.co/k5uzRa

Woof. The air-cooled version of the FE uses more power than a stock 1080ti or Titan XP, and nearly as much as a fully overclocked Titan XP (overvolt).

DO NOT take this as final info however. Even though the Vega 64 gaming model uses the same GPU as the Vega FE, who knows, the final product might just have some tweaks that surprise us, some amazing driver magic, whatever. Until the final cards are in the hands of the big hitter tech sites, it remains a mystery, albeit with a lot of dark clouds circling. It reminds me so much of when Bulldozer was about to come out, or even that hilariously terrible Nvidia 'FX' which culminated in the hilariously awful FX5800. Man, ATI stomped that gen with the 9700 Pro and 9800 stuff. LOL. Nvidia's later FX5900 was improved, but the damage was done.

I'm right there with ya and I have an R9 Nano. Not that I was seriously looking to upgrade, though I was feeling the temptation before I found out the rumored power consumption numbers. If those and the performance numbers are true, then WTF AMD? Did they focus so hard on HBM2 that they forgot to actually upgrade over Fury X?!

):<



The BuShA owns all!

Azzanation said:
1080. I have not had 1 game on max settings in 4k that pushed this card unless your Star Citizen.

THe 1080Ti doesn't even run every game on max settings 4k/60 so I doubt the small brother does.




Twitter @CyberMalistix

malistix1985 said:
Azzanation said:
1080. I have not had 1 game on max settings in 4k that pushed this card unless your Star Citizen.

THe 1080Ti doesn't even run every game on max settings 4k/60 so I doubt the small brother does.

I never said 4k at 60 frames. I said that pushes this card to its limits. Playing Witcher 3 on max settings at 4k with 50 to 55 frames is not pushing this card or do i think many would care. 

1080 is a beast of a card and is well supported with Nvidea updates.



Azzanation said:
malistix1985 said:

THe 1080Ti doesn't even run every game on max settings 4k/60 so I doubt the small brother does.

I never said 4k at 60 frames. I said that pushes this card to its limits. Playing Witcher 3 on max settings at 4k with 50 to 55 frames is not pushing this card or do i think many would care. 

1080 is a beast of a card and is well supported with Nvidea updates.

nier automata, watch dogs 2, some other games don't even run properly on my system. Sure the 1080 can do some 4k gaming on decent settings but on the witcher 4 you proberbly also have to turn down some settings to get a real solid framerate, same for gears of war and stuff, Even I made some compromizes on the witcher, getting around 70fps but you always need to count for the lowest frames not the average, when you go for 50-55 but have dips under 50 its going to be noticable, unless you have a Gsync display, that would help a lot.

Either way for this topic deff 1080 and deff at least 1440p resolution, else its kind of a waste, unless you just want to go for the high frames.




Twitter @CyberMalistix