By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Red Dead Redemption is Native 4K on Xbox One X, looks stunning!

quickrick said:
Ganoncrotch said:
Should the best reason to own a new premium system be a game from 8 years ago?

The game is loved by many and people have been begging for a remaster, because RDR has such a gorgeous world thats being held back by low resolution and jaggies, and the game can't be played anywhere else in this form.   

It is indeed gorgeous world, 4K just makes it look better (though all the flaws, like low res ground textures, are even more apparent in 4K).

Game itself is, story wise, a bit of a incoherent and disjointed mess unfortunately, especially in second part of game.



Around the Network
Ganoncrotch said:
quickrick said:

yea if you have a good PC no reason to buy an X.

Ryzen 7 1700, 16GB 3000mhz DDR4 vengeance teamed up with a MSI OC Twin Frozr 780ti @1085mhz and 6.25TB of storage (250gb SSD, 2x3TB 7200rpm high performance drives)

780TI is a few years old, but it's on par/better than a 1060 and this comparison uses the stock clock of the 780TI which would be 785mhz

!

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-780-Ti-vs-Nvidia-GTX-1060-3GB/2165vs3646

Easily capable of pretty much any graphics task at 1080p at least.

Sorry for completely randomly jumping in, but that's 1060 3GB version, it's slower than regular 1060, which easily beats 780TI

https://www.anandtech.com/bench/product/1771?vs=1717



HoloDust said:
Ganoncrotch said:

Ryzen 7 1700, 16GB 3000mhz DDR4 vengeance teamed up with a MSI OC Twin Frozr 780ti @1085mhz and 6.25TB of storage (250gb SSD, 2x3TB 7200rpm high performance drives)

780TI is a few years old, but it's on par/better than a 1060 and this comparison uses the stock clock of the 780TI which would be 785mhz

!

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-780-Ti-vs-Nvidia-GTX-1060-3GB/2165vs3646

Easily capable of pretty much any graphics task at 1080p at least.

Sorry for completely randomly jumping in, but that's 1060 3GB version, it's slower than regular 1060, which easily beats 780TI

https://www.anandtech.com/bench/product/1771?vs=1717

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-780-Ti-vs-Nvidia-GTX-1060-6GB/2165vs3639

Huge difference alright.

 

edit - also it isn't slower it has 10% less cores in the 3GB version to the 6GB version, they've the same clock speeds (depending on the card versions of course)

There are other tech differences between the cards which give the 1060 an advantage over the 2 generation old 780TI and the fact that the power consumption of the 780TI is huge to achieve what the far more efficient 1060 can achieve, but that said, I'm not thinking about the power bill when gaming so couldn't give a rats.



Why not check me out on youtube and help me on the way to 2k subs over at www.youtube.com/stormcloudlive

Ganoncrotch said:
HoloDust said:

Sorry for completely randomly jumping in, but that's 1060 3GB version, it's slower than regular 1060, which easily beats 780TI

https://www.anandtech.com/bench/product/1771?vs=1717

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-780-Ti-vs-Nvidia-GTX-1060-6GB/2165vs3639

Huge difference alright.

 

edit - also it isn't slower it has 10% less cores in the 3GB version to the 6GB version, they've the same clock speeds (depending on the card versions of course)

There are other tech differences between the cards which give the 1060 an advantage over the 2 generation old 780TI and the fact that the power consumption of the 780TI is huge to achieve what the far more efficient 1060 can achieve, but that said, I'm not thinking about the power bill when gaming so couldn't give a rats.

Uhm...which makes it slower in games...who said anything about clocks?

Anyway, 780TI is still quite valid 1080p card, no point in upgrading it unless you go for higher res. I've put 1060 3GB last year in my other rig to replace ancient HD7770...now that was worthy upgrade.



Ninjablade aproves this thread.



Proud to be the first cool Nintendo fan ever

Number ONE Zelda fan in the Universe

DKCTF didn't move consoles

Prediction: No Zelda HD for Wii U, quietly moved to the succesor

Predictions for Nintendo NX and Mobile


Around the Network
HoloDust said:
Ganoncrotch said:

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-780-Ti-vs-Nvidia-GTX-1060-6GB/2165vs3639

Huge difference alright.

 

edit - also it isn't slower it has 10% less cores in the 3GB version to the 6GB version, they've the same clock speeds (depending on the card versions of course)

There are other tech differences between the cards which give the 1060 an advantage over the 2 generation old 780TI and the fact that the power consumption of the 780TI is huge to achieve what the far more efficient 1060 can achieve, but that said, I'm not thinking about the power bill when gaming so couldn't give a rats.

Uhm...which makes it slower in games...who said anything about clocks?

Anyway, 780TI is still quite valid 1080p card, no point in upgrading it unless you go for higher res. I've put 1060 3GB last year in my other rig to replace ancient HD7770...now that was worthy upgrade.

You did

"it's slower than regular 1060,"

When talking about clock speed you would refer to that as speed, or you meant slower as in the card has less performance? You can get a 4ghz pentium 4 processor which would be faster than my Ryzen7 since it's clocked at just 3.7ghz as in the pentium 4 performs more operations per second than the Ryzen, the obvious difference is that most of those operations generate nothing but heat lol.

Think perhaps just confusion as you think the terms Clock speed and Speed are unrelated terms? But yeah both versions of the 1060 run at the same speed, there is just 10% less cores in the 3gb version which ends up obviously creating 10% less performance (in a perfectly optimized scenario for the 6gb version)



Why not check me out on youtube and help me on the way to 2k subs over at www.youtube.com/stormcloudlive

Ganoncrotch said:
HoloDust said:

Uhm...which makes it slower in games...who said anything about clocks?

Anyway, 780TI is still quite valid 1080p card, no point in upgrading it unless you go for higher res. I've put 1060 3GB last year in my other rig to replace ancient HD7770...now that was worthy upgrade.

You did

"it's slower than regular 1060,"

When talking about clock speed you would refer to that as speed, or you meant slower as in the card has less performance? You can get a 4ghz pentium 4 processor which would be faster than my Ryzen7 since it's clocked at just 3.7ghz as in the pentium 4 performs more operations per second than the Ryzen, the obvious difference is that most of those operations generate nothing but heat lol.

Think perhaps just confusion as you think the terms Clock speed and Speed are unrelated terms? But yeah both versions of the 1060 run at the same speed, there is just 10% less cores in the 3gb version which ends up obviously creating 10% less performance (in a perfectly optimized scenario for the 6gb version)

No, I really didn't - that has nothing to do with clock speed.

Being slower or faster is usual term used when comparing CPUs/GPUs...unless one is grammar nazi or has no grasp on how hardware works, since one card being slower or faster than the other is never compared via clock speeds (unless they are exactly the same cards clocked differently).



HoloDust said:
Ganoncrotch said:

You did

"it's slower than regular 1060,"

When talking about clock speed you would refer to that as speed, or you meant slower as in the card has less performance? You can get a 4ghz pentium 4 processor which would be faster than my Ryzen7 since it's clocked at just 3.7ghz as in the pentium 4 performs more operations per second than the Ryzen, the obvious difference is that most of those operations generate nothing but heat lol.

Think perhaps just confusion as you think the terms Clock speed and Speed are unrelated terms? But yeah both versions of the 1060 run at the same speed, there is just 10% less cores in the 3gb version which ends up obviously creating 10% less performance (in a perfectly optimized scenario for the 6gb version)

No, I really didn't - that has nothing to do with clock speed.

Being slower or faster is usual term used when comparing CPUs/GPUs...unless one is grammar nazi or has no grasp on how hardware works, since one card being slower or faster than the other is never compared via clock speeds (unless they are exactly the same cards clocked differently).

You are using the term Speed in place of performance, it's wrong there is no "grammar nazi" about it.

The card isn't faster, the speed that it's functioning at is the exact same on both 3gb and 6gb models, the 6gb provides more performance because it has more cores functioning at that speed.

If you would like to think I've no grasp on hardware functionality though and that the card does in fact work "faster" then by all means go for it if that makes you feel better, I'm done with the hardware lesson.



Why not check me out on youtube and help me on the way to 2k subs over at www.youtube.com/stormcloudlive

Ganoncrotch said:
HoloDust said:

No, I really didn't - that has nothing to do with clock speed.

Being slower or faster is usual term used when comparing CPUs/GPUs...unless one is grammar nazi or has no grasp on how hardware works, since one card being slower or faster than the other is never compared via clock speeds (unless they are exactly the same cards clocked differently).

You are using the term Speed in place of performance, it's wrong there is no "grammar nazi" about it.

The card isn't faster, the speed that it's functioning at is the exact same on both 3gb and 6gb models, the 6gb provides more performance because it has more cores functioning at that speed.

If you would like to think I've no grasp on hardware functionality though and that the card does in fact work "faster" then by all means go for it if that makes you feel better, I'm done with the hardware lesson.

Well, I certainly hope you're done with your hardware lesson and that you've learned from it that not only GPUs are being faster/slower than others is usual lingo that's been around since there are GPUs, and has nothing to do with clocks (which is something that only someone who has very little knowledge about hardware would assume), but that how certain GPU preforms is measured in time that it needs to render a frame/frames it's capable to render in second - thus making it, by definition, faster or slower than other GPUs.



JRPGfan said:
Looks great for a last gen game.
Though visually its still behinde alot of current gen games.

I think it holds up pretty damn good.

https://youtu.be/LSj_8BEKHCg?t=1m5s