By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - Digital Foundry: £100 Graphics Card vs Next-gen consoles

Interesting. I remember when a GTX8800 could hardly run Crysis at 30fps at medium settings. That game was built from the ground up on PC and that was the best video card out at the time (it cost AU$1000 when it launched). Then the PS360 got a version which wasn't too far behind what the GTX8800 could do.

It just goes to show what monsters the PS360 were for their time, but they both sold at a loss for a long time. My understanding is that the X1 & PS4 are pretty much breaking even for Sony and Microsoft, so the fact that a $150 video card on a 28nm manufacturing process can almost match them is pretty telling. I know it's early in the new console gen, but I can't see gaming performance making massive leaps from where it is now on the consoles. They're apparently pretty easy to develop for so I'd imagine gradual improvements as new engines (snowdrop etc) allow it.

Can't wait for Maxwell :)



Around the Network
Munkeh111 said:
So it's not quite a fair comparison as your £100 graphics card will likely find itself in a £400+ system. You would also only expect this parity to last another few months and teams get more time and more experience working on next-gen hardware

The problem is as a PC gamer you always want more. My 670 could provide nice 1080p gaming, but once I can't hit ultra and keep the frame rate up, I'm going to want more power rather than just lowering the settings. Star Citizen is going to ruin me

Very true, but most PC gamers will already have the system, and will just be looking for upgrades, as i did recently.

Purchased a GTX770 for £250 with 3 games bundled in as an upgrade for my PC (which is a 5 year old build), and the GTX770 will keep my system going for a good number of years, and am able to play everything on Ultra with no issues. 

Its true that building a capable PC is going to be more expensive than a console in the initial purchase, but in the long run, with the ability to just upgrade individual components every couple of years, it works out cheaper (and get better performance) than buying a brand new console every 6 or so years.



Current Game Machines: 3DS, Wii U, PC.

Currently Playing: X-Com(PC), Smash Bros(WiiU), Banner Saga(PC), Guild Wars 2(PC), Project X Zone(3DS), Luigis Mansion 2(3DS), DayZ(PC)

Suprising results, battlefield is extremely well optimized, though they usually are since Dice puts more effort on optimizing the PC version too.

and this is even before mantle has hit yet ^^



Locknuts said:
Interesting. I remember when a GTX8800 could hardly run Crysis at 30fps at medium settings. That game was built from the ground up on PC and that was the best video card out at the time (it cost AU$1000 when it launched). Then the PS360 got a version which wasn't too far behind what the GTX8800 could do.

It just goes to show what monsters the PS360 were for their time, but they both sold at a loss for a long time. My understanding is that the X1 & PS4 are pretty much breaking even for Sony and Microsoft, so the fact that a $150 video card on a 28nm manufacturing process can almost match them is pretty telling. I know it's early in the new console gen, but I can't see gaming performance making massive leaps from where it is now on the consoles. They're apparently pretty easy to develop for so I'd imagine gradual improvements as new engines (snowdrop etc) allow it.

Can't wait for Maxwell :)

Actually a Geforce GTX 8800 768Mb could do high settings at 1080P and still achieve 30fps, provided you had a beefy Core 2 Duo and did some overclocking.
The trick was to run Crysis in Direct X 9 mode rather than Direct X 10, then with a couple of .ini tweaks you can get the Direct X 9 path to look similar to the Direct X 10 path but with the added benefit of allot more performance.

Besides, the console version wasn't actually a "port" of the origional Crysis per-say.
Crytek actually ported the game to CryEngine 3, when the PC version was using CryEngine 2, then ported the game to consoles. - But even with that move  allot of foliage and other assets were missing.
It would have been impossible for the consoles to run the origional crysis with all the assets and effects intact with CryEngine 2.

The Geforce 7950 series is probably a more accurate assumption of what to expect with that game.
The PC version (with tweaking of course) you could do better than the 360/PS3 with the DX9 path if you sacrifice some resolution (Same goes for Crysis 2 too).

http://www.youtube.com/watch?v=0rcGb8nvWJM
http://www.youtube.com/watch?v=dyr9VR_GVZQ
http://www.youtube.com/watch?v=jHWPGmf_A_0
http://www.youtube.com/watch?v=-l1TnsDYE5k
http://www.youtube.com/watch?v=YZbLAtoZNOY
http://www.youtube.com/watch?v=Upw5wFOc0Ig
http://www.tweaktown.com/articles/1234/nvidia_geforce_8800gts_512mb_g92_tested/index18.html

I do concur however that relative to a high-end PC the Xbox 360 and Playstation 3 were incredibly powerfull, which is in stark contrast to this new generation of consoles which are only equivalent to a mid-range PC. (Enthusiast-level PC's are always in an entirely different league.)




www.youtube.com/@Pemalite

NiKKoM said:
euro's or dollars people... euro's or dollars.. you crazy left driving people

:p



Around the Network
Pemalite said:
Locknuts said:
Interesting. I remember when a GTX8800 could hardly run Crysis at 30fps at medium settings. That game was built from the ground up on PC and that was the best video card out at the time (it cost AU$1000 when it launched). Then the PS360 got a version which wasn't too far behind what the GTX8800 could do.

It just goes to show what monsters the PS360 were for their time, but they both sold at a loss for a long time. My understanding is that the X1 & PS4 are pretty much breaking even for Sony and Microsoft, so the fact that a $150 video card on a 28nm manufacturing process can almost match them is pretty telling. I know it's early in the new console gen, but I can't see gaming performance making massive leaps from where it is now on the consoles. They're apparently pretty easy to develop for so I'd imagine gradual improvements as new engines (snowdrop etc) allow it.

Can't wait for Maxwell :)

Actually a Geforce GTX 8800 768Mb could do high settings at 1080P and still achieve 30fps, provided you had a beefy Core 2 Duo and did some overclocking.
The trick was to run Crysis in Direct X 9 mode rather than Direct X 10, then with a couple of .ini tweaks you can get the Direct X 9 path to look similar to the Direct X 10 path but with the added benefit of allot more performance.

Besides, the console version wasn't actually a "port" of the origional Crysis per-say.
Crytek actually ported the game to CryEngine 3, when the PC version was using CryEngine 2, then ported the game to consoles. - But even with that move  allot of foliage and other assets were missing.
It would have been impossible for the consoles to run the origional crysis with all the assets and effects intact with CryEngine 2.

The Geforce 7950 series is probably a more accurate assumption of what to expect with that game.
The PC version (with tweaking of course) you could do better than the 360/PS3 with the DX9 path if you sacrifice some resolution (Same goes for Crysis 2 too).

http://www.youtube.com/watch?v=0rcGb8nvWJM
http://www.youtube.com/watch?v=dyr9VR_GVZQ
http://www.youtube.com/watch?v=jHWPGmf_A_0
http://www.youtube.com/watch?v=-l1TnsDYE5k
http://www.youtube.com/watch?v=YZbLAtoZNOY
http://www.youtube.com/watch?v=Upw5wFOc0Ig
http://www.tweaktown.com/articles/1234/nvidia_geforce_8800gts_512mb_g92_tested/index18.html

I do concur however that relative to a high-end PC the Xbox 360 and Playstation 3 were incredibly powerfull, which is in stark contrast to this new generation of consoles which are only equivalent to a mid-range PC. (Enthusiast-level PC's are always in an entirely different league.)

Ah yes. I forgot it was one of the first DX10 games. You're right, the DX9 version did run better.



Pemalite said:

Adaptive V-Sync is your friend with the fluctuating framerate.
AMD users can obtain that functionality by using RadeonPro. (Which in turn gives you access to SweetFX to improve a games graphics.)

Also, feel free to overclock, it's free performance.
AMD and nVidia are both making great strides in dealing with framerate fluctuations, nVidia is throwing G-Sync out into the wild and AMD is pushing for the more open Freesync.

Adaptive V-Sync is a good try, I will give it a run right now. But AC IV is just an extreme example because of the crazy flutuating framerate. It's not about tearing when going 40-60 fps (that would be exactly the sweet spot for adaptive V-Sync or the future G-Sync), it's more about the game dipping to 20-30 fps on some areas and going at more than 60 fps at others, so I have to adjust my settings to get a good framerate on the more demanding areas while, on lighter areas, I will be way above 60fps (a performance that could be used to more antialiasing/effects on that area). Its something they should optimize better. Of course, AC IV isn't a good example of a PC port, it has good graphics, surely a improvement over AC III, but it is more demanding than games with better visual results.

About overclocking, I would need to see how much my VGA would go with its natural cooling system. Of course, I have an extra big fan on my PC that is turned off right now (disconnected from the MB. It doesn't help with the amount of dust accumulated, I really need some dust filters).



Munkeh111 said:

Well I think a performance difference that large is rare. I think I generally have v-sync on, which means a rock solid 60fps in most game. Far Cry 3 was at a solid 30fps and my little playthrough of Crysis 3 found things mostly stable at about 45fps

With PC, you're always going to have to accept that you are going to have to "get your hands dirty" to a certain extent. With the variety in PCs, the number of issues are going to be higher and higher still given the lack of care that is going to continue to be taken


Well, no problems with getting the hands a little dirtier. And I'm happy with my VGA, specially after I grabbed some 360 gamepads to play with.



I would love to get into pc gaming just don't have the knowledge to build my own and just don't know if a prebuilt is worth the $ (alienware, etc) in comparison to my xbox1. $500 compared to $1500.



Cleary397 said:
Munkeh111 said:
So it's not quite a fair comparison as your £100 graphics card will likely find itself in a £400+ system. You would also only expect this parity to last another few months and teams get more time and more experience working on next-gen hardware

The problem is as a PC gamer you always want more. My 670 could provide nice 1080p gaming, but once I can't hit ultra and keep the frame rate up, I'm going to want more power rather than just lowering the settings. Star Citizen is going to ruin me

Very true, but most PC gamers will already have the system, and will just be looking for upgrades, as i did recently.

Purchased a GTX770 for £250 with 3 games bundled in as an upgrade for my PC (which is a 5 year old build), and the GTX770 will keep my system going for a good number of years, and am able to play everything on Ultra with no issues. 

Its true that building a capable PC is going to be more expensive than a console in the initial purchase, but in the long run, with the ability to just upgrade individual components every couple of years, it works out cheaper (and get better performance) than buying a brand new console every 6 or so years.

Must have been a good deal, my 670 was £350... Personally, I'm not convinced that it does end up cheaper in the long run, it just depends on what your normal PC usage is. I have a laptop and gaming PC, so the gaming PC wouldn't exist if it weren't for gaming. But then again, I would probably have a more expensive laptop...

Anyway, I play on consoles more simply because their less of a hastle and great exclusives. I'm lucky that money isn't so much of a problem