By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Crytek Say The PC Is A Generation Ahead

shio said:

I don't know about you, but that Crysis pic completely crushes the Uncharted 2 one. And that's nowhere near the best Crysis pics.

Though I don't agree with Slimebeast, that's not Uncharted 2.

Edit: And yes, obviously Crysis on max still beats out Uncharted 2.



Rockstar: Announce Bully 2 already and make gamers proud!

Kojima: Come out with Project S already!

Around the Network

This is a comparison between a card which is roughly twice as fast in practice as Xenos with the current top of the range AMD card. Bear in mind that the next generation of cards are ~30% or better faster than the one shown here.

http://www.anandtech.com/bench/Product/179?vs=162

Bear in mind as well that many of the results here are CPU limited considering the scaling of Crysis vs other titles.

 



Tease.

I blame pirates. There isn't any potential in PC gaming because no one wants to develop for the PC. They don't develop because the gamers that play PC games would much rather get the game for free (don't blame them) then buy the game. So developers stop developing. Makes sense to me. Its a shame too.



Slimebeast said:
Foamer said:

You're still completely missing the point. One last time for the hard of reading, in bold seeing as it's not getting through to you- he's saying no one's taking advantage of the huge tech advantage of the PC. Here's the rest of his point, in italics this time in case you're still not getting it- that's why you're not seeing the quantum leaps in graphical fidelity you'd expect given the enormous gulf in power.

The pics you're posting are just reinforcing what he's saying and, along with your spectacularly ignorant comments on hardware, making yourself look very silly.

No, my comments on hardware are accurate.

A console generation is a minimum 16 times increase in graphics rendering power compared to the previous gen(each 18 months doubles the power ----> x2 x2 x2 x2 = 16)

The Xbox360 came out 5 years ago and had a GPU equivalent of a Nvidia 7800GT 256MB (or a Radeon X1800XL if you like).

But the current strongest graphics card on PC - the Nvidia GTX 580 1536MB is not 16 times faster than a 7800 GT (it's not even 10 times faster).

Conclusion: PC is not a generation ahead of consoles yet.



Actually, that isn't accurate at all ...

There is not a uniform level of processing power which defines a generation and (as a result) there is not a uniform boost in processing power that represents a generational gap. No one would argue that the Dreamcast was a generation ahead of the N64 but the Dreamcast was only 4 to 6 times as powerful as the N64; and few people would argue that the XBox 360 or PS3 aren't a generation ahead of the XBox, but neither system is 16 times the processing power of the XBox.

In general, the jump between a system in one generation to its replacement in the next generation is roughly 10 times the performance; and some systems will see smaller or larger jumps because of changes in strategy (changing the price point, change in time between console releases, change in size or energy consumption, etc.).

 

Now, the undelying question is "what is the minimum processing power jump which can define a generational jump on processing power alone?" which is not a straight forward/easy question to answer. First off diminishing returns are an issue; a couple of generations ago if you released a system that was twice the performance of another system the difference in displayed results could be dramatic, while today it would be noticeable but not significant. Secondly, you have to question whether the display resolutions and framerate remain the same or not; because the results produced by modern hardware at 720p @30fps (like most HD console games) are dramatically different from what hardware can produce at 1080p @60fps.

With that said, if you released a system that targeted 720p and was 6 to 8 times the processing power of the HD consoles no one would question that it was a generation ahead; and on the other hand, if you're targeting 1080p (or potentially 3D displaying at 1080p) you would need 8 to 16 times the performance.

 

One last consideration is that Crytek is (potentially) talking in terms of games that are in development today and will be released in 2011/2012. If you're producing a game where your 1 year old low end graphics card is the minimum requirement and has 4 times the processing power of the HD console's cards, you're targeting a new mid range graphics card for the game to perform well, and you're including enhancements for the people who will be running a system with 2 graphics cards that won't be released for a year, the hardware you are working on would clearly be a generation ahead of current generation consoles.



greenmedic88 said:
HappySqurriel said:
greenmedic88 said:
shio said:
greenmedic88 said:

And again we go back to the single exclusive game on PC that demonstrates the (multi) generation gap between a console and a PC.

It's seriously getting old. Especially considering that only a tiny percentage of gaming PCs in circulation allow Crysis to look its best (the version that the PC pundit invariably references) at decent frame rates and resolution.

Considering there really hasn't been anything major since 2007 on PC, the gap isn't so extreme.

Maybe PC developers should be working on squeezing more performance out of current hardware (current average hardware) rather than simply relying upon future hardware to take up the slack. But of course, that's not the nature of the PC game.

There are loads of examples, but Crysis isa very fitting example.

Then list them. 

PC exclusive games that required hardware upgrades in the same vein as Crysis in 2007.

Just on your original comment that "only a tiny percentage of gaming PCs in circulation allow Crysis to look its best", a PC with a Radeon HD 4770 (a low end graphics card released in 2008) can run Crysis at high detail, above 720p, at over 30fps ...

The real reason why we haven't seen the benefits of more powerful PC hardware is not because technology is not dramatically more advanced than the HD consoles; after all, new graphics cards have theoritical processing peformance of 2 Teraflops which is around 10 times the theoritical performance of the XBox 360 or PS3's GPU. What is holding them back is that most third party publishers need to sell games for the XBox 360, PS3 and PC in order to come close to breaking even on the cost of development of HD games. While there are some benefits for PC gamers from this (generally, higher detailed models, textures, better draw distances, higher resolutions and better framerates) developers can't really take full advantage of modern PC hardware without making it difficult/impossible to release games for the HD consoles.

That's what I'm talking about. 1280x720 on 16:9 or 1280x768 on 16:10 at 30 plus FPS on "high" (not highest or ultra) is not Crysis at its best. It's definitely not what you'll find when looking for the best examples of Crysis frame grabs which are invariably the ones used in graphics comparisons. 

Your second point reiterates what a lot of people have already said. Any game that pushed hardware to the point where a VGA card upgrade was in order for decent performance (which is at minimum what people will want if they're buying a game for the advanced visuals) is going to have to sell a significant number of copies assuming the R&D that went into the engine as well as all the highly detailed game resources drove the budget well into 8 figures. Millions of units sold, not hundreds of thousands. 

Crysis was essentially the last such PC exclusive title to do so. 

Without the hardware busting exclusives that come around every generation or so, the main reasons for trying to build an optimal gaming PC is as you said: higher detailed textures, better draw distances, higher res and frame rates. These don't really represent huge "generational" gaps as suggested by the OP. 

Personally, sure: give me the PC version of a multiplatform game 90 plus % of the time. It will play better on my PC and it typically costs me less as a bonus. But in the end, they're STILL the same games I'd be playing on a console and sometimes do when a PC port has serious issues. 

When I asked for a list of PC exclusives that are "hardware busters" (requiring a VGA upgrade for good performance) that was a legit question as the last such PC exclusive I've bought was Crysis Warhead back in 2008. I buy more games on PC than any other platform, so it's not like I don't follow what's been released. 

Yes, but that is a 2 year old low end graphics card, and the point of bringing that up was that the assumption that Crysis needs a "monster PC" to run is not true at all anymore. The high end graphics cards from 2009 played Crysis at the highest detail levels at 1080p or above and saw framerates of above 60fps; and people with the top of the line systems today can run a game like Crysis at over 120 frames a second at far above 1080p ...



Around the Network

Developers don't take PC gaming seriously at the moment because of the terrible state of the market; and people aren't going to rush out and buy loads of $3000 PCs when they can get something almost as good in a $2-300 console. In terms of hardware, yes, because this generation of consoles is probably almost over. I would be hesitant to call the current state of the PC gaming industry a generation ahead either; most PC games I have played recently only have graphics comparable to last generation's consoles; and Console games almost always have a much more interesting art style, even those on DS (which is this generation's weakest system in terms of power).

Only Minecraft has really impressed me lately on PC, and it is not because of the graphics or tech, this game could have been made in 1995. Top PC games like Starcraft 2, Civilization 5, SPORE, Lord of the Rings Online, etc... while fairly good, have been disappointing. 



I describe myself as a little dose of toxic masculinity.

HappySqurriel said:
Slimebeast said:
Foamer said:

You're still completely missing the point. One last time for the hard of reading, in bold seeing as it's not getting through to you- he's saying no one's taking advantage of the huge tech advantage of the PC. Here's the rest of his point, in italics this time in case you're still not getting it- that's why you're not seeing the quantum leaps in graphical fidelity you'd expect given the enormous gulf in power.

The pics you're posting are just reinforcing what he's saying and, along with your spectacularly ignorant comments on hardware, making yourself look very silly.

No, my comments on hardware are accurate.

A console generation is a minimum 16 times increase in graphics rendering power compared to the previous gen(each 18 months doubles the power ----> x2 x2 x2 x2 = 16)

The Xbox360 came out 5 years ago and had a GPU equivalent of a Nvidia 7800GT 256MB (or a Radeon X1800XL if you like).

But the current strongest graphics card on PC - the Nvidia GTX 580 1536MB is not 16 times faster than a 7800 GT (it's not even 10 times faster).

Conclusion: PC is not a generation ahead of consoles yet.



Actually, that isn't accurate at all ...

There is not a uniform level of processing power which defines a generation and (as a result) there is not a uniform boost in processing power that represents a generational gap. No one would argue that the Dreamcast was a generation ahead of the N64 but the Dreamcast was only 4 to 6 times as powerful as the N64; and few people would argue that the XBox 360 or PS3 aren't a generation ahead of the XBox, but neither system is 16 times the processing power of the XBox.

In general, the jump between a system in one generation to its replacement in the next generation is roughly 10 times the performance; and some systems will see smaller or larger jumps because of changes in strategy (changing the price point, change in time between console releases, change in size or energy consumption, etc.).

 

Now, the undelying question is "what is the minimum processing power jump which can define a generational jump on processing power alone?" which is not a straight forward/easy question to answer. First off diminishing returns are an issue; a couple of generations ago if you released a system that was twice the performance of another system the difference in displayed results could be dramatic, while today it would be noticeable but not significant. Secondly, you have to question whether the display resolutions and framerate remain the same or not; because the results produced by modern hardware at 720p @30fps (like most HD console games) are dramatically different from what hardware can produce at 1080p @60fps.

With that said, if you released a system that targeted 720p and was 6 to 8 times the processing power of the HD consoles no one would question that it was a generation ahead; and on the other hand, if you're targeting 1080p (or potentially 3D displaying at 1080p) you would need 8 to 16 times the performance.

 

One last consideration is that Crytek is (potentially) talking in terms of games that are in development today and will be released in 2011/2012. If you're producing a game where your 1 year old low end graphics card is the minimum requirement and has 4 times the processing power of the HD console's cards, you're targeting a new mid range graphics card for the game to perform well, and you're including enhancements for the people who will be running a system with 2 graphics cards that won't be released for a year, the hardware you are working on would clearly be a generation ahead of current generation consoles.

Great post as always Squirrel, and I'm ready to agree.

About the bolded though. That makes tons of sense and is probably what is happening at CryTek and in Yerli's head at the moment. But isn't that unfair though to include future hardware and future games and say the PC is easily a generation ahead?

Because the same could be said for consoles. Bethesda is (potentially) developing Fallout 4 for next gen consoles and I can bet my hat that the Xbox 720 will be released in late 2012 and have a GPU that is way faster than a Nvidia GTX 580.



I would have little doubt my new PC could run GT5 with 8xAA, ultra shadows, etc. on triple screen better then on the PS3 on one screen.



Jumpin said:

people aren't going to rush out and buy loads of $3000 PCs when they can get something almost as good in a $2-300 console.

$3000? Pretty ignorant I'd say. You can get a terrific gaming PC for much less than $1000, and a decent, future-proof gaming PC for an even lower price.

Jumpin said:

most PC games I have played recently only have graphics comparable to last generation's consoles

You mean the sixth gen? If so, you clearly haven't had a glance at PC gaming since... the beginning of this millenium? No, seriously, most PC games haven't looked anything like sixth gen games in a long time. What have you played, and how's your PC? Because either your games are from the sixth gen or you PC is.



HappySqurriel said:
 

Yes, but that is a 2 year old low end graphics card, and the point of bringing that up was that the assumption that Crysis needs a "monster PC" to run is not true at all anymore. The high end graphics cards from 2009 played Crysis at the highest detail levels at 1080p or above and saw framerates of above 60fps; and people with the top of the line systems today can run a game like Crysis at over 120 frames a second at far above 1080p ...

The whole point of this thread was how much further PC has progressed relative to consoles, the OP being that they are "a generation" ahead.

First, acknowledge there's a difference between being able to just play the game and being able to play the game with everything set to max, even at 1920x1080 with all settings on "ultra" or even "enthusiast" which is where you need to be to see those big visual differences.

When Crysis came out in 2007, there weren't any VGA cards that could play the game at max settings at non-slideshow framerates. Key word being "max" because there's little point in claiming how advanced the visuals are when no one actually gets to see them on their own PC. "Playable" isn't even the issue, as with any game, you dial down the visuals enough to keep playable frame rates on a non optimized PC and just about any game starts to look pretty bad.

I haven't seen any top of the line systems that can run Crysis over 120 fps at 1920x1080. GTX580 SLI gave 93.6 fps at 1920x1200 (reducing that to 1920x1080 will not give you an extra 26 fps) at Gamer quality with Enthusiast shaders and 4xAA. The HD5970 from last year gave 65.7fps, same settings.

So I'm not sure where this "120 fps at far above 1080p is coming from."

http://www.anandtech.com/show/4008/nvidias-geforce-gtx-580/6

"Even 2 years since the release of the original Crysis, “but can it run Crysis?” is still an important question, and the answer continues to be “no.” One of these years we’ll actually be able to run it with full Enthusiast settings…"