By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Won't it be depressing if...

Zlejedi said:
 

Not really.

For me 6870 and 560 offer good enough graphics

They are fine today...for game which are mostly optimized for consoles. Give them something like Metro 2033 or upcoming Battlefield 3 and they are pretty much borderline. Now imagine having devs optimize games for those GPUs till 2020 or whatever year next console gen ends.



Around the Network

if they put current latest graphics cards, the amount of juice and overclocking and push out of those gpu would be sufficient for several years. But what i want to see is massive increase in ram and cpu speed. I want to see a 4-5GHZ CPU and at least 4-8 GB ram. Because more ram the better.



Of Course That's Just My Opinion, I Could Be Wrong

HappySqurriel said:

Even the rumoured "Low End" GPU in the Wii U, after modifications and optimization, will likely be able to play games on a similar level to Crysis 2 at close to its highest levels of detail with a resolution of 1080p @60fps ... While graphics can improve beyond this point, the value in pushing graphics is becomming smaller and smaller over time.

On top of the lower value, until there is a means of managing development costs while producing dramatically higher quality graphics it is unlikely that publishers and developers will push high end hardware that could be released with next generation systems. At a time where developers are going bankrupt if their $20 to $40 Million game doesn't sell as well as expected it isn't likely that these publishers/developers will want to start producing games with an average budget that is approaching $100 Million.

Edit: Please note that I'm not saying the Wii U will be able to match the output of Crysis 2 at its highest settings at launch, but more that the best looking games after 3 or 4 years will probably be very similar to what we see from Crysis 2 on a high end PC today.


Console games beenefit a little from being in development on specific hardware (VS PC which has to support a crapload of configuration). Also they benefit from better upscaling when it comes to resolutions. PCs are moer cut and dry and if you ask for a sup 720p res, you will clearly see a sub 720p res. However other than that, the hardware behaves the same and gives similar performance.

I mean, take a console port like bioshock 2 and run it on similar hardware on PC on mid/low detail with low AA and 720p resolution. You should be getting similar frame rates with a radeon x1950 as on the 360.

So I think this "console optimizes GPUs much better" is not very accurate. It hides things a little better than PC and sometimes benefits from in house develpment for 1 harware configuration...IMO. 



HappySqurriel said:

Even the rumoured "Low End" GPU in the Wii U, after modifications and optimization, will likely be able to play games on a similar level to Crysis 2 at close to its highest levels of detail with a resolution of 1080p @60fps ... While graphics can improve beyond this point, the value in pushing graphics is becomming smaller and smaller over time.

On top of the lower value, until there is a means of managing development costs while producing dramatically higher quality graphics it is unlikely that publishers and developers will push high end hardware that could be released with next generation systems. At a time where developers are going bankrupt if their $20 to $40 Million game doesn't sell as well as expected it isn't likely that these publishers/developers will want to start producing games with an average budget that is approaching $100 Million.

Edit: Please note that I'm not saying the Wii U will be able to match the output of Crysis 2 at its highest settings at launch, but more that the best looking games after 3 or 4 years will probably be very similar to what we see from Crysis 2 on a high end PC today.

I think you should just have your paragraph in notepad so you can copy/paste it for every graphics thread instead of having to type it every time.  The middle paragraph is probably the most compelling for the arguement.  If development cost were x2-4 for the next generation then even more developers/publishers would go out of business when compared to this generation.  We would probably be left with just EA, Activision, and a few others ditacting the smaller developers into publishing sequel after sequel (due to them not wanting to take new idea risks).  Basically the gaming industry would turn fully into a Hollywood mentality.



disolitude said:
Zlejedi said:
 

Not really.

For me 6870 and 560 offer good enough graphics

They are fine today...for game which are mostly optimized for consoles. Give them something like Metro 2033 or upcoming Battlefield 3 and they are pretty much borderline. Now imagine having devs optimize games for those GPUs till 2020 or whatever year next console gen ends.

Those cards can run Witcher 2 at full HD. With an advantage of optimised code they should be fine for next 5-6 years for most people. And at least they won't be painfull for your eyes unlike todays games :)



PROUD MEMBER OF THE PSP RPG FAN CLUB

Around the Network
sethnintendo said:

I think you should just have your paragraph in notepad so you can copy/paste it for every graphics thread instead of having to type it every time.  The middle paragraph is probably the most compelling for the arguement.  If development cost were x2-4 for the next generation then even more developers/publishers would go out of business when compared to this generation.  We would probably be left with just EA, Activision, and a few others ditacting the smaller developers into publishing sequel after sequel (due to them not wanting to take new idea risks).  Basically the gaming industry would turn fully into a Hollywood mentality.


Actually no...this can only be a valid argument for console gamer.

Game development tools are advancing a great deal and budgets while still high, don't need to be so massive to achieve impressive visuals. Only when you have consoles which are underpowered and developers are set in providing visually impressive game and have to push every ounce of horsepower can you run in to massive budgets due to graphics. 

Otherwise if you take something middle of the road on consoles like Battlefield Bad Company 2 and look at the PC version, using DX11, and tessalation and full HD resolution at 60 frames per second...it spanks Uncharted 2, Killzone 3 or whatever else consoles can muster in terms of visuals...yet it doesn't have to cost an arm an a leg.

What I am trying to say is -

Great hardware, game engine and dev tools = great graphics

and

great graphics/= high budget costs



Zlejedi said:
disolitude said:
Zlejedi said:
 

Not really.

For me 6870 and 560 offer good enough graphics

They are fine today...for game which are mostly optimized for consoles. Give them something like Metro 2033 or upcoming Battlefield 3 and they are pretty much borderline. Now imagine having devs optimize games for those GPUs till 2020 or whatever year next console gen ends.

Those cards can run Witcher 2 at full HD. With an advantage of optimised code they should be fine for next 5-6 years for most people. And at least they won't be painfull for your eyes unlike todays games :)


True. If consoles next gen settle for 1080p@30, 2XAA, 3D gaming @ 720p per eye and generally mid range game details(especially later in the gen)...those graphics cards are enough.