By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - PS3 UT3 looking better then ever as a must buy.

twesterm said:
your mother said:
twesterm said:
your mother said:

30fps? What resolution?

Anyhow, doesn't matter, as even on the PC version the framerates will be "locked" at 60fps.

But the differences between the PS3 and PC versions don't stop there - have a look at the preliminary benchmarks for the PC:

http://www.pcper.com/article.php?aid=464&type=expert&pid=5

2560x1600@46.6fps average. Higher Definition indeed...


They aren't going to lock the FPS at 60, there's no good reason to have that hard lock on there. None of the other UT games did this and I doubt this one will too, though they all do have a console command to lock the FPS.


(and @ WEWdeadeye)

There may not be a good reason to lock the fps, and while none of the other UT games did so, this doesn't mean that they aren't:

http://www.pcper.com/article.php?aid=464&type=expert

From page 2, and I quote:

"There aren't a whole lot of options for graphics in the demo right now, but we did set the texture and world detail levels up to their peak at 5. V-Sync was disabled though the game does have a 60 FPS lock on it so scores on the higher end cards are going to look closer than they might otherwise be. "

And WEWdeadeye, lol all you want, but bear in mind you are having a good old lol'ing at that site. That's all.


Thanks for highlighting that, now I'll highlight the part you missed. Isn't this fun!

I've already worked in Unreal 3 (not the demo) and the framerate wasn't locked.


You are right! This is fun!

You somehow are missing the entire point: These benchmarks are for what version of UT3 again?

Oh, that's right, it's the DEMO version, just like you pointed out!

If you even bothered to look at the charts from the PCPer link, you can already see that despite being "locked" the cards are still pushing higher than 60fps in best-case situations.

Maybe you can highlight the part where I specifically said that the retail version of UT3 would have the framerates capped at 60fps? Because certainly I haven't seen it - the retail version, that is.

And twestern, you aren't the only person on this forum that works in gaming for a living... 



Around the Network
tabsina said:
your mother said:
GranTurismo said:
By the end of the PS3 life cycle it will have games better looking then Crysis.

By the end of the PS3 life cycle the PC will have games that make Crysis look like Mario64.


I agree with this, although i wouldn't put it that way.. I'll just say that of course the PS3 games will be looking better as time goes on, that happens with all consoles, but PCs will likewise continue to grow, but the difference is, PCs are without bounds as they can constantly be upgraded


 true but they are also limited to having to have low - high textures. They have to make it so a game can run on hardware atleast 3 generations behind when the game is made which can limit the game.



PC gaming is better than console gaming. Always.     We are Anonymous, We are Legion    Kick-ass interview   Great Flash Series Here    Anime Ratings     Make and Play Please
Amazing discussion about being wrong
Official VGChartz Folding@Home Team #109453
 
ssj12 said:
your mother said:
ssj12 said:
your mother said:
fazz said:
Morgyn said:

"El Cheap" 8800 GTS 320MB is what... $300? That's most of the cost of the ps3/360 right there never mind the rest of the kit.


No, $279, that's about the same as Wii. And the cheaper 8800 GT is coming in a week or two. And considering most people already have a PC for everything else, you only need to put the card in and violá.

@ssj12: You got a 8400? As a stopgap I guess? Is it any good? :P


Myself, I'm waiting for the RV670 to come out, because I'll be damned if I crossfire two dustbuster 2900s in my whisper-quiet rig while consuming more electricity than a hair dryer.


*pats on shoulder* ATI has become complete garbage just like AMD. They were both in the lead with their tech but now they suck and since they are the same company now they can drag eachother down. Intel QX9650 and Nvidia 9800GTX FTW!!!


You could've said the same about Intel when AMD were utterly destroying Intel with the Athlon series, or when ATI came out with the 9600.

This is technology, and they all have their ups and downs, but if the RV670 doesn't deliver, nor the 780, I will relegate my crossfire mb to home theater status and just get an SLI.


AMD has no chance in hell with the Phenom based Athlons and ATI disnt deliver with a single promiss with the HD2900XT and wont with the HD2900XTX or the R780 w/e that will really be called.


That's what Intel said prior to AMD's Athlon, and we all remember how that turned out, eh?

You seem to be an Intel fanboy. Well, sorry to disappoint, but I am a fanboy of technology, and swear no allegiance to bullcrap like names. My current rig, as a matter of fact, is Intel-based, simply because currently it is superior to AMD, not because I wear Intel on my sleeve. Of course, if AMD came out on top again, I would not hesitate for a second to add whatever chip they have into my PC, not because I made some secret blood pact with the AMD fraternity, but because it would be the better product. Period. I hope you are like that as well, otherwise you may see yourself having to suck up bullcrap that AMD fanboys dish at you for choosing the losing side. 



Will it work with keyboard/mouse? I read that somewhere, but I'm not sure if it was confirmed or not.



stranne said:
Will it work with keyboard/mouse? I read that somewhere, but I'm not sure if it was confirmed or not.

So far mouse/keyboard are supported. 



Around the Network
your mother said:
ssj12 said:
your mother said:
ssj12 said:
your mother said:
fazz said:
Morgyn said:

"El Cheap" 8800 GTS 320MB is what... $300? That's most of the cost of the ps3/360 right there never mind the rest of the kit.


No, $279, that's about the same as Wii. And the cheaper 8800 GT is coming in a week or two. And considering most people already have a PC for everything else, you only need to put the card in and violá.

@ssj12: You got a 8400? As a stopgap I guess? Is it any good? :P


Myself, I'm waiting for the RV670 to come out, because I'll be damned if I crossfire two dustbuster 2900s in my whisper-quiet rig while consuming more electricity than a hair dryer.


*pats on shoulder* ATI has become complete garbage just like AMD. They were both in the lead with their tech but now they suck and since they are the same company now they can drag eachother down. Intel QX9650 and Nvidia 9800GTX FTW!!!


You could've said the same about Intel when AMD were utterly destroying Intel with the Athlon series, or when ATI came out with the 9600.

This is technology, and they all have their ups and downs, but if the RV670 doesn't deliver, nor the 780, I will relegate my crossfire mb to home theater status and just get an SLI.


AMD has no chance in hell with the Phenom based Athlons and ATI disnt deliver with a single promiss with the HD2900XT and wont with the HD2900XTX or the R780 w/e that will really be called.


That's what Intel said prior to AMD's Athlon, and we all remember how that turned out, eh?

You seem to be an Intel fanboy. Well, sorry to disappoint, but I am a fanboy of technology, and swear no allegiance to bullcrap like names. My current rig, as a matter of fact, is Intel-based, simply because currently it is superior to AMD, not because I wear Intel on my sleeve. Of course, if AMD came out on top again, I would not hesitate for a second to add whatever chip they have into my PC, not because I made some secret blood pact with the AMD fraternity, but because it would be the better product. Period. I hope you are like that as well, otherwise you may see yourself having to suck up bullcrap that AMD fanboys dish at you for choosing the losing side.


 *looks at desktop* AMD Athlon X2 4600+ in my desktop.. of course this desktop is made to be mid-grade. 



PC gaming is better than console gaming. Always.     We are Anonymous, We are Legion    Kick-ass interview   Great Flash Series Here    Anime Ratings     Make and Play Please
Amazing discussion about being wrong
Official VGChartz Folding@Home Team #109453
 
ssj12 said:
your mother said:
ssj12 said:
your mother said:
ssj12 said:
your mother said:
fazz said:
Morgyn said:

"El Cheap" 8800 GTS 320MB is what... $300? That's most of the cost of the ps3/360 right there never mind the rest of the kit.


No, $279, that's about the same as Wii. And the cheaper 8800 GT is coming in a week or two. And considering most people already have a PC for everything else, you only need to put the card in and violá.

@ssj12: You got a 8400? As a stopgap I guess? Is it any good? :P


Myself, I'm waiting for the RV670 to come out, because I'll be damned if I crossfire two dustbuster 2900s in my whisper-quiet rig while consuming more electricity than a hair dryer.


*pats on shoulder* ATI has become complete garbage just like AMD. They were both in the lead with their tech but now they suck and since they are the same company now they can drag eachother down. Intel QX9650 and Nvidia 9800GTX FTW!!!


You could've said the same about Intel when AMD were utterly destroying Intel with the Athlon series, or when ATI came out with the 9600.

This is technology, and they all have their ups and downs, but if the RV670 doesn't deliver, nor the 780, I will relegate my crossfire mb to home theater status and just get an SLI.


AMD has no chance in hell with the Phenom based Athlons and ATI disnt deliver with a single promiss with the HD2900XT and wont with the HD2900XTX or the R780 w/e that will really be called.


That's what Intel said prior to AMD's Athlon, and we all remember how that turned out, eh?

You seem to be an Intel fanboy. Well, sorry to disappoint, but I am a fanboy of technology, and swear no allegiance to bullcrap like names. My current rig, as a matter of fact, is Intel-based, simply because currently it is superior to AMD, not because I wear Intel on my sleeve. Of course, if AMD came out on top again, I would not hesitate for a second to add whatever chip they have into my PC, not because I made some secret blood pact with the AMD fraternity, but because it would be the better product. Period. I hope you are like that as well, otherwise you may see yourself having to suck up bullcrap that AMD fanboys dish at you for choosing the losing side.


*looks at desktop* AMD Athlon X2 4600+ in my desktop.. of course this desktop is made to be mid-grade.


Good for you. You should then know that brands in technology, especially with PC hardware, don't mean jack. Things change at the drop of a hat, and there is no reason to think that Phenom will be a failure, or whatever AMD may have down the road. 



your mother said:
twesterm said:
your mother said:
twesterm said:
your mother said:

30fps? What resolution?

Anyhow, doesn't matter, as even on the PC version the framerates will be "locked" at 60fps.

But the differences between the PS3 and PC versions don't stop there - have a look at the preliminary benchmarks for the PC:

http://www.pcper.com/article.php?aid=464&type=expert&pid=5

2560x1600@46.6fps average. Higher Definition indeed...


They aren't going to lock the FPS at 60, there's no good reason to have that hard lock on there. None of the other UT games did this and I doubt this one will too, though they all do have a console command to lock the FPS.


(and @ WEWdeadeye)

There may not be a good reason to lock the fps, and while none of the other UT games did so, this doesn't mean that they aren't:

http://www.pcper.com/article.php?aid=464&type=expert

From page 2, and I quote:

"There aren't a whole lot of options for graphics in the demo right now, but we did set the texture and world detail levels up to their peak at 5. V-Sync was disabled though the game does have a 60 FPS lock on it so scores on the higher end cards are going to look closer than they might otherwise be. "

And WEWdeadeye, lol all you want, but bear in mind you are having a good old lol'ing at that site. That's all.


Thanks for highlighting that, now I'll highlight the part you missed. Isn't this fun!

I've already worked in Unreal 3 (not the demo) and the framerate wasn't locked.


You are right! This is fun!

You somehow are missing the entire point: These benchmarks are for what version of UT3 again?

Oh, that's right, it's the DEMO version, just like you pointed out!

If you even bothered to look at the charts from the PCPer link, you can already see that despite being "locked" the cards are still pushing higher than 60fps in best-case situations.

Maybe you can highlight the part where I specifically said that the retail version of UT3 would have the framerates capped at 60fps? Because certainly I haven't seen it - the retail version, that is.

And twestern, you aren't the only person on this forum that works in gaming for a living...


 Just from reading this thread and the original link in your post, there was no difference talked about the cap in the demo and release versions.  The way things were being presented here, everyone was just assuming that the PC version will be capped as well and that just isn't true.  All I am doing is clearing that up.

So, yes, I'm sorry that I can't highlight the part where you said the release version will be capped, I can just highlight the part where you just said it will be capped and can easily be assumed you meant every version.

your mother said:

Anyhow, doesn't matter, as even on the PC version the framerates will be "locked" at 60fps.


Also, no one even mentioned the world demo (not even your first link) until after I posted.

And I'm well aware that I'm not the only person that works in gaming here, I really don't care as that doesn't have a lot to do here.  I merely mentioned that because I have worked in Unreal 3, I even work with a guy that has written quite a few articles on the Unreal Developers Network for Unreal 3, and I know it isn't capped. 



your mother, im not really looking talk crap about computer gaming. the reality is for many people that they need high-end pc's for various things not necessarily confined to gaming. for these people (especially those who use it for their jobs) this cost is justified as it is dual use. however, my argument that if you build a pc for basically the purpose of gaming (as many people do) it is expensive for what you get out of it considering its fairly limited life span.

for someone who is a fan of UT but not a fan of the price tag of building a new computer from scratch, a ps3 version of UT seems like a fair compromise. it is obvious that the pc version will beat it in every way imaginable but im sure the ps3 version will still look gorgeous and be a blast to play.



Demon's Souls Official Thread  | Currently playing: Left 4 Dead 2, LittleBigPlanet 2, Magicka

twesterm said:
your mother said:
twesterm said:
your mother said:
twesterm said:
your mother said:

30fps? What resolution?

Anyhow, doesn't matter, as even on the PC version the framerates will be "locked" at 60fps.

But the differences between the PS3 and PC versions don't stop there - have a look at the preliminary benchmarks for the PC:

http://www.pcper.com/article.php?aid=464&type=expert&pid=5

2560x1600@46.6fps average. Higher Definition indeed...


They aren't going to lock the FPS at 60, there's no good reason to have that hard lock on there. None of the other UT games did this and I doubt this one will too, though they all do have a console command to lock the FPS.


(and @ WEWdeadeye)

There may not be a good reason to lock the fps, and while none of the other UT games did so, this doesn't mean that they aren't:

http://www.pcper.com/article.php?aid=464&type=expert

From page 2, and I quote:

"There aren't a whole lot of options for graphics in the demo right now, but we did set the texture and world detail levels up to their peak at 5. V-Sync was disabled though the game does have a 60 FPS lock on it so scores on the higher end cards are going to look closer than they might otherwise be. "

And WEWdeadeye, lol all you want, but bear in mind you are having a good old lol'ing at that site. That's all.


Thanks for highlighting that, now I'll highlight the part you missed. Isn't this fun!

I've already worked in Unreal 3 (not the demo) and the framerate wasn't locked.


You are right! This is fun!

You somehow are missing the entire point: These benchmarks are for what version of UT3 again?

Oh, that's right, it's the DEMO version, just like you pointed out!

If you even bothered to look at the charts from the PCPer link, you can already see that despite being "locked" the cards are still pushing higher than 60fps in best-case situations.

Maybe you can highlight the part where I specifically said that the retail version of UT3 would have the framerates capped at 60fps? Because certainly I haven't seen it - the retail version, that is.

And twestern, you aren't the only person on this forum that works in gaming for a living...


Just from reading this thread and the original link in your post, there was no difference talked about the cap in the demo and release versions. The way things were being presented here, everyone was just assuming that the PC version will be capped as well and that just isn't true. All I am doing is clearing that up.

So, yes, I'm sorry that I can't highlight the part where you said the release version will be capped, I can just highlight the part where you just said it will be capped and can easily be assumed you meant every version.

your mother said:

Anyhow, doesn't matter, as even on the PC version the framerates will be "locked" at 60fps.


Also, no one even mentioned the world demo (not even your first link) until after I posted.

And I'm well aware that I'm not the only person that works in gaming here, I really don't care as that doesn't have a lot to do here. I merely mentioned that because I have worked in Unreal 3, I even work with a guy that has written quite a few articles on the Unreal Developers Network for Unreal 3, and I know it isn't capped.

Sorry if that was so very confusing for you. Perhaps I should have specified that, just as PCPer's benchmarks state, they are for the -  again, I reiterate - demo version, which is, according to PCPer (which I've quoted, not stated as my fact) not for the PC retail version.

Hope that clears things up for you.