By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Tomb Raider.......WOW!

CGI-Quality said:
Hynad said:
CGI-Quality said:

HDMI can only output 1080p @ 60Hz on console (and most consoles games aren't even in native 1080p), while PC can enjoy higher fps, in addition to higher resolutions with DVI or DP (Display Port). For example, I have my 360 hooked up to a Samsung SA23700D, 3D, 120Hz monitor, yet, the picture clarity is still trumped by my PC (which I have running on another of the exact same model). This is all before we even get to the indivdual, PC exclusive traits.

Both (HDMI and dual link DVI) allow for a resolution of 2560x1600 at 60Hz with 48bits color depth.
 

Now, HDMI 1.4 allows resolutions of 3840x2160 @ 30Hz, and 4096x2160 @ 24Hz. Which puts HDMI over DVI when all is said and done, but those resolutions and clocks aren't really suitable for PC gaming. In the end, HDMI is a superior tech over DVI, since it can achieve pretty much the same as DVI but also outputs 8 channel audio. Both connections allow a resolution of 1920x1200 at 120Hz.

If you really want to go over the specs of HDMI, DVI isn't the asnwer at all.  You have to go Displayport. That is, if you want to enjoy the resolution possible with HDMI, but at 60Hz. It has twice the bandwidth that HDMI has. So for PC gaming enthusiasts, this is a given.

If I use HDMI on my PC, I will not get the full 120Hz of my montior, for instance. I don't get it out of my console, either (same monitor). With "daul-link" DVI, those issues are not there. The picture quality won't be different if you're running both (DVI-D and HDMI) @ 60Hz, no, but that wasn't ever in question.

Yeah, could it be that HDMI 1.4 ready monitors aren't out yet? Been a while since I've shoped for a display.
Because HDMI 1.4 has more than enough bandwidth to allow it. 

Also, what was in question then? You mentioned " higher resolutions with DVI" , yet DVI and HDMI have the same resolution output...
If all you meant was Hz, then yeah, reffer to my current first paragraph.



Around the Network
CGI-Quality said:
Hynad said:
CGI-Quality said:
Hynad said:

Both (HDMI and dual link DVI) allow for a resolution of 2560x1600 at 60Hz with 48bits color depth.
 

Now, HDMI 1.4 allows resolutions of 3840x2160 @ 30Hz, and 4096x2160 @ 24Hz. Which puts HDMI over DVI when all is said and done, but those resolutions and clocks aren't really suitable for PC gaming. In the end, HDMI is a superior tech over DVI, since it can achieve pretty much the same as DVI but also outputs 8 channel audio. Both connections allow a resolution of 1920x1200 at 120Hz.

If you really want to go over the specs of HDMI, DVI isn't the asnwer at all.  You have to go Displayport. That is, if you want to enjoy the resolution possible with HDMI, but at 60Hz. It has twice the bandwidth that HDMI has. So for PC gaming enthusiasts, this is a given.

If I use HDMI on my PC, I will not get the full 120Hz of my montior, for instance. I don't get it out of my console, either (same monitor). With "daul-link" DVI, those issues are not there. The picture quality won't be different if you're running both (DVI-D and HDMI) @ 60Hz, no, but that wasn't ever in question.

Yeah, could it be that HDMI 1.4 ready monitors aren't out yet? Been a while since I've shoped for a display.
Because HDMI 1.4 has more than enough bandwidth to allow it. 

I don't see why they wouldn't be able to. My impression was that even Sony's 3D Display was HDMI 1.4a ready. I was also lead to believe that HDMI was HDMI, regardless of the 1.3 vs 1.4a (although that would be silly for them to call it something different if it is the exact same).

On this bit, things are a little foggy for me, so any piece of knowledge would help. I was always under the impression that DisplayPort had them both beat (bandwith and clarity), but dual-link DVI was what most PC gamers used.


Even DVI have their differences, single link vs dual link, for example. But the current standard, touching wood, is dual link. It's the same for HDMI, right now, the standard is shifting to 1.4, but it will take some time before 1.3 is completely in the past. There are still many manufacturers still making sets with HDMI 1.3 ports...

As for DisplayPort, no questions asked, it trumps HDMI. Double the bandwidth, same color depth, allows for roughly twice the frame rate at same resolutions...
If you have the choice, and the ports on your monitor, you go DisplayPort. 



Damn looks like the 600 series gfx cards have a bug. Seems like the game crashes every 20-30 minutes.

Well just got my first deer. So not very far in game. So far loving it. Found it funny that the first movie looked worse graphically than when I'm actually playing the game. They apparently did the in game movies not on max settings.

Got me 2/10 totems. Are they all in this starting area?



irstupid said:
Damn looks like the 600 series gfx cards have a bug. Seems like the game crashes every 20-30 minutes.

Well just got my first deer. So not very far in game. So far loving it. Found it funny that the first movie looked worse graphically than when I'm actually playing the game. They apparently did the in game movies not on max settings.

Got me 2/10 totems. Are they all in this starting area?


turn off tessellation and you prolly won't crash so much.



Oh look..........graphics. lol



Around the Network
CGI-Quality said:

Steam just released Tomb Raider for us PC nerdies, right? Cool. So I go in, thinking it will look pretty good, but not expecting to be floored....and then BAM! Not only does this game look good (of course, it's maxed out on a PC that can run everything on Ultra), it is the first game that, truly, looks like a Next Gen console game. And it doesn't just stop there - TressFX, the lighting, and even some of the texture detail are a step above practically anything I've seen. Easily the best looking adventure game available.

Now, I'm sure many will take this for "hyperbole" or "me trying to show off the power of my rig", but no, the devs deserve all the credit here. I did NOT expect, even the PC version, to trump Uncharted 3, but it does it with little to no trouble at all. I can't stress it enough, if you can max this game out, you'll see what I mean! For the first time since God of War III, I'm in complete awe, and TressFX really helps bring that "Next Gen" feel to the character!

If you're picking it up, I think you'll be pleased with the results!

Ok, so I'm thinking of getting a Sager gaming laptop.  So I can use it for Maya and UDK and stuff too.  But, anyway the one I'm looking at will have a geforce gtx 675mx with 4gb gddr5, 8 or 16 gb of ram, and an i7 quad core processor.  I take it these would be able to run this game or even crysis 3 on ultra settings or almost all ultra settings.

I don't want to derail the thread, but what is your opinion on these.  I know the 675mx is pretty good, but I don't know how good.



CGI-Quality said:
darkknightkryta said:
CGI-Quality said:
darkknightkryta said:
You called me crazy when I said Tomb Raider looked great, now who's the crazy one?!!!!!!  

Would you mind linking me to that post where I called you crazy? 

You didn't I'm just exaggerating.  Someone, I think it was Nsanity, made a thread showing gameplay from Tomb Raider.  Everyone in the thread was going on about how the game has terrible graphics, game doesn't look to good, etc, etc.  I think you were dissapointed with the graphics too.

Well, initially letdown by the visuals, which were console pics, btw, yes. This was also before they said the PC version would have exclusive features and prior to beta. What was shown did not look very good, but what you currently get, maxed out on PC, is miles and miles better.

Console or not that's the base of the game, you can clean up the corners with tesselation, high res the game, but that's what it is and that's what the final game is (Though I'm not sure if they were arsed to improve the textures in the PC version, did they?).  Though it's pretty amazing how much better an image looks after it leave the frame buffer and gets it's post processing in.  Plus in motion it's gorgeous.



CGI-Quality said:
nnodley said:

Ok, so I'm thinking of getting a Sager gaming laptop.  So I can use it for Maya and UDK and stuff too.  But, anyway the one I'm looking at will have a geforce gtx 675mx with 4gb gddr5, 8 or 16 gb of ram, and an i7 quad core processor.  I take it these would be able to run this game or even crysis 3 on ultra settings or almost all ultra settings.

I don't want to derail the thread, but what is your opinion on these.  I know the 675mx is pretty good, but I don't know how good.

I imagine that card will have 2GB of VRAM and the CPU, which will make a difference with Cry3, is plenty. So, it should be able to max it with little-to-no trouble. 

Thanks.  I've been wanting to get into pc gaming and I think now is the perfect time.



CGI-Quality said:
darkknightkryta said:
CGI-Quality said:
darkknightkryta said:
CGI-Quality said:
darkknightkryta said:
You called me crazy when I said Tomb Raider looked great, now who's the crazy one?!!!!!!  

Would you mind linking me to that post where I called you crazy? 

You didn't I'm just exaggerating.  Someone, I think it was Nsanity, made a thread showing gameplay from Tomb Raider.  Everyone in the thread was going on about how the game has terrible graphics, game doesn't look to good, etc, etc.  I think you were dissapointed with the graphics too.

Well, initially letdown by the visuals, which were console pics, btw, yes. This was also before they said the PC version would have exclusive features and prior to beta. What was shown did not look very good, but what you currently get, maxed out on PC, is miles and miles better.

Console or not that's the base of the game, you can clean up the corners with tesselation, high res the game, but that's what it is and that's what the final game is (Though I'm not sure if they were arsed to improve the textures in the PC version, did they?).  Though it's pretty amazing how much better an image looks after it leave the frame buffer and gets it's post processing in.  Plus in motion it's gorgeous.

It wasn't just a cleaned up image. The specific details, the textures themselves, were lacking in those early pics. Besides, console pics vs suped up PC footage can make a HUGE difference. In Tomb Raider's case, the contrast couldn't be anymore profound.

What I got on Tues and what I saw months ago was vastly different.

That's all in the post processing.  Even the PS3 version I'm playing looks way better, but the models weren't any better nor are the backgrounds.  Not too sure about the texture work though; what I'm seeing is absolutely amazing, but I wasn't too focused on the little things, like that wheel everyone was complaining about.



CGI-Quality said:
darkknightkryta said:
CGI-Quality said:

It wasn't just a cleaned up image. The specific details, the textures themselves, were lacking in those early pics. Besides, console pics vs suped up PC footage can make a HUGE difference. In Tomb Raider's case, the contrast couldn't be anymore profound.

What I got on Tues and what I saw months ago was vastly different.

That's all in the post processing.  Even the PS3 version I'm playing looks way better, but the models weren't any better nor are the backgrounds.  Not too sure about the texture work though; what I'm seeing is absolutely amazing, but I wasn't too focused on the little things, like that wheel everyone was complaining about.

Well there you go! Some of us were, and what we saw was lackluster. Not the case, today.

That's what I was trying to say, I'm not sure if they fixed the little things.  But trees, mountain ranges, everything, the textures top notch.  I'm actually surprised they managed to do what they did on the consoles.  The scale isn't sacrificed, they got that fancy lighting particles in the air that Crysis 3 and Unreal 4 are showing (Though not abused in the way that Killzone: Shadow Fall or Infamous 3 are doing it, though I'm not sure any 7 year old video card can handle it to that extent).  I know it ain't much, but there's a lot of local lighting going on while the environmnet is getting lit.  That's very difficult to do.