By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - Worlds fastest DX11 GPU? also very impressive tech demo see the future

the glitches in the black ops demo were funny as well lol



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

Around the Network

My Laptop can run Call Of Duty, i don't really know why they demo'd that game among all others.

Btw ATI is better

edit : i also like to note that especially for PC graphics, you can't see whats so great about them unless they're right infront of you , and this is also offscreen Cam , which is even worse.



I live for the burn...and the sting of pleasure...
I live for the sword, the steel, and the gun...

- Wasteland - The Mission.

I'm just curious about one question and I'd like someone a little expert on the matter to answer it rather than having someone that knows even less than myself saying "no":

 

Any chances of the next Nintendo console having those visuals?



I think ATI put the fear in Nvidia by the looks of it. I was always wondering how they managed to fall behind ATI the way they did for a while there. I'm thinking that maybe they were serious about challenging intel? I dunno but this is extremely awesome.



Tag(thx fkusumot) - "Yet again I completely fail to see your point..."

HD vs Wii, PC vs HD: http://www.vgchartz.com/forum/thread.php?id=93374

Why Regenerating Health is a crap game mechanic: http://gamrconnect.vgchartz.com/post.php?id=3986420

gamrReview's broken review scores: http://gamrconnect.vgchartz.com/post.php?id=4170835

 

Mr.Metralha said:

I'm just curious about one question and I'd like someone a little expert on the matter to answer it rather than having someone that knows even less than myself saying "no":

 

Any chances of the next Nintendo console having those visuals?


If they wanted to, yes. The main reasons stopping them are cost and power consumption (the Wii pulls 17W under load, this does 200-250).



Around the Network
vlad321 said:

I think ATI put the fear in Nvidia by the looks of it. I was always wondering how they managed to fall behind ATI the way they did for a while there. I'm thinking that maybe they were serious about challenging intel? I dunno but this is extremely awesome.


It's a fix-up of Fermi (new stepping). What it should have been a year ago really. It's nice that they're competitive now, but AMD has yet to show their competing part (Cayman). Hold off judgement until that is released.

AMD still holds the single-card crown and will extend that with dual-Cayman, and if Cayman is any good they will have the single-GPU lead and a perf/mm^2 and perf/watt lead. But Nvidia won't go bankrupt which is a good thing for consumers.

The next parts from both companies are key. They're 28nm, and Nvidia's taken a year and a half more than AMD to produce a viable 40nm part (4770 vs this), so can Nvidia actually execute on a new process node on time?

In any case Nvidia will take 6 months to roll out the 5xx series to the midrange and low end. AMD already has the midrange and low-end is Jan/Feb. That's where the volume and hence revenue is.



The future for PC gaming is bright



ǝןdɯıs ʇı dǝǝʞ oʇ ǝʞıן ı ʍouʞ noʎ 

Ask me about being an elitist jerk

Time for hype

Liking the looks of this.



Rockstar: Announce Bully 2 already and make gamers proud!

Kojima: Come out with Project S already!

vlad321 said:

I think ATI put the fear in Nvidia by the looks of it. I was always wondering how they managed to fall behind ATI the way they did for a while there. I'm thinking that maybe they were serious about challenging intel? I dunno but this is extremely awesome.


yea ATI were just to good at a lower price point lower power drain and they released sooner, thanks to focusing on efficiency and minimising die size. Nvida focused on the biggest most powerful die possible while pushing CUDA and tessellation above all else, and then the die on release was to hot, to expensive and power hungry to be competitive (without the brand loyalty and a performance advantage), they have improved with their midrange cards and this refresh looks mighty impressive  tho. I think Nvida's plan is finally coming to fruition with this gen of cards, I just hope AMD can release something competitive or better to keep the prices down.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

lol...It was pretty cool to see COD get chewed up by this card.

I will never understand why people even bring up power drain when discussing cards like this. Like this card is 500 bucks... anyone that is buying these cards obviously is using a high end gaming machine.

What are the odds those people care about their electricity bills or the need to buy a better power supply? They don't...they go for broke.

On topic...I am perfectly happy with my GTX470 SLI and it will most likely never have a chance to use this card, but it seems like the best single GPU you can buy by a long shot. Only thing that Nvidia needs to implement is the ability to connect 3 monitors on a single GPU. You still need SLI to do this, and this 1 card could erasily power 3 monitors for most games.