By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - IF the rumors were true AND the Ps4 were to use HD 7670 and the Nextbox were to use HD 6670. Who will come out on top?

DraconianAC said:
I'm just going to say one thing. Optimization. Now that I've said it, I'm going to explain it to you; Software optimization is a beautiful thing. Ask Apple, it makes their average hardware appear lightning fast. Sony can't really do it for shit in their software department, I'm going to excluding some of their game developers.

A PC is a multi-task machine that is supposed to run many different applications. Most of the software is not tailored made for one type of PC because everyone has their own set up with different components. Just think of it this way.
Think of all the games that look better on Nvidia cards compared to the AMD cards; then find out if Nvidia worked closely with the game developers to optimize the code for their hardware. I will bet a pretty penny that they did, Its gonna take a few years, and a much more powerful AMD card to run the same FPS and it still won't look as good as how an Nvidia card will perform.
Here is another example: PS2 emulation: It still hard to emulate some of the PS2 games on high end computers even though that is 10+ old tech. There is a community trying to optimize the code for PC so that we don't have a stutters or glitches here or there. but I still can't play a PS2 Saint Seiya game on my PC at 60FPS. ARGGGG!!!

Lol what? Outside their game devs, what do you mean?

And its obvious we can't compare a pc gpu with the one in a console. However that doesn't mean a pc low end gpu will turn into a monster card in consoles. PS3s RSX is comparable to the Geforce 7800 series. Those were high end cards when they came out. One would expect the next gen consoles to have derivatives of todays high end gpus like the 8870(rebranded 7870). The HD6670 is comparable to the 8800gt. Yes faster than the RSX, but don't expect the performance boost like we've seen PS1>PS2>PS3



Around the Network

Well usually the trend is

Weaker system = winner.

But a headstart could change everything depending on who has it.



Xbox: Best hardware, Game Pass best value, best BC, more 1st party genres and multiplayer titles. 

 

It's the same GPU and not console will use it in the next gen.



Haha, its pretty funny to see everyone worried. If this turns out to be true, below is how powerful next gen consoles will be in terms of current graphics cards. Awesum Powa!

I for one hope this is true. It will silence all the annoying console power discussions on forums... 

 

Graphics Card Hierarchy Chart

GeForceRadeonXbox720/PS4 GPU
Discrete: GTX 690


Discrete: GTX 590
Discrete: HD 6990, 7970 GHz Ed.

Discrete: GTX 680
Discrete: HD 7970
Discrete: GTX 670
Discrete: HD 5970, 7950

Discrete: GTX 580, 660 Ti
Discrete: HD 7870 LE, 7870
Discrete: GTX 295, 480, 570, 660
Go (mobile): 680M
Discrete: HD 4870 X2, 6970, 7850
Mobility: 7970M

Discrete: GTX 470, 560 Ti, 560 Ti 448 Core Discrete: HD 4850 X2, HD 5870, HD 6950
Mobility: 7950M

Discrete: GTX 560, 650 Ti
Go (mobile): 580M
Discrete: HD 5850, HD 6870 
Mobility: 6990M

Discrete: 9800 GX2, 285, 460 256-bit, 465
Go (mobile): 675M
Discrete: HD 6850, HD 7770
Mobility: 6900M

Discrete: GTX 260, 275, 280, 460 192-bit, 460 SE, 550 Ti, 560 SE, 650
Go (mobile): 570M, 670M
Discrete: HD 4870, 5770, 4890, 5830, 6770,  6790, 7750
Mobility: HD 5870, 6800M

Discrete: 8800 Ultra, 9800 GTX, 9800 GTX+, GTS 250, GTS 450 
Go (mobile): 560M, 660M
Discrete: HD 3870 X2, 4850, 5750, 6750
Mobility: HD 4850, 5850, 7870M

Discrete: 8800 GTX, 8800 GTS 512 MB, GT 545 (GDDR5)
Go (mobile): GTX 280M, 285M, 555M (GDDR5)
Discrete: HD 4770
Mobility: HD 4860, 7770M, 7850M

Discrete: 8800 GT 512 MB, 9800 GT, GT 545 (DDR3), GT 640 (DDR3)
Go (mobile): 9800M GTX, GTX 260M (112), GTS 360M (GDDR5), 555M (DDR3)
Discrete: HD 4830, HD 5670, HD 6670 (GDDR5)
Mobility: HD 5770, HD 5750, 6600M/6700M (GDDR5), 7750M
NEXT GEN CONSOLE GAMING GPU!
Discrete: 8800 GTS 640 MB, 9600 GT, GT 240 (GDDR5)
Go (mobile): 9800M GTS, GTX 160M
Discrete: HD 2900 XT, HD 3870, HD 5570 (GDDR5), HD 6570 (GDDR5)
Mobility: 6500M (GDDR5), 6600M/6700M (DDR3), 7730M

Discrete: 8800 GS, 9600 GSO, GT 240 (DDR3)
Go (mobile): GTX 260M (96), GTS 150M, GTS 360M (DDR3)
Discrete: HD 3850 512 MB, HD 4670, HD 5570 (DDR3), HD 6570 (DDR3), HD 6670 (DDR3)
Mobility: HD 3870, HD 5730, HD 5650, 6500M (DDR3)

Discrete: 8800 GT 256 MB, 8800 GTS 320 MB, GT 440 GDDR5, GT 630 GDDR5
Go (mobile): 8800M
Discrete: HD 2900 PRO, HD 3850 256 MB, 5550 (GDDR5)
Mobility: HD 3850

Discrete: 7950 GX2, GT 440 DDR3, GT 630 DDR3
Discrete: X1950 XTX, HD 4650 (DDR3), 5550 (DDR3)
Integrated: HD 7660D

Discrete: 7800 GTX 512, 7900 GTO, 7900 GTX, GT 430, GT 530
Go (mobile): 550M
Discrete: X1900 XT, X1950 XT, X1900 XTX 

Discrete: 7800 GTX, 7900 GT, 7950 G, GT 220 (DDR3)
Go (mobile): 525M, 540M
Discrete: X1800 XT, X1900 AIW, X1900 GT, X1950 Pro, HD 2900 GT, HD 5550 (DDR2)
Integrated: HD 7560D

Discrete: 7800 GT, 7900 GS, 8600 GTS, 9500 GT (GDDR3), GT 220 (DDR2)
Go (mobile): 7950 GTX
Discrete: X1800 XL, X1950 GT, HD 4650 (DDR2), HD 6450
Mobility: X1800 XT, HD 4650, HD 5165, 6400M
Integrated: HD 6620G, 6550D, 7540D



Turkish said:
DraconianAC said:
I'm just going to say one thing. Optimization. Now that I've said it, I'm going to explain it to you; Software optimization is a beautiful thing. Ask Apple, it makes their average hardware appear lightning fast. Sony can't really do it for shit in their software department, I'm going to excluding some of their game developers.

A PC is a multi-task machine that is supposed to run many different applications. Most of the software is not tailored made for one type of PC because everyone has their own set up with different components. Just think of it this way.
Think of all the games that look better on Nvidia cards compared to the AMD cards; then find out if Nvidia worked closely with the game developers to optimize the code for their hardware. I will bet a pretty penny that they did, Its gonna take a few years, and a much more powerful AMD card to run the same FPS and it still won't look as good as how an Nvidia card will perform.
Here is another example: PS2 emulation: It still hard to emulate some of the PS2 games on high end computers even though that is 10+ old tech. There is a community trying to optimize the code for PC so that we don't have a stutters or glitches here or there. but I still can't play a PS2 Saint Seiya game on my PC at 60FPS. ARGGGG!!!

Lol what? Outside their game devs, what do you mean?

And its obvious we can't compare a pc gpu with the one in a console. However that doesn't mean a pc low end gpu will turn into a monster card in consoles. PS3s RSX is comparable to the Geforce 7800 series. Those were high end cards when they came out. One would expect the next gen consoles to have derivatives of todays high end gpus like the 8870(rebranded 7870). The HD6670 is comparable to the 8800gt. Yes faster than the RSX, but don't expect the performance boost like we've seen PS1>PS2>PS3


The 8800 and hd6670 are no where close to each other. For starters no8800 can dx11 hell they can't even run dx10.1.

The hd6670 is much more powerful. However I expect Durango to cross fire 2. Don't expect a 680gtx type of card it won't happen. Hell imagine what current gens could do with 2 gigs of ram instead of 512!  Commen sense will tell us more than ram CPU and gpu are involved with consoles. Especially if kinetc 2.0 and that room thingy Microsoft just showed off. People bash Nintendo for making an expensive shell of a game pad that should have gone to more power yet don't think about the same issues with move and kinect. 



Around the Network
Demensha said:
Turkish said:
DraconianAC said:
I'm just going to say one thing. Optimization. Now that I've said it, I'm going to explain it to you; Software optimization is a beautiful thing. Ask Apple, it makes their average hardware appear lightning fast. Sony can't really do it for shit in their software department, I'm going to excluding some of their game developers.

A PC is a multi-task machine that is supposed to run many different applications. Most of the software is not tailored made for one type of PC because everyone has their own set up with different components. Just think of it this way.
Think of all the games that look better on Nvidia cards compared to the AMD cards; then find out if Nvidia worked closely with the game developers to optimize the code for their hardware. I will bet a pretty penny that they did, Its gonna take a few years, and a much more powerful AMD card to run the same FPS and it still won't look as good as how an Nvidia card will perform.
Here is another example: PS2 emulation: It still hard to emulate some of the PS2 games on high end computers even though that is 10+ old tech. There is a community trying to optimize the code for PC so that we don't have a stutters or glitches here or there. but I still can't play a PS2 Saint Seiya game on my PC at 60FPS. ARGGGG!!!

Lol what? Outside their game devs, what do you mean?

And its obvious we can't compare a pc gpu with the one in a console. However that doesn't mean a pc low end gpu will turn into a monster card in consoles. PS3s RSX is comparable to the Geforce 7800 series. Those were high end cards when they came out. One would expect the next gen consoles to have derivatives of todays high end gpus like the 8870(rebranded 7870). The HD6670 is comparable to the 8800gt. Yes faster than the RSX, but don't expect the performance boost like we've seen PS1>PS2>PS3


The 8800 and hd6670 are no where close to each other. For starters no8800 can dx11 hell they can't even run dx10.1.

The hd6670 is much more powerful. However I expect Durango to cross fire 2. Don't expect a 680gtx type of card it won't happen. Hell imagine what current gens could do with 2 gigs of ram instead of 512!  Commen sense will tell us more than ram CPU and gpu are involved with consoles. Especially if kinetc 2.0 and that room thingy Microsoft just showed off. People bash Nintendo for making an expensive shell of a game pad that should have gone to more power yet don't think about the same issues with move and kinect. 


Dx11 won't matter when you have such a weak gpu. The 8800GT and HD6670 are very close. Look at this hierarchy chart http://www.tomshardware.co.uk/gaming-graphics-card-review,review-32416-7.html

The 6670 will be a 3 generation old low end card. The PS360 had derivatives of high end gpus when they came out. Unless Sony and MS bundle an expensive peripheral with their console, we'll have powerful futureproof consoles.



worst case scenario 5x wii u, very disappointing if true.



The consoles haven't even been launch yet and the graphics wars have already start...



Nintendo and PC gamer

DraconianAC said:

Think of all the games that look better on Nvidia cards compared to the AMD cards; then find out if Nvidia worked closely with the game developers to optimize the code for their hardware. I will bet a pretty penny that they did, Its gonna take a few years, and a much more powerful AMD card to run the same FPS and it still won't look as good as how an Nvidia card will perform.
Here is another example: PS2 emulation: It still hard to emulate some of the PS2 games on high end computers even though that is 10+ old tech. There is a community trying to optimize the code for PC so that we don't have a stutters or glitches here or there. but I still can't play a PS2 Saint Seiya game on my PC at 60FPS. ARGGGG!!!


Prove that nVidia games look better on nVidia cards compared to AMD. - (I.E. It simply isn't true.)
That has been debunked so many times it's not funny in the PC space.
Every now and then AMD might have better texture filtering and nVidia might get better Anti-Aliasing, but the difference is never "clear as day" and they catch up to each other eventually even without upgrading your hardware.

Sure, you get PhysX with nVidia cards, but you can also run that with AMD cards if you used hacked drivers and a secondary low-end nVidia GPU.
I am also yet to find a game I can't run on a single monitor with max graphics, triple monitor is a different story... But hell. It's allot of god damn pixels to render and nVidia would fold too.

Also, the reason why a super powerfull PC can't emulate a PS2 100% perfectly is simple. - It's not a PS2.
Emulation works on the basis that instead of modifying the game code to run optimised natively on x86 architectures, the PC uses software middle layers to "emulate" a PS2 environment where instructions are intercepted then the JIT compiler re-compiles the instructions to run in x86.
It's hardly an efficient process at any rate, the upside is compatability and potentially superior image quality, the disadvantage is of course performance.
Look at any exclusive PC game, it surely doesn't look like a PS2 game at any rate because it can take advantage of the hardware and doesn't need anything to translate the instructions.

Now, take a 5 year old PC, something along the lines of a Core 2 Quad Q6600 and drop in something like a Geforce 8800GT and a few gigabytes of ram. - You would be hard pressed not to run most games at console levels of image quality settings at 720P@30fps, provided it was a decent console port to begin with. (I.E. Not GTA IV.)
Heck, you could run Crysis on that with at-least 25-30fps. ;)

To put the Radeon 6670 in perspective for some though...
A Radeon 6670/7670 has about 3x more compute power than the Xbox or PS3's graphics chips, but does feature some image-quality enhancing features such as Tessellation and all the other pretties/optimisations that we got with the move to Direct X 10/10.1 and 11.

The other thing to remember is that Sony and Microsoft wouldn't be using the slow and bog standard for low-end cards, GDDR3 memory either and probably opt for memory with vastly more bandwidth, so it would still beat a cheap desktop Radeon 6670/7670 at any rate.

Also worthy to note is that Integrated graphics have reached a point of almost being at parity of a Radeon 6670/7670 both from Intel and AMD. A few more years and they will probably eclipse it.
But with that said, I highly doubt Sony or Microsoft would choose such chips, especially aging ones based on the VLIW5 architecture.




www.youtube.com/@Pemalite

wiiU. wiiU would win.