By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - PS4 processing specs detailed - sources

Snesboy said:
zero129 said:
VGKing said:
Andrespetmonkey said:
VGKing said:
Snesboy said:
VGKing said:
Snesboy said:
se7en7thre3 said:
I just bought an HD 6670 low profile card, and I'm not impressed by it at all. Cant even do 1080p with all effects without major slowdown. So someone with more knowledge about pc cards, feel free to correct me, but if these specs hold true, then Wii U won't be too far off and will infact be able to receive ports easily (if not the main console that does the porting, ala PS2).

Bold: You are correct.

Unbolded: WiiU's GPU crushes the 6670.

No, it doesn't. Wii U is rumored to be using the R700 series which are the 4xxx gamily of GPUs.

http://gamrconnect.vgchartz.com/thread.php?id=141992&page=1

Shut the fuck up.

1. You got some balls to say that. Hope it was worth it.

2. There is no rumor that says  Wii U will be using the HD 4870. That is just wishful thinkging and if you saw how many watts is uses, you'd know there's non way ANY console would use it.

Here is a rumour that says it'll be using it 

The GPU will be custom built for the Wii U so power usage could easily change. There is no reason why the Wii U couldn't have a GPU based on the HD 4870.

That's from June 2011 which is about the time of E3. A LOT has changed since then.

Devs have said its not much of an improvement over 360 and in some cases it's actually LESS capable. 

What Devs have said that? you know any of their names??. I know Devs such as Gearbox, Epic etc have said its more powerful.

Damn right. Proper devs have said it was more powerful. Anonymous devs have just said it was less powerful. I think I will side with CliffyB and Randy Pitchford over some anonymous troll.

Rob Fahey (a columist at Gamesindustry.biz; the website that wrote the article about some devs not happy with power of Wii U) wrote an interesting comment when asked about it (not the one that wrote the offending article). He basically said the contradictory comments come down to 3 things:

1. Some devs have genuine complaints because of what they want to deliver vs the hardware.

IMO this is either console development done wrong or for multiplatform titles they're trying to port (with difficulty for one reason or another). If it's for an exclusive then they're really screwing up considerring they should be designing everything with the hardware in mind.  

2. Developers are at different stages and still getting to grips with what they can do.

3. Developers have different skill levels (blunt but probably true).



Around the Network
pezus said:
Staude said:
pezus said:
VGKing said:
pezus said:
If it releases with specs like that it'll be kind of weak...I want to see new technology, not old PC tech that is already outdated.


LOL! Even if they took a 2008 GPU/CPU it woud easily outperform the off the shelf PC counterparts. Why? Optimization.

These PS4 specs are a HUGE improvement over PS3. Easily. I'm sure no one wants another $600 console.

You're a PC gamer anyway so what do you care?

What a load of bollocks...

Oh people can't game on 2 systems now?

Actually the the bolded part of his post is accurate. Imagine you have a development team, now, if you're developing for the pc.. You can pretty much do whatever you want and then enable the player to scale some settings that'll help it run on their system.
Now, if you, on the other hand, develop for a console, you have a fixed system where you know EXACTLY what you can do. Since graphic cards and processors are different, you can directly create assets for the game, that runs well on the system you are developing for. That allows you to get far more out of it, as you minimize things that it's weak at, while maximizing things that it's good at.

Well that's the point of it anyways. That some developers choose not to .. :P minor detail.

It could be true if he picked a year like 2011 maybe...but 2008? Hell no. PC devs optimize for the most popular graphics cards you know.


Concidering a PC's range can go from the best graphics card in the world, with the best cpu, to the worst with the worst, to the best GPU with a shit CPU, vice versa, crossfire/sli, raid harddrives - ssd harddrives .. etc etc. There is simply EXTREMELY unlikely that any given game is optimised fully to run on one persons given system. It can have a great scalable engine, but even so, if the developers have a specific set of hardware to develop for, they will be able to get more - the most possible, out of that given set of hardware, which can yield great results such as uncharted :P Just to use a mainstream brand. But that's just the way it is. That's one of the reasons why console games look "so good".. Another reason is that they have a limited number of things running in the background which leaves more resources for the game itself. (though newer consoles have more stuff running, it's still much less than a pc would :P)



Check out my game about moles ^

pezus said:
zero129 said:
pezus said:
zero129 said:
pezus said:
Staude said:
pezus said:

It could be true if he picked a year like 2011 maybe...but 2008? Hell no. PC devs optimize for the most popular graphics cards you know.


Concidering a PC's range can go from the best graphics card in the world, with the best cpu, to the worst with the worst, to the best GPU with a shit CPU, vice versa, crossfire/sli, raid harddrives - ssd harddrives .. etc etc. There is simply EXTREMELY unlikely that any given game is optimised fully to run on one persons given system. It can have a great scalable engine, but even so, if the developers have a specific set of hardware to develop for, they will be able to get more - the most possible, out of that given set of hardware, which can yield great results such as uncharted :P Just to use a mainstream brand. But that's just the way it is. That's one of the reasons why console games look "so good".. Another reason is that they have a limited number of things running in the background which leaves more resources for the game itself. (though newer consoles have more stuff running, it's still much less than a pc would :P)

Yeah but then again, the graphics intensive console games have much bigger budgets than most PC games do. But when we see a dev truly use PC power to its fullest (Crysis) we see a game which not even console games from 2012 and forward can beat in graphics. But we are talking gaming PCs here, and devs optimize for the most popular cards and CPUs like I said. It's not that much to optimize for.

Add the Witcher 2 aswell :D, Nothing even comes close on consoles (Yes including the 360 port).

Of course! I was just demonstrating that even a PC game from 2007 beats every single console game.

Good point :D. It sure does feel good to be playing the nextgen already pezus :D.

Sure does.

I never said they could match the best on the pc market, but a game on the console beats the living hell out of the same game, on pc, on a similar rig due to the facts I stated.

I have a pretty badass pc myself (I run batlefield3 in 1080p on max) But comparatively, if you're able to optmize for specific hardware, you'll always be able to achieve more. That's simply not possible for pc's because of the scalability you have to achieve :P



Check out my game about moles ^

Rath said:
logic56 said:
Mnementh said:
logic56 said:
Direct X is what's killing these rumors for me, I think the people making this crap up should really find out exactly that it is


A DirectX11-enabled card. That doesn't mean Sony have to offer a DirectX11-API to developers.

Direct X platform is fully owned by MS they couldn't offer it to developers even if they wanted to

I dunno, Microsoft wants to drive the adoption of DirectX as the standard. They might just be happy for Sony to use it in their console.

Of course Sony probably wouldn't want to as most things I've read have stated that OpenGL is superior.

5 years ago that might have been true, it just isn't anymore.  Open GL does offer some marginal benefits but without the backing of any of the graphics card manufacturers it's dead in the water except for perhaps the mobile and Apple markets.



slowmo said:
Rath said:
logic56 said:
Mnementh said:
logic56 said:
Direct X is what's killing these rumors for me, I think the people making this crap up should really find out exactly that it is


A DirectX11-enabled card. That doesn't mean Sony have to offer a DirectX11-API to developers.

Direct X platform is fully owned by MS they couldn't offer it to developers even if they wanted to

I dunno, Microsoft wants to drive the adoption of DirectX as the standard. They might just be happy for Sony to use it in their console.

Of course Sony probably wouldn't want to as most things I've read have stated that OpenGL is superior.

5 years ago that might have been true, it just isn't anymore.  Open GL does offer some marginal benefits but without the backing of any of the graphics card manufacturers it's dead in the water except for perhaps the mobile and Apple markets.


It's still supported by both Nvidia and ATI though isn't it? I mean major games like RAGE are still being built with it.



Around the Network

have we not yet come to the conclusion that this is made up BS because it's obviously made up BS.....why is this thread still going.



Rath said:
slowmo said:
Rath said:
logic56 said:
Mnementh said:
logic56 said:
Direct X is what's killing these rumors for me, I think the people making this crap up should really find out exactly that it is


A DirectX11-enabled card. That doesn't mean Sony have to offer a DirectX11-API to developers.

Direct X platform is fully owned by MS they couldn't offer it to developers even if they wanted to

I dunno, Microsoft wants to drive the adoption of DirectX as the standard. They might just be happy for Sony to use it in their console.

Of course Sony probably wouldn't want to as most things I've read have stated that OpenGL is superior.

5 years ago that might have been true, it just isn't anymore.  Open GL does offer some marginal benefits but without the backing of any of the graphics card manufacturers it's dead in the water except for perhaps the mobile and Apple markets.


It's still supported by both Nvidia and ATI though isn't it? I mean major games like RAGE are still being built with it.


We still support the running of 16 bit applications in Windows 7 through emulation, doesn't mean it's a good thing to start using 16 bit apps.  The Open GL support is still there because hardware manufacturers are always slow to drop support.  I suspect it is also used by 3d modelling software as a tool, it's just not as good as Directx for games developers as Directx has so many better tricks up it's sleeve.



slowmo said:
Rath said:
slowmo said:

5 years ago that might have been true, it just isn't anymore.  Open GL does offer some marginal benefits but without the backing of any of the graphics card manufacturers it's dead in the water except for perhaps the mobile and Apple markets.


It's still supported by both Nvidia and ATI though isn't it? I mean major games like RAGE are still being built with it.


We still support the running of 16 bit applications in Windows 7 through emulation, doesn't mean it's a good thing to start using 16 bit apps.  The Open GL support is still there because hardware manufacturers are always slow to drop support.  I suspect it is also used by 3d modelling software as a tool, it's just not as good as Directx for games developers as Directx has so many better tricks up it's sleeve.

???

I'm confused - first  you say that OpenGL offers benefits over DirectX but that it's weak because it doesn't have hardware support, then you say that DirectX offers benefits over OpenGL and both have hardware support.

Do you actually know what you're talking about?



zero129 said:
Staude said:
pezus said:
zero129 said:
pezus said:
zero129 said:
pezus said:
Staude said:
pezus said:

It could be true if he picked a year like 2011 maybe...but 2008? Hell no. PC devs optimize for the most popular graphics cards you know.


Concidering a PC's range can go from the best graphics card in the world, with the best cpu, to the worst with the worst, to the best GPU with a shit CPU, vice versa, crossfire/sli, raid harddrives - ssd harddrives .. etc etc. There is simply EXTREMELY unlikely that any given game is optimised fully to run on one persons given system. It can have a great scalable engine, but even so, if the developers have a specific set of hardware to develop for, they will be able to get more - the most possible, out of that given set of hardware, which can yield great results such as uncharted :P Just to use a mainstream brand. But that's just the way it is. That's one of the reasons why console games look "so good".. Another reason is that they have a limited number of things running in the background which leaves more resources for the game itself. (though newer consoles have more stuff running, it's still much less than a pc would :P)

Yeah but then again, the graphics intensive console games have much bigger budgets than most PC games do. But when we see a dev truly use PC power to its fullest (Crysis) we see a game which not even console games from 2012 and forward can beat in graphics. But we are talking gaming PCs here, and devs optimize for the most popular cards and CPUs like I said. It's not that much to optimize for.

Add the Witcher 2 aswell :D, Nothing even comes close on consoles (Yes including the 360 port).

Of course! I was just demonstrating that even a PC game from 2007 beats every single console game.

Good point :D. It sure does feel good to be playing the nextgen already pezus :D.

Sure does.

I never said they could match the best on the pc market, but a game on the console beats the living hell out of the same game, on pc, on a similar rig due to the facts I stated.

I have a pretty badass pc myself (I run batlefield3 in 1080p on max) But comparatively, if you're able to optmize for specific hardware, you'll always be able to achieve more. That's simply not possible for pc's because of the scalability you have to achieve :P

Good to see you also are already in the Next Gen :D, Now hurry and release that Killer Moles game xD.

:P Trying !.. It's taking a while though.

And yes, I told myself I had to get a awesome pc for rendering.. The fact that it was also for gaming is just a minor detail :P cough

 

@Rath, there were some talk that AMD was dropping open gl.. I'm not sure if they actually did, but it is as competent as direct x. They just have different things they do. It's easier for them to drop it though, because microsoft directly supports and develops direct x. So it would save them a lot in R&D.

 

Open GL is open source though, so in theory, anyone can work on it :P



Check out my game about moles ^