By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Rumour: PS4 to feature AMD GPU a gen ahead of nextbox

I call BS but if that's the case then that's the console I'm going with. Unless it's released a year later in which case I don't think I can be bothered to wait.



Around the Network
Andrespetmonkey said:

Question for people who know their stuff: Let's say this rumour is right, and along with the GPU mentioned here it has 3-4gb RAM (2gb vram, 1-2gb main ram) and a quad-core cpu at around 3.2 GHz. Will this be able to run BF3 on its highest settings at 1080p & 60fps with AA on? (you can determine how much AA)

Hmmm interesting question...

Based on hardware specs alone i would say no, but a solid 30fps would be attainable (which is good enough imo).

Theres also quite a few other things to consider..we simply dont have enough info to go on. Like what kind of tweaks the gpu will receive? What exact cpu would it have? Core count and speed means nothing compared to the underlying architecture. Exact memory specs are mystery as well. You also have to take into account that the game will be much better optimized for the hardware..which can yield massive performance boosts....

If we take all the variables into account...and assume the most positive scenario then only would I say yes.



Intel Core i7 3770K [3.5GHz]|MSI Big Bang Z77 Mpower|Corsair Vengeance DDR3-1866 2 x 4GB|MSI GeForce GTX 560 ti Twin Frozr 2|OCZ Vertex 4 128GB|Corsair HX750|Cooler Master CM 690II Advanced|

disolitude said:
Thats great but which GPUs are we talking about exactly? I'd take a 6970 over 7750 any day :)

Yeah, if they said it was a 7450, everyone would be still foaming at the mouth even though the card is a piece of shit.



Snesboy said:
disolitude said:
Thats great but which GPUs are we talking about exactly? I'd take a 6970 over 7750 any day :)

Yeah, if they said it was a 7450, everyone would be still foaming at the mouth even though the card is a piece of shit.

Well it makes sense...

7450 > 6970...for some VG chartz users that alone is bragging rights right there.



disolitude said:
Snesboy said:
disolitude said:
Thats great but which GPUs are we talking about exactly? I'd take a 6970 over 7750 any day :)

Yeah, if they said it was a 7450, everyone would be still foaming at the mouth even though the card is a piece of shit.

Well it makes sense...

7450 > 6970...for some VG chartz users that alone is bragging rights right there.

And it makes me facepalm everytime. I'd take a 4870 over a 7450 anyday. Or even a 7770.



Around the Network
Jega said:

not by 2012 but by 2016, possible. a developer has to make the game that will run that resolution and you will have to have a tv capable of 2160p to see it by 2015 those tvs should be on the market.

Just like in 2006 when the ps3 was released many people could not afford a hdtv but by 2008 2009 more people had the hdtvs.

@bolded:  Adoption rates span several years from market introduction... 

HDTVs (as well as the first HD broadcasts) were first available to consumers in the US in 1998, but it wasn't until several years later that they became widely adopted into households, hence why the Dreamcast, PS2, GC and Xbox weren't designed to support 720/1080 resolution.  It wasn't until 2005 when HDTVs were far more prevalent in homes that the Xbox360 supported HD resolution.

If 2160p TVs are introduced into the market in 2014/15, it will be at least 2020 before they are purchased en masse by consumers.  By that time we will be into/expecting the 9th generation of game consoles. 

And this is regarding movies and content viewing, not games, which is a far more taxing beast.  As it is, the difference between 720p and 1080p is over a million pixels (720p = 921,600 pxls / 1080p = 2,073,600 pxls).  There is a lot more rendering for that increase in resolution.  Even just a 2K HDTV (2,048 x 1,536p) = 3,145,728 pxls.  The res you're talking about (3,840 x 2,160p) is a whopping 8,294,400 pxls.  Needless to say, it would be absurd to expect games on consoles to run in that res anytime soon.



Viper1 said:
crissindahouse said:
Viper1 said:

Again, you are missing the point I made above.  TDP for an HD 7950 is 180 watts.  TDP for the X1800 XL the Xenos is based on was 60 watts. 

The X1800 XL was also very mid range.    It would be comporable to the HD 7750 of today.

not really sure about that but i believe i always read the equivalent to 360's gpu was the X1800 XT and few parts of 1900? (but i really don't know just what i read few times)

btw the xl came out for 449€ and the xt for 549€ here end of 2005. not speaking about wattage and only about the price, people who think a card costing 500€ atm (like the gtx 680) would mean a console would cost above 1000€ end of 2013 should realize how expensive the cards were in 2005 as xbox 360 came out.

Look at the specs for the XL and Xenos again.  They match up much closer than does the XT.   The other architecture it incoproated was from the R600 series (HD 2xxx) by way of the unified shader.

As for price, you'll notice I don't touch on price in my debate.  Only the TDP.

sergiodaly said:

XL and XT had the same GPU chip, only clocked at different speed... xenos had 9 times more MOperations/s processing power than the XL/XT chip due to the addition of extra processing units... it had a higher clock speed (700 against the 500 of the XL or the 625 of the XT) the TDP was almost for sure around 100 watt because of those modifications. XT had a 105 watt TDP.

if the 7850 has a 130 watt TDP that could be the base for Orbis GPU as well... but my guess is still a Tahiti based GPU.

we will see soon enough!

Sergio, the Xenos is clocked at 500 Mhz...just like the X1800 XL.  The Xenos has a 700 Mhz memory clock which might be waht you are thinking of but that's actually a downclock from the XL's 1,000 Mhz memorcy clock.

And expecting any console to launch with the 250 watt is nuts.   10 year life span....not at 250 watts you don't.

Are you talking about the console as a whole? PS3 released with a 380 Watt PSU:

http://en.wikipedia.org/wiki/PlayStation_3_hardware



disolitude said:
Snesboy said:
disolitude said:
Thats great but which GPUs are we talking about exactly? I'd take a 6970 over 7750 any day :)

Yeah, if they said it was a 7450, everyone would be still foaming at the mouth even though the card is a piece of shit.

Well it makes sense...

7450 > 6970...for some VG chartz users that alone is bragging rights right there.

>_>

And you people wonder why PC gamers are known as the elitist pricks of gaming forums.

So some people don't understand GPU gens. Why must you mock them? WHY!?



Play4Fun said:
disolitude said:
Snesboy said:
disolitude said:
Thats great but which GPUs are we talking about exactly? I'd take a 6970 over 7750 any day :)

Yeah, if they said it was a 7450, everyone would be still foaming at the mouth even though the card is a piece of shit.

Well it makes sense...

7450 > 6970...for some VG chartz users that alone is bragging rights right there.

>_>

And you people wonder why PC gamers are known as the elitist pricks of gaming forums.

So some people don't understand GPU gens. Why must you mock them? WHY!?


Cause they come in to threads like this without knowing any details and say "7000 series GPU? Impressive!"



Play4Fun said:
disolitude said:
Snesboy said:
disolitude said:
Thats great but which GPUs are we talking about exactly? I'd take a 6970 over 7750 any day :)

Yeah, if they said it was a 7450, everyone would be still foaming at the mouth even though the card is a piece of shit.

Well it makes sense...

7450 > 6970...for some VG chartz users that alone is bragging rights right there.

>_>

And you people wonder why PC gamers are known as the elitist pricks of gaming forums.

So some people don't understand GPU gens. Why must you mock them? WHY!?

Because a google search and a quick look on a table would educate them.

If they can't do that then they're worth mocking!