By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - IGN: Xbox 720 Will Be Six Times as Powerful as Current Gen

greenmedic88 said:
zarx said:

Sorry to break it to you but 1440 is not a TV resolution so won't be supported and 4k even for upscaled is not possible as HDMI 1.4 only supports up to 24fps for 4k video and I don't think many people will be happy if games were limited to 24fps... 1080p will be the max resolution next gen unless they have 2 HDMI ports which is just not going to happen. There is currently no HDMI spec planned that supports higher framerates so a final spec is several years off.

Well I guess Sony may try to push their 4k TVs via Display port and a good internal upscaller but I doubt it.

Nope 1080p will have to do for the next 7-11 years for consoles.

Plus, the 1080p TV standard isn't going anywhere until the broadcast standards change. Seeing as how they're currently set at 720p/1080i (and most people are keenly aware of how long that transition took), it's pretty hard to imagine broadcast standards changing anything before that 7-11 years you estimated. 

Realistically, we'd probably see consoles that can push out 1080p video to 2 or 3 HDTVs (separate signals, not mirrored) before seeing a console that utilizes a 4k display. And even then, I don't know who would have such a set up in their living room barring the same type of consumer currently gaming on a 2 or 3 monitor set up for PC games. 

Considering that the next generation of consoles probably won't even be able to render a stereoscopic 1080p signal at 60fps, talk of 4k display support seems even more pointless. 


I was implying that the gen after next will have to worry about 4k



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

Around the Network
zarx said:

I was implying that the gen after next will have to worry about 4k

What, in 2020? Won't we all be flying around in anti-gravity saucers and playing cloud based video games on helium powered servers by then?



greenmedic88 said:
zarx said:

I was implying that the gen after next will have to worry about 4k

What, in 2020? Won't we all be flying around in anti-gravity saucers and playing cloud based video games on helium powered servers by then?


and judging by the current rate of phone resolutions playing our games streamed to 4k capable phones/sun glasses

Oh and your phone will be imbedded into your forearm 



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

PRICING PRICING.

Oh shit if the thing about the 720 being 20% more powerful than the Wii U, and PS4 continuing to lead in terms of console power.

How much will they actually cost. I dare not imagine.



happydolphin said:
greenmedic88 said:
crissindahouse said:
Solid-Stark said:
Not sure why people are disappointed by "6x". Thats perfect for games at native 1080 with decent settings, not to mention a price of $350-$400.


i will pay some thousand dollar for one generation i would preferto pay 100 more for much better a.i., draw distance blablub

And on a similar sarcastic note, there's a name for the minority niche consumer willing to pay $1,000 for hardware to play games the way developers intended; they're called PC gamers. 

I guess it's all a question of bang for your buck, and whether or not the big 3 will make the right decisions in offering both a good range of processing power given today's TV tech, and a competitive price. I think the cube was a good yardstick to compare to. It was high-end, and it was affordable. That's all we're asking for really.

Except that, while the Gamecube was a high performance system, it was fairly outdated hardware that achieved high performance by being very efficient and developers optimizing engines for it.

People (often) whine because the hardware some of these systems are made using isn't as high performance as the PC components they can buy but never seem to consider that the PC is designed to be a modular platform and sacrifices a lot of efficiency to be modular, and (due to the massive number of combinations of PC componenets) developers never optimize their engines to take advantage of particular hardware.

Beyond this, there are certain features that have been added to GPUs to make them more efficient since the PS3 and XBox 360 launched. Tesselation alone should translate into higher detailed environements and further draw distances while rendering the same number of polygons. When you then add twice the detail and twice the processing power for effects while rendering at 1080p@60fps there should be a nice boost in quality.

 

I know a large portion of my post wasn't really directly related to your post, but I think most people are focused on the "PS3" or "XBox" design methodology and tend to ignore how the Gamecube was built; and making a system more efficient rather than more powerful benefits everyone.



Around the Network
dividePower said:
PRICING PRICING.

Oh shit if the thing about the 720 being 20% more powerful than the Wii U, and PS4 continuing to lead in terms of console power.

How much will they actually cost. I dare not imagine.

1000 dalla!! Some pages have already listed them price that must mean ite true :O



    R.I.P Mr Iwata :'(

HappySqurriel said:
happydolphin said:
greenmedic88 said:
crissindahouse said:
Solid-Stark said:
Not sure why people are disappointed by "6x". Thats perfect for games at native 1080 with decent settings, not to mention a price of $350-$400.


i will pay some thousand dollar for one generation i would preferto pay 100 more for much better a.i., draw distance blablub

And on a similar sarcastic note, there's a name for the minority niche consumer willing to pay $1,000 for hardware to play games the way developers intended; they're called PC gamers. 

I guess it's all a question of bang for your buck, and whether or not the big 3 will make the right decisions in offering both a good range of processing power given today's TV tech, and a competitive price. I think the cube was a good yardstick to compare to. It was high-end, and it was affordable. That's all we're asking for really.

Except that, while the Gamecube was a high performance system, it was fairly outdated hardware that achieved high performance by being very efficient and developers optimizing engines for it.

People (often) whine because the hardware some of these systems are made using isn't as high performance as the PC components they can buy but never seem to consider that the PC is designed to be a modular platform and sacrifices a lot of efficiency to be modular, and (due to the massive number of combinations of PC componenets) developers never optimize their engines to take advantage of particular hardware.

Beyond this, there are certain features that have been added to GPUs to make them more efficient since the PS3 and XBox 360 launched. Tesselation alone should translate into higher detailed environements and further draw distances while rendering the same number of polygons. When you then add twice the detail and twice the processing power for effects while rendering at 1080p@60fps there should be a nice boost in quality.

 

I know a large portion of my post wasn't really directly related to your post, but I think most people are focused on the "PS3" or "XBox" design methodology and tend to ignore how the Gamecube was built; and making a system more efficient rather than more powerful benefits everyone.

I remember reading alot about this back when Nintendo used the Gecko CPU and Flipper card. The architecture was heavily adapted to these tailored pieces of hardware to make the Cube awesome as it was. Hopefully Nintendo is doing the same here, but it still is nice to compare baselines (say one graphics chip's baseline spec over another), wouldn't you agree, or is there even variability here, where a better GPU could be less tailored than a weaker one?



happydolphin said:
HappySqurriel said:
happydolphin said:
greenmedic88 said:
crissindahouse said:
Solid-Stark said:
Not sure why people are disappointed by "6x". Thats perfect for games at native 1080 with decent settings, not to mention a price of $350-$400.


i will pay some thousand dollar for one generation i would preferto pay 100 more for much better a.i., draw distance blablub

And on a similar sarcastic note, there's a name for the minority niche consumer willing to pay $1,000 for hardware to play games the way developers intended; they're called PC gamers. 

I guess it's all a question of bang for your buck, and whether or not the big 3 will make the right decisions in offering both a good range of processing power given today's TV tech, and a competitive price. I think the cube was a good yardstick to compare to. It was high-end, and it was affordable. That's all we're asking for really.

Except that, while the Gamecube was a high performance system, it was fairly outdated hardware that achieved high performance by being very efficient and developers optimizing engines for it.

People (often) whine because the hardware some of these systems are made using isn't as high performance as the PC components they can buy but never seem to consider that the PC is designed to be a modular platform and sacrifices a lot of efficiency to be modular, and (due to the massive number of combinations of PC componenets) developers never optimize their engines to take advantage of particular hardware.

Beyond this, there are certain features that have been added to GPUs to make them more efficient since the PS3 and XBox 360 launched. Tesselation alone should translate into higher detailed environements and further draw distances while rendering the same number of polygons. When you then add twice the detail and twice the processing power for effects while rendering at 1080p@60fps there should be a nice boost in quality.

 

I know a large portion of my post wasn't really directly related to your post, but I think most people are focused on the "PS3" or "XBox" design methodology and tend to ignore how the Gamecube was built; and making a system more efficient rather than more powerful benefits everyone.

I remember reading alot about this back when Nintendo used the Gecko CPU and Flipper card. The architecture was heavily adapted to these tailored pieces of hardware to make the Cube awesome as it was. Hopefully Nintendo is doing the same here, but it still is nice to compare baselines (say one graphics chip's baseline spec over another), wouldn't you agree, or is there even variability here, where a better GPU could be less tailored than a weaker one?

There could even be a lot of variability in the GPU ...

While it may not sound like a lot of memory, it is possible that these GPUs could have 32 to 128 MB of high speed integrated memory which could have a significant impact on performance. They can replace the bus, add instructions, add stream processors, modify the clock rate, etc. which will all have an impact on performance.



If it's six time as powerful as the 360, then shouldn't it be called the XBox 2160?

I'm not going to look through 22 pages to see if anyone else told this joke.



Proud member of the SONIC SUPPORT SQUAD

Tag "Sorry man. Someone pissed in my Wheaties."

"There are like ten games a year that sell over a million units."  High Voltage CEO -  Eric Nofsinger

Zappykins said:

Summary: “Avatar was rendered on supercomputers”  Yes, I realize that, but they are not rendering at near the resolution of Avatar at 1080p.

And since the 4K TV's are coming out, and 1440p monitors are already out, and have been for a couple of years.  It would be sad to have a consol that is already rather dated before it even his it's first major price break.

Not too many people at 720P TV’s when the Xbox 360 came out. Yes they were out, but more of an ‘elite’ TV, like the 4K will be in a few years.  And all the X360 games are developed at 720p.

Look - I'm not saying that they Xbox 720 will have 'Cameron’s Avatar' level graphics.  Although, being rendered several years ago, with ‘Moore’s Law’ things have improved much since that time.  And the Xbox 360 is equivalent of a super computer less than two decades ago.  And if it comes out in 2013 that’s nearly 5 years since Avatar was rendered.

 I'm just point out that there is plenty of room to improve 1080p if that's all it can do.  But there also to seems to be tradition that the target should be the new TV's coming out.

Have computers improved in the last five years?  Sure they have.  But I don't think you have any idea what it took to do the CGI in that movie.

"Cameron told audience each frame of finished film takes 30-50 hrs to render, then double that up for 3D."

http://www.slashfilm.com/early-buzz-24-minutes-of-james-camerons-avatar-screened/

"They had 10,000 quad processing machines just in the render farms alone — that's 40,000 processors"

http://www.thesun.co.uk/sol/homepage/showbiz/film/2760950/Avatar-week-in-The-Sun-3D-secrets-behind-the-300m-movie.html

We are going to be nowhere near being able to do that in real time for years to come.



Switch Code: SW-7377-9189-3397 -- Nintendo Network ID: theRepublic -- Steam ID: theRepublic

Now Playing
Switch - Super Mario Maker 2 (2019)
Switch - The Legend of Zelda: Link's Awakening (2019)
Switch - Bastion (2011/2018)
3DS - Star Fox 64 3D (2011)
3DS - Phoenix Wright: Ace Attorney (Trilogy) (2005/2014)
Wii U - Darksiders: Warmastered Edition (2010/2017)
Mobile - The Simpson's Tapped Out and Yugioh Duel Links
PC - Deep Rock Galactic (2020)