By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Is 900p okay? Is Xbox One a worthy 8th gen console now?

John2290 said:
AnthonyW86 said:
H3ADShOt3 said:
This whole generation is under powered, 1080p 60fps should've been the minimal for these machines and they also should've been capable of 4K resolution. I wasn't too happy when I saw the performance of these consoles at first but I've learned to accept it to stay a console gamer.

This argument just isn't true. Although this generation started out with lower end hardware, right now the gap between PS4 and even Xbox One compared to current PC hardware is actually smaller than it was between PS360 and PC in say 2008.

In 2008 you could buy a HD48xx series card for between $200 and $300 which were around 5 times faster than PS360. You could even buy and HD4870x2 for $399 which was able to achieve 2000 gflops. That's PS4 level power, in 2008!

So right now what do we have? Even extremely high end cards off over $600 and that use more power than a complete PS4 and Xbox One console combined are only around 4 times faster than current gen consoles. So even with much more expensive cards available the gap has actually schrunk, with current gen consoles being cheaper and actually profitable.

Are you comparing card prices to console cards or consoles themselves because it takes more than a card to make a machine. If you are making this arguement it has to be at the very most a 100 euro card in 2008, try finding the best card you can find for the eq.uivilant price of the ps4's card in 2013 and tell me if you find anything as close to it, because you wont.

I was comparing the difference in console power and PC power between the two generations. The prices are only to show that not just is the difference in power between console and high end pc smaller this gen, when looking at cards prices high end cards have become more expensive as well. And that while the PS4/XBone are cheaper to produce than X360/PS3 where.



Around the Network
bdbdbd said:
captain carot said:

Yes and no.

 

That's about the time when you achieve what. VGA came out 1987 with games starting to support VGA very slowly until 1990. Standardized SVGA was around 1989/1990.

Not to talk about the practical capabilities of video cards back then. I remember my first two monitors having 1024x768 max resolution when 2D games usually still were 320x200-320x240 and most 3D games 'high res modes' 320x400. ^^

That slowly changed around 1994/95. There was a huge difference between theoretical capability like SXGA having 1280x1024 with 8Bit color depth and what actually could be used for games.

If they had DisplayPort 1.3 current graphics cards could be 8K on paper. But not have the power for newer games in 8K.

People should remember that at the time the Voodoo 1 came out 320x200/240 still was usual for 3D games and that fluid 640x480 with 16bit colors was a breakthrough. With Voodoo 2 it was 800x600.

And at the time Voodoo 2, the Riva TNT and so on came out the Dreamcast was released in Japan.

That was a totally different thing from today.

 

@Pemalite:
386 came out in 1986 (productionwise). And VGA cards a bit later. So you wont find 1985 PC's that can run Kings Quest while the NES released 1983, though Japan only. :-p


While I agree with your point that you didn't get what was technically possible at the time, in the end this still boils down to what could be done with the high end hardware. You didn't really see the games start taking advantage of PC hardware before the game centric computers started to disappear. 

On consoles the cost per efficiency is on a whole different level, but on PC you're able to get more power when you're willing to pay for it. I think an editor on an PC magazine that commented Xbox 360 launch back in the day nailed it. He commented that he doesn't understand the fuzz around 360 being able to draw hundreds of characters on screen that all look the same, when on a PC you're able to draw hundreds of characters that all look different, though GPU capable of doing that costs as much a a 360.

What I really love about console hardware, are the technical tweaks used to boost on-screen performance. PS2 had insane VRAM bandwidth, GC CPU used L2 cache as a buffer to eliminate empty clock cycles, Dreamcast didn't draw off-screen (or behind an object) polygons, 360 CPU was designed to have low internal latencies, Megadrive's DMA controller was interesting enough to have it's own marketing term, SNES had a number of cheap special purpose processors to boost the performance of the weak hardware, to name a few.

 

Yeah, Dreamcast used the PowerVR that was released as PC graphics cards called Kyro. Don't know if you remember them, but a Kyro without HW T&L could be faster than a GeForce. Kyro was PowerVR series 3 though.

Thing about PC is as a gaming platform it needed about 15 years to really get on top. It had it pros like already decent 3D performance at least with an FPU though. But back then it was expensive and outdated quickly. That basically has changed within the last ten years. You still can play every game with a 100$/€ graphics card from 2010/11.

On the other side, console hardware is about maxing out what you have. Always has been. I remember seeing screenshots, not even videos, of Donkey Kong Country and at first thinking that's a next gen game. Like many did back then. At that time i already played on PC as well.

Now i've come to a point where looking what AMD or Nvidia are doing is interesting bu i just don't care anymore for the highest res or ultrahigh quality settings.



captain carot said:Yeah, Dreamcast used the PowerVR that was released as PC graphics cards called Kyro. Don't know if you remember them, but a Kyro without HW T&L could be faster than a GeForce. Kyro was PowerVR series 3 though.

Thing about PC is as a gaming platform it needed about 15 years to really get on top. It had it pros like already decent 3D performance at least with an FPU though. But back then it was expensive and outdated quickly. That basically has changed within the last ten years. You still can play every game with a 100$/€ graphics card from 2010/11.

On the other side, console hardware is about maxing out what you have. Always has been. I remember seeing screenshots, not even videos, of Donkey Kong Country and at first thinking that's a next gen game. Like many did back then. At that time i already played on PC as well.

Now i've come to a point where looking what AMD or Nvidia are doing is interesting bu i just don't care anymore for the highest res or ultrahigh quality settings.


One of the most obvious problems with consoles is that when they start to design the hardware, it's largely guesswork about what tech will be available at the time the console is out. Two maybe the best examples of missing the targets are N64 and PS3. Not only was the release delayed, but also the performance ended up being lower than expected.

Yes, on consoles it's about maxing out what you have, but also balancing between cost and performance. For example, if you have an efficient GPU you either take advantage of it with smaller amount of faster memory or have more memory with slower clock speed, so it's either more stuff onscreen with smaller areas, or bigger areas but less happening on the screen. While on PC you can have the same (or "same") GPU and buy the amount of memory that you need with the needed speed, so you don't need to compromise.

I think PC have always been a gaming platform to an extent; it's just that the games have been so much different. You may remember back in the day when we had computer- and video games. Computer games were more complex and slower paced, and video games were simple and fast paced arcade. From mid 90's to mid 00's you could say FPS and RTS were the two genres that kept PC as a gaming platform afloat. Recent rise in PC gaming popularity seems to be caused by flash-games and, now that the PC games are on consoles too, the fact that you're able to play the same games on steady framerates and high resolution. The downside is that consoles are holding PC games back.

DKC was pretty amazing, it used the same gimmick as Mortal Kombat by using pre-rendered objects (however it was made possible by the number of colours onscreen SNES was capable of) DKC 2 was much more polished, but I think DKC was still the better looking of the two, maybe it was because of attempting to make a more realistic world (made of plastic models).



Ei Kiinasti.

Eikä Japanisti.

Vaan pannaan jalalla koreasti.

 

Nintendo games sell only on Nintendo system.

All of the consoles (ps4, Xbone, and WiiU) are lacking for their intended purpose. Every game should be a consistent 60fps. That is what I was expecting with this gen. I mean, some games on Xbone are 720p. That is ridiculous. I still like my Xbox One, but 720p is really crazy to see in 2015. I'm glad that I mainly play pc and only use my ps4, Xbone, and WiiU for exclusives, but even then, many games do not perform as well as they should in 2015.



Hi

iLikeEggs said:
All of the consoles (ps4, Xbone, and WiiU) are lacking for their intended purpose. Every game should be a consistent 60fps. That is what I was expecting with this gen. I mean, some games on Xbone are 720p. That is ridiculous. I still like my Xbox One, but 720p is really crazy to see in 2015. I'm glad that I mainly play pc and only use my ps4, Xbone, and WiiU for exclusives, but even then, many games do not perform as well as they should in 2015.


That's more of a dev issue. They prioritize resolution over fps. They're will never be 60fps across the board because devs will always strive for visual parity on consoles with PCs that will forever dwarf them in power.



Around the Network
bdbdbd said:


One of the most obvious problems with consoles is that when they start to design the hardware, it's largely guesswork about what tech will be available at the time the console is out. Two maybe the best examples of missing the targets are N64 and PS3. Not only was the release delayed, but also the performance ended up being lower than expected.

Yes, on consoles it's about maxing out what you have, but also balancing between cost and performance. For example, if you have an efficient GPU you either take advantage of it with smaller amount of faster memory or have more memory with slower clock speed, so it's either more stuff onscreen with smaller areas, or bigger areas but less happening on the screen. While on PC you can have the same (or "same") GPU and buy the amount of memory that you need with the needed speed, so you don't need to compromise.

I think PC have always been a gaming platform to an extent; it's just that the games have been so much different. You may remember back in the day when we had computer- and video games. Computer games were more complex and slower paced, and video games were simple and fast paced arcade. From mid 90's to mid 00's you could say FPS and RTS were the two genres that kept PC as a gaming platform afloat. Recent rise in PC gaming popularity seems to be caused by flash-games and, now that the PC games are on consoles too, the fact that you're able to play the same games on steady framerates and high resolution. The downside is that consoles are holding PC games back.

DKC was pretty amazing, it used the same gimmick as Mortal Kombat by using pre-rendered objects (however it was made possible by the number of colours onscreen SNES was capable of) DKC 2 was much more polished, but I think DKC was still the better looking of the two, maybe it was because of attempting to make a more realistic world (made of plastic models).


Just a side note:
Mortal Kombat used digitized images of actors. Don't know if you remember Primal Rage, that used digitized stopmotion miniatures. What made DKC so special, aside from the great overall design, was the number of effects Rare used back then but again not for the effects to shine out but to get a great looking game. Like the massive parallax scrolling in the jungle giving a great feeling of depth and size of jungle. Or the snow just looking beautyful, but not so much thinking woah, those FX...

 

As for consoles vs. PC:
Not looking at 80's home computers, PC's slowly became a gaming platform. Being office only machines in the beginning and fucking expensive (i have an old price of 10.000 Deutsche Mark for an IBM 5150 in a book) while other computers where way cheaper and more gaming friendly that's not a miracle.

The big plus from the very beginning was modularity via the XT bus (later 8bit ISA) and then the AT bus (16 bit ISA). RAM and floppy drives you usually could ad to other computers.

That plus, back then, IBM, later MS-DOS.

PC games at the very beginning were ports. I know very few old 'IBM only' games. Even flight sims like F15 Strike Eagle or the back then famous Falcon were released for other systems first. You can actually see a clear path of PC's slowly becoming something mainstream and becoming gaming sytems. And getting better video, sound(blaster), acceleration cards...

Then there's the games side. Home computers and PC's had one big plus from the beginning, cheap, 'large' storage. Floppies and later harddisks. That was perfect for larger games like RPG's or dungeon crawlers. Stuff you might have had on a univerity PDP mainframe.

There was a perfect fit to play test adventures and stuff like that. Hell, originally you had to type in most graphics adventures, no point&click.

So there's somewhat a natural reason in the late 70's, early 80's for computers to get the slower but more complex games. And there's a reason IBM clones at the beginning usually didn't get the fast games.

That changes within the late 80's and early 90's with stuff like VGA, dying home computers, PC's getting cheaper and so on.

Games change as well. Not only FPS and RTS, all kinds of sims, adventures, space operas like Wing Commander or X-Wing and all that.

 

Hardware wise:
Yes, console makers have to start planning years before release, build a system that's not to expensive and so on. Wrong planning was an issue with the Saturn. Sega built a 2D system at first and then made a ton of additions and changes, which made the Saturn difficult for developers as well as expensive.

At the same time and looking at Steam data, it's not like all PC's used for gaming are really high end. Thing is, PC gamers decide by themself and wallet what kind of gaming system they build or how often they upgrade.

For consoles again, it's about who you cater to as well. Resolution for example, it wouldn't have made sense for consoles to go 1024x768 when no one could actually use that.

It doesn't really make sense if you go for 4K if most people don't benefit too. Really benefiting from 4K needs you to sit close in front of your screen or have a really big one.

Now, for the 8th gen, MS definitely made mistakes in planning their whole concept. They changed their concept. What they can't change anymore is the lack of raw GPU power.



No it isn't ok, PS4 is the far superior console, Xbone wouldn't even be able to run UC4 at 720p.



PS4 has been consistent in delivering 1080p 30 fps for a while with very few exceptions. One of the major reasons it is adopted by players more.



ArchangelMadzz said:


Selling $600/$700 consoles kind of ruins the point of them.


I like to know how Sony was able to sell the PS2, which was ridiculously more powerful than the PS1, for only $299.99 and now, they can't do the same thing without the price being $699.99. I mean that too. I'd REALLY like someone to explain that to men because it makes no sense imo.



AnthonyW86 said:

 right now the gap between PS4 and even Xbox One compared to current PC hardware is actually smaller than it was between PS360 and PC in say 2008.


Yes but how about 2005 when the 360 launched? I'm not expert, but I remember hearing most people say that it was much more capable than high end PC's when it launched.