MS will probably put 2.000U$D Nvidia super pcs inside the consoles to pretend they are more powerfull and sell those at loss.
Wait, no...
MS will probably put 2.000U$D Nvidia super pcs inside the consoles to pretend they are more powerfull and sell those at loss.
Wait, no...
As much as I would like to agree with you; I don't see XB1 having significant graphics or processing issues anytime soon. The 360 Vs. PS3 architecture is a dramatic difference. The PS3 has had all kinds of complaints from developers over the years. So far, while developers might point out potential issues and obstacles with XB1 development, it hasn't shown up in their titles as of yet. The PS3 on the other hand was having issues at launch with cross-platform titles. I agree that PS4 is the better, more developer and gamer friendly platform, but tech specs and rundowns aren't want I would use to support the idea. The games look great on PS4 and XB1, but Sony is more games focused than MS, and that is what will make the difference. Their focus and passion for games as well as their support to developers.


| Daisuke72 said: The PS4 was built to provide a next generation experience that's cost efficient, not to be ahead of a PC.... My current PC has better hardware than the PS4 and it costed 800 dollars. |
Actually... A PC that's about 4+ years old can be faster than the PS4.
The great thing about PC? You're not limited to just a single graphics card, set clock speeds (You can overclock) or memory amounts, for example throw 4x Radeon 5870's into a machine and watch it obliterate a Radeon 7870 in benchmarks.
Granted 4x 5870's aren't exactly efficient... But it's still faster. :)
I also personally think cloud computing is more hassle than it's worth, for starters... It adds latency, you need servers that are physically near you which means a majority of the world is going to have a poor experience, you can't change the law of physics afterall.
Secondly, you must have a reliable internet connection that also has plenty of bandwidth, hence a 3G connection that only achieves 1Mbps ain't going to cut it.
Thirdly, the games servers must be online at all times, take it from those who played Sim City or Diablo 3 at or a little after launch on what their experiences were like. Hint: Was not pretty.
In-fact I still get allot of "Rubber banding" issues in Diablo 3 as my data has to cross my continent, across the Pacific before reaching Blizzards servers in the United states before making that same trip all the way back again, and that's with a 15Mbps ADSL connection with interleaving turned off for the lowest possible latency.
If Microsoft's "Cloud" servers are all in the USA, then forget it, I'm not interested in that technology when I might end up waiting a second or two in latency.

www.youtube.com/@Pemalite


Captain_Tom said:
2) The difference is that of comparing a 7770 with 1GB to a 7870 with 2GB. Essentially double the power. After a year expect PS4 games to run at higher framerates with higher resolution textures. It will be very noticeable. |
I guess you did not read the X1 article if you are stating that there was no games running on the X1.
To your second point, on paper it may appear to be double the power but in reality we are not seeing it. Double the power, Sony games should be blazing compared to the X1 and it was not displayed at E3. Also its not like the hardware is brand new or a different learning curve for developers. As with any device there must be other factors limiting the PS4 for achiving is highs. If we are not seeing this greater textures now in first party games, its probably not going to be something we see as both hardware mature. Its not like GDDR5 is a magic bullet that makes everything work better because it has its ups and downsides as well. If starting out of the game we are seeing 1080P 30FPS, then its a good bet this will be the rez that defines this gen and with that, we will not see much of a difference between the 2 consoles.
Pemalite said:
|
What? I'm in Melbourne and I've very rarely experienced lag on Diablo 3. Especially rubber banding. I get the obvious amounts of ping but it's far better than trying to play Halo 4 on a US peer host.


Pemalite said:
|
And a whole lot of money. A platform built to support cloud computing and the resources to make it happen. Its not just turning on a switch and suddenly you have a cloud computing platform. Its not like you can just pop up a couple racks of servers and suddenly you can support millions of gamers. It takes software to make it happen. It takes billions of dollars on servers around the world to make it happen. It takes a incredible complex infrastructure to manage those resources. There are very few companies in the world who have that and really the list is closer to 3. Amazon, Google and MS.
People make crazy statements that Sony could do something like this and its just that crazy. Sony isn't even on the playing field compared to where MS, Amazon and Google are at. They have not put 14 billion dollars investment in such a system and they currently only have Gaikai as their platform which doesn't do cloud compute. Just turning the service around to do cloud compute would take a tremendous amount of money and you would have to have the people capable of creating the infrastructure.
Last but not least, people forget that MS is offering this for FREE. To understand having Azure as a resource for free for any game you make is huge. No its not gamer huge because gamers can only concentrage on what they can see. But as a developer, you have this vast resource that may or may not make your games better but its there and its free. At least now every gamer will have a dedicated server when they play any multiplayer game which is something Neither Sony or Nintendo can do.
CGI-Quality said:
I would advise you to keep the flaming at a minimum. You can get your point across with it (if its valid and strong enough, that is). Now to the topic, no, the cloud will not offer an "infinite amount of power." Besides, high-end PCs are already more powerful than next gen gaming consoles, so mentioning those is pointless. The rest of what you wrote is either completely wrong or very debatable. |
I agree with what you said, especially if people think that the cloud can do 50% of the rendering while locally on the xbox one does the other 50%....such thoughts just show the lack of knowledge in regards to how graphics engines work.
I said this before and I'll say this again, the cloud will be used as dedicated servers for shooter like games, which is awesome for multiplayer, as well as doing AI simulations in a game world which is persistent but only in a towns which the play is not in, in other words, which ever town the player is in will be computed on the xbox one locally while the other towns AI will be on the cloud. Their will also be MMO like persistence for games which are MMO like, and the final thing is, CLOUD game rendering like GAIKI and ONLIVE.
So yes in a way the cloud can be used to render graphics, but it will do so 100% just like it's done with GAIKI and Onlive, the issue then becomes your internet connection speeds and the latencies that come with such services, so while in theory the cloud could render CGI quality graphics, unless you have the best internet in the word with almost zero latency, then what you see on your screen could in fact be worse than what the Xbox one can render locally....the fact is, while the cloud will have it's benefits, it's not exactly the 'limitless powa' that Microsoft PR machine keeps spitting out.
Captain_Tom said:
1) The ESRAM is to make up for the 60% slower RAM. Even then it is still not as fast (Theoretically) as GDDR5. Slightly more cores, much worse RAM, and 20% slower clock (800MHz vs 1000MHz) = 7770 at best, but probably closer to a 7750. 2) The GPU in the PS4 IS stronger than a vannilla 7850 with those enhancements as well. It should easily catch up to a 7870. 3) A 7850 is already over twice as stong as a 7750, and 50% stonger than the 7770. Then the 7870 is about twice as strong as a 7770 as seen here: http://www.techspot.com/review/661-nvidia-geforce-gtx-650-ti-boost-sli/page2.html
Anyone with a solid understanding of AMD's current GPU archetecture will tell you the same as I have (And I am an Engineer so this is kind of MY JOB). But if you don't believe me, maybe some actual developers will convince you: http://www.gamechup.com/jonathon-blow-some-games-could-be-60fps-on-ps4-and-30fps-on-xbox-one/ |
Last time I've checked, XOne's GPU was cut down 7790 (so more geometry engines/command processors than 7770) with very similar base memory bandwidth of 7770 (68.2 vs 72 GB/s). So you're telling me that with all the additional enhancements MS put in, it will perform at best as 7770? Sure, whatever makes you tick...
Just to avoid unnecessary back and forth discussion, I think PS4's GPU will perform somewhere in between 7850 and 7870, and XOne's in between 7770 and 7790...so, depending on the game, some 40-60% difference in favor of PS4. You're certainly welcome to your opinion, since, as you put it, it's YOUR JOB.
ironmanDX said:
|
As a resident of Sydney who has expirience 300ms ping for Diablo versus 80 ping on FPS, trust me the difference is like day and night. Right now you're living in the night and don't know the pain your suffering because it has always been night, be glad... ~_^
