| NiKKoM said: euro's or dollars people... euro's or dollars.. you crazy left driving people |
http://www.xe.com/currencyconverter/convert/?Amount=100&From=GBP&To=USD
£100 = $165.79
| NiKKoM said: euro's or dollars people... euro's or dollars.. you crazy left driving people |
http://www.xe.com/currencyconverter/convert/?Amount=100&From=GBP&To=USD
£100 = $165.79
| Munkeh111 said: So it's not quite a fair comparison as your £100 graphics card will likely find itself in a £400+ system. You would also only expect this parity to last another few months and teams get more time and more experience working on next-gen hardware The problem is as a PC gamer you always want more. My 670 could provide nice 1080p gaming, but once I can't hit ultra and keep the frame rate up, I'm going to want more power rather than just lowering the settings. Star Citizen is going to ruin me |
I actually see the bigger problem in flutuating framerate. In AC IV, my GTX650TI can achieve 60 fps in some areas, but when we reach locations with more vegetation, it woud dip to 20 fps unless I lower my settings. In the end, you have to lower the visuals for the worse part on the game, while the console version is automatically optimized, they will just remkve what they have to in any area to achieve the performance they want.
| JEMC said: That's incredibly good for PC gamers (who most likely will have something better than a R7 260X), but as always there's a fly in that soup: That said, for all the advantages we found in our tests, we do notice a worrying trend in this recent slate of PC releases. Need for Speed: Rivals is capped to a console-standard 30fps, while Assassin's Creed 4 requires a lot of GPU horsepower to sustain 60fps, dropping down hard to 30fps on less capable hardware. Likewise, the PC version of Call of Duty: Ghosts is in a bit of a state, which despite several attempts to achieve parity with the console settings, was held back, seemingly due to poor optimisation. Unless developers and publishers start paying more attention to the PC version of their games, we'll still have to own cards a lot more powerful than what's inside the consoles to be able to play the same games. |
isn't the ability to optimize due to standardized hardware the entire reason consoles are competitive. yes, PC has far better numbers but you need them to overcome the overhead of being on a non-standardized platform.
| JEMC said: That's incredibly good for PC gamers (who most likely will have something better than a R7 260X), but as always there's a fly in that soup: That said, for all the advantages we found in our tests, we do notice a worrying trend in this recent slate of PC releases. Need for Speed: Rivals is capped to a console-standard 30fps, while Assassin's Creed 4 requires a lot of GPU horsepower to sustain 60fps, dropping down hard to 30fps on less capable hardware. Likewise, the PC version of Call of Duty: Ghosts is in a bit of a state, which despite several attempts to achieve parity with the console settings, was held back, seemingly due to poor optimisation. Unless developers and publishers start paying more attention to the PC version of their games, we'll still have to own cards a lot more powerful than what's inside the consoles to be able to play the same games. |
Too true, the consoles have a great advantage in unified hardware, most developers get a bit lazy with PC versions and make them really poorly optimized in comparison. The worst examples are games like Skyrim only utilizing 2 GB of RAM regardless, on release, that's amazingly stupid for an open world title with such a massive amount of more or less constant texture load and no transitional rendering besides entering and leaving buildings.
I really hope the 8th gen will be better in this regard and I think it actually will; the PS4 and One having hardware architecture so close to a pure PC setup should make the development and optimization whole process more natural and fluid. There is hope now! 


Mummelmann said:
I really hope the 8th gen will be better in this regard and I think it actually will; the PS4 and One having hardware architecture so close to a pure PC setup should make the development and optimization whole process more natural and fluid. There is hope now! |
didn't skyrim actually had a change in that like the last 2 weeks tho? 
"I think people should define the word crap" - Kirby007
Join the Prediction League http://www.vgchartz.com/predictions
Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.


torok said:
|
Adaptive V-Sync is your friend with the fluctuating framerate.
AMD users can obtain that functionality by using RadeonPro. (Which in turn gives you access to SweetFX to improve a games graphics.)
Also, feel free to overclock, it's free performance.
AMD and nVidia are both making great strides in dealing with framerate fluctuations, nVidia is throwing G-Sync out into the wild and AMD is pushing for the more open Freesync.
Then you have Mantle being thrown into the mix, which allot of game engines may end up supporting, giving PC gamers similar overheads to that of console gamers. (That's gonna' ruffle some jimmies!)
| kitler53 said: isn't the ability to optimize due to standardized hardware the entire reason consoles are competitive. yes, PC has far better numbers but you need them to overcome the overhead of being on a non-standardized platform. |
Most console developers don't optimise for a consoles specific hardware to any great degree.
Most developers Purchase/Lease a 3rd party engine such as the Unreal Engine, Gamebryo, CryEngine etc' and then use those engines on a Low level or high-level API and never even deal with the hardware.
It's usually only the console exclusive developers who take advantage of a specific console and optimise for every little nuance to extract as much performance as possible, but that is more or less the exception rather than the norm.
The Goal of Mantle is to give a similar environment to developers that the consoles have enjoyed for years, pretty much since 3dfx Glide days, standardised or not, API's were invented to hide such things and make making games easier, it's just Direct X is old and slow at handling it, OpenGL is a little better, Mantle may solve allot of the issues.

www.youtube.com/@Pemalite
kirby007 said:
didn't skyrim actually had a change in that like the last 2 weeks tho? |
Possibly, I haven't really played it for a long time. There was an early fix that allowed it to use more than the initial 2GB but it was still limited to 4GB since it basically ran as a 32bit application, 32bit application are, as we know, incapable of employing total memory above a fairly small (in today's standard) utility cap.
There might have been a fix, if so; about time! But it should never have launched with that limitation to begin with; that's my whole point on PC versions often getting the shaft.


Mummelmann said:
Possibly, I haven't really played it for a long time. There was an early fix that allowed it to use more than the initial 2GB but it was still limited to 4GB since it basically ran as a 32bit application, 32bit application are, as we know, incapable of employing total memory above a fairly small (in today's standard) utility cap. There might have been a fix, if so; about time! But it should never have launched with that limitation to begin with; that's my whole point on PC versions often getting the shaft. |
Heard if you were't carefull the tampering with it could actually brake certain hardware ( don't know what tho ) so i didn't bother with it myself
"I think people should define the word crap" - Kirby007
Join the Prediction League http://www.vgchartz.com/predictions
Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.
torok said:
I actually see the bigger problem in flutuating framerate. In AC IV, my GTX650TI can achieve 60 fps in some areas, but when we reach locations with more vegetation, it woud dip to 20 fps unless I lower my settings. In the end, you have to lower the visuals for the worse part on the game, while the console version is automatically optimized, they will just remkve what they have to in any area to achieve the performance they want. |
Well I think a performance difference that large is rare. I think I generally have v-sync on, which means a rock solid 60fps in most game. Far Cry 3 was at a solid 30fps and my little playthrough of Crysis 3 found things mostly stable at about 45fps
With PC, you're always going to have to accept that you are going to have to "get your hands dirty" to a certain extent. With the variety in PCs, the number of issues are going to be higher and higher still given the lack of care that is going to continue to be taken
once 20nm GPUs hit you should get a lot better performance for the money
@TheVoxelman on twitter