By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - You will need ~$650 bucks to match next gen consoles on PC

dahuman said:
ethomaz said:
disolitude said:

If intel slaps 8 cores on an affordable CPU it will destory anything AMD has. Even 6 core intels rape AMD's 8 cores easily in multithreaded tasks.

AMD is good value, plain and simple...and that is why its in our next gen console.

No.

Even Intel 8 real cores is weaker than AMD 8 real cores when you talk about multithread tasks... SMP is better in AMD CPUs.

Some example: http://www.cpu-world.com/Compare/444/AMD_FX-Series_FX-8350_vs_Intel_Core_i5_i5-3570K.html

You can research if you want... the problems is that Intel destroy AMD in single-threaded tasks... the difference is way bigger than the advantage in multi-thread performance of AMD... and PC Desktops uses more single-threaded tasks than multi.

You pretty much supported Dis with that link though, i5 runs on 4 cores and at a lower clock speed yet is competing with a processor that came out at a later date, runs on more power, and can't run on more than 60C for a long time without the processor being damaged. The really sad part is if you OC the 8350 to let's say 4.8(general max) and the i5 to about 4.2-4.5, the 8350 would get destroyed.

For the record, I got 5.2 Ghz stable out of my 8350.  :)

I had to scale back to 4.8 becuase its summer and my place gets really hot lol. Also the whole rig sucks close to 400 Watts when idle...lol



Around the Network
disolitude said:
dahuman said:
Kasz216 said:
Additionally, everytime I see these threads I think... "Why do people argue tech specs with Disolitude."

Outside Trash there is nobody I know who seems to know more about this stuff.

Sadly Trash won't ever post in these threads however exactly because people are more interested in arguing about tech then studying and learning it.


That an I3 beats the other CPU's is pretty crazy considering i'm posting right now fro an I5 laptop... Is there really no need for an I5 (Let alone I7) on a desktop?


He made a build that would rival console performance with overhead instead of the most optimal setup for high performance. It all depends on what level of performance you are looking for when it comes to building a PC. An i3 will introduce bottlenecks in different areas(say you don't have a Nvidia card but want to run PhysX, i3 will choke and die on higher settings) or CPU heavy games(like TERA being a prime example and many other games.) Current optimal is i5 or i7 without hyperthreading if you are purely gaming, an i5 will destroy what's in the PS4 and Xbox One easily without breaking a sweat, but if you live stream, then the game changes, i7 (with hyperthreading and decent OC) or an 8 core AMD CPU(FX-8350 level prefered) will be better than an i5 if you live stream HD videos at a good quality. i7 or the high FX series would also be better if you are still running tasks in the background while playing games just because of thread allocation.

I personally think it's not worth it to build something that'd destroy consoles with a good price until next year, and the situation this time is not the same as the 7th gen since high end PCs are already more powerful by a quiet margin with a cost. It also doesn't help that you have to pay for PS+ to play online now, that cost will add up as well. PC's value will be much greater when new tech comes out in 2014 and beyond IMO.


Truth be told, you could use an FX 4300 or even a Phenom II 965 and have happy PC gaming for many years if people fear 2 core CPU bottlenecks in the future.  Those CPU's made no sense for an mITX build though.

If you're not looking to penny pinch you can build a rediculously good PC today. I've seen an FX 8320 for 154.99 and have seen it run at 5 Ghz with my own eyes. 

5GHz would require some pretty good cooling, but aftermarket cooling solution is pretty much required on the 8 core FX unless you want to hear a jet engine fan taking off into space under load anyways. It all depends on what you are trying to accomplish with your build, I always want more threads with high clock rate for my needs, but if you build a PC purely for gaming(which is mighty silly IMO, you want a good all around main PC for all your needs unless you are building a streaming PC or a server) then you wouldn't need as much other than a decent GPU. I still wouldn't build a new one until later next year anyways if I were to, all I'd need is a new GPU or 2 for my setup though and I'm done for another few years.



disolitude said:
dahuman said:
ethomaz said:

No.

Even Intel 8 real cores is weaker than AMD 8 real cores when you talk about multithread tasks... SMP is better in AMD CPUs.

Some example: http://www.cpu-world.com/Compare/444/AMD_FX-Series_FX-8350_vs_Intel_Core_i5_i5-3570K.html

You can research if you want... the problems is that Intel destroy AMD in single-threaded tasks... the difference is way bigger than the advantage in multi-thread performance of AMD... and PC Desktops uses more single-threaded tasks than multi.

You pretty much supported Dis with that link though, i5 runs on 4 cores and at a lower clock speed yet is competing with a processor that came out at a later date, runs on more power, and can't run on more than 60C for a long time without the processor being damaged. The really sad part is if you OC the 8350 to let's say 4.8(general max) and the i5 to about 4.2-4.5, the 8350 would get destroyed.

For the record, I got 5.2 Ghz stable out of my 8350.  :)

I had to scale back to 4.8 becuase its summer and my place gets really hot lol. Also the whole rig sucks close to 400 Watts when idle...lol

yeah.... I wouldn't want ot go over 4.8(I'd just keep the 8350 at stock, it's a good value CPU at what it is right out the box already, for now), the Wattage requirement jumps like a mother fucker once you reach a certain point even on the Intel CPUs.



disolitude said:

You're probably talking about low level access to CPU/GPU that console have. Yeah consoles can utilize hardware better than PCs, this is no secret... But these advances in game development usually make it to PC as well and push games on both platforms. 

PCs also have the ability to customize game performance. If you value high res textures but don't mind 30 fps, there is a setting for that. lower res textures/resolution but 60 fps...etc. On consoles you are stuck with what developers think is best.


Yes, I'm talking about low level access, but another important factor is to know exactly what GPU and CPU it's in use. That can allow a massive jump in performance. You can't optimize code for a generic approach, because of that you get an overhead.

An example, talking about CPU. Let's suppose you know exactly the model used. The distribution of task between processor cores can impact performance considerably because of the distribution of cache memory. If you have an 8-core processor (like AMD ones) where you have a big L3 cache and 4 L2 caches, each one shared by 2 cores and have to run a task in 2 cores. With the knowledge about your cache memory in a specific architeture, you will try to run the parallel task on two cores that share the same cache so you will allow them to share the same data without having both going to RAM instead of finding it right on the cache. If we look at the GPU side, you have a lot of thing to watch too. 

This is something on consoles that can't be reproduced on a PC. The solution is to use brute force, like PCs are doing these days. Of course, the new gen will improve things on PC side too. PC games are not in a state that I like now, I'm seeing a lot of badly optimized games (Hitman: Absolution is a good example of that). With new consoles and better graphics, devs will have to rely less on brute force and more on using functions on new GPUs in PCs (and leaving some legacy GPUs out of the games) to improve the results.

Edit: I have seen you and ethomaz talking about Intel vs. AMD on next gen consoles. Actually, the choice wasn't just because of price. I've read an article talking about it (link below) that shows that the choice was motivated by the fact that Sony and MS wanted a APU/SoC design.

They first looked at possible architetures (ARM, x86 and MIPS. Power was discarted because the architeture was having a slower evolution). MIPS was discarded because none of them wanted to create an APU from scratch and MIPS wasn't considered developer-friendly. ARM would mean NVidia and x86 would mean AMD (because Intel don't have APUs with the necessary GPU power). Tests revealed that even with the latest improvements ARM could't match x86 (they noted that it can happen soon, but only after the launch of the next gen). So AMD actually was the choice because it was the only one that could offer that product. 

http://www.forbes.com/sites/patrickmoorhead/2013/06/26/the-real-reasons-microsoft-and-sony-chose-amd-for-consoles/



disolitude said:
JerCotter7 said:
Is 2gb really enough for the next few years?

The only reason I'm not going to SLI my build now is because I am worried about the 2gb of Vram that I have. Although I may need it when my 3d monitor arrives.

Looking at current PC benchmakrs I've personally ran, I wouldn't hesitate to build a GTX 770 SLI 2GB rig and use it for the next 3-4 years.

Unlike ethomaz though, I don't see consoles pushing PCs to the point that VRAM would be limiting. He seems to think that all PC games are made with limitations because of current gen consoles and that next gen consoles will make these PC game programers unleash the true power of game development.

It would definitely help with 3D gaming though, I think that's part of what they want to push to maybe sell more TVs, like the extension of the PS3 plan.



Around the Network
dahuman said:
disolitude said:

For the record, I got 5.2 Ghz stable out of my 8350.  :)

I had to scale back to 4.8 becuase its summer and my place gets really hot lol. Also the whole rig sucks close to 400 Watts when idle...lol

yeah.... I wouldn't want ot go over 4.8(I'd just keep the 8350 at stock, it's a good value CPU at what it is right out the box already, for now), the Wattage requirement jumps like a mother fucker once you reach a certain point even on the Intel CPUs.

In my current household arangement, I pay the rent, but gf pays electricity, parking and utilities.

Lets just say that a lot of fuse breakers get tripped in my house... :)



disolitude said:
dahuman said:
disolitude said:

For the record, I got 5.2 Ghz stable out of my 8350.  :)

I had to scale back to 4.8 becuase its summer and my place gets really hot lol. Also the whole rig sucks close to 400 Watts when idle...lol

yeah.... I wouldn't want ot go over 4.8(I'd just keep the 8350 at stock, it's a good value CPU at what it is right out the box already, for now), the Wattage requirement jumps like a mother fucker once you reach a certain point even on the Intel CPUs.

In my current household arangement, I pay the rent, but gf pays electricity, parking and utilities.

Lets just say that a lot of fuse breakers get tripped in my house... :)


Yeah I hate that.... that's why I don't OC like a crazy person these days unless they are Intel, I have too many machines near me and I have tripped the breaker a few times in the past =_=;



torok said:
disolitude said:

You're probably talking about low level access to CPU/GPU that console have. Yeah consoles can utilize hardware better than PCs, this is no secret... But these advances in game development usually make it to PC as well and push games on both platforms. 

PCs also have the ability to customize game performance. If you value high res textures but don't mind 30 fps, there is a setting for that. lower res textures/resolution but 60 fps...etc. On consoles you are stuck with what developers think is best.


Yes, I'm talking about low level access, but another important factor is to know exactly what GPU and CPU it's in use. That can allow a massive jump in performance. You can't optimize code for a generic approach, because of that you get an overhead.

An example, talking about CPU. Let's suppose you know exactly the model used. The distribution of task between processor cores can impact performance considerably because of the distribution of cache memory. If you have an 8-core processor (like AMD ones) where you have a big L3 cache and 4 L2 caches, each one shared by 2 cores and have to run a task in 2 cores. With the knowledge about your cache memory in a specific architeture, you will try to run the parallel task on two cores that share the same cache so you will allow them to share the same data without having both going to RAM instead of finding it right on the cache. If we look at the GPU side, you have a lot of thing to watch too. 

This is something on consoles that can't be reproduced on a PC. The solution is to use brute force, like PCs are doing these days. Of course, the new gen will improve things on PC side too. PC games are not in a state that I like now, I'm seeing a lot of badly optimized games (Hitman: Absolution is a good example of that). With new consoles and better graphics, devs will have to rely less on brute force and more on using functions on new GPUs in PCs (and leaving some legacy GPUs out of the games) to improve the results.

The thing is that the poorly optimized PC games are 99% of the time console prots. They also usually get patched... Hell Sonic Generations ran like shit when it first came out for no apparent reason but now its smooth 60 fps non stop. PC first devs tend to have well optimized games on PC which then get transfered to consoles very well as well. Look at Witcher 2, stellar looking on PC, pretty good looking on X360 too. It takes a significant amount of extra power to show the slightest increase in visual fidelity. That is why its easy to downscale a game to consoles.

I'm glad you brought up multi core allocation...I think the aspect which will improve PC gaming once next gen console development comes out is core usage and allocation. Console ports like Skyrim tend to use 2-3 cores (and run like crap on AMD hardware if you run them at max). Now that we have AMD CPU's in consoles with 8 cores, we should see proper multicore utilization which PC's will benefit from as well.



Agreed.



You're forgetting OS. Tack on $100 for an OEM copy..



Back from the dead, I'm afraid.