By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Core i5 2500K is a gaming beast

Hey guys/girls,

I've built my first gaming PC a while back now after reading about the insane performance per dollar of the now famous Core i5 2500k CPU. I overclocked it to 4.5Ghz with a closed loop water cooler and coupled it with 2 x Nvidia GTX 560tis in SLI. I absolutely loved my rig. It monstered everything at 1080p at max settings for years.

Since then I have upgraded a few things: a bigger SSD, I sold my 560 tis and bought a GTX 970, but I have never really been tempted to upgrade my 2500k. Looking at the gaming benchmarks it just doesn't seem worth the cost.

I remember playing Far Cry 3 at max settings, 1080p 60fps and thinking to myself 'This feels next gen'.

And just today it hit me: The 2500k came out over 5 years ago! It released in January 2011 and it wasn't long after that I built my PC.

I have been into PC gaming since 1995 and I have never seen a CPU remain current for this long. Yes the focus has moved to more powerful GPUs, but the CPU still plays a vital role in gaming and my apart from the bigger VRAM buffer I'm really only getting around a 20% performance improvement with the GTX 970.

Overall I feel like I picked the right parts at the right point in time to build a gaming PC and think the 2500k will go down in history as one of the greatest gaming CPUs of all time. I love it and when it comes time for me to upgrade it, I think it will be genuinely hard for me to let go.

Anybody else have a Sandy Bridge CPU and feel the same way?



Around the Network

Good stuff man.Yeah i remember back in the mid 90s a PC was obsolete for new games 2 years after release.
Buy a P90 in 1995 for what, $2000 in 1995 dollars, by 1997 you couldn't run big games of that year like Myth or Quake 2 without serious framerate issues.Plus you could notice a huge difference when you upgraded for things like web browsing or even using an mp3 player.Not anymore.

My machine is almost 6 years old now,Cant play new AAAs like witcher 3 but still good for everything else.I am waiting for the new graphene technology chips before upgrading again.That is the next huge leap and it's not far off.



It's definitely a high point for Intel meeting expectations considering Nehalem itself raised them pretty high ...

But I'm happier with my Skylake build since it gives a decent improvement on IPC compared to Sandy Bridge plus there's tons of x86 extensions goodies too ...

I expect the gap to grow in favour of Skylake as time goes on ...



Frankly, Intel's IPC improvements after Sandy Bridge are disappointing. The only real good thing about the newer CPU's are the newer features, and even those aren't spectacular.



I had a 2500k machine and sold it to my brother a couple of years back and always regretted it.

Just built a 6700k/980ti machine and of course it's amazing but could definitely have got a better value system.



Around the Network
Nettles said:
Good stuff man.Yeah i remember back in the mid 90s a PC was obsolete for new games 2 years after release.
Buy a P90 in 1995 for what, $2000 in 1995 dollars, by 1997 you couldn't run big games of that year like Myth or Quake 2 without serious framerate issues.Plus you could notice a huge difference when you upgraded for things like web browsing or even using an mp3 player.Not anymore.

My machine is almost 6 years old now,Cant play new AAAs like witcher 3 but still good for everything else.I am waiting for the new graphene technology chips before upgrading again.That is the next huge leap and it's not far off.

Yeah my Dad had a P100 back in '95. In around '99 I put a Voodoo2 in it. That gave it a few more years :)



With that kind of overclocking it still can't be beaten by AMD's current flagship at 4.7GHz^^

Well good for you. I'm running with the direct successor(3570k) for years now and it served me well even without overclocking. Though this year I'm gonna splurge and finally get an i7 for my new rig.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Yes, the 2500K has been one of Intel's great CPUs like the i7-920 or the Q6600, all of them good processors that were great because of how well they overclocked.

It's a shame that Intel went for better thermals after that line of CPUs, with only marginal improvements on the performance side.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Same story with Nehalem... If you have a Core i7 970/980/990... You still have a beast of a chip... And they came out 6 years ago, overclocking one of them can put you ahead of even the Core i7 6700K in many situations.

The Core 2 Extreme QX9775 @ 3.6 - 3.8ghz is still capable of pretty much running every game out there... And that is 8 years old.

I still have a Sandy-Bridge-E 3930K in my secondary rig, it's still out-benches Intel's latest quad core chips, especially overclocked. (Where it will go farther than Haswell-E.)

CPU is just not that important for gaming anymore once you get a decent quaddy or better, the real gains are had in the GPU.

What I love about the Sandy Bridge chips is how much fun they were to overclock and how well they overclocked.



--::{PC Gaming Master Race}::--

Nettles said:
My machine is almost 6 years old now,Cant play new AAAs like witcher 3 but still good for everything else.I am waiting for the new graphene technology chips before upgrading again.That is the next huge leap and it's not far off.

You will be waiting for decades, if not forever. I don't know where you get the idea it's not far off.

Graphene can't even properly do the "on-off" you need for a transistor to work, not to mention all that bullshit of graphene going for hundreds of gigahertz? Yeah, a single silicon transistor can also do that on test conditions. Hell, even inside a Skylake processor some of them go to a few dozen gigahertz. Not to mention very fast chips like that would theoretically be limited by the speed of light, which can't go a long way when you are working through hundreds of billion cycles per second. Instructions wouldn't be properly pipelined, latency and errors would increase exponentially as pipeline delays pile up and... you get the picture.

I make mine the words of Gordon Moore on an alternative to modern chips when he says it will be very hard to beat tens of billions of silicon (or likely silicon + something) transistors on a single chip. They are basically the combustion engine of computing, except you don't feel compelled to switch to electric because of fossil fuels.