By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - AMD Ryzen Full Lineup Prices, Specs & Clock Speeds Leaked

Werix357 said:

Thought that might of been the case. The only AMD cpu I've had was the Athlon 3Dnow which was good but yeah Intel has pretty good the last 10-15 years.

I get what you're saying, but 3Dnow! is an x86 instruction set extension, not a CPU model



Around the Network
Captain_Yuri said:
SvennoJ said:
Interesting. Will games start using 8 cores? Do they already use 6 cores atm?
Or is 4 cores still plenty for games? It seems the i5 without hyper threading is still more than capable to keep up with any console game running on 6 Jaguar cores.

I5s are plenty for 60fps+ gaming but pcs have been stuck in the quad core region for too long. Yes these are high performance cores but really, we should have had 6-8 high performance cores at affordable prices a while ago. But thanks to intels monopoly and amd failing to deliver on cpus, it has been zzz worthy.

Hopefully Ryzen will change that and will usher in an era of affordable high performance 8 cores.

After the amazing Core 2 series... Sandy Bridge was the last great Intel CPU that ushered in amazing price/performance/power ratios, since then it's been relatively small increases whilst AMD fumbled with Bulldozer, Piledriver, Steamroller and Excavator.

AMD did have a few good chips. The FX 8120 overclocked like a champ and was cheap on release, whilst the Phenom 2 x6 1090T also gave Nahelem a run for it's money when you pushed the NB clock up.

Muffin31190 said:
Also does anyone know if the threading is better with Intel on the their cores, I could be wrong but this looks like Intel has better threading then this Ryzen line of CPU's.

Ryzen is segmenting it's L3 cache between quad-core configurations, so there should be a little bit of a penalty there.

Captain_Yuri said:
Muffin31190 said:
Also does anyone know if the threading is better with Intel on the their cores, I could be wrong but this looks like Intel has better threading then this Ryzen line of CPU's.

We dont know yet. According to amds own testing, their top of the line Ryzen cpu which costs $499 according to the leak is slightly slower than intels 6900k which costs $1200.

The issue is... Its amds benchmarks... Granted we can test them for ourselves tho since amd has provided us the files

Engineering Samples should be a good baseline of what to expect, they were accurate for Phenom, Phenom 2 and Bulldozer, so there is precedent.

And the Ryzen Engineering sample on a per-core basis was trending around a Core i5.
Energy efficiency was also worst than Intel.

shikamaru317 said:

Yeah, I can't wait to see those benchmarks, that's where my budget is at. If the 1200x and the 1300 beat the similarly priced high end i3's and low end i5's, I will most likely use one or the other in my next upgrade, paired with an Rx 480 or the lowest end Vega GPU (depending on pricing later this year). 

Yeah. Dual Cores need to take a hike already. Ditch the Core i3's. A Quad-core Ryzen could make a pretty potent gaming rig at a low price.

Captain_Yuri said:

Mhmm, what I am most curious to see is what will intel's response be provided everything is legit. Will they finally lower their 8 core prices to +-100 of the Ryzen prices? Cause if Intel say lowers the price of the 6900k to say $600, I might go for that instead cause some of the features are better such as quad channel ram, thunderbolt and etc.

Even when Intel was loosing to AMD's K7 and K8 architectures, Intel didn't really respond with price cuts in any extreme extent.
Intel was still out-selling AMD significantly thanks to agreements Intel had with OEM's.

It really depends how much marketshare AMD can claw away from Intel as to whether Intel will respond in-kind with price cuts, AMD could win over Enthusiasts and Gamers, which has been seing significant growth over the past few years.

eva01beserk said:

isint the 6900k the one that cost over 1k? If it is, you rreally think they would cut the price so much? It would be like saying in the PR "yep they beat us, we had no choice or the will push us off the market".

The 6900K has a ton of features that Ryzen doesn't and is excuse enough for Intel to maintain it's high profit margins on those chips.

Captain_Yuri said:

Well the question has always been how much markup is intel making on that CPU?

According to Intel, their profit margin is around 60-70%.

eva01beserk said:

I belive they have been charging so much cuz they had no competition. I think they all of a suden will anounce a succesor with the same perfoormance as kabylake but "found new tech and/or procces" to lower the price. we will know its bullshit, but they need to save face. Its not like we will ever find out the profit on each cpu they make.

These chips take years to design.

Intels however has a few counters to AMD's Ryzen dropping next refresh, Coffee-Lake will introduce Intels first 6-core processor on the mainstream socket. (Likely Socket 1151.)

Which has been common knowledge since Intel inveiled it's roadmap a couple years ago.

alrightiwill said:
Even if Intel reduce prices, consumers should still buy AMD(if these numbers are accurate). Otherwise it will reward Intel with having periods of being able to charge more for doing less than they could.

Intel need to be punished by the consumer for resting on their laurels... if AMD produce the goods of course

Disagree. Buy the best you can afford, regardless if it is Intel or AMD. Do not reward a company for less than optimal products.

thismeintiel said:

Well, you should be happy to have AMD.  Considering the Ryzen 7 1700X offers similar performance to the i7 6900K, but at less than half the price, it's going to force Intel to drop their prices real quick.

Probably should wait for someone like Anandtech to do a proper review on Ryzen before you make that assumption. ;)

NATO said:

Not far off.

Was AMD all the way through to the 1100T, first had a k6-2 500, athlon 900, athlon xp 2200+, athlon 64 2800+, Phenom 940 BE, Phenom 965, then lastly the Phenom X6 1100T, all of them performed worse than their intel counterparts with the only caveat being they were cheaper, most of them had much higher thermals than their intel counterpart, and the 1100T , coupled with the HD 6990 decided they didn't enjoy life and mid-rendering session commit suicide and destroyed themselves and the motherboard.

Been with Intel ever since, never had any issues, costs a lot more but I haven't regretted the switch to intel, not even once.

Funny. I had the 1090T Overclocked to 4ghz with the NB clock at 3ghz with 1.45v and two Radeon 6950's unlocked into 6970's, overclocked and over volted in Crossfire powering three 1920x1080 screens (5760x1080) and it lasted years.
Later had a 1055T and a 1035T which are still running in their respective systems today.

In-fact, I gave my grandmother the old Phenom 2 x6. Because you know. Emails and stuff and it's still solid after all these years.

The Phenoms only really came into their own once you overclocked them, once you push the NB clock up to 3ghz, it can boost IPC by a good 10-15% in some instances which gave Nahelem a run for it's money.

But you are right that thermals were shit. But honestly, if you have a Radeon 6990, you obviously don't care about thermals or power consumption anyway.

Sadly, you just likely had a dud. Can happen with intel as well.

SvennoJ said:
Interesting. Will games start using 8 cores? Do they already use 6 cores atm?
Or is 4 cores still plenty for games? It seems the i5 without hyper threading is still more than capable to keep up with any console game running on 6 Jaguar cores.

Some games do use 8 cores and 6-cores, but it's not like it's needed to any super great degree.

4-cores is still more than enough for the majority of games.



--::{PC Gaming Master Race}::--

NATO said:
Werix357 said:

Thought that might of been the case. The only AMD cpu I've had was the Athlon 3Dnow which was good but yeah Intel has pretty good the last 10-15 years.

I get what you're saying, but 3Dnow! is an x86 instruction set extension, not a CPU model

yeah it was a while ago and couldn't remember if it was a feature or something to do with the model I but that was the only thing I could remember about that CPU



I am officially erect. If these are only slightly underpar from Kaby lakes i might be able to afford to build a new rig. But the best news is that I'm hearing they're performing better than Kaby dollar for dollar.



If all these leaks turn out to be true, AMD should make Jim Keller a statue in their HQ, and have always an office ready waiting for him whenever he wants to come back.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network
shikamaru317 said:
thismeintiel said:

But what is the TDP of the chips in the Pro?  I couldn't find the info myself, but it has to be quite a bit higher than the launch PS4.  I do know it is rated for ~300 watts, I believe.  If they are going to keep up with PCs, to some extent, they are going to have to aim for a higher TDP.  Granted, I do expect them to use semi-custom chips, again, so they are going to be slightly weaker variants.

I'm guessing you meant 20 in your first sentence.  And, I know it's probably unlikely, since it's not guaranteed to launch in 2018.  If they can make it, the chances are greater, but I'm still leaning more towards a Vega 10.  Also, I think Sony is going to want to be around double the Scorpio to make the gap seem even larger, so ~11-12 Tflops.  If they use an underclocked Vega 10, which will also help with heat, they should achieve that. 

I seriously doubt we are going to see anything as low as 8 Tflops.  Even though not every flop is the same, it would just be mocked as barely a jump worthy of a new gen, especially with reasonably priced 15+ Tflops cards launching by then.  It would also only be ~4x more powerful than the OG PS4 and less than twice that of the Pro.  At least with ~12 Tflops, you're looking at ~6-6.5x the OG PS4 and ~3x that of the Pro.  I think the vast majority of console gamers would be fine with that.  If they only went with 8/9 Tflops, it would probably also give MS an easy way to win by just taking a slight loss on a ~12 Tflops XB2.  And I don't think Sony wants to risk that.

The TDP of PS4 Pro is 300 watts, but testing shows that it actually uses less power than the launch PS4 did in some scenarios, and only slightly more power when playing a Pro enhanced game (155 watts for Pro vs 148 watts for the launch PS4): https://www.youtube.com/watch?v=0wNoCnPxTp4

No, I meant Vega 10. Maybe I'm not up to date with the latest rumors, but the last rumor I saw was that Vega 10 will be rated at about 225 watts while using around 200 watts in typical real world scenarios, which makes sense to me, considering it produces twice the flops of the 150 watt rated RX 480. According to that same rumor, Vega 20 will be a 7nm die shrink of Vega 10 releasing in 2018 that is clocked considerably higher and uses less power (rated at 150 watts).

I can't see PS5 being double Scorpio in 2018 and probably not even in 2019, not unless Sony is willing to either sell at a loss early on or launch at $500+. Bear in mind that the Desktop GPU that PS4's GPU is closest too in specs cost about $180 when PS4 released, last I heard Vega 10 is expected to launch around the $350-400 mark later this year, I can't see it dropping to <$250 before Holiday 2018 or even Holiday 2019. The price/performance sweet spot Sony will most likely be going for will probably be Vega 11 (~7-8 tflops) in 2018 and either a cut-down Vega 10 or mid-high range Navi chip in 2019 (~10 gflops). I wouldn't be expecting as big of a boost over previous hardware as we saw this gen, that was a 7 year boost in power (2006-2013), we're most likely looking at a 5-6 year boost this gen (2013-2018/19). PS4 was roughly a 6x boost over PS3, I'd be expecting about a 4.5-5x boost for PS5 (8.1-9 tflop), taking into account the shorter cycle. 

Sorry, meant Vega 20 in the first sentence of your 2nd paragraph.  You said Vega 10 wasn't supposed to launch until 2018/19, when it launches this year. And while the PS4 Pro only uses ~160W+ in probably the more extreme cases, it's obvious it was built to handle more.

Just to be clear, I don't see the PS4 launching until late 2019.  That's two years to see price drops in the components that are out this year, and a year if the 20 in fact launches in 2018.  Again, I'm leaning more towards an underclocked 10.  I just do not see them in any case aiming so low as to go with a Vega 11.  Even if MS was toying with the idea of abandoning the console market, this would 100% change their mind, and give them an easy win if they target the 10.  A ~8Tflops vs ~12Tflops gap can not be ignored. 

As far as prices go, they do drop quit quickly in the GPU world.  The R9 Fury, for example, came out in July of 2015 for $549 (more for an actual graphics card using the chip), but you can get a Fury graphics card for less than $280, now.  Sometime this year, that will probably drop even more.  Sony will also pay a much cheaper price for the chip itself, since it buys in bulk.  Also, keep in mind that Sony has ALWAYS launched at a loss.  It is usually only a slight lose, but it still allows them to put in better chips and make up for the lose with SW sales.  And now with how much money they are raking in with PS Plus, I wouldn't doubt they stretch it a little more if they have to.



Depending on Intel's H2H gaming performance with Coffee Lake , i could be in the market for a 1600X , 1700/X as a replacement for my aging 3770K @ 4.6Ghz/2400Mhz DDr3 build. Been finding reason to upgrade difficult past years due to mostly everything being GPU bound at 1440p-4K.



PC I i7 3770K @4.5Ghz I 16GB 2400Mhz I GTX 980Ti FTW

Consoles I PS4 Pro I Xbox One S 2TB I Wii U I Xbox 360 S

Pemalite said:
thismeintiel said:

Well, you should be happy to have AMD.  Considering the Ryzen 7 1700X offers similar performance to the i7 6900K, but at less than half the price, it's going to force Intel to drop their prices real quick.

Probably should wait for someone like Anandtech to do a proper review on Ryzen before you make that assumption. ;)

True.  Though, even if it is a little below it, that ~$800 difference really can't be ignored.  But, like you said, it depends on others willingness to switch.



Reviews coming Feb 28th



Due to how long its taking for Ryzen to come out , while i don't question their value across the spectrum, i am confused as to the legs it will have coming to market just matching Broadwell architecture.

Everyone knows Intel's consumer level I7's tend to have always lead in gaming performance over the 6-8 core Extreme variants time and time again, so while Intel's ticks have been unimpressive for major upgrades, truth is there will be Coffee Lake come fall with another 15% IPC improvement.

Right now Ryzen at launch has a 14-15%  deficit of single threaded performance to Kaby lake, and that will double with Coffee Lake.


I guess i need to see case scenario's where 8 core 16 thread Ryzen will truly outshine your latest i7 in games, considering it will occupy that same $300 range. More threads/cores sounds cool, but i would need it to prove a difference maker. IPC is still very important in a multi threaded environment since the first few threads still handle a lot of the heavy lifting.

...then you gotta consider what difference it makes at 1440-2160p resolution where games less CPU dependent.



PC I i7 3770K @4.5Ghz I 16GB 2400Mhz I GTX 980Ti FTW

Consoles I PS4 Pro I Xbox One S 2TB I Wii U I Xbox 360 S