By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Intel and AMD Team Up on PC Chips to Battle Nvidia

 

Has Hell Frozen over?

yes 27 65.85%
 
no 5 12.20%
 
I saw this coming, long ago. 9 21.95%
 
Total:41
Chazore said:
JRPGfan said:

https://www.anandtech.com/show/12003/intel-to-create-new-8th-generation-cpus-with-amd-radeon-graphics-with-hbm2-using-emib

"With Intel buying chips from AMD, it stands to reason they could be buying more than one configuration, depending on how Intel wanted to arrange the product stack. Intel could pair a smaller 10 CU design with a dual core, and a bigger 20+ CU design with a quad-core mobile processor. A couple of benchmark sources seem to believe that there is at least two configurations in Polaris-like configurations, with up to 24 CUs in the high-end model."

24 CU's = 1536 Core's.

For compairison with The RX 400 Mobile series, m470x is 896 cores.

AMD Polaris 11 supposedly has 1024 cores (streamprocessors) and does ~2 Tflop of gpu compute.

If I had stock in Nvidia this would worry me.

Nvidoomed?.

I don't think it's going to be anything that will revolutionise the world, especially when the deal is for laptops.

Gameing is like 50% of their revenue, and its R&D is what allows alot of the other parts to funktion.

Laptops with nvidia cards in them, is probably (I couldnt find quick numbers on it) a large part of that.

At worst it ll probably mean Nvidia loseing like 25% of their revenue stream, so too early for Nvidoomed :p

 

Imagine if Intel decides to suddenly ship most of their "core" series with AMD gpus?

Last edited by JRPGfan - on 07 November 2017

Around the Network

It's quite funny that both companies that have CPU and GPU lines are partnering to have one's CPU and other's GPU instead of Intel partner with NVidia for this.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

JRPGfan said:

Gameing is like 50% of their revenue, and its R&D is what allows alot of the other parts to funktion.

Laptops with nvidia cards in them, is probably (I couldnt find quick numbers on it) a large part of that.

At worst it ll probably mean Nvidia loseing like 25% of their revenue stream, so too early for Nvidoomed :p

They also have a rather large part that buys into their GPU side, as well as their investment into AI as well. They aren't soley into just the GPU market btw. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Not sure if this is ultimately good or bad for consumers, hmm. On one hand, I think that this partnership will help push the envelope for semiconductor-based technology in that market, but ... hopefully this isn't a further consolidation of corporate power that will result in an even tighter oligopoly. We shall see!



Now Intel is using Glue



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network

Apple is going to be all over this with the mbps. They prefer to go eith integrated graphics for their laptops and this is going to allow them to do that.
Although it looks to be kaby lake for now, so a quad core configuration might be some time off.



SuperNova said:
Apple is going to be all over this with the mbps. They prefer to go eith integrated graphics for their laptops and this is going to allow them to do that.
Although it looks to be kaby lake for now, so a quad core configuration might be some time off.

Kaby Lake-R is basically Coffee Lake. The H-series chips have as far as I know always been quad-core.



JRPGfan said:
vivster said:

High end Laptops is a bit misleading. This technology will be in small form factor laptops like Ultrabooks. The performance will increase because so far you had to decide if you want a fast CPU or a fast GPU. Now you can have both. But it's still around 15TDP so you can't expect anything close to what actual high end laptops deliver.

In big gaming laptops you will have dedicated GPUs anyway so the poroblem of having to choose between one or another doesn't even come up.

its useing "HBM2" as onboard memory (expensive still, faster than GDDR5/6).... this wont just be for ultra thins.

It ll be thiner than throwing in a discrete nvidia mobile gpu, but be able to perform as such basically.

This more than just the low end of performance spectrum, this could eat away at nvidia's laptop gpu sales.

 

I see this as Intel wanting to do like Nvidia 1060+ level's of graphics in laptops.

And future laptops below that level, you wont find a nvidia geforce inside laptops from Intel.

Naturally AMD will do their own thing so.. yeah, its gonna eat away at nvidia laptop market.

Which basically means the only laptops with nvidia gpus in them, might end up being the really high end gameing laptops.

It's using just one single stack of HBM2, limiting the bandwith to something around 250GB/s, so about the same as an RX 580. However due to the size of those chips that means it has 4-8 GiB of HBM2 RAM as LLC, which should be more than enough to fuel the graphics part of the combo chip. If they added a second stack they actually could have used it as unified memory without the need for any additional DDR4 Dimms (though not sure if Windows would accept such a configuration out of the box)



Bofferbrauer2 said:
JRPGfan said:

It's using just one single stack of HBM2, limiting the bandwith to something around 250GB/s, so about the same as an RX 580. However due to the size of those chips that means it has 4-8 GiB of HBM2 RAM as LLC, which should be more than enough to fuel the graphics part of the combo chip. If they added a second stack they actually could have used it as unified memory without the need for any additional DDR4 Dimms (though not sure if Windows would accept such a configuration out of the box)

Its going to be pretty decent for a laptop.

1536 stream processors x 1190 MHz = 3.655 Tflops (when at max clocks).

https://www.techpowerup.com/238542/intel-amd-mcm-core-i7-design-specs-benchmarks-leaked

 

This should be like haveing a Desktop class GeForce GTX 1060, imo.

Which again isnt bad, and in a laptop format.

That + FreeSync = kickass 1080p gameing laptop.



DonFerrari said:
It's quite funny that both companies that have CPU and GPU lines are partnering to have one's CPU and other's GPU instead of Intel partner with NVidia for this.

a qoute from a anandtech user said it well:

"IMHO, this is a great move from AMD + RTG.

Intel sees Nvidia as the real threat because the deep learning and data centres are where the money is.
Nvidia has a huge revenue and can leverage more financial power comapred to AMD and as such is am ore threatening adversary. Besides the friction between Intel and Nvidia, I think Intel of the last decade has a soft spot for AMD.
Funny thing is that because Intel has higher prices because of the margins, that indirectly thay also kept AMD afloat. If Intel would have sold their cpu's for the same prices as AMD,
everybody who has a smaller budget would also have gone for Intel depriving AMD from selling any cpu.
Intel has a better relationship with AMD, and this is actually a win win situation for AMD / RTG.
because they will sell more gpus. Have Intel + AMD brand recognition. This is very good for AMD / RTG in the long run because it means financial income in the long run.
The laptop market is very big, so this makes sense.
For AMD this is another custom design but if it is any gcn or ncu alike, it also could mean more HSA adoption and more use of GCN gpu derivate as a general compute source.
Nvidia has been very smart with promoting cuda as they have done. Now it is a household name in the pc programming world for various disciplines.
And AMD could use the same effect very much."