By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Ka-pi96 said:
Hey guys, I've seen 'Laptop versions of graphics cards may work but are NOT officially supported.' on a few Steam game pages. What does that mean? I'm planning on getting a new gaming laptop at some point to replace my PC so that sounds a bit concerning to me.

Show us and maybe then we can help you ...



Around the Network
JEMC said:

 

AMD 14nm FinFET POLARIS GPU SIZE LEAKS OUT – 232mm² LARGE DIE

http://wccftech.com/amd-polaris-gpu-die-size-232-mm-2/

Details of AMD’s upcoming 14nm GPU have finally started leaking out. Some good detective work by the user AnarchX over at Beyond3D 3DCenter forums has revealed what appears to be the LinkedIn profile of a senior engineer at AMD. Interestingly, the engineer lists multiple projects, one of which is the Polaris die. The size of the chip will be 232mm² and (assuming the information is accurate) will constitute one tier of the Polaris architecture.

 

"According to the information we have about the 14nm LPP process, and based on transistor density increase, a 232mm² GPU would be roughly equivalent to a  464mm² 28nm processor – at the same TDP levels. Since we already know that AMD is going to be focusing not just on performance but power efficiency as well – this number could be be much higher, in fact we will discuss the number AMD is using below. We can however safely say that this die is more than capable of meeting the ‘minimum VR spec that AMD promises."

 

>>As a reference, AMD's 290/390 chips are 438 mm2, so if that assumption is true and we add the performance increases from the new architecture and other improvements, get we a chip that will bring a nice improvement in performance and power consumption while also leaving room for bigger chips down the road (either for the Fury brand or a 5x0 series)

I would like to add that Polaris is not using TSMC's 20/16nm transistors ... 

AMD is using Samsung/GlobalFoundries 14nm LPP process node with a known die area scaling advantage of upto 15% smaller die which irons out to 17% more transistors to play with in the same die sapce ... 

Without accounting for other factors that don't contribute to performance such as the display engine or video codecs and etc a 232mm^2 14nm part can match a 546mm^2 28nm part in performance and get in range with 600mm^2 GPUs ...



Ka-pi96 said:

The laptop? Haven't decided on one yet, or how much money I'm willing to spend. Probably won't be getting it until June at the earliest anyways.

And @Chaz I'd prefer a new desktop, but I'll be moving in to dorms in September so a laptop would just be all around easier.

Show us the page ... 

I also wouldn't get a laptop until june once the new GPUs hit the shelves ...



fatslob-:O said:
JEMC said:

 

AMD 14nm FinFET POLARIS GPU SIZE LEAKS OUT – 232mm² LARGE DIE

http://wccftech.com/amd-polaris-gpu-die-size-232-mm-2/

Details of AMD’s upcoming 14nm GPU have finally started leaking out. Some good detective work by the user AnarchX over at Beyond3D 3DCenter forums has revealed what appears to be the LinkedIn profile of a senior engineer at AMD. Interestingly, the engineer lists multiple projects, one of which is the Polaris die. The size of the chip will be 232mm² and (assuming the information is accurate) will constitute one tier of the Polaris architecture.

*pic*

"According to the information we have about the 14nm LPP process, and based on transistor density increase, a 232mm² GPU would be roughly equivalent to a  464mm² 28nm processor – at the same TDP levels. Since we already know that AMD is going to be focusing not just on performance but power efficiency as well – this number could be be much higher, in fact we will discuss the number AMD is using below. We can however safely say that this die is more than capable of meeting the ‘minimum VR spec that AMD promises."

 

>>As a reference, AMD's 290/390 chips are 438 mm2, so if that assumption is true and we add the performance increases from the new architecture and other improvements, get we a chip that will bring a nice improvement in performance and power consumption while also leaving room for bigger chips down the road (either for the Fury brand or a 5x0 series)

I would like to add that Polaris is not using TSMC's 20/16nm transistors ... 

AMD is using Samsung/GlobalFoundries 14nm LPP process node with a known die area scaling advantage of upto 15% smaller die which irons out to 17% more transistors to play with in the same die sapce ... 

Without accounting for other factors that don't contribute to performance such as the display engine or video codecs and etc a 232mm^2 14nm part can match a 546mm^2 28nm part in performance and get in range with 600mm^2 GPUs ...

If AMD can manage to get a 232mm2 chip beating or coming close to the monsters that are their Fiji chips (at least Fury), it would be awesome! But that could bring some trouble with cooling and heat density.

 

Oh, and it's a few days old but somehow I missed this piece of news:

THE PC GAMING SHOW IS BACK FOR 2016

http://www.pcgamer.com/the-pc-gaming-show-is-back-in-2016/

https://www.youtube.com/watch?v=dX_inDdvsxE



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:

If AMD can manage to get a 232mm2 chip beating or coming close to the monsters that are their Fiji chips (at least Fury), it would be awesome! But that could bring some trouble with cooling and heat density.

 

Oh, and it's a few days old but somehow I missed this piece of news:

THE PC GAMING SHOW IS BACK FOR 2016

http://www.pcgamer.com/the-pc-gaming-show-is-back-in-2016/

https://www.youtube.com/watch?v=dX_inDdvsxE

It would be nice for AMD to come close to those monsters in such a small space for their sake and I don't think heat density should be much of an issue with the clocks that GPUs run at ... 

The Fury X was a missed opportunity with AMD betting on more compute and bandwidth. Their DX11 drivers ain't bad but they would have had a better deal with a bigger frontend to lower submission overhead and 5 geometry processing units and maybe 60 CUs ? 

As for the PC Gaming Show it was very horrible from what I heard and based on the announcements last year so I'm not particularly excited for this years showing ...



Around the Network
Ka-pi96 said:

Oh, the games? It was these 2
http://store.steampowered.com/app/378120/
http://store.steampowered.com/app/239140/

I could see some issues with Football Manager provided that your running it on something ancient and likely with an Intel GMA series ... 

As for Dying Light any DX11 GPU will do since Microsoft is more strict with the conformance tests provided you also have the required video memory. The only laptop equivalents that you should avoid is the Mobility Radeon HD 5165 and below since AMD recycled some DX 10.1 parts in the product line ...



fatslob-:O said:
JEMC said:

If AMD can manage to get a 232mm2 chip beating or coming close to the monsters that are their Fiji chips (at least Fury), it would be awesome! But that could bring some trouble with cooling and heat density.

 

Oh, and it's a few days old but somehow I missed this piece of news:

THE PC GAMING SHOW IS BACK FOR 2016

http://www.pcgamer.com/the-pc-gaming-show-is-back-in-2016/

https://www.youtube.com/watch?v=dX_inDdvsxE

It would be nice for AMD to come close to those monsters in such a small space for their sake and I don't think heat density should be much of an issue with the clocks that GPUs run at ... 

The Fury X was a missed opportunity with AMD betting on more compute and bandwidth. Their DX11 drivers ain't bad but they would have had a better deal with a bigger frontend to lower submission overhead and 5 geometry processing units and maybe 60 CUs ? 

As for the PC Gaming Show it was very horrible from what I heard and based on the announcements last year so I'm not particularly excited for this years showing ...

I do hope heat doesn't become a problem. After all, these cards were the "Artic Island" ones because of their focus on efficiency, but AMD will still have to come up with a good cooler and not the thing they used with the 290 series.

What I found odd about Fury was its performance disparity. At 1080p they are kind of bad performers, with the 980 coming close in many benches, and yet once resolution goes up their performance doesn't drop as much as the other cards. In any case, AMD says that GCN 4.0 brings lot of changes so we'll have to wait a bit more to see what they've done.

 

About the PC Gaming Show, it was something completely different than what we're used to see at E3, and at some point it was more like a late show kind of conference: one host interviewing many people in the industry with some videos of their work and some jokes.

It wasn't great or mind blowing, but it wasn't horrible either. That said, there are lots of ways to improve it even without changing the format: less talk about how great PC gaming is and more game demos running even if they are in the background would improve it a lot. Also, it would be great for all if both AMD and Nvidia took part on it and use the show to reveal some new hardware, but I can't see them both attending to the same show and not being the central point of it.

All I know is that I'll watch it again (not live as it will be too late over here).



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Ka-pi96 said:

Thanks, shouldn't have any problems with a new GPU then

Speaking of AMD though, are they any good? I mean, usually I just completely ignore any AMD cards and only look at Nvidia. Are they worth looking at or am I doing the right thing?

You should figure that out for your own use cases ... 

I personally just buy for performance. If I had the money right now I would be looking to get a GTX 980 Ti but if AMD wins the next round of their GPUs with the help of DX12 to back them up a bit I might stay with the red team seeing as I'm currently fine ... 

Basically it all boils down to this, look at the benchmarks with the most performance intense or poorly optimized game/software for however you define it and then purchase your brand or whatever based on that ... 



JEMC said:

I do hope heat doesn't become a problem. After all, these cards were the "Artic Island" ones because of their focus on efficiency, but AMD will still have to come up with a good cooler and not the thing they used with the 290 series.

What I found odd about Fury was its performance disparity. At 1080p they are kind of bad performers, with the 980 coming close in many benches, and yet once resolution goes up their performance doesn't drop as much as the other cards. In any case, AMD says that GCN 4.0 brings lot of changes so we'll have to wait a bit more to see what they've done.

All I know is that I'll watch it again (not live as it will be too late over here).

AMD has hopes that more games will become compute limited to give them the advantage in the long run and that's exactly what their aiming to do with support for asynchronous compute shaders just like how console APIs expose support for that kind of feature ... 

I'm curious as to how much of an improvement it will be with the changes they've made to the command processor, geometry related fixed function units, and the shader instruction pre-fetching ... 

JEMC said:

About the PC Gaming Show, it was something completely different than what we're used to see at E3, and at some point it was more like a late show kind of conference: one host interviewing many people in the industry with some videos of their work and some jokes.

It wasn't great or mind blowing, but it wasn't horrible either. That said, there are lots of ways to improve it even without changing the format: less talk about how great PC gaming is and more game demos running even if they are in the background would improve it a lot. Also, it would be great for all if both AMD and Nvidia took part on it and use the show to reveal some new hardware, but I can't see them both attending to the same show and not being the central point of it.

All I know is that I'll watch it again (not live as it will be too late over here).

That would sound great ... 

As for me I don't do E3 so I base it on what it announced and what content they show for it ...



fatslob-:O said:
JEMC said:

I do hope heat doesn't become a problem. After all, these cards were the "Artic Island" ones because of their focus on efficiency, but AMD will still have to come up with a good cooler and not the thing they used with the 290 series.

What I found odd about Fury was its performance disparity. At 1080p they are kind of bad performers, with the 980 coming close in many benches, and yet once resolution goes up their performance doesn't drop as much as the other cards. In any case, AMD says that GCN 4.0 brings lot of changes so we'll have to wait a bit more to see what they've done.

All I know is that I'll watch it again (not live as it will be too late over here).

AMD has hopes that more games will become compute limited to give them the advantage in the long run and that's exactly what their aiming to do with support for asynchronous compute shaders just like how console APIs expose support for that kind of feature ... 

I'm curious as to how much of an improvement it will be with the changes they've made to the command processor, geometry related fixed function units, and the shader instruction pre-fetching ... 

That's one of the reasons because I like AMD, they launch products that not only cover today's necessities, but also the ones that will come in the future. Maybe that's why their 7970/280X is still a very capable card today even thought it launched three or four years ago.

But, because everything has its cons, that forward thinking can also backfire on them. The best example that I can think of is how they had a tesselator unit years before any game used it and, when engines and games started using it, they did so with a different method that made those old cards useless for tesselation.   

fatslob-:O said:
JEMC said:

About the PC Gaming Show, it was something completely different than what we're used to see at E3, and at some point it was more like a late show kind of conference: one host interviewing many people in the industry with some videos of their work and some jokes.

It wasn't great or mind blowing, but it wasn't horrible either. That said, there are lots of ways to improve it even without changing the format: less talk about how great PC gaming is and more game demos running even if they are in the background would improve it a lot. Also, it would be great for all if both AMD and Nvidia took part on it and use the show to reveal some new hardware, but I can't see them both attending to the same show and not being the central point of it.

All I know is that I'll watch it again (not live as it will be too late over here).

That would sound great ... 

As for me I don't do E3 so I base it on what it announced and what content they show for it ...

I always find E3 to be an interesting time because, if nothing else, is a clear sign of how healthy is the industry.

And well, this year could be fun to watch with Nintendo having new hardware and AMD (and hopefully Nvidia) using the conference to reveal at least part of its cards and plans.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.