By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - AMD Volcanic Islands "Hawaii" GPU confirmed

Pemalite said:
riderz13371 said:
This is kind of a random question but...my computer has 2 graphics cards (Intel HD 4000 and Nvidia GeForce GT 630M 1 Gb Gddr5) and whenever I play games on my laptop it gets pretty loud. It switches out which graphics card it uses depending on what I'm doing so for instance if I play a game it uses Nvidia but if I'm browsing the internet it uses the Intel one. I was just wondering is it normal for it to be loud when I play games? I assume it's the fans that are hard at work but I get worried and I try not to play for a long time because I feel as if I am harming my computer when doing so.


It's called "Optimus" technology.

Essentially when you are not doing anything demanding it switches to the Intel graphics to save on battery life.

As for the noise, it's perfectly normal, you are not going to reduce the notebooks life, it was designed to handle it.
Heck, if I had it, I would have overclocked the nVidia GPU on top of it. :P

Don't even know how to do that but I read on some sites that it's harmful to overclock on laptops. Gonna try to do a little more research before attempting something like that.



Around the Network
riderz13371 said:

Don't even know how to do that but I read on some sites that it's harmful to overclock on laptops. Gonna try to do a little more research before attempting something like that.


It's not harmfull if you don't go stupidly overboard and know what you are doing.

Years ago I had a laptop with a Mobility Radeon 9700 pro and Single Core Pentium M and ran that overclocked for years without a single issue even ran Oblivion on relatively high settings. :)




www.youtube.com/@Pemalite

Pemalite said:
JEMC said:
With the chips being bigger they will also be more expensive.

AMD, you better have something to compete with the GTX 7xx series, otherwise...


Not always more expensive.
28nm is old and mature now, so yields would be high, hence they can get away with having larger chips and still end up being cheaper than when the 7000 series launched.

Interested to see what uArch changes they bring in as like with the 6000 series being stuck at 40nm AMD had to increase die-sizes and improve efficiency. (I.E. Move from VLIW5 to VLIW4)

Bold: that's the problem.

Yes, the process is now more mature, but AMD (and Nvidia) are currently enjoying that, and has allowed them to reduce their prices until current levels, so having a bigger chip will make it more expensive than their current ones. Sure, they will still be cheaper than the HD7xxx series at launch, but the HD9950 will cost about the same as the 7970GHz now and the 9970 even more.

And then comes the GTX770: it's faster than the GTX680 and the HD 7970 GHz and it costs a little more than this last one without having competition, meaning that they could lower its price to 7970 levels if they want and do the same with the 780 to a more (but not completely) logical price point of 500 €/$, maybe even 450.

Where does that leave AMD? That leaves AMD in a bad situation unless the 9950 is almost as fast as (probably) similarly priced 770, and the same is true for the 9970 and the 780.

That's why I said that they better have something to compete.



Please excuse my bad English.

Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

I wander how they will perform.

Perhaps a 9970 will be 30% faster than a 7970 but also draw slightly more power.

I need a new GPU and PC soon. Or do I? Maybe I can survive with the PS4 until Dark Souls 2 comes out in spring of 2014 and I won't need to buy a new PC until autumn 2014.

Will AMD at 20nm be available in autumn 2014? Perhaps it will. By then a decently priced PC should clearly outrun a PS4 and that would be a logical time to start playing multiplats with enhanced graphics on PC.



Finally. Some competition.



Around the Network
Slimebeast said:

I wander how they will perform.

Perhaps a 9970 will be 30% faster than a 7970 but also draw slightly more power.

I need a new GPU and PC soon. Or do I? Maybe I can survive with the PS4 until Dark Souls 2 comes out in spring of 2014 and I won't need to buy a new PC until autumn 2014.

Will AMD at 20nm be available in autumn 2014? Perhaps it will. By then a decently priced PC should clearly outrun a PS4 and that would be a logical time to start playing multiplats with enhanced graphics on PC.

The 7970GHz edition is already the most power hungry chip available, if the 9970 uses even more (which will) it will need a hell of a cooling system.

About your second part, I also need to change my PC (faulty mobo) but I decided to wait to replace my GPU until AMD and Nvidia show its new architectures and cards at 20nm. I hope that they are launched by mid 2014.



Please excuse my bad English.

Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:

The 7970GHz edition is already the most power hungry chip available, if the 9970 uses even more (which will) it will need a hell of a cooling system.

 


Are you absolutely 100% positive it will use more power? Got any facts to back up that claim? Links even?

AMD has a good idea on the transister characteristics at that fabrication process, so they will be able to use optimal transisters at that node.
There is a reason that the 6900 series is both faster and more power efficient (Die size+Performance per watt) than the 5870 even though they were both at 40nm. :P




www.youtube.com/@Pemalite

Pemalite said:
JEMC said:

The 7970GHz edition is already the most power hungry chip available, if the 9970 uses even more (which will) it will need a hell of a cooling system.

 


Are you absolutely 100% positive it will use more power? Got any facts to back up that claim? Links even?

AMD has a good idea on the transister characteristics at that fabrication process, so they will be able to use optimal transisters at that node.
There is a reason that the 6900 series is both faster and more power efficient (Die size+Performance per watt) than the 5870 even though they were both at 40nm. :P

No, I don't have any proof.

In fact, I remember leaked specs from 6 months ago or even older that talked about it having higher frequencies and using less power, which was a bit hard to believe, but if they are also using bigger chips then its even harder to believe, don't you think?

But hey, I won't complain if they do it!



Please excuse my bad English.

Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:

No, I don't have any proof.

In fact, I remember leaked specs from 6 months ago or even older that talked about it having higher frequencies and using less power, which was a bit hard to believe, but if they are also using bigger chips then its even harder to believe, don't you think?

But hey, I won't complain if they do it!


Nope. History says otherwise.

Radeon 5870:  2.1 Billion transisters, 850mhz core clock, 40nm.
Radeon 6970: 2.6 Billion transisters, 880mhz core clock, 40nm.

By those metrics, the 6970 should use significantly more power because it's larger and has a higher clockspeed, reality plays out differently.

But, we won't know for sure untill we have them in our hands, I'll be upgrading to it regardless.




www.youtube.com/@Pemalite

Pemalite said:
JEMC said:

No, I don't have any proof.

In fact, I remember leaked specs from 6 months ago or even older that talked about it having higher frequencies and using less power, which was a bit hard to believe, but if they are also using bigger chips then its even harder to believe, don't you think?

But hey, I won't complain if they do it!


Nope. History says otherwise.

Radeon 5870:  2.1 Billion transisters, 850mhz core clock, 40nm.
Radeon 6970: 2.6 Billion transisters, 880mhz core clock, 40nm.

By those metrics, the 6970 should use significantly more power because it's larger and has a higher clockspeed, reality plays out differently.

But, we won't know for sure untill we have them in our hands, I'll be upgrading to it regardless.

I hope they can do it.

And you'll upgrade? Why don't you wait until their new architecture (the 9xxx cards will use GCN 2.0 which is "only" a revision if I'm not mistaken)?



Please excuse my bad English.

Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.