By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Will sales of the Scorpio take a hit?

tokilamockingbrd said:
Lawlight said:

Weird, Google says it's $329+.

he said stock. the base 4 GB is 199, base 8 GB is 249, I got a black edition 8 GB with a nice OC and cooling upgrades for 299. They are really popular right now so supply demand was pulled prices up some, but they were launched at 199/249

I need a link, mate. I don't know the difference between stock or non-stock graphics card. He said $200 - I cannot find any link to a $200 one.



Around the Network
Lawlight said:
tokilamockingbrd said:

he said stock. the base 4 GB is 199, base 8 GB is 249, I got a black edition 8 GB with a nice OC and cooling upgrades for 299. They are really popular right now so supply demand was pulled prices up some, but they were launched at 199/249

I need a link, mate. I don't know the difference between stock or non-stock graphics card. He said $200 - I cannot find any link to a $200 one.

https://www.nowinstock.net/computers/videocards/amd/rx480/

it seems its popularity has driven up the price, but look at the best buy price (out of stock of course) there is your 199



psn- tokila

add me, the more the merrier.

tokilamockingbrd said:
Lawlight said:

I need a link, mate. I don't know the difference between stock or non-stock graphics card. He said $200 - I cannot find any link to a $200 one.

https://www.nowinstock.net/computers/videocards/amd/rx480/

it seems its popularity has driven up the price, but look at the best buy price (out of stock of course) there is your 199

Probably a one-time deal. The cheapest on the best buy site is $250. I tried the newegg one and it added $25 as tax?! What's up with that?



UltimateGamer1982 said:
There's the fact that Nx could be a hit, further taking sales away from Scorpio so who knows what will happen.

God, I love facts.



binary solo said:

Question: Are there any advantages to running a GPU at minimum or average clock vs maximum/overclock? I assume a GPU needs to work with all other components of teh system, which means it's operating speeds need to be in sync with CPU etc. So is PS4P going with the (probable) minimum clock more about how the GPU needs to operate in sync with the CPU than there being any inherent benefit to operating at minimum? I assume higher clock means more heat, but that shouldn't matter if you have decent cooling/venting.

In the PS4Pro, the cpu and gpu seem to be using different clock generators (like the XBox, unlike the PS4). The problem with AMD designs is they use a "pack as much as possible as tightly as possible" philosophy which tends to limit viable speeds rapidly, unlike NVidia that uses kind of a "keep it simple and leave space to breath" design philosophy.

Generally, the higher the clock rate, the more current you must provide (and remove!). At school, we learned that P = I^2*R is the heat dissipated in resistors, and you see it goes with the square of current I. AMD designs have the tendency to "totally lose it" above a certain current. A stock RX480 card already reaches 150W and for a 6TF card, easily passes 200W. Then you have to add an 8core "better than Jaguar" cpu which adds 60-90W depending on speed.

For a console, a SoC that draws more than 200W is no longer viable economically. The PS4Pro's gpu clock was chosen to be in a safe thermal spot and that seems to be the purported 911MHz. There is a common myth that "you just use a better cooler" but that is a dangerous thought train. Because chips get smaller faster than current decreases, at some point you can no longer dissipate the generated heat away fast enough economically. For consoles, the price budget for cooling probably is in the $8-$12 range.
With consoles, a fixed clock rate is mandatory. Using temporary boost clocks (of whatever kind) would make game performances unpredictable. The one mantra of consoles is that every console must run the same performance at all times.
With consoles, it is always price first, speed second. The price requirement pretty much limits what you can do inside a console.


Around the Network
drkohler said:

In the PS4Pro, the cpu and gpu seem to be using different clock generators (like the XBox, unlike the PS4). The problem with AMD designs is they use a "pack as much as possible as tightly as possible" philosophy which tends to limit viable speeds rapidly, unlike NVidia that uses kind of a "keep it simple and leave space to breath" design philosophy.

You have made many assumptions.

What Microsoft did with the Xbox One was boost it's "base clock" and not just the individual Multipliers, which meant that everything tied to that clock was boosted, you can bet your hens teeth the PS4 Pro is the same, it's a general consensus both AMD and Intel have come to in their chip designs as a good approach.
Heck, even nVidia reverted back to it with the removal of their seperate shader clock (Which is now tied to the Core clock).

As for AMD and nVidia, nVidia modified chunks of it's architecture to be able to drive up the clock rate on Pascal, AMD did not.
AMD and nVidia do an equal amount of work in trying to pack in as many transisters at every node, by using different types of transisters, optimizing layout to reducing leakage... etc'.
This was most evident during the prolonged 28nm cycle.

Polaris isn't clockrate limited because of how AMD packed it's transisters, nVidia specifically set-out to make Pascal a high-clocking part.


drkohler said:

Generally, the higher the clock rate, the more current you must provide (and remove!).

There is a ton more to it than that.
For instance...
Higher clockrates doesn't always mean more current, there are different types of transisters which have their own efficiency curves, some transisters that operate at higher frequency's are at their most efficient at a high clock rate.

Of course you have the layout and technology's like Resonant Clock Mesh that plays a role as well.

drkohler said:

 At school, we learned that P = I^2*R is the heat dissipated in resistors, and you see it goes with the square of current I. AMD designs have the tendency to "totally lose it" above a certain current. A stock RX480 card already reaches 150W and for a 6TF card, easily passes 200W.

Resistors aren't the same as transisters and thus that equation can't be applied here.

The RX 480 card in a nut shell, was a disaster in terms of Performance/Power consumption, with that said, Teraflops also has no direct correllation with energy consumption.
Remember it is the Shaders * 2 * Clock Rate.

Typically a GPU will be more conservative with it's energy demands the wider it is (Up to a point.)
To think Polaris is going to be any kind of representation of what Scorpio is going to be is pretty silly, considering we have no idea what chip it's going to be derived from or how the chip is going to be fed.

drkohler said:

Then you have to add an 8core "better than Jaguar" cpu which adds 60-90W depending on speed.


As for Jaguar and it's sucessors, keep in mind they are low clocked, low powered parts, Jaguar's successor Puma and Puma+ have tangible benefits, run at the same clocks and have reduced power consumption at the same node.
Carizzo-L based on Puma+ will happily use 12-25w of power in a quad-core configuration, but that's not just for the CPU cores themselves, that includes the graphics units, memory controllers and various other pieces of logic and is thus not representative of how much energy is going to be consumed for the CPU cores themselves.

drkohler said:

For a console, a SoC that draws more than 200W is no longer viable economically.


Sure it is, the PC has invented multiple technology's to aid in more efficient heat removal, such as Vapor-Chambers.
Power supplys have also been able to provide more and more energy at a lower price over the years.

The original Xbox 360 for example had a 203w PSU and had a 198w power consumption, but that was also over a decade ago, things have gotten far better since then. ;)

drkohler said:

The PS4Pro's gpu clock was chosen to be in a safe thermal spot and that seems to be the purported 911MHz. There is a common myth that "you just use a better cooler" but that is a dangerous thought train. Because chips get smaller faster than current decreases, at some point you can no longer dissipate the generated heat away fast enough economically. For consoles, the price budget for cooling probably is in the $8-$12 range.


That used to be true, but we have gotten far better at removing heat, 300w and beyond TDP's are pretty viable now.
Vapor Chambers, Heat Pipes, Phase Change, big 140mm fans, closed loop water coolers... We aren't just dropping a chunk of Aluminum or copper with some thermal compound anymore.


drkohler said:
With consoles, a fixed clock rate is mandatory. Using temporary boost clocks (of whatever kind) would make game performances unpredictable. The one mantra of consoles is that every console must run the same performance at all times.
With consoles, it is always price first, speed second. The price requirement pretty much limits what you can do inside a console.

While I agree that Boost clocks is silly for a console...

But the price argument isn't as relevent this generation with the introduction of higher priced, higher performing more "premium" tiers of hardware.



--::{PC Gaming Master Race}::--