Otter said:
Can you explain this math? |
Hes makeing a wrong assumption.
Hes assumeing the "varies" mentioned about clocks, work like a intel cpu that boosts up for a short while, when it doing things.
Mark Cerny explained that wasnt the case.
Its the normal speed, but it will downclock abit if it uses too much power/gets too hot.
I think cerny mentioned it was a few %'s in the talks.
So if it drops like 5% in cpu or gpu speed, to keep to a certain amount of power/heat, then thats not enough to make the differnce between the two that big.







