By: David Hess (davidwhess.delete@this.gmail.com), April 29, 2015 8:38 am
Room: Moderated Discussions
Pierre (Boutoukoat.delete@this.yahoo.fr) on April 29, 2015 4:38 am wrote:
>
> Dissipated power is O(frequency) and O(squared voltage). The article mention the voltage could be reduced
> by 200 millivolts, e.g. go from 0.9 to 0.7 volts. This mean a reduction of power of 40%, and in layman
> terms, 40% more battery life on a processor 4 times cheaper. Reasoning at constant power and ignoring
> o a lot of details, frequency could increase by 40% if voltage goes from 0.9V to 0.7V. However, most of
> the perceived performance out of a processor comes from memory speed (unchanged ...) , parallelism and
> integration of co-processors like GPU and NICs (more transistors), and software benchmarks.
Power density is also becoming a limiting factor. Heat pipes helped here however they have their own hard power density limits and now require heat spreaders between the CPU and evaporator. If you reduce the power by 40% but have the same number of transistors in 1/4 of the area, then the power density increases by 160%. Raising the frequency by 40% and keeping the power constant would increase the power density by 300%. Either the junction to case thermal resistance has to be lowered (diamond?) or a higher junction temperature is needed which will adversely affect reliability and leakage.
> For the last 10 years, frequency of desktops for gamers has not increased (peak around 4 Ghz on
> the most expensive Intel CPUs). It looks unlikely to change in the near future, and the trend
> to add more cores will continue. I hope this helps to answer your obsolete question "how faster
> ?" typical from the 90's. Right questions now are "how cheaper ?" , "how longer ?" ...
AMDs current CPUs top out in the 4 GHz range as well. Adding more cores to take advantage of higher density processes is going to run up against power density limits and most applications including games cannot take advantage of more cores anyway. Do any games make use of more than 2 cores?
>
> Dissipated power is O(frequency) and O(squared voltage). The article mention the voltage could be reduced
> by 200 millivolts, e.g. go from 0.9 to 0.7 volts. This mean a reduction of power of 40%, and in layman
> terms, 40% more battery life on a processor 4 times cheaper. Reasoning at constant power and ignoring
> o a lot of details, frequency could increase by 40% if voltage goes from 0.9V to 0.7V. However, most of
> the perceived performance out of a processor comes from memory speed (unchanged ...) , parallelism and
> integration of co-processors like GPU and NICs (more transistors), and software benchmarks.
Power density is also becoming a limiting factor. Heat pipes helped here however they have their own hard power density limits and now require heat spreaders between the CPU and evaporator. If you reduce the power by 40% but have the same number of transistors in 1/4 of the area, then the power density increases by 160%. Raising the frequency by 40% and keeping the power constant would increase the power density by 300%. Either the junction to case thermal resistance has to be lowered (diamond?) or a higher junction temperature is needed which will adversely affect reliability and leakage.
> For the last 10 years, frequency of desktops for gamers has not increased (peak around 4 Ghz on
> the most expensive Intel CPUs). It looks unlikely to change in the near future, and the trend
> to add more cores will continue. I hope this helps to answer your obsolete question "how faster
> ?" typical from the 90's. Right questions now are "how cheaper ?" , "how longer ?" ...
AMDs current CPUs top out in the 4 GHz range as well. Adding more cores to take advantage of higher density processes is going to run up against power density limits and most applications including games cannot take advantage of more cores anyway. Do any games make use of more than 2 cores?