By: Mark Roulo (nothanks.delete@this.xxx.com), November 13, 2014 10:45 am
Room: Moderated Discussions
Paul A. Clayton (paaronclayton.delete@this.gmail.com) on November 11, 2014 8:21 am wrote:
> I am curious why you thought Netburst was a really bad idea. (I can understand "won't last", that is reasonable
> for any microarchitecture.) Obviously, clock frequency at (almost) any cost is a bad idea for performance
> (much less performance per watt), but it is not obvious to me that in 2000 a speed demon design was a
> particularly bad idea. The optimism that frequency could scale to 10 GHz may have been unreasonable, but
> even that might be attributed to having faith in Intel's process development genius.
>
> (Perhaps I am asking what particular aspects of Netburst seemed especially ill conceived.)
>
> Willamette seems to have suffered from die area issues as well as power issues.
> (A smaller L2 cache with higher memory latency is not a good combination.)
I suspect that a number of folks can provides lists of reasons (and some have). I can, too. But the Pentium-IV actually did fairly well for a few years (ignoring the laptop market ... which back then was a much smaller percentage of the market than today).
*MY* believe about why it didn't succeed (as opposed to being a bad idea) is that the Pentium-IV traded good branchy/scalar performance away in an attempt to get good streaming performance. I have this dim recollection that Intel's marketing was that branchy/scalar performance for contemporary CPUs was "good enough" and that it made sense to maintain that while mostly optimizing for streaming/multi-media loads(*). Thus the deep pipeline and the over-the-top drive for clockspeed.
But ... Intel was *wrong* about scalar/brancy x86 performance being good enough to no longer be a battlefield and AMD had the better/faster product. Even if Intel had been able to hit 10 GHz, I'm not sure that things would have worked much better for them. Too many loads, I fear, would be bottlenecked on cache misses and DRAM latency. Streaming loads would have been fine if the DRAM bandwidth had kept up, but the performance of the branchy stuff would have flatlined.
I'd love to know how much of Intel being wrong was also because GPUs got powerful enough to handle much of he multi-media loads (I don't game ... and didn't game back then either, so I have no feel for this). I observe that nVidia's "GeForce 256" product came out in late 1999. If we hadn't gotten GPUs like this from nVidia and ATI and the CPU had been required to do more of the GPU load, would the Pentium-IV have made more sense? Or would an Athlon design still win (Athlon + GPU in 2004 seemed to win ... I'm asking about the alternate universe in which we didn't get those GPUs)? In other words: Was the Pentium-IV screw up that GeForce GPUs (and their competitors) showed up (and failing to anticipate the thermal problems as the clock speed went up, of course)?
A meta-suggestion is that you look for reviews and complaints from the time that the Pentium-IV came out.
One I remember is this:
http://www.emulators.com/docs/pentium_1.htm (especially the "why the Pentium 4 fails to deliver" section)
Another contemporary review is: http://www.anandtech.com/show/661/24
(*) With claims like this there is always the question of whether the folks believed what they were saying or were merely spinning a story to defend the position in which they found themselves ("talking their book" is the Wall Street expression). I dunno.
> I am curious why you thought Netburst was a really bad idea. (I can understand "won't last", that is reasonable
> for any microarchitecture.) Obviously, clock frequency at (almost) any cost is a bad idea for performance
> (much less performance per watt), but it is not obvious to me that in 2000 a speed demon design was a
> particularly bad idea. The optimism that frequency could scale to 10 GHz may have been unreasonable, but
> even that might be attributed to having faith in Intel's process development genius.
>
> (Perhaps I am asking what particular aspects of Netburst seemed especially ill conceived.)
>
> Willamette seems to have suffered from die area issues as well as power issues.
> (A smaller L2 cache with higher memory latency is not a good combination.)
I suspect that a number of folks can provides lists of reasons (and some have). I can, too. But the Pentium-IV actually did fairly well for a few years (ignoring the laptop market ... which back then was a much smaller percentage of the market than today).
*MY* believe about why it didn't succeed (as opposed to being a bad idea) is that the Pentium-IV traded good branchy/scalar performance away in an attempt to get good streaming performance. I have this dim recollection that Intel's marketing was that branchy/scalar performance for contemporary CPUs was "good enough" and that it made sense to maintain that while mostly optimizing for streaming/multi-media loads(*). Thus the deep pipeline and the over-the-top drive for clockspeed.
But ... Intel was *wrong* about scalar/brancy x86 performance being good enough to no longer be a battlefield and AMD had the better/faster product. Even if Intel had been able to hit 10 GHz, I'm not sure that things would have worked much better for them. Too many loads, I fear, would be bottlenecked on cache misses and DRAM latency. Streaming loads would have been fine if the DRAM bandwidth had kept up, but the performance of the branchy stuff would have flatlined.
I'd love to know how much of Intel being wrong was also because GPUs got powerful enough to handle much of he multi-media loads (I don't game ... and didn't game back then either, so I have no feel for this). I observe that nVidia's "GeForce 256" product came out in late 1999. If we hadn't gotten GPUs like this from nVidia and ATI and the CPU had been required to do more of the GPU load, would the Pentium-IV have made more sense? Or would an Athlon design still win (Athlon + GPU in 2004 seemed to win ... I'm asking about the alternate universe in which we didn't get those GPUs)? In other words: Was the Pentium-IV screw up that GeForce GPUs (and their competitors) showed up (and failing to anticipate the thermal problems as the clock speed went up, of course)?
A meta-suggestion is that you look for reviews and complaints from the time that the Pentium-IV came out.
One I remember is this:
http://www.emulators.com/docs/pentium_1.htm (especially the "why the Pentium 4 fails to deliver" section)
Another contemporary review is: http://www.anandtech.com/show/661/24
(*) With claims like this there is always the question of whether the folks believed what they were saying or were merely spinning a story to defend the position in which they found themselves ("talking their book" is the Wall Street expression). I dunno.