By: hobold (hobold.delete@this.vectorizer.org), January 24, 2014 3:02 am
Room: Moderated Discussions
Doug S (foo.delete@this.bar.bar) on January 23, 2014 8:54 pm wrote:
[...]
> The same is true for application software. There hasn't been anything new people
> want to do that requires more power
Is this another way of saying that people never wanted computer performance itself, but only the new functionalities that were enabled by it? That is a very direct way of obsoleting old boxes.
But if that is the case, then I see a hen-egg-problem here. Now that general-purpose performance has stopped growing exponentially, there are no machines on which the next killer app could be developed. Traditionally, most new stuff began as bloatware. Optimization almost always came later. Initially, "too fast" computers were needed, until eventually the new stuff was well tuned for deployment on consumer machines.
Tuning takes time and money; an investment only spent on applications that have already seen some demand by early adopters. A few "too fast" computers in the hands of enthusiast consumers are needed as well, to provide an initial market.
My pet theory about obsolescence through speedups is certainly not the whole truth. I remember that back in the 1GHz Athlon days, there was a perception that maybe there existed such an inconceivable thing as a "fast enough computer". It turned out not to be true quite yet in general - but apparently segments of the market were satisfied and stopped upgrading. That portion of the market has probably grown steadily since then.
But for the rest of us, only now that we simply cannot get faster machines, only now do we begin to reluctantly accept them as fast enough, don't we?
[...]
> The same is true for application software. There hasn't been anything new people
> want to do that requires more power
Is this another way of saying that people never wanted computer performance itself, but only the new functionalities that were enabled by it? That is a very direct way of obsoleting old boxes.
But if that is the case, then I see a hen-egg-problem here. Now that general-purpose performance has stopped growing exponentially, there are no machines on which the next killer app could be developed. Traditionally, most new stuff began as bloatware. Optimization almost always came later. Initially, "too fast" computers were needed, until eventually the new stuff was well tuned for deployment on consumer machines.
Tuning takes time and money; an investment only spent on applications that have already seen some demand by early adopters. A few "too fast" computers in the hands of enthusiast consumers are needed as well, to provide an initial market.
My pet theory about obsolescence through speedups is certainly not the whole truth. I remember that back in the 1GHz Athlon days, there was a perception that maybe there existed such an inconceivable thing as a "fast enough computer". It turned out not to be true quite yet in general - but apparently segments of the market were satisfied and stopped upgrading. That portion of the market has probably grown steadily since then.
But for the rest of us, only now that we simply cannot get faster machines, only now do we begin to reluctantly accept them as fast enough, don't we?