Article: Parallelism at HotPar 2010
By: Vincent Diepeveen (diep.delete@this.xs4all.nl), July 31, 2010 5:50 pm
Room: Moderated Discussions
Mark Christiansen (aliasundercover@nospam.net) on 7/30/10 wrote:
---------------------------
>David Kanter (dkanter@realworldtech.com) on 7/29/10 wrote:
>
>>I see the GPU as a relatively new platform, one that holds a good deal of promise
>>for certain highly structured and HPC-like workloads that are free from dependencies.
>>It's fundamentally different from a CPU in that it's really a bandwidth optimized
>>device, and there are certain trade-offs that implies which make it unsuitable for many workloads/algorithms.
>>
>>Ultimately, the right balance is a combination of the CPU and GPU. ...
>>
>>David
>
>This seems like sense to me.
>
>But then I remember the parade of vector accelerators and similar coprocessor like
>things intended to give high performance for jobs with heavy computation needs and
>regular data. They all came with a nice performance advantage and they all faded away.
>
>It seems to me they died of software. Save the tasks with the most present need
>and the readiest access to custom software the main stream processors beat them
>soon enough it just wasn't worth the development effort. Since they were all different
>software written for them soon lost its value.
>
>Software on CPUs has to survive and be upgraded by multiple generations of new
>hardware. It isn't enough it go on working, it has to get more performance with
>the new. Processor compatibility from generation to generation is vital.
>
>Will GPUs get this? Or will GPU software have a life span of 2 years at best?
You adress a few important issues here Mark.
I would add to this that it is obvious the GPU has huge potential, yet a critical problem of it seems to be so far that you can't really make money writing optimized codes for GPU's. Sometimes there is a contract here or there for 1 specific company, but that's not systematic.
If you write a program for a CPU, you can produce it and try to sell it. Your market is big.
If you do real big effort to have some calculations done on a GPU, there is simply not a big market. Maybe 1 or 2 clients will be interested, if you are so lucky to have such companies or organisations as a client.
So it's currently more of a custom market. That is - apart from games of course.
But games already were very dependant upon GPU's - nothing new there.
>It seems blindingly obvious a computation engine built for tasks like graphics
>with large data sets and less random control can out perform a CPU which is built
>to optimize control and flexibility. All that flexibility is vital to performance
>on most general purpose jobs but it costs dear in silicon and power. A steel I-beam is stronger than a robot arm.
>
>How much performance can the GPU give while allowing software to go on working
>and go on gaining performance with new generations for 15 years? Meet that spec
>and give 5x performance for the same power on problems people care about and I predict
>GPUs move in next to the CPUs and find a lasting home.
>
>Fail the software lifespan test and the wheel turns again with GPUs forgotton but
>maybe some new scheme attempted in a new decade.
>
---------------------------
>David Kanter (dkanter@realworldtech.com) on 7/29/10 wrote:
>
>>I see the GPU as a relatively new platform, one that holds a good deal of promise
>>for certain highly structured and HPC-like workloads that are free from dependencies.
>>It's fundamentally different from a CPU in that it's really a bandwidth optimized
>>device, and there are certain trade-offs that implies which make it unsuitable for many workloads/algorithms.
>>
>>Ultimately, the right balance is a combination of the CPU and GPU. ...
>>
>>David
>
>This seems like sense to me.
>
>But then I remember the parade of vector accelerators and similar coprocessor like
>things intended to give high performance for jobs with heavy computation needs and
>regular data. They all came with a nice performance advantage and they all faded away.
>
>It seems to me they died of software. Save the tasks with the most present need
>and the readiest access to custom software the main stream processors beat them
>soon enough it just wasn't worth the development effort. Since they were all different
>software written for them soon lost its value.
>
>Software on CPUs has to survive and be upgraded by multiple generations of new
>hardware. It isn't enough it go on working, it has to get more performance with
>the new. Processor compatibility from generation to generation is vital.
>
>Will GPUs get this? Or will GPU software have a life span of 2 years at best?
You adress a few important issues here Mark.
I would add to this that it is obvious the GPU has huge potential, yet a critical problem of it seems to be so far that you can't really make money writing optimized codes for GPU's. Sometimes there is a contract here or there for 1 specific company, but that's not systematic.
If you write a program for a CPU, you can produce it and try to sell it. Your market is big.
If you do real big effort to have some calculations done on a GPU, there is simply not a big market. Maybe 1 or 2 clients will be interested, if you are so lucky to have such companies or organisations as a client.
So it's currently more of a custom market. That is - apart from games of course.
But games already were very dependant upon GPU's - nothing new there.
>It seems blindingly obvious a computation engine built for tasks like graphics
>with large data sets and less random control can out perform a CPU which is built
>to optimize control and flexibility. All that flexibility is vital to performance
>on most general purpose jobs but it costs dear in silicon and power. A steel I-beam is stronger than a robot arm.
>
>How much performance can the GPU give while allowing software to go on working
>and go on gaining performance with new generations for 15 years? Meet that spec
>and give 5x performance for the same power on problems people care about and I predict
>GPUs move in next to the CPUs and find a lasting home.
>
>Fail the software lifespan test and the wheel turns again with GPUs forgotton but
>maybe some new scheme attempted in a new decade.
>