By: Patrick Chase (patrickjchase.delete@this.gmail.com), December 9, 2014 2:00 pm
Room: Moderated Discussions
Salvatore De Dominicis (no.thanks.delete@this.example.com) on December 9, 2014 8:51 am wrote:
> There is one aspect I think is not apparent from your post regarding computer vision.
> Designing/building/testing new hardware is expensive, while (comparatively)
> powerful GPUs are becoming more and more common in the mobile market.
> A GPU implementation of some computer vision algorithm could be "Good Enough"©
> to run on existing hardware, avoiding the need for specialized hardware, you already
> need to design (or license)/validate the GPU and everything related to it.
As someone who did image processing and vision work on GPUs and defined ASICs to do the same, I feel qualified to address this:
It all depends on volume and rate of change. When volumes are low and/or the algorithm set is unstable you use commodity HW, of which the flavor of the decade is the GPU. When the volumes get higher and the algorithms more stable it pays to do custom HW (the fixed costs of a moderately complex ASIC on a modern processes are O($10M) or higher when all is said and done, so you can do the math and figure out when it makes sense).
If the volumes are high but the algorithms are only partially stable, then it can make sense to do an ASIC that incorporates both fixed-function HW and programmability in the form of DSPs/GPUs/whatever. That's why Qualcomm sprinkles those 'Hexagon' doohickeys in all of their Snapdragons.
> There is one aspect I think is not apparent from your post regarding computer vision.
> Designing/building/testing new hardware is expensive, while (comparatively)
> powerful GPUs are becoming more and more common in the mobile market.
> A GPU implementation of some computer vision algorithm could be "Good Enough"©
> to run on existing hardware, avoiding the need for specialized hardware, you already
> need to design (or license)/validate the GPU and everything related to it.
As someone who did image processing and vision work on GPUs and defined ASICs to do the same, I feel qualified to address this:
It all depends on volume and rate of change. When volumes are low and/or the algorithm set is unstable you use commodity HW, of which the flavor of the decade is the GPU. When the volumes get higher and the algorithms more stable it pays to do custom HW (the fixed costs of a moderately complex ASIC on a modern processes are O($10M) or higher when all is said and done, so you can do the math and figure out when it makes sense).
If the volumes are high but the algorithms are only partially stable, then it can make sense to do an ASIC that incorporates both fixed-function HW and programmability in the form of DSPs/GPUs/whatever. That's why Qualcomm sprinkles those 'Hexagon' doohickeys in all of their Snapdragons.