Probably IEEE fp16 (blah)

Article: IBM's Machine Learning Accelerator at VLSI 2018
By: dmcq (dmcq.delete@this.fano.co.uk), October 12, 2018 1:07 pm
Room: Moderated Discussions
dmcq (dmcq.delete@this.fano.co.uk) on October 12, 2018 12:56 pm wrote:
> wumpus (lost.delete@this.in.a.cave) on October 12, 2018 6:55 am wrote:
> > David Kanter (dkanter.delete@this.realworldtech.com) on October 11, 2018 10:53 pm wrote:
> > Apparently they are working on an 8b FP format as well!
> > >
> > > David
> > >
> >
> > At this point just use a logarithmic representation (might work for 16 bits, but that probably requires too
> > many weird circuits. But once you get people to work with log8, they will probably want log16 as well).
> >
> > Yes, it might need a scaling factor if you don't fit the exact same "law" as the
> > hardware. But float8 is only going to be less forgiving about scaling factors.
>
> There's been a number of schemes described over the years besides pure logarithmic which can cut
> the complexity a bit. But with that number of bits practically anything can be done pretty efficiently.
> Compared to IEEE pure logarithmic multiplication is just addition, addition is more complicated
> best done by converting to something more like IEEE. In AI one would normally want the sum of a
> number of multiplies and the total can be converted back at the end with rounding.

Yes there would be problems with scaling factors. In pure logarithmic if one wanted to range from 1/512 to 512 one would want a base of 2^(1/7) whereas if the range was 1/8 to 8 one would want 2^(1/21) to keep the maximum precision. This would mainly affect conversion before and after addition and yes catering for that would be ... interesting but possible I think.
< Previous Post in ThreadNext Post in Thread >
TopicPosted ByDate
VLSI 2018: IBM's machine learning acceleratorDavid Kanter2018/10/09 02:28 PM
  New article! (NT)David Kanter2018/10/09 09:12 PM
  VLSI 2018: IBM's machine learning acceleratorlockederboss2018/10/10 12:20 PM
     Probably IEEE fp16Mark Roulo2018/10/10 03:41 PM
       Probably IEEE fp16David Kanter2018/10/11 10:53 PM
         Probably IEEE fp16 (blah)wumpus2018/10/12 06:55 AM
           Probably IEEE fp16 (blah)dmcq2018/10/12 12:56 PM
             Probably IEEE fp16 (blah)dmcq2018/10/12 01:07 PM
This Post has been deleted
This Post has been deleted
Reply to this Topic
Name:
Email:
Topic:
Body: No Text
How do you spell tangerine? 🍊