By: Brett (ggtgp.delete@this.yahoo.com), August 9, 2014 11:51 am
Room: Moderated Discussions
David Kanter (dkanter.delete@this.realworldtech.com) on August 9, 2014 10:50 am wrote:
> juanrga (nospam.delete@this.juanrga.com) on August 9, 2014 6:38 am wrote:
> > David Kanter (dkanter.delete@this.realworldtech.com) on August 8, 2014 11:36 pm wrote:
> > > ARM is full of legacy crap as well. Not to mention the fact that an ARMv8 requires
> > > 3-4 different decoders. I know a few people who have had the pleasure of designing
> > > custom ARM cores, and according to them 'ARMv8 decode is just as terrible as x86'.
> >
> > By ARM64 I am referring to AArch64 exclusively. ARMv8 can be A or T and it includes AArch32 for >legacy.
>
> Can you name a design which is ARM64 only with no 32-bit support? I can't.
> That means that decoders are needed for ARMv8, v7, and thumb.
For those that are designing 40 watt ARM64 chips there are no legacy 32 bit apps to support in the laptop/desktop/server space. Thus no reason to waste expensive die space on legacy ARM32/Thumb/Cortex opcode decoding, or worse yet waste time to market costs for the design and validation of ARM32/Thumb/Cortex with your limited engineering resources.
If someone really wants to run native Android tablet apps on a Chromebook the 40 watt ARM64 CPU can emulate a 2 watt ARM CPU faster than that chip can run, so performance will be good as far as the user can tell. (The Apple A7 uses 1.1 watts. http://en.wikipedia.org/wiki/Apple_A7 )
> > The designs that I am commenting are pure AArch64 implementations,
> > legacy 32bit mode is not needed for HPC for instance.
Maybe I will be wrong, but I will be shocked if any of the 10+ watt ARM64 designs bother with legacy opcode support.
> juanrga (nospam.delete@this.juanrga.com) on August 9, 2014 6:38 am wrote:
> > David Kanter (dkanter.delete@this.realworldtech.com) on August 8, 2014 11:36 pm wrote:
> > > ARM is full of legacy crap as well. Not to mention the fact that an ARMv8 requires
> > > 3-4 different decoders. I know a few people who have had the pleasure of designing
> > > custom ARM cores, and according to them 'ARMv8 decode is just as terrible as x86'.
> >
> > By ARM64 I am referring to AArch64 exclusively. ARMv8 can be A or T and it includes AArch32 for >legacy.
>
> Can you name a design which is ARM64 only with no 32-bit support? I can't.
> That means that decoders are needed for ARMv8, v7, and thumb.
For those that are designing 40 watt ARM64 chips there are no legacy 32 bit apps to support in the laptop/desktop/server space. Thus no reason to waste expensive die space on legacy ARM32/Thumb/Cortex opcode decoding, or worse yet waste time to market costs for the design and validation of ARM32/Thumb/Cortex with your limited engineering resources.
If someone really wants to run native Android tablet apps on a Chromebook the 40 watt ARM64 CPU can emulate a 2 watt ARM CPU faster than that chip can run, so performance will be good as far as the user can tell. (The Apple A7 uses 1.1 watts. http://en.wikipedia.org/wiki/Apple_A7 )
> > The designs that I am commenting are pure AArch64 implementations,
> > legacy 32bit mode is not needed for HPC for instance.
Maybe I will be wrong, but I will be shocked if any of the 10+ watt ARM64 designs bother with legacy opcode support.