By: Stunk (no.delete@this.no.no), December 3, 2014 1:45 pm
Room: Moderated Discussions
Patrick Chase (patrickjchase.delete@this.gmail.com) on December 2, 2014 1:54 pm wrote:
> Jouni Osmala (josmala.delete@this.cc.hut.fi) on December 1, 2014 11:19 pm wrote:
> > > > > Yet another form of "worse is better" ?
> > > >
> > > > More like "idle hands do the Devil's work".
> > > >
> > > > IMO there are two fundamental truths at work here:
> > > >
> > > > 1. It is very often harder to debug and verify something than it is to design it. This has been noted
> > > > by many people in many different ways (another oft-heard variation is "if you design something that as
> > > > clever as you can possibly make it, then somebody smarter than you will be required to debug it").
> > >
> > > Ho ho ho. Too right. The first program I wrote was a short one of about thirty lines that evaluated
> > > statements in propositional logic. It was a masterpiece in conciseness and I was very proud of myself.
> > > However when I went to look at it a few weeks afterwards to gloat I found I couldn't figure out exactly
> > > how it worked! So it was a valuable lesson to me on the importance of simplicity and clarity.
> >
> > My first really serious program was such that after 6 months break in coding, I spend my
> > free time for a week on it and after that harddrive broke with everything I had on it and
> > all I could really feel was relieved that I didn't have to see that code anymore.
> >
> > I didn't know any compiler theory so I invented my own way of doing compiling to
> > bytecode, and interpreting bytecode. It was graphics calculator that had bytecode
> > interpreter and compiled expressions to a bytecode so that I could execute those
> > expression fast enough to draw graphs, since textual interpeting was too slow for the
> > task. All the variables where three letter acronyms to make typing code faster.
>
> Oh my. Seriously, we've all done it (particularly those of us who started out self-taught).
>
> In most reasonably strong professional teams that sort of thing would be perceived as unacceptable
> [*] and would be self-regulated without resorting to architectural fiat or management escalation.
> Where you typically get into trouble and need a firm architectural hand is with the borderline
> stuff that's slightly more complicated than it needs to be and/or that addresses a hypothetical
> requirement that we or might not have at some unspecified point in the future.
>
> There have been times where 80% of my job was to be the person accountable
> for saying "no" (i.e. I'd be the one answering the tough questions in the unlikely
> event that said hypothetical requirement actually did materialize).
>
> [*] If you get me inebriated enough I might bring myself to relive the time that a bunch of EEs
> who also didn't know anything about compilers decided that a self-designed DSP was The Answer (tm).
Maybe that depends on what the question was? :-)
I mostly agree though, with the following caveat:
The compiler team will always want vast amounts of all-orthogonal resources (in my experience). They may have little understanding of cost trade-offs for getting to such an ideal compiler target - IMO lacking understanding of the whole picture (and sometimes not caring about it). I'm not saying it's wrong - it is great to aid the compiler when you can. I'm saying sometimes you have to work with what you have - a less than ideal compiler target but which is a very good fit in its target niche. It's actually a (very) specialized VLIW DSP with many adaptations to the system infrastructure.
It's all a game of trade-offs. It turns out that - no surprise - the interconnect infrastructure is what mostly limits you anyway, and you have to heavily consider just that.
Stunk
> At one point I tallied up how much money they'd wasted between fixed development cost, the recurring
> cost of the unused chip area, and the effort we expended working around the fact that we were left
> with insufficient general-purpose compute horsepower. The number had 8 digits.
>
> Jouni Osmala (josmala.delete@this.cc.hut.fi) on December 1, 2014 11:19 pm wrote:
> > > > > Yet another form of "worse is better" ?
> > > >
> > > > More like "idle hands do the Devil's work".
> > > >
> > > > IMO there are two fundamental truths at work here:
> > > >
> > > > 1. It is very often harder to debug and verify something than it is to design it. This has been noted
> > > > by many people in many different ways (another oft-heard variation is "if you design something that as
> > > > clever as you can possibly make it, then somebody smarter than you will be required to debug it").
> > >
> > > Ho ho ho. Too right. The first program I wrote was a short one of about thirty lines that evaluated
> > > statements in propositional logic. It was a masterpiece in conciseness and I was very proud of myself.
> > > However when I went to look at it a few weeks afterwards to gloat I found I couldn't figure out exactly
> > > how it worked! So it was a valuable lesson to me on the importance of simplicity and clarity.
> >
> > My first really serious program was such that after 6 months break in coding, I spend my
> > free time for a week on it and after that harddrive broke with everything I had on it and
> > all I could really feel was relieved that I didn't have to see that code anymore.
> >
> > I didn't know any compiler theory so I invented my own way of doing compiling to
> > bytecode, and interpreting bytecode. It was graphics calculator that had bytecode
> > interpreter and compiled expressions to a bytecode so that I could execute those
> > expression fast enough to draw graphs, since textual interpeting was too slow for the
> > task. All the variables where three letter acronyms to make typing code faster.
>
> Oh my. Seriously, we've all done it (particularly those of us who started out self-taught).
>
> In most reasonably strong professional teams that sort of thing would be perceived as unacceptable
> [*] and would be self-regulated without resorting to architectural fiat or management escalation.
> Where you typically get into trouble and need a firm architectural hand is with the borderline
> stuff that's slightly more complicated than it needs to be and/or that addresses a hypothetical
> requirement that we or might not have at some unspecified point in the future.
>
> There have been times where 80% of my job was to be the person accountable
> for saying "no" (i.e. I'd be the one answering the tough questions in the unlikely
> event that said hypothetical requirement actually did materialize).
>
> [*] If you get me inebriated enough I might bring myself to relive the time that a bunch of EEs
> who also didn't know anything about compilers decided that a self-designed DSP was The Answer (tm).
Maybe that depends on what the question was? :-)
I mostly agree though, with the following caveat:
The compiler team will always want vast amounts of all-orthogonal resources (in my experience). They may have little understanding of cost trade-offs for getting to such an ideal compiler target - IMO lacking understanding of the whole picture (and sometimes not caring about it). I'm not saying it's wrong - it is great to aid the compiler when you can. I'm saying sometimes you have to work with what you have - a less than ideal compiler target but which is a very good fit in its target niche. It's actually a (very) specialized VLIW DSP with many adaptations to the system infrastructure.
It's all a game of trade-offs. It turns out that - no surprise - the interconnect infrastructure is what mostly limits you anyway, and you have to heavily consider just that.
Stunk
> At one point I tallied up how much money they'd wasted between fixed development cost, the recurring
> cost of the unused chip area, and the effort we expended working around the fact that we were left
> with insufficient general-purpose compute horsepower. The number had 8 digits.
>