Stockholm Syndrome – human psychological adaptation in which a kidnap victim will over time start to identify with, appease, and even show affection for the entity holding absolute power over him or her.
Humble Beginnings
In 1981 IBM introduced a personal computer based on the Intel 8088 processor, a low cost, low performance variant of the 8086. The official reasons that the 8088 was chosen over its closest competitor, the Motorola MC68000, were its earlier availability and the fact that the 8088’s multiplexed address bus and 8 bit data bus reduced system costs. There were no integrated chipsets back then and board level buses required SSI TTL “jelly bean” components to buffer and latch data. These devices are 8 bit wide each so a 68000 system required more of them which raised board area, power consumption, and cost. Some observers also noted that a PC line built on the more capable 68000 could have effectively challenged IBM’s high margin low end proprietary minicomputers. IBM long understood that internal competition within its various overlapping product lines was often a bigger problem than its rival’s products. Minimization of internal competition, IBM’s “don’t eat your own children” principle, was likely the third, unspoken factor in its decision to adopt the 8088.
From software developer’s perspective the choice was a disaster. The 808x family could only directly address code and data in 64 KB aggregations. For programs that required more data or program code than 64 KB it was necessary for programmers to manage a baroque segmentation based system in which a 16 bit segment value was combined with a 16 bit offset to derive a 20 bit physical address [1]. Even the growing use of high level languages couldn’t shield programmers from the segmentation madness. Compilers offered a bewildering assortment of memory models (small, medium, big, and huge). Integrating assembly language code with compiled code meant understanding the difference between “near” calls and “far” calls. For programmers that migrated to PCs from mainframes and minicomputers the choice of the 8088 over the MC68000, which in many ways resembled the PDP-11 extended to 32 bits, was particularly egregious.
But money talks and the run-away success of the IBM PC family and its growing set of compatible imitators meant that the 8088 and the DOS operating system that ran on it was a guaranteed meal ticket that thousands of software developers couldn’t afford to ignore out of personal preference. The success of the PC caught even IBM by surprise and the failure of its last ditch attempt to reign in the clone market by imposing its microchannel adapter (MCA) hardware standard was a sign that the x86 based PC had outgrown the ability of even the biggest computer vendor to control it. Instead the destiny of the PC lay jointly in the hands of Intel and Microsoft and both grew tremendously wealthy and influential as a result.
The painful infancy of x86 ended with the introduction of the 32 bit 80386 (“386”) in late 1985. But for most programmers the segmentation horror lived on for many years until Microsoft finally got around to building a 32 bit operating system for mainstream users, namely Windows 95. Besides allowing 32 bit flat addressing and the manipulation of 32 bit scalar data, the 32 bit operating mode of the 386 used its 7 general purpose registers (GPRs) in a more regular fashion than the original 16 bit version of the ISA which often dedicated specific registers for particular instructions, thus limiting programmer’s and compiler’s freedom to optimize code sequences.
Discuss (83 comments)