Email or username:

Password:

Forgot your password?
Ken Shirriff

In 1981, Intel released the iAPX 432, calling this "micro-mainframe" one of the most important advances in computing since the 1950s. But it was a flop, costing Intel $100 million. An unexpected side-effect, though, was the 8086 processor. 1/n

44 comments
Ken Shirriff

The 432 processor put object-oriented programming and storage allocation in hardware. This ambitious processor was split across two chips: the Instruction Decoding Unit decoded instructions into micro-instructions. The Microinstruction Execution Unit executed them.

Ken Shirriff

I took die photos of the first chip, the 43201. This chonky half-a-processor is twice the size of the 8086 processor and doesn't even execute instructions. It has 3.8ร— the transistors (110,000 vs 29,000) and has 6ร— the microcode (64 Kb vs 11 Kb).

sxpert
@kenshirriff I can guess microcode top-right, whatโ€™s top-left ? 2 large blocks of cache ?
Ken Shirriff

@sxpert My labeled die photo is later in the thread. The top left block is microcode. The top right blocks are PLAs (Programmable Logic Arrays) for the decoding state machine.
oldbytes.space/@kenshirriff/11

Ken Shirriff

Why a separate chip just to decode instructions? The 432's instructions are absurdly complicated. An instruction is from 6 to 321 bits long and can start anywhere in a byte. Decoding instructions needed a complex state machine complete with subroutines.

Ken Shirriff

This complicated block diagram from a 432 patent shows what's inside the decoder chip. To summarize: instructions enter at the left (ACD) and micro-instructions exit at the right (ยตI bus). Microprogram ROM is in the center. The Composer Matrix extracts chunks of instructions.

Ken Shirriff

I partially reverse-engineered the die photo to label it with approximate functional blocks. The top half is the microcode ROM and the state-machine PLAs (programmable logic arrays). The bottom half disassembles the instruction stream and shuffles pieces around.

Ken Shirriff

This closeup of the microcode ROM shows the vertical select and output data lines and the zig-zag polysilicon select lines. Bits are stored by putting a transistor at each zig-zag, or not. Changing the focus shows the underlying transistor pattern and thus the microcode bits.

Ken Shirriff

Why so much microcode? The basic operations and addressing modes took 250 micro-instructions; the other 3.7K implemented floating point and the "sophisticated object-oriented functions" of the system. The 432 was one of the first to use IEEE-754 floating point, still used today.

Ken Shirriff

Binary decoders select rows and columns in the ROM. Each column matches a binary number: 0000, 0001, 0010, etc. The boxes indicate transistors, attached to a 0 or 1 line. The low-bit transistors (red) alternate every column, orange alternate every two columns, etc.

Ken Shirriff

Since instructions aren't aligned with bytes, a 32-bit shifter called the "Composer Matrix" shifts the word to extract each instruction field. Diagonal control lines energize transistors to select an arbitrary shift.

Ken Shirriff

A PLA-based state machine steps through the chunks of the instruction, running microcode routines as needed. The bit instruction pointer keeps track of the location in a byte. (A jump instruction can end up in the middle of a byte.)

Ken Shirriff replied to Ken

An interesting circuit is the charge pump, an analog circuit on this digital chip. It has an oscillator and capacitor to generate a negative voltage. This negative bias on the silicon improves performance. The charge pumps are almost identical to the ones in the 8086 processor.

Ken Shirriff replied to Ken

As the 432 project fell behind schedule, Intel realized they urgently needed something to sell. Intel quickly threw together a short-lived stopgap processor to sell until the 432 was ready: the 8086 (1978) was a huge success and is still around in the x86 architecture.

Ken Shirriff replied to Ken

The iAPX 432 was finally released in 1981 to great fanfare. But its performance was dismal, 1/4 the speed of the 8086, making the 432 a flop.

Ken Shirriff replied to Ken

The paper that killed the 432 was "A Performance Evaluation of the Intel iAPX 432". I recently realized that one of the paper's co-authors was my former officemate twitter.com/Bob_Mayo.
archive.org/details/Performanc

Ken Shirriff replied to Ken

The big computer architecture debate of the 1980s was RISC vs CISC, pitting Reduced Instruction Set Computers against Complex Instruction Set Computers. RISC processors were simple but fast with lots of registers, moving complexity to software. Instructions were easy to decode.

Ken Shirriff replied to Ken

Built just before RISC, the 432 took CISC to the extreme, putting everything possible into hardware rather than software: objects, garbage collection, etc. Intel called it the Silicon Operating System. With no user-visible registers, instructions were stack and memory-based.

Ken Shirriff replied to Ken

Minimizing the "semantic gap" between high-level languages and assembly language was a big thing back then. The 432 was designed for the Ada language with instructions to perform high-level operations. The Ada compiler was $30,000; we're spoiled now by open-source compilers.

Ken Shirriff replied to Ken

What if the 432 had won? Computing would be very different. Many security problems wouldn't exist. You can't have a buffer overflow because every data structure is a separate object with memory segment size enforced in hardware. You can't smash the stack or make bad pointers.
The 432 was designed around fault-tolerant multiprocessing. One chip could validate another and fail over if necessary. Computers would be much more reliable if the 432 had won.

Ken Shirriff replied to Ken

There aren't many die photos of the 432 chipset, but I made this summary from various sources. The 43201 and 43202 form the "General Data Processor". The 40203 was an attached I/O co-processor. The Bus Interface and Memory Control were for fault-tolerant multiprocessor systems.

Robert Hollingshead :donor: replied to Ken

@kenshirriff wow. I didn't know iNTEL made this and it fits neatly into my fantasy world where one could increase processing power by adding hardware to the existing hardware. You could upgrade without having to throw out the old stuff immediately. I know the nuance would kill such a thing but a person can dream.

I'm wondering if this was inspired by LISP?

Thanks for sharing

Ken Shirriff replied to Robert Hollingshead :donor:

@0xF21D I don't think there was any Lisp influence on the iAPX 432. It was inspired by object-oriented languages, specifically Ada.

Kristofer Younger replied to Ken

@kenshirriff @0xF21D my 432 had the Ada compiler. it took hours to compile โ€œhello.adbโ€. had access to it at Purdue. it was 1983 i think. my summer job was learning Green Ada, at Honeywell SRC.

Ken Shirriff replied to Ken

Taking the Instruction Decoding Unit die photos was a bit tricky because the die is encased in a paperweight. Thanks to moylecroft for loaning me the paperweight.
Chips-on-board photo by @brouhaha (CC BY-SA 2.0) commons.wikimedia.org/wiki/Fil
Die photos of 43204/43205 from Intel/CHM.

Jari Komppa ๐Ÿ‡ซ๐Ÿ‡ฎ replied to Ken

@kenshirriff Knowing the ebb and flow of generic vs specific computing hardware, I'm pretty sure the model would have been dropped into a more generic, largely software driven system eventually and we'd have all the security problems in any case.. =)

Tom Forsyth

@kenshirriff In 1985 the ARM1 was revolutionary in having a barrel-shifter "for free" in the instruction set, which cost a huge amount of die area but was impressively flexible compared to the 1-bit-per-clock shifts of 68k and x86.

Here they have the same structure and area cost, but its utility is almost completely invisible to the programmer! The 432 was wild...

๐Ÿ‡บ๐Ÿ‡ฆ haxadecimal

@kenshirriff The 43201 microcode can be dumped electrically without decap, and I've done that for a release 1 C43201. Unfortunately the 43201 and 43202 contain many PLAs which (AFAIK) can't be dumped other than by decap and photomicrograph.

Ken Shirriff

@brouhaha Have you decoded any of the 43201 microcode?

๐Ÿ‡บ๐Ÿ‡ฆ haxadecimal

@kenshirriff Only a tiny amount. I'll try to find time to put what I've got into a GitHub repo.
Most of the microcode implements the high-level instructions, which are quite complex. I haven't been able to figure out any of those yet.

Lalufu

@kenshirriff I assume there's some sort of incredible advantage to having bit aligned instructions that offsets all the additional complexity on top of byte aligned ones? You save some bits of storage, sure, but...

Ken Shirriff

@Lalufu Bit-alignment was supposed to improve instruction density so you could get more instructions with fewer memory accesses. But it turned out that the 432's instruction density wasn't as good as regular processors in most cases, so it was a bad idea.

Johnny Cache

@kenshirriff

I have a few of these Iโ€™ve been meaning to decap. Learning that the instruction size โ€˜varies from 6 to 321โ€™ bits long is making me reconsider. It feels like breaking the seal on some sort of cursed tomb.

Yeah, its *probably* not haunted. But why take the risk? ๐Ÿค”

Poul-Henning Kamp

@kenshirriff

The Rational R1000 Ada computer we have in Datamuseum.dk is the same basic idea, but it worked out, IBM bought Rational for a couple of billions in 1990ies.

Some really amazing software probably made all the difference.

Dogzilla

@kenshirriff A late friend of mine was on the 432 team at Intel. He never put it on his resume. Itโ€™s a shame that it had such a stigma, I think Intel struggles with anything thatโ€™s not X86, like the 432 and Itanium.

lopta

@kenshirriff This might be the first time I've actually seen a photo of iAPX 432!

Diane Bruce VA3DB

@kenshirriff I saw this brochure (I think?) the first time. It was heavily ADA oriented with built in blocking message passing operations. Crazy. I *think* we may have been evaluating it for a project but as you say, it was a flop.

Dr James Howard

@kenshirriff I have a delightful collection of old microchips...but an iAPX 432 continues to elude me. Are there any of these out in the wild?

๐ŸŒˆ Andrew โ˜„๏ธ

@kenshirriff I guess if it led to 8086 it was an important advance in computing even if it itself was a flop

dontdothisathome

@kenshirriff interesting, didn't know this system before, thanks for the explanations

Chris Shaw

@kenshirriff thanks for putting this together Ken ... Really interesting history

Go Up