Email or username:

Password:

Forgot your password?
97 posts total
Ken Shirriff

Here's a silicon wafer for Intel's iAPX 432 processor (1981), a failed "micro-mainframe". Each rectangle on the wafer is one processor chip. But what are those five unusual rectangles? Those are test circuits... 🧡

Ken Shirriff

Creating chips on a silicon wafer is complicated and lots can go wrong. A few test circuits were placed on the die so Intel could check the important characteristics and make sure everything was okay. The white squares are test pads. Tiny probes contact the pads for measurements.

Dana Myers

@kenshirriff I asked Google Bard to write a poem about the i432 πŸ™‚

The Intel i432

A chip ahead of its time,
An architecture too complex,
Too expensive to produce.

A dream that never came true,
A vision of the future,
That was lost to time.

A relic of the past,
A forgotten footnote,
In the history of computing.

But still, it is remembered,
By those who know its story,
And who appreciate its potential.

The Intel i432,
A forgotten masterpiece,
That could have changed the world.

@kenshirriff I asked Google Bard to write a poem about the i432 πŸ™‚

The Intel i432

A chip ahead of its time,
An architecture too complex,
Too expensive to produce.

A dream that never came true,
A vision of the future,
That was lost to time.

A relic of the past,
A forgotten footnote,
In the history of computing.

Ken Shirriff

Intel introduced the iAPX 432 "micromainframe" processor in 1981. Like every good mainframe, it had a separate channel processor to perform input/output (I/O). Here's my die photo of the 43203 Interface Processor chip. 🧡

Ken Shirriff

The 432 was a strange system. Everything was an object, implemented in hardware with access control. You couldn't just move I/O data to memory because that would bypass the object protection. The Interface Processor translated between the object world and the I/O byte world.

Kent Borg

@kenshirriff Very intriguing thread, as in "These might be good ideas!", until I saw a memory structure diagram and I realized all that walking would be done in hardware, and how slow are they trying to be?

And then I asked myself: What makes me think hardware can't walk data structures? I mean if software does it, the hardware still does it, right? What makes that slow?

Ken Shirriff

Today's die photo: the Intel iAPX 432 processor (1981). This "micro-mainframe" was so complex, the processor needed to be split across two chips: this is the 43202, which executes microinstructions. The 432 was slow, late, and a failure.

Ken Shirriff

The iAPX 432 was a follow-on to the Intel 8008 and 8080, originally called the 8800. It was supposed to be Intel's flagship processor but that didn't work out. This chip has the label 8802 since it is the second chip.

bitsavers.org

@kenshirriff

are you going to ask John McMaster to do a high resolution image of the whole die?

Ken Shirriff

Before WiFi, Ethernet was the best way to connect your computer to the network. This Fujitsu Ethernet chip from the 1980s was used in network cards. 🧡

Show previous comments
penguin42

@kenshirriff Nice pic; any idea what's going on the left of the 10th row up where a few cells seem to be below the line of the others? I admit to not understanding why - even for a gate array - they have all the gaps between the cell rows; I think it might be because they didn't have many metal layers so all the custom horizontal wires are in the gap?

Stylus πŸ¦‰

@kenshirriff some may say that ethernet is still the best way to connect a computer to a network

Rick

@kenshirriff I would argue that Ethernet is still the best way to connect your devices to the network.

Ken Shirriff

This Central Air Data Computer (CADC) was introduced in 1955. It computed airspeed, altitude, etc for fighter planes. But instead of a processor, it was an analog computer that used tiny gears for its computations. Let's look at how one of its modules works.🧡

Show previous comments
AngryAndrew

@kenshirriff

"... The missile knows where it is because it knows where it isn't, by subtracting where it isn't from where it is…"

Well now it all makes sense...

ΔΈurth

@kenshirriff awesome, thanks :)

Also (imho ;)) highly entertaining on the subject, for it's crackly sound and distinct narration: such series of US-Navy training videos as i.e.:

youtube.com/watch?v=_8aH-M3PzM

furicle

@kenshirriff Now do the air/hydraulic one out of an FW190 circa 1939....

Ken Shirriff

In 1981, Intel released the iAPX 432, calling this "micro-mainframe" one of the most important advances in computing since the 1950s. But it was a flop, costing Intel $100 million. An unexpected side-effect, though, was the 8086 processor. 1/n

Show previous comments
🌈 Andrew β˜„οΈ

@kenshirriff I guess if it led to 8086 it was an important advance in computing even if it itself was a flop

dontdothisathome

@kenshirriff interesting, didn't know this system before, thanks for the explanations

Chris Shaw

@kenshirriff thanks for putting this together Ken ... Really interesting history

Ken Shirriff

The most popular way of wiring computers into a network is Ethernet, invented at Xerox PARC in 1973. I looked inside a Fujitsu Ethernet chip from 1984. The silicon die has rows of transistors and metal wiring. Around the edges, bond wires connect the chip to the external pins.

Show previous comments
Juleselt

@kenshirriff Thank you for such an interesting thread. My father worked for a company called General Automation in the 60s-early 70s. He designed mother boards. These illustrations reminded me of the large vellum sheets he would bring home to work on that looked just like these.

Of course as a kid, I really didn't get what he did until I started using computers in the 80s.

it's B! Cavello 🐝

@kenshirriff it’s incredible to me that back in 1984, things were already this small. I think I truly have underestimated how unfathomably tiny things are now 🀯

Ken Shirriff

You'd think every computer should be able to divide two numbers, but early microprocessors didn't have division instructions. The Intel 8086 (1978) was one of the first with division. Let's look at how it implemented division and why division is so hard.

Ken Shirriff

Computers can divide by performing long division, just like grade school except using binary. This needs a subtract-and-shift loop. For early microprocessors, you'd implement the loop in assembly code. The 8086 implemented the loop in microcode, much faster and more convenient.

Dave Bittner

@kenshirriff I have a vague recollection of the 6809 having an advantage over the 6502 when it came to being able to divide more quickly. Interesting stuff!

Ken Shirriff

Another look inside the Intel 8086 microprocessor from 1978. This chip (and today's x86 computers) has "string" operations to efficiently copy or scan blocks of memory. They are implemented with microcode and a bit of special hardware. Let's see how this works. 🧡

Ken Shirriff

Most people think of machine instructions as the lowest level of software, but many processors contain microcode. An instruction is broken down into micro-instructions. The 8086 has 21-bit micro-instructions; each executes a register move (source→dest) and an action in parallel.

JoJaSciPo

@kenshirriff

Fascinating! Thanks!!

reminds me horribly of that book The Soul of a New Machine:

en.wikipedia.org/wiki/The_Soul

That book confirmed all my worst suspicions of working in the USA and permanently killed any ambition to do so!πŸ˜‚

Steve

@kenshirriff
I don’t suppose you know of a teardown of the Intersil 6100 chip (also made by Harris)? Fond memories of using it in medical instrumentationβ€”battery powered. Also with one of my fave books being Prosser/Winkel’s β€˜The Art of Digital Design’ which builds a PDP-8 from scratch; it would be great to see how the chip did this.

archive.org/details/bitsavers_

Ken Shirriff

Early microprocessors could add and subtract but didn't have multiply or divide instructions. The Intel 8086 processor (1978) fixed this with multiply and divide instructions, making life simpler for programmers. Multiplication used shift-and-add loops written in microcode.🧡

Ken Shirriff

Binary multiplication is much like long grade-school multiplication, but simpler because each step is either 0 or the multiplicand. A processor can implement this by shifting the number and adding it in each cycle of a loop. For e.g. the 6502, this was done in assembly code.

Ken Shirriff

Intel introduced the 8086 microprocessor in 1978 and it still has a huge influence through the modern x86 architecture used today. This 16-bit processor contains a bunch of registers, some of them hidden. I reverse-engineered the 5-bit code that it uses to select registers. 🧡

Ken Shirriff

Instructions for the 8086 processor specify registers through 3 bits in the opcode or following byte. This is expanded to a 5-bit code to support 16-bit registers (red), 8-bit registers (blue), segment registers (green), and special internal registers.

chris

@kenshirriff My first PC used the 8088. Boy was that fast πŸ€—.

Ken Shirriff

The rotating globe of the Globus INK (1967) showed Soviet cosmonauts their location in orbit. I reverse-engineered this analog gear-driven computer. We tested the "landing position" feature that rapidly spins the globe to show where they would land. Then it spins back to orbit.🧡

Ken Shirriff

The Globus advances its position every second. That's the annoying clicking sound, solenoids ratcheting the gears forward. For the landing position, a separate motor (upper right) spins the globe, stopping when an arm hits the limit switch. The relay board controls the motor.

Teixi

@kenshirriff

Inertial navigation systems, gyroscopes, gimbals, etc.β€”one of the best examples of military research contemporaneously contributing to science research, and then also aerospace industry!

Still exists a history book to be written about, with a central chapter for Draper Laboratory ;)

You should log/vlog this!!

http :verified:

@kenshirriff It's amazing what was done with the newest technology that was available at that time. Probably nobody could imagine then that all that gets replaced by a generic screen and some super fast calculations done by billions of tiny transistors controlling the screen. But it's the same today; we just use the latest technology at hand and nobody knows what the future brings.

Ken Shirriff

The Intel 8086 processor (1978) has a complex instruction set with instructions from 1 to 6 bytes long. How does the processor determine the instruction length? It turns out that there is no explicit length. A ROM says if 1 or 2 bytes, then the microcode fetches bytes until done. 🧡

Ken Shirriff

Instruction processing starts with the Group Decode ROM, which classifies instructions: 1 byte implemented in logic, a prefix, 1+ byte using microcode, or 2 bytes+ (including ModR/M byte) using microcode. A circuit called the loader gets 1 or 2 bytes from the prefetch queue.

Ken Shirriff

The Intel 8086 microprocessor (1978) led to the x86 architecture that your computer probably runs today. The 8086 provided a complicated set of memory access modes to get values from memory. Let's take a close look at how microcode and hardware work together to implement them. 🧡

Show previous comments
thinkberg

@kenshirriff Do you only decode HW? I still have 255 byte .COM that does stuff you could identify :)

Neko May

@kenshirriff Very much loving these microcode deep dives!

j2i.net

Thinking about this, I'm thinking about the movie "Inside Out." Only instead of people controlling a person it's a program controlling programs.

@kenshirriff

Ken Shirriff

How did Russian cosmonauts know where they were? The Globus INK (1967) showed the position of their Soyuz spacecraft on a rotating globe. It is an analog computer built from tiny gears. I reverse-engineered the wiring (which inconveniently had been cut) and we powered it up. 🧡

Ken Shirriff

The iconic Intel 8086 processor (1978) set the path for modern computers. Like most CPUs, it supports interrupts, a mechanism to interrupt the regular flow of processing. I've been reverse-engineering it from the silicon die, and I can explain how its interrupt circuitry works.

Ken Shirriff

The interrupt circuitry is implemented both in microcode and hardware. Microcode is a layer between machine instructions and the hardware, executing low-level 21-bit micro-instructions. These perform moves, combined with several types of actions: ALU, memory, jumps, etc.

Ken Shirriff

The Bendix Central Air Data Computer (CADC) is an amazing electromechanical device that computed airspeed, altitude, and other "air data" for fighter planes such as the F-104 and F-111. Digital computers weren't good enough in 1955, so the CADC used gears, cams and synchros.🧡

Ken Shirriff

What's inside the famous 8086 processor from 1978? I opened up a chip, took microscope photos, and I'm reverse-engineering it. One of the 8086's instructions is HLT, which halts the processohttps://pbs.twimg.com/media/FnalrlNXEBoU1I_?format=jpg&name=4096x4096r. Seems simple, but there's a lot of circuitry to make the halt instruction work... 🧡

Ken Shirriff

This diagram shows the main parts of the 8086 chip; dark labels are affected by HALT. The 8086 is partitioned into an Execution Unit which executes instructions, and a Bus Interface Unit which performs memory operations and preloads instructions into the prefetch queue.

Henry H

@kenshirriff

Amazing. I can only imagine what programming in assembler with this level of understanding would be like.

Joe Wein

@kenshirriff

The CP/M 2.2 command processor for the 8080 checked the Digital Research copyright notice and executed a HLT instruction to lock up the machine if it had been tampered with (e.g. pirated copy of the OS).

The 8086 version of CP/M used the same machine translated code, but typical 8086/88 machines were interrupt driven, waking the machine again at the next keystroke and thus rendering the anti-piracy check toothless.

Ken Shirriff

The Globus INK (1967) is a remarkable piece of Soviet spacecraft equipment. Its rotating globe showed cosmonauts the position of their Soyuz spacecraft. An electromechanical analog computer, it used gears, cams, and differentials to compute the position. Let's look inside 🧡

Show previous comments
racoonmedia

@kenshirriff awesome technology. reminds me a bit of the KSP navball. Is there a way to replicate such a tech with modern off the shelf thingies?

Jona Joachim

@kenshirriff
We can't see the actual mechanism that makes the globe spin. Rubber wheels? Like a computer mouse in reverse mode?
@jaj

Ken Shirriff

Everyone says the Intel 8086 processor has 29,000 tiny transistors. But I counted and found 19,618. Why the difference? It turns out that most counts include "potential" transistors as well as real ones. Let's take a look and find out why.🧡

Show previous comments
DELETED

@kenshirriff I am guessing this got used in the 80186 variant, which included a lot more on board support circuitry.

[DATA EXPUNGED]
Go Up