Email or username:

Password:

Forgot your password?
Ken Shirriff

The ancestor of the 8086 processor is the Datapoint 2200, a desktop minicomputer used as an intelligent terminal. Made before the microprocessor, the Datapoint built a processor from a board of chips. The Intel 8008 cloned the Datapoint, first step to the x86 architecture. 🧵

42 comments
Ken Shirriff

Here's the Datapoint 2200's processor board, crammed with simple TTL chips. Datapoint asked Texas Instruments and Intel if they could replace the board with a single chip. Texas Instruments created the TMX 1795, the first 8-bit microprocessor. Intel created the 8008 processor.

Ken Shirriff

The newfangled microprocessors were too slow, so Datapoint rejected them (bad move). Texas Instruments advertised their "CPU on a chip" but couldn't find a customer for the TMX 1795 and abandoned it (also bad move). Intel marketed the 8008, creating the microprocessor industry.

Ken Shirriff

Because the Datapoint 2200 used low-cost shift-register memory instead of RAM, it operated serially and needed to be little-endian. The 8008 copied this and that's why Intel processors are little-endian today.

Ken Shirriff

The Datapoint 2200 had a parity flag, very useful for a terminal. It had I/O instructions for its hardware. That's why x86 has a parity flag and uses I/O instructions.

Ken Shirriff

Intel improved the 8008 to create the 8080 processor, which was popular in embedded systems. The first generation of home computers (Altair, IMSAI) used the 8080. Because of backward compatibility, the 8080 still had the Datapoint instructions and features.

Ken Shirriff

The 8086 was a big improvement over the 8080, a 16-bit processor instead of 8. The 8086's registers names originally matched the Datapoint ones: A, B, C, D, E, H, L as shown in this 8086 patent diagram. But these were renamed AX, BX, CX, and DX just before release.

Ken Shirriff

The 8086 was designed to be backward compatible with the 8080 through a conversion program called CONV86, so it inherited the Datapoint features. The 8086 was extended to the modern x86 architecture used in most laptops and servers today.

Ken Shirriff

So that's how the modern x86 architecture developed from an obscure desktop computer called the Datapoint 2200. For lots of details and a close look at the instruction sets, see my blog post: righto.com/2023/08/datapoint-t

Jyrgen N

@kenshirriff I read your earlier (but in substance identical) account of the Datapoint/Intel history a while ago. It is mind-blowing how the design decisions of a rather obscure intelligent terminal in 1970 still shape a large part of computing today, more than 50 years later — and probably for decades to come. No one could have ever imagined that at the time, and I even have a hard time grasping it now. Thanks a lot for sharing this!

Gerard van Oel

@kenshirriff @schotanus I have never worked on it but in the mid 80’s, Neddata, the IT part of Nedlloyd Shipping Company had one small but important system running on a Datapoint. The rest was working on IBM mainframe(s)and something new called DEC. We had one PC, for 125 it-emplyees. 😂

Alex Rosenberg

@kenshirriff I’d heard that the parity bit was brought forward from 4004 and it’s use in operating traffic lights.

Ken Shirriff

@alexr Unfortunately, there are two problems with that theory. The 4004 does not have parity and the 8008 is unrelated to the 4004.

Zorin =^o.o^=

@kenshirriff The fact that a design decision in the 1970s led to an architecture trait that has endured for over 50 years now is pretty amazing.

Ken Shirriff

@zorinlynx The IBM System/360 architecture from 1964 is similarly amazing, since IBM's mainframes are still compatible with it.

mytwobits01

@kenshirriff @zorinlynx
Fascinating.
I imagine there was a mixture of "we thought it was a great design feature and had no reason to know it would turn out to be so troublesome" and "it's a great design feature because there was so little prior art constraining us and the possibilities were wide open".

Bread80

@kenshirriff BTW that looks like the video board - which used 14 shift memory ICs (7 bit ASCII, two chips per bit).

This is a memory card from the d2200 recreation I’m currently working on. It’s not engineered for the power hungry 1405s which I’ll need to emulate with a daughter board.

Decoder board is already done, processor board is at the layout stage. I’m adding as many blinkenlights as I can fit on 🙂

Bruce Elrick

@kenshirriff The sins of the father are revisited upon the sons.

Rupert Reynolds

@kenshirriff That's the first time an explanation of why x86 is LE has made sense to me!

William D. Jones

@kenshirriff AFAIK, TMX 1795 was never released to the public. But at least one chip _does_ exist, and it _does_ work.

Dani (:cxx: modules addict)

@kenshirriff This looks kind of similar to the bit-slice machine that I've been working with in my last university year (and later on). The most complex chip on that board was a 16x16 multiplier. Hence it was used as a DSP.

Ken Shirriff

@nwp The Intel 4004 and the Intel 8008 processors are unrelated apart from the part numbers. They have completely different architectures. Many of the same people worked on both chips, though.

Kalman Reti

@kenshirriff I owned one of these in the early 1970's while living in a Philadelphia apartment. It was extraordinarily heavy, uppercase only (you could buy an extra board for lower case) and communicated over a 110 baud acoustic coupler.

CynBlogger™️

@kenshirriff
OMG! OMG! That was the first mini I worked on. My company went on to become one of Datapoint’s best/biggest customers — they used me in one of their national ads. 😃

CynBlogger™️

@kenshirriff
The OS ran from one of the cassette decks - the other was used for custom application software.

Ricardo B�nffy

@kenshirriff I love how it was widescreen before widescreen was cool.

Ken Shirriff

@piikkiniska Your Datapoint looks older because it has the CTC logo instead of the Datapoint logo. Is it a Version I?

Ken Shirriff

@piikkiniska Very nice! The Datapoint 2200 Version I is pretty rare, at least I don't know of anyone else with one.


@kenshirriff I didn't know of such a relationship. Does the 4004 fit into the story?

Anna Nicholson

@kenshirriff Before I read your (fascinating) thread, I was a little surprised to hear that the 8008 wasn’t just an 8-bit development of the groundbreaking 4004 (as I’d assumed, despite having worked at Intel in the ’80s – clearly the marketing ploy worked on me!)

So I popped over to Wikipedia, where I found that the 8008 was indeed logically unrelated (and was originally going to be called the 1201), though unsurprisingly it used the same PMOS technology and 10-micron fabrication process as the 4004, which was only released a few months earlier

One thing I noticed is that your chronology differs slightly from that of the Wikipedia authors, and I’m curious to know which is right 🤔

The ‘History’ section of the Intel 8008 article [1] leans heavily on a 2008 article in Computerworld [2] (and incidentally also cites an interview with you for the naming of the chip!)

According to the Computerworld article, CTC’s first model was the Datapoint 3300, which used a TTL CPU board (and ran hot)

Its successor was supposed to use a CPU-on-a-chip, which CTC commissioned from Intel and Texas Instruments in early 1970

However, CTC clearly expected a very rapid turnaround (!) and decided to develop a TTL version of the Datapoint 2200, which sold its first units months later, in May 1970 [2]

Both Intel and TI demonstrated working 8-bit processors in 1971: TI got there first, in February [3], but its TMX 1795 was unreliable, and never made it into production; Intel delivered its 1201 to CTC towards the end of the year [2]

CTC by this time, though, was developing the TTL-based Datapoint 2200 II, which rendered Intel’s 1201 obsolete, so they ceded IP rights to Intel in lieu of paying them for the work [2]

Intel then marketed the ‘1201’ as the 8008 in April 1972

Whichever version of the story is correct, the Intel 8008 and the Datapoint 2200 clearly have a common heritage, and CTC’s commissioning of Intel to produce a processor that was never used by them ultimately led, by an accident of history, to the 8-bit Intel 8080 (and Zilog Z80) and the 16-bit Intel 8086, which formed the heart of the IBM PC

[1] en.wikipedia.org/wiki/Intel_80

[2] computerworld.com/article/2532

[3] en.wikipedia.org/wiki/Micropro

📷 CPU Collection Konstantin Lanzet, CC BY-SA 4.0

@kenshirriff Before I read your (fascinating) thread, I was a little surprised to hear that the 8008 wasn’t just an 8-bit development of the groundbreaking 4004 (as I’d assumed, despite having worked at Intel in the ’80s – clearly the marketing ploy worked on me!)

So I popped over to Wikipedia, where I found that the 8008 was indeed logically unrelated (and was originally going to be called the 1201), though unsurprisingly it used the same PMOS technology and 10-micron fabrication process as the...

Go Up