Email or username:

Password:

Forgot your password?
77 posts total
Ken Shirriff

Ethernet was invented at Xerox Parc in 1973 and became the most popular way to wire computers into a network. AMD made the LANCE chip in 1982, containing most of the interface circuitry. I made a die photo of the chip. 1/11

Ken Shirriff

I was puzzled by the mysterious yellow stripes on the chip. Most of the circuitry was standard NMOS: the purple-gray silicon with reddish polysilicon on top providing wiring and creating the transistor gates. But the yellow stripes were something new. 2/11

Ken Shirriff

A transparent IC? That's what we found inside a vintage HP floppy drive! The PHI chip (1977) is constructed from silicon-on-sapphire and you can see the gold "X" right through the die's sapphire substrate. 1/9

Ken Shirriff

Silicon-on-sapphire got its start in 1963. Instead of starting with a silicon wafer, silicon circuits are built on top of a synthetic sapphire base. The sapphire provides insulation between the transistors, improving performance and making the chip resistant to radiation. 2/9

isaiah

@kenshirriff in the 90s, when i was working in chip design, SoS chips were primarily used for military and space applications as they are much more resilient to radiation than silicon substrates. however that seems a tad overkill for a floppy drive. ;-)

Ken Shirriff

Intel's 386 processor (1985) was an important milestone, moving Intel to a 32-bit architecture. It's a complicated chip, but fundamentally it is built from logic gates. I found that it uses two completely different circuits to implement the XOR function... 1/9

Ken Shirriff

Zooming way in on the 386 shows the transistors for two XOR gates. In red, a shift register for the chip's self-test feature contains XOR gates built from pass transistors. Yellow: prefetch queue control circuitry uses a completely different standard-cell XOR circuit. 2/9

Robert Hollingshead :donor:

@kenshirriff I love playing “Where’s Waldo” with designer initials! I’m wondering what that symbol is between ET and CK above the segment unit. Almost looks like a telephone.

Ken Shirriff

The Intel 386 processor (1985) is a complicated chip. One of its features is a "barrel shifter" (red) that can shift binary values by 0 to 32 bits in one step, much faster than shifting one bit at a time. Let's see how it works. 1/11

Show previous comments
John Gordon ⚡️

@kenshirriff A barrel shifter was the first thing we had to design in our silicon design class back in the early 1990s.

While doing that, my own home computer was running a very much simpler chip (an early ARM). You can see the barrel shifter and ALU in the ARM1 in the link below. Mine was running an ARM 2 initially (and latterly was upgraded to an ARM 3), but very similar to this design.

hackaday.com/2015/12/21/revers

James Trickle uP

@kenshirriff The first microprocessor I worked with that had a barrel shifter was at my first job at #Tektronix - port the #Magnolia #workstation (#Motorola 68000) V7 #UNIX first to 68010 but a few years later to 68EC040. The '40 had the barrel shifter. Man rotations were zippy. Later for #WindRiver I ported #VxWorks RTOS to #MIPS and #PowerPC chips. PPC bitfield ops were efficiently done with a bunch of rotate with mask instructions - IMO the pinnacle, the kitchen sink of embedded instructions.

Ken Shirriff

The 386 processor (1985) was pivotal for Intel, moving the x86 architecture to 32 bits and moving the design to CMOS. Everything in the processor is controlled by the clock. Some tricky circuitry on the chip (red) generates the clock; let's take a look inside the die... 1/11

Ken Shirriff

Early microprocessors such as the 8080 used a two-phase clock: when one phase is high, the other is low, with a gap between the high part of each phase. Processing moves step-by-step through the circuitry, one phase at a time. 2/11

Alolan Yoda

@kenshirriff I found your account through your blog post and I'm very happy to have found it. Great write-up, super interesting. Thank you for taking the time to write this!

Ken Shirriff

The Intel 386 processor (1985) was a key member of the x86 line, moving to 32 bits. It has a bunch of on-chip registers, implemented with compact, highly-optimized circuitry. Let's look at the circuit, called T8, that implements some of these registers. 1/11

Ken Shirriff

The registers are implemented as static RAM, using a circuit called T8 because it has 8 transistors. The basic concept is to put two inverters in a loop, so they can stay in either the 0 or the 1 state. Each inverter provides the input to the other. 2/11

Tim Bray

@kenshirriff I’m old and, looking back, I think this was the biggest CPU news in my whole lifetime. Developers' productivity got maybe its biggest boost ever from not having to wear 16-bit shackles.

Ken Shirriff

Introduced in 1973, Ethernet is the best way to wire your computer to a network. But it was very expensive, costing thousands of dollars for an interface board. Intel's 82586 chip (1982) simplified the interface, dropping network prices to "just" $1000. Let's look inside... 1/14

Ken Shirriff

At the time, Ethernet chips usually just translated data packets into a stream of network bits, computing checksums. Intel's chip went much further: it included a coprocessor to move data between the network and memory independent of the main processor (i.e. DMA). 2/14

Ken Shirriff

The Intel 386 processor (1985) was the first 32-bit processor in the x86 line. Let's take a close look at the processor dies, seeing how Intel shrunk the chip, created new versions, and why the 386 SL jumped from 285,000 transistors to 855,000 transistors. 1/9

Show previous comments
MHowell

@kenshirriff One aerospace project I worked on used obsolete x386 uP's that were running out of stock. Rather than upgrade and recertify, they bought the mask(s) and had enough fabricated to support to EOL.

lopta

@kenshirriff I haven't skipped ahead but I'm guessing cache and Service Mode added transistors. #i386

Ken Shirriff

Intel's Israel site was opened in 1974, Intel's first design and development center outside the US. In this thread, I'll look at some of the important vintage chips designed at Intel Israel, such as the 8088 processor. 1/9

Ken Shirriff

The 8088 processor was designed at Intel Israel. This is indicated on the die as "i/IL". The Intel 8088 was selected for the IBM PC (1981), ensuring the success of the x86 architecture. The 8088 is a 16-bit chip internally but uses an 8-bit bus to reduce system cost. The 8088 is mostly the same as the 8086 but the bus control is redesigned, the prefetch queue is smaller, and there are numerous changes throughout the chip. 2/9

Peter Jakobs ⛵

@kenshirriff I never knew there was a Hebrew version of the intel logo

Ken Shirriff

How did fighter planes in the 1950s perform calculations before compact digital computers were available? With the Bendix Central Air Data Computer! This electromechanical analog computer used gears and cams to compute "air data" for fighter planes such as the F-101. 1/13

Show previous comments
Jim Flanagan

@kenshirriff One of these may have fallen out of a plane off the coast of the Greek island Antikythera, landed on a shipwreck, and gotten all rusty.

Giacomo Amoroso

@kenshirriff This is sooo mind boggling, but so ingenious. Thank you for sharing this

Ken Shirriff

Flip-flops are a key component of a microprocessor, holding a 0 or a 1 value. When the "clock" signal triggers, the flip-flop loads a new value. I examined the silicon die of the Intel 8086 processor (1978) and found 184 flip-flops (colored dots). 1/7

Ken Shirriff

Flip-flops have many roles in the 8086. They hold the instruction and various fields. They store condition flags such as carry. They hold the microcode address and the 21-bit micro-instruction. They implement state machines for reads and writes. They manage the prefetch queue.

lopta

@kenshirriff The die shot is an aerial photo of your home town. #8086cpu

Ken Shirriff

The Bryant 4000C disk drive (1965) was absurdly large. The drive was 52" tall, weighed 3551 pounds, and held 205 megabytes. Now you can get a 1TB flash drive: 5000 times the storage for 1/50,000 the weight.

Show previous comments
Killa Koala

@kenshirriff Big, heavy, slow and power hungry, but your data was safer than on a modern SanDisk SSD... 😉

nsfw :donor:

@kenshirriff And it was "as low as" 0.06 cents per byte (0.58 cents in today's dollars) and had an MTBF of 2,000 to 3,000 hours.

balkongast

@kenshirriff

Smaller and less resource consuming is better?
Whoa, don't tell economists, the car industry or republican politicians?

Ken Shirriff

The ancestor of the 8086 processor is the Datapoint 2200, a desktop minicomputer used as an intelligent terminal. Made before the microprocessor, the Datapoint built a processor from a board of chips. The Intel 8008 cloned the Datapoint, first step to the x86 architecture. 🧵

Show previous comments
Anna Nicholson

@kenshirriff Before I read your (fascinating) thread, I was a little surprised to hear that the 8008 wasn’t just an 8-bit development of the groundbreaking 4004 (as I’d assumed, despite having worked at Intel in the ’80s – clearly the marketing ploy worked on me!)

So I popped over to Wikipedia, where I found that the 8008 was indeed logically unrelated (and was originally going to be called the 1201), though unsurprisingly it used the same PMOS technology and 10-micron fabrication process as the 4004, which was only released a few months earlier

One thing I noticed is that your chronology differs slightly from that of the Wikipedia authors, and I’m curious to know which is right 🤔

The ‘History’ section of the Intel 8008 article [1] leans heavily on a 2008 article in Computerworld [2] (and incidentally also cites an interview with you for the naming of the chip!)

According to the Computerworld article, CTC’s first model was the Datapoint 3300, which used a TTL CPU board (and ran hot)

Its successor was supposed to use a CPU-on-a-chip, which CTC commissioned from Intel and Texas Instruments in early 1970

However, CTC clearly expected a very rapid turnaround (!) and decided to develop a TTL version of the Datapoint 2200, which sold its first units months later, in May 1970 [2]

Both Intel and TI demonstrated working 8-bit processors in 1971: TI got there first, in February [3], but its TMX 1795 was unreliable, and never made it into production; Intel delivered its 1201 to CTC towards the end of the year [2]

CTC by this time, though, was developing the TTL-based Datapoint 2200 II, which rendered Intel’s 1201 obsolete, so they ceded IP rights to Intel in lieu of paying them for the work [2]

Intel then marketed the ‘1201’ as the 8008 in April 1972

Whichever version of the story is correct, the Intel 8008 and the Datapoint 2200 clearly have a common heritage, and CTC’s commissioning of Intel to produce a processor that was never used by them ultimately led, by an accident of history, to the 8-bit Intel 8080 (and Zilog Z80) and the 16-bit Intel 8086, which formed the heart of the IBM PC

[1] en.wikipedia.org/wiki/Intel_80

[2] computerworld.com/article/2532

[3] en.wikipedia.org/wiki/Micropro

📷 CPU Collection Konstantin Lanzet, CC BY-SA 4.0

@kenshirriff Before I read your (fascinating) thread, I was a little surprised to hear that the 8008 wasn’t just an 8-bit development of the groundbreaking 4004 (as I’d assumed, despite having worked at Intel in the ’80s – clearly the marketing ploy worked on me!)

So I popped over to Wikipedia, where I found that the 8008 was indeed logically unrelated (and was originally going to be called the 1201), though unsurprisingly it used the same PMOS technology and 10-micron fabrication process as the...

Ken Shirriff

A simple repair at twitter.com/ComputerHistory. The IBM 729 tape drive provided bulk storage on a 2400 ft reel of magnetic tape. But one of the lights on the control panel burned out. It uses an obscure 55-volt telco bulb in an unusual package.

Show previous comments
bls

@kenshirriff OMG! 729 tape drives! We had 4 of them on our DEC-10 at U Illinois Physics when I was there. I'm amazed that they're still around 50 years later 😮

Alex Lindsay

@kenshirriff I loaded tapes on something like this, back in the 80s. It was fun watching them thread the tape across to the other reel by themselves, and watching the two loops in the vertical channels when it was spinning madly...

http :verified:

@kenshirriff I remember these lamps from my childhood, but don't remember the device. Certainly no computer or telephone.

Ken Shirriff

I found a hidden name in the Intel 8088 processor. The 8088 was a derivative of the 8086 processor introduced in 1979 and best known as the processor in the IBM PC. I dissolved the chip's metal layer and found "רפי", the name in Hebrew of Rafi Retter, the chip's engineer.

Ken Shirriff

The Intel 8086 microprocessor was introduced in 1978 and led to the x86 architecture in use today. One of its obscure features is the "bus hold", allowing another device to temporarily take over communication with memory. This circuitry is in the upper left corner of the chip.🧵

Ken Shirriff

During a bus hold, the 8086 processor stops using the memory bus and electrically disconnects from the bus by going into "tri-state" mode. This lets an external device take control of the bus and access memory directly. This can be used for high-speed input/output (DMA).

Ken Shirriff

The Intel 8086 processor (1978) started the x86 architecture still used today. This chip only had 40 pins, so the address pins needed to be reused for data or status. (The pins are connected to the pads around the edge.) It took some tricky circuitry to make this work. 🧵

Ken Shirriff

The output pins need high current, so each pin has multiple large transistors in parallel. This diagram shows how long, parallel transistors are created from polysilicon and silicon, and then wired together by the metal layer on top.

Rozzychan

@kenshirriff
That brown photo looks like a sepia print of a really well organized area with crops that people could live in.

Ken Shirriff

The Intel i960 was the most popular RISC chip of the mid-1990s. This powerful 32 (or 33!) bit processor was used in embedded applications including the F-22 fighter plane. The i960CA was the world's first superscalar microprocessor, running two instructions every clock cycle. 🧵

Ken Shirriff

The i960's roots are the iAPX 432, a "micro-mainframe" processor that Intel started in 1976 to revolutionize microprocessors with 32-bit mainframe power in an object-oriented chip. Spoiler: it did not. Far behind schedule, Intel introduced the 8086 processor, a temporary stopgap.

Ivor Hewitt

@kenshirriff
Wow fascinating, was there any relation between the i860 and the i960? I remember we were accelerating video decoding in the 90s with a hugely expensive (I thought at the time) dual i860 board.

ivar

@kenshirriff very interesting, reading your posts is always a joy :)

keep it up!

Ken Shirriff

Fighter planes in the 1950s used the Bendix Central Air Data Computer to determine air speed, mach number, altitude and so forth from pressure. It is electromechanical, using gears and synchros for its computations. Amazingly, it is modular and can be easily disassembled.

We separated the top layer from the rest for testing. The "interface" between the layers is two gears and an electrical connection. The electronic servo amplifier blocks come off too.

Ken Shirriff

We powered up the block that converts a temperature reading to a gear rotation but nothing happened. So I'm reverse-engineering the circuitry to figure out what's gone wrong. Maybe an old capacitor since some of the old capacitors have fungus or corrosion or something.

http :verified:

@kenshirriff I imagine being a caveman and finding this. Sophisticated alien technology from that point of view.

Ken Shirriff

The Intel 8086 processor (1978) started the PC era and most desktop computers still use the x86 architecture. Its instruction set is complicated with a variety of formats. This made decoding each instruction a challenge. The Group Decode ROM was a key part. 🧵

Ken Shirriff

Most 8086 instructions are implemented in microcode, a level of instructions below the familiar machine instructions. But before microcode can run, something called the Group Decode ROM categorizes instructions according to their structure, shown as colors below.

David Gerhart

@kenshirriff

Fond memories... When Al Gore invented the internet? I was there.

Consumed the MS DOS 1.1 desk reference cover to cover. Boot stacks for the University coax ethernet. Ugh.

I blew tuition $ on an 8087 (had to get the math chip.)

Fortran! The code and prob data were uploaded by 56kbps modem to the CP6 for calculation. Sometimes those jobs took the whole weekend.

Switched to C.

Ran a Grad student collaborative BBS. 4 telephone lines served about 70 peers.

#history #geek

Johnny Cache

@kenshirriff
Every time I read one of your pieces I have to re-assess how much of single chip one individual can hold in their head. 😂

Go Up