Ethernet was invented at Xerox Parc in 1973 and became the most popular way to wire computers into a network. AMD made the LANCE chip in 1982, containing most of the interface circuitry. I made a die photo of the chip. 1/11
Ethernet was invented at Xerox Parc in 1973 and became the most popular way to wire computers into a network. AMD made the LANCE chip in 1982, containing most of the interface circuitry. I made a die photo of the chip. 1/11 A transparent IC? That's what we found inside a vintage HP floppy drive! The PHI chip (1977) is constructed from silicon-on-sapphire and you can see the gold "X" right through the die's sapphire substrate. 1/9 Silicon-on-sapphire got its start in 1963. Instead of starting with a silicon wafer, silicon circuits are built on top of a synthetic sapphire base. The sapphire provides insulation between the transistors, improving performance and making the chip resistant to radiation. 2/9 @kenshirriff in the 90s, when i was working in chip design, SoS chips were primarily used for military and space applications as they are much more resilient to radiation than silicon substrates. however that seems a tad overkill for a floppy drive. ;-) Intel's 386 processor (1985) was an important milestone, moving Intel to a 32-bit architecture. It's a complicated chip, but fundamentally it is built from logic gates. I found that it uses two completely different circuits to implement the XOR function... 1/9 Zooming way in on the 386 shows the transistors for two XOR gates. In red, a shift register for the chip's self-test feature contains XOR gates built from pass transistors. Yellow: prefetch queue control circuitry uses a completely different standard-cell XOR circuit. 2/9 @kenshirriff I love playing “Where’s Waldo” with designer initials! I’m wondering what that symbol is between ET and CK above the segment unit. Almost looks like a telephone. The Intel 386 processor (1985) is a complicated chip. One of its features is a "barrel shifter" (red) that can shift binary values by 0 to 32 bits in one step, much faster than shifting one bit at a time. Let's see how it works. 1/11
Show previous comments
@kenshirriff A barrel shifter was the first thing we had to design in our silicon design class back in the early 1990s. While doing that, my own home computer was running a very much simpler chip (an early ARM). You can see the barrel shifter and ALU in the ARM1 in the link below. Mine was running an ARM 2 initially (and latterly was upgraded to an ARM 3), but very similar to this design. https://hackaday.com/2015/12/21/reverse-engineering-the-arm-alu/ @kenshirriff The first microprocessor I worked with that had a barrel shifter was at my first job at #Tektronix - port the #Magnolia #workstation (#Motorola 68000) V7 #UNIX first to 68010 but a few years later to 68EC040. The '40 had the barrel shifter. Man rotations were zippy. Later for #WindRiver I ported #VxWorks RTOS to #MIPS and #PowerPC chips. PPC bitfield ops were efficiently done with a bunch of rotate with mask instructions - IMO the pinnacle, the kitchen sink of embedded instructions. The 386 processor (1985) was pivotal for Intel, moving the x86 architecture to 32 bits and moving the design to CMOS. Everything in the processor is controlled by the clock. Some tricky circuitry on the chip (red) generates the clock; let's take a look inside the die... 1/11 Early microprocessors such as the 8080 used a two-phase clock: when one phase is high, the other is low, with a gap between the high part of each phase. Processing moves step-by-step through the circuitry, one phase at a time. 2/11 @kenshirriff I found your account through your blog post and I'm very happy to have found it. Great write-up, super interesting. Thank you for taking the time to write this! The Intel 386 processor (1985) was a key member of the x86 line, moving to 32 bits. It has a bunch of on-chip registers, implemented with compact, highly-optimized circuitry. Let's look at the circuit, called T8, that implements some of these registers. 1/11 The registers are implemented as static RAM, using a circuit called T8 because it has 8 transistors. The basic concept is to put two inverters in a loop, so they can stay in either the 0 or the 1 state. Each inverter provides the input to the other. 2/11 @kenshirriff I’m old and, looking back, I think this was the biggest CPU news in my whole lifetime. Developers' productivity got maybe its biggest boost ever from not having to wear 16-bit shackles. Introduced in 1973, Ethernet is the best way to wire your computer to a network. But it was very expensive, costing thousands of dollars for an interface board. Intel's 82586 chip (1982) simplified the interface, dropping network prices to "just" $1000. Let's look inside... 1/14 At the time, Ethernet chips usually just translated data packets into a stream of network bits, computing checksums. Intel's chip went much further: it included a coprocessor to move data between the network and memory independent of the main processor (i.e. DMA). 2/14 The Intel 386 processor (1985) was the first 32-bit processor in the x86 line. Let's take a close look at the processor dies, seeing how Intel shrunk the chip, created new versions, and why the 386 SL jumped from 285,000 transistors to 855,000 transistors. 1/9
Show previous comments
@kenshirriff One aerospace project I worked on used obsolete x386 uP's that were running out of stock. Rather than upgrade and recertify, they bought the mask(s) and had enough fabricated to support to EOL. @kenshirriff I haven't skipped ahead but I'm guessing cache and Service Mode added transistors. #i386 Intel's Israel site was opened in 1974, Intel's first design and development center outside the US. In this thread, I'll look at some of the important vintage chips designed at Intel Israel, such as the 8088 processor. 1/9 The 8088 processor was designed at Intel Israel. This is indicated on the die as "i/IL". The Intel 8088 was selected for the IBM PC (1981), ensuring the success of the x86 architecture. The 8088 is a 16-bit chip internally but uses an 8-bit bus to reduce system cost. The 8088 is mostly the same as the 8086 but the bus control is redesigned, the prefetch queue is smaller, and there are numerous changes throughout the chip. 2/9 How did fighter planes in the 1950s perform calculations before compact digital computers were available? With the Bendix Central Air Data Computer! This electromechanical analog computer used gears and cams to compute "air data" for fighter planes such as the F-101. 1/13
Show previous comments
@kenshirriff One of these may have fallen out of a plane off the coast of the Greek island Antikythera, landed on a shipwreck, and gotten all rusty. @kenshirriff This is sooo mind boggling, but so ingenious. Thank you for sharing this Flip-flops are a key component of a microprocessor, holding a 0 or a 1 value. When the "clock" signal triggers, the flip-flop loads a new value. I examined the silicon die of the Intel 8086 processor (1978) and found 184 flip-flops (colored dots). 1/7 Flip-flops have many roles in the 8086. They hold the instruction and various fields. They store condition flags such as carry. They hold the microcode address and the 21-bit micro-instruction. They implement state machines for reads and writes. They manage the prefetch queue. The Bryant 4000C disk drive (1965) was absurdly large. The drive was 52" tall, weighed 3551 pounds, and held 205 megabytes. Now you can get a 1TB flash drive: 5000 times the storage for 1/50,000 the weight.
Show previous comments
@kenshirriff Big, heavy, slow and power hungry, but your data was safer than on a modern SanDisk SSD... 😉 @kenshirriff And it was "as low as" 0.06 cents per byte (0.58 cents in today's dollars) and had an MTBF of 2,000 to 3,000 hours. Smaller and less resource consuming is better? The ancestor of the 8086 processor is the Datapoint 2200, a desktop minicomputer used as an intelligent terminal. Made before the microprocessor, the Datapoint built a processor from a board of chips. The Intel 8008 cloned the Datapoint, first step to the x86 architecture. 🧵
Show previous comments
A simple repair at https://twitter.com/ComputerHistory. The IBM 729 tape drive provided bulk storage on a 2400 ft reel of magnetic tape. But one of the lights on the control panel burned out. It uses an obscure 55-volt telco bulb in an unusual package.
Show previous comments
@kenshirriff OMG! 729 tape drives! We had 4 of them on our DEC-10 at U Illinois Physics when I was there. I'm amazed that they're still around 50 years later 😮 @kenshirriff I loaded tapes on something like this, back in the 80s. It was fun watching them thread the tape across to the other reel by themselves, and watching the two loops in the vertical channels when it was spinning madly... @kenshirriff I remember these lamps from my childhood, but don't remember the device. Certainly no computer or telephone. I found a hidden name in the Intel 8088 processor. The 8088 was a derivative of the 8086 processor introduced in 1979 and best known as the processor in the IBM PC. I dissolved the chip's metal layer and found "רפי", the name in Hebrew of Rafi Retter, the chip's engineer.
Show previous comments
The Intel 8086 microprocessor was introduced in 1978 and led to the x86 architecture in use today. One of its obscure features is the "bus hold", allowing another device to temporarily take over communication with memory. This circuitry is in the upper left corner of the chip.🧵 During a bus hold, the 8086 processor stops using the memory bus and electrically disconnects from the bus by going into "tri-state" mode. This lets an external device take control of the bus and access memory directly. This can be used for high-speed input/output (DMA). The Intel 8086 processor (1978) started the x86 architecture still used today. This chip only had 40 pins, so the address pins needed to be reused for data or status. (The pins are connected to the pads around the edge.) It took some tricky circuitry to make this work. 🧵 The output pins need high current, so each pin has multiple large transistors in parallel. This diagram shows how long, parallel transistors are created from polysilicon and silicon, and then wired together by the metal layer on top. @kenshirriff The Intel i960 was the most popular RISC chip of the mid-1990s. This powerful 32 (or 33!) bit processor was used in embedded applications including the F-22 fighter plane. The i960CA was the world's first superscalar microprocessor, running two instructions every clock cycle. 🧵 The i960's roots are the iAPX 432, a "micro-mainframe" processor that Intel started in 1976 to revolutionize microprocessors with 32-bit mainframe power in an object-oriented chip. Spoiler: it did not. Far behind schedule, Intel introduced the 8086 processor, a temporary stopgap. @kenshirriff Fighter planes in the 1950s used the Bendix Central Air Data Computer to determine air speed, mach number, altitude and so forth from pressure. It is electromechanical, using gears and synchros for its computations. Amazingly, it is modular and can be easily disassembled. We separated the top layer from the rest for testing. The "interface" between the layers is two gears and an electrical connection. The electronic servo amplifier blocks come off too. We powered up the block that converts a temperature reading to a gear rotation but nothing happened. So I'm reverse-engineering the circuitry to figure out what's gone wrong. Maybe an old capacitor since some of the old capacitors have fungus or corrosion or something. @kenshirriff I imagine being a caveman and finding this. Sophisticated alien technology from that point of view. The Intel 8086 processor (1978) started the PC era and most desktop computers still use the x86 architecture. Its instruction set is complicated with a variety of formats. This made decoding each instruction a challenge. The Group Decode ROM was a key part. 🧵 Most 8086 instructions are implemented in microcode, a level of instructions below the familiar machine instructions. But before microcode can run, something called the Group Decode ROM categorizes instructions according to their structure, shown as colors below. Fond memories... When Al Gore invented the internet? I was there. Consumed the MS DOS 1.1 desk reference cover to cover. Boot stacks for the University coax ethernet. Ugh. I blew tuition $ on an 8087 (had to get the math chip.) Fortran! The code and prob data were uploaded by 56kbps modem to the CP6 for calculation. Sometimes those jobs took the whole weekend. Switched to C. Ran a Grad student collaborative BBS. 4 telephone lines served about 70 peers. @kenshirriff |
I was puzzled by the mysterious yellow stripes on the chip. Most of the circuitry was standard NMOS: the purple-gray silicon with reddish polysilicon on top providing wiring and creating the transistor gates. But the yellow stripes were something new. 2/11