tag 标签: 4004

相关博文
  • 热度 12
    2012-2-1 19:05
    1553 次阅读|
    0 个评论
    The first bit of strangeness was the package, an enormous 50-pin DIP (old-timers will remember the first 68000, which was in an equally-huge 64-pin DIP). That allowed for separate instruction and data buses, and some claim that this was the first micro with a Harvard architecture. Instructions were 16 bits, with 13 address bits, and the data bus was just a byte wide. It gets odder. The data bus was called the "Interface Vector bus," or IV for short, and bit 7 was the LSB! IV0 to IV7 were multiplexed with both address and data, suggesting that 256B of data were supported. But, no, Left Bank and Right Bank signals selected one of two data sources, doubling the space. The nomenclature reminds me of Paris. The data space included both RAM and I/O, and the latter was memory mapped. The 8X300 contained a variety of registers, numbered R1 to R6 and R11. What about the missing ones? The numbers were the octal address of each register, and R11 for unknown reasons was at O11. R10, or at least the "register" at O10, was a one-bit overflow flag, the only conditional bit. Only the ADD instruction could set or clear that bit. And R7 didn't really exist; it was a trick used to access the IV bus. To read or write in data space or a register, one, well, the process is rather mind-boggling; there isn't enough room to describe it here. But the MOVE, ADD, AND, and XOR instructions included fields to specify how many bits to rotate an argument. Except that sometimes those bits selected a field length. There were four other instructions, including XEC, which executed the instruction pointed to by a bit of convoluted math. The PC pointed to the location after the XEC, unless the instruction executed was a jump. Strange, but it did make handling jump tables easy. Note what's missing: there were no interrupts. No stack, no calls. And no subtract. But that was easy enough to simulate by taking the one's complement of the subtrahend, adding one to make the two's complement, and then adding the minuend. Each subtract consumed five instructions. And many more! A number of microprocessors had quirky features that could be a curse or a blessing. For example, the 4004 had a hardware stack, on-board the chip. It was three-levels deep. That doesn't permit much subroutine nesting. Intel improved this on the 8008, making the stack seven-levels deep. Even in those early days we tried to practice modular programming, so it was common to blow the stack via too many calls. The PIC1650 is often credited as being the first microcontroller (although I suspect the Scottish PICO1 probably deserves that honor). That part's stack was two deep, mirroring the similar PIC10 parts still selling today. Interrupt cycles on Intel's 8080 were exactly like any fetch, except the CPU asserted an interrupt acknowledge pin. The hardware was then supposed to jam an instruction on the bus to vector to a handler. Eight one-byte calls (RST0-7) were provided for this purpose. But there were plenty of designs that used crafty logic to jam a three-byte call. Where I worked, we had severe time constraints and none of those approaches were fast enough due to the stack pushes involved. Instead, when awaiting a particular event, we'd branch to the handler and execute a halt instruction. The interrupt hardware would jam a NOP, which was the fastest instruction the CPU could execute, and the code following the halt, the interrupt handler, would resume. For a while bit-slice processors were the rage for high-performance systems. The most popular was AMD's 2900 series. These 4bit slices could be cascaded together to build a system of any word length. Each chip had an ALU (arithmetic logic unit), a decoder for the eight instructions, and 16 nibbles of dual-ported RAM that acted as registers. Things like a program counter and the other components of a computer had to be added externally. Intel was in that game, too, with their 3000-series of 2bit slices. Needless to say, a processor with a wide word consumed a lot of parts. Then there was Fairchild's 8bit F8—which had no address bus. Designers used the CPU chip in conjunction with a number of other unique Fairchild devices, and all worked in lockstep. Each device included data pointers, which changed values by watching control signals emitted from the CPU. Even the memory controllers had to know when a jump or other instruction caused a change in control. The F8 did have a whopping 64 registers. Sixteen could be addressed directly from an instruction; the rest were accessed via a pointer register. Some instructions could cause that to increment or decrement, making it easy to run through tables. This "autoincrement" idea harked back to the PDP-11. The F8's datasheet can be found at http://datasheets.chipdb.org/Fairchild/F8/fairchild-3850.pdf . Strangely, it uses hex, octal, and decimal rather interchangeably. In the early micro days, octal was often used instead of hex, as a lot of developers had problems groking the notion of letters as numbers. By the time the microprocessor became a successful product, computer instruction set architectures were well understood and often beautifully orthogonal. But designers did awful things to cram a CPU into a chip and often had to take what today seems like strange shortcuts. In other cases, the new world of micros caused a flurry of creativity that resulted in some marvelously quirky, and sometimes cool, features. Many of those decisions live on today, frozen in time and silicon by the weight of legacy compatibility.
  • 热度 16
    2012-2-1 19:04
    1922 次阅读|
    0 个评论
    The first bit of strangeness was the package, an enormous 50-pin DIP (old-timers will remember the first 68000, which was in an equally-huge 64-pin DIP). That allowed for separate instruction and data buses, and some claim that this was the first micro with a Harvard architecture. Instructions were 16 bits, with 13 address bits, and the data bus was just a byte wide. It gets odder. The data bus was called the "Interface Vector bus," or IV for short, and bit 7 was the LSB! IV0 to IV7 were multiplexed with both address and data, suggesting that 256B of data were supported. But, no, Left Bank and Right Bank signals selected one of two data sources, doubling the space. The nomenclature reminds me of Paris. The data space included both RAM and I/O, and the latter was memory mapped. The 8X300 contained a variety of registers, numbered R1 to R6 and R11. What about the missing ones? The numbers were the octal address of each register, and R11 for unknown reasons was at O11. R10, or at least the "register" at O10, was a one-bit overflow flag, the only conditional bit. Only the ADD instruction could set or clear that bit. And R7 didn't really exist; it was a trick used to access the IV bus. To read or write in data space or a register, one, well, the process is rather mind-boggling; there isn't enough room to describe it here. But the MOVE, ADD, AND, and XOR instructions included fields to specify how many bits to rotate an argument. Except that sometimes those bits selected a field length. There were four other instructions, including XEC, which executed the instruction pointed to by a bit of convoluted math. The PC pointed to the location after the XEC, unless the instruction executed was a jump. Strange, but it did make handling jump tables easy. Note what's missing: there were no interrupts. No stack, no calls. And no subtract. But that was easy enough to simulate by taking the one's complement of the subtrahend, adding one to make the two's complement, and then adding the minuend. Each subtract consumed five instructions. And many more! A number of microprocessors had quirky features that could be a curse or a blessing. For example, the 4004 had a hardware stack, on-board the chip. It was three-levels deep. That doesn't permit much subroutine nesting. Intel improved this on the 8008, making the stack seven-levels deep. Even in those early days we tried to practice modular programming, so it was common to blow the stack via too many calls. The PIC1650 is often credited as being the first microcontroller (although I suspect the Scottish PICO1 probably deserves that honor). That part's stack was two deep, mirroring the similar PIC10 parts still selling today. Interrupt cycles on Intel's 8080 were exactly like any fetch, except the CPU asserted an interrupt acknowledge pin. The hardware was then supposed to jam an instruction on the bus to vector to a handler. Eight one-byte calls (RST0-7) were provided for this purpose. But there were plenty of designs that used crafty logic to jam a three-byte call. Where I worked, we had severe time constraints and none of those approaches were fast enough due to the stack pushes involved. Instead, when awaiting a particular event, we'd branch to the handler and execute a halt instruction. The interrupt hardware would jam a NOP, which was the fastest instruction the CPU could execute, and the code following the halt, the interrupt handler, would resume. For a while bit-slice processors were the rage for high-performance systems. The most popular was AMD's 2900 series. These 4bit slices could be cascaded together to build a system of any word length. Each chip had an ALU (arithmetic logic unit), a decoder for the eight instructions, and 16 nibbles of dual-ported RAM that acted as registers. Things like a program counter and the other components of a computer had to be added externally. Intel was in that game, too, with their 3000-series of 2bit slices. Needless to say, a processor with a wide word consumed a lot of parts. Then there was Fairchild's 8bit F8—which had no address bus. Designers used the CPU chip in conjunction with a number of other unique Fairchild devices, and all worked in lockstep. Each device included data pointers, which changed values by watching control signals emitted from the CPU. Even the memory controllers had to know when a jump or other instruction caused a change in control. The F8 did have a whopping 64 registers. Sixteen could be addressed directly from an instruction; the rest were accessed via a pointer register. Some instructions could cause that to increment or decrement, making it easy to run through tables. This "autoincrement" idea harked back to the PDP-11. The F8's datasheet can be found at http://datasheets.chipdb.org/Fairchild/F8/fairchild-3850.pdf . Strangely, it uses hex, octal, and decimal rather interchangeably. In the early micro days, octal was often used instead of hex, as a lot of developers had problems groking the notion of letters as numbers. By the time the microprocessor became a successful product, computer instruction set architectures were well understood and often beautifully orthogonal. But designers did awful things to cram a CPU into a chip and often had to take what today seems like strange shortcuts. In other cases, the new world of micros caused a flurry of creativity that resulted in some marvelously quirky, and sometimes cool, features. Many of those decisions live on today, frozen in time and silicon by the weight of legacy compatibility.  
  • 热度 20
    2011-8-31 22:29
    1846 次阅读|
    0 个评论
    Bill Gates once said: "We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten." If one human generation represents 20 years, as many sources suggest, two entire generations have been born into a world that has always had microprocessors. Two generations never knew a world where computers were rare and so expensive only large corporations or governments owned them. These same billions of people have no experience of a world where the fabric of electronics was terribly expensive and bulky, where a hand-held device could do little more than tune in AM radio stations. In November, 1971, 40 years ago, Intel placed an ad in Electronic News introducing the 4004, the first microprocessor. "A micro-programmable computer on a chip!" the headline shouted. At the time in my first year of college, I was fortunate to snag a job as an electronics technician. None of the engineers I worked with believed the hype. Intel's best effort at the time had resulted in the 1103 DRAM, which stored just 1 kilobit of data. The leap to a computer on a chip seemed impossible. And so it turned out, as the 4004 needed a variety of extra components before it could actually do anything. But the 4004 heralded a new day in both computers and electronics. The 4004's legacy wasn't that of a single-chip computer. That came within a few years. Rather, it spawned the age of ubiquitous and cheap computing. Yes, the era of the personal computer came a decade later and entirely as a result of the microprocessor, but the 4004 immediately ushered in the age of embedded systems. In the decade between the micro's invention and the first IBM PC, thousands, perhaps millions, of products hit the market with embedded intelligence. Forty years ago few people had actually seen a computer; today, no one can see one, to a first approximation, as the devices have become so small. Embedded Systems Design magazine and the entire embedded systems industry that employs so many of us couldn't exist without the microprocessor. In the four decades since its birth, everything we know about electronics has changed. And so, for this and the next three issues of this magazine, I will devote this column to a look back at the story of this astonishing invention. The history of the micro is really the story of electronics, which is the use of active elements (such as transistors, tubes, diodes) to transform signals. And the microcomputer is all about using massive quantities of active elements. But electrical devices—even radios and TV—existed long before electronics. Mother Nature was the original progenitor of electrical systems. Lightning is merely a return path in a circuit composed by clouds and the atmosphere. Some think that bit of natural wiring may have created life on this planet. Miller and Urey created amino acids in 1952 using simulated high-energy discharges. But it took four billion years after Earth formed before Homo sapiens arrived, and then a little longer until Ben Franklin and others in France found, in 1752, that lightning and sparks are the same stuff. Hundreds of years later kids repeat this fundamental experiment when they shuffle across a carpet and zap their unsuspecting friends and parents (the latter usually holding something expensive and fragile). Other natural circuits include the electrocytes found in electric eels. Somewhat battery-like, they're composed of thousands of individual "cells," each of which produces 0.15V. It's striking how the word "cell" is shared by biology and electronics, unified with particular emphasis in the electrocyte. Alessandro Volta was probably the first to understand that these organic circuits used electricity. Others, notably Luigi Galvani (after whom the galvanic cell is named) mistakenly thought some sort of biological fluid was involved. Volta produced the first artificial battery, although some scholars think that the Persians may have invented one thousands of years earlier. About the same time others had built Leyden jars—early capacitors. A Leyden jar is a glass bottle with foil on the surface and an inner rod. I suspect it wasn't long before natural philosophers (proto-scientists) learned to charge the jar and zap their kids. Polymath Ben Franklin, before he got busy with forming a new country and all that, wired jars in series and called the result a "battery," from the military term, which is the first use of that word in the electrical arena. Many others contributed to the understanding of the strange effects of electricity. Joseph Henry showed that wire coiled tightly around an iron core greatly improved the electromagnet. That required insulated wire long before Digikey existed, so he reputedly wrapped silk ripped from his long-suffering wife's wedding dress around the bare copper. This led directly to the invention of the telegraph. Wives weren't the only to suffer in the long quest to understand electricity. In 1746 Jean-Antoine Nollet wired 200 monks in a mile-long circle and zapped them with a battery of Leyden jars. One can only imagine the reaction of the circuit of clerics, but their simultaneous jerking and no doubt not-terribly pious exclamations demonstrated that electricity moved very quickly indeed.
  • 热度 16
    2011-8-31 22:25
    2328 次阅读|
    0 个评论
    Here's a quote from Bill Gates: "We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten." If one human generation represents 20 years, as many sources suggest, two entire generations have been born into a world that has always had microprocessors. Two generations never knew a world where computers were rare and so expensive only large corporations or governments owned them. These same billions of people have no experience of a world where the fabric of electronics was terribly expensive and bulky, where a hand-held device could do little more than tune in AM radio stations. In November, 1971, 40 years ago, Intel placed an ad in Electronic News introducing the 4004, the first microprocessor. "A micro-programmable computer on a chip!" the headline shouted. At the time in my first year of college, I was fortunate to snag a job as an electronics technician. None of the engineers I worked with believed the hype. Intel's best effort at the time had resulted in the 1103 DRAM, which stored just 1 kilobit of data. The leap to a computer on a chip seemed impossible. And so it turned out, as the 4004 needed a variety of extra components before it could actually do anything. But the 4004 heralded a new day in both computers and electronics. The 4004's legacy wasn't that of a single-chip computer. That came within a few years. Rather, it spawned the age of ubiquitous and cheap computing. Yes, the era of the personal computer came a decade later and entirely as a result of the microprocessor, but the 4004 immediately ushered in the age of embedded systems. In the decade between the micro's invention and the first IBM PC, thousands, perhaps millions, of products hit the market with embedded intelligence. Forty years ago few people had actually seen a computer; today, no one can see one, to a first approximation, as the devices have become so small. Embedded Systems Design magazine and the entire embedded systems industry that employs so many of us couldn't exist without the microprocessor. In the four decades since its birth, everything we know about electronics has changed. And so, for this and the next three issues of this magazine, I will devote this column to a look back at the story of this astonishing invention. The history of the micro is really the story of electronics, which is the use of active elements (such as transistors, tubes, diodes) to transform signals. And the microcomputer is all about using massive quantities of active elements. But electrical devices—even radios and TV—existed long before electronics. Mother Nature was the original progenitor of electrical systems. Lightning is merely a return path in a circuit composed by clouds and the atmosphere. Some think that bit of natural wiring may have created life on this planet. Miller and Urey created amino acids in 1952 using simulated high-energy discharges. But it took four billion years after Earth formed before Homo sapiens arrived, and then a little longer until Ben Franklin and others in France found, in 1752, that lightning and sparks are the same stuff. Hundreds of years later kids repeat this fundamental experiment when they shuffle across a carpet and zap their unsuspecting friends and parents (the latter usually holding something expensive and fragile). Other natural circuits include the electrocytes found in electric eels. Somewhat battery-like, they're composed of thousands of individual "cells," each of which produces 0.15V. It's striking how the word "cell" is shared by biology and electronics, unified with particular emphasis in the electrocyte. Alessandro Volta was probably the first to understand that these organic circuits used electricity. Others, notably Luigi Galvani (after whom the galvanic cell is named) mistakenly thought some sort of biological fluid was involved. Volta produced the first artificial battery, although some scholars think that the Persians may have invented one thousands of years earlier. About the same time others had built Leyden jars—early capacitors. A Leyden jar is a glass bottle with foil on the surface and an inner rod. I suspect it wasn't long before natural philosophers (proto-scientists) learned to charge the jar and zap their kids. Polymath Ben Franklin, before he got busy with forming a new country and all that, wired jars in series and called the result a "battery," from the military term, which is the first use of that word in the electrical arena. Many others contributed to the understanding of the strange effects of electricity. Joseph Henry showed that wire coiled tightly around an iron core greatly improved the electromagnet. That required insulated wire long before Digikey existed, so he reputedly wrapped silk ripped from his long-suffering wife's wedding dress around the bare copper. This led directly to the invention of the telegraph. Wives weren't the only to suffer in the long quest to understand electricity. In 1746 Jean-Antoine Nollet wired 200 monks in a mile-long circle and zapped them with a battery of Leyden jars. One can only imagine the reaction of the circuit of clerics, but their simultaneous jerking and no doubt not-terribly pious exclamations demonstrated that electricity moved very quickly indeed.