tag 标签: Neumann

相关博文
  • 热度 22
    2014-1-29 16:57
    1845 次阅读|
    0 个评论
    What would Hiram Percy Maxim make of the ham scene if he were alive today? This is what Kay Craigie considers in the January issue of QST, a magazine (still in print!) for ham radio enthusiasts. Maxim was the first president of the Amateur Radio Relay League when that organisation was founded in 1914. Radios of that era were spark gap transmitters that needed no active elements, and were extraordinarily primitive by today's standards. That started me wondering how one of the computer industry's early luminaries, say, John von Neumann , would view today's embedded space. Von Neumann (called "Johnny" in his day) died in 1957, when the vacuum tube still reigned nearly supreme. His first thought might be "Where's the computer?" He helped design the biggest computer of all time, the SAGE, which occupied a half acre of floor space and weighed 250 tons. He'd be surprised by devices like the PIC10, which is available in a six-pin 2 x 3 mm DFN package. His hand could even hold dozens of the biggest processors around, like Intel's high-end devices.   A couple of Pentiums with a 1 GB disc. The RAMAC, the hard disc of von Neumann's era held 5 MB and weighed more than a ton. If he looked at a DSP he might ask "Harvard architecture? What happened?" We'd have to reassure him that, yes, the von Neumann architecture still dominates but isn't always the most appropriate choice. I'm sure he'd want a computer of his own, and would immediately start writing a grant application to get the government funds needed. After all, that SAGE system cost about $90 billion in today's deflated dollars (that figure includes 24 two-computer systems). He'd probably be too stunned to crack one of his famous ribald jokes when told that a 32 bit MCU could be had for half a buck or less. Von Neumann might squint and wonder how many tubes were packed into a modern processor. In his era nearly all computers were based on tubes, thousands of them, which is why machines were so huge.   A triode and a 32bit computer.   But Johnny was surely aware of the transistor, and would have realised that semiconductors were much smaller than tubes. He'd probably wonder, though, what clever design had reduced the needed transistor count so much that a computer could exist in a chip. Imagine his jaw dropping when told that, no, actually some of these parts contain billions of active elements. We complain when a machine crashes; we're surprised when one fails. It's common for a PC to run for years between hardware failures, which would probably surprise von Neumann. The IAS computer, built under his supervision, reportedly had an MTBF of 10 minutes, and there was a raging debate about just how much useful computation could be done within this limit. He'd also be surprised at our I/O; the notion of a GUI just didn't exist in his lifetime. His 1953 Johnniac initially had just a card reader and punch... plus a loudspeaker to give a sense of what was going on. It wasn't long before card decks were produced to play popular tunes. The stereo and full-motion video we take for granted would probably astound him. (He objected to the computer's name, but the lead engineer let him know there were plenty of other Johns in the world; what made von Neumann think it was named after him?) At least one thing would be familiar. His computers dissipated heat. A lot of it. That hasn't changed in the high-end spectrum of computers today.   There may not be many tubes in that processor, but it sure does get hot.
  • 热度 24
    2014-1-16 18:00
    1732 次阅读|
    0 个评论
    John von Neumann's 100th birthday was December 28, 2003. Since I missed writing about that anniversary, here's to the 110 years since that prodigy entered the world. A lot of brilliant people contributed to the birth and explosive growth of computers, but few were such polymaths. Von Neumann contributed to as many fields as he brushed against. Born to a silver spoon in Budapest when that city was wealthy and growing fast he quickly made a name for himself in mathematics, publishing important papers at a prodigious rate. By the early '30s he was here in the USA, working at the Institute for Advanced Study in Princeton alongside Einstein and other luminaries. He was prescient about most things, including war, in 1935 predicting a major European conflict within a decade with a resulting genocide of the European Jews. He also said that if England were imperiled the USA would come to its aid. When the war did come he was an enthusiastic supporter of US involvement and had no doubt the allies would win. During it von Neumann worked on ballistics and the theory of explosions and shockwaves, as early as 1941 exploring the nature of shaped charges. Ballistics would lead to his involvement in computers, and his explosion research proved important to the soon-to-come Manhattan Project. It seems that aiming a big gun required executing some 750 multiplications to compensate for all of the variables involved. The Ballistics Research Laboratory in Aberdeen, MD had been trying to simplify this since the end of WWI. Johnny, as he was called, served as a consultant to the Laboratory. But his expertise was widely known and he worked on these problems for other organisations. The Navy eventually nicked him; he said he preferred Admirals to Generals because the latter drank iced tea at lunch, but when ashore, Navy brass went for the hard liquor, and Johnny prided himself on his ability to imbibe. By 1943 he had traveled to England and met Turing, and was thinking about mechanical or electronic solutions for the physics of explosions. Many of his papers of the time referred to "kilotonnage" and "computers." After September 1943 30% of his time was at Los Alamos. He didn't invent the explosive lens needed to detonate a plutonium weapon, but made important improvements to it. Now that the press is full of stories of uranium enrichment it's important to distinguish the two types of bombs: a uranium device is easy to make (it's basically a canon) but purifying the uranium is extremely difficult. Plutonium is much easier to get but must be imploded with microsecond accuracy, and so those gadgets use an explosive lens to shape the charge, rapidly compressing the Pu in a perfect sphere. This turned out to be much harder than anticipated. On his one foray into economics he casually cranked out a 640 page book about game theory which is still considered important as it, for the first time, gave a mathematical foundation to the subject. According to John von Neumann by Norman Macrae, between August 21 and September 2, 1944, Johnny hit on the idea of a stored program computer after seeing ENIAC. The history is pretty muddied. It is known he wrote the "First Draft of a Report on the EDVAC" in March 1945. EDVAC was proposed by ENIAC's designers (John Mauchly and J. Presper Eckert) before ENIAC was operational. It's not clear if the report was ever finished, or if he wanted it circulated. However, that paper documented the stored program concept and resulted in his getting credit for the idea. Eckert and Mauchly felt ripped off, creating a bitterness (on their part) that never went away. But today we don't talk about the Eckert-Mauchly architecture vs. a Harvard architecture. Von Neumann was certainly one of the most visionary early computer proponents, and he wrote with an eloquence and persuasiveness that helped boot the industry. He did start a computer project at the Institute for Advanced Study, and in the contract with the Army insisted that periodic public reports were issued with the authors assigning any patentable ideas to the public domain. It's hard to imagine that happening today. Sadly, he died young at 53 in 1957 of cancer, which some feel was a result of his atomic work, especially in observing the Bikini test in 1946. He had a vast repertoire of off-colour jokes he used to defuse hostile confrontations, and was extremely engaging, the opposite of the stereotype of the introverted scientist. Always gracious, never one to offend, he frequently hosted huge parties. Unlike many of his colleagues who were overtly or partially Marxist, Johnny was aghast at communism and was a big proponent of atomic weapons and a strong defence. Interestingly, on-line searches for his work "The Computer and the Brain" mostly gets hits to ebooks. No doubt he'd be pleased. " John von Neumann Interview " , a grainy, undated black and white video on YouTube, shows him talking about the need for more science and technology education in the secondary schools, at least 60 years before the rest of us started saying the same thing. His Hungarian accent is strong, but his command of English is excellent. Some of the major figures in science sound like dull dinner companions. Not Johnny; I bet a couple of beers with him would have been the experience of a lifetime.  
  • 热度 14
    2012-12-6 18:24
    1589 次阅读|
    0 个评论
    A few weeks ago, I had a conversation with a software developer on the proper education of software programmers relating to several recent articles on the topic published on Embedded.com: "Students need to learn multi-multiple programming languages," by Greg Gicca, and "The education of embedded software engineers," by Robert DeWar. In the course of our conversation, he referred to a recent article in the November, 2012 issue of IEEE Computer Magazine "Debugging on the shoulders of Giants," as an example of how computer programming should be taught—not driven by the specifics of learning a particular programming language, but with the aim of teaching general principles that can be applied in any computing environment and where no one language is viewed as better or worse than another, but is simply a tool created to accomplish specific tasks. Much of the IEEE article was about the effort at the U.S. Air Force Academy to duplicate the original IAS computer ( aka the von Neumann architecture ) designed—and programmed—70 years ago by a team of scientists and engineers led by John von Neumann at the Institute of Advanced Studies in Princeton, N.J. The USAF project was part of an effort to create materials and tools for a course at the U.S. Air Force Academy on "Great Ideas in Computing." This included IASSim, an emulator for use by college freshmen to help them program in IAS assembly language. In the process of creating the course building blocks, the authors – Barry Fagin and Dale Skrien—had to go back to the original documentation for the IAS computer convert it into machine readable form for the emulator, recreate the original programs written for the IAS computer and then execute the programs. The article chronicles the steps they went through, the programs they ran and what they learned about how Von Neumann and those on his IAS project team created the architecture, the instruction set they designed and the programs they wanted to run. In the process they discovered how the original team stumbled upon many of the tools, procedures and methodologies that are commonplace now, or in their absence how they created debugged the code manually, the errors they made and why and how they found them. Of all the many scientists and engineers of the last hundred years, John Von Neumann is the one who I hold in the most awe, not just for his sheer intellect, but for the many areas of science and technology he was interested in and to which he made significant contributions: not just in computer science, but mathematics, quantum mechanics, fluid dynamics, economics, game theory, genetics and the structure of DNA, self-replicating machines and statistics. Beyond the breadth of his interests and the influence of his ideas was his ability to move back and forth between the abstract arena of scientific investigation and the hands-on aspects of applying those ideas in the world of engineering. His ability to easily move back and forth between the two worlds of theory and application was brought home to me as I read this article on what the authors found out about the problems Von Neumann and the Princeton team faced and how they solved them: I/O limitations, memory and instruction format, number representation, self-modifying code, the pros and cons of formal methods, instruction set design, and orthogonality. But what most impressed me were the efforts he and his team made at debugging the code they developed, despite the problems they faced and the fact that they were creating the tools and procedures they needed "on-the-fly," as situations developed—all without the tools at hand now for the programmer. Von Neumann and his team wrote 15 programming problems to run on the IAS computer ranging from relatively simple ones involving algebraic expressions, parameters and subroutining, iteration, BCD to binary conversion, sorting and merging lists and double precision sums to more complex ones such as Newton's method for calculating square roots and Lagrangian interpolation. Given they were breaking new ground and some of the tough mathematical problems they attempted, the number of errors that Fagin and Skrien found was surprisingly low: seven programs were error free and in several others the errors were typographical in nature. And the errors found in a few of them owed as much as to the mathematics involved and how to represent that in the code as to actual coding errors. According to the authors, because the target machine implementing the IAS architecture was not built for another five years, in the early 1950s: " the relatively small number of errors in the code is quite remarkable ." Aside from giving me yet another reason to admire Von Neumman, what impressed me as I read the article was how closely the USAF "Great Ideas in computing" course seems to reflect the approach espoused by Gicca in his article: " Understanding just a single language promotes solutions that only approach a problem from a single perspective ," he writes. " Knowing multi-multiple languages allows the problem to be looked at from a variety of perspectives so that multi-multiple solutions can be compared and the most natural solution for the problem can be selected ." The closing paragraph of the IEEE Computer article is also worth thinking about: " Our exploration into the IAS machine makes us wonder if some sort of exposure to older machines makes sense for future computer designers. After all, those who do not learn from computer history are condemned to repeat it ."  
相关资源