tag 标签: John

相关博文
  • 热度 22
    2014-1-29 16:57
    1845 次阅读|
    0 个评论
    What would Hiram Percy Maxim make of the ham scene if he were alive today? This is what Kay Craigie considers in the January issue of QST, a magazine (still in print!) for ham radio enthusiasts. Maxim was the first president of the Amateur Radio Relay League when that organisation was founded in 1914. Radios of that era were spark gap transmitters that needed no active elements, and were extraordinarily primitive by today's standards. That started me wondering how one of the computer industry's early luminaries, say, John von Neumann , would view today's embedded space. Von Neumann (called "Johnny" in his day) died in 1957, when the vacuum tube still reigned nearly supreme. His first thought might be "Where's the computer?" He helped design the biggest computer of all time, the SAGE, which occupied a half acre of floor space and weighed 250 tons. He'd be surprised by devices like the PIC10, which is available in a six-pin 2 x 3 mm DFN package. His hand could even hold dozens of the biggest processors around, like Intel's high-end devices.   A couple of Pentiums with a 1 GB disc. The RAMAC, the hard disc of von Neumann's era held 5 MB and weighed more than a ton. If he looked at a DSP he might ask "Harvard architecture? What happened?" We'd have to reassure him that, yes, the von Neumann architecture still dominates but isn't always the most appropriate choice. I'm sure he'd want a computer of his own, and would immediately start writing a grant application to get the government funds needed. After all, that SAGE system cost about $90 billion in today's deflated dollars (that figure includes 24 two-computer systems). He'd probably be too stunned to crack one of his famous ribald jokes when told that a 32 bit MCU could be had for half a buck or less. Von Neumann might squint and wonder how many tubes were packed into a modern processor. In his era nearly all computers were based on tubes, thousands of them, which is why machines were so huge.   A triode and a 32bit computer.   But Johnny was surely aware of the transistor, and would have realised that semiconductors were much smaller than tubes. He'd probably wonder, though, what clever design had reduced the needed transistor count so much that a computer could exist in a chip. Imagine his jaw dropping when told that, no, actually some of these parts contain billions of active elements. We complain when a machine crashes; we're surprised when one fails. It's common for a PC to run for years between hardware failures, which would probably surprise von Neumann. The IAS computer, built under his supervision, reportedly had an MTBF of 10 minutes, and there was a raging debate about just how much useful computation could be done within this limit. He'd also be surprised at our I/O; the notion of a GUI just didn't exist in his lifetime. His 1953 Johnniac initially had just a card reader and punch... plus a loudspeaker to give a sense of what was going on. It wasn't long before card decks were produced to play popular tunes. The stereo and full-motion video we take for granted would probably astound him. (He objected to the computer's name, but the lead engineer let him know there were plenty of other Johns in the world; what made von Neumann think it was named after him?) At least one thing would be familiar. His computers dissipated heat. A lot of it. That hasn't changed in the high-end spectrum of computers today.   There may not be many tubes in that processor, but it sure does get hot.
  • 热度 19
    2014-1-29 16:52
    1636 次阅读|
    0 个评论
    In the January issue of QST, a magazine (still in print!) for ham radio enthusiasts, Kay Craigie muses on what Hiram Percy Maxim would make of the ham scene if he were alive today. Maxim was the first president of the Amateur Radio Relay League when that organisation was founded in 1914. Radios of that era were spark gap transmitters that needed no active elements, and were extraordinarily primitive by today's standards. That started me wondering how one of the computer industry's early luminaries, say, John von Neumann, would view today's embedded space. Von Neumann (called "Johnny" in his day) died in 1957, when the vacuum tube still reigned nearly supreme. His first thought might be "Where's the computer?" He helped design the biggest computer of all time, the SAGE, which occupied a half acre of floor space and weighed 250 tons. He'd be surprised by devices like the PIC10, which is available in a six-pin 2 x 3 mm DFN package. His hand could even hold dozens of the biggest processors around, like Intel's high-end devices.   A couple of Pentiums with a 1 GB disc. The RAMAC, the hard disc of von Neumann's era held 5 MB and weighed more than a ton. If he looked at a DSP he might ask "Harvard architecture? What happened?" We'd have to reassure him that, yes, the von Neumann architecture still dominates but isn't always the most appropriate choice. I'm sure he'd want a computer of his own, and would immediately start writing a grant application to get the government funds needed. After all, that SAGE system cost about $90 billion in today's deflated dollars (that figure includes 24 two-computer systems). He'd probably be too stunned to crack one of his famous ribald jokes when told that a 32 bit MCU could be had for half a buck or less. Von Neumann might squint and wonder how many tubes were packed into a modern processor. In his era nearly all computers were based on tubes, thousands of them, which is why machines were so huge.   A triode and a 32bit computer.   But Johnny was surely aware of the transistor, and would have realised that semiconductors were much smaller than tubes. He'd probably wonder, though, what clever design had reduced the needed transistor count so much that a computer could exist in a chip. Imagine his jaw dropping when told that, no, actually some of these parts contain billions of active elements. We complain when a machine crashes; we're surprised when one fails. It's common for a PC to run for years between hardware failures, which would probably surprise von Neumann. The IAS computer, built under his supervision, reportedly had an MTBF of 10 minutes, and there was a raging debate about just how much useful computation could be done within this limit. He'd also be surprised at our I/O; the notion of a GUI just didn't exist in his lifetime. His 1953 Johnniac initially had just a card reader and punch... plus a loudspeaker to give a sense of what was going on. It wasn't long before card decks were produced to play popular tunes. The stereo and full-motion video we take for granted would probably astound him. (He objected to the computer's name, but the lead engineer let him know there were plenty of other Johns in the world; what made von Neumann think it was named after him?) At least one thing would be familiar. His computers dissipated heat. A lot of it. That hasn't changed in the high-end spectrum of computers today. There may not be many tubes in that processor, but it sure does get hot.  
  • 热度 24
    2014-1-16 18:00
    1732 次阅读|
    0 个评论
    John von Neumann's 100th birthday was December 28, 2003. Since I missed writing about that anniversary, here's to the 110 years since that prodigy entered the world. A lot of brilliant people contributed to the birth and explosive growth of computers, but few were such polymaths. Von Neumann contributed to as many fields as he brushed against. Born to a silver spoon in Budapest when that city was wealthy and growing fast he quickly made a name for himself in mathematics, publishing important papers at a prodigious rate. By the early '30s he was here in the USA, working at the Institute for Advanced Study in Princeton alongside Einstein and other luminaries. He was prescient about most things, including war, in 1935 predicting a major European conflict within a decade with a resulting genocide of the European Jews. He also said that if England were imperiled the USA would come to its aid. When the war did come he was an enthusiastic supporter of US involvement and had no doubt the allies would win. During it von Neumann worked on ballistics and the theory of explosions and shockwaves, as early as 1941 exploring the nature of shaped charges. Ballistics would lead to his involvement in computers, and his explosion research proved important to the soon-to-come Manhattan Project. It seems that aiming a big gun required executing some 750 multiplications to compensate for all of the variables involved. The Ballistics Research Laboratory in Aberdeen, MD had been trying to simplify this since the end of WWI. Johnny, as he was called, served as a consultant to the Laboratory. But his expertise was widely known and he worked on these problems for other organisations. The Navy eventually nicked him; he said he preferred Admirals to Generals because the latter drank iced tea at lunch, but when ashore, Navy brass went for the hard liquor, and Johnny prided himself on his ability to imbibe. By 1943 he had traveled to England and met Turing, and was thinking about mechanical or electronic solutions for the physics of explosions. Many of his papers of the time referred to "kilotonnage" and "computers." After September 1943 30% of his time was at Los Alamos. He didn't invent the explosive lens needed to detonate a plutonium weapon, but made important improvements to it. Now that the press is full of stories of uranium enrichment it's important to distinguish the two types of bombs: a uranium device is easy to make (it's basically a canon) but purifying the uranium is extremely difficult. Plutonium is much easier to get but must be imploded with microsecond accuracy, and so those gadgets use an explosive lens to shape the charge, rapidly compressing the Pu in a perfect sphere. This turned out to be much harder than anticipated. On his one foray into economics he casually cranked out a 640 page book about game theory which is still considered important as it, for the first time, gave a mathematical foundation to the subject. According to John von Neumann by Norman Macrae, between August 21 and September 2, 1944, Johnny hit on the idea of a stored program computer after seeing ENIAC. The history is pretty muddied. It is known he wrote the "First Draft of a Report on the EDVAC" in March 1945. EDVAC was proposed by ENIAC's designers (John Mauchly and J. Presper Eckert) before ENIAC was operational. It's not clear if the report was ever finished, or if he wanted it circulated. However, that paper documented the stored program concept and resulted in his getting credit for the idea. Eckert and Mauchly felt ripped off, creating a bitterness (on their part) that never went away. But today we don't talk about the Eckert-Mauchly architecture vs. a Harvard architecture. Von Neumann was certainly one of the most visionary early computer proponents, and he wrote with an eloquence and persuasiveness that helped boot the industry. He did start a computer project at the Institute for Advanced Study, and in the contract with the Army insisted that periodic public reports were issued with the authors assigning any patentable ideas to the public domain. It's hard to imagine that happening today. Sadly, he died young at 53 in 1957 of cancer, which some feel was a result of his atomic work, especially in observing the Bikini test in 1946. He had a vast repertoire of off-colour jokes he used to defuse hostile confrontations, and was extremely engaging, the opposite of the stereotype of the introverted scientist. Always gracious, never one to offend, he frequently hosted huge parties. Unlike many of his colleagues who were overtly or partially Marxist, Johnny was aghast at communism and was a big proponent of atomic weapons and a strong defence. Interestingly, on-line searches for his work "The Computer and the Brain" mostly gets hits to ebooks. No doubt he'd be pleased. " John von Neumann Interview " , a grainy, undated black and white video on YouTube, shows him talking about the need for more science and technology education in the secondary schools, at least 60 years before the rest of us started saying the same thing. His Hungarian accent is strong, but his command of English is excellent. Some of the major figures in science sound like dull dinner companions. Not Johnny; I bet a couple of beers with him would have been the experience of a lifetime.  
  • 热度 13
    2012-5-20 22:15
    1272 次阅读|
    0 个评论
    The John Q is a good film,talk about the love and the responsibility between the father and son.John Q是个工厂的工人,虽然上班很拼命、努力,但是家庭并不富裕,夫妻间处处充满了危机。“幸福的人总是相似的,不幸的人各有各的不幸”,在一次学校棒球比赛中,儿子Michael却突然摔倒在运动场上,在医院检查后,被告知患了先天性心脏病,需要支付高昂的医药费进行换心脏手术,否则将会很快死去。买了保险的John Q以为能用医保支付这高昂的费用,“福无双至,祸不单行”,偏偏又被告知医保只能支付很少的一部分。为了救儿子Michael,于是John Q一边变卖仅有的一些财产,一边申请救助金,但拿到手的钱始终是杯水车薪。由于不能付昂贵的手术费用,医院又要求给Michael办出院手续。John Q被逼无赖,持*在急救室劫持了做手术的主治医师,并要求给Michael做手术。最后引来了警察,在与警察僵持过程中,John Q显示了普通工人在现实中的无赖,与妻子及儿子之间真挚的感情,为了救儿子不惜用自己的生命来换取儿子的生命。虽然在最后结局出现了大团圆,用自己的牢狱之灾换回了儿子的生命,但通过这部电影简单的故事情节,展现的却不是简单的事件。虽然包含了父爱与责任,但透过这些在深入发掘,不难发现所谓的保险、医院、教育、政治等等,不过是有钱人的。虽是电影中的故事情节,但在现实生活中难得不是在天天上演么? 国家现在富裕了,但是百姓生活真的比以前的日子好吗?国家创造的财富中有多少又是真正的科学技术呢?有多少企业打着高科技的幌子干着代工的事情?虽然建了很多社区医疗服务中心、医院,但真的就能看得起病,住的起院吗?虽然教育普及了,上大学的人也多了,但真正素质提高了吗?在大学中所讲的知识不过是过时的一些知识。虽然每年大学毕业的人那么多,但有多少能获诺贝尔奖呢?能培养出比尔.盖茨、扎克伯格、乔布斯吗?虽然有马云、马化腾、丁磊等等,但淘宝、QQ、网易的创意真的就能算是他们的吗?总是讲具有多么悠久的历史文明,在这几千年中除了四大发明外,还有多少其他的科学技术发明呢?从第一次工业革命至今,出现的蒸汽机、电话、电脑、汽车、飞机等等,有那一样出自天朝大国呢?虽然有养老金,但真正到老了,随着老龄化的加剧,真的能用来养老么?能不出现的日本的情况么? 在接下来的十年中,要成为一个强国或超级大国,真正的第一,或许真的需要改变,改变一些体制。  
  • 热度 18
    2012-4-13 21:16
    2495 次阅读|
    2 个评论
    I must admit that I am overwhelmed with admiration for the way in which John MacCormick tackled his book Nine Algorithms That Changed the Future: The Ingenious Ideas That Drive Today's Computers. Of course, it's no surprise that computer science is replete with algorithms for all sorts of things. Consider a relatively simple undertaking like sorting things, for example... perhaps we wish to sort a list of names into alphabetical order, or a list of numerical values into ascending size. In this case, there are textbooks that will present you with so many different types and permutations of algorithms that it will make your eyes water. A quick glance at the Wikipedia page on Sorting Algorithms , for example, immediately presents us with a bunch of popular contenders: - Bubble Sort - Selection Sort - Insertion Sort - Shell Sort - Comb Sort - Merge Sort - Heapsort - Quicksort - Counting Sort - Bucket Sort - Radix Sort - Distribution Sort - Timsort And don't even get me started on some of the more esoteric entries like the Cocktail Sort, Gnome Sort, and Patience Sorting. Or how about the Bogosort, which is based on luck (randomly permute the array and check to see if it's sorted), or the Slowsort, which provides a remarkably inefficient form of the selection sorting algorithm. Actually, I would like to mention that the Wikipedia Page on the Bubble Sort provides the best visual representation of how this form of sort works that I've ever seen. Arrggghhh... as usual it's a case of "wind me up and watch me go" because none of the above is in any way relevant or related to Nine Algorithms That Changed the Future . What the author has done is to focus on a small number of revolutionary algorithms that the vast majority of computer users come into contact with every day without even knowing or thinking about it. When we perform a web search using Google, for example, the search engine returns a handful of relevant results culled from the billions of pages on the web. If you were to instigate a search on "The evolution of color vision" , for example, you would discover that my paper Color Vision: One of Nature's Wonders appears on the first page. How is it possible for the Google search engine to recognize the genius behind my humble offering and to return such amazingly relevant (and – in this case – incredibly self-serving) results? This book tells the tale. As another simple example, uploading a photo to Facebook involves millions of pieces of information being transmitted over numerous error-prone network links, yet somehow a perfect copy of the photo arrives intact. How can this be? Once again, this book explains all. One of the best things about Nine Algorithms That Changed the Future is that it is of interest to computer professionals and innocent bystanders (non-professionals) alike. The author doesn't attempt to "baffle us with science" or blow us away with his mathematical prowess. Instead, he employs simple analogies that we can all understand. His use of mixing colored paints to explain the machinations of public key cryptography is, frankly, brilliant. For myself, I have to say that I learned a lot of stuff I didn't know, including some of the algorithmic tricks associated with error-correcting codes, data compression, and pattern recognition. In the discussions on databases, for example, the author explains concepts like two-phase commit, rollbacks, and transaction logging, all of which are vital to ensuring data integrity and maintaining the functionality of the database. Although I was vaguely aware of a lot of this stuff, I didn't really understand the nitty-gritty details (I probably still don't, but I know a lot more than I did before reading this book ). Should I happen to fall through a time warp and appear in the 1950s, all I have to do is remember these database tricks and techniques to ensure that I become rich beyond my wildest dream. As usual, I could waffle on for hours, but the bottom line is that I highly recommend this book as a very enjoyable read that will be of interest to anyone who would like to understand more about the way in which the computer systems we use every day perform their magic.  
相关资源