tag 标签: measurement

相关博文
  • 热度 23
    2015-9-18 18:15
    1613 次阅读|
    0 个评论
    Power supply efficiency is a "hot" topic these days, whether it's an AC/DC or DC/DC unit (sorry about that unavoidable pun). Everyone is concerned about it for reasons related to issues of one or more factors of run time, thermal load, operating cost, or regulatory mandates.   Of course, talking about it is one thing, but measuring it and doing it properly is another. In the "old days" when efficiencies were in the range of 40 to 80%, a measurement error of a few percentage points might not be a serious problem. As efficiencies are, however, now in the 85% to 90%+ zone and every percentage point increase is judged as critical, any inadvertent or unintentional errors in the test set-up or execution can be a major concern.   A recent blog post by Josh Mandelcorn at Texas Instruments' Power House site clearly showed how easy it is to allow apparently inconsequential or overlooked issues to seriously affect the validity of the results. He points out that the potential errors in the final results start with a basic fact: there are four simultaneous readings (input and output current and voltage) from four meters, and they each have errors which can conveniently cancel out, or just as easily add up.     In addition, the effect of temperature on wire and connector resistivity—not an issue with very low-power supplies, of course—can easily cause other small errors as well. Furthermore, there are issues associated with forced cooling versus convection cooling, thermal time constants for the supply's circuitry, and other factors that can affect the legitimacy of the numbers, even if they are measured with high accuracy.   It's an old story: making a fairly good measurement is often easy; making a really good one is much harder. As most engineers know, the difference between the two can become a serious problem. Even if the results are tagged as "preliminary, subject to review," they get repeated in reports and often develop a life and credibility of their own, at least in management's mind.   Pretty soon, the numbers become accepted as solid fact rather than as tentative subject to verification; for supplies, the rough number will soon become the lower-bound threshold of presumed efficiency. It’s tough to go back to the boss and say "you know that 94% number I gave you last week? —after careful review, it's really only 90%."   Have you have to make power supply efficiency measurements? What problems did you encounter, or not foresee? Did they come back and "bite you" when you had to re-visit them?
  • 热度 31
    2015-3-26 19:54
    1570 次阅读|
    0 个评论
    In the extraordinary year of 1905, Einstein published five brilliant papers including his best-known one on special relativity . One of the other papers explained the mechanics of Brownian motion, the random motion of small particles in fluid first studied by Robert Brown in 1827 and easily observed with a microscope (Brown used spores).   Einstein combined diffusion analysis and thermodynamics to explain and then quantify this noise-like motion of particles ( figure ), which had previously been collectively measured although the underlying physics remained a mystery. While his analysis of Brownian motion has been experimentally confirmed, he noted that one parameter of his analysis—the instantaneous velocity of particles needed to verify their group-velocity distribution—would never be measurable, due to the tiny physical and time scales. Figure: Brown motion of particles is a random process like electrical noise, but Einstein showed how this motion could be analyzed in the aggregate using thermodynamic and diffusion principles to characterize observable factors such as mean travel distances and drift motion over time (from Florida State University, Department of Scientific Computation).   Forward to the 21st century and it looks as if the "it can't be done" measurement is being done. In a brief and fascinating article " The measurement Einstein deemed impossible " in January 2015 edition of Physics Today (always an interesting and readable publication), two professors have combined an optical tweezer with a pulsed laser for nanosecond data collection, to actually make that measurement.   Note: Einstein’s five papers of 1905 were on the kinetic theory of gases, Brownian motion, special relativity, the relationship between mass and energy, and the quantum nature of photoelectric effect (due to its "practicality," the last one was his cited accomplishment on the official rationale for his Nobel prize award). Einstein's Miraculous Year by John Stachel is an excellent book with complete translations of these five papers, as well as detailed technical explanations and in-depth historical perspective for each.   Developed in the 1970s, the optical tweezer is a focused laser beam that's now a standard tool in physics and biology experiments. It's used to trap and then move particles. This tweezer and a pulsed laser are, however, not enough to make the measurement. The article discussed how the particles under surveillance—here, they used tiny glass beads rather than the pollen that Brown and others used—and overall test bed were configured. The velocity measurement used a split-beam photodetector, with half of the emitted beam going to the test site and returning to the detector, while the other half is going directly to the detector via an equivalent optical-path length. This type of differential measurement enables cancellation of any intensity fluctuations in the laser.   The ability to extract meaningful data in difficult situations is one of the many attributes of the test and measurement domain. Sometimes, as in the Brownian-motion case here, the measurement challenge is inherent in the nature of the object under test and the subtleties of instantaneous velocity on this microscopic time and physical scale.   In many cases, however, the parameter is simple to measure in principle, but the reality of the making the measurement is hard. Think of all the challenging places we need to measure temperature despite its apparent simplicity, and all the creative, often ingenious contact and non-contact solutions used.   Or consider weight: the late Jim Williams, in his 1976 feature article for EDN , discussed in detail his design for an infant-weighing scale for the MIT nutrition lab, " This 30-ppm scale proves that analog designs aren't dead yet ." While a basic weigh scale is an almost trivial design, Williams faced some tough objectives: the scale had to be small and portable, offer absolute accuracy within 0.02% along with resolution to 0.01 pound over a 300-pound range, use only standard components, and never need calibration once put into use. To do this, he looked at every subtle source of error including thermal drift, component aging, and stray EM fields. In a thorough tour of engineering excellence, he worked out how to minimize, negate, or self-cancel their degrading effects.   Have you ever helped devise a solution to a measurement that "couldn't be done" or was deemed very difficult? How did you verify the validity of your approach?
  • 热度 21
    2013-3-15 18:36
    8641 次阅读|
    0 个评论
    Assessing the thermal effects on general purpose pc boards is challenging, but it's another story to control board reliability in mission critical applications where a device failure is not an option. In a recent discussion, Andy Burkhardt, technical marketing specialist at Polar Instruments, acknowledged the reliability issues for pc boards, adding: "Certainly moisture ingression, humidity and temperature all have effects on PCBs. More generally on reliability as a catalyst contributing towards other failure mechanisms, but particularly on insertion loss of surface microstrip traces and slightly less so for strip-line transmission line structures." Superior reliability is the pivotal issue for mission critical applications. Devices have to continue to function correctly and reliably under harsh and, sometimes severe conditions, including high moisture and humidity, high heat, freezing cold, electromagnetic interference and destructive electromagnetic pulses. Think about the Mars Science Laboratory rover, Curiosity, which landed at Gale Crater on Aug. 6, 2012. It integrates the highest-end technology sensors and measuring systems to conduct unprecedented experiments on the Red Planet. Curiosity has a number of pc boards communicating, interfacing, regulating and controlling various sensors and other scientific gear Take, for instance, Curiosity's Mars Hand Lens Imager (MAHLI), which uses a 2-megapixel colour camera with a focusing macro lens to observe textural, mineralogical, structural and morphological details in geologic materials. The MAHLI camera head electronics are laid out as a single rigid-flex substrate with three rigid sections that are sandwiched between housings that provide mechanical support and radiation shielding.   Location of MAHLI hardware aboard the Mars Science Laboratory rover, Curiosity, in the Spacecraft Assembly Facility (Source: NASA) The aerospace industry requires pc boards with high reliability in extreme conditions. The MAHLI camera head is usually operated at temperatures of -40°C to +40°C on Mars. For that, NASA said it has been verified through testing on a non-flight unit to be able to survive nearly three Mars years of diurnal temperature cycles (down to -130°C) without any heating.   The graph shows the rise and fall of air and ground temperatures on Mars obtained by NASA's Mars Curiosity rover. The data cover Aug. 16 to Aug. 17, 2012, and were obtained by the Rover Environmental Monitoring Station. Ground temperatures vary from as high as 37 degrees F (3 degrees C) to as low as minus 131.8 degrees F (minus 91 degrees C), showing large temperature oscillations from day to night. ( Source: NASA ) The imperatives are different, though, when human lives are at stake. If Golden Spike, a private company whose board includes former NASA engineers and spaceflight experts, delivers on its promises, it will offer rides to the moon by 2020 –for the modest sum of $1.5 billion for two people. Advancing to new frontiers is exhilarating but, as lives depend on the products we provide, it's fundamental to never compromise on safety. Therefore, the aerospace electronics industry must continue to design products that meet the absolute highest standards.   Anne-Francoise Pele EE Times  
  • 热度 20
    2013-3-15 18:35
    1881 次阅读|
    0 个评论
    Identifying the thermal effects on general purpose pc boards is tough, but it's another story to control board reliability in mission critical applications where a device failure is not an option. In a recent discussion, Andy Burkhardt, technical marketing specialist at Polar Instruments, acknowledged the reliability issues for pc boards, adding: "Certainly moisture ingression, humidity and temperature all have effects on PCBs. More generally on reliability as a catalyst contributing towards other failure mechanisms, but particularly on insertion loss of surface microstrip traces and slightly less so for strip-line transmission line structures." Superior reliability is the pivotal issue for mission critical applications. Devices have to continue to function correctly and reliably under harsh and, sometimes severe conditions, including high moisture and humidity, high heat, freezing cold, electromagnetic interference and destructive electromagnetic pulses. Think about the Mars Science Laboratory rover, Curiosity, which landed at Gale Crater on Aug. 6, 2012. It integrates the highest-end technology sensors and measuring systems to conduct unprecedented experiments on the Red Planet. Curiosity has a number of pc boards communicating, interfacing, regulating and controlling various sensors and other scientific gear Take, for instance, Curiosity's Mars Hand Lens Imager (MAHLI), which uses a 2-megapixel colour camera with a focusing macro lens to observe textural, mineralogical, structural and morphological details in geologic materials. The MAHLI camera head electronics are laid out as a single rigid-flex substrate with three rigid sections that are sandwiched between housings that provide mechanical support and radiation shielding.   Location of MAHLI hardware aboard the Mars Science Laboratory rover, Curiosity, in the Spacecraft Assembly Facility (Source: NASA) The aerospace industry requires pc boards with high reliability in extreme conditions. The MAHLI camera head is usually operated at temperatures of -40°C to +40°C on Mars. For that, NASA said it has been verified through testing on a non-flight unit to be able to survive nearly three Mars years of diurnal temperature cycles (down to -130°C) without any heating.   The graph shows the rise and fall of air and ground temperatures on Mars obtained by NASA's Mars Curiosity rover. The data cover Aug. 16 to Aug. 17, 2012, and were obtained by the Rover Environmental Monitoring Station. Ground temperatures vary from as high as 37 degrees F (3 degrees C) to as low as minus 131.8 degrees F (minus 91 degrees C), showing large temperature oscillations from day to night. ( Source: NASA ) The imperatives are different, though, when human lives are at stake. If Golden Spike, a private company whose board includes former NASA engineers and spaceflight experts, delivers on its promises, it will offer rides to the moon by 2020 –for the modest sum of $1.5 billion for two people. Advancing to new frontiers is exhilarating but, as lives depend on the products we provide, it's fundamental to never compromise on safety. Therefore, the aerospace electronics industry must continue to design products that meet the absolute highest standards.   Anne-Francoise Pele EE Times  
  • 热度 41
    2011-4-30 10:05
    7587 次阅读|
    16 个评论
    三月份,日本发生了九级大地震,震惊了全世界,震动了整个产业链。这段时间大家都忙着应对危机,紧急备库存,到处找替代厂商。现在过去快一个多月了,事件的影响在渐渐退出大家的视野,新闻媒体的关注度也在慢慢下降,从先前的 24 小时滚动报道,到现在的偶尔提起。终于可以从之前的忙乱中抽出身来,做个小结。看来还是无风无浪的安逸日子舒服啊。可是人在江湖漂,哪能不挨刀 J 风险管理就是做供应链管理工作中不可或缺的一部分,居安思危更是必备的一种思维模式。 从这次日本危机中可以总结的经验教训有很多,例如鸡蛋不要都放在一个篮子里,公共危机处理中的信息透明化等等。从供应链管理的角度看,通过这次日本危机,对供应商的风险管理和危机应对,其重要性变得更加突出。正如我在上一篇文章中提到的,风险管理,是作为采购一定要做的功课。 如何做好风险管理?个人认为首先要对风险管理 (Crisis Management or Risk Management) 的过程有一个完整的认识。我大致分为四部分:风险评估与识别 --- 〉应对计划 --- 〉跟踪检讨 --- 〉改善更新。 风险评估 (Risk Evaluation) 的主要目的是识别风险因子,了解万一风险发生后所产生后果的严重程度,影响程度。风险评估应该涵盖的方面包括:策略方面,质量方面,供应方面,安全方面(具体又再分解为生产安全,环境安全)。针对供应商的风险评估工作,应该由采购牵头,组成跨部门小组来完成。因为这其中涉及到许多的专业技术和知识,不是一个采购可以全部覆盖的。 在对可能的风险全面评估的基础上,就要有应对计划 (contingency plan) 的编制。这具体包括:量化的控制指标和检测范围,罗列可能导致异常情况出现的因素,以及所要采取的措施,负责实施的人员,要被通知的人员等等。 关于应对计划 / 应对措施,有人认为要越具体越好,越详细越好。个人以为,这里要有一个风险和效益的平衡。毕竟计划的东西,是现在对未来的规划。现在计划好的东西,到未来是否一定就会这样发生,或者到未来的时候情况是否和现在计划的一样,谁也不能确定。所以我以为在做具体的应对措施的时候有两个原则,第一可以从历史数据中找依据,看看各种情况会发生的可能性有多大,再决定措施要有多具体; 第二看看团队成员主要是什么性格的人,如果大部分是风险厌恶者,那就会计划得不厌其烦,若是乐观的人多,那就简单很多。但不管是复杂版本或则简化版本,几个基本的要素还是要有的,例如:风险点,关键控制点,控制方法,应急预案,风险报告路径等。 其次,对风险应对计划要有一个持续检讨和更新的过程。因为再完美的计划也只是计划,是否符合实际,是否跟得上形势,需要不断的回顾和检讨。具体可以通过要求供应商定期提交报告,到供应商工厂一起开会检讨,进行提案改善等。 另外,个人经验,和供应商开会检讨时,一定要看现场,不要被计划给忽悠了。有的供应商很会做表面文章, power point 做的很漂亮,应付客户的要求非常有经验。但到底有没有落实到实处,只有到过现场才知道。以前有个这样一个案例:在对一个供应商作风险评估时,觉得其生产区域的人员出入管理不完善,对产品信息安全有潜在的风险,要求其整改。供应商表面上也很配合,很快就提交了改善方案,十几页的报告,详细写明了人员着装如何识别,人员出入如何管控,生产区域的隔断和封闭工程如何进行,甚至还附上 auto CAD 的图纸。看起来好专业,好具体。但隔了一段时间去回访供应商,那生产区域还是没有任何管控,人员自由出入,之前所作的方案没有一条得到实施。 所以说,现场管理很重要,一定要确定其责任落实,考核量化 ,并且要有有效性跟踪 (effectiveness measurement) 。 匆匆忙忙作个小结,写的比较简略。最后想说,对供应商的风险管理,居安思危的意识很重要,所谓小心撑得万年船,这可以帮到避免损失。但当风险万一发生了,保持镇定的心态也很重要,这时就要努力减少损失。
相关资源