热度 31
2015-3-26 19:54
1565 次阅读|
0 个评论
In the extraordinary year of 1905, Einstein published five brilliant papers including his best-known one on special relativity . One of the other papers explained the mechanics of Brownian motion, the random motion of small particles in fluid first studied by Robert Brown in 1827 and easily observed with a microscope (Brown used spores). Einstein combined diffusion analysis and thermodynamics to explain and then quantify this noise-like motion of particles ( figure ), which had previously been collectively measured although the underlying physics remained a mystery. While his analysis of Brownian motion has been experimentally confirmed, he noted that one parameter of his analysis—the instantaneous velocity of particles needed to verify their group-velocity distribution—would never be measurable, due to the tiny physical and time scales. Figure: Brown motion of particles is a random process like electrical noise, but Einstein showed how this motion could be analyzed in the aggregate using thermodynamic and diffusion principles to characterize observable factors such as mean travel distances and drift motion over time (from Florida State University, Department of Scientific Computation). Forward to the 21st century and it looks as if the "it can't be done" measurement is being done. In a brief and fascinating article " The measurement Einstein deemed impossible " in January 2015 edition of Physics Today (always an interesting and readable publication), two professors have combined an optical tweezer with a pulsed laser for nanosecond data collection, to actually make that measurement. Note: Einstein’s five papers of 1905 were on the kinetic theory of gases, Brownian motion, special relativity, the relationship between mass and energy, and the quantum nature of photoelectric effect (due to its "practicality," the last one was his cited accomplishment on the official rationale for his Nobel prize award). Einstein's Miraculous Year by John Stachel is an excellent book with complete translations of these five papers, as well as detailed technical explanations and in-depth historical perspective for each. Developed in the 1970s, the optical tweezer is a focused laser beam that's now a standard tool in physics and biology experiments. It's used to trap and then move particles. This tweezer and a pulsed laser are, however, not enough to make the measurement. The article discussed how the particles under surveillance—here, they used tiny glass beads rather than the pollen that Brown and others used—and overall test bed were configured. The velocity measurement used a split-beam photodetector, with half of the emitted beam going to the test site and returning to the detector, while the other half is going directly to the detector via an equivalent optical-path length. This type of differential measurement enables cancellation of any intensity fluctuations in the laser. The ability to extract meaningful data in difficult situations is one of the many attributes of the test and measurement domain. Sometimes, as in the Brownian-motion case here, the measurement challenge is inherent in the nature of the object under test and the subtleties of instantaneous velocity on this microscopic time and physical scale. In many cases, however, the parameter is simple to measure in principle, but the reality of the making the measurement is hard. Think of all the challenging places we need to measure temperature despite its apparent simplicity, and all the creative, often ingenious contact and non-contact solutions used. Or consider weight: the late Jim Williams, in his 1976 feature article for EDN , discussed in detail his design for an infant-weighing scale for the MIT nutrition lab, " This 30-ppm scale proves that analog designs aren't dead yet ." While a basic weigh scale is an almost trivial design, Williams faced some tough objectives: the scale had to be small and portable, offer absolute accuracy within 0.02% along with resolution to 0.01 pound over a 300-pound range, use only standard components, and never need calibration once put into use. To do this, he looked at every subtle source of error including thermal drift, component aging, and stray EM fields. In a thorough tour of engineering excellence, he worked out how to minimize, negate, or self-cancel their degrading effects. Have you ever helped devise a solution to a measurement that "couldn't be done" or was deemed very difficult? How did you verify the validity of your approach?