热度 16
2016-3-24 11:52
2075 次阅读|
0 个评论
Measuring physical variables is a major challenge along two dimensions: when ever-greater precision and accuracy are required, or when the variable—such as distance, time, or temperature—is at an extreme end of the range. It's always read about new ways of measuring "common" parameters that make use of new techniques and technology advances. That's why the recent story Breakthrough Spectrometer Defines Kelvin Temperature in Laser Focus World was so fascinating, and raised some questions for me of a philosophical nature. In brief, the article explains in readable detail how researchers from a trio of Australian universities have developed an ultrasensitive absorption spectrometer ( Figure 1 ) than can measure Boltzmann's constant , a fundamental parameter of physics that relates energy to temperature. If you can measure the former, you can then determine the latter. It's especially attractive because it uses instrumentation which can be duplicated anywhere (given money and expertise, of course) rather than depend on a transfer standard or other less-accurate secondary-standards set-ups. Figure 1. The laser-based absorption spectrometer developed by three Australian universities allows precise measurement of Boltzmann's constant via the atomic transition trapped cesium gas, leading to a measurement of absolute temperature with instrumentation which can be duplicated anywhere rather than rely on a transfer standard. (Courtesy University of Adelaide). I won't claim to fully understand the physics and optics of this new approach. Still, reading these descriptions of how measuring variable A leads to a measurement of variable B always makes me a little uncomfortable, as in "what are we really doing here?" Advanced metrology can sometimes take your head in circles, even in the principles are valid and vetted. I don't doubt the validity of the Australian technique; it's only my personal queasiness we're talking about. After all, measuring temperature by any means, whether it is a thermocouple, fluid expansion, or even the speed of sound is inherently an indirect way of measuring this most-common parameter. (For a fascinating transducer and circuit you can which measures temperature by the change in the speed of sound, see the last EDN article by the sorely missed Jim Williams, An introduction to acoustic thermometry .) I have even greater discomfort when I read about experiments at a fraction of a degree above absolute zero (0 k). The report will call out 0.01 K and I wonder: how do you measure that low a temperature with both accuracy and precision? I am sure the experts at places such as NIST have thought these through and analyzed things, but still, I wonder: how do you know that the physics of whatever you are doing remain consistent at those extremes? After all, we know apparently strange things (which we now can model and explain) do happen as the temperature drops, such as water turning to ice or superconductivity. I recall once reading that the brilliant physicist Richard Feynman pointed out (sorry, I can't find the source or citation) that it's naive to assume the laws of physics as we know them hold at extremes (he was referring to the moment before the Big Bang, when all the mass of the universe was supposedly compressed into the volume of a coffee cup). So measuring temperature at a fraction of a degree K may be a "what does this really mean?" question as much as anything else. It's the same with special and general relativity: we know the equations and theory, and we know they have been verified by countless independent experiments along many different perspectives. Still, after thinking about time dilation and the bending of the space-time fabric by gravity, I think of the key message and paradox about "time" discussed in the excellent book From Eternity to Here: The Quest for the Ultimate Theory of Time by Sean Carroll: everyone knows what time is ( Figure 2 ), and yet no one really knows what it is—and the implications of "time travel" only compounds that confusion. Thinking about these things too much can put your head in a "race" condition from which there may be no escape. Figure 2. Take a break from your test-and-measurement tasks, and delve into the meanings of "time" with this excellent book which is casual, sophisticated, and technical at the same time. Do you ever get involved in test and measurement situations which make you wonder what you are really measuring, and the real meaning of the parameter itself?