How do you build and verify the instrumentation which allows you to do leading-edge work, and advance the state of the art? This has always been an engineering and science dilemma. How do you confirm vital metrology, which must itself be better than what you are now trying to measure? How do you actually know that what you think you are measuring is really what is there, anyway? Or, to cite the Roman poet Juvenal, "who will guard the guards themselves?"
Obviously, there is no single or simple answer to this, but we do know one thing: it isn't easy. Yet vendors such as Agilent Technologies continuously work the problem, both by developing radically new instrumentation and by looking to squeeze that last bit of performance perfection out of what we already have.
I saw a clear demonstration of this when Agilent showed two new test/calibration products, for different situations but with a common underlying theme. One is the N2809A PrecisionProbe software for their Infiniium 90000 X-Series and 90000A Series of oscilloscopes; the other is a "system impulse response correction" (SIRC) calibration product for multimode optical receivers with bandwidths up to and beyond 25GHz, and single-mode receivers nearing 100GHz bandwidth.
The PrecisionProbe product lets users fully characterize the signal path's passive components (cables or fixtures), and then compensates automatically during signal acquisition and analysis. For example, it allows you to take your GHz cables—which were not absolutely perfect to begin with, and have likely endured a hard life of bending, twisting, and maybe even some kinks and crushing—and develop a profile of that specific item, which is then associated with it for all future measurements, as a downloadable file. As a result, the scope assesses only the device under test (DUT) in its displays and readouts, while cable or fixture is taken "out of the picture", so to speak.
It's always been possible, of course, to work around these imperfections to some extent, and engineers have done so for many years. But it's a complex process, often involving a vector network analyzer (VNA) and maintenance of a paper trail of the numerous calibration factors and points for that setup, as well as physically disconnecting and reconnecting each item to be checked. That's both labor-intensive and can introduce new errors, since things never go back quite the way they were.
While the idea of measurement and compenzation is certainly not new, making it practical at the bandwidths of these scopes required using the scope's internal 15ps rise-time, indium phosphide (InP) pulser IC as an signal source to the outside.
Having this source is only part of the story. The Agilent scopes embed a DSP hardware accelerator to perform the compenzation calculations (frequency response, loss, linearity, skew, artifacts, and phase shift) in real-time, so the scope incurs a throughput loss of only a few percent—practically invisible to the user.
[The SIRC calibration for optical is conceptually similar but very different in implementation: it provides calibration and compenzation of the optical channel within the 86100D digital communication analyzer, to allow more accurate, standards-compliant measurements to higher bandwidths.]
Advancing the state of the art in instrumentation has always been among the toughest challenges engineers face. You have to determine the sources of error, then figure out ways to minimize them, cancel them, or compensate for them—and also how to verify that what you are seeing is reality, and not a figment of the test and/or your own mental "projection". The recently deceased analog-expert Jim Williams often explored this, and one of his first published articles, "This 30-ppm scale proves that analog designs aren't dead yet" is still a remarkable exposition on this subject.
Do you address these sorts of challenges? Have you ever had to? Has your equipment, fixturing, overall setup, or even your own assumptions ever led you astray—and for how long?
文章评论(0条评论)
登录后参与讨论