The continuing trend toward faster, cheaper, more accurate analog-to-digital converters will top the list of advances in instrumentation in the 1990s. Digitizing oscilloscopes may be the most significant product, combining the speed of analog oscilloscopes with the ability to simultaneously examine high-speed transient events on multiple channels. By enabling engineers to get a clearer picture of what is going on in high-speed digital systems, digitizing oscilloscopes will speed design implementation. A side effect of the oscilloscope changeover will be the gradual disappearance of logic analyzers, which are essentially one-bit digitizing oscilloscopes. Advances in a-d convertors will also spur development of the instrument-on-a-card concept, which envisions medium performance instrumentation in the form of cards that fit in a VXI bus-based mainframe.
The most exciting technological advance in instrumentation in the ’90s will be the continuing improvement in analog-to-digital converters, as they continue to get faster, less expensive, and more precise. As a result, a whole host of changes in instrumentation may be anticipated. Probably the most important change will be the changeover in oscilloscopes from analog to digital in all but the lowest-cost, lowest-performance areas.
The new breed of digitizing oscilloscopes will have all of the speed that was formerly available only from analog instruments. But they’ll also offer a capability that average designers could never access before: the ability to examine high-speed transient events on multiple channels simultaneously.
Not surprisingly, that capability is exactly what’s needed to analyze the behavior of the types of circuits that average designers will be building. I say “not surprisingly” because those new circuits will be based on the same advances in semiconductors that will make the new oscilloscopes possible. To be specific, those circuits will be very fast digital designs, such as systems based on 50-MHz microprocessors. These systems will have sub-nanosecond rise times and will require timing resolutions of tens of picoseconds.
Today, if engineers have to troubleshoot a design of that type, much of what they do will consist of guesswork. They use their knowledge and experience to guess what the problem might be, fix it based on that assumption, and check to see whether the problem went away. Eventually they fix the problem, but rarely is it known whether the problem was what was originally thought. The higg-speed single-shot events that cause the problems simply can’t be seen on today’s conventional instruments.
The new breed of reasonably-priced oscilloscopes will give average engineers the ability to really understand what’s happening with their high-speed digital systems. I can’t say precisely what effect that capability will have on design methodologies, but it’s sure to be considerable. It will certainly enable engineers to implement designs more quickly. In other words, it’s a productivity tool.
It can also be a learning aid. When you truly understand what went wrong and why your fix worked, you may have learned something that will give you a hint of what to avoid in the future.
Another interesting outcome of the oscilloscope changeover will be the disappearance of the logic analyzer as a separate piece of instrumentation. A logic analyzer, after all, is merely a one-bit digitizing oscilloscope. As the price of a-d converters continues to drop, a point will be reached where it makes sense to build, say, 80-channel digitizing oscilloscopes. With such instruments, there’d be no need for a simple logic analyzer.
As a result of advances in oscilloscopes, I expect substantial changes in microwave engineering design methodologies. Today, most microwave design work is done in the frequency domain because the dominant measurement tools available to microwave designers–spectrum analyzers and network analyzers–work in this domain.
But given a choice, a majority of engineers would prefer to work mostly in the time domain. At lower frequencies, where there has long been a choice, both design and analysis are done in the time domain because it’s easier to spot most problems there. For example, if your amplifier is clipping a sine wave, it can easily spotted on an oscilloscope. In the frequency domain, however, all you’d see are some second and third harmonics, which may also be caused by crossover distortion or some other nonlinearity. With a spectrum analyzer, you know there’s a problem and you know when it’s been solved, but you don’t necessarily know what you did to fix it.
Aside from fomenting what amounts to a revolution in oscilloscopes, advances in a-d converters will also give a powerful boost to the instrument-on-a-card concept. I expect that much of the medium-performance instrumentation produced toward the end of the decade will be in the form of cards that will fit into an instrumentation mainframe based on the VXI bus. This type of packaging, however, will be of more interest to manufacturing and test engineers–to whom size and configurability are very important–than to designers. But wherever it’s applied, the instrumentation card cage will offer lots of very neat solutions.