Circuit models are the heart of worst-case circuit analysis (WCCA). For simulations to be valid, you must gather and vet models or create them yourself; you can't usually rely on the manufacturer's model. Your models must correlate to datasheets or test data, and then you need to add tolerances to those models. Your goal is to create an accurate model with the "proper" fidelity. Too much fidelity results in high costs, but models that are too simple or inaccurate can result in bogus outcomes.

Modeling was my first assignment out of college and much of the material I published while at Intusoft is still relevant today. Reference 1 includes a link for a free download I wrote some years ago on modeling diodes and BJTs.

The WCCA for analog functional blocks (power systems, linear circuits) is most optimally performed using simulation (usually SPICE or some board level simulator such as ADS). In some cases, we have reviewed WCCAs that used 100% math (Mathcad) or 100% simulation. Neither technique is optimum for all types of analyses, nor are they even appropriate. You can't easily use math to perform many nonlinear analyses like transients or frequency domain stability. On the other hand, SPICE is overkill for many steady state assessments. Most analog WCCAs are about 50-50 math vs. simulation from a methodology standpoint. Digital WCCA is a bit of a different story, there, it is commonplace for all the analysis to be simulation based and you are largely dependent on the manufacturer provided IBIS models.

SPICE is a powerful tool, but you can easily get yourself into trouble and not know it. All it takes is one incorrect parameter in a sea of models, each with its own subcircuit or model parameter set, to invalidate the simulation’s results (Ref. 2).

For most SPICE based analyses, more than half the work scope is taken up deriving a believable, supportable, and correlated nominal model. Correlation is critical. Part and circuit models must be anchored to something known, at least nominally. How can we expect to perform parametric extreme value analysis (‘EVA’) or Monte Carlo analysis using worst-case tolerances if the nominal model isn’t within the range of the band of initial tolerances? You can’t just take a nominal model from a vendor, slap tolerances onto it, and assume the results are valid for all circuit configurations and operating ranges.

**Figure 1** shows a simple test circuit for simulating MOSFET transconductance (gFS). L1 and C2 are used to “open the loop” allowing you to measure the transfer function from the gate to the output, while maintaining a closed DC loop (Ref. 3).

**Figure 1. This circuit model lets you simulate a MOSFET's transconductance (gFS)**.

**Figure 2a** (left) shows the circuit simulation of a fitted model made by AEi Systems while **Figure 2b** (right) shows the breadboard measured at load currents of 30 µA, 250 µA, 1mA, 10 mA, and 50 mA for comparison to the model performance. The motivation here is that most SPICE models for FETs are not accurate for linear operation so we created one. The SPICE models are generally set up for hard switching applications and VGS isn't accurate for the low operating currents. The data usually isn't in the data sheet, but that's another story.

**Figure 2. We created our own simulation (a) because many MOSFET models don't cover low operating current. (b) shows the measured performance.**

**Figure 3** shows the same gFS data from a vendor-supplied SPICE model. The difference between Fig. 3 and Fig. 2a (our model) is clear.

**Figure 3. The same simulation using the SPICE model for the IRF230 from IRF.com. Note that in this case, the model kind of portrays the gFS characteristic but doesn’t get the actual performance quite right. The MOSFET manufacturer didn’t prioritize or evaluate the gFS performance at low currents and their subcircuit topology did not model it well.**

Transients, whether for part stress assessments or to assess circuit startup/EMC performance, AC analyses like stability, or any analysis that is not monotonic with respect to the outcome vs. tolerances, usually requires some sort of simulation model.

So herein lies the problem. Most analysts rely on the component manufacturers to supply the part models, often without checking the validity of the model in their circuit application. A model needs checking for both the characteristics needed and the operating range over which the characteristics need to be accurate.

This may come as a surprise, but vendor models often lack the fidelity you need. Important characteristics aren't modeled or only modeled at certain specific operating conditions. This is not to say that the models are wrong, but that they are often not accurate under the conditions you need. In most cases, documentation is scant, buried in the netlist, or nonexistent. This is a huge problem. Models need documentation and its often inexplicably not available. Without documentation, you don't know over which operating conditions the models are good or even what characteristics the models portray. SPICE models, by their very nature, have limitations. The trick is to know them and adapt the models accordingly.

How do you know if a model is any good? You must build test circuits that emulate the data sheet's test circuits and correlate all the parameters that must be right for your simulation. Then, you must correlate the entire application circuit model to test data or practical theory of some kind. It is only at that point that you can apply tolerances and run worst-case scenarios. In**Figure 4a**, you can see the top left plot how the vendor voltage reference model did not have the output impedance modeled. Therefore, it could not be used in transient or AC analyses. In **Figure 4b**, the vendor model is first order only and not very accurate.

**Figure 4. There are two aspects to SPICE models that need to be verified. Does the model exhibit the characteristic of interest and then how accurately is the characteristic modeled over the operating conditions needed? ****In (a), the model didn't at all portray the output impedance. In (b), the model varied the forward voltage with current, but not very accurately.**

Knowing how to do this requires knowledge of how to model each individual part using the simulator's syntax and available constructs, what characteristics are important, and how to spot when the circuit model doesn't behave properly. The debugging can be time-consuming and frustrating. It's simply not likely you will "luck into" a usable model without extensive experience in both the application and the parts involved. This is particularly true if you are trying to perform a WCCA on a circuit that has yet to be built and you have no test data.

End-of-life or worst-case tolerance models are normally not provided, so if the model isn't encrypted, you will have to learn where the parameter "knobs" are in the netlist so that you can apply tolerances to the characteristics for which the circuit is sensitive. This is a bit of a learned art, as sub-circuits from different manufacturers for the same part type (e.g. FETs) are different.

In addition, most SPICE parameters don't directly relate to a datasheet counter-part. For instance, in a diode there are three SPICE parameters that are used to fit the diode’s forward V-I response: N, IS, and RS. So, if you're only matching a few data points, there are multiple combinations that will get you there, though some can be physically unrealizable. More importantly, if you chose the wrong set, the model may not operate correctly outside of those data points.

**Nominal isn't enough**

For WCCA, it's not enough to get a nominal model from the part manufacturer. You must be able to apply tolerances to it. Many datasheet characteristics are modeled with groups of components. Thus, applying data sheet tolerances may not be simple or obvious. You must get into the netlist and figure it out. If you build your own models, you can configure them to be easily toleranced.

In this case, a capacitor is modeled with a ladder subcircuit (**Figure 5**) and its impedance/ESR can be scaled between its minimum to spec maximum using a voltage source multiplier. The model is tested to see its tolerance distribution so that when it is used in a Monte Carlo analysis (**Figure 6**), it's clear how it will perform.

**Figure 5. A ladder subcircuit lets you model a capacitor.**

**Figure 6. To truly know how a part will perform during a Monte Carlo analysis, you can perform a simplified Monte Carlo analysis of just the part’s characteristic of interest--equivalent series resistance (ESR) in this case. This way, the statistical performance of the model can be proven before use in a full circuit simulation. A distribution-free Monte Carlo analysis using 1440 runs shows the worst end points just matching the spec maximum and assumed worst case minimum.**(My next article will cover Monte Carlo analysis.)

If the model is encrypted (meaning you can’t read its underlying data), you’re at a dead-end. Applying Tolerances to all the parts around the fixed part may be useful, but it won’t be worst case. For instance, many power IC models are encrypted. That means their control loop response (bandwidth, poles/zeros, etc.) is fixed. Over life, bandwidths can vary substantially. Stability, for instance may vary widely depending on the gain/phase response.

Inevitably, you'll have to vet and create your own part models – whether it’s just for passive parts, semiconductors, or op-amps, you'll run into cases where models are suspect and must be altered or where they don't exist at all. For power supply modeling you will have to learn and understand state space averaging techniques (there are many books and papers related to this topic by Christophe Basso and Steve Sandler) (Ref 4, 5).

It just takes a fair amount of time and experience to learn how to tweak a model because each part type is different and for most ICs there is no standard template. Two tools that are quite useful in this regard are the OrCAD PSpice Model Editor and Intusoft’s SpiceMod program.

Finally, you must create the model for entire circuit portion you want to simulate. That means assessing whether interconnects and PCB parasitics play a material role in the fidelity of the results, as well as, correlating the circuit to any test data that might be available (hopefully). Again, the idea is to build confidence in the model’s fidelity and realism before you estimate the worst behavior.

**Invalid assumption**

“The effects of the PCB are assumed not to impact the analysis.” Over the years, much of the analog WCCA performed hasn't included PCB effects. This was an assumption and for the most part, a reasonable one. That assumption is becoming less and less accurate. The bandwidth of power ICs and the edge speeds of the dynamic load current they supply are increasing, while the voltage regulation margins are dropping. In an increasing number of cases, you must account for the impact of the power distribution network’s (PDN) impedance, distributed interconnect impedances, power rail planes, interconnects, connectors, backplanes, and decoupling. That means basic power integrity analyses like stability, ripple, startup, and load step, will require much more advanced (i.e. expensive) finite element analysis tools that are able to incorporate the PCB, component interconnect parasitics, and the voltage regulator module's output impedance into the simulation (Ref 6).

With SPICE it’s garbage in, garbage out. But with good correlation and high-fidelity models, the results can be dead on accurate. In this simulation of the power bus in the Space Station (**Figure 7**), AEi Systems predicted that a master computer reset would be triggered via a bus dropout every time the spacecraft went through an eclipse. Two years after the simulation was performed and the analysis results ignored, and the hardware subsequently built, the picture on the right showed up with a little yellow sticky note.

**Figure 7. The simulation showed a glitch in a spacecraft's master reset that occurred whenever the spacecraft passed through an eclipse.**

In summary, model development, correlation, and tolerancing is the hardest and most time-consuming part of WCCA. It can take a long time to achieve a viable model, but it is necessary to complete much of the analysis.

**References**

Modeling was my first assignment out of college and much of the material I published while at Intusoft is still relevant today. Reference 1 includes a link for a free download I wrote some years ago on modeling diodes and BJTs.

The WCCA for analog functional blocks (power systems, linear circuits) is most optimally performed using simulation (usually SPICE or some board level simulator such as ADS). In some cases, we have reviewed WCCAs that used 100% math (Mathcad) or 100% simulation. Neither technique is optimum for all types of analyses, nor are they even appropriate. You can't easily use math to perform many nonlinear analyses like transients or frequency domain stability. On the other hand, SPICE is overkill for many steady state assessments. Most analog WCCAs are about 50-50 math vs. simulation from a methodology standpoint. Digital WCCA is a bit of a different story, there, it is commonplace for all the analysis to be simulation based and you are largely dependent on the manufacturer provided IBIS models.

SPICE is a powerful tool, but you can easily get yourself into trouble and not know it. All it takes is one incorrect parameter in a sea of models, each with its own subcircuit or model parameter set, to invalidate the simulation’s results (Ref. 2).

For most SPICE based analyses, more than half the work scope is taken up deriving a believable, supportable, and correlated nominal model. Correlation is critical. Part and circuit models must be anchored to something known, at least nominally. How can we expect to perform parametric extreme value analysis (‘EVA’) or Monte Carlo analysis using worst-case tolerances if the nominal model isn’t within the range of the band of initial tolerances? You can’t just take a nominal model from a vendor, slap tolerances onto it, and assume the results are valid for all circuit configurations and operating ranges.

Transients, whether for part stress assessments or to assess circuit startup/EMC performance, AC analyses like stability, or any analysis that is not monotonic with respect to the outcome vs. tolerances, usually requires some sort of simulation model.

So herein lies the problem. Most analysts rely on the component manufacturers to supply the part models, often without checking the validity of the model in their circuit application. A model needs checking for both the characteristics needed and the operating range over which the characteristics need to be accurate.

This may come as a surprise, but vendor models often lack the fidelity you need. Important characteristics aren't modeled or only modeled at certain specific operating conditions. This is not to say that the models are wrong, but that they are often not accurate under the conditions you need. In most cases, documentation is scant, buried in the netlist, or nonexistent. This is a huge problem. Models need documentation and its often inexplicably not available. Without documentation, you don't know over which operating conditions the models are good or even what characteristics the models portray. SPICE models, by their very nature, have limitations. The trick is to know them and adapt the models accordingly.

How do you know if a model is any good? You must build test circuits that emulate the data sheet's test circuits and correlate all the parameters that must be right for your simulation. Then, you must correlate the entire application circuit model to test data or practical theory of some kind. It is only at that point that you can apply tolerances and run worst-case scenarios. In

Knowing how to do this requires knowledge of how to model each individual part using the simulator's syntax and available constructs, what characteristics are important, and how to spot when the circuit model doesn't behave properly. The debugging can be time-consuming and frustrating. It's simply not likely you will "luck into" a usable model without extensive experience in both the application and the parts involved. This is particularly true if you are trying to perform a WCCA on a circuit that has yet to be built and you have no test data.

End-of-life or worst-case tolerance models are normally not provided, so if the model isn't encrypted, you will have to learn where the parameter "knobs" are in the netlist so that you can apply tolerances to the characteristics for which the circuit is sensitive. This is a bit of a learned art, as sub-circuits from different manufacturers for the same part type (e.g. FETs) are different.

In addition, most SPICE parameters don't directly relate to a datasheet counter-part. For instance, in a diode there are three SPICE parameters that are used to fit the diode’s forward V-I response: N, IS, and RS. So, if you're only matching a few data points, there are multiple combinations that will get you there, though some can be physically unrealizable. More importantly, if you chose the wrong set, the model may not operate correctly outside of those data points.

For WCCA, it's not enough to get a nominal model from the part manufacturer. You must be able to apply tolerances to it. Many datasheet characteristics are modeled with groups of components. Thus, applying data sheet tolerances may not be simple or obvious. You must get into the netlist and figure it out. If you build your own models, you can configure them to be easily toleranced.

In this case, a capacitor is modeled with a ladder subcircuit (

If the model is encrypted (meaning you can’t read its underlying data), you’re at a dead-end. Applying Tolerances to all the parts around the fixed part may be useful, but it won’t be worst case. For instance, many power IC models are encrypted. That means their control loop response (bandwidth, poles/zeros, etc.) is fixed. Over life, bandwidths can vary substantially. Stability, for instance may vary widely depending on the gain/phase response.

Inevitably, you'll have to vet and create your own part models – whether it’s just for passive parts, semiconductors, or op-amps, you'll run into cases where models are suspect and must be altered or where they don't exist at all. For power supply modeling you will have to learn and understand state space averaging techniques (there are many books and papers related to this topic by Christophe Basso and Steve Sandler) (Ref 4, 5).

It just takes a fair amount of time and experience to learn how to tweak a model because each part type is different and for most ICs there is no standard template. Two tools that are quite useful in this regard are the OrCAD PSpice Model Editor and Intusoft’s SpiceMod program.

Finally, you must create the model for entire circuit portion you want to simulate. That means assessing whether interconnects and PCB parasitics play a material role in the fidelity of the results, as well as, correlating the circuit to any test data that might be available (hopefully). Again, the idea is to build confidence in the model’s fidelity and realism before you estimate the worst behavior.

“The effects of the PCB are assumed not to impact the analysis.” Over the years, much of the analog WCCA performed hasn't included PCB effects. This was an assumption and for the most part, a reasonable one. That assumption is becoming less and less accurate. The bandwidth of power ICs and the edge speeds of the dynamic load current they supply are increasing, while the voltage regulation margins are dropping. In an increasing number of cases, you must account for the impact of the power distribution network’s (PDN) impedance, distributed interconnect impedances, power rail planes, interconnects, connectors, backplanes, and decoupling. That means basic power integrity analyses like stability, ripple, startup, and load step, will require much more advanced (i.e. expensive) finite element analysis tools that are able to incorporate the PCB, component interconnect parasitics, and the voltage regulator module's output impedance into the simulation (Ref 6).

With SPICE it’s garbage in, garbage out. But with good correlation and high-fidelity models, the results can be dead on accurate. In this simulation of the power bus in the Space Station (

In summary, model development, correlation, and tolerancing is the hardest and most time-consuming part of WCCA. It can take a long time to achieve a viable model, but it is necessary to complete much of the analysis.

- Hymowitz, Ober, Robson, Horita, “Definitive Handbook of Transistor Modeling”
- Ho, Sandler, Hymowitz, “SPICE models need correlation to measurements,” EDN, June 2014.
- Sandler, Hymowitz, “SPICE Model Supports LDO Regulator Designs,” Power Electronics, 2005.
- Basso, “Switch-Mode Power Supplies Spice Simulations and Practical Designs”, ASIN: B012HU9XIU, 2010.
- Sandler, “Switched-Mode Power Supply Simulation with SPICE: The Faraday Press Edition”, ISBN-13: 978-1941071847, 2018
- Sandler, “Power Integrity: Measuring, Optimizing, and Troubleshooting Power Related Parameters in Electronics Systems”, ISBN-13: 978-0071830997, 2014. Book review.

广告

相关推荐阅读

- Hi__terry 2019-01-15 16:32
- The connector is often the EMI problem
- When it comes to reducing EMI problems, most coax cables behave the same. They are all just as good ...

- Hi__terry 2017-06-07 05:29
- 关于A/D采集误差分析
- 存在误差有三个主要的因素：源信号存在的误差、过滤电路、转换电路，其中信号源存在的误差，这个误差是固有的误差，不能随着电路设计得到改进，下面主要说明过滤电路、A/D转换电路造成的...

- Hi__terry 2017-06-04 15:06
- 汽车电子工程师的成长之路
- 今天是第一次发文，希望能给志同道合的汽车电子人带来帮助以及激励自己继续进步！简单介绍下自己，我从事电子硬件工作有8年的时间，做过很多硬件上的设计，音频电路、射频电路，现在有开始做汽车电子电路设计...

广告

关闭
热点推荐 */2*

## 文章评论

登录后参与讨论（0条评论）