tag 标签: CR2032

相关博文
  • 热度 20
    2014-9-26 12:35
    2229 次阅读|
    0 个评论
    Now that I have discharged about 100 CR2032 coin cells using a number of power profiles, I have come to collect millions of data points, and have shared the results here over the past months. These articles include:   How much energy can we generate from a coin cell? Implementing reverse battery protection UL coin cell requirements, and why you can’t parallel two batteries to get more mAh I finished getting some more data some months ago but have been too backlogged to reduce it to useful information. Finally that’s done! The question I had was: suppose one applies a fixed load to a coin cell for a short period of time. Does the battery voltage change? That’s a special case of a broader question: everyone uses internal resistance (IR) to characterize these cells. Is IR really an accurate way to model their behavior? For this experiment I discharged 9 CR2032s. Most of the time there was only a 0.5 mA background load to run the batteries down, but every two hours the test jig applied either a 10 mA or a 30 mA load for one second. That is, at the 2 hour mark the cells saw a 10 mA load; at four hours it was 30 mA. The system read voltages immediately after applying the load, every 10 ms until 100 ms went by, and then at 100 ms intervals. Here are the results for 30 mA. The horizontal axis is time, but I left it unlabeled as it could be weeks, months or years depending on the discharge profile. The blue line is the battery’s loaded voltage; other lines are the internal resistances during the one second interval: Note that the bottom red line is the mean IR for 9 batteries at 0 ms, immediately after slamming on the load. All of the other data points are pretty tightly grouped. In other words, the IR goes up significantly (about 10%) milliseconds after the load shows up, but there’s little change after that. In other words, IR is not an accurate model of coin cell behavior. It’s not bad; for an engineering analysis it’s probably close enough. But there is some other effect from the battery chemistry going on. The results are clearer with less data. Here the red line is the IR at 0 ms; grey is at 10 ms, and yellow at 1 second: Especially nearing the end of life we can see a big increase in IR from 0 to 10 ms, but not much more from 10 ms to 1000 ms. Yes, the effect is different when the battery hasn’t run down too much, but then the voltage is higher and IR is so low the increasing IR isn’t particularly important. With a 10 mA load the results are about the same: The bottom line is that the voltage the MCU sees when it first wakes up is not that which will be there mere milliseconds later. Figure on another 10% drop, on top of all of the serious losses I’ve detailed.
  • 热度 16
    2014-9-26 12:31
    1489 次阅读|
    1 个评论
    After discharging about 100 CR2032 coin cells using a number of power profiles I have collected millions of data points, and have shared the results here over the past months. These articles include:   How much energy can be derived from a coin cell? UL coin cell requirements, and why you can’t parallel two batteries to get more mAh I finished getting some more data some months ago but have been too backlogged to reduce it to useful information. Finally that’s done! The question I had was: suppose one applies a fixed load to a coin cell for a short period of time. Does the battery voltage change? That’s a special case of a broader question: everyone uses internal resistance (IR) to characterize these cells. Is IR really an accurate way to model their behavior? For this experiment I discharged 9 CR2032s. Most of the time there was only a 0.5 mA background load to run the batteries down, but every two hours the test jig applied either a 10 mA or a 30 mA load for one second. That is, at the 2 hour mark the cells saw a 10 mA load; at four hours it was 30 mA. The system read voltages immediately after applying the load, every 10 ms until 100 ms went by, and then at 100 ms intervals. Here are the results for 30 mA. The horizontal axis is time, but I left it unlabeled as it could be weeks, months or years depending on the discharge profile. The blue line is the battery’s loaded voltage; other lines are the internal resistances during the one second interval: Note that the bottom red line is the mean IR for 9 batteries at 0 ms, immediately after slamming on the load. All of the other data points are pretty tightly grouped. In other words, the IR goes up significantly (about 10%) milliseconds after the load shows up, but there’s little change after that. In other words, IR is not an accurate model of coin cell behavior. It’s not bad; for an engineering analysis it’s probably close enough. But there is some other effect from the battery chemistry going on. The results are clearer with less data. Here the red line is the IR at 0 ms; grey is at 10 ms, and yellow at 1 second: Especially nearing the end of life we can see a big increase in IR from 0 to 10 ms, but not much more from 10 ms to 1000 ms. Yes, the effect is different when the battery hasn’t run down too much, but then the voltage is higher and IR is so low the increasing IR isn’t particularly important. With a 10 mA load the results are about the same: The bottom line is that the voltage the MCU sees when it first wakes up is not that which will be there mere milliseconds later. Figure on another 10% drop, on top of all of the serious losses I’ve detailed.
  • 热度 17
    2014-5-23 20:24
    1533 次阅读|
    0 个评论
    A few readers have written about how they defend against a dying battery using a brown-out reset (BOR) circuit, a feature pretty much every low-power microcontroller has. Unfortunately, most are poorly designed and/or lightly documented and should not be used. One popular ultra-low power MCU, from a vendor who noisily advertises two to three decades of coin cell operation, has a BOR which is off by default. On, it requires somewhere between 8 (typical) and 16 uA (worst case). Remember that, for a decade of life from a CR2032, the system’s average power draw cannot exceed 2.5 uA. Even if one were to foolishly use the typical number, the BOR by itself will deplete the battery in just a couple of years. Another vendor, pushing extremely-low power ARM parts, rates the BOR at 0.49 to 1.26 uA – typical. There’s no max listed; there’s not the slightest guidance of what statistical distribution one can expect when making thousands of systems. 1,26 uA eats half the 2.5 uA budget. In another case the datasheet reads about the BOR: “In the deeper sleep modes, this will contribute significantly to the total current consumption.” As Deming said: “In God we trust, all others bring data.” They didn’t, and it strikes me as risky to expect divine intervention instead of doing careful analysis. I’d be wary of using these uncharacterized parts in long-lived applications. As always, read the datasheets pessimistically and carefully. When does the BOR interrupt occur? In some cases the voltage range is very wide; an example is an MCU where it is listed as 2.05 (min), 2.2 (typ), and 2.35 volts (max). Doing worst-case design you’d have to assume that the BOR may occur at 2.35 volts. The following graph is one I have used before; it shows the voltage a CR2032 can deliver as it discharges as a function of load. The 10 mA load line is pretty common for an awake MCU with enabled peripherals; 30 mA is not unexpected if there’s any sort of wireless comm going on. The heavy red line is at 2.35 volts, the max value where we may see a BOR interrupt. Note that in this case the BOR circuit will signal a dead battery, if there’s a 30 mA load, shortly after you ship the product. With a 10 mA load the battery-dead indication happens when there’s still 31% capacity left.   In other words, the BOR circuit is useless. A much better alternative is to leave the BOR off and use the A/D. Wake up from time to time, apply a load similar to what the system normally needs when awake, and read Vcc. If it’s near 2.0, then signal a battery failure (or take whatever action your system needs in this situation). If such a reading is taken once every ten hours (surely, more often than needed), and 10 mA were pulled for 1 ms, over a decade this eats only 0.03 mAh, or 0.01% of a CR2032’s capacity. And, by manually reading the A/D a more nuanced approach to battery management is possible than by relying on a single, poorly-controlled, BOR voltage.
  • 热度 14
    2014-4-24 19:14
    1529 次阅读|
    0 个评论
    I worked on some experiments to determine how coin cells behave. This was motivated by what I consider outrageous claims made by a number of MCU vendors that their processors can run for several decades from a single CR2032 cell. Some vendors take their MCU’s sleep currents and divide those into the battery’s 225 mAh capacity to get these figures. Of course, no battery vendor I’ve found specifies a shelf life longer than a decade (at least one was unable to define “shelf life”) so it’s folly, or worse, to suggest to engineers that their systems can run for far longer than the components they’re based on last. Conservative design means recognizing that ten years is the max life one can expect from a coin cell. In practice, even that will not be achievable. There’s also a war raging about which MCUs have the lowest sleep currents. Sleep current is, to a first approximation, irrelevant. But how do coin cells really behave in these low-power applications? I’ve been discharging CR2032s with complex loads applied for short periods of time and have acquired millions of data points.   My CR2032 experiment. A small ARM controller applies various loads to batteries being discharged and logs the results. The following results are for 42 batteries from Duracell, Energizer, and Panasonic. For each vendor I ran two groups of cells, each group purchased months apart from distributors located in distant states, in hopes that these represent different batches. (The devices are not marked with any sort of serial or batch numbers). First, the weird part. Our local grocery store sells these cells for up to $5 each. Yet Digi-Key only wants $0.28 for a Panasonic and $0.40 for an Energizer – in singles. Duracells are harder to find from commercial distributors, but I paid about a buck each from on-line sources (e.g., Amazon). I found little battery-to-battery variability (other than one obviously bad Panasonic and one bad Duracell), little vendor-to-vendor difference, and likewise different batches gave about the same results. What parameters matter? Chiefly, capacity (how many milliamp hours one can really get from a cell), and internal resistance, which varies with capacity used. All of the vendors say “dead” is at 2.0 volts. The following graph shows the average voltage for the batteries from each vendor, as well as the worst-case voltage from each vendor, as they discharge at a 0.5 mA rate. The curve ascending from left to right is the cumulative capacity used. By the time 2.0 volts is reached the capacity variation is in the noise. I found it averaged 233 mAh with a standard deviation between all results of 5 mAh. Energizer and Duracell’s datasheets are, uh, a bit optimistic; Panasonic says we can expect to get 225 mAh from a cell, which seems, given this data, a good conservative value to use.   Battery discharge data But in practice you won’t get anything near that 225 mAh. As cells discharge, their internal resistance (IR) goes up. Actually, this is not quite correct, despite the claims of all of the published literature I have found. Other results I’ll report on in a later column shows that there’s something more complex than simple resistance going on, but for now IR is close enough. The next chart shows average IR for each vendor’s products, plus the IR’s standard deviation. Internal resistance and its standard deviation So what does this all mean to a cautious engineer? The IR grows so quickly that much of the battery’s capacity can’t be used! First, the average IR is not useful. Conservative design means using worst case figures, which we can estimate using the measured standard deviation. By using three sigma our odds of being “right” are .997. The following graph combines the IR plus three sigma IR to show what voltage the battery will deliver, depending on load. Voltage delivered from battery depending on load If a system, when awake, draws just 10 mA, 88% of the battery’s capacity is available before the voltage delivered to the load drops to 2.0. It’s pretty hard to build a useful system that needs only 10 mA. Some ultra-low-power processors are rated at 200 uA/MHz with a 48 MHz max – almost 10 mA just for the CPU. With higher loads, like any sort of communications, things get much worse. Bluetooth could take 80 mA, and even Bluetooth LE can suck nearly 30 mA. At 30 mA only 39% of the battery’s rated capacity can be used. An optimist might use two sigma and suffer from 5% of his system not working to spec, but that only increases the useful capacity to 44%. The battery will not be able to power the system long before it is really “dead,” and long before the system’s design lifetime. And long before the time MCU vendors cite in their white papers. (Some MCUs will run to 1.8 volts, so vendors might say my cutoff at 2.0 is unfair. Since battery vendors say that 2.0 is “dead”, I disagree. And, even if one were to run to 1.8V there’s less than a 5% gain in useful capacity.)
  • 热度 15
    2012-11-5 18:09
    1681 次阅读|
    1 个评论
    There's a battle going on about super-ultra-extremely-insanely-low power microcontrollers. Vendors claim their parts will operate on the just a subjunctive promise of a photon's worth of energy. Some pledge 20 years of operation on a single coin cell. But there is a lot of fairy dust in these claims. First, despite lots of marketing blather to the contrary, to my knowledge all of the 32 bit devices are completely out of the picture when it comes to these extremely-low power levels. Second, the low-power MCU's do offer incredibly-low power options. Truly astonishing. But one must look at system-wide issues. For instance, that two decade dream assumes the processor will be in a deep sleep mode most ( like 99.9% ) of the time, because in deep sleep modes these parts consume just tens of nanoamps. Applications that require a usually-alert CPU will kill the battery much faster. Then there's the battery. The common CR2032 is usually referenced. Panasonic's version is good for 200 mA/hrs with a 100 Kohm load. That's 30 microamps, enough to run some of these MCUs at a reduced frequency, generally with various features and peripherals disabled. If the processor is awake, 200 mA-hours at 30 microamps is less than a year of operation. And by that time the battery will be down to 2V, so the system had better be designed to work at that level. But there are plenty of other issues. The microcontroller doesn't live in a vacuum; it's part of the system. Other components draw power, and it's the design engineer's job to evaluate those drains. Some may be surprising. I have a lot of evaluation boards here; many are designed to show off the long-battery-life aspects of a particular processor. It's interesting to examine the designs. I put one of the boards under a microscope and looked at some of the ancillary parts. There's a 22 uF decoupling capacitor. No real surprise there. It appears to be one of those nice Kemet polymer tantalums designed for high-density SMT applications. The datasheet pegs the cap's internal leakage at 22 uA. That means the capacitor draws a thousand times more power than the dozing CPU. In fact, the cap alone will drain the battery in under a year ( typical values, which are undocumented, are probably better, but who designs for anything another other than worst case? ). A capacitor has parasitics, and in simplified form looks like this: Usually the ESR (equivalent series resistance) is pretty low; for caps that are not wound ESL, too, is low. One would hope that Rleak would be infinity, but for the parts we usually use on power and ground that's just not the case. Some datasheets proudly proclaim Rleak values of 50 million ohms, which sounds great. But at 3V that's much more of a load on the battery than the sleeping MCU. Leakage is often specified in ohm-Farads, or megohm-microfarads. Many of AVX's offerings of ceramic MLCC parts are specified at 1000Ω-Farads. A 22 uF part is therefore about 50 Mohm, completely out of bounds for decades-long power from a small battery. Also note that dielectrics can be very sensitive to temperature. Going from 20C to 80C on some devices will increase leakage by an order of magnitude. Perusing various sites it's rather astonishing how few capacitor vendors specify leakage. A lot of them claim "low leakage" but don't give numbers. That's about as useless as a campaign pledge. Kemet's excellent " Application Notes for Tantalum Capacitors " does show how derating the voltage can help. Operate at the cap at a third of the rated voltage and the leakage will typically decrease by an order of magnitude. Go to ten per cent of the part's rated level and leakage dives by another factor of five. But in a three volt system that means a 30V capacitor – bigger and pricier. These deratings are "typical" numbers. Use with caution. But do battery-operated systems need those big bulk caps? Aren't they only for filtering power supplies? The answer is that it depends. When the CPU wakes up the entire system comes to life. The processor and all of the required peripherals and real-world interfaces will suddenly make serious demands on the battery. A CR2032 has a series resistance on the order of tens of ohms, but that increases as its capacity is used up. Your system may require tens of mA for a short time. Near the battery's end of life, when it is struggling to put out a bit over 2V, its series resistance could be enough to cause the system to be under-powered. A capacitor has a very low series resistance (aka ESR), and can provide the needed power. TI has a white paper about this . They claim a CR2032 can exhibit as much as 1000Ω of series resistance as it becomes seriously discharged! One mA at 1000Ω is a 1V drop, which will certainly crash the system. Even if you discount TI's number by an order of magnitude, a not-unreasonable 10 mA load will have the same effect. So what kind of cap has very low leakage and decent capacitance ratings? I can't find any in the greater than 10 uF range that will allow a system to run for two decades on a coin cell. Probably the only option is to exploit the 10-50X improvement that comes from derating a high-voltage capacitor. But even that will swamp the MCU's consumption. Capacitors are the tip of the iceberg. If the battery is installed by a worker who isn't wearing gloves, what is the effect of the finger oils left on it? If a sleeping CPU uses, say, 20 nA, then it seems logical to keep other drains no larger. At 3 V a 150 million ohm resistor draws 20 nA. It's not unreasonable to imagine finger oils in the 150 meg range. Maybe it's much worse – I can't find any empirical data. FR4 PCB material is somewhat hygroscopic. Here in Maryland in the US the summertime humidity soars seemingly to infinity. Must boards be conformally coated? What is the conductivity of the solder flux residue? Does the board cleaner leave low-impedance swarf behind? There are a lot of ways nanoamp leaks can accumulate, and it won't take much to seriously impact decades-long battery lives. The nanoamps consumed by the MCU are just a part of the equation – perhaps a small part. There is good news for designers: if the batteries start failing at the five or ten year mark, probably no one will remember who is at fault.