tag 标签: voltage

相关帖子
相关博文
  • 热度 27
    2016-4-29 18:04
    1550 次阅读|
    0 个评论
    With all the attention given to low-power design and products, we tend to forget that there's another world out there with power levels that are tens of orders of magnitude greater—and working with them is a radically different world in every respect. A recent article in IEEE Spectrum, " Inside the Lab That Pushes Supergrid Circuit Breakers to the Limit ," was a dramatic reminder of the engineering challenges of anything having to do with the electric grid's high-tension power lines in the megavolt/kiloamp regime. Next time you hear engineers moan that their design has to “sip electrons,” perhaps you should suggest that they read this article and they’ll hopefully soon stop. Forget everything you think you know about "electricity" when you are at these levels. The article focuses on testing of circuit breakers for these power lines, and it's truly another world. There's no need for me to reiterate the article; you can read it yourself. The author’s detailed description of what a circuit breaker must do and deal with at these levels is astonishing: we're in the world of plasmas, gas quenching, switching of kA in milliseconds, mechanical contact issues that you won't be able to anticipate, and more. Every aspect of the test setup had to be custom built, as there are no off-the-shelf fixtures for this kind of work. The test facility even had to build a special generating and storage system to supply the power for the tests. Every decision and action requires careful, deliberate thinking and risk assessment. I'm fascinated by engineers who are not only at the extremes of design along one or more performance or operational parameters, but must also devise and build ways to test their designs. Sometimes, as in the case of the power-grid circuit breakers, there is a set of operational features in their favor: the tests are reasonably close to final conditions, they can be repeated as needed under carefully controlled conditions, and changes can be made and then the devices are retested. However, not all tests involving systems at high power levels (whether electrical, chemical, or mechanical) have this characteristic. Often, the test process is so complex and difficult to set up or execute that any the test/modify/retest cycle is too expensive or time consuming. The implications of this point were clearly explained in the excellent book " Apollo: The Race to the Moon ,” where an “interlude chapter” steps back and provides a big-picture overview of the differences between aircraft test and big-rocket test. They paraphrase Joe Shea, Apollo Program Manager, as saying it didn't matter if they tested the Saturn V rocket six times instead of four, or eight instead of six … statistically, the extra successes (if they were successes) would be meaningless – all they would do is use up pieces of precious, costly hardware (time and dollars) that could have been used for real missions. There are other cases when a project can’t be fully tested. The recently published book " The Right Kind of Crazy " about the Mars Curiosity Rover mission discussed the implications of the obvious: that many aspects of the design of space-exploration systems must be simulated, assessed, and analyzed to an extreme, because there is no way to replicate some of the real operating conditions. For the Mars mission, one such topic was the parachute which slowed the Lander down so the "sky crane" could hover and lower the Rover to the Martian surface. You can replicate the extreme cold and vacuum of space, but how do you test deploying that parachute at hundreds of miles/hour in the Martian atmosphere and then using retrorockets to stabilize the platform in a low-g environment? The answer is that you can’t. What’s your experience with higher voltages and currents? What‘s the highest power level for which you have had to design for? What was your biggest surprise or memorable example of culture shock?
  • 热度 24
    2015-2-5 18:35
    1582 次阅读|
    0 个评论
    Low-voltage design gets most of the attention these days, but there are many applications which require very-high voltages even though they do not deliver significant amounts of current to the load.   It's easy to think that almost "everyone" is doing low-voltage designs with power-stingy, battery-operated circuits -- but that's a simplistic and myopic perspective. There are well-known exceptions to the low-power world in applications which must deliver significant power to a load, such as a heater or motor. In those situations, using higher voltages allows use of lower currents for a given power rating (P = V × I), while minimizing IR loss and I2R heating. In these cases, current ratings in tens or hundreds of amps are common.   But it’s not all about higher voltages when it comes to reducing current, even though the current may still be in the tens or hundreds of amps . There are many unavoidably high-voltage situations which are also fairly low current, often under 100 mA. This became fairly evident when I walked the exhibit floor at the recent fall meeting of the Materials Research Society , the world's leading scientific organization for researching, developing, and applying new and existing materials. Approximately 6,800 attendees explored nanomaterials, ultrapure materials, cryogenics and ultravacuum chambers, high-temperature furnaces, sputtering, vapor deposition, specialized instrumentation, and much more.   We routinely and somewhat casually rely on many materials-related technologies which enable our hi-tech advances and innovations; it's a synergistic relationship, of course, as these advances in turn drive new materials. Beyond the development of these materials, there are major issues in determining their electrical, physical, and chemical properties. After all, once you have created that amazing ultrapure nanomaterial, how do you measure its hardness?   Why the need for the high voltages and low currents, as seen at many exhibits at the MRS event? These systems and their instrumentation are not "power devices" in the conventional sense, and minimizing IR loss and I 2 R dissipation is not the primary concern. The need is simple: it’s the law of physics. These systems require high voltages to steer electron beams, attract and accelerate particles, and change the energy state of atoms. I saw many specialty vendors whose supply product lines began at 10 kV, as well as many high-voltage supplies embedded within highly specialized analysis, fabrication, and measurement systems. Consumers also have a need for voltages in the 10 kV range, to power the magnetron in their microwave oven, and in past times, for the venerable CRT of now-obsolete television displays.   The standard home microwave oven has a high-voltage supply, usually between 10 and 20 kV, to energize the 2.4 GHz magnetron inside; would knowing that detail scare the consumer? SOURCE: Wikipedia   For designers who have little or no exposure to high-voltage/low-current design, it's a very different world. There's little need to minimize IR loss by using heftier connectors and conductors, since voltage drop is not a primary concern at these low currents. Instead, it's a world of thin conductors, thick insulation, safety interlocks, and mandated minimum physical spacing between conductors and anything nearby. It's also an unforgiving world where marginal design, inadequate attention to tiny details, and microscopic cracks in insulation can have dangerous consequences for equipment and users.   Nothing is done quickly, easily, or casually: voltage/current monitoring, probing with test equipment during debug or repair, and designing for user access must all take into account high voltages and its tendency to go through any breach in system physical integrity. Seeing a 25-kV rail cause spark-over in an poorly placed capacitor is an experience you won't forget (who, me?). Every component must have appropriate ratings and certifications, and even something as normally mundane as insulation material, thickness, and breakdown rating is a concern. Further, any tiny nicks in the insulation which may occur during prototyping and debug stages, or in production manufacturing are also a concern. Routine electrical insulation and standard electronic signal-isolation techniques and components are neither routine nor standard.   Designers who work in low-voltage, low-current domain have the luxury of being able to route power as they want to and make changes as needed to satisfy PC board-layout and cable-harness priorities; you can rout a 3.3V rail (on the PC board or in a cable) pretty much wherever you wish. In high-voltage designs however, that degree of freedom doesn’t exist and any change in routing must be carefully assessed to make sure it doesn't violate appropriate design guidelines or numerous regulatory standards for placement, creepage, clearance, and safety. Low-voltage designers don't have to worry that the basic design or any change in it will trigger safety-related regulatory testing and approval cycles.   Have you ever been directly involved with high-voltage/low-current design? Have you ever worked on a project which had that aspect? If so, what intrigued, impressed, worried, or scared you the most?
  • 热度 18
    2015-1-20 11:18
    1407 次阅读|
    0 个评论
    H igh voltage (e.g. 20V) on vin is frequently required, since (USB) cable inductance induced voltage overshoot during hot plugging. The IC may be disabled when OV(over voltage, e.g. 6V) detected at vin, so that no high voltage appears at vbat side. If high voltage charging is required, the A2 has to be re-designed to handle high voltage.
  • 热度 21
    2013-12-9 21:29
    1697 次阅读|
    0 个评论
    I usually hear reporters and commentators use "energy" and "power" interchangeably, as if they were two words for the same physical parameter. I can understand that confusion to some extent. You have to know what you are talking about to use them properly, and energy and power are vague concepts to most of these folks. It's somewhat similar to the voltage/current confusion we discussed previously . But it also worries me when I hear engineers use "energy" and "power" almost interchangeably. I assume that, in most of these cases, the engineer knows what the words really mean, so it's just a matter of verbal casualness. Let's be clear. Energy is the ability to do work. Power is the rate at which the energy is being used and work is being done. To look at it another way, energy is the time integral of power. The standard unit of power is watts, and the standard unit of energy is joules, watt-hours, or some dimensional scaling of those basic units.   A battery and a lightning bolt represent very different poles of the power/energy continuum. My concern is that, by being a little careless in the use of these ubiquitous and very necessary engineering terms, we risk getting fuzzy in our thinking about them. In many designs, the energy storage element (a battery or supercapacitor, for example) is filled by available energy at one rate, yet it is usually called on to deliver power at a much higher rate. Of course, there are reverse situations, where a burst of energy fills the reservoir and then is drawn down slowly over time. Regardless, you have two processes that are linked together but somewhat independent, though they must add up in the aggregate. Consider an energy-harvesting sub-system used to charge the battery for a data logger. The harvested energy trickles into the battery over time, but that battery must deliver bursts of power to the logger when it is acquiring or transmitting data. If the integral of the accumulated energy doesn't equal the needed power burst, it's an unsustainable situation. Or look at an AC line charger for a smartphone or laptop. It has to supply only a modest amount of power to charge the device within a reasonable time when it is not in use, but it has to have much more robust capability if it is to charge the device while it is in use. The former situation is determined more by the energy the charger can deliver over some broad time. The latter one is an operational scenario, and thus it is more driven by power. That's why it is important to use these two closely related parameters carefully and appropriately. In most situations, we accumulate energy, but we spend it as power in order to get useful work done through a combination of mechanical, chemical, or electrical means. This blurring is also what worries me about many green energy plans, however well intentioned. When you go through the energy capture/power spend reckoning, there's a real imbalance that must be acknowledged and addressed. Have you ever been in a situation where the energy and power numbers were way out of balance, or misunderstanding and miscommunication about these two critical parameters led to design problems?  
  • 热度 25
    2013-12-5 22:07
    1605 次阅读|
    0 个评论
    We have many choices for energy-harvesting sources, such as localized heat or vibration, but one of the most pervasive possibilities is to grab some of that stray RF field energy that is all around us, from low frequencies into the gigahertz range. Hey, if you don't use it, it truly will go to waste. It will be absorbed by any materials in its path (causing imperceptible but widespread heating) or dissipated into space (perhaps continuing forever on its journey at 3 x 10 8 meters/second). That's why a recent development at Duke University looks interesting. It involves the use of highly engineered metamaterials to build a 900MHz-to-DC transducer, shown in the figure below. The researchers say their device achieves 37% efficiency. That's very impressive, especially when you consider that good solar cells reach around 10% and (for obvious reasons) are not usable 24/7.   Duke engineering students Alexander Katko (left) and Allen Hawkes show a waveguide containing a single power-harvesting metamaterial cell, which provides enough energy to power the attached green LED. I have a quibble with the Duke University press release about this development. In trying to make the concept tangible to the audience, the writer says that, by using the metamaterial cells in series (see image below), the device was able to produce an output of 7.3 V, which is higher than a standard USB port.   This five-cell metamaterial array developed by Duke engineers converts stray microwave energy, as from a WiFi hub, into more than 7 V with an efficiency of 36.8%—comparable to that of a solar cell. Even though that is factually correct, there's the implication that this metamaterial panel can act as a USB charger or similar power source. That would be nice. But anyone reading this column knows that, even though it could deliver that voltage, the current level would be low. There isn't that much RF energy passing through in the capture field. The full technical paper in Applied Physics Letters shows that the researchers produced about 100 mA into a 70-Ω load, which is very impressive. The 7.3 V tag made me think about the love/hate relationship engineers have with voltage and current and thus with energy and power (the rate at which energy is delivered). Sometimes the specific value needed is determined by the laws of physics. If you want to ionise a gas (such as a neon tube) or jump a spark gap, you'll need several thousand volts but low current. When you want to do real work such as driving a motor, you'll want more current to deliver the power—at a higher voltage to reduce I 2 R losses and increase overall efficiency. By contrast, the voltage and current needed for a smartphone is dictated by the ICs that were designed for very low voltage, due to the imperative of low power consumption. In general, low-single-digit voltages are tough to work with efficiently—not because of resistive losses, but because unavoidable diode drops of 0.6 to 0.8 V can take a big bite out of your available source voltage. How do you choose the voltage and current values to use? As in most engineering situations, the answer is clear: It depends. For some situations, such as gas ionisation or the smartphone, you have little choice; the numbers are dictated by physics, available components, or industry standards. In other cases, the engineer has the flexibility to choose (within limits). It's a matter of finding the voltage/current combination that works best in terms of power delivery, system efficiency, available components, safety requirements (which kick in at different levels), and cost. Have you had to analyse and select operating voltage and current levels? How did you come to a decision on balancing the unavoidable trade-offs?  
相关资源