A PIC based Battery Management System

Can you explain how this is notionally supposed to work? I'm having a hard time wrapping my brain around this. I'm still on my first latte... :D

Also, how big are these inductors, physically? I'm trying to get an idea how much real estate this would take.

-- Gary
 
OK, then there are switched capacitor charge pumps. These come in off-the-shelf varieties with all the switches built into a small chip. The LM2662 is one example:
http://cache.national.com/ds/LM/LM2662.pdf
Some of these switched capacitor charge pumps are rated for up to 800ma, but are designed for driving LEDs.

If you put one of these across each pair of cells, cascaded like the switched inductor setup, all the cells would be sharing charge up and down the string. If it worked ideally, it would be just like charging one big parallel mess of cells. All the individual cells would be forced to the same voltage. The pumps could just run all the time during charge or discharge, and sleep during standby.

The 200 ma factory built units might be too small, but I see no reason why the same concept can't be scaled up. Even if you put a bunch of the 200ma ones in parallel, it should scale up.
LM2662 switched capacitor circuit.jpg

You might be able to simplify things by cuting out half the swtiches, and just gang them together like below. Again, the switches would run all the time during charge, at 50% duty cycle. By choosing the right switches and capacitors, it should be able to move amps.
Switched Capacitor Charge Pump 1.jpg

To get fancier, one could use a common bus and capacitor, and selectively move charge from cell to another, not necessarily next to each other. This might be more limited in power transfer, since one capacitor has to do all the work.
Switched Capacitor Charge Pump 2.jpg
 
GGoodrum said:
Can you explain how this is notionally supposed to work? I'm having a hard time wrapping my brain around this. I'm still on my first latte... :D

Also, how big are these inductors, physically? I'm trying to get an idea how much real estate this would take.

-- Gary

Right. It might take several cups of latte to properly analyze the current paths, especially in the simplified version.

The size of the inductors would depend on how much current you want to push. It would be like any boost regulator that's doubling voltage (same basic concept). For a modest 1-2 amps, I think they make surface mount inductors that are fairly small, but still expensive.

Here's a more simplified diagram of the basic principle:
Inductive Charge Pump principle.jpg

When switch 1 closes, it draws current from cell 1 to build the field in the inductor. When switch 1 opens, the collapsing field will circulate the current through the diode into cell 2. Likewise, when switch 2 toggles, it will pump current from cell 2 to cell 1.

In the diagrams showing FETs, keep in mind that most big enhancement mode FETs already have the diode built in.
 
Okay, got it now, thanks (well into my second triple shot latte...). Brain firing on all cylinders now...

The cap thing looks like it would be a bit easier to do. If it can be active for discharge and charge, we don't need amps. 200mA might be fine. A sleep mode would definitely be good, for times when the pack has to sit on the shelf for awhile, but with fairly healthy cells, once balanced like this, they aren't likely to self-discharge very much and if they do, it will be close to the same rate, so not much shuffling of current needs to happen, between cells.

If we wanted to do a version that could transfer at say, 400-500mA, is there an equivalent part that would do handle this much? Also, how big a cap would be needed?

Would these things get hot enough to need a heatsink?

-- Gary
 
Elevated Voltage?

file.php


If I understand this correctly you switch all the switches at the same time in order to build one elevated voltage across the bottom most capacitor.

:arrow: Is this correct?

So if you combined this with PWM you get (for a DC motor) the actual pulse that you want. You've more or less distributed the controller circuitry (FET's, Capacitors) from a central location (the controller) to the cells. The controller could be limited to being a driver for PWM and that's about it.


The question I have is if you have switches that can do this, then why even use capacitors? If you have a three way switch you can do the "On State" and "Off State" behavior that the Smart Battery Data Specification uses.

:arrow: FET's are either on or off... these switches need to be three way... any ideas about what type of switch you would use?
 
All the swtiches go at the same time. The voltage on all the caps is the same as one cell.

The App note for the LM2662 shows somewhere around 5 to 20uf for the capacitor. Since it only needs to run at 3.7v, it could be pretty small. The switching chip itself is very tiny and does not need any heatsink. For higher currents, I think the capacitor needs to be very low ESR. Most of them switch at 1 -2 Mhz. I think even a scaled up version of this would not need any heatsink.

Safe, I have no clue what you are talking about and it's not directly relevant to this discussion. Switched capacitor charge pumps are not suitable replacements for a PWM (at least not in this configuration).
 
fechter said:
Safe, I have no clue what you are talking about and it's not directly relevant to this discussion.
You need to read the "Smart Battery Data Specification" thread. There's an actual standard out there for the "Smart Battery" and we seem to be trying to reinvent the wheel all the time. In the spec they use the concept of "On State" or "Off State" to describe a cell that is either in series or out of the loop.

:idea: Somehow people aren't making the big "wow I get it" about the "Smart Battery Data Specification" and I think people need to take a look.

What I'm saying is that the "pro's" have given us a rich resource of information to draw upon... we should examine it before going further...

P.S: I still like the idea of PIC based BMS and all that I'm just saying we ought to all be educated in the standards out there.
 
fechter said:

An interesting topology. I'm not too sure it'd work out well in practice. Specifically when one of the inductors is at a high current and has it's driving FET shut off it then is forced to drive this current into one of the other inductors which are all at zero current. This just won't happen instantly, so high voltages will be generated any time a FET switches. The switched capacitor topologies shown above have a complementary issue. With just the capacitors the initial current through the FET is only limited by the parasitic resistances in the circuit.

How about a variation on the http://en.wikipedia.org/wiki/Ćuk_converter? As a single unit it likely has no advantages over a single inverting DC-DC converter. Chained up like Fechter's "inductive charge pump" it shouldn't suffer from voltage and current spikes in the FETs while it retains the possibility to directly pass charge to non-adjacent cells.

Marty
 
In the circuit above, I can see that when a cell in the middle switches open, the current in two inductors will be circulated through the body diodes in the FETs to the ends of the string. This would be OK, in fact ideal. The inductors along the way to the ends would suck up a bit of the pulse, but since they're inductors, they present a high impeadance to the high dv/dt of the pulse, so I think most of it would make it to the ends of the string.

Potential problems are, when a switch in the middle goes open, the voltage at the ends of the switch will try to go all the way to the ends of the stack (minus diode forward voltages). This would require careful handling of the gate drives to prevent them from getting zapped. Something like a high side gate driver chip might take care of this. Another problem is what happens if more than one fires at the same time? When one switch opens, it will send a current impulse through the other cells, which might just bump them over the threshold and cause them to fire. This could set up a feedback or resonance that could be destructive.

If you overlapped two cell units, like a PowerCheq, I don't think you would have either of these problems. (see the inductive charge pump 3 diagram). The inverting switcher or Cuk would be good topologies to look at to do the same thing. I was just trying to keep the part count minimized.

Question for "real EE's":
In a Cuk, inverter, or my basic two cell inductive charge pump circuit, how do you size the inductor? Say I wanted to pump 1 amp at 3.6v? Inductor size will depend on switching frequency, but how do you pick the optimum values? In this application efficiency is not really important as long as none of the parts get too hot. Cost and inductor size is more important.

I don't think the switched capacitor thing will have a problem on startup. All of them work the same way. It will only take about 3 cycles to get the caps up to voltage. The FETs can handle that. Once they get up to voltage, they only have a few mv of ripple during operation.
One cool thing about this approach is the voltage balancing will take place during the entire charge cycle, not just at the end.

The downside appears to be that switched capacitor charge pumps have quite a bit of 'sag' at higher currents. I still think if the parts are properly sized, it can work at higher currents. The size of the capacitor may be the limiting factor. The off-the-shelf charge pump chips like the LM2662 are not made for high currents, but I don't see why the same basic idea can't be scaled up. Some of them are really tiny, so paralleling a few might be an option.
 
MOSFET's vs IGBT's

One of the things that I find annoying about the idea of using Power MOSFET's is that they don't do a good job of preventing reverse current flow.

That's a real problem.

But I think that the IGBT might be the way to fix the situation:

http://en.wikipedia.org/wiki/IGBT

"The additional PN junction blocks reverse current flow. This means that IGBTs cannot conduct in the reverse direction, unlike a MOSFET."

:arrow: Maybe the answer is to switch from MOSFET's to IGBT's?

If you could get affordable IGBT's for switching then you could avoid all the problems and simply use the Smart Battery Data Specification approach and do the "On State" or "Off State" philosophy.

This also immediately solves the voltage measurement issue.

IGBT's are often mentioned by articles about Electric Vehicles as being the "Holy Grail" of the industries progress. Should we be paying attention to this?


http://digikey.com/scripts/DkSearch/dksus.dll?Detail?name=FGA25N120ANTDTU-ND

Fairchild Semiconductor
FGA25N120ANTDTU-ND
Voltage - Rated 1200V (1.2kV)
Current Rating 50A

$3.51

http://digikey.com/scripts/DkSearch/dksus.dll?Detail?name=FGPF30N30-ND

Fairchild Semiconductor
FGPF30N30-ND
Voltage - Rated 300V
Current Rating 80A

$1.60
 
With the cap-based approach working during charging and discharging, there is no reason that you need to move a lot of current through. I would think even 200mA might be enough. The only reason you need an amp, or more is if you are trying to correct a fairly large imbalance, in a relatively short period.

There are two techniques to doing battery management. One is that you don't worry about balancing at all, and in essence, individually charge each cell to whatever its max level is, and then monitor each cell to make sure it doesn't discharge too far. To charge, you either use individual chargers on each cell, or you use a shunt-based regulator on each cell that will bypass whatever current the cell can't absorb during the final CV part of the charge cycle so that the next cell in series can have all the current available, if it is trying to "catch up" with the rest of the cells.

The second type of battery management is to try and end up with all the cells to be at the same voltage level/SOC at the end of the charge process. In many of these designs, shunts are also used, but in a different fashion. Instead of waiting until the cell is at the cutoff voltage, the shunts will bleed off current all the time for the cells that have higher voltage, which has the effect of slowing down the charging of those cells, in order to let the lower voltage cells catch up. The trick here is that you have to pick a current that is high enough to overcome whatever the worst case imbalance is likely to be, over the charge cycle. Most of the BMS units that come with the cheap "duct tape" packs only use about 100mA of bleedoff current and what the charger does is just keep supplying a "trickle" charge until the cells finally balance. Their own instructions say that this process can take up to 10 hours for packs with significant imbalances.

One thing that really needs to be factored in is just what kind of imbalances are considered "normal". I haven't used any of the lower C-rated Chinese LiFePO4 cells, but I have quite a bit of experience with a123 cells and now with LiFeBATTs. Packs made with healthy a123 cells, will rarely have imbalances more than 1-2%, under normal use. I have a couple of packs like this and they stay so close that for the most part, I just bulk charge them using a Zivan NG1. Ocassionally, I will use my VoltPhreaks individual cell chargers on these packs, but this doesn't really get them any more balanced. I also have a couple of packs that have cells that have been "stressed", mixed in with healthy cells. The stressed cells were in two packs that got over-discharged. Some cells went all the way to zero volts, and are truly dead. Some got down under 2V, between 1.6V and 1.8V and these recovered, but now are the stressed ones, with slightly reduced capacities (about 10%, near as I can tell...). The rest of the cells never got below about 2.5V and they are now still as strong as ever. Anyway, these packs with the stressed cells typcially have imbalances as high as 4-5%, but with these I don't think the answer is to simply drag down the level of the healthy cells so that they always equal the lower level of the weak cells. Instead, I always just individually charge each cell in these packs, just to make sure each one gets a full charge, whatever that happens to be.

What I'm seeing with the LiFeBATT cells is that I have yet to see an imbalance more than about 2%. This tracks with what the factory in Taiwan keeps telling us, which is that by using very tight quality control during the manufacturing process, and by extensive burn-in testing and cell matching, they claim cells will stay within 2%, period. They do capacity and IR-based cell matching when they build their packs, and they do the same with loose cells, which come eight to a box. I've got one 16-cell pack that has cells from two boxes, and all the cells from one box stay within about 1% of each other, as do the ones from the 2nd box, but the difference between the two batches means that the total imbalance is just under about 2%, worst case. Now I haven't used this pack more than a few cycles yet, but I will keep a close eye on it to see how balance might change over time. With my other LiFeBATT packs, I didn't pay much attention to which cell came from which batch, but overall, the cells in all the packs never have had an imbalance greater than about 2%.

Anyway, getting back to how much current is really needed for a cap-based approach, I think at least three factors need to be considered. First, how much imbalance should this be designed to overcome? I'd say 3% is probably a good starting point. Next in how short a time does this imbalance need to be corrected? Finally, closely related to the second factor is what is the max charge current that is being used? If you are trying to charge a pack at a 1C rate, or better, you will need to switch more current between cells to keep up than you will if you are only charging at a 1/2C rate. Also, if you are only trying to correct imbalances during the actual charge process, you will need more switched current than you would if the balancing is allowed to happen all the time, or at least during discharging and charging. I haven't done the math, but my rather generous gut tells me that 200-300mA might be plenty if the balancing is allowed to happen during discharging and charging, and maybe even as low as 100mA, if allowed to work all the time.

--Gary
 
GGoodrum said:
There are two techniques to doing battery management.

:arrow: One is that you don't worry about balancing at all, and in essence, individually charge each cell to whatever its max level is, and then monitor each cell to make sure it doesn't discharge too far. To charge, you either use individual chargers on each cell, or you use a shunt-based regulator on each cell that will bypass whatever current the cell can't absorb during the final CV part of the charge cycle so that the next cell in series can have all the current available, if it is trying to "catch up" with the rest of the cells.

:arrow: The second type of battery management is to try and end up with all the cells to be at the same voltage level/SOC at the end of the charge process. In many of these designs, shunts are also used, but in a different fashion. Instead of waiting until the cell is at the cutoff voltage, the shunts will bleed off current all the time for the cells that have higher voltage, which has the effect of slowing down the charging of those cells, in order to let the lower voltage cells catch up.
:arrow: The third technique is the way the "Smart Battery" does things. The "Smart Battery" is either in the "On State" or the "Off State". When the battery is off it acts like a wire and the series string simply ignores it. Unlike shunting this completely removes the cell from the system. In the "Smart Battery" configuration the battery itself makes the commands and the external devices that use the battery must act as "slaves" to the battery and react to it's requests.

I'm trying really hard to get people to see this third path that seems kind of crazy given the old ways, but it makes sense if you think about it enough. By removing completely the cells that are unable to function (for whatever reason) you change the thought process significantly.

Don't blame me for distracting things... the "Smart Battery" is being taken very seriously in many circles and I think we need to start rethinking our thinking. We need a thinking rethinking... or a re-re-thinking thought... err... whatever. :lol:


It just seems wrong to be talking about PIC based BMS solutions and not be at least accepting the existence of the "Smart Battery". I'd like to see us move towards a more cooperative relationship with the "powers that be" ("Smart Battery", Intel, SMBus, etc) and move towards solving problems in a way that has some sort of future path to it. A low priced "Smart Battery" done as an Open Source project would be great. (the major corporations will hate it, but we would love it :wink: )
 
200 ma might be good for healthy cells. If you back off on the charging current when any cell gets too high, the pack will balance itself out over time. This might be worth trying.

Something like a MAX889 might work for pairs of cells, then you gang them in the overlapping fashion to do the whole string. http://www.maxim-ic.com/quick_view2.cfm/qv_pk/2326/t/al You'd need some way to make them go to sleep when they weren't needed.

If you have a cell that really goes bad, it would be good to identify it and make the overall charging scheme tolerate it.

Safe, show me how you can use any of those smart battery things in a 16s string, and for LiFePO4. "They" could design a chip that does this, but I have not found any you can buy that will work. Perhaps in the future.
Also, IGBT's suck for losses. They will have about a 1.7v drop across them when on. They are great for systems over 200v, as long as you have water cooling. IRFB4110's will have about .2v drop at 50 amps. IGBT's would be close to 2v at 50 amps (dissipating a sizzling 100 watts each).
 
Whew, lots of things to ponder about here! :) It will take me a while...

If I understand correctly many those ideas need high-voltage switches. How would that be done in practice? A few ways I've seen in this thread were:
- Opto-isolator+fet: two components per switch
- Solid state relay: looks expensive, maybe there are cheaper ones
- Lowside switch + voltage dividers: I've made this one up, don't know if it would work actually

And I found this one:
IR2131 http://www.irf.com/product-info/datasheets/data/irs2110.pdf

It's high/low-side gate driver that can switch gates up to 600V high. But it also looks like it can sink/source 2A pulse current without external gates. So I'm thinking this chip alone could switch capacitor from high-voltage cell down to ground.
 
fechter said:
IGBT's suck for losses. They will have about a 1.7v drop across them when on. They are great for systems over 200v, as long as you have water cooling. IRFB4110's will have about .2v drop at 50 amps. IGBT's would be close to 2v at 50 amps (dissipating a sizzling 100 watts each).
You make a good point... :oops:

From wikipedia:

http://en.wikipedia.org/wiki/IGBT

"The additional PN junction adds a diode-like voltage drop to the device. At lower blocking voltage ratings, this additional drop means that an IGBT would have a higher on-state voltage drop. As the voltage rating of the device increases, the advantage of the reduced N- drift region resistance overcomes the penalty of this diode drop and the overall on-state voltage drop is lower (the crossover is around 400 V blocking rating). Thus IGBTs are rarely used where the blocking voltage requirement is below 600 V."

So maybe the MOSFET combined with a diode in the bypass circuit is the way to go. I was trying to find a way to eliminate the backwards flow in the bypass part of an "On State" and "Off State" type of circuit.

Finally... the "Smart Battery" is a good place to get ideas. They've done a great job thinking it all the way through and it might be a good idea to simply steal some of the concepts from it. Maybe even steal the API. You're going to have to write software for the relationship of a cell to it's charger or controller anyway, so it's always a good idea to find previously existing software to imitate even if you don't have immediate plans to use the SMBus part. It's more of a philosophical thing... try to follow the general tone of the "Smart Battery" concept instead of the "Command and Control" philosophy that previous systems used.

The key ideas of the "Smart Battery" are:

:arrow: "On State" vs "Off State" behavior - Good :)

:arrow: "Safety Signal" - Yeah, we have LVC and HVC to manage :)

:arrow: The cell drives the charger by making requests for services - This is more abstract, but it's a guideline to follow in how you set up communications. It really comes down to which device talks first and which responds.
 
Right tomv, you need a way to switch these FETs. Dedicated gate driver chips like the IR21xx series would work for many of these applications. They are isolated up to 600v, so there's no practical size limit on the pack.

The switched capactor scheme could be done two ways (at least). One would be to use the IR21xx drivers to fire a bunch of good FETs and caps with an oscillator driving the inputs to all the gate drivers in sync. This would make powering off the unit easier.

Plan B would be to use the factory off-the-shelf chips on pairs of cells cascaded. Each chip has its own gate drivers and FETs, so all you need to add is a capacitor. The FETs in these chips have a fairly high on resistance which limits the current they can pump. Keeps 'em from blowing up too. One problem here is how to put all of them into standby when you don't want it running. There might be a way to do it with just resistors and diodes.

Safe: I'm really getting tired of this half-baked battery switching scheme you keep talking about. You've been bouncing that idea around for over a year now and it's not anymore practical now than it was then. I've thought about it enough to not waste any more resources on it. If somebody else can do it, fine. It's never going to work for any EV I'm going to be riding unless somebody invents a new kind of lossless FET or something.
Will you *please* stop or at least keep it to your own threads!
 
On a lighter note, one "nice to have" feature that could be added to MCU BMS :D

http://www.ladyada.net/make/spokepov/index.html
 

Attachments

  • spokepov.jpg
    spokepov.jpg
    33.7 KB · Views: 1,553
I just found another paper on cell balancing. Great info:
http://www.americansolarchallenge.org/tech/resources/SAE_2001-01-0959.pdf

In summary you can balance cells:
1. Dissipative. Typical setup 10mA/Ahr current will balance at 1% per hour. Simpliest
2. Shunt resitors (End of charge)
3. Switched capacitor
4. NEW Switched transfoermer
5. NEW Shared transformer. Amazingly simple! Needs custom transformer

Something that didn't cross my mind before :idea: :
- With methods 1/2 your pack capacity is Ncells * Ah_of_weekest_cell.
- With active balancing if current is high enough, then pack capacity is Ncells * Average_Ah - Balancing_losses

Depending on how well cells are matched this could be significant
 
The more I think of this project, the more I see that the very first idea proposed by Fechter is the only one that seems reasonably buildable sofar.

It does measurement over a 40V range and to scale it I'm just thinking to cascade those via isolated I2C. For attached circuit all is missing is Attiny, 5V voltage regulator and I2C isolator chip.

Now this could do balancing too with small modifications. Change C1 to FET and you have dissipative balancer (can A/D convert or balance at one time only). Current is Vcell / Mux Ron * 2 = 40mA, so approx 2% per hour for A123.

Keep C1 but make it big and you have a charge pump. Would slow down A/D sampling rate a lot. Could change it to C1 + FET so you can either balance or measure. Low current and big losses due to high Mux Ron.

Also could do there inductor/transformer/etc. Ideal would be to find low resistance/high voltage/high current muxes. Building them from discrete gates just becomes something that I don't want to physically construct...
 

Attachments

  • muxes.png
    16 KB · Views: 1,360
The paper: http://www.americansolarchallenge.org/t ... 1-0959.pdf
is a great find! They pretty much covered most of the ideas here. Bottom line: for a car sized system, disipative balancers are the most cost effective. They did mention for smaller systems (like bike or MC sized), the other topololgies may be usable.

The real drawback with all the off the shelf switched capacitor pumps I've found so far is the high on resistance of the FETs. Seems like this could be improved with better switching FETs, but there goes the parts count and cost again.

The balancing issue remains regardless of which approach is used to monitor the cell voltages.

The transformer approach would be OK if transformers weren't so darn expensive. If the transformers get big enough, then you might as well just do single cell chargers.

I wonder if there are any cheap, tiny 1:1 switching transformers?....
 
OK, here's the most basic configuration to do a capacitor charge pump balancer using LM2662's:
Cascaded charge pumps.jpg

I've spent many hours sifting through the datasheets for charge pump IC's. There are hundreds of them. There is likely a better one than the LM2662, but these are available in a leaded 8dip package. The MAX889 is another similar unit that may have better performance (surface mount).

The flying capacitor needs to have a very low ESR. The LM2662 datasheet calls for a 47uf, but the MAX889 application shows 1-2uf. I think some kind of mulitlayer ceramic is the best here, even if the value is small.
 
Wow, is that all that is necessary? If I wanted to do a board to test this concept, as an external balancer, what else would be needed? Is there a way, for instance, to indicate with an LED that current is actually being moved around? If the cells are balanced, theoretically there wouldn't be any current being transfered.

Anyway, I really want to try this, and see how long it takes to balance cells with 200 mA or transfer current, for cells in various states of imbalance. This has the potential for really reducing BMS complexity, and cost. With whatever voltage measurment scheme is ultimately employed, you could have the microprocessor only enable the charge pump balancer circuits when the cells are outside a certain tolerance. I also think that as an external balancer, this would be a great adjunt to use with LVC-only based packs. Just connect the charge pump-based balancer to the pack periodically, and let it do its thing, however long it takes. :)

-- Gary
 
There should be a way to use diodes/resistors to disable all of them for standby mode. This won't be necessary if you unplug it when it's not needed.

The balancing will behave like a bunch of cells in parallel, connected by resistors. The larger the voltage difference, the greater the current. Cells at one end of the pack will take a long time to send charge to the opposite end of the pack.

I can't think of a simple way to monitor current flow, but measuring all the cell voltages will tell you how well balanced things are. You could add an op-amp window comparator to each charge pump to indicate when the two sides are within a close voltage.
 
Okay, so for an external test unit, we don't care about enabling or disabling them, so you just need the chips wired as shown, plus the caps? Do you have a particular cap in mind?

What if you only sensed the current for one channel? All we would need to know is whether or not any amount of current, above a certain threshold, was moving in either direction. This could simply drive a bi-color LED. It could be red if any current is flowing, and green, if the current is below the threshold. Once the cells are balanced, pretty much no current would flow.

-- Gary
 
OK, how do you feed this beast when charging? Lets say you want the end of charge to be 3.65V per cell; do you simply feed it with n*3.65V in a CC/CV scheme?

Also, where do pins 5 & 6 from the lowest cell go?

For a string of cells, do you need the same number of chips, or one less?
 
Back
Top