Bypass balancing...

Kodin

1 kW
Joined
Feb 20, 2014
Messages
314
Location
Portland, OR
I've been researching various BMS designs, and most if not all of them tend to rely on resistive loads to "bleed off" extra power when balance-charging lithium cells. This seems extremely wasteful to me, and I wonder, why is no one pursuing other options?

Here's my idea: a BMS that will shut off charging when a cell reaches HVC for the charge cycle, however instead of dumping that current into a resistive load, it instead bypasses the cell completely. Clearly this can only occur if you're able to vary the charge input dynamically based on input from the BMS, but in typical charge scenarios, this is theoretically possible. The only time I see this not being an advantage is if you cannot vary the input current, such as during heavy/prolonged regen.

Crude cell diagram:

[1] [2] [3] [4]

[BMS - Modular or distributed]

(Power and data line(s) go between these two)

[smart charger - (Variable voltage/current)]

Say you're charging through all four cells, and cell #2 hits HVC first. It's done charging. What you from disconnecting it completely from the pack, and passing the charge current from Cell 1 to Cell 3, and at the same time, command the charger to reduce the voltage and current to reflect the new "pack" configuration?
 
Dont' forget you must disconnect the cell completely from the circuit, or else you are shorting across it when you "bypass" it.

That means every cell or cell group would need relays or some other switch to cut it out of the battery as it fills up, and then to reconnect it after the charge is complete so you can use the battery.
 
TI has a induction system that actually works by transferring bleed of current from full cells directly to lower cells.

you can get the system but it is bloody expensive in any regard as every cell string needs a individual flyback transformer board and the pricy chip from TI.
 
A series string maintains perfect coulombic balance as it's charged and discharged.
Cells unbalance due to differing rates of self-discharge and to a much lesser extent differing rates of charge storage efficiency. Both of these processes that cause imbalance also cause micro-gassing in the cell from unintended side reactions.

Pouch and cylinder cells both contain a defined finite interior sealed volume.

In a perfect world with a perfect battery, the string remains in perfect balance with no balancing needed. In the real world with cells made by humans, the tiny differences in self-discharge current will cause cells to slowly drift apart in cell voltages. Healthy cell self-discharge is in the <0.1-10uA range. If you discover yourself requiring even an average bleed consumption of even 1mA, unless this is a very large pack where that value is expected, you can see this as some cell running a tiny gas generator at a hazardously accelerated rate inside a finite volume enclosure, and it's only a time function before failure.

Until a BMS can actually repair the defect causing the self-discharge inside of a cell, and hence stop the abnormally high rate of gas production, the more efficient or higher power bleed/shuttle current the BMS offers, the greater extent it can mask awareness of defective cells to lead to a more serious type of failure than just having the self-discharging cell go to 0V without event as a weak BMS permits.

Until a BMS can open up the defective cell and fix the impurity issue or coating imperfection or whatever is causing the imbalance, then de-gas the cell after repairing it, then and only then does higher power balancing make sense to offer as a feature. Until that point (which is unlikely to ever occur), having a tiny 10mA 1/4W bleed resistor is an intrinsically better BMS solution than types capable of shuttling around high bleed currents towards the outcome of delayed pack failures of a more critical type (cell envelope failures over just going to 0V).

Also, the last Tesla BMS I saw used resistive bleed balancing, and only 80mA max bleed IIRC.

ATB,
-Luke
 
Could you combine a charger that charges each cell individually with a typical bms?

You'd never need the top balancing part of the bms that way, but would still get the other features, LVC, max amps, etc. You'd just use the regular bms because it's economic to buy them.

The charge current would of course need it's own set of balance wires, but the input current would simply bypass the bms. Or would it? Would you need switches on the balance wires to turn off during charging?
 
dogman dan said:
Could you combine a charger that charges each cell individually with a typical bms?

You'd never need the top balancing part of the bms that way, but would still get the other features, LVC, max amps, etc. You'd just use the regular bms because it's economic to buy them.

The charge current would of course need it's own set of balance wires, but the input current would simply bypass the bms. Or would it? Would you need switches on the balance wires to turn off during charging?


This also doesn't repair a defective high discharge rate cell, but rather masks the ability for it to bleed down safely and disable the pack prior to enough gas generated to fail the pouch/can sealing.

IMHO, if you have a defective cell, it's best to stop pack function until you can remove it over helping it limp along towards a more significant failure mode.
 
dogman dan said:
Could you combine a charger that charges each cell individually with a typical bms?

You'd never need the top balancing part of the bms that way, but would still get the other features, LVC, max amps, etc. You'd just use the regular bms because it's economic to buy them.

The charge current would of course need it's own set of balance wires, but the input current would simply bypass the bms. Or would it? Would you need switches on the balance wires to turn off during charging?
Yeah that would be a cool thing to have but it sounds a bit too wishful thinking.
Would be cool to have a super magic complex BMS that could say use regen energy from DD motor breaking and feed that power into your lowest cells to keep everything in balance.

But I can see LFPs point of view that its all too complex and kind of bandaid/masking the underline deterioration of the battery pack.
But I can't help but think that if we had BMSes that smart we would also have full information being fed live to our google glasses heads up display and maybe a Apple Siri voice saying "Sir you really should get cell S13 looked at by a battery technician, I can't keep feeding it regen energy directly forever..". And that would be an argument that we should have cell topping up BMS technology.

I think considering there seems to be a decent percentage of BMS board failures out there we go to get the basics solid first.
The OP can sit there and wish for this stuff all he wants but its not even on the drawing board.
 
Absolutely, no bms or balancer can turn a shit cell into shinola. I just saw the advantage of single cell group charging as being faster than bulk charge, bleed, charge, bleed, going on for hours. If you do have it going on for hours, then you do have a shit cell in there that needs to go.

I just wondered what might happen, if you did combine a typical cheesy bike battery bms, with a charger like the Thunder power 21s, that charges down the balance wires. In such a case, you really would not need the bleed down circuits, just the LVC to shut off the whole pack when one cell group got low.
 
I agree, it would be useful on a large pack if the BMS provided early warning of a faulty cell string rather than just blindly balancing it.
 
From what I've read, some of the open distributed BMS designs do data logging so you can identify a faulty cell easily. Mostly in the world of EV cars though, not so much on ebikes.
 
An isolated charger for each cell (or parallel string) removes the need for balancing as each cell takes only the charge it needs to top it off. When series charging you need to shunt power from the already-full cells so the slower ones can catch up.
 
On bike and scooter batteries, I'd just like to see a diagnostic port. Open a panel, and a shop tech could plug in a cellog 8 or two and see if a cell has died.

Some of these bike shop mechanics really are not ready for taking the thing apart, and poking around with a voltmeter.
 
Punx0r said:
An isolated charger for each cell (or parallel string) removes the need for balancing as each cell takes only the charge it needs to top it off. When series charging you need to shunt power from the already-full cells so the slower ones can catch up.


Yep, there are many tricks to end up with a pack that has all matched cell voltages.

Sadly, none of them repair or replace a defective cell for you.
 
Is there no source of imbalance that doesn't indicate a faulty cell? Otherwise it's seems you're saying that with quality cells (I figure 18650s must be about the best) there should be no need to ever balance?

What about thermal gradients across a large pack resulting in different charge/discharge efficiency?
 
Regarding the OP's question...

Resistive balancing isn't really that wasteful once the pack is initially balanced. Take my 16S/10Ah LiFePO4 pack as an example and forgive the rough numbers I'm using. The BMS bypasses about 0.5A once a cell hits 3.5V, and takes no more than a few minutes during the balancing stage before all the cells are charged. Worst case that's about 30W being dissipated for 5 minutes or so. Given the total pack capacity is ~500Wh, and 30W for 5 minutes is 2.5Wh, that's less than 1% of the energy delivered by the charger that's wasted vs an "active" balancing scheme that shuttles charge around or removes charged cells from the string. If the pack is only 50% discharged, then I may waste 1% of the energy delivered per charge. Inefficiencies in the charger cause more energy loss than bleeding off a little energy at the end of charge.
 
Back
Top