# My theory of how to max-out the lifecycle of Li-Ion batteries

#### thunderstorm80

##### 1 kW
Hi,
Now that I start working with mortal Li-Ion batteries (non A123's - My 7 year old A123 has lost only 20% of it's capacity, but sadly they are not available anymore genuinely), I have learned a lot about them and how you can greatly increase your life-cycle. (not just the known rule to top-up to less than 4.2V@cell)
One of the methods Tesla implements to increase the life-cycle count of their cells is to have a declining charging C rate during the bulk stage, until the C rate when the battery is close to become full (still in the bulk phase) is quite small.
I was reading that the Li ions are actually competing with each other, and the more you reach the end of one extreme (whether fully charge or fully discharge), then that "competition" on the respective electrode becomes more harsh. This is reflected as an increase in the IR when you move the chemistry in that direction, so if you discharge the cell in a constant current, you can feel it would get warmer towards the very end. (and the bigger voltage drop can tell that, too)
My theory is that if you have a programmed policy of a variable discharge/regen limit vs the current %SOC (State of Charge), you can avoid those issues and take care of the greatest enemy: Heat.
Here is a graph which explains it. Sorry for the simple Windows-Paint job:

For example, at the very beginning of charging an empty battery, the IR (Internal Resistance) for charging should be lower than the spec's IR, and so even huge surge of continuous regen currents wouldn't harm it. (this is what Tesla do - I was reading that the first half "tank" of the battery is quite short in it's duration to fill up, and once you reach 80% the charge already takes a very long time to reach 100%)
On the same note, you wouldn't want to keep using the same battery current limit when you discharge the battery when it's almost empty.
Note that before you cross the 50% SOC from either direction, you can actually draw/charge more than the "stock" battery current limit recommended by the manufacturer. I believe the spec'ed values (including IR) are for the 50% SOC, where the chemistry would act symmetrically either direction.
Min-Current and Max-Current can be programmed (for example the min-current can be defined to 0A, if someone want to fully optimize the life-cycle but reduce somewhat the usable energy spectrum of the battery), as well as the values of %SOC low&high values for the change between variable linear current-limit and the constant current.

A device like the Cycle Analyst, which knows the %SOC, could reduce the current limit linearly to zero (or a defined low figure current limit) when approaching either side of the %SOC. The user can program it's maximum battery current when starting to discharge a 100% SOC battery or regen'ing one which is at 0% SOC. It can also accept either zero current when you reach one of the edges or a defined current. (which would be much lower than the "standard" battery current)

What do you think?
I find myself already doing that manually by limiting my new non-A123 battery not to be charged more than 70%, so I can still input it short duration peaks of 1KW of regen without fearing for rapid degradation.
(of course I can charge it to 80% and even 100% if I need the range)

Perhaps the graph should behave like a exponential decay and not a linear line. I don't know for sure. I do believe that such approach, even in emprical thinking, is correct.

A caveman approach to max charging rate might be to add a couple temp sensors to the charging system. The changing IR can be tracked fairly accurately by monitoring the heat coming off the pack. Program the charger to start charging at "X" amps, and as the pack warms, the amps slowly decline in steps?

I guess we need to figure out testa,s mA/h versus SOC curve and mimick it

It is quite interesting what you are saying. I have read a lot on extending the life of lipo and lifepo4 batteries. I bought a drone about two years ago and the lipo batteries just do not last if they get too hot. My take on them is to make sure that they cool off before recharging and charge with half the current specified = general rules. I bought a couple of vpower packs in 2009 and kept them fully charged at all times. I always charged them immediately following the ride. They are 48v 20ah and still have about 95 percent capacity. I suspect that the reason they are in decent shape is because I ordered them in quite a long configuration to allow cooling. Also I use a three amp charger as I give them lots of time to charge. I have never seen much discussion on heat sinking batteries to dissipate their heat but am sure that if there was some dissipation between batteries this would extend their life the most.

thunderstorm80 said:
For example, at the very beginning of charging an empty battery, the IR (Internal Resistance) for charging should be lower than the spec's IR, and so even huge surge of continuous regen currents wouldn't harm it.
At the very beginning of charging an empty battery it will tend to be cold, and therefore have a higher ESR than normal. As it warms up due to I2R heating from charging, it will be able to take higher and higher currents without exceeding cell voltage limits. Since chargers generally do not get to currents this high it's often not an issue. You do see this in EV's with "odd" regen limits (i.e. very little) when you first start driving.

(this is what Tesla do - I was reading that the first half "tank" of the battery is quite short in it's duration to fill up, and once you reach 80% the charge already takes a very long time to reach 100%)
That's a very different phenomenon, and has to do with the difference between constant current charging (where you can go fast) and constant voltage charging (where the current starts to limit so as not to exceed cell voltage.)
A device like the Cycle Analyst, which knows the %SOC, could reduce the current limit linearly to zero (or a defined low figure current limit) when approaching either side of the %SOC.
Chargers DO reduce the current to zero as the battery approaches 100% SOC, due to that CC/CV thing I mentioned earlier. In addition, some EV's reduce maximum discharge rates as the battery nears empty, although this is more to prevent cell undervoltage than anything having to do with an electrochemical problem with drawing current.

Overall the best things you can do for li-ions are -
- Store them at less than full charge (50% is great) and store them cold.
- Charge them at the lowest rate you can live with.
- Cycle them less often. (Obviously not practical much of the time.)

It's something I've noticed with new mobile phones as well. When the battery is completely depleted and I plug the USB in, it doesn't do fast charge and is very slow to charge.

Then until it charges to some point, then it charges faster

cwah said:
It's something I've noticed with new mobile phones as well. When the battery is completely depleted and I plug the USB in, it doesn't do fast charge and is very slow to charge.
Yes. You are seeing the battery's protection circuit limit charge because the cell voltage is below the safe threshold for charging. So it trickle charges until the cell voltage goes above the limit, then resumes normal charge.

Needless to say it would be better to not let the cell voltage get that low (bad for the battery) - but phone manufacturers want to be able to advertise long talk and standby times, so they let it drop that low.

its simple: just charge as slow as you can you can't charge a cell up too slowly. slower is better. even tesla nerfs the battery charge rates if you use the supercharger too much.

i am working on this right now:

https://endless-sphere.com/forums/viewtopic.php?f=14&t=93989

charging takes about 4~5 hours and a discharge usually 1~1.5 hours. we will see how long cells can live if you use more relaxed charging rates.

How long wil this 7S 144P panasonic ncr18650pf battery last...max charge 20Amp max discharge 2Amp :lol:
1008 cells heavy to lift and a hell of a job

why?

Why not :wink: the sla 24V 400ah was to big and heavy poor energy density.

60kg and compact 10.5Kwh Li-ion battery wil wurk en fit. The mobile device can run 9 days.

This fits in the back of a smart and wil power camera&others like mobile listening equipments(private detective)

the Watt meter shows 60watt max so this wil wurk perfect.

To get a proper big charger 29.4V-100A+ a bms that support this was way to expensive and charge time was no issue.
A compleet 25V 400Ah lithium battery like victron wil cost 16K in Euro,s incl charger...so the guy saves money and this wil wurk perfect.

Bms 7S 45A and a good waterproof 20Amp charger wil do the trick...cells are al new and good brand so i think there wil be no issues/cells out of balans...very low max discharge amps :lol:

billvon said:
thunderstorm80 said:
For example, at the very beginning of charging an empty battery, the IR (Internal Resistance) for charging should be lower than the spec's IR, and so even huge surge of continuous regen currents wouldn't harm it.
At the very beginning of charging an empty battery it will tend to be cold, and therefore have a higher ESR than normal. As it warms up due to I2R heating from charging, it will be able to take higher and higher currents without exceeding cell voltage limits. Since chargers generally do not get to currents this high it's often not an issue. You do see this in EV's with "odd" regen limits (i.e. very little) when you first start driving.

(this is what Tesla do - I was reading that the first half "tank" of the battery is quite short in it's duration to fill up, and once you reach 80% the charge already takes a very long time to reach 100%)
That's a very different phenomenon, and has to do with the difference between constant current charging (where you can go fast) and constant voltage charging (where the current starts to limit so as not to exceed cell voltage.)
A device like the Cycle Analyst, which knows the %SOC, could reduce the current limit linearly to zero (or a defined low figure current limit) when approaching either side of the %SOC.
Chargers DO reduce the current to zero as the battery approaches 100% SOC, due to that CC/CV thing I mentioned earlier. In addition, some EV's reduce maximum discharge rates as the battery nears empty, although this is more to prevent cell undervoltage than anything having to do with an electrochemical problem with drawing current.

Overall the best things you can do for li-ions are -
- Store them at less than full charge (50% is great) and store them cold.
- Charge them at the lowest rate you can live with.
- Cycle them less often. (Obviously not practical much of the time.)

I didn't superimpose the IR vs the temperature component into my theory.
I speak purely from the state-of-charge remaining, vs the recommended declining current. (but in practice of course the temperature should be factored as well when you superimpose everything into your equation)
And you didn't fully followed what I was talking about:
I know that on normal charging, the turn-point from CC to CV happens at around the 85%-90% mark, and the standard LVC at the end of discharge works on a similar simple closed-loop feedback.
I don't talk about this standard limit/behavior, which happens just because the charger already see the targeted voltage (or the controller see the LVC voltage)
I talk about limiting the current vs the %SOC you have remaining (depends on the direction you move the chemistry). For example the allowable charging current at 30% would be higher than at 40%. (And again, this is NOT a CV stage).
Look at my graph again.
This of course requires a smart charger, and a software addition to the Cycle Analyst.

thunderstorm80 said:
I know that on normal charging, the turn-point from CC to CV happens at around the 85%-90% mark, and the standard LVC at the end of discharge works on a similar simple closed-loop feedback.
Sometimes. The point at which you switch from CC to CV depends also on temperature, charge rates and ESR (which in turn depends on life.) Towards the end of the life of my Leaf battery, for example, the CV/CC switchover (when fast charging) happened at the 25% mark.
I don't talk about this standard limit/behavior, which happens just because the charger already see the targeted voltage (or the controller see the LVC voltage)
I talk about limiting the current vs the %SOC you have remaining (depends on the direction you move the chemistry). For example the allowable charging current at 30% would be higher than at 40%. (And again, this is NOT a CV stage).
That would depend on the charge rate (whether it was CV or CC.)

It would be interesting to see what actual charge rates would look like with your method vs. the several standard charging methods for lithium ion. I have a feeling that at slow charge rates it wouldn't matter (since charge rates are so far below 1C anyway) but at very high charge rates there might be an improvement at the cost of longer charge times.

Replies
7
Views
300
Replies
1
Views
1,040
Replies
19
Views
1,760
Replies
2
Views
465
Replies
0
Views
467