Does capacity degradation affect max safe charge/discharge current?

Jan-Erik-86

100 W
Joined
Jan 23, 2019
Messages
110
Hi,

I'm currently putting together a battery pack for my e-bike to extend the capacity of my current battery and reduce the currently rather high voltage sag.

I'm using older/used 18650 laptop cells mainly from Samsung, LG and Panasonic, so finding the datasheet is not difficult, and neither is finding the max rated charge and (more importantly) discharge current. Therefor it should be rather simple to calculate how many cells in parallel I need to be inside the safe discharge zone.
However, as charge and discharge current is often referred to as a C-rating, one question popped up in my head, and I've been unable to find a clear answer after searching a bit online.

Take for example this cell: ICR18650-28A. Capacity: 2,8Ah, Max charge: 2.8A, Max discharge: 5,6A.

Now these ratings equals to exactly 1C charge, and 2C discharge, which is all fine when the cell is new and actually have 2.8Ah capacity, but what about my used cell that only have 2,1Ah of capacity?
Is it still safe to charge it at 2,8A, which would then equal to 1,33C, and discharge it at 5,6A, which would then equal to 2,66C, or should I follow the current capacity of the cell and set the max safe charge to 2,1A, and max safe discharge to 4,2A to maintain the 1C charge and 2C discharge rating?

I know common sense tell me "when in doubt, add extra parallel row(s) to be sure", and I will probably do that anyway, but I'm asking in general to also learn the theory behind how rated specifications are affected by a degraded battery. :)
 
Capacity degradation is either caused by, or its effect is increased internal resistance. Bear in mind, I'm not an engineer, but the way I think of it is that as resistance increases, much of your lost capacity is lost because you are making the cells hotter than before.

So yeah, if you are making the pack hot now, when it was just a bit warm, you really need to add more in parallel. if your sag is high, so is your internal resistance, relative to your discharge rate. increasing pack size will change that resistance / discharge rate fraction.

As I have worn out RC cells, a strong 10 ah pack needs to be first paralleled with another 10 ah pack of the same age to continue a 40 amps discharge without losing capacity to heating. Lower that discharge rate enough though, and lots of that capacity returns. But by the end of the cell lifespan, you have to start doing .5c or less to see 75% of the original 500 wh from a 48v 10 ah pack. At the end, even fairly weak discharge rates are impossible without losing more than half the original capacity. So at the end, I only want to pull 400w or less from a big 20 ah bundle of paralleled packs.

Clearly there is still another reason capacity is gone, since the pack now won't hold a full charge overnight. But heating can double that, if you hit it with more amps than weak old tired cells can take.

Bottom line, if you are running scavenged cells, you are going to need to pack 20,30, or even 40 ah of it to see any ability to carry a higher amps motor controller system at full gallop.

Re safety, I don't know why my 18650 pack burned down my garage while charging. But I am certain sure that somehow, one cell got hot, melted down, and then the whole pack went off like a string of firecrackers. The pack was not that old, and 20 ah, but it was on a moderately fast 5 amps charger. ( charging at 1.2c then) Most likely my bms failed to prevent overcharging, but nobody will ever know for sure. Did the 5 amps charger fry the bms? who knows?

All I can say about safety now is three words. Outside, outside, and outside.
 
Industry standard is to retire cells at 80% SoH.

I might push EoL past 70% with lead or LFP, but not LiPo.

Using cells until they actually **fail** is multiplying the odds by thousandfold you **will** see "sudden destructive failure modes" that may be very dangerous.

My goal in battery usage is No Surprises.

Yes that means spending a little more.
 
What's 1C on a 3AH cell? 3 amps.
What's 1C on a 1.5AH cell? 1.5 amps.

As your battery capacity decreases with wear, so should the expectation of how much current it outputs.

Always better to have a battery that can output more amps than you need.
 
neptronix said:
As your battery capacity decreases with wear, so should the expectation of how much current it outputs.
As well as how much current it can take during a charge. ;)
 
amberwolf said:
neptronix said:
As your battery capacity decreases with wear, so should the expectation of how much current it outputs.
As well as how much current it can take during a charge. ;)

Indeed!
 
Modern cells aren't affected in the internal resistance department much by capacity degradation.

What mostly affects internal resistance is charge rate.

Faster charging stresses the cell more, and over a long period of time, can affect a cell's internal resistance significantly.
 
Thank you for all your replies!

Dan, I previously read about your battery fire, and it's something I keep in my mind whenever I charge batteries. I'm also sure your battery accident was a real eye-opener for many that normally would charge larger packs both indoors, unattended and at night.

I'm awere that internal resistance increase with use and age, and this again will cause higher voltage sag during load, less ability to absorb charge, and that the excess energy is transfered into heat. However, I've yet to find a clear answer as to how to determine a safe current draw on a degraded cell.
I know I can not expect "as new" performance from a used cell, but mainly what I'm wondering is, at some point a manufactor had to determine if the cell they're making is safe for a 2A discharge, or a 20A discharge. I know internal resistance play a huge role as it directly affects the battery temp during load, but is this decision mainly just based on battery temperature during a selected load, or are there other critical factors as well? If the answer to this is mainly just heat caused by internal resistance, then I have my answer... :)


John, I've never heard anything about an industry standard claiming that cells should be replaced when they have 80% capacity left. Naturally in many cases battery-dependant systems would benefit from the batteries being replaced at this point to achieve the required runtime, but that's a different story. I know cycle count is based on how many repeated 0-100% cycles a cell can handle before it only have 80% capacity left, but I've never read in any datasheets that a cell is recommended replaced at this point and/or that the cells safety is compromised if kept in service after the cell have lost 20% or more of it's rated capacity. If what you say is indeed true then it's a very interesting subject, and I would be really happy if you could show some references.
I see you mention other battery technologies like LiPo and LFP, but I'm asking about li-ion (regardless of which chemical type).


neptronix, this I am already awere of, but not really what I asked, more like my actual reason for asking what i did... :)

What I meant by my question is more like, since some datasheets specify a safe current by C rating, and others by A rating:
If a new 3Ah cell is rated for a 1C discharge in the datasheet (not by an A rating), it's safe to discharge it at 3A.
If the cell have lost capacity and is down to 2Ah, is it then safe to discharge at 2A, following the 1C rating?
If the cell have lost even more capacity and is down to 1Ah, is it then safe to discharge at 1A, following the 1C rating?

If a new 3Ah cell is rated for a 3A discharge in the datasheet (not by a C rating), it's safe to discharge it at 3A.
If the cell have lost capacity and is down to 2Ah, is it still safe to discharge at 3A, following the 3A current rating?
If the cell have lost even more capacity and is down to 1Ah, is it still safe to discharge at 3A, following 3A current rating?

I doubt the answer to the 3A discharge is "yes" (as the cell naturally will get more and more heated as the internal resistance increase if the discharge current remains the same), and so this leads me back to the original question; is the answer to the C rated discharge "yes", or is there other factors to consider? And if so, which factors, and how to you determine/calculate them?


BlueSwordM, that's an interesting statement. Is this due to different chemicals used to build the battery,or simply better production quality (more clean materials etc)?
You only mention high charge rate, but I'm guessing high discharge would also cause it even on a modern battery?


Again, please note I'm not in any way going to be pushing my new battery pack in either charge or discharge. My cells will not see much more then 0,5C continuous discharge, and peeks maybe up to 1,5C. I'm simply asking to learn at this stage, and sorry for the long post. :)
 
:lowbatt:
I have actual and current practical experience with this... Ive built my pack out of old cells mostly. You could consider the "new" pack that I'm building to be old...however all the cells have been tested to be exactly within the specs sheet for the cells. They've Sat around for years without ever being discharged.

What you need to do beyond just looking up datasheets is to actually test the cells. If you plan on discharging them around 4A then you need to test each cell at 4A and see how they respond. One of my favorite dischargers is the 60w discharger that you can get for like $15. I can easily test cells way upto 10A no problem!

Now my current pack is made with cells that definitely can't discharge to the level I'm pulling from the pack but there's a lot of leeway for discharge rates beyond charge rates as what's already been said. Fortunately I'll be retiring all the old cells and building a new ncr18650 pack that'll easily be able deliver the watts.

I charge at a super slow rate and it takes hours to charge but I'm OK with that. I charge at 1.4A at 48v.

At this point I'm not concerned about a fire from charging or discharging. I've already accidentally charged the cells upto 4.25V once and they were fine...im more concerned about cell damage or accidentally dropping the pack...
 
I did not mean any particular industry, nor chemistry with the 80% EoL guideline.

Lithium-ion is just an umbrella term that covers all the dozens of related chemistries, and I've never seen any indication other than we should be a lot **more** cautious wrt the high-risk chemistries like LCO/LMN NCM/NMC/NCA or the car makers formulations.

Obviously hobbyist consumers do push the risk envelope, but for example someone living on their boat or any small enclosed space on land, I would not allow any non-lead chemistry at all other than LFP even of new, except maybe in future OTS Tesla system after approval by gov agencies and insurance companies.

So yes, 70% is IMO really pushing things, if capacity of a 150Ah rated pack has declined to anywhere past 120Ah, the whole lot should be scrapped and replacement cells purchased new.

Not just for safety's sake, but functionality and in the interest of not wasting time.

And all that's in the context of gently cycled House bank usage, not so-violent propulsion.

Thousands of banks are out there now well past a decade old and 3000 cycles, and show **zero** degradation of capacity. Well treated like that, it is IMO likely that calendar life will turn out to be the limiting factor.

But none of that apparently applies to the wild west of hobbyist EV, IMO lots of people just can't afford to play safely in that sandbox.

All this is just my opinion, obviously won't change the culture.
 
Back
Top