Bottom balancing?

AndyH said:
I guess I failed to explain the graphs to your satisfaction. I'll try again.

First - the 0, 5.0, 10.0, 15.0, 20.0, and 25.0 numbers across the bottom of the charts are for hours. I'm pretty sure that 24 hours meets your '180 minute' instruction.

Second - you suggest I should run the charts again on a "NEW" cell. I'm very sorry for my communication error. What I ment to say in the post when I typed "brand-new Thunder Sky, Sky Energy, and Shandong HiPower 100Ah prismatics", and later in the comments for the images "New 100Ah Thunder Sky LiFeYP04 - from 4.2V" that the data came from new (unused) cells.

No. The charts I posted are correct. The voltage bleed was from a trio of brand new cells. The data was collected by a calibrated, automated device at 20ºC ambient. The cells were charged on a CC/CV power supply to 1V of their specificed charge voltage, then put on a LaMantia BA402 battery analyzer programmed to finish the charge then log cell voltage for 24 hours.

I'd be happy to re-run the charts on video if you'd like.

Then we DO have a mystery worth investigating Andy. Because I ran this myself quite manually last night. I would suggest we have TWO possibilities here:

1. Test equipment setup.
2. We're testing two entirely different cell chemistries.

When you say "brand new" are you referring to the LiFeYPo4 cells with the Yttrium addition. Or the standard LFP cells?

I'm testing the previous chemistry, but in a new unused battery cell just removed from the box. Though I found NO difference in this among a new cell, a bench queen, and an actual known damaged cell. So I'm not quite guessing here. But they are NOT the LiFeYPo4 cells.

It's like watching paint dry, but if we're doing the same cells, I would suggest forgetting the video, and going to a manual test using a known good DVM. I've been burned by the automated equipment myself several times.

If we are indeed testing the same cells, I would suggest bailing on the pretty graphs and video, and going back to a basic measurement at the cell terminals with a DVM and a stop watch. Disconnect EVERYTHING from the cell after the charge completion. Then connect a DVM to the terminals.

Whenever I see an anomaly like this, I start throwing out test equipment until I get to the simplest form I can.

I measured at the minute for 10, then at the hour, and then every 15 after. In 180 minutes it was over - 3.400 straight up and down.

I kind of did them one at a time which is what took 5 hours. The whole thing can be done in a little over three.

BTW. Since you do have some equipment, perhaps you can help explain something I HAVE been noticing on both charge and discharge. I call it the S curve but that doesn't describe it. It is a little reversal, occurring mostly at pretty good current levels 100-200, on both charge and discharge. On charge, for example, the terminal voltage RISES to 4.16 then REVERSES and falls to 3.92v, then REVERSES AGAIN and rises to the 4.2. On discharge this looks like a drop during current pulses to some low value, like 2.9v for the first five 30second pulse cycles, then a recovery to 3.10 during pulse, gradually declining as expected from there.

Have you seen this? I can pretty much duplicate it at will.


Jack Rickard
http://evtv.me
 
My apologies. I STILL missed it. You ARE using the Yttrium cells?

Then we are testing two different chemistries. But how would this explain SE cells. They haven't changed and I DO have the latest of those....

Jack Rickard
 
That's good to know. I don't keep up with all the latest lithium stuff. Only remembered that A123 had some patent on being nano based allowing more storage and current flow.
 

Attachments

  • Gate Resistor Power Dissipation.pdf
    160.5 KB · Views: 4
If you take a brand new cell, and you over charge it to 4.2v, then it's now a damaged cell. It's been 0.6v overcharged. Try that with many other Lithium chemistries and they burst into a fireball. Just because LiFePO4 has excellent manors and doesn't cry out when it's damaged doesn't mean it's not damaged.

I just ran a test with a headways cell. I took a cell that always hold a 3.65V charge, and charged them well beyond the proper charging stopping point all the way up to 4.2v. This is the first time this cell been damaged by overcharging like this, and I'm no able to watch the mV's dropping away, it's down to 3.5V now on a cell that always held itself at 3.65V before I overcharged it.

If you look at the life cycles degradation charts for Lithium Batteries, you will find the biggest impacting factor on cycle life testing was HVC point (aside from extreme high temperature).

For example, with Kokam's Lithium Polymer LiMn cell tests, 4.1V/cell is fully charged. (Note, LiMn is slightly lower voltage than LiCO at 4.2v)
In there charts, charging the cells to 4.2v on each cycle (just a 0.1V overvolt) resulted in roughly 100 cycles to 80% capacity. Charging the cell to 4.1V resulted in over 1,000cycles to 80% capacity. For Lithium Ion LiCO chemistries (a 4.2v fully charged chemistry), we've seen the testing data showing a similar 500-1000% increase in cycle life by undercharging the cell to 4.1v, at a cost of something like 6-8% capacity.


Lastly, Ri is with out a doubt the single biggest predictor of cell performance. Testing it needs to happen in proper conditions to get repeatable results. I take Ri's of cells at 32F in an ice bath, at room temp, and at 130F, and plot the lines, which enable you to get a trend line for all the temps in between, as the slope is linear between Ri and temperature (yes, I know this would mean only 2 points would be needed, I take 3 points just to ensure the slope between each 2 point set matches). Knowing the Ri number of the pack enables you to precisely know cell voltage drop under any current load at any temperature condition. Why does knowing voltage drop matter so much? It lets you know exactly the amount of that cells energy is being wasted in the form of resistively heating it internally.

Ri and the health of the cell trend together exactly. If you want to re-use the cell, you can't tear the cell apart and unwrap the cathode and look for all the places the carbon has had damaged regions, or non-contact areas between separator and cathode from tiny off-gas events from overcharge, or crystal formation from the ions in the gel to the surface of the anode from under-voltage, or drying out of the ion gel from extended over-temperature operation, etc etc. But what you can do is measure Ri, which reflects the cumulative sum of all these damages in measuring the cells ability to do it's job, which is to hold a voltage under load.

Ri varies between different temps and different SOCs (as you've noticed). On and ICE engine, oil pressure also varies between different temps and different RPMs, but it doesn't mean it's not the best indicator of bearing wear/health of the engine, short of tearing it down and mic'ing the bearing shells. Like so many non-constant things in science/life, you have to pick some consistant point to take measurements from, and now the data is consistent and repeatable, and most importantly extremely useful for charting the degradation of the cells (as well as at least 10 other important things).


-Luke
 
Both Luke and Jack have very valid points. The problem lies in real world applications of it. Knowing the RI isn’t always practical since cells are usually paralleled together, so a false sense in reliability is what seems to be Jack’s argument against it. It doesn’t seem as if Jack is discounting the stuff we all agree on as much as how the details of it gets applied.

If single cell monitoring is feasible then one could extrapolate the cutoff based on RI and temperature. This can be easily handled by a microprocessor and some formula or lookup table. The problem lies in the practicality of it. What Jack seems to propose is to monitor the cells as they run in the system and compare the cells with others as banks or individually if possible and if there are differences, then the weaker ones need to be pruned out. This seems to be a useful approach if it is really quantifiable.
 

Attachments

  • front_drive_scott.jpg
    front_drive_scott.jpg
    57.4 KB · Views: 5
I think that since these LIFepO4 cells all have very sharp cliff at the end of discharge, it doesn't realluy matter wich cell "NEED" to reach the LVC first...

In EVERY LiFePo4 pack you will ALWAYS have one cell reaching the LVC first!

I dont know any pack you can monitor and protect using an LVC circuit that will have ALL cell in serie reaching the LVC at the same time with in a second!!
There WILL BE always one cell first to reach it

That said, does it really care wich cell reach the LVC first... My answer is NO

So having the LOWEST.. or "WEAKEST" cell in a pack to reach the cell first doesn't really matter...

What i know is that having a cell that is:
WEAK.. or dammaged, or lower by a major difference in capacity... (call it whatever you) want... IS NOT GOOD.. and that cell will continu to loose capacity and having his internal resistance increasing..

Now.. What it create.. is that that cell will always be warmer than others... and since this cell is very close to one or two next cell, that will affect their IR too because the heat created by that "weak" cell will affect the IR of the adjacent cells too.. and then might create a chain reaction over the time and dammage.. it affect negatively more cells...

I tested alot of Lithium battery pack for ebikes and powertools and i can say that this is very commo that more than one cell close together are weak. and the one in the middle of those being the worst..

So... what should we remember about that:

That is simple: Avoid keeping a "WEAK, Dammaged..etc" cell in your pack...

Now what is mean..

After replacing it for a GOOD cell that have similar capacity and IR from the entire pack,

YOU WILL STILL HAVE ONE CELL THAT IS LOWER THAN THE REST !!..

Now if you just remove it... You will still have another that is lower than others...

so... to me, there is no really good reason to preffer balancing at the bottom instead of at the top...

Now let's talk about one risk of balancing at the bottom instead of at the top:

What i know is that you will have less chance to have a cell reversal to happen if you balance at the top..

Why?.. It's simple, By balancing at the bottom, you will have every cell that reach the LVC in a shorter period of time thah if balancing at the top
Now let say we look at the potential risk of balance wire disconnect problem...

I heard about that problem ALOT of time here from person using their LiFePO4... REMEMBER?
risk

Let say your BMS have one balance/monitor wire disconnected accidently:
If you balance at the top, you'll have less chance to dammage a cell than if you balance at the botto,... bucause you can be pretty sure that in this situation the cell that have the wire disconnected WILL go under LVC limit due to the fact that every cell are matched to reach that LVC clost to the same time.. and the time the other cell that is the lower cell reach the LVC.. it will be too late!

Now if you balance at the top, that same cell will have still enough Ah so that the lower cell will reach the LVC limit BEFORE..

Ok, let me explain:

If you have the lowest cell in the pack that is let say..2Ah difference from the rest (100Ah) YOU CAN BE SURE that if you balance at the top, this cell it will probably be always the cell that have more chance to reach LVC first... the rest of the cell are 2Ah away from that... ( all cell at 100Ah and the lowest at 98Ah)

Now if you balance at the bottom, You have ALL CELLS CHARGED TO THE LOWEST CELL CAPACITY.. and then will force them to reach the LVC close to the same time... and you Loose that protection for the 2Ah offset ( all cells at 98Ah the lowest at 98Ah)

So forcing cell to alwways reach the LVC or very close to the LVC increase the risk of cell reversal in case where the BMS have a trouble of monitoring cells..

And. In many Lithium battery potential problems list, that occur more time than you think.

Doc
 
jrickard said:
Then we DO have a mystery worth investigating Andy. Because I ran this myself quite manually last night. I would suggest we have TWO possibilities here:

1. Test equipment setup.
2. We're testing two entirely different cell chemistries.

When you say "brand new" are you referring to the LiFeYPo4 cells with the Yttrium addition. Or the standard LFP cells?

I say 'brand new' because the cells were fresh from EV Components. If they were cycled at all, it was from manufacturer testing or possibly EVC's receiving inspection. I looked at bleed before I pulled a single electron from the cells.

I expect that the SE and Hipower cells are regular LiFePO4. The TS cell should be YPO4 - different terminals than the earlier cells I have, EVC makes monthly orders, I received the cells earlier this month. I don't know exactly when TS changed the chemistry so cannot say to 100% if this is a YPO4 cell but I believe it to be. The serial for the TS100 is TS-LFP100AHA 090514-F02342 The earlier cells I have have aluminum terminals and serials in the 080810xxx and 081018 range.

View attachment tssehp100_ts40_psihw10.jpg
 
jrickard said:
markcycle said:
I've seen this relax back to 3.4 volts with TS cells every time i charge since I monitor every cell all the time.

My take on all this top charging

The first cell in a pack to reach 4 volts is most often the weakest cell, if you stop charging after the first cell in the pack is at 4 volts then all the other cells have the same energy capacity, that will be the capacity of the weakest cell but thats OK, because then all the cell will reach the low voltage cutoff at the same time. Nothing is going to improve the weakest cell. So why have it reach the LVC first and potentially be destroyed?

Now if you top balance you just bring the strongest cells in the pack to full capacity and higher than the weakest cell. You still have to shut down the pack based on the weakest cell. There are really two choices
1) replace the weakest cell for a better one even though the difference between the weakest to strongest may be 1 or 2 Ah's
2) don't top charge and stop charging once the weakest cell reaches 4 volts this way all the cell will reach the LVC point at about the same time making a better bottom balanced pack.

In all cases Low voltage cutoff monitored at each and every cell is essential

Mark

And I would take exception to every point you make here. They do NOT match anything I've learned and are totally wrong. There's no nice way to put that.

First, you do NOT stop charging when the first cell in the pack reaches 4v. The charge curve varies depending on the current applied, but a significant portion of it happens in the CV stage. If you charge until the first cell reaches 4 v and then cut it off, you might potentially be leaving a lot on the table. Rather, pick a lower voltage average, I would suggest 3.65, and use that as the charge voltage or point where it switches from constant current to constant voltage. Picture charge to this voltage and HOLD IT THERE, not cut it off. Some cells WILL reach 4v. No harm. No foul. Keep charging. When the current declines to 3 or 4 amps then shut it off.

You misunderstood me or just want to be confrontational, When the first cell reaches 4 volts I am in the CV stage guess I didn't state this, and the overall current has tapered down to 10% or less of the CC stage. The CV level is based on 3.7 volts per cell but due to IR differences that's not how the cell end up some will be higher than others all will be over 3.4 volts. none over 4 volts

LV cell monitoring is NOT only not essential, it is essentially nonsense. I keep asking this question and while no one has a fixed answer, they continue to chant LV cell monitoring, LV cell monitoring like a political rally or something. As I described in the Jan 18 video, MONITOR FOR WHAT? The voltage is all over the place. It depends on what load you put the cell under. And THAT changes depending on where you're at on the discharge curve, what temperature it is, etc. In cold weather the voltage under a 400 or 500 amp load drops to very low values with the cell fully charged and operating just fine. At other times, a small drop could be meaningful. It is a VERY dynamic area. Most of the LV cell monitor guys are plagued by constant false alarms such that they basically ignore them and miss the one that was real. The boy who cryed wolf.

OK I do get false positives for now I'll take that over damaged cells. I only get false positive under high load so its not hard to ignore an alarm under high current when I know I have used less than 30AH of my 40AH pack. If I clear a alarm at low load and it comes back then I know I have a low cell. A small computer looking at load and voltage can eliminate most if not all false positives. I'm not looking for a bad cell in this scheme just when my pack is low based on the lowest cell under low to no load.

I don't at this point really know how it will work out. But I am attracted to the notion of the Cell Log 8S alarm on DIFFERENCE voltages. It takes some rather complicated test procedures to work this out. But it is my assumption that since all cells are under identical current load, the voltage reactions should be quite uniform across the pack. If a cell varies much from the other cells under IDENTICAL load conditions, at ANY level, that might be a pretty good indicator of a cell going bad an dbetter, it might be a pretty EARLY indicator of a cell going bad. That MIGHT have some significant value.

But LV cell monitoring is NOT only not essential, it is essentially nonsense.

Jack Rickard
http://evtv.me
 
jrickard said:
BTW. Since you do have some equipment, perhaps you can help explain something I HAVE been noticing on both charge and discharge. I call it the S curve but that doesn't describe it. It is a little reversal, occurring mostly at pretty good current levels 100-200, on both charge and discharge. On charge, for example, the terminal voltage RISES to 4.16 then REVERSES and falls to 3.92v, then REVERSES AGAIN and rises to the 4.2. On discharge this looks like a drop during current pulses to some low value, like 2.9v for the first five 30second pulse cycles, then a recovery to 3.10 during pulse, gradually declining as expected from there.

Have you seen this? I can pretty much duplicate it at will.

Jack Rickard
http://evtv.me

Hi Jack!

I think you have answered this yourself already. It is the IR that varies with cell core temperature.
The cell core will heat up from the first part of the charge/discharge and lower the IR for a while until the SOC will raise the IR again at both ends of the SOC curve.

At least that is what I think.
Take a look at the Hi rate discharge curves from the LIFEBATT cells here at the forum you can see it pretty clearly at 20C discharge.
http://endless-sphere.com/forums/viewtopic.php?f=14&t=6586&hilit=lifebatt&start=240

Best Regards
/Per Eklund
 
Gentlemen - since I'm clearly lacking in some basic knowledge of how a CC/CV charger works, can someone point me in the direction of some clear explanations?

I believe I've begun to scratch the surface in discovering the existence of voltage-controlled current sources (via op-amps)... but I have the feeling that there's got to be a great online reference on the basics of battery charging theory that someone can point me and any other novices to. (probably here on the sphere somewhere)

I'm beginning to suspect that if the Ri (or IR, or whatever) is so variable, and depends on temperature, load and SOC, then a charger that takes these things into account (which AFAIK none currently do) would be ideal. Does the charger's output not depend on the impedance it sees?? :shock:

Imagine being able to just plug your car, bike, truck, whatever into the wall and walk away - I know it's already reality, but understanding how it all works makes it so much better - it makes it possible to "do it yourself" as much as possible.

What a great thread!

Thanks again to all here!

Best,

- Mike
 
northernmike said:
Gentlemen - since I'm clearly lacking in some basic knowledge of how a CC/CV charger works, can someone point me in the direction of some clear explanations?

Essentially it will start in Constant Current mode, where it will either apply a specific voltage across the cell (or pack) and limit the current to a specific maximum that it has been "told" is safe for that cell (whatever limit you set for it), or it will apply a voltage to the cell and then ramp it up until it reaches a specific maximum, and then only increase the voltage when the current begins to drop, to hold the current constant.

Once it reaches a certain voltage level, then it will stay at that voltage and never rise farther, and then simply let current trickle off to zero or end with a timer shutting it off after staying at that level for some preset amount of time.

The exact methods that these are done with vary, and can be as simple as transistor control, op-amp/comparator control, or as complex as dedicated chip control or microprocessor control based on a program and preset parameters plus feedback from the cell (or pack).

The curve that results from graphing the above constant current / constant voltage modes' output is the Charging Curve. This ought to be specified by the cell manufacturer, and it can vary by temperature or other environmental conditions, so the graphs usually show more than one curve based on those conditions. This curve can then be programmed into the charger if it is microprocessor-based, to be more exacting than just the analog feedback-loop types might be.


I believe I've begun to scratch the surface in discovering the existence of voltage-controlled current sources (via op-amps)... but I have the feeling that there's got to be a great online reference on the basics of battery charging theory that someone can point me and any other novices to. (probably here on the sphere somewhere)
Battery University is a good site to start at, but I don't know how up to date that information is.

There are also some threads like the Care and Feeding of Lipo and similar ones, that have a ton of useful information, but are pretty long.

The catch with batteries is that as you can see, there are different experiences with them by different people, depending on their usage, test methodologies, end results, and sometimes even what brands or revisions of the same brands of batteries they use. There is also a lot of anecdotal reporting that happens with little hard data but vivid reporting that stick in the minds of others, that then gets re-reported by them, perhaps missing some of the original information as well.

I'm beginning to suspect that if the Ri (or IR, or whatever) is so variable, and depends on temperature, load and SOC, then a charger that takes these things into account (which AFAIK none currently do) would be ideal. Does the charger's output not depend on the impedance it sees?? :shock:
[/quote]
In a way, it would, in that the greater the impedance the less current that will flow into it at a specific voltage. So if it is a per-cell charger it could increase the voltage (up to the maximum CV value) to force the CC part of the cycle to continue at that current, if it wanted to.
 
The ideal method to charge uses a wimpy power supply regulated to 3.65V. Done. It starts out with the lower voltage of the cell pulling down the power supply so it's outputting the most current it's able to do. Cell voltage creeps up until it reaches 3.65v. At this point the charger is in CV mode, and the cell can stay on this charger for years, and the cell can't be overcharged, it just stays perfectly charged.

It's very simple to perfectly charge 1 cell.

The trouble comes from when you try to get this charge function to happen across a large string of cells in series.
 
liveforphysics said:
The ideal method to charge uses a wimpy power supply regulated to 3.65V. Done. It starts out with the lower voltage of the cell pulling down the power supply so it's outputting the most current it's able to do. Cell voltage creeps up until it reaches 3.65v. At this point the charger is in CV mode, and the cell can stay on this charger for years, and the cell can't be overcharged, it just stays perfectly charged.

It's very simple to perfectly charge 1 cell.

The trouble comes from when you try to get this charge function to happen across a large string of cells in series.


Trouble.. Huh?.. :lol: never had any trouble with that... :wink:

Hmmm... I suggested that idea 2 years ago and also realized my own 12s 20A charger that is selectable from 3.600 or 4.200 V

I also found a way to hack the DC-Dc to fall into constant current mode at lower current value eventhough they are not current limited :wink:

http://endless-sphere.com/forums/viewtopic.php?f=14&t=2824&hilit=Artesyn&start=30

some pics:

file.php


file.php


file.php


The very first Hacked and reverse engineering DC-Dc !!

The only Battery charger Dc-Dc type i know are made by a compagny named Gaia.. but they are more than 175$ each for 100W !!!
So i fount that solution and using 10$ 150W Dc-Dc unit from ebay

file.php

file.php

file.php

file.php





This charger now is working perfectly and balance cell each time you charge... :mrgreen:

15A.. 4.2V or 3.6V for every 12 channels. I sypply it with my PFC48V 35A power supply

What is beautifull i sthat these Dc-Dc have an isolated input from the output so you can link their input in parallel and their output in serie!.. I added a shotky diode to prevent reverse polarity on each Dc-Dc and a 20A fuse.

I remember I measured up to 157A in short circuit test mode on these 30A 150W 10$ Dc-Dc.. Thet are very strong!!

Lucent and Tyco make some great too.



Doc
 

Attachments

  • DSCN4797_1024x768_800x600.jpg
    DSCN4797_1024x768_800x600.jpg
    71.5 KB · Views: 3,657
  • DSCN4801_1024x768_800x600.jpg
    DSCN4801_1024x768_800x600.jpg
    89 KB · Views: 3,657
Who knows why TS changed to LiFeYPo4 cells with the Yttrium addition....???

What is the benefit of the Yttrium?

@Jack
if you dont need the A123s...i take them...:)
its a shame, when such good cell just spend theire time unused in the boxes...
 
liveforphysics said:
If you take a brand new cell, and you over charge it to 4.2v, then it's now a damaged cell. It's been 0.6v overcharged. Try that with many other Lithium chemistries and they burst into a fireball. Just because LiFePO4 has excellent manors and doesn't cry out when it's damaged doesn't mean it's not damaged.

I just ran a test with a headways cell. I took a cell that always hold a 3.65V charge, and charged them well beyond the proper charging stopping point all the way up to 4.2v. This is the first time this cell been damaged by overcharging like this, and I'm no able to watch the mV's dropping away, it's down to 3.5V now on a cell that always held itself at 3.65V before I overcharged it.

If you look at the life cycles degradation charts for Lithium Batteries, you will find the biggest impacting factor on cycle life testing was HVC point (aside from extreme high temperature).

For example, with Kokam's Lithium Polymer LiMn cell tests, 4.1V/cell is fully charged. (Note, LiMn is slightly lower voltage than LiCO at 4.2v)
In there charts, charging the cells to 4.2v on each cycle (just a 0.1V overvolt) resulted in roughly 100 cycles to 80% capacity. Charging the cell to 4.1V resulted in over 1,000cycles to 80% capacity. For Lithium Ion LiCO chemistries (a 4.2v fully charged chemistry), we've seen the testing data showing a similar 500-1000% increase in cycle life by undercharging the cell to 4.1v, at a cost of something like 6-8% capacity.


Lastly, Ri is with out a doubt the single biggest predictor of cell performance. Testing it needs to happen in proper conditions to get repeatable results. I take Ri's of cells at 32F in an ice bath, at room temp, and at 130F, and plot the lines, which enable you to get a trend line for all the temps in between, as the slope is linear between Ri and temperature (yes, I know this would mean only 2 points would be needed, I take 3 points just to ensure the slope between each 2 point set matches). Knowing the Ri number of the pack enables you to precisely know cell voltage drop under any current load at any temperature condition. Why does knowing voltage drop matter so much? It lets you know exactly the amount of that cells energy is being wasted in the form of resistively heating it internally.

Ri and the health of the cell trend together exactly. If you want to re-use the cell, you can't tear the cell apart and unwrap the cathode and look for all the places the carbon has had damaged regions, or non-contact areas between separator and cathode from tiny off-gas events from overcharge, or crystal formation from the ions in the gel to the surface of the anode from under-voltage, or drying out of the ion gel from extended over-temperature operation, etc etc. But what you can do is measure Ri, which reflects the cumulative sum of all these damages in measuring the cells ability to do it's job, which is to hold a voltage under load.

Ri varies between different temps and different SOCs (as you've noticed). On and ICE engine, oil pressure also varies between different temps and different RPMs, but it doesn't mean it's not the best indicator of bearing wear/health of the engine, short of tearing it down and mic'ing the bearing shells. Like so many non-constant things in science/life, you have to pick some consistant point to take measurements from, and now the data is consistent and repeatable, and most importantly extremely useful for charting the degradation of the cells (as well as at least 10 other important things).


-Luke

You may live for physics, but you clearly know squat about batteries. Different cells have different charge voltages and headway does not precisely define the genre. TS cells charge to 4.2 v by spec and SE's charge to 3.6. Ri and an ice bath will tell you buttkiss. There isn't any damaged carbon on the cathode because the carbon is on the anode. There is no magic on the HVC any different from the magic on the low end, except for the fact that you're probably driving 400 amps through the cell at the low end. There's four or five other eggregious mistakes in one single forum post but I don't have time to go through them. In fact, I don't have time to go through these.

Jack Rickard
 
AndyH said:
jrickard said:
Then we DO have a mystery worth investigating Andy. Because I ran this myself quite manually last night. I would suggest we have TWO possibilities here:

1. Test equipment setup.
2. We're testing two entirely different cell chemistries.

When you say "brand new" are you referring to the LiFeYPo4 cells with the Yttrium addition. Or the standard LFP cells?

I say 'brand new' because the cells were fresh from EV Components. If they were cycled at all, it was from manufacturer testing or possibly EVC's receiving inspection. I looked at bleed before I pulled a single electron from the cells.

I expect that the SE and Hipower cells are regular LiFePO4. The TS cell should be YPO4 - different terminals than the earlier cells I have, EVC makes monthly orders, I received the cells earlier this month. I don't know exactly when TS changed the chemistry so cannot say to 100% if this is a YPO4 cell but I believe it to be. The serial for the TS100 is TS-LFP100AHA 090514-F02342 The earlier cells I have have aluminum terminals and serials in the 080810xxx and 081018 range.


My TS cells all fell further to 3.38v with another day of rest, though oddly the damaged TS cell did not. It stayed at 3.41, perhaps a sign of its earlier overcharge adventure. I'm actually capacity testing it now and it is looking very strangely good. I'm about 83 AH out of the cell now at 100 amps and holding nicely at 2.82v under load. A rather high internal resistance of 5.2 milliohms initially. Keep in mind this was a cell that was intentionally overcharged to 5.5v for an hour AFTER a full charge, swollen like a football, and then recompressed for 5 days in a bench vise with two steel plates. Then it was left to set for 2 weeks. I think I've got a shot at getting the full 160Ah rating out of it. Unlikely, but it sure is putting out the juice.

Ok. I have plenty of SE cells. I would suggest let's change to that chemistry just to get on the same page. I'll re-run on an SE, and you do a test manually without all the equipment and let's compare notes then. We just can't keep getting completely different results doing the same test.

Jack Rickard
 
You're right Jack. :)

Internal resistance doesn't mean a thing for a battery, and there's no reason to pay any attention to it. The ability to understand and calculate cell voltage drop and cell heating and inefficiency is totally useless. I suppose it has no impact on C-rates for cell in charge or discharge for you either. :)

I don't understand what causes the self-discharge in TS cells. I've personally only seen this with cells that have been damaged, and I can reproduce the effect by taking a good cell and damaging it by overcharging.
 
liveforphysics said:
Lastly, Ri is with out a doubt the single biggest predictor of cell performance. Testing it needs to happen in proper conditions to get repeatable results. I take Ri's of cells at 32F in an ice bath, at room temp, and at 130F, and plot the lines, which enable you to get a trend line for all the temps in between, as the slope is linear between Ri and temperature (yes, I know this would mean only 2 points would be needed, I take 3 points just to ensure the slope between each 2 point set matches). Knowing the Ri number of the pack enables you to precisely know cell voltage drop under any current load at any temperature condition. Why does knowing voltage drop matter so much? It lets you know exactly the amount of that cells energy is being wasted in the form of resistively heating it internally.

Ri and the health of the cell trend together exactly. If you want to re-use the cell, you can't tear the cell apart and unwrap the cathode and look for all the places the carbon has had damaged regions, or non-contact areas between separator and cathode from tiny off-gas events from overcharge, or crystal formation from the ions in the gel to the surface of the anode from under-voltage, or drying out of the ion gel from extended over-temperature operation, etc etc. But what you can do is measure Ri, which reflects the cumulative sum of all these damages in measuring the cells ability to do it's job, which is to hold a voltage under load.

Ri varies between different temps and different SOCs (as you've noticed). On and ICE engine, oil pressure also varies between different temps and different RPMs, but it doesn't mean it's not the best indicator of bearing wear/health of the engine, short of tearing it down and mic'ing the bearing shells. Like so many non-constant things in science/life, you have to pick some consistant point to take measurements from, and now the data is consistent and repeatable, and most importantly extremely useful for charting the degradation of the cells (as well as at least 10 other important things).


-Luke

IF this is so, then why has no one bothered to produce an Internal Resistance meter for the dashboard? Yes, it indicates a LOT of things. So it is very difficult to determine WHAT it is indicating because you cannot determine easily whether it is damage, temperature, or SOC - in other words WHAT is it indicating. Yes, I can measure it easily, but it doesn't tell me anything USEFUL. Oil pressure doesn't either by the way. It MOSTLY serves as magic dust that people with NO practical knowledge of these cells use to cloud any useful discussion of the cells. THAT'S why I avoid talking about it at all. I've got failed cells that HAD great Ri, and cells with really bad Ri that I can't bludgeon into silence with a stick. I kill them and they come BACK with even WORSE Ri and still want to throw off 300 amps at the first excuse I give em.

It is not meaningless. It is just not useful for anything I can determine. I take the car out (TSLFP90Ah) in the cold and I'm getting apparent voltage drops of 20% at 35 mph, but the car runs fine and I get very good, if not maximum range. In warmer weather the same cells drop 4% under the same conditions and I get a little better performance. It just isn't useful information.

Jack Rickard


Jack Rickard
 
liveforphysics said:
You're right Jack. :)

Internal resistance doesn't mean a thing for a battery, and there's no reason to pay any attention to it. The ability to understand and calculate cell voltage drop and cell heating and inefficiency is totally useless. I suppose it has no impact on C-rates for cell in charge or discharge for you either. :)

I don't understand what causes the self-discharge in TS cells. I've personally only seen this with cells that have been damaged, and I can reproduce the effect by taking a good cell and damaging it by overcharging.

You don't understand it because it has nothing to do with self discharge. It's just a bleed off of surface charge accumulated during the charging process. The fully charged voltage for these cells is 3.4v - period. It always was. These are larger cells than you are testing, they have a lot more plate area, and ramming coulombs into them is the cause of this. If you remove the charger, with no circuit to reabsorb they will bleed off eventually anyway. If you hook up the slightest load, they drop instantly to their fully charged voltage of 3.4v.

Jack Rickard
http://EVTV.me
 
the question for me now is:

if it is only surface charge...why do we need it?

Jack can you make a test: charge one battery to 4,2V...check tha capacity and than charge it to 3,6V and check the capacity...make this charge/discharge at 1C....will there be any difference to the capacity of both cells?
 
RoughRider said:
the question for me now is:

if it is only surface charge...why do we need it?

Jack can you make a test: charge one battery to 4,2V...check tha capacity and than charge it to 3,6V and check the capacity...make this charge/discharge at 1C....will there be any difference to the capacity of both cells?


We don't. We never did. We just charged to the higher voltage as a CC/CV charge curve to get the cells fully charged. This is where the conversation started. TS and SE cells fully charged are 3.4v To get there, charge TSE CC/CV at 4.2v. Charge SE CC/CV at 3.6v.

In series packs, I use 3.65 and 3.50 respectively.

Jack Rickard
 
RoughRider said:
the question for me now is:

if it is only surface charge...why do we need it?

Jack can you make a test: charge one battery to 4,2V...check tha capacity and than charge it to 3,6V and check the capacity...make this charge/discharge at 1C....will there be any difference to the capacity of both cells?

There will be. But it will be less than 5Ah on a 160Ah cell. That's been my point. You seriously extend the life of the cell at the lower voltage, at a very small sacrifice in range, a couple of miles. And you can throw away that $2000 and all that crap in your car representing someones BMS theory blue elephant gun.

Jack Rickard
 
jrickard said:
RoughRider said:
the question for me now is:

if it is only surface charge...why do we need it?

Jack can you make a test: charge one battery to 4,2V...check tha capacity and than charge it to 3,6V and check the capacity...make this charge/discharge at 1C....will there be any difference to the capacity of both cells?

There will be. But it will be less than 5Ah on a 160Ah cell. That's been my point. You seriously extend the life of the cell at the lower voltage, at a very small sacrifice in range, a couple of miles. And you can throw away that $2000 and all that crap in your car representing someones BMS theory blue elephant gun.

Jack Rickard


It is laughable that you are so anti-BMS. I have destroyed so many cells following 80% dod and 80% balance-charge cycles to know that a BMS will at least allow me to see the problem before it happens. I still don't run a BMS, and yet I know that on packs with more than 6 batteries in series it is in my best interest- especially with more volatile chemistries.


Let us pretend a pack has one cell that has decided to just give up the ghost after an accident or internal failure. Many times these cells will charge up just fine (voltage), but will not have the capacity upon discharge cycle. Even worse, when charging to lower voltages the damaged cell may not even be out of line with the rest at all. Voltage is fine under no load, but....When it gets run down before the rest it will get hot and maybe catch on fire depending on chemistry. If this cell was isolated it wouldn't be too bad, but generally packs are built with all the cells together. Just the heat generated from this one cell can damage surrounding cells, not even considering a fire. The only way to catch this bad cell is a BMS, because we don't always have the luxury of the "bad" cell showing itself until it is too late. It would be great if it did, but "accidents" are not called happy-fun-time for a reason.
 
Jack, when I invited you visit, I pointed out that some folks have names with different colors. I promise you that's for a very good reason. :wink:

(no disrespect to my fellow monochromes. :) )
 
Back
Top