Why Measure Capacity by Discharge and not by Charge?

rg12

100 kW
Joined
Jul 26, 2014
Messages
1,596
Measuring in discharge is affected by sag which changes with different currents so you reach the cutoff from different voltages and get a different result for the same battery.
Why not discharge to about 2.8V at whatever discharge rate needed to get it there and then charge it at whatever current all the way to the top?
 
Bahaha.

If you aint cycling @ 0.5C, or 1C, and/ or 2C, you just dumb, or happy with inaccurate results.. Some bring out the old 0.2C rate too.. but that's just wimpy.

LOl of course you check the charge efficiency. I dont know what dataloggeer you use.. but I absolutely cycle for accuracy. How would I know the bad cells without? How do you know if you put more in then taken out? You dont.. if you just discharge mAh... You must charge mAh and discharge mAh count.

Reliability, reproducibility, and reciprocity.... RG.

Or go buy an Opus and hope the numbers are correct, or good enough.. Lol. Plus you cannot see the input taper currents capacity without. You would be surprised if yo saw how much and how different some cells act and fill on adjusted taper.. quite surprised.

Why wouldn't you cycle? I can set my lil charger on 20 cycles and just wait. Fill it up 20 times, empty it 20 times, then charger istores the cell for me. Datalog is stored and a 100hour timeout on the charge can be set for large large cells. Have a graph with 40 lines zigzagging across the X (time) horizontal and the Y (mAh) vertical trace.. that you can compare and know the cells nature..

Who doesnt cycle ? Not the people that buy the charger that I use. We cycle for accuracy. Good luck, Opie.



Also, there are things like constant power vs constant current discharge, or taper discharge, that you can use to get different capacities from different cells.. reliably.. for your data. Balance discharging, for one, if you are doing more than one string of cells. Get those numbers all kinds of shiny.

Look at this: Headway 8Ah.. (USED CELL) failing cell.. mre in than out ( Green is mAh in, blue is mAh out... ) .. This cell is a heater. Burned almost 1.2Ah on the charge that never comes out of the cell... Got 6Ah out, put 7.2Ah in.. repeatably... and the other Headways I had did better...
Headway capacity.jpg

Headwaycap.jpg




Here is a 35Ah lithium. Better.. (NEW CELL) ...Much better. Put 37.8Ah in, got 36.8 Ah out.

35Ah lith.jpg

Here is a 60Ah LiFePo... So good that there is next to ZERO wasted mAh on charge/discharge... @ this rate....

60Ah.jpg
Random 5Ah lithium cell that was doing good and lasting a long time: Hit softly at a 1C charge / discharge rate and the C/10 taper current.. to 4.175V.. reliably produced 4.25Ah in , and out. 5Ah LGX.jpg
 
Yeah short answer is do both so you have some idea of Coulombic efficiency. It should be at least 99.9% efficient.
 
Always keep a constant current either way, pick a rate appropriate for your usage.

The Ah going in will always be more than what it can actually deliver back, and the latter is what counts.
 
Well, thinking about consumed energy, im more baffled to which measure I should calculate kilometers per pack.

For data monitoring I have G.T. Power Watt meter
------
After a ride I get 7.4 Ah consumed (with 6s li-ion battery) but in terms of Watt Hours, I get 134 Wh consumed.

If I multiply 7.4A X 21.6v (nominal) = ~160 Wh.

If I take average voltage during ride, lets say 22.5V, then WH count is even higher (~166 Wh)

So Im not sure at this moment from which measure can I really calculate my range.

If I did it with Ah, I would 'burnt' 1 kilometer extra for sure, since 160 Wh would be 8km ride distance while 134 Wh would be closer to 7 km (even below that)

Energy Consumption is ~18-20 Wh/km usually.

Now it turns out that Ah number is higher than Wh count.

===

So yes, in hindsight I might try to hook up DaVeGa telemetry screen for Vesc and see what values it gives.

It might be that wattmeter just have weird way of calculating these energy values.
 
Always use Ah, measures the way batteries actually work, much fewer inaccuracies from the many finer-point variables.

Wh are derived from Ah,

really only useful for comparing systems that work at different **nominal** voltages, say 10S vs 20S.
 
Yeh, it starts to seem that way.

Funny how I relied on such theoretical Wh estimate to calculate range before.

It sure is convenient to know batteries WH amount right away than figure out voltage and amp hours for pack versus different voltage pack and its amp hours.

Though when charging the charger does show same amount of Wh pumped into battery versus amp hours put into..

So Im still have to use different measuring device to really conclude that AH number is greater than Wh one when battery gets discharged.

It might just be that wattmeter doesnt count in high enough volts or lags somewhere else.
 
In response to the title, because the most commonly useful reason to measure capacity is as a gas gauge WHILE YOU'RE DISCHARGING THE BATTERY. Jesus some people just can't help themselves. Also as noted above the charge capacity should be virtually the same.
 
I believe the most important use for precise CC load testing, which is the only accurate way to measure capacity, is for benchmarking cells' declining State of Health over its lifespan, and determining EoL to replace them proactively rather than waiting for obvious failure symptoms.

A cheap/rough instrument commonly used for in-use gas-gauge display is rarely that accurate.

Really it is the CC load regulation that is most important, increasing power demand, to keep amps precisely level as the voltage drops.

In which case a stopwatch timer is all you need, much more accurate than most coulomb counters.

It is true that CEF is "pretty close" to 1.00 for many batteries, but input will **always** be greater than output.
 
Okami said:
It might just be that wattmeter doesnt count in high enough volts or lags somewhere else.
the wh reading is more accurate for a discharge test at varying amps (like when riding), because it accounts for the varying voltage and current during the ride.

ah does not.

however, as long as you are only comparing capacities of *your own pack* to *other instances of your own measurements* of that same pack, and are always using *the same unit* (either wh or ah but not switching between them), then either one is equally valid for comparing change in capability over time.

(but as wh takes into account the varying loading during the ride, different rides are easier to compare than using ah, which only counts the current used. if the rides are exactly the same each time (which doesnt' really happen outdoors, and not even indoors except under carefully controlled conditions), the ah would account for enough.)


if you are comparing one pack to another, for measurements taken while riding, whether under the same conditions or different ones, the wh will be more accurate than the ah because it takes into account more of the variable nature of the rides and packs.


if you are comparing "static" bench tests at a constant current with identical conditions for eveyrthing else between tests, then ah would work fine (as would wh).
 
Some good points about WH / amp hour difference.

Well lately ive been wondering how to include ambient temperature's effect on battery capacity as well.

I forgot to mention I was riding in 10C (~49F) or below conditions, so a bit far from the ideal 25C.

Maybe temperature also impacts why Ah and WH numbers are off from each other, when I calculate WH count from consumed Ah times voltage.

Though it doesnt really change wattage needed, its demand, so even if volts are lower, amps would be higher to compensate.



Sent from my ALE-L21 using Tapatalk

 
Okami said:
Maybe temperature also impacts why Ah and WH numbers are off from each other, when I calculate WH count from consumed Ah times voltage.
you can only really "calculate" wh from ah (and voltage) if both amps and volts stayed constant during the charge or discharge. if either changes, then the calculation is at best an estimate, and if they constantly change, like on a ride, it is a rough estimate.

if you were just doing a normal charge, or benchtest discharge with a fixed load, then because the voltage and current curves are close to opposite, a calculation is likely close enough.

but if you were riding around with the load constantly changing, so current increases and decreases constantly and by essentially the full range of allowable system currents from zero to max, and the voltage both sags during higher currents, and drops as battery discharges, it isn't necessarily going to be a very good calculation. if you based it on the average current, assuming zero as minimum, and max current detected as maximum, and average voltage, assuming full charge voltage as maximum, and the lowest voltage sag detected as minimum, it is not likely to come out the same as a fully-measured wh tracking would--it might be significantly different (probably would be).

it may still be close enough for range estimations, pack health tracking, etc., but if you want accuracy...you'll have to measure everything as you go. :)


Though it doesnt really change wattage needed, its demand, so even if volts are lower, amps would be higher to compensate.
that isn't necessarily true. it depends on the electrical situation.

for instance:

in the phase wires, this is nearly guaranteed to be true, because the resistance of the motor itself is so low.

in the controller-to-battery wires, it will depend on the situation, load, and controller design / software (and settings, if any).

if, as is common for generic cheap ebike controllers, the controller is not setup for monitoring and controlling power draw, or motor phase current, but rather battery current, it is going to attempt to keep battery current constant, regardless of battery voltage, assuming a specific load on the motor. (especially one that is at the max battery current allowed by the controller). especially if it's a common "pwm" or "speed" throttle, rather than a "torque" or "current" throttle, or the less common "power" throttle.

that means that the measurements, taken on the battery/controller wires, won't get higher currents as voltage lowers, because the loading doesn't stay teh same, as the controller is not attempting to keep the motor outputting the same power. instead, the battery current is kept the same, so total power in the system drops as the battery discharges and decreases voltage. moreso as the battery sags in voltage more for the same current load as it gets closer to empty.

if you're never anywhere near the max battery current limit of the controller, then you will see current increase as voltage drops in various situations, assuming the load on the motor stays the same to cause it to draw the same amount of power...but this doesn't typically happen on a whole ride, or even more than short stretches of one.


if you have a controller that measures phase currents, and is controlled by a "current" or "torque" or "power" throttle, it is going to be trying to keep the *motor* current constant for a given throttle setting, then it is going to be trying to keep the loading constant, and that *will* increase the current as voltage drops, up until you've maxed out the current limit of the controller (if it monitors battery current in addition to phase currents--if not, it may not actually stop increasing the battery current draw, so there wont' be an artificial plateau).
 
77°F / 25°C is the usual standard for apples to apples benchmarking of capacity.

Labs use circulating fluid baths, the rest of us a thermostat controlled space and wait at least 24hrs after putting the batts in there.
 
Thanks for all the info guys (aside from two of you who insisted on contributing your information in the lowest form possible "LOL, DUHH, JESUS!!" what a bunch of dumb amateurs).
Anyway, I get that Ah put in isn't perfectly aligned with Ah put out but what I meant to say is that this difference is very minor compared to a pack's capacity measured at 1C, 4C, 0.2C which makes a much bigger difference due to voltage sag reaching the cutoff sooner with higher C or slower with lower C.
 
Yes for lead standard is the 20-hour rate 0.05C, some vendors publish a dozen different rates. Talking discharge of course, CEF is a bit higher than with LI.

If you want a standard for LI 0.2C or lower will align better than faster with vendor specs.

Anything over 0.4C will reduce cycle lifetime for repeated automated tests.

But of course IRL many use cases need faster, so up to the tester to choose if they want to match that.

 
rg12 said:
what a bunch of dumb amateurs).
Anyway, I get that Ah put in isn't perfectly aligned with Ah put out but what I meant to say is that this difference is very minor compared to a pack's capacity measured at 1C, 4C, 0.2C which makes a much bigger difference due to voltage sag reaching the cutoff sooner with higher C or slower with lower C.

which makes a much bigger difference due to voltage sag reaching the cutoff sooner with higher C or slower with lower C

That is where you are wrong.

Bigger difference in what, exactly? Wh? Power?
Charge ( aka columbic) eff?

If it is the latter... Look at your charge efficiency and tell me the same energy didnt come out as went in. That is the loss. The same energy output.. matters on the input.... If you are hitting LVC you should lower powerout ( amps).

Cause your cells CANNOT hack it. I raise the power out of my cells, b'cause my cells CAN take it. There will always be a point that the cell ( any cell, lipo, or 18650, or any cell) cannot take it and goes thermal ( short much?) . Just cause the cells cannot hack it and heat up on discharge doesn't mean they didnt put out the same amount of ENERGY as the lower load discharge did...

the ENERGY just went into HEAT ( in watts, a measure of heat)( i dont know what you measure heat in, but I measure it in Watts... some use joules, ) (going thermal) .. not power. Voltage sag mean heat creation is induced. The power of the cell resists this change. (the energy out is still the same from a high Crate and a low Crate... you only look at the mAh number, that is not the total energy out... that is the capacity of the batter/cell. )


Bahahaha. Heat should be measured in joules? The units can be converted back and forth. Thermal engineering is a professional application of thermal dynamics. I work for a thermal engineering company. Heat is something we know very very well..... the use of "entropy" pays our bills.
....and a joule is .... by definition....

"is equal to the energy transferred to (or work done on) an object when a force of one newton acts on that object in the direction of the force's motion through a distance of one metre (1 newton metre or N⋅m). It is also the energy dissipated as heat when an electric current of one ampere passes through a resistance of one ohm for one second. It is named after the English physicist James Prescott Joule - Amateur Electrician



"An ampere hour or amp hour (symbol: A⋅h or A h; sometimes also unofficially denoted as Ah) is a unit of electric charge, having dimensions of electric current multiplied by time, equal to the charge transferred by a steady current of one ampere flowing for one hour, or 3,600 coulombs.[1] The commonly seen milliampere hour (symbol: mA⋅h, mA h, or unofficially mAh) is one-thousandth of an ampere hour (3.6 coulombs)."

The mAh capacity rating refers to the storage capacity ( available energy, in whatever form.. heat or power) available for a particular battery. Waste heat, or "Work" ... One or the other, in a cells discharge, based on its rate.. still outputs the same energy ( 1C vs 10C).

Maybe I am wrong, or maybe right, who knows. Rg.

"ENTROPY "- A thermodynamic quantity representing the unavailability of a system's thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system.
 
john61ct said:
77°F / 25°C is the usual standard for apples to apples benchmarking of capacity.

59°!!!! F !!!! Lol. Behooves the people who make the standards. Lol . 15* centigrade.

john61ct said:
Always use Ah, measures the way batteries actually work, much fewer inaccuracies from the many finer-point variables.

Wh are derived from Ah,

john61ct said:
Always use Ah, measures the way batteries actually work, much fewer inaccuracies from the many finer-point variables.

Wh are derived from Ah,

really only useful for comparing systems that work at different **nominal** voltages, say 10S vs 20S.
 
Okami said:
Well, thinking about consumed energy, im more baffled to which measure I should calculate kilometers per pack.

For data monitoring I have G.T. Power Watt meter
------
After a ride I get 7.4 Ah consumed (with 6s li-ion battery) but in terms of Watt Hours, I get 134 Wh consumed.

If I multiply 7.4A X 21.6v (nominal) = ~160 Wh.

If I take average voltage during ride, lets say 22.5V, then WH count is even higher (~166 Wh)

So Im not sure at this moment from which measure can I really calculate my range.

If I did it with Ah, I would 'burnt' 1 kilometer extra for sure, since 160 Wh would be 8km ride distance while 134 Wh would be closer to 7 km (even below that)

Energy Consumption is ~18-20 Wh/km usually.

Now it turns out that Ah number is higher than Wh count.

===

So yes, in hindsight I might try to hook up DaVeGa telemetry screen for Vesc and see what values it gives.

It might be that wattmeter just have weird way of calculating these energy values.

I was thinking what dogdipstick said, the resistance in the pack heats the pack so the battery is consuming watt hours, the more amps you pull from the pack, the higher the temperature and the more watts the pack consumes?
 
One reason for low rate testing is to minimize temperatures as a variable.

Outside of propulsion use cases where density is critical, normal discharging should not cause any significant heat rise at all.
 
john61ct said:
Yes for lead standard is the 20-hour rate 0.05C, some vendors publish a dozen different rates. Talking discharge of course, CEF is a bit higher than with LI.

If you want a standard for LI 0.2C or lower will align better than faster with vendor specs.

Anything over 0.4C will reduce cycle lifetime for repeated automated tests.

But of course IRL many use cases need faster, so up to the tester to choose if they want to match that.

What if I start at let's say 4C and avoiding hitting the cutoff by lowering up to 0.2C so the discharge will end at 0.2C
Will that produce the same amount of Ah?
Let's assume that there is no heat produced...
 
DogDipstick said:
Bahaha.

If you aint cycling @ 0.5C, or 1C, and/ or 2C, you just dumb, or happy with inaccurate results.. Some bring out the old 0.2C rate too.. but that's just wimpy.

LOl of course you check the charge efficiency. I dont know what dataloggeer you use.. but I absolutely cycle for accuracy. How would I know the bad cells without? How do you know if you put more in then taken out? You dont.. if you just discharge mAh... You must charge mAh and discharge mAh count.

Reliability, reproducibility, and reciprocity.... RG.

Or go buy an Opus and hope the numbers are correct, or good enough.. Lol. Plus you cannot see the input taper currents capacity without. You would be surprised if yo saw how much and how different some cells act and fill on adjusted taper.. quite surprised.

Why wouldn't you cycle? I can set my lil charger on 20 cycles and just wait. Fill it up 20 times, empty it 20 times, then charger istores the cell for me. Datalog is stored and a 100hour timeout on the charge can be set for large large cells. Have a graph with 40 lines zigzagging across the X (time) horizontal and the Y (mAh) vertical trace.. that you can compare and know the cells nature..

Who doesnt cycle ? Not the people that buy the charger that I use. We cycle for accuracy. Good luck, Opie.



Also, there are things like constant power vs constant current discharge, or taper discharge, that you can use to get different capacities from different cells.. reliably.. for your data. Balance discharging, for one, if you are doing more than one string of cells. Get those numbers all kinds of shiny.

Look at this: Headway 8Ah.. (USED CELL) failing cell.. mre in than out ( Green is mAh in, blue is mAh out... ) .. This cell is a heater. Burned almost 1.2Ah on the charge that never comes out of the cell... Got 6Ah out, put 7.2Ah in.. repeatably... and the other Headways I had did better...
Headway capacity.jpg

Headwaycap.jpg




Here is a 35Ah lithium. Better.. (NEW CELL) ...Much better. Put 37.8Ah in, got 36.8 Ah out.

35Ah lith.jpg

Here is a 60Ah LiFePo... So good that there is next to ZERO wasted mAh on charge/discharge... @ this rate....

60Ah.jpg
Random 5Ah lithium cell that was doing good and lasting a long time: Hit softly at a 1C charge / discharge rate and the C/10 taper current.. to 4.175V.. reliably produced 4.25Ah in , and out. 5Ah LGX.jpg

btw, what equipment do you use to create those charge, make the actual test etc?
 
rg12 said:
What if I start at let's say 4C and avoiding hitting the cutoff by lowering up to 0.2C so the discharge will end at 0.2C
Will that produce the same amount of Ah?
Let's assume that there is no heat produced...
Sorry I wasn't more clear.

If accuracy in your benchmarking is desired, pick one rate and stick to it.

CC load testing by definition, requires precision in adjusting the power draw so amps stay **constant** over the whole test from 100% SoC through the LVC.

See my stopwatch comment above.

 
john61ct said:
rg12 said:
What if I start at let's say 4C and avoiding hitting the cutoff by lowering up to 0.2C so the discharge will end at 0.2C
Will that produce the same amount of Ah?
Let's assume that there is no heat produced...
Sorry I wasn't more clear.

If accuracy in your benchmarking is desired, pick one rate and stick to it.

CC load testing by definition, requires precision in adjusting the power draw so amps stay **constant** over the whole test from 100% SoC through the LVC.

See my stopwatch comment above.

I just read it again but I still don't understand why Ah can't be calculated with flactuating current?
1 hour at 4A equals 4Ah and 1 hour at 2A will equal 2Ah, so discharging for two hours at 4A for an hour and 2A for another hour will produce 6Ah.
If I discharge constantly at 3A for two hours I will get the same result.
 
No you won't.

A battery literally holds a different amount of energy depending on the discharge rate.

Kind of like testing MPG driving at different speeds, and sometimes with an extra 1000lbs sometimes not.

Look up Peukert's Law.


One more time.

An accurate battery capacity test requires a precisely held CONSTANT CURRENT load.

Start at a precisely defined Full point, e.g. 4.19V held until current drops to 0.020C

Stop at say 2.99V

Use a precise timer and divide by the amps rate to get your Ah.

Do not rely on a coulometer if you want accuracy.

Do it **exactly** the same way every time, besides the exact same CONSTANT Amps rate, use same exact temperature, ideally with the exact same cell holder, dummy load etc every time

if you want your benchmark to accurately reveal SoH% as it declines with age, document the battery wearing out.

Or to confirm a set of identical batteries are equally worn in the same way.

The other "dimension" is ESR, but that is a lot more difficult to establish consistency, very sensitive to any change in temperature, needs to be done at the same exact SoC%, etc
 
john61ct said:
No you won't.

A battery literally holds a different amount of energy depending on the discharge rate.

Kind of like testing MPG driving at different speeds, and sometimes with an extra 1000lbs sometimes not.

Look up Peukert's Law.


One more time.

An accurate battery capacity test requires a precisely held CONSTANT CURRENT load.

Start at a precisely defined Full point, e.g. 4.19V held until current drops to 0.020C

Stop at say 2.99V

Use a precise timer and divide by the amps rate to get your Ah.

Do not rely on a coulometer if you want accuracy.

Do it **exactly** the same way every time, besides the exact same CONSTANT Amps rate, use same exact temperature, ideally with the exact same cell holder, dummy load etc every time

if you want your benchmark to accurately reveal SoH% as it declines with age, document the battery wearing out.

Or to confirm a set of identical batteries are equally worn in the same way.

The other "dimension" is ESR, but that is a lot more difficult to establish consistency, very sensitive to any change in temperature, needs to be done at the same exact SoC%, etc

Isn't the capacity "changing" with different currents happen because the voltage sags to cutoff sooner?
Because many ride a bike, hit LVC with the sag and not the voltage at rest and then limp it back home with 1/4 throttle to not make the voltage sag to LVC and use the remaining capacity.

About internal resistance, I once tested a cell at 4.2V and then at 3.7V and it showed the exact same number.
 
Back
Top