Questions about CC charging

Username1

100 W
Joined
Nov 26, 2013
Messages
168
So i learned that a constant current method varies the voltage applied to the battery, in order to maintain a constant current.

My question is what does constant current mean? Does this mean constant amps or constant watts?

Also, regardless of the answer to the above question, why would you need to vary the voltage to achieve this. Wouldn't applying constant amps and volts result in a "constant current"?

I know these may be very basic questions, but i'm having trouble understanding even after researching it.
 
Constant current means constant amps.

For a charger, current flows into the battery when when the voltage of the charger is greater than the battery voltage. The amount of current is determined by the voltage difference between battery and charger, and the resistance of the circuit (resistance of the charger, wiring and battery). This is ohms law, I = V / R.

You apply a voltage and depending on the resistance, you get a certain amount of current flowing. Chargers use CC regulation to limit the amount of current going into the battery. You wouldn't want to stress the battery with high current flow, or pull too many amps from the charger and overload it.

The resistance is fairly constant. The battery's resistance changes depending on how charged it is, but not by a huge amount. Chargers have to vary their output voltage, to make sure the current stays below the limit.
 
Current means amps.

Watts is called power.

Every mainstream battery uses both CC and CV charging, from lead to all the various lithium.

The "Absorb" voltage setpoint is a maximum, and during the early Bulk stage (aka CC) the charger and battery together "negotiate" the voltage on the circuit in between.

It starts out low V which rises with SoC, how quickly depends on C-rate and chemistry / resistance. Charger is usually feeding all the amps it can (CC is a misnomer, not really fixed, starts to decline)

When the V setpoint is reached, the charger's regulation circuitry prevents it going any higher, hence CV aka Absorb stage.

The big question is, "How long to hold Absorb ?" aka AHT.

Some chargers just stop hitting the setpoint aka CC-only. Since LI does not like to get too Full, at low C-rates that is plenty.

Can use a dumb PSU or DCDC converter for this, coupled with a good HVC.

Other chargers use a time - based AHT, might be a sophisticated algorithm, and/or user adjustable.

With lead banks, which **do** need to get to Full for longevity, this is an important feature.

With LI you want to stop earlier rather than later for longevity.

When precise benchmarking is required, e.g. capacity benchmarking, then "trailing amps" are used, endAmps being the spec for a second setpoint.

Soon as the CC-CV transition is reached, amps start to drop, with LI very quickly depending on the original C-rate. With lead, may require 5-7 hours longer to get to 100% Full.

A vendor may spec endAmps= 0.01C, IMO that is too stressful for longevity, but for only occasional testing / maintenance protocols, NBD.

For normal cycling, if the C-rate during Bulk is set very high, then I'd use something like 0.05C for endAmps, with CV set to 4.05 to 4.15Vpc depending if full capacity is required for maximum range.

Hope this helps.
 
Addy said:
Chargers use CC regulation to limit the amount of current going into the battery.
Very rarely.

Grin Satiator can, and lab-style PSUs, feature aka user-defined maximum current level.

Current level cannot be "pushed" out from source to load, it is pulled by the latter.

A charge source needs to be **current limiting** with LI chemistries because high CAR, very thirsty, can burn the charge source circuitry.

But during the charge cycles, the current is not dynamically regulated by the charger.

The changing current is the result of battery chemistry, SoC / resistance climbing together.



Addy said:
The resistance is fairly constant. The battery's resistance changes depending on how charged it is, but not by a huge amount.
Also inaccurate.

When a 300A source is reduced to half an amp at 99.9% Full, that's a significant change.

Just that with LI, that only happens in the last few minutes.




 
There are true CC sources, e.g. for driving LEDs motor control etc

But normal battery chargers are "CC/CV" and operate as I described above.

Automatic charge termination is the other key feature to call it a "battery charger" as opposed to a dumb PSU or DC-DC converter.

Some also have a third stage, Float, but not usually used for LI charging.

Also current limiting but with some scenarios that is not required.
 
john61ct said:
Addy said:
Chargers use CC regulation to limit the amount of current going into the battery.
Very rarely.

Grin Satiator can, and lab-style PSUs, feature aka user-defined maximum current level.

Just because you don't see many chargers with adjustable current limits, doesn't mean they aren't CC regulated.


john61ct said:
But during the charge cycles, the current is not dynamically regulated by the charger.
Current is regulated by varying the charger output voltage. Current is being dynamically regulated.


john61ct said:
Addy said:
The resistance is fairly constant. The battery's resistance changes depending on how charged it is, but not by a huge amount.
Also inaccurate.

When a 300A source is reduced to half an amp at 99.9% Full, that's a significant change.

Just that with LI, that only happens in the last few minutes.

Not inaccurate, you're just misunderstanding. 300A? half an amp? You're talking about current here, I wrote about resistance. Current goes down because the charger is in CV mode and it's not raising it's output anymore, not because the battery resistance is changing.
 
Yes any source that lets the user adjust both the voltage and the current levels,

in fact just adjustments of the **maximum** level

certainly has more sophisticated control circuitry.

What is going on internally is really not important though, the account I gave from a user POV is accurate for mainstream chargers, even the likes of the Satiator.

There are some more dynamically current-adjusting regulators, for example, scaling an alternator's current down when greater propulsion HP is required from the ICE.

Or doing so when overtemp conditions arise, but still maintaining the voltage level and as much current as is safe.

But such charge sources nowadays are programmable MCU controlled, and often 3-5x pricier than the usual, less sophisticated ones.
 
Username1 said:
So is the CC charger PROVIDING a set amount of amps at all times?
You will not be using a "CC charger".

Just call it a charger. All the ones we use are CC/CV, by definition.

A good way to look at it is, the charge source "makes available" a certain current.

The load (here, a battery) may pull less.

If it tries to pull more, that is why chargers usually are "current limiting".

Sucky ones are just "current protected", latch off like a circuit breaker.

A little better is "hiccup mode", keeps trying after temp falls.

The worst just burn.

But 99% of devices sold as chargers are properly current **limiting** , so a big LI pack won't release the magic smoke.

> but then some of the power gets turned to heat by resistance (by varying amounts depending on state of charge), so the charger increases the voltage so the battery is actually RECEIVING the same amount of overall power?

No, and never mind about power here.

Power is watts, just derived from the V&A

And no significant heating need be involved in the process I described above.

The battery voltage is usually too low at first for the CV / Absorb setpoint to be reached

also the C-rate is too low (Ah capacity too high relative to the current) and the chemistry has a certain CAR charge acceptance rate, aka based on internal resistance, lead is way higher than any LI.

This is the "Bulk" or CC **stage** of the usual CC/CV charge cycle.

______
Think of the charger **striving** to get the combined circuit voltage up to the setpoint. In effect that's all it "knows",

V < setpoint, keep pushing until the **combined circuit** get there then hold that setpoint as max V.

And then after the CC-CV transition, usually keep going a bit

as current **naturally falls** (trailing amps, taper down)

until the algorithm says Stop charging - that may mean shut off Off, or

maybe drop to Float for a 3-stage "smart" charger designed for lead, RVs, boat House banks etc)

______
At a low Bulk stage C-rate, SoC is much higher at the CC-CV transition, Absorb Hold Time for the CV **stage** afterward, is much shorter, may get to 100% Full in just a few minutes. Careful, overcharging can be dangerous!

At a high Bulk C-rate, the CV setpoint is reached more quickly, but SoC is much farther below Full, longer AHT needed to get to 100%.

But remember, staying away from Full is a Good Thing so long as your use case does not require every mAh of capacity be used for maximum range, stopping sooner will increase cell / pack longevity.






 
I just don't understand the purpose of the CC phase. Why not just use CV mode, aka set the voltage to maximum battery voltage, provide a constant amount of amps, and let the battery draw what it wants? Then either disconnect when maximum voltage is reached (for fast charging), or let it continue charging (slowly) to reach full charge.

From what i understand, the charger can't provide more than the maximum voltage at ANY time in the charge without damaging the battery.
 
Username1 said:
I just don't understand the purpose of the CC phase.
There is no "purpose" for it, just how the chemistry and physics works while charging.

If you inflict a 5C charge rate, the voltage will jump to the CV transition point more quickly and the whole cycle might only take 8-11min.

That's how to "avoid" rather minimize the CC stage. But there is no reason to do so, such a high rate is harmful, reduces longevity, more likely to start a fire.

Much healthier to use a low rate, under 0.4C if you aren't in a hurry, CC stage might be 95% of the way to Full, only 5% time spent in CV stage.

So if you like that is the purpose of the so-called CC stage.

Which is still not literally CC charging strictly speaking. That requires very specialized unusual hardware, is not used much with our batteries.

So also in that sense you're also correct.


> Why not just use CV mode, aka set the voltage to maximum battery voltage… and let the battery draw what it wants?

That is exactly what we do.

But for "provide a constant amount of amps", a **maxximum** setting makes that amount of current available.

Both the circuit voltage and the actual current drawn are not actually under the control of any circuitry but "negotiated" between the battery and source. The circuitry just sets the parameters and "tries its best" to get to Full as quickly as possible within those boundaries.

Note the regulation may be performed by a device separate from the actual source.

A DC-DC charger with a PSU or alternator upstream. An alt and an external voltage regulator. Solar panels vs the controller. A BMS sending CAN signals to an EV charger, or an EV charger interacting with an EVSE.

> Then either disconnect when maximum voltage is reached (for fast charging)

That is "CC-only", skipping CV stage.

Usually "fast charging" refers to high currents / C-rates available, or with hobby chargers, skipping the cell balancing stage.

> From what i understand, the charger can't provide more than the maximum voltage at ANY time in the charge without damaging the battery

Exactly, an overvoltage means a failed charger, and only a redundant fail-safe separate HVC can prevent disaster. One of the failure modes a BMS should protect from for example.

But different chemistries are more robust than others. FLA vs GEL for example, the latter much more easily murdered by overcharging.

Holding AHT too long can damage (reduce longevity), not just overvoltage.


 
Username1 said:
I just don't understand the purpose of the CC phase. Why not just use CV mode, aka set the voltage to maximum battery voltage, provide a constant amount of amps, and let the battery draw what it wants? Then either disconnect when maximum voltage is reached (for fast charging), or let it continue charging (slowly) to reach full charge.

From what i understand, the charger can't provide more than the maximum voltage at ANY time in the charge without damaging the battery.
I guess the purpose of the CC phase is to have the shortest time to charge the battery without any harm to it.
 
It just is part of the process, not possible to bypass it completely.

At high C-rates combined with a high-CAR chemistry the CC stage is just shortened, blink and you might miss it.
 
Addy said:
Constant current means constant amps.

For a charger, current flows into the battery when when the voltage of the charger is greater than the battery voltage. The amount of current is determined by the voltage difference between battery and charger, and the resistance of the circuit (resistance of the charger, wiring and battery). This is ohms law, I = V / R.

You apply a voltage and depending on the resistance, you get a certain amount of current flowing. Chargers use CC regulation to limit the amount of current going into the battery. You wouldn't want to stress the battery with high current flow, or pull too many amps from the charger and overload it.

The resistance is fairly constant. The battery's resistance changes depending on how charged it is, but not by a huge amount. Chargers have to vary their output voltage, to make sure the current stays below the limit.
Everything you said is 100% correct.
 
Back
Top