'on board' 1.5kW charger. Which alibaba supplier?

fatty said:
How do you have 2000 posts and not know this?
From this very thread, if you cared to read it:

now explain what a "psu" like a meanwell ELG/HLG does different then a charger.

and i know it. you dont. i actually went to school for this shit. you didnt. i just want you to learn you something so this BS argument between chargers and "psu's" can finally stop.
 
jamiejackherer said:
...and will be able to do some cool stuff, like being able to use a 72v battery charger on a 48v battery by monitoring the batteries voltage and cutting the charge at a preset voltage of say 54v.

As above, any decent charger would detect the battery as overdischarged and not initiate charging.
Also, chargers are limited by power, so the higher voltage charger (if it negotiated low enough voltage) would have a lower current output -- so you would actually be charging slower, not faster. Not cool.
 
flippy said:
fatty said:
How do you have 2000 posts and not know this?
From this very thread, if you cared to read it:

now explain what a "psu" like a meanwell ELG/HLG does different then a charger.

and i know it. you dont. i actually went to school for this shit. you didnt. i just want you to learn you something so this BS argument between chargers and "psu's" can finally stop.

I'm very, very certain you didn't.

If this doesn't illustrate it, I don't think anything will:
psu vs charger.png
Obviously, I've truncated the decreasing PSU current at nominal pack voltage for clarity.

This place is unreal.
 
fatty said:
I'm very, very certain you didn't.
If this doesn't illustrate it, I don't think anything will:
psu vs charger.png
Obviously, I've truncated the decreasing PSU current at nominal pack voltage for clarity.
This place is unreal.

ah, there lies the problem. you have seen a pretty picture but have no clue what you are actually defending or saying.
lets have you dig this hole a little deeper so we can get to the bottom of where your information goes wrong.

please explain to me how a regular lithium battery is charged.
specifically: what controls what. so what controls the current and what controls the voltage at what stage. please explain this for each stage. (spoiler alert, it should not matter in your explanation if its a charger or a "psu")

for ease of numbers lets assume a 10S 100Ah battery that is fully charged at 42V and both devices are capable of delivering 10A.

added points if you explained what would happen if i were to connect something like a lightbulb to either device, one that is rated BELOW the current limit and one rated ABOVE the current limit of both devices.

fun fact: if you explain the bonus part correctly you also find out why your pretty picture makes no sense in charging batteries.

note, we are talking about chargers and "psu's" like the ELG/HLG from meanwell with CC/CV capabillity.
 
flippy said:
john61ct said:
The sealed Meanwells would be great
if you either like flippy don't care about overcharging (not voltage necessarily, just for too long)
or are OK with a CC-only profile and know how to create an HVC circuit to enable automated charge termination
Oh, and figure out the issues around hooking them up in series to get to your desired target voltage.
So maybe not so straightforward. . .

again for clarity:

YOU CANT OVERCHARGE BY FLOATING THE CHARGER AT THE SET VOLTAGE.

(unless you set the voltage too high, but that applies to everything)

if you dont know how charging work stop commenting or ask for clarification but stop spreading disinformation.

And again, yes you can, you are wrong and should stop spreading misinformation.

The key word is "overcharge".

Is not defined as "can cause a fire"

Is not defined as "will cause immediately detectable damage"

Is not defined as "voltage too high".

If the desired charge profile is

> Charge to 3.47V and stop, do not hold for CV stage

then holding at 3.47V for any length of time, is overcharging,

If the desired profile is

> Charge to 3.43, then hold CV until current tapers to 0.01C

Then allowing CV to continue until current has tapered to 0.005C

is by definition overcharging.

It is harmful to cycling longevity to keep applying voltage to the point that there is no current flowing.

"harmful to cycling longevity" is what I call damage, even if you won't see evidence for decades

Lithium batteries should not be "floated" even at a voltage lower than their resting Full voltage.

And yes the risk of thermal runaway is also much higher failing to terminate charge at a point long before the vendor data sheet, especially for 3.6-3.7Vnom cells, and most especially as they wear past the 70-75% SoH EoL point.

Relying on human attention and memory to terminate charging is foolish and dangerous.
 
fatty said:
No, it's exactly the charging curve and charge termination "features" that differentiate a charger from a PSU.
No, the charger refusing to charge again after overdischarge is likewise another feature that differentiates a charger from a PSU, because the damaged battery shouldn't just be bulk charged again, without warning and potentially unattended.

Playing with fire is dangerous enough. You shouldn't be encouraging other people to do so.

Exactly right, and thank you!

flippy said:
there is no difference between a mean well CC/CV power supply and a device sold as a charger. they operate in EXACTLY the same way. the only difference is a little light that indicates that the current has dropped below a certain level.
Poppycock.

A proper charge source of whatever type - mains AC, solar, DCDC

actually STOPS charging once its algorithm decides the battery is Full. Nothing to do with stupid blinky lights!

Better ones let the user make adjustments to that based on Ah size, desired Absorb Hold Time (CV stage)

The best allow specified endAmps measured by a shunt at the battery, to directly trigger charge termination.

Others are based of a ratio of the time spent getting to the CV transition, or even a simple eggtimer approach, needs to be calibrated by the user measuring trailing amps

ideally without concurrent loads present sharing the charge current.

 
fatty said:
How do you have 2000 posts and not know this?
He's a bit off in the head I think, not too smart, and a very obnoxious personality to boot.

Really best to just ignore most of the time.

 
john61ct said:
A proper charge source of whatever type - mains AC, solar, DCDC actually STOPS charging once its algorithm decides the battery is Full. Nothing to do with stupid blinky lights!
hardly any charger on the market actually does that so its a moot argument. and lithium chargers have no "algorithm" to speak of.
john61ct said:
Better ones let the user make adjustments to that based on Ah size, desired Absorb Hold Time (CV stage)
the charger has NO influence on the duration of the CV part provided it does not actually cut off the output like every charger on the market.
john61ct said:
The best allow specified endAmps measured by a shunt at the battery, to directly trigger charge termination.
Others are based of a ratio of the time spent getting to the CV transition, or even a simple eggtimer approach, needs to be calibrated by the user measuring trailing amps
ideally without concurrent loads present sharing the charge current.
explain how a battey is charged first, then hopefully i can make sense of what this means.

john61ct said:
And again, yes you can, you are wrong and should stop spreading misinformation.
The key word is "overcharge".
Is not defined as "can cause a fire"
Is not defined as "will cause immediately detectable damage"
Is not defined as "voltage too high".
If the desired charge profile is
> Charge to 3.47V and stop, do not hold for CV stage
then holding at 3.47V for any length of time, is overcharging,
If the desired profile is
> Charge to 3.43, then hold CV until current tapers to 0.01C
Then allowing CV to continue until current has tapered to 0.005C
is by definition overcharging.
It is harmful to cycling longevity to keep applying voltage to the point that there is no current flowing.
"harmful to cycling longevity" is what I call damage, even if you won't see evidence for decades
Lithium batteries should not be "floated" even at a voltage lower than their resting Full voltage.
And yes the risk of thermal runaway is also much higher failing to terminate charge at a point long before the vendor data sheet, especially for 3.6-3.7Vnom cells, and most especially as they wear past the 70-75% SoH EoL point.
Relying on human attention and memory to terminate charging is foolish and dangerous.

who/what gave you the idea that a battery is "full" when the current drops below a certain level and should then stop ->ALLOWING<- current to continue to pass to the cell?

that there is harm to HAVING a battery fully charged causes its lifespan to shorten is not a discussion, that is fact.

but tell me how much MORE damage is done to KEEP the battery floating at 4.2V or even at 4.05V for that matter where it obviously would stay on its own anyway provided the cell is not self discharging. and be VERY specific here. i LOVE to see actual lifespan data from that.

do keep in mind in your answer that 99% of "chargers" sold on the market today do NOT shut off their outputs. the only thing they do is turn a light from red to green. (fun fact: in many cheap chargers you can set the red/green switch point yourself with a simple trim pot inside)
 
right, time for a little lesson in battery charging basics. that is more helpful then you guys trying to be highschool bullies to the guy with glasses that actually studied for this crap.

a few basics:
a battery is electrically just a resistor. just like a capacitor or anything that can store a charge.
for charging the resistance can be treated as being infinite/zero for just practical reasons.

if we were to put a energy source on a empty battery it would "see" basically a dead short and any fuse or protection would trip and nothing would happen as the resistance is extremely low and the amount of current that CAN flow is simply huge and the voltage would not even matter. this issue is especially present in (super)capacitors.
so to combat this problem you add a current limiting circuit to the energy source. this caps the current to the set limit. this limit is usually set to whatever the limit of the energy source is without it failing. when this happens the battery increases in charge and the voltage naturally rises. the energy source has to do nothing for this to happen, the battery voltage is decided by the chemestry inside the battery and its charge level. this is the CC part of battery charging.
after a while the battery (not the energy source!) reaches a point where the chemestry becomes saturated and cant take in as much energy. at this point the current will naturally lower while the voltage continues to rise. and after a while the energy source reaches its set voltage limit and does not allow the voltage to rise further. this is the CV part of charging.
after a even longer while the chemestry is fully saturated and the voltage of both the energy source and battery are in perfect equilibbrium and no current will flow. at this part the battery is actually charged up to the (voltage) point that the energy source was set.
so apart from limiting the current and the voltage the battery is in control, not the energy source.

so the only things you need in your energy source is a CC and CV circuit. and guess what: that is exactly what BOTH a simple meanwell ELG/HLG does AND a charger. the difference is that most chargers have fancy lights or screens on them to tell you what is happening. something you can add with VERY simple parts yourself if you are really bored. just like many people have done already with modding the chinese nockoff power supplies by changing the current proctection to act as a current limiter to turn them into chargers on this very forum....

ofcourse you can brute force the CC/CV point to a higher state of charge and just dump huge amounts of current into the battery as some people on this forum advise but that only kills it faster. naturally you can also lower the current and strech out the CC part of the charge cycle and increase lifespan.

added bonus lesson: the datasheets of cells do NOT say what current the charging should be terminated. it only tells you on what current they tested the cells at to make the datasheet. it does not say ANYWHERE on the datasheet you should stop charging as well at that point. they just mention their cutoff that so manufacturers that use their product have a way to verify the specs on the datasheet. nothing more.

this concludes the lesson for today and i hope you learned something today.
 
flippy said:
added bonus lesson: the datasheets of cells do NOT say what current the charging should be terminated. it only tells you on what current they tested the cells at to make the datasheet. it does not say ANYWHERE on the datasheet you should stop charging as well at that point. they just mention their cutoff that so manufacturers that use their product have a way to verify the specs on the datasheet. nothing more.

"Charge cut-off Voltage 4.2V±0.05V CC/CV"
"Charge cut-off Voltage 3.8V CC/CV"
"End current(Cut off) 50mA"
"Cut-off current 0.02C, charge time is less than 2 hour."
 

Attachments

  • 3.7V 10Ah 10C.pdf
    377.4 KB · Views: 14
  • LFP 3.2V 10Ah 5C.pdf
    161.5 KB · Views: 16
  • LG 21700.pdf
    368.7 KB · Views: 16
  • QB26800 DATASHEET.pdf
    252.3 KB · Views: 12
fatty said:
flippy said:
added bonus lesson: the datasheets of cells do NOT say what current the charging should be terminated. it only tells you on what current they tested the cells at to make the datasheet. it does not say ANYWHERE on the datasheet you should stop charging as well at that point. they just mention their cutoff that so manufacturers that use their product have a way to verify the specs on the datasheet. nothing more.

"Charge cut-off Voltage 4.2V±0.05V CC/CV"
"Charge cut-off Voltage 3.8V CC/CV"
"End current(Cut off) 50mA"
"Cut-off current 0.02C, charge time is less than 2 hour."


read better.

you are referencing the TESTING/PERFORMANCE part of the datasheet. the actual SPECIFICATIONS dont mention anything about current cutoff. if they do mention it its for the capacity test. not that you MUST cut charging. cutoff is only useful for getting the offical rated capacity.
nowehere in any datasheet do they mention you should not float charge them. nor does it change what energy source you should you to charge them.

reading datsheets is a skill, but it not hard if you first separate the sections. but to be fair, some datasheets are shit.


ps: i am not arguing that it would not be better to cut the charging off. but virturally no charger you can buy or is sold with a device actually does this as it also kills any change the bms has to balance the pack. that its better to cut off is actually false in real life as you kill the battery with the unbalance you create with not letting it float to let the bms balance the pack.

you might not want to float the battery but you simply have to. especially when the battery ages.

fun fact: every phone on the planet with a lithium cell does not cut off its charging, it also floats as long as the power supply is connected in order to squeeze out every last Wh from the cell.

so unless 99% of the billions of devices in and above this planet are designed wrong i would not worry too much about the current cutoff. the voltage is WAY more important.
 
I know most true chargers for lithium in the hobby sphere use an endAmps spec proportional to how much current the battery pulls in the early CC / Bulk stage.

There are many 12/24V chargers for House banks use, and inverter/chargers, that combine with coulomb counting BM's to let you specify a fixed endAmps termination point

but of course they are pricier than those that use simpler algorithms.

Here's an example commentary on Victron's AHT algorithm, works well in practice but very different from most.

https://www.cruisersforum.com/forums/f14/victron-mppt-battery-charging-algorithm-202526.html

The Victron community forums have hundreds of posts going into a lot more detail. The company makes great products, great flexible pricing scheme on their solar gear.

Not so relevant to high-voltage use cases, I think top out at 48V, but excellent documentation for those past the Dunning–Kruger stage and open to learning more.

I'd be very curious to learn the details on Justin's Satiator's AHT algorithm.

 
miro13car said:
waterproof charger - that is weird.
Why would that be wierd, for an on-board charger?

FWIW, I've been using the Meanwell HLG-600H-54A bolted to the bottom of the SB Cruiser trike for 2-3years now (long enough I lost track). It's been submerged at least twice, maybe three times, during flash floods on my commute home from work, with no problems.


I have a smaller 12V HLG running an LED panel on the top of a pole in the middle of my yard for a very bright yard light, and though it doesnt' rain much here, it rains a lot for a while whenever it does rain. No problems with that, either.
 
john61ct said:
I know most true chargers for lithium in the hobby sphere use an endAmps spec proportional to how much current the battery pulls in the early CC / Bulk stage.
There are many 12/24V chargers for House banks use, and inverter/chargers, that combine with coulomb counting BM's to let you specify a fixed endAmps termination point
but of course they are pricier than those that use simpler algorithms.
Here's an example commentary on Victron's AHT algorithm, works well in practice but very different from most.
https://www.cruisersforum.com/forums/f1 ... 02526.html
The Victron community forums have hundreds of posts going into a lot more detail. The company makes great products, great flexible pricing scheme on their solar gear.
Not so relevant to high-voltage use cases, I think top out at 48V, but excellent documentation for those past the Dunning–Kruger stage and open to learning more.
I'd be very curious to learn the details on Justin's Satiator's AHT algorithm.
the the charger that dicates how much current is allowed into the battery in the CC part, not the battery itself. the internal resistance is too low to self regulate until you reach CV. its like trying to fill a colander with water. you need to plug a lot of holes (charge) before it will retain any water even if you use a firehose.

for someone that tries to sound so superior you dont seem to understand the basics. your comment implicates that its the battery that limits the current during the CC part. if i understand your statment correctly that is.

please prove me wrong that a battery can control its incoming current safely by connecting a small hobbyking 3S empty pouch pack to a fully charged car battery with some fat cables and record what happens with an amp and voltage meter. also do this indoors please. :lol:

the correct current that should be used to charge a battery is stated on the datasheet. the rest happens naturally with the battery chemestry. the charger decides nothing, you do. according to the datahseet.
and using Ah for charge termination is dumb. it should only be set so high that you never reach that limit unless there is some weird issue that the bms does not see and trigger a shutoff. its the job of the bms to keep an eye on the battery, not the charger.

and victron has nice stuff and they understand the engeneering. but they are very much not aimed at consumer grade. they mostly supply OEM's that use their stuff in their products. still, not even their solar chargers cut off charging.
 
flippy said:
[good information, but irrelevant to the question]

I don't know if you're genuinely misunderstanding the debate or just obfuscating. Nobody is debating that you can charge with a PSU:
fatty said:
Yes, you can charge like this, but it is a fail-deadly approach when a fault/mistake/accident happens.

And of course it can't be CC AND CV AT THE SAME TIME. I drew a 2-second diagram in MSPaint (no curves) to visually illustrate the SAFETY FEATURES I already posted that differentiate a PSU from a proper charger:
fatty said:
No, it's exactly the charging curve and charge termination "features" that differentiate a charger from a PSU.
No, the charger refusing to charge again after overdischarge is likewise another feature that differentiates a charger from a PSU, because the damaged battery shouldn't just be bulk charged again, without warning and potentially unattended.
fatty said:
As above, any decent charger would detect the battery as overdischarged and not initiate charging.

Do you need me to circle it and write it out?
charger.png
1) PSUs don't refuse to charge damaged overdischarged batteries (< U1)
2) PSUs don't safe slow-charge deep-discharged batteries (between U1 and U2)
3) PSUs don't correctly terminate charging (U3)
None of these differences are even remotely up for debate.
 
fatty said:
As above, any decent charger would detect the battery as overdischarged and not initiate charging.

its the job of the bms to stop power flow when something is wrong, not the charger.

and no commercially sold power supply for anything more then hobbyist batteries actually acts like that.

and how will you EVER get out of a low voltage situation if the charger or bms wont allow you to charge anyway? unless you are charging at 1C or more putting in 0.2~0.5C does not damage the battery in any meaningful way then it already gets from being overdischarged.

you guys read something somewhere and hang yourself on this without knowing if its actually soemthing to worry about in real life.

fatty said:
1) PSUs don't refuse to charge damaged overdischarged batteries (< U1)
2) PSUs don't safe slow-charge deep-discharged batteries (between U1 and U2)
3) PSUs don't correctly terminate charging (U3)
None of these differences are even remotely up for debate.

1: its the job of the bms to do that. 99% of chargers dont give a frock and will charge anyway.
2: its the job of the bms to prevent that. most bms simply turn off below 2.0v per cell and then you have to bypass it anyway to get it back.
3: they dont have to. the battery does that on its own by simply not accepting any more current. that is how 99% of every charger (cheap or expensive) on the market works. its not the charger that decides whent the battery is full, the battery does.


it looks like you are trying to put the job of the bms on the charger. that is not how it works.
 
flippy said:
read better.

you are referencing the TESTING/PERFORMANCE part of the datasheet. the actual SPECIFICATIONS dont mention anything about current cutoff. if they do mention it its for the capacity test. not that you MUST cut charging. cutoff is only useful for getting the offical rated capacity.
nowehere in any datasheet do they mention you should not float charge them. nor does it change what energy source you should you to charge them.

reading datsheets is a skill, but it not hard if you first separate the sections. but to be fair, some datasheets are shit.

Christ, I even attached the datasheets -- all you had to do was open them. None of my examples, which cover Li-Ion to LFP, cylindrical to pouch, tier 1 to generic, are from "TESTING/PERFORMANCE" or standard charging sections -- they are from "the actual SPECIFICATIONS":
1.png
2.png
3.png
4.png
I don't even have any other datasheets available that say otherwise.

Those charging must be adhered to, in order to be guaranteed the other specifications of capacity, max discharge, cycle life, etc.

What, do you think the 4.2V max charge voltage is just a test condition? Or max discharge is just a suggestion?
Do you think your car owner's manual spec for oil capacity is just a test condition?
 
i see that you can read, you just have trouble understanding.

the first page on the datasheet tells you a standard for the official rating that they give. its what you need to do to get the rated lifespan or capacity. you are totalyl free to change these specifications to suit your case. so you can go higher on the voltage to say 4.3V and get fuckall in lifepspan but more capacity (look at your phone, it goes to 4.35V in new phones) or go lower and get more as they do in cars. same goes with current. you can charge or discharge faster and kill the lifespan or go slower and extend the lifepspan.

on your first picture i dont see a mandatory termination current ANYWHERE.
neither on the second. only a voltage is mentioned as that is the only metric for SoC.
on the thrid pic you see the testing parameters needed to get the previously stated specifications.
same applies to the fourth pic.

as i stated: please learn to read and understand datasheets before you go off. there is a difference between the actual specifcations of the cell and the TESTING PARAMETERS.
 
flippy said:
its the job of the bms to stop power flow when something is wrong, not the charger.
1: its the job of the bms to do that. 99% of chargers dont give a frock and will charge anyway.
2: its the job of the bms to prevent that. most bms simply turn off below 2.0v per cell and then you have to bypass it anyway to get it back.
it looks like you are trying to put the job of the bms on the charger. that is not how it works.

No, you're trying to put the job of the charger on the (possibly non-existent or likely non-functional) BMS. A good and safe charger, as mentioned in this thread and detailed my posts, will operate safely in the absence of a BMS. You would expect this to be the case, since the charger manufacturer can't foresee that the charger will only be used with a BMS, and the liability and reputation damage of acting (unsafely) otherwise would end the company.


flippy said:
and how will you EVER get out of a low voltage situation if the charger or bms wont allow you to charge anyway? unless you are charging at 1C or more putting in 0.2~0.5C does not damage the battery in any meaningful way then it already gets from being overdischarged.

It isn't a low voltage situation -- it's an unsafe overdischarge that has resulted in cell damage. Cells are unbalanced and some could have reversed polarity. You get out of that situation by individually balance charging with the right tool -- which is not a bulk charger, let alone a dumb PSU.
 
flippy said:
i see that you can read, you just have trouble understanding.

the first page on the datasheet tells you a standard for the official rating that they give. its what you need to do to get the rated lifespan or capacity. you are totalyl free to change these specifications to suit your case.
Yes, I just said that:
fatty said:
Those charging must be adhered to, in order to be guaranteed the other specifications of capacity, max discharge, cycle life, etc.
And no, you can''t change the SPECIFICATIONS to "suit your case". You can misuse the cells outside of their specifications, but you can't CHANGE the specifications. Unbelievable.


flippy said:
(look at your phone, it goes to 4.35V in new phones)
You've completely lost it. This is a different chemistry with a higher 3.8V nominal, with "Surface coating and electrolyte additives" -- not standard 3.7V nominal Li-Ion. The manufacturer certainly isn't just "changing the specifications to suit their case."


I think we're done here. You're willfully ignoring reality and spouting dangerous nonsense.
 
john61ct said:
It is harmful to cycling longevity to keep applying voltage to the point that there is no current flowing.
In the context of this discussion (charger cutoff @ XmA vs. leaving a charger connected at CV for Y hours, for example overnight), you're wrong.

I'm sure you will agree that connecting 2 equal cells in parallel is not harmful to cycling longevity, right?

If you connect a cell @ 4.2V to a charger (voltage source) @ 4.2V, no current flows. It's exactly like connecting a cell @ 4.2V to another cell (voltage source) @ 4.2V, no current flows.

By definition, when no current flows, NOTHING IS HAPPENING.
 
serious_sam said:
In the context of this discussion (charger cutoff @ XmA vs. leaving a charger connected at CV for Y hours, for example overnight), you're wrong.

I'm sure you will agree that connecting 2 equal cells in parallel is not harmful to cycling longevity, right?

If you connect a cell @ 4.2V to a charger (voltage source) @ 4.2V, no current flows. It's exactly like connecting a cell @ 4.2V to another cell (voltage source) @ 4.2V, no current flows.

By definition, when no current flows, NOTHING IS HAPPENING.

This is sort of pedantic. Two equal cells in parallel will still both rest together to a lower voltage.
That's quite different from constantly topping off a cell to 4.2V, even if there periods of no current flow.
 
serious_sam said:
In the context of this discussion (charger cutoff @ XmA vs. leaving a charger connected at CV for Y hours, for example overnight), you're wrong.
I'm sure you will agree that connecting 2 equal cells in parallel is not harmful to cycling longevity, right?
If you connect a cell @ 4.2V to a charger (voltage source) @ 4.2V, no current flows. It's exactly like connecting a cell @ 4.2V to another cell (voltage source) @ 4.2V, no current flows.
By definition, when no current flows, NOTHING IS HAPPENING.

its seems like such a simple concept right?

fatty said:
This is sort of pedantic. Two equal cells in parallel will still both rest together to a lower voltage.
That's quite different from constantly topping off a cell to 4.2V, even if there periods of no current flow.

no its not. the cells dont care if the current comes from the cell next to it or a external source, everything is an external source. and what if you just hold the resting voltage with a external source at say, like 4.1V?
 
flippy said:
no its not. the cells dont care if the current comes from the cell next to it or a external source.
No current is flowing in the paralleled cells example, since they're at equal (4.2V) voltage.

Since you edited:
flippy said:
and what if you just hold the resting voltage with a external source at say, like 4.1V?
This is known to be less harmful than constantly topping off the cells to 4.2V.
What is the question here?
 
Back
Top