adjust amps on charger

goatman

10 MW
Joined
Jun 23, 2019
Messages
3,225
Location
Surrey, B.C.
I have a battery charger emc-600. I can adjust voltage down to 63v minimum but will go to about 120v 3amp,72v 5amp. 60v8amp. im adjusting for 17s4p pack of 25r batteries so 71.4v. I don't know if that's considered 63v battery.

would the charger basically change the amps to around 7amps because the voltage changed or do I need to adjust these other 2 pots?

rB8VoIL.jpg

MgjJnCm.jpg

kTaczcd.jpg
 
You should use an ammeter to check, but a little more or less current won't affect charge time much.

Slower is better for longevity, unless very warm ambients.

And 70V is enough for 17S, also better for longevity.

 
What do the other 2 trim pots adjust? Is one to trim current and one to adjust the minimum current before cutoff once the charger goes into Constant Current mode...OR do both need adjusting to properly trim the current.

On a related topic, why do the identical chargers set for different cutoff voltages require lower current settings when they are used for higher voltage? I thought chargers became more efficient when they produce a charge voltage closer to the input voltage. When I switched my standard pack voltage to 21s I did notice that they seem to run warmer after adjusting the cutoff voltage 4V higher, so knowing the answer to the first question is important to me. I'd gladly give up an amp of charge current for a cooler running charger, because that directly increases life and reliability AFAIC. Plus it should save a bit on my electric bill, a tiny amount but real, and my ebikes do have a noticeable impact on my electric bill since they provide 99% of our family's transportation needs.
 
John in CR said:
On a related topic, why do the identical chargers set for different cutoff voltages require lower current settings when they are used for higher voltage?

because there is a wattage limit for total power conversion within them. a given size /etc unit can only dissipate so much power (heat), and to preserve that limit current must go down if voltage goes up.


some "identical" units that run at different currents and voltages from each toher are not actually identical inside, but overall still have teh same watt limit as each other due to size/components/etc, so the higher voltage units use less current, and the lower voltage units more current.


if you don't lower the one when reaising the other, eventually you exceed the heat dissipation capability of at least one component, and it fails. before that happens, the extra heat everywhere is aging all of the heated parts faster than if you used them within their limits,
 
John in CR said:
What do the other 2 trim pots adjust? Is one to trim current and one to adjust the minimum current before cutoff once the charger goes into Constant Current mode...OR do both need adjusting to properly trim the current.

On a related topic, why do the identical chargers set for different cutoff voltages require lower current settings when they are used for higher voltage? I thought chargers became more efficient when they produce a charge voltage closer to the input voltage. When I switched my standard pack voltage to 21s I did notice that they seem to run warmer after adjusting the cutoff voltage 4V higher, so knowing the answer to the first question is important to me. I'd gladly give up an amp of charge current for a cooler running charger, because that directly increases life and reliability AFAIC. Plus it should save a bit on my electric bill, a tiny amount but real, and my ebikes do have a noticeable impact on my electric bill since they provide 99% of our family's transportation needs.
I need to go to youtube, vortecks. he showed how to adjust the other 2 pots. I think he did it by measuring the watts. I just need to put a couple xt60 on my watt meter so I can have it inline between battery and charger.
hopefully one pot controls the last charging amps so the bms isn't getting 8amps going into it for balancing.
 
amberwolf said:
John in CR said:
On a related topic, why do the identical chargers set for different cutoff voltages require lower current settings when they are used for higher voltage?

because there is a wattage limit for total power conversion within them. a given size /etc unit can only dissipate so much power (heat), and to preserve that limit current must go down if voltage goes up.

When it comes to heat a similar view is why so many people have problems with overheating motors on their ebikes. Heat is the product of current flowing through a resistance and it increases linearly with resistance but by the square of current. Voltage doesn't play a direct role. Too bad I didn't measure current too when adjusting cutoff voltage, because based on the extra heat my guess is that current went up a bit since resistance didn't change but voltage did.

If you want to talk about voltage when it comes to heat, then the voltage to use is the voltage drop, because it results from current and resistance. What voltage drop should we look at though? In this case, increasing the charge voltage reduces the voltage drop between the AC input supply and the charger output, which is why to me the charger should become more efficient.

I realize that this is an overly simplistic view of what goes on in a charger, but my question remains. If you maintain the same output current, what things inside the charger get hotter just because you adjust output voltage higher?
 
You might see "0-60V 0-30A" but in reality Watts max total might be 1000W.

Cheap chargers / PSUs / DCDC converters have only rudimentary regulation, can be easy to release the magic smoke.

If you want good controls, lab style units like HP, TDKs, Sorenson are worth the money.

Otherwise cheap Chinese, buy spares and beef up the heat sinking, add fans, stay under 50% of their inflated ratings.
 
That style of charger will have a CC point that is controlled by one of the pots. Changing the voltage won't change the current setting. As others have pointed out, lowering the current will reduce the power which will reduce heating and make your batteries happier.

On my cheap Meanwell clone power supplies, I generally run a "400W" rated supply around 300W and it still gets pretty warm but they have held up well in the long run.
 
I've found this thread valuable in understanding charger operation and setpoints. It mentions charger wattage, and keeping amperage to match voltage output. (If desired to use maximum output... less is never a problem, just a bit slower.) May also give you hints on which pot is which to adjust. (PCB labeling) A good, accurate and reliable meter and connections is a must with these settings. Testing on a good and balanced battery.

Good Charger Read...


Regards,
T.C.
 
John in CR said:
I realize that this is an overly simplistic view of what goes on in a charger, but my question remains. If you maintain the same output current, what things inside the charger get hotter just because you adjust output voltage higher?
whcih things get hotter depends on the design of the charger; you'd have to measure the temeratures of each thing under each condition to find out. but the things that get hotter might not be the parts that eventually fail from the heat.


but it still happens at the root because watts is volts x amps. the resistances and whatnot all come into play within individual components, but as a system, as a whole charger, those are irrelevant to determine one limit based on the other two values. they may make a difference as to *which* parts get hotter in whatever situation, but they generally don't as far as the entire systme as a whole.

if are trying to improve an existing charger so that it can do more output of one or the other, then which parts are generating the heat is relevant. if you're just understanding why you must decrease charger amps to increase charger volts, and vice-versa, it's not relevant.

the system (charger) as a whole has a general efficiency, so that is the ratio of how many watts of input will make how many watts of heat.

as you increase the watts it processes, regardless of whether it is by current increase or voltage increase, you increase the amount of heat it produces.

if you are using a certain amount of volts, and increase the amount of amps processed without decreasing the amount of volts processed, then the amount of watts processed, and thus the total amount of watts wasted as heat, increases.

the same is true if you increase teh volts without decreasing the amps in a way that results in the same amount of watts processed.


make the charger process enough extra watts in total, and the waste heat because of it's overall efficiency will, if not removed quickly enough, cause parts inside the charger to fail. (and the system as a whole will age faster because of the higher heat). whcih specific parts fail depends on the specific design and the environment outside the charger that is absorbing the waste heat; you would have to monitor the charger's individual components in the specific situation that it is occuring in to find out which specific components are generating that heat--but they may not be the components that fail, if they are not as sensitive to that heat as other components being heated by them.
 
Again, it's amps that create heat with voltage often completely irrelevant, that is until you need to change components to higher voltage ones that have higher resistance like we see in mosfets. Take our ebike motors for example, we can often have them make less heat while running higher power by running at higher voltage and lowering current only slightly. Because the losses in our motors is so heavily skewed toward resistance losses, we can run them at higher power than rated by running them at higher voltage than they are rated. With chargers we don't even have the nagging core loss increase like with motors, because we aren't changing the operating frequency. What especially doesn't make sense to me regarding these chargers is that voltage converters commonly have greater efficiency running output voltage closer to input voltage, which is the case when increase output voltage on a charger. When it comes to heat you simply can't just look at power instead of current and voltage separately. That's why I ask where the heat is coming from. They have resistors (heat base on current), coils (heat based on current), capacitors (? causes more heat) and what else?
 
Most of the heat is generated from the switching transistor(s) and the diodes. Additional significant heat comes from the transformer, inductor, output capacitors and NTC inrush current limiter. But overheating is not the only failure mode. During switching, you can get some large transients on the switching transistor, diodes and caps. These transients will be larger at higher output voltages and you can have failures caused by overvoltage transients.

Empirically, we have observed these things tend to fail if either the voltage or current is too high on the output. I have a fairly large collection of failed ones.
 
John in CR said:
Again, it's amps that create heat with voltage often completely irrelevant,
voltage isn't irrelevant, as if you increase voltage in a particular branch of a circuit, if the resistance in that branch doesnt' change, the current is then higher, and then so is power dissipation. that's just part of ohm's law.

test it out in this calculator
https://www.rapidtables.com/calc/electric/ohms-law-calculator.html
by entering a single resistance, say 10 ohms, and then a single voltage, say 20v, and hit calculate. that gives 2a of curent, and 40w of power dissipation.

then change the voltage to 40v, and hit calculate. you just doubled the amps in that simple circuit branch, which doulbed it's power dissipation.

so assuming that simple branch is part of a larger complicated circuit like a charger, the voltage and current somewhere else in it has to drop to also drop the power dissipation within the complete complicated circuit the same, so that the unit overall doesn't overheat.

there are limits for each component before failure, so there's going to be some overall limitation to voltage and current in the system as a whole to prevent any component failures, but theres also a general watt limit because of total power dissipation that the unit as a whole can handle, under as-designed conditions. (changing those conditions by fan cooling or potting or submersion in liquid, etc., can change a lot of individual properties, so those have to be tested for separately).


anyway, the various parts of a charger will have higher heat from higher power dissipation simply becuase the voltage across those parts is higher.

which parts those are depends on the specific design and conditions at that time.

if you have access to something with it you can use thermal imaging to see this in realtime. or you can use a laser-ir thermometer to make specific measurments of each part and put them in a graphical layout, with different conditions over multiple tests, to see this on paper.
 
Exactly. Like I said before I guessed that the extra heat in mine was due to a slight current increase after I raised the cutoff voltage, but that was shot down with someone saying, no that current stays the same, thus my question "if current stays the same, where does the extra heat come from?".
 
it does vary by model

If the total watts limit is less than (maxAmps * maxVolts), and you are at that limit then as one goes up the other has to come down.

Some devices do not handle automating that, it's the responsibility of the user to avoid letting the smoke out.

Complicated by the fact that with cheap gear, the **actual** limits may be well below the mfg claimed ratings, or only with beefed uo heat sinking or active cooling.

Pay for a nice HP or Sorensen unit, you don't have to worry, but they can cost 100x more.
 
John in CR said:
Exactly. Like I said before I guessed that the extra heat in mine was due to a slight current increase after I raised the cutoff voltage, but that was shot down with someone saying, no that current stays the same, thus my question "if current stays the same, where does the extra heat come from?".

In order for the supply to reach a higher output voltage, the FET switch needs to be on longer. FET heating will be a function of how long it stays on and the current.
 
fechter said:
John in CR said:
Exactly. Like I said before I guessed that the extra heat in mine was due to a slight current increase after I raised the cutoff voltage, but that was shot down with someone saying, no that current stays the same, thus my question "if current stays the same, where does the extra heat come from?".

In order for the supply to reach a higher output voltage, the FET switch needs to be on longer. FET heating will be a function of how long it stays on and the current.

Thanks Fechter. That explains some of the increase in controller heat with increased voltage on our bikes as well. Since I have plenty of extra real 4110's, should I consider changing the mosfets on chargers to 4110's next time I have one open...and change the insulator too that I've noticed is often a foamy rubber? I'm sure the factories use the cheapest chinese stuff they can get.
 
4110s won't have a high enough voltage rating. I don't have one handy here but if you look up the stock part number you can see what it's rated at. I'd guess something like 400v. You should look for a part with the same or higher voltage rating and a lower Rds.

You probably can't make much improvement on the diodes as their forward drop is going to be about the same regardless.
 
fwiw, it's not just the rdson that matters for the heating of the fet in a switching application

because all of the time the fet is not completely off or completely on is time it generates more heat than when fully off or on, these two things also matter:

the gate charge amount (q something?) spec for the fet determines how long it takes to reach full-on state (lower resistance) or full off state, vs the actual circuit's gate drive design adn the current that gate drive can produce.

then the actual fet's rise/fall times when receiving a specific amount of gate voltage and current will also matter for how long it takes to turn of and on,

whether it matters for the charger design in question i couldn't say,


but some of the controller powerstage design stuff i've seen (probably here on es) has shown how a fet with slightly hihger rdson might actually make less overall heat than one with lower rdson if the gate drive isn't sufficient to fully turn the lower rdson fet on or off quckly enough.
 
Ok, so the mosfet sees the incoming voltage. Since mine all run at 110V but are built to handle 220V AC, then a better matched mosfet should make a nice difference...and maybe even add an extra one in parallel. I have plenty of 4115's to, which should be fine since my AC input runs at 110-117V.

Now that I know what the other 2 trim pots do, I'll start with adjusting my charge current down about half an amp to see if they run noticeably cooler. Add in that my new location is at higher altitude it gives me consistently 5°F cooler ambient temps, and hopefully hot charger worries become a thing of the past.
 
John in CR said:
I have plenty of 4115's to, which should be fine since my AC input runs at 110-117V.
ac input isn't the same as rectified dc inside teh charger.

the dc voltage by the time it reaches the fets is much higher, almost certainly well over the 150v max those can handle. you should measure it but it could be 300-400vdc dependign on input stage design.

that also doesnt' account for voltage spikes during switching since these fets are probably on the coils of a transformer that creates inductive kickback during switching.

i think you're going to want to measure everythign involved, before you start swapping out parts. it will require an oscilloscope to see the spikes, they wont' show up on a multimeter.
 
With a 120vac input, the main input caps will be running around 170v, so 4115s won't handle it. Most primary side transistors I see used are at least 350v rated.

One other thought is that these power supplies are NOT buck converters. They have transformers. The transformer is designed for a particular operating point and if you deviate too much from that point, efficiency will drop.
 
Back
Top