Formula to solve for Time

Joined
Sep 5, 2013
Messages
71
Location
Ventura CA
1) Is there a formula that will give the TIME to charge an x volts y Ah LiFePo bike battery from a deep cell 12-volt z Ah battery? Can I complete this charge without converting to AC?

2) Is there a formula that will give me the TIME to charge a 12 volt z AH deep cell from a solar panel of awatts and b volts?

Thanks.
 
overtonmath said:
1) Is there a formula that will give the TIME to charge an x volts y Ah LiFePo bike battery from a deep cell 12-volt z Ah battery? Can I complete this charge without converting to AC?

2) Is there a formula that will give me the TIME to charge a 12 volt z AH deep cell from a solar panel of awatts and b volts?

You can use Energy and Power to figure Time. Energy is Work or the ability to do work. Power is the Rate at which Work is done. Time is the inverse of Rate.

So let's use Watts (W) for the units of Power, Hours (h) for Time, and Watt hours (Wh) for the units of Energy. So convert everything into these units and then simple math can be used. Remember that Power = Volts * Amps and that Energy = Volts * Charge (Ah).

So say the bike battery is 24V and 10Ah. It has a nominal Energy of 240Wh. If you have a 5A charger, then it has a Power of 24V*5A = 120W. So the Time it takes to charge is the Energy divided by the Power, or 240Wh / 120W = 2h. 2 hours. This is regardless of the source, as long as the source has enough energy (greater than 240Wh). Also, this calculation does not account for any losses (inefficiency) in the system, which would lengthen the time required, and/or increase the power resulting in greater energy from the source. It is obvious that a charger would be required between the 12V battery (source) and bike battery.

Use the same formula for the solar charging. If your 12V battery is 150Ah, then it has an Energy of 1800Wh. If your solar array has a Power of 200W, then it would take 9 hours. 1800Wh / 200W = 9h. Again, the calculation is for a perfect world where the sun shines bright all the time and there are no losses whatsoever and the battery was completely empty to start and completely full at the end. In the real world, Pb-Acid batteries are inefficient to charge, the sunshine is unpredictable and the charging system will have loss. So it will take longer.

Using the simple power and energy equation can give you a best case time calculation. From there it depends on how good your system is as to how close to that time calculation you can come. You'll never achieve it, but might get to 80 or even 90%, especially charging Lithium batteries which are very efficient. However with the solar charging of Pb-Acid, it might take several times longer to charge than your calculation.
 
There's nothing I enjoy more than sitting down to read ES with a paper and pencil...I love it :D. When I look at solar panels they always seem to be listed as watts and volts together. Example: 100W 48v or 100W 12v. I would think that the voltage of the panel would have an effect on the time to charge a deep cell battery, such as a 12v 150Ah deep cell.

You stated: "So the Time it takes to charge is the Energy divided by the Power."

So Wh/W = h. That seems self-evident. Also, W = v * A which A = W/v follows and Wh = v*Ah. Here is where I'm a bit confused with the units. The deep cell battery has 1800Wh. This is then divided by the watts of the panel 100w, yielding 18 hours regardless of the volts of the panel. If I try to include the panel volts I get Wh/(v*A) but Ampss or A = W/v. So I get Wh/(v*W/v) and the volts cancel out. There must be something in this system I'm missing like a converter, inverter, or some other device for which I don't understand the process, that adds volts to the numerator or somewhere else. What am I missing? Or is it true, solar panel volts don't fit into this equation.
 
overtonmath said:
There's nothing I enjoy more than sitting down to read ES with a paper and pencil...I love it :D. When I look at solar panels they always seem to be listed as watts and volts together. Example: 100W 48v or 100W 12v. I would think that the voltage of the panel would have an effect on the time to charge a deep cell battery, such as a 12v 150Ah deep cell.

You stated: "So the Time it takes to charge is the Energy divided by the Power."

So Wh/W = h. That seems self-evident. Also, W = v * A which A = W/v follows and Wh = v*Ah. Here is where I'm a bit confused with the units. The deep cell battery has 1800Wh. This is then divided by the watts of the panel 100w, yielding 18 hours regardless of the volts of the panel. If I try to include the panel volts I get Wh/(v*A) but Ampss or A = W/v. So I get Wh/(v*W/v) and the volts cancel out. There must be something in this system I'm missing like a converter, inverter, or some other device for which I don't understand the process, that adds volts to the numerator or somewhere else. What am I missing? Or is it true, solar panel volts don't fit into this equation.

The solar panels do fit the equation. Listen to what the equations are telling you. Voltage doesn't matter when you are using power and energy. You can have the same power from infinite combinations of voltage and current, P=V*I, or Watts = Volts * Amps. So your 100W panel could be 10V@10A, or 20V@5A, or 100V@1A, or 25V@4A, etc. As long as the product of V and I equals 100 Watts, the power and energy equation is unaffected.

And yes, some type of converter will be needed between the solar panel and the battery. And the specification for this converter will depend on the voltage of the panel(s), as well as the power.

If you were to choose your solar panel for a voltage of 14.8V, then you could hook it directly to the 12V Pb-Acid battery and have it charge the battery...........maybe. That's because solar panels do not output a constant voltage. The voltage varies with the amount of light collected and also the load (current). The battery charging characteristics affect the current dependent on the voltage, state of charge, temperature and other factors.

Typically a converter is used to control the load on the solar cell to operate it at maximum efficiency. Also a converter is used to charge the battery properly; usually called a battery charger. You may be able to find a single converter to combine these functions.
 
The http://en.wikipedia.org/wiki/Faraday_efficiency of most batteries is greater than 95%, meaning very little of the charge going through is lost in side reactions. So taking X amp hours from a battery means something like 1.02X amp hours will recharge it.

The charging current starts when the battery voltage is exceeded and increases as the voltage is raised. The excess voltage is wasted as heat and side reactions so a faster charge uses more power and energy.

A PV cell is basically a silicon diode constructed so that light drives electrons through the junction to the positive end. If the cell is not connected to anything the voltage builds up until the net electron flow across the junction stops. That is the open circuit voltage. If the cell is externally shorted the electrons flow through the short back to the negative end, that is the short circuit current. Neither of those conditions generate any external power; the maximum power point is at the maximum V*I where the most electrons lose the most voltage when flowing through the external load.

So when a PV panel is connected to a battery some current will flow if the voltage is high enough. It may not be the maximum power but it will be the maximum current at that particular voltage. The early PWM chargers (e.g. Trace C40) used a low-resistance FET switch to connect the panel to the battery; switching that at a high speed allowed tapering the current near the end of the charge before a final cutoff. The bulk of the charging was still a direct connection of the panel to the battery.

But put an inductor in series between panel and charger (making a buck converter) and a wonderful thing happens. When the FET is switched on any excess voltage appears across the inductor rather than the battery. Then when the FET turns off the magnetic field in the inductor collapses and continues to drive current through the battery. Switching at 10s of kilohertz allows the panel to operate at a greater voltage and reduced current output, while the battery sees a smaller voltage and correspondingly higher current.

MPPT chargers operate this way, but if the panel voltage is greater than the battery voltage all you need is a cheap buck converter with current feedback into the PWM duty cycle. Maximum current through the battery is in either case the maximum charging rate.

But in both cases, the faster you charge the more energy you lose.
 
Once you get used to the idea of thinking in Watts and Watt Hours, the math gets so easy I can even do it with only the usual failure to be able to add subtract and multiply right. Calculator, and even I can do it.

Since you are getting into this, you might add a killawatt meter to your bag of tools. A fun device that measured the AC input of any 110v item in your house. Like a CA for your refrigerator. It's a must have device for deciding which home appliance to keep and which to dump when you are getting your house lean and mean for economy.

Before I got a CA, I used to use mine to know when my battery was getting close to full. The amps draw would drop on the killawatt.
 
Back
Top