The most intractable EV engineering issue? State of charge.

Lapwing

100 W
Joined
Dec 10, 2008
Messages
122
Location
Saint Helens, Oregon
http://green.autoblog.com/2010/01/2...-charge-of-lithium-batteries-problematic-for/

It would be great to brainstorm an ideal battery "fuel" gage.

A few meters have been useful for me - the Heart/interface/Xantrex e-meters & the cycle-analyst. I'm not sure they are a good starting point for the perfect fuel gage though.

Let me define what I want.

I want to use up to 80% of my battery capacity on the road. I want to leave between 5% and 10% at the top of the pack when charging, and that 10-15% at the bottom, that will ensure long battery life. However, I want to charge opportunistically, at any time, for arbitrary lengths of time, without fuel gage drift. :mrgreen:

In between I want to know exactly how much juice I have left. I would prefer not to fully charge the battery more than once in 100 cycles, to recalibrate a fuel gage. This implies a very accurate solution!

I want to know when a cell in the pack shows signs of being "unhealthy" compared to others in the pack.
I want to know about any significant temperature imbalances between cells in a pack.


So what's the best approach?
 
So, here's what I think you gotta do: build a model of battery capacity and track all current flowing into/out of the battery pack. You need to keep track of both charging and regen braking on the inflow side, and simple power usage by the motor/accessories on the output side. A full recharge resets the model to 'full', which helps deal with drift between the model and the actual battery state.

Now, the hard part is adding various 'corrections' to the model. Your battery has less capacity when it's cold, they lose more capacity as they age, etc. So, ideally, there is a voltage-based correction added in from time to time, but it's got to be carefully controlled for battery temp and amperage load. It might be easiest to apply this correction only when the motor is idle, so it can sense 'resting' voltage, but that's why the fuel gauge is model driven first, rather than voltage driven first.

The good news is that I bet you don't have to be that accurate in general. Being within 5% of the actual state of charge is probably good enough for most vehicles. So much of the model can be approximated and set up for average conditions.

Of course, I suspect you're already thinking along these lines, given your requirements.
 
Once you start thinking about the problem, as you clearly have, you begin to realize just how tough it is.

MikeB said:
.........

The good news is that I bet you don't have to be that accurate in general. Being within 5% of the actual state of charge is probably good enough for most vehicles. So much of the model can be approximated and set up for average conditions.

Of course, I suspect you're already thinking along these lines, given your requirements.

You might be right, but I suspect that accuracy needs to be better than 0.1 percent on the current measurement side just to achieve the "no drift" requirement for 100 cycles (of part thereof). In fact high definition current measurement may be the key to getting this sorted. The e-meter has pretty high accuracy but after a few cycles, the "amp hours left", would drift way off, unless a calibrating "full charge happens". No amount of tweaking Peukert's exponents and temperature calibration in the e-meter solved this to my satisfaction in the Marine AGM installs I typically do. After best tweaking 5-7 cycles and it would be out by 10-20%.

What would be the best technology currently for measuring say 100A DC to high precision at any voltage from say 12V-360V?

http://www.lem.com/hq/en/content/view/25/101/
I'm leaning towards High bandwidth high accuracy closed loop Hall-effect
http://en.wikipedia.org/wiki/Hall_effect
 
After a quite a bit of searching I found that highly accurate non shunt DC current sensors do exist. Almost all the Hall-effect ones are in the 1% accuracy level, but I found one type way better. It's not Hall effect.

.0096% for the 150A version. Now were talking!
Small-250-_867.jpg


http://www.lem.com/hq/en/content/view/163/152/

http://www.lem.com/hq/en/component/...serie,IT 150-600 S/output_type,instantaneous/

http://www.lem.com/docs/products/it 150-s.pdf
http://www.gmw.com/electric_current/Danfysik/866_867/documents/866-600_Installation_Package.pdf

Is the output of something like this, feeding a microprocessor the start of an accurate juice meter? (After searching prices for ULTRASTAB I came up with nothing. company was previously Danfysik before LEM bought them)

How would you use a sensor like this to keep track of the battery state of charge?

A/D conversion? Microprocessor suggestions? Logging/storage Software strategies to maximize accuracy?
 
I've built a "charge state" meter that works OK. It looks rather like a fuel gauge and displays battery capacity on an LCD bargraph. To overcome the drift problem, I just reset the meter at every full charge (it resets to full charge when disconnected from the battery). It measures current using a bidirectional Hall current sensor (so that charge and discharge can be measured in use) and uses a simple Picaxe 08M microcontroller. There is a photo of it working here: http://endless-sphere.com/forums/viewtopic.php?f=2&t=14498#p219880

I've not used it in anger yet, but the final version of the code seems to be working OK.

Jeremy
 
Jeremy Harris said:
I've built a "charge state" meter that works OK. It looks rather like a fuel gauge and displays battery capacity on an LCD bargraph. To overcome the drift problem, I just reset the meter at every full charge (it resets to full charge when disconnected from the battery). It measures current using a bidirectional Hall current sensor (so that charge and discharge can be measured in use) and uses a simple Picaxe 08M microcontroller. There is a photo of it working here: http://endless-sphere.com/forums/viewtopic.php?f=2&t=14498#p219880

I've not used it in anger yet, but the final version of the code seems to be working OK.

Jeremy


You're the man Jeremy. :)
 
Jeremy Harris said:
I've built a "charge state" meter that works OK. It looks rather like a fuel gauge and displays battery capacity on an LCD bargraph. To overcome the drift problem, I just reset the meter at every full charge (it resets to full charge when disconnected from the battery). It measures current using a bidirectional Hall current sensor (so that charge and discharge can be measured in use) and uses a simple Picaxe 08M microcontroller. There is a photo of it working here: http://endless-sphere.com/forums/viewtopic.php?f=2&t=14498#p219880

I've not used it in anger yet, but the final version of the code seems to be working OK.

Jeremy
Your effort is partly what inspired me to start this thread.

Calculating the amount of charge left in batteries is a complex calculation. It's not just a matter of getting out, what you put in and take out over time.

The full charge reset is the most direct way to reduce complexity, and I love your KISS approach.

On the other hand this doesn't reflect how a my unsophisticated Oprah watching sister, would use and EV. She would want to plug in for however long she has, and get in and drive. The "gas-gage" , must keep track of all complexity and reflect what's left in the "tank" given the temperature out, the weight of the lead foot, etc. Being the oops I forgot to plug it in last night, she needs to randomly charge the car wherever, all the while knowing how much exactly is in the tank.

I would very much like to come to grips at least in general terms with the complexity and nonlinearity of LiFPO4 types in use and with regen & battery charging. Batteries like the Headways and the A123 and Thundersky/Sky Energy cells show very predictable behavior.

Factors like:
Cell temperature
Cell state of charge
Rate of current in or out
Internal impedance
Lots of variables, but for a given pack of a given cell type it should be possible, to come within a few tenths of a percent, of the true state of charge, and use that to determine the top and bottom of a pack, recharging to 100% only very occasionally. I'm convinced that this middle 80%of the pack approach, will be the way to ensure great pack life.

The current sensors I found vary from 60A-1000A and have a 9 pin DB connector in common, so the electronics to use them could be standardized. A software interface could present the needed opportunity to tweak and try algorithms for various packs and cells.

I do wish I had an electronic engineering background. I don't! :?
 
Lapwing said:
Lots of variables, but for a given pack of a given cell type it should be possible, to come within a few tenths of a percent, of the true state of charge, and use that to determine the top and bottom of a pack, recharging to 100% only very occasionally. I'm convinced that this middle 80%of the pack approach, will be the way to ensure great pack life.
It may be achievable; but for the meager gains found at the ends of the range, it might be better to simply add more cells to a pack. (a la Prius managment)
 
Lapwing said:
http://green.autoblog.com/2010/01/2...-charge-of-lithium-batteries-problematic-for/

I want to use up to 80% of my battery capacity on the road. I want to leave between 5% and 10% at the top of the pack when charging, and that 10-15% at the bottom, that will ensure long battery life. However, I want to charge opportunistically, at any time, for arbitrary lengths of time, without fuel gage drift. :mrgreen:

In between I want to know exactly how much juice I have left. I would prefer not to fully charge the battery more than once in 100 cycles, to recalibrate a fuel gage. This implies a very accurate solution!

I want to know when a cell in the pack shows signs of being "unhealthy" compared to others in the pack.
I want to know about any significant temperature imbalances between cells in a pack.


So what's the best approach?

I would compromise the 100 cycles with no drift and make it 10 cycles between full charges, then just use LVC in Cycle analyst and charge throught he same shunt as is used for discharge (easy for the standalone CA, maybe takes a bit more checking for the ones using the controller shunt. One approach might be going backwards through the controller, ie disconnect motor, plug charger, account for FET voltage losses.

A different approach is to build a Fechter/Goodrum BMS with components selected so LVC and HVC stay within your desired range. This will of course not let you charge/discharge fully w/o swapping BMS.
 
A question...

If you measure individual cell voltage down to the millivolt, take the lowest one and use that as state of charge.
I know the discharge curve is pretty flat and not linear, but there is still a variation along the way.

Never did elaborate data collection, but if my Ping sits at 39,0V resting, I know that I have more left than when it sits at 38,5V.
And when I do many short trips, I see resting voltage slowly drop 0,1V at a time.
 
Velocipede said:
A question...

If you measure individual cell voltage down to the millivolt, take the lowest one and use that as state of charge.
I know the discharge curve is pretty flat and not linear, but there is still a variation along the way.

Never did elaborate data collection, but if my Ping sits at 39,0V resting, I know that I have more left than when it sits at 38,5V.
And when I do many short trips, I see resting voltage slowly drop 0,1V at a time.

To simplistic an approach given all the variables. It can work if you have the right static situation. see patent

http://www.wikipatents.com/US-Paten...-determining-the-state-of-charge-of-a-lithium

Some background to the problem. (Lithium discussion starts about 1/2 way down this link.) http://www.mpoweruk.com/soc.htm

There is also a good outline of the issues here. http://pdfserv.maxim-ic.com/en/an/AN3958.pdf I guess what I'm looking for is a coulomb counter with environmental compensation

Using AC impedance for state of charge - tried this. doesn't work. http://www.springerlink.com/content/b5bg12kkdpkecbxu/

EDIT: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.80.7974&rep=rep1&type=pdf
 
Lapwing said:
After a quite a bit of searching I found that highly accurate non shunt DC current sensors do exist. Almost all the Hall-effect ones are in the 1% accuracy level, but I found one type way better. It's not Hall effect.

.0096% for the 150A version. Now were talking!
Small-250-_867.jpg


http://www.lem.com/hq/en/content/view/163/152/

http://www.lem.com/hq/en/component/...serie,IT 150-600 S/output_type,instantaneous/

http://www.lem.com/docs/products/it 150-s.pdf
http://www.gmw.com/electric_current/Danfysik/866_867/documents/866-600_Installation_Package.pdf

Is the output of something like this, feeding a microprocessor the start of an accurate juice meter? (After searching prices for ULTRASTAB I came up with nothing. company was previously Danfysik before LEM bought them)

How would you use a sensor like this to keep track of the battery state of charge?

A/D conversion? Microprocessor suggestions? Logging/storage Software strategies to maximize accuracy?

I'm working on something quite similar. What you've found is interesting, but there are a couple of design issues.

First, these closed loop sensors require an additional power supply, typically, and in this case both positive and negative 15 vdc. A little DC to DC converter will do it, but further complicates the problem.

Second is the voltage level at which you measure this. You have to run it into an A/D converter at some point. These devices you cite can put out 200 ma typical into a max of 5 ohm,s or in the case of the 600 amp unit, 4 ma into 2.5. That's basically a current output into a measurement resistor that winds up with a 1 volt signal for the rated current.

I'm using an Arduino, but I would say the majority of Microcontrollers have a single polarity 0-5v AD converter. And it's not terribly great resolution. A 10 bit A/D will give you 1024 discrete levels to represent that 5 volts. With a 1 volt input, you wind with just 256 of those. Amplify it? Yes. But then you have whatever THAT circuit does to it.

Finally, it varies plus or minus 1 volt around zero for currents in two directions. My A/D goes from 0-5 volts.

I'm actually using a LEM device. It's a LEM HASS 200-S. It puts out about 2.5 volts at 0 amps, and varies up to 5 volts on the high side and down to zero on the low side with a 10K load resistor. You have a little more flexibility with the load resistor too. If you're microcontroller is a 3.3v system, you simply plug in a different load resistor value.

Ive found it remarkably accurate to about an amp compared with a power supply and load system putting out the amps between 0 and 200 amps. From 200 to 600, it will be well accurate enough. My concern is at the low end. My thought was to use a 50 amp version and the 200 amp and if below a certain value, go with the 50 amp, if above, go with the 200 amp. But the 200 amp is fairly accurate. I'm getting about 1.5 amps per ADC increment. I could do a bit better than that with the 50 amp on the low end, but it looks good enough.

My take on it is that the granularity and output scaling to an A/D are the overriding issues. Oh, the LEM HASS 200-S also uses a SINGLE 5vdc power supply, which is just a wire from the Arduino supply which is 5v. The current load is very low.

I think that AH counting is about the best we can do for battery SOC. By using a microcontroller, I can take various actions. For example, drive a relay to switch teh throttle output across a voltage divider to cut its value to 25% normal (full throttle 25 mph). At another level, I can send a signal to the controller shutting it down.

The reset is going to be pretty simple - a doorbell button. I'll "fully" charge the pack to the definition that suits me, and reset the Microcontroller when I do that. That's basically how teh current EVision we use works, and while it is a step, It's an easy one. Our Brusa charger also lights a green LED on a successful completeion of a full charge. I could just as easily use that signal to reset the AH, such that it only resets on a "complete" recharge. I'll not likely bother.

I'm also going to interface a small GPS module to the Arduino. I think what most Oprah watchers are wanting to know is HOW MANY MILES CAN I GO. A lot of cars have this now, and it varies in accuracy. I've left town in the Cadillac showing a range of 192 miles left. On a drive to St. Louis, 75 miles later it is showing 262 miles left. The RATE of burn changed on the highway in relation to the miles. So it is I suppose "inaccurate" but I find it useful anyway. By calculating distance travelled against AH used, you can project Miles to Empty based on what's left. Sure it's a wag. But we don't even really have a wag.

Most of it is working pretty well at this point. Spitting text out a USB port. The next phase will be displaying it in some sort of graphical format. I'm leaning toward a Linux Picot ITX box hooked up to one of these motorized 7inch screen stereos with touch screen and VGA input. But I've also been toying with the idea of linking it via Bluetooth to one of these new Motorola Droid Android phones. It's got a nice display screen, a great SDK/API for programming, and good bluetooth. I think I can get em to talk to each other ans show some little gages on the screen - maybe several screens.

Jack RIckard
http://evtv.me

The LEM HASS is available here http://search.digikey.com/scripts/DkSearch/dksus.dll?lang=en&site=US&WT.z_homepage_link=hp_go_button&KeyWords=HASS+200-S&x=0&y=0 at $26. I used a piece of copper bar as the conductor and mounted it in a box with the arduino with two heavy terminal bolts on it to hook cables up to.
 
In regard to the digital resolution of 10bit ADC on typical microcontrollers, it's possible to get external ADC's that provide you with much greater granularity. 16-bit ADC's exist and are fairly obtainable at low frequencies, and dividing 200 amps into 2^16 divisions gets an interval of .003 amps. While existent and errors do accumulate, the error is going to be very small if you (dis-)/charge to a known SOC/voltage level at least semi-periodically (I.e., full charge).

It's true it "complicates it", but it's no big deal. Something like that is pretty easy.
 
jrickard said:
I'm working on something quite similar. What you've found is interesting, but there are a couple of design issues.

First, these closed loop sensors require an additional power supply, typically, and in this case both positive and negative 15 vdc. A little DC to DC converter will do it, but further complicates the problem.

Second is the voltage level at which you measure this. You have to run it into an A/D converter at some point. These devices you cite can put out 200 ma typical into a max of 5 ohm,s or in the case of the 600 amp unit, 4 ma into 2.5. That's basically a current output into a measurement resistor that winds up with a 1 volt signal for the rated current.

.........................
I knew you would be up so something interesting Jack. Thanks for sharing.

I found this accessory module which gives a ±10V output that is proportional to the primary current.
http://www.gmw.com/electric_current/Danfysik/866_867/VOM.html

Unfortunately I also uncovered the cost of perfection - OUCH :cry: All up around $750 for this kind of precision.

When I have time, I have promised myself an education on how to use microprocessors to do this kind of stuff. Just got to get through the next 3 weeks!
http://www.gmw.com/electric_current/Danfysik/866_867/866.html
 
My really simple state-of-charge indicator uses a cheap LEM sensor too. It seems good enough for the job.

I think that, instead of getting hung up on accuracy, we need to look at what people are already used to. The fuel gauge on a car is probably around 5 to 10% accurate at best, with a repeatability of maybe a couple of percent and a resolution of maybe 5% to 10%. This gives us an aiming point, as that's the cumulative accuracy that users would be happy with.

If we underestimate the charge efficiency by, say, one or two percent, then we're going to get a gauge that will, over time, start to read slightly low (one or two percent on the charge efficiency estimates equates to much less than this on total accuracy). This is OK, all it means is that the user will charge the battery a bit more. If we build in a simple reset, that detects the "all shunts on" signal from the BMS and uses that to reset the fuel gauge capacity remaining variable to "full", then we've probably achieved the goal.

In use, the gauge would slightly under-estimate the actual remaining battery capacity, pretty much exactly as a car fuel gauge does. It will remain on the safe side of the actual remaining capacity through partial charge/discharge cycles, but with a slightly pessimistic view of what's left. Sooner or later the user will find time to give the battery a full charge, whereupon all the drift induced errors get removed.

It doesn't need super duper sensors and A/Ds to do this. The biggest error in my simple gauge is time, not current or voltage measurement resolution. It's impractical to measure the current and voltage too frequently, as the controller has to do a fair bit of work driving the display and you don't want that to be updated too often, else it starts to flicker a bit. Mine updates once a second, which seems about right. I chose to count amp seconds to a resolution of 16 bits, but with 10 bit voltage and current measurements, primarily because I wanted a really simple system. The timing errors are almost certainly greater than the measurement errors on my system, although I could sort of fix that with an external real time clock. The reasons I didn't go down that route are because of complexity and because it slowed the thing down. An external real time clock would mean a sample time of one second minimum, and, because you've added more load on the controller by making it read the external clock the processing cycle time will probably go up. You then end up with a time reference that is more accurate, but with voltage and current readings that may be less accurate.

To understand this, you need to think about the nature of sampled systems. When the controller measures the voltage and current it does so at fixed point in time. The software has to assume two things; that the voltage and current are stable at the time of the sampled measurement and that the two were sampled at exactly the same instant. The latter is wrong, because all practical "hobby" microcontrollers like this will read the A/Ds sequentially, so there will be a small time difference (maybe only a mS or two) between the two readings. As the first assumption is that nothing changes between sample periods, this doesn't seem to matter too much. However, in the real world, the current and voltage will be changing all the time. Its quite easy to get a few percent difference in power used over a period of one second, which means that the base accuracy of your system, even with really good sensors, will be limited unless you up the sample rate.

Upping the sample rate is the obvious way to get accuracy up, but this then means getting away from sequential A/D readings and having a couple of flash A/Ds synced to read at exactly the same instant. Over-sampling during the measurement period (say a few hundred samples in a second) then averaging to get amp seconds would be the way to reduce the short period voltage and current measurement variations. However, you've then added a fair bit more cost and complexity, just to get a system that would be far more accurate than most users need. Add in the largely unknown variation in cell charge/discharge efficiency over time, with temperature, instantaneous current etc and I think that a simple approach that periodically resets on full charge would be the most practical option.

Jeremy
 
If you reset each time you fully charge, the functionality of an SOC gauge would make no practical usage difference if the accuracy were +-5% IMHO.

With LiCo based cells, the slope of voltage vs capacity is such that a very accurate SOC can be determined by just knowing the cell temperature, and the resting voltage. With LiFePO4 based cells, it seems the slope is very flat, which would reduce the resolution, or perhaps make it impossible to take a reading like this unless the cell temp was entirely uniform at the time if taking the reading, which would make it impossible for quick and simple live readings. The current sampling method can fix this problem, but when internal cell temps among other factors play such a difficult to quantify variable in usable cell capacity as well as charging efficiency, I can't understand the need for extremely high resolutions.
 
I think you're spot on, Luke. Who actually needs a fuel gauge that reads to better than about 5 to 10%, as long as it always reads in the safe direction?

Ideally, always fully charging the pack and resetting the gauge at full charge would be the way to go, but in my case, because the boat has solar charge, I know that it's going to go for up to a week without getting a full mains charge. From the estimates I've made, my meter might be reading as much as about 7 or 8% low after around 6 or 7 days use without a full charge, with a mix of several hours discharge and several hours slow charge every day. If the charge ever reaches the point where all the BMS shunts come on (indicating full charge) it automatically resets the gauge to full, removing all the accumulated errors. No operator intervention is needed, so the worst that can happen is that the gauge will read a bit lower than the true state of charge, leading to the operator charging it slightly earlier than might be strictly necessary if you wanted to use close to 100% of the battery capacity. For all practical purposes I think this will be just fine.

As soon as I'm happy that my gauge works reliably over a few dozen partial charge/discharge cycles, I'll publish the schematic and code on here. It could easily be scaled to any voltage or current you wanted, just by changing the LEM sensor and altering a resistor value. The parts cost is dominated by the display (around $40), but the code could be changed to allow the use of a cheaper serial display (around $20), as long as you don't want the bar graph. You could opt to change the Picaxe for a different controller, then a cheaper display could be used and you could get the bargraph function back, the downside would be rewriting the code. I think that the Cat's Whisker TextStar module is the best bet, because it's small and easy to fit into a limited space, plus it takes a lot of code overhead away from the controller.

Jeremy
 
There any number of analog amp-hour meters, see e.g. http://encyclopedia.jrank.org/MEC_MIC/METER_ELECTRIC.html. The electrolytic ones (traditionally mercurous nitrate, copper sulphate works too) reversibly transfer metal from one end of a transparent tube to the other, from the side the column of metal looks like an amp-hour pointer as moves along the tube. For extreme accuracy you could run the reading through a microprocessor to account for charge efficiency, shunt temperature, etc.
 
Consider that not every EV driver is a commuter, coming home after the round trip, plugging in and fully charging overnight.

I would think a significant portion of drivers, particularly city dwellers will be opportunistic, possibly even time limited recharger's. Urban dwellers with limited travel times and distances, are the very people best suited to EV's and worse polluters using IC (the catalytic converters hardly get up to temperature).

While I agree that the accuracy of the fuel gage within 5 % is sufficient as a display - 10-20 LED's in a row will do it - I disagree that the underlying process can afford to be inaccurate.

In a non regen vehicle (a boat for instance) where you drive, and then you fully charge the pack, you can get away with the reset, and relaxed accuracy (no accumulation of errors).

What about that city dweller, parking mostly on the street perhaps. The vehicle is AC powered with good regen, and a significant portion of opportunistic time limited public charging(no full tank re-set).

By my estimates two factors are very important.
1)There needs the underlying accuracy counting coulombs (high sample rate, averaging, data sifting for spurious 'noise' values, fuzzy logic etc.).
2) Also needed is some kind of cell performance model, to predict coulomb conversion efficiencies, into and out of the batteries under different conditions.

The purpose of this way more complex solution, is to know what's left in the tank at the end of a week of partial plugin charges and short stop/start traffic congested trips, with a hill or two along the way.

Small cumulative errors add up over time, to something misleading unless there is an underlying accuracy, and an accurate model interpreting what is measured. The awesome thing about LiFePO4 types is that cell performance is so very predictable & over thousands of cycles. Modeling it empirically should be easy (if boring). Heck, I wouldn't be surprised if a good software model on an environmental bench, could teach itself about the cells :wink: .

EDIT: I'm almost hesitant to voice my real reason for wanting this kind of precision.

I think a very accurate determination of state of charge is the best way I can think of, to protect cells from over discharge. I've played with simple LVC at 2.1V and 2.3V and even 2.8V and under adverse conditions (elevated temps for instance), it's not always the solution to keeping cells from reversing and damage. When it's cold the false tripping under load is an issue, even with near full cells. No I don't see a way around eventually needing to crack this "true state of charge" problem.

What does Tesla do?
 
dak664 said:
There any number of analog amp-hour meters, see e.g. http://encyclopedia.jrank.org/MEC_MIC/METER_ELECTRIC.html. The electrolytic ones (traditionally mercurous nitrate, copper sulphate works too) reversibly transfer metal from one end of a transparent tube to the other, from the side the column of metal looks like an amp-hour pointer as moves along the tube. For extreme accuracy you could run the reading through a microprocessor to account for charge efficiency, shunt temperature, etc.

Nice find! :D but I'm trying to kick the mercury, before I am as "mad as a hatter". Anyway a regular lead/acid habit best explains the state of my grey matter.
 
My case is pretty close to the commuter EV one, albeit at a much lower charge and discharge rate. The boat will go for up to a week of daily use, with no access to grid charging. Charging will be from the solar panels and will be highly variable, both from hour to hour and day to day. Similarly, discharging will be variable, as the boat will have differing run times each day.

Consumers are used to having to fill cars up when the gauge gets low, so what's different about an EV? I guess the obvious difference is the long charge time, but if the gauge is good enough to show the charge consumed during a trip (say, dropping from 3/4 full to 1/2 full) then as long as the user puts back enough charge (as shown on the gauge) to meet the next days anticipated use, the accuracy doesn't matter too much.

With the strategy I've outlined above, the deliberate, built in, gauge under-reading will result in a point being reached, maybe after many days or weeks of use, when the battery actually gets a full charge. When this happens (and assuming that the pack is fitted with an indicating BMS with some form of "all cells charged" signal, as mine is) the gauge will reset to full, removing the cumulative error built up over time.

It is is an interesting intellectual exercise looking at ways of modelling all of the variables that affect actual battery charge state, but I can't help feeling that it's probably over-kill for real world use.

IIRC, there's at least one other production EV that uses a learning system to log battery capacity, which might be another way to look at this problem, in effect this looks at thing from the discharge end, rather than the charge end. The system measures current and voltage and logs amp hours used, but changes it's internal calibration factor every time the pack voltage drops to a particular voltage under high current demand situations. In effect, it's measuring the capacity indirectly, by monitoring battery pack internal resistance and voltage. Increasing Ri can be a reasonable measure of state of charge for some battery chemistries. This sort of system can work well, as long as the user understands that it isn't accurate until the car has been run a few times and the system has been allowed to learn the true capacity of the pack.

Those electrochemical charge state devices have given me the idea for another way to crack this.

What if you had a single cell, that wasn't part of the main battery pack electrically (but was physically) and that you used as an electrical equivalent to the devices linked above? You could feed this cell a precise, measured, percentage of the charge current, and similarly feed it a precise, measured, percentage of the discharge current. Periodically, you could discharge (or charge) this cell fully (during the period that the vehicle was on charge) to measure the true capacity remaining, as against the capacity that the gauge was indicating. You could use the difference between the predicted charge remaining and the actual charge remaining to correct the gauge drift error. As long as you automatically recharged it afterwards to the same SOC as it was when you started this calibration process, then you'd have a pretty foolproof way of periodically recalibrating the gauge and getting rid of the drift problem altogether. It'd be heaps simpler than trying to model all the other variables, I think.

Jeremy
 
Back
Top