A PIC based Battery Management System

Yes, I can see how you could use one set of voltage monitor connections, and let the PIC do both, but another thing to consider is power drain. The TC54 is an extremely low-power chip, drawing microamps. I would think you would need to keep the draw down, for whatever is used, to under a few milliamps in order to not exceed the self-discharge rate of the cells themsevles by too much.

Like Jeff used in his a123 shunts, we are using a LM431 shunt controller with a variable voltage divider on the input. The output of the LM431 drives a TIP105 darlington pair power transistor, which is what is used to bypass current. That is basically all you need, but we alo have some "extra" parts on each channel in order to drive an LED when the shunt is active, and to turn on an opto putput. We are now using a two-channel opto, with the other one being driven by the LVC TC54 detector. The shunt opto outputs are used with two diodes, on each channel, in order to create two signals, one that is low if any of the shunts are active, and a second one that is low if all the shunts are active. The "ANY SHUNT ACTIVE" signal is used to activate an op-amp that then controls a FET that the charger negative lead is connected through. The purpose of this logice is to throttle back the current that is allowed to pass to some adjustable level between about 1/2A and about 10A. The reason for this is so that the board can be used with a variety of heat sink solutions for the TIP105s. The "ALL SHUNTS ACTIVE" signal is used to trigger logic that will shut the current limit FET off completely, once the cells are full.

I could see where the PIC could replace the TC54 and LM431 functions for each channel, as well as eliminating the need for the diode network, and it could also drive a more intelligent display, eliminating the need for individual LEDs and the optos, but you still have the power consumption issue. Although we have active cutoff FETs (two 4110s...) for the LVC function, the rest of our charger control logic is not powered at all unless the charger is connected.

There are certainly lots of really cool things that you could add, like a more robust display capability, and/or incorporating typical Watt's Up/Cycle Analyst functions like Ah used, peak readings, etc. That alone would make this a desirable thing to do.

What LiFeBatt is doing with their big EV hard packs is to use a small BMS board with a microprocessor, for every four cells in series. Each of these boards have input and output UARTs so that the boards can be connected in serial strings. Each hard shell case also has two DB9 connectors in between the main terminals so that all the batteries can be daisy-chained together. One end of these can then be connected to a PC and/or vehicle management system to look at data. Each cell is kept track of and peaks are recorded. Low-voltage, over-voltage and over current signals are also generated, and the PC/VMS can identify the problem down to the cell level.

These 4-cell BMS boards don't do active cutoff, for error conditions, and they don't use shunts on each cell either. Instead, the use 100mA per cell balancer circuits, during the charge process. I've question this but their engineers in Taiwan swear up and down that the cells neve get more than 1% out-of-balance. We'll see, I guess.

Anyway, looks like a fun project. :)
 
It seems Atmel makes a few chips in its family specifically for battery charging. They have interesting reference designs AVR450 (Atiny and Atmega8 based chargers) and AVR453 based on ATMega406.

Atmega406 is in hard to use package but it does everything in one chip. From their sheet:

Battery Management Features
– Two, Three, or Four Cells in Series
– Deep Under-voltage Protection
– Over-current Protection (Charge and Discharge)
– Short-circuit Protection (Discharge)
– Integrated Cell Balancing FETs
– High Voltage Outputs to Drive Charge/Precharge/Discharge FETs

Peripheral Features
– One 8-bit Timer/Counter with Separate Prescaler, Compare Mode, and PWM
– One 16-bit Timer/Counter with Separate Prescaler and Compare Mode
– 12-bit Voltage ADC, Eight External and Two Internal ADC Inputs
– High Resolution Coulomb Counter ADC for Current Measurements
– TWI Serial Interface for SM-Bus
– Programmable Wake-up Timer
– Programmable Watchdog Timer

So all I need basically is this chip+optosiolator ($2) for SMbus.

I'll dig more about the Atiny chip too. They are ultralow power, can take wide range power supply, can measure big voltoges without external components and have serial communications built in.

Also I read somewhere Dewalt uses Atmel chip in it's BMS. Any more info about how they its setup?
 
Interesting thing is that microconroller with 8 ADC channels costs 10 times less than 1:16 mux. For a 12S battery setup you need two 1:16 muxes ($20) + differential amp.

Is it at all possible to read voltage from a voltage divider without op-amp buffer? I know this is obvious question to any EE, but I can't seem to find any info on that.

So options are:
1 MCU + 1 Opto per cell - no voltage dividers and no buffers, cost ~$3 per cell, pain to assemble

1 8 input MCU + 16 resitors to divide voltage + 1 Opto, cost $6 per 4 cells
(i suspect this can't be done without op-amps)

1 8 input MCU + 4 2-channel high voltage differential amplifiers +opto - cost $??? -
but all on one board for up to 8 cells.

What would be good, cheap unity gain (no external resistors), low power differential amp?

I'm thinking about this one:
http://cache.national.com/ds/LM/LMC6462.pdf

- 20 microAmp power consumption
- rail to rail
- can go to 16V
- two amps in DIP package for $1

The only problem is that it requires external resistors to set it up as unity gain differential.

I'd love something like INA105:
- No external components needed
- Can handle 36V supply
- Too expensive: $5
- Uses 2mA of power -
 
tomv said:
Is it at all possible to read voltage from a voltage divider without op-amp buffer? I know this is obvious question to any EE, but I can't seem to find any info on that.
:idea: This is the same idea I had. Yes and No. Yes you can do it, but No it will not give a "simple" answer. The reason is that if the cells are perfectly balanced then the correct voltage will result on each voltage divide, but if they are imbalanced then it shifts the voltage around IN A PREDICTABLE MANNER.

My thought was that you create an abstract software model that reflects the behavior of the voltage dividers. When the cells become imbalanced the software can figure out what is going on and true values can be calculated.

It's a more abstract (difficult) software model, but a much less expensive and simpler circuit design. Thinking is free (software) but circuits cost $$$.
 
Is it at all possible to read voltage from a voltage divider without op-amp buffer? I know this is obvious question to any EE, but I can't seem to find any info on that.

yes you can but it depends on the value of the resistors and the ADC, on my AVR the output impedance of whatever you're measuring should be 10K or less, thats because it does a sample and hold by charging a capacitor and it has to be charged within 1.5 ADC clock cycles, so if the output impedance of what you are measuring is too high then the capacitor wont have time to fully charge.

this mux chip looks pretty cool, dual/differential 16 channels $9 from digikey http://www.analog.com/UploadedFiles/Data_Sheets/292612942ADG726_732_0.pdf
 
dirty_d said:
yes you can but it depends on the value of the resistors

file.php


What I found by running a simple simulation was that balanced cells if lowered in voltage at the exact same rate tend to produce voltage output that exactly matches the cell voltage. However, if the cells are lowered at different rates (how we expect real world cells to discharge) the output becomes disconnected from the cell voltage. So in order to actually use the voltage measured you need the abstract model within the software that deals with the resistors as an overall system.

It's possible to do, but it involves abstract thought... you know that "brains" thing... :lol:


:arrow: Cell Voltage: 3.2, 3.2, 3.2 --- Measured: 3.2, 3.2, 3.2

:arrow: Cell Voltage: 2.5, 2.5, 2.5 --- Measured: 2.5, 2.5, 2.5

:arrow: Cell Voltage: 3.2, 2.5, 2.5 --- Measured: 3.2, 2.85, 2.73

:arrow: Cell Voltage: 3.2, 3.2, 2.5 --- Measured: 3.2, 3.2, 2.96
 
Let's back up a step here and define what a good BMS needs to do:

1. Measure the voltage of each cell and provide a signal to cut the power when the voltage gets too low (LVC).
2. Measure the voltage of each cell and throttle the charging current when the voltage gets too high.
3. Engage a shunt resistor across any cell that reaches the max. voltage threshold. This would not be needed if you are going with single cell chargers, but a single big charger is much handier.
4. Being sofware based, the upper and lower cell voltages can be programmable to accomodate different chemistires.
5. Not drain the batteries when they are in standby. This might require some kind of 'sleep mode'. When charging or discharging, the circuit needs to be activated, otherwise it needs a low current standby mode.
6. Easily scaleable to any number of cells.
7. Needs to accomplish the above tasks at the least overall cost.
8. Optional features like 'fuel gauge' function and individual cell health displays. If a cell starts to go bad, it would be nice if the circuit would identify which one it is.

I'm sure there are more, but that's a start.

Using a PIC with 4 analog inputs to measure 4 cells might be cheaper than a MUX and one PIC. Using this approach, there would need to be a way for each 4 cell sub-pack to talk to the low voltage cutout and charger. Let's assume that we want to do something like 16 series cells.

With this approach, there would be a PIC for each 4 cells, and a 'Master' board somewhere that collects the signals from each cluster to handle the LVC and charge current.

If there are voltage divider resistors on each tap, then the resistors will be constantly draining the batteries. One possible way around this would be to use small FETs on the divider that go open when in standby mode. The on resistance of the FETs will be very low compared to the rest of the divider circuit.

The MUX chip is good because it will open circuit all the taps when it's off.

How to make the PICs go to sleep and wake up is sort of a challenge. When the charger is connected, they can run off a separate wire powered from the charger, so that part is fairly easy. It's a bit tougher to detect discharge and turn things on. One way to manage that is to take a voltage from the controller (like the throttle supply) and use that to wake it up. Since I'd like to tie into the controller to kill the throttle when I hit the LVC, one more wire is not a big deal.

I've seen a large variety of purpose made BMS chips from various manufacturers, but most of them are made for Li-Co chemistry. Some of them have serial communication built in to transmit data. There might be something that sits on each cell and sends a voltage measurement via serial when interrogated.

Here's the Maxim MUX: http://www.maxim-ic.com/quick_view2.cfm/qv_pk/1085

Here's the TI MUX: http://focus.ti.com/lit/ds/symlink/mpc506.pdf
 
you could do things like that, thats originally how i was going to do it in my battery charger, the software part is easy, you just multiply in software how much each tap divided the cell voltage by and add then up and subtract the previous cell tap post multiplied voltage from the one you're checking, thats easy enough, but the problem is that an ADC is not perfect, for the lower cells where the voltage is only divided by 2 the error is only multiplied by 2, but for say the 30th cell in the pack the error will be multiplied by 30.

i think the first post in this thread that fechter made is the best way to do it, you divide every cell in the pack by the same amount, say 10, then send it through a mux with a pot adjusted for each cell to get the exact division, then to a differential amplifier with a gain of 10, you can divide and multiply voltage with more accuracy with analog components in some situations like this.

an example of how it works take cell zero 3.9V to ground
3.9V / 10 = 0.39V
the output of the mux would be the negative terminal of cell zero(0V) and the positive terminal after division(0.39V)
the op-amp outputs the difference multiplied by a gain, (0.39 - 0) * 10 = 3.9V to the ADC, since there is no need to do any math in the MCU the ADC error wont be exaggerated.

for say cell 30 at 3.5V with previous cell voltages adding up to 98.6V:
resistor stage:
98.6 / 10 = 9.86V
(98.6 + 3.5) / 10 = 10.21V
op-amp:
(10.21 - 9.86) * 10 = 3.5V

dividing each cell and doing the multiplication in software is the same as the difference between using a 10-bit ADC that has a range of 0-5V and another 10-bit ADC that has a range of 0-100V, with the latter you lose a whole lot of resolution.
 
fechter, do you think the mux i posted on page 2 would be ok? it has a dual output model with 16 channels the max voltage is 5.5V though so you would need to divide the voltage more before it gets to the mux, about the same price as the MPC506. you could switch 32 cells with just one 8 bit port on the mcu.
 

I Have the Solution!

Fist of all, the reason for using a multiplexer to clcle through all the cells instead of using individual components is because you want consistency in your voltage measurements. We all know that there is often significant variation in electronic components, so variations in voltages are likely due to variations in optos, resitors, or other components (that are heating up). The voltage divider solution would be a nightmare in my opinion because you'd never get the voltages standardized.

What I want to do cycle through each cell and send the voltage through the same resistor and voltage measurement device so they all are consistantly measured.

And I found the way to measure isoulated voltages:

http://www.parallax.com/Portals/0/Downloads/docs/prod/appkit/ltc1298.pdf
http://www.parallax.com/Store/Micro...efault.aspx?SortField=ProductName,ProductName

THE LTC1298 can be configured in two modes ... In two channel mode, the selected channel's voltage is measured relative to ground and returned as a value between 0 and 4095. In differential mode, the voltage difference is between the two inputs and returned as a value between 0 and 4095.

How perfect is that???? So with that resolution each step is 0.0012v. So our voltages should be accurate to plus or minus 0.005v at worst!

So here's what needs to be done: You set-up two multiplexers with each one of its input pins connected to each cell's terminal. The two channels of the chip above are connected to the output pins of the two multiplexers. So one multiplexer's output will be the negative voltage and the other will positive. So when you start the process MUX1 = cell1- ; MUX2 = cell1+. At this point the PIC will take a reading of the ADC chip (above) and record the voltage. Then the PIC will send a pulse to both "increment" pins of the multiplexers. Then their outputs will "point" to cell2: MUX2 = cell2- ; MUX2 = cell2+. Then the PIC reads the voltage for cell2. PIC pulses increment pins again etc.

There you have it. A PIC that can read 20 cells voltages within 0.005v. And its all a perfectly natural interface for the PIC. Excellent.

 
Beagle123 said:
First of all, the reason for using a multiplexer to cycle through all the cells instead of using individual components is because you want consistency in your voltage measurements. We all know that there is often significant variation in electronic components, so variations in voltages are likely due to variations in optos, resistors, or other components (that are heating up). The voltage divider solution would be a nightmare in my opinion because you'd never get the voltages standardized.
The Voltage Divider Solution would require that you simulate a circuit in the software because if you do not the slightest imbalance in the cells will throw off the results. Basically you could do it, but it would require building a software model that was rather complex. I like using abstract models, but my observation is that electrical engineers prefer more direct solutions. (they prefer the literal over the abstract... "hardcoded" vs "software") The measurement accuracy should be good after you correct the values by running the data through your abstract model. (it's necessary to correct the data before you use it)

The PIC's also only have a limited number of analog translation units inside them. So while they might have up to 33 I/O options from what I've been able to tell they usually only have at most 8 analog to digital conversion units. Based on this you pretty much are forced to go with a multiplexer.

:?: But if analog to digital units exist within the PIC's (but in limited numbers) why use an external analog to digital converter?
 
Beagle, you'd still have a problem unless I'm not getting something. The ADC blows up if it sees more than 5v on the inputs, so you still need a divider. The MUX can't take more than that either.

fechter, do you think the mux i posted on page 2 would be ok? it has a dual output model with 16 channels the max voltage is 5.5V though so you would need to divide the voltage more before it gets to the mux, about the same price as the MPC506. you could switch 32 cells with just one 8 bit port on the mcu.

Since the analog input on the PIC only goes up to 5v, then if you fed the output of the MUX straight to the PIC, you'd need to be under 5v anyway.

I say screw the consistency, use the same divider on all channels, and put a "calibration factor" for each channel into the PIC that gets adjusted later. I'm just not sure how hard this would be in practice. You would need to measure each cell with a good multimeter and somehow compare the actual voltages to what the PIC sees to determine the individual cal factors, then input them into the PIC. This would allow you to use crappy cheap 10% resistors and correct everything later in software.

If you just measured each channel through the divider, multiply by the 'cal factor' then input the voltage for each tap into some registers, then subtract them to get the individual cell voltages, I think the accuracy might be good enough. It would be nice to avoid needing two channels of MUX to do differential measurments.

Another thought: the ideal voltages might depend on temperature. This is true with most chemistries, but I don't know if it would be worth messing with for LiFePO4.

Also, when the cell voltages are below a certain point, the charger should start out in the low current mode and switch to high current when the voltage gets high enough. Something to add to the software.
 
fechter said:
This would allow you to use crappy cheap 10% resistors and correct everything later in software.
:arrow: Bingo!

That's the idea I was going after. It all depends on the amount of RAM that you can get for the software. Ideally you could have a startup routine when you power up the system that would do something like test the system and find out how the resistors are behaving. Once this little trial run is complete you store those values to represent the resistors for that session. When the voltage measurements vary as the cells discharge (as they inevitably will) you correct them using the data you got from your initial testing.

As a former software guy I'm pretty used to "abstracting" ideas... in a sense that's what software does... you take literal reality and you make mental models of it. You have to have a natural love of symmetries and pure forms and have an active imagination. The higher the language level (machine language is low abstraction, object oriented programming is high abstraction) the easier it is to manipulate symbols in abstract ways. The trends in software have been to get more and more abstract... which creates a more pure model.

Anyway... from what I can see what would be GREAT is if you could find a PIC that had all the analog to digital pins you needed and then you could skip the Multiplexer entirely, but I don't think that's going to happen.

Are there Multiplexers that can handle high common mode voltages? (like 100 volts) If you could Mulitplex that raw cell voltage and then run it through a single INA117 Difference Amplifier ($5) then that's cost effective. (the INA117 can handle a full 200V) But I doubt that the Multiplexers can handle it... :cry:
 
i agree with fechter on the software error correction, you could do the same with potentiometers, but at about $1 each, the software way beats it because its free :D. theres probably still a way to do that with a 2 channel mux and a differential amp though. i was wrong about needing 8 bits for two 16 channel muxes, you can connect four i/o pins on the mcu to the four channel select pins on both muxes, then use the 5th bit on the mcu to turn on/off the enable pin on each mux, for 2 muxes you could just do it with a couple transistors, but if you wanted to use a lot you would use another mux to switch the enable pins.

after some thought:
yes indeedy, you can do it like that with cheap resistors and a dual channel mux while maintaining high accuracy at the expense of time. i don't think any MCUs have a floating point math unit, so it would just take longer to cycle through all the cells and do the math and take action, but that doesn't really matter as it will probably still happen many times a second.

heres how i see it going down:

1) 10:1 divider on each cell going to a dual channel mux followed by a differential amplifier.
2) measure the actual voltage of each cell with reference to ground, then measure the output of the mux going to the differential amplifier for that cell, with that cell selected on the mux.
3) divide the actual voltage by the voltage seen (at the amp * 10), e.g. actual 3.5V, voltage at amp 0.36V, 3.5 / (0.36 * 10) = 0.972, store that in an array called ECV(error correction value) with one item in the array for each cell, i.e. cell 0 is ECV[0] cell 1 ECV[1](cell 0 wouldnt really be a cell though that would just be ground, and the value in ECV[0] would be 1.0 since there wont be any error there)
4...) now i gotta break this down, ill just go along with the first two cells:
cell 1: voltage 3.5V, from ground: 3.5V, measured at amp: 0.34V, ECV[1] = 1.029
cell 2: voltage 3.5V, from ground: 7.0V, measured at amp: 0.714V, ECV[2] = 0.980
4a) starting from the beginning the output from channel A of the mux is 0V because its connected to ground, channel B will be 0.34V, the amp multiplies the difference by 10 so in our MCU we get 3.4V.
heres what happens in software, we have a variable CELL that holds the ADC calculated voltage(3.4V), an array called CELLS with however many items there are numbers of cells, this holds the actual caclulated voltage of the cells after we correct the error, CELLS[0] will be 0V since it represents ground, and a variable TMP for storing temporary values.
4b) CELL=3.4; CELLS[0]=0; ECV[0]=1.0; ECV[1]=1.029
TMP = CELL + CELLS[0] / ECV[0]; (3.4)
CELLS[1] = TMP * ECV[1] - CELLS[0]; (3.5)
and done for the first cell.
4c) CELL= 3.74; CELLS[1]=3.5; ECV[1]=1.029; ECV[2]=0.980
TMP = CELL + CELLS[1] / ECV[1]; (7.14)
CELLS[2] = TMP * ECV[2] - CELLS[1]; (3.5)
and done for the second cell

that seems to work fine, and you don't lose any precision.
 
I suppose you could also use a differential amp like safe is talking about on each cell, then MUX the outputs of those. Op amps are cheap. With an op amp on the front end, it can be set up to handle just about any voltage with the right configuration (without resorting to expensive high voltage ones). The output of the differential amps could be the actual cell voltage, which is in the under 5v range, so no problems with the MUX or ADC input.

I think you would still want the error correction value thing for each cell in the software. There could be lots of error in the amp circuits with high voltage divider ratios.

What is the resolution of the built-in ADC? With a super resolution one like Beagle is talking about, you might get away without any differential amps and just use single ended measurements of the taps through dividers. I was thinking you might get away with using the built in ADC, but we'd need to know what kind of voltage resolution it would provide.

I'm still not sure just how accurate it needs to be anyway. Seems like +/- 10mv should be good enough, but who knows. +/- 1mv would be nice, but that might cost a lot more.

How do you actually enter the error correction values into the PIC? I think you'd want to run it somehow first to figure out what the error is, then go back and reprogram the correction values into it. Is that practical?

If there's enough program space and a display, you might be able to program it with push buttons like a CA or a cycle computer. That could be down the road, but it would be nice if the end user didn't need a programmer. It think you'd want to measure the actual tap voltages and compare that to the output of the ADC rather than the input to correct for any error in the ADC too.

Another dumb question: how could we activate shunts on the cells for balancing? Are there enough digital outputs to handle that? Each shunt needs to be switched on and off, and when it's on, it should stay on through the entire program loop until it gets a command to turn off. These could be fed by opto couplers, but there might be a cheaper way to couple them.

There also needs to be a way to throttle the charging current when the voltage is too low or when any of the shunts gets activated. It could be a simple high/low thing, but I was thinking you might use a PWM to vary it. If you used a PWM, then you could use a FET in series with the charger to throttle the current and not have a lot of heat dissipation.
I'm not sure what this does to the batteries, but if the PWM frequency is high enough, the peak voltage on the cells will be very close to the average.
 
This is all sounding roughly similiar to my "Distributed Pulse Width Modulation" idea. At some point if you are adding FET's to control things you might as well go all the way and isolate each cell behind a FET or a FET pair. You could still focus the control of everything using the PIC but the advantage of the direct cell level approach is that you could use the FET's as a control mechanism for both charging and discharging. I think I went overboard on that other thread when I suggested a single chip to do everything since that more or less took the argument out of the hands of hobbyists. Using things like a Multiplexer could drop the price enough to make the general idea practical. :)
 
Calibrating in software is easy. Atmel does that in their reference designs. You actually don't multiply anything when running, as that is expensive. Instead you store limit values for each ADC. Say ADC0 4.2V is 4300. ADC1 might be 4000 and so on. Recording those values is also not too hard. When everything is hooked up and soldered on you just tell PIC to run in calibration mode (jumper switch or via PC on ubs/rs242 port). Then just apply known voltage on ports where cells would go. 5V from lab supply would be fine. Then store reading of each adc.

My worry is that resistors (and maybe other parts) change their spec depending on ambient temperature, humidity, load cycling, vibration. It would be interesting to test how much calibration would drift with different operating conditions. Are some 5% resistors better than others in terms of being stable in different conditions?
 
tomv said:
My worry is that resistors (and maybe other parts) change their spec depending on ambient temperature, humidity, load cycling, vibration. It would be interesting to test how much calibration would drift with different operating conditions.
I had suggested a startup calibration routine that might send out some kind of pulse that would come back and give a measurement of actual conditions. If that test could be done really fast then maybe every 1000 cycles of the regular analysis routine you perform the startup calibration routine again. That way as conditions change you are always getting true values of what you are measuring. "Self correcting" is the basic idea... as conditions change you simply go along with it. Doing things this way could give precision that would be so good that it might be overkill. Just how much precision is needed? Does a small drift over the course of a ride amount to enough error to make this "self correction" needed?

We've also proposed this "self test" without really thinking out fully how it might be done. What information can we discover? Do we send a pulse, a sine wave, a constant voltage or current? We've sort of skipped past how the "self test" is done without even proposing the mechanics of doing it. There will likely be one of those "Eureka" moments by someone who can imagine how to do it.

I know there was a programmable controller somewhere that had the ability to sense the internal capabilites of a brushless motor through some sort of pre-ride routine, so the idea does seem to have promise. Think folks... how does one figure out the value of resistors from an external location?
 
im not sure what the problem is with the single differential amp idea, if there is any error it will be easy to null out, and even if you dont, the error is consistent because all cells go through it and you can just compensate in software. im also not sure about what you mean by put an op-amp on each cell, the inputs need to be between the supplies of the op-amp, so if the supply is 30V you cant put it on any cell that is higher than 30V from ground without using a higher supply voltages and putting a divider on the output, and its just way more complicated than the single amp. as far as switching the cell shunts i think an opto and darlington with a resistor on each cell would be the cheapest, just about any opto would be fine, i see a bunch for $0.25 on digikey and 1A darlingtons for less than that.

for the error correction i was thinking i would make a separate program, you just test all the cells and the output of the mux and enter each cell in a text file. like this
3.61 0.364
7.20 0.718
...
and the program would read that text file and generate a C header file with an array declaration that holds all the error correction values. and when you compile the MCU code that header file is included.
 
dirty_d said:
...as far as switching the cell shunts i think an opto and darlington with a resistor on each cell would be the cheapest, just about any opto would be fine, i see a bunch for $0.25 on digikey and 1A darlingtons for less than that.
Optocouplers will wear out within about a year (they will degrade in performance) if left on all the time. So you can't practically place them on the cell directly. Bob Mcree gets around this problem with a Voltage Detector BEFORE the optocoupler so that the optocoupler stays off most of the time. It only comes on when the voltage triggers the need for it being on.

The choices seem to be:

:arrow: Place Difference Amplifiers on all the cells. (at $5 each that's expensive)

:arrow: Somehow route the cell voltage through a Multiplexer and then use the Difference Amplifier.

Problem A: High Voltage Multiplexers don't exist. (so scaling must come first)

Problem B: Trimming the Voltage with a Voltage Divider introduces errors that need to be compensated for with software.

No matter what you do you have to wrestle the large common mode voltage down to under 5v and the only cheap way is with simple resistors. I just don't see an exit for this.
 
dirty_d said:
the opto will only be on when a cell is out of balance and the pack is charging...
I obviously don't understand your design yet. The issue we were working to solve was how to measure the voltage of a cell. If the opto's don't do anything for that measurement then what other part is doing it?

A picture of the basic elements might be nice. :p

As I see it the Difference Amplifier is the most direct and easiest way to measure cell voltage, but the price is too high. We are trying to find a way to cut costs and to do that (it seems to me) the cheapest thing out there is the resistor. The PIC is then used to correct the distortions that the simple voltage divider introduces. But we are forced to go down this multiplexer path instead of doing it directly because the PIC's don't come with enough Analog to Digital conversions units built in. (they normally have at most 8 ) So it's sort of a forced line of reasoning... each step forces you to fall into a trap and then you try to dig yourself out of each trap.
 
I've just found PIC16F884 and PIC16F887 that have 14 channel 10 bit A/D converters in PDIP packages! So 10 voltage measurements + temperature + current + 2 to spare. All for <$5

Bellow I've drawn two ways to hook up ADC to the cells that I found in this thread. I believe I missed a few other ideas here, and will add them once I understand them :)

PIC ADC converters want 10 kOhm impedance or less and it seems to be standard so the first way would be this:

Simple ADC:

Pro:
- 20 resistors + PIC is all that's needed for 10S pack monitoring. Total cost <$10
Con:
- 0.3 mA power draw at all times. Would discharge the pack to 0V in one year or so. Adding FETs negates the simplicity advantage
- Precision could be low (mostly can compensate in software)

Proper ADC:
Pro:
- Can pick amps to have ultra low power draw (micro amps)
Con:
- Four resistors + op-amp per cell. 10S pack needs 40 resistors, 10 op-amps. More complicated but in the long run probably the right way to do it.
 

Attachments

  • tCad2.png
    2.5 KB · Views: 1,720
I'll try to quantify precision of the two designs above. Let's say pack is 10S, 33V.

Simple ADC drive: 33V
- Maximum precision is 33/4096 = 8mV
- 1% drift in resistors away from calibrated values introduces max 33mV error.
- Any ideas how much resistors would actually drift? if it's 5% then it's a problem. Error would be ~0.17V

Proper ADC drive:
- Everything becomes 10 times better. So 5% resitor drift only gives 0.02V error.
 
Thanks for the input tomv.

I suspect the reason you want low impedance inputs to the ADC is because it will load the measurment otherwise. If we fix this in software, then you might be able to use much higher resistance values for the dividers. In fact, the ADC might already have some fixed resistance to ground that could make half the divider. This could be tested.

It would be really nice to avoid the op amps, but they don't cost that much and you can get quad packages.

Here's the datasheets:
http://ww1.microchip.com/downloads/en/DeviceDoc/41291D.pdf ($3.45 from DigiKey)
 
Back
Top