So I have a Gen 1 Ford Escape hybrid with a weak hv battery. Works fine when the computers catch up with the real capacity, but I would like to get the real electric performance one way or another.
I’m down for replacing the cells since I doubt any 20-year-old battery is gonna have that many miles left on them, but the cells appear to be these special NiMh D cells that are capable of 30C peak discharge. There’s one place that sells new premade sticks, but they’re giving me the cold shoulder for whatever reason. In the meantime, I’m left alone with my thoughts.
My day job is as an electrical engineer for an automotive manufacturer, we haven’t dipped our toes in high voltage systems that much yet so my expertise is slightly limited when it comes to EV high voltage power trains.
Anyway, the stupid idea is to retrofit the 250s 1p high discharge NiMh cells with a 100s 2p LiFePO4 cell arrangement.
Ideally I spend time reverse engineering the coms for the factory bms to see how it tells the transmission how much charging current it wants, how much discharge current it’s allowed to use, when it demands the AC compressor, etc. However with how much the thing controls, like the cooling fans, vents to outside, evap coil, and I think even the fuel pump and sender, I have to think there’s an easier way to go about it. Say, leaving the factory BMS on board, and emulating the NiMh pack with a sketchy resistor divider so it lives its life thinking the pack is always perfectly balanced and only has to watch for the overall state of charge from the main voltage, then use an external bms for the LiFePO4 cells to manage them. Which a 100s seems to roughly match what the 250s NiMh would be expected to operate in, but I’m open to being corrected on that.
The main issue I haven’t figured out yet is the discharge rate. With 30C peak discharge, in theory, the original 6000mAh NiMh cells can hit 180amps for a few seconds. 32800 LiFePO4 cells from the marketing I can find give them a 3 or 5C burst rating, but at 2p I only need to get close to 90 amps to get near the original capabilities, but with an average capacity of 7500mAh, I need 12C discharge, so off-the-shelf cells are a little far off.
Now, if I possibly intercept the max charge and discharge messages in the CAN bus and man-in-the-middle some limits in the network, that might get around that problem. Don’t need to spin the tires, I just don’t want the car to keep hammering the cells. It’s a possible remedy, assuming the BMS doesn’t completely check out when it sees this, I’m kinda banking on Ford never thinking ahead far enough to require a handshake or something.
Of course I might be overthinking it and putting way too much thought on a car that doesn’t deserve the effort, but eh, it’s a thought experiment. So mostly what I need to know is how hard can you hammer LiFePO4 cells? Is 15C a reasonable amount to demand a few times a month for a few seconds at a time? Or is the pack size completely wrong for the NiMh retrofit and what configuration would match the voltages better?
Thanks for reading!
I’m down for replacing the cells since I doubt any 20-year-old battery is gonna have that many miles left on them, but the cells appear to be these special NiMh D cells that are capable of 30C peak discharge. There’s one place that sells new premade sticks, but they’re giving me the cold shoulder for whatever reason. In the meantime, I’m left alone with my thoughts.
My day job is as an electrical engineer for an automotive manufacturer, we haven’t dipped our toes in high voltage systems that much yet so my expertise is slightly limited when it comes to EV high voltage power trains.
Anyway, the stupid idea is to retrofit the 250s 1p high discharge NiMh cells with a 100s 2p LiFePO4 cell arrangement.
Ideally I spend time reverse engineering the coms for the factory bms to see how it tells the transmission how much charging current it wants, how much discharge current it’s allowed to use, when it demands the AC compressor, etc. However with how much the thing controls, like the cooling fans, vents to outside, evap coil, and I think even the fuel pump and sender, I have to think there’s an easier way to go about it. Say, leaving the factory BMS on board, and emulating the NiMh pack with a sketchy resistor divider so it lives its life thinking the pack is always perfectly balanced and only has to watch for the overall state of charge from the main voltage, then use an external bms for the LiFePO4 cells to manage them. Which a 100s seems to roughly match what the 250s NiMh would be expected to operate in, but I’m open to being corrected on that.
The main issue I haven’t figured out yet is the discharge rate. With 30C peak discharge, in theory, the original 6000mAh NiMh cells can hit 180amps for a few seconds. 32800 LiFePO4 cells from the marketing I can find give them a 3 or 5C burst rating, but at 2p I only need to get close to 90 amps to get near the original capabilities, but with an average capacity of 7500mAh, I need 12C discharge, so off-the-shelf cells are a little far off.
Now, if I possibly intercept the max charge and discharge messages in the CAN bus and man-in-the-middle some limits in the network, that might get around that problem. Don’t need to spin the tires, I just don’t want the car to keep hammering the cells. It’s a possible remedy, assuming the BMS doesn’t completely check out when it sees this, I’m kinda banking on Ford never thinking ahead far enough to require a handshake or something.
Of course I might be overthinking it and putting way too much thought on a car that doesn’t deserve the effort, but eh, it’s a thought experiment. So mostly what I need to know is how hard can you hammer LiFePO4 cells? Is 15C a reasonable amount to demand a few times a month for a few seconds at a time? Or is the pack size completely wrong for the NiMh retrofit and what configuration would match the voltages better?
Thanks for reading!