My new 18 FET TO-247 layout riding video page 10

In response to HighHopes comments about bootstrap supply:

I want high performance and high reliability, so I'm looking for some isolated supplies now 8)

Fairchild has a gate driver which seems to have the same features as the TD350 but with built-in opto and lower propagation delay and higher gate drive current capability:

http://www.fairchildsemi.com/ds/FO/FOD8332.pdf

But it seems to require a minimum 15V supply. Is that a no-go for MOSFETs with a max gate voltage of 20V? The ACPL-333J seems to require 15V also, what gate drive voltage should then be used when MOSFETs have a max gate-source voltage of 20V?

By the way, why do we have to use isolated supplies for all 6 gate drivers, why not supply the low-side drivers non-isolated?
 
Use what ever you want, I posted my setup and it's been proven to work, might be too big for your application, but my driver boards are pretty small and I can make them even smaller if I need to. Now that I understand how to do the design work correctly I can use any of the integrated drivers and create a new gate driver in a few weeks. You'll have to start from the beginning an learn all the details which is not easy. Desat is not straight forward and is counter intuitive in it's operation.

I use a 12-15V supply in my designs, either works fine for a +/-20v gate input. I mainly went 15V because I was testing out my boost stages and I lose 1.2V due to them, so my max gate output is around 13.5V, not 15. Even my current 18 FET setup does not really need a boost stage since I'm using a 10ohm gate resistor which limits me to a max of 1.5A which the TD350 can do all by itself.

I can use a TD350E up to at least 25khz switching with the 2 level turn off feature and around 33khz without it. You are doing lower power than I am so I doubt this feature is critical. If you can work with up to 30khz fsw, my TD350 design will work, if not, a lower propagation drive will be needed, but reality and testing shows me that high frequency switching isn't important in most cases. I have hub motors with 30uH inductance and they run just fine at 15khz and 15kW.
 
don't waste your time with fairchild gate drivers, stick to the list i gave you. there are a lot of reasons for that and also i have a lot of experience in this field so i hope you can just trust me blindly on that.

one thing to keep in mind, gate driver design & layout is the most critical aspect of a motor drive system. you have to get this absolutely correct. the higher power level the less margin of error you have.

i agree with zombiess, if your application can make use of zombiess' design you should go that route. it is hard work to design one of these successfully and once you have, you never change it unless you have new application that demands change.
 
Futterama said:
So basically, if I want desat detection, miller clamp and built-in opto isolation, I would need to use the ACPL-333J as my only option? That's $70 for the 6 gate drivers alone, holy crap :shock:

$70 expensive, not at all. Assume you are an experienced designer and can make it work that can save a lot of expensive silicon that could blow up. I remember seeing a giant pile of dead MOSFETs in one of Arlo1's videos, that was more than $70 worth. I personally have yet to kill a MOSFET using my driver and I've purposely tried many times, like a 100V + 250A (way off the current sensor) max reading shoot through event. I should do a video how how good desat is, one with and one without.

And yet another reason to use the same TD350E I'm using, they are cheap in small quantity, like $2 each, so about $30 for a complete set with shipping. If one looks around you can find them for about $1.50 ea. You do no an external opto, but it's really small and only costs $1 I think. You probably don't need the fault output in your application so that's more savings because it's one less opto.

HighHopes is dead serious about gate driver design being complicated, I think he spent 4-5 months helping me get it all acceptable and when you look at the schematic, it's so simple looking, not many components involved. I/We had to decide on what specs we were going to target (he helped me figure out which was the most important and why). It's very detailed and a bit math intensive. Then you have to design a good layout, that took a long time, 2 months I think. Remember that I was working on learning this 20-30 hours per week for almost an entire year to get to this point. There really is that much detail in designing a good drive setup. If I would have been able to find a known working design out there with the features I wanted, I would not know half of what I do now, but I also would not have needed to know it or why it worked, just that it met my application requirements.

I'm a quick study usually do very well with app notes, examples, etc, but there is no complete source of info available on these topics. I "might" have been able to get something to spin a motor, but it would not be robust. Without HH generously taking time to tutor me, I would have never been able to achieve this level of success.

I'm giving away the design to anyone. It's not fully documented yet, but there is more than enough posted for someone to modify it to fit their needs (tweaking desat level is probaby the biggest one and that's easy). I'm intimately familiar with it so I can assist with modding it to fit your application. I can probably shrink a single gate drive down to 20x20mm, maybe smaller since I can eliminate a lot of components if fault output (it still has fault protection, just no output signal) and 2 level turn off are not needed.

FWIW, now that I have experience I could probably come up with a pretty good ACPL-333J design in about 1 month if I dedicated 20-30hrs a week to the project, but I don't see much need for a super high fsw in my applications. I have my design and I'm sticking with it until I have a reason that requires me to change it.
 
My take on this... what desat detection does is that it checks that the voltage across a FET is close to 0 when the FET is on. If not an error signal is generated to switch the whole controller off. When the voltage across a FET that is on is not 0, it means the opposit FET is shorted due to burn out, the wiring to the motor is shorted out or there is a short from the controller output to the supply due to a wiring error.

Once the controller is correctly connected to the motor and everything is done properly (no wires rubbing on sharp edges etc etc), i think the only realistic thing that can trip the desat is a burned out FET. The desat does not prevent a FET from burning out due to voltage spikes or overheating or anything, it only shuts down the controller to prevent more FETs burning out and the damage spreading. It limits the size of the explosion so to say. This is a good thing to have when you run FETs on the limit like Zombiess is doing. If you're more like me and are running 150V FETs only upto 100V or so and well within their current limit, blowing a FET in normal use is not likely. I run normal IRS2186 gate drivers without desat detection. On the off chance that a FET blows up, it'll be a cointoss whether the opposite FET will blow or the fuse goes. Either way, I accept this extra cost as the controller was cheaper to build. The main 'cost' for me will be that either way,the bike is dead in the water with or without desat, this I find a much bigger issue than whether it'll cost $5 or $10 to fix. Knock on wood, but I've done 9000km so far without blowing up.
 
that's pretty much the truth,if you don't have any faults then there is no need to have fault protection. but let me give you a bit more info

desat protects only from short-circuit current as you said. there are other methods to protect from over-voltage transients including lightning strikes, thermal overload, list goes on and on. you just have to decide which are important to you, maybe none. as you go up in power levels, the more cost effective fault protection becomes.

it is not true that if desat shuts down drive you are dead in the water anyway. desat prevents catastrophic failure from short-circuit and yes it shuts down drive. but it does NOT have to shut down drive permanently. most shoot-through events are one time event anomalies, so if you waited 200ms, you could restart the drive safely. that is FAST by human standard and you wouldn't even notice cause motor coasted during that time. without desat, under same fault, probably you will have a complete failure, maybe a fire.

assuming you have gate driver with desat function built in, you need two diodes a resistor and a capacitor (50 cents?). probably also a way to get fault signal out of gate driver to brain board (does not have to be with an $1.5 opto coupler, the opto couplers in zombiess gate driver are there for another reason). each gate driver needs this. you do not need isolated power supply for desat function (isolated power supply is there for another reason).

i say "probably a way to get fault signal to brain board" because it is possible for brain board to realize a mosfet is shut-down WITHOUT a dedicated fault signal, a solution managed entirely in software.
 
Lebowski said:
My take on this... what desat detection does is that it checks that the voltage across a FET is close to 0 when the FET is on. If not an error signal is generated to switch the whole controller off. When the voltage across a FET that is on is not 0, it means the opposit FET is shorted due to burn out, the wiring to the motor is shorted out or there is a short from the controller output to the supply due to a wiring error.

Once the controller is correctly connected to the motor and everything is done properly (no wires rubbing on sharp edges etc etc), i think the only realistic thing that can trip the desat is a burned out FET. The desat does not prevent a FET from burning out due to voltage spikes or overheating or anything, it only shuts down the controller to prevent more FETs burning out and the damage spreading. It limits the size of the explosion so to say. This is a good thing to have when you run FETs on the limit like Zombiess is doing. If you're more like me and are running 150V FETs only upto 100V or so and well within their current limit, blowing a FET in normal use is not likely. I run normal IRS2186 gate drivers without desat detection. On the off chance that a FET blows up, it'll be a cointoss whether the opposite FET will blow or the fuse goes. Either way, I accept this extra cost as the controller was cheaper to build. The main 'cost' for me will be that either way,the bike is dead in the water with or without desat, this I find a much bigger issue than whether it'll cost $5 or $10 to fix. Knock on wood, but I've done 9000km so far without blowing up.
You are close but as highhopes points out there will be other times when a problem sets off the desat and the controller will still work after.
For one would be a wire rubbing or even water where its not suppose to be but other reasons like a false turn on on the wrong side of the H bridge causing shoot though but the desat can shut it down fast enough to save the whole controller. If this happens a lot you have to change something. Other times it would help is if you set it with tight tolerances it can help stop over current in any set of fets and it can help protect due to temperature to a small extent. As the fets get hotter they have a higher voltage drop or resistance so the desat will be closer to shut down and if you get it "tuned" or set up just right you might be able to have it shut down the fets when high currents are reached with hot fets. This would be something that can react faster then any temperature sensor will know you have your fets to hot.

Fet drivers with desat detection is the last piece of the puzzle I needed to find in my journey. I feel 100% confident I can get this all to work very reliably now and that's what matters to me.
 
Excellent discussion folks.

Desat protection seems like a good feature to have in a high quality controller. But it does cost a bit more, and a conservatively designed controller can get by without it.

It is apparently common on ebikes to develop a short in motor or the associated wiring, and the desat protection could save the controller from destruction in that case. How many times have we read about motor shorts or spun axles blowing the controller up?

If the components are chosen correctly it can also protect against FET high current overheats as the voltage drop rises at increased temperature. Essentially it is a direct test of FET die temperature.

Desat protection might also be triggered in a parallel FET setup if the FETs are not properly sharing high current, and fewer FETs are carrying the load. I've had several failures on a 24 FET controller, and I suspect the FETs aren't sharing current properly. A properly set desat limit could trip this off before massive failure.
 
It is a critical feature for any controller you want to hotrod without buying new FET stages after each control glitch.
 
HighHopes said:
i say "probably a way to get fault signal to brain board" because it is possible for brain board to realize a mosfet is shut-down WITHOUT a dedicated fault signal, a solution managed entirely in software.
This is basically what i have. The controller IC can detect problems with the output stage. With a three phase motor the controller ic knows what the currents are supposed to be, too big of a difference between expected and measured and it shuts down the output stages by turning all FETs off. But this is a matter of milli seconds, not micro seconds as you get with desat detecting gate drivers. In my 9000km controller i had a shutdown once where it didn't want to fire right back up, it was pouring down and water had gotten in the controller. 2 hours later however it started right back up. So here the software based fault detection worked nicely...
Disadvantage of a software solution is that momentary spikes can make the software think it should shut down, especially when what i call the error currents are set too low. Even though nothing is actually wrong. The latest version, upon zombiess request, now shuts down (all fets off) but then immediately tries to restart the still running motor. The solution i have should not overstress the controller if something is really wrong, and goes for a full shutdown if it cannot restart within a preset amount of time...
 
The desat detection will detect problems the cause current to flow through the MOSFETs but without the current flowing through the current sensors the brain uses for that reason the desat will detect problems the brain can't.
 
i've never heard of someone trying to use desat circuit to protect against overheating. one thing to know about desat is it is not very accurate. it has a HUGE range of operation, like it will trip on fault currents between 150% and 200% . that's a wide range. its OK for desat protection to have wide range because it is either a super huge fast moving fault current or it is not; we don't care at all if it trips at 157% or 182%. but for over temperature, which is a slow moving fault in comparison, it might mater to be more accurate. so it is something for you to think about if you want to use desat for over temperature protection.
 
But it will fault easier as temp increases.
 
Arlo1 said:
But it will fault easier as temp increases.

It faults at a lower current when Tj is hotter due to RDS going up with temp. I always calculate the desat current based on temp 0, 25,75 and 100c to ensure I know what is going to occur. I should probably also calc at -25 because the threshold becomes much higher.
 
Desat protection seems like a good feature to have in a high quality controller. But it does cost a bit more, and a conservatively designed controller can get by without it.
all controllers can get by without it because it is just an option for designer to consider, not key function like capability to turn a MOSFET on.. which is obviously key. deciding to put in this feature depends a great deal on what power level you are talking about. if you are talking about ebike <5kW, i agree with you to just leave it out. but what zombiess and arlo are after are much higher power.

at higher power the probability of a shoot-through event goes up a lot due to significant noise increase and much higher component stress. the cost of the inverter goes up because way more silicon, larger thermal management and noise management features, so you have more investment to protect.
 
HighHopes said:
Desat protection seems like a good feature to have in a high quality controller. But it does cost a bit more, and a conservatively designed controller can get by without it.
all controllers can get by without it because it is just an option for designer to consider, not key function like capability to turn a MOSFET on.. which is obviously key. deciding to put in this feature depends a great deal on what power level you are talking about. if you are talking about ebike <5kW, i agree with you to just leave it out. but what zombiess and arlo are after are much higher power.

at higher power the probability of a shoot-through event goes up a lot due to significant noise increase and much higher component stress. the cost of the inverter goes up because way more silicon, larger thermal management and noise management features, so you have more investment to protect.
But what if you build a 10 kW controller by putting three 3kW controllers in parallel ?
 
hmmm.... the cost of 3 parallel 3kW would be more than a single 9kW, so in the end it is probably a wash.

like all self respecting engineers, i have an answer for everything :mrgreen:
 
Isolated gives the advantage of less noise, isolates the gate drive from the traction pack, safety. Eliminates ground loop issues. HH will explain it in more detail as I know I am missing detail.

Isolation also gives the option to implement a negative voltage bias for turn off.

The only major negatives to isolation I can think of right now are size and cost. I would say complexity, but every time I draw up a gate driver I find that isolated supplies make the design easier, less issues to worry about and easier to lay out the board. I am not trying to design low power setups so I have less size restrictions.
 
i can understand why you would think what's the problem with tieing brain ground and lower mosfet "ground" together. in a way it makes sense to think only the high side needs isolated ground because only that reference bounces between DC+ and DC-. at least, that's how it appears on first glance.

in reality there are two reasons why you need to be weary of tying brain board ground to lower mosfet(s) reference.
1. there is a lot of high energy noise over wide frequency from 60Hz to 1MHz on the high power DC+ and DC- that could easily corrupt the opperation of digital/anaglog circuitry. when you picture it in your mind you may see that DC- is "ground" and DC+ has all the noise. that's how it appears in the textbook anyway. but in reality, noise is voltage & current base and as you know voltage is everywhere in the circuit simultaneousy and any current that leaves the DC+ returns on the DC-. so any voltage noise that is on DC+ is also on DC- and any current nosie that is on DC+ is also on DC-. you could prove this to yourself by using differential probes on DC bus. put + probe to DC+ and - probe to DC- and look at noisy waveform. then swap probes around.. guess what, same noise!

2. the body diode of the lower mosfet(s) will conduct current during some portion of the fundamental frequuency's cycle. how can this body diode conduct current if DC- is "ground". wouldn't the ground ALWAYS be the lowest potential? how could the lower mosfet's body diode conduct current anode to cathod if DC- was ALWAYS lowest potential? obviously it is NOT always the lowest potential, it does actually move. so if you tied the brain board ground to lower mosfet(s) reference, be aware that you just tied brain board ground to a MOVING reference. digital devices do not appreciate that.

having said that..during development i have tied all lower mosfet references together and then tied to brain board. i did this for troubleshooting purposes, it did function, it is possible. but don't expect this to work reliably at power levels >1kW and basically not at all at >20kW
 
again, opinions vary.. voltages are always between two nodes, meaning you're always measuring whether one node is higher or
lower than another and by how much. A result of this is that you can always say: 'this node is what I call 0V and I measure all the
others with respect to this one'. Back in the day of the first transistor schematics the highest voltage in a circuit was typically
called '0V' and all voltages were negative.

For me, PCB ground is 0V and then indeed as HH mentions, motor terminal voltages can go below 0 (with repect to PCB ground) but as
I defined PCB ground to be 0 it stays 0. Motor terminal voltages go below 0 by the way as the inductors act as current sources and will
do anything to keep the current flowing. But this does not change the ground or 0 voltage.

The nasty thing about power electronics though, and I think HH is referring to this, is that there is no thing as a simple wire or PCB trace.
dI/dt 's are so large that any wire must be seen as an inductor, and any wire with have a signal across it because of sharply rising and
falling currents. For every current you have to think which path it takes as it runs around in a circle. A current will take the path of least
resistance / inductance, so you as a designer must make sure that these high current paths are kept away from the sensitive low voltage
electronics (the u-processor, current sensors, gate drivers). Take my low inductance output stage for instance. Yes the sources of the low
side FETs are connected to the PCB ground as this is where the gate drivers are. BUT these connections are made with relatively long
wires forming large area loops. These long wires give a much higher inductance for the power currents as the bus-bar directly across
the FETs. So the motor currents will go through the bus bar (the electrons super highway) and will not go to the PCB and back (two or
three idiot electrons will always chose to take the twisty single lane scenic route but hey, what can you do)

Anyways, I had a very nice trip this weekend on my 3kW 6FET recumbent (which uses simple bootstrapped gate drivers). Very hot though (35C
in the shade) so I was a bit worried about my 25s lipo bricks overheating in the sun (as I lay in the grass by the lake :D).
 
It is possible to do clever cheap and noise robust boot-strapping circuits that can perform perfectly adequately.

However, doing that takes months of development time and tuning, and then changing things in the system may cause them to require re-tuning.


Using 6 separate floating isolated supplies, and 6 isolated input de-sat protection gate drivers is the method to use if you are done f*cking around spending 95% of your time dealing with issues caused from poor gate drive circuit noise and/or tuning.

It is also the method you use if you simply want the best possible way to do it and don't care if that means it adds maybe $~80 in parts cost to your board. For hot-rod controller builders, it's a total no-brainer because it's already saving them money if it prevents a single power stage failure from a shoot-through glitch or noise issue etc.

If I were just going to buy an off-the-shelf controller, I wouldn't care if the company had taken the time to get perfectly dialed in boot-strap circuits and noise filtering and found solutions to control the FETs effectively for like $10 in total component cost. If I am building a controller from scratch, or choosing a controller I want to hack into a monster, having 6 isolated supplies feeding 6 isolated input de-sat protected gate drivers would be the only option I would even consider. It's the best $80 you can spend in components for a controller in my $0.02.
 
Back
Top