John in CR said:
Running a controller at a higher voltage but at same speed (assuming same motor) is less efficient. That's because running at partial duty is less efficient for the same power, primarily in the controller itself. I see it first hand all the time, because I run high enough voltage that I don't ride around at WOT, and when I ran lower voltage but the same speeds my controllers ran cooler. Motor temp seem unaffected in long-term comparisons, so it's not a matter of using the greater performance available when running higher voltage, but the effects of partial throttle on a controller are significant enough to note the lower efficiency in the form of a hotter controller and some increase in consumption wh/mile or wh/km.
You can make a pretty good estimate of the switching losses in a controller. Each PWM cycle the FETs switch they pass from off to on, and then again from on to off. So twice per cycle. During this switching time their power dissipation goes from near zero to max to near zero again. The worst moment is when they are halfway, at which point they see half the battery voltage and the full motor current. The power dissipation is basically a triangle shape with the peak being the halfway point. The area under this triangle is one half, so the average power dissipated during this switching event is half the max power times the time duration of the switching.
Better FETs don't help much here, the problem is the time to switch. Better drivers to switch the FETs faster does help, but it also causes larger transients which require better capacitors and more engineering and more expensive parts and better drive circuitry.
So let's take a simple case. Assume 1000W continuously to the motor, 100V battery, 1 microsecond switching time, 10 khz switching frequency, and half throttle.
So the motor is seeing 50 volts effective because we are at half throttle 100 * 50% = 50V, and we're traveling at about half max speed.
We know the motor power is 1000W so the motor current is 1000/50 = 20A.
The peak power dissipated during switching is 100V times 20A or 2kW. The average during this switching time is half that, or 1kW.
This loss occurs for one microsecond, times 10khz PWM times twice per PWM cycle, which is 0.02 times the average switching power, or 20W.
So the loss in the controller is 2% while we deliver 1kW to the motor, which doesn't have much effect on range.
Dropping the system voltage to 75V would mean we'd run at about 67% throttle, the motor current and voltage would be the same of course to travel the same speed and the same power. The result overall would be the switching loss would drop by the ratio 75/100 or about 15W, for a savings of 5W in the controller from switching.
This is switching loss only, the I squared R loss in the FETs from motor current is not related to switching and not included. It is also not changing if the same controller is used as motor current is the same and motor current dominates FET I squared R loss.
If you lower the switching frequency, or reduce the switching time the loss will drop. Well engineered controllers can reduce losses, but it does come at a cost.