12V or 24V for a standstill (mostly) application


Since I am mostly interested in max torque output, and basically standstill operation. I would guess max RPM will be in the <100 range, what does higher voltage give in this regard?

Since P = U*I I was thinking that lowering the voltage could mean I could increase max amperage and thus get a higher max torque?

But this may also apply, P = I * I * R. And in that scenario the actual power loss in windings is NOT dependent on voltage, but on amperage and resistiance. This may actually answer my question, but I’d like to get it confirmed,

Since I have access to a server power supply that would provide 12V, 56A it would be nice if that supply would in fact suffice for a 6374 motor for instance. I am NOT interested in high revs, just holding torque. The alternative is to take two 12V supplies in series and thus get 24V, but it implies a bunch of problems with grounding etc. What will 24V give me that 12V doesn’t?


Hi @Barsk,

Keep in mind that the voltage supplied to the ODrive has little to do with the voltage applied to the motor phases, as the latter is determined through switching in accordance to maintaining the desired current. I think that the main effects that increasing power supply voltage has are i. a greater headroom for torque production and ii. a difference in switching duty cycle (which should be unnoticeable for the most part due to closed loop control).


Hmm. Well, good point. But, it doesn’t answer the question fully.
Will I get the same holding torque from a 12V 56A supply as from a 24V 56A supply?


Torque is entirely dependent on the phase current, where \tau = I * K_t

Your power requirement at holding torque is P = I^2 * R, where R is your phase resistance and I is your phase current. This power has to come from the power supply. If the power supply is 24V 56A, it can supply 1.35kW. A 12V 56A supply will give half that. SO, as long as the 12V 56A supply can hit the required motor power, there is no difference in torque between the two. However, a 24V supply will heat up less and voltage drop will be less of an issue.


Sorry, I still don’t get what the connection is to voltage, hence my question. Math says P=I2∗R, voltage is not a concern. And then you say “as long as the 12V 56A supply can hit the required motor power, there is no difference in torque between the two”, disregarding the math you just cited. Since both supplies the same amp rating they SHOULD be equal. According to math…

Don’t get me wrong, I am not being disrespectful. It just seems we are missing some important factor here that will explain it. As soon as the wheel start to spin up voltage will become a factor in fighting back-EMF and higher voltage will produce higher revs, and thus higher motor power. But there will never be any high revs in my application.


I found a nice link that explains it.

Summarized, the torque delivered by the motor can be defined in terms of current:

\tau = k_t*I

But the current requested by the motor is a function of the voltage. This is important to understand.

I = \frac{V_s−k_i∗\omega}{R}

Where V_s is the supply voltage, k_i is a constant, \omega is the rotation velocity (0 for holding torque), and R is the electrical resistance of the motor.

So basically, the current determines the torque, but the higher your supply voltage, the more current your motor can demand (at any rpm).

So at constant rpm, the motor determines how much current it requests based on the voltage level. Then your power supply must determine whether it can deliver this.

Keep in mind that higher current means more heating in your motor and controller. Carefully monitor your motor heat in stall conditions, because they usually will overheat/reach curie temperature at maximum current.

If your motor can request 56A at 12V, then 24V will allow it to request more amps. But since your 24V power supply can only supply 56A, the torque will remain the same. But maybe I understood it wrong, haha.


The answer to your second question is that the two psu’s will have different effects depending on the motor used.

Let’s imagine a motor has 0.5 Ohm phase resistance (quite high). Then your 12V psu will give out at most 24A, and your 24V psu will give out at most 48A. So with this particular motor the 24V psu will allow the motor to produce double the torque at standstill (and more heat etc…).

Now imagine you have a motor with 0.01 Ohm phase resistance. Both psus will max out at 56A (And the voltage across the psu poles will drop to the same level which is 0.01 * 56 = 0.56V). In this case you have no benefit from a higher psu voltage in terms of torque produced.


Ok, got it. The missing factor was phase resistance, which causes voltage drop (ohms law basically).
So, for low voltage PSUs a low resistance motor is the key.

However, this also applies: (from the stackexchange.com quote above):
“For the same motor, ideally if you apply double the voltage you’ll double the no-load speed, double the torque, and quadruple the power.”

This assumes a PSU without amperage limit of course. So, in essence, increasing voltage also increases torque, as long as your psu keeps up.


No, the only thing that matters for torque is phase current. Phase current is simply phase voltage / phase resistance.

Most of these motors are in the ~0.1 ohm range. If you want 100A, you only need

100A * 0.1\Omega = 10V

This is 1kW. Now that you have the power demand, you can calculate the current demand from from power supply given the DC bus voltage.

P = IV = 1000W
1000W / 12V = 83.33A
1000W / 24V = 41.66A

If the PSU has no current limit, both will supply the same power, with the same phase voltage, and the same phase current, giving you the same torque. This is the whole idea behind current control… It decouples the DC bus voltage from the output velocity and torque (up to the limits of the PSU).


Thanks Wetmelon,

from a real world perspective with non-perfect psus your example makes sense. But I’m still not sure how the conversion from a 10V/100A to 12V/83.3A (equal electrical power) would produce equal torque.

If that applies at the same time, it would mean that the PSU amp rating is decoupled from what the driver is feeding the motor? It is the actual Power rating of the PSU that matters?

If that applies, it would then mean I get the same 2X power if connecting the 12V/56A PSUs in parallell. That is a electrically safer approach, since I do not have to disconnct earth ground from the “upper” psu. But it would (probably) need shottky diodes (voltage drop = 0.55V) to prevent back-current between the supplies and thus lower voltage to about 11.45V.

11.45V/112A or 24V/56A, which is the better… :slight_smile:


I think you have come to a correct conclusion.

In general, in the name of electrical safety, I would advocate for the parallel configuration. But one thing we haven’t talked about is the transient response of the power supplies. The Odrive will brown out at 8V. A 24V supply gives you a lot more headroom, if you need it.

By the way, the ODrive docs link to a blog post by @Richard_Parsons explaining why the power rating matters and not the current rating https://things-in-motion.blogspot.com/2018/12/how-to-select-right-power-source-for.html?m=1


Alright, taking a real-world example.

This HobbyKing motor for instance:

It has a resistance of 0.013 ohm, and a max rating of 60A. Following your example that gives:
60 x 0.013 = 0.78V
0.78 x 60 = 46.8W (electrical power needed)

46.8W / 12 V = 3.9A

So this motor in max holding torque according to specs would draw 3.9A from the supply, and the Odrive would convert that to 60A fed to motor. The torque output would be (theoretically):
8.26 * 60 / 280 = 1.77 Nm