ODrive ASCII protocol reports different state to odrivetool's Native protocol

I have the ODrive plugged in on one usb port, and my arduino connected on another. The arduino is talking over UART to the ODrive using the ASCII protocol.

When I send r axis0.current_state over the UART I most often get back 0, but sometimes 256. Very occasionally I was getting 1 (IDLE) but never for very long. I have asked about the 256 in another question.

(I am sending the message every 100ms, which seems very generous to me. I want to ideally be polling the ODrive at a much faster rate than that, once all these errors are sorted out.)

With odrivetool open in another window, I can just do odrv0.axis0.current_state and I get 1 every time without issue.

I also believe 1 is the correct value, meaning odrivetool and the native protocol are correct, and the ASCII protocol talking to the arduino is getting it wrong.

What I want to know is what might be happening here to make it go wrong.

Make sure you have a ground wire in common between the ODrive and the Arduino. Otherwise, it sounds like maybe itā€™s off-by-one on the Arduino for some reason.

Yep, thereā€™s definitely a ground wire there. What could cause an off-by-one error?

Iā€™m not sure, to be honest. Noise maybe, or the wrong baud rateā€¦ are you using software serial or hardware? The software serial is crap, definitely only use hardware.

Can confirm Iā€™m using HardwareSerial.

Here is the code thatā€™s running on the arduino side:

float readODriveInt(String msg){
  Serial_ODrive.print("r ");
  Serial_ODrive.println(msg);
  delayMicroseconds(50);
  float pos = Serial_ODrive.parseInt(); // timeout is 0 so this doesn't wait.
  Serial.print("RCVD: r "+msg+":");
  Serial.println(pos);
  return pos;
}

EDIT: Iā€™m sending two requests -

odrive_state = readODriveInt( String(motor_axis)+".current_state");
int is_calibrated = readODriveInt( String(motor_axis)+".motor.is_calibrated");

is_calibrated is coming in as 0 as well, when in odrivetool I get

In [18]: odrv0.axis0.motor.is_calibrated
Out[18]: True

I have just gone and changed .pre_calibrated to False, and is_calibrated continues to come in as 0.

If itā€™s an off-by-one error I would have expected that to be -1 (or 256).

Also, both odrive_state and is_calibrated are normal ints in the arduino. So that makes me think this 256 value must be coming from the ODrive, or else it should be represented as -1. No?

On further investigation, the value of r axis0.motor.is_calibrated seems to change randomly between 0 and 1.

This could indicate noise, but Iā€™m confused by something:

The ASCII protocol is presumably sending ā€˜0ā€™ and ā€˜1ā€™ as the ASCII values for those characters. So 0x30 and 0x31 respectively, and these are sent serially over the two UART wires. Why would the noise only ever affect one single bit? As in, if noise is to blame, why arenā€™t I seeing a whole bunch of random characters coming in?

EDIT: Also, I notice that the value of r axis0.current_state doesnā€™t flicker. Itā€™s steady at 0, while odrivetool reports a value of 1.

possibly a timing issue - maybe it is only affecting the last bit if there is a clock skew?
What baud rate are you running at, and are you running from a crystal oscillator or RC oscillator?

1 Like

Baud rate is 115200. I donā€™t have anything like a crystal oscillator to hand. Is that a likely problem, and if so is there any way to test it?

Try running at a lower baud rate. If it fixes it then we will be a step closer to working out what is going on.
As for the oscillator, I meant that most Arduinos use a crystal oscillator on the board, but some (probably) donā€™t have one. If so then they might not work at high baud rates.

TLDR, it was the clock skew thing and lowering the baud rate fixed it. Thanks for the suggestion.

My problems donā€™t end there, though, since Iā€™m using two versions of the firmware, a ā€œfactory defaultā€ and a version Iā€™ve made lots of changes to.

The factory default starts working at a lower baud rate, but doesnā€™t work with SPI or have any of the other stuff I wanted in there.

My altered version doesnā€™t work, and I donā€™t know why because I didnā€™t think Iā€™d touched anything that should affect the ASCII comms.

What really confuses me is that this all used to work fine - Iā€™ve had the arduino controlling the ODrive over the UART interface before, and there were no problems at all. And that was 115200 all the way. I have no idea what might have changed.

Is it possible that you think it is using HardwareSerial but actually it is using SoftwareSerial?
Or is it possible that the board config is wrong?

Also, is there any capacitance and/or resistance on the UART pins? If so, then it will be slowing the edges with the RC time constant so that it doesnā€™t work at high baud rates. If you have an oscilloscope, you can check that the edges look square.

Although the off-by-one issue where it only seems to be affecting the last bit (LSB?) and not the others really seems to point to a timing issue rather than capacitance.
Again if you have the scope, you can check that the length of 1 bit in microseconds is exactly the same for TX (from the Arduino) and RX (from the ODrive). If they are off by more than 5%, you are in trouble.

My Serial interface is declared as

HardwareSerial Serial_ODrive(2); 

What about the board could I have got configured wrong?

I unfortunately donā€™t have a scope so canā€™t check the trace.

However, Iā€™ve found a few software problems in my own code (arduino-side) which can be blamed for a lot of the trouble. My plan is to finish fixing those and see what errors remain.