Gcode interpreter

Ok, thanks for that, I think this would be a good starting point and as many of the known firmwares are based on grbl and almost all use line segments (except g2core/tinyG) this should also allow an adaption. Anyhow a later implementation of higher level segments seems to be a good idea especially for beyond 3d printing applications​.

I agree that on the entry level people expect to have a plug and play solution.
However moving past that point this is not the case anymore I would guess.

On the one hand we’re taking not only about a singular application (3d printing e.g. where there is also a huge demand in numbers justifying a specialised highly sophisticated monolithic board with all in) but e.g. building robot arms win 3 to n axis and possibly different effectors.
On the other hand and this is different from the very recent past the number of options for motor control evolved. So for the last few years a new 3d printer control board only implemented a lot sophisticated stepper driver (higher microstepping, more current, direct current control …) But recently - and Odrive is only one of those, although it is the most promising from my point of view - there are different concepts arising with the aim to overcome the stepper limitations. Which is a sign that obviously either on the demand or on the opportunity side something has change. The point here is, that these modules will not be integrated into a big controller board any more. Therefore at the moment Step/dir is the only existing interface, but that keeps one of the most severe limitations of the stepper concept alive.

For sure i didn’t want to imply any duty on Oskar to implement such concept. The great thing is it is open source and anyone can add a further option. Only it would help to define a concept which many people think makes sense.

On the other hand Odrive is already and has to be both of your mentioned systems:
A 2 axis driver by itself can’t be a complete motion controller for more complex projects but for simple things it may work on its own and with step/dir it will already be a great servo controller.

I’ll point out that my original post wasn’t that odrive needed to be the gcode interpreter, but rather that it needed to be able to be controlled by one.

If LinuxCNC can be taught to send the appropriate commands to the odrive, then there is little need to have a g-code interpreter on the odrive itself

Another project that needs a similar interface is the pthat http://pthat.com/ It’s driving stepper motors, but has a text interface that’s not g-code and not step/direction. It would be nice if projects that are going in this direction could get together and agree on a protocol standard so that the interface doesn’t need to be re-invented for every board.

They have published a command set at http://pthat.com/index.php/command-set/ It’s a bit limiting in that they have it defined to support 4 axis, but it would not take a lot of effort to extend it to more (some thought would have to be put as to how to sync multiple boards)

Unfortunately, they have so far taken the attitude that they don’t want to be limited to g-code, so they don’t plan on trying to make their hardware supported by any CNC software :frowning:

I hope my comments didn’t sound to impolite.

For the moment I think the gcode is for most applications still the best and standard way to interface with machine control, so there is clearly the need to have some sort of gcode interface - my basic idea is also that for simple projects it could also run on the Odrive board itself, but taking your comment from the top into account it is true that this will introduce also some problems (especially where to draw the line of supported commands).

My first comment was just pointing to the problem that gcode is not a suitable standard to interface multiple boards synced, and therefore usually step/dir is used, but at a high price.

However looking at linuxcnc (as far as I got it) they seem to use step/dir in most of the cases (gecko drives seem to be interfaced in this way also e.g.) but therefore they often implement specific fpga boards as step drivers which looks a bit anachronistic with all these specialised 3d printer boards out nowadays (but step rates on CNCs are probably much higher when trying to achieve fast moves on spindled axes).

From the discussions at the Duetwifi forum David Crooker stated that driving 4 axes the duet board is able to put out ~100kHz step rate as a max (~300kHz if only one axis is moved), so having very fine movement steps this is a severe limit to the max speed (Normally on 32bit boards this is no real issue with a stepper setup, only if you decide to go for the actual 128 microsteps mode. However the dynamics for a decent BLDC servo should be much higher).

My question on the commercial CNCs was more related if there is a standardized protocol for this which could be used as a start, but so far I didn’t come across anything.

I assume every manufacturer implements something proprietary (there are some servos with RS485 interface, but i think they always use a specific command set) At work I have one machine which does actually have Step/dir to control a stepper axis but there it is nothing synchronized, all multi-axis machines have a kind of CNC control where to start a process you usually do a reset with the override at 0 and then ramp up to full speed, which was the starting point for my thought, if the poti controls the takt frequency it would initialize syncronized movements by the first takt and the motion segments imply a certain time reference to that takt line.

I didn’t plan to be suggesting anything when I initially posted, I just wanted to know what direction to go.

but when the idea of rolling a new g-code interpreter came up, I felt I needed to suggest better options.

I also suspect that the big CNC machines all have proprietary stuff inside, there’s not a one of them that allows any other controller to be used so compatibility with anything is a non-issue for them.

I don’t know what odrive had in mind to sync multiple boards to be able to do a coordinated move across boards, but I’d like to see it be something that could actually work with other boards (for example, use odrive for the big motors to move a gantry and then pthat to do Z axis, 4th axis, extruder or similar)

I think it’s fair to say that the thing that issues the commands should take into account the speed needed for each axis so that coordinated motion will still happen across boards. It’s just a matter of having a way to tell all boards to start at the same time.

the pthat command structure already supports defining coordinated moves within one board. We could extend that command set to include a ‘delayed start’ command

Define a GPIO on each board to be part of a bus that has a pull-up resister on it. Have every board (possibly including the master computer) pull the pin low under normal conditions.

Once they get a ‘delayed start’, they would switch from pulling the pin low to it being an input (possibly an interrupt). Once all boards stop pulling it low, the line will go high and all boards would start at the same time (and after a very short window, pull it low again)

the same command could be used in the middle of a sequence of commands to make sure that all the boards are in sync (modern clocks are pretty accurate, but we want to deal with what drift does happen)

Yes, that could work quite well for the start, it would be like having every command (or every n-th) synced by that pulse.

I also agree that this should not be focused on ODrive alone but an implementation should be simply made for different boards, so it will be key to easily implement at least the most basic layers.

I thought about something similar in the beginning, but I see two main disadvantages:

  • the reaction time and the look ahead on the motor controller is limited. In case the pulse is not at the expected point the controller has to issue an abrupt stop which should lead to jerk of the machine, as the “master” will not know this happened it will also not take the acceleration to get back to sync into consideration. With the takt line the servo controller can self correct a certain mismatch in position by a deviation from its expected position without a hard cut, also with the one or two takts ahead all controllers can adapt to upcoming changes (with a frequency not too low to make it recognizable by human eye, but still not too high to reduce effort on controllers side so few hundreds to 1kHz is whats on my mind, so a decent controller has plenty of time to react within two periods)

  • The “master” would have to segment all moves for all axes into identical lengths, even if motion on different axes have different scales (e.g. a sinusiodical machining will be done with one axes moving ahead with constant speed over a long run, wheras second axis is alternating all the way long. With a takt-line this is only a matter of the same time base which can be self controlled.

On the other hand I do agree with the concept of the master planner which has to have the information which axis has which capability,
I think the concept applied by LinuxCNC is quite good to have the configuration/scaling done at the lowest possible level, so the position commands could be in “natural” numbers and the servo controller does its own scaling (however the configuration could be distributed at startup from the main controller)

I also agree with the general Idea of different servo controllers for different purposes on the same machine. E.g. as I am building a coreXY printer, the main motion stage for XY is where I want to apply ODrive, but for slow Z-axis a different driver is required, where maybe additional motors could be added to have motorized bed levelling, also the extruders should be driven by cheaper brushed DC motors in the long term with feedback from an actual filament feed sensor…

Again a long post … I apologize :grin:

We’re having a good discussion here, until someone else jumps in and tells us to shut up, let’s keep going :slight_smile:

As for the disadvantages you note:

If you are doing a multi-axis move that involves multiple controller boards, you cannot have the look-ahead being done on the controller, you need to have it done at the higher level that is working across controllers. If one controller can’t do what’s been asked, the result is not going to be good (just slowing down the axis managed by that one board will not be good), the most that an individual board can do is to report that it was unable to comply.

I’m not worried about the need for something to break things down into chunks that take the same amount of time on all axis. This is a very similar problem to how to draw curves. Every CNC machine I know already has to break any curve down into a series of straight line segments. It’s either done at the CAM stage and shows up in the g-code as a ton of tiny line segments, or if the g-code uses G2/G3, it’s done in the g-code interpreter. In either case, it’s not a problem unless you have pauses between steps.

As long as you can buffer commands, so that you can send the next movement while the board is executing a previous movement (and the communications and processor are fast enough to keep up), the mechanics of the machine should not be able to tell when one command ends and the next starts.

Since some software is going to generate lots of line segments anyway, this is something we are going to have to handle in any case.

by the way, I’m unfamiliar with the term “takt-line” I think I have a fair idea of what it is from the context, but could you point me at the definition you are using?

Yes, I am a bit afraaid that term is not really correct, In industrial planning Takt is also used in english, but presumably in electronics its not …
So clock-line would have been more correct, but also lacks part of the meaning here, actually it is more an additional time reference but neither for the operation of the processor nor for the serial line (although in theory maybe one of those could be mis-used for this purpose).

As I said I would assume a good frequency for normal operation would be around or below 1kHz.[quote=“dlang, post:19, topic:83”]
I’m not worried about the need for something to break things down into chunks that take the same amount of time on all axis. This is a very similar problem to how to draw curves. Every CNC machine I know already has to break any curve down into a series of straight line segments. It’s either done at the CAM stage and shows up in the g-code as a ton of tiny line segments, or if the g-code uses G2/G3, it’s done in the g-code interpreter. In either case, it’s not a problem unless you have pauses between steps.

You are right with this statement. The reasons I would try to avoid this are the following, first (and more obvious) the number of commands to be issued will be drastically increased, especially if you are going to implement higher order acceleration schemes (which I find an appealing concept, although their proof of use is neither given nor declined for me yet) but more severe I think it makes sense to have the transformation from the arc to line segments on the lowest possible level to preserve as much accuracy as possible. For controllers unable to do this, it could be done at the master but e.g. on the ODrive it should be done there, because I assume the controller knows the best how accurate it could follow.

On this point I may have been a bit imprecise again, you are totally right, that the actual planning cannot be at two different levels and it should be at the highest possible. Actually I am lacking a bit of experience on how effectively this could be achieved even with only a serial interface without the mentioned external time reference. The idea of the look ahead on controller side (you can also call it a buffer) is more focused on the layout of the position or speed regulator and allowing it to compensate on errors differently depending on the future conditions (Probably this is not really working with a one layer standard PID scheme but could be more helpful when higher level or multi layer regulators are applied).

Actually some time ago I was following some discussions in the mechaduino forum, but didnt since a while, so maybe they did overcome these issues, but it was quite obvious that the PID tuning was extremely hard and from what I understood there were mainly two issues causing it:

  1. The used AMS encoder seems to be quite high latency at least depending on the output mode choosen (this is only my impression, so i never tested or compared those, although they are very promising looking at the specs)

  2. The change of nominal position with Step-dir input. As especially the I-term evolves over time and is quite helpful for long term stable positioning, it is often reset when nominals are changing as part of the anti-windup strategy. if a known rate of change for the nominal exists, the strategy would change on this point allowing for more effective regulation and use of all terms, however with step/dir you can only derive the rate of change from the past which may be completely wrong.

I also agree on this :wink: Its a good technical discussion and I am very glad to go into the details of it, it also helps to refine the ideas bringing it from thoughts to words :slight_smile:

1 Like

well, remember that perfect is the enemy of good enough.

I’d rather start wit a conceptually simpler system, even if it has performance issues, and then optimize from there than try to design a system that provides the best possible performance and becomes too complicated to build reliably.

or another way to put it, "premature optimization is the root of all evil’


Realistically, we have three common cases here.

  1. odrive running solo that wants to consume g-code.

  2. odrive running as the master, with something else as the slave (could be odrive, could be other)

  3. external controller managing one or more odrive and/or one or more other boards

(plus the case where we don’t care about g-code, but in that case, something issues native commands directly)

I’m thinking that it makes sense to distill this into two categories.

A. the odrive is a slave of another system and is fed line segments to execute

In this case, it needs to understand a command set (ideally one common with other boards) to set the axis speed, acceleration, and distance movements, including some synchronizing primitives.

B. the odrive is the master and controls axis on itself, and possibly dispatching some axis movements to another board.

In this case, running something like grbl on it and internally calling the axis movement with the possibility of sending axis movements to other boards (with sync) would work

I haven’t spotted the documentation on the existing serial protocol that the odrive implements, but if it was possible to use the same commands as the pthat, it would make it easier to justify adding a driver to send commands in the serial format to upstream projects like grbl and linuxcnc (anything that supports multiple projects is going to be more readily accepted than something that only supports one board)

does this sound reasonable to the odrive folks?

Actually after reading Oskars latest mail on the log, I was curious by the profile pic of the guy with the Ananas Stepper board, and on the discussion on his side Oskar referred to the CanOpen CiA 402 protocol.

I just did a quick browse on it but it seems to be a standardised serial control protocol (I came across that term sometime before at commercial servos, but only thought it was referring to the transmission protocol not the actual command set).

So before fully got the details, I have a gut feeling that this could be the right thing and for sure any existing standard is preferred over something new.

Hey guys thanks for the very interesting discussion!

On replacing step/dir:
I completely agree that step/dir is garbage and needs to be replaced yesterday. Still, I need to support it for connecting to legacy motion controllers.

There are many ways the ODrives could speak to the outside world and to each other, but indeed I have been keeping my eye on CiA 402, as I know that is what’s used in industrial servo drives. Hence, many software stacks meant to interface with industrial servo drives speak it, such as ROS.

For reference, the intoductory paragraph of chapeter titled interpolated position mode on page 105 of the CiA 402 spec reads:
“The interpolated position mode is used to control multiple coordinated axles or a single axle with the
need for time-interpolation of set-point data. The interpolated position mode normally uses time
synchronization mechanisms like the sync object defined in /3/ for a time coordination of the related
drive units”.

Further the spec says you must implement linear interpolation of position over time. However, they allow for manufacturer specific modes, which is good if we want to have higher order planning.

They also specify a standard mechanism for how to check for tracking error (i.e. ability to keep up), and how to signal them and what should be done when one occurs.
By the way @Giornoxx, it is not standard to slow down or take any sort of intermediary recovery action at the master: suppose you are decellerating, and one of your axes goes into torque limit, what do you do? You can’t slow down, that makes it worse. Changing the trajectory plan to mitigate is too complicated for most applications. Usually the approach taken is to plan the trajectory with sufficient margin that a limiting condition should “never” occur, and any small deviation/error is servo’d by the individual axis.

On G-Code interpreter on ODrive
Conceptually we have in a robotics/motion system (both using ODrive only, or otherwise): High level motion commands -> motion planner -> motion controller -> motion primitives -> motion execution.
While there are many other options too, for the purposes of this discussion, the high level motion commands could be original (not split into 1000 lines) G-Code, and the motion primitives could be CiA 402 “interpolation points”, or step/dir signals from an existing controller.

As both @dlang and @wetmelon mentioned, it is unclear if the ODrive is trying to be just a servo drive or both the motion controller and a servo drive, and if so, how much effort will be spent on the motion controller and planning part. I am intentionally leaving this unspecified.
I will put in place a simple cubic trajectories planner and motion controller, and something that can read just the G1 command. The main reason to do just that is because it is required for very simple point-to-point movements for standalone applications and demos.
Beyond that, I think we can use the step/dir interface (for now) to interface to existing motion controllers, and get working systems up and running quickly. We can in the mean while spend our time with making the servo drive part of the project really good. Stuff like automatic tuning, ease of use, etc.
Then we can come back to this topic once machines are running.

That said, if there is any of you who are really interested in starting to work on any of the following this early anyway, let me know, and I can provide any direction you may need.

  • Porting any G-Code interpreter
  • Runnnig an interpreter on the host, and converting to other primitives, to be sent over USB
  • Motion planning, in general
  • Implementing/porting CiA 402

Thanks again for the well thought out and civil discussion. This is how I always hoped the forum would be like ;D

1 Like

complying with any spec is good, I was just picking one I ran across that seemed well defined. If there’s an actual industry spec that’s available to be implemented (no fees for access to the spec or licensing costs to implement it), that’s even better.

Thanks for the pointer.

by the way, do you know of any opensource code that implement CiA402?

I started gathering the relevant portions of code together to get an overview and try to dig into it, however I won’t make any promises at the moment (with respect to my coding skills and available time).
But my personal goal is to get to manage the connection of an existing firmware with Odrive without step/dir im willing to contribute somehow. In the end it should be attached to the duet boards, but I’ll start with plain grbl as it’s probably easier to understand the system working and set up some demo stuff.

Ok, let’s say the basic idea was to allow for more options for the closed loop control based on the trajectory. Additionally I could imagine that for some as fast as possible moves out could be good to let the master know let’s say if 80% of achievable torque are reached (but for sure it’s no current issue and any reaction to such information should be handled thoughtful and based on validated strategies and scenarios) :smirk:
Which also implies its probably not a standard scheme …

1 Like

I think adding the motion control is too much to ask of the Odrive because besides controlling the motors you need more IO to do anything. My vote is to make the drive do what it is designed for really well. To add the motion control will add a lot of complexity and will never be able to handle all the IO needed for even just a mill. for a mill you would need 2 Odrives and at least 3 endstops and that is bare minimum.

Well, first off, here’s hoping that a later version uses a larger chip with more motor controls :slight_smile:

but even this limited version could be used for a laser cutter (2D + PWM signal for the laser), which is what I have in mind.

but the real issue is the I/O rate, doing step/direction pulses has a very low limit in how fast you can give direction to the odrive, so there needs to be some higher-level protocol in play (including the ability to sync
movements across multiple boards)

4 posts were merged into an existing topic: Step / Direction

Hi everyone, resurrecting this old thread to let you know that thanks to @Hello1024, we now have the G-Code parsing and motion planning features of GRBL stripped out and half-integrated into ODrive firmware here: ODrive PR 47.
Like it says in the PR, this hasn’t been tested on hardware, and is more of a proof of concept than a finished feature. Nevertheless, if someone want’s to work on finishing this off, you are more than welcome!
Let me know, and I can coordinate.

Has there been any update since the end of 2017 ? Or are there any other implementations that allow for multi-axis trajectory planning inside odrive based on a list of coordinates coming from an external device?