Robot following trajectory path (Ideas needed)

Hi all,
I love ODrive.
A while back I started developing this 6-DOF articulating arm.



After some time. I came back to it and started working on it again. A while back I posted a hard stop homing feature here

I imported the 3D model into RoboDK for simulations and trajectory path calculations:

I like RoboDK because it is Python based and I LOVE Python.
Anyway. I gave up on the project for a while because I got sick of the USB connection reliability being sucky, and crashy.
Then I put on my big boy pants on, and started modifying the firmware to use UART with custom short commands and 921600 baudrate.
After a long time of working. I got it and I developed a simple GUI to control it.

Now that I have a GUI, I can calibrate all joints and home them using the AWESOME hard stop homing. see here: https://youtu.be/nYdeXFth3EI

Then I figured I add a “park” position function that allows me to park the robot before I click IDLE and close. See here: https://youtu.be/tBGBzsRpWfc

On to the issues and where I need ideas and help.
I can use RoboDK to program a path, see here: https://youtu.be/0HJygEd8gEE
The simulation sends to my python script a list of joint angles, see here: https://youtu.be/kXAtio8DXW0
The issue as I described it is basically I would like to know the best way to drive the robot to implement the programmed path. I can get all the path joint angles and convert them to “encoder counts”, and I got it to save to a file, then get the file and extract ALL the path joint angles from it. My issue is not that. It is, what is the best or correct way to send these value to the ODrives and have them run the robot to depict this path.

I am currently using position control and trajectory control, using a modified Razor’s Edge firmware. So I am using the.
axis*.controller.input_pos = value followed by axis*.controller.update_pos()
This utilizes the trap_traj velocity, accel and decel. But it is not as easy as sending those counts to the ODrive.
If I just send the counts to each joint and run them, the motion looks like this: https://youtu.be/2A2Qs4Drmso

It is not even close :stuck_out_tongue:

Thank you fellas in advance

3 Likes

Hi Jamil84,

Trajectory planning is quit complex as you need to consider the dynamic loads on each individual joint like inertia, acceleration, torque etc.

What I think could help your path planning is to calculate the acceleration of each individual DOF meaning each joint has a different acceleration than the other. I am not sure if you already considered this but it might be a start?

You can take a look at this for more details (chapter 9): academia.edu/39668978/MODERN_ROBOTICS_MECHANICS_PLANNING_AND_CONTROL_Modern_Robotics_Mechanics_Planning_and_Control_c

@uniquenospacesshort Thank you,
Currently, all joints have been programmed to have the same accel and decel in the trap_traj settings with respect to their reduction ratio.
That being said. I believe that velocity, accel and decel should not be fixed and should depend on the counts that each joint will be traveling.
But then again its a learning curve.

Don’t use the Trajectory Planner for this, it’s not designed to synchronize the axes. My recommendation is to also parse the joint velocities with your program and send both input_pos and input_vel for each joint. In other words, with UART, p <axis> <pos> <vel>. If you can convert acceleration to current, then send the current also.

1 Like

Hi Jamil84, firstly your GUI is amazing! Secondly do you need “RoboDK” to use it?

Regards Jerry :thinking:

Hi Jamil84,

Not sure how you manage the robot to playback the motion profile. But as you said, it seems the set position is passing through much more quickly than the hardware reaction. Here just give some key point idea for you to have a check if it matches what you’ve done.

  1. check the max speed, acceleration setting in your trajectory generator and make sure they could be achieved based on your motor setting;
  2. generate the motion profile with timestamps (calculated into joint angle position);
  3. do interpolation calculation for the motion profile based on the time stamp of which time the controller sending out the position to ODrive board;
  4. motor following position control mode;

Hopefully it could help. Good luck :wink:

Thanks Jerry.
No you do not need RoboDK, unless you click on the buttons in the RoboDK frame. :slight_smile:

  1. do interpolation calculation for the motion profile based on the time stamp of which time the controller sending out the position to ODrive board;
  2. motor following position control mode;

Interesting. I have got to look more into RoboDK and see what else the API offer besides joint angles.
Thank you!

Hi Jamil
I saw your posts on the Discord. I’m working on a 6DOF Robot Arm with ODrive too. It will be fully ROS and MoveIt integrated. You asked for some documentation, this is all I needed to get started with ROS. If you have some specific questions, don’t hesitate to contact me! :slight_smile:
https://kyrofa.com/posts/your-first-robot-introduction-to-the-robot-operating-system-2-5
https://industrial-training-master.readthedocs.io/en/melodic/index.html
http://wiki.ros.org/urdf/Tutorials/Building%20a%20Visual%20Robot%20Model%20with%20URDF%20from%20Scratch
https://answers.ros.org/question/9613/how-to-import-stl-files-into-urdf-files/

@Noothless
Thank you so much. converting 3D models to URDF has always been a dead end for me using ROS.
You should share your project as well.

@Jamil84
You’re welcome. Yes, this script for fusion360 works great. You need to convert those ascii stl files to binary stl files afterward though. Otherwise, MoveIt won’t be able to load them. I had to learn this the hard way… :slight_smile:
This one worked for me: https://www.meshconvert.com/en.html
I am working on sharing my project. Unfortunately, I’m quite busy at the moment.