Drawing SVG paths with ZA6 Robot arm

This example shows how to get the ZA6 Robot arm to draw SVG paths with the help of svgpathtools python library. This library has a collection of tools that you can use to deal with SVG paths.

The goal here is to give a script an SVG file and then it will command the robot arm to move based on the paths in that SVG file.

The links below will take you to the videos that show the robot in action. I added some images to act as thumbnails. :smile:


https://photos.app.goo.gl/ukW7ZBhzKSNFpdPAA


https://photos.app.goo.gl/UE5WHc9uC15VSS1FA

To get started you will need some SVG test files. I have some ready for you.
For some reason, the forum text editor didn’t allow me to upload SVG files so I just created sharable links for the files. The images you see are just png files that I uploaded(since that’s what the forum text editor accepted) to show what you will see in the SVG file but below them is the link to the actual SVG file.

text.svg SVG file link

b.svg file link
If you plan on creating your own SVG files I highly recommend using Inksacpe. Also, use a 600x600px canvas in Inkscape or whatever software you plan to use. This will make sure that your paths are limited in a way that the robot arm won’t accidentally swing too far to harm you or anyone.

Once you have the file assets let’s get into the code.
I might have gone overboard with the code comments but it’s mainly for those that might not be familiar with Python or coding in general.

'''
SVG Drawer
This example shows how to get the ZA6 robot arm to draw SVG paths.
To get this to work we will be using svgpathtools library. It's a great collection of tools that make it easy to play with SVG files.

NOTE: For this project Make sure you set up a user frame to make sure it supports the tool you choose to use. Here is a link on how to get this set up [https://www.youtube.com/watch?v=i_zQoZG7DYQ]

Libraries used:
svgpathtools: https://github.com/mathandy/svgpathtools
Reference:
Gerry W Jenkins has a great tutorial on how to play with SVG paths in python https://gjenkinsedu.com/post/drawing_py_svg_path_0001/
'''
from robot_command.rpl import *

try:
   #import svgpathtools that will be helping us read paths from an SVG file
   from svgpathtools import Path, Line, QuadraticBezier, CubicBezier, Arc, svg2paths
   import pygame

except:
    notify("Installing package")
    #In case the library doesn't exist we install it. (NOTE: Make sure the computer has access to the internet)
    import sh
    with sh.sudo:
        sh.pip3.install("svgpathtools")
    from svgpathtools import Path, Line, QuadraticBezier, CubicBezier, Arc, svg2paths
   

set_units("mm", "deg")
fname = 'b.svg' 
paths, attributes = svg2paths(fname)
#Adjust path_steps or scale if something weird is happening 
path_steps = 100 # Adjust accordingly if paths look weird or incomplete.
scale = 0.3  # This value is multiplied by each point of the path to make it bigger or smaller. Adjust this value if the figure looks small or big


c_paths = [] # We will store our paths here
for path in paths:
    path_points = []
    for i in range(0, path_steps+1):
        f = i/path_steps  # Remaping 0 - (path_steps) to 0.0 to 1.0
        complex_point = path.point(f)  # path.point(t) returns point at 0.0 <= t <= 1.0
        path_points.append((complex_point.real*scale, complex_point.imag*scale)) # Add extracted point to points[] list
    c_paths.append(path_points) # Add extracted points in path_points list to c_paths[] list

safeHeight = -50
def start():
    movel(p[0 , 0, safeHeight, 0, 0, 0]) # Before drawing move up to the origin at a safe height
    '''
    Since SVG paths have so many points to draw, it will make the robot arm draw the paths in an incremental manner.
    To make the robot arm draw a path in a smooth manner, we will need to blend the path points using the set_path_blending() function.
    '''
    set_path_blending(True, 0.0) # We enable path blending by passing in True for the first parameter. The next parameter is the blend radius between moves (for now it's 0.0)
    
    for c_path in c_paths: # For each path in c_paths
        #create variables to hold x, y coodinate points
        lx = 0
        ly = 0
        
        for pt in c_path: # for each point in c_path
            #Get points
            lx = pt[0]
            ly = pt[1]
            movel(p[lx, ly, 0, 0, 0, 0]) # Move pen to (lx, ly) point using the movel() function
        # After each path 
        movel(p[lx, ly, safeHeight, 0, 0, 0]) #At the last point of the path move up to a safe height
    sync() # To force the execution of queued move commands
    set_path_blending(False) # Disable path blending


def main():
    change_user_frame("svg_frame") # Just to make sure that the robot uses the right user frame (NOTE: Don't forget to set up your user frame, if you don't know how to here is a great video that will get you up to speed [https://www.youtube.com/watch?v=i_zQoZG7DYQ])
    start()

Sometimes you might want to preview the SVG file path pattern using the same code that will be run by the robot arm but without the need for the robot arm. So I wrote a separate piece of code that you can run right away on your computer before you transfer your files to the robot arm computer.

This code is similar to the one above but the only difference is that instead of the robotic arm we will be using pygame python library to create a canvas for us to draw our SVG paths.

'''
SVG Drawer
This example shows how to draw SVG paths.
To get this to work we will be using svgpathtools library. It's a great collection of tools that make it easy to deal with SVG files.

NOTE: For this project make sure you set up a user frame to make sure it supports the tool that you choose to attach.

Libraries used:
svgpathtools (https://github.com/mathandy/svgpathtools)
pygame (https://www.pygame.org/docs/ref/draw.html)

Reference: 
Gerry W Jenkins has a great tutorial on how to play with SVG paths in python, https://gjenkinsedu.com/post/drawing_py_svg_path_0001/
'''

import pygame
from svgpathtools import  svg2paths #Other tools that you can import from svgpathtools: Path, Line, QuadraticBezier, CubicBezier, Arc,

''' 
Step 1) Setting up svgpathtools to deal with SVG paths 
'''

paths, attributes = svg2paths('assets/mrT_text.svg') # Getting paths and attributes
path_steps = 100 # Number of steps to take in a path
scale = 1
c_paths = []

for path in paths:
    path_points = []
    for i in range(0, path_steps + 1): # loop from 0 to path_steps + 1 
        p = i / path_steps             # Remapping 0 - path_steps(100) to  0.0 to 1.0
        complex_point = path.point(p)  # path.point(p) returns point at 0.0 <= p <= 1.0
        path_points.append((complex_point.real * scale, complex_point.imag * scale))    # Adding extracted points to path_point[] list  
    c_paths.append(path_points)     # Adding collected path points to c_paths[] list

''' 
Step 2) Setting up pygame to display our paths 
'''

pygame.init()                                  # init pygame
surface = pygame.display.set_mode((600, 600))   # get surface to draw on (width = 600, height = 600)
surface.fill(pygame.Color('white'))            # set background to white

for c_path in c_paths: # for every path  in c_paths[] list
    pygame.draw.aalines(surface, pygame.Color('blue'), False, c_path)  # False is no closing (Meaning a closing line wont be drawn from the last point to the first)
    pygame.display.update() # copy surface to display


while True:  # loop to wait till window closes
    for event in pygame.event.get():
        if event.type == pygame.QUIT:
            exit()

This project was fun to do and there are a few things that I plan to improve in the future. These things include; a pen holder, and using a camera hooked up directly to the robotic arm computer.

  • For those that will get a chance to try this project out you will notice that if you don’t have a good pen holder the pressure between the robot arm and the paper is really strong which punched holes in the paper. Also, the wiggle of the pen leads to imperfect lines. So making a pen holder that has good soft touch on paper is on my ToDo list.

  • It’s very tedious to move back and forth between Inkscape, script, and robot arm. So it will be nice to just have the camera take images and feed them directly to the script and convert the images into custom paths or SVG paths that the robot can follow.

If anyone beats me to these goals please let me know, I would like to see your approach

2 Likes

@Sadiq_Wanyaka WOW! This is really neat - thank you so much for posting the project and the example code.

If I understand your last post correctly, I think you are proposing a workflow like this:

  1. Take a photo
  2. Convert the jpg to svg
  3. Use the robot with a pen attachment to plot out the SVG on paper

All in an automated script? That’s really neat. I can’t wait to see it in action (video would be awesome :slight_smile: )

Thanks.
And yes, that’s the workflow I plan to use.

Hey @Sadiq_Wanyaka,

I just came across this - super cool project. A friend did something similar with an earlier generation robot prototype in a shared space. We ended up using a spring loaded pen. It was one of those click to open pens that has a built in spring. We used a couple of set screws to hold the spring in the correct location, and relied on the rest of the pen mechanism to apply the appropriate pressure. It took a little fiddling the first time, but once it was dialed, it worked great!

Here’s a little photo of the hack:

image

-KS

Wow, that’s really neat. I had a similar idea when I was starting out but then I dropped it thinking the spring in the pen is too small and short. If you guys got it to work that gives me hope.

I have some questions about this.

  1. For the pen shell did you slice it to expose the pen tip to make it easy to bounce?
  2. How did you handle the wiggle of the ink tube?
  1. The pen shell wasn’t modified - other than the 2 screws. I’ll attach a couple extra photos.
  2. The wiggle was no worse than when using the same style of pen by hand…at least that’s what I recall.

Typically, the spring is used to return the ink tube into the pen body. For our application, the spring was used to push the pen tube out. The ink tube has 2 formed pieces of plastic that rest against the spring. Typically, the spring is between the pen tip and the formed plastic pieces. We moved the spring to the other side of the ink tube, the full ‘silver tip’ of the pen is exposed in this orientation. We placed the silver tips such that the spring was applying a little pressure in the fully open position. When the robot arm put pressure on the pen tip, it would depress the spring just a little bit, and the pressure was maintained by the preloaded spring.


-KS

2 Likes

@Sadiq_Wanyaka great work! I checked with our dev team and they have updated the forums to allow for SVG I downloaded and am uploading your test.svg here:
text
Thanks.

Thanks a lot, I will also update this post to have them embedded directly.

So one issue I noticed when I set up a user frame on the robot is that the frame has an incline on the y-axis.
when it’s close to zero on the y-axis the height is normal but as I go away from zero into the positives the height goes up slightly on the z-axis. With this issue, I have crashed a lot of pens. If I get this issue figured out this idea works.

I used a mechanical pencil for this, and after trying it I immediately learned how fragile the pencil casing was :smile:. I definitely need a few things figured out:

  1. I need a pen that has similar qualities to the one you used.
  2. I need a proper gripper just like the one you have in the images you sent a while back.
  3. I need to come up with more ideas on how to get rid of the wabble in case I want to draw really straight lines.

@Sadiq_Wanyaka A couple of thoughts:

I happen to know that the robot you are using at Sector 67 may not be properly mastered (it used to live in my garage at home :slight_smile: ). Mastering is the process by which the angles of the joints are properly set so that software matches the physical hardware. It’s kind of like referencing a CNC machine or 3D printer when you move the machine to its reference or limit switches - it establishes a home position. With the ZA6, you can get close by aligning the physical notches in the joints, but when we master robots at Tormach we use a digital inclinometer and a touch probe to correctly and accurately master the robot. A robot that’s not mastered well will exhibit the behavior you describe. Remastering the robot would likely fix the Z height change with respect to Y.

Alternately, you could create a user frame using the 3 or four point method (Tormach ZA6 Robot | Training a User Frame - YouTube). Done properly you might still see a little Z drift (because of the mastering) but likely not so much that it would cause issues.

1 Like

Ooh, that makes sense. I mainly use the 3-point method to create my user frame, next time I will experiment with 4 point method.

I’m trying to use your code on my ZA6 and it won’t let me install the svgpathtools package. It gives me an error when I run it saying: “sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper”. Does anyone have any suggestions on how to fix this? I haven’t had any luck with downloading the package from terminal.

Hi @Alex_Janis, sorry in advance I am currently at school and I don’t have access to the ZA6 so I might not be of match help. I will be back to mess with the ZA6 again sometime next month.

It seems there are some permision issues and the system is expecting a password.
To prove this theory, you could try running a sudo command in the terminal for example:

sudo pip install svgpathtools

Also since PathPilot is running in a docker container, you should do this test in the that same docker container aswell. here is a link showing how to get into a docker container: Docker Exec: How to Enter Into a Docker Container's Shell?

There is a way to bypass that password prompt. here is a forum that has that discussion:

SIDE NOTE: The above “drawer” script is a basic way to get the ZA6 to draw stuff. I wrote a completely defferent program that also gets the ZA6 to draw but this time you provide an image. You could also provide an SVG but if the SVG is really large you start getting into some issue. You can find this program here: example_robot_programs/trpl_examples/projects/image_path_extractor_pathpilot at main · tormach/example_robot_programs · GitHub
The README file in the project folder will tell you what to do to get things going. Though your persmision issue needs to be fixed first.

1 Like