Merging modern software development with electrons and metal
Random header image... Refresh for more!

Robot Primer 12: Moving In Work Coordinates

While researching for this post, I reviewed the relevant Denso manual (the Setup Guide).  Although its dry text is no match for my scintillating style, I have to say it gives a good, illustrated explanation of the various coordinate systems.  I am not going to try to compete with it; instead, I will give my own summary with some videos and, in an effort to get your programming juices flowing, concentrate on why and where you might want to use these features.

I will be using a simulated robot recorded in WinCaps III simulation mode (kudos to Denso for providing a 90-day WinCaps III trial version, available to everyone).  I chose a 6-axis articulated robot because it can do motions that are impossible using a SCARA or Delta robot. I am using Denso in my examples, primarily because I can use the simulator and am familiar with their robots.

As I’ve noted before, the basics should apply to other robots, but the details will vary for different robot controllers,   Of course the robot type determines what poses the robot can do (for example, a 4-axis SCARA can only roll about the Z axis (Rz), not the X axis or Y axis).

Note that you can click on the pictures to see a bigger version.

Work Coordinates

Work coordinates are rectangular coordinates fixed relative to the base of the robot.  Work coordinate systems are defined relative to the Base coordinates by specifying:

  • the coordinate origin (X, Y, Z) defined in base coordinates
  • the angles of rotation (Rx, Ry, Rz) around the corresponding base coordinate axes (Rx, Ry, Rz).

Base Coordinates

Base coordinates are work coordinates with the origin at the base of the robot.  In Denso terminology, the base coordinates are “3-dimensional Cartesian coordinates whose origin is at the center of the robot basement”.

Example Base And Work Coordinates

Base, Work Top View

Base, Work Top View

I setup my robot work space with a few objects:

  • Denso robot with my simple end effector
  • A table with two of my simple fixtures.  The second fixture is rotated 180 degrees from the first fixture.
  • Two work coordinate systems, Work1 for Fixture 1 and Work2 for Fixture 2.  When the robot is in the appropriate Work coordinates, the fixtures’ positions are exactly the same.

The picture, above, shows Base coordinates (Work0), Work1, and Work 2.  The lines show the direction of the positive axes (+X, +Y, and +Z).  The  bottom window shows the definition of Work1 and Work2 relative to Base coordinates.  Note  that Work2 has a Rz value of 180 degrees, and you can see that the direction of Work2’s X and Y axes are exactly opposite Work1’s.

Coordinate Axes and Angles of Rotation

Work1 Front View

Work1 Front View

Work2 - Front View

Work2 – Front View

The first picture above shows the Work1 coordinate axes (X, Y, Z) and angles of rotation (Rx, Ry, Rz), and the second picture shows the Work2 coordinate axes and angles of rotation.  The lines and arrows point in the direction of positive movement.

The Base coordinate axis directions (X, Y, Z) are the same as Work1’s axis directions, but the origin is different.  The angles of rotation are the same for both.

The coordinate axis directions are different between Work1 and Work2: because Work2 is rotated 180 degrees about the Z axis, its X and Y axes point in the opposite direction from Work1’s X and Y axes.

The angles of rotation define the attitude of the robot flange, and are also called yaw, pitch, and roll.  Their origin always is at the center of the robot flange surface (you can see that the origin is the same for both Work1 and Work2), but the directions are the same as the Work coordinate’s X, Y, and Z axes (so Work2’s Rx and Ry directions are reversed compared to Work1’s Rx and Ry directions).  When you rotate along Rx, Ry, or Rz, the origin (which is the center of the flange surface) will stay in the same X, Y, Z position, but the rest of robot will rotate around that axis.

Putting It All Together: Robot Movements in Base World Coordinates

My youtube video shows some basic movements in Base World Coordinates, moving in CP (straight line) mode.  Try to match my descriptions above with what the robot is doing: making this video took a lot of time, so I hope it helps make my prose a lot clearer.

What’s Next

More on World Coordinates, of course, including potential applications.

September 12, 2014   6 Comments

End Effector Notes: STL, VRML, and Colors

I’ve been having major problems importing my end effector into Denso WINCAPS III and maintaining my desired colors.

WINCAPS III can only import Direct-3D (*.X) or VRML Ver 2 (*.WRL) files.  On the other hand, most MCAD software won’t export VRML files.

I used DesignSpark Mechanical (DSM) to create my design.  DSM can export 5 3-D file formats: STL, OBJ, 3D PDF, SKP (Sketchup), and XAML.   I was frustrated trying to set the colors I wanted in DSM; help (including blogs and forums) is still very limited, and I couldn’t figure out how to change the color of imported STEP files.  I was able to get to this:

DesignSpark Colored End Effector

DesignSpark Colored End Effector

Since I choose to export to STL, the next step was to convert from STL to VRML using meshconv, but when I imported the resulting VRML file into WINCAPS III I got this:

Initial Result in WINCAPS III

Initial Result in WINCAPS III

Yuck!  All my color is gone, and my part is white hot and glowing purple.  I’m pretty sure part of the problem is that the WINCAPS simulator has a bright light, which as far as I can tell can’t be adjusted; when the part is rotated, the bright spots change.  But the major problem, which took me a while to figure out, is that STL files normally do not retain any color information.  After all, it’s not needed by most 3D printers, and STL was invented for 3D printers.

I did a little more research on the DSM export formats.  I am using two conversion tools, meshconv (a command line converter) and MeshLab (which includes a viewer and much more).  Of the 5 DSM 3D export formats, meshconv and MeshLab are only able to import STL and OBJ.  While OBJ may be able to contain color information, it wasn’t retained when I tested exporting from DSM to OBJ and then importing in meshconv or MeshLab.

I tried using SketchUp.  I was able to color the parts with a bit of effort (see below for an example), and export to VRML using a add-on, but WINCAPS III didn’t like the resulting VRML file.

End Effector In SketchUp

End Effector In SketchUp

So I ended up using MeshLab: I exported from DSM to a STL file, imported the STL file into MeshLab, colored using MeshLab (pretty easy), exported from MeshLab to VRML, and finally imported the VRML file into WINCAPS III.  The colors in WINCAPS are pretty different from MeshLab’s colors, but they’re much better than my first attempt.

End Effector in MeshLab

End Effector in MeshLab

The Final Result

The Final Result

March 25, 2014   No Comments

Robot Primer 10: The Core Is Coordinates And Kinematics

In my last post, I talked about the development time advantage the robot’s integrated system brings.  However, I think the core robot advantage is coordinate points, transforms, and kinematics, which all go together.

After all, I can buy integrated non-robot systems ranging from pre-wired motors and drives to integrated motors (such as SmartMotors and MDrives) to integrated stages (like IAI’s stages).

However, a robot still has much faster development time because it deals with real world coordinates.

Terminology And Capabilities

I am using Denso Robotic’s terminology and capabilities as a rough basis for my posts, instead of continually saying “most robot controllers do X, some do Y, and a few do Z”.  Most robot controllers should have similar capabilities and equivalent terms.

Coordinates Again

A point in space is represented using a coordinate systems, such as cartesian (XYZ or rectangular), spherical, or cylindrical.  Using the coordinate system that best fits the problem can really help when you’re doing physics or geometry, but in the robot world rectangular coordinates are the usual choice.

However, most controllers provide a choice of coordinate origins, including work (fixed relative to the robot base) and tool (fixed relative to the end of the robot’s arm).

The orientation of the end effector can be represented using systems such as Rx (roll about the X axis), Ry (roll about the Y axis), and Rz (roll about the Z axis) or roll, pitch, and yaw.

Kinematics Yet Again

The robot moves its joints, not coordinates.  Kinematics (the science of motion) and inverse kinematics is how the robot figures out how to move its joints to get to the desired coordinate position and orientation.

The robot controller knows the position of each of its joints (using encoder feedback) and knows their relationships (length of each joint segment, which joint is connected to which, etc), so by doing a little fancy math it can always know where the end effector is in a particular coordinate system (the kinematics part) or figure out how to move the joints to get to the desired position  (the inverse kinematics portion).

Let’s look at a very simple example: suppose we want to lay down glue on the path shown below at a constant velocity from P1 to P4.

Example Robot Path

Example Robot Path

It’s pretty simple if you are using a cartesian robot.  For example, if you are using a Galil controller, the core code could be something like:

PA 0,0
BG
AM
LMAB
VS 10000
VA 50000
VD 50000
LI 0,5000
LI 10000,0
LI 0,-5000
LI -10000,0
LE
BGS
AM

But suppose we’re using a SCARA robot.  Now it’s tough to use a normal motion controller, because joints are rotary so every time we try to move a joint in X or the Y axis, we also move in the other axis (Y or X).  To get straight lines, we have to move multiple joints at just the right relative speeds.

But it’s easy with a robot controller:

MOVE L, @E P[1 TO 4], S50

which moves the robot through positions P1, P2, P3, and P4 at 50% speed with square corners.

The bottom line: the robot controller makes using complex robots (such as articulated, SCARA, or delta) as easy as using a cartesian robot.

Coordinate Transforms

Coordinate transforms are very useful; here are a few examples:

  • Moving using the teach pendent in Tool mode (the robot has to do coordinate transforms between the Tool coordinates and its base coordinates)
  • Easy use of multiple end effectors, such as dual grippers and a camera.  For example, you can teach one location and then move any of the end effector tools over that location simply by changing the Tool mode.
  • Getting machine vision information into a form usable by the robot (calibrate camera, get its positions, and then transform them into robot coordinates)

What’s Next?

I will be dig deeper into coordinate systems, transforms, and their uses.

 

November 1, 2013   No Comments

Robot Primer 1: Introduction

A Programmer’s Introduction To Industrial Robots

Since many books and articles on industrial robotics have already been written, why I am spending time writing this? Because I am frustrated by the existing material. The books are typically textbooks or academic books, going into the nuances of robot control theory or painting a broad overview. In the magazines and on-line, I’ve seen marketing white papers and application stories, which can be useful, but don’t go into depth.

I’ve sub-titled this series A Programmer’s Introduction To Industrial Robots because I am writing from a software developer’s point of view and my goal is to give some idea of what industrial robots can do, not to write an authoritative text.  (I’m using Robot Primer for the title because that’s its short but still gives the basic idea).

My robotic experience has been using Adept and Denso robots to do precision assembly and test. I don’t claim to be a robot expert, but I did have to do some uncommon operations and thus become familiar with Adept and Denso tech support. I had good experiences with both companies’ robots and technical support, and would use them again, but I’ve also heard good things about other vendors.

I view robots as an alternative to other automation options such ball screw stages, pneumatic cylinders, and such, not a replacement for people. This view probably reflects my Silicon Valley background.

Since I am most familiar with Denso robots I might be a little biased towards them, but most of what I cover should be applicable to other manufacturers.

I am only going to cover tradition robotic arms with controllers (such as SCARA and articulated robots), not autonomous robots, mobile robots, etc.

The goal of this series is to provide a basic understanding of industrial robots, with an emphasis on the controller’s programming and capabilities, so that by the end you should have an idea of when a robot might be a good choice, and then can do more in-depth research on your own.

June 27, 2013   2 Comments