Since WorkÂ coordinates are transformed into Base coordinates in the end, why bother?Â Why not just use base coordinates?Â Here are some possible reasons; I’m sure creative programmers have come up with others.
- Using Work coordinates can be more natural.
- Using Work coordinates can save re-teaching points.
- UsingÂ dynamically updated Work coordinate makes complex situations such as picking and placing from a moving conveyor easy.
Next I’ll look at some examples in more detail
Using Work Coordinates To Save Re-Teaching
If we use base coordinates andÂ the robot’s base coordinates change, then all the points have to be re-taught.Â However, if we’ve used world coordinates, all we have to do is to add the offset between the old and new base coordinates, and we’re done.
Some reasons why the robot’s base coordinates could change:
- Robot needs to be re-calibrated
- Robot needs to be replaced by another robot of the same or possibly different type
- Moving the whole base plate to a different robot cell
- If base plate fabrication is precise enough, to allow easier production
Let’s look at a simple example using a simple assembly robot.
Simple Assembly Cell
This imaginery work cell uses a robot to:
- Pick up the circular yellow bases from the top left pallet (defined by points P1 to P4)
- Place the base in the assembly fixture (P13) and add glue
- Pick up the blue part from the bottom left pallet (defined by P5 to P8) and place it onto the base.
- Move the completed part to the inspection fixture (P14)
- Finally move the inspected part to the output pallet (defined by P9 to P12).
The base plate is big blue rectangle, andÂ the base coordinates are represented by the even larger gold rectangle, with the coordinates for P1 shown (54.5mm and 85.0 mm).
My example is simple and easy, there’s no need for additional complications such as work coordinates, right?
Work Coordinates to the Rescue
But nowÂ suppose the robot breaks down and is replaced by a new robot, with slightly different base coordinatesÂ (represented by the bold red rectangle).
As you can see, the position of P1 has shifted quite considerably on the base plate.Â So we will need to re-teach all 14 points.
No big deal, right?Â But now suppose the pallets and fixtures and interchangeable so we can assembly 10 different types of parts.Â Now we have to re-teach 140 positions: ouch!
Unless, of course, we used work coordinates – then all we have to do is change the offset so that the work coordinates for the robot matches the work coordinates for the old robot.Â Now a little bit of extra work pays off: we only have to teach 1 work coordinate instead of 140 positions.
I actually saw this situation; a robotic system I serviced had its robot re-calibrated, and the program used base coordinates, so all the points had to be re-taught.
January 24, 2015 1 Comment
While researching for this post, I reviewed the relevant Denso manual (the Setup Guide).Â Although its dry text is no match for my scintillating style, I have to say it gives a good, illustrated explanation of the various coordinate systems.Â I am not going to try to compete with it; instead, I will give my own summary with some videos and, in an effort to get your programming juices flowing, concentrate on why and where you might want to use these features.
I will be using a simulated robot recorded in WinCaps III simulation mode (kudos to Denso for providing a 90-day WinCaps III trial version, available to everyone).Â I chose a 6-axis articulated robot because it can do motions that are impossible using a SCARA or Delta robot. I am using Denso in my examples, primarily because I can use the simulator and am familiar with their robots.
As I’ve noted before, the basics should apply to other robots,Â but the details will vary for different robot controllers,Â Â Of course the robot type determines what poses the robot can do (for example, a 4-axis SCARA can only roll about the Z axis (Rz), not the X axisÂ or Y axis).
Note that you can click on the pictures to see a bigger version.
Work coordinates are rectangular coordinates fixed relative to the base of the robot.Â Work coordinate systems are defined relative to the Base coordinates by specifying:
- the coordinate origin (X, Y, Z) defined in base coordinates
- the angles of rotation (Rx, Ry, Rz) around the corresponding base coordinate axes (Rx, Ry, Rz).
Base coordinates are work coordinates with the origin at the base of the robot.Â In Denso terminology, the base coordinates are “3-dimensional Cartesian coordinates whose origin is at the center of the robot basement”.
Example Base And Work Coordinates
I setup my robot work space with a few objects:
- Denso robot with my simple end effector
- A table with two of my simple fixtures.Â The second fixture is rotated 180 degrees from the first fixture.
- Two work coordinate systems, Work1 for Fixture 1 and Work2 for Fixture 2.Â When the robot is in the appropriate Work coordinates, the fixtures’ positions are exactly the same.
The picture, above, shows Base coordinates (Work0), Work1, and Work 2.Â The lines show the direction of the positive axes (+X, +Y, and +Z).Â TheÂ bottom window shows the definition of Work1 and Work2 relative to Base coordinates.Â NoteÂ that Work2 has a Rz value of 180 degrees, and you can see that the direction of Work2’s X and Y axes are exactly opposite Work1’s.
Coordinate Axes and Angles of Rotation
The first picture above shows the Work1 coordinate axes (X, Y, Z) and angles of rotation (Rx, Ry, Rz), and the second picture shows the Work2 coordinate axes and angles of rotation.Â The lines and arrows point in the direction of positive movement.
The Base coordinate axis directions (X, Y, Z) are the same as Work1’s axis directions, but the origin is different.Â The angles of rotation are the same for both.
The coordinate axis directions are different between Work1 and Work2: because Work2 is rotated 180 degrees about the Z axis, its X and Y axes point in the opposite direction from Work1’s X and Y axes.
The angles of rotation define the attitude of the robot flange, and are also called yaw, pitch, and roll.Â Their origin always is at the center of the robot flange surface (you can see that the origin is the same for both Work1 and Work2), but the directions are the same as the Work coordinate’s X, Y, and Z axes (so Work2’s Rx and Ry directions are reversed compared to Work1’s Rx and Ry directions).Â When you rotate along Rx, Ry, or Rz, the origin (which is the center of the flange surface) will stay in the same X, Y, Z position, but the rest of robot will rotate around that axis.
Putting It All Together: Robot Movements in Base World Coordinates
My youtube video shows some basic movements in Base World Coordinates, moving in CP (straight line) mode.Â Try to match my descriptions above with what the robot is doing: making this video took a lot of time, so I hope it helps make my prose a lot clearer.
More on World Coordinates, of course, including potential applications.
September 12, 2014 6 Comments
I’ve been having major problems importing my end effector into Denso WINCAPS III and maintaining my desired colors.
WINCAPS III can only import Direct-3D (*.X) or VRML Ver 2 (*.WRL) files.Â On the other hand, most MCAD software won’t export VRML files.
I used DesignSpark Mechanical (DSM) to create my design.Â DSM can export 5 3-D file formats: STL, OBJ, 3D PDF, SKP (Sketchup), and XAML.Â Â I was frustrated trying to set the colors I wanted in DSM; help (including blogs and forums) is still very limited, and I couldn’t figure out how to change the color of imported STEP files.Â I was able to get to this:
Since I choose to export to STL, the next step was to convert from STL to VRML using meshconv, but when I imported the resulting VRML file into WINCAPS III I got this:
Yuck!Â All my color is gone, and my part is white hot and glowing purple.Â I’m pretty sure part of the problem is that the WINCAPS simulator has a bright light, which as far as I can tell can’t be adjusted; when the part is rotated, the bright spots change.Â But the major problem, which took me a while to figure out, is that STL files normally do not retain any color information.Â After all, it’s not needed by most 3D printers, and STL was invented for 3D printers.
I did a little more research on the DSM export formats.Â I am using two conversion tools, meshconv (a command line converter) and MeshLab (which includes a viewer and much more).Â Of the 5 DSM 3D export formats, meshconv and MeshLab are only able to import STL and OBJ.Â While OBJ may be able to contain color information, it wasn’t retained when I tested exporting from DSM to OBJ and then importing in meshconv or MeshLab.
I tried using SketchUp.Â I was able to color the parts with a bit of effort (see below for an example), and export to VRML using a add-on, but WINCAPS III didn’t like the resulting VRML file.
So I ended up using MeshLab: I exported from DSM to a STL file, imported the STL file into MeshLab, colored using MeshLab (pretty easy), exported from MeshLab to VRML, and finally imported the VRML file into WINCAPS III.Â The colors in WINCAPS are pretty different from MeshLab’s colors, but they’re much better than my first attempt.
March 25, 2014 No Comments
I created my end effector mostly using “Spaceclaim Light”, officially known as DesignSpark Mechanical (DSM).Â Since my goal was to create something I could use as quickly as possible, I have not spent the time to become an expert user.
The DesignSpark Mechanical Background
Electrocomponents (parent company of RS Components and Allied Electronics) offers a variety of free tools and other design resources (such as forums) on their DesignSpark website.Â Â The most impressive tools are:
- DesignSpark PCB, based on Easy-PC from Number One Systems
- DesignSpark Mechanical, a carefully cut-down version of SpaceClaim.Â DSM is a very powerful program, but lacks key features (such as useful import and export formats; assembly constraints also appear to be missing) needed to replace SpaceClaim, SolidWorks, SolidEdge, and such for hard-core mechanical design.Â On the other hand, with features such as IDF import, DSM appears to be a good match for creating 3D PCB designs.
Electrocomponents is betting their costs will be more than covered by increased component sales and much better awareness (in other words, I’d say the cost of DesignSpark.com is a much better use of marketing money than direct advertisements).
Creating parts in DSM reminds me of creating parts in SketchUp, except that SketchUp is really oriented towards architecture, while it’s clear DSM is meant for mechanical design.Â I like being able to easily input exact dimensions.Â It’s neat being able to push and pull 3D parts.
I didn’t have much difficulty creating my simple parts.Â The hardest was figuring out how to create the cones for the vacuum grippers (I created a triangular sketch, then revolved it 360 degrees around the center axis – it did take a few experiments to figure out the exact sequence of mouse clicks).
I’m still not a fan of the Microsoft ribbon interface.Â I don’t care for it in MS Office, and I don’t like it any better in DSM.
Well, my parts aren’t really assembled.Â I got so frustrated trying to assemble them I was tempted to go back to Alibre Design (now Geomagic Design), but since this isn’t a real design, I just moved them by eye until I was happy with the layout.
At least you can move components (groups of parts) by selecting the top level component; if you’re not careful, you’ll end up moving just a part of the component.
Since DSM is so new, there is very little community support, and the documentation is pretty skimpy.Â All I could find on making assemblies is this brief tutorial (see Section 3; I’m pretty sure all that’s happening is the part is moved, with no constraints) and in the FAQ (see How Do I Make Assembly Models?).
What I want to do is set assembly constraints such as making planes parallel, aligning axes, and such.Â From my searching, it appears SpaceClaim has an assembly constraints toolbar; I couldn’t find anything similar in DSM.
Getting The Result Into WinCAPS III
I will be using my model in the Denso WinCAPS III robot simulator, which can only import DirectX and VRML files.Â As is typical of most MCAD software, DSM does not export to VRML.Â However, it does export to STL, and fortunately there are a number of STL to VRML converters.
I used meshconv to convert to VRML; its documentation isn’t great, but it’s not too hard to use.Â For example, to convert fixture.stl to fixture.wxl I used this command line:
meshconv.exe fixture.stl -c wrl -o fixture -vrmlver 2
Based on my small project, I’d say that if you’re hoping for a free replacement for the professional MCAD programs, you’ll be disappointed in DesignSpark Mechanical.Â But if you’re looking for a SketchUp-style program oriented towards mechanical, especially electro-mechanical, design then check it out.
I may have to tweak my model bit for different simulation situations; I think that will be pretty easy to do.
Sometime I need write an update about the low cost MCAD market, since there have been a lot of changes, including the introduction of DSM, and Autodesk buying Delcam (which may lead to changes with the free PowerShape-e MCAD software).
February 4, 2014 No Comments
I’ve put together a simple and unrealistic end effector that I will be using in the rest of this series to help illustrate my topics.Â It’s unrealistic because it can’t be manufactured as shown (for example, no pneumatic tubing).
Now for a quick look at what I’ve created:
The major parts are:
- Mounting to robot arm
- Smart Camera (a Microscan Vision MINI)
- Gripper with pneumatic suction cup (mounted on a Misumi MPPU10 Air Linear Guide)
- Second gripper mechanism
- My sample part
I choose to use vacuum grippers for simplicity.Â Using two gripper adds flexibility, since the robot can exchange parts at the fixture.
It’s a simple part; I made it non-symmetric so it’s easier to see the effects of certain robot sequences.
This fixture is a very simple place to put the part.Â Most likely, a real fixture would have a clamp and more connections, but this one is good enough for my planned demonstrations.
January 31, 2014 No Comments
In my last post, I talked about the development time advantage the robot’s integrated system brings.Â However, I think the core robot advantage is coordinate points, transforms, and kinematics, which all go together.
However, a robot still has much faster development time because it deals with real world coordinates.
Terminology And Capabilities
I am using Denso Robotic’s terminology and capabilities as a rough basis for my posts, instead of continually saying “most robot controllers do X, some do Y, and a few do Z”.Â Most robot controllers should have similar capabilities and equivalent terms.
A point in space is represented using a coordinate systems, such as cartesian (XYZ or rectangular), spherical, or cylindrical.Â Using the coordinate system that best fits the problem can really help when you’re doing physics or geometry, but in the robot world rectangular coordinates are the usual choice.
However, most controllers provide a choice of coordinate origins, including work (fixed relative to the robot base) and tool (fixed relative to the end of the robot’s arm).
The orientation of the end effector can be represented using systems such as Rx (roll about the X axis), Ry (roll about the Y axis), and Rz (roll about the Z axis) or roll, pitch, and yaw.
Kinematics Yet Again
The robot moves its joints, not coordinates.Â Kinematics (the science of motion) and inverse kinematics is how the robot figures out how to move its joints to get to the desired coordinate position and orientation.
The robot controller knows the position of each of its joints (using encoder feedback) and knows their relationships (length of each joint segment, which joint is connected to which, etc), so by doing a little fancy math it can always know where the end effector is in a particular coordinate system (the kinematics part) or figure out how to move the joints to get to the desired positionÂ (the inverse kinematics portion).
Let’s look at a very simple example: suppose we want to lay down glue on the path shown below at a constant velocity from P1 to P4.
It’s pretty simple if you are using a cartesian robot.Â For example, if you are using a Galil controller, the core code could be something like:
PA 0,0 BG AM LMAB VS 10000 VA 50000 VD 50000 LI 0,5000 LI 10000,0 LI 0,-5000 LI -10000,0 LE BGS AM
But suppose we’re using a SCARA robot.Â Now it’s tough to use a normal motion controller, because joints are rotary so every time we try to move a joint in X or the Y axis, we also move in the other axis (Y or X).Â To get straight lines, we have to move multiple joints at just the right relative speeds.
But it’s easy with a robot controller:
MOVE L, @E P[1 TO 4], S50
which moves the robot through positions P1, P2, P3, and P4 at 50% speed with square corners.
The bottom line: the robot controller makes using complex robots (such as articulated, SCARA, or delta) as easy as using a cartesian robot.
Coordinate transforms are very useful; here are a few examples:
- Moving using the teach pendent in Tool mode (the robot has to do coordinate transforms between the Tool coordinates and its base coordinates)
- Easy use of multiple end effectors, such as dual grippers and a camera.Â For example, you can teach one location and then move any of the end effector tools over that location simply by changing the Tool mode.
- Getting machine vision information into a form usable by the robot (calibrate camera, get its positions, and then transform them into robot coordinates)
I will be dig deeper into coordinate systems, transforms, and their uses.
November 1, 2013 No Comments
Most robots are integrated systems.Â Combined with robot controller features such as kinematics and teaching points, this makes it much faster to to get a robot up and running.
When we were evaluating a Denso robot, I setup it up in one day, from unpacking to running a simple demo. My task was like this:
- Unpack the robot and controller
- Place it on a solid bench and mount it (with help from our techs)
- Connect the robot power cable to the controller
- Connect air to the robot (for the Z axis)
- Connect the teach pendent to the controller
- Have our techs connect a simple end effector (it’s nice having a machine shop)
- Connect AC power to the the controller
- Start the system
- Teach a few points using the teach pendent
- Create a simple move routine going through several points using the teach pendent and searching through the manual for the appropriate commands
- Test it
Now when I’m setting up a motion controller, it starts with:
- Unpack the motion controller, motors, and stages
- Find the motion controller documentation
- Connect electrical power to the motion controller via the appropriate terminal blocks
- Find the motor documentation
- Use a break-out board to connect the motors to the motion controller
- Configure the motion controller for the motor, and try to spin the motor, verifying encoder, hall sensors, etc.
- Do some initial tuning.
- Connect the motor to the stage.
- Verify the limit sensors.
- Then I have to repeat steps 4-9 for all the other axes, and we still just have a bunch of unconnected stages.
Of course, if you’re using a robot controller with custom cartesian stages, setup time will be longer.Â And to be fair, many (most?) robot applications will take considerable programming time.
October 6, 2013 1 Comment
In this post, I will take a quick look at industrial robot types.Â I know there are more robot types (such as cylindrical and polar) and variations in each type, but these are the most common.Â The manufacturers’ web sites and other resources such as books and system integrators have more opinions about when to use what robot type.
Robot terminology can vary between manufacturers.Â I will use this post to define my terminology.
A Few Words About Coordinates and Planes
Since I will be using reference axes, the picture above shows my reference system.Â The X axis is toward the viewer, the Y axis is left to right, and the Z axis is down to up.Â The XY plane, highlight, is formed by X and Y axes.
Note that while three positions (such as X, Y, Z position in rectangular coordinates) uniquely define a point in space, it takes more to define the position and orientation of a real object in space.Â Robot controllers often borrow from aviation and use yaw (nose left/right), pitch (nose up/down), and roll (rotation about the principal axis) to define the robot end effector’s orientation.
The end effector is the tooling attached to the end of the robot’s arm.Â End effectors consist of whatever is needed to get the job done, such as vacuum cups, grippers, cameras, glue dispensers, and welding equipment.
The articulated robot is constructed from a series of interconnected rotary joints or axes, typically 4 to 6 in total.Â It’s biggest advantage is flexibility; a 6 axis model should have full 6 DOF (degrees of freedom), and thus can approach a given XYZ point with any desired yaw, pitch, and roll (within the robot’s mechanical limitations).Â All those joints make it easy for the robot to reach around obstacles.Â A five axis articulated robot will still be more flexible in its moves than a 4-axis SCARA.
If you want to visualize this flexibility, consider performing automatic screwing at several locations on the surface of a sphere.Â The robot’s screwdriver needs to be perpendicular to the sphere’s surface at each location: a 6-axis articulated robot can do this, but a 4-axis SCARA can’t.
Like SCARA robots, articulated robots have a large work area and a small base.Â Very large articulated robots are available.Â They are often slower than SCARA or delta robots and less rigid in the Z axis than SCARA robots.
I number the robot axes by starting with the joint closest to the base as Axis 1 and work out from there.Â The picture above shows my numbering.
The SCARA robot has all three rotary axes in the XY plane, with only 1 axis that can move up and down.Â This configuration gives the robot more rigidity, or less compliance, along the Z axis, and thus the name: Selective Compliance Assembly(or Articulated) Robot Arm.
Like articulated robots, SCARA robots have a large work area and a small base.
Since the SCARA robot is fast and rigid, they are often used for assembly (especially when downwards force is required), pick and place, dispensing, and palletizing.
The picture shows my axis numbering.Â The quill combines Axis 3 (Z, up/down) and Axis 4 (rotation around axis 2).Â The quill is typically hollow to allow passing through cabling (pneumatic and electrical) to the end effector.
Cartesian robots are made from a combination of linear stages, typically stacked.Â Using linear stages offers a potentially wide range of possible characteristics, including:
- Very heavy load capability, especially with gantry (parallel) stages.
- Very high acceleration and velocity, for exampleÂ with linear motor based stages
- Very high precision, for example with stages using air bearings and flexures
- And more with options such as piezo motor stages and belt-driven stages.
If a cartesian robot makes sense, then there are several possible approaches.Â I’ll use Adept’s lineup as an example, since I’m familiar with it and they offer cartesian robots:
- Use an Adept robot controller with an Adept cartesian robot.Â This is by far the easiest approach, with very little integration work: basically plug the robot into the controller, and go.
- Use an Adept robot controller with Adept servo drives and motors and third party stages.Â This will take considerably more integration, including defining the kinematics.
- Use an Adept robot controller with standard servo amps and motors and third part stages.Â Again, this will approach requires much more integration, including defining the kinematics, but provides the most flexibility.
- Or if the application doesn’t require the extra capabilities of a robot controller, use an appropriate motion controller.Â For example, the ACS SpiiPlus is oriented towards precision motion (think semiconductor) while the Schneider LMC-20 is targeted towards packaging.Â This willÂ require substantially more integration work than using an integrated robot and controller package.
Given the variety of possible cartesian robots, it’s hard to give definite comparisons.Â In general, they are going to take up more area (larger base) than a SCARA or articulated robot, and will not have the flexibility of the 6-axis articulated robot.
The delta robot is a parallel robot with various arms joined together at the end effector.Â Typically it has 3 degrees of freedom (XYZ), although some models have an additional rotary axis.
The delta’s strong points are speed and stiffness: since the arms are very light, it can accelerate and move very quickly.Â The multiple connected arms add stiffness, but they also reduce the work envelope.
The delta robot is typically mounted above the work area.
Delta robots are popular for packaging and other high speed pick and place type operations.
September 28, 2013 2 Comments
The typical industrial robot is an integrated system consisting of:
- The robot arm. Common types include articulated, SCARA, cartesian, and delta.
- The robot controller, which typically includes the servo amplifiers, controllers, interfaces, and I/O.
- Teach pendent â€“ for maintenance, teaching points, debugging, and limited development.
- Development software. OK, you can develop on the teach pendent, but for anything serious you need to use PC-based software.
The end effector is the equipment mounted to the end of the robot arm. Typically the system integrator develops a custom end effector for the specific application with devices such as suction cups, grippers, welder equipment, or cameras.
The robot needs to be supplied with electrical power and often compressed air. Very small robots can use 1 phase 120VAC or 240VAC power; however, most robots require 3 phase 240VAC or higher electrical power. Compressed air isnâ€™t always required; some robots need it to balance their Z axis (to counteract gravity), and itâ€™s often used by end effectors such as pneumatic grippers and pneumatic vacuum generators for suction cups.
Some small robots can be mounted upside down. The advantage is that the robot has a larger clear area, however, the mounting will be more difficult.
Another way to add flexibility is to mount the robot on rails so it can move from station to station.
August 26, 2013 No Comments
In this post, I take a quick look at how some common automation controllers handle motion.Â All of the controllers easily control pneumatics using digital I/O.
Traditional Ladder Logic (PLC and many PACs).
Iâ€™m grouping PLCs and PACâ€™s together because they are often quite similar (the definition of a PAC is nebulous; many PACâ€™s are simply PLCâ€™s based on x86 CPUs, but are still running ladder logic).
PLCs can easily handle pneumatics, although handling event sequences in ladder logic isnâ€™t as straightforward as it is in programming languages such as BASIC, C, or C#.
Even many low end PLCs such as the Panasonic FP0R and Siemens S7-1200 support motion control via step and direction outputs, which can control stepper drivers or servo drives that accept step and direction. More capable motion control is available through dedicated modules, such as Panasonicâ€™s FPG-PP11 module.
The PLCOpen TC2 Standard makes PLC motion control much better by adding a large number of standard motion control function blocks.
PLC motion programming varies.Â For example:
- The Panasonic FPG-PP approach requires setting up a block of memory with all the desired values, then copying it to a special memory location to start the operation.
- The PLCOpen approach is simpler: just set the values of the function block.
Simple Serial Controller
Simple serial or USB controllers are quite common; examples include Schneider (IMS) MDrive integrated stepper motors, Moog Animatics SmartMotors,Â AllMotion stepper and servo drives, and many others. These controllers are OK for simple tasks, but are quite limited in their capabilities (the best youâ€™ll find is simple coordinated motion, gearing, and camming). Although the hardware is often cute, they are almost always a pain to program.
Here is an example AllMotion program that homes or steps through a sequence of 4 positions, based on the state of two inputs:
/1s0n2f1J1V150000L10000Z150000A5000V20000Z10000A10000z0J0H12gS12e0S11e1GR /1s1J1V5000000A18000J0H11gS12e0S11e2GR /1s2J1V5000000A36000J0H11gS12e0S11e3GR /1s3J1V5000000A18000J0H11gS12e0S11e4GR /1s4J1V5000000A0J0H11gS12e0S11e1GR
By the way, this protocol is common is certain industries and is used by many companies besides Allmotion.
A fieldbus drive is a servo amplifier or stepper drive integrated with a motion controller and a fieldbus interface.Â Sometimes the fieldbus drive is integrated onto the motor.Â There are way too many vendors to list; a short list would include AMC, Copley Controls, and Elmo Motion Controls.
Standard real time fieldbuses such as CANOpen, EtherCAT, and Ethernet PowerLink support standard motion profiles including torque control, velocity control, homing, profile moves, and PVT (position-velocity-time) moves.
Using the raw motion profile is a bit tedious; moves are set up by writing to the appropriate object’s object dictionary, and you have to deal directly with the protocol (CANOpen, etc).Â Copley Controls provides an easier to use interface for C/C++ (CML) or COM (older versions) or .NET (current version) with their CMO library.Â I’m surprised that very few other vendors provide comparable software.
Just sending commands to drives works fine if you’re doing basic motion.Â For more complex motions, you can either buy a hardware motion controller that uses fieldbus drives (from Parker, ACS, and others), buy a soft controller (from ACS or others) or write your own motion control software.
Here is a sample Python script that performs an absolute move using Copley CMO V2.18:
from win32com.client import Dispatch cmo = Dispatch('CMLCOM.CANOpenObj') cmo.BitRate = 500000 cmo.PortName = 'kvaser0' cmo.Initialize() drive = Dispatch('CMLCOM.AmpObj') drive.Initialize(cmo, 9) profile = drive.ProfileSettings profile.ProfileVel = 20000 profile.ProfileAccel = 100000 profile.ProfileDecel = 100000 drive.ProfileSettings = profile drive.MoveAbs(100000)
Motion controllers can be software only, plug-in cards (ISA, PCI, PCIe, VME, etc), or stand-alone (serial, Ethernet, USB, Firewire, etc).
Typical motion control capabilities include coordinated motion, interpolated motion, camming, gearing, and triggers.Â Motion inputs include dedicated axis inputs such as limit sensors and encoder inputs (one or two per axis; two allows for separate position (e.g. linear encoder) and velocity (e.g. rotary encoder) feedback).Â Motion outputs include servo command (normally +/- 10V analog) and/or stepper command (step and direction) and/or digital fieldbus.Â Most controllers also have some general purpose I/O.
Programming methods vary; the four main approaches are the simple command approach (used by Galil and many others; they call it “simple and intuitive”, which might be true if you’re only doing motion), using a BASIC variant (Aerotech, Baldor, and many others), ladder logic (especially IEC-61131 and PLCOpen) or use a PC programming language (so the controller provides a library; this approach was popularized by MEI).Â Also, many controllers will can use either the PC library approach or their proprietary language.
The boundaries can blur a bit; when is a controller running ladder logic a PLC or a motion controller?Â I’d say when it’s intended for general use (for example a Panasonic FP0R), it’s a PLC, and when it’s intended for motion control with special motion control features (such as a Schneider Lexium Motion Controller), it’s a motion controller.Â If it’s intended for both, maybe it’s a PAC (such as the Omron NJ series).
Here’s some sample Galil code showing setting up and moving 3-axes:
SP 20000,20000,50000 AC 100000,200000,500000 DC 100000,200000,500000 PA 50000,100000,-25000 BG AM
The typical motion controller fails when it comes to dealing with the motion on linked axes.Â I know a company that had a prototype SCARA robot, using a MEI controller, that was never sold because programming it would have been too difficult — if you wanted to move the end effector to point XYZT you had to figure out the correct positions for each of the axes.
What you need for robots and other machines with mechanically linked mechanisms is inverse kinematics, which means determining how to move the mechanically connected mechanisms to get the end of the mechanisms where you want it to go.
In the past, pretty much only the dedicated robot controllers supported kinematics and inverse kinematics.Â Now, I’m happy to say, it’s a lot more common (my Robot Resources page has some links), especially in controllers targeted for packaging automation.
The PLCOpen standard has optional blocks for coordinated motion that includes standard blocks for kinematic transformations.Â These transformations have to be supplied by the vendor, so if a PLCOpen motion controller doesn’t support it, you can’t add it.Â Still, this is a big step forward.
I took a quick glance at the PLCOpen standard and a couple of kinematic controllers; my impression is that they still do not yet replicate the capabilities of a dedicated robot controller.
From what I’ve seen, robot controllers still have the best motion capabilities but they may not be best in terms of communications, integration, or programming power.
Many robot controllers use languages based on either BASIC or VAL; for example, Denso uses a BASIC variant and Adept’s V+ is a VAL variant.Â The robot vendors have been adding more options, too, such as:
- PLC-based control such as Adept’s ePLC and Denso’s b-CAP.
- Higher level overlays and program generators such as Adept’s AIM and ACE.
- PC-based control such as Denso’s ORiN2 and LabView For Industrial Robots (currently supports Denso robots)
Robots almost always use brushless servo motors.Â Often, the controllers can also control a few extra servo motors, but not stepper motors.Â The controllers normally have some built-in digital I/O.
To kind of tie everything together, here’s a summary by motion technology:
- Pneumatics are easily controlled by all controllers.
- Stepper motors can be controlled by some PLCs, all simple serial stepper controllers, all fieldbus stepper drives,Â and most motion controllers.
- Servo motors can be controlled by some PLCs, all simple serial servo controllers, all fieldbus servo drives, most motion controllers, and most robot controllers.
- Robots can be easily controlled by kinematic-capable motion controllers and dedicated robot controllers.
July 31, 2013 2 Comments