Haptics in Manufacturing – Paint-Spraying Example
The Barrett WAM arm
The combination of extraordinarily dexterous trajectory control with force/torque control not only made it the first robot arm capable of haptics in 1988, but no other robotic arm has ever matched the WAM’s haptic capabilities.
In the paint-spraying example, the WAM is used to train paint-spraying trajectories. Simple trajectories can be generated offline on a computer with excellent results but the challenges in paint spraying lie with the more difficult surface forms that are present in many modern products, ranging from appliances, to bicycle frames, to automobiles.
The example of a truck-load of 500 wheelbarrows will be considered. Today, the most cost-effective solution is to hire a tradesperson to spray each side of each wheelbarrow – one-by-one. Rich countries will provide special rooms with high-capacity air filtration as well as protective breathing equipment. The tradesperson will use substantial experience (that is difficult to match in any expert computer system) to apply the best coating to each wheelbarrow.
The WAM alternative is to enable the WAM arm and the tradesperson to guide the gun collaboratively and simultaneously. But, first, the geometry of the wheelbarrow is gathered simply by sliding the end of the WAM across its surface. Once done, the tradesperson enters the coating type and sprayer model so that the WAM system may select and fix the best operating distance (too close and the paint will run; too distant and the paint will begin to dry before contacting the surface). Then the WAM locks the tip of the sprayer into the resulting virtual haptic surface, offset from the real surface and at right angles, thereby constraining 3 of 6 possible degrees of freedom. The remaining 3 degrees of freedom (1) up-down, (2) left-right, and (3) rotation are chosen intuitively by the tradesperson, drawing on experience.
Once a trajectory is selected by the tradesperson as being acceptable for the remaining wheelbarrows, the rest are sent in by conveyor, one at a time using either alignment jigs or by lightly touching points on the real wheelbarrow surface and rotating the haptic surface automatically.
One special aspect of this technique is that the level of haptic forces never exceeds the capability of the tradesperson. In that way the tradesperson need not break intuitive focus to override the robot when necessary for special boundary conditions, high-curvature, odd topologies, or concavity situations.
Teach & Play in Manufacturing – Simple Concept, Elusive to Achieve
Mouseover for video controls
The value of Teach & Play is so obvious to any person unfamiliar with robotics that the general lack (or poor implementation) of this capability in conventional robots is shocking. However, the virtually-magic WAM™ arm combination of high backdrivability and redundant kinematics enables the intuitive notion of hands-on teaching a robot trajectory for playback.
Unfortunately, effective Teach & Play has eluded controls engineers for decades. The secret is in understanding a fundamental hardware limit: you cannot control away the crippling effects of friction in robotic joints, regardless of the quality of joint-torque sensing. Rather, a robot design must begin with robotic joints that have virtually zero friction. This revelation (part of the PhD thesis of Barrett’s founder at MIT) is far from obvious, and attempts to mask the friction effects with active torque sensing tend to increase an arm’s perceived inertia which makes the Teach part slow, fatiguing, and fundamentally less safe.
Effective Teach & Play opens a world of applications where a tradesperson can teach the robot arm directly, eliminating a programmer’s attempts at encoding the tradesperson’s intuition – a task which is never fully realizable.
With Teach & Play, a tradesperson guides the end tip of a WAM with one hand, controlling the pose with the other as/if desired. The playback trajectory or velocity profile can be edited later or portions of the path re-taught. A physical offset tool may be attached at the end of the WAM if an offset distance is desired during Teach. Then the user slides the WAM along a surface or an edge or other geometric features.
With haptics, the user can even create edges for sliding along that did not exist before. For example a vertical plane can intersect a large, mostly-horizontal automotive body-panel die for heat-treatment hardening. The user slides along the virtual edge generated at the intersection of the real die surface and the virtual vertical plane to generate a path. Then the vertical path may be indexed any number of times across the die surface.
In this example, the entire path creation process can be fully automated. However, the deeper value lies in the human’s capability to handle exceptions intuitively. The WAM max force is designed never to exceed human capability, so, the tradesperson need never press a button to override the robot, it is simply done as required without distraction from the task.
Finally, if the taught path will remain static over a production run, then many capable trajectory-controlled robots are available to repeat the playbacks.
Mouseover for video controls
In yesterday’s factory, robots are anchor-bolted to fixed-floor locations entrenched behind safety fences. Much of yesterday’s factory is organized around getting work-in-progress to and from these robot workcells via systems of conveyers, bowl feeders, etc. The name of the game is to keep these workcells fed and busy. Downtime is avoided at any cost since it would take days and huge expense to exchange one of these anchor-bolted robots.
Now consider an improved factory where robotic manipulators on wheeled platforms guided by machine vision are free to move among islands of flexible manufacturing. Robots are deployed and frequently re-deployed from place to place according to dynamic load demands. If customers are ordering more of a particular blend of product this week, then the robots simply migrate to meet that peak demand. Conventional anchor-bolted robots are still part of this future factory, for example, to handle dangerously-heavy payloads; but the lighter mobile manipulators (that are safe enough to move about the factory) balance the output and add robustness in case a workstation suffers unexpected downtime.
Mobile manipulators are battery powered when moving between workstations. Precision docking features such as tapered jig pins on the mobile platforms can lock into matching features at the various stations around the islands of flexible manufacturing. Then machine vision leverages the ultra-dexterity of these robots to perform vision-guided tasks such as 3D bin picking to obtain materiel.