DESIGN WORLD - ROBOTICS HANDBOOK 2020

Page 1

November 2020

www.therobotreport.com

2020

Robotics

Handbook

NOV 2020 COV_RRHBK_FINAL.indd 1

11/2/20 8:40 AM


Linear Motion Guides

REACH FOR HIGHER Cross Roller Rings

QUALITY

World-class THK components for robotic applications. THK leading linear motion products are now available in pre-assembled robot components. Our robot arm components feature a human hand shaped three-finger linkage system, allowing for grasping of objects of various sizes and shapes. An allin-one space-saving design integrates fingers, actuator, and the driver controller. THK offers a complete line of linear motion products for the robotics industry.

To learn more, call us at 1-800-763-5459 or visit www.thk.com. See Us At Semicon West, July 9–11, 2019 Booth #S–427

Sponsor of MVPvets

MVPvets assists veterans with meaningful employment in life science companies.

THK-Robotics 11-20_RRHBK.indd 1

Model: TRX-S/L Grippers

Human hand shaped three-finger linkage system

Made in the

USA

Hebron, Ohio

10/29/20 8:47 PM


FHA-C Mini Actuator with Integrated Servo Drive

The Servo Drive is Inside!

The FHA-C Mini Series is a family of extremely compact actuators that deliver high torque with exceptional accuracy and repeatability. As part of the FHA-C Mini family, an integrated servo drive version utilizing CANopen® communication is now available. This evolutionary product eliminates the need for an external drive and greatly improves wiring while retaining high-positional accuracy and torsional stiffness in a compact housing. • Actuator + Integrated Servo Drive utilizing CANopen communication • 24VDC Nominal +7-28VDC Supply Voltage Range • Single Cable with only 4 conductors needed: CANH, CANL, +24VDC, 0VDC • Zero Backlash • Dual Absolute Encoders

• Panel Mount Connectors with 4 exit options • Output Sensing Encoder 14bit (16384 cpr) resolution • Input Sensing Encoder 15bit (32768 cpr) resolution • Control Modes Including Torque, Velocity, and Position Control, CSP, CSV, CST • Harmonic Drive HDL Software

42 Dunham Ridge, Beverly, MA 01915 | 800.921.3332 | www.HarmonicDrive.net Harmonic Drive is a registered trademark of Harmonic Drive LLC. CANopen is a registered trademark of CAN in Automation.

DW 9x10.875.indd 1 Harmonic Drive 11-20_RR.indd 1

10/22/20 AM 10/29/20 11:03 8:54 PM


11

Contents

2020 • therobotreport.com

ON THE COVER: | Courtesy Toyota Research Institute

DESIGN & DEVELOPMENT

COBOTS

06_ Toyota Research Institute demonstrates

48 _ eKAMI and READY Robotics train former coal

household robot prototypes

14 _

Testing a PR2 in a simulated hospital world

miners to program multiple robots

52 _ MT Solar automates small-batch welding with UR10e, Vectis cobot tool

MOBILE ROBOTS

18 _

3 AMRs help ICM handle demand for protective gear

24 _ Mini antenna enables robots to team up in complex environments

28 _ How to get started with designing a cost-effective UGV for security and surveillance

SOFTWARE

34 _ Future connectivity requirements 40 _ Where hybrid robotics is heading 46 _ How to develop software for safety in medical robotics

MOTION CONTROL

56 _ Interdisciplinary approaches are essential to drive systems, robotics success

60 _ How Realtime Robotics is helping robots avoid collisions

VISION

64 _ How Siemens automated maritime battery production

68 _ Eliminating bias from visual datasets used to train AI models

MANIPULATION

72 _ Hello Robot’s Stretch aims to democratize mobile manipulation

76 _ Visual transfer learning helps robots manipulate objects

2

November 2020

CONTENTS_Robotics_Handbook_11_2020_Vs3_ed.indd 2

www.therobotreport.com

THE ROBOT REPORT

11/3/20 1:51 PM


PUT DIRTY WORK INTO HANDS THAT NEVER GET TIRED.

Free your workers to do more productive jobs, while improving your system’s performance. Find the best ways to automate tedious tasks at www.intelligrated.com/ready.

© 2020 Honeywell Intelligrated. All rights reserved.

Honeywell Integrated 11-20_RR.indd 3 12073 HON-INT-Robotics-DirtyWork_9x10-875.indd 1

10/29/20 8:53 PM 2/14/20 3:33


DESIGN WORLD

Follow the whole team on twitter @DesignWorld

EDITORIAL

DIGITAL MARKETING

PRODUCTION SERVICES

EVENTS

VP, Editorial Director Paul J. Heney pheney@wtwhmedia.com @wtwh_paulheney

VP, Digital Marketing Virginia Goulding vgoulding@wtwhmedia.com @wtwh_virginia

Customer Service Manager Stephanie Hulett shulett@wtwhmedia.com

Events Manager Jen Osborne jkolasky@wtwhmedia.com @wtwh_jen

Senior Contributing Editor Leslie Langnau llangnau@wtwhmedia.com @dw_3dprinting

Digital Marketing Specialist Sean Kwiatkowski skwiatkowski@wtwhmedia.com

Executive Editor Leland Teschler lteschler@wtwhmedia.com @dw_leeteschler

Webinars/Virtual Events Senior Manager Lisa Rosen lrosen@wtwhmedia.com

Executive Editor Lisa Eitel leitel@wtwhmedia.com @dw_lisaeitel

Webinar Coordinator Halle Kirsh hkirsh@wtwhmedia.com

Senior Editor Miles Budimir mbudimir@wtwhmedia.com @dw_motion Senior Editor Mary Gannon mgannon@wtwhmedia.com @dw_marygannon Associate Editor Mike Santora msantora@wtwhmedia.com @dw_mikesantora CREATIVE SERVICES

VP, Creative Services Mark Rook mrook@wtwhmedia.com @wtwh_graphics Art Director Matthew Claney mclaney@wtwhmedia.com @wtwh_designer Graphic Designer Allison Washko awashko@wtwhmedia.com @wtwh_allison Graphic Designer Mariel Evans mevans@wtwhmedia.com @wtwh_mariel

4

November 2020

Staff page 11-20_Robotics Hbk_Vs1.indd 4

Webinar Coordinator Kim Dorsey kdorsey@wtwhmedia.com WEB DEV/DIGITAL OPERATIONS

Web Development Manager B. David Miyares dmiyares@wtwhmedia.com @wtwh_webdave Senior Digital Media Manager Patrick Curran pcurran@wtwhmedia.com @wtwhseopatrick

Customer Service Representative Tracy Powers tpowers@wtwhmedia.com Customer Service Representative JoAnn Martin jmartin@wtwhmedia.com Customer Service Representative Renee Massey-Linston renee@wtwhmedia.com

Event Marketing Specialist Olivia Zemanek ozemanek@wtwhmedia.com

Director, Audience Development Bruce Sprague bsprague@wtwhmedia.com FINANCE

Controller Brian Korsberg bkorsberg@wtwhmedia.com

Digital Production Manager Reggie Hall rhall@wtwhmedia.com

Accounts Receivable Specialist Jamila Milton jmilton@wtwhmedia.com

Digital Production Marketing Designer Samantha King sking@wtwhmedia.com Digital Production Specialist Elise Ondak eondak@wtwhmedia.com

Front End Developer Melissa Annand mannand@wtwhmedia.com

EDITORIAL

Software Engineer David Bozentka dbozentka@wtwhmedia.com

VP, Robotics and Intelligent Systems Dan Kara dkara@wtwhmedia.com @RobotReportKara

Editor Steve Crowe scrowe@wtwhmedia.com @SteveCrowe

Senior Editor Eugene Demaitre edemaitre@wtwhmedia.com @GeneD5

VIDEO SERVICES

Video Manager Bradley Voyten bvoyten@wtwhmedia.com @bv10wtwh Videographer Derek Little dlittle@wtwhmedia.com @wtwh_derek

2011 - 2020

2014 Winner

WTWH Media, LLC 1111 Superior Ave., Suite 2600, Cleveland, OH 44114 Ph: 888.543.2447 | FAX: 888.543.2447

www.therobotreport.com

Medical Design & OUTSOURCING

THE ROBOT REPORT

11/2/20 4:15 PM


Keystone 11-20 (RRHBK).indd 5 EE_DWRobotics THiNK Fist+CC-AutoFH_11-20prod.indd 1

10/29/20 11:10 8:56 PM 10/12/20 AM


Design & Development •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

Toyota Research Institute demonstrates household robot prototypes Virtual media tour shows improving manipulation, machine learning with simulation, and a gantry kitchen robot. By Eugene Demaitre • Senior Editor, The Robot Report

While most automakers have focused on robotics for manufacturing or on developing autonomous vehicles, Toyota Motor Corp. has also been investigating service robots for household use. Last month, Toyota Research Institute conducted a virtual open house of its research and development facilities in Los Altos, Calif., and Cambridge, Mass. Toyota Research Institute (TRI) demonstrated how robots can learn and conduct tasks such as wiping down surfaces, picking up varied items and loading a dishwasher, and operating in dynamic environments designed for humans rather than robots. TRI also showed manipulation research conducted with both physical robots and simulation, as well as a mockup home built inside its California laboratory. TRI’s assistive robots are roughly humanoid, with two arms, a wheeled base, and multiple degrees of eedom. The company is also developing algorithms, grippers, and other service robots with partners. Toyota Research Institute invests in human amplification Toyota is investing in robotics to ampli human capabilities as a logical extension of how the car amplifies mobility, said Gill Pratt, CEO of Toyota Research Institute. TRI’s research into robotics, assisted driving, accelerated materials design and discovery, and machine-assisted cognition is guided by the concept of “Ikigai,” which means that each person’s life should have purpose, he said. “Studies of Ikigai teach us that we feel most fulfilled when our lives incorporate work that we love and that helps society,” said Pratt. “To enable more people to achieve their Ikigai, TRI is pursuing new forms of ‘automation with a human touch,’ known as ‘Jidoka’ in the Toyota Production System, to develop capabilities that ampli rather than replace human ability.”

6

November 2020

TRR_Toyota2020_11_Vs3_ed.indd 6

www.therobotreport.com

THE ROBOT REPORT

11/3/20 7:51 AM


Toyota Research Institute is testing mobile service robots. Source: TRI The global population over the age of 65 is expected to double by 2050, according to the United Nations. As the populations of developed nations such as Japan age, the need for robot-assisted care will only grow, said Steffi Paepke, senior user experience designer at TRI. “TRI robotics research is focused on the home because it is in that environment that robots can provide the greatest assistance in achieving human fulfillment,” said Max Bajracharya, vice president of robotics at TRI. “It is also one of the most complex environments for robots to master.” Robots to learn om simulation, fleets “Getting large amounts of data is not really practical,” Bajracharya said. “We’re working on how to learn om less data.” “Our work is focused on two key challenges: teaching robots om human behavior and using simulation to both train and validate robot behaviors.” he explained. “We think THE ROBOT REPORT

TRR_Toyota2020_11_Vs3_ed.indd 7

of this idea as fleet learning, where when one machine learns something, they all learn something. We believe this is going to be the key to making robots in human environments practical.” TRI instructs robots with a telepresence and virtual reality system, in which a staffer models a task for a robot. The combination of human instruction and the ability to run numerous robots in parallel simulations should make it easier for robots to learn and share how to handle new objects, said Russ Tedrake, vice president of robotics research at TRI. “We’ve used our dish-loading robot and clutter-clearing experiments to automatically improve our behaviors in simulation and have that result in improved performance on the real robots,” he said.

www.therobotreport.com

November 2020

7

11/3/20 7:51 AM


Design & Development

VR is used to instruct robots. Source: TRI

Household robots must learn to handle varied objects. Source: TRI

TRI has already learned some valuable lessons about manipulation, Tedrake added. He demonstrated a soft bubble gripper similar to human finger pads that includes cameras inside to record how different objects deform the bubbles. The company is also working on how to build trust with user engagement through a “principled” human-robot interaction (HRI) approach, said Paepke. TRI robot hangs out in the kitchen Another new piece of hardware that Toyota Research Institute showed at 4x speed was its gantry kitchen robot, similar to Miso Robotics Inc.’s Flippy Robot-on-a-Rail system. Rather than

Robots are learning tasks from humans and simulation. Source: TRI

8

November 2020

TRR_Toyota2020_11_Vs3_ed.indd 8

www.therobotreport.com

have a mobile robot take up precious floor space in a domestic kitchen, TRI is experimenting with a robotic assistant hanging from the ceiling. “We rely heavily on observational research techniques such as contextual inquiries,” said Paepcke. “Before COVID-19, we went to Japan to work with our research partners to visit the homes of older adults and observe them going about their daily lives, making note of friction points, challenges, and opportunities.” “We observed that cooking is a beloved activity for many, though it can get more strenuous over time,” she said. “Sharing meals and feeding loved ones also can serve as a focal point for social connection, so giving elderly people a fully automated cooking robot or pre-cooked meals might be physically beneficial but emotionally detrimental.” In the case of the gantry robot, there is a tradeoff between changing the environment and providing assistance, Bajracharya said. Both reliability and cost need to be considered as early as possible in its design process, added Pratt. “We’re not just aiming at people over 65 but [also] their children and grandchildren, who will be getting these robots for them,” Pratt said. TRI is thinking of individual early adopters and health insurers, as well as builders, which could incorporate robots such as the kitchen gantry into their designs, he said. TRI looks to the future TRI said its goal is for robots to enable people to “age in place,” prolonging their THE ROBOT REPORT

11/3/20 7:52 AM


maxon 11-20_DW.indd 9

10/29/20 9:02 PM


Design & Development Your worldwide source for high performance Tapes, Films, Fabrics, and Silicone

Improving automation design with performance polymers

Acetal POM Delrin® Nylon 6/6 PEEK® PTFE Polycarbonate UHMW

With Adhesive Backing Industry solutions: • • • •

Surface protection for (EOAT) Custom wear strips or sheets for high impact areas Friction reducing washers for fastening motion components Non-stick polymer solutions for reducing friction on conveyor guide rails or gantry arm systems

Need Custom Cut Parts?

Send us a drawing! 800-461-4161 sales@cshyde.com www.cshyde.com

Gantry kitchen robot. Source: TRI

independence rather than have them be sent to expensive nursing homes. “The difference between Toyota and other companies … what we’re really trying to do is build a time machine,” Pratt said. Toyota’s robots are not just a convenience or replacing fulfilling jobs but are intended to restore people’s capabilities to work and relate to one another as they did when they were younger, he said. The Toyota unit is working on other proofs of concept with Toyota Research Institute-Advanced Development (TRIAD), soon to be rebranded as Woven Planet Holdings Group. It includes Woven City, a “living laboratory” being built in Japan, and Woven Capital. “Woven Planet is working incredibly hard in Japan,” Pratt told The Robot Report. “Woven City is human-centered and will tie all of us together. TRI [regularly and virtually] meets with the Woven City team, and we’re really excited to see this in the future.” Woven Capital will collaborate with Toyota AI Ventures. That unit invests in startups such as Intuition Robotics Ltd., which is testing ElliQ, a companion robot for older adults. In the case of telepresence and social robots, their ability to help people connect more easily can have psychological benefits, Pratt added. However, consumers waiting for The Jetsons’ Rosie the Robot will have to be

10

TRR_Toyota2020_11_Vs3_ed.indd 10

November 2020

patient because of the complexity of tasks, environments, and objects to be handled, from toys and dishes to laundry. The robots shown in the virtual open house are research models, not production ones, Pratt said. Commercialization for a household robot is still a ways off, acknowledged Kelly Kay, executive vice president and chief finance officer at TRI. “Robots aren’t ready yet for unstructured environments in the home, which are different from those in a factory, where they’re doing repetitive tasks,” she said. “We know how to hand off promising applied research concepts to partners for actual development.” RR

TRI CEO Gill Pratt. Source: Toyota Research Institute

www.therobotreport.com

THE ROBOT REPORT

11/3/20 7:52 AM


Soft Bubble Gripper a step torward household robots, says Toyota

Bubble Gripper and mug. Source: TRI

For decades, visions of the future have featured robots serving as domestic assistants. But despite many years of development in the robotics field, the dream of the domestic robot remains unfulfilled except for a few elementary tasks like floor vacuuming. Toyota Research Institute said it has developed a Soft Bubble Gripper toward that goal. Robotic assistance can not only help clean homes and offices, but it can also assist older people or those with disabilities or age-related challenges so they can live with greater independence, dignity, and joy, said the Toyota Research Institute (TRI). As part of its efforts toward a domestic robot, TRI has focused on improving robotic manipulation. Such grippers must be capable of stable grasps, precise placement, and safe interactions during inadvertent contact. They must also be low cost to be part of an affordable and commercially viable household robot. TRI said its engineers have designed a manipulator with all these capabilities called the Soft Bubble Gripper. New gripper builds on past advances The Soft Bubble Gripper builds on past work by the TRI Manipulation Team. Previously, TRI’s robots gripped objects, sorted them, and even correctly placed items in a dishwasher using conventional, two-fingered grippers guided by external cameras. The grippers relied on these cameras to do their work. In other words, they had no sense of touch. Now, TRI said it has found a way to improve robotic perception and manipulation with soft grippers that both passively hold objects better and

THE ROBOT REPORT

TRR_Toyota2020_11_Vs4_ed.indd 11

Top: Highly compliant soft-bubble parallel gripper on a robot arm grasping an object. Bottom-left: In-hand pose estimate during this interaction; Bottom-right: Shear-induced displacements tracked on the interior of this sensor. Source: TRI actively sense how much force is applied. The soft grippers also accurately measure lateral force, which indicates when an object is about to slip out of a gripper’s grasp. The gripper’s development team is led by Alex Alspach, who tapped into his soft robot development background to ideate the new gripper design, and Naveen Kuppuswamy, who formulated algorithms to use it. Together with TRI’s Tactile Team, they developed technology that uses the robust and compliant nature of air-filled, elastic bubbles for gripping with sensing from cameras on the inside. The cameras show what’s happening from a new perspective inside the grasp, including forces that are usually invisible. Alspach and Kuppuswamy iterated their design, building on the work of other research organizations. Early designs used a robot arm with a single, large, round, soft-bubble on its end effector, which then graduated to two arms — a.k.a. “big fingers” — for dexterous, tactile-driven tasks like blindly sorting objects and threading a nut onto a giant bolt. TRI said its researchers now use a single arm with two smaller fingers, each using a soft bubble, which combines all the advantages of compliant gripping with real-time, real-world tactile sensing. The bubbles feel shapes and forces and recognize the object they are gripping, as well as the forces between the object and the fingers. In addition, the team designed the Soft Bubble Gripper with inexpensive materials to eliminate a potential obstacle to domestic robot adoption.

www.therobotreport.com

November 2020

11

11/3/20 2:47 PM


Soft Bubble Gripper a step torward household robots TRI gripper capabilities The combination of force and shape data as visualized by these sensors allows robots equipped with Soft Bubble Grippers to perform a range of tasks that would be extremely difficult for rigid grippers to accomplish, according to the TRI researchers. Thus far, lab-validated capabilities of the Soft Bubble Gripper include the following innovations: Robust and passively-compliant grasping. The Soft Bubble Gripper uses the inherently grippy texture and durable elastic properties of latex. The latex is inflated to a degree of softness that optimizes compliance to the shapes of held objects, maximizing the grasp stability. Given the physical properties — its air-filled bubbles and the gummy, high-friction texture of latex — the Soft Bubble Gripper is very reliable when it comes to grabbing and holding onto objects. “Passive compliance” refers to grippers controlled by the laws of material physics rather than a motor and compliance

to whatever shape is in the end effector. However, as Alspach noted, “compliance alone doesn’t let you do creative things with the object. This is where the gripper’s other capabilities come into play.” Recognizing objects by touch. Inside the bubble is a low-cost, off-the-shelf, time-of-flight (ToF) depth sensor/infrared camera that uses vision to ‘feel’ what the gripper is holding. It enables the system to recognize objects by their shape and other physical properties and understand what to do with it within about a second. This is analogous to human fingers searching for house keys buried deep in a purse or backpack. They can’t technically “see,” but they provide tactile information people use to build a mental model of what we are touching. This capability lets the robot perform realistic, useful tasks typically found in a home, because it understands how the object is oriented in its “hand” and what it must do to complete the task. This is not much different from a toddler playing with a shape-sorter toy. When children presented with a pile of differently shaped blocks, they randomly grab one and feel its shape and all of its facets, and make their selections. The Soft Bubble Gripper uses a similar process to sort a sink full of objects. It measures and recognizes geometric features, then either precisely sets mugs in the dishwasher or drops plastic bottles into the recycling bin. It sorts the shapes, not visually, but through touch, according to TRI. Shear force detection and interpretation. The Soft Bubble Gripper can also sense when

Some of the grasped objects from the dish-loading task and their corresponding soft-bubble IR images. In each case, the system correctly identified the object classes: a PET tea-bottle, a square plastic alcohol bottle, a plastic mug, and an alternative grasp utilized on a bottle similar to that in the second case. Source: TRI

12

November 2020

TRR_Toyota2020_11_Vs4_ed.indd 12

www.therobotreport.com

some outside force is trying to take, twist, pull, or push an object, said the researchers. It uses the camera inside the bubbles to measure how a dense dot pattern inside the latex membrane is moving and distorting. It then infers the magnitudes and directions of the forces causing this distortion. This lets it swiftly sense when the object it is lowering has landed on the counter, if it has accidentally bumped into something, or when an object it is handing over has been received. The end-of-arm tooling gives the robot tactile awareness of the outside world and its cohabitants. Depending on the controller, the robot could gently hand someone a full wine glass or set it on the table without spilling. “To achieve a handover, the robot doesn’t need to know it is holding a wine glass, only that something is pulling on it … exerting an external force that tells it to let go,” explained Kuppuswamy. In another exercise, the robot stacks transparent wine glasses. This would be incredibly difficult for a robot with conventional vision and hard grippers, because it is difficult to perceive transparent objects, and wine glasses are fragile. But it is relatively easy with the Soft Bubble Gripper, and the robot blindly stacks stemware with no knowledge of the object height or table position. Shear force — read as directional changes in the dot pattern inside the latex membrane — indicates that one glass has successfully landed on top of another and tells the robot to let go. “This kind of tactile sensing is absolutely necessary for the uncertainty that you see in people’s homes,” said Alspach. “We’re not in a controlled factory environment anymore. Our goal is to be helpful to people in their homes, and to do so we have to build a domestic robot with a sense of what it is touching.” In addition, the robot is able to perceive and track changes in rotation of an object in its grasp. The robot can push the bottom of a mug against the kitchen counter to actually flip it into a better position for loading into the dishwasher. “The ‘a-ha’ moment here is that instead of putting all our computing energy into how the robot approaches and picks up an object — which is a requirement for blind robots — we fast-forward past that to the actual task,” THE ROBOT REPORT

11/3/20 2:02 PM


said Kuppuswamy. “The robot quickly grabs anything in the sink. Then we can focus on and adjust to what’s happening in its hand. This is quite different from a traditionally slow process needed for a factory robot.” Operating blind. Most robots rely heavily on cameras to create a sense of vision. This means that light is needed to help create this sense of perception. The technical term for this is “visual occlusion.” Traditionally, robots also have difficulty working around clutter or in confined spaces, where there’s no clear line of sight. Transparent, shiny, or dark objects are also quite hard for robots to see.

withstand the many cycles needed for this type of training. “By cutting the human out of the loop completely, the grasp data was a better match to the variability the robot would experience in real-world situations,” added Alspach. Use of low-cost materials. Toyota said its team used simple fabrication techniques and affordable materials so that the Soft Bubble Gripper would be inexpensive to build, operate, maintain, and repair. “An insight we gained is finding a way to make an inexpensive camera deliver usable depth data at an extremely close range,”

A long-term goal of the TRI robotics research team is to break the traditional robot controller paradigm and accommodate for an ever-changing environment. However, since a robot equipped with the Soft Bubble Gripper relies on sensors inside the bubble and not an external camera to recognize and manipulate objects, it works equally well in lit or darkened rooms. The technology is suitable for situations in which it must reach into cluttered places (such as a sink full of miscellaneous objects and/or water) or has to manipulate in a way that would otherwise block its own view, said TRI. “We’re not trying to replace external vision sensors — they’re still important,” Alspach noted. “We’re adding to these systems with even more useful sensory information resulting in increased capabilities and added precision.” Training through self-annotation. Traditionally, a robotic system guided by computer vision is trained to recognize objects by repeatedly showing it images to establish defined categories — a time-consuming process known as supervised training. For example, somebody has to show the robot many times over various photos of coffee mugs and tell it, “These are all coffee mugs.” The team TRI team claimed that it had a breakthrough when it set up the robot to selfannotate by filling a sink with one type of object and letting the robot repeatedly grasp and drop these objects, reducing the amount of time it took to “learn” a new object. This recognition process is repeated for every new object the team wants the Soft Bubble Gripper to learn. As a result, tactile sensors must be designed to THE ROBOT REPORT

TRR_Toyota2020_11_Vs5_ed.indd 13

Alspach said. “It’s how we angled the camera that lets us maximize the quality of the depth measurements — so that the camera can view and identify the ‘tactile’ imprint of the gripped object on the soft latex bubble.” Goals for future TRI exploration A long-term goal of the TRI robotics research team is to break the traditional robot controller paradigm and accommodate for an everchanging environment. Improving reflexes. The team said it wants to replicate real-time course-correcting behavior in robots. It is striving for distributed robotic sensing where information is processed and reacted to “locally” instead of having to refer to a centralized ‘brain’ for every decision. This enables robots to execute their actions much quicker with less forethought. “It’s more like what humans or animals do,” Alspach explained. “In a fraction of a second, we react and catch the thing that fell off the table or slipped out of our hand almost without thinking.” Expanding softness. Currently, only the robot gripper is soft. But what if more of its surface could be made of such materials? That could bring great advantages for human/robot interaction including working in the home with children, pets, or people who may already have physical challenges. “We aim to build robots that can interact with the world around them comfortably,” said www.therobotreport.com

Alspach. “Softness makes the application of robots more reasonable for homes. Taking it further, we can begin to incorporate lightweight materials, air, smaller electronics and an optimized design to help us reduce weight. It’s the combination of soft and lightweight, along with the ability to sense and react, that contributes to safety.” “It’s actually a really tricky problem,” added Kuppuswamy. “In one sense, it’s extremely hard to build because you are working with frontier materials…. It’s not just classical methods and materials like machining or welding.” Further complicating this is the fact that anything inflatable introduces the factor of continuously dynamic shape-shifting, which destabilizes the usual equations. “Mathematically, soft is hard,” Kuppuswamy said. Publication, recognition on the way to more useful robots More information about the TRI Soft Bubble Gripper research can be found in the published paper, “Soft Bubble grippers for robust and perceptive manipulation.” The team’s research was also recognized at ICRA 2020, the annual IEEE International Conference on Robotics and Automation, with a 2019 Robotics and Automation Letters Best Paper award for another publication about the soft gripper project. TRI welcomes other scientists researching robotic manipulation to build on their work by bringing their own insights and discoveries. To inspire this collaboration and further the research on soft bubble grippers, they hosted VisuoTactile 2020, a workshop of thought-leaders in visuo-tactile technology at the Robotics Science and Systems (RSS) conference. The company said the Soft Bubble Gripper builds on its manipulation research toward making human-assist robots reliable and robust. Even without the sensing capability, the stretchy material or low stiffness makes for a superior gripper in comparison with with standard soft grippers, said the researchers. It can conform to a wide variety of shapes and get a stable grasp. The gripper combines strength and smarts to handle objects that require a delicate touch and a steady hand, said the team. Through sensing geometry, camera images, and shear forces, robots can now perceive the objects they are holding by estimating the pose of objects and sensing forces on the surface. Because the Soft Bubble Gripper uses relatively inexpensive cameras internally, it moves manipulation for applications such as domestic robots a step closer to reality, said TRI. RR November 2020

13

11/3/20 2:56 PM


Design & Development •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

Testing a PR2 in a simulated hospital world

As the need for robots in healthcare grows, better tools are needed to build, test, and deploy robotics applications quickly and safely. By Matt Hansen • AWS

Nowadays, risk of spreading disease is a key concern in hospitals, where doctors, nurses, and other caregivers are on the ont lines helping patients. Hospitals have started using robots in daily operations such as contactless delivery and room disinfection to reduce risk of spreading disease. As the need for robots in healthcare grows, better tools are needed to build, test, and deploy robotics applications quickly and safely. Introducing Hospital World Testing robots in a physical environment is time consuming, and testing new code in a hospital environment can pose a safety risk due to the unpredictability of untested code. Testing in a virtual environment, or a simulated world, increases test coverage, reduces safety risk, and decreases development time. However, creating virtual worlds for simulation is costly, time consuming, and requires specialized skills in 3D modeling. For that reason, Amazon Web Services (AWS) has developed a Gazebo simulated Hospital World and published it as open source so that robotics companies within the healthcare industry can more easily test their robots in a simulated hospital environment. In this

14

November 2020

Hospital_Simulation2020_11_Vs3_ed.indd 14

The PR2 robot in the lobby with front desk and waiting area of Hospital World, a simulated environment from AWS. Credit: AWS

www.therobotreport.com

THE ROBOT REPORT

11/3/20 7:55 AM


article, I will provide an overview of the hospital and share my experience using it to test a PR2 robot, including so ware failures I encountered that you may find helpful. In order to test and demonstrate the features of the Hospital World, I wanted to use a robot that moves and looks like I would expect a hospital-based robot to move and look if I were to develop one myself. I researched available options om the open-source community, searching for robots that had existing simulation packages, and the one that achieved these needs was the PR2 robot. The PR2 was developed by Willow Garage in conjunction with the Robot Operating System, or ROS, as a development and research platform. It’s great for simulation, running navigation, and simultaneous localization and mapping (SLAM), and it’s fun to play with. Getting started with simulation Now that we have background on why this matters and what robot was chosen for testing, we will dive into these three areas: 1. A quick tour of the Hospital World 2. Testing the PR2 robot in the Hospital World 3. Getting started with the Hospital World

A top-down view of Hospital World. Credit: AWS The first thing to point out is the lobby of the Hospital World, where there’s a ont desk with a waiting area, patients, and medical staffers nearby. The second thing to note is the size and layout of the hospital. The hospital is much larger than the other AWS open-source worlds such as Bookstore World and the Small Warehouse World. It also has more rooms, including exam rooms, patient rooms, storage, and a staff break room. Larger, complex worlds are great for testing navigation and especially SLAM algorithms. In the rooms, there are hospital beds, chairs, and furniture, which makes testing obstacle avoidance algorithms realistic. In addition, there are stairways, a ramp, steps into the lobby, and elevators,

A patient room and hospital staff. Credit: AWS

THE ROBOT REPORT

Hospital_Simulation2020_11_Vs3_ed.indd 15

www.therobotreport.com

which are all excellent for testing robot perception and navigation behaviors. Testing the PR2 in the Hospital World I decided to spawn my PR2 in the lobby of the hospital, which I accomplished by using the pr2_no_arms.launch file om the pr2_gazebo ROS package. A er spawning the robot, I wanted it to navigate autonomously om point to point. In order to do that, I needed a map or I needed to create a map dynamically by running SLAM. The PR2 has an existing ROS package to run navigation and SLAM together, pr2_2dnav_slam, which made it simple to get those algorithms running. With those packages running, I opened RViz the ROS visualization tool to see the map as it was being generated. Once I had a map showing in RViz, I had the ability to send the PR2 navigation goals via the ‘2D Nav Goal’ tool in the graphical user interface. Initially, I tried to navigate the PR2 into a small exam room directly off the lobby area, as a simple test to validate the navigation and SLAM components were running correctly. The robot only made it to the doorway but wasn’t able to navigate through the doorway. Was the doorway too narrow for the robot or was the so ware not working correctly? I wasn’t sure yet, so I tried navigating to a few other locations to see if those would work. A er navigating back to the starting point in the lobby, I decided to see if the robot was able to navigate behind the desk to map the area. It created a plan November 2020

15

11/3/20 7:55 AM


Design & Development and attempted to navigate there, but the robot got stuck trying to turn the corner to go behind the desk. Now I had a second failure, which was making me suspicious of the navigation software. After tele-operating the robot using the ROS pr2_teleop package to get it unstuck, I decided to send it down a hallway to explore and extend the map. What I hadn’t noticed was there was a wheeled trolley bed in the hallway, and I watched as the PR2 tried to avoid the trolley bed. Halfway around the bed, the robot hit the edge and began pushing the trolley bed. It pushed the bed down the hall and around the corner before losing balance and falling over. With all of the errors, I was glad to be testing in a simulation environment and not with a physical robot in a real hospital. The expensive robot could have been damaged if this failure had happened in reality. With multiple failures, I was becoming confident something was wrong with the navigation software.

A hospital staff break room. Credit: AWS

The Gazebo and RViz views of PR2 in the hospital lobby. Credit: AWS

The PR2 robot fell over after running into a hospital bed. Credit: AWS

16

November 2020

Hospital_Simulation2020_11_Vs3_ed.indd 16

www.therobotreport.com

THE ROBOT REPORT

11/3/20 7:55 AM


If you know how ROS navigation works, the obstacle avoidance portion of the stack is the local costmap. I was highly suspicious the local costmap wasn’t working, so I tried to visualize it in RViz. Sure enough, the local costmap wasn’t showing any obstacles. I continued testing to see if visualization was the only problem or if the robot really couldn’t navigate local obstacles. I ended up crashing the robot again, this time in one of the patient rooms. In RViz, the robot’s local costmap region (the square highlighted on the map) wasn’t inflating the obstacles like it should. The local costmap showed up as clear instead of inflating the obstacles using color gradients. A er a little more investigation, by checking the parameters for the local costmap, I found the problem. One of the local costmap inputs was clearing the costmap data. A er a parameter change, I was able to get the local costmap working and inflating obstacles, which now showed colorfully in RViz. The PR2 could now navigate around the corners and beds without crashing Once I had everything working correctly, I wanted to generate a map of the entire hospital and save it to a map file. Mapping the entire hospital by manually sending new goal locations to the PR2 would be time-consuming. Instead, I wanted the PR2 to map the Hospital World autonomously. To do this, I added the ROS explore_lite package alongside the existing navigation + SLAM I was already running. With the

explore_lite package running, the PR2 was driving around exploring on its own. As I watched it drive around creating the map, it was fun to guess where it was going to try to go next. While the robot was exploring and mapping, I did encounter several more failure modes. First, it fell down the stairs. That was bad, obviously. The sensors didn’t detect the drop off, and the PR2 mistakenly thought the stairway was a hallway. Testing robots near stairways and ramps is a great use of simulation, so I tested it on the ramp to the elevators, and found that it got stuck on the transition om the top of the ramp to the elevator platform. I believe this failure indicates a mechanical problem as the wheels couldn’t climb over the final transition. Finally, but much more common, was the PR2 would get stuck oscillating between a doorway it needed to pass through and a goal pose that was outside the door in the other direction. For example, if the goal was in a room to the east but the doorway to exit the room it was in was on the west side, it got stuck oscillating forever. I believe more parameter tuning could alleviate this oscillation. Getting started with Hospital World If you want to try testing the PR2 in the Hospital World yourself, a sample application is available for download on GitHub. You can get the source code and follow the README.md document for instructions on how to run. The

sample application makes it easy to get started whether you’re working in AWS RoboMaker or on a local system. I hope you found my tour of the Hospital World simulation and learnings while testing interesting. It shows how useful an environment like this can be for finding different types of so ware problems and even physical / mechanical design issues before testing on hardware. If you want to know more about testing robots in simulation, read our blog titled “Introduction to Automatic Testing of Robotics Applications.� If you have further questions about the Hospital World, the sample application, or AWS RoboMaker, reach out to AWS for further information. RR

About the author: Matt Hansen is a principal solutions architect specializing in Robotics at Amazon Web Services (AWS). Prior to joining AWS, Matt had five years working with ROS and ROS 2 in Intel’s Open Source Robotics team, where he led the ROS 2 Navigation2 project and was an original member of the ROS 2 Technical Steering Committee. He is an Oregon native and has an MS in Electrical Engineering om Portland State University.

Gazebo and RViz views with the local costmap inflating obstacles. Credit: AWS

THE ROBOT REPORT

Hospital_Simulation2020_11_Vs3_ed.indd 17

www.therobotreport.com

November 2020

17

11/3/20 7:58 AM


Mobile Robots •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

3

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

AMRs

help ICM handle demand for protective gear

ICM, one of the largest wholesalers of personal protective equipment in Scandinavia, turned to MiR1000 heavy-duty mobile robots for help. The Robot Report Staff

While every warehouse or distribution center is different, they all need to efficiently and safely move materials to meet accelerating demand. Danish wholesale company ICM A/S has deployed mobile robots om Mobile Industrial Robots (MiR) to augment its human workers, who can now focus on higher-value activities. ICM was founded in 1946 and is one of Scandinavia’s leading suppliers of technical equipment, work environment products, and personal protective equipment. It has warned of shortages because of the COVID-19 pandemic. Challenges ICM runs pallet transport operations every day om 7:00 a.m. to 10:00 p.m., and it moves a total of 100,000 orders per year relatively long distances on 31,000 pallets. Most of its orders are for same-day delivery. The internal logistics flow at its high-rise warehouse in Odense, Denmark, mainly uses human-operated trucks in narrow aisles with 40- . high racks. Space is limited, so ICM must optimize its use of time, personnel, and space. ICM has four human-driven, high-reach trucks, plus 10 manual stackers and 26 employees. Devising the optimal workflow for the entire flow of traffic and transport of goods in the logistics center has been a learning process, said Søren Jepsen, supply chain director at ICM.

18

ICM_Vs6_ed.indd 18

November 2020

www.therobotreport.com

THE ROBOT REPORT

11/3/20 8:01 AM


Case Study Breakdown Company: ICM A/S Location: Odense, Denmark Industry: Wholesale protective equipment Challenge: Keeping up with same-day orders, competitive pressures, and workforce scarcity Partner: Mobile Industrial Robots Robots: 3 MiR1000 heavy-duty mobile robots and MiR fleet management software Tasks: Moving pallets of goods from receiving to positioning Value drivers: Coordinating and optimizing materials movement in tight aisles Results: Saving 40 hours per week in staff time, freeing up personnel for higher-value tasks, improving job satisfaction “Our warehouse uses the chaotic storage principle, managed by a warehouse management system,” he said. “We must be geared to be able to drop everything in order to be able to deliver within 24 hours to our customers in Denmark. This means it’s about using our resources shrewdly. We’re investing in new technology in order to safeguard our staff and to attract new, talented people.” THE ROBOT REPORT

ICM_Vs6_ed.indd 19

ICM runs pallet transport operations every day from 7:00 a.m. to 10:00 p.m., moving 100,000 orders per year relatively long distances on 31,000 pallets. Most of its orders are for sameday delivery. Credit: Mobile Industrial Robots

Solution Fortunately for ICM, Odense is home to one of the most prominent robotics clusters in Europe, if not the world. One of its neighbors is MiR. ICM acquired three heavy-duty MiR1000 robots, so named because of their ability to move 1,000kg (2,204.6 lb.). “Thanks to an investment in three mobile MiR1000 robots, three employees now save several hours each on daily activities,” said

www.therobotreport.com

November 2020

19

11/3/20 8:02 AM


Enabling 3-D perception. Reimagining LiDAR. R2300 3-D LiDAR Sensor ■

For more information, visit pepperl-fuchs.com

Four scan planes in a single sensor enable environment perception for cliff detection, collision avoidance, and bay occupancy checks

100° horizontal field of view and 0.1° angular resolution facilitate object tracking and follow me

Precise light spot and high sampling rate offer enhanced perception and small object detection

Pepperl+Fuchs 11-20.indd 20

10/29/20 8:59 PM


Mobile Robots

ICM made a dedicated route for the mobile robots, freeing space for other traffic in the logistics center. Previously, space was very cramped because of the many operations with manual stackers on the main traffic routes. Credit: Mobile Industrial Robots

Jesper Lorenzen, a warehouse assistant who is responsible for goods reception at ICM. “They no longer have to spend time manually moving pallets om a stacker to the aisles in the high-rise warehouse.” Instead, the workers can place the pallets on special MiR racks, om which the robots collect the pallets and transport them to the aisles inside the high-rise warehouse. The MiR robots leave the pallets at the end of the aisles to be collected by high-reach trucks that place them in the relevant racks. “The high-reach truck operators automatically report when they have taken a pallet om a rack, so I can just press a button on the tablet screen and send one more MiR robot on a mission,” Lorenzen said. “This way, the robots ensure the high-reach trucks are always supplied with pallets.” The operators can use a map on tablets in their trucks to see where the mobile robots are at all times. In the busiest areas, the MiR1000s use audio signals and lights. This enables close collaboration between the trucks and the robots. In an environment with constant traffic, communication among vehicles is vital to avoid different machines blocking one another’s paths. THE ROBOT REPORT

ICM_Vs6_ed.indd 21

ICM has made a dedicated route for the mobile robots, eeing space for other traffic in the logistics center. Previously, space was very cramped because of the many operations with manual stackers on the main traffic routes, which have now been replaced by the MiR robots. The MiR Fleet management so ware chooses the mobile robot that can carry out a task in the shortest time. It also ensures that the three mobile robots automatically move to a charging station between tasks to minimize downtime. Results MiR’s mobile robots have saved about 40 hours a week at ICM, time the staff previously spent on moving goods between the receiving and positioning areas. These employees can now focus on higher-value activities, such as planning and optimization. Assessing, handling, and prioritizing the pallets and their contents is a complicated task that requires insight and experience because many parameters must be taken into consideration. Therefore, people are best for performing such tasks. “The robots have saved time, which we can now use to optimize the warehouse

www.therobotreport.com

November 2020

21

11/3/20 9:15 AM


Your Robots Need Oversight Too.

Mobile Robots •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

The mobile robots have saved about 40 hours a week at ICM, time the staff previously spent on moving goods. These employees now focus on highervalue activities, such as planning and optimization.

The FORT Platform delivers missioncritical, wireless safety for any network and hardware-based security to reduce risk, mitigate cyber threats, and ensure humans stay in control of the machines.

Certified Safety. Embedded Security. User Access Control.

Credit: Mobile Industrial Robots and fine-tune flow,� said Lorenzen. “We have become used to the new technology and have learned to work in a completely different way. The more we apply it, the more time we save through automation using mobile robots.� Not only have MiR’s mobile robots increased efficiency, but they have also improved the working environment, according to Brian Brandt, warehouse manager at ICM. Job satisfaction is important in warehouses, where recruitment and retention are o en challenging. “It’s just so much fun working with mobile robots. Being able to move something om A to B without even touching it, that’s really cool,� he said. “The design of the MiR robots is so simple and user- iendly that I could take a new colleague in om off the street, and they would also think they’re logical to use.� Brandt smiled as he observed a MiR1000 robot moving past, carrying a 600-kg (1,322-lb.) load of cleaning cloths on a pallet. While mobile robots currently aid the flow of goods om the receiving area to the storage aisles, ICM’s management sees the potential in automating more supply chain processes. In the long term, it could benefit om the use of robots for picking to the delivery of goods. RR

22

ICM_Vs6_ed.indd 22

November 2020

THE ROBOT REPORT

11/3/20 8:03 AM

•• ••

•• ••

•• ••

•• ••


Digi-Key 10-20_DW.indd 23 200831_AutoPuls1_DW_US.indd 1

10/29/20 9:00 AM PM 8/24/20 9:59


Mobile Robots •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

The prototype miniature antenna is integrated on a UGV with a softwaredefined radio and other robotic sensors. The system streams video between the UGV and a second node. | Source: U.S. Army

Mini antenna enables robots to team up in complex environments

U.S. Army and University of Michigan researchers have developed an antenna technique enabling small robots to effectively network.

The Robot Report staff

In addition to the intelligence for coordination, swarm robotics requires reliable communications among small robots. A new, miniature, low equency antenna with enhanced bandwidth is intended to enable robust networking among compact, mobile robots in complex environments. It is the result of a collaboration between the University of Michigan and the Army Research Laboratory. The Army Research Laboratory (ARL) is an element of the U.S. Army Combat Capabilities Development Command (CCDC). The CCDC is a major subordinate command of the Army Futures Command. New antenna maintains performance The teams developed a new design approach they said improves upon limitations of conventional antennas operating at low equencies. They demonstrated smaller antennas that maintain performance.

24

November 2020

Mini_Antenna_Vs4_ed.indd 24

www.therobotreport.com

THE ROBOT REPORT

11/3/20 8:10 AM


Detecting environmental changes in real time U.S. Army researchers recently demonstrated in a real-world environment Impedance matching is a key aspect of antenna design, ensuring the radio transmits power through the antenna with minimal reflections while in transmit mode. It also ensures that when the antenna is in receive mode, it captures power to efficiently couple to the radio over all equencies within the operational bandwidth. “Conventional impedance matching techniques with passive components -- such as resistors, inductors and capacitors -- have a fundamental limit, known as the Chu-Wheeler limit, which defines a bound for the maximum achievable bandwidthefficiency product for a given antenna size,” said Army researcher Dr. Fikadu Dagefu. “In general, low- equency antennas are physically large, or their miniaturized counterparts have very limited bandwidth and efficiency, resulting in higher power requirement.” With those challenges in mind, the researchers developed a way to improve bandwidth and efficiency without increasing size or changing the topology of the antenna. “The proposed impedance matching approach applies a modular active circuit to a highly miniaturized, efficient, lightweight antenna overcoming the aforementioned Chu-Wheeler performance limit,” said Army postdoctoral researcher Dr. Jihun Choi. “This miniature, actively matched antenna enables the integration of power-efficient, low- equency radio systems on compact mobile agents such as unmanned ground and aerial vehicles.” Heterogeneous networking opportunities The researchers said this approach could create new opportunities for networking in the Army. The ability to integrate low- equency radio systems with low size, weight, and power -- or SWAP -- opens the door for the exploitation of this underutilized and under-explored equency band as part of the heterogeneous autonomous networking paradigm, said the researchers. In this paradigm, agents equipped with complementary communications modalities must adapt their approaches based on challenges in the environment for that specific mission. Specifically, the lower equencies are suitable for reliable communications in complex propagation environments and terrain due to their improved penetration and reduced multipath. “We integrated the developed antenna on small, unmanned ground vehicles [UGVs] and demonstrated reliable, real-time digital video streaming between UGVs, which has not been done before with such compact low- equency radio systems,” Dagefu said. THE ROBOT REPORT

Mini_Antenna_Vs4_ed.indd 25

a human-robot team in which the robot detects physical changes in 3D and shares that information with a human in real-time through augmented reality. The human is then able to evaluate the information received and decide followon action. Even small changes in your surroundings could indicate danger. Imagine a robot could detect those changes, and a warning could immediately alert you through a display in your eyeglasses. That is what U.S. Army scientists are developing with sensors, robots, real-time change detection and augmented reality wearables. “This could let robots inform their soldier teammates of changes in the environment that might be overlooked by or not perceptible to the soldier, giving them increased situational awareness and offset from potential adversaries,” said Dr. Christopher Reardon, a researcher at the Army Research Laboratory. “This could detect anything from camouflaged enemy soldiers to IEDs.” The research paired a small autonomous ground mobile robot from Clearpath Robotics, equipped with LiDAR sensors, to build a representation of the environment, with a human teammate wearing augmented reality glasses. As the robot patrolled the environment, it compared its current and previous readings to detect changes in the environment. Those changes were then instantly displayed in the human’s eyewear to determine whether the human could interpret the changes in the environment. The researchers tested different resolution LIDAR sensors on the robot to collect measurements of the environment and detect changes. When those changes were shared using augmented reality to the human, the researchers found that human teammates could interpret changes that even the lowerresolution LIDARs detected. This indicates that, depending on the size of the changes expected to occur, lighter, smaller and less expensive sensors could perform just as well, and run faster in the process. RR

The U.S. Army has developed a robot that can detect physical changes in 3D and share that information with a human in real time. | Source: U.S. Army

www.therobotreport.com

November 2020

25

11/3/20 8:15 AM


Mobile Robots •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

Researchers have developed an active-matching technique to equip robotic ground vehicles with powerful, miniature antennas. | Source: U.S. Army

“By exploiting this technology, the robotic agents could coordinate and form teams, enabling unique capabilities such as distributed on-demand beamforming for directional and secure battlefield networking.” With more than 80% of the world’s population expected to live in dense urban environments by 2050, innovative Army networking capabilities are necessary to create and maintain transformational overmatch, the researchers said. Lack of fixed in astructure and the increasing need for a competitive advantage over near-peer adversaries imposes further challenges on military networks, a top modernization priority for multi-domain operations. Maximizing tradeoffs among bandwidth, efficiency, stability While previous experimental studies demonstrated bandwidth enhancement with active matching applied to a small non-resonant antenna (e.g., a short metallic wire), no previous work simultaneously ensures bandwidth and radiation efficiency enhancement compared to small, resonant antennas with performance near the Chu-Wheeler limit. The Army-led active matching design approach addresses these key challenges

26

November 2020

Mini_Antenna_Vs4_ed.indd 26

stemming om the trade-off among bandwidth, efficiency, and stability. The researchers built a 15-cm prototype (2% of the operating wavelength) and demonstrated that the new design achieves more than threefold bandwidth enhancement compared to the same antenna without applying active matching. It also improved the transmission efficiency 10 times compared to the state-of-the-art actively matched antennas with the same size. “In the design, a highly accurate model captures sharp impedance variation of the highly miniaturized resonant antenna” Choi said. “Based on the model, we develop an active matching circuit that enhances bandwidth and efficiency simultaneously while ensuring the circuit is fully stable.” “This technology is ripe for future development and transition to our various partners within the Army,” Dagefu said. “We are optimistic that with the integration of aspects of our heterogeneous networking research, this technology will further develop and will be integrated into future Army communications systems.” RR

www.therobotreport.com

THE ROBOT REPORT

11/3/20 8:11 AM


RELY ON US TO CARRY THE LOAD

Get up and running with less hassle than ever before. Build gantries faster, design heavy-duty robot transfer units with ease, and maximize reliability in your automated motion systems with complete solutions from Bishop-Wisecarver. With 70 years of engineering expertise, we understand our customers’ design and application requirements which enables us to develop unique robotic solutions.

LEARN MORE AT BWC.COM/ROBOTICS

EXPERTLY DESIGNED, DELIVERED TO PERFORM

Bishop Wisecarver 11-20.indd 27

925.439.8272 | BWC.COM

10/29/20 9:07 PM


Mobile Robots •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

How to get started with designing a cost-effective UGV for security and surveillance

When building an autonomous vehicle, start with mission requirements and basic components.

Autonomous navigation vehicle. Source: Mistral Solutions By Raja Subramanian N and Pramod Ramachandra • Mistral Solutions

Mobility is hot, om the mobile robots in warehouses to the self-driving cars and trucks now being tested. By 2040, three out of every four vehicles will be autonomous, according to the Institute of Electrical and Electronics Engineers. The unmanned vehicle industry is becoming very competitive, making it difficult for startups to gain the necessary investment. However, there are ways to design and build an autonomous system for a commercial application without burning a hole in your pocket. While most of the attention has been on semi-autonomous or fully autonomous passenger vehicles, there is a huge void in developing these technologies for security and surveillance applications. What is an unmanned ground vehicle, or UGV? A simple definition is that it is a ground vehicle that can run independently of a human operator. It uses sensors and machine vision to perceive and comprehend its environment, while drive-by-wire actuators and motors perform operations. Various open-source tools can come in handy when designing a cost-effective, safe, and reliable UGV based on an electric platform. Let’s focus on building a multi-terrain vehicle rather than one just suited for roads. We will also discuss a surveillance payload that can be integrated into this vehicle.

28

November 2020

Mistral-AGVs_Vs3_ed.indd 28

www.therobotreport.com

THE ROBOT REPORT

11/3/20 8:13 AM


NEW IDEA. NEW DRIVE.

BETTER ROBOTICS

Building a new robotic application? You need a new drive! The Motus Labs M-DRIVE precision drives offer a higher torque density than competing strain wave gearing with no compromise in performance. The patented design utilizes a series of cam-driven blocks that engage over 80% of the output ring surface area at all times. This design distributes load stresses over a much larger surface area, allowing the drive to deliver a much greater torque per unit size and volume than other technologies.

Get better price to performance! HIGH PRECISION | LESS WEIGHT | UP TO 2X TORQUE DENSITY

Need some help with your application? Contact us!

MOTUS 11-20_DW.indd 29

MOTUS-LABS.COM

10/29/20 9:08 PM


Mobile Robots •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

Using navsat_transform_node to integrate GPS data. Source: ROS.org

So ware for UGVs ROS: It is possible to build a high-quality so ware package with the open-source tools currently on the market. The Robot Operating System, or ROS, is a flexible, open-source platform for so ware development. ROS was developed to save the time spent on design and engineering and let product developers build on solved problems. The ROS community provides several tools, as well as support for various sensors, algorithms, visualization and simulation to develop a robust so ware. ROS allows so ware nodes to run on distributed computational units, depending on the hardware architecture. By choosing ROS, developers can reuse various modules and build application use cases on Python, C++, and Java. With minimum customization based on project requirements, they can optimize the modules for the autonomous vehicle’s intended function. Maps and navigation: Open-source maps and navigation tools can help developers incorporate a dynamic map in a Web application. These tools are completely ee, and they offer developers with the ease and flexibility of a cutting-edge, mobileready solution. It is very easy to customize these platforms by accessing application programming interfaces (APIs) or using thirdparty libraries. Developers can also build a Web-based geographic information system (GIS) and integrate it with ROS, along with various algorithms to identi and define the path of the vehicle and turn-by-turn navigation support. The edit support in Maps can be

30

November 2020

Mistral-AGVs_Vs4_ed.indd 30

used when in-road networks are not available. There are also a few open-source tools that provide ee geospatial data and permit relevant modifications. These tools can help developers to generate and define a path for the vehicle. The geospatial data includes detailed information about streets, landmarks, railways, major institutions, and much more. It may also include the name, type, and width of the street and even the speed limits applicable. They support offline maps and allow for the addition of custom road networks. LAMP: Standing for open-source components Linux, Apache, MySQL, and PHP/Python, LAMP is a reliable platform for developing UGV Web applications. LAMP makes the developer’s life easy by minimizing the programming efforts that a complex autonomous platform or a robot ask for. Autonomous navigation algorithms: UGVs rely on navigation technologies, sensors, and cameras to navigate across terrain. Now, consider an environment that is unknown or not updated in the navigation platform. This is where various autonomous navigation algorithms play a crucial role. These algorithms help identi obstacles, avoid them, calculate best routes, and define a new path for the vehicle by understanding the surroundings based on the data om various sensors. There are three www.therobotreport.com

important algorithms in autonomous navigation: geolocalization, path planning, and navigation. The system uses Global Positioning System/inertial navigation system (GPS/ INS) for accurate vehicle positioning, Map to choose the best route, and navigation algorithms to provide steering angular profile commands to the vehicle. Hardware for UGVs So ware is increasingly important, but the vehicle platform itself is still the basis of any UGV. An internal combustion (IC) engine poses several challenges such as noise, pollution, and complexity in integration, and most are not designed for autonomous mode. By contrast, in electric vehicles, many of the mechanical customization challenges can be surmounted. Developers can also easily access vehicle network for obtaining data such as engine status, fuel status, gear level, clutch, and so on. Drive by wire, also known as “X by wire,� is transforming the way people drive or commute. Drive-by-wire systems rely on electronics and various sensor inputs to control operations such as steering, acceleration, and braking, while the conventional vehicles use mechanical and hydraulic technologies. Let’s look at the various components such as the driving or AC motor, motor controllers, hub motors, and steering controls. AC motor: Among the prime concerns of UGV designers while considering AC THE ROBOT REPORT

11/3/20 8:23 AM


motors are torque, power and efficiency of the motor. The motor and other motion systems should provide superior quality, reliability, and performance at the lowest possible cost. While choosing the motor and deciding on the power, the developer should consider the terrain where the UGV will be deployed. Acceleration of the vehicle is controlled by sending electrical signals of defined voltage, directly to the motor controller. For this, one can use an existing AC motor controller available in open market. Building a custom AC motor controller may not be a viable decision because of expense and potential project delays.

In-wheel motor. Source: Elaphe

Wheel hub motor: In electric vehicles, hub motors power each wheel. The more powerful the integrated hub motors are, the more capable the UGV will in demanding environments.

the computer, which in turn controls the steering.

Steering control: Another crucial component of autonomous vehicles is the automatic lateral control. A specially designed electric power-assisted steering (EPAS) popularly used in modern vehicles has various advantages over the conventional steering system. In an autonomous vehicle, EPAS consists of a controller integrated into the steering column in addition to the electric motor and a torque sensor. The EPAS controller unit receives inputs from

Braking: For service brakes, one can use a linear actuator-based control system. In an electric vehicle, regenerative or re-gen brakes could also be helpful because they use the kinetic energy generated while braking and convert it back to stored energy in the vehicle battery. Designers can also consider ACMEbased linear actuators for parking brakes. There are several off-the-shelf parkingbrake systems that a developer can use, depending on the terrain and ruggedness

the system demands. Vehicle communication networks: The vehicle communication network connects the in-vehicle electronics and devices, as well as the vehicle itself to the external world using technologies such as Controller Area Network (CAN), Ethernet, Wi-Fi, and mesh networks. The selection of a reliable and redundant network structure is key to avoiding any failures. The vehicle network must be able to handle the huge amount of high-speed data generated by various onboard sensors. The network should be fully capable of handling this data and supporting the sophisticated vehicle electronics.

Vehicle communications network with different V2X communication nodes. Source: “Vehicular Communication Networks in Automated Driving Era,” by IEEE members Shan Zhang and Nan Cheng; IEEE student members Jiayin Chen, Feng Lyu, and Weisen Shi, and IEEE Fellow Xuemin (Sherman) Shen

THE ROBOT REPORT

Mistral-AGVs_Vs4_ed.indd 31

www.therobotreport.com

November 2020

31

11/3/20 8:23 AM


Mobile Robots •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

A gigabit-speed network is recommended so that sensors and various processors can take advantage of its highbandwidth, low-latency, high-reliability links, paving the way to real-time autonomous operations. Autonomous vehicle sensors An autonomous vehicle relies on GPS and inertial measurement units (IMUs) for localization and navigation, while it relies on sensors to perceive the environment, obstacles, and moving objects such as other vehicles or pedestrians. While the GPS and IMU provide vehicle position, speed, and direction, it is the cameras, radar, and lidar that facilitate safe decision making. These sensors must work in extreme weather conditions, ranging om bright sunshine to dark, rainy, foggy, or dusty conditions, not to mention eezing or high temperatures. We recommend IP65 or IP67grade sensors. Millimeter-wave radars provide crucial data for autonomous vehicle and UGV functions, including obstacle detection, proximity warnings, collision avoidance, lane-departure warnings, and adaptive cruise control. One big advantage of radar over other sensors is that it works accurately in any weather condition. Radar has evolved significantly over the past decade. Advancements in antenna design, signal transmission, processing power, digital signal processing, silicon technologies, and machine vision algorithms have transformed radar into a system on a chip (SoC). In automotive applications, the range requirement is as low as few hundred meters, which is where mmWave radar comes into the picture. 77-GHz radar modules are gaining popularity, as they generate better object resolution and greater accuracy in velocity measurement. These modules come in ultra-compact form factors and provide superior processing power. Off-the-shelf radar modules available include Texas Instruments’ ultra-high resolution AWR sensors, which it says include high-accuracy and low-power 45nm RF CMOS build technology. For advanced autonomous driving applications, developers can consider the Sensor Fusion Sensor Fusion Kit (camera vision + mmWave radar) om Mistral

32

November 2020

Mistral-AGVs_Vs4_ed.indd 32

An integrated, camera and mmWave radar sensor platform based on Texas Instruments TDA3 SoC and AWR1443 FMCW RADAR sensor for ADAS applications. Source: Mistral Solutions

Solutions, powered by Texas Instruments’ TDA3 SoC and AWR1443 FMCW RADAR sensor.

and automatic parking. Ultrasonic sensors are comparatively economical and work perfectly in bad weather.

Lidar sensors can generate highresolution 3-D maps with detailed information including road features, vehicles, and other obstacles in the terrain. Lidar can provide quick scans of objects and help an autonomous vehicle or UGV perceive its surroundings.

GPS-INS: GPS aids in identi ing the vehicle location accurately on the ground. Traditional GPS receivers may have position error in the range of meters, depending upon satellite visibility; signal strength; weather conditions; and surroundings such as buildings, trees, or hills. The safety demands for autonomous vehicles require position errors within subcentimeter or millimeter levels. Accuracy can be further improved by combining GPS with an INS -- a device that uses accelerometers, gyroscopes, GNSS receiver, microprocessor, and a data logger to continuously calculate the position, orientation, and velocity of the vehicle using dead reckoning. The position errors can be further reduced by deploying Real-Time Kinematic (RTK) base station, which will send position correction messages to the GPS receivers periodically.

Cameras were among the first sensors deployed in vehicles for driver assistance applications. The technology is mature, and miniaturization has made cameras indispensable components in every modern-day vehicle. With the introduction of advanced image-processing technologies and vision analytics, cameras have become one of the key sensors in advanced driver-assist systems (ADAS), self-driving cars, and mobile robots. In addition, cameras enable in-object identification and classification, as well as depth perception including the position, distance and speed of objects. In an autonomous surveillance platform, two layers of cameras need to be implemented. The first set of camera sensors assists the vehicle in gathering data for autonomous , and second gathers intelligence. Surveillance camera specifications are addressed below. Ultrasonic sensors have fallen incost, allowing autonomous systems developers to consider them for new applications. Ultrasonic sensors can play a major role in obstacle avoidance. The latest sensors can detect the distance to obstacles and assist in safe manoeuvring www.therobotreport.com

Surveillance payloads For developers, identi ing the surveillance needs is the first and most important step in building an application-specific UGV. Advanced electronics modules may be costly, so the developer needs to be clear on the system requirements. Developers and integrators can consider in ared cameras, which use thermal imaging regardless of the lighting conditions; pan-tilt-zoom (PTZ) cameras for covering large areas and seeing small details; and a 360-degree camera, which provides a bird’s-eye view of the surroundings. THE ROBOT REPORT

11/3/20 8:24 AM


Command-and-control vehicles. Source: Mistral Solutions

In addition, standard surveillance equipment including communications systems and control consoles are available on the market. The communications network is another aspect to be considered. Video feeds from various cameras are streamed live to the teleoperator console or the command center. For this, one can either use an existing Wi-Fi network or a wireless mesh network. This can enable a wireless ad hoc network to communicate faster and safer in a demanding environment. Teleoperator console: Building an autonomous surveillance vehicle does not mean that security is independent of human intelligence. There is always a human eye in the loop, monitoring the vehicle movement and keeping a check on the live feeds received. This is where the teleoperator console comes into picture. The operator console acts like a command center and facilitates the gathering of surveillance intelligence. The console should be equipped with reliable communication and control systems. The operator should be able to take remote control of a vehicle at any time, control the surveillance payload, make an emergency announcement, start or stop the vehicle, and so on.

Developers can integrate the teleoperator console into an existing central command-and-control center, or it can be designed as a portable kit. The portable console should be lightweight and easily deployable. A standard teleoperator console consists of a touch control panel that runs an intuitively designed Web app for configuring missions and teleoperating the vehicle. You can consider a metro-style user interface like grid stack for teleoperator console, which will enable the user to define what to focus on in each mode of operation. e-Stop and telemetry data: Consider a situation in which the user wants to stop the UGV immediately -- that’s when emergency stop or e-Stop feature comes handy. With the press of a button, the vehicle and its surveillance systems shut down completely. This can help safeguard the vehicle from being tampered with or protect it from an unexpected malfunction. In addition to the vehicle network, a dedicated wireless network is required for implementing the e-Stop functionality. There are third-party, off-the-shelf products available in ISM (Industrial, Scientific, and Medical) and non-ISM frequencies, but they are not costsensitive.

As a cost-effective alternative, developers can consider wireless networks that support long range for e-Stop functionality. Security UGVs require multiple disciplines Using the technologies discussed above, developers can build a state-of-the-art autonomous surveillance vehicle. They will need a customized electric vehicle platform, drive-by-wire actuators and sensors, and ROS. Builders will also need a graphics processor that runs compute intensive navigational and visual algorithms, a surveillance payload, and Web-based GIS systems. The resulting rugged, all-terrain UGV can be deployed for a wide range of applications such as perimeter surveillance, 24x7 patrolling of critical infrastructure , first-response to fires or other disasters, and more. To realize a surveillance UGV, product developers need experience in mechanical design, development, and integration of electrical components, as well as power management, various sensors and actuators, and software. Select mechanical parts and components to ensure high protection from shock and vibration. Knowledge of tools such as HTML5, CSS3, Python, MySQL, Bootstrap, jquery, ROS Kinetic, and Melodic support will help the developers kickstart such projects immediately. Knowledge on DBW Algorithms, navigation, obstacle avoidance, path planning, and visualization provides an added advantage. RR

About the authors: Raja Subramanian N is senior technical architect - software design, and Pramod Ramachandra is technical architect software design at Mistral Solutions. Mistral is a certified product design and systems engineering company. It focuses on product engineering services, aerospace and defense, and homeland security.

mmWave technologies. Source: Mistral Solutions

THE ROBOT REPORT

Mistral-AGVs_Vs4_ed.indd 33

www.therobotreport.com  

November 2020

33

11/3/20 8:24 AM


Software •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

Future connectivity requirements for unmanned aerial vehicles

For many applications, traditional cellular communications, including 5G, are suboptimal

for data-intense UAVs. What is required are new cellular communications solutions that provide

seamless connectivity and unlimited range for vehicles and ensure that all data is transmitted and received to and from related parties from any location. By Deniz Kalaslıoğlu • Founder, Soar Robotics

Drones and unmanned aerial vehicles (UAVs) have become invaluable assets for end users, enterprises, and governments in the past decade. They continue to prove their value as propulsion, sensor, and avionics technologies evolve. As drones become ever more capable and affordable, new applications are opening up in different industries. We are witnessing an incredible increase in autonomous aerial operations in urban and industrial areas. Thus, the global markets for commercial drones, urban air mobility, and drone delivery maintain strong growth. Various companies have already started testing autonomous aerial transportation of goods and people. Meanwhile, others are creating value for markets by collecting precious aerial data and intelligently processing it with the help of artificial intelligence. These new applications could transform industries ranging om agriculture and mining and construction. 34

November 2020

Connectivity_Vs4_ed.indd 34

www.therobotreport.com

THE ROBOT REPORT

11/3/20 8:26 AM


| shutterstock.com

Two fundamental leaps in the technology centered on autonomy of the vehicles and the flight range. The result is that the vehicles weren’t limited to a few hundred feet in any direction and an operator’s manual inputs. They have gradually gained the ability to fly beyond the visual line of sight (BVLOS). These advancements make a whole range of new applications possible such as drone delivery, aerial inspection and monitoring, and urban air mobility. Connectivity challenge As futuristic as all of this may sound, there exists a substantial obstacle in the way of UAVs becoming commonplace: connectivity. Innovators are designing and developing vehicle-wise technologies and advancing the capabilities of aerial vehicles.

THE ROBOT REPORT

Connectivity_Vs4_ed.indd 35

However, if we can’t create a unified infrastructure to respond to the connectivity needs, it’s not possible to ensure safe flights and manage air traffic. So many of the proposed applications for UAVs won’t take off without a seamless, robust, and range-independent connectivity technology, which is the ultimate building block for enabling safe, intelligent, and autonomous aerial operations. Drone and UAV connectivity Drone and UAV connectivity has traditionally been thought of as direct, point-topoint communication between a ground station/operator and the vehicle. Although

www.therobotreport.com

November 2020

35

11/3/20 8:26 AM


Software •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• •• Hyundai’s flying taxi concept for a planned Uber aerial taxi service. | courtesy of Uber. Credit: Uber

communication is a broad topic, we can classi aerial vehicle connectivity under two types: control and non-payload communication (CNPC), and payload communication (PC). CNPC is the dedicated two-way link that enables the transfer of critical mission data between the vehicle and a ground station/operator. It provides useful and safe operation by ensuring that the aircra is remotely monitored, controlled, and intervened if necessary. Payload communications, on the other hand, is o en implemented as a one-way link that transmits applicationspecific payload data om the drone to the ground. Since a PC facilitates the transmission of high-quality and large payload data such as real-time HD, thermal, in ared, or multispectral video, it requires much higher throughput compared to CNPC. In contrast, CNPC is more timesensitive and requires much less latency. UAVs come in sizes that range om a few centimeters to several tens of meters and a large variety of configurations such as rotary-wing, fixed-wing, multirotor, and many others, with an equally wide range of applications. In addition to the difference between the requirements of PC and CNPC, numerous individual combinations of connectivity parameter settings must be present to fulfill these different use cases.

36

November 2020

Connectivity_Vs4_ed.indd 36

Innovation marches on Since it is of utmost importance to safely manage air traffic, technologies and protocols to support critical communications of piloted aircra have existed for decades. Non-military use of pilotless aircra is relatively more recent and goes back several years rather than decades. Nevertheless, there’s an astonishing level of focus, and much work has been done by stakeholders that are industry leaders, to include new aircra , air taxis, and delivery drones to enter and share the skies of our future safely. The U.S. Federal Aviation Administration (FAA), which has been the main regulatory body that has been

indisputably leading the aviation industry throughout the world, has decided to separate the piloted and pilotless traffic management to better fulfill the different requirements for each, namely NextGen and Unmanned Traffic Management (UTM). This separation is an excellent start for scaling aerial operations. Especially with the ever-increasing amount of onboard data om sensors, avionics, and payloads on both types of aerial vehicles, current communication architectures that optimally enable collision avoidance, voice data, and telemetry will not suffice the next generation of piloted vehicles. Besides, when you consider that the

An Amazon Prime Air Delivery Drone. Credit: Amazon

www.therobotreport.com

THE ROBOT REPORT

11/3/20 8:27 AM


estimations predict that there will be a demand for millions of drones in the coming years, this will be a massive problem. Different challenges Some might point out that the apparent solution to the drone connectivity problem is to use cellular networks, since they stand ready nationwide, providing wide-area, highquality and secure connectivity for humans. 5G has already started enabling more reliable, low-latency, and high-throughput cellular communications for many of its users. So, it might just be the solution to the emerging problem of aerial vehicle connectivity. However, autonomous aerial operations impose entirely different challenges to meet the requirements of capacity, coverage, cost, user data rates, and latency. Limitations of cellular network infrastructure Current cellular network infrastructure and protocols are designed to support terrestrial users. The antenna and channel models are designed for two-dimensional ground coverage and aren’t able to adapt to a 3D space with always moving users. The uplink/downlink bandwidth allocation is optimized for human use, meaning the physical bandwidth allocated and algorithms are designed to support much higher download speeds. Due to altitude, a high probability of lineof-sight channels with a vast number of base stations causes interference and association problems. A UAV that flies at a 100-feet altitude can easily get downlink signals from more than 100 ground base stations in dense urban settings. It receives a multitude of weak signals from many base stations. That is precisely the opposite way of how cellular networks and current protocols are designed. The UAVs also cause interference to these ground base stations, meaning an individual channel is jammed in almost 100 cells and not able to serve even the terrestrial users. Due to these reasons, current cellular networks can’t adequately support any aerial vehicle, vertical take-off and landing (VTOL) aircraft, or passenger drones. These networks need to evolve to be able to support autonomous aerial operations. Companies like Uber and Ehang have already started testing their aerial vehicles with the existing infrastructure. However, the infrastructure needs, at the very least, some significant upgrades if not built from scratch.

BRUSHLESS DC GEARMOTORS for Mobile and Industrial Applications

Get quiet, efficient, and reliable performance with Bodine low-voltage brushless DC (EC) gearmotors. Available with brakes, encoders, special connectors or wire harnesses. Let our engineers team up with you on your next development project.

Visit our website at bodine-electric com info@bodine-electric.com

THE ROBOT REPORT

Connectivity_Vs4_ed.indd 37

November 2020

| 773.478.3515 (USA)

37

11/3/20 8:27 AM


Formant2 11-20_DW.indd 38

10/29/20 9:18 PM


RT

Software •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

Connectivity requirements A protocol stack for autonomous aerial operations built on existing 5G specifications might enable millions of continuously moving and connected devices that are constantly exchanging information. The new wireless network should enable ultra-reliable, lowlatency, high-throughput communications for vehicles traveling at high velocity. Vehicles in the autonomous world will operate for countless types of applications in many different locations and conditions. Each vehicle will be unique, and so will their connectivity requirements. This calls for extremely flexible network settings that will adapt accordingly to meet those requirements continually. Built-in capabilities to delegate the computation of crucial and time-sensitive operations to the device edge and cloud edge will help optimize the load and immensely improve the overall network performance and economics. New wireless networks This new cellular communications network will provide seamless connectivity and unlimited range for drones and aerial vehicles, and ensure all data is transmitted and received to and om related parties om any location. Facilitating a greater data exchange will offer extensive cloud computing capabilities, enabling drones to scale on a cloud through data processing and management services. Some of the examples of these services can be UTM, AI-powered data analysis, and overthe-air so ware updates, to name a few. Traditionally, cellular communication networks are thought of as one big architecture that serves all. However, advancements in autonomous vehicles, UAVs, and robotics have been forcing the nature of telecom in astructure a change. While a great deal of attention has been given to IoT connectivity, connectivity of data-intense vehicles is rather a recent phenomenon. When we think that there will be an addition of millions of autonomous vehicles in the near future, unified approaches to telecom in astructure need iteration. So deploying partitioned networks tailored for specific needs might be the solution. Indeed, these developments may lead to a next-generation telecommunications company that solely serves the autonomous world. RR THE ROBOT REPORT

Connectivity_Vs4_ed.indd 39

November 2020

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

BRUSHLESS DC GEARMOTORS for Low-Voltage and Battery Applications

Now with a smaller control enclosure, 34B/SR-WX INTEGRA gearmotors combine a 12 or 24VDC BLDC gearmotor and speed control. Maintenance-free, ideal for low-voltage and battery powered applications where downtime is not an option. Remote I/O available.

Visit our website at bodine-electric com info@bodine-electric.com

| 773.478.3515 (USA)

39

11/3/20 8:28 AM


Software •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

The future of hybrid robotics There are clear reasons to choose either the cloud or the edge, but it’s equally evident this will continue to shift and evolve. By Jeff Linnell • Formant

Robotics began at the edge. Early robots were massive, immobile machines operating on factory floors, with plenty of space for storing what little data they required locally. Over recent years, however, robots have le the factory floor and are moving around in an increasing number and variety of environments. These robots are no longer re igerator-sized automata punching out widgets. Now, rather than worrying about workers bumping into robots, we have to worry about robots bumping into workers. The new unstructured environments that autonomous systems are venturing into are invariably aught with obstacles and challenges. Humans can assist, but we need data, lots of it, and in real time. Companies like Formant have used cloud technology to meet those needs. We’ve enabled companies to observe, operate, and analyze this new wave of robotic fleets remotely and intuitively through the use of our cloud platform. This notion, however, of everything being pushed to the cloud all of the time, is beginning to come into question. Using the integrated GPU cores in NVIDIA’s Jetson platform, we can swing the pendulum back in the direction of the edge and reap its advantages. The combination of GPU-optimized edge data processing paired with

40

November 2020

Hybrid_Robotics_Vs3_ed.indd 40

www.therobotreport.com

THE ROBOT REPORT

11/3/20 8:29 AM


Formant uses NVIDIA’s Jetson with its cloud platform for nextgeneration robotics controls. Source: Formant

Formant’s observability and teleoperation platform can create an efficient command and control center right out of the box. When using Jetson devices, Formant users can now enable real-time video and image analytics in the cloud and perform PII scrubbing at the edge, ultimately sustaining more reliable connections and better privacy protections. This also allows for much greater cloud/edge portability, as the very same algorithms can run in both places. This hybridized model allows one to do away with “one-sizefits-all” solutions and opt for balance between your in-cloud and on-device operations. Find balance with cutting-edge encoding An optimal teleoperation experience requires the proper balance between latency, quality, and computational availability. In the past, striking such a balance wasn’t an easy task. In essence, for THE ROBOT REPORT

Hybrid_Robotics_Vs3_ed.indd 41

each single quality one sought to prioritize, it would come at the expense of the other two. By virtue of applying Formant’s tooling and the portability of NVIDIA’s DeepStream SDK, users can re-adjust those balances as needed in order to optimize data management to their specific use case. The most immediately-useful capability we gain by using Jetson is hardware-accelerated video encoding. When Formant detects that you are using a Jetson or compatible device, it unlocks the option to automatically perform H.264 encoding at the edge. This enables high-quality transmission of full-motion video with substantially diminished bandwidth requirements, lower latency, and lower storage requirements if buffering the data for later retrieval. When it comes to measuring the performance of a teleoperation system, latency is one of the most important criteria to consider. This is even more so the case when

www.therobotreport.com

November 2020

41

11/3/20 8:30 AM


Software •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

When measuring the performance of a teleoperation system, latency is one of the criteria to consider. This is even more so the case when dealing with video encoding, an exceedingly resource-intensive process that can easily introduce latency. Source: Formant

dealing with video encoding, an exceedingly resource-intensive process that can easily introduce latency if pushed to the limit. In our tests, 1080p resolution at 30 ames per second pegged all 6 of our cores at 100% utilization when not using hardware acceleration. This then caused a fair amount of latency to be introduced to the pipeline. However, when our Jetson implementation is activated, the average CPU utilization for the system drops to below 25%. This not only improved latency significantly, it also eed up the CPU for other activities. The future of hybrid robotics With Formant, one has the ability to finetune and balance what operations occur

at the edge and in the cloud. We think this flexibility is a huge milestone in the business of robotics. Just imagine a chief financial officer saying that your LTE bill is way too high, your engineering team deciding that they need to use cheaper devices and smaller batteries, or that new data transmission and sovereignty regulations have just been passed. With the ability to determine what is done at the edge and what is done on the cloud, these otherwise heavy li s become as simple as checking a box and swinging the pendulum. At the moment, these are crucial decisions, and the striking of the edge/ cloud balance must be decided by engineers. Formant provides these interfaces to let you tune the system

easily. Looking ahead, we envision an automated dynamic “load balancing� between the edge and the cloud. You essentially will define rules and budgets to optimize around. For example, when your robot is connected to Wi-Fi, power, and idle, you could automatically use this spare time and power to leverage the GPU for semantic labeling and data enrichment, then upload the data to the cloud while the bandwidth is cheap. There are clear reasons to choose either the cloud or the edge for your computation. It’s equally evident that this line will continue to shi and evolve. RR

In our tests, 1080p resolution at 30 frames per second pegged all 6 of our cores at 100% utilization when not using hardware acceleration. This then caused a fair amount of latency to be introduced to the pipeline. However, when our Jetson implementation is activated, the average CPU utilization for the system drops to below 25%. Source: Formant

42

November 2020

Hybrid_Robotics_Vs3_ed.indd 42

www.therobotreport.com

THE ROBOT REPORT

11/3/20 8:30 AM


Zivid 11-20_DW.indd 43

10/29/20 9:20 PM


Software •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

We tele-operated Spot from 3,000 miles away

Steve Crowe, Editor of The Robot Report, tele-operating Boston Dynamics’ Spot quadruped around Golden Gate Park in San Francisco. | Credit: Steve Crowe

Dashboard view of the data collected by Formant’s so ware during our Spot walk. | Credit: Formant

Senior Editor Eugene Demaitre and I recently tele-operated Boston Dynamics’ Spot robot. We took the quadruped for a walk around Golden Gate Park in San Francisco, controlling it from our homes near Boston. The opportunity to tele-operate Spot is a program Formant ran to familiarize the public with robots. Spot is impressive. It does all the hard work, such as walking up and down stairs and handling different terrain. All we did was point the robot in the proper direction. I controlled Spot with a PS3 controller plugged into my laptop, while Eugene used his laptop’s touchpad. Neither Eugene or I crashed Spot, but I did drive it into the brush once. At one point, Formant tried to get me to walk Spot down a steep, muddy hill. Even they were somewhat unsure about how the robot would handle the terrain, so I decided against putting a $74,500 machine into harm’s way. Formant said walking in tall grass can still throw the robot for a loop. “The new adage would be, if the robot apocalypse happens, run into a field of tall grass,” Formant joked. We walked Spot for about one hour, taking a couple of minutes to switch out the battery once. I used the right stick on my controller to move Spot forward or backward. The further you push the stick in a direction, the faster Spot moves. The left stick moves Spot sideways, which came in handy when it was in the shrubs. Eugene walked Spot up and down stairs. He simply put Spot into position in front of the stairs, switched Spot into “stairs mode,” and the robot flawlessly did the rest. From an engineering standpoint, Formant founder and CEO Jeff Linnell said this is “an 11 out of 10.” “That is testament to 20 years of development. [Boston Dynamics] is the one company on Earth that really got that right and had the nerve to do it.” said Linnell. “[The reason] I’m not afraid to walk it up stairs is that [Boston Dynamics] is all about trying to break the machine, so they can make it better. They’ve spent 20 years doing that, and this is the result of it, and every other robotics company is going to benefit.” “I think it’s going to be the kind of thing we take for granted moving forward, because they just proved that this type of mobility is finally possible,” he said. The peripheral vision wasn’t great, so it was tough to get a sense of what was on either side of Spot. Perhaps that was because we were first-time operators. But that can also be easily corrected by making another camera view available. Using Formant’s software, Spot collects a ton of data as it moves, but the interface for Eugene and me was quite clean. The main view is a forward-looking perspective from one of Spot’s cameras, but we also had a GPS view and a wide-angle view captured by a Formant employee who acted as Spot’s handler just in case something went wrong. At the bottom of our screens, we could also see metrics such as battery life, memory, ping rate and what mode Spot was in. Formant can customize this for robotics engineers who might want to see different metrics. “What to do you wanna do with [Spot]?” Linnell said. “I think that’s the real question. You’ve got a really incredible mobility platform; now you can start dreaming up applications.” RR Steve Crowe • Editor, The Robot Report

44

November 2020

Hybrid_Robotics_Vs3_ed.indd 44

www.therobotreport.com

THE ROBOT REPORT

11/3/20 8:42 AM


SMALL MOBILE ROBOTS JUST GOT SAFER. Now streaming: On Air with SICK Available on your favorite podcast streaming app

SICK’s new ultra-compact safety laser scanner, nanoScan3, equips small AGVs and mobile robots with leading-edge safety technology. This space-saving sensor can be used wherever machines and vehicles require maximum safety performance, but have minimal mounting space. It adds the required security without sacrificing weight or size. We think that’s intelligent. www.sick.com

SICK 11-20.indd 45

10/29/20 9:21 PM


Software •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

How to develop software for safety in medical robotics As healthcare robotics continues to evolve, well-designed software will be paramount for safety and consistent performance.

Susan Jones • MedAcuity

The use of robotics in medicine continues to grow. Whether it’s a collaborative robot working alongside humans in manufacturing or a surgical robot in the operating room, a single point of failure can cause serious harm. The incorporated so ware systems must take safety into account. IEC 61508-3 describes several techniques for developing so ware for safety-related systems. The medical device so ware development community can draw on them when designing and implementing risk-control measures as required by ISO 14971. Developing “safe” so ware begins with establishing a so ware coding standard. IEC 61508-3 promotes using well-known techniques, including: • Using modular code. • Using preferred design patterns. • Avoiding re-entrance and recursion. • Avoiding dynamic memory allocations and global data objects. • Minimizing the use of interrupt service routines and locking mechanisms. • Avoiding dead wait loops. • Using deterministic timing patterns. Keep safety simple There are other suggestions under the “keep it simple” principle around limiting the use of pointers, unions and type casting, and not using automatic type conversions while

46

November 2020

Medical_Software_Vs4_ed.indd 46

www.therobotreport.com

THE ROBOT REPORT

11/3/20 8:45 AM


da Vinci robotic surgical system. | Credit: Intuitive Surgical

encouraging the use of parentheses and brackets to clarify intended syntax. A hazard analysis might identify that your code or data spaces can get corrupted. There are well-known risk-control measures around maintaining code and memory integrity which can be easily adopted. You can run code from read-only memory, protected with a cyclic redundancy check (CRC-32) that can be checked at boot time and periodically during runtime. This can prevent errant changes to the code space and provide a mechanism to detect these failures. Segregating data into different memory regions that can be protected through virtual memory space and using CRC-32 over blocks of memory regions or even adding a checksum to each item stored in memory allows these CRC/checksums to be checked periodically. CRC/checksums can be verified on each read access to a stored item and updated atomically on every write access to these protected items. Building tests into the software is an important tool as well. It’s a good idea to perform a power-on selftest (POST) at power-up to make sure the hardware is working and to check that your code and data spaces are consistent and not corrupt. THE ROBOT REPORT

Medical_Software_Vs4_ed.indd 47

What else can happen? Another hazardous situation arises when controlling and monitoring are performed on the same processor or in the same process. What happens to your safety system if your process gets hung up in a loop? Techniques that separate the monitor from the controlling function introduce some complexity to the software system, but this complexity can be offset by ensuring the controlling function implements the minimum safety requirements while the monitor handles the fault and error recovery. Fault-detection systems and errorrecovery mechanisms are much easier to implement when designed from the start. Poorly designed software can experience unexpected, inconsistent timing, which results in unexpected failures. It’s possible to avoid these failures by controlling latency in the software. State machines, software watchdogs and timer-driven events are common design elements to control this. Keep an eye on communications Inter-device and inter-process communications are another area of concern for safety-related systems. The

www.therobotreport.com

integrity of these communications must be monitored to ensure they are robust. Using CRC-32 on any protocol between two entities is recommended. Separate CRC-32 on the headers and the payload helps to detect corruption of these messages. Protocols should be written and designed with the idea that at any time, your system could reboot due to some fault. Thus, building in retry attempts and stateless protocols is recommended. Safe operational software verifies the ranges of all inputs at the interface where it is encountered; checks internal variables for consistency; and defines default settings to help recover from an inconsistent setting or to support a factory reset. Software watchdog processes can be put in place to watch the watcher and ensure that processes are running as they should. By taking these techniques into account, software developers working on medical robotics can better address the concerns of safety-related systems. RR November 2020

47

11/3/20 8:45 AM


COBOTS •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

eKAMI and READY Robotics

train former coal miners to program multiple robots A Kentucky organization and a robotics software provider have teamed up to retrain workers, providing a model for U.S. industry.

By The Robot Report staff

Automation is key to the U.S. competing on a global manufacturing stage, but the nation must train its workforce to be fluent in advanced skills such as robot programming. It is imperative that developers make automation easier to deploy to enable broader adoption and power the future of manufacturing, according to READY Robotics Corp. However, user- iendly technology isn’t enough on its own; workers need to be rapidly trained in the skills necessary to implement automation. The future of U.S. manufacturing requires the democratization of automation, with a combination of technology and worker training/upskilling, said the company. One example of success is in Paintsville, Ky. In the middle of coal country, Kathy Walker has built a model for the future at the Eastern Kentucky Advanced Manufacturing Institute (eKAMI). eKAMI is teaching unemployed coal miners the advanced manufacturing skills that U.S. manufacturing sorely needs. “We are re-skilling the region’s people for jobs in advanced manufacturing,” said Walker. “This community needs it, and U.S. manufacturing needs it.” Proof that the skills eKAMI is teaching are in high demand is that eKAMI has trained over 100 students with a 100% job placement rate. Most students accept job offers before they graduate their 16-week course. eKAMI was a 2020 RBR50 innovation award winner for its work with mobile robot maker AutoGuide on reskilling workers.

48

November 2020

TRR_eKAMI_Vs3_ed.indd 48

www.therobotreport.com

THE ROBOT REPORT

11/3/20 8:47 AM


| phot

The Eastern Kentucky Advanced Manufacturing Institute is educating robotics technicians. Source: READY Robotics While the organization’s advanced manufacturing course teaches valuable skills, READY Robotics said it was excited by the opportunity to help eKAMI expand its curriculum to include robotics skills that enable eKAMI students to multiply their impact on the manufacturing floor. READY Robotics brings Forge/OS to eKAMI curriculum “Robots are far too hard to program. So despite falling robot prices, the programming barrier has kept automation out of reach for many manufacturers,” said Kel Guerin, Ph.D., cofounder and chief technology officer of READY Robotics. “That’s why we built Forge/OS — to enable any manufacturer to deploy and own their automation.” “We’re seeing incredible results om our customers who are powering their automation with Forge/OS, but we realize that the more we can train up workers skilled in automation, the more we can magni the impact of Forge/OS,” he said. Creating a reasonable path for workers to learn the skills THE ROBOT REPORT

TRR_eKAMI_Vs3_ed.indd 49

necessary to design, program, deploy, manage, and troubleshoot automation is critical to the health of U.S. manufacturing, according to READY Robotics. The skilled labor shortage requires industry to make better use of the existing workforce by re-skilling workers and training the next generation with the skills that are in demand. Factories need workers who can program a CNC (computer numerical control) machine, and then design, program, and deploy the tools necessary to automate that workcell, said the Columbus, Ohio-based company. READY Robotics said its three-week add-on to eKAMI’s 16-week CNC course teaches these skills. The course goes beyond just programming the robot. It also includes cell design, robot and hardware evaluation, parts presentation, machine tool operation, programming peripherals, and more. Using Forge/OS, the eKAMI students were able to program Yaskawa, FANUC, and UR robots in one day, said

www.therobotreport.com

November 2020

49

11/3/20 8:48 AM


COBOTS

Forge/OS is designed to be hardware-agnostic. Source: READY Robotics

READY Robotics. It provided a single platform that enabled them to quickly program multiple brands of robots. As a result, he students were able to spend the bulk of their three-week curriculum learning about the details of automation and getting robots and machine tools to work seamlessly together. This allowed all the students to program a lights-out manufacturing task in just 2.5 weeks — when none had ever touched a robot before. Training to create a ‘superhuman workforce’ “We can’t hire our way out of our skilled labor shortage.” said Aaron Prather, R&D evangelist at FedEx Express. “But we can upskill our workers, and create a superhuman workforce with skills that make them orders of magnitude more productive than they otherwise would have been. I believe this has the potential to both increase manufacturing output, and create tens of thousands of highquality jobs!” It’s not just thought leaders like Prather who are recognizing the importance of workers skilled in automation, but political leaders as well. On Aug. 27, Sen. Mitch McConnell visited eKAMI to tour the facility and see their efforts to produce personal protective equipment (PPE) for front-line medical professionals during the COVID-19 pandemic.

50

November 2020

TRR_eKAMI_Vs3_ed.indd 50

“We were excited to share with Sen. McConnell what we are doing to help train a new generation of manufacturing professionals skilled in robotics.” said Ben Gibbs, co-founder and CEO of READY Robotics. “Having a workforce trained in robotics brings a host of benefits, including higher productivity, greater cost competitiveness, and more resiliency.” On Aug. 21, eKAMI’s most recent class of 14 graduated with the skills to design, program, and deploy robotic automation. READY Robotics asserted that this knowledge should be accessible to everyone in the industry, but most manufacturers don’t have access to an advanced training facility like eKAMI. With Forge/OS and access to automation training, American

manufacturing is ready to take the next step forward in an Industry 4.0 future, said READY Robotics. It added that it is actively working to make its curriculum available online to everyone. READY.academy extends beyond robot programming to include all aspects of automation. “We’re excited about our robotics contribution to eKAMI’s curriculum, and we’re excited to build on these efforts in a way that can solve the skilled labor shortage at scale and enable every U.S. manufacturer to easily deploy automation.” said Gibbs. RR

eKAMI is using Forge/OS as part of its curriculum. Source: READY Robotics

www.therobotreport.com

THE ROBOT REPORT

11/3/20 8:48 AM


BETTER PARTS | BETTER PROCESS Additive Manufacturing Experts

FAST DELIVERY

REDUCE INVENTORY

Turn 6-8 week lead times into days

Manufacture Parts on Demand

IMPROVE DESIGN Optimize Material and Weight

MANUFACTURING TO BLUEPRINT

Connect today for a free analysis of your parts and learn more about how we can be your partner in manufacturing innovation. Better match materials for your design and come out ahead in delivery times and overall quality improvement.

Specializing in manufacturing complex functional parts that demand a quality-driven process.

www.azoth3d.com Manufacturing On-Demand

AZOTH 11-20_RRHBK.indd 51

10/29/20 9:22 PM


COBOTS •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

MT Solar automates small-batch welding

with UR10e, Vectis cobot tool A Montana-based solar module maker turned to collaborative welding robots to meet a seasonal surge in demand.

By The Robot Report staff

Of all the tasks within manufacturing, welding is one that could benefit the most om automation because of skilled labor shortages, as well as the need for precision and flexibility. MT Solar designs and makes mounting structures for solar modules of all sizes. The Charlo, Mont.-based company faced a 300% jump in demand every summer and found that conventional robots in safety cages could not meet its requirements. MT Solar ultimately chose Vectis Automation LLC’s DIY Cobot Welding Tool, which includes a UR10e collaborative robot arm om Universal Robots A/S. The system now handles a wide range of welds, enabling quick change-overs and optimized production, said the manufacturer. MT Solar faces staffing, flexibility challenges MT Solar experiences a threefold jump in demand for its solar mount products every summer, but it has been unable to find skilled welders to handle the seasonal uptick. Travis Jordan, owner and president of MT Solar, was in his office one day, “just scrambling” to deal with the labor shortage, when an employee handed him an article on welding robots. “He said, ‘I really think you should look into this; it would be a good solution for our team,’” Jordan recalled. “And I’m like, ‘Well, if you got one of the operators saying you need to look into robotics, you’ve got a reason you should be doing something here.’”

52

November 2020

TRR_VectisUR_Vs3_ed.indd 52

www.therobotreport.com

THE ROBOT REPORT

11/3/20 8:51 AM


The DIY Cobot Welding Tool does not require a lot of space at MT Solar. Source: Universal Robots At that point, MT Solar’s lead times were two to three times what they were supposed to be. “It’s hard to find good, skilled people that are willing to come up here and work,” explained Jordan. The company looked into conventional welding robots but found them best suited to huge batches of the same item and lacking in flexibility. MT Solar’s hiring woes reflect a national trend. The American Welding Society predicts a potential shortfall of 400,000 welders by 2025. Finding a flexible automation solution was crucial for MT Solar, as the company makes many different types of mounting parts, o en in high-mix/low-volume batches. “Think of us as a ‘solar Ikea,’ if you will -- where all the pieces have to go out to the customer to be assembled in the field,” Jordan said. “If I don’t have all the other parts that go with it, I can’t ship anything.” The cost of conventional automation was compounded by the hassle associated with programming and setup, he added. THE ROBOT REPORT

TRR_VectisUR_Vs3_ed.indd 53

“At first, it might look like a good idea to use traditional robots, but when you look at the time and resources to get them up and running and programming. It was not the route we wanted to take,” said Jordan. “Conventional robots aren’t very flexible. They can’t handle a mixed bag. The envelope is too small.” “Or, the fixturing can’t be manipulated properly, and the cost would have gone through the roof,” he added. MT Solar preferred a system that wouldn’t require safety guarding and that existing operators could handle. Cobots offer a solution Jordan said the paradigm changed when MT Solar discovered Universal Robots. “The big difference is the collaborative robot approach of being able to work with the robot, and it being so teachable and so easy to run,” he noted. While traditional welding robots require safety cages and can look “kind of like a nightmare,” cobots offer safe human-

www.therobotreport.com

November 2020

53

11/3/20 8:52 AM


COBOTS •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

“Cobots are meant for industrial environments,” says Josh Pawley (left), co-founder of Vectis Automation. Source: Universal Robots robot collaboration in close proximity without fences, which further increased the appeal of cobot-powered welding, said Jordan. “When I zoomed in and discovered Vectis Automation’s Cobot Welding Tool powered by Universal Robots, it became obvious that this combination was the right way to go,” he said. Operators set up the cell with jigs, supply the parts, and program the system through an intuitive 3D interface directly integrated on the cobot’s teach pendant through Vectis’ URCap so ware plugin supplied. The pendant includes a full weld library developed by Vectis Automation, providing standard settings for common weld jobs, including Pattern and Tack tools. “I am by no means a certified welder,” said Jordan. “I’ll just grab any of the guys out of the shop that are welders, and I’ll say, ‘OK, I’ll run the pendant. You go ahead and run the torch and put it where you want it. Where do you want the weld to start; where do you want it to stop; what angle do you want it to do?’ And we’ve thought through some very advanced welds.” Once the DIY programming is complete, the robot autonomously runs a full MIG welding cycle. The UR10e welds six to eight parts in each cycle. Typically, these are small parts, including lock collars, beam clamps, and weld nuts. The cobot welds these in a predefined order and completes four to 12 parts per run with no batching. When it has completed its welding tasks, humans can reload parts and restart the system,

54

November 2020

TRR_VectisUR_Vs4_ed.indd 54

if required, or quickly program a new welding job with esh parts. The cobot is responsible for welding specific lists of parts every 38-minute shi . Operators work collaboratively with the cobot, loading and unloading during cycles and maximizing the build reach of the machine to cover multiple different fixtures. Attracted by ease of deployment, ROI Enthusiasm had been building among MT Solar workers prior to the arrival of the cobot, which made for a “very interesting” first morning together, Jordan explained. “The robot shows up on the truck, and of course I’m all excited about it,” he said. “I walk out of the office, and I’ve got employees already cutting shrink wrap off the robot and getting ready to set something up. I was like, ‘Hang on! I

want to play!’ We had production parts running that a ernoon.” The best thing about the system is that one doesn’t have to be a rocket scientist to use it, said Mike Gillin, a certified welder and operations manager at MT Solar. “I’m a welder by trade, but I didn’t know anything about robots, and I’m not very computer-savvy,” he said. “Curiosity attracted me to the robot, and I was really surprised at how easy it was to figure it out.” The flexible financing and rent-toown options offered through Vectis Automation enabled MT Solar to test out the entire system without a large financial commitment at the outset. “This is a system that I can rent or lease for a very short period. I can afford that,” Jordan said. “If it doesn’t work, I’m not saddled with the thing. It made it really

The Vectis Cobot Welder is a UR+ Application Kit powered by the UR10e cobot. Source: Universal Robots

www.therobotreport.com

THE ROBOT REPORT

11/4/20 12:57 PM


“You can work right beside the cobot as it is moving and welding,” says Travis Jordan, owner and president of MT Solar. Source: Universal Robots easy for us to get started. The system provides a clear return on investment (ROI) by adding another welder to the team, he said. “[It’s] a real clean one-and-a-half, two-year ROI, just on hard-number labor savings,” noted Jordan. The intangible ROI is “probably even more valuable,” he added, because the cobot has had “such a far-reaching impact on both customers and employee morale.” Results include enhanced repeatability, quality, competitiveness Human welding can accommodate -- and introduce -- a lot of variance, especially when it comes to monotonous work and relying on the “calibrated thumb,” said Gillin. “We’ve built up to 7,500 small parts over a winter. For an operator to sit and do that; you can tell where they’ve gotten sick of it, and some of those parts end up being scrapped,” said the operations manager. The cobot welder now performs that repetitive work. The Vectis/UR system offers the repeatability that manufacturers crave and helps MT Solar maintain consistent product quality, said Jordan. “Our customers are always excited when we tell them, ‘Hey, you know, that part was welded on a robot.’ They go to put it on, and it looks exactly the same as the last one, and they look highquality,” he said. “I think Universal Robots’ system with Vectis is the best one I’ve run across.”

THE ROBOT REPORT

TRR_VectisUR_Vs3_ed.indd 55

“Furthermore, the consistency and flexibility of the system enables MT Solar to compete with large corporations,” said Jordan. “It allows us to combine industrial quality and scalability with the innovation and the nimbleness of a small company, which we think is an extremely powerful combination.” MT Solar plans to acquire more cobots soon for new tasks such as manipulation of heavy and larger parts, which will also alleviate some of the li ing currently performed by employees. “We’ve got a shop full of people that will come up with ideas just based on their exposure to it,” said Gillin. RR

Case study at a glance Case Study Breakdown

Company: MT Solar Location: Charlo, Mont. Industry: Solar power module manufacturer Challenge: Meeting a tripling in demand each summer amid staffing shortages Partners: Vectis Automation LLC, Universal Robots A/S Robots: DIY Cobot Welding Tool, including UR10e arm Tasks: Small parts welding and handling Value drivers: Need for flexibility, ease of deployment and use Results: Improved quality and scalability Returns on investment: 1.5 to 2 years

www.therobotreport.com

November 2020

55

11/3/20 8:53 AM


Motion Control •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

Source: maxon motor ag

•• •• •

Interdisciplinary approaches are essential to drive systems, robotics success

There are a number of approaches to evaluating and adding drives to designs, says a maxon motion control specialist.

Drive technology is a basic building block for the performance

By Juergen Wagenbach

of machines, robots, and handheld devices. However, there is much more to consider than just the design and the motor selection. Interdisciplinary thinking, specific engineering knowledge, depth of experience, and a clear understanding of the requirements are other critical factors. Any kind of technological progress should focus on the application. At some point, improved performance characteristics or new technologies have to yield results in the form of better quality and/or reduced cost. With regard to drive technology for machines and handheld devices, this means: • Better dynamics improve the production output • Faster drive control improves precision and product quality • A more efficient drive system improves the energy efficiency of the overall system In order to achieve these goals, drive systems must be selected in the overall context of the application and its requirements.

56

November 2020

maxon_11_2020_Vs4_ed.indd 56

www.therobotreport.com

Look at the big picture The first step in the specification and optimization of a drive system is to understand and correctly prioritize the technical and commercial requirements of the final system. It’s easy to miss the bigger picture and focus unilaterally on the considerations relevant for one’s own field of expertise. As a rule, the drive selection happens in engineering. However, the performance, the cost, and the limitations of a drive solution are influenced by a multitude of factors, as well as other system components. It is therefore critical to harness the knowhow of experts om different fields during the idea and conceptualization phase. THE ROBOT REPORT

11/3/20 8:58 AM


Pools of expertise A systems approach and interdisciplinarity are two key success factors that should be considered from the concept design all the way to the implementation in mass production. Frequently, it is not possible to cover all competences in-house at the same high level. External partners with a wide scope of experience provide an opportunity for a broader, interdisciplinary exchange of ideas. Ideally, the partner is also able to take responsibility for the development and production of partial systems, to reduce development risks and speed up the time to market. maxon is more than just drives With over 50 years of experience and more than 2,800 employees worldwide, maxon’s scope of knowledge extends far beyond the “pure” drive motor. With in-house development and production, the maxon portfolio covers brushless and brushed DC motors, gearheads, spindles, encoders, motor controllers, master controllers, and battery management systems. maxon components and customerspecific drive systems are used in robotics, medical and laboratory engineering, industrial automation, the automotive industry, and in aerospace applications from Earth to Mars. With many projects, the key factor isn’t just the broad product portfolio but also the interdisciplinarity and depth of experience of the maxon application teams, as well as the possibility of developing completely new drive solutions. maxon experts for motors, gearheads, electronics, and control are available in the early stages of discussing an idea and are familiar with the requirements of specific fields of application. Why covering various fields of expertise can be so decisive for a drive system becomes clear when we take a look at the individual components and their influencing factors. Top down: Focus on the master A top-level “intelligence,” sometimes referred to as the master, transmits motion commands to the motor controller and queries process data such as torque, speed, position, and status. THE ROBOT REPORT

maxon_11_2020_Vs4_ed.indd 57

Source: maxon motor

• How the tasks are shared between the master and the motor controller is critical for assessing the required performance and selection of the master, the motor controller, and the communication interface. • If fast cyclic data exchange -- i.e., every millisecond -- is needed in the machines, then a master with a real-time operating system (e.g. PLC) and a fast interface (e.g. CAN, EtherCAT) is required. • If complex motion sequences can be preconfigured and executed autonomously in the motor controller, then it is sufficient to use a PC for common lab automation. One can also choose a microcontroller, as used in handheld devices such as screwdrivers and drills in industrial automation or medical technology. Focus on the motor controller The motor controller is the link between the top-level master and the motors, as well as feedback devices like encoders. Controllers and the power stage are used to convert current, speed, and position commands into voltages and currents in the motor phases. www.therobotreport.com

• Fast controller cycles and complex algorithms enable precise and dynamic drive motions. • State-of-the-art power stages deliver the peak currents required for fast acceleration. They have a high energy efficiency. Integrated motor and signal filters improve the electromagnetic compatibility (EMC) and noise immunity Focus on the encoder Speed and positioning control require feedback devices (so-called encoders or linear scales) for returning information about the actual position of the motor and/or output shaft. • The resolution and location of these encoders determine the theoretical precision limits of the positioning.

Source: maxon motor November 2020

57

11/3/20 9:00 AM


Motion Control •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

Focus on the motor The motor converts electrical energy into mechanical energy -- motion and torque. • DC or BLDC motors with high overload capacity can have a compact design, because high torques are available for dynamic acceleration phases in the short term. • Motors with a low rotor inertia reduce the torque required to accelerate the rotor, thereby increasing energy efficiency and dynamics. • High motor efficiency therefore improves the overall energy efficiency and reduces the heat generation, which is an important factor especially for hand-held devices.

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

Focus on the gearhead The precision, backlash, elasticity and efficiency of the gearhead and the mechanical assemblies does the following things: • Determines output positioning accuracy • Affects the dynamics, such as the time a er which the target position has been reached and stabilized • Plays a role for energy efficiency Challenge: Integration In order to integrate all components into a unit that is as compact as possible, it is necessary to consider the thermal aspects of mutual heating under load. To avoid costly redesigns, operating point

maxon Door Drive: BLDC motor, gearhead, encoder, positioning controller integrated in a compact unit. Source: maxon motor

calculations and thermal evaluations for the electric motor and the electronics should be done early on during the proof-of-concept stage. Challenge: Battery operation Battery-powered applications require competence in energy efficiency optimization and battery management. Example: Integrated door drive system Billions of people ride in elevators every day. Besides the main drive, elevators require compact door drives that are installed in the tight space above the door. The “smart” drives receive commands to open and close via a bus system. The drives need to execute these

VIONiC™ digital encoder series Designed for the designer

The VIONiC digital incremental encoder series has been specifically designed with the machine builder in mind. Its enhanced ease of use, superior metrology capability and multiple configuration options all ensure optimal machine performance. VIONiC really has been designed for the designer.

For more information visit www.renishaw.com/vionic Renishaw, Inc. 1001 Wesemann Drive, West Dundee, Illinois, 60118 United States T +1 847 286 9953 F +1 847 286 9974 E usa @ renishaw.com

www.renishaw.com Renishaw VIONIC advert 0417_USA.indd 1

maxon_11_2020_Vs4_ed.indd 58

10/16/2020 2:45:29 PM

11/3/20 9:03 AM


5:29 PM

commands reliably and fulfill strict safety requirements. Together with a leading elevator manufacturer, maxon has developed the Door Drive, which is capable of moving doors weighing up to 400 kg (881.8 lb.). This low-noise and energyefficient drive system integrates a high-torque maxon EC 90-flat motor with an encoder, an optional belt transmission, as well as an EPOS positioning controller in one compact unit. The data exchange is implemented via CAN. Specific control and force monitoring algorithms ensure smooth but dynamic door movement, precise positioning, and squeeze protection. maxon designed and developed the system in collaboration with a customer, bringing together the interdisciplinary knowhow of experts in application and safety technology, construction, controllers, electronics development, thermodynamics, and so ware. This kind of solution cannot be achieved simply by combining individual components. It requires the ability to design a new drive solution for a specific application case. maxon is the ideal partner for drive systems, controllers, and battery management. RR

THE NEW MUST HAVE FOR ROBOTS! POWERFUL EMBEDDED 3D CAMERA SYSTEM

Light and cost-optimized - the new Ensenso N40 series

About the author: Juergen Wagenbach is a motion control specialist at maxon motor AG. He studied physics and joined maxon in 1989. Wagenbach was initially involved in the development of firmware for maxon motion controllers. At the same time, he studied business engineering and later on so ware engineering to broaden his knowledge about technical and commercial aspects of industrial systems. Wagenbach le the company in 2000 and worked for a so ware consultancy and a motion control firm before returning to maxon.

ÂŽ THE ROBOT REPORT

maxon_11_2020_Vs4_ed.indd 59

November 2020

59

www.ids-imaging.com 11/3/20 9:02 AM


Motion Control •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

A robot equipped with real-time motion planning can operate safely in an environment with humans, and can be deployed in relatively unstructured factories and adjust to imprecise object locations and orientations. Credit: Realtime Robotics

How Realtime Robotics

is helping robots avoid collisions A unique circuit design, when combined with proprietary software, acts like a plug-in motor cortex for robots.

By Zach Winn, MIT

60

November 2020

Realtime_Robotics_11_2020_Vs4_ed.indd 60

George Konidaris still remembers his disheartening introduction to robotics. “When you’re a young student and you want to program a robot, the first thing that hits you is this immense disappointment at how much you can’t do with that robot,” he said. Most new roboticists want to program their robots to solve interesting, complex tasks - but it turns out that just moving them through space without colliding with objects is more difficult than it sounds. Fortunately, Konidaris is hopeful that future roboticists will have a more exciting start in the field. That’s because roughly four years ago, he co-founded Realtime Robotics, a startup that’s solving the “motion planning problem” for robots. The company has invented a solution that gives robots the ability to quickly adjust their path to avoid objects as they move to a target. The Realtime controller is a box that can be connected to a variety of robots and deployed in dynamic environments. www.therobotreport.com

“Our box simply runs the robot according to the customer’s program,” explained Konidaris, who currently serves as Realtime’s chief roboticist. “It takes care of the movement, the speed of the robot, detecting obstacles, collision detection. All [our customers] need to say is, ‘I want this robot to move here.’” Realtime’s key enabling technology is a unique circuit design that, when combined with proprietary so ware, has the effect of a plug-in motor cortex for robots. In addition to helping to fulfill the expectations of starryeyed roboticists, the technology also represents a fundamental advance toward robots that can work effectively in changing environments. THE ROBOT REPORT

11/3/20 9:06 AM


Helping robots get around Konidaris was not the first person to get discouraged about the motion planning problem in robotics. Researchers in the field have been working on it for 40 years. During a four-year postdoc at MIT, Konidaris worked with School of Engineering Professor in Teaching Excellence Tomas Lozano-Perez, a pioneer in the field who was publishing papers on motion planning before Konidaris was born. Humans take collision avoidance for granted. Konidaris pointed out that the simple act of grabbing a beer from the fridge actually requires a series of tasks such as opening the fridge, positioning your body to reach in, avoiding other objects in the fridge, and deciding where to grab the beer can. “You actually need to compute more than one plan,” Konidaris said. “You might need to compute hundreds of plans to get the action you want. … It’s weird how the simplest things humans do hundreds of times a day actually require immense computation.” In robotics, the motion planning problem revolves around the computational power required to carry out frequent tests as robots move

Bin Picking Type

Average Time

Seconds Per Pick

% Performance Rate

1 robot

31.67

3.96

-

2 robots with interference zones

27.64

3.46

14%

2 robots with Realtime Controller

18.24

2.28

74%

through space. At each stage of a planned path, the tests help determine if various tiny movements will make the robot collide with objects around it. Such tests have inspired researchers to think up ever more complicated algorithms in recent years, but Konidaris believes that’s the wrong approach. “People were trying to make algorithms smarter and more complex, but usually that’s a sign that you’re going down the wrong path,” Konidaris said. “It’s actually not that common that super technically sophisticated techniques solve problems like that.” Konidaris left MIT in 2014 to join the faculty at Duke University, but he continued to collaborate with researchers at MIT’s Computer Science and Artificial Intelligence Laboratory. Duke is also

where Konidaris met Realtime co-founders Sean Murray, Dan Sorin, and Will Floyd-Jones. In 2015, the co-founders collaborated to make a new type of computer chip with circuits specifically designed to perform the frequent collision tests required to move a robot safely through space. The custom circuits could perform operations in parallel to more efficiently test short motion collisions. “When I left MIT for Duke, one thing bugging me was this motion planning thing should really be solved by now,” Konidaris said. “It really did come directly out of a lot of experiences at MIT. I wouldn’t have been able to write a single paper on motion planning before I got to MIT.” The researchers founded Realtime in 2016 and quickly brought on robotics industry veteran Peter Howard, who currently serves as Realtime’s CEO and is also considered a co-founder. “I wanted to start the company in Boston

In robotics, the motion planning problem revolves around the computational power required to carry out frequent tests as robots move through space. At each stage of a planned path, the tests help determine if various tiny movements will make the robot collide with objects around it. Credit: Realtime Robotics

THE ROBOT REPORT

Realtime_Robotics_11_2020_Vs4_ed.indd 61

www.designworldonline.com

November 2020

61

11/3/20 9:09 AM


GEARING FOR ROBOTIC APPLICATIONS Articulated Robot Joints and 7th Axis

Automated Guided Vehicles Wheel Hub Drives GCL Cycloidal withstands shock loads of 5x nominal torque

Custom Gearbox Integral hub design directly drives wheel

GCL Cycloidal with integrated pre-stage

GSL Strain Wave Zero-backlash in a compact design

GSL Strain Wave with harmonic gearing for precise motion

Trunnion Headstock Drive GCL Cycloidal

Precise point-to-point motion

GPL Planetary

Vibration-free, continuous coordinated motion between trunnion and robot

7th Axis: Dyna Series + Helical Rack & Pinion Designed for optimal system performance

From zero-backlash gearboxes to rack & pinion, GAM has the flexibility and broad product range for all your robotic applications. As a U.S. manufacturer with U.S. manufacturing and one of the broadest product offerings in the gearbox industry, as well as the engineering expertise and manufacturing capabilities to develop customized solutions, GAM can help with your application.

GAM Can. www.gamweb.com/robotics | info@gamweb.com | 888.GAM.7117

Servo Gearboxes Precision inline or right angle gearboxes for motion control applications

Zero-Backlash Planetary The new standard in zero-backlash gearboxes with precise, smooth path control, positioning, and repeatability

Cycloidal Gearbox Zero-backlash with impact resistance 5x nominal torque and precise point-to-point motion

GAM Enterprises, Inc. | 901 E. Business Center Dr. | Mount Prospect IL 60056

GAM 11-20_RR.indd 62

Strain Wave Gearbox Uses harmonic gearing for high precision in a compact design

Precision Helical Rack & Pinion Rack & Pinion paired with GAM Gearboxes for optimized system performance

Servo Coupings Zero-backlash bellows, elastomer, safety, and distance couplings custom bored to order

Š 2020 GAM. ALL RIGHTS RESERVED

11/2/20 4:00 PM


RT

Motion Control •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

because I knew MIT and a lot of robotics work was happening there,” said Konidaris, who moved to Brown University in 2016. “Boston is a hub for robotics. There’s a ton of local talent, and I think a lot of that is because MIT is here — Ph.D.s om MIT became faculty at local schools, and those people started robotics programs. That network effect is very strong.” Removing robot restraints Today, the majority of Realtime’s customers are in the automotive, manufacturing, and logistics industries. The robots using Realtime’s solution are doing everything om spot welding to making inspections to picking items om bins. A er customers purchase Realtime’s control box, they load in a file describing the configuration of the robot’s work cell, information about the robot such as its end-of-arm tool, and the task the robot is completing. Realtime can also help optimally place the robot and its accompanying sensors around a work area. Konidaris said Realtime can shorten the process of deploying robots om an average of 15 weeks to one week. Once the robot is up and running, Realtime’s box controls its movement, giving it instant collision-avoidance capabilities. “You can use it for any robot,” Konidaris said. “You tell it where it needs to go and we’ll handle the rest.” Realtime is part of MIT’s Industrial Liaison Program (ILP), which helps companies make connections with larger industrial partners, and it recently joined ILP’s STEX25 startup accelerator. With a few large rollouts planned for the coming months, the Realtime team’s excitement is driven by the belief that solving a problem as fundamental as motion planning unlocks a slew of new applications for the robotics field. “What I find most exciting about Realtime is that we are a true technology company,” said Konidaris. “The vast majority of startups are aimed at finding a new application for existing technology; o en, there’s no real pushing of the technical boundaries with a new app or website, or even a new robotics ‘vertical.’ But we really did invent something new, and that edge and that energy is what drives us. All of that feels very MIT to me.” RR THE ROBOT REPORT

Realtime_Robotics_11_2020_Vs5_ed.indd 63

Easy Engineering!

Finding the right handling system couldn’t be quicker or easier: Configure and order your standard handling system in just three steps with the Handling Guide Online. All systems are delivered fully tested and assembled. Try out the new software tool today! For more information Call: 1-800-Go-Festo 1-800-463-3786 www.festo.us/hgo

63

11/3/20 9:12 AM


Vision •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

How Siemens

automated maritime battery production Siemens’ maritime battery production line in Norway has eight conďŹ gurable robot cells and seven AGVs for handling inter-cell logistics.

Steve Crowe • Editor, The Robot Report

As a part of forward-thinking environmental legislation, the Norwegian Parliament set out in 2018 to outlaw harmful emissions om the ferries and cruisers operating in its ords by no later than 2026. This covers both CO2 and NOx gases, as well as noise pollution on the water. The response om local shipping companies and ferry operators has been to dramatically accelerate the development and introduction of electrical propulsion systems. And the performance figures of Norway’s first all-electric ferry to enter service are, to say the least, more than impressive. The MF Ampere, a fully battery-powered ferry, operates between the ports of Lavik and Oppedal, where shore-based charging stations recharge its batteries. Compared with diesel-powered alternatives, Norwegian organizations claim to have obtained a 95% reduction in emissions and an 80% reduction in operating costs with the vessel. This is an attractive proposition for ferry fleets worldwide. Traveling relatively short distances and staying quay-side at the same ports for long periods of time, car and passenger ferries have proven to be the ideal vessels to begin the inexorable shi om traditional diesel to battery and hybrid diesel-electric power in the global maritime transport sector.

64

November 2020

Siemens11-2020_Vs5_ed.indd 64

www.therobotreport.com

A critical enabler in this radical transformation is the availability of the right types of batteries, in the right kinds of volumes to power the new all-electric and hybrid powertrains. In opening a flexible, highly automated battery factory in Trondheim, Norway, Siemens has invested NOK 100 million ($11.36 million U.S.) to help address the future demand. It will develop and manufacture energy storage solutions for both marine and offshore oil and gas applications. To achieve the high levels of automation required om its new maritime battery production line, Siemens appointed Raufoss-based Intek to provide the robots, 3D machine vision and automated system integration it needed to help achieve demanding productivity goals. THE ROBOT REPORT

11/3/20 9:31 AM


Designed and engineered over a 12-month period, the line handles everything automatically, from the initial picking of component parts through final battery testing and documentation. Source: Siemens Challenges In the case of an all-electric ferry, the battery pack generally needs to have a battery capacity of around 2 MWh. A typical configuration has 34 battery cabinets. Within each there are nine battery modules, with each comprising 28 battery cells. Even in a hybrid dieselelectric power system, a battery capacity of at least 500 kWh is needed. As well as car and passenger ferries of course, there are thousands of fishing boats, cruisers, multi-purpose vessels and offshore units that can also benefit om full or part electrification. Demand for marine battery sets, and the production capacity to support it, is therefore expected to be high and set to rise exponentially. “All car and passenger ferries in THE ROBOT REPORT

Siemens11-2020_Vs5_ed.indd 65

Norway will ultimately rely on some form of energy storage solution,” said Torstein Sole-Gärtner, head of Siemens’ offshore and marine centre in Trondheim. “We estimate that there’ll be around 60 hybrid or all-battery powered ferries operating here.” Siemens said it expects a doubling of the global marine battery market by 2024 and predicts that nearly 80% of all new ships up to 150 meters in length will be equipped with either all-battery or hybrid diesel-electric configurations. To keep pace with future demand, the new Trondheim production line systems needed to be as efficient and flexible as possible. The number of people involved hands-on in the production process would need to be minimized, while the amount of robotization needed to be maximized. www.therobotreport.com

Olaf Pedersen, project manager at system integrator Intek, outlined the challenge that he faced. “Maintaining rapid, error- ee production throughput was a vital consideration in developing the overall solution,” he said. “Whether the production task was transporting components, product assembly or test, assuring high productivity was key.” Handling battery component parts in the production line’s depalletizing section presented a particular set of challenges. “Unlike the other production cells, the first cell needed to be able to automatically handle a wide and unpredictable range of components – battery cells, ames, connectors and so November 2020

65

11/3/20 9:32 AM


Vision •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

A Siemens programmable logic controller and a highspeed industrial PC provided control and processing power. Custom algorithms were designed to manipulate the camera’s 3D point cloud data and maximize production line throughput. Source: Siemens on,” explained Pedersen. “The handling of so many different kinds of in-coming goods, arriving randomly placed on palettes, on cardboard trays and in plastic blister packs for example, and to do it at some speed can be a tricky task to automate.” Solution Siemens’ Trondheim maritime battery production line is equipped with eight independently configurable robot cells, and seven automatic guided vehicles (AGVs) for handling inter-cell logistics. Designed and engineered by Intek over a 12-month period, the line handles everything automatically, om the initial picking of component parts through to final battery testing and documentation.

For the depalletizing cell, Intek chose to use the Zivid One real-time 3D machine vision camera mounted on a KUKA KR9 robot arm equipped with a custom designed vacuum gripper. A Siemens programmable logic controller and a high-speed industrial PC provided control and processing power. Intek applied its own custom algorithms to manipulate the camera’s 3D point cloud data and maximize production line throughput. “By harnessing the Zivid One camera’s high-quality 3D point cloud, we were able to easily pinpoint the outline of the pallet or tray, very accurately determine the outer dimensions of the component within, and then pick and place accordingly with the highest degree of accuracy,” said Pedersen. “And being able to rely on a single camera snapshot meant it was very fast, too.” “By taking this more pragmatic approach we could resolve the issue of random component alignment without the need for any mechanical ‘steering’ systems or operator intervention to straighten up the components and shi them to predefined positions. What’s more, it also avoided the need

A Zivid One real-time 3D machine vision camera mounted on a KUKA KR9 robot arm. Source: Siemens

66

November 2020

Siemens11-2020_Vs5_ed.indd 66

www.therobotreport.com

for intensive programming of complex component CAD files. It’s a very flexible and reliable solution as a result.” “Because the robot arm is automatically stacking component parts onto a relatively small AGV, it’s essential that the system takes into account distribution of load – the AGV needs to stay perfectly balanced in transit,” said Peterson. “Furthermore, when the AGV arrives at its destination production cell, the component parts need to be picked by another robot om predefined positions. And so accurate component placement onto the AGV was also essential.” Results The advanced cellular nature of Siemens’ maritime battery production line enables it to flex its capabilities and increase its production capacity in response to the expected upsurge in worldwide market demand and rapid technological developments. Battery design and production are more easily tailored to match a particular vessel’s function and duty cycle. By harnessing state-of-the-art machine vision, robotics and AGVs, the Trondheim production has achieved a high level of automation, requiring only three people to work in the production area. It makes it possible to produce sustainable energy solutions more efficiently and cost-effectively. With its highly-efficient, end-to-end automated production line, Siemens is expected to be able to supply batteries for 150-200 ferries annually, equating to a battery module capacity in the order of 400 MWh. The factory can produce the battery modules needed for an all-electric ferry in less than four days. The positive environmental impact of the switch om diesel-powered vessels towards all-battery or hybrid vessels cannot be underestimated, said Siemens. Reductions in CO2 and NOx gas emissions and water-borne noise pollution will be felt globally as well as locally. Emission ee, near silent maritime operations are a worthy goal to aim for. RR

THE ROBOT REPORT

11/3/20 9:32 AM


YOUR CUSTOM SOLUTIONS ARE CGI STANDARD PRODUCTS

Advanced Products for Robotic Applications CGI Motion standard products are designed with customization in mind. Our team of experts will work with you on selecting the optimal base product and craft a unique solution to help differentiate your product or application. So when you think customization, think standard CGI assemblies. Connect with us today to explore what CGI Motion can do for you.

800.568.GEAR (4327) • www.cgimotion.com

copyright©2019 cgi inc. all rights reserved. 025rbt

CGI 7-19_Robot Report.indd 67

10/29/20 9:25 PM


Vision •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

Eliminating bias

from visual datasets used to train AI models Tool takes stock of a dataset’s content using existing image annotations and

measurements such as object counts, the co-occurrence of objects and people, and images’ countries of origin.

By Molly Sharlach | Princeton

Researchers at Princeton University have developed a tool that flags potential biases in sets of images used to train artificial intelligence systems. The work is part of a larger effort to remedy and prevent the biases that have crept into AI systems that influence everything om credit services to courtroom sentencing programs. Although the sources of bias in AI systems are varied, one major cause is stereotypical images contained in large sets of images collected om online sources that engineers use to develop computer vision, a branch of AI that allows computers to recognize people, objects, and actions. Because the foundation of computer vision is built on these datasets, images that reflect societal stereotypes and biases can unintentionally influence computer vision models. To help stem this problem at its source, researchers in the Princeton Visual AI Lab have developed an open-source tool that automatically uncovers potential biases in visual datasets. The tool allows dataset creators and users to correct issues of underrepresentation or stereotypical portrayals before image collections are used to train computer vision 68

November 2020

BIAS_11_2020_Vs4_ed.indd 68

www.therobotreport.com

models. In related work, members of the Visual AI Lab published a comparison of existing methods for preventing biases in computer vision models themselves, and proposed a new, more effective approach to bias mitigation. The first tool, called REVISE (REvealing VIsual biaSEs), uses statistical methods to inspect a dataset for potential biases or issues of underrepresentation along three dimensions: object-based, gender-based and geography-based. A fully automated tool, REVISE builds on earlier work that involved filtering and balancing a dataset’s images in a way that required more direction om the user. REVISE takes stock of a dataset’s content using existing image annotations and measurements such THE ROBOT REPORT

11/3/20 9:35 AM


as object counts, the co-occurrence of objects and people, and images’ countries of origin. Among these measurements, the tool exposes patterns that differ om median distributions. For example, in one of the tested datasets, REVISE showed that images including both people and flowers differed between males and females: Males more o en appeared with flowers in ceremonies or meetings, while females tended to appear in staged settings or paintings. (The analysis was limited to annotations reflecting the perceived binary gender of people appearing in images.) Once the tool reveals these sorts of discrepancies, “then there’s the question of whether this is a totally innocuous fact, or if something deeper is happening, THE ROBOT REPORT

BIAS_11_2020_Vs4_ed.indd 69

and that’s very hard to automate,” said Olga Russakovsky, an assistant professor of computer science and principal investigator of the Visual AI Lab. Russakovsky co-authored the paper with graduate student Angelina Wang and Arvind Narayanan, an associate professor of computer science. For example, REVISE revealed that objects including airplanes, beds and pizzas were more likely to be large in the images including them than a typical object in one of the datasets. Such an issue might not perpetuate societal stereotypes, but could be problematic for training computer vision models. As a remedy, the researchers suggest collecting images of airplanes that also include the labels mountain, desert or sky. www.therobotreport.com

Princeton computer scientists developed a tool that flags potential biases in sets of images used to train AI systems. Source: Princeton University

November 2020

69

11/3/20 9:36 AM


Vision •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

Versatile. Flexible. Modular. Flat Top Chain Conveyors

www.mkversaflex.com/rr The perfect solution for complex layouts: • Pallet or product handling • Inclines, curves and loops • Diverts and merges

better products. better solutions. (860) 769-5500 | info@mknorthamerica.com

BIAS_11_2020_Vs4_ed.indd 70

®

•• •• The underrepresentation of regions of the globe in computer vision datasets, however, is likely to lead to biases in AI algorithms. Consistent with previous analyses, the researchers found that for images’ countries of origin (normalized by population), the U.S. and European countries were vastly overrepresented in datasets. Beyond this, REVISE showed that for images om other parts of the world, image captions were o en not in the local language, suggesting that many of them were captured by tourists and potentially leading to a skewed view of a country. Researchers who focus on object detection may overlook issues of fairness in computer vision, said Russakovsky. “However, this geography analysis shows that object recognition can still be quite biased and exclusionary, and can affect different regions and people unequally,” she said. “Dataset collection practices in computer science haven’t been scrutinized that thoroughly until recently,” said Wang. She added that images are mostly “scraped om the internet, and people don’t always realize that their images are being used [in datasets]. We should collect images om more diverse groups of people, but when we do, we should be careful that we’re getting the images in a way that is respectful.” “Tools and benchmarks are an important step … they allow us to capture these biases earlier in the pipeline and rethink our problem setup and assumptions as well as data collection practices,” said Vicente Ordonez-Roman, an assistant professor of computer science at the University of Virginia who was not involved in the studies. “In computer vision there are some specific challenges regarding representation and the propagation of stereotypes. Works such as those by the Princeton Visual AI Lab help elucidate and bring to the attention of the computer vision community some of these issues and offer strategies to mitigate them.” A related study om the Visual AI Lab examined approaches to prevent

70

THE ROBOT REPORT

11/3/20 9:37 AM


In one data set, REVISE uncovered a potential gender bias in images containing people (red boxes) and the musical instrument (blue boxes). Analyzing the distribution of inferred 3D distances between the person and the organ showed that males tended to be featured as actually playing the instrument, whereas females were often merely in the same space as the instrument. | Source: Princeton University

computer vision models om learning spurious correlations that may reflect biases, such as over predicting activities like cooking in images of women, or computer programming in images of men. Visual cues such as the fact that zebras are black and white, or basketball players o en wear jerseys, contribute to the accuracy of the models, so developing effective models while avoiding problematic correlations is a significant challenge in the field. In research presented at the virtual International Conference on Computer Vision and Pattern Recognition, electrical engineering graduate student Zeyu Wang and colleagues compared four different techniques for mitigating biases in computer vision models. They found that a popular technique known as adversarial training, or “fairness through blindness,” harmed the overall performance of image recognition models. In adversarial training, the model cannot consider information about the protected variable — in the study, the researchers used gender as a test case. A different approach, known as domainindependent training, or “fairness through awareness,” performed much better in the team’s analysis. “Essentially, this says we’re going to have different equencies of activities for different genders, and yes, this prediction is going to be gender-dependent, so we’re just going to embrace that,” said Russakovsky. The technique outlined in the paper mitigates potential biases by considering the protected attribute separately om other visual cues. “How we really address the bias issue is a deeper problem, because of course we can see it’s in the data itself,” said Zeyu Wang. “But in the real world, humans can still make good judgments while being aware of our biases” — and computer vision models can be set up to work in a similar way, he said. RR

®

® THE ROBOT REPORT

BIAS_11_2020_Vs4_ed.indd 71

November 2020

71

1-800-444-5366 www.lemo.com

11/3/20 9:39 AM


Manipulation •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

Hello Robot’s Stretch aims to democratize mobile manipulation After three years in stealth mode, Hello Robot unveiled its Stretch mobile manipulator for robotics researchers.

Steve Crowe • Editor, The Robot Report

He just couldn’t stay away. Aaron Edsinger, the former director of robotics at Google (2013-17), is back with another mobile manipulator. A er three years in stealth mode, Hello Robot co-founders Edsinger and Charlie Kemp recently unveiled Stretch, a slender robot looking to flip the script on what we’ve come to expect om mobile manipulators. Founded in 2017, Hello Robot has offices in Atlanta and Martinez, Calif. Kemp is a professor at Georgia Tech and in 2007 founded the university’s Healthcare Robotics Lab. Edsinger sold, for undisclosed amounts, two robotics startups to Google in 2013: Meka Robotics and Redwood Robotics. Meka was developing mobile manipulators, and Redwood was working on robotic arms. Stretch Research Edition The first version of the robot, the Stretch Research Edition (Stretch RE1), targets academic and corporate researchers. Weighing in at just over 50 lb., the Stretch RE1 features a Roomba-like mobile base that moves at a maximum speed of 0.6 m/s, an Intel RealSense D435i depth camera with an IMU,

72

November 2020

Hello_Robot2020_11_Vs4_ed.indd 72

www.therobotreport.com

a telescoping arm and li , and a custom gripper that offers a 3.3-lb. payload. The arm can reach up 43.3 in. high and extends outward 20.5 in. The Stretch RE1 also features torque sensing in all its joints to detect contact, as well as a laser range finder, an onboard computer, and differential two-wheel drive. The robot has various mount points and expansion ports that allow researchers to extend the robot with their own hardware. The so ware for the Stretch RE1 is all open source. It includes a ROS interface with a calibrated model of the robot. Stretch RE1 includes an open-hardware library of accessories that researchers can 3D print, such as a tray with a cup holder for delivering objects and a phone holder that can be used to take pictures. For those who don’t want ROS, THE ROBOT REPORT

11/3/20 9:43 AM


Stretch is very much a research platform at the moment. But Hello Robot co-founder and CEO Aaron Edsinger (pictured) said he hopes Stretch will one day be deployed in homes for human assistance applications. Source: Hello Robot

there’s also a low-level Python layer, which handles the robot’s local teleoperation skills via the included Xbox controller. Kemp shared a great story about how he remotely teleoperated an earlier version of Stretch to take care of the family cat while on vacation. He used web-based teleoperation code they’ve released as open source. “You may think Stretch is just for hardcore roboticists, but we’re hoping to change who uses robots,” said Edsinger. “We have one customer who might only teleoperate Stretch for human factors studies.” Stretch can operate fully autonomously, too. Priced at $17,950, the Stretch RE1 is significantly more affordable than other mobile manipulators currently available. Three- and six-packs of Stretch RE1s THE ROBOT REPORT

Hello_Robot2020_11_Vs4_ed.indd 73

are available for $49,950 and $98,950, respectively. For the sake of comparison, Edsinger’s M1 mobile manipulator cost about $340,000 in 2011. “Once grad students got laptops and could work om home, productivity went up. We hope Stretch will pave the way to one robot per researcher,” said Kemp. “I’ve always wanted one robot per student. If you’re time-sharing a robot, your productivity goes down.” Think of the Stretch RE1 as a means to democratize the advancement of mobile manipulators. “We see Stretch as an invitation for a much larger community to become involved with mobile manipulation. We’re excited to see the new applications it unlocks,” said Edsinger.

www.therobotreport.com

November 2020

73

11/3/20 9:45 AM


Manipulation •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

NEW HOLLOW-SHAFT KIT ENCODERS

Designed for Drives, Robot Joints Open-center form factor fits around central shaft, cables or structures Precision rotation measurement with multiturn range Rotation counter powered by Wiegand energy-harvesting technology - no batteries needed! SSI and BiSS-C interfaces Dust and moisture tolerant Factory-friendly assembly tolerances

www.posital.com

Thanks to its 3D perception and compliant gripper, Hello Robot in its launch video showed Stretch autonomously navigating a home and picking up a variety of objects, including an egg, spoon and its own Xbox controller. The gripper’s springs conform to the object it’s trying to grasp, while rubber fingertips achieve high-friction contact. The gripper’s Python interface provides motor current and position feedback as approximations of grip pose and grip force. Source: Hello Robot A glimpse into the future Thanks to its 3D perception and compliant gripper, Hello Robot in its launch video shows the Stretch RE1 autonomously navigating a home and picking up a variety of objects, including an egg, spoon, and its own Xbox controller, off a variety of surfaces. It also shows the robot autonomously opening a drawer and cleaning a table. The gripper’s springs conform to the object it’s trying to grasp, while rubber fingertips achieve high- iction contact. The gripper’s Python interface provides motor current and position feedback as approximations of grip pose and grip force. The launch video also shows the robot being tele-operated to play with a dog and vacuum a couch. At the moment, the robot comes with

74

Hello_Robot2020_11_Vs4_ed.indd 74

November 2020

a proprietary gripper, but there is an open hardware tool-share option that allows customers to explore different attachments. The current version of Stretch is “very much a robot for researchers,” but Hello Robot has more lo y longterm goals in mind. The video provides a glimpse at how Hello Robot hopes future versions of Stretch will be used: at home, by everyday people, to perform everyday tasks. “Human assistance is a longer-term goal, but we started in the research segment pragmatically,” said Kemp, who has conducted extensive research throughout his career on the use of assistive robots. “I have a model of a person in a wheelchair, so we used that when we came up with the dimensions and the design of Stretch. Hello Robot wouldn’t name any

www.therobotreport.com

THE ROBOT REPORT

11/3/20 9:46 AM


•• •• •• • of its early customers, but it said the first people it contacted are leading researchers in assistive robotics and home robotics. Hello Robot is licensing the design of the Stretch RE1 om Georgia Tech. “We’re excited for researchers to help us make the future of mobile manipulation fun, useful and inclusive,” said Kemp. “This is a robot for everyone.” RR

Stretch’s arm consists of five telescoping links made of custom carbon fiber. They are driven by a single motor, which is attached to the robot’s vertical pole to minimize weight and inertia. The arm can reach up 43.3 inches high, extend outward 20.5 inches, and lift up to 3.3 lbs. Source: Hello Robot

THE ROBOT REPORT

Hello_Robot2020_11_Vs4_ed.indd 75

•• •• •• •

•• •• •• •

•• •• •• •

•• •• •• •

•• •• •• •

•• •• •• •

•• •• •• •

•• •• •• •

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

•• •• ••

Bootstrapping Hello Robot While they did take meetings with potential investors, Edsinger and Kemp said they decided to self-fund Hello Robot. “The things we would’ve had to commit to, particularly given the cost of bringing hardware to market, would’ve almost guaranteed our failure,” said Edsinger. “We just recognized it was the wrong model for our early stage hardware startup. It’s worked out very well for us so far, and it has allowed us to retain control of the company.” Despite Hello Robot likely being better positioned than other startups to take the self-funded approach, both of its founders have a bootstrap mentality. “There are dead bodies of venture-backed robot companies everywhere,” said Kemp. “We’re doing small production runs because it allows us to scale in a smart manner. We are able to be profitable and driven by customer demand from the beginning. That’s a great place to be.” Edsinger said the initial plan was for Hello Robot to be larger scale and sell directly to consumers. But the team received sage advice. “Generally, we found there’s a mismatch between an investor’s business model and where the technology is today,” said Edsinger. Edsinger and Kemp have known each other for years, going back to their days studying under Rodney Brooks at the Massachusetts Institute of Technology (MIT). The two continued to collaborate throughout the years – Kemp’s lab purchased one of the first M1 mobile manipulators from Meka Robotics – and plan to keep the team small, for the time being. “Between Charlie and me, we’re pretty full stack. So we don’t have to hire a giant team,” said Edsinger. “It makes problem-solving and decision-making very efficient. At Google, I remember one meeting where we had to make a big decision with 12 people in the room. The outcome of the meeting was a plan to have another meeting.” “At Hello Robot, we’re on generation eight of Stretch’s design. At one point, we knew the sixth-generation design wasn’t going to work. And within one day, we re-routed the whole plan and didn’t get bogged down by the structure of the organization.” RR

www.therobotreport.com

November 2020

75

11/3/20 9:53 AM


Manipulation •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

Visual transfer learning

helps robots manipulate objects Pre-training deep learning models enabled robots to learn to pick and grasp

arbitrary objects in unstructured settings in less than 10 minutes of trial and error. By Yen-Chen Lin and Andy Zeng • Google

The idea that robots can learn to directly perceive the affordances of actions on objects (i.e., what the robot can or can’t do with an object) is called affordance-based manipulation. In these systems, affordances are represented as dense pixel-wise action-value maps that estimate how good it is for the robot to execute one of several predefined motions at each location. For example, given an RGB-D image, an affordance-based grasping model might infer grasping affordances per pixel with a convolutional neural network. The grasping affordance value at each pixel would represent the success rate of performing a corresponding motion primitive (e.g. grasping action), which would then be executed by the robot at the position with the highest value. For methods such as this, the ability to do more with less data is incredibly important, since data collection through physical trial and error can be both time-consuming and expensive. However, recent discoveries in transfer learning have shown that visual feature representations learned om large-scale computer vision datasets can be reused for deep learning agents, enabling them to learn faster and generalize better in video games and simulated environments. If end-to-end, affordance-based robot learning models that map om pixels to actions could similarly benefit om these visual

76

November 2020

Transfer_Learning2020_11_Vs5_ed.indd 76

www.therobotreport.com

representations, one could begin to leverage the vast amounts of labeled visual data that are now available to more efficiently learn useful skills for real-world interaction with less training. We investigated whether existing pretrained deep learning visual feature representations can improve the efficiency of learning robotic manipulation tasks, like grasping objects. By studying how we can intelligently transfer neural network weights between vision models and affordance-based manipulation models, we can evaluate how different visual feature representations benefit the exploration process and enable robots to quickly acquire manipulation skills using different grippers. We present practical techniques to pretrain deep learning models, which enable robots to learn to pick and grasp arbitrary objects in unstructured settings in less than 10 minutes of trial and error. THE ROBOT REPORT

11/3/20 10:16 AM


CONTROLFLEX COUPLINGS WHAT IS A CONTROLFLEX COUPLING?

WHY RULAND?

• Lightweight, low inertia, and balanced design for speeds up to 25,000 RPM.

• Widest variety of standard Controlflex sizes in the world – over 550 hubs that make over 11,000 combinations.

• Ideal for encoders or other light duty applications. • Single insert style for fit in compact spaces and high misalignment. • Double insert style for added torque. • Consistent zero-backlash operation. • Manufactured by Schmidt-Kupplung in Wolfenbüttel, Germany.

• Large stock on hand in our Marlborough, MA USA factory. • In-house rebore allows for quick turnaround on out of stock items. • Easy to navigate website with full product data, CAD, and install videos.

Want to try Controlflex for yourself? Go to ruland.com/samples and we will send you up to 4 couplings to test in your next design.

Ruland 11-20_RR.indd 77

CONTROLFLEX COUPLINGS

New

www.ruland.com 11/2/20 3:59 PM


Manipulation •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• •• An overview of affordance-based manipulation. Credit: Google

Transfer learning for affordance-based manipulation Affordance-based manipulation is essentially a way to re ame a manipulation task as a computer vision task, but rather than referencing pixels to object labels, we instead associate pixels to the value of actions. Since the structures of computer vision models and affordance models are so similar, one can leverage techniques om transfer learning in computer vision to enable affordance models to learn faster with less data. This approach repurposes pre-trained neural network weights (i.e., feature representations) learned om large-scale vision datasets to initialize network weights of affordance models for robotic grasping.

In computer vision, many deep model architectures are composed of two parts: a “backbone� and a “head.� The backbone consists of weights that are responsible for early-stage image processing, such as filtering edges, detecting corners, and distinguishing between colors. The head consists of network weights that are used in latter-stage processing, such as identi ing high-level features, recognizing contextual cues, and executing spatial reasoning. The head is o en much smaller than the backbone and is also more task specific. Hence, it is common practice in transfer learning to pre-train (e.g., on ResNet) and share backbone weights between tasks, while randomly initializing the weights of the model head for each new task.

Following this recipe, we initialized our affordance-based manipulation models with backbones based on the ResNet-50 architecture and pre-trained on different vision tasks, including a classification model om ImageNet and a segmentation model om COCO. With different initializations, the robot was then tasked with learning to grasp a diverse set of objects through trial and error. Initially, we did not see any significant gains in performance compared with training om scratch – grasping success rates on training objects were only able to rise to 77% a er 1,000 trial-and-error grasp attempts, outperforming training om scratch by 2%. However, upon transferring network weights om both the backbone and

A NEW VIRTUAL EXPERIENCE FROM ROBOTICS BUSINESS REVIEW!

Fostering Innovation, Expanding Opportunities, Building a Community

FEATURING:

Speaker Presentations

Live Discussions

Follow-up Q&As w/ presenters

robobusiness.com

78

November 2020

Transfer_Learning2020_11_Vs5_ed.indd 78

www.therobotreport.com

THE ROBOT REPORT

11/3/20 10:17 AM


New England Wire Tech 1-20.indd 79

10/29/20 9:35 PM


Manipulation •• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• •• •

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• ••

•• •• ••

the head of the pretrained COCO vision model, we saw a substantial improvement in training speed – grasping success rates reached 73% in just 500 trial-and-error grasp attempts, and jumped to 86% by 1,000 attempts. In addition, we tested our model on new objects unseen during training and found that models with the pretrained backbone om COCO generalize better. The grasping success rates reach 83% with pre-trained backbone alone and further improve to 90% with both pretrained backbone and head, outperforming the 46% reached by a model trained om scratch. Transfer learning can improve exploration In our experiments with the grasping robot, we observed that the distribution of successful grasps versus failures in the generated datasets was far more balanced when network weights om both the backbone and head of pretrained vision models were transferred to the affordance models, as opposed to only transferring the backbone. These results suggest that reusing network weights om vision tasks that require object localization (e.g., instance segmentation, like COCO) has the potential to significantly improve

Does first learning to see improve the speed at which a robot can learn to act? Google AI recently studied ways in which to transfer knowledge learned from computer vision tasks (left) to robot manipulation tasks (right). Credit: Google the exploration process when learning manipulation tasks. Pretrained weights om these tasks encourage the robot to sample actions on things that look more like objects, thereby quickly generating a more balanced dataset om which the system can learn the differences between good and bad grasps. In contrast, pretrained weights om vision tasks that potentially discard objects’ spatial information (e.g., image classification, like ImageNet) can only improve the performance slightly compared to random initialization. To better understand this, we visualize the neural activations that are triggered by different pre-trained models and a converged affordance model trained om scratch using a suction gripper. Interestingly, we find that the intermediate network representations learned om the head of vision models used for segmentation om the COCO dataset activate on objects in ways that are similar to the converged affordance model. This aligns with the idea that transferring as much of the vision model as possible (both backbone and head) can lead to more object-centric

exploration by leveraging model weights that are better at picking up visual features and localizing objects. Limitations and Future Work Many of the methods we use today for end-to-end robot learning are effectively the same as those being used for computer vision tasks. Our work here on visual pre-training illuminates this connection and demonstrates that it is possible to leverage techniques om visual pretraining to improve the learning efficiency of affordance-base manipulation applied to robotic grasping tasks. While our experiments point to a better understanding of deep learning for robots, there are still many interesting questions that have yet to be explored. For example, how do we leverage large-scale pretraining for additional modes of sensing (e.g. force-torque or tactile)? How do we extend these pre-training techniques towards more complex manipulation tasks that may not be as object-centric as grasping? These areas are promising directions for future research. RR

Affordances predicted by different models from images of cluttered objects (a). (b) Random refers to a randomly initialized model. (c) ImageNet is a model with backbone pre-trained on ImageNet and a randomly initialized head. (d) Normal refers to a model pre-trained to detect pixels with surface normals close to the anti-gravity axis. (e) COCO is the modified segmentation model (MaskRCNN) trained on the COCO dataset. (f) Suction is a converged model learned from robot-environment interactions using the suction gripper. Credit: Google 80

November 2020

Transfer_Learning2020_11_Vs5_ed.indd 80

www.therobotreport.com

THE ROBOT REPORT

11/3/20 10:17 AM


SEARCH MILLIONS OF PARTS FROM THOUSANDS OF SUPPLIERS



PRICING & AVAILABILITY

DATA SHEETS & SPECS

SOURCE & PURCHASE

Get real-time pricing and stock info from authorized distributors and manufacturers.

View and download product data sheets and technical specifications.

Compare options from suppliers and buy direct from distributors and manufacturers.

ABOUT DESIGNFAST

DesignFast is a search engine for finding engineering components and products. With DesignFast, engineers and sourcing professionals can quickly search for products, compare prices, check stock, view data sheets and go direct to the supplier for purchase.

HOW DOES IT WORK?

DesignFast aggregates product data from thousands of suppliers and distributors and makes it available for searching. DesignFast provides pricing, availability and product data sheets for free download.

designfast.com DESIGN FAST HOUSE AD 11-20.indd 81

11/3/20 10:46 AM


Robotics Robotics

CGI Inc. Advanced Products for Robotics and Automation At CGI we serve a wide array of industries including medical, robotics, aerospace, defense, semiconductor, industrial automation, motion control, and many others. Our core business is manufacturing precision motion control solutions. CGI’s diverse customer base and wide range of applications have earned us a reputation for quality, reliability, and flexibility. One of the distinct competitive advantages we are able to provide our customers is an engineering team that is knowledgeable and easy to work with. CGI is certified to ISO9001 and ISO13485 quality management systems. In addition, we are FDA and AS9100 compliant. Our unique quality control environment is weaved into the fabric of our manufacturing facility. We work daily with customers who demand both precision and rapid turnarounds.

ISO QUALITY MANAGEMENT SYSTEMS: ISO 9001• ISO 13485 • AS9100 • ITAR SIX SIGMA AND LEAN PRACTICES ARE EMBRACED DAILY WITHIN THE CULTURE

CGI Inc. 3400 Arrowhead Drive Carson City, NV 89706 Toll Free: 1.800.568.4327 Ph: 1.775.882.3422 Fx: 1.775.882.9599 WWW.CGIMOTION.COM

CS HYDE COMPANY Eliminating friction unleashes a Gantry robots full potential

82

Applying UHMW tape is a performance upgrade to all robotic rail systems. UHMW or (Ultra-high-molecularweight polyethylene) is an abrasion resistant material with anti friction performance similar to PTFE. This plastic can be used on conveyor or guide rail systems across many industries. This tape is extremely abrasion and impact resistant which enables it to withstand the repeatability of robotic gantry systems. Its low friction non stick surface allows gantry robots to slide across rail systems freely. Eliminate drag and protect your rails from potential wear and tear. UHMW Tape is available in slit to width rolls, sheets, strips, or custom die cut parts. UHMW is supplied in sheet stock for mechanical fastening or tape with a PSA adhesive for easy

November 2020

Robotic Tips 11-20_Vs1.indd 82

CS Hyde Company

peel and stick application.

1351 N. Milwaukee Ave. Lake Villa, IL 60046 Toll Free: 800.461.4161

www.cshyde.com

www.therobotreport.com

THE ROBOT REPORT

11/3/20 10:49 AM


Robotics Robotics THE ROBOT REPORT

Robotic Tips 11-20_Vs1.indd 83

AGILUS KR 10 R1100-2 Robot Kit The KUKA Robotics AGILUS KR 10 R1100-2 is an articulated arm capable of 6 axis with a rated payload of 5 kg and a max payload of up to 10.9 kg at reduced load center distances. With a work area reach of 1101 mm and a footprint of 208 mm x 208 mm, the AGILUS KR 10 robotics kit comes pre-plumbed, with a KR C4 compact controller, and a KUKA smartPAD teach pendant. The KR10 is IP65 protected, has an operating temp range of 0°C to 45°C and is mountable in a variety of positions including floors, ceilings, as well as wall mount. KUKA’s AGILUS Series Compact Robotics are unparalleled in performance and capability in their payload class.

Digi-Key Electronics 701 Brooks Ave. S. Thief River Falls, MN 56701 1-800-344-4539 www.digikey.com

Fully Integrated Speed Controller, within 6.2 mm The FAULHABER BXT Flat brushless DC servo motor family has grown; now available in all sizes with a diametercompliant, integrated speed controller. With an additional attachment length of just 6.2 mm, the combination of the BXT H motors with the integrated speed controller is the ideal solution for space-confined applications, particularly if speeds need to be controlled precisely, and high torques are also required. The default factory pre-configuration, along with the Motion Manager software allows for quick and easy commissioning of the system. Typical applications are medical devices, pumps, hand-held instruments, optics systems, and robotics & end-effectors.

FAULHABER MICROMO www.faulhaber.com 14881 Evergreen Ave Clearwater, FL 33760 USA

800-807-9166

www.therobotreport.com

November 2020

83

11/3/20 10:49 AM


Robotics Robotics 84

Get the Right Handling System in Just Minutes The Handling Guide Online (HGO) is a unique sizing and selection web tool for Cartesian gantry systems. It offers you the following benefits: • Efficiency — cuts your engineering time and effort to a minimum • Intuitiveness — very easy to use and features structured prompts for data input Just enter your application specific data and within 20 minutes, HGO will generate several reliable and suitable solutions with the associated 3D CAD models, which can be downloaded immediately. You can then request a quote online and receive it within one to two business days. Using three simple steps, you will be able to find the right standard handling systems quickly and easily.

Phone: 1-800-993-3786 Web: www.festo.com e-mail: customer.service.us@festo.com

Safety and Security Solutions for Smart Machines. FORT Robotics empowers companies building and working with robotics and autonomy systems to scale quickly, reduce risk, and keep people safe from harm. FORT is the leading standard of safety and security solutions for any smart machine. The FORT Platform offers flexible configurations for safety-rated wireless controls, hardware-based security, and cloud-based machine access control. Our solutions extend to a variety of applications including material handling and warehousing, construction, manufacturing, and agriculture. Integrate with nearly any electrified machine from AMRs, automated fork trucks, and construction and agriculture equipment, to last-mile autonomous delivery vehicles. FORT is on the frontier of autonomy, building the foundation for the autonomous future by solving the challenges and complexities of safety and security requirements for robotics.

November 2020

Robotic Tips 11-20_Vs1.indd 84

Festo Corporation 1377 Motor Parkway Suite 310 Islandia, NY 11749

fortrobotics.com contact@fortrobotics.com 267–515–5880

www.therobotreport.com

THE ROBOT REPORT

11/3/20 10:50 AM


Robotics Robotics THE ROBOT REPORT

Robotic Tips 11-20_Vs1.indd 85

FUTEK We make innovation possible FUTEK Advanced Sensor Technology specializes in creating inventive sensor solutions for today’s leading tech innovators: • Load cells • Torque sensors • Pressure sensors • Multi-axis sensors • Instruments • Software Our end-to-end measurement products and services include sensors, amplifiers, and calibration, allowing you to streamline and optimize your system and achieve better results at a lower cost than legacy solutions. All our products are made in the USA. To learn more, visit www.futek.com.

FUTEK Advanced Sensor Technology, Inc. 10 Thomas Irvine, CA 92618 USA www.futek.com +1 (949) 465-0900

GAM GAM provides a full range of zero-backlash gearboxes GAM’s extensive product offering includes three different zero-backlash gearboxes: Strain Wave (harmonic), Cycloidal, and the revolutionary Zero-Backlash Planetary. The GAM GPL zerobacklash planetary gearbox features a unique design ensuring backlash of ≤ 0.1 arcmin for the life of the gearbox. The GPL provides vibrationfree motion and high positional accuracy for precise smooth path control and repeatability with an impressive life of 20,000 hours. The GCL cycloidal gearbox provides precise point-to-point motion and high impact resistance of 5x nominal torque with the option of an integral pre-stage. The GSL strain wave gearbox uses harmonictype gearing for high accuracy and drops in for popular competitor gearboxes. With three options, GAM can provide the zero-backlash gearbox for your precision application.

www.therobotreport.com

GAM 901 E Business Center Drive Mount Prospect, IL 60056 888.GAM.7117 | 847.649.2500 www.gamweb.com info@gamweb.com

November 2020

85

11/3/20 10:51 AM


Robotics Robotics

Harmonic Drive FHA Actuator with Integrated Servo Drive Just Released: FHA mini actuator with an integrated servo drive utilizing CANopen® communication. Eliminating the need for an external servo drive, it features a single cable connection with only 4 wires needed: CANH, CANL, +24VDC, 0VDC. A single-turn 14bit (16384 cpr) gear output sensing encoder has been integrated along with a single-turn 15bit (32768 cpr) motor input sensing encoder providing a true absolute encoder that does not require a battery within 360° of rotation of the output. The FHA-C mini Series is a family of extremely compact actuators that deliver high torque with exceptional accuracy and repeatability.

42 Dunham Ridge Beverly, MA 01915 United States www.harmonicdrive.net

Harmonic Drive is a registered trademark of Harmonic Drive Systems

Honeywell Intelligrated® Robotic Solutions By leveraging advanced robotic technology with extensive material handling experience, warehouse automation solutions from Honeywell Robotics provide the speed, accuracy and efficiency to satisfy a broad and growing range of operational requirements. Innovative designs, simulation tools that predict performance before installation, application expertise and committed support ensure maximum dependability and round-the-clock productivity. Robotic solutions also relieve workers of some of the most arduous, repetitive and injury-prone tasks, freeing up limited labor for more rewarding, higher-value jobs. Best of all, Honeywell Robotics solutions can be leveraged as part of larger integrated systems, backed by proven integration and support capabilities. Honeywell Intelligrated is recognized by the Robotic Industries Association (RIA) as a Certified Robot Integrator, with more than a quarter-century of experience providing single-source robotic solutions for high-performance distribution and manufacturing operations. From system concepting, simulation, fabrication and integration to installation and commissioning, training and ongoing support, each solution is approached with a comprehensive lifecycle view to maximize the value of your system.

86

November 2020

Robotic Tips 11-20_Vs1.indd 86

www.therobotreport.com

Honeywell Intelligrated 1.866.936.7300

www.intelligrated.com

THE ROBOT REPORT

11/3/20 10:51 AM


Robotics Robotics

IDS Imaging Development Systems Inc. 3D camera with onboard data processing When 3D image processing is used in computeintensive applications, interfaces and CPU power quickly become a bottleneck. Wouldn’t it be convenient if the 3D camera already did some of the computing? Ensenso XR is the first stereo vision camera offered by IDS that processes 3D data directly in the FPGA. The camera family initially consists of XR30 and XR36 models. They are very robust thanks to IP65/67 protection class, feature 1.6 MP Sony sensors and can detect objects at working distances of up to 5 m. Since the camera calculates

IDS Imaging Development Systems Inc.

3D data itself, a high-performance industrial PC is no longer

92 Montvale Ave, Suite 2800

required. The transfer of result data instead of raw data also reduces the load on the network considerably.

Stoneham, MA 02180, USA Phone: +1 (781) 787-0048 Email: usasales@ids-imaging.us Web: www.ids-imaging.com

Keystone Electronics Corp. A World Class Manufacturer of precision electronic components & hardware for over 70 years. Keystone’s design and engineering experts are fully integrated with their inhouse precision tool & die division supported by advanced manufacturing systems to produce close tolerance Stamping, Machining, Assembly, CNC and Injection Molded parts. Keystone utilizes state-of-the-art software to support the

thousands of standard products found in their Product Design Guide M70 and Keystone’s Dynamic Catalog on-line. Product Overview: Battery Clips, Contacts & Holders; Fuse Clips & Holders; Terminals & Test Points; Spacers & Standoffs; Panel Hardware; Pins, Plugs, Jacks & Sockets; Multi-Purpose Hardware. As an ISO9001:2015 certified manufacturer, Keystone’s

quality control system, responsive customer service and custom manufacturing division can meet your challenges with a standard or custom design solution.

THE ROBOT REPORT

Robotic Tips 11-20_Vs1.indd 87

DESIGNERS & MANUFACTURERS

www.keyelco.com

www.therobotreport.com

Keystone Electronics 55 S. Denton Ave. New Hyde Park, NY 11040 Tel: 1.800.221.5510 www.keyelco.com

November 2020

87

11/3/20 10:52 AM


Robotics Robotics

Connectors 4 Robots LEMO connectors are used on collaborative robots for industrial applications but also for control, articulated manipulator, and automation systems. As robots become more complex, LEMO connectors enable connecting sensors, motors and actuators in an efficient way, even when the cabling layout is very dense. Thanks to the Push-Pull system, the connector can be easily mated and un-mated allowing reduced maintenance and installation time. LEMO connectors are used extensively on quadrupedal and other legged robots, as well as on wheeled robots. LEMO’s high-speed circular connector (CAT6A signals) can be built into 2K/ 2T/ 2B series, offer IP68 watertightness and full EMC. Learn More at https://www.lemo.com/en/application/robotic-connector LEMO USA, Inc. 635 Park Court Rohnert Park, CA 94928 www.lemo.com info-us@lemo.com Tel: 707.206.3700

maxon Drive Systems for Robotics Reliable, Powerful, Efficient A complete joint actuation unit. Includes a brushless DC motor, an internal high resolution encoder, planetary gearhead with absolute encoder and position controller with CAN and RS232 interface. Exoskeleton Joint Actuator • Compact Housing • Integrated Controller • Reduced Weight and Cost • For Use in Hip and Knee Exoskeletons maxon is your single source for motion solutions. When you choose

maxon precision motors, inc.

maxon, you can expect outstanding service, creative options and quality

125 Dever Drive

without question. Want to get your ideas moving? Contact maxon today.

Taunton, MA 02780 Phone: 508.677.0520

Learn more about the maxon solutions and visit www.maxongroup.us

88

November 2020

Robotic Tips 11-20_Vs1.indd 88

www.maxongroup.us info.us@maxongroup.com

www.therobotreport.com

THE ROBOT REPORT

11/3/20 10:52 AM


Robotics Robotics

VersaFlex Conveyors Handle Complex Layouts VersaFlex flat top chain conveyors, by mk North America, are the ideal conveyor for complex layouts, elevation changes, and small spaces. These conveyors are capable of vertical conveying, alpine configurations, and side-gripper applications – in addition to conveying product in any number of conventional layouts. What sets VersaFlex conveyors apart from the rest of the flat top chain conveyors in the marketplace is their ability solve a variety of manufacturing challenges – including capacity issues, space constraints and workforce shortages. Visit our website to learn more.

THE ROBOT REPORT

Robotic Tips 11-20_Vs1.indd 89

mk North America, Inc. 105-125 Highland Park Drive Bloomfield, CT 06002

www.mkversaflex.com/rr ®

www.mkversaflex.com/rr (860) 769-5500 info@mknorthamerica.com

JUST RELEASED: Revolutionary - Motus Labs ML1000 Series precision M-DRIVES offer a higher torque density than competing strain wave gearing with no compromise in performance. The ML1000 family of hollow shaft drives includes standard gear drive sizes ranging from 17-40. Motus Labs’ patented design utilizes a series of cam-driven blocks, instead of traditional gear teeth, that engage over 80% of the output ring surface area at all times resulting in a more rigid drive at a lower weight. The design distributes load stresses over a much larger surface area, permitting the M-DRIVE transmission to deliver up to twice the torque per unit size and volume and 15% greater efficiencies compared to strain wave drives.

Motus Labs 17815 Davenport Road Suite 130 Dallas, Texas 75252 www.motus-labs.com

www.therobotreport.com

November 2020

89

11/3/20 10:52 AM


Robotics Robotics

New England Wire Technologies Advancing innovation for over 100 years Why accept a standard product for your custom application? NEWT is committed to being the premier manufacturer of choice for customers requiring specialty wire, cable and extruded tubing to meet existing and emerging worldwide markets. Our custom products and solutions are not only engineered to the exacting specifications of our customers, but designed to perform under the harsh conditions of today’s advanced manufacturing processes. Cables we specialize in are LITZ, multi-conductor cables, hybrid configurations, coaxial, twin axial, miniature and micro-miniature coaxial cables, ultra flexible, high flex life, low/high temperature cables, braids, and a variety of proprietary cable designs. Contact us today and let us help you dream beyond today’s technology and achieve the impossible.

90

NEW ENGLAND WIRE T E C H N O LO G I E S

New England Wire Technologies www.newenglandwire.com 603.838.6624

Pepperl+Fuchs, Inc. AGV Collision Avoidance and Cliff Detection– 4 Scanning Layers in Just One Device Pepperl+Fuchs‘ R2300 is a cost-effective and versatile multi-layer LiDAR sensor for object perception in 3D space. The sensor – powered by Pulse Ranging Technology (PRT) – ensures high accuracy, noise immunity, and cross-talk protection. The high sampling rate and precise light spot is ideal for positioning, object classification, and navigationsupport tasks. The R2300 is also equipped with an integrated visible-red pilot laser that can be switched on to simplify installation and commissioning and switched off during operation. The R2300 is made with solid-state electronics ensuring durability, efficiency, and longevity. www.pepperl-fuchs.com/usa/en/R2300_photoelectric_sensors.htm

November 2020

Robotic Tips 11-20_Vs1.indd 90

Contact info:

Pepperl+Fuchs, Inc. 1600 Enterprise Parkway Twinsburg, OH 44087 330-425-3555 sales@us.pepperl-fuchs.com www.pepperl-fuchs.com

www.therobotreport.com

THE ROBOT REPORT

11/3/20 10:53 AM


Robotics Robotics

Posital-Fraba Upgrade Your Motor Feedback with POSITAL ABSOLUTE Kit Encoders POSITAL ABSOLUTE Kit Encoders offer a great upgrade path for the traditional incremental kit encoders used for servomotors. Compact, rugged and cost effective, they provide accurate position feedback for precision motion control in robots, production machinery, autonomous vehicles and other motion and position control application. They can also be used to provide closed-loop feedback control for stepper motors. Rotational resolution is up to 17-bit (one part in 130,000) with a multi-turn range of

POSITAL-FRABA Inc.

more than 8 million revolutions.

1 N Johnston Ave, Suite C238 Hamilton, NJ 08609

Standardized compact form factors make POSITAL absolute kit encoder a straightforward replacement for US Digital or Broadcom incremental kit encoders in existing machinery or in new designs.

Website: www.posital.com Email: info@fraba.com Phone: +1 609.750.8705

NEW AksIM-2TM rotary absolute kit encoders offer outstanding performance – to 20-bits with no hysteresis Renishaw associate company RLS d.o.o Introduces an improved second generation of AksIMTM absolute rotary encoders widely used in many humanoid, medical and collaborative (Cobot) applications, where hysteresis, large through holes, low profile, reliability and repeatability are fundamental. The additional benefits of AksIM-2TM encoders are: • Full range of sizes • Onboard eccentricity calibration • Multiturn capability • Extended operating temperature and pressure ranges

THE ROBOT REPORT

Robotic Tips 11-20_Vs1.indd 91

USA

www.therobotreport.com

Contact Info: 1001 Wesemann Drive West Dundee, IL 60118 Website: www.renishaw.com Phone: 847.286.9953 Email: usa@renishaw.com

November 2020

91

11/3/20 10:53 AM


Robotics Robotics

Ruland Manufacturing Zero-Backlash Couplings for Robotic Systems Ruland Manufacturing offers a variety of zero-backlash servo couplings designed for use in high precision applications like automation and robotics. Ruland offers beam, bellows, disc, oldham, jaw, and newly-released Controlflex couplings in thousands of off-the-shelf combinations and sizes to help designers optimize their systems. Robotic vision systems, material handling robots, and automated guided vehicles have infamously strict requirements that require engineers to balance torque, weight, dampening, and more, all while retaining extremely precise power transmission. Ruland servo couplings excel in demanding applications and can be selected based on a wide variety of performance characteristics. Visit Ruland.com for access to everything you need to make a coupling design decision including: full technical product data, 3D CAD models, installation videos, and eCommerce to make prototyping easy.

Ruland Manufacturing 6 Hayes Memorial Dr. Marlborough, MA 01752 508-485-1000 www.ruland.com email: sales@ruland.com

SICK, Inc.

New Ultra-Compact Safety Laser Scanner – nanoScan3

92

A new ultra-compact safety laser scanner is now available from SICK that revolutionizes safe navigation for small AGVs or mobile robots. With an overall height just over three inches, the nanoScan3 is a space-saving sensor that can be used where machines and vehicles require maximum performance, but have minimal mounting space. Product benefits:

• • • • •

Small housing, measuring only 3.15 inches in height Two pairs of OSSD safety outputs Up to 128 freely configurable fields and monitoring cases Direct static and encoder inputs for flexible monitoring case switching Protective field range of three meters with a scanning angle of 275-degrees

• High-precision measurement data output for navigation support via Ethernet interface

• Maximum detection reliability even when subject to challenging ambient conditions

November 2020

Robotic Tips 11-20_Vs1.indd 92

SICK, Inc. 6900 West 110th St. Minneapolis, MN 55438 USA www.sick.com info@sick.com

www.therobotreport.com

THE ROBOT REPORT

11/3/20 10:54 AM


Robotics Robotics THE ROBOT REPORT

Robotic Tips 11-20_Vs1.indd 93

Looking for compact, rugged motion sensing with premium performance? In operation on the factory floor, in the fields, in the air and under water, Silicon Sensing’s DMU11 inertial measurement unit (IMU) delivers complete motion sensing in three-dimensional space. This is a compact, precise, six-degrees-of-freedom (6-DOF) device ideal for any motion control or stabilisation role. Low cost and able to fit in the smallest space, it delivers market-leading performance that is calibrated over its full rated temperature range. All Silicon Sensing MEMS gyroscopes, accelerometers & inertial systems deliver precise, rugged, ultra-reliable inertial sensing. Silicon Sensing www.siliconsensing.com Clittaford Road Southway Plymouth Devon PL6 6DE England Ph: 01752 723330

THK Micro Cross-Roller Ring Series RAU The THK micro-size Cross-Roller Ring RAU features a 10mm inner diameter and a 21mm outer diameter. It is more compact, lightweight and rigid than a double row angular contact ball bearing type. RAU rollers travel on V-shaped raceways ground into the inner and outer rings. Alternating rollers are arrayed orthogonally so that one bearing can support loads and moments in any direction. Spacer retainers enable smooth movement and high rotation accuracy. https://www.thkstore.com/products/rotation/cross-roller-rings/rau.html

THK America, Inc. 200 East Commerce Drive Schaumburg, IL 60173 Phone: 847-310-1111 www.THK.com

www.therobotreport.com

November 2020

93

11/3/20 10:54 AM


Robotics

See more. Do more. Zivid Two 3D color camera brings human-like vision to pick and place robotics. Second-generation color 3D camera breaks speed, image quality, and trueness barriers for fast cycle times, better object detection and accurate manipulation of parts.

zivid.com

It’s not a web page, it’s an industry information site So much happens between issues of R&D World that even another issue would not be enough to keep up. That’s why it makes sense to visit rdworldonline.com and stay on Twitter, Facebook and Linkedin. It’s updated regularly with relevant technical information and other significant news to the design engineering community.

rdworldonline.com Robotic Tips 11-20_Vs1.indd 94

11/3/20 10:54 AM


WEBINAR SERIES

CUSTOM CONTENT IN A LIVE, INTERACTIVE OR ON-DEMAND FORMAT. CHECK OUT OUR WEBINARS TODAY: ■

designworldonline.com/category/webinars

fluidpowerworld.com/category/webinars

therobotreport.com/category/robotic-webinars

eeworldonline.com/category/webinars

solarpowerworldonline.com/category/featured/webinars

windpowerengineering.com/category/featured/webinars

medicaldesignandoutsourcing.com/webinars

WTWH MEDIA’S WEBINARS OFFER: • Coverage of a wide range of topics • Help engineers better understand technology or product related issues and challenges • Present educational material related to specific topics

Medical edical Design & OUTSOURCING

WEBINARS HOUSE AD 11-20.indd 95 Webinars-FullPgAd.indd 1

11/3/20 10:45 AM 4/8/20 4:16 PM


AD INDEX Azoth .................................................................................... 51

SALES

LEADERSHIP TEAM

Jami Brownlee

Publisher Mike Emich

jbrownlee@wtwhmedia.com 224-760-1055

Bishop Wisecarver .......................................................... 27

Mike Caruso

Bodine Electric Company ....................................... 37,39

mcaruso@wtwhmedia.com 469.855.7344

CGI Inc. ................................................................................ 67 CS Hyde Company ........................................................... 10

FAULHABER MICROMO ................................................IBC

smccafferty@wtwhmedia.com 310.279.3844 @SMMcCafferty

Jim Dempsey

EVP Marshall Matheson

jdempsey@wtwhmedia.com 216.387.1916

Festo .................................................................................... 63

Michael Ference

Formant .............................................................................. 38

Managing Director Scott McCafferty

Bill Crowley

bcrowley@wtwhmedia.com 610-420-2433

Digi-Key ............................................................................... 23

memich@wtwhmedia.com 508.446.1823 @wtwh_memich

mmatheson@wtwhmedia.com 805.895.3609 @mmatheson

mference@wtwhmedia.com 408.769.1188 @mrference

Fort Robotics .................................................................... 22

Mike Francesconi

FUTEK Advanced Sensor Technology, Inc. ...........BC

mfrancesconi@wtwhmedia.com 630.488.9029

GAM ...................................................................................... 62

Neel Gleason

Harmonic Drive .....................................................................1 Honeywell Intelligrated ....................................................3

ngleason@wtwhmedia.com 312.882.9867 @wtwh_ngleason

IDS Imaging Development Systems GmbH ........... 59

Jim Powers

jpowers@wtwhmedia.com 312.925.7793 @jpowers_media

Keystone Electronics Corp. ............................................5 LEMO USA ...........................................................................71

Courtney Nagle

maxon ........................................................................ Cover,9

cseel@wtwhmedia.com 440.523.1685 @wtwh_CSeel

mk North America, Inc. .................................................. 70 Motus Labs ........................................................................ 29 New England Wire Technologies & New England Tubing Technologies ................... 79 Pepperl+Fuchs, Inc. ........................................................ 20 POSITAL-FRABA ................................................................74 Renishaw ............................................................................ 58 Ruland Manufacturing Co., Inc. ...................................77 SICK, Inc. ............................................................................. 45 Silicon Sensing Systems ............................................... 21 THK America, Inc ............................................................. IFC Zivid ...................................................................................... 43

FOLLOW US ON

Follow the whole team on twitter @DesignWorld 96

November 2020

AD INDEX - ROBOTICS HBK_11-20_Vs1.indd 96

www.therobotreport.com

THE ROBOT REPORT

11/3/20 11:12 AM


FAULHABER-MICROMO 8-20_DW.indd 1

11/2/20 3:44 PM


2 Conceptual rendering of the multi-jointed robotic arm of a surgical system.

1

3 1 4

Giving robots a sense of touch FUTEK's miniaturized sensor technology allows surgeons to perform as if they had virtual fingertips. The sensors’ precise measurement and feedback allow the machine to emulate the dexterity and haptics of human hands.

QTA143

1

Micro Reaction Torque Sensor

Dimensions: 14 mm × 10 mm × 26 mm Provides closed-loop feedback on torque measurement.

LSB205

2

Miniature S-Beam Jr. Load Cell

Dimensions: 19 mm × 18 mm × 6.6 mm Provides critical force feedback.

QLA401

3

Load Cell Built for Autoclave

Dimensions: Ø 14 mm × 3.28 mm Designed to withstand the autoclave sterilization process.

go.futek.com/medtech

QLA414

4

ANSI

ISO

ISO

ISO

Z540-1

17025

9001

13485

FUTEK 11-20_RR.indd 1

U.S. Manufacturer

Nano Force Sensor

Dimensions: 4mm × 5mm Enables direct measurement that eliminates any drift in the output.

11/2/20 3:36 PM


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.