THE ROBOT REPORT_DESIGN WORLD_OCTOBER 2020

Page 1

A Supplement to Design World - October 2020 www.designworldonline.com

CropHopper

robot takes novel approach to field monitoring page 62

INSIDE: • AI in agriculture: How PyTorch enables Blue River’s robots ............................................54 • Augean Robotics mechanizes food production, from farm to table ......................66 • Building service network for 10,000 autonomous combines ...................................70

ROBOT_REPORT_COVER_10-20_FINAL.indd 53

10/6/20 8:47 AM


The Robot Report

AI in agriculture:

How PyTorch enables Blue River’s robots The PyTorch open-source machine learning library has helped guide weeding robots.

Chris Padwick | Blue River Technology

How did farming affect your day today? If you live in a city, you might feel disconnected from the farms and fields that produce your food. Agriculture is a core piece of our lives, but we often take it for granted. Farmers today face a huge challenge — feeding a growing global population with less available land. The world’s population is expected to grow to nearly 10 billion by 2050, increasing the global food demand by 50%. As this demand for food grows, land, water, and other resources will come under even more pressure. The variability inherent in farming, like changing weather conditions, and threats like weeds and pests also have consequential effects on a farmer’s ability to produce food. The only way to produce more food while using less resources is through agricultural robots that can help farmers with difficult jobs, offering more consistency, precision, and efficiency. Agricultural robots use PyTorch At Blue River Technology, we are building the next generation of smart machines. Farmers use our tools to control weeds and reduce costs in a way that promotes agricultural sustainability. Our weeding robot integrates cameras, computer vision, machine learning, and robotics. The intelligent sprayer drives through fields using AutoTrac to minimize the load on the driver and quickly targets and sprays weeds, leaving the crops intact.

54

October 2020

PyTorch_RR_10-20_FINAL_ed.indd 54

www.therobotreport.com

THE ROBOT REPORT

10/5/20 11:49 AM


A 2017 prototype of the See & Spray precision weed-control machine. | Source: Blue River Technology The machine needs to make real-time decisions on what is a crop and what is a weed. As it drives through the field, high-resolution cameras collect imagery at a high frame rate. We developed a convolutional neural network (CNN) using PyTorch to analyze each frame and produce a pixel-accurate map of where the crops and weeds are. Once the plants are all identified, each weed and crop is mapped to field locations, and the robot sprays only the weeds. This entire process happens in milliseconds, allowing the farmer to cover as much ground as possible, since efficiency matters. To support the machine learning (ML) and robotics stack, we built an impressive compute unit, based on the NVIDIA Jetson AGX Xavier System on Module System On Module (SOM) AI on the edge computer. Since all our inference happens in real time, uploading to the cloud would take too long, so

THE ROBOT REPORT

PyTorch_RR_10-20_FINAL_ed.indd 55

we bring the server farms to the field. The total compute power onboard the robot just dedicated to visual inference and spray robotics is on par with IBM’s Blue Gene supercomputer (2007). This makes this a machine with some of the highest compute capacity of any moving machine machinery in the world! Building weed-detection models My team of researchers and engineers is responsible for training the neural network model that identifies crops and weeds. This is a challenging problem because many weeds look just like crops. Professional agronomists and weed scientists train our labeling workforce to label the images correctly.

www.therobotreport.com

October 2020

55

10/5/20 11:49 AM


The Robot Report Field operations specialist Alex Marsh with a large self-propelled sprayer. | Source: Blue River Technology

In the image below, the cotton plants are in green, and the weeds are in red. Machine learning stack uses PyTorch for training On the machine learning front, we have a sophisticated stack. We use PyTorch for training all our models. We have built a set of internal libraries on top of PyTorch. This allows us to perform repeatable machine learning experiments. The responsibilities of my team fall into three categories: • Build production models to deploy onto the robots • Perform machine learning experiments and research with the goal of continually improving model performance • Data analysis/data science related to machine learning, A/B testing, process improvement, software engineering We chose PyTorch because it’s very flexible and easy to debug. New team members can quickly get up to speed, and the documentation is thorough. Before working with PyTorch, our team used Caffe and Tensorflow extensively. In 2019, we made a decision to switch to PyTorch, and the transition was seamless. The framework gives us the ability to support production model workflows and research workflows simultaneously.

For example, we use the torchvision library for image transforms and tensor transformations. It contains some basic functionality, and it integrates really nicely with sophisticated augmentation packages like imgaug. The transforms object in torchvision is a piece of cake to integrate with imgaug. Let’s look at a code example using the Fashion MNIST dataset. A class called Custom Augmentor initializes the iaa. Sequential object in the constructor, then calls augment_image() in the call method. CustomAugmentor() is then added to the call to transforms.Compose(), prior to ToTensor(). Now the train and val data loaders will apply the augmentations defined in CustomAugmentor() when the batches are loaded for training and validation. In addition, PyTorch has emerged as a favorite tool in the computer vision ecosystem. (Looking at Papers With Code, PyTorch is a common submission.)

Would you have distinguished cotton plants from weeds at a glance? | Source: Blue River Technology

56

October 2020

PyTorch_RR_10-20_FINAL_ed.indd 56

www.therobotreport.com

This makes it easy for us to try out new techniques like Debiased Contrastive Learning for semi-supervised training. On the model-training front, we have two normal workflows: production and research. For research applications, our team runs PyTorch on an internal, on-prem compute cluster. Jobs being executed on the on-premise cluster are managed by Slurm, which is an HPC batch job-based scheduler. It is free, easy to set up and maintain, and provides all the functionality our group needs for running thousands of machine learning jobs. For our production-based workflows, we utilize an Argo workflow on top of a Kubernetes (K8s) cluster hosted in AWS. Our PyTorch training code is deployed to the cloud using Docker. Deploying models on field robots For production deployment, one of our top priorities is high-speed inference on the edge computing device. If the robot needs to drive more slowly to wait for inferences, it can’t be as efficient in the fields. To this end, we use TensorRT to convert the network to an Xavier optimized model. TensorRT doesn’t accept JIT models as input, so we use ONNX to convert from JIT to ONNX format. From there, we use TensorRT to convert to a TensorRT engine file that we deploy directly to the device. As the tool stack evolves, we expect this process to improve as well. Our models are deployed to Artifactory using a Jenkins build process, and they are deployed to remote machines in the field by pulling from Artifactory.

THE ROBOT REPORT

10/5/20 11:51 AM


To monitor and evaluate our machine learning runs, we have found the Weights & Biases (W&B) platform to be the best solution. Their API makes it fast to integrate W&B logging into an existing codebase. We use W&B to monitor training runs in progress, including live curves of the training and validation loss. SGD vs. Adam Project As an example of using PyTorch and W&B, I will run an experiment and compare the results of using different solvers in PyTorch. There are a number of different solvers in PyTorch -- the obvious question is “Which one should you pick?” A popular choice of solver is Adam. It often gives good results without needing to set any parameters and is our usual choice for our models. In PyTorch, this solver is available under torch.optim. adam. Another popular choice of solver

| Source: Blue River Technology

for machine learning researchers is Stochastic Gradient Descent (SGD). This solver is available in PyTorch as torch. optim.SGD. Momentum is an important concept in machine learning, as it can help the solver to find better solutions by avoiding getting stuck in local minima in the optimization space. Using SGD and momentum, the question is this: “Can I find a momentum setting for SGD that beats Adam?” The experimental setup is as follows. I use the same training data for each run and evaluate the results on the same test set. I’m going to compare the

F1 score for plants between different runs. I set up a number of runs with SGD as the solver and sweeping through momentum values from 0–0.99. (When using momentum, anything greater than 1.0 causes the solver to diverge.) I set up 10 runs with momentum values from 0 to 0.9 in increments of 0.1. Following that, I performed another set of 10 runs, this time with momentum values between 0.90 and 0.99, with increments of 1. After looking at these results, I also ran a set of experiments at momentum values of 0.999 and 0.9999. Each run was done with a different

A NEW VIRTUAL EXPERIENCE FROM ROBOTICS BUSINESS REVIEW!

Fostering Innovation, Expanding Opportunities, Building a Community

FEATURING:

Speaker Presentations

Live Discussions

Follow-up Q&As w/ presenters

robobusiness.com

THE ROBOT REPORT

PyTorch_RR_10-20_FINAL_ed.indd 57

www.therobotreport.com

October 2020

57

10/5/20 11:51 AM


The Robot Report Figure 1: On the left side, the f1 score for crops is shown on the X axis, and the run name is shown on the Y axis. On the right side, the f1 score for plants as a function of momentum value is shown. | Source: Blue River Technology

random seed, and was given a tag of “SGD Sweep” in W&B. The results are shown in Figure 1. It is very clear from Figure 1 that larger values of momentum are increasing the f1 score. The best value of 0.9447 occurs at momentum value of 0.999 and drops off to a value of 0.9394 at a momentum value of 0.9999. The values are shown in the table below. How do these results compare to Adam? To test this, I ran 10 identical runs using torch.optim.Adam with just the default parameters. I used the tag “Adam runs” in W&B to identify these runs.

Table 1:

I also tagged each set of SGD runs for comparison. Since a different random seed is used for each run, the solver will initialize differently each time and will end up with different weights at the last epoch. This gives slightly different results on the test set for each run. To compare them, I will need to measure the spread of values for the Adam and SGD runs. This is easy to do with a box plot grouped by tag in W&B. The results are shown in graph form in Figure 2, and in tabular form in Table 2. The full report is available online too. You can see that I haven’t been able to beat the results for Adam by just adjusting momentum values with SGD. The momentum setting of 0.999 gives very comparable results, but the variance on the Adam runs is tighter, and the average value is higher as well. So Adam appears to be a good choice of solver for our plant segmentation problem! PyTorch visualizations With the PyTorch integration, W&B picks up the gradients at each layer, letting us inspect the network during training. W&B experiment tracking also makes it easy to visualize PyTorch models during training, so you can see the loss curves in real time in a central dashboard. We use these visualizations in our team meetings to discuss the latest results and share updates. As the images pass through our PyTorch model, we seamlessly log predictions to Weights & Biases to visualize the results of model training. Here we can see the predictions, ground truth, and labels. This makes it easy to identify scenarios where model performance isn’t meeting our expectations. Here we can quickly browse the ground truth and predictions and the difference between the two. We’ve labeled the crops in green and the weeds in red. As you can see, the model is doing a pretty reasonable job of identifying the crops and the weeds in the image.

Table 1: Each run is shown as a row in the table above. The last column is the momentum setting for the run. The F1 score, precision, and recall for Class 2 (crops) is shown. | Source: Blue River Technology

58

October 2020

PyTorch_RR_10-20_FINAL_ed.indd 58

www.therobotreport.com

Reproducible models Reproducibility and traceability are key features of any ML system, and it’s hard to get right. When comparing different network architectures and hyperparameters, the input data needs to be the same to make runs comparable.

THE ROBOT REPORT

10/5/20 11:52 AM


Table 2:

Table 2: Run table showing f1 score, optimizer, and momentum value for each run. | Source: Blue River Technology

F-Score 0.940

0.920

Often, individual practitioners on ML teams save YAML or JSON config files. It’s excruciating to find a team member’s run and wade through their config file to find out what training set and hyperparameters were used. We’ve all done it, and we all hate it. A new feature that W&B just released solves this problem. Artifacts allow us to track the inputs and outputs of our training and evaluation runs. This helps us a lot with reproducibility and traceability. By inspecting the Artifacts section of a run in W&B, I can tell what datasets were used to train the model, what models were produced from multiple runs, and the results of the model evaluation. A typical use case is the following. A data-staging process downloads the latest and greatest data and stages it to disk for training and test, with separate data sets for each. These datasets are specified as artifacts. A training run takes the training set artifact as input and outputs a trained model as an output artifact. The evaluation process takes the test set artifact as input, along with the trained model artifact, and outputs an evaluation that might include a set of

metrics or images. A directed acyclic graph (DAG) is formed and visualized within W&B. This is helpful since it is very important to track the artifacts that

Sp D SG

SG

D

sp

re

re

ad

ad

(0

99

.9

5)

99

s un R p Sw D

SG

Figure 2: The spread of values for Adam and SGD. The Adam runs are shown in the left of the graph in green. The SGD runs are shown as brown (0.999), teal (0–0.99), blue (0.9999), and yellow (0.95). | Source: Blue River Technology

| Source: Blue River Technology

| Source: Blue River Technology

THE ROBOT REPORT www.therobotreport.com

PyTorch_RR_10-20_FINAL_ed.indd 59

ee

re sp D SG

A

D

A

M

ad

R

99

un

9

s

0.900

October 2020

59

10/5/20 11:53 AM


The Robot Report producer (1)

dataset (4)

eval (3)

model (4)

train (10)

| Source: Blue River Technology

are involved with releasing a machine learning model into production. A DAG like this can be formed easily: One of the big advantages of the Artifacts feature is that you can choose to upload all the artifacts (datasets, models, evaluations), or you can choose to upload only references to the artifacts. This is a nice feature because moving lots of data around is time consuming and slow. With the dataset artifacts, we simply store a reference to those artifacts in W&B. That allows us to maintain control of our data (and avoid long transfer times) and still get traceability and reproducibility in machine learning. Leading ML teams Looking back on the years I’ve spent leading teams of machine learning engineers, I’ve seen some common challenges:

• Efficiency: As we develop new models, we need to experiment quickly and share results. PyTorch makes it easy for us to add new features fast, and Weights & Biases gives us the visibility we need to debug and improve our models. • Flexibility: Working with our customers in the fields, every day can bring a new challenge. Our team needs tools that can keep up with our constantly evolving needs, which is why we chose PyTorch for its thriving ecosystem and W&B for the lightweight, modular integrations. • Performance: At the end of the day, we need to build the most accurate and fastest models for our field machines. PyTorch enables us to iterate quickly, then productionize our models and deploy them in the field. We have full visibility and transparency in the development process with W&B, making it easy to identi the most performant models.

60

October 2020

PyTorch_RR_10-20_FINAL_ed.indd 60

The ground truth, predictions and the difference between the two. Crops are shown in green, while weeds are shown in red. | Source: Blue River Technology

I hope you have enjoyed this short tour of how my team uses PyTorch and Weights and Biases to enable the next generation of intelligent agricultural machines! RR

About the author: Chris Padwick is the director of computer vision and machine learning at Blue River Technology. The company was acquired by John Deere in 2017 for $305 million.

www.therobotreport.com

THE ROBOT REPORT

10/5/20 11:53 AM


YOUR CUSTOM SOLUTIONS ARE CGI STANDARD PRODUCTS

Advanced Products for Robotic Applications CGI Motion standard products are designed with customization in mind. Our team of experts will work with you on selecting the optimal base product and craft a unique solution to help differentiate your product or application. So when you think customization, think standard CGI assemblies. Connect with us today to explore what CGI Motion can do for you.

800.568.GEAR (4327) • www.cgimotion.com

copyright©2019 cgi inc. all rights reserved. 025rbt

CGI 7-19_Robot Report.indd 61

10/5/20 11:55 AM


The Robot Report

CropHopper robot

takes novel approach to field monitoring Part drone, part legged robot, CropHopper jumps across fields to scan crops and identify weeds and diseases.

Eugene Demaitre | Senior Editor, The Robot Report

Robots can perform many functions in agriculture, om monitoring crops and livestock, to applying fertilizer and pesticide, to harvesting uits and vegetables. While most farm automation involves wheeled robots or aerial drones, HayBeeSee has taken a different approach with its CropHopper robot. Part drone, part legged robot, CropHopper is designed to move across fields and monitor crops, identi weeds and insects, and watch for signs of disease. It can also conduct mechanical weeding and spot spraying, said the London-based company. CropHopper can provide better information for farmer decision making than periodic aircra surveys or slower ground-based robots, claimed HayBeeSee.

62

October 2020

CropHopper_RR_10-20_FINAL.indd 62

www.therobotreport.com

Understanding agricultural needs “I studied aerospace engineering and was looking for the next big thing in 2013,� recalled Fred Miller, founder and CEO of HayBeeSee. “Both led to agriculture and robotics, and in 2015, we looked at quadcopters in agriculture.� “People didn’t understand early on what problems in agriculture to solve,� Miller told The Robot Report.

THE ROBOT REPORT

10/5/20 11:59 AM


HayBeeSee is currently working on making its CropHopper robot more rugged, sealing it for dust, and testing different foot designs. | Credit: HayBeeSee “When you talk to agronomists, they view farming as a system. You could use a waterfall chart to show the contribution of things like cultivation, the choice of crop-rotation strategy, and different applications of herbicides. A lot comes down to timing.” “The problem is that robotics startups were going with a stand-alone model, but it’s not like washing a car. You have to stay on it,” he said. “If you only go out once in a while, you have to waste a lot of time mapping, and with no data to start, you don’t know how effective a spray is. Farmers need something that visits the field very frequently, not simply technology to make one or two scans at the end of the season.” “A tractor can’t scan every few days, a UAV can’t scan close to the ground, and a satellite can’t provide enough data for decisions,” said Miller. “At 14 kph [8.6 mph], a tractor for spraying or

THE ROBOT REPORT

CropHopper_RR_10-20_FINAL.indd 63

harvesting moves too quickly to detect weeds. CropHopper is more than a remote-control airplane or pre-existing vehicle base for cameras.” CropHopper designed for daily monitoring HayBeeSee said its lightweight robot is designed to quickly and efficiently leap as part of an integrated, full-season management strategy. Two carbon fiber arms are key to the hybrid design. The arms are tightened, using a winding mechanism, and then released, bouncing the robot into the air. Its four propellers then engage to propel CropHopper forward as its camera captures field images. To extend drone flight times, Miller’s team considered using solar cells, but that was not

www.therobotreport.com

October 2020

63

10/5/20 11:59 AM


The Robot Report

To extend its flight times, HayBeeSee considered using solar cells, but that was not practical. CropHopper needs to recharge every three hours, and it can cover 29.6 acres per hour. The goal is for it to cover 172.9 acres per day. | Credit: HayBeeSee

practical, he said. CropHopper needs to recharge every three hours, and it can cover 12 hectares (29.6 acres) per hour. The goal is for it to cover 70 hectares (172.9 acres) per day. “I already knew from aerospace that it’s hard to optimize for lift and weight, so a jumping mechanism is more efficient,” said Miller. “We hired a Ph.D. student who was a jumping robotics expert. We then put flex legs on a quadcopter.” “Depending on where you are in the world, the average field is 15 hectares [37 acres],” he said. “CropHopper currently works best in smaller fields, but since it is designed to run for three years without replacing parts, you could have a different strategy with multiple robots for larger fields.” “When we thought about vehicle missions, we started out thinking about the field as a rectangle, but we soon realized that farmers just need representative samples,” Miller explained. “CropHopper can accelerate from point to point faster than a wheeled robot. Also, it weighs only 3.5 kg, or 7 lb., and can move across terrain and over obstacles more like four-legged robots for the military, which provides some flexibility and doesn’t compact soil.” “At the beginning of 2018, we built a device that could jump, control itself, and land,” added Tomasz Wierzchowski, chief technology officer at HayBeeSee. “We’ve

64

October 2020

CropHopper_RR_10-20_FINAL.indd 64

been polishing it up with computer vision and conducting trials.” More data could yield better farm returns Last year, CropHopper mapped 30 hectares (74 acres) of fields for an estimated 60% in chemical savings. HayBeeSee is currently working on making its robot more rugged, sealing it for dust, and testing different foot designs. While CropHopper’s current height of 70 cm (27.5 in.) is suitable for wheat and barley, foot extensions could make it useful for taller crops such as soybeans and corn, Wierzchowski said. Most legged robots have been designed for more open terrain, he noted. “We initially focused on weed killing, because CropHopper can cover a lot of land and fly between patches of weeds and crops, but it has gotten much bigger than that,” Miller said. “Agronomists like that it can work all season long, and no other robots can go to the field every day and do the job of killing bugs or applying fertilizer.” “Image quality, accounting for the robot’s motion, and understanding what you want to measure and at what height are more things that people need to understand,” he said. “There has been a bottleneck of data and technology.” CropHopper sends data to a Webbased portal where farmers can plan www.therobotreport.com

modifications. HayBeeSee said it could help increase yields by 10% to 30% and triple profits. In addition, the company said its hopping robot can reduce environmental impacts by cutting carbon emissions and improve sustainability with a lower impact on soils. HayBeeSee said that all of its test farms, plus an additional 20 more, have requested the product when it is ready. Agbotics ready for growth HayBeeSee has received angle funding from ESA Business Incubation Centre UK, Innovate UK, the London Co-Investment Fund, and Newable. The company is looking for seed funding. “In agriculture, there has been some resistance to investing in innovation, but European policies are now driving change,” Miller said. “There is now a lot of interest in farm management platforms, but there’s still a need for new hardware. We need to go from Ph.D. specialists to the wider knowledge base of industry and engineers.” RR

THE ROBOT REPORT

10/5/20 12:01 PM


Parvalux 10-20_Robot Report.indd 65

10/5/20 11:56 AM


The Robot Report

Augean Robotics

mechanizes food production, from farm to table Recent automation demonstrations show both the promise and challenge of the agricultural market. Oliver Mitchell

Back at CES in January, many people may have wondered why a tractor was taking up more than 120 feet inside the already overpacked robotics section of the Las Vegas Convention Center. Ever since its acquisition of artificial intelligence startup Blue River Technologies for $305 million, John Deere has been betting its future on data-driven agriculture. Hence the enormous green combine on the show floor. “It’s a great chance for those in the tech industry to visit with them and to learn more about how their food is produced and the important role technology plays and will continue to play in putting food on their tables,” explained Laurel Caes, public relations manager at John Deere. I hosted Charles Andersen, CEO of Augean Robotics, at RobotLab to dig into the agritech market. He grew up with the industry in a family with generations of farmers. After business school, Andersen worked at John Deere’s largest competitor, Case New Holland (CHN). After analyzing Blue River and the wider unmanned marketplace for CHN, Andersen concluded that “autonomy is a force for new market disruption within agriculture, meaning that it is a force best commercialized by startups, so I decided to start a robotics company focused on agriculture.” Augean Robotics, one of The Robot Report’s 2019 Startups to Watch, has one of the few systems actually working in the fields, while other upstarts are still tinkering indoors. “Roughly 2 million U.S. farms produce about $400 billion in revenue annually,” he noted. “On a revenue basis, half of output is crops, and half is livestock.”

66

October 2020

Augean_RR_10-20_FINAL_ed.indd 66

www.therobotreport.com

THE ROBOT REPORT

10/5/20 12:02 PM


The Burro agricultural robot. Specialty crops ripe for disruption Livestock and grain production are already on track to becoming fully automated, Andersen claimed. “Livestock production is often fairly mechanized and in some cases automated -- robotic milking parlors for example,” he said. “Meanwhile, about one quarter of U.S. farm output is grains -- corn, soybeans, wheat, and other field crops like cotton. These crops are very mechanized, with little in the way of labor in their production. This is where Deere, CNH, Kubota, AGCO, and others focus their marketing and R&D dollars building bigger and better tractors, combines, sprayers, etc.” This leaves specialty crop production such as berries, orchards, and vegetables,

THE ROBOT REPORT

Augean_RR_10-20_FINAL_ed.indd 67

| Source: Augean Robotics

which account for 88% of labor, as the literal low-hanging fruit for disruption. Andersen painted a portrait of aging farmers struggling with increasing overhead and razor-thin margins, forcing many owners to sell their family estates and move production to Central and South America. “Overall, there is rising demand for food with growing global population. The irony of rising population is that as we have more people on the planet, we have fewer farmers and fewer people looking to work for farmers,” he said. “Thus, inputs across the board, from labor to water to fertilizer to machinery, are increasingly expensive and scarce, and generally speaking, growers are looking to do more with less.”

www.therobotreport.com

October 2020

67

10/5/20 12:03 PM


The Robot Report

A data-driven tractor at CES 2020.

| Source: John Deere

Based on Andersen’s remarks, robotics is more than the newest equipment; it could be the savior of the U.S.’s agrarian economy. While many financial analysts have projected uber growth for agritech, the present reality is stymied by long sales cycles and difficult operating environments. “On the challenges side, the average age of a U.S. farmer is 58, and these rising ages correlate with consolidation and an ever-smaller number of larger operators,” Andersen said. “Simultaneously, the conditions

are often very challenging for autonomy, with the lighting, weather, field variability, and harshness that robots must face and handle consistently over and over again.” At the same time, the opportunities could be larger than for other areas of autonomy, as unmanned farm vehicles are able to immediately navigate around workers without regulations, pedestrians, or other obstacles. Augmenting farming with mobile robots Rather than replacing humans, Augean Robotics’ approach is to alleviate today’s agronomy inefficiencies by augmenting

farmhands with mechanical donkeys called Burros. “We are doing something different,” boasted Andersen, “taking a stepped or phased approach towards full autonomy, beginning with a collaborative robotic platform called Burro that helps people work more productively today, collects tons of data, over time can be modularly expanded towards fully autonomous farming in a variety of different settings, and which we can get into the market today, not five years from now.” After observing how table grapes were picked and collected, Andersen launched a self-driving wheelbarrow to autonomously steer through vineyard rows as a shopping cart for harvesters. “We’ve found that, like Kiva systems in Amazon warehouses, if you automate in-field transit, you can enable people doing high-value/high-dexterity work like picking to be much more productive,” Andersen said. “A crew of 10 people harvesting table grapes with one of our robots running them back and forth can pick 40% more fruit per day, and the payback on one of our robots is accordingly just 30 and 40 days.” In the long term, Andersen said he hopes to translate his success in table grapes to other labor-intensive crops such as berries and orchard fruits. In fact, his biggest worry for Augean Robotics being a startup is scaling his

The Burro robot can use a machine leraning detection and tracking model to follow a designated person, remote-free, through any setting.

| Thee Source: Augean

Robotics

68

October 2020

Augean_RR_10-20_FINAL_ed.indd 68

www.therobotreport.com

THE ROBOT REPORT

10/5/20 12:04 PM


A preview of foodservice automation The vision of Augean Robotics’ Charles Andersen

A self-driving wheelbarrow for table grapes. | Source: Augean Robotics

team to keep up with demand. “Every grower that buys our robots starts asking about five other use cases, o en in different crops, that we didn’t imagine our Burros going in to, and we have to ensure that our autonomy functions consistently and reliably everywhere,” he said. Andersen said he imagines a robotic herd eventually forming a complete farming logistics platform. “In five years, I see our Burro robots forming the core API for many of the future autonomous tasks people would like in specialty crops,” he said. “By mastering the process of moving om A to B in complex farming settings with a powerful and modular autonomous platform, I believe we are building a tool-carrying platform that can enable autonomous picking, pruning, weeding, spot spraying, and a host of other tasks.” RR

is embraced by many other roboticists who see a convoy of logistical solutions om the farm to the table. Also at CES 2020, I was introduced to RoboJuice, a tasty invention by juice bar proprietor Mikalai Sakhno. “I realize that the automation of the food is an inevitable future, and I wanted to participate in the change,” said Igor Nefedov, CEO of RoboJuice. “Our smoothies will be cheaper, higher-quality, and [require] little wait.” In light of the recent spate of robo-downturns, including Zume Pizza, Creator, and CaféX, society might not yet be ready to turn over the kitchen to the bots. Nefedov disagreed, arguing in favor of more humanoid robots. “We’re using human-like robots because it’s scientifically proven that people prefer robots that look like them,” he said. “People will eventually create an emotional connection -- that will drive repeat customers.” RoboJuice is still in building mode and planned to open a first kiosk to showcase its anchise concept later this year. In the meantime, on a busy evening at CES, I passed a completely empty automated bar, Tipsy Robot. I asked the hostess what was good, and she directed me to the casino’s nightclub, “as the bartender there makes a mean mojito.”

About the author: Oliver Mitchell is a Venture Partner at ff Venture Capital. Augean Robotics is a portfolio company of ff Venture Capital.

Tipsy Robot’s robot-run bar at CES 2020. | Source: Oliver Mitchell

THE ROBOT REPORT

Augean_RR_10-20_FINAL_ed.indd 69

www.therobotreport.com

October 2020

69

10/5/20 12:05 PM


The Robot Report

Building service network for 10,000 autonomous combines Russian firms to sell, install, and maintain autonomy systems for harvesters, tractors, and sprayers.

Eugene Demaitre | Senior Editor, The Robot Report

For agricultural robotics to be more widely adopted, farmers will need services to help maintain and repair intelligent systems. EkoNiva Holding Co. and Cognitive Pilot have signed a three-year agreement to outfit farm machinery with the C-Pilot autonomous driving system and create a service network across Russia. Moscow-based Cognitive Pilot is a technology joint venture of Sberbank and Cognitive Technologies Group. The company’s customers include Hyundai Mobis, Russian Railways, Transport Systems PC, and major Russian and international vehicle makers. It recently launched a pilot project to install the Cognitive Agro Pilot so ware and hardware in 242 combine harvesters used by Rusagro Group LLC.

70

October 2020

Cognitive_RR_10-20_FINAL_ed.indd 70

www.therobotreport.com

EkoNiva is the largest partner of farm equipment manufacturer John Deere in Russia. Under the agreement, the Russian-German agricultural holding company will sell the Cognitive Agro Pilot system, as well as install, set up, maintain, and provide engineering support for it. The operations will cover 35 regions of Russia and more than 10 climate zones.

THE ROBOT REPORT

10/5/20 12:08 PM


Supporting Russian agbotics development Cognitive Agro Pilot uses a convolutional neural network to analyze data from a single video camera to understand the types and positions of objects, build trajectories, and send commands to perform maneuvers. Many other agricultural systems use more expensive suites of sensors, according to the company. C-Pilot is intended to provide autonomy to equipment such as grain harvesters, tractors, and sprayers, freeing operators to focus on the quality of harvesting, claimed Cognitive Pilot. The system can operate safely in harsh weather conditions, with any light intensity, and without GPS, it added. The partnership between EkoNiva and Cognitive

THE ROBOT REPORT

Cognitive_RR_10-20_FINAL_ed.indd 71

Pilot also includes the creation of new smart farming systems. The project is a part of the digital ecosystem of Sberbank, a multinational institution that is working to build up the Russian agricultural technology sector. “The combination of the unique technological solutions of Sberbank’s Cognitive Pilot subsidiary and EkoNiva’s expertise in sales and services in the regions will position us to improve the quality of customer interactions using the Cognitive Agro Pilot system and help enhance the service level,” said Anatoly Popov, the deputy chairman of Sberbank’s executive board.

www.therobotreport.com

EkoNiva, the largest partner of farm equipment manufacturer John Deere in Russia, will outfit farm machinery with the C-Pilot autonomous driving system. | EkoNiva

October 2020

71

10/5/20 12:08 PM


The Robot Report

C-Pilot uses AI to analyze camera data. | Cognitive Pilot

Training and nationwide support Cognitive Pilot said it is already training EkoNiva engineers on how to install and set up its autonomy systems. The company plans to train its partner‘s entire service department, from the Leningrad region to Novosibirsk. “The large-scale work to service combine harvesters across such a vast territory is expected to enable EkoNiva and Cognitive Pilot to amass the world’s most comprehensive video image database for further training of neural networks that combine harvesters already use in Russia, as well as in the U.S., Latin America, China, and other countries,” stated Olga Uskova, the CEO of Cognitive Pilot. “The cooperation between the companies will not only allow us to expand Cognitive Agro Pilot’s sales network in Russia, but also provide our customers with high-quality and fast local services.” “EkoNiva’s service network will provide a full range of work, including installation, adjustment, repair services and maintenance, as well as consulting of agricultural enterprises about the Cognitive Agro Pilot system,” she told The Robot Report. “All the measures and work meet the highest level of the ‘basic dealer standard’ requirements introduced by John Deere and supported by Cognitive Pilot.” “Service engineers will monitor the condition of the Cognitive Agro Pilot hardware and will provide the necessary technical advice,” Uskova said. “In case of any nonstandard technical situations,

72

October 2020

Cognitive_RR_10-20_FINAL_ed.indd 72

a specialized technical support team will come to the farms to solve complex technical issues in the shortest time. The satellite navigation system allows the dispatcher to identify the nearest engineer to the breakdown site and send him to the client.” Autonomy for 10,000 combines Under the terms of the three-year contract, technicians will install C-Pilot on up to 10,000 combines from different farm equipment manufacturers. “Covering the introduction and maintenance of artificial intelligence systems for agricultural enterprises, this agreement is among the largest ones in the world,” said Bjorne Drechsler, first deputy CEO of EkoNivaTekhnika-Holding. “Fitting our customers’ equipment with autonomous motion systems should improve the efficiency of harvesting and cut the cost of grain for them by 3% to 5%.” EkoNiva’s “vast geographic footprint and an extensive network of modern service centers will let it quickly scale up the use of artificial intelligence technology in Russia and reinforce its position as a smart agriculture leader,” he said. RR

www.therobotreport.com

THE ROBOT REPORT

10/5/20 12:10 PM


Robotics Robotics THE ROBOT REPORT

Robotic Tips 10-20_Vs1.indd 73

CGI Inc. Advanced Products for Robotics and Automation At CGI we serve a wide array of industries including medical, robotics, aerospace, defense, semiconductor, industrial automation, motion control, and many others. Our core business is manufacturing precision motion control solutions. CGI’s diverse customer base and wide range of applications have earned us a reputation for quality, reliability, and flexibility. One of the distinct competitive advantages we are able to provide our customers is an engineering team that is knowledgeable and easy to work with. CGI is certified to ISO9001 and ISO13485 quality management systems. In addition, we are FDA and AS9100 compliant. Our unique quality control environment is weaved into the fabric of our manufacturing facility. We work daily with customers who demand both precision and rapid turnarounds.

ISO QUALITY MANAGEMENT SYSTEMS: ISO 9001• ISO 13485 • AS9100 • ITAR SIX SIGMA AND LEAN PRACTICES ARE EMBRACED DAILY WITHIN THE CULTURE

CGI Inc. 3400 Arrowhead Drive Carson City, NV 89706 Toll Free: 1.800.568.4327 Ph: 1.775.882.3422 Fx: 1.775.882.9599 WWW.CGIMOTION.COM

Parvalux Standard and customized motors for agricultural applications Parvalux manufactures farming motors for agricultural applications such as milking machines, automatic feeding equipment, sheep shearers and ventilation systems and our 70+ years’ building great products has earned us trust in this demanding market. Reliability and rugged construction are two key requirements needed to satisfy the specific requirements of the farming industry. As you would expect of any equipment designed to operate in an agricultural environment, our farming motors and gearboxes are trusted to work reliably in the harshest of conditions where they’ll face dust, a wide range of temperatures and the full scale of humidity challenges throughout the year. When you choose Parvalux, you can be confident that our products have been designed, built and thoroughly tested to ensure the durability and reliability expected. To learn more visit www.parvalux.com

www.therobotreport.com

Parvalux Electric Motors Ltd. | a maxon company 125 Dever Drive Taunton, MA 02780 Phone: 508.677.0520 sales.us@parvalux.com www.parvalux.com

October 2020

73

10/5/20 12:13 PM


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.