ADEGLISTUDIDIVERONA
DipartimentodiInformatica
Master’sDegreein
ComputerEngineeringforRoboticsandSmartIndustry
Master’sThesis
DEVELOPMENTANDCHARACTERIZATIONOFA
3DPRINTEDHEMISPHERICALSOFTTACTILE SENSORFORAGRICULTURALAPPLICATIONS
Supervisor:Candidate:
Prof.RICCARDOMURADOREFABIOCASTELLINI
Co-supervisor:StudentID:
FRANCESCOVISENTIN,PhDVR464639
ACADEMICYEAR2021-2022
UNIVERSIT `
ListofFigures 1.1Ontheleftasculpturedepictinganancientromanharvester calledGallicVallus;ontherightamoderncombineharvester designedforgrain,potatoes,carrots,beets.............
2 1.2SoftRobotics’mGrip:amodulargrippingsystemthatenables reliable,high-speedpickingoftraditionallyhard-to-graspsingle items................................. 4
morphicrobotichand”[38]whileontherightasoftanthropomorphichand[7]........................... 9 2.2Fromlefttoright:ROBOTIQHand-EAdaptiveGripper,2F-85 Gripperand3-FingerAdaptiveRobotGripper.......... 10 2.3Fromlefttoright:UniversalRobotsZXP7*01VacuumUnit, OnRobotVG10VacuumgripperandJoulinFoamGripper.... 10 2.4Abrieftimelineofmilestonesinthedevelopmentofsoftgripper technologiesaspresentedin[63]................... 13 2.5SoftRobotics’mGripmodularsoftgripper............. 16 2.6OntheleftFestoFlexShapeGripperandontherightFestoTentacleGripper............................. 16 i
2.1Ontheleftan“integratedlinkage-drivendexterousanthropo-
2.7FestoBionicSoftArmequippedwithaflexibleMultiChoiceGripper................................... 16 2.8CambridgeConsultants’Hanksoftgripper............. 17 2.9Fromlefttoright:asoftfabricgripperwithgeckoadhesion; abioinspiredsoftgrippermadeofDragonSkin30silicon;a hybrid/softroboticgrippermadeofurathenrubber........ 19 2.10Classificationofautomaticharvestingmethods,accordingto[47]. 21 2.11Asimplifiedschemeofbasicpickingtechniques,accordingto[47]. 22 2.12OntheleftRaussendorfautonomoussystemforagricultural purposesC¨asar;ontherightDJIAGRASMG-1PSeriesagriculturedrone............................. 25 2.13OntheleftAscTecFalcon8flyingdrone;ontherightTevel Aeroboticsautomatedfruitpicker.................. 25 2.14AutonomousrobotsproposedbyNa¨ıoTechnologies,fromleft toright:Oz,DinoandTed..................... 26 2.15OntheleftthecompletelyautonomousmobilerobotforpreciselycontrollingsoilgrassingproposedbyVITIROVERSolutions;ontherightTertillWeedingRobot............. 26 2.16AgrobotE-Seriesstainlesssteelandmilitary-gradealuminum robot................................. 27 2.17HarvestCROORoboticsBerry5fruitpickingrobot....... 28 2.18OntheleftAugeanRobotics’Burroself-drivingrobot;onthe rightHarvestAutomation’sHV-100robot............. 28 2.19OntheleftANYmalproposedbyANYbotics;ontherightUnitreeGo1proposedbyUnitreeRobotics............... 30 ii
2.29Asreportedin[75]:Open-TacTip(left):theoriginalversionof thesensorcomprisesa3D-printedcameramountandbaseand acastsiliconeskin.ImprovedTacTip(center):theredesigned basehousesawebcam,andmodulartipswith3D-printedrubber skin.Modulartips(right):separatemodulartipswithanodular fingerprint(above)andflattip(below)............... 40
2.20Ontheleftthe3DprintedChromaTouchtactilesensorandthe transducedforcesintomarkerappearancechanges;ontheright arenderofthefingertipassembly.................. 33 2.21OnthelefttherenderedstructureoftheUniversalGripper;on therightthemanufacturedprototype............... 34 2.22Ontheleftthedimensioned(mmunits)sensorassembly;onthe rightthesoft-bubblemountedonaKUKAiiwarobot...... 35 2.23Ontheleftthesoft-bubbleparallelgripperthatestimatesinhandposeandtracksshear-induceddisplacements;ontheright thedimensionedschemewheretheToFdepthsensorisdepicted inblue................................. 36 2.24TheGelSightFinRaygripper.................... 37 2.25TheHiVTactactilesensorprototype................ 38 2.26DesignoftheFingerVisionanditsprototypeinstalledonthe Baxtergripper............................ 38 2.27ArenderingoftheVisiflex[18]tactilesensoranditsexploded view.................................. 39 2.28TheVisiflexsensorcontactedinmultiplepoints;redLEDsrepresentarethefiducialmarkers,whilegreenLEDsarethecontact pointsthatcanbeseenbythecamera,becauseofthewaveguide. 40
iii
2.30OnthelefttheDenseTactsensormountedontheAllegrohand andits3Dreconstructionresults;ontherightavisualizationof theraycastingalgorithm,usedtodeterminetheradialdepth fromthe3Dcalibrationsurfacewhichisthenprojectedintothe imageplane..............................
2.31Asreportedin[79]:(a)basicprincipleoftheGelsightdesign thatconsistsofasensingelastomerpiecewiththeopaquereflectivemembraneontop,supportingplate,LEDsandcamerato capturetheshadedimageswithdifferentlightings;(b)picture ofthesensor;(c)arrangementoftheLEDsandcamerawhen
3.2Fromlefttoright:theFormlabsForm2stereolithography3D
41
viewingfromthetop......................... 42
Hand2-fingersgripper........................ 45
3.1TheFrankaEmikaPandaroboticarmequippedwiththeFranka
printer;theElastic50AResin;a3Dprintedsampleasshown onthewebsite[22].......................... 46 3.3Fromlefttoright:thejustprinteddome;thewasheddome; howthewasheddomepresentsontheinside............ 47 3.4ShoreHardnessscaleofgeneralpurposeitems........... 47 3.5Chartofthemostcommonsoftmaterials’propertiesandcuring times,accordingto[33]....................... 48
rightthehemisphericaldomeafterbeingcured.......... 48
superimposeddetectedblobswhichareclearlynoisy....... 50 iv
3.6Ontheleftthehemisphericaldomebeforebeingcured;onthe
3.7Onthelefttheinitial“doublecross”designconsistingof0.9mm diametermarkerswithaspacingof1.5mm;ontherightthe
3.8Ontheleftthefourdesignedpatterns;ontherightthechosen “doublecross”pattern........................ 50 3.9Ontheleftaphotographofanexplodedviewoftheinitial prototype;ontherightarenderoftheprototype......... 51 3.10Multiplerenderedviewsofproposed3Dprintedmodulardesign. 52 3.11Overallviewofthetestingexperimentalsetup........... 53 3.12Closerviewsoftheexperimentalsetup............... 54 3.13OntheleftanexampleofCartesianrobot;ontherighttheATI NanoSeries6-axisforce/torquesensor............... 55 3.14Ontheleftasideviewofthearm’sworkspace,ontherighta topviewofthearm’sworkspace.................. 55 3.15OriginalFrankaEmikaHandgripper................ 56 3.16RendersofthedefinitivesetupmountedontheFrankaEmika Pandarobot............................. 57 3.17PhotographsshowingtheFrankaEmikaPandarobotwiththe customgripperinstalledontheFrankaHand........... 57 3.18TheIntelRealSenseD435iRGBDcamera............. 58 3.19SimpleBlobDetector’sthresholdingoptions............. 59 3.20Fromlefttoright:therawframewhennoforcesareapplied; therawframeafterapplyingaforce;theframeaftermarker detection............................... 61 3.21Left:therawframewhenanormaldisplacementof10mmis causedbytheexternalforce.Center:theframeaftermarker detection,fittingellipses.Right:theframeaftermarkerdetectionfittingcircles........................... 62 v
3.22Circularmarkers’displacements(horizontal,verticalandradius)inpixelunits.Eachofthe29markersisrepresentedbya differenthue............................. 63
3.23Ellipticmarkers’displacements(horizontal,verticalandradius) inpixelunits............................. 63
3.24Plotofthesubsequentmarkers’coordinatesinpixelunitsduring deformation.............................. 65
3.25Fromlefttoright:therawframeinrestposition;thesuperimposedarrowsthatshowthedisplacement’sdirectionofeach marker(thearrowswerelengthenedbyafactorof8tomake themmorevisible);therawframeunderaforceofaround4N.. 66
3.26Ontheleftarepresentationofhowcircularmarkersembeddedinthemembranearesearchedasellipses;ontherightthe developedviewertoolillustratingthegripper’sdeformation[57]. 67
3.27Fromlefttoright:topandfrontviewsoftheobtained3Dcoordinatesofthemarkersdetectedas circles,usingtheformulas describedin[57]........................... 68
3.28Left:sideviewoftheobtained3Dcoordinatesofthemarkersdetectedas circles,usingtheformulasdescribedin[57]. Right:3Dmeshobtainedusingthe“pyvista”library........ 69
3.29Topandfrontviewsoftheobtained3Dcoordinatesofthemarkersdetectedas ellipses,usingtheformulasdescribedin[57]... 70
3.30Ontheleftafrontviewoftheobtained3Dcoordinatesofthe markersdetectedas circles,ontherightafrontviewofthe obtained3Dcoordinatesofthemarkersdetectedas ellipses 70
vi
thedeformedframe;ontherightthebefore(coincidingwith theCADgroundtruths)andafterdeformationmeshesobtained
triangulatingthe3Dmarkers’coordinates............ 72 3.32Topandfrontviewsofthe3Dmarkers’coordinatesobtained withthe proposedmethod 73 3.33Fromlefttoright:atopandabottomviewofthe3Dmesh obtainedthroughbooleandifferencebetweendomeandtriangulatedmarkers’coordinates;asideviewofthe3Dmeshobtained throughbooleandifferencebetweenrestpositionanddeformed positiontriangulatedmarkers’coordinates............. 73 3.34ForceandtorquecomponentsmeasuredbytheATINanosensor. 74 3.35Figureshowingfromtoptobottom:the3forcecomponents’ groundtruths,the u, v and radius displacementsofeachmarker. 77 3.36Plotshowingthefirstsegmentofthesynchronizedgroundtruths andmarkers’displacements..................... 77 3.37Plotshowingthesecondsegmentofthesynchronizedground truthsandmarkers’displacements................. 78 3.38Plotshowingthethirdsegmentofthesynchronizedground truthsandmarkers’displacements................. 78 4.1ThemaindifferencesbetweenMachineLearningandDeepLearningandanexampleofDeep-CNN[42]............... 83 4.2ThetypicalarchitectureofDeep-CNN[71]............. 84 4.3Ontheleftthedetectedmarkers’movementusingSimpleBlobDetector;ontherightanexampleofmarkermovementswhen anormalforceisapplied[78].................... 84 vii
3.31Ontheleftaradiusdisplacementheatmapsuperimposedon
4.4Mode,meanandmedianin3differentdatadistributionscenarios[45]................................ 87 4.5Histogramsofthecomputed Cx, Cy, Cz coefficientsduringeachiteration........................... 88 4.6Sortingorderofthe29fiducialmarkers.............. 89 4.7Asequenceofimagesshowingtheestimatedapplicationpoint (green)andarea(blue),dependingontheappliedforce’smagnitude................................. 91 4.8GraphicalrepresentationoftheKNNalgorithm[13]....... 93 4.9Theeffectofthe“weights”parameterontheestimates.Thedefaultvalueis“uniform”andassignsequalweightstoallpoints; “distance”assignsweightsproportionaltotheinverseofthe distancefromthequerypoint[59]................. 94 4.10ApplicationofSVMincaseoflinearlydistributeddata[50]... 95 4.11ApplicationofSVMincaseofnon-linearlydistributeddata[50]. 96 4.12Left:agraphicalrepresentationofthechosenSequentialmodel usingtheplot model()function;Right:agenericexampleof 2-hiddenlayersNeuralNetwork[12]................ 97 4.13Onthelefttherawimagesensedbythefish-eyecamera;onthe rightthebinarizedimage...................... 98 4.14Sortingorderofthe29fiducialmarkers.............. 101 4.15Histogramplotshowingthe Fx estimationMeanSquaredError dependingonthefeaturetype/scalingcombination(negative barscorrespondtoerrorsabove100%)............... 103 4.16Histogramplotshowingthe Fy estimationMeanSquaredError dependingonthefeaturetype/scalingcombination(negative barscorrespondtoerrorsabove100%)............... 104 viii
4.17Histogramplotshowingthe Fz estimationMeanSquaredError dependingonthefeaturetype/scalingcombination(negative barscorrespondtoerrorsabove100%)............... 105 4.18Comparisonoftheestimatedforcecomponents,consideringfeatureoptionnumber2(Option 2. inthelist)withfeaturescaling.106 4.19Comparisonoftheestimatedforcecomponents,consideringfeatureoptionnumber5(Option 5. inthelist)withfeaturescaling.107 4.20Comparisonoftheestimatedforcecomponents,consideringfeatureoptionnumber6(Option 6. inthelist)withfeaturescaling.108 4.21Comparisonoftheestimatedforcecomponents,consideringfeatureoptionnumber7(Option 7. inthelist)withoutfeature scaling................................. 109 4.22Comparisonoftheestimatedforcecomponents,consideringfeatureoptionnumber5(Option 5. inthelist)withfeaturescaling andwithouttheshuffleoption(toseetheactualforces’trends). 111 4.23ResNet50trainandvalidationlossconsideringrawRGBimages asinput................................ 112 4.24ResNet50trainandvalidationlossconsideringBinarizedimages asinput................................ 112 4.25ResNet50predictionsonthetestsetconsideringrawRGBimagesasinput............................. 113 4.26ResNet50predictionsonthetestsetconsideringBinarizedimagesasinput............................. 114 4.27Photographsoftheforceestimationvalidationsetup....... 116 4.28Estimatedforcesagainstgroundtruthsduringthevalidation phaseusingtheATINanosensorandpressingwiththedevelopedgripper(1)........................... 117 ix
4.29Estimatedforcesagainstgroundtruthsduringthevalidation phaseusingtheATINanosensorandpressingwiththedevelopedgripper(2)........................... 118 4.30Estimatedforcesagainstgroundtruthsduringthevalidation phaseusingtheATINanosensorandpressingwiththedevelopedgripper(3)........................... 119 4.31Photographshowinghowstrawberryplantsweresetup(emulatingahydroponicculture)toperformthepickingtask..... 120 4.32Qualitativeresultofthefine-tunedCNNtestedononeofthe plantsusedforthesetupshownin4.31............... 121 4.33Phase1:detectionandlocalizationoftheripefruit........ 122 4.34Phase2:approachoftheripestrawberrygiventhe3Dtarget point................................. 123 4.35Phase3:applicationofthepickingpatterntoharvestthestrawberry................................. 123 4.36Imagesensedbythefish-eyecamerawhenthedomeisdeformed, withthesuperimposeddetectedmarkers.............. 123 4.37Onlineforcefeedbackswhilegraspingthestrawberry(weused theKNNmodeltoestimatenormalforceanddetermineifthe 1.75forcethresholdhasbeenreached)............... 124 5.1Photographsofthedevelopedsensingdeviceholdingastrawberrywithoutsqueezingit...................... 126 x
ListofTables
2.1AbriefsummaryofTable4reportedin“SoftGrippersforAutomaticCropHarvesting:AReview”[47]asaliteraturereview offoodsoftgrippers......................... 15
2.2Asummaryofthemainmaterials’characteristicsusedinsoft grippersmentionedin[47]...................... 18
4.1Summaryofthebestperformanceachievedbyeverymodelevaluatedonthetestset(MSEvaluesrefertothe Fz component).. 115 xi
Preface
Inthisspace,I’dliketothankallthepeoplethatsupportedmyuniversity studies.StartingfromProfessorsRiccardoMuradoreandFrancescoVisentin, theybothprovidedvaluablefeedbackassupervisorsofthisMaster’sThesis. Theykindlysharedtheirknowledgeandexpertisecontributinginasignificant waytothiswork.Particularly,Francescodealtwiththedesignand3Dprinting ofthecharacterizedsensorandofthecasestoholditinplace.I’dliketo alsothankPost-DocResearchersGiacomoDeRossiandNicolaPiccinellithat helpeduswiththerobot’smotionplanningduringthefinalevaluationphase.
Lastbutnotleast,athankgoestoallmyfriendsandfamilythatsupported myjourneyfromallperspectives.ParticularlyI’dliketoshowgratitudetomy parentsMariangelaandLivio,mygirlfriendLucia,mybrotherMatteoandmy closestfriends.
xii
Abstract
Softrobotics,andparticularlysoftgrippingtechnology,facesmanychallenges. Duetoseveralstrictrequirementssuchassmalldimensions,lowcostandefficientmanufacturingprocess,accuratesensingcapabilities,thedevelopmentof softgrippersisstillanopenresearchproblem.Inthisworkahemispherical deformableandcheaptomanufacturetactilesensorisproposedandcharacterized.Thedeviceis3Dprintedusingastereolithography(SLA)3Dprinter andismadeofasemi-transparentelasticpolymerresinthatisproperlycured afterwards.Theoverallaimistosensenormalandtangentialforcesappliedto thegripper.Thegripperisdesignedandthoughtforagriculturalapplications suchasgraspingdelicatefruitsandvegetables.
xiii
Contents ListofFiguresi ListofTablesxi Prefacexii Abstractxiii 1Introduction1 1.1Problemstatement......................... 1 1.1.1Howtocopewiththeincreasingfooddemand...... 1 1.1.2Agriculture4.0andtheroleofsoftrobotics....... 2 1.1.3Doallhavethemeanstofaceatransition?........ 5 1.2Thesisstructure........................... 7 2Overviewofsoftroboticsinagriculture8 2.1Softgrippers............................. 8 2.1.1Abriefintroductiontoroboticgrippers.......... 8 2.1.2Softgrippingtechnologiesforagriculture......... 11 2.1.3State-of-the-artofsoftgrippers.............. 13 2.1.4Softgrippers’materialsandmanufacturingprocess... 17 2.1.5Controllingasoftgripper................. 19 xiv
2.2Agriculturalpractices’automation................. 20 2.2.1Harvestingprocessclassification.............. 20 2.2.2Harvestingpickingpatterns................ 22 2.2.3Automationlevelofagriculturalprocesses........ 23 2.2.4Openissuesinagriculturalautomationandproposedsolutions............................ 29 2.3Openissuesinsoftrobotics.................... 31 2.3.1Softgrippers’limitationsandrequiredimprovements.. 31 2.3.2Casestudy-relatedstate-of-the-art............. 32 3Thedesignoftheprototype43 3.1Specificationsandgoals....................... 43 3.2Themanufacturingprocess..................... 45 3.2.1Prototypingandmanufacturingthehemisphericaldome. 45 3.2.2Designandmanufacturingofthecase........... 50 3.3Experimentalsetups........................ 52 3.3.1Testingsetupusedduringsoftwareimplementation... 52 3.3.2Temporarysetupfordataacquisitionduringsensor’scalibration........................... 54 3.3.3Definitivesetupmountedontherobot.......... 55 3.4Implementedsoftwarealgorithmsandapproaches........ 58 3.4.1Markerdetectionandtracking............... 58 3.4.2Frompixeltometricunitswithamonocularsetup... 66 3.4.3Rawdataacquisitionforsensorcalibration........ 74 3.4.4Offlinesemi-automateddatasetcreationpipeline..... 75 4Experimentalresults81 4.1Forceestimationapproaches.................... 81 xv
4.1.1MachineLearningvsDeepLearning............ 82 4.1.2Linearestimation...................... 84 4.1.3Non-linearlycompensatedandmarker-locationbasedestimation........................... 88 4.1.4LinearRegressionmodel.................. 92 4.1.5K-NeighborsRegressormodel............... 92 4.1.6SupportVectorRegressionmodel............. 94 4.1.7NeuralNetworkSequentialmodel............. 96 4.1.8DeepConvolutionalNeuralNetworkmodel........ 97 4.1.9Featureextraction..................... 98 4.2Comparisonoftheresults..................... 101 4.2.1Evaluationonthetestset.................. 101 4.2.2Evaluationwiththeroboticgripper............ 115 4.2.3Real-timeforcefeedbackandstrawberrydetection.... 120 5Conclusions125 Bibliography127 xvi
Chapter1 Introduction
InthischaptertheMaster’sThesis’workismotivated.Agriculture4.0,industrialautomationandsoftroboticscanhelptotransitiontowardsamore sustainableandefficientharvestingprocess.AttheendofChapter1willbe brieflyexplainedhowtherestoftheworkisstructured.
1.1Problemstatement
1.1.1Howtocopewiththeincreasingfooddemand
Asstatedbyseveralpapers[7,47,65,30]andwellrespectedinstitutions,agricultureisrequiredtosignificantlygrowitsproductivitytokeepupwiththe risingglobalfooddemand.TheUnitedNationsFoodandAgriculturalOrganization(FAO)foreseesthat“foodandfeedproductionwillneedtoincrease by70%by2050inordertomeettheworld’sfoodneeds”[65].Makingharder toaccomplishsucharesultistheshortageofworkersinthisfield,duetothe timeconsumingandlabourintensiveactivitiesthey’reaskedtostandupto, whilebeingoftenexploitedandunderpaid[11,8].Themainreasonsofthedecliningtrendinagriculturalinterestare:highland,realestate,machineryand
1
agrotechnologyprices;unequalwork-lifebalance;lackofgovernmentincentives and,inmostcases,poorworkingconditions.Also,theagriculturalindustry ishavingtroublescompetingwithcorporatejobsthatofferhigherpayand smart-workingoptions.Allofthistranslatesinfeweryoungfarmerscoming intofilltheshoesofretiredones.Inaddition,theglobalCOVID-19pandemichas“increasedtheneedofindustrialautomationforrelievingworkforce challengesandincreasingoperationalandfoodsafetyinfactoryenvironments” [30].
1.1.2Agriculture4.0andtheroleofsoftrobotics
It’sknownthatfromromanandgreektimes,ifnotevenbefore,humansare tryingtoautomateasmuchaspossiblelaboriousandrepetitivetasks,making useoftheavailabletechnology.Asanexample,romansusedasocalled“GalloRomanharvester”showninFigure1.1tospeedupthegrainharvestingprocess.
Nowadaysautomationplaysabigroleinthefarmingindustryminimizingwasteofproducts,timeandoptimizingthecropproductioncycle.An increasingnumberofcompaniesareworkingonroboticsinnovationtodevelopdrones,autonomoustractors,roboticharvesters,automaticwatering, andseedingrobots[30].Themaingoalistoaddressoratleasthelpwiththe previouslymentionedissuesthataffecttheagriculturesector. Agriculture
Chapter1Introduction2
Figure1.1: OntheleftasculpturedepictinganancientromanharvestercalledGallic Vallus;ontherightamoderncombineharvesterdesignedforgrain,potatoes,carrots,beets.
4.0 ,alsocalled precisionagriculture referstotheuseofInternetofThings (IoT),bigdata,ArtificialIntelligence(AI)androboticstomaketheentire productionchainmoreefficient.Technologicalinnovationisexploitedtocollect,transmitandpreciselyanalyzedatafromthefield.Datagatheredfrom sensorsisthenelaboratedwiththeaimofsupportingfarmersinthedecisionmakingprocessrelatedtotheiractivities.Theultimategoalsare“increasing economic,environmental,andsocialsustainability-aswellasprofitabilityofagriculturalprocesses”[2].
ThemainbenefitsofIndustry4.0inthecontextofagricultureare:
• avoidingunnecessarywaste(e.g.computingtheexactwaterrequirementsofthecrop);
• minimizingcostsbyplanningandpredictingallstagesofcultivation, fromlandpreparationandsowingtoharvesting;
• improvingthetraceabilityofthesupplychainthusthefoodqualityina sustainablemanner.
Althoughmoreandmorecompaniesarenowusinghigh-techdevicesandsensorstoeasethefarmers’work,makingthemabletoconcentrateonhigher leveltasks,there’sstillplentytobediscoveredinthisfield.Inthisregard, arelativelynewresearchbranchisfocusingon softrobotics andinparticular softgrippers Softrobotics isasubfieldofroboticsconcerningthedesign, control,andfabricationofrobotscomposedofcompliantmaterials,insteadof rigidlinks.Figure1.2showstwoexamplesofsoftgrippersdesignedbytheU.S. companySoftRobotics[66].Thedepicted mGrip modulargrippingsystemis fullyconfigurableandisthoughtforsafelyandefficientlypickingandpacking delicateproductssuchasfood.
Chapter1Introduction3
Inthelastdecadestheongoingtrendinroboticspointstowards collaborativerobots thatcanoperateoutsideofacage,interactingwithhumansandthe surroundingenvironment.Softrobots,notonlyaimatbeingsaferforhumans butmaysolvemanyopenissuesinrobotics.Infact,being compliant,they’re flexible,hardertobreakordamage,adaptabletounstructuredanddynamic environments.Theycanmeethygieneandstrictmanipulationrequirements whileoperatingwithdelicateorfragileproducts,makingthemdesirablein theagricultureandfoodindustries.Also,inapick-and-placetasktheydon’t requireanin-depthcharacterizationoftheobjecttohandle,ensuringgood performanceevenwithoutanyforcefeedback.Ingeneral,softroboticgrippersallowasimplercontrolarchitecturethantraditionalrigidrobots.Infact, mostsoftgrippersare underactuated,meaningthatthecontrolinputsare lessthantheachievabledegreesoffreedom.Byreplacingtheintricaterigid bodyjointmechanicswithsimplecompliantmechanisms,thenumberofparts requiredissignificantlyreduced,leadingtolowercostsformaintenanceand assembly.
Chapter1Introduction4
Figure1.2: SoftRobotics’mGrip:amodulargrippingsystemthatenablesreliable,highspeedpickingoftraditionallyhard-to-graspsingleitems.
1.1.3Doallhavethemeanstofaceatransition?
Globalagriculturehasbeenconstrainedbymanyfactors,suchassocioeconomicissues,climatechange,desertificationanddiminishedcropyields,attributedtothedecreaseofvitalnutrientsinagriculturallands[39].Despite thecompellingevidencesupportingagriculturalautomation,criticshavearguedthatdevelopingcountries,includingthoseinSub-SaharaAfrica,areless equippedforthetransitiontoAgriculture4.0[39].Infact,it’snotclearhow thosecountrieswouldbesuppliedwithnewtechnology,especiallysoftrobots. Asmentioned,Europeisexperiencinganintenselabourshortageinthisfield, thussoftroboticsandseeding/plantingequipmentswouldbeavalidsolution. Ontheotherhand,developingnationsinAfricahaveacriticalmassofunemployedyouth[39]butthisisnotenoughtotransitiontoamoreautomated harvestingprocess.AccordingtoFoodandAgricultureOrganizationofthe UnitedNations(FAO)[73],“digitalinnovationsinmechanizationtechnologies canmakeagriculturemoreattractivetoruralyouth,especiallyindeveloping countries”.Suchahighunemployedyouthofferstheopportunitytocreatenew andmoreattractivejobstoleavebehindrudimentaryhandtools.According toFAO,governmentsshouldbeprovidedwiththenecessarytechnicalsupport totransformagricultureinasustainableway;theinitiativeisalignedwiththe FrameworkforSustainableAgriculturalMechanizationinAfrica(SAMA)and Asia(SAM).
Overtheyears,agriculturalmechanizationhasevolvedfrombasichand toolsandanimal-drivenimplementstoengine-poweredequipments,butnot uniformlyallovertheworld.Infact,manualtoolsandanimalpowerarestill commonlyusedindevelopingcountries,negativelyaffectingthelivelihoodsof small-scalefarmersandtheirproductivity[73].
EconomicincentivestohelpwiththetransitiontoAgriculture4.0aretak-
Chapter1Introduction5
ingplacealsoinItalyforseveralyears.Asreportedbyarecentarticleon thistopic[51],it’simportantnotonlytoofferincentivesbutalsotocommunicatethemeffectivelyinordertoreachmostofthefarmersandsmall-medium enterprises.Supportingthisstatement,87%ofbigcompaniesknowanduse someofthecurrentconcessions,butonly59%ofthesmallercompaniesknow aboutthem.Tobeabletoadoptnewtechnologysuchas agrobots (robotsfor agriculture),acertainlevelofunderstandingoftheroboticdeviceisrequired: agoodfarmerisnotnecessarilyexpertindigitaltechnologiesandautomation. Someofthefarmers’reluctanceisdueto:anotsostraightforwardprocess, lackofcontinuoussupportandtraining,andabsenceofexternalincentives (e.g.policiesormarketprices)[39].Also,toachievethebestresults,thefarm systemandfarmers’workflowitselfmustadapttotherobots.Forinstance, spacingbetweencropsandcropstructuresneedstomatchtheoperationalparametersoftheagrobotasitmovesamongthecultivatedcrops.Ofcourse, purchasepriceofthedevicehastobetakenintoaccountasitcouldbeunsustainableformedium-smallfarms.Thiscanbecomelessofaproblemforlarge commerciallyorientedfarmswherehighlabourcostsduringharvestseason canbeattenuatedthroughautomation.FAOpresentstheneedtofindprofitablebusinessmodelswherethefarmerdoesnotnecessarilyowntherobot butcanbenefitfromthetechnology.Twopossiblesolutions,alreadyinplace inmanyfarmingsystems,areserviceprovisionandcooperativeownership.To conclude,arecentstudy[73]summarizesthemainbenefitsforAgriculture 4.0indevelopingcountriesandparticularlyinVietnam.Itreportsthatat thisstage“theagricultureofVietnamisstilldominatedbyindividualhouseholdswithsmallscaleproductionandlowskilltechniques.However,thereisa growingtrendofprivateinvestmentinagriculture,whichapplymoderntechniques,frombothforeignanddomesticinvestors.Moreinterestingly,thereare
Chapter1Introduction6
1.2Thesisstructure
ThisMaster’sThesisfocusesontheresearch,developmentandcharacterization ofamarker-basedhemisphericalsoftgrippercapableofsensingforceswhenin contactwiththeexternalenvironment.
Theworkisstructuredasfollows:
• Chapter2 presentsanoverviewaboutsoftroboticsandsoftgrippersas apossiblesolutiontotheproblemstheagriculturalindustryfacesnowadays.Themainstate-of-the-artsoftgrippingapproachesarediscussed andanoverviewoftheopenissuesinthisfieldisreported;
• Chapter3 addressestheprototypingandmanufacturingphaseexplainingthemainstepstodevelopthesensingdevice.Theexploitedexperimentalsetupsarebrieflypresented,whilethefinalpartoftheChapter explainstheimplementedofflinepipelinetocreatethedataset.Also, someapproachesfor3Dshapereconstructioninspiredfromstate-of-theartsimilarorsomehowrelatedwork,thatturnedouttobenotvery accurate,arenonethelessshown;
• Chapter4 isdedicatedtotheexperimentalresults.Thedevelopedofflineandonlinepipelines’outputsareshownandcommented.Inthis Chapterthemainlyfacedproblemsandsolutionsarediscussed.Moreover,alltheexploitedalgorithms,MachineLearningandDeepLearning estimationapproachesarecitedandbrieflyexplained.Finally,usingthe developedsetupmountedontheFrankaEmikaPandarobotasimple pickingtaskisattemptedandthequalitativeresultsarereported.
Chapter1Introduction7
companiesnowspecializingintechnicalsolutionsforagriculture”.
Chapter2 Overviewofsoftroboticsin agriculture
InthisChapterthesoft-grippingtechnologyisintroducedandpresentedasa possiblesolutiontomanyproblemstheagriculturalindustryiscurrentlyfacing.
Anoverviewabouttheavailabletechnologiesonthemarketispresentedand thepublicationsrelatedtothecasestudyarebrieflysummarized.
2.1Softgrippers
2.1.1Abriefintroductiontoroboticgrippers
Graspingandmanipulationarefundamentalfunctionsthatrequireinteraction withthesurroundingenvironment. Grasping canbedescribedasthe“abilitytopickupandholdanobjectagainstexternaldisturbances”[63],while manipulation istheabilitytoexertforcesonanobject,causingitsrotationanddisplacementwithrespecttothemanipulator’sreferenceframe.A roboticgripper isaroboticend-effectorthatcanbemountedonarobotic
8
arm,actinglikeatoolor,specifically,ahandforgrasping,pickingandplacing objects.Traditionally,roboticgrippersaremadeofrigidjointsandlinksand they’reactuatedthroughelectricmotorsinsidethestructure.Inalternative theycanbeactuatedthroughcablesortendons,asshowninFigure2.1.Gripperdesignsrangefromtwo-fingeredgripperstoanthropomorphichandswith articulatedfingersandpalm(Figure2.1)[63].Infact,theirdesignisoften inspiredtohumanoranimalfeaturestoachieve dexterity (theabilitytoperformnon-trivialactionquicklyandskilfullywiththehands)and compliance (flexibilityandelasticdeformability).InFigure2.2threegrippersfromthe ROBOTIQcompany[56]areshown.They’realldesignedtobemountedon collaborativerobotsforprecisionassemblytasks.Moreover,inFigure2.3there arevacuumgrippersproducedbydifferentmanufacturers,suchasUniversal Robots[72],OnRobot[49]andJoulin[35].Thelatterproposes“TheFoam Gripper”whichischaracterizedbyafoamsuctioncupthatisinsensitiveto porosities.
Oneofthemainchallengesofrigidanthropomorphicgrippersishandling softanddeformableobjectslikefruitsorvegetablesthatrequireanadditional careduringmanipulation.Thesocalled softgrippers aretryingtoaddress someofthoseproblemsavoidingthehighmechanicalandcontrolcomplexity
Chapter2Overviewofsoftroboticsinagriculture9
Figure2.1: Ontheleftan“integratedlinkage-drivendexterousanthropomorphicrobotic hand”[38]whileontherightasoftanthropomorphichand[7].
ofclassicalgrippersrequiredtoachieve “softwarecompliance”.Infact,they allowasimplercontrollabilityandadaptabilitytodynamicenvironments,while beingrobust,durable,versatileand inherentlycompliant thankstothesoft materials. Underactuation,thatdenotesalowernumberofactuatorsthan degreesoffreedom,isfundamentaltohaveasimplercontrollability:asan example,humanfingerscanbeseenasthecompositionofonetendonand threelinks,meaningtwodegreesoffreedomgivenasinglecontrolinput.
Also,roboticgripperscanbeequippedwithsensorstoestimateposition andvelocityofthegripperelements(e.g.withHall-effectsensors,encoders, torquesensors)andwithsensorstoretrieveinformationaboutthein-contact
Chapter2Overviewofsoftroboticsinagriculture10
Figure2.2: Fromlefttoright:ROBOTIQHand-EAdaptiveGripper,2F-85Gripperand 3-FingerAdaptiveRobotGripper.
Figure2.3: Fromlefttoright:UniversalRobotsZXP7*01VacuumUnit,OnRobotVG10 VacuumgripperandJoulinFoamGripper.
objectsorappliedexternalforces(e.g.pressure,force,torquesensors,optical sensors,resistive,conductiveandelectromagneticsensors).
2.1.2Softgrippingtechnologiesforagriculture
Accordingto[63],softgrippingtechnologiescanbeclassifiedinthreemacro categories,eventhoughthey’renotexclusiveandmanydevicesmakeuseof combinationsoftwotechnologyclassestoreachhigherperformance:
• Actuation: passivestructurewithexternalmotors,fluidicelastomer actuators(FEAs),electroactivepolymers,shapememoryalloys(SMAs);
• Controlledstiffness: granularjamming,lowmeltingpointalloys (LMPAs),electro-rheological(ER)andmagneto-rheological(MR)fluids,shapememorypolymers(SMPs);
• Controlledadhesion: electro-adhesion,geckoadhesion(dryadhesion).
Grippingby actuation consistsofbendinggripperfingersorelements aroundtheobject,aswedowithourfingerswhenpickingupaneggoraglass ofwater.Thebendingshapecanbeactivelycontrolled,otherwisecontactwith theobjectcanbeexploitedtoinducedeformation[63].
Grippingusing controlledstiffness exploitsthelargechangeinrigidity ofsomematerialstoholdthetargetobject.Anactuatorisneededtoenvelop theobjectwithpartofthegripperandwhileit’ssofttheappliedforcecan beverylow,allowingthemanipulationofdelicateobjects.Suchgrippersare fastandallowtuningofthestiffnesstoadesiredlevelbutitsrangecanbe limiting.
Grippingusing controlledadhesion,similarlytovariablestiffness,requiresanactuationmethodtopartiallyenveloptheobject.Controlledadhesionreliesonsurfaceforcesattheinterfacebetweengripperandobject.
Chapter2Overviewofsoftroboticsinagriculture11
Thisoperatingprincipleisamajoradvantagewhenmanipulatingverydelicateobjects,asitavoidsthehighcompressionforcesrequiredingrippingby actuation.Also,it’sanidealmethodforflatobjectsorobjectsthatcan’tbe envelopedbutrequiresclean,relativelysmoothanddrysurfaces.
Asdescribedin[47]additionalcriteriatochoosethegripper’stechnology couldbe:targetobjectsize,grippersize,liftingcapabilitiesandratiobetween gripper’sandobject’smasses.Also,powerconsumption,controllabilityease (openloop),scalability,modularity,adaptabilitytovarioustargetobjects.
Responsetime,surface-relatedrequirements,bio-compatibility,robustnessin unstructuredenvironments,compliance,lifetimecanallaffecttheefficiency oftheagriculturetask.Figure2.4showsabrieftimelineofmilestonesin thedevelopmentofsoftgrippertechnologiesaccordingto[63],startingfrom thelate70s’tendondrivengrippersto2017’sFEAsusingthermo-reversible Diels-Alderpolymers.
Accordingtothementionedrequirements,themostsuitableandcommonly usedtechnologiesare:
• Granularjamming: reactingtoexternalvariablessuchaschemical concentration,humidity,orlight,theyachievegoodliftingratio,response timeandabilitytoliftmedium-sizefruits;
• PassivestructureswithexternalmotorsandFEAactuators: idealforfruitharvestinggrippers,highliftingratio,wideobjectsize range,goodresponsetime,abilitytograspanyobject.
Softcomponentstypicallyusedinthegrippers’actuatorsincludeurethanes, hydrogels(invisibleinaqueousenvironments),hydraulicfluidsandpolymers, suchas siliconeelastomers [47,63].Actuatorsbasedonsiliconeelastomers haveattractedstronginterestduetotheirlowcostandeaseofmanufacture;
Chapter2Overviewofsoftroboticsinagriculture12
Figure2.4: Abrieftimelineofmilestonesinthedevelopmentofsoftgrippertechnologies aspresentedin[63].
theydonotrequiretheuseofcomplexmachineryorskilledlabour.Inaddition, thesecompliantmaterialsarealsoadvantageouswhenconsideringthesafety ofinteractionwithbiologicalproducts,makingthemappropriatecandidates foragriculturalapplications.
2.1.3State-of-the-artofsoftgrippers
Inthefieldofsoftrobotics,there’sstillplentyofroomforimprovementofsoft actuatorsdesignedforpicking,placingandharvestingfruitsandvegetables. Handlingthistypeofproductsrequiresprecisecontrolofthegrippertosuccessfullyfollowthe pickingpattern’smovementswithoutcausinganydamage tothefruit.Inliterature[47]themaincapabilitiesofanidealpickingrobot
Chapter2Overviewofsoftroboticsinagriculture13
Chapter2Overviewofsoftroboticsinagriculture14
wouldbe:
• 3Dlocalizationoffruitsinsidetheplant;
• pathandtrajectoryplanning;
• applicationofthesuitedfruitdetachmentmethod;
• adequatestorageofthefruit.
Allofthisshouldbecarriedoutwiththeaimofincreasingtheharvestratio betweenroboticandmanualpicking,increasingtheharvestedfruitquality,beingeconomicallyjustified.End-effectorsarerequiredtoappropriatelyhandle fruitstopreservetheirquality,meaningtheirvalueonthemarket.
Softgrippersareconsideredtobeoneofthebestsolutionsforharvesting crops,thankstotheiradaptabilityanddelicacywhengraspingandmanipulatingthetargetproducts.Byusingmaterialswithamoduleofelasticitysimilar tobiologicalmaterials,softgrippersensuresafeinteractionwithhumansand theworkingenvironment.Table2.1summarizesthemostrecentproposalsfor foodsoftgrippers[47].
Regardingcommerciallyavailablesoftgrippers,in2015thecompanySoft Robotics[66]introduced mGrip:apneumaticallypoweredgrippermadeof softelastomers.AsshowninFigure2.5,itconsistsofanetworkofparallel airchambersembeddedintheelastomerthankstowhichasinglepneumatic sourcecancontrolthedevice.So,complianceisachievedwithouthardlinkages, additionalsensorsoravisionsystem.Thismodulargrippercanbesetupas twoopposingfingersormultiplefingersplacedinacircularpattern.
Festo[19]isanindustrialautomationcompanythatproducescollaborative robotswithsoftgrippersattached(examplesinFigure2.6and2.7). BionicSoftArm isaroboticarmthatinitslargestversionhassevenpneumatic
Softtechnology andyear
Graspedobject Objectsize orweight GrippertypeControllability
FEAs(2020)Lettuce50x250mm
2pneumaticactuatorsand ablade(8kg,450x450x300mm) Close-loopwith forcefeedback
FEAs(2010) Apple,tomato, strawberry,carrot 69mm;5-150g Magnetorheologicalgripper (fingersize:82x16x15mm) PID
FEAs(2017)Cupcake75.2g Softfingers (fingerlength:97mm) Open-loop
FEAs(2020)Orange1kg Softfingers (fingersize:95x20x18mm) Open-loop
FEAs(2020)Tomato,kiwifruit46-76mm 4softchambersincircularshell (diameter:46mm;height:30mm) Open-loop
Tendon-driven(2020)Tomato500g3softfingerdesign Pre-programmed motors’rotation
FEA-tendondriven(2019)Banana,apple,grapes2.7kg 3softfingerdesignwith asuctioncup(390g) Teleoperation
Topologyoptimizedsoftactuators Apple,grapefruit, guava,orange,kiwi 1.4kg2compliantfingersOpen-loop
Table2.1: AbriefsummaryofTable4reportedin“SoftGrippersforAutomaticCrop Harvesting:AReview”[47]asaliteraturereviewoffoodsoftgrippers.
actuatorsandasmuchdegreesoffreedom.Itcanbeequippedwithvarious adaptivegrippersforpickandplacetaskssuchas FlexShapeGripper,inspiredtothebehavioursofachameleon; MultiChoiceGripper,anadaptive, flexiblehandlingsysteminspiredtotheopposablethumb; TentacleGripper, anoctopus-inspiredgripperwhichwrapsaroundobjectslikeanoctopus’sarm andthenusesvacuumsuctioncupstoholditfirmlyinplace.
Hanksoftgripper fromCambridgeConsultants[10]attemptstoemulate thehumanhands’fourfingersandopposablethumbthatallowasophisticated senseoftouchandslipusingsensorsembeddedinitsindividualpneumatic fingers(Figure2.8showstheHanksoftgripper).Thesesensorsareembedded duringthemoldingprocessinsideitshollowsiliconefingers,thatareactuated pneumatically.Basedonthedeformationofthefingers,theappliedforceis measuredandtheforcefeedbackclosureisprovided.
Finally,theincreasingpressureforenvironmentallyfriendlytechnologies
Chapter2Overviewofsoftroboticsinagriculture15
hasinducedresearcherstoexploresoftgrippersmadeofbiodegradable,and evenedible,materials[63].
Chapter2Overviewofsoftroboticsinagriculture16
Figure2.5: SoftRobotics’mGripmodularsoftgripper.
Figure2.6: OntheleftFestoFlexShapeGripperandontherightFestoTentacleGripper.
Figure2.7: FestoBionicSoftArmequippedwithaflexibleMultiChoiceGripper.
2.1.4Softgrippers’materialsandmanufacturingprocess
Accordingto[63],softgrippersaremadeofurethanes,hydrogels,braided fabrics,hydraulicfluidicsandpolymers,suchassiliconeelastomers,whichbecameverydesirablethankstotheirlowcostandsimplicitytomanufacture.
Themostcommonlyusedsoftmaterialsthatresideinthesiliconeelastomers categoryare:DragonSkin,Ecoflex,polydimethylsiloxane(PDMS),Elastosil M4601andSmooth-Sil.OtherpolymersareAgilus30/VeroClear,ultra-high molecularweightpolyethylene,electrostaticdischarge(ESD)plasticsheet, thermoplasticelastomers(TPEs)andthermoplasticpolyurethane(TPU).An importantaspectforthesuitabilityofsoftgrippersintheagriculturalsector, assuggestedby[63],isthatthematerialsthey’remadeofmustn’tcontaminatethefood.Thistopicshouldbeinvestigatedmore,tounderstandifsoft grippers’degradationmayleaveparticlesonthemanipulatedcrops.Table2.2 summarizesthemainadvantagesofthementionedmaterials.
Regardingthe manufacturingprocess,severalapproachescanbementioned:
• Moulding :fusedmaterialisplacedinsidea(typically3Dprinted)mold
Chapter2Overviewofsoftroboticsinagriculture17
Figure2.8: CambridgeConsultants’Hanksoftgripper.
SoftmaterialMainspecificationsShorehardness
DragonSkin,Ecoflex,Smooth-SilVersatile,easytouseandhandle,lowcost10to50ShoreA ElastosilM4601 Highlyresistanttobendingandelongation; lowviscosityinitsuncuredform;easytomold
Approximately28ShoreA
PDMS
TPUandTPE
Highelasticity;itisathermosettingpolymer, obtainedbyirreversiblyhardening(curing)asoft solidorviscousliquidprepolymer(resin).
PreciselymathematicallymodellablethroughFinite ElementMethod(FEM)analysis.Thevariationinits hardnessthroughseveralmixingratioshasbeen extensivelystudiedintheliterature.
Approximately50ShoreA
Canbe3Dprinted.Also,TPU-95isverydurable suitableforagriculturalenvironments,where harmfulcollisionswithobjectsarefrequent. 85ShoreA
Table2.2: Asummaryofthemainmaterials’characteristicsusedinsoftgrippersmentionedin[47].
andremovedafterhardening.ItcanbedonemanuallyorthroughFused DepositionModelling(FDM)printers;
• ShapeDepositionManufacturing(SDM):suitablefor3Dsoftactuatorsmadeofmultiplematerialswithdifferentproperties;
• Softlithography :suitedfordevelopingmultichannelsoftactuators;
• Virtuallost-waxcasting :avariantofatechniquenormallyappliedto castmetal.Inthiscase,thefinalparttobeobtainedisvirtuallydesigned (CAD)andavirtualmoldiscreatedbyinvertingthepartdesign.This moldisthen3Dprintedandfilledwithuncuredsilicone.Aftercuring, themoldisdestroyedusingasolvent;
• Soft3-Dprinting :themostpromisingtechnologyduetotheeliminationofseveralmouldingstages,whichmakesthemanufacturingprocess easierandallowsthedesignofmorecomplexinnerchambersorpneumaticnetworks.
Chapter2Overviewofsoftroboticsinagriculture18
In[29]amulti-fingeredsoftgripperdesignthatcompriseshydraulic-driven andsheet-shapedfabricbendingactuatorsisproposed.In[44]abioinspired softroboticgripperforadaptablegraspingisproposed.Themanufacturing processinvolvesmoldingandcastingoftheDragonSkin30siliconeinsidethe molds.In[25]bothhybridroboticgripperandacompletesoftroboticgripper areproposed.Theyarecharacterizedbyretractabletelescopicinflatablefingers.Thisdesignisthoughttobeexploitedinunknownenvironmentsdueto theirhighconformabilityandcompactness.AsshowninFigure2.9,telescopic mechanismsaremadeofurethanerubber(Smooth-OnVytaflex40)whilethe clawsare3DprintedwithPLA(PolyLacticAcid)material,anddrivenvia rackandgearcouplingsconnectedtothreeRoboticsDynamixelXM430-W350 smartactuators.
softgrippermadeofDragonSkin30silicon;ahybrid/softroboticgrippermadeofurathen rubber.
2.1.5Controllingasoftgripper
Aspreviouslymentioned,softactuatorsaredeformableandcompliant,which translatesintoalargeintrinsicnumberofdegreesoffreedom.Howasoft
Chapter2Overviewofsoftroboticsinagriculture19
Figure2.9: Fromlefttoright:asoftfabricgripperwithgeckoadhesion;abioinspired
actuatoriscontrolledhighlydependsonthechosenmaterialsandthecontrol complexitycanbereducedbasedonthedesign.Softgrippersareoftencitedas anexampleof morphologicalcomputation meaningthatcontrolcomplexityisreducedbymaterialsoftnessandmechanicalcompliance[63].Several controlstrategieshavebeenproposedforFEA-typeactuatortechnologysuch asProportional-Integral-Derivative(PID)control,closedloopcurvaturecontrol,real-timeArtificialNeuralNetwork(ANN)control.However,open-loop controlisoneofthemostfrequentlyused.Accordingtoarecentreview[47], difficultiescanbeencounteredwhilecontrollingcertaintypesofFEAsoftactuatorsandpassivestructuresactuatedbyexternalmotorsortendonmotors, duetotheirdeflectionaroundtheobject.
2.2Agriculturalpractices’automation
2.2.1Harvestingprocessclassification
Harvestingisaprocessthatcomesintoplayrightatthefinalstageoffruit developmentanddeterminesthefruitquality.Itisimportanttoharvestfruits andvegetablesattheproperstageofmaturityinordertomaintaintheirnutrientqualityandfreshnessforprolongedperiodoftime[30].Nowadays,the majorityoffruitsusedforfreshconsumptionareharvestedbyhand,anda mechanicalharvestermaytakecareofthoseusedforprocessing.Handharvestingrequiresquitealongtimeandexcessivelabouruse,whilemechanical harvestinghasagreaterefficiency.Accordingtoarecentlypublished(2021) reviewpaper[47],mechanicalharvestingmethodscanbedividedinto:
• Indirectharvesting: aforceisappliedtotheplantwithoutmaking adirectcontactwithfruits;involvesmethodssuchasairblasting,limb
Chapter2Overviewofsoftroboticsinagriculture20
shaking,trunkshakingandcanopyshaking(typicallyusedforolives, almonds,pistachionuts).
• Directharvesting: usedwheneveraplantduetoitsstructurecan’t beshaken,requiringadirectapplicationofamechanicalforceonthe fruitoritspeduncle.Inthiscasepickingtechniques(orpatterns)such astwisting,pulling,bending,liftingoracombinationofthem,arechosentoeffectivelydetachfruitsfromthestem(e.g.strawberries,apples, tomatoes).
• Directharvestingwithanactuationforceonthepeduncle: appliedwhenacuttingtoolisrequiredtoproperlydetachthefruitbecause ofitshardpeduncleconnectiontotheplant(e.g.oranges,cucumbers, peppers).
Infigure2.10aclassificationofthemostcommonlyusedharvestingmethods aspresentedin[47]isshown.
Chapter2Overviewofsoftroboticsinagriculture21
Figure2.10: Classificationofautomaticharvestingmethods,accordingto[47].
Dependingonthecrop,morethanoneharvestingtechniquecouldbeused andseveralfactorssuchassize,shape,fragilityofthetree,maturitystage ofthefruits,thewilltoriskdamagingfruitorplant,financialprofitability, determinethechoiceofthemostsuitableone.
2.2.2Harvestingpickingpatterns
Regardingthesecondmentionedharvestingmethod(directharvesting),furtherconsiderationscanbemade.Aresearchbranchinroboticsfocuseson studyingthehumanmovementsperformedduringtheharvestingofcrops,with theobjectiveofreplicatingthemusingroboticgrippers.Thesemovementsare thesocalled pickingpatterns,whichincludebending,lifting,twisting,and pullingoracombinationofthem(showninFigure4.34).
Inliterature,severalstudieshavebeenconductedtounderstandthemost suitablepickingpatternandthereforegripperdesign,foreachfruitsuchas tomatoes,apples,kiwis,strawberries.Inparticular,softgrippersarebeing
Chapter2Overviewofsoftroboticsinagriculture22
Figure2.11: Asimplifiedschemeofbasicpickingtechniques,accordingto[47].
developedbecauseofthecompliancecharacteristicsthatallowdelicatemanipulationofthefruit,sincedirectcontactisrequiredwhileharvesting.Also, directharvestingwithanadditionalactuationforcecanbesolvedusingsoft grippertechnologyandasuitablecuttingtoolsuchassaw,hotwire,scissors oraknife[63].
ThisMaster’sThesis’work,mainlyduetothedimensionsofthemanufacturedsensingdevice,focusesontheharvestingofsmall-sizefruitssuchas strawberriesandtomatoesthatcanbeharvestedfollowingthesecondmethod. Thepickingpatternusuallyincludestwistingandpullingoncethefruitis grasped.Instead,manyotherfruitssuchasolives,raspberriesandblueberries thatwouldbedirectlyharvestedbyhand,they’refareasiertoharvestwith thefirstmethod(e.g.shakingtheplant),ifautomationisinvolved.
2.2.3Automationlevelofagriculturalprocesses
Asoftenhappens,tasksthatcanbeeasilycarriedoutbyhumansbecomevery challengingforrobots.Inthefieldofcropharvesting,anexperiencedfarmer canfirmlyandrapidlydistinguishmaturitystageofacropnotonlybythe colorbutalsobythesize,shape,surfacetexture,softnessandresonance(sound itcreateswhentapped).Iftasted,thefruitorvegetablecanbeharvested consideringitsaroma,sweetness,sourness,bitterness.Mostofthisproperties aredifficulttobesensedbyarobotandthisisthereasonwhymostofthe agrobotsareonlysuitedforharvestingcropsthatwillbeprocessedbeforesale. Also,thefarmlandenvironmentistypicallynon-structured,highlydynamic andfullofobstacles.It’scharacterizedbydumpyandunevengroundthatcan onlygetworseifbadweatherpresents.Abriefreviewofthecurrentlypresent onthemarketsolutionsispresented.
Accordingtoarecentlypublished(2021)reviewpaper“AdvancesinAgri-
Chapter2Overviewofsoftroboticsinagriculture23
cultureRobotics:AState-of-the-ArtReviewandChallengesAhead”[48],AI, IoTtechnologiesandcomputervisionalgorithmscanbesuccessfullyusedfor soilandweedmanagement,fruitclassificationandweeddetectionincomplex environments.Regardingthelandpreparationtask,theGermancompany Raussendorf[55]developedin2014“C¨asar ”(showninFigure2.12),amobilefour-wheeldriveremote-controlled/temporarly-autonomousrobotforsoil fertilization.InordertoperformsuchataskReal-TimeKinematic(RTK) technologyfortheGlobalNavigationSatelliteSystem(GNSS)isused,allowingtoimprovetherobot’slocationaccuracyupto3cm.Itisdesignedto workinconjunctionwithfarmersfeaturingacollisiondetectionsystemwitha maximumdetectiondistanceof5m.Ontheotherhand,theChinesecompany DJI[15]developedaflyingdroneequippedwith8rotors(AGRASMG-1P ) toperformagriculturalactivitiessuchasapplyingliquidfertilizers,pesticides andherbicides(showninFigure2.12).Ithasatransportingcapacityupto10 litersoveramaximumdistanceofupto3km,ensuringasprayingcapacity of6ha/h.Toavoidcollisionswithhighvoltagewiresorhighvegetation,an omnidirectionalradaranti-collisionsystemisembedded,allowingtodetectobstaclesupto15meters.Thisapproachcanbeusefulwheneverdirectcontact withthesoilisnotrequired,asitacceleratestheprocessandavoidsterrestrial obstaclesandnavigationinaroughenvironment.
AscendingTechnologies[5]proposed AscTecFalcon8 :aremotlycontrolledmulticopterdesignedforinspectionandmonitoring,surveyandmappingapplications.Itcanbeusedinagricultureformonitoringtheamountof chlorophyllpresentinthevines,preventingorhighlightinganypossibledisease. TevelAeroboticsTechnologies [70]proposedaflyingautonomous robotaimedatautomatingthefruitpickingtask,selectingtheripecropsand gentlygraspingthem.Figure2.13includesthementionedflyingdrones.
Chapter2Overviewofsoftroboticsinagriculture24
Anotherkeytaskinagricultureisweedcontrol:atypeofpestcontrolthat aimsatreducingthegrowthofnoxiousweedsthatcompetewithcropsfor space,nutrients,waterandlight.Severalcompaniesdevelopedautonomous robotstoremovethoseundesirableweeds:thefrenchcompanyNa¨ıoTechnologies[46]proposed“Oz ”,“Dino”and“Ted ”forlargescalevegetablefarms andwinegrowers.AccordingtoNa¨ıoTechnologies,70Ozrobotsweresoldin 2018alone,being80%ofsalestotheFrenchinternalmarket,15%toEuropean countriesand5%totherestoftheworld.ThethreeNa¨ıoTechnologiesrobots areshowninFigure2.14.
Thefrenchcompany VITIROVERSolutions [74]developedacompact, lightweightmobilerobotforweedsremoval(showninFigure2.15).It’sable
Chapter2Overviewofsoftroboticsinagriculture25
Figure2.12: OntheleftRaussendorfautonomoussystemforagriculturalpurposesC¨asar; ontherightDJIAGRASMG-1PSeriesagriculturedrone.
Figure2.13: OntheleftAscTecFalcon8flyingdrone;ontherightTevelAerobotics automatedfruitpicker.
tooperateundervariousweatherconditions,itisequippedwithphotovoltaic panelsandallowscontrolandmonitoringthroughamobileapplication,putting intopracticetheIoTconcept.Also,TertillCorporation[69],basedinMassachusetts,proposed Tertill :acheap(349.00USD),lightandsmallwheeled autonomousrobotdesignedtoremoveweedsfromresidentialgardens(shown inFigure2.15).
Oncecropsaremature,theharvestingprocesstakesplace.Thespanish companyAgrobot[3]proposes AgrobotE-Series:arobotthatconsistsof (upto)24independentcartesianroboticarmsabletoworktogetherforgently
Chapter2Overviewofsoftroboticsinagriculture26
Figure2.14: AutonomousrobotsproposedbyNa¨ıoTechnologies,fromlefttoright:Oz, DinoandTed
Figure2.15: Ontheleftthecompletelyautonomousmobilerobotforpreciselycontrolling soilgrassingproposedbyVITIROVERSolutions;ontherightTertillWeedingRobot.
Chapter2Overviewofsoftroboticsinagriculture27
harvestingdelicatefruitssuchasstrawberries.AscanbeseeninFigure2.16, ithasthreewheelsanditsmechanicalstructurecanbeadjustedtosuitthe cropdimensions.
Moreover,theamericancompanyHarvestCROORobotics[28],created Berry5 :aroboticpickerthatexploitsAItodetermineifastrawberryis ripeornot,beforeharvesting(showninFigure2.17).Ithasapickingspeedof 8secondsperfruit,movingthroughstrawberriesbedsataspeedof1.6km/h,
resultingtoanequivalentyieldof25to30humanharvesters[48].Likethe AgrobotE-Series,thevariousmechanismsoftheBerry5robotareprotected bypatents,makingitsscientificanalysisdifficult.
OthercompaniessuchasAugeanRobotics[9]andHarvestAutomation [52]arefocusingonrobotsthatcancooperatewithhumansforcarryingand organizingproductswiththeobjectiveofincreasingindustrialproductivity.
AugeanRoboticsdeveloped Burro (showninFigure2.18):amobilecollaborativerobotthatexploitscomputervision,highprecisionGPS,andAIto followpeopleandnavigateautonomouslywhilecarryingortowingobjects.
Ithasamaximumcarryingpayloadof226Kg,dependingontheterrainand amaximumtowingcapacityof907Kg.HarvestAutomationproposed HV-
Figure2.16: AgrobotE-Seriesstainlesssteelandmilitary-gradealuminumrobot.
100 (showninFigure2.18):amobileautonomousrobotdesignedtoperform materialhandlingtasksinunstructured,outdoorenvironmentssuchasthose typicallyfoundincommercialgrowingoperations.Therobotcansafelycollaboratewithworkersandrequireminimaltrainingtooperate,withamaximum payloadof10Kg.
Interestingly,thereviewpaper[48]showsthatamongthe62considered projects/availableproducts,80%ofthemareintheresearchstage.Also,most ofthemconsistsoffour-wheeldrive(4WD)mobilerobotsandalmost70%of
Chapter2Overviewofsoftroboticsinagriculture28
Figure2.17: HarvestCROORoboticsBerry5fruitpickingrobot.
Figure2.18: OntheleftAugeanRobotics’Burroself-drivingrobot;ontherightHarvest Automation’sHV-100robot.
Chapter2Overviewofsoftroboticsinagriculture29
themdonotincludecomputervisionalgorithms.
Despitetheconstanttechnologicaladvances,manychallengesarestillto beovercomesuchasfruitocclusions,changesinambientlighting,simplicity ofconstructionandefficiency.
2.2.4Openissuesinagriculturalautomationandproposedsolutions
Accordingto[48],mostagriculturalrobotsare4WD,buttheagricultural environmentisclassifiedassemi-structuredandthiskindoflocomotionis stronglyaffectedbysoilcharacteristics.Also,atrade-offbetweenqualityand costofembeddedelectronicdevices(sensors,cameras,IoTcomponents)must betakenintoaccount.
Wheeledrobotsnotonlystruggletomoveinanagriculturalenvironment butcanalsocauseundesiredsoilcompaction.Asmentioned, UAV devicescan beavalidalternativeifthetaskallowsit. Leggedrobots arealsoproposed by[48]requiringlesscontactwiththegroundwhilemovingandbeingableto adjusttheirposturedependingonterrain’sslope.Leggedrobotssuchasthe onesshowninFigure2.19arerelativelylight,small,autonomousandhave locomotionpatternsthatadapttotheenvironment.Onedrawbackisthat theirsmallfeetimplyasmallcontactareathatcreatesaconsiderableamount ofpressureonthefootplacementregion.So,topreventrobots’feetfrom penetratingsoftsoilsandtrappingthemselves,theyrequireacustomizedfeet design.
Also,embeddedsensorscanhaveasignificantimpactonthefinalproduct’scost.Anideacanbetoestimatevariablesinsteadofmeasuringthem throughasmartdesign,likesuggestedbytherecentsoftgrippingandtactile sensingpublications(discussedinSubsection2.3.2).Ifbudgetallowsit,by
investinginsensorswithahighIngressProtection(IP),meaningtheycan operatewithhightemperatureandhumidityranges,atasmallpriceincrement,theroboticsystemcanbenefitintermsoflifetime.Also,computer visionandmachinelearningalgorithmscanfurthermoreimprovetheefficiency ofautomatedtaskssuchasdiseases’identification,detectionofweeds,selectiveapplicationofpesticides,locationofcrops,classificationofripenessand yieldestimation.However,thosealgorithmsneedimprovementsintermsof robustnessmakingthemindependentofweather,temperature,humidityand lightingchanges.TheuseofAIalgorithms,suchasMLP(MultiLayerPerceptron),CNN(ConvolutionalNeuralNetwork),R-CNN(Region-basedCNN), andSVM(Support-VectorMachines)provedtobeadaptabletorapidvariationinnaturallighting,changingseasonsandcropgrowing[48].
Chapter2Overviewofsoftroboticsinagriculture30
Figure2.19: OntheleftANYmalproposedbyANYbotics;ontherightUnitreeGo1proposedbyUnitreeRobotics.
2.3Openissuesinsoftrobotics
2.3.1Softgrippers’limitationsandrequiredimprovements
Asseen,softgrippershavelotsofadvantagesbutstillneedsomeimprovements.Acommoncomplainisthatsoftrobotsandgrippersrequirea timeconsumingmulti-stepfabricationprocess thatinvolvesmoldmaking, casting,curingandsupportremoval[7].Ascanbenoticedbythepreviously mentionedstate-of-the-artsolutions,themanufacturingofmostsoftgrippers isvery“handmade”andstillfartobeoptimizedforproduction.Therefore,repeatabilitycanbehardtoachieve,eventhoughprocessesbasedon3Dprinting andlostwaxmanufacturingcanbevalidoptionsforstandardizingthemanufacturing.
Also,softend-effectorsarenoteasytomodeloftenrequiringtechnical expertisetoaccountforthecontinuousdeformationsgivenbysoftmaterials.Althoughtheirdesign,placementandtestingisnottrivial,manyresearchesproposeeasiertodevelopend-effectorsexploiting3Dprintingtechnology[7,47,63].Asanexample,atCarnegieMellonUniversity,Pennsylvania,someresearchersproposedafullyprintablelow-costdexteroussoftmanipulatorthatwasdesignedthroughaframeworktheydeveloped[7].They wereabletousetheclassicalrigid-linked UnifiedRobotDescriptionFormat (URDF),generallynotcapableofdescribingcontinuousdeformationsofsoft materials,exploitingquasi-rigidapproximations.Thiswaytheend-effector’s behaviourcanbequicklyevaluatedinsimulation.Dependingonthetypeof grippingmethodthereareadvantagesbutalsolimitations:anapplecanbe firmlygraspedwhileastrawberrywouldneedamoregentleapproach.Some grippersareeasiertomaintainandclean,othersareonlyabletogripsmooth
Chapter2Overviewofsoftroboticsinagriculture31
anddrysurfaces,whileothershavealimitedadaptivegrasping.Featureslike modularity,easeofrepairandtheabilitytohandlefoodandmultiplecrops aredesiredforagriculturalapplications.
Anotherproblemthatistypicallynotaddressedbyresearchesisthe energy sourcesystem ofthesoftgrippers,thatshouldbetailoredtotheagriculturalunstructuredenvironment.Infact,theproposedelectrical,pneumaticor chemicalenergysourcesaretypicallyonlysuitableforalaboratoryoravery structuredindustrialcontext.
Ingeneral,challengesforsoftgrippersincludeminiaturization,robustness, speed,integrationofsensing,andcontrol.Improvedmaterials(elastomers, phasechangematerials)andprocessingmethodsplayalargeroleinfuture improvements[63].Finally,nomatterhowmuchthetechnologyisadvanced andreliable,thetransitionofsoftgrippersfromtheresearchstagetothe industrialcontextneedstobeeconomicallycompetitivewithrespecttoolder methodologiesandsemi-automaticormanualapproaches.Thisisanotso trivialpointtoaddress,becausewithoutpoliticalmanoeuvrestoincentive technologyinnovation,it’shardtojustifyhugeinvestmentsformostofthe small-mediumsizedenterprises.
2.3.2Casestudy-relatedstate-of-the-art
ThisSubsectionisdedicatedtoaroundupofthemainpapersthatweretaken asinspirationforthisMaster’sThesis.Inthecurrentstate-of-the-artthere’s alackofinformationaboutsphericalandhemisphericalsofttactilesensors andgrippers,sothefewavailableresearchesonthistopicareconsideredto beworthmentioning.Thefollowingcasestudy-relatedworkswillbeusefulto betterunderstandthenextchapter.
Oneofthepapersthatinspiredthisworkis “Rapidmanufacturingof
Chapter2Overviewofsoftroboticsinagriculture32
Chapter2Overviewofsoftroboticsinagriculture33
color-basedhemisphericalsofttactilefingertips”,publishedin2022 [58].Inthispapertheauthorspresenta3Dprintedtactilesensorcalled ChromaTouchthatexploitshue,centroidandapparentsizeofthemarkersto estimatenormalandlateralforces.Thedevice,showninFigure2.20ismade withaStratasysJ735multi-materialadditivemanufacturingsystemthatallowstheprecisealignmentofupto400markersona21mmradiushemisphere. Thesensingprincipleisbasedontherelativedisplacementbetweendifferently coloredmarkersthatlieontwoseparatelayers.Inparticular,thesubtractive colormixingencodesthenormaldeformationofthemembrane,andthelateral deformationisfoundbycentroiddetection.Thisapproachstandsoutbecause
mostoftheexistingmarker-basedsolutionsfailtodirectlyencodethedistance betweenmarkersandmonocularcamera,forcingthenormaldeformationtobe estimatedfromthelateraldisplacementofthemarkers.Inthiscase,theChromaTouchsensorencodesnormaldeformationinthehue-valueofthemarkers. Also,usingsubtractivecolormixing(thecolorofthetranslucentmarkerson theinnerlayerismixedwiththecoloroftheopaquemarkersbehindthem) allowsahighersensingresolutionwithrespecttoolderproposalssuchasthe GelForcesensor.However,themainobjectiveofthisworkistoperformaccu-
Figure2.20: Ontheleftthe3DprintedChromaTouchtactilesensorandthetransduced forcesintomarkerappearancechanges;ontherightarenderofthefingertipassembly.
Chapter2Overviewofsoftroboticsinagriculture34
ratecurvatureestimationswhenthesensorispressingagainstapositivelyor negativelycurvedobjectbutcontactingforcesandtorquesarenotestimated.
AnotherresearchworkdevelopedattheUniversityofMadridentitled “A universalgripperusingopticalsensingtoacquiretactileinformationandmembranedeformation” [57],proposedagranular-jamming basedgripperwithsemi-transparentfillingthatallowstodetectthemembrane’sdeformationandtheobjectbeinggrasped.Theprototype,shownin Figure2.21,isabletograspcylindricalandrectangularobjects(10to70mm length)whiletrackingthegripper’sdeformationsothatobjectclassification throughthereconstructedpointcloudcouldbeperformed.Also,graspingsuccessisdetectedestimatingshearforces.Thefabricationprocessconsistsin 3Dprintingthemoldsthatarelaterfilledwithsiliconeorepoxyresintorespectivelycreatethesoftgripper’smembraneandbulkhead.Also,athickness of1mmandaShoreHardnessA-20werechosen,whiletheembeddedcircular markershaveadiameterof6mmandathicknessof1.5mm.Eventhough thesizingandmanufacturingofthisprototypearedifferentfromours,avery similarmarkertrackingapproachwasusedandthesuggested3Dpositionestimationalgorithmwasimplementedbutdidnotachieveacceptableresults.
Figure2.21: OnthelefttherenderedstructureoftheUniversalGripper;ontherightthe manufacturedprototype.
Chapter2Overviewofsoftroboticsinagriculture35
Thepaper “Soft-bubble:Ahighlycompliantdensegeometrytactilesensorforrobotmanipulation” [4]proposesadensegeometrysensor andend-effectorfortactile-objectclassification,poseestimationandtracking (showninFigure2.22).Itmeasuresdeformationofathin,flexibleair-filled membraneusingadepthcamera.Thesensedfeaturesarethenexploitedto performobjectshapeandtextureclassification(usingaDeepNeuralnetwork), objectsorting,objectposeestimationandtracking.Asshownbythefollowingscheme,thedimensionsoftheproposeddeviceallowtheuseofaTime OfFlightdepthcamera(PMDpicoflexx)andthedesignfollowsitsminimum sensingdistanceof100mm.Asmentionedbytheauthors,thischoiceallows avoidingnon-trivialandrobustalgorithmsfor3Dshapereconstruction(e.g. structuredlighting,photometricstereoalgorithms).Toourknowledge,asimilarapproachcouldnotbeadoptedduetothecurrentlyavailableonthemarker depthsensors’limitations(sizeandsensingrange),whilekeepingacompact design.
Similarlytothepaperwhichhasjustbeendiscussed,in “Soft-bubble
Figure2.22: Ontheleftthedimensioned(mmunits)sensorassembly;ontherightthe soft-bubblemountedonaKUKAiiwarobot.
Chapter2Overviewofsoftroboticsinagriculture36
grippersforrobustandperceptivemanipulation” [40],asoft-bubble grippersystemispresented(showninFigure2.23).Inthisworkthemain contributionstothistechnologyarethedesignimprovementswithasmaller parallelgripperformfactor,theintroductionofhigh-densitymarkersonthe internalbubblesurface,usedforestimatingshearforces,aproximitypose estimationframeworkandintegratedtactileclassification.Asintheprevious work,aToFcameraisusedtosensedepthbutinthiscaseaprototypecamera fromPMDtechnologieswithaworkingrangeof4-11cmwasemployed(and placedatanangletoreducetheoverallgripperwidth).Also,markerswere addednottoinferdepthbuttoestimateslippageandgraspqualityfrom shear-induceddisplacements.
Figure2.23: Ontheleftthesoft-bubbleparallelgripperthatestimatesin-handposeand tracksshear-induceddisplacements;ontherightthedimensionedschemewheretheToF depthsensorisdepictedinblue.
In “GelSightFinRay:IncorporatingTactileSensingintoaSoft CompliantRoboticGripper” [43]asoftgripperwithtwosensorizedfingers forretrievingtactileinformationisproposed.TheFinRaydesignhasthe advantageofnotrequiringactuationforsecurelygraspingobjectsunlikemany softandrigidgrippers.Likementionedbyotherstudies,externalambient
Chapter2Overviewofsoftroboticsinagriculture37
lightingcaninterferewithavision-basedsystem:inthiscaseadarkcloth wasappliedonthesensingdevice,obstructingoutsidelighting.Asshownin
Figure2.24,theproposedFinRayfingerisequippedwithmarkersforslipand twistdetection,itcanmeasuretheorientationofthein-contactobjectand throughanRGBilluminationandpre-collectedreferenceimagescanperform 3Dreconstruction.
Regardingtheforceestimationproblem,in “HiVTac:AHigh-Speed Vision-BasedTactileSensorforPreciseandReal-TimeForceReconstructionwithFewerMarkers” [53]aprototypeforforcereconstructionisproposed.Thedevelopedalgorithmallowsreal-timeestimationofthe directionandintensityoftheexternalforce.TheHiVTactactilesensorshown inFigure2.25ismadeofasquaresheetofPDMS(polydimethylsiloxane)with dimensionsof40mm × 40mm.Ithas4markersthataretrackedthrougha wide-anglecamera.Themainproblemwhenre-adaptingtheobtainedresults onthecase-studyarethestronggeometricalassumptionsthataremadethanks toaplanardesign,thatarenotsuitedforahemisphericaldome.
In “FingerVisionforTactileBehaviors,Manipulation,andHapticFeedbackTeleoperation” [78]and “ImplementingTactileBehav-
Figure2.24: TheGelSightFinRaygripper.
iorsUsingFingerVision” [77]byYamaguchi,thesameforceestimation approachisused.Alsointhesepapers,thevision-basedtactilesensor(shown inFigure2.26)isalmostplanar,makingitdifferentfromourdesign.Nonetheless,thesamemarkertrackingandforceestimationapproachhasbeenimplementedandtestedtoseeifasimplelinearizationwouldbesuitableforsmall deformationsatleast.Inparticular,tangentialforcesareestimatedconsideringthehorizontaldisplacementsofthemarkerswhilenormalforces,dueto theunstablemarker’swidthreadingofthedetectionalgorithm,isestimated throughthenormofthemarkers’positionchange.ThenoisyradiusreadinggivenbytheBlobDetectionalgorithmisalsoconfirmedbyourwork;the accuracyofthismethodtoahemisphericalsurfacewillbelaterdiscussed.
Chapter2Overviewofsoftroboticsinagriculture38
Figure2.25: TheHiVTactactilesensorprototype.
Figure2.26: DesignoftheFingerVisionanditsprototypeinstalledontheBaxtergripper.
In “Visiflex:ALow-CostCompliantTactileFingertipforForce, Torque,andContactSensing” [18]acheapcomplianttactilefingertipis proposed.Thesensor,showninFigure2.27,iscapableofcontactlocalization andforce/torqueestimation.Accordingtothepaper,testsindicatethattypicalerrorsincontactlocationdetectionarelessthan1mmandtypicalerrorsin forcesensingarelessthan0.3N.Atafirstglance,thedesignoftheVisiflexis verysimilartoours,eventhoughdifferentapproachesforforceandapplication pointestimationareused.Inthisworktheamericanresearchersusedadomeshapedacrylicwaveguidecoveredbyasiliconecap.EightLEDsactasfiducial markersandthelightinjectedintothewaveguideistotallyinternallyreflected exceptwherethecapcontactsthewaveguide.Thisbehaviourisexploitedto easilysenseeithersingleormulti-contactwiththeexternalenvironment,as showninFigure2.28.Anotherdifferenceinthedesignwithrespecttoours isduetoa6degreesoffreedom(DoF)fingertipthatisaccomplishedusinga 6-DoFflexuresystem.Wrenchsensingisperformedusingseveraltechniques suchasstiffnessmatrixestimation,linearapproximationandnonlinearapproximationstartingfromtheexperimentaldata(basedonNeuralNetworks).
Chapter2Overviewofsoftroboticsinagriculture39
Figure2.27: ArenderingoftheVisiflex[18]tactilesensoranditsexplodedview.
“TheTacTipFamily:SoftOpticalTactileSensorswith3DPrintedBiomimeticMorphologies” [75]isaslightlyolderpaper(publishedin2018)thatcomparesseveralproposeddeviceswiththesamebiomimetic designprinciple.Inparticular,theyallexploitdeformationofthefingertip sensingthedisplacementofpinsormarkersusingacamera.Severalpatterns areconsideredandcompared,basedonthein-handmanipulationandobject explorationcapabilities.InFigure2.29threeiterationsofthedesignareshown.
(center):theredesignedbasehousesawebcam,andmodulartipswith3D-printedrubber skin.Modulartips(right):separatemodulartipswithanodularfingerprint(above)andflat tip(below).
Chapter2Overviewofsoftroboticsinagriculture40
Figure2.28: TheVisiflexsensorcontactedinmultiplepoints;redLEDsrepresentarethe fiducialmarkers,whilegreenLEDsarethecontactpointsthatcanbeseenbythecamera, becauseofthewaveguide.
Figure2.29: Asreportedin[75]:Open-TacTip(left):theoriginalversionofthesensor comprisesa3D-printedcameramountandbaseandacastsiliconeskin.ImprovedTacTip
Chapter2Overviewofsoftroboticsinagriculture41
In “DenseTact:OpticalTactileSensorforDenseShapeReconstruction” [16]acompacttactilesensorwithhigh-resolutionsurfacedeformationmodelingforsurfacereconstructionofthe3Dsensorsurfaceispresented. Inthisworkforceestimationisnotaddressedandthedesigndoesn’tcomprise fiducialmarkers.AsshowninFigure2.30,usinga3-coloredlightinginside thedomeandanRGBcamera,thesurfacedeformationisestimatedandthe contactingobject’sshapeisreconstructed.However,thisestimatesarehighly dependenton3Dshapecalibrationprocess(using3DprintedobjectsandinferringCADmodels)andDeepNeuralNetwork’saccuracy.
Figure2.30: OnthelefttheDenseTactsensormountedontheAllegrohandandits3D reconstructionresults;ontherightavisualizationoftheraycastingalgorithm,usedto determinetheradialdepthfromthe3Dcalibrationsurfacewhichisthenprojectedintothe imageplane.
Similarlytothepreviouslycitedpaper,in “GelSight:High-Resolution RobotTactileSensorsforEstimatingGeometryandForce” [79]
3Dshapereconstructionisperformedusingastructuredlightsetupanda photometricstereoalgorithm.Inparticular,3differentlycoloredLEDsare arrangedatdifferentdirectionsandcombiningtheshadingfromthreeormore directions,surfacenormalsoneachpixeloftheshadedimageareestimated (asshowninFigure2.31).Afterwards,surfacenormalisintegratedtoget the3Dshapeofthesurface.Also,forceestimationiscarriedoutthrough
Chapter2Overviewofsoftroboticsinagriculture42
markertracking.Theproposedsensorhasaplanargeometry,sothementioned approacheswerenoteasilyapplicabletoourcasestudy.
Figure2.31: Asreportedin[79]:(a)basicprincipleoftheGelsightdesignthatconsists ofasensingelastomerpiecewiththeopaquereflectivemembraneontop,supportingplate, LEDsandcameratocapturetheshadedimageswithdifferentlightings;(b)pictureofthe sensor;(c)arrangementoftheLEDsandcamerawhenviewingfromthetop.
Chapter3 Thedesignoftheprototype
Inthischapterthespecificcasestudyispresented,analyzingstepbystepthe methodologiesusedtomanufacturetheprototype.Also,theexperimentalsetups andthemainComputerVisionalgorithmsaredescribedindetail.
3.1Specificationsandgoals
AsdiscussedinChapter2,thereareseveralreasonswhysoftroboticsisdrawingattentionofcompaniesandresearchers.Severalrecentlypublishedpapers focustheirattentionontactilesensing,thatcanprovideinformationabout thecontactsuchasfriction,slippage,surfacefeatures(curvature,texture), appliedreactionforcesandtorques.Moreover,itcanbeavalidoptionfordescribingtheobject’sshape,orientation,rigidity,insituationswherevisionis outofreach.Asmentionedin[58]mostoftheresearchonthistopichasbeen focusingonsoftbutflattactilesensors.Ontheotherhand,extendingthis technologiestofreeformorhemisphericalsurfacesisnottrivial.Inorderto sensethemechanicaldeformationthecommonstrategyistoapplyapattern ofmarkersinsidethefingertipandtrackitsmotionusinganRGBorRGBD
43
Chapter3Thedesignoftheprototype44 camera.Thisway,there’snoneedfordirectwiringtothesoftmaterial,guaranteeinghighercompliancethanembeddingcapacitiveorresistivematerials.
Basedonthechosenpatternandmarkerdensity,thesensingresolutioncan befairlyhigh.AsdiscussedinChapter2,sincethefingertipdesignhasto becompact,therearenotavailabledepthcameras(RGBDnorToF)onthe marketthatareabletopreciselysensedepth,requiringaminimumdistanceof around10cm.So,ifamonocularcameraisusedsomeothertechniqueshaveto beexploitedtoperformdepthestimation.In[58]2layersofdifferentlycolored markersareexploited;in[57]theyrelyoncameracalibrationandthesensed markers’dimensions;in[43],[16]and[79]deepneuralnetworkapproachesare usedtocalibratethesensorandperform3Dreconstruction.
Inourworkwefocusedonthedevelopmentofa cheap,3Dprinted, marker-basedhemisphericalsoftgripper capableofsensingforceswhen incontactwiththeexternalenvironment.Moreover,the3Dshapeestimation approachproposedin[57]hasbeenexplored.
Thefinalgoalsofourworkareto:
• designandmanufactureasofttactilesensorwithasuitedpatternof fiducialmarkerstobetracked;
• calibratethesensingdeviceinordertoperformonlineestimationofboth shearandnormalforces;
• designastructurethatcanholdthesensingdeviceinplaceandmountit onthe“FrankaEmikaPanda”[23]robot’send-effector(showninFigure 3.1);
• attemptasimpleharvestingtaskexploitingthedevelopeddevicewithina forcecontrollooponthegripperandanexternaldepthcameramounted ontheroboticarm.
3.2Themanufacturingprocess
3.2.1Prototypingandmanufacturingthehemispherical dome
Firstofall,wefocusedonthemanufacturingofthetactilesensorandthe patternselection.Prioritizingeaseofmanufacturingandreproducibility,we decidedtoavoidasmuchaspossiblemulti-materialadditivemanufacturing andmolding,focusingon3Dprinting.Ourhemisphericalsensingdevicewas 3DprintedusingtheFormlabsForm2stereolithography(SLA)3Dprinter [21]showninFigure3.2.Itcostsaround3000$ andit’scharacterizedby alayerthicknessoraxisresolutionofrespectively25,50and100microns. ThecompatiblematerialsarealsoprovidedbyFormlabs[22]andtheyrange from150$/lto400$/ldependingonmechanicalpropertiessuchasstiffness, elasticity,thermalresistenceandachievableresolution(upto0.005mm).To 3Dprintthehemisphericaldomeweusedthe Elastic50AResin,apolymer resinthatsupportsamaximumresolutionof100micronsand50AofShore Hardness.Therequiredprintingtimedependsonthegeometryandheight
Chapter3Thedesignoftheprototype45
Figure3.1: TheFrankaEmikaPandaroboticarmequippedwiththeFrankaHand2-fingers gripper.
Chapter3Thedesignoftheprototype46 ofthemodel;inourcaseitwasaround4hata0.10mmresolutionforthe singledome,while5hata0.10resolutionfortwodomes.SLAresinsare photocurable throughlightthatliesintheultravioletspectrum.
Afterthe 3Dprintingprocess,theelastomerlookslikeinFigure3.3,so alltheunnecessarymaterialhastobeproperlyremovedtomakethesurface smooth.After3Dprinting,themodelshavetobecleanedeitherwithIPA (IsopropylAlcohol),withaproperwashingmachineorbyhand,likeinour case,duetothefragilityofthematerial.Inparticularthe3Dprinteddomes needtobedippedinIPAforabout10minutes;thenithastobegentlyshaken whileinsidethewashingtubandlefttosoakforanother10minutes.This processallowstoremovealltheresinresiduefromthepolymerizedstructure, duetothefactthattheprintercontinuouslydepositsresinonthemodeland thelaserbeamcausesitspolymerizationjustabovetheprintingplane(thelaser losespowerafterthe0.10mmlength).Thecleaningprocessisveryimportant toavoidthatalltheresiduesonthesurfacepolymerizeduringthenextstep.
Afterwards,the UVcuringprocess takesplaceinsideaspecialultravioletovenforabout20minutesat60◦.Withoutenteringintotoomuchdetail, curing isachemicalprocessthatcausesthetougheningorhardeningofapoly-
Figure3.2: Fromlefttoright:theFormlabsForm2stereolithography3Dprinter;the Elastic50AResin;a3Dprintedsampleasshownonthewebsite[22].
mermaterialbycross-linkingofpolymerchainsandinourcaseitwasinduced byheat.Theamountofrequiredcuringtimeiscomputedconsideringthegoal ShoreHardnessofthefinalmaterialthatisabout50A.Tobetterunderstand themeaningofthisvalue,Figure3.4showsaShoreHardnessscaleofgeneral purposeitems.ShoreHardnessisinfactameasurementunitthatindicates theresistanceofamaterialtoindentation.AsshowninFigure3.4,thereare differentscales,dependingontheapplicationfieldandmaterialtype,thatcan overlapeachother.
Figure3.4: ShoreHardnessscaleofgeneralpurposeitems.
Figure3.5showsasummaryofthemostcommonmaterials’propertiesand curingtimes,accordingto[33],usedinsoftrobotics.
Figure3.6showshowthepolymerdomepresentsbeforeandafterthecuring process.
Chapter3Thedesignoftheprototype47
Figure3.3: Fromlefttoright:thejustprinteddome;thewasheddome;howthewashed domepresentsontheinside.
Chapter3Thedesignoftheprototype48
Figure3.5: Chartofthemostcommonsoftmaterials’propertiesandcuringtimes,accordingto[33].
Figure3.6: Ontheleftthehemisphericaldomebeforebeingcured;ontherightthehemisphericaldomeafterbeingcured.
Oncethematerialiscured,itbecomessemi-transparentallowingexternal lighttopartiallypassthrough.Thisbehaviour,thatinitiallywasthoughtto beanaddedvalueisactuallysomethingtodealwithinordertoachievean accuratetrackingalgorithm.Infact,theshadowsgeneratedbythein-contact objecttendtointerferewiththemarkers’detectionandtrackingalgorithms.
Toavoidthisproblem,atemporarylight-coloredclothwasplacedoverthe domeduringtestingandcalibration,whileathinlayeroflatexwasusedasa coveronthefinalsensingdevice.Also,it’sworthmentioningthatthehigher thecuringtimethegreatertheopacitybutalsothematerial’sstiffness.
Theprototypingprocessincludedmultipleiterationsofthedomedesign, inparticular:
• initially,themarkers’diameterwas0.9mm(insideviewofthedome showninFigure3.7)andtheywerespacedby1.5mmintervals.Toimprovemarkers’detectionandtrackingtheywereincreasedtoadiameter of2mmandaspacingof2mm;
• fourdifferentpatternswereprintedandevaluatedbasedontheirsensitivitytoappliedforcesandrobustnessduringmarkers’tracking.As mentionedbystate-of-the-artrelatedpapers,atradeoffbetweensensing resolutionandsignal-to-noiseratiomustbefound.Inourcase,wedecidedtoadoptthe“doublecross”designshowninFigure3.8.Weled tothisdecisionduetothemarkers’detectionandtrackingreliability; also,thepatternwasdesignedinsuchawaythatdiagonalcrossescan bedistinguishedfromorthogonalcrossesbythenumberofmarkersand haveaslightlydifferentspacing.Thetotalnumberofmarkersonthe chosendesignis29andtheywerefilledwithblackresintomaximizethe contrast(thisoperationcouldbeeasilyintegratedduring3Dprinting,if theprinterallowsit).
Chapter3Thedesignoftheprototype49
3.2.2Designandmanufacturingofthecase
Thecasewas3DprintedusingacheapFDM(FuseDepositionModelling) printerandit’smadeofPLA(PolylacticAcid).Initially,itwasthoughtasa 4partsdesigntodecoupleasmuchaspossiblethedifferentlayers,asshown inFigure3.9.Infact,startingfromtoptobottom:
• asquarepiececonstrainsthehemisphericaldomeatitsbase;
Chapter3Thedesignoftheprototype50
Figure3.7: Onthelefttheinitial“doublecross”designconsistingof0.9mmdiameter markerswithaspacingof1.5mm;ontherightthesuperimposeddetectedblobswhichare clearlynoisy.
Figure3.8: Ontheleftthefourdesignedpatterns;ontherightthechosen“doublecross” pattern.
Chapter3Thedesignoftheprototype51
• athinnerplatewithacircularindentationholdstheAdafruitNeoPixel Ring(12LED)[1];
• a“box-like”shapedpieceholdsthe5MPfish-eyeRGBcameraforRaspberryPi[20];
• thebasecompletesthedesignandincludesseveralholesforpositioning screwsandcablemanagement.
Afterseveraliterations,mainlyduetoabadcenteringofthecamerathat ledtopoorforceestimationresultsandanon-symmetricimage,thedesign wasfurtherimproved.Havingamodulardesignallowstore-thinkandprint asinglepartinsteadofthewholeprototype,makingtheoverallprocessfaster andminimizingproducts’waste.Infactthefinaldesignconsistsin4flat elementsthatarefasttoprint(around40minuteseach)andthemainthrough holeboxthatrequires2h,foratotalofabout5hat0.18mmresolutionforthe
Figure3.9: Ontheleftaphotographofanexplodedviewoftheinitialprototype;onthe rightarenderoftheprototype.
Chapter3Thedesignoftheprototype52
wholecase.Consideringtheamountofrequiredmaterial,theproductioncost ofthecaseisaround4e
Figure3.10showsarenderofthefinaldesignthatwasmountedonthe FrankaEmikaPanda’sgripper.
3.3Experimentalsetups
Inthissectiontheexperimentalsetupsaredescribedanddividedintothree categories:testingsetupusedduringsoftwareimplementation;temporarysetup fordataacquisitionduringsensor’scalibration;definitivesetupmountedonthe robot.
3.3.1Testingsetupusedduringsoftwareimplementation
ThemostfrequentlyusedsetupduringourworkistheoneshowninFigures 3.11and3.12.Itincludesthefollowinghardwarecomponents:
• RaspberryPi3ModelB[54],itspowersupplyandUSBWiFiAdapter;
Figure3.10: Multiplerenderedviewsofproposed3Dprintedmodulardesign.
Chapter3Thedesignoftheprototype53
• ArduinoUno[68];
• 3Dprintedhemisphericaldomemountedonthe3Dprintedcase;
• AdafruitLEDNeoPixelRing12RGB[1](insidethecase);
• IR1.7mmfocallength5MPresolution175◦ fieldofviewfish-eyecamera (ChipOV5647)forRaspberryPi[20](insidethecase);
• connectioncablesfortheLEDRingandthecamera;
• powersupplyconnectorandsplitterforchargingtheArduinoUnoboard andtheLEDRing;
OntheRaspberryPi3board,“RaspberryPiOS”(previouslycalledRaspbian)wasinstalled.Itactslikeaserverestablishingaconnectionwiththe maincomputerandsendstoitthereal-timeimagescollectedbythecamera.TheArduinoUnoboardwasusedtoeasilycontroltheAdafruitLED
Figure3.11: Overallviewofthetestingexperimentalsetup.
Chapter3Thedesignoftheprototype54
NeoPixelRingwiththededicatedlibrary[17].Themaincomputerconnects totheRaspberryserverasaclient(theymustbeconnectedtothesameLocal AreaNetwork)andprocessesthereceivedrawframes.
3.3.2Temporarysetupfordataacquisitionduringsensor’scalibration
OncethemarkerswereproperlydetectedandtrackedbytheComputerVision algorithms(thatwillbediscussedinSection3.4),atemporarysetupfordata acquisitionwascreated.Inadditiontoallthementionedcomponentsofthe “testingsetup”,anATINano17-Eforce-torquesensor[6]anda3-axisDoF Cartesianrobotwereexploited.Figure3.13showsa3-axisCartesianrobot similartotheoneweusedandtheATINano17-Esensor.The“testingsetup” wasplacedonahorizontalsurfacewhilethe3-axisrobotholdingtheATI Nanosensorwaspressedagainstthehemisphericaldome.Inparticular,we measurednormalforcesappliedtothesamepointthatwas,approximately, thedome’scenter.Dataacquisitionandprocessingwillbediscussedmorein Subsection3.4.3.
Figure3.12: Closerviewsoftheexperimentalsetup.
3.3.3Definitivesetupmountedontherobot
Asmentionedbefore,theFrankaEmikaPandaroboticarmwasused.Accordingtothedatasheet[24],ithasamaximumpayloadof3kg,855mmofreach and7degreesoffreedom(Figure3.14showstherobot’sworkspacefromtwo differentpointsofview).
Also,itisequippedwithahandparallelgripperwithexchangeablefingers (showninFigure3.15),thatweunscrewedtoinstallthedevelopedsensing
Chapter3Thedesignoftheprototype55
Figure3.13: OntheleftanexampleofCartesianrobot;ontherighttheATINanoSeries 6-axisforce/torquesensor.
Figure3.14: Ontheleftasideviewofthearm’sworkspace,ontherightatopviewofthe arm’sworkspace.
Chapter3Thedesignoftheprototype56 device.Regardingtheapplicableforce,itensuresacontinuousforceof70N andamaximumforceof140N.TherobotwasprogrammedwithC++and PythonlanguagesusingROS(RobotOperatingSystem).
AsshowninFigure3.16,thesetupthatwasmountedontherobotarmin ordertoperformagraspingtaskincludes:
• theFrankaEmikaHandgripperwith2sensingdevices(ofwhichonlyone wasusedforreal-timeforcesensing,assumingthattheforceisequally distributedamongthetwo);
• anIntelRealSenseD435i[32]depthcamera;
• aRaspberryPi3board;
• aseriesof3Dprintedsuitableholdersforeachcomponent.
Figure3.16showstworendersofthePanda’sgripperwiththecustom sensingdevicethatreplacedtheoriginalfingersandtheRealSensedepth camera.Moreover,Figure3.17showsthe3Dprinteddesignmountedonthe realrobot.
Figure3.15: OriginalFrankaEmikaHandgripper.
AscanbenoticedlookingatFigure3.16,theArduinoUnowasremoved toimprovethedesign,bysimplyinstallingthededicatedlibrary[26]onRaspberryPitomanagetheAdafruitLEDRing.TheIntelRealSenseD435ishown inFigure3.18isadepthcamerathatwasexploitedtoautonomouslyperform agrasping/pickingtask.Infact,RGBDcamerasallowtoretrievenotonlya colorimagebutalso3Dcoordinates(withrespecttoitsownreferenceframe) associatedtoeverypixelintheimage.Futuredevelopmentsmayfurthertake advantageofitoffering:collisionavoidanceandconsequenttrajectoryplan-
Chapter3Thedesignoftheprototype57
Figure3.16: RendersofthedefinitivesetupmountedontheFrankaEmikaPandarobot.
Figure3.17: PhotographsshowingtheFrankaEmikaPandarobotwiththecustomgripper installedontheFrankaHand.
Chapter3Thedesignoftheprototype58 ning,saferhuman-robotinteraction,moreeffectivepickingpatternselection, morerobustandreliableobjectdetection.
3.4Implementedsoftwarealgorithmsandapproaches
Thissectionexploresthemainimplementedalgorithmsandpipelines,ranging frommarkerdetectiontoonlineforceestimation.
3.4.1Markerdetectionandtracking
TheindividualmarkersinsidethedomearedetectedusingOpenCVBlob Detectionalgorithm.TheinputimagethatissentfromtheRaspberryPi boardtothemainlaptopisthenconvertedtograyscaleusedasinputofthe OpenCV SimpleBlobDetector.Thedetector’sparametersarefine-tunedforour specificcase;inparticular,blobsarefilteredbyareaandamaximumthreshold isapplied.Accordingto“LearnOpenCV”’sguide[41],themainstepsofthe SimpleBlobDetectorare:
Figure3.18: TheIntelRealSenseD435iRGBDcamera.
Chapter3Thedesignoftheprototype59
• Thresholding: thesourceimageisconvertedtoseveralbinaryimagesbythresholdingitwiththresholdsstartingat minThreshold.These thresholdsareincrementedby thresholdStep until maxThreshold ;
• Grouping: ineachbinaryimage,connectedwhitepixelsaregrouped andthey’recalled binaryblobs
• Merging: thebinaryblobs’centersinthebinaryimagesarecomputed, andblobslocatedcloserthan minDistBetweenBlobs aremerged;
• Center&RadiusCalculation: thecentersandradiiofthenewly mergedblobsarecomputedandreturned.
AsshowninFigure3.19,furtherthresholdingoptionscanbesetasparameters oftheSimpleBlobDetector.Inourapplication,markerscandeformtoellipses butstillneedtobetracked,sofilteringwasappliedbyareaandamanualfilter wasintroducedtoconsideronlymarkersdetectedwithinapre-definedradius fromtheframe’scenter(assumingthedometobecenteredwithrespecttothe camera).
Figure3.19: SimpleBlobDetector’sthresholdingoptions.
Chapter3Thedesignoftheprototype60
TheListing3.1showshowtheSimpleBlobDetectorwascreatedandinitializedspecificallytobereliableduringmarkerdetectionwiththe29markers “doublecross”design.
Listing3.1: PythonsnippetoftheSimpleBlobDetector’sinitialization
Also,inFigure3.20boththerawframes(beforeandafterdeformation) sensedbythefish-eyecameraandthesuperimposeddetectedmarkersare shown.Thegreencirclerepresentsthemanuallydefinedareaofinterest,so ifablobisdetectedoutsideofit,itisignored(sometimesthebordersofthe imagetendtocreateproblems),whilethegreenlinesmarkthecenterofthe cameraimage,allowingabettermanualalignment.TheBlobDetectionalgorithmisfairlyrobustevenwhenhighdeformationsareinvolved;ontheother handitconsidersblobsascirclesandnotellipses.Thisaspectishighlighted andproperlymanagedby[57]wherethediameterofthecircularmarkeriscom-
1 import cv2 2 def set_parameters(): 3 # Initialize parameters of the SimpleBlobDetector 4 params=cv2.SimpleBlobDetector_Params() 5 6 # Filter by Area 7 params.filterByArea=True 8 params.minArea=20 # ideal for " double cross " pattern 9 params.maxArea=200 # ideal for " double cross " pattern 10 11 # Maximum threshold 12 params.maxThreshold=125 13 14 # Create a SimpleBlobDetector with the chosen parameters 15 detector=cv2.SimpleBlobDetector_create(params) 16 17 return detector
Chapter3Thedesignoftheprototype61
putedastwicethesemi-majoraxisoftheellipseshowninthecameraimage. Thesemi-majorandsemi-minoraxesarecomputedusingtheellipsefitting algorithmproposedbyFitzgibbonetal.,alsointegratedinthe cv2.fitEllipse() function.Inourworkthisaspectwasinvestigated,eventhoughourmarkers’ sizeisaboutoneorderofmagnitudesmallerthantheUniversalGripper’s, resultinginalesspronouncedshapetransitionfromcircletoellipse.
LookingatFigure3.21wecanseethatafterapplyinganormalforceof around4N,themarkers’shapetendtobeslightlyellipticdependingonthe applicationpointandmarkerlocation,butthestandardBlobDetectionalgorithmreturnscirclesthatapproximatetheirshape.Thecentralimageshows theBlobDetection’soutput,highlightinginwhitethecircleandthecenter withblackcrosses.Theimageontherightshowsthefittedellipsesthatcan’t beobtaineddirectlyfromtheSimpleBlobDetector(eventhoughonecould exploitcircularityandinertiafilterstoretrievetheamountofdeformation).In fact,asimplealgorithmwasdevelopedandcanbesummarizedinthefollowing steps:
• converttheinputimagetogray-scale(cv2.cvtColor() function);
• blurthegray-scaleimagewitha3 × 3filter(cv2.blur() function);
• detectedgesusingtheCannyalgorithm(cv2.Canny() function);
Figure3.20: Fromlefttoright:therawframewhennoforcesareapplied;therawframe afterapplyingaforce;theframeaftermarkerdetection.
Chapter3Thedesignoftheprototype62
• findcontoursonabinaryimage(cv2.findContours(canny output, cv2.RETR EXTERNAL,cv2.CHAIN APPROX SIMPLE) function);
• foreachcontour(thatrepresentsthemarker’sborder)ifitconsistsof atleast5points,fitanellipseusingtheFitzgibbon95’salgorithmimplementedinthe cv2.fitEllipse() function;
• returntheellipses’centerandmajor-axis.
Figure3.22showsthehorizontal,verticalandradiusdisplacementsofthe detectedcircularmarkersinpixelunits,whileFigure3.23referstothedetectedellipticmarkers.Thisresultscomefromoneofthemeasuredsamples duringthedataacquisitionphase,inparticularthecenterednormalforcefrom restpositiontoamaximum Z-deformationofaround10mm,correspondingto about4Nofverticalforce.
Ascanbeseenbythetwographs,thesenseddisplacementsarepretty similarbuttheradiimeasurementsbecomeevenmorenoisywhenconsidering markersasellipses.Consideringthenotsostraightforwardellipsefittingalgorithmthatincludesseveralthresholdingsteps,therestoftheworkusesthe standardBlobDetection’soutputthatisaccurateenoughforourcasestudy.
Figure3.21: Left:therawframewhenanormaldisplacementof10mmiscausedbythe externalforce.Center:theframeaftermarkerdetection,fittingellipses.Right:theframe aftermarkerdetectionfittingcircles.
Chapter3Thedesignoftheprototype63
Figure3.22: Circularmarkers’displacements(horizontal,verticalandradius)inpixel units.Eachofthe29markersisrepresentedbyadifferenthue.
Figure3.23: Ellipticmarkers’displacements(horizontal,verticalandradius)inpixelunits.
Themainreasonwhythismethodwasimplementedisbecausethe3Dreconstructionformulasreturnedsometimesnoisyandnon-realisticresults;unfortunatelyconsideringellipses’centersandwidthsdidn’timprovetheoutcomes bymuch.
Oncethemarkersareproperlydetected, tracking comesintoplay.The trackingapproachisfairlysimple:afteraninitializationstepthatispassed ifallthe29markersaredetected,foreveryreceivedframe,markerdetection isappliedandthenewcoordinatesareassociatedtotheclosermarkerofthe previousframe.UsingasametrictosortmarkerstheEuclideandistance,we foundtheapproachrobustenough:theframerateisabout 20fps,allowing ittocorrectlyperformtracking.Inordertomakethealgorithmabitmore reliableduringtheofflinedatasetcreation,a Booleaninterpolationmatrix has beencreated.Ineachrowofthismatrixthereare29Booleaninstancesthat indicatethetrackingstate(“True”meanscurrentlytracking;“False”means trackingwaslost)ofeachmarkercorrespondingtotheconsideredframe.Once allthestatesarecollected,theDataFramedatastructure(pandas library)is exploitedtoperformlinearinterpolationofthe u, v and radius measurements, whentrackingwaslost(acodesnippetisshowninListing3.2).
1 import numpyasnp
2 import pandasaspd
3 # [...]
4 # Linear interpolation where interpolation matrix value is True
5 for it,interp_arr in enumerate (interpolation_matrix):
6 interp_arr=np.array(interp_arr)
7 ind= list (np.where(interp_arr==True)[0])
8 if len (ind)>0:
9 for marker_id in ind:
10 all_traj_markers[it][marker_id]=[np.nan,np.nan,np.nan]
11
Chapter3Thedesignoftheprototype64
Chapter3Thedesignoftheprototype65
Figure3.24showsavisualrepresentationofthemarkersbeingindependentlytracked(subsequentcoordinatesofthesamemarkerarescatteredwith thesamecolor)andsorted(thesortingalgorithmwillbediscussedinSubsection3.4.4);whileFigure3.25showstheeffectsofthesamedeformationdirectly ontheframesentbyRaspberryPi.
12 # Interpolation step : if " np nan ", means tracking was lost , 13 # so interpolate linearly between positions 14 trajs=np.array(all_traj_markers) 15 for marker_id in range (num_markers_gripper): 16 interp_df=pd.DataFrame(trajs[:,marker_id,:]).interpolate() 17 for it,el in enumerate (interp_df.values.tolist()): 18 all_traj_markers[it][marker_id]=el 19 # [...]
Listing3.2: Pythonsnippetofhowlinearinterpolationisperformedwhentrackingislost.
Figure3.24: Plotofthesubsequentmarkers’coordinatesinpixelunitsduringdeformation.
3.4.2Frompixeltometricunitswithamonocularsetup
Webelievethatit’sworthmentioningallthetriedapproachestoestimatethe markers’3Dcoordinates,becauseatfirstwewantedtoexploitthoseresultsfor forceestimation.Despitetheefforts,intheendwedecidedtodirectlytrain theforceestimationmodelsonthecoordinatesexpressedinpixelunits,being themfarmorereliable.
AsdiscussedinSubsection2.3.2,mostoftheproposedmethodstoreconstructthegraspedobject’s(ordome’s)3Dshapethatdon’timplyRGBD cameras,relyonNeuralNetworksoronapreliminarytrainingprocess.In fact,obtainingdepthinformationwithasinglecamera(monocularsetup)is notatrivialtask.Inourworkwetriedtoreproducetheresultsobtainedin [57]usingthesuggestedformulas.Inparticular3Dmarkerpositionestimation isperformedas:
Chapter3Thedesignoftheprototype66
Figure3.25: Fromlefttoright:therawframeinrestposition;thesuperimposedarrows thatshowthedisplacement’sdirectionofeachmarker(thearrowswerelengthenedbyafactor of8tomakethemmorevisible);therawframeunderaforceofaround4N.
X = (u cx)Omm Opx (3.1) Y = (cy v)Omm Opx (3.2) Z = (fxOmmIpx) OpxImm X 2 (3.3) where(X,Y,Z)arethe3Destimatedcoordinateofthemarker;(u,v)are
Chapter3Thedesignoftheprototype67 thecoordinatesofthemarkerinpixelunits;(cx,cy)aretheprincipalpoint coordinates(obtainedthroughcameracalibration),usuallyattheimagecenter; (fx,fy)arefocallengthsinmetricunits; Omm isthemarker’swidthinmetric units,while Opx isthemarker’swidthinpixelunits,bothcorrespondingto theellipse’smajor-axis; Imm istheimagesensorwidthinmetricunitswhile Ipx istheimagesensorwidthinpixelunits.Figure3.26,[57],reportsavisual representationofthemarkers’widthandtheresulting3Dshapeestimation.
Figure3.26: Ontheleftarepresentationofhowcircularmarkersembeddedinthemembranearesearchedasellipses;ontherightthedevelopedviewertoolillustratingthegripper’s deformation[57].
Firstofall,wefocusedonthisapproach.Afterapplyingthestandard ComputerVisioncalibrationprocesswitha9 × 6chessboardandtheuseof OpenCV,thecameramatrixanddistortionparameterswerecomputed.After that,thepixeltometersconversiontakesplacetoretrievethe3Dposition ofthe29markers.Inthisregard,wetriedtoexploitboththe circular and elliptic markerdetectionstocomparetheresultsandlookforimprovements. Fromaqualitativepointofview,theretrieved3Dcoordinatesarereasonable buttheydon’tmeasureuptotheCADmodel.
AscanbenoticedbylookingatFigure3.27,whenthesensorisinits restposition,someofthecentralmarkersareactuallylowerthantheexternal ones.Tobetterunderstandthegraphs:inFigure3.27andintheleftimageof Figure3.28thecyancoordinatesrepresentthe3Dgroundtruthsobtainedfrom
Chapter3Thedesignoftheprototype68
theCADmodelinrestposition;thegreenandredcoordinateswerecomputed usingtheEquations(3.1)-(3.3)andtheyrespectivelyrefertotherestpositions (thatshouldcoincidewithcyancoordinates)anddeformedpositions(forwhich wedon’thaveareliablegroundtruth).Moreover,Figure3.28showsa3D meshrepresentingthedeformeddome,thatwasobtainedperformingaBoolean differencebetweenasphereofthesamediameterofthedomeandthe3D triangulatedvolumefromthemarkers’coordinatesafterdeformation,using the pyvista“delaunay 3d()” function.
Also,thismethodheavilyreliesoncameracalibration,imagerectification andacorrectcameraplacementthatensuresthevalidityofthegeometric assumptions.Infact,withtheinitialtemporarysetup,weexperimentedthat movingandtiltingthecameraindifferentdirectionswoulddrasticallychange theobtained3Dcoordinates.
Tryingtoimprovethe3Destimation,thatcouldopenupthepossibilityof performingobjectin-handlocalization,classificationandsurfaceestimation,
Figure3.27: Fromlefttoright:topandfrontviewsoftheobtained3Dcoordinatesofthe markersdetectedas circles,usingtheformulasdescribedin[57].
wetriedtoexploitthediscussed ellipsefittingalgorithm.So,eachmarker isrepresentedbythe(u,v)coordinatesthatdefinetheellipse’scenterandby twicetheellipse’ssemi-majoraxisinplaceofthecircle’sradius.Asshownin Figure3.29,thisapproachleadstoslightlybetterresults.Moreover,Figure 3.30showsacomparisonoftheobtained3Dcoordinatesbetweenfittingcircles andellipses.
Unliketheprevioustwo, anotherapproachtoestimate3Dcoordinates thatwasdevelopedfromscratchispresented.Itreliesonthemeasured circles’radii andonthe3Dinitialpositionofthemarkersextracted fromthe CADmodel.Infact,despitebeingfairlynoisy,theradiusofeach markeristheonly“measurement”ofdepththatcanberetrieved.So,the ideaistoestimatethe3Dpositionofeachmarkerthroughalinear(for X and Y )orinverse(for Z)proportionalitybetweentheCADcoordinatesandthe (∆u, ∆v, ∆radius)displacements.Listing3.3showsthePythoncodeusedto
Chapter3Thedesignoftheprototype69
Figure3.28: Left:sideviewoftheobtained3Dcoordinatesofthemarkersdetectedas circles,usingtheformulasdescribedin[57].Right:3Dmeshobtainedusingthe“pyvista” library.
Chapter3Thedesignoftheprototype70
Figure3.29: Topandfrontviewsoftheobtained3Dcoordinatesofthemarkersdetected as ellipses,usingtheformulasdescribedin[57].
Figure3.30: Ontheleftafrontviewoftheobtained3Dcoordinatesofthemarkersdetected as circles,ontherightafrontviewoftheobtained3Dcoordinatesofthemarkersdetected as ellipses
Chapter3Thedesignoftheprototype71
performthe3Destimation.Asclearlystatedbythecomments,therearesome manuallydefinednormalizingfactorsthatmaketheestimationreasonable,but tobeautomaticallycomputed,3Dgroundtruthsofthedeformeddomewould berequired.Regardingthe“Z reduction”array,theboundariesweresetconsideringthenormaldisplacementthatwasacquiredduringthedatasetcreation witha3-axisrobot.So,havingrecordedamaximumdeformationofnegative 10mm,weaccordinglysettheboundariessothatthe Z-coordinateestimationwouldbecorrectforthecentralmarker,thenweassumedthatalinearly distributednormalizingfactorcouldbesuitablefortheothermarkers.Also, thenormalizingfactorswouldneedtobekeptconsistentevenwithlesspronounceddeformations.Ontheotherhand,the“norm factor u v ”normalizing factorwasmanuallydefinedsothatthe X and Y estimationswouldbereasonableeventhough(X,Y )groundtruthsaremissing.Futuredevelopments couldretrievegroundtruths3Dcoordinatessimulatingforceapplicationwith asoftwaresuchasSOLIDWORKS[67].
1 # Normalize radius displacements between -1 and 1 2 displacements_px_c=final_markers_c-initial_markers_c 3 norm_radius_disp=displacements_px_c[:,2].copy() 4 norm_radius_disp/= max ( abs (norm_radius_disp)) 5 6 # Manually define X , Y , Z reduction factors 7 num_steps=30 8 Z_reduction=np.linspace(1,0.6,num_steps) 9 disp_steps=np.linspace(0.1,1,num_steps) 10 norm_factor_u_v=40 11 12 # Compute the final 3 D markers ’ coordinates 13 final_markers_3d=[] 14 for m_id,(rad_disp,mark) in enumerate ( zip (norm_radius_disp, 15 final_markers_c)):
Chapter3Thedesignoftheprototype72
Listing3.3: Pythonsnippetoftheproposed3Dmarkers’positionsestimation
Figure3.32showstheobtained3Dcoordinateswiththeproposedmethod, whileFigure3.31showsaheatmaprepresentingradiusdisplacement(from blacktoyellow)superimposedonthedeformedframe,onwhichisbasedthe Zcoordinateestimation.Finally,Figure3.33presentsa3Dvisualrepresentation ofthedeformeddomeobtainedthroughthe pyvista library.
Figure3.31: Ontheleftaradiusdisplacementheatmapsuperimposedonthedeformed frame;ontherightthebefore(coincidingwiththeCADgroundtruths)andafterdeformation meshesobtainedtriangulatingthe3Dmarkers’coordinates.
16 u,v,radius=mark 17 X,Y,Z=markers_XYZ[m_id,:] # get ground truths from CAD 18 u_disp,v_disp=displacements_px_c[m_id,0:2]/norm_factor_u_v 19 for k,factor in enumerate (disp_steps): 20 if rad_disp<0.3 and [X,Y,Z] not in final_markers_3d: 21 final_markers_3d.append([X*(1+u_disp),Y*(1+v_disp),Z]) 22 break 23 elif 0.3<=rad_disp<=factor: 24 final_markers_3d.append([X*(1+u_disp),Y*(1+v_disp), 25 Z*(Z_reduction[k])]) 26 break
Figure3.32: Topandfrontviewsofthe3Dmarkers’coordinatesobtainedwiththe proposedmethod
Figure3.33: Fromlefttoright:atopandabottomviewofthe3Dmeshobtainedthrough booleandifferencebetweendomeandtriangulatedmarkers’coordinates;asideviewofthe 3Dmeshobtainedthroughbooleandifferencebetweenrestpositionanddeformedposition triangulatedmarkers’coordinates.
Chapter3Thedesignoftheprototype73
3.4.3Rawdataacquisitionforsensorcalibration
Toproperlycalibratethesensingdevice,groundtruthforcesandtracking datahavebeenacquired.Inparticular,asmentionedinSubsection3.3.2an ATIforce/torquesensorwasusedtomeasuregroundtruthforceswhilepressingagainstthedome’ssurface.TologtheforcemeasurementsweusedROS (RobotOperatingSystem),sotheyweresavedinsidethesocalled“rosbags”. Afterpositioningandcenteringtherobotabovethedomeandresettingthe sensors’biascausedbyitsorientation,wecollectedforcemeasurementsand rawframesslowlymovingdownthe Z-axisoftherobottogeneratepressure onthedome.Inthisfashionweacquireddatafrom9differentexperiments rangingfrom3mmto12mm(3,4,5,6,7,8,9,10,12mm)ofverticaldisplacement, generatinganormalforceapproximatelycenteredonthecentralmarker.Figure3.34showstheforceandtorquecomponentsmeasuredbytheATINano sensor.
Chapter3Thedesignoftheprototype74
Figure3.34: ForceandtorquecomponentsmeasuredbytheATINanosensor.
Chapter3Thedesignoftheprototype75
3.4.4Offlinesemi-automateddatasetcreationpipeline
Aftercollectingalltheimagesandforcemeasurementsduringthedataacquisitionprocess,therawdatahastobepre-processedandproperlystored. Inparticular,afterorganizinginaproperfolderstructuretherawdata,a semi-automatedofflinepipeline wasdevelopedinordertoperformthe followingtasks:
1. automaticofflinedetectionandtrackingofthemarkers,startingfrom thepre-collectedimagesduringeachtrial.Thelow-passfilteredand interpolated(ifneeded)trackingdata(timestamp,markertrajectory, markerdisplacement)isthanwrittenona.jsonfile;
2. semi-automaticsynchronizationbetweengroundtruthforcesandmarker displacements.Thesynchronizedgroundtruthsanddisplacementsare writtenintotwoseparate.jsonfiles,whilethecorrespondingimagesare correctlyplacedinsideeachfolder.
Startingfromthe firsttask,therawdatawasstoredandorganizedinthe followingmanner:
raw data
3mm out img
* 1frame.png
* 2frame.png
* 3frame.png
*.bag
4mm out img
..
*.bag ..
Oncethecodeisexecuted,allthefolderscontainedinside“raw data”are browsedonebyone.Ifa“.bag”fileandthe“out img”folderarefound,
thestoredframesarereadbasedontheindexorder,thenmarkerdetection andtrackingisperformed.Afterhavingtrackedthe29markers,theyare linearlyinterpolatedasexplainedinSubsection3.4.1andfilteredwitha butter low-passfilter implementedinthe scipy.signal library.It’sworthmentioning thatthedetectedmarkersarenotsortedbetweendifferenttrials,meaning that“marker1”ofexperiment“3mm”couldcorrespondto“marker5”of experiment“4mm”.
Next,the secondtask comesintoplay.Duetotheabsenceofaunique triggeringsignaltosynchronizetheATImeasurementswiththeimagesor onlinetrackingdataduringacquisition,asemi-automaticandfastsynchronizationstepwasintroduced.Afterthefirststep,Figure3.35popsupand theuserisaskedtoclickonthefiguresixtimes.Inparticular,theuserneeds tospecifythestartandendofthegroundtruthportion(2clicksonthefirst subplot)andonlythentheboundariesofthecorrespondingmarkers’displacements(2clicksonthesecondsubplot).Thisprocesshastobedone3times,in ordertoautomaticallyretrievesynchronizedpixeldisplacementsandtrajectories,forcegroundtruthsandrawimages.Infact,aftersixclicks,thethree obtainedsegmentsareshowntotheuserthatneedstosupervisetheresults (thispipelinecouldbeeasilyautomatedbutwethinkthatthistaskiscrucial enoughtorequirehumansupervision).Figures3.36,3.37and3.38showhow thefirst,secondanthirdsegmentspresentaftersynchronization.
Chapter3Thedesignoftheprototype76
Chapter3Thedesignoftheprototype77
Figure3.35: Figureshowingfromtoptobottom:the3forcecomponents’groundtruths, the u, v and radius displacementsofeachmarker.
Figure3.36: Plotshowingthefirstsegmentofthesynchronizedgroundtruthsandmarkers’ displacements.
Chapter3Thedesignoftheprototype78
Figure3.37: Plotshowingthesecondsegmentofthesynchronizedgroundtruthsandmarkers’displacements.
Figure3.38: Plotshowingthethirdsegmentofthesynchronizedgroundtruthsandmarkers’ displacements.
Onceallthedatastoredinthe“raw data”folderhasbeensynchronized, the“clean data”foldercontains: clean data
3mm synchronized partial1 synchronized rgb
* click1 frame.png
* click1+1 frame.png
* click2 frame.png synchronized pixel.json synchronized rosbag.json
3mm synchronized partial2 synchronized rgb
* click3 frame.png
* click3+1 frame.png
* click4 frame.png
3mm synchronized partial3 synchronized pixel.json synchronized rosbag.json synchronized rgb
* click5 frame.png
* click5+1 frame.png
* click6 frame.png
Tomakethementionedfolderrepresentationclear, click1 referstotheindex oftheimagethatcorrespondstotheassociatedgroundtruthanddisplacements values.Also, timenormalization isperformedduetothelackofacommon triggerand groundtruths’downsampling iscarriedoutbecausetheATI Nanosensor’ssamplingfrequencyisfargreaterthanthereceivedframesper second(sincethemarkertrackingprocessisperformedoffline,thenumber ofdisplacements-ormarkers’coordinates-correspondstothenumberof acquiredimages).Someexperimentswereconductedtoseeifintroducinga linearupsamplingofthedisplacementswouldleadtobetterresults,thanksto
Chapter3Thedesignoftheprototype79
..
..
..
ahighernumberoftrainingsamples.Insomecasesthisapproachseemedto worsentheforceestimates,maybeduetoalinearizationofhighlynonlinear data,soitwasnotadoptedtocreatethefinaldataset.
Oncealltherawdatawascleanedandsynchronized,thedatasethas beensplitbetween80%trainand20%testbothmanuallyandusingthe “sklearn.model selection.train test split()”function.Also,inadditiontothe mentionedtrials,somedynamictestsweremanuallyperformedandsynchronizedtocheckthemodels’robustnessandabilitytogeneralize.
Chapter3Thedesignoftheprototype80
Chapter4 Experimentalresults
InthisChapteralltheutilizedforceestimationapproachesarepresented,briefly explainingtheexploitedalgorithms,rangingfromLinearapproximationtoMachineLearningandDeepLearningtechniques.Afterthat,theachievedresults arecomparedbasedontheMeanSquaredErrormetric.
4.1Forceestimationapproaches
Severalapproacheswereexploitedandcomparedtoperformforceestimation, rangingfromsimplelinearapproximationstoMachineLearningandDeep Learningalgorithms.Inparticular,weused:
• alinearelastic forceapproximation basedon[78];
• a non-linear andmoresophisticated variant of[78];
• aproperlytuned LinearRegression modelimplementedby sklearn [60]
• aproperlytuned K-NeighborsRegressor modelimplementedby sklearn [59]
81
Chapter4Experimentalresults82
• aproperlytuned SupportVectorRegression modelimplementedby sklearn [62]
• a NeuralNetworkSequential modelimplementedthrough keras [37]
• a DeepConvolutionalNeuralNetwork modelimplementedthrough keras [36]
4.1.1MachineLearningvsDeepLearning
InthisSubsectionasbriefintroductiontotheMachineLearningpipelineis presented.Then,wediscussthemaindifferencesbetweenMachineLearning andDeepLearningapproaches,rapidlyexplainingwhataNeuralNetworkis.
• Datapreparation:collection,“cleaning”andpre-processingofthe requiredinputdata(imagesornumericaldata);
• Featureextraction:retrievingmoremanageableinformationthatstill describestherawdataandissuitableformodelling;
• Featureselection:reducingthefeatures’spacedimensionality,keeping onlythemostrelevantfeaturestotrainthemodel.Featureselectionis apriorknowledgeinMachineLearningapproaches;
• Modelselection:choosingastatisticalmodelandtuningitshyperparameterstosolvetheregressionproblem.Modelselectionisalsoaprior knowledgeinMachineLearningapproaches;
• Modeltraining :processthatusesthetrainingdatasettobuildamodel thatwillbeabletoclassifynewsamples,belongingtothetestset;
• Prediction:modeloutputdecisionbasedonitsacquiredknowledge;
Chapter4Experimentalresults83
• Modeltesting :testingthepreviouslytrainedmodelwithunseeninput datatocheckitsperformance.Theaimofmodeltrainingisbuilding ageneralmodelthatcanclassifynewsamples,avoidingoverfittingthe trainset.
Themaindifferencebetween MachineLearning and DeepLearning approachesisthepriorknowledge.Infact,ifNeuralNetworksareused,not onlyfeatureselectionandmodelselection,butalsofeatureextractionandpredictionprocessesareconsiderablepriorknowledge.Featuresareconstantly learnedduringmodeltraining,throughlayersofConvolutionalfilters,coreof CNNarchitectures.Anotherimportantaspectthatweconsideredwhentestingthereal-timeimplementationwastherequiredpredictiontime.Infact, MachineLearningalgorithmstendtobefastertotrainandevaluatethan DeepLearningarchitectures.Figure4.1schematizesthedifferencesbetween MachineLearningandDeepLearning,whileFigure4.2showsatypicalDeepCNNstructure.
Figure4.1: ThemaindifferencesbetweenMachineLearningandDeepLearningandan exampleofDeep-CNN[42].
4.1.2Linearestimation
Thefirstmethodwefocusedonwasalinearapproximationbasedonthe elasticforceformulation,discussedin“ImplementingTactileBehaviorsUsing FingerVision”[78]and“FingerVisionforTactileBehaviors,Manipulation, andHapticFeedbackTeleoperation”[77].Figure4.3extractedfrompaper[78] showsthedetectedmarkers’trajectorieswhenthealmostplanarsurfaceis deformed.
Let(dx,dy)bethehorizontaldisplacementofeachmarkerfromitsinitial positionandlet(cx,cy,cz)betheconstantelasticcoefficients,aforceestimate
Chapter4Experimentalresults84
Figure4.2: ThetypicalarchitectureofDeep-CNN[71].
Figure4.3: Ontheleftthedetectedmarkers’movementusingSimpleBlobDetector;onthe rightanexampleofmarkermovementswhenanormalforceisapplied[78].
Chapter4Experimentalresults85
appliedateachmarkerisgivenby [
So,theoverallforceestimatesofthesensingdevicearedefinedasaverageof thesingleforces:
Thisapproachassumesthateverymarkerhasthesameimpactonthe forceestimatesandthatthere’salinearrelationbetweenhorizontal/vertical displacementsandforces.Regardingthenormalforceestimation,theyuse thenormofmarkers’positionchangebeingitmorestableandlessaffectedby noisethantheradiusreadingoftheBlobDetector.AsmentionedinChapter 2,thisapproximationisreallyputtothetestinourcasestudy,beingthe surfacehemisphericalandnotplanarlikein[78].
Inthiscasethemodelisfairlysimple,beingitlinearandcomposedby3 constantparameters.Toretrievetheaveragestiffnesscoefficientsalongthe3 maindirections,weusedthefollowingapproach(acodesnippetisshownin Listing4.1):
• considerthetraindataset’s(Fx,Fy,Fz)groundtruthsandpixelhorizontal/verticaldisplacements;
• ifthegroundtruthsaregreaterthanathreshold,computethethree stiffnesscoefficientsforeachofthe29markersbyinvertingEquation4.1 (avoidingthedivision-by-zeroexception);
• oncethepreviousstepsarerepeatedoverallthetrainingsamples(among theseveralrecordedexperiments),thefinal(Cx,Cy,Cz)coefficientsare computedasthemedian(insteadofthemean)ofallthecollectedvalues.
fx,fy,fz]=[cxdx,cydy,cz d2 x + d2 y]. (4.1)
Fx,Fy,Fz]=[ 1 29 28 i=0 fx, 1 29 28 i=0 fy, 1 29 28 i=0 fz](4.2)
[
Listing4.1: Pythonsnippetofthestiffnesscoefficients’estimationassuggestedby[78]
Thislinearmodelhastheadvantageofbeingeasytounderstand,implementandimprovebycomplicatingtheforce/displacementrelations.Inpar-
1 # [...] 2 # Avoid having high C_hat coeffs . because of 3 # near to zero displacements 4 th_from_zero=0.1 5 if dx==0: 6 dx=th_from_zero 7 elif abs (dx)<th_from_zero: 8 dx=np.sign(dx)*th_from_zero 9 if dy==0: 10 dy=th_from_zero 11 elif abs (dy)<th_from_zero: 12 dy=np.sign(dy)*th_from_zero 13 # [...] 14 # Linear formulas from the paper : 15 # compute coeffs . only if the ground truth is meaningful 16 force_th=0.1 17 if abs (force_gt_x)>force_th: 18 Cx_hat_final.append(force_gt_x/(dx)) 19 if abs (force_gt_y)>force_th: 20 Cy_hat_final.append(force_gt_y/(dy)) 21 if abs (force_gt_z)>force_th: 22 Cz_hat_final.append(force_gt_z/(np.sqrt(dx**2+dy**2))) 23 # [...] 24 # Compute the final stiffness coefficients considering 25 # the median value 26 Cx_hat_final=statistics.median(Cx_hat_final) 27 Cy_hat_final=statistics.median(Cy_hat_final) 28 Cz_hat_final=statistics.median(Cz_hat_final)
Chapter4Experimentalresults86
ticular,wedecidedtocomputethe3stiffnesscoefficientsusingthemedian operatorinsteadofthemean,beingitmorerobusttooutliersifdataisnot normallydistributed,asgraphicallyshowninFigure4.4[45].
Tofurthermotivatethisdecision,Figure4.5representsahistogramofthe sumofthestiffnesscoefficientscomputedduringeachiteration.Allthese numbersarestoredand,intheend,the3finalcoefficientsareretrievedby eitherselectingthemean,modeormedianvalue.Consideringthatthecreated datasetisbalancedandwideregardingnormalforcesbutprettylimitedinthe X and Y components’range,the Cx and Cy coefficientsseemnormally distributedbutthe Cz arefairlyskewed.
Afterthetrainingprocess,the3constantcoefficientsareestimatedand testingcanbedonedirectlyapplyingEquation4.1andEquation4.2together, toretrievethetotalforcecomponents.
Chapter4Experimentalresults87
Figure4.4: Mode,meanandmedianin3differentdatadistributionscenarios[45].
4.1.3Non-linearlycompensatedandmarker-locationbased estimation
Asmentioned,thismethodtakesinspirationfromthepreviousonebuttries toimprovesomeofitslimitationsduetothenon-planarshapeofourdome.
Infact,it’smorefeasibletoapproximateaquasi-planarsurfacetoalinear elasticbehaviorthanahemisphericalsurface.Anotherassumptionisthateverymarker,meaningeveryregionofthematerial,behavesthesameinterms ofstiffness,deformationandelasticproperties.Duetotheseveralstepsthat arerequiredtomanufacturetheelasticdome,weexperiencedthatmanyvariablescanaffectthemodel’sstiffnessandthereforeit’sresponsetoexternal forces.Wethinkthatthepreviousmethodcanrapidlygiveagoodindication ofhowstiffthematerialis,butcouldalsolackofgeneralizationandrobustness capabilities.Forthesereasons,wetriedtoimprovethismethodby:
• consideringthe29markersindependentofeachother;
Chapter4Experimentalresults88
Figure4.5: Histogramsofthecomputed Cx, Cy, Cz coefficientsduringeachiteration.
Chapter4Experimentalresults89
• non-equallydistributingthegroundtruthforces;
• creatinga29 × 3stiffnessmatrix(insteadofa1 × 3stiffnessvector)so thateverymarkerhasanelasticcoefficientforeverydirection;
• applyingforceestimationusingthesameequationsasbefore(Equations 4.1and4.2)butusingtheestimatedstiffnesscorrespondingtothe k th marker.
Howthe29markersareproperlysortedisexplainedinSubsection4.1.9, fornowlet’sassumethattheinputmarkertrajectoriesaresortedasshownin Figure4.6.
AsListing4.2shows,thegroundtruthforcesarecompensatedwithcoefficientsthatweretunedbasedontheoutcomes.Inparticular,thelogicbehind theirtuningisbasedonthepercentageoftheforcethatshouldbe“absorbed” bytheregionaroundtheconsideredmarker.Assumingthatanormalforceis appliedtothecenterofthedome,thecentralmarkershouldbestillnomatter
Figure4.6: Sortingorderofthe29fiducialmarkers.
theintensityofthenormalforce,whileonthefarthestmarkersshouldbeextertedtheleastamountofforce.Ifneeded,thisreasoningcouldbegeneralized estimatingtheapplicationpointoftheexternalforce.
Onthissubject,asimpleestimationoftheforce’sapplicationpointispro-
Chapter4Experimentalresults90
1 Cx_hat,Cy_hat,Cz_hat=[],[],[] 2 for k_it in range (num_iterations): 3 # Ground truth forces corresponding to the k - th iteration 4 force_gt_x,force_gt_y,force_gt_z=train_labels[k_it,:] 5 6 Cx_hat_it,Cy_hat_it,Cz_hat_it=[],[],[] 7 for marker_id in range (num_markers): 8 dx=all_dx[k_it,marker_id] 9 dy=all_dy[k_it,marker_id] 10 # -------------------------------------# 11 if marker_id==0: # central marker 12 compensation_fact=0 13 elif marker_id in [4,5,12,13]: 14 compensation_fact=0.2/4 15 elif marker_id in [19,20,25,26]: 16 compensation_fact=0.3/4 17 elif marker_id in [3,6,11,14]: 18 compensation_fact=0.2/4 19 elif marker_id in [2,7,18,27,21,24,10,15]: 20 compensation_fact=0.15/8 21 elif marker_id in [1,8,9,16,17,22,23,28]: 22 compensation_fact=0.15/8 23 force_gt_x_new=force_gt_x*compensation_fact 24 force_gt_y_new=force_gt_y*compensation_fact 25 force_gt_z_new=force_gt_z*compensation_fact
Listing4.2: Pythonsnippetoftheproposedgroundtruths’compensationbasedonthe markerlocation
Chapter4Experimentalresults91
posed(butnotintegratedwiththementionedgroundtruths’compensation).
Toretrievetheforce’sapplicationpoint,theideaistocomputeaweightedaverageoftheinitialmarkers’coordinates,dependingontheirradiusincrement. Thisway,weconsiderthecorrelationbetweensizeincreaseofthedetected blobandwherethecauseofdeformationcomesfrom.Figure4.7showsa sequenceofimagescollectedbythefish-eyecamera,withthesuperimposed estimateoftheapplicationpoint(green)andarea(blue),dependingonthe force’smagnitude.
Oncethegroundtruthforcesarecompensated,theimplementationissimilartoListing4.1,withthemaindifferencethatinthiscase29stiffnesscoefficientsareestimatedforeachdirection.Finally,aftertraining,twotesting approachescanbeexploited:computethe(Fx,Fy,Fz)forcesusingasastiffness scalarthemeanormedianvalueforeverydirection(Equation4.6);compute theresultantforcesasasummationofthesinglecomponentsthateverymarker absorbs(Equation4.7),i.e.
Figure4.7: Asequenceofimagesshowingtheestimatedapplicationpoint(green)andarea (blue),dependingontheappliedforce’smagnitude.
4.1.4LinearRegressionmodel
TheordinaryleastsquareLinearRegressionmodelhasbeenused,exploiting the sklearn [60]library.Asmentionedbythe scikit-learn documentation,this modelisfittedtothetrainingdatainordertominimizetheresidualsum ofsquaresbetweentheobservedtargetsinthedatasetandpredictionsmade bythelinearapproximation.Inthiscasetherearen’thyperparameterstobe tuned.
4.1.5K-NeighborsRegressormodel
The sklearn [59]libraryimplementsaregressormodelbasedonthe K-Nearest Neighbors model,typicallyusedforclassificationtasks.Asgraphicallyshown inFigure4.8,givenasetof X trainingsamplesandanewsampletoclassify, thedistancebetweenthenewpointandalltheotherpointsiscomputed; thenewsampleisthenclassifiedasbelongingtothemostfrequentclassthat appearsinsidethenewpoint’s“neighbourhood”.
Ontheotherhand,the K-NeighborsRegressor modelcanbeusedwhen thedataset’slabelsarecontinuousratherthandiscretevariables.Thelabel
Chapter4Experimentalresults92 Cx =[cx0 ,cx1 ,...,cx28 ](4.3) Cy =[cy0 ,cy1 ,...,cy28 ](4.4) Cz =[cz0 ,cz1 ,...,cz28 ](4.5) [Fx,Fy,Fz]=[ 1 29 28 i=0 Cxdxi , 1 29 28 i=0 Cydyi , 1 29 28 i=0 Cz d2 xi + d2 yi ](4.6) [Fx,Fy,Fz]=[ 28 i=0 cxi dxi , 28 i=0 cyi dyi , 28 i=0 czi d2 xi + d2 yi ](4.7)
Figure4.8: GraphicalrepresentationoftheKNNalgorithm[13]. assignedtoaquerypointiscomputedbasedonthemeanofthelabelsofits nearestneighbors.Themainhyperparameterstobetunedare:
• n neighbors:thenumberofsamplestobeconsideredasneighbors(K);
• metric:thedistancemetrictobeusedfordetectingtheclosestneighbors(minkowski,cityblock,cosine,euclidean);
• weights:theweightingpolicythatisappliedtotheneighborhood(uniform,distance);Figure4.9showshowthe“weights”parametercanaffect theestimationresults.
Tochoosethebestparameters,the sklearn.model selection.GridSearchCV() functionisused(notonlyforthis,butalsoforthefollowingalgorithms). Tobrieflyexplainthisfunction,giventhehyperparameterstotry,ititerates amongallthepossibleoptions,returningthebestcombination(basedonthe achievedaccuracyonthetrainset).OnceGridSearchisperformed,themodel withthebestparametersiscreatedandfittedonthetrainingdata.
Chapter4Experimentalresults93
Figure4.9: Theeffectofthe“weights”parameterontheestimates.Thedefaultvalueis “uniform”andassignsequalweightstoallpoints;“distance”assignsweightsproportional totheinverseofthedistancefromthequerypoint[59].
4.1.6SupportVectorRegressionmodel
The sklearn [59]libraryimplementsaregressormodelbasedonthe Support VectorMachine model,typicallyusedforclassificationtasks.SVMisa binaryclassificationtechniquethatusesthetrainingdatasettopredictanoptimalhyperplaneinan N -dimensionalspace,toseparatedataintotwoclasses. Theidentifiedhyperplaneiscalled decisionboundary andbydefinition,itwill alwayshaveonelessdimensionthanthedataspaceitisbuiltin(e.g.aline in2Dspaceisahyperplaneofdimension1).Toidentifytheoptimaldecision boundarythatwillclearlyseparatethedifferentclasses,SVMusesSupport Vectors,whicharetheclosestdatapointstotheedgeofeachclassandare themostdifficultpointstocorrectlyclassify.Theother,moregenericdata pointsareignoredfordeterminingtheboundary.Thedistancebetweenthe hyperplaneandthesupportvectorsarecalled margins andthoseneedtobe maximizedbythemodeltoretrievetheoptimaldecisionboundary.Support
Chapter4Experimentalresults94
VectorMachinesareabletoseparatebothlinearandnon-lineardistributed data,usingthesocalled kerneltrick.A kernel isafunctionthatcanbe usedtotransformadatasetintohigher-dimensionalspacesothatthedatabecomeslinearlyseparable.Thekerneltrickeffectivelyconvertsanon-separable problemintoaseparableonebyincreasingthenumberofdimensionsinthe problemspaceandmappingthedatapointstothenewproblemspace.Figure 4.10showshowSVMcanbeappliedtolinearlyseparabledata,usingthe“linear”kernel;Figure4.11showssomeexamplesofnon-linearlyseparabledata andhowcanbeseparatedusingnon-lineardecisionboundaries.
Moreover,SVMand,consequentlySVRdoesn’tsupportmulticlassclassificationnatively,sowetrained3differentmodelstoestimateeachcomponentof theforce.Inparticular,weusedthe sklearn.svm.SVR() functiontoexploitthe socalled SupportVectorRegression model.Theimplementationisbased onthe libsvm libraryanditexploitstheconceptofSupportVectorMachines andre-adaptsittoacontinuousclassificationproblem,thatisregression.The mainhyperparameterswetunedare:
• kernel :thetypeofkernel(linear,poly,rbf,sigmoid);
• C :thecost,usedtoregularizedatawithaninverseproportionality.
Chapter4Experimentalresults95
Figure4.10: ApplicationofSVMincaseoflinearlydistributeddata[50].
4.1.7NeuralNetworkSequentialmodel
Weusedthe keras [37]librarytocreateasimpleNeuralNetworkSequential model.Inparticular,Figure4.12showstheNeuralNetwork’sstructurethatis composedofanormalizationlayerfittedonthetrainingdataset,twohidden layers(Dense)withadepthof64and RectifiedLinearUnit(ReLu) andafinal Dense linearlayerthatoutputsthethreeestimatedforcecomponents.The modelwascompiledusingastandard Adam optimizertominimizethe Mean SquaredError(MSE).Thisrathersimplemodelhasatotalof8248parameters ofwhich8131aretrainable.
Chapter4Experimentalresults96
Figure4.11: ApplicationofSVMincaseofnon-linearlydistributeddata[50].
4.1.8DeepConvolutionalNeuralNetworkmodel
Finally,weexploiteda Deep-CNNmodel implementedbythe keras [36] library.Inparticular,severalmodelsweretestedandbasedontheMean SquaredErrormetricwechosethebestperformingarchitectureandrespective hyperparameters.Inthiscasethemodelsare“Deep”inthesensethatthe numberoftrainableparametersisintheorderofmillions,whilethenumber ofhiddenlayers(thatdeterminethedepth)goesfrom16(e.g.VGG16)allthe wayto152(e.g.ResNet152).Thosemodelstendtobetimeconsumingtotrain andthey’rehardlyinterpretable,meaningthatit’sdifficulttounderstandwhy certaindecisionsorpredictionshavebeenmade.Nonethelesstheachievable performanceduetothedeepextractedfeaturescanbeoutstanding.Thisis trueespeciallywhenthetaskstoperformarenon-trivial;laterintheChapter we’llbediscussingifthisisrequiredinourcasestudy.Moreover,thisis theonlycaseinwhichwedidn’tinputtimeseriestothemodel,butimages instead.Asmentioned,CNNsarebuilttoworkwithimagesthroughtheuse
Chapter4Experimentalresults97
Figure4.12: Left:agraphicalrepresentationofthechosenSequentialmodelusingthe plot model()function;Right:agenericexampleof2-hiddenlayersNeuralNetwork[12].
ofConvolutions.So,wetriedtogivethemodelbothbinarizedimageswith awhitebackgroundandblackdots,representingthe29fiducialmarkersand (duringanothertrainingistance)rawRGBimagesrecordedbythefish-eye camera(examplesareshowninFigure4.13).
4.1.9Featureextraction
ImportantaspectswhentrainingaMachineLearningorDeepLearningmodel is featureselection and extraction.Aftercollectingandpre-processingthe dataset,wetriedtoextractfeatureswithoutmixingthesamplesandnoticed thateveryalgorithmdidn’tperformgoodenough.Tomakethemodelsmore robustandtomakethemconvergefaster,the sklearn.model selection.train test split() functionwasfoundtobeessential.AsshowninListing4.3,thewholedataset isdividedin80%trainofwhicha10%usedforvalidationduringtraining,and 20%test.Togiveaquantitativemeasurethetrainingsetiscomposedby897 samples,followingthevalidationsetwith100samplesandthetestingsetwith 250samples.
1 from sklearn.model_selection import train_test_split
2 # Split dataset between train and test
3 train_features,test_features,train_labels,test_labels=
Chapter4Experimentalresults98
Figure4.13: Onthelefttherawimagesensedbythefish-eyecamera;ontherightthe binarizedimage.
4
Chapter4Experimentalresults99
Listing4.3: Pythonsnippetshowinghowthedatasetisdividedbetweentrain,testand validation.
Featureswereextractedwithseveraldifferentmethodstocomparetheresultsandunderstandtheamountofrequiredinformationanddepthtoproperly characterizethesensingdevice.Infact,using2or3featuresinsteadofmillions likeinDeepNeuralNetworkapproachescanseemahugelossofinformation, butontheotherhandtheDCNNcouldeasilyoverfitdatawhileasimplelinearestimationcouldgeneralizemoreduetoafeatureextractionwithlimited depth.Weimplementedandtested8differentmethodsforfeatureextraction(ontopofwhichthere’sthepossibilityofscalingthefeaturesthrougha MinMaxScaler ):
1. theaveragedisplacementofthe29markersalongthehorizontaland verticalaxis(2features);
2. theaveragedisplacementofthe29markersalongthehorizontaland verticalaxis,theaverageradiusincrementanddecrement(4features);
3. theaverageabsolutedisplacementofthe29markersalongthehorizontal andverticalaxis(2features);
4. theaverageabsolutedisplacementofthe29markersalongthehorizontal, verticalaxisandradiusmeasurement(3features);
train_test_split(all_features,all_labels,test_size=0.2, 5 random_state=42)
# Split trainset between train and validation
train_features,valid_features,train_labels,valid_labels=
train_test_split(train_features,train_labels,test_size=0.1, 9 random_state=42)
6
7
8
5. the29sortedhorizontalandverticalcoordinatesofthemarkers(58 features);
6. the29sortedcoordinatesandradiiofthemarkers(87features);
7. the29sortedhorizontalandverticaldisplacementsofthemarkers(58 features);
8. the29sortedhorizontal,verticalandradiusdisplacementsofthemarkers (87features).
Tobrieflyexplainhowthe29markersweresorted,thedevelopedalgorithm canbesummedupwiththefollowingsteps(assumingthatthedomeisinstalled inafixedpositionwithrespecttotheinnercamera):
• detectthe29markersinthefirstframe;
• giventhemarkers’patternasapriorinformation,usetheRANSAC (RANdomSAmpleConsensus)[61]algorithmtofittheverticalline;
• sorttheverticalinliersaccordingtotheindexingshowninFigure4.14;
• usetheRANSACalgorithmtofitthehorizontallineontheremaining 20markers(consideredtobeoutliersinthepreviousstepbecausethey don’tlieontheverticalline);
• sortthehorizontalinliersaccordingtoFigure4.14;
• repeatthesameprocessfortheremainingmarkersplacedonthetwo diagonals;
• wheneveranewarrayof29coordinatesisgiven,itwillbeautomatically sorted(usingtheEuclideandistancemetric)accordingtotheinitialsorting,thatinthiscasedoesn’tchangebetweendifferenttrials.
Chapter4Experimentalresults100
4.2Comparisonoftheresults
InthisSubsectiontheobtainedresultsarepresented,startingfromtheevaluationonthetestingportionofthecollecteddatasettotheonlinevalidationwith theroboticgripper.
4.2.1Evaluationonthetestset
Figures4.18,4.19,4.20and4.21showthebestachievedresultswhileestimatingthe(FX ,Fy,Fz)forcecomponents,withallthemodelsthatrequire numericaldata(KNN,SVR,LR,SequentialNN,linearandnon-linearmethods).Moreover,Figures4.15,4.16and4.17showacomparisonoftheachieved MSEvaluesforeachfeaturetype/scalingcombination(atotalof16possibilities).Tobetterunderstandthefollowinggraphs,someobservationsshouldbe done:
• MeanSquaredError istheevaluationmetricthatisusedtocompare theresults,eventhoughthecodeallowstocomputeothermetrics(such asRootMeanSquaredError, R2 score,MeanAbsoluteError);
Chapter4Experimentalresults101
Figure4.14: Sortingorderofthe29fiducialmarkers.
Chapter4Experimentalresults102
• theforceestimatesthatovercome100%MSEarenotplotted,thiskeeps thegraphsmorereadablebyremovingunnecessaryinformation;thesame conceptappliesforhistograms(Figures4.15,4.16and4.17)wherethe unsignificantresultsarerepresentedbynegativeMSEvalues;
• allthemodelsweretrained,validatedandtestedonthesameportionof dataset,randomlysplittedbythementioned train test split() function;
• thehistogramsshowninFigures4.15,4.16and4.17depictthe16feature type/scalingcombinations’MSE.Thiscomparisonisusefultounderstandifanyimprovementisachievedwhenincreasingthenumberand complexityoffeaturesandtovisualizetheeffectoffeaturescaling.
Amongallthepossiblecombinations,thebestperformingoptions’results areshown.Particularly,Figure4.18showsforceestimateswhenfeatureoption number2isselected(Option 2. inthelist);duetothelownumberoffeatures (only2)thebestresultisachievedbyKNNwithaMSEof14.1%.Figure4.19 showsforceestimateswhenfeatureoptionnumber5isselected(Option 5. in thelist).Inthiscase58featuresrepresentingeachmarkers’horizontaland verticaldisplacementsareextractedandthebestachievedMSEis6.3%,scored bytheLinearRegressor.Figure4.20showsforceestimatesusing87features; similarlytothepreviousoptionbutincludingtheradii’sdisplacements(Option 6. inthelist).Figure4.21showsforceestimatesusingthe58sortedhorizontal andverticaldisplacements(Option 7. inthelist)improvingperformanceof KNNtoaMSEof2.86%.
AsshowninFigures4.18,4.19,4.20and4.21,thenormalforce Fz isestimatedwithaconsiderableMSEthat,dependingonthetypeofextracted featureandusedmodelrangesfrom2%toabove100%.Moreover,duetohow thesensingdevicewascalibrated,forcesarenotestimatedwiththesamerelia-
Chapter4Experimentalresults103
Figure4.15: Histogramplotshowingthe F x estimationMeanSquaredErrordependingonthefeaturetype/scalingcombination(negative barscorrespondtoerrorsabove100%).
Chapter4Experimentalresults104
Figure4.16: Histogramplotshowingthe F y estimationMeanSquaredErrordependingonthefeaturetype/scalingcombination(negative barscorrespondtoerrorsabove100%).
Chapter4Experimentalresults105
Figure4.17: Histogramplotshowingthe F z estimationMeanSquaredErrordependingonthefeaturetype/scalingcombination(negative barscorrespondtoerrorsabove100%).
Chapter4Experimentalresults106
Figure4.18: Comparisonoftheestimatedforcecomponents,consideringfeatureoptionnumber2(Option 2. inthelist)withfeature scaling.
Chapter4Experimentalresults107
Figure4.19: Comparisonoftheestimatedforcecomponents,consideringfeatureoptionnumber5(Option 5. inthelist)withfeature scaling.
Chapter4Experimentalresults108
Figure4.20: Comparisonoftheestimatedforcecomponents,consideringfeatureoptionnumber6(Option 6. inthelist)withfeature scaling.
Chapter4Experimentalresults109
Figure4.21: Comparisonoftheestimatedforcecomponents,consideringfeatureoptionnumber7(Option 7. inthelist)withoutfeature scaling.
bilityalongthe3directions.Meaningthatthedatasetisstronglylimitedalong (X,Y )whilethe(Z)groundtruthsaremorelinearlyandwidelydistributed.
Wethinkthatanunneglectablebiaswasintroducedduringdatacollection andpre-processing,resultinginlimitedestimationaccuracy.Nonetheless,the developedsoftwarepipelinescanbeeasilyexploitedtore-calibratethesensing deviceonawiderdataset,includingsignificantshear(tangential)forces.We experimentedthatthedeviceissensibleenoughtoperceivenormaldeformationsof1mm,showingthatawiderdatacollectioncouldleadtosignificant improvementsoftheMachineLearningmodels.
Figure4.22showsforceestimateswithoutshufflingthetestset;thisallows tobettervisualizetheforces’trends.Aswecansee,groundtruthsaresometimesaffectedbyspikesandnoise,eventhoughmovingaverageisperformed, thatcouldworsenmodels’performance.Moreover,theestimatestendtocorrectlyfollowtheincreasingordecreasingtrendsbutslowerthantheground truths.
RegardingtheDeepConvolutionalNeuralNetwork,theachievedresults provedtobecomparablewiththeMachineLearningalgorithmsintermsof performance,withthemaindifferencethattherequiredtimetoretrievepredictionshastobetakenintoaccount.Figures4.23and4.24showthetrainand validationlosses(MSE)respectivelywhenrawRGBimagesandbinarizedimagesareusedasinputofthemodels.Figures4.25and4.26showtheachieved MeanSquaredErrorwhentestingontheunseenportionofthedataset.In bothcases,afteraGridSearchprocessing,wechoseaResNet50DeepConvolutionalNeuralNetworkmodelpre-trainedonImageNet[31]withalearning rateof0.001,adropoutprobabilityof0.5andanAdamoptimizer,thatwe trainedfor50epochs.
Chapter4Experimentalresults110
Figure4.22: Comparisonoftheestimatedforcecomponents,consideringfeatureoptionnumber5(Option 5. inthelist)withfeature scalingandwithouttheshuffleoption(toseetheactualforces’trends).
Chapter4Experimentalresults111
Chapter4Experimentalresults112
Figure4.23: ResNet50trainandvalidationlossconsideringrawRGBimagesasinput.
Figure4.24: ResNet50trainandvalidationlossconsideringBinarizedimagesasinput.
Chapter4Experimentalresults113
Figure4.25: ResNet50predictionsonthetestsetconsideringrawRGBimagesasinput.
Chapter4Experimentalresults114
Figure4.26: ResNet50predictionsonthetestsetconsideringBinarizedimagesasinput.
Table4.1showsthebestachievedresultbyeveryforceestimationapproach, intermsofMeanSquaredError,whenevaluatedonthetestportionofthe dataset.Tonoticethatonlythe Fz componentsisconsidered,beingitmore representativeoftheactualmodels’performance.Accordingtoourexperiments KNN obtainedthebestresultsduringtesting,sowechosetouseit duringtheharvestingtask.
Table4.1: Summaryofthebestperformanceachievedbyeverymodelevaluatedonthe testset(MSEvaluesrefertothe Fz component).
4.2.2Evaluationwiththeroboticgripper
Oncetheinitialdatasetwasexploitedtobothtrainandevaluatethealgorithms,thefittedmodelswerefurtherevaluatedontherealsetup.Todothat, theATINanoforce/torquesensorwasfixedintoapositionandtherobotic handwasclosedatdifferentspeedsandwidths(asshowninFigure4.27).Aftercollectingdatafromthesensorandthemodels,predictionswerecompared againstgroundtruthforces,asshowninFigures4.28,4.29and4.30.ToacquirethenecessarydatatheROSenvironmentwasusedandaseriesofscripts wereimplementedinordertologimagesandjsonstreamscontainingtheestimatedforces,markers’trajectoriesandapplicationpoint.Aspreviouslydone duringthedatasetcollection,groundtruthforceswerestoredinsidea“rosbag”
Chapter4Experimentalresults115
ModelFeatureoptionNumberoffeaturesFeaturescalingBestMSE Linearelasticforceapproximation-2- ≥ 200% Non-linearcompensation-58- ≥ 200% LinearRegression 6. 87Yes6.16% K-NeighborsRegressor 7. 58No2.86% SupportVectorRegression 6. 87Yes11.36% SequentialNeuralNetwork 6. 87Yes6.72% DeepConvolutionalNeuralNetwork-224 × 224 × 3-14.38%
Chapter4Experimentalresults116
readingthe netft data topic[76].
Figure4.27: Photographsoftheforceestimationvalidationsetup.
Chapter4Experimentalresults117
Figure4.28: EstimatedforcesagainstgroundtruthsduringthevalidationphaseusingtheATINanosensorandpressingwiththe developedgripper(1).
Chapter4Experimentalresults118
Figure4.29: EstimatedforcesagainstgroundtruthsduringthevalidationphaseusingtheATINanosensorandpressingwiththe developedgripper(2).
Chapter4Experimentalresults119
Figure4.30: EstimatedforcesagainstgroundtruthsduringthevalidationphaseusingtheATINanosensorandpressingwiththe developedgripper(3).
4.2.3Real-timeforcefeedbackandstrawberrydetection
InthisSubsectionthefruit-pickingtaskisattemptedandqualitativeandquantitativeresultsareshown.AsmentionedatthebeginningofthisThesis,the aimofasoftgripperistogentlygraspdeformableobjectswithoutsqueezingor damagingthem.Todemonstratethatthisispossible,eventhoughthegripper designandcontrolarchitecturecanbeimproved,wesetupaseriesofstrawberryplants,emulatingasortofhydroponicculture(showninFigure4.31).
Afterwards,wetriedtoperformasimpleharvestingtask,startingfromthe detectionoftheripefruit,allthewaytothepickingofthestrawberrywitha suitablemotion.ThispipelinewasimplementedinROSwithmainlyPython scriptshandlingthenodes.Infact,thefirstdevelopedROSnodeprovides thereal-timeforcefeedbackusingtheonlineestimationpipelinebasedonthe
Chapter4Experimentalresults120
Figure4.31: Photographshowinghowstrawberryplantsweresetup(emulatingahydroponic culture)toperformthepickingtask.
gripper’sdeformation(KNNwasfoundtobethemostreliablemodeltouse); theseconddevelopednodetakescareoffruitdetectionexploitingapre-trained CNN(YOLOv3)model,thatwasfine-tunedonasmallstrawberrydatasetfor objectdetection(about600images).Figure4.32showsaqualitativeresultof thetrainedCNNforstrawberrydetection;itisabletolocatetheboundaries ofthefruitwhileclassifyingitas ripe or unripe withalevelofcertainty.
Thedevelopedreal-timepipelinerequiressomesupervisionduetothetemporarysetupandanon-perfectlytunedNeuralNetworkandconsistsinthe followingsteps:
• asshowninFigure4.33,theroboticharmisplacedaround50cmaway fromtheplantsandusestheRGBDcamerainformationtobothfind ripefruitsandcomputethe3Dlocationoftheircentralpoint(centerof theestimated2Dboundingbox)withrespecttothecamera;
• theestimated3Dpointapproximatelyrepresentsthestrawberrybari-
Chapter4Experimentalresults121
Figure4.32: Qualitativeresultofthefine-tunedCNNtestedononeoftheplantsusedfor thesetupshownin4.31.
centerwithrespecttothecamera.Thatcoordinateisthentransformed (exploitinghand-eyecalibrationresults)withrespecttotherobot’sbase referenceframe;
• asshowninFigure4.34,thegripper,thatisstillopen,approachesthe targetpointpublishedonaspecificROStopicaccordingtoaproperly tunedimpedencecontrollaw;
• oncethegripperiscorrectlypositioned,theclosingcommandispublished tothe frank gripper/MoveActionGoal topic;
• basedonthereceivedreal-timeforcefeedback,thegripperisstoppedto thecurrentposition(publishingthelastwidthvalue),whenthenormal forcereachesthe1.75Nthreshold(Figure4.37showsthereal-timeforce feedbackthatwasestimatedduringthepickingtask);
• whentheforcefeedbackstabilizes,thegripperperformsthesuggested pickingpatternforstrawberries;inparticularitslowlyrotatesandmoves backwithrespecttotheplant,harvestingthefruit(thisisrepresented byFigures4.35and4.36).
Chapter4Experimentalresults122
Figure4.33: Phase1:detectionandlocalizationoftheripefruit.
Chapter4Experimentalresults123
Figure4.34: Phase2:approachoftheripestrawberrygiventhe3Dtargetpoint.
Figure4.35: Phase3:applicationofthepickingpatterntoharvestthestrawberry.
Figure4.36: Imagesensedbythefish-eyecamerawhenthedomeisdeformed,withthe superimposeddetectedmarkers.
Chapter4Experimentalresults124
Figure4.37: Onlineforcefeedbackswhilegraspingthestrawberry(weusedtheKNNmodeltoestimatenormalforceanddetermineif the1.75forcethresholdhasbeenreached).
Mostofthestate-of-the-artworkonsofttactilesensorsfocusesonfinger-likeor “palmar”softgrippers,thathaveaquasi-planargeometry,leadingtosimpler forceestimationandoverallcharacterization.Weproposedahemispherical cheaptomanufactured3Dprintedsoftgripperthatallowstogentlygraspobjects.Wecharacterizedthesensingdevicebychosingasuitableplacementand densityoffiducialmarkers.ByexploitingseveralMachineLearningandDeep Learningapproachesweestimatedforcesexertedonthedome’ssurfacethat werereasonableandsufficienttoattemptapickingtask.Infact,eventhough severalimprovementscanbemade,theforceestimatesarestable,allowingto easilysetathresholdforstoppingtheFrankaHandgripper’sclosure.Moreover,onarealscenarioitwouldberequiredtogentlygraspandhandlethefruit orvegetabletopreserveitsquality,meaningthatfastpickandplacecontrol isn’tprobablyrequirednoruseful.Theproposedforceestimationtechniques heavilyrelyonComputerVision,MachineLearningalgorithmsandcalibrationofthedevice.Thismeansthatsoftwareupdatescouldsurelyimprove theestimationperformanceandaddnewfeatureswithoutnecessarilychangingthedesign.Futuredevelopmentscouldfocusonthissubject,providing
Chapter5 Conclusions
125
afullyautomatedcalibrationprocess.Also,thefruitdetectionandpicking resultsshouldbemademorerobustagainstenvironmentaldisturbances.To conclude,thiswasamotivatingandinspiringprojecttoworkon,alsodueto theinterestthattheagriculturalindustrydemonstratestoputintopractice the“4.0transition”.
Chapter5Conclusions126
Figure5.1: Photographsofthedevelopedsensingdeviceholdingastrawberrywithout squeezingit.
Bibliography
[1]
[2]
https://www.adafruit.com/product/1643,2022.
https://www.mccormick.it/as/agriculture-4-0-what-is-it-andwhat-are-its-tools-and-benefits,2021.
[3]
https://www.agrobot.com/e-series,2022.
[4] AlexAlspach,KunimatsuHashimoto,NaveenKuppuswamy,andRuss Tedrake.Soft-bubble:Ahighlycompliantdensegeometrytactilesensor forrobotmanipulation. IEEE,2019.
[5]
[6]
https://aerial-robotix.com/asctec-falcon-8/,2022.
https://www.ati-ia.com/products/ft/ft_models.aspx?id=nano17, 2022.
[7] DominikBauer,CorneliaBauer,ArjunLakshmipathy,andNancyPollard.Fullyprintablelow-costdexteroussoftroboticmanipulatorsfor agriculture. AAAI2022Workshop,2021.
[8] AneliseBorgesandNatalieHuet. https://www.euronews.com/myeurope/2020/07/17/invisible-workers-underpaid-exploited-andput-at-risk-on-europe-s-farms,2020.
[9]
https://burro.ai/,2022.
127
[10]
https://www.cambridgeconsultants.com/press-releases/hankdexterous-robot-human-touch,2022.
[11] DanielCosta. https://www.epi.org/blog/the-farmworker-wagegap-continued-in-2020-farmworkers-and-h-2a-workers-earnedvery-low-wages-during-the-pandemic-even-compared-withother-low-wage-workers,2021.
[12]
[13]
https://cs231n.github.io/neural-networks-1/,2023.
https://www.datacamp.com/tutorial/k-nearest-neighborclassification-scikit-learn,2023.
[14] HoangXuanDiemandDoThiThuThuy.Prospectsforagriculture4.0 indevelopingcountries:Casestudiesfromvietnam. IDE-JETRO,2020.
[15]
https://www.dji.com/it/mg-1p,2022.
[16] WonKyungDoandMonroeKennedy.Densetact:Opticaltactilesensor fordenseshapereconstruction. ICRA,2022.
[17] https://fastled.io/,2022.
[18] AlfonsoJ.Fernandez,HuanWeng,PaulB.Umbanhowar,,andKevinM. Lynch.Visiflex:Alow-costcomplianttactilefingertipforforce,torque, andcontactsensing. IEEE,2021.
[19] https://www.festo.com/,2022.
[20]
https://www.amazon.it/Tangxi-videocamera-RaspberryFotocamera-grandangolare/dp/B07WH1D4D4/ref=sr_1_3_sspa?__
mk_it_IT=%C3%85M%C3%85%C5%BD%C3%95%C3%91&crid=1O0NE4N291TA3& keywords=camera+fisheye+raspberry&qid=1675413564&sprefix= 128
[21]
[22]
[23]
[24]
camera+fish+eye+raspberr%2Caps%2C124&sr=8-3-spons&sp_csd= d2lkZ2V0TmFtZT1zcF9hdGY&psc=1&smid=A1UVPWM0U8T1NP,2023.
https://formlabs.com/it/3d-printers/form-2/,2022.
https://formlabs.com/materials/,2022.
https://www.franka.de/,2022.
https://www.generationrobots.com/media/panda-franka-emikadatasheet.pdf,2022.
[25] LucasGerez,Che-MingChang,andMinasLiarokapis.Employingpneumatic,telescopicactuatorsforthedevelopmentofsoftandhybridrobotic grippers. frontiers,2020.
[26]
https://github.com/rpi-ws281x/rpi-ws281x-python,2022.
[27] AimeeGoncalves,NaveenKuppuswamy,AndrewBeaulieu,AvinashUttamchandani,KatherineM.Tsui,andAlexAlspach.Punyo-1:Soft tactile-sensingupper-bodyrobotforlargeobjectmanipulationandphysicalhumaninteraction. IEEE,2022.
[28]
https://www.harvestcroorobotics.com/,2022.
[29] TrungThienHoang,JasonJiaShengQuek,MaiThanhThai, PhuocThienPhan,NigelHamiltonLovell,andThanhNhoDo.Soft roboticfabricgripperwithgeckoadhesionandvariablestiffness. Sciencedirect,2021.
[30]
https://www.plugandplaytechcenter.com/resources/howautomation-transforming-farming-industry,2022.
[31]
https://image-net.org/about.php,2023.
129
[32]
[33]
https://www.intelrealsense.com/depth-camera-d435i/,2022.
http://www.interactivearchitecture.org/a-wearable-softrobot-with-variable-material-distribution.html,2022.
[34] SnehalJain,SaikrishnaDontu,JoanneEeMeiTeoh,andPabloValdiviaY Alvarado.Amultimodal,reconfigurableworkspacesoftgripperforadvancedgraspingtasks. SoftRobotics,2020.
[35]
[36]
[37]
https://www.joulin.com/company/vacuum-technology.html,2022.
https://keras.io/api/applications/,2022.
https://keras.io/guides/sequential_model/,2022.
[38] UikyumKim,DawoonJung,HeeyoenJeong,JongwooPark,Hyun-Mok Jung,JoonoCheong,HyoukRyeolChoi,HyunminDo,andChanhun Park.Integratedlinkage-drivendexterousanthropomorphicrobotichand. Nature,2021.
[39] MariaKondoyanni,DimitriosLoukatos,ChrysanthosMaraveas,Christos Drosos,andKonstantinosG.Arvanitis.Bio-inspiredrobotsandstructures towardfosteringthemodernizationofagriculture. MDPI,2022.
[40] NaveenKuppuswamy,AlexAlspach,AvinashUttamchandani,Sam Creasey,TakuyaIkeda,andRussTedrake.Soft-bubblegrippersforrobust andperceptivemanipulation. SoftRobotics,2020.
[41] https://learnopencv.com/blob-detection-using-opencv-pythonc/,2022.
[42] https://levity.ai/blog/difference-machine-learning-deeplearning,2023.
130
[43]
SandraQ.LiuandEdwardH.Adelson.Gelsightfinray:Incorporating tactilesensingintoasoftcompliantroboticgripper. IEEE,2022.
[44] MariangelaManti,TaimoorHassan,GiovanniPassetti,Nicolo‘D’Elia, CeciliaLaschi,,andMatteoCianchetti.Abioinspiredsoftroboticgripper foradaptableandeffectivegrasping. SoftRobotics,2015.
[45]
https://medium.com/@nhan.tran/mean-median-an-mode-instatistics-3359d3774b0b,2023.
[46]
https://www.naio-technologies.com/en/home/,2022.
[47] EduardoNavas,RoemiFern´andez,DeliaSep´ulveda,ManuelArmada,and PabloGonzalezdeSantos.Softgrippersforautomaticcropharvesting: Areview. MDPI,2021.
[48] LuizF.P.Oliveira,Ant´onioP.Moreira,andManuelF.Silva.Advances inagriculturerobotics:Astate-of-the-artreviewandchallengesahead. MDPI,2021.
[49]
[50]
https://onrobot.com/,2022.
https://www.oreilly.com/library/view/python-machinelearning/9781783555130/ch03s04.html,2023.
[51] SerenaGiuliaPala. https://agronotizie.imagelinenetwork.com/ agrimeccanica/2022/02/08/incentivi-40-opportunita-per-unagricoltura-piu-smart/73891,2022.
[52]
https://www.public.harvestai.com/,2022.
131
[53] ShengjiangQuan,XiaoLiang,HairuiZhu,MasahiroHirano,andYuji Yamakawa.Hivtac:Ahigh-speedvision-basedtactilesensorforprecise andreal-timeforcereconstructionwithfewermarkers. MDPI,2022.
[54] https://www.raspberrypi.com/products/raspberry-pi-3-modelb/,2022.
[55]
https://www.raussendorf.de/en/fruit-robot.html,2022.
[56] https://robotiq.com/it/,2023.
[57] TatsuyaSakuma,FelixvonDrigalski,MingDing,JunTakamatsu,and TsukasaOgasawara.Auniversalgripperusingopticalsensingtoacquire tactileinformationandmembranedeformation. IEEE,2018.
[58] RobB.N.Scharff,Dirk-JanBoonstra,LaurenceWillemet,XiLin,and MichaelWiertlewski.Rapidmanufacturingofcolor-basedhemispherical softtactilefingertips. RoboSoft2022,2022.
[59]
https://scikit-learn.org/stable/modules/generated/sklearn. neighbors.KNeighborsRegressor.html,2022.
[60] https://scikit-learn.org/stable/modules/generated/sklearn. linear_model.LinearRegression.html,2022.
[61] https://scikit-learn.org/stable/modules/generated/sklearn. linear_model.RANSACRegressor.html,2023.
[62] https://scikit-learn.org/stable/modules/generated/sklearn. svm.SVR.html,2022.
[63] JunShintake,VitoCacucciolo,DarioFloreano,andHerbertShea.Soft roboticgrippers. AdvancedMaterials,2018.
132
[64]
https://www.smooth-on.com/page/durometer-shore-hardnessscale/,2022.
[65]
https://www.robotics247.com/article/soft_robotics_raises_ 10m_to_help_meet_pandemic_induced_demand_for_food_automation, 2021.
[66]
[67]
https://www.softroboticsinc.com/,2022.
https://www.solidworks.com/it/product/solidworks-simulation, 2022.
[68]
https://store.arduino.cc/products/arduino-uno-rev3/?gclid= Cj0KCQiAic6eBhCoARIsANlox840SmKcsANukbxGxRVvHAKYanTWFUWyRZqJ-Cz-vqtDqEffMSQF58aAheIEALw_wcB,2022.
[69]
[70]
[71]
https://tertill.com/,2022.
https://www.tevel-tech.com/,2022.
https://towardsdatascience.com/a-comprehensive-guide-toconvolutional-neural-networks-the-eli5-way-3bd2b1164a53, 2023.
[72] https://www.universal-robots.com,2022.
[73]
SantiagoSantosValleandJosefKienzle.Agriculturalroboticsandautomatedequipmentforsustainablecropproduction. EuropeanCommission, 2020.
[74]
[75]
https://www.vitirover.fr/,2022.
BenjaminWard-Cherrier,NicholasPestell,LukeCramphorn,Maria ElenaGiannacciniBenjaminWinstone,JonathanRossiter,andNathanF.
133
Lepora.Thetactipfamily:Softopticaltactilesensorswith3d-printed biomimeticmorphologies. SoftRobotics,2017.
[76] http://wiki.ros.org/netft_utils,2023.
[77] AkihikoYamaguchi.Fingervisionfortactilebehaviors,manipulation,and hapticfeedbackteleoperation. IEEJ,2018.
[78] AkihikoYamaguchiandChristopherGAtkeson.Implementingtactile behaviorsusingfingervision. IEEE,2017.
[79] WenzhenYuan,SiyuanDong,andEdwardH.Adelson.Gelsight:Highresolutionrobottactilesensorsforestimatinggeometryandforce. MDPI, 2017.
134