2021 Ingenium: Journal of Undergraduate Research

Page 30

Computer vision methods for tool guidance in a finger-mounted device for the blind Yuxuan Hu, Rene R. Canady, B.S., Roberta Klatzky, Ph.D. and George Stetten, M.D., Ph. D. The Visualization and Image Analysis (VIA) Laboratory, Department of Bioengineering Yuxuan Hu is a Bioengineering senior at the University of Pittsburgh. He will be a graduate student next year at ETH in Zurich studying surgical vision and imaging.

Yuxuan Hu

Rene R. Canady, B.S.

Rene R. Canady received a BS in bioengineering in 2020 from the University of Pittsburgh and is currently pursuing a Ph.D. in Sociology at Washington University, St. Louis. Her research interests include engineering ethnography and racial controversies in health. Roberta Klatzky is a professor of Psychology and Human-Computer Interaction at Carnegie Mellon. She enjoys combining basic research with applications.

Roberta Klatzky, Ph.D.

George Stetten, M.D., Ph.D.

George Stetten is Professor of Bioengineering at the University of Pittsburgh. He directs the Visualization and Image Analysis (VIA) Laboratory and the Music Engineering Laboratory (MEL) and is a fellow in the American Institute for Medical and Biological Engineering.

Significance Statement

Tool handling and close-up operation are challenges for the visually impaired that have not been well addressed by assistive devices. We have come up with novel and computationally efficient computer vision methods for real-time tool detection and target motion classification, to improve on “FingerSight,” a finger-mounted haptic device designed to help the visually impaired complete daily tasks.

Category: Methods

Keywords: Assistive Technology, Haptics, Computer Vision, Image Analysis 30 Undergraduate Research at the Swanson School of Engineering

Abstract

People with visual impairment often find difficulty in performing high precision tasks such as interacting with a target using a tool. We propose a device for the visually impaired that can provide accurate localization of targets via vibratory feedback that makes tool-handling tasks easier. The device is an adaptation of our existing system, “FingerSight,” a finger-mounted device consisting of a camera and four vibrators (tactors) that respond to analysis of images from the camera. We report here on a new design for the hardware and optimization of real-time algorithms for tool detection and motion classification. These include the determination that a duration of tactor vibration of 90 ms yielded a minimum error rate (5%) in tactor identification and that the best parameters for the tool recognition algorithm were threshold = 13, kernel size = 11, yielding an average tracking error of 23 pixels in a 640 x 480 pixel camera frame. Anecdotal results obtained from single healthy blind-folded subject show the device’s functionality as a whole and potential for providing guidance for the visually impaired manipulating tools in real-life scenarios.

1. Introduction

In 2015, approximately 3.22 million people in the United States were visually impaired, while 1.02 million of them were blind. And by 2050, the number of people afflicted by visual impairment is projected to double [1]. Such increasing prevalence of visual impairment has driven scientists and engineers to develop various assistive technologies, many of which utilize computer vision and haptics. Significant progress has been made in the area of user mobility in pedestrian environments. For example, a system using tactors attached to the torso has been developed to localize the user with respect to the surroundings and guide travel while avoiding obstacles [2]. Another study focuses on detecting aerial obstacles with a stereo vision wearable device [3]. However, few solutions have addressed the commonplace problem of finding targets in peripersonal space, i.e., the space within reach where objects can be grasped and manipulated. Our laboratory previously developed a system with a hand mounted binocular cameras and five vibrators to help the user locate nearby targets in 3D space using a depth map generated with computer vision methods [4]. Our present system, “FingerSight,” is a wearable device originally intended to help the visually impaired navigate in the environment and locate targets in peripersonal space. The device, mounted on the finger, contains a camera and a set of vibrators (tactors) that are activated based on computer vision analysis of the real-time camera image. Previous research on this technology by Satpute et al., in our laboratory, demonstrated its effectiveness for guiding blindfolded participants to move their hand to reach an LED target located in front of their body [5]. Based on the working prototype by Satpute et al., our present research focuses on incorporating the experience of using hand-held tools into the basic FingerSight framework. Accurate localization and feedback are needed


Turn static files into dynamic content formats.

Create a flipbook

Articles inside

Index

2min
pages 114-115

u Neural Network-based approximation of model predictive control applied to a flexible shaft servomechanism

13min
pages 107-110

Department of Bioengineering, McGowan Institute for Regenerative Medicine, Renerva, LLC

15min
pages 102-106

u Finite element analysis of stents under radial compression boundary conditions with different material properties

8min
pages 111-113

Analysis of stride segmentation methods to identify heel strike

14min
pages 98-101

Joseph Sukinik, Rosh Bharthi, Sarah Hemler, Kurt Beschorner

13min
pages 94-97

Human Movement and Balance Laboratory, Department of Bioengineering; Falls, Balance, and Injury Research Centre, Neuroscience Research Australia

10min
pages 90-93

u Topological descriptor selection for a quantitative structure-activity relationship (QSAR) model to assess PAH mutagenicity

12min
pages 81-84

Department of Bioengineering, Department of Electrical Engineering, Department of Mechanical Engineering, Innovation, Product Design, and Entrepreneurship Program

12min
pages 85-89

Department of Chemical Engineering, Heart, Lung, Blood, and Vascular Medicine Institute Division of Pulmonary, Allergy and Critical Care Medicine

14min
pages 76-80

u Demonstrating the antibiofouling property of the Clanger cicada wing with ANSYS Fluent simulations

13min
pages 72-75

u Levator Ani muscle dimension changes with gestational and maternal age

11min
pages 64-67

u Bioinformatic analysis of fibroblast-mediated therapy resistance in HER2+ breast cancer

11min
pages 60-63

Department of Bioengineering, Department of Psychiatry, Department of Neurology, Physician Scientist Training Program, University of Pittsburgh School of Medicine

15min
pages 55-59

u Fluid flow simulation of microphysiological knee joint-on-a-chip

14min
pages 49-54

Department of Bioengineering, Division of Vascular Surgery, University of Pittsburgh Medical Center, Department of Surgery, Department of Cardiothoracic Surgery, and Department of Chemical and Petroleum Engineering, McGowan Institute for Regenerative Medicine, and Center for Vascular Remodeling and Regeneration

16min
pages 44-48

Testing the compressive stiffness of endovascular devices

11min
pages 40-43

Department of Bioengineering, Carnegie Mellon University, McGowan Institute of Regenerative Medicine

15min
pages 35-39

Physical Metallurgy & Materials Design Laboratory, Department of Mechanical Engineering & Material Science

13min
pages 25-29

Hardware acceleration of k-means clustering for satellite image compression

15min
pages 20-24

Visualization and Image Analysis (VIA) Laboratory, Department of Bioengineering

16min
pages 30-34

Spike decontamination in local field potential signals from the primate superior colliculus

10min
pages 16-19

u Simulating the effect of different structures and materials on OLED extraction efficiency

8min
pages 13-15

u Representations of population activity during sensorimotor transformation for visually guided eye movements

14min
pages 7-12

Message from the Coeditors in Chief

2min
page 5

A Message from the Associate Dean for Research

3min
page 4
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.