6 minute read

Connecting Humans and Machines

THROUGH THE NEW UC DAVIS CENTER FOR NEUROENGINEERING AND MEDICINE and

projects funded by NASA and the National Science Foundation (NSF), mechanical and aerospace engineering (MAE) faculty members Sanjay Joshi, Jonathon Schofield and Steve Robinson are pushing the boundaries of the developing field of neuroengineering and finding new ways for humans and machines to work together.

Advertisement

“We have all the resources at UC Davis to be one of the absolute leaders in this field,” said Joshi. “We are one of the few universities in the world with tremendous programs in engineering, neuroscience and medicine. We have the potential to bring these programs together to do some really groundbreaking work.”

When a person moves a muscle, the brain sends an electrical signal to tell it what to do. Joshi, a former NASA engineer and control systems expert, thought these signals could also be used to help people control machines. His group found that even the most underused muscles in the body could produce a wide range of signals that could be used to control electrical devices.

“Our body is electrically-controlled and you can measure these electrical signals at different places on the body,” he said. “The body acts like a limited signal generator.”

In partnership with the disability community, his group began building neuromuscular-controlled devices that help people with even the most severe disabilities use a prosthetic or even move a computer cursor—devices that improve their everyday lives.

Joshi’s work is part of the rise of neuroengineering, a new field that combines engineering, neuroscience and medicine to restore or add function to humans using techniques from multiple disciplines. A collaborator by nature, he saw the potential to bring together UC Davis’ strengths to create a neuroengineering hub. In 2014, he established the Neuroengineering and Medicine Initiative, which was met with enthusiasm across campus.

GIVING ASTRONAUTS A HAND

One of his first collaborations in the area was with fellow MAE professor and former astronaut, Steve Robinson. They realized the human-machine interfaces Joshi had developed could make spacewalks safer by giving astronauts a robotic fifth limb—called a supernumerary robot—to increase their capabilities and range of motion.

They recruited MAE assistant professor Jonathon Schofield and neurobiology, physiology and behavior (NPB) associate professor Wil Joiner—both hired as part of the Neuroengineering and Medicine Initiative—along

with NPB professor Lee Miller and were awarded an NSF grant in 2019 for the project.

Learning to use the arm could potentially be like learning to coordinate two arms and facial muscles to play an instrument. The brain technically sends multiple physiological signals to control multiple limbs, even if one is a supernumerary robot controlled by muscles in a different part of the body, depending on the situation. Learning to use the arm could potentially be like learning to coordinate two arms and facial muscles to play an instrument. Since supernumerary robots are new, however, there could be a significant learning curve.

HUMANS AND CONNECTING

EXTENDING HUMAN CAPABILITY

Feedback is key to helping ease this learning curve. Feedback can take many forms, or a combination of forms, ranging from an auditory response such as beeping, to visuals such as flashing lights or motion, to a sensation of touch, known as haptic feedback. Based off a person’s responses, says Schofield, researchers can tune the devices to get the user to embody the robot.

“Tool-embodiment occurs when an expert tennis player operates their racket,” said Schofield. “They’re very

By Noah Pflueger-Peters

HUMANS AND CONNECTING MACHINES MACHINES

Above Left: Professor Sanjay Joshi

Above Right: Assistant Professor Jonathon Schofield

Right: Sarah O’Meara, the chief scientist in Professor Robinson’s Lab, working with a supernumerary robotic arm. (Steve Robinson/UC Davis)

(continued from page 21)

aware of where it is in space and how to manipulate it, and it functions as an extension of their body.”

Tool-embodiment makes it easier for humans to control these devices precisely, and therefore safely, especially in space where collisions have serious consequences. As part of Robinson’s UC Davis Center for Spaceflight Research, Schofield, Robinson and Joiner are studying visual and haptic strategies to controlling external robotic arms on spacecraft. These arms are used for everything from assembling spacecraft to operating unmanned stations.

By putting stimulators on an astronaut’s arm that vibrate in coordination with the external robot arm, an astronaut would move their real arm and see the robot arm moving along with them, as well as feel a type of vibration when the robot arm grabs hold of an object.

“There’s a tremendous amount of information that you can encode in vibration,” said Schofield. “For example, you can probably distinguish between five or six different apps sending you a notification, just by the way your phone pulses and vibrates.” Though these techniques developed will be for space, Robinson sees this technology being applicable to any number of situations.

“It’s not just space robotics,” said Robinson. “It could be used for something like robotic surgery—anywhere where getting it wrong could be very critical.”

Professor Steve Robinson

A challenge in neuroengineering is that every human and every situation is different, so devices need to be adaptable. Signals the body produces can change with time, and the body can also change after using new devices like the ones Joshi, Schofield and Robinson are developing.

“As we use any new tool over and over again, whether it be our cell phone or sporting equipment, our brains and our bodies change, so we’re interested to see what happens as we develop this whole new generation of robotic tools,” said Joshi.

Collaboration with neuroscientists, biologists and doctors is essential say Joshi, Robinson and Schofield, who are excited for the multidisciplinary research opportunities the work will give their students.

“This field requires students to have an understanding of so many different aspects, from sciences to engineering to clinical study, so we’re really interested in forming a new model of how students can gain this knowledge,” said Joshi.

This multidisciplinary approach is a core tenant of the

A C

neuroengineering community at UC Davis, which Joshi now leads as co-director of the UC Davis Center for Neuroengineering and Medicine, along with biomedical engineering professor Karen Moxon and School of Medicine professor Carolynn Patten. The center is guided by the program’s steering committee, which Robinson and Schofield serve on.

As the center grows, so does the number of collaborations. One eventual goal is to bring the Davis campus and the UC Davis Medical Center in Sacramento closer together and develop a pipeline for neuroengineering innovation from basic research through clinical investigation.

Neuroengineering research is in its infancy, but the work Joshi, Robinson and Schofield are doing at UC Davis is poised to be foundational to the young field, and for people who are disabled or work in hazardous environments.

“I’ve been totally excited about the number of new collaborations that we’ve been able to establish, and the fact that we’re thinking about creative ways we can all work together to help people,” said Joshi. “There’s definitely a lot of momentum.”

Page 10 Bottom photo: Supernumerary robots can give astronauts on the International Space Station an extra hand, making spacewalks safer and more efficient. Graphic courtesy of Sarah O’Meara.

Graphic A: Visualization of electromyography sensors placed on leg muscles to control a robot or prosthetic. Graphic courtesy of RASCAL/K. Lyons.

Graphic B: Myolectric control, or sensing and responding to electric signals produced by muscle movement, is one potential way to control a supernumerary robot. Graphic courtesy of Kenneth Lyons/UC Davis.

Graphic C: Visualization of person using foot movements to control an arm prosthetic, using muscle sensors placed on ankle. Graphic courtesy of RASCAL/K. Lyons.

This article is from: