M O D E R N ME DI CI NE
A COLLISION OF NEW TECHNOLOGIES
W
e are in the middle of a collision of new technologies that are changing design forever. Wirelessly connected products are the norm. High-speed data provides instantaneous access to more information than ever before. Machine learning is scouring everything ever published on the planet, and artificial intelligence (AI) is being applied to study big data to identify patterns and probabilities to assist clinical decision-making. Surgical systems with integrated micro robotics, 3D vision systems and synthetic haptics are extending surgeon performance. The intersection between these new high-tech platforms is reshaping the design brief, defining a broader scope and increasing responsibility for designers like never before. Three significant change agents are transforming the role of industrial designers and the complexity of R&D teams in the design of med-tech products. Big Data, AI and the Clinician Dashboard Today we have access to all the data. Everything published on every topic is available with a keystroke. Powerful machine-learning algorithms process this vast database looking for patterns and probabilities of occurrence in the future. But data alone is overwhelming and meaningless without some tool set to help us make sense of it. AI is serving this role. AI agents are increasingly providing highly curated results to healthcare professionals in ways that support and expedite decision-making. AI is reshaping the clinician interface by removing the rote task of manually scanning volumes, the intrinsic human error and the copious amount of time required by this
26
IDSA.ORG
traditional process. For example, using AI a radiologist will not need to study dozens of slices from an MRI scan because an AI agent will instantaneously scan, assess and flag all anomalies and identify the probabilities of their root causes. The implications are profound. Not only does this eliminate pressure on the clinician to manually scan disparate chunks of information under increasing pressure to improve patient throughput, but AI agents can draw from global databases, which can dramatically increase diagnostic reliability and improve patient outcomes. Visually simplifying and reducing the density of data presented to clinicians on traditional dashboards will decrease time to diagnose, reduce human error and stress, and free up precious time that can be reallocated to clinician-patient dialog. Multisensory Integration Robotic surgical systems represent the state-of-the-art in the complexity and design of systems that rely on multisensory integration. 3D vision systems with graphic overlays of imaging and physiological data combined with synthetic haptics and micro-robotics are attempting to replace natural-born multisensory capabilities. Never before has the role of human factors engineering been more important. The human factors that now need to be considered in complex surgical systems include issues related to eye tracking, eye-hand-foot motor control coordination and movement times, synthetic haptics, and visual and acoustic feedback, to mention a few. These human factors bring together highly specialized disciplines that never intersected in the past. Experts in motor learning,