4 minute read

Use of AI and smart lenses edges ever closer

Next Article
Random Thoughts

Random Thoughts

AI and smart lenses

Smart lenses, microelectronics and AI edge closer to the clinic. Dermot McGrath reports

ophthalmology. We have experienced three revolutions in medicine since the Flexner Report, a book-length study of medical education in the United States and Canada, published in 1910: molecular biology in the 1950s with the discovery of DNA; genomics in the 1980s with all the biologics that we currently use in ophthalmology and medicine; and more recently, the convergence of AI with the hardware miniaturisation revolution in the 2000s,” he said. UNMET NEEDS These advances will allow ophthalmologists to address unmet needs in cataract and ophthalmic disease management, noted Dr Azar.

“AI can really help us with screening in order to identify which patient should be referred to the sub-specialist for management and will help with workload reduction for better efficiency with limited healthcare resources. Algorithms can be run on PCs or smartphones; we will see improved sensitivity and specificity of at-risk patient detection and promotion of personalised cataract surgery and personalised disease management,” he said.

Diagnostic imaging is currently the highest and most efficient application of AI-based analyses and will likely further expand as imaging modalities become more advanced and multi-modal, said Dr Azar.

“We are also looking at applications beyond diagnostics to AI-based discovery. AI should improve IOL power calculations,” said Dr Azar, with effective lens position (ELP) representing the biggest potential source of error. “Research led by Dr Jose De La Cruz at the University of Illinois is showing how data about the lens, cornea, anterior R apid advances in biomedical engineering, artificial intelligence and miniaturised electronics are bringing the prospect of smart lenses ever closer as potential solutions for various ophthalmic disorders and potential applications for the management of presbyopia, according to Dimitri T Azar MD, MBA.

“There have been concerted efforts in recent years to bring microelectronics, sensors and smart lenses from the lab into ophthalmic clinical and preclinical testing,” he told delegates attending a symposium of artificial intelligence at the 37th Congress of the ESCRS in Paris, France.

Dr Azar, Senior Director of Ophthalmic Innovations at Verily Life Sciences, gave an overview of the future of microelectronics in ophthalmology, broad progress made to date and the challenges in bringing microelectronics, diagnostics and smart lenses to the ophthalmic market.

“The future involves engineering novel solutions to address the considerable technical challenges of building wireless sensing capabilities in and on the eye. Future smart lenses are likely to incorporate sensors, miniaturised low power electronics and data analytics in devices that are as small as 600 microns in diameter, which is quite a feat of engineering,” he said.

Dr Azar said that artificial intelligence allied to the hardware miniaturisation revolution of recent years would have a profound impact on all domains of life, including medicine and ophthalmology.

“We are in the new era of technological pervasiveness with AI and nanofabrication, which will inevitably change the way we practise medicine and Dimitri T Azar

MD, MBA

There have been concerted efforts in recent years to bring microelectronics, sensors and smart lenses from the lab into ophthalmic clinical and preclinical testing Dimitri T Azar MD, MBA

chamber depth and angle anatomy can be collected during laser cataract surgery in order to calculate the postoperative ELP,” he said.

“It requires knowledge of axial length, central K, A constant and postoperative refraction. There is great scope for AI to capture these important predictors as well as other preoperative characteristics in order to improve outcomes but we need better data sets to be able to do this properly. We especially need data sets with accurate refractive error calculations in patients having good postoperative Snellen acuity,” he said.

Published reports have also shown the potential of AI to predict refractive errors with a relatively high degree of accuracy just from analysis of fundus images, said Dr Azar.

“The images were derived from the UK Biobank and AREDS data set. Looking at some of these fundus images, one could perhaps distinguish a high myope from a high hyperope. However, these deep learning programmes can do the job much better than humans in predicting refractive errors,” he said.

COMPLEX FUTURE TECHNOLOGIES Dr Azar referred to experimental and theoretical data in the public domain regarding ophthalmic microelectronics, such as electronic autofocus lenses, and explained that these future technologies are much more complex than today’s IOLs. “For implantable lenses, the lens would be ideally positioned in the bag or elsewhere in the eye. The first step is to determine the afferent pathway and estimate the distance at which the patient is looking. AI and feedback loops are potentially incorporated to change the power of the lens,” he said.

“All of this would require miniaturisation, tiny batteries, antennae and computer chips placed inside ophthalmic devices,” he said.

Setting the optimal cut-off point for the add power of the smart lens is one of the key decisions to be made by the designers, said Dr Azar.

“One could potentially set the lens for progressive add corrections for near distance at +2D, +3D or +3.5D. There are trade-offs for every one of these decisions,” he said.

This article is from: