
5 minute read
Cybersecurity: As Much a Social Problem as a Technical One
CYBERSECURITY
AS MUCH A SOCIAL PROBLEM AS A TECHNICAL ONE
BY KELLEY CHRISTENSEN
What cybersecurity means to many people is making sure your passwords are strong and perhaps paying your bank a monthly fee for additional security measures.
But in a world where an increasing number of the devices we use every day are becoming members of the Internet of Things—from the obvious ones like Alexa to the less obvious refrigerators and thermostats—cybersecurity attacks and countermeasures are expanding.
Cybersecurity is a cross-disciplinary field. Research opportunities abound because of the important role security plays in autopilot vehicles, cyber-physical systems, industrial control systems and smart city infrastructure.
Four researchers in Michigan Technological University’s College of Computing focus their research on strengthening cybersecurity, but each approaches the field from different angles, including malware detection, security of cellphones and vehicles (both human-piloted and autonomous), and training the future workforce.
PATTERN PLAY
“Behavior” may seem like an odd word to apply to machines. But the algorithms that power our devices are creatures of habit and repetition—that learn. Which means Bo Chen, assistant professor of computer science, is interested in detecting unexpected behavior patterns that could be evidence of malware.
“A top concern is that when a device is hacked, the malware can steal your information,” Chen said. “If you can detect the malware, you can take action. Data compromised by malware can be recovered after the intrusion is detected.”
Chen’s research focuses on how cybersecurity professionals detect malware: by signature or behavior. Signature detection is akin to finding fingerprints left at the scene of a crime—but instead, finding bytes out of place in code. Behavior is a little more complicated. Researchers like Chen are establishing models that can detect excess patterns in code. If the patterns in a data set no longer line up—like crooked corners in a quilt or misplaced colors of thread in a carpet—the model can detect malware at work.
Teaching the next generation of cybersecurity professionals is paramount, as an increasing number of devices important to modern life become susceptible to cyberattacks.
DEEP LEARNING
Teaching a machine to detect complicated patterns requires more than static programming. Neural network learning systems teach artificial intelligence (AI) to recognize situations similar to those they have already encountered. In learning by doing, neural networks create a framework for AIs to make decisions by weighing variables and bias. The more layers of decisions in a given network—a multitude of “if this, then that” questions—the deeper the learning.
Xiaoyong Yuan, assistant professor of applied computing, focuses on cybersecurity and deep machine learning, studying how to use AI to solve cybersecurity challenges. Machine learning can help detect network attacks such as distributed denial-of-service (DDOS) strikes.
Because of the sheer amount of data involved, network attacks cannot be rebuffed manually. Automating analysis is necessary, and so is teaching AI how to automatically detect malware.
Yuan is interested in issues of trust. In a culture becoming increasingly reliant on Alexa for everything from scheduling meetings to selecting playlists, security and privacy are paramount. The beautiful thing about neural networks is that they can learn new features automatically, which makes models more powerful; but what makes them beautiful is also a potential danger.
“AI is a large part of our lives but it is not perfect. We need to build a trustworthy AI,” Yuan said. “This includes finding the security and privacy issues in deep learning and AI algorithms.”
Yuan gave the example of autocompletion, a feature now offered in many programs, including Google Mail and Microsoft Outlook. The AI behind it learns from your typing history so it can predict your next sentence.
“If you type something sensitive and confidential, the model will remember it,” Yuan said. “What’s the chance the algorithm will leak your confidential sensitive information? We must build security and privacy into machine learning to prevent data breaches.”
GHOSTS IN THE MACHINES
Malicious attacks on thermostats and refrigerators are inconvenient but losing control of a vehicle barreling down the highway at 80 mph veers quickly into the realm of disastrous. For that reason, Lan Zhang, assistant professor of electrical and computer engineering, is interested in creating defense mechanisms in machine-learning models. One approach is designing environments to be dynamic enough to prevent bad actors from finding private information.
Zhang and her collaborators are developing mechanisms and algorithms to address security threats in wireless communications, such as those used in driving environments. Connected and autonomous driving is one typical Internet of Things/cyberphysical systems application.
“Our solutions include physicallayer security design to augment the transmission environments to avoid eavesdropping; blockchain-based authentication systems for trust management; and intrusion detection for secure routing.”
Zhang also investigates federated learning, a privacy-preserving machine-learning paradigm, which allows users to collaboratively train a shared machine-learning model without migrating their confidential data into a central server. Federated learning is widely used in Internet of Things applications, like Google Keyboard.

Cybersecurity is a cross-disciplinary field. There are many research opportunities because of the important role security plays in autopilot vehicles, cyber-physical systems, industrial control systems and smart city infrastructure.
THE NEXT GENERATION
Because cyber hacks and security breaches have dramatically increased in recent years, the nation is facing a talent shortage of cybersecurity professionals. To improve security education and meet industry needs, cybersecurity educators and researchers need to propose innovative approaches and ideas to defend the nation’s digital frontier.
Yu Cai, professor of applied computing, is the principal investigator of a multimillion-dollar CyberCorps: Scholarship for Service grant sponsored by the National Science Foundation. The grant provides up to three years of full scholarship support for 20 undergraduate and graduate students studying cybersecurity at Michigan Tech. In return, recipients will work for the federal government for several years after graduation.
The GenCyber program Cai leads is another cybersecurity workforce development effort, providing summer cybersecurity camp experiences for students and teachers at the K-12 level. GenCyber’s goals are to help students understand safe online behavior, learn fundamental cybersecurity knowledge, increase interest in cybersecurity careers, and improve pedagogical methods for delivering cybersecurity content in K-12 curricula.
“The U.S. is facing a significant shortage of well-trained and well-prepared cybersecurity professionals,” Cai said. “Michigan Tech has developed a national and international reputation in cybersecurity education, and research and outreach activities. We are thrilled to be part of the solution to the nation’s cybersecurity workforce challenge.”
Kelley Christensen is the Director of Research Communications at Univ. of Oregon, having been a Science Writer at Michigan Tech University.