![](https://assets.isu.pub/document-structure/250204184027-ee1461cb83d9b490faa91d22ca56ec74/v1/ad6fa953ed0cf491005262ce0800de76.jpeg)
![](https://assets.isu.pub/document-structure/250204184027-ee1461cb83d9b490faa91d22ca56ec74/v1/91ac5527647982b9d7e2c21387ab841a.jpeg)
![](https://assets.isu.pub/document-structure/250204184027-ee1461cb83d9b490faa91d22ca56ec74/v1/0994e6b1862ce7b8906b03a036185c6e.jpeg)
![](https://assets.isu.pub/document-structure/250204184027-ee1461cb83d9b490faa91d22ca56ec74/v1/182887d264347f34b15d91adb1807464.jpeg)
10
![](https://assets.isu.pub/document-structure/250204184027-ee1461cb83d9b490faa91d22ca56ec74/v1/bf9f1dfa4d79d1d5910d96b6d62a3679.jpeg)
![](https://assets.isu.pub/document-structure/250204184027-ee1461cb83d9b490faa91d22ca56ec74/v1/d49659f558d7931a1af483ec5aa2cd36.jpeg)
![](https://assets.isu.pub/document-structure/250204184027-ee1461cb83d9b490faa91d22ca56ec74/v1/d83ac151ffb9208e95a67b53bf41dab6.jpeg)
![](https://assets.isu.pub/document-structure/250204184027-ee1461cb83d9b490faa91d22ca56ec74/v1/fb6b0f5d97bff76eea69f1d5ec31064d.jpeg)
10
In 2015, Arizona State University launched the Global Security Initiative to serve as an interdisciplinary hub for research on complex global security issues. Its leaders recognized that the challenges facing our nation and the broader globe — such as cyber threats, resource scarcity and data privacy — called for a new, holistic approach and collaboration with the government and private sectors.
While GSI grew and evolved over its first decade, its focus on people, impact and mission has remained the same, says Executive Director Nadya Bliss. Ten years on, the initiative’s goal is still to advance science and technology capabilities to meet defense, security and intelligence mission needs.
One of GSI’s major contributions to the ASU community has been its role in helping the university grow its engagement with U.S. Department of Defense research. ASU has doubled its DOD-funded research expenditures since GSI launched in 2015.
The initiative is celebrating several other key wins from the last decade, including improving homeland security operations, protecting vulnerable infrastructure, supporting outstanding young faculty, advancing human-AI team research, fighting adversaries’ online strategies, and educating the current and future national security workforce.
“Even though I've been running GSI for 10 years, it's probably been a different organization every year or so, because we've had all these different milestones,” Bliss says. “Running and building this organization with the team here has been one of the greatest privileges and honors of my life.”
In 2015, ASU launched the Global Security Initiative to tackle emerging global security challenges. Over the past decade, GSI has received funding from over 50 federal agencies, significantly boosting ASU's Department of Defense research expenditures in particular. GSI has become a global leader in addressing complex defense issues, not only advancing our understanding of global security but also contributing to making the world a safer place.
I celebrate GSI’s accomplishments and look forward to its continued impact in the years to come!”
— Sally C. Morton Executive Vice President of ASU Knowledge Enterprise
Bolstered homeland security and public safety
Protected vulnerable infrastructure
Advanced human-AI collaboration
Combated malicious influence
PEOPLE
Trained the national security workforce
Supported the next generation of faculty
Taught learners at every stage
MISSION
Engaged ASU in defense solutions
Created a new paradigm for defense research
Guided nation's responsible advancement
The U.S. Department of Homeland Security was in need of advanced tools to improve operations across its organizations, including the Transportation Security Administration, U.S. Coast Guard, Federal Emergency Management Agency, and Customs and Border Protection.
That’s why, in 2017, the agency turned to ASU researchers for help and established a DHS Center of Excellence. Dubbed the Center for Accelerating Operational Efficiency, it is housed in GSI with ASU’s Ira A. Fulton Schools of Engineering as a core partner.
DHS has selected only a handful of universities across the country to lead research efforts in its Centers of Excellence.
“That DHS chose ASU for this Center of Excellence speaks to ASU’s commitment to impactful, use-inspired research,” said Center Director Ross Maciejewski when the center launched. “We will develop new research and translate existing research into useful tools, such as data analytics, economic analysis or operations management systems, that DHS organizations can put in place for improved decision-making and effectiveness.”
In its first year, the center conducted groundbreaking research to improve efficiency and security at national borders, seaports and airports by using multidisciplinary, customer-driven and practical solutions.
To raise passenger satisfaction and reduce wait times without compromising security, researchers in the center began developing a decision-support tool to simulate and visualize TSA security screening checkpoint operations with ever-changing passenger demands. Phoenix Sky Harbor International Airport and El Paso International
Airport were chosen as test sites for the tool, which is expected to benefit up to 900 million passengers per year by reducing wait times for passengers at U.S. airports.
When the COVID-19 pandemic hit in 2020, the center sprang into action. Working directly with DHS, the center helped some of the agencies charged with responding to the pandemic plan for and overcome both anticipated and unanticipated challenges.
For example, the center pivoted an ongoing project to address expected supply chain challenges around medical equipment and vaccines. The focus areas shifted to equalizing statewide access for vaccines, deploying ventilators by forecasting demand, simulating disease transmission, examining social distancing policies, and creating logistics for distribution of antivirals.
The center also added new projects, including one that estimated the economic impacts of COVID-19 on the U.S. and its major trading partners, examining six scenarios that ranged from a minor event to a disaster.
Another added project aimed to provide health care, economic and public policymakers with models that could determine how the pandemic and associated policy responses would affect the U.S. economy.
AI is a growing presence in our lives with great potential for good. It’s increasingly used in sectors like health care to aid in diagnosis and drug discovery, in finance to navigate the stock market, and in transportation to power self-driving cars.
While most people may not harbor active suspicion against AI, they may have low trust, much like you might have for a complete stranger. If an AI system diagnosed you with a disease, for example, would you pursue treatment right away or would you seek out a second opinion from a human health professional?
It also plays a role in our national security. Regardless of whether the U.S. embraces AI technology, other countries will — countries with possibly different values than our own.
To that end, the center worked on a project that will help the U.S. government and industry acquire AI technology that people will feel confident using. Funded through DHS, the research group tested whether a new evaluation tool effectively measures the trustworthiness of AI systems.
The TSA’s federal security directors from Phoenix Sky Harbor International Airport, San Diego International Airport and Las Vegas Harry Reid International Airport each organized volunteer officers to participate in the study. Officers interacted with one of two simulated AI systems — one reliable, one not — that the ASU team created. Then they used the evaluation tool to indicate their level of trust with the system.
If the ASU team is able to show that the evaluation tool is useful for assessing AI trustworthiness, it will help in building and buying systems that people can rely on — paving the way for AI’s smooth integration into critical sectors, protecting national security and multiplying its power for positive impact.
In its first year, GSI established the Center for Cybersecurity and Digital Forensics. The initiative’s first center was tasked with taking a proactive, interdisciplinary approach to the issue of cybersecurity. Proper cybersecurity measures are critical when it comes to protecting our nation’s infrastructure.
“I am thrilled to have the Global Security Initiative’s first center address this challenge, bringing together expertise from across the campus, and connecting to both private and public partners,” GSI Executive Director Nadya Bliss said when launching the center. “In this age of interconnectedness and complexity, cybersecurity is at the forefront of our security as a human race.”
That same year, ASU was also named a partner in a $28.1 million national research program, the Cyber Resilient Energy Delivery Consortium. The consortium was tasked to develop cybersecurity tools and standards to protect the country’s electricity infrastructure from attacks.
Energy-delivery systems are critical infrastructures that rely on complex industrial
control networks and enterprise networks for management and day-to-day operations. But these networks also expose the systems to cyberattacks with dire consequences.
“Our stake in this initiative is to focus on securing several new technologies that are emerging,” said Gail-Joon Ahn, founding director of the Center for Cybersecurity and Digital Forensics who co-led ASU’s efforts. “We will study the coupling that exists between energy-delivery systems and other infrastructures, including building and home automation infrastructures and the so-called Internet of Things, which can make the end use of electricity responsive to grid congestion, but may also be vulnerable to cyberattacks aimed at creating imbalance in the grid.”
In 2022, GSI combined this center with the Cybersecurity Education Consortium to form a single organization: the Center for Cybersecurity and Trusted Foundations. The unit holistically addresses the complex cybersecurity challenges facing the nation by fortifying the fundamental building blocks of security — technology, process and workforce.
Sometimes developers accidentally introduce a bug that leaves software vulnerable to attacks. When a company or developer discovers a vulnerability, they provide a fix — or patch — as soon as possible. But there are situations where patching isn’t so straightforward.
“What happens if the software company no longer exists? Who can fix those bugs, and how?” asked Adam Doupé, director of the Center for Cybersecurity and Trusted Foundations.
Critical infrastructure presents another cybersecurity problem. Because it’s critical, there’s a tendency to avoid software security updates since they might disrupt other parts of the system. Many critical infrastructure companies don’t have a way to check that a patch won’t interfere with the system’s function, Doupé said. This update lag leaves them open to ransomware attacks.
ASU began tackling these problems in 2020 when DARPA awarded a four-year contract to the center, which contributed research and development efforts to the Assured Micropatching program. A micropatch is a small patch that fixes one vulnerability without jeopardizing functionality.
The team developed VOLT (a Viscous, Orchestrated Lifting and Translation framework), to reverse-engineer a piece of outdated software. It can take the ones and zeros of a binary program and “translate” them into code that humans can read, and vice versa.
Once developers learn about a program, they can create the smallest patch possible that changes as little in the system as possible. The center’s team also aims to use mathematical proofs to guarantee that a system will still work after the patch is deployed.
This will allow the team to address security issues in the software used in critical infrastructure as well as deployed military equipment for less time and cost.
“I have worked on software reverse engineering for over 10 years, and much to my surprise, no one has created techniques to make effortless binary patching possible,” said Ruoyu "Fish" Wang, the lead investigator of the Assured Micropatching project. “Our VOLT framework, upon success, will be the first of its kind that enables easy bug fixing on deployed software. This capability will mean a lot to both industry and national security.”
Current AI is a good tool but a poor teammate because of its lack of human understanding.
That is why the research and development arm of the U.S. Department of Defense, the Defense Advanced Research Projects Agency (DARPA), is exploring new ways to build social skills in AI systems that would allow humans and machines to work effectively together.
Clues on how to create machines with social intelligence have been found in an unexpected place: the popular video game Minecraft.
ASU researchers teamed up with Aptima — a company that develops solutions to optimize and improve human performance in mission-critical, technology-intensive settings — during a four-year program funded by DARPA called Artificial Social Intelligence for Successful Teams, or ASIST. The project aims to improve the social intelligence of artificial intelligence and make it better able to assist teams of humans working in complex environments, including national security missions.
Researchers from GSI’s Center for Human, Artificial Intelligence, and Robot Teaming (CHART)
generated data from 1,160 Minecraft games, which represents the largest publicly available human-AI team research dataset in history.
CHART is at the forefront of a new science of human, AI and robot teaming. It synthesizes research across computer science, robotics, law, art and social science to revolutionize how we ensure national security through human-machine teaming.
While team research is statistically underpowered because it is hard to schedule and convene participants at the same time, CHART found innovative ways of recruiting participants by hosting Minecraft competitions and virtual hackathons and giving awards to high performers.
More than 200 participants — including representatives from the Massachusetts Institute of Technology, Cornell, the University of Southern California and Carnegie Mellon University — played a role in the ASIST research. ASIST is one component of a broader DARPA initiative called AI Forward, which is taking a deep look at how to reliably build trustworthy AI systems by examining AI theory, AI engineering and human-AI teaming.
Founded in 2021, the Center on Narrative, Disinformation and Strategic Influence works to identify and mitigate the impacts of strategic influence campaigns from foreign adversaries intent on undermining the interests and security of the U.S. and democratic allies.
The center takes a uniquely interdisciplinary approach to address this globe-spanning challenge — think mathematics and engineering with communications, psychology and journalism.
Influence campaigns are a major component of global competition today, used by competitor nations like China or Russia to sow confusion or mold public opinion in a strategically important region, like the Indo-Pacific or Eastern Europe. The goals can vary — ranging from diminishing confidence in the U.S. as an ally to preemptive justification for military actions. Geopolitical influence operations are often inexpensive and low-risk, but can cause real damage to a nation’s ability to respond to strategic threats.
continued from page 11
With the broad deployment of generative AI tools that could supercharge influence campaigns, the center has evolved to focus more heavily on understanding our rapidly evolving media and information landscape, developing tools to identify AI-generated content, and combining mathematics with the study of narrative structure.
“One question that I am very interested in is, how does a nation or a society heal from a successful malicious influence campaign?” said Joshua Garland, interim director of the center. “Adversarial influence campaigns are almost always premised on an ‘us versus them’ narrative meant to increase polarization. How do you start to reduce that polarization? I think that’s a really difficult challenge that I want to be working on.”
Studying Russian propaganda to detect a planned military invasion
The Office of Naval Research, a division of the U.S. Department of the Navy, awarded ASU a grant in 2018 to examine thousands of mass media and social media postings in the Baltic States — Lithuania, Latvia and Estonia — to help detect if Russia is planning a military invasion there.
Researchers from the Center for Strategic Communication at the Hugh Downs School of Human Communication, the School of Computing and Augmented Intelligence, and GSI collaborated on the work, a portion ofwhich was subcontracted to the aerospace and defense company Lockheed Martin.
“About four months before the 2014 annexation of Crimea, we saw huge changes in Russian anti-Ukrainian propaganda,” said Steve Corman, director of the Center for Strategic Communication, who leads the study. “The Russians were clearly trying to rile up the Russian-speaking minorities to sow support for their cause. Obviously, there hasn’t been an invasion in the Baltics yet, but we will be trying to figure out if there are correlations between propaganda framing and conflict events in the Baltics.”
Understanding covert influence through propaganda
As information proliferates across new media technologies at a faster rate than ever before, nation-state actors can manipulate it, including through propaganda, and sometimes undermine a nation’s sovereignty and internal political stability. Over the last decade, China has made increasing use of such information operations as part of its “Three Warfares” strategy, which combines psychological, public opinion and legal warfare.
Scott Ruston, founding director of the Center on Narrative, Disinformation and Strategic Influence, received a Department of Defense Minerva Research Initiative award to study online influence targeting Indonesia and the Philippines that uses narratives to manipulate public opinion and create political action favoring China.
The research helps to fill the critical capability and knowledge gap the United States faces with regard to China’s engagement in “informationized” warfare, and will more broadly establish a model for effective analysis of strategic influence in the Middle East, Europe and other regions.
As new technologies make their way onto the battlefield, soldiers need to stay up to date to remain resilient and capable.
In response to this need, GSI helped create artificial intelligence and machine learning training to incorporate into the Warrant Officer Advanced Course at Fort Huachuca, Arizona. The training was developed in collaboration with the U.S. Army Intelligence Center of Excellence and the Army Research Laboratory and served as a primer for military intelligence soldiers on artificial intelligence, machine learning and data science.
“As new technologies like AI are developed and implemented, it’s important that we include the perspectives of multiple stakeholders to make sure our approaches are relevant and solve real-world problems,” said Jamie Winterton, GSI’s former senior director of strategy. “Engaging with the Chief Warrant Officers gave us great insights into how we can effectively teach AI and machine learning to new audiences, and how we can build new AI systems to solve Army problems."
The seminar featured a lecture by Chitta Baral, professor at the School of Computing and Augmented Intelligence in the Ira A. Fulton Schools of Engineering and a GSI affiliate. After the lecture, students separated into small practical exercise groups where they worked with data scientists to develop a presentation on an intelligence function that could benefit from artificial intelligence and machine learning.
“Our adversaries are certainly focused on implementing AI in their military,” said Chief Warrant Officer 3 Jonathan Berry, a student from the Utah National Guard’s 142nd Military Intelligence Battalion. “So in order to stay relevant in future conflicts, AI certainly does need to be implemented."
There are an estimated 3.5 million unfilled cybersecurity jobs worldwide, 750,000 of which are in the U.S.
Since 2017, students interested in filling that gap have enrolled in the National Science Foundation Scholarship for Service program at ASU focused on cybersecurity.
The NSF CyberCorps program accommodates students interested in earning undergraduate or graduate degrees, and all students who enroll are involved in cybersecurity research.
“Students who enroll can focus on many different areas that influence and build upon cybersecurity, including artificial intelligence, machine learning, networking, embedded devices and more,” said Adam Doupé, director of GSI's Center for Cybersecurity and Trusted Foundations.
Students who are accepted into the program receive a scholarship covering full-time tuition and education-related fees, a health insurance reimbursement, a professional development allowance, a book allowance, and a stipend. In exchange for their scholarships, recipients agree to work after graduation for a federal government agency in a position related to cybersecurity.
“Right now, the supply side of cybersecurity talent is not meeting the demand,” Doupé said. “This program is an excellent way to encourage students to enter into and study cybersecurity, while at the same time giving back to the public sector and government. Additionally, breaches such as the Equifax hack highlight the need for competent and qualified cybersecurity professionals in all areas and industries.”
Arizona State University is among the top three universities in the nation for the total number of DARPA Young Faculty Awards, alongside Massachusetts Institute of Technology and University of Michigan. Fueling ASU’s success, the Global Security Initiative launched the DARPA Working Group in 2015 to help junior ASU faculty members pursue the career-boosting DARPA awards. The Young Faculty Award program provides high-impact funding to rising academics in early-career research positions at U.S. institutions to advance innovative research enabling transformative Department of Defense capabilities.
Here are eight next-generation academic scientists, engineers and mathematicians recognized by DARPA with the prestigious award.
Kitchen
Military radios operate at low frequencies, which have the advantage of operating over longer distances and the disadvantage of equipment that is bulky, requires large antennas and drains battery quickly.
Kitchen, an associate professor of electrical engineering in the Ira A. Fulton Schools of Engineering, aims to integrate recent innovations in electronics architecture and processes to bring military radios down in size and up in efficiency and resiliency, without losing the benefits of low-frequency systems. The system would enable transmissions to be even more secure, and maybe even capable of stealth communications.
“The dream is to do all of that with something the size of a cell phone,” Kitchen said.
Mahyar Eftekhar
Eftekhar, an associate professor of supply chain management at the W. P. Carey School of Business, conducts research that could radically improve the nation’s approach to disaster relief and humanitarian assistance. His work concentrates on nonprofit operations management and humanitarian logistics. Through more coordinated disaster relief operations, humanitarian organizations and government entities could share resources to effectively and efficiently respond to disasters.
While his main goal is to improve the resilience of the humanitarian supply chain in response to rapidonset disasters, he believes the results of his work could be adapted to many other nonprofit sectors.
Umit Ogras
Ogras, an adjunct faculty member with the Ira A. Fulton Schools of Engineering, aims to provide low-cost devices powered by external energy sources like heat, as well as tools that can sense and transmit data without wires. They are made by printing tiny electrical circuits on small, flexible polymer platforms on which commercially available computer processing chips can be mounted.
Those technologies are being designed to enable realtime analysis of an array of situations in areas of active national defense operations.
Such tools could help commanders evaluate the physical conditions and performance of personnel in the field, monitor the operability of safety-critical equipment and provide a virtual picture of a range of activities occurring across broad expanses of terrain.
Sze Zheng Yong
Yong, a visiting scholar, faculty and researcher with the Ira A. Fulton Schools of Engineering, is an expert in system dynamics and control as well as robotics. His research focuses on understanding the intention of autonomous swarms controlled by another party, such as an enemy.
Yong used his DARPA funding to develop new computational foundations to classify and predict swarm intents. Swarm intents differ based on the mission and the objective, and swarm formation differs based on the desired outcome.
“My proposed ideas are rather different from conventional approaches, and the to-be-developed techniques will generally be applicable to a wide array of dynamic systems beyond swarm systems,” he said. “The swarm intent understanding problem has many potential defense applications.”
Yan Shoshitaishvili
Today’s alarming growth in cybercrime is exacerbated by a lacking cybersecurity workforce. There are an estimated 3.5 million unfilled cybersecurity jobs worldwide, around 750,000 of which are in the U.S.
Shoshitaishvili, an associate professor in the Ira A. Fulton Schools of Engineering, champions multiple projects at ASU to improve the nation’s cybersecurity.
Among his efforts, he plans to fill the jobs pipeline with a well-qualified, dedicated cybersecurity workforce that can beat the hackers at their game. He serves as associate director of workforce development at ASU’s Center for Cybersecurity and Trusted Foundations, or CTF. Fueled by a DARPA grant, the CTF team has established the American Cybersecurity Education (ACE) Institute.
Timothy Balmer
An assistant professor in the School of Life Sciences, Balmer investigates how sensory signals are processed by the brain with a focus on hearing and balance.
Balmer’s lab prioritizes disorders like Meniere’s disease, an ear condition that causes imbalance and vertigo. This disorder may also cause tinnitus, a hearing condition that causes a ringing or buzzing noise in the ear. Balmer's research is personal to him, as he suffers from tinnitus himself.
His lab studies the vestibular cerebellum, a sensory system in the brain integrating signals that convey head, body and eye movements to coordinate balance. A disruption in the neural processing of this system causes conditions like Meniere’s disease.
Saeed Zeinolabedinzadeh
For Zeinolabedinzadeh, an assistant professor in the Ira A. Fulton Schools of Engineering, a DARPA Young Faculty Award propelled his research to develop a new, high-precision, low latency and cost effective time transfer scheme to improve 5G, 6G, wireless sensor networks, navigation and defense applications.
“Our proposed approach significantly increases the synchronization accuracy and reduces the synchronization time,” Zeinolabedinzadeh said. “In addition, the system can robustly operate while the radios within a communication system are moving at high speed, such as a user in the 5G network.”
Such improvements would enhance the performance and reliability of wireless systems used for national security as well as other communications applications.
Yu Yao
Yao, an associate professor in the Ira A. Fulton Schools of Engineering, has developed a new imaging sensor for applications where understanding the properties of light is essential, such as autonomous driving, biomedical imaging and quality control in manufacturing. Her DARPA Young Faculty Award has funded her exploration of this technology’s potential for underwater navigation.
The sensor she developed not only can capture regular images but also collects detailed information about the light itself. All this technology is built directly onto a small chip, an integration that makes the device compact, portable and potentially cheaper to manufacture and operate.
“The design concept for this microscale polarization imaging sensor was inspired by the eye of the mantis shrimp, which can see the polarization difference of light,” Yao said.
Hackathons, also known as codefests, offer students fun and exciting opportunities to design, innovate and build teamwork and hands-on technical skills in a fast-paced learning environment.
One regular offering is Devils Invent, a series of engineering and design challenges organized by ASU’s Ira A. Fulton Schools of Engineering. The event pairs students with academic and industry mentors.
In 2020, students at the Devils Invent hackathon solved various challenges that drones may encounter in homeland security, including developing a countermeasure for an unmanned aerial vehicle threat to airport security or a stadium. In 2023, the hackathon challenged students to design effective responses to homeland security threats in “soft locations” like churches, museums, schools, stadiums and other public places. The 2024 event focused on solving cascading problems, when an unexpected event triggers a series of calamities. ASU students won first place in all categories and $42,000 cash.
At the 2024 DEF CON, the world’s largest hacker convention, the 25-person Shellphish team, composed of “hackademics” from ASU, Purdue University, and the University of California, Santa Barbara, won the semifinal round of the DARPA AI Cyber Challenge, also known as AIxCC.
The Shellphish team collaborated on the development of an AI-based system called ARTIPHISHELL, which can automatically analyze the code that runs a piece of software, correct any security vulnerabilities found and then retest the system.
The AIxCC event was hosted by DARPA to spur the development of a cybersecurity system powered by artificial intelligence. Because of its desire to protect hospitals, pharmacies and medical devices from cyberattacks, the U.S. Advanced Research Projects Agency for Health, or ARPA-H, also collaborated on the competition and expanded the prize pool.
The Global Security Initiative has ramped up initiatives to draw young people to work in a critical, understaffed field by training the next generation of cybersecurity professionals in grades K-12.
GSI’s Cybersecurity Education Consortium reached students in elementary and middle school, introducing them to core cybersecurity concepts, generating interest in the topic and working to create a pipeline of local talent. The consortium partnered with industry, local schools and other units at ASU to deliver cybersecurity content and training.
The consortium collaborated with ASU’s New College of Interdisciplinary Arts and Sciences to host the first GenCyber camp at ASU in 2021. Throughout the week, high school campers learned from a variety of presenters, including industry professionals, ASU faculty and cyber employees from the Arizona Department of Homeland Security.
To address the gender gap in cybersecurity roles, ASU’s West Valley campus hosted an event called CybersecurityDay4Girls in 2019. The event was hosted in partnership with IBM to introduce middle school girls to the field of cybersecurity.
CybersecurityDay4Girls covered topics to help middle school students and their families stay safe online in an ever more connected world. The program also introduced more advanced concepts like cryptography and blockchain, providing students with a better understanding of cybersecurity as a career and encouraging them to consider pursuing it further.
While dangerous hackers are stealing our data and our dollars, Yan Shoshitaishvili, an associate professor of computer science and engineering in the Ira A. Fulton Schools of Engineering, has come to stop them.
Shoshitaishvili plans to fill the jobs pipeline with a well-qualified, dedicated cybersecurity workforce that can beat the hackers at their game.
With his innovative project, pwn.college — a unique, distinct combination of an educational curriculum, a competitive practice environment and a set of communication tools to help students learn collaboratively — Shoshitaishvili has developed an effective system to train the next generation of cybersecurity professionals.
On the pwn.college site, cybersecurity students from around the world complete programming modules and participate in hacking exercises to gain real insight into how attackers access secured systems. Today, pwn.college is used in over 100 countries and is on the path to becoming the gold standard for cybersecurity training.
ASU’s Center for Cybersecurity and Trusted Foundations, or CTF, was awarded a two-year, $4.5 million grant from DARPA to establish an institute that will educate future cybersecurity experts and address critical workforce shortages.
As part of GSI, the CTF team established the American Cybersecurity Education (ACE) Institute. The institute focuses on preparing current students for the toughest cybersecurity challenges and recruiting enough students to fill jobs in the future. One of its first steps was creating a master’s degree in cybersecurity offered by the School of Computing and Augmented Intelligence.
The Global Security Initiative contributed to ASU’s increased engagement in solving defense challenges. Since the initiative’s founding in 2015, ASU has more than doubled its research spending funded by the U.S. Department of Defense.
Over the last 10 years, GSI has built relationships with the DOD and other federal agencies and helped ASU researchers write effective proposals and secure grants. In addition, the institute’s researchers and centers have earned many grants of their own to develop national security solutions.
As GSI helped ASU grow its defense portfolio, it paved the way for the university’s most recent recognition — selection to provide reputable academic research support to the DOD Irregular Warfare Center. Working closely with the DOD, this research will seek to better understand current and emerging global trends in nontraditional warfare. So far, it is the biggest DOD award managed by GSI, with an anticipated $24 million over five years.
Irregular warfare refers to a broad spectrum of missions and activities that are often indirect and non-attributable, including unconventional warfare. The ASU-supported effort represents an intellectual investment to ensure America can compete effectively in this unseen arena.
GSI also helped connect Hongbin Yu, a professor of electrical engineering in the Ira A. Fulton Schools of Engineering, to a research opportunity with DARPA. With GSI’s proposal guidance, Yu’s team received a $1.5 million grant as part of the NextGeneration Microelectronics Manufacturing program. The team recommended ways
to manufacture new, more efficient chip technology called 3D heterogeneously integrated microelectronics in the U.S.
GSI, as the university’s primary interface to the Department of Defense and the Intelligence Community, works closely with faculty and academic units across the university to amplify ASU’s engagement in defense challenges. GSI provides tailored training to researchers new to national security sponsors, helps shape concepts to meet defense mission needs, and connects faculty with potential research sponsors.
Past DARPA projects at ASU include a brain-drone swarm control interface; wearable exo suits, including a jet pack that helps wearers run faster; and robot swarms that can perform a task in unpredictable environments where comms and GPS don’t work well.
GSI’s scientists were also instrumental in getting ASU involved in multi-university defense research projects.
In 2017, researchers from ASU joined five other universities to bring the age-old concept, “know your enemy,” to digital battlefields to combat advanced persistent cyber threats and other forms of cyber malfeasance.
The project brought together experts in computer science, cybersecurity, game theory and cognition to conduct research on defending against cyberattacks by profiling the attackers.
It was supported by a $6.2 million Multidisciplinary University Research Initiative award, granted to the six partnering universities by the Army Research Office.
Nancy Cooke, the senior scientific advisor for the Center for Human, Artificial Intelligence and Robot Teaming, explained the aim of the project in simple terms: “We’re trying to deceive the deceiver.” Cooke is also a professor of human systems engineering at ASU’s Polytechnic School.
Cooke’s role was to gather data on human behavior using her Cyber Defense Exercises for Team Awareness Research simulator. The lab, which seats six people, simulated cyberattack and defense scenarios for participating graduate students that Cooke used to gather data.
That data went to researchers at Carnegie Mellon University, who in turn created cognitive models of decision-making by attackers.
“What we’re doing is developing a personalized form of deception,” Cooke said. “We try to understand the attacker. Instead of using a generalized honeypot, we specialize the offense against them, creating an environment in which they don’t know what’s real and what’s not.”
In 2019, GSI helped secure ASU another spot in a multiuniversity defense project. DARPA gave $11.7 million to support the Cognitive Human Enhancements for Cyber Reasoning Systems (CHECRS) project. The project’s goal was to create a human-assisted autonomous tool that can find and analyze software vulnerabilities and also learn from its interactions with humans.
GSI helped the ASU team win $6.6 million of the award funding, and its researchers supported the team’s work. Ruoyu “Fish” Wang, the associate director of impact in GSI’s Center for Cybersecurity and Trusted Foundations, led the ASU team.
“While modern automated tools run on computers that calculate billions of times faster than a human brain, human security analysts still find the majority of software vulnerabilities,” Wang said. “This is because the knowledge and intuition that humans possess outweigh the speed of calculation when facing problems with extreme complexity, for example, finding software vulnerabilities.”
The CHECRS team wanted to create an autonomous tool that can be used by a variety of human assistants, not just cybersecurity experts. When humans of varying skills and expertise are at work, or when the machine needs help connecting dots using intuition, the automated tool can delegate tasks it’s not good at to the humans while it switches over to other tasks computers are optimized to perform.
Not only do humans help in the moment, but the automated tool also will incorporate what it learns from human contributions to continuously improve upon itself — both in how it interacts with its human partners and in its own ability to accomplish tasks.
The Global Security Initiative is unique in the defense research space because it works outside of traditional academic boxes. Its researchers bring many disciplines — engineering, social science, biology and more — to bear on complex problems. In fact, this interdisciplinary style was part of its design from the beginning.
In its first 10 years, GSI has proven to be a highly effective design, serving as a model for other institutions around the nation. It led 46 projects and received funding from over 50 different federal agencies to support its research. Such a scale of work and broad investment base signals a recognition that GSI’s pioneering approach to defense research is a promising path toward solving the world’s biggest security challenges.
At the Center for Human, Artificial Intelligence and Robot Teaming (CHART), researchers are developing autonomous systems to control robotic swarms — groups of small robots that work together to complete tasks.
To design these robot systems, the team is studying the ways that social creatures like bees, ants, birds and fish can coordinate and respond to challenges to achieve a common goal. Their robot swarms will mimic this unique behavior perfected by nature.
This technology has many applications, including for defense. The swarms could perform security surveillance, search-and-rescue missions, and detection of chemical,
biological and nuclear materials. They could also help monitor weather and climate conditions, transport materials, and collect and transmit data from remote places like under water or outer space.
Another goal is to enable these robots to perform dependably in distant, challenging environments where communications could be limited and unreliable.
The Center for Narrative, Disinformation and Strategic Influence frequently brings experts in cybersecurity, AI, communications, psychology and narrative studies together to understand the sociotechnical aspects of informational warfare.
In a project that aimed to detect AI-generated text in DARPA’s Semantic Forensics (SemaFor) program, researchers from the center partnered with ASU’s Walter Cronkite School of Journalism and Mass Communication to add a journalism perspective.
The team was tasked with determining whether news articles were written by a person or by an AI-powered text generator. While the computer scientists were stuck, the journalist could pick up clues based on elements like language, story structure and grammar style. Once the team input style guides as features in their AI-detection algorithm, their tool became the top performer with the highest accuracy scores compared to other algorithms.
Researchers in the Center for Accelerating Operational Efficiency created a simulation toolkit to help detect, deter and disrupt illegal passage into the country more efficiently. It works by identifying probable pathways individuals might take to cross national borders and is currently being tested in a pilot effort in several sectors in Texas.
The toolkit combines data visualization, human factors engineering and mathematical modeling to meet specific security needs on the ground. The underlying technology is based on the same modeling and algorithm approach that Google Maps uses to get people from point A to point B.
When using the toolkit, each station or sector will be able to change the modeling criteria and build its own customized profile for how individuals in that area may act, based on factors such as landscape, prior apprehension locations, and sensor and security camera positions.
Additionally, the toolkit utilizes open-source software so it can be field-deployed on a government laptop in environments that lack internet access.
From its inception, GSI has positioned itself as a leader in national security research and solutions. The initiative’s broad recognition at the national level is reflected in the fact that GSI faculty are regularly appointed to national bodies dedicated to advancing technology for societal good.
GSI’s Executive Director Nadya Bliss has been involved in multiple National Academies efforts focused on technology for national security, and has been involved in the Computing Community Consortium for over seven years, stepping into the role of chair in 2024.
The CCC council is a national group of over 20 computing experts from academia, industry and government advancing computing research in socially responsible ways.
Throughout her time on the council, Bliss has engaged extensively in national efforts around technology research, design and development, often discussing ways to more deliberately anticipate potential harms and mitigate their consequences before they take root in society.
“Technology is advancing at a rapid pace, yet security often remains a secondary consideration, despite all the examples we have of unforeseen harms stemming at least partly from new technologies,” Bliss said. “We need to prioritize security
alongside capabilities, and that goal is really at the core of my national service commitments.”
In 2020, Bliss and fellow GSI faculty member Nancy Cooke were appointed to the Defense Advanced Research Projects Agency (DARPA) Information Science and Technology Study Group.
The group brings the brightest scientists and engineers in the country together to identify new areas of development in computer and communication technologies and to highlight potential future research directions to DARPA.
During her terms as vice-chair and chair of the DARPA ISAT Study Group, Bliss has focused on emphasizing interdisciplinary research, fostering more inclusion in DOD research initiatives and enabling more collaborations with partners.
Nancy Cooke, senior scientific advisor for the GSI Center for Human, AI and Robot Teaming, studies humanmachine teaming, a key priority for DARPA as the DOD incorporates more machines and artificial intelligence into its daily and mission operations.
”Being involved in predicting surprise for DARPA through collaborations with very bright scientists and engineers is incredibly rewarding,” Cooke said. “The collaboration helps to move my thought processes forward and provides me an opportunity to branch out into areas that are new to me.”
Scott Ruston, former director of GSI’s Center on Narrative, Disinformation and Strategic Influence, also had the opportunity to lend his expertise to improve national defense.
Rear Adm. Ruston, who balanced his responsibilities at GSI with those as an officer in the U.S. Navy reserves, departed GSI in 2023 to direct a new Navy initiative called Get Real, Get Better. The program helps the Navy’s more than 340,000 active-duty sailors and 100,000 reservists identify strengths and shortcomings and use that analysis to improve performance.
“How do we structure the delivery of this initiative in a way that comports with people's identity as sailors and aligns with the missions we're trying to achieve and the values of the Navy?” Ruston asked. “It’s one of the things I'm looking forward to, and I think there's quite a bit of my research background that I can bring to bear to be positive and successful.”
GSI has the opportunity to further amplify its voice in national conversations around security and defense thanks to the Ambassador Barbara Barrett and Justice Sandra Day O’Connor Washington Center.
Since opening the center in 2018, ASU has brought a multitude of its innovative programs to the heart of the country. GSI joins other centers and initiatives in the strategic location just two blocks from the White House, where it can lend a valuable research perspective to think tanks, federal research agencies, policy councils, nonprofit organizations and national associations.
$238M Total research expenditures since 2015
$27.7M
Number of federal agencies providing funding to GSI 193 51
Total number of proposals submitted since 2015
Total number of projects (both completed and active) 46 Research expenditures in FY 24 Join us
Global Security Initiative launched
The new initiative will serve as a universitywide interdisciplinary hub for research on complex global security issues.
Partnering for digital security
Center for Cybersecurity and Digital Forensics and Samsung Electronics announce cybersecurity partnership.
2015 2016 2017
Proactively protecting data
GSI launches the Center for Cybersecurity and Digital Forensics for a proactive cybersecurity approach. Exploring the food-energy-water nexus
An interdisciplinary team studies how these systems are interconnected and builds tools to support sustainable policy decision-making.
ASU becomes national leader in DARPA Young Faculty Awards
The DARPA Working Group led by GSI helps junior ASU faculty members pursue the career-boosting DARPA awards.
ASU organizes largest hacking competition in the world for the first time DHS selects ASU to lead new Center of Excellence
GSI’s new Center for Accelerating Operational Efficiency will conduct research to improve the effectiveness of DHS organizations.
Center for Cybersecurity and Digital Forensics plays a lead role at the DEF CON conference, considered the Olympics of hacking.
2018 2019
First ASU Congressional Conference on Cybersecurity Panelists, including the late Senator John McCain, debate the definition of critical infrastructure, workforce needs and who defends what. GSI launches the Center for Human, AI and Robot Teaming
Analyst wins DOD Minerva Research Initiative award
Scott Rustin combines narrative analysis with computer science to identify and combat adversarial information operations.
The center will provide much-needed research on coordinating teams of humans and synthetic agents.
ASU researchers help shape national security priorities
Nadya Bliss and Nancy Cooke join the DARPA Information Science and Technology Study Group to inform research and technology developments.
Major advancements as DARPA cybersecurity portfolio expands
Center for Cybersecurity and Digital Forensics gets a four-year DARPA grant to protect vulnerable software through micropatching.
GSI uses the popular video game to assess how AI and humans team in complex environments and situations.
A new open-source training platform, pwn.college, prepares the next generation of cybersecurity experts with the moves to thwart cyberattacks.
Measuring our trust in AI
The Center for Accelerating Operational Efficiency tests a tool that could help government and industry develop trustworthy AI technology.
ASU to support DOD Irregular Warfare Center 2020
ASU is selected for the award with GSI leading execution. It's the biggest DOD award managed by GSI to date.
Helping federal agencies respond to COVID-19
The Center for Accelerating Operational Efficiency and DHS help federal agencies respond to the pandemic and overcome challenges.
GSI launches the Center on Narrative, Disinformation and Strategic Influence to fight adversarial influence campaigns.
GSI proposal team helps secure DOD National Defense Education Program grant to develop biotechnology education for high schools.
DARPA grants to reimagine the microchip
Center for Wireless Information Systems and Computational Architectures Director Dan Bliss will develop new chips for better communication.
ASU team makes finals of DARPA’s AI Cyber Challenge
Cybersecurity team wins at semifinals of the competition at DEF CON, earning a spot in the finals and $2 million to continue developing their AI-based system.
The United States faces a dynamic and unpredictable global security environment, one defined by broadly available, rapidly advancing technology that can be leveraged by geopolitical competitors and non-state actors alike in efforts to undermine the goals and influence of the U.S. and allies. In this landscape, advancing new capabilities in critical technology areas is important, but not sufficient. These advances must be coupled with:
• Scalable training to ensure the national security workforce is up to speed on the latest developments in their field.
• Efforts with allies to develop new frameworks and ethical guidelines for use-cases of emerging technologies like artificial intelligence and quantum computing.
• Rapid adoption and operation of new tools and systems by end-users.
ASU is uniquely positioned to respond to this multifaceted challenge. Few organizations in the world have access to the disciplinary breadth and technical depth of expertise at ASU, home to over 5,000 faculty, more than 100,000 students, state-of-the-art research and development facilities, and robust national and global partnerships necessary to make headway on complex security problems.
At the Global Security Initiative, we leverage ASU’s strengths to address national security mission needs, driving advancements on today’s pressing security priorities and preparing for the challenges of the future — from securing critical infrastructure to navigating a messy information environment to developing trustworthy AI that operates as a teammate to humans. And as we move into our second decade, we are astounded by the progress we are making on these and other challenges.
Moving forward, GSI will work with partners to expand its focus areas to new national security domains to ensure we are meeting the challenges of the moment — from predicting and preparing for the impacts of extreme weather events on defense missions to improving forensic attribution of biological threats to advancing the security of U.S. capabilities in space.
The milestones and achievements of the last 10 years are thanks to the hard work of our faculty, staff and students, and it is their continued dedication to GSI’s mission that will propel us to reach and exceed our goals in pursuit of a secure future for all.
“
I want to emphasize that it's about the people and the mission. I really care that the work is rewarding and impactful. The scale changed — it's different when there's three of you versus when there's a hundred of you. But throughout our 10 years of growth, I think we've stuck with those principles.”
— Nadya Bliss Executive Director ASU Global Security Initiative