The Pervasive Cyberthreat
Policymakers and experts have always used a lexicon of warfare and disaster when warning about cyberthreats. James Adams, a co-founder of the cybersecurity firm iDefense, predicted in these pages as early as 2001 that cyberspace will be the "new international battleground" where future military battles would be won or lost. In the years after, US defense authorities have warned of a "cyber–Pearl Harbor," as Defense Secretary Leon Panetta put it, and a "cyber 9/11," as Homeland Security Secretary Janet Napolitano put it. In 2015, James Clapper, the director of national intelligence at the time, warned the US must prepare for "cyber Armageddon," albeit he admitted it was unlikely. Officials stated that cyberspace should be viewed as a "domain" of combat with "key terrain" that the US needed to conquer or defend in response to the danger.
In the 20 years following Adams' warning, cyberthreats and cyberattacks have proven to be highly consequential—but not in the way that most projections predicted. Cyber-spying and theft have amassed petabytes, exabytes, and even zettabytes of sensitive and private data. Elections have been endangered by cyber-enabled information operations, which have sparked major social movements.
Business cyberattacks have cost hundreds of billions of dollars. However, while the cyberthreat is real and growing, expectations that cyberattacks would have large-scale physical effects akin to those caused by surprise bombings on American soil, or that they would hurl states into violent conflict, or even that what happened in cyberspace would determine who won or lost on the battlefield have not been realized. In attempting to compare the cyberthreat to physical warfare, policymakers overlooked a far more insidious threat: how cyber-operations corrode people's belief in markets, institutions, and even national authority.
Diagnosing the threat correctly is critical, in part because it influences how nations spend in cybersecurity. Focusing on single, potentially catastrophic events and focusing primarily on the physical effects of cyberattacks overprioritizes capabilities that will protect against "the big one": large-scale responses to catastrophic cyberattacks, offensive measures that produce physical violence, or punishments limited to attacks that cross a strategic threshold. Cyberattacks erode the confidence that underpins contemporary economies, society, governments, and military, and such capabilities and reactions are usually ineffectual in guarding against them.
If confidence is at stake—and it has already been severely eroded—then nations must take alternative actions to survive and operate in this new reality. The best approach to avoid a "cyber–Pearl Harbor" is to prevent it, but the best way to maintain trust in a digital world despite the likelihood of cyberattacks is to create resilience and thereby foster confidence in today's institutions of business, government, military power, and international cooperation. States may improve their resilience by restoring human-to-human and network-to-network connectivity, selectively deploying analog systems where they are needed, and investing in procedures that allow for manual and human intervention. The key to long-term success in cyberspace is not to find a means to fight all cyberattacks, but to learn to survive in the face of the disruption and damage they create.
A "cyber 9/11" has yet to occur in the United States, and a cyberattack with immediate devastating physical consequences is unlikely in the future. However, Americans' faith in their government, institutions, and even their fellow citizens is fast eroding, threatening society's basic foundations. Cyberattacks take advantage of these flaws, instilling distrust in information, causing uncertainty and worry, and intensifying hate and disinformation. This cyberthreat to trust will only become more existential as people's digital dependence expand and the linkages between technology, people, and institutions become more weak. Policymakers should be concerned about this looming dystopian future and do all necessary to prevent it.
THE TIES THAT BIND
In economies, civilizations, and the international system, trust is described as "the deep conviction in the dependability, truth, competence, or strength of someone or something." Individuals, companies, and nations can delegate duties or obligations, freeing up time and resources for other activities or cooperating rather than acting alone. It's the glue that keeps complicated alliances together, allowing markets to become more sophisticated, government to expand to a larger population or set of challenges, and states to trade, collaborate, and live within more complicated alliance connections. "Extensions of trust... enable coordination of acts over wide domains of space and time, allowing more complex, differentiated, and diverse societies to benefit," argues political scientist Mark Warren.
Those expansions of trust have played a critical role in human progress on all levels. Sociologists refer to primitive, secluded, and autocratic civilizations as having "particularized trust," or faith in just known persons. "Generalized trust," which goes outside known circles and allows actors to delegate trust connections to persons, organizations, and processes with whom the truster is not intimately aware, is required in modern and networked states. Generalized trust enables complicated market interactions, community involvement, trade, and cooperation among states; particularized trust leads to allegiance within small groups, distrust of others, and apprehension of unfamiliar processes or institutions; and particularized trust leads to distrust of others, distrust of unfamiliar processes, and apprehension of unfamiliar processes or institutions.
Without the confidence that allows for the transfer of duty to another entity, the contemporary market, for example, would not exist. People have faith in the worth of currencies, the ability of institutions to preserve and safeguard assets, and the fulfillment of IOUs such as checks, credit cards, and loans. Wages, profits, and employment rise when people and businesses trust the financial system. Trade and economic growth are facilitated by trust in property rights legislation. This widespread trust is even more vital in the digital economy. People no longer store gold in bank vaults. Instead, modern economies are made up of complex sets of digital transactions in which users must trust not only that banks will protect and secure their assets, but also that the digital medium—a series of ones and zeros linked together in code—will translate into actual value that can be used to purchase goods and services.
The digitally dependent economy is particularly vulnerable to degradations of trust.
Social capital—the shared norms and interwoven networks that, as political scientist Robert Putnam memorably argued, contribute to more peaceful and wealthy communities—requires trust. The generalized trust that underpins social capital permits voters to shift responsibility for representing their interests to proxies and institutions. Voters must have faith that their representatives will advocate for them, that votes will be properly recorded and counted, and that the institutions that draft and enforce laws will do so fairly.
Finally, how nations build national power and, ultimately, how they engage within the international system is based on trust. It permits civilian heads of state to delegate leadership of the armed forces to military commanders, who may then exercise decentralized command over lower-level military operations and tactics. States with a high level of civil-military distrust are less likely to win wars, partially due to how trust impacts a regime's willingness to grant lower-level military units responsibility in combat. For example, political scientist Caitlin Talmadge points out that Saddam Hussein's efforts to coup-proof his military through frequent reassignment of officers, restrictions on foreign travel and training, and perverse regime loyalty promotion incentives hampered the Iraqi military, which was otherwise well-equipped. Militaries can also experiment and train with new technology, making them more likely to innovate and produce breakthrough military force improvements.
The international system's stability is also determined by trust. States rely on it to develop trade and arms control agreements, as well as to feel certain that other countries will not launch a surprise assault or invasion. It promotes international cooperation and prevents arms races by enabling governments to communicate information, so avoiding the suboptimal conclusion of a prisoner's dilemma, in which states choose conflict over cooperation because they are unable to share the necessary information. Since the Cold War, the Russian saying "Doveryai, no proveryai"—"Trust, but verify"—has governed arms control talks and accords.
In sum, today's world is more reliant on trust than it has ever been. This is due in great part to the widespread use of information and digital technologies by contemporary economies, communities, governments, and military, with their virtual character heightening the importance of trust in daily operations. This can happen in a number of ways. First, the advent of automation and autonomous technologies—whether in traffic systems, financial markets, health care, or military weapons—requires a delegation of confidence, in which the user trusts that the machine can safely and appropriately do a task.
Second, users must trust that data is kept in the correct location, that its values are what the user expects them to be, and that the data will not be changed. Furthermore, new trust dynamics concerning identity, privacy, and authenticity are created by digital social media platforms. How do you know you can trust the people that create information or that your social interactions are with real people? How confident are you that the information you share with others will remain private? Users' reliance on digital technology and information in the modern environment has resulted in these very complicated trust relationships.
The level of confidence required to conduct these online conversations and trades creates a massive target. Cyber-operations create disbelief in how or whether a system works in the most striking way. An exploit, for example, is a cyberattack that takes advantage of a security hole in a computer system to hack and operate a pacemaker, increasing distrust among patients who use the device. A microchip backdoor, on the other hand, may allow bad actors to get access to smart weapons, creating doubt about who controls them. The integrity of data or the algorithms that make sense of it might be questioned as a result of cyber-operations. Are voter registries up to date? Is the AI-enabled strategic warning system displaying an actual missile launch or just a glitch in the computer code? Furthermore, living in a digital society might lead to a lack of confidence in the ownership or management of information: Do you keep your images private? Is the intellectual property of your organization safe? Did nuclear-weapons-related government information fall into the hands of a foe? Finally, through influencing social networks and connections, cyber-operations breed distrust, eroding social capital. Individuals' ability to trust both information and one another is complicated by online personalities, bots, and misinformation efforts. All of these cyberthreats have the potential to destroy the foundations of markets, society, governments, and the international system.
The digitally dependent economy is especially vulnerable to trust erosion. Cyberthreats have gotten more complex and pervasive as the contemporary economy has become more integrated online. The entire economic impact of cyberattacks is estimated to be in the hundreds of billions to trillions of dollars per year. The contemporary economy is threatened by more than just the financial impact of these attacks. Instead, it's how these repeated attacks instill doubt in the system's overall integrity.
The public's reaction to the ransomware assault on the American energy company Colonial Pipeline was a perfect example of this. The pipeline, which supplies roughly 45 percent of the petroleum to the East Coast of the United States, was shut down in May 2021 by a criminal organization called as DarkSide, who demanded a ransom, which the business eventually paid. Despite the attack having a minor influence on the company's capacity to supply oil to its customers, panic ensued, and people rushed to gas stations with oil tanks and plastic bags to fill up on gas, causing a fake scarcity at the pump. This type of mistrust, as well as the instability it creates, endangers not only the digital economy, but the entire economy.
Inability to protect intellectual property from cybertheft has comparable ramifications. Hacking into a company's network and obtaining sensitive data to steal intellectual property or trade secrets has become a profitable criminal activity that governments like China and North Korea employ to catch up with the United States and other countries with the most inventive technologies. North Korea notably hacked Pfizer in an effort to steal their COVID-19 vaccine technology, while Chinese exfiltrations of US defense industrial base research have resulted in imitation scientific advancements in aircraft and missile development. The more widespread and effective such assaults grow, the less firms can trust that their research and development efforts will pay off, eventually undermining knowledge-based economies. The dangers to trust are nowhere more pervasive than in internet banking. If people lose faith in the security of their digital data and money, the entire intricate contemporary financial system might implode. Inversely, the shift toward cryptocurrencies, the majority of which are not backed by government assurances, raises the stakes for faith in the value of digital data.
Attacks against trust are also a threat to societies and governments. Schools, courts, and local governments have all become ransomware targets, with systems being shut down or left worthless unless the victim pays a ransom. Virtual schools, access to court documents, and local emergency services are all in the crosshairs. While the immediate impact of these attacks may degrade some governance and social functions, the longer-term danger is that a lack of trust in the integrity of data stored by governments—whether marriage records, birth certificates, criminal records, or property divisions—can erode trust in society's basic functions. The dependence on information and social capital by democracies to generate trust in institutions has shown to be particularly vulnerable to cyber-enabled information operations. State-sponsored campaigns that provoke questions about the integrity of governance data (such as vote tallies) or that fracture communities into small groups of particularized trust give rise to the kind of forces that foment civil unrest and threaten democracy.
Cyber-operations can potentially put military force in jeopardy by undermining faith in contemporary weaponry. States began to rely on smart weaponry, networked sensors, and autonomous platforms for their forces as digital capabilities, beginning with the microprocessor, grew. As those military got more technologically competent, they were more vulnerable to cyber-attacks, which jeopardized the security and functionality of these smart weapons systems. Rather than focusing on how cyber-operations may function like a bomb, the genuine risk comes when cyberattacks make it impossible to believe that physical bombs will perform as planned. This trust becomes even more vital when military move further away from the battlefield through remote operations and leaders shift authority to autonomous technologies. Can military trust that cyberattacks on autonomous systems would not render them useless or, worse, result in fratricide or the deaths of civilians? Furthermore, for highly networked militaries (such as that of the United States), lessons taken from the early information age led to doctrines, campaigns, and weapons that rely on complex distributions of information. Absent trust in information or the means by which it is being disseminated, militaries will be stymied—awaiting new orders, unsure of how to proceed.
These variables, taken together, pose a challenge to the fragile trust mechanisms that promote peace and stability in the international system. They make commerce less likely, weapons control more difficult, and state-to-state communication more difficult. The emergence of cybertools for snooping, assaults, and theft has further added to the distrust's impacts. The lack of norms about the appropriate use of cyber-operations makes it difficult for states to trust that others will exercise restraint. Offensive cyber-capabilities are difficult to monitor, and the lack of norms about the appropriate use of cyber-operations makes it difficult for states to trust that others will exercise restraint. Are Russian hackers examining U.S. power networks in preparation for an impending cyberattack, or are they just looking for flaws with no intentions to exploit them? Is the US's "defend forward" cyber-operations a ruse to justify aggressive strikes on Chinese or Russian command-and-control systems, or a genuine attempt to avert attacks on US networks? Meanwhile, the employment of mercenaries, middlemen, and gray-zone activities in cyberspace makes attribution and discerning purpose more difficult, endangering international confidence and collaboration. For example, Israeli spyware supporting Saudi government efforts to stifle dissent, off-duty Chinese military hacktivists, and criminal groups that the Russian state supports but does not officially fund all complicate establishing a clear line of credit for an intended state activity. Such intermediaries also jeopardize the use of public agreements among nations on what constitutes appropriate internet conduct.
LIVING WITH FAILURE
To far, American responses to cyberthreats have mostly concentrated on the cyberspace aspect of the problem—deterring, defending against, and destroying cyberthreats as they assault their targets. However, these cyber-focused methods have struggled and even failed: cyberattacks are on the rise, deterrence's efficiency is debatable, and offensive approaches are unable to halt the flood of small-scale strikes that threaten the world's contemporary, digital underpinnings. Massive exploits, such as the recent breaches of SolarWinds' network management software and Microsoft Exchange Server's email software, are a symptom of how the targeted systems were developed and constructed in the first place, rather than a failure of US cyberdefenses. The objective should not be to prevent all cyber-attacks, but to develop systems that can survive them. This isn't a new concept. Cities struggled to resist the assault of increased weapons when cannons and gunpowder were introduced in Europe in the fourteenth and fifteenth century. As a result, nations altered their fortifications—digging ditches, building bastions, organizing cavaliers, and erecting massive polygonal edifices—all with the goal of establishing towns that could withstand a siege, rather than preventing cannon fire from ever occurring. The greatest fortifications were built to allow active defense, wearing out the assailants until a counterattack could destroy the armies outside the city.
The fortification comparison welcomes a cyberstrategy that focuses on the system itself, whether it's a smart weapon, an electric grid, or an American voter's thinking. How do you design systems that can function in a world when confidence has eroded? Network theory, which studies how networks thrive, fail, and survive, can help. The strongest networks, according to network resilience studies, are those with a high density of tiny nodes and many paths between nodes. Highly robust networks can survive the loss of several nodes and links without breaking down, whereas less resilient, concentrated networks with fewer paths and nodes have a lower critical threshold for deterioration and failure. If economies, societies, governments, and the international system are going to survive serious erosions of trust, they will need more bonds and links, fewer dependencies on central nodes, and new ways to reconstitute network components even as they are under attack. Together, these qualities will lead to generalized trust in the integrity of the systems. How can states build such networks?
If users no longer trust that their data and money can be safeguarded, then the modern financial system could collapse.
First, networks and data structures that support the economy, vital infrastructure, and military force must emphasize resilience on a technological level. Decentralized and dense networks, hybrid cloud topologies, redundant apps, and backup mechanisms are all required for this. It entails preparing for network failure and educating personnel to adapt and continue to offer services even in the middle of a cyber-offensive. When digital capabilities are lost, it means depending on physical backups for the most crucial data (such as votes) and manual operating system alternatives. For some very sensitive systems (for example, nuclear command and control), analog alternatives may yield extraordinary robustness, even if they are less efficient. The distinction between binary trust (that is, trusting the system to work perfectly or not trusting the system at all) and a continuum of trust (trusting the system to function at some percentage between zero and 100%) should drive the design of digital capabilities and networks. These design decisions will boost user trust while lowering the incentives for criminal and state-based actors to undertake cyberattacks.
Improving the resilience of essential infrastructure and military might against cyberattacks will benefit world stability. Because they can bounce back fast, more resilient infrastructure and communities are less susceptible to systemic and long-term repercussions from cyberattacks. As a result, nations are less likely to hit an opponent online as a preemptive strike, as their cyberattacks' efficacy and capacity to force the target population would be questioned. When confronted with a difficult, expensive, and perhaps futile counterattack, aggressors are less likely to recognize the benefits of risking the cyberattack in the first place. Furthermore, governments that emphasize the development of resilience and endurance in their digitally equipped armed forces are less likely to engage in first-strike or offensive operations, such as long-range missile attacks or preemptive campaigns. The security dilemma—when states that would otherwise not go to war with each other find themselves in conflict because they are uncertain about each other’s intentions—suggests that when states focus more on defense than offense, they are less likely to spiral into conflicts caused by distrust and uncertainty.
However, resolving the technological issues is only one part of the answer. Society's human networks—that is, the linkages and links that people have as individuals, neighbors, and citizens so that they can work together to solve problems—are the most crucial trust relationships that internet threatens. Making these human networks more permanent requires solutions that are even more sophisticated and demanding than technology remedies. The relationships that establish trust between individuals and communities are the focus of cyber-enabled information operations. They sabotage these larger ties by incentivizing the establishment of concentrated networks of particularized trust—for example, social media platforms that bring together like-minded people or misinformation efforts that foster in-group and out-group splits. Algorithms and sensationalism meant to elicit anger only serve to exacerbate these differences and undermine outsiders' faith in the organization.
Governments can try to control these dynamics on social media, but such virtual enclaves mirror real-life societal differences. There's also a feedback loop: online distrust spills over into the real world, further dividing individuals into "we" and "them" groups. To combat this, education and political involvement are required, as are the bowling leagues that Putnam said were required to restore Americans' social capital (Putnam's book Bowling Alone was published in 2000, just as the Internet was gaining traction). It's time to reenergize physical communities, time for neighborhoods, school districts, and towns to join together to repair the linkages and bonds that were broken to save lives during the epidemic.
The truth is that these schisms existed in American societies long before the epidemic or the Internet intensified their consolidation and influence. As a result, the answer, or the way to conduct this sort of rebuilding, will not come from social media, platform CEOs, or technical technologies. Instead, it will take brave local leaders who can rebuild trust from the ground up, finding methods to bring communities that have been split apart together. To reconnect in person, it will be necessary to detach more often from the Internet and the synthetic communities of particularized trust that have emerged there. Civic education may aid by reminding people of their similarities and aims, as well as cultivating critical thinkers who can push for change within democratic institutions.
There's a cliché that "death by a thousand cuts" applies to cyber-operations, but maybe a better parallel is termites, which hide in the crevices of foundations and steadily eat away at the very structures that sustain people's existence. The prior strategy concentration on one-off, large-scale cyber-operations resulted in larger and better cyber-capabilities, but it ignored the fragility of the foundations and networks.
Will cyberattacks ever have the significant physical consequences that have been predicted for the past two decades? Will a policy that emphasizes trust and resilience make nations particularly vulnerable? Of all, it's impossible to predict that no cyberattack would ever have large-scale physical consequences comparable to those caused by the bombing of Pearl Harbor. But that's doubtful, because cyberspace's virtual, ephemeral, and ever-changing nature makes it impossible for assaults to have long-term physical consequences. These types of attacks are made much more difficult by strategies that focus on trust and resilience by investing in networks and relationships. As a result, focusing on constructing networks that can withstand repeated, smaller attacks has a happy side effect: increased resistance against one-off, large-scale attacks.
But this isn't simple, and techniques that prioritize resilience, redundancy, and endurance over convenience or preventing and destroying cyberthreats have a considerable efficiency and cost penalty. And the initial expense of these trust-building tactics is borne disproportionately by democracies, which must nurture generic trust rather than the specific trust that autocracies rely on for power. This can be a difficult pill to take, especially given the fact that China and the US appear to be rushing toward a more competitive relationship.
Democracies and contemporary economies (such as the United States) must prioritize creating faith in the mechanisms that keep society running—whether it's the electric grid, banks, schools, voting machines, or the media—despite the challenges and costs. This entails devising backup plans and fail-safes, making strategic judgments about what should be online or digital and what should remain analog or physical, and constructing networks—both online and in society—that can withstand attacks on a single node. If a stolen password can still shut down an oil pipeline or a phony social media account can still impact the political beliefs of thousands of people, cyberattacks will continue to be too profitable for autocracies and criminal actors to ignore. If greater resilience—both technical and human—is not built in, the cycle of cyberattacks and the distrust they generate will continue to erode democratic society's foundations.