Fostering Accountability in Cyberspace

Summarizing key themes of discussion from a recent workshop on “Fostering Accountability in Cyberspace”

The Stimson Center held a scoping workshop on May 4 2023 in partnership with the European Union Institute for Security Studies (EU Cyber Direct) and hosted by the Embassy of Switzerland to the United States. This workshop sought to identify gaps and opportunities for advancing accountability in cyberspace by fostering multi-stakeholder dialogue in the field. It was also an opportunity for the Stimson Center to present its newly launched Cyber Accountability project. The workshop brought together a diverse group of cyber security professionals, including governmental representatives, policy experts, researchers, civil society organizations, international organizations, and representatives of technology companies.

The one-day workshop included three substantive sessions in addition to a brief presentation of Stimson’s accountability project and small group discussions. It was held under the Chatham House Rule.  Below are six themes that arose from the discussions across all sessions, and that may provide practical approaches to the application of norms, law and accountability in cyberspace.

1. Think bigger. Broad accountability in cyberspace is needed from all stakeholders; however, this means more clearly defining acceptable and unacceptable behavior for many actors.

Conventional thinking on “accountability” needs to be broadened beyond simply attributing malicious acts and punishing perpetrators. Accountability is needed to reinforce the positive actions that can be taken to support cyber security as well as for ensuring negative acts are deterred, disrupted, and disincentivized.  States, providers, and users of cyberspace need to be accountable to relevant commitments. This includes implementing the agreed, voluntary, and non-binding UN norms. These norms define acceptable and unacceptable behavior, with implications even beyond the purview of states.

One of the first requirements is to identify and understand the differing roles that diverse stakeholders play in fostering accountability. This includes in relation to building awareness of and supporting implementation of agreed norms and law.

Non-governmental stakeholders also have an important role to play in fostering accountability. Some actors can be held accountable through peer reviews and other mechanisms, which may be initiated or involve non-governmental stakeholders. Tech providers can develop transparent industry performance principles as an integral part of development rather than an add-on. Insurers can develop more standardized processes to hold their clients accountable, for example, through compliance with certain standards and vulnerability risk assessments.  More broadly, the enterprises that comprise the layers of the internet — from IP addresses to routing — should also be part of accountability. Other stakeholders have a role to play in reporting about cyber harm and raising awareness about responsible behavior and legal obligations.

2. Understand the differences. A clearer understanding of the common visions shared by all stakeholders but also of the limitations inherent in their differences, is important.

Building on the first theme, workshop participants also touched on the fact that states and stakeholders will have differing priorities and capacities in relation to accountability initiatives.

For instance, cybercrime is more of a concern for some states than others. States often underestimate the costs of cybercrime, resulting in lower prioritization of initiatives to address it both domestically and internationally. States and stakeholders have different views on the trade-off between online anonymity and privacy versus reducing online harms, including surveillance. A process to openly discuss trade-offs is needed, including within the context of the human rights framework.

Regions can and are developing their own models of information sharing and accountability. As discussed, some Latin American countries are taking the lead on developing legislation and establishing cyber security agencies with examples and expertise that can be shared in the region. In Africa, an existing African Peer Review Mechanism might be leveraged. There may also be opportunities to replicate homegrown measures of accountability that have been demonstrated to work on other issues. 

There is also an underlying conflict in valuing security vis-a-vis perceived growth. This is particularly evident within the Global South, where the need for digital growth may be prioritized at the expense of cyber security. Consequently, gaining access to inexpensive information and communication technologies for socio-economic development can incur a cost when cyber incidents occur, impacting both the global commons and longer-term goals of stability and security. More clearly linking a stable and secure cyberspace to the UN’s Sustainable Development Goals would inspire increased interest in norms implementation.

Other, non-cyber elements in bilateral and multilateral State relationships can affect their cooperation in the cyber domain. Domestically, states also have differences in values and how they drive decisions. Even within regional groups and blocs, conflict over some security issues exists, e.g., the tension between commercial profitability and human rights concerns.

3. Capacity building is needed across the spectrum of stakeholders in order to foster accountability.

Many participants stressed the importance of capacity building, given the vast disparities in knowledge and resources in this area. Capacity building should be planned and carried out in accordance with states’ desired outcomes rather than their current capabilities. A gap analysis of national needs and priorities, as well as a common vision of achievable objectives for different stakeholders, would assist states in identifying the most important and viable areas to undertake capacity building in the near term.

It was noted that other frameworks exist in addition to the UN framework. These need closer scrutiny and application so that they can be built upon and synergies established. Examples include the Global Digital Compact, evolving industry principles and the development of standards and initiatives, such as the Paris Call and the Siemen’s Charter of Trust. Support for non-governmental stakeholder governance models and collaborations is growing.

Capacity building by and for all types of stakeholders is needed. To hold actors to account, it is essential to possess the requisite resources, skills, and authority. Citizens and non-governmental organizations (NGOs) need assistance understanding and fulfilling their roles in cyber accountability. For states seeking to embody responsible behavior, more detailed principles and guidance are needed. Capacity building should seek to bridge the gap between authority (actors with the agency and access but not the capabilities to achieve objectives) and resources (actors with the capabilities and expertise but not the authority or resources). 

4. Effective Accountability Requires Certain Elements as Prerequisites

Definitions and clarifications

The issue of definitions surfaced repeatedly in the workshop. Agreed definitions on certain terms and types of operations could bring clarity in some contexts, while others noted this might only further encourage malicious actors to pursue operations in a grey zone. The political and other challenges of reaching agreed definitions were also noted. Some believed that having clear definitions while ensuring strong, yet unspecified responses could deter malicious behavior. At present, strategic ambiguity appears to be the favored approach. These concerns raised questions regarding whether and how to shift toward an incentive framework that rewards clarity and commitment. More detailed principles and guidance are needed for states to understand how to demonstrate positive behavior, not just acceptable behavior.

Many in civil society seem to want more clarity from states and technology companies, particularly regarding operating standards, principles, and what constitutes responsible behavior. They aspire to have more input into policy-making discussions. Questions were also raised about how key human rights instruments and legal commitments are reflected in this context.  Companies have considerable control, for instance, over targeting algorithms. What controls over social media should be considered, including over ownership and liability? More exhaustive mapping of how human rights are affected by various cyber operations and activities was suggested, particularly of operations that may have a less  immediately obvious human rights dimension. Some efforts on this are underway, especially within the human rights community, and were referenced in passing during the workshop. 

When defining harm, it is important to recognize the various types and levels of harm. It is also necessary to identify incidents that have a significant impact as well as those that have relatively lower-level effects but affect a larger number of individuals, cumulatively. Identification of specific groups that suffer harm should also be noted rather than neglected, including at-risk populations. Different perspectives should also be included when assessing the potential for harm. Harm should thus be measured from different perspectives.

States should prioritize forging international agreements that define the most consequential risks, especially for potentially vulnerable groups, and set objectives for managing those risks. This will help states define what controls might be set on behavior, with allowance for uncertainties and deleterious secondary effects of regulations such as on small businesses.

Making cyber security a priority requires incentives and new approaches.

It will be necessary to find ways to elevate cyber security investment as a key priority for states and stakeholders. Positive incentives are needed to adopt and operationalize norms, and to interpret and apply international law. States can be supported to prioritize security investments through some explicit means, such as technology transfers, infrastructure development assistance, or sanctions relief. The true cost of malicious cyber incidents, including losses – in time, money and reputation – is not well appreciated or documented. Sustained investment in the cyber security of states and other organizations can pay dividends, including for economic development.

With adjustments in coverage, insurance could also play a role, as underwriters develop clearer understandings of risks and collaborate among themselves. Insurers also have an interest in defining systemic risks and warlike acts in order to better manage their exposure. To give the insured benefits of good security investments, the industry could explore standardized safety certificates, coordinated stress tests, vendor reviews, liability reforms, as well as the adoption of international frameworks and collaborative partnerships among businesses and states.

Incentives for the private sector and civil society to be resilient are needed as well, which will necessarily  be more tailored to their specific needs and interests.

5. Collaboration! States and stakeholders working together is key to improving security in cyberspace.

Cyber accountability is difficult given attribution challenges such as in relation to plausible deniability, while criminal and political motives for malicious acts often overlap.

Industry can and already does play an apolitical role in technical attribution. Technical organizations and others are providing characterizations of different threat actors that can help identify types/origins of actors, if not always specific actors and their motivations.

Technical attribution alone is insufficient, however. To be effective, political responses to malicious acts require collaboration and multilateralism, with like-minded states working together to call out malicious behavior, even without always calling out specific actors. Yet, states have different processes for pursuing political attribution or in how they coordinate with one another.

Some recommend that attribution statements should also describe the human harm caused by an incident or operation, thus bringing in the concerns of civil society. Ways to discern and measure harms need to be further explored. For example, technology companies can build an evidence base for cybercrime incident tracking, which could also help assess harm and human impact.  Additionally, a clearer idea is needed of how judicial systems can and already are advancing cyber accountability.

Deterrence can be achieved through means other than attribution; coercive diplomacy and incentives including through the linkage of political issues/negotiation were referenced.

Efforts are already underway to further existing collaboration. Among the suggestions and references made in the workshop are:

  • Intelligence sharing could be broadened to include elements that help to identify malicious actors’ motives and their use of tools and exploits of vulnerabilities but done in ways that do not necessarily disclose intelligence sources.
  • Cooperation on law enforcement. National judicial institutions are working together more to take down international groups of malicious cyber actors. The international collaboration against ransomware is noteworthy.
  • Existing regional coordination and cybersecurity directives have spillover effects and impact in other global regions, where some countries choose to apply self-constraining regulatory approaches from other global regions. 
  • Concrete steps should be taken to reduce barriers to such cooperation and broaden collaboration. This may only happen on a multilateral and multi-stakeholder level. Yet globally, accountability is achievable in some forms. Shifting some of the focus on accountability from the norms calling for restraint to those focusing on positive behaviors will help build a safer, more secure cyberspace.

6. Research! Lessons can be taken from non-cyber risk areas.

The workshop provided insightful advice on Stimson’s efforts to identify lessons learned from non-cyber threats and risk areas. Some areas to research that were addressed are:

  • Arms control: Although arms control has its limits in terms of applicability to cyber space, some cyber areas might be amenable to limitations, such as spyware and espionage. In addition to limiting capabilities, limits on the use of capabilities can certainly be pursued.
  • The global commons: Some domains such as air, space and the seas have had to develop rules for managing international conduct in the commons. These are areas that have parallels to cyberspace and should be considered for research.
  • Consensus/agreement process: The Stimson project should examine how consensus/agreement happens in other risk areas (regionally, bilaterally, with/within industry). The research should look at what changed decision making and the processes involved in that and where cyberspace might develop such processes for agreement.
  • Harm: In looking at harm, consider how other agreements defined them and how they might be different for cyber, albeit hard to quantify. 
  • Crisis management: Given the inevitability of cyber crises, lessons from other areas on crisis management are needed. The environmental area and Covid might provide such lessons.
  • Shared capability: Another area to consider is how capability dependency has worked in some areas, e.g., reliance on others’ intelligence, etc. to act. 

These were important points not only on where to look for parallel lessons but also on what elements to research. 

Recent & Related

Commentary
Allison Pytlak • Lisa Sharland

Subscription Options

* indicates required

Research Areas

Pivotal Places

Publications & Project Lists

38 North: News and Analysis on North Korea