Nearly 35 years after the , the relationship between the internet, emerging technology, and democracy has never looked more uneven or uncertain. On the one hand, digital connectivity has been a boon for democratic speech and participation, allowing people all around the world to organize, scrutinize governments, and make their views heard. On the other hand, the digital domain appears to be an increasingly fractious and wild frontier where threats to human security and anti-democratic practices are on the rise.
Autocrats may now use digital technologies to their populace, as well as .Artificial intelligence is that discrimination and inequality will become more entrenched. Global data flows are set to triple by 2026, according to the , while the connectivity divide between rich and poor countries and communities is growing. Online platforms based on algorithms designed to capture attention and amplify divisive content are fueling political and social polarization, reducing trust in institutions, and causing an information crisis, all of which are destabilizing democracies and contributing to political violence around the world.
The digital domain's novelty, complexity, and rapid innovation make it challenging for governments to keep up with and efficiently regulate emerging technology. As a result, states, organizations, and communities are increasingly divided about the nature of sovereignty, privacy, public goods, and human security. Yet, current venues for resolving disputes over tech governance are too limited in scope and ambition to provide the multiplicity of stakeholders who have a vested interest in outcomes with meaningful avenues to shape the digital future.
On aims, there appears to be universal agreement among open societies. Governance should focus on preventing, managing, and resolving digital harms while also preserving human rights and ensuring security. Simultaneously, it should foster innovation, creativity, and open collaboration, ensuring that digitization serves as a driver of human prosperity. Less evident is what types of governance may achieve such goals, especially given the contradictions that exist between them, not to mention the difficult political and technical hurdles of establishing and enabling such institutions amid power disparities, rapid technological change, and a fracturing of the global order. All of the above circumstances leave open to question what it means when technologists, corporations, governments, communities, and citizens say they want 鈥渄emocratic digital governance.鈥
In January, 麻豆果冻传媒鈥檚 Planetary Politics program joined with the at University of Denver鈥檚 Korbel School and the to explore this question and the challenges of preventing, mitigating and managing digital harms. Two dozen top academics and civil society leaders with experience in digital rights and governance from North America, Latin America, South Asia, and Europe attended the workshop. Our conversation began with a debate about what 鈥渄emocratic governance鈥 means in the context of the digital domain. We landed on a working definition that consists of two mutually reinforcing parts. Governance refers to systems and processes that affect how applications and technologies work. "Democratic" refers to participatory monitoring of technology-influencing systems, not to a political system with elected representatives.
We then considered three models for achieving democratic governance. The first was the "gatekeeper" concept, which focuses on empowering institutions to police digital space鈥攖hat is, to establish and enforce norms, regulations, and other safeguards to prevent harm and defend rights. Governments, particularly democratic ones, are traditionally regarded as the most legitimate gatekeepers.The second model, on the other hand, stresses increased public participation as a means of achieving democratic governance goals. Policies, institutions, and the design of online spaces, in this vision, would attempt to increase individuals' ability to view themselves as part of a polity and then act in its interests. This concept is based on the pragmatist intellectual tradition, in which the American educator and philosopher defined democracy as a way of life rather than a set of institutions.
Our conversations focused on how these three governance models have played out since first issued his on the management of distributed information systems. What we agreed on is that digital governance is now an urgent global priority but there is no one size fits all solution or set of solutions. Governance of the digital wild can鈥檛 and won鈥檛 happen in a vacuum. We need to keep in mind that neither governments nor tech corporations are monoliths; conflict between states and technology firms over regulations is as much a reflection of internal tensions as it is external pressures. Recent history has lessons for us when it comes to rule making, norm setting and harm prevention, mitigation and management. However, analogs in the physical world鈥攍ike maps with hard geographic borders and boundaries鈥攁re likely to gain the greatest traction, fastest with most stakeholders, but they may not make sense when applied in a domain where power is distributed and boundless. Most importantly, pragmatism and a sense of perspective can go a long way toward improved digital governance. Tensions are inevitable, we just need to remember that democracy is a generational process not an idyll destination.
Digital Governance Was Always Hard, But Now It鈥檚 Getting Harder
Though the internet and then later social media applications such as Facebook and Twitter were initially celebrated as instruments of democracy, the last decade has revealed the darker possibilities of digital technologies, as harms ranging from election interference to financial crime to mass violence have proliferated. In , Facebook鈥檚 algorithms amplified disinformation and hateful content that incited violence against Rohingya Muslims, contributing to a brutal that displaced more than 700,000 people and left as many as 10,000 dead. In a sign of the growing sophistication and impact of cybercrime, in 2021 forced the temporary shutdown of 5,500 miles of the Colonial Pipeline in the U.S. East Coast region, causing gasoline and jet fuel shortages that triggered a rise in gas prices.
World leaders like , and European Commission have all called for global digital governance based on human rights and democratic principles. Intergovernmental bodies, such as the United Nations, the OECD, UNESCO, and the G20 have digital governance agendas and are working to establish governing frameworks. Add to this dozens of multistakeholder and civil society initiatives, as well as those by tech companies, to bring some semblance of democratic governance to digital spaces.
Yet governing digital technology presents novel and complex challenges. Identifying these is an obvious first step toward better understanding the democratic governance solutions likely to help manage them. For one, 鈥渄igital technology鈥 encompasses a broad and complex array of applications, tools, and use cases, the impacts of which differ from one geographic or socioeconomic context to another. Another related challenge is the scale and pace of change. The world wide web has been around for only three-plus decades but it has upended human ideas of what is sacred, what is right, what is wrong, what is just, what is true and more, and more what is real in many parts of the world.
Adding to the difficulty is the many types of stakeholders involved in digital governance. Though born from a U.S. Department of Defense initiative, the internet developed as a loose collaboration between companies, universities, nonprofits, and individuals. The private sector has been critical from the start, and today much of the digital world鈥檚 critical infrastructure and data by companies. This means that businesses especially, but also NGOs and other actors, end up, either intentionally or by default, playing a large role in digital governance.
Extreme concentrations of power in the digital domain exacerbate the challenges above. Around the world, governments or state-owned enterprises control much of the physical infrastructure of the internet, a situation that enables those with jurisdictional or territorial control it or shut it off. Companies and even individuals also wield immense power. Decisions made by the entrepreneur Elon Musk regarding how and whether to provide Starlink satellite internet service to the Ukrainian military in its defense against the Russian invasion have had a significant impact, at first in Ukraine and later in it. Owing to the logic of network effects, the platforms and content on the internet are largely controlled by a handful of firms. In 2020, the top five accounted for 20 percent of the U.S. stock market鈥檚 total worth, and almost people used Meta-owned social media applications.
Expecting governments to regulate the digital domain according to democratic principles has proven to be wishful thinking. Even excluding autocracies, which frequently use digital tools to surveil, control, and abuse their populations, many governments with democratic political systems also act undemocratically in digital space. In democracies ranging from to India to the , government leaders and their organizations have used social media to spread disinformation, organize anti-democracy protesters, and undermine democratic processes.
Even when they are well-meaning, government representatives often struggle to understand digital technology. Witness of Meta CEO Mark Zuckerberg by befuddled U.S. Senators, or Supreme Court quip during oral arguments in the content moderation case Gonzalez v. Google that 鈥淸Supreme Court Justices] are not the nine greatest experts on the internet.鈥
The problem goes deeper, however. In Recoding America: Why Government Is Failing in the Digital Age and How We Can Do Better, , former U.S. Deputy Chief Technology Officer and founder of the U.S. Digital Service in the Obama White House, notes that, with the exception of the U.S. Defense Department, which has a budget sizable enough to enable it to compete with private companies when recruiting personnel, U.S. government agencies are unable to attract staff鈥攕uch as programmers and technologists鈥攚ho understand the cutting-edge of digital technologies.
Lack of digital literacy is also an issue for citizenries. We all eagerly embrace digital tools but often understand little about how they operate, the data they generate, and how this data is used. This means that people often facilitate undemocratic practices鈥攕uch as by Russian-created disinformation in the 2016 U.S. presidential election. Democracy depends on citizens' ability to hold the government, corporations, and other power centers to account. But if the public doesn鈥檛 understand how exactly a government or tech company is violating their rights or abusing power in the digital domain or with digital tools, then it will have no basis on which to act.
Artificial intelligence, or AI, stands to compound this problem. Many AI systems are 鈥渂lack boxes鈥 in which the training data, inputs, or operations aren鈥檛 visible to users or outside parties, such as regulators or civil society watchdogs. The problem doesn鈥檛 end there. As AI becomes more advanced, it may not be just difficult to make sense of what the machine is doing, it may be next to
A final challenge is the novel properties of digital technology, and the political economy that springs from that novelty. It is tempting to apply analogs from the physical world and to repurpose them for governance purposes. Existing regulatory standards and concepts that draw on common law or civil law traditions may not neatly apply. Two examples are and the for antitrust regulation. Moreover, many governing models and frameworks are predicated on material, territory-bound concepts, such as sovereignty, national identity, and citizenship. The digital domain is a new landscape detached from physical geography. Are legal and economic frameworks based on existing conceptions of sovereignty, ownership, and citizenship applicable in such a non-physical space where borders are porous or nonexistent and people can adopt multiple identities?
How Governing the Digital Wild Is Getting Done Today
With these challenges in mind, workshop participants examined different instruments鈥攚hether initiatives, platform designs, networks, legislation, voluntary agreements, organizations, or others鈥攖hat employ democratic means to govern digital technology and spaces. We identified three approaches to democratic governance鈥攖he gatekeeper model, the public participation model, and the balance-of-power model鈥攁nd we examined how each informs the other.
Governments in Open Societies
Democratically elected governments have become more active in regulating the activity of commercial actors in and on the internet over the last decade. This is true at the subnational, national, and international levels. Generally speaking, governments take two approaches: regulation (hard law) and creating voluntary standards (soft law). When it comes to the former, many jurisdictions are enacting regulations in line with the gatekeeper model of democratic governance.
The most common form involves imposing some form of liability on internet companies for the content that appears on their platforms or websites. Several national governments have passed legislation to this effect, including , the , and . Subnational governments have also introduced measures. California requiring social media companies to publicly post their content moderation policies and report enforcement data to the state鈥檚 attorney general. The most far-reaching and comprehensive of these is the European Union鈥檚 , which establishes for commercial actors to enhance transparency, curb illegal content, and regulate advertising to create responsible platforms free from manipulative design.
Another hard-law approach is based on the balance of power model of democratic governance. Antitrust or competition regulation, which aims to use the power of government to diminish concentrations of private power, is an example of this. Again, the European Union has gone the furthest, using the size and heft of the European market to check anti-competitive and monopolistic practices by tech companies. The , adopted in 2022 alongside the Digital Services Act, aims to explicitly reduce the gatekeeping power of large online platform companies.
Using soft law approaches, governments create guidelines, codes of conduct, and other standards for behavior that tech companies voluntarily agree to uphold. In a sense, such approaches could be considered in line with the participatory model of democratic governance, whereby companies are viewed as citizens in a larger polity and invited to act in a way that furthers the greater interests of the whole. By way of example, the (UNGPs) and UN Special Rapporteur David Kaye鈥檚 offer companies frameworks with which to inform their decision making.
Similarly, in 2018, the EU introduced the , voluntary commitments to reduce online disinformation among signatory firms. The Code, last updated in 2022, has 34 signatories, including major digital platform companies such as Google, Meta, and Twitter. In 2022, the White House Office of Science and Technology Policy issued an , identifying guidance for the design and use of AI systems. At a minimum, these guidelines instruct companies on how to act in more and ways. The , for instance, through its multi-stakeholder processes, has successfully pushed many social media companies to create human rights teams and integrate human rights principles consistent with the UNGPs into their decision-making. But the problem with soft law mechanisms is that some rely on the good faith of company signatories, which often operate under the perverse incentive structures created by market competition. And as some commentators have , companies cannot be expected to meaningfully regulate themselves鈥攅specially in ways that cut into profits or that undermine core business models.
The hard regulatory approach also has . Workshop participants noted that it can unnecessarily consume company time and resources and stifle innovation. It can have other unintended side effects, as well, such as undermining more flexible and agile forms of self-regulation better able to respond to the rapid pace of change in the digital realm. And, uncoordinated laws by different governments can create a global labyrinth difficult to navigate, particularly for smaller transnational companies.
Companies in a Globalized World
There are two main ways that tech companies exercise some semblance of democratic governance over the digital domain. The first is by creating governing bodies or mechanisms that approximate and borrow democratic principles and practices, and the second is in the design of platforms and applications themselves. For example, the , a of former political leaders, activists, and journalists that is paid by an independent nonprofit foundation, rules on appeals of content moderation decisions made by Facebook鈥檚 parent company, Meta.
The Oversight Board operates in an environment where it strives to be responsive to billions of users all over the world. It uses the UNGPs and UN Special Rapporteur David Kaye鈥檚 guidelines on content regulation as the framework for content moderation decisions. It established tests to determine the applicability and fairness of content moderation standards: a legality test (is the ruling clear and compatible with rulings by such bodies as the UN Human Rights Council?), a legitimacy test (does the ruling reflect the public interest, not a corporate interest?), and a necessity test (is it using the least intrusive tools?). As such, the Oversight Board offers redress through more democratic governance mechanisms than opaque or inconsistently justified procedures at other platform companies, such as at .
At a glance, the Oversight Board is a typical democratic gatekeeper. But as workshop participant Swati Srivastava has noted, instead of elected officials who act with the consent of the governed, the gatekeeper is selected by an unelected, relatively unaccountable corporation. Moreover, it makes decisions about the limits of free speech without delegated authority from national sovereigns. In its exercise of authority over global populations, the Facebook Oversight Board engages in what Srivastava calls 鈥減rivate polity-making.鈥
Companies also affect democratic processes through the choices they make about the design of their platforms. In many cases, companies exercise , by which algorithmic parameters shape the user鈥檚 experience. For democratic governance, the central question is, does the gather people in such a way that they understand themselves to be part of a community in which they have opportunities for meaningful participation?
Two examples illustrate the contrast between design that enables such participation and design that does not. First is Facebook, an app with an that discourages users from seeing themselves as part of a democratic public. The structure of the Facebook social network is dyadic, meaning the user connects with friends one-by-one, building a community out of discrete links. Groups are closed and private. The algorithms driving each users鈥 Newsfeed select for , not diversity. The space itself is also homogenous, designed by Facebook with no opportunity for users to shape it. Overall, the user on Facebook is encouraged to see herself as an atomized individual.
A contrasting example is Reddit, an online platform that starts with the community. A user on Reddit does not connect to other individual profiles, but instead joins open, topical communities called 鈥渟ubreddits鈥 where strangers share information and opinions related to a particular topic鈥攑olitics, architecture, professional basketball, houseplant care, and countless others. Each sub-Reddit is governed according to norms that are created and enforced by volunteer moderators who themselves are users. Anyone can provide input, and engage in deliberation, discussion, and decision-making. Users have even been able to the corporate policies of the company itself. The overall experience is one that encourages among users.
Civil Society Amid Democratic Backsliding
Not all governance is about making and enforcing rules. Championing issues, shaping agendas, educating stakeholders, and monitoring commitments are all part of governance processes and express different democratic commitments. It is in these ways that civil society organizations contribute to democratic digital governance by both encouraging public participation and drawing citizens together to influence both governments and companies. Increasingly, however, they are doing so amid and the use of technology to crackdown on challenges to the elite status quo.
There are dozens if not hundreds of civil society initiatives committed to fostering an open, secure, innovative digital domain. evaluates the practices of the world鈥檚 most powerful tech and telecom companies and their effects on human rights. Their research and provide important tools for monitoring company behavior on the basis of democratic norms. The Public Interest Technology project encourages digital literacy, enabling young people especially to play a greater role in ensuring technology serves the public interest. promotes digital policies, ethical frameworks, and initiatives in Latin America. The Digital Impact and Governance Initiative works to catalyze solutions for digital public infrastructure. Their identification of fosters innovations that increase participation and also uphold democratic norms. shines a light on digital surveillance and repression, generating capacity to counter abuses of power. undertakes policy research and develops guidance tools to shed light on challenges posed by ICTs as well as their use for peaceful purposes. The Open Technology Institute connects researchers, organizers, and innovators across a broad array of concerns to foster equitable access to open and secure digital technology, thus enhancing participatory and norm-respecting action.
Civil society organizations can also help address global power imbalances. Rich nations in the Global North dominate the governance of digital technology. They have the most influence in international governing bodies, and they tend to define the frameworks and goals of digital governance. Poorer countries in , as well as historically marginalized and disenfranchised populations, often have less voice and influence. They also face different impacts from technology and might have different governance goals. Civil society organizations that represent Global South concerns can bring greater attention to priorities and risks such as environmental sustainability, accessibility, and various forms of bias.
Multi-Stakeholder Initiatives in Multilateral Environment
Lastly, multi-stakeholder initiatives () bring together governments, companies, and civil society organizations for digital governance. These emerged as the primary governance form for the internet, given its genesis as a decentralized collaboration, in which all these actors played a role. Perhaps the best-known and most consequential MSI is the (ICANN), a nonprofit is 鈥渢o ensure the stable and secure operation of the Internet's unique identifier systems,鈥 such as the domain name system (DNS), without which the internet as we know it would cease to function.
The structure of ICANN has shifted over time. It was established in 1998 when the U.S. Department of Commerce to it for managing the DNS and keeping the internet running smoothly in a "bottom up, consensus-driven, manner". In the wake of the around the U.S. National Security Administration鈥檚 surveillance activities, the Obama Administration US government supervision of ICANN to . Governments still participate as , but ICANN is now as a financially independent California-based nonprofit that businesses, nongovernmental organizations, and academics working alongside governments.
Though many workshop participants agreed that ICANN was successful, there was less consensus about its democratic qualities. Some thought ICANN鈥檚 capacity to keep the internet functioning and its narrow focus on technical issues minimized its susceptibility to politicization, even though governments are participants. Others pointed out that the ICANN we see now is very different from the body established in the 1990s and that the libertarian spirit behind the creation of ICANN might cringe at the role it now plays in the profits of large multinational corporations. ICANN鈥檚 success may be less attributable to its abstract qualities and more to its pragmatic capacity to adapt. Its shifts and adjustments have continued to generate deference and but also , and its operations do not clearly reflect any of the democratic models we outline.
Another MSI of note is the , an organization of tech companies, civil society organizations, investors, and academics that seeks to influence governance on the internet. Since 2008, the GNI has developed and published normative frameworks, based in large part on the UNGPs, such as the and the that aim to define responsible corporate responses. As such, the GNI plays a "gatekeeper" role, influencing those who police the digital domain. At the same time, its members seek to encourage public participation in and knowledge of digital governance. Workshop participants argued that its success has hinged on how it has evolved in response to regulatory developments鈥攕imilar to ICANN. Also important has been its ability to focus on relevant issues and the willingness of funders (both members and non-members) to value the space it creates for information sharing, trust building, and collaboration.
A similar but narrower initiative is the , which focuses on advancing content-agnostic best practices in trust and safety. It takes inspiration from industry frameworks and international management standards, such as those developed through the . It also relies on a gatekeeper logic of democracy seeking to both align stakeholders around practices that manage digital harms and bring policies in line with international human rights law. It engages stakeholders through participatory processes to generate a framework for avoiding conflicting regulatory regimes that pose risks to both innovation and human rights.
Other initiatives aim to affect governance on the internet by fostering democratic deliberation. These include the (IGF) and the (GIP). These broad initiatives are helpful in bringing together stakeholders and facilitating open discussions. They thus contribute to participatory engagement that may lead to governance initiatives by others.
Takeaways and Implications
Several key insights from the workshop carry implications for advancing democratic principles, approaches, and practices in the digital realm.
It is useful to draw on pre-existing frameworks and analogs from the physical world, but buyers should beware. Though the novelty of the digital world makes it tempting to start from scratch, effective governance is often rooted in existing legal and human rights frameworks and norms. As the scholars Martha Finnemore and Duncan Hollis , norms are 鈥渟ocial creatures that grow out of specific contexts via social processes and interactions among particular groups of actors.鈥 Norms for cybersecurity, algorithmic decision-making, and other aspects of the digital domain are more likely to gain traction if they emerge from frameworks that are already there. Still, there are some models that simply won鈥檛 map neatly onto a rapidly changing domain that has few physical boundaries and that consists of distributed systems. That means we鈥檒l have to think more deeply about what we mean when we鈥檙e talking about sovereignty, citizenship, and identity in the digital realm.
Digital democratic governance will benefit from greater interaction among different instruments. Just as the internet is a network of networks, so too should the global regime . We can learn from how existing democratic governance instruments already interact and may be well served by identifying opportunities for greater connection and learning among them. One of the highly fragmented AI global governance regime found that greater centralization would make the system more efficient and politically powerful, but that locking in a centralized architecture that was inadequate would be even worse than fragmentation.
When it comes to regulation, one size does not fit all and blended, process-based approaches are often more productive. Sensitivity to context is important for digital governance. Rigid, blanket regulations can create unintended consequences. A focus on processes rather than output-based prescriptions can be more productive. For instance, when it comes to content moderation decisions, specifying procedures and principles is more nuanced and effective than blanket restrictions on certain types of content. Similarly, soft law can generate buy-in and then can be using available instruments in different locales. When the EU brought the Digital Services Act into force, for instance, some of its requirements were already present in the voluntary , so many companies were already aware of their obligations under the regulation.
Neither private companies nor states are unitary actors. Large institutions, both private and public, are not monoliths. In a of Facebook (now Meta), Chinmayi Arun examined the ways in which the company engaged differently with different external actors and how teams in the company had divergent, sometimes competing priorities. Those who regulate or otherwise govern large institutions should be aware of how internal politics can shape adoption and implementation of regulations and practices. Multi-stakeholder initiatives can capitalize on these differences to create alliances across subgroups in governments and companies more attentive to democratic practices.
Tensions are inevitable. The major objectives of digital governance鈥攕ecurity, innovation, access, and human rights鈥攃an be in conflict with one another. Protecting human rights might curb innovation; strengthening security can limit access; and so on. And the gatekeeping, participation, and power-balancing approaches can also conflict. Policing democratic norms may weaken opportunities for participation, for instance. Though unsettling, such tensions are unavoidable. Charting processes and mechanisms that help mediate these tensions in particular settings will be important for identifying paths forward.