A Framework for Trusted Openness in the Digital Age
This essay is part of my series Towards a Trusted Tech Alliance, which explores how democracies can move from fragmentation to common cause in shaping the digital future. Earlier essays have examined the costs of divergence between Washington and Brussels and the risks of authoritarian defaults wiring much of the world. This piece takes a next step: advancing the option that interoperability — the lifeblood of the digital economy — be treated not as a universal public good, but as a club good.
Why Interoperability Matters
Interoperability allows data, digital identities, and applications to move seamlessly across systems. It lowers costs, accelerates innovation, and allows businesses and consumers to operate at global scale.
But interoperability also has a downside: if openness is universal and unconditional, it can be exploited. Authoritarian states can tap into democratic markets, technologies, and datasets — even while keeping their own systems closed and controlled. The result is asymmetry: democracies bear the risks of openness without securing its rewards.
A Club Good for the Digital Age
In economic terms, a club good is something non-rivalrous but excludable. It can be shared widely among members, but outsiders can be excluded.
Applied to digital standards, this suggests an approach where:
- Inside the club, members enjoy interoperability — data portability, AI certification, privacy recognition, secure cloud standards.
- At the boundary, membership is conditioned on reciprocity, transparency, and alignment with trusted rules.
- Outside the club, adversaries cannot free-ride on openness or siphon off leading-edge technologies.
This is not about exclusion for its own sake. It is about making openness sustainable in an era of strategic competition.
Linking Rules to Incentives
What would make such a club attractive? The answer lies in incentives.
The United States and its partners still dominate the technology stacks that matter most: advanced semiconductors, hyperscale cloud, frontier AI models, and secure enterprise software. These stacks are not just economic engines — they are chokepoints.
If access to those stacks were tiered based on commitment to club rules, membership would become a competitive necessity. Countries that align on privacy interoperability, AI safety testing, cloud security, and export controls would gain privileged access. Those who do not, would not.
Building on Existing Efforts
This idea does not begin from scratch. Several initiatives already point in this direction:
- Japan’s “Data Free Flow with Trust” (DFFT) — a G20 initiative stressing that data should flow only where safeguards build trust.
- The Global CBPR Forum — a certification system for cross-border privacy rules, originally launched in APEC, now open globally.
- Digital Economy Partnership Agreement (DEPA) — a modular pact among Singapore, New Zealand, Chile, and others, with plug-in modules on AI, fintech, and e-invoicing.
- OECD AI Principles and the G7 Hiroshima Process — articulating risk-based, democratic approaches to responsible AI.
- USMCA’s digital trade chapter — the strongest binding template for digital rules today.
- The AUKUS Digital Pillar (Pillar II) — demonstrating how trusted partners can collaborate on advanced technologies such as AI, quantum, and cyber while ensuring interoperability and shared security.
- The Clean Network and the Krach Institute’s Global Trusted Tech Standard — pioneering efforts to translate trust into action through transparent, values-based frameworks for secure and democratic technology collaboration.
Each of these provides pieces of the puzzle. A club good approach would knit them together, adding the missing elements of excludability, reciprocity, and incentives — creating a trusted digital commons for those who play by the rules.
Questions for Further Exploration
The concept is promising, but it raises important questions:
- Membership. Who should qualify? Only established democracies, or also emerging economies willing to align?
- Tiers. Should there be multiple levels of access and obligation? How would countries move up the ladder?
- Governance. Who sets the standards and updates them? Could this be done through modular agreements, a standing secretariat, or a rotating consortium?
- Export Controls. How can members coordinate to prevent leakage of sensitive technologies while keeping rules targeted and innovation-friendly?
- Bridges. How might a club good framework interoperate with the EU’s regulatory sphere, without forcing uniformity?
These are not trivial questions. But they are worth asking now, before authoritarian models harden into global defaults.
Why This Matters
The stakes are high. If democracies remain fragmented, authoritarian standards will spread by default, wiring the next billion users in ways that tilt toward coercion rather than empowerment.
Treating interoperability as a club good would turn democratic openness from a vulnerability into a strength. It would scale innovation across trusted markets, create incentives for alignment, and preserve space for rules that are both fair and innovation-friendly.
This is not a finished blueprint, but an idea worth exploring further — with policymakers, business leaders, and civil society. The alternative to acting together is not regulatory freedom, but regulatory capture by others.
Closing Thought
Interoperability is too important to be left unguarded. It can be the backbone of a trusted digital order — but only if democracies treat it as a club good, open to those who align, closed to those who would exploit it.
In line with my belief that responsibly embracing AI is essential to both personal and national success, this piece was developed with the support of AI tools, though all arguments and conclusions are my own.
Author
Mark Kennedy
WISC Director, DRI Senior Fellow