Opening — Why this matters now
AI governance isn’t just a technical issue—it’s an institutional one. As governments scramble to regulate, corporations experiment with ethics boards, and civil society tries to catch up, the question becomes: who actually holds the power to shape how AI unfolds in the real world? The latest ethnographic study by The Aula Fellowship, Levers of Power in the Field of AI, answers that question not through theory or models, but through people—the policymakers, executives, researchers, and advocates navigating this turbulent terrain.
Background — Context and prior art
Institutional theory often treats power as structural: laws, hierarchies, and formal authority. But in practice, real influence comes from subtler levers—social relationships, moral authority, narrative framing, and the ability to bridge worlds. Building on neo-institutional and Scandinavian institutionalism, the study explores 12 anonymized personas—decision-makers from academia, business, government, and civil society—each revealing how personal agency interacts with institutional inertia.
The researchers frame these as levers of power: from formal mechanisms like governance and regulation to informal ones like idea mobility and relational channels. It’s a conceptual upgrade from the usual top-down view of policy: not what institutions do, but how people inside them move the machine.
Analysis — What the study does
Rather than statistics or surveys, this is ethnography. The study uses a modified governance framework incorporating dimensions like logics (the rationale behind decisions), organizational models, norms, and informal relational channels. Respondents—ranging from senators and executives to rabbis and civil advocates—were asked how they perceive and exert power within their institutions.
Their stories are strikingly human:
- Alex, a policymaker in academia, sees influence as fragile and collaborative, dependent on cross-sector networks.
- Cameron, a senator, admits the state is unprepared for AI’s social impacts but leans on entrepreneurial instincts for agility.
- Emerson, a rabbi, reframed religious rituals during the pandemic through digital channels, symbolizing how technology redefines authority and access.
- Kinsey, a civil-society director, embodies the new hybrid—part activist, part diplomat—leveraging informal encouragement over bureaucratic power.
Each persona demonstrates how personal ethics, relationships, and even self-perception (“Breaker of Stereotypes!” says one) shape institutional responses to AI.
Findings — The levers of power in motion
The researchers distilled their findings into a simple but profound table:
| Lever of Power | Current Status |
|---|---|
| Logics | Market logic, social justice logic, technosaviourism |
| Institutional infrastructure | Emergent and translated |
| Governance | Incomplete coverage |
| Collective interest organizations | Few and far between |
| Regulators | Lacking in purview |
| Informal governance bodies | Inconclusive |
| Field-configuring events | Present, multi-logical |
| Status differentiators | Money, resources, authority |
| Organizational templates | Mixed industry standards |
| Categories / labels | High potential for impact |
| Norms | In flux, accessible |
| Relational channels | Formal, in transformation |
| Idea mobility | Low integration, high variance |
From these patterns emerge five hypotheses:
- Formal methods are outpaced—informal power grows faster than bureaucracy can track.
- Institutions drift toward business logics, often sidelining civic engagement.
- Big Tech strategically co-opts both formal and informal governance.
- Informal influence remains elitist, even when wrapped in inclusion rhetoric.
- Collective action is resurgent, but fragile and uneven.
In short: AI governance today is less a coherent field than a fragmented ecosystem of influence. Power flows where formality falters.
Implications — What this means for business and policy
For executives and policymakers, the message is blunt: governance is personal. Regulation, ethics frameworks, and committees are necessary—but insufficient. Real power lies in relationships, credibility, and timing. When a field-configuring event (like an AI safety summit or corporate scandal) hits, influence shifts toward those who can act informally yet decisively.
For civil society, the takeaway is strategic: stop chasing seats at the table; start shaping the table’s conversation. The most effective change agents in this study weren’t the most senior—they were the most socially connected and narratively aware.
From an organizational lens, stability and change coexist. Technology destabilizes norms, but also offers new anchors. Emerson’s virtual congregation and Dana’s responsible AI frameworks both illustrate that resilience depends on translation—turning crises into continuity.
Conclusion — Power, personified
In an era when “AI governance” risks devolving into paperwork and press releases, Levers of Power reminds us that institutions don’t act—people do. They improvise, rationalize, and occasionally contradict themselves, but they move systems nonetheless.
The real question isn’t whether we can regulate AI—it’s whether our human institutions still remember how to evolve.
Cognaptus: Automate the Present, Incubate the Future.