Charles Spinelli on When AI Ethics Becomes Performance
As artificial intelligence spreads across workplace systems, many organizations point to ethics frameworks as proof of responsibility. Policy documents, review boards, and checklists signal care and control. Charles Spinelli has noted that these measures often project confidence without delivering protection, creating a form of compliance theater that reassures leadership while leaving workers exposed.
This gap between appearance and practice widens as AI tools move faster than oversight. Governance artifacts look complete, yet daily decisions tell a different story. Employees experience monitoring, scoring, and inference without clear recourse, even as organizations cite adherence to ethical standards.

The Appeal of the Checklist
Checklists offer speed and simplicity. They provide a sense of order in a complex landscape and allow teams to move forward with minimal friction. A completed form suggests risk has been addressed and responsibility assigned.
Yet ethics does not function like quality control. A static list cannot capture shifting contexts, downstream impacts, or the ways systems interact with power. When governance becomes procedural, it loses contact with lived experience. Boxes get checked while questions go unanswered. This approach favors documentation over outcomes. Policies describe intent, not effect. As long as processes appear sound, harm can remain invisible.
Where Practice Falls Short
Workers feel the consequences when governance stays abstract. An ethics statement may promise fairness, while an algorithm quietly reshapes access to projects or scrutiny. A review committee may exist, yet few employees know how to raise concerns or see changes.
Compliance theater thrives when responsibility diffuses. Vendors point to client policies. Employers cite vendor safeguards. Accountability moves in circles. Ethical claims mean little without clear ownership of results. Another weakness lies in scope. Many frameworks focus on deployment rather than use. Once tools go live, oversight tapers off. Models drift, data changes, and impacts compound, all outside the checklist’s reach.
The Cost of Performative Ethics
Superficial governance can erode trust faster than no framework at all. Employees recognize when protections feel symbolic. This disconnect breeds cynicism and silence, discouraging feedback that could surface real issues.
Leaders may also grow complacent. Formal compliance offers cover, reducing the urgency to ask harder questions. Ethics requires discomfort. It demands time, challenge, and willingness to pause. Theater replaces that work with reassurance. Over time, systems gain legitimacy through repetition, not merit. What began as a pilot becomes standard. The checklist remains unchanged while impact deepens.
Moving From Signal to Substance
Effective governance centers on practice. It ties principles to clear limits, review cycles, and remedies. It invites worker input and treats concerns as data, not disruption. Transparency about failures supports credibility more than polished statements.
Charles Spinelli underscores that protecting workers depends on substance over signal. Ethics frameworks matter only when they shape decisions after deployment, not just approvals before launch. In workplaces shaped by AI, responsibility shows up in outcomes, not paperwork.





