Amsterdam’s AI Welfare Dream Just Became a Nightmare

Amsterdam thought AI could end welfare bias. Instead, it proved algorithms can discriminate faster, and with greater confidence.
Amsterdam’s ambitious project, Smart Check, aimed to fairly screen welfare applications using AI, but it collapsed spectacularly. The city hoped algorithms could objectively detect fraud without bias.
They meticulously implemented ethical AI guidelines, consulted stakeholders, and deployed an “explainable boosting machine” that evaluated 15 non-sensitive criteria. Despite exhaustive bias corrections, including reweighting data, the algorithm kept unfairly targeting certain groups, initially migrants and men, later Dutch nationals and women.
This failure wasn’t due to negligence. Amsterdam involved external experts like Deloitte and the Civic AI Lab, and thoroughly tested their system. Yet real-world use still amplified biases present in historical data. Ironically, human caseworkers were similarly biased but faced no such rigorous scrutiny.
As we accelerate digital transformation, we must learn three crucial lessons from Amsterdam’s AI misadventure:
- Bias persists even in rigorously tested AI.
- Humans and algorithms both reflect historical prejudices.
- Technical fixes alone can’t guarantee fairness.
Amsterdam’s experiment underlines the complexity of fairness and the necessity of blending technological innovation with robust human oversight. How do we ensure technology serves everyone, fairly?
Read the full article on MIT Technology Review.
----
💡 We're entering a world where intelligence is synthetic, reality is augmented, and the rules are being rewritten in front of our eyes.
Staying up-to-date in a fast-changing world is vital. That is why I have launched Futurwise; a personalized AI platform that transforms information chaos into strategic clarity. With one click, users can bookmark and summarize any article, report, or video in seconds, tailored to their tone, interests, and language. Visit Futurwise.com to get started for free!
