<aside> <img src="notion://custom_emoji/7f3a86c4-0e4f-8193-9274-00038d571f22/294a86c4-0e4f-8053-a481-007af138f2db" alt="notion://custom_emoji/7f3a86c4-0e4f-8193-9274-00038d571f22/294a86c4-0e4f-8053-a481-007af138f2db" width="40px" />

This page explains why introducing AI into classrooms too quickly risks repeating the mistakes of the last ed-tech boom — for educators, policymakers, and parents. In short: the smartest use of AI in education right now is restraint. It matters because learning isn’t broken — governance is. The tools need testing before trust. Use it when designing AI literacy policies, digital curricula, or pilot programmes.

</aside>

Everyone’s racing to plug AI into classrooms like it’s the missing charger for the education system. Tech giants are promising “personalised learning”, policymakers are nodding enthusiastically, and schools are told to “get future-ready or get left behind”.

But Michael Bloomberg just dropped a necessary bucket of cold water on the hype. His point? AI in education could become the next mobile phone disaster — another shiny thing that distracts instead of develops.

Here are the five takeaways that matter.

1. Big Tech doesn’t teach — it sells

Let’s start with the obvious.

Tech companies aren’t marching to the rhythm of children’s progress; they’re dancing for shareholders. When AI enters schools, it’s not charity — it’s market expansion.

Remember the “laptops for every child” crusade? Billions spent, grades dropped, and teachers turned into tech support. The same script is being replayed, only this time the promise is “AI tutors” instead of Chromebooks.

2. Screen time kills focus, not just time

Schools that banned mobile phones saw behaviour and results improve. No mystery there. Yet the same policymakers now want to double down on screen time with AI tools.

Research already shows that the more hours kids spend glued to screens, the worse their math and reading performance becomes. AI won’t fix that — it’ll amplify it. Especially if “learning” starts meaning “chatting with a bot that occasionally hallucinates facts”.

3. The real lessons are human

Education isn’t about mastering tools; it’s about mastering thought. Critical reasoning, empathy, and trustworthiness — these don’t come from a prompt box.

Bloomberg puts it bluntly: the goal of public education should be to develop minds, not to train operators. Kids don’t need to “master AI”; they need to master being human in an AI-saturated world. That’s a very different curriculum.

4. Evidence before evangelism

Yes, AI might help — in moderation.

A study by the OECD found that up to an hour of computer-based learning boosted performance, but beyond that, results collapsed. Even Khan Academy’s AI tutor, Khanmigo, admits it goes off-script after a while (same as every office chatbot you’ve ever met).

So here’s the rule: test before trust. AI tools in schools should be treated like medication — prescribed in small doses, with clear side effects listed, and under supervision.

5. The best learning is still face-to-face

Bloomberg’s closing argument is simple but fierce: until AI can prove its consistent value, schools should focus on what works — teacher excellence, accountability, and human connection.