I’ve been working with so many students who turn to it as a first resort for everything. The second a problem stumps them, it’s AI. The first source for research is AI.
It’s not even about the tech, there’s just something about not wanting to learn that deeply upsets me. It’s not really something I can understand. There is no reason to avoid getting better at writing.


Yeah, that’s the way I came up too. But I disagree with the “maths without calculators” approach - mainly because it feels like a brute-force solution that ignores the reality that calculators exist.
So does ChatGPT.
We should learn to use the tools we have, not pretend they aren’t there.
More importantly, using something like “do the maths the long way” as a proxy for teaching reasoning probably has limited transfer if it’s not framed explicitly. Like you, I learned a lot of logic through algebra - but no one ever connected those dots. I only realized years later that the real lesson was about reasoning, not just manipulating symbols.
What I’m getting at is:
If we actually care about developing thinkers, we probably need to teach reasoning, skepticism, and how to interrogate outputs directly, including outputs from tools like AI.