I’ve been working with so many students who turn to it as a first resort for everything. The second a problem stumps them, it’s AI. The first source for research is AI.
It’s not even about the tech, there’s just something about not wanting to learn that deeply upsets me. It’s not really something I can understand. There is no reason to avoid getting better at writing.


You are so right about how important the process of thinking and learning is, and that is where AI fails.
I am not a teacher, but a couple weeks ago, I was a guest speaker in a high school IT class. I told them all about how critical it is to be an effective communicator by documenting their steps in their tickets in a way that others can follow, and told them, straight up, that communication is a skill. If you can’t communicate, I will not hire you. Told them I have actively declined to hire or promote because they don’t communicate effectively.
I am not sure how to do something similar with, say, an English class, but I wonder if you could figure out how to expose them to the future professional repercussions of not understanding the topic deeply. I think it hit differently when the repercussion wasn’t just that their instructor would be unhappy.
AI is brilliant for learning. Endlessly patient, answers all my questions at a pace that suits me, can combine knowledge for hundreds of different sources to find the right concept, or the best way to explain something. If you’re not able to learn with AI, you’re doing something wrong.
Just ask it to explain bloom filters to you. Keep asking questions until you get it.
AI can point you in interesting directions, but if it is your first and only source, and you trust it to combine all these other sources together, you are shorting yourself. It does not do as well as you think it does, at combining ideas, identifying edge cases or real understanding. What it is teaching you may be or may not be, broadly accurate. It is a starting place, which, as I interpreted the OP, was their primary and often only, source.
The act of forming hypothesis, and researching to understand is part of learning. If all your learning comes from reading tailored answers to specific questions, you miss out on exposure to other thoughts, that you would bump into by researching.
I’ve used AI to try to research things, and EVERY time, on deeper inspection of an idea, some of the information it shared ranged from false to technically true, but not … really right.
It is, at best, like a personal TA; someone who you go to the office hours of, when you are stumped on a thing you’ve learned and need the idea explained differently, or you have no idea where to start, and you need a point in the right direction. Helpful, but you would never use that person to write your research.
It has problems with truthfulness but for topics that are well known it can be like having a better search engine or tutor