I’ve been working with so many students who turn to it as a first resort for everything. The second a problem stumps them, it’s AI. The first source for research is AI.
It’s not even about the tech, there’s just something about not wanting to learn that deeply upsets me. It’s not really something I can understand. There is no reason to avoid getting better at writing.


AI can point you in interesting directions, but if it is your first and only source, and you trust it to combine all these other sources together, you are shorting yourself. It does not do as well as you think it does, at combining ideas, identifying edge cases or real understanding. What it is teaching you may be or may not be, broadly accurate. It is a starting place, which, as I interpreted the OP, was their primary and often only, source.
The act of forming hypothesis, and researching to understand is part of learning. If all your learning comes from reading tailored answers to specific questions, you miss out on exposure to other thoughts, that you would bump into by researching.
I’ve used AI to try to research things, and EVERY time, on deeper inspection of an idea, some of the information it shared ranged from false to technically true, but not … really right.
It is, at best, like a personal TA; someone who you go to the office hours of, when you are stumped on a thing you’ve learned and need the idea explained differently, or you have no idea where to start, and you need a point in the right direction. Helpful, but you would never use that person to write your research.
It has problems with truthfulness but for topics that are well known it can be like having a better search engine or tutor