AI Can Make You Productive or Dumb. Here’s How to Tell Which Side You’re OnIt all depends on how you use AI.
Generative AI tools like ChatGPT have seen a massive rise in popularity, with over half of Americans reporting they had tried AI assistants or coding copilots by early 2024. From writing emails to generating code, more and more of our mental tasks are being outsourced to machines. This raises an important question: why bother remembering facts or composing text when AI can handle it for us? Is this digital convenience making us intellectually lazy—or even less sharp? Recent studies, expert opinions, and real-world examples suggest that while AI can significantly boost productivity, it may also weaken our critical thinking, memory, and social skills if we’re not careful. The impact isn’t entirely positive or negative; it largely depends on how we choose to use these tools. Automation and Atrophy: When AI Thinks for UsOne of the clearest indicators of AI's impact comes from a study by the MIT Media Lab, which scanned participants' brain activity while they wrote essays with and without AI assistance. The results were striking. Participants who used ChatGPT showed the lowest levels of brain engagement and consistently underperformed in terms of neural activity, language use, and behavior, according to the researchers. In short, their brains were less active, especially in areas related to memory, imagination, and critical thinking, compared to those who wrote their essays without AI support. The essays produced with AI also tended to be more generic and lacked originality. Many included repetitive, formulaic phrases. As participants continued writing, their effort declined. By the third essay, several were allowing ChatGPT to handle nearly all of the writing, simply copying and pasting the output. The more they leaned on AI, the more passive their approach became. This mental disengagement came with real consequences. After the writing tasks, participants were asked to recall details from their own essays. Only 20% of those who had used ChatGPT could remember a single quote from their work, compared to 85% of those who wrote without AI or only used tools like Google. Even more surprising, 16% of the AI users didn’t recognize their own essays when presented with them. None of the participants who had written without AI had this issue. In effect, AI had disrupted their memory of the task. They completed the assignment, but the learning didn’t register. “The task was executed efficiently and comfortably,” said Dr. Nataliya Kosmyna, the study’s lead author, “but essentially nothing was integrated into the memory networks.” When AI handles most of the cognitive load, our brains tend to slow down. It’s a strong example of cognitive offloading and appears to weaken the very abilities we rely on AI to perform. A separate study by Microsoft and Carnegie Mellon University also found that people solved data-related problems faster with AI tools, but at the cost of critical thinking. Workers with access to generative AI were more efficient, but they showed less reasoning and analysis than those who worked without it. The study also found a dose-dependent pattern. The more participants relied on AI-generated answers, the more their own critical thinking declined. Real-world experiences support these findings. Software engineer Koen Van Belle noticed it himself after frequent use of GitHub Copilot. When his internet connection failed one day, he struggled to recall programming syntax and fix problems on his own. “I couldn’t remember how it worked,” he admitted. He realized he had become overly dependent on AI and had to retrain himself in skills he once knew by heart. Later, when his company restricted AI access, productivity and code quality among interns dropped significantly. It became clear they hadn’t developed the core problem-solving skills needed to work independently. This points to an important insight. By skipping the challenge of learning and letting AI generate solutions, many young programmers missed the chance to build the mental resilience needed to solve new problems on their own. The New “Google Effect”: Memory and Learning in the Age of AIRelying on AI to do our thinking doesn't just pose a risk to creativity and critical reasoning. It may also quietly reshape how we learn and what we actually retain. Cognitive science has shown for decades that learning is an active process. We absorb knowledge through effort—by working with ideas, making mistakes, and gradually refining our understanding. So what happens when an always-available AI steps in and interrupts that process? Educators are already raising red flags. John Warner, a writing instructor, argues that using ChatGPT to write essays or complete assignments can “derail the important exploration of an idea that happens through the act of writing.” In other words, when a student skips straight to a polished answer, they miss out on the deeper, more meaningful learning that comes from wrestling with the problem. This isn’t just a theoretical concern. There’s real evidence it’s already taking place. A 2023 classroom study found that students who relied heavily on AI to answer questions ended up learning less overall. High school students who used AI more frequently also showed less flexibility when adapting to new social situations. In an experiment at Duke University, students were asked to summarize readings and then answer questions about them. Those who used AI to generate their summaries performed worse on exams than students who took notes and recalled the material from memory. Researcher Qirui Ju explained that writing things out by hand helps strengthen understanding. When AI takes over that process, you might get a grammatically clean result, but you lose the part where actual learning happens. As he put it, even if AI-generated responses are more polished, the learning is weaker because students aren’t making the mental effort themselves. This pattern is really just a more advanced version of what researchers have called the “Google Effect.” We've already seen how internet searches made it easier to forget basic facts. Now, AI doesn’t just help us find answers—it creates them. It can write essays, solve problems, and even generate full solutions, which means we might not even try to come up with ideas on our own. A study from 2011 found that people often turn to the internet to answer tough questions and, as a result, remember less information directly. What they do remember is how to find the answer. But with AI, even the act of searching and organizing information is being done for us, which can distance us even further from the material we’re trying to learn. According to a Harvard study titled The Google Effect, the internet has become a “primary form of external or transactive memory.” In a way, it has become an external brain. AI could take this concept even further, acting not only as a memory tool but as an external problem solver and creative engine. Many |