Artificially Informed: How AI is Robbing Students of their Critical Thinking
Ryan Monaghan
Professor John Horgan
HST 401 Seminar in Science Writing
2 May 2025
Artificially Informed: How AI is Robbing Students of their Critical Thinking
As a course assistant for a core Computer Science degree requirement, I saw homework scores that were higher than I have seen in the past—while I saw test scores that dropped to an all time low. What I saw wasn’t due to easy homework assignments and impossible exams, it was due to an ever increasing presence of AI in the education system. I’m not the only one who has seen this, however. Dr. Stephen Rupprecht, Assistant VP and Dean of Students at Kutztown University (and who did his dissertation on academic integrity) suggests that faculty should learn how to use tools to detect AI content effectively when enforcing and upholding academic integrity. He was also “not surprised by [my] experience of seeing high homework scores and low test scores” (Rupprecht). That being said, it isn’t all bad. AI promises huge productivity boosts, and in my own experience using it I was able to see small improvements in my process while writing code. The biggest thing I noticed while using AI, however, was that I was not thinking as much as I usually did while programming. It took the work out of what I was doing and it made me a worse programmer. This is what sparked my interest in this subject, as I figured this was happening in school where students would use AI to check the boxes to get their degree while not learning as much as they could have. I do not think that we should entirely prevent the use of AI tools in education, there is evidence and research to back up the claim that an overreliance on AI can lead to a reduction in critical thinking. That being said, I believe that while AI tools in education promise efficiency and accessibility, the overreliance on these tools undermines the critical thinking in students by making them complacent, which reduces problem solving from genuine understanding to output generation.
First, I am going to introduce some AI tools that I saw being used commonly in classrooms so you can see in what areas AI would be used, whether consciously or unconsciously. I’ll start with ChatGPT. This is the one that I have used the least, but it has been the easiest to interact with. The idea is you send a message to the AI as if you were having a chat with another person. It’s information isn’t too accurate, and I have found from my experience that the more you know about a subject, the more you realize how much it doesn’t know. One of my favorite examples of this is that the developer of the Linux kernel says that AI is only good at things it has seen before, and it has a hard time at pioneering new spaces. This transitions nicely into GitHub’s Copilot, which is an LLM that is tailored to generating code based on the context of the rest of the file you’re working in. This is the AI that I have used the most extensively, and I found that this one was good for generating code that was easy to write, but again, the harder stuff it could not get right. After a few months I had enough of AI stepping on my toes and so I uninstalled it. I have seen many examples of students having this embedded in their code editors while helping them in my office hours, and the reason this hurts so much is because you don’t even need to ask it to write code for you, it just does it automatically. Finally, Google’s AI assist in search can summarize pages (often incorrectly) after you search for something. If you google something and it summarizes it nicely for you, it takes the search out of research. And this is automatic as well, you don’t need to visit a website and ask for this summary, it just gives it to you. AI is more commonplace than ever, and students are feeling the impact in their education whether they seek it out (ChatGPT, Copilot), or if it happens automatically (AI Overview).
This begs the question, how do you define critical thinking without being inside someone’s head? In my quest to answer that for myself, I discovered a concept created by Benjamin Bloom in 1956 which is used as a framework for categorizing educational goals and cognitive skills. This concept, called Bloom’s Taxonomy, describes “critical thinking” occurring in a set of six steps: Remembering, Understanding, Applying, Analyzing, Evaluating, and Creating. Now that you have an idea of the framework used to describe critical thinking, you can probably imagine that AI alters this process significantly. When using AI, you no longer need to Remember, Understand, or Apply the knowledge you are meant to gain in your education. Going back to my own experience, I think that the lack of these first 3 fundamental steps is what makes students have high homework scores and low homework scores. You might be able to picture yourself sitting in a classroom, watching your professor walk through a problem on the board. They can show you how to do it step by step and you might even sit there and think to yourself “that’s not that hard” once you have been stepped through it. What happens to these students when they have to do these same problems on an exam? They will probably forget how to do the problem because they are going to have trouble remembering the steps, they won’t have understood why the professor did things the way they did, and the student hasn’t had to apply that knowledge until the exam. This same thing happens when students use AI to do their homework. Maybe they sit there and try and look at why the AI did things a certain way, but without actually sitting there and writing the code themselves, they won’t be able to do it again on exams. And that is the exact kind of issue that my professor and the other course staff saw on the most recent exam. The students have no issue writing code for homework, but on exams they got entire concepts wrong because—you guessed it— they never applied the concepts on their own.
Now for the meat and potatoes of this paper, how does AI actually impact critical thought? It’s easy to speculate based on experience, but couldn’t this just be a one-off in a set of very specific circumstances? Based on a paper from Microsoft Research written by Hao-Ping (Hank) Lee et. al, there is actually a measurable impact of AI use on critical thinking. In the paper The Impact of Generative AI on Critical Thinking, participants reported “[using] “much less effort” or “less effort” comprising 72% in Knowledge, 79% in Comprehension, 69% in Application, 72% in Analysis, 76% in Synthesis, and 55% in Evaluation”, showing that there is a large reduction in most of the stages of critical thinking as described by Bloom’s Taxonomy (Lee et. al). This reduction in critical thinking likely comes from the mental model that users have of the AI, where they assume that “AI is competent for simple tasks” which “can lead to overestimating AI capabilities” (Lee et. al). Educators are concerned, and in the interview with Dr. Rupprecht I was able to ask him about his stance on the use of AI in the classrooms, and if it has harmed or impacted education. He responded by saying that “I think it has improved education far more than it has harmed our field” due to reasons like accessibility of these tools and how they can be as engaging as an in-person lecture. While Dr. Rupprecht does acknowledge the benefits, he also raised concerns about the use of AI and how it has the potential to diminish critical thinking in students. He says that “the "harm" that most concerns me relates to diminishing critical thinking skills for those who become over-reliant on these tools vs. their own brains!” amongst other concerns like plagiarism. This goes to show that not only is there data to support the reduction of critical thinking, those in higher education are concerned about this impact.
To wrap up this paper with my own thoughts, I absolutely think that there is a space for AI in education. It is inevitable that it will be used, and there have been many other significant innovations in the past that have impacted education in this way, and each time the process has adapted. The calculator was created which reduced the ability to do mental math, Google search which made doing research easier, and now AI which can even write essays for us. All of these things made menial tasks much quicker, and from my point of view it would be pointless to try and prevent an innovation like this from being used. However, I think that if instructors do not adapt their courses to mitigate this AI use, this reduction in critical thinking will continue to increase and will only cause more problems. Personally, I would not want to work with any of the students that I know use AI due to the fact that when they are asked to write simple code they give me back blank stares. I do not think that the current attitude of using AI to get your degree without learning just because you’ll have access to the internet in the real world (yes, that was an actual argument used by a student I was talking to about AI) is a good attitude to have, and companies will most certainly find issue with feeding proprietary information into ChatGPT or similar platforms. So, punctuating these thoughts, I believe that, while AI is useful and is undoubtedly a good tool for quick thinking, an over-reliance on AI for critical thinking tasks or even to replace entire employees will turn out to be problematic in the long run.
Works Cited
Bittle, Kyle, and Omar El-Gayar. “Generative AI and Academic Integrity in Higher Education: A Systematic Review and Research Agenda.” Information, vol. 16, no. 4, 8 Apr. 2025, p. 296, www.mdpi.com/2078-2489/16/4/296, https://doi.org/10.3390/info16040296.
Jackson, Justin. “Increased AI Use Linked to Eroding Critical Thinking Skills.” Phys.org, 13 Jan. 2025, phys.org/news/2025-01-ai-linked-eroding-critical-skills.html.
Lee, Hank, et al. “The Impact of Generative AI on Critical Thinking: Self-Reported Reductions in Cognitive Effort and Confidence Effects from a Survey of Knowledge Workers.” The Impact of Generative AI on Critical Thinking: Self-Reported Reductions in Cognitive Effort and Confidence Effects from a Survey of Knowledge Workers, 2025, advait.org/files/lee_2025_ai_critical_thinking_survey.pdf, https://doi.org/10.1145/3706598.3713778.
Rahyuni Melisa, et al. “Critical Thinking in the Age of AI: A Systematic Review of AI’s Effects on Higher Education.” Educational Process International Journal, vol. 14, no. 1, 1 Jan. 2025, www.researchgate.net/profile/Achmad-Salido-2/publication/388572731_Critical_Thinking_in_the_Age_of_AI_A_Systematic_Review_of_AI, https://doi.org/10.22521/edupij.2025.14.31.
Royce, Christine Anne, and Valerie Bennett. “To Think or Not to Think: The Impact of AI on Critical-Thinking Skills.” Nsta.org, 2025, www.nsta.org/blog/think-or-not-think-impact-ai-critical-thinking-skills.
Rupprecht, Stephen. How Has AI Impacted Higher Education? 2025.
Comments
Post a Comment