OSU Learns STEM Students Lean on AI Too Much, New Study Suggests Ways to Keep them Thinking

University students with traits coveted in STEM fields are also the ones most prone to relying on artificial intelligence to the detriment of their cognitive skills, according to a new study by scientists at Oregon State University.

Researchers in OSU’s colleges of Engineering and Liberal Arts offer suggestions for breaking what they call “the self-reinforcing cognitive debt cycle” stemming from routine AI dependence.

The study led by computer science graduate student Rudrajit “Rudy” Choudhuri indicates that without corrective action by educators and AI designers, thinking skills will steadily decline amid an illusion of increased efficiency.

“Our findings make you rethink what kind of AI literacy is useful after all,” he said.

Choudhuri and doctoral advisor Anita Sarma teamed up with Margaret Burnett, distinguished professor of computer science, and Christopher Sanchez, associate professor of psychological science, to analyze survey data from 299 STEM students representing five North American universities.

The analysis reveals a feedback loop: The routine use of generative AI serves to weaken students’ intellectual habits, which leads them to lean on AI even more, further diminishing their thinking skills.

“The most surprising finding was that the ‘AI-savvy’ ones, who you might think do significantly better in STEM careers, are actually spinning themselves deeper into the AI dependence spiral,” Choudhuri said. “The choices educators and AI designers make now will shape not only what students learn, but how they learn to think.”

The researchers note that the “cognitive laziness” associated with an over-reliance on AI is explained at least in part by evolution. Our ancestors who learned the skillful use of intuitive, automatic and low-effort “system 1” thinking were able to outcompete those who overapplied “system 2” thinking, which is deliberate, controlled and effortful.

“System 1 is what delivers the rapid answer to 2 + 2 = ? or completes the phrase ‘bread and _____,”  Choudhuri said. “System 2 is invoked when you calculate 17 × 24 or evaluate a complex argument. System 2 is lazy. It avoids effort whenever System 1 can provide a plausible solution. It’s what psychologists call the law of least effort, which is a pervasive tendency to accept the path of least cognitive resistance.”

Basically, human ancestors who conserved their mental energy for actual threats survived whereas those who performed exhaustive analyses prior to every decision didn’t.

“Today we make shopping lists rather than rely on memory, let calculators do our arithmetic and use GPS instead of reading maps,” Sarma added. “Generative AI tools tend to amplify this sort of trend because they deliver fluent, confident support on demand. That means interacting with AI is more about choosing from outputs rather than thinking by doing.”

Though choosing instead of thinking can create a compelling sense of progress via time savings, they said, it’s problematic for learning. That’s particularly true in STEM education, where students need “desirable difficulties” – slow, painstaking cognitive work that builds intuition and transferable skills. In the survey, students who trusted and routinely used generative AI reported significantly lower cognitive engagement.

“The types of skills that students are farming out to AI actually put them in a hole when it comes to developing skills that are genuinely needed,” Sanchez said. “In short, they are not using AI as an assistant but as more of a substitute for actually engaging with learning a topic.”

The scientists suggest some ways for students to build these cognitive skills, including:

  • Increasing the intrinsic reward of cognitive engagement by designing coursework as opportunities for exploration through storytelling, game playing or puzzle solving.
  • Alternating between AI-assisted and independent work. For example, students might first problem solve with help from AI, then be required to independently critique, debug/repair, verify and adapt that contribution to their project requirements without AI during a supervised lab.
  • Combining these approaches with in-class assessments and/or collaborative/group activities to further reinforce engagement by leveraging social accountability.
  • Designing generative AI systems as “bicycles for the mind.” Bikes amplify physical efficiency while requiring the rider’s active control; designed as mind-bicycles, AI would amplify cognition via critique, counter argument and provocation while keeping humans responsible for exploration, steering and judgment.

“Absent intervention, STEM education runs the risk of cultivating a generation of students unwilling and/or unable to exercise their capacity for reflection, understanding and critical thinking,” Burnett said. “If routine reliance on AI changes students’ willingness to engage in effortful thinking, many may enter professional life without having developed the intellectual habits that earlier generations gained through practice.”

The USDA National Institute of Food and Agriculture and the National Science Foundation supported this study, which can be viewed online in preprint form.

By Steve Lundeberg

Do you have a story for The Advocate? Email editor@corvallisadvocate.com