It's a fact of life that automation can make us lazier. Usually, the tradeoffs are worth it. But it feels more pernicious with AI Chatbots, which offer to basically automate thinking itself.
That's what Sam Schechner, a tech reporter for The Wall Street Journal, began to wise up to after developing a nasty ChatGPT habit.
"Artificial intelligence was eating my brain," he wrote in a recent essay for the newspaper.
Schechner, an American living in Paris, had come to rely on the OpenAI chatbot to draft emails in French. It got to the point where he even used it to write some of his texts to his French friends.
"After years of building up my ability to articulate nuanced ideas in French, AI had made this work optional. I felt my brain get a little rusty," Schechner wrote. "I was surprised to find myself grasping for the right words to ask a friend for a favor over text."
Robert Sternberg, a Cornell University professor of psychology, cautioned Schechner.
"With creativity, if you don't use it, it starts to go away," he told the WSJ reporter.
Schechner's experience isn't unique. A recent study from researchers at Microsoft and Carnegie Mellon suggested that a person's critical thinking skills atrophied the more they relied on, and trusted in, AI responses. Another paper, published last year, unearthed a distressing link between students who heavily relied on ChatGPT and memory loss and tanking grades.
These are examples of a broader phenomenon that psychologists call cognitive offloading — when we outsource the burden of performing mental tasks to external tools, like an AI model. Cognitive offloading can be helpful, disencumbering our brains of having to perform tedious work so we can focus on more fulfilling pursuits, or harder brain tasks that deserve more attention. But you can get too much of a good thing.
As their name warns us, large language models offer to take care of language for us. Thought and language aren't one and the same, but the line separating them is blurry. In any case, it's an awful lot to be surrendering to tools that are infamously prone to making up information and lying.
"Tools like GPS and generative AI make us cognitively lazy," Louisa Dahmani, a neuroscientist at Massachusetts General Hospital, told Schechner. In 2020, Dahmani showed that relying on GPS to get around cripples our spatial memory. "While it's possible to use these tools in a mindful manner, I think that most of us will take the path of least resistance."
Yes, we should all eat our vegetables. Still, it can be hard to make the case to someone that they shouldn't just ask a chatbot to write an essay and save them hours of work, or churn out that email to a boss they hate. After all, we struggle to put our smartphones down, even though we know that they too are turning our gray matter to mush.
More on AI: Something's Gone Wrong With Microsoft's Huge AI Data Center Investments
Share This Article