Put simply, writing with AI reduces the maximum strain required from your brain. For many commentators responding to this article, this reality is self-evidently good.“The spreadsheet didn’t kill math; it built billion-dollar industries. Why should we want to keep our brains using the same resources for the same task?”
My response to this reality is split. On the one hand, I think there are contexts in which reducing the strain of writing is a clear benefit. Professional communication in email and reports comes to mind. The writing here is subservient to the larger goal of communicating useful information, so if there’s an easier way to accomplish this goal, then why not use it?Â
But in the context of academia, cognitive offloading no longer seems so benign. In a learning environment, the feeling of strain is often a by-product of getting smarter. To minimize this strain is like using an electric scooter to make the marches easier in military boot camp; it will accomplish this goal in the short term, but it defeats the long-term conditioning purposes of the marches.
I wrote many a journal entry in college complaining about this exact point, except we were still arguing about graphing calculator and laptop use.
Now that I’m older, I understand the split that Cal talks about here.
When I’m writing software to accomplish a task for work, then it’s more important for me to spend my brain energy on building the context of the problem in my head.
When I’m writing an essay and trying to prove that I understand a concept, then it’s more important for me to get the words out of my head and onto paper. Then, I can use tools to help me clean it up later.
Maybe this points to a larger problem I’ve had with our education system. Imagine a spectrum of the intent of college. The left end of the spectrum represents “learning how to critically think about ideas”. The right end represents “learning skills that will help you survive in the real world”.
When someone makes fun of a film studies major, it’s because their evaluation of the spectrum is closer to the right end.
When someone makes fun of students using ChatGPT for writing their essays for them, it’s because their evaluation is closer to the left.
Continue to the full article
→
Today, quite suddenly, billions of people have access to AI systems that provide augmentations, and inflict amputations, far more substantial than anything McLuhan could have imagined. This is the main thing I worry about currently as far as AI is concerned. I follow conversations among professional educators who all report the same phenomenon, which is that their students use ChatGPT for everything, and in consequence learn nothing. We may end up with at least one generation of people who are like the Eloi in H.G. Wells’s The Time Machine, in that they are mental weaklings utterly dependent on technologies that they don’t understand and that they could never rebuild from scratch were they to break down.
Before I give a counterpoint, I do want to note the irony that even now people do not understand how this stuff works. It’s math, all the way down. It shouldn’t work, frankly… but it does!
I think that is so beautiful. We don’t really understand much about our universe, like dark matter, gravity, all number of naturally-occurring phenomena.
But just because we don’t understand it doesn’t mean we can’t harness it to do amazing things.
As far as the students using ChatGPT… I mean, yeah, it’s painfully obvious to most teachers I chat with when their kids use the tech to get by.
I would posit, though, that this is the history of education in general. We teach students truths about the world, and they go out and show us how those truths are not entirely accurate anymore.
Sure, some kids will certainly use ChatGPT to compose an entire essay, which circumvents the entire point of writing an essay in the first place: practicing critical thinking skills. That’s bad, and an obvious poor use of the tool.
But think of the kids who are using AI to punch up their thoughts, challenge their assumptions with unconsidered angles, and communicate their ideas with improved clarity. They’re using the tool as intended.
That makes me so excited about the future. That’s what I hope teachers lean into with artificial intelligence.
(via Simon)
Continue to the full article
→
Saturday Morning Breakfast Cereal:
Why do we want a liberal education? Because everyone in the modern university is living in its opposite, and it sucks.
Oof, this was a great one. Makes me wonder what would make for a better collegiate experience. Perhaps not charging an insane amount for it, making it more accessible for a diverse set of students, allowing more people to participate in the free flow of idea exchange?
Continue to the full article
→
"If they want to increase their rankings in U.S. News & World Report, an easy way to do that is to bribe high-scoring students to come to your university with non-need-based aid," said Richard Kahlenberg, a specialist in education at the Century Foundation.
I'd like to hope a fair share of them are accepting financial aid/loans because their parents are teaching them the value of money, but it still stinks for the rest of us with loans and debt for a degree in, ahem, journalism.
Continue to the full article
→