
Last week, in my ecology class, while handing out an assignment, my professor said something that paused me mid-note:
“At least sit with the assignment, even if you’re not able to solve it.”
Sit with it?Â
In a classroom increasingly anxious about AI-generated content, about students blatantly outsourcing even casual homework from ChatGPT, this sentence felt oddly out of place, almost tender, almost radical.
The dominant narrative is familiar by now: students are lazy, unserious, intellectually hollowed out by easy tools. AI, we are told, is making us stupid. Depending on who is speaking, the proposed solution ranges from stricter surveillance to outright moral scolding.
That same morning, I read an article arguing exactly this, before pivoting, almost sheepishly, to suggest that perhaps the better question is not whether AI makes us dumb, but how we might use it to become smarter. I returned to my professor’s sentence and wondered: Is this entire ordeal really that simple?
Let me zoom out.
Before we rush into ethical gymnastics about tools, users, and outcomes, before we debate plagiarism, shortcuts, or productivity, I want to ask a different question. Not who is committing the crime, or how, or even what the damage is.
I want to ask: why?
Why do we reach for ChatGPT so quickly? And no, I am not talking about convenience or efficiency. I am talking about something quieter, something psychological.
My professor was not merely complaining about AI, he was pointing to something deeper: students no longer even try to sit with the assignment. They do not misinterpret or struggle through a wrong approach,they simply hand it over. The easy explanation is laziness or stupidity or moral decay.
But that explanation itself is lazy.
I have seen students who love their craft, who are skilled, curious, and genuinely invested, use AI just as readily. Fear does not discriminate between the capable and the unprepared.
For many students, especially those insecure about their thinking or writing, AI is not a shortcut. It is an armor. This is not about avoiding work. It is about avoiding the risk of being wrong.
Thinking is slow, it requires uncertainty. It demands that we tolerate not knowing, that we entertain the possibility of error. And error, in our current systems, is not acceptable. It is punished.
We live in a world obsessed with precision, accuracy, and speed. Traits we increasingly mistake for intelligence. Traits that are, frankly, robotic. Trying, real trying, is destinationless, it is probabilistic. You might be right, but you will almost certainly be wrong first. And when wrongness carries disproportionate consequences, grades, judgment, humiliation, exclusion, the reward of being right no longer feels worth the risk. So we outsource the uncertainty.
AI does not merely give answers. It gives reassurance. It offers something polished, correct-looking, and safe. It allows us to bypass the most uncomfortable part of thinking: sitting with confusion.
When people say, “Students aren’t even trying anymore,” I want to ask: Trying for what?
For a system that rewards correctness over curiosity?
For institutions that treat mistakes as evidence of inadequacy rather than as part of growth?
This is not merely a student failure. It is a systemic one. Creation, real creation, is messy, imperfect, it stumbles, doubles back, contradicts itself before finding clarity. But instead of making space for that mess, we chase perfection, optimization, and acceptance.Â
And in doing so, we risk something quieter than academic dishonesty. We risk replacing human honour in failure with a robotic impulse to fill every gap of imperfection with precision.
I keep returning to a line by Sophocles: “Rather fail with honor than succeed by fraud.”
Every great genius has failed countless times before arriving at something meaningful. Those failures did not diminish their efforts. They paved the path toward them. What we risk letting AI steal is not our intelligence, but our permission to be imperfect.
Just to be accepted.
Just to be rewarded.
My professor did not ask us to solve the assignment.
He asked us to sit with it.
Perhaps that is the part we are most afraid of losing.
