AI Detection in SchoolsEssay

It's Not About the Technology

Why AI in education is really about trust.

Every conversation about AI in education eventually becomes a conversation about trust. The technology is just the stage. The drama playing out is much older.


I've been teaching long enough to remember when the big concern was Wikipedia. Then it was SparkNotes. Then phones in class. Now it's ChatGPT. The technology changes. The underlying question doesn't:

Do we trust our students?

And just as importantly:

Do our students trust us?

The Trust Deficit

When administrators mandate AI detection software, they're making a statement about trust — specifically, that they don't trust students to do their own work, and they don't trust teachers to identify problems without algorithmic assistance.

When students use AI to complete assignments, they're also making a statement about trust — that they don't trust the assignment to be worth their time, or they don't trust that teachers will understand their circumstances, or they don't trust themselves to succeed without help.

When teachers feel they need surveillance tools, it's often because something has broken down in the relationship with students that used to make cheating obvious.

AI didn't create these trust gaps. It just made them impossible to ignore.

What We've Lost

We've Lost Knowing Our Students

Class sizes have grown. Teacher loads have increased. The time we once had to read student work carefully, to conference individually, to know each writer's voice — that time has been squeezed out. Detection tools promise to replace what relationship used to provide.

Students Have Lost Connection to Purpose

When assignments feel like compliance exercises — points to accumulate, boxes to check — students lose the sense that the work matters. Integrity requires believing that authentic engagement is worthwhile.

We've Lost Benefit of the Doubt

The default stance has shifted from assuming good faith to assuming potential cheating. This affects how students experience school — surveilled rather than supported.

What Detection Can't Do

Here's what no AI detection tool can tell you:

  • Whether your student is struggling with depression and couldn't face another blank page
  • Whether the assignment felt meaningless and AI seemed like reasonable triage
  • Whether the student has a learning disability that makes writing excruciating
  • Whether they're working 30 hours a week to help support their family
  • Whether they've lost faith that school leads anywhere

Detection tells you that something might be AI-generated. It doesn't tell you why, or what the student actually needs.

Rebuilding Trust

If the AI debate is really about trust, then the solution isn't better technology. It's rebuilding the conditions where trust can grow.

What Trust-Building Looks Like

  • Smaller class sizes so teachers can actually know their students and their writing
  • Meaningful assignments where students understand why the work matters
  • Process over product — valuing the journey of learning, not just final submissions
  • Conversations before accusations — when something seems off, asking questions first
  • Acknowledging student pressures — recognizing the unrealistic demands we place on young people
  • Grace alongside standards — maintaining expectations while leaving room for human imperfection

The Choice We Face

We can respond to AI with more surveillance, more detection, more adversarial dynamics. That path leads somewhere — but probably not somewhere good.

Or we can use this moment to ask harder questions: Why do students feel compelled to shortcut? What would make authentic engagement worthwhile? How do we rebuild relationships that make integrity feel natural?

The technology will keep changing. Trust is the constant. If we get that right, the tools matter less.