What 73 teachers told me about AI and the web in education

For this, we simply cold-emailed a large number of teachers across Denmark, Sweden, Finland, and Estonia (mostly primary and secondary school). We framed the outreach as being about “the web in education” (rather than explicitly “AI”) to avoid mainly attracting AI enthusiasts or detractors, and then ran open-ended interviews about how the web—and increasingly AI as part of it—shows up in their day-to-day teaching.

Not sure if I agree with all the conclusions (e.g. AI causing less reading stamina), but presenting them here for transparency.

Here are the 8 themes that came up repeatedly.

1) Distraction is constant when laptops/the web are open

When every student has a browser open, teachers describe attention as fragile and easily pulled away by games, YouTube, shopping, notifications, and “just one thing” checking. Even teachers who aren’t anti-tech describe it as an unfair fight: a single adult trying to compete with a full internet designed to capture attention. The result is that “computer time” often becomes split between intended tasks and parallel entertainment, especially when work gets hard or boring.

“For many, the computer is perfect for doing something else if they don’t feel like listening.” — Upper-secondary teacher

“Honestly, pretty bad. If I’m not lecturing or giving instructions, they’re playing games or watching YouTube.” — Swedish teacher, vocational programs

“Distraction. Constant quick clicks—moving away from learning.” — Upper-secondary physics teacher

“You can’t let Year 1 or 2 students work freely online; after five minutes, half the class is elsewhere.” — Upper-secondary physics teacher

“One teacher versus thirty optimized brains is not a fair fight.” — Programming/physics teacher

2) AI becomes the default shortcut—students jump to answers before thinking

Across subjects, teachers describe a shift in “first instinct”: instead of trying, then searching, students often go straight to AI to get a complete response. That changes the sequence of learning. Rather than building understanding and using tools to support it, students often treat AI as the primary engine of thinking—especially for study questions, definitions, and written responses. Teachers describe this as fast, convenient, and hard to reverse once normalized.

“Year two—it’s lightning fast. They don’t think for themselves; they go straight to AI.” — Economics/law teacher

“Students using AI as a shortcut without thinking. That’s the biggest issue.” — Finnish teacher

“Now they don’t Google anymore, and they’ve become worse at it.” — Social studies teacher

“They ask poorly formulated questions, Chat understands anyway, and they don’t click links or check sources.” — Social studies teacher

“Right now it’s a runaway train.” — Economics/law teacher

3) Authentic assessment is collapsing (and teachers often can’t prove AI use)

A recurring pain point is that teachers can often sense when work isn’t the student’s—tone, vocabulary, structure, sudden jumps in quality—but can’t reliably prove it. That creates a social and professional problem: confronting a student can feel accusatory, while letting it slide makes grades feel less meaningful. It also breaks familiar assignment types, because tasks that used to show student thinking can now be “produced” without that thinking being present.

“With written assignments, I get this stream that isn’t really theirs—they can’t explain it.” — Economics/law teacher

“When you try to explain, ‘This is not your way of writing,’ … they say, ‘Prove it.’” — Math/natural science teacher

“It’s hard to know if the assignment is difficult or if it’s the setup.” — Swedish teacher, vocational programs

“Teachers often suspect that a student has cheated but can’t prove it.” — Interview summary, technical high school teacher

“Examinations—ensuring students actually wrote the work. Universities face the same problem.” — Philosophy/Swedish teacher

4) Teachers respond by moving work back into class: handwriting, oral checks, locked modes

In response, many teachers describe redesigning assessment and practice so they can observe process: writing in class, handwriting, oral explanations, and locked exam environments. This is less about nostalgia and more about regaining visibility into student work. The tradeoff teachers describe is that authentic writing processes over time become harder to run, so they compress writing into controlled sessions or shift toward formats that are harder to outsource to AI.

“Written homework doesn’t work at all.” — Economics/law teacher

“To prevent AI from doing the whole task, I often have assignments done in class, so I can see them writing.” — Danish/IB teacher

“More and more, I’m moving away from written assignments.” — Upper-secondary teacher

“We’ve started having students take notes by hand.” — Natural science/philosophy/history teacher

“I started handing out notebooks, and all students are required to take notes by hand. Laptops are only allowed when I explicitly permit them.” — Finnish lecturer, social studies

5) Reading, vocabulary, and attention are getting worse (and AI/web habits amplify it)

Multiple teachers connect digital habits to weaker reading stamina and vocabulary growth. They describe students struggling with longer texts, giving up quickly, and having less patience for building a “base” before reflecting. This shows up sharply in language-heavy subjects (law, Swedish, social science), where comprehension and precise language matter, and where AI can mask gaps until students have to explain or write without support.

“Reading and writing skills are worse. Attention span is shorter.” — Economics/law teacher

“Vocabulary is poor, even basic words.” — Upper-secondary teacher

“They read three sentences and think they know everything.” — Upper-secondary teacher

“There’s also the challenge of shortened attention spans.” — Finnish lecturer, social studies

6) Source criticism is a major gap—students treat AI/Google summaries as “truth”

Teachers repeatedly describe students struggling to evaluate sources, and AI makes that harder by presenting fluent, confident answers that feel “authoritative.” Several note that students cite “Google” as a source, don’t click through to links, and may not recognize when they’re reading an AI summary versus a human-authored text. This pushes teachers toward explicitly teaching what AI is (and isn’t), and requiring students to back claims with original sources.

“Many don’t even know what AI is… When I ask what sources they used, they say ‘Google said so.’” — Upper-secondary teacher

“Students already struggle with source criticism. They think AI has read everything and gives correct answers.” — Natural science/philosophy/history teacher

“I once compared a real source with an AI-generated one—the AI version looked more structured and clearer, but it was factually wrong.” — Year 9 social science teacher

“They need to be critical and not swallow things blindly.” — Math/technology teacher

7) The impact is unequal: weaker students (and the “middle”) are hit hardest

A common pattern is that strong students often use AI as an assistant and can still “carry” the reasoning, while weaker students are more likely to paste and submit without understanding—sometimes believing the teacher won’t notice. Several teachers also point to a “middle group” risk: students who could build competence but get pulled into shortcuts and distractions, which can flatten progress over time. This shows up as mismatches between submitted work and the student’s ability to explain it.

“A weaker student thinks the teacher will believe they’re amazingly good… I don’t see them in the text.” — Math/technology teacher

“Students with less background knowledge… disappear into the computer.” — Upper-secondary teacher

“The middle group—the ones who could become ambitious—are easily drawn into the digital world and lose that chance.” — Biology/chemistry teacher

8) Teachers want control over the digital environment—but tools are clunky/expensive and guidance lags

Many teachers describe feeling stuck between two bad options: close devices entirely or accept distraction + AI misuse. They want selective control (allow Classroom and specific sites/tools, block games/ChatGPT at times), but current solutions are described as rigid, hard to manage, or expensive. In parallel, teachers describe a lack of training and uneven policies—leaving each teacher to improvise rules, enforcement, and pedagogy.

“We’ve blocked five thousand sites, but there are still fifty million more.” — Math/natural science teacher

“That is the dream.” — Math/natural science teacher

“We use exam cookies. They work, but the locks are heavy and clunky. There must be a better way.” — Danish/IB teacher

“I tried Norstedts’ legal chat… one license costs 38,000 SEK. That doesn’t work.” — Economics/law teacher

“AI appeared like an atomic bomb. We got no training.” — Math/natural science teacher

“Many teachers are waiting for the ministry to give guidance—but they won’t.” — Danish/IB teacher

As a side note: one teacher described “rescuing” old laptops, installing Ubuntu on them, and scripting a simple on/off internet toggle from the teacher’s desk. Students still had the essentials (an editor plus offline docs like Python and Pygame), and in some cases limited access to StackOverflow—so the default mode was “work with what you have,” and the internet could be enabled only when it served the lesson.