how should exams, projects, coursewor ...
The Core Dilemma: Assist or Damage? Learning isn't all about creating correct answers—it's about learning to think, to reason, to innovate. AI platforms such as ChatGPT are either: Learning enhancers: educators, guides, and assistants who introduce learners to new paths of exploration. Learning undeRead more
The Core Dilemma: Assist or Damage?
Learning isn’t all about creating correct answers—it’s about learning to think, to reason, to innovate. AI platforms such as ChatGPT are either:
- Learning enhancers: educators, guides, and assistants who introduce learners to new paths of exploration.
 - Learning underminers: crutches that give students answers, with students having skimmed assignments but lacking depth of knowledge.
 
The dilemma is how to incorporate AI so that it promotes curiosity, creativity, and critical thinking rather than replacing them.
1. Working with AI as a Teaching Companion
AI must not be framed as the enemy, but as a class teammate. A few approaches:
- Explainers in plain terms: Students are afraid to admit that they did not understand something. AI can describe things at different levels (child-level, advanced, step-by-step), dispelling the fear of asking “dumb” questions.
 - Personalized examples: A mathematics teacher might instruct AI to generate practice questions tailored to each student’s level of understanding at the moment. For literature, it could give different endings to novels to talk about.
 - 24/7 study buddy: Students can “speak” with AI outside of class when teachers are not present, reaffirming learning without leaving them stranded.
 - Brainstorming prompts: In art, creative writing, or debate classes, AI can stimulate the brainstorming process by presenting students with scenarios or viewpoints they may not think of.
 
Here, AI opens doors but doesn’t preclude the teacher’s role of directing, placing, and correcting.
2. Redesigning Tests for the Age of AI
The biggest worry is testing. If AI can execute essays or equations flawlessly, how do we measure what children really know? Some tweaks would suffice:
- Move from recall to reasoning: Instead of “define this term” or “summarize this article,” have students compare, critique, or apply ideas—tasks AI can’t yet master alone.
 - In-class, process-oriented evaluation: Teachers can assess students’ thinking by looking at drafts, outlines, or a discussion of how they approached a task, not the final, finished product.
 - Oral defenses & presentations: After having composed an essay, students may defend orally their argument. This shows they actually know what is on the page.
 - AI-assisted assignments: Teachers just instruct, “Use AI to jot down three ideas, but write down why you added or dropped each one.” This maintains AI as a part of the process, not a hidden shortcut.
 
This way, grading becomes measuring human thinking, judgment, and creativity, even if AI is used.
3. Training & Supporting Teachers
The majority of teachers are afraid of AI—they think it’s stealing their jobs. But successful integration occurs when teachers are empowered to utilize it:
- Professional development: Hands-on training where teachers learn through doing AI tools, rather than only learning about them, so they truly comprehend the strengths and shortcomings.
 - Communities of practice: Teachers sharing examples of successful implementation of AI so that best practices naturally diffuse.
 - Transparency to students: Instead of banning AI out of fear, teachers can show them how to use it responsibly—showing that it’s a tool, not a cheat code.
 
When teachers feel secure, they can guide students toward healthy use rather than fear-policing them.
4. Setting Boundaries & Ethical Standards
Students need transparency, not guesswork, to know what is an acceptable use of AI. Some guidelines may be enough:
- Disclosure: Ask students to report if and how they employed AI (e.g., “I used ChatGPT to get ideas for outlines”). This incorporates integrity into the process.
 - Boundaries by skill level: Teachers can restrict the use of AI in lower grades to protect foundational skill acquisition. Autonomy can be provided in later levels.
 
Talks of ethics: Instead of speaking in “don’t get caught” terms, schools can have open discussions regarding integrity, trust, and why learning continues even beyond grades.
5. Keeping the Human at the Center
Learning is not really about delivering information. It’s about developing thinkers, creators, and empathetic humans. AI can help with efficiency, access, and customization, but it can never substitute for:
- The excitement of discovery when a student learns something on their own.
 - The guidance of a teacher who sees potential in a young person.
 - The chaos of collaboration, argument, and experimentation in learning.
 
So the hope shouldn’t be “How do we keep AI from killing education?” but rather:
“How do we rethink teaching and testing so AI can enhance humanity instead of erasing it?”
Last Thought
Think about calculators: once feared as machines that would destroy math skills, now everywhere because we remapped what we want students to learn (not just arithmetic, but mathematical problem-solving). AI can follow the same path—if we’re purposeful.
The best integrations will:
- Let AI perform repetitive, routine work.
 - Preserve human judgment, creativity, and ethics.
 - Teach students not only to use AI but to critique it, revise it, and in some instances, reject it.
 - That’s how AI transforms from a cheat into an amplifier of learning.
 
                    
The Old Model and Why It's Under Pressure Essays and homework were long the stalwarts of assessment. They measure knowledge, writing skills, and critical thinking. But with the presence of AI, it is now easy to produce well-written essays, finish problem sets, or even codes in minutes. That does notRead more
The Old Model and Why It’s Under Pressure
Essays and homework were long the stalwarts of assessment. They measure knowledge, writing skills, and critical thinking. But with the presence of AI, it is now easy to produce well-written essays, finish problem sets, or even codes in minutes.
That does not mean students are learning less—it’s just that the tools they use have changed. Relying on the old model without adapting is like asking students to write out multiplication tables manually once calculators are employed everywhere. It’s not getting it.
Redesigning Exams
Exams are designed to test individual knowledge. When AI is introduced, we may need to:
Testing is less “what do you know” and more “how you think.”
Rethinking Projects & Coursework
Projects are where AI may either replace effort or spark new creativity. To keep them current:
This reverses coursework from being outsourcing-oriented to reflection, uniqueness, and human effort.
Reframing Coursework Purposes Altogether
If AI is already capable of doing the “garden variety” work, maybe education can focus on more higher-order goals :
The Human Side
This’s not about “catching cheaters.” It’s about recognizing that tools evolve, but learning doesn’t. Students want to be challenged, but also supported. When it all turns into a test of whether they can outsmart AI bans, motivation falters. When, on the other hand, they see AI as just one of several tools, and the question is how creatively, critically, and personally they employ it, then education comes alive again.
Last Thought
Just as calculators revolutionized math tests, so will AI revolutionize written work. Prohibiting homework or essays is not the answer, but rather reimagining them.
The future of exams, project work, and coursework must:
In short: assessments shouldn’t try to compete with AI—they should measure what only humans can uniquely do.
See less