AI-Resilient Assessments

Introduction

Generative AI is greatly impacting higher education, and certain assessments are increasingly easy to complete with these tools. The PROTECT framework addresses this challenge by providing an approach for redesigning assessments so they stay meaningful and resilient in an AI-enabled environment. Rather than focusing only on the end deliverable, PROTECT emphasizes real-world complexity and the process of learning (Wu et al., 2025).

A Closer Look

Find Assessments Vulnerable to AI & PROTECT Them

Before assigning an assessment, take a moment to evaluate it using three quick questions:

  1. Analyze the Goals: Does the assessment rely heavily on simple recall of information rather than synthesis and application?
  2. Audit the Deliverables: Does the assessment rely on a static final product (like a standard essay) rather than a visible learning process (e.g., submitting drafts, reflection logs, or annotated bibliographies)?
  3. Audit the Thinking: Does the assessment allow a student to produce a high-quality output using AI while bypassing the core cognitive work (critical analysis, synthesis, or evaluation) intended by the assignment?

If the answer to any of these questions is “yes,” the assessment may be vulnerable to AI-assisted completion. Use the PROTECT framework to redesign your assessment so it’s both more AI-resilient and a richer learning experience for students.

The PROTECT Framework

The PROTECT framework consists of seven interconnected principles designed to deepen engagement while ensuring authentic learning.

Personalization

Connect assessments to students’ personal experiences, communities, or local contexts. When students must work with their own environment or perspectives, generic AI responses become far less useful. This also encourages stronger ownership of the work.

Realism

Bring real-world complexity into assessments. Authentic situations often include incomplete data, limited budgets, and conflicting stakeholder interests. These “messy” conditions require judgment and interpretation that go beyond what AI can easily produce.

Originality

Encourage students to demonstrate learning in multiple ways. In addition to standard written work, consider video presentations, prototypes, interviews, or oral defenses.

Taxonomy-Guided Progression

Design assessments that move students up Bloom’s taxonomy, progressing from basic understanding toward analysis, evaluation, and creation. At higher levels, students must critique ideas, refine outputs, and apply knowledge in new contexts.

Evaluation Diversity

Remember to assess the learning process, not just the final product. Try incorporating checkpoints such as draft submissions, peer reviews, progress updates, or live discussions to verify genuine engagement and understanding.

Constraints

Provide clear guidelines for appropriate AI use. Help students distinguish between beneficial assistance (such as brainstorming or debugging) and inappropriate substitution (where AI replaces a student’s thinking). Clear boundaries support both learning and academic integrity.

Transparency

Treat AI use as a reflective learning practice. Ask students to document their prompts, describe how they used AI tools, and explain how they validated results. This approach shifts AI from a hidden shortcut to a visible and accountable academic tool.


Example: Redesigning an Assessment Using the PROTECT Framework

The Course: Undergraduate Economics

The Topic: Supply, Demand, and Price Ceilings

Traditional (AI-Vulnerable)

  • Students write a paper explaining the theoretical impact of rent control on housing supply.

Redesigned (AI-Resilient)

  • Students analyze the rental market in their university town by collecting current listing data and interviewing at least one local stakeholder.

Why This Works Better

  • Personalization and Realism: Students engage with real, local conditions and interpret messy variables instead of relying on simplified, AI-generatable theory.

 

Traditional (AI-Vulnerable)

  • Students submit a standard written essay.

Redesigned (AI-Resilient)

  • Students submit a five-minute video presentation proposing a policy recommendation to the city council.

Why This Works Better

  • Originality: A video requires communication skills, synthesis, and personal engagement—and is harder for AI to replicate.

 

Traditional (AI-Vulnerable)

  • Students are only required to submit a basic summarization of existing concepts.

Redesigned (AI-Resilient)

  • Students may use AI for tasks such as cleaning datasets, but they must submit an AI audit log documenting their prompts, workflow, and verification steps.

Why This Works Better

  • Transparency and Constraints: AI is used as a tool, while students remain responsible for analysis and decisions.

 

Traditional (AI-Vulnerable)

  • Students submit an essay heavily based on textbook theory.

Redesigned (AI-Resilient)

  • Students draft a specific legislative amendment addressing rent control policy in their city.

Why This Works Better

  • Taxonomy-Guided Progression: The redesigned assignment shifts students toward higher levels of Bloom’s taxonomy, moving beyond basic summarization (“Remember” and “Understand” levels) to applying, evaluating, and creating.

 


Additional Resources

Reference