AI-Resilient Assessments
Introduction
Generative AI is greatly impacting higher education, and certain assessments are increasingly easy to complete with these tools. The PROTECT framework addresses this challenge by providing an approach for redesigning assessments so they stay meaningful and resilient in an AI-enabled environment. Rather than focusing only on the end deliverable, PROTECT emphasizes real-world complexity and the process of learning (Wu et al., 2025).
A Closer Look
Find Assessments Vulnerable to AI & PROTECT Them
Before assigning an assessment, take a moment to evaluate it using three quick questions:
- Analyze the Goals: Does the assessment rely heavily on simple recall of information rather than synthesis and application?
- Audit the Deliverables: Does the assessment rely on a static final product (like a standard essay) rather than a visible learning process (e.g., submitting drafts, reflection logs, or annotated bibliographies)?
- Audit the Thinking: Does the assessment allow a student to produce a high-quality output using AI while bypassing the core cognitive work (critical analysis, synthesis, or evaluation) intended by the assignment?
If the answer to any of these questions is “yes,” the assessment may be vulnerable to AI-assisted completion. Use the PROTECT framework to redesign your assessment so it’s both more AI-resilient and a richer learning experience for students.
The PROTECT Framework
The PROTECT framework consists of seven interconnected principles designed to deepen engagement while ensuring authentic learning.
Example: Redesigning an Assessment Using the PROTECT Framework
The Course: Undergraduate Economics
The Topic: Supply, Demand, and Price Ceilings
Traditional (AI-Vulnerable)
- Students write a paper explaining the theoretical impact of rent control on housing supply.
Redesigned (AI-Resilient)
- Students analyze the rental market in their university town by collecting current listing data and interviewing at least one local stakeholder.
Why This Works Better
- Personalization and Realism: Students engage with real, local conditions and interpret messy variables instead of relying on simplified, AI-generatable theory.
Traditional (AI-Vulnerable)
- Students submit a standard written essay.
Redesigned (AI-Resilient)
- Students submit a five-minute video presentation proposing a policy recommendation to the city council.
Why This Works Better
- Originality: A video requires communication skills, synthesis, and personal engagement—and is harder for AI to replicate.
Traditional (AI-Vulnerable)
- Students are only required to submit a basic summarization of existing concepts.
Redesigned (AI-Resilient)
- Students may use AI for tasks such as cleaning datasets, but they must submit an AI audit log documenting their prompts, workflow, and verification steps.
Why This Works Better
- Transparency and Constraints: AI is used as a tool, while students remain responsible for analysis and decisions.
Traditional (AI-Vulnerable)
- Students submit an essay heavily based on textbook theory.
Redesigned (AI-Resilient)
- Students draft a specific legislative amendment addressing rent control policy in their city.
Why This Works Better
- Taxonomy-Guided Progression: The redesigned assignment shifts students toward higher levels of Bloom’s taxonomy, moving beyond basic summarization (“Remember” and “Understand” levels) to applying, evaluating, and creating.
Additional Resources
- Anthropic Education Report: How University Students Use Claude
- Generative AI Course Reflection Document by the Office of Online & Professional Learning Resources
Reference
- Wu, X., Ma, J., Roopaei, M., & Wang, Y. (2025). PROTECT: A framework for preserving project-based learning integrity in the AI era. 2025 IEEE Frontiers in Education Conference (FIE).