The rapid rise of generative artificial intelligence has intensified longstanding challenges in mathematics assessment, particularly for open-ended projects that can be easily outsourced to AI tools. Yet projects remain essential for fostering mathematical sensemaking, creativity, and positive mathematical identity, especially in liberal arts mathematics courses serving non-STEM majors. This session presents a sequence of AI-resistant mini-projects implemented in a Liberal Arts Mathematics (MA111) course. These projects are intentionally designed around personalization, invention, and reflective explanation, features that require authentic student thinking and cannot be meaningfully completed by AI alone. Examples include: (1) designing an original recursive number sequence inspired by Fibonacci-type patterns, (2) analyzing mathematical structure in a natural phenomenon of the student’s choosing, and (3) creating a personal number system with defined symbols, rules, and representations. The design framework integrates three core principles: 1. Personalization (student-chosen contexts and parameters), 2. Creation (students generate new mathematical objects or systems), and 3. Justification (written explanation of reasoning and meaning). Student artifacts and reflections indicate that these projects promote ownership, creativity, and deeper engagement with mathematical ideas while substantially reducing AI-generated submissions. Practical assignment prompts, scaffolding strategies, and grading approaches will be shared so participants can adapt AI-resistant project structures to their own courses.
View Submission
The rapid advancement of artificial intelligence has created new challenges for mathematics educators seeking assignments that promote authentic student engagement rather than AI-assisted completion. This study examines student-created video solution homework as an AI-resistant assignment in undergraduate mathematics. Implemented across four undergraduate mathematics courses, the assignment required students to explain their problem-solving processes verbally and visually. Classroom-based empirical evidence was collected through two measures: (1) comparisons of exam performance by video homework completion and quality, and (2) a student perception survey. Across courses, students who consistently earned full-credit video scores demonstrated higher average exam performance than peers with incomplete or lower-quality submissions. Survey responses further indicated that students perceived improvements in conceptual understanding, organization of reasoning, and confidence in explaining mathematics. Together, these findings suggest that structured video explanation assignments can promote genuine engagement and deeper learning while serving as a practical AI-resistant assessment strategy in undergraduate mathematics.
View Submission
It is often said that the best way to understand something is to try to explain it to someone else. Based on this, would asking our students to explain important course concepts in their own words be an effective learning technique? To test this hypothesis, a few years ago I introduced oral exams as an alternative assessment method in some of my upper-division courses. In this talk, I will describe how I conduct oral exams in my Real Analysis and Abstract Algebra courses, giving examples of the types of questions I ask, criteria for evaluation, strategies for alleviating student anxiety, and some benefits of assessing students using this method.
View Submission