Minerva Assessment Tools

Designing a tool for providing frequent, detailed feedback on all aspects of student performance, including assignments, class contributions, and group activities.

HIGHER EDUCATION

LEARNING SCIENCE

UX DESIGN

WEB

Minerva Assessment Tools

Designing a tool for providing frequent, detailed feedback on all aspects of student performance, including assignments, class contributions, and group activities.

HIGHER EDUCATION

LEARNING SCIENCE

UX DESIGN

WEB

Minerva Assessment Tools

Designing a tool for providing frequent, detailed feedback on all aspects of student performance, including assignments, class contributions, and group activities.

HIGHER EDUCATION

LEARNING SCIENCE

UX DESIGN

WEB

Minerva Assessment Tools

Designing a tool for providing frequent, detailed feedback on all aspects of student performance, including assignments, class contributions, and group activities.

HIGHER EDUCATION

LEARNING SCIENCE

UX DESIGN

WEB

At the core of Minerva’s active learning philosophy is the concept of frequent, detailed feedback on every element of the student performance, which includes not just assignment submissions but poll answers, worksheets completed in breakout groups, and even spoken contributions made during class.

The sheer amount of data generated by any given student in a typical week of classes is overwhelming. The design challenge was daunting: How could we take this data and not only turn it into scored assessments with actionable feedback for the student but also simultaneously design it in such a way that the faculty members could efficiently grade with the least amount of friction?

My Role

I was the Product Designer on this initiative, working closely with our Chief Learning Scientist, our engineering team, as well as our Founding Dean and faculty members.

Purpose

To create a seamless system that empowers both students and faculty by leveraging data to enhance learning outcomes and streamline the grading process, aligning with Minerva’s mission of active, data-driven education.

Outcome

Today, these tools are used by Minerva and global partners to successfully grade over 5,000 classes and 20,000 assignments per semester, helping accelerate the student education.

Breaking the Process into Two Products

We divided the assessment tool into two distinct products to address their unique inputs and needs. One product focused on evaluating items generated within the classroom, such as verbal contributions and poll answers. The other targeted written assignments completed outside the classroom.

Class Grader

Initial Research

For Class Grader, our research began with discussions with deans to determine product requirements. Each item needed to be scored using a predefined rubric and optionally include a grader comment. Faculty highlighted a concern about identifying specific moments in the vast classroom data, frequently requesting: “Let me quickly find a particular student’s contributions.”

Designing and Testing Assumptions

I began by sketching a variety of potential layouts, and after some feedback from faculty settled on an overall strategy of structuring the screen into thirds: One main column to house the elements to grade (either the classroom video and “transcript” list of accessible classroom moments, or the poll answers), one column to filter the results that would be shown in the main column, and a third column to assess the selected element.

With this sketch as a guide, I created higher fidelity designs and eventually created a clickable prototype that tested well with faculty. Buoyed with the feeling that I was on the right path, the Chief Learning Scientist and I worked with engineering to get the first version of the tool into production.

Enhancing Usability

Usability sessions revealed that grading was still time-consuming. We added features like faster video playback for efficiency and transcript filtering by length to prioritize substantive contributions. A bookmarking function allowed professors to mark moments during class for easy retrieval. Keyboard shortcuts streamlined grading actions, from scoring to saving.

Lessons Learned

The enhancements improved efficiency, enabling faculty to locate and grade classroom moments faster. Assessments per student increased by 40%, strengthening feedback loops and boosting student performance. These improvements highlighted the value of iterative design and user-focused enhancements in educational tools.

Assignment Grader

Initial Research

Class assignments were the second-largest factor in final grades. A tool was needed to process PDFs, enabling faculty to select text for contextual grading and comments. The design included a toolbar for assignment selection, a main display column, and an assessment column. Instructors could highlight text to open a modal for assigning grades, selecting outcomes, and adding comments, ensuring consistency across tools.

Designing and Testing Assumptions

I began by sketching a variety of potential layouts, and after some feedback from faculty settled on an overall strategy of structuring the screen into thirds: One main column to house the elements to grade (either the classroom video and “transcript” list of accessible classroom moments, or the poll answers), one column to filter the results that would be shown in the main column, and a third column to assess the selected element.

With this sketch as a guide, I created higher fidelity designs and eventually created a clickable prototype that tested well with faculty. Buoyed with the feeling that I was on the right path, the Chief Learning Scientist and I worked with engineering to get the first version of the tool into production.

Release and New Needs Discovered

After launch, plagiarism detection emerged as a key issue. Faculty relied on a manual third-party process, so we partnered with Unicheck to integrate an “originality score” for quick plagiarism alerts.

Grading policies also required tracking “foreground” learning outcomes. Initially manual, this process was streamlined with a new tracker feature, well-received during testing and integrated successfully.

To allow portions of the submission to be graded in context, I worked with our Chief Learning Scientist on a feature that allowed the instructor to highlight a portion of text which would then cause a modal to appear.

Within the modal, the grader would be able to select from a list of outcomes particular to the class, assign a grade to it, and then add an optional comment. We borrowed from the pattern designed for the class grader here to establish parity between the two grading applications.

Outcomes

The class and assignment grader tools launched with relatively smooth rollouts but faced initial efficiency challenges, requiring significant iteration.

Today, these tools support Minerva and global partners in grading 5,000 classes and over 20,000 assignments per semester, significantly enhancing student outcomes.

A key personal insight from this experience was understanding the context of faculty workload. Grading is just one part of their responsibilities, but the feedback-intensive approach of our pedagogy requires more grading time than at other institutions. Therefore, even minor efficiency improvements in the tools deliver meaningful benefits to faculty members.

The scope and depth of this project go beyond what’s presented here. I’d be glad to share a more comprehensive look at the process if you’re interested.

87%

Increase in assessment effeciency

527k

Class & assignments graded

87%

Increase in assessment effeciency

527k

Class & assignments graded

87%

Increase in effeciency

527k

Class & assignments graded

87%

Increase in instructor effeciency

527k

Class & assignments graded

Interested in collaborating?

© Copyright 2025

Interested in collaborating?

© Copyright 2025

Interested in collaborating?

© Copyright 2025

Interested in collaborating?

© Copyright 2025