Learning media Design
My team and I worked with the Research Science course at Winchester Thurston, an independent school in Pittsburgh, for our Learning Media Design final project taught by Professor Marti Louw. As part of the course objectives, we set out to design a learning media solution that targeted an aspect of improving documentation, assessment, and portfolio practices. After three phases of design and exploratory analysis, our final product was a tool to help project-based learning instructors assess 21-first century competencies in a traditional grading system.
Phase 1: Expert Interviews
understanding expert practices
To understand the process of documentation of experts, we conducted interviews (n=3) with graduate students in human-computer interaction, interaction design, and METALS students. We recruited three students that each displayed exemplary portfolio practices as determined by the quality of their personal sites. Each student was in a different degree program, though they similarly were studying user interface or experience design in one form or another. Two of the students came from an architecture background. Having a diverse set of students was important so we could identify similarities and pain-points universal to portfolio and documentation practices, across disciplines.
Our goals were to
understand the process of documentation of experts
how project determines documentation practices
learn the best and worst practices of documentation
how tools and environment influence the documentation process
The interviews were highly informative, but in ways we did not initially expect. As we probed on process hoping to get a sequence of activities, we got more about individual decisions and why they document a certain way. To better articulate the differences in beliefs and styles of our experts, we opted to use an identity model for final consolidation. This model helped highlight some of the contrasts and similarities between conventions and styles, and lead us toward our overall lens of structure and openness.
We learned that experts build their portfolio and document their process in a highly structured way which becomes critical of their working process. However, they all dislike when documentation is imposed or forced upon them as it creates non-organic documentation. They prefer more open work styles that fit their own habits and patterns. This lead us to start considering the value of openness and structure at large, and would eventually lead us away from forcing students to document their work.
Phase 2: Stakeholder Interviews
We chose to do two stakeholder interviews, one with Mr. Adam Nye and one with Mr. Graig Marx. Mr. Marx is our primary stakeholder as our intervention focused on his course, Research Science.
Stakeholder 1: Adam Nye, Assistant Head for Educational Strategy at Winchester Thurston
In our interview with Mr. Nye, we discussed the implementation challenges of PBL in formal education. In PBL, formative and summative assessments, classroom activities, and curriculum are interdependent components of the flexible knowledge development system (Barron & Hammond, 2008). These assessments are often not standardized across projects within the class. Additionally, students end up in different places of the design process than others, resulting in grading inconsistencies. Another challenge in project-based learning that was highlighted was how to demonstrate student performance to outsiders (i.e. to employers through portfolios, or higher education admissions through a transcript).
STAKEHOLDER 2: GRAIG MARX, Instructor of research science at winchester thurston
We learned a lot about the instructor’s central role in Research Science. In order to foster a more open PBL environment, the instructor favors less structure (i.e. seldom use of rubrics and assessments). He has difficulty determining which projects are better than others, problems anticipating which student groups will get farther in the design process, and no systematic process for evaluating student learning.
Our synthesis of these interviews focused on identifying problems that we may target in our design. As such, we consolidated information into a diagnostic map, categorizing problems, causes, consequences, and possible solutions. Our data fell into several broad categories including Grading, Motivation, Peer Evaluation, Space, Mentors and others. Our diagnostic map helped us see the relationships between these high level items, and helped us think about the overall course structure from Mr. Marx’s perspective. We started to see that the structure of the course was limited, with Mr. Marx serving as the central spoke of all interaction. One of our key findings is that Mr. Marx assesses students nearly everyday, but in an informal manner, piecing together for himself student and projects progress overtime, but comparing that information only against his own intuition.
Our observation informed us on how students interact with each other, and with Mr. Marx. Throughout the hour session, Mr. Marx went from one group to the next, got a brief catch-up on their design proposal (the current project phase), and then provided group specific help and guidance. This was repeated until all groups had been reached, which occurred at the end of the hour.
Several observations provided useful insights. One group was of particular interest, as they were noticeably working inefficiently. This group seemed to have a repeated history of wheel-spinning, as they were the only group that Mr. Marx had sent research articles to directly. Later in the class, Mr. Marx publicly announced the due date of the design proposal, which prompted the group to gain motivation, repeatedly saying that they had to work as they had a deadline next week. This group exemplified many of the problems we found during diagnostic mapping. Students were unaware of deadlines, some groups work better than others, some students have differing motivation styles than others, and some group projects get off to a faster start without scaffolding.
The insights gained from our observations didn’t lend themselves to their own synthesis model, but added value to our diagnostic mapping. Our insights helped identify pain points to guide our design solution.
identified problems in assessments
Our diagnostic mapping and affinity diagraming helped illustrate problems in assessments.
Grading inconsistencies and subjectivity
Evidence of performance not collected
Unequal feedback among groups
Phase 3: Prototyping
In our next phase, we addressed the problems we identified in user research. Our prototypes were low-fidelity UI mockups of the application (digital) and storyboards of the integrated system (digital and physical). In these mockups, role prototypes were an effective way for us to figure out how our system could assist the teacher in his documentation of feedback. We utilized both digital and physical prototypes because the instructor requested a hybrid system that contains both written notes and a digital system. Our team prioritized making a seamless experience for the instructor while achieving our goals set forth by the system, like making feedback and assessments more transparent for students.
We wanted to mimic the experience the teacher will have to interact with the system as much as possible. The goal was to pick his brain to see how he visualizes his interactions with the system. Many of our questions pertained to the specifics of how he would use the system, such as, how often should the students be able to see their reports? Will students have access to the system? How will we build a product to help the teacher while maintaining the student-driven nature of the course? Furthermore, how will we know if our system will achieve the desired results? We went into the field with some prototypes to see how the instructor would actually use the system.
We determined metrics to determine whether the system accomplishes those goals (ex: are students reflecting more or just being burdened by unnecessary assignments and bombarded by confusing visualizations of their competencies). Some of these questions were resolved with getting feedback on our prototypes. The long-term metrics are not able to be measured due to the limited time constraints, and future work would delve into the long-term implications.
Final Product: Zipper
In our final product, we consolidated the prototypes from phase 3 with the user feedback and findings from our prototype testing. We presented the final demo to the instructors and principal of Winchester Thurston.
Future Work and Vision
Our next steps would be to continue user testing to understand how our assessment tracker could be integrated in other project-based learning courses. Our vision is to provide a balance between structure and openness, and improve documentation of assessments.
Structure & Openness
Instructors impose structure while maintaining open nature of the course
Instructors focus on providing evidence of assessing students skills and learning throughout the process
Our solution facilitates organic instructor documentation without intrusivity