EngageUW - A Learning Analytics Solution for UW

User-centered design project conducted over 12 weeks in Spring 2019

Final-process-book-min

For my senior capstone project, I was lucky to partner with the UW Professional and Continuing Education department (PCE) to help research and redesign certain aspects of their online education platforms. At the time, strictly online courses were not the norm, with only a select amount of certificate classes and graduate programs using strictly online courses or a mix of online and in-person. The events of 2020 have made online education a priority for all types of education levels, from pre-school through post-graduate.

I hope the results of this project can give some insight into what is possible through current online education platforms and how they can help shape better education outcomes for students in the future. Even when we go back to a "normal" education system, I believe utilizing online education platforms and tools will help bridge inequality gaps and create new opportunities for learning.

I recorded the entire process of this project within a presentation linked below. On this page, I will focus most heavily on the research aspects of this project. Though we went through the full user-centered design process during this project, I want to emphasize my role as lead researcher and show the work I was most heavily involved in. To get a full picture of the project, please look into the process book linked below.

Sections

Research Questions and Context

Before starting anything, we shaped our plan based on these two research questions:

  1. What data should UWCC focus on gathering and analyzing to help instructors monitor student performance?
  2. How can this e-learning data be best utilized to improve outcomes for students and instructors/course developers of the UW PCE?

The University of Washington uses several different platforms for aiding education both in-person and online. These platforms collect and provide countless amounts of data about each student. However, these systems provide many data points in ways that are not meaningful to educators or do not provide the data points that educators believe would be helpful. The PCE was planning on adopting a Learning Record Store (LRS) to manage student data from several different sources. This could greatly benefit an educator's ability to understand students and their own courses, but the LRS still needs to know which data are available and how to best display them for educators.

The main goal of this project was to discover which data points teachers utilize already, which data points they wish they could utilize, and how to display this information to them most effectively. I determined the best method for discovering these answers would be through an interview study.

Exploratory Research - Interviewing

When initially devising our research study, I wanted to ensure that we were getting a full picture of how online courses worked within the PCE. To ensure this, I felt we should get perspectives from both main users within an online class: the instructors and the students. We conducted 9 interview sessions over a 1 week period: 6 PCE instructor interviews and 3 PCE student interviews. We also conducted a small survey study with 33 students to try and better understand their sentiment towards how their education data was being used.

After completing research and discussing results with our stakeholders, we decided to design specifically for the instructor side of our product. However, our stakeholders were very interested in the research about students and their use of online education systems. It was a perspective they had not seen a lot of information on and felt they may follow up on the idea in a more formal research study down the line.

Here, I will focus specifically on the instructor interview planning process and their results. For more information and a look at the student research, please look at the full interview report.

Secondary Research

Before talking to any instructors, I felt I needed to be more familiar with the current systems that instructors use. I conducted background research into the designs and technical details of some of the products available to UW instructors.

I was fortunate to recently have TA'd a course, which gave me access to some of the main online platforms UW instructors use to look at student data. I had some familiarity with these systems, but I looked deeply into these systems to see what data was currently available and how it was being displayed to instructors.

Screens from UW learning analytics platforms. More detail is available in the appendix of the full research report.

Planning

With a better understanding of some of the current UW learning platforms, I wrote out several lower-level research questions I wanted the interview study to answer:

● What student data/data points do instructors currently utilize (if at all) and how do they use them?
● What student data/data points are currently available in multiple systems that could be combined to give more insights and/or make instructors' jobs easier.
● How can we best illustrate student growth for instructors?
● What do instructors see as student competency?
● When do instructors feel they need to become more involved with a student/students to ensure that they are more successful in a course.
● How can we measure student competency in a topic area? Over multiple topic areas?
● What major measurable factors do current data analytics track that correlates with each other that could be meaningful to instructors?
● How do students think their data is being used?
● How do students think their educational experience could be improved by their instructors?
● How could the implementation of a Learning Records Store (LRS) make it easier for teachers to improve their courses?

From these research questions, I created 7 interview questions to learn as much as I could about instructors and their relationship with these online systems. I chose a semi-structured style of interviewing so I could be flexible while interviewing to follow up on any new ideas that come up during the interviews. Here are a few of the interview question, with its corresponding rationale and potential follow-up questions (FUQ):

Q1: What classes do you teach? How long have you been teaching?
Rationale: This question is mostly for demographics and introductions, so we can get a better idea of the teachers’ experience.

  • FUQ1: What’s your favorite part about teaching? What’s your least favorite part about teaching?
  • FUQ2: If they have not mentioned the PCE or teaching for it: Do you teach for the PCE?
    If so: How has teaching classes for the PCE been different from your other teaching experiences?
  • FUQ3: What’s the hardest part about teaching?

Q3: What online platforms do you use for distributing education materials?
Rationale: This question starts to drill deeper into what UW instructors are using in their courses and how learning analytics fits into their experience.

  • FUQ1: How have these platforms affected your teaching experience?
  • FUQ2: Do you know if these platforms have any data tracking or analytics capabilities? (If they need an example, the learning analytics section of a Canvas class page)
  • FUQ3: If so: How have you utilized these data tracking systems?

Q4: Have you noticed any strong correlations between different aspects of a class and student engagement or learning?
Rationale: This question helps us discover more about the core of the design question, aiming to figure out what ways student data can be best utilized and visualized by a learning analytics platform in ways that teachers would understand and believe would be helpful.

  • FUQ1: If so, have you ever used metrics to measure these correlations?
  • FUQ2: What student data do you think could be tracked and analyzed to help you understand student engagement more?
  • FUQ3: What student data do you think could be tracked and analyzed to help you understand student learning more?

Q6: In the classes, you have taught, how did you keep track of student performance?
Rationale: The instructors we interview will likely have different levels of experience with tracking and analyzing student data themselves. This question will help us understand the different needs of instructors with more or less technology/data experience and comfortability.

  • FUQ1 : (If more than most basic system / Canvas grades) What data points have you tracked?
  • FUQ2: What programs have you used to analyze student data?
  • FUQ3: What has been helpful about the way you have been tracking student performance?
  • FUQ4: What are some challenges or pain points you have faced with tracking student data/performance?

Interviews and Analysis

I scheduled 6 interviews with UW PCE instructors by reaching out through an email list. Because of our tight schedule for research (2 weeks), I had to conduct all the interviews within a 5 day period. Though planning each in this time frame was difficult, I was able to get every interview done through a mix of in-person and online interviews. 

Each interview was around 45 minutes and included a facilitator and a note-taker. I facilitated 5 of the 6 interviews, though I ended up writing a lot of notes myself alongside the note-taker. After each interview, we took 15 minutes to reflect on any salient ideas/themes and then discuss our thoughts.  I recorded each interview for transcriptions to confirm notes and gather quotes during analysis.

Using a thematic analysis, we discovered several major themes that came up throughout the interviews. We had a lot of data to work with, but we were able to synthesis all the results into several major themes.

An excerpt from our thematic analysis. Check out the full report to see all the findings.

Results

In the end, I determined 7 key findings, each including explanations, justifications, and recommendations. Here are a few excerpts, but full details can be seen in the final report:

Key Finding #1: Instructors felt there were several areas/data points that could be better measured to help them teach more effectively.

Result: Though some instructors were not sure exactly what could be measured that would help them teach more effectively, most had some different ideas about things that they wish they had more knowledge about in their classrooms:

  • Measure and display pockets of students that are falling behind compared to the average of the class ( “pockets of students” to avoid an inevitable one or two outliers)
  • Highlight at-risk students, especially those that show trends in line with students who may be likely to drop out of a program

...

Quotes:

  • "I would like to see every single thing on a dashboard and then you can zoom in on one thing and then it expands and now you see the whole student, and then I want to zoom in on one assignment, and then it expands and I see that all the detail of that assignment. You see - like zoomable, go as deep as you can go." - P3

...

Design Recommendation: Ensure that all of the findings mentioned above are included in our designs, if possible and feasible. Some of the points mentioned by instructors are already available to them, such as seeing what parts of a Panopto video students are watching. Specific data like this should be available in an easy to find and easy to view format.

Key Finding #3: Teacher’s current usage of available learning analytics was mixed, with purely online and self-paced classes having the least awareness/knowledge on the subject.

Result: Interviewees showed a mixed level of awareness/usage of the learning analytics and student data currently available to them. Some instructors were very aware of learning analytics available through Canvas and used them in various ways, some knew of their existence but did not use them for anything besides checking grades, and some were completely unaware of them.

...

Design Recommendation: Our design needs to be accessible for a wide range of teachers with varying degrees of experience using learning analytics. Again, an onboarding process may help get all instructors that use this product on the same page. Our design also should be attractive to instructors of self-paced classes by giving insights that can help them get a better understanding of their students. We need to ensure that instructors understand exactly what can be gained from looking at the learning analytics and how it can improve their experience as an instructor and benefit their students' experience and retention rates. This explanation should also be included in the on-boarding process so that the product does not become ignored after initial exposure.

Conclusion

Our stakeholders were very pleased with our results and believed we were heading in the right direction for our designs. Though we had discovered several important findings, we now needed to use these findings to design a viable product.

Ideation, Prototyping, and Validation - A Brief Overview


As mentioned previously, I will only briefly go over the ideation, prototyping, and validation stages of this project.

Ideation

After our exploratory research, we used multiple ideation methodologies to better understand how our users could utilize our product in their workflow and to explore several ways our design recommendations could be implemented. We used three main ideation methods:

  • Persona Building
  • Storyboarding
  • Sketching

Here are some of the results of these methods. For a more in-depth explanation, please refer to the process book.

Our first persona, Katy. Katy teaches both online and in-person classes but feels she is unable to help students online as much as her in-person students.

Prototyping

With our initial sketches and ideas in mind, we moved towards creating a fully-fledged prototype of our product to test with PCE teachers, our target user. We utilized three different prototyping methods that built upon each other.

  • Paper Prototypes and Testing
  • Annotated Wireframes
  • High Fidelity Prototypes

Here are some of the screens from these various prototypes. For a more in-depth explanation of the prototyping process, please refer to the process book.

A paper prototype screen I made. I designed this to allow instructors to compare current course stats with previous quarters.
Here is the homepage for our final prorotype. It's divided into two main sections: data about individual students and data about the course as a whole.

Validation

Now that we had completed our high-fidelity prototype, we wanted to ensure that our designs were usable and logical. We ran usability tests with our target users to see if they could easily understand and use our product. Though we were towards the end of our project, this would give us the opportunity to make any last-minute updates to the design and gather sentiment on our product before passing it off to our stakeholders.

Planning

Our usability tests were thorough but relatively short. We tested the core functionality of our tests in 5 tasks given a scenario of using EngageUW while teaching a course.

Before each test, we asked participants to answer some rating questions to get a better understanding of how comfortable each participant was using learning analytics platforms. After each task, we asked participants to rate the difficulty of the task and answer follow-up questions about the task. At the end of each test, we asked participants about their thoughts on the product and test as a whole.

Testing

We conducted 5 usability tests with instructors. Overall, these tests went well and gave very valuable information. Though they were brief, they did give us a lot of details on small updates we could make to our prototype. Instructors really enjoyed using the prototype and many wanted to try it out with their own current classes.

With our testing done, we made various updates to the prototype accordingly and synthesized all our work into a final findings and recommendations report. We included this alongside our prototype in the final deliverables for our stakeholders.

Final Findings and Recommendations

Along with our final prototype, we included a revised version of our 7 key findings and 3 major design recommendations for any learning analytics platform. The full explanation can be found in the final report below, but here is a small excerpt from one of the key findings and design recommendations:

Key Finding 1: Value of Student Risk Profiles and Indicators

Student risk profiles and indicators are the features most valued by instructors during our usability testing.

This was also an idea that came up several times during our interviews with instructors during our research phase. Instructors, especially those with classes that do not meet in person regularly, are very interested in knowing who in their class is falling behind the average of the class, as this is a major indicator that they are at risk of dropping out of the class or program as a whole. Instructors also believe that there are more factors that can be indicators of falling behind, such as:

  • Not engaging with instructor feedback.
  • Inactivity on e-learning platforms such as Canvas or Panopto.
  • Sudden deviation in grade or engagement trends.

Design Recommendation 1: At-Risk Student Analysis and Messaging

Analysis of at-risk students has been the most well-received elements of our design. Instructors would value the ability to view risk indicators for individual students. Instructors are interested in a feature that allows them to select groups of students based on their risk status and send a message to the group.

We recommend that a final application includes a ranking of students using various metrics such as grade, engagement, and risk indicators. Instructors should be able to sort, select, and message students using these metrics.

This updated table uses accessible colors and allows instructors to learn more information about students.

 

Final Reflection

 This project was a great way to end my college career at UW. It gave me the ability to really flesh out a research study and do a complete run through the user-centered design process. It also was my first opportunity to work closely with stakeholders and utilize this relationship to create something good while ensuring that a client's needs are met. Our stakeholders were very happy with our final deliverables and may have already implemented some of our features into UW curriculum, especially with the new requirements of online learning and more focus on the functionality and usability of online platforms.

Mario Paolini 2023
Built on WordPress and self-hosted on a Raspberry Pi