Connecting Campus: Resourcing Mental Health at UofT

User Experience Research

UX Research • Comprehensive • MidFi

Group Master’s Project

December 2021

Why is it that poor mental health has become ubiquitous to the university student experience? This is a loaded question, and one that my teammates and I aimed to take on in this end-to-end UX project. To do this, we spent our semester embarking on a design thinking journey from research, to analysis, to wireframe, and then back to more research. This involved one sprint per phase, each followed by a playback that crafted our user’s iterative story from start to finish.

Our Problem was the state of stress management and resource navigation at UofT.

Our Process was 4 research sprints that followed IBM’s Design Activation journey, conducted over a 2-month timeline.

Our Solution was a resource management platform for students, optimized to alleviate their three biggest pain points: access, convenience, and trust.

My Role was to co-facilitate user research with the group, craft our story, lead brainstorming, and produce our final video.

With that established, we can explore how we went about this.

Background

Uoft Graphic from official recruitment campaign. Source of image: https://i.ytimg.com/vi/SYd-jjrUvug/maxresdefault.jpg

95,000.

That’s how many students are enrolled at the University of Toronto in 2021. With 3 campuses, 7 colleges, 18 divisions, 900 student groups, and thousands of resource offerings, there’s a place for every student to feel heard, find their community, keep on top of their academics, and easily seek help when they need it, right?

Screenshots of news articles that highlight the mental health crisis at UofT

… well, it’s complicated.

In a university-wide community survey we conducted, we found that more than half of respondents used google or reddit as their main means of resource discovery. And while almost all of them reported above average to extreme levels of stress during their degree, half of them didn’t make use of any on-campus support available to them. More than two thirds say that they feel a below average level of support from the university. Students are struggling to keep up.

But let’s pause and rewind for a bit - how did we get here?

Discovery

We start our journey at discovery. For our desk research, we made use of research papers, online articles, press releases. For our user research, we had a total of 10 interview participants and 22 survey participants from which we were able to draw qualitative data. This was conducted over 2 weeks as we scoped out our problem space and deepened our understanding of the mental health landscape at UofT.

Highlights from our user research include the following quotes:

“As much as I would like to, I don’t really the time to sift through things outside of a class context. I’m afraid that, if I wait long enough, I’ll be forced to make time"

“Everything I know about mental health resources comes from other students - there’s a shift in how information is being passed on”

“There just never seems to be somebody professional, who is also readily available, that I can talk to. It’s making me lose trust in the university”

“No doubt the university has valuable resources, it’s just that they never present it to us in a way that students would understand”

Once this round of research was completed, we gathered our data and took to the digital whiteboard. Over the course of two workshops, we broke down our datapoints and clustered it all into an affinity map that revealed 5 key data clusters:

  • Students are busy with school and life stressors, limiting their capacity to actively seek out and use resources in a timely way.

  • University messaging about these resources feels impersonal and uncompassionate.

  • Students who seek out information often find it inaccessible, unavailable, and overwhelming.

  • This delivery has created isolation and distrust in university resources, students struggling to find resources that effectively address their problems.

  • Despite relying on them for information, students lack support and empathy from faculty and administration, turning instead to personal networks to fill this hole.

In a perfect world, students would be able to access resources easily and during times that can accommodate their schedules, and quickly engage with support without needing to wait. They would see mental health resources integrated into their school life, increasing their awareness and knowledge of relevant information. They would have trust in the competence and compassion of the University when seeking help. And ultimately, they would feel supported and be part of a tight-knit community composed of both students and faculty.

Our findings revealed to us that the worlds these students were living in were anything but perfect.

Analysis

With our problem space scoped out, we were ready to consolidate our insights and put ourselves in the shoes of our user. This analysis started with the creation of a primary persona that we could turn to for guidance and direction as we began solutioning.

Meet Stressy Sammy:

Stylized depiction of the "Stressy Sammy" persona. Info from top-down, left to right: Photo, Name, Quote, About, Bio, Wellness Goals, Frustrations, Personality, Feelings, Task List

Sammy’s persona was informed and enriched by an empathy map that we divided into four interaction categories: “Says”, “Thinks”, “Does”, and “Feels”. This is what it looked like:

Empathy map that builds Sammy's persona, top left quadrant saying "Says", top right saying "Thinks", bottom left saying "Does", and bottom right saying "Feels"

If we intended on staying true to Sammy’s emotional journey, knowing their emotions in a vacuum was not enough - we needed to map out their process. We adapted and extrapolated the insights from our empathy map to build out Sammy’s step-by-step journey when navigating UofT’s mental health resources. The result was an four-step journey that we consolidated into 4 unique pain points:

  1. Sammy finds that messaging from the university regarding mental health is time-consuming and feels disingenuous, affecting their trust in their ability to help them.

  2. Sammy is confused and overwhelmed by the current information due to scattered resources that are not easy to go through.

  3. Resources available at the university are not embedded in Sammy’s university life, which makes it difficult for them to access in their busy schedule.

  4. Sammy wishes that they could get the much required help immediately rather than having to wait for a month to get advice and solutions

We chose these four pain points through a dot-voting process that was guided by relative frequency and potential for breadth in the problem space. This how these these pain points looked in the context of our journey map.

As-is user journey that shows the 4 determined steps, and highlights the 5 relevant insights that we will move forward with

Ideation

Having converged on our primary pain points, it was time to once again embrace ambiguity through our ideation process. This started with a “big ideas” workshop, where we put our heads down and came up with as many ideas as we possibly could in 15 minutes. Emphasis was placed on quantity over quality in this session, and our ideas were guided by Sammy’s pain points and needs. We then gathered these ideas and grouped them on the basis of similarity. To zero-in on the ideas that we would take with us into our proposed solution, we each had a fixed number of votes that we would assign to our idea clusters - half being votes for “impact”, and the other half for “feasibility”. Once tallied, we plotted our ideas into a prioritization grid that would help us get a better understanding of what we could (and should) move forward with.

Here are some snippets from our workshop:

 
Collage of pictures from the ideation workshop
 

We transcribed the results of this workshop back onto our digital whiteboard, from which we came out with the following digital prioritization grid:

Digitized version of the prioritization grid, with the x axis being Feasibility and the y axis being Impact

As is outlined in the grid, we segmented our big ideas into four categories. Ideas that received no votes for either criteria were left in the “cutting room floor” category (outside of grid). Ideas with imbalanced votes that amounted to four or less total were classed as “marginal gains” (bottom left). Ideas with moderately balanced votes that had less than four votes in either criteria were “tough decisions” (middle). Ideas with balanced votes that had more than four in either criteria were our “home runs” (top right).

Based on these results, we decided that we would proceed with a solution that integrated some combination of our home runs. The user stories were as follows:

Three digitized sticky notes showing slightly different depictions of the "Rating System" concept

Users are able to view, provide testimonials, and leave ratings on resources, in a system akin to “Rate My Professor.”

Two digitized sticky notes showing slightly different ideas of the "One-stop-shop" concept

A platform for tailored navigation through all on campus resources - including faculty, department, college, and campus resources.

Three digitized sticky notes showing slightly different ideas of the "Buddy System" concept.

A Buddy System helps that people connect with community members to help manage stress, applicable to online forums & physical spaces.

In combining these three home-runs, we envisioned a to-be journey that improved the user experience at every step of the process. We represented this “perfect” journey through the following comic strip (read from left to right, top to bottom):

Comic panel 1: Sammy is pictured thinking, saying "Where can I quickly find help?"

Faced with the daily stressors of school, Sammy ponders where they can turn to for help.

Comic panel 3: Sammy has a positive expression, looking at a 5-star review of the app on their phone, saying "Others seem to like it, so maybe I'll give it a shot"

Sammy can more easily identify the resource that best fits their needs by reading reviews and past student experiences

Comic panel 5: Left side of image shows a login page for an Art Therapy class that Sammy is signing up for. Right side shows Sammy in the class, focused on their task

Sammy accesses their resources and also has immediate access to a supportive community of students. Because Sammy contributes to this community, they feel heard, and through that contribution, Sammy can become a champion of mental health for other students.

Comic panel 2: Sammy with a happy expression saying "It's so easy to find all of this information"

Sammy can find all the information in one convenient place. They also have access to an online student forum for immediate help.

Comic panel 4: Left side of image shows Sammy clicking a button on the phone, saying "Immediate booking?". Right side shows them with a positive expression, saying "This is so convenient"

Sammy can easily see what resources fit into their schedule and book multiple appointments in one place. They are able to find more timely resources when the one they wanted has a long wait time

Comic panel 6: Final depiction of Sammy excited and relieved after class is completed, leaving a 5-star review and saying "Wow, I really needed that"

Sammy feels less stressed out, instead feeling supported and excited about their future. They end their journey in a much better place than where they started.

Prototyping

With our envisioned journey and high-priority ideas in focus, we moved forward with prototyping. In doing this, we targeted three design goals, also known as “Hills” in the IBM Activation Journey, to create statements of clear intention and value. In a work environment, these would be crucial to any stakeholder involved in the development or success of the proposed product or solution. These hills were:

Trustability

Sammy the isolated student can discover & engage with a supportive network of like-minded students wherever, and whenever, they need it.

Community

Sammy the skeptical student can evaluate the suitability & effectiveness of a resource and verify them with real student feedback.

Convenience

Sammy the busy student can seamlessly identify, book & access a resource that suits them, all in one place.

For the purposes of our project, we leaned towards an app-based digital solution, developed with iOS standards and breakpoints in mind. The three steps of our prototyping journey were to conceptualize the app flow, create a minimally viable low fidelity wireframe, and then gather lean feedback to create a clickable medium fidelity prototype.

We started by getting our heads together in another workshop and established our desired app flow. We sliced this flow into three paths that would each aim to accomplish a Hill, then sketched our own individual versions of how we envisioned each screen. We gathered and voted on them on the basis of adherence to user-centered design principles, information hierarchy, and contribution to their respective hills. We then gathered, iterated on our flow, and re-sketched to ensure visual consistency. This is what the finalized sketch flow (left) and lo-fi wireframe looked like:

To move towards our medium fidelity prototype, an informal lean evaluation was conducted on representative users to gain preliminary insights into our design and whether or not we had any critical issues, confusing parts, or missing elements. The main areas of improvement uncovered from this evaluation were in ambiguous language choices, path flexibility, and hierarchical inconsistencies that created cognitive overload.

Taking all of that into consideration, we were ready to create our clickable medium fidelity prototype using Balsamiq. This software offers ready-made assets to enable a more efficient drag-and-drop building process at this stage of the development cycle. We collected the screens of this prototype and crafted an end-to-end story describing it and its design goals, with the final output being a short video and presentation.

Here, I took full of the video production process. I created the storyboard, wrote the script, curated the assets, and edited the video in its entirety. Based on the finalized script and preferred competencies of the rest of my groupmates, I also put together a strategy to make sure everybody’s strengths were highlighted in the video - delegating narration tasks and sketching of new visuals accordingly.

This was a medium that I had not previously explored to this extent, but I was excited to take it on and deliver it as part of our final playback. The playback took place in front of a panel of industry experts and the rest of the class, eliciting some incredible feedback. Per our Professor’s requests, the video was also used as a gold-star example of the final playback for future iterations of the course. So, in all, I think I’d call it a resounding success. Huge thanks to my team for giving me full creative license with the video, and for their invaluable contributions and feedback as we produced it.

The full video can be found here:

Usability Testing

The final step of our process was to put our prototype in the hands of our users and evaluate. To do this, we conducted 4 think-aloud usability tests with representative users, and walked them through modified versions of the tasks given during our lean evaluation. Those tasks were:

  1. Onboard the application 

  2. Find, verify, and book a resource.

  3. Write an anonymous post in the student forum.

  4. Provide a 4-star review for the event attended.

Our moderation platform was Zoom, and our testing platform was Balsamiq. We ran a pre-test, post-task, and post-test questionnaires to put our the data we uncovered into context. All tasks were completed successfully by our participants, and we observed the following impacts on our Hill categories:

Trustability Metric

2/4 participants reported higher perceived trust in campus mental health resources after the test. Contributing features include UofT login credentials, resource reviews, and anonymous posting.

Community Metric

3/4 participants reported incremental increases in community sentiment after the test. Contributing features include the student forum and community reviews.

Convenience Metric

4/4 participants reported improved experiences locating resources after the test. Contributing features include search option variety, student reviews, and two-tap bookings.

Due to the low number of participants in this usability test, our reporting strays away from quantitative measurements. Beyond relevance to our Hills, key takeaways from these tests were as follows:

 
Evaluation summary outlining what worked well, what could be better, and what could be fixed.
 

Wrap-up

Which brings us to the end of our 2 month journey, and to my personal reflections. As was the case for most of the team, this was my first exposure to the full scale, front to back design process (beyond the lens of business strategy). This was double-edged, because while it allowed us to bring fresh and interdisciplinary perspectives to the process, it made us prone to hitting walls more often than we had anticipated. I tried my hand at addressing this through the experience I had from past design-thinking projects, using strategies like sprint workshops and ‘bad idea’ brainstorming to get over slumps. I also tried to exercise extra attentiveness to the skillsets of each team member beyond what was communicated verbally to make the division of tasks when needed as efficient and flexible as possible. This made the emergence of roles in the team more organic, and resulted in more team synergy.

As far as areas of improvement go, there were many that came about as a result of format constraints and our collective experience levels.

  • Firstly, was that the details of our persona went far beyond what would be valuable for business or design stakeholders, resulting in a slightly overwhelming artefact that was not revisited as much as it could have been.

  • Secondly, was that our “big ideas” were anchored to a mobile-based platform without a concrete design justification for it. This heavily limited the potential that our platform had in being a robust solution for users.

  • Third, was that our prototyping phase was bottlenecked by the capabilities of the Balsamiq platform, as well as the types of feedback we were able to source and integrate. Our transition from low to medium fidelity would have ideally involved more internal stakeholders and less users.

  • Fourth, is that our usability tests were heavily informal and fell below industry standard in terms of participant numbers. We ideally would have recruited > 8 participants to cover a greater majority of usability issues, but were constrained in time and outreach strategies.

  • Finally, is that our project ended before moving into the high-fidelity stage. This would ideally be what is tested in user evaluations, but the timeline and content covered in this course made moving into it practically infeasible.

I finally wanted to give a special thanks to Olivier and Velian for their guidance throughout this course. They equipped us with a strong foundation that I will carry with me for the rest of my design career. I’m excited to see where it will take me.

 
Group photo after the conclusion of the class. Left to right: Edith, Isaac, Me (Mado), Prof. Velian, Prof. Olivier, Mary. Group members not pictured: Iris, Ramya.
 
 
Previous
Previous

Quip: Startup Product Design Project

Next
Next

Fledger: Master's UI Design Project