UX Designer + Researcher

service blueprint.png

Braven

How can we improve the service experience for our clients?

 
 
 

Through weekly data collection and discussion of KPIs, what can we learn to improve the experience of our users?
 

 
Letter Copy 15.png

I was the Manager of User Research and Service Design Strategy, for Braven over the course of my 11 months as an Education Pioneers Fellow.  Braven is a leadership development start-up that has a hybrid digital product + in-person service model, and was initially funded by the Chan Zuckerberg Initiative.

As a member of the Founding Team, and Research Lead, I led the way in building up our data collection systems, figuring out how we could use our data to drive strategic decision making, and helping to establish a culture of using data to inform product development and service design improvement.  Effectively, I was responsible for strategizing and planning the research, managing internal and external stakeholders for quantitative and qualitative data collection, leading the way in analyzing the research, and then facilitating weekly cross-team meetings with Service, Tech, Design, Strategy teams so we could discuss how the research findings would impact our designs.  

ONE: THE CHALLENGE

Through weekly data collection and discussion of KPIs, what can we learn to improve the experience of our users? 

With Braven's hybrid digital / in-person model, there is a digital product design component and a service design component.  We wanted to collect data on both of these components and figure out how to maximize the user's experience.  In this case, our users are Braven Fellows.  They engage with an online platform, and are coached in-person by Leadership Coaches.  We used a combination of qualitative and quantitative methods to gain the best possible understanding of users to design the best possible product / service.

 

TWO: THE FINDINGS

Our users prefer a more intense, high-human touch experience.

We were able to conduct an A/B test – offering the same programming at one university over 6 months, and another university over 6 weeks.  We noticed that student satisfaction ratings were substantially higher at the university where the program was conducted over 6 weeks.  These quantitative results were in line with in-person observations by people who had seen both programs.

Our users are much more likely to complete tasks when they can see how that task ties to their goals. 

We were able to conduct a digital log analysis and see which assignments users were engaging with, and which ones they were completing.  Our quantitative data then informed our next round of qualitative research.  We followed up with in-depth interviewing to talk to students and find out why they were completing certain tasks and no others.  We learned that an assignment like "finish your resume so you can get an internship" resonated with students.  Whereas it was less clear to users how "upload a moodboard" tied to their goals.   

Our users are seeking two main things: to improve their career prospects, and to find more of a community.

In our qualitative data collection process, which included both formal and informal observations and conversations, we heard that students wanted more internship opportunities, more ways to meet other Fellows, and that they took pride in being a part of Braven's community.  

 

THREE: THE CHANGES

I facilitated weekly strategy calls with the Service, Tech, Design, and Strategy teams where we brainstormed design changes that have already been successfully implemented.

Based on the needs identified from the research findings, we worked together to brainstorm and then decide on actionable next steps.  The Service team implemented the suggested Service changes, and the Design team worked on changes to the digital product.  The result was a more enjoyable, more effective learning experience for Braven's users ("the Fellows").
 

 

Braven's offerings have moved from being "every 3 weeks" to every week, allowing more in-person coaching. 

 
 
 
 

By moving to a weekly cycle, Braven is responding to the insight that users prefer a higher-human touch experience.  The weekly offering allows each week to build more naturally off the previous week, and Leadership Coaches can focus on doing the things that humans do best - coaching, allowing users to feel more engaged.
 

 
 

Messaging has been tailored to make clear how the activities tie to student's goals, like finding an internship.

Events like this hackathon at hosted by LinkedIn where Fellows were given a real life business challenge to solve by LinkedIn provided the type of workplace simulation and thinking challenge that Fellows will need to use in their careers. 
 

 
 

Braven now places an added emphasis on building community among its Fellows and Leadership Coaches.

Going along with changing the program from meeting every 3 weeks to every week, Fellows have felt more a part of a community and have been able to build stronger relationships with their fellow Fellows and their Leadership Coaches. 
 

 
 

FOUR: THE APPROACH

A mixed-methods approach where we examined qualitative data alongside quantitative data, and determined design implications.

By looking at data from different angles (by team: Service, Tech, Design, Strategy; or by approach: qualitative or quantitative) and comparing notes, we were able to gain nuance and insight.  At the same time, there was definitely a consciousness to think about actionable design recommendations based on these insights.  In our weekly meetings, I tried to keep our designs moving forward and improving with each iteration.

 

Designing weekly KPI trackers

Our KPIs aligned to our broader goals of wanting to understand user Retention, Attendance, and Mastery of material.  Ultimately ,we analyzed data from digital log analysis and in-person observation.   These trackers were viewed by our entire team and had to make data accessible for everyone.  Ultimately, my colleagues (clients, in this case) felt the data was accessible to them.  I was also given the design constraint that I had to design these in Google Sheets.  THere's a focus on presenting the key information: basic trendlines, the most recent week's data written out, and then a short written synopsis along the lefthand side for each KPI.  

 

Leading weekly strategy meetings

I led weekly strategy meetings between our Service, Tech, Design, and Strategy teams, where I would present the research findings for the week, and we would discuss design implications.  For each of our goals, we had an ongoing metric for success (KPI), and each week we did a weekly data analysis for all of our goals.  Each weekly data analysis contains the observed reality, why we think it happened, and also what to do about it (for example, would the Service team or the Design team be responsible for implementing this change?).  In this way, we were able to test innovation and iterate rapidly.   

 

Managing administration of pre- and post-experience surveys

For this data collection effort, we wanted to understand our users' experience by measuring where they stood before the experience and comparing it ot where they stood at the end of the experience.  This involved pre- and post-experience surveys.  Ultimately, I was responsible for overseeing administration across 3 geographic regions, 8 different schools, 2-4 types of surveys/school.  It also required managing the process between both internal stakeholders (our Service team, our Strategy team), and external stakeholders (our survey administration company, school districts).   

 

Researching how to increase engagement with the digital platform

For  this project, I led our team's research efforts via digital log analysis, in-depth interviewing, and in-context observation / ethnographic research.  Through these efforts, we were able to identify some major pain points in users' digital experiences, help our entire team gain empathy for our user and gain a common language for discussing our users, as well as provide concrete design recommendations.

 

FIVE: THE OUTCOMES

My work helped build a culture of valuing research at Braven, which has seen impressive outcomes.

Screen Shot 2017-07-29 at 11.12.37 PM.png

Through the design of digital tools, processes, systems, and the establishment of a weekly research insights and design implications, my work as Founding Manager of User Research and Service Design Strategy helped to make design research accessible to everyone within the organization. 

After my Fellowship term ended, the organization experienced 300% growth in its user base, and continues to grow and expand to new regions.  I look forward to seeing how the product/service continues to evolve.

LEARNINGS AND THOUGHTS

1) Though we didn't have the time or capacity to develop personas for our service design, I was later able to develop them for our digital platform design.  We still incorporated knowledge of users as we were developing our service, but if I  could go back, I would bring these personas with me and incorporate them more fully into our service design process.

2) It would be great to do more creative brainstorming around service design - how might we design with unusual constraints like a fully in-person experience, or an all-digital experience, or half the amount of time?

3)  It would be great to think more about how to make the in-person and digital experience more cohesive, and feel like part of the same awesome brand experience.