top of page
Virtualitics
Pathfinder

* This work is protected under an NDA.

Anchor 1

ROLE

UX Researcher

TIMELINE

June 2024 - July 2024

GOALS

Evaluate Pathfinder’s usability and effectiveness in supporting users with their data analysis.

METHODOLOGIES

Usability Testing, Surveys, Affinity Mapping, Session Snapshots

A problem that Virtualitics Explore faced was users not knowing what to do or where to start with their
data exploration when they enter the platform. To resolve this issue, the team developed
Pathfinder, an AI-powered tool designed to simplify data analysis for new or less experienced users of the platform. This tool offers AI-assisted capabilities for completing three important analytical tasks (i.e., identify segments, find key drivers, and uncover outliers) for users that the team identified based on previous research. 

 

For this study, we aimed to evaluate a minimum viable product (MVP) of Pathfinder’s usability and effectiveness in supporting users with their data analysis via these three capabilities. This research was essential to informing product and design decisions for Pathfinder, as well as guiding discussions about upcoming features of the tool related to generative AI (GenAI).
 

Research
Project Journey

As the sole UX Researcher on this project, I led the design and execution of the research study. This included creating a research plan, recruiting participants through Respondent, conducting six usability tests, distributing surveys, analyzing and synthesizing data, writing a research report, and presenting my findings and recommendations to stakeholders. Regarding tools, I used Zoom to facilitate research sessions, Respondent to recruit participants, Maze for surveys, and HeyMarvin to auto-notetake, record, and livestream.

Business Objectives:

  • Inform design and development of existing and future products/features

  • Increase Coinbase's ability to be accessible for all​​

​

Research Goals:

  • Evaluate the clarity, completeness, and actionable nature of the information presented in Insight Pathfinder

  • Identify any gaps in the content and gather user suggestions for enhancements

  • Discover any usability problems within Pathfinder and uncover opportunities to improve the tool

​

Research Questions:

  • User Understanding: How do users interpret the information presented by Pathfinder?

  • Actionability: To what extent do users feel equipped to take action based on the information provided by the Insight Pathfinder feature?

    • ​In what ways do users incorporate the information from Pathfinder into their decision-making processes?

  • Content Satisfaction and Completeness: How satisfied are users with the information presented by the Insight Pathfinder feature? 

    • What critical information, if any, do users find missing from Pathfinder?

    • What enhancements do users suggest for improving the content of Pathfinder?

  • Usability: What usability issues, if any, do users encounter while interacting with Pathfinder?​

Planning

Planning

Before writing the research plan for this study, I tested the MVP of Pathfinder myself to gain a better understanding of the tool. Then, I organized and facilitated a kickoff meeting with key stakeholders, which including a Lead Product Manager and a Product Designer, to discuss research objectives, key research questions, methodologies, participant criteria, and timeline. 

 

Once we were aligned on those components, I drafted the research plan, shared it with stakeholders for feedback, and then finalized the plan, which contained the following parts:

  • Objectives

  • Research questions

  • Methodologies

  • Participant screening criteria

  • Project timeline

  • Moderator guide (for usability testing)

  • Survey questions

Usability Testing and Surveys

Usability Testing and Surveys

Regarding the research methodologies, I conducted 40-minute remote usability tests via Zoom and distributed 5-minute surveys via Maze. The usability tests consisted of:

  • Introduction (5 minutes)

  • 3 tasks to test the 3 key features of Pathfinder (30 minutes)

    • Including post-task questions about the participants’ experience​

  • Conclusion (5 minutes)​

    • Including questions about generative AI​

 

​​After completing the usability tests, I gave participants a post-session survey to get a quantitative evaluation of their experience. The survey included:

  • An NPS question about the participants’ likelihood for using the feature

  • Ratings about clarity, actionability, and ease of use

  • An optional free response for additional feedback

 

Considering my team’s need for a quick turnaround time for results, as well as a balance of external and internal participants, I aimed for 6 participants total—3 each from both groups. â€‹When recruiting participants, criteria that I looked for internally was people who use Virtualitics Explore and closely interact with external users. From past experience in previous studies, these particular participants provided a lot of really helpful and valuable feedback, so we wanted to recruit them again. For external participants, I looked for data analysts who work in maintenance operations, since my company was focusing on that specific space at the time. Methods to recruit were Slack (for internal participants) and Respondent (for external participants).

Challenge

Recruiting external participants who meet the targeting criteria. A challenge I faced during this recruitment stage was finding external participants, since the target audience for this was an extremely small pool on Respondent and other recruitment platforms I looked at. Eventually, I was able to find them, but it prompted me to give further consideration of how to scale this for future studies, where more participants might be needed.

Pathfinder and its 3 capabilities: identifying segments, findings key drivers, and uncovering outliers

PathfinderPanel.png
PathfinderSegments.png
PathfinderFindKeyDrivers.png
PathfinderUncoverOutliersPanel.png

Evangelizing UX Research

During this study, I was also kickstarting initiatives to increase transparency and evangelize UX Research in my company at the time. While conducting the usability tests and surveys, some things I implemented were:

  • Virtual backrooms (aka live streaming) for stakeholders to observe sessions

  • A UX Research calendar for teammates to be informed of ongoing research and easily join a live stream

  • UX Snapshots (easy-to-digest summaries of insights gained from a session that can be easily and quickly shared with the team)

HeyMarvin Livestreaming.png

Virtual backroom (livestream) on HeyMarvin

PostHog Interview Snapshot Template.png

Example of a UX Snapshot

Analysis

Analysis

Methods that I employed during analysis included:

  • Compiling the UX snapshots of each participant's session

  • Affinity mapping to find common themes

  • Quantitative data analysis of the surveys to calculate NPS and average ratings

Accessibility analysis (1).jpg

Here's an example of an affinity map I made for a different research study

Research Report

Usability Testing

Once analysis was done, I wrote a research report, containing an executive summary, findings, recommendations, and appendices for all relevant research materials/resources. I subsequently shared the report with the team and stakeholders to discuss findings, recommendations, and next steps.

 

Three key themes found from this study were participants needing:

  1. Transparency behind system processes, (e.g., how Pathfinder decides what is a key driver of a metric)

  2. The ability to dive deeper into the data directly from the shown insights

  3. More usage of common, everyday language to help them understand the insights better

Outcome
Outcome

By the end of this project, I submitted the following final deliverables:

  • Research plan

  • Research report

  • Session recordings

  • Summary notes of sessions

  • Session snapshots

​

As a result of this study, I achieved the following outcomes: 

  • Identified opportunities to improve the next iteration of Pathfinder

  • Informed product and design decisions for upcoming features (including Generative AI) of the tool

Key Takeaways
Reflection

Recruiting actual users of the platform. Reflecting on this project, I think recruiting current users of the platform to provide feedback on future versions of Pathfinder would be really helpful and offer an alternative perspective. While external participants provided great feedback on the MVP, they lacked sufficient knowledge of our tool to provide more in-depth feedback.

​

Ensuring transparency behind research processes. Based on positive feedback from my team about the initiatives I kickstarted during this study (i.e., session snapshots, virtual backrooms, and UX Research calendar), I plan to continue such practices to ensure transparency behind research processes and help evangelize UX Research.
 

Contact
bottom of page