Identifying New Tools to Support UX Research at Q2

Q2 is a financial technology company that builds software for community financial institutions. I sought to identify and act upon opportunities to support UX research within the company. Their research team has conducted hundreds of studies over the last six years; however, the discoverability of their historic research is low leading to issues in onboarding new designers, unnecessary burden to retrieve past research, unanswered research questions, and wasted time. Additionally, given constraints in time and resources, qualitative analysis was often less thorough. My recommendations were pitched to Q2's Vice President of Product and Development. The results of the project included the purchase of a new research tool at $2,400/year by the company. I have also maintained a key role in ensuring the successful adoption of the tool into the research team's regular processes.

Big Takeaways

This experience taught me a lot about the value of asking for regular feedback and establishing “buy in”. The long-term project goal was to identify and build out a repository and analysis tool to be used by stakeholders outside of the UX Research Team. So, rather than wait for some “grand reveal” at the conclusion of the project and hope stakeholders would be satisfied, I sought regular feedback. The project took about 10 weeks to complete and, during that time, I gave a total of four presentations to all UX managers, designers, and researchers. Thus, by the time I had proposed the purchase of (…spoiler alert!) Dovetail, I was confident that the pitch would be successful, because I had essentially established “buy in” and acted upon company feedback by that point.

Workflow

click to expand

Ideation Exercise

When I started at Q2, like many new hires, I was eager to look around the company and identify opportunities for improvement. I first moderated a collaborative, ideation exercise in Miro with all UX managers, designers, and researchers. Participants were asked a series of questions that probed processes and challenges related to current research activities (e.g., finding research, analyzing data, and reporting). Participants responded by writing out answers on “sticky notes”, and it was emphasized that a single note should be limited to a single idea.

 

The perceived advantages of this approach, as opposed to 1-on-1 interviews or focus group interviews, was as follows: (1) the exercise was a relatively fast, efficient means of data collection, and (2) the exercise created simultaneous, equal communication across all participants.

 

Notes were analyzed using affinity diagramming by first sorting notes into thematically-related, top-level categories and, second, sorting notes within each category into more nuanced themes to derive insights.

click to expand

The results of my analysis revealed several important findings summarized in the table below. Many challenges around research processes were found: (1) searching for historic research was challenging, reducing the impact of past research and creating a burden on the research team to uncover past research, (2) research files and information were spread across multiple locations (e.g., Confluence pages, SharePoint folders, separate video recordings and transcript files, etc.), (3) qualitative data analysis was typically informal, not thorough, and prone to issues of biases, (4) there was no standard for reporting research findings, and (5) while participants valued highlight reels, they found the process of creating them arduous. Findings from my analysis were written into a detail research report, and I presented findings from my analysis to all UX managers, designers, and researchers.

click to expand

Competitive Analysis

I then conducted a competitive analysis of ten research repository and analysis tools. I focused my analysis on a set of features identified through the previous exercise. For instance, it was important to identify repositories that could support tagging of projects with searchable metadata so that stakeholders could, for example, find projects by features and components.

click to expand

My competitive analysis identified five candidate repository and analysis tools (i.e., Aurelius, Condens, Dovetail, Enjoy HQ, and Tetra Insights). Next, within each platform, I re-created the same project using data from a focus group study conducted in 2020. I made project pages, uploaded media files and other important files (e.g., research session notes), analyzed qualitative data via thematic tagging, and created highlight reels. I also evaluated each platform’s transcription accuracy by hand transcribing a random selection of speech across multiple speakers and questions and comparing my transcription of the corresponding portions produced by the different platforms.

click to expand

click to expand

To better summarize the findings of my competitive analysis, I ranked each repository along the following dimensions: search features, experience creating highlight reels, storage of research files (e.g., research notes), qualitative analysis features, and transcription accuracy. The rankings were based on objective criteria when possible (e.g., transcription accuracy), otherwise, rankings were based on subjective experience from working closely within each repository. Rankings were presented as part of a detailed written report to clearly highlight my findings and justify my rankings. I presented findings from my competitive analysis to all UX managers, designers, and researchers.

click to expand

Pitch

I recommended the purchase of a Dovetail subscription at $2,400/year. I pitched my recommendation to Q2's Vice President of Product and Development. During my presentation, I presented solutions to each of the previously identified challenges. For example, designers, had expressed difficulties searching through historic research, which would be supported by both a universal search feature and by metadata tagging inside Dovetail. In another example, informal qualitative analysis would be replaced with a more thorough practice of thematic tagging streamlined via Dovetail's analysis tools. Creating artifacts, such as highlight reels, would also be streamlined with Dovetail.

click to expand

I also laid out where Dovetail would be used to streamline and support each step during a typical research study process including planning, collecting data, analysis, and reporting. For example, collection of interview data would be followed by uploading and transcribing video files into Dovetail, identifying responses via question-level tagging, and thematic tagging within questions. Question-level tagging refers simply to highlighting participant responses to each question. This type of tagging, in addition to thematic tagging, confers a couple of advantages: (1) it presents an opportunity to clean up transcripts, which would benefit their discovery within a universal search feature, and (2) it allows stakeholders to easily retrieve all responses to any one question.

 

It was also important to remember supporting UX research at Q2 will require more than just adopting new tools, including Dovetail, a point I made during my presentation. Qualitative analysis, for example, will benefit from the establishing best-practice guidelines that set expectations for all involved stakeholders. My proposal to purchase Dovetail was approved by the company.

click to expand

Next Steps

Once all research-related content (e.g., planning documents, recordings, reports, etc.) has been migrated into Dovetail, the platform will serve as a single hub for all research information, artifacts, and analysis. More importantly, however, the discoverability of historic research projects will be significantly improved, aided by the inclusion of searchable metadata. In short, stakeholders will be able to better uncover relevant research and, when they do, be able to access all the content related to a project in one space.

 

The acquisition of Dovetail has also jump-started significant changes to the research team’s processes. I have been directly involved in building qualitative analysis and reporting guidelines in a way that will promote a more thorough practice of analysis and documentation of research findings. The guidelines will ensure consistency and set clear expectations around qualitative analysis and research reporting.