In my personal life, I will spend hours figuring out the quickest way to do something. Maybe it’s housework or grocery shopping or a driving route. While I’m figuring it out, it’ll take longer. (Drives my sister nuts – she’s a civil engineer who obviously knows the best way to do it the first time.) But once I have my aerodynamic path, everything takes a lot less time than if I hadn’t streamlined it. My professional life does not escape my efficiency efforts either. I have been working to build out infrastructure to support execution of user research and access to findings for most of my career.
Planning, conducting, and reporting on user research is challenging work and takes a lot of time no matter how you do it. A researcher must clearly define the purpose of the test, determine the type of testing that will best serve the purpose of the test, prepare testing materials, then conduct the test, then analyze and report the findings in a manner best suited to fulfilling the purpose of the test. Again, lots of work and lots of time — and then you add all of the other stuff on top: advocating for user research, participant screening and recruitment (and scheduling, compensation, checking in), consent/disclosure agreements, ensuring the affordances for research are in place (location, recording, software, production of assets for testing), and having a place to store the findings and raw data for later access. Never in my career has all of the necessary infrastructure existed to support user research, but I have often been part of and/or led initiatives to build out infrastructure for user research.
Here’s how a typical user research project would go in the past (and possibly also in the present) . . .
- Early in an engagement, UX would start begging the product or account manager to approve user research, which had not been planned for in the budget or project timeline.
- If the request was not quashed based on scope, UX would write up a formal research proposal.
- Upon approval, UX would work out a detailed research plan, create a screener for recruiting, and write up a testing script.
- Someone, often a person whose job included making arrangements for (non-UX) focus groups and other market research activities, would reach out to an existing network of market research vendors to recruit participants using the screener created by UX.
- The selected vendor would also typically procure consents/disclosure signatures and handle incentives (such as gift cards) for participants. Recruiting would take weeks.
- Depending on the research plan, research sessions might take place remotely, in the office, or at a research facility arranged by the vendor.
- On research days, either the vendor or some other person would check-in participants; after that, UX was on their own unless the research was done at a research facility that provided video recordings, in which case the vendor would later hand over the raw video footage.
- Otherwise, UX provided the test materials and equipment, ran the test, arranged for and managed video recordings, and otherwise documented the test in progress. If it was in-office or a remote test, UX would ensure the necessary software (such as Morae) had been procured and was available for the test. For a paper prototype test, UX would create the prototype, arrange for the prototype to be mounted on boards, and arrange for cameras in the room. And so on.
- And then UX would conduct the test and do the necessary analysis — first manually transcribing the videos — and then prepare and present a report.
- UX would save the report and all underlying evidence to a folder for that particular client engagement and, upon conclusion of that engagement, the research would never be seen again unless someone got nosy.
In recent years, most of my research projects have gone asychronous, using online platforms like usertesting.com and playbookux.com. These services provide remote and live studies. Some, like playbookux.com automatically transcribe sessions; others, like usertesting.com do not. In my experience, there is little difference between automatic transcription and manual transcription. The biggest allure of platforms like usertesting.com and playbookux.com is price, overall, and recruitment.
How to Kill User Researchers
I was hired as a contractor to conduct two user research studies for Intuit Quick Books, one before redesign and one after, with usertesting.com. For the first study, I created my research plan and screener. I needed small business users of Quick Books; I specified some parameters within that population to screen in and out. I selected the right options to set up my study. I wrote up my script and transcribed it into the usertesting.com interface. The service did its magic: I recruited exactly the people I wanted to test and recorded their interactions and comments. The test subjects were perfect! After the study I spent 70% of my billable time manually transcribing the user interviews before analyzing the results and producing a report that informed redesign. For the second study,I repeated the operation except that suddenly I was told I would be conducting multiple studies with no changes to the time parameters I had agreed to. Again, after the study I spent 70% of my billable time manually transcribing the user interviews before analyzing the results and producing a report, only this time I had worked unpaid overtime over a holiday weekend.
What went wrong? Among other things . . . First, the overall project plan failed to adequately account for and recognize the importance of user research. (This is really ironic because the agency running all of this claimed to specialize in UX.) Not only was the project not planned to accommodate user research, but a decision was made to keep the project on track by making the user researcher work double-time hours. Second, whatever project plan there was failed to account for the realities of user research, such as the time required for manually transcribing recordings. Third, encompassing the first two problems, logistics were not in place to support the researcher. The researcher spent most of her billable hours manually transcribing recordings and there was no place for the researcher to save or archive the research where the data could be referenced in the future.
Operations in Progress
Everywhere I have worked I have initiated processes to facilitate user research and make user research findings available to those who need it.
Working recently at a SaaS company I found, in lieu of a corporate intranet, a disorganized wiki, Google Drive, and an abundance of other third party applications used for particular purposes. Knowledge-wise, nothing had a home and nothing was findable unless you knew what you were looking for.
Within this chaos, the UX team started to carve out a small corner, and among other things was a nook for UX research.
At this SaaS company,