Interaction Designer

2015-06-10 16.33.04 copy.jpg


Usability Research, Enterprise UX
Native IOS, Android, & Desktop



Certify is an expense management tool that enhances all the operational aspects of business travel through automated expense reports, analytics, and more. However, their data showed that a majority of users still typed in receipt information manually.

Certify engaged my team to uncover the behavioral drivers to using—or more importantly, not using—Certify's automation tools so that we could make informed recommendations to their upcoming product refresh initiative.


UX Director
UX Researcher
UX Architect — My Role
Project Manager


6 weeks


  • Recruiting
    Tracking down Certify users that were local to the Chicago area and fit our criteria

  • Design Bias
    Getting past our own bias as a full stack dev shop to uncover objective insights

Research plan



The project was broken up into 3 phases over the course of 6 weeks: Knowledge Ramp Up, Usability Research, and Synthesis Readout


research objectives

  • Understand participants’ motivations for using their preferred reporting method: automated, manual, or otherwise.

  • Identify common experience gaps within the system.

  • Identify opportunities for Certify to promote consistent use of automation features.

  • Gain insight into users’ unmet and latent needs.

participant selection

To observe behavioral drivers that may have developed over time, we chose to focus on current Certify users who range in preference from manual to automated tools and mainly submitted reports vs. approved them.  

Certify was so excited to get going that they wanted to hear from all of the users and they even wanted to recruit. I put together a cheat sheet for their team to refer to as they recruited participants, and made sure to highlight our recommendation.

testing setup

I wrote the Moderator's Guide to start with qualitative questions about participants' past experiences with expensing and business travel. The participant would then go through a series of tasks designed to show us how they would create and submit their expenses and expense reports. We felt that a Moderator (XR) and 2 Observers (UXD, XA) would be too much for the Participant so the test environment was split into two rooms:

Credit: Dan LeBoeuf

Credit: Dan LeBoeuf

Testing room

Setup to make the Participant feel as comfortable as possible. (I even bought a plant! 🌵) The XR moderated the conversation & tasks while we observed separately via livestream. This allowed us to think out loud without disturbing the Participant.

Credit: Dan LeBoeuf

Credit: Dan LeBoeuf

observation room

Armed with mountains of stickies (sorry, Earth), the UX Director and I recorded observations in real-time. Towards the end of the session, the XR came in and we'd get a chance to relay any follow up questions to the Participant. 


After the testing sessions, the XR and I commandeered a small room to group our observations into an affinity map. As themes emerged, we recorded early insights and recommendations on green stickies (pictured above).

General findings

Participants weren’t simply binary towards automation vs. manual features. Many participants incorporated a combination of both types of behaviors into their workflow. In fact, even manual participants perceived Certify to be highly automated compared to  “archaic” spreadsheets of past experiences. 

Participants felt efficient in their preferred process and had little incentive to change their behavior towards more automated features. Ultimately, participants held themselves accountable for accurate expense reports. 

Research Themes

    “It’s tedious to scan by hand.”

    “Wish I had seen [these automation tools] sooner.”

    “Certify is pretty straightforward.”

    “I’m more comfortable typing on desktop.”

    “I’m worried that my expenses are incorrect.”

“I don’t want to be wrong, especially when it comes to expense reports.”

Final readout

These findings made our report a little tricky. Fixing the UI/UX issues we uncovered would greatly improve usability, but would not likely push users' to adopt automation features. 

So, in the final readout we got client buy-in early by highlighting participants' positive perceptions of Certify as well as easy fixes to keep it that way. Each theme we introduced was designed to build upon the key insights and user quotes from the last, culminating into a section of recommendations for both short and long term enhancements.


  • Use visual cues to indicate preferred or main actions, as well as errors

  • Decrease number of elements to reduce cognitive load and promote the happy path (automated features)

  • Establish automation habits as users are onboarded. This is critical as users grow complacent with their expensing process over time.

  • Highlight automation successes to build users trust of the features over time


so what?

Anyone can dump a deck on a client and say, "fix these things and you'll make more money," but usability research shouldn't be a band-aid.

From day 1, we invested time into breaking down UX methodology because we saw an opportunity for Certify's talented product team to continue user-centered processes long after our synthesis was delivered, which I'm proud to say that they're doing.


  • Recruiting is hard.

  • Timestamp stickies and write down key observations right away or they'll be lost to your recall 5ever.

  • Research isn't always about what the user says. Oftentimes, observing hesitations, gestures, and body language can provide key insights.

  • Research is like sketching. Seeing a problem from multiple perspectives gives you greater insight into how to solve it.

  • UX isn't always about the user. Not only did our client walk away with a deck of actionable, informed insights & recommendations, they were inspired by this process to adopt user-centered methodologies into their company culture.

What Now?