Livy, a data science startup, needed user research, concept validation, and design for a mobile event recommendation app.





UX Designer


Sketch, Invision



Three weeks


Artifacts And Deliverables

Research plan

Competitive analysis

User interviews

Subject matter expert interviews

User archetypes

Mental model

Journey map

App map

Content strategy

Invision prototype

Concept testing


Usability testing




Exploring the domain

Our team of three UX designers developed a research plan that started with competitive analysis and domain research. Livy has two types of direct competitors–online event calendars and social media. Here's a snapshot of their strengths and weaknesses:

Snapshot of the strengths and weaknesses of the two main competitor types

Snapshot of the strengths and weaknesses of the two main competitor types


Recommendation services

Because Livy will leverage AI and machine learning to provide personalized event recommendations, we also looked at other personalized recommendation services. We wanted to know how these apps onboard users and present recommendations.

We saw three types of onboarding:

01. The Cold Open

no user input required; recommendations informed by user behavior

ex: Amazon, Spotify

02. Quick and Easy

user shares a few preferences; recommendations improve with user behavior

ex: Apple Music, Netflix

03. In-Depth

extensive user input required; recommendations are discrete, but will improve with repeated use of the service

ex: Bright Cellars, Next Glass

We noticed two approaches to presenting recommendations:

01. Discrete

user can view, but not edit results

02. Open and flexible

user can browse recommendations; user can actively improve results


Machine learning and artificial intelligence

Next we wanted to understand the capabilities and constraints of machine learning and artificial intelligence. We felt it was necessary to have technical perspective if we wanted to maximize the user experience. We interviewed two data scientists and a designer who works with machine learning platforms. Those conversations, combined with reading and online courses, gave us a solid understanding of Livy's technical environment.



Listening to event goers

Strong feelings about searching for events

We interviewed ten regular event goers in the Chicago metro area. We wanted to find out about their motivations for attending events and their process for discovering and evaluating the events they attend. Actively searching for events was something people felt strongly about.

I don’t like spending my free time searching. The pathway from Google can be kind of annoying.
— Breana
I like looking and I like sharing the information.
— George

Hesitant about artificial intelligence

Livy assumed that users would be attracted to an app powered by artificial intelligence. So we also explored participant worldviews of AI and machine learning. Participants appreciated recommendations like the ones Netflix provides. But they were not entirely comfortable with the idea that machine learning powers those recommendations.

I would want reasoning as to why these events were chosen [for me]. I don’t want to give the control over.
— Jason
Technology is taking over. Little scary sometimes for sure. I would be wary of [recommendations powered by artificial intelligence] right now. I would want to learn more.
— Alison


Getting insights from the data

Affinity mapping to make sense of user interview data

Affinity mapping to make sense of user interview data

Who is the Livy user?

With hours of interview transcripts we needed a method to organize the data. We started with affinity mapping. It was clear that there are two distinct types of event goers: Avid Event Seekers and Willing Participants. View the slide gallery below to see these two proto personas–and their overlapping goals, motivations, needs, and pain points.

From the Venn diagrams we could easily see how designing for the Avid Event Seeker would provide Livy with a clear value proposition and audience.


The Avid Event Seeker's mental model

Next we created a mental model for the Avid Event Seeker. The model involved pre, during, and post event considerations and behaviors.

This mental model shows the Avid Event Seeker's process from discovering an event to evaluating how it went.

This mental model shows the Avid Event Seeker's process from discovering an event to evaluating how it went.

CORE opportunities

  • Make it easier and faster to find events
  • Provide ways for users to share events
  • Focus user experience on the benefits of personalization not on selling the concept of AI

The Avid Event Seeker's journey

Next, we layed out the specific journey of the Avid Event Seeker. The journey map reinforced many of the points in the mental model. It also clarified the micro routines involved in finding and attending an event.

Journey map reveals opportunities for improvement

Journey map reveals opportunities for improvement

Areas for improvement

  • Reduce time to event selection
  • Facilitate social coordination
  • Provide enough information to determine fit and set expectations
  • Allow user to set specific preferences–on price, location, and event content

Research-based design guidelines

Synthesizing our research helped clarify our opportunities. But before we started sketching and concept testing we developed a couple of touchstones to guide our design process: a design opportunity statement and design principles.


Design principles


Livy understands that users’ priorities change and makes it easy for users to find what they’re looking for, no matter what that is.


Livy embraces user feedback and available data to tailor recommendations without needlessly limiting the user’s sphere of opportunity for new experiences. The user has insight into why events are chosen and an opportunity to weigh in on the success of those choices.


Livy leverages feedback, incentives, and social connection to give users more for the time and energy they spend filling their schedule with the next great event.


Livy knows that sometimes users hit the town solo, but more often than not they’re looking for a "plus one." With Livy it's easy to connect the dots between what you’re going to do and who you’re going to do it with.



Iterating toward a solution

Quick sketching to ideate user flows

Quick sketching to ideate user flows

Our initial prototype needed to address onboarding, presenting recommendations, event details, sharing events, and incentives for user feedback on recommendations.

We did eight rounds of rapid sketches for each user flow. When done we had a wide range of approaches on paper. We reviewed them together and chose the strongest directions to test. Then we each built a low fidelity prototype for concept testing.

Each concept test session involved a moderator and a note taker. Our five testers, all regular event goers in Chicago, completed a series of tasks while speaking their thoughts and actions aloud.


Testing two onboarding approaches

We explored two directions for onboarding: 1) user shares data to influence initial recommendations and 2) user signs up with Facebook and allows data sharing to support recommendations. There were pros and cons to each approach from both a user and technical standpoint:



  • All users want to directly influence their recommendations
  • Gives Livy a standard base of preference data for all users


  • Amount of data points are limited

02. Facebook integration


  • Users like the idea of convenient sign up and sign in
  • Livy's unsupervised machine learning algorithms would benefit from the potential wealth of data points available from Facebook's API


  • Some users are skeptical that their social media profile contains the type of information needed to provide personalized event recommendations
  • Too much reliance on an API to deliver value to users could expose Livy to unnecessary business risk

User input prevails–and gets better through iteration

Two of the onboarding sequences we tested utilized the user input approach. One used a Facebook only sign up. Even users who expressed interest in signing up with Facebook also wanted to share input directly to the app during onboarding. Because we had two versions of the user input approach we were able to get data on UI patterns even during concept testing. I was responsible for integrating onboarding feedback.

View the gallery below, which features my initial concept screens and subsequent iterations, to see how onboarding evolved through two rounds of testing.


To reward or not to reward?

Our client wanted to know how users would respond to getting a discounted app subscription in exchange for their feedback on the recommendations. We decided to test the idea of assigning a tangible value to user feedback by creating a rewards store. That way we could test two things:


What do users think of assigning a monetary value to their feedback?


What do users value most in return for their feedback?

concept prototype: rewards

Introduction to LivyCash upon completion of onboarding

Introduction to LivyCash upon completion of onboarding

Example of user feedback modal

Example of user feedback modal

Example of LivyCash awarded to user for sharing feedback

Example of LivyCash awarded to user for sharing feedback

LivyCash store including balance and "Earn More" link

LivyCash store including balance and "Earn More" link


  • Many users didn't read the LivyCash introduction
  • Even users who read the introduction were surprised to see that giving feedback on an event recommendation had earned them LivyCash
  • Users liked the idea of free tickets in exchange for LivyCash, but said it felt like a distraction from their primary reason to use the app
  • When asked, users said they would like to be rewarded with free use of the app more than free tickets
I like the idea that because I helped make the app better with my feedback I get to use the app for free. Like a reward for being a good community member.
— Becca

Listening to the user to deliver real rewards

As a team we assessed the user feedback and it's implications for the incentives. We decided that rewards could increase user engagement and loyalty. And we agreed that some major changes needed to be made to accomplish that goal. I was responsible for iterating on this aspect of our prototype. You can see the changes in the final prototype in the next section.


Two ways to present recommendations

We tested two approaches to the dashboard. One was a tinder-style interface where users swiped left or right as they were presented with one recommendation at a time. The second approach used a swimlane pattern that gave users the ability to browse and explore from a selection of curated events. Users preferred the swimlane pattern. During our second round of testing we sought to understand what categories users wanted and how they wanted the information presented. You can see the refined dashboard in the final prototype.



Final prototype

In this case study I've reviewed the user flows I was most responsible for: onboarding and rewards. These show how our collaborative process led to a user experience design grounded in both user insight and technical requirements. Our final prototype was a mix of screens created by each designer on our team of three.

Onboarding prototype

Click on the Create Account screen below to view the Invision prototype of the final onboarding flow.

Click  here  or on the phone image above to use the Invision prototype

Click here or on the phone image above to use the Invision prototype



This collaboration was especially rewarding. We collaborated as a design team of three UX designers, an art director, and a project manager. We also worked closely with Livy's founder. As a data scientist he had a lot to share with us about Livy's technical vision and environment. And he was eager to learn everything we could share with him about the user.



Conceptual fidelity

This project also reinforced the difference between conceptual and visual fidelity. Visual fidelity in the form of well thought out patterns, spacing, typography, and key iconography can help communicate the concept to user testers. However, visual fidelity in the form of color and images can distract users from the concept and focus their attention on content-specific elements. As a designer I'll continue to be mindful of the objectives for testing when deciding how much visual fidelity is required to facilitate those objectives.

Subject matter research

I also learned that even when your client is an expert in the project subject matter, getting perspective from additional experts is invaluable.


I learned to notice the shift in energy that happens right around the time I could really use a critique on my work. If I lose focus on the 20% of my work that will be responsible for 80% of its impact, it's time to get feedback.



Thanks for making it this far. If you'd like, have a look at more case studies, view my resumé, or connect with me on LinkedIn.