Screen Shot 2018-01-13 at 9.39.24 PM.png

Parlay

Chatbot Survey + Analytics Dashboard for Capturing Product Feedback

 
 

TYPE: Class Project, Group (4 total)

DURATION: 3 weeks

TOOLS: Sketch, InVision

MY ROLE: Researcher, information Architect

Table of Contents: Overview | Discovery | Research | Ideation & Sketching | Prototype | Usability Testing | Final Iteration

 

 

OVERVIEW

What is Parlay?

meta ux for product owners

Parlay is a tool that helps product and ux people understand what's working and what's not through direct customer feedback. Wouldn't it be cool if you could gather intelligence on existing product features or proposed features from your power users? And, what if the feedback helped you understand the underlying issues behind well-received (or not so well-received) features?

Parlay to the rescue!

OUR ROLE

UX for Chatbot Survey + a tiny bit of product develoment

The Parlay app has two main facets: a user-facing survey that collects sentiment data and a client-facing analytics platform. Our job was to develop a survey that could intelligently collect data (and, of course, be a great user experience!) on various aspects of an existing or potential feature and build an analytics dashboard to display such data.

 

 

CLIENT BRIEF

 

Find a novel way to capture and measure user feedback on digital feature previews

...while maintaining a good user experience

...and providing actionable data

 

 

DELIVERABLES

 
Screen Shot 2018-02-03 at 8.12.18 PM.png
 
 

 

DISCOVERY

AREAS OF INVESTIGATION

 
Areas of Investigation

Areas of Investigation

 

COMPETITIVE ANALYSIS

The Concept of parlay already exists

But, not quite...

Based on guidance from our clients, we surveyed the major (and minor) players in the product/ux sentiment space.

Why these? Competitors had survey tools, sales tools, and other features that Parlay has, and we wanted to understand how they executed on them. More importantly, however, how can we do something different, something better?

Screen Shot 2018-02-01 at 5.38.43 PM.png

COMPETITIVE RESEARCH: FEATURES

Product Feature Previews

One of parlay's key selling points

Feature previews allow a user to see in real time how a product would fit into a website or webapp. This gives unparalled contextual information and saves time on user testing. This also allows Parlay to see performance by preview -- how cool is that?

Screen Shot 2018-02-03 at 3.05.09 PM.png

COMPETITIVE RESEARCH: DASHBOARDS

Actionable Insights, Not charts

Numbers are meaningless without contextual info

We decided to look at some of the major players in the analytics space in order to understand the state of the industry and analytics products.

Bottom Line: We didn’t want a dashboard full of graphs, charts, and numbers. We needed to think outside the box. We needed to think Actions. Take it a step beyond a regular dashboard.

Screen Shot 2018-02-03 at 3.13.28 PM.png
 

 

RESEARCH

USER INTERVIEWS

We interviewed 12 industry experts

Product, Development, UX, Analytics

We asked questions like:

  • “How do you get user feedback on newly-added features?”
  • “Are developers interested in feedback?”
  • “How do you use analytics to inform your product decisions?”
  • What metrics are most key to your decision decisions”

Findings

Product Development is Messy

  • Sometimes, features might live or die by metrics alone, leaving the question: "Why did this feature live/fail?"
  • Developers sometimes don't understand the rationale behind development decisions
  • Lots of time and money may be invested in features that are not worth developing in the first place
Screen Shot 2018-02-03 at 3.22.39 PM.png
 
Screen Shot 2018-02-03 at 3.26.04 PM.png
 
 

 

PERSONAS

 
Screen Shot 2018-02-03 at 5.52.53 PM.png
 

DIFFERENT ROLES, DIFFERENT PRIORITIES

Each specialist needs something different from user feedback

And different levels of quantitative vs. qualitative data

Product Needs:
Assurances = Need to understand that what they’re working on is worthwhile and users will respond to the new product or feature without sinking a ton of money into it. This is the advantage of previews

Development Needs:

Actionable Insights = Need to understand the implications of a feature. If it’s looking like one feature is preferred over another, how can they get ahead and plan for that? This helps with development costs and tech debt.

Analysts Need:
More Data = Great all-around. This helps product owners and other business people decide what’s working and what isn’t.

UX Designers Need:
More Feedback = Need on-demand feedback from power-users. These people aren’t always represented in user surveys and usability studies

 
Screen Shot 2018-02-03 at 3.41.46 PM.png
 

 

PROBLEM STATEMENT

"Product teams often have trouble anticipating their users’ wants and needs, and speaking about those wants and needs in a common language."

two "sub-problem" statements:

  1. How can we describe a set of standards or benchmarks that Parlay's users can intuitively interpret and understand if a particular product or feature is or isn't working out?

  2. How can we visualize this set of standards in a meaningful way?

 

 

BEFORE SKETCHING, DEFINING OUR CONSTRAINTS

THE SURVEY

3 separate surveys, to adhere to the Customer Happiness Index (CHI)

  • What questions would we ask?
  • How can we avoid leading questions?

The Analytics Dashboard

Displaying Qualitative Data

  • How can we drive more action and less analysis with the dashboard?
  • How can we be different?
Sketching out some concepts

Sketching out some concepts

 

 

Brainstorming

Translating product sentiment

product and ux often want the same metrics of performance

To kick off the brainstorm, we played with the idea of using UX Heuristics to help outline what users thought of the product and give UX/Product the vocabulary to evaluate performance.

This way, a user can express sentiment and a ux/product person can use their empathy skills to understand what the user means when they say convey like/dislike for a particular feature preview.

The Heuristic Model would end up being our primary contribution to the project and help set the stage for further survey and dashboard development.

The team, hard at work brainstorming and sketching

The team, hard at work brainstorming and sketching

 

 

SOLUTION STATEMENT

 

We will address all those user insights with a custom heuristic framework so that those insights can be simply categorized, and understood well.


 

 
 

 

DEVELOPING A FRAMEWORK FOR EVALUATING PRODUCTS

USING UX DESIGN PRINCIPLES: HEURISTICS

SURVEYING THE CROWD

What Heuristics Are important to you?

To save time, we surveyed nearly 100 UX and Product professionals around the world to understand, in their unbiased opinion, what heuristics are most important and/or relevant in the work they have done.

The heuristic model, refined

The initial results of the survey needed some refinement. Through some additional brainstorming, we pared down the list and made some substitutes. Why? Some heuristics didn't work well with the product feature model or some were not easy to convey through the potential survey questions

We arrived at the following heuristics: predictability, impact, consistency, relevance, value, learnability

Additionally, some heuristics are better-applied to the product within positive, negative and/or neutral survey sentiments.

Screen Shot 2018-02-05 at 7.27.52 PM.png
Initial sketching on survey questions

Initial sketching on survey questions

 

 

"CHATBOT" IDEATION

We developed 3 surveys

Each one representing a positive, neutral, or negative sentiment

The questions needed to be conversational, but brief, with closed responses to get consistent data across each heuristic, and across multiple product "tests."

Questions also must be contextual to the sentiment. Thus, we built different questions for different sentiments. Each question flow required us to jump into the minds of people who may be satisfied, dissatisfied, or indifferent to their product experience.

20171113_132447.jpg
Survey Question Flows

Survey Question Flows

 

 

ANALYTICS DASHBOARD

Visual representation of product performance

Why harvey balls?

Through our conversations with product managers, we noted that Harvey Balls seemed to be the visual of choice when it came to expressing product performance.

Harvey Balls give a quick, visual representation of a quantitative metric. This was the at-a-glance, actionable dashboard item we were looking for.


FROM SURVEY QUESTION TO DASHBOARD ELEMENT

Assigning a simple point value system

We assigned a simple -1(negative) / 0(neutral) / +1(positive) value to each potential survey resposne. This allowed us to take qualitative data, make it quantitative, and translate that back into a visual concept for quick consumption.

Harvey Balls

Harvey Balls

Translating the survey question to visualized quantitative data

Translating the survey question to visualized quantitative data

 

 

DASHBOARD SKETCHES

Focus on actionable insights

Data-backed suggestions, not charts

We poured over countless dashboards from Google Analytics to the dashboards of our competitors to understand what the gold standard was in this product space.

Through our research, we sussed out a couple of key features that would make our dashboard stand out:

  • Visuals to assess performance (i.e.: larger graphics to display more frequency, colors to convey positive/negative)
  • Recommendations (Parlay would reveal which version of a product test performed better and recommend that you launch that particualr feature
  • Although the dashboard would be just a little different from what's currently out there, the functionality should follow convention. (i.e.: calendars to select specific date ranges, the ability to "drill down" into separate reports and pull data in different manners)
Various iterations of dashboards, some more "non-graph" than others

Various iterations of dashboards, some more "non-graph" than others

 

 

FINAL CHATBOT & DASHBOARD ITERATIONS

INITIAL POSITIVE SENTIMENT SURVEY FLOW

 
 
 

 

ANALYTICS DASHBOARD