Brewing a Better Connection Reimagining the Frontline Employee Feedback Experience

Redesigning the feedback ecosystem for a global food & beverage leader to amplify the voices of its 400,000+ store employees and drive data-informed innovation.

My Role

Product Designer, UX Researcher

Owned strategy, mixed-method research, concept design, and pilot validation

Team

Sammantha
Rogger
Xuci
Khush

Timeline

12 Weeks

Skills

User Research

Product Strategy
Innovation Implementation

Visual Design

Prototyping

The Challenge

A Disconnected Voice

A global coffee leader was launching new in-store initiatives but lacked critical feedback from its frontline employees.

The company's primary feedback tool, a digital survey, had very low 5% response rates, leaving them blind to operational issues.

The challenge was to find root the cause of the low engagement and design a solution to capture the invaluable insights being lost.

Impact highlights

From 5% 60% Increase

Reframed the problem from a simple survey issue to a systemic user experience challenge, delivering a 3-phased innovation roadmap.

We developed and tested initial intervention prototypes across 4 districts in USA, culminating a holistic feedback concept.

The most successful pilots resulted in a 60% increase in employee engagement, validating our hypothesis and securing stakeholder buy-in for the long-term vision.

1.6X Increase

Lift in employee engagement in the top-performing pilot, increasing responses by fixing the feedback gap preventing expensive losses in the future.

96% Willing

Partners were comfortable providing feedback, proving the issue was a systemic UX failure, not user apathy.

1 in 4 Partners

Ratio of employees who explicitly demanded to see their feedback lead to tangible action, a core insight from analyzing 278 open-ended responses.

Why this mattered

The company validates hundreds of in-store innovation before national rollouts. The existing Partner Pride Survey had a 95% non-response rate, leaving product decisions supported by an extremely small and biased sample. That gap created systemic risk: a failed national rollout could represent nine-figure losses.

Research-driven insights

This project prioritized grounding every design decision in partner reality. The research program used a mixed-method approach that combined field observation, contextual interviews, qualitative coding, and analysis of existing survey responses to triangulate root causes.

  • 1 ideation workshop

    Conducted ideation session with 9 partners in Store Testing team

    Secondary research

    Innovations on feedback collecting Best practices for surveys

    Comparative testing

    Tested 4 strategies for interventions in 4 districts to get a sense of most effective approaches

    Concept testing

    Store Manager Peer Coach (SMPC)

    Shift Supervisor

    Barista

  • 12 Interviews

    Store Testing team members

    (SMPCs), (SMs), (ASMs)

    Shift Supervisor and Baristas

    Survey Experts

    7 Intercepts

    Store Managers

    Shift Supervisors

    Baristas

    8 store visits

    City stores

    Rapid Testing Stores

    2 ideation sessions

    A free brainstorming session & an insights smashing session

Filling out the survey is just one small part of the whole survey experience

  • Find the survey 

link on iPad

    “Ugh, where is the survey?”

    Store iPads

    Find time to do 

the survey

    “When do I have the time to fill out!?”

    Store iPads

    Get notified 

a survey is live

    “received an email about some survey.”

    Email

    Fill out the survey

    “Is it relevant to my work?”

    Store iPads

    Hear about the survey

    “I heard I need to fill out a survey??”

    Verbal, chats, etc.

    Go back to work 

or end shift

    “OK, I got that out of the way.”

    Store iPads

    Wonder what happens

    “I wonder what will happen?”

    ?

Barriers

Designing a better survey is the least of our concerns. Drawing from two surveys with 278 combined responses and our qualitative research, we categorized the barriers by their level of significance.

Accountability

No way to keep track of
the store progress

TIme

Difficult to find time to take the survey

awareness

Limited awareness & communication

Cognitive

Struggle to remember to take the survey

motivation

Insufficient intrinsic motivational

access

Difficult access & navigation to the survey

Context

Lack of understanding and context

UX

survey experience

We asked how likely would store partners take a survey if this aspect was improved, and measured the frequency of responses. We triangulated the results with observations from the comparative testing, store visits, qualitative interviews, and open-ended survey answers.

Hypotheses
01

Making survey access visible and centralized will reduce friction and increase completion.

02

Adding a social, competitive signals (store/district leaderboards) will increase motivation and completion among partners.

03

Closing the loop (showing what changed) will improve repeat engagement and perceived value.

Each pilot targeted one or more hypotheses so results could be triangulated back to specific mechanisms.

Testing Hypotheses
  • Build personal connection with 

store partners

    • Introduction videos

    • Testing roadshow

    • Information sessions

    • Onboarding Zoom call

    • Letter to stores

    Intro video addressed to store partners

    58% Decrease
    Communication
    Motivation
    Education

    Helping stores 

track completion + gamification

    • Updates with leaderboard

    • Teams channel with stores

    • District competition

    • In-store competitions

    • Pin, badges, and merch

    Gamified store
    leaderboards

    50% Increase
    Motivation
    Completion

    Support store partners through physical reminders.

    • Survey promo posters

    • Partner-facing info cards

    • Reminder on iPad dock

    • Store Testing brand materials

    In-store posters promoting the survey

    25% Decrease
    Reminder
    Communication
    Education

    Optimize navigation and access to survey

    • Infographic for access

    • Note on App 1

    • Note on App 2

    • Posters with QR codes

    Navigation infographic
    + note on Apps

    60% Increase
    Access
Design of the pilots

We intentionally chose low-cost, observable interventions (physical posters with QR codes, updated shift comms, a district leaderboard in a shared Teams channel, and scripted manager prompts).


Pilots were run as quasi-experiments across four Store Testing Team districts with consistent measurement windows and a control period baseline.

Outcomes

2 districts implementing gamification + improved access saw a 60% uplift in completion.


2 districts saw declines attributed to confounding variables (technical issues and survey fatigue), a critical learning that informed rollout readiness.


Combined uplift equated to 12x improvement over the baseline in the successful pilots.

What we measured
Primary metric

Survey completion rate

Secondary signals

Qualitative sentiment
Anecdotal indicators

Solution overview

Espressit is a centralized feedback platform concept designed to solve the three root causes. It bundles three tightly-focused features:

Transparent Feedback Loop

Post-survey completion splash screens and summaries so partners can see tangible results from participation.

Centralized To‑Do Hub

Single place to see active tests, who’s responsible, and remaining time to complete the task, reducing fragmentation.

Gamified Dashboards

Store and district level progress metrics, streaks, and leaderboards to replace manual motivating with friendly competition. (Manager validation: strong.)

Risks & limitations
01

Two pilot districts underperformed due to tech issues and fatigue, underscoring the need for telemetry and staged rollout.

02

Low-fi pilots are context-sensitive; transferability requires local tailoring.

03

Early stakeholder alignment was critical to scaling.

Learnings
Business outcomes & stakeholder impact

Pilot data created a defensible case for the need to invest in a dedicated feedback channel, shifting budget conversations from speculative to evidence-driven.

Surfaced the previously invisible partner voice at scale, improving confidence in test outcomes and reducing rollout risk.

Next steps

Build the prototype
of Espressit MVP

Instrumentation:
track completion rate,
time-to-complete,
drop-off points, repeat engagement.
KPIs:
response rate,
median completion time,
repeat participation,
% of feedback acted on,
impact on test-to-launch decision confidence.