UX604 · Research Method 2 · Winter 2026

Uber Eats
Uber Eats
UX Research

Carol Chen
Independent UX Researcher
Professor Susie Simon
Wilfrid Laurier University
UX604 · Research Method 2 · Winter 2026
Table of Contents

Intro

The goal of this study is to evaluate the usability of the Uber Eats app, identify the root causes of cognitive friction and factors that undermine user trust, and develop solutions to address these issues and enhance the user experience.

  • 01 Summary Overview video of the full research
  • 02 Key Findings 3 problem evidence sections
  • 03 Actions Triangulation matrix + Prioritised design recommendations
  • 04 Methods Research methods & Participants & Data
  • 05 Reflection Project takeaways and personal growth
Executive Summary

Summary

This comprehensive UX research report evaluates the Uber Eats application to identify cognitive friction points and areas where user trust is compromised. By triangulating data from Usability Testing, PURE Expert Evaluation, and AI Heuristic Analysis, we uncovered three critical issues: invisible customer support, opaque pricing at checkout, and complex information architecture.

--- Get a quick overview of our research journey and top findings in the video below. ---

Research Insights

Three Key Findings

Confirmed and cross-validated across all three research methods.

01
Support Invisible
Priority — Critical
02
Price Transparency
Priority — Critical
03
Information Architecture
Priority — High
Finding 01 — Critical
Customer Support System Is Invisible
Problem Highlight: Support Invisible
Problem Overview
The customer help of Uber Eats is invisible, and there is no direct human contact option to the restaurant. Users feel forced into an AI loop when they have an issue with an order.
Selected User Quotes

Because it just don't show the number, and I have to connect AI and then ask the AI to go through the agent stuff."

— Participants 1

I just couldn't find the phone number."

— Participant 5

Usability Testing
Usability Testing Data
Task 3 Data

The success rate for Task 3 is 0%, and the SEQ score is only 2.8. This process involves significant friction, making it completely impossible for users to achieve their goals.

PURE
PURE Evaluation Data
Task 3 Data

The PURE score indicates that multiple friction metrics exceed the threshold, constituting a critical path obstacle; immediate corrective action is required.

AI
AI Heuristic Analysis Data
Task 3 - Step 5

By the end of Task 3, the cognitive load reached 5 (the peak), and the interaction logic was highly counterintuitive, causing the task to be completely abandoned.

Finding 02 — Critical
The Issue of Price Transparency
Problem Highlight: Price Transparency
Problem Overview
Users experience noticeable hesitation during the final payment stage, leading to a decrease in financial confidence. The grand total is hidden behind ambiguous UI elements, making it impossible to confirm the final amount easily.
Selected User Quotes

I cannot see the total price after I add a tip. That's a little bit annoying."

— Participants 1

When I reached the final step after selecting the tip, I couldn't find a summary of the total price anywhere."

— PURE Evaluator 2

Usability Testing
Usability Testing Data
Task 2 Data

High operational ease (SEQ 5.4/7) but partial success (1.2) due to price-opacity induced anxiety.

PURE
PURE Evaluation Data
Task 2 - Step 7

In the final step of Task 2, the pure score is 2, indicating that there is a friction point in this step.

AI
AI Heuristic Analysis Data
Task 2 Data

Task 2 AI data indicates a high cognitive load, with an overall PURE score of 3 (serious issue).

Finding 03 — High
Information Architecture Problems
Problem Highlight: Architecture
Problem Overview
Through three research, we found that the current software’s information architecture is complex, with ad display taking precedence over usability. This results in cognitive overload for users, while the complex information indexing makes it difficult for them to find the information they need. New users struggle to adapt, which negatively impacts the user experience.
Selected User Quotes

The interface is a bit too flashy and too complicated... it would be better if the page were simpler."

— Participants 5

The search bar is not on the top of the screen. It makes me confused a little bit. Then I find it on the bottom."

— PURE Evaluator 3

Usability Testing
Usability Testing Data
Overall UMUX

With a 68% UMUX Lite score and 80% neutral ease-of-use ratings, the app’s usability is only average due to persistent navigation barriers.

PURE
PURE Evaluation Data
Overall PURE Score

High friction (red/yellow) reveals severe IA flaws.

AI
AI Heuristic Analysis Data
Overall PURE Score

Overall data reveals widespread severe friction, confirming critical IA.

Design Recommendations

Actions

Prioritised solutions to resolve the identified friction points.

We utilized a Triangulation Matrix to cross-validate the identified usability issues, and applied the Eisenhower Matrix to determine the priority of the problems to be solved.

Triangulation Matrix — Uber Eats UX Evaluation
Issue: Support Invisible
Usability Testing Findings: Behavior: 100% task failure. Users felt frustrated and helpless ("I can't find a phone number anywhere"). Task 3 success rate is 0.
PURE Evaluation Findings: Expert Rating: Score 3 – High friction. Task 3 is worth a total of 10 points; there are 4 friction points in Task 3.
AI Heuristic Analysis Findings: The AI assigns an extremely high cognitive load (5/5). AI Persona assigns the lowest SEQ score (2/7, extremely difficult). Heuristics violation: Violates H3 User Control and Freedom.
Final Triangulation Strong Alignment
All three methods highlight a critical blocker. The complete absence of a human escalation path creates severe emotional friction and guarantees task failure.
Issue: Price Transparency
Usability Testing Findings: Behavior: Users noticeably hesitated during checkout, leading to decreased financial confidence. Quan show 80% users can not find the final price. SEQ score is low
PURE Evaluation Findings: Expert reflect couldn't find a summary of the total price anywhere. Expert Rating: Score 3 – Medium friction.
AI Heuristic Analysis Findings: Heuristic violations: Serious violation of the visibility of system status (H1). PURE 3 — High friction, Cognitive load 4 / 5. The AI persona gave a rating of 4 out of 7 and did not confirm the final price during the task.
Final Triangulation Strong Alignment
Real user hesitation is directly explained by the structural flaws identified by experts and AI—specifically, the hidden grand total and ambiguous payment UI.
Issue: Information Architecture
Usability Testing Findings: User Behavior: Users show noticeable hesitation when looking for the after-sales support option; some users have reported that the help option is not clearly visible.
PURE Evaluation Findings: Tasks 1, 2, and 3 all have several sticking points.
AI Heuristic Analysis Findings: Heuristic violations: H2 (abbreviations do not conform to natural language), H4 (search box placement deviates from standards). The AI experiences increased cognitive load when processing redundant hierarchical menus. The AI feedback promotion mechanism appears at every decision point, constantly interrupting the core task workflow.
Final Triangulation Moderate Alignment
Analytical methods (PURE/AI) pinpointed specific structural issues that perfectly align with the minor delays and confusion observed in real users.
Priority matrix — Eisenhower
High
Urgent
Low
Low
Important
High
Delegate
Do
1. Price Transparency
2. Support Invisible
Delete
Decide
3. Information Architecture
1
Price Transparency
Explaining that this problem is very serious and can be quickly solved, quickly improving the user experience.
2
Support Invisible
Although seeking help is the final step in the user flow, a reliable and user-friendly “safety net” is crucial. In the long term, resolving this significant friction point is decisive for restoring user trust, reducing frustration, and ultimately ensuring customer retention.
3
Information Architecture
Simplified navigation and mental model. Refines global and local navigation for faster task completion and smoother interactions.
01
Fix Price Transparency at Checkout
How might we redesign the checkout flow to provide full price transparency and eliminate users' fear of price uncertainty?
Advice

Redesign the layout of the pricing page to ensure price transparency, allowing users to clearly see the total amount they will be charged, while also adding safeguards to prevent accidental orders.

Action screenshot
1Action
The total price can be positioned at the bottom of the screen so that users can see it at any time by scrolling; the tax amount can be adjusted on the payment page.
Action screenshot
2Action
Provide payment system to prevent misuse, the user clicks on the place order should provide a confirmation of payment steps to avoid misuse.
Business Goals, KPI
Business Goals KPI
Collaborative
UI&UX Designer

To prototype the interactions and design visual for real-time price.

UX Research

Usability testing on new checkout flow to confirm reduced anxiety and hesitation.

Engineering

The front-end is responsible for implement the real-time calculation logic and the smooth animation. and the back-end is responsible for making sure that the price calculated before the click is 100% accurate.

Product Management

Track order completion rate and refund rate as primary KPIs post-launch.

02
Customer Support Functions are Not Visible
How might we establish a transparent communication mechanism within the app that allows users to resolve their needs at any time?
Advice

Establish a direct communication platform so that users can directly contact the merchant or rider, reducing the "contact intermediary". The help page needs to be intuitive so that users can quickly capture the required functions.

Action screenshot
1Action
Platform design a three-way communication platform: users, merchants, and riders can communicate directly online. Real Reference Case — China's Meituan Delivery Software.
Action screenshot
2Action
The help page should be available on both the home page and the order page. Options:
  • Add a help icon to the top right corner of the home page.
  • Hover over the help icon so the user can see it wherever they click.
Business Goals, KPI
Business Goals KPI
Collaborative
UI&UX Designer

UX designers are responsible for creating communication platforms and helping with interaction design.

UX Research

AB testing to see which design users prefer.

Engineering

The front-end is responsible for the interaction with the updated help button.

Product Management

To forecast the increase in direct merchant calls and adjust support capacity.

03
Redesign Information Architecture
How might we design an intuitive information architecture that minimises friction points for users during their interactions?
Advice

Restructure the navigation so critical functions are visible at a glance. Move the search bar to the top, add a sidebar for menu categories, and redesign the help page hierarchy so users can find support in under 2 steps.

Action screenshot
1Action
Product categories should be prominently displayed, with recommended items consolidated into a section below. Users without specific preferences can browse the content below. Any advertising sections must be clearly labeled, but should not be overly prominent.
Search: change the search bar from the bottom to the top, in line with the user's habit.
Action screenshot
2Action
Sidebar: Although the current product selection page provides grouping, but the user needs to scroll down to see, click into the home page can not be seen directly, not intuitive, can be directly designed in the user click into the user can see, vertical row on the left of the form, the user can see at a glance what the grouping, do not need to because of sliding.
Action screenshot
3Action
Help Page:
  • Use card sorting to rearrange the information architecture, then test whether the proposed new structure is viable.
  • Previous orders should not be limited to displaying only the most recent one. While showing the last order directly allows users to find it quickly, it prevents them from accessing other orders. Therefore, create a module where users can click to view all pending orders requiring attention.
Business Goals, KPI
Business Goals KPI
Collaborative
UI&UX Designer

Redesigning the information architecture of the homepage, menus and help page through card sorting, and testing it.

UX Research

Card sorting studies to validate new information architecture with real users.

Engineering

Implement new navigation components and sidebar category menu.

Product Management

Define success metrics for new IA — task completion rate and time-to-first-action.

Research Process

Methodology

Three progressive methods, designed to cross-validate each other.

01
Usability Icon
Usability Study
Remote moderated testing via Zoom with 5 real Uber Eats users (ages 20–25). Three tasks with SEQ, UMUX-Lite, task success rate, error rate, and time-on-task.
N=5 UsersRemote/ZoomSEQ + UMUX-LiteAffinity Map
Click for more data
02
PURE Icon
PURE Study
In-person expert evaluation by 3 UX Master's students (WLU). Each step rated on PURE 1–3 friction scale. Qualitative data analyzed via Affinity Mapping in Figma.
N=3 ExpertsIn-PersonPURE 1–3 Scale17 Steps
Click for more data
03
AI Icon
AI Heuristic Eval
Gemini (Thinking) used as AI participant proxy with simulated think-aloud. Same 3 tasks evaluated against Nielsen's 10 Heuristics with SEQ scores recorded.
Gemini ThinkingNielsen's 10HSEQ ScoresThink-Aloud
Click for more data

Usability Testing Tasks

To evaluate the core user flow and identify potential friction points, participants were asked to complete the following three key tasks:

  • Task 1: Item Selection & Pop-up Handling Search for a "Sandwich" within the app, select a specific package, and successfully navigate through any promotional pop-ups that appear.
  • Task 2: Payment & Checkout Customization Navigate to the payment page to modify the default payment method and adjust the tip amount before finalizing the order.
  • Task 3: Support & Issue Resolution Locate customer service to inquire about the most recent order and attempt to find a direct customer service phone number.
UX Research Timeline
Jan 5 – Mar 23, 2026  ·  11 Weeks  ·  Methods: Usability · PURE · AI
Jan 5
Jan 26
Feb 2
Feb 9
Feb 16
Feb 23
Mar 2
Mar 9
Mar 16
Mar 23
W1–3
Research Plan
Planning
Finalize task list & screening criteria
✓ Done
W4
Recruitment
Usability
Recruit 5 participants; screening & consent
✓ Done
W5
Testing
Usability
5 sessions Zoom / in person, recorded
✓ Done
W6
Analysis
Usability
Success rate, time-on-task, error data
✓ Done
W7
Recruitment
PURE
Secure 3 UX expert evaluators
✓ Done
W8
Evaluation
PURE
Task scoring (1–3 scale); organize findings
✓ Done
W9
AI Evaluation
AI Study
Simulate user flows; analyze task steps via AI
✓ Done
W10
Cross-Method
Comparison
Compare Usability, PURE & AI data
✓ Done
W11
Final Project
Deliverable
Consolidated report & final presentation
✓ Done

Methodology Comparison: Strengths & Weaknesses

A core component of this study was comparing the efficacy of Usability Testing, PURE, and AI Evaluation.

Method Strengths Weaknesses
1. Usability Testing Provides undeniable behavioral evidence, genuine emotional reactions (e.g., frustration), and qualitative context ("say-do" gap). High stakeholder buy-in. Resource-intensive (recruiting, moderating), smaller sample size, and users may exhibit "social desirability bias," overlooking friction if a task is eventually completed.
2. PURE (Expert Eval) Systematic, structured, and fast. Excellent at identifying granular UI friction points and structural flaws that average users might not articulate. Lacks real user empathy and emotional context. Experts might suffer from "curse of knowledge," assuming tasks are easier (or harder) than they are for the target demographic.
3. AI Evaluation Incredibly fast, scalable, and low-cost. Highly objective in flagging heuristic violations and quantifying cognitive load step-by-step. Can be overly rigid or hallucinate context. AI lacks genuine human emotional nuance and cannot fully simulate the unpredictable, chaotic nature of real human interactions.

Strategic Integration: When to Leverage Each Research Method

Based on our comparative study, we propose the following strategic roadmap for integrating these methods into the product development lifecycle:

  • Early Stage (Ideation & Wireframing) -> AI Evaluation: Use AI as a low-cost, high-speed "first pass" to identify obvious heuristic violations, structural logic errors, and cognitive overload before any code is written.
  • Mid Stage (Prototyping) -> PURE (Expert Review): Once interactive prototypes are built, deploy UX experts to conduct systematic PURE evaluations. This ensures the interaction design is fluid and meets industry standards before investing in user recruitment.
  • Late Stage (Pre-Launch & Post-Launch) -> Usability Testing: Reserve resource-intensive real user testing for validating high-risk flows, assessing emotional impact (trust, anxiety), and uncovering the unpredictable "say-do" gaps that experts and AI cannot simulate.
Software & Tools Used
Zoom
Gemini
Figma
Google Docs
Canva
CapCut
GitHub
Zoom
Gemini
Figma
Google Docs
Canva
CapCut
GitHub
Project Takeaways

Reflection

What I learned from this comparative evaluation study.

1. Identifying the Gap: Behavioral Data vs. Self-Reporting

One of the most profound insights from this study was the "say-do" gap. I observed that users often assigned high satisfaction scores even when they encountered significant obstacles during tasks. This suggests that users may subconsciously overlook friction points once a task is completed, or they might feel a "social desirability bias" during testing. It taught me that relying solely on quantitative ratings can be misleading; we must cross-reference subjective feedback with objective behavioral observations to pinpoint where the experience truly breaks down.

2. Mastering Data Synthesis & Evidence-Based Prioritization

The biggest challenge was synthesizing vast amounts of qualitative and quantitative data. Transitioning from data collection to prioritizing findings requires more than just intuition—it requires a structured, evidence-based approach. Utilizing the Triangulation Matrix and the Eisenhower Priority Matrix was a turning point. These frameworks allowed me to align new findings with existing research, ensuring that our focus was directed toward the most critical issues rather than subjective preferences. This process was instrumental in building a robust, defensible evidence chain.

3. Future Improvements

  • Agile Methodology Selection: In future projects, I aim to more rapidly identify the most effective research methods tailored to specific goals, ensuring a more streamlined data-gathering phase.
  • Strategic Stakeholder Communication: I plan to further refine my reporting style by focusing on actionable insights. By simplifying complex data into high-impact narratives, I want to ensure that stakeholders can quickly grasp the core findings to make informed, data-driven decisions.