Presentations

Evidence-first results, deliverables, and closing the engagement.


Presentations

The final presentation at the end of a project. Evidence-first: what was built, what was tested, what the data shows, what was changed, what's next.


Why This Step Exists

The project ends with validated outcomes, not subjective feedback. The presentation is where evidence meets decisions — the client sees what users actually did, not what the designer thinks should happen.

A project can be a single sprint or multiple sprints. The presentation happens once — at the end — when all testing and UX fixes are complete. It closes the engagement cleanly with evidence, deliverables, and next steps.

Testimonials, feedback, and next-step conversations were happening ad hoc on the first engagement — now they're methodology.


How to Do It

1. Build the Presentation

The presentation is built as an interactive slide deck inside the prototype codebase — not a separate Keynote or Google Slides file. This keeps everything in one place and lets you embed live demos directly into the flow.

The presentation follows this structure across 10 sections:

Section 1: Headline Results (lead with the answer)

  • Title slide: project name, sprint count, user count, date
  • The three sprint questions from the first workshop
  • One slide per sprint question showing the headline pass/fail result and ICP-weighted score
  • "Remaining gap" slide — frame the gap as depth features, not foundational fixes
  • ICP-weighted scorecard — all questions on one slide with the combined data
  • Sprint-over-sprint comparison table showing measurable improvement (session duration, feedback maturity, task success rates)

Section 2: Live Walkthrough

  • Before going into methodology or detailed results, show the product
  • Open the live prototype and walk through the core flow
  • This gives the client context for everything that follows

Section 3: Methodology

  • Explain how ICP weighting works and why it matters
  • "Surgical instrument" framing: an early-stage product is built for users who already experience the pain — testing against non-target users produces accurate but not actionable data
  • Show who was weighted in vs. reduced/excluded, and why
  • Make the point that every user contributed — ICP users drive product-market fit scoring, non-ICP users gave actionable UX improvements

Section 4: Results by Question

  • Deep-dive into each sprint question individually
  • For each question: the score, the key user quotes, and the nuance
  • Show the deeper signal beneath the headline number (e.g. "technical users could replicate the capability — they won't invest the time to build and maintain it")
  • If trust has levels (Level 1: "I can see sources" vs. Level 2: "I can see confidence and methodology"), break them out with before/after screenshots
  • Sprint decisions scorecard — every decision from the workshop, validated or not

Section 5: What We Changed

  • Title slide: number of major overhauls
  • For each overhaul: the user quote that surfaced the issue, a before/after comparison, and what was built
  • This is the output of the UX Fixes day — show it as "tested, fixed, improved"
  • Include live demo embeds where the change is best experienced interactively
  • Show the strongest recurring signal across the dataset (e.g. "5/9 users across both sprints prefer direct source access over chat")

Section 6: Quick Fixes

  • Title slide with the count (e.g. "16 Quick Fixes — All Implemented")
  • Each fix gets a card: numbered, the user quote that surfaced it, a screenshot of the fix, and the category (Trust, Context, Knowledge, Reduce, Emotion, Time, Organise)
  • "Every piece of user testing feedback addressed" — this demonstrates thoroughness

Section 7: Strategic Decisions

  • Separate from quick fixes — these need a team decision, not a code change
  • For each: the evidence, the competing perspectives, and a recommendation
  • Frame things the client needs to decide vs. things you've already resolved
  • Include "next product milestone" items that came from user testing but are too large for a UX fixes day

Section 8: Recommendations

  • Phased development priority (Ship / Iterate / Expand)
  • Mark which recommendations are already built in the prototype vs. need backend work vs. need a future sprint
  • Conclusion slide: "The product works for its target market" with the validation checklist

Section 9: Deliverables + Handover

  • Overview slide listing every deliverable with a one-line description
  • Individual walkthrough slides for each deliverable — open the repo, show the documentation structure, explain where to find decisions and user testing data
  • Link to the GitHub repository
  • "Questions?" slide sits between the overview and the detailed walkthrough — this is where the client conversation happens naturally

Section 10: Close

  • Summary slide: sprint count, user count, key validation metrics
  • Thank you + referral ask: "Ways you can help" — referrals, testimonials, introductions
  • Link back to journey overview

2. Present to the Client

Format: Video call or in-person, 45-60 min.

Tone: Evidence-first. Show user quotes, pass/fail data, before/after screenshots. Let the data make the argument.

Avoid: Defensive explanations for failing metrics. If Q1 failed 2/5, say so and explain what was changed. The failure is the finding — it's what user testing is for.

Live demos: Open the prototype during the presentation — don't just show screenshots. The presentation slides can embed the actual product as iframes for key sections.

3. Close the Engagement

At the end of the presentation:

  1. Confirm deliverables — Walk through each one: code in the repo, documentation structure, user testing data, where to find design decisions
  2. Share documentation — User testing analysis, presentation slides, design system brief, workshop materials, transcripts + written session notes
  3. Discuss next steps — Is there a follow-up sprint? What does the client need to build from here? Provide a phased development priority
  4. Ask for permission to quote — "Can we share your feedback on our website?"
  5. Referral ask — "If you know anyone who would benefit from this process, introductions go a long way"

Deliverables Checklist

Deliverable Description
Coded prototype Working React SPA in client's repository, deployed to staging
Design system documentation Visual style brief — typography, colours, layout rules, component patterns
User testing reports Written analysis per sprint with ICP-weighted results, key quotes, recommendations
Presentations Interactive slide decks built into the prototype codebase (one per sprint milestone)
Workshop materials Miro boards, HMW clusters, sprint questions, scope documents
User testing data Fireflies transcripts + handwritten session notes with behavioural observations
Example reports Working data populated into the prototype for demo/sales use
Video walkthroughs Screen.studio recordings of build progress across sprints

Sprint Retrospective Call

When: 1 week after delivery.

Purpose: What worked, what could improve, what's next.

Agenda (30 min):

  1. How has the team used the sprint output since delivery?
  2. What worked well about the process?
  3. What would you change?
  4. Would you recommend this to someone else? (testimonial opportunity)
  5. Is there a next sprint or referral?

This call is standard — not optional. It captures feedback that improves the methodology and creates natural upsell/referral conversations.


Locked Decisions

Decision Why
Evidence-first presentation User quotes and pass/fail data are more convincing than designer opinion
UX fixes completed before presentation Shows "tested, fixed, improved" — not "tested, here's a to-do list"
Presentation built in the prototype codebase Keeps everything in one repo, enables live demo embeds, no separate slide tool needed
Lead with headline results, not questions The client wants to know "did it work?" before the methodology deep-dive
Live walkthrough before methodology Give the client product context before explaining how you weighted the data
Separate quick fixes from strategic decisions Quick fixes are done; strategic decisions need the client's input — don't mix them
Sprint retrospective call (1 week post-delivery) Testimonials, feedback, and next-step conversations were happening ad hoc — now standardised
Ask for permission to quote Locked in on the first engagement — client gave permission to quote on website
Referral ask at the close Natural moment when the client is most impressed — don't skip it
Presentation at end of project, not per sprint Projects can span multiple sprints. Mid-project workshops handle interim results.

Research Tech Example

Live presentation: research-tech-zebra.vercel.app/journey/s3-final-01-title

Final Presentation (30 January 2026) — 50+ slides across 10 sections:

The presentation was built as interactive slides inside the React prototype. Arrow keys navigated between slides. Two slides embedded the actual product as live iframes (executive summary + research nodes).

Section breakdown as delivered:

  1. Headline Results (7 slides): Title → Sprint Questions → Q1 Differentiation (90%) → Q2 Trust (100% Level 1) → Remaining Gap → ICP Scorecard → Sprint 2 vs Sprint 3 comparison table (session time halved from 60min to 25-30min, feedback matured from fundamental to feature requests)
  2. Live Walkthrough (1 slide): Link to open the Sprint 3 prototype — walked through the full flow live on the call
  3. Methodology (4 slides): "Surgical instrument" philosophy → ICP weighting (5 full-weight users, 4 reduced/excluded) → "Every user contributed — weighting tells us how" (non-ICP users drove the sources overhaul and DAG filtering improvements)
  4. Results by Question (8 slides): Q1 deep-dive with quotes ("It's obviously more niche... saves me a lot of time") + nuance (value = TIME + UX, not raw capability) → Q2 two distinct trust levels with before/after screenshots → Q3 navigation 100% solved → Sprint 3 decisions all validated
  5. What We Changed (9 slides): Chat reduces trust for source access (5/9 users across both sprints — strongest recurring signal) → Sources overhaul with before/after (removed chat intermediary, added confidence levels, rejected sources section) → Validation quotes → Live demo embed → Report categories restructured (risks/opportunities → team/market/product/financials) → Research nodes expanded to full-screen feature → DAG interactive navigation (category filtering, hover preview, priority sort, simplified colours)
  6. Quick Fixes (2 slides): 16 fixes, each with: user quote, screenshot, category tag. "Every piece of user testing feedback addressed."
  7. Strategic Decisions (3 slides): Review step (3/4 confused initially, but all navigated on play — recommendation: keep it) → Living document as next product milestone (2 ICP users independently requested) → Why it can't be built now (scope too large, needs enterprise validation)
  8. Recommendations (2 slides): Phased priority (Level 2 Trust features built → cross-reference tracking → review step decision → living document in future sprint) → Development build sequence (Ship / Iterate / Expand)
  9. Additional Screens (6 slides): Dashboard (standard + enterprise) → Setup workflow → Request dedicated research form → Landing page → Onboarding → Processing → Audio briefing
  10. Deliverables + Handover (9 slides): Overview of 9 deliverables → Questions pause → Individual walkthrough of each: coded prototype, design system, user testing reports, presentations, Miro board, workshop materials, testing data (transcripts + written notes), example reports, video walkthroughs
  11. Close (2 slides): "3 sprints. 9 users. 1 validated product." summary → Thank you + referral ask

Client reaction: "I'm amazed. Unbelievable." "I can't wait to give it to users." Szymon: "A bargain." Permission to quote on website — confirmed on the call.

What made it work: Every change was backed by a user quote and a before/after comparison. The client didn't have to trust the designer's taste — they could see what users actually said and how the product responded. The UX fixes day meant the presentation showed improvements already implemented, not just recommendations. Live demo embeds inside the slides kept the presentation grounded in the real product.


Previous: UX Fixes