INP (Interaction to Next Paint) thresholds: • ≤200ms → ✓ Good • 200–500ms → ⚠️ Needs Improvement (380ms is in this range) • >500ms → ❌ Poor
Key vocabulary: • threshold — the boundary value that separates Good / Needs Improvement / Poor • needs improvement / within acceptable range — language for nuanced client reporting • LCP, CLS, INP — the three Core Web Vitals metrics Google uses in search ranking
2 / 5
A performance audit shows these Lighthouse scores for a SaaS dashboard:
61 is in the "Needs Improvement" range — not broken, but a significant opportunity. Accessibility (94), Best Practices (92), and SEO (88) are all in the Good/Needs Improvement range.
Key phrases for stakeholder communication: • "significant opportunities" — technical debt framing without alarm • "in good shape" — informal but clear status for non-technical reviewers • "bottleneck" — the limiting factor in a system; the constraint preventing improvement
Common Lighthouse performance issues: Render-blocking resources, large unoptimised images, unused JavaScript, long main-thread tasks, no caching headers.
3 / 5
A team has optimized their landing page. Before vs. after measurements:
How would you describe these results to the product manager?
What changed: • LCP: 5.8s (Poor: >4s) → 2.1s (Good: ≤2.5s) — major improvement, moved from Poor to Good • FCP: 3.2s (Needs Improvement: >1.8s) → 0.9s (Good: ≤1.8s) — moved to Good • TBT: 840ms → 120ms — 85.7% reduction; TBT directly measures main-thread blocking
Core Web Vitals vocabulary: • LCP (Largest Contentful Paint) — when the main content of the page loads • FCP (First Contentful Paint) — when the first text/image appears (not a Core Web Vital but key metric) • TBT (Total Blocking Time) — sum of time the main thread was blocked over 50ms tasks • "moved out of the Poor zone" — precise language referencing the threshold categories
A developer says: "Our CLS jumped from 0.05 to 0.31 after the banner redesign." Which response shows the correct understanding of what this means for users?
CLS (Cumulative Layout Shift) measures visual stability — how much the content moves around unexpectedly as the page loads.
• 0.05 → ✓ Good (≤0.1 threshold) • 0.31 → ❌ Poor (>0.25 threshold) — significant jump across TWO thresholds
What a CLS of 0.31 feels like: You're about to click a button, and the page shifts — your click lands on a different element. This leads to accidental ad clicks, form submissions, or navigation errors. High CLS is one of the most frustrating UX issues.
Common causes of CLS: • Images/videos without explicit width/height attributes • Ads, embeds, or iframes appearing above existing content • Dynamically injected content (cookie banners, notification bars) • Web fonts causing FOUT (Flash of Unstyled Text)
Key vocab: • visual stability — how consistent the page layout is during loading • layout shift — when a visible element changes position unexpectedly • FOUT — Flash of Unstyled Text: text renders in fallback font then jumps when web font loads
5 / 5
A team wants to present their Core Web Vitals improvement to stakeholders. Which framing is most effective and accurate?
The best stakeholder framing combines: 1. Results in plain language ("all three Core Web Vitals into the Good zone") 2. Specific numbers (before → after for each metric) 3. Business context ("Google ranking signal", "reduce bounce rate")
Why the other options fail: • "Lighthouse score went up" — Lighthouse score is a lab test, not field data. It's useful but doesn't prove real-user experience. • "No direct impact" — This is false. Core Web Vitals directly affect user experience AND Google Search ranking since 2021. • "Wait 2.5 seconds less" — Technically inaccurate. LCP is when the largest content element renders, not total page load. Users don't wait from 0 — they see content progressively.
Core Web Vitals in communication: • Always cite the zone (Good / Needs Improvement / Poor) alongside the raw number • Connect to business KPIs: bounce rate, conversion, organic ranking • Use "field data" (real user monitoring, RUM) vs "lab data" (Lighthouse) distinction when presenting to engineers