Skip to main content

Visual QA Checklist: What to Verify Before Every Release

A practical visual QA checklist for web teams. Covers layout, typography, responsiveness, cross-browser rendering, interactive states, and accessibility checks to run before every release.

Checklist interface showing visual QA verification items with pass and fail indicators

Visual QA Checklist: What to Verify Before Every Release

Shipping a release without visual QA is a gamble. Functional tests confirm that features work, but they do not confirm that the interface looks correct. A button can pass every integration test and still overlap a form field on mobile Safari. A heading can render perfectly in English and break its container when translated into German.

This checklist covers the visual verifications that every web team should run before a release goes to production. Use it as-is or adapt it to your specific stack and audience.

Why a visual checklist matters

Visual bugs are uniquely damaging because they are immediately visible to users. A misaligned element or broken layout erodes trust in ways that a slow API response does not. Yet visual QA is often the most ad-hoc part of the release process โ€” done inconsistently, by different people, with no shared criteria.

A written checklist solves this by making visual QA repeatable and measurable. Everyone on the team knows what "visually verified" means, and no category of checks gets skipped because someone forgot.

Layout and spacing

Layout issues are the most common visual regressions. Check these systematically:

  • Page structure is intact โ€” header, main content, sidebar (if present), and footer render in the correct order with expected spacing.
  • Grid and flexbox layouts wrap correctly โ€” verify at desktop, tablet, and mobile widths. Look for unexpected column wrapping or items overflowing their containers.
  • Spacing between sections is consistent โ€” margins and padding match the design specification. Watch for sections that are too close together or too far apart.
  • No overlapping elements โ€” check that absolutely positioned elements (tooltips, dropdowns, modals) do not overlap content or extend beyond the viewport.
  • Sticky and fixed elements behave correctly โ€” sticky headers and fixed footers should remain in position during scroll without covering content behind them.

For automated layout verification across breakpoints, see our guide on cross-browser testing workflows.

Typography and content

Text rendering issues are subtle but affect readability and professionalism:

  • Headings follow the correct hierarchy โ€” H1, H2, H3 sizes and weights match the design system.
  • Body text is readable โ€” font size, line height, and contrast meet accessibility standards. Test with actual content, not placeholder text.
  • No text truncation or overflow โ€” long headings, translated strings, and dynamic content fit within their containers. Check especially with longer locales like German.
  • Links are visually distinguishable โ€” underline, color, or other styling makes links identifiable without relying solely on color.
  • Code blocks and preformatted text render correctly โ€” monospace fonts load, syntax highlighting works, and horizontal scroll appears for long lines.
  • Lists render with correct markers โ€” ordered lists show numbers, unordered lists show bullets, and nesting is visually clear.

Responsive behavior

Responsive bugs account for a disproportionate share of user-reported visual issues:

  • Mobile layout (375px width) โ€” verify the full page at typical mobile width. Navigation collapses to hamburger, images scale, and content stacks vertically.
  • Tablet layout (768px width) โ€” this is where most responsive bugs hide. Two-column layouts should transition cleanly, and touch targets should be adequately sized.
  • Desktop layout (1440px width) โ€” the primary desktop experience. Multi-column layouts display correctly, and maximum content widths are respected.
  • Wide desktop (1920px+ width) โ€” content should not stretch to fill ultra-wide screens. Check that maximum widths and centering work correctly.
  • Orientation changes โ€” on mobile and tablet, switching between portrait and landscape should not break the layout.
  • Zoom behavior โ€” browser zoom to 150% and 200% should not cause overlapping or hidden content.

Cross-browser rendering

Different browser engines interpret CSS differently. Check at least these combinations:

  • Chrome desktop โ€” your baseline rendering. Most users will see this version.
  • Firefox desktop โ€” catches Gecko-specific issues with flexbox gap, font metrics, and custom properties.
  • Safari desktop โ€” WebKit has unique behavior for backdrop-filter, position-sticky, and flex-wrap.
  • Safari mobile (iOS) โ€” all iOS browsers use WebKit. Viewport handling, safe-area insets, and scroll behavior differ from desktop.
  • Chrome mobile (Android) โ€” verify touch targets, viewport meta behavior, and mobile-specific styling.

You do not need to check every browser for every page. Prioritize by traffic data: cover the combinations that represent 90% of your audience. See our visual regression testing guide for more on building a browser matrix.

Interactive states

Interactive elements have multiple visual states. Each one needs verification:

  • Buttons โ€” default, hover, focus, active, disabled, and loading states all render correctly and are visually distinct.
  • Form inputs โ€” empty, focused, filled, error, and disabled states. Check that error messages appear in the right position and do not shift the layout.
  • Navigation โ€” active page indicator, hover effects on links, dropdown menus opening and closing in the correct position.
  • Modals and dialogs โ€” open smoothly, center correctly on all screen sizes, and display a visible backdrop. Close button is accessible.
  • Tooltips and popovers โ€” appear in the correct position relative to their trigger. Do not get clipped by overflow-hidden containers or the viewport edge.
  • Loading states โ€” skeleton screens, spinners, and progress bars display correctly during data fetching.

Images and media

Visual media requires its own set of checks:

  • Images load and display โ€” no broken image icons. All images have appropriate alt text for accessibility.
  • Images are correctly sized โ€” no distortion or unexpected cropping. Aspect ratios are maintained.
  • Responsive images serve appropriate sizes โ€” mobile devices do not download desktop-sized images.
  • Video embeds work โ€” play buttons display, aspect ratios are correct, and embeds do not overflow their containers.
  • Icons render correctly โ€” SVG icons display at the right size and color. Icon fonts load without showing fallback characters.

Accessibility and color

Visual QA and accessibility overlap significantly:

  • Color contrast meets WCAG AA โ€” text and interactive elements have sufficient contrast against their backgrounds. Check both light and dark themes if applicable.
  • Focus indicators are visible โ€” keyboard navigation should show a clear focus ring on all interactive elements.
  • Dark mode renders correctly โ€” if your application supports dark mode, verify that all components, images, and text are readable in both themes.
  • No information conveyed by color alone โ€” error states, status indicators, and required fields use icons or text in addition to color.

Performance issues can manifest as visual problems:

  • No layout shift during load โ€” content does not jump as fonts, images, or scripts load. This is measured by Cumulative Layout Shift (CLS).
  • Web fonts load before paint โ€” no flash of unstyled text (FOUT) or invisible text (FOIT) during page load.
  • Above-the-fold content renders quickly โ€” the visible portion of the page should appear within 1โ€“2 seconds on a good connection.

Pre-release verification workflow

Use this workflow to run the checklist consistently:

  1. Deploy to staging โ€” verify against an environment that mirrors production as closely as possible.
  2. Run automated screenshot tests โ€” capture baselines across your browser and device matrix. See our CI/CD automation guide for setup instructions.
  3. Review automated diffs โ€” classify each diff as intentional, regression, or noise.
  4. Run manual spot-checks โ€” automated tests cover static rendering. Manually verify interactive states, animations, and edge cases.
  5. Document results โ€” record which checks passed, which failed, and what was fixed. This creates an audit trail for future releases.
  6. Approve or block โ€” if all checks pass, approve the release. If critical visual bugs remain, block until they are resolved.

Adapting the checklist to your team

This checklist is comprehensive by design. For your team, customize it based on:

  • Your stack โ€” if you do not support dark mode, remove those checks. If you use a design system, add component-level checks.
  • Your audience โ€” if 95% of your traffic comes from Chrome, reduce the cross-browser scope and increase mobile testing.
  • Your risk tolerance โ€” revenue-critical applications need stricter checks. Internal tools can use a lighter process.

The key is consistency. A shorter checklist that runs every release is more valuable than a comprehensive one that gets skipped under deadline pressure.

Continue with ScanU

Automated screenshot testing handles the most time-consuming parts of this checklist: layout verification, cross-browser rendering, and responsive breakpoint checks. ScanU captures screenshots across browsers and devices so your team can focus on interactive states and edge cases. Explore plan options on Pricing, learn about the platform on Features, and see answers to common questions in the FAQ.