An Accessibility Audit Starts With a Number: 98

Manual accessibility auditing finds what automated scores do not show

Kateryna Bilous
An Accessibility Audit Starts With a Number: 98

Automated accessibility checkers routinely report scores above 90. Kateryna ran one on her assigned project interface on a Tuesday morning and got 98 out of 100. Then she spent the next six hours doing the manual audit that the automated tool cannot do, and found seven issues the score never mentioned.

What Automated Tools Miss

Tools like Axe and Lighthouse catch structural problems: missing alt text, absent form labels, incorrect heading hierarchy. They do not catch context failures. A button labeled with an icon and no accessible name will pass some automated checks if the surrounding markup is technically valid. It will fail a screen reader test immediately.

WCAG 2.2 Success Criterion 2.4.11 requires that keyboard focus indicators meet a minimum contrast ratio of 3:1 against adjacent colors. Kateryna tested focus visibility across the interface using a browser-based contrast checker and found that three interactive elements failed this criterion entirely. The visual design looked fine. The keyboard-only experience was disorienting.

The Manual Audit Log, Hour by Hour

Audit findings by category and severity
Category Issues Found WCAG Criterion
Keyboard navigation 3 2.4.11, 2.1.1
Color contrast 2 1.4.3
Screen reader output 2 4.1.2, 1.3.1

Seven issues. A score of 98. The gap between a passing number and a working interface is where most accessibility failures live. Kateryna documented each finding with a screenshot, the relevant criterion, and a proposed fix. The report took longer to write than the audit itself, which is probably the correct ratio.

Interested in structured UX learning?

Serverex Hub runs focused seminars on user experience design for professionals who prefer depth over broad overviews. Check the current program to see what sessions are available.

See Learning Program