This talk was presented at the AI The Docs 2025 online conference. We are thrilled to share the recording and the summary with you.
Visit the talk summary page to see all of the presentations from the conference.
Summary
In this talk, Kody Jackson (Technical writer and manager at Cloudflare) explores how AI and automation help streamline documentation maintenance. Cloudflare faces the challenge of balancing essential maintenance tasks with new feature development, which occupies the majority of a writer’s workload. This often leaves maintenance (updating API placeholders, ensuring consistency, and preventing outdated content) under-resourced, resulting in inconsistencies and reduced documentation quality. The team’s guiding question became: “How do we cheat at maintenance?” to amplify the impact of limited effort.
Cloudflare manages over 5,000 English-only documentation pages with millions of monthly views and a GitHub repository that merges hundreds of pull requests each month. Their small but agile team combines content designers, technical writers, engineers, and production managers, using a tech stack that includes an Astro site hosted on Workers, the Starlight documentation framework, Algolia search, and GitHub for content.
To tackle maintenance challenges, Cloudflare developed the D.A.D.A. framework (Document, Automate, Democratize, AI). Style guides remain central, not only for alignment but to prompt critical thinking about decisions and define a “pain scale,” helping the team prioritize meaningful improvements. Automation targets repetitive and easily validated tasks: GitHub Actions handle required checks and periodic audits, custom Astro components enforce style and centralize sources of truth, and AI contributes to generating actions and components, reducing manual toil.
Empowering engineers through internal tools like Clue (Content Legibility for User Ease) allows contributors to evaluate and improve content themselves, while AI assists with complex tasks that exceed the limits of simple regex. Cloudflare’s experience shows that not all AI interventions succeed (rewriting full pages or relying on hallucinating review bots often causes more work) but automated fixes for repetitive issues, like broken anchor links, deliver significant value with minimal oversight.
Key takeaway: Kody Jackson emphasizes starting automation by identifying areas of pain and assessing potential value. Clear validation of tasks, critical evaluation using the pain scale, and targeted AI support allow technical writers to focus on high-value work, improving documentation quality while reducing manual effort.
Sign up to our Developer Portal Newsletter so that you never miss out on the latest API The Docs recaps and our devportal, API documentation and Developer Experience research publications.