Skip to main content


Free & OS tools for the writing, managing, testing and rendering of documentation

recap from devroom track at FOSDEM 2019

How do you document your project? What tools do you use?

Tool The Docs was for the second year now a DevRoom at FOSDEM, co-organized by Chris Ward, Mark Meyer and Kristof Van Tomme. The track is a great opportunity for project leads, documentarians, and other documentation enthusiasts from across open source projects to come together and talk about tooling that makes contribution to docs efforts easier.

A big shout-out to the organizers of the Fosdem conference and Thank You for the presenters! With great pleasure we share with you the recordings, slide decks and notes from Tool The Docs.

Introduction to OpenAPI Specification

Lorna Mitchell

Developer Advocate at Nexmo

Describe APIs in both human and machine readable way with OpenAPI Specification. Allows you to separate source content and the output format: once you have the description, you can render any way you like.

Design first, with all stakeholders. Consider:

  • Product owner has detailed insight into the functions of the API
  • Technical writer creates concise microcopy
  • Software engineer has a good grip on the edge use cases

Using OpenAPI spec is an investment. It is a standard format that is picking up, you will be able to automatically generate more and more of it.

"OpenAPI Spec is not complicated, but it is a little bit unwieldy, and it is verbose."

If you have an API with lots of endpoints, use tags for discoverability. Tagging is freestyle and you can attach multiple tags to one path.


  • Edit: Atom, Swagger UI, SwaggerHub (see, Stoplight (version3 support coming along)
  • Validation: Speccy (CLI with configurable rules, but it is opinionated), OpenAPI Spec Validator (good error message descriptions)
  • Preview: ReDoc, Speccy, OpenAPI-GUI

Lorna's talk summary

video recording from Fosdem


presentation slides

First talk in the #ToolTheDocs room at #FOSDEM is @lornajane talking about the Open API Spec

— Kristof Van Tomme (@kvantomme) February 2, 2019

Building Pantheon


Feedback about on our documentation team experience

Nicolas Massart

Developer at Pegasys

"Good documentation is one of the ways to achieve appreciation of your product's excellence."

Etherium blockchain is so complex, docs team needed an exploration plan like old-time adventurers: gather data, share findings and put these findings into the first basic version of the documentation.

First version of docs was a GitHub wiki.

Pros: out of the box wiki, table of contents, role-based-access-control.

Cons: for restructuring, they had to migrate the docs.

To be able to choose the right documentation tool, they list the requirements in order:

Requirements Day1

  • Markdown syntax
  • Manage versions based on tags
  • Code&docs versions have to be in sync
  • Enable reviews and comments
  • Contributions to docs have to be easy

Requirements Day2

  • Adopt the visual theme to Pegasys
  • Usage statistics
  • Enable feedback
  • Check broken links
  • Automatic ToC and navigation
  • Enable Markdown extensions
  • Have a search engine
  • Offline PDF/HTML version

In Github and Jira they use tags to separate the pure code PRs and docs PRs.

Advice: gather accurate data, map it, report, and you can plan the next steps well.

Nicolas' talk summary

video recording from Fosdem


presentation slides

Next up @MassartNicolas talking about Pantheon docs #Fosdem #ToolTheDocs

— Kristof Van Tomme (@kvantomme) February 2, 2019

Multilingual Kubernetes

The stack, how we got there, and what it took to get there

Zach Corleissen

Lead technical writer at Linux Foundation

Technical pivot achieved: Kubernetes docs natively support multilingual documentation.

Key questions to answer:

Q1: How to publish multilingual content using Jekyll as a static site generator?

Q2: How to approve PRs for languages the docs SIG (Kubernetes special interest group) doesn't read, eg. Korean.

Prerequisites for a move to multilingual docs support:

  • scalable workflow
  • upstream contributions have to stay possible
  • be prepared for the upscaling repo maintenance costs
  • strategy on dealing with the increasing technical debt

Choices, solutions:

A1: Changed from Jekyll to Hugo: better multilingual support and shorter build times on large volume. Reduced technical debt.

A2: Kubernetes project's custom CI called prow allowed

  • setting subfolder level review permission control, everything in one common repo,
  • filtering: to sort issues and PRs by localized language


  • Invite the localization teams into the conversation, as equal collaborators not just stakeholders: gain a deeper pool of technical knowledge in problem solving.
  • Reconsider other options, revisit and re-examine the previous decisions.
  • Specify fallback language for each file.

The process wasn’t easy.

  • Migrating content from Jekyll to Hugo: different .md parsers of differing settings
  • Sitemap creation in Jekyll in a simple subcontent.yaml file, but in Hugo you have to specify a page weight upfront for every individual .md file
  • Hugo is not perfect but good work in progress.

Branching strategy: current+4. On Netlify these branches (plus dev) each have their own deployment -- version selector gives you a separate deployment.

Zach's talk summary

Write Drunk - Test Automated

Different aspects of Continuous Integration Testing for documentation testing

Sven Strack

DocOps Engineer at Pronovix

Writing can be hard and Continuous Integration isn't easy either.

Automated testing can only take you so far: quality assurance starts in your own text editor, with the right settings for both.

There is no golden solution.

Well written and maintained documentation should be part of your internal and public company culture.

The docs' source code should also be standardized and easy to read: quicker edits, faster deploys. Write meaningful and readable documentation.

  1. Plan: style guide, standards, check values, what if the test fails?
  2. Implement iteratively, preferably from the beginning of a project.
  3. Audience and Contributors: metrics, adjust check values.
  4. Be strict and be friendly. Adjust to developer workflow, provide templates, link to docs.
  5. Give implementable error messages.
  6. Check only the changed parts. Checks should depend on each other.
  7. Write checks in best coding practices and make sure the check itself passes the code linter.
  8. Do not break the build. If you did, fix it.

Editor Tip: Stricter testing rules on your local, if they passes, they will pass on the CI/CD linter settings too.

Two short demos: Basic examples for using Travis, using CI locally.

Q1: What if legacy docs were written without rules? A: Can only change them little by little, do not expect to work magic on 100% right now.

Q2: How to test code examples that are in a docs environment? A: Test code examples in a code repository and only then pull them into the docs.

Sven's talk summary

video recording from Fosdem


presentation slides

Live demo, only for the brave... @der_sven_ #Fosdem 


— Kristof Van Tomme (@kvantomme) February 2, 2019

Getting Closer to a Software Help Language

Untangling complexities of the LibreOffice Help

Olivier Hallot

Documentation coordinator at The Document Foundation

The gap between help documentation and new features is widening as development speeds up: contributed docs, immense localization effort.

LibreOffice is a large, interdependent program, teams working on translations into 100 languages: changes are risky, errors in docs have a large ripple effect.

They are developing a new toolchain, audience suggested looking into possible cooperation with Open Dita.

Olivier's talk summary

video recording from Fosdem


presentation slides

Open source XML Help language online editor developed by @Libreoffice @ohallot #Fosdem


— Kristof Van Tomme (@kvantomme) February 2, 2019

Who needs pandoc when you have Sphinx?

An exploration of the parsers and builders of the Sphinx documentation tool

Stephen Finucane

Senior software engineer at RedHat

Who needs pandoc when you have Sphinx? An exploration of the parsers and builders of the Sphinx documentation tool.

  • reStructuredText provides the syntax
  • Docutils provides the parsing and file generation
  • Sphinx provides the cross-referencing and translation

How does Docutils work?

  • index.rst → Docutils → index.html
  • Docutils in itself provides some tooling which can help to understand what actually happens under the hood

  • Readers (reads from source and passes to the parser)

  • Parsers (creates a doctree model from the read file)
  • Transforms (add to, prune, pr otherwise change the doctree model)
  • Writers (converts the doctree model to a file)

What about Sphinx?

  • Sphinx and Docutils share most of the same architecture, but the workflow is slightly different: readers, parsers, transformers, writers, but Sphinx builds upon and extends Docutils’ core functionality as well (builders, application, environment)
  • Multiple writers/builders are available for both Docutils and Sphinx
  • Sphinx also gives you cross-references and translation
  • And many more writers/builders available along with readers
  • If you decide to go and write your own Sphinx documentation, Docutils’ documentation is excellent and they provide enough information about writing and parsing

Stephen's talk summary

video recording from Fosdem


presentation slides

One but last talk in the #ToolTheDocs room at #Fosdem is "Who needs Pandoc when you have Sphinx?" by @stephenfin

— Kristof Van Tomme (@kvantomme) February 2, 2019

To the future with Grav CMS

How we migrated from a Wiki to the open-source Grav CMS

Aleksei Akimov

Team Lead, Documentation and Developer Experience at Adyen

When selecting tools, list the requirements and functions, then pick the tools.

"But the tools will also define your processes, that will have an effect on your organization."

Five years ago all docs in pdf: hard to maintain, distribute, analyze.

Changed to Confluence. Limitations: versioning, database is a black box.

The ideal tool doesn't exist.

Something may have worked well in a previous stage but now it doesn't, (e.g. now we need to automate).

Docs as code: existing models, could implement it but the collaboration, contribution and extensibility would suffer (Markdown, git).

If you implement a docs as code toolchain, you will mostly get feedback and contribution from developers, from other roles it will be minimal. This can have a long-term effect on your documentation's quality.

Try docs-as-code coupled with a CMS: Adyen implemented this with the OS Grav CMS. They have 5k pages of docs, this setup works well for them.

Lessons learnt

  1. Stack: be careful with changes, use the existing talent and mind the effect of your tools on your processes.
  2. Can stay with markdown and extend it when necessary.
  3. Proof-of-concept before change for real.
  4. Automate, especially the migration.
  5. Migration problems can be fixed at any level: you can eg. change the source.
  6. User testing to learn from, eg. how people use the tool's UI.
  7. Open Source as early as possible, as much as possible.

Aleksei's talk summary

video recording from Fosdem


presentation slides

Last talk of the #ToolTheDocs room at #FOSDEM "To the future with Grav CMS" by @AlexeyAkimov

— Kristof Van Tomme (@kvantomme) February 2, 2019


Articles on devportals, DX and API docs, event recaps, webinars, and more. Sign up to be up to date with the latest trends and best practices.