Metrics - Specifics

Modern technical writing - Andrew Etter 2016

Metrics
Specifics

If you aren't keeping an eye on documentation metrics, you're making a huge mistake. User research is wonderful, but knowing exactly which pages are most popular, your site's bounce rate, and common behavioral flows are all invaluable. Create a Google Analytics account, add the provided tracking code to your static site theme, and check the numbers regularly. I'm sure other tools exist, but Google Analytics is simple, free, and the industry standard. I've never used anything else.

Metrics rarely serve as an obvious pointer to the correct course of action. Rather, they serve as a valuable check against your thoughts and intuition. If you create what you think is an amazing bit of documentation, and sure enough, it's one of the most popular pages on the site, perfect! If it's not very popular, maybe no one can find it, or maybe it's not as useful as you thought. Deeper investigation might reveal a major flaw, a minor tweak, or a reasonable explanation that requires no change at all. But not having metrics and using only your own criteria for success can be disastrous.

Note

Customers often frown upon a native application "phoning home"—especially to a third-party like Google—with usage metrics. Gathering these metrics from a native application requires costly development work, which is yet another reason why only hosting your documentation on a website is the better approach.

In the software industry, you often hear the questions, "What does good look like?" and "How do you measure success?" For a documentation website, any answer should include some mix of analytics, bug numbers6, reader feedback, alignment with corporate strategy (if applicable), and finally, personal intuition. I'm sure other ways of measuring success exist. I'm probably just ignorant to them. Detecting the difference between good and great documentation is an incredibly hard, unsolved problem, but that doesn't mean we can give up trying to solve it. Consider the following statements from two hypothetical technical writers:

· "In my professional opinion, the content is clear, concise, correct, and complete. The language is professional, conforms to our style guide, and projects a strong brand. Some of the tables were too wide for print, so we now enforce a two-column limit on all tables. Overall, I'm happy with the quality of the documentation."

· "The application logs show that the product only has 1,300 users, yet the documentation received 2,400 page views last month. In that same timespan, readers reported six inaccuracies, all of which I resolved within 72 hours. The five most popular search terms return the pages I would expect, and the design team recently helped me optimize the header margins for readability. Overall, I'm happy with the quality of the documentation."

I sincerely hope that you find the second argument more compelling. In any field, opinions become more credible when you attach quantitative metrics to them. Documentation is no different.