Making every word count: improving writing with data
Image: Craig A Rodway Spaghetti via photopin (license)
Bad writing can be expensive. This is the conclusion of author Josh Bernoff, whose Harvard Business Review article ‘Bad writing is destroying your company’s productivity’ looks at the hidden costs of poor communication.
Bernoff surveyed 547 businesspeople, discovering they spent an average of 25.5 hours per week reading for work, about a third of which involved reading emails. The majority of these people agreed that the information they read was ‘frequently ineffective because it’s too long, poorly organized, unclear, filled with jargon, and imprecise’.
As Bernoff points out, one of the biggest problems with vague or imprecise writing is that it tends to reflect the unstructured or incomplete thinking of the writer:
Clear writing uses well-organized, active-voice sentences to explain what is happening, what ought to happen, and what people need to do. Conversely, inexact and passive language reflects gaps in thinking.
An organisation that puts a value on this skill by hiring, developing and rewarding staff who demonstrate the ability to write clearly will be more productive than one that tolerates a culture of poorly written communications.
John Saito certainly knows the importance of making every word count. In ‘Design words with data’ he explains how the Dropbox UX team use data to inform their choice of terminology when designing interfaces, where ‘one wrong word can break a user’s experience’.
Saito and his team use tools such as Google Trends, Google Ngram Viewer, research surveys and user studies to test their assumptions about which terms will resonate with users. For example, the decision as to whether to ask users to ‘log in’, ‘log on’, ‘sign in’ or ‘sign on’ can be simplified by searching all four options in Google Trends to see which is the most searched term. As it turns out, most people use the term ‘sign in’ in their Google queries, so it makes sense to use the same term for your interface.
Saito also suggests using readability tests such as Readability-Score.com or Hemingway Editor to assess your writing and provide statistics such as grade level (for example, level 8 means the text can be understood by a typical 8th grader in the US), average sentence length, and the instances of passive voice. While these readability tools can be a little simplistic (even Hemingway fails the Hemingway Editor test) they are still useful for highlighting sentences or paragraphs that could benefit from a rewrite.
And while it might not be practical to conduct a research survey or user study to check if your email to a client is clear and unambiguous, asking a colleague to do a quick review before you hit send might be.