Skip to main content
February 16, 2015

Taking the DITA troubleshooting topic for a spin

This guest post is by Carlos Evia, Ph.D., the director of Professional and Technical Writing at Virginia Tech.

The DITA Troubleshooting topic is one of the “new” features in version 1.3 of the standard. However, troubleshooting has been around the DITA world for some good eight years now.

A SourceForge archive of plug-ins for the DITA Open Toolkit still houses a Troubleshooting Specialization released in October 2007. The 2007 troubleshooting topic sounded like a visit to the doctor, with  tags like tsSymptoms, tsCauses, tsDiagnose, and tsResolve (tsTake2Aspirins was too long, I guess).

It wasn’t until July 2014 when the DITA Adoption Technical Committee announced the troubleshooting topic as a new, formal content type in the standard. The committee released then the final version of the white paper Using DITA 1.3 Troubleshooting, authored by Bob Thomas. The white paper presents the rationale for the troubleshooting topic and provides detailed, accurate examples and templates, focusing on a structure of cause-remedy pairs of information to populate the topic.

Around that time, I was invited to lead a consulting project for a client in need of an online manual for processes related to cardboard manufacturing (not their actual business; just an example for this post). The client wanted to have web-based “how to” information for operators in charge of the processes of corrugating and die-cutting cardboard (not the actual processes we documented). As I assembled a team of faculty and students in technical communication and computer science, during an early meeting the client revealed that the manual’s focus had to be on troubleshooting. That scratched my itch for taking the troubleshooting topic for a spin.

Seven months into the process, as we wrap up the project, here I share some lessons learned from my experience with the the troubleshooting topic.

Conduct a root cause analysis

Task analysis, collecting and analyzing legacy documentation, and interviews with subject matter experts. Those traditional weapons for technical communication are probably not effective for obtaining troubleshooting information. When looking for cause-remedy pairs, the team (led by the client’s human resources personnel) conducted a root cause analysis. In the 3rd edition of their book Root Cause Analysis, Latino & Latino defined it by including four different definitions! For the 4th  edition (which includes a 3rd Latino in the list of co-authors), they simplify the definition of root cause analysis as “the establishing of logically complete, evidence-based, tightly coupled chains of factors from the least acceptable consequences to the deepest significant underlying causes” (p. 15).

The specific cause and effect tool we used for this troubleshooting project was a five whys session, which can be used to “question each identified cause as to whether it is a symptom, a lower-level cause, or a root cause” and “continue the search for true root causes even after finding that a possible cause has been found” (Andersen & Fagerhaug; 2000, p. 117). The five whys exercise involved supervisors, operators with diverse levels of expertise, and personnel from the client’s human resources department. At the end, we had a series of tables documenting conditions, delivering the type of cause-remedy pairs specified by the DITA Adoption TC white paper.

Prioritize conditions and solutions

A long root cause analysis session with supervisors, users, and managers can be too exhaustive for a troubleshooting guide aimed at an audience of machine operators. Never forget the deliverable’s intended users and their unique needs. During the five whys experiment we came out with some conditions that had more than 15 possible cause-remedy pairs. They were all interesting and relevant to some aspects of cardboard production. However, some happened at least once a week and others were almost urban legends. Many of their solutions involved shift supervisors or technicians. We filtered results based on a) audience’s real needs for the scope of the project, and b) frequency on the production floor.

Realize that troubleshooting is an excellent starter topic

Students who had never been exposed to DITA had a short learning curve for authoring troubleshooting topics. The students knew about principles of effective, minimal documentation, and persuasive writing. However, their knowledge of concept-task-reference was limited to a 5-minute
presentation. To them, DITA was mainly a grammar for troubleshooting. Unlike students who started with a DITA 101 course and had to work for at least half a semester with the standard, the new troubleshooting authors had a smooth transition to topic-based writing.

Maybe it is because a task or concept as an isolated chunk of information needs a map and a transformation to make sense. The troubleshooting topic, on the other hand, has a cause and a solution and can incorporate elements of a task. The topic provides instant gratification to the author who can see it as a small deliverable.

Remember that conrefs matter

Having new DITA authors who did not know much about the standard also brought problems. Students without previous DITA experience were good at learning the tags behind the troubleshooting topic and mastered cross-referencing links. But when it came to using conrefs, we had to appoint inspector. We called them the “conref police.” After all, a dull blade on a cardboard-cutting machine can be the cause for many conditions, and the solution will always be “ask maintenance to replace the blade.”

The conref police was in charge of frequently talking to authors and proposing conref solutions without getting too deep into the concept and mechanics of reuse.

Be aware that flowcharts kill good content

A troubleshooting topic can include several cause-remedy pairs (the condition of “humidity” in corrugating, for example, has many possible causes). When facing complex scenarios with many solutions, the DITA Adoption TC white paper proposes the use of static flowcharts inside an
image tag. I have been teaching about DITA at the college level for eight years, and I always tell my students that good content goes to die in PowerPoint slides. Oh boy, I was not prepared for dealing with static flow charts. Forget about good content that died of natural exporting causes; flow charts kill good content without mercy. One minor change, filter application, or typo sends you back to OmniGraffle and does not allow easy customization.

Maybe the solution is coming, with Jang Graat’s DITA-to-flowchart project, which he introduced at DITA Europe last year. We will wait and see.

Find a solution for the “remedy”

As a tag and title, “remedy” did not solve problems in this case. Maybe it was the unique situation of this project, where the client’s management staff and most of the authoring and developing team were Hispanic. There is nothing etymologically wrong with the term, but the more we talked about it, for us “remedy” sounded like a cheap, quick fix. Think of the stigma attached to “remedial writing” in college. We decided to use “Solution” in the title of each section, but the tag is still remedy, and we can’t change that.

Bend the rules (to help users)

Make documentation easy to find. Isn’t that one of the IBM characteristics of quality technical information? (Carey at al. 2014). For this project, the main web deliverable had a DITA-generated index and a search box. However, users needed to identify defective boxes looking at pictures showing the most common conditions affecting the processes of corrugating and die-cutting. A quick solution, without specializing or modifying XSLTs, was to create a visual catalog of defects. On the main map, the topicref for the concept c-corrugatingtrouble.dita had a child for each condition documented.

The images came from each troubleshooting topic, where they had been (blasphemy!) included in the short description. It worked, and the users were able to identify the conditions starting from a defective box.

The troubleshooting topic, as included in the DITA 1.3 standard, was worth the long wait. It is a much needed content type that authors can understand and adopt easily. Now I just have to update my teaching materials to expand the concept-task-reference language.

References

Andersen, B., & Fagerhaug T. (2000) Root cause analysis: simplified tools and techniques. Milwaukee, WI: ASQ Quality Press.

M. Carey, M., McFadden Lanyi, M., Longo, D., Radzinski, E., Rouiller, S., & Wilde, E. (2014). Developing quality technical information: a handbook for writers and editors. Upper Saddle River, NJ: IBM Press.

Latino, R. J., Latino, K. C., & Latino, M. A. (2011). Root cause analysis: improving performance for bottom-line results.  4th ed. Boca Raton, FL: CRC Press.