CMS/DITA Conference Session 8: Improving Source and Translation Quality with DITA and Content Management

ScriptoriumTech / ConferencesLeave a Comment

by David Kelly

This session was presented by Ann Adams of Kyocera and Dan Dube of DocZone.com. The gist of the presentation was similar to several others at this conference, with a little different twist in the type of CMS that was selected. The story was similar to the pattern of others: a limited writing staff and diverse legacy sources as over against an exploding complexity of products, audiences, and document services/output types to fulfill.

Ms. Adams pointed to several keys to success and several points where they learned difficult lessons. Some of the things she thought helped:

* Attending a DITA User’s Group early on.

* XML training for authors. (When the XSL section of a video started, she turned it off – no need to confuse people.)

* DITA training

* A workshop on modular writing.

The CMS they chose, DocZone, helped them accelerate the process because it is a hosted solution – they contract for services rather than purchasing the CMS and related tools. DocZone helped them perform the data structure analysis and performed the initial conversion for them.

One point she made: they have continued to tweak the data structure as they have learned more about the structure of their legacy data. This was a common theme in the implementation stories I heard: take time to understand the structure of your data. More on this in another session.

Other contributions to success:

* Weekly meetings with the writers (“code reviews”)

* Weekely meetings with their vendor

* Using CMS workflow for reviews

Issues they encountered:

* Turnover in a smalll doc department causes significant loss of intellectual capital. One writer left, another writer was out six weeks on jury duty.

* The first conversion was easy, but subsequent documents uncovered more structure. They wound up changing their minds about the data structure so often they wound up purchasing the conversion software so they could reconfigure it at will rather than paying the conversion vendor each time they wanted a change.

Some interesting “side-effects”:

* The CMS provided high visibility into authors’ writing habits (e.g., making numerous corrections to a document over time creates a large version trail) Some writers did not like this.

* The ditamaps were good for book-oriented views, but writers wanted submaps showing only the topics they were working on.

* The concept of “done” is having to be redefined. Consequently it is hard to make estimates of when they will be “done.”

Two advantages they have derived from implementing the CMS:

* Expanded capabilities (new delivery outputs and methods)

* The use of “author memory” has enabled them to reuse source language more effectively.

What is “author memory?” I hadn’t heard that term before. Ms. Adams said that it was “like translation memory,” then introduced Dan Dube, who is, I believe, a product manager with DocZone, who picked up from there.

It turned out that “author memory” was essentially the same as translation memory, but in DocZone it is built in to the CMS. So is translation memory, for that matter. Author memory holds the source material, while translation memory holds the translated material. Except, if I recall the statement correctly, in the case of DocZone, the author memory and translation memory are the same.

At any rate, the “author memory” approach helps the author make fuzzy matches to similar language and decide whether it is viable to use existing language. According to Ms. Adams, this has resulted in significantly more reuse of content than they expected.

There was a discussion about the adoption of DocZone’s version of translation memory vs. traditional (e.g., Trados) mechanisms. Mr. Dube said translation vendors were wary, but were also interested because it enabled them to avoid paying Trados licensing fees to SDL, which also does translations in addition to now owning Trados. Interesting.

The CMS/DITA implementation for Kyocera took about 1 year. This sounded significantly less than the figures I heard for “owned systems”. On the other hand, Kyocera is also now locked in to regular payments — so I guess it depends on what you are looking for and what kind of cost model you are willing to swallow. And, I suppose, how customizable you want your system to be.

About the Author

ScriptoriumTech

Leave a Reply

Your email address will not be published. Required fields are marked *