Scriptorium Publishing

content strategy consulting

Pokémon GO and community documentation

July 25, 2016 by

Yes, I'm playing.

Yes, I’m playing.

Even if you aren’t twitchily checking your phone and resisting the urge to run outside to catch a Pikachu or Gyrados, you’ve probably heard all about the phenomenon of Pokémon GO. One of the most common criticisms of the game is that the in-app documentation is sparse at best. In response, the community banded together and began to document their theories and findings. You can readily find articles covering “eeveelutions,” theories on how to more easily capture Pokémon, and how to capture opposing gyms. It hearkens back to a time of meeting up in schoolyards to swap tips and rumors.

This community-driven documentation has done an amazing job of bringing players together in the absence of official documentation. While this works for a game, you need to be sure that your own documentation does not force your staff or users into this behavior.

In QA

The heart of QA is documentation. While a QA department can get by on oral tradition and tribal knowledge, there’s always the threat of brain drain. Should the unthinkable happen and your QA guru is hit by the lottery, what happens to the testing and validation portion of your workflow? In the best case, you lose efficiency while your remaining QA staff attempt to formalize what was once a loose testing structure. In the worst case, your workflow grinds to a halt while your staff tries to create documentation where there was none before.

The same also goes for general internal documentation, including style guides and information on how to configure business-critical software.

In production

Perhaps there was a miscommunication between your development and tech pubs teams, or something was lost in building out documentation requirements. Regardless, having mis- or undocumented features in your products or services can be a major headache, both in dealing with the fallout and attempting to rectify the inconsistency. Unlike with games or other social media, your users probably won’t converge to do the documentation for you.

Taking action

So how do you try to prevent these issues?

For internal materials, consider using wikis or document repositories for storing critical information. Be sure that any information is easy for your staff to find. Having software licensing information or style guides stored won’t help you if no one can find them.

For external materials, have strong SOPs in place that govern how your content is reviewed before being published. You should also have some means to integrate feedback from users so that future publications can take real-world practice into account.

Another gratuitous Pokémon Go tie-in (Scriptorium year-end conference line-up)

July 18, 2016 by

Will Pokémon Go still be hot at the end of the year? If so, here are some opportunities to see Scriptorium and expand your Pokémon-collecting options.

TC Summer Camp, July 30, Fairfax, Virginia

Join Gretyl Kinsey for a content strategy workshop or stop by our booth to visit with her at the first-ever East Coast TC Camp.

Webcast: Balancing standardization against the need for creativity, August 3

OK, this one is virtual, so you’ll have to watch the webcast on one phone while wandering around with a second phone to play.

Join Alan Pringle for a collection of case studies in which we had to figure out how to balance creativity and standardization in different ways.

Register

GIANT conference, October 17–19, Charlotte, NC

Sarah O’Keefe is presenting on Content Strategy Triage at 1 p.m. on Wednesday, October 19 at the GIANT conference in Charlotte. We are very excited to have an event within driving distance.

LavaCon Las Vegas, October 25–28, Las Vegas

Pokémon and Vegas. The mind boggles. Even though it’s Vegas, Bill Swallow will be presenting with the harmless-sounding title, The Value Proposition of Translation Strategies.

Viva LavaCon Las Vegas!

tcworld, November 8–11, Stuttgart, Germany

Finally (we hope), you’ll find Alan Pringle and Sarah O’Keefe at the tcworld conference in Stuttgart, Germany. Alan will be delivering a DITA case study centered around LearningDITA.com. Sarah is participating in at least one panel about DITA implementation.

We hope to see you at these and other Pokéstops.

New LearningDITA course: Introduction to reuse

July 11, 2016 by

We’ve introduced DITA and covered the basics of authoring topics and building maps on LearningDITA.com. (Many thanks to our 1,400 subscribers and counting!) Now we’re kicking off a new series of more advanced courses with Introduction to reuse in DITA.


Ducklings (Flickr: Eric Sonstroem)

Ducklings (Flickr: Eric Sonstroem)


This course includes the following lessons, which will help you get started with reusing your content:

  • Introduction to reuse
  • Creating reusable topics
  • Reusing topics and maps
  • Using content references

You’ll learn about the benefits that reuse can bring to your organization, from saving time and cost to improving localization. The lessons show you how to determine what content is reusable, demonstrate writing for reuse, and offer best practices for maximizing your content’s reuse potential.

This course will also help you take your first steps into the practical side of reuse. You’ll gain some hands-on experience with reusing topics and maps and creating conrefs. The guided practice with example code and on-your-own exercises will give you a chance to see reuse in action.

This course was contributed by Mike Rice and Annie Chen of easyDITA, with additional content by Simon Bate, Jake Campbell, and me of Scriptorium. The accompanying slide deck on Content Reuse was created by Pam Noreault, Tracey Chalmers, and Julie Landman.

As always, we welcome your feedback! If there are any subjects you’d like to see covered in future courses, feel free to contribute your ideas or content to the ditatraining GitHub repository. You can also take a look at our project roadmap of current and planned future courses and let us know what you’d like to learn. To make sure you don’t miss any new courses, sign up for LearningDITA.com announcements (you can also sign up during site registration).

Special thanks to our sponsors: oXygen XML Editor, Adobe, IXIASOFT, SDL, easyDITA, and Flow.

Reduce translation costs with XML

July 4, 2016 by

$0.21 per word.

That’s the average cost in the US to translate content into another language according to Slator, a translation news and analytics site. That number is not speculative; they analyzed the costs per word from over 80 actual proposals gathered by the US General Services Administration (GSA). You can view the source proposals here.


Some languages are more expensive to translate into than others, but this average cost is compelling. Are you getting the maximum value for your translation spending?

Translation is not a necessary evil; it’s a vital step in the process of conducting business globally. You’ve invested in creating quality products and content, and you (hopefully!) want to extend that quality into other markets. While translation can be expensive, cutting corners can result in lost revenue.

Searching for the lowest translation prices is fiscally prudent, but it should never be to the detriment of quality. Cost should be secondary to the class of service you receive. Look for translators who meet your needs with regard to subject matter expertise, language fluency, and local market knowledge. If you find several who meet your needs, then by all means weigh them by cost of services.

Reduce your overhead

robotic arm lifting dice

(source: Flickr/druscoe)

The true “trick” to reducing translation spending is to remove as much overhead from the process as possible. Overhead is best reduced through consistency of content and efficiency in formatting.

While you can accomplish this through the use of style guides, thorough editorial passes, and rigid templates, the effort involved is tremendous unless you can automate the processes. This is why an increasing number of companies with multiple language requirements are moving to XML.

Writing your content in XML removes the presentation layer (how published content looks) from the authoring side, and allows for easier reuse of common blocks of content. Writers and translators do not need to spend time formatting documents, and the amount of unique text is significantly reduced through reuse.

An idealistic translation scenario

Consider a situation where two writers work on a new 500-page manual. This document then needs to be translated into 10 languages.

Assuming the old collegiate standard of 250 words per page and factoring in the average $0.21 per word, the translation budget for this 500 page manual is $262,500.

By removing the need to format the translated content, you may be able to haggle the translation costs down a bit. Let’s assume a modest 10% reduction in cost; a price reduction of $0.021 (yes, two pennies) will save you $26,250 on that 500 page manual.

If you can reduce the amount of content by 20% through reuse, you can save quite a bit on translation. You would essentially be removing 100 pages of content by reusing content from your other manuals, which could reduce translation costs by $52,500 using the average $0.21 rate.

Combining these approaches—20% reuse with a 10% rate reduction—your total translation cost for the same exact manual becomes $189,000. That is a savings of $73,500!

Make a realistic compelling case

The projected savings in the example above is certainly compelling, but the example uses basic math and makes some big assumptions. Savings can vary greatly based on the total amount of reuse, how reuse is employed, and whether or not you can bargain with your translation vendor.

For a more conservative, realistic estimate, try our XML business case calculator. This calculator does not assume a drop in translation word count from reuse, as it is hard to quantify how many times a particular chunk of content will get reused. Instead, it factors in the amount of authoring time saved (multiplied by writer wages) through reuse.

Using the same scenario in the XML business case calculator, but assuming it takes the writers 2 hours to author every page at a $50/hour wage, the annual total savings from switching to XML is $53,250. (Not too different from the idealistic scenario.)

Once you factor in more manuals and more writers, the return on the investment in XML authoring becomes quite compelling. Go on, give it a try yourself.

The Rocky Road to Dublin, er, DITA

June 27, 2016 by

For LavaCon Dublin, Sarah O’Keefe and I delivered a case study presentation on some of the roadblocks we have encountered in implementing DITA at ADP. This article summarizes the key points of the presentation. The presentation and this blog do not represent the views of ADP, LLC.

Presentation slide with title "DITA Implementation Therapy Session" and a large gray DRAFT plastered across the slide

An early title, later replaced with something a bit more professional–and with a bit more local color

ADP, LLC is a global provider of cloud-based Human Capital Management (HCM) solutions and a leader in business outsourcing services, analytics, and compliance expertise. A centralized technical communication group within ADP undertook to move a group of 60 writers, mostly using RoboHelp, to structured authoring in DITA and a component content management system (CCMS).

It was relatively easy to build a business case, since the tools in place simply could not support the business demands for technical content. The primary driving factors were reducing localization cost, increasing reuse, and improving content quality.

However, the implementation was considerably more difficult. Some of the key challenges were:

  • Complex conditions
  • Resource constraints and expertise gaps
  • Matrix reporting structure

Complex conditions

Traffic sign showing the "magic roundabout" at Swindon, which is a roundabout with five subordinate roundabouts

Complexity is expensive. Photo credit: dangerousroads.org

ADP has complex products with many variants. For example, some products are available as part of an integrated suite or as stand-alone products. Within these products, users with different roles have access to different functionality; for example, a manager can review timesheets for all direct reports, but an employee can only modify his or her own timesheet. And different content is sometimes needed for different geographic regions.

In building out a classification scheme for the content and for the various conditions, it became clear that we had to balance the value of reuse against the increasing complexity of conditions.

Increased reuse reduces the cost of creating content, but overly complex conditions increase the cost of authoring and maintaining content and the risk of tagging and publishing errors. Finding the optimal balance between these factors became an ongoing issue.

Resource constraints and expertise gaps

Like all projects, there were resource constraints—not enough people and not enough funding to complete an implementation quickly. Furthermore, as is common with new DITA implementations, the people assigned to the DITA implementation team had little or no previous experience with DITA or XML. The organization had no reason to hire for those skills when the tool of choice was RoboHelp. Any team members with DITA knowledge would have acquired it in a previous position.

Increasing DITA expertise is a critical need, but it takes time to transfer knowledge to the implementation team, and that can prolong implementations dramatically. Hiring a DITA expert is an appealing option, but that takes time, too. New hires also lack social capital within the organization. They do not know the company culture, and they don’t have the connections necessary to get things done.

ADP sought to fill the resource and expertise gaps to some degree with consultants. And to maximize the value of the consulting engagements, ADP prioritized knowledge transfer—ensuring that internal resources learn from the consultants—and getting quick wins to build momentum.

Matrixed reporting structure

The ADP DITA implementation team is part of a matrixed reporting structure, in which team members are accountable to the DITA project lead but “report” to a different manager. Because the demands of a DITA project rise and fall, especially in the early stages, a matrixed or dual reporting structure is common. The team members are assigned to the implementation project for a percentage of their time and are expected to complete regular assignments in the rest of their time. Addressing and resolving conflicts between the needs of the two assignments is often challenging. Excellent planning, coordination, and communication by the leaders is a must.

The rocky, winding road

In a session that preceded ours at LavaCon, the presenter asked for a show of hands of people who have done projects that went as planned. We had a similar point to make in our presentation:

GIF

The roadblocks we discussed certainly added some twists and turns to ADP’s experience. A few of the key takeaways we identified that can help teams mitigate—or at least cope with—these twists included:

  • Start building expertise as early as possible
  • Insist on knowledge transfer if you engage consultants
  • Develop clear roles and expectations for team members
  • Have a clear communication plan if you work in a matrixed environment

Is your content overhead or a customer delight?

June 20, 2016 by

Delight is the difference between what you and your team cost, and the revenue you directly (or indirectly) produce (or protect). This concept is as important to charities as hedge funds.

Andy Kessler & Bruce Clarke

You may not think that “delighting” customers is part of your content creation responsibilities. But when customer delight is defined in terms of revenue and costs, it suddenly becomes a critical part of your job.

Determining whether your content is merely overhead or a customer delight may seem like a losing fight: it’s too subjective! However, there are questions you can ask to measure how delightful your content is, including:

  • Does the support team repeatedly answer questions addressed in the content? If customers are contacting support with queries that are (or should be) addressed in content, your content doesn’t explain things well, is hard to find, or both.
  • Where does your publicly available content show up in search engine results? If your content is not at the top of the search results, that means someone else’s content is getting all the attention. Your content probably needs an SEO tuneup. (I know some companies cannot open up their content on the web for competitive and security reasons. Even so, those companies may need a web page to direct users to an official resource, such as a customer-only portal.)
  • What do web analytics show? Web stats can show what content is popular and what isn’t getting any attention. If content isn’t getting read, can you do something to make it more useful, or should you refocus your efforts on the content customers are reading?
  • Do you have a customer feedback loop? Is there a way customers can send comments and questions about specific content? The mechanism can be as basic as a link to an email address that sends the comments to particular content creators. If you do a formal analysis of your content, be sure to include customers as part of the discovery process. Interviews with customers can be very illuminating, particularly when done by a third-party consultant (like me!) who may elicit more candid responses.
  • How do your partners or resellers use your content? If they are writing their own “cheat sheet” versions of official content for their customers (or are translating it because your company does not), your content is failing. You also lose control over how your product/service is being presented.

Measuring content use—and customers’ satisfaction with that content—is critical in how you prove delight. Without that customer delight, your job as a content creator is expendable overhead.

Content strategy patterns in the enterprise

June 13, 2016 by

What factors affect content strategy decisions? Every client has a different combination of requirements, and of course there are always outliers. But based on our consulting experience, here are some factors that affect content strategy decisions.

Is the content the product?

detail from Book of Kells

Book of Kells // flickr: Patrick Lordan

If yes, the content design will be relatively more important. The organization will want content development and publishing workflows that provide for maximum flexibility in order to deliver the highest possible quality in the final product.

Are the writers professional writers?

Full-time content creators may have a preferred way of doing things, but they usually have experience with a variety of tools, and understand that different tools are appropriate for different organizations.

Are the writers volunteers or paid professionals? Does writing the content demand specialized knowledge?

Domain knowledge is always important. If your writers have extremely specialized knowledge, are volunteering their time, or both, then they effectively have veto power over the authoring environment. Tread with care.

Are there regulatory or compliance requirements?

If so, you can expect a company that is relatively more willing to invest in content (since a failure could mean Serious Consequences), but these companies also tend to move slowly and be risk-averse. Review workflows will be relatively more important for regulated companies.

How many languages are supported or need to be supported?

More languages means more interest in governance because mistakes and inefficiencies are multiplied across each language.

Can misuse of the product injure or kill people?

If the product is potentially dangerous, the organization will look for ways to minimize the risk. At the most basic level, this results in documents with pages and pages of hazard warnings. More advanced organizations work on product design to minimize operational hazards and design their content to support correct product usage. Compliance and regulatory requirements may also come into play.

How many people contribute content? Are they full-time or part-time contributors?

A huge pool of part-time content contributors usually means looking for a lightweight, easy-to-use authoring tool that does not require per-seat licensing. A large group of full-time writers usually means investing in automation because even small productivity gains are valuable.

SubjectScheme Explained

June 6, 2016 by

Your project is coming along nicely. You have your workflow ready, your style guides are composed, and things are looking up. However, you have complex metadata needs that are starting to cause problems. You need a way to ensure that authors are only using valid attribute values, and that your publication pipeline isn’t going to suffer. This is a situation that calls for a subjectScheme.

In a note element, the type attribute only allows specific values.

In a note element, the type attribute only allows specific values.


Normally, most DITA attributes can have any text value. If you have very specific needs for your attribute metadata, it can be helpful to make sure that you only allow certain values. A subjectScheme allows you to define a list of values that are then associated with a specific attribute. When you include a subjectScheme in your map, it acts like an editor, going through your document and ensuring that your attribute values are valid.

Anatomy of a subjectScheme

A subjectScheme map consists of a root <subjectScheme> element that contains the following:

  • a <subjectdef> element, which defines your allowed values
  • an <enumerationdef> element, which binds your allowed values to a specific attribute

Take a look at this sample subjectScheme:

<subjectScheme>

  <subjectdef keys=”apps”>
    <subjectdef keys=”internal”/>
    <subjectdef keys=”external”>
        <subjectdef keys=”allowedex”/>
        <subjectdef keys=”disallowedex”/>
    </subjectdef>
    <subjectdef keys=”all”/>
  </subjectdef>

  <enumerationdef>
    <attributedef name=”app”/>
    <subjectdef keyref=”apps”/>
  </enumerationdef>

<subjectScheme>

We need to assign values to an attribute named app. The <attributedef> element identifies that attribute, and the <subjectdef> element after it assigns the allowed values. The nested <subjectdef> elements above the <enumerationdef> list five values (internal, external, allowedex, disallowedex, and all) that we consider valid. If we take this and reference it in a map, any app attribute containing anything other than those five values will cause the document to fail validation.

Also, notice that the <subjectdef> elements that define allowedex and disallowedex are nested in the <subjectdef> element that defines external. This indicates that allowedex and disallowedex are types of external apps, creating a semantic link between these types.

To use a subjectScheme in a map, you need to use the following format when referencing it:

<topicref href="filename" format="ditamap" type="subjectScheme"/>

Effects on conditional filtering

We’ve already discussed conditional filtering in a previous blog post. If you’re using a ditaval filter file to conditionally process content, the relationship between values defined in your subjectScheme map comes into play.

Let’s say that you have a ditaval file which includes the following:

<val>
  <prop action=”include” att=”app” val=”internal”/>
  <prop action=”exclude” att=”app” val=”external”/>
  <prop action=”include” att=”app” val=”all”/>
</val>

If you run your content with this filter, elements that have an app value of internal or all will be included, and elements with an app value of external will be excluded. However, values of allowedex and disallowedex will still be included in your output, and you would need to include specific handling for those values.

If your map included a subjectScheme, though, the only elements that would come through would be those with an app value of internal or all. This is because of the semantic relationship that is defined within the subjectScheme. Because external apps are excluded and because the subjectScheme defines allowedex and disallowedex as a type of external app, allowedex and disallowedex are also excluded.

By using a subjectScheme map to enrich your attribute metadata, you not only gain a way to define valid values, but also a way to create relationships between them.

Localization testing: it’s not just translation

May 30, 2016 by

It takes considerable planning and effort to run a successful localization project, from following best practices to evaluating vendors to finding and fixing the weakest link in the localization chain. But the localization process does not end when you receive the translations. Localization testing is necessary for ensuring that your content and products are ready for a distributed global release.

People commonly assume that a quick proofread of the localized content is all that is needed before release, since it’s “just a translation” of the completed source material. This assumption is wrong. In fact, localized content needs to be treated with as much care and attention as the source from which it was derived.

test tubes

Testing is critical for achieving successful results. (image source: wikimedia)

When developing your source—whether it’s a manual, marketing material, or even an application—you likely (hopefully!) test it in some manner. As Jake Campbell recently blogged, product and content need a test bed and use cases to test with. Your localization testing should be conducted using the same criteria and scenarios as your source material.

Functional testing

The first step in localization testing is to ensure that everything is correct and is functioning properly. After a thorough content review and approval of the translations, the content needs to be applied to the products and content deliverables for functional testing.

During testing, check the following:

  • Does the content render? Make sure that the correct language displays, that there are no encoding issues, and that there are no special characters dropping out.
  • Does it render properly? Check for layout and formatting issues, text expansion concerns, font use, and so on.
  • Is it easy to navigate? Ensure that all navigation controls are clearly labeled and understandable in the target language, that any alphabetically sorted lists are in the correct order, and that content flow and usability conform to the target language expectations (particularly important for right-to-left languages).
  • Do all features work? Finally, make sure that everything functions as expected. Check all menus and dialog boxes, test the index and search features using terms and phrases common to the target languages, and proof all content in context to make sure it is still correct and understandable.

For subsequent translations, much of this can be smoke tested. But the content itself should be reviewed for completeness and correctness every time, in every language.

Testing against use cases

Once the localized content passes functional testing, it must be tested for usability and relevance. These tests rely on use cases and scripted scenarios.

The use cases you employ may vary from language to language and from location to location, but they should generally follow the same contexts used for the source language tests. These tests will ensure that your localized content and products are relevant, understandable, and useful.

Use real-life scenarios that people will encounter while using the products and content. All of these scenarios need to be tested in every language to make sure that the experience is very similar from language to language (some differences may be required depending on local requirements), and that instructions and next steps are clear.

Plan accordingly

Be realistic about scale, timelines, and effort when factoring localization testing into your project cycle. Every test designed for your source language needs to be applied to each target language. Some aspects of localization testing can be expedited based on known validity of content and the extent of changes from release to release. However, proper testing—even when expedited—takes time and effort to conduct.

If you are using third parties to conduct the testing (such as partners in your target markets), they need to follow the same test scripts and validate on the same criteria as you. This is critical for tracking quality and pinpointing the source of any issues.

Do you have other tips for localization testing? Please share them in the comments!

Do you know a content strategy concrete head?

May 23, 2016 by

In lean management, a concrete head is someone resistant to change. In my years working on content strategy projects, I have noticed many people are resistant to changing how they develop and distribute content—sometimes without even knowing it.

Easter Island head statue

Don’t be a content strategy concrete head (flickr: William Warby)

If you hear any of the following things, there is a good chance your team includes a content strategy concrete head.

Disclaimer: I have heard the following thoughts expressed by multiple people working for different clients. This list is not focused on a particular client or two. Believe me.

“I don’t mind copying and pasting content.”

Copying and pasting is more efficient than writing something from scratch—in the short term. But creating another version of the same (or nearly identical) content sets up another maintenance headache. When a feature or product name changes, authors have to track down every mention and make the change. What are the chances they will miss one? Or two? Pretty high, especially if the information is across departments and developed in different content tools.

Reliance on copying and pasting is a sign your content needs a reuse strategy.

“I won’t give up the authoring tool I’m using now.”

It’s great that an author has mastered Tool You’ll Pry from My Cold, Dead Hands™. Yes, that tool may have served the authoring team and the company well. But business requirements change, and if a tool no longer supports the company’s overall requirements, it’s time to consider other options.

Ferocious loyalty to a tool can be a career-limiting move.

“The minute you put in a real process, things become unmanageable.”

The ad hoc processes in place may be working for individual authors, but probably not for the company as a whole.

Implementing consistent, repeatable processes can be inconvenient. But content creators must balance the short-term pain against the need to adapt for company growth.

Automatic dismissal of any process as “unmanageable” is really code for “I don’t want to be bothered.”

And speaking of not being bothered…

“Changing process is fine as long as it doesn’t affect what I’m doing.”

People are not really supporting change when they shift the burden of change onto others. Successful content strategies encompass the entire organization—not just a department or two. No department is an island.

“We put a PDF file out on the web. It’s searchable and easy to use.”

The PDF file is a dependable format, and it will likely be around for a while. However, reading a letter- or A4-sized PDF on a smartphone is not optimal. Also, searching a PDF for specific information is more difficult than, say, using a search field to find information across a set of web pages.

Putting a PDF file, help set, or any other content deliverable on the web is not the same thing as making content findable and useful. Find out how customers are accessing your content (or would like to), and adjust your content distribution methods accordingly.

What else have you heard from a content strategy concrete head? Put it in a comment, please!