Scriptorium Publishing

content strategy consulting

Breaking down content silos: expectation vs. reality

March 16, 2015 by

You’re probably hearing it more and more: silos are bad for your business. They discourage collaboration, lead to duplication and inconsistency, and prevent you from delivering a unified content experience to your customers. But what really happens when you try to break them down?

Expectation: One big happy content-creating family. Reality: "We don't like change!" (flickr: Daniele Nicolucci)

Expectation: One big happy content-creating family. Reality: “We don’t like change!”
(flickr: Daniele Nicolucci)

In Marcia Riefer Johnston’s recent post, Scott Abel discusses the importance of eliminating silos and restructuring content departments to foster collaboration. I agree that developing content without silos would be extremely beneficial for companies, content creators, and customers. However, the reality of achieving this goal can be slow, difficult, and painful.

Before you start the process of breaking down the barriers between your company’s content groups, it’s important to think about how and why those silos formed in the first place. Silos may not seem to add value to your business, but they do exist for several reasons:

  • Structure. Companies need to be divided into departments to stay organized and avoid chaos. This is especially true as a company grows larger and larger.
  • Accountability. With the department structure comes a reliable management system and chain of command. People can be held responsible for departments’ successes or failures.
  • Focus. It makes sense to group people together based on what type of content they are best at creating (for example, technical writers or marketing specialists).

Silos are often deeply entrenched. Maybe they’ve existed for years, or they’re an integral part of the company culture. Sometimes they lead to competitive relationships between departments (we’ve seen this happen with technical and marketing groups). The more ingrained these silos are, the more difficult they are to change. In many cases, it’s easier to leave silos as they arefor those who manage them as well as those who work inside themthan to try to dismantle them.

Here are challenges you can expect to face when you attempt to break down your company’s silos:

  • Lack of motivation. Some of your colleagues may say they support your efforts, but never do anything to back up their claims. Others may agree with you in principle, but feel that your goals are not realistic. Getting people to start thinking about breaking down silos is easyconvincing them to take action can be an uphill battle.
  • Change resistance. Be prepared for negative comments like, “We don’t have time for all this change with our deadlines,” “We’d never make up for the learning curve,” or “Why fix what’s been working for years?” Even if the people in one group can see the value in collaborating with others, they worry about what that means for their day-to-day work experience.
  • An extremely long process. It may be years before you see your efforts lead to any change in your company at all, much less achieve your end goal of no more silos. Future employees of your company may see more changes and fewer silos than you do.

Sometimes the best way to work toward breaking down silos is finding small ways to improve the relationships between departments. Introducing these compromises will likely be much more effective than suggesting drastic changes:

  • Encourage collaboration. Try having a weekly meeting with representatives from each content department so that they can each be aware of what the others are doing. Even if the silo structure is still in place, these meetings will keep the groups from working in isolation.
  • Manage your silos better. Remember that silos are important to your company from a management point of view, even if they hurt the business in other ways. If eliminating silos is not an option, suggest adding a new management positionsomeone who oversees all content and the coordination between departments.
  • Educate your colleagues. Talk about the benefits of working without silos. You can eliminate the risk of two separate groups producing duplicated or contradictory content, which will increase consistency and allow for reuse. More importantly, your content will have a unified look, feel, and message that improves customer experience and strengthens your brand.

Change can be slow, but it can also start small. Simply getting your colleagues into a collaborative mindset can be a great first step toward a future without silos. If you approach the problem of silos with reality in mind, you will achieve better results.

Going global: the demand for intelligent content

March 10, 2015 by

Companies experience their greatest growing pains when expanding business to global markets. It’s an exciting time but can also be a rude awakening as differing local requirements emerge for both product and content.

On the content side, keeping all of these requirements in check can be a daunting task. Proper planning and execution is critical for meeting these requirements and delivery dates, and for keeping your sanity.

Translation is a small part of the picture

The first thing most people think about when going global is translation. There are core languages of each country to consider, and then there are market niche languages (Russian in Italy, for example). There are right to left (RTL) languages to consider as well, and every language has an impact on layout, whether by direction or by length of the translation.

While planning for and managing the translation effort is a formidable (and expensive) task, in the grand scheme of things it is probably the easiest part of the globalization effort to execute.

Customization is key

wall of sorted Lego bricks

Keeping everything organized and labeled makes for easy assembly. (source: Flickr/firepile)

If your company is selling a physical product, it will likely have varying features based on where it is being sold. The documentation for the product will need to accommodate these variances. If you’re authoring with traditional desktop publishing tools (Word, FrameMaker, InDesign, etc.), you can handle much of this with conditions, but it will take considerable effort to tag it all accordingly.

Your company may sell to another global company with its own configuration requirements, which also may vary based on location. Suddenly you have a new set of conditions to manage, which may conflict with conditions you already have in place.

What if your company has more than one customer with this demand? Or many? Suddenly the idea of using conditions to manage all of the differences doesn’t look very efficient, and maintaining multiple custom documentation sets is extremely labor intensive.

The biggest culture shock when entering a foreign market, and one that can easily catch you off guard, is being met with different legal requirements. Imagine a sale or delivery being stalled because not all of your content has been translated into required languages, or your cautions and warnings do not meet local regulations. What if your marketing material is rejected from the market because it makes illegal claims? The United States has fairly relaxed requirements compared to other countries, and what is appropriate or forgivable in the US can be completely inappropriate if not illegal in other countries.

Intelligent content can help

Traditional authoring workflows simply can’t scale to meet global content demands without requiring additional labor, time, and cost to deliver. Intelligent content – semantically rich XML-based content – can better meet those demands. Some key benefits include:

  • Small reusable topics and content “chunks”: Content can be written once and reused as often as needed. This eliminates the need to rewrite the same information multiple times, saving on authoring and updating time. And because the content is authored only once, it only needs to be translated once per target language.
  • Separation of form and content: When you write in XML, you focus on the content itself and not how it will look in print, online, in mobile devices, and so on. The visual formatting is applied when you publish, usually automatically by a transformation process using stylesheets. Different languages can have their own unique transformations applied.
  • Custom tags and definitions for specific countries, regions, or customers: Rather than rely solely on conditional text to handle all of your unique content requirements, much of the customization can be handled by metadata and through special processing of certain elements (XML tags). Content identified through these means can be swapped in place of standard content as needed when publishing.
  • Everything in one place: The move to intelligent content usually involves implementing a component content management system (CCMS). This means that your content is managed in one place and is easily findable by authors, and all country, region, or customer specific information can be tagged, managed, and used by all authors.

There are many other benefits to using intelligent content for global content distribution, and all aspects can be customized for your unique implementation. Employing intelligent content can ease the pain of delivering content globally, accelerate your time to market, and help you keep your production costs in check.

If you would like to learn more, visit the related topics in this post or send us a message. We’re happy to help!

Buyer’s guide to CCMS evaluation and selection (premium)

March 2, 2015 by

“What CCMS should we buy?”

It’s a common question with no easy answer. This article provides a roadmap for CCMS evaluation and selection.

First, a few definitions. A CCMS (component content management system) is different from a CMS (content management system). You need a CCMS to manage chunks of information, such as reusable warnings, topics, or other small bits of information that are then assembled into larger documents. A CMS is for managing the results, like white papers, user manuals, and other documents.

1. Why do you need a CCMS?

assortment of yarns in store

Why do you need a CCMS? // flickr: lollyknit

The very first step in CCMS evaluation is to determine why you need a CCMS. (Sorry, this sounds like a Fight Club reference.) What business problem are you trying to solve? I am amazed at the number of people who think they need a CCMS but cannot articulate why. Some common reasons for CCMS implementation are increases in:

  • Volume of content
  • Translation requirements
  • Velocity of content
  • Versioning

Before you do anything, understand the scope of your problem and whether a CCMS can solve it. What is broken? Is it more broken than last year? Why? Some common examples of problems that a CCMS will not solve are:

  • Content is inaccurate.
  • Writers lack necessary technical expertise.
  • Departments are siloed because of a lack of mutual professional respect.
  • Content creators refuse to follow style guides because their content is special.
  • In short, all of the People Things™.

If you have issues with People Things, you need to address those in addition to (and preferably before) any CCMS initiatives.

2. What is your ROI?

CCMS implementation is expensive. The cost of CCMS software varies, but implementation is expensive no matter what. Assuming you have a problem that a CCMS will address, your next step is to calculate your approximate return on investment. You’re looking for a rough order of magnitude here, and much of this calculation is driven by your scale. If, for example, you have 50 full-time content creators, and you assume a loaded cost of $100,000 per person per year, then a 10 percent improvement in efficiency is equal to around $500,000 in yearly savings.

By contrast, if your annual translation budget is $50,000, you probably won’t find ROI justification in cutting translation costs.

You can build ROI based on:

  • Cost avoidance
  • Increased revenue
  • Time

Cost avoidance is an efficiency argument: “If we buy X, we can do task Y with less effort.”

Increased revenue is an investment argument: “If we buy X, we will get more income from Y.”

Time is usually a time-to-market argument: “If we buy X, we can deliver Y in two months instead of six months.”

Our XML business case calculator lets you estimate your potential ROI from a transition from desktop publishing to XML.

Once you have some sort of idea of the cost improvement (the “return” part of ROI), you can move on to figure out the other side of the equation–how much do you need to invest? Your initial cost estimates will tell you what class of CCMS you should be looking at. For general guidance, refer to our XML workflow costs article.

3. There is no bad CCMS. Only bad fits for you.

The trick to buying the right CCMS is to find the one that meets your requirements. Every system on the market has strengths and weaknesses. There is no single Best CCMS, nor is there a Bad CCMS. What we have is systems that are better in some scenarios than others. Therefore, you need to figure out the following:

  • What are your priorities?
  • Which system best matches your priorities?

Figuring out your own requirements is your job. Once you have some vague idea of what you need and have done preliminary research, consider issuing a formal RFI (request for information) to get better vendor information.

At this point, you should not be issuing an RFP (request for proposal). You don’t know exactly what you want to ask for yet.

4. What features do you need?

Knitting markers in various shapes

Pay attention to features // flickr: trilliumdesign

“What features do you need in a CCMS?”

“Version control.”

Well, duh. Let’s ask a better question: “What features do you need beyond the basics?” Answers in the past few years from different clients have included items such as:

  • “We have no IT support, so the system needs to be SaaS  (software-as-a-service).”
  • “The system must not be SaaS.”
  • “The system must integrate with [SAP | web CMS | others].”
  • “The system must generate output via [this protocol] to [this format].”
  • “The interface must be available in [this language].”
  • “The system must cost less than [this amount].”
  • “The system must align with the branching techniques we use in [this software development tool].”
  • “The system must be installed and running by [some very soon date].”
  • “The system must support [this content model].”

You want to thoroughly understand your exact requirements and constraints so that you can narrow down your choices quickly. Don’t make a list of boring stuff that every CCMS does. It’s a waste of everyone’s time, especially yours.

Instead, do the following:

  • Identify your most unique requirements (regulatory? system architecture? content volume? languages?)
  • Write up detailed use cases for your most critical requirements and use those to evaluate the systems

Your use case scenarios will help you evaluate the candidates and figure out whether they really meet your requirements.

5. Is extensibility important?

How will your CCMS interact with other systems? Which other systems do you need to interact with? If extensibility is important, focus on the systems that provide a good starting point for this type of integration.

6. What is your exit strategy?

Start planning your exit strategy on the way in. If you ever decide you want to take your content and shift it to another system, will you be able to do so? What is your exit strategy?

7. Vendor relationship

Can you communicate successfully with the CCMS vendor? If not, the implementation and integration process is going to be extremely painful.

Don’t play games with your vendors. The CCMS world is extremely small. People move from one software vendor to another constantly. As a result, there is an excellent industry grapevine. Bad behavior, whether by CCMS vendors, by consultants, or by potential CCMS buyers, is not an option–word gets around quickly.

Not too long ago, I asked a colleague about working with a particular prospect, who seemed a bit….difficult. His response? “Run. Run fast. Then hide.”

Test your vendor relationship with a low-budget, low-commitment pilot project.

Additional resources

This brief overview only scratches the surface of what you need to consider. More resources:

How fast food can help your content strategy

February 23, 2015 by

These days, I generally avoid fast food, but it’s hard to pass up good French fries every now and then. Look beyond those yummy fries, and you can learn some valuable lessons that apply to content strategy.

A consistent—yet location-tailored—experience

In 1986, I took a whirlwind tour of Europe with a group of other teenagers. About a week into our trip, our not-so-refined palates were craving some American fast food. In Venice, we were elated to see this canopy:

Wendy restuarant in Venice, Italy

When in Venice, eat as the American teens do. Wendy(‘s) of Venice, 1986.

Even though the ‘s was missing because the English-language possessive makes no sense in Italian, we instantly recognized the lettering. The food was also comfortingly familiar. I was on another continent, but much of the food could have been from the Wendy’s just down the street from my family home in North Carolina.

That said, the menu board was in Italian, and there were some differences in the food that reflected Italian culinary flair. The burgers were also smaller than their American counterparts, perhaps to better match European serving sizes or to reflect supply costs.

Does your company’s content offer a similarly consistent experience that takes regional differences into account? In this global economy, distributing content in one language with little thought about delivery in other locales is usually a losing (and revenue-limiting) proposition. Merely translating words into another language is not enough, either. You also need to consider several issues, including:

  • Are images and colors culturally appropriate?
  • Does your content contain turns of phrase that are region-specific?
  • Can the formatting of your content handle text expansion from translation or a shift from a left-to-right language to a right-to-left language (or vice versa)?

My visit to the Venetian Wendy—without the ‘s, thank you very much—was an early lesson in adapting for different locales. A trip to Venetian fast-food outlets might give you some perspective, too. (Good luck with that expense report. I doubt “Alan Pringle told me to go to Venice” will cut it.)

“Have it your way”

I’ve already dated myself with the previous post about my teenage years, and I’m about to do it again with this ad:

“Have it your way” was a Burger King slogan for 40 years. As long as I can remember, the chain has promoted its ability to accommodate customers’ tweaks to menu items.

I’m no expert on fast-food workflows, but I’ll bet Burger King has done all sorts of studies on how to crank out the food quickly while fulfilling customers’ special requests. The chain has probably instituted specific processes based on those studies. Burger King knows it will lose business if it cannot correctly and quickly prepare customized food items.

Implementing intelligent content can help you deliver customized information to your customers—even more quickly than Burger King can put extra ketchup on your Whopper.

Suppose your company sells multiple models of the same item. Some features are consistent across all models, but other features are specific to particular versions. With intelligent content in place, you could create a web portal or app through which customers specify the model they own, what accessories they have, and so on, to generate instant custom content. It takes a lot of planning and work to set up systems that deliver on-the-fly custom information, but it is possible—and some companies are doing it now.

In addition to customizing the content itself, you also need to consider how information is displayed on phones, tablets, computers, and who knows what devices in the future. If your content doesn’t display well on differently sized screens, you aren’t letting customers have it their way. (I’m as displeased about reading a big PDF on my phone as I am about a restaurant that won’t hold the mayo on a burger. Blech.)

Any other “special sauce” lessons you’ve applied to your content strategy? Leave your thoughts in the comments.

Taking the DITA troubleshooting topic for a spin

February 16, 2015 by

This guest post is by Carlos Evia, Ph.D., the director of Professional and Technical Writing at Virginia Tech.

The DITA Troubleshooting topic is one of the “new” features in version 1.3 of the standard. However, troubleshooting has been around the DITA world for some good eight years now.

A SourceForge archive of plug-ins for the DITA Open Toolkit still houses a Troubleshooting Specialization released in October 2007. The 2007 troubleshooting topic sounded like a visit to the doctor, with  tags like tsSymptoms, tsCauses, tsDiagnose, and tsResolve (tsTake2Aspirins was too long, I guess).

It wasn’t until July 2014 when the DITA Adoption Technical Committee announced the troubleshooting topic as a new, formal content type in the standard. The committee released then the final version of the white paper Using DITA 1.3 Troubleshooting, authored by Bob Thomas. The white paper presents the rationale for the troubleshooting topic and provides detailed, accurate examples and templates, focusing on a structure of cause-remedy pairs of information to populate the topic.

Around that time, I was invited to lead a consulting project for a client in need of an online manual for processes related to cardboard manufacturing (not their actual business; just an example for this post). The client wanted to have web-based “how to” information for operators in charge of the processes of corrugating and die-cutting cardboard (not the actual processes we documented). As I assembled a team of faculty and students in technical communication and computer science, during an early meeting the client revealed that the manual’s focus had to be on troubleshooting. That scratched my itch for taking the troubleshooting topic for a spin.

Seven months into the process, as we wrap up the project, here I share some lessons learned from my experience with the the troubleshooting topic.

Conduct a root cause analysis

Task analysis, collecting and analyzing legacy documentation, and interviews with subject matter experts. Those traditional weapons for technical communication are probably not effective for obtaining troubleshooting information. When looking for cause-remedy pairs, the team (led by the client’s human resources personnel) conducted a root cause analysis. In the 3rd edition of their book Root Cause Analysis, Latino & Latino defined it by including four different definitions! For the 4th  edition (which includes a 3rd Latino in the list of co-authors), they simplify the definition of root cause analysis as “the establishing of logically complete, evidence-based, tightly coupled chains of factors from the least acceptable consequences to the deepest significant underlying causes” (p. 15).

The specific cause and effect tool we used for this troubleshooting project was a five whys session, which can be used to “question each identified cause as to whether it is a symptom, a lower-level cause, or a root cause” and “continue the search for true root causes even after finding that a possible cause has been found” (Andersen & Fagerhaug; 2000, p. 117). The five whys exercise involved supervisors, operators with diverse levels of expertise, and personnel from the client’s human resources department. At the end, we had a series of tables documenting conditions, delivering the type of cause-remedy pairs specified by the DITA Adoption TC white paper.

Prioritize conditions and solutions

A long root cause analysis session with supervisors, users, and managers can be too exhaustive for a troubleshooting guide aimed at an audience of machine operators. Never forget the deliverable’s intended users and their unique needs. During the five whys experiment we came out with some conditions that had more than 15 possible cause-remedy pairs. They were all interesting and relevant to some aspects of cardboard production. However, some happened at least once a week and others were almost urban legends. Many of their solutions involved shift supervisors or technicians. We filtered results based on a) audience’s real needs for the scope of the project, and b) frequency on the production floor.

Realize that troubleshooting is an excellent starter topic

Students who had never been exposed to DITA had a short learning curve for authoring troubleshooting topics. The students knew about principles of effective, minimal documentation, and persuasive writing. However, their knowledge of concept-task-reference was limited to a 5-minute
presentation. To them, DITA was mainly a grammar for troubleshooting. Unlike students who started with a DITA 101 course and had to work for at least half a semester with the standard, the new troubleshooting authors had a smooth transition to topic-based writing.

Maybe it is because a task or concept as an isolated chunk of information needs a map and a transformation to make sense. The troubleshooting topic, on the other hand, has a cause and a solution and can incorporate elements of a task. The topic provides instant gratification to the author who can see it as a small deliverable.

Remember that conrefs matter

Having new DITA authors who did not know much about the standard also brought problems. Students without previous DITA experience were good at learning the tags behind the troubleshooting topic and mastered cross-referencing links. But when it came to using conrefs, we had to appoint inspector. We called them the “conref police.” After all, a dull blade on a cardboard-cutting machine can be the cause for many conditions, and the solution will always be “ask maintenance to replace the blade.”

The conref police was in charge of frequently talking to authors and proposing conref solutions without getting too deep into the concept and mechanics of reuse.

Be aware that flowcharts kill good content

A troubleshooting topic can include several cause-remedy pairs (the condition of “humidity” in corrugating, for example, has many possible causes). When facing complex scenarios with many solutions, the DITA Adoption TC white paper proposes the use of static flowcharts inside an
image tag. I have been teaching about DITA at the college level for eight years, and I always tell my students that good content goes to die in PowerPoint slides. Oh boy, I was not prepared for dealing with static flow charts. Forget about good content that died of natural exporting causes; flow charts kill good content without mercy. One minor change, filter application, or typo sends you back to OmniGraffle and does not allow easy customization.

Maybe the solution is coming, with Jang Graat’s DITA-to-flowchart project, which he introduced at DITA Europe last year. We will wait and see.

Find a solution for the “remedy”

As a tag and title, “remedy” did not solve problems in this case. Maybe it was the unique situation of this project, where the client’s management staff and most of the authoring and developing team were Hispanic. There is nothing etymologically wrong with the term, but the more we talked about it, for us “remedy” sounded like a cheap, quick fix. Think of the stigma attached to “remedial writing” in college. We decided to use “Solution” in the title of each section, but the tag is still remedy, and we can’t change that.

Bend the rules (to help users)

Make documentation easy to find. Isn’t that one of the IBM characteristics of quality technical information? (Carey at al. 2014). For this project, the main web deliverable had a DITA-generated index and a search box. However, users needed to identify defective boxes looking at pictures showing the most common conditions affecting the processes of corrugating and die-cutting. A quick solution, without specializing or modifying XSLTs, was to create a visual catalog of defects. On the main map, the topicref for the concept c-corrugatingtrouble.dita had a child for each condition documented.

The images came from each troubleshooting topic, where they had been (blasphemy!) included in the short description. It worked, and the users were able to identify the conditions starting from a defective box.

The troubleshooting topic, as included in the DITA 1.3 standard, was worth the long wait. It is a much needed content type that authors can understand and adopt easily. Now I just have to update my teaching materials to expand the concept-task-reference language.


Andersen, B., & Fagerhaug T. (2000) Root cause analysis: simplified tools and techniques. Milwaukee, WI: ASQ Quality Press.

M. Carey, M., McFadden Lanyi, M., Longo, D., Radzinski, E., Rouiller, S., & Wilde, E. (2014). Developing quality technical information: a handbook for writers and editors. Upper Saddle River, NJ: IBM Press.

Latino, R. J., Latino, K. C., & Latino, M. A. (2011). Root cause analysis: improving performance for bottom-line results.  4th ed. Boca Raton, FL: CRC Press.


Conditional content in DITA (premium)

February 9, 2015 by

This post provides an overview of techniques you can use to handle conditional content in DITA. The need for complex conditions is a common reason organizations choose DITA as their content model. As conditional requirements get more complex, the basic Show/Hide functionality offered in many desktop publishing tools is no longer sufficient.

Conditional processing is especially interesting–or maybe problematic—when you combine it with reuse requirements. You identify a piece of content that could be reused except for one small bit that needs to be different in the two reuse scenarios.

The first step in establishing a strategy for conditional content is to clarify your requirements and ensure that you understand what you are trying to accomplish.

Classes of text variants

DITA offers two basic types of variants:

  • Variables (implemented through DITA keys): a short snippet, like a product name, that often changes.
  • Conditional information: an element or group of elements that needs to be included or excluded selectively. Conditional information can occur at the topic, block, or inline level. Graphics and tables can also be conditionalized.

In DITA, your conditional assignments need to correspond to an element. In unstructured desktop publishing tools, it’s possible to assign conditions to an arbitrary chunk of content. This is not the case in DITA because you need to attach the conditional labeling to an element. (In theory, it’s possible to use processing instructions to mimic the arbitrary labeling, but just…don’t.)


Here’s what a simple variable looks like. First, you define the variable as a key (in this case, clientname) in the map file.

    <title>DITA Topic Map</title>
    <keydef keys=“clientname”>
                <keyword>My First Client</keyword>
    <topicref href=“sample.dita”/>

Inside the topics, you reference the key:

<p>When we deliver this information to <keyword keyref=“clientname”/>

You use a placeholder for the keyword in your text, and you use the map file to define the value of the placeholder. Therefore, you can use a single topic with a keyref along with multiple map files. The result will be different output for the key reference for each of the maps. (DITA 1.3 adds scoped keys, which allow you to change the key’s value inside a single map file.)


In DITA, you use attributes to identify conditional content:

<p>This paragraph is for everyone.</p>
audience=“advanced”>This paragraph is only for advanced users.</p>

      It’s possible to do conditional content at the phrase level<ph platform=“badidea”>, but it’s a really terrible idea</ph>.

If you have more complex combinations, you use more than one attribute:

 <p audience=“expert”


      product=“X”>content goes here</p>
<p audience=“expert” 

           platform=“windows mac”

           product=“X Y Z”>other content here</p>

Do not use conditions below the sentence—preferably paragraph—level.

Why not, you ask?

<p>The colo<ph xml:lang=“en-uk”>u</ph>r of money is a very speciali<ph xml:lang=“en-uk”>s</ph><ph xml:lang=“en-uk”>z</ph>ed topic.</p>

Two reasons:

  1. Your translator will hate it.
  2. You will go insane.

Specifying conditional output

Once you have assigned your attribute values, you use a ditaval file to specify what to include and exclude when you generate output through the DITA Open Toolkit. Here is a simple example:

<prop action=“include” att=“audience” val=“expert”/>
<prop action=“include” att=“product” val=“X”/>

Markup is the small(er) challenge

You assign attributes to an element to make it conditional. You can assign conditions, therefore, to anything that has an element, all the way down to phrases, words, or even letters (but again, don’t go below sentences). DITA gives you three attributes out of the box (audience, product, platform) for conditional processing. If you need more or different attributes, you’ll need to specialize.

Establishing a reasonable taxonomy and information architecture presents a much more difficult challenge that the actual assignment of conditional markup. You have to figure out which attributes to create, what values they should have available, and how you might combine the attributes to generate the variants you need.

Consider the case of information that is applicable only to a specific region, like California:

<warning audience=“ca”>
     <p> This product contains chemicals known to the State of California to cause cancer and birth defects or other reproductive harm.</p>

This works, provided that your regions are limited to the fifty U.S. states. If you needed to flag information for Canada, that “ca” designator would suddenly become problematic. Perhaps you’d try specifying the country in addition to the state:

<warning audience=“usa-ca”>…California content… </warning>

<warning audience=“ca”>…Canada content…</warning>

As long as you planned for California and Canada, everything will be OK. The problem occurs when you start with a list of states and an assumption that you’ll never need non-US regions, and then suddenly you do.

Conditions and reuse

The combination of conditional variants and reuse is especially problematic. One interesting solution is to use a conref push. A conref push allows you to insert (or “push”) information into a topic.

We use this technique in some of our software assessments. We have a general overview of a particular piece of software with information our clients need, like cost, licensing terms, supported platforms, and so on. But we also need to include our overall recommendation for or against that software. This final bit of information is different for each customer.

To accommodate this, we set up the location where the information is needed with an ID. In our case, this is the last paragraph in the assessment of XYZ tool:

<p id=“xyz”>We recommend XYZ if ABC is a critical requirement.</p>

We then create another topic, referenced in the parent map file as a resource, that provides the information to be inserted:

<p conref=“file.dita#id/xyz” conaction=“mark”/>
      <p conaction=“pushafter”>Using XYZ would eliminate the manual formatting that currently takes up so much production time at ClientA.</p>

For another client, we have a different map file and a different piece of content to be inserted:

<p conref=“file.dita#id/xyz” conaction=“mark”/>
      <p conaction=“pushafter”>XYZ does not support right-to-left languages (such as Arabic), which ClientB needs.


What are your experiences with DITA conditional content?

The talent deficit in content strategy

February 2, 2015 by

Content strategy is taking hold across numerous organizations. Bad content is riskier and riskier because of the transparency and accountability in today’s social media–driven world.

But now, we have a new problem: a talent deficit in content strategy.

Our industry has talented people; it’s just that the demand for content strategists exceeds the available supply. Furthermore, we have an even bigger problem in writing, editing, and production. Enterprise content strategy very often means the introduction of professional-grade tools and workflows (such as XML, CMS, and rigorous terminology and style guides), but many content creators are unprepared for this new environment.

Technical deficit

On the technical side, writers need new skills, such as structured authoring, XML, DITA, and CMS usage. Editors—who move into information architecture roles—must understand how to apply metadata and how to build an effective set of metadata for specific requirements. Deep knowledge of reuse types and applying them is critical. An understanding of localization best practices is helpful for any global organization.

The most technical roles, XSLT stylesheet developers and CMS administrators, must be filled by individuals with a hybrid of IT and publishing skill sets.

Business analysis deficit

Organizations need to first understand their business goals and then use the identified goals to decide on an appropriate content strategy. Here, we face another major skill gap. Although lots of people understand that XML is useful, and some can spell out how XML might be useful, the ability to connect “useful things XML can do” to “what the business needs” is rare. Most publishing people love books. The business component is only of incidental interest.

This presents a problem (or maybe an opportunity) for content strategy consultants.

Why is this a problem

It’s a good time to be a content strategist, a writer with the right technical skills, a stylesheet developer, or a CMS administrator. From your point of view, there are tons of job opportunities, which likely means higher salaries.

When we look at the overall industry, though, we have a problem. Several of our clients have open positions that they are struggling to fill. They are having an especially difficult time finding information architects for DITA. If this continues, the benefits of a very challenging toolset (like DITA) may be outweighed by the lack of available staff. From an executive’s point of view, there are a lot of negatives: more skilled individuals command higher salaries and are hard to find.

Closing the gap

Mind the gap warning next to railroad tracks.

Be aware of the talent gap // flickr: cgpgrey

We won’t close the talent gap any time soon, but here are some suggestions for management:

  • Identify high-potential employees and support them in learning what they need.
  • Recognize that the best employers will get the best talent. If you are not a preferred employer, you may have a long road ahead.
  • Turnover is expensive. Do what you need to do to avoid it.

What’s needed is some really excellent training to start expanding the pipeline of qualified people.

Scriptorium job placement policy

Given the discussion about hiring and recruiting, it seems wise to include a note about our job placement policy.

As consultants, we have unique access to staff across a variety of companies. The tight market is leading to a lot of inquiries about job opportunities. To avoid conflicts of interest, we have a few simple rules:

  1. We do not poach staff from customers.
  2. We do not recruit staff from one customer to another customer.
  3. As long as there is no conflict with these first two items, we provide informal matchmaking assistance to our customers looking for candidates and to individuals seeking new positions. We do not charge for these services.

Are you facing a talent deficit? What’s your plan to address it?

Speaking of talent issues, I give you this gratuitous video:

DITA 1.3 overview

January 26, 2015 by

Robert Anderson, one of the DITA architects, compared the transition from DITA 1.1 to DITA 1.2 to the difference between having a couple of drinks with friends and a huge party. The DITA 1.2 specification introduced broad changes in the form of new base elements, new architectural structures, and new specializations.

DITA 1.3 isn’t the rowdy gathering that DITA 1.2 was—it’s more a group of friends going out to the new pub downtown while being sure to get some designated drivers. This article describes the most important additions to DITA 1.3.

Scoped keys

In DITA 1.2, keys were always global. That is, a key could only use one value in a ditamap. If you needed a key to use two different values in different parts of a single ditamap file, you were out of luck.

DITA 1.3 addresses this problem with the @keyscope attribute. The @keyscope attribute lets you constrain the value of a key to a specific part of the ditamap. In effect, you create a local key instead of a global key. This is useful for omnibus publications or for publications that are authored by multiple teams. For example, consider a widget repair manual that describes two different products. It needs to reference the prodinfo key, but with two different values. The @keyscope attribute supports this requirement.


<title>Widget Repair Manual</title>
<topichead navtitle=”Widget A” keyscope=”Widget_A”>
<keydef keys=”prodinfo” href=”prodinfo_a.dita”/>
<topicref href=”maintenance_a.dita”/>
<topichead navtitle=”Widget B” keyscope=”Widget_B”>
<keydef keys=”prodinfo” href=”prodinfo_b.dita”/>
<topicref href=”maintenance_b.dita”/>


Before DITA 1.3, this scenario required two different keys or separate publications for each widget. To reference a key within a specific scope, add a scope before the key name. For example, to explicitly reference Widget B’s product information, use Widget_B.prodinfo as the key value. If no scope is declared, the scope of the current topic is used instead.

Cross-deliverable linking

DITA 1.3 also provides support for cross-deliverable linking through scoped keys. First, you create a mapref, set the @scope attribute to peer, and then define a keyscope.


<title>Widget User Guide</title>
<mapref href=”../Repair_Manual/Repair_Manual.ditamap” scope=”peer” keyscope=”Repair_Manual”/>


Then you can use keys contained in the referenced map within the publication. In this example, topics within the Widget User Guide can use keys that are defined in Repair_Manual.ditamap. For example, to reference a task detailing the repair of a feature, an author could do the following:

<p>If replacing the module did not work, see <xref keyref=”Repair_Manual.wiring_harness”/> in <conkeyref=”Repair_Manual.title”/> for detailed instructions on how to repair the wiring harness.</p>

Branch filtering

In DITA 1.2, ditavals are global and applied when running a transform. In DITA 1.3, the <ditavalref> element allows you to specify ditavals at the map or topicref level and then cascade the filtering effects through the map. Consider a software documentation guide.


<title>App User Manual</title>
<topicref href=”prodinfo.dita>

          <ditavalref href=”user.ditaval”/>

<topicref href=”functions.dita”>

<ditavalref href=”mac.ditaval”/>
<ditavalref href=”windows.ditaval”/>
<ditavalref href=”linux.ditaval”/>



When output is generated from this map, prodinfo.dita is published using the settings in user.ditaval. In addition, functions.dita is published three times, once for each of the three ditaval files specified.


The new troubleshooting topic type is a specialization of the task topic type intended for troubleshooting steps, and consists of <condition>, <cause>, and <remedy> elements. You can use multiple cause/remedy pairs to provide a series of corrective actions for conditions that can have multiple causes.

The optional <tasktroubleshooting> and <steptroubleshooting> elements and trouble admonition type are used to provide quick, embedded troubleshooting within a topic. The DITA 1.3 specification recommends that these elements contain a condition, cause, and remedy.

The big picture

There are a lot of new toys to work with in DITA 1.3, and we haven’t even looked at some of the smaller additions, like new attributes for tables and organizational elements within topics. There are a lot of necessities to consider, but will your vendors support them, and how soon?

How could some of these new features be used to your advantage? Have you already implemented similar functionality yourself?

XML overview for executives

January 20, 2015 by

Over the past year or two, our typical XML customer has changed. Until recently, most XML publishing efforts were driven by marketing communications, technical publications, or IT, usually by a technical expert. But today’s customer is much more likely to be an executive who understands the potential business benefits of XML publishing but not the technical details. This article provides an XML overview for executives. What do you need to know before you decide to lead your organization into an XML world?

1. What is the actual problem?

What is the problem that you are solving? The answer is not “we want XML.” Why do you want XML? Are there other ways of solving the business problem? Make sure that the organization has a solid content strategy. At a minimum, the strategy document should include the following:

  • Business goals
  • A description of how the content strategy will achieve the business goals
  • High-level plan: resources, timelines, and budget
  • ROI

2. Technology is the easiest part.

As with most projects, early tool selection is appealing. It’s a concrete task, there are lots of choices, and choosing a content management system or an XML authoring tool feels like progress. Unfortunately, choosing tools too early can lead to problems downstream, when you discover that your selected toolset can’t actually meet your requirements.

Before tool selection, you should have the following:

  • A general idea of how all the moving pieces and parts will be connected (systems architecture overview)
  • Estimate of business benefits, such as $100K in yearly cost savings or faster time to global markets (four months instead of eight months)
  • A list of constraints (required/rejected operating systems, SaaS or not?, language support requirements)

3. Separation of content and formatting

You’ve probably heard this one. Storing content in XML makes it possible to deliver multiple output formats, such as HTML and PDF, from a single set of source files. This statement is true, but it glosses over a lot of hard work. Many designers are not prepared to let go of their preferred tools for building web sites or printed documents. When you separate content and formatting, you remove the designer’s ability to “tweak” their layouts. The transition from a hand-crafted page (web or print) to automatically generated pages is challenging. For some document types, the final polish you get from manual layout may not be optional. In this case, you have to think hard about what value you get from XML. You can also consider a middle ground, in which you publish into an intermediate format, do some final cleaning up, and then render the final version.

Separation of content and formatting provides for complete automation and huge cost savings, but it is not the right choice for all document types.

4. What is your organization’s capacity for change?

The best solution is the one that your organization can master. Make sure that you assess the following before making any permanent decisions:

  • Corporate culture: Is the organization typically risk-averse? Is it an early or late adopter?
  • Change management: How do employees handle change? How is change communicated? Is change resistance a common concern?
  • Leadership: Does your organization have leaders who can manage change?
  • Technical expertise: Is the solution a good fit for the technical expertise of the staff? If not, what is the plan to address this issue?
  • Executive leadership: Will executive management support this change?
  • Risk mitigation: What are the risks? What is the risk mitigation plan?


2015 content trends

January 12, 2015 by

It’s a new year, which means it’s time for Scriptorium to discuss—and wildly speculate about—the latest trends in content. Here’s what Bill Swallow, Gretyl Kinsey, and I had to say about 2015 content trends.

Content strategy driven by IT

dog with paw on laptop keyboard

flickr: ttarasiuk

Alan: Years ago, IT was brought in a little later in [content strategy projects], and there was also some adversarial positioning going on. I can remember one client saying, “Our IT department: they’re impossible to work with. We don’t like them.” The IT department turned out to be a really huge asset for them and really went to the mat to the help them get systems set up like they should have been.

More recently, I have seen cases where IT departments are the primary stakeholder and are the folks who are really instigating changes in content strategy because they see the overall bigger picture of how content is flowing across the company.

Gretyl: [One concern is] when your IT department is driving a project but then they are going to make a decision that works really well for them but makes things harder for content creators or doesn’t serve the organization as well as a whole.

Content as part of the customer journey

Gretyl: When someone is buying a product, they’re not just buying a product. They are buying an experience. It’s becoming a lot more important for content to be a major part of that experience of the customer journey. Marketing content will be what draws you in or educates you about the product.

And there is also some room for technical content to come into play here as well if you are looking for something that has a lot of technical information you need to know about before you can make a choice.

Bill: It seems to be a trend more on the customer side as an expectation. I think a lot of companies are really starting to pick up on this after the fact. What we’re seeing is people responding to improper information being posted online; we see a lot of complaining on social media. Pretty soon, the company is playing catch-up trying to save face.

Executive eye on content

dog wearing glasses

flickr: Monsieur Gordon

Bill: For a while now, the idea of having an executive sponsor for a major project like a content strategy implementation was fairly commonplace. But more and more, we are seeing executives who have a firm investment in the project if not direct ownership over it—so much so that we are seeing more titles out in the field for a chief content officer. [The title] was traditionally reserved for a lot of broadcasting and media companies. But now we’re seeing it in lots of different consumer companies, software, hardware, manufacturing.

Alan: There’s a stereotype out there: executives know nothing about the real world. There are some people [for which] that is true, unfortunately. They’re a little out touch with what’s going on. But I have also seen the flip side of that very vividly where an executive basically says, “I don’t need all these metrics; I don’t need all these numbers. I can tell you exactly what is wrong with this part of the big picture. I know content is part of it. Fix it.” Sometimes, they have an uncanny ability to zero in on exactly what the problem is.

Accommodating experts as content contributors

Alan: I’ve really seen an increase in content creators wanting to be sure that subject matter experts, tool experts, [and] internal experts are able to contribute content easily to the process and to review content. More and more content creation systems are allowing people to actually get into the source—and not necessarily edit the source—but to suggest edits, and someone on the content creation side can approve it.

Gretyl: This goes along with the idea of getting more and more technologically inclined. That’s really helping our content strategy both inside and outside the organization. While we’ve talked a lot before about how [technology] is helping the customer; this is the flip side of that. If you’ve got content contributors that really don’t know anything about your actual [content] creation process—they just know about the subject they’re an expert in— this is using technology to your best advantage.

Cross-company focus in content strategy implementation

Sled dogs running

flickr: Frank Kovalchek

Gretyl: For me, this a trend mostly about breaking down silos. These groups don’t interact and collaborate—and if they do interact, it’s to say, “Don’t touch my stuff.” What this leads to is content that really doesn’t work well together, and that can hold your company back. One of the ways you can get rid of this idea of silos is to develop a content strategy that encompasses all your content, and that will be more effective than just developing a content strategy, for example, for one department and ignoring all the others.

Bill: The cross-company focus is definitely on the rise, but it comes back to, “People really love their silos.” Even if they don’t, and they do want to work outside the silo, a lot of times, there is so much embedded in the way they need to work, the channels they have to go through because of financial reasons or because of reporting structures, that it is going to hinder this cross-company focused growth.

“Old” social media platforms decline

Bill: The social media landscape is going to change—it has been changing—but there are some outliers there that have been around for quite a while. Even though they still seem new to many of us, according to my kids and my kids’ friends, these are all “Mom-and-Dad networks all the old people are using.” They want nothing to do with it. They maintain a Facebook page to talk to family, but a lot of their actual interactions with friends are done on private networks, which is kind of scary.

It speaks to a different way of looking at social applications, social networking, and sharing of information. We’re not going to see a lot of Facebook, Google Plus, blogging, and other resources used by the younger generation. So, our social media and social engagement strategies around content strategy are going to need to adapt.

Alan: I think this goes back to one of the oldest and best guidelines for any kind of writing—tech comm, marcom, whatever—and that is know your audience. That’s the most important thing here. I think we’ve all seen in social media and advertising [when] companies selling fairly traditional products [are] trying to play like they’re all hip. It’s painful, absolutely painful, to see.

We had fun during the event. You can see for yourself when you watch the recording:

You’ll probably enjoy my “Did I really say that?” moments a lot more than I did!