Scriptorium Publishing

content strategy consulting

How fast food can help your content strategy

February 23, 2015 by

These days, I generally avoid fast food, but it’s hard to pass up good French fries every now and then. Look beyond those yummy fries, and you can learn some valuable lessons that apply to content strategy.

A consistent—yet location-tailored—experience

In 1986, I took a whirlwind tour of Europe with a group of other teenagers. About a week into our trip, our not-so-refined palates were craving some American fast food. In Venice, we were elated to see this canopy:

Wendy restuarant in Venice, Italy

When in Venice, eat as the American teens do. Wendy(‘s) of Venice, 1986.

Even though the ‘s was missing because the English-language possessive makes no sense in Italian, we instantly recognized the lettering. The food was also comfortingly familiar. I was on another continent, but much of the food could have been from the Wendy’s just down the street from my family home in North Carolina.

That said, the menu board was in Italian, and there were some differences in the food that reflected Italian culinary flair. The burgers were also smaller than their American counterparts, perhaps to better match European serving sizes or to reflect supply costs.

Does your company’s content offer a similarly consistent experience that takes regional differences into account? In this global economy, distributing content in one language with little thought about delivery in other locales is usually a losing (and revenue-limiting) proposition. Merely translating words into another language is not enough, either. You also need to consider several issues, including:

  • Are images and colors culturally appropriate?
  • Does your content contain turns of phrase that are region-specific?
  • Can the formatting of your content handle text expansion from translation or a shift from a left-to-right language to a right-to-left language (or vice versa)?

My visit to the Venetian Wendy—without the ‘s, thank you very much—was an early lesson in adapting for different locales. A trip to Venetian fast-food outlets might give you some perspective, too. (Good luck with that expense report. I doubt “Alan Pringle told me to go to Venice” will cut it.)

“Have it your way”

I’ve already dated myself with the previous post about my teenage years, and I’m about to do it again with this ad:

“Have it your way” was a Burger King slogan for 40 years. As long as I can remember, the chain has promoted its ability to accommodate customers’ tweaks to menu items.

I’m no expert on fast-food workflows, but I’ll bet Burger King has done all sorts of studies on how to crank out the food quickly while fulfilling customers’ special requests. The chain has probably instituted specific processes based on those studies. Burger King knows it will lose business if it cannot correctly and quickly prepare customized food items.

Implementing intelligent content can help you deliver customized information to your customers—even more quickly than Burger King can put extra ketchup on your Whopper.

Suppose your company sells multiple models of the same item. Some features are consistent across all models, but other features are specific to particular versions. With intelligent content in place, you could create a web portal or app through which customers specify the model they own, what accessories they have, and so on, to generate instant custom content. It takes a lot of planning and work to set up systems that deliver on-the-fly custom information, but it is possible—and some companies are doing it now.

In addition to customizing the content itself, you also need to consider how information is displayed on phones, tablets, computers, and who knows what devices in the future. If your content doesn’t display well on differently sized screens, you aren’t letting customers have it their way. (I’m as displeased about reading a big PDF on my phone as I am about a restaurant that won’t hold the mayo on a burger. Blech.)

Any other “special sauce” lessons you’ve applied to your content strategy? Leave your thoughts in the comments.

Taking the DITA troubleshooting topic for a spin

February 16, 2015 by

This guest post is by Carlos Evia, Ph.D., the director of Professional and Technical Writing at Virginia Tech.

The DITA Troubleshooting topic is one of the “new” features in version 1.3 of the standard. However, troubleshooting has been around the DITA world for some good eight years now.

A SourceForge archive of plug-ins for the DITA Open Toolkit still houses a Troubleshooting Specialization released in October 2007. The 2007 troubleshooting topic sounded like a visit to the doctor, with  tags like tsSymptoms, tsCauses, tsDiagnose, and tsResolve (tsTake2Aspirins was too long, I guess).

It wasn’t until July 2014 when the DITA Adoption Technical Committee announced the troubleshooting topic as a new, formal content type in the standard. The committee released then the final version of the white paper Using DITA 1.3 Troubleshooting, authored by Bob Thomas. The white paper presents the rationale for the troubleshooting topic and provides detailed, accurate examples and templates, focusing on a structure of cause-remedy pairs of information to populate the topic.

Around that time, I was invited to lead a consulting project for a client in need of an online manual for processes related to cardboard manufacturing (not their actual business; just an example for this post). The client wanted to have web-based “how to” information for operators in charge of the processes of corrugating and die-cutting cardboard (not the actual processes we documented). As I assembled a team of faculty and students in technical communication and computer science, during an early meeting the client revealed that the manual’s focus had to be on troubleshooting. That scratched my itch for taking the troubleshooting topic for a spin.

Seven months into the process, as we wrap up the project, here I share some lessons learned from my experience with the the troubleshooting topic.

Conduct a root cause analysis

Task analysis, collecting and analyzing legacy documentation, and interviews with subject matter experts. Those traditional weapons for technical communication are probably not effective for obtaining troubleshooting information. When looking for cause-remedy pairs, the team (led by the client’s human resources personnel) conducted a root cause analysis. In the 3rd edition of their book Root Cause Analysis, Latino & Latino defined it by including four different definitions! For the 4th  edition (which includes a 3rd Latino in the list of co-authors), they simplify the definition of root cause analysis as “the establishing of logically complete, evidence-based, tightly coupled chains of factors from the least acceptable consequences to the deepest significant underlying causes” (p. 15).

The specific cause and effect tool we used for this troubleshooting project was a five whys session, which can be used to “question each identified cause as to whether it is a symptom, a lower-level cause, or a root cause” and “continue the search for true root causes even after finding that a possible cause has been found” (Andersen & Fagerhaug; 2000, p. 117). The five whys exercise involved supervisors, operators with diverse levels of expertise, and personnel from the client’s human resources department. At the end, we had a series of tables documenting conditions, delivering the type of cause-remedy pairs specified by the DITA Adoption TC white paper.

Prioritize conditions and solutions

A long root cause analysis session with supervisors, users, and managers can be too exhaustive for a troubleshooting guide aimed at an audience of machine operators. Never forget the deliverable’s intended users and their unique needs. During the five whys experiment we came out with some conditions that had more than 15 possible cause-remedy pairs. They were all interesting and relevant to some aspects of cardboard production. However, some happened at least once a week and others were almost urban legends. Many of their solutions involved shift supervisors or technicians. We filtered results based on a) audience’s real needs for the scope of the project, and b) frequency on the production floor.

Realize that troubleshooting is an excellent starter topic

Students who had never been exposed to DITA had a short learning curve for authoring troubleshooting topics. The students knew about principles of effective, minimal documentation, and persuasive writing. However, their knowledge of concept-task-reference was limited to a 5-minute
presentation. To them, DITA was mainly a grammar for troubleshooting. Unlike students who started with a DITA 101 course and had to work for at least half a semester with the standard, the new troubleshooting authors had a smooth transition to topic-based writing.

Maybe it is because a task or concept as an isolated chunk of information needs a map and a transformation to make sense. The troubleshooting topic, on the other hand, has a cause and a solution and can incorporate elements of a task. The topic provides instant gratification to the author who can see it as a small deliverable.

Remember that conrefs matter

Having new DITA authors who did not know much about the standard also brought problems. Students without previous DITA experience were good at learning the tags behind the troubleshooting topic and mastered cross-referencing links. But when it came to using conrefs, we had to appoint inspector. We called them the “conref police.” After all, a dull blade on a cardboard-cutting machine can be the cause for many conditions, and the solution will always be “ask maintenance to replace the blade.”

The conref police was in charge of frequently talking to authors and proposing conref solutions without getting too deep into the concept and mechanics of reuse.

Be aware that flowcharts kill good content

A troubleshooting topic can include several cause-remedy pairs (the condition of “humidity” in corrugating, for example, has many possible causes). When facing complex scenarios with many solutions, the DITA Adoption TC white paper proposes the use of static flowcharts inside an
image tag. I have been teaching about DITA at the college level for eight years, and I always tell my students that good content goes to die in PowerPoint slides. Oh boy, I was not prepared for dealing with static flow charts. Forget about good content that died of natural exporting causes; flow charts kill good content without mercy. One minor change, filter application, or typo sends you back to OmniGraffle and does not allow easy customization.

Maybe the solution is coming, with Jang Graat’s DITA-to-flowchart project, which he introduced at DITA Europe last year. We will wait and see.

Find a solution for the “remedy”

As a tag and title, “remedy” did not solve problems in this case. Maybe it was the unique situation of this project, where the client’s management staff and most of the authoring and developing team were Hispanic. There is nothing etymologically wrong with the term, but the more we talked about it, for us “remedy” sounded like a cheap, quick fix. Think of the stigma attached to “remedial writing” in college. We decided to use “Solution” in the title of each section, but the tag is still remedy, and we can’t change that.

Bend the rules (to help users)

Make documentation easy to find. Isn’t that one of the IBM characteristics of quality technical information? (Carey at al. 2014). For this project, the main web deliverable had a DITA-generated index and a search box. However, users needed to identify defective boxes looking at pictures showing the most common conditions affecting the processes of corrugating and die-cutting. A quick solution, without specializing or modifying XSLTs, was to create a visual catalog of defects. On the main map, the topicref for the concept c-corrugatingtrouble.dita had a child for each condition documented.

The images came from each troubleshooting topic, where they had been (blasphemy!) included in the short description. It worked, and the users were able to identify the conditions starting from a defective box.

The troubleshooting topic, as included in the DITA 1.3 standard, was worth the long wait. It is a much needed content type that authors can understand and adopt easily. Now I just have to update my teaching materials to expand the concept-task-reference language.


Andersen, B., & Fagerhaug T. (2000) Root cause analysis: simplified tools and techniques. Milwaukee, WI: ASQ Quality Press.

M. Carey, M., McFadden Lanyi, M., Longo, D., Radzinski, E., Rouiller, S., & Wilde, E. (2014). Developing quality technical information: a handbook for writers and editors. Upper Saddle River, NJ: IBM Press.

Latino, R. J., Latino, K. C., & Latino, M. A. (2011). Root cause analysis: improving performance for bottom-line results.  4th ed. Boca Raton, FL: CRC Press.


Conditional content in DITA (premium)

February 9, 2015 by

This post provides an overview of techniques you can use to handle conditional content in DITA. The need for complex conditions is a common reason organizations choose DITA as their content model. As conditional requirements get more complex, the basic Show/Hide functionality offered in many desktop publishing tools is no longer sufficient.

Conditional processing is especially interesting–or maybe problematic—when you combine it with reuse requirements. You identify a piece of content that could be reused except for one small bit that needs to be different in the two reuse scenarios.

The first step in establishing a strategy for conditional content is to clarify your requirements and ensure that you understand what you are trying to accomplish.

Classes of text variants

DITA offers two basic types of variants:

  • Variables (implemented through DITA keys): a short snippet, like a product name, that often changes.
  • Conditional information: an element or group of elements that needs to be included or excluded selectively. Conditional information can occur at the topic, block, or inline level. Graphics and tables can also be conditionalized.

In DITA, your conditional assignments need to correspond to an element. In unstructured desktop publishing tools, it’s possible to assign conditions to an arbitrary chunk of content. This is not the case in DITA because you need to attach the conditional labeling to an element. (In theory, it’s possible to use processing instructions to mimic the arbitrary labeling, but just…don’t.)


Here’s what a simple variable looks like. First, you define the variable as a key (in this case, clientname) in the map file.

    <title>DITA Topic Map</title>
    <keydef keys=“clientname”>
                <keyword>My First Client</keyword>
    <topicref href=“sample.dita”/>

Inside the topics, you reference the key:

<p>When we deliver this information to <keyword keyref=“clientname”/>

You use a placeholder for the keyword in your text, and you use the map file to define the value of the placeholder. Therefore, you can use a single topic with a keyref along with multiple map files. The result will be different output for the key reference for each of the maps. (DITA 1.3 adds scoped keys, which allow you to change the key’s value inside a single map file.)


In DITA, you use attributes to identify conditional content:

<p>This paragraph is for everyone.</p>
audience=“advanced”>This paragraph is only for advanced users.</p>

      It’s possible to do conditional content at the phrase level<ph platform=“badidea”>, but it’s a really terrible idea</ph>.

If you have more complex combinations, you use more than one attribute:

 <p audience=“expert”


      product=“X”>content goes here</p>
<p audience=“expert” 

           platform=“windows mac”

           product=“X Y Z”>other content here</p>

Do not use conditions below the sentence—preferably paragraph—level.

Why not, you ask?

<p>The colo<ph xml:lang=“en-uk”>u</ph>r of money is a very speciali<ph xml:lang=“en-uk”>s</ph><ph xml:lang=“en-uk”>z</ph>ed topic.</p>

Two reasons:

  1. Your translator will hate it.
  2. You will go insane.

Specifying conditional output

Once you have assigned your attribute values, you use a ditaval file to specify what to include and exclude when you generate output through the DITA Open Toolkit. Here is a simple example:

<prop action=“include” att=“audience” val=“expert”/>
<prop action=“include” att=“product” val=“X”/>

Markup is the small(er) challenge

You assign attributes to an element to make it conditional. You can assign conditions, therefore, to anything that has an element, all the way down to phrases, words, or even letters (but again, don’t go below sentences). DITA gives you three attributes out of the box (audience, product, platform) for conditional processing. If you need more or different attributes, you’ll need to specialize.

Establishing a reasonable taxonomy and information architecture presents a much more difficult challenge that the actual assignment of conditional markup. You have to figure out which attributes to create, what values they should have available, and how you might combine the attributes to generate the variants you need.

Consider the case of information that is applicable only to a specific region, like California:

<warning audience=“ca”>
     <p> This product contains chemicals known to the State of California to cause cancer and birth defects or other reproductive harm.</p>

This works, provided that your regions are limited to the fifty U.S. states. If you needed to flag information for Canada, that “ca” designator would suddenly become problematic. Perhaps you’d try specifying the country in addition to the state:

<warning audience=“usa-ca”>…California content… </warning>

<warning audience=“ca”>…Canada content…</warning>

As long as you planned for California and Canada, everything will be OK. The problem occurs when you start with a list of states and an assumption that you’ll never need non-US regions, and then suddenly you do.

Conditions and reuse

The combination of conditional variants and reuse is especially problematic. One interesting solution is to use a conref push. A conref push allows you to insert (or “push”) information into a topic.

We use this technique in some of our software assessments. We have a general overview of a particular piece of software with information our clients need, like cost, licensing terms, supported platforms, and so on. But we also need to include our overall recommendation for or against that software. This final bit of information is different for each customer.

To accommodate this, we set up the location where the information is needed with an ID. In our case, this is the last paragraph in the assessment of XYZ tool:

<p id=“xyz”>We recommend XYZ if ABC is a critical requirement.</p>

We then create another topic, referenced in the parent map file as a resource, that provides the information to be inserted:

<p conref=“file.dita#id/xyz” conaction=“mark”/>
      <p conaction=“pushafter”>Using XYZ would eliminate the manual formatting that currently takes up so much production time at ClientA.</p>

For another client, we have a different map file and a different piece of content to be inserted:

<p conref=“file.dita#id/xyz” conaction=“mark”/>
      <p conaction=“pushafter”>XYZ does not support right-to-left languages (such as Arabic), which ClientB needs.


What are your experiences with DITA conditional content?

The talent deficit in content strategy

February 2, 2015 by

Content strategy is taking hold across numerous organizations. Bad content is riskier and riskier because of the transparency and accountability in today’s social media–driven world.

But now, we have a new problem: a talent deficit in content strategy.

Our industry has talented people; it’s just that the demand for content strategists exceeds the available supply. Furthermore, we have an even bigger problem in writing, editing, and production. Enterprise content strategy very often means the introduction of professional-grade tools and workflows (such as XML, CMS, and rigorous terminology and style guides), but many content creators are unprepared for this new environment.

Technical deficit

On the technical side, writers need new skills, such as structured authoring, XML, DITA, and CMS usage. Editors—who move into information architecture roles—must understand how to apply metadata and how to build an effective set of metadata for specific requirements. Deep knowledge of reuse types and applying them is critical. An understanding of localization best practices is helpful for any global organization.

The most technical roles, XSLT stylesheet developers and CMS administrators, must be filled by individuals with a hybrid of IT and publishing skill sets.

Business analysis deficit

Organizations need to first understand their business goals and then use the identified goals to decide on an appropriate content strategy. Here, we face another major skill gap. Although lots of people understand that XML is useful, and some can spell out how XML might be useful, the ability to connect “useful things XML can do” to “what the business needs” is rare. Most publishing people love books. The business component is only of incidental interest.

This presents a problem (or maybe an opportunity) for content strategy consultants.

Why is this a problem

It’s a good time to be a content strategist, a writer with the right technical skills, a stylesheet developer, or a CMS administrator. From your point of view, there are tons of job opportunities, which likely means higher salaries.

When we look at the overall industry, though, we have a problem. Several of our clients have open positions that they are struggling to fill. They are having an especially difficult time finding information architects for DITA. If this continues, the benefits of a very challenging toolset (like DITA) may be outweighed by the lack of available staff. From an executive’s point of view, there are a lot of negatives: more skilled individuals command higher salaries and are hard to find.

Closing the gap

Mind the gap warning next to railroad tracks.

Be aware of the talent gap // flickr: cgpgrey

We won’t close the talent gap any time soon, but here are some suggestions for management:

  • Identify high-potential employees and support them in learning what they need.
  • Recognize that the best employers will get the best talent. If you are not a preferred employer, you may have a long road ahead.
  • Turnover is expensive. Do what you need to do to avoid it.

What’s needed is some really excellent training to start expanding the pipeline of qualified people.

Scriptorium job placement policy

Given the discussion about hiring and recruiting, it seems wise to include a note about our job placement policy.

As consultants, we have unique access to staff across a variety of companies. The tight market is leading to a lot of inquiries about job opportunities. To avoid conflicts of interest, we have a few simple rules:

  1. We do not poach staff from customers.
  2. We do not recruit staff from one customer to another customer.
  3. As long as there is no conflict with these first two items, we provide informal matchmaking assistance to our customers looking for candidates and to individuals seeking new positions. We do not charge for these services.

Are you facing a talent deficit? What’s your plan to address it?

Speaking of talent issues, I give you this gratuitous video:

DITA 1.3 overview

January 26, 2015 by

Robert Anderson, one of the DITA architects, compared the transition from DITA 1.1 to DITA 1.2 to the difference between having a couple of drinks with friends and a huge party. The DITA 1.2 specification introduced broad changes in the form of new base elements, new architectural structures, and new specializations.

DITA 1.3 isn’t the rowdy gathering that DITA 1.2 was—it’s more a group of friends going out to the new pub downtown while being sure to get some designated drivers. This article describes the most important additions to DITA 1.3.

Scoped keys

In DITA 1.2, keys were always global. That is, a key could only use one value in a ditamap. If you needed a key to use two different values in different parts of a single ditamap file, you were out of luck.

DITA 1.3 addresses this problem with the @keyscope attribute. The @keyscope attribute lets you constrain the value of a key to a specific part of the ditamap. In effect, you create a local key instead of a global key. This is useful for omnibus publications or for publications that are authored by multiple teams. For example, consider a widget repair manual that describes two different products. It needs to reference the prodinfo key, but with two different values. The @keyscope attribute supports this requirement.


<title>Widget Repair Manual</title>
<topichead navtitle=”Widget A” keyscope=”Widget_A”>
<keydef keys=”prodinfo” href=”prodinfo_a.dita”/>
<topicref href=”maintenance_a.dita”/>
<topichead navtitle=”Widget B” keyscope=”Widget_B”>
<keydef keys=”prodinfo” href=”prodinfo_b.dita”/>
<topicref href=”maintenance_b.dita”/>


Before DITA 1.3, this scenario required two different keys or separate publications for each widget. To reference a key within a specific scope, add a scope before the key name. For example, to explicitly reference Widget B’s product information, use Widget_B.prodinfo as the key value. If no scope is declared, the scope of the current topic is used instead.

Cross-deliverable linking

DITA 1.3 also provides support for cross-deliverable linking through scoped keys. First, you create a mapref, set the @scope attribute to peer, and then define a keyscope.


<title>Widget User Guide</title>
<mapref href=”../Repair_Manual/Repair_Manual.ditamap” scope=”peer” keyscope=”Repair_Manual”/>


Then you can use keys contained in the referenced map within the publication. In this example, topics within the Widget User Guide can use keys that are defined in Repair_Manual.ditamap. For example, to reference a task detailing the repair of a feature, an author could do the following:

<p>If replacing the module did not work, see <xref keyref=”Repair_Manual.wiring_harness”/> in <conkeyref=”Repair_Manual.title”/> for detailed instructions on how to repair the wiring harness.</p>

Branch filtering

In DITA 1.2, ditavals are global and applied when running a transform. In DITA 1.3, the <ditavalref> element allows you to specify ditavals at the map or topicref level and then cascade the filtering effects through the map. Consider a software documentation guide.


<title>App User Manual</title>
<topicref href=”prodinfo.dita>

          <ditavalref href=”user.ditaval”/>

<topicref href=”functions.dita”>

<ditavalref href=”mac.ditaval”/>
<ditavalref href=”windows.ditaval”/>
<ditavalref href=”linux.ditaval”/>



When output is generated from this map, prodinfo.dita is published using the settings in user.ditaval. In addition, functions.dita is published three times, once for each of the three ditaval files specified.


The new troubleshooting topic type is a specialization of the task topic type intended for troubleshooting steps, and consists of <condition>, <cause>, and <remedy> elements. You can use multiple cause/remedy pairs to provide a series of corrective actions for conditions that can have multiple causes.

The optional <tasktroubleshooting> and <steptroubleshooting> elements and trouble admonition type are used to provide quick, embedded troubleshooting within a topic. The DITA 1.3 specification recommends that these elements contain a condition, cause, and remedy.

The big picture

There are a lot of new toys to work with in DITA 1.3, and we haven’t even looked at some of the smaller additions, like new attributes for tables and organizational elements within topics. There are a lot of necessities to consider, but will your vendors support them, and how soon?

How could some of these new features be used to your advantage? Have you already implemented similar functionality yourself?

XML overview for executives

January 20, 2015 by

Over the past year or two, our typical XML customer has changed. Until recently, most XML publishing efforts were driven by marketing communications, technical publications, or IT, usually by a technical expert. But today’s customer is much more likely to be an executive who understands the potential business benefits of XML publishing but not the technical details. This article provides an XML overview for executives. What do you need to know before you decide to lead your organization into an XML world?

1. What is the actual problem?

What is the problem that you are solving? The answer is not “we want XML.” Why do you want XML? Are there other ways of solving the business problem? Make sure that the organization has a solid content strategy. At a minimum, the strategy document should include the following:

  • Business goals
  • A description of how the content strategy will achieve the business goals
  • High-level plan: resources, timelines, and budget
  • ROI

2. Technology is the easiest part.

As with most projects, early tool selection is appealing. It’s a concrete task, there are lots of choices, and choosing a content management system or an XML authoring tool feels like progress. Unfortunately, choosing tools too early can lead to problems downstream, when you discover that your selected toolset can’t actually meet your requirements.

Before tool selection, you should have the following:

  • A general idea of how all the moving pieces and parts will be connected (systems architecture overview)
  • Estimate of business benefits, such as $100K in yearly cost savings or faster time to global markets (four months instead of eight months)
  • A list of constraints (required/rejected operating systems, SaaS or not?, language support requirements)

3. Separation of content and formatting

You’ve probably heard this one. Storing content in XML makes it possible to deliver multiple output formats, such as HTML and PDF, from a single set of source files. This statement is true, but it glosses over a lot of hard work. Many designers are not prepared to let go of their preferred tools for building web sites or printed documents. When you separate content and formatting, you remove the designer’s ability to “tweak” their layouts. The transition from a hand-crafted page (web or print) to automatically generated pages is challenging. For some document types, the final polish you get from manual layout may not be optional. In this case, you have to think hard about what value you get from XML. You can also consider a middle ground, in which you publish into an intermediate format, do some final cleaning up, and then render the final version.

Separation of content and formatting provides for complete automation and huge cost savings, but it is not the right choice for all document types.

4. What is your organization’s capacity for change?

The best solution is the one that your organization can master. Make sure that you assess the following before making any permanent decisions:

  • Corporate culture: Is the organization typically risk-averse? Is it an early or late adopter?
  • Change management: How do employees handle change? How is change communicated? Is change resistance a common concern?
  • Leadership: Does your organization have leaders who can manage change?
  • Technical expertise: Is the solution a good fit for the technical expertise of the staff? If not, what is the plan to address this issue?
  • Executive leadership: Will executive management support this change?
  • Risk mitigation: What are the risks? What is the risk mitigation plan?


2015 content trends

January 12, 2015 by

It’s a new year, which means it’s time for Scriptorium to discuss—and wildly speculate about—the latest trends in content. Here’s what Bill Swallow, Gretyl Kinsey, and I had to say about 2015 content trends.

Content strategy driven by IT

dog with paw on laptop keyboard

flickr: ttarasiuk

Alan: Years ago, IT was brought in a little later in [content strategy projects], and there was also some adversarial positioning going on. I can remember one client saying, “Our IT department: they’re impossible to work with. We don’t like them.” The IT department turned out to be a really huge asset for them and really went to the mat to the help them get systems set up like they should have been.

More recently, I have seen cases where IT departments are the primary stakeholder and are the folks who are really instigating changes in content strategy because they see the overall bigger picture of how content is flowing across the company.

Gretyl: [One concern is] when your IT department is driving a project but then they are going to make a decision that works really well for them but makes things harder for content creators or doesn’t serve the organization as well as a whole.

Content as part of the customer journey

Gretyl: When someone is buying a product, they’re not just buying a product. They are buying an experience. It’s becoming a lot more important for content to be a major part of that experience of the customer journey. Marketing content will be what draws you in or educates you about the product.

And there is also some room for technical content to come into play here as well if you are looking for something that has a lot of technical information you need to know about before you can make a choice.

Bill: It seems to be a trend more on the customer side as an expectation. I think a lot of companies are really starting to pick up on this after the fact. What we’re seeing is people responding to improper information being posted online; we see a lot of complaining on social media. Pretty soon, the company is playing catch-up trying to save face.

Executive eye on content

dog wearing glasses

flickr: Monsieur Gordon

Bill: For a while now, the idea of having an executive sponsor for a major project like a content strategy implementation was fairly commonplace. But more and more, we are seeing executives who have a firm investment in the project if not direct ownership over it—so much so that we are seeing more titles out in the field for a chief content officer. [The title] was traditionally reserved for a lot of broadcasting and media companies. But now we’re seeing it in lots of different consumer companies, software, hardware, manufacturing.

Alan: There’s a stereotype out there: executives know nothing about the real world. There are some people [for which] that is true, unfortunately. They’re a little out touch with what’s going on. But I have also seen the flip side of that very vividly where an executive basically says, “I don’t need all these metrics; I don’t need all these numbers. I can tell you exactly what is wrong with this part of the big picture. I know content is part of it. Fix it.” Sometimes, they have an uncanny ability to zero in on exactly what the problem is.

Accommodating experts as content contributors

Alan: I’ve really seen an increase in content creators wanting to be sure that subject matter experts, tool experts, [and] internal experts are able to contribute content easily to the process and to review content. More and more content creation systems are allowing people to actually get into the source—and not necessarily edit the source—but to suggest edits, and someone on the content creation side can approve it.

Gretyl: This goes along with the idea of getting more and more technologically inclined. That’s really helping our content strategy both inside and outside the organization. While we’ve talked a lot before about how [technology] is helping the customer; this is the flip side of that. If you’ve got content contributors that really don’t know anything about your actual [content] creation process—they just know about the subject they’re an expert in— this is using technology to your best advantage.

Cross-company focus in content strategy implementation

Sled dogs running

flickr: Frank Kovalchek

Gretyl: For me, this a trend mostly about breaking down silos. These groups don’t interact and collaborate—and if they do interact, it’s to say, “Don’t touch my stuff.” What this leads to is content that really doesn’t work well together, and that can hold your company back. One of the ways you can get rid of this idea of silos is to develop a content strategy that encompasses all your content, and that will be more effective than just developing a content strategy, for example, for one department and ignoring all the others.

Bill: The cross-company focus is definitely on the rise, but it comes back to, “People really love their silos.” Even if they don’t, and they do want to work outside the silo, a lot of times, there is so much embedded in the way they need to work, the channels they have to go through because of financial reasons or because of reporting structures, that it is going to hinder this cross-company focused growth.

“Old” social media platforms decline

Bill: The social media landscape is going to change—it has been changing—but there are some outliers there that have been around for quite a while. Even though they still seem new to many of us, according to my kids and my kids’ friends, these are all “Mom-and-Dad networks all the old people are using.” They want nothing to do with it. They maintain a Facebook page to talk to family, but a lot of their actual interactions with friends are done on private networks, which is kind of scary.

It speaks to a different way of looking at social applications, social networking, and sharing of information. We’re not going to see a lot of Facebook, Google Plus, blogging, and other resources used by the younger generation. So, our social media and social engagement strategies around content strategy are going to need to adapt.

Alan: I think this goes back to one of the oldest and best guidelines for any kind of writing—tech comm, marcom, whatever—and that is know your audience. That’s the most important thing here. I think we’ve all seen in social media and advertising [when] companies selling fairly traditional products [are] trying to play like they’re all hip. It’s painful, absolutely painful, to see.

We had fun during the event. You can see for yourself when you watch the recording:

You’ll probably enjoy my “Did I really say that?” moments a lot more than I did!

Beneath the page: learning to see structure

January 6, 2015 by

We hear a lot about the learning curve for structured authoring, but what does that really mean?

Experienced knitters learn to read the work—they can look at a knitted object and map the knitted version to a written or charted pattern. This skill is extremely helpful in locating pattern mistakes. Beginning knitters usually can’t step outside their immediate concerns of needles, yarn, and unfamiliar motions.

Spotting a dropped stitch is easier for experienced knitters.

Reading the work // flickr: kibbles_bits

Similarly, listening to kids talk about video games is enlightening. They dissect the game, discuss the way a particular challenge is constructed, and argue about whether the various enemies are too easy or too hard. They also have an eye for game mechanics and game flow. More casual gamers (me) struggle just to figure out how to make the sword work.

Bloom’s (revised) taxonomy provides a general framework for cognitive learning:

  • Remember
  • Understand
  • Apply
  • Analyze
  • Evaluate
  • Create

You have to master the basics (remember, understand, and apply) before you can move up to the more sophisticated levels. For knitters, the analysis level is the ability to read the work. (Designing patterns is the “create” level.)

For structured content, we have a similar set of learning requirements:

  • Remember—learn basic ideas, such as elements, attributes, and hierarchy.
  • Understand—comprehend structured authoring concepts.
  • Apply—use elements and attributes as needed.
  • Analyze—look at a page (print or web) and understand how that page is constructed with elements and metadata.
  • Evaluate—assess whether a page is structured properly or develop best practices for using an existing tag set with unstructured content.
  • Create—develop your own set of elements and attributes to describe content (information architecture).

And here is the crux of the structured authoring challenge:

Structured documents require authors to gain a deeper understanding of their documents than unstructured documents. This is true even if the editing software hides elements and attributes from the author.

Just as moving from a typewriter to a word processor required additional skills, moving from a word processor to a structured document requires new skills. The software will get better and easier over time, but the cognitive leap required is permanent.

Content strategy and DITA and localization, oh my! Our best of 2014

December 29, 2014 by

Yes, you need another “best of 2014″ list to round out the year. Without further ado, here are Scriptorium’s top 2014 blog posts on content strategy, DITA, and localization.

A hierachy of content needs

Based on Maslow's hierarchy of needs, the layers are from bottom to top: available, accurate, appropriate, connected, and intelligentMaslow has his hierarchy of human needs: physiological, safety, and so on.

We have our hierarchy of content needs.

Ten mistakes for content strategists to avoid

As content strategy spreads far and wide, we are making old mistakes in new ways. Here are ten mistakes that content strategists need to avoid.

XML publishing: Is it right for you?

Wondering about a transition from desktop publishing to XML publishing for your content? Check out our new business case calculator. In five minutes, you can estimate your savings from reuse, automated formatting, and localization.

Managing DITA projects (premium)

A DITA implementation isn’t merely a matter of picking tools. Several factors, including wrangling the different groups affected by an implementation, are critical to successfully managing DITA projects (registration required).

Content strategy mistake: replicating old formatting with new tools

photo of rubber stamp

(flickr: woodleywonderworks)

When remodeling your kitchen, would you replace 1980s almond melamine cabinets with the same thing? Probably not. Then why make the content strategy mistake of using new tools to re-create the old formatting in your content?


XML workflow costs (premium)

Everyone wants to know how much an XML workflow is going to cost. For some reason, our prospective clients are rarely satisfied with the standard consultant answer of “It depends.” This premium post (registration required) breaks down typical XML projects at four different budget levels: less than $50,000, $150K, $500K, and more than $500K.

Content strategy burdens: cost-shifting

In assessing an organization’s requirements, it’s important to identify content strategy burdens. That is, what practices or processes impose burdens on the organization? A content strategy burden might be an external cost, such as additional translation expense, or it might be an internal cost, such as a practice in one department that imposes additional work on a different department. A key to successful content strategy is to understand how these burdens are distributed in the organization.

Three factors for evaluating localization vendors

Localizing content can be a frustrating and expensive effort. In addition to per-word costs and turnaround times, keep these three key factors in mind when choosing a vendor.

XML product literature

Maria robot from movie Metropolis

“Machine Human” Maria in Metropolis (1927)

Your industrial products become part of well-oiled machines. Unfortunately, your workflow for developing product literature may not be as well-oiled. Using desktop publishing tools (such as InDesign) to develop product literature means you spend a lot of time applying formatting, designing complex tables, and so on. This post provides three examples of how XML can improve your processes for developing product literature.

Content strategy: first steps (premium)

Content: You’re doing it wrong. That’s easy for us to say—we rarely hear from people who are happy with their content. But are you ready for a major transformation effort? Our approach is to assess the overall content lifecycle, meet with all the stakeholders, identify needs, develop a strategy, and then execute the strategy. If you want a more incremental approach, consider these inexpensive first steps (registration required).

Content strategy vs. the undead

Lego zombie hunter; image via Kenny Louie on Flickr (Creative Commons)

Flickr: Kenny Louie

Implementing a content strategy can involve overcoming many challenges. Some of these challenges can be quite scary and hazardous to your strategy. In fact, overcoming these challenges is a lot like battling the undead.


Localization best practices (premium)

Localization—the process of adapting content to a specific locale—is a critical requirement for global companies. More than ever, products and services are sought, purchased, and consumed in multiple language markets. Proper localization practices are critical to drive sales, and they can save you time and money in production.

This post (registration required) describes best practices for efficient, effective localization.

Scriptorium wishes you the best for 2015! Please join us on January 8 for our annual trends webcast. This popular event is free, but registration is required.

Want to solve some content problems in 2015? Contact us. We’d love to help.

Celebrating the good stuff (Blog Secret Santa)

December 24, 2014 by

Or: A stranger takes over the Scriptorium blog and gets all enthusiastic about tone of voice
Merry Christmas, Scriptorium readers. And, Sarah O’Keefe, an especially Merry Christmas to you. I’m your writer, Santa, and this is your Blog Secret Santa gift. (Everyone else: yep, hi. I’m a random stranger writing for Sarah’s blog. Because Christmas is fun.)

And here’s your present: Four websites that perform the rare magic trick of taking things that are normally really boring and making them entertaining.

How do they do this? With quality content, of course (whatever that means). Behind that, though, these four sites are all absolute masters of ‘tone of voice’. They bubble with the enthusiasm of the person behind the keyboard.

Perhaps, Sarah, your gift might actually be many hours of enjoyable reading about things like science, philosophy and cooking. Especially if you don’t mind rude words. While assembling this list I’ve discovered a personal bias towards writers who swear like sailors. Who knew?

Anyway, on we go with Santa’s Celebration Of Wonderful Tone Of Voice (Potty-mouthed Internet Edition). In alphabetical order, we salute:

1. Myths Retold

Some guy called Ovid is behind this one. He takes myths from cultures all over the world and, as promised in the blog name, retells them. His style is a mix of epic poetry and long-winded stand-up comedy. Somehow, it works.

Every now and then the ‘mythology’ ends up being an old book, too. Like the fantastic Doctor Jekyll and Mister Hyde is Really About Meth:

And for a while, shit goes back to normal

Jekyll invites everyone over for dinner parties and it’s great

but then all of a sudden he stops having parties of any kind

and in London at this time that is a SERIOUS PROBLEM

so Utterson keeps trying to go hang out

but Jekyll just keeps being like NOPE STAY AWAY

until finally Utterson gives up and is like “Welp

I guess that’s why my momma always told me never to make friends with crazy people.”

Without this site, I would never have made it to the end of Jekyll and Hyde. I don’t know if you’ve noticed this or not, but a lot of the classics are seriously boring.

2. Philosophy Bro

The most impressive thing about Philosophy Bro—possibly the finest of the internet’s many, many bros—is that he really, really knows his stuff. If you took out the foul language and streetish slang, you’d have genuine philosophical essays, and accurate summaries of many of the world’s most important philosophical works. But then again, if you took out the foul language and streetish slang, no-one would want to read it. Here’s a cut from John Locke’s ‘Second Treatise on Government’: A summary:

That’s the foundation of property – bros worked their goddamn asses off to make shit, so they had the right to that shit. An apple on a tree that no one owns is f&%$#*g useless – some bro had to pick it to eat it. Who are you to tell him he can’t eat an apple that he worked for? Yeah, I have no problem with them telling you to keep the f&@k out. Sorry I’m not sorry.

I like to think that this is actually John Locke’s first draft, and that he had a very patient editor.

3. Thug Kitchen

Ok, so with this thug‘s vocabulary he’s going to wear out his M and F keys pretty soon, but sometimes it takes a lot to get noticed. And I’m not going to get offended by the only person in the world who can make vegan cooking both hilarious and attractive. That’s right: I just said “vegan” and “hilarious” in the same sentence. You don’t get that every Christmas.

Here’s a quick, slightly-cleaned-up taster that shows you how to make people read about soy-based nonsense in between slices of bread:

WHAT THE F#@% DID YOU EAT FOR LUNCH? If it wasn’t a summer tempah sammie, take the afternoon off and re-evaluate some shit.

I discovered Thug Kitchen when the love of my life did too much yoga and got weird about food. As that spun out into a literal unhealthy obsession, I started getting pretty mad at websites that recommended cutting all sorts of stuff out of your diet for no reason. But I could never get angry at Thug Kitchen.

4. What if? (xkcd)

If you don’t already read xkcd cartoons, you’re missing out. Or you’re not a nerd. Or both. Anyway, Randall Munroe, the ex-NASA scientist behind xkcd, takes reader questions and answers them on What If? The trick is that he’s smart enough to make his answers to crazy questions sound both sciency and plausible. Like when he was asked how long humanity would survive a robot apocalypse.

What people don’t appreciate, when they picture Terminator-style automatons striding triumphantly across a mountain of human skulls, is how hard it is to keep your footing on something as unstable as a mountain of human skulls. Most humans probably couldn’t manage it, and they’ve had a lifetime of practice at walking without falling over.

I should award extra tone of voice kudos here, because unlike the first 3 sites, What If works its magic without a bunch of swearing. And that’s probably good.

So there you go. Science can be interesting. Mythology can be fun. Philosophy can be readable, and vegan cooking doesn’t have to be an over-earnest snore-fest. All it takes is something to make the content sparkle, like a perfectly worked tone of voice. If you’re struggling for traffic, you might have just found your 2015 New Year’s resolution: Sound like someone who loves what you’re writing about. Sound like yourself.

Merry Christmas!