Scriptorium Publishing

content strategy consulting

Content strategy patterns in the enterprise

June 13, 2016 by

What factors affect content strategy decisions? Every client has a different combination of requirements, and of course there are always outliers. But based on our consulting experience, here are some factors that affect content strategy decisions.

Is the content the product?

detail from Book of Kells

Book of Kells // flickr: Patrick Lordan

If yes, the content design will be relatively more important. The organization will want content development and publishing workflows that provide for maximum flexibility in order to deliver the highest possible quality in the final product.

Are the writers professional writers?

Full-time content creators may have a preferred way of doing things, but they usually have experience with a variety of tools, and understand that different tools are appropriate for different organizations.

Are the writers volunteers or paid professionals? Does writing the content demand specialized knowledge?

Domain knowledge is always important. If your writers have extremely specialized knowledge, are volunteering their time, or both, then they effectively have veto power over the authoring environment. Tread with care.

Are there regulatory or compliance requirements?

If so, you can expect a company that is relatively more willing to invest in content (since a failure could mean Serious Consequences), but these companies also tend to move slowly and be risk-averse. Review workflows will be relatively more important for regulated companies.

How many languages are supported or need to be supported?

More languages means more interest in governance because mistakes and inefficiencies are multiplied across each language.

Can misuse of the product injure or kill people?

If the product is potentially dangerous, the organization will look for ways to minimize the risk. At the most basic level, this results in documents with pages and pages of hazard warnings. More advanced organizations work on product design to minimize operational hazards and design their content to support correct product usage. Compliance and regulatory requirements may also come into play.

How many people contribute content? Are they full-time or part-time contributors?

A huge pool of part-time content contributors usually means looking for a lightweight, easy-to-use authoring tool that does not require per-seat licensing. A large group of full-time writers usually means investing in automation because even small productivity gains are valuable.

tcworld China recap

April 25, 2016 by

The tcworld China event took place in Shanghai April 18 and 19. I was there to present on content strategy and advanced DITA (yes, I hear your gasp of surprise), but for me, the most interesting part of the trip was getting a chance to connect with the technical communication community in China.

Technical Communication in Chinese

“technical communication” in Chinese

There were more than 100 attendees at the event. Most of the people I met were from Shanghai, Beijing, and Shenzhen. There were also participants from other cities, like Nanjing, and from Japan and Singapore.

For those of us completely ignorant of Chinese geography (which I’m embarrassed to say included me until I found out about this trip), here is a basic map:

I don’t recommend making a strategic decision based on my single week in China, but nonetheless, here are some observations.

Blending authoring and localization

In several conversations, I heard about a blended authoring/localization workflow. Technical writers create information in Chinese and work with the engineers to have this information reviewed and approved. Once the Chinese document is finalized, the same technical writers rewrite the information in English. The English document becomes the starting point for localization into all other languages.

English as a pivot language is common in many places, but the difference here is that a single technical writer is expected to create both the Chinese and the English versions of a document. This means that the technical writers must be able to write in both languages.

Academic background

Chinese universities are just beginning to offer technical writing courses. These courses are often intended for engineers. Technical writing is not currently available as an academic major. Like North American technical writers, Chinese technical writers have varied educational backgrounds. The most common is a university degree in English or a related subject like English translation. Engineering or computer science majors also may end up in technical writing.

In English, we usually refer to people “falling into” technical writing, and German has the word “Quereinsteiger”; that is, “a person who climbs in sideways.” In Germany, however, a large percentage of technical communicators have university-level education in technical communication, and there is also a robust certification process.

It remains to be seen which approach the technical communication industry in China will choose, or whether China will choose a third way.

Business relevance

I delivered a presentation on content strategy in technical communication at the event. My key message, as always, was that you need to have business reasons as the driving force behind your content strategy decisions.
tcworld China slide: Chocolate factory with a sign on the wall reading 400kg chocolate every three minutes. Caption for the slide is Justify your approach.

I also spent some time discussing why cheap content is really expensive—product returns, legal exposure, and inefficient content processes all increase the cost of producing information.

tcworld China slide: Two chocolate bunnies with their ears bitten off. Caption is The myth of cheap content

Both of these messages seemed to resonate with the audience, but there was concern about how to get management support for any new content initiatives.

Several people told me that, in China, organizations are often not ready to invest in content or content strategy. Their corporate culture is to keep operational costs as low as possible. This makes the argument for content strategy investment, even with compelling ROI, a difficult one. That said, it is clear that some companies are shifting their strategy toward innovation—they are delivering cutting-edge products rather than commodities.

A view of the Bund and the river at night

Shanghai at night

There is an informal Association of Shanghai Technical Communicators, which communicates mainly via WeChat. If you can read Chinese, that would definitely be something to explore.

Platform differences

At home, I rely heavily on Slack (internal business), Twitter (mostly business), and Facebook (business and personal) for social media, along with email, Skype, web meeting tools, and more. Inside China, people use different platforms, such as WeChat (similar to Twitter). In part, this is because of the Great Firewall. Facebook, for example, is not officially allowed in China, and I expected to be blocked from using it.

What I found, however, was in some locations I could use the Facebook mobile app via a cellular connection (but not Wifi). In other locations, it appeared to be wide open. I had very little luck getting Twitter to work anywhere.

This presents a business problem for us. We want to continue to connect with the Chinese technical communication industry, but the social media tools we use are not appropriate for making those connections. Information posted on Twitter will not reach people in China, but the social media applications used in China are not widely used outside of China. We have a platform divide.

Communication challenges

Finally, I want to talk about some of the communication challenges I ran into. A colleague told me that the biggest challenge in China is that you are functionally illiterate. Although many signs are provided in both Chinese and English, this is quite true. Upon arrival, I hopped in a taxi and told that driver my hotel. But because the hotel name is different in Chinese, it wasn’t until I showed him the written address, in Chinese, that he understood where I needed to go. (Based on advice from colleagues, I was prepared with the necessary version of the address.)

Shanghai was actually easier in this regard than Shenzhen, where I also spent a couple of days. (This is probably a good spot to mention that Yuting Tang of tekom did a fantastic job organizing various outings, providing translation, and acting as a general fixer for me and other speakers. And I had a great time just hanging out with her! Without her, Shenzhen would have been a big challenge.)

In Shanghai, I had a twelve-hour time difference with my office in North Carolina. Given a conference during the Shanghai day, I generally had only a few hours in the evening for synchronous communication. That is, after I got back from one of our epic dining adventures until I fell into bed, I could check in with the office as needed. For a week-long stay, this wasn’t particularly critical. For an ongoing business relationship, though, this introduces obvious challenges. One (China-based) colleague had to leave an evening get-together to attend an 8 p.m. meeting. Another (visiting) colleague had previously scheduled a webcast, so he found himself at his computer at 11 p.m. local time. There’s not much that can be done about the time zones, but best practices like rotating meeting times (so that everyone shares the pain of the occasional 11 p.m. meeting) are important to show some respect to your team members.

 

I thoroughly enjoyed my time in China, and I was delighted to meet a few of the people working in technical communication across the country. I also made a significant dent in the country’s dumpling inventory. Many thanks to Michael Fritz at tekom for the invitation!

Dumplings!

Totally worth the trip.

 

The last mile: getting approval for your content strategy

April 18, 2016 by

You’ve thought about your content strategy. You have a business case. You have a plan. What you don’t have is a budget and approval to proceed. What can you do?

First, recognize your accomplishment. A solid strategy, business case, and plan already put you in the top 25 percent or so. But how do you get over this new hurdle and actually get a funded project with a green light to proceed?

At this point, it’s important to understand that the game has changed. Until now, all of your plotting planning has been inside your content world. To get funding, you need executive approval, and executives by definition work on a broader scale.

go_boardTo get funding, you have to show the value of your project (with a business case, and yours is beautiful). But that’s not enough. Organizations have limited budgets and lots of different projects are competing for scarce funding. You have to prove that your project is more deserving than the other projects. Otherwise, it’s a shiny nice-to-have that gets cut in the first round of budget negotiations.

Can you prove that your project is in fact mission-critical? Here are some factors to look at.

Return on investment

Can you show that the investment will yield increased revenue or cost savings? How long will it take for the organization to recoup the proposed investment? Are you arguing for efficiency and therefore lower cost, or are you arguing for an investment that will result in more revenue?

Another way to show return on investment is by accelerating time to market. If your proposal can speed up delivery of content in a global market, you have a compelling argument. Can you reduce a localization delay currently measured in months down to weeks?

Time

Why are you asking for funding now? What happens if this project doesn’t happen until the next budget cycle? If the answer is “not much,” you can expect delays.

Perhaps you have a window of opportunity in which to make changes and think strategically before your next major product? Or perhaps your products are being redesigned in a way that makes the current strategy unsustainable? If you are increasing the number of required languages every year, the cost of inefficient content development is increasing quickly.

Keep in mind that implementing any sort of major change in content strategy is going to take at least six months. When your executives tell you that “oh, we don’t need that until January 2017” that means you need to get started absolutely no later than June 2016.

Timing is everything. Successful managers learn how the budget cycle and the project allocations really work, and figure out how to work the system. For example, you may have a CFO who responds well to efficiency and isn’t interested in innovation. Your CTO, on the other hand, may want to engage in a detailed discussion of nifty technology. Understand their priorities and work with them.

Customer journey

Technical writers are allergic to buzzwords. But tying your strategy into the current Next Big Thing is smart. With attention focused on the customer journey and the customer experience, your pitch for content strategy should include a focus on these concepts. How will your strategy support them?

Getting approval for your content strategy project requires you to understand how decisions are made in your organization, and then work within that process to get what you want. Some technical communicators feel that the quality of their work should speak for itself, and that these types of games are beneath them. We call them People Who Don’t Get Budget for their Projects.

Creating a unified customer experience with a content fabric

April 11, 2016 by

Coauthored by Anna Schlegel (Senior Director, Globalization and Information Engineering, NetApp) and Sarah O’Keefe (President, Scriptorium Publishing)

This post is also available as a white paper, which you can read in PDF format.

The interest in customer experience presents an opportunity for enterprise content strategists. You can use the customer experience angle to finally get content proposals and issues into the discussion. Ultimately, the challenge is in execution—once you raise awareness of the importance of content synchronization, you are expected to deliver on your promises. You must figure out how to deliver information that fits smoothly into the entire customer experience. At a minimum, that requires combining information from multiple departmental silos.

You need a customer experience that does not reproduce your organization’s internal structure. Customers need relevant, usable, and timely information—they don’t care that the video was developed by tech support and the how-to document by tech pubs. When customers search your web site, they want all relevant results, not just documents from a specific department. Furthermore, they assume you will use consistent terminology and will provide content in their preferred language. To meet these expectations, you need a unified content strategy.

At NetApp, the Information Engineering team uses the term content fabric to describe this unified approach. In the content fabric, customers get seamless delivery of the right information based on their needs. Multiple departments are involved in creating and delivering content. The processes are complex, but customers only see the end result. The content fabric aims to deliver information for each customer at the point of need.

Weaving a content fabric

To deliver a content fabric, you need the following key features:

  • Search across all content sources
  • Content in appropriate format(s)
  • Content in appropriate languages

Each of these requirements expands into significant implementation challenges. To provide search across all content sources, for example, you have to solve the following issues:

    • Provide search across more than a single deliverable (such as a PDF file)
    • Provide search across all deliverables from one department for one product
    • Provide search across all deliverables from all sources for one product
    • Provide search across all deliverables from all sources for all products
    • Align product classification schemes across the organization
    • Align product terminology across the organization
    • Align content localization across the organization

content fabric

Several teams typically share responsibility for content development and delivery. You might have the following:

  • Technical publications for technical communication
  • Technical support for knowledge base
  • IT for web site infrastructure
  • Digital experience for web site architecture and appearance
  • Marketing for technical white papers
  • Training for job aids and e-learning

Each group has a different perspective on information, a different tool set, and a different set of expectations from their content creators. But somehow, you have to ensure that their content works in the big picture.

Unifying content organizations is important

Delivering a seamless content fabric means that different organizations must deliver compatible information. There are two main options to achieve this goal:

  • Consolidate the content creators in a single organization
  • Ensure that diverse content creators use the same content standards

Consolidation makes sense for similar roles. For example, most organizations put the entire technical communication function in a single team. Technical support and marketing have important content to contribute, but their functions and priorities differ from those of tech comm. In this case, the sensible approach is to share content infrastructure, including the following:

  • Terminology and style guides. All content creators must use agreed-upon terminology to refer to the same thing. Everyone uses the same corporate style guide.
  • Taxonomy. The classification system for content is shared across the organization. For example, the organization defines a set of categories, such as product name, release number, and content type, that label products and information.
  • Content structure. A reference document always has the same structure, no matter who created it. Similarly, knowledge base articles always have the same structure across the organization.
  • Content formatting. Content formatting matches corporate branding standards. All company content looks related, and all content of a particular type matches. For example, videos always include the company logo in the lower right corner, use the same types of visuals, and include standard introductions and conclusions. These formatting standards are enforced throughout the organization for all videos, not just in a single department.
  • Search. All website content is searchable through a single search interface.

Connecting content development systems

The content fabric provides the reader with a single point of entry for information. This simple premise requires us to rethink how we develop and deliver information. The easiest way to deliver consistent information is to move all content creators into a single content development environment. Realistically, it’s more likely that you loosely connect multiple systems to produce a consistent end result. Challenges include different content development systems, taxonomies, search, and update cycles. Align these aspects to ensure that each content development pipeline delivers information that fits into the content fabric.

Adding translation to the mix

Globalization adds yet another layer of complexity to the content fabric. You must ensure that your unified delivery extends across all supported languages by making careful decisions about what, when, and how to translate. For example, if an organization wants to increase sales in South America, audit the content assets available in the local languages. Are you providing enough information? Is the content of high quality in the local language? Are you delivering Brazilian Portuguese (as opposed to Portuguese for Portugal) to Brazil? Is information being developed in the local languages or are you translating? If you are translating, what is your strategy for translation, localization, and transcreation?

The value of the content fabric

Why should organizations consider a content fabric like the one proposed at NetApp—a unified content strategy in which content efforts are carefully aligned across the enterprise? After all, it’s challenging to have consistency in a single department, let alone half a dozen groups across a far-flung organization.

The value of the content fabric is two-fold. First, you improve the customer experience. Instead of repeatedly transferring customers from one group to another, you provide the customer with a consistent, high-quality experience, in which questions are addressed in a single location. Second, you improve the overall content development process with less content redundancy and a single set of content development standards. In manufacturing terms, you are reducing waste and improving your quality control.

First steps toward your own content fabric

To begin the move toward your own content fabric, start with some basic questions:

  • What information do you need to deliver and where?
  • How is that information created and by whom?
  • What standards are needed?

Once you understand the current environment, you can look at improving two major areas:

  • Content development: Ensure that all content developers have the tools, technologies, and support they need to produce the right information.
  • Content delivery: Ensure that information is presented to the customer in a unified interface.

Scriptorium and the Information Engineering team at NetApp are collaborating on NetApp’s journey to the content fabric.

Author: Anna Schlegel

Anna SchlegelAnna Schlegel leads the Information Engineering team at NetApp as well as its Globalization Program Strategy Office. Anna is a native of Catalunya and a linguist at heart. @annapapallona is her twitter account. She loves tomatoes and eggplant. She does check baggage.

Content strategy for technical communication

February 22, 2016 by

Content strategy is planning to use information to advance an organization’s goals. Your organization should have an enterprise content strategy that covers all customer-facing information, both persuasive content and informational content. Marketing content is generally persuasive, and technical content is generally informational.

First published: October 19, 2010
Updated: February 22, 2016

For marketing content, an enterprise content strategy means creating information that supports the organization’s communication strategy and aligns the voice and tone of content with the corporate branding across all channels.

For technical content, an enterprise content strategy means identifying business goals and then setting up a content development and delivery system that supports those goals.

Common business drivers for content strategy in technical communication include:

  • Compliance with legal and regulatory requirements
  • Controlling costs
  • Improving marketing and product visibility
  • Speeding up delivery (reducing time to market)
  • Integration with other content and data sources in the organization

Compliance

In many industries, compliance with legal and regulatory requirements is the first priority. Failure to produce content as required in a given market leads to expensive problems and, in extreme cases, the inability to do business at all. To support compliance requirements in technical content, consider the following steps:

  • Clearly document compliance requirements in each market. Requirements will vary by geographical location, product type, and other factors.
  • Pay particular attention to localization requirements. Failure to plan for localization in the product and content planning phase is expensive.
  • Identify critical regulatory differences among markets and develop a strategy to create and manage related content.
  • Ensure that content developers can easily identify approved content.
  • Provide an efficient workflow for compliance review.
  • Provide traceability—the ability to trace back data to its point of origin

Controlling costs

Nobody wants to waste time or money, but many organizations have huge inefficiencies in content production and localization due to outdated technologies, convoluted workflows, and general neglect. If you localize content, efficiency gains in the source language(s) are multiplied across each language–two hours fixing page breaks in English might be acceptable, but repeating that process in 20 languages would take a week.

To manage costs, consider the following:

  • Ensure that content development is efficient. You can apply lean principles to reduce wasted time and effort in content production.
  • Invest in content quality to reduce churn. For example, consistent terminology in content development results in more efficient localization.
  • Reuse content to reduce the total amount of content that you must manage.
  • Automated formatting, especially in multichannel workflows, lets you ensure consistency and timely delivery of the final output formats.
  • Capture metrics related to content costs. For example, a technical support organization might improve technical content to reduce the number of expensive technical support calls.
  • Invest in the right toolchain for your requirements.

Marketing support

Technical content is traditionally considered post-sales information, but potential buyers often seek out technical information to help them make a buying decision. Perhaps they want to understand how a specific feature works, or find out the exact production dimensions. Therefore, low-quality technical content can result in lost sales. Like marketing content, technical content is available to the customer throughout the customer journey, not just as an ugly piece of paper in the product box. A company that sells a luxury product needs all aspects of the product to support the premium branding. Technical content can either reinforce the marketing message or contradict it.

To provide marketing support, consider the following:

  • Ensure that the corporate branding guidelines are followed in all content. Marketing and technical content do not have to use the same exact design, but they should look as though they were created by the same organization.
  • Recognize that any customer-facing information has a de facto role in the sales process.
  • Ensure that localization strategy is aligned across the enterprise and not established department by department.

Reducing time to market

Reducing time to market can also drive a content strategy effort. By improving how content is created, managed, and delivered, you can deliver content faster. This tactic is especially important for localization. Many products cannot ship until their documentation is available in the local language. When you speed up the delivery of localized information, the organization gets revenue sooner in localized markets.

To reduce time to market, consider the following:

  • Look for opportunities to create content in parallel with product development.
  • Look for opportunities to localize in parallel with content creation.
  • Identify process improvement areas, especially those that eliminate delays.
  • Focus on smoothing out the critical path for content.

Integration with other content and data

Content and data need to flow across an organization. For example, CAD files produced by engineering are used in technical and marketing documents. Procedures created in technical communications are used by the training group. Knowledge base information moves into technical communications. Inventory information is connected to repair procedures. A focus on integration means understanding where information originates and how to share it efficiently and accurately with the people who need it.

Technical communication, training, support, and others in boxes, all interconnected with arrows. Below that, a list of possible deliverables with arrows going from the functional areas to the various output boxes.  It's a hot mess, which is basically the point.

Technical communication content integrates with other content functions across the organization. Graphic: Gretyl Kinsey

To integrate with other content and data, consider the following:

  • For each information type, understand its lifecycle in the organization: how is it developed and by whom? Where does it go? How often does it change?
  • Assess storage and exchange mechanisms to ensure you can transfer information as needed (hint: if it’s copy and paste, you’re doing it wrong)

 

Content strategy in technical communication requires you to align content development with business priorities. Instead of focusing on technical content as an expense, understand that it can be an asset. Like any major asset, content requires strategic planning to ensure that you extract maximum value from it.

The second wave of DITA

February 1, 2016 by

You have DITA. Now what?

More companies are asking this question as DITA adoption increases. For many of our customers, the first wave of DITA means deciding whether it’s a good fit for their content. The companies that choose to implement DITA find that it increases the overall efficiency of content production with better reuse, automated formatting, and so on.

Now, clients are looking for the second wave of DITA: they want to connect DITA content to other information and explore innovative ways of using information. The focus shifts from cost reduction to quality improvements with questions like:

  • How will our content strategy evolve as DITA evolves?
  • How do we make the most of our DITA implementation?
  • How do we tailor our DITA implementation to better suit our needs?
  • What can DITA do for us beyond the basics?
  • What other content sources are available and how can we integrate them with our baseline DITA content?
  • What new information products can we create using DITA as a starting point?
  • How can we improve the customer experience?

The second wave of DITA can go in two directions. In the apocalyptic scenario, the overhead and complexity of DITA exceeds any business value, so the organization looks for ways to get out of DITA. But if you measure implementation cost against business value before any implementation work begins, this scenario is unlikely. Instead, you can reap the benefits of a successful implementation and start exploring what else DITA can do for your business.

A huge wave and a tiny surfer.

Will you thrive or wipe out in the second wave? // flickr: jeffrowley

Extending DITA beyond the basics

Your first DITA implementation must achieve your objectives with minimum complexity. When the shock of using the system wears off, you can consider new initiatives:

  • Building additional specializations
  • Using advanced DITA techniques to accommodate complex requirements
  • Delivering new output files
  • Refining your reuse strategy

Integrating with other systems

In the first wave, organizations usually focus on getting their content in order—migrating to DITA and topic-based authoring, setting up reuse, establishing efficient workflows, and managing the staff transition into new systems.

In the second wave of DITA, the new baseline is a functioning, efficient content production process, and attention turns to requirements that increase the complexity of the system. For example, a company might combine DITA content with information in a web CMS, a knowledge base, an e-learning system, or various business systems.

Moving additional content types into the DITA CCMS is only one option to foster collaboration. Organizations can align content across different authoring systems. Another integration opportunity is combining business data (such as product specifications or customer information) with DITA content. Software connectors that allow disparate systems to exchange information are a huge opportunity in the second wave of DITA. You can share information as needed without forcing everyone into a single system.

Focusing on the business value of content

The emphasis is shifting. In the first wave, organizations focused on reducing the cost of producing content by improving operational efficiency. In effect, they built systems that reduced or eliminated waste in content manufacturing. In the second wave of DITA, the focus is on the business value of the content. After setting up the assembly line, the organization can build cars, er, content, with more and more features that authors and consumers need.

Some trends in this area include the following:

  • In localization, a shift from authoring in a single source language toward multilingual authoring. Product expertise is not confined to employees who are proficient in English. If your subject matter expert is most comfortable in Chinese, why not allow her to work in that language?
  • In management, an increasing recognition of the value of good content, and a demand for improvements.
  • In content creation, a greater recognition of the importance of content strategy and an increasing focus on the big picture.

DITA is a constantly evolving technology, and to get the most value out of your implementation, you must ensure that your content strategy evolves with it. Don’t stop at letting DITA solve your content problems—take advantage of the second wave of DITA and explore the many other ways it can advance your business.

We had some interesting discussion about the second wave of DITA during our 2016 content trends webcast, and we’d like to continue that in the comments. Are you in DITA and figuring out what comes next? Let us know.

Top eight signs it’s time to move to XML

January 11, 2016 by

How do you know it’s time to move to XML? Consult our handy list of indicators.

Versione italiana (Alessandro Stazi, January 28, 2016)

This list is in rough order from problems to opportunities. That is, the first few items describe situations where the status quo is problematic. Later in the list, you’ll see items that are more focused on improving the quality of the information you are delivering.

1. Overtaxed system

snail

Is your system fast enough? // flickr: Eirien

Your current system (tools and workflow) worked well in the past, but now it’s overburdened. Tasks take too long because you don’t have enough software licenses, or enough people, or too many manual steps in your workflow.

XML-based content is not the only way to solve this problem, but you can use an XML-based system to improve the efficiency of your operation:

  • XML content files have a smaller footprint than the equivalent binary files (because formatting is not stored in each XML file but instead centralized in the publishing layer).
  • You can use a variety of editors with XML files. Software developers might use their favorite programming text editors. Full-time content creators likely prefer an XML authoring tool. Getting software is less of a problem because not everyone needs a (potentially expensive) authoring tool.
  • Content creators spend a shocking amount of time on formatting tasks—up to 50% of their time. XML-based publishing replaces the ongoing formatting tasks with a one-time effort to create stylesheets.

2. File management problems

Box labeled Fragile: Do Not Drop that has been dropped and crushed.

Not good. // flickr: urbanshoregirl

Your days are filled with administrative problems, such as the following:

  • Trying to manage increasingly fragile authoring tools in which formatting spontaneously combusts when you so much as breathe near your computer. (I’m looking at you, Microsoft Word.)
  • Searching through shared network drives, local file storage, and random archives for a specific file and, most important, the latest version of that file.

File administration is overhead at its worst.

The authoring tool problems are addressed by the simplicity of XML files—formatting is applied later in the process, so it cannot muck up your source files. (Note: Some software offers the option of saving to “MySoftware XML” format. In most cases, that XML does include formatting information, which destroys much of the value of an XML-based approach.)

The file search problem is a source and version control problem. The best solution for content is a component content management system (CCMS), in which you can track and manage the files. If, however, you cannot justify a CCMS for your organization, consider using a software source control system. Because XML files are text, you can use common system such as Git or Subversion to manage your files. This approach doesn’t give you all the features of a CCMS, but the price is appealing. It’s also possible to manage binary files in a source control system, but you will experience additional limitations. (For example, you cannot compare file versions using source control system.)

3. Rising translation and localization demands

Box with "no volcar" label.

No volcar. // flickr: manuuuuuu

Your “creative” workarounds to system limitations were acceptable when you only translated a few minor items into Canadian French, but now the company delivers in a dozen languages (with more expected next year), and correcting these problems in every language is getting expensive and time-consuming.

Localization is by far the most common factor that drives organizations into XML. The cost savings from automated formatting across even a few language variants are compelling. Furthermore, because most organizations use outside vendors for translation, it’s quite easy to quantify the cost of translation—you can just look at the vendor’s invoices.

4. Complex conditions

Most unstructured authoring tools offer a way to label information as belonging to a specific content variant and produce two or more versions of a document from a single file. For example, by flagging test answers as “Instructor,” a teacher could generate both a test and an instructor’s answer key from a single file.

In software documentation, a requirement to label information as belonging to a high-end “professional” version as opposed to the baseline product is common. Authors can then create documents for the baseline version and for the superset professional version from a single source.

With more complex variants, however, the basic flagging and filtering is insufficient. Consider, for example, a product that has the following variants:

  • U.S. product and European product with different safety requirements
  • Product used in different environments, like factories, mines, and retail establishments
  • Optional accessories, which change the way the product works
  • Product components are shared across different products with small changes

In this example, you would need to create the following types of filters and have the ability to generate combinations of filters:

  • Regulatory environment
  • Operating environment
  • Accessory
  • Product (to identify variance in shared components)

In XML, you can use metadata to create these flags and filter on various combinations.

5. No off-the-shelf solution meets your requirements

If your output requirements are exotic, it’s quite likely that no authoring/publishing tool will give you the delivery format you need out of the box. For example, you might need to deliver warning messages in a format that the product software can consume. Or you need information in strings that are compatible with web applications, perhaps in PHP or Python. JSON is increasingly required for data exchange.

If you are faced with building a pipeline to deliver an unusual format, starting from XML will be easier and less expensive than starting from any proprietary system.

6. More part-time content creators

In many XML environments, the full-time content staff is augmented with part-time content creators, often subject matter experts, who contribute information. This helps alleviate the shortage of full-time content people. Another strategy is to use XML to open up collaboration across departments. For example, tech comm and training departments can share the load of writing procedural information. Interchange via XML saves huge amounts of copying, pasting, and reformatting time.

Part-time content creators have a different perspective on authoring than full-timers. Their tolerance for learning curves and interface “challenges” generally decreases with the following factors:

  • Level of expertise. Subject-matter experts want to get in, write what they need to, and get out.
  • Level of compensation. Put too many obstacles in front of a volunteer, and your volunteer will simply drop out.
  • Scarcity of knowledge. The fewer people understand the topic, the more likely that your part-time content creators resist any workflow changes.

The solution is to focus on WIIFM (“What’s in it for me?”). If the content creator is accustomed to managing content in complex spreadsheets with extensive, time-consuming copy and paste, an XML system with bulletproof versioning and reuse will be quite popular.

7. Metadata

Text is no longer just text. You need the ability to provide supplemental data about text components. For example, you need to be able to identify the abstract section for each magazine article. Or you want to create a link from a book review to a site where you can purchase the book. Conveniently, a book’s ISBN provides the unique identifier you need, but you don’t want to display the ISBN in the review itself, so you need metadata.

Most unstructured tools let you specify metadata for a document (often, in something like “Document Properties”). XML lets you assign metadata to any document or document component, so you can include more detailed background information. (And you can use that metadata to filter the output results.)

8. Connectivity requirements

In some contexts, your text connects to other systems. These might include the following:

  • For a repair procedure, a link from a part to your company’s parts inventory, so that you can see whether the part is available and order it if necessary.
  • For software documentation, the ability to embed error messages and UI strings both in content and in the software itself.
  • For medical content, the ability to accept information from medical devices and change the information displayed accordingly. (For example, a high blood pressure reading might result in a warning being displayed in your medical device instructions.)

Does your organization show signs of needing XML? Can you justify the investment? Try our XML business case calculator for an instant assessment of your potential return on investment.

How to get budget for content strategy

January 4, 2016 by

One common roadblock to content strategy is a lack of funding. This post describes how to get budget, even in lean years (and recently, they have all been lean years!).

1. Establish credibility

Well before you ask for anything, you need a good reputation in your organization. You need management to know that you are:

  • Smart
  • Reliable
  • Productive
  • Great at solving problems
  • Not a complainer

Does your executive team occasionally ask for miracles? Make it happen, and be sure that they understand what you had to do to pull off the miracle.

Find ways to improve content that are inexpensive but have a real impact on cost and quality. For example, build out some decent templates that help people create content more efficiently and with higher quality even in your current, yucky system.

If you must complain about things, do so very far away from the Money People.

2. Identify a problem that the Money People care about

Your problems are the wrong problems. For example:

Keyword with green "Budget" key in place of regular Shift key.

Content strategy needs budget // flickr: jakerust

  • The content publishing process is inefficient and causes stress for the whole team during every release.
  • I hate this authoring tool, and I want to work in that cool new authoring tool.
  • Our content is not consistent from one writer to another.

These are all small potato, internal problems. If you want funding for content strategy work, you need to communicate with executive management in ways that they understand.

Hint: They understand M-O-N-E-Y.

So:

  1. For each release, the content publishing process takes 40 hours, per document, per language. We have two releases per year, with 5 documents, and 20 languages. That means we are spending
    40 x 2 x 5 x 20 = 8000 hours = roughly $400,000 per year on content publishing.
  2. Our organization has identified mobile-friendly content as a priority. Using our current authoring tools, we have to rework information to make it mobile friendly. If we switch to a different tool, we could deliver mobile friendly content immediately and automatically.
  3. Customers must currently search the technical support articles and the technical content separately. As a result, 20% of support phone calls are for information that is available in the technical content, but is not being found by customers.

3. Show ROI

After identifying the problem, show a solution that makes financial sense:

  1. An automated publishing workflow would eliminate that yearly recurring cost. The cost to implement it is roughly $150,000, so we come out ahead during the first automated release.
  2. The cost of the rework is roughly $100,000 per year, and delays delivery by four weeks. Investing in NewAuthoringTool will cost $50,000 for licensing and $30,000 for training.
  3. We want to improve the search facility to reduce the calls that can and should be solved by a search of technical content. Our technical support budget is $5M per year, so 20% is roughly $1M. We need $250,000 in funding to implement the new search, so we will break even in year 1 if we can reduce the not-found calls by 25%.

You will compete with other projects for limited funds, but a business-focused approach to content initiatives will ensure that your project is at least competitive with the other projects.

 

 

 

The advent of intelligent content

December 1, 2015 by

You can justify intelligent content with efficiency–more reuse, cheaper translation, better content management. The true value of intelligent content, however, is unlocked when you connect content across the organization.

For medium and large organizations, the efficiency argument easily wins the day. In a group of 10 or more writers, a one-time investment in automated formatting pays for itself almost immediately. When you add in the cost savings from localization, the financial benefits are compelling.

Efficiency, however, is not the only reason that intelligent content is important. Consider, once again, the rise of the printing press in the mid-1400s. The printing press was cheaper than hand-copying books, but it also had the following key differences:

  • Velocity. Printing is orders of magnitude faster than hand-copying.
  • Error rate. Printing errors are less common and easier to correct (because a mistake in printing setup results in an error in the same place in each copy).
  • Separation of copying and formatting. The copy function and the formatting function performed simultaneously by scribes (and illuminators) were separated. The printer would typeset pages; the printing press made the copies.

For intelligent content, we can also identify new characteristics:

  • Increased velocity, again.
  • Separation of content and formatting. Instead of applying formatting as the content is created, the formatting role is deferred until the information is rendered.
  • Metadata. Intelligent content can carry metadata, such as audience level or language.
  • Context. The context in which intelligent content is rendered can affect how the information is rendered.
  • Connection. We can integrate data, products, and content to provide new hybrids.

Of these, I think connection is the least understood today. The idea of hypertext is not particularly new, and we have basic examples already:

  • A reference to another book might take you to that book (or to an online bookseller’s listing of the book).
  • A reference to a part in a service manual could take you to a CAD drawing, a description of the part, or an order page.
  • In medical content, a mention of a specific disease could link to additional information about that disease.

But intelligent, connected content will go beyond these basic examples. Instead of laboriously encoding each link into information, I expect automatic linking. Based on metadata or other content characteristics, links will be generated when the content is rendered. The distinction between product and content will be blurred…when information appears as part of the product interface, is it part of the product? And there will be a need for product and content professionals who understand how to integrate data, product, and content to create a unified experience.

Intelligent content is about positioning information so that we are ready when these new requirements arise. A solution purpose-built to solve today’s problems will likely fail to address the new requirements. The investment in a more general solution may be worthwhile in order to keep your options open.

Sturm und DITA-Drang at tekom

November 16, 2015 by

This year’s tekom/tcworld conference reinforced the ongoing doctrinal chasm between North American technical communication and German technical communication.

I am speaking, of course, of the proper role of DITA in technical communication. If any.

Executive summary: There may be a valid use case for German non-DITA CMS systems, but vendors are shooting themselves in the foot with factually inaccurate information about DITA as a starting point for their arguments.

The program this year included several presentations, in both English and German, that provided the German perspective on DITA. They included the following:

The DERCOM effect

We also heard a great deal from a new organization, DERCOM. Founded in 2013, this organization is an association of German CMS manufacturers (the acronym sort of works in German) and includes Schema, Docufy, Empolis, Acolada, and three other member companies.

DERCOM has released a position paper entitled “Content Management und Strukturiertes Authoring in der Technischen Kommunikation” or (as you might expect) “Content Management and Structured Authoring in Technical Communication.” This document is available both in German and in English translation. Unfortunately, the link seems to be obfuscated. Go to the main DERCOM page and find a clickable link under “News.” DERCOM member Noxum has a direct link for the German version.

Uwe Reissenweber explicitly introduced his presentation as providing the official position of DERCOM.

Note that he used the German word “Lobbyist,” but perhaps “advocate” would be a better English translation than “lobbyist” since the latter term is so loaded with negative political connotations. Marcus Kesseler said that he was not speaking for DERCOM but rather for Schema in his individual presentation. Here is what I observed across the various presentations:

  • There was remarkable uniformity in the statements made by the various DERCOM members, even when they said they were speaking for their employer rather than the association.
  • There were a number of talking points that were repeated over and over again.
  • The descriptions of DITA were so technically inaccurate that they destroyed the credibility of the speakers’ entire argument and made it rather difficult to extract valid information.

For example, Uwe Reissenweber asserted that the DITA specialization mechanism, if used to create new elements (as opposed to removing them), does not allow for interoperability with other environments. That is, once you create new, specialized elements, you can no longer exchange your content with other organizations. This statement is technically inaccurate and betrays a fundamental misunderstanding of specialization. When you create a new element (for example, a warning), you base it on an existing element (for example, note). Because DITA maintains inheritance information, a downstream user would know that the warning element is based on note and can process it as a regular note element via a fallback mechanism. This is a critical—and unique—feature of the DITA architecture. Marcus Kesseler asserted that vendor lock-in with DITA-based content is no different than a system (like his) that uses a proprietary content model because so much of the business logic is tied up in the CMS rather than the content model. This overall accuracy of this statement depends on how tightly business processes and other information are bound into the CMS. But it seems indisputable that it would be easier to move DITA content from CMS A to CMS B (with any attendant business logic issues) than it would be to move XML Flavor A from CMS A to XML Flavor B in CMS B. In the second case, you have to move all of the business logic and worry about possible incompatibilities between XML Flavor A and XML Flavor B. “You can’t learn specialization in an afternoon.” This is a completely true statement from Uwe Reissenweber to which I say, with great professionalism, “SO WHAT??” Surely we are not advocating the idea that anything that takes more than an afternoon to learn cannot be worth the effort. After hearing these statements and others (see my Twitter feed for increasingly agitated coverage), it becomes difficult to take any of the presenters’ arguments seriously. And this is unfortunate, because I do want to understand their position. Kesseler, for example, displayed a chart in which he made the case that business logic is embedded either in the CMS or possibly in the DITA Open Toolkit, but not in the core DITA topics.


His Schema co-founder, Stefan Freisler, believes that only 5–10% of return on investment realized from a CMS system is in the content model. Instead, the vast majority of the value resides in the workflow layer.

These are interesting points and worthy of further discussion.

DITA support in DERCOM CMSs?

Eliot Kimber, who has a lot more patience than I do (also, I had a scheduled meeting), stayed through a heated post-presentation Q&A with Kesseler. Kimber had this to say in his trip report:

It was an entertaining presentation with some heated discussion but the presentation itself was a pretty transparent attempt to spread fear, uncertainty, and doubt (FUD) about DITA by using false dichotomies and category errors to make DITA look particularly bad. This was unfortunate because Herr Kesseler had a valid point, which came out in the discussion at the end of his talk, which is that consultants were insisting that if his product (Schema, and by extension the other CCMS systems like Schema) could not do DITA to a fairly deep degree internally then they were unacceptable, regardless of any other useful functionality they might provide.

This lack of support is another starting point for discussion. (I would also note that it’s often not the Evil Consultants, but rather the Opinionated Clients, who are insisting on DITA.)

With a proprietary content model, you are putting your trust and a good bit of your ability to execute in the hands of your CMS vendor. Provided that the vendor does a good job of introducing new features that meet your needs, you could have a long and mutually beneficial relationship. But what if your vendor starts to falter? What if they are acquired and change their strategy to something that doesn’t meet your requirements? DERCOM members are asserting first that they are better at adapting information models to the needs of their clients and second, that the content model provides only a small part of the value of the CMS.

Do you throw your lot in with a vendor, their release cycle, their software development/R&D efforts, or do you choose to rely on a standard and therefore rely on the OASIS technical committee, with all of the advantages and disadvantages of the committee-based standards-building process?

If the content model is largely irrelevant to the CMS functionality, why not just have the best of both worlds and support DITA inside the Very Superior DERCOM systems? Some of the vendors are doing just that. Empolis supports DITA both in its core CLS offering and in a new, low-cost SaaS system that is under development.

It remains as an exercise for the reader to understand why the other vendors are not following suit. Eliot says this:

DITA poses a problem for these products to the degree that they are not able to directly support DITA markup internally, for whatever reason, e.g., having been architected around a specific XML model such that supporting other models is difficult.So there is a clear and understandable tension between the vendors and happy users of these products and the adoption of DITA. Evidence of this tension is the creation of the DERCOM association (http://www.dercom.de/en/dercom-home), which is, at least in part, a banding together of the German CCMS vendors against DITA in general, as evidenced by the document “Content Management and Struktured Authoring in Technical Communication – A progress report”, which says a number of incorrect or misleading things about DITA as a technology.

During the German-language Intelligente Information panel, Professor Sissi Closs pointed out the importance of DITA as a multiplier. She mentioned that widespread adoption of DITA would lead to a network effect, in which the standard becomes more valuable because more and more people are using it and therefore training, support, community, and qualified employees are more readily available.

Some statistics

In North America, DITA is the clear leader in XML-based content work. We estimate that at least 80% of structure implementations are using DITA. The equivalent number for Germany is in the 5-10% range, based on research done by tcworld.

This chart was shown in Reissenweber’s presentation and attributed to tcworld as of 2015:

DITAUsage

Here is my English translation. In each grouping, the upper bar is for software companies and the lower bar for industrial companies.

dita_chart

Scriptorium’s perspective

For Scriptorium consulting projects, we use a standard methodology with roots in management consulting. In the assessment phase, we develop the following:

  • Business goals for the organization
  • Content strategy to support the identified business goals
  • Needs analysis
  • Gap analysis
  • Requirements
  • Implementation plan, ROI, and budget

The decision whether or not to use DITA is generally made in the requirements phase. Most North American companies, at this point in time, assume that DITA is the path of least resistance because of the large numbers of authoring tools, CMS systems, and supporting systems (like translation management and content delivery platforms) that support it.

DERCOM companies will have difficulty making inroads into this market without an affirmation that they can provide DITA support. Any advantages that they might have in workflow or editorial management are irrelevant because they will be ruled out as a prospective vendor by the DITA requirement. Additionally, most of these vendors do not have much presence in North America, so support, training, and maintenance are a risk.

Classic case of disruption

In Germany, the DERCOM vendors are clearly dominant at this time. However, their continued insistence that their technology is superior and the upstart DITA-based options should be ignored follows the classic pattern seen with disruptive technologies. When a disruptive technology offers a clear advantage that is different from the main feature set of the incumbent approach, the incumbents have much difficulty responding to the new challenge.

In the case of DITA, the disruptions are in the following areas:

  • A wide variety of options at each point in the tool chain (authoring tool, content management system, translation management system, content delivery portal, and others)
  • Incremental implementation. Because DITA can work both on a file system and on a CCMS, organizations can opt for an incremental implementation, where pilot projects are built and validated on the file system before CCMS selection begins.
  • Open standard. Interested parties can participate in the standards-development process through OASIS. Vendor assessment is based on conformance to the standard, which makes evaluation across vendors easier for the buyer. Content exchange is easier to implement for the buyer.
  • The ecosystem of DITA architecture, software vendors, consultants, open-source community, and more. (Take a look at the DITA projects available just on GitHub.)

NOTE: Translations from German are mine. If the original authors would like to provide a better or alternate translation, please leave a comment. My tweets of German presentations are real-time translations by me. I apologize in advance for any errors.

Additional reading (if by some miracle you made it this far):