Scriptorium Publishing

content strategy consulting

The second wave of DITA

February 1, 2016 by

You have DITA. Now what?

More companies are asking this question as DITA adoption increases. For many of our customers, the first wave of DITA means deciding whether it’s a good fit for their content. The companies that choose to implement DITA find that it increases the overall efficiency of content production with better reuse, automated formatting, and so on.

Now, clients are looking for the second wave of DITA: they want to connect DITA content to other information and explore innovative ways of using information. The focus shifts from cost reduction to quality improvements with questions like:

  • How will our content strategy evolve as DITA evolves?
  • How do we make the most of our DITA implementation?
  • How do we tailor our DITA implementation to better suit our needs?
  • What can DITA do for us beyond the basics?
  • What other content sources are available and how can we integrate them with our baseline DITA content?
  • What new information products can we create using DITA as a starting point?
  • How can we improve the customer experience?

The second wave of DITA can go in two directions. In the apocalyptic scenario, the overhead and complexity of DITA exceeds any business value, so the organization looks for ways to get out of DITA. But if you measure implementation cost against business value before any implementation work begins, this scenario is unlikely. Instead, you can reap the benefits of a successful implementation and start exploring what else DITA can do for your business.

A huge wave and a tiny surfer.

Will you thrive or wipe out in the second wave? // flickr: jeffrowley

Extending DITA beyond the basics

Your first DITA implementation must achieve your objectives with minimum complexity. When the shock of using the system wears off, you can consider new initiatives:

  • Building additional specializations
  • Using advanced DITA techniques to accommodate complex requirements
  • Delivering new output files
  • Refining your reuse strategy

Integrating with other systems

In the first wave, organizations usually focus on getting their content in order—migrating to DITA and topic-based authoring, setting up reuse, establishing efficient workflows, and managing the staff transition into new systems.

In the second wave of DITA, the new baseline is a functioning, efficient content production process, and attention turns to requirements that increase the complexity of the system. For example, a company might combine DITA content with information in a web CMS, a knowledge base, an e-learning system, or various business systems.

Moving additional content types into the DITA CCMS is only one option to foster collaboration. Organizations can align content across different authoring systems. Another integration opportunity is combining business data (such as product specifications or customer information) with DITA content. Software connectors that allow disparate systems to exchange information are a huge opportunity in the second wave of DITA. You can share information as needed without forcing everyone into a single system.

Focusing on the business value of content

The emphasis is shifting. In the first wave, organizations focused on reducing the cost of producing content by improving operational efficiency. In effect, they built systems that reduced or eliminated waste in content manufacturing. In the second wave of DITA, the focus is on the business value of the content. After setting up the assembly line, the organization can build cars, er, content, with more and more features that authors and consumers need.

Some trends in this area include the following:

  • In localization, a shift from authoring in a single source language toward multilingual authoring. Product expertise is not confined to employees who are proficient in English. If your subject matter expert is most comfortable in Chinese, why not allow her to work in that language?
  • In management, an increasing recognition of the value of good content, and a demand for improvements.
  • In content creation, a greater recognition of the importance of content strategy and an increasing focus on the big picture.

DITA is a constantly evolving technology, and to get the most value out of your implementation, you must ensure that your content strategy evolves with it. Don’t stop at letting DITA solve your content problems—take advantage of the second wave of DITA and explore the many other ways it can advance your business.

We had some interesting discussion about the second wave of DITA during our 2016 content trends webcast, and we’d like to continue that in the comments. Are you in DITA and figuring out what comes next? Let us know.

XML workflow: gathering specifications for output

January 25, 2016 by

It is a common stereotype that an XML workflow for content is rigid, unbending, and free of creativity.

In my experience, the opposite is true. Creativity is necessary to develop XML workflows, and—if developed with specific requirements in mind—these workflows enable creative solutions. Business requirements often demand flexibility in how content is developed and distributed.

This is the first of three case studies about balancing creativity and standardization in XML workflows. This content first appeared in tcworld magazine.

Desktop publishing (DTP) software offers template designers a visual interface for developing the formatting of print and PDF output. The interface shows what the page will look like: the page size, the font for each paragraph, the spacing between paragraphs, and so on.

This visual aspect of template development is missing in structured workflows that generate automated outputs. Instead, programmers use a text or XML editor to develop stylesheet code that defines formatting. So, how can stylesheet programmers and DTP template designers collaborate most efficiently to identify formatting specifications for automated output?

Extensible Stylesheet Language-Formatting Objects (XSL-FO) is a common solution for creating PDF output from XML content.

XML workflow: transforming source XML files into PDF file

Typical XML-to-PDF process

Coding in the XSL-FO stylesheets controls the formatting of the PDF output. The following sample code defines first-level headings:

<xsl:variable name="level-1-family">SansBold</xsl:variable>
<xsl:variable name="level-1-size">14pt</xsl:variable>
<xsl:variable name="level-1-line-height">15pt</xsl:variable>
<xsl:variable name="level-1-weight">normal</xsl:variable>
<xsl:variable name="level-1-style">normal</xsl:variable>
<xsl:variable name="level-1-indent">0pt</xsl:variable>
<xsl:variable name="level-1-color">#000000</xsl:variable>
<xsl:variable name="level-1-space-before">42pt</xsl:variable>
<xsl:variable name="level-1-space-after">4pt</xsl:variable>

In their early efforts to gather formatting information, Scriptorium’s stylesheet programmers extracted specifications from the paragraph styles, table styles, and so on, in the existing DTP templates. The programmers also received a list of changes from the template designers. Often, the designers made requests such as:

  • “Make the chapter heading a little bigger.”
  • “Change the heading color to the new corporate blue.”

These requests were perfectly reasonable, but programmatic formatting requires specific numbers. How many points is “a little bigger”? What is the Pantone color code for the corporate blue?

To improve communication between template designers and stylesheet programmers, a Scriptorium stylesheet programmer developed a process that relies on…stylesheets!

First, the programmer added detailed code comments to the stylesheets. For example, for the page size settings, the comment reads:

Physical page dimensions. US Letter is 8.5 in x 11 in; A4 is 8.3 in × 11.7 in. Dimensions that accommodate both paper sizes are 8.3 in x 11.0 in.

He then developed another stylesheet-based process that extracts the code comments and creates a Microsoft Word file. The Word file explains each setting and includes the default value. The template designer reviews the settings, modifies the values as necessary, and returns the revised Word file to the programmer.

table in Word file for specifying page size

Word file excerpt with page size information

Using the Word document to collect values greatly streamlines the development process. Programmers receive specific measurements they can add to the stylesheets. Also, the detailed code comments serve as a reference to those who later maintain the stylesheets in the XML workflow.

Download the file for PDF specifications

Next month: InDesign XML for well-designed print layouts

The cost of DITA specialization

January 18, 2016 by

One of the greatest benefits of using DITA is specialization. However, specialized DITA is more challenging and expensive to implement than standard, out-of-the-box DITA, which is something you should consider before you take the plunge.

In this follow-up post to Making metadata in DITA work for you and Tips for developing a taxonomy in DITA, you’ll learn about the cost of specialization, and how to decide whether it’s worthwhile for your business.

Know what’s involved

Is being a special snowflake worth the cost? (flickr: Dmitry Karyshev)

Is being a special snowflake worth the cost?
(flickr: Dmitry Karyshev)

You’ve determined that DITA is the best solution for your company’s content, but now you have a choice to make—whether or not to specialize. Specialization means customizing the existing structure of DITA by adding, modifying, or removing elements to suit your needs.

The first step in your decision should be learning about what’s involved in the specialization process. If you specialize, you will need to:

  • Create a content model, or framework that shows how your content will be structured in DITA.
  • Develop the specialization, including custom DTDs, elements, and attributes.
  • Test the specialization with your content.
  • Make sure that your conversion process, output transforms, and tools work with your specialization (or modify them accordingly).

Implementing a DITA specialization will cost more—in terms of both time and expense—than standard DITA. Make sure you account for the added effort of specialization, especially if you’re on a tight schedule or budget.

Analyze your content

The structure of your content can help give you an idea of whether or not specialization is the best option for you. As you review your existing content, ask yourself:

  • How is your content structured? Keep in mind that even if your content is currently stored in an unstructured format, it probably still has an implied structure.
  • How closely does your content match the structure of standard DITA? If your content fits within standard DITA except for a few cases, it will likely be more cost-effective to rewrite those pieces of content than to create a specialization to handle them. However, if your structure differs significantly from standard DITA, you can probably make a strong case for specialization.
  • How consistent is your content? It can be tempting to use specialization to accommodate an inconsistent structure with numerous edge cases. But just because you can create specialized DITA doesn’t mean you should—especially if reworking your content to be more consistent is cheaper than specializing around it.
  • What semantic value does your content need? Can you assist your content creators by using element names that are more meaningful to them? If you’re in an industry with very specific language, such as pharmaceuticals, or if your company has a large, complex system of product names and categories, it might make sense to specialize—particularly when it comes to metadata.
  • How will your content be tracked? Do you or your audience need to search for and extract specific pieces of content (for example, a list of supplies from a datasheet for a certain product)? If so, creating a specialization that allows for semantic tagging might be the best (or only) way to accomplish this.

Estimate the costs

You’ve determined that your company could benefit from specialization based on the structure of your content. Now you’ll need to evaluate the costs involved in specialization so that you can present a strong business case for it. Here are some costs that could occur when you implement a DITA specialization:

  • Development costs. Do you have people in your organization who have the DITA knowledge and skill it takes to create your specialization? If so, you’ll need to account for their time and effort in your budget, especially if they already have other responsibilities. If not, you’ll need to reach out to an external resource (such as a consultant) or try to hire someone.
  • Conversion costs. Do you have legacy content that you plan to convert to DITA? How much? If you have enough content that you’ll be using a conversion vendor, ask them to estimate how much it will cost to convert your content using your specialization.
  • Output costs. What types of output will you need? How will your specialization affect the development of your output transforms? Depending on the nature of your specialization, your transforms may be more difficult or time-consuming to create than they would be with standard DITA.
  • Tool costs. What kind of support do the content management systems and authoring tools you’re considering have for your specialization? How difficult will it be to manage and update the specialization once your content is integrated? These factors can not only help you estimate the costs, but can also help you choose the right tools.
  • Localization costs. Do you need to translate your content into other languages? If so, keep in mind that the tool chain for any localization vendors you use must support your specialization, which could affect both vendor selection and implementation costs.
  • Testing costs. You’ll need to test your specialization at various stages throughout the implementation process, so make sure to allow for the cost of the time involved.

Specialization isn’t cheap or easy, and the decision to implement it shouldn’t be taken lightly. However, if it’s the best approach for your content, the costs involved are probably worthwhile. Now that you have a better understanding of the factors and costs of DITA specialization, you can make a more informed decision about whether or not to specialize—and support that decision with a stronger business case.

Top eight signs it’s time to move to XML

January 11, 2016 by

How do you know it’s time to move to XML? Consult our handy list of indicators.

Versione italiana (Alessandro Stazi, January 28, 2016)

This list is in rough order from problems to opportunities. That is, the first few items describe situations where the status quo is problematic. Later in the list, you’ll see items that are more focused on improving the quality of the information you are delivering.

1. Overtaxed system

snail

Is your system fast enough? // flickr: Eirien

Your current system (tools and workflow) worked well in the past, but now it’s overburdened. Tasks take too long because you don’t have enough software licenses, or enough people, or too many manual steps in your workflow.

XML-based content is not the only way to solve this problem, but you can use an XML-based system to improve the efficiency of your operation:

  • XML content files have a smaller footprint than the equivalent binary files (because formatting is not stored in each XML file but instead centralized in the publishing layer).
  • You can use a variety of editors with XML files. Software developers might use their favorite programming text editors. Full-time content creators likely prefer an XML authoring tool. Getting software is less of a problem because not everyone needs a (potentially expensive) authoring tool.
  • Content creators spend a shocking amount of time on formatting tasks—up to 50% of their time. XML-based publishing replaces the ongoing formatting tasks with a one-time effort to create stylesheets.

2. File management problems

Box labeled Fragile: Do Not Drop that has been dropped and crushed.

Not good. // flickr: urbanshoregirl

Your days are filled with administrative problems, such as the following:

  • Trying to manage increasingly fragile authoring tools in which formatting spontaneously combusts when you so much as breathe near your computer. (I’m looking at you, Microsoft Word.)
  • Searching through shared network drives, local file storage, and random archives for a specific file and, most important, the latest version of that file.

File administration is overhead at its worst.

The authoring tool problems are addressed by the simplicity of XML files—formatting is applied later in the process, so it cannot muck up your source files. (Note: Some software offers the option of saving to “MySoftware XML” format. In most cases, that XML does include formatting information, which destroys much of the value of an XML-based approach.)

The file search problem is a source and version control problem. The best solution for content is a component content management system (CCMS), in which you can track and manage the files. If, however, you cannot justify a CCMS for your organization, consider using a software source control system. Because XML files are text, you can use common system such as Git or Subversion to manage your files. This approach doesn’t give you all the features of a CCMS, but the price is appealing. It’s also possible to manage binary files in a source control system, but you will experience additional limitations. (For example, you cannot compare file versions using source control system.)

3. Rising translation and localization demands

Box with "no volcar" label.

No volcar. // flickr: manuuuuuu

Your “creative” workarounds to system limitations were acceptable when you only translated a few minor items into Canadian French, but now the company delivers in a dozen languages (with more expected next year), and correcting these problems in every language is getting expensive and time-consuming.

Localization is by far the most common factor that drives organizations into XML. The cost savings from automated formatting across even a few language variants are compelling. Furthermore, because most organizations use outside vendors for translation, it’s quite easy to quantify the cost of translation—you can just look at the vendor’s invoices.

4. Complex conditions

Most unstructured authoring tools offer a way to label information as belonging to a specific content variant and produce two or more versions of a document from a single file. For example, by flagging test answers as “Instructor,” a teacher could generate both a test and an instructor’s answer key from a single file.

In software documentation, a requirement to label information as belonging to a high-end “professional” version as opposed to the baseline product is common. Authors can then create documents for the baseline version and for the superset professional version from a single source.

With more complex variants, however, the basic flagging and filtering is insufficient. Consider, for example, a product that has the following variants:

  • U.S. product and European product with different safety requirements
  • Product used in different environments, like factories, mines, and retail establishments
  • Optional accessories, which change the way the product works
  • Product components are shared across different products with small changes

In this example, you would need to create the following types of filters and have the ability to generate combinations of filters:

  • Regulatory environment
  • Operating environment
  • Accessory
  • Product (to identify variance in shared components)

In XML, you can use metadata to create these flags and filter on various combinations.

5. No off-the-shelf solution meets your requirements

If your output requirements are exotic, it’s quite likely that no authoring/publishing tool will give you the delivery format you need out of the box. For example, you might need to deliver warning messages in a format that the product software can consume. Or you need information in strings that are compatible with web applications, perhaps in PHP or Python. JSON is increasingly required for data exchange.

If you are faced with building a pipeline to deliver an unusual format, starting from XML will be easier and less expensive than starting from any proprietary system.

6. More part-time content creators

In many XML environments, the full-time content staff is augmented with part-time content creators, often subject matter experts, who contribute information. This helps alleviate the shortage of full-time content people. Another strategy is to use XML to open up collaboration across departments. For example, tech comm and training departments can share the load of writing procedural information. Interchange via XML saves huge amounts of copying, pasting, and reformatting time.

Part-time content creators have a different perspective on authoring than full-timers. Their tolerance for learning curves and interface “challenges” generally decreases with the following factors:

  • Level of expertise. Subject-matter experts want to get in, write what they need to, and get out.
  • Level of compensation. Put too many obstacles in front of a volunteer, and your volunteer will simply drop out.
  • Scarcity of knowledge. The fewer people understand the topic, the more likely that your part-time content creators resist any workflow changes.

The solution is to focus on WIIFM (“What’s in it for me?”). If the content creator is accustomed to managing content in complex spreadsheets with extensive, time-consuming copy and paste, an XML system with bulletproof versioning and reuse will be quite popular.

7. Metadata

Text is no longer just text. You need the ability to provide supplemental data about text components. For example, you need to be able to identify the abstract section for each magazine article. Or you want to create a link from a book review to a site where you can purchase the book. Conveniently, a book’s ISBN provides the unique identifier you need, but you don’t want to display the ISBN in the review itself, so you need metadata.

Most unstructured tools let you specify metadata for a document (often, in something like “Document Properties”). XML lets you assign metadata to any document or document component, so you can include more detailed background information. (And you can use that metadata to filter the output results.)

8. Connectivity requirements

In some contexts, your text connects to other systems. These might include the following:

  • For a repair procedure, a link from a part to your company’s parts inventory, so that you can see whether the part is available and order it if necessary.
  • For software documentation, the ability to embed error messages and UI strings both in content and in the software itself.
  • For medical content, the ability to accept information from medical devices and change the information displayed accordingly. (For example, a high blood pressure reading might result in a warning being displayed in your medical device instructions.)

Does your organization show signs of needing XML? Can you justify the investment? Try our XML business case calculator for an instant assessment of your potential return on investment.

How to get budget for content strategy

January 4, 2016 by

One common roadblock to content strategy is a lack of funding. This post describes how to get budget, even in lean years (and recently, they have all been lean years!).

1. Establish credibility

Well before you ask for anything, you need a good reputation in your organization. You need management to know that you are:

  • Smart
  • Reliable
  • Productive
  • Great at solving problems
  • Not a complainer

Does your executive team occasionally ask for miracles? Make it happen, and be sure that they understand what you had to do to pull off the miracle.

Find ways to improve content that are inexpensive but have a real impact on cost and quality. For example, build out some decent templates that help people create content more efficiently and with higher quality even in your current, yucky system.

If you must complain about things, do so very far away from the Money People.

2. Identify a problem that the Money People care about

Your problems are the wrong problems. For example:

Keyword with green "Budget" key in place of regular Shift key.

Content strategy needs budget // flickr: jakerust

  • The content publishing process is inefficient and causes stress for the whole team during every release.
  • I hate this authoring tool, and I want to work in that cool new authoring tool.
  • Our content is not consistent from one writer to another.

These are all small potato, internal problems. If you want funding for content strategy work, you need to communicate with executive management in ways that they understand.

Hint: They understand M-O-N-E-Y.

So:

  1. For each release, the content publishing process takes 40 hours, per document, per language. We have two releases per year, with 5 documents, and 20 languages. That means we are spending
    40 x 2 x 5 x 20 = 8000 hours = roughly $400,000 per year on content publishing.
  2. Our organization has identified mobile-friendly content as a priority. Using our current authoring tools, we have to rework information to make it mobile friendly. If we switch to a different tool, we could deliver mobile friendly content immediately and automatically.
  3. Customers must currently search the technical support articles and the technical content separately. As a result, 20% of support phone calls are for information that is available in the technical content, but is not being found by customers.

3. Show ROI

After identifying the problem, show a solution that makes financial sense:

  1. An automated publishing workflow would eliminate that yearly recurring cost. The cost to implement it is roughly $150,000, so we come out ahead during the first automated release.
  2. The cost of the rework is roughly $100,000 per year, and delays delivery by four weeks. Investing in NewAuthoringTool will cost $50,000 for licensing and $30,000 for training.
  3. We want to improve the search facility to reduce the calls that can and should be solved by a search of technical content. Our technical support budget is $5M per year, so 20% is roughly $1M. We need $250,000 in funding to implement the new search, so we will break even in year 1 if we can reduce the not-found calls by 25%.

You will compete with other projects for limited funds, but a business-focused approach to content initiatives will ensure that your project is at least competitive with the other projects.

 

 

 

The best of 2015

December 21, 2015 by

Let’s wrap up 2015 with a look back at popular posts from the year.

Scriptorium wishes you the best for 2016!

Buyer’s guide to CCMS evaluation and selection (premium)

“What CCMS should we buy?”

It’s a common question with no easy answer. We provide a roadmap for evaluating and selecting a component content management system (registration required).

Localization, scalability, and consistency

A successful content strategy embraces consistency and plans for scaling up in the future—which in turn means localization is more efficient.

To see how consistent XML-based content can save your company time and money, check out our business case calculator.

The talent deficit in content strategy

Content strategy is taking hold across numerous organizations. Bad content is riskier and riskier because of the transparency and accountability in today’s social media–driven world.

But now, we have a new problem: a talent deficit in content strategy.

Tech comm skills: writing ability, technical aptitude, tool proficiency, and business sense

“Technical Writing is only about what software you know!”

This comment from a LinkedIn post has it partially right: technical writers should have expertise with authoring software. But to be successful, they need a balance of skills.

DITA 1.3 overview

Curious about the additions to DITA in the version 1.3 spec? Here’s a quick rundown on scoped keys, cross-deliverable linking, and more.

Structured authoring: breaking the WYSIWYG habit

It can be difficult to switch from desktop publishing to structured authoring—especially breaking out of desktop publishing’s WYSIWYG authoring mode.

You can shake your WYSIWYG habits with these tips.

The Force Awakens: Content strategy and project hype

December 14, 2015 by

With the most anticipated film of the year—Star Wars: The Force Awakens—coming out this week, I couldn’t help but think about movie hype and how sometimes it leads to disappointment.

The same thing can happen when hype builds around content strategy. Excitement about implementing a new strategy can be good for an organization, especially when the alternative is hostility or resistance to change. But too much enthusiasm can have unintended consequences and result in failure. Here are some of the pitfalls of project hype and how you can avoid them.

Problem #1: Choosing tools too early

Star Wars

Star Wars at Heroes Con 2015 (photo by Gretyl Kinsey)

If your company uses outdated tools that make your content development process slow, tedious, and draining, you may be desperate for new ones. But over-excitement about upgrading your tools can make you more vulnerable to the “ooh, shiny!” factor when you’re looking at new options—and more likely to choose tools without proper investigation. Selecting tools too hastily increases your chances of getting stuck in a workflow that’s just as inefficient as your current one.

In The Empire Strikes Back, Luke Skywalker learned a valuable lesson about choosing the wrong tools for the job when he ignored Yoda’s advice to leave behind his weapons for a Jedi training exercise. You don’t want to make a similar mistake.

The solution: Take a step back and remember that your content strategy should come first. What are your business goals? What does your content development team need to achieve those goals, and what factors are standing in the way of them? Your content strategy should inform your tool choice, not the other way around.

Problem #2: Burning through your budget

When you’re overly eager to start implementing your content strategy, you might be more likely to overspend—especially in the early stages of the project, when you still have most or all of your budget available. If you go over budget in an early part of your implementation, it’s easy to tell yourself that you’ll make up for the loss in a later phase, whether or not that’s actually feasible. (We’ve seen companies fall into this mindset, but it’s a trap!)

Unexpected budget cuts or changes can crop up at any time in an organization. Implementations can also uncover costs you didn’t anticipate at first—maybe your conversion will cost more than you initially estimated, or your output requirements will become more complex halfway through the project. If you’ve been spending over budget in your excitement, your project will have a more difficult time surviving a sudden blow to the budget, and you might not be able to implement the solution that your organization really needs.

The solution: Plan carefully, spend wisely, and always include a budget backup plan in your content strategy. After all, you don’t want to end up like Han Solo, who spent most of the original Star Wars trilogy in debt to Jabba the Hutt.

Problem #3: Ignoring the sequence of your strategy

Implementations don’t always go exactly as planned, and they can sometimes hit a delay or come to a standstill. In your eagerness to keep things moving, you might be tempted to catch up in other areas if one part of your implementation starts lagging behind. However, content strategies usually involve phases with dependencies, and it’s important to pay attention to the order of these phases before you try to change it.

Jumping ahead to one stage of your implementation to offset lag in another—or trying to implement phases in tandem rather than in sequence to save time—can cause more problems than the delays themselves. The order of your content strategy matters. For example, before you can train your team, you need to know what tools you’ll be using, but before you can choose your tools, you need to define your content goals. If you skip ahead to a phase of your project without completing that phase’s prerequisites, you will most likely have to waste time and resources repeating that phase properly at a later time.

The solution: Stick to your strategy. If your project falls behind, find out what’s causing the delay and address that directly. Resist the temptation to skip ahead. Can you imagine what would have happened if the rebels tried to destroy the Death Star before Princess Leia brought them the plans?

Problem #4: Intimidating your team

If you’re the only one who’s excited about your content strategy implementation, there’s nothing wrong with trying to motivate the rest of your team—but be careful not to go overboard. If everyone around you is facing an intimidating learning curve or major changes to their everyday work experience, hyping up the project could backfire. If you make your team feel pressured, however unintentionally, they might respond with even stronger resistance to change, which is the opposite of what you want.

The solution: Be a leader. Give others the education, encouragement, and support they need to be on board with the implementation. Emphasize the ways your new strategy will improve things for your content creation team—and, if they have concerns, listen and address them. Remember Leia’s words of wisdom about the Empire’s heavy-handed rule of the galaxy and take a more diplomatic approach with your team.

It’s great to be enthusiastic about your new content strategy, but too much hype can set you up for failure. As Yoda told Luke, “Adventure… excitement… a Jedi craves not these things.” A calmer, more controlled approach will lead to a more successful implementation—and help you channel the Force of your positive energy into something you can use to your advantage.

The rise of part-time contributors

December 7, 2015 by

Content creation should no longer be the exclusive domain of full-time writers. Employees in other departments can offer valuable information that your company’s content should capture. Where can you find these part-time contributors?

Two groups that immediately come to my mind are product development and support.

Product development

Boc with part-time ninja on it

DeviantArt: tinkun113

Product engineers have contributed content indirectly since the first product specifications were released. Specifications are usually rewritten by those in tech comm, marcom, training, and other groups focused primarily on content development, but the core information is still the same.

Because of their deep knowledge of products, it makes sense to integrate engineers directly into the content development processes. They can write small bits of content, review others’ content for accuracy, and so on.

I’m not talking about maintaining the old-school methods of sharing and reviewing content, either—the days of  sharing content via email and hard copy markups are waning, thankfully. Many desktop publishing tools of today offer trackable electronic reviews of source content. Content management systems for structured workflows have specific features that enable part-time contributors to develop and review content.

Product support

The support organization talks directly to customers to help them use a product successfully. Those conversations—whether via phone, chat, or other medium—often expose deficiencies in existing product content. Those conversations also uncover customers’ product uses that engineers and full-time content creators never considered.

Getting that valuable real-world information into product content is essential. Giving support staff access to the content creation process can help ensure that information flows back into product content.

The benefits of integrating support staff into the content development are a two-way street. Support staff contribute the critical real-world information from customers, but they also receive valuable information about new features, bug fixes, and so on, by reviewing updated information flowing from the product development group.

Synthesizing the knowledge and skills of part-time and full-time content contributors means your company has a much better chance of delivering useful information to customers. Have you integrated part-time contributors into your content workflows? What departments in your organization include part-time contributors? Please leave comments about your experiences—both good and bad—below.

The advent of intelligent content

December 1, 2015 by

You can justify intelligent content with efficiency–more reuse, cheaper translation, better content management. The true value of intelligent content, however, is unlocked when you connect content across the organization.

For medium and large organizations, the efficiency argument easily wins the day. In a group of 10 or more writers, a one-time investment in automated formatting pays for itself almost immediately. When you add in the cost savings from localization, the financial benefits are compelling.

Efficiency, however, is not the only reason that intelligent content is important. Consider, once again, the rise of the printing press in the mid-1400s. The printing press was cheaper than hand-copying books, but it also had the following key differences:

  • Velocity. Printing is orders of magnitude faster than hand-copying.
  • Error rate. Printing errors are less common and easier to correct (because a mistake in printing setup results in an error in the same place in each copy).
  • Separation of copying and formatting. The copy function and the formatting function performed simultaneously by scribes (and illuminators) were separated. The printer would typeset pages; the printing press made the copies.

For intelligent content, we can also identify new characteristics:

  • Increased velocity, again.
  • Separation of content and formatting. Instead of applying formatting as the content is created, the formatting role is deferred until the information is rendered.
  • Metadata. Intelligent content can carry metadata, such as audience level or language.
  • Context. The context in which intelligent content is rendered can affect how the information is rendered.
  • Connection. We can integrate data, products, and content to provide new hybrids.

Of these, I think connection is the least understood today. The idea of hypertext is not particularly new, and we have basic examples already:

  • A reference to another book might take you to that book (or to an online bookseller’s listing of the book).
  • A reference to a part in a service manual could take you to a CAD drawing, a description of the part, or an order page.
  • In medical content, a mention of a specific disease could link to additional information about that disease.

But intelligent, connected content will go beyond these basic examples. Instead of laboriously encoding each link into information, I expect automatic linking. Based on metadata or other content characteristics, links will be generated when the content is rendered. The distinction between product and content will be blurred…when information appears as part of the product interface, is it part of the product? And there will be a need for product and content professionals who understand how to integrate data, product, and content to create a unified experience.

Intelligent content is about positioning information so that we are ready when these new requirements arise. A solution purpose-built to solve today’s problems will likely fail to address the new requirements. The investment in a more general solution may be worthwhile in order to keep your options open.