Scriptorium Publishing

content strategy consulting

Easy ways to undermine marketing with content strategy

October 24, 2016 by

Does your content deliver on your marketing promises?

“Our products lead the industry…”

but we can’t create a decent mobile experience.

Karen McGrane writes in the Harvard Business Review:

You don’t get to decide which device your customer uses to access the internet. They get to choose. It’s your responsibility to deliver essentially the same experience to them — deliver a good experience to them — whatever device they choose to use.

Any claim of cutting-edge industry leadership must be supported by a good web site experience, and that includes the mobile experience.

“We serve clients in 47 countries…”

provided that they speak English, because we do not offer localized products or content. Also, we use a lot of jargon in English, so good luck to customers with limited English proficiency.

“We care deeply about our customers…”

but only those users with perfect color vision, excellent fine-motor control, and pristine hearing. We use tiny, trendy low-contrast type. We do not provide alternatives to mouse-based navigation. We make heavy use of video, and we do not provide captions as an alternative to listening to the video.

“Our product offering is flexible and configurable…”

but our web site doesn’t work on Safari.

“We offer a luxury experience…”

as long as you don’t need help charging the batteries because that document is incomprehensible in any language.


I’m not sure why, but it makes me think of this:



Localization strategy: improving time to market

October 17, 2016 by

This post is part of a series on the value proposition of localization strategies.

You can make localization “better” by taking a look at localization value. Quality and cost are important value factors, but improved time to market returns the greatest value.

Improving time to market for localized products and content is no easy task. It’s not as simple as adding more translators to the effort; that may cause more problems (and more delays). Improving time to market involves moving localization up the project chain, and to do so effectively requires a localization strategy.

An effective localization strategy begins with the same foundation as other effective communication strategies: an audience-first approach. Who are you targeting? For what purpose? What do they need? What do they value?

cat closely inspecting a butterfly

Inspect every detail!

At the very beginning of a project, the entire audience needs to be considered for every aspect of the project.

  • Marketing campaigns must be culturally and legally appropriate
  • Sales materials and contracts must be reviewed
  • Pricing must be adjusted in some cases
  • Product design must consider all local requirements

The list goes on and on. Every aspect of the project must be evaluated against every aspect of every target market. Doing so will identify variations in requirements before they become problems, and will identify opportunities before they are lost.

What does all of this have to do with time to market? It all starts with setting realistic expectations. The more you know about your target audiences, the earlier you can begin to define requirements, avoid unexpected issues, and plan your release strategy. You are also able to take an iterative approach to translation that runs parallel to development, and build localization testing into your core product testing.

In short, implementing a localization strategy helps you remove many unknowns from the tail end of a project and allows you to optimally target all audiences from the very beginning.

Have you experienced an unpleasant localization surprise at the tail end of a project? Have you improved your time to market by making changes to how you handle localization? Please share your stories in the comments.

DITA to InDesign: the gruesome details

October 10, 2016 by

We’ve written before on what lurks beneath the surface of an InDesign file, and how drastically it differs from the DITA standard. When you’re looking at going from DITA to InDesign, though, there’s a lot that you need to take into consideration before you jump in.

An apt visualization of today's theme.

An apt visualization of today’s theme. Lisa Risager, Flickr

DITA separates formatting from content, but formatting content is one of the most powerful features that InDesign offers. You need to prepare your DITA content for the transtition from a no- or low-design environment to a high-design platform. You also need to ensure that your InDesign environment is ready, or you’ll wind up with inconsistently-formatted content, or worse, output that will crash the program when you try to import it.

The DITA side

Taxonomy: You need to make sure that you know your content. InDesign offers a wide range of ways to format your content, but there’s not always a direct mapping from DITA. For example, a paragraph element could have a basic body style applied, or perhaps it needs a style with a different margin. How do you determine this?

  • Robust metadata will allow you to identify elements that need to be treated differently. The quickest way is to use the outputclass attribute, but for subtle variations on a style, you may need to consider…
  • Specialization allows you to define custom structures. If you have a type of admonition that lacks a label and adds detail to text nearby, you might create a callout element.
  • Don’t forget the stock offerings of the DITA specification. Images in particular can already specify things like alignment, which may fulfill your needs.

Information overload: Since it’s a desktop publishing (DTP) platform, InDesign takes some shortcuts when it comes to some things. Images, in particular, are a challenge. When you add images to your DITA content, you need to be sure to include both height and width information. This is due to the way that InDesign stores image display information. Rather than saying that an image appears at a point in the document and is X pixels wide and Y pixels high, InDesign identifies an anchor point, then a series of four points that describes a frame, and then places the image within it. Without both height and width, or an image format that you can draw those dimensions from, you’ll have trouble defining how the image displays. The moral of the story: if you have the information available, you should include it.

Just plain weird stuff: While Adobe has made the IDML standard public, InDesign itself isn’t anticipating someone coming along and placing raw code into a template. This results in some very strange behavior.

  • If you have a paragraph element that ends with bolded text, when you place your output into a template, all of the text following that paragraph element will be bolded until InDesign finds another character style element.
  • If something goes wrong with your output and InDesign doesn’t like it, one of two things will happen: the offending content will be dropped, or InDesign will crash without any error. Debugging this can be an exercise in patience.

The InDesign side

The most important part of preparing the InDesign portion of your workflow is getting your templates in order. They should either be prepared before you begin working on your DITA taxonomy requirements, or developed alongside them.

  • Do you need more than one template, or can you use a master template? If you need specific layouts or master pages, you’ll need multiple templates. If the paragraph, character, or object styles between those templates differ, you’ll need to communicate that to whoever is working on your plugin.
  • How well-defined are your object styles? You need to take into account not only things like margins, but also word wrap.
  • Do any of your style names have special characters in them? You need to avoid that. The style declarations on the DITA side need to be escaped if so, and if they’re not escaped properly, InDesign will crash when you try to place your content into the template.
  • Do your paragraph styles have their hyphenation properties set up correctly? If you know you have tables that will be narrow, you need to be careful about this. If the text in a table cell is too short to become overset, but long enough to hyphenate and then become overset, InDesign will crash when you try to place your content into the template.


While transforming DITA into an ICML file will allow you to quickly get your content into InDesign, it isn’t a smart process.

  • Since an ICML file lacks any kind of page information, the only page breaks that will appear are those that are dictated by your paragraph styles.
  • An image only knows where its anchor point is relative to the body content it appears near. This means that if you have multiple images in close proximity, there’s no way to prevent them from overlapping.
  • When you auto-flow content into a page in a template, it uses the same master page throughout. If you have sections of your content that require a different master page, you’ll have to apply it by hand.

Despite these limitations, being able to leverage DITA’s portability and reusability with InDesign’s high-design environment remains a tantalizing prospect. PDF allows for quick, consistent publishing of your content, but any edits require new output, and any formatting updates require changes to the plugin. If you have a production pipeline that utilizes InDesign and you value the fine control that it grants you, a DITA to InDesign workflow may be worth it.

Using XML to integrate database content and desktop publishing

October 3, 2016 by

This article shows how Scriptorium helped one company use XML to integrate information in a database with desktop publishing content.

In most enterprises, useful content exists in a number of different tools or databases. To include that content in your publications, you might use traditional ways of moving the information, such as copy and paste. However, it can be far more reliable, repeatable, and efficient to automate conversion from those tools and integrate the result directly into your publishing solutions.

A large manufacturer of integrated circuits used Adobe FrameMaker (unstructured) to produce reference and conceptual documentation for their very complex processors. Each processor had thousands of registers. Most registers contained multiple bit fields, each containing specific pieces of information, all of which needed to be documented.

The information necessary for documenting the processors and their registers was maintained in the manufacturer’s chip-design database. To produce reference documentation, the writers copied the information from the chip design database and pasted it into FrameMaker. The writers could—and did—edit the information, but usually they were constrained to copy content and apply formatting.

The descriptions for each register consisted of two main parts:

  • A table, which described the bit fields and enumerated values (if any).
  • A diagram, which provided a quick reference to the name, location, and size of each bit field in the register, in addition to other information about the bit fields.

Depending on the size of the register, and the number of fields, these diagrams could be quite complicated. The diagrams were constructed using FrameMaker tables (tables were easier to manipulate than using FrameMaker drawing tools or a separate drawing program).

There were several problems inherent in the documentation process:

  • When a chip was in development, the specifications for the registers and their contents could change rapidly. The copy-and-paste methodology made it hard to keep up with the changing and evolving design.
  • If the writers modified text when copying and pasting, those changes weren’t preserved in the chip design database.
  • Creating and maintaining the illustrations required a large amount of time and thought. Manipulating FrameMaker tables and table cell borders was inefficient and the whole process was potentially prone to errors.

Additionally, the manufacturer did not want to transition to a new tool set, so remaining in FrameMaker was a requirement. Unstructured FrameMaker was perfectly good for documenting the conceptual information; it was only the reference information that was difficult to maintain.

The manufacturer was aware that the reference information could be exported from the database in IP-XACT, an XML-based, vendor-neutral schema for describing chips and their components. However, they needed some help converting the IP-XACT into something that could integrate with FrameMaker, which is when they reached out to Scriptorium.

Scriptorium suggested that an XSL transform could convert the IP-XACT sources into files that could be imported into structured FrameMaker. FrameMaker allows mixing structured and unstructured files in FrameMaker books, so all of the conceptual information could still be maintained as unstructured FrameMaker files. The reference sections could be replaced in the book files whenever they were updated.

Writers were granted access access to the chip design database, so that text corrections to the register and field descriptions could be made in one place.

In addition to solving the basic problem (extracting the descriptions from the database and converting them to FrameMaker), the transform organized the registers in a coherent series of chapters or sections, including a linked summary table for each set of registers, and built the register diagrams. The transform also created persistent cross-reference targets, so that writers could easily create cross-references to register descriptions from conceptual content.

Once the structured files were imported to structured FrameMaker, a Scriptorium-created custom ExtendScript performed final cleanup.

The resulting documentation and diagrams were clear, consistent, and could be re-created from updated sources in a matter of minutes.

The manufacturer (and the success of the project) benefited from these factors:

  • The chip-design database contained almost all the information needed to document the registers.
  • The chip-design database could export XML (IP-XACT).
  • The content in the chip-design database was consistent and of sufficient quality to expose to customers.
  • The writing team could access the chip-design database to enhance and correct the information, where necessary.

Automatically converting content from reliable and consistent sources produced reliable and consistent documentation, which then freed the team to focus their energies on conceptual content.

Consulting lite: life at Scriptorium

September 26, 2016 by

Scriptorium is hiring. Our consulting jobs are a unique blend that you don’t see in many other places. Over and over, I’ve found myself struggling to explain this blend to candidates. So here is an attempt to describe life at Scriptorium.

Job structure

Our technical consultants are full-time, permanent employees with benefits. Our interns are full-time temporary employees with benefits. After 6-12 months, interns are eligible to become permanent employees.

Client load

Employees typically work on multiple client projects in a single week. You might deliver a draft document to one client, then turn your attention to updates on another project, receive a few review comments from a third client, and clear a support ticket from another client.

Each project has a project lead. For small projects, the lead might also do the work; for larger projects, the lead coordinates the project team.

One of the biggest challenges is remembering different communication requirements. For example, we use Basecamp for project collaboration on some projects. For others, we use client infrastructure (usually SharePoint).

Client mix

Our clients come from a cross-section of industries: finance, life sciences, education, high-tech, heavy machinery, telecommunications, state and federal government, non-profit associations, semiconductors, and others.

We specialize in working with organizations that have content problems, and they are found everywhere!

Our consultants are exposed to content workflows across many industries.

Sales and marketing responsibilities

Unlike freelancers, our employees are not responsible for hunting down their own new projects. But our employees do have some sales and marketing responsibilities. These include:

  • Participating in social networking
  • Writing blog posts or other articles
  • Presenting at industry conferences
  • Ensuring that clients are happy
  • Noticing when a client asks for additional work and making sure their issue is addressed promptly
  • Contributing to proposals


All of our consultants travel. Some of that travel is for conferences and industry events, and some is for client visits. No consultant is expected to travel more than 25% of the time.

Cloud systems

Our office space is in Research Triangle Park (Durham), North Carolina. Most of our employees are based there, but all of our systems are cloud-based. Employees can access the information they need at home, or remotely, or while traveling.

Scriptorium versus corporate life

It’s hard to generalize about Scriptorium versus All Possible Full-Time Jobs. But here are some things to consider:

  • Domain knowledge (expertise about the company’s products and industry) is more valuable in a corporate job. Scriptorium employees move from project to project, so the domain changes constantly.
  • If you like change, Scriptorium may be the place for you. We learn new things in every project, and we are always looking for better ways to do things. If you prefer to develop deep expertise in a single set of tools or a specific approach, this is not the right place for you.
  • As a change of pace from client work, you might find yourself working on, writing blog posts, or working on internal processes.

Scriptorium versus freelance life

Bring on the additional generalizations! Working as a consultant at Scriptorium is basically Consulting Lite:

  • Like a freelancer, you have project variety and an ever-changing list of assignments.
  • You do not have to do all your own sales and marketing work.
  • Scriptorium handles administrative support (payroll, taxes, invoicing, and office management tasks).
  • You are paid a salary and not an hourly rate.
  • You have coworkers who can QA your work before it’s sent to the client.
  • You have an office location (if based in RTP), and an internal communication network to discuss projects, share juicy gossip, and abuse the emoji capabilities.


Does consulting lite sound appealing? We’re hiring.


Making localization “better”

September 19, 2016 by

This post is the first in a series about the value proposition of localization strategies. You can also see a presentation on this topic at LavaCon this October.

Localization issues are a primary reason companies seek help with a new content strategy. One of the most common questions we hear is, “How do we make our localization process better?”

When we’re asked this question, we turn the question around. What is wrong with your current localization process? What would you like to improve? How do you define “better?”

fast, good, cheap project triangle for localization

Fast, good, cheap… why pick only two?
(image: wikimedia)

The answers always fall somewhere on the project management triangle. Localization costs may be too high, localization may take too long or happen too late in the project cycle, or there may be quality issues with the translated content.

Usually when companies say they want “better” localization, they mean that they want to make a combination of improvements; they are paying too much for problematic translations that take too long to produce.

In short, they’re not getting value for their efforts.

Defining localization value

Localization value is usually measured in two ways:

  • Cost: how lean can it get?
  • Quality: how accurate and error-free can it get?

What’s interesting is the absence of “Time” in the value assessment. This absence is largely due to viewing localization as an end-game process. To measure and see the powerful value of time, we must look at time to market.

How much of a revenue increase would result in bringing multilingual versions of a product to market three months sooner? Six months sooner? Concurrent with the “core” release?

How favorably do your multilingual customers currently view your company? How might that change if they could receive the same level of product or service within the same timeframe as other customers? Would they be more likely to promote your company? Might that increase sales in certain markets?

Improving time to market for localized products and services can be tricky, and should always include improvements in cost and quality. More on this in the next post in the series. But for now, a parting question:

What do you see as your biggest localization hurdle to overcome?

Glue strategy: connecting web CMS to CCMS

September 12, 2016 by

Web sites are fantastic at content delivery and generally terrible for content authoring. If you’re old enough (like me), you may have experienced the pain of hand-coding HTML or even editing HTML files live on your web server.

These days, hardly anyone codes HTML by hand. Instead, we use web content management systems (web CMSs), such as WordPress, Drupal, Magnolia, and many, many others. Web CMSs have two major functions: letting authors create content, and running a web site to deliver content to readers. The problem arises when web CMS A provides great authoring functionality and web CMS B provides great web site functionality. Which do you choose? Do you make life easier for your authors or for your audience?

After sufficient pain in that area, you eventually decide to decouple the web site and content management. This approach lets you choose the best option for authoring and web site, but it also requires you to glue the two components back together. Somehow, you have to move your content from the content management (authoring) system over to the web site:

Content management icon, arrow labeled glue pointing from CM over to a web site icon

Decoupling CMS and web site

A decoupled CMS enables you to take advantage of new and innovative technologies for creating rich web and mobile experiences, while ensuring your content authors and editors have a consistent approach to content management. (Spelling out the advantages of a decoupled CMS, CMSWire)

The decoupled approach lets you choose a content management system with a great authoring experience for the authors and a web site delivery system that meets the needs of your web site visitors. It does introduce a few new problems, though:

  • You have to maintain two systems instead of one.
  • You have to find a way to glue the two systems together.

Adding complexity with DITA

When you add structured technical content and DITA into the mix, things get sticky (!!). How do you manage DITA content if you already have a web CMS (which may in fact be more than one platform)? If you decouple everything, you are faced with a fairly sketchy-looking architecture:

Separate repositories for web content and DITA content. Each has a arrow labeled glue pointing to the web CMS for rendering.

So much glue, so little time

NOTE: I’ve labeled the contents of each repository as CSS (for formatting), DITA (for DITA XML), and HTML (for web authoring). This is of course a gross oversimplification. Not all formatting is CSS. The content repositories can and should include other types of content.

It’s rare to see the decoupled architecture with DITA involved.

Instead, the web CMS owns one set of content (usually marketing content). Inside the web CMS, you manage that content and also control the presentation layer for the web site. A separate component content management system (CCMS) manages DITA content. So DITA content is created, edited, reviewed, and approved in the CCMS. Then we send it over to the web CMS. The process of gluing together the CCMS and the web CMS is generally painful and expensive. The advantage to this approach is that the web CMS people (marketing) and the CCMS people (tech comm) can basically ignore each other. Oh, and the people who know how to create glue code? They are very, very overworked.

CCMS icon contains DITA, glue arrow points to web CMS icon that contains HTML, CSS, and more content

Web CMS and CCMS

In a few systems, the glue is built into the CCMS. For example, you can deploy SDL’s LiveContent Architect (the CCMS) along with LiveContent Reach (web CMS). easyDITA offers connectors to Mindtouch and WordPress. So in this case, the glue technology is attached to the CCMS:

Similar to previous image, but now the glue arrow is connected to the CCMS.

Glue gets easier…

With the just-announced XML/DITA connector for AEM (Adobe Experience Manager), Adobe is gluing together the repository and the display management. Nearly every other solution—for example, a DITA CCMS plus Drupal—requires you to create that glue.

Inside the web CMS icon, we have HTML,  CSS, and DITA

DITA inside the web CMS. No glue!

If you take the AEM approach, you get free glue. (That is, when you license the XML/DITA connector along with AEM, you do not have to build out a connector yourself.) You can manage your DITA content in the same repository as your non-DITA content. And you can be sure that your web site delivery will be consistent across all of your content. If you work in a company that has already invested in AEM for web delivery, this could be a reasonable answer.

So what is your glue strategy? Will you choose individual components for maximum flexibility and pay for glue? Or does it make more sense to choose a single integrated solution?


This post provides only a general overview of possible glue strategies. If you need a recommendation for your specific situation, contact us to discuss a content strategy assessment.

PS I really wanted to entitle this post “The Glue Factory,” but my coworkers are mean.

Before XML, improve DTP

September 6, 2016 by

Thinking about migrating unstructured content to XML? Take a hard look at your existing desktop publishing workflow. The maturity of your DTP process will have a big impact on a move to XML.

Following a template-based DTP workflow is not just about implementing best-practice processes. Templates make a potential move to XML less expensive and painful.

Templates are key for efficient DTP

The cornerstone of a mature DTP workflow is the template: predefined paragraph, character, table, and other styles. With a template in place, authors don’t have to guess about the right formatting, and applying existing styles is much faster than manually adjusting the look and feel of content.

Templates also provide a consistent user experience: the standardization in formatting across a company’s content unifies information and reinforces branding.

There is an additional advantage to templatized content that deserves its own discussion: conversion.

Efficient conversion

Whether you are converting DTP files to web pages, online help, or even XML for a new structured authoring workflow, template-based content greatly increases the quality and efficiency of conversion. Conversion scripts map template styles to tagging for the new output.

Ad hoc formatting is much more difficult and expensive to script for automated conversion. If authors are not using templates well (or at all!) in a DTP workflow, content creation itself is inefficient, and by extension, all conversion efforts will be less efficient, too.

Before moving to XML, consider cleaning up the DTP process with better templates. Cleaning up unstructured content can be the right choice when, for example, some content needs to stay in the DTP workflow for a while because of release cycles.

Some unstructured source files are so poorly assembled that it will be more cost efficient to re-create the content in XML. You may split the difference: some DTP content gets cleaned up and is converted to XML, and other content gets re-created in XML. If you’re unsure of what path(s) to take, invest in a little third-party advice from a consultant—even if you intend to use internal resources for the actual conversion work.

There is not a one-size-fits-all solution for converting DTP files to XML. Automation through scripting, scheduling, and cost/benefit analysis all come into play.

Moving to XML is really hard for those who don’t use templates

person leaping across rocks

Before leaping to XML, look at your DTP processes

For content creators, moving from DTP to XML is challenging. Period. But those working in a template-based DTP environment usually have an easier time with the transition to XML.

A mature DTP workflow has an implied structure defined by correct template use. Content creators who use templates correctly are already accustomed to working in a more controlled process. Moving to the guided authoring/enforced structure in XML content workflows is less of a shock for these template users.

For authors who don’t use predefined template styles and apply ad hoc formatting, moving from template-free content development to structure is a huge, difficult leap. These authors often feel too constrained by structured authoring tools and resist the change to structure. The already difficult transition to XML then becomes even less pleasant for everyone involved.


P.S. Also worth noting: templatized DTP is essential for efficient localization processes (free registration required for link).

Tech comm, content strategy, and coaching

August 29, 2016 by

Earlier in the year, I was chatting with Sharon Burton. As an aside to our knitting-focused discussion, I asked her what new services we should offer.

The answer came back immediately:


So with many thanks to Sharon, we are offering coaching to managers, directors, and executives with responsibility for technical content, localization, and/or content strategy.

Leading a content or localization team can be lonely. Your peer managers don’t understand your job function, and your staff doesn’t understand the management challenges. Our goal, as coaches, is to provide a confidential setting in which to discuss management problems, strategic decisions, and new ideas.

Over the last 20 years, we have worked with many managers who were confident in their leadership abilities but needed support in understanding content teams. When your background is in HR, QA, or engineering, it often seems that the content creators are speaking another language. (And some staff members aren’t shy about pointing out your lack of front-line experience in technical communication as a reason that they should get to do things Their Way.)


Tilikum Crossing, Portland, Oregon // flickr: Twelvizm

Coaching can help you bridge the gap by filling in the technical expertise so that you can understand what your content team should be doing.

We are still finalizing the coaching offering, which will launch officially in January 2017, but there are a few things you can do today:

  • If you’d like to sign up immediately, we have an early-bird program, which will offer a discounted rate for coaching through the end of 2016. Contact us to get more information.
  • For more details, read the coaching services page.
  • If you’d like to provide some feedback and help shape the program, we have a very short survey. Or you can just leave a comment on this post.

Thank you!

Going for the gold with your content strategy

August 22, 2016 by

Now that the 2016 Olympic Games have come to a close, countries are tallying up their final medal counts. Athletes are assessing their performances, celebrating their victories or mourning their losses. After you’ve implemented a content strategy, you should also assess the project to determine how successful it was.

Your number one metric for success should be your goals and how well your implementation achieved them. Olympic athletes do this, too—some are aiming for a gold medal (or more), while others are just trying to make the podium, improve their best times, or qualify for their event finals.

Similarly, your content strategy should outline the goals you plan to accomplish. Once you’ve implemented that strategy, you can see how well your new content development processes line up with those goals. Here are some ways you can figure out whether your content strategy was a success.

Have you solved your content problems?

Flickr: Paul Hudson

Flickr: Paul Hudson

If you implemented a strategy to make your content more localization-ready, have you seen any improvements in the localization process? Maybe now you’re spending less on localization than you were before, or you’ve cut your time to market in half. Maybe the customers who consume your content in another language are having a better experience using your products, and that’s showing in sales increases in global markets. Analyzing how much better localization is now than it was before can help you measure your strategy’s success.

The same logic applies to whatever content problems you set out to resolve. If you were missing lots of reuse opportunities before, are you maximizing your reuse potential now? If you were concerned about wasted time and resources in your content development process, have you made that process more efficient?

Has your strategy uncovered any other problems?

Sometimes implementing a content strategy can reveal problems you didn’t initially realize you had. Perhaps your implementation is already equipped to solve these new problems along with the original ones. If not, you’ll need to make some adjustments or additions to your content strategy to ensure that as many of these issues will be resolved as possible.

In some cases, implementing a content strategy might resolve your original problems but introduce new ones that you didn’t expect. For example, the content management system you chose might help improve workflow efficiency, but it might also make it more difficult to share your content with other groups, therefore exposing a problem with silos.

In cases like these, you will need to start thinking about other strategies to solve your new issues. With careful planning and time spent exploring the possibilities of unforeseen problems before you implement, you can reduce your chances of finding yourself in such a situation.

Has your implementation gone according to plan?

The answer to this one is almost certainly no—even the most thoroughly researched, well-planned implementations usually end up veering into unexpected territory. But unless a few bumps in the road derailed your project completely, you can still use your original project plan to help determine how successful your strategy was.

Did you solve your content problems and improve your content development processes in the end, even if getting to that point took much longer than you anticipated? If your plans for change management didn’t go quite as smoothly as you’d hoped, was the overall implementation still successful in spite of issues with change resistance? Even if you still achieved the desired outcome with your strategy, assessing where your implementation veered from its original course can help you do a better job planning for future changes.

How has your strategy helped the business?

The main purpose of a content strategy is to use your content to support your company’s business goals. This means that a good strategy should reduce costs and improve the bottom line.

Now that you’ve put your strategy into practice, you can calculate the total costs you spent on implementation. Are you earning the kind of revenue you need to offset those costs? Have you improved your content processes enough that your total profits will increase over time? The more cost your strategy saves, the more successful it is.

How have your business goals changed?

Because implementing a content strategy takes time—usually several months or even a couple of years—it’s possible, even likely, that your business goals could change during the implementation process. If the original business goals that drove your strategy shift before your implementation is finished, how adaptable is your strategy to those changes? A good content strategy addresses both short-term solutions and long-term possibilities so that you will be ready for changing business goals.

A successful implementation will put you in a position to scale up with new business requirements. You can build on the strategy you already put in place to keep achieving business goals as they grow and change. If your content strategy only focused on solving one or two short-term problems and didn’t account for future goals, your strategy probably won’t have lasting success, and you might need a new one sooner than you’d planned.

Now that you’ve done a post-implementation assessment of your content strategy, how did it measure up to your goals? Did it get the gold, silver, or bronze? Maybe it failed to make it onto the podium, so you need a different strategy if your goal was to win a medal. Either way, analyzing the results of your content strategy implementation is the best way to determine how to keep using content to achieve your business goals going forward.