Scriptorium Publishing

content strategy consulting

Help this first-time tcworld attendee, please!

October 29, 2013 by

Whew! I’m just back from the excellent LavaCon event in Portland. I have (mostly) recovered from that trip, so now I’m focusing on the upcoming tcworld conference in Wiesbaden, Germany. And I need your help!

tcworld 2013 logoNext week will be my first visit to tcworld (and to Germany). Veteran tcworld attendees are accustomed to seeing Sarah O’Keefe there,  and Sarah’s already given me some great pointers on what to expect. I’d like to get some advice from you, too, on:

  • Choosing sessions. Do you have a strategy for picking sessions? What’s on your must-see list?
  • Visiting the trade show. What should I expect from vendors? I’m bracing myself for a bit of cultural shock on two fronts. Scriptorium won’t have a booth at this event, so I won’t have my usual booth duties. Also, I’ve heard booths are a bit more elaborate at tcworld than many of the events I attend in the US.
  • Networking. I’m attending the International Networking Dinner on Wednesday, where I’m sure I’ll meet many, many people. Any other networking suggestions?
  • What to wear. Any article of clothing you wished you had brought (or left at home)? I’ve seen photos of past tcworld events, and it looks like tcworld attendees dress a notch more professionally than those at US-based tech comm events, where I see more casual attire. Is my assessment accurate?
  • Food, food, food. If you follow Scriptorium’s blog or my Twitter feed, you probably know I like to eat (and I’m partial to good pastries and chocolate). What culinary adventures do you recommend in the Wiesbaden area?
  • General travel advice. I’ve been to Europe before for business and vacation, but this will be my first visit to Germany. Any travel tips? (My years of studying Latin have helped me somewhat with Romance languages, but they won’t be so helpful with German!)

I look forward to meeting you next week—especially those I’ve known for years through blog interactions, Twitter, and webcast events but have never met in person. Also, I’ll be helping Sarah out during The Game of Content Strategy presentation, which runs on Wednesday and Thursday. Hope to see you there!

P.S. If you’d like to schedule a meeting with Sarah or me during the conference, send us a message through our contact form.

How the “we-meeting” kills good tech comm

October 14, 2013 by

Does this sound familiar?

One reason for lack of accountability is the we-meeting. You know the one: “We need a new process for handling customer service issues.” Lots of discussion follows, but no clear direction is given, nor is any responsibility taken.

Bruce Clarke (The View from HR column) referencing consultant Kathleen Kelly

Having worked on many content strategy projects, I can confirm the “we-meeting” is a huge problem for tech comm professionals—particularly when it comes to getting subject matter experts to review technical content.

Photo of ax

Flickr: Martin Cathrae

A solid technical review can be the difference between technical content that merely rehashes the patently obvious (“Press the Print button to print.” Oh really?) and content that gives users depth, context, and useful examples. A technical review must be a collaboration between the SME and writer. Unfortunately, some SMEs (and a handful of tech writers) often see reviews as cursory obligations that should get as little attention as possible. That attitude is deadly for useful content.

What’s the cure for these useless, superficial review cycles? Accountability, which can take many forms, including:

  • Using workflow tools to assign and track content reviews. For example, software companies already use bug tracking tools, so consider using the bug tracker to track review comments, too. Component content management systems often offer collaborative review tools and include ticketing systems, tracking mechanisms, so on.
  • Building in reviews as part of the development process. Scheduling reviews not only sets aside crucial time for reviews, but it also sends the message, “Reviews are an official part of the process.”
  • Specifying what’s in and out of bounds for technical reviews. For example, nitpicks over formatting should be outlawed. If formatted content is what’s being reviewed, technical reviewers should not be requesting changes to line spacing, formatting, and so on, particularly when that formatting is merely aesthetic. (A bad line break in a code sample that could cause errors is another story, though.) If you’re working in an XML-based environment, you may have some options to present review content in a vanilla, formatting-neutral manner that stops useless formatting feedback in its tracks.
  • Instituting consequences for reviews that are late or do not meet criteria. Management has to step up and do icky management things when reviews don’t occur when they are supposed to. If reviewers and content creators aren’t giving reviews time and care, they are failing to meet their obligations as employees.

Codifying review cycles and their objectives is not fun. But you’ve got to do it to get “we-meetings” out of your tech comm.




Blemished—but better—tech comm?

September 4, 2013 by

Consumers’ demand for perfect things drives a lot of pesticide use….Ninety percent of pesticide use in apple crops is to get the last five percent of quality of the fruit.

That comment near the end of an On Point podcast confirmed that accepting a few blemishes on an apple treated with fewer or no synthetic insecticides is a compromise I am willing to make. It also made me think more about a tech comm–related post I saw last week.

There have been more than 100 responses to a LinkedIn post in the Technical Writer Forum about whether one or two spaces follow the period at the end of the sentence. I suppose I should be happy that the tech comm community has active social media networks, but my response was a lot less positive:

I’m skeptical that end users of technical content are that concerned about the number of spaces of following a period. The technical writers’ desire for writing that perfectly adheres to a rule is what’s “driving demand” in the case of that post.  Tech writers are not the true consumers of technical content. The end users are.

I’m not advocating style guide–free writing here. Style guidelines are important because they are the foundation of consistent writing, which is easier to understand and translate. But style guides are a small part of what makes good technical content. Stylistically pristine content is useless if it is technically inaccurate or doesn’t address the audience at the right level. It’s also a waste if it’s locked away in a PDF file that end users can’t find online, for example.


flickr: ollesvensson

Technical writers are generally better writers than most, but we can’t let our writing skills be our primary defining factor. Being a good writer is just a prerequisite in tech comm today; you can’t sustain a career in this industry by focusing on writing ability and style guides as your areas of expertise. (Besides, there are now tools that can automatically enforce style guidelines. Keeping a solitary focus on style is particularly foolhardy when a tool can do it for you.) We also can’t project our need for excellent writing on the audiences for technical information. For them, stylistically “good enough” writing is often plenty enough.

Move beyond the mechanics of writing and ensure your content reaches your audience and gives them the answers they need. If the time it takes to make your content more accurate, accessible, and intelligent means there are a few stylistic blemishes—many of which end users won’t even notice—so be it.

I think it’s a worthy compromise.

P.S. I have a degree in English and worked as a technical editor for years. Style guidelines are probably floating around in my bloodstream.


Webcast: Managing DITA implementation

August 16, 2013 by

In this webcast recording, Alan Pringle discusses key factors for DITA implementation success. Alan touches on the following issues:

  • Good and bad conversion strategies
  • Information architecture and content modeling issues
  • Wrangling the various factions (authors, implementers, IT, and executives)
  • CMS evaluation and vendor management

Avoiding buyer’s remorse: techcomm tools edition

July 15, 2013 by

Yes, you can call me overly cautious.

Before making a purchase, I will research the you-know-what out of the item. If it’s a big purchase, I’ll hire a professional to help me make my decision (particularly when it comes to real estate). I’d rather part with a bit more cash than get angry with myself later for a bad purchase.

This careful mindset is why I can’t understand companies that purchase tools first and then seek consultants to help implement those tools. Why not work with a consultant before you buy?

In techcomm, we focus on tools. A lot. After all, tools are what we use every day to get our jobs done. That relentless focus on tools, however, can be a detriment when evaluating new strategies for developing and distributing content. Sarah O’Keefe and I wrote about this in Content Strategy 101:

A high level of proficiency in a specific tool fosters a sense of achievement and security in team members. But a strong attachment to a tool can cause “tool myopia.” … The straightforward (but admittedly painful) cure for this myopia is building requirements. Strategic thinking about content cannot happen when early discussions are framed as “Tool X can do this, and Tool Y can do that.”

When you are considering a move to a completely new process for content (DITA, for example), can you formulate strong requirements if no one on your team has detailed knowledge about that technology? Also, can team members with strong attachments to a tool or particular process objectively develop requirements for the replacement of their Precious?

If you’re fortunate, your company hires someone who has the very knowledge and wrangling skills you need to map business requirements for content to tool requirements. If you aren’t that lucky, I recommend you bring in a consultant—even just part-part-time—to help you sort out your process and tool requirements before you go shopping. The consultant can also help you vet the tools you’re considering.

envelope with tear strip marked "let the buyer's remorse begin"

flickr: benjami

“Imagine that! A consultant telling people to hire a consultant!” More work for content strategy consultants would certainly benefit me. The greater benefit, however, goes to the company that is about to make a huge investment in new tools and processes. Implementing a tool that will not meet the company’s needs in the long term—or that will never deliver what the tool vendor promised—wastes an enormous amount of time and money.

Buyer’s remorse over the purchase of electronics or a car is unpleasant. For expensive, department- and enterprise-level tools, buyer’s remorse can be a career-killer.



“No PDF for you!” The destructive power of arrogant thinking

June 5, 2013 by

I love it when an offhand remark on Twitter turns into a smart conversation.


I was joking with the reference to the much-maligned Windows ME, but Al Martine’s “arrogant thinking” observation is correct. Microsoft was foolish to think one new OS could change decades of how people use the PC interface: “You don’t need a stinkin’ Start button or the ability to boot to a desktop view. You’ll get our new Metro interface and LIKE it.”

Microsoft is about to eat crow with the release of Windows 8.1, which will include—drum roll—a Start button and booting to the desktop interface.

Tech comm professionals can learn some lessons from Microsoft’s poor decisions on Windows 8. We are experiencing huge shifts in how we can distribute content: PDF files/print are being superseded by web pages, ebooks, wikis, video, and more. But that doesn’t mean we just stop producing PDF files because they aren’t cutting edge.

You can’t force your customers to happily rely on new output formats when you’ve supplied just PDF content for the past umpteen releases. This is particularly true if contracts or industry regulations specify how you provide content. If you have a legal requirement to offer PDF, print, or some other format, it doesn’t matter that your HTML pages are searchable or that the EPUB version works well on a tablet. The HTML and EPUB don’t fulfill your obligations.

Even if you don’t have legal reasons to continue to provide PDF files, it’s the height of hubris (and stupidity) to assume your customers will immediately accept content distributed in new ways. Instead, be smart by offering your customers choices in how they consume content. For example, if you want to establish an HTML version of your content, your HTML pages could include links to the PDF manual in the header area.  Google searches will lead customers to particular HTML pages, but if customers want the PDF version,  they can get the PDF file with little extra effort.

More than once, I’ve heard, “PDF is dead, so we aren’t going to offer it any more.” That kind of short-sighted thinking can indeed lead to death—the death of your career at the hands of angry customers who clog up the phone lines and mailboxes of your support department.

Let your business requirements guide how you deliver content, and introduce new outputs alongside your PDF files and other “traditional” formats. Otherwise, your content—and the product it supports—may join Windows 8 as another casualty of arrogant thinking.


When IT is MIA, content strategy crumbles

May 22, 2013 by

Last year, I told you to hug it out with your IT department. Play nicely with your IT group, but you also need to ask tough questions and get commitments. Otherwise, IT problems can derail your content strategy.

During the assessment phase of your content strategy, it is crucial to get input from all affected parties—this includes the IT staff, who will maintain the infrastructure and tools. In my April 2012 post, I outlined questions you should ask about IT issues:

  • How much time will it take to maintain the system (database maintenance, backups, and so on)? As more users and content are added to the system, how does maintenance time increase?
  • Should the CMS reside on a physical or virtual server? How well will that server scale as users and content are added?
  • Can the current Internet connection and network handle the CMS (particularly when multiple sites are involved)? Does the company need a bigger pipe to accommodate CMS activity?
  • How will user accounts be added and removed from the system? Does the CMS integrate with the single sign-on solution your company already has in place?
  • Will there be in-depth training provided to the IT personnel on the CMS? What about follow-on support?
  • What is the process for installing new releases, and what kind of time do they generally require?
  • Is there a hosted solution that eliminates maintenance tasks when the IT staff doesn’t have the resources to manage another system?

Based on some recent painful experiences, I need to expand this list with some new questions:

image of finger with question mark above it

Things may get a bit tense (flickr: Tsahi Levent-Levi)

  • What happens if the primary IT contact leaves the company? Having a primary IT resource for content processes is a logical approach, but there needs to be a secondary resource who is more than just a backup in name only. The secondary resource should be well-versed in the tools and participate in basic maintenance to develop a working knowledge of the system.
  • What amount of time will the IT department commit to your project for implementation, system maintenance, and installing upgrades? If you can’t get straight answers to these questions, hosted (maintenance-free) tools may be a better choice. If the IT group grumbles about security at the mention of hosted solutions, ask them (again) for their project time commitments for the proposed installed solution.
  • Who are the primary and secondary technical contacts on the content development side? Among those authoring content, have at least two tech-savvy employees who are the main points of contact with the IT department. These technical liaisons collect information about performance issues and other problems and then work with the IT group to solve the issues. Don’t play the I’m just a writer and don’t want to be bothered with the technical details card and leave all the heavy lifting to the IT group.
  • Are there corporate policies requiring the use of virtual desktops to minimize licensing costs?  Any perceived cost savings through the sharing of virtualized tool licenses must be balanced against reduced system performance, which lowers employee productivity. If virtual desktops are a requirement, the IT group should work closely with the tool vendors to ensure compatibility and that performance won’t be affected significantly.

Can you preplan for every possible IT-related emergency? Nope. But asking some pointed questions and getting commitments up front can minimize later pain.



From toilets to techcomm: tallying tool risks

May 9, 2013 by

I’m about to replace an old toilet, not-so-affectionately nicknamed the Lazy River.

As you might guess, the Lazy River is barely doing its job. I’ll give the toilet points for longevity—it was installed in the mid-1980s—but the Lazy River uses three gallons of water for each leisurely operation. It’s beyond time for a more efficient model.

Lazy river (flickr: archer10)

The leading contender to replace the Lazy River was an aesthetically pleasing option (well, as far as toilets can be) with an impressive warranty and a highly competitive price. And then I read a forum post about that product that changed everything.

The flushing mechanism and water supply line for this toilet rely on proprietary technology. (Proprietary. Toilet. Technology.) That would probably mean ordering any replacement parts directly from the manufacturer. What if the company decided to discontinue making toilets? The so-called “universal” replacement parts wouldn’t work, and I’d be stuck with a nice-looking toilet that I couldn’t repair. No thanks!

I ended up choosing another option. Replacement parts for it are more widely available, and universal parts will work, too. I have no worries about buying a unit that will become obsolete if the manufacturer quits making that model.

The lessons I learned with my toilet purchase most definitely apply to the technologies and tools that produce technical content. Consider the long-term risks associated with choosing a particular tool.

Proprietary storage formats for source files are a large consideration, and you should account for both a toolmaker’s longevity and the continuity of its products. For example, is there is a history of products being poorly supported or discontinued altogether?

Poking around industry forums and wikis can help you gather this crucial intelligence. Talk to your peers at conferences and industry events—and don’t get snowed by fancy vendor demos and slick marketing that camouflage risks.

Before you choose a solution, figure out an escape hatch that will let you extract your content from the tool. For example, can you export source files to a generic format that you could transform and modify further? This is where having XML as a base technology is very useful: you can often programmatically transform one set of XML tags to to another with minimal human intervention. In a sense, XML is a universal replacement part.

Open-source tools should not be exempt from scrutiny. Sure, an open-source technology may not rely upon a proprietary file format like many off-the-shelf tools do, but how well supported is that technology across the industry? Has it reached a critical mass that means you could, for example, choose from several consultants to help you develop your implementation? Or are you pretty much on your own with an open-source tool that has spotty or nonexistent documentation? And always remember: the free license associated with an open-source tool does not mean it will be cheap to implement and maintain.

While evaluating content development tools, it’s critical to consider the longer-term risks of choosing a particular solution and to understand your options if you choose to move to another tool later. Otherwise, you may find yourself up a not-so-lazy river without a paddle.

Rapids (flickr: amerune)

Five tips for converting content to DITA

April 18, 2013 by

So, you’ve decided to move to a DITA-based workflow. Before you convert your existing content to DITA, consider these five tips, which encompass both big-picture and coding-specific issues.

Meat grinder photo (flickr: klwatts)

flickr: klwatts

  1. Read a book about DITA best practices before you convert. Educating yourself about what coding works well (and doesn’t work so well) in the real world can save you a lot of headaches and rework. Merely reading the DITA specification is not going to give you advice, for example, on the best way to code commands in your content.  The DITA Style Guide by Tony Self is a good resource, and I’m not saying that just because Scriptorium Press published it. That book has provided me with really useful information while working on DITA projects.
  2. Don’t assume the first sentences in a section are the short description for a DITA topic. There is a strong temptation to convert the first bit of information in a section to a topic’s short description (shortdesc element). Don’t succumb to that temptation. In my experience, it is rare that the first sentences in legacy content are a true short description, which should be a standalone summary of a topic’s content. For information on best practices for short descriptions and how different outputs (HTML, PDF, help) from the DITA Open Toolkit use shortdesc elements, see Kristen Eberlein’s Art of the short description.
  3. Use the right topic types for your content. DITA offers four topic types: generic topic, concept, task, and reference, and you should convert your content to match the purposes of those topic types. For example, don’t shoehorn a procedure better suited to a task topic into a concept topic. Yes, the DITA spec will let you code an ordered list in a concept topic that may seem like sensibly coded task. However, when it comes time for transforming your DITA content to HTML or PDF, the styling for procedures may rely on coding specific to the task element. An ordered list in a concept may not be formatted the same.
  4. Consider how cross-references are processed by the DITA Open Toolkit. During conversion, it is a good idea to add ID attributes to items that are commonly referenced (tables and figures, for example); you need those ID attributes to create cross-references to elements. However, just because the DITA spec enables you to put an ID attribute on the title element within a fig or table element, that does not mean you should point to that title element when creating cross-references. For example, in output based on the default XHTML plugin that comes with the DITA Open Toolkit,  a cross-reference to a figure will not work when the xref element points to the title element within the fig element instead of the fig element itself:screenshot showing incorrect cross-reference from xref element pointing to title in fig element
  5. Know that valid DITA content is not the same as good DITA content. Don’t be fooled when a conversion vendor makes a big deal about how quickly it can convert your legacy information into valid DITA. The problems I mentioned in tips 2–4 can exist in valid DITA topics. The validation feature in a DITA authoring tool is not going to tell you, for example, that the two sentences you converted to a short description are not a true short description.

    Valid DITA ≠ semantically correct, useful DITA.

There are many other tips I could offer, but these five are a good starting point. Feel free to share your own conversion tips and war stories below.


Webcast: The realities of ebook distribution

April 12, 2013 by

In this webcast recording, Alan Pringle discusses the challenges of ebook distribution and how Scriptorium has addressed them when selling EPUB and Kindle editions. Topics covered include:

  • Formatting differences in ereader devices and apps
  • Pricing
  • Other lessons learned through painful experience