Skip to main content

Author: Sarah O'Keefe

Tools

WMF…that’ll shut ’em up

Which graphics formats should you use in your documentation? For print, the traditional advice is EPS for line drawings and TIFF for screen captures and photographs. That’s still good advice. These days, you might choose PDF and PNG for the same purposes. There are caveats for each of these formats, but in general, these are excellent choices.

Of course, everybody knows to stay away from WMF, the Windows Metafile Format. WMF doesn’t handle gradients, can’t have more than 256 colors, and refuses to play nice with anything other than Windows.

Think you’re too good to hang out with WMF? For your print and online documentation, perhaps. But it may be a great choice to give to your company’s PowerPoint users.

Are you familiar with this scenario? PowerPoint User saw some graphics in your documentation and thought they would work for some sales presentations. The screen captures are easy; you just give PowerPoint User PNGs or BMPs or whatever. It’s the line drawings that are the problem. PowerPoint User doesn’t have Illustrator and has never heard of EPS. PowerPoint User says, “Can you give me a copy of those pictures in a format that I can use in PowerPoint? Oh, and can make that box purple and change that font for me first? And move that line just a little bit? And make that line thicker? And remove that entire right side of the picture and split it into two pictures?”

You want PowerPoint User to reuse the graphics; you’re all about reuse. But you have dealt with PowerPoint User before, and you know you will never get your real job done if you get pulled into the sucking vortex of PowerPoint User’s endless requests.

The secret is to give PowerPoint User the graphics in a format that can be edited from within PowerPoint (or Word): WMF. Here’s the drill that will make you a hero:

  1. Save your graphics as WMF.
  2. Place each WMF on a separate page in a PowerPoint or Word file.
  3. Tell PowerPoint User to double-click on a graphic to make it editable.(If you think your PowerPoint User is really dumb, you can double-click the graphic and respond to the dialog box asking if you want to make the drawing editable yourself before saving the file, but nobody is that dumb.)

WMF. It will make PowerPoint User go away…happy!

Read More
Conferences

Upcoming DITA events (free, cheap, and discounted)

Free

Tomorrow (February 5) at noon Eastern time, I’m doing a webinar, DITA 101–Why the Buzz?

This is a basic introduction to the Darwin Information Typing Architecture, an XML standard for technical communication content. If you’re wondering about this DITA “thing,” and want to get some basic information, this is the session for you.

Also, the price is right, as it’s free (register here). Audio will be Internet-based, so you don’t even have the expense of a phone call.

Many thanks to MadCap Software, who is organizing and sponsoring this series of free webinars. These sessions are “tool-independent” — they are not going to be pitches for MadCap products.

Cheap

I have to mention Simon Bate’s new Hacking the DITA OT white paper again. It’s crammed with useful tips and tricks on how to get started configuring DITA output to your satisfaction. It’s not free, but at $20 for an instant download, it’s pretty cheap.

Discounted

Conferences are more expensive than our $20 white paper, but they also give you the opportunity to talk with people face-to-face. My next conference event is DocTrain West (Palm Springs, CA). I have two sessions:

  • What Gutenberg Can Teach Us about XML: This session looks at movable type and explores how the changes introduced by the printing press compare to the changes introduced by XML.
  • Demystifying DITA to PDF Publishing: This session discusses the advantages and disadvantages of each approach to extracting PDF from DITA content. Includes discussion of the DITA Open Toolkit, FrameMaker, and InDesign.

You can register for the event at a $400 savings until February 17. I hope to see you there.

Read More
Content strategy XML

Web 2.0: The tipping point for XML

STC Intercom, January 2009

As the many-to-many communication between blogs, forums, and the like grow in volume, official product information will become just one of the many sources available to readers. Product owners who isolate their official information from the conversation run the risk of not being heard at all.

XML authoring can help to close the documentation gap between official and user-generated content, integrating the two and ensuring their voice is in the mix.

Download the PDF PDF file (125 K)

Read More
Opinion

Publishing DITA without the DITA Open Toolkit: A Trend or a Temporary Detour?

I estimate that about 80 percent of our consulting work is XML implementation. And about 80 percent of our XML implementation work is based on DITA. So we spend a lot of time with DITA and the DITA Open Toolkit.

I’m starting to wonder, though, whether the adoption rate of DITA and the DITA Open Toolkit is going to diverge.

For DITA, what we hear most often is that it’s “good enough.” DITA may not be a perfect fit for a customer’s content, but our customer doesn’t see a compelling reason to build the perfect structure. In other words, they are willing to compromise on document structure. DITA structure, even without specialization, offers a reasonable topic-based solution.

But for output, the requirements tend to be much more exacting. Customers want any output to match their established look and feel requirements precisely.

Widespread adoption of DITA leads to a a sort of herd effect with safety in numbers. Not so for the Open Toolkit — output requirements vary widely and people are reluctant to contribute back to the Open Toolkit, perhaps because look and feel is considered proprietary.

The pattern we’re seeing is that customers adopt the Open Toolkit when:

  • They intend to deploy onto multiple servers, and open source avoids licensing headaches.
  • The Open Toolkit provides a useful starting point for their output format.

Customers tend to adopt non-Open Toolkit solutions when:

  • They need attractive PDF. Getting this result from the Open Toolkit isn’t quite impossible, but it’s hard. There are other options that are faster, cheaper, and easier.
  • They need a format that the Open Toolkit doesn’t provide. The most common requirement here is web-based help. Getting from the XHTML output in the Open Toolkit over to a sophisticated tri-pane help system with table of contents, index, and search….well, let’s just say that it keeps me gainfully employed. AIR is another platform that we need to support.

The software vendors seem to be encouraging this trend. In part, I think they would like to find some way to get lock-in on DITA content. Consider the following:

  • Adobe FrameMaker can output lovely PDF from DITA content through FrameMaker (no Open Toolkit). You can also use the Open Toolkit to generate formats such as HTML.
  • ePublisher Pro uses the Open Toolkit under the covers, but provides a GUI that attempts to hide the complexities.
  • MadCap’s software will support DITA (initially) by importing DITA content and letting you publish through MadCap’s existing publishing engine.
  • Several other vendors provide support for publishing DITA, but do not use the Open Toolkit at all.

The strategy of supporting DITA structure through a proprietary publishing engine actually makes a lot of sense to me. From a customer point of view, you can:

  • Set up an XML-based authoring workflow
  • Manage XML content

It’s not until you’re ready to publish that you move into a proprietary environment.

To me, the interesting question is this: Will the use of proprietary publishing engines be a temporary phenomenon, or will the Open Toolkit eventually displace them in the same way that DITA is displacing custom XML structure?

Read More
DITA

The hidden costs of DITA

Originally published in STC Intercom, April 2008

DITA is a free, pre-made XML document structure. That statement can lead to a few erroneous assumptions: if it’s free, then it will cut down on costs, and if it’s pre-made, it will cut down on labor. There are several things to consider when choosing a DITA solution. Does your staff have the skills to author in a DITA environment? Will additional training be required? Does DITA even match your content model, and if it doesn’t, is it worth the effort to change?

Sarah’s conclusion? “DITA may be free, but it’s not cheap.”

Download the PDF (950 K)

Read More
Conferences

WritersUA: DITA pilot techniques

Mark Wallis of IBM ISS on how to run a successful DITA pilot. Some great information in this presentation on how to reduce risks.

He recommends selecting your pilot project based on the following items:

  • Right timeframe — don’t choose the project that has an imminent release
  • Choose a manageable documentation set size
  • Reduce risk by avoiding the strongest (or most critical) product
  • Identify a product with a known need to improve the user experience

They had one person out of a group of twelve, a “senior in name only” writer, leave because of this transition.

The ideal team for a pilot will need cross-functional and complementary skills:

  • Project management skills
  • Tools and technology strengths
  • Product knowledge and understanding
  • Architecture and design skills
  • Editor for standards and styles

Some advice on planning your content. (And it’s worth noting here that these apply to good writing and topic-oriented content rather than to DITA tools.)

  • No autopilot writing
  • Don’t just migrate existing content; you’ll get trapped in old paradigms (this assumes that existing content does not fit the DITA topic paradigm)
  • Perform use case analysis and task analysis
  • Determine the critical scenarios to document
  • Focus on tasks; backfill supporting information as needed

Some interesting discussion of “task support clusters,” which include conceptual overviews, related tasks, deep concept, and reference information. (Michael Hughes did a presentation on this earlier today, which I unfortunately was not able to attend.)

They set up a DITA War Room in a small conference room and met at least daily (1.5 to 2 hours per day. Yikes). They set weekly goals and used small tasks to build momentum.

There was also heavy use of an internal wiki to put up initial “straw man” design, then revise, comment, and discuss.

Layering deliverables
Implementation deliverables were split out into smaller tasks, such as:

  • Creating topic files, links, and navigation
  • Testing links from code and navigation
  • Creating task and reference topics
  • Validating help against the user interface
  • Creating concept topics for principles, guidelines, and best practices (“deep concept”)
  • Validating content in the expert community

For the third time, he points out that they are no longer documenting how to use a check box, so I guess I’ll mention it.

Choosing the DITA toolset

Task Modeler (free) for building and managing ditamaps, defining relationships between topics, and creating skeleton topics (stub files).

DITA-compliant editor to edit your topics.

Compiler (part of open source toolkit). Compiler? What are they compiling? HTML Help? Oh. He just referred to Ant as a compiler. Ohhhhhkay.

Proof of concept

They picked a subset of the pilot to do the proof of concept.

The presenter’s boss is quoted as saying, “There’s no such thing as bad weather, only insufficient clothing.” I’m guessing that she’s never been to Minnesota in winter.

The objectives for the proof of concept:

  • Learn and evaluate tools
  • Address technical obstacles
  • Specify end-to-end requirements

They learned that deliverable formats matter because they must deliver several different formats.

Managing costs

Purchase toolsets only for pilot team.

After completing proof of concept (successfully!), invest in tools for the remaining writers.

Wiki

They used their wiki to capture conventions and guidelines.

Improving acceptance

They paid attention to the change management issues. He doesn’t mention it here, but I would assume that the combination of an acquisition by IBM plus the requirement to change the authoring environment could have caused significant angst. Their approach included presentations, wiki content, email discussions, and online training.

At the point of transition, DITA boot camp was offered.

They used collaborative walkthroughs, or reviews, to help standardize their content development. Interesting. This sounds as though it could be a) threatening and b) an unbelievable time sink. But just maybe it might also c) help improve the content.

Other lessons learned

Think more, write less. (Don’t document the obvious, don’t document common user interface convention, write only if you’re really adding value.)

Don’t squander your ignorance. (If something makes you stumble in the interface, that will probably also cause problems for your users, so capture it.)

The more structured your content, the easier the transition to DITA.

Documenting the obvious teaches readers to ignore your text, so don’t document the obvious.

The handouts are available here: http://www.writersua.com/ohc/suppmatl/

Read More
XML

When is XML the wrong answer?

Originally published in STC Intercom, November 2007

XML can benefit a publishing workflow in many ways: improving content reuse, consistency, and potentially automating much of the process. That all sounds wonderful, but XML is not the logical answer for everyone.

Implementing a structured authoring solution requires a significant change from the familiar desktop publishing routine to new tools, technologies, and processes. Switching to XML is going to cost time and money. Depending on your needs, it may not be the most efficient solution.

Download the PDF PDF file (350 K)

Read More
Conferences

tekom: Benefits for North American writers

My post about tekom generated some interesting comments, including this one, which I will address in pieces:

Thanks for this info. I’ve been lobbying my company to send me to Tekom for the last few years, unsuccessfully. I submitted 2 times for presentations but both were rejected. Our company is in Concord, Massachusetts, USA.
Could you discuss the benefits to North American writers attending such an international event. Are there things you learned there you will not learn anywhere else (business/tech stuff of course )

Interesting question.

Read More
Opinion

The Age of … Expertise?

Over on O’Reilly’s Radar blog, Andy Oram has a fascinating article about the demise (!) of the Information Age and what will be next:

[T]he Information Age was surprisingly short. In an age of Wikipedia, powerful search engines, and forums loaded with insights from volunteers, information is truly becoming free (economically), and thus worth even less than agriculture or manufacturing. So what has replaced information as the source of value?The answer is expertise. Because most activities offering a good return on investment require some rule-breaking–some challenge to assumptions, some paradigm shift–everyone looks for experts who can manipulate current practice nimbly and see beyond current practice. We are all seeking guides and mentors.

What comes after the information age? (be sure to read the comments, too)

It’s an interesting idea, but I don’t think we’re getting away from the Information Age into the Expertise Age. After all, expertise is just a specialized (useful!) form of information.

In the comments, Tim O’Reilly points out that the real change is in how information is gathered and distributed with “the rise of new forms of computer mediated aggregators and new forms of collective curation and communication.”

I believe that we are still firmly in the Information Age because information has not yet become a commodity product. There is, however, clearly a shift happening in how information is created and delivered. I think it’s helpful to look at communication dimensions:

  • Traditional technical writing is one-to-many. One person/team writes, many people consume it.
  • Wikis are many-to-many. Many people write; many people use the information.
  • Mailing lists are many-to-one. Many people respond to one persons’ question.
  • Technical support is one-to-one. One person calls; one person responds.

Technical support is the most expensive option; it’s also often the most relevant. Technical writing is more efficient (because the answer to the question is provided just once), but also less personal and therefore less relevant.

Many technical writers are concerned about losing control over their content. For an example of the alarmist perspective, read Joanne Hackos’s recent article on wikis. Then, be sure to read Anne Gentle’s eponymous rebuttal on The Content Wrangler.

Keep in mind, though, that you can’t stop people from creating wikis, mailing lists, third-party books, forums, or anything else. You cannot control what people say about your products, and it’s possible that the “unauthorized” information will reach a bigger audience than the Official Documentation(tm). You can attempt to channel these energies into productive information, but our new information age is the Age of Uncontrolled Information.

Furthermore, the fact that people are turning to Google to find information says something deeply unflattering about product documentation, online help, and other user assistance. Why is a Google search more compelling than looking in the help?

Read More