Skip to main content

Author: Sarah O'Keefe

Content strategy Localization XML

Building efficient multilingual workflows

STC Intercom, April 2009

A common argument for XML-based workflows is that they automate production and localization tasks. With XML, localization can be reduced to a fraction of its original cost, but how exactly does that happen?

Sarah explores automization in localization and two technology standards used in multilingual workflows: The Extensible Stylesheet Language (XSL) and XML Localization Interchange File Format (XLIFF).

Download the PDF PDF file (125 K)

Read More
News

DITA adoption increasing overall structured authoring adoption

I’m knee-deep in survey data analysis. With over 600 responses, our recent structured authoring survey was hugely successful–thank you. Many respondents added candid details about their experiences with structured authoring implementation–their fears, mistakes, and biggest surprises.

The survey report will be available later this month (free to participants, $200 for others), but I wanted to give you a couple of preliminary highlights:

  • About 30 percent of respondents said that they are currently using structured authoring.
  • There’s a lot of hype around DITA, but our data indicates that it’s backed up by reality. Consider this chart, which shows the top three types of structure (custom, DocBook, or DITA) implemented, being implemented, or planned.

DITA accounts for the vast majority of structure implementations--past, present, and futureDITA dominates the chart. But it looks as though DITA is additive. That is, it’s not cannibalizing the numbers for DocBook or custom structures. Those numbers are relatively flat. Instead, it looks as though DITA is increasing the total number of implementations.

If you are attending the STC Summit this year, I’m doing a presentation on the survey results on Monday, May 4, at 1:30 p.m., called “The State of Structure.”

Read More
News

What do Tech Writers Want?

Answer? I don’t know, but The Content Wrangler is conducting a survey to find out. Here’s the announcement:

2009 is a touch economic year for most of us. Companies are cutting back on nice-to-have purchases and focusing in on what’s necessary. This survey conducted by The Content Wrangler aims to help us better understand your training needs for 2009 and to identify the types of classes you need. We plan to use this information to help training providers create relevant public and on-site training programs that address your needs and to gain an understanding of the current state of training program interest in our industry today.

In case you need further motivation, there is also a random drawing for some goodies. The survey has only five questions, so it should be quick.

Take the survey

Read More
Conferences

Life in the desert

Last week, I attended the annual DocTrain West event, which was held this year in Palm Springs, California.

Weather in Palm Springs was spectacular as always with highs in the 80s during the day. Some of my more northerly friends seemed a bit shell-shocked by the sudden change from snow and slush to sun and sand. (North Carolina was 40 degrees when I left, so that was a nice change for me as well.)

Scott Abel did his usual fine job of organizing and somehow being omnipresent.

I promised to post my session slides. The closing keynote was mostly images and is probably not that useful without audio, so I’m going to point you to an article that covers similar ground (What do Movable Type and XML Have in Common, PDF link).

I have embedded the slides from my DITA to PDF session below.

I have also posted the InDesign template file and the XSL we built to preprocess the DITA XML into something that InDesign likes on our wiki. Note that running the XSL requires a working configuration of the DITA Open Toolkit. For more information, refer to the DITA to InDesign page on our wiki.

Read More
News

How to Get a Job

[update to correct bad links]

This is the best advice for job seekers I’ve ever seen. India Amos writes about her pile of resumes:

And do you want to know what’s the most striking thing about most of these hopefuls? They are completely wasting their time. And mine, of course, but mostly their own. Because they’re not only not going to get a job with me, they’re not going to get a job with anyone unless that person is as slovenly and illiterate as these applicants.

She proceeds to offer some excellent advice in numerous categories. Here are some excerpts from a lengthy list about formatting:

  • Learn to use style sheets, so that you can make your heading styles consistent. If you choose to ignore my request for a PDF résumé, try to make sure your Word attachment doesn’t demonstrate to me what a slob you are, formatting everything locally and aligning text using spaces instead of tabs.
  • Don’t Capitalize Everything. I Cannot Emphasize This Enough. It Makes You Look Like a 419 Scammer.
  • Violet 9pt Arial is probably not a good choice for anything.

Hehe. (sob)

Related to this: How Not to Get a Job (Palimpsest, December 2007)

Of course, in today’s economy, lots of people need jobs. So here is some long-promised advice on how to get a job:

  1. Apply for jobs where your skillset is relevant. In this job market, with tons of job seekers, you are unlikely to get the “stretch” position. So, look for positions that are equivalent to your last position, that you are uniquely qualified for, or that you are slightly overqualified for. For instance, let’s say you are a technical writer with five years of experience and “the usual” complement of technical skills. What is your unique qualification? If you speak some Japanese, look for Japanese companies where your language skills might be useful. If your undergraduate degree is in music, look for a company that makes music software or products related to music. In other words, look for a position where your outside interests are also relevant. But, at a minimum, apply only for positions that you are reasonably qualified for. It’s tempting, especially when you really need a job Right Now, to take the firehose approach and spray resumes everywhere. It doesn’t work. Focus your job search and send out a smaller number of really good applications.
  2. Do your homework. Before contacting the company, investigate. Read their web site, read any recent news coverage. Look them up on LinkedIn and see if you know anyone in the organization. (You are on LinkedIn, right?) Use the information you find to make your application more relevant. If you get an interview, do more homework before the interview.
  3. When you apply for the job, follow the #!%$#!%#! instructions. If asked for PDF, provide PDF. If asked for Word, provide Word. Et cetera.
  4. Submit resumes online. Paper and snail mail takes too long. By the time your resume arrives by mail, the position could be filled. Also, dropping off your resume in person? Creepy and needy. (One exception: If you know someone at the organization and they are willing to deliver the resume for you. Even then, I would recommend sending your contact email with the resume and asking him or her to forward it.)
  5. Whether it is requested or not, write a cover letter. The cover letter should be the body of your email and not an attachment. Follow Ms. Amos’s excellent advice. You might also use a T letter as your cover letter, but do send the resume. Tom Murrell describes the T letter in detail in his article Get More Interviews with a T-letter. But again, I disagree with his advice to leave out the resume. If you are instructed to send a resume, send a resume.
  6. Show up on time for any in-person interview. If possible, do a dry run the day before to locate the building. Or plan to arrive very early. There are worse things than sitting in a nearby coffee shop for half an hour. (Don’t chug too much coffee.)

I could go on for a long time, but frankly, these six points will lift you above 95 percent of the other applicants, and you can do the rest.

(India Amos via words / myth / ampersand & virgule)

Read More
Opinion

I am not a Pod Person

Confession time: I don’t like podcasts.

And I think I know why.

I am a voracious reader. And by voracious, I mean that I often cook with a stirring spoon in one hand and a book in the other. I go through at least a dozen books a months (booksfree is my friend).

So why don’t I like podcasts?

  1. They’re inconvenient. I don’t have a lot of interrupted listening time, other than at the gym. And frankly, there’s a bizarre cognitive dissonance listening to Tom Johnson interview Bogo Vatovec while I’m lifting weights. I tried listening to a crafting podcast, but that was worse — my brain can’t handle auditory input describing crocheting techniques while simultaneously operating an elliptical machine. So I went back to Dr. Phil on the gym TV. It may rot my brain, but at least it doesn’t hurt.
  2. They’re inefficient. I can listen to a 30-minute podcast, or I can skim the equivalent text in 90 seconds.

I’ve been thinking about what would make a podcast more appealing to me, and realized that it’s not really the medium I object to, it’s my inability to control the delivery.

I’ll become a podcasting proponent when I perceive these properties:

  1. Better navigation. Podcasts, like other content, need to be divided into logical chunks. These chunks should be accessible via a table of contents and an index.
  2. Ability to skim. Podcasts need to provide the audio equivalent of flipping pages in a book or scrolling through a document while only reading the headings.

Depending on the software you use to consume podcasts, you may already have some of the features. For instance, a colleague told me that he listened to my recent DITA webinar at five times the normal speed:

I wanted to let you know about something in particular. I listened to it at 5x fast fwd in Windows Media Player while drinking a coke. My heart is still racing. You should try it. :o)

Do you enjoy podcasts? Do you have any special techniques for managing them efficiently?

Read More
Tools

WMF…that’ll shut ’em up

Which graphics formats should you use in your documentation? For print, the traditional advice is EPS for line drawings and TIFF for screen captures and photographs. That’s still good advice. These days, you might choose PDF and PNG for the same purposes. There are caveats for each of these formats, but in general, these are excellent choices.

Of course, everybody knows to stay away from WMF, the Windows Metafile Format. WMF doesn’t handle gradients, can’t have more than 256 colors, and refuses to play nice with anything other than Windows.

Think you’re too good to hang out with WMF? For your print and online documentation, perhaps. But it may be a great choice to give to your company’s PowerPoint users.

Are you familiar with this scenario? PowerPoint User saw some graphics in your documentation and thought they would work for some sales presentations. The screen captures are easy; you just give PowerPoint User PNGs or BMPs or whatever. It’s the line drawings that are the problem. PowerPoint User doesn’t have Illustrator and has never heard of EPS. PowerPoint User says, “Can you give me a copy of those pictures in a format that I can use in PowerPoint? Oh, and can make that box purple and change that font for me first? And move that line just a little bit? And make that line thicker? And remove that entire right side of the picture and split it into two pictures?”

You want PowerPoint User to reuse the graphics; you’re all about reuse. But you have dealt with PowerPoint User before, and you know you will never get your real job done if you get pulled into the sucking vortex of PowerPoint User’s endless requests.

The secret is to give PowerPoint User the graphics in a format that can be edited from within PowerPoint (or Word): WMF. Here’s the drill that will make you a hero:

  1. Save your graphics as WMF.
  2. Place each WMF on a separate page in a PowerPoint or Word file.
  3. Tell PowerPoint User to double-click on a graphic to make it editable.(If you think your PowerPoint User is really dumb, you can double-click the graphic and respond to the dialog box asking if you want to make the drawing editable yourself before saving the file, but nobody is that dumb.)

WMF. It will make PowerPoint User go away…happy!

Read More
Opinion

Publishing DITA without the DITA Open Toolkit: A Trend or a Temporary Detour?

I estimate that about 80 percent of our consulting work is XML implementation. And about 80 percent of our XML implementation work is based on DITA. So we spend a lot of time with DITA and the DITA Open Toolkit.

I’m starting to wonder, though, whether the adoption rate of DITA and the DITA Open Toolkit is going to diverge.

For DITA, what we hear most often is that it’s “good enough.” DITA may not be a perfect fit for a customer’s content, but our customer doesn’t see a compelling reason to build the perfect structure. In other words, they are willing to compromise on document structure. DITA structure, even without specialization, offers a reasonable topic-based solution.

But for output, the requirements tend to be much more exacting. Customers want any output to match their established look and feel requirements precisely.

Widespread adoption of DITA leads to a a sort of herd effect with safety in numbers. Not so for the Open Toolkit — output requirements vary widely and people are reluctant to contribute back to the Open Toolkit, perhaps because look and feel is considered proprietary.

The pattern we’re seeing is that customers adopt the Open Toolkit when:

  • They intend to deploy onto multiple servers, and open source avoids licensing headaches.
  • The Open Toolkit provides a useful starting point for their output format.

Customers tend to adopt non-Open Toolkit solutions when:

  • They need attractive PDF. Getting this result from the Open Toolkit isn’t quite impossible, but it’s hard. There are other options that are faster, cheaper, and easier.
  • They need a format that the Open Toolkit doesn’t provide. The most common requirement here is web-based help. Getting from the XHTML output in the Open Toolkit over to a sophisticated tri-pane help system with table of contents, index, and search….well, let’s just say that it keeps me gainfully employed. AIR is another platform that we need to support.

The software vendors seem to be encouraging this trend. In part, I think they would like to find some way to get lock-in on DITA content. Consider the following:

  • Adobe FrameMaker can output lovely PDF from DITA content through FrameMaker (no Open Toolkit). You can also use the Open Toolkit to generate formats such as HTML.
  • ePublisher Pro uses the Open Toolkit under the covers, but provides a GUI that attempts to hide the complexities.
  • MadCap’s software will support DITA (initially) by importing DITA content and letting you publish through MadCap’s existing publishing engine.
  • Several other vendors provide support for publishing DITA, but do not use the Open Toolkit at all.

The strategy of supporting DITA structure through a proprietary publishing engine actually makes a lot of sense to me. From a customer point of view, you can:

  • Set up an XML-based authoring workflow
  • Manage XML content

It’s not until you’re ready to publish that you move into a proprietary environment.

To me, the interesting question is this: Will the use of proprietary publishing engines be a temporary phenomenon, or will the Open Toolkit eventually displace them in the same way that DITA is displacing custom XML structure?

Read More
DITA

The hidden costs of DITA

Originally published in STC Intercom, April 2008

DITA is a free, pre-made XML document structure. That statement can lead to a few erroneous assumptions: if it’s free, then it will cut down on costs, and if it’s pre-made, it will cut down on labor. There are several things to consider when choosing a DITA solution. Does your staff have the skills to author in a DITA environment? Will additional training be required? Does DITA even match your content model, and if it doesn’t, is it worth the effort to change?

Sarah’s conclusion? “DITA may be free, but it’s not cheap.”

Read More