Skip to main content

Author: Sarah O'Keefe

Opinion

Some thoughts on “free”

Chris Anderson (author of The Long Tail and editor-in-chief of Wired Magazine) has just published Free: The Future of a Radical Price. The book is available (not free) in all the usual outlets, but you can also read it on scribd. For free.

Reviews, so far, are mixed. Malcolm Gladwell, writing in the New Yorker, didn’t like it. The New York Times, not so much a fan. And there was an ugly little kerfluffle about attribution (or lack thereof) of content sourced from Wikipedia. Emma Duncan, writing for the Guardian, liked it.

This book is important because Anderson is attempting to define a taxonomy of different types of “free.” Business and organizations face the difficult challenge of figuring out what should and should not be free. To give you a tiny, itty-bitty example, Scriptorium offers a series of white papers, technical references, and books. What’s the difference between a white paper and a technical reference? The white papers are free, the tech references are not. Costs range from $10 to $200. But how do we decide whether a document should be free or not? We are still trying to figure out the right answer. As Anderson points out, the incremental cost of producing additional e-books (after the first one) is zero. Should all digital content be free? We have chosen, for the most part, to charge for books and for the more technical documents. White papers, which typically provide an overview of a technology or methodology, are generally free. We feel that this is a fair representation of our actual development costs.

Meanwhile, our friendly neighborhood technical communication organization is trying to figure out some similar issues. Currently, the STC web site has public content (free) and members-only content (not free).

The major argument I’m hearing from STC leadership for locking down content is basically that otherwise, people will be able to use the content without paying for it. In other words, the value of the STC membership is that it gives you access to members-only content. This logic would make some amount of sense if STC held a monopoly on content related to technical communication. It does not.

So, what happens when you lock down content and hide it from non-members? You lose the opportunity to participate in the community. You lose the opportunity to have non-members read your content, decide you are useful, and join the Society. You lose the opportunity for inbound links.

Similar logic applies to forums, wikis, and online communities. Members and non-members should be able to participate. Perhaps members get special badges in their profiles to indicate membership, but communities derive value from participation, and open access means more participation.

If stc.org can be transformed into a vital hub for the technical communication community, the organization itself will do fine. In a moment of apparent insanity, I have offered to help with this effort. If you’d like to join me, contact me in the comments below, via Twitter (@sarahokeefe), on the STC Ideas forum (stcideas.ning.com), or via whatever avenue makes the most sense to you. (Email and phone contact information are in the main part of our web site.)

Read More
Webinar

Summer webinar theme: Avoiding extinction

Ellis Pratt of Cherryleaf is delivering Beyond Documentation this Thursday, July 9th, at 11 a.m. Eastern (US) time. Ellis gave a similar presentation in Vienna, which was the basis for Tom Johnson’s post, How to Avoid Extinction as a Technical Communicator, and led to a lively discussion in the comments. Join us to see if you agree with Ellis’s point of view.

In the category of “what’s old is new again,” we have Writing to STOP from Tony Self of HyperWrite in Australia.

STOP – Sequential Thematic Organisation of Publications – was developed at Hughes Corporation in the 1960s. The purpose of STOP was to improve the speed of document production, and to allow multiple authors to work simultaneously on the same document. […]
The STOP approach still resonates in the age of online documentation, as we still have the same needs to reduce document creation times and to work collaboratively. In this session, we will look at how the STOP approach worked, and how it might be re-applied even more effectively in the 21st century. 

That presentation is July 15 at 5 p.m. Eastern time. (Note the time change. Our usual 11 a.m. time slot is 1 a.m. in Melbourne, Australia. That seemed impolite to our presenter.)

Finally, Jack Molisani of Prospring and Lavacon is delivering How to Build a Business Case on August 4 at 11 a.m. Eastern time.

If you’ve ever submitted a purchase request that was not approved, chances are it lacked one or more of the vital components management looks for when allocating resources. 

In this segment, Jack Molisani will present a fun and practical session identifying the components of a successful business case, how to identify what is important to management, how to maximize your chances of approval, and more.

Jack usually rewards questions with chocolate, and I’m going to be impressed if he manages that in a webinar.

Don’t miss your chance to hear from these guys. You can register through our store; recordings of previous webcasts are now available as well.

PS Our presenters are based in England, California, and Australia. Registrants could be anywhere. The sessions are yours for $20. I love the Internet.

Read More
Opinion

This is the future of technical communication

First, read this article in the New York Times about the struggle to keep a reporter’s kidnapping quiet:

For seven months, The New York Times managed to keep out of the news the fact that one of its reporters, David Rohde, had been kidnapped by the Taliban. But that was pretty straightforward compared with keeping it off Wikipedia. 

Now, think about these issues as applied to technical communication. Let’s assume that your organization has online community — forums and a wiki, maybe. Technical communicators are responsible for monitoring and managing the community. Under what circumstances do you delete information? How do you respond when:

  • Information is inaccurate
  • Information is unflattering
  • Both

What if the information is accurate but incomplete?
What if someone describes a way of using your product that could cause injury, even though it’s technically possible? Do you delete the information? Do you add a comment warning of possible injury? What if the reader sees the original post but not the comment?

In the absence of safety concerns, I think that accuracy must win. Thus, as the information curator, you have a responsibility to correct inaccurate information. If the inaccuracy is truly dangerous, you may need to edit the post directly. Make sure that you disclosure what you’ve done with brackets. For example:

I like riding my scooter down mountains, especially without guardrails. Wheee! [This is a really bad idea because You Might Die. -moderator]

or

I like [really bad idea redacted by moderator]. Wheee!

Deleting unflattering (but accurate) information will probably backfire on the organization. Instead of censoring negative content, try addressing the concern being identified. Think of an impolite forum post as customer feedback. Does the poster have a valid point? Can you fix the problem that’s been identified?

I hate your scooters. They don’t come in enough colors. And they suck. 

What colors would you like to see? We do have two dozen available, see this list.
– Joe in TechComm

The life-or-death issues around Mr. Rohde’s kidnapping are relatively straightforward. We are likely to have much more difficult judgment calls in typical technical communication. Imagine, for example, that information were being suppressed because it criticized security arrangements and not because of safety concerns for the reporter. In that case, I think we can agree that Wikipedia’s response would have (and should have) been different. What would an equivalent scenario look like in your organization?

Read More
Opinion

Whither STC?

As you may have heard, STC is in a financial crisis. According to the board of directors meeting minutes from May 5, 2009 (PDF, page 2), STC must retain membership “for the next year or STC will be out of business in two years.” There’s a lively discussion on Twitter under the #stcorg hashtag.

For example, Bill Swallow (@techcommdood) wrote: “From STC I want innovation, education, and communication. Right now I get advertising, magazines, and frustration. #stcorg”

STC itself has requested feedback via private email, on Twitter with the #stcorg tag, and on a “private online forum.” I appreciate the idea, but I prefer to share my thoughts here, where anyone can read and comment on them.

According to the June 18 email message from Cindy Currie (STC president), the “unprecedented financial shortfall” is being caused by “the recession’s negative impact on our traditional sources of revenue.” Although it’s certainly true that the recession has caused a decline in membership along with a decline in conference attendance (the biggest two sources of income for STC), the recession is not the root cause of the problem.

The root cause is that STC is not perceived as sufficiently important by its membership. After all, a member could pay $200 for a membership by dropping cable television for a couple of months. Getting rid of cable for a year would come close to paying for conference attendance. It is true, of course, that a few members are in serious financial trouble due to layoffs or reduced income. In most cases, however, I think the member (or the sponsoring employer) has simply decided that STC (or the conference) does not offer enough value to justify the cost.

I have been an STC member for many years, and am an associate fellow. I participate in the annual conference both as a speaker and as an exhibitor. My company is a member of the Corporate Value Program. I have served on a couple of society-level committees and initiatives. This doesn’t make me a typical member, but I think it does give me a fairly broad perspective on the organization as a whole.

I believe that STC needs to make some significant changes in the following areas.

Velocity
Industry developments are fast and furious, and STC has not kept pace. For the STC conference, generally held in May, proposals are due the preceding summer. I turned in an article for Intercom on June 16, which will appear in the September issue. Chris Hester (@chris_oh) said it best on twitter: “Why pay for a pub when it uses content that was on blogs months earlier?”

STC needs to increase what the military calls operational tempo. Intercom, as many others have said, probably needs to evolve into an online publication to cut down the publication time. This has some significant advantages:

  • Faster publishing
  • Cheaper publishing by eliminating print production, paper, and distribution costs
  • Ability to publish more often

There is concern that putting Intercom online (and, by the way, I do not mean in PDF format) would put a dent in advertising revenue. It will. However, my company does not currently advertise in Intercom because we think the rates are too high and the value is not there. I would greatly prefer advertising in an online Intercom. I would also expect those rates to be significantly lower than the equivalent print ad. Providing Intercom online would open up advertising to many smaller companies. Would it be more profitable? I don’t know, but it would be a better, more relevant, publication, so that’s a start.

Similarly, the proposal process for the annual conference needs to be compressed significantly. With nine months of lead time, it’s impossible to provide relevant content. And please don’t tell me “it can’t be done.” Joe Welinske of WritersUA usually evaluates proposals in September/October for a March conference. Germany’s tekom, which is significantly larger than the STC conference, generally requires proposals in May for a November event. Six months is still a long time, but it’s one-third shorter than STC’s process.

Community
STC’s main value is in providing a sense of community for technical writers/communicators. In the past, the organization delivered community through printed magazines mailed to the membership, through local chapter meetings, and through regional and national conferences. As email lists became popular, STC has provided discussion lists for various SIGs, local chapters, and other groups (for example, there is a chapter presidents’ list. Or so I hear).

Today, however, communities of interest are meeting through various social media, and STC has not kept pace. STC should be providing a platform that encourages discussion and collaboration. The obvious template for this is what Scott Abel has done with the Content Wrangler network. STC serves writers; give the writers a place to write blogs, collaborate on a wiki, and the like.

Incidentally, STC Body of Knowledge effort is an excellent example of open collaboration. However, it’s quite difficult to find it from the main STC web site. These and other initiatives should all be under the stc.org umbrella. It’s not particularly difficult to set up subdomains so that, for example bok.stc.org points to the Body of Knowledge and forum.stc.org points to the forums. And so on.

Openness
Finally, STC needs to embrace a culture of openness. That means:

  • Provide open access to Intercom and other publications online. Increase the readership, make the publications more relevant, and therefore increase their appeal to advertisers.
  • Provide open access to forums and other collaboration areas. Do not limit them to members only. The STC Single Sourcing SIG recently launched a Ning network (here), but access is restricted not just to STC members but actually to SIG members only. This balkanization reduces the value of the community. Instead, open up participation and build a valuable, must-have resource.
  • Improve member communications and especially focus on giving people a way of letting their voices be heard. The virtual town halls now in progress are a good idea, but the process of getting access is too difficult. I finally resorted to begging for help on twitter and got the information I needed in less than five minutes. Unless there is a compelling reason to lock up information, it should be publicly available.

Change is hard. Transformational change is painful.
I have worked with many of the people in the STC office and in STC leadership, and it’s important to recognize that they are hard-working, smart people. I like them. (One of them is particularly entertaining in a hotel bar at 1 a.m. You Know Who You Are.)

They see the icebergs ahead and are trying hard to navigate through them. The problem is that turning a cruise ship takes time and effort. And, if you’ll pardon the tortured analogy, the larger problem is navigating through the ice field is impossible with a huge cruise ship. The correct answer is to step outside today’s constraints and rethink the problem. Perhaps we should morph into a submarine and go under the icebergs. At this point, we are still discussing whether to make a 5-degree or a 10-degree turn.

The financial problem that STC faces is a symptom, not the disease. Let’s treat the symptom and get through this crisis, but please do not forget about the underlying disease. STC needs more velocity, more community, and more openness.

Update (6/23/2009): Since I published this post, several other bloggers have added their perspectives. Here they are, in no particular order. If I missed your post, please add it in the comments so that readers of this article can find you.

Read More
Reviews

Flare 5 DITA feature review, part 2

[Alan Pringle wrote most of this review.]

This post is Part 2 of our Flare 5 DITA feature review. Part 1 provides an overview and discusses localization and map files.

Cross-references and other links
I imported DITA content that contained three xref elements (I shortened the IDs below for readability):

  • Reference to another step in the same topic:
    <stepresult>
    Result of step. And here’s a reference to the <xref href=”task1.xml#task_8F2F9″ type=”li” format=”dita” scope=”local”>third step</xref>.
    </stepresult>
  • Reference to another topic:
    <stepresult>
    Result text. And here’s a link to the other task topic:
    <xref href=”task2.xml#task_8F2F94 type=”task” format=”dita” scope=”local”></xref>.
    </stepresult>
  • Link to web site:
    <cmd>
    Here’s another step. Here’s a link with external scope:
    <xref href=”https://scriptorium.com” scope=”external” format=”html”>www.scriptorium.com</xref>
    </cmd>

All three came across in the WebHelp I generated from Flare:


On the link to the topic, Flare applied a default cross-reference format that included the word “See” and the quotation marks around the topic’s name. You can modify the stylesheet for the Flare project to change that text and styling.

Relationship tables
DITA relationship tables let you avoid the drudgery of manually inserting (and managing!) related topic links. Based on the relationships you specify in the table, related topic links are generated in your output.

I imported a simple map file with a relationship table into Flare and created WebHelp. The output included the links to the related topics. I then tinkered with the project’s stylesheet and its language skin for English to change the default appearance and text of the heading for related concepts. The sentence-style capitalization and red text for “Related concepts” in the following screen shot reflect my modifications:

screen shot showing Related concepts heading in red and with sentence style capitalization
conrefs
DITA conrefs let you reuse chunks of content. I created a simple conref for a note and then imported the map file with one DITA file that contains the actual note and a second file that references the note via a conref.

Flare happily imported the information and turned the conref into a Flare snippet. It’s worth noting that the referencing, while equivalent, is not the same. In my source DITA files, I had this:

aardvark.xml contains:
<note id=””>Do not feed the animals

baboon.xml contains:
<note conref=”aardvark.xml#aardvark/nofeeding”>

Thus, we have two instances of the content in the DITA files — the original content and the content reference. In Flare, we end up with three instances — the snippet and two references to the snippet. In other words, Flare separates out the content being reused into a snippet and then references the snippet. This isn’t necessarily a bad thing, but it’s worth noting.

Specialization
Specialized content is not officially supported at this point. According to MadCap, it worked for some people in testing, but not for others. If you need to publish specialized DITA content through Flare, you might consider generalizing back to standard DITA first.

Conditional processing
When you import DITA content that contains attribute values, Flare creates condition tags based on those values. I imported a map file with a topic that used the audience attribute: one paragraph had that attribute set to user, and another had the attribute set to admin. When I looked in the Project Organizer at the conditions for the WebHelp target, conditions based on my audience values were listed:

audience.admin and audience.user conditionsI set Audience.admin to Exclude and Audience.user to Include, and then I created WebHelp. As expected, the output included the user-level paragraph and excluded the admin-level one.

DITA support level
Flare supports DITA v1.1.

Our verdict

If you’re looking for a path to browser-based help for your DITA content, you should consider the new version of Flare. Without a lot of effort, we were able to create WebHelp from imported DITA content. Flare handled DITA constructs (such as conrefs and relationship tables) without any problems in our testing. Our only quibble was with the TOC entries in the WebHelp (as mentioned in Part 1), and we’ve heard that MadCap will likely be addressing that issue in the future.

We didn’t evaluate how Flare handles DITA-to-PDF conversion. However, if the PDF process in Flare works as smoothly as the one for WebHelp, Flare could provide a compelling alternative to modifying the XSL-FO templates that come with the Open Toolkit or adopting one of the commercial FO solutions for rendering PDF output.

Read More
Reviews

Flare 5 DITA feature review (Part 1: Overview and map files)

[Disclosure: Scriptorium is a Certified Flare Instructor.]
[Full disclosure: We’re also an Adobe Authorized Training Center, a JustSystems Services Partner, a founding member of TechComm Alliance, a North Carolina corporation, and a woman-owned business. Dog people outnumber cat people in our office. Can I start my post now?]

These days, most of our work uses XML and/or DITA as foundational technologies. As a result, our interest in help authoring tools such as Flare and RoboHelp has been muted. However, with the release of Flare 5, MadCap has added support for DITA. This review looks at the DITA features in the new product. (If you’re looking for a discussion of all the new features, I suggest you wander over to Paul Pehrson’s review. You might also read the official MadCap press release.)

The initial coverage reminds me a bit of this:

(My web site stats prove that you people are suckers for video. Also, I highly recommend TubeChop for extracting a portion of a YouTube video.)

Let’s take a look at the most important Flare/DITA integration pieces.

New output possibilities
After importing DITA content into Flare, you can publish to any of the output formats that Flare supports. Most important, in my opinion, is the option to publish cross-browser, cross-platform HTML-based help (“web help”) because the DITA Open Toolkit does not provide this output. We have created web help systems by customizing the Open Toolkit output, and that approach does make sense in certain situations, but the option to publish through Flare is appealing for several reasons:

  • Flare provides a default template for web help output (actually, three of them: WebHelp, WebHelp Plus, and WebHelp AIR)
  • Customizing Flare output is easier than configuring the Open Toolkit

I took some DITA files, opened them in Flare, made some minimal formatting changes, and published to WebHelp. The result is shown here:

Sample WebHelp from DITA through FlareNot bad at all for 10 minutes’ work. I added the owl logo and scriptorium.com in the header, changed the default font to sans-serif, and made the heading purple. Tweaking CSS in Flare’s visual editor is straightforward, and changes automatically cascade (sorry) across all the project files.

Ease of configuration
Flare wins. Next topic. (Don’t believe me? Read the DITA Open Toolkit User Guide — actually, just skim the table of contents.)

Language support
The Open Toolkit wins on volume and for right-to-left languages; Flare wins on easy configuration (I’m detecting a theme here.)

Out of the box, both Flare and the Open Toolkit provide strings (that is, localized output for interface elements such as the “Table of Contents” label) for simplified and traditional Chinese, Danish, Dutch, English, Finnish, French, German, Italian, Japanese, Korean, Norwegian, Portugese, Spanish, Swedish, and Thai (I have omitted variations such as Canadian French).

Beyond that, we have the following:

  • Right-to-left language support: Only in the Open Toolkit
  • Language strings provided by the Open Toolkit but not by Flare: Arabic, Belarusian, Bulgarian, Catalan, Czech, Greek, Estonian, Hebrew, Croatian, Hungarian, Icelandic, Latvian, Lithuanian, Macedonian, Polish, Romanian, Russian, Slovak, Slovenian, Serbian, Turkish, and Ukrainian
  • Ease of adding support for a new language: Flare wins. In the Open Toolkit, you modify an XML file; in Flare, you use the Language Skin Editor (although it looks as though you could choose to modify the resource file directory directly if you really wanted to)

Thus, if you need Hebrew or Arabic publishing, you can’t use Flare. The Open Toolkit also provides default support for more languages.

Map files
I imported a map file into Flare and published. Then, I changed the map file to include a simple nested ditamap. Here is what I found:

  • Flare recognized the map file and the nested map file and built TOC files in Flare with the correct relationships.
  • Inexplicably, the nested map file was designated the primary TOC. I speculate that this might be because the nested map file was first in alphabetical order. I changed the parent map file to be the primary TOC to fix this. I don’t know what would happen for a more complex set of maps, but I am concerned.
  • Flare inserted an extra layer into the output TOC where the nested map is found.
  • The titles generated in the TOC are different in Flare than they are through the DITA Open Toolkit (see below).

I generated the output for my map file (the nested map is the “The decision to implement” section in this screen shot) through the DITA Open Toolkit and got the following XHTML output:
Then, I imported the same map file into Flare, generated WebHelp, and got the following TOC output:

Notice that:

    • The TOC text is different (!!). The DITA Open Toolkit uses the text of the topic titles from inside the topic files. Flare uses the text of the @navtitle attribute in the map file. My topic titles and @navtitles don’t match because I created the map file, then changed a bunch of topic titles. The map file didn’t keep up with the new titles (because it doesn’t matter in the Open Toolkit), but it appears to matter for Flare. The entry in the map file for the first item is:

&lt;topicref href="introduction.xml" navtitle="Introduction" type="topic"&gt;

Flare picks up the “Introduction” from the navtitle attribute.

Inside the file, you find:

&lt;title&gt;Executive summary&lt;/title&gt;

The Open Toolkit uses the content of the title element from inside the file.

  • The Implementation section has added an extra layer in the Flare output. It appears that nesting a map file results in an extra level of hierarchy.

The inconsistency between the two implementations is annoying.

In part 2 of this review (coming soon), I’ll look at cross-references, reltables, conrefs, specialization, and conditional processing.

Read More
Webinar

Webinar mania!

I have several webinar-related updates to share:

Next week, the State of Structure

You probably know that Scriptorium conducted an industry survey on structured authoring earlier this year. The report, The State of Structure in Technical Communication, is available in our online store for $200.

There is a cheaper option to get the highlights. On Tuesday, June 16, at 1 p.m. Eastern time, I’ll be delivering a one-hour webinar that highlights the most important findings.

Coming in July and August

Expect to see additional webinars in cooperation with our TechComm Alliance partners, Cherryleaf and HyperWrite. We are also welcoming Jack Molisani of ProSpring, who will offer excellent and candid career development advice. Watch this space for details about these upcoming events. Scriptorium consultants will also be offering additional content.

Recorded events

Two of our recent webinars are now available for download:

  • Hacking the DITA Open Toolkit
  • Documentation as Conversation

Each webinar lasts about one hour and is $20, either live or recorded. You can register for the Tuesday webcast and download recordings in our online store.

(Warning: The recorded webcast files are quite large.)

Read More
Humor Opinion

More cowbell!

About a year ago, we added Google Analytics to our web site. I have done some research to see what posts were the most popular in the past year:

  1. The clear winner was our FrameMaker 9 review. With 21 comments, I think it was also the most heavily commented post. Interestingly, the post itself is little more than a pointer to the PDF file that contains the actual review.
  2. InDesign CS4 = Hannibal post, which discussed InDesign’s encroachment on traditional FrameMaker features.
  3. A surprise…a post from 2006 in which Mark Baker discussed the merits (or lack thereof) of DITA in To DITA or not to DITA

Our readers appear to like clever headlines, because I don’t think the content quality explains the high numbers for posts such as:

We noticed this pattern recently, when a carefully crafted, meticulously written post was ignored in favor of a throwaway post dashed off in minutes with a catchy title (Death to Recipes!).

For useful, thoughtful advice on blogging, I refer you to Tom Johnson and Rich Maggiani. I, however, have a new set of blogging recommendations:

  1. Write catchy titles
  2. Have an opinion, preferably an outrageous one
  3. More cowbell

Read More
News

Think global

All your docs are belong to us.
We are joining with a couple of other technical communication companies to form the TechComm Alliance:

Three companies—Cherryleaf Ltd., HyperWrite, and Scriptorium Publishing—are forming TechComm Alliance to help us handle technical communication projects around the world. We are located in the United Kingdom, Australia, and the United States, respectively, and each company has customers in both its home location and in other countries. TechComm Alliance will make it easier to work with global companies that need services worldwide.

How will this work? We expect to:

  • Work together on large projects that require support in multiple locations. For instance, Scriptorium might be implementing structured authoring for a U.S. company that also has operations in Europe and Australia. During rollout, instead of sending a Scriptorium consultant around the world, we partner with Cherryleaf for the training in Europe and with HyperWrite for the training in Australia. The result? Our customer saves on travel expenses, and our consultants spend less time in airplanes.
  • Refer projects to each other. Each company has (and will continue to have) clients around the world. When we feel that a local presence would benefit the customer, we can refer the project to our alliance partners.
  • Produce webinars and other events together. I’d like for Scriptorium customers to benefit from the expertise of our partners, and we are working on joint webinars.

Read More