Skip to main content
Opinion

The elephant in the room—publishers and e-books

Two years ago, Nate Anderson wrote this on ars technica:

The book business, though far older than the recorded music business, is still lucky enough to have time on its side: no e-book reader currently offers a better reading experience than paper.

That’s what makes Apple’s iPad announcement so important. Books will now face stiff competition from e-books as the e-book experience improves.

Elephant in the room // flickr: mobilestreetlife

Elephant in the room // flickr: mobilestreetlife

Meanwhile, the publishing industry (with the notable exception of O’Reilly Media) is desperately trying to avoid the inevitable. (For a slighty happier take, see BusinessWeek.)

Publishers are supposed to filter, edit, produce, distribute, and market content. pre-Internet, all of these things were difficult and required significant financial resources. Today, many are easy and all are cheap.

There’s only one other thing.

Content.

But the revenue split between publishers and authors does not—yet—reflect the division of labor. The business relationships are still built on the idea that authors can’t exist without publishers. In fact, it’s the reverse that’s true.

Only the big publishers can get your book into every bookstore in the country. However, I’ve got news for you: Unless your name is on an elite shortlist with the likes of Dan Brown, John Grisham, Nora Roberts, and J.K. Rowling, it probably doesn’t matter.

If you know your audience, you can reach them at least as well as a big publisher can. And you need to reach a lot fewer people to succeed as an independent. The general rule of thumb is a 10-to-1 ratio. You’ll make the same amount selling 10,000 books through a traditional publisher as 1,000 books on your own.

It’s not so difficult to hire freelancers (especially in this economy) to edit and produce your book, if that’s not your cup of tea. Distribution is doable—Amazon is easy, bookstores a little more challenging. This is where e-books will accelerate the change—the challenges of shelf space and returns simply disappear.

And even if you have a publisher, they will expect you to do most of the marketing.

So, what will successful publishers look like in 2020?

  • They will provide editorial and production support for writers who do not want to deal with technical issues.
  • They will support authors in marketing by helping them with blogging platforms and other social media efforts.
  • They will get a much smaller cut of revenues than they currently do.

Actually, that looks a lot like Lulu.

    Read More
    Content strategy XML

    XML: The death of creativity in technical writing?

    Originally published in STC Intercom, February 2010.

    I spend a lot of time giving presentations on XML, structured authoring, and related technologies. The most common negative reaction, varied only in the level of hostility, is “Why are you stifling my creativity?”

    Does XML really mean the Death of Creativity for technical communicators? And does creativity even belong in technical content?

    Read More
    News Opinion

    ePub + tech pub = ?

    At Scriptorium earlier this week, we all watched live blogs of the iPad announcement. (What else would you expect from a bunch of techies?) One feature of the iPad that really got us talking (and thinking) is its support of the ePub open standard for ebooks.

    ePub is basically a collection of XHTML files zipped up with some baggage files. Considering a lot of technical documentation groups create HTML output as a deliverable, it’s likely not a huge step further to create an ePub version of the content. There is a transform for DocBook to ePub; there is a similar effort underway for DITA. You can also save InDesign files to ePub.

    While the paths to creating an ePub version seem pretty straightforward, does it make sense to release technical content as an ebook? I think a lot of the same reasons for releasing online content apply (less tree death, no printing costs, and interactivity, in particular), but there are other issues to consider, too: audience, how quickly ebook readers and software become widespread, how the features and benefits of the format stack up against those of PDF files and browser-based help, and so on. And there’s also the issue of actually leveraging the features of an output instead of merely doing the minimum of releasing text and images in that format. (In the PDF version of a user manual, have you ever clicked an entry in the table of contents only to discover the TOC has no links? When that happens, I assume the company that released the content was more interested in using the format to offload the printing costs on to me and less interested in using PDF as a way to make my life easier.)

    The technology supporting ebooks will continue to evolve, and there likely will be a battle to see which ebook file format(s) will reign supreme. (I suspect Apple’s choice of the ePub format will raise that format’s prospects.) While the file formats get shaken out and ebooks continue to emerge as a way to disseminate content, technical communicators would be wise to determine how the format could fit into their strategies for getting information to end users.

    What considerations come to your mind when evaluating the possibility of releasing your content in ePub (or other ebook) format?

    Read More
    Tools

    White paper on whitespace (and removing it)

    When I first started importing DITA and other XML files into structured FrameMaker, I was surprised by the excessive whitespace that appeared in the files. Even more surprising (in FrameMaker 8.0) were the red comments displayed via the EDD that said that some whitespace was invalid (these no longer appear in FrameMaker 9).

    The whitespace was visible because of an odd decision by Adobe to handle all XML whitespace as if it were significant. (XML divides the world into significant and insignificant whitespace; most XML tools treat whitespace as insignficant except where necessary…think <codeblock> elements). This approach to whitespace exists in both FrameMaker and InDesign.

    At first I handled the whitespace on a case-by-case basis, removing it by hand or through regular expressions. Eventually, I realized this was a more serious problem and created an XSL transform to eliminate the white space as a part of preprocessing. By using XSL that was acceptable to Xalan (not that hard), the transform can be integrated into a FrameMaker structured application.

    I figured this whitespace problem must be affecting (and frustrating) more than a few of you out there,
    so I made the stylesheet available on the Scriptorium web site. I also wrote a white paper “Removing XML whitespace in structured FrameMaker documents” that describes describes the XSL that went into the stylesheet and how to integrate it with your FrameMaker structured applications.

    The white paper is available on the Scriptorium web site. Information about how to download the stylesheet is in the white paper.

    If the stylesheet and whitepaper are useful to you, let us know!

    Read More
    Opinion

    Unedited content will get you deleted

    flickr: Nics events

    The abundance of information today forces content consumers to filter out redundant and unworthy information—much like an editor would. That, however, doesn’t mean content creators can throw up their hands and send out unreviewed content for readers to sort through. Instead, authors (and particularly their managers) need to understand how editing skills can ensure their information doesn’t get filtered out:

    [A]re we getting any better at editing in a broader context, which is editing ourselves? Or to rephrase it, becoming a better critic of our own work? Penelope Trunk (again) lists the reasons why she works with an editor for whatever she writes in public:

    • Start strong – cut boring introduction
    • Be short – and be brave
    • Have a genuine connection – write stuff that matters to the readers
    • Be passionate – write stuff that matters to you
    • Have one good piece of research – back your idea up

    They have one thing in common: difficult to do on our own.

    Granted, some of those bullet points don’t completely apply to technical writing, but it is hard to edit your own work, regardless of the kind of content. For that very reason, folks at Scriptorium get someone else to review their writing. Whether the content is in a proposal, book, white paper, important email to a client, or a blog post, we understand that somebody else’s feedback is generally going to make that information better.

    The same is true of technical content. A lot of documentation departments may no longer hire dedicated editors, so peer reviewers handle editing tasks. Electronic review tools also make it easier than ever to offer feedback: even a quick online review of content by another writer will likely catch some potentially embarrassing typos and yield suggestions to make information more accessible to the end user. (You can read more about the importance of editing in a PDF excerpt from the latest edition of Technical Writing 101.)

    With so much competing information out on the Internet, companies can’t afford to have their official documentation ignored because it contains technical errors, misspellings, and other problems that damage the content’s credibility. Even if you don’t have the time or budget for a full-blown edit, take just a little time to have someone do a quick technical review of your work. Otherwise, end users seeking information about your product will likely do their own editing—in their minds, they’ll delete you as a source of reliable information. And that’s a deletion that’s hard to STET.

    PS: Software that checks spelling and grammar is helpful, but it’s not enough: it won’t point out technical inaccuracies.

    Read More
    Opinion Webinar

    Behold, the power of free

    Lately, our webcasts are getting great participation. The December event had 100 people in attendance (the registered number was even higher), and the numbers for the next few months are strong, as well. Previous webcasts had attendance of A Lot Less than 100. What changed? The webcasts are now free. (Missing an event? Check our archives.)

    We’re going in a similar direction with white papers. We charge for some content, but we also offer a ton of free information.

    The idea is that free (and high-quality) information raises our profile and therefore later brings in new projects. I’m not so sure, though, that we have any evidence that supports this theory yet.

    So, I thought I’d ask my readers. Do you evaluate potential vendors based on offerings such as webcasts and white papers? Are there other, more important factors?

    PS Upcoming events, including several DITA webcasts, are listed on our events page.

    Read More
    Opinion

    2010 predictions for technical communication

    It’s time for my (apparently biennial) predictions post. For those of you keeping score at home, you can see the last round of predictions here. Executive summary: no clear leader for DITA editing, reuse analyzers, Web 2.0 integration, global business, Flash. In retrospect, I didn’t exactly stick my neck out on any of those. Let’s see if I can do better this year.

    Desktop authoring begins to fade

    Everyone else is talking about the cloud, but what about tech comm? Many content creation efforts will shift into the cloud and away from desktop applications and their monstrous footprints (I’m looking at you, Adobe). When your content lives in the cloud, you can edit from anywhere and be much less dependent on a specific computer loaded with specific applications.

    I expect to see much more content creation migrate into web applications, such as wiki software and blogging software. I do not, at this point, see much potential for the various “online word processors,” such as Buzzword or Zoho Writer, for tech comm. Creating documents longer than four or five pages in these environments is painful.

    In the ideal universe, I’d like to see more support for DITA and/or XML in these tools, but I’m not holding my breath for this in 2010.

    The ends justify the means

    From what we are seeing, the rate of XML adoption is steady or even accelerating. But the rationale for XML is shifting. In the past, the benefits of structured authoring—consistency, template enforcement, and content reuse—have been the primary drivers. But in several newer projects, XML is a means to an end rather than a goal—our customers want to extract information from databases, or transfer information between two otherwise incompatible applications. The project justifications reach beyond the issues of content quality and instead focus on integrating content from multiple information sources.

    Social-ism

    Is the hype about social media overblown? Actually, I don’t think so. I did a webcast (YouTube link) on this topic in December 2009. The short version: Technical communicators must now compete with information being generated by the user community. This requires greater transparency and better content.

    My prediction is that a strategy for integrating social media and official tech comm will be critical in 2010 and beyond.

    Collaboration

    The days of the hermit tech writer are numbered. Close collaboration with product experts, the user community, and others will become the norm. This requires tools that are accessible to non-specialists and that offer easy ways to manage input from collaborators.

    Language shifts

    There are a couple of interesting changes in language:

    • Content strategy rather than documentation plan
    • Decision engine (such as Hunch, Wolfram Alpha, and Aardvark) rather than search engine

    What are your predictions for 2010?

    Other interesting prediction posts:

    Read More
    XML

    Handling XSL:FO’s memory issue with large page counts

    Formatting Object (FO) processors (FOP, in particular) often fail with memory errors when processing very large documents for PDF output. Typically in XSL:FO, the body of a document is contained in a single fo:page-sequence element. When FO documents are converted to PDF output, the FO processor holds an entire fo:page-sequence in memory to perform pagination adjustments over the span of the sequence. Very large page counts can result in memory overflows or Java heap space errors.

    Read More
    Content strategy

    Friend or foe? Web 2.0 in technical communication

    The rise of Web 2.0 technology provides a platform for user-generated content. Publishing is no longer restricted to a few technical writers—any user can now contribute information. But the information coming from users tends to be highly specific, whereas technical documentation is comprehensive but less specific. The two types of information can coexist and improve the overall user experience.

    Read More
    XML

    Managing implementation of structured authoring


    An updated version of this white paper is in Content Strategy 101. Read the entire book free online, or download the free EPUB edition.

    Moving a desktop publishing–based workgroup into structured authoring requires authors to master new concepts, such as hierarchical content organization, information chunking with elements, and metadata labeling with attributes. In addition to these technical challenges, the implementation itself presents significant difficulties. This paper describes Scriptorium Publishing’s methodology for implementing structured authoring environments. This document is intended primarily as a roadmap for our clients, but it could be used as a starting point for any implementation.

    Read More