Skip to main content
Webinar

Webcast: The technology is the easy part! Leading through change

Change is constant in technical communication. Whether dealing with new technology, shifts in organizational structures, or growing business requirements, content creators must be able to adapt. In this webcast recording, a panel of content experts—Jack Molisani of The LavaCon Conference and ProSpring Staffing, Erin Vang of Dolby Laboratories, Sarah O’Keefe of Scriptorium, and moderator Toni Mantych of ADP—answer questions and give advice about dealing with change in the industry.

Read More
Opinion

The good manager litmus test: process change

For Kai Weber, a good manager is pivotal in making a job satisfying:

It’s the single most important factor in my satisfaction with a job. Nothing else shapes my memory and my judgment of a past job as much.

What really tests the mettle of a manager is how he or she handles process change. A good manager is absolutely critical when a documentation department switches to new authoring and publishing processes, particularly when moving from a desktop publishing environment to an XML-based one. Without good management, the implementation of new processes will likely fail. (I’ve seen bad management kill an implementation, and it’s ugly.)

So, what does a good manager do to ensure a smooth(er) transition? From my point of view, they will take the following actions (and this list is by no means all encompassing):

  • Demonstrate the value of the change to both upper management and those in the trenches. A manager can often get the approval from upper management on a workflow change by showing cost savings in localization expenses, for example; (less) money talks to those higher up on the corporate chain. But mentions of reduced costs don’t usually warm the hearts of those who are doing the work. A good manager should show team members how the new process eliminates manual drudgery that everyone hates, explain how authors can focus more on writing good content instead of on secondary tasks (such as formatting), and so on. Demonstrating how the team’s work experience improves is more important than showing improvements in the bottom line—even though the cost savings are a result of  those very changes. There is also the angle of professional development for a staff  moving to a new environment; more on that in the next bullet.
  • Ensure those working in the new process understand the new tools and technologies by offering training/knowledge transfer. A good manager knows that changing processes and not including some sort of training as part of the transition is foolish; knowledge transfer should be part of the project cost. Sure, not all companies can afford formal classroom training, but there are less expensive options to consider. Web-based training is very cost effective, particularly when team members are geographically dispersed. Another option is training one or two team members and then having them share their expertise with the rest of the group (“train the trainer”). The benefits of knowledge transfer are two-fold: team members can ramp up on the new processes in less time (thereby more quickly achieving the cost savings that upper management likes so much), and the team members themselves gain new skills in their profession. A good manager recognizes that training benefits both the company as a whole and individual employees (and he or she can help team members recognize how they benefit in the long term professionally from learning new skills).
  • Know the difference between staff members who are bringing up legitimate issues with the new workflow and those who are being recalcitrant just to maintain the status quo. During periods of change, a manager will get pushback from staff. That’s a given. However, that pushback is a very healthy thing because it can point out deficiencies in the new process. A good manager will take feedback, consider it, and modify the process when there are weaknesses. Addressing genuine feedback in such a manner can also help a manager win “converts” to the new process.  However, there may be an employee (or two) who won’t be receptive to change, regardless of how well the manager has explained the change, how much training is offered, and so on. In these cases, the manager may need to consider other assignments for that employee: for example, maintaining legacy documentation in the old system, particularly when that employee’s domain knowledge is too important to lose. There are more unpleasant options (including termination) the manager may need to consider if the recalcitrant team member isn’t providing other value to the organization as a whole. Basically, a good manager won’t let one individual poison the working environment for everyone else.

I will be the first to say that these tasks are not easy, particularly dealing with an employee who is utterly against change. But managers need to address all of the preceding issues to ensure a smooth transition and to keep the work environment positive and productive for the staff as a whole.

I won’t even pretend I have covered all the issues managers need to address when a department changes workflows, and each company will face its own particular challenges because of differences in corporate culture, and so on. If you’ve been through a workflow transition, please share your experiences in the comments: I’d love to hear from both managers and team members on what worked well (and what didn’t) in how management handled the changes.

PS: You can read a more detailed discussion about managing process change in our white paper, Managing implementation of structured authoring (HTML and PDF).

Read More
Conferences

Tools of Change for Publishing/Norwegian Monks!

As part of a brief history of publishing in the opening keynote, I’ve already seen a few friends:

  • The Norwegian Monks video — Technical support for books
  • A reference to Vannevar Bush’s “As We Might Think” article from 1945

According to Tim O’Reilly, Microsoft Encarta “fatally wounded” the Encyclopedia Britannia because of “asymmetric competition.”

A series of short, related keynotes to kick off the conference. I like this approach; in a nontechnical, high-level keynote, it can be difficult to fill a 60- or 90-minute slot.

Brian Murray, HarperCollins, Retooling HarperCollins for the Future
Consumer publishing *was* straightforward. All promotion wasdesigned to drive traffic to a retailer.

In 2005, “the earth moved.” There were search wars, community sites, user-generated content, Web 2.0. Newspapers and magazines responded with premium, branded sites online based on advertising or subscription models.

Book publishers are confused. Search engines treat digitized book content like “free” content. Rights and permissions are unclear. Books are not online — except illegally! Book archives are not digitized.

Before 2004, “book search” took place in a book store.

What is the role of the publisher in a digital world?
What is the right digital strategy?
What are the right capabilities?
“Search” provides new opportunities for publishers.
Publishers must transition from paper to digital.
How can publishers create value and not destroy it?

Some statistics:

  • 65M in the U.S. read more than 6 books a year.
  • 10M read more than 50 books a year. [ed.: waves]
  • Younger consumers read less; they spend more time online

Search is used more often than email.

HarperCollins decided to focus on connecting with customers, rather than e-commerce. Amazon and others already do e-commerce. They focused on the idea of a “digital warehouse” that is analogous to the existing physical warehouse. They want to:

  • promote and market to the digital consumer.
  • use digitized books to create a new publishing/distribution chain
  • protect author’s copyright
  • “replicate in digital world what we do in physical world”
  • got publicity, strong public response
  • no single vendor who could deliver turnkey

Improvements from digital production and workflow could fund some or all of the digital warehouse investment. Projects that were low priority “IT and production” projects become high priority. Savings were realized in typesetting/design costs, digital workflow, and digital asset management.

The digital warehouse now has 12,000 titles. (Looks as though they were scanned, which doesn’t meet *my* definition of “digital content.”)

At this point in the presentation, we began to hear a lot about “control.” Control of content, controlling distribution, and so on.

HarperCollins does not want others to replicate their 9-billion page archive in multiple locations. They want others to link into their digital warehouse. But if storage is cheap and getting cheaper, what’s in it for, say, Google?

Strategic issues for book publishers

  • Should publishers digitize, organize, and own the exclusive digital copy of their book content?
  • Should publisher control the consumer experience on the web?
  • If the cost of 1 and 2 is zero, should every publisher do them both? would they?
  • How to make money

The focus on controlling content was interesting and perhaps not unexpected. The business case based on savings in digital production was also interesting.

Read More
Conferences

XML 2006: Content Management APIs

How Google and wireless access have changed the world: I’m sitting in this session, and the presenter’s approach isn’t working for me. So, I google jsr 170 and I find this article at CMS Watch that explains it quite nicely.

Having skimmed that, I return my attention to the presenter, and find that he’s making a lot more sense.

The CMS Watch article has an excellent definition of JSR 170:

JSR-170 promises the Java world, and possibly beyond, a unified API that allows accessing any compliant repository in a vendor- or implementation-neutral fashion, leading to the kind of clean separation of concerns that characterizes modern IT architectures. Some people call JSR-170 the “JDBC [Java Database Connectivity] of Content Repositories.”

Now, we have Michael Wechner presenting on what is theoretically the same topic. Only not. He leads with this: “Today, every CMS is producing its own user interface, which is just kind of silly.” And then this analogy: mail servers are standardized, but you’re free to use your own client/front end. Similarly, CMSes need a common backend and you can do whatever on the front end.

I feel smarter already.

Wechner’s company, Wyona, is an integrator for open source CMS.

He points out that the ability to work offline is important because people aren’t always online. He uses the example of a train ride in Europe — the obvious equivalent in the United States is airplanes. (Side note: If people are permitted to yap on their cell phones in-flight, I’m probably going to stop traveling altogether. It’s bad enough on the ground at the gate.)

OK, and I think he’s proposing that you use existing protocols, such as Atom and WebDAV, to do CMS connections.

Read More
Content operations Podcast Podcast transcript Structured content

Tips for moving from unstructured to structured content with Dipo Ajose-Coker

In episode 159 of The Content Strategy Experts Podcast, Bill Swallow and special guest Dipo Ajose-Coker share tips for moving from unstructured to structured content.

“I mentioned it before: invest in training. It’s very important that your team knows first of all not just the tool, but also the concepts behind the tool. The concept of structured content creation, leaving ownership behind, and all of those things that we’ve referred to earlier on. You’ve got to invest in that kind of training. It’s not just a one-off, you want to keep it going. Let them attend conferences or webinars, and things like that, because those are all instructive, and those are all things that will give good practice.”

— Dipo Ajose-Coker

Read More