Skip to main content
May 6, 2024

Replatforming an early DITA implementation

Bill Swallow, Director of Operations at Scriptorium, and Emilie Herman, Director of Publishing at the Financial Accounting Foundation (FAF), shared lessons learned from a DITA implementation project. 

What did we want to accomplish with our project? One was to develop a single source of truth for our content, a single system to host all of it. Secondly, we wanted to modernize our information architecture and our content models and document all of it clearly. Lastly, we wanted to futureproof our content operations and go to a digital-first workflow.

— Emilie Herman

As FAF grew as an organization, their content operations evolved into a network of overlapping tools and processes, including:

  • Multiple content management systems, including a component content management system (CCMS)
  • Two different platforms for hosting products
  • Separate platforms for hosting websites with a separate CMS 

FAF “built around” their original print processes when opportunities arose to add new content options including digital formats, websites, and an XML feed for their stakeholders. Eventually, the add-on processes grew to an unsustainable level. After years of working with what they had, their team decided it was time for a change.

The end result was we always got the job done, but it took a lot of institutional knowledge, relying on a handful of people never going on vacation with a lot of institutional memory, and a lot of overlapping and duplicative processes.

— Emilie Herman

Before starting this DITA implementation project, FAF outlined three key goals:

  1. Develop a single source of truth (a single system) to host all content
  2. Modernize information architecture and content models, clearly documenting everything in the process
  3. Futureproof content operations and move to a digital-first workflow

Developing a single source of truth

As a non-profit agency that produces standards and rules for financial reporting, FAF’s content is the primary asset they produce. Their use of DITA directly supports their primary asset, which is fairly unique as many other organizations use content as a support for products and services. 

After replatforming their content operations processes into a DITA system, FAF now has a single source of truth (or single repository) for all content. From this repository, all content gets pushed out to FAF, Governmental Accounting Standards Board (GASB), and Financial Accounting Standards Board (FASB) sites. 

Modernizing information architecture and content models

Before starting this DITA implementation project, Scriptorium looked at FAF’s existing DITA content model which had been in use for about 15 years. The team that set up FAF’s initial DITA framework did a phenomenal job of customizing the DITA model to fit all their needs as the model was limited at the time. 

Over the past 15 years, DITA advanced to meet those needs with many of its standard elements. FAF’s custom elements were no longer necessary, and it became time to make upgrades that would remove the custom elements. However, these upgrades had to be implemented without significantly changing FAF’s existing XML feeds as those feeds still needed to be accessible to stakeholders.  

As an additional challenge, FAF supports two XML content models. Both models use similar processes but have unique needs, so they must remain separate. The second content model required converting DocBook and MS Word content. The DocBook content included multiple publications with variations in content structure, numbering, and cross-referencing. The content structures were designed for print books. With the shift to online content, the structures had to be adapted for online presentation and functionality. 

Futureproofing content operations

DITA is used to track changes to content, including how to trace changes back to their source. This requires a static numbering structure. 

FAF publishes a large amount of content. When a particular document or topic needed to be updated, approximately 12,000 pages had to be republished because of how the change affected archived content. If new content was added, everything referencing the live topic had to reference the archive, and everything that was numerically ordered in the archive had to shift down accordingly. Additionally, the new content needed to be referenced in many new places, and all cross-referencing had to be updated. 

Now when they do any kind of updates, instead of doing a full run, they might just do a small update batch for their XML. They produce the updates as something they call “overlays,” which essentially is a small update package. You can kind of think of it as a transparency sheet with old presentations before we started using projectors. You could take a “sheet” with the updates and lay it on top of the existing content. Everything in the underlying model remains untouched and all of the new or changed content gets put into place. It’s complex because of all of the archiving that’s involved.

— Bill Swallow

Lastly, FAF needs to be able to show the historical view of all their content. Content can be deprecated and no longer effective, but it can’t be removed without a document announcing the change.

Unexpected challenges during the project

As with all change, this enterprise-altering project encountered obstacles during its implementation.

  • Moving to a DITA-based workflow. Deciding where in the process to move to DITA was a key decision for the project team. Tackling this and other change-management issues was tough, but worth the benefits of supporting their team through training, communication, and documentation. 
  • Content migration. Migration was an all-hands effort between the migration partner, Scriptorium, the implementation vendor, and the FAF team. It required several iterations as small adjustments had to be made to ensure the conversion was correct.

Ultimately, though this was a challenging project, FAF’s content team was set up for success with futureproof content operations. 

We built an end-to-end process where we are able to produce both the document and an update to our codification from a single source of content. That was a big and exciting win. Our key takeaway was that this has to start with knowing your organization and what makes you unique. That way, you can be very clear with your team about your scope and protect it, which is very hard to do on a long-term project like this. It’s a technology project obviously, but a lot of it is a very human process and it’s as good as the people you get in the room and the collaboration and forward thinking that you get from the team.

— Emilie Herman