Avoid Doing a “Hail Mary” Pass when Doing a DITA Implementation

Bill Hackos Presenting at the CSOFT World Summit 2011
Bill Hackos Presenting at the CSOFT World Summit 2011

Am in Beijing, attending a localization summit hosted by CSOFT International. I did a presentation on future development plans my company is doing, and how localization factors into how we reach out to our customers. (DITA did get a mention, but it wasn’t the focus of the presentation). Prior to my presentation were those by JoAnn and Bill Hackos, who gave the keynote addresses for the summit.

 

I’ve seen JoAnn in front of an audience many times, but I have never seen Bill give a presentation before, so was particularly looking forward to it. I knew that he was in the thick of things DITA-wise, and one of his articles had set me straight as to how to how to go about measuring reuse effectively.

Much like JoAnn he is a good and effective speaker, and definitely knowledgeable. His title for his talk was “Analyze the Numbers—The Impact of DITA on the Cost Structure of Information-Development Organizations” and like that implies, focused on how various organizations implemented DITA, and how well their experiences went.

Right off the bat he emphasized that doing DITA without some sort of plan for managing its implementation was like doing a “Hail Mary” pass in football. You can pray and hope that things go well, but things go better if there’s some planning done first.

This comes out of a survey he helped conduct a couple of years ago which looked at the productivity numbers of those firms implementing DITA. There were just under 100 self-selected participants in the study, and while the results were largely anecdotal (few had any actual measurements to call upon) the results are still interesting.

Not surprisingly, given that the data is a couple of years old, most of the respondents said they had not been using DITA for more than two years — when this survey was originally done, my own organization would have fit into this category. What I found interesting was the next statistic, which said that just under half of their organization (45%) was using DITA, with an equal percentage being “few or a limited number” (with presumably the remainder not using DITA at all). I find that interesting because it implies that many organizations have multiple tech writing groups, and that DITA adoption tends to come from one of those groups. Either that or a core of tech writers in a single group were “trying it out”. To me this implies that DITA tends to be a grass-roots effort, driven more by the tech writers or their managers than by upper management. In which case it is a good thing that the DITA-OT is free, as the low initial cost makes it easier for groups to experiment with DITA. (This might also explain the surge in interest in DITA-based CMSes, as two-years on these groups have matured and are looking for something more robust that can leverage all of the material they now have).

The overall productivity improvements were also interesting, with just over half of those surveyed saying that they were actively reusing topics and that they were also publishing them to multiple types of deliverables, and just under half were seeing the rates of collaboration between their authors increase. Given all of this it was not surprising to see the majority of these groups predicted that they expected productivity to increase, and that a significant percentage of those who did not said that it was due to not getting enough support or resources to effectively capitalize on what they had been able to do.

The most intriguing thing about the survey results was that while most organizations implementing DITA resulted having significantly increased productivity, there were other groups that saw a drop in productivity, even after two or more years of implementation, which he termed a “paradox”. You might expect a productivity increase (or decrease) for all such groups implementing DITA, but not some. Even allowing for variations in how an organization is run or its content, you would not normally see that sort of spread.

While he didn’t have a definite answer to this question — he hopes a subsequent survey may answer what’s really going on — he seemed pretty sure that those who DITA implementation “failed” had a number of problems going in to the process, not following one or all of the recommended best practices, including:

  • measure relative productivity before you begin the transition to DITA (so you can measure the “before” and “after” accurately)
  • remodel your existing docs/do a document audit
  • create a strategy for topic reuse throughout your document set
  • use an Annotated Topic List* to help develop your DITA maps

There is a definite cost to failing to implement DITA, as depicted in the follow-up slide showing where it can lead to increased costs overall:

DITA Success vs. Failure Chart
DITA Success vs. Failure Chart

To me this was the most important part of the presentation, underscoring the idea a tech writing group shouldn’t plunge headlong in doing their DITA implementation without doing the necessary preliminary work beforehand. The idea should be not just to be able to produce docs more efficiently/cheaply, but to do them better as well. You may luck out with your “Hail Mary” pass, but just doing things in DITA instead of some other toolchain doesn’t necessarily improve the processes where you can get the benefits from working in DITA.

About

"DITAWriter" is Keith Schengili-Roberts. I work for AMD as a Senior Manager for Technical Documentation, and have recently helped usher in a new company-wide DITA-based CCMS. And I like to write about DITA and the technical writing community. To get ahold of me you can email me at: keith@ditawriter.com.

View all posts by

2 thoughts on “Avoid Doing a “Hail Mary” Pass when Doing a DITA Implementation

  1. I disagree with the conclusion you draw from this statistic:

    > just under half of their organization (45%) was using DITA, with an equal
    > percentage being “few or a limited number”

    DITA isn’t for every situation and every document. The overhead isn’t always justified, just like not every document should be converted (just ask DCL!). It is impressive that the statistic is this high.

    For the rest of it – I totally agree. Doing the preparation in advance will save you so much farther down the road. This is where most of the work is. You can’t avoid doing it, but you can avoid having to repeat and fix errors created by not doing your analysis early on.

Comments are closed.