As content distribution formats proliferate, production executives tackle digital archiving
In a world of rapidly evolving methods for distributing content online—RSS, Facebook, e-newsletters, webinars, mobile alerts, etc.—the pressure is building on media production and manufacturing executives to ensure that nearly every bit of content is being archived.
Of course, it can be difficult to archive everything. “You need to focus on your business requirements first and foremost,” said John Blanchard, VP-manufacturing at Reed Business Information. “Ideally, we'd like all content in all formats available to all participants in work flow—both internal and external. Experience tells us that's not feasible.”
Cost, legacy systems and differing processes from unit to unit are some of the reasons Reed can't archive it all, Blanchard said. So the company mapped out a strategy in which the production department knows the types of content it is working with as well as who needs it and when.
At IDG, a large percentage of editorial content—stories, slide shows, video and podcasts—is initially published on the Internet. “Our content is divorced from design/layout elements and lives in a database,” said Stephan Scherzer, exec VP-general manager of PC World and Macworld Online. All production teams—online and print—have access to the database, allowing them to republish the content.
“As long as content appears as data and it is stored appropriately, the channel where you publish the content doesn't matter,” Scherzer said.
Jeremy Carlson, manager of digital prepress, digital imaging and media operations at Advanstar Communications, said the biggest challenge to archiving digital content is “finding a solution that's dynamic enough to be easily integrated with both a publisher's print and Web editorial/content systems.” He said that having a digital asset management (DAM) environment tied in with archives is an important step toward keeping the archives media-neutral.
“Ideally, this archive/DAM could act as a media gateway and be integrated with a publisher's print editorial system and Web CMS [content management system] so that the appropriate content can be exported with a click of a button or automated trigger to the format and channel desired,” Carlson said.
Using metatags on all content is crucial to a good archive system. “An archive/DAM system should also have a database that can read and embed XMP metadata for rights management, keyword queries, usage and description/captioning,” Carlson said.
IDG is careful about what third-party data to archive. If it is identified as being important enough, “it is entered into our database, categorized and tagged so that it can in turn be connected with related stories produced in-house,” Scherzer said. “That way, third-party content can add tremendous value to our own content.”
Problems arise when publishers make articles central elements and build Web sites and production structures around them, Scherzer said. “You need to have a data-driven and datacentric model to be competitive with the large number of Web-only competitors,” he said. “All of them are very data-driven, and can aggregate and organize the content in a very flexible way.”
Scherzer said it's important to break up the content with a short life span—news, product first looks, most blog posts, etc.—and the content with a long life span. “From where we stand, the purpose of an archive is to enhance the shelf life, and thus the value, of the content we produce,” he said, which means divorcing content from its format and keeping in a database in an unformatted way as well as tagging, describing and categorizing content so that it can be reused in many different ways.
Carlson recommended having an archive/DAM system with a Web-based interface so that outside offices, external licensees and freelancers can access the information. He also suggested using a third-party integrator that has experience with archiving. M