“We knew some people were doing testing, but there was no business ownership of testing as a concept. There was no governance or process to ensure syndication [of content] so we were getting the best bang for the buck.” In July, the company introduced the SAP.com Test Lab under the direction of the global search team that Sheridan leads. “[Search] being the highest traffic-driver to SAP.com, it made sense for us,” Sheridan said. The team is comprised of a full-time director and individuals from SAP's marketing and Web analytics teams. SAP uses a crowdsourcing model to solicit ideas about what to test, from the bottom up to the top down, Sheridan said. “We take an idea [such as an offer] and look at all the things that lead up to a conversion—the process, Web pages, design, user experience, content, offer and actual registration process,” he said. “Then we do a blend of A/B and multivariant testing, and image testing [such as putting an image next to the offer], to see where we get the most significant lift.” Sheridan added: “We've learned that simplicity is the key—easily scanned, brief content, appropriate imagery and a value-based call to action.” For example, after simplifying a registration form for a particular call to action, SAP.com saw a 96% increase in the number of registrations. However, there were trade-offs associated with an initial lift in registrations as a result of making forms simpler for users, Sheridan said. “We have to look at what is the downside to SAP in removing a field, such as taking away "industry.' We have industry-specific sales teams, so for each field we take away to get an increased conversion, there is a downside—now we have to do more work on the back end [to learn this information].” So far, the test lab has helped improve performance on a variety of actions, with 85% of the tests resulting in some lift—from 2% to more than 100%, Sheridan said. “The goal is to syndicate insights across the organization so we don't have to repeat making the wheel,” he said. Hewlett-Packard Co. is another company that makes testing a priority. “We are all about doing rigorous testing,” said Scott Anderson, VP-customer communications at HP Enterprise Group. “On the digital side, everything is connected. So not only do we test digital content, but there is a requirement to amplify the message.” Anderson said the first thing HP does is set up “listening posts” to monitor conversations about the company taking place in social media networks, in user communities and broadly across the Web. It uses social media monitoring tools such as Radian6 to discern discussion topics, analyze sentiment about HP and find the top keywords and other topics of interest being discussed by the company's target audience of business and technology professionals. “One objective is to increase the share of conversation,” Anderson said. “One way we do this is by "activating' the HP community. For example, if we have an event, we work with our customers to broadcast the event [such as on Twitter], and we have an ongoing testing process to see how we can keep the volume up and increase the share of conversations.” The next level of testing is at the campaign level, during which HP's marketing group conducts A/B testing on headlines, graphics, offers and other content related to digital campaigns. “We always do rigorous testing on campaigns,” Anderson said. HP uses a marketing automation program from Eloqua to test offers to customers. “We can become more relevant based on their answers,” he said. For example, if a customer's warranty on an HP machine has expired, that triggers an automated email to the customer with a choice of service options. “For each response, we can test the offer, call to action, language and headlines, and dial up the conversions at each interaction with the customer.” The third level of testing is what Anderson calls “functional testing” of content on HP Web pages and in newsletters and blogs. For its Web pages, HP scores content using tools such as Safeguard by SDL Tridion, which scans Web page content for keyword optimization, metadata and visual elements to optimize the user experience and improve SEO. HP's social media group reviews blogs written by HPers and implements best practices, such as testing headlines, language and calls to action. Once blogs have been reviewed and tested, the views on blog pages may increase up to eight times, Anderson said. Finally, Anderson manages an email marketing team that tests all elements of email communications to customers, from subject lines to offers in the email, and has seen improvements of up to 600% in click-through rates and 300% in open rates. Smaller companies are also doing more testing of digital content across an array of channels. JDA Software, a supply chain management software company, does content testing on websites and in blogs, newsletters and other marketing materials. “We have a publishing calendar every week, which looks at all the content we're pushing out to meet our corporate goals,” said Cindy Kim, director-marketing and social media at JDA. “We use LinkedIn, Twitter and Facebook to see what topics resonate.” JDA serves the retail industry, and there was some internal debate about whether to use the term “multichannel” or “cross-channel” in content. A JDA blogger posed the question on his blog and to his LinkedIn groups to find out which term resonated more with the retail audience; and the informal research showed the industry preferred “multichannel.” In addition, JDA does A/B testing on email subject lines, newsletter and blog content, and website design using metrics such as open and click-through rates as well as time spent on site.