One of the most interesting sessions I attended at Adobe Summit this year examined how to build an optimization program across a series of different brands or divisions under a single organization. Large companies often run conglomerates of businesses, some that might even occupy similar spaces in the marketplace. The core challenge for these companies becomes how to manage testing and optimization programs across these multiple brands, while maintaining governance over activities, assets, and brand messaging within different regions or markets. One session at Summit offered some best practices for doing just that.

A fundamental best practice underscored in the session surrounds standardizing an effective testing process. Although tests among brands or regions/divisions may differ, the methods for executing a test, evaluating its success, and communicating the results should be similar. It is no longer sufficient to test brands under a single organization separately, requiring extensive posttest analysis of disparate data to piece together cross-organizational results.  Conducting multibrand testing within a single interface streamlines the process, ensures that critical steps are not overlooked, and minimizes the potential for errors in the results. For example, the quality assurance (QA) phase of your test design is a critical component of every test you conduct. Minor bugs in the test design can yield inaccurate results, which can skew the decision-making progress. In a sense, the rollout of a new QA can be more rigorous than a product rollout because errors cannot be tolerated owing to their potential impact. Test designs are fragile environments, and errors in experiences or in tracking results can skew your reports.

Another best practice outlined in this session involved aligning people across the organization and across brands and establishing an experienced core team that can manage the program, prioritize efforts, and offer guidance and strategy. This context and collaboration across company-wide campaigns and global teams becomes much easier within a unified interface, such as that provided by Adobe Marketing Cloud, in which assets, data, tests, and results can easily be shared and monitored. Prior to recent advancements in cloud technology, dispersed  departments would often conduct siloed testing, with the ability to communicate their learnings limited to a handful of cross-organizational meetings designed to review the results.

In today’s multibranded business, test optimization requires an optimization “steering committee.” This core team can manage and evaluate the learnings from all of the brands and divisions, as well as “steer” them on the right course based on optimization best practices, revenue performance, and the company’s overall business direction and brand goals.

In this Summit session, the speaker outlined how their initial testing program was getting good results but was failing to publicize those results throughout the company, which eroded the stakeholders’ trust in the program. It’s important to remember that trust is one of the most important factors of a successful program, so a process was put in place to validate the test results and to align the company with the optimization activities.

The process begins with project norms, which are decisions the main stakeholders make about the program, such as how many tests will be run, how the product teams will be supported, and what types of tests will yield the most ROI. Next, ideas are gathered from throughout the company. Although the ideas may be disconnected or siloed, being able to take ideas from within the company and turn them into tests to show the benefit of the testing process helps to build confidence and trust with the program. Prioritizing test ideas relies on a cost-benefit analysis process to determine which will be the most effective in terms of execution and results.

Once the program has been outlined and ideas generated, tests are designed and validated to ensure proper design and accurate results. From here, the optimization roadmap is finalized and the tests implemented. For each test there is a separate process that consists of test planning and design, development and QA, communication of the test to stakeholders, a final test, a live testing period, and finally analysis and additional answers. The key to the individual test process, and the process as a whole, is that there is built-in accountability and ownership at each step of the way. This means that specific people are responsible for the integrity of the program and the process at each step, ensuring buy-in throughout the organization and better execution of the program.

The final concept examined in this session was embedded execution. Embedded execution refers to an organizational culture in which all program participants adhere to a standard process. With tools like the new Adobe Target interface, involving more and more members of the organization in optimization is easier than ever. As opposed to test design and communication being conducted with individual slices of the larger company, the new user interface allows colleagues to participate in the testing process while being governed and coordinated by the core team.

The posting of content, reports, concepts, and notifications within a unified environment and architecture allows for sharing of ideas and knowledge that have been gleaned from across the organizational brands. The lessons from successes in one part of the organization can be infused in another, creating embedded execution where the benefits of all activities in the program are leveraged across the entire organization.

Many businesses talk about goals such as coordinated collaboration, automated personalization, and master marketing profiles as a sort of future state. However, as we saw at this year’s Summit, these features and capabilities exist in today’s marketing organizations. They’re easy to pick up and use in concert for better, faster, smarter optimization and personalization, allowing company-wide optimization programs to become a reality.

0 comments