Navigating the depths of multivariate testing

By Gordon H. Bell and Roger Longbotham

Multivariable testing has made a big splash in the last few years with online retailers, yet this sudden success is really riding the crest of a wave that’s been building for years. Multivariable testing—also called scientific testing, multivariate or matrix testing, Taguchi methods, or other branded terms—is based on a specialized field of statistics that has evolved over the last 80 years. Since the 1930s, a small group of academic statisticians has developed new test designs and techniques focused on efficient ways to test more variables more quickly.

Often called “experimental design,” this specialty falls outside of mainstream statistics and has remained largely unknown to the business world. Only in the last decade have practitioners found a successful approach for using this impressive depth of academic theory to navigate fast-moving marketing channels.

Many variables at once

The concept is simple: with the right techniques you can change many variables at once—but in an organized way—so you can separate the impact of each. Complex mathematical principles define the “organized way” you need to set up your multivariable test.

The depth of statistical complexity below the surface can seem daunting. As marketers, you should understand the fundamental concepts and basic pros and cons of the selected test strategy. The expert who guides you through the process should be able to explain the _rationale of his approach and have a good grasp of the vast realm of techniques available. These include efficient test designs like full-factorial, fractional-factorial and Plackett-Burman designs, plus a veritable A-to-Z of specialized tools: axial runs, Bonferroni method, confounding, dispersion effects and experimental units, plus _orthogonality, projectivity and quadratic effects, down to the X-, Y- and Z-components of interaction.

Various designs and techniques are appropriate for different _marketing programs and objectives. For example, Plackett-Burman designs work well for testing 10-20 creative elements very efficiently in high-production-cost direct mail programs. Fractional-factorial designs are flexible and powerful for testing 5-15 creative elements and select interactions in e-mail and Internet programs. For product, price and offer testing—where elements are known to be important and interactions can be very large and valuable—full-factorial designs often are best. The number and type of test elements, cost and constraints on the number of “recipes” you can create, and the desired speed and precision of the test are among the issues that impact your choice of test design and strategy.

Since the dawn of direct marketing, split-run techniques (also called A/B splits, test-_control or _champion-_challenger testing) have been the standard for marketing testing. You may have a long-running (or “control”) banner advertisement and test it against one other with only the tagline changed, so any difference in click-through and conversion can be attributed to this one variable alone.

In contrast, one multivariable test design is made up of a number of related test “recipes.” Instead of the one-variable change of a split-run test, one new banner ad in a multivariable test would include a number of changes—perhaps the new tagline along with a control graphic, new price, additional starburst and control background color. These multiple versions each has a unique combination of all elements in the test, each providing one new piece of data on every test element. Analyzing all recipes together, but grouping data in different ways, you can separate the precise impact of each change. The statistical structure requires that the creative execution accurately follows the defined test recipes.

Scientific multivariable tests have four key advantages over split-run techniques. You can test many marketing elements at once, using the same small sample size as A/B split, with results that quantify the impact of each element alone (main effect) and in combination with others (interaction), and with a vast array of techniques available to customize your approach.

The best e-mail recipe

A large Internet retailer/cataloger wanted to increase e-mail conversion. With 2-3 e-mail drops per week to a customer base of 450,000, conversion rate averaged 1% per campaign. The team had a challenge pinpointing what worked best because they continually changed e-mail creatives and offers to keep the program fresh.

After brainstorming 42 ideas, the team narrowed the list down to 18 bold, independent test elements for one multivariable test made up of 20 different combinations, or recipes, of all 18 elements. Four of these “recipes” are shown on p. 75 (with control levels in black and the new ideas in orange). A direct subject line might be something like “Save 20% through Friday.” A creative subject line would change that to “Awesome new products and super savings that won’t last.”

Recipe 20 was simply the control. All other recipes had about half the elements set at the control level and half at the new level, but a different half-and-half for each recipe. Though these four may look like random combinations, all recipes fit within the precise statistical test design. Like the pieces of a puzzle, all recipes fit together to provide accurate data on the main effects and important interactions of all 18 elements.

The team ran the same test across three different types of promotions to see promotion-_specific effects plus elements that were important across all campaigns. Results for the first campaign are shown below, including the 18 main effects (shown in the bar chart) and one key interaction (in the line plot).

In the chart, main effects are arranged from the largest (D, at the top) to the smallest. Test elements are listed on the left with the “new idea” shown in parentheses. The length of the bar and the label show the size of the main effect. The +/- sign on the effect shows whether the new idea is better (positive effect) or the control is better (negative effect). The dashed line of significance is a measure of experimental error. All effects below that line (less than 6%) can be explained simply by random variation. Effects are shown as a percentage change from the control, so a 10% effect would increase conversion rate from 1% for the control to 1.1%.

Four main effects were clearly significant: Product selection had the largest effect. Conversion rate increased by 10% when best-_selling products (D+) were promoted instead of “unique” products. A larger headline with color (J+) increased conversion by 8.9%. Offering three products decreased conversion by 8.2% vs. one product (E-). Finally, the creative subject line (A+) beat the direct subject line by 6.9%.

Immersed deeper within the unique statistical structure is a wealth of information about interactions. On the surface, main effects show individual changes that increase conversion. Interactions show how these effects may ebb or flow depending on the relationship among marketing-mix elements.

The line plot on p. 76 shows the AB interaction. The main effect of A (subject line theme) changes significantly depending on whether certain words are capitalized in the subject line (_element B). Supporting the main effect in the bar chart, the creative subject line is always better than the direct offer (going from left to right), but the impact is much greater with no capitalization (B+, orange line). This interaction shows that (1) capitalizing words in the subject line does have an impact (B+: no capitalization) and that (2) without capitalization, the effect of the creative subject line (A+B+) is about 40% larger than shown by the main effect in the bar chart. Interactions not only offer deeper insights into the true relationship among elements, they also help better quantify the impact of the optimal combination of elements.

What’s the difference?

In this case, what was the advantage of using multivariable testing? Well, if the team had used simple split-run techniques instead:

l Testing all 18 elements in one drop, not one effect would have been significant, since the line of significance would have been 2 1/2 times higher (only effects greater than 16% would be significant).

l For equal confidence, the team would have to test only one element per drop, requiring 18 campaigns, with no way to separate seasonality or differences among campaigns.

l The team would never have seen the AB interaction (and others) and capitalization would appear to have no impact.

An extreme case of multivariable testing was one banner ad test of 26 elements. Testing 10 graphical elements, 9 messages, pop-ups, drop-downs, animation and other marketing tactics, the biggest challenge in this test was defining the elements and managing recipes to avoid completely absurd combinations (imagine all these words and graphics stuffed into one banner). In cases like this, the “art” of testing—defining clear, bold, independent test elements and creating a test design with recipes that push the limits of market knowledge without falling apart in _execution—is equally as important as the science. In this case, eight significant main effects and one very profitable interaction led to a 72% jump in conversion. The test was completed in four weeks. For equal confidence, split-run tests would have required 14 months.

An Internet retailer of consumer gifts ran a landing page test of 23 elements for three weeks and pinpointed seven changes to increase sales (and six “good” ideas that hurt) for a 14.3% jump in sales. 23 separate A/B splits would have required over 40 weeks to achieve equal statistical confidence. This test paid for itself 10 days after results were implemented.

Getting started

Multivariable testing is most effective for retailers who have many ideas to test and the flexibility to create numerous recipes within a high-value marketing program. Key decisions in launching a retail test include: choosing the right test elements and levels for each, deciding which of all possible combinations should be executed for a valid test, creating and executing all recipes, and collecting data and analyzing results.

Software platforms offered by firms like Optimost and Offermatica can simplify the process of creating recipes and analyzing results for Internet tests. Consultants focused on the specialized statistics and strategies of _testing can help guide you through the process, especially for e-mail and offline programs (like direct mail, media _advertising and in-store tests). For outside assistance, you may want to budget about $10,000 per month for ongoing support. In return you can expect to reduce your learning curve and increase your testing efficiency and return on investment. Another option for small firms is the free, barebones service from Google for Adwords advertisers called the Website Optimizer.

Testing remains an integral part of every good marketing program. Like trading in your dinghy for a clipper ship, launching a multivariable test brings you the power and freedom to move faster through turbulent marketing channels. With an experienced guide to show you the way, scientific testing offers greater agility to respond to market changes, streamline your retail programs and explore new opportunities for growth.

Gordon H. Bell (above) is president of LucidView, a marketing consulting firm specializing in scientific testing techniques, and can be reached at gbell@lucidview.com. Roger Longbotham is senior statistician at Amazon.com Inc., where he oversees the multivariable tests on Amazon.com and conducts data mining studies related to customer behavior. He can be reached at longboth@amazon.com.

http://www.internetretailer.com/article.asp?id=21312

SugarCRM Debuts Multichannel Marketing, Analytics

SugarCRM, a provider of open source Latest News about open source CRM Free Trial – Way Beyond CRM – Learn how Landslide can help you., has added multichannel marketing Email Marketing Software – Free Demo and business analytics functionality to its Sugar Open Source, Sugar Professional and Sugar Enterprise product lines.

This development is welcome news to those companies that are investigating open source as a viable alternative to packaged and on-demand CRM systems — a group that, while still minuscule, is becoming increasingly vocal about the benefits of do-it-yourself CRM.

“For a business model that owns, at best, less than one percent of the CRM market, they are getting a lot of buzz,” Yankee Group CRM analyst Sheryl Kingstone told CRM Buyer.
Enhancing Marketing

SugarCRM’s multichannel marketing features include the campaign wizard to set up and execute a campaign; campaign manager, which tracks the opportunities generated and closed by the campaign across customer channels; and automated lead capture, which can integrate Web leads into SugarCRM, associating them with a specific campaign.

Other features provide for better management of e-mail marketing, online advertising, newsletters, search engine marketing, list rentals, telesales programs, webcasts and traditional advertising.
New Business Analytics

The platform’s new business analytics includes features that generate ad-hoc, multi-module reporting to analyze marketing, sales and customer support. These reports can be manipulated into multiple formats such as pie charts or line graphs.

The multichannel marketing functionality is included in Sugar Open Source, Sugar Professional and Sugar Enterprise; business analytic capabilities are included in Sugar Professional and Sugar Enterprise only.
Use Case

Three years or so out of the gate, open source CRM may not have grabbed as much market share as initial speculation had suggested; however, its use is suitable for a wide range of companies.

Users could range from small companies that do not want to invest significant money in commercial systems to large enterprises that want highly-tailored systems and firms that want to support open source, Kingstone said.

Additionally, companies in micro verticals — which the vendors have yet to penetrate — would also be interested in developing their own open source CRM system Back up your business with HP’s ProLiant ML150 Server – just $1,299..

However, when faced with a buying decision, firms are more likely to continue to opt for the more conventional choices.

“I see Salesforce.com (NYSE: CRM) Latest News about Salesforce.com going up against Siebel or SAP (NYSE: SAP) Latest News about SAP AG. I never see open source as an option in these deals,” Kingstone stated, adding that if she ever were to consider open source as a possibility, SugarCRM would likely be the tech provider.

Built on the LAMP (Linux, Apache, MySQL, PHP) platform, SugarCRM was one of the first vendors to come to market with an open source CRM platform and has since gained significant traction.

http://www.ecommercetimes.com/story/55445.html

SPSS Rolls Out Clementine Version 11

by Alex Woodie

Business intelligence software developer SPSS recently unveiled a new version of Clementine, the data mining software that’s often deployed alongside CRM systems to help detect customer patterns, like fraud. Version 11 includes better data cleansing and transformation capabilities, many new calculations, and improved output.

Clementine is one of several predictive analytics products from SPSS that help companies improve their visibility into their customers’ buying trends and habits. In a nutshell, the software accomplishes this little feat of magic by analyzing information on past circumstances along with present events, and projecting their future actions, such as whether they might stop being customers, or try to rip somebody off.

With version 11, SPSS is aiming to make predicting the future even easier. The product includes new algorithms for credit scoring, complex pricing models, CRM and response modeling, forecasting, and rule-based models that incorporate users’ business knowledge, the company says.

Future predictions will arrive earlier than before, thanks to better tooling with Clementine, according to SPSS. The new release includes more robust transformation capabilities, more automated data cleansing, and the use of “optimal binning” to enable more predictive power, the company says. What’s more, the new Binary Classifier feature makes it easier to build multiple models simultaneously, so that the user can pick the best one.

The future is also clearer with Clementine 11.0 thanks to a new graphics engine that makes it easier to generate and edit images, and closer integration with SPSS statistical products.

Unfortunately for iSeries shops, the OS/400 version of Clementine is still several months away from general availability, according to SPSS officials; the Windows version is available now. Stay tuned for more coverage of Clementine for iSeries in an upcoming issue of this newsletter.

http://www.itjungle.com/fhs/fhs013007-story06.html

SAS ranked best in data mining

CARY, N.C. — SAS, the leader in business intelligence was voted Best Data Mining Toolset vendor for the third consecutive year. Value, reliability and broad applicability are the hallmarks of the Intelligent Enterprise Readers’ Choice Awards, the annual pick of preferred vendors. SAS also received 87 percent of votes in the customer satisfaction category for Web/Clickstream Analytics, tying for the top spot.

“SAS has a successful 30-year track record developing proven analytics capabilities,” said Mary Crissey, Analytics Product Marketing Manager at SAS. “Our investment in breath of analytics and accuracy leads the industry. Intelligent Enterprise readers who are SAS customers enjoy the solid analytical foundation that is needed to gain the competitive edge.”

“Each year, subscribers select their preferred vendors, and SAS was a clear leader,” said Doug Henschen, Editor of Intelligent Enterprise.com. “Hundreds of readers voiced their confidence in technology excellence through their ballots.”

DataFlux, a SAS company, also received the Readers’ Choice Award for Best Data Quality and Profiling software.

SAS Analytics Software is Unmatched in the Industry

SAS offers an integrated suite of analytics software that allows customers to formulate and evolve their analysis to obtain the best results and discover insights hidden in new data more quickly and easily than before using SAS.

SAS Enterprise Miner streamlines the entire data mining process from data access to model deployment by supporting all necessary tasks within a single, integrated solution, all while providing the flexibility for efficient workgroup collaborations. Delivered as a distributed client/server system, it is especially well-suited for data mining in large organizations. SAS Enterprise Miner is designed for data miners, marketing analysts, database marketers, risk analysts, fraud investigators, engineers and scientists who face difficult challenges in solving critical business or research issues.

The SAS Web Analytics solution delivers accurate, up-to-date information on an organization’s entire Web presence. The solution turns high volumes of Web data into key metrics specific to the business, enabling decision makers to determine the success of online operations and to proactively refine business strategies as needed.

About Intelligent Enterprise
IntelligentEnterprise.com is the only site dedicated to helping organizations plan and deploy strategic business applications that turn information into intelligence. Intelligent Enterprise empowers Business Application Strategists charged with unlocking the value of strategic information, setting, managing and running their enterprise business processes to provide collaborative information delivery that drives strategic decision making.

About SAS
SAS is the leader in business intelligence software and services. Customers at 40,000 sites use SAS software to improve performance through insight into vast amounts of data, resulting in faster, more accurate business decisions; more profitable relationships with customers and suppliers; compliance with governmental regulations; research breakthroughs; and better products. Only SAS offers leading data integration, intelligence storage, advanced analytics and business intelligence applications within a comprehensive enterprise intelligence platform. Since 1976, SAS has been giving customers around the world THE POWER TO KNOW®. www.sas.com

http://carolinanewswire.com/news/News.cgi?database=1news.db&command=viewone&id=2357&op=t

Where is web analytics headed in 2007

Here’s an interesting ClickZ article about what’s in store for us this year in the world of web analytics…

Proto-Analytics for 2007

By Shane Atchison | January 11, 2007

It’s now the new year and after a week’s vacation, I found myself reading all my favorite Web analytics writers: Avinash Kaushik, Jason Burby, and Craig Danuloff. They all made their predictions for 2007, and I thought I’d share mine as well.

* It’s a tale of two software vendors, continued. In 2006, we conducted over 50 analytics software evaluations for Fortune 2000 companies. Along the way, we saw the emergence of two clear-cut leaders in the enterprise analytics market: Omniture and WebSideStory. I expect that trend to continue.

For Omniture, the Genesis release has provided a great platform for third-party integration. I expect the company to continue in this direction and to acquire other companies, thanks to its very successful 2006 IPO.

It looks like most of WebSideStory’s enterprise customers will be encouraged to migrate to Visual Science. The reason? WebSideStory purchased Visual Science, and Jim McIntryre, founder of Visual Science, is now CEO of WebSideStory.

* Google Analytics will be a significant dark-horse vendor. Nowadays, everyone has his eyes on Google, wondering what this new player will do with its massive resources. Yet it hasn’t realized its potential in the analytics software business. I have great hope that it will develop some key innovations through its investment and perhaps acquisition of other companies.

* Innovation will come from third parties. Look for increased third-party integration in existing software platforms. In addition, we should get much better tools for behavioral content targeting, search bid management, and site and campaign optimization.

* Marketing executives will embrace Web analytics. 2007 will be the year the vast majority of CMOs and other marketing managers drink the Web analytics Kool-Aid. They’ll write marketing plans, project forecasts, creative briefs, and everything else with Web analytics as part of the process.

* Marketing execs’ biggest issue will be human capital. Wanting something and getting it are two entirely different things. While marketing professionals will want to add analytics to all their processes, the human resources needed to make those changes will be hard to find. If you’re looking for a job this year, having “Web analyst” at the head of your résumé will be a very good thing. Still, I and others in the industry worry that inexperience and unrealistic expectations will cause a lot of disappointment in companies trying to adopt data-driven processes.

* Employees will be held accountable for analytics. Nonetheless, at the ground level analytics will play an increased role in performance reviews. If you’re working in the Web organization of a Fortune 2000 company, expect to see an analytics dashboard or scorecard as part of your bonus structure.

* Optimization will be the hottest trend. It’s hard to think of a more promising area in analytics than optimization. Look to technology vendors such as Offermatica and Kefta to lead this space. These companies are prime candidates for additional capital and possible acquisition by Web analytics software vendors.

* Education will be huge. With the rising interest in analytics will come the arrival of many new books and materials on the field, including one by my colleague (and fellow ClickZ columnist) Jason Burby and myself.

Finally, it may be more of a wish than a prediction, but I hope universities will begin to create curricula around analytics. If there are any academics out there interested, please e-mail me. We need your help.

Have any questions, comments, or predictions of your own? E-mail me. I’d be happy to hear from you.

http://www.clickz.com/showPage.html?page=3624481

Building business intelligence

Companies implementing information projects tend to be undergoing organizational change as well. Or at least it seems that way to me. In any event, dealing with change is inevitable in an information management career. Understanding you are in a period of change followed by an adjustment of your approaches to various situations can help immensely.
The Challenge of Change

In times of change, organizations face major challenges such as retention and morale issues as well as maintaining high-level work results. Productivity is under assault during change. Information projects need to go forward, but the team members may be confused or, worse, demoralized by the activities going on around them. I’ve found that information projects which ignore the important aspects of organizational change do not produce good results, regardless of the best-laid architecture and technology plans.

Often, change is undermanaged and, therefore, stressful. That is the bad news. The good news is that during these times, there are numerous opportunities to deliver results. Successfully coping with and managing organizational change – and by extension a successful information project – requires awareness, motivation and discipline, resulting in personal excellence that will filter to the rest of the information project team. Here are some tips to manage change in a way that ensures success of your information project.
Tips to Manage Change

The right approach. It is a natural reaction for people and teams to resist change. Create and adopt an empathetic approach to dealing with change and realize that employees who appear to be going with the flow in times of change probably are not. The common reactions resulting from organizational change are most likely being experienced by everyone on the project team, even if they are not on display. Job preservation becomes a major concern. Where people need to live without closure – or at least information – on their personal situation, gossip and rumor fill the void. We see this all the time during business interviews for information projects. Because a company’s issues will take a back seat to personal issues, getting the issues that personally affect people out in the open and dealing with them will clear the way for dealing with company issues.

A healthy pessimism. Maintaining a healthy pessimism about the outcome of projects during change is, well, healthy. Early on, anticipate obstacles from those commonalities that most information projects find challenging. This includes data quality, business participation, query performance, data volume management, getting the specification correct and building to the specification. Expect trouble. To do otherwise is giving yourself false hope. You can only come to grips with problems that you are familiar with. Every employee is an agent in this regard, and each viewpoint should have an outlet. Make the truth welcome.

Focus and discipline. Focus. Focus. Focus. I have to say it – discipline is in short supply in information management. While many programs are addicted to firefights and flavors du jour, those who exercise control and discipline far outdistance their peers in success. The trick is to make yourself do the things that are important, even if it doesn’t feel like fun. We get spread thin with all the roles we play, and often we’re left with little that is tangible to show for our efforts at the end of the day. That’s why a set of daily nonnegotiable “to do’s” will help ensure progression through the tough times that change brings.

Obviously, production presents an easy set of daily nonnegotiable tasks through checking data movement, query performance, data quality and all the things that keep a production environment healthy. Likewise, development has opportunities for daily progress, but often they must be more creatively manufactured.

Deadlines can drive discipline. If I don’t have externally imposed deadlines, I create them for myself. I’m grateful for deadlines because they are the key to staying focused through change.

Self-control and teamwork. Finally, control yourself in the face of change. If you don’t, your focus and discipline problems will be exacerbated. During change, it is easy to forget that teams win together. It is not necessary for someone else to lose or for those around you to feel small for you to win.

Information management and organizational change initiatives often go hand in hand. While it is human nature to resist change, employing a few simple tips and techniques for managing change can make all the difference in any information project. Empathy, healthy pessimism, focus and self-control are key tools to successful change. Your personal excellence in dealing with change will overcome obstacles, inspire others and lead to successful information projects. The result is that information management is fun, the time flies and performance breakthroughs occur.

http://www.dmreview.com/article_sub.cfm?articleId=1062034