Category Archives: analytics

Advancements in SPSS 16.0 Expand Analytical Power

CHICAGO–(BUSINESS WIRE)–SPSS Inc. (Nasdaq:SPSS), a leading worldwide provider of predictive analytics software, today released SPSS 16.0, a significantly enhanced version of its flagship statistical software suite of products. New capabilities in SPSS 16.0 integrate this product with SPSS’ Predictive Enterprise Services™ (PES) analytics platform to direct and automate decisions. This use of predictive analytic technology produces a measurable return on investment and provides SPSS customers with a competitive advantage.

SPSS 16.0 targets the needs of organizations that value predictive modeling capabilities as corporate assets. With PES — an enterprise-level predictive platform addressing key requirements supporting the widespread use and implementation of predictive analytics — businesses utilize more powerful statistics to tap into customers’ attitudinal and behavioral data. The benefits achieved include detecting fraud, increasing revenues through cross-selling and up-selling, and improving overall business process efficiencies and outcomes.

The SPSS 16.0 release features a new user interface written completely in Java™, making the software more flexible and easier to use. Other major enhancements include significantly expanded analytical capabilities, enhanced data management, improved programmability and greater performance and scalability in enterprise applications. This release features the immediate availability of all SPSS 16.0 product modules across the major computing platforms — Microsoft® Windows®, Apple® Mac® OS® X or Linux®.

With version 16.0, SPSS addresses the growing demand for predictive analytics from the commercial sector, while maintaining its leading and respected position in academia. SPSS leads the charge in education as universities worldwide incorporate SPSS in their applied statistics training programs. Today, SPSS 16.0 is the flexible suite of products students-turned-business leaders incorporate into their daily analysis of their customers’ needs, wants and desires.

Customers Laud Added Capabilities of SPSS 16.0

Ken Kirsten, director of analytic consulting for The Nielsen Company, said, “Businesses that incorporate SPSS 16.0 into their technology stack will gain a real competitive advantage as they’ll have the suite of tools necessary to become predictive enterprises. I use SPSS for analysis that simply can’t be done in a simple spreadsheet. This newest version will help me do my job more efficiently and productively.”

Added Bob Muenchen, statistical consultant and manager of the University of Tennessee Statistical Consulting Center, “The SPSS name has been well-known at major colleges and universities around the world for almost four decades now. Students prefer SPSS because it lets them spend their time learning research methods rather than computer programming. SPSS offers a unique combination of ease-of-use for beginners, while providing advanced users programmability and the ability to handle massive data sets.”

Highlights of SPSS 16.0

Supporting the Predictive Enterprise

SPSS 16.0 delivers additional integration with PES, providing a highly efficient, cost-effective way to manage and update the growing number of analytical assets. Enhancements in the SPSS Adapter for Predictive Enterprise Services make it possible to store and manage a variety of analytic assets, including predictive models and data transformations — whether created with version 16.0 or other popular SPSS products — resulting in increased performance.

A New, More Flexible User Interface

The entire user interface, rewritten in Java, has a new form and functionality, making it even easier to work with data. SPSS 16.0 now provides the ability to instantly resize a dialog to see a more complete description of variables, as well as quickly select and drag the variables wanted for analysis.

More Powerful Statistics

A new optional module, SPSS Neural Networks™, provides a complementary approach to the statistical techniques available in SPSS 16.0. Access to neural networks is made simple with the familiar SPSS interface, leading to the discovery of more complex relationships in the data. These powerful analytics offer the ability to forecast consumer demand for a particular set of products, calculate the likely rate of response to a marketing campaign, determine an applicant’s credit worthiness or detect potentially fraudulent transactions.

Enhanced Data Management and Reporting Capabilities

For organizations working with data in multiple languages, SPSS 16.0 provides the ability to process Unicode data, as well as treat text according to Unicode properties for such tasks as sorting and case conversion. In addition, version 16.0 addresses data management concerns, offering more flexibility in how data are prepared and managed.

Improved Programmability

The SPSS Programmability Extension™ enhances the capabilities of SPSS 16.0 by using external programming languages such as Python®. Integration plug-ins are available at the SPSS Developer Central Web site, as is the SDK extension from SPSS that allows users to create their own integration plug-ins.

New to SPSS 16.0 is an integration plug-in for the Open Source statistical programming language, “R.” This enables access to the wealth of statistical routines created in R that can be used within the supportive infrastructure of the SPSS environment as part of SPSS syntax.

To view the entire list of enhancements in SPSS 16.0, please visit and download a PDF brochure.

About SPSS Inc.

SPSS Inc. (Nasdaq: SPSS) is a leading global provider of predictive analytics software and solutions. The company’s predictive analytics technology improves business processes by giving organizations forward visibility for decisions made every day. By incorporating predictive analytics into their daily operations, organizations become Predictive Enterprises — able to direct and automate decisions to meet business goals and achieve a measurable competitive advantage. More than 250,000 public sector, academic and commercial customers rely on SPSS technology to help increase revenue, reduce costs and detect and prevent fraud. Founded in 1968, SPSS is headquartered in Chicago, Illinois. For additional information, please visit

Analyse the Strengths and Weaknesses of Spss BI

DUBLIN, Ireland –(Business Wire)– Research and Markets ( has announced the addition of “The Spss Business Intelligence Radars (Vendor Focus)” to their offering.

This brief analyses the strengths and weaknesses of SPSSs Business Intelligence offering. Information Builders is rated according to its market impact (based on revenues), user sentiment (based on customer perceptions) and technology.

Scope of this title:

Technology: an assessment of SPSSs technology based on specific attributes and the availability of certain features. User sentiment: tracks end-users impression of SPSSs products based on a survey of over 700 Business Intelligence users. Market impact: measures SPSSs market impact based on its revenues and financial performance.

Click here to watch a video in which featured Gartner vice president Michael Maoz shares new insights on the innovative technologies and processes shaping the future of customer service.
Click here to learn how Continental Dispatch Accelerates Customer Service with Hosted Contact Center System
Click here to learn how to leverage greater long-term value from your CRM.
Click here to read how Oracle has dramatically improved pipeline management and, in turn, increase sales velocity

Highlights of this title:

Many enterprises have invested heavily in enterprise applications that by now form an integral part of their core business processes. Subsequently enterprises are increasingly deploying Business Intelligence solutions in order to unlock the potential of the increasing volume of the mission-critical data captured within an enterprise. Since Business Intelligence could be of benefit to a wide range of organizations, the demand for these solutions will expand. Particularly as organizations are seeking to consolidate their existing BI capabilities further.

Forge valuable relationships at The World’s Only IP Communications Development Event. Communications Developer Conference is May 15-17, 2007 in Santa Clara.
Click here to learn more about DSP modules and how they power multimedia messaging services for mobile networks while decreasing the development costs.
Forge valuable relationships at The World’s Only IP Communications Development Event. Communications Developer Conference is May 15-17, 2007 in Santa Clara.
Click here to learn more about DSP modules and how they power multimedia messaging services for mobile networks while decreasing the development costs.
Vendor Guru has done the research to help you quickly find respected, cutting-edge CRM companies with the products and services that are right for your business. Click here to learn more.
Vendor Guru has done the research to help you quickly find respected, cutting-edge Telephony companies with the products and services that are right for your business. Click here to learn more.

Reasons to order your copy:

Gain detailed knowledge of SPSSs strengths with regards to technology, user sentiment and market impact. Business Intelligence vendors can benchmark their own performance against SPSS in various key criteria. Enterprise IT managers will gain valuable insight to improve their Business Intelligence purchasing decisions.

Topics Covered:

The BI market will continue to evolve and grow

SPSS: Business Intelligence Radars

Recommendation: Explore
Our Ratings
Extended methodology
User Sentiment
Market Impact
Further reading
Ask the analyst

For more information, visit

Source: Datamonitor

Required Reading: Analytics: A Winning New Way

Thomas Davenport and Jeanne Harris argue that companies need analytics to make better decisions and extract maximum value from their business processes.

by Colin Beasty

Tuesday, May 01, 2007
In a world where the traditional bases of competitive advantage have largely disappeared, how do you separate your company’s performance from the rest of the pack’s? In Competing on Analytics: The New Science of Winning, coauthors Thomas Davenport, director of Accenture’s Institute for Strategic Change, and Jeanne Harris, executive senior research fellow and director of research at Accenture, argue that companies need analytics to make better decisions and extract maximum value from their business processes. Leading companies no longer just collect and store data, they build their competitive strategies around data-driven insights that generate big results. CRM’s Colin Beasty spoke with Harris about the book.

CRM magazine: In recent years we’ve seen BI tools bring analytics to the masses. How are companies leveraging this?
Harris: If you think about CRM tools four or five years ago, vendors were embedding basic decision-support capabilities and analysis. It wasn’t the kind of statistical analysis and predictive modeling you could achieve with a tool from a company like SAS, SPSS, or Cognos. But those tools required a lot more knowledge of math and statistics. I think what’s happened is, on the one hand, you have the CRM suite providers embedding more advanced decision-support capabilities and ad-hoc analysis in their solutions. On the other hand, best-of-breed BI vendors are coming at it from another angle. They’ve been selling solutions for years and realized that they needed to create customizable dashboards for end users in the financial department. They’ve realized that other parts of the business require the same functionality, such as marketing and sales. They’re both converging, and the market is going to explode as there are more and more of these applications as opposed to just tools on the market.
CRM magazine: With the amount of information businesspeople are being bombarded with on a daily basis, is it possible for a company to measure too many metrics using analytics?
Harris: If you’re going to be an analytical competitor, you’re going to need access to a lot of data. That said, there’s an art and a science to analytics. The science of analytics is being able to crunch the numbers and analyze all the data, but the true art is being able to understand what metrics matter. You’ll find that the most successful companies using analytics measure fewer metrics than most companies. They may have started out measuring 2,000 factors, but having done the analysis, they know that only 60 matter. You’ll capture a ton of data, but you’ll need the people to understand what the data is really telling you. Once you’ve done the analysis, you’re much better off because you’ll be able to focus.

CRM magazine: What will readers find most interesting about your book?
Harris: I think your readers will find chapter five interesting. It’s where we talk about using analytics to develop customer intimacy. In addition, I think there are a couple of messages that resonate throughout the book. Companies have used analytics for years. But today we’ve reached a reflection point where we finally have the data, the processing power, and a new generation of statistically astute executives who understand the importance of analytics, both tactically and strategically.

Navigating the depths of multivariate testing

By Gordon H. Bell and Roger Longbotham

Multivariable testing has made a big splash in the last few years with online retailers, yet this sudden success is really riding the crest of a wave that’s been building for years. Multivariable testing—also called scientific testing, multivariate or matrix testing, Taguchi methods, or other branded terms—is based on a specialized field of statistics that has evolved over the last 80 years. Since the 1930s, a small group of academic statisticians has developed new test designs and techniques focused on efficient ways to test more variables more quickly.

Often called “experimental design,” this specialty falls outside of mainstream statistics and has remained largely unknown to the business world. Only in the last decade have practitioners found a successful approach for using this impressive depth of academic theory to navigate fast-moving marketing channels.

Many variables at once

The concept is simple: with the right techniques you can change many variables at once—but in an organized way—so you can separate the impact of each. Complex mathematical principles define the “organized way” you need to set up your multivariable test.

The depth of statistical complexity below the surface can seem daunting. As marketers, you should understand the fundamental concepts and basic pros and cons of the selected test strategy. The expert who guides you through the process should be able to explain the _rationale of his approach and have a good grasp of the vast realm of techniques available. These include efficient test designs like full-factorial, fractional-factorial and Plackett-Burman designs, plus a veritable A-to-Z of specialized tools: axial runs, Bonferroni method, confounding, dispersion effects and experimental units, plus _orthogonality, projectivity and quadratic effects, down to the X-, Y- and Z-components of interaction.

Various designs and techniques are appropriate for different _marketing programs and objectives. For example, Plackett-Burman designs work well for testing 10-20 creative elements very efficiently in high-production-cost direct mail programs. Fractional-factorial designs are flexible and powerful for testing 5-15 creative elements and select interactions in e-mail and Internet programs. For product, price and offer testing—where elements are known to be important and interactions can be very large and valuable—full-factorial designs often are best. The number and type of test elements, cost and constraints on the number of “recipes” you can create, and the desired speed and precision of the test are among the issues that impact your choice of test design and strategy.

Since the dawn of direct marketing, split-run techniques (also called A/B splits, test-_control or _champion-_challenger testing) have been the standard for marketing testing. You may have a long-running (or “control”) banner advertisement and test it against one other with only the tagline changed, so any difference in click-through and conversion can be attributed to this one variable alone.

In contrast, one multivariable test design is made up of a number of related test “recipes.” Instead of the one-variable change of a split-run test, one new banner ad in a multivariable test would include a number of changes—perhaps the new tagline along with a control graphic, new price, additional starburst and control background color. These multiple versions each has a unique combination of all elements in the test, each providing one new piece of data on every test element. Analyzing all recipes together, but grouping data in different ways, you can separate the precise impact of each change. The statistical structure requires that the creative execution accurately follows the defined test recipes.

Scientific multivariable tests have four key advantages over split-run techniques. You can test many marketing elements at once, using the same small sample size as A/B split, with results that quantify the impact of each element alone (main effect) and in combination with others (interaction), and with a vast array of techniques available to customize your approach.

The best e-mail recipe

A large Internet retailer/cataloger wanted to increase e-mail conversion. With 2-3 e-mail drops per week to a customer base of 450,000, conversion rate averaged 1% per campaign. The team had a challenge pinpointing what worked best because they continually changed e-mail creatives and offers to keep the program fresh.

After brainstorming 42 ideas, the team narrowed the list down to 18 bold, independent test elements for one multivariable test made up of 20 different combinations, or recipes, of all 18 elements. Four of these “recipes” are shown on p. 75 (with control levels in black and the new ideas in orange). A direct subject line might be something like “Save 20% through Friday.” A creative subject line would change that to “Awesome new products and super savings that won’t last.”

Recipe 20 was simply the control. All other recipes had about half the elements set at the control level and half at the new level, but a different half-and-half for each recipe. Though these four may look like random combinations, all recipes fit within the precise statistical test design. Like the pieces of a puzzle, all recipes fit together to provide accurate data on the main effects and important interactions of all 18 elements.

The team ran the same test across three different types of promotions to see promotion-_specific effects plus elements that were important across all campaigns. Results for the first campaign are shown below, including the 18 main effects (shown in the bar chart) and one key interaction (in the line plot).

In the chart, main effects are arranged from the largest (D, at the top) to the smallest. Test elements are listed on the left with the “new idea” shown in parentheses. The length of the bar and the label show the size of the main effect. The +/- sign on the effect shows whether the new idea is better (positive effect) or the control is better (negative effect). The dashed line of significance is a measure of experimental error. All effects below that line (less than 6%) can be explained simply by random variation. Effects are shown as a percentage change from the control, so a 10% effect would increase conversion rate from 1% for the control to 1.1%.

Four main effects were clearly significant: Product selection had the largest effect. Conversion rate increased by 10% when best-_selling products (D+) were promoted instead of “unique” products. A larger headline with color (J+) increased conversion by 8.9%. Offering three products decreased conversion by 8.2% vs. one product (E-). Finally, the creative subject line (A+) beat the direct subject line by 6.9%.

Immersed deeper within the unique statistical structure is a wealth of information about interactions. On the surface, main effects show individual changes that increase conversion. Interactions show how these effects may ebb or flow depending on the relationship among marketing-mix elements.

The line plot on p. 76 shows the AB interaction. The main effect of A (subject line theme) changes significantly depending on whether certain words are capitalized in the subject line (_element B). Supporting the main effect in the bar chart, the creative subject line is always better than the direct offer (going from left to right), but the impact is much greater with no capitalization (B+, orange line). This interaction shows that (1) capitalizing words in the subject line does have an impact (B+: no capitalization) and that (2) without capitalization, the effect of the creative subject line (A+B+) is about 40% larger than shown by the main effect in the bar chart. Interactions not only offer deeper insights into the true relationship among elements, they also help better quantify the impact of the optimal combination of elements.

What’s the difference?

In this case, what was the advantage of using multivariable testing? Well, if the team had used simple split-run techniques instead:

l Testing all 18 elements in one drop, not one effect would have been significant, since the line of significance would have been 2 1/2 times higher (only effects greater than 16% would be significant).

l For equal confidence, the team would have to test only one element per drop, requiring 18 campaigns, with no way to separate seasonality or differences among campaigns.

l The team would never have seen the AB interaction (and others) and capitalization would appear to have no impact.

An extreme case of multivariable testing was one banner ad test of 26 elements. Testing 10 graphical elements, 9 messages, pop-ups, drop-downs, animation and other marketing tactics, the biggest challenge in this test was defining the elements and managing recipes to avoid completely absurd combinations (imagine all these words and graphics stuffed into one banner). In cases like this, the “art” of testing—defining clear, bold, independent test elements and creating a test design with recipes that push the limits of market knowledge without falling apart in _execution—is equally as important as the science. In this case, eight significant main effects and one very profitable interaction led to a 72% jump in conversion. The test was completed in four weeks. For equal confidence, split-run tests would have required 14 months.

An Internet retailer of consumer gifts ran a landing page test of 23 elements for three weeks and pinpointed seven changes to increase sales (and six “good” ideas that hurt) for a 14.3% jump in sales. 23 separate A/B splits would have required over 40 weeks to achieve equal statistical confidence. This test paid for itself 10 days after results were implemented.

Getting started

Multivariable testing is most effective for retailers who have many ideas to test and the flexibility to create numerous recipes within a high-value marketing program. Key decisions in launching a retail test include: choosing the right test elements and levels for each, deciding which of all possible combinations should be executed for a valid test, creating and executing all recipes, and collecting data and analyzing results.

Software platforms offered by firms like Optimost and Offermatica can simplify the process of creating recipes and analyzing results for Internet tests. Consultants focused on the specialized statistics and strategies of _testing can help guide you through the process, especially for e-mail and offline programs (like direct mail, media _advertising and in-store tests). For outside assistance, you may want to budget about $10,000 per month for ongoing support. In return you can expect to reduce your learning curve and increase your testing efficiency and return on investment. Another option for small firms is the free, barebones service from Google for Adwords advertisers called the Website Optimizer.

Testing remains an integral part of every good marketing program. Like trading in your dinghy for a clipper ship, launching a multivariable test brings you the power and freedom to move faster through turbulent marketing channels. With an experienced guide to show you the way, scientific testing offers greater agility to respond to market changes, streamline your retail programs and explore new opportunities for growth.

Gordon H. Bell (above) is president of LucidView, a marketing consulting firm specializing in scientific testing techniques, and can be reached at Roger Longbotham is senior statistician at Inc., where he oversees the multivariable tests on and conducts data mining studies related to customer behavior. He can be reached at

SugarCRM Debuts Multichannel Marketing, Analytics

SugarCRM, a provider of open source Latest News about open source CRM Free Trial – Way Beyond CRM – Learn how Landslide can help you., has added multichannel marketing Email Marketing Software – Free Demo and business analytics functionality to its Sugar Open Source, Sugar Professional and Sugar Enterprise product lines.

This development is welcome news to those companies that are investigating open source as a viable alternative to packaged and on-demand CRM systems — a group that, while still minuscule, is becoming increasingly vocal about the benefits of do-it-yourself CRM.

“For a business model that owns, at best, less than one percent of the CRM market, they are getting a lot of buzz,” Yankee Group CRM analyst Sheryl Kingstone told CRM Buyer.
Enhancing Marketing

SugarCRM’s multichannel marketing features include the campaign wizard to set up and execute a campaign; campaign manager, which tracks the opportunities generated and closed by the campaign across customer channels; and automated lead capture, which can integrate Web leads into SugarCRM, associating them with a specific campaign.

Other features provide for better management of e-mail marketing, online advertising, newsletters, search engine marketing, list rentals, telesales programs, webcasts and traditional advertising.
New Business Analytics

The platform’s new business analytics includes features that generate ad-hoc, multi-module reporting to analyze marketing, sales and customer support. These reports can be manipulated into multiple formats such as pie charts or line graphs.

The multichannel marketing functionality is included in Sugar Open Source, Sugar Professional and Sugar Enterprise; business analytic capabilities are included in Sugar Professional and Sugar Enterprise only.
Use Case

Three years or so out of the gate, open source CRM may not have grabbed as much market share as initial speculation had suggested; however, its use is suitable for a wide range of companies.

Users could range from small companies that do not want to invest significant money in commercial systems to large enterprises that want highly-tailored systems and firms that want to support open source, Kingstone said.

Additionally, companies in micro verticals — which the vendors have yet to penetrate — would also be interested in developing their own open source CRM system Back up your business with HP’s ProLiant ML150 Server – just $1,299..

However, when faced with a buying decision, firms are more likely to continue to opt for the more conventional choices.

“I see (NYSE: CRM) Latest News about going up against Siebel or SAP (NYSE: SAP) Latest News about SAP AG. I never see open source as an option in these deals,” Kingstone stated, adding that if she ever were to consider open source as a possibility, SugarCRM would likely be the tech provider.

Built on the LAMP (Linux, Apache, MySQL, PHP) platform, SugarCRM was one of the first vendors to come to market with an open source CRM platform and has since gained significant traction.

Schools use analytic tools to address student performance

SPSS software used to develop performanace models in grades K-12

As students head back to their pencils and books, school districts in several states are turning to predictive analytic tools to meet the data aggregation and analysis requirements of the No Child Left Behind Act and to focus teacher efforts on boosting academic performance.

Research firms Analytic Focus LLC and Reveal Technologies LLC have partnered with Chicago-based SPSS Inc. to help develop models that can predict student performance in grades K-12 based on current instructional methods used in a school. School districts in New York, Colorado, Minnesota, Alabama and Iowa will put in place the SPSS predictive analytics tools and participate in this new program, according to an SPSS announcement.

In addition, the Naperville, Ill., school district, located in the suburbs of Chicago, this summer has been training principals in its 21 schools how to use SPSS’s predictive analytics software, said Alan Leis, superintendent of the Naperville Community Unit School District 203.

“No Child Left Behind [legislation] forces us to focus on individual student data… and large groups by schools,” Leis said. “[SPSS] will allow us to see which students are on a normal growth path and which students are below it… and to predict which students are most at risk for not meeting achievement standards.”

The district began working with SPSS last year to build a master data warehouse that could pull together data from disparate databases containing test scores, demographic data and other information needed for predictive analysis, Leis added. This school year, the district will begin using the software to analyze data and build growth plans for schools and the district’s 19,000 students. The software will replace the time-consuming process of manually analyzing data from test score spreadsheets, Leis added.

“Now we can give [users] a CD with all this data on it so they can do the what-if analysis,” he said. “It allows you to not spend all this time figuring out the data but… figuring out what you did right and what you need to do better.”

Leis said he hopes to eventually expand the use of the software to the district’s 1,200 teachers.

Phil Ashworth, coordinator of testing data for the Hamilton County school district in Chattanooga, Tenn., said he has been using SPSS predictive analytics software for several years to analyze testing data. A year ago, he added SPSS’s Clementine data mining tool to the mix to provide a graphical representation of test scores from the district’s 40,000 students.

The tool allows him to set up the parameters for analysis and to run a report and apply those parameters to any of the 80 schools in the district without having to rewrite any of the instructions, he said. In addition, while testing is commonly done in the spring, it is at the beginning of the school year that teachers need to know their students’ strengths and weaknesses, Ashworth said. The SPSS tools allow him to provide testing data to each teacher at the beginning of the year, he said.

Analytical CRM gains rapid adoption: Datamonitor

London: Analytical customer relationship management (aCRM) technology, considered the logical evolution of the CRM lifecycle, is being adopted by enterprises on a broader global scale. This is revealed in a new report by the London-based independent market analyst Datamonitor (DTM.L).

The report, Analytical CRM, forecasts global enterprise investment in aCRM will grow from an estimated $2.3 billion today to over $3bn in 2009.

aCRM (a sub section of the wider business intelligence [BI] market), whilst complex is a compelling technology. By employing aCRM analytics, businesses stand to gain a fuller understanding of the customer in order to serve them better thus increasing customer longevity and generating more profit.

“The aCRM, and wider CRM market is going through a period of exciting change,” says Tom Pringle, technology analyst at Datamonitor and author of the study. “High and stable growth reflects the value businesses place on understanding more about their customers. However, vendors will need to make every effort to educate enterprises. Many are still confused by the concept and technologies that constitute BI.”

According to Pringle, growth is already at a high, stable level, a reflection of some maturity in the wider BI market in North America and Western Europe, and strong growth in the APAC and CALA regions.

aCRM is the active collection, concentration and analysis of data gathered about the customer and his interactions with the business. It represents the next, logical step in this development path through utilisation of customer data held within the enterprise. This analysis is then used to generate value, both for the enterprise and the enterprise’s customers. It encompasses cultural change at every level as part of the wider CRM project: the creation of a customer focused business.

The report reveals the clear lead adopters are to be found in the financial services, retail, manufacturing and communications industries around the globe. However with confusion and lack of understanding among end users regarding aCRM and its uses, it will be imperative for vendors to educate enterprises across all verticals as to the aCRM function, its working, uses and benefits.

“Vendors in this space need to tread carefully to exploit the opportunities which exist. Market education is a clear requirement, with many potential users confused by the range of technological options available to them, and a lack of understanding around the uses of aCRM. There are clear signs marking the appropriateness of aCRM for different enterprises and vendors will do well targeting those that display them.”

Note: Datamonitor’s report, defines the market as both a technology and a concept. This report simplifies the technologies, uses and user groups found in the aCRM market. The full range of technologies in the aCRM market are covered, including ETL, data quality, data warehousing and BI tools. The development of aCRM technologies is discussed as part of the changes seen in the wider BI market. A full market sizing analysis by geography, vertical and technology for the aCRM space is also given. The uses of aCRM are varied. The ways in which it is currently deployed, and how this will change are covered by this report.

How to grow your own analytics team

Readers have asked how to start a dedicated Web analytics team within an enterprise. Smart question. In a previous column I said many companies fail to squeeze enough juice from the analytics grape because they don’t assign genuinely committed or qualified resources, or they consign analytics to a committee already tasked with too much else.

Suppose you had a chance to get it right. What kinds of people suit the roles? What skills are paramount? If you had the luxury of creating a new position in these tight times, how would you advertise it? We work with smart, nimble boutique firms and Fortune 100 juggernauts alike. When an enterprise analytics team clicks, it clicks for similar reasons, no matter the scale. Looking at successful teams, I see recurring models, assignments, and action plans.

A good foundation typically includes two starring roles: a manager and a champion.

Technical Lead

Whether they use a software solution or hosted solution, companies I know that successfully use analytics invariably have an analytics technical lead. Key responsibilities include:

* Manage software and servers (depending on tracking tool).

* Manage a “tagging strategy”: Determine the best way to place visitor-tracking tags on Web site pages and ensure those pages show up correctly and consistently in the analytics tool. Work with third-party content providers and partners to tag pages.

* Ensure new pages or site changes carry proper tracking tags.

* Manage tracking tool changes and upgrades.

Analytics Lead

The analytics lead is a champion, an evangelist, an advocate. This person is responsible for promoting analytics’ value throughout the organization. She often has a strong background in business strategy, Web strategy, or both. She has a solid understanding of your particular analytics tool. She knows what data it yields, how to research and troubleshoot, and how to report the benefits in a pithy, accessible way. Key responsibilities include:

* Help key stakeholders define site or section goals. Define the data readouts they need to drive improvements.

* Develop a data distribution strategy featuring weekly, monthly, and quarterly reports, as well as ways to get them in front of stakeholders.

* Help interpret data as a basis for site architecture or design changes.

* Drive A/B testing to solve specific site problems identified through analytics.

Influential Circle and Beyond

These two roles call for very different skill sets. It’s hard for one person to cover both. Organizations that can dedicate two full-time people to these positions typically gain more ground faster in the analytics derby. Then, there’s the influential circle that can make or break a fledgling analytics team:

* Executive management supports the initiative and helps define Web site success metrics at the highest level.

* Web designers, information architects, and developers implement data-driven improvements.

* Business group leads.

As the team gains traction, it can add more analysts, work with outside firms to augment or train (that’s where I typically come in), or even designate analytics “owners” inside each business group to track data reports and marry them to site improvements.

Our goal is to get the whole enterprise focused on analytics. This hardly ever works when a big committee tries to run analytics. To get fast results, start with a dedicated technical lead and an evangelist to spotlight the outcome.