|At A Glance: foofoo.com
The company:Based in Arlington, Va., Foofoo.com is a content/e-commerce site focused on “the fun and finer things in life.”
The problem:Foofoo.com wanted a way to analyze and monitor the interests of visitors to its site so it could present the most appropriate products and services to its upwardly mobile clientele.
The solution: net.Genesis Corp.’s net.Analysis clickstream tool is the first step in analyzing customer activity on Foofoo.com’s Web site. With it, the business is able to tailor its editorial content, provide the right mix of products, and test and target marketing campaigns to reach out to its upscale customers. Eventually, the company plans to add demographic and customer data from its production SQL Server 7.0 databases as well as catalog information to round out its customer retention efforts.
Foofoo.com, a self-proclaimed content/e-commerce site focuses on “the fun and finer things in life.” It has a pretty good notion about how to cater to its upwardly mobile clientele. Luxurious bath products, fashions from the toniest designers, and gourmet foods are just some of its top-selling items, and visitors to the site can soak up articles on topics such as trendy travel destinations and home decorating tips.
Given these proclivities, Foofoo.com execs assumed that high-end electronics gear like DVDs and home theater systems would be equally enticing. So in preparation for the 1999 holiday season, they loaded up on content and products in this category. But when the season came and went, the electronics merchandise didn’t. “While internally, the hypothesis was this was a good product line to get into, it turns out there was mild to low interest,” says Philip Hawken, director of operations at Foofoo.com Inc., in Arlington, Va.
Hawken, like many execs, learned that guesstimating customer preferences is no way to maintain an edge in today’s frenetic and fiercely competitive e-commerce landscape. With Web logs essentially providing a detailed roadmap of how visitors move through a site, companies are sitting on a wealth of information that can be mined to answer basic questions like what customers are buying and what promotions are generating the most traffic. As a result, Foofoo.com and others are turning to new clickstream tools, which collect behavioral data as individuals interact through their browsers with remote Web sites, to analyze customer activity. Others are pushing the concept further, using fresh iterations of data warehouse products to drill down into Web log data, and marry results with operational data in enterprise systems and information from external data sources–all in the quest to uncover customer trends.
Whether you call it Webhousing or e-intelligence, this exercise is the next big step for dot-com and click-and-mortar companies trying to build a lucrative e-business. “The idea is to get closer to customers and have them return,” says Doug Laney, senior program director of application delivery strategy services at research firm META Group Inc., in Chicago. By leveraging the new tools, companies can get closer to customers. For instance, they can deliver a personalized buying experience, tailor marketing programs more effectively, and redesign sites to generate optimal traffic. With clickstream data, companies can do things so people won’t want to leave their sites. “You’re setting up a barrier to exit because, in the Web world, it’s too easy in a single click for a customer to become a noncustomer,” Laney says.
In Foofoo.com’s case, net.Genesis Corp.’s net.Analysis clickstream tool helped Hawken and his crew determine that visitors to the site read up on home electronics, but left without buying. Customers were clicking on the editorial content, but there was minimal activity in the related shopping portion of the site. Based on this nugget and other insights garnered from net.Analysis, Foofoo.com redesigned its site and eliminated electronics equipment from its inventory right after the 1999 holiday season. “We made a lot of decisions solely based on what the data showed us,” Hawken says. “While we can’t tap into people’s psyches as much as we’d like, we can find out where they’re going and what their preferences are.”
A warehouse of Web wares
No matter how alluring, companies are bound to face a new set of challenges trying to turn raw Web data into gold. For one thing, there’s the issue of scalability. Webhouses draw on detailed information to find patterns, thus requiring considerably more data to be stored and managed than their enterprise cousins. There are also architecture considerations since companies eventually want to react in real-time to the information collected. For example, e-commerce sites would like to bring a customer directly to a page customized with only the products and promotions of interest to that particular individual. As companies move beyond clickstream analysis to an enterprise Webhouse focus, they also face a formidable task in integrating the data collected from Web logs with existing back-end systems and external data sources.
Accrue Software Inc.
Broadbase Software Inc.
Chutney Systems LLC
IBM Corp.’s SurfAid Analytics
SAS Institute Inc.
WebSideStory Inc. Offers HitBox, a Web audience analysis service; HitBox.com, a Webmaster resource center and community of independent Web sites; and StatMarket.com, a source of data on Internet user trends.
“With e-commerce, you need to be talking about a warehouse that’s customer-centric as opposed to traditional management information warehouses, which store information on the performance of a company and summarize it so people can see trends,” explains John McIntyre, director of global marketing at SAS Institute Inc., in Raleigh, N.C. “A customer-centric look is more likely to be augmented with external data. And to really personalize your relationship with customers, you need to take information from all points of contact that a company has with the customer to get the richest profile.”
SAS and many of the traditional data warehousing vendors, like Oracle Corp., see Webhousing as a natural extension of their product lines. For example, SAS is positioning its existing suite of tools, augmented by new additions, as a way for companies to produce reports, do analysis on their Web traffic, and develop rich customer profiles. SAS is also introducing what it calls Knowledge Solution add-ons to its Enterprise Miner datamining tool for specific functions like cross-selling, available since January 2000, and fraud detection and churn, which will be available in the first quarter of 2000.
Oracle also insists that Web business intelligence has to be part of an overall enterprise data warehouse effort to give companies a holistic view of their customers, according to Jagdish Mirani, senior director for Oracle’s Data Warehouse Program Office in Redwood Shores, Calif. The company’s Intelligent WebHouse, as it’s calling its end-to-end solution, comprises existing products, including Oracle Reports, the Darwin datamining tool, the Express multidimensional database, and the Discoverer, for ad hoc analysis. In March 2000, Oracle released Oracle Warehouse Builder, a lifecycle management tool for integrating data from enterprise resource planning (ERP) systems, Web sites, and external data sources into a single warehouse.
Along with traditional players, there are many newcomers to this Webhousing space (see “Webhousing wares”). Most of the upstarts–for example, net.Genesis, in Cambridge, Mass., and Accure Software Inc., in Fremont, Calif.–are focused on clickstream analysis tools, while others like E.piphany Inc., of San Mateo, Calif., and Menlo Park, Calif.-based Broadbase Software Inc. deliver marketing campaign solutions. While clickstream tools basically analyze Web log data, detailing how customers move through a site, marketing campaign solutions offer a robust suite of tools for things like customer identification and analysis and real-time personalization capabilities to aid in one-to-one marketing.
Enterprise data warehouse vendors and many consultants contend that the clickstream category of tools doesn’t provide sufficient customer information upon which to do comprehensive personalization and profiling. “Web log data is not suited for customer relationship management,” says Augie MacCurrach, principal of technology at DiaLogos Inc., an e-business consulting firm in Boston. “It was originally built to help developers test whether a site was workingthere’s no standard definition for what a unique user is or what a unique session is.”
While they might not be a total solution, a growing number of companies view clickstream tools as a worthwhile first step. Since its site went live in June 1999, Foofoo.com has not just redefined and refined its product mix based on results from net.Analysis, but it also uses the clickstream tool to generate reports for its content partners. This helps them better target the site’s editorial content to readers’ needs, Hawken says. With net.Analysis, Foofoo.com is also able to track the effectiveness of e-mail campaigns and banner advertising, allowing them to make more effective use of marketing dollars, not to mention testing ads before making a major investment.
Hawken acknowledges, however, that there’s much more work to be done to get a complete view of the customer. Later in 2000, Foofoo.com plans to use extensions to net.Analysis to tie the Web log data to its catalog and shopper demographic information stored in its production SQL Server 7.0 databases. “We definitely want to extend the tool, but the data we’re getting out of the box is more than enough to make key business decisions before we get into more in-depth data Webhousing,” he explains.
At della.com, a Della and James Inc. site that bills itself as a gift registry or wish-list aggregator, Accrue’s clickstream and datamining tools are just one piece of an overall customer marketing database that the firm is building. Currently, della.com feeds sales order data from its ERP systems and Web logs into a separate customer data warehouse. There it employs E.piphany’s marketing analysis suite to drill down and uncover customer trends, according to Dan McNamara, director of relationship marketing for the San Francisco-based company.
|Data warehousing meets the Web
Two data warehouse veterans have taken an early stab at defining and describing what they claim is the data warehouse reborn: the data Webhouse.
So far, McNamara says the results of the first phase of the project are pretty impressive. Accrue has enabled della.com to determine where customers are coming from and how long they stay on the site, helping company officials see what advertising vehicles draw their most profitable customers. The clickstream tool has also helped create landing pages for people coming in from different ads–the first step toward personalization, he adds.
But it’s when you factor in some of the other pieces of the customer data warehouse project that della.com has seen the greatest return. The site, which started off as a wedding registry, was able to determine with E.piphany that its high-end wedding customers were even more seasonal than originally anticipated. So the firm opted to revamp itself as a general gift registry in July 1999 to balance the spring/summer wedding season with a lucrative business during the fall/winter holiday season. Currently, the pieces of della.com’s customer data warehouse are separate, but McNamara says integration is the next big step, along with adding more sophisticated personalization capabilities, e-mail campaign management, and real-time recommendations.
A custom approach
The need to marry Web log data with existing ERP, supply chain, and data warehouse systems is so compelling that some companies are doing their own custom integration or turning to enterprise-class data management tools. AutoTrader.com LLC, a used car marketplace up and running since June 1998, scrapped a clickstream analysis tool in favor of customizing the SAS suite to tap into data sources other than Web logs. The new system now accesses two Oracle databases–one stocked with spec and availability information on cars, the other with e-mail generated by the site–along with advertising information from its DoubleClick ad server. “We need access to those databases to get the full picture,” notes Jerry Johannesen, MIS manager for AutoTrader.com, in Atlanta.
more.com has also prioritized integration with back-end systems as a way to leverage one-on-one marketing to its customer base, says Andy Felong, vice president of engineering for the San Francisco-based online health, beauty, and wellness product superstore. Using a beta version of Oracle’s new Data Warehouse Builder, a data warehouse lifecycle tool that will be released in March 2000, more.com is able to use one tool to integrate product-oriented ERP data from its Oracle Financials systems with session-oriented data from its Web site. By mining this mix of data, more.com has been able to identify profitable customers and then reach out to them with custom marketing and cross-selling programs. “We’re looking for that competitive edge, and we believe by obtaining the information and mining it, we can get it,” Felong says.
For Cyberian Outpost Inc., which aims to provide buyers of electronics and computer equipment with a totally personalized shopping experience, a customized, enterprise Webhouse approach was the only way. With the help of consultants including DiaLogos, Outpost built a system that integrates Sagent Inc.’s data mart environment, Broadvision’s e-commerce engine, SAS tools for datamining and Rubrix’s campaign management software, now owned by Broadvision. Outpost’s personalization goals–to essentially present a known customer with a first screen that caters to all of his or her interests–couldn’t be accomplished with clickstream tools, says Dan Bachman, director of business intelligence for Outpost, in Kent, Conn. “Most of the clickstream tools read parsed Web logs and report passively on information; there’s no standard way to identify unique sessions or important clicks within a session,” he explains.
To address that shortcoming, Outpost designed what it calls a front-end observation server as part of its Webhouse architecture, which is based on Windows NT. This essentially captures the unique user sessions and feeds them into the Sagent data marts for analysis, which is performed overnight. The results of that analysis are then fed back into the system to generate the customized Web screens. The next step, Bachman says, is to use the observation server to analyze the clickstream data in real time. “But we have to walk before we can run,” he admits.
Ralph Kimball, a veteran data warehouse expert, is also a proponent of creating a real-time or what he calls a “hot response” cache as part of a Webhouse architecture (see diagram, “How to build a Webhouse”). “That way, the data warehouse can continually anticipate questions and provide a whole set of precomputed responses,” says Kimball, president of Ralph Kimball Associates Inc., in Boulder Creek, Calif., and co-author of The Data Webhouse Toolkit (see “Data warehousing meets the Web”).
Whatever the approach, smart Web businesses know that guesswork no longer cuts it when it comes to catering to customers. In today’s wild and wooly Web world, the name of the game is knowing exactly what customers want and when they want it. And that makes all the difference. //
Beth Stackpole is a freelance writer living in Newbury, Mass. She can be reached at [email protected]
Many companies are still struggling with whipping enterprise data warehouse efforts into shape, and with the introduction of the Web, the exercise becomes far more daunting. Luckily, two data warehouse veterans have taken an early stab at defining and describing what they claim is the data warehouse reborn: the data Webhouse, a new entity at the center of the Web revolution.
As described by authors Ralph Kimball and Richard Merz in their book, The Data Webhouse Toolkit, published by John Wiley & Sons Inc., the new data Webhouse will be the engine that controls or analyzes the Web experience. As such, it increases the importance of the technology, but changes its very nature from the data warehouses of the past decade. In the book, written for designers and project managers in IT organizations, Kimball and Merz lay out the differences between the two generations, provide a detailed roadmap of how to design and model a data Webhouse, and discuss how to extend and adapt existing data warehouses to accommodate this critical Web component.
The Webhouse, the authors contend, has two personalities, which are reflected in the structure of the book. The first half describes bringing the Web to the warehouse. At the center of this discussion is understanding and leveraging the raw clickstreams–behavioral data collected as individuals interact through their browsers with remote Web sites–as another source of information to be massaged and integrated into a data warehouse. The second part of the book is keyed to bringing the existing data warehouse to the Web, which the authors say is essentially making all interfaces such as reporting, application development, and systems administration accessible via the browser.
The book is peppered with practical design tips and threads the theme of customer relationship management throughout its 16 chapters. Kimball and Merz do a nice job of balancing coverage of the business cases for embarking on one of these Webhouse projects–for instance, how to use the information to determine profitability of a Web business, to create customized marketing activities, or to assemble a clickstream value chain with customers or suppliers–with detailed, how-to technical discussions on everything from cookies to datamining to modeling data marts specifically for clickstream data. There is also ample space devoted to scalability and security–two highly important requirements and challenges associated with Webhousing.
In keeping with its practical–rather than theoretical–tone, the authors devote Chapter 15 to the special management and organizational issues surrounding Webhouse projects. Included in this discussion is a nice organizational chart that spells out the roles necessary for getting a project of this ilk off the ground and completed successfully.
The authors acknowledge that the book tackles its subject matter at the very early stages of development. And yet while big changes are undoubtedly on the horizon, they make the case that the impact of the Web is so profound that Webhousing is the future for data warehousing. If they’re right, it’s not too early to get acquainted with one’s new environment, making The Data Webhouse Toolkit a worthwhile read. –Beth Stackpole