|At A Glance: foofoo.com Foofoo.com Inc. The company: Based in Arlington, Va., Foofoo.com is a content/e-commerce site focused on "the fun and finer things in life." The problem: Foofoo.com wanted a way to analyze and monitor the interests of visitors to its site so it could present the most appropriate products and services to its upwardly mobile clientele. The solution: net.Genesis Corp.'s net.Analysis clickstream tool is the first step in analyzing customer activity on Foofoo.com's Web site. With it, the business is able to tailor its editorial content, provide the right mix of products, and test and target marketing campaigns to reach out to its upscale customers. Eventually, the company plans to add demographic and customer data from its production SQL Server 7.0 databases as well as catalog information to round out its customer retention efforts.|
Hawken, like many execs, learned that guesstimating customer preferences is no way to maintain an edge in today's frenetic and fiercely competitive e-commerce landscape. With Web logs essentially providing a detailed roadmap of how visitors move through a site, companies are sitting on a wealth of information that can be mined to answer basic questions like what customers are buying and what promotions are generating the most traffic. As a result, Foofoo.com and others are turning to new clickstream tools, which collect behavioral data as individuals interact through their browsers with remote Web sites, to analyze customer activity. Others are pushing the concept further, using fresh iterations of data warehouse products to drill down into Web log data, and marry results with operational data in enterprise systems and information from external data sources--all in the quest to uncover customer trends.Whether you call it Webhousing or e-intelligence, this exercise is the next big step for dot-com and click-and-mortar companies trying to build a lucrative e-business. "The idea is to get closer to customers and have them return," says Doug Laney, senior program director of application delivery strategy services at research firm META Group Inc., in Chicago. By leveraging the new tools, companies can get closer to customers. For instance, they can deliver a personalized buying experience, tailor marketing programs more effectively, and redesign sites to generate optimal traffic. With clickstream data, companies can do things so people won't want to leave their sites. "You're setting up a barrier to exit because, in the Web world, it's too easy in a single click for a customer to become a noncustomer," Laney says. In Foofoo.com's case, net.Genesis Corp.'s net.Analysis clickstream tool helped Hawken and his crew determine that visitors to the site read up on home electronics, but left without buying. Customers were clicking on the editorial content, but there was minimal activity in the related shopping portion of the site. Based on this nugget and other insights garnered from net.Analysis, Foofoo.com redesigned its site and eliminated electronics equipment from its inventory right after the 1999 holiday season. "We made a lot of decisions solely based on what the data showed us," Hawken says. "While we can't tap into people's psyches as much as we'd like, we can find out where they're going and what their preferences are." A warehouse of Web wares No matter how alluring, companies are bound to face a new set of challenges trying to turn raw Web data into gold. For one thing, there's the issue of scalability. Webhouses draw on detailed information to find patterns, thus requiring considerably more data to be stored and managed than their enterprise cousins. There are also architecture considerations since companies eventually want to react in real-time to the information collected. For example, e-commerce sites would like to bring a customer directly to a page customized with only the products and promotions of interest to that particular individual. As companies move beyond clickstream analysis to an enterprise Webhouse focus, they also face a formidable task in integrating the data collected from Web logs with existing back-end systems and external data sources.