Gary Angel, President and CTO of Semphonic, a Web analytics consultancy, recently hosted an Omniture Developer Connection User Group in San Francisco. Gary co-founded Semphonic and leads their consulting efforts for companies like American Express, Charles Schwab, Intuit, Genentech, Nokia, Sears and Turner Broadcasting. Gary has published articles on Web and SEM Analytics in DM News, American Demographics, CRM Guru, CRM Buyer, IMediaConnection, Business Geographics and Business Insurance. As an early-adopter of the Omniture APIs, Semphonic has been building integrated solutions that combine Web analytics with enterprise data to help customers realize new marketing innovation and efficiency through the Omniture platform. After the San Francisco User Group, Gary and I connected to talk more about Semphonic’s use of the Omniture APIs:

Q:  At the Omniture developer user group, you mentioned that customers are pulling you into integration projects that really advance beyond current web analytics paradigms. What are customers asking you to integrate and why?

A: One of the things that make integration projects challenging is that they tend to be pretty unique. I guess I’d say that integrations tend to fall into one of three basic types. The most common is probably the integration of multiple management reporting streams. The vast majority of our customers are multi-channel. For many of them, their source of ultimate truth are CRM and financial systems – and data from these systems forms the bulk of what gets reported up to senior management. But the top end of the funnel – where online behavior occurs – needs to be represented there as well. The API makes it possible to pull that online behavioral information at the same time as you are tapping into these other systems – and pushing the information into a single unified (and automated) report. The second type of integration, less common but to me more interesting, is when we pull information so that we can analyze it in more detail. In a way, our SAINT applications are like that – we take advantage of the full-feature regular expression libraries in .Net to built rule-based SAINT tables automatically – something that just isn’t possible otherwise. Finally, we’ve worked on a few integrations where the goal was simply to pull data out of the analytics and into some piece of an online system (usually to push or optimize some part of content). That’s useful but also very provisional – ultimately you’d hope that people use more complete tools for that kind of work.

Q: For those who aren’t familiar with the SAINT (SiteCatalyst Attribute Importing and Naming Tool) API, can you share an example of the kind of data you use it for and why?

A: SAINT is used to generate lookup tables in SiteCatalyst. Probably the most common use is for companies to assign friendly names and rollup categories to campaigns based on a campaign code captured from the URL. But it can be used for almost any kind of variable. Our clients often use SAINT tables for products and merchandising categories, article names, video metadata and more. This all works fine where you control the metadata that comes in (like product id to product name). But it’s nice to be able to categorize external variables as well – things like SEO keywords, article tags, and internal search keywords. To effectively analyze any of these, you need to categorize them. In theory, SAINT let’s you do this. But SAINT tables are strictly a 1-to-1 lookup. You can’t apply any logic like “if the keyword phrase contains XXX assign it to Y Category.” That makes SAINT very difficult and manual for this type of application. We use the SAINT API to pull all the values from a SiteCatalyst variable and then apply a series of regular expression or lookup rules to categorize each value – then automatically generate the SAINT table. It’s almost the only practical way to do analysis on large open-ended fields like SEO keywords or internal search keywords and it’s also very useful for sophisticated SEO reporting.

Q: Semphonic was a very-early-adopter of the Omniture APIs. In your blog, you write about the increasing maturity of the Omniture APIs. Where have you seen improvements and how is that helping you serve customer needs better?

A: It has improved a lot – and in many of the ways you’d most expect as a software system achieves maturity. The documentation and examples are a lot better now. That’s a big deal when you’re first starting out with a new software tool. We happen to be a .Net shop, and when we first got started the security model was just a killer. It took forever just to figure out how to authenticate. That’s really frustrating of course because you feel like you’re just spinning your wheels. It’s also really hard, as a consulting firm, to justify those hours. You can’t tell a client you couldn’t figure out how to logon to the system – so you end up having to eat all those “educational” hours. Now there are good examples that make this pretty easy. The extent of the APIs has also improved a lot. The coverage is pretty darn good now. Finally, the cost model and token model have improved and been clarified.

Q: What would you like to see on the roadmap for the Omniture APIs?

A: Like most developers, I’d still like to see a throttling mechanism as opposed to the current token mechanism for controlling usage. I’ve also campaigned for a system where 3rd Parties like us could buy tokens and use them for custom apps to simplify our client arrangements. In terms of the APIs themselves, I guess I’d like to see direct access to the event level data ala the Data Feed. I think it would be great to be able to customize and launch data feed requests and get back that true server-call level data.

Q: How do you see customer requirements evolving, Vis a Vis Web analytics integration with multi-channel marketing efforts, in 2010?

A: At the upper-end of the maturity curve I think 2010 is going to see a lot of movement on the customer-level integration of web analytics data. To be honest, I thought a lot of this would happen in 2009, but it was such a tough year for big projects that I think most organizations ended up just pushing these types of projects out. My sense is that lot’s of companies are ready to combine key online events with their other customer/visitor data. Doing that often means moving data in both directions: certain kinds of customer data need to move out to the web analytics solution so that you can understand who you’re profiling and what their online behavior means. Then, as behavior manifests itself online, you need to be able to move that data back into your CRM and marketing warehouse systems for outbound messaging.

Q: What’s the most interesting use of Web analytics data you’ve seen, outside the reporting environment of Sitecatalyst?

A:  I have two ways of answering this question. The most “powerful” uses of web analytics data I’ve seen are usually integrations that take advantage of event-based marketing. Visitor X did this online – respond with these changes to the site, this outbound email, this customer communiqué. This is powerful stuff and it takes a lot of marketing operations work – but truth to tell it’s not that interesting analytically. It’s usually just cherry-picking – because the best opportunities for this kind of integration are obvious. Analytically, I continue to believe that one of the most interesting analytics projects we do is full behavioral segmentation. Typically, we take a full Omniture data feed for a month or two and then build behavioral profiles of every visitor in tools like SPSS or SAS using cluster analysis. When we can, we’ll also integrate online survey data. This type of behavioral profiling is fascinating work – but I also think it’s genuinely powerful. It’s the best way I’ve found to make web analytics data actually come alive for marketers.

Q: You’ve recently been writing about application measurement and how that’s different than Web Analytics. As more apps move into the cloud, what should developers be thinking about in terms of measuring their applications?

A: I wrote five long blogs on this – so it’s a challenge to shrink it down to a bite-sized answer. But here are a couple big-picture things. First, you’ll find that measuring applications requires a pretty fundamental shift in measurement thinking. The basic web site measurement stuff (pages,
clicks) really don’t apply. Instead, you need to think about capturing functional usage, application states, and performance information. Unfortunately, you still need to translate this paradigm back into something that works in the analytic solution and that can be pretty difficult. It’s also important for developers to realize that measurement integration takes real planning and testing cycles – and there aren’t simple automated solutions for testing. So it’s vitally important to integrate the measurement into the early stages of development and build a careful test-plan – otherwise you’re likely to end up leaving most of the important measurement on the app cutting-room floor.

Q: What advice would you give to a new Omniture Developer?

A: If you’re picking an environment, PHP is the best supported and documented. Definitely start with one of the sample programs in the gallery – obviously you should pick one from your environment. I find it’s just a lot easier to get started when you can begin by making tweaks to existing code that compiles and works. I also think it’s worth starting with the SiteCatalyst Reporting APIs – or at least understanding them – even if you’re focused elsewhere. I find we end up using these even when we are doing an application mainly focused on something else (SAINT for instance). SiteCatalyst is still at the heart of the environment and those APIs are definitely worth understanding.

0 comments