I know. “Shut Up and Play the Hits: The latest Adobe Analytics tips and tricks” is a strange title for a session at Adobe Summit 2014. The creative name comes from Andrew Koperwas, one of our excellent Product Marketing Managers, who describes himself as a big LCD Soundsystem fan. The title notwithstanding, I am very excited about the session, and I hope you will find time to attend it on Tuesday afternoon here in Salt Lake City! (UPDATE: A repeat of the session has just been added on Thursday afternoon!) We will not be shutting up, but we will be playing the hits. In this post, I want to give you a sense of how the session will work, and cover some of the content that we left on the proverbial cutting room floor.

We had a great year of innovation in Adobe Analytics between Summit 2013 and the present, with more than 70 new features released into the solution. With all of this great new functionality in the product, there are new tips and tricks and best practices to master. This session is all about walking through eight of my favorite new additions to Adobe Analytics, discussing how we hope they will solve challenges that you may be facing in your role, and offering some ideas around how you can best take advantage of them. (SPOILER ALERT: I will also show off two as-yet-unseen new features coming to the product later this spring. You will get a sense how to get the most value out of the great features we have released since last Summit, and a sense of where we are headed with our next release.)

There are so many great new features in Analytics that I had to leave a few favorites out. So in addition to the 10 tips I’ll be showing at Summit, here are two other really cool features; we’ll call them “Tip #11″ and “Tip #12.”

Tip #11: Using the Time Prior to Event report alongside the Time Spent per Visit report

This is one that we technically released before Summit 2013. The Time Prior to Event report differs from Time Spent per Visit in the sense that it associates metrics with the time bucket where the given metric occurred, as opposed to associating all metrics with the bucket representing the total length of the visit. Thus, it is a little more granular than the traditional Time Spent per Visit report. A big part of this “tip” is just to help you get familiar with the Time Prior to Event report, which might be what you and your constituents are really looking for when you run the Time Spent per Visit report. But the best insights are available when using the two in concert to find patterns. I’m going to give you a very simple example of this.

Consider the following table showing a sequence of page views representing a complete visit, along with the time when each page view occurred.

Time Page Event
9:00 Home Page None
9:02 Product Detail Page (Kamloops Snow Boot) Product View
9:03 Add to Cart Cart Addition
9:05 Checkout None
9:08 Order Confirmation Order
9:15 Home Page None
9:27 Product Detail Page (Interlaken Ski Jacket) Product View
9:35 User Profile Profile Edit
End of Visit

The two reports we are discussing would handle the events that occurred in this visit differently. Remember, in the Time Spent per Visit report, all metrics are associated with the total length of the visit. In the Time Prior to Event report, all metrics are associated with the time at which they occurred during the visit. So here is what that visit would look like

Time Prior to Event

  Product Views Orders Profile Edits Visits
Less than 1 minute 0 0 0 0
1-5 minutes 1 0 0 0
5-10 minutes 0 1 0 0
10-30 minutes 1 0 0 0
30-60 minutes 0 0 1 1

Time Spent per Visit

Product Views Orders Profile Edits Visits
Less than 1 minute 0 0 0 0
1-5 minutes 0 0 0 0
5-10 minutes 0 0 0 0
10-30 minutes 0 0 0 0
30-60 minutes 1 1 1 1

What do I learn here? I learn that this customer spent time browsing my site after purchasing, but did not make a second purchase. Next, I can try to assess which factors contribute to this behavior: What recommended items are we showing on the order confirmation page in this case? Are there certain traffic sources which tend to lead to browsing after purchase? I will likely end up with a bunch of great items for testing, and also for targeting based these specific criteria.

If I find that there is a significant amount of traffic matching this pattern (or one like it), I can go over to ad hoc analysis (formerly Discover) and build a sequential segment which reproduces a path like this very specifically, even down to the specific time increments if I want to. I can build a segment which says “Visits where order occurred and after five minutes, but within 25 minutes, a product view occurred.” Then I can do complete behavioral analysis for this segment to better understand how to reach these visitors.

Tip #12: Processing Rules improvements

In my session last year, I briefly covered Processing Rules, which allow you to populate variables, copy/concatenate variables, and change variable values after data collection. Also, my colleague Bret Gundersen gave a fantastic session on Processing Rules—so fantastic, in fact, that it was voted the #1 session of Summit 2013. Since then, we have made some significant improvements to Processing Rules and what you can do with them. From our recent release notes:

  • Max rules increased from 50 to 100 for each report suite. UI enhancements were also made to improve performance when displaying large numbers of rules.
  • “Else” condition support for rules lets you take action when a condition is not met.
  • When copying rules between report suites, you can now append rules to the target report suite rather than overwriting all rules.

I want to focus on the second and third bullet points above.

The value of the “else” condition is that it expands the range of possibilities for each of your rules, allowing you to be more flexible in way you use Processing Rules to augment and enhance your implementation. This is particularly valuable in the area of data mapping. I’ll show you what I mean. For example, let’s say I am selling books, movies, and artwork on my site. I have instructed my developers to use Context Data variables to send author (for books) or director (for movies) into Adobe Analytics, and then, as an admin, I am going to map those Context Data variables to eVar20. But on book pages, my developer is using s.contextData['authorName'], and on the movie pages he is using s.contextData['directorName']. In the past, this would have required two of my valuable Processing Rules. But now this becomes one simple rule:

Processing Rules

This change, coupled with the increase from a limit of 50 Processing Rules to a max of 100 means that you can do a whole lot more with your data even after it comes in from your site or app.

The other Processing Rules change to mention briefly is the ability to copy a Processing Rule to individual report suites. In the past, you could copy a Processing Rule to all of your suites, but this was troublesome because a.) you don’t want to have to write new rules for every suite, even though they each use variables a little differently, and  b.) Processing Rules, when in effect, permanently alter your data. In short, often you want to copy a rule, but only to one or a few report suites—definitely NOT to all of them.

This problem is solved! Now, when you click on “Copy Processing Rules,” you can now multi-select report suites as destinations.

Copying Processing Rules

Note that you can also choose whether to overwrite a destination report suite’s existing Processing Rules, or simply append these new rules to the set already existing on the destination report suite.

See you at Summit!

If you’re coming to Summit 2014 and you’re an Adobe Analytics user in any role, at any company, I do hope you’ll consider joining my session on either Tuesday afternoon or Thursday afternoon. And if you do, please come down to the front of the room afterward and say hello. I love meeting members of our community and user base! If I don’t see you there, hopefully I’ll see you elsewhere around Summit. It’s the best week of the year for digital marketers and analysts!