It’s been a lot of fun to meet more and more real estate professionals who are using their Web analytics data over the past couple years.
Two years ago, most of my conversations about Web analytics at conferences involved quelling fears about how "difficult" it is to get a Web analytics package set up (all it takes is a couple lines of code on your website).
Now, I’m as likely to get into a deep conversation about how different metrics are used or how to gather "qualitative" analytics. Even better, now and then someone is taking the ball and running all the way to using Web stats to help solve problems outside the traditional Web marketing silo (informing offline branding campaigns, identifying operations inefficiencies, and so on).
As you start to beef up your Web analytics "code fu," there’s something to watch out: obsessive stat gathering. The beginning Web analytics user is learning to grapple with the data-configuration issues and picking a few reports to watch.
That keeps him or her plenty busy. But the intermediate Web analytics junkie, having mastered data configurations, has a little more time on his or her hands. This can be dangerous … something about idle hands.
You see, in many Web analytics packages there’s hundreds if not thousands of bits of trackable data. And since it’s all about what your customers are doing (and sometimes thinking), this data can be incredibly interesting.
But interesting isn’t the same as actionable.
Here’s an example: I was recently showing a client how to maintain a spreadsheet of Web analytics data — an important step in creating a Web analytics initiative that reflects the business needs of my client as opposed to the reporting capabilities of Google Analytics.
In our initial selection of things to measure (what analytics geeks call "KPIs," for "key performance indicators"), I had established a group of data that aggregated social media activity.
Measuring social media is a hot topic, and people like to do it so they know what’s working and how it’s working on which channel and so on.
But this particular client really wasn’t using social media in a strategic fashion yet — they were just playing with social media to figure out how it works.
I should note that I think the client’s approach to social media at this point is good — the client’s goal is to learn how to use social media.
There was no strategic intent guiding my client’s use of social media, though, so I told the staffer who would be maintaining the spreadsheet going forward that he shouldn’t continue gathering the social media data that I had compiled.
Now it wasn’t that much extra effort to continue gathering that data — a few extra numbers in a spreadsheet, and that’s it. He figured it was no biggie and wanted to keep gathering it.
This seems like no big deal at first. What’s an extra three minutes?
The big deal here: It’s more than just the three extra minutes it takes to gather that data. It’s also added time for the people who read the spreadsheet later and spend time thinking about what these social media numbers may mean.
If there isn’t a clear strategic intent for social media, trying to find meaning in the social media numbers isn’t helpful.
The reason spending time gathering and thinking about social media data isn’t helpful is that no one is going to change the way they do social media based on these numbers –remember, they’re still playing with social media.
So effort and resources are going to be expended on measuring and reporting something that isn’t going to help anyone to make a decision. In fact, it might confuse people or make people anxious.
It’s sort of like worrying about your website’s bounce rate when you’re not really going to make a change to your Web design.
When the total time it takes to gather and make sense of Web data is increased by paying attention to metrics that aren’t "actionable," we run the risk of endangering the whole Web analytics process.
Sooner or later someone will say "Geez, why does this Web analytics thing take so much time for so little result?" Whenever someone is saying that, it means that the data-gathering got bloated.
This is why it’s critical, once you’ve got a handle on the basics of Web data, to become very focused on only measuring and reporting on stuff that helps you take action.
Just because you can measure something doesn’t mean you should.