Tracking AJAX objects is a new challenge for web analytics. Its tricky to do with tags because the script has to be embedded in the page. Actually, this is true of all file downloads, such as PDF, etc. With Passive Data Capture, these items are easily captured as they are simply more request-responses between the visitor and server.
The other question is how to define them within teh context of Web Analytics. Is refreshing soem objects on the page a new pageview or is it something else? Sometimes it looks like a new pageview to the visitor, so it could be resonably treated as such. However, many pages are simply changing a small icon, such as in a catalogue, which does not really seem like a new pageview.
Tuesday, September 11, 2007
Sunday, September 9, 2007
Get your own data
Recently, we have been approached by a number of companies developing predictive behaviorial analytics engines to improve visitor conversion. Its part of site optimization. It many cases, their prospects use a 3rd party tagging solution and the data they need is probably being collected already, at aleast for off-line analysis. However, it turns out that these companies are reluctant to provide the raw data back to the client. It is not clear whether this is because the are not set up to do it or they want to develop this space themselves and this is a way to keep the competitors out! Either way, it is another reason to get your own data in-house. As web activity becomes less of a novelty and is more an integral part of the business, it becomes more imprortant to be able to join web traffic data to other sales and marketing data. Thsi is difficult to do if the data is in someone else's database and they are only allowing selective access.
Friday, September 7, 2007
Packet Sniffing - Analytics without site changes
We knew that collecting web analytics data with Passive Data Capture (Packet Sniffing) instead of tags does not require any changes to the website or servers, but we did not understand how important this was until one of our clients told us that they had spent over $1M tagging their site! Why? The tag javascript is complex and they have code embedded in the pages as well, but not $1M complex.
The reason is quite simple, tags change the operational site. First, they had to buy and configure a sophisticated test system. Then they had to implement against the test system which of course, was not exactly the same as real visitor traffic. Then there was the nail-biting rollout which immediatley had problems with real traffic. Every tweak and change had the potential to stop a huge e-commerce site, so there was a long test, QA and approval cycle for each change. Add to this the fact that marketing kept coming up with new requirements and the development and support kept ballooning.
With Passive Data Capture (Packet Sniffing), they can now implement and test against the real traffic immediately without fear of breaking the site. Because there are no nasty surprises on rollout and the buesiness risk is far less, the update cycle is massively reduced. Because we can emulate the logs that were coming from the in-house tag server, they can continue to use he existing analytics package. Now there are no tags on the site, no 30K file downloaded each visit to slow down pages and present a security risk and much faster turn-around for marketing changes. Now if they had only done it that way in the first place . . .
The reason is quite simple, tags change the operational site. First, they had to buy and configure a sophisticated test system. Then they had to implement against the test system which of course, was not exactly the same as real visitor traffic. Then there was the nail-biting rollout which immediatley had problems with real traffic. Every tweak and change had the potential to stop a huge e-commerce site, so there was a long test, QA and approval cycle for each change. Add to this the fact that marketing kept coming up with new requirements and the development and support kept ballooning.
With Passive Data Capture (Packet Sniffing), they can now implement and test against the real traffic immediately without fear of breaking the site. Because there are no nasty surprises on rollout and the buesiness risk is far less, the update cycle is massively reduced. Because we can emulate the logs that were coming from the in-house tag server, they can continue to use he existing analytics package. Now there are no tags on the site, no 30K file downloaded each visit to slow down pages and present a security risk and much faster turn-around for marketing changes. Now if they had only done it that way in the first place . . .
Friday, April 27, 2007
Web Analytics-no tags or logs
As with any new technology, there is a lag while the users catch up to the vendor hype and start to drive towards making it actually useful. Web Analytics is no exception. Most of our users started with analyzing web logs. As their traffic grew, so did the number of servers and the number and size of the web logs. When it takes you more than a night to process a day’s traffic, something has to give. Some went to tagging solutions which at least gave them cleaner data that was already organized by visitor. But, whether they used a hosted service or managed their own data, the actual information was not much better than from web logs. Adding standard tags to a site is fairly painless, but try picking out interesting data like order numbers, catalogue numbers, campaign tracking ids, error codes in the HTML, etc. Now you are into the realm of custom tagging and that gets hard, fast. It also becomes a maintenance nightmare on sites that are constantly changing.
But the marketing department is figuring out what analytics can do for them and are starting to drive the conversation. I love it when that happens. Its great to see a technology really get used and appreciated! Top of the list, is that they want more specific information. IT departments that are already overworked and underpaid (they say!) are feeling the strain and baulking at a whole new level of effort to support custom tagging.
Our larger users have solved this problem with passive data capture that can monitor all the traffic, filter and clean it and sessionize it by visitor in real-time and pick out the data they need with simple rules. Of course, the marketing guys still want their favorite analytics package so they get to choose because passive capture can just emulate a tag server log and feed the reporting package of their choice.
The really interesting evolution that we are seeing is that there are all sorts of other applications that web data can be used for. Take fraud detection for example. Most of the techniques already exist and are used by banks etc. but they are after the fact. To catch the problem sooner involves alterations to their content and application servers – alterations that will need to be changed often. This is a big deal because a bug in a fraud detection application could break their site, which is their money generator. With passive data capture, they can get the data they need and build systems externally that cannot compromise the web site itself. We are seeing this more and more as a huge advantage as more uses are found for web data.
For more information on Passive Data Capture, Packet Sniffing for Web Analytics see www.metronomelabs.com
But the marketing department is figuring out what analytics can do for them and are starting to drive the conversation. I love it when that happens. Its great to see a technology really get used and appreciated! Top of the list, is that they want more specific information. IT departments that are already overworked and underpaid (they say!) are feeling the strain and baulking at a whole new level of effort to support custom tagging.
Our larger users have solved this problem with passive data capture that can monitor all the traffic, filter and clean it and sessionize it by visitor in real-time and pick out the data they need with simple rules. Of course, the marketing guys still want their favorite analytics package so they get to choose because passive capture can just emulate a tag server log and feed the reporting package of their choice.
The really interesting evolution that we are seeing is that there are all sorts of other applications that web data can be used for. Take fraud detection for example. Most of the techniques already exist and are used by banks etc. but they are after the fact. To catch the problem sooner involves alterations to their content and application servers – alterations that will need to be changed often. This is a big deal because a bug in a fraud detection application could break their site, which is their money generator. With passive data capture, they can get the data they need and build systems externally that cannot compromise the web site itself. We are seeing this more and more as a huge advantage as more uses are found for web data.
For more information on Passive Data Capture, Packet Sniffing for Web Analytics see www.metronomelabs.com
Subscribe to:
Posts (Atom)