News

3 Reasons Google Analytics Fails for SEO

If you are serious about search engine optimization, Google Analytics is NOT for you. I know it’s tough to look past the (non-existent) price tag, ease of use and conversion tools — but trust me when I say that Google Analytics is a flawed program for your needs.

#1 – Limitations of Technology
The first trouble is that Google Analytics uses what is called “page tagging technology”. Without getting too deep on this, it is important that we understand this one fact. That is, the program is only capable of recording information on browsers that execute JavaScript.

Automated browsers (including all spiders like Googlebot, Yahoo! Slurp, MSNBot, etc.) never execute JavaScript in your source code. So, while Google Analytics may do a great job of tracking your human visitors, they’re unable to give you the goods on spider behavior.

In order to make the most of your SEO efforts you need to know when a spider is coming in, what pages they’re requesting, and how often they’ll come back. Using this information is how you can launch and optimize new pages, set up the best internal links and prioritize your source code changes.

While it sounds strange, I would go so far as to say that tracking spiders is just as important as tracking your visitors.

#2 — No Support for Log Files
I’ve met many novice SEOs who rely solely on Google Analytics. While it makes me wonder how much they are actually doing for their clients, I urge them to begin using log file analyzers.

Log file analysis opposes page tagging when it comes to the technologies behind web analytics. Tools that rely on log files are using hard coded server logs to mine all information. Every request for a file on your site is recorded, along with an IP address, user agent, and in most cases — the referral string. While it can be a bit more work to use these applications — the data you get back makes it a necessary evil.

The best part about log analysis is in the filters you can build. I know the Google Analytics tries to address this to some degree but it’s lacking at best in the control that you have.

Sadly, Google Analytics will not allow you to upload log files (or, retrieve them via FTP/HTTP) as other applications do. If they could build this feature in, it would be a win-win situation. You could use GA to track all of your data — and so could Google!

#3 — You’re at Google’s Mercy!
When you need to check reports, you access them online via the Google Analytics interface. The reports cannot easily be taken with you, and you have no way of archiving your data efficiently.

Worse still, is that Google retains control over your data at all times. If they change the interface or style of reporting, you are stuck with having to deal with what they give you.

If we’re talking about YOUR web site, YOUR visitors and YOUR analytics… Why should Google have all the power?

Before thinking that the product is complete too, leave the analysis to experts in web analytics. According to Matt Durgin’s blog, Forrester Research was less than thrilled by GA…

Google Analytics was reviewed, but ranked far below the commercial solutions — further back than Urchin (whom Google acquired and re-branded as Google Analytics) ever ranked behind its competitors. This could be evidence that Google will not devote the resources to keep up with the commercial solutions on the market. If this trend continues, look for a larger discrepancy between the “free” Google Analytics, and the commercial, professional tools on the market.

Recommendations
In the end, Google Analytics is not a useless tool. It is however a tool that does not address the needs of serious search marketers. If you’re looking to make the most of your time spent on analytics, do yourself the favor of using multiple tools.

Generally speaking, Google Webmaster Tools should provide accurate data for your site and how Google related to it. So should Yahoo’s Site Explorer.

At the end of the day though, you’ll need a solid log file analyzer to learn more. Here are three recommendations I can stand behind:

WebLog Expert
Free Demo Available / Commercial Versions at $74.95 and $124.95

Sawmill: Universal Log File Analyzer
Trial Available / Commercial Versions from $99 to $30,000

123LogAnalyzer
Trial Available / Commercial Versions from $99 to $699

You Might Also Like

Comments are closed.

48 thoughts on “3 Reasons Google Analytics Fails for SEO

  1. I still like Analytics best for examining user behavior. Also, I think Google Webmaster Tools does a decent job of showing crawl information. You could probably get by with using both.

  2. Google Webmaster Tools actually fails to report data accurately in *all* instances. I’ve got some information on that, but am holding off before coming up with a complete article.

    In short, you’ll need log files to learn about spiders and automated agents. That includes Googlebot, but is not restricted to it. Understanding Yahoo and MSN’s working with your site is also quite valuable.

  3. Eric –

    It sounds like your issue is, you are in love with log-file analytics and hate page tagging analytics. Most (but not all) of the issues you are ranting about are specific to page tagging, not specific to GA. It doesn’t really sound like you hate GA — you just (probably) *know* GA.

    The logfile vs pagetagging arguments go on and on (and you can successfully argue either side, although market forces do show that page tagging is winning.) However, let’s not confuse “I Hate GA” with “I Hate Page Tagging Solutions.”

    I noticed that your blog uses GA, pretty exclusively, it appears. Maybe you have a server side solution that doesn’t set cookies, but all I see are utma, utmb, utmc and utmz. In other words: GA.

    Robbin Steif
    LunaMetrics
    Google Analytic Authorized Consultant

  4. Excellent reply Robbin!

    I agree that the flaws indicated above apply to all page tagging analytics, not exclusively GA.

    And, as indicated in the article, it’s best to use multiple options. Just because you see Google’s code on my personal blog, doesn’t mean that I am using that exclusively.

    Please keep in mind that this is Search Engine Journal. Our readers are more familiar with Google Analytics than they are with other page tagging systems. Professionally, I am forced to use Omniture — and have some pretty strong feelings about that program’s problems too.

    To be clear — you have me pegged. I do love log analysis, which is why I was also a huge fan of Nettracker back in the days of when they used page tagging AND log analysis to provide stronger reports. Unsure of what they have now.

    The key here though is that if you want information on spiders, you need the log files. It may sound like I dislike all tagging, but that’s not true. For the sake of an SEO measuring spider and bot activity, you NEED log analysis.

  5. I agree 100% on the issue you raise. I use GA and love it for user analysis.

    Do you or anyone know of a good analysis program that can sort thru the robot/spider information and give you good detailed information on the files and path they take on your site? Free or Inexpensive?

  6. Dustin, please check the recommendations above. For quick and easily filters, I use WebLog Expert the most. They do a great job of supporting their product too, as well as helping if you’d like to see a new filter created. I’ve got a full commercial version, but believe you can use their BETA versions free of charge.

  7. When you focus the log files vs. page taggers debate towards robot/spider issues, there are two ways to argue the point.

    The loggers will say that theirs is the only tool that will capture this data, and they are correct. Page tagging tools are largely blind to non-human traffic.

    The taggers will say that ignoring spider traffic will result in more accurate reporting, since log analysis tools must keep a list of robots/spiders and constantly filter the log files to separate human and non-human traffic, and that their lists of spiders can never be 100% up to date. They are also correct. Log file analysis tools can erroneously record traffic from spiders as human traffic.

    I guess at the end of the day, the advantages of page tagging (speed of reporting, tracking off-site or off-server pages, tracking cached pages) outweighs their disadvantages. NetTracker did have a “hybrid” tool (which was really just a page-tagging platform with a server plug-in for recording spider visits), and they may still have it in their new NetInsight line. WebTrends had something similar a few years back. I would think that all page-tagging platforms would offer something like this, but they don’t.

  8. Eric –

    I was recently reading about AWstats. Now I would never claim to be an expert in analytics or log file analysis (fairly new to both fields) but what are your thoughts on that program? It seems fairly limited to me but does provide info on some of the things you mentioned above.

  9. I use pMetrics in addition to GA. If visitors don’t have JavaScript enabled, it falls back to tracking via an invisible gif. It also tracks outbound links, which log parsing doesn’t without using redirect pages (annoying).

  10. For its Analytics value Mahesh, but not for its SEO statistics value. Running Google Analytics AND an analytics platform which is designed for SEO tracking, or more SEO tracking and SEO metrics friendly, is the way to go.

    We run Google Analytics because of its slick interface and universal acceptance of its traffic measurement by advertisers.

    For pinpoint SEO breakdown purposes, something like ClickTracks would work much better, but Google Analytics does fine for what we use it for :)

  11. I used google analytics only once and never used again. The program sucks, it is damn slow and the graphs are of no practical seo use as stated in this article. A log analyzer like awstats is 100 times better than google analytics. webalizer sucks too.

  12. Eric,

    As a previous employee of a NetInsight/NetTracker distributor I know that both applications still use a hybrid approach to data collection using both page tags and log files. And there is a server plugin that will generate cookies along with an advanced verison of the Unica pagetag that handles true first party cookies (but is only available through SCL Analytics).

    As far as I know this is the only solution for having the best of both worlds.

  13. The “page tagging technology” isn´t an exclusive Google Analytics´s problem.Others webanalytics that uses the same technology has the same fails. The tagging technology is not enough to give information about search engines spiders, but give us information about user behavior, that the log file is no able to give. The hybrid webanalytics give us more completely reports, certainly.

    Some webanalytics that uses the tagging technology, like Google Analytics, offers to their customers a service and not the web analytics structure. Because of that they retain control over data. If Google Analytics is fails for SEO, all other webanalytics that uses the same technology are fails too.

    And, finally, the tagging technology has been a way to solve customer´s problems that wish a report about your site performance, but do not want to buy hardware and install software.

  14. The funny thing about the scraper mentioned below is that the comments on that article link back to THIS article. Lol. At least they are giving you a link, even if it is incredibly low value.

    As to the article, I’m not so sure that tracking spider activity is all that and a bag of chips. I mean, you can tell right away if a search engine is indexing new content by searching for it…right?

  15. I agree 100% GA is a good (and free) tool to get a quick overview if site traffic. But any site with serious traffic needs a serious solution – end of story.

  16. I am fairly new to SEO. What about providing xml sitemaps on a site? Do these sitemaps affect getting the SEO info you discuss?

  17. I use google analytics to track my website visitors. But somes it fails according to my requirement. By using log files analysis I got all the elements whatever I need. But google analytics have many good features than other free analytics softwares.

  18. The huge flaw in Google analytics is that they don’t report on actual keywords that are being displayed by Adwords. For the beginner what this means is that GA is automatically matching your keywords from “Blue Widget” to “Blue widgets are bad for the environment”. The analytics software will not show you this anywhere. So as the writer commented….GA alone is not enough. If you do it without another tool, you are set for bankruptcy.

    Take this from someone who has spent over $750,000 dollars on Adwords. No joke.

  19. I’m just going to touch on a few topics here but everything stated here can be refuted with the right tricks.
    With a little JavaScript, you can get the exact keywords into your reports (strip it off the referrer and shoving in one of a number of places). ROI Revolution published their script to do just this. It’s not brain science, really. Further they’re likely adding the feature in the future.
    I won’t go into much detail here but the truth is, even Google Analytics ultimately uses it’s own log files on the back end where a clever you *can* fake hits based on spider behavior. Everything you need comes in via the http headers and is written to your server logs. The trickiest part is sessioning which ultimately requires that grab a new cookie id, store it and do some user agent/IP timeout rules with your server-side script to make sure you’re sending all the right bits to GA.
    As far as Tags vs. Logs, I say you should use both and I think you should do so like I’m mentioning here. Fill in the gaps with your log files by shoving that data to the tag-based solution. The extent to which you do this is up to you.
    You’ll deal with less caching skew with the browser-side tags but you’ll still get those non-JavaScript enabled users and hits to nonJavaScript files via the server-side script.
    What this article lacks is specifics about SEO. We care a heck-of-alot more about traffic from properly segmented paid and organic search keywords and from internal search terms. We care about the user-performance of those terms. Provided we have the unmatched segmentation of paid (as I mentioned above), we can refine paid search campaigns (negatives or split terms to separate groups), and focus SEO efforts based on this data. We can also use internal search to find what your users want and what they call it. We can do tricks to show whether they found any results. We can use this to either edit the site to use this language (and the corresponding paid and organic search lists) or we can turn this into the next big offering, or atleast investigate why in the world someone might think they might find that on your site. Yes GA does alot for search and it’s soooooo much better than nothing. If you want to see some real gold, check out how, at http://www.analyticsindex.com (the tool is called analytics fox), you can get a tool (for Yahoo! stores only at the moment) that will tell you if your keywords are in your site in broad or exact match format. It’s insanely useful. Go GA!, my rant here is done.

  20. Great comments here Brian, and I’m glad it caught someone like you and Avenue A’s attention.

    The article’s purpose was to illustrate the fact that GA is page tagging analytics, and sadly, page tagging has limitations.

    I’m actually a fan of GA through and through. It’s just that you’ll need other tools (like analytics fox and others) to put all of the pieces together.

  21. I’m stuck in the dark ages using GA and my server log / stats to track user behavior and so far it’s worked out ok for us. From reading this post it would seem that there are better options but I have to question if I actually need to go that far. I’m sure there is a lot of science going on here but do we actually need it?

  22. I don’t need to get any more bogged down in stats than my java based stats program. I can waste enough time on this alone. Time to get back to content creation.

  23. Enh, I agree…Analytics just doesn’t really click with me. Plus, it slows down my site and it’s a pain in the ass to set up a cron to download the .js file every coupla days.

    But, what else is free? Dedicated servers cost money, and I’m already whoring myself out as fast as I can…Can’t afford no fancy weblog thingamajig, aye? We be poor.

  24. I may have to correct you concerning Googlebot’s ability to execute Javascript – I just found a hit from Googlebot on a site that uses Javascript to track visitors.

    Browser: Firefox 1.0, Resolution: 1280 x 1024, 24 bits, rDNS: crawl-66-249-67-55.googlebot.com

  25. People, GA is still in development, they even have a site/blog with information on the updates every some weeks. This article is half a year old and there were already some changes in GA. I have a little site and this amount of information is very interesting. You have to differentiate between size of sites and for example if they are shops, how deep their link structure is and so on.

  26. From a technical perspective, I agree completely with your article. I use Google Analytics as a simple, quick tool for a few things however, such as page visits, bounce rates, total time on site, and long tail metrics.

    For the more technical data, I use SmarterTools’ SmarterStats. It is a great log file analyzer and gives some great information.

  27. I think this is question of how deep you want to dig the data.

    If you just want the hit demographics, GA may be sufficient and easy solution. But if you are looking for SEO, stolen resources tagged on someone else’s site (this happens a lot if your site is COOL), etc log based tools gives you deeper insight. That is the bottom line.

  28. I have been on both sides of the fence and recently took the time to get more familiar with GA and its filter options. As a tool is does what you need it to do and it keeps improving.

    The main issue with analyzing log files is the numerous spiders, scrappers and other tools that do not assign a user agent and because of this appear as customers. In some cases this can throw your logs out a long way. The other issue with relying on log files is analyzing customer pathways and trends. It is near impossible to track the pathway a user took in the logs as you could have several users sitting behind a proxy accessing your site at the same time.

    If the main gripe is that GA or Omniture do not track spider movement then solve the problem by placing a bot page link just below your GA code.

    Example:

    Ensure that the image is a 1px * 1px transparent image so it can not be seen by regular users.

    The bot page content only needs to have some basic text. e.g. This is a bot tracking page.

    What you can look for is the referring page that took the bot to the bot landing page. Now you will be able to see when your bots visited the site and what pages they visited.

    Cheers
    zeroanarchy

  29. I don’t trust this story and I do believe that you are promoting other paid alternatives simply for financial gain. I say this with a bit of hesitation.

    When you are looking at your visitors, you want to know where your “Visitors” are and not bots. You want to see what your “Visitors” are doing and not the bots. Bots will not tell you what they think of the content of a page. Bots won’t give you bailout info like “Visitors” will. Bots will not convert your site to cash money, “Visitors” will. So, who cares about bots. You can simply look at your hosting webstats to get bot info. You want to know what and where your “Visitors” are. Not bots…

    So, you can see why I say this posting is only for financial gain…

    Regards,

    KeepingItRealGuy

  30. Does Google Analytic mess up a lot for anyone else? It seems that GA has a hard time with unique users vs. returning users. Maybe it has something to do with users clearing their cookies, etc.

    Good post though. Enjoyable read.

  31. I guess I am one of those newbie SEO’s because I have always used Google analytics. Mainly because its free, and because I don’t have to charge my clients more to absorb the cost. I have never used a log based program though, so maybe I should take a look at that.

  32. I think GA is an excellent tool for the price. :) Though you’re right, It isn’t a good idea to keep all of your eggs in the same basket. This has made me interested in trying out alternative tools.

  33. Its been a couple years since this article was written. Are there any updates as to Google releasing a system that will go and pull your web server log files to provide the data that the author lists it needing?

  34. I Wonder how much time and energy you put for stating such things. I’m no one in SEO stuff, but surely can understand that if a product like Google Analytic is popular and is still lacking such features. People who are specialists in SEO must have already been knowing all these things. Why YOU putting your ENERGY to say Google Analytic isn’t good enough. Why DONT you mind your own business?

    Hey no offence but I’m not favoring google or something. It just that I didn’t liked the negative energy around this article I read. I kind of thinking I wasted my time here.

  35. We use google analytics in tracking data for our clients. As far as free is concerned, Google analytics is the best way to go. If we are talking about small business clients, we do not need comprehensive reporting suite. I would say that if you own a small business, there is no data that Google analytics cannot cover. I would recommend that if you are getting 20k-50k traffic, google analytics is still good for you. If more than that, then using other reporting suite is a good idea.