If you have a sneaking suspicion that your biggest website visitors are in fact robots, Google has a way you can find out.
By enabling the new “Bot filtering” option in Google Analytics, website managers can now see bot-free stats for their websites. This mean the analytics will truly reflect the number of human visitors, and not just the artificially inflated page views created by bots, spiders and web crawlers.
Here are instructions for activating bot filtering in the Analytics dashboard, from Google Analytics’ Matthew Anderson:
You can simply select a new checkbox option which would be included in the view level of the management user interface. This option would be labeled “Exclude traffic from known bots and spiders”. Selecting this option will exclude all hits that come from bots and spiders on the IAB know bots and spiders list. The backend will exclude hits matching the User Agents named in the list as though they were subject to a profile filter. This will allow you to identify the real number of visitors that are coming to your site.