I have been running my personal website for more than 7 years and use Google analytics to track my website visitors. I could fine more than 50% of my web traffic comes from bots(robots), not humans!!! I was very surprised. Are we trying create a beautiful design, do SEO,SEM etc just for bots to visit our websites?
Here is the in-depth analysis from a web security company Incapsula. They measured the bot can account for up to 61.5 percent of all website traffic and only 38.5 percent was human. The ratio is getting increased every year.
Who are those bots? Is it good for a website?
More than 21% of bots are good like search engines and other crawling bots (crawls the website for latest/good contents). But more than 31% of bots are bad like spammers, hackers, scrapers and Impersonators. Most of the bad bots eats the server CPU/memory and reduces the website performance which will also affect the website’s search engines ranking.
So how can we save the website from bad bots?
There are few SEO best practices to tweak robots.txt and .htaccess files to save the websites from bad bots. You can check some here http://moz.com/learn/seo/robotstxt and there are many ways we can stop it in web servers as well.
Credits: http://www.incapsula.com/the-incapsula-blog/item/820-bot-traffic-report-2013image credits: www.3news.co.nz