Bots take over all web traffic
A new study has found that human users account for less than half of all entities that surf the web online. A recent study estimated that 61.5 per cent of all website traffic is generated by software ‘bots’ while humans account for only 38.5 per cent of all website traffic. The proportion of the automated Bots is up from 51 per cent in 2012.
What are bots? Bots are tools used by software developers to crawl websites. For instance, they may be used by a company like Google to index pages for its search engine — by internet archive firms to save website content before the sites become defunct and by data analytics firms to measure a site’s performance.
The firm responsible for the report — Incapsula based these results on a data set of 90 days of crawling website traffic. In good news, the firm also said there was a drastic reduction in malicious bots — bots that were posting spam or links to malicious software or phishing sites.
There was a 75 per cent reduction in these malicious bots which the firm attributed to Google’s new efforts to curb spam link via its upgraded PageRank algorithm called ‘Penguin’. There was also 10 per cent drop in malicious hacking bot activities over the last year along with a 55 per cent increase in ‘good bots’ working to improve online systems.
This increase in bot activity could also put most websites under strain in the future, and it is becoming harder for many search engine optimisation and online ads firms to monetise in the new environment, as these firms had often resorted to using spam bots to achieve their goals.