Bots

Bots are bits of software that help search engines decide how to rank web sites and index pages.

Bots are used by search engines to decide how web sites will be ranked and pages will be indexed.

Bot data data is collected in an index. The index enables search engines to serve relevant search results more quickly.

More about bots

Bots find new web pages to crawl by following links. Once a new webpage is discovered, details about the content and its relationship to other pages and content is captured. Bots will also revisit sites to check for updates.

In the case of search engines, bot data data is collected in an index. The index enables search engines to serve relevant search results more quickly, drawing on the information captured by bots about which websites have relevant content on different topics.

Bots can also enable websites to automatically pull in or ‘scrape’ information from other websites.

Bots can be used for legitimate or illegitimate purposes. Spammers might use bots to search the Internet for email addresses, collect them then use them later in spam campaigns.

Website owners can ask all bots or specific ones not to crawl parts of their website by adding code to their website’s robots.txt file. Most mainstream bots comply but you can also attempt to ban certain bots by editing the .htaccess file attached to your website.

Some studies suggest that bots are dominating web traffic — accounting for three out of five ‘visits’ to any website. 

Contact