Crawl depth is the level to which a search engine is able to crawl your website and index the pages that it finds. If you have a new site, expect Google autobots to visit it anywhere between 4 days to 4 weeks of it going live.
For the uninitiated, or those new to SEO, Google sends out autobots to crawl websites and save the information that they find there in its index. Google then draws upon this information contained in the index when determining the SERP, every time a user performs a search request.
What is crawl depth?
Crawl depth is the level to which a search engine is able to crawl your website and index the pages that it finds. Essentially, click depth is the shortest path to each webpage on a website. The crawl depth is determined by the number of clicks you need to make to reach each page from the home page.
Because most websites have multiple pages which can become subpages, a website can go much further than just the initial facade.
As such, a website’s home page is given a crawl depth of 0. Any pages linked to from the homepage are given a crawl depth of 1. Any pages linked to those crawl depth 1 pages are given a crawl depth of 2, and so on.
The lower down the hierarchy a web page is, the lower the chances of it being ranked highly in the SERP.
How does crawl depth affect SEO
If a page is not linked directly to the home page, the longer the journey will take for the autobots as they crawl a site. Any page that has a high number of clicks to get to it, you can guarantee won’t rank highly on the SERP. And if a page is hard to find, the lower it’s chances of getting ranked.
Because, and take note, autobots only have a set time that they can be on each site, so if they can’t reach every page of your site in their designated time allowance, it won’t be ranked. Period.
How do I increase Google crawl rate?
- Internal linking. The easiest way to increase Google crawl rate of your site is to improve your site’s internal structure. Now, that doesn’t mean you have to reorganise the whole site, because if you have a huge site, that is going to be a nightmare. So, the easiest way to improve your site’s structure (and make it easier for Google’s autobots to navigate and improve your user’s experience), is to include a lot of internal linking. Internal linking not only reduces the number of clicks it takes to get to far flung pages of your website thus making them easier to find for both users and autobots, but you’re also offering your users access to more additional, potentially valuable content that they are looking for (and reducing your bounce rate in the process). And we all know Google’s main aim in life is to make every user experience as friendly as possible. And for that you will (should) be handsomely rewarded in the SERP.
- Breadcrumbs. Breadcrumbs allow your users to see, at a glance, where they are on your website. They allow a top down view of how your site is laid out, allowing for easier navigation, by both autobots and humans. Breadcrumbs can be:
- Location based. These type of breadcrumbs allow your website to be easily categorised. Ie. Home page > SEO learning > Crawl Depth > Breadcrumbs.
- Keyword based. These don’t necessarily show users their location on your site, but they do allow each page to be linked through meta-information or attributes.
- Path based. These are all the different paths a user can take to get to each page. The path based breadcrumbs are numerous and dynamic, with plenty of options to reach the same end result.
Location based breadcrumbs are probably the best for SEO purposes.
How often does Google crawl?
In short. No one knows.
Mainly because the crawl process is algorithmic – it is controlled by a machine, not a human.
If you have a new site, expect Google autobots to visit it anywhere between 4 days to 4 weeks of it going live. However this isn’t set in stone. Some websites have been indexed within 1 day.
Google states that they regularly and routinely crawl websites, but how often that occurs and when you will see the results of your recent SEO efforts are just not known. Google does say that how often it gets around your site depends on the links you have in place, the current page ranking and how many crawling road blocks ie dead links your site has.
You can find out when Google last visited your site via Google Search Console (one of Google Webmaster Tools).
- Log into Google Search Console.
- Click ‘crawl’ on the side menu.
- The stats you will be presented with will show you all the Google autobot activity on your website in the last 90 days.
- You can even check which URLs Google has indexed by clicking Google Index, followed by index status.