The number of pages that get crawled per day define the SEO success of your business! If your valid/business pages crawled per day is going down take it granted that your SEO ranking too are going to change which may be a positive change or a negative one.

A quick brief on a few things,

Pages crawled per day is measured as crawl rate.
The frequency at which web page is getting crawled is called as crawl frequency.
Crawl rate and crawl frequency together referred as crawl stats

So, if your web pages crawled per day coming down implies that your crawl rate is low. Crawl rate coming down can affect positively or negatively, in this article I will be discussing when to consider this as a serious problem and when not to consider as a problem.

Many SEO professionals have to realize one thing, everything that’s done on the website even before someone starts visiting organically from search engines is to improve the crawl stats of a website.

Later whatever it gets implemented in SEO may be for the following reasons,

To improve the CTR
To improve user engagement
To Improve the conversion rate
To improve the crawl rate

SEO is anyways implementations intending to improve the above factors always!

Before we get into the pros and cons of drop-in crawl rate lets understand the reasons for drop-in crawl rate (Pages getting crawled per moment)

Reasons for the drop in crawl rate:

Below are some of the most common 8 reasons for the drop in crawl rate for the website.

Allowing or Disallowing some of the page types on robots.txt:

When you find some of the pages on your website are redundant, thin or of no use to customer or bots and you fixing them through robots.txt by writing a command “disallow: pages to be blocked will go here“ will stop crawling those redundant pages and that’s when you will see the number of pages crawled coming down. Meanwhile, if you allowed some of the good pages types your crawled pages will increase

Need Help In Defining Robots.txt for your website? Saving crawl resource and making use of it effectively increases your website traffic by many folds.

Adding or removing No-index, No-follow To some of the web pages or to complete website

When you add a no-index, no follow meta tags in the head section of your web pages or website the bots won’t be able to crawl all of your webpages and hence only limited pages get crawled, therefore the pages crawled per day will be dropped.

If you are removing no-index, no-follow or changing it to follow, index bots start accessing them and the pages crawled per day will increase

Lots of traffic to the website / Server Too Burdened:

If your website is on shared hosting or has bandwidth issues you may see pages crawled coming down, when your server is busing serving the real user requesting coming from the browser and it would not be available for the bot to crawl the web pages effectively and in this case you would either see status code 500 serves errors or pages loading too slow. So, websites with low bandwidth and too much traffic will impact the crawl stats.

By moving to new and heavy designs:

Bots visit your website with a certain bandwidth, which measures in terms of time(duration) or data.

Case 1: Bot with a bandwidth of 50 MB

let’s say during its visit it has to crawl 50 MB and if each of your web pages if 2 MB, then the bot can crawl only 25 pages.

Whereas in your old design each page was of 500 KB and bot was able to crawl 100 web pages easily.

So, with the increase in webpage sizes pages crawled has come down.

Case 2: Bot with a bandwidth of 60 seconds

Let’s assume your new design is heaving and has too many HTTP requests to the server and is taking a load time of 2 seconds to load the complete webpage and in this case, it can crawl only 30 web pages. For your understanding assume that your old design was very light and was loading in 0.5 milliseconds so now bot can crawl 120 web pages in a minute, that’s how pages crawled per day can increase.

Removing the current Internal Linking or Implementing the poor internal linking:

The pages crawled per moment can go high only when bots can easily access all of your web pages and if bots cannot access it inner webpages or next level of webpages they cannot crawl and hence pages crawled per day comes down.

Make sure you are doing this in a right way. If you website is massive with few hundreds or thousands of webpages seek our help. We help few tens of businesses every week.

By removing or Adding The HTML sitemap or more web pages to it:

An HTML sitemap helps bots to access of webpages on the website and if you are removing or adding this to your website creating convenience or inconvenience then bots crawling the webpage will vary.

Changing your hosting or servers:

A great server is always an advantage for SEO, good services make it all pleasant for bots to crawl the web pages effectively and hence good server plays a role in crawl rate.

Bots other than Google Bot:

There are few hundreds of bots or web crawlers which may keep visiting your website and engaging it most of the time and in this case, Google bot may not crawl your website effectively and any changes in defining what kind of bot to visit and whatnot to will also have an effect on the crawl rate.

You can explore such implementation done on robots.txt in order to allow only specific set of bots to crawl the website here, https://kandra.pro/robots.txt

Wondering is it a good thing or bad?

The following table will help you in understanding when is it good and when is it bad,

Effect of change in crawl in pages crawled per day.
Sl No Implementation Type Implementation Result Status
1 Changes on Robots.txt Allowing Only Valid Pages Pages Crawled Dropped Good
2 Changes on Robots.txt Allowing All pages Pages Crawled Increases Bad
3 Meta Index or No-Index Allowing Only Valid Pages Pages Crawled Dropped Good
4 Meta Index or No-Index Allowing All pages Pages Crawled Increases Bad
5 Server Too Burdened Too much of traffic Pages Crawled Dropped Bad
6 Moving To New Design Site Heavy Pages Crawled Dropped Bad
7 Moving To New Design Site Light Pages Crawled Increases Good
8 Internal Linkining Removed Pages Crawled Dropped Bad
9 Internal Linking Changed Pages Crawled Increases Good
10 Internal Linking Improved Pages Crawled Increases Good
11 HTML Sitemap Implemented Pages Crawled Increases Good
12 Change Of Host/Server Good Server/Good Uptime Pages Crawled Increases Good
13 Change Of Host/Server Not Good/Less Uptime Pages Crawled Dropped Bad
14 Allowing/Disallowing Bots Allowing al bots Pages Crawled Dropped Bad
15 Allowing/Disallowing Bots Allowing only Specific Bots like Google Bot Pages Crawled Increases Good

Conclusion:

Crawl stats play a crucial role in SEO if you see pages crawled per day coming down consistently you have to compulsorily spare some time in understanding the possible reasons, as many times that drop may affect negatively. If you are not putting an effort in understanding it you may not be the right person for technical SEO.

In paid marketing, if you have to get the traffic to website you will be paying to the Google and if you have to make your website rank organically you have to make sure that your website is getting crawled effectively on a regular basis.


-

Manjunath Chowdary

Digital Marketing Expert, consultant, Mentor and
Director of KandraDigital Marketing
Solutions Pvt Ltd.

-Kandra Digital

An agency that’s been built with the core purpose of delivering the quality digital marketing in the era where Digital marketing services are just business rather than the value for the business, business owners and their resources/time.

Get to us