Crawl budget is how fast and how many pages a search engine wants to crawl on your site. It’s affected by the amount of resources a crawler wants to use on your site and the amount of crawling your server supports.
More crawling doesn’t mean you’ll rank better, but if your pages aren’t crawled and indexed they aren’t going to rank at all.
Most sites don’t need to worry about crawl budget, but there are few cases where you may want to take a look. Let’s look at some of those cases.
When should you worry about crawl budget?
How to check crawl activity
What counts against crawl budget?
How does Google adjusts its crawling?
How can I make Google crawl faster?
How can I make Google crawl slower?
When should you worry about crawl budget?
You usually don’t have to worry about crawl budget on popular pages. It’s usually pages that are newer, that aren’t well linked, or don’t change much that are not crawled often.
Crawl budget can be a concern for newer sites, especially those with a lot of pages. Your server may be able to support more crawling, but because your site is new and likely not very popular yet, a search engine may not want to crawl your site very much. This is mostly a disconnect in expectations. You want your pages crawled and indexed but Google doesn’t know if it’s worth indexing your pages and may not want to crawl as many pages as you want them to.
Crawl budget can also be a concern for larger sites with millions of pages or sites that are frequently updated. In general, if you have lots of pages not being crawled or updated as often as you’d like, then you may want to look into speeding up crawling. We’ll talk about how to do that later in the article.
How to check crawl activity
If you want to see an overview of Google crawl activity and any issues they identified, the best place to look is the Crawl Stats report in Google Search Console.
There are various reports here to help you identify changes in crawling behavior, issues with crawling, and give you more information about how Google is crawling your site.
You definitely want to look into any flagged crawl statuses like the ones shown here:
The workload like this whatsapp number list allows both the vendor and the affiliate to focus on. Clicks are the number of clicks coming to your website’s URL from organic search results.
There are also timestamps of when pages were last crawled.
If you want to see hits from all bots and users, you’ll need access to your log files. Depending on hosting and setup, you may have access to tools like Awstats and Webalizer as is seen here on a shared host with cPanel. These tools show some aggregated data from your log files.
For more complex setups you’ll have to get access to and store data from the raw log files, possibly from multiple sources. You may also need specialized tools for larger projects such as an ELK (elasticsearch, logstash, kibana) stack which allows for storage, processing, and visualization of log files. There are also log analysis tools such as Splunk.