I have a website and recently when I check from webmaster center, I found the bot is crawling many old dead pages. I'm sure there's no way to find such pages on website anymore. But why googlebot still trying to crawl such non-exist pages? Because of this, it gives me lots of 404 errors from report.
__________________ Hostirian - Saint Louis Data Center and Managed Services Provider (800) 615-9349
Datavault, Colocation, Bare Metal Servers, Cloud Servers & Managed WordPress Solutions
Over 18 years of experience working with small to medium sized businesses.
Your sitemap must be up to date. If you are running like a wp site there are plugins that you can get to keep it updated. Also we had a online shop that had a script that would update the sitemap at any interval you set it at ie. daily, weekly or monthly and of course you can also use your cron for setting the interval also.