How to Resolve 30,000 Crawl Errors in Webmasters?

arihantwebtech

New member
Yesterday I was checking my webmaster and I found 30,000 crawl errors in my webmaster account. Can anyone tell me to solve this issue with any short way. Because manually it been very hard for me.
 
You can log in the web master tools, in the page of the errors hit mark as solved ;)
There is a page ,it provides free site maps along with page analize, it traves the error in your code and shows it to you.
The if you have a constant mistake you can copy the whole code in word and do search and replace. Hope it helps
 
Unfortunately you really need to go over them.

These errors can be caused by a multitude of reasons, could be old pages, bad links, bad direction permissions. Something blocking the bot etc.
All of these will however have the potential to negatively impact your rankings if they are having issues accessing legitimate pages.

It could also be as simple as an automatic sitemap builder including things it shouldnt.
 
30,000 errors sounds like either you changed the structure in your site, you removed a bunch of pages, or something crazy happened.

It's not uncommon to have a blip of 10-20 errors show up where Google says it can't access a website, but then re-check and it has no problem.

Check the date on the reported issue - is it all from the same date, or two? If so, likely something in the site changed. It should give a reason as to why you're getting the error also.

If it was a page structure change, then you could possibly create a wildcard redirect using regex, but you'll want to research that before implementing anything.

I certainly wouldn't remove them until I know why they're flagged. That heads-up by Google could save your website from losing traffic - so pay attention and get it resolved.
 
30,000 errors sounds like either you changed the structure in your site, you removed a bunch of pages, or something crazy happened.

It's not uncommon to have a blip of 10-20 errors show up where Google says it can't access a website, but then re-check and it has no problem.

Check the date on the reported issue - is it all from the same date, or two? If so, likely something in the site changed. It should give a reason as to why you're getting the error also.

If it was a page structure change, then you could possibly create a wildcard redirect using regex, but you'll want to research that before implementing anything.

I certainly wouldn't remove them until I know why they're flagged. That heads-up by Google could save your website from losing traffic - so pay attention and get it resolved.

As like same happening with me, just I have no loss about ranking yet. But I want to resolve this for my knowledge and SEO understanding purpose. Thanks for Reply!!!

Unfortunately you really need to go over them.

These errors can be caused by a multitude of reasons, could be old pages, bad links, bad direction permissions. Something blocking the bot etc.
All of these will however have the potential to negatively impact your rankings if they are having issues accessing legitimate pages.

It could also be as simple as an automatic sitemap builder including things it shouldnt.

One of my website have 500 above 404 url's and I am trying to them resolve manually. But difficulty is coming over with me.... When I reduced the number from 500 to 400. After two days they again increases and again become 460 from 400. I was done 301 redirections on relevant pages. So, you have any suitable suggestion which I can solve them properly.
 
Are the 404's VALID? Meaning, did a page used to exist that is now deleted? If not, then you could be fielding 404 errors for a robot that's just guessing at words on a site.

Check your backlink profile and see if the places that are linking to that 404 error maybe need a link update, or if they're even legit.

Did you make a massive website structure change? Removing everything from a folder into another?
 
If those urls dont exist any more . Select all errors and mark as resolved.

NOT a good SEO strategy.

If you've built traffic and potential links, you need to forward those to relevant pages. If nothing relates to the article you removed, then sure, 404 it, or dump the user to a search page - but if there's even a hint of related articles on the new site, you need to 301 redirect the urls and keep that time/money spent marketing working for you.
 
You really have to check what the errors where (404 etc) and then find out why they occurred before you just mark them as resolved. 30,000 is quite high. Also they don't make it very easy to just mark all 30,000 as resolved. Last i checked i think the most rows you could show and clear was 500.
 
NOT a good SEO strategy.

If you've built traffic and potential links, you need to forward those to relevant pages. If nothing relates to the article you removed, then sure, 404 it, or dump the user to a search page - but if there's even a hint of related articles on the new site, you need to 301 redirect the urls and keep that time/money spent marketing working for you.

But its is very difficult to do redirection for all url's. Because 301 redirection can be possible by manually. Which very time taken process.
Any automated solution are there??? If yes please let me know>
 
Yes, you can automate it using RegEx, but again, I go all the way back to my previous question;
Did you change the structure of the site, did the pages exist in the past? If so, then you can POSSIBLY use RegEx to redirect entire URL stuctures.

If the pages didn't exist before, then it could just be a bot knocking on the door, but you'll need to see where on the web those links might have existed etc.

One thing I can tell you about SEO - if it's easy, you're not doing it right! Take the time to invest into resolve the issues correctly, and Google, Bing, and users, will reward you with results.
 
Top