B2B Articles - Mar 14, 2012 12:26:34 AM

Crawl errors revisited for webmasters

Google just updated their Webmaster Tool crawl errors, which inform designers and webmasters of problems Googlebot encounters when trying to crawl a website. This tool has become a staple for many web designers that need insight into problems that potentially could hurt search rankings or site visibility.

With the recent update, the crawl errors have been updated into two categories:

  1. site errors
  2. URL errors

Site errors are categorized as:

  1. DNS errors
  2. Server Connectivity problems for a website
  3. Robots.txt Fetch

URL errors are categorized as:

  1. Server error
  2. Access denied
  3. Soft 404
  4. Not found
  5. Not followed
  6. Other (As in, Zombies got there first)

Also, Google now shows trends over the past 90 days for each type of error within Webmaster tools. Google is also listing URLs in priority order--ranked by numerous factors.

Unfortunately, the updates to Webmaster Tools has resulted in some serious missing functionality, which may or may not be an accident. This loss of functionality includes a decrease in the amount of URLs that you can download, specifics about soft 404s, URLs blocked by robots.txt and more.

Tel 212-993-7809  

Ironpaper ®
10 East 33rd Street 
6th Floor
New York, NY 10016

Ironpaper - B2B Agency

B2B Marketing and Growth Agency.

Grow your B2B business boldly. Ironpaper is a B2B marketing agency. We build growth engines for marketing and sales success. We drive demand generation campaigns, ABM programs, B2B content, sales enablement, qualified leads, and B2B marketing efforts. 

Ironpaper Twitter Ironpaper Linkedin