Ken Lyons, August 13.
There are plenty of reasons why you'd want Googlebot to recrawl your website ahead of schedule. Maybe you've cleaned up a malware attack that damaged your organic visibility and want a clean bill of health so rankings recover faster; or maybe you've implemented site-wide canonical tags to eliminate duplicate content and want these updates sorted out quickly; or you want to accelerate indexing for that brand new resources section on your site.To force recrawls, SEOs typically use tactics like resubmitting XML sitemaps, or using a free ping service like Seesmic Ping (formerly Ping.fm) or Ping-O-Matic to try and coax a crawl, or firing a bunch of social bookmarking links at the site. Trouble is, these tactics are pretty much hit or miss.Good news is, there's a better, more reliable way to get Googlebot to recrawl your site ahead of your standard crawl rate, and it's 100 percent Google-endorsed.
Meet "Submit URL to Index"
Last year, Google updated "Fetch as Googlebot" in Webmaster Tools (WMT) with a new feature, called "Submit URL to Index," which allows you to submit new and updated URLs that Google themselves say they "will usually crawl within a day."For some reason, this addition to WMT got very little fanfare in the SEO sphere, and it should have been a much bigger deal that it was. Search marketers should know that Submit URL to Index comes as advertised, and is very effective in forcing a Google recrawl and yielding almost immediate indexing results.
Quick Case Study and Some Tips on Using "Submit URL to Index"
Recently, a client started receiving a series of notifications from Webmaster Tools about a big spike in crawl errors, including 403 errors and robots.txt file errors. These types of preventive measure alerts from WMT are relatively new and part of Google's continued campaign to give site owners more visibility into their site's performance (and diagnose performance issues), which started with the revamping of the crawl errors feature back in March.Almost immediately, organic traffic and SERP performance began to suffer for the client site, which is to be expected given the number of error notices (five in three days) and the rash of errors cropping up.Here's the final email sent to the client from WMT:Google being unable to access the site was the real tip off here, and it turned out that the client's developer had inadvertently blocked Google's IP.In the past, technical issues like the one above might take days to discover. But with these new crawl error notifications are a real Godsend and saved us a ton of time and effort trying to isolate and diagnose the issues, making my life easier and helping greatly reduce the amount of time it takes to solve issues. This means we spend less time fighting fires and more time on progressive SEO efforts.After the developer removed the block on Google's IP and we felt the issue was solved, we wanted to force a recrawl. To do this, you first need to submit URLs to used the "Fetch as Googlebot" feature and get diagnostic feedback on or either Google's success or error when attempting to fetch the URL.If Google is able to fetch the URL successfully, you're then granted access to use the "Submit URL to Index" feature.Here are a couple of tips when using this feature:Select "URL and all linked pages" vs "URL" when submitting for a recrawl. This designates the URL you submit as the starting point for a crawl and includes a recrawl of all internal links on that page and whole interlinked sections of sites.You can also force Google to crawl URLs that aren't in your error reports by going to "Fetch as Googlebot" and plugging in any URL on your site. FYI you can leave the field blank if you want Google to use the home page as a starting point for a recrawl.When choosing additional URLs to crawl, submit pages that house the most internal links so you're "stacking the deck" in trying to force as deep a crawl as possible on as many URLs as possible: think HTML site map and other heavily linked-up pages.Keep in mind that Google limits you to ten index submissions per month, and that's per account. So if you host a number of client sites in the same WMT account, be aware and use your submits sparingly.After forcing your recrawls, you want to return to the crawl errors screen and select the offending category (in this case it was the access denied tab) and "mark as fixed" and either individually select the URL or select all.Now it's worth noting that there may be a system lag with some of these notices. So even after you've made fixes, you may still get technical error notices. But if you're confident you've solved all the issues, just repeat the process of marking URLs as fixed until you get a clean bill of health.The day after forcing a Google recrawl of the client's site we saw an immediate spike in crawl activity in Webmaster Tools.As a result, we were able to solve the issue in a few days and traffic rebounded almost immediately. I also believe that submitting multiple "internal link hub" type URLs for Google to crawl -- including the HTML site map and an extensively linked-up resources page -- really helped speed up recovery time.
Final Thoughts on Submit to Index and Crawl Error Alerts
All of these feature upgrades in Webmaster Tools -- like the crawl error alert notifications -- are really instrumental in helping SEOs and site owners find and fix technical issues faster.With Submit to Index, you no longer having to wait around for Googlebot to crawl your site and discover your fixes. You can resolve technical issues faster, leading to less SERP interruption and happier clients.Want to Know the Hottest Digital Marketing Strategies for 2013?Find out when you join hundreds of digital marketers and experts at SES Chicago Conference & Expo, November 12-16. Put yourself ahead of the curve and:Get inspired with a Keynote from Avinash Kaushik, Google's Digital Marketing EvangelistConnect, share and get an in-depth education on SEO, PPC, social, analytics, mobile, and much more from the industry's leading practitionersExplore evolving trends, opportunities and evolving technologies in digital marketingBe a part of the community, receive an unparalleled education, network with your peers, and achieve results! SAVE $400 on an All Access Pass now through October 12. Register today!
Recommend this story
inShare100Save to del.icio.us (13 saves, tagged: Googlebot SEO Tony Britten)
→Read more on SEO
PopularTopicsFrom Around the Web
Common Technical SEO Problems and How to Solve Them
Sep 10, 2012
Answering the top questions from government webmasters
Sep 14, 2012
Microsoft Bing launches Facebook photo search tool
Aug 30, 2012
Bing rolls out a dedicated tool for searching your friend’s Facebook pictures
Aug 30, 2012
Canonical Ubuntu Management Tool Gets Hefty Upgrade
Sep 13, 2012
Calculate Your 2013 Facebook Ad Spend With This Free Tool
Sep 20, 2012Commenting policyAdd a Comment Post20 CommentsTweetsDiscussion HighlightsFollow this threadJeremy EldridgeGood read Ken! People often forget to keep it simple...Last Week· Replypackers and movers jhansiA Well researched post indeed. Its presentation style is truly commendable. One can learn a lot from it. If you read between the lines there are many features which makes your post strong and highly impressive. Many thanks :) packers and movers jhansi :)3 Weeks Ago· ReplySanket PatelReally nice explanation about the Google bot fetching URL. I am using webmaster for fetch my URLs. but i don't know about Google limits that we can index ten submissions per month.Last Month· ReplyWebsite DevelopersFirst of all thanks for sharing the article. Last Month· ReplyJason AnsleySo...the million dollar question...Ok, just a few thousand dollar question. If the above has been done, and the site is fetched, and the site is crawled, pages have been submitted...but still after 30 days Google still has not indexed the site, What is the next step? I have done the best I can to determine if the domain is on any blacklists, and they come back clean. However, GOogle is not indexing several sites I manage and the cache:yoursite returns 404, yet Google Fetches perfectly. Thoughts? Jason Ansley ansleyRDgroup Business Development Coach Are you Building Generous Business?Last Month· ReplyOlivia's Pet SupplyI for one never knew about this feature, and I will be using it a lot as I am in the process of rewriting over 4,000 manufacturers descriptions on my ecommerce site. Thanks for sharing this info!Last Month· ReplyMatthew MouldenUsually Googlebot re-crawls on sites that are already indexed. Linking to already to already indexed sites will surely crawls your site. Using the cache:yoursite on google search bar to see when was the last crawl of Googlebot on your site. Cheers, Matthew MouldenLast Month· ReplyseofirstpageIt's an old story, Google had said many times, to use fetch as Google bot to make Google discover your new webpages or website. In their official presentation they said, to make Google discover your new webpage use "fetch as a Google bot" feature in webmasters over submitting new sitemaps.Last Month· ReplyJMCThanks for sharing this. It is nice when somebody talks about a tool with numbers, graphics and results. Probably many of us already knew this tool but I bet some of us don't still know how to get the most of it.Last Month· ReplyMike MiguelI used and take moreless 2 days to recrawl... was excelent I suposed it take 1 week or moreLast Month· ReplyMike MiguelI used one week ago and takes moreless 2 days to recrawl. When I saw it for the 1st time i didnt believe it could be so quick... I expected 1 week, as usually by GoogleLast Month· ReplyWine DineSometimes the simplest tools have more impact.Last Month· ReplyYiannisIn my opinion,the most important role in recrawling one website are the often small changes in content.Last Month· Replygregory smithI have used this option many times, and use it frequently on all my work. I wouldn't be surprised if there was a whole ton of people who didn't know about this though. Not everyone is an SEO. Thanks!Last Month· ReplyKen LyonsHey, Gregory. Thanks for commenting. Yeah, exactly. Unless you actively use WMT, or you saw coverage about the feature update (chirp, chirp...) or you subscribe to the Google Webmaster Central Blog, you likley have no idea it even exists. And I would put many SMBs, Webmasters and even some search marketers in that group. Anyway, I just feel the tool is so damn useful that it was well worth a write up and some needed exposure.Last Month· ReplyRob S.Yes, this tool got very little fanfare and thanks for reminding us about this option. This is one that I tend to forget about since old school best solution was to just submit and new xml sitemap. Good Article.Last Month· ReplyChris GedgeYou would be a pretty useless SEO if you didn't already know about this. Its been out for a year!Joe Rega likes this.Last Month· ReplyKen LyonsHaha... Chris, you'd be surprised how many in SEO don't know about it. Anemic exposure may be to blame, so I thought I'd give it some belated coverage. Last Month· Replygregory smithCould you please tell me hwo's the editor here, in case I would like to submit a post to them?Last Month· ReplySian KillingsworthExcellent post - I'll be recommending this process to my clients, and will be following these directions for my own site. Thanks! SK Content Strategy & Copywriting www.sianessa.comLast Month· ReplyUpdatesSign up for daily (or weekly) newslettersSign UpRSSinShareSend us a story tip
Trending
Most ViewsMost LikesMost TweetsTrending TopicsClara Schumann Google Doodle Celebrates German Pianist & ComposerBare Minimum SEO: 3 Things You Must DoBing vs. Google: A Search Engine Taste Test (Of Things To Come?)Google SUXContent is King & Other White Lies We Tell OurselvesThe Definitive Checklist for Effective Facebook PostsAlgorithm Updates, Duplicate Content & A RecoverySix Degrees of Kevin Bacon Comes to Google: Search for Any Actor’s ‘Bacon Number’The iPhone 5: What Does it Mean for Mobile Search (and Google)?Ditch The Silos – How to Build a Great Content Marketing TeamHomeA-ZAboutEventsForumsJobsNewslettersNewswallSEO ToolStatsToolsTrainingWebinarsWhitepapersAbout us | Contact SEW | FAQ | Write for SEW | Advertise | Report a bug | Newsletter problems? | Site Map | Copyright and Licensing |Privacy PolicyIncisive Interactive Marketing LLC.© 2012 All rights reserved. 55 Broad St, 22nd Floor, New York, NY 10004Incisive MediaAOP Digital Publisher of the Year
There are plenty of reasons why you'd want Googlebot to recrawl your website ahead of schedule. Maybe you've cleaned up a malware attack that damaged your organic visibility and want a clean bill of health so rankings recover faster; or maybe you've implemented site-wide canonical tags to eliminate duplicate content and want these updates sorted out quickly; or you want to accelerate indexing for that brand new resources section on your site.To force recrawls, SEOs typically use tactics like resubmitting XML sitemaps, or using a free ping service like Seesmic Ping (formerly Ping.fm) or Ping-O-Matic to try and coax a crawl, or firing a bunch of social bookmarking links at the site. Trouble is, these tactics are pretty much hit or miss.Good news is, there's a better, more reliable way to get Googlebot to recrawl your site ahead of your standard crawl rate, and it's 100 percent Google-endorsed.
Meet "Submit URL to Index"
Last year, Google updated "Fetch as Googlebot" in Webmaster Tools (WMT) with a new feature, called "Submit URL to Index," which allows you to submit new and updated URLs that Google themselves say they "will usually crawl within a day."For some reason, this addition to WMT got very little fanfare in the SEO sphere, and it should have been a much bigger deal that it was. Search marketers should know that Submit URL to Index comes as advertised, and is very effective in forcing a Google recrawl and yielding almost immediate indexing results.
Quick Case Study and Some Tips on Using "Submit URL to Index"
Recently, a client started receiving a series of notifications from Webmaster Tools about a big spike in crawl errors, including 403 errors and robots.txt file errors. These types of preventive measure alerts from WMT are relatively new and part of Google's continued campaign to give site owners more visibility into their site's performance (and diagnose performance issues), which started with the revamping of the crawl errors feature back in March.Almost immediately, organic traffic and SERP performance began to suffer for the client site, which is to be expected given the number of error notices (five in three days) and the rash of errors cropping up.Here's the final email sent to the client from WMT:Google being unable to access the site was the real tip off here, and it turned out that the client's developer had inadvertently blocked Google's IP.In the past, technical issues like the one above might take days to discover. But with these new crawl error notifications are a real Godsend and saved us a ton of time and effort trying to isolate and diagnose the issues, making my life easier and helping greatly reduce the amount of time it takes to solve issues. This means we spend less time fighting fires and more time on progressive SEO efforts.After the developer removed the block on Google's IP and we felt the issue was solved, we wanted to force a recrawl. To do this, you first need to submit URLs to used the "Fetch as Googlebot" feature and get diagnostic feedback on or either Google's success or error when attempting to fetch the URL.If Google is able to fetch the URL successfully, you're then granted access to use the "Submit URL to Index" feature.Here are a couple of tips when using this feature:Select "URL and all linked pages" vs "URL" when submitting for a recrawl. This designates the URL you submit as the starting point for a crawl and includes a recrawl of all internal links on that page and whole interlinked sections of sites.You can also force Google to crawl URLs that aren't in your error reports by going to "Fetch as Googlebot" and plugging in any URL on your site. FYI you can leave the field blank if you want Google to use the home page as a starting point for a recrawl.When choosing additional URLs to crawl, submit pages that house the most internal links so you're "stacking the deck" in trying to force as deep a crawl as possible on as many URLs as possible: think HTML site map and other heavily linked-up pages.Keep in mind that Google limits you to ten index submissions per month, and that's per account. So if you host a number of client sites in the same WMT account, be aware and use your submits sparingly.After forcing your recrawls, you want to return to the crawl errors screen and select the offending category (in this case it was the access denied tab) and "mark as fixed" and either individually select the URL or select all.Now it's worth noting that there may be a system lag with some of these notices. So even after you've made fixes, you may still get technical error notices. But if you're confident you've solved all the issues, just repeat the process of marking URLs as fixed until you get a clean bill of health.The day after forcing a Google recrawl of the client's site we saw an immediate spike in crawl activity in Webmaster Tools.As a result, we were able to solve the issue in a few days and traffic rebounded almost immediately. I also believe that submitting multiple "internal link hub" type URLs for Google to crawl -- including the HTML site map and an extensively linked-up resources page -- really helped speed up recovery time.
Final Thoughts on Submit to Index and Crawl Error Alerts
All of these feature upgrades in Webmaster Tools -- like the crawl error alert notifications -- are really instrumental in helping SEOs and site owners find and fix technical issues faster.With Submit to Index, you no longer having to wait around for Googlebot to crawl your site and discover your fixes. You can resolve technical issues faster, leading to less SERP interruption and happier clients.Want to Know the Hottest Digital Marketing Strategies for 2013?Find out when you join hundreds of digital marketers and experts at SES Chicago Conference & Expo, November 12-16. Put yourself ahead of the curve and:Get inspired with a Keynote from Avinash Kaushik, Google's Digital Marketing EvangelistConnect, share and get an in-depth education on SEO, PPC, social, analytics, mobile, and much more from the industry's leading practitionersExplore evolving trends, opportunities and evolving technologies in digital marketingBe a part of the community, receive an unparalleled education, network with your peers, and achieve results! SAVE $400 on an All Access Pass now through October 12. Register today!
Recommend this story
inShare100Save to del.icio.us (13 saves, tagged: Googlebot SEO Tony Britten)
→Read more on SEO
PopularTopicsFrom Around the Web
Common Technical SEO Problems and How to Solve Them
Sep 10, 2012
Answering the top questions from government webmasters
Sep 14, 2012
Microsoft Bing launches Facebook photo search tool
Aug 30, 2012
Bing rolls out a dedicated tool for searching your friend’s Facebook pictures
Aug 30, 2012
Canonical Ubuntu Management Tool Gets Hefty Upgrade
Sep 13, 2012
Calculate Your 2013 Facebook Ad Spend With This Free Tool
Sep 20, 2012Commenting policyAdd a Comment Post20 CommentsTweetsDiscussion HighlightsFollow this threadJeremy EldridgeGood read Ken! People often forget to keep it simple...Last Week· Replypackers and movers jhansiA Well researched post indeed. Its presentation style is truly commendable. One can learn a lot from it. If you read between the lines there are many features which makes your post strong and highly impressive. Many thanks :) packers and movers jhansi :)3 Weeks Ago· ReplySanket PatelReally nice explanation about the Google bot fetching URL. I am using webmaster for fetch my URLs. but i don't know about Google limits that we can index ten submissions per month.Last Month· ReplyWebsite DevelopersFirst of all thanks for sharing the article. Last Month· ReplyJason AnsleySo...the million dollar question...Ok, just a few thousand dollar question. If the above has been done, and the site is fetched, and the site is crawled, pages have been submitted...but still after 30 days Google still has not indexed the site, What is the next step? I have done the best I can to determine if the domain is on any blacklists, and they come back clean. However, GOogle is not indexing several sites I manage and the cache:yoursite returns 404, yet Google Fetches perfectly. Thoughts? Jason Ansley ansleyRDgroup Business Development Coach Are you Building Generous Business?Last Month· ReplyOlivia's Pet SupplyI for one never knew about this feature, and I will be using it a lot as I am in the process of rewriting over 4,000 manufacturers descriptions on my ecommerce site. Thanks for sharing this info!Last Month· ReplyMatthew MouldenUsually Googlebot re-crawls on sites that are already indexed. Linking to already to already indexed sites will surely crawls your site. Using the cache:yoursite on google search bar to see when was the last crawl of Googlebot on your site. Cheers, Matthew MouldenLast Month· ReplyseofirstpageIt's an old story, Google had said many times, to use fetch as Google bot to make Google discover your new webpages or website. In their official presentation they said, to make Google discover your new webpage use "fetch as a Google bot" feature in webmasters over submitting new sitemaps.Last Month· ReplyJMCThanks for sharing this. It is nice when somebody talks about a tool with numbers, graphics and results. Probably many of us already knew this tool but I bet some of us don't still know how to get the most of it.Last Month· ReplyMike MiguelI used and take moreless 2 days to recrawl... was excelent I suposed it take 1 week or moreLast Month· ReplyMike MiguelI used one week ago and takes moreless 2 days to recrawl. When I saw it for the 1st time i didnt believe it could be so quick... I expected 1 week, as usually by GoogleLast Month· ReplyWine DineSometimes the simplest tools have more impact.Last Month· ReplyYiannisIn my opinion,the most important role in recrawling one website are the often small changes in content.Last Month· Replygregory smithI have used this option many times, and use it frequently on all my work. I wouldn't be surprised if there was a whole ton of people who didn't know about this though. Not everyone is an SEO. Thanks!Last Month· ReplyKen LyonsHey, Gregory. Thanks for commenting. Yeah, exactly. Unless you actively use WMT, or you saw coverage about the feature update (chirp, chirp...) or you subscribe to the Google Webmaster Central Blog, you likley have no idea it even exists. And I would put many SMBs, Webmasters and even some search marketers in that group. Anyway, I just feel the tool is so damn useful that it was well worth a write up and some needed exposure.Last Month· ReplyRob S.Yes, this tool got very little fanfare and thanks for reminding us about this option. This is one that I tend to forget about since old school best solution was to just submit and new xml sitemap. Good Article.Last Month· ReplyChris GedgeYou would be a pretty useless SEO if you didn't already know about this. Its been out for a year!Joe Rega likes this.Last Month· ReplyKen LyonsHaha... Chris, you'd be surprised how many in SEO don't know about it. Anemic exposure may be to blame, so I thought I'd give it some belated coverage. Last Month· Replygregory smithCould you please tell me hwo's the editor here, in case I would like to submit a post to them?Last Month· ReplySian KillingsworthExcellent post - I'll be recommending this process to my clients, and will be following these directions for my own site. Thanks! SK Content Strategy & Copywriting www.sianessa.comLast Month· ReplyUpdatesSign up for daily (or weekly) newslettersSign UpRSSinShareSend us a story tip
Trending
Most ViewsMost LikesMost TweetsTrending TopicsClara Schumann Google Doodle Celebrates German Pianist & ComposerBare Minimum SEO: 3 Things You Must DoBing vs. Google: A Search Engine Taste Test (Of Things To Come?)Google SUXContent is King & Other White Lies We Tell OurselvesThe Definitive Checklist for Effective Facebook PostsAlgorithm Updates, Duplicate Content & A RecoverySix Degrees of Kevin Bacon Comes to Google: Search for Any Actor’s ‘Bacon Number’The iPhone 5: What Does it Mean for Mobile Search (and Google)?Ditch The Silos – How to Build a Great Content Marketing TeamHomeA-ZAboutEventsForumsJobsNewslettersNewswallSEO ToolStatsToolsTrainingWebinarsWhitepapersAbout us | Contact SEW | FAQ | Write for SEW | Advertise | Report a bug | Newsletter problems? | Site Map | Copyright and Licensing |Privacy PolicyIncisive Interactive Marketing LLC.© 2012 All rights reserved. 55 Broad St, 22nd Floor, New York, NY 10004Incisive MediaAOP Digital Publisher of the Year
No comments:
Post a Comment