Clickbank Ad Rotator
Home / Digital & Internet News / How to Resubmit an Updated or New Robots.txt File

How to Resubmit an Updated or New Robots.txt File

robots.txt

Our agency handles organic search consulting for several SaaS vendors in the industry. A client that we recently started working with had done a fairly standard practice, placing their application on a subdomain and moving their brochure site to the core domain. This is a standard practice since it enables both your production team and your marketing team to make updates as needed without any dependency on the other.

As a first step in analyzing their organic search health, we registered both the brochure and application domains in Webmasters. That’s when we identified an immediate issue. All of the application pages were being blocked from being indexed by the search engines. We navigated to their robots.txt entry in Webmasters and instantly identified the issue.

While preparing for the migration, their development team didn’t want the application subdomain to be indexed by search, so they disallowed access to search engines. The robots.txt file is a file found in the root of your site – yourdomain.com/robots.txt – that lets the search engine know whether or not they should crawl the site. You can write rules to allow or disallow indexing on the entire site or specific paths. You can also add a line to specify your sitemap file.

The Robots.txt file had the following entry which prevented the site from being crawled and indexed in search result rankings:
User-agent: *
Disallow: /

It should have been written as follows:
User-agent: *
Allow: /

The latter provides permission to any search engine crawling the site that they can access any directory or file within the site.

Great… so now that the robots.txt file is perfect but how does Google know and when will they check the site again? Well, you can absolutely request that Google check your robots.txt, but it’s not too intuitive.

Navigate to the Google Webmasters Search Console to the Crawl > robots.txt Tester. You will see the contents of the most recently crawled robots.txt file within the Tester. If you’d like to resubmit your robots.txt file, click Submit and a popup will come up with a few options.

resubmit robots.txt

The final option is Ask Google to update. Click the blue Submit button next to that option and then navigate back to the Crawl > robots.txt Tester menu option to reload the page. You should now see the updated robots.txt file along with a date stamp that shows that it was crawled again.

If you don’t see an updated version, you can click submit and select View uploaded version to navigate to your actual robots.txt file. Many systems will cache this file. In fact, IIS generates this file dynamically based on rules entered through their user interface. You’ll most likely have to update the rules and refresh the cache to publish a new robots.txt file.

robots-txt-tester

Click Here For Original Source Of The Article

Check Also

OneLocal: A Suite of Marketing Tools for Local Businesses

OneLocal: A Suite of Marketing Tools for Local Businesses

OneLocal

OneLocal is a suite of marketing tools designed for local businesses to get more customer walk-ins, referrals, and – ultimately – to grow revenue. The platform is focused on any type of regional service company, spanning the automotive, health, wellness, home services, insurance, real estate, salon, spa, or retail industries. OneLocal provides a suite to attract, retain, and promote your small business, with tools for every part of the customer journey. OneLocal’s cloud-based tools help you provide best-in-class experiences, connecting


© 2017 DK New Media. All Rights Reserved. Visit and Subscribe to MarTech today!

Leave a Reply

Your email address will not be published. Required fields are marked *

css.php