If you use the hosting service for your site, for example, Wix or Blogger, you may not have to edit the Robots.txt file directly. Instead, your provider can provide a search settings page or other mechanisms for indicating search engines, which pages of the site can be indexed and which ones are not.
If you want to hide or open one of the pages from search engines, read the instructions for changing the page visibility at your hosting, for example, with a search query “How to hide a page from search engines in WIX”. This will help you control which files can be available for bypassing search robots using the Robots.txt file.
The Robots.txt file is located in the root of your site. For example, for www.example.com, the Robots.txt file will be located at www.example.com/robots.txt. This is a regular text file that follows the Robots Exclusion Standard Exception standard.
The Robots.txt file consists of one or more directives, each of which blocks or allows access for a particular robot to the specified path on the site. By default, all files on the site are allowed for bypass, unless otherwise specified.
Here is an example of a simple Robots.txt file with two rules:
User-agent: Googlebot Disallow: /nogooglebot/ User-agent: * Allow: / Sitemap: https://www.example.com/sitemap.xml
What does this file mean:
Creating a Robots.txt file and its testing consists of several stages:
The rules in the Robots.txt file are designed to specify robots which parts of your site can be bypassed. Here are a few recommendations for writing the rules:
User-agent: Googlebot Disallow: /private/ User-agent: * Allow: /
After downloading the Robots.txt file to the site, be sure to test that it is available for search robots. To do this, open the incognito window in your browser and go to your Robots.txt, for example, https://example.com/robots.txt. If you see the contents of your file, it means that it is available for processing.
To test and eliminate problems with markings, you can use:
After the Robots.txt file is uploaded and tested, Google robots will automatically find it and start using it. There is no need to manually send the file. However, if you have updated the file and want Google to quickly upgrade it in the cache, you can use the Robots.txt update function in the Google Search Console.
Here are some useful examples of the Robots.txt file:
User-agent: * Disallow: /
User-agent: * Disallow: /private/
User-agent: Googlebot Allow: / User-agent: * Disallow: /
If you have questions about setting up the Robots.txt file or other SEO issues, you can contact our SEO companion info@seo.computer or through WhatsApp by number +79202044461.
ID: 4