Correction of JavaScript problems on your site for Google

Correction of JavaScript problems on your site for Google

This guide will help you identify and eliminate the problems with JavaScript, which can block your page or specific content on the pages using JavaScript from display in searching for Google. Despite the fact that Google Search works with JavaScript, there are some features and restrictions that need to be taken into account when designing pages and applications to ensure the correct indexing of Googlebot. Our guide on the basics of SEO for JavaScript contains additional information on how to optimize your site with JavaScript for the Google search engine.

Indexation frequency and behavior for your site in Google

Googlebot is designed to effectively index pages, minimizing the impact on user experience. Using Web Rendering Service (WRS), Googlebot constantly analyzes and identifies resources that do not contribute to the main content of the page, and may not load such resources. For example, requests for reporting or errors that are not related to the main content of the page can be excluded from indexation.

In order to track Googlebot activity on your site, use a report on the statistics of indexing in Google Search Console to monitor the activity of Googlebot and WRS on your site and receive feedback.

How to check whether JavaScript blocks on your site content for Google

If you suspect the problems with JavaScript can block your content from the appearance in the search for Google, take the following steps. If you are not sure if JavaScript is the main reason for the problem, follow our general debugging manual to accurately determine the source of the problem.

  • To check how Googlebot indexes and displays URL, use the Rich Results Test or URL Insption Tools in Google Search Console. These tools will show loaded resources, conclusions of the JavaScript console, exceptions displayed DOM and other data.

Additionally, it is recommended to collect and check JavaScript errors that users can have, including Googlebot, on your site to identify potential problems that may affect content rendering.

Error prevention 404 on your site for Google

In one -page applications (SPA), prevention of errors 404 can be a particularly difficult task. To avoid indexing erroneous pages, use the following strategies:

  • Redirect to the URL, which returns code 404.
  • Add or change the Robots meta-tag to noindex.

When SPA uses JavaScript to process errors, these errors often report the 200 HTTP code instead of the correct condition code, which can lead to indexation of pages with errors.

How to take into account the restrictions for Googlebot in your application

Googlebot cannot process requests that require permits from users. For example, if your application requires access to the camera, Googlebot will not be able to provide such an opportunity. Instead, provide a way to obtain content without obligatory permits, such as access to the camera.

Do not use URL fragments to download various types of content on your Google website

One -page applications can use fragments of URL (for example, https://example.com/#/products) to download various representations. However, the AJAX Crauling scheme has been outdated since 2015, and you cannot rely on URL fragments for Googlebot indexation. It is recommended to use the History API to download various SPA content.

Do not rely on data storage on your site on Google

WRS loads each URL separately, similarly to an ordinary browser, and does not maintain the condition between the loading of the page. This means that the data in Local Storage, Session Storage and HTTP COOKies are cleaned with each new request, which can affect the rendering of content.

Using content prints to prevent googlebot caching problems

Googlebot is actively caching resources to reduce network requests and the use of resources. WRS can ignore the caching titles, which will lead to the use of outdated JavaScript or CSS files. To avoid this problem, use content prints for file names, for example Main.2BB85551.js. This will allow updated files to have unique names, which guarantees their use by Googlebot.

Provide the support of all the necessary API for your Google website

Make sure your application uses a test of the capabilities of all critical APIs and provides alternative behavior or polyphillas in case of inaccessibility. Some web functions may not yet be supported by all agents or can be disconnected. For example, if you use WebGL to rendering photo effects, check if Googlebot Webgl supports, and if not, use the server rendering.

Check if your content is working with HTTP connections for your Google website

Googlebot uses HTTP checks to extract content from your server. It does not support other types of connections such as WebSockets or WebRTC. Therefore, it is important to provide a backup solution for HTTP connections and use reliable error processing and checking capabilities.

Make sure your web components are correctly rendered for your Google site

Use the Rich Results Test or URL Inspection Tool tools to check if all the expected content on the page will be rendered. WRS flattenes light DOM and shadow DOM. If your web components do not use the mechanism For a light DOM content, look at the component documentation or select another component.

After correcting the problems, test your page in the Rich Results Test or URL Inspection Tool in Google

After making changes, check your page again using the Rich Results Test or URL Inspression Tool in Google Search Console. If the error is eliminated, you will see a green checkmark and lack of errors. Otherwise, contact the Search Central support community.

If you have questions, you can contact the SEO company "SEO.computer" by email info@seo.computer or through WhatsApp to number +79202044461.

ID 122

Send a request and we will provide a consultation on SEO promotion of your website