In order for your resource to begin to display among the search results, the search system must first learn about its existence. For this, special programs are used-robots that view and save data from web pages.
The process of adding pages of your site to the search engine includes several consistent stages:
The search boot independently determines the frequency of visiting sites, the number of scanned pages and priorities. The main sources of information for bypass include:
The robot re -checks new links, changes and accessibility of the resource until the page:
During the access to the server, the bot receives an HTTP code of the answer, which determines further actions:
If a temporarily inaccessible page returns the code 429, the bot will continue attempts. However, with prolonged overload, the rate of indexation may decrease.
At this stage, the bot retains the structure of the page, its content and meta -information. The following is taken into account:
After analyzing the information, the bot transmits data to the ranking algorithms. Not all pages fall into the index, especially if:
If your page is useful, but is not displayed in the search, the reason may be in high competition or insufficient visibility.
The algorithm evaluates the relevance of the page, the meaning of information and the convenience of perception:
Only high -quality pages from your site can appear in the search results. Not all pages are indexed, and some may disappear.
Why is snippet different from Description?
Metateg is not necessarily displayed, but the most relevant text of the page.
Why are internal frames displayed?
Check the availability of the parent window in the browser console.
No Last-Modified on the server
This does not interfere with indexation, but it can slow down the re -public of the changed pages.
Does coding affect?
No, the robot itself determines the encoding, even if it is not indicated.
Does the Revisit-AFTER directive work?
No, it is ignored.
Indexation of foreign domains
Sites in Russian and other languages are indexed if they contain relevant content.
Does URL length affect?
Yes, long and confusing addresses can worsen indexation. The maximum length is 1024 characters.
Indexing of the archives GZIP
Yes, it is supported.
Indexation of links with anchors
The robot does not take into account the anchor, except for the Ajax format (#!).
What about pagination?
Pages with `Prev` and` Next` parameters are indexed as ordinary pages.
If your pages do not appear in the search results for a long time or have disappeared, you can contact technical support and give examples of such pages.
For any questions regarding the indexation and promotion of your site, you can contact the SEO company CEO:
ID: 226