We present an overview of a tool that solves most problems for conducting a technical SEO audit of a website.
Let's get started.
This tool is available in a freemium version. You can scan a site with up to 500 links for free. However, it is worth considering that this quota includes not only pages, but also images, scripts, documents and other resources. The free version has limited analytical features.
The full version costs about 149 pounds sterling, which at the time of writing is equivalent to more than 14 thousand rubles.
First you need to download the program from the official website. The installation should be carried out by downloading it from a trusted resource, and not from torrents with cracked versions. After entering the license key in the “License” menu, you can start using the program.
It is important to note that the program only works in English and there is no localization for it. Each analyzed item can be exported to Excel for further work.
Let’s take a website as an example and analyze it. To do this, launch the program, enter the site domain in the appropriate field and begin scanning.
If you want to scan all pages of a site, simply enter the domain and click “Start”. The Crawl slider will show the percentage of work completed.
If you need to analyze specific pages, you can use the "List" mode. In this mode, select “Upload”, insert links and start scanning.
If you need to exclude some pages or subdomains, you can configure this using regular expressions. You can also disable page crawling if the site is closed via the robots.txt file.
For more detailed settings, you can go to the Configuration -> Spider -> Basic menu and select which elements you want to include in the analysis.
The speed of the program can be adjusted through Configuration -> Speed, where Max threads indicates the number of threads creating load on the site. The lower this indicator, the less load on the server.
On the right side of the program interface, in the “Seo elements” menu, you can see all the elements that the program detected on the site. From this list you can select only those that are needed for analysis.
To analyze outgoing links, go to the External tab. Here you can find out which page contains a particular outgoing link. By selecting the link you need, you can see on which pages it appears.
To analyze the site structure, you need to select Internal -> Select Type of table view to see the internal pages and their relationships.
The Protocol tab shows which pages on the site use HTTP or HTTPS. You can check the correctness of the redirect and make sure that all pages have the necessary security certificate.
To analyze server response codes, go to Response codes and select the desired response type for further analysis.
To analyze URLs, go to the URL block. Here you can check for duplicate URLs, the presence of parameters in addresses and the length of URLs.
Analyze the title in the Page Titles block. Important parameters are the absence of a title, duplicates, a match with H1, and the presence of several title tags on the page.
Description tags are checked similarly to title tags: absence, duplicates and multiple tags on the page. It is recommended to resolve these issues to improve site visibility.
Particular attention should be paid to the image. The weight of the images is checked, as well as the presence of missing alt tags.
For meta keywords, only pages where they are not present are checked. Other parameters can be checked as desired.
It is important to configure the H1 and H2 tags correctly. Parameters such as the absence of tags, duplicates and the presence of several tags on the page are checked.
Check pages for canonical links and proper use of next and prev tags for pagination.
Check for meta robots directives such as noindex, nofollow and noarchive to ensure your site is correctly indexed by search engines.
In the Sitemap menu, you can create an XML sitemap and an image map that meets the requirements of search engines.
In order for the program to work as a search bot, you can configure it through Configuration -> User Agent by selecting the desired preset.
To search for specific data on a site (for example, contact information or micro markup), use the Custom -> Search function. Enter the desired mask and the program will perform a search.
Slow loading pages can be found through Response Codes -> Response Time, sorting the data by server response time.
Check the number of internal links through the Internal -> Inlinks tab. This will help improve internal linking and site structure.
To integrate with Google Analytics, you can configure API access via Configuration -> Api Access -> Google Analytics, which will allow you to load analytics data directly into the program for analysis.
If you have any questions or need help in analyzing the site, contact the SEO studio "SEO COMPUTER" by email: info@seo.computer.
ID 6692