Home » SEO tips » How to force Search Robot visit website more often for better indexation
a

Лена

Writer & Blogger

How to force Search Robot visit website more often for better indexation

Google crawling bot

Webmasters often face a problem, when pages do not fall within an index of a search engine for weeks. It is especially actual for Yandex – new pages appear there much later than in Google. And recently, Yandex does text updates on rare occasions – 2-3 times a month. In this article, we will consider what influences the indexation of the website, how can it be accelerated, and how to monitor it.

What influences the frequency of indexing of the website by the search robot

— Load of the server, on which the website is located (quality of a hosting)

If there are many other resources on the server, and technical characteristics of the server do not allow to cope quickly with all requests of the robot, the robot begins to visit such site less frequently. Respectively, it will need more time for the inclusion of pages in search results.

— Website refresh rate in general

Search robots analyze content refresh rate on the website. And thus they define how often this or that site will be visited.

The more often new content is added on the website, the more often SE robots visit it.

— Interest of visitors to the website (BF)

The search robot can review the planning policy and visit the site more often if the new materials, interesting to users, are regularly added on the website (for example, news, articles):

  • users dwell on the website;
  • click through internal pages;
  • add the website to tabs;
  • share material in social nets, etc.

How to improve indexation of the website

The search robot indexes a certain number of pages at one visit, according to the quota, which depends on a set of parameters of the website. In other words, even if the robot visited your site, it does not mean, that it will index and include all pages of the website in the base. Therefore, it is very important, that the robot-indexer not only visited your site but also included all new and changed pages of the website in the base.

Below in the article, we will consider, how to force SE robot to visit your site more often and, at the same time, how to increase the number of pages, which SE robot can index at one visit, – the crawling budget.

1.Analyze the website for the existence of doubles

This point is the first one because the doubles of pages are one of the main problems which worsen the indexation of the website by search robots.

When the website is full of doubled pages, the crawling budget is spent on useless pages instead of new pages or pages with the updated material.

2.Configure the server to issue the correct HTTP status

Correct setup of status codes of HTTP is very important for the correct indexation of the website.

When the SE robot requests the page of the website, the status code provides it with information about the website and specific page:

  • whether such a page exists or not,
  • whether the redirect is configured,
  • whether an error on server side is present or not.

For example, the http «404 Not Found» code reports that the page to the required address does not exist, and the http «OK 200» code reports that the page is available.

We also recommend configuring the heading Last-Modified.

The heading Last-Modified informs the SE robot about the last modification date of the document. Thus, the robot-indexer checks for updating the documents, which were really changed since the previous bypass or new pages, without spending the crawling budget for pages that were not changed.

3.Monitor response time of the server and loading speed of pages

Response time of the server for the request of the browser directly influences the indexation of the website.

Taking into account cross-network delays, it should be less than 300 milliseconds.

Services for measurement of a response time of the server:

  • http://www.host-tracker.com/
  • Yandex. Webmaster

image1

Loading speed of the website is also very important for indexation.

Page load time should not exceed 3-5 seconds.

Services for checking the loading speed:

  • http://tools.pingdom.com/fpt/
  • http://www.webpagetest.org/

It is possible to check, whether there are some errors, by means of PageSpeed Google service.

4.Organize the structure of the website competently

The clearer the structure of the website looks for the search robot, the better it will be indexed by this robot.

Recommendations on the structure of the website:

— Nesting level of pages

Any promoted page should be not farther, than in 3 cliques from the main one. It is very important for indexation because the search robot will spend much less time on indexation of the simple and superficial website, than on a difficult resource with the confused system of navigation.

— Implement hub pages on the website

Hub pages are the pages, which contain references to the sections and subsections that serve for users’ navigation on the website.

To improve indexation, hub pages need to be implemented in one click from the main page of the website.

The implementation of hub pages will give us the following:

  1. It will reduce the nesting level;
  2. It will accelerate the indexation of pages by a search bot;
  3. It will help users to find the necessary material faster

Example of the hub page:

2

—Put announces of new pages on main page of the website

The search robot visits the main page of the website most frequently. If references to new pages or pages with the updated content are placed on it, there is a high probability that the robot will index them.

For example:

3

— Implement a widget of the last articles in a sidebar

One more good way to interlink the pages of the website:

4

— Do not implement the menu on scripts and flash

The main disadvantage of the menu on scripts and flash is that search robots do not see it.

5.Configure the robots.txt file

Correctly made robots.txt will allow to exclude the possible problems arising when the robot scans the website. It can considerably accelerate the indexation of a resource in general.

It is necessary to spell out the instruction for robots of search engines in the robots.txt file: what pages of the website to index and what pages should not be indexed. In that case searchers will need less time for the website scanning.

For example, it is possible to close in robots.txt:

  • utility files and CMS folders;
  • internal and external doubles (if there are any);
  • answer forms at forum;
  • technical pages;
  • files, which do not bear any useful information for user (for example, statistics of visits, search results pages).

How to create the correct robots.txt?

  1. Create the robots.txt file in a text editor.
  2. Fill it according to the rules. It is possible to learn the syntax of Robots in recommendations of Yandex for webmasters.
  3. Check the file by means of service Yandex. Webmaster —> Analysis of robots.txt.

Screenshot of Yandex. Webmaster:

5

  1. If check was successful and no errors were revealed, load the file into a website root.

6.External signals by means of links

To accelerate the indexation of the website, we recommend attracting the SE robots by means of external references. For example:

  • Register the website in trust directories, reference books, and ratings (for example, Yandex. Directory, DMOZ and Mail.ru Directory);
  • Post the articles on a paid basis on the websites with the same subject;
  • Communicate on thematic forums;
  • Integrate the website with social networks;
  • Create RSS feed on the website;
  • Post the articles on services of social news;
  • Work with social tabs.

7.Write qualitative and unique content

The quality of content also influences the speed of indexation of the website. A website with bad content that does not give a full reply to the request of the user, contains grammar errors, or overloaded with a big quantity of keywords, is indexed worse.

When writing the text, it is important to consider that:

  • Texts should contain only the most important information that is useful to visitors.
  • The material should give a full reply to the request; after reading this material the user should stop looking for information on this subject.
  • Focus attention on important nuances, get rid of «milk-and-water» and idle speculations.
  • Texts should be competently structured (use headings, subtitles, divide the text into paragraphs, do lists if necessary, etc.)
  • Texts should contain keywords, but in moderate quantity, avoid a lot of spam.
  • It is desirable, that the uniqueness of texts was not lower than 70%, you can check it by means of http://advego.ru/plagiatus/ or https://www.etxt.ru/antiplagiat/service.

How to monitor the indexation of the website

Add the website to service Yandex. Webmaster and Google Search Console

If the website is not added to the panels of search engines webmasters yet, we strongly recommend to do it. With their help you will be able to analyze the indexation of the project easily. For example:

  • when did the robot visit your resource for the last time;
  • how many pages were loaded into its base;
  • whether a certain page is included in search base;
  • what errors were found by the robot;
  • load time of the page.

6

You can also inform the SE about new or deleted pages and monitor other parameters of the website which are not connected with indexation. For example, the external references linking to the website, requests, on which users link to the website, etc.

Only the full analysis will be able to help to reveal and correct errors of indexation of the website in time.

It is simple to register the website in the webmaster of Yandex and Google.

For this purpose, it is necessary to execute a procedure, consisting of 3 steps in Yandex.Webmaster, which come with the detailed instruction:

7

Adding of the website in Google Search Console is also followed by the instruction:

8

How to inform the SE robot about the changes on the website

— Create a site map

In order to help the search robot to find new pages on the website, we recommend creating a site map in two formats: XML and HTML.

The map in HTML format needs to be downloaded on the website as a separate page. When scanning the website, the robot gets on this page – it should contain references to all pages of the website, and it will help the robot to find them.

The map in the XML format (Sitemaps) has to be added into a toolbar for webmasters of Google and Yandex.

Recommendations on the creation of a site map of XML (Sitemap):

— The site map should be generated and updated automatically when adding a new URL on the website or deleting the old one.

— The pages that are closed from indexation should not get into a site map.

— After the file creation, it is necessary to make sure that there are no errors. It is possible to make it by means of special tools of search engines:

It is also possible to specify in the XML (Sitemap) map:

indexation priority – by means of the <priority> tag;

refresh rate of the specific page – by means of the <changefred> tag.

After the site map is created, information on it needs to be added to a toolbar for webmasters of Google and Yandex.

— Write a script «The Hunter of Bots»

It is possible to apply the following technique to the large project – when the SE robot comes on the website, the special script shows the robot the links to the non-indexed pages of the website.

The disadvantage of a technique is its difficult technical implementation.

— Use «Add URL» of search engines

It is the most «traditional» way of indexation acceleration of new pages on the website. By means of «Add URL» the search engine can be informed both about the new website, and about the new page on the website.

We will list the links on «Add URL» of the most popular search engines below:

To inform the Yandex SE about the new page, use Yandex add URL form:

9

To inform Google SE about a new page, use the Google add URL form:

10

The disadvantage of this tool is that adding pages in these forms does not guarantee that the page will appear in base of a search engine at once.

— For fast indexation of new pages in Google, use the “Look as Googlebot” tool

It is possible to use this tool after the website is added to Google Search Console (webmaster).

Google Search Console —>Section Scanning —>Look as Googlebot —>Enter the necessary URL —>Scan Button —>Add to an Index Button, then the text «URL is sent for indexing” will appear:

11

Also, by means of this tool, it is possible to learn the load time of the page and data on the scanning process.

The only disadvantage is that every time when adding new material, it is necessary to enter the address manually.

— Pinger plug-in for indexation acceleration of new pages in Yandex

It is possible to raise a priority of indexing of new pages of the website by means of special POST requests. Look here for detailed information.

If your website is constructed on the basis of popular CMS, for example, WordPress, Joomla, Drupal, you should install a special plug-in Pinger. It will automatically report to Yandex about the appearance of new pages on your website.

What should be done for this purpose:

А.Install Yandex Search :

12

В.Install and configure a plug-in according to the instruction:

13

After the publication of the new page, log into the settings of a plug-in. In the window “Message on the Status of a Plug-in”, the message “Plug-in works correctly» should appear and the last added address must be displayed.

You can learn other possible statuses and their meanings here.

— The special forms of SE for the acceleration of indexation of the blog «Ping»

If you have a blog with a subscription to RSS, for an acceleration of indexation of new pages you can use the special form of SE called «Ping».

«Ping» is a notification of search engines about changes on your website.

To add a new record in SE Google, it is necessary to use the Blog Search form at the following address:

14

To add a new record in SE Yandex, it is necessary to use the Send a Ping form at the address:

15

— Free ping services

For acceleration of indexation of new documents use free ping services, for example, http://seo-ng.net/seo-instrumenty/ping_services.html:

16

Important! Use ping services only if you have new material on the website. Also you should not ping the same website on different services.

Let's sum up – to accelerate the indexation of the website in general, it is necessary:

  1. Perform technical optimization of the website.
  2. Organize the competent structure of the website.
  3. Regularly add new qualitative material on the website.
  4. When you add new pages, it is necessary to report them the SE.
  5. Attract on the website SE robots by means of external references.
5/5 - (1 vote)

Leave a Reply

Your email address will not be published. Required fields are marked *

Лена

Writer & Blogger

Valentin Romanenko

Blog (с) 2009-2024

Main

About

Contacts

admin@valentyn-romanenko.com