We speed up sending new pages to the index. The danger of slow indexing

Perhaps the most exciting question of all webmasters is how to speed up website indexing in search engines.

Moreover, if Google indexes quickly and does not require much effort, then indexing in Yandex sometimes drags on for a long time. Therefore, considering the issue of acceleration today, we will first of all mean the acceleration of site indexing in Yandex.

You can speed up the indexing process using both paid and free methods. To obtain a guaranteed result, it is desirable to use several of the following methods at once. So, let's move on to the methods of acceleration themselves. I decided to describe everything free ways speeding up indexing as much as possible.

Fast site indexing in Yandex and Google

1) Site structure. Think over and build the structure of your site so that the search robot can easily reach any page. It is highly desirable to make page attachments no more than three clicks from the muzzle. Otherwise, indexing may be delayed.

2) Internal site linking. I have already written about the benefits of internal linking. Just one of its tasks is to speed up the indexing of the resource. We remember that the bot follows links within the site, the more links lead to different pages. The faster the bot will find them and include them in the index.

3) Site map (sitemap). It should be an essential element of any website. Its main and only task is to speed up the search for all pages of the site by the robot. You can show Google and Yandex a site map through the webmaster panel. The path to it is also indicated in the robots.txt file. It will also be useful to make an html version of the sitemap, which will be available in one click from the main one. There is no doubt that the sitemap helps to speed up the indexing of the internal pages of the site.

4) Adding a site to the Yandex and Google webmaster panel. Webmaster panels provide ample opportunities to control various site parameters. And if for good sites this is a plus, then for frank HS it will be a minus. If you have SDL then feel free to add.

6) Adding a site to popular ratings. Such well-known ratings as Rating@Mail, Rambler TOP100 and LiveInternet are often visited by search bots. Accordingly, if they find a link to your site there, they will go to it.

7) Registration in trust directories. Do not be lazy and send requests to add the site to trust directories. All popular search engines have them: On Yandex, this is Yandex.Catalog ( since 2018 they do not accept new sites), Google has DMOZ ( closed in 2017) and Mile's Directory. In fact, registration in them is needed not for indexing, but for a set of site trust. After all, you can get into the catalog, even if your site is accepted there, at best, in three months (for Ya.Kataloga). In dmoz, in general, you can wait a year, there is a terrible lack of moderators and the site will hang under consideration until you turn blue in the face ...

8) Run through directories. We are talking about registration in a bunch of low-grade directories. To do this madness manually, you need to use special services. For example, I used to use the 1PS service. I really liked it because at night it was possible to work with the service for free. At that time, every penny counted. Now this service provides a large number of services, not only runs.

9) Publication of articles in article directories. In fact, this is the same as the paragraph above, only we place links there, and articles here. The network has a large number of such directories, as well as mailing services for them. A run through article directories to speed up the indexing of a new site can also be done through the service mentioned above.

10) Using social bookmarking. Social bookmarks are well indexed by search engines. You can do a social bookmarking run or manually add links to pages on your site. The difference from running through catalogs is that only main pages can be added to catalogs, and internal pages can also be added to bookmarks.

11) Placement of announcements and press releases on trust sites. The process is completely manual and very labor intensive. I collected my database of trail sites through one service, which is now dead.

12) We use question and answer services. Question and answer services are very fond of search bots, along with the users who hang out there. There, too, you can leave a link to your site, both yourself, and find the artist through advego. If you do not have a pumped turnip, then moderators will most likely delete your link. They are very evil there. Therefore, it is better to find a good performer and work with him. Google closed its service, now you can only read there, but you can still do it on Mail.ru.

13) Comments on thematic blogs. Quality blogs that are not inundated with ads, are updated frequently and have regular readers and are often visited by search robots. Such blogs can also be used to speed up indexing. Of course, if you spam, there will be no sense, admins will remove your links. Just add a link to the field provided for this. Finding such blogs is easy, you can use one of the Internet ratings, ask Yandex for "blog rating".

14) RSS feed. If your site has such an opportunity, you can organize an RSS feed to special RSS directories. Announcements of your new materials on other sites can lead search bots to your resource. I recommend adding the broadcast to these directories: Feedreader.ru, Feedburner.com, Newsread.ru, LiveRSS.ru, Rssreader.ru, Plazoo.com.

15) Use popular forums. A good way to speed up site indexing is a link in the profile of a popular forum (if the profile is open for indexing) and a link in the signature. Find top forums on your subject and register in them. Bots love forums with lots of new daily posts. Participate in "hot" dialogues on the forum and the bot will find your link in the signature. In addition to indexing, such links will also give traffic to the site, albeit not large, but still.

16) Pinging. Popular website engines like WordPress have a ping feature. Those. they send a signal to the PS about a new page on the site. If your CMS has such a feature, why not use it.

17) E-mail newsletter. It is a proven fact that fast site indexing is possible with mailing lists. Especially if people will go to the advertised site via links from the mail. The most popular e-mail service Subscribe.ru. You need to register there and organize the release of the newsletter.

18) Blogging on free blog hosting sites. Blogs on free hostings LiveJournal.com (aka LJ), Blogger.com, Liveinternet.ru, Blogs.Mail.ru, etc. can also help us in indexing the new site. We start a new blog there and put links to the desired site. Bots love these services and will definitely notice the appearance of a new blog. To speed up this process, you can make “friendships” with other bloggers of this service.

19) Social networks. I left this for a snack. It is currently one of better ways speed up site indexing. Especially twitter. Be sure to get yourself a Twitter account. Pump it up properly, and it will be an indispensable thing for indexing new pages of the site. Literally minutes later, after the link is published on the pumped twitter, the pages appear in the Yandex search results. You can untwist twitter

Webmasters often face a problem when pages do not get into the search engine index for weeks. It is especially relevant in Yandex - new pages appear much later than in Google. And in Lately Yandex also rarely makes text updates - 2-3 times a month. In this article, we will look at what affects site indexing, how it can be accelerated, and how to monitor it.

What affects the frequency of site indexing by search robots

– The load of the server on which the site is located (hosting quality)

If there are many other resources on the server, and specifications servers do not allow you to quickly cope with all the requests of the robot, the robot begins to visit such a site less often. Accordingly, it will take more time to include pages in search results.

- The frequency of site updates in general

Search robots analyze the frequency of updating content on the site, as a result of which they determine how often a particular site will be visited.
The more often new content is added to the site, the more often PS robots visit it.

— Interest of visitors to the site (PF)

The search robot may revise the scheduling policy and visit the site more often if new materials of interest to users are regularly added to the site (for example, news, articles):

  • users stay on the site;
  • go through the internal pages;
  • add the site to bookmarks;
  • share content on social media. networks, etc.

How to improve site indexing

The search robot indexes a certain number of pages in one visit according to the allocated quota, which depends on many site parameters. Those. even if the robot visited your site, this does not mean that it will index and include all the pages of the site in its database. Therefore, it is very important that the indexing robot not only visits your site, but also includes all new and changed pages of the site in its database.

Below in the article we will look at how to make the PS robot visit your site more often and at the same time increase the number of pages that the PS robot can index at a time - the crawl budget.

1. Analyze the site for duplicates

This item was put first, since duplicate pages are one of the main problems that worsen the indexing of the site by search robots.

When a site has duplicates, the crawl budget is spent on useless pages instead of new pages or pages with updated content.

How to find duplicates on the site is described in detail in the articles of our blog:

2. Configure the server to return the correct HTTP status

Correctly setting HTTP status codes is very important for proper site indexing.

When a PS robot requests a page of a site, the status code provides it with information about the site and the specific page:

  • whether the page exists or not,
  • whether redirection is configured,
  • whether there is an error on the server side.

For example http code "404 Not Found" reports that the page at the requested address does not exist, and the http code OK 200 indicates that the page is available.

header Last Modified informs the PS robot about the date of the last modification of the document. Thus, the indexing robot checks for updates only documents that have really changed since the previous crawl, or new pages, without spending crawling budget on pages that have not changed.

More about the Last-Modified header.

3. Monitor server response time and page loading speed

The server response time to a browser request directly affects the indexing of the site.

Taking into account network delays, it should be no more than 300 ms.

Services for measuring server response:

You can see if there are any flaws using the Google PageSpeed ​​service.

4. Organize your website structure well

The more understandable the structure of the site looks to the search robot, the better it will index it.

— Page nesting level

Any promoted page should be no further than 3 clicks from the main one. This is very important for indexing, since it will take a search robot much less time to index a simple and shallow site than a complex resource with a confusing navigation system.

- Implement hub pages on the site

Hub pages are pages that contain links to sections and subsections that serve to navigate users around the site.

In order to improve indexing, hub pages must be implemented in one click from the main page of the site.

What will the implementation of hub pages give us:

A. Reduce nesting level
B. It will speed up the indexing of pages by the search bot
B. Help users find the right material more easily

- Display announcements on new pages on the main site

The search robot most often visits the main page of the site. If it contains links to new pages or pages with updated content, there is a high probability that the robot will index them.

For instance:

- Implement widget in sidebar latest articles

Another good way to link the pages of the site:

- Do not make menus on scripts and flash

The main disadvantage of using the menu on scripts and flash is that search robots do not see it.

5. Customize the robots.txt file

Correctly composed robots.txt will allow you to exclude possible problems that occur when a robot crawls a site. This can significantly speed up the indexing of the resource as a whole.

In the robots.txt file, you need to write instructions for search engine robots: which pages of the site to index and which not. In this case, search engines will take less time to crawl the site.

For example, in robots.txt you can close:

  • CMS service files and folders;
  • internal and external duplicates (if any);
  • forum response forms;
  • technical pages;
  • files that are not useful for the user (for example, visit statistics, pages with search results).

How to create correct robots.txt?

1. Create a robots.txt file in a text editor.

3. Check the file in the service Yandex.Webmaster — > Analysis of robots.txt.

Screenshot from Yandex.Webmaster:

4. If the check was successful and no errors were found, upload the file to the site root.

  • Register your site in trust directories, directories and ratings (for example, Yandex.Catalog, DMOZ and Mail.ru Catalog);
  • Place your articles on a paid basis on sites of the same subject;
  • Communicate on thematic forums (more about crowd marketing);
  • Integrate the site with social networks;
  • Create an RSS feed on the site;
  • Post articles on social news services;
  • Work with social bookmarks.

7. Write quality and unique content

The quality of the content also affects the indexing speed of the site. Bad content that does not give a complete answer to the user's query, which has grammatical errors, high keyword density, is indexed worse.

When writing text, it is important to consider that:

  • Texts should contain only the most important information useful to visitors.
  • The material should give a complete answer to the request, after reading which the user will stop looking for information on this topic.
  • Focus on important nuances, remove the "water" and reasoning about nothing.
  • Texts should be well structured (use headings, subheadings, divide text into paragraphs, make lists when appropriate, etc.)
  • Texts should contain keywords, but in moderation - avoid overspam.
  • It is desirable that the uniqueness of the texts be at least 70%, according to the service https://advego.ru/plagiatus/ or https://www.etxt.ru/antiplagiat/.

How to track site indexing

Add your site to Yandex.Webmaster and Google Search Console

If the site has not yet been added to the webmasters panel of search engines, we strongly recommend adding it. With their help, you can easily analyze project indexing. For instance:

  • when the robot last visited your resource;
  • how many pages he uploaded to his database;
  • whether a particular page is included in the search base;
  • what errors the robot found;
  • page load time.

You can also inform the PS about new or deleted pages and monitor other site parameters that are not related to indexing. For example, external links leading to the site, queries that take users to the site, etc.

Only a complete analysis can help identify and correct site indexing errors in time.

It is easy to register a site in Yandex and Google webmaster.

To do this, you need to follow the procedure in 3 steps, which are accompanied by, in Ya.Webmaster:

Adding a site to Google Search Console is also accompanied by instruction :

How to inform the PS robot about changes on the site

- Create a sitemap

To help the search robot find new pages on the site, we recommend creating a sitemap in two formats: xml and html.

The map in html format must be posted on the site as a separate page. When scanning a site, the robot gets to this page - it should contain links to all pages of the site, and this will help it find them.

A map in XML format (Sitemaps) must be added to the toolbar for Google and Yandex webmasters.

- The sitemap should be automatically generated and updated when new URLs are added to the site or old ones are deleted.

- The sitemap should not include pages closed from indexing.

- After creating the file, you need to make sure that there are no errors in it. This can be done using special search engine tools:

Also in the XML map (Sitemap) you can specify:

  • indexing priority - using a tag ;
  • frequency of updating a particular page - using a tag .

After the sitemap is created, information about it must be added to the toolbar for Google and Yandex webmasters.

— Write a script "Catcher of bots"

For a large project, you can apply the following technique - at the time the PS robot arrives at the site, a special script shows the robot links to non-indexed pages of the site.

The disadvantage of using the technique is a complex technical implementation.

- Use « Add URL » search engines

This is the most "traditional" way to speed up the indexing of new pages on the site. With "Add URL" you can tell the search engine about both the new site and the new page on the site.

To report a new Yandex PS page, use the add url Yandex form:

To report a new Google UA, use the form add url Google :

The disadvantage of using this tool is that adding pages to form data does not guarantee that the page will immediately appear in the search engine database.

- To quickly index new pages in Google, use the "View as Googlebot" tool

You can use this tool after the site is added to Google Search Console (webmaster).

Google Search Console -> Crawl section -> View as Googlebot -> Enter the desired URL -> Crawl button -> Add to index button, after which the inscription "URL submitted for indexing" will appear:

Also, using this tool, you can find out the page load time and data on the scanning process.

The only drawback is that each time you add a new material, the address must be entered manually.

— Pinger plugin to speed up the indexing of new pages in Yandex

You can increase the indexing priority of new site pages using special POST requests. Read more.

If your site is built on the basis of popular CMS, for example, WordPress, Joomla, Drupal - install a special Pinger plugin. It will automatically inform Yandex about the appearance of new pages on your site.

What needs to be done for this:

A. Install :

B. Install and configure the plugin according to the instructions:

After publishing a new page, go to the plugin settings, in the "Plugin Status Message" window, the message "Plugin is working correctly" should appear and the last added address should be displayed.

Other possible statuses and their meanings can be viewed.

- Special PS forms to speed up the indexing of the Ping blog

If you have a blog with an RSS subscription, you can use a special Ping form to speed up the indexing of new pages.

"Ping" is a notification to search engines about changes on your site.

To add a new entry to the Google PS, you need to use the "Blog Search" form at:

To add a new entry to the Yandex PS, you need to use the "Send ping" form at:

To speed up the indexing of new documents, use free ping services, for example, https://seo-ng.net/seo-instrumenty/ping_services.html :

Important!Use ping services only if there is fresh material on the site. Also, do not ping the same site in different services.

Let's summarize - to speed up the indexing of the site as a whole, you need to:

1. Carry out technical optimization of the site.

2. Organize a competent site structure.

3. Regularly add a new one quality material to the website.

4. When adding new pages, you need to report them to the PS.

5. Attract PS robots to the site using external links.

Subscribe to newsletter

Dear readers, today we will talk about speeding up the indexing of a site or blog.

What is site indexing? This is adding your newly written articles to the index (base) of search engines. Those. the faster the search robot came to your site and copied new articles to itself, the faster they are indexed and the better! Now I will tell you everything in order.

Why is it important to speed up website indexing?

To speed up indexing, of course, the most important XML sitemap. Don't forget to specify its address in the webmaster and in the robots.txt file. This method is good because you need to do it once and forget it.

Customize the robots.txt file

Use Addurilki Search Engines

Addurilka (eng. Add URL - add an address) is a service through which you can inform the search engine about the release of a new article. This is a kind of queue for indexing, which you enter by writing new article.

In addition, by adding an article to the Yandex adduril, you will receive a response based on the indexing result. Or the service will inform you about an error or impossibility of indexing the article. So this service is very convenient for speeding up the indexing of the site, of course, if you are not too lazy to add articles to it every time.

Set up a ping

Ping is a signal that your server sends search engines when publishing an article. This signal tells you that the new article needs to be indexed. This method of speeding up site indexing is also convenient because it does not require any actions other than a one-time setup.

It's easy to set up: go to "Options" -\u003e "Writing". At the very bottom, in the "Update Services" enter the list of ping servers. Full list copy from .

You can talk endlessly about how to speed up the indexing of a site by Yandex and Google. The considered methods are the most successful and useful in my opinion.

I advise you to use as many of the above methods as possible to speed up the indexing of the site, and the result will not keep you waiting. But if you know any other important and effective ways be sure to write in the comments.

All quick indexing!

If you are just starting your blog, I recommend that you do everything right from the very beginning, otherwise then correcting old notes is quite difficult, time-consuming, requires patience and concentration.

To get started, study this post: (it’s quite possible to follow all the rules)

You write new post and you want it to "fly" into the index of Google and Yandex search robots in a matter of minutes, at least in a couple of hours (for some, this is already a result!). I remember my first notes and it’s a pity that I didn’t know all the rules before, because some posts were not indexed for half a year. But, let's not talk about sad things.

How to speed up indexing in Yandex

1.Use "Original texts" of Yandex

To get started, you must register with Yandex Webmaster and add your site to your account.

When your note is ready, it remains to make small adjustments with links, relinking, spell checking, etc.

Preferably 5-10 hours before you hit the "Publish" button, you need to click on your not yet published note, then press the magic keyboard shortcuts Ctrl + A (select all, if you did not know, it will come in handy) and then Ctrl + C (copy).

Then go to your office Yandex webmaster, . Choose your site, then "Site Content"

After that, paste the text that you copied in your WordPress editor earlier. Paste it here:

And click the "add" button under the inserted test. If you have not used the " original lyrics”, then I strongly recommend that you throw in the texts of all your blog posts in turn, this is not long, but very useful, believe me.

2. Add URL to Yandex Adduril

Immediately after publishing a new note, you need to copy its URL and add it to the Yandex adduril using the link -

The main thing is to enter your URL correctly and choose the right captcha)) and this item is completed.

By the way, if you edited your old post, then it also needs to be added to the Yandex adduril. This will give a signal to search robots to re-index the edited content.

3. Post new content regularly

If you make a plan for yourself and publish 1 post per day / week / 2 weeks, it doesn’t matter, then search robots will come to the site after such a period of time for the next indexing. The key word here is regularly rather than as often as possible.

It is better to publish a full review article every 2 weeks than to publish an article every three days about what's new on your blog. If you have a huge audience and you are a mega-popular person, then you can write several posts a day, I don’t argue.

How often you teach search robots to visit you for indexing, you will get such a result. If you publish 1 note per month, then don't expect it to fly into the Yandex index in a few hours or days. Sometimes it takes a month or more.

4. We distribute content on social networks

I will be extremely frank with you and I will say right away that not every content and not in all topics can be made and promoted on social networks. However, you need to experiment and look for your social network in which yours is located.

On the example of one of my sites, I can give you an example of how I collect a lot of traffic and more than 200 likes from posting in social media groups!

Everything is described there in a clear and accessible language, study and try to achieve even better results.

Be sure to set buttons in the body of each article social networks. It doesn't matter how you do it: script, code, plugin, it doesn't matter. The main thing is to arrange the buttons conveniently and implement them in such a way that they do not load the site too much.

I use this service, read it, maybe it will work for you:

5. Plugin "Social lock" simple and effective!

This is a very effective tool to help you collect reader likes.

To do this, you reveal the topic of the article almost completely, and leave a couple of "buns" for dessert. You hide this section of text with the help of a plugin under a beautiful picture. This text opens after a like on any of the proposed social networks.

Agree, everything ingenious is simple))

Here is a detailed article about this plugin:

6. Build links and trust

In order to be respected in life, you need to earn people's trust. So on the Internet, there is the concept of "", which, according to certain algorithms, will help you become more visible in the search.

If the site is trusted, then new notes will appear in the Yandex index in a matter of minutes.

7. Add search from Yandex to the site

Implementing this is quite simple, detailed instructions here:

This method will not help you much directly in speeding up indexing, but indirectly it will be an additional incentive. If you are more focused on Google search, then use it, but it must be present on the site.

8. Create correct Sitemap.xml and robots.txt files

Pleased with the important factor, it is not difficult to implement these files. I described it in detail in a note:

Everything is clear with Yandex, let's move on to indexing a new article by Google robots

Speed ​​up indexing by Google robots

To begin with, each webmaster needs to register his site on Google Webmaster

After that, select the site whose indexing you want to speed up, go to the webmaster panel.

Select your site, then in the left panel "Crawl", then "View as Google Bot"

Entered the url of the new note after the site name in desired field as in the screenshot below and click "Scan"

Then a line will appear below and opposite it a green checkmark and the signature “Done”, and then “Add to index”. As you may have guessed, we need to press it. After that, you will see a window with the following content:

Choose 1 or 2 option, put a dot in the checkbox and click "Submit"

After that, you should see the following phrase - "URL added to the index"

So with the indexing of a new post in Google, we have done all the necessary steps. If you have any questions, be sure to write in the comments, I will answer promptly.

See you!

Interesting posts on this topic:

A big problem in Yandex for new sites, as well as for old ones, is page indexing, namely, long site indexing or lack of page indexing. Therefore, here you will learn, so that new pages get into the index in 30 minutes.

Initially, Yandex indexes the site more slowly than its competitor Google because the Yandex robot that indexes the site visits the site less often and sees links worse if the site is not optimized. This could become main reason not indexing the site!

7 ways to speed up site indexing for Yandex

I found 7 best ways to speed up site indexing for Yandex, which help to upload pages to the index within 30-40 minutes. Do not forget that fast indexing of pages may allow you to provide some important information to users first, for example, breaking news.

1. New page notification via add url

2. Adding a site to Yandex webmaster

Also, many make the mistake of not adding the site to, why is this a mistake? This service was created in order to keep a history of the site (indexing, keywords, transitions). webmaster as well essential tool in order to do the following.

3. Create and upload the robots.txt file

robots.txt is a document that specifies the rules for indexing a site. Using this file, indexing is configured not only for the site, but also for links. You can create it yourself or with the help of services. Here is one of the services for creating robots - .

After creating it, upload it to the root folder of the site. Then open the Yandex webmaster, indexing settings, analysis of robots.txt. And click on the button "download robots.txt from the site", a line will appear where you will need to enter the main page of the site.

4. Create a sitemap

Creating a sitemap is necessary for any resource. We will need to create an html sitemap, one of them is a standard page where all the links are collected, like mine. Such a card should be either in the menu at the top or in the footer, the main thing is that it be present on every page.

In addition to the html view map, you will need to create an xml site map. I advise you to follow the link, there will be instructions for its installation. After installing the map, upload it to Yandex Webmaster. To download it, you need to go to the indexing settings, sitemap files. First check the correctness of the map and only then add it.

5. Announce the link on social networks

Since the robot finds new pages through links, I advise you to create accounts on social networks, you will find good list social networks. And publish links to your pages, this greatly speeds up indexing, since links from social networks are the fastest to get into search results.

6. Quality and frequency of writing content for the site

The quality of the content also affects the indexing speed of the following pages. Bad content, where there are a large number of errors, a high keyword density, or the text is pure copy-paste, is indexed worse. Therefore, it is recommended to write for websites.

In addition, the frequency of writing content also affects. That is, the more often you write, the more often the robot will visit your site. Writing at least 2 times a week at a certain time will increase the indexing speed by several times than writing once a month!

7. Linking site pages

It has been proven that the better, the better the search results will be, but in addition, you can get a quick indexing of the site. One of the methods of linking is a widget in the sidebar of the latest articles. Thus, on each page of your site there will be a link to a new article. This makes the search robot react quickly to the appearance of a new link on your site.

If you use all 7 ways to speed up site indexing in Yandex, then you will achieve the same fast indexing as mine. And the indexing speed of each of my pages takes no more than 30 minutes.