Bing explains the brand new Sitemap Index Protection report


Bing has quietly rolled out the brand new sitemap index protection report in Bing Webmaster Instruments and at this time posted a how-to on that reporting instrument. I consider the brand new sitemap index protection report started rolling out in late June 2023 and pushed it to everybody just a few weeks in the past however Bing has by no means formally introduced the brand new report.

Bing said they try to obtain related and legitimate sitemaps not less than as soon as a day besides when your lastmod date says that your sitemaps didn’t change.

What it reveals. The sitemap report reveals:

  • A web page that reveals all of the submitted and organically found sitemaps on your website, together with once they had been submitted, standing (success or error), final crawl date, the variety of URLs found, a hyperlink to Index Protection when this information is accessible, and the flexibility to resubmit sitemaps.
  • Per sitemaps index file, an inventory of all of the sitemaps included and the problems detected if any.
  • Bing now reviews a breakdown of URLs indexing standing for URLs listed and URLs which want consideration from sitemaps and sitemap index information. At the moment that is accessible for websites having greater than 10,000 URLs listed in Bing.

What’s seems to be like. Here’s a screenshot of this report:

Bing explains the brand new Sitemap Index Protection report 11

The way it helps. Bing mentioned this report may help you do higher in search by:

  • First, guaranteeing that Bing is knowledgeable about all of the related sitemaps related together with your web site. Should you discover any related sitemaps are absent, submit them to Bing or references them in your robots.txt file or be certain that they’re a part of a sitemap index file which is already referenced in robots.txt file.
  • Second, Figuring out and fixing in your sitemaps. As an example, should you see that we can’t crawled your sitemaps be certain that there are not any settings in robots.txt which can be blocking the crawl by Bing Bot. Moreover make certain particular person sitemaps are organized through Index Sitemaps in order that variety of URLs per sitemap are lower than 50,000 and every sitemap file dimension is lower than 50 MB to make sure most discovery.
  • Third, and that is the position of the brand new indexing report, please examine why URLs listed in your sitemaps should not listed by URLs which want consideration.

Why we care. When you have got points together with your URLs being listed in Bing Search, that may result in much less site visitors and visibility for what you are promoting or group. This report helps you perceive which pages are listed and which aren’t listed. Extra importantly, it provides you clues on why these pages should not index.

This report may help you enhance your index price by Bing Search and thus enhance your site visitors from Bing Search.

New on Search Engine Land

In regards to the creator

Barry Schwartz

Barry Schwartz is a Contributing Editor to Search Engine Land and a member of the programming group for SMX occasions. He owns RustyBrick, a NY primarily based net consulting agency. He additionally runs Search Engine Roundtable, a preferred search weblog on very superior SEM subjects. Barry may be adopted on Twitter here.

Source link


Please enter your comment!
Please enter your name here