Bing's Webmaster Blog

Announcing the new Bing Webmaster Tools (migration complete)

Since February, our team of ninja developers and designers have been working relentlessly to offer a new and enhanced user experience to our users. Today, we are pleased to announce the full release of our new Bing Webmaster Tools. Our new Bing Webmaster Tools is built with key principles of - keeping the design Cleaner and Responsive with Faster and more Actionable tools. Keeping the need of users in mind, the portal has responsive design which provides the flexibility to the user to access it across devices. We strived to make the new portal exciting for our users not only by improving the performance of the tools but also by full redesigning the user experience to make it more user friendly and intuitive. For instance, when our old Bing Webmaster Tools had 47 navigation links, our new site has only 17 as we bundle many features together making it easier to navigate to get the information you need to maximize your search results at Bing and elsewhere. We have not only redirected our customers from the old tools to the new ones, we have also enhanced them and enabled new tools. As an overview:   Enhanced tools   Backlinks lists backlinks for any site, including similar sites. Keywords research allows to filter by countries, languages, and devices. URL submission offers easy to navigate user flow with metrics and history. We have also ease URL submission via our WordPress plugin allowing to get your content indexed automatically and quickly by just installing this plugin. SEO reports offers an improved classification of issues. Adding and verifying a site offers a smoother experience for your site, Google Search Console or via Domain Connect. New tools   URL inspection is an exciting BETA feature allowing to inspect Bing indexed version of your URLs allowing to detect potential indexing issues due to crawl or not following Bing Webmaster guidelines. ​Site Scan is an on-demand site audit tool which crawls your site and checks for common technical SEO issues allowing to improve your website performance at Bing and other search engines as well. Robots.txt tester allows to check your robots.txt file using the same parser we use in production and verify that your URLs are allowed or disallowed as you want.  The Bing Webmaster APIs continue as-is, so users using them to get their data programmatically do not have to make any changes. We are not done. Now that we have migrated to this new improved platform and experience, we will be enhancing tools listening to your feedback and shipping new salient features including a refreshed version Index Explorer this summer. Also, we urge you to reach out to us and share feedback on Twitter, let us know how you feel about the new Bing Webmaster Tools as we are here to help you. If you encounter any issues, please contact our support team. Regards, The Bing Webmaster Tools team  

Get your WordPress content indexed immediately using Bing Webmaster Tools plugin

Today, we are excited to announce the release of Bing URL Submissions Plugin for WordPress as open source project. The plugin allows webmasters of WordPress sites to get their content easily, automatically and immediately indexed by Bing as soon as their content is published! Who in the SEO community has not dreamed of such ability? Since last year, webmasters have the ability to submit up to 10,000 URLs per day, and more if requested, through the Bing Webmaster Tools portal as well as the Bing Webmaster Tools API for immediate crawl and indexation. Today, we made this submission super easy for WordPress sites by releasing Bing Webmaster Tools WordPress plugin. Once installed and configured with an API key obtained from Bing Webmaster portal, the plugin detects both page updates and new pages created in WordPress and automatically submits the URLs behind the scenes to our Bing Webmaster Tools API ensuring that the site pages are always fresh in the Bing index. Some other handy features included in the plugin: The ability to toggle the automatic submission feature on and off. ​Manually submit a URL to Bing Index. View list of recent URL submissions from the plugin. Retry any failed submissions from the recent submissions list. Download recent URL submissions for analysis. Follow these instructions for 2 easy steps to install the WordPress plugin   and enjoy automatic real time indexing at Bing of your WordPress content. Search for Bing URL Submissions Plugin or click this link: Add your Bing Webmaster Tools API key to activate: We're here to help beyond WordPress: we open sourced this plugin to make it easier for webmasters having their own Content Management System and others Content Management System to reuse our ideas and ease integration with our API. If you have comments or questions about the plugin, contact us via Twitter, find us on GitHub. Thanks, Bing Webmaster team

How to Optimize Your Content for Search Questions using Deep Learning

One of the most exciting developments is how well Bing and other major search engines can answer questions typed by users in search boxes.   Search is moving farther and farther away from the old ten blue links and matching typed keywords with keywords in content.   Answers to search questions can come up in the form of intelligent answers where we get a single result with the answer, and/or “People Also Ask”, where we get a list of related questions and answers to explore further.   This opens both opportunities and challenges for content producers and SEOs.    First, there is no keyword mapping to do as questions rarely include the same words as their corresponding answers. We also have the challenge that questions and answers can be phrased in many different ways.   How do we make sure our content is selected when our target customers search for answers we can provide?   I think one approach to do this is to evaluate our content by following the same process that Bing’s answering engine follows and contrast it to an evaluation of competitors that are doing really well.   In the process of doing these competitive evaluations we will learn about the strengths and weaknesses of the systems, and the building blocks that help search engines answer questions.   BERT - Bidirectional Encoder Representations of Transformers One of the several fundamental system that Bing and other major search engines use to answer questions is called BERT (Bidirectional Encoder Representations of Transformers.) As stated by Jeffrey Zhu, Program Manager of the Bing Platform in the article Bing delivers its largest improvement in search experience using Azure GPUs: “Recently, there was a breakthrough in natural language understanding with a type of model called transformers (as popularized by Bidirectional Encoder Representations from Transformers, BERT) … Starting from April of this year, we used large transformer models to deliver the largest quality improvements to our Bing customers in the past year. For example, in the query "what can aggravate a concussion", the word "aggravate" indicates the user wants to learn about actions to be taken after a concussion and not about causes or symptoms. Our search powered by these models can now understand the user intent and deliver a more useful result. More importantly, these models are now applied to every Bing search query globally making Bing results more relevant and intelligent.”

Releasing Additional Features in New Bing Webmaster Tools Portal and OAuth Support for Bing Webmaster APIs

In March 2020 we launched an update to the Bing Webmaster Tools portal with refreshes of the Search Performance report, Sitemaps tool and Backlinks tool. As promised, we are continuing to add new features to the portal as we move all functionalities from the previous Webmaster Tools experience to the new updated interface. With that promise in mind, we are delighted to announce three additional features migrating into the new Bing Webmaster portal:   URL Submission – The most popular tool in Bing Webmaster, Submit URL, is now updated. Using this tool, users can submit URLs to Bing for real time indexation. Block URL – This tool can be used to temporary prevent any URL from appearing in the search results. This feature can also be used to clear Bing’s cache for a URL in case you have updated the page. Crawl Control – This tool can be used to control the speed of crawling of your site by Bingbot. It lets you set the hourly crawl rate using a template or through a custom setup based on your site’s traffic pattern. As in the previous feature release, the users will be able to use the current and new pages simultaneously for a short period and we will be deprecating the current pages for these features in a few weeks. In addition to the above features, we also announce that Bing Webmaster APIs can now also be accessed through OAuth2.0 to enable delegated access to registered site owner's data. The OAuth option is present in the new portal in the “API Access” section under Settings in the header bar. The existing users of Bing Webmaster APIs will not have to change anything as validation through API keys are also operational. You can check more details for accessing APIs through OAuth here. To check out the new portal you can login or sign-up to Bing Webmaster Tools. Reach out to us on Twitter and Facebook  to let us know how you feel about the new Bing Webmaster Tools. Look out for more exciting features coming soon to Bing Webmaster Tools. Regards, The Bing Webmaster Tools team    

Helping news publishers control how their content is displayed in Bing

As part of Bing’s continuing efforts to empower news publishers, Bing has announced that publishers will have more tools to control the snippets that preview their site on the Bing results page.   A simple and efficient way for news publishers to submit their content for potential exposure to millions of Windows, Outlook, and Bing users who get their news through these channels is to submit their site for inclusion in the Bing News publisher portal, Bing News PubHub.   All news publications are strongly encouraged to register. By submitting your site, you may benefit from the following: You are providing the most up to date information for your site so that Bing can make your content accessible in multiple formats and across many devices.   You are informed of new or improved Bing features and products, and best practices on how to leverage them to drive users to your content.   You are informed of opportunities to partner with us on creating the future of news experiences.   You have a line of support when you need it. To be considered, first ensure that your site meets both the Bing Webmaster Guidelines and Bing News Publisher Guidelines, then visit Bing News PubHub to get started.

Announcing new options for webmasters to control their snippets at Bing

We’re excited to announce, webmasters will have more tools than ever to control the snippets that preview their site on the Bing results page. For a long time, the Bing search results page has shown site previews that include text snippets, image or video. These snippets, images or videos preview are to help users gauge if a site is relevant to what they’re looking to find out, or if there’s perhaps a more relevant search result for them to click on. The webmasters owning these sites have had some control over these text snippets; for example, if they think the information they’re providing might be fragmented or confusing when condensed into a snippet, they may ask search engines to show no snippet at all so users click through to the site and see the information in its full context. Now, with these new features, webmasters will have more control than ever before to determine how their site is represented on the Bing search results page. Letting Bing knows about your snippet and content preview preferences using robots meta tags. We are extending our support for robots meta tags in HTML or X-Robots-Tag tag in the HTTP Header to let webmasters tell Bing about their content preview preferences. max-snippet:[number] Specify the maximum text-length, in characters, of a snippet in search results. Example : <meta name="robots" content="max-snippet:400" /> If value = 0, we will not show a text snippet. If value = -1, webmaster does not specify a limit. max-image-preview:[value]Specify the maximum size of an image preview in search results. Example: <meta name="robots" content="max-image-preview:large" />   If value = none, Bing will not show an image preview. If value = standard, Bing may show a standard size image. If value = large, Bing may show a standard or a large size image. If value is not none and not standard and not large, webmaster does not specify a limit.   max-video-preview:[number] Specify the maximum number of seconds (integer) of a video preview in search results. Example <meta name="robots" content="max-video-preview:-1" />   If value = 0, Bing may show a static image of the video. If value = -1, webmaster does not specify a limit. Please note that the NOSNIPPET meta tag is still supported and the options above can be combined with other meta robots tags. Example by setting ​<meta name="robots" content="max-snippet:-1, max-image-preview:large, max-video-preview:-1, noarchive" />  webmasters tell Bing that there is no snippet length limit, a large image preview may be shown, a long video preview may be shown and link to no cache page should be shown. Over the following weeks, we will start rolling out these new options first for web and news, then for images, videos and our Bing answers results. We will use these options as directive statement, not as hints. For more information, please read our documentation on meta tags. Please reach out to Bing webmaster tools support if you face any issues or questions. Fabrice Canel Principal Program Manager Microsoft - Bing

Bing adopts schema.org markup for special announcements for COVID-19

Bing is adding new features to help keep everyone up to date on the latest special announcements related to the COVID-19 pandemic. In addition to our previously announced experiences for finding tallies of cases in different geographic regions, we will add announcements of special hours and closures for local businesses, information on risk assessment and testing centers, and travel restrictions and guidelines.   SpecialAnnoucement schema markup for government health agencies Bing may consume case statistics from government health agencies at the country, state or province, administrative area, and city level that use the schema.org markup for diseaseSpreadStatistics associated with a SpecialAnnouncement. These statistics are used on bing.com/covid and other searches for COVID-19 statistics. As a government agency determining whether to use this tag for your webpages, consider whether it meets the following criteria, which are characteristics we consider when selecting case statistics to include:   Your site must be the official government site reporting case statistics for your region. Information in the markup must be up-to-date and consistent with statistics displayed to the general public from your site. Your special announcement must include the date it was posted, indicating the time at which the statistics were first reported. SpecialAnnoucements schema markup for COVID-19 related business updates  Bing may consume special announcements from local businesses, hospitals, schools, government offices, and more that use the schema.org markup for SpecialAnnouncement. A label showing your special announcements related to the COVID-19 pandemic with a link to your site for more details may be used on web results for your official website and in local listings shown on the SERP or map experiences. This provides an easy link for your customers and community to find your latest information. When determining whether to use this tag for your webpages, consider whether it meets the following criteria, which are characteristics we consider when selecting special announcements to display:   The special announcements must be posted on your official website and refer only to changes related to COVID-19 for your own business, hospital, school, or government office. The name of the special announcement must be easily identified within the body of the special announcement page on your site. Your special announcement must include the date it was posted and should also include the time the announcement expires, if appropriate. SpecialAnnoucement schema markup for risk assessment and testing centers Bing may consume information on risk assessments and testing centers from healthcare providers and government health agencies that use the schema.org markup for gettingTestedInfo and CovidTestingFacility. Searches for nearby testing information may include information on how to get assessed to see whether getting tested is recommended and, if so, how to locate a nearby testing facility and find instructions for getting tested at that center. When determining whether to use this tag for your webpages, consider whether it meets the following criteria, which are characteristics we consider when selecting testing information to display:   Your site must be an official site for a well-known healthcare facility or government health agency. gettingTestedInfo must refer to a webpage that specifies what assessment is required prior to being tested at the given testing location. The testing facility information must refer to URLs and facility locations already associated with your provider or agency. Listing other providers’ facilities is not supported at this time. SpecialAnnoucement schema markup for travel restrictions Bing may consume information on travel restrictions from government agencies, travel agencies, airlines, hotels, and other travel providers that use the schema.org markup for travelBans and publicTransportClosuresInfo. Travel related searches may include information on updated hours, closures, and guidelines for travel. When determining whether to use this tag for your webpages, consider whether it meets the following criteria, which are characteristics we consider when selecting travel restrictions to display:   Your site must be an official site for a well-known government agency, travel agency, airline, hotel, or other travel provider. The special announcement including the travel ban or public transport closure info must specify the location covered by the announcement. The name of the special announcement must be easily identified within the body of the special announcement page describing the ban or closure info on your site. Your special announcement must include the date it was posted and should also include the time the announcement expires, if appropriate. More information on how to implement and use these tags can be found at https://schema.org/SpecialAnnouncement and Bing Webmaster special announcement specifications.  

Announcing the new Bing Webmaster Tools

Over the last few months, we have heard from the webmaster ecosystem that Bing Webmaster Tools user interface is slow and outdated. With our user first focus, we have taken your feedback and have been working on modernizing the tools.  We are delighted to announce the first iteration of refreshed Bing Webmaster Tools portal. The refreshed portal is being built with key principles of - keeping the design simple with the tools being Faster, Cleaner, more Responsive and Actionable. We have updated the backend datastore to improve the data extraction and redesigned the user experience to make it more user friendly and intuitive. Keeping the need of users in mind, the portal is also device responsive so that it provides the flexibility to the users to access it across devices. In the first iteration, the new portal will have 3 key features, Backlinks - The Inbound Links report in the current portal is integrated with the Disavow links tool to become the new Backlinks report in the refreshed portal. Search Performance - Page Traffic and Search Keywords reports are also integrated as one and are a part of the new Search Performance report. Sitemaps - The Sitemaps page is the refreshed Sitemaps page of the current portal We are releasing the new portal to a select set of users this week and will be rolling out to all users by 1st week of march. To access the new portal, sign-in to Bing Webmaster Tools and navigate to Sitemaps, Inbound Links, Page Traffic or Search Keywords reports and click on the links to open the new portal. Over the next few months, we will focus on moving all the functionalities to the new portal. During the transition, the users will be able to use the current and new pages simultaneously for a short period. We will be deprecating the functionality from the old portal in a few weeks immediately after its inclusion in the new portal. We will strive to make this transition seamless and exciting for our users. The Bing Webmaster APIs will stay the same so users using our webmaster API to get their data programmatically do not have to make any changes. Reach out to us and share feedback on Twitter and Facebook and let us know how you feel about the new Bing Webmaster Tools. If you encounter any issues, please raise a service ticket with our support team. Regards, The Bing Webmaster Tools team  

Bing partners with the ecosystem to drive fresh signals

Bing Webmaster Tools had launched the Adaptive URL submission capability that allowed webmasters to submit up to 10,000 URLs using the online toolkit through Bing webmaster portal (Submit URLs option) or in batch mode using Batch API. Since launch we have seen high adoption rate by large websites as well as small and medium websites.   We have been working with multiple partners to further Bing’s vision to drive fundamental shift in how search engines find content using direct notification from websites whenever content is created or updated.   A few examples to note: During the recent SEO Conference (Pubcon Pro, Las Vegas) Linkedin.com and love2dev.com showcased how they used the URL Submission API, ensuring search users find timely, relevant and trustworthy information on Bing.   Similarly, we have been working with leading SEO platforms like Botify, to integrate URL Submission API in their product offerings. This integration is an expansion to Botify’s new FastIndex solution, the first within the Botify Activation product suite, the partnership builds upon Bing’s new programmatic URL submission process. For more information please refer to the annoncement from Botify. URL Submission API is programmed to review the site performance that are registered on Bing webmaster tools and adaptively increase their daily quota for submitting content to Bing. We encourage websites to register on Bing webmaster tools using standard methods or Google Search Console Import or Domain Connect based verification.   Apart from integration of URL Submission API, Botify is participating in Bing’s content submission API pilot, which allows for the direct push of HTML, images, and other site content to the search engine, reducing the need for crawling.   Please refer the documentation for Easy set-up guide, Batch URL Submission API,  cURL code example, for more details on URL Submission API.   We are happy to bring in more partners to accelerate this shift in content discovery.   Thanks!  Bing Webmaster Tools Team 

Announcing future user-agents for Bingbot

As announced in October, Bing is adopting the new Microsoft Edge as the engine to run JavaScript and render web pages. We have already switched to Microsoft Edge for thousands of web sites “under the hood”. This evolution was transparent for most of the sites and we carefully tested to check whether each website is rendering fine on switching to Microsoft Edge. Over the coming months, we will scale this migration to cover all the sites. So far, we were crawling using an existing bingbot user-agents. With this change, we will start the transition to a new bingbot user-agent, first for sites which require it for rendering and then gradually and carefully to all sites.   Bingbot user-agents today  Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)  Mozilla/5.0 (iPhone; CPU iPhone OS 7_0 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A465 Safari/9537.53 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)  Mozilla/5.0 (Windows Phone 8.1; ARM; Trident/7.0; Touch; rv:11.0; IEMobile/11.0; NOKIA; Lumia 530) like Gecko (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)  In addition to the existing user-agents listed above, following are the new evergreen Bingbot user-agents Desktop Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm) Chrome/W.X.Y.Z Safari/537.36 Edg/W.X.Y.Z  Mobile Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 Edg/W.X.Y.Z (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)  We are committing to regularly update our web page rendering engine to the most recent stable version of Microsoft Edge thus making the above user agent strings to be evergreen. Thus, "W.X.Y.Z" will be substituted with the latest Microsoft Edge version we're using, for example “80.0.345.0".  How to test your web site  For most web sites, there is nothing to worry as we will carefully test the sites to dynamically render fine before switching them to Microsoft Edge and our new user-agent. We invite you to install and test the new Microsoft Edge to check if your site looks fine with it. If it does then you will not be affected by the change. You can also register your site on Bing Webmaster Tools to get insights about your site, to be notified if we detect issues and to investigate your site using our upcoming tools based on our new rendering engine.  We look forward to sharing more details in the future.  Thanks,  Fabrice Canel  Principal Program Manager  Microsoft - Bing

Accessing Bing Webmaster Tools API using cURL

Thank you webmasters for effectively using Adaptive URL solution to notify bingbot about your website’s most fresh and relevant content. But, did you know you don’t have to use the Bing webmaster tools portal to submit URLs? Bing webmaster tools exposes programmatic access to its APIs for webmasters to integrate their workflows. Here is an example using the popular command line utility cURL that shows how easy  it is to integrate the Submit URL single and Submit URL batch API end points. You can use Get url submission quota API to check remaining daily quota for your account. Bing API can be integrated and called by all modern languages (C#, Python, PHP…), however, cURL can help you to prototype and test the API in minutes and also build complete solutions with minimal effort. cURL is considered as one of the most versatile tools for command-line API calls and is supported by all major Linux shells – simply run the below commands in a terminal window. If you're a Windows user, you can run cURL commands in Git Bash, the popular git client for Windows" (no need to install curl separately, Git Bash comes with curl).  If you are a Mac user, you can install cURL using a package manager such as Homebrew. When you try the examples below, be sure to replace API_KEY with your API key string obtained from Bing webmaster tools > Webmaster API > Generate. Refer easy set-up guide for Bing’s Adaptive URL submission API for more details.   Submitting new URLs – Single curl -X POST "https://ssl.bing.com/webmaster/api.svc/json/SubmitUrl?apikey=API_KEY" -H "Content-Type: application/json" -H "charset: utf-8" -d '{"siteUrl":"https://www.example.com", "url": "https://www.example.com/about"}' Response: {"d": null} Submitting new URLs – Batch curl -X POST “https://ssl.bing.com/webmaster/api.svc/json/SubmitUrlBatch?apikey=API_KEY” -H “Content-Type: application/json” -H “charset: utf-8” -d ‘{“siteUrl”:”https://www.example.com”, “urlList”:[“https://www.example.com/about”, “https://www.example.com/projects”]}’ Response: {“d”:null} Check remaining API Quota curl “https://ssl.bing.com/webmaster/api.svc/json/GetUrlSubmissionQuota?siteUrl=https://www.example.com&apikey=API_KEY” Response: { “d”: {“__type”: “UrlSubmissionQuota:#Microsoft.Bing.Webmaster.Api”, “DailyQuota”: 973, “MonthlyQuota”: 10973 }} So, integrate the APIs today to get your content indexed real time by Bing. Please reach out to Bing webmaster tools support if you face any issues.  Thanks, Bing Webmaster Tools team  

Some Thoughts on Website Boundaries

In the coming weeks, we will update the Bing Webmaster Guidelines to make them clearer and more transparent to the SEO community. This major update will be accompanied by blog posts that share more details and context around some specific violations. In the first article of this series, we are introducing a new penalty to address “inorganic site structure” violations. This penalty will apply to malicious attempts to obfuscate website boundaries, which covers some old attack vectors (such as doorways) and new ones (such as subdomain leasing). What is a website anyway? One of the most fascinating aspects of building a search engine is developing the infrastructure that gives us a deep understanding of the structure of the web. We’re talking trillions and trillions of URLs, connected with one another by hyperlinks. The task is herculean, but fortunately we can use some logical grouping of these URLs to make the problem more manageable – and understandable by us, mere humans! The most important of these groupings is the concept of a “website”. We all have some intuition of what a website is. For reference, Wikipedia defines a website as “a collection of related network web resources, such as web pages, multimedia content, which are typically identified with a common domain name.” It is indeed very typical that the boundary of a website is the domain name. For example, everything that lives under the xbox.com domain name is a single website. Fig. 1 – Everything under the same domain name is part of the same website. A common alternative is the case of a hosting service where each subdomain is its own website, such as wordpress.com or blogspot.com. And there are some (less common) cases where each subdirectory is its own website, similar to what GeoCities was offering in the late 90s. Fig. 2 – Each subdomain is its own separate website.   Why does it matter? Some fundamental algorithms used by search engines differentiate between URLs that belong to the same website and URLs that don’t. For example, it is well known that most algorithms based on the link graph propagate link value differently whether a link is internal (same site) or external (cross-site). These algorithms also use site-level signals (among many others) to infer the relevance and quality of content. That’s why pages on a very trustworthy, high-quality website tend to rank more reliably and higher than others, even if such pages are new and didn’t accumulate a lot of page-level signals. When things go wrong Stating the obvious, we can’t have people manually review billions of domains in order to assess what is a website. To solve this problem, like many of the other problems we need to solve at the scale of the web, we developed sophisticated algorithms to determine website boundaries. The algorithm gets it right most of the time. Occasionally it gets it wrong, either conflating two websites into one or viewing a single website as two different ones. And sometimes there’s no obvious answer, even for humans! For example, if your business operates in both the US and the UK, with content hosted on two separate domains (respectively a .com domain and a .co.uk domain), you can be seen as running either one or two websites depending on how independent your US and UK entities are, how much content is shared across the two domains, how much they link to each other, etc. However, when we reviewed sample cases where the algorithm got it wrong, we noticed that the most common root cause was that the website owner actively tried to misrepresent the website boundary. It can be indeed very tempting to try to fool the algorithm. If your internal links are viewed as external, you can get a nice rank boost. And if you can propagate some of the site-level signals to pages that don’t technically belong to your website, these pages can get an unfair advantage. Making things right In order to maintain the quality of our search results while being transparent to the SEO community, we are introducing new penalties to address “inorganic site structure”. In short, creating a website structure that actively misrepresents your website boundaries is going to be considered a violation of the Bing Webmaster Guidelines and will potentially result in a penalty. Some “inorganic site structure” violations were already covered by other categories, whereas some of them were not. To understand better what is active misrepresentation, let’s look at three examples. PBNs and other link networks While not all link networks misrepresent website boundaries, there are many cases where a single website is artificially split across many different domains, all cross-linking to one another, for the obvious purpose of rank boosting. This is particularly true of PBNs (private blog networks). Fig. 3 – All these domains are effectively the same website. This kind of behavior is already in violation of our link policy. Going forward, it will be also in violation of our “inorganic site structure” policy and may receive additional penalties. Doorways and duplicate content Doorways are pages that are overly optimized for specific search queries, but which only redirect or point users to a different destination. The typical situation is someone spinning up many different sites hosted under different domain names, each targeting its own set of search queries but all redirecting to the same destination or hosting the same content. Fig. 4 – All these domains are effectively the same website (again). Again, this kind of behavior is already in violation of our webmaster guidelines. In addition, it is also a clear-cut example of “inorganic site structure”, since we have ultimately only one real website, but the webmaster tried to make it look like several independent websites, each specialized in its own niche. Note that we will be looking for malicious intent before flagging sites in violation of our “inorganic site structure” policy. We acknowledge that duplicate content is unavoidable (e.g. HTTP vs. HTTPS), however there are simple ways to declare one website or destination as the source of truth, whether it’s redirecting duplicate pages with HTTP 301 or adding canonical tags pointing to the destination. On the other hand, violators will generally implement none of these, or will instead use sneaky redirects. Subdomain or subfolder leasing Over the past few months, we heard concerns from the SEO community around the growing practice of hosting third-party content or letting a third party operate a designated subdomain or subfolder, generally in exchange for compensation. This practice, which some people call “subdomain (or subfolder) leasing”, tends to blur website boundaries. Most of the domain is a single website except for a single subdomain or subfolder, which is a separate website operated by a third party. In most cases that we reviewed, the subdomain had very little visibility for direct navigation from the main website. Concretely, there were very few links from the main domain to the subdomain and these links were generally tucked all the way at the bottom of the main domain pages or in other obscure places. Therefore, the intent was clearly to benefit from site-level signals, even though the content on the subdomain had very little to do with the content on the rest of the domain. Fig. 5 – The domain is mostly a single website, to the exception of one subdomain. Some people in the SEO community argue that it’s fair game for a website to monetize their reputation by letting a third party buy and operate from a subdomain. However, in this case the practice equates to buying ranking signals, which is not much different from buying links. Therefore, we decided to consider “subdomain leasing” a violation of our “inorganic site structure” policy when it is clearly used to bring a completely unrelated third-party service into the website boundary, for the sole purpose of leaking site-level signals to that service. In most cases, the penalties issued for that violation would apply only to the leased subdomain, not the root domain. Your responsibility as domain owner This article is also an opportunity to remind domain owners that they are ultimately responsible for the content hosted under their domain, regardless of the website boundaries that we identify. This is particularly true when subdomains or subfolders are operated by different entities. While clear website boundaries will prevent negative signals due to a single bad actor from leaking to other content hosted under the same domain, the overall domain reputation will be affected if a disproportionate number of websites end up in violation of our webmaster guidelines. Taking an extreme case, if you offer free hosting on your subdomains and 95% of your subdomains are flagged as spam, we will expand penalties to the entire domain, even if the root website itself is not spam. Another unfortunate case is hacked sites. Once a website is compromised, it is typical for hackers to create subfolders or subdirectories containing spam content, sometimes unbeknownst to the legitimate owner. When we detect this case, we generally penalize the entire website until it is clean. Learning from you If you believe you have been unfairly penalized, you can contact Bing Webmaster Support and file a reconsideration request. Please document the situation as thoroughly and transparently as possible, listing all the domains involved. However, we cannot guarantee that we will lift the penalty. Your feedback is valuable to us! Clarifying our existing duplicate content policy and our stance on subdomain leasing were two feedbacks we heard from the SEO community, and we hope this article achieved both. As we are in the middle of a major update of the Bing Webmaster Guidelines, please feel free to reach out to us and share feedback on Twitter or Facebook. Thank you, Frederic Dubut and the Bing Webmaster Tools Team    

The new evergreen Bingbot simplifying SEO by leveraging Microsoft Edge

Today we’re announcing that Bing is adopting Microsoft Edge as the Bing engine to run JavaScript and render web pages. Doing so will create less fragmentation of the web and ease Search Engines Optimization (SEO) for all web developers. As you may already know, the next version of Microsoft Edge is adopting the Chromium open source project.   This update means Bingbot will be evergreen as we are committing to regularly update our web page rendering engine to the most recent stable version of Microsoft Edge.     Easing Search Engine Optimization   By adopting Microsoft Edge, Bingbot will now render all web pages using the same underlying web platform technology already used today by Googlebot, Google Chrome, and other Chromium-based browsers. This will make it easy for developers to ensure their web sites and their Content Management System work across all these solutions without having to spend time investigating each solution in depth.   By disclosing our new Bingbot Web Pages rendering technology, we are ensuring fewer SEO compatibility problems moving forward and increase satisfaction in the SEO community.   If your feedback can benefit the greater SEO community, Bing and Edge will propose and contribute to the open source Chromium project to make the web better for all of us. Head to github  to check out our explainers!     What happens next   Over the next few months, we will be switching to Microsoft Edge “under the hood”, gradually over time. The key aspects of this evolution will be transparent for most sites. We may change our bingbot crawler user-agent as appropriate to allow rendering on some sites.   For most web sites, there is nothing you should really need to worry as we will carefully test that they dynamically render fine before switching them to Microsoft Edge.   We invite you to install and test Microsoft Edge and register your site to Bing Webmaster Tools to get insights about your site, to be notified if we detect issues, and to investigate your site using our upcoming tools based on our new rendering engine.   We look forward to sharing more details in the future. We are excited about the opportunity to be an even-more-active part of the SEO community to continue to make the web better for everyone including all search engines.   Thanks, Fabrice Canel Principal Program Manager Microsoft- Bing    

Import sites from Search Console to Bing Webmaster Tools

At Bing Webmaster Tools, we actively listen to the needs of webmasters. Verifying a website’s ownership had been reported as a pain-point in Bing Webmaster Tools. To simplify this process, we recently introduced a new method for webmasters to verify sites in Bing Webmaster Tools. Webmasters can now import their verified sites from Google Search Console into Bing Webmaster Tools. The imported sites will be auto-verified thus eliminating the need for going through manual verification process. Both Bing Webmaster Tools and Google Search Console use similar methods to verify the ownership of a website. Using this new functionality, webmasters can log into their Google Search Console account and import all the verified sites and their corresponding sitemaps to their Bing Webmaster Tools account. Webmasters just need to follow 4 simple steps to import their site: Step 1: Sign-in to your Bing Webmaster Tools account or create a new one here Step 2: Navigate to My Sites page on Bing Webmaster Tools and click Import Step 3: Sign-in with your Google Search Console account and click Allow to give Bing Webmaster Tools access to your list of verified sites and sitemaps Step 4: After authentication, Bing Webmaster Tools will display the list of verified sites present in your Google Search Console account along with the number of Sitemaps and corresponding role for each site. Select the sites which you want to add to Bing Webmaster Tools and click Import Webmasters can import multiple sites from multiple Google Search Console accounts. On successful completion, the selected sites will be added and automatically verified in Bing Webmaster Tools. Please note that it might take up to 48 hours to get traffic data for the newly verified websites. Maximum 100 websites can be imported in one go. Please follow the above steps again in case you want to add more than 100 sites. The limit of 1000 sites addition per Bing Webmaster Tools account still applies. Bing Webmaster Tools will periodically validate your site ownership status by syncing with your Google Search Console account. Therefore, it is necessary for your Bing Webmaster Tools account to have ongoing access to your Google Search Console account. If access to your Google Search Console account is revoked, you will then have to either import your sites again or verify your sites using other verification methods. We hope that both Import from Google Search Console and Domain Connect verification method will make the onboarding process easier for webmasters. We encourage you to sign up and leverage Bing Webmaster tools to help drive more users to your sites.   We want to hear from you!  As a reminder, you can always reach out to us and share feedback on Twitter and Facebook. If you encounter issues using this solution, please raise a service ticket with our support team. Thanks! Bing Webmaster Tools team  

Bing Webmaster Tools simplifies site verification using Domain Connect

In order to submit site information to Bing or to get performance report or access diagnostic tools, webmasters need to verify their site ownership in Bing Webmaster Tools. Traditionally Bing webmaster tools support three verification options,   Option 1: XML file authentication Option 2: Meta tag authentication Option 3: Add a CNAME record to DNS Option 1 and Option 2 requires webmaster to access the site source code to complete the site verification. With Option 3, webmaster can avoid access to site source code but need to access the domain hosting account to edit the CNAME record to hold the verification code provided by Bing Webmaster Tools. To simplify option 3, we announce the support for Domain Connect open standard that will allow webmasters to seamlessly verify their site in Bing Webmaster Tools. Domain Connect is an open standard that makes it easy for a user to configure DNS for a domain running at a DNS provider (e.g. GoDaddy, 1&1 Ionos, etc) to work with a Service running at an independent Service Provider (e.g. Bing, O365, etc). The protocol presents a simple experience to the user, isolating them from the details of DNS settings and its complexity. Bing Webmaster Tools verification using Domain Connect is already live for users whose domain is hosted with following DNS providers                                               Bing webmaster tools will gradually integrate this capability with other DNS providers that support Domain Connect open standard. Quick guide on how to use Domain Connect feature to verify your site in Bing Webmaster Tools:   Step 1: Open a Bing Webmaster Tools account You can open a free Bing Webmaster Tools account by going to the Bing Webmaster Tools sign-in or sign-up page.  You can sign up using Microsoft, Google or Facebook account.   Step 2: Add your website Once you have a Bing Webmaster Tools account, you can add sites to your account. You can do so by entering the URL of your site into the Add a Site input box and clicking Add.        Step 3: Check if your site is supported for Domain Connect protocol When you Add the website information, Bing Webmaster Tools will do background check to identify if that domain/ website is hosted on DNS provider that has integrated Domain Connect solution with Bing Webmaster Tools. Following view will show in case the site is supported – In case the site is not supported for Domain Connect protocol then user will see the default verification options as mentioned in top of this blog.   Step 4: Verify using DNS provider credentials On click of Verify, user will be redirected to DNS provider site. Webmaster should sign-in using the account credentials associated with domain/ website under verification.                                                               On successful sign-in, user site will be successfully verified by Bing webmaster tools within few seconds. In certain cases, it may take longer for DNS provider to send the site ownership signal to Bing webmaster tool service.   Using the new verification options will significantly reduce the time taken and simplify the site verification process in Bing Webmaster Tools. We encourage you to try out this solution and get more users for your sites on Bing via Bing Webmaster Tools. In case you face any challenges using this solution you can raise a service ticket with our support team. We are building another solution to further simplify the site verification process and help webmasters to easily add and verify their site in Bing Webmaster Tools. Watch this space for more!      Additional reference: https://www.plesk.com/extensions/domain-connect/ https://www.godaddy.com/engineering/2019/04/25/domain-connect/   Thanks! Bing Webmaster Tools team

bingbot Series: Introducing Batch mode for Adaptive URL submission API

We launched the Adaptive URL submission capability that allowed webmasters to submit up to 10,000 URLs using the online API or through Bing webmaster portal (Submit URLs option). Since the launch we have received multiple requests from webmasters for the ability to submit the URLs in batches. As we are actively listening to the webmaster and their needs, we are delighted to announce the Batch mode capability for Adaptive URL Submission API which will allow the webmasters and site managers to submit URLs in batches, saving them from those excessive API calls made when submitting the URLs individually.   The Batch URL Submission API is very similar to the individual URL Submission API (Blogpost) and hence integrating the Batch API is very easy and follows the same steps.   Example requests for the Batch URL Submission API for the supported protocols can be seen below JSON Request Sample  POST /webmaster/api.svc/json/SubmitUrlbatch? apikey=sampleapikeyEDECC1EA4AE341CC8B6 HTTP/1.1 Content-Type: application/json; charset=utf-8 Host: ssl.bing.com { "siteUrl":"http://yoursite.com", "urlList":[ "http://yoursite.com/url1", "http://yoursite.com/url2", "http://yoursite.com/url3" ] } XML Request Sample POST /webmaster/api.svc/pox/SubmitUrlBatch? apikey=sampleapikeyEDECC1EA4AE341CC8B6 HTTP/1.1 Content-Type: application/xml; charset=utf-8 Host: ssl.bing.com <SubmitUrlBatch xmlns="http://schemas.datacontract.org/2004/07/Microsoft.Bing.Webmaster.Api"> <siteUrl>http://yoursite.com</siteUrl> <urlList> <string xmlns="http://schemas.microsoft.com/2003/10/Serialization/Arrays">http://yoursite.com/url1</string> <string xmlns="http://schemas.microsoft.com/2003/10/Serialization/Arrays">http://yoursite.com/url2</string> <string xmlns="http://schemas.microsoft.com/2003/10/Serialization/Arrays">http://yoursite.com/url3</string> </urlList> </SubmitUrlBatch> You will get a HTTP 200 response on successful submission of the URLs. Meanwhile the URLs will be checked to comply with Bing Webmaster Guidelines and if they pass, they will be crawled and indexed in minutes.   Please refer the Documentation for generating the API key and Batch URL Submission API for more details. Do note that the maximum supported batch size in this API is 500 URLs per request. Total limit on numbers of URLs submitted per day still applies.   So, integrate the APIs today to get your content indexed real time by Bing and let us know of what you think of this capability. Please reach out to bwtsupport@microsoft.com if you face any issue while integrating.   Thanks ! Bing Webmaster Tools Team

bingbot Series: Easy set-up guide for Bing’s Adaptive URL submission API

In February, we announced launch of adaptive URL submission capability. As called out during the launch, as SEO manager or website owner, you do not need to wait for the crawler to discover new links, you should just submit those links automatically to Bing to get your content immediately indexed as soon as your content is published!  Who in SEO didn’t dream of that.  In the last few months we have seen rapid adoption of this capability with thousands of websites submitting millions of URLs and getting them indexed on Bing instantly.   At the same time, we have few webmasters who have asked for guidance on integrating the adaptive URL submission API. This blog provides information on how easy it is to set-up the adaptive URL submission API.   Step 1: Generate an API Key     Webmasters need an API key to be able to access and use Bing Webmaster APIs. This API key can be generated from Bing Webmaster Tools by following these steps:   Sign in to your account on Bing Webmaster Tools. In case you do not already have a Bing Webmaster account, sign up today using any Microsoft, Google or Facebook ID.  Add & verify the site that you want to submit URL for through the API, if not already done.  Select and open any verified site through the My Sites page on Bing Webmaster Tools and click on Webmaster API on the left-hand side navigation menu.    If you are generating the API key for the first time, please click Generate to create an API Key. Else you will see the key previously generated.    Note: Only one API key can be generated per user. You can change your API key anytime; change is taken by the system within 30 minutes. Step 2: Integrate with your website    You can any of the below protocols to easily integrate the Submit URL API into your system.   JSON request sample  POST /webmaster/api.svc/json/SubmitUrl? apikey=sampleapikeyEDECC1EA4AE341CC8B6 HTTP/1.1 Content-Type: application/json; charset=utf-8 Host: ssl.bing.com { "siteUrl":"http:\/\/example.com", "url":"http:\/\/example.com\/url1.html" } XML Request sample  POST /webmaster/api.svc/pox/SubmitUrl?apikey=sampleapikey341CC57365E075EBC8B6 HTTP/1.1 Content-Type: application/xml; charset=utf-8 Host: ssl.bing.com <SubmitUrl xmlns="http://schemas.datacontract.org/2004/07/Microsoft.Bing.Webmaster.Api"> <siteUrl>http://example.com</siteUrl> <url>http://example.com/url1.html</url> </SubmitUrl> If the URL submission is successful you will receive an http 200 response. This ensures that your pages will be discovered for indexing and if Bing webmaster guidelines are met then the pages will be crawled and indexed in real time. Using any of the above methods you should be able to directly and automatically let Bing know whenever new links are created in your website. We encourage you to integrate such solution in your Web Content Management System to let Bing auto discover your new content at publication time.  In case you face any challenges during the integration, you can reach out bwtsupport@microsoft.com to raise a service ticket. Feel free to contact us if your web site requires more than 10,000 URLs submitted per day. We will adjust as needed.  Thanks!  Bing Webmaster Tools team

bingbot Series: Get your content indexed fast by now submitting up to 10,000 URLs per day to Bing

Today, we are excited to announce a significant increase in the number of URLs webmasters can submit to Bing to get their content crawled and indexed immediately. We believe that enabling this change will trigger a fundamental shift in the way that search engines, such as Bing, retreive and are notified of new and updated content across the web. Instead of Bing monitoring often RSS and similar feeds or frequently crawling websites to check for new pages, discover content changes and/or new outbound links, websites will notify the Bing directly about relevant URLs changing on their website. This means that eventually search engines can reduce crawling frequency of sites to detect changes and refresh the indexed content.  For many years, Bing has offered all webmasters the ability to submit their site URLs through the Bing Webmaster Tools portal as well as the Bing Webmaster Tools API for immediate crawl and indexation. Until today, this feature was throttled for all sites to submit maximum of 10 URLs per day and maximum of 50 URLs per month.  Today we are releasing the Adaptive URL submission feature that increases the daily quota by 1000x, allowing you to submit up to 10,000 URLs per day, with no monthly quotas. The daily quota per site will be determined based on the site verified age in Bing Webmaster tool, site impressions and other signals that are available to Bing. Today the logic is as follows, and we will tweak this logic as needed based on usage and behavior we observe.    Key things to note: Site verified age is one of the signals but not the only signal that is used to determine the daily URL quota per site. Webmasters should be able to see revised limit for their site on Bing webmaster tools portal (Submit URLs option) or by using the Get URL submission quota API.  Webmaster tools portal ​ Get URL submission quota API In Bing webmaster tools portal, under “Submit URLs” option, webmaster will see maximum of 1,000 latest submitted URLs even though the permissible quota per day could be greater than 1,000 As per existing functionality, for sub-domain level site “Submit URL” option will not be applicable. If there are sites mapped to sub-domain then quota is allocated at domain level and not at sub-domain level. So, login to Bing Webmaster Tools or integrate Bing Webmaster APIs in your Content Management Systems now to benefit from the increase and contact us for feedback.  Feel free to contact us also if your web site requires more than 10,000 URLs submitted per day. We will adjust as needed.   Thanks! Bing Webmaster Tools team  

Introducing Clarity, a web analytics product for webmasters

Today, we are announcing the beta release of Clarity, a web analytics product which enables website developers to understand user behavior at scale.  Web developers face many challenges in building compelling and user-friendly websites. Understanding why users struggle, where users run into issues or why they abandon a website is difficult. When making updates to a web experience, A/B experimentation helps developers decide on which way to go. While A/B experiments allow developers to see when their key metrics are moving, the primary drawback is the lack of visibility into why the metrics moved in any given direction. This gap in understanding user behavior led us to build Clarity. Session Replay The session replay capability of Clarity allows web developers to view a user's page impression to understand user interactions such as mouse movement, touch gestures, click events and much more. Being able to replay user sessions allows web developers to empathize with users and understand their pain points. Clarity Case Studies  Bing uses Clarity to detect poor user experience due to malware In the Wild West of devices and browsers, the experience you think you are serving and what your users see can be completely different - devices, plugin, proxies and networks can all change and degrade the quality of your experience. These problems are expensive to diagnose and fix.   The first image shows the page of Bing with malware while the second image shows Bing default  experience after removal of malware.  Clarity has been used by the Bing UX team at Microsoft to delve into sessions that had negative customer satisfaction and determine what went wrong. In some cases, engineers were able to detect pages that had multiple overlays of advertisements and looked nothing like the expected experience for customers. Looking through the layout anomalies and network request logs that Clarity provides, Bing developers were able to diagnose the cause: malware installed on the end user's machine was hijacking the page and inserting bad content.  With the knowledge of what was causing these negative experiences, Bing engineers were able to design and implement changes which defended the Bing page. By doing so they increased revenue while decreasing page load time - all while giving their customers a significantly improved experience.  Cook with Manali uses Clarity to improve user engagement  Cook with Manali is a food blog and like many other blogs dedicated to cooking, posts begin with a story about the inspiration behind the recipe. Posts have detailed instructions to prepare the meal, high-quality photographs of the finished dish and potentially step by step pictures to help explain the more complicated parts. Near the bottom of the page is a shorthand recipe card summarizing ingredients, instructions and nutritional information. While this long post format enables food bloggers to emotionally connect with their readers and preemptively address any complication in the recipe, some readers would rather get straight to the recipe.  When the Cook with Manali team started using Clarity, they were able to investigate real user sessions and realized that almost thirty percent of users were abandoning the page before reaching the bottom of these recipe pages, which has important information about the recipe. In many cases, it seemed that users felt they had to scroll too far to get to the recipe that they really cared about and lost patience before making it far enough on the page. The developers realized their strategy was backfiring and creating a bad experience for some of their users, prompting them to add a "Jump to Recipe" button at the top of these pages.   With the new button deployed, the team was able to see traffic going up and abandonment going down. When they dug into these new session replays, they were able to see users utilizing the new button and getting directly to the content they cared about. They saw abandonment for these pages drop down to roughly ten percent, signaling a significant increase in user satisfaction. Interestingly, many users now utilize the "Jump to Recipe" button to then scroll back up to read the larger story afterwards.   How does Clarity work?  Clarity works on any HTML webpage (desktop or mobile) after adding a small piece of JavaScript to the website. This JavaScript code listens to browser events and instruments layout changes, network requests and user interactions. The instrumentation data is then uploaded and stored in the Clarity server running on Microsoft Azure.   Other capabilities coming soon  Interesting sessions are automatically bubbled up based on Clarity's AI and machine learning capabilities to help web developers review user sessions with abnormal click or scroll behavior, session length, JavaScript errors and more. Web developers can spend less time and gain more insight into their users focusing on the sessions that Clarity marks as most relevant.  Related sessions are a grouping of similar sessions that are recommended based a single session. This feature allows web developers to quickly understand the scope of a specific user behavior and find other occurrences for the same user as well as other users.   Heatmaps provide a view into user behavior at an aggregate level through click/touch and scroll heatmaps. Click/touch heatmap provides distribution of user interactions across a webpage. Scroll heatmaps provide how far users scroll on your webpage.   How do I get started?  Sign up at the Clarity website using your Microsoft Account! (In case you don’t have one, you can sign-up here.)   When you create a new project, it will be added to the waitlist. A notification will be sent when your project is approved for onboarding and you can login to Clarity to retrieve the uniquely generated JS code for your project. Once you have added the code to your website, you can use Clarity dashboard to start replaying user sessions and gain insights.   Please reach out to ClarityMS@microsoft.com if you have any questions.   Contributing to Clarity  The Clarity team has also open sourced the JavaScript library which instruments pages to help understand user behavior on websites on GitHub . As Clarity is in active development with continuous improvements, join our community and contribute. Getting started is easy, just visit GitHub and read through our README.    Here are some of the exciting new feature the Clarity team is brewing up: Interesting sessions are automatically bubbled up based on Clarity's AI and machine learning capabilities to help web developers review user sessions with abnormal click or scroll behavior, session length, JavaScript errors and more. Web developers can spend less time and gain more insight into their users focusing on the sessions that Clarity marks as most relevant. Related sessions are a grouping of similar sessions that are recommended based a single session. This feature allows web developers to quickly understand the scope of a specific user behavior and find other occurrences for the same user as well as other users. Heatmaps provide a view into user behavior at an aggregate level through click/touch and scroll heatmaps. Click/touch heatmap provides distribution of user interactions across a webpage. Scroll heatmaps provide how far users scroll on your webpage. Thank you,

bingbot Series: Getting most of Bingbot via Bing Webmaster Tools

There are multiple features in Bing Webmaster Tools that allows webmasters to check Bingbot’s performance and issues on their site, provide input to Bingbot crawl schedules and check if that random bot hitting the pages frequently is actually Bingbot or not.  In part 4 of our Bingbot series, Nikunj Daga, Program Manager for Bing Webmaster Tools, revisits some of those tools and features that assists webmasters in troubleshooting and optimizing Bingbot’s performance on their site.  Crawl Information - Webmasters can get the data about Bingbot’s performance on their site in the Reports and Data section in Bing Webmaster Tools. The site activity chart present in this page can show an overlapping view of total pages indexed, pages crawled and pages where there were crawl errors for the last six months along with the Impressions and Clicks data. Through this chart, it is easier for webmasters to visually see whether the changes they made on their sites had any impact on page crawling.   Further, in order to get more information on the pages with crawl errors, webmasters can go to the Crawl Information page. In this page, an aggregated count of the pages with different errors that Bingbot faced is provided along with the list of those URLs. This make it simple for webmasters to troubleshoot why a particular page that they are looking for in Bing is not appearing while searching.  Crawl Errors – In addition to webmasters going and checking the Crawl Information page for crawl errors, Bing Webmaster Tools also proactively notifies the webmasters in case Bingbot faces significant number of issues while crawling the site. These notifications are sent on the Message Center in Bing Webmaster Tools. These alerts can also be sent through email to users who do not visit webmaster tools on a regular basis. So, it will not be a bad idea for webmasters to opt in for the email communication from Bing Webmaster Tools through the Profile Page.  Webmasters can set the preference for kind of alerts they want to receive emails for along with the preferred contact frequency.  Further, Bingbot can face different kinds of errors while crawling a site, a detailed list of which along with their description and action can be found here.  Crawl Control – The Crawl Control feature allows webmasters to provide input to the Bingbot about the crawl speed and timing for your site. It can be found under the “Configure My Site” section in Bing Webmaster Tools. Using this feature, you can set hourly crawl rate for your site and notify Bingbot to crawl slowly during peak business hours and faster during off peak hours. There are preset schedules to choose from based on the most common business hours followed across the globe. In addition to the preset schedules, webmasters also have the option to fully customize the crawl schedule based on their site’s traffic pattern. Customizing the crawl pattern is very easy and can be done by just dragging and clicking on the graph present in the Crawl Control feature.  Fetch as Bingbot – The Fetch as Bingbot tool returns the code that Bingbot sees when it crawls the page. Webmasters can find this feature under the “Diagnostics and Tools” section to submit a request to fetch as Bingbot. Once the fetch is completed, the status will change from “Pending” to “Completed” and the webmasters will be able to see the code that appears to Bingbot when it tries to crawl the site. This is a useful feature for webmasters who use dynamic content on their sites and is the basic check if they want to see what data Bingbot sees among all the dynamic and static content on the site.  Verify Bingbot - Found under the “Diagnostics and Tools” section in Bing Webmaster Tools, Verify Bingbot tool lets the webmasters check if the Bing user agent string appearing in the server logs are actually from Bing or not. This can help webmasters determine if someone is hiding their true identity and attacking the site by using Bing’s name. Further, it also helps webmasters who have manually configured IP to whitelist Bingbot on their server. Since Bing does not release the list of IPs, using this tool the webmasters can check whether the IPs allowed in the server belong to Bing and whether they are whitelisting the right set of IPs.  Thus, it is evident that a lot can be done by webmasters to improve Bingbot’s performance on their site using the features in Bing Webmaster Tools. These features were developed and have evolved over the years based on feedback we receive from the webmaster community. So, login to Bing Webmaster Tools now to use the features and let us know what you think.  Thanks! Nikunj Daga Program Manager, Bing Webmaster Tools

Pages

Recommended Content