[Manoj]: Can you talk a little bit about the benefits of the latest features, such as DNS record update and clicks/avg. position?
[Maile Ohye]: Sure, Manoj, thanks for asking. The schedule for our Webmaster Tools team includes a new release about every two months. Exciting! For site owners, this means new features for you every few months for the past 4+ years.
In each release, you’ve probably noticed that we aim to improve an existing feature (or our backend infrastructure), as well as release an entirely new feature. With DNS verification we helped webmasters more easily verify ownership of subdomains in Webmaster Tools. Rather than individually verify www.example.com, blog.example.com, and shopping.example.com, you can add one line to you DNS record and all associated sites/subdomains are verified at once.
We expect this feature to be most helpful to webmasters of larger sites.
In our improved Search queries feature, we aimed to help all site owners with access to impression and click data. Regardless of whether you’re a webmaster of a large ecommerce site, a cooking blogger, a large AdWords customer, or you’ve never heard of AdWords, every verified site owner has the capability to see data about their current potential visitors (impressions) and their actual visitors (clickthrough).
In terms of how to act on this data, one method is to find the queries where you’re receiving impressions but not getting clickthrough. Run those queries yourself from a searchbox and investigate why you’re not receiving visitors. How does your title and snippet look? Can you make your content more competitive?
With Search queries, you can view data from not just web search, but also from other properties like images, mobile, and smartphone queries. And you can tailor the information to originate from various countries, like the United Kingdom or Japan.
Features like Search queries’ average position were developed to provide you a quick-glance understanding of the performance of a given keyword. Additionally, if you only want to quickly track a certain set of keywords, go ahead and “star” them for a more simplified display (like you’d see in gmail).
[Manoj]: How has real time search been incorporated?
[Maile Ohye]: Real Time results are triggered in Universal Search from the following:
Threshold queries based on volume and delta: This includes triggering queries like [Lost] to show Real Time results on premier/finale night. We algorithmically notice a large delta for this query compared to perhaps yesterday, or earlier in the week. When [lost] reaches a certain threshold, we understand that Real Time results may be most relevant for the user.
Common queries for Real Time results: This includes political queries and the like — things that are constantly talked about and where freshness and/or a variety of sources may be helpful to the user.
Real Time information on Google is even better searchable. For certain queries, you can actually replay the conversation about a topic. I think this may be the only searchable, replay-able, public archive of tweets.
[Manoj]: What feature of Webmaster tools is your favorite?
[Maile Ohye]: Picking a favorite feature is pretty difficult for me. It’s like picking my favorite niece (I love them all!). One feature I definitely feel goes under-recognized, though, is HTML suggestions. HTML suggestions tells you what pages have duplicate HTML titles and meta descriptions (which are often used in your snippet).
This is actionable data. In getting a handle on whether your site has duplicate content, I’d first run a few site: queries, like [site:googlewebmastercentral.blogspot.com microformats], to see if Google had serve-time filtered duplicate results. Next, I’d go straight to HTML suggestions. Pages with duplicate titles and snippets are likely complete duplicate content.
At SES New York a few months ago, I gave a more in-depth presentation on duplicate content, multiple sites, and how to address the issues. I’ll try to film that presentation in mid-June (when I’m back from the SES Toronto keynote), and I’ll write a related post on our Webmaster Central Blog just in case people find the information useful.
[Manoj]: For a brand new site, would a site owner see quicker indexation of their site with an XML sitemap vs. without?
[Maile Ohye]: An XML Sitemap is a great way to maximize your site’s exposure to our crawlers. Once crawled, your site can be indexed. Once indexed, your site can be returned to users in search results. So yes, submit a Sitemap if you can.
Furthermore, when you submit a Sitemap, Webmaster Tools then displays the number of URLs from your Sitemap that are indexed. Win-win.
[Manoj]: What types of improvements are you guys hard at work at?
[Maile Ohye]: As for improvements, Webmaster Tools Message Center is only going to get better. If you’re a verified site owner, I’d recommend setting up email forwarding of our messages. We currently notify site owners of certain violations of our webmaster guidelines, or when we crawl infinite spaces, or information about malware on their site.
Having a communication channel between us at Google and the opted-in, verified owner of a site has truly huge potential. Excitement mounts… music quickens… stay tuned!