Google created Webmaster Tools to help you understand what Google knows about your website, and to help spot serious problems hurting your search rankings.
Read on to learn how to setup Google Webmaster Tools and then get an overview of everything you can do with it.
To get started with Google Webmaster Tools, visit the signup page – you’ll need to login with your Google Account, or if you don’t have one, follow the prompts to create it.
Once logged in, you’ll need to “add a site,” and then proceed to verification.
Verifying Your Site
The most common reason people have trouble setting up Webmaster Tools is verification, where Google makes sure you own your website. Helpfully, Google has made this easier by allowing verification through Google Analytics.
Verifying With Google Analytics
The default Webmaster Tools verification option should be to “Use your Google Analytics account:”
If you have setup Google Analytics (must be with the newer “async” tracking code), and you are an administrator for the account, go ahead and click verify. If you aren’t using Google Analytics – either learn more about it, or try one of the alternate methods.
Alternate Verification Methods
Alternate methods for Google Webmaster Tools verification include:
- Upload an HTML file to your server – If you have ftp access to your website, this should be an easy option. You’ll get a small file to upload.
- Add a meta tag to your site’s homepage – This involves adding a small line of code to your website. If you can edit your site, great. If you use WordPress, it’s easily accomplished with many plugins. WordPress SEO by Joost is one of these, or if you just need verification, Webmaster Tools Verification should work.
- Add a DNS record to your domain’s configuration – If you have access to your DNS records, this is an option. Just be patient for the records to update. Learn more about managing DNS.
When your site is setup, you’ll land on the dashboard. This is just a summary showing a few of the more important details about your site.
Down the left-hand side of the screen, there is a menu – I’m going to take you through each section one step at a time. If you don’t have Webmaster Tools setup yet, I’d suggest bookmarking this page and following along once you do.
Google can use this section to communicate information about your site. They are usually important – telling you that your site is hosting malware (probably been hacked), or that a WordPress update is available, for instance.
It’s wise to forward these messages to your email account. Click the “Google Webmaster Tools” logo in the top-left to go back to the homepage. In the upper-right section of the screen there should be a drop down that says “Don’t forward messages.” Clicking it will give you the option to forward messages to any email address associated with your Google Account.
Sitemaps provide Google with a list of pages on your site, to help the search engine make sure it crawls your entire website. This isn’t terribly important if you have a very small site, but it’s never a bad thing to have.
In order to submit a sitemap, you’ll first have to create one. If you are using WordPress, you can use Google XML Sitemaps just for this, or Yoast’s SEO plugin includes sitemaps as one of many features. If you’re not on WordPress, this site can build it for you or you can do it manually.
Once you have a sitemap, you can submit it via the button in Webmaster Tools, you just need to know its URL (usually something like http://yoursite.com/sitemap.xml). Come back later, and Google will tell you if it had any problems with the sitemap, how many webpages were in the sitemap, and how many of those pages are currently included in Google’s index.
If you have a small site, hopefully almost all the pages will be included – this can take some time if your site is newer. The report will look like this:
This section of Webmaster Tools is dedicated to helping you hide sections of your website from Google. The idea isn’t to be sneaky, it’s just that some pages may not be important for search engines, or you may want to keep them private (although just asking Google not to look at them is really not sufficient).
For many small businesses, the bigger issue with crawler access will be making sure that nothing is accidentally being hidden. If your site is completely absent from the search results, there’s a good chance you’ll find an error here. Also refer to the “Crawl Errors” section further down this page.
The first tab here is “Test robots.txt” This is referring to a file on your web server, which tells “bots,” such as those used by search engines to crawl the web, which pages on your website they are allowed to access.
A common setting is:
User-agent: * Disallow:
This essentially means that all robots (the *) are disallowed from nothing – meaning your entire site is open to the world. If you do not have a robots.txt file, or it is blank, that will produce the same result. It’s better to have a blank one than none at all. If you have something else here, and don’t know why, this could cause visibility problems in search.
Under separate tabs, Google also provides a robots.txt generator, and a “Remove URL” tool. The generator can be helpful if you do want to specify pages for exclusion from Google, and if there’s something private (on your site) appearing in Google, the removal tool can help get that taken out.
For some searches, Google returns results like this:
A single website dominates the results, with Google displaying what is essentially a menu for the site. The appearance of this “12-pack” of sitelinks is relatively new, as Google used to show fewer links.
Try searching for your website by name. If you see something like this, then the Sitelinks section of Google Webmaster Tools might be helpful for you.
You’ll want to look at the sitelinks that appear and decide if any are pages you don’t want appearing here. If so, fill out Google’s form. It’s a bit confusing – in this example, the first blank would be left empty, because the top result is my site’s home page. You would then enter the URL of the sitelink you wish to remove.
Change of Address
Google Webmaster Tools can help you change your website’s address. You’ll need to verify your new address before using this tool.
The settings page presents these options:
- Geographic target – If you specifically target users of a particular country, you can set this option.
- Preferred domain – Your site should not show up as both www.yoursite.com and yoursite.com – one should automatically redirect to the other (use what’s called a 301 redirect). If you’re using WordPress, this should be automatic. Whether you choose www or not, choose the same for your Preferred domain here.
- Crawl rate – This determines how rapidly Google’s spider crawls your site. People often want to make this very fast, but the default is probably best. It is mainly useful to slow down the crawler if it is moving fast enough to overload your site.
Parameters are added onto the end of a web address – for instance, rather than http://yoursite.com, you might have http://yoursite.com?utm_source=search. The utm_source designation is usually used for Google Analytics – and could be ignored by Google’s crawler, because it does not affect the content of your web page.
On the other hand, some sites are setup so that these parameters actually determine which page on the site will be served. If Google ignores the parameters, it will lose most of your site.
In the URL Parameters section, Google will display parameters it has seen used on your site (if any). The default, “Let Googlebot decide” is better than screwing things up if this is all foreign to you. If, however, Google is indexing many identical pages, you can click “edit,” and tell it that the parameter has no effect on content.
Your Site On The Web
Ok, we’re through the basic and technical stuff – now for the fun parts. In “Your Site On The Web,” Google tells us interesting things it knows about our site.
The search queries report will show you the number of impressions and clicks your site is generating through Google:
The numbers are imprecise, but provide a look at what’s going on. You can also dig into data for particular queries – helpful to see if a little work might push you onto the first page more frequently for a particular search (average ranking is displayed), or to see where your click-through rate is particularly good/bad.
You can also click the “Top pages” tab to see your data by web page – again helpful to see which pages would provide the most benefit from improvement.
Links to Your Site
This section shows websites linking to your site. It is not too helpful, as Google only shows a sampling of the links it has found. There are three sections:
- Who links the most – You can see how many links point to your site from particular domains.
- Your most linked content – Find the pages on your website that have attracted the most outside links.
- How your data is linked – Shows the most commonly used anchor text. This refers to the actual text used in the link pointing to your site.
Google lists words that are used more commonly on your site than on most websites. This is one of the methods Google uses to determine the topic of your website. It’s an interesting thing to look at – if most of the keywords are not related to your business/industry, you should try to work more relevant keywords into the text on your website.
Click on any word in the list to see the number of its occurrences, variations on the word that Google considers the same (such as plurals), and the specific pages where Google finds the word most frequently.
This section of Webmaster Tools displays a list of the pages on your website, and the number of “Internal links” pointing at each page. Internal links are links from one page on your website to another.
You can click on any page to see a list of the pages linking to it. The highest counts will usually be pages included in your site-wide navigation.
Internal links can help Google determine what is important content on your website. Take a look at this report, and make sure that your key pages are among those with the most internal links.
If your site publishes a RSS feed, and there are subscribers using Google products to access it, the number will show up here. This is not nearly as complete as the data Feedburner provides, making it not particularly useful. You can submit your feed as a sitemap here as well, which could help keep Google informed about updates to your site.
Google recently created a +1 button (this is different from the Google+ social network). It is similar to Facebook’s “Like” button, letting people indicate your page is valuable.
This area describes how +1’s have impacted your site in the search results. You can see how frequently a result was displayed that told a searcher it had been “+1’d,” and what impact this had on the click-through rate for that result.
The +1 activity report:
This report shows the rate at which you are receiving +1’s, and the pages which are +1’d the most.
The audience section displays the number of total users that have +1’d a page on your site. Once you have a reasonable number of +1’s, demographic information is also displayed.
The diagnostics section of Webmaster Tools is an important area – it primarily alerts you to problems with your site.
In an effort to protect searchers, Google scans sites for malware. Alerts will appear here if malware is found on your site, which usually occurs because your site has been hacked.
Google provides suggestions for dealing with malware.
If you believe your site does not actually contain any malware, you can request a “malware review” from within this section of Webmaster Tools (only when malware has been detected). It is important to do this, as if Google believes there is malware on your site, it can dramatically lower your search rankings.
This page displays problems Google had when crawling your site. The most common problems you’ll probably see are:
- Not found – This indicates there are links to a page that does not exist. A good solution is often to create a 301-redirect, pointing the missing page at something appropriate. If the link/page are really meaningless, you can also ignore it – not found errors should not impact your ranking.
- Restricted by robots.txt – These are pages Google is being told not to crawl by that file. Make sure there is nothing that you want indexed in here. Read the “Crawler Access” section earlier in this article for more about robots.txt.
Crawl stats look like:
The main thing to watch for here is a sudden drop-off in the number of pages crawled per day. You want Google to be consistently looking at your site, and a big drop could indicate a problem. Also, spikes in the “Time spent downloading a page” may indicate that your server is having trouble serving up pages quickly.
Fetch As Googlebot
You can ask Google to crawl a specific page on your site. It takes a few seconds, and then you can see what Google found. It will be displayed as the raw html, which can be helpful in making sure Google is finding the same thing your visitors find.
One particular use is for when Google says malware exists on your site – trying fetching pages as Googlebot. The hacker may have setup your site to display different content to Google than your visitors (scan what Google returns for spammy phrases, such as “viagra”).
You are limited to 50 fetches per week.
Once a web page has been fetched, you can manually “Submit to index.” This will make Google consider the page for inclusion in its search results, although Google is usually quite good at finding pages on its own. You can submit up to 10 pages per week (and can ask Google to look at all pages linked from that one as well).
Title tags and meta descriptions are html elements that Google likes to see, and often displays in the search results. Each page on your site should have a unique title and meta description. This section will alert you to pages that have title/meta descriptions that are duplicates of other pages, are too long, too short, or do not provide useful information.
It’s a good idea to go through this report periodically, and consider fixing any tags which Google reports as problematic.
The labs area includes features which Google considers still under development, although some of these have been there for quite a while.
Google now provides small image previews in the search results (click the magnifying glass next to any result). You can get a preview of the images that will be shown using this tool.
The preview takes a few seconds to be created – when you view it Google also provides help if the picture is not displaying the content normally seen on your website.
A (really bad) site performance report:
Google now uses site load time as a factor in its search rankings. The company has stated that they are not a major factor, but that sites with terrible load times may experience lower rankings. This tool shows how your site speed has changed over time, and how it compares to other websites.
In general, I would not be too concerned about this unless it is very bad – like this one, which falls into the slowest 11% of sites. In order to get under the “fast” line, your site needs to load in about 1.5 seconds.
If you have videos on your website, you can create a video sitemap. This tool functions much like the regular sitemap tool, just finding problems in your video sitemap.
Google Webmaster Tools can help you understand how Google sees your site, and spot any serious problems hurting its search rankings. If you haven’t already, I suggest you signup – and return to check the information periodically.