Ever get the feeling that Google really wants to be friends with you if you do SEO?
No, not really?
I know they have a tendency to unleash black-and-white animals on us and make changes that can drive any man or woman to drink.
Not to mention, if we don’t play according to their rules, we’ll get slapped in all kinds of ways.
But, still, you’ve got to admit — they do give us a few very helpful and useful tools that we utilize for our SEO, right?
There’s obviously (GA) Google Analytics, with in-depth analytics to help you understand and improve website traffic.
And then there’s (GWT) Google Webmaster Tools, which helps in planning and evaluating your SEO efforts and overall health of your website.
You get a chance to see your website the way Google sees it.
And that can be crucial for the success of your SEO efforts.
You’ll see what pages are indexed, most popular keywords, if rich snippets and markup is set up correctly, what links are coming your way and from whom, how it’s crawled, indexed, and so much more.
You do want your site fully indexed and ranking, right? GWT is pure gold for this.
It’s like a health check and all about the metrics: traffic, links, indexing, crawling, data markup, keywords, and any errors to be fixed.
So many features, so many settings, so much benefit!
GWT Tries Really, Really Hard To Be Your Friend With All These Features
Let’s talk a bit about all those settings and features you can find within Google Webmaster Tools, yes?
Some of them are more important and beneficial than others and we could probably talk all day and all night about each of the tools and reports.
But we don’t have time for a tea party and dollhouses, so let’s just get an overview with some details that will help you in your quest for SEO perfection.
You’ll find this menu at the top-right corner of your browser window by clicking the gears icon.
Let’s talk about the settings that matter for SEO purposes.
Google explains it like this: “If your site has a neutral top-level domain, such as .com or .org, geotargeting helps Google determine how your site appears in search results, and improves our search results for geographic queries. If you don’t want your site associated with any location, select Unlisted.”
Do you intend to target a specific geographic area only? Then by all means, use this.
Decide between a www or non-www URL version of your site. If you do set a preferred domain, Google will treat both links the same, regardless of if the ‘www’ is used or not.
You should let Google set their own crawl rate and optimize it for your site. They say, “Our goal is to crawl as many pages from your site as we can without overwhelming your server’s bandwidth. You can change the crawl rate (the speed of Google’s requests during the crawl) for sites at the root or subdomain level – for example, www.example.com and http://subdomain.example.com. The new custom crawl rate will be valid for 90 days.”
Change Of Address
Moving your site to a new domain? After making sure you’ve got all your 301’s squared away, take a pass through here and tell Google about your new URL. They will update their index to reflect this and changes stay in effect for 180 days — plenty of time to crawl and index the pages at your new URL.
Site Messages: Notifications, Warnings, And More
If there’s an issue with your site, you’ll get a message sent to Webmaster Tools. How often do you check it? Hopefully often enough to know that everything is either going great or blowing up.
This is where you’d get one of those pesky unnatural link notifications, if you’ve been playing on the shadier part of SEOville. Isn’t that nice? Google will let you know when something is rotten. It’s not always clear as to what will happen but if it’s something that’s within your power to change, then do so.
You have worked hard on your structured data and rich snippets, right? This report tells you how many data items, and on how many pages, Google can find this information.
Use this to make sure that your Schema.org markups are being crawled correctly.
If you’re not too savvy with code, this tool helps you apply structured data information to your website in a point-and-click manner. Your site won’t be altered, Google just saves this information and applies it.
These are the links that show up under your domain in the SERPs. Here’s what I mean:
These are based on how much authority a domain has for a particular search query. Having these show up for your domain can help with, for example, reputation management, as it pushes any negative search results further down the page.
Don’t get too trigger-happy here, use this with caution.
This is probably one of the more useful tools available. Google will check your site for any potential issues with content (you know, things like missing, duplicate, or problematic title tags or meta descriptions).
Correct and adjust as much as you can when these things come up.
Here you’ll find an estimate of the number of clicks and impressions your website is getting. It displays top search queries and while not as detailed as Google Analytics, it gives you an overview of rankings and traffic.
It isn’t that reliable but still useful for a glance every now and then.
Links To Your Site
Want to know who’s linking to you? Here’s where you can see that as well as your top anchor text. It lists who links to you the most, your most linked content, and how your data is linked.
You can also export all of this information and download it for your Sunday afternoon perusing if you wish.
If you find any low-quality website that have plenty of links pointing to your site (this can be global, sitewide links), you probably want to do something about that.
In general, with some caveats, the more links a page gets, the more authority it has. I know, that’s not always true, but it’s true enough. Keep your internal linking mainly emphasized on your most important pages — those you want people to find all the time.
If your internal linking is up-to-par, then you do help Google crawl and index those key pages, as well as placing them in the SERPs.
Pretty self-explanatory, right? If you have any manual actions counted against you, they will show up here. Suffice to say, you want to keep your eyes on this.
This report tells you how many of your website’s URLs are indexed. If you go to the Advanced tab, you can even see how many are blocked, have ever been crawled, or have been removed, along with the total indexed.
This is useful for troubleshooting any errors or mistakes on your site. Also, if you know your site has, say, 250 URLs but the report shows 2,000 URLs or some other crazy number, you probably have an issue with duplicate content or canonical URLs.
These are the most frequently used keywords on your website. This report should give you a quick glance on how well your site is themed for what you’re trying to rank for.
If you want to rank for “vegetarian dog food” but find that most of the keywords listed have nothing or very little to do with it, you are doing something wrong.
This is where synonyms and related keywords should show up, as well.
Want a URL to be taken out of the index? Your robots.txt will tell the bots how to crawl your site, but you can also request a removal here. There are requirements, and you should use this with caution.
A basic snapshot of how your website is doing, telling you how your DNS, Server Connectivity, Robots.txt Fetch, and URL Errors are doing (that would be error messages like 404’s or 403’s, and so on).
Finding any errors? Fix them, pronto, especially if you have URL errors where your site’s crawl efficiency is affected.
The similar Google Analytics report (for downloading a page) is useful, of course, and you might prefer it, but this gives you a decent enough overview of how many pages are crawled per day, kilobytes downloaded per day, and time spent downloading a page.
You want more pages crawled, more downloads, but more time spent downloading is not good. Page speed is a factor for rankings and optimizing load times (as in, reducing them) should be a tool in your SEO arsenal.
Fetch As Google
Do you have a large site with lots of moving parts and changes? It can be very helpful to fetch your site as if you were Google, to verify or troubleshoot any potential issues.
After you’ve fetched your site, you can also submit it to the index.
This report will show you what your robots.txt looks like and if you’re blocking any URLs. Very handy for a quick look and making sure your robots.txt is set up properly. The User-agent feature is very handy, where you can test for images, Adsense, and Adwords.
Submitting your sitemaps through Webmaster Tools should be obvious. At least make sure you’re passing along your XML sitemap, but consider submitting a news sitemaps (if applicable), image, video, or even a mobile sitemaps.
This is powerful and can either really help your site or really hurt it. For the most part, you should be fine with using noindex, nofollow, 301, rel=canonical or robots.txt. However, for some sites (especially those that use URL parameters like it’s a drug; see news sites), this tool will request that Google only crawls specific URLs and ignores others.
You can configure how Google treats parameters like utm_source, utm_medium, and more.
If this section of your Webmaster Tools says nothing but “Google has not detected any malware on this site”, then you’re all good!
If you are infected with malware, like a code injection, then most visitors will get a message that warns them it’s not safe when trying to access your site.
This is where Google just threw a bunch of extras into a corner that some webmasters will find useful.
You’ve got the Structured Data Testing Tool for checking that Google correctly parses your data markup.
There’s the point-and-click tool for adding Structured Data Markup Helper.
Let’s not forget about Google Places, although that has actually been replaced by Google+.
Last but not least, you’ve got Google Merchant Center, where if you sell online (like an e-commerce store) you can upload product data here.
These are a motley collection of tools and resources that frequently change, are added, or dropped.
If you’ve set up your author profile correctly, using the rel=”author” tag and such, you’ll find statistics of your articles here.
You can add a Google search bar to your website, and if you do, you can see reporting of this.
You can preview what your website looks like from a few viewpoints, like On Demand Desktop Search Instant Preview, Pre-Render Desktop Search Instant Preview and Mobile Search Instant Preview.
It will also tell you if it finds any errors while fetching.
This is no longer supported, but you can use these other resources instead:
- Google Analytics Site Speed — Measures page load time as experienced by your visitors and allows you to measure other user defined timings.
- PageSpeed Insights — Analyzes the content of your pages and provides suggestions to improve performance.
Did You Make Friends With Google Webmaster Tools Yet?
Are you convinced yet that Google really wants to be Best Friends Forever and help you with your site?
Set aside any suspicion you might harbor and let’s pretend they have your site’s best interest at heart.
If you do go along with that, you’ll find that the Webmaster Tools will give you top-level insight and data about your website that will, ultimately, help with your Search Engine Optimization strategy.
Understand how your site is performing, how it’s crawled, indexed, how content keywords match and link are all pieces of information.
How are you using Google Webmaster Tools to support your SEO work?