Google’s search console is a familiar sight to many marketers, particularly those in the habit of checking for search penalties due to various gray hat techniques, or simply the age of their sites. Formerly known as Google Webmaster Tools, the Search Console is a collection of tools and information that can help with a variety of optimizations. I’ve come up with the five best approaches to optimization using only the data the Search Console gives you.
In January of 2018, Google updated their Search Console. Those who have used it before are given the opportunity to migrate from the old console to the new. This is worth doing, even though you lose access to some data and reports; the new Search Console doesn’t have everything the old one had, but it also has new features and is more integrated with the rest of Google’s products.
For example, it was only in August that Google got around to adding the link reports from the old console to the new.
For now, the new Search Console has a number of useful reports. There’s an index coverage report, an AMP status report, a rich result status report, a performance report, and a sitemaps report. Additionally, there’s a URL inspection tool, a report about manual actions, a links report, and a mobile usability report. All of these are useful to some extent, though some of them aren’t useful to some people. For example, if you’re not using the AMP system, the AMP status report is meaningless to you.
Manual Actions are Google penalties. Some things that we refer to as penalties, like “Panda penalties” are not actually penalties; they’re adjustments to the algorithm and corresponding drops in ranking. Manual actions are intentional depressions in ranking caused by specific issues that Google points out for you to fix.
If your site has any kind of manual action taken against it, that penalty will show up in this report. You should take steps to fix any manual action before even thinking about other site optimizations. Manual actions are holding you back far more than any other lack of optimization. To use a car analogy, it would be like focusing on exactly when, to the millisecond, you should shift gears to get the most power from your vehicle, while ignoring the trunk full of bricks. Remove the bricks and you’ll go faster, sooner.
There are a lot of different manual actions – enough to be covered in their own post, actually – so I’m only going to gloss over each of them. You can find a resource for fixing each issue if it’s one that affects you.
In all cases, you can see one of three results when you click on the action. You’ll see “All Pages”, a URL string that ends in a subfolder, or a specific page URL. This identifies the scope of where Google has seen the issue, and helps you identify where on your site the problem occurs. Use this to track down the problem and fix it.
Once you have fixed the issue, you can file a request for reconsideration. In the Search Console, find the “request review” button and file for reconsideration. Google will re-scrape your site and check for the issues they had previously identified. If those issues are resolved, they will remove the relevant penalty.
Under the Index Coverage Status Report, you’ll see some very useful information with regards to how much of your site Google can see. This will show you the number of pages on your site that Google indexes, as well as any indexing errors that come up.
You should look for a few possible problems. First of all, check to see if there are indexing errors, and more importantly, if there’s a spike in indexing errors.
Any indexing error is an error where a page should be indexed – Google knows it exists – but is prevented from being indexed, usually by a NoIndex or bot-blocking command. A spike in indexing errors indicates a change you made to your site that blocks pages, perhaps from editing a template file and breaking it. If a page should be indexed but isn’t, look for what is blocking Google from seeing the page and fix the problem. If a page is indexed but shouldn’t, you can block Google from seeing that page. Files like robots.txt or any page behind a login wall should be noindexed just in case.
If you know your site has 1,000 pages on it, but only 400 are indexed, you may have a subfolder or subdomain blocked. You should be able to identify what is being blocked by looking for what is indexed, and figuring out the shape of the holes.
Sometimes pages will be overlooked. Google discovers new pages through sitemaps, by following links, and other forms of notification. If Google hasn’t discovered certain pages, it’s a good idea to build a full sitemap and submit it to Google so they can find everything.
Google is putting more and more emphasis on mobile web browsing every year, including the recent mobile-first indexing change they rolled out this year. As such, the mobile usability report is crucial for modern optimization.
This report will show you every indexed page on your site and whether or not it has an error. Note that sometimes, pages affected by issues are not shown because they aren’t bad enough in comparison to worse pages on your site. Essentially, the worse a page looks in the report, the more it needs to be fixed immediately. Once you fix the worst issues, re-run the report and fix anything else that pops up. Here are the errors that can appear:
Sooner or later your mobile site will need to be more optimized. Might as well start now.
The Rich Result status report will give you a listing of any pages on your site with the right kind of structured data to fill out a rich result. For example, this search shows you a box up top with a basic hint of a recipe.
That’s a rich result. Rich results exist for job postings, recipes, and events, among other pages. You can see if there are rich results on your site, and if you think you should have some but they aren’t there, you can troubleshoot problems. This only applies to sites that have the right kind of structured data, so it’s not high on my list.
The URL Inspection Tool is a tool provided in the Search Console that allows you to inspect a specific URL. You can inspect Google’s indexed version and see if it matches your live version, as well as see any specific errors or penalties relating to that specific page. You can also request that Google fetch a live URL and inspect whether or not it’s able to be indexed. Note that this does not automatically index a page; you can’t use this to get your site indexed faster or anything. There’s a specific “request indexing” feature you can use to put your page in Google’s queue.
When you inspect a URL, you can see whether or not the URL is part of the index, when it was last indexed, and if there are any errors relating to structured data, AMP, or other problems. Check the problem you see against Google’s reference table and explore how to fix any issues that crop up.