Google Search Console formerly known as Google webmaster tools is a free service offered by Google that helps you monitor and maintain the presence of your site in Google search results. This Google Search Console Beginners Guide will help to know How to Submit Website to Google Search Console and how to maintain your site’s presence in Google search results.
Google Search Console:
Google Search Console is a free service offered by Google that helps you monitor and maintain your site’s presence in Google search results. Search console lets you easily monitor your website traffic, optimizing your ranking, in some cases resolving server errors, site load issues and security issues.
Google Search Console Features:
It has the tools that let webmasters can
- Submit the sitemap to Google and check if there any errors with sitemap
- Maintain and resolve spam or malware issues
- Submit new content for crawling and remove content from Google if you don’t want to show in search results
- Get a list of internal and external pages that linked to the website
- Can check mobile site performance
- Write and check a robot.txt file that helps to discover website pages
After launching your website you need to submit your website to search engines in order to index your website. Google Search engine is the most used search engine in the world. Submitting your website to Google helps you to index your website easily and fast also helps to increase website traffic.
How to Submit Website to Google Search Console:
Step 1: Go to https://www.google.com/webmasters/tools/ and log in with your Gmail account.
Gmail account is mandatory in order to connect your website to Google search console. If you don’t have, create one for free.
Step 2: After login to your Google account, type your website URL and click on “Add A Property” button in order to add your website.
Step 3: Now you need to verify your website. Google offers so many methods to verify your website. Simply choose the option and verify your website.
Verification Methods:
HTML File Upload:
You can verify your website ownership by uploading the HTML file to your website. You can use this method to verify your website if you’re able to upload files to your website root directory.
To use this method, choose HTML file upload method on the verification details page. Now you’ll be asked to download the HTML file. Download it and upload the HTML file to your website without making any changes to the file. After uploading the HTML file, come back to Google Search Console and click on verify button to verify your ownership.
The website will be verified and you’ll receive confirmation if everything uploaded correctly otherwise you may get an error.
Don’t delete HTML page from your website after verification, otherwise, your website will become unverified in Google Search Console.
HTML Tag:
You can verify your website ownership by uploading the HTML tag to your website. Click on alternate methods and select HTML tag option and Copy the code.
Now we need to add this Meta tag in the header of your website. Login to your WordPress website and add this code in the header.php file. Go to Appearance > Editor > header.php and paste the Meta tag doe after the head tag.
Alternatively, you can use Yoast SEO plugin. Yoast SEO plugin makes easy to add these Meta tags. Simply log in to WordPress dashboard and go to Yoast SEO >dashboard > webmaster tools and add the code in Google search console box and click on save changes.
Now go to Google search console and click on verify button in order to verify your website.
Don’t remove HTML tag from your website after verification, otherwise, your website will become unverified in Google Search Console.
Domain Name Provider:
You can use this method to verify your website if you’re able to sign in to your Domain Name Provider. Domain Name Provider is a company where you purchased a domain name from.
To use this method, choose Domain Name Provider method on the verification details page. Depending on your Domain Name Provider you’ll be offered one of these following methods.
Directly sign in to domain name provider form search console: some domain providers like Godaddy enable you to directly sign in from search console. Select your domain providers from the list and verify your website.
Add a DNS TXT or CNAME record: If your domain name provider not listed above, you can verify your website ownership by adding a DNS record. You’ll find instructions here to add DNS TXT record. If DNS TXT record doesn’t work for your provider, then you’ll have the option to add CNAME record.
Google Analytics:
If you’re using Google Analytics for your site, you can easily verify your site using the Google Analytics tracking code that associated with your site.
To use this method, choose Google Analytics method on the verification details page and follow the instructions.
Google Tag Manager:
If you have Google Tag Manager account, you can easily verify your site using the Google Tag Manager.
To use this method, choose Google Tag Manager method on the verification details page and follow the instructions.
Now choose any option and verify your website. After verification, you can monitor and manage your website with Google Search Console.
Monitor website with Google Search Console:
A. Crawl:
a. Add Sitemap:
A sitemap is an XML file that contains URL’s of your website that’s accessible to crawlers, users and viewers. A sitemap can provide valuable data associated with the page you list in that sitemap such as page last updates, changes and important links to search engines. The sitemap does not boost your search engine rankings but allows to better crawl your website.
Simply lets search engines to know about the blog pages and crawl and index your website.
There are several ways to create XML sitemaps in WordPress. But using the Yoast SEO plugin we can create the sitemaps easily with one click. The Yoast SEO plugin has the most advanced XML Sitemap functionality that helps to create sitemaps includes images and videos in posts and pages. This step by step guide helps you to create the Sitemap for WordPress Website Using Yoast SEO.
After creating your website sitemap you need to submit to Google search console.
How to Submit Website sitemap to Google Search Console:
We need to submit website sitemap to Google Search Console by using webmaster tools by logging in.
Step 1: After login to your Google search console dashboard, click on the website name on which you want to add a sitemap.
Step 2: Now on the dashboard left sidebar, click on crawl > sitemap and click on add/test sitemap to add your website sitemap.
On the same page, you can check which sitemap are previously submitted and indexed by Google.
Step 3: Submit your sitemap here.
I suggest to submit image sitemap for better Image SEO optimization and you can also submit video sitemaps.
If you are submitting a sitemap for a new blog, it takes some time to check the crawl and index status. You can always check how many URL’s are submitted and indexed by Google on this page.
b. Fetch as Google:
The fastest way to index your website posts by Google is to submitting the posts manually using Fetch as Google option. This option also enables you to test how Google crawls and renders your website URL’s. You can check whether Googlebot can access your website pages and any resources are blocked by Googlebot.
Login to your Google search console dashboard and go to Crawls > Fetch as Google.
Step1: Add your website post URL without prefix in the box beside your website name.
For example, if your website post link is www.example.com/first-post.html, you just enter first-post.html in the text box.
You can fetch your homepage by leaving the box blank.
Step2: Now choose the type of Googlebot you wish to fetch as either Desktop or Smartphone.
Step3: Select Fetch or Fetch and Render.
Fetch: Fetches the specified URL only and displays the HTTP response but not run any associated resources like images and scripts.
Fetch and Render: Fetches the specified URL and displays the HTTP response. Also renders the page according to the specified platform like a desktop or smartphone. This runs all associated resources like images and scripts.
Now request will be added to the fetch history with a status. Now click on request indexing button beside fetch status.
The final step is to choose between two options. There are “Crawl Only This URL” And “Crawl This URL and Its Direct Links”. Choose the first option if you want to index a single page. Choose the second one to index a single page with its direct links and click on Go.
Now Google will send bots to crawl your website. When the request is completed, the row will show the Fetch Request Status.
You can do only 10 fetches a day and the fetch history table will show 100 rows of fetch histories. We can see complete details of fetch by clicking on the status in a row of fetch history table. We can see a different kind of statuses.
Fetch Request Status:
Complete: Google contacted your website and successfully crawled your page. Click on a table row to get all the details of fetch results.
Partial: Google contacted your website and got a response from your website but partially rendered your site because some resources were blocked by robot.txt files. Click on a table row to get all the details of fetch results. If any resources were blocked, unblock the resources on robot.txt file.
Redirected: Google contacted your website but your website server responded with a redirect and told to visit another URL. The fetch as Google tool will not follow redirects.
Other resource types fetch errors will also be displayed. Here is the screenshot from about other resource type fetch errors according to Google.
By clicking on the status you can see, how Googlebot and visitor can see the rendered page. You can also check the blocked resources at the bottom of the page with the severity level.
c. Crawl Errors:
Crawl Errors is one of the most popular features in GSC (Google Search Console). This provides details about the URL’s errors that Google not able to crawl and returned to 404 errors.
This report has two sections.
- Site Errors: Site errors are the errors that affect your entire site. This includes DNS resolution failures, connectivity issues with server and problems fetching the robots.txt file.
To access your site errors click on crawl > Crawl errors. This report shows the main issues of your website for the past 90 days that prevented Google from crawling.
For a well operating site, Google crawl error report should show no errors. Google will notify you if it detects any errors on your site.
The following errors will be exposed in site errors section.
DNS:
A DNS error means that Google unable to communicate with DNS server either because of your server is down or there’s an issue with DNS routing to your domain.
While most of DNS errors don’t affect Googlebot’s ability to access your site but you should act immediately to resolve the error. This may be a symptom of high latency which may negatively impact your user and provides the poor user experience.
How to Fix DNS Errors?
Make sure Google can crawl your site: Google recommends using Fetch as Google tool to check how Google crawls your homepage. If Google runs your website content without having any problem, you can assume that Google is able to access your site.
Check with DNS provider: If Google unable to fetch your site in above step properly check with your Domain provider for the issue. There could be an issue from their side.
Configure your site server to respond with an error code: ensure your server responds with an error code such as 404 or 500 instead of failed connection.
Server Errors:
It means your server taking too long to respond and the request timed out. Googlebot can wait only certain time to crawl your website. If it’s taking too long means Googlebot couldn’t access your website and forced to abandon your request.
Server Errors are different than DNS errors. Google cannot look up your web page due to DNS issues but can lookup your URL but cannot load the website due to server errors.
You should resolve the server errors immediately otherwise it will harm your website reputation and traffic.
Before fixing the Server errors, make sure to check which type of errors are you are getting from the server as they are many types.
- Timeout
- Truncated headers
- Connection reset
- Truncated response
- Connection refused
- Connect failed
- Connect timeout
- No response
How to Fix Server Errors?
Make sure your web hosting server is not overloaded: Server errors might happen if your website overloaded with too much traffic. If you got connection timeout or any response error, check with your web hosting provider and consider increasing your website ability to handle website traffic.
Make sure Google can crawl your site: Use Fetch as Google tool to check how Google crawls your homepage. If Google runs your website content without having any problem, you can assume that Google is able to access your site.
Robots Failure:
Robots Failure means Google cannot retrieve your robots.txt file which is located in your root domain. A robots.txt file is necessary if you don’t want Google to crawl some of your pages. Robots Failure error occurred if your robots.txt file exists but Google unable to reach that file.
If Google cannot access your robots.txt file, it will not crawl your website and index newly published pages till Google can access your robots.txt file.
How to Fix Robots Failure?
Ensure to properly configure your robots.txt file: Make sure to properly configure your robots.txt file with the pages you don’t want Google to crawl. Double check the pages and configure the file. If your file seems in order and still getting the error, check whether file returning to any 404 or 200 HTTP status code.
Make sure Google can access your robots.txt file: Robots Failure may happen if your robots.txt file exists but Google unable to reach that file. Check with your web hosting provider and make sure they are not blocking Googlebot.
You don’t always need a robots.txt file: A robots.txt file is necessary if you don’t want Google to crawl some of your pages. If you want Google to index everything on your website, you don’t need a robots.txt file. Not even an empty one.
2.URL Errors:
This section will show the top 1000 URL errors specific to that category –desktop and smartphone. Unlike site errors, URL Errors will not affect on the total website but will affect on specific web pages.
Types of URL Errors:
(i)Common URL errors:
Server error: It means your server taking too long to respond and the request timed out. As a result, Googlebot couldn’t access your website and forced to abandon your request.
Soft 404: Soft 404 is an error that displays a successful response code 200 but it should display 404 not found error. In some cases instead of displaying not found error, the page redirects to an empty page or a page with other content.
404: It means Googlebot tried to crawl a URL that doesn’t exist on your website.
Access denied: Googlebot prevented from crawling your website as it requires authentication to view the content.
Not followed: It means Google could not follow your URL. Some features like Java Script’s, flash and cookies make it difficult to crawl your website.
(ii) Mobile only (Smartphone) URL errors:
Faulty Redirects: Some websites configures separate URL’s for smartphone and desktop users. Faulty Redirects error occurs when a desktop page incorrectly redirects to smartphone page which is not relevant to their query.
URLs blocked for smartphones: It means URL is blocked from crawling in your robots.txt file.
View URL Error details:
To access your site errors click on crawl > Crawl errors. This report shows the main issues of your website for the past 90 days that prevented Google from crawling.
Click on download to get the list of top 1000 URL errors specific to that category
How to Fix URL Errors?
Google ranks the most important issues at the top. Not every error requires your attention and most of the errors may already be resolved earlier. It’s very important to monitor this section for errors that can have a negative impact on your users and traffic.
Make sure your web page is published: Ensure your webpage is published and not in draft status to avoid 404 errors.
Fix not found errors with 301 redirects: If you want URL to redirect to another page, make sure to add 301 redirections to the appropriate page.
Remove authentication from pages: Remove login from WebPages which enable Google to access your WebPages to avoid access denied errors.
Check redirect chains and keep it short: If there are too many “hops”, Google will not follow the redirect sequence. Try to make redirects simple and short.
Update sitemaps: Replace your old sitemap with a new one and delete the old sitemaps. Don’t redirect them to new sitemaps.
Mark as fixed:
Once you resolved all errors which are addressed by Google, mark all as fixed.
Select the particular URL or select all and click on “mark as fixed” button to fix all.
Now the URL’s will be removed from the list. URL errors will reappear once Google crawls and found the error again.
Make sure to fix all errors before marking as fixed in Google search console. The same error will reappear once Google crawls and found the same error again.
d. Crawl Stats:
The Crawl Stats report provides information of Googlebots activity on your website.
This is another great tool from GSC to know how your site is reacting to Google crawler.
To access your site Crawl Stats click on crawl > Crawl Stats. This report shows the crawl rate of your website for the past 90 days. This should be something like this in below picture.
Crawl rate is how often Googlebot crawls your site.
- A high crawl rate means Googlebot can crawl your website more easily and quickly.
- Crawl rate highly depends on your website speed.
Here Crawl Stats will show you the Googlebots activity on your website in 3 sections.
- Pages crawled per day
- Kilobytes downloaded per day
- Time spent downloading a page (in milliseconds)
They are all very important to monitor. Let’s check each section in detail.
Pages Crawled Per Day:
It shows how many pages Googlebot crawled in a day from your website. You can see the report for the past 90 days.
Here you can see high, average and low crawl metrics on the left side of the graph.
You can check the specific date Crawled pages data by hovering the mouse on the graph.
Kilobytes Downloaded Per Day:
Googlebot will download your website pages whenever it visits your site for crawling. Here it shows how much Google downloaded your pages in Kilobytes. You can see the report for the past 90 days.
Time spent downloading a page (in milliseconds):
Here you can see how much time Google spend to download your pages for the past 90 days.
We need to compare kilobytes downloaded per day graph with time spent downloading a page graph to get better insights.
If two graphs are on the same line means Googlebot taking too much time to download your pages.
If kilobytes downloaded per day graph is high and time spent downloading a page graph is low means Google spending less time on your site which leads to faster crawling and indexing.
Generally, all 3 graphs look relatively similar but we should focus on Pages Crawled per Day graph for crawl rate.
If you see a sudden drop or spike on Pages Crawled per Day graph there might be a problem with your site. Let’s see what could be the reason and what we should do in each scenario.
If you see sudden crawl rate drop:
You added a new and broad robots.txt rule: Make sure to properly modify your robots.txt file with the pages you don’t want Google to crawl.
Don’t block Google from some of the important resources that understand your content. You need to revise your robots.txt file if your file is huge.
Broken HTML code or unsupported content: Google won’t be able to crawl your pages if your site has broken HTML code which might have added recently. Use Fetch as Google tool to check how Google crawls your homepage
Server error rates increased: Googlebot will slow down its request to avoid overloading your server if your server errors are increased. Check crawl errors report and fix the errors.
Site not updating frequently and has low-quality content: If your site not updating frequently with fresh content Google might not crawl your website frequently.
Google loves fresh and high-quality content. Always add new and high-quality content to crawl your website by Google frequently.
If you see sudden crawl rate spike:
You added a bunch of new information: If you’ve added a bunch of new information to your site, you might be crawled a bit more.
You need to monitor your website crawl rate by taking averages of 90 days.
Here you can see high, average and low crawl metrics on the left side of every graph. You can check the specific date Crawled pages data by hovering the mouse on the graph.
The one way to improve your crawl rate is publishing the new and fresh content on your website.
More content you publish = Higher crawl rate
So always use Google search console crawl stats to monitor and optimize your website crawl rate.
e. Robots.txt tester:
Here you can see your website robots.txt file and can check which URL’s are blocked from crawling.
A robots.txt file is necessary if you don’t want Google to crawl some of your pages. If you want Google to index everything on your website, you don’t need a robots.txt file. Not even an empty one.
If Google cannot access your robots.txt file due to any errors in your file, it will not crawl your website and index newly published pages till Google can access your robots.txt file.
Here Google search console allows you to test your robots.txt file using Robots.txt tester. You can make changes and test this file. Robots.txt tester provides errors and warning if any observed in your robots.txt file.
Go to GSC dashboard and click on crawl > Robots.txt tester. Add your new robots.txt rule in the white space shown in the below image and click on submit to check the reports and warnings.
Here you can also check if any particular URL is blocked or allowed from Google crawling. Just enter webpage URL in the box given beside your website name and click on test. As shown in the below image.
Now Google will provide the URL crawl status allowed or disallowed.
f. URL Parameters:
URL Parameters are the values that entered into your WebPages URL’s to track the clicks.
Use this features if you know the coding otherwise take developer help. The incorrect configuration may ruin your website SEO.
To set new parameters click on crawl > URL Parameters and click on configure URL Parameters.
B. Google Index: This section provides information on Google index status and blocked resources.
a. Index Status:
Here we can check how many URL’s are indexed by Google, URL’s blocked by robots.txt file and removed URL’s.
To access Google Index Status report, click on Google Index > Index Status. Here you can see 2 options, Basic and advanced.
In a basic report, only total indexed pages will be displayed. Whereas in an advanced report, you’ll get total information about total indexed pages, blocked by robots and removed URL’s.
b. Blocked resources:
Google needs complete access to your website including important resources that understand your content like CSS, files, images etc.
If Google cannot access important resources on your pages, your pages might be indexed incorrectly and Google will not crawl your website and index newly published pages in some cases.
To get the blocked resources information, click on Google Index > Blocked resources.
If any resources were blocked, unblock the resources on robot.txt file.
c. Remove URL’s:
This tool helps you to remove any of your web pages from search engine results temporarily.
Click on temporarily hide button and add your URL that you want to hide from search engine results and click on continue.
Now you need to specify the request type. Whether you want to temporarily hide the page from search results and remove from cache or remove the page from cache only or temporarily hide directory.
C. Search Traffic:
a. Search analytics:
The Search analytics report shows how often your website appears in the Google search results. The report even displays clicks, impressions, CTR and position per keyword in country wise, device wise, search type and much more.
To access your site Search analytics click on Search Traffic > Search analytics. Now filter and group data by categories such as queries, pages, countries, devices and search type to get the Google search performance from last 90 days.
b. Links to your site:
This report shows the websites that are linked to your website URL and the pages on your website with the most links. You can also check the most common anchor text found by Google here.
To access the Links to your site click on Search Traffic > Links to your site. Now you can see the linked pages, pages on your website with the most links and most common anchor text found by Google from last 90 days.
c. Internal links:
This report displays the number of internal pages linking to the specific page.
To check the Internal links of your site click on Search Traffic > Internal links. Now you can see the targeted pages with the number of links.
You can also check the total internal links for any specific page in your website in find internal links option.
Add your webpage in the box and click on find option to get the total links.
The number of internal links to a specific page informs Google about the importance of the page. As a result, the page will rank higher for the keywords.
If an important page in your website doesn’t display in that list, you should focus on building more internal links for that page.
d. Manual actions:
This report lists any spam or manual actions found on your website which needed your attention. It also displays the pages that are not compliant with Google quality guidelines.
e. International targeting:
This section is applicable only if your website content translated for a country or regional users.
Once you configured your multi-language site, you can use this tool to fix and keep your website healthy,
f. Mobile usability:
This report displays the mobile usability issues that affecting your website.
Fix all the issues and make sure your website is mobile friendly to gain more visitors and attract all types of people.
Google search console is a better tool to improve your website SEO. Start using this and improve your website ranking in Google search results.
Hope this article helps you in understanding about Google search console.
If you’ve any questions, feel free to contact me and share this post with your friends on Facebook, Twitter and Google +.
Leave a Reply