Checking Google Search Console
google Search council formerly known as google. Web master tools is a platform that helps web masters control how google indexes their site which you all should google not index which year olds should they index, which you're all should google not crawl. Which one should they crawl? You can manage all this and more through google search council and it all starts with verifying your domain. So let's get started with that. Go to google search for google search council. If you've never used google search council before, you'll see two options for verifying your domain. Google gives you two options for verifying verify the entire domain at the root level or verify your l prefix. Now verifying the entire domain is a bit more involved and I posted a video on how to do this exactly but it will verify all the sub domains in your site so it's really it's really worth it and saves you a lot of time. However if you encounter yourself demand that you don't want index, I highly recommend that you v...
erify that sub domain separately with the U R L prefix. This will allow you to better control how that sub domain appears in google without waiting for your development team to push code to that sub domain. Once the site is verified. The next step is just to note what keywords are currently ranking. This will help you track the impact of your recommendations. Now lucky for us. Google search council does this automatically. Once someone has verified this site. If that's the case you just log in and you'll see 16 months worth of performance data, you can filter by U R. L. C average rankings and even individual keyword rankings. Next thing is to make sure that you aren't accidentally creating what's known as technical duplicate U R L S. Sometimes €1 will accidentally copy another euro on your site and that usually happens through something called a parameter. A parameter is a piece of your l that's dependent at the end of a U R L to track usage. Say you're doing a facebook advertising campaign to a blog post you wrote, you'd append something like question mark, which is the parameter. Ut m underscore campaign equals facebook ad. So if anyone comes to that whole euro with the facebook ad in it, we know where that user came from. So to avoid any confusion, google search council has this feature called you're all parameter handling. Just click on legacy tools and reports and click on your all parameters. You'll see all the Euro parameters. Google's encountered in their crawl and in this view you can click on each parameter, see examples and decide whether it's a unique. You're all that should be indexed, click yes or if it's a technical duplicate, click No. They will also give you a timestamp for when these changes have occurred. Next thing you want to make sure is that you submit your site map a site map is a file that lists all the euros that you want. Google to index. It also contains information like how frequently should google crawl these pages and what's the priority of crawl for each of these pages. Going back to the new view of google search council, you can simply add your site map here. Each site map can hold up to 50,000 u R. L. S. So if you have more than €50,000 I suggest you nest your site maps within each other. Something called a site map in next file. Note, make sure you clean your site maps of dirt. If google sees a lot of errors in your site map, if you're including things that have no index tags or are 44 broken pages, they'll stop following your recommendations entirely and just crawl whatever they want. Next check for air patterns that you might find. Sometimes your site will generate 44 you RLS unintentionally. So you may see a large group of similar looking error pages and solving for one of those pages will solve for a whole bunch of them. Also note you can export these errors to your scr research spreadsheet but hold up on doing that for right now because we're going to actually do that when we get to the crawler tools Finally, let's make sure that our robots at txT file isn't blocking any very important anywhere else to do that. Go to this link, input your euro and see if your robots dot txt file is blocking it. And that's my simple framework for checking on google Search Council. In the next section, we'll dive deep into how to enhance your crawl related checks with crawler tools. See you soon.