this section is all about technical audits, which are when you investigate the site thoroughly to seek out all issues that are affecting it. In this section we'll cover site maps, robots dot txt files, https broken pages, duplicate content and mobile friendliness. So what we're looking at here is an example of an xml site map. As you can see, it's not glamorous, but it's really important for showing a search engine bots how they can navigate the site in which pages are important. So, site maps are a list of your oils that give hints to search engines on how they can crawl your website. The most common widely accepted form is Xml, which stands for extensible markup language. You don't want to see a lot of variation between how many pages of your site google indexed and the number of crawled pages found by S. C. O. Tools. That's an indication of bad site health. So to determine whether the client has an xml site map, you can check google search console and if we click on the site map sec...
tion again, we can see here that under at a new site map. This is where you can enter the site map U. R. L. Which will often look like site map dot xml and then click the submit button and that will submit the site map directly to google to diagnose any action that needs to be taken for xml site maps. You'll first want to compare numbers of pages between screaming frog, the client's existing xml site map and google's index. So we have this example of Glossier xml site map. We want to count the numbers of in this site map. We would then want to open up screaming Frog again and we can put in glaciers, U. R. L. And while that's crawling. Then we can look at the last component which is looking at google's index. So the way to check Google's index is by typing in site colon and then the site name, so Glossier dot com. And then you'll see up here at the top the number of pages that google has indexed for this sake, Which is about 536 results. This is not a hard and fast number, but it is a pretty good approximation of how many pages Google things are on the site. Now, we'll go back to screaming frog And here it's crawled again. About 500 pages. So that checks out. But then when we look at the XML site map, we can see that there are not even close to 500 pages here. So that is an example of something you would want to do some investigation into. So, you'd want to go back to screaming Frog. Look at all these pages. And actually, When we look at this, we see that actually there are only 76 important pages that screaming Frog has prioritized here. However, It's finding about pages for Glossier in its own index. So we want to investigate deeper to get a better understanding of why Google is seeing 536 results, Screaming frog is showing that there are 76 important pages on the site. And the site map only has about maybe 40 pages in its XML site map. If there truly are 76 important pages on the site, then you would want to update the site map to reflect all of the important pages. So in this case, the site map would need to be expanded to include more of the pages that google is seeing. But if instead there's a site map that has an excessive pages, then it should be cleaned up and the pages that are no longer on the site should be removed. And the final step in both cases is to submit the newer updated site map to google within google search console. And then the final step in both cases is to submit the new or updated state map to google Women Google search console. So we'll navigate back there, we will go to the new search console to save maps and then again right here is where you can enter that site map, you are l and hit submit And this lets google know that site changes have been made and that the search engine should re crawl the site to update rankings accordingly and for further details, you can check out Google's own site map guidelines at this RLL