Web page optimization
The audit tool also analyzes the robots.txt file of your website. The robots.txt file tells search engines how your web pages should be indexed.
A robots.txt file with a wrong configuration can keep search engine robots away and your web pages won't be indexed then.
As the robots.txt file is hosted on your server, you cannot edit it directly in the audit tool. If necessary, open the robots.txt file with a plain text editor. It is important that the editor does not add style information to the file. For that reason, you cannot use MS Word to edit the robots.txt file.
If you don't want to exclude any search engines from your website, just leave the robots.txt file blank. If you want to exclude search engines, you can find examples here.
An XML Sitemap helps search engines to index all pages of your website. An XML Sitemap (with a capital S) is a text file containing a detailed map of all URLs present on your website.
This helps search engine robots to index all of your website's pages. The Sitemap can also contain META data, which helps the search engine crawlers to understand when a file has changed or when you have added new content. The website audit tool in SEOprofiler automatically creates an XML Sitemap of your site.
Click the 'Download sitemap file' button to download the sitemap file to your local computer. The file is compressed as .gzip which is a format that is accepted by Google, Bing and other search engines.
After downloading the XML Sitemap, you can submit it to Google and Bing. Use the links on the 'Sitemap' page the website audit tool to submit your XML Sitemap.
You can also include the XML Sitemap in your robots.txt file:
Of course, replace sitemap_url with the actual location of the XML Sitemap on your website.