Robots Txt correct for WordPress

Learn how to create the perfect robots.txt file for WordPress with an example to get the most out of your SEO.

Basic WordPress Robots.txt File

A robots.txt file is a text file located in the root of your website that tells search engine robots which pages to crawl and index. It is important that you create a robots.txt file if you have a WordPress website so that you can control how search engine bots access your website.

Here is an example of a basic robots.txt file for a WordPress website:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

Sitemap: https://www.example.com/sitemap_index.xml

The first line of the robots.txt file tells search engine bots which pages to crawl and index. In this example, it allows all bots (the asterisk (*) indicates all bots) to crawl the website. The second line tells the bots not to crawl the /wp-admin/ directory, which is the WordPress administration area. The third line allows the bots to crawl the admin-ajax.php file, which is necessary for the website to function properly. Finally, the fourth line points to the sitemap, which helps the search engine bots find and index all of the pages on the website.

It is important to note that the robots.txt file is not a security measure. It simply tells search engine bots which pages to crawl and index, and does not protect any sensitive information. If you need to protect sensitive information on your website, you should use other security measures such as password-protecting files or directories.

Answers (0)