WordPress

WordPress Robots.txt Optimization

The purpose of the robots.txt file is to direct search engines to where they can index and where they are not allowed to index.

It is recommended by all search engines to use this robots.txt file so that they don't waste time and resources crawling and indexing files which you don't want them to be indexed.

Here is what WordPress recommends your robots.txt file to look like.

Sitemap: http://www.example.com/sitemap.xml

# Google Image
User-agent: Googlebot-Image
Disallow:
Allow: /*

# Google AdSense
User-agent: Mediapartners-Google*
Disallow:

# digg mirror
User-agent: duggmirror
Disallow: /

# global
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/
Disallow: /trackback/
Disallow: /feed/
Disallow: /comments/
Disallow: /category/*/*
Disallow: */trackback/
Disallow: */feed/
Disallow: */comments/
Disallow: /*?
Allow: /app/uploads/
Back to top

Learn how to code with Treehouse

  • Learn projects with access to 1000+ videos
  • Practice live with our Code Challenge Engine
  • Get help in our members-only forums

Start with a 7 day free trial