MediaWiki and SEO #2 – robots.txt

MediaWiki Flower
If you are using MediaWiki as your CMS, you’ll need to ensure robots.txt is configured correctly on your site.

Search engines don’t like seeing duplicate pages, and if you don’t configure robots.txt, MediaWiki will give the search engines exactly that, via its “random” and “search” functions. This can lead to your pages being incorrectly marked down, or not indexed at all.

To prevent this from happening, create a file, robots.txt, with the following content:

User-agent: *
Disallow: /wiki/Special:Random
Disallow: /wiki/Special%3ARandom
Disallow: /wiki/Special:Search
Disallow: /wiki/Special%3ASearch

where wiki is the directory of your MediaWiki installation.

Place this file in the root HTML directory of your web server. You can verify that the file has installed correctly by visiting http://your-web-server/robots.txt.

This fix prevents your MediaWiki based web site from being marked down incorrectly due to duplicated content.

Leave a Reply