Robot Configuration

Revision as of 12:10, 9 October 2018 by Fbaeckmann (talk | contribs) (Fbaeckmann moved page Reference:Robot Configuration to Robot Configuration without leaving a redirect)

Extension: Robot Configuration


Overview
Description: In order to protect own pages on the Internet from web crawlers, the "robots.txt" was introduced.
State: stable Dependency:
Developer: Martijn Koster License: -
Type: MediaWiki Category: -
Edition: BlueSpice free
For more info, visit Mediawiki.

Features

In order to protect own pages on the Internet from web crawlers, the "robots.txt" was introduced.

This file is located in the root directory of the server and is a web crawler control file that can ban.

It is often used to reduce the data traffic caused by the large number of web crawlers or to block certain areas for the web crawlers.


For further information about  this file, please refer to the Wikipedia.

Attachments

Discussions