Robot Configuration
-
- Last edited 4 years ago by MLR
-
-
- This page is expired
Revision as of 09:40, 28 July 2020 by Mlink-rodrigue (talk | contribs) (Mlink-rodrigue moved page Robot Configuration to Archive:Robot Configuration without leaving a redirect: not part of v3)
Extension: Robot Configuration
Overview | |||
---|---|---|---|
Description: | In order to protect own pages on the Internet from web crawlers, the "robots.txt" was introduced. | ||
State: | stable | Dependency: | |
Developer: | Martijn Koster | License: | - |
Type: | MediaWiki | Category: | - |
Edition: | BlueSpice free | ||
Features
In order to protect own pages on the Internet from web crawlers, the "robots.txt" was introduced.
This file is located in the root directory of the server and is a web crawler control file that can ban.
It is often used to reduce the data traffic caused by the large number of web crawlers or to block certain areas for the web crawlers.
For further information about this file, please refer to the Wikipedia.