You are viewing an old version of this page. Return to the latest version.
Version of 10:24, 6 December 2016 by NathalieKöpff
Difference between revisions of "Archive:Robot Configuration"
m |
m (Mlink-rodrigue moved page Robot Configuration to Archive:Robot Configuration without leaving a redirect: not part of v3) |
Extension: Robot Configuration
Overview | |||
---|---|---|---|
Description: | In order to protect own pages on the Internet from web crawlers, the "robots.txt" was introduced. | ||
State: | stable | Dependency: | |
Developer: | Martijn Koster | License: | - |
Type: | MediaWiki | Category: | - |
Edition: | BlueSpice free | ||
Features[edit source]
In order to protect own pages on the Internet from web crawlers, the "robots.txt" was introduced.
This file is located in the root directory of the server and is a web crawler control file that can ban.
It is often used to reduce the data traffic caused by the large number of web crawlers or to block certain areas for the web crawlers.
For further information about this file, please refer to the Wikipedia.
[[Category:Extension]] [[Category:BlueSpice public]] {{BSExtensionInfobox |desc=In order to protect own pages on the Internet from web crawlers, the "robots.txt" was introduced. |status=stable |developer=Martijn Koster |type=MediaWiki |edition=BlueSpice free |docu=https://en.wikipedia.org/wiki/Robots_exclusion_standard }} In order to protect own pages on the Internet from web crawlers, the "robots.txt" was introduced. This file is located in the root directory of the server and is a web crawler control file that can ban. It is often used to reduce the data traffic caused by the large number of web crawlers or to block certain areas for the web crawlers. <br data-attributes="%20/" data-mce-fragment="1" />For further information about this file, please refer to the [https://en.wikipedia.org/wiki/Robots_exclusion_standard Wikipedia.] [[de:Robotskonfiguration]][[en:{{FULLPAGENAME}}]] [[Category:Extension]] [[Category:Spam]]
(11 intermediate revisions by 7 users not shown) | |||
Line 1: | Line 1: | ||
+ | {{BSExtensionInfobox | ||
+ | |desc=In order to protect own pages on the Internet from web crawlers, the "robots.txt" was introduced. | ||
+ | |status=stable | ||
+ | |developer=Martijn Koster | ||
+ | |type=MediaWiki | ||
+ | |edition=BlueSpice free | ||
+ | |docu=https://en.wikipedia.org/wiki/Robots_exclusion_standard | ||
+ | }} | ||
+ | In order to protect own pages on the Internet from web crawlers, the "robots.txt" was introduced. | ||
+ | |||
+ | This file is located in the root directory of the server and is a web crawler control file that can ban. | ||
+ | |||
+ | It is often used to reduce the data traffic caused by the large number of web crawlers or to block certain areas for the web crawlers. | ||
+ | |||
+ | <br data-attributes="%20/" data-mce-fragment="1" />For further information about this file, please refer to the [https://en.wikipedia.org/wiki/Robots_exclusion_standard Wikipedia.] | ||
+ | |||
+ | |||
− | |||
− | |||
+ | [[de:Robotskonfiguration]][[en:{{FULLPAGENAME}}]] | ||
+ | [[Category:Extension]] | ||
[[Category:Spam]] | [[Category:Spam]] |