The behavior and performance of robots can be improved a lot with the
cooperation of Webmasters. Also, robots writers must follow some guidelines
in order to not upset Webmasters.
- Robots may exhibit unacceptable behavior when they retrieve documents
that are not suitable for robot consumption. Examples are mirrors from
other servers, cache hierarchies, temporary files and some CGI-scripts
for counting and/or feedback.
A Webmaster can help the robot writer by providing hints about documents
to avoid in a robots.txt file.
- Robots may cause problems for Webmasters when they submit many
requests to the same server in a short period of time (causing a high load)
or when they run out of control, for instance by entering a "black hole",
as illustrated by the infinite recursive time example.