# Robots Configuration File # # Example: Tell "WebSpider" indexing robot where it can't go (without the #) # User-agent: WebSpider # Disallow: / (disallows all files from being "roboted") # User-agent: * Disallow: /cgi-bin/ Disallow: /lynnita_folder/ Disallow: /js/ Disallow: /_borders/ Disallow: /_derived/ Disallow: /_fpclass/ Disallow: /_overlay/ Disallow: /_private/ Disallow: /_themes/ Disallow: /_vti_bin/ Disallow: /_vti_cnf/ Disallow: /_vti_log/ Disallow: /_vti_map/ Disallow: /_vti_pvt/ Disallow: /_vti_txt/ Disallow: /fpdb/ Disallow: /images/ # These Suck up system resources big time and cause bad log statistics analysis User-agent: SlySearch Disallow: / User-agent: TurnitinBot Disallow: / User-agent: ia_archiver Disallow: /