# # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/ # # MAINTAINED by puppet Sitemap: http://lib.ugent.be/siteindex.xml User-agent: * Crawl-delay: 10 # Disallow for crawlers Disallow: /status