The robots.txt Celebrates 20 Years Of Blocking Search Engines

by Barry Schwartz
Today is the 20th anniversary of the robots.txt directive being available for webmasters to block search engines from crawling their pages. The robots.txt was created by Martijn Koster in 1994 while he was working at Nexor after having issues with crawlers hitting his sites too hard. All major search engines back then, including WebCrawler, Lycos and AltaVista, quickly adopte ...Read the full article