At PerimeterX, we defined a “Code of Good Bots” that provides basic rules of good behavior. The most famous good bot is the Googlebot, which crawls links to build the search engine many of us use. Good bots are required to download the robots.txt file from the site before accessing it, parse it, and follow the instructions. Good bot builders should provide a defensible method to verify a bot is what it declares itself to be. Bad bots can pretend to be legitimate by copying the user-agent header of a beneficial bot.”]
Source: https://www.darkreading.com/cloud/how-to-live-by-the-code-of-good-bots

