# robots.txt
This repository contains my very opinionated (and possibly overengineered)
robots.txt generation.
`lists` contains lists of user agents to disallow at root
* .txt files, line by line listings of user agents
* empty lines are ignored
* lines starting with # are ignored
`bases` contains site specific base robots.txt files
* base robots.txt files in the robots.txt format
* names represent the domains they're served at
`generate.sh` is a bash script for generating output robots.txt files
* arg $1 is a required path to the out file
* arg $2 is an optional path to a base file