Clark Boylan c499b57e16 Add robots.txt to our list servers
We've noticed that our uwsgi queues are filling up and a lot of requests
are being made to robots.txt which ends up 500/503 erroring. Add a
robots.txt file which allows crawling of our lists and archives with a
delay value in hopes this will cause bots to cache results and not fill
up the queue with repetetive requests.

Change-Id: I660d8d43f6b2d96663212d93ec48e67d86e9e761
2024-04-23 08:51:50 -07:00
..
2023-04-19 09:53:10 +10:00
2023-02-13 23:54:59 +00:00
2022-02-25 17:27:35 +11:00
2022-02-18 21:39:27 +00:00
2022-06-28 18:41:17 +10:00
2023-04-19 09:53:10 +10:00
2023-11-14 16:05:28 -08:00
2022-09-15 19:21:33 -07:00
2022-02-01 13:52:47 -08:00
2022-02-10 13:24:42 -08:00
2022-02-01 13:52:47 -08:00
2022-10-20 09:00:43 +11:00