From 45e3578b274f10c741a0d9771f98c5dbdf812910 Mon Sep 17 00:00:00 2001
From: Isaac Woods
We allow our API and website to be crawled by commercial crawlers such as GoogleBot. At our discretion, we may choose to allow access to experimental crawlers, as long as they limit their request rate to 1 request per second or less. +
+
We also require all crawlers to provide a user-agent header that allows us to
uniquely identify your bot. This allows us to more accurately monitor any
impact your bot may have on our service. Providing a user agent that only
-identifies your HTTP client library (such as "request/0.9.1") increases the
+identifies your HTTP client library (such as "request/0.9.1
") increases the
likelihood that we will block your traffic.
It is recommended, but not required, to include contact information in your user
agent. This allows us to contact you if we would like a change in your bot's
behavior without having to block your traffic.
+
+Bad: "User-Agent: reqwest/0.9.1
"
+Better: "User-Agent: my_bot
"
+Best: "User-Agent: my_bot (my_bot.com/info)
" or "User-Agent: my_bot (help@my_bot.com)
"
+
We reserve the right to block traffic from any bot that we determine to be in violation of this policy or causing an impact on the integrity of our service.