-
Notifications
You must be signed in to change notification settings - Fork 10
/
DESCRIPTION
51 lines (51 loc) · 1.56 KB
/
DESCRIPTION
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
Package: robotstxt
Type: Package
Title: A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker
Version: 0.7.15.9000
Authors@R: c(
person(
"Pedro", "Baltazar", role = c("ctb"),
email = "pedrobtz@gmail.com"
),
person(
"Jordan", "Bradford", role = c("cre"),
email = "jrdnbradford@gmail.com"
),
person(
"Peter", "Meissner", role = c("aut"),
email = "retep.meissner@gmail.com"
),
person(
"Kun", "Ren", email = "mail@renkun.me", role = c("aut", "cph"),
comment = "Author and copyright holder of list_merge.R."
),
person("Oliver", "Keys", role = "ctb", comment = "original release code review"),
person("Rich", "Fitz John", role = "ctb", comment = "original release code review")
)
Description: Provides functions to download and parse 'robots.txt' files.
Ultimately the package makes it easy to check if bots
(spiders, crawler, scrapers, ...) are allowed to access specific
resources on a domain.
License: MIT + file LICENSE
BugReports: https://github.com/ropensci/robotstxt/issues
URL: https://docs.ropensci.org/robotstxt/, https://github.com/ropensci/robotstxt
Imports:
stringr (>= 1.0.0),
httr (>= 1.0.0),
spiderbar (>= 0.2.0),
future.apply (>= 1.0.0),
magrittr,
utils
Suggests:
knitr,
rmarkdown,
dplyr,
testthat (>= 3.0.0),
covr,
curl
Depends:
R (>= 3.0.0)
VignetteBuilder: knitr
RoxygenNote: 7.3.2
Encoding: UTF-8
Config/testthat/edition: 3