forked from Delta-Ark/Geo_Bot-complex
-
Notifications
You must be signed in to change notification settings - Fork 1
/
README.txt
executable file
·175 lines (126 loc) · 5.67 KB
/
README.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
NOTE: This repo will be migrated to github.com/saitogroup/geotweets
Ariel Kalinowski and Trevor Owens 8/24/2016
About:
----------------------------------------------------------------------
This library is composed of several tools for scraping
geolocated tweets and visualizing data gleaned from these tweets.
Geo-tag your tweets!
--------------------
We rely on geo-tagged tweets. Please allow your location to be seen
when tweeting, especially when using this application! You can modify
this by logging into your main twitter account and under "Security and
Privacy" check the box next to "Tweet location". THANKS!
Install:
----------------------------------------------------------------------
git, python 2.7.X, pip
Python packages required: tweepy, nltk, matplotlib, geopy, argparse, json
On Windows: upgrade powershell
(you may still have unicode problems when printing to command line)
$ python -m pip install
For each required package listed above run:
$ pip install <package>
Now we need some data, so we’ll use the nltk downloader
Run a python shell from the command line:
$ python
$ import nltk
$ nltk.download()
On main page, highlight book, click download and that should be it...
These are the exact packages from nltk that are required in case you want less data:
1) under corpora -> highlight stopwords
2) under corpora -> highlight treebank
3) under all packages -> highlight punkt
4) under models -> highlight averaged-perceptron-tagger
This created a folder called “nltk_data” in your home folder which is
used by the program
Navigate to the folder where you want getweets to be
git clone https://github.com/owenst/geotweets.git #THIS WILL CHANGE
get consumerkeyandsecret (see below) and put that in the folder
cd into folder
run sample.py from the command line (see below)
Consumer Key and Secret:
----------------------------------------------------------------------
The program looks for a file in the geotweets folder called
consumerkeyandsecret This should have at least 2 lines, with the
consumer key on the first line, the secret (the longer one) on the
next and then (for streaming and posting) 2 more lines. An access
token on the 3rd and the access token secret on the 4th. You can get
these by going to https://apps.twitter.com in a web browser and
creating an app. Then hit the button to create access tokens. You may
have to set the app permissions to "read and write" if you want to use
this to send tweets on your behalf. After creating the app, copy the 4
alphanumeric keys into a blank file called "consumerkeyandsecret" as
described above and put this file in your "geotweets" folder.
TOOLS:
----------------------------------------------------------------------
sample:
-------
One tool, called 'sample' allows you to scrape and save up to
100 geolocated tweets in batch form. You can optionally search within
this set for specific words or hash tags and visualize the top word
frequency. See sample.py for details or from command line run:
$ python sample.py --help
$ python sample.py --doc
USAGE :
$ python sample.py [-h][-d][-v][-f FILENAME][-o OUTPUT][-vis]
real time visualizer:
---------------------
Another tool, called 'real_time_vis' uses the previous tool to create
a word frequency distribution chart which can grow and change in near
real time as more tweets are grabbed. See real_time_vis.py for details
or from the command line run:
$ python real_time_vis.py --help
$ python real_time_vis.py --doc
USAGE :
$ python real_time_vis.py [-h][-d][-f FILENAME][-n NUMBER][-s][-a ADDRESS]
Both files use a parameter file with geolocation and search
terms. See params.txt for an example.
You may have to adjust your PYTHONPATH variable to run the program
from the command line. Otherwise, using the python interpreter you can
run it.
scan_and_respond
----------------
This tool scans tweets and asks the user to verify them before sending
a tweet response. The relevant tweets are also saved to a JSON
file. This requires write access, which means the consumerkeyandsecret
file must contain all 4 lines.
usage: scan_and_respond.py [-h] [-d] [-f FILENAME] [-a ADDRESS] [-o OUTPUT]
write
-----
This program classifies tweets into phrase types and
produces a JSON array containing these, called phrases.json. It uses
parameters from params.txt. This requires quite a bit of processing
time, which can be reduced by using a lower "count".
suggest_bot
-----------
This is a robotically assisted poetry engine. The user can create
poems using a large supplied word corpus or use their own. It can also
add words to the corpus from the twitter-sphere using the search
option.
HELP:
----------------------------------------------------------------------
All programs can be run from the command line (a.k.a. terminal in OS X).
$ The $ symbol represents a command prompt.
By typing $ python <program_name> -h you will get help on the various
command line tool options.
By typing $ python <program_name> -d you will get the programs
documentation string
EXAMPLES:
----------------------------------------------------------------------
Grabbing geo-located tweets using paramter file params.txt (default),
print to command line and write to output.txt (default):
$ python sample.py --verbose
Visualizing the data, using params.txt (default):
$ python real_time_vis.py
Streaming real time data to create a word frequency chart using a local address:
$ python real_time_vis.py -a "175 5th Avenue NYC" -s
UTILITIES:
----------------------------------------------------------------------
These modules contain methods to assist the "tools" listed above:
tweeter.py
utils.py
geo_converter.py
geosearchclass.py
streamer.py
The below two modules run unit tests:
test_real_time_vis
test_write