Skip to content

Abishek-Reddy/Twitter-Analysis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tweet Analysis

Twitter allows users to mine tweets with it's API. And we now use it to find the most used words in a given number of tweets in wordcloud format.

Getting Started

Create a twitter account, and verify your phone number. Sign up for developer access.

Prerequisites

  1. Tweepy
  2. Twitter API key and Access Token
  3. Configparser
  4. Wordcloud
  5. sklearn
  6. collections

Prerequisites installation

pip install tweepy
pip install configparser
pip install wordcloud
pip install sklearn
pip install collections

How to execute the file

Download the config file, add your API key, API key secret, Access Token, Access Token Secret.
Now download final.py file and open it in VS Code or in any IDE that can run python.
Enter your keyword, geolocation and set the limit.
You can now start running the code and the tweets will be collected in the CSV file.
Download visualization.py and run it to get the tweets in the wordcloud format.

Output

  1. Wordcloud - 300 Tweets

  2. Wordcloud - 600 Tweets

  3. Wordcloud - 1200 Tweets

Author

License

This project is licensed under the MIT License - see the LICENSE.md file for details

References

  • AI Spectrum
  • Tweepy documentation
  • Stack Overflow

About

Analyzing tweets using Tweepy.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages