Twitter’s API is pricey and has lots of limitations. But their frontend has its own API, which was reverse-engineered by @n0madic and maintained by @imperatrona. Some endpoints require authentication, but it is easy to scale by buying new accounts and proxies.
You can use this library to get tweets, profiles, and trends trivially.
- Installation
- Quick start
- Rate limits
- Methods that returns channels
- Authentication
- Methods
- Get tweet
- Get tweet replies
- Get tweet retweeters
- Get user tweets
- Get user medias
- Get bookmarks
- Get home tweets
- Get foryou tweets
- Search tweets
- Search params
- Get profile
- Get profile by id
- Search profile
- Get trends
- Get following
- Get followers
- Get space
- Like tweet
- Unlike tweet
- Create tweet
- Delete tweet
- Create retweet
- Delete retweet
- Get scheduled tweets
- Create scheduled tweet
- Delete scheduled tweet
- Upload media
- Account
- Connection
- Contributing
go get -u github.com/imperatrona/twitter-scraper
package main
import (
"context"
"fmt"
twitterscraper "github.com/imperatrona/twitter-scraper"
)
func main() {
scraper := twitterscraper.New()
scraper.SetAuthToken(twitterscraper.AuthToken{Token: "auth_token", CSRFToken: "ct0"})
// After setting Cookies or AuthToken you have to execute IsLoggedIn method.
// Without it, scraper wouldn't be able to make requests that requires authentication
if !scraper.IsLoggedIn() {
panic("Invalid AuthToken")
}
for tweet := range scraper.GetTweets(context.Background(), "x", 50) {
if tweet.Error != nil {
panic(tweet.Error)
}
fmt.Println(tweet.Text)
}
}
Api has a global limit on how many requests per second are allowed, don’t make requests more than once per 1.5 seconds from one account. Also each endpoint has its own limits, most of them are 150 requests per 15 minutes.
Apparently twitter doesn’t limit the number of accounts that can be used per one IP address. This could change at any time. As of February 2024, I have been managing 20 accounts per IP address without receiving a ban for several months.
OpenAccount was great in the past, but now it’s nerfed by twitter. They allow 180 requests instead of 150, but you can only create one account per month with one IP address. If you use OpenAccount you should save your credentials and use them later with WithOpenAccount
method.
Some methods returns channels. They created to rid you from dealing with cursor
, but under the hood they still using the same endpoints as they Fetch
counterparts, they have the same rate limits. For example GetTweets
using FetchTweets
to get tweets. FetchTweets
returns up to 20 tweets, so if you set GetTweets
to fetch 150 tweets it will make 8 requests to FetchTweets
(150/20=7.5 ~ 8 requests).
If under-hood Fetch
method got the error, it will be passed to object twitterscraper.TweetResult
and will stop further scraping. In methods that return twitterscraper.TweetResult
you should check if tweet.Error
is not nil
before accessing the tweet content.
Most endpoints require authentication. The preferable way is to use SetCookies. You can also use SetAuthToken
but POST
endpoints will not work. Login with password may require confirmation with email and is often the reason of accounts ban.
Endpoints that work without authentication will not return sensitive content. To get sensitive content you need to authenticate with any available method including OpenAccount
.
// Deserialize from JSON
var cookies []*http.Cookie
f, _ := os.Open("cookies.json")
json.NewDecoder(f).Decode(&cookies)
scraper.SetCookies(cookies)
if !scraper.IsLoggedIn() {
panic("Invalid cookies")
}
To save cookies from an authorized client to a file, use GetCookies
:
cookies := scraper.GetCookies()
data, _ := json.Marshal(cookies)
f, _ = os.Create("cookies.json")
f.Write(data)
SetAuthToken
method simply set required cookies auth_token
and ct0
.
scraper.SetAuthToken(twitterscraper.AuthToken{Token: "auth_token", CSRFToken: "ct0"})
if !scraper.IsLoggedIn() {
panic("Invalid AuthToken")
}
Warning
Deprecated. Nerfed by twitter, doesn't support new endpoints.
LoginOpenAccount
is now limited to one new account per month for IP address.
account, err := scraper.LoginOpenAccount()
You should save OpenAccount
returned by LoginOpenAccount
to reuse it later.
scraper.WithOpenAccount(twitterscraper.OpenAccount{
OAuthToken: "TOKEN",
OAuthTokenSecret: "TOKEN_SECRET",
})
To log in, you have to use your username, not the email!
err := scraper.Login("username", "password")
If you have email confirmation, use your email address in addition:
err := scraper.Login("username", "password", "email")
If you have two-factor authentication, use the code:
err := scraper.Login("username", "password", "code")
Status of login can be checked with method IsLoggedIn
:
scraper.IsLoggedIn()
scraper.Logout()
150 requests / 15 minutes
TweetDetail
endpoint requires auth, so TweetResultByRestId
endpoint used instead when auth not provided. Which doesn't return InReplyToStatus
and Thread
tweets.
tweet, err := scraper.GetTweet("1328684389388185600")
150 requests / 15 minutes
Returns by ~5-10 tweets and multiple cursors – one for each thread.
var cursor string
tweets, cursors, err := scraper.GetTweetReplies("1328684389388185600", cursor)
To get all replies and replies of replies for tweet you can iterate for all cursors. To get only direct replies check if cursor.ThreadID
is equal your tweet id.
tweets, cursors, err := scraper.GetTweetReplies("1328684389388185600", "")
if err != nil {
panic(err)
}
for {
if len(cursors) > 0 {
var cursor *twitterscraper.ThreadCursor
cursor, cursors = cursors[0], cursors[1:]
moreTweets, moreCursors, err := scraper.GetTweetReplies(tweetId, cursor.Cursor)
if err != nil {
// you can check here if rate limited, await and repeat request
panic(err)
}
tweets = append(tweets, moreTweets...)
if len(moreCursors) > 0 {
cursors = append(cursors, moreCursors...)
}
} else {
break
}
}
500 requests / 15 minutes
Returns a list of users who have retweeted the tweet.
var cursor string
retweeters, cursor, err := scraper.GetTweetRetweeters("1328684389388185600", 20, cursor)
150 requests / 15 minutes
GetTweets
returns a channel with the specified number of user tweets. It’s using the FetchTweets
method under the hood. Read how this method works in Methods that returns channels.
for tweet := range scraper.GetTweets(context.Background(), "taylorswift13", 50) {
if tweet.Error != nil {
panic(tweet.Error)
}
fmt.Println(tweet.Text)
}
FetchTweets returns tweets and cursor for fetching the next page. Each request returns up to 20 tweets.
var cursor string
tweets, cursor, err := scraper.FetchTweets("taylorswift13", 20, cursor)
To get tweets and replies use GetTweetsAndReplies
, FetchTweetsAndReplies
and FetchTweetsAndRepliesByUserID
methods.
500 requests / 15 minutes
GetMediaTweets
returns a channel with the specified number of user tweets that contain media. It’s using the FetchMediaTweets
method under the hood. Read how this method works in Methods that returns channels.
for tweet := range scraper.GetMediaTweets(context.Background(), "taylorswift13", 50) {
if tweet.Error != nil {
panic(tweet.Error)
}
fmt.Println(tweet.Text)
}
FetchMediaTweets
returns tweets and cursor for fetching the next page. Each request returns up to 20 tweets.
var cursor string
tweets, cursor, err := scraper.FetchMediaTweets("taylorswift13", 20, cursor)
Important
Requires authentication!
500 requests / 15 minutes
GetBookmarks
returns a channel with the specified number of bookmarked tweets. It’s using the FetchBookmarks
method under the hood. Read how this method works in Methods that returns channels.
for tweet := range scraper.GetBookmarks(context.Background(), 50) {
if tweet.Error != nil {
panic(tweet.Error)
}
fmt.Println(tweet.Text)
}
FetchBookmarks
returns bookmarked tweets and cursor for fetching the next page. Each request returns up to 20 tweets.
var cursor string
tweets, cursor, err := scraper.FetchBookmarks(20, cursor)
Important
Requires authentication!
500 requests / 15 minutes
GetHomeTweets
returns a channel with the specified number of latest home tweets. It’s using the FetchHomeTweets
method under the hood. Read how this method works in Methods that returns channels.
for tweet := range scraper.GetHomeTweets(context.Background(), 50) {
if tweet.Error != nil {
panic(tweet.Error)
}
fmt.Println(tweet.Text)
}
FetchHomeTweets
returns latest home tweets and cursor for fetching the next page. Each request returns up to 20 tweets.
var cursor string
tweets, cursor, err := scraper.FetchHomeTweets(20, cursor)
Important
Requires authentication!
500 requests / 15 minutes
GetForYouTweets
returns a channel with the specified number of for you home tweets. It’s using the FetchForYouTweets
method under the hood. Read how this method works in Methods that returns channels.
for tweet := range scraper.GetForYouTweets(context.Background(), 50) {
if tweet.Error != nil {
panic(tweet.Error)
}
fmt.Println(tweet.Text)
}
FetchForYouTweets
returns for you home tweets and cursor for fetching the next page. Each request returns up to 20 tweets.
var cursor string
tweets, cursor, err := scraper.FetchForYouTweets(20, cursor)
Important
Requires authentication!
150 requests / 15 minutes
SearchTweets
returns a channel with the specified number of tweets that contain media. It’s using the FetchSearchTweets
method under the hood. Read how this method works in Methods that returns channels.
for tweet := range scraper.SearchTweets(context.Background(),
"twitter scraper data -filter:retweets", 50) {
if tweet.Error != nil {
panic(tweet.Error)
}
fmt.Println(tweet.Text)
}
FetchSearchTweets
returns tweets and cursor for fetching the next page. Each request returns up to 20 tweets.
tweets, cursor, err := scraper.FetchSearchTweets("taylorswift13", 20, cursor)
By default, search returns top tweets. You can change it by specifying the search mode before making requests. Supported modes are SearchTop
, SearchLatest
, SearchPhotos
, SearchVideos
, and SearchUsers
.
scraper.SetSearchMode(twitterscraper.SearchLatest)
See Rules and filtering for build standard queries.
95 requests / 15 minutes
profile, err := scraper.GetProfile("taylorswift13")
95 requests / 15 minutes
profile, err := scraper.GetProfileByID("17919972")
Important
Requires authentication!
150 requests / 15 minutes
SearchProfiles
returns a channel with the specified number of tweets that contain media. It’s using the FetchSearchProfiles
method under the hood. Read how this method works in Methods that returns channels.
for profile := range scraper.SearchProfiles(context.Background(), "Twitter", 50) {
if profile.Error != nil {
panic(profile.Error)
}
fmt.Println(profile.Name)
}
FetchSearchProfiles
returns profiles and cursor for fetching the next page. Each request returns up to 20 tweets.
profiles, cursor, err := scraper.FetchSearchProfiles("taylorswift13", 20, cursor)
trends, err := scraper.GetTrends()
Important
Requires authentication!
500 requests / 15 minutes
var cursor string
users, cursor, err := scraper.FetchFollowing("Support", 20, cursor)
Important
Requires authentication!
50 requests / 15 minutes
var cursor string
users, cursor, err := scraper.FetchFollowers("Support", 20, cursor)
Important
Requires authentication!
500 requests / 15 minutes
Use to retrvie data about space and it's participants. You can get up to 1000 participants of space. If method returns less, it's probably because listeners is anonymous.
space, err := scraper.GetSpace("space_id")
You can get space_id
from space url which can be retrived from tweet. For example:
tweet, err := testScraper.GetTweet("1815884577040445599")
if err != nil {
t.Fatal(err)
}
var spaceId string
spaceUrl := tweet.URLs[0] // https://twitter.com/i/spaces/1mnxeAMPEqqxX
if strings.HasPrefix(spaceUrl, "https://twitter.com/i/spaces/") {
spaceId = strings.Replace(spaceUrl, "https://twitter.com/i/spaces/", "", 1) // 1mnxeAMPEqqxX
}
space, err := scraper.GetSpace(spaceId)
Important
Requires authentication!
500 requests / 15 minutes (combined with UnlikeTweet
method)
err := scraper.LikeTweet("tweet_id")
Important
Requires authentication!
500 requests / 15 minutes (combined with LikeTweet
method)
err := scraper.UnlikeTweet("tweet_id")
Important
Requires authentication!
tweet, err = scraper.CreateTweet(twitterscraper.NewTweet{
Text: "new tweet text",
Medias: nil,
})
To create tweet with medias, you need to upload media first. Up to 4 medias; jpg, mp4 and gif allowed.
var media *twitterscraper.Media
media, err = testScraper.UploadMedia("./photo.jpg")
if err != nil {
t.Error(err)
}
tweet, err = scraper.CreateTweet(twitterscraper.NewTweet{
Text: "new tweet text",
Medias: []*twitterscraper.Media{
media,
},
})
Important
Requires authentication!
err := testScraper.DeleteTweet("1810458885008105870");
Important
Requires authentication!
Returns retweet id, which is not the same as source tweet id.
retweetId, err := testScraper.CreateRetweet("1792634158977568997");
Important
Requires authentication!
To delete retweet use source tweet id instead retweet id.
err := testScraper.DeleteRetweet("1792634158977568997");
Important
Requires authentication!
500 requests / 15 minutes
tweets, err := scraper.FetchScheduledTweets()
Important
Requires authentication!
500 requests / 15 minutes
tweets, err := scraper.CreateScheduledTweet(twitterscraper.TweetSchedule{
Text: "New scheduled tweet text",
Date: time.Now().Add(time.Hour * 24 * 31),
Medias: nil,
})
Important
Requires authentication!
500 requests / 15 minutes
err := scraper.DeleteScheduledTweet("123")
Important
Requires authentication!
50 requests / 15 minutes
Uploads photo, video or gif for further posting or scheduling. Expires in 24 hours if not used.
media, err := scraper.UploadMedia("./files/movie.mp4")
Requires authentication!
To get current account settings use GetAccountSettings
method.
settings, err := scraper.GetAccountSettings()
If you use session with multiaccount you can use GetAccountList
method to get slice of all accounts.
accounts, err := scraper.GetAccountList()
By default client uses user agent from mac google chrome v129.
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36
You can set any client you want with method SetUserAgent
.
scraper.SetUserAgent("user-agent")
To get current user agent use GetUserAgent
.
agent := scraper.GetUserAgent()
err := scraper.SetProxy("http://localhost:3128")
err := scraper.SetProxy("socks5://localhost:1080")
Socks5 proxy support authentication.
err := scraper.SetProxy("socks5://user:pass@localhost:1080")
Add delay between API requests (in seconds)
scraper.WithDelay(5)
scraper.WithReplies(true)
To run some tests, you need to set any form of authentication via environment variables. You can see all possible variables in .vscode/settings.json file. You can also set them in the file to use automatically in vscode, just make sure you don’t commit them in your contribution.