How to save all you Twitter flowers ?

In this tutorial, I will show you how to extract all your followers to CSV/excel using a simple python script. Together with followers screen name, you will also get their description number of followers, language, and location. This can help you monitor your account and see how you progress with your followers.




This tutorial is based on Tweepy library and python v3.6





In order to access and download data from Twitter API, you need to have credentials such as keys and access tokens.You get them by simply creating an APP with Twitter.


Follow these steps:
  • Go to Twitter Application Management and log in, with your Twitter account 
  • Click “create New app” button 
  • Supply the necessary required fields, read and agree to the Twitter Developer Agreement 
  • Submit the form 

Below you will find the full code where you can add your credentials and screen name from which you want to extract followers (line 27). The script will save CSV file with today's date so it will be easier for further analysis.


import tweepy
import pandas as pd
from datetime import datetime
def lookup_user_list(user_id_list, api):
full_users = []
users_count = len(user_id_list)
try:
for i in range((users_count // 100) + 1):
print i
full_users.extend(api.lookup_users(user_ids=user_id_list[i * 100:min((i + 1) * 100, users_count)]))
return full_users
except tweepy.TweepError:
print('Something went wrong, quitting...')
consumer_key = '' # keep the quotes, replace this with your consumer key
consumer_secret = '' # keep the quotes, replace this with your consumer secret key
access_token = '' # keep the quotes, replace this with your access token
access_token_secret = '' # keep the quotes, replace this with your access token secret
auth = tweepy.OAuthHandler(consumer_key, consumer_secret)
auth.set_access_token(access_token, access_token_secret)
api = tweepy.API(auth, wait_on_rate_limit=True, wait_on_rate_limit_notify=True)
ids = []
for page in tweepy.Cursor(api.followers_ids, screen_name="PiotrKrosniak").pages():
ids.extend(page)
results = lookup_user_list(ids, api)
all_users = [{'id': user.id,
'Name': user.name,
'Statuses Count': user.statuses_count,
'Friends Count': user.friends_count,
'Screen Name': user.screen_name,
'Followers Count': user.followers_count,
'Location': user.location,
'Language': user.lang,
'Created at': user.created_at,
'Time zone': user.time_zone,
'Geo enable': user.geo_enabled,
'Description': user.description}
for user in results]
df = pd.DataFrame(all_users)
date = ( datetime.now().strftime('%Y-%m-%d-'))
df.to_csv(date+'followers.csv', index=False, encoding='utf-8')

No comments:

Post a Comment

Fork me on GitHub