Do you follow hundreds or even thousands of accounts on Twitter? Do you find yourself missing important announcement tweets from Google Search accounts? Were you late to know about the latest core update? No more! In this tutorial I’m going to show, using Python, how to create a very simple Twitter alert system. We’re going to use the Python module Advertools, created by Elias Dabbas, to connect to the Twitter API and scan the current day’s posts of specific accounts for keywords you are interested in. If a match is found you can do any number of things, but I’ll show you how to either send an email alert or an SMS. Never miss a core update announcement again!
Table of Contents
Requirements and Assumptions
- Python 3 is installed and basic Python syntax understood.
- Access to a Linux installation (I recommend Ubuntu) or Google Colab.
- Twitter Developer account with API access
- Gmail account for email with yagmail configured or Textbelt account for SMS
Note: If you are copying and pasting the code to your own script, double-check the indents. Python is very sensitive and often WordPress or copying doesn’t preserve indents very well.
Install Python Modules
First, we need to install the advertools and yagmail modules. In your terminal type the following. If in Google Colab, add an exclamation mark at the very beginning of the code.
pip3 install advertools
pip3 install yagmail
Now let’s import the modules we’re going to use.
- advertools: For the Twitter API connection
- pandas: To handle the dataframe response from advertools
- datetime: To grab today’s date for the Twitter date range
- yagmail: To send the email alert
import pandas as pd import advertools as adv import yagmail from datetime import date
Twitter API
To use the Twitter API you need to have registered a Twitter Developer account. Create a developer project and request API access. It can take a few days for them to accept you. When you are confirmed, head to your project and click on the “Keys and Token” menu. There you will find your Key, Secret, and tokens. Fill in the code below.
auth_params = { 'app_key': '', 'app_secret': '', 'oauth_token': '', 'oauth_token_secret': '', }
Create Send Function
Now is a good time to create our alert function. This is in case you want to monitor more than one account. I’ll highlight the areas you’ll need to alter later if you do. This function accepts 3 parameters. The Twitter handle/s of the accounts you’re monitoring, the tweets that we detected, and how you want to be alerted. If you are monitoring more than two, you may not want to enable SMS as the message may be long.
Method 1 is to send an email alert. We use our yagmail module. It does require some small configuration. Be sure to follow the directions, it’s easy. Be sure to replace YOUR_GMAIL_ADDRESS and TO_EMAIL_ADDRESS with your accounts. Just for diagnostics, we are keeping the print function to log any errors.
def send_alert(emailmessage, method): if method == 1: body = emailmessage yag = yagmail.SMTP("YOUR_GMAIL_ADDRESS") yag.send( to="TO_EMAIL_ADDRESS", subject="Twitter Alert", contents=body) print("Email sent")
Method 2 will fire off an SMS message to your phone. In this example, I’m using the service textbelt. It’s very easy and cheap. $10 for 1000 should last you quite a while. Be sure to add your phone and key below.
if method == 0: resp = requests.post('https://textbelt.com/text', { 'phone': '', 'message': twitterhandle + " tweeted out: " + tweet, 'key': '', }) print(resp.json())
Initiate Advertools
The alert function is done, so let’s move back to the beginning. First, we grab today’s date in the default format yyyy-mm-dd. We plan on running this script once a day so we only want to detect tweets made today. Once we have the date we invoke the advertools Twitter function which accepts the authentication you set earlier.
today = str(date.today()) adv.twitter.set_auth_params(**auth_params)
Setup Monitor List and Storage Dataframe
Next, we create the dictionary list that will hold the Twitter handle and keywords (use regex for multiple keywords) for the account/s we want to monitor along with an empty dataframe we’ll use to store the results if a match is found.
alerts_list = {"searchliaison":"core|update|algorithm","googlesearchc":"search/sconsole"} df1 = pd.DataFrame(columns = ['TwitterHandle', 'Message'])
Grab Twitter Timelines and Filter
Time to loop over our list of accounts we want to monitor with the dictionary we created above. We use the advertools Twitter API function to grab the timelines of each account.
for keys,values in alerts_list.items(): df = adv.twitter.get_user_timeline(screen_name=keys,tweet_mode="extended")
The tweet_created_at column needs to be converted to a string as it’s returned as a datetime object. Once we have our timeline for an account we want to filter it to return only tweets with keywords we’re looking for and if posted today.
df['tweet_created_at'] = df['tweet_created_at'].astype('string') df = df[df['tweet_full_text'].str.contains(values,regex=True) & df['tweet_created_at'].str.contains(today)]
Lastly if after filtering we have a dataframe with at least one row we can add that row to a master dataframe that stores all the matches tweets across all accounts you are tracking and continue back to the top to process the next handle and keyword we want to monitor.
if len(df.index) > 0: df1 = df1.append({'TwitterHandle' : keys, 'Message' : df['tweet_full_text']}, ignore_index = True)
Build Alert and Send Message
We have all our matched tweets in df1. We want to convert these to a dictionary using zip() so we can format them into a message we can send.
getlist = dict(zip(df1['TwitterHandle'].tolist(),df1['Message'].tolist())) emailmessage = ""
Finally, we check if the master dataframe has any matched records. If not, there is nothing to do today, no alerts. If there are rows in the dataframe we loop over them and concatenate them into a message we can send. We also have two methods to send. If the method is set to 1, it will send an email, if 0 it will send an SMS. Only use SMS if you are tracking 1-2 accounts.
if len(df1.index) > 0: for key, value in getlist.items(): emailmessage += key + ": " + value + "\n\n" method = 1 send_alert(emailmessage,method)
Automating the System
Next task is to automate this script because it doesn’t do you any good to manually run it as you might as well just go check the accounts. If you are running this local you can proceed below. If you are running in Google Colab you’ll need to transfer the script to Google Cloud Functions and use Google Cloud Scheduler.
Luckily, Linux already supplies us with a solution by using the crontab. The crontab stores entries of scripts where you can dictate when to execute them (like a scheduler). You have lots of flexibility with how you schedule your script (any time of day, day of the week, day of the month, etc.). To add entries to the crontab, run this command:
crontab -e
It will likely open up the crontab file in vi editor. On a blank line at the bottom of the file, type the code below. This code will run the script at midnight every Sunday. To change the time to something else, use this cronjob time editor. Customize with your path to the script.
0 0 * * SUN /usr/bin/python3 PATH_TO_SCRIPT/filename.py
If you want to create a log file to record each time the script ran, you can use this instead. Customize with your path to the script.
0 0 * * SUN /usr/bin/python3 PATH_TO_SCRIPT/filename.py > PATH_TO_FILE/FILENAME.log 2>&1
Save the crontab file and you’re good to go! Just note, that your computer needs to be on at the time the cronjob is set to run.
Conclusion
And there you have it. You have the framework to start monitoring your favorite accounts for keywords you care about. You can extend this in so many ways by adding more alert methods, more accounts, more keywords and so much more. No longer will you miss a core update announcement or a new Search Console feature! Please follow me on Twitter for feedback and to showcase interesting ways to extend the script. Enjoy!
- Build a Custom Named Entity Visualizer with Google NLP - June 19, 2024
- Storing CrUX CWV Data for URLs Using Python for SEOs - January 20, 2024
- Scraping YouTube Video Page Metadata with Python for SEO - January 4, 2024