It’s Sunday night. You’re lying in bed. Do you know if your website is up?
In this intermediate tutorial, I’m going to show you how to create and automate a simple uptime monitor with email notifications using Python. For historical analysis, later on, we will store the data in MySQL. In addition to detecting if a site is up or down, we’ll also grab some quick speed data and basic TLS/SSL verification.
Note: Placeholders exist in some of the code below where you need to fill in the details for your environment.
Table of Contents
Requirements and Assumptions
- Access to a Linux installation (I recommend Ubuntu) and an understanding of basic terminal functions
- Python 3 is installed and basic Python syntax understood
- Access to a MySQL database, locally or externally
Importing Necessary Modules
Python modules are like libraries in other coding languages. They are collections of premade functions that you can use to save time by not reinventing the wheel. Most of the Python modules we’re going to use should be preinstalled, but three that aren’t are fake_useragent, yagmail, and mysql.connector. Be sure to read the documentation on yagmail, there is a tiny bit of configuring. Very easy. To install these, go to your command terminal and type in both of these commands:
pip3 install mysql.connector
pip3 install fake_useragent
pip3 install yagmail
If you get any errors about other missing modules, you can use the same code above to install the rest. Just make sure to replace the last part with the name of the new module. Sometimes the names aren’t obvious; you can search for the module names here.
Creating MySQL Tables
First, we need to set up the database where we’ll store our monitoring data. There are two common ways to access and manage MySQL: via the command terminal and via phpMyAdmin GUI by cPanel.
Option 1 – Command Line Terminal:
If you are without cPanel or most comfortable in the terminal, follow this guide for logging into MySQL to create the database and user. Then, run the SQL statements below to create the tables.
Option 2 – phpMyAdmin:
If you have access to cPanel, you can create the database and user in the MySQL Databases area. After that, head over to phpMyAdmin (also found in cPanel). Select your database from the list on the left side. In the SQL tab found at the top, enter the following SQL statement to create the table that will contain the websites you want to monitor. If you have this table already created from one of my earlier guides, you can reuse that table instead.
CREATE TABLE websites ( websiteid int NOT NULL AUTO_INCREMENT, name varchar(255), url varchar(255), PRIMARY KEY (websiteid ) );
At this point, you’ll have an empty table for your websites (URLs). Naturally, you’ll want this table populated with the number of URLs you want to monitor. I usually just focus on the homepage. If you created your table in phpMyAdmin, you can select it on the left side column and then select “Insert” at the top. Fill out that form for each website you want to monitor.
You can also insert website records via SQL as shown below (websiteid is auto-generated):
INSERT INTO websites (name,url) VALUES ("Rocket Clicks","https://www.rocketclicks.com")
Next, we’re ready to create the table for the monitoring data using the SQL statement below:
CREATE TABLE uptime ( scanid int NOT NULL AUTO_INCREMENT, websiteid int(255), date varchar(255), time varchar(255), status_code int(255), speed float(255), cert_valid int(255), PRIMARY KEY (scanid) );
Getting the Script Ready
Fire up your favorite code editor or IDE. I recommend PyCharm for more experienced coders or Thonny for beginners.
Place the code below on the first line of the file. It’s called a shebang or hashbang and tells Linux how to execute the file. This is generally optional but required when running from a cronjob, which we will be doing later. This tells Linux to run using Python 3.
#!/usr/bin/python3
First, let’s import the Python modules we’re going to use.
## Get today's date from datetime import date ## Get today's time from datetime import datetime ## For header info import requests ## For time delay import time ## To connec to mysql import mysql.connector ## Generate random valid user agents for request from fake_useragent import UserAgent ## For email notifications import yagmail
From here we’re going to work backward a little as we create two functions. One to handle the website request and another to write to MySQL.
Let’s start with creating our uptime() function which will request the website for its server and record the interaction. This function will take in 3 variables, URL, name, and header information. For our request to the website, we’re going to use a Try/Except because we’re using the verify attribute of the request function and set it to True. This validates the TLS certificate and protects you from scanning malicious URLs. If you are certain of the security of the sites you’re scanning you can set it to False. If set to True and the URL’s certificate is invalid an error will be thrown and the script will stop. We use the Try/Except to record the event and continue even after an error.
def uptime(url,name,header): try: response = requests.get(url,headers=header,verify=True) except: gettime = "n/a" status_code_num = "n/a" cert_valid = "1" return status_code_num,gettime,cert_valid
Continuing inside the uptime() function we’re going to grab the status_code from the response object we just created. We then test to see if that status code was a successful 200 response. If it was, we’ll grab the time it took to download the URL file content (the source code) using the elapsed_time property. If not a 200, we’ll skip getting speed data.
status_code_num = response.status_code if response.status_code == 200: gettime = round(response.elapsed.total_seconds(),2) else: gettime = 0 return status_code_num,gettime, cert_valid
Now we create the function to store the data from the uptime() function. We can generate the time and date here which are important for tracking down the cause of the downtime. We build the SQL statement and execute it to insert the record.
def writetolog(websiteid,name,status_code_num,gettime,cert_valid): now = datetime.now() today = date.today() getnow = now.strftime('%H:%M') getdate = today.strftime('%m/%d/%Y') mydb = mysql.connector.connect(port="3306", host="HOSTIP",user="USER",password="PASSWORD", database="DATABASE") cursor = mydb.cursor() new_scan = "INSERT INTO uptime (websiteid,date,time,code,speed,cert_valid) VALUES ('" + str(websiteid) + "','" + getdate + "','" + getnow + "','" + str(status_code_num) + "','" + str(gettime) + "','" +str(cert_valid)+ "')" print(new_scan) cursor.execute(new_scan)
After writing the log record let’s check if the status code was 200 or if there is a certificate issue and if not, send the notification that your site is down! Be sure to change YOUR_GMAIL_ADDRESS with the one you configured with yagmail earlier. Then replace TO_GMAIL_ADDRESS with the email address that should be receiving the notification. You can easily extend this with an if/then to alter the message if it’s down or a certificate issue. Most times with a severe certificate issue the browser will alert the user and ask if they want to proceed.
if status_code_mum != 200 or cert_valid == 1: body = "Your website: "+url+"Is DOWN! \n\n" body += str(response.text.encode('utf8')) yag = yagmail.SMTP("YOUR_GMAIL_ADDRESS") yag.send( to="TO_EMAIL_ADDRESS", subject=name + " is Down or there is a TLS certificate issue!", contents=body )
Here is where the script actually begins. You start by generating the website URL list from the database. Fill in your connection information for the mydb variable. You see I create another database connection. If anyone knows if I can pass mydb and cursor to the writetolog function, let me know! Moving on, we start to loop through our website records. We grab the websiteid, name, and URL. I included a commented-out line where you can make sure the URL variable contains a URL that will fail to test downtime functionality.
mydb = mysql.connector.connect(port="3306", host="IP",user="USER",password="PASSWORD", database="DBNAME") new_scan = "SELECT * FROM websites" cursor = mydb.cursor() cursor.execute(new_scan) records = cursor.fetchall() for row in records: websiteid = str(row[0]) name = str(row[1]) url = str(row[2]) #url = "https://expired.badssl.com/" ### For downtime testing
Next, we invoke the fake_useragent function to generate a Chrome user agent and pass the URL, name, and header into the uptime() function we created earlier. After the uptime() function runs it returns the status code, speed, and certificate status. Then it passes all that information into the writetolog() function we already created and the record is stored!
ua = UserAgent() header = { 'User-Agent': ua.chrome } status_code_num, gettime, cert_valid = uptime(url,name,header) writetolog(websiteid,name,status_code_num,gettime, cert_valid) mydb.close()
Automating the Scan
If your Up-Time Python script is working well when you run it manually, it’s time to automate it. Luckily, Linux already supplies us with a solution by using the crontab. The crontab stores entries of scripts where you can dictate when to execute them (like a scheduler). You have lots of flexibility with how you schedule your script (any time of day, day of the week, day of the month, etc.). To add entries to the crontab, run this command:
crontab -e
It will likely open up the crontab file in vi editor. On a blank line at the bottom of the file, type the code below. This code will run the script at midnight every Sunday. To change the time to something else, use this cronjob time editor. Customize with your path to the script.
0 0 * * SUN /usr/bin/python3 PATH_TO_SCRIPT/filename.py
If you want to create a log file to record each time the script ran, you can use this instead. Customize with your path to the script.
0 0 * * SUN /usr/bin/python3 PATH_TO_SCRIPT/filename.py > PATH_TO_FILE/FILENAME.log 2>&1
Save the crontab file and you’re good to go! Just note, your computer needs to be on at the time the cronjob is set to run.
Conclusion
So there you have it! you can rest easy knowing your websites are up and running! If not, you’ll know about it. I even found an SMS module you can try to send your phone an SMS message. Naturally, the next step would be to tap into the database with another script or existing application to display or further analyze the data. Please follow me on Twitter for feedback and showcasing interesting ways to extend the script. Enjoy!
But wait! We’re not done, coming soon, we’re going to extend this script to include how to create an LED monitor board and LCD screen using Raspberry Pi that lets you visually know if your site is up or down! Stay tuned!
Uptime Monitor FAQ
How can Python be used to create a website uptime monitor for SEO purposes?
Leverage Python scripts to implement a website uptime monitor, ensuring continuous monitoring of website availability for SEO optimization.
Are there specific Python libraries commonly used for website uptime monitoring?
Python offers various libraries such as Requests for making HTTP requests and BeautifulSoup for parsing HTML, which can be employed for website uptime monitoring.
What considerations should be taken into account when creating a website uptime monitor with Python?
Ensure robust error handling, consider the frequency of checks, and implement notifications to promptly address downtime, enhancing the effectiveness of the website uptime monitor.
Can Python scripts be scheduled to run periodically for automated website uptime checks?
Yes, Python scripts can be scheduled using tools like cron or task scheduler to run periodically, automating the website uptime monitoring process.
Where can I find a comprehensive guide or documentation for creating a website uptime monitor with Python?
Explore online tutorials, Python documentation, and SEO-focused resources for step-by-step guides and best practices in creating an effective website uptime monitor using Python.
- Calculate Similarity Between Article Elements Using spaCy - November 13, 2024
- Audit URLs for SEO Using ahrefs Backlink API Data - November 11, 2024
- Build a Custom Named Entity Visualizer with Google NLP - June 19, 2024