CGI Scripting Tips for Bash or SH

Introduction

Background

You might recognize the situation: deadline's just been around the corner and you need to get some networked interprocess communication going within a very limited environment. If you're lucky to still be able to get your favourite scripting language like PHP up and running and build a fancy MySQL backed XML RPC bridge, go ahead - this article won't really get in your way but you'll still find this information useful.

If the situation is a little different than normal and you've only got access to some non-root accounts on a server you don't own, you might need to use a few basic building blocks to implement the solution. This article counts on you not having access to a webscripting language, like PHP or Ruby on Rails or an up-to-date C compiler like gcc.

In this article we'll build a remote server monitoring tool that gathers uptime and load average data on a single database server. Impossible without big tools? No, in fact it's really easy to do.

Getting started

The bare necessities

So, with all the stuff you shouldn't have, what should you have? I'll give tips using a combination of a Bash or SH shell, the MySQL mysql command-line client and a bit of grep and sed magic. It's also quite inevitable that you need access to a webserver path you can place CGI-script in, which might be the most tricky part.

I've sketched quite a bare landscape for creating an XML-RPC like bridge, but don't get desperate just yet, using these few tools you can still create something useful.

More "requirements"

You'll need a lightweight console text-editor. Well, you don't really need it, but it sure beats echoing line after line to a file. So, make sure you've got a spare vi or emacs installed somewhere. The choice of which editor you choose is ultimately yours.

After you've seated yourself a bit more comfortably, let's go ahead and look at the tips I've got for you.

Oh, and one more thing...

I'll assume you're familiar with the MySQL database server. For the server monitoring application we're about to build, you'll have to create a database containing a table with this structure:

CREATE TABLE requests(
requestdate DATETIME NOT NULL,
id INT NOT NULL,
ip VARCHAR(255) NOT NULL,
information VARCHAR(255) NOT NULL,
PRIMARY KEY(requestdate, id)
);

The requestdate field will contain the date and time on which the request was offered to the server, this will allow you to determine if a certain host hasn't responded in the past hours. The id field contains a unique number per monitored computer. The ip field contains the IP address from which the request originated and lastly, the information field will contain the uptime and load averages for each request.

Got the table up and running? Great, let's go.

Maker, Meet Your Tools

CGI-scripting 101, the things to know

There are three basic things you'll need to know whenever you're going to write a CGI-script using this article as a guide. Summarized you should keep these three points in mind:

  • HTTP Headers, no-one is going to hold your hand when writing plain CGI-scripts. You will need to give your webserver a clue about what content it has to expect. Failure to do so will result in 501 - Internal Server Error warnings.
  • GET Requests, try to use GET requests which use the HTTP request to send parameter information to your script. The parameter information is available to your script in the QUERY_STRING environment variable.
  • Remote Address, sometimes you'll want to check or use the remote address of the client machine. This address is stored in the environment variable REMOTE_ADDR.

The only one of these three points that needs a little more explanation is the first one about sending the HTTP headers. I wasn't completely telling the truth, since you do not really have to take care of all the headers yourself, you'll just need to tell the webserver you want to send HTML data back to the client. As long as the top two lines of your output read:

Content type: text/html

You'll be fine. And yes, the second line is indeed blank.

Basic script header

For all tasks involving CGI-scripting you'll probably want your script to start like this:

#!/bin/bash
#
# CGI-script, (c) Copyright
#
# Start with outputting the HTTP headers.
#

echo "Content-type: text/html"
echo ""

#
# Start HTML content.
#

You basically need to send the HTTP header in each of your scripts so you might as well use a template that already contains them. It also sets the premise for the rest of your script: you only have to provide HTML content after the comment that says so.

The first line in the example above is to indicate which binary should be used to execute your program.

Parsing GET parameters

What might seem difficult at first is the parsing of the GET parameters. In webscripting languages like PHP you might use a simple reference to $_GET['parameter'] to retrieve the contents. In a simple Bash script you're going to have to rely on external tools like grep to parse the query string retrieved from the webserver.

You won't get far using grep in the default line matching mode, we'll use the extended regular expression mode, which is aliased under the command egrep.

For the server monitoring example, we're going to send two parameters via the GET request: an identification-number called id and the uptime information in the info parameter. Parsing these two parameters can be done like this:

ID=`echo "$QUERY_STRING" | grep -oE "(^|[?&])id=[0-9]+" | cut -f 2 -d "=" | head -n1`
INFO=`echo "$QUERY_STRING" | grep -oE "(^|[?&])info=[^&]+" | sed "s/%20/ /g" | cut -f 2 -d "="`

That doesn't look too hard, does it? What happens is simple, the id gets matched using an integer repetition, the info is parsed until an ampersand character is encountered. After the match, both parameter names get chopped off using the cut command.

By copying and pasting similar lines you can easily retrieve other and more GET parameters. Notice that the sed call replaces unescaped spaces, %20, to a normal ASCII space character.

Executing MySQL SQL statements

The last piece of the puzzle consists of connecting to the MySQL database and storing the request information in the desired table. We will define the connection data inside the script, using four correctly named variables, like this:

#
# MySQL settings
#
HOST=127.0.0.1
USER=monitor
PASS=supersecret
DB=monitor

Once these variables are set, we can use the command line utility mysql with the correct parameters to execute an arbitrary query:

mysql -u$USER -h$HOST --password=$PASS -e "INSERT INTO requests(requestdate, id, ip, information) VALUES(NOW(), '$ID', '$REMOTE_ADDR', '$INFO')" $DB

I hope that wasn't too hard, all of these parameters are fully documented in the manual pages of MySQL.

Putting It Together

The serverside CGI-script

Putting all the previous tricks together, I present you the serverside CGI-script for our monitoring solution. The comments, alongside the information I already gave you will probably make it easy to understand this script:

#!/bin/bash

# Interval Monitoring CGI-script
# (c) 2005 - 2009, Frank Schoep, Forever For Now

# MySQL settings
HOST=127.0.0.1
USER=monitor
PASS=supersecret
DB=monitor

ID=`echo "$QUERY_STRING" | grep -oE "(^|[?&])id=[0-9]+" | cut -f 2 -d "=" | head -n1`
INFO=`echo "$QUERY_STRING" | grep -oE "(^|[?&])info=[^&]+" | sed "s/%20/ /g" | cut -f 2 -d "="`
IP=$REMOTE_ADDR

# The HTML output starts here
echo "Content-type: text/html"
echo ""

echo "<html><head><title>Interval Monitoring</title></head><body><h1>Interval Monitoring</h1><pre>";
echo "ID: <em>$ID</em><br />"
echo "IP: <em>$IP</em><br />"
echo "Information: <em>$INFO</em><br />"

mysql -u$USER -h$HOST --password=$PASS -e "INSERT INTO requests(requestdate, id, ip, information) VALUES(NOW(), '$ID', '$IP', '$INFO')" $DB

echo "</pre></body></html>";

You can copy-and-paste this script if you'd like to.

Testing the script

You've made it this far, you're probably wondering if the thing actually works. Well, go ahead, try to surf to the place where you saved the CGI-script and add an id and information parameter, for example:

http://www.yourserver.com/cgi-bin/monitor.sh?id=13&info=Testing

If all goes well, you'd be able to see the information you entered on the webpage you requested, and it'll also be inserted into the database. If it didn't, check the URL you entered, check if the script is executable, use chmod +x, and make sure your database settings are correct.

Once the script works, you're all set to set up the systems you want to be monitored.

"Installing the client"

Setting up UNIX clients for this system is simple. We'll use wget in a crontab to send information to the server. Important is that you assign each computer to be monitored a unique id to send over to the server, make sure you've got no duplicates or your database won't be as useful.

Using wget is simple, try the following command on a UNIX system:

wget "http://www.yourserver.com/cgi-bin/monitor.sh?id=13&info=`uptime`"

Notice the backticks around uptime, using these backticks we can send the string outputted by the uptime command to the server machine. You can use any other command as well, so if you want to monitor disk usage, feel free to write a oneline as a frontend for df.

Getting this command to run periodically can be done using crontab. Start the editor using:

crontab -e

Next, you can insert a line into the crontab which looks like this one:

0 * * * * wget -q -O /dev/null "http://www.yourserver.com/cgi-bin/monitor.sh?id=13&info=`uptime`"

This will make sure that the wget is executed every hour, on the hour. The -q option for wget surpresses status messages and the -O /dev/null option makes sure no output gets saved to disk. Save the crontab and enjoy your well deserved rest. Everything should be in place now.

What's next?

You've now got a lot of machines reporting in to the server, sending their uptime and load averages. What you're going to do with this data is up to you. You can write a script that checks whether or not machines have checked in the past hours, or monitor the load for certain treshold values.

The most important part of the article was showing you can create advanced looking things using simple tools. Especially the GET request parameter-parser using grep may come in handy later on. I hope I also gave some insight in the process of creating Bash CGI-scripts.

About this article

This article was added to the site on the 20th of December 2005. On the 1st of February 2009 I updated the scripts and added some extra information about wget thanks to helpful feedback from Ken Butcher.

Back to top