[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Lecture 22

These are the notes on make.

Title: Math 481/581 Lecture 22: Using Make and Compiling GNU Software

Math 481/581 Lecture 22: Using Make and Compiling GNU Software

© 1998 by Mark Hays <hays@math.arizona.edu>. All rights reserved.

Today we'll cover the make(1) utility and describe how to compile GNU/FSF packages.


Now that we've seen the Bourne shell and covered how compilers work on UNIX systems, we can immediately see a possible way to combine the two things. If we have a directory containing a bunch of FORTRAN files that need to get compiled into an executable program, we can write a shell script to do the job for us:
    # buildit -- script to build my code

    # name of program to build

    # FORTRAN compiler to use

    # compilation flags to use

    # extra libs to link against


    # compile each file
    for FILE in *.f; do
        $FC $FCOPTS -c $FILE

    # link it
    $FC -o $PROG *.o $LIBS

    # done!
    exit 0

If you have a small number of files (so it doesn't take too long to compile everything, every time), this approach works fine.

If your project is a little more complicated and you don't want to compile every single file every time, the shell script approach may annoy you. Fortunately, there is another way which we'll get to in a moment.

As a concrete example, we'll consider a small LaTeX project consisting of a main LaTeX file, an included LaTeX file, and a PostScript figure generated from a data file by our old friend, gnuplot.

Enter make(1)

Let's head over to the SWIG page for a couple of discussions on make(1):


Installing GNU Software

The Free Software Foundation (FSF) maintains a large number of freely available software packages ranging from programming tools (compilers, debuggers, etc.) to user utilities (less, ispell, emacs, etc.) to games (GNU chess).

The FSF's primary effort is called GNU, which stands for "GNU's Not UNIX" --- the word "UNIX" is copyrighted by AT&T. The ultimate goal of the GNU project is to create a fully functional, freely available UNIX-like operating system (much like Linux and Free-BSD).

One nice thing about GNU software is that it is easy to install. It usually takes four commands to install most GNU packages. Below, we'll see how to install a GNU package into your very own account.

As an example, I'll assume that you are on an SGI machine and want to use make(1). SGI's version of make(1) is totally, hopelessly, completely broken. I cannot offer a reason for this: the source code for make(1) is freely available and so there is no reason to screw it up. But SGI managed to do it.

The best bet on an SGI is to obtain and install GNU make(1). One of the primary GNU FTP sites is


The source code for GNU make(1) will be a file called make-<version>.tar.gz. As of 17 Nov 1998, the latest version of GNU make(1) is 3.77.

The steps to getting the package installed are:

The .tar.gz file is a compressed archive of the source code for the package. Specifically, the archive is created with the tar program ("tar" stands for "tape archive") and is then compressed with the GNU gzip file compression program. To unpack the source code, you'll need to uncompress the file you downloaded and extract the files from the resulting archive.

What is an archive? An archive is simply a bunch of files stuffed into a single large file is some format that lets you get the bunch of files back again; in other words, an archive lets you carry around a whole bunch of files in a single suitcase.

Creating your own archives is very handy for making backup copies and snapshots of your own works in progress. For example, to make a backup of your $HOME/diss directory (and all of its contents recursively), type:

   cd $HOME
   tar cf diss-MMDDYY.tar diss/
You'll end up with a big diss.tar that contains a snapshot of your dissertation. You can FTP this file all over the place so that you sleep well at night (well, as well as anyone writing a dissertation sleeps). A tar file is a binary file, so be sure to FTP it binary mode!

If you goof something up or otherwise need to get a file or two back from your snapshot, you can unpack the thing by doing:

   cd $HOME
   mkdir tmp
   cd tmp
   tar xf /path/to/diss-MMDDYY.tar
This will create the subdirectory $HOME/tmp/diss/ that contains all the files in the archive. DO NOT unpack the tarfile in $HOME (unless you mean to)! If you do, existing files in your diss/ directory will be overwritten by the old versions!

What is data compression? Data compression is a Very Interesting Thing. Basically, you want to take a file and somehow make it smaller (in a reversible way, of course). By doing this, you reduce the amount of disk space used by the file, as well as reducing the download time over the network. There are a number of data compression algorithms in use.

It is not possible to compress arbitrary data streams (for example, streams of random numbers). However, most data isn't arbitrary. English prose contains about 1.2 bits of information per 8 bit byte. This is due to things like every "q" is followed by a "u", etc. If we could exploit this fact, we could theoretically reduce the amount of storage required by a factor of 8/1.2 or 6.6.

UNIX systems come equipped with programs called compress and uncompress that use the Lempel-Ziv-Welch data compression algorithm to do their thing. Unfortunately, there are patent issues with this algorithm, and so the GNU people came up with their own compression algorithm that is embodied in the gzip program. You should know that gzip usually compresses files better than compress and that it can also decompress files produced with compress.

The reason the FSF is concerned about patent issues is that all GNU software is to be freely available and redistributable. In addition, source code must be available for all GNU packages. Within fairly broad limits, you can do whatever you want with GNU source code and binaries. Anyway, the GNU free software philosophy prevents the use of patented algorithms that require licensing (which costs money and generally prohibits redistribution), nondisclosure agreements (which prevent source code from being (re)distributed), and so on.

Unpacking the Source Code

On many systems, unpacking the source code is as simple as:
   gzip -dc < make-3.77.tar.gz | tar xf -
This will create a subdirectory called make-3.77 that contains all of the source code for GNU make.

The only wrinkle you might hit here is that the gzip program may not be installed. You can find out by typing

   gzip -V
If you get back something like "gzip: command not found", you'll need to obtain and install gzip before doing anything else. The fact is that some GNU packages require other GNU packages in order to function (or install, or even unpack!). Normally any such dependencies are described in the README or INSTALL files distributed with the source code. Of course, you need to be able to unpack the source code before you can read these files!

Almost all GNU source code is distributed in gzip format, so you'll definitely need the gzip program before doing anything. Fortunately, installing it is easy. Assuming you are installing things in your home directory, you can bootstrap yourself as follows: first, get the gzip source code from the above site --- the current version as of 17 Nov 1998 is 1.2.4. You can get gzip up and running with the following steps:

   cd $HOME
   mkdir bin info lib man man/man1
   tar xf gzip-1.2.4.tar
   cd gzip-1.2.4
   sh configure --prefix=$HOME
   make install
You'll want to modify your startup files and add $HOME/bin to your PATH variable. Once the command gzip -V works (it should print out the version number of gzip), you can proceed to unpack the source for GNU make(1) as described above. The meaning of the above steps is described below.

Configuring the Package

Configuring most packages is also simple. All GNU software is configured via a Bourne shell script called configure. This script is distributed with each package.

The configure script serves two purposes. First, it lets you specify installation options such as the location in which the package is to be installed. When you run this script, the second function is invoked: it gropes your system to figure out things like the location of your C compiler, etc., which is nice because it saves you from having to know gory these gory details.

The configure script has one option you definitely need to know about: --prefix=<path>. This option specifies the installation prefix for the package. If you set the prefix to $HOME, the executables will be placed in $HOME/bin, private files and libraries will go into $HOME/lib, and the manpages will go into $HOME/man when you install the package.

For most GNU packages, configuration can be achieved by simply typing:

   sh configure --prefix=$HOME
   [ ... loads of diagnostic output ... ]
This will configure the package and set it up to be installed in your account.

Some GNU packages support various options -- these are normally detailed in the README or INSTALL files that accompany the source code. You can also get a list of options supported by typing:

   sh configure --help

Compiling and Installing the Package

Compiling and installing the package are easy:
   make              # build everything
   make install      # install it into $prefix
Now all you need to do is add $prefix/bin to your PATH and you'll be using GNU make. If you want access to the accompanying manpages, you'll also need to add $prefix/man to your MANPATH environment variable.

GNU Documentation

The GNU project distributes its documentation in what is called "texinfo" format. If you have emacs, TeX, and the GNU texinfo package installed (along with any other necessary prerequisites), you can process and read GNU documentation.

I will not get into the religious and technical reasons of why I think this is a horrid abomination, but I will be happy to discuss it with anyone outside of class.

The following paragraph is now attached to the top of a number of FSF manpages:

       If we find that the things in this man page that  are  out
       of date cause significant confusion or complaints, we will
       stop distributing the man page.  The alternative, updating
       the  man  page when we update the Info file, is impossible
       because the rest of the work of maintaining GNU CC  leaves
       us no time for that.  The GNU project regards man pages as
       obsolete and should not let them take time away from other

I think the following would be a nice addition to the FSF offering: come up with a package (available on the same terms as other GNU software) that converts GNU texinfo and info files into plain ASCII text, nroff (manpage) format, and HTML (or SGML). Converting info files into perl POD format (which is a Real Simple format) should do the trick because you could then make use of the existing pod2text, pod2man, and pod2html scripts that are distributed with perl.