The GNU Autotools are nothing short of an astounding engineering achievement. They can configure, compile and install software ranging from supercomputers all the way down to a SunOS 1.0 from 1983, or thereabouts. It can even do shared libraries portably, which is considered to be among the darkest of magicks on many of these platforms.

This technological triumph does not change the fact that using Autotools is awful. The feeling of using Autotools is not totally unlike trying to open a safe by hitting it repeatedly with your head. Here are just some ways they cause pain to developers.

Complexity is standard

If I were to describe Autotools with just one word, it would be this:

Ununderstandable

It’s not that Autotools is hard, a lot of systems are. It’s not that it has a weird syntax, lots of programs have that too. It’s that every single thing about Autotools seem to be designed to be as indecipherable as possible.

Suppose that a developer new to Autotools opens up a configure.ac for the first time. He might find lines such as these, which I took from a real world project:

AM_CONDITIONAL([HAVE_CHECK],[test "x$have_check" = xyes])

AM_INIT_AUTOMAKE([foreign dist-bzip2])
AM_MAINTAINER_MODE

Several questions immediately come to mind. Why are all function arguments quoted in brackets? What is AM_MAINTAINER_MODE? Why is it needed since I am not the maintainer of Automake? What is “xyes”? A misspelling of “xeyes” perhaps? Are the bracketed things arrays? Is space the splitting chracter? Why is the answer no on some locations and yes on others?

The interested reader might dig into these questions and a week or so later have answers. Most people don’t. They just get aggravated, copy existing files and hope that they work. A case in point: Autotools’ documentation states clearly that AM_MAINTAINER_MODE should not be used, yet it is in almost every Autotools project. It survives as a vestigial organ much like the appendix, because no-one wants to understand the system. And, just like the appendix, it sometimes tries to kill its host with a sudden inflammation.

My estimate is that there are less than 20 people in the entire world who truly understand Autotools. That is one hell of a small pool for something as fundamental as the most used build system in the Free software world.

A legacy of legacy

Autotools work by generating a Bourne shell (the one from the seventies) compatible configure script and makefiles (also of the type from the seventies). To do this they use a macro language called M4 (guess which decade this one is from). There is nothing inherently wrong with using tried and tested tools.

The big problem is that the designers did not do proper encapsulation. The source files to Autotools do not have one syntax. They have several all mixed together. The “xyes” thing
mentioned above is in fact a shell script snippet that gets included (eventually) to the main configure script. Make directives can also be added for extra fun.

The end result is that you have code snippets in several different languages mixed together arbitrarily. In addition to being tedious to read, they also make automatic code inspection all but impossible. For example most non-trivial Autoconf snippets give syntax errors in Eclipse’s Autotools editor due to missing and/or extraneous parentheses and so on (to be fair, most of these are bugs in Eclipse, but they are caused by the fact that parsing is so hard). The only way to find errors is to compile and run the code. Debugging the result is even harder.

Since Autotools is intertwingled with these tools and their idiosyncrasies, it can never be fixed, cleaned or substantially improved.

Surprising costs

One of the main goals of the C++ committee has been that you should never have to pay penalty for features you don’t use. If you don’t use virtual inheritance, your function calls are just as fast as in C. Don’t need RTTI? Just disable it. Don’t use templates? Then you
don’t get any extra bloat in your binaries.

Not so with Autotools.

Suppose you develop a program that uses GTK+ and D-bus. That implies a rather modern Linux program that will never run on, say, AIX 4.0. So you would probably want to throw away all the garbage dealing with that platform’s linking peculiarities (and everything else, too) from your build system. But you can’t.

Autotools is designed so that every single portion of it runs according to the lowest possible common denominator in the entire world (except when it doesn’t, we’ll come back to this). This has interesting consequences.

Bloat

The most common complaint about any piece of software is that it is bloated. For some reason this is never said of Autotools, even though it is one of the most bloated things in existance.

As an example let’s look at utouch-grail. It is a plain C library that detects multitouch gestures. It is a small to medium sized project. Analyzing it with Sloccount reveals that its configure script is three times larger than all C source (including headers) put together.

THREE! TIMES! LARGER!

This is even more astounding when you remember that the configure script is written in a high level scripting language, whereas the library is plain C.

If you look inside the configure script, one of the things you notice quite quickly is that it does not use shell functions. They are all unrolled. This is because the original plain Bourne Shell did not support functions (or maybe it did but they were broken in some versions of Ultrix or whatever). So Autotools will not use them in the code it generates. You pay the price whether you want to or not.

Slow

My friend once told me that if you have a multicore machine and update Gentoo on it, a fascinating thing happens. For most packages running configure takes a lot longer than the actual build. The reason being that configure is slow, and, even worse, unparallelizable.

A question of state

Computers are very good at remembering state. Humans are notoriously bad at it. Therefore the basic rule in interaction design is to never have the user remember state that the machine can either remember or deduce by itself. Autotools forces its user to keep all sorts of state needlessly in his head.

When the user has changed the code, he types make to compile it. This usually works. But when the files describing the build system are changed (Which ones? I really don’t know.) just running make will fail. Even worse it probably fails silently, claiming everything is fine but producing garbage.

In these cases the user has to manually run autogen.sh, autoreconf, or maybe something else, before make. Why? Why does the developer have to care?  Is it really too much for a build dependency tracker system to, you know, actually track dependencies?  To notice that a file that some other files depend on has changed and thus deduce the steps that
need taking? And take those steps automatically?

For Autotools the answer is yes. They force the user to keep state in his head needlessly. Since the human mind can only keep about 7 things in short term memory at any one time, this single choice needlessly wastes over 14% of developer brain capacity.

When are binaries not binaries?

When they are shell scripts that invoke magical incantations in hidden directories, of course. This little gem is courtesy of Libtool, which is a part of Autotools.

If your project uses shared libraries, Autotools does not actually build them until after you do “make install”. Instead it creates so-called convenience libraries and, in a step of utmost convenience, hides them from the developer. Since the actual libraries do not exist, binaries in the build tree cannot use them, ergo they are replaced with magical wrapper scripts.

By itself this would not be so bad, but what happens when you want to run the files under gdb or Valgrind? You either always run make install before debugging or you follow the instructions on this page:

http://www.delorie.com/gnu/docs/libtool/libtool_11.html

(At least they are honest, seeing that their breakpoint was put on the line with the statement ‘printf (“Welcome to GNU Hell!\n”)’.)

This decision again forces state on the user. How you run gdb, Valgrind or any other inspection tool depends on whether the program you build uses shared libraries or not. There goes another 14% of your brain. More if your project has several different kinds of
binaries.

Consistent lack of consistency

With Autotools you can never depend on anything being the same. So you have to jump through unnecessary hoops all the time. Say you want to have automated build service that does builds directly from revision control as well as release builds from tarballs.

To do this you need two different build rules, since the configure script is not in revision control you need to generate it in daily builds. But since source tarballs sometimes don’t contain autogen.sh you can’t always call that before configure. And indeed you shouldn’t,
you’re testing the release after all.

As an added bonus any patch that changes configure is guaranteed to be non-mergeable with any other patch that does the same. So be sure to specifically tell your diff program to ignore the configure file. But be sure to remember whether you did that or not every single time.

This is just one more way Autotools gives you more state to deal with. These kinds of annoying one-off things keep popping up in places where they really should not. They sap developers’ time and effort constantly.

Poor portability

Autoconf’s main claim to fame is that it is portable. The only requirement it has, as mentioned above, is the userland of SunOS from 1983.

Unfortunately there is a platform that does not provide anything even close to that. It is quite popular in some circles. For example it has over 90% market share in desktop machines (but who uses those, anyway).

You can use Autotools on Windows, but first you need to install either Cygwin or MSYS and even then you can only use variants of GCC. There is roughly zero support for Visual studio, which unfortunately is the most popular compiler on that platform.

The end result is that if you want or need to support Windows as a first class platform then Autotools can’t be used. Many projects provide both Autotools and Visual Studio projects, but that means that you have two independent build systems that will go out of sync on a
regular basis.

Portability is internal too

Autotools are not forward or backwards compatible with themselves. The developers change the API quite often. This means that if you need to support several different aged platforms, you need to support several versions of Autotools.

This can be done by programming the configure scripts to work differently in different Autotools versions. And who among us doesn’t like a bit of meta-meta-meta-metaprogramming?

Not producing garbage is better than cleaning it

As a final issue I’d like to mention build directories. This is a concept advocating source code directory hygiene. The idea is that you have a source directory, which contains all files that are checked into revision control. In addition there is the build directory. All files generated during build go there. In this way the source directory is always clean. You can also have several build directories, each using different settings, a different compiler and
so on.

Autotools do provide this functionality. If you run the configure script from a directory other than source root, it writes the build files to that directory. But this is pretty much useless, as it only works on full tarballs. Actually it probably won’t since most Autotools projects are written so that they only work when built in the source directory. Probably because the project they were copypasted from also did that.

Tarballs are used mostly by packagers and end users. Developers work on revision control checkouts. As their first step they need to run either autogen.sh or autoreconf. These commands will always vomit their files in the source directory. And there are lots of them, just look at almost any project’s revision control file ignore list.

Thus we have a really useful feature, which is completely useless to those people who need it the most.

What to use then?

That depends. Believe it or not, there are build systems that are even worse. Actually most of them are.

My personal favorite is CMake. It fixes almost all of the issues listed here. It has a couple of downsides too. Its language syntax is weird at some places. The way its state and cache values interact on repeated invocations is non-intuitive. Fortunately you usually don’t have to care about that if you just want to build your stuff.

Read more