Autotools_02 Autotools A Practitioner's Guide To Autoconf, Automake And Libtool

User Manual:

Open the PDF directly: View PDF PDF.
Page Count: 364 [warning: Documents this large are best viewed by clicking the View PDF Link!]

www.nostarch.com
THE FINEST IN GEEK ENTERTAINMENT
SHELVE IN:
COMPUTERS/PROGRAMMING
$44.95 ($56.95 CDN)
CREATING
PORTABLE
SOFTWARE JUST
GOT EASIER
CREATING
PORTABLE
SOFTWARE JUST
GOT EASIER
“I LIE FLAT.
This book uses RepKovera durable binding that won’t snap shut.
The GNU Autotools make it easy for developers to
create software that is portable across many Unix-like
operating systems. Although the Autotools are used
by thousands of open source software packages, they
have a notoriously steep learning curve. And good luck
to the beginner who wants to find anything beyond a
basic reference work online.
Autotools is the first book to offer programmers a tutorial-
based guide to the GNU build system. Author John
Calcote begins with an overview of high-level concepts
and a quick hands-on tour of the philosophy and design
of the Autotools. He then tackles more advanced details,
like using the M4 macro processor with Autoconf,
extending the framework provided by Automake, and
building Java and C# sources. He concludes the book
with detailed solutions to the most frequent problems
encountered by first-time Autotools users.
You’ll learn how to:
Master the Autotools build system to maximize your
software’s portability
Generate Autoconf configuration scripts to simplify
the compilation process
Produce portable makefiles with Automake
Build cross-platform software libraries with Libtool
Write your own Autoconf macros
Autotools focuses on two projects: Jupiter, a simple
“Hello, world!” program, and FLAIM, an existing,
complex open source effort containing four separate but
interdependent subprojects. Follow along as the author
takes Jupiter’s build system from a basic makefile to a
full-fledged Autotools project, and then as he converts
the FLAIM projects from complex hand-coded makefiles
to the powerful and flexible GNU build system.
ABOUT THE AUTHOR
John Calcote is a senior software engineer and architect
at Novell, Inc. Hes been writing and developing portable
networking and system-level software for nearly 20 years
and is active in developing, debugging, and analyzing
diverse open source software packages. He is currently
a project administrator of the OpenSLP, OpenXDAS, and
DNX projects, as well as the Novell-sponsored FLAIM
database project.
AUTOTOOLS
AUTOTOOLS
A PRACTITIONERS GUIDE TO
GNU AUTOCONF, AUTOMAKE, AND LIBTOOL
JOHN CALCOTE
CALCOTE
AUTOTOOLS
AUTOTOOLS
www.it-ebooks.info
www.it-ebooks.info
AUTOTOOLS
Autotools_02.book Page i Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Autotools_02.book Page ii Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
AUTOTOOLS
A Practitioner’s Guide to
GNU Autoconf, Automake,
and Libtool
by John Calcote
San Francisco
Autotools_02.book Page iii Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
AUTOTOOLS. Copyright © 2010 by John Calcote.
All rights reserved. No part of this work may be reproduced or transmitted in any form or by any means, electronic or
mechanical, including photocopying, recording, or by any information storage or retrieval system, without the prior
written permission of the copyright owner and the publisher.
14 13 12 11 10 1 2 3 4 5 6 7 8 9
ISBN-10: 1-59327-206-5
ISBN-13: 978-1-59327-206-7
Publisher: William Pollock
Production Editor: Ansel Staton
Cover and Interior Design: Octopod Studios
Developmental Editor: William Pollock
Technical Reviewer: Ralf Wildenhues
Copyeditor: Megan Dunchak
Compositor: Susan Glinert Stevens
Proofreader: Linda Seifert
Indexer: Nancy Guenther
For information on book distributors or translations, please contact No Starch Press, Inc. directly:
No Starch Press, Inc.
38 Ringold Street, San Francisco, CA 94103
phone: 415.863.9900; fax: 415.863.9950; info@nostarch.com; www.nostarch.com
Library of Congress Cataloging-in-Publication Data
Calcote, John, 1964-
Autotools : a practitioner's guide to GNU Autoconf, Automake, and Libtool / by John Calcote.
p. cm.
ISBN-13: 978-1-59327-206-7 (pbk.)
ISBN-10: 1-59327-206-5 (pbk.)
1. Autotools (Electronic resource) 2. Cross-platform software development. 3. Open source software.
4. UNIX (Computer file) I. Title.
QA76.76.D47C335 2010
005.3--dc22
2009040784
No Starch Press and the No Starch Press logo are registered trademarks of No Starch Press, Inc. Other product and
company names mentioned herein may be the trademarks of their respective owners. Rather than use a trademark
symbol with every occurrence of a trademarked name, we are using the names only in an editorial fashion and to the
benefit of the trademark owner, with no intention of infringement of the trademark.
The information in this book is distributed on an “As Is” basis, without warranty. While every precaution has been
taken in the preparation of this work, neither the author nor No Starch Press, Inc. shall have any liability to any
person or entity with respect to any loss or damage caused or alleged to be caused directly or indirectly by the
information contained in it.
Autotools_02.book Page iv Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
For Michelle
But to see her was to love her;
Love but her, and love forever.
—Robert Burns
Autotools_02.book Page v Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Autotools_02.book Page vi Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
BRIEF CONTENTS
Foreword by Ralf Wildenhues..........................................................................................xv
Preface .......................................................................................................................xvii
Introduction ..................................................................................................................xxi
Chapter 1: A Brief Introduction to the GNU Autotools..........................................................1
Chapter 2: Understanding the GNU Coding Standards .....................................................19
Chapter 3: Configuring Your Project with Autoconf ...........................................................57
Chapter 4: More Fun with Autoconf: Configuring User Options ..........................................89
Chapter 5: Automatic Makefiles with Automake..............................................................119
Chapter 6: Building Libraries with Libtool .......................................................................145
Chapter 7: Library Interface Versioning and Runtime Dynamic Linking ...............................171
Chapter 8: FLAIM: An Autotools Example.......................................................................195
Chapter 9: FLAIM Part II: Pushing the Envelope...............................................................229
Chapter 10: Using the M4 Macro Processor with Autoconf ..............................................251
Chapter 11: A Catalog of Tips and Reusable Solutions for Creating Great Projects .............271
Index.........................................................................................................................313
Autotools_02.book Page vii Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Autotools_02.book Page viii Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
CONTENTS IN DETAIL
FOREWORD by Ralf Wildenhues xv
PREFACE xvii
Why Use the Autotools? .........................................................................................xviii
Acknowledgments ................................................................................................... xx
I Wish You the Very Best .......................................................................................... xx
INTRODUCTION xxi
Who Should Read This Book .................................................................................. xxii
How This Book Is Organized .................................................................................. xxii
Conventions Used in This Book ...............................................................................xxiii
Autotools Versions Used in This Book .......................................................................xxiii
1
A BRIEF INTRODUCTION TO THE GNU AUTOTOOLS 1
Who Should Use the Autotools? ................................................................................. 2
When Should You Not Use the Autotools? ................................................................... 2
Apple Platforms and Mac OS X ................................................................................. 3
The Choice of Language ........................................................................................... 4
Generating Your Package Build System ....................................................................... 5
Autoconf ................................................................................................................. 6
autoconf ........................................................................................................... 7
autoreconf ........................................................................................................ 7
autoheader ....................................................................................................... 7
autoscan ........................................................................................................... 7
autoupdate ....................................................................................................... 7
ifnames ............................................................................................................ 8
autom4te .......................................................................................................... 8
Working Together .............................................................................................. 8
Automake ................................................................................................................ 9
automake ....................................................................................................... 10
aclocal ........................................................................................................... 10
Libtool ................................................................................................................... 11
libtool ............................................................................................................. 12
libtoolize ........................................................................................................ 12
ltdl, the Libtool C API ........................................................................................ 12
Building Your Package ............................................................................................ 13
Running configure ............................................................................................ 13
Running make ................................................................................................. 15
Installing the Most Up-to-Date Autotools ..................................................................... 16
Summary ............................................................................................................... 18
Autotools_02.book Page ix Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
xContents in Detail
2
UNDERSTANDING THE GNU CODING STANDARDS 19
Creating a New Project Directory Structure ................................................................ 20
Project Structure ..................................................................................................... 21
Makefile Basics ...................................................................................................... 22
Commands and Rules ....................................................................................... 23
Variables ........................................................................................................ 24
A Separate Shell for Each Command ................................................................. 25
Variable Binding ............................................................................................. 26
Rules in Detail ................................................................................................. 27
Resources for Makefile Authors .......................................................................... 32
Creating a Source Distribution Archive ...................................................................... 32
Forcing a Rule to Run ....................................................................................... 34
Leading Control Characters .............................................................................. 35
Automatically Testing a Distribution .......................................................................... 36
Unit Testing, Anyone? ............................................................................................. 37
Installing Products ................................................................................................... 38
Installation Choices .......................................................................................... 40
Uninstalling a Package ..................................................................................... 41
Testing Install and Uninstall ............................................................................... 42
The Filesystem Hierarchy Standard ........................................................................... 44
Supporting Standard Targets and Variables .............................................................. 45
Standard Targets ............................................................................................. 46
Standard Variables .......................................................................................... 46
Adding Location Variables to Jupiter .................................................................. 47
Getting Your Project into a Linux Distro ..................................................................... 48
Build vs. Installation Prefix Overrides ........................................................................ 50
User Variables ....................................................................................................... 52
Configuring Your Package ...................................................................................... 54
Summary ............................................................................................................... 55
3
CONFIGURING YOUR PROJECT WITH AUTOCONF 57
Autoconf Configuration Scripts ................................................................................. 58
The Shortest configure.ac File .................................................................................. 59
Comparing M4 to the C Preprocessor ....................................................................... 60
The Nature of M4 Macros ....................................................................................... 60
Executing autoconf ................................................................................................. 61
Executing configure ................................................................................................ 62
Executing config.status ............................................................................................ 63
Adding Some Real Functionality ............................................................................... 64
Generating Files from Templates .............................................................................. 67
Adding VPATH Build Functionality ............................................................................ 68
Let’s Take a Breather .............................................................................................. 70
An Even Quicker Start with autoscan ........................................................................ 71
The Proverbial autogen.sh Script ........................................................................ 73
Updating Makefile.in ....................................................................................... 75
Initialization and Package Information ...................................................................... 76
AC_PREREQ ................................................................................................... 76
AC_INIT ......................................................................................................... 76
AC_CONFIG_SRCDIR ...................................................................................... 77
Autotools_02.book Page x Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Contents in Detail xi
The Instantiating Macros ......................................................................................... 78
AC_CONFIG_HEADERS ................................................................................... 83
Using autoheader to Generate an Include File Template ....................................... 84
Back to Remote Builds for a Moment ......................................................................... 87
Summary ............................................................................................................... 88
4
MORE FUN WITH AUTOCONF:
CONFIGURING USER OPTIONS 89
Substitutions and Definitions .................................................................................... 90
AC_SUBST ...................................................................................................... 90
AC_DEFINE .................................................................................................... 91
Checking for Compilers .......................................................................................... 91
Checking for Other Programs .................................................................................. 93
A Common Problem with Autoconf ........................................................................... 95
Checks for Libraries and Header Files ....................................................................... 98
Is It Right or Just Good Enough? ....................................................................... 101
Printing Messages .......................................................................................... 106
Supporting Optional Features and Packages ........................................................... 107
Coding Up the Feature Option ........................................................................ 109
Formatting Help Strings .................................................................................. 112
Checks for Type and Structure Definitions ................................................................ 112
The AC_OUTPUT Macro ....................................................................................... 116
Summary ............................................................................................................. 117
5
AUTOMATIC MAKEFILES
WITH AUTOMAKE 119
Getting Down to Business ...................................................................................... 120
Enabling Automake in configure.ac .................................................................. 121
A Hidden Benefit: Automatic Dependency Tracking ........................................... 124
What’s in a Makefile.am File? ............................................................................... 125
Analyzing Our New Build System .......................................................................... 126
Product List Variables ..................................................................................... 127
Product Source Variables ................................................................................ 132
PLV and PSV Modifiers ................................................................................... 132
Unit Tests: Supporting make check .......................................................................... 133
Reducing Complexity with Convenience Libraries ..................................................... 134
Product Option Variables ................................................................................ 136
Per-Makefile Option Variables ......................................................................... 138
Building the New Library ....................................................................................... 138
What Goes into a Distribution? .............................................................................. 140
Maintainer Mode ................................................................................................. 141
Cutting Through the Noise ..................................................................................... 142
Summary ............................................................................................................. 144
Autotools_02.book Page xi Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
xii Contents in Detail
6
BUILDING LIBRARIES WITH LIBTOOL 145
The Benefits of Shared Libraries ............................................................................. 146
How Shared Libraries Work .................................................................................. 146
Dynamic Linking at Load Time ......................................................................... 147
Automatic Dynamic Linking at Runtime ............................................................. 148
Manual Dynamic Linking at Runtime ................................................................. 149
Using Libtool ........................................................................................................ 150
Abstracting the Build Process ........................................................................... 150
Abstraction at Runtime .................................................................................... 151
Installing Libtool ................................................................................................... 152
Adding Shared Libraries to Jupiter .......................................................................... 152
Using the LTLIBRARIES Primary ......................................................................... 153
Public Include Directories ................................................................................ 153
Customizing Libtool with LT_INIT Options .......................................................... 157
Reconfigure and Build .................................................................................... 161
So What Is PIC, Anyway? ............................................................................... 164
Fixing the Jupiter PIC Problem ......................................................................... 167
Summary ............................................................................................................. 170
7
LIBRARY INTERFACE VERSIONING AND
RUNTIME DYNAMIC LINKING 171
System-Specific Versioning .................................................................................... 172
Linux and Solaris Library Versioning ................................................................. 172
IBM AIX Library Versioning ............................................................................. 173
HP-UX/AT&T SVR4 Library Versioning .............................................................. 176
The Libtool Library Versioning Scheme .................................................................... 176
Library Versioning Is Interface Versioning .......................................................... 177
When Library Versioning Just Isn’t Enough ........................................................ 180
Using libltdl ......................................................................................................... 181
Necessary Infrastructure ................................................................................. 181
Adding a Plug-In Interface ............................................................................... 183
Doing It the Old-Fashioned Way ..................................................................... 184
Converting to Libtool’s ltdl Library .................................................................... 188
Preloading Multiple Modules ........................................................................... 192
Checking It All Out ........................................................................................ 193
Summary ............................................................................................................. 194
8
FLAIM: AN AUTOTOOLS EXAMPLE 195
What Is FLAIM? ................................................................................................... 196
Why FLAIM? ....................................................................................................... 196
An Initial Look ...................................................................................................... 197
Getting Started .................................................................................................... 199
Adding the configure.ac Files .......................................................................... 199
The Top-Level Makefile.am File ........................................................................ 202
The FLAIM Subprojects .......................................................................................... 204
The FLAIM Toolkit configure.ac File .................................................................. 205
The FLAIM Toolkit Makefile.am File .................................................................. 212
Autotools_02.book Page xii Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Contents in Detail xiii
Designing the ftk/src/Makefile.am File ............................................................. 215
Moving On to the ftk/util Directory .................................................................. 217
Designing the XFLAIM Build System ........................................................................ 218
The XFLAIM configure.ac File .......................................................................... 219
Creating the xflaim/src/Makefile.am File ......................................................... 222
Turning to the xflaim/util Directory ................................................................... 223
Summary ............................................................................................................. 227
9
FLAIM PART II: PUSHING THE ENVELOPE 229
Building Java Sources Using the Autotools ............................................................... 230
Autotools Java Support ................................................................................... 230
Using ac-archive Macros ................................................................................ 233
Canonical System Information ......................................................................... 234
The xflaim/java Directory Structure .................................................................. 234
The xflaim/src/Makefile.am File ...................................................................... 235
Building the JNI C++ Sources .......................................................................... 236
The Java Wrapper Classes and JNI Headers ..................................................... 237
A Caveat About Using the JAVA Primary .......................................................... 239
Building the C# Sources ........................................................................................ 239
Manual Installation ........................................................................................ 242
Cleaning Up Again ........................................................................................ 243
Configuring Compiler Options ............................................................................... 243
Hooking Doxygen into the Build Process ................................................................. 245
Adding Nonstandard Targets ................................................................................ 247
Summary ............................................................................................................. 250
10
USING THE M4 MACRO PROCESSOR WITH AUTOCONF 251
M4 Text Processing .............................................................................................. 252
Defining Macros ............................................................................................ 253
Macros with Arguments .................................................................................. 255
The Recursive Nature of M4 .................................................................................. 256
Quoting Rules ............................................................................................... 258
Autoconf and M4 ................................................................................................. 259
The Autoconf M4 Environment ......................................................................... 260
Writing Autoconf Macros ...................................................................................... 260
Simple Text Replacement ................................................................................ 260
Documenting Your Macros .............................................................................. 263
M4 Conditionals ............................................................................................ 264
Diagnosing Problems ............................................................................................ 268
Summary ............................................................................................................. 269
11
A CATALOG OF TIPS AND REUSABLE SOLUTIONS
FOR CREATING GREAT PROJECTS 271
Item 1: Keeping Private Details out of Public Interfaces .............................................. 272
Solutions in C ................................................................................................ 273
Solutions in C++ ............................................................................................ 273
Autotools_02.book Page xiii Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
xiv Contents in Detail
Item 2: Implementing Recursive Extension Targets ..................................................... 276
Item 3: Using a Repository Revision Number in a Package Version ............................. 279
Item 4: Ensuring Your Distribution Packages Are Clean ............................................. 281
Item 5: Hacking Autoconf Macros .......................................................................... 282
Providing Library-Specific Autoconf Macros ....................................................... 287
Item 6: Cross-Compiling ........................................................................................ 287
Item 7: Emulating Autoconf Text Replacement Techniques .......................................... 293
Item 8: Using the ac-archive Project ........................................................................ 298
Item 9: Using pkg-config with Autotools .................................................................. 299
Providing pkg-config Files for Your Library Projects ............................................. 300
Using pkg-config Files in configure.ac .............................................................. 301
Item 10: Using Incremental Installation Techniques ................................................... 302
Item 11: Using Generated Source Code .................................................................. 302
Using the BUILT_SOURCES Variable ................................................................ 302
Dependency Management .............................................................................. 303
Built Sources Done Right ................................................................................. 306
Item 12: Disabling Undesirable Targets ................................................................... 309
Item 13: Watch Those Tab Characters! ................................................................... 310
Item 14: Packaging Choices .................................................................................. 311
Wrapping Up ...................................................................................................... 312
INDEX 313
Autotools_02.book Page xiv Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
FOREWORD
When I was asked to do a technical review on a book
about the Autotools, I was rather skeptical. Several
online tutorials and a few books already introduce
readers to the use of GNU Autoconf, Automake, and
Libtool. However, many of these texts are less than ideal in at least some
ways: They were either written several years ago and are starting to show their
age, contain at least some inaccuracies, or tend to be incomplete for typical
beginner’s tasks. On the other hand, the GNU manuals for these programs
are fairly large and rather technical, and as such, they may present a signifi-
cant entry barrier to learning your ways around the Autotools.
John Calcote began this book with an online tutorial that shared at least
some of the problems facing other tutorials. Around that time, he became a
regular contributor to discussions on the Autotools mailing lists, too. John
kept asking more and more questions, and discussions with him uncovered
some bugs in the Autotools sources and documentation, as well as some
issues in his tutorial.
Autotools_02.book Page xv Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
xvi Foreword
Since that time, John has reworked the text a lot. The review uncovered
several more issues in both software and book text, a nice mutual benefit. As
a result, this book has become a great introductory text that still aims to be
accurate, up to date with current Autotools, and quite comprehensive in a
way that is easily understood.
Always going by example, John explores the various software layers, port-
ability issues and standards involved, and features needed for package build
development. If you’re new to the topic, the entry path may just have become
a bit less steep for you.
Ralf Wildenhues
Bonn, Germany
June 2010
Autotools_02.book Page xvi Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
PREFACE
I’ve often wondered during the last ten years how it
could be that the only third-party book on the GNU
Autotools that I’ve been able to discover is GNU
AUTOCONF, AUTOMAKE, and LIBTOOL by Gary
Vaughan, Ben Elliston, Tom Tromey, and Ian Lance
Taylor, affectionately known by the community as
The Goat Book (so dubbed for the front cover—an old-
fashioned photo of goats doing acrobatic stunts).1
I’ve been told by publishers that there is simply no market for such a
book. In fact, one editor told me that he himself had tried unsuccessfully to
entice authors to write this book a few years ago. His authors wouldn’t finish
the project, and the publisher’s market analysis indicated that there was very
little interest in the book. Publishers believe that open source software devel-
opers tend to disdain written documentation. Perhaps they’re right. Interest-
ingly, books on IT utilities like Perl sell like Perl’s going out of style—which is
actually somewhat true these days—and yet people are still buying enough
1. Vaughan, Elliston, Tromey, and Taylor, GNU Autoconf, Automake, and Libtool
(Indianapolis: Sams Publishing, 2000).
Autotools_02.book Page xvii Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
xviii Preface
Perl books to keep their publishers happy. All of this explains why there are
ten books on the shelf with animal pictures on the cover for Perl, but literally
nothing for open source software developers.
I’ve worked in software development for 25 years, and I’ve used open
source software for quite some time now. I’ve learned a lot about open source
software maintenance and development, and most of what I’ve learned,
unfortunately, has been by trial and error. Existing GNU documentation is
more often reference material than solution-oriented instruction. Had there
been other books on the topic, I would have snatched them all up immediately.
What we need is a cookbook-style approach with the recipes covering
real problems found in real projects. First the basics are covered, sauces and
reductions, followed by various cooking techniques. Finally, master recipes
are presented for culinary wonders. As each recipe is mastered, the reader
makes small intuitive leaps—I call them minor epiphanies. Put enough of these
under your belt and overall mastery of the Autotools is ultimately inevitable.
Let me give you an analogy. I’d been away from math classes for about
three years when I took my first college calculus course. I struggled the entire
semester with little progress. I understood the theory, but I had trouble
with the homework. I just didn’t have the background I needed. So the
next semester, I took college algebra and trigonometry back to back as half-
semester classes. At the end of that semester, I tried calculus again. This time
I did very well—finishing the class with a solid A grade. What was missing the
first time? Just basic math skills. You’d think it wouldn’t have made that much
difference, but it really does.
The same concept applies to learning to properly use the Autotools. You
need a solid understanding of the tools upon which the Autotools are built
in order to become proficient with the Autotools themselves.
Why Use the Autotools?
In the early 1990s, I was working on the final stages of my bachelor’s degree
in computer science at Brigham Young University. I took an advanced com-
puter graphics class where I was introduced to C++ and the object-oriented
programming paradigm. For the next couple of years, I had a love-hate rela-
tionship with C++. I was a pretty good C coder by that time, and I thought I
could easily pick up C++, as close in syntax as it was to C. How wrong I was!
I fought with the C++ compiler more often than I’d care to recall.
The problem was that the most fundamental differences between C
and C++ are not obvious to the casual observer, because they’re buried
deep within the C++ language specification rather than on the surface in
the language syntax. The C++ compiler generates an amazing amount of
code beneath the covers, providing functionality in a few lines of C++ code
that require dozens of lines of C code.
Just as programmers then complained of their troubles with C++, so like-
wise programmers today complain about similar difficulties with the GNU
Autotools. The differences between make and Automake are very similar to
the differences between C and C++. The most basic single-line Makefile.am
Autotools_02.book Page xviii Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Preface xix
generates a Makefile.in (an Autoconf template) containing 300–400 lines of
parameterized make script, and it tends to increase with each revision of the
tool as more features are added.
Thus, when you use the Autotools, you have to understand the under-
lying infrastructure managed by these tools. You need to take the time to
understand the open source software distribution, build, test, and installa-
tion philosophies embodied by—in many cases even enforced by—these
tools, or you’ll find yourself fighting against the system. Finally, you need to
learn to agree with these basic philosophies because you’ll only become frus-
trated if you try to make the Autotools operate outside of the boundaries set
by their designers.
Source-level distribution relegates to the end user a particular portion
of the responsibility of software development that has traditionally been
assumed by the software developer—namely, building products from source
code. But end users are often not developers, so most of them won’t know
how to properly build the package. The solution to this problem, from the
earliest days of the open source movement, has been to make the package
build and installation processes as simple as possible for the end user so that
he could perform a few well-understood steps to have the package built and
installed cleanly on his system.
Most packages are built using the make utility. It’s very easy to type make,
but that’s not the problem. The problem crops up when the package doesn’t
build successfully because of some unanticipated difference between the user’s
system and the developer’s system. Thus was born the ubiquitous configure
script—initially a simple shell script that configured the end user’s environ-
ment so that make could successfully find the required external resources
on the user’s system. Hand-coded configuration scripts helped, but they
weren’t the final answer. They fixed about 65 percent of the problems result-
ing from system configuration differences—and they were a pain in the neck
to write properly and to maintain. Dozens of changes were made incremen-
tally over a period of years, until the script worked properly on most of the
systems anyone cared about. But the entire process was clearly in need of an
upgrade.
Do you have any idea of the number of build-breaking differences there
are between existing systems today? Neither do I, but there are a handful of
developers in the world who know a large percentage of these differences.
Between them and the open source software community, the GNU Autotools
were born. The Autotools were designed to create configuration scripts and
makefiles that work correctly and provide significant chunks of valuable
end-user functionality under most circumstances, and on most systems—
even on systems not initially considered (or even conceived of) by the pack-
age maintainer.
With this in mind, the primary purpose of the Autotools is not to make
life simpler for the package maintainer (although it really does in the long
run). The primary purpose of the Autotools is to make life simpler for the end user.
Autotools_02.book Page xix Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
xx Preface
Acknowledgments
I could not have written a technical book like this without the help of a lot of
people. I would like to thank Bill Pollock and the editors and staff at No Starch
Press for their patience with a first-time author. They made the process inter-
esting and fun (and a little painful at times).
Additionally, I’d like to thank the authors and maintainers of the GNU
Autotools for giving the world a standard to live up to and a set of tools that
make it simpler to do so. Specifically, I’d like to thank Ralf Wildenhues, who
believed in this project enough to spend hundreds of hours of his personal
time in technical review. His comments and insight were invaluable in taking
this book from mere wishful thinking to an accurate and useful text.
I would also like to thank my friend Cary Petterborg for encouraging me
to “just go ahead and do it,” when I told him it would probably never happen.
Finally, I’d like to thank my wife Michelle and my children: Ethan,
Mason, Robby, Haley, Joey, Nick, and Alex for allowing me to spend all of
that time away from them while I worked on the book. A novel would have
been easier (and more lucrative), but the world has plenty of novels and not
enough books about the Autotools.
I Wish You the Very Best
I spent a long time and a lot of effort learning what I now know about the
Autotools. Most of this learning process was more painful than it really had
to be. I’ve written this book so that you won’t have to struggle to learn what
should be a core set of tools for the open source programmer. Please feel
free to contact me, and let me know your experiences with learning the
Autotools. I can be reached at my personal email address at john.calcote
@gmail.com. Good luck in your quest for a better software development
experience!
John Calcote
Elk Ridge, Utah
June 2010
Autotools_02.book Page xx Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
INTRODUCTION
Few software developers would deny that
GNU Autoconf, Automake, and Libtool
(the Autotools) have revolutionized the open
source software world. But while there are many
thousands of Autotools advocates, there are also many
developers who hate the Autotools—with a passion.
The reason for this dread of the Autotools, I think, is that when you use the
Autotools, you have to understand the underlying infrastructure that they
manage. Otherwise, you’ll find yourself fighting against the system.
This book solves this problem by first providing a framework for under-
standing the underlying infrastructure of the Autotools and then building
on that framework with a tutorial-based approach to teaching Autotools
concepts in a logically ordered fashion.
Autotools_02.book Page xxi Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
xxii Introduction
Who Should Read This Book
This book is for the open source software package maintainer who wants to
become an Autotools expert. Existing material on the subject is limited to
the GNU Autotools manuals and a few Internet-based tutorials. For years
most real-world questions have been answered on the Autotools mailing lists,
but mailing lists are an inefficient form of teaching because the same answers
to the same questions are given time and again. This book provides a cook-
book style approach, covering real problems found in real projects.
How This Book Is Organized
This book moves from high-level concepts to mid-level use cases and examples
and then finishes with more advanced details and examples. As though we
were learning arithmetic, we’ll begin with some basic math—algebra and
trigonometry—and then move on to analytical geometry and calculus.
Chapter 1 presents a general overview of the packages that are consid-
ered part of the GNU Autotools. This chapter describes the interaction
between these packages and the files consumed by and generated by each
one. In each case, figures depict the flow of data from hand-coded input to
final output files.
Chapter 2 covers open source software project structure and organiza-
tion. This chapter also goes into some detail about the GNU Coding Standards
(GCS) and the Filesystem Hierarchy Standard (FHS), both of which have played
vital roles in the design of the GNU Autotools. It presents some fundamental
tenets upon which the design of each of the Autotools is based. With these
concepts, you’ll better understand the theory behind the architectural deci-
sions made by the Autotools designers.
In this chapter, we’ll also design a simple project, Jupiter, from start to
finish using hand-coded makefiles. We’ll add to Jupiter in a stepwise fashion
as we discover functionality that we can use to simplify tasks.
Chapters 3 and 4 present the framework designed by the GNU Autoconf
engineers to ease the burden of creating and maintaining portable, func-
tional project configuration scripts. The GNU Autoconf package provides
the basis for creating complex configuration scripts with just a few lines of
information provided by the project maintainer.
In these chapters, we’ll quickly convert our hand-coded makefiles into
Autoconf Makefile.in templates and then begin adding to them in order to
gain some of the most significant Autoconf benefits. Chapter 3 discusses the
basics of generating configuration scripts, while Chapter 4 moves on to more
advanced Autoconf topics, features, and uses.
Chapter 5 discusses converting the Jupiter project Makefile.in templates
into Automake Makefile.am files. Here you’ll discover that Automake is to
makefiles what Autoconf is to configuration scripts. This chapter presents
the major features of Automake in a manner that will not become outdated
as new versions of Automake are released.
Autotools_02.book Page xxii Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Introduction xxiii
Chapters 6 and 7 explain basic shared-library concepts and show how
to build shared libraries with Libtool—a stand-alone abstraction for shared
library functionality that can be used with the other Autotools. Chapter 6
begins with a shared-library primer and then covers some basic Libtool
extensions that allow Libtool to be a drop-in replacement for the more
basic library generation functionality provided by Automake. Chapter 7
covers library versioning and runtime dynamic module management fea-
tures provided by Libtool.
Chapters 8 and 9 show the transformation of an existing, fairly complex,
open source project (FLAIM) from using a hand-built build system to using
an Autotools build system. This example will help you to understand how you
might autoconfiscate one of your own existing projects.
Chapter 10 provides an overview of the features of the M4 macro proces-
sor that are relevant to obtaining a solid understanding of Autoconf. This
chapter also considers the process of writing your own Autoconf macros.
Chapter 11 is a compilation of tips, tricks, and reusable solutions to
Autoconf problems. The solutions in this chapter are presented as a set of
individual topics or items. Each item can be understood without context
from the surrounding items.
Most of the examples shown in listings in this book are available for
download from http://www.nostarch.com/autotools.htm.
Conventions Used in This Book
This book contains hundreds of program listings in roughly two categories:
console examples and file listings. Console examples have no captions, and
their commands are bolded. File listings contain full or partial listings of the
files discussed in the text. All named listings are provided in the download-
able archive. Listings without filenames are entirely contained in the printed
listing itself. In general, bolded text in listings indicates changes made to a
previous version of that listing.
For listings related to the Jupiter and FLAIM projects, the caption speci-
fies the path of the file relative to the project root directory.
Throughout this book, I refer to the GNU/Linux operating system sim-
ply as Linux. It should be understood that by the use of the term Linux, I’m
referring to GNU/Linux, its actual official name. I use Linux simply as short-
hand for the official name.
Autotools Versions Used in This Book
The Autotools are always being updated—on average, a significant update of
each of the three tools, Autoconf, Automake, and Libtool, is released every
year and a half, and minor updates are released every three to six months.
The Autotools designers attempt to maintain a reasonable level of backward
compatibility with each new release, but occasionally something significant is
broken, and older documentation simply becomes out of date.
Autotools_02.book Page xxiii Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
xxiv Introduction
While I describe new significant features of recent releases of the Auto-
tools, in my efforts to make this a more timeless work, I’ve tried to stick to
descriptions of Autoconf features (macros for instance) that have been in
widespread use for several years. Minor details change occasionally, but the
general use has stayed the same through many releases.
At appropriate places in the text, I mention the versions of the Autotools
that I’ve used for this book, but I’ll summarize here. I’ve used version 2.64 of
Autoconf, version 1.11 of Automake, and version 2.2.6 of Libtool. These were
the latest versions as of this writing, and even through the publication pro-
cess, I was able to make minor corrections and update to new releases as they
became available.
Autotools_02.book Page xxiv Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
A BRIEF INTRODUCTION
TO THE GNU AUTOTOOLS
We shall not cease from exploration
And the end of all our exploring
Will be to arrive where we started
And know the place for the first time.
—T.S. Eliot, “Quartet No. 4: Little Gidding”
As stated in the preface to this book, the
purpose of the GNU Autotools is to make
life simpler for the end user, not the main-
tainer. Nevertheless, using the Autotools will
make your job as a project maintainer easier in the
long run, although maybe not for the reasons you suspect. The Autotools
framework is as simple as it can be, given the functionality it provides. The
real purpose of the Autotools is twofold: it serves the needs of your users, and
it makes your project incredibly portable—even to systems on which you’ve
never tested, installed, or built your code.
Throughout this book, I will often use the term Autotools, although you
won’t find a package in the GNU archives with this label. I use this term to
signify the following three GNU packages, which are considered by the com-
munity to be part of the GNU build system:
zAutoconf, which is used to generate a configuration script for a project
zAutomake, which is used to simplify the process of creating consistent
and functional makefiles
zLibtool, which provides an abstraction for the portable creation of
shared libraries
Autotools_02.book Page 1 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
2Chapter 1
Other build tools, such as the open source packages CMake and SCons,
attempt to provide the same functionality as the Autotools but in a more
user-friendly manner. However, the functionality these tools attempt to hide
behind GUI interfaces and script builders actually ends up making them less
functional.
Who Should Use the Autotools?
If you’re writing open source software that targets Unix or Linux systems, you
should absolutely be using the GNU Autotools, and even if you’re writing
proprietary software for Unix or Linux systems, you’ll still benefit significantly
from using them. The Autotools provide you with a build environment that
will allow your project to build successfully on future versions or distributions
with virtually no changes to the build scripts. This is useful even if you only
intend to target a single Linux distribution, because—let’s be honest—you
really can’t know in advance whether or not your company will want your soft-
ware to run on other platforms in the future.
When Should You Not Use the Autotools?
About the only time it makes sense not to use the Autotools is when you’re
writing software that will only run on non-Unix platforms, such as Microsoft
Windows. Although the Autotools have limited support for building Windows
software, it’s my opinion that the POSIX/FHS runtime environment embraced
by these tools is just too different from the Windows runtime environment to
warrant trying to shoehorn a Windows project into the Autotools paradigm.
Autotools support for Windows requires a Cygwin1 or MSYS2 environment
in order to work correctly, because Autoconf-generated configuration scripts
are Bourne-shell scripts, and Windows doesn’t provide a native Bourne shell.
Unix and Microsoft tools are just different enough in command-line options
and runtime characteristics that it’s often simpler to use Windows ports of GNU
tools, such as GCC or MinGW, to build Windows programs with an Autotools
build system.
I’ve seen truly portable build systems that use these environments and
tool sets to build Windows software using Autotools scripts that are common
between Windows and Unix. The shim libraries provided by portability envi-
ronments like Cygwin make the Windows operating system look POSIX enough
to pass for Unix in a pinch, but they sacrifice performance and functionality for
the sake of portability. The MinGW approach is a little better in that it targets the
native Windows API. In any case, these sorts of least-common-denominator
approaches merely serve to limit the possibilities of your code on Windows.
I’ve also seen developers customize the Autotools to generate build scripts
that use native (Microsoft) Windows tools. These people spend much of their
time tweaking their build systems to do things they were never intended to
do, in a hostile and foreign environment. Their makefiles contain entirely
1. Cygwin Information and Installation, http://www.cygwin.com/.
2. MinGW and MSYS, Minimalist GNU for Windows, http://www.mingw.org/.
Autotools_02.book Page 2 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
A Brief Introduction to the GNU Autotools 3
different sets of functionality based on the target and host operating systems:
one set of code to build a project on Windows and another to build on
POSIX systems. This does not constitute a portable build system; it only por-
trays the vague illusion of one.
For these reasons, I focus exclusively in this book on using the Autotools
on POSIX-compliant platforms.
NOTE I’m not a typical Unix bigot. While I love Unix (and especially Linux), I also appreciate
Windows for the areas in which it excels.3 For Windows development, I highly recommend
using Microsoft tools. The original reasons for using GNU tools to develop Windows
programs are more or less academic nowadays, because Microsoft has made the better
part of its tools available for download at no cost. (For download information, see
Microsoft Express at http://www.microsoft.com/Express.)
Apple Platforms and Mac OS X
The Macintosh operating system has been POSIX compliant since 2002 when
Mac OS version 10 (OS X) was released. OS X is derived from NeXTSTEP/
OpenStep, which is based on the Mach kernel, with parts taken from FreeBSD
and NetBSD. As a POSIX-compliant operating system, OS X provides all the
infrastructure required by the Autotools. The problems you’ll encounter with
OS X will mostly likely involve Apple’s user interface and package-management
systems, both of which are specific to the Mac.
The user interface presents the same issues you encounter when dealing
with X Windows on other Unix platforms, and then some. The primary dif-
ference is that X Windows is used exclusively on most Unix systems, but Mac
OS has its own graphical user interface called Cocoa. While X Windows can be
used on the Mac (Apple provides a window manager that makes X applications
look a lot like native Cocoa apps), Mac programmers will sometimes wish to
take full advantage of the native user interface features provided by the oper-
ating system.
The Autotools skirt the issue of package management differences between
Unix platforms by simply ignoring it. They create packages that are little more
than compressed archives using the tar and gzip utilities, and they install and
uninstall products from the make command line. The Mac OS package manage-
ment system is an integral part of installing an application on an Apple system
and projects like Fink (http://www.finkproject.org/) and MacPorts (http://
www.macports.org/) help make existing open source packages available on the
Mac by providing simplified mechanisms for converting Autotools packages
into installable Mac packages.
The bottom line is that the Autotools can be used quite effectively on
Apple Macintosh systems running OS X or later, as long as you keep these
caveats in mind.
3. Hard core gamers will agree with me, I’m sure. I’m writing this book on a laptop running
Windows 7, but I’m using OpenOffice.org as my text editor, and I’m writing the book’s sample
code on my 3GHz 64-bit dual processor Opensuse 11.2 Linux workstation.
Autotools_02.book Page 3 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
4Chapter 1
The Choice of Language
Your choice of programming language is another important factor to consider
when deciding whether to use the Autotools. Remember that the Autotools
were designed by GNU people to manage GNU projects. In the GNU com-
munity, there are two factors that determine the importance of a computer
programming language:
zAre there any GNU packages written in the language?
zDoes the GNU compiler toolset support the language?
Autoconf provides native support for the following languages based on
these two criteria (by native support, I mean that Autoconf will compile, link,
and run source-level feature checks in these languages):
zC
zC++
zObjective C
zFortran
zFortran 77
zErlang
Therefore, if you want to build a Java package, you can configure Auto-
make to do so (as we’ll see in Chapters 8 and 9), but you can’t ask Autoconf
to compile, link, or run Java-based checks,4 because Autoconf simply doesn’t
natively support Java. However, you can find Autoconf macros (which I will
cover in more detail in later chapters) that enhance Autoconf’s ability to
manage the configuration process for projects written in Java.
Open source software developers are actively at work on the gcj compiler
and toolset, so some native Java support may ultimately be added to Autoconf.
But as of this writing, gcj is still a bit immature, and very few GNU packages are
currently written in Java, so the issue is not yet critical to the GNU community.
Rudimentary support does exist in Automake for both GNU (gcj) and
non-GNU Java compilers and JVMs. I’ve used these features myself on projects
and they work well, as long as you don’t try to push them too far.
If you’re into Smalltalk, ADA, Modula, Lisp, Forth, or some other non-
mainstream language, you’re probably not too interested in porting your code
to dozens of platforms and CPUs. However, if you are using a non-mainstream
language, and you’re concerned about the portability of your build systems,
consider adding support for your language to the Autotools yourself. This
is not as daunting a task as you may think, and I guarantee that you’ll be an
Autotools expert when you’re finished.5
4. This statement is not strictly true: I’ve seen third-party macros that use the JVM to execute
Java code within checks, but these are usually very special cases. None of the built-in Autoconf
checks rely on a JVM in any way. Chapters 8 and 9 outline how you might use a JVM in an Autoconf
check. Additionally, the portable nature of Java and the Java virtual machine specification make
it fairly unlikely that you’ll need to perform a Java-based Autoconf check in the first place.
5. For example, native Erlang support made it into the Autotools because members of the
Erlang community thought it was important enough to add it themselves.
Autotools_02.book Page 4 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
A Brief Introduction to the GNU Autotools 5
Generating Your Package Build System
The GNU Autotools framework includes three main packages: Autoconf,
Automake, and Libtool. The tools in these packages can generate code that
depends on utilities and functionality from the gettext, m4, sed, make, and perl
packages, among others.
With respect to the Autotools, it’s important to distinguish between a
maintainer’s system and an end user’s system. The design goals of the Autotools
specify that an Autotools-generated build system should rely only on tools
that are readily available and preinstalled on the end user’s machine. For
example, the machine a maintainer uses to create distributions requires a
Perl interpreter, but a machine on which an end-user builds products from
release distribution packages should not require Perl.
A corollary is that an end user’s machine doesn’t need to have the Autotools
installed—an end user’s system only requires a reasonably POSIX-compliant
version of make and some variant of the Bourne shell that can execute the
generated configuration script. And, of course, any package will also require
compilers, linkers, and other tools deemed necessary by the project maintainer
to convert source files into executable binary programs, help files, and other
runtime resources.
If you’ve ever downloaded, built, and installed software from a tarball—a
compressed archive with a .tar.gz, .tgz, .tar.bz2, or other such extension—you’re
undoubtedly aware of the general process. It usually looks something like this:
$ gzip -cd hackers-delight-1.0.tar.gz | tar xvf -
...
$ cd hackers-delight-1.0
$ ./configure && make
...
$ sudo make install
...
NOTE If you’ve performed this sequence of commands, you probably know what they mean,
and you have a basic understanding of the software development process. If this is the
case, you’ll have no trouble following the content of this book.
Most developers understand the purpose of the make utility, but what’s the
point of configure? While Unix systems have followed the de facto standard
Unix kernel interface for decades, most software has to stretch beyond these
boundaries.
Originally, configuration scripts were hand-coded shell scripts designed
to set variables based on platform-specific characteristics. They also allowed
users to configure package options before running make. This approach worked
well for decades, but as the number of Linux distributions and custom Unix sys-
tems grew, the variety of features and installation and configuration options
exploded, so it became very difficult to write a decent portable configuration
script. In fact, it was much more difficult to write a portable configuration script
than it was to write makefiles for a new project. Therefore, most people just
Autotools_02.book Page 5 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
6Chapter 1
created configuration scripts for their projects by copying and modifying the
script for a similar project.
In the early 1990s, it was apparent to many open source software devel-
opers that project configuration would become painful if something wasn’t
done to ease the burden of writing massive shell scripts to manage configura-
tion options. The number of GNU project packages had grown to hundreds,
and maintaining consistency between their separate build systems had become
more time consuming than simply maintaining the code for these projects.
These problems had to be solved.
Autoconf
Autoconf6 changed this paradigm almost overnight. David MacKenzie started
the Autoconf project in 1991, but a look at the AUTHORS file in the Savannah
Autoconf project7 repository will give you an idea of the number of people
that had a hand in making the tool. Although configuration scripts were long
and complex, users only needed to specify a few variables when executing
them. Most of these variables were simply choices about components, features,
and options, such as: Where can the build system find libraries and header files?
Where do I want to install my finished products? Which optional components do I
want to build into my products?
Instead of modifying and debugging hundreds of lines of supposedly
portable shell script, developers can now write a short meta-script file using a
concise, macro-based language, and Autoconf will generate a perfect config-
uration script that is more portable, more accurate, and more maintainable
than a hand-coded one. In addition, Autoconf often catches semantic or logic
errors that could otherwise take days to debug. Another benefit of Autoconf
is that the shell code it generates is portable between most variations of the
Bourne shell. Mistakes made in portability between shells are very common,
and, unfortunately, are the most difficult kinds of mistakes to find, because
no one developer has access to all Bourne-like shells.
NOTE While scripting languages like Perl and Python are now more pervasive than the Bourne
shell, this was not the case when the idea for Autoconf was first conceived.
Autoconf-generated configuration scripts provide a common set of options
that are important to all portable software projects running on POSIX systems.
These include options to modify standard locations (a concept I’ll cover in
more detail in Chapter 2), as well as project-specific options defined in the
configure.ac file (which I’ll discuss in Chapter 3).
The autoconf package provides several programs, including the following:
zautoconf
zautoheader
zautom4te
6. For more on Autoconf origins, see the GNU webpage on the topic at http://www.gnu.org/
software/autoconf.
7. See http://savannah.gnu.org/projects/autoconf.
Autotools_02.book Page 6 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
A Brief Introduction to the GNU Autotools 7
zautoreconf
zautoscan
zautoupdate
zifnames
autoconf
autoconf is a simple Bourne shell script. Its main task is to ensure that the cur-
rent shell contains the functionality necessary to execute the M4 macro proces-
sor. (I’ll discuss Autoconf’s use of M4 in detail in Chapter 3.) The remainder
of the script parses command-line parameters and executes autom4te.
autoreconf
The autoreconf utility executes the configuration tools in the autoconf,
automake, and libtool packages as required by each project. autoreconf
minimizes the amount of regeneration required to address changes in
timestamps, features, and project state. It was written as an attempt to
consolidate existing maintainer-written, script-based utilities that ran all
the required Autotools in the right order. You can think of autoreconf as a
sort of smart Autotools bootstrap utility. If all you have is a configure.ac file,
you can run autoreconf to execute all the tools you need, in the correct
order, so that configure will be properly generated.
autoheader
The autoheader utility generates a C/C++–compatible header file template
from various constructs in configure.ac. This file is usually called config.h.in.
When the end user executes configure, the configuration script generates
config.h from config.h.in. As maintainer, you’ll use autoheader to generate the
template file that you will include in your distribution package. (We’ll examine
autoheader in greater detail in Chapter 3.)
autoscan
The autoscan program generates a default configure.ac file for a new project; it
can also examine an existing Autotools project for flaws and opportunities
for enhancement. (We’ll discuss autoscan in more detail in Chapters 3 and 8.)
autoscan is very useful as a starting point for a project that uses a non-Autotools-
based build system, but it may also be useful for suggesting features that might
enhance an existing Autotools-based project.
autoupdate
The autoupdate utility is used to update configure.ac or the template (.in) files
to match the syntax supported by the current version of the Autotools.
Autotools_02.book Page 7 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
8Chapter 1
ifnames
The ifnames program is a small and generally underused utility that accepts a list
of source file names on the command line and displays a list of C-preprocessor
definitions on the stdout device. This utility was designed to help maintainers
determine what to put into the configure.ac and Makefile.am files to make them
portable. If your project was written with some level of portability in mind,
ifnames can help you determine where those attempts at portability are located
in your source tree and give you the names of potential portability definitions.
autom4te
The autom4te utility is an intelligent caching wrapper for M4 that is used by
most of the other Autotools. The autom4te cache decreases the time successive
tools spend accessing configure.ac constructs by as much as 30 percent.
I won’t spend a lot of time on autom4te (pronounced automate) because
it’s primarily used internally by the Autotools. The only sign that it’s working
is the autom4te.cache directory that will appear in your top-level project direc-
tory after you run autoconf or autoreconf.
Working Together
Of the tools listed above, autoconf and autoheader are the only ones project
maintainers will use directly when generating a configure script, and autoreconf
is the only one that the developer needs to directly execute. Figure 1-1
shows the interaction between input files and autoconf and autoheader that
generates the corresponding product files.
Figure 1-1: A data flow diagram for autoconf and autoheader
NOTE I will use the data flow diagram format shown in Figure 1-1 throughout this book.
Dark boxes represent objects provided either by the user or by an Autotools package.
Light boxes represent generated objects. Boxes with square corners are scripts, and boxes
with rounded corners are data files. The meaning of most of the labels here should be
obvious, but at least one deserves an explanation: The term ac-vars refers to Autoconf-
specific replacement text. I’ll explain the gradient shading of the aclocal.m4 box shortly.
configure.ac
(m4 / shell)
configure
(shell script)
config.h.in
(cpp / ac-vars)
autom4te
(perl script)
autom4te.cache
(cache directory)
acsite.m4
(m4 / shell)
autoheader
(perl script)
autoconf
(shell script)
User-provided data filesGenerated scriptsAutotools-provided scripts Generated data files
m4
(binary)
aclocal.m4
(m4 / shell)
Autotools_02.book Page 8 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
A Brief Introduction to the GNU Autotools 9
The primary task of this suite of tools is to generate a configuration script
that can be used to configure a project build directory. This script will not
rely on the Autotools themselves; in fact, autoconf is designed to generate
configuration scripts that will run on all Unix-like platforms and in most vari-
ations of the Bourne shell. This means that you can generate a configuration
script using autoconf and then successfully execute that script on a machine
that does not have the Autotools installed.
The autoconf and autoheader programs are executed either directly by the
user or indirectly by autoreconf. They take their input from your project’s
configure.ac file and various Autoconf-flavored M4 macro definition files,
using autom4te to maintain cache information. autoconf generates a configuration
script called configure, a very portable Bourne shell script that enables your
project to offer many useful configuration capabilities. autoheader generates
the config.h.in template based on certain macro definitions in configure.ac.
Automake
Once you’ve done it a few times, writing a basic makefile for a new project is
fairly simple. But problems may occur when you try to do more than just the
basics. And let’s face it—what project maintainer has ever been satisfied with
just a basic makefile?
Attention to detail is what makes an open source project successful. Users
lose interest in a project fairly easily—especially when functionality they expect
is missing or improperly written. For example, users have come to expect
makefiles to support certain standard targets or goals, specified on the make
command line, like this:
$ make install
Common make targets include all, clean, and install. In this example,
install is the target. But you should realize that none of these are real targets:
A real target is a filesystem object that is produced by the build system—usually a
file. When building an executable called doofabble, for instance, you’d expect
to be able to enter:
$ make doofabble
For this project, doofabble is a real target, and this command works for the
doofabble project. However, requiring the user to enter real targets on the
make command line is asking a lot of them, because each project must be built
differently—make doofabble, make foodabble, make abfooble, and so on. Standard-
ized targets for make allow all projects to be built in the same way using com-
monly known commands like make all or make clean. But commonly known
doesn’t mean automatic, and writing and maintaining makefiles that support
these targets is tedious and error prone.
Autotools_02.book Page 9 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
10 Chapter 1
Automake’s job is to convert a simplified specification of your project’s
build process into boilerplate makefile syntax that always works correctly the
first time and provides all the standard functionality expected. Automake creates
projects that support the guidelines defined in the GNU Coding Standards
(discussed in Chapter 2).
The automake package provides the following tools in the form of Perl
scripts:
zautomake
zaclocal
automake
The automake program generates standard makefile templates (named
Makefile.in) from high-level build specification files (named Makefile.am).
These Makefile.am input files are essentially just regular makefiles. If you were
to put only the few required Automake definitions in a Makefile.am file, you’d
get a Makefile.in file containing several hundred lines of parameterized make
script.
If you add additional make syntax to a Makefile.am file, Automake will
move this code to the most functionally correct location in the resulting
Makefile.in file. In fact, you can write your Makefile.am files so all they contain
is ordinary make script, and the resulting makefiles will work just fine. This
pass-through feature gives you the ability to extend Automake’s functionality
to suit your project’s specific requirements.
aclocal
In the GNU Automake Manual, the aclocal utility is documented as a temporary
work-around for a certain lack of flexibility in Autoconf. Automake extends
Autoconf by adding an extensive set of macros, but Autoconf was not really
designed with this level of extensibility in mind.
The original documented method for adding user-defined macros to an
Autoconf project was to create a file called aclocal.m4, place the user-defined
macros in this file, and place the file in the same directory as configure.ac. Auto-
conf then automatically included this file of macros while processing configure.ac.
The designers of Automake found this extension mechanism too useful to
pass up; however, users would have been required to add an m4_include state-
ment to a possibly unnecessary aclocal.m4 file in order to include the Automake
macros. Since both user-defined macros and M4 itself are considered advanced
concepts, this was deemed too harsh a requirement.
aclocal was designed to solve this problem—this utility generates an
aclocal.m4 file for a project that contains both user-defined macros and all
required Automake macros.8 Instead of adding user-defined macros directly
to aclocal.m4, project maintainers should now add them to a new file called
acinclude.m4.
8. Automake macros are copied into this file, but the user-written acinclude.m4 file is merely
referenced with an m4_include statement at the end of the file.
Autotools_02.book Page 10 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
A Brief Introduction to the GNU Autotools 11
To make it clear to readers that Autoconf doesn’t depend on Automake
(and perhaps due to a bit of stubbornness), the GNU Autoconf Manual doesn’t
make much mention of the aclocal utility. The GNU Automake Manual originally
suggested that you rename aclocal.m4 to acinclude.m4 when adding Automake
to an existing Autoconf project, and this approach is still commonly used.
The flow of data for aclocal is depicted in Figure 1-2.
Figure 1-2: A data flow diagram for aclocal
However, the latest documentation for both Autoconf and Automake
suggests that the entire paradigm is now obsolete. Developers should now
specify a directory that contains a set of M4 macro files. The current recom-
mendation is to create a directory in the project root directory called m4 and
add macros as individual .m4 files to it. All files in this directory will be gath-
ered into aclocal.m4 before Autoconf processes configure.ac.9
It should now be more apparent why the aclocal.m4 box in Figure 1-1
couldn’t decide which color it should be. When you’re using it without Auto-
make and Libtool, you write the aclocal.m4 file by hand. However, when you’re
using it with Automake, the file is generated by the aclocal utility, and you
provide project-specific macros either in acinclude.m4 or in an m4 directory.
Libtool
How do you build shared libraries on different Unix platforms without add-
ing a lot of very platform-specific conditional code to your build system and
source code? This is the question that the libtool package tries to address.
There’s a significant amount of common functionality among Unix-like
platforms. However, one very significant difference has to do with how shared
libraries are built, named, and managed. Some platforms name their librar-
ies libname.so, others use libname.a or even libname.sl, and still others don’t
even provide native shared libraries. Some platforms provide libdl.so to allow
software to dynamically load and access library functionality at runtime, while
others provide different mechanisms, and some platforms don’t provide this
functionality at all.
9. As with acinclude.m4, this gathering is virtual; aclocal.m4 merely contains m4_include statements
that reference these other files in place.
aclocal
(perl script)
aclocal.m4
(m4 / shell)
configure.ac
(m4 / shell)
m4/*.m4 files
(m4 / shell)
acinclude.m4
(m4 / shell)
User-provided data files Generated data filesAutotools-provided scripts
Autotools_02.book Page 11 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
12 Chapter 1
The developers of Libtool have carefully considered all of these differences.
Libtool supports dozens of platforms, providing not only a set of Autoconf
macros that hide library naming differences in makefiles, but also offering
an optional library of dynamic loader functionality that can be added to
programs. This functionality allows maintainers to make their runtime, dynamic
shared-object management code more portable.
The libtool package provides the following programs, libraries, and
header file:
zlibtool (program)
zlibtoolize (program)
zltdl (static and shared libraries)
zltdl.h (header file)
libtool
The libtool shell script that ships with the libtool package is a generic version
of the custom script that libtoolize generates for a project.
libtoolize
The libtoolize shell script prepares your project to use Libtool. It generates a
custom version of the generic libtool script and adds it to your project directory.
This custom script is shipped with the project along with the Automake-
generated makefiles, which execute the script on the user’s system at the
appropriate time.
ltdl, the Libtool C API
The libtool package also provides the ltdl library and associated header files,
which provide a consistent runtime shared-object manager across platforms.
The ltdl library may be linked statically or dynamically into your programs,
giving them a consistent runtime shared-library access interface between
platforms.
Figure 1-3 illustrates the interaction between the automake and libtool
scripts, and the input files used to create products that configure and build
your projects.
Automake and Libtool are both standard pluggable options that can be
added to configure.ac with just a few simple macro calls.
Autotools_02.book Page 12 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
A Brief Introduction to the GNU Autotools 13
Figure 1-3: A data flow diagram for automake and libtool
Building Your Package
As maintainer, you probably build your software packages fairly often, and
you’re also probably intimately familiar with your project’s components, archi-
tecture, and build system. However, you should make sure that your users’
build experiences are much simpler than your own. One way to do this is to
give users a simple, easy-to-understand pattern to follow when building your
software packages. In the following sections, I’ll show you the build pattern
provided by the Autotools.
Running configure
After running the Autotools, you’re left with a shell script called configure
and one or more Makefile.in files. These files are intended to be shipped with
your project release distribution packages. Your users will download these
packages, unpack them, and enter ./configure && make from the top-level
project directory. Then the configure script will generate makefiles (called
Makefile) from the Makefile.in templates created by automake and a config.h
header file from the config.h.in template generated by autoheader.
Automake generates Makefile.in templates rather than makefiles because
without makefiles, your users can’t run make; you don’t want them to run make
until after they’ve run configure, and this functionality guards against them
doing so. Makefile.in templates are nearly identical to makefiles you might write
by hand, except that you didn’t have to. They also do a lot more than most
people are willing to hand code. Another reason for not shipping ready-to-run
makefiles is that it gives configure the chance to insert platform characteristics
and user-specified optional features directly into the makefiles. This makes them
a better fit for their target platforms and the end user’s build preferences.
libtoolize
(shell script)
configure.ac
(m4 / shell)
Makefile.am
(am / make)
install-sh
missing
depcomp
mkinstalldirs
(shell scripts)
COPYING
INSTALL
(text files)
config.guess
Makefile.in
(make / ac-vars)
ltmain.sh
(shell script)
autom4te
(perl script)
autom4te.cache
(cache directory)
automake
(perl script)
User-provided data files
Generated scripts
Autotools-provided scripts
Generated data files
config.sub
(shell scripts)
Autotools_02.book Page 13 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
14 Chapter 1
Figure 1-4 illustrates the interaction between configure and the scripts it
executes during the configuration process in order to create the makefiles
and the config.h header file.
Figure 1-4: A data flow diagram for configure
The configure script has a bidirectional relationship with another script
called config.status. You may have thought that your configure script generated
your makefiles. But actually, the only file (besides a log file) that configure
generates is config.status.
configure is designed to determine platform characteristics and features
available on the user’s system, as specified in configure.ac. Once it has this
information, it generates config.status, which contains all of the check results,
and then it executes this script. The config.status script, in turn, uses the
check information embedded within it to generate platform-specific config.h
and makefiles, as well as any other files specified for instantiation in configure.ac.
NOTE As the double-ended fat arrow in Figure 1-4 shows, config.status can also call configure.
When used with the --recheck option, config.status will call configure using the
same command-line options used to originally generate config.status.
The configure script also generates a log file called config.log, which will
contain very useful information in the event that an execution of configure
fails on the user’s system. As the maintainer, you can use this information for
debugging. The config.log file also logs how configure was executed. (You can
run config.status --version to discover the command-line options used to
generate config.status.) This feature can be particularly handy when, for
example, a user returns from a long vacation and can’t remember which
options he used to originally generate the project build directory.
NOTE To regenerate makefiles and the config.h header files, just enter ./config.status from
within the project build directory. The output files will be generated using the same
options originally used to generate the config.status file.
config.cache
configure
(shell script)
config.h
(cpp)
config.site
(m4 / shell)
config.h.in
(cpp / ac-vars)
ltmain.sh
(shell script)
config.status
(shell script)
config.guess
Makefile.in
(make / ac-vars)
Makefile
(make)
libtool
(shell script)
config.log
(text)
User-provided data filesGenerated scriptsAutotools-provided scripts Generated data files
config.sub
(shell scripts)
Autotools_02.book Page 14 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
A Brief Introduction to the GNU Autotools 15
Building Outside the Source Directory
A little-known feature of Autotools build environments is that they don’t need
to be generated within a project source tree. That is, if a user executes configure
from a directory other than the project source directory, he can generate a
full build environment within an isolated build directory.
In the following example, Joe User downloads doofabble-3.0.tar.gz,
unpacks it, and creates two sibling directories called doofabble-3.0.debug and
doofabble-3.0.release. He changes into the doofabble-3.0.debug directory, executes
doofabble-3.0’s configure script, using a relative path, with a doofabble-specific
debug option, and then runs make from within this same directory. Finally, he
switches over to the doofabble-3.0.release directory and does the same thing,
this time running configure without the debug option enabled:
$ gzip -dc doofabble-3.0.tar.gz | tar zxf -
$ mkdir doofabble-3.0.debug
$ mkdir doofable-3.0.release
$ cd doofabble-3.0.debug
$ ../doofabble-3.0/configure --enable-debug
...
$ make
...
$ cd ../dofable-3.0.release
$ ../doofabble-3.0/configure
...
$ make
...
Users generally don’t care about remote build functionality, because all
they usually want to do is configure, build, and install your code on their plat-
forms. Maintainers, on the other hand, find remote build functionality very
useful, as it allows them to not only maintain a reasonably pristine source tree,
but it also allows them to maintain multiple build environments for their project,
each with complex configuration options. Rather than reconfigure a single
build environment, a maintainer can simply switch to another build directory
that has been configured with different options.
Running make
Finally, you run plain old make. The designers of the Autotools went to a lot of
trouble to ensure that you didn’t need any special version or brand of make.
Figure 1-5 depicts the interaction between make and the makefiles that are
generated during the build process.
As you can see, make runs several generated scripts, but these are all really
ancillary to the make process. The generated makefiles contain commands
that execute these scripts under the appropriate conditions. These scripts
are part of the Autotools, and they are either shipped with your package or
generated by your configuration script.
Autotools_02.book Page 15 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
16 Chapter 1
Figure 1-5: A data flow diagram for make
Installing the Most Up-to-Date Autotools
If you’re running a variant of Linux and you’ve chosen to install the compil-
ers and tools used for developing C-language software, you probably already
have some version of the Autotools installed on your system. To determine
which versions of autoconf, automake, and libtool you’re using, simply open a
terminal window and type the following commands:
$ which autoconf
/usr/local/bin/autoconf
$ autoconf --version
autoconf (GNU Autoconf) 2.65
Copyright (C) 2009 Free Software Foundation, Inc.
License GPLv3+/Autoconf: GNU GPL version 3 or later
<http://gnu.org/licenses/gpl.html>, <http://gnu.org/licenses/exceptions.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.
Written by David J. MacKenzie and Akim Demaille.
$
$ which automake
/usr/local/bin/automake
$ automake --version
automake (GNU automake) 1.11
Copyright (C) 2009 Free Software Foundation, Inc.
License GPLv2+: GNU GPL version 2 or later <http://gnu.org/licenses/gpl-
2.0.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.
Written by Tom Tromey <tromey@redhat.com>
and Alexandre Duret-Lutz <adl@gnu.org>.
$
$ which libtool
/usr/local/bin/libtool
libtool
config.h
(cpp)
make
(binary program)
Makefile
(make)
Project Sources
(language of choice)
missing
install-sh
mkinstalldirs
(shell scripts)
Project
Targets
Generated scripts
Generated data files
System tools
User-provided data files
Autotools_02.book Page 16 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
A Brief Introduction to the GNU Autotools 17
$ libtool --version
ltmain.sh (GNU libtool) 2.2.6b
Written by Gordon Matzigkeit <gord@gnu.ai.mit.edu>, 1996
Copyright (C) 2008 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
$
NOTE If you have the Linux-distribution varieties of these Autotools packages installed on
your system, the executables will probably be found in /usr/bin, rather than /usr/
local/bin, as you can see from the output of the which command here.
If you choose to download, build, and install the latest version of any one
of these packages from the GNU website, you must do the same for all of them,
because the automake and libtool packages install macros into the Autoconf
macro directory. If you don’t already have the Autotools installed, you can
install them from their GNU distribution source archives with the following
commands (be sure to change the version numbers as necessary):
$ mkdir autotools && cd autotools
$ wget -q ftp://ftp.gnu.org/gnu/autoconf/autoconf-2.65.tar.gz
$ gzip -cd autoconf* | tar xf -
$ cd autoconf*
$ ./configure && make all check
...
$ su
Password: ******
# make install
...
# exit
$ cd ..
$
$ wget -q ftp://ftp.gnu.org/gnu/automake/automake-1.11.tar.gz
$ gzip -cd automake* | tar xf -
$ cd automake*
$ ./configure && make all check
...
$ su
Password: ******
# make install
# exit
$ cd ..
$
$ wget -q ftp://ftp.gnu.org/gnu/libtool/libtool-2.2.6b.tar.gz
$ gzip -cd libtool* | tar xf -
$ cd libtool*
$ ./configure && make all check
...
Autotools_02.book Page 17 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
18 Chapter 1
$ su
Password: ******
# make install
...
# exit
$ cd ..
$
You should now be able to successfully execute the version check commands
from the previous example.
Summary
In this chapter, I’ve presented a high-level overview of the Autotools to give
you a feel for how everything ties together. I’ve also shown you the pattern to
follow when building software from distribution tarballs created by Autotools
build systems. Finally, I’ve shown you how to install the Autotools and how to
tell which versions you have installed.
In Chapter 2, we’ll step away from the Autotools briefly and begin creat-
ing a hand-coded build system for a toy project called Jupiter. You’ll learn the
requirements of a reasonable build system, and you’ll become familiar with
the rationale behind the original design of the Autotools. With this background
knowledge, you’ll begin to understand why the Autotools do things the way they
do. I can’t really emphasize this enough: Chapter 2 is one of the most important
chapters in this book.
Autotools_02.book Page 18 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
UNDERSTANDING THE GNU
CODING STANDARDS
I don’t know what’s the matter with people: they don’t
learn by understanding, they learn by some other way—
by rote or something. Their knowledge is so fragile!
—Richard Feynman,
Surely You’re Joking, Mr. Feynman!
In Chapter 1, I gave an overview of the GNU
Autotools and some resources that can help
reduce the learning curve required to master
them. In this chapter, we’re going to step back a
little and examine project organization techniques
that you can apply to any project, not just one that uses
the Autotools.
When you’re done reading this chapter, you should be familiar with the
common make targets and why they exist. You should also have a solid under-
standing of why projects are organized the way they are. By the time you fin-
ish this chapter, you’ll be well on your way to becoming an Autotools expert.
The information provided in this chapter comes primarily from two sources:
zThe GNU Coding Standards (GCS)1
zThe Filesystem Hierarchy Standard (FHS)2
1. See the Free Software Foundation’s GNU Coding Standards at http://www.gnu.org/prep/standards/.
2. See Daniel Quinlan’s overview at http://www.pathname.com/fhs/.
Autotools_02.book Page 19 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
20 Chapter 2
If you’d like to brush up on your make syntax, you may also find the GNU
Make Manual3 very useful. If you’re particularly interested in portable make
syntax (and you probably should be), then check out the POSIX man page
for make.4
Creating a New Project Directory Structure
There are two questions you need to ask yourself when you’re setting up the
build system for an open source software project:
zWhich platforms will I target?
zWhat do my users expect?
The first is an easy question—you get to decide which platforms to target,
but you shouldn’t be too restrictive. Open source software projects attain
greatness by virtue of the number of people who’ve adopted them, and arbi-
trarily limiting the number of platforms reduces the potential size of your
community.
The second question is more difficult to answer. First, let’s narrow the
scope to something manageable. What you really need to ask is: What do my
users expect of my build system? Experienced open source software developers
become familiar with these expectations by downloading, unpacking, building,
and installing thousands of packages. Eventually, they come to know intuitively
what users expect of a build system. But, even so, the processes of package
configuration, build, and installation vary widely, so it’s difficult to define any
solid norm.
Rather than taking a survey of every build system out there yourself,
you can consult the Free Software Foundation (FSF), sponsor of the GNU
project, which has done a lot of the leg work for you. The FSF is one of the
best definitive sources for information on free, open source software, includ-
ing the GCS, which covers a wide variety of topics related to writing, publishing,
and distributing free, open source software. Even many non-GNU open source
software projects align themselves with the GCS. Why? Well, they invented
the concept of free software, and their ideas make sense, for the most part.5
There are dozens of issues to consider when designing a system that manages
packaging, building, and installing software, and the GCS takes most of them
into account.
3. See the Free Software Foundation’s GNU Make Manual at http://www.gnu.org/software/make/
manual/.
4. See the Open Group Base Specifications, Issue 6, at http://www.opengroup.org/onlinepubs/
009695399/utilities/make.html.
5. In truth, it’s likely that the standards that came about from the BSD project were written much
earlier than the standards of the FSF, but the FSF had a big hand in spreading the information
to many different platforms and non–system specific software projects. Thus, it had a large part
in making these standards publicly visible and widely used.
Autotools_02.book Page 20 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 21
Project Structure
We’ll start with a basic sample project and build on it as we continue our
exploration of source-level software distribution. I’ll call our project Jupiter
and I’ll create a project directory structure using the following commands:
$ cd projects
$ mkdir -p jupiter/src
$ touch jupiter/Makefile
$ touch jupiter/src/Makefile
$ touch jupiter/src/main.c
$ cd jupiter
$
We now have one source code directory called src, one C source file called
main.c, and a makefile for each of the two directories in our project. Minimal,
yes; but this is a new endeavor, and everyone knows that the key to a success-
ful open source software project is evolution. Start small and grow as needed—
and as you have the time and inclination.
Let’s start by adding support for building and cleaning our project. (We’ll
need to add other important capabilities to our build system later on, but
these two will get us going.) The top-level makefile does very little at this
point; it merely passes requests down to src/Makefile, recursively. This consti-
tutes a fairly common type of build system, known as a recursive build system, so
named because makefiles recursively invoke make on subdirectory makefiles.6
WHAT’S IN A NAME?
You probably know that open source software projects generally have quirky names—
they might be named after some small furry animal that has (vaguely) similar character-
istics to the software, some device, an invention, a Latin term, a past hero, or an
ancient god. Some names are just made-up words or acronyms that are catchy and
easy to pronounce. Another significant characteristic of a good project name is
uniqueness—it’s important that your project be easy to distinguish from others. Addi-
tionally, you should make sure your project’s name does not have negative connotations
in any language or culture.
6. Peter Miller’s seminal paper, “Recursive Make Considered Harmful” (http://miller.emu.id.au/
pmiller/books/rmch/), published over 10 years ago, discusses some of the problems recursive build
systems can cause. I encourage you to read this paper and understand the issues Miller presents.
While the issues are valid, the sheer simplicity of implementing and maintaining a recursive build
system makes it, by far, the most widely used form of build system.
Autotools_02.book Page 21 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
22 Chapter 2
Listings 2-1 through 2-3 show the contents of each of these three files,
thus far.
all clean jupiter:
cd src && $(MAKE) $@
.PHONY: all clean
Listing 2-1: Makefile: An initial draft of a top-level makefile for Jupiter
all: jupiter
jupiter: main.c
gcc -g -O0 -o $@ main.c
clean:
-rm jupiter
.PHONY: all clean
Listing 2-2: src/Makefile: The first draft of Jupiter’s src directory makefile
#include <stdio.h>
#include <stdlib.h>
int main(int argc, char * argv[])
{
printf("Hello from %s!\n", argv[0]);
return 0;
}
Listing 2-3: src/main.c: The first version of the one source file in the Jupiter project
NOTE As you read this code, you will probably notice places where a makefile or a source code
file contains a construct that is not written in the simplest manner or is perhaps not
written the way you would have chosen to write it. There is a method to my madness:
I’ve tried to use constructs that are portable to many flavors of the make utility.
Now let’s discuss the basics of make. If you’re already pretty well versed
in it, then you can skip the next section. Otherwise, give it a quick read, and
we’ll return our attention to the Jupiter project later in the chapter.
Makefile Basics
If you don’t use make on a regular basis, it’s often difficult to remember exactly
what goes where in a makefile, so here are a few things to keep in mind. Besides
comments, which begin with a hash mark (#), there are only three basic types
of entities in a makefile:
zVariable assignments
zRules
zCommands
Autotools_02.book Page 22 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 23
While there are several other types of constructs in a makefile (including
conditional statements, directives, extension rules, pattern rules, function
variables, and include statements, among others), for our purposes, we’ll just
touch lightly on them as needed instead of covering them all in detail. This
doesn’t mean they’re unimportant, however—on the contrary, they’re very
useful if you’re going to write your own complex build system by hand. How-
ever, our purpose is to gain the background necessary for understanding the
GNU Autotools, so I’ll only cover the aspects of make you need to know to
accomplish that goal.
If you want a broader education on make syntax, refer to the GNU Make
Manual. For strictly portable syntax, the POSIX man page for make is an excel-
lent reference. If you want to become a make expert, be prepared to spend a
good deal of time studying these resources—there’s much more to the make
utility than is initially apparent.
Commands and Rules
When a line in a makefile begins with a TAB character, make will always consider
it to be a command. Indeed, one of the most frustrating aspects of makefile
syntax to neophytes and experts alike is that commands must be prefixed with
an essentially invisible character. The error messages generated by the legacy
UNIX make utility when a required TAB is missing (or has been converted to
spaces by your editor) or an unintentional TAB is inserted are obscure at best.
GNU make does a better job with such error messages. Nonetheless, be careful
to use leading TAB characters properly in your makefiles—always and only
before commands.
A list of one or more commands is always associated with a preceding
rule. A rule takes the form of a target followed by a list of dependencies. In
general, targets are objects that need to be built, and dependencies are objects
that provide source material for targets. Thus, targets are said to depend upon
the dependencies. Dependencies are essentially prerequisites of the targets,
and thus they should be updated first.7
Listing 2-4 shows the general layout of a makefile.
var1=val1
var2=val2
...
target1 : t1_dep1 t1_dep2 ... t1_depN
<TAB> shell-command1a
<TAB> shell-command1b
...
target2 : t2_dep1 t2_dep2 ... t2_depN
<TAB> shell-command2a
<TAB> shell-command2b
...
Listing 2-4: The general layout of a makefile
7. You’ll often hear dependencies referred to as prerequisites for this reason.
Autotools_02.book Page 23 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
24 Chapter 2
The make utility is a rule-based command engine, and the rules at work
indicate which commands should be executed and when. When you prefix
a line with a TAB character, you’re telling make that you want it to execute the
following statements from a shell according to the preceding rule. The exist-
ence and timestamps of the files mentioned in the rules indicate whether the
commands should be executed, and in what order.
As make processes the text in a makefile, it builds a web of dependency
chains (technically called a directed graph). When building a particular target,
make must walk backward through the entire graph to the beginning of each
“chain.” make then executes the commands for each rule in these chains,
beginning with the rule farthest from the target and working forward to the
rule for the desired target. As make discovers targets that are older than their
dependencies, it must execute the associated set of commands to update
those targets before it can process the next rule in the chain. As long as the
rules are written correctly, this algorithm ensures that make will build a com-
pletely up-to-date product using the least number of operations possible.
Variables
Lines in a makefile containing an equal sign (=) are variable definitions.
Variables in makefiles are somewhat similar to shell or environment variables,
but there are some key differences.
In Bourne-shell syntax, you’d reference a variable in this manner: ${my_var}.
The syntax for referencing variables in a makefile is identical, except that you
have the choice of using parentheses or curly brackets: $(my_var). To minimize
confusion, it has become somewhat of a convention to use parentheses rather
than curly brackets when dereferencing make variables. For single-character
make variables, using these delimiters is optional, but you should use them in
order to avoid ambiguities. For example, $X is functionally equivalent to $(X)
or ${X}, but $(my_var) would require parentheses so make does not interpret
the reference as $(m)y_var.
NOTE To dereference a shell variable inside a make command, escape the dollar sign by doubling
it—for example, $${shell_var}. Escaping the dollar sign tells make not to interpret the
variable reference, but rather to treat it as literal text in the command.
By default, make reads the process environment into its variable table
before processing the makefile; this allows you to access most environment
variables without explicitly defining them in the makefile. Note, however,
that variables set inside the makefile will override those obtained from the
environment.8 It’s generally not a good idea to depend on the existence of
environment variables in your build process, although it’s okay to use them
conditionally. In addition, make defines several useful variables of its own,
such as the MAKE variable, the value of which is the complete command line
(with options) used to invoke the current make process.
8. You can use the -e option on the make command line to reverse this default behavior so that
variables defined within the environment override those defined within the makefile. However,
relying on this option can lead to problems caused by subtle environmental differences between
systems.
Autotools_02.book Page 24 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 25
You can assign variables at any point in the makefile. However, you should
be aware that make processes a makefile in two passes. In the first pass, it gathers
variables and rules into tables and internal structures. In the second pass, it
resolves dependencies defined by the rules, invoking those rules as necessary
to rebuild the dependencies based on filesystem timestamps. If a dependency
in a rule is newer than the target or if the target is missing, then make executes
the commands of the rule to update the target. Some variable references are
resolved immediately during the first pass while processing rules, and others
are resolved later during the second pass while executing commands.
A Separate Shell for Each Command
As it processes rules, make executes each command independently of those
around it. That is, each individual command under a rule is executed in its
own shell. This means that you cannot export a shell variable in one command
and then try to access its value in the next.
To do something like this, you would have to string commands together
on the same command line with command separator characters (e.g., semi-
colons, in Bourne shell syntax). When you write commands like this, make passes
the set of concatenated commands as one command line to the same shell.
To avoid long command lines and increase readability, you can wrap them
using a backslash at the end of each line—usually after the semicolon. The
wrapped portion of such commands may also be preceded by a TAB character.
POSIX specifies that make should remove all leading TAB characters (even
those following escaped newlines) before processing commands, but be aware
that some make implementations do output—usually harmlessly—the TAB
characters embedded within wrapped commands.9
Listing 2-5 shows a few simple examples of multiple commands that will
be executed by the same shell.
foo: bar.c
sources=bar.c; \
gcc -o foo $${sources}
fud: baz.c
sources=baz.c; gcc -o fud $${sources}
doo: doo.c
TMPDIR=/var/tmp gcc -o doo doo.c
Listing 2-5: Some examples of multiple commands executed by the same shell
In the first example at , both lines are executed by the same shell because
the backslash escapes the newline character between the lines. The make utility
will remove any escaped newline characters before passing a single, multi-
command statement to the shell. The second example at is identical to the
first, from make’s perspective.
9. Experiments have shown that many make implementations generate cleaner output if you
don’t use TAB characters after escaped newlines. Nevertheless, the community seems to have
settled on the consistent use of TAB characters in all command lines, whether wrapped or not.
Autotools_02.book Page 25 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
26 Chapter 2
The third example at is a bit different. In this case, I’ve defined the
TMPDIR variable only for the child process that will run gcc.10 Note the missing
semicolon; as far as the shell is concerned, this is a single command.11
NOTE If you choose to wrap commands with a trailing backslash, be sure that there are no
spaces or other invisible characters after it. The backslash escapes the newline character,
so it must immediately precede that character.
Variable Binding
Variables referenced in commands may be defined after the command in the
makefile because such references are not bound to their values until just
before make passes the command to the shell for execution—long after the
entire makefile has been read. In general, make binds variables to values as
late as it possibly can.
Since commands are processed at a later stage than rules, variable refer-
ences in commands are bound later than those in rules. Variable references
found in rules are expanded when make builds the directed graph from the
rules in the makefile. Thus, a variable referenced in a rule must be fully defined
in a makefile before the referencing rule. Listing 2-6 shows a portion of a
makefile that illustrates both of these concepts.
...
mytarget=foo
$(mytarget): $(mytarget).c
gcc -o $(mytarget) $(mytarget).c
mytarget=bar
...
Listing 2-6: Variable expansion in a makefile
In the rule at , both references to $(mytarget) are expanded to foo because
they’re processed during the first pass, when make is building the variable list
and directed graph. However, the outcome is probably not what you’d expect,
because both references to $(mytarget) in the command at are not expanded
until much later, long after make has already assigned bar to mytarget, overwriting
the original assignment of foo.
Listing 2-7 shows the same rule and command the way make sees them
after the variables are fully expanded.
...
foo: foo.c
gcc -o bar bar.c
...
Listing 2-7: The results after variable expansion of the code in Listing 2-6
10. gcc uses the value of the TMPDIR variable to determine where to write temporary intermediate
files between tools such as the C-preprocessor and the compiler.
11. You cannot dereference TMPDIR on the command line when it’s defined in this manner. Only
the child process has access to this variable; the current shell does not.
Autotools_02.book Page 26 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 27
The moral of this story is that you should understand where variables will
be expanded in makefile constructs so you’re not surprised when make refuses
to act in a sane manner when it processes your makefile. It is good practice
(and a good way to avoid headaches) to always assign variables before you
intend to use them. For more information on immediate and deferred
expansion of variables in makefiles, refer to “How make Reads a Makefile” in
the GNU Make Manual.
Rules in Detail
Lines in my sample makefiles that are not variable assignments (i.e., don’t con-
tain an equal sign), and are not commands (i.e., are not prefixed with a TAB
character) are all rules of one type or another. The rules used in my examples
are known as common make rules, containing a single colon character (:). The
colon separates targets on the left from dependencies on the right.
Remember that targets are products—that is, filesystem entities that can
be produced by running one or more commands, such as a C or C++ compiler,
a linker, or a documentation generator like Doxygen or LaTeX. Dependencies,
on the other hand, are source objects, or objects from which targets are cre-
ated. These may be computer language source files, intermediate products built
by a previous rule, or anything else that can be used by a command as a resource.
You can specify any target defined within a makefile rule directly on the
make command line, and make will execute all the commands necessary to gen-
erate that target.
NOTE If you don’t specify any targets on the make command line, make will use the default
target—the first one it finds in the makefile.
For example, a C compiler takes dependency main.c as input and generates
target main.o. A linker then takes dependency main.o as input and generates
a named executable target—program, in this case.
Figure 2-1 shows the flow of data as it might be specified by the rules
defined in a makefile.
Figure 2-1: A data flow diagram for the compile and link processes
The make utility implements some fairly complex logic to determine when
a rule should be run, based on whether a target exists and whether it is older
than its dependencies. Listing 2-8 shows a makefile containing rules that exe-
cute the actions in Figure 2-1.
program
main.c main.o
Generated executables
User-provided data files Generated data files
gcc ld
System executables
Autotools_02.book Page 27 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
28 Chapter 2
program: main.o print.o display.o
ld main.o print.o display.o ... -o program
main.o: main.c
gcc -c -g -O2 -o main.o main.c
print.o: print.c
gcc -c -g -O2 -o print.o print.c
main.o: main.c
gcc -c -g -O2 -o display.o display.c
Listing 2-8: Using multiple make rules to compile and link a program
The first rule in this makefile says that program depends on main.o, print.o,
and display.o. The remaining rules say that each .o file depends on the corre-
sponding .c file. Ultimately, program depends on the three source files, but the
object files are necessary as intermediate dependencies because there are
two steps to the process—compile and link—with a result in between. For
each rule, make uses an associated list of commands to build the rule’s target
from its list of dependencies.
Unix compilers are designed as higher-level tools than linkers. They
have built-in, low-level knowledge about system-specific linker requirements.
In the makefile in Listing 2-8, the ellipsis in the line at is a placeholder for
a list of system-specific, low-level objects and libraries required to build all
programs on this system. The compiler can be used to call the linker,
silently passing these system-specific objects and libraries. (It’s so effective
and widely used that it’s often difficult to discover how to manually execute
the linker on a given system.) Listing 2-9 shows how you might rewrite the
makefile from Listing 2-8 to use the compiler to compile the sources and call
the linker in a single rule.12
sources = main.c print.c display.c
program: $(sources)
gcc -g -O0 -o program $(sources)
Listing 2-9: Using a single make rule to compile sources into an executable
In this example, I’ve added a make variable (sources) that allows us to con-
solidate all product dependencies into one location. We now have a list of
source files captured in a variable definition that is referenced in two places:
in the dependency list and on the command line.
12. Using a single rule and command to process both steps is possible in this case because the
example is very basic. For larger projects, skipping from source to executable in a single step is
usually not the wisest way to manage the build process. However, in either case, using the compiler
to call the linker can ease the burden of determining the many system objects that need to be
linked into an application, and, in fact, this very technique is used quite often. More complex
examples, wherein each file is compiled separately, use the compiler to compile each source file
into an object file and then use the compiler to call the linker to link them all together into an
executable.
Autotools_02.book Page 28 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 29
Automatic Variables
There may be other kinds of objects in a dependency list that are not in the
sources variable, including precompiled objects and libraries. These other
objects would have to be listed separately, both in the rule and on the com-
mand line. Wouldn’t it be nice if we had shorthand notation for referencing
the rule’s entire dependency list in the commands?
As it happens, there are various automatic variables that can be used to
reference portions of the controlling rule during the execution of a command.
Unfortunately, most of these are all but useless if you care about portability
between implementations of make. The $@ variable (which references the current
target) happens to be portable and useful, but most of the other automatic
variables are too limited to be very useful.13 The following is a complete list
of portable automatic variables defined by POSIX for make:
z$@ refers to the full target name of the current target or the archive file-
name part of a library archive target. This variable is valid in both explicit
and implicit rules.
z$% refers to a member of an archive and is valid only when the current
target is an archive member—that is, an object file that is a member of a
static library. This variable is valid in both explicit and implicit rules.
z$? refers to the list of dependencies that are newer than the current target.
This variable is valid in both explicit and implicit rules.
z$< refers to the member of the dependency list whose existence allowed the
rule to be chosen for the target. This variable is only valid in implicit rules.
z$* refers to the current target name with its suffix deleted. This variable
is guaranteed by POSIX to be valid only in implicit rules.
GNU make dramatically extends the POSIX-defined list, but since GNU
extensions are not portable, it’s unwise to use any of these except $@.
Dependency Rules
In Listing 2-10, I’ve replaced the sources variable with an objects variable and
replaced the list of source files with a list of object files. This listing also elim-
inates redundancy by making use of both standard and automatic variables.
objects = main.o print.o display.o
main.o: main.c print.h display.h
print.o: print.c print.h
display.o: display.c display.h
program: $(objects)
gcc -g -O0 -o $@ $(objects)
Listing 2-10: Using automatic variables in a command
13. This is because POSIX is not so much a specification for the way things should be done as it is a
specification for the way things are done. Essentially, the purpose of the POSIX standard is to keep
Unix implementations from deviating any further from the norm than necessary. Unfortunately,
most make implementations had wide acceptance within their own communities long before
the idea for a POSIX standard was conceived.
Autotools_02.book Page 29 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
30 Chapter 2
I’ve also added three dependency rules, which are rules without commands
that clarify the relationships between compiler output files and dependent
source and header files. Because print.h and display.h are (presumably) included
by main.c, main.c must be recompiled if either of those files changes; how-
ever, make has no way of knowing that these two header files are included by
main.c. Dependency rules allow the developer to tell make about such back-end
relationships.
Implicit Rules
If you attempt to mentally follow the dependency graph that make would build
from the rules within the makefile in Listing 2-10, you’ll find what appears to
be a hole in the web. According to the last rule in the file, the program executable
depends on main.o, print.o, and display.o. This rule also provides the command
to link these objects into an executable (using the compiler only to call the
linker this time). The object files are tied to their corresponding C source and
header files by the three dependency rules. But where are the commands that
compile the .c files into .o files?
We could add these commands to the dependency rules, but there’s really
no need, because make has a built-in rule that knows how to build .o files from
.c files. There’s nothing magic about make—it only knows about the relationships
you describe to it through the rules you write. But make does have certain
built-in rules that describe the relationships between, for example, .c files
and .o files. This particular built-in rule provides commands for building any-
thing with a .o extension from a file of the same base name with a .c extension.
These built-in rules are called suffix rules, or more generally, implicit rules,
because the name of the dependency (source file) is implied by the name of
the target (object file).
You can write implicit rules yourself, if you wish. You can even override
the default implicit rules with your own versions. Implicit rules are a power-
ful tool, and they shouldn’t be overlooked, but for the purposes of this book,
we won’t go into any more detail. You can learn more about writing and using
implicit rules within makefiles in “Using Implicit Rules” in the GNU Make
Manual.
To illustrate this implicit functionality, I created simple C source and
header files to accompany the sample makefile from Listing 2-10. Here’s
what happened when I executed make on this makefile:
$ make
cc -c -o main.o main.c
$
$ make program
cc -c -o print.o print.c
cc -c -o display.o display.c
gcc -g -O0 -o program main.o print.o display.o
$
As you can see, cc was magically executed with -c and -o options to generate
main.o from main.c. This is common command-line syntax used to make a C-
language compiler build objects from sources—it’s so common, in fact, that the
Autotools_02.book Page 30 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 31
functionality is built into make. If you look for cc on a modern GNU/Linux
system, you’ll find that it’s a soft link in /usr/bin that refers to the system’s GNU
C compiler. On other systems, it refers to the system’s native C compiler.
Calling the system C compiler cc has been a de facto standard for decades.14
But why did the make utility build only main.o when we typed make at ?
Simply because the dependency rule for main.o provided the first (and thus,
the default) target for the makefile. In this case, to build program, we needed
to execute make program, like we did in . Remember that when you enter make
on the command line, the make utility attempts to build the first explicitly
defined target within the file called Makefile in the current directory. If we
wanted to make program the default target, we could rearrange the rules so
the program rule would be the first one listed in the makefile.
To see the dependency rules in action, touch one of the header files and
then rebuild the program target:
$ touch display.h
$ make program
cc -c -o main.o main.c
cc -c -o display.o display.c
gcc -g -O0 -o program main.o print.o display.o
$
After updating display.h, only display.o, main.o, and program were rebuilt.
The print.o object didnt need to be rebuilt because print.c doesn’t depend on
display.h, according to the rules specified in the makefile.
Phony Targets
Targets are not always files. They can also be so-called phony targets, as in the
case of all and clean. These targets don’t refer to true products in the filesystem,
but rather to particular outcomes or actions—when you make these targets,
the project is cleaned, all products are built, and so on.
Multiple Targets
In the same way that you can list multiple dependencies on the right side of a
colon, you can combine rules for multiple targets with the same dependencies
and commands by listing the targets on the left side of a colon, as shown in
Listing 2-11.
all clean:
cd src && $(MAKE) $@
Listing 2-11: Using multiple targets in a rule
14. POSIX has standardized the program (or link) names c89 and c99 to refer to 1989 and 1999
C-language standard compatible compilers. Since these commands can refer to the same compiler
with different command-line options, they’re often implemented as binary programs or shell
scripts, rather than merely as soft links.
Autotools_02.book Page 31 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
32 Chapter 2
While it may not be immediately apparent, this example contains two
separate rules: one for each of the two targets, all and clean. Because these
two rules have the same set of dependencies (none, in this case), and the
same set of commands, we’re able to take advantage of a shorthand notation
supported by make that allows us to combine their rules into one specification.
To help you understand this concept, consider the $@ variable in Listing 2-
11. Which target does it refer to? That depends on which rule is currently
executing—the one for all or the one for clean. Since a rule can only be exe-
cuted on a single target at any given time, $@ can only ever refer to one target,
even when the controlling rule specification contains several.
Resources for Makefile Authors
GNU make is significantly more powerful than the original AT&T UNIX make
utility, although GNU make is completely backward compatible, as long as you
avoid GNU extensions. The GNU Make Manual15 is available online, and O’Reilly
has published an excellent book on the original AT&T UNIX make utility16 and
all of its many nuances. While you can still find this title, the publisher has
recently merged its content into a new edition that also covers GNU make
extensions.17
This concludes the general discussion of makefile syntax and the make
utility, although we will look at additional makefile constructs as we encoun-
ter them throughout the rest of this chapter. With this general information
behind us, let’s return to the Jupiter project and begin adding some more
interesting functionality.
Creating a Source Distribution Archive
In order to actually get source code for Jupiter to our users, we’re going to
have to create and distribute a source archive—a tarball. We could write a separate
script to create the tarball, but since we can use phony targets to create arbi-
trary sets of functionality in makefiles, let’s design a make target to perform this
task instead. Building a source archive for distribution is usually relegated to
the dist target.
When designing a new make target, we need to consider whether its func-
tionality should be distributed among the makefiles of the project or handled
in a single location. Normally, the rule of thumb is to take advantage of a
recursive build system’s nature by allowing each directory to manage its own
portions of a process. We did just this when we passed control of building the
jupiter program down to the src directory, where the source code is located.
15. See the Free Software Foundation’s GNU Make Manual at http://www.gnu.org/software/make/
manual/.
16. Andy Oram and Steve Talbott, Managing Projects with make, Second Edition: The Power of GNU
make for Building Anything (Sebastopol, CA: O’Reilly Media, 1991), http://oreilly.com/catalog/
9780937175903/.
17. Robert Mecklenburg, Managing Projects with GNU Make, Third Edition: The Power of GNU make
for Building Anything (Sebastopol, CA: O’Reilly Media, 2004), http://www.oreilly.com/catalog/
9780596006105/.
Autotools_02.book Page 32 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 33
However, building a compressed archive from a directory structure isn’t really
a recursive process.18 This being the case, we’ll have to perform the entire
task in one of the two makefiles.
Global processes are often handled by the makefile at the highest rele-
vant level in the project directory structure. We’ll add the dist target to our
top-level makefile, as shown in Listing 2-12.
package = jupiter
version = 1.0
tarname = $(package)
distdir = $(tarname)-$(version)
all clean jupiter:
cd src && $(MAKE) $@
dist: $(distdir).tar.gz
$(distdir).tar.gz: $(distdir)
tar chof - $(distdir) | gzip -9 -c > $@
rm -rf $(distdir)
$(distdir):
mkdir -p $(distdir)/src
cp Makefile $(distdir)
cp src/Makefile $(distdir)/src
cp src/main.c $(distdir)/src
.PHONY: all clean dist
Listing 2-12: Makefile: Adding the dist target to the top-level makefile
Besides the addition of the dist target at , I’ve also made several other
modifications. Let’s look at them one at a time. I’ve added the dist target to
the .PHONY rule at . .PHONY is a special kind of built-in rule called a dot-rule or
a directive. The make utility understands several different dot-rules. The purpose
of .PHONY is simply to tell make that certain targets don’t generate filesystem
objects. Normally, make determines which commands to run by comparing
the timestamps of the targets to those of their dependencies in the filesystem—
but phony targets don’t have associated filesystem objects. Using .PHONY ensures
that make won’t go looking for nonexistent product files named after these
targets.
Adding a target to the .PHONY rule has another effect. Since make won’t be
able to use timestamps to determine whether the target is up to date (that is,
newer than its dependencies), make has no recourse but to always execute the
commands associated with phony targets whenever these targets either are
requested on the command line or appear in a dependency chain.
18. Well, okay, it is a recursive process, but the recursive portions of the process are tucked away
inside the tar utility.
Autotools_02.book Page 33 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
34 Chapter 2
I’ve separated the functionality of the dist target into three separate rules
(, , and ) for the sake of readability, modularity, and maintenance.
This is a great rule of thumb to follow in any software engineering process:
Build large processes from smaller ones, and reuse the smaller processes where it makes
sense.
The dist target at depends on the existence of the ultimate goal—in
this case a source-level compressed archive package, jupiter-1.0.tar.gz. I’ve used
one variable to hold the version number (which makes it easier to update the
project version later) and another variable for the package name at , which
will make it easier to change the name if I ever decide to reuse this makefile
for another project. I’ve also logically split the functions of package name
and tarball name; the default tarball name is the package name, but we do
have the option of making them different.
The rule that builds the tarball at indicates how this should be done
with a command that uses the gzip and tar utilities to create the file. But,
notice that the rule has a dependency—the directory to be archived. The
directory name is derived from the tarball name and the package version
number; it’s stored in yet another variable called distdir.
We don’t want object files and executables from our last build attempt
to end up in the archive, so we need to build an image directory containing
exactly what we want to ship—including any files required in the build and
install processes and any additional documentation or license files. Unfortu-
nately, this pretty much mandates the use of individual copy (cp) commands.
Since there’s a rule in the makefile (at ) that tells how this directory
should be created, and since that rule’s target is a dependency of the tarball,
make runs the commands for that rule before running the commands for the
tarball rule. Recall that make processes rules to build dependencies recursively,
from the bottom up, until it can run the commands for the requested target.19
Forcing a Rule to Run
There’s a subtle flaw in the $(distdir) target that may not be obvious right
now, but it will rear its ugly head at the worst of times. If the archive image
directory (jupiter-1.0) already exists when you execute make dist, then make
won’t try to create it. Try this:
$ mkdir jupiter-1.0
$ make dist
tar chof - jupiter-1.0 | gzip -9 -c > jupiter-1.0.tar.gz
rm -rf jupiter-1.0
$
Notice that the dist target didn’t copy any files—it just built an archive
out of the existing jupiter-1.0 directory, which was empty. Our users would get
a real surprise when they unpack this tarball! Worse still, if the image directory
from the previous attempt to archive happened to still be there, the new tar-
ball would contain the now outdated sources from our last attempt to create
a distribution tarball.
19. This process is formally called post-order recursion.
Autotools_02.book Page 34 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 35
The problem is that the $(distdir) target is a real target with no depen-
dencies, which means that make will consider it up to date as long as it exists in
the filesystem. We could add the $(distdir) target to the .PHONY rule to force make
to rebuild it every time we make the dist target, but it’s not a phony target—it’s
a real filesystem object. The proper way to ensure that $(distdir) is always
rebuilt is to ensure that it doesn’t exist before make attempts to build it. One
way to accomplish this is to create a true phony target that will always execute,
and then add that target to the dependency chain for the $(distdir) target. A
common name for this kind of target is FORCE, and I’ve implemented this con-
cept in Listing 2-13.
...
$(distdir).tar.gz: $(distdir)
tar chof - $(distdir) | gzip -9 -c > $@
rm -rf $(distdir)
$(distdir): FORCE
mkdir -p $(distdir)/src
cp Makefile $(distdir)
cp src/Makefile $(distdir)/src
cp src/main.c $(distdir)/src
FORCE:
-rm $(distdir).tar.gz >/dev/null 2>&1
-rm -rf $(distdir) >/dev/null 2>&1
.PHONY: FORCE all clean dist
Listing 2-13: Makefile: Using the FORCE target
The FORCE rule’s commands (at ) are executed every time because FORCE
is a phony target. Since we made FORCE a dependency of the $(distdir) target
(at ), we have the opportunity to delete any previously created files and
directories before make begins to evaluate whether it should execute the com-
mands for $(distdir).
Leading Control Characters
A leading dash character (-) on a command tells make not to care about the
status code of the command it precedes. Normally, when make encounters a
command that returns a nonzero status code to the shell, it will stop execu-
tion and display an error message—but if you use a leading dash, it will just
ignore the error and continue. I use a leading dash on the rm commands in
the FORCE rule because I want to delete previously created product files that
may or may not exist, and rm will return an error if I attempt to delete a non-
existent file.20
20. Another option would have been to use a -f command-line option with the rm command,
which would narrow the failure conditions to those not related to removing nonexistent files.
Autotools_02.book Page 35 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
36 Chapter 2
Note that I did not use a leading dash on the rm command in the tarball
rule at . This is because I want to know if something goes wrong with this
command—if it doesn’t succeed, something is very wrong, since the preced-
ing command should have created a tarball from this directory.
Another leading character that you may encounter is the at sign (@). A
command prefixed with an at sign tells make not to perform its normal behavior
of printing the command to the stdout device as it executes it. It is common
to use a leading at sign on echo statements. You don’t want make to print echo
statements, because then your message will be printed twice: once by make,
and then again by the echo statement itself.
It’s best to use the at sign judiciously. I usually reserve it for commands I
never want to see, such as echo statements. If you like quiet build systems, con-
sider using the global .SILENT directive in your makefiles. Or better still, simply
allow the user the option of adding the -s option to her make command lines.
This enables her to choose how much noise she wants to see.
Automatically Testing a Distribution
The rule for building the archive directory is probably the most frustrating
rule in this makefile, because it contains commands to copy individual files
into the distribution directory. Every time we change the file structure in our
project, we have to update this rule in our top-level makefile, or we’ll break
the dist target. But there’s nothing more we can do—we’ve made the rule
as simple as possible. Now we just have to remember to manage this process
properly.
Unfortunately though, breaking the dist target is not the worst thing
that could happen if you forget to update the distdir rule’s commands. It
may appear that the dist target is working, but it may not actually be copying
all of the required files into the tarball. In fact, it is far more likely that this,
rather than an error, will occur, because adding files to a project is a more
common activity than moving them around or deleting them. New files will
not be copied, but the dist rule won’t notice the difference.
There is a way to perform a sort of self-check on the dist target. We can
create another phony target called distcheck that does exactly what our users
will do: unpack the tarball and build the project. We can have this rule’s
commands perform this task in a temporary directory. If the build process
fails, then the distcheck target will break, telling us that we forgot something
crucial in our distribution.
Listing 2-14 shows the modifications to our top-level makefile that are
required to implement the distcheck target.
...
distcheck: $(distdir).tar.gz
gzip -cd $(distdir).tar.gz | tar xvf -
cd $(distdir) && $(MAKE) all
cd $(distdir) && $(MAKE) clean
rm -rf $(distdir)
Autotools_02.book Page 36 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 37
@echo "*** Package $(distdir).tar.gz is ready for distribution."
...
.PHONY: FORCE all clean dist distcheck
Listing 2-14: Makefile: Adding a distcheck target to the top-level makefile
The distcheck target depends on the tarball itself, so the rule that builds
the tarball is executed first. make then executes the distcheck commands, which
unpack the tarball just built and then recursively run make on the all and clean
targets within the resulting directory. If that process succeeds, it prints out a
message indicating that your users will likely not have a problem with this tarball.
Now all you have to do is remember to execute make distcheck before you
post your tarballs for public distribution!
Unit Testing, Anyone?
Some people insist that unit testing is evil, but the only honest rationale they
can come up with for not doing it is laziness. Proper unit testing is hard work,
but it pays off in the end. Those who do it have learned a lesson (usually in
childhood) about the value of delayed gratification.
A good build system should incorporate proper unit testing. The most
commonly used target for testing a build is the check target, so we’ll go ahead
and add it in the usual manner. The actual unit test should probably go in
src/Makefile because that’s where the jupiter executable is built, so we’ll pass
the check target down from the top-level makefile.
But what commands do we put in the check rule? Well, jupiter is a pretty
simple program—it prints a message, Hello from some/path/jupiter! where
some/path depends on the location from which jupiter was executed. I’ll use
the grep utility to test that jupiter actually outputs such a string.
Listings 2-15 and 2-16 illustrate the modifications to our top-level and src
directory makefiles, respectively.
...
all clean check jupiter:
cd src && $(MAKE) $@
...
.PHONY: FORCE all clean check dist distcheck
Listing 2-15: Makefile: Passing the check target to src/Makefile
...
check: all
./jupiter | grep "Hello from .*jupiter!"
@echo "*** ALL TESTS PASSED ***"
...
.PHONY: all clean check
Listing 2-16: src/Makefile: Implementing the unit test in the check target
Autotools_02.book Page 37 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
38 Chapter 2
Note that check depends on all. We can’t really test our products unless
they are up to date, reflecting any recent source code or build system changes
that may have been made. It makes sense that if the user wants to test the
products, he also wants the products to exist and be up to date. We can ensure
they exist and are up to date by adding all to check’s dependency list.
There’s one more enhancement we could make to our build system:
We can add check to the list of targets executed by make in our distcheck rule,
between the commands to make all and clean. Listing 2-17 shows where this
is done in the top-level makefile.
...
distcheck: $(distdir).tar.gz
gzip -cd $(distdir).tar.gz | tar xvf -
cd $(distdir) && $(MAKE) all
cd $(distdir) && $(MAKE) check
cd $(distdir) && $(MAKE) clean
rm -rf $(distdir)
@echo "*** Package $(distdir).tar.gz is ready for distribution."
...
Listing 2-17: Makefile: Adding the check target to the $(MAKE) command
Now when we run make distcheck, it will test the entire build system shipped
with the package.
Installing Products
We’ve reached the point where our users’ experiences with Jupiter should
be fairly painless—even pleasant—as far as building the project is concerned.
Users will simply unpack the distribution tarball, change into the distribution
directory, and type make. It really can’t get any simpler than that.
But we still lack one important feature—installation. In the case of the
Jupiter project, this is fairly trivial. There’s only one program, and most users
would guess correctly that to install it, they should copy jupiter into either
their /usr/bin or /usr/local/bin directory. More complex projects, however,
could cause users some real consternation when it comes to where to put
user and system binaries, libraries, header files, and documentation including
man pages, info pages, PDF files, and the more-or-less obligatory README,
AUTHORS, NEWS, INSTALL, and COPYING files generally associated with
GNU projects.
We don’t really want our users to have to figure all that out, so we’ll create
an install target to manage putting things where they go once they’re built
properly. In fact, why not just make installation part of the all target? Well,
let’s not get carried away. There are actually a few good reasons for not doing
this.
First, build and installation are separate logical concepts. The second
reason is a matter of filesystem rights. Users have rights to build projects in
their own home directories, but installation often requires root-level rights to
Autotools_02.book Page 38 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 39
copy files into system directories. Finally, there are several reasons why a user
may wish to build but not install a project, so it would be unwise to tie these
actions together.
While creating a distribution package may not be an inherently recursive
process, installation certainly is, so we’ll allow each subdirectory in our project
to manage installation of its own components. To do this, we need to modify
both the top-level and the src-level makefiles. Changing the top-level makefile
is easy: Since there are no products to be installed in the top-level directory,
we’ll just pass the responsibility on to src/Makefile in the usual way.
The modifications for adding an install target are shown in Listings 2-18
and 2-19.
...
all clean check install jupiter:
cd src && $(MAKE) $@
...
.PHONY: FORCE all clean check dist distcheck install
Listing 2-18: Makefile: Passing the install target to src/Makefile
...
install:
cp jupiter /usr/bin
chown root:root /usr/bin/jupiter
chmod +x /usr/bin/jupiter
.PHONY: all clean check install
Listing 2-19: src/Makefile: Implementing the install target
In the top-level makefile shown in Listing 2-18, I’ve added install to the
list of targets passed down to src/Makefile. The installation of files is actually
handled by the src-level makefile shown in Listing 2-19.
Installation is a bit more complex than simply copying files. If a file is placed
in the /usr/bin directory, then root should own it, so that only root can delete or
modify it. Additionally, the jupiter binary should be flagged executable, so I’ve
used the chmod command to set the mode of the file as such. This is probably
redundant, as the linker ensures that jupiter is created as an executable file, but
some types of executable products are not generated by a linker—shell scripts,
for example.
Now our users can just type the following sequence of commands and
the Jupiter project will be built, tested, and installed with the correct system
attributes and ownership on their platforms:
$ gzip -cd jupiter-1.0.tar.gz | tar xf -
$ cd jupiter-1.0
$ make all check
...
Autotools_02.book Page 39 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
40 Chapter 2
$ sudo make install
Password: ******
...
Installation Choices
All of this is well and good, but it could be a bit more flexible with regard to
where things are installed. Some users may be okay with having jupiter installed
into the /usr/bin directory. Others are going to ask why it isn’t installed into
the /usr/local/bin directory—after all, this is a common convention. We could
change the target directory to /usr/local/bin, but then users may ask why they
don’t have the option of installing into their home directories. This is the
perfect situation for a little command-line supported flexibility.
Another problem with our current build system is that we have to do a
lot of stuff just to install files. Most Unix systems provide a system-level pro-
gram—usually a shell script—called install that allows a user to specify various
attributes of the files being installed. The proper use of this utility could simplify
things a bit for Jupiter’s installation, so while we’re adding location flexibil-
ity, we might as well use the install utility, too. These modifications are
shown in Listings 2-20 and 2-21.
...
prefix=/usr/local
export prefix
all clean check install jupiter:
cd src && $(MAKE) $@
...
Listing 2-20: Makefile: Adding a prefix variable
...
install:
install -d $(prefix)/bin
install -m 0755 jupiter $(prefix)/bin
...
Listing 2-21: src/Makefile: Using the prefix variable in the install target
Notice that I only declared and assigned the prefix variable in the top-
level makefile, but I referenced it in src/Makefile. I can do this because I used
the export modifier at in the top-level makefile—this modifier exports the
make variable to the shell that make spawns when it executes itself in the src
directory. This feature of make allows us to define all of our user variables in
one obvious location—at the beginning of the top-level makefile.
NOTE GNU make allows you to use the export keyword on the assignment line, but this syntax
is not portable between GNU make and other versions of make.
Autotools_02.book Page 40 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 41
I’ve now declared the prefix variable to be /usr/local, which is very nice
for those who want to install jupiter in /usr/local/bin, but not so nice for those
who want it in /usr/bin. Fortunately, make allows you to define make variables
on the command line, in this manner:
$ sudo make prefix=/usr install
...
Remember that variables defined on the command line override those
defined within the makefile.21 Thus, users who want to install jupiter into
the /usr/bin directory now have the option of specifying this on the make
command line.
With this system in place, our users may install jupiter into a bin directory
beneath any directory they choose, including a location in their home directory
(for which they do not need additional rights). This is, in fact, the reason we
added the install -d $(prefix)/bin command at in Listing 2-21—this com-
mand creates the installation directory if it doesn’t already exist. Since we
allow the user to define prefix on the make command line, we don’t actually
know where the user is going to install jupiter; therefore, we have to be pre-
pared for the possibility that the location may not yet exist. Give this a try:
$ make all
$ make prefix=$PWD/_inst install
$
$ ls -1p
_inst/
Makefile
src/
$
$ ls -1p _inst
bin/
$
$ ls -1p _inst/bin
jupiter
$
Uninstalling a Package
What if a user doesn’t like our package after he’s installed it, and he just wants
to get it off his system? This is a fairly likely scenario for the Jupiter project, as
it’s rather useless and takes up valuable space in his bin directory. In the case
of your projects, however, its more likely that a user would want to do a clean
install of a newer version of the project or replace the test build he down-
loaded from the project website with a professionally packaged version that
21. Unfortunately, some make implementations do not propagate such command-line variables
to recursive $(MAKE) processes. To alleviate this potential problem, variables that might be set
on the command line can be passed as var="$(var)" on sub-make command lines. My simple
examples ignore this issue because it’s a corner case, but you should at least be aware of this
problem.
Autotools_02.book Page 41 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
42 Chapter 2
comes with his Linux distribution. Support for an uninstall target would be
very helpful in situations like these.
Listings 2-22 and 2-23 show the addition of an uninstall target to our two
makefiles.
...
all clean install uninstall jupiter:
cd src && $(MAKE) $@
...
.PHONY: FORCE all clean dist distcheck install uninstall
Listing 2-22: Makefile: Adding the uninstall target to the top-level makefile
...
uninstall:
-rm $(prefix)/bin/jupiter
.PHONY: all clean check install uninstall
Listing 2-23: src/Makefile: Adding the uninstall target to the src-level makefile
As with the install target, this target requires root-level rights if the user
is using a system prefix, such as /usr or /usr/local. You should be very careful
about how you write your uninstall targets; unless a directory belongs specifi-
cally to your package, you shouldn’t assume you created it. If you do, you
may end up deleting a system directory like /usr/bin!
The list of things to maintain in our build system is getting out of hand.
There are now two places we need to update when we change our installation
processes: the install and uninstall targets. Unfortunately, this is really about
the best we can hope for when writing our own makefiles, unless we resort to
fairly complex shell script commands. But hang in there—in Chapter 5 I’ll
show you how to rewrite this makefile in a much simpler way using GNU
Automake.
Testing Install and Uninstall
Now let’s add some code to our distcheck target to test the functionality of
the install and uninstall targets. After all, it’s fairly important that both of
these targets work correctly from our distribution tarballs, so we should test
them in distcheck before declaring the tarball release-worthy. Listing 2-24
illustrates the necessary changes to the top-level makefile.
...
distcheck: $(distdir).tar.gz
gzip -cd $(distdir).tar.gz | tar xvf -
cd $(distdir) && $(MAKE) all
cd $(distdir) && $(MAKE) check
cd $(distdir) && $(MAKE) prefix=$${PWD}/_inst install
cd $(distdir) && $(MAKE) prefix=$${PWD}/_inst uninstall
cd $(distdir) && $(MAKE) clean
Autotools_02.book Page 42 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 43
rm -rf $(distdir)
@echo "*** Package $(distdir).tar.gz is ready for distribution."
...
Listing 2-24: Makefile: Adding distcheck tests for the install and uninstall targets
Note that I used a double dollar sign on the $${PWD} variable references,
ensuring that make passes the variable reference to the shell with the rest of
the command line, rather than expanding it inline before executing the
command. I wanted this variable to be dereferenced by the shell, rather than
the make utility.22
What we’re doing here is testing to ensure the install and uninstall targets
don’t generate errors—but this isn’t very likely because all they do is install
files into a temporary directory within the build directory. We could add some
code immediately after the make install command that looks for the products
that are supposed to be installed, but that’s more than I’m willing to do. One
reaches a point of diminishing returns, where the code that does the checking
is just as complex as the installation code—in which case the check becomes
pointless.
But there is something else we can do: We can write a more or less generic
test that checks to see if everything we installed was properly removed. Since
the stage directory was empty before our installation, it had better be in a
similar state after we uninstall. Listing 2-25 shows the addition of this test.
...
distcheck: $(distdir).tar.gz
gzip -cd $(distdir).tar.gz | tar xvf -
cd $(distdir) && $(MAKE) all
cd $(distdir) && $(MAKE) check
cd $(distdir) && $(MAKE) prefix=$${PWD}/_inst install
cd $(distdir) && $(MAKE) prefix=$${PWD}/_inst uninstall
@remaining="`find $${PWD}/$(distdir)/_inst -type f | wc -l`"; \
if test "$${remaining}" -ne 0; then \
echo "*** $${remaining} file(s) remaining in stage directory!"; \
exit 1; \
fi
cd $(distdir) && $(MAKE) clean
rm -rf $(distdir)
@echo "*** Package $(distdir).tar.gz is ready for distribution."
...
Listing 2-25: Makefile: Adding a test for leftover files after uninstall finishes
The test first generates a numeric value at in a shell variable called
remaining, which represents the number of regular files found in the stage
directory we used. If this number is not zero, it prints a message to the console
22. Technically, I didn’t have to do this because the PWD make variable was initialized from the
environment, but it serves as a good example of this process. Additionally, there are corner cases
where the PWD make variable is not quite as accurate as the PWD shell variable. It may be left pointing
to the parent directory on a subdirectory make invocation.
Autotools_02.book Page 43 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
44 Chapter 2
at indicating how many files were left behind by the uninstall commands
and then it exits with an error. Exiting early leaves the stage directory intact
so we can examine it to find out which files we forgot to uninstall.
NOTE This test code represents a good use of multiple shell commands passed to a single shell.
I had to do this here so that the value of remaining would be available for use by the if
statement. Conditionals don’t work very well when the closing fi is not executed by the
same shell as the opening if!
I don’t want to alarm people by printing the embedded echo statement
unless it really should be executed, so I prefixed the entire test with an at
sign (@) so that make wouldn’t print the code to stdout. Since make considers
these five lines of code to be a single command, the only way to suppress
printing the echo statement is to suppress printing the entire command.
Now, this test isn’t perfect—not by a long shot. This code only checks for
regular files. If your installation procedure creates any soft links, this test
won’t notice if they’re left behind. The directory structure that’s built during
installation is purposely left in place because the check code doesn’t know
whether a subdirectory within the stage directory belongs to the system or to
the project. The uninstall rule’s commands can be aware of which directories
are project specific and properly remove them, but I don’t want to add project-
specific knowledge into the distcheck tests—it’s that problem of diminishing
returns again.
The Filesystem Hierarchy Standard
You may be wondering by now where I’m getting these directory names. What
if some Unix system out there doesn’t use /usr or /usr/local? For one thing,
this is another reason for providing the prefix variable—to allow the user
some choice in these matters. However, most Unix-like systems nowadays follow
the Filesystem Hierarchy Standard as closely as possible. The FHS defines a number
of standard places including the following root-level directories:
This list is by no means exhaustive. I’ve only mentioned the directories
that are most relevant to our study of open source project build systems. In
addition, the FHS defines several standard locations beneath these root-level
directories. For instance, the /usr directory should contain the following
subdirectories:
/bin /etc /home
/opt /sbin /srv
/tmp /usr /var
/usr/bin /usr/include /usr/lib
/usr/local /usr/sbin /usr/share
/usr/src
Autotools_02.book Page 44 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 45
The /usr/local directory should contain a structure very similar to that
of the /usr directory. The /usr/local directory provides a location for software
installation that overrides versions of the same packages installed in the /usr
directory structure, because system software updates often overwrite software
in /usr without prejudice. The /usr/local directory structure allows a system
administrator to decide which version of a package to use on her system, because
/usr/local/bin may be (and usually is) added to the PATH before /usr/bin. A fair
amount of thought has gone into designing the FHS, and the GNU Autotools
take full advantage of this consensus of understanding.
Not only does the FHS define these standard locations, but it also explains
in detail what they’re for and what types of files should be kept there. All in
all, the FHS leaves you, as project maintainer, just enough flexibility and choice
to keep your life interesting but not enough to make you wonder if you’re
installing your files in the right places.23
Supporting Standard Targets and Variables
In addition to those I’ve already mentioned, the GNU Coding Standards
lists some important targets and variables that you should support in your
projects—mainly because your users will expect support for them.
Some of the chapters in the GCS document should be taken with a grain
of salt (unless you’re actually working on a GNU-sponsored project). For
example, you probably won’t care much about the C source code formatting
suggestions in Chapter 5. Your users certainly won’t care, so you can use what-
ever source code formatting style you wish.
That’s not to say that all of Chapter 5 is worthless to non-GNU open
source projects. The “Portability between System Types” and “Portability
between CPUs” subsections, for instance, provide excellent information on
C source code portability. The “Internationalization” subsection gives some
useful tips on using GNU software to internationalize your projects.
While Chapter 6 discusses documentation the GNU way, some sections
of Chapter 6 describe various top-level text files commonly found in projects,
such as the AUTHORS, NEWS, INSTALL, README, and ChangeLog files. These
are all bits of information that the well-indoctrinated open source software
user expects to see in any reputable project.
The really useful information in the GCS document begins in Chapter 7:
“The Release Process.” This chapter is critical to you as a project maintainer
because it defines what your users will expect of your projects’ build systems.
Chapter 7 contains the de facto standards for the user options that packages
provide in source-level distributions.
23. Before I discovered the FHS, I relied on my personal experience to decide where files should
be installed in my projects. Mostly I was right, because I’m a careful guy, but after I read the FHS
documentation, I went back to some of my past projects with a bit of chagrin and changed things
around. I heartily recommend you become thoroughly familiar with the FHS if you seriously
intend to develop Unix software.
Autotools_02.book Page 45 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
46 Chapter 2
Standard Targets
The “How Configuration Should Work” subsection of Chapter 7 of the GCS
defines the configuration process, which I cover briefly in “Configuring Your
Package” on page 54. The “Makefile Conventions” subsection covers all of
the standard targets and many of the standard variables that users have come
to expect in open source software packages. Standard targets defined by the
GCS include the following:
You don’t need to support all of these targets, but you should consider
supporting the ones that make sense for your project. For example, if you
build and install HTML pages, you should probably consider supporting the
html and install-html targets. Autotools projects support these and more.
Some targets are useful to end users, while others are only useful to project
maintainers.
Standard Variables
Variables you should support as you see fit include those listed in the follow-
ing table. In order to provide flexibility for the end user, most of these vari-
ables are defined in terms of a few of them, and ultimately only one of them:
prefix. For lack of a more standard name, I call these prefix variables. Most of
these could be classified as installation directory variables that refer to standard
locations, but there are a few exceptions, such as srcdir. Table 2-1 lists these
prefix variables and their default values.
all install install-html
install-dvi install-pdf install-ps
install-strip uninstall clean
distclean mostlyclean maintainer-clean
TAGS info dvi
html pdf ps
dist check installcheck
installdirs
Table 2-1:
Prefix Variables and Their Default Values
Variable Default Value
prefix /usr/local
exec_prefix $(prefix)
bindir $(exec_prefix)/bin
sbindir $(exec_prefix)/sbin
libexecdir $(exec_prefix)/libexec
datarootdir $(prefix)/share
datadir $(datarootdir)
sysconfdir $(prefix)/etc
sharedstatedir $(prefix)/com
Autotools_02.book Page 46 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 47
Autotools-based projects support these and other useful variables auto-
matically, as needed; Automake provides full support for them, while Autoconf’s
support is more limited. If you write your own makefiles and build systems,
you should support as many of these as you use in your build and installation
processes.
Adding Location Variables to Jupiter
To support the variables that we’ve used so far in the Jupiter project, we need
to add the bindir variable, as well as any variables that it relies on—in this case,
the exec_prefix variable. Listings 2-26 and 2-27 show how to do this in the top-
level and src directory makefiles.
...
prefix = /usr/local
exec_prefix = $(prefix)
bindir = $(exec_prefix)/bin
export prefix
export exec_prefix
export bindir
...
Listing 2-26: Makefile: Adding the bindir variable
localstatedir $(prefix)/var
includedir $(prefix)/include
oldincludedir /usr/include
docdir $(datarootdir)/doc/$(package)
infodir $(datarootdir)/info
htmldir $(docdir)
dvidir $(docdir)
pdfdir $(docdir)
psdir $(docdir)
libdir $(exec_prefix)/lib
lispdir $(datarootdir)/emacs/site-lisp
localedir $(datarootdir)/locale
mandir $(datarootdir)/man
manNdir $(mandir)/manN (N = 1..9)
manext .1
manNext .N (N = 1..9)
srcdir The source-tree directory corresponding to the
current directory in the build tree
Table 2-1:
Prefix Variables and Their Default Values (continued)
Variable Default Value
Autotools_02.book Page 47 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
48 Chapter 2
...
install:
install -d $(bindir)
install -m 0755 jupiter $(bindir)
uninstall:
-rm $(bindir)/jupiter
...
Listing 2-27: src/Makefile: Adding the bindir variable
Even though we only use bindir in src/Makefile, we have to export prefix,
exec_prefix, and bindir because bindir is defined in terms of exec_prefix, which
is itself defined in terms of prefix. When make runs the install commands, it
will first resolve bindir to $(exec_prefix)/bin, then to $(prefix)/bin, and finally
to /usr/local/bin. Thus, src/Makefile needs to have access to all three variables
during this process.
How do such recursive variable definitions make life better for the end
user? After all, the user can change the root install location from /usr/local to
/usr by simply typing the following:
$ make prefix=/usr install
...
The ability to change prefix variables at multiple levels is particularly useful
to a Linux distribution packager (an employee or volunteer at a Linux com-
pany whose job it is to professionally package your project as an RPM or APT
package), who needs to install packages into very specific system locations.
For example, a distro packager could use the following command to change
the installation prefix to /usr and the system configuration directory to /etc:
$ make prefix=/usr sysconfdir=/etc install
...
Without the ability to change prefix variables at multiple levels, configu-
ration files would end up in /usr/etc because the default value of $(sysconfdir)
is $(prefix)/etc.
Getting Your Project into a Linux Distro
Open source software maintainers often hope that their projects will be picked
up by a Linux distribution. When a Linux distro picks up your package for
distribution on its CDs and DVDs, your project magically moves from the
realm of tens of users to that of tens of thousands of users—almost overnight.
Some people will be using your software without even knowing it.
By following the GCS within your build system, you remove many of the
barriers to including your project in a Linux distro. If your tarball follows all
the usual conventions, distro packagers will immediately know what to do with it.
These packagers generally get to decide, based on needed functionality and
Autotools_02.book Page 48 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 49
their feelings about your package, whether it should be included in their flavor
of Linux. Since they have a fair amount of power in this process, it behooves
you to please them.
Section 7 of the GCS contains a small subsection that talks about supporting
staged installations. It is easy to support this concept in your build system, but
if you neglect to support it, it will almost always cause problems for packagers.
Packaging systems such as the Red Hat Package Manager (RPM) accept
one or more tarballs, a set of patch files, and a specification file. The so-called
spec file describes the process of building and packaging your project for a
particular system. In addition, it defines all of the products installed into the
target installation directory structure. The package manager software uses
this information to install your package into a temporary directory, from
which it then pulls the specified binaries, storing them in a special binary
archive that the package installation software (e.g., RPM) understands.
To support staged installation, all you need is a variable named DESTDIR
that acts as a sort of super-prefix to all of your installed products. To show
you how this is done, I’ll add staged installation support to the Jupiter
project. This is so trivial that it only requires three changes to src/Makefile.
The required changes are bolded in Listing 2-28.
...
install:
install -d $(DESTDIR)$(bindir)
install -m 0755 jupiter $(DESTDIR)$(bindir)
uninstall:
-rm $(DESTDIR)$(bindir)/jupiter
...
Listing 2-28: src/Makefile: Adding staged build functionality
As you can see, I’ve added the $(DESTDIR) prefix to the $(bindir) references
in the install and uninstall targets that refer to installation paths. You don’t
need to define a default value for DESTDIR, because when it is left undefined,
it expands to an empty string, which has no effect on the paths to which it’s
prepended.
I didn’t need to add $(DESTDIR) to the uninstall rule’s rm command for the
sake of the package manager, because package managers don’t care how
your package is uninstalled. They only install your package so they can copy
the products from a stage directory. To uninstall the stage directory, package
managers simply delete it. Package managers such as RPM use their own
rules for removing products from a system, and these rules are based on a
package manager database, rather than your uninstall target.
However, for the sake of symmetry, and to be complete, it doesn’t hurt to
add $(DESTDIR) to uninstall. Besides, we need it to be complete for the sake of
the distcheck target, which we’ll now modify to take advantage of our staged
installation functionality. This modification is shown in Listing 2-29.
Autotools_02.book Page 49 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
50 Chapter 2
...
distcheck: $(distdir).tar.gz
gzip -cd $(distdir).tar.gz | tar xvf -
cd $(distdir) && $(MAKE) all
cd $(distdir) && $(MAKE) check
cd $(distdir) && $(MAKE) DESTDIR=$${PWD}/_inst install
cd $(distdir) && $(MAKE) DESTDIR=$${PWD}/_inst uninstall
@remaining="`find $${PWD}/$(distdir)/_inst -type f | wc -l`"; \
if test "$${remaining}" -ne 0; then \
echo "*** $${remaining} file(s) remaining in stage directory!"; \
exit 1; \
fi
cd $(distdir) && $(MAKE) clean
rm -rf $(distdir)
@echo "*** Package $(distdir).tar.gz is ready for distribution."
...
Listing 2-29: Makefile: Using DESTDIR in the distcheck target
Changing prefix to DESTDIR in the install and uninstall commands allows
us to properly test a complete installation directory hierarchy, as we’ll see shortly.
At this point, an RPM spec file could provide the following text as the
installation commands for the Jupiter package:
%install
make prefix=/usr DESTDIR=%BUILDROOT install
Don’t worry about package manager file formats. Instead, just focus on
providing staged installation functionality through the DESTDIR variable.
You may be wondering why the prefix variable couldn’t provide this
functionality. For one thing, not every path in a system-level installation is
defined relative to the prefix variable. The system configuration directory
(sysconfdir), for instance, is often defined as /etc by packagers. You can see
in Table 2-1 that the default definition of sysconfdir is $(prefix)/etc, so the
only way sysconfdir would resolve to /etc would be if you explicitly set it to do
so on the configure or make command line. If you configured it that way, only
a variable like DESTDIR would affect the base location of sysconfdir during
staged installation. Other reasons for this will become clearer as we talk
about project configuration later on in this chapter, and then again in the
next two chapters.
Build vs. Installation Prefix Overrides
At this point, I’d like to digress slightly to explain an elusive (or at least non-
obvious) concept regarding prefix and other path variables defined in the
GCS. In the preceding examples, I used prefix overrides on the make install
command line, like this:
$ make prefix=/usr install
...
Autotools_02.book Page 50 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 51
The question I wish to address is: What is the difference between using a
prefix override for make all and for make install? In our small sample makefiles,
we’ve managed to avoid using prefixes in any targets not related to installation,
so it may not be clear to you at this point that a prefix is ever useful during the
build stage. However, prefix variables can be very useful during the build stage
to substitute paths into source code at compile time, as shown in Listing 2-30.
program: main.c
gcc -DCFGDIR="\"$(sysconfdir)\"" -o $@ main.c
Listing 2-30: Substituting paths into source code at compile time
In this example, I’m defining a C-preprocessor variable called CFGDIR on
the compiler command line for use by main.c. Presumably, there’s some code
in main.c like that shown in Listing 2-31.
#ifndef CFGDIR
# define CFGDIR "/etc"
#endif
const char cfgdir[FILENAME_MAX] = CFGDIR;
Listing 2-31: Substituting CFGDIR at compile time
Later in the code, you might use the C global variable cfgdir to access the
application’s configuration file.
Linux distro packagers often use different prefix overrides for build and
install command lines in RPM spec files. During the build stage, the actual
runtime directories are hardcoded into the executable using commands like
the one shown in Listing 2-32.
%build
%setup
./configure prefix=/usr sysconfdir=/etc
make
Listing 2-32: The portion of an RPM spec file that builds the source tree
Note that we have to explicitly specify sysconfdir along with prefix, because,
as I mentioned above, the system configuration directory is usually outside of
the system prefix directory structure. The package manager installs these
executables into a stage directory so it can then copy them out of their installed
locations when it builds the binary installation package. The corresponding
installation commands might look like those shown in Listing 2-33.
%install
make DESTDIR=%BUILDROOT% install
Listing 2-33: The installation portion of an RPM spec file
Autotools_02.book Page 51 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
52 Chapter 2
Using DESTDIR during installation will temporarily override all installation
prefix variables, so you don’t have to remember which variables you’ve over-
ridden during configuration. Given the configuration command shown in
Listing 2-32, using DESTDIR in the manner shown in Listing 2-33 has the same
effect as the code shown in Listing 2-34.
%install
make prefix=%BUILDROOT%/usr sysconfdir=%BUILDROOT%/etc install
Listing 2-34: Overriding the default sysconfdir during installation
The key point here is one that I touched on earlier. Never write your install
target to build all or even part of your products in your makefiles. Installation func-
tionality should be limited to copying files, if possible. Otherwise, your users
won’t be able to access your staged installation features if they are using prefix
overrides.
Another reason for limiting installation functionality in this way is that it
allows the user to install sets of packages as a group into an isolated location
and then create links to the actual files in the proper locations. Some people
like to do this when they are testing out a package and want to keep track of
all its components.24
One final point: If you’re installing into a system directory hierarchy,
you’ll need root permissions. People often run make install like this:
$ sudo make install
...
If your install target depends on your build targets, and you’ve neglected
to build them beforehand, make will happily build your program before install-
ing it—but the local copies will all be owned by root. This inconvenience is
easily avoided by having make install fail for lack of things to install, rather
than jumping right into a build while running as root.
User Variables
The GCS defines a set of variables that are sacred to the user. These variables
should be referenced by a GNU build system, but never modified by a GNU build
system. These so-called user variables include those listed in Table 2-2 for C
and C++ programs.
24. Some Linux distributions provide a way of installing multiple versions of common packages.
Java is a great example; to support packages using multiple versions or brands of Java (perhaps
Sun Java versus IBM Java), some Linux distributions provide a script set called the alternatives
scripts. These allow a user (running as root) to swap all of the links in the various system directories
from one grouped installation to another. Thus, both sets of files can be installed in different
auxiliary locations, but links in the expected installation locations can be changed to refer to
each group at different times with a single root-level command.
Autotools_02.book Page 52 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 53
This list is by no means comprehensive, and interestingly, there isn’t a
comprehensive list to be found in the GCS. In fact, most of these variables
come from the documentation for the make utility itself. These variables are
used in the built-in rules of the make utility—they’re somewhat hardcoded
into make, and so they are effectively defined by make. You can find a fairly
complete list of program name and flag variables in the “Variables Used by
Implicit Rules” section of the GNU Make Manual.
Note that make assigns default values for many of these variables based on
common Unix utility names. For example, the default value of CC is cc, which
(at least on Linux systems) is a soft link to the GCC C compiler (gcc). On
other systems, cc is a soft link to the system’s own compiler. Thus we don’t
need to set CC to gcc, which is good, because GCC may not be installed on
non-Linux platforms.
For our purposes, the variables shown in Table 2-2 are sufficient, but for
a more complex makefile, you should become familiar with the larger list
outlined in the GNU Make Manual.
To use these variables in our makefiles, we’ll just replace gcc with $(CC).
We’ll do the same for CFLAGS and CPPFLAGS, although CPPFLAGS will be empty by
default. The CFLAGS variable has no default value either, but this is a good
time to add one. I like to use -g to build objects with symbols, and -O0 to dis-
able optimizations for debug builds. The updates to src/Makefile are shown in
Listing 2-35.
...
CFLAGS = -g -O0
...
jupiter: main.c
$(CC) $(CPPFLAGS) $(CFLAGS) -o $@ main.c
...
Listing 2-35: src/Makefile: Adding appropriate user variables
This works because the make utility allows such variables to be overridden
by options on the command line. For example, to switch compilers and set
some compiler command-line options, a user need only type the following:
$ make CC=gcc3 CFLAGS='-g -O2' CPPFLAGS=-dtest
Table 2-2:
Some User Variables and Their Purposes
Variable Purpose
CC A reference to the system C compiler
CFLAGS Desired C compiler flags
CXX A reference to the system C++ compiler
CXXFLAGS Desired C++ compiler flags
LDFLAGS Desired linker flags
CPPFLAGS Desired C/C++ preprocessor flags
. . .
Autotools_02.book Page 53 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
54 Chapter 2
In this case, our user has decided to use GCC version 3 instead of 4, gen-
erate debug symbols, and optimize her code using level-two optimizations.
She’s also decided to enable the test option through the use of a C-preprocessor
definition. Note that if these variables are set on the make command line, this
apparently equivalent Bourne-shell syntax will not work as expected:
$ CC=gcc3 CFLAGS='-g -O2' CPPFLAGS=-dtest make
The reason is that we’re merely setting environment variables in the local
environment passed to the make utility by the shell. Remember that environ-
ment variables do not automatically override those set in the makefile. To get
the functionality we want, we could use a little GNU makespecific syntax in
our makefile, as shown in Listing 2-36.
...
CFLAGS ?= -g -O0
...
Listing 2-36: Using the GNU make–specific query-assign operator (?=) in a makefile
The ?= operator is a GNU makespecific operator, which will only set the
variable in the makefile if it hasn’t already been set elsewhere. This means
we can now override these particular variable settings by setting them in the
environment. But don’t forget that this will only work in GNU make. In general,
it’s better to set make variables on the make command line.
Configuring Your Package
The GCS describes the configuration process in the “How Configuration
Should Work” subsection of Section 7. Up to this point, we’ve been able to
do about everything we’ve wanted to with Jupiter using only makefiles, so you
might be wondering what configuration is actually for. The opening paragraphs
of this subsection in the GCS answer our question:
Each GNU distribution should come with a shell script named
configure. This script is given arguments which describe the kind
of machine and system you want to compile the program for. The
configure script must record the configuration options so that they
affect compilation.
The description here is the specification of the interface for the
configure script in GNU packages. Many packages implement it
using GNU Autoconf (see Section “Introduction” in Autoconf )
and/or GNU Automake (see Section “Introduction” in Automake),
but you do not have to use these tools. You can implement it any
way you like; for instance, by making configure be a wrapper
around a completely different configuration system.
Autotools_02.book Page 54 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Understanding the GNU Coding Standards 55
Another way for the configure script to operate is to make a link
from a standard name such as config.h to the proper configuration
file for the chosen system. If you use this technique, the distribution
should not contain a file named config.h. This is so that people won’t
be able to build the program without configuring it first.
Another thing that configure can do is to edit the Makefile. If you
do this, the distribution should not contain a file named Makefile.
Instead, it should include a file Makefile.in which contains the
input used for editing. Once again, this is so that people won’t
be able to build the program without configuring it first.25
So then, the primary tasks of a typical configuration script are as follows:
zGenerate files from templates containing replacement variables.
zGenerate a C-language header file (config.h) for inclusion by project
source code.
zSet user options for a particular make environment (debug flags, etc.).
zSet various package options as environment variables.
zTest for the existence of tools, libraries, and header files.
For complex projects, configuration scripts often generate the project
makefiles from one or more templates maintained by project developers.
These templates contain configuration variables in a format that is easy to
recognize (and substitute). The configuration script replaces these variables with
values determined during the configuration process—either from command-
line options specified by the user or from a thorough analysis of the platform
environment. This analysis entails such things as checking for the existence
of certain system or package header files and libraries, searching various file-
system paths for required utilities and tools, and even running small programs
designed to indicate the feature set of the shell, C compiler, or desired libraries.
The tool of choice for variable replacement has, in the past, been the sed
stream editor. A simple sed command can replace all the configuration vari-
ables in a makefile template in a single pass through the file. However, Auto-
conf versions 2.62 and newer prefer awk to sed for this process. The awk utility
is almost as pervasive as sed these days, and it provides more functionality to
allow for efficient replacement of many variables. For our purposes on the
Jupiter project, either of these tools would suffice.
Summary
We have now created a complete project build system by hand, with one
important exception: We haven’t designed a configure script according to the
design criteria specified in the GNU Coding Standards. We could do this, but it
would take a dozen more pages of text to build one that even comes close
to conforming to these specifications. Still, there are a few key build features
25. See Section 7.1, “How Configuration Should Work,” in the GNU Coding Standards document
at http://www.gnu.org/prep/standards/html_node/Configuration.html#Configuration. GNU documentation
changes quite often. This text came from the March 26, 2010 version of the GCS document.
Autotools_02.book Page 55 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
56 Chapter 2
related specifically to the makefiles that the GCS indicate as being desirable.
Among these is the concept of VPATH building. This is an important feature
that can only be properly illustrated by actually writing a configuration
script that works as specified by the GCS.
Rather than spend the time and effort to do this now, I’d like to simply
move on to a discussion of Autoconf in Chapter 3, which will allow us to build
one of these configuration scripts in as little as two or three lines of code. With
that behind us, it will be trivial to add VPATH building and other common
Autotools features to the Jupiter project.
Autotools_02.book Page 56 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
CONFIGURING YOUR PROJECT
WITH AUTOCONF
Come my friends,
’Tis not too late to seek a newer world.
—Alfred, Lord Tennyson, “Ulysses”
Because Automake and Libtool are essen-
tially add-on components to the original
Autoconf framework, it’s useful to spend
some time focusing on using Autoconf without
Automake and Libtool. This will provide a fair amount
of insight into how Autoconf operates by exposing
aspects of the tool that are often hidden by Automake.
Before Automake came along, Autoconf was used alone. In fact, many
legacy open source projects never made the transition from Autoconf to
the full GNU Autotools suite. As a result, it’s not unusual to find a file called
configure.in (the original Autoconf naming convention) as well as handwritten
Makefile.in templates in older open source projects.
In this chapter, I’ll show you how to add an Autoconf build system to an
existing project. I’ll spend most of this chapter talking about the fundamental
features of Autoconf, and in Chapter 4, I’ll go into much more detail about how
some of the more complex Autoconf macros work and how to properly use
them. Throughout this process, we’ll continue using the Jupiter project as
our example.
Autotools_02.book Page 57 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
58 Chapter 3
Autoconf Configuration Scripts
The input to the autoconf program is shell script sprinkled with macro
calls. The input stream must also include the definitions of all referenced
macros—both those that Autoconf provides and those that you write yourself.
The macro language used in Autoconf is called M4. (The name means
M, plus 4 more letters, or the word Macro.1) The m4 utility is a general-purpose
macro language processor originally written by Brian Kernighan and Dennis
Ritchie in 1977.
While you may not be familiar with it, you can find some form of M4 on
every Unix and Linux variant (as well as other systems) in use today. The pro-
lific nature of this tool is the main reason it’s used by Autoconf, as the original
design goals of Autoconf stated that it should be able to run on all systems
without the addition of complex tool chains and utility sets.2
Autoconf depends on the existence of relatively few tools: a Bourne shell,
M4, and a Perl interpreter. The configuration scripts and makefiles it gener-
ates rely on the existence of a different set of tools, including a Bourne shell,
grep, ls, and sed or awk.3
NOTE Do not confuse the requirements of the Autotools with the requirements of the scripts
and makefiles they generate. The Autotools are maintainer tools, while the resulting
scripts and makefiles are end-user tools. We can reasonably expect a higher level of
installed functionality on development systems than we can on end-user systems.
The configuration script ensures that the end user’s build environment
is configured to properly build your project. This script checks for installed
tools, utilities, libraries, and header files, as well as for specific functionality
within these resources. What distinguishes Autoconf from other project con-
figuration frameworks is that Autoconf tests also ensure that these resources
can be properly consumed by your project. You see, it’s not only important
that your users have libxyz.so and its public header files properly installed
on their systems, but also that they have the correct versions of these files.
Autoconf is pathological about such tests. It ensures that the end user’s envi-
ronment is in compliance with the project requirements by compiling and
linking a small test program for each feature—a quintessential example, if
you will, that does what your project source code does on a larger scale.
Can’t I just ensure that libxyz.2.1.0.so is installed by searching library paths for
the filename? The answer to this question is debatable. There are legitimate
situations where libraries and tools get updated quietly. Sometimes, the spe-
cific functionality upon which your project relies is added in the form of a
security bug fix or enhancement to a library, in which case vendors aren’t
even required to bump up the version number. But it’s often difficult to tell
whether you’ve got version 2.1.0.r1 or version 2.1.0.r2 unless you look at the
file size or call a library function to make sure it works as expected.
1. As a point of interest, this naming convention is a fairly common practice in some software
engineering domains. For example, the term internationalization is often abbreviated i18n, for
the sake of brevity (or perhaps just because programmers love acronyms).
2. In fact, whatever notoriety M4 may have today is likely due to the widespread use of Autoconf.
3. Autoconf versions 2.62 and later generate configuration scripts that require awk in addition to
sed on the end user’s system.
Autotools_02.book Page 58 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 59
However, the most significant reason for not relying on library version
numbers is that they do not represent specific marketing releases of a library.
As we will discuss in Chapter 7, library version numbers indicate binary interface
characteristics on a particular platform. This means that library version numbers
for the same feature set can be different from platform to platform, which
means that you may not be able to tell—short of compiling and linking against
the library—whether or not a particular library has the functionality your
project needs.
Finally, there are several important cases where the same functionality
is provided by entirely different libraries on different systems. For example,
you may find cursor manipulation functionality in libtermcap on one system,
libncurses on another, and libcurses on yet another system. But it’s not critical
that you know about all of these side cases, because your users will tell you
when your project won’t build on their system because of such a discrepancy.
What can you do when such a bug is reported? You can use the Autoconf
AC_SEARCH_LIBS macro to test multiple libraries for the same functionality. Simply
add a library to the search list, and you’re done. Since this fix is so easy, it’s
likely the user who noticed the problem will simply send a patch to your
configure.ac file.
Because Autoconf tests are written in shell script, you have a lot of flexi-
bility as to how the tests operate. You can write a test that merely checks for
the existence of a library or utility in the usual locations on your user’s system,
but this bypasses some of the most significant features of Autoconf. Fortunately,
Autoconf provides dozens of macros that conform to Autoconf’s feature-testing
philosophy. You should carefully study and use the list of available macros,
rather than write your own, because they’re specifically designed to ensure
that the desired functionality is available on the widest variety of systems and
platforms.
The Shortest configure.ac File
The simplest possible configure.ac file has just two lines, as shown in Listing 3-1.
AC_INIT([Jupiter], [1.0])
AC_OUTPUT
Listing 3-1: The simplest configure.ac file
To those new to Autoconf, these two lines appear to be a couple of func-
tion calls, perhaps in the syntax of some obscure programming language.
Don’t let their appearance throw you—these are M4 macro calls. The macros
are defined in files distributed with the autoconf package. You can find the
definition of AC_INIT, for example, in the autoconf/general.m4 file in Autoconf’s
installation directory (usually /usr/(local/)share/autoconf). AC_OUTPUT is defined
in autoconf/status.m4.
Autotools_02.book Page 59 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
60 Chapter 3
Comparing M4 to the C Preprocessor
M4 macros are similar in many ways to the C-preprocessor (CPP) macros
defined in C-language source files. The C preprocessor is also a text replacement
tool, which isn’t surprising: Both M4 and the C preprocessor were designed and
written by Kernighan and Ritchie around the same time.
Autoconf uses square brackets around macro parameters as a quoting
mechanism. Quotes are necessary only for cases in which the context of the
macro call could cause an ambiguity that the macro processor may resolve
incorrectly (usually without telling you). We’ll discuss M4 quoting in much
more detail in Chapter 10. For now, just use square brackets around every
argument to ensure that the expected macro expansions are generated.
Like CPP macros, you can define M4 macros to accept a comma-delimited
list of arguments enclosed in parentheses. In both utilities, the opening
parenthesis must immediately follow the macro name in its definition, with
no intervening whitespace. A significant difference, however, is that in M4,
the arguments to parameterized macros are optional, and the caller may simply
omit them. If no arguments are passed, you can also omit the parentheses.
Extra arguments passed to M4 macros are simply ignored. Finally, M4 does
not allow intervening whitespace between a macro name and the opening
parenthesis in a macro call.
The Nature of M4 Macros
If you’ve been programming in C for many years, you’ve no doubt run across
a few C-preprocessor macros from the dark regions of the lower realm. I’m
talking about those truly evil macros that expand into one or two pages of C
code. They should have been written as C functions, but their authors were
either overly worried about performance or just got carried away, and now
it’s your turn to debug and maintain them. But, as any veteran C programmer
will tell you, the slight performance gains you get by using a macro where
you should have used a function do not justify the trouble you cause main-
tainers trying to debug your fancy macros. Debugging such macros can be a
nightmare because the source code generated by macros is usually inaccessible
from within a symbolic debugger.4
Writing such complex macros is viewed by M4 programmers as a sort of
macro nirvana—the more complex and functional they are, the “cooler” they
are. The two Autoconf macros in Listing 3-1 expand into a file containing
over 2,200 lines of Bourne-shell script that total more than 60KB in size! But
you wouldn’t guess this by looking at their definitions. They’re both fairly
short—only a few dozen lines each. The reason for this apparent disparity is
simple: They’re written in a modular fashion, each macro expanding several
others, which, in turn, expand several others, and so on.
4. A technique I’ve used in the past for debugging large macros involves manually generating
source code using the C preprocessor, and then compiling this generated source. Symbolic
debuggers can only work with the source code you provide. By providing source with the macros
fully expanded, you enable the debugger to allow you to step through the generated source.
Autotools_02.book Page 60 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 61
For the same reasons that programmers are taught not to abuse the C
preprocessor, the extensive use of M4 causes a fair amount of frustration for
those trying to understand Autoconf. That’s not to say Autoconf shouldn’t
use M4 this way; quite the contrary—this is the domain of M4. But there is a
school of thought that says M4 was a poor choice for Autoconf because of the
problems with macros mentioned above. Fortunately, being able to use Auto-
conf effectively usually doesn’t require a deep understanding of the inner
workings of the macros that ship with it.5
Executing autoconf
Running autoconf is simple: Just execute it in the same directory as your
configure.ac file. While I could do this for each example in this chapter, I’m
going to use the autoreconf program instead of the autoconf program, because
running autoreconf has exactly the same effect as running autoconf, except
that autoreconf will also do the right thing when you start adding Automake
and Libtool functionality to your build system. That is, it will execute all of
the Autotools in the right order based on the contents of your configure.ac file.
autoreconf is smart enough to only execute the tools you need, in the order
you need them, with the options you want (with one caveat that I’ll mention
shortly). Therefore, running autoreconf is the recommended method for exe-
cuting the Autotools tool chain.
Let’s start by adding the simple configure.ac file from Listing 3-1 to our
project directory. The top-level directory currently contains only a Makefile
and a src directory which contains its own Makefile and a main.c file. Once
you’ve added configure.ac to the top-level directory, run autoreconf:
$ autoreconf
$
$ ls -1p
autom4te.cache/
configure
configure.ac
Makefile
src/
$
First, notice that autoreconf operates silently by default. If you want to see
something happening, use the -v or --verbose option. If you want autoreconf
to execute the Autotools in verbose mode as well, then add -vv to the com-
mand line.6
Next, notice that autoconf creates a directory called autom4te.cache. This is
the autom4te cache directory. This cache speeds up access to configure.ac during
successive executions of utilities in the Autotools tool chain.
5. There are a few exceptions to this rule. Poor documentation can sometimes lead to a
misunderstanding about the intended use of some of the published Autoconf macros. This
book highlights a few of these situations, but a degree of expertise with M4 is the only way to
work your way through most of these problems.
6. You may also pass --verbose --verbose, but this syntax seems a bit . . . verbose to me.
Autotools_02.book Page 61 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
62 Chapter 3
The result of passing configure.ac through autoconf is essentially the same
file (now called configure), but with all of the macros fully expanded. You’re
welcome to take a look at configure, but don’t be too surprised if you don’t
immediately understand what you see. The configure.ac file has been trans-
formed, through M4 macro expansions, into a text file containing thousands
of lines of complex Bourne shell script.
Executing configure
As discussed in “Configuring Your Package” on page 54, the GNU Coding
Standards indicate that a handwritten configure script should generate
another script called config.status, whose job it is to generate files from
templates. Unsurprisingly, this is exactly the sort of functionality you’ll find
in an Autoconf-generated configuration script. This script has two primary
tasks:
zPerform requested checks
zGenerate and then call config.status
The results of the checks performed by configure are written into
config.status in a manner that allows them to be used as replacement text for
Autoconf substitution variables in template files (Makefile.in, config.h.in, and
so on). When you execute configure, it tells you that it’s creating config.status.
It also creates a log file called config.log that has several important attributes.
Let’s run configure and then see what’s new in our project directory.
$ ./configure
configure: creating ./config.status
$
$ ls -1p
autom4te.cache/
config.log
config.status
configure
configure.ac
Makefile
src/
$
We see that configure has indeed generated both config.status and
config.log. The config.log file contains the following information:
zThe command line that was used to invoke configure (very handy!)
zInformation about the platform on which configure was executed
zInformation about the core tests configure executed
zThe line number in configure at which config.status is generated and
then called
Autotools_02.book Page 62 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 63
At this point in the log file, config.status takes over generating log infor-
mation and adds the following information:
zThe command line used to invoke config.status
After config.status generates all the files from their templates, it exits,
returning control to configure, which then appends the following informa-
tion to the log:
zThe cache variables config.status used to perform its tasks
zThe list of output variables that may be replaced in templates
zThe exit code configure returned to the shell
This information is invaluable when debugging a configure script and its
associated configure.ac file.
Why doesn’t configure just execute the code it writes into config.status
instead of going to all the trouble of generating a second script, only to
immediately call it? There are a few good reasons. First, the operations of
performing checks and generating files are conceptually different, and make
works best when conceptually different operations are associated with separate
make targets. A second reason is that you can execute config.status separately
to regenerate output files from their corresponding template files, saving the
time required to perform those lengthy checks. Finally, config.status is written
to remember the parameters originally used on the configure command line.
Thus, when make detects that it needs to update the build system, it can call
config.status to re-execute configure, using the command-line options that
were originally specified.
Executing config.status
Now that you know how configure works, you might be tempted to execute
config.status yourself. This was exactly the intent of the Autoconf designers
and the authors of the GCS, who originally conceived these design goals.
However, a more important reason for separating checks from template
processing is that make rules can use config.status to regenerate makefiles
from their templates when make determines that a template is newer than its
corresponding makefile.
Rather than call configure to perform needless checks (your environment
hasn’t changed—just your template files), makefile rules should be written to
indicate that output files are dependent on their templates. The commands
for these rules run config.status, passing the rule’s target as a parameter. If, for
example, you modify one of your Makefile.in templates, make calls config.status to
regenerate the corresponding Makefile, after which, make re-executes its own
original command line—basically restarting itself.7
7. This is a built-in feature of GNU make. However, for the sake of portability, Automake generates
makefiles that carefully reimplement this functionality as much as possible in make script, rather
than relying on the built-in mechanism found in GNU make. The Automake solution isn’t quite
as comprehensive as GNU make’s built-in functionality, but it’s the best we can do, under the
circumstances.
Autotools_02.book Page 63 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
64 Chapter 3
Listing 3-2 shows the relevant portion of such a Makefile.in template, con-
taining the rules needed to regenerate the corresponding Makefile.
...
Makefile: Makefile.in config.status
./config.status $@
...
Listing 3-2: A rule that causes make to regenerate Makefile if its template has changed
A rule with a target named Makefile is the trigger here. This rule allows
make to regenerate the source makefile from its template if the template changes.
It does this before executing either the user’s specified targets or the default
target, if no specific target was given.
The rule in Listing 3-2 indicates that Makefile is dependent on config.status
as well as Makefile.in, because if configure updates config.status, it may generate
the makefile differently. Perhaps different command-line options were pro-
vided so that configure can now find libraries and header files it couldn’t find
previously. In this case, Autoconf substitution variables may have different val-
ues. Thus, Makefile should be regenerated if either Makefile.in or config.status
is updated.
Since config.status is itself a generated file, it stands to reason that you
could write such a rule to regenerate this file when needed. Expanding on the
previous example, Listing 3-3 adds the required code to rebuild config.status if
configure changes.
...
Makefile: Makefile.in config.status
./config.status $@
config.status: configure
./config.status --recheck
...
Listing 3-3: A rule to rebuild config.status when configure changes
Since config.status is a dependency of Makefile, make will look for a rule
whose target is config.status and run its commands if configure is newer than
config.status.
Adding Some Real Functionality
I’ve suggested before that you should call config.status in your makefiles to
generate those makefiles from templates. Listing 3-4 shows the code in
configure.ac that actually makes this happen. It’s just a single additional
macro call between the two original lines of Listing 3-1.
Autotools_02.book Page 64 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 65
AC_INIT([Jupiter],[1.0])
AC_CONFIG_FILES([Makefile src/Makefile])
AC_OUTPUT
Listing 3-4: configure.ac: Using the AC_CONFIG_FILES macro
This code assumes that templates exist for Makefile and src/Makefile, called
Makefile.in and src/Makefile.in, respectively. These template files look exactly
like their Makefile counterparts, with one exception: Any text that I want
Autoconf to replace is marked as an Autoconf substitution variable, using the
@VARIABLE@ syntax.
To create these files, simply rename the existing Makefiles to Makefile.in
in both the top-level and src directories. This is a common practice when
autoconfiscating a project:
$ mv Makefile Makefile.in
$ mv src/Makefile src/Makefile.in
$
Next, let’s add a few Autoconf substitution variables to replace the original
default values. At the top of these files, I’ve also added the Autoconf substitu-
tion variable, @configure_input@, after a comment hash mark. Listing 3-5 shows
the comment text that is generated in Makefile.
# Makefile. Generated from Makefile.in by configure.
...
Listing 3-5: Makefile: The text generated from the Autoconf @configure_input@ variable
I’ve also added the makefile regeneration rules from the previous examples
to each of these templates, with slight path differences in each file to account
for their different positions relative to config.status and configure in the
build directory.
Listings 3-6 and 3-7 highlight in bold the required changes to the final ver-
sions of Makefile and src/Makefile from the end of Chapter 2.
# @configure_input@
# Package-specific substitution variables
package = @PACKAGE_NAME@
version = @PACKAGE_VERSION@
tarname = @PACKAGE_TARNAME@
distdir = $(tarname)-$(version)
# Prefix-specific substitution variables
prefix = @prefix@
exec_prefix = @exec_prefix@
bindir = @bindir@
...
Autotools_02.book Page 65 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
66 Chapter 3
$(distdir): FORCE
mkdir -p $(distdir)/src
cp configure.ac $(distdir)
cp configure $(distdir)
cp Makefile.in $(distdir)
cp src/Makefile.in $(distdir)/src
cp src/main.c $(distdir)/src
distcheck: $(distdir).tar.gz
gzip -cd $(distdir).tar.gz | tar xvf -
cd $(distdir) && ./configure
cd $(distdir) && $(MAKE) all
cd $(distdir) && $(MAKE) check
cd $(distdir) && $(MAKE) DESTDIR=$${PWD}/_inst install
cd $(distdir) && $(MAKE) DESTDIR=$${PWD}/_inst uninstall
@remaining="`find $${PWD}/$(distdir)/_inst -type f | wc -l`"; \
if test "$${remaining}" -ne 0; then \
echo "*** $${remaining} file(s) remaining in stage directory!"; \
exit 1; \
fi
cd $(distdir) && $(MAKE) clean
rm -rf $(distdir)
@echo "*** Package $(distdir).tar.gz is ready for distribution."
Makefile: Makefile.in config.status
./config.status $@
config.status: configure
./config.status --recheck
...
Listing 3-6: Makefile.in: Required modifications to Makefile from the end of Chapter 2
# @configure_input@
# Package-specific substitution variables
package = @PACKAGE_NAME@
version = @PACKAGE_VERSION@
tarname = @PACKAGE_TARNAME@
distdir = $(tarname)-$(version)
# Prefix-specific substitution variables
prefix = @prefix@
exec_prefix = @exec_prefix@
bindir = @bindir@
...
Makefile: Makefile.in ../config.status
cd .. && ./config.status src/$@
Autotools_02.book Page 66 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 67
../config.status: ../configure
cd .. && ./config.status --recheck
...
Listing 3-7: src/Makefile.in: Required modifications to src/Makefile from the end of
Chapter 2
I’ve removed the export statements from the top-level Makefile.in and added
a copy of all of the make variables (originally only in the top-level Makefile)
into src/Makefile.in. Since config.status generates both of these files, I can
reap excellent benefits by substituting values for these variables directly into
both files. The primary advantage of doing this is that I can now run make in
any subdirectory without worrying about uninitialized variables that would
originally have been passed down by a higher-level makefile.
Since Autoconf generates entire values for these make variables, you may
be tempted to clean things up a bit by removing the variables and just substi-
tuting @prefix@ where we currently use $(prefix) throughout the files. There
are a few good reasons for keeping the make variables. First and foremost, we’ll
retain the original benefits of the make variables; our end users can continue
to substitute their own values on the make command line. (Even though
Autoconf places default values in these variables, users may wish to override
them.) Second, for variables such as $(distdir), whose values are comprised
of multiple variable references, it’s simply cleaner to build the name in one
place and use it everywhere else through a single variable.
I’ve also changed the commands in the distribution targets a bit. Rather
than distribute the makefiles, I now need to distribute the Makefile.in templates,
as well as the new configure script and the configure.ac file.8
Finally, I modified the distcheck target’s commands to run the configure
script before running make.
Generating Files from Templates
Note that you can use AC_CONFIG_FILES to generate any text file from a file of
the same name with an .in extension, found in the same directory. The .in
extension is the default template naming pattern for AC_CONFIG_FILES, but you
can override this default behavior. I’ll get into the details shortly.
Autoconf generates sed or awk expressions into the resulting configure script,
which then copies them into config.status. The config.status script uses these
expressions to perform string replacement in the input template files.
Both sed and awk are text-processing tools that operate on file streams.
The advantage of a stream editor (the name sed is a contraction of the phrase
stream editor) is that it replaces text patterns in a byte stream. Thus, both sed
and awk can operate on huge files because they don’t need to load the entire
input file into memory in order to process it. Autoconf builds the expression
list that config.status passes to sed or awk from a list of variables defined by
8. Distributing configure.ac is not merely an act of kindness—it could also be considered a
requirement of GNU source licenses, since configure.ac is very literally the source code for
configure.
Autotools_02.book Page 67 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
68 Chapter 3
various macros, many of which I’ll cover in greater detail later in this chap-
ter. It’s important to understand that Autoconf substitution variables are the
only items replaced in a template file while generating output files.
At this point, with very little effort, I’ve created a basic configure.ac file. I
can now execute autoreconf, followed by configure and then make, in order to
build the Jupiter project. This simple, three-line configure.ac file generates a
configure script that is fully functional, according to the definition of a proper
configuration script defined by the GCS.
The resulting configuration script runs various system checks and gener-
ates a config.status script that can replace a fair number of substitution
variables in a set of specified template files in this build system. That’s a lot
of functionality in just three lines of code.
Adding VPATH Build Functionality
At the end of Chapter 2, I mentioned that I hadn’t yet covered an important
concept—that of VPATH builds. A VPATH build is a way of using a makefile
construct (VPATH) to configure and build a project in a directory other than
the source directory. This is important if you need to perform any of the fol-
lowing tasks:
zMaintain a separate debug configuration
zTest different configurations side by side
zKeep a clean source directory for patch diffs after local modifications
zBuild from a read-only source directory
The VPATH keyword is short for virtual search path. A VPATH statement con-
tains a colon-separated list of places to look for relative-path dependencies
when they can’t be found relative to the current directory. In other words,
when make can’t find a prerequisite file relative to the current directory, it
searches for that file successively in each of the paths in the VPATH statement.
Adding remote build functionality to an existing makefile using VPATH is very
simple. Listing 3-8 shows an example of using a VPATH statement in a makefile.
VPATH = some/path:some/other/path:yet/another/path
program: src/main.c
$(CC) ...
Listing 3-8: An example of using VPATH in a makefile
In this (contrived) example, if make can’t find src/main.c in the current direc-
tory while processing the rule, it will look for some/path/src/main.c, and then for
some/other/path/src/main.c, and finally for yet/another/path/src/main.c before giving
up with an error message about not knowing how to make src/main.c.
With just a few simple modifications, we can completely support remote
builds in Jupiter. Listings 3-9 and 3-10 illustrate the necessary changes to the
project’s two makefiles.
Autotools_02.book Page 68 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 69
...
# VPATH-specific substitution variables
srcdir = @srcdir@
VPATH = @srcdir@
...
$(distdir): FORCE
mkdir -p $(distdir)/src
cp $(srcdir)/configure.ac $(distdir)
cp $(srcdir)/configure $(distdir)
cp $(srcdir)/Makefile.in $(distdir)
cp $(srcdir)/src/Makefile.in $(distdir)/src
cp $(srcdir)/src/main.c $(distdir)/src
...
Listing 3-9: Makefile.in: Adding VPATH build capabilities to the top-level makefile
...
# VPATH-related substitution variables
srcdir = @srcdir@
VPATH = @srcdir@
...
Listing 3-10: src/Makefile.in: Adding VPATH build capabilities to the lower-level makefile
That’s it. Really. When config.status generates a file, it replaces an Autoconf
substitution variable called @srcdir@ with the relative path to the template’s
source directory. The value substituted for @srcdir@ in a given Makefile within
the build directory structure is the relative path to the directory containing the
corresponding Makefile.in template in the source directory structure. The
concept here is that for each Makefile in the remote build directory, VPATH
provides a relative path to the directory containing the source code for that
build directory.
The changes required for supporting remote builds in your build system
are summarized as follows:
zSet a make variable, srcdir, to the @srcdir@ substitution variable.
zSet the VPATH variable to @srcdir@.
zPrefix all file dependencies used in commands with $(srcdir)/.
NOTE Don’t use $(srcdir) in the VPATH statement itself, because some older versions of make
won’t substitute variable references within the VPATH statement.
If the source directory is the same as the build directory, the @srcdir@ sub-
stitution variable degenerates to a dot (.). That means all of these $(srcdir)/
prefixes simply degenerate to ./, which is harmless.9
9. This is not strictly true for non-GNU implementations of make. GNU make is smart enough to
know that file and ./file refer to the same filesystem object. However, non-GNU implementations
of make aren’t always quite so intelligent, so you should be careful to refer to a filesystem object
using the same notation for each reference in your Makefile.in templates.
Autotools_02.book Page 69 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
70 Chapter 3
A quick example is the easiest way to show you how this works. Now that
Jupiter is fully functional with respect to remote builds, let’s give it a try. Start
in the Jupiter project directory, create a subdirectory called build, and then
change into that directory. Execute the configure script using a relative path,
and then list the current directory contents:
$ mkdir build
$ cd build
$ ../configure
configure: creating ./config.status
config.status: creating Makefile
config.status: creating src/Makefile
$
$ ls -1p
config.log
config.status
Makefile
src/
$
$ ls -1p src
Makefile
$
The entire build system has been constructed by configure and config.status
within the build subdirectory. Enter make to build the project from within the
build directory:
$ make
cd src && make all
make[1]: Entering directory '../prj/jupiter/build'
gcc -g -O2 -o jupiter ../../src/main.c
make[1]: Leaving directory '../prj/jupiter/build'
$
$ ls -1p src
jupiter
Makefile
$
No matter where you are, if you can access the project directory using
either a relative or an absolute path, you can do a remote build from that
location. This is just one more thing that Autoconf does for you in Autoconf-
generated configuration scripts. Imagine managing proper relative paths to
source directories in your own hand-coded configuration scripts!
Let’s Take a Breather
So far, I’ve shown you a nearly complete build system that includes almost all
of the features outlined in the GCS. The features of Jupiter’s build system are
all fairly self-contained and reasonably simple to understand. The most diffi-
cult feature to implement by hand is the configuration script. In fact, writing
Autotools_02.book Page 70 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 71
a configuration script by hand is so labor intensive, compared to the simplicity
of using Autoconf, that I just skipped the hand-coded version entirely in
Chapter 2.
Although using Autoconf like I’ve used it here is quite easy, most people
don’t create their build systems in the manner I’ve shown you. Instead, they
try to copy the build system of another project, and tweak it to make it work
in their own project. Later, when they start a new project, they do the same
thing again. This can cause trouble because the code they’re copying was
never meant to be used the way they’re now trying to use it.
I’ve seen projects in which the configure.ac file contained junk that had
nothing to do with the project to which it belonged. These leftover bits came
from some legacy project, but the maintainer didn’t know enough about
Autoconf to properly remove all the extraneous text. With the Autotools,
it’s generally better to start small and add what you need than to start with a
copy of configure.ac from another full-featured build system, and try to pare it
down to size or otherwise modify it to work with a new project.
I’m sure you’re feeling like there’s a lot more to learn about Autoconf,
and you’re right. We’ll spend the majority of this chapter examining the
most important Autoconf macros and how they’re used in the context of the
Jupiter project. But first, let’s go back and see if we might be able to simplify
the Autoconf startup process even more by using another utility that comes
with the autoconf package.
An Even Quicker Start with autoscan
The easiest way to create a (mostly) complete configure.ac file is to run the
autoscan utility, which is part of the autoconf package. This utility examines
the contents of a project directory and generates the basis for a configure.ac
file (which autoscan names configure.scan) using existing makefiles and source
files.
Let’s see how well autoscan does on the Jupiter project. First, I’ll clean up
the droppings from my earlier experiments, and then run autoscan in the
jupiter directory. Note that I’m not deleting my original configure.ac file—I’ll
just let autoscan tell me how to improve it. In less than a second, I have a few
new files in the top-level directory:
$ rm -rf autom4te.cache build
$ rm configure config.* Makefile src/Makefile src/jupiter
$ ls -1p
configure.ac
Makefile.in
src/
$
$ autoscan
configure.ac: warning: missing AC_CHECK_HEADERS([stdlib.h]) wanted by: src/main.c:2
configure.ac: warning: missing AC_PREREQ wanted by: autoscan
configure.ac: warning: missing AC_PROG_CC wanted by: src/main.c
configure.ac: warning: missing AC_PROG_INSTALL wanted by: Makefile.in:18
$
Autotools_02.book Page 71 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
72 Chapter 3
$ ls -1p
autom4te.cache/
autoscan.log
configure.ac
configure.scan
Makefile.in
src/
$
The autoscan utility examines the project directory hierarchy and creates
two files called configure.scan and autoscan.log. The project may or may not
already be instrumented for Autotools—it doesn’t really matter, because
autoscan is decidedly non-destructive. It will never alter any existing files in a
project.
The autoscan utility generates a warning message for each problem it dis-
covers in an existing configure.ac file. In this example, autoscan noticed that
configure.ac should be using the Autoconf-provided AC_CHECK_HEADERS, AC_PREREQ,
AC_PROG_CC, and AC_PROG_INSTALL macros. It made these assumptions based on
information gleaned from the existing Makefile.in templates and from the C-
language source files, as you can see by the comments after the warning state-
ments beginning at . You can always see these messages (in even greater
detail) by examining the autoscan.log file.
NOTE The notices you receive from autoscan and the contents of your configure.ac file may
differ slightly from mine, depending on the version of Autoconf you have installed. I
have version 2.64 of GNU Autoconf installed on my system (the latest, as of this writing).
If your version of autoscan is older (or newer), you may see some minor differences.
Looking at the generated configure.scan file, I note that autoscan has added
more text to this file than was in my original configure.ac file. After looking it
over to ensure that I understand everything, I see that it’s probably easiest for
me to overwrite configure.ac with configure.scan and then change the few bits
of information that are specific to Jupiter:
$ mv configure.scan configure.ac
$ cat configure.ac
# -*- Autoconf -*-
# Process this file with autoconf to produce a configure script.
AC_PREREQ([2.64])
AC_INIT([FULL-PACKAGE-NAME], [VERSION], [BUG-REPORT-ADDRESS])
AC_CONFIG_SRCDIR([src/main.c])
AC_CONFIG_HEADERS([config.h])
# Checks for programs.
AC_PROG_CC
AC_PROG_INSTALL
# Checks for libraries.
Autotools_02.book Page 72 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 73
# Checks for header files.
AC_CHECK_HEADERS([stdlib.h])
# Checks for typedefs, structures, and compiler characteristics.
# Checks for library functions.
AC_CONFIG_FILES([Makefile
src/Makefile])
AC_OUTPUT
$
My first modification involves changing the AC_INIT macro parameters for
Jupiter, as illustrated in Listing 3-11.
# -*- Autoconf -*-
# Process this file with autoconf to produce a configure script.
AC_PREREQ([2.64])
AC_INIT([Jupiter], [1.0], [jupiter-bugs@example.org])
AC_CONFIG_SRCDIR([src/main.c])
AC_CONFIG_HEADERS([config.h])
...
Listing 3-11: configure.ac: Tweaking the AC_INIT macro generated by autoscan
The autoscan utility does a lot of the work for you. The GNU Autoconf
Manual10 states that you should modify this file to meet the needs of your
project before you use it, but there are only a few key issues to worry about
(besides those related to AC_INIT). I’ll cover each of these issues in turn, but
first, let’s take care of a few administrative details.
The Proverbial autogen.sh Script
Before autoreconf came along, maintainers passed around a short shell script,
often named autogen.sh or bootstrap.sh, which would run all of the Autotools
required for their projects in the proper order. The autogen.sh script can be
fairly sophisticated, but to solve the problem of the missing install-sh script
(see “Missing Required Files in Autoconf” on page 74), I’ll just add a simple
temporary autogen.sh script to the project root directory, as shown in Listing 3-12.
#!/bin/sh
autoreconf --install
automake --add-missing --copy >/dev/null 2>&1
Listing 3-12: autogen.sh: A temporary bootstrap script that executes the required Autotools
The automake --add-missing option copies the required missing utility scripts
into the project, and the --copy option indicates that true copies should be
10. See the Free Software Foundation’s GNU Autoconf Manual at http://www.gnu.org/software/
autoconf/manual/index.html.
Autotools_02.book Page 73 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
74 Chapter 3
made (otherwise, symbolic links are created that refer to the files where they’re
installed with the Automake package).11
11. The automake --add-missing option copies in the missing required utility scripts, and the
--copy option indicates that true copies should be made—otherwise, symbolic links are created
to the files where the automake package has installed them. This isn’t as bad as it sounds, because
when make dist generates a distribution archive, it creates true copies in the image directory.
Therefore, links work just fine, as long as you (the maintainer) don’t move your work area
to another host. Note that automake provides a --copy option, but autoreconf provides just the
opposite: a --symlink option. Thus, if you execute automake --add-missing and you wish to actually
copy the files, you should pass --copy as well. If you execute autoreconf --install,--copy will be
assumed and passed to automake by autoreconf.
MIS SING REQUIR ED F ILES IN AUTOCONF
When I first tried to execute autoreconf on the configure.ac file in Listing 3-11, I dis-
covered a minor problem related to using Autoconf without Automake. When I ran
the configure script, it failed with an error: configure: error: cannot find install-sh
or install.sh ...
Autoconf is all about portability and, unfortunately, the Unix install utility is not
as portable as it could be. From one platform to another, critical bits of installation
functionality are just different enough to cause problems, so the Autotools provide a
shell script called install-sh (deprecated name: install.sh). This script acts as a
wrapper around the system’s own install utility, masking important differences
between various versions of install.
autoscan noticed that I’d used the install program in my src/Makefile.in tem-
plate, so it generated an expansion of the AC_PROG_INSTALL macro. The problem is
that configure couldn’t find the install-sh wrapper script anywhere in my project.
I reasoned that the missing file was part of the Autoconf package, and it just
needed to be installed. I also knew that autoreconf accepts a command-line option
to install such missing files into a project directory. The --install option supported
by autoreconf is designed to pass tool-specific options down to each of the tools that
it calls in order to install missing files. However, when I tried that, I found that the file
was still missing, because autoconf doesn’t support an option to install missing files.1
I could have manually copied install-sh from the Automake installation direc-
tory (usually /usr/(local/)share/automake-*), but looking for a more automated solu-
tion, I tried manually executing automake --add-missing --copy. This command
generated a slew of warnings indicating that the project was not configured for
Automake. However, I could now see that install-sh had been copied into my
project root directory, and that’s all I was after. Executing autoreconf --install
didn’t run automake because configure.ac was not configured for Automake.
Autoconf should ship with install-sh, since it provides a macro that requires it,
but then autoconf would have to provide an --add-missing command-line option.
Nevertheless, there is actually a quite obvious solution to this problem. The install-sh
script is not really required by any code Autoconf generates. How could it be?
Autoconf doesn’t generate any makefile constructs—it only substitutes variables into
your Makefile.in templates. Thus, there’s really no reason for Autoconf to complain
about a missing install-sh script.2
1. Worse still, the GNU Autoconf Manual that I was using at the time told me that “Autoconf
comes with a copy of install-sh that you can use”—but it’s really Automake and Libtool that
come with copies of install-sh.
2. When I presented this problem on the Autoconf mailing list, I was told several times that
autoconf has no business copying install-sh into a project directory, thus there is no install-
missing-file functionality accessible from the autoconf command line. If this is indeed the case,
then autoconf has no business complaining about the missing file, either!
Autotools_02.book Page 74 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 75
NOTE When make dist generates a distribution archive, it creates true copies in the image
directory, so the use of symlinks causes no real problems, as long as you (the main-
tainer) don’t move your work area to another host.
We don’t need to see the warnings from automake, so I’ve redirected the
stderr and stdout streams to /dev/null on the automake command line at in
this script. In Chapter 5, we’ll remove autogen.sh and simply run autoreconf
--install, but for now, this will solve our missing file problems.
Updating Makefile.in
Let’s execute autogen.sh and see what we end up with:
$ sh autogen.sh
$ ls -1p
autogen.sh
autom4te.cache/
config.h.in
configure
configure.ac
install-sh
Makefile.in
src/
$
We know from the file list at that config.h.in has been created, so we
know that autoreconf has executed autoheader. We also see the new install-sh
script at that was created when we executed automake in autogen.sh. Anything
provided or generated by the Autotools should be copied into the archive
directory so that it can be shipped with release tarballs. Therefore, we’ll
add cp commands for these two files to the $(distdir) target in the top-level
Makefile.in template. Note that we don’t need to copy the autogen.sh script
because it’s purely a maintainer tool—users should never need to execute it
from a tarball distribution.
Listing 3-13 illustrates the required changes to the $(distdir) target in
the top-level Makefile.in template.
...
$(distdir): FORCE
mkdir -p $(distdir)/src
cp $(srcdir)/configure.ac $(distdir)
cp $(srcdir)/configure $(distdir)
cp $(srcdir)/config.h.in $(distdir)
cp $(srcdir)/install-sh $(distdir)
cp $(srcdir)/Makefile.in $(distdir)
cp $(srcdir)/src/Makefile.in $(distdir)/src
cp $(srcdir)/src/main.c $(distdir)/src
...
Listing 3-13: Makefile.in: Additional files needed in the distribution archive image directory
Autotools_02.book Page 75 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
76 Chapter 3
If you’re beginning to think that this could become a maintenance prob-
lem, then you’re right. I mentioned earlier that the $(distdir) target was painful
to maintain. Luckily, the distcheck target still exists and still works as designed.
It would have caught this problem, because attempts to build from the tarball
will fail without these additional files—and the check target certainly won’t
succeed if the build fails. When we discuss Automake in Chapter 5, we will
clear up much of this maintenance mess.
Initialization and Package Information
Now let’s turn our attention back to the contents of the configure.ac file in
Listing 3-11. The first section contains Autoconf initialization macros. These
are required for all projects. Let’s consider each of these macros individually,
because they’re all important.
AC_PREREQ
The AC_PREREQ macro simply defines the earliest version of Autoconf that may
be used to successfully process this configure.ac file:
AC_PREREQ(version)
The GNU Autoconf Manual indicates that AC_PREREQ is the only macro that
may be used before AC_INIT. This is because it’s good to ensure you’re using a
new enough version of Autoconf before you begin processing any other macros,
which may be version dependent.
AC_INIT
The AC_INIT macro, as its name implies, initializes the Autoconf system. Here’s
its prototype, as defined in the GNU Autoconf Manual:12
AC_INIT(package, version, [bug-report], [tarname], [url])
It accepts up to five arguments (autoscan only generates a call with the
first three): package, version, and optionally, bug-report, tarname, and url. The
package argument is intended to be the name of the package. It will end up
(in a canonical form) as the first part of the name of an Automake-generated
release distribution tarball when you execute make dist.
NOTE Autoconf uses a normalized form of the package name in the tarball name, so you can
use uppercase letters in the package name, if you wish. Automake-generated tarballs are
named tarname-version.tar.gz by default, but tarname is set to a normalized form of
the package name (lowercase, with all punctuation converted to underscores). Bear this
in mind when you choose your package name and version string.
12. The square brackets used in the macro definition prototypes within this book (as well as the
GNU Autoconf Manual) indicate optional parameters, not Autoconf quotes.
Autotools_02.book Page 76 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 77
The optional bug-report argument is usually set to an email address, but any
text string is valid. An Autoconf substitution variable called @PACKAGE_BUGREPORT@ is
created for it, and that variable is also added to the config.h.in template as a C-
preprocessor definition. The intent here is that you use the variable in your
code to present an email address for bug reports at appropriate places—possibly
when the user requests help or version information from your application.
While the version argument can be anything you like, there are a few
commonly used OSS conventions that will make things a little easier for you.
The most widely used convention is to pass in major.minor (e.g., 1.2). However,
there’s nothing that says you can’t use major.minor.revision, and there’s nothing
wrong with this approach. None of the resulting VERSION variables (Autoconf,
shell, or make) are parsed or analyzed anywhere—they’re only used as place-
holders for substituted text in various locations.13 So if you wish, you may
even add nonnumeric text into this macro, such as 0.15.alpha1, which is
occasionally useful.14
NOTE The RPM package manager, on the other hand, does care what you put in the version
string. For the sake of RPM, you may wish to limit the version string text to only alpha-
numeric characters and periods—no dashes or underscores.
The optional url argument should be the URL for your project website.
It’s shown in the help text displayed by configure --help.
Autoconf generates the substitution variables @PACKAGE_NAME@,
@PACKAGE_VERSION@, @PACKAGE_TARNAME@, @PACKAGE_STRING@ (a stylized concatena-
tion of the package name and version information), @PACKAGE_BUGREPORT@,
and @PACKAGE_URL@ from the arguments to AC_INIT.
AC_CONFIG_SRCDIR
The AC_CONFIG_SRCDIR macro is a sanity check. Its purpose is to ensure that the
generated configure script knows that the directory on which it is being exe-
cuted is actually the project directory.
More specifically, configure needs to be able to locate itself, because it
generates code that executes itself, possibly from a remote directory. There
are myriad ways to inadvertently fool configure into finding some other
configure script. For example, the user could accidentally provide an incorrect
--srcdir argument to configure. The $0 shell script parameter is unreliable, at
best—it may contain the name of the shell, rather than that of the script, or it
may be that configure was found in the system search path, so no path infor-
mation was specified on the command line.
13. As far as M4 is concerned, all data is text; thus M4 macro arguments, including package and
version, are treated simply as strings. M4 doesn’t attempt to interpret any of this text as numbers
or other data types.
14. A future version of Autoconf will support a public macro that allows lexicographical comparison
of version strings, and certain internal constructs in current versions already use such functionality.
Thus, it’s good practice to form version strings that increase properly in a lexical fashion from
version to version.
Autotools_02.book Page 77 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
78 Chapter 3
The configure script could try looking in the current or parent directories,
but it still needs a way to verify that the configure script it locates is actually
itself. Thus, AC_CONFIG_SRCDIR gives configure a significant hint that it’s looking
in the right place. Here’s the prototype for AC_CONFIG_SRCDIR:
AC_CONFIG_SRCDIR(unique-file-in-source-dir)
The argument can be a path (relative to the project’s configure script) to
any source file you like. You should choose one that is unique to your project
so as to minimize the possibility that configure is fooled into thinking some
other project’s configuration file is itself. I try to choose a file that sort of rep-
resents the project, such as a source file named for a feature that defines the
project. That way, in case I ever decide to reorganize the source code, I’m
not likely to lose it in a file rename. But it doesn’t really matter, because both
autoconf and configure will tell you and your users if it can’t find this file.
The Instantiating Macros
Before we dive into the details of AC_CONFIG_HEADERS, I’d like to spend a little
time on the file generation framework Autoconf provides. From a high-level
perspective, there are four major things happening in configure.ac:
zInitialization
zCheck request processing
zFile instantiation request processing
zGeneration of the configure script
We’ve covered initialization—there’s not much to it, although there are
a few more macros you should be aware of. Check out the GNU Autoconf
Manual for more information—look up AC_COPYRIGHT, for an example. Now
let’s move on to file instantiation.
There are actually four so-called instantiating macros: AC_CONFIG_FILES,
AC_CONFIG_HEADERS, AC_CONFIG_COMMANDS, and AC_CONFIG_LINKS. An instantiating
macro accepts a list of tags or files; configure will generate these files from
templates containing Autoconf substitution variables.
NOTE You might need to change the name of AC_CONFIG_HEADER (singular) to AC_CONFIG_HEADERS
(plural) in your version of configure.scan. The singular version is the older name for
this macro, and the older macro is less functional than the newer one.15
The four instantiating macros have an interesting common signature.
The following prototype can be used to represent each of them, with appro-
priate text replacing the XXX portion of the macro name:
AC_CONFIG_XXXS(tag..., [commands], [init-cmds])
15. This was a defect in autoscan that had not been fixed as of Autoconf version 2.61. However,
version 2.62 of autoscan correctly generates a call to the newer, more functional AC_CONFIG_HEADERS.
Autotools_02.book Page 78 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 79
For each of these four macros, the tag argument has the form OUT[:INLIST],
where INLIST has the form IN0[:IN1:...:INn]. Often, you’ll see a call to one
of these macros with only a single argument, as in the three examples below
(note that these examples represent macro calls, not prototypes, so the square
brackets are actually Autoconf quotes, not indications of optional parameters):
AC_CONFIG_HEADERS([config.h])
In this example, config.h is the OUT portion of the above specification. The
default value for INLIST is the OUT portion with .in appended to it. So, in other
words, the above call is exactly equivalent to:
AC_CONFIG_HEADERS([config.h:config.h.in])
What this means is that config.status contains shell code that will gener-
ate config.h from config.h.in, substituting all Autoconf variables in the process.
You may also provide a list of input files in the INLIST portion. In this case, the
files in INLIST will be concatenated to form the resulting OUT file:
AC_CONFIG_HEADERS([config.h:cfg0:cfg1:cfg2])
Here, config.status will generate config.h by concatenating cfg0, cfg1, and
cfg2 (in that order), after substituting all Autoconf variables. The GNU Autoconf
Manual refers to this entire OUT[:INLIST] construct as a tag.
Why not just call it a file? Well, this parameter’s primary purpose is to
provide a sort of command-line target name—much like makefile targets. It
can also be used as a filesystem name if the associated macro generates files,
as is the case with AC_CONFIG_HEADERS, AC_CONFIG_FILES, and AC_CONFIG_LINKS.
But AC_CONFIG_COMMANDS is unique in that it doesn’t generate any files. Instead,
it runs arbitrary shell code, as specified by the user in the macro’s arguments.
Thus, rather than name this first parameter after a secondary function (the
generation of files), the GNU Autoconf Manual refers to it more generally,
according to its primary purpose—as a command-line tag that may be specified
on the config.status command line, in this manner:
$ ./config.status config.h
This config.status command line will regenerate the config.h file based
on the macro call to AC_CONFIG_HEADERS in configure.ac. It will only regenerate
config.h.
Enter ./config.status --help to see the other command-line options you
can use when executing config.status:
$ ./config.status --help
'config.status' instantiates files from templates according to the
current configuration.
Usage: ./config.status [OPTION]... [TAG]...
Autotools_02.book Page 79 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
80 Chapter 3
-h, --help print this help, then exit
-V, --version print version number and configuration settings, then exit
-q, --quiet, --silent
do not print progress messages
-d, --debug don't remove temporary files
--recheck update config.status by reconfiguring in the same
conditions
--file=FILE[:TEMPLATE]
instantiate the configuration file FILE
--header=FILE[:TEMPLATE]
instantiate the configuration header FILE
Configuration files:
Makefile src/Makefile
Configuration headers:
config.h
Report bugs to <bug-autoconf@gnu.org>.
$
Notice that config.status provides custom help about a project’s
config.status file. It lists configuration files and configuration headers
that we can use as tags on the command line where the usage specifies
[TAG]... at . In this case, config.status will only instantiate the specified
objects. In the case of commands, it will execute the command set specified by
the tag passed in the associated expansion of the AC_CONFIG_COMMANDS macro.
Each of these macros may be used multiple times in a configure.ac file.
The results are cumulative, and we can use AC_CONFIG_FILES as many times as
we need to in configure.ac. It is also important to note that config.status sup-
ports the --file= option (at ). When you call config.status with tags on the
command line, the only tags you can use are those the help text lists as avail-
able configuration files, headers, links, and commands. When you execute
config.status with the --file= option, you’re telling config.status to generate
a new file that’s not already associated with any of the calls to the instantiating
macros found in configure.ac. This new file is generated from an associated
template using configuration options and check results determined by the
last execution of configure. For example, I could execute config.status in this
manner:
$ ./config.status --file=extra:extra.in
NOTE The default template name is the filename with a .in suffix, so this call could have
been made without using the :extra.in portion of the option. I added it here for clarity.
Let’s return to the instantiating macro signature at the bottom of
page 78. I’ve shown you that the tag... argument has a complex format,
but the ellipsis indicates that it also represents multiple tags, separated by
whitespace. The format you’ll see in nearly all configure.ac files is shown in
Listing 3-14.
Autotools_02.book Page 80 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 81
...
AC_CONFIG_FILES([Makefile
src/Makefile
lib/Makefile
etc/proj.cfg])
...
Listing 3-14: Specifying multiple tags (files) in AC_CONFIG_FILES
Each entry here is one tag specification, which, if fully specified, would
look like the call in Listing 3-15.
...
AC_CONFIG_FILES([Makefile:Makefile.in
src/Makefile:src/Makefile.in
lib/Makefile:lib/Makefile.in
etc/proj.cfg:etc/proj.cfg.in])
...
Listing 3-15: Fully specifying multiple tags in AC_CONFIG_FILES
Returning to the instantiating macro prototype, there are two optional
arguments that you’ll rarely see used in these macros: commands and init-cmds.
The commands argument may be used to specify some arbitrary shell code that
should be executed by config.status just before the files associated with the tags
are generated. It is unusual for this feature to be used within the file-generating
instantiating macros. You will almost always see the commands argument used
with AC_CONFIG_COMMANDS, which generates no files by default, because a call to
this macro is basically useless without commands to execute!16 In this case,
the tag argument becomes a way of telling config.status to execute a specific
set of shell commands.
The init-cmds argument initializes shell variables at the top of config.status
with values available in configure.ac and configure. It’s important to remember
that all calls to instantiating macros share a common namespace along with
config.status. Therefore, you should try to choose your shell variable names
carefully so they are less likely to conflict with each other and with Autoconf-
generated variables.
The old adage about the value of a picture versus an explanation holds
true here, so let’s try a little experiment. Create a test version of your configure.ac
file that contains only the contents of Listing 3-16.
AC_INIT([test], [1.0])
AC_CONFIG_COMMANDS([abc],
[echo "Testing $mypkgname"],
[mypkgname=$PACKAGE_NAME])
AC_OUTPUT
Listing 3-16: Experiment #1—a simple configure.ac file
16. The truth is that we don’t often use AC_CONFIG_COMMANDS.
Autotools_02.book Page 81 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
82 Chapter 3
Now execute autoreconf, configure, and config.status in various ways to
see what happens:
$ autoreconf
$ ./configure
configure: creating ./config.status
config.status: executing abc commands
Testing test
$
$ ./config.status
config.status: executing abc commands
Testing test
$
$ ./config.status --help
'config.status' instantiates files from templates according to the current
configuration.
Usage: ./config.status [OPTIONS]... [FILE]...
...
Configuration commands:
abc
Report bugs to <bug-autoconf@gnu.org>.
$
$ ./config.status abc
config.status: executing abc commands
Testing test
$
As you can see at , executing configure caused config.status to be executed
with no command-line options. There are no checks specified in configure.ac,
so manually executing config.status, as we did at , has nearly the same effect.
Querying config.status for help (as we did at ) indicates that abc is a valid
tag; executing config.status with that tag (as we did at ) on the command
line simply runs the associated commands.
In summary, the important points regarding the instantiating macros are
as follows:
zThe config.status script generates all files from templates.
zThe configure script performs all checks and then executes config.status.
zWhen you execute config.status with no command-line options, it gener-
ates files based on the last set of check results.
zYou can call config.status to execute file generation or command sets
specified by any of the tags given in any of the instantiating macro calls.
zconfig.status may generate files not associated with any tags specified in
configure.ac, in which case it will substitute variables based on the last set
of checks performed.
Autotools_02.book Page 82 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 83
AC_CONFIG_HEADERS
As you’ve no doubt concluded by now, the AC_CONFIG_HEADERS macro allows
you to specify one or more header files that config.status should generate
from template files. The format of a configuration header template is very
specific. A short example is given in Listing 3-17.
/* Define as 1 if you have unistd.h. */
#undef HAVE_UNISTD_H
Listing 3-17: A short example of a header file template
You can place multiple statements like this in your header template, one
per line. The comments are optional, of course. Let’s try another experiment.
Create a new configure.ac file like that shown in Listing 3-18.
AC_INIT([test], [1.0])
AC_CONFIG_HEADERS([config.h])
AC_CHECK_HEADERS([unistd.h foobar.h])
AC_OUTPUT
Listing 3-18: Experiment #2—a simple configure.ac file
Create a template header file called config.h.in that contains the two lines
in Listing 3-19.
#undef HAVE_UNISTD_H
#undef HAVE_FOOBAR_H
Listing 3-19: Experiment #2 continued—a simple config.h.in file
Now execute the following commands:
$ autoconf
$ ./configure
checking for gcc... gcc
...
checking for unistd.h... yes
checking for unistd.h... (cached) yes
checking foobar.h usability... no
checking foobar.h presence... no
checking for foobar.h... no
configure: creating ./config.status
config.status: creating config.h
$
$ cat config.h
/* config.h. Generated from ... */
#define HAVE_UNISTD_H 1
/* #undef HAVE_FOOBAR_H */
$
Autotools_02.book Page 83 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
84 Chapter 3
You can see at that config.status generated a config.h file from the
simple config.h.in template we wrote. The contents of this header file are
based on the checks executed by configure. Since the shell code generated by
AC_CHECK_HEADERS([unistd.h foobar.h]) was able to locate a unistd.h header file
() in the system include directory, the corresponding #undef statement was
converted into a #define statement. Of course, no foobar.h header was found
in the system include directory, as you can also see by the output of configure
at ; therefore, its definition was left commented out in the template, as
shown at .
Thus, you may add the sort of code shown in Listing 3-20 to appropriate
C-language source files in your project.
#if HAVE_CONFIG_H
# include <config.h>
#endif
#if HAVE_UNISTD_H
# include <unistd.h>
#endif
#if HAVE_FOOBAR_H
# include <foobar.h>
#endif
Listing 3-20: Using generated CPP definitions in a C-language source file
Using autoheader to Generate an Include File Template
Manually maintaining a config.h.in template is more trouble than necessary.
The format of config.h.in is very strict—for example, you can’t have any leading
or trailing whitespace on the #undef lines. Besides that, most of the informa-
tion you need from config.h.in is available in configure.ac.
Fortunately, the autoheader utility will generate a properly formatted
header file template for you based on the contents of configure.ac, so you
don’t often need to write config.h.in templates. Let’s return to the command
prompt for a final experiment. This one is easy—just delete your config.h.in
template and then run autoheader and autoconf:
$ rm config.h.in
$ autoheader
$ autoconf
$ ./configure
checking for gcc... gcc
...
checking for unistd.h... yes
checking for unistd.h... (cached) yes
checking foobar.h usability... no
checking foobar.h presence... no
checking for foobar.h... no
Autotools_02.book Page 84 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 85
configure: creating ./config.status
config.status: creating config.h
$
$ cat config.h
/* config.h. Generated from config.h.in... */
/* config.h.in. Generated from configure.ac... */
...
/* Define to 1 if you have... */
/* #undef HAVE_FOOBAR_H */
/* Define to 1 if you have... */
#define HAVE_UNISTD_H 1
/* Define to the address where bug... */
#define PACKAGE_BUGREPORT ""
/* Define to the full name of this package. */
#define PACKAGE_NAME "test"
/* Define to the full name and version... */
#define PACKAGE_STRING "test 1.0"
/* Define to the one symbol short name... */
#define PACKAGE_TARNAME "test"
/* Define to the version... */
#define PACKAGE_VERSION "1.0"
/* Define to 1 if you have the ANSI C... */
#define STDC_HEADERS 1
$
NOTE Again, I encourage you to use autoreconf, which will automatically run autoheader if
it notices an expansion of AC_CONFIG_HEADERS in configure.ac.
As you can see by the output of the cat command at , an entire set of
preprocessor definitions was derived from configure.ac by autoheader.
Listing 3-21 shows a much more realistic example of using a generated
config.h file to increase the portability of your project source code. In this
example, the AC_CONFIG_HEADERS macro call indicates that config.h should be
generated, and the call to AC_CHECK_HEADERS will cause autoheader to insert a
definition into config.h.
AC_INIT([test], [1.0])
AC_CONFIG_HEADERS([config.h])
AC_CHECK_HEADERS([dlfcn.h])
AC_OUTPUT
Listing 3-21: A more realistic example of using AC_CONFIG_HEADERS
The config.h file is intended to be included in your source code in loca-
tions where you might wish to test a configured option in the code itself using
the C preprocessor. This file should be included first in source files so it can
influence the inclusion of system header files later in the source.
NOTE The config.h.in template that autoheader generates doesn’t contain an include-guard
construct, so you need to be careful that it’s not included more than once in a source file.
Autotools_02.book Page 85 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
86 Chapter 3
It’s often the case that every .c file in a project needs to include config.h.
In this case, it might behoove you to include config.h at the top of an internal
project header file that’s included by all the source files in your project. You
can (and probably should) also add an include-guard construct to this inter-
nal header file to protect against including it more than once.
Don’t make the mistake of including config.h in a public header file if
your project installs libraries and header files as part of your product set. For
more detailed information on this topic, refer to “Item 1: Keeping Private
Details out of Public Interfaces” on page 272.
Using the configure.ac file from Listing 3-21, the generated configure script
will create a config.h header file with appropriate definitions for determining,
at compile time, whether or not the current system provides the dlfcn inter-
face. To complete the portability check, you can add the code from Listing 3-22
to a source file in your project that uses dynamic loader functionality.
#if HAVE_CONFIG_H
# include <config.h>
#endif
#if HAVE_DLFCN_H
# include <dlfcn.h>
#else
# error Sorry, this code requires dlfcn.h.
#endif
...
#if HAVE_DLFCN_H
handle = dlopen("/usr/lib/libwhatever.so", RTLD_NOW);
#endif
...
Listing 3-22: A sample source file that checks for dynamic loader functionality
If you already had code that included dlfcn.h, autoscan would have gener-
ated a line in configure.ac to call AC_CHECK_HEADERS with an argument list containing
dlfcn.h as one of the header files to be checked. Your job as maintainer is to
add the conditional statements at and to your source code around the
existing inclusions of the dlfcn.h header file and around calls to the dlfcn
interface functions. This is the crux of Autoconf-provided portability.
Your project might prefer dynamic loader functionality, but could get along
without it if necessary. It’s also possible that your project requires a dynamic
loader, in which case your build should terminate with an error (as this code
does) if the key functionality is missing. Often, this is an acceptable stopgap
until someone comes along and adds support to the source code for a more
system-specific dynamic loader service.
NOTE If you have to bail out with an error, it’s best to do so at configuration time rather than
at compile time. The general rule of thumb is to bail out as early as possible.
Autotools_02.book Page 86 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
Configuring Your Project with Autoconf 87
One obvious flaw in this source code is that config.h is only included
if HAVE_CONFIG_H is defined in your compilation environment. You must define
HAVE_CONFIG_H manually on your compiler command lines if you’re writing
your own makefiles. Automake does this for you in generated Makefile.in
templates.
HAVE_CONFIG_H is part of a string of definitions passed on the compiler
command line in the Autoconf substitution variable @DEFS@. Before autoheader
and AC_CONFIG_HEADERS functionality existed, Automake added all of the compiler
configuration macros to the @DEFS@ variable. You can still use this method if
you don’t use AC_CONFIG_HEADERS in configure.ac, but it’s not recommended—
mainly because a large number of definitions make for very long compiler
command lines.
Back to Remote Builds for a Moment
As we wrap up this chapter, you’ll notice that we’ve come full circle. We started
out covering some preliminary information before we discussed how to add
remote builds to Jupiter. Now we’ll return to this topic for a moment, because I
haven’t yet covered how to get the C preprocessor to properly locate a gener-
ated config.h file.
Since this file is generated from a template, it will be at the same relative
position in the build directory structure as its counterpart template file,
config.h.in, is in the source directory structure. The template is located in the
top-level source directory (unless you chose to put it elsewhere), so the gener-
ated file will be in the top-level build directory. Well, that’s easy enough—it’s
always one level up from the generated src/Makefile.
Before we draw any conclusions then about header file locations, let’s
consider where header files might appear in a project. We might generate
them in the current build directory, as part of the build process. We might
also add internal header files to the current source directory. We know we
have a config.h file in the top-level build directory. Finally, we might also create
a top-level include directory for library interface header files our package pro-
vides. What is the order of priority for these various include directories?
The order in which we place include directives (-Ipath options) on the
compiler command line is the order in which they will be searched, so the
order should be based on which files are most relevant to the source file
currently being compiled. Thus, the compiler command line should include
-Ipath directives for the current build directory (.) first, followed by the source
directory [$(srcdir)], then the top-level build directory (..), and finally, our
project’s include directory, if it has one. We impose this ordering by adding
-Ipath options to the compiler command line, as shown in Listing 3-23.
...
jupiter: main.c
$(CC) -I. -I$(srcdir) -I.. $(CPPFLAGS) $(CFLAGS) -o $@ main.c
...
Listing 3-23: src/Makefile.in: Adding proper compiler include directives
Autotools_02.book Page 87 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
88 Chapter 3
Now that we know this, we need to add another rule of thumb for remote
builds to the list we created on page 69:
zAdd preprocessor commands for the current build directory, the associ-
ated source directory, and the top-level build directories, in that order.
Summary
In this chapter, we covered just about all the major features of a fully func-
tional GNU project build system, including writing a configure.ac file, from
which Autoconf generates a fully functional configure script. We’ve also covered
adding remote build functionality to makefiles with VPATH statements.
So what else is there? Plenty! In the next chapter, I’ll continue to show
you how you can use Autoconf to test system features and functionality before
your users run make. We’ll also continue enhancing the configuration script
so that when we’re done, users will have more options and understand exactly
how our package will be built on their systems.
Autotools_02.book Page 88 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
MORE FUN WITH AUTOCONF:
CONFIGURING USER OPTIONS
Hope is not the conviction that something will turn out well,
but the certainty that something makes sense,
regardless of how it turns out.
—Václav Havel, Disturbing the Peace
In Chapter 3, we discussed the essentials of
Autoconf—how to bootstrap a new or exist-
ing project and how to understand some of
the basic aspects of configure.ac files. In this chap-
ter, we’ll cover some of the more complex Autoconf
macros. We’ll begin by learning how to substitute our
own variables into template files (e.g., Makefile.in) and how to define our own
preprocessor definitions from within the configuration script. Throughout
this chapter, we’ll continue to develop functionality in the Jupiter project by
adding important checks and tests. We’ll cover the all-important AC_OUTPUT
macro, and we’ll conclude by discussing the application of user-defined
project configuration options as specified in the configure.ac file.
In addition to all this, I’ll present an analysis technique that you can
use to decipher the inner workings of macros. Using the somewhat complex
AC_CHECK_PROG macro as an example, I’ll show you some ways to find out what’s
going on under the hood. After all, when software is distributed in source
format, its secrets can’t stay hidden forever.
Autotools_02.book Page 89 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
90 Chapter 4
Substitutions and Definitions
I’ll begin this chapter by discussing three of the most important macros in
the Autoconf suite: AC_SUBST and AC_DEFINE, along with the latter’s twin brother,
AC_DEFINE_UNQUOTED.
These macros provide the primary mechanisms for communication
between the configuration process and the build and execution processes.
Values that are substituted into generated files provide configuration informa-
tion to the build process, while values defined in preprocessor variables provide
configuration information at build time to the compiler and at runtime to
the built programs and libraries. As a result, it’s well worth becoming thor-
oughly familiar with AC_SUBST and AC_DEFINE.
AC_SUBST
You can use AC_SUBST to extend the variable substitution functionality that’s
such an integral part of Autoconf. Every Autoconf macro that has anything to
do with substitution variables ultimately calls this macro to create the substi-
tution variable from an existing shell variable. Sometimes the shell variables
are inherited from the environment; other times, higher-level macros set the
shell variables as part of their functionality before calling AC_SUBST. The signa-
ture of this macro is rather trivial (note that the square brackets in this prototype
represent optional arguments, not Autoconf quotes):
AC_SUBST(shell_var[, value])
NOTE If you choose to omit any trailing optional parameters when using M4 macro calls, you
may also omit the trailing commas. However, if you omit any arguments from the middle
of the list, you must show the commas as placeholders for the missing arguments.
The first argument, shell_var, represents a shell variable whose value you
wish to substitute into all files generated by config.status from templates. The
optional second parameter is the value assigned to the variable. If it isn’t speci-
fied, the shell variable’s current value will be used, whether it’s inherited or
set by some previous shell code.
The substitution variable will have the same name as the shell variable,
except that it will be bracketed with at signs (@) in the template files. Thus,
a shell variable named my_var would become a substitution variable named
@my_var@, and you could use it in any template file.
Calls to AC_SUBST in configure.ac should not be made conditionally; that is,
they should not be called within conditional shell statements like if-then-else
constructs. The reason becomes clear when you carefully consider the pur-
pose of AC_SUBST: You’ve already hardcoded substitution variables into your
template files, so you’d better use AC_SUBST for each variable unconditionally,
or else your output files will retain the substitution variables, rather than the
values that should have been substituted.
Autotools_02.book Page 90 Tuesday, June 15, 2010 2:38 PM
www.it-ebooks.info
More Fun with Autoconf: Configuring User Options 91
AC_DEFINE
The AC_DEFINE and AC_DEFINE_UNQUOTED macros define C-preprocessor macros,
which can be simple or function-like macros. These are either defined in the
config.h.in template (if you use AC_CONFIG_HEADERS) or passed on the compiler
command line (via the @DEFS@ substitution variable) in Makefile.in templates.
Recall that if you don’t write config.h.in yourself, autoheader will write it based
on calls to these macros in your configure.ac file.
These two macro names actually represent four different Autoconf macros.
Here are their prototypes:
AC_DEFINE(variable, value[, description])
AC_DEFINE(variable)
AC_DEFINE_UNQUOTED(variable, value[, description])
AC_DEFINE_UNQUOTED(variable)
The difference between the normal and the UNQUOTED versions of these
macros is that the normal versions use, verbatim, the specified value as the
value of the preprocessor macro. The UNQUOTED versions perform shell expansion
on the value argument, and they use the result as the value of the preprocessor
macro. Thus, you should use AC_DEFINE_UNQUOTED if the value contains shell
variables that you want configure to expand. (Setting a C-preprocessor macro
in a header file to an unexpanded shell variable makes no sense, because
neither the C compiler nor the preprocessor will know what to do with it
when the source code is compiled.)
The difference between the single- and multi-argument versions lies in
the way the preprocessor macros are defined. The single-argument versions
simply guarantee that the macro is defined in the preprocessor namespace,
while the multi-argument versions ensure that the macro is defined with a
specific value.
The optional third parameter, description, tells autoheader to add a com-
ment for this macro to the config.h.in template. (If you don’t use autoheader, it
makes no sense to pass a description here—hence it