libtool/doc/libtool.texi
Ileana Dumitrescu 5335d4cfbf
libtool.texi: Remove section 'References'
The section 'References' only contained old broken links.

* doc/libtool.texi: Remove broken links from documentation.
2024-11-20 17:36:29 +02:00

7266 lines
280 KiB
Plaintext

\input texinfo @c -*-texinfo-*-
@c %**start of header
@setfilename libtool.info
@settitle Libtool
@c For double-sided printing, uncomment:
@c @setchapternewpage odd
@c Put everything in one index (arbitrarily chosen to be the concept index).
@syncodeindex vr cp
@syncodeindex fn cp
@syncodeindex tp cp
@synindex pg cp
@c %**end of header
@include version.texi
@set BUGADDR the Libtool bug reporting address @email{bug-libtool@@gnu.org}
@set MAILLIST the Libtool mailing list @email{libtool@@gnu.org}
@set objdir .libs
@copying
This manual is for GNU Libtool (version @value{VERSION}, @value{UPDATED}).
Copyright @copyright{} 1996--2019, 2021--2024 Free Software Foundation,
Inc.
Permission is granted to copy, distribute and/or modify this document
under the terms of the GNU Free Documentation License, Version 1.3
or any later version published by the Free Software Foundation;
with no Invariant Sections, with no Front-Cover Texts,
and with no Back-Cover Texts. A copy of the license is included in
the section entitled ``GNU Free Documentation License''.
@end copying
@dircategory Software development
@direntry
* Libtool: (libtool). Generic shared library support script.
@end direntry
@dircategory Individual utilities
@direntry
* libtool-invocation: (libtool)Invoking libtool. Running the @code{libtool} script.
* libtoolize: (libtool)Invoking libtoolize. Adding libtool support.
@end direntry
@titlepage
@title GNU Libtool
@subtitle For version @value{VERSION}, @value{UPDATED}
@author Gordon Matzigkeit
@author Alexandre Oliva
@author Thomas Tanner
@author Gary V. Vaughan
@page
@vskip 0pt plus 1filll
@insertcopying
@end titlepage
@contents
@ifnottex
@node Top, Introduction, (dir), (dir)
@comment node-name, next, previous, up
@top Shared library support for GNU
This file documents GNU Libtool, a script that allows package developers
to provide generic shared library support. This edition documents
version @value{VERSION}.
@xref{Reporting bugs}, for information on how to report problems with
GNU Libtool.
@menu
* Introduction:: What the heck is libtool?
* Libtool paradigm:: How libtool's view of libraries is different.
* Using libtool:: Example of using libtool to build libraries.
* Invoking libtool:: Running the @code{libtool} script.
* Integrating libtool:: Using libtool in your own packages.
* Other languages:: Using libtool without a C compiler.
* Versioning:: Using library interface versions.
* Library tips:: Tips for library interface design.
* Inter-library dependencies:: Libraries that depend on other libraries.
* Dlopened modules:: @code{dlopen}ing libtool-created libraries.
* Using libltdl:: Libtool's portable @code{dlopen} wrapper library.
* Trace interface:: Libtool's trace interface.
* FAQ:: Frequently Asked Questions
* Troubleshooting:: When libtool doesn't work as advertised.
* Maintaining:: Information used by the libtool maintainer.
* GNU Free Documentation License:: License for this manual.
* Combined Index:: Full index.
@detailmenu
--- The Detailed Node Listing ---
Introduction
* Motivation:: Why does GNU need a libtool?
* Issues:: The problems that need to be addressed.
* Other implementations:: How other people have solved these issues.
* Postmortem:: Learning from past difficulties.
Using libtool
* Creating object files:: Compiling object files for libraries.
* Linking libraries:: Creating libraries from object files.
* Linking executables:: Linking object files against libtool libraries.
* Debugging executables:: Running GDB on libtool-generated programs.
* Installing libraries:: Making libraries available to users.
* Installing executables:: Making programs available to users.
* Static libraries:: When shared libraries are not wanted.
Linking executables
* Wrapper executables:: Wrapper executables for some platforms.
Invoking @command{libtool}
* Compile mode:: Creating library object files.
* Link mode:: Generating executables and libraries.
* Execute mode:: Debugging libtool-generated programs.
* Install mode:: Making libraries and executables public.
* Finish mode:: Completing a library installation.
* Uninstall mode:: Removing installed executables and libraries.
* Clean mode:: Removing uninstalled executables and libraries.
Integrating libtool with your package
* Autoconf macros:: Autoconf macros exported by libtool.
* Makefile rules:: Writing @file{Makefile} rules for libtool.
* Using Automake:: Automatically supporting libtool.
* Configuring:: Configuring libtool for a host system.
* Distributing:: What files to distribute with your package.
* Static-only libraries:: Sometimes shared libraries are just a pain.
Configuring libtool
* LT_INIT:: Configuring @code{libtool} in @file{configure.ac}.
* Configure notes:: Platform-specific notes for configuration.
Including libtool in your package
* Invoking libtoolize:: @code{libtoolize} command line options.
* Autoconf and LTLIBOBJS:: Autoconf automates LTLIBOBJS generation.
Using libtool with other languages
* C++ libraries:: Writing libraries for C++
* Tags:: Tags
Library interface versions
* Interfaces:: What are library interfaces?
* Libtool versioning:: Libtool's versioning system.
* Updating version info:: Changing version information before releases.
* Release numbers:: Breaking binary compatibility for aesthetics.
Tips for interface design
* C header files:: How to write portable include files.
Dlopened modules
* Building modules:: Creating dlopenable objects and libraries.
* Dlpreopening:: Dlopening that works on static platforms.
* Linking with dlopened modules:: Using dlopenable modules in libraries.
* Finding the dlname:: Choosing the right file to @code{dlopen}.
* Dlopen issues:: Unresolved problems that need your attention.
Using libltdl
* Libltdl interface:: How to use libltdl in your programs.
* Modules for libltdl:: Creating modules that can be @code{dlopen}ed.
* Thread Safety in libltdl:: Registering callbacks for multi-thread safety.
* User defined module data:: Associating data with loaded modules.
* Module loaders for libltdl:: Creating user defined module loaders.
* Distributing libltdl:: How to distribute libltdl with your package.
Frequently Asked Questions about libtool
* Stripped link flags:: Dropped flags when creating a library
Troubleshooting
* Libtool test suite:: Libtool's self-tests.
* Reporting bugs:: How to report problems with libtool.
The libtool test suite
* Test descriptions:: The contents of the old test suite.
* When tests fail:: What to do when a test fails.
Maintenance notes for libtool
* New ports:: How to port libtool to new systems.
* Tested platforms:: When libtool was last tested.
* Platform quirks:: Information about different library systems.
* libtool script contents:: Configuration information that libtool uses.
* Cheap tricks:: Making libtool maintainership easier.
Porting libtool to new systems
* Information sources:: Where to find relevant documentation
* Porting inter-library dependencies:: Implementation details explained
Platform quirks
* Compilers:: Creating object files from source files.
* Reloadable objects:: Binding object files together.
* Multiple dependencies:: Removing duplicate dependent libraries.
* Archivers:: Programs that create static archives.
* Cross compiling:: Issues that arise when cross compiling.
* File name conversion:: Converting file names between platforms.
* Windows DLLs:: Windows header defines.
File name conversion
* File Name Conversion Failure:: What happens when file name conversion fails
* Native MinGW File Name Conversion:: MSYS file name conversion idiosyncrasies
* Cygwin/Windows File Name Conversion:: Using @command{cygpath} to convert Cygwin file names
* Unix/Windows File Name Conversion:: Using Wine to convert Unix paths
* LT_CYGPATH:: Invoking @command{cygpath} from other environments
* Cygwin to MinGW Cross:: Other notes concerning MinGW cross
@end detailmenu
@end menu
@end ifnottex
@node Introduction
@chapter Introduction
In the past, if you were a source code package developer and wanted to
take advantage of the power of shared libraries, you needed to write
custom support code for each platform on which your package ran. You
also had to design a configuration interface so that the package
installer could choose what sort of libraries were built.
GNU Libtool simplifies your job by encapsulating both the
platform-specific dependencies, and the user interface, in a single
script. GNU Libtool is designed so that the complete functionality of
each host type is available via a generic interface, but nasty quirks
are hidden from the programmer.
GNU Libtool's consistent interface is reassuring@dots{} users don't need
to read obscure documentation to have their favorite source
package build shared libraries. They just run your package
@code{configure} script (or equivalent), and libtool does all the dirty
work.
There are several examples throughout this document. All assume the
same environment: we want to build a library, @file{libhello}, in a
generic way.
@file{libhello} could be a shared library, a static library, or
both@dots{} whatever is available on the host system, as long as libtool
has been ported to it.
This chapter explains the original design philosophy of libtool. Feel
free to skip to the next chapter, unless you are interested in history,
or want to write code to extend libtool in a consistent way.
@menu
* Motivation:: Why does GNU need a libtool?
* Issues:: The problems that need to be addressed.
* Other implementations:: How other people have solved these issues.
* Postmortem:: Learning from past difficulties.
@end menu
@node Motivation
@section Motivation for writing libtool
@cindex motivation for writing libtool
@cindex design philosophy
Since early 1995, several different GNU developers have recognized the
importance of having shared library support for their packages. The
primary motivation for such a change is to encourage modularity and
reuse of code (both conceptually and physically) in GNU programs.
Such a demand means that the way libraries are built in GNU packages
needs to be general, to allow for any library type the package installer
might want. The problem is compounded by the absence of a standard
procedure for creating shared libraries on different platforms.
The following sections outline the major issues facing shared library
support in GNU, and how shared library support could be standardized
with libtool.
@cindex specifications for libtool
@cindex libtool specifications
The following specifications were used in developing and evaluating this
system:
@enumerate
@item
The system must be as elegant as possible.
@item
The system must be fully integrated with the GNU Autoconf and Automake
utilities, so that it will be easy for GNU maintainers to use. However,
the system must not require these tools, so that it can be used by
non-GNU packages.
@item
Portability to other (non-GNU) architectures and tools is desirable.
@end enumerate
@node Issues
@section Implementation issues
@cindex tricky design issues
@cindex design issues
The following issues need to be addressed in any reusable shared library
system, specifically libtool:
@enumerate
@item
The package installer should be able to control what sort of libraries
are built.
@item
It can be tricky to run dynamically linked programs whose libraries have
not yet been installed. @code{LD_LIBRARY_PATH} must be set properly (if
it is supported), or programs fail to run.
@item
The system must operate consistently even on hosts that don't support
shared libraries.
@item
The commands required to build shared libraries may differ wildly from
host to host. These need to be determined at configure time in
a consistent way.
@item
It is not always obvious with what prefix or suffix a shared library
should be installed. This makes it difficult for @file{Makefile} rules,
since they generally assume that file names are the same from host to
host.
@item
The system needs a simple library version number abstraction, so that
shared libraries can be upgraded in place. The programmer should be
informed how to design the interfaces to the library to maximize binary
compatibility.
@item
The install @file{Makefile} target should warn the package installer to set
the proper environment variables (@code{LD_LIBRARY_PATH} or equivalent),
or run @command{ldconfig}.
@end enumerate
@node Other implementations
@section Other implementations
Even before libtool was developed, many free software packages built and
installed their own shared libraries. At first, these packages were
examined to avoid reinventing existing features.
Now it is clear that none of these packages have documented the details
of shared library systems that libtool requires. So, other packages
have been more or less abandoned as influences.
@node Postmortem
@section A postmortem analysis of other implementations
@cindex other implementations, flaws in
@cindex reusability of library systems
In all fairness, each of the implementations that were examined do the
job that they were intended to do, for a number of different host
systems. However, none of these solutions seem to function well as a
generalized, reusable component.
@cindex complexity of library systems
Most were too complex to use (much less modify) without understanding
exactly what the implementation does, and they were generally not
documented.
The main difficulty is that different vendors have different views of
what libraries are, and none of the packages that were examined seemed
to be confident enough to settle on a single paradigm that just
@emph{works}.
Ideally, libtool would be a standard that would be implemented as series
of extensions and modifications to existing library systems to make them
work consistently. However, it is not an easy task to convince
operating system developers to mend their evil ways, and people want to
build shared libraries right now, even on buggy, broken, confused
operating systems.
For this reason, libtool was designed as an independent shell script.
It isolates the problems and inconsistencies in library building that
plague @file{Makefile} writers by wrapping the compiler suite on
different platforms with a consistent, powerful interface.
With luck, libtool will be useful to and used by the GNU community, and
that the lessons that were learned in writing it will be taken up by
designers of future library systems.
@node Libtool paradigm
@chapter The libtool paradigm
At first, libtool was designed to support an arbitrary number of library
object types. After libtool was ported to more platforms, a new
paradigm gradually developed for describing the relationship between
libraries and programs.
@cindex definition of libraries
@cindex libraries, definition of
In summary, ``libraries are programs with multiple entry points, and
more formally defined interfaces.''
The best way to introduce the libtool paradigm is to contrast it with
the paradigm of existing library systems, with examples from each. It
is a new way of thinking, so it may take a little time to absorb, but
when you understand it, the world becomes simpler.
@node Using libtool
@chapter Using libtool
@cindex examples of using libtool
@cindex libtool examples
It makes little sense to talk about using libtool in your own packages
until you have seen how it makes your life simpler. The examples in
this chapter introduce the main features of libtool by comparing the
standard library building procedure to libtool's operation on two
different platforms:
@table @samp
@item a23
An Ultrix 4.2 platform with only static libraries.
@item burger
A NetBSD/i386 1.2 platform with shared libraries.
@end table
Source files for the following examples are taken from the Autotest file
@file{tests/demo.at}. The files can be extracted by running a demo test and
preserving the artifacts:
@example
burger$ make check TESTSUITEFLAGS="-d -k 'preloaded static library'"
burger$ cp -r tests/testsuite.dir/027 demo/
@end example
You can follow these examples on your own platform, using the preconfigured
libtool script that was installed with libtool (@pxref{Configuring}). Assume
that we are building a library, @file{libhello}, out of the files @file{foo.c}
and @file{hello.c}.
Note that the @file{foo.c} source file uses the @code{cos} math library
function, which is usually found in the standalone math library, and not
the C library (@pxref{Trig Functions, , Trigonometric Functions, libc,
The GNU C Library Reference Manual}). So, we need to add @option{-lm} to
the end of the link line whenever we link @file{foo.lo} into an
executable or a library (@pxref{Inter-library dependencies}).
The same rule applies whenever you use functions that don't appear in
the standard C library@dots{} you need to add the appropriate
@option{-l@var{name}} flag to the end of the link line when you link
against those objects.
After we have built that library, we want to create a program by linking
@file{main.o} against @file{libhello}.
@menu
* Creating object files:: Compiling object files for libraries.
* Linking libraries:: Creating libraries from object files.
* Linking executables:: Linking object files against libtool libraries.
* Debugging executables:: Running GDB on libtool-generated programs.
* Installing libraries:: Making libraries available to users.
* Installing executables:: Making programs available to users.
* Static libraries:: When shared libraries are not wanted.
@end menu
@node Creating object files
@section Creating object files
@cindex compiling object files
@cindex object files, compiling
To create an object file from a source file, the compiler is invoked
with the @option{-c} flag (and any other desired flags):
@example
burger$ @kbd{gcc -I. -g -O -c main.c}
burger$
@end example
The above compiler command produces an object file, usually named
@file{main.o}, from the source file @file{main.c}.
For most library systems, creating object files that become part of a
static library is as simple as creating object files that are linked to
form an executable:
@example
burger$ @kbd{gcc -I. -g -O -c foo.c}
burger$ @kbd{gcc -I. -g -O -c hello.c}
burger$
@end example
@cindex position-independent code
@cindex PIC (position-independent code)
Shared libraries, however, may only be built from
@dfn{position-independent code} (PIC). So, special flags must be passed
to the compiler to tell it to generate PIC rather than the standard
position-dependent code.
@cindex library object file
@cindex @file{.lo} files
@cindex object files, library
Since this is a library implementation detail, libtool hides the
complexity of PIC compiler flags and uses separate library object files
(the PIC one lives in the @file{@value{objdir}} subdirectory and the
static one lives in the current directory). On systems without shared
libraries, the PIC library object files are not created, whereas on
systems where all code is PIC, such as AIX, the static ones are not
created.
To create library object files for @file{foo.c} and @file{hello.c},
simply invoke libtool with the standard compilation command as
arguments (@pxref{Compile mode}):
@example
a23$ @kbd{libtool --mode=compile gcc -I. -g -O -c foo.c}
gcc -I. -g -O -c foo.c -o foo.o
a23$ @kbd{libtool --mode=compile gcc -I. -g -O -c hello.c}
gcc -I. -g -O -c hello.c -o hello.o
a23$
@end example
Note that libtool silently creates an additional control file on each
@samp{compile} invocation. The @file{.lo} file is the libtool object,
which Libtool uses to determine what object file may be built into a
shared library. On @samp{a23}, only static libraries are supported so
the library objects look like this:
@example
# foo.lo - a libtool object file
# Generated by ltmain.sh (GNU libtool) @value{VERSION}
#
# Please DO NOT delete this file!
# It is necessary for linking the library.
# Name of the PIC object.
pic_object=none
# Name of the non-PIC object.
non_pic_object='foo.o'
@end example
On shared library systems, libtool automatically generates an
additional PIC object by inserting the appropriate PIC generation
flags into the compilation command:
@example
burger$ @kbd{libtool --mode=compile gcc -I. -g -O -c foo.c}
mkdir @value{objdir}
gcc -I. -g -O -c foo.c -fPIC -DPIC -o @value{objdir}/foo.o
gcc -I. -g -O -c foo.c -o foo.o >/dev/null 2>&1
burger$
@end example
Note that Libtool automatically created the @file{@value{objdir}} directory
upon its first execution, where PIC library object files will be stored.
Since @samp{burger} supports shared libraries, and requires PIC
objects to build them, Libtool has compiled a PIC object and
made a note of it in the libtool object:
@example
# foo.lo - a libtool object file
# Generated by ltmain.sh (GNU libtool) @value{VERSION}
#
# Please DO NOT delete this file!
# It is necessary for linking the library.
# Name of the PIC object.
pic_object='@value{objdir}/foo.o'
# Name of the non-PIC object.
non_pic_object='foo.o'
@end example
@cindex @option{-no-suppress}, libtool compile mode option
Notice the second run of GCC has its output discarded. This is
done so the compiler warnings aren't annoyingly duplicated. If you
need to see both sets of warnings (you might have conditional code
inside @samp{#ifdef PIC} for example), you can turn off suppression by
passing the @option{-no-suppress} option to libtool's compile mode:
@example
burger$ @kbd{libtool --mode=compile gcc -no-suppress -I. -g -O -c hello.c}
gcc -I. -g -O -c hello.c -fPIC -DPIC -o @value{objdir}/hello.o
gcc -I. -g -O -c hello.c -o hello.o
burger$
@end example
@node Linking libraries
@section Linking libraries
@pindex ar
Without libtool, the programmer would invoke the @command{ar} command to
create a static library:
@example
burger$ @kbd{ar cr libhello.a hello.o foo.o}
burger$
@end example
@pindex ranlib
But of course, that would be too simple, so many systems require that
you run the @code{ranlib} command on the resulting library in order to
generate a symbol table:
@example
burger$ @kbd{ranlib libhello.a}
burger$
@end example
It seems more natural to use the C compiler for this task, given
libtool's ``libraries are programs'' approach. So, on platforms without
shared libraries, libtool simply acts as a wrapper for the system
@command{ar} (and possibly @code{ranlib}) commands.
@cindex libtool libraries
@cindex @file{.la} files
Again, the libtool control file name (@file{.la} suffix) differs from
the standard library name (@file{.a} suffix). The arguments to
libtool are the same ones you would use to produce an executable named
@file{libhello.la} with your compiler (@pxref{Link mode}):
@example
a23$ @kbd{libtool --mode=link gcc -g -O -o libhello.la foo.o hello.o}
*** Warning: Linking the shared library libhello.la against the
*** non-libtool objects foo.o hello.o is not portable!
ar cr .libs/libhello.a
ranlib .libs/libhello.a
creating libhello.la
(cd .libs && rm -f libhello.la && ln -s ../libhello.la libhello.la)
a23$
@end example
Aha! Libtool caught a common error@dots{} trying to build a library
from standard objects instead of special @file{.lo} object files. This
doesn't matter so much for static libraries, but on shared library
systems, it is of great importance. (Note that you may replace
@file{libhello.la} with @file{libhello.a} in which case libtool won't
issue the warning any more. But although this method works, this is
not intended to be used because it makes you lose the benefits of
using Libtool.)
So, let's try again, this time with the library object files. Remember
also that we need to add @option{-lm} to the link command line because
@file{foo.c} uses the @code{cos} math library function (@pxref{Using
libtool}).
Another complication in building shared libraries is that we need to
specify the path to the directory where they will (eventually) be
installed (in this case, @file{/usr/local/lib})@footnote{If you don't
specify an @code{rpath}, then libtool builds a libtool convenience
archive, not a shared library (@pxref{Static libraries}).}:
@example
a23$ @kbd{libtool --mode=link gcc -g -O -o libhello.la foo.lo hello.lo \
-rpath /usr/local/lib -lm}
ar cr @value{objdir}/libhello.a foo.o hello.o
ranlib @value{objdir}/libhello.a
creating libhello.la
(cd @value{objdir} && rm -f libhello.la && ln -s ../libhello.la libhello.la)
a23$
@end example
Now, let's try the same trick on the shared library platform:
@example
burger$ @kbd{libtool --mode=link gcc -g -O -o libhello.la foo.lo hello.lo \
-rpath /usr/local/lib -lm}
rm -fr @value{objdir}/libhello.a @value{objdir}/libhello.la
ld -Bshareable -o @value{objdir}/libhello.so.0.0 @value{objdir}/foo.o @value{objdir}/hello.o -lm
ar cr @value{objdir}/libhello.a foo.o hello.o
ranlib @value{objdir}/libhello.a
creating libhello.la
(cd @value{objdir} && rm -f libhello.la && ln -s ../libhello.la libhello.la)
burger$
@end example
Now that's significantly cooler@dots{} Libtool just ran an obscure
@command{ld} command to create a shared library, as well as the static
library.
@cindex @file{@value{objdir}} subdirectory
Note how libtool creates extra files in the @file{@value{objdir}}
subdirectory, rather than the current directory. This feature makes
it easier to clean up the build directory, and helps ensure
other programs fail horribly if you accidentally forget to use libtool
when you should.
Again, you should look at the @file{.la} file to see what Libtool
stores in it. You will see Libtool uses this file to remember the
destination directory for the library (the argument to @option{-rpath})
as well as the dependency on the math library (@samp{-lm}).
@node Linking executables
@section Linking executables
@cindex linking against installed libraries
If you choose at this point to @dfn{install} the library (put it in a
permanent location) before linking executables against it, then you
don't need to use libtool to do the linking. Simply use the appropriate
@option{-L} and @option{-l} flags to specify the library's location.
@cindex buggy system linkers
Some system linkers insist on encoding the full directory name of each
shared library in the resulting executable. Libtool has to work around
this misfeature by special magic to ensure that only permanent directory
names are put into installed executables.
@cindex security problems with buggy linkers
@cindex bugs, subtle ones caused by buggy linkers
The importance of this bug must not be overlooked: it won't cause
programs to crash in obvious ways. It creates a security hole,
and possibly even worse, if you are modifying the library source code
after you have installed the package, you will change the behaviour of
the installed programs!
So, if you want to link programs against the library before you install
it, you must use libtool to do the linking.
@cindex linking against uninstalled libraries
Here's the old way of linking against an uninstalled library:
@example
burger$ @kbd{gcc -g -O -o hell.old main.o libhello.a -lm}
burger$
@end example
Libtool's way is almost the same@footnote{However, you should avoid using
@option{-L} or @option{-l} flags to link against an uninstalled libtool
library. Just specify the relative path to the @file{.la} file, such as
@file{../intl/libintl.la}. This is a design decision to eliminate any
ambiguity when linking against uninstalled shared libraries.}
(@pxref{Link mode}):
@example
a23$ @kbd{libtool --mode=link gcc -g -O -o hell main.o libhello.la}
gcc -g -O -o hell main.o ./@value{objdir}/libhello.a -lm
a23$
@end example
That looks too simple to be true. All libtool did was transform
@file{libhello.la} to @file{./@value{objdir}/libhello.a}, but remember
that @samp{a23} has no shared libraries. Notice Libtool also
remembered @file{libhello.la} depends on @option{-lm}, so even
though we didn't specify @option{-lm} on the libtool command
line@footnote{
@c
And why should we? @file{main.o} doesn't directly depend on @option{-lm}
after all.
@c
} Libtool has added it to the @command{gcc} link line for us.
On @samp{burger} Libtool links against the uninstalled shared library:
@example
burger$ @kbd{libtool --mode=link gcc -g -O -o hell main.o libhello.la}
gcc -g -O -o @value{objdir}/hell main.o -L./@value{objdir} -R/usr/local/lib -lhello -lm
creating hell
burger$
@end example
@cindex linking with installed libtool libraries
Now assume @file{libhello.la} had already been installed, and you want
to link a new program with it. You could figure out where it lives by
yourself, then run:
@example
burger$ @kbd{gcc -g -O -o test test.o -L/usr/local/lib -lhello -lm}
@end example
However, unless @file{/usr/local/lib} is in the standard library search
path, you won't be able to run @code{test}. If instead you use libtool
to link the already-installed libtool library, it will do The Right
Thing (TM) for you:
@example
burger$ @kbd{libtool --mode=link gcc -g -O -o test test.o \
/usr/local/lib/libhello.la}
gcc -g -O -o @value{objdir}/test test.o -Wl,--rpath \
-Wl,/usr/local/lib /usr/local/lib/libhello.a -lm
creating test
burger$
@end example
Note that libtool added the necessary run-time path flag, as well as
@option{-lm}, the library libhello.la depended upon. Nice, huh?
@cindex wrapper scripts for programs
@cindex program wrapper scripts
Notice the executable, @code{hell}, was actually created in the
@file{@value{objdir}} subdirectory. Then, a wrapper script (or, on
certain platforms, a wrapper executable @pxref{Wrapper executables}) was
created in the current directory.
Since libtool created a wrapper script, you should use libtool to
install it and debug it too. However, since the program does not depend
on any uninstalled libtool library, it is probably usable even without
the wrapper script.
On NetBSD 1.2, libtool encodes the installation directory of
@file{libhello}, by using the @samp{-R/usr/local/lib} compiler flag.
Then, the wrapper script guarantees the executable finds the
correct shared library (the one in @file{./@value{objdir}}) so it
can be properly installed.
Let's compare the two different programs:
@example
burger$ @kbd{time ./hell.old}
Welcome to GNU Hell!
** This is not GNU Hello. There is no built-in mail reader. **
0.21 real 0.02 user 0.08 sys
burger$ @kbd{time ./hell}
Welcome to GNU Hell!
** This is not GNU Hello. There is no built-in mail reader. **
0.63 real 0.09 user 0.59 sys
burger$
@end example
The wrapper script takes significantly longer to execute, but at least
the results are correct, even though the shared library hasn't been
installed yet.
So, what about all the space savings that shared libraries are supposed
to yield?
@example
burger$ @kbd{ls -l hell.old libhello.a}
-rwxr-xr-x 1 gord gord 15481 Nov 14 12:11 hell.old
-rw-r--r-- 1 gord gord 4274 Nov 13 18:02 libhello.a
burger$ @kbd{ls -l @value{objdir}/hell @value{objdir}/libhello.*}
-rwxr-xr-x 1 gord gord 11647 Nov 14 12:10 @value{objdir}/hell
-rw-r--r-- 1 gord gord 4274 Nov 13 18:44 @value{objdir}/libhello.a
-rwxr-xr-x 1 gord gord 12205 Nov 13 18:44 @value{objdir}/libhello.so.0.0
burger$
@end example
Well, that sucks. Maybe I should just scrap this project and take up
basket weaving.
Actually, it just proves an important point: shared libraries incur
overhead because of their (relative) complexity. In this situation, the
price of being dynamic is eight kilobytes, and the payoff is about four
kilobytes. So, having a shared @file{libhello} won't be an advantage
until we link it against at least a few more programs.
@menu
* Wrapper executables:: Wrapper executables for some platforms.
@end menu
@node Wrapper executables
@subsection Wrapper executables for uninstalled programs
@cindex wrapper executables for uninstalled programs
@cindex program wrapper executables
Some platforms, notably those hosted on Windows such as Cygwin
and MinGW, use a wrapper executable rather than a wrapper script
to ensure proper operation of uninstalled programs linked by libtool
against uninstalled shared libraries. The wrapper executable thus
performs the same function as the wrapper script used on other
platforms, but allows to satisfy the @command{make} rules for the
program, whose name ends in @code{$(EXEEXT)}. The actual program
executable is created below @file{@value{objdir}}, and its name will end
in @code{$(EXEEXT)} and may or may not contain an @code{lt-} prefix.
This wrapper executable sets various environment values so that the
program executable may locate its (uninstalled) shared libraries,
and then launches the program executable.
The wrapper executable provides a debug mode, enabled by passing the
command-line option @code{--lt-debug} (see below). When executing in
debug mode, diagnostic information will be printed to @code{stderr}
before the program executable is launched.
Finally, the wrapper executable supports a number of command line
options that may be useful when debugging the operation of the wrapper
system. All of these options begin with @code{--lt-}, and if present
they and their arguments will be removed from the argument list passed
on to the program executable. Therefore, the program executable may not
employ command line options that begin with @code{--lt-}. (In fact, the
wrapper executable will detect any command line options that begin with
@code{--lt-} and abort with an error message if the option is not
recognized.) If this presents a problem, please contact the Libtool
team at @value{BUGADDR}.
These command line options include:
@table @option
@item --lt-dump-script
Causes the wrapper to print a copy of the wrapper @emph{script}
to @code{stdout}, and exit.
@item --lt-debug
Causes the wrapper to print diagnostic information to @code{stdout},
before launching the program executable.
@end table
For consistency, both the wrapper @emph{script} and the wrapper
@emph{executable} support these options.
@node Debugging executables
@section Debugging executables
If @file{hell} was a complicated program, you would certainly want to
test and debug it before installing it on your system. In the above
section, you saw how the libtool wrapper script makes it possible to run
the program directly, but unfortunately, this mechanism interferes with
the debugger:
@example
burger$ @kbd{gdb hell}
GDB is free software and you are welcome to distribute copies of it
under certain conditions; type "show copying" to see the conditions.
There is no warranty for GDB; type "show warranty" for details.
GDB 4.16 (i386-unknown-netbsd), (C) 1996 Free Software Foundation, Inc.
"hell": not in executable format: File format not recognized
(gdb) @kbd{quit}
burger$
@end example
Sad. It doesn't work because GDB doesn't know where the executable
lives. So, let's try again, by invoking GDB directly on the executable:
@example
burger$ @kbd{gdb @value{objdir}/hell}
GNU gdb 5.3 (i386-unknown-netbsd)
Copyright 2002 Free Software Foundation, Inc.
GDB is free software, covered by the GNU General Public License,
and you are welcome to change it and/or distribute copies of it
under certain conditions. Type "show copying" to see the conditions.
There is no warranty for GDB. Type "show warranty" for details.
(gdb) @kbd{break main}
Breakpoint 1 at 0x8048547: file main.c, line 29.
(gdb) @kbd{run}
Starting program: /home/src/libtool/demo/.libs/hell
/home/src/libtool/demo/.libs/hell: can't load library 'libhello.so.0'
Program exited with code 020.
(gdb) @kbd{quit}
burger$
@end example
Argh. Now GDB complains because it cannot find the shared library that
@file{hell} is linked against. So, we must use libtool to
properly set the library path and run the debugger. Fortunately, we can
forget all about the @file{@value{objdir}} directory, and just run it on
the executable wrapper (@pxref{Execute mode}):
@example
burger$ @kbd{libtool --mode=execute gdb hell}
GNU gdb 5.3 (i386-unknown-netbsd)
Copyright 2002 Free Software Foundation, Inc.
GDB is free software, covered by the GNU General Public License,
and you are welcome to change it and/or distribute copies of it
under certain conditions. Type "show copying" to see the conditions.
There is no warranty for GDB. Type "show warranty" for details.
(gdb) @kbd{break main}
Breakpoint 1 at 0x8048547: file main.c, line 29.
(gdb) @kbd{run}
Starting program: /home/src/libtool/demo/.libs/hell
Breakpoint 1, main (argc=1, argv=0xbffffc40) at main.c:29
29 printf ("Welcome to GNU Hell!\n");
(gdb) @kbd{quit}
The program is running. Quit anyway (and kill it)? (y or n) @kbd{y}
burger$
@end example
@node Installing libraries
@section Installing libraries
@pindex strip
Installing libraries on a non-libtool system is quite
straightforward@dots{} just copy them into place:@footnote{Don't
strip static libraries though, or they will be unusable.}
@pindex su
@example
burger$ @kbd{su}
Password: @kbd{********}
burger# @kbd{cp libhello.a /usr/local/lib/libhello.a}
burger#
@end example
Oops, don't forget the @command{ranlib} command:
@example
burger# @kbd{ranlib /usr/local/lib/libhello.a}
burger#
@end example
@pindex install
Libtool installation is quite simple, as well. Just use the
@command{install} or @command{cp} command that you normally would
(@pxref{Install mode}):
@example
a23# @kbd{libtool --mode=install cp libhello.la /usr/local/lib/libhello.la}
cp libhello.la /usr/local/lib/libhello.la
cp @value{objdir}/libhello.a /usr/local/lib/libhello.a
ranlib /usr/local/lib/libhello.a
a23#
@end example
Note that the libtool library @file{libhello.la} is also installed, to
help libtool with uninstallation (@pxref{Uninstall mode}) and linking
(@pxref{Linking executables}) and to help programs with dlopening
(@pxref{Dlopened modules}).
Here is the shared library example:
@example
burger# @kbd{libtool --mode=install install -c libhello.la \
/usr/local/lib/libhello.la}
install -c @value{objdir}/libhello.so.0.0 /usr/local/lib/libhello.so.0.0
install -c libhello.la /usr/local/lib/libhello.la
install -c @value{objdir}/libhello.a /usr/local/lib/libhello.a
ranlib /usr/local/lib/libhello.a
burger#
@end example
@cindex stripping libraries
@cindex libraries, stripping
It is safe to specify the @option{-s} (strip symbols) flag if you use a
BSD-compatible install program when installing libraries.
Libtool will either ignore the @option{-s} flag, or will run a program
that will strip only debugging and compiler symbols from the library.
Once the libraries have been put in place, there may be some additional
configuration that you need to do before using them. First, you must
make sure that where the library is installed actually agrees with the
@option{-rpath} flag you used to build it.
@cindex postinstallation
@cindex installation, finishing
@cindex libraries, finishing installation
Then, running @samp{libtool -n finish @var{libdir}} can give you
further hints on what to do (@pxref{Finish mode}):
@example
burger# @kbd{libtool -n finish /usr/local/lib}
PATH="$PATH:/sbin" ldconfig -m /usr/local/lib
-----------------------------------------------------------------
Libraries have been installed in:
/usr/local/lib
To link against installed libraries in a given directory, LIBDIR,
you must use the '-LLIBDIR' flag during linking.
You will also need to do one of the following:
- add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
during execution
- add LIBDIR to the 'LD_RUN_PATH' environment variable
during linking
- use the '-RLIBDIR' linker flag
See any operating system documentation about shared libraries for
more information, such as the ld and ld.so manual pages.
-----------------------------------------------------------------
burger#
@end example
After you have completed these steps, you can go on to begin using the
installed libraries. You may also install any executables that depend
on libraries you created.
@node Installing executables
@section Installing executables
If you used libtool to link any executables against uninstalled libtool
libraries (@pxref{Linking executables}), you need to use libtool to
install the executables after the libraries have been installed
(@pxref{Installing libraries}).
So, for our Ultrix example, we would run:
@example
a23# libtool --mode=install install -c hell /usr/local/bin/hell
install -c hell /usr/local/bin/hell
a23#
@end example
On shared library systems that require wrapper scripts, libtool just
ignores the wrapper script and installs the correct binary:
@example
burger# libtool --mode=install install -c hell /usr/local/bin/hell
install -c @value{objdir}/hell /usr/local/bin/hell
burger#
@end example
@node Static libraries
@section Linking static libraries
@cindex static linking
@cindex convenience libraries
Why return to @command{ar} and @command{ranlib} silliness when you've had a
taste of libtool? Well, sometimes it is desirable to create a static
archive that can never be shared. The most frequent case is when you
have a set of object files that you use to build several different
libraries. You can create a ``convenience library'' out of those
objects, and link against that with the other libraries, instead of
listing all the object files every time.
If you just want to link this convenience library into programs, then
you could just ignore libtool entirely, and use the old @command{ar} and
@command{ranlib} commands (or the corresponding GNU Automake
@samp{_LIBRARIES} rules). You can even install a convenience library
using GNU Libtool, though you probably don't want to and hence GNU
Automake doesn't allow you to do so.
@example
burger$ @kbd{libtool --mode=install ./install-sh -c libhello.a \
/local/lib/libhello.a}
./install-sh -c libhello.a /local/lib/libhello.a
ranlib /local/lib/libhello.a
burger$
@end example
Using libtool for static library installation protects your library from
being accidentally stripped (if the installer used the @option{-s} flag),
as well as automatically running the correct @command{ranlib} command.
But libtool libraries are more than just collections of object files:
they can also carry library dependency information, which old archives
do not. If you want to create a libtool static convenience library, you
can omit the @option{-rpath} flag and use @option{-static} to indicate that
you're only interested in a static library. When you link a program
with such a library, libtool will actually link all object files and
dependency libraries into the program.
If you omit both @option{-rpath} and @option{-static}, libtool will create a
convenience library that can be used to create other libtool
libraries, even shared ones. Just like in the static case, the library
behaves as an alias to a set of object files and dependency libraries,
but in this case the object files are suitable for inclusion in shared
libraries. But be careful not to link a single convenience library,
directly or indirectly, into a single program or library, otherwise you
may get errors about symbol redefinitions.
The key is remembering that a convenience library contains PIC
objects, and can be linked where a list of PIC objects makes sense;
i.e.@: into a shared library. A static convenience library contains
non-PIC objects, so can be linked into an old static library, or
a program.
When GNU Automake is used, you should use @code{noinst_LTLIBRARIES}
instead of @code{lib_LTLIBRARIES} for convenience libraries, so that
the @option{-rpath} option is not passed when they are linked.
As a rule of thumb, link a libtool convenience library into at most one
libtool library, and never into a program, and link libtool static
convenience libraries only into programs, and only if you need to carry
library dependency information to the user of the static convenience
library.
@cindex standalone binaries
Another common situation where static linking is desirable is in
creating a standalone binary. Use libtool to do the linking and add the
@option{-all-static} flag.
@node Invoking libtool
@chapter Invoking @command{libtool}
@pindex libtool
@cindex libtool command options
@cindex options, libtool command
@cindex command options, libtool
The @command{libtool} program has the following synopsis:
@example
libtool [@var{option}]@dots{} [@var{mode-arg}]@dots{}
@end example
@noindent
and accepts the following options:
@table @option
@item --config
Display libtool configuration variables and exit.
@item --debug
Dump a trace of shell script execution to standard output. This
produces a lot of output, so you may wish to pipe it to @command{less} (or
@command{more}) or redirect to a file.
@item -n
@itemx --dry-run
Don't create, modify, or delete any files, just show what commands would
be executed by libtool.
@item --features
Display basic configuration options. This provides a way for packages
to determine whether shared or static libraries will be built.
@item --finish
Same as @option{--mode=finish}.
@item -h
Display short help message.
@item --help
Display a help message and exit. If @option{--mode=@var{mode}} is
specified, then detailed help for @var{mode} is displayed.
@item --help-all
Display help for the general options as well as detailed help for each
operation mode, and exit.
@item --mode=@var{mode}
Use @var{mode} as the operation mode. When using libtool from the
command line, you can give just @var{mode} (or a unique abbreviation
of it) as the first argument as a shorthand for the full
@option{--mode=@var{mode}}. For example, the following are equivalent:
@example
$ @kbd{libtool --mode=execute --dry-run gdb prog.exe}
$ @kbd{libtool execute --dry-run gdb prog.exe}
$ @kbd{libtool exe --dry-run gdb prog.exe}
$ @kbd{libtool e --dry-run gdb prog.exe}
@end example
@noindent
@var{mode} must be set to one of the following:
@table @option
@item compile
Compile a source file into a libtool object.
@item execute
Automatically set the library path so that another program can use
uninstalled libtool-generated programs or libraries.
@item link
Create a library or an executable.
@item install
Install libraries or executables.
@item finish
Complete the installation of libtool libraries on the system.
@item uninstall
Delete installed libraries or executables.
@item clean
Delete uninstalled libraries or executables.
@end table
@item --tag=@var{tag}
Use configuration variables from tag @var{tag} (@pxref{Tags}).
@item --preserve-dup-deps
Do not remove duplicate dependencies in libraries. When building packages
with static libraries, the libraries may depend circularly on each other
(shared libs can too, but for those it doesn't matter), so there are
situations, where -la -lb -la is required, and the second -la may not be
stripped or the link will fail. In cases where these duplications are
required, this option will preserve them, only stripping the libraries
that libtool knows it can safely.
@item --no-finish
Do not execute finish_cmds (disabled by default). This option is for
specifying that testing of local changes to shared libraries is being
performed so that ldconfig will not alter the shared library cache, which
is an issue observed on OpenBSD 7.5. This option should be combined with
the usage of @option{--mode=install} and @option{--mode=finish} to have
any effect. Prior to utilizing this option, the shared library cache must
not contain links to the listed install directory for shared libraries
undergoing testing; otherwise, it will have no useful effect. In OpenBSD,
the shared library cache can be reordered to prefer directories for
testing shared libraries over the directories already listed in the shared
library cache with @option{--reorder-cache=@var{shared_lib_dirs}}.
@item --reorder-cache=@var{shared_lib_dirs}
Reorder the shared library cache by providing the preferred directories
(@var{shared_lib_dirs}) to link shared libraries from. The previous
shared library cache is unconfigured, and the preferred directories are
configured with the previous directories appended to the end (if not in
the preferred directory list)@footnote{Additionally, all directories
that no longer exist will be removed from the shared library cache.}.
This option is currently only available on OpenBSD where @code{make
install} has been required before @code{make check} for the shared
library cache to be updated.
This option is essentially a wrapper for executing @command{ldconfig},
and it should be used as an independent option before and after testing
changes to shared libraries. Below are some usage examples:
@example
$ @kbd{libtool --reorder-cache=/tmp/testing}
Original: /usr/lib /usr/X11R6/lib /usr/local/lib
Reordered: /tmp/testing /usr/lib /usr/X11R6/lib /usr/local/lib
$ @kbd{libtool --reorder-cache=/usr/lib:/usr/X11R6/lib:/usr/local/lib}
Original: /tmp/testing /usr/lib /usr/X11R6/lib /usr/local/lib
Reordered: /usr/lib /usr/X11R6/lib /usr/local/lib /tmp/testing
@end example
@example
$ @kbd{libtool --reorder-cache=/tmp/testing}
Original: /usr/lib /usr/X11R6/lib /usr/local/lib
Reordered: /tmp/testing /usr/lib /usr/X11R6/lib /usr/local/lib
$ @kbd{rm -rf /tmp/testing}
$ @kbd{libtool --reorder-cache=/usr/lib:/usr/X11R6/lib:/usr/local/lib}
Original: /tmp/testing /usr/lib /usr/X11R6/lib /usr/local/lib
Reordered: /usr/lib /usr/X11R6/lib /usr/local/lib
@end example
@example
$ @kbd{libtool --reorder-cache=/tmp/testing:/usr/local/lib:/home/user/dir}
Original: /usr/lib /usr/X11R6/lib /usr/local/lib
Reordered: /tmp/testing /usr/local/lib /home/user/dir /usr/lib /usr/X11R6/lib
$ @kbd{libtool --reorder-cache=/usr/lib /usr/X11R6/lib /usr/local/lib}
Original: /tmp/testing /usr/local/lib /home/user/dir /usr/lib /usr/X11R6/lib
Reordered: /usr/lib /usr/X11R6/lib /usr/local/lib /tmp/testing /home/user/dir
@end example
@item --quiet
@itemx --silent
Do not print out any progress or informational messages.
@item -v
@itemx --verbose
Print out progress and informational messages (enabled by default),
as well as additional messages not ordinarily seen by default.
@item --no-quiet
@itemx --no-silent
Print out the progress and informational messages that are seen
by default. This option has no effect on whether the additional
messages seen in @option{--verbose} mode are shown.
@item --no-verbose
Do not print out any additional informational messages beyond
those ordinarily seen by default. This option has no effect
on whether the ordinary progress and informational messages
enabled by @option{--no-quiet} are shown.
Thus, there are now three different message levels (not counting
@option{--debug}), depending on whether the normal messages and/or
the additional verbose messages are displayed. Note that there is
no mechanism to display verbose messages, without also displaying
normal messages.
@table @strong
@item default
Normal messages are displayed, verbose messages are not displayed.
In addition to being the default mode, it can be forcibly achieved
by using both option @option{--no-verbose} and either option
@option{--no-silent} or option @option{--no-quiet}.
@item silent
Neither normal messages nor verbose messages are displayed. This
mode can be achieved using either option @option{--silent} or
option @option{--quiet}.
@item verbose
Both normal messages and verbose messages are displayed. This mode
can be achieved using either option @option{-v} or option
@option{--verbose}.
@end table
@item --version
Print libtool version information and exit.
@item -W
@itemx --warnings=@var{CATEGORY}
Report the warnings falling in category @var{CATEGORY}. The default
category is @command{all}. To disable warnings, use the category
@command{none}.
@end table
The current @command{libtool} implementation is done with a shell script
that needs to be invoked by the shell that @command{configure} chose for
configuring @command{libtool} (@pxref{, , config.status Invocation,
autoconf, The Autoconf Manual}). This shell is set in the she-bang
(@samp{#!}) line of the @command{libtool} script. Using a different
shell may cause undefined behavior.
The @var{mode-args} are a variable number of arguments, depending on the
selected operation mode. In general, each @var{mode-arg} is interpreted
by programs libtool invokes, rather than libtool itself.
@menu
* Compile mode:: Creating library object files.
* Link mode:: Generating executables and libraries.
* Execute mode:: Debugging libtool-generated programs.
* Install mode:: Making libraries and executables public.
* Finish mode:: Completing a library installation.
* Uninstall mode:: Removing installed executables and libraries.
* Clean mode:: Removing uninstalled executables and libraries.
@end menu
@node Compile mode
@section Compile mode
@cindex mode, compile
@cindex compile mode
For @dfn{compile} mode, @var{mode-args} is a compiler command to be used
in creating a ``standard'' object file. These arguments should begin with
the name of the C compiler, and contain the @option{-c} compiler flag so
that only an object file is created.
Libtool determines the name of the output file by removing the directory
component from the source file name, then substituting the source code
suffix (e.g.@: @samp{.c} for C source code) with the library object suffix,
@samp{.lo}.
If shared libraries are being built, any necessary PIC generation flags
are substituted into the compilation command.
The following components of @var{mode-args} are treated specially:
@table @option
@item -o
Note that the @option{-o} option is now fully supported. It is emulated
on the platforms that don't support it (by locking and moving the
objects), so it is really easy to use libtool, just with minor
modifications to your Makefiles. Typing for example
@example
libtool --mode=compile gcc -c foo/x.c -o foo/x.lo
@end example
will do what you expect.
Note, however, that, if the compiler does not support @option{-c} and
@option{-o}, it is impossible to compile @file{foo/x.c} without
overwriting an existing @file{./x.o}. Therefore, if you do have a
source file @file{./x.c}, make sure you introduce dependencies in your
@file{Makefile} to make sure @file{./x.o} (or @file{./x.lo}) is
re-created after any sub-directory's @file{x.lo}:
@example
x.o x.lo: foo/x.lo bar/x.lo
@end example
@noindent
This will also ensure that make won't try to use a temporarily corrupted
@file{x.o} to create a program or library. It may cause needless
recompilation on platforms that support @option{-c} and @option{-o}
together, but it's the only way to make it safe for those that don't.
@item -no-suppress
If both PIC and non-PIC objects are being built, libtool will normally
suppress the compiler output for the PIC object compilation to save
showing very similar, if not identical duplicate output for each
object. If the @option{-no-suppress} option is given in compile mode,
libtool will show the compiler output for both objects.
@item -prefer-pic
Libtool will try to build only PIC objects.
@item -prefer-non-pic
Libtool will try to build only non-PIC objects.
@item -shared
Even if Libtool was configured with @option{--enable-static}, the object
file Libtool builds will not be suitable for static linking. Libtool
will signal an error if it was configured with @option{--disable-shared},
or if the host does not support shared libraries.
@item -static
Even if libtool was configured with @option{--disable-static}, the
object file Libtool builds @strong{will} be suitable for static
linking.
@item -Wc,@var{flag}
@itemx -Xcompiler @var{flag}
Pass a flag directly to the compiler. With @code{-Wc,}, multiple flags
may be separated by commas, whereas @code{-Xcompiler } passes through
commas unchanged.
@end table
@node Link mode
@section Link mode
@cindex link mode
@cindex mode, link
@dfn{Link} mode links together object files (including library
objects) to form another library or to create an executable program.
@var{mode-args} consist of a command using the C compiler to create an
output file (with the @option{-o} flag) from several object files.
The following components of @var{mode-args} are treated specially:
@table @option
@cindex undefined symbols, allowing
@cindex unresolved symbols, allowing
@item -all-static
If @var{output-file} is a program, then do not link it against any
shared libraries at all. If @var{output-file} is a library, then only
create a static library. In general, this flag cannot be used together
with @samp{disable-static} (@pxref{LT_INIT}).
@item -avoid-version
Tries to avoid versioning (@pxref{Versioning}) for libraries and modules,
i.e.@: no version information is stored and no symbolic links are created.
If the platform requires versioning, this option has no effect.
@item -bindir
Pass the absolute name of the directory for installing executable
programs (@pxref{Directory Variables, , Directory Variables, standards,
The GNU Coding Standards}). @command{libtool} may use this value to
install shared libraries there on systems that do not provide for any
library hardcoding and use the directory of a program and the @env{PATH}
variable as library search path. This is typically used for DLLs on
Windows or other systems using the PE (Portable Executable) format.
On other systems, @option{-bindir} is ignored. The default value used
is @file{@var{libdir}/../bin} for libraries installed to
@file{@var{libdir}}. You should not use @option{-bindir} for modules.
@item -dlopen @var{file}
Same as @option{-dlpreopen @var{file}}, if native dlopening is not
supported on the host platform (@pxref{Dlopened modules}) or if
the program is linked with @option{-static},
@option{-static-libtool-libs}, or @option{-all-static}. Otherwise, no
effect. If @var{file} is @code{self} Libtool will make sure that the
program can @code{dlopen} itself, either by enabling
@option{-export-dynamic} or by falling back to @option{-dlpreopen self}.
@item -dlpreopen @var{file}
Link @var{file} into the output program, and add its symbols to the
list of preloaded symbols (@pxref{Dlpreopening}). If @var{file} is
@code{self}, the symbols of the program itself will be added to
preloaded symbol lists. If @var{file} is @code{force} Libtool will
make sure that a preloaded symbol list is always @emph{defined},
regardless of whether it's empty or not.
@item -export-dynamic
Allow symbols from @var{output-file} to be resolved with @code{dlsym}
(@pxref{Dlopened modules}).
@item -export-symbols @var{symfile}
Tells the linker to export only the symbols listed in @var{symfile}.
The symbol file should end in @file{.sym} and must contain the name of one
symbol per line. This option has no effect:
@itemize @bullet
@item
on static libraries, and
@item
on shared libraries on some platforms, such as AIX and Haiku.
@end itemize
By default all symbols are exported.
@item -export-symbols-regex @var{regex}
Same as @option{-export-symbols}, except that only symbols matching
the regular expression @var{regex} are exported.
By default all symbols are exported.
@item -L@var{libdir}
Search @var{libdir} for required libraries that have already been
installed.
@item -l@var{name}
@var{output-file} requires the installed library @file{lib@var{name}}.
This option is required even when @var{output-file} is not an
executable.
@item -module
Creates a library that can be dlopened (@pxref{Dlopened modules}).
This option doesn't work for programs.
Module names don't need to be prefixed with @samp{lib}.
In order to prevent name clashes, however, @file{lib@var{name}} and @file{@var{name}}
must not be used at the same time in your package.
@item -no-fast-install
Disable fast-install mode for the executable @var{output-file}. Useful
if the program won't be necessarily installed.
@item -no-install
Link an executable @var{output-file} that can't be installed and
therefore doesn't need a wrapper script on systems that allow hardcoding
of library paths. Useful if the program is only used in the build tree,
e.g., for testing or generating other files.
@item -no-undefined
Declare that @var{output-file} does not depend on any libraries other
than the ones listed on the command line, i.e., after linking, it will
not have unresolved symbols. Some platforms require all symbols in
shared libraries to be resolved at library creation (@pxref{Inter-library
dependencies}), and using this parameter allows @command{libtool} to
assume that this will not happen.
@item -o @var{output-file}
Create @var{output-file} from the specified objects and libraries.
@item -objectlist @var{file}
Use a list of object files found in @var{file} to specify objects.
@item -os2dllname @var{name}
Use this to change the DLL base name on OS/2 to @var{name}, to keep
within the 8 character base name limit on this system.
@item -precious-files-regex @var{regex}
Prevents removal of files from the temporary output directory whose
names match this regular expression. You might specify @samp{\.bbg?$}
to keep those files created with @code{gcc -ftest-coverage} for example.
@item -release @var{release}
Specify that the library was generated by release @var{release} of your
package, so that users can easily tell what versions are newer than
others. Be warned that no two releases of your package will be binary
compatible if you use this flag. If you want binary compatibility, use
the @option{-version-info} flag instead (@pxref{Versioning}).
@item -rpath @var{libdir}
If @var{output-file} is a library, it will eventually be installed in
@var{libdir}. If @var{output-file} is a program, add @var{libdir} to
the run-time path of the program. On platforms that don't support
hardcoding library paths into executables and only search PATH for
shared libraries, such as when @var{output-file} is a Windows (or
other PE platform) DLL, the @file{.la} control file will be installed in
@var{libdir}, but see @option{-bindir} above for the eventual destination
of the @file{.dll} or other library file itself.
@item -R @var{libdir}
If @var{output-file} is a program, add @var{libdir} to its run-time
path. If @var{output-file} is a library, add @option{-R@var{libdir}} to its
@var{dependency_libs}, so that, whenever the library is linked into a
program, @var{libdir} will be added to its run-time path.
@item -shared
If @var{output-file} is a program, then link it against any
uninstalled shared libtool libraries (this is the default behavior).
If @var{output-file} is a library, then only create a shared library.
In the later case, libtool will signal an error if it was configured
with @option{--disable-shared}, or if the host does not support shared
libraries.
@item -shrext @var{suffix}
If @var{output-file} is a libtool library, replace the system's standard
file name extension for shared libraries with @var{suffix} (most systems
use @file{.so} here). This option is helpful in certain cases where an
application requires that shared libraries (typically modules) have an
extension other than the default one. Please note you must supply the
full file name extension including any leading dot.
@item -static
If @var{output-file} is a program, then do not link it against any
uninstalled shared libtool libraries. If @var{output-file} is a
library, then only create a static library.
@item -static-libtool-libs
If @var{output-file} is a program, then do not link it against any
shared libtool libraries. If @var{output-file} is a library, then only
create a static library.
@item -version-info @var{current}[:@var{revision}[:@var{age}]]
If @var{output-file} is a libtool library, use interface version
information @var{current}, @var{revision}, and @var{age} to build it
(@pxref{Versioning}). Do @strong{not} use this flag to specify package
release information, rather see the @option{-release} flag.
@item -version-number @var{major}[:@var{minor}[:@var{revision}]]
If @var{output-file} is a libtool library, compute interface version
information so that the resulting library uses the specified major, minor and
revision numbers. This is designed to permit libtool to be used with
existing projects where identical version numbers are already used across
operating systems. New projects should use the @option{-version-info} flag
instead.
@item -weak @var{libname}
if @var{output-file} is a libtool library, declare that it provides a
weak @var{libname} interface. This is a hint to libtool that there is
no need to append @var{libname} to the list of dependency libraries of
@var{output-file}, because linking against @var{output-file} already
supplies the same interface (@pxref{Linking with dlopened modules}).
@item -Wc,@var{flag}
@itemx -Xcompiler @var{flag}
Pass a linker-specific flag directly to the compiler. With @code{-Wc,},
multiple flags may be separated by commas, whereas @code{-Xcompiler }
passes through commas unchanged.
@item -Wa,@var{flag}
@itemx -Xassembler @var{flag}
Pass a linker-specific flag directly to the assembler. With @code{-Wa,},
multiple flags may be separated by commas, whereas @code{-Xassembler }
passes through commas unchanged.
@item -Wl,@var{flag}
@itemx -Xlinker @var{flag}
Pass a linker-specific flag directly to the linker.
@item -XCClinker @var{flag}
Pass a link-specific flag to the compiler driver (@code{CC}) during linking.
@end table
If the @var{output-file} ends in @file{.la}, then a libtool library is
created, which must be built only from library objects (@file{.lo} files).
The @option{-rpath} option is required. In the current implementation,
libtool libraries may not depend on other uninstalled libtool libraries
(@pxref{Inter-library dependencies}).
If the @var{output-file} ends in @file{.a}, then a standard library is
created using @code{ar} and possibly @code{ranlib}.
@cindex partial linking
@cindex linking, partial
If @var{output-file} ends in @file{.o} or @file{.lo}, then a reloadable object
file is created from the input files (generally using @samp{ld -r}).
This method is often called @dfn{partial linking}.
Otherwise, an executable program is created.
@node Execute mode
@section Execute mode
@cindex execute mode
@cindex mode, execute
For @dfn{execute} mode, the library path is automatically set, then a
program is executed.
The first of the @var{mode-args} is treated as a program name, with the
rest as arguments to that program.
The following components of @var{mode-args} are treated specially:
@table @option
@item -dlopen @var{file}
Add the directory containing @var{file} to the library path.
@end table
This mode sets the library path environment variable according to any
@option{-dlopen} flags.
If any of the @var{args} are libtool executable wrappers, then they are
translated into the name of their corresponding uninstalled binary, and
any of their required library directories are added to the library path.
@node Install mode
@section Install mode
@cindex install mode
@cindex mode, install
In @dfn{install} mode, libtool interprets most of the elements of
@var{mode-args} as an installation command beginning with
@command{cp}, or a BSD-compatible @command{install} program.
The following components of @var{mode-args} are treated specially:
@table @option
@item -inst-prefix-dir @var{inst-prefix-dir}
When installing into a temporary staging area, rather than the
final @code{prefix}, this argument is used to reflect the
temporary path, in much the same way @command{automake} uses
@env{DESTDIR}. For instance, if @code{prefix} is @file{/usr/local},
but @var{inst-prefix-dir} is @file{/tmp}, then the object will be
installed under @file{/tmp/usr/local/}. If the installed object
is a libtool library, then the internal fields of that library
will reflect only @code{prefix}, not @var{inst-prefix-dir}:
@example
# Directory that this library needs to be installed in:
libdir='/usr/local/lib'
@end example
not
@example
# Directory that this library needs to be installed in:
libdir='/tmp/usr/local/lib'
@end example
@code{inst-prefix} is also used to ensure that if the installed
object must be relinked upon installation, that it is relinked
against the libraries in @var{inst-prefix-dir}/@code{prefix},
not @code{prefix}.
In truth, this option is not really intended for use when calling
libtool directly; it is automatically used when @code{libtool --mode=install}
calls @code{libtool --mode=relink}. Libtool does this by
analyzing the destination path given in the original
@code{libtool --mode=install} command and comparing it to the
expected installation path established during @code{libtool --mode=link}.
Thus, end-users need change nothing, and @command{automake}-style
@code{make install DESTDIR=/tmp} will Just Work(tm) most of the time.
For systems where fast installation cannot be turned on, relinking
may be needed. In this case, a @samp{DESTDIR} install will fail.
Currently it is not generally possible to install into a temporary
staging area that contains needed third-party libraries that are
not yet visible at their final location.
@end table
The rest of the @var{mode-args} are interpreted as arguments to the
@command{cp} or @command{install} command.
The command is run, and any necessary unprivileged post-installation
commands are also completed.
@node Finish mode
@section Finish mode
@cindex finish mode
@cindex mode, finish
@dfn{Finish} mode has two functions. One is to help system administrators
install libtool libraries so that they can be located and linked into
user programs. To invoke this functionality, pass the name of a library
directory as @var{mode-arg}. Running this command may require superuser
privileges, and the @option{--dry-run} option may be useful.
The second is to facilitate transferring libtool libraries to a native
compilation environment after they were built in a cross-compilation
environment. Cross-compilation environments may rely on recent libtool
features, and running libtool in finish mode will make it easier to
work with older versions of libtool. This task is performed whenever
the @var{mode-arg} is a @file{.la} file.
@node Uninstall mode
@section Uninstall mode
@cindex uninstall mode
@cindex mode, uninstall
@dfn{Uninstall} mode deletes installed libraries, executables and objects.
The first @var{mode-arg} is the name of the program to use to delete
files (typically @command{/bin/rm}).
The remaining @var{mode-args} are either flags for the deletion program
(beginning with a @samp{-}), or the names of files to delete.
@node Clean mode
@section Clean mode
@cindex clean mode
@cindex mode, clean
@dfn{Clean} mode deletes uninstalled libraries, executables, objects
and libtool's temporary files associated with them.
The first @var{mode-arg} is the name of the program to use to delete
files (typically @command{/bin/rm}).
The remaining @var{mode-args} are either flags for the deletion program
(beginning with a @samp{-}), or the names of files to delete.
@node Integrating libtool
@chapter Integrating libtool with your package
This chapter describes how to integrate libtool with your packages so
that your users can install hassle-free shared libraries.
There are several ways that Libtool may be integrated in your
package, described in the following sections. Typically, the Libtool
macro files as well as @file{ltmain.sh} are copied into your package
using @command{libtoolize} and @command{aclocal} after setting up the
@file{configure.ac} and toplevel @file{Makefile.am}, then
@command{autoconf} adds the needed tests to the @file{configure} script.
These individual steps are often automated with @command{autoreconf}.
Here is a diagram showing how such a typical Libtool configuration works
when preparing a package for distribution, assuming that @file{m4} has
been chosen as the location for additional Autoconf macros, and
@file{build-aux} as the location for auxiliary build tools (@pxref{Input,,
The Autoconf Manual, autoconf, The Autoconf Manual}):
@example
@group
libtool.m4 -----. .--> aclocal.m4 -----.
ltoptions.m4 ---+ .-> aclocal* -+ +--> autoconf*
ltversion.m4 ---+--+ `--> [copy in m4/] --+ |
ltsugar.m4 -----+ | ^ | \/
lt~obsolete.m4 -+ +-> libtoolize* -----' | configure
[ltdl.m4] ------+ | |
`----------------------------------'
ltmain.sh -----------> libtoolize* -> [copy in build-aux/]
@end group
@end example
During configuration, the @file{libtool} script is generated either
through @command{config.status} or @command{config.lt}:
@example
@group
.--> config.status* --.
configure* --+ +--> libtool
`--> [config.lt*] ----' ^
|
ltmain.sh --------------------------------'
@end group
@end example
At @command{make} run time, @command{libtool} is then invoked as needed
as a wrapper around compilers, linkers, install and cleanup programs.
There are alternatives choices to several parts of the setup; for
example, the Libtool macro files can either be copied or symlinked into
the package, or copied into @file{aclocal.m4}. As another example, an
external, pre-configured @command{libtool} script may be used,
by-passing most of the tests and package-specific setup for Libtool.
@menu
* Autoconf macros:: Autoconf macros exported by libtool.
* Makefile rules:: Writing @file{Makefile} rules for libtool.
* Using Automake:: Automatically supporting libtool.
* Configuring:: Configuring libtool for a host system.
* Distributing:: What files to distribute with your package.
* Static-only libraries:: Sometimes shared libraries are just a pain.
@end menu
@node Autoconf macros
@section Autoconf macros exported by libtool
Libtool uses a number of macros to interrogate the host system when it
is being built, and you can use some of them yourself too. Although
there are a great many other macros in the libtool installed m4 files,
these do not form part of the published interface, and are subject to
change between releases.
@noindent
Macros in the @samp{LT_CMD_} namespace check for various shell
commands:
@defmac LT_CMD_MAX_LEN
Finds the longest command line that can be safely passed to
@samp{$SHELL} without being truncated, and store in the shell variable
@samp{$max_cmd_len}. It is only an approximate value, but command
lines of this length or shorter are guaranteed not to be truncated.
@end defmac
@noindent
Macros in the @samp{LT_FUNC_} namespace check characteristics of
library functions:
@defmac LT_FUNC_DLSYM_USCORE
@samp{AC_DEFINE} the preprocessor symbol @samp{DLSYM_USCORE} if we
have to add an underscore to symbol-names passed in to @samp{dlsym}.
@end defmac
@noindent
Macros in the @samp{LT_LIB_} namespace check characteristics of system
libraries:
@defmac LT_LIB_M
Set @samp{LIBM} to the math library or libraries required on this
machine, if any.
@end defmac
@defmac LT_LIB_DLLOAD
This is the macro used by @samp{libltdl} to determine what dlloaders
to use on this machine, if any. Several shell variables are set (and
@samp{AC_SUBST}ed) depending on the dlload interfaces are available on
this machine. @samp{LT_DLLOADERS} contains a list of libtool
libraries that can be used, and if necessary also sets
@samp{LIBADD_DLOPEN} if additional system libraries are required by
the @samp{dlopen} loader, and @samp{LIBADD_SHL_LOAD} if additional
system libraries are required by the @samp{shl_load} loader,
respectively. Finally some symbols are set in @file{config.h}
depending on the loaders that are found to work: @samp{HAVE_LIBDL},
@samp{HAVE_SHL_LOAD}, @samp{HAVE_DYLD}, @samp{HAVE_DLD}.
@end defmac
@noindent
Macros in the @samp{LT_PATH_} namespace search the system for the full
path to particular system commands:
@defmac LT_PATH_LD
Add a @option{--with-gnu-ld} option to @file{configure}. Try to find
the path to the linker used by @samp{$CC}, and whether it is the
GNU linker. The result is stored in the shell variable
@samp{$LD}, which is @code{AC_SUBST}ed.
@end defmac
@defmac LT_PATH_NM
Try to find a BSD-compatible @command{nm} or a MS-compatible
@command{dumpbin} command on this machine. The result is stored in the
shell variable @samp{$NM}, which is @code{AC_SUBST}ed.
@end defmac
@noindent
Macros in the @samp{LT_SYS_} namespace probe for system
characteristics:
@defmac LT_SYS_DLOPEN_SELF
Tests whether a program can dlopen itself, and then also whether the
same program can still dlopen itself when statically linked. Results
are stored in the shell variables @samp{$enable_dlopen_self} and
@samp{enable_dlopen_self_static} respectively.
@end defmac
@defmac LT_SYS_DLOPEN_DEPLIBS
Define the preprocessor symbol @samp{LTDL_DLOPEN_DEPLIBS} if the
OS needs help to load dependent libraries for @samp{dlopen} (or
equivalent).
@end defmac
@defmac LT_SYS_DLSEARCH_PATH
Define the preprocessor symbol @samp{LT_DLSEARCH_PATH} to the system
default library search path.
@end defmac
@defmac LT_SYS_MODULE_EXT
Define the preprocessor symbol @samp{LT_MODULE_EXT} to the extension
used for runtime loadable modules. If you use libltdl to open
modules, then you can simply use the libtool library extension,
@file{.la}.
@end defmac
@defmac LT_SYS_MODULE_PATH
Define the preprocessor symbol @samp{LT_MODULE_PATH_VAR} to the name
of the shell environment variable that determines the run-time module
search path.
@end defmac
@defmac LT_SYS_SYMBOL_USCORE
Set the shell variable @samp{sys_symbol_underscore} to @samp{no}
unless the compiler prefixes global symbols with an underscore.
@end defmac
@node Makefile rules
@section Writing @file{Makefile} rules for libtool
@cindex Makefile
@cindex Makefile.am
@cindex Makefile.in
Libtool is fully integrated with Automake (@pxref{Top,, Introduction,
automake, The Automake Manual}), starting with Automake version 1.2.
If you want to use libtool in a regular @file{Makefile} (or
@file{Makefile.in}), you are on your own. If you're not using
Automake, and you don't know how to incorporate libtool into your
package you need to do one of the following:
@enumerate 1
@item
Download the latest Automake distribution from your nearest GNU
mirror, install it, and start using it.
@item
Learn how to write @file{Makefile} rules by hand. They're sometimes complex,
but if you're clever enough to write rules for compiling your old
libraries, then you should be able to figure out new rules for libtool
libraries (hint: examine the @file{Makefile.in} in the
@file{tests/testsuite.dir/027} subdirectory, generated from the Autotest
labeled 'link against a preloaded static library' in @file{tests/demo.at},
of the libtool distribution; note especially that it was automatically
generated from the @file{Makefile.am} by Automake).
@end enumerate
@node Using Automake
@section Using Automake with libtool
@vindex LTLIBRARIES
Libtool library support is implemented under the @samp{LTLIBRARIES}
primary.
Here are some samples from the Automake @file{Makefile.am} in the
libtool distribution's @file{tests/demo.at}.
First, to link a program against a libtool library, just use the
@samp{program_LDADD}@footnote{@c
@c
Since GNU Automake 1.5, the flags @option{-dlopen}
or @option{-dlpreopen} (@pxref{Link mode}) can be employed with the
@samp{program_LDADD} variable. Unfortunately, older releases didn't
accept these flags, so if you are stuck with an ancient Automake, we
recommend quoting the flag itself, and setting
@samp{program_DEPENDENCIES} too:
@example
program_LDADD = "-dlopen" libfoo.la
program_DEPENDENCIES = libfoo.la
@end example
@c
} variable:
@example
bin_PROGRAMS = hell hell_static
# Build hell from main.c and libhello.la
hell_SOURCES = main.c
hell_LDADD = libhello.la
# Create a statically linked version of hell.
hell_static_SOURCES = main.c
hell_static_LDADD = libhello.la
hell_static_LDFLAGS = -static
@end example
You may use the @samp{program_LDFLAGS} variable to stuff in any flags
you want to pass to libtool while linking @file{program} (such as
@option{-static} to avoid linking uninstalled shared libtool libraries).
Building a libtool library is almost as trivial@dots{} note the use of
@samp{libhello_la_LDFLAGS} to pass the @option{-version-info}
(@pxref{Versioning}) option to libtool:
@example
# Build a libtool library, libhello.la for installation in libdir.
lib_LTLIBRARIES = libhello.la
libhello_la_SOURCES = hello.c foo.c
libhello_la_LDFLAGS = -version-info 3:12:1
@end example
The @option{-rpath} option is passed automatically by Automake (except for
libraries listed as @code{noinst_LTLIBRARIES}), so you
should not specify it.
@xref{A Shared Library, Building a Shared Library, The Automake Manual,
automake, The Automake Manual}, for more information.
When building libtool archives which depend on built sources (for example a
generated header file), you may find it necessary to manually record
these dependencies.
Because libtool archives generate object file names manually recording these
dependencies is not as straightforward as the examples in Automake's manual
describe.
This affects header files in particular, because simply listing them as
@samp{nodist_libfoo_la_SOURCES} will not cause Automake to establish a
dependent relationship for the object files of @file{libfoo.la}.
A useful trick (although somewhat imprecise) is to manually record built
sources used by a libtool archive as dependencies of all the objects for that
library as shown below (as opposed to a particular object file):
@example
# Build a libtool library, libhello.la which depends on a generated header.
hello.h:
echo '#define HELLO_MESSAGE "Hello, World!"' > $@@
BUILT_SOURCES = hello.h
CLEANFILES = hello.h
nodist_libhello_la_SOURCES = hello.h
libhello_la_SOURCES = hello.c foo.h foo.c bar.h bar.c
# Manually record hello.h as a prerequisite for all objects in libhello.la
$(libhello_la_OBJECTS): hello.h
@end example
@xref{Built Sources Example, Recording Dependencies manually, The Automake Manual,
automake, The Automake Manual}, for more information.
@node Configuring
@section Configuring libtool
@cindex configuring libtool
Libtool requires intimate knowledge of your compiler suite and operating
system to be able to create shared libraries and link against
them properly. When you install the libtool distribution, a
system-specific libtool script is installed into your binary directory.
However, when you distribute libtool with your own packages
(@pxref{Distributing}), you do not always know the compiler suite and
operating system that are used to compile your package.
For this reason, libtool must be @dfn{configured} before it can be
used. This idea should be familiar to anybody who has used a GNU
@code{configure} script. @code{configure} runs a number of tests for
system features, then generates the @file{Makefile}s (and possibly a
@file{config.h} header file), after which you can run @code{make} and
build the package.
Libtool adds its own tests to your @code{configure} script to
generate a libtool script for the installer's host machine.
@menu
* LT_INIT:: Configuring @code{libtool} in @file{configure.ac}.
* Configure notes:: Platform-specific notes for configuration.
@end menu
@node LT_INIT
@subsection The @code{LT_INIT} macro
If you are using GNU Autoconf (or Automake), you should add a call to
@code{LT_INIT} to your @file{configure.ac} file. This macro
adds many new tests to the @code{configure} script so that the generated
libtool script will understand the characteristics of the host. It's the
most important of a number of macros defined by Libtool:
@defmac LT_PREREQ (@var{version})
Ensure that a recent enough version of Libtool is being used. If the
version of Libtool used for @code{LT_INIT} is earlier than
@var{version}, print an error message to the standard
error output and exit with failure (exit status is 63). For example:
@example
LT_PREREQ([@value{VERSION}])
@end example
@end defmac
@defmac LT_INIT (@var{options})
@defmacx AC_PROG_LIBTOOL
@defmacx AM_PROG_LIBTOOL
Add support for the @option{--enable-shared}, @option{--disable-shared},
@option{--enable-static}, @option{--disable-static}, @option{--enable-pic}, and
@option{--disable-pic} @code{configure} flags.@footnote{@code{LT_INIT} requires
that you define the @file{Makefile} variable @code{top_builddir} in your
@file{Makefile.in}. Automake does this automatically, but Autoconf
users should set it to the relative path to the top of your build
directory (@file{../..}, for example).} @code{AC_PROG_LIBTOOL} and
@code{AM_PROG_LIBTOOL} are deprecated names for older versions of this macro;
@code{autoupdate} will upgrade your @file{configure.ac} files.
By default, this macro turns on shared libraries if they are available,
and also enables static libraries if they don't conflict with the shared
libraries. You can modify these defaults by passing either
@code{disable-shared} or @code{disable-static} in the option list to
@code{LT_INIT}, or using @code{AC_DISABLE_SHARED} or @code{AC_DISABLE_STATIC}.
@example
# Turn off shared libraries during beta-testing, since they
# make the build process take too long.
LT_INIT([disable-shared])
@end example
The user may specify modified forms of the configure flags
@option{--enable-shared} and @option{--enable-static} to choose whether
shared or static libraries are built based on the name of the package.
For example, to have shared @samp{bfd} and @samp{gdb} libraries built,
but not shared @samp{libg++}, you can run all three @code{configure}
scripts as follows:
@example
trick$ ./configure --enable-shared=bfd,gdb
@end example
In general, specifying @option{--enable-shared=@var{pkgs}} is the same as
configuring with @option{--enable-shared} every package named in the
comma-separated @var{pkgs} list, and every other package with
@option{--disable-shared}. The @option{--enable-static=@var{pkgs}} flag
behaves similarly, but it uses @option{--enable-static} and
@option{--disable-static}. The same applies to the
@option{--enable-fast-install=@var{pkgs}} flag, which uses
@option{--enable-fast-install} and @option{--disable-fast-install}.
The package name @samp{default} matches any packages that have not set
their name in the @code{PACKAGE} environment variable.
The @option{--enable-pic} and @option{--disable-pic} configure flags can be
used to specify whether or not @command{libtool} uses PIC objects. By default,
@command{libtool} uses PIC objects for shared libraries and non-PIC objects for
static libraries.
The @option{--enable-pic} option also accepts a comma-separated
list of package names. Specifying @option{--enable-pic=@var{pkgs}} is the same
as configuring every package in @var{pkgs} with @option{--enable-pic} and every
other package with the default configuration. The package name @samp{default}
is treated the same as for @option{--enable-shared} and
@option{--enable-static}.
This macro also sets the shell variable @code{LIBTOOL_DEPS}, that you
can use to automatically update the libtool script if it becomes
out-of-date. In order to do that, add to your @file{configure.ac}:
@example
LT_INIT
AC_SUBST([LIBTOOL_DEPS])
@end example
and, to @file{Makefile.in} or @file{Makefile.am}:
@example
LIBTOOL_DEPS = @@LIBTOOL_DEPS@@
libtool: $(LIBTOOL_DEPS)
$(SHELL) ./config.status libtool
@end example
If you are using GNU Automake, you can omit the assignment, as Automake
will take care of it. You'll obviously have to create some dependency
on @file{libtool}.
Aside from @code{disable-static} and @code{disable-shared}, there are
other options that you can pass to @code{LT_INIT} to modify its
behaviour. Here is a full list:
@table @samp
@item dlopen
Enable checking for dlopen support. This option should be used if
the package makes use of the @option{-dlopen} and @option{-dlpreopen}
libtool flags, otherwise libtool will assume that the system does not
support dlopening.
@item win32-dll
This option should be used if the package has been ported to build clean
dlls on win32 platforms. Usually this means that any library data items
are exported with @code{__declspec(dllexport)} and imported with
@code{__declspec(dllimport)}. If this option is not used, libtool will
assume that the package libraries are not dll clean and will build only
static libraries on win32 hosts.
Provision must be made to pass @option{-no-undefined} to @code{libtool}
in link mode from the package @code{Makefile}. Naturally, if you pass
@option{-no-undefined}, you must ensure that all the library symbols
@strong{really are} defined at link time!
@item aix-soname=aix
@itemx aix-soname=svr4
@itemx aix-soname=both
Enable the @option{--enable-aix-soname} to @command{configure}, which the
user can pass to override the given default.
By default (and @strong{always} in releases prior to 2.4.4), Libtool always
behaves as if @code{aix-soname=aix} is given, with no @command{configure}
option for the user to override. Specifically, when the @option{-brtl} linker
flag is seen in @code{LDFLAGS} at build-time, static archives are built from
static objects only, otherwise, traditional AIX shared library archives of
shared objects using in-archive versioning are built (with the @code{.a} file
extension!). Similarly, with @option{-brtl} in @code{LDFLAGS}, libtool
shared archives are built from shared objects, without any filename-based
versioning; and without @option{-brtl} no shared archives are built at all.
When @code{aix-soname=svr4} option is given, or the
@option{--enable-aix-soname=svr4} @command{configure} option is passed, static
archives are always created from static objects, even without @option{-brtl}
in @code{LDFLAGS}. Shared archives are made from shared objects, and filename
based versioning is enabled.
When @code{aix-soname=both} option is given, or the
@option{--enable-aix-soname=svr4} @command{configure} option is passed, static
archives are built traditionally (as @option{aix-soname=aix}), and both
kinds of shared archives are built. The @code{.la} pseudo-archive specifies
one or the other depending on whether @option{-brtl} is specified in
@code{LDFLAGS} when the library is built.
@item disable-fast-install
Change the default behaviour for @code{LT_INIT} to disable
optimization for fast installation. The user may still override this
default, depending on platform support, by specifying
@option{--enable-fast-install} to @command{configure}.
@item shared
Change the default behaviour for @code{LT_INIT} to enable
shared libraries. This is the default on all systems where
Libtool knows how to create shared libraries.
The user may still override this default by specifying
@option{--disable-shared} to @command{configure}.
@item disable-shared
Change the default behaviour for @code{LT_INIT} to disable
shared libraries. The user may still override this default by
specifying @option{--enable-shared} to @command{configure}.
@item static
Change the default behaviour for @code{LT_INIT} to enable
static libraries. This is the default on all systems where
shared libraries have been disabled for some reason, and on
most systems where shared libraries have been enabled.
If shared libraries are enabled, the user may still override
this default by specifying @option{--disable-static} to
@command{configure}.
@item disable-static
Change the default behaviour for @code{LT_INIT} to disable
static libraries. The user may still override this default by
specifying @option{--enable-static} to @command{configure}.
@item pic-only
Change the default behaviour for @command{libtool} to try to use only
PIC objects. The user may still override this default by specifying
@option{--disable-pic} to @command{configure}.
@item no-pic
Change the default behaviour of @command{libtool} to try to use only
non-PIC objects. The user may still override this default by
specifying @option{--enable-pic} to @command{configure}.
@end table
@end defmac
@defmac LT_LANG (@var{language})
Enable @command{libtool} support for the language given if it
has not yet already been enabled. Languages accepted are ``C++'',
``Fortran 77'', ``Java'', ``Go'', and ``Windows Resource''.
If Autoconf language support macros such as @code{AC_PROG_CXX} are
used in your @file{configure.ac}, Libtool language support will automatically
be enabled.
Conversely using @code{LT_LANG} to enable language support for Libtool
will automatically enable Autoconf language support as well.
Both of the following examples are therefore valid ways of adding C++
language support to Libtool.
@example
LT_INIT
LT_LANG([C++])
@end example
@example
LT_INIT
AC_PROG_CXX
@end example
@end defmac
@defmac AC_LIBTOOL_DLOPEN
This macro is deprecated, the @samp{dlopen} option to @code{LT_INIT} should be
used instead.
@end defmac
@defmac AC_LIBTOOL_WIN32_DLL
This macro is deprecated, the @samp{win32-dll} option to @code{LT_INIT} should
be used instead.
@end defmac
@defmac AC_DISABLE_FAST_INSTALL
This macro is deprecated, the @samp{disable-fast-install} option to @code{LT_INIT}
should be used instead.
@end defmac
@defmac AC_DISABLE_SHARED
@defmacx AM_DISABLE_SHARED
Change the default behaviour for @code{LT_INIT} to disable shared libraries.
The user may still override this default by specifying @samp{--enable-shared}.
The option @samp{disable-shared} to @code{LT_INIT} is a shorthand for this.
@code{AM_DISABLE_SHARED} is a deprecated alias for @code{AC_DISABLE_SHARED}.
@end defmac
@defmac AC_ENABLE_SHARED
@defmacx AM_ENABLE_SHARED
Change the default behaviour for @code{LT_INIT} to enable shared libraries.
This is the default on all systems where Libtool knows how to create
shared libraries. The user may still override this default by specifying
@samp{--disable-shared}. The option @samp{shared} to @code{LT_INIT} is a
shorthand for this.
@code{AM_ENABLE_SHARED} is a deprecated alias for @code{AC_ENABLE_SHARED}.
@end defmac
@defmac AC_DISABLE_STATIC
@defmacx AM_DISABLE_STATIC
Change the default behaviour for @code{LT_INIT} to disable static libraries.
The user may still override this default by specifying @samp{--enable-static}.
The option @samp{disable-static} to @code{LT_INIT} is a shorthand for this.
@code{AM_DISABLE_STATIC} is a deprecated alias for @code{AC_DISABLE_STATIC}.
@end defmac
@defmac AC_ENABLE_STATIC
@defmacx AM_ENABLE_STATIC
Change the default behaviour for @code{LT_INIT} to enable static libraries.
This is the default on all systems where shared libraries have been disabled
for some reason, and on most systems where shared libraries have been enabled.
If shared libraries are enabled, the user may still override this default by
specifying @samp{--disable-static}. The option @samp{static} to @code{LT_INIT}
is a shorthand for this.
@code{AM_ENABLE_STATIC} is a deprecated alias for @code{AC_ENABLE_STATIC}.
@end defmac
The tests in @code{LT_INIT} also recognize the following
environment variables:
@defvar CC
The C compiler that will be used by the generated @code{libtool}. If
this is not set, @code{LT_INIT} will look for @command{gcc} or
@command{cc}.
@end defvar
@defvar CFLAGS
Compiler flags used to generate standard object files. If this is not
set, @code{LT_INIT} will not use any such flags. It affects
only the way @code{LT_INIT} runs tests, not the produced
@code{libtool}.
@end defvar
@defvar CPPFLAGS
C preprocessor flags. If this is not set, @code{LT_INIT} will
not use any such flags. It affects only the way @code{LT_INIT}
runs tests, not the produced @code{libtool}.
@end defvar
@defvar LD
The system linker to use (if the generated @code{libtool} requires one).
If this is not set, @code{LT_INIT} will try to find out what is
the linker used by @code{CC}.
@end defvar
@defvar LDFLAGS
The flags to be used by @code{libtool} when it links a program. If
this is not set, @code{LT_INIT} will not use any such flags. It
affects only the way @code{LT_INIT} runs tests, not the produced
@code{libtool}.
@end defvar
@defvar LIBS
The libraries to be used by @code{LT_INIT} when it links a
program. If this is not set, @code{LT_INIT} will not use any
such flags. It affects only the way @code{LT_INIT} runs tests,
not the produced @code{libtool}.
@end defvar
@defvar NM
Program to use rather than checking for @command{nm}.
@end defvar
@defvar RANLIB
Program to use rather than checking for @command{ranlib}.
@end defvar
@defvar LN_S
A command that creates a link of a program, a soft-link if possible, a
hard-link otherwise. @code{LT_INIT} will check for a suitable
program if this variable is not set.
@end defvar
@defvar DLLTOOL
Program to use rather than checking for @command{dlltool}. Only meaningful
for Cygwin/MS-Windows.
@end defvar
@defvar OBJDUMP
Program to use rather than checking for @command{objdump}. Only meaningful
for Cygwin/MS-Windows.
@end defvar
@defvar AS
Program to use rather than checking for @command{as}. Only used on
Cygwin/MS-Windows at the moment.
@end defvar
@defvar MANIFEST_TOOL
Program to use rather than checking for @command{mt}, the Manifest Tool.
Only used on Cygwin/MS-Windows at the moment.
@end defvar
@defvar LT_SYS_LIBRARY_PATH
Libtool has heuristics for the system search path for runtime-loaded
libraries. If the guessed default does not match the setup of the host
system, this variable can be used to modify that path list, as follows
(@code{LT_SYS_LIBRARY_PATH} is a colon-delimited list like @code{PATH}):
@itemize @bullet
@item @code{path:}
The heuristically determined paths will be appended after the trailing
colon;
@item @code{:path}
The heuristically determined paths will be prepended before the leading
colon;
@item @code{path::path}
The heuristically determined paths will be inserted between the double
colons;
@item @code{path}
With no dangling colons, the heuristically determined paths will be
ignored entirely.
@end itemize
@end defvar
With 1.3 era libtool, if you wanted to know any details of what
libtool had discovered about your architecture and environment, you
had to run the script with @option{--config} and grep through the
results. This idiom was supported up to and including 1.5.x era
libtool, where it was possible to call the generated libtool script
from @file{configure.ac} as soon as @code{LT_INIT} had
completed. However, one of the features of libtool 1.4 was that the
libtool configuration was migrated out of a separate @file{ltconfig}
file, and added to the @code{LT_INIT} macro (nee @code{AC_PROG_LIBTOOL}),
so the results of the configuration tests were available directly to code in
@file{configure.ac}, rendering the call out to the generated libtool
script obsolete.
Starting with libtool 2.0, the multipass generation of the libtool
script has been consolidated into a single @file{config.status} pass,
which happens after all the code in @file{configure.ac} has
completed. The implication of this is that the libtool script does
not exist during execution of code from @file{configure.ac}, and so
obviously it cannot be called for @option{--config} details anymore. If
you are upgrading projects that used this idiom to libtool 2.0 or
newer, you should replace those calls with direct references to the
equivalent Autoconf shell variables that are set by the configure time
tests before being passed to @file{config.status} for inclusion in the
generated libtool script.
@defmac LT_OUTPUT
By default, the configured @file{libtool} script is generated by the
call to @code{AC_OUTPUT} command, and there is rarely any need to use
@file{libtool} from @file{configure}. However, sometimes it is
necessary to run configure time compile and link tests using
@file{libtool}. You can add @code{LT_OUTPUT} to your
@file{configure.ac} any time after @code{LT_INIT} and any
@code{LT_LANG} calls; that done, @file{libtool} will be created by a
specially generated @file{config.lt} file, and available for use in
later tests.
Also, when @code{LT_OUTPUT} is used, for backwards compatibility with
Automake regeneration rules, @file{config.status} will call
@file{config.lt} to regenerate @file{libtool}, rather than generating
the file itself.
@end defmac
@pindex aclocal
When you invoke the @command{libtoolize} program (@pxref{Invoking
libtoolize}), it will tell you where to find a definition of
@code{LT_INIT}. If you use Automake, the @command{aclocal} program
will automatically add @code{LT_INIT} support to your
@file{configure} script when it sees the invocation of @code{LT_INIT}
in @file{configure.ac}.
Because of these changes, and the runtime version compatibility checks
Libtool now executes, we now advise @strong{against} including a copy of
@file{libtool.m4} (and brethren) in @file{acinclude.m4}. Instead,
you should set your project macro directory with
@code{AC_CONFIG_MACRO_DIRS}. When you @command{libtoolize} your
project, a copy of the relevant macro definitions will be placed in
your @code{AC_CONFIG_MACRO_DIRS}, where @command{aclocal} can reference
them directly from @file{aclocal.m4}.
@node Configure notes
@subsection Platform-specific configuration notes
While Libtool tries to hide as many platform-specific features as possible,
some have to be taken into account when configuring either the Libtool package
or a libtoolized package.
@include notes.texi
@node Distributing
@section Including libtool in your package
In order to use libtool, you need to include the following files with
your package:
@table @file
@item config.guess
@pindex config.guess
Attempt to guess a canonical system name.
@item config.sub
@pindex config.sub
Canonical system name validation subroutine script.
@item install-sh
@pindex install-sh
BSD-compatible @command{install} replacement script.
@item ltmain.sh
@pindex ltmain.sh
A generic script implementing basic libtool functionality.
@end table
Note that the libtool script itself should @emph{not} be included with
your package. @xref{Configuring}.
You should use the @command{libtoolize} program, rather than manually
copying these files into your package.
@menu
* Invoking libtoolize:: @code{libtoolize} command line options.
* Autoconf and LTLIBOBJS:: Autoconf automates LTLIBOBJS generation.
@end menu
@node Invoking libtoolize
@subsection Invoking @command{libtoolize}
@pindex libtoolize
@cindex libtoolize command options
@cindex command options, libtoolize
@cindex options, libtoolize command
The @command{libtoolize} program provides a standard way to add libtool
support to your package. In the future, it may implement better usage
checking, or other features to make libtool even easier to use.
The @command{libtoolize} program has the following synopsis:
@example
libtoolize [@var{option}]@dots{}
@end example
@noindent
and accepts the following options:
@table @option
@item --copy
@itemx -c
Copy files from the libtool data directory rather than creating
symlinks.
@item --debug
Dump a trace of shell script execution to standard output. This
produces a lot of output, so you may wish to pipe it to @command{less} (or
@command{more}) or redirect to a file.
@item --dry-run
@itemx -n
Don't run any commands that modify the file system, just print them
out.
@item --force
@itemx -f
Replace existing libtool files. By default, @command{libtoolize} won't
overwrite existing files.
@item --help
Display a help message and exit.
@item --ltdl [@var{target-directory-name}]
Install libltdl in the @var{target-directory-name} subdirectory of
your package. Normally, the directory is extracted from the argument
to @code{LT_CONFIG_LTDL_DIR} in @file{configure.ac}, though you can
also specify a subdirectory name here if you are not using Autoconf
for example. If @command{libtoolize} can't determine the target
directory, @samp{libltdl} is used as the default.
@item --no-warn
Normally, Libtoolize tries to diagnose use of deprecated libtool macros
and other stylistic issues. If you are deliberately using outdated
calling conventions, this option prevents Libtoolize from explaining
how to update your project's Libtool conventions.
@item --nonrecursive
If passed in conjunction with @option{--ltdl}, this option will cause
the @command{libltdl} installed by @samp{libtoolize} to be set up for
use with a non-recursive @command{automake} build. To make use of it,
you will need to add the following to the @file{Makefile.am} of the
parent project:
@example
## libltdl/ltdl.mk @r{appends to the following variables}
## @r{so we set them here before including it:}
BUILT_SOURCES =
AM_CPPFLAGS =
AM_LDFLAGS =
include_HEADERS =
noinst_LTLIBRARIES =
lib_LTLIBRARIES =
EXTRA_LTLIBRARIES =
EXTRA_DIST =
CLEANFILES =
MOSTLYCLEANFILES =
include libltdl/ltdl.mk
@end example
@noindent
@item --quiet
@itemx -q
Work silently. @samp{libtoolize --quiet} is used by GNU Automake
to add libtool files to your package if necessary.
@item --recursive
If passed in conjunction with @option{--ltdl}, this option will cause
the @command{libtoolize} installed @samp{libltdl} to be set up for use
with a recursive @command{automake} build. To make use of it, you
will need to adjust the parent project's @file{configure.ac}:
@example
AC_CONFIG_FILES([libltdl/Makefile])
@end example
@noindent
and @file{Makefile.am}:
@example
SUBDIRS += libltdl
@end example
@item --subproject
If passed in conjunction with @option{--ltdl}, this option will cause
the @command{libtoolize} installed @samp{libltdl} to be set up for
independent configuration and compilation as a self-contained
subproject. To make use of it, you should arrange for your build to
call @command{libltdl/configure}, and then run @command{make} in the
@file{libltdl} directory (or the subdirectory you put libltdl into).
If your project uses Autoconf, you can use the supplied
@samp{LT_WITH_LTDL} macro, or else call @samp{AC_CONFIG_SUBDIRS}
directly.
Previous releases of @samp{libltdl} built exclusively in this mode,
but now it is the default mode both for backwards compatibility and
because, for example, it is suitable for use in projects that wish to
use @samp{libltdl}, but not use the Autotools for their own build
process.
@item --verbose
@itemx -v
Work noisily! Give a blow by blow account of what
@command{libtoolize} is doing.
@item --version
Print @command{libtoolize} version information and exit.
@end table
@cindex LIBTOOLIZE_OPTIONS
Sometimes it can be useful to pass options to @command{libtoolize} even
though it is called by another program, such as @command{autoreconf}. A
limited number of options are parsed from the environment variable
@code{LIBTOOLIZE_OPTIONS}: currently @option{--debug}, @option{--no-warn},
@option{--quiet} and @option{--verbose}. Multiple options passed in
@code{LIBTOOLIZE_OPTIONS} must be separated with a space, comma or a
colon.
By default, a warning is issued for unknown options found in
@code{LIBTOOLIZE_OPTIONS} unless the first such option is
@option{--no-warn}. Where @command{libtoolize} has always quit
on receipt of an unknown option at the command line, this and all
previous releases of @command{libtoolize} will continue unabated whatever
the content of @code{LIBTOOLIZE_OPTIONS} (modulo some possible warning
messages).
@example
trick$ @kbd{LIBTOOLIZE_OPTIONS=--no-warn,--quiet autoreconf --install}
@end example
@findex AC_CONFIG_MACRO_DIRS
If @command{libtoolize} detects an explicit call to
@code{AC_CONFIG_MACRO_DIRS} (@pxref{Input, , The Autoconf Manual,
autoconf, The Autoconf Manual}) in your @file{configure.ac}, it will
put the Libtool macros in the specified directory.
In the future other Autotools will automatically check the contents of
@code{AC_CONFIG_MACRO_DIRS}, but at the moment it is more portable to
add the macro directory to @code{ACLOCAL_AMFLAGS} in
@file{Makefile.am}, which is where the tools currently look. If
@command{libtoolize} doesn't see @code{AC_CONFIG_MACRO_DIRS}, it too
will honour the first @samp{-I} argument in @code{ACLOCAL_AMFLAGS}
when choosing a directory to store libtool configuration macros in.
It is perfectly sensible to use both @code{AC_CONFIG_MACRO_DIRS} and
@code{ACLOCAL_AMFLAGS}, as long as they are kept in synchronisation.
@example
ACLOCAL_AMFLAGS = -I m4
@end example
When you bootstrap your project with @command{aclocal}, then you will
need to explicitly pass the same macro directory with
@command{aclocal}'s @samp{-I} flag:
@example
trick$ @kbd{aclocal -I m4}
@end example
@findex AC_CONFIG_AUX_DIR
If @command{libtoolize} detects an explicit call to
@code{AC_CONFIG_AUX_DIR} (@pxref{Input, , The Autoconf Manual,
autoconf, The Autoconf Manual}) in your @file{configure.ac}, it
will put the other support files in the specified directory.
Otherwise they too end up in the project root directory.
Unless @option{--no-warn} is passed, @command{libtoolize} displays
hints for adding libtool support to your package, as well.
@node Autoconf and LTLIBOBJS
@subsection Autoconf and @code{LTLIBOBJS}
People used to add code like the following to their
@file{configure.ac}:
@cindex LTLIBOBJS
@example
LTLIBOBJS=`echo "$LIBOBJS" | sed 's/\.[^.]* /.lo /g;s/\.[^.]*$/.lo/'`
AC_SUBST([LTLIBOBJS])
@end example
@noindent
This is no longer required (since Autoconf 2.54), and doesn't take
Automake's deansification support into account either, so doesn't work
correctly even with ancient Autoconfs!
Provided you are using a recent (2.54 or better) incarnation of
Autoconf, the call to @code{AC_OUTPUT} takes care of setting
@code{LTLIBOBJS} up correctly, so you can simply delete such snippets
from your @file{configure.ac} if you had them.
@node Static-only libraries
@section Static-only libraries
@cindex debugging libraries
@cindex developing libraries
@cindex double-compilation, avoiding
@cindex avoiding shared libraries
@cindex eliding shared libraries
@cindex using shared libraries, not
@cindex shared libraries, not using
@cindex time, saving
@cindex saving time
When you are developing a package, it is often worthwhile to configure
your package with the @option{--disable-shared} flag, or to override the
defaults for @code{LT_INIT} by using the @code{disable-shared} option
(@pxref{LT_INIT, , The @code{LT_INIT} macro}). This prevents libtool
from building shared libraries, which has several advantages:
@itemize @bullet
@item
compilation is twice as fast, which can speed up your development cycle,
@item
debugging is easier because you don't need to deal with any complexities
added by shared libraries, and
@item
you can see how libtool behaves on static-only platforms.
@end itemize
You may want to put a small note in your package @file{README} to let
other developers know that @option{--disable-shared} can save them time.
The following example note is taken from the GIMP@footnote{GNU Image
Manipulation Program, for those who haven't taken the plunge. See
@url{http://www.gimp.org/}.} distribution @file{README}:
@example
The GIMP uses GNU Libtool to build shared libraries on a
variety of systems. While this is very nice for making usable
binaries, it can be a pain when trying to debug a program. For that
reason, compilation of shared libraries can be turned off by
specifying the @option{--disable-shared} option to @file{configure}.
@end example
@node Other languages
@chapter Using libtool with other languages
@cindex C, not using
@cindex languages, non-C
@cindex C++, using
Libtool was first implemented to add support for writing shared
libraries in the C language. However, over time, libtool is being
integrated with other languages, so that programmers are free to reap
the benefits of shared libraries in their favorite programming language.
This chapter describes how libtool interacts with other languages,
and what special considerations you need to make if you do not use C.
@menu
* C++ libraries:: Writing libraries for C++
* Tags:: Tags
@end menu
@node C++ libraries
@section Writing libraries for C@code{++}
@cindex trouble with C++
@cindex pitfalls using C++
@cindex C++, pitfalls
Creating libraries of C++ code should be a fairly straightforward
process, because its object files differ from C ones in only three ways:
@enumerate 1
@item
Because of name mangling, C++ libraries are only usable by the C++
compiler that created them. This decision was made by the designers of
C++ to protect users from conflicting implementations of
features such as constructors, exception handling, and RTTI.
@item
On some systems, the C++ compiler must take special actions for the
dynamic linker to run dynamic (i.e., run-time) initializers. This means
that we should not call @command{ld} directly to link such libraries, and
we should use the C++ compiler instead.
@item
C++ compilers will link some Standard C++ library in by default, but
libtool does not know what these libraries are, so it cannot even run
the inter-library dependence analyzer to check how to link it in.
Therefore, running @command{ld} to link a C++ program or library is deemed
to fail.
@end enumerate
Because of these three issues, Libtool has been designed to always use
the C++ compiler to compile and link C++ programs and libraries. In
some instances the @code{main()} function of a program must also be
compiled with the C++ compiler for static C++ objects to be properly
initialized.
@node Tags
@section Tags
@cindex tag names
@cindex language names
@cindex inferring tags
Libtool supports multiple languages through the use of tags. Technically
a tag corresponds to a set of configuration variables associated with a
language. These variables tell @command{libtool} how it should create
objects and libraries for each language.
Tags are defined at @command{configure}-time for each language activated
in the package (see @code{LT_LANG} in @ref{LT_INIT}). Here is the
correspondence between language names and tags names.
@multitable {Windows Resource} {Tag name}
@item Language name @tab Tag name
@item C @tab CC
@item C++ @tab CXX
@item Java @tab GCJ
@item Fortran 77 @tab F77
@item Fortran @tab FC
@item Go @tab GO
@item Windows Resource @tab RC
@end multitable
@command{libtool} tries to automatically infer what tag to use from
the compiler command being used to compile or link. If it can't infer
a tag, then it defaults to the configuration for the @code{C} language.
The tag can also be specified using @command{libtool}'s
@option{--tag=@var{tag}} option (@pxref{Invoking libtool}). It is a good
idea to do so in @file{Makefile} rules, because that will allow users to
substitute the compiler without relying on @command{libtool} inference
heuristics. When no tag is specified, @command{libtool} will default
to @code{CC}; this tag always exists.
Finally, the set of tags available in a particular project can be
retrieved by tracing for the @code{LT_SUPPORTED_TAG} macro (@pxref{Trace
interface}).
@node Versioning
@chapter Library interface versions
@cindex dynamic dependencies
@cindex dependency versioning
@cindex shared library versions
The most difficult issue introduced by shared libraries is that of
creating and resolving runtime dependencies. Dependencies on programs
and libraries are often described in terms of a single name, such as
@command{sed}. So, one may say ``libtool depends on sed,'' and that is
good enough for most purposes.
However, when an interface changes regularly, we need to be more
specific: ``Gnus 5.1 requires Emacs 19.28 or above.'' Here, the
description of an interface consists of a name, and a ``version
number.''
Even that sort of description is not accurate enough for some purposes.
What if Emacs 20 changes enough to break Gnus 5.1?
The same problem exists in shared libraries: we require a formal version
system to describe the sorts of dependencies that programs have on
shared libraries, so that the dynamic linker can guarantee that programs
are linked only against libraries that provide the interface they
require.
@menu
* Interfaces:: What are library interfaces?
* Libtool versioning:: Libtool's versioning system.
* Updating version info:: Changing version information before releases.
* Release numbers:: Breaking binary compatibility for aesthetics.
@end menu
@node Interfaces
@section What are library interfaces?
@cindex library interfaces
Interfaces for libraries may be any of the following (and more):
@itemize @bullet
@item
global variables: both names and types
@item
global functions: argument types and number, return types, and function names
@item
standard input, standard output, standard error, and file formats
@item
sockets, pipes, and other inter-process communication protocol formats
@end itemize
Note that static functions do not count as interfaces, because they are
not directly available to the user of the library.
@node Libtool versioning
@section Libtool's versioning system
@cindex libtool library versions
@cindex formal versioning
@cindex versioning, formal
Libtool has its own formal versioning system. It is not as flexible as
some, but it is definitely the simplest of the more powerful versioning
systems.
Think of a library as exporting several sets of interfaces, arbitrarily
represented by integers. When a program is linked against a library, it
may use any subset of those interfaces.
Libtool's description of the interfaces that a program uses is simple:
it encodes the least and the greatest interface numbers in the resulting
binary (@var{first-interface}, @var{last-interface}).
The dynamic linker is guaranteed that if a library supports @emph{every}
interface number between @var{first-interface} and @var{last-interface},
then the program can be relinked against that library.
Note that this can cause problems because libtool's compatibility
requirements are actually stricter than is necessary.
Say @file{libhello} supports interfaces 5, 16, 17, 18, and 19, and that
libtool is used to link @file{test} against @file{libhello}.
Libtool encodes the numbers 5 and 19 in @file{test}, and the dynamic
linker will only link @file{test} against libraries that support
@emph{every} interface between 5 and 19. So, the dynamic linker refuses
to link @file{test} against @file{libhello}!
In order to eliminate this problem, libtool only allows libraries to
declare consecutive interface numbers. So, @file{libhello} can declare at
most that it supports interfaces 16 through 19. Then, the dynamic
linker will link @file{test} against @file{libhello}.
So, libtool library versions are described by three integers:
@table @var
@item current
The most recent interface number that this library implements.
@item revision
The implementation number of the @var{current} interface.
@item age
The difference between the newest and oldest interfaces that this
library implements. In other words, the library implements all the
interface numbers in the range from number @code{@var{current} -
@var{age}} to @code{@var{current}}.
@end table
If two libraries have identical @var{current} and @var{age} numbers,
then the dynamic linker chooses the library with the greater
@var{revision} number.
@node Updating version info
@section Updating library version information
If you want to use libtool's versioning system, then you must specify
the version information to libtool using the @option{-version-info} flag
during link mode (@pxref{Link mode}).
This flag accepts an argument of the form
@samp{@var{current}[:@var{revision}[:@var{age}]]}. So, passing
@option{-version-info 3:12:1} sets @var{current} to 3, @var{revision} to
12, and @var{age} to 1.
If either @var{revision} or @var{age} are omitted, they default to 0.
Also note that @var{age} must be less than or equal to the @var{current}
interface number.
Here are a set of rules to help you update your library version
information:
@enumerate 1
@item
Start with version information of @samp{0:0:0} for each libtool library.
@item
Update the version information only immediately before a public release
of your software. More frequent updates are unnecessary, and only
guarantee that the current interface number gets larger faster.
@item
If the library source code has changed at all since the last update,
then increment @var{revision} (@samp{@var{c}:@var{r}:@var{a}} becomes
@samp{@var{c}:@math{r+1}:@var{a}}).
@item
If any interfaces have been added, removed, or changed since the last
update, increment @var{current}, and set @var{revision} to 0.
@item
If any interfaces have been added since the last public release, then
increment @var{age}.
@item
If any interfaces have been removed or changed since the last public
release, then set @var{age} to 0.
@end enumerate
@strong{@emph{Never}} try to set the interface numbers so that they
correspond to the release number of your package. This is an abuse that
only fosters misunderstanding of the purpose of library versions.
Instead, use the @option{-release} flag (@pxref{Release numbers}), but be
warned that every release of your package will not be binary compatible
with any other release.
The following explanation may help to understand the above rules a bit
better: consider that there are three possible kinds of reactions from
users of your library to changes in a shared library:
@enumerate 1
@item
Programs using the previous version may use the new version as
drop-in replacement, and programs using the new version can also work
with the previous one. In other words, no recompiling nor relinking
is needed. In this case, bump @var{revision} only, don't touch
@var{current} nor @var{age}.
@item
Programs using the previous version may use the new version as
drop-in replacement, but programs using the new version may use APIs not
present in the previous one. In other words, a program linking against
the new version may fail with ``unresolved symbols'' if linking against
the old version at runtime: set @var{revision} to 0, bump @var{current}
and @var{age}.
@item
Programs may need to be changed, recompiled, and relinked in order to use
the new version. Bump @var{current}, set @var{revision} and @var{age}
to 0.
@end enumerate
@noindent
In the above description, @emph{programs} using the library in question
may also be replaced by other libraries using it.
@node Release numbers
@section Managing release information
Often, people want to encode the name of the package release into the
shared library so that it is obvious to the user what package their
programs are linked against. This convention is used especially on
GNU/Linux:
@example
trick$ @kbd{ls /usr/lib/libbfd*}
/usr/lib/libbfd.a /usr/lib/libbfd.so.2.7.0.2
/usr/lib/libbfd.so
trick$
@end example
On @samp{trick}, @file{/usr/lib/libbfd.so} is a symbolic link to
@file{libbfd.so.2.7.0.2}, which was distributed as a part of
@samp{binutils-2.7.0.2}.
Unfortunately, this convention conflicts directly with libtool's idea of
library interface versions, because the library interface rarely changes
at the same time that the release number does, and the library suffix is
never the same across all platforms.
So, to accommodate both views, you can use the @option{-release}
flag to set release information for libraries for which you do not
want to use @option{-version-info}. For the @file{libbfd} example, the
next release that uses libtool should be built with @samp{-release
2.9.0}, which will produce the following files on GNU/Linux:
@example
trick$ @kbd{ls /usr/lib/libbfd*}
/usr/lib/libbfd-2.9.0.so /usr/lib/libbfd.a
/usr/lib/libbfd.so
trick$
@end example
In this case, @file{/usr/lib/libbfd.so} is a symbolic link to
@file{libbfd-2.9.0.so}. This makes it obvious that the user is dealing
with @samp{binutils-2.9.0}, without compromising libtool's idea of
interface versions.
Note that this option causes a modification of the library name, so do
not use it unless you want to break binary compatibility with any past
library releases. In general, you should only use @option{-release} for
package-internal libraries or for ones whose interfaces change very
frequently.
@node Library tips
@chapter Tips for interface design
@cindex library interfaces, design
@cindex design of library interfaces
Writing a good library interface takes a lot of practice and thorough
understanding of the problem that the library is intended to solve.
If you design a good interface, it won't have to change often, you won't
have to keep updating documentation, and users won't have to keep
relearning how to use the library.
Here is a brief list of tips for library interface design that may
help you in your exploits:
@table @asis
@item Plan ahead
Try to make every interface truly minimal, so that you won't need to
delete entry points very often.
@item Avoid interface changes
@cindex renaming interface functions
Some people love redesigning and changing entry points just for the heck
of it (note: @emph{renaming} a function is considered changing an entry
point). Don't be one of those people. If you must redesign an
interface, then try to leave compatibility functions behind so that
users don't need to rewrite their existing code.
@item Use opaque data types
@cindex opaque data types
The fewer data type definitions a library user has access to, the
better. If possible, design your functions to accept a generic pointer
(that you can cast to an internal data type), and provide access
functions rather than allowing the library user to directly manipulate
the data.
That way, you have the freedom to change the data structures without
changing the interface.
This is essentially the same thing as using abstract data types and
inheritance in an object-oriented system.
@item Use header files
@cindex header files
If you are careful to document each of your library's global functions
and variables in header files, and include them in your library source
files, then the compiler will let you know if you make any interface
changes by accident (@pxref{C header files}).
@item Use the @code{static} keyword (or equivalent) whenever possible
@cindex global functions
The fewer global functions your library has, the more flexibility you'll
have in changing them. Static functions and variables may change forms
as often as you like@dots{} your users cannot access them, so they
aren't interface changes.
@item Be careful with array dimensions
The number of elements in a global array is part of an interface, even
if the header just declares @code{extern int foo[];}. This is because
on i386 and some other SVR4/ELF systems, when an application
references data in a shared library the size of that data (whatever
its type) is included in the application executable. If you might
want to change the size of an array or string then provide a pointer
not the actual array.
@end table
@menu
* C header files:: How to write portable include files.
@end menu
@node C header files
@section Writing C header files
@cindex portable C headers
@cindex C header files, portable
@cindex include files, portable
Writing portable C header files can be difficult, since they may be read
by different types of compilers:
@table @asis
@item C++ compilers
C++ compilers require that functions be declared with full prototypes,
since C++ is more strongly typed than C@. C functions and variables also
need to be declared with the @code{extern "C"} directive, so that the
names aren't mangled. @xref{C++ libraries}, for other issues relevant
to using C++ with libtool.
@item ANSI C compilers
ANSI C compilers are not as strict as C++ compilers, but functions
should be prototyped to avoid unnecessary warnings when the header file
is @code{#include}d.
@item non-ANSI C compilers
Non-ANSI compilers will report errors if functions are prototyped.
@end table
These complications mean that your library interface headers must use
some C preprocessor magic to be usable by each of the above compilers.
@file{foo.h}, defined in the @file{tests/demo.at} Autotest of the libtool
distribution, serves as an example for how to write a header file that
can be safely installed in a system directory.
Here are the relevant portions of that file:
@example
#ifndef FOO_H
#define FOO_H
#ifdef __cplusplus
extern "C" @{
#endif
int foo (void);
int hello (void);
#ifdef __cplusplus
@}
#endif
#endif /* !FOO_H */
@end example
This can also be achieved by utilizing macros:
@example
/* BEGIN_C_DECLS should be used at the beginning of your declarations,
so that C++ compilers don't mangle their names. Use END_C_DECLS at
the end of C declarations. */
#undef BEGIN_C_DECLS
#undef END_C_DECLS
#ifdef __cplusplus
# define BEGIN_C_DECLS extern "C" @{
# define END_C_DECLS @}
#else
# define BEGIN_C_DECLS /* empty */
# define END_C_DECLS /* empty */
#endif
/* PARAMS is a macro used to wrap function prototypes, so that
compilers that don't understand ANSI C prototypes still work,
and ANSI C compilers can issue warnings about type mismatches. */
#undef PARAMS
#if defined __STDC__ || defined _AIX \
|| (defined __mips && defined _SYSTYPE_SVR4) \
|| defined WIN32 || defined __cplusplus
# define PARAMS(protos) protos
#else
# define PARAMS(protos) ()
#endif
@end example
These macros can be used in @file{foo.h} as follows:
@example
#ifndef FOO_H
#define FOO_H
/* The above macro definitions. */
#include "@dots{}"
BEGIN_C_DECLS
int foo PARAMS((void));
int hello PARAMS((void));
END_C_DECLS
#endif /* !FOO_H */
@end example
Note that the @file{#ifndef FOO_H} prevents the body of @file{foo.h}
from being read more than once in a given compilation.
Also the only thing that must go outside the
@code{BEGIN_C_DECLS}/@code{END_C_DECLS} pair are @code{#include} lines.
Strictly speaking it is only C symbol names that need to be protected,
but your header files will be more maintainable if you have a single
pair of these macros around the majority of the header contents.
You should use these definitions of @code{PARAMS}, @code{BEGIN_C_DECLS},
and @code{END_C_DECLS} into your own headers. Then, you may use them to
create header files that are valid for C++, ANSI, and non-ANSI
compilers@footnote{We used to recommend @code{__P},
@code{__BEGIN_DECLS} and @code{__END_DECLS}. This was bad advice since
symbols (even preprocessor macro names) that begin with an underscore
are reserved for the use of the compiler.}.
Do not be naive about writing portable code. Following the tips given
above will help you miss the most obvious problems, but there are
definitely other subtle portability issues. You may need to cope with
some of the following issues:
@itemize @bullet
@item
Pre-ANSI compilers do not always support the @code{void *} generic
pointer type, and so need to use @code{char *} in its place.
@item
The @code{const}, @code{inline} and @code{signed} keywords are not
supported by some compilers, especially pre-ANSI compilers.
@item
The @code{long double} type is not supported by many compilers.
@end itemize
@node Inter-library dependencies
@chapter Inter-library dependencies
@cindex dependencies between libraries
@cindex inter-library dependencies
By definition, every shared library system provides a way for
executables to depend on libraries, so that symbol resolution is
deferred until runtime.
An @dfn{inter-library dependency} is where a library depends on
other libraries. For example, if the libtool library @file{libhello}
uses the @code{cos} function, then it has an inter-library dependency
on @file{libm}, the math library that implements @code{cos}.
Some shared library systems provide this feature in an
internally-consistent way: these systems allow chains of dependencies of
potentially infinite length.
However, most shared library systems are restricted in that they only
allow a single level of dependencies. In these systems, programs may
depend on shared libraries, but shared libraries may not depend on other
shared libraries.
In any event, libtool provides a simple mechanism for you to declare
inter-library dependencies: for every library @file{lib@var{name}} that
your own library depends on, simply add a corresponding
@code{-l@var{name}} option to the link line when you create your
library. To make an example of our @file{libhello} that depends on
@file{libm}:
@example
burger$ @kbd{libtool --mode=link gcc -g -O -o libhello.la foo.lo hello.lo \
-rpath /usr/local/lib -lm}
burger$
@end example
When you link a program against @file{libhello}, you don't need to
specify the same @samp{-l} options again: libtool will do that for you,
to guarantee that all the required libraries are found. This
restriction is only necessary to preserve compatibility with static
library systems and simple dynamic library systems.
Some platforms, such as Windows, do not even allow you this
flexibility. In order to build a shared library, it must be entirely
self-contained or it must have dependencies known at link time (that is,
have references only to symbols that are found in the @file{.lo} files
or the specified @samp{-l} libraries), and you need to specify the
@option{-no-undefined} flag. By default, libtool builds only static
libraries on these kinds of platforms.
The simple-minded inter-library dependency tracking code of libtool
releases prior to 1.2 was disabled because it was not clear when it was
possible to link one library with another, and complex failures would
occur. A more complex implementation of this concept was re-introduced
before release 1.3, but it has not been ported to all platforms that
libtool supports. The default, conservative behavior is to avoid
linking one library with another, introducing their inter-dependencies
only when a program is linked with them.
@node Dlopened modules
@chapter Dlopened modules
@findex dlopen
@findex dlsym
@findex dlclose
@findex shl_load
@cindex dynamic linking, applications
@cindex dlopening modules
@cindex modules, dynamic
@cindex application-level dynamic linking
It can sometimes be confusing to discuss @dfn{dynamic linking}, because
the term is used to refer to two different concepts:
@enumerate 1
@item
Compiling and linking a program against a shared library, which is
resolved automatically at run time by the dynamic linker. In this
process, dynamic linking is transparent to the application.
@item
The application calling functions such as @code{dlopen} that load
arbitrary, user-specified modules at runtime. This type of dynamic
linking is explicitly controlled by the application.
@end enumerate
To mitigate confusion, this manual refers to the second type of dynamic
linking as @dfn{dlopening} a module.
The main benefit to dlopening object modules is the ability to access
compiled object code to extend your program, rather than using an
interpreted language. In fact, dlopen calls are frequently used in
language interpreters to provide an efficient way to extend the
language.
Libtool provides support for dlopened modules. However, you should
indicate that your package is willing to use such support, by using the
@code{LT_INIT} option @samp{dlopen} in @file{configure.ac}. If this
option is not given, libtool will assume no dlopening mechanism is
available, and will try to simulate it.
This chapter discusses how you as a dlopen application developer might
use libtool to generate dlopen-accessible modules.
@menu
* Building modules:: Creating dlopenable objects and libraries.
* Dlpreopening:: Dlopening that works on static platforms.
* Linking with dlopened modules:: Using dlopenable modules in libraries.
* Finding the dlname:: Choosing the right file to @code{dlopen}.
* Dlopen issues:: Unresolved problems that need your attention.
@end menu
@node Building modules
@section Building modules to dlopen
On some operating systems, a program symbol must be specially declared
in order to be dynamically resolved with the @code{dlsym} (or
equivalent) function. Libtool provides the @option{-export-dynamic} and
@option{-module} link flags (@pxref{Link mode}), for you to make that
declaration. You need to use these flags if you are linking an
application program that dlopens other modules or a libtool library
that will also be dlopened.
For example, if we wanted to build a shared library, @file{hello},
that would later be dlopened by an application, we would add
@option{-module} to the other link flags:
@example
burger$ @kbd{libtool --mode=link gcc -module -o hello.la foo.lo \
hello.lo -rpath /usr/local/lib -lm}
burger$
@end example
If symbols from your @emph{executable} are needed to satisfy unresolved
references in a library you want to dlopen you will have to use the flag
@option{-export-dynamic}. You should use @option{-export-dynamic} while
linking the executable that calls dlopen:
@example
burger$ @kbd{libtool --mode=link gcc -export-dynamic -o helldl main.o}
burger$
@end example
@node Dlpreopening
@section Dlpreopening
Libtool provides special support for dlopening libtool object and
libtool library files, so that their symbols can be resolved
@emph{even on platforms without any @code{dlopen} and @code{dlsym}
functions}.
Consider the following alternative ways of loading code into your
program, in order of increasing ``laziness'':
@enumerate 1
@item
Linking against object files that become part of the program executable,
whether or not they are referenced. If an object file cannot be found,
then the compile time linker refuses to create the executable.
@item
Declaring a static library to the linker, so that it is searched at link
time to satisfy any undefined references in the above object
files. If the static library cannot be found, then the compile time
linker refuses to create the executable.
@item
Declaring a shared library to the runtime linker, so that it is searched
at runtime to satisfy any undefined references in the above
files. If the shared library cannot be found, then the dynamic linker
aborts the program before it runs.
@item
Dlopening a module, so that the application can resolve its own,
dynamically-computed references. If there is an error opening the
module, or the module is not found, then the application can recover
without crashing.
@end enumerate
Libtool emulates @option{-dlopen} on static platforms by linking objects
into the program at compile time, and creating data structures that
represent the program's symbol table. In order to use this feature,
you must declare the objects you want your application to dlopen by
using the @option{-dlopen} or @option{-dlpreopen} flags when you link your
program (@pxref{Link mode}).
@deftp {Data Type} {lt_dlsymlist} typedef struct @
@{ @w{const char *@code{name};} @w{void *@code{address};} @} lt_dlsymlist
The @code{name} attribute is a null-terminated character string of the
symbol name, such as @code{"fprintf"}. The @code{address} attribute is
generic pointer to the appropriate object, such as @code{&fprintf}.
@end deftp
@deftypevar {const lt_dlsymlist } lt_preloaded_symbols[]
An array of @code{lt_dlsymlist} structures, representing all the preloaded
symbols linked into the program proper. For each module
@option{-dlpreopen}ed by the Libtool linked program
there is an element with the @code{name} of the module and an @code{address}
of @code{0}, followed by all symbols exported from this file.
For the executable itself the special name @samp{@@PROGRAM@@} is used.
The last element of all has a @code{name} and @code{address} of
@code{0}.
To facilitate inclusion of symbol lists into libraries,
@code{lt_preloaded_symbols} is @samp{#define}d to a suitably unique name
in @file{ltdl.h}.
This variable may not be declared @code{const} on some systems due to
relocation issues.
@end deftypevar
Some compilers may allow identifiers that are not valid in ANSI C, such
as dollar signs. Libtool only recognizes valid ANSI C symbols (an
initial ASCII letter or underscore, followed by zero or more ASCII
letters, digits, and underscores), so non-ANSI symbols will not appear
in @code{lt_preloaded_symbols}.
@deftypefun int lt_dlpreload (const lt_dlsymlist *@var{preloaded})
Register the list of preloaded modules @var{preloaded}.
If @var{preloaded} is @code{NULL}, then all previously registered
symbol lists, except the list set by @code{lt_dlpreload_default},
are deleted. Return 0 on success.
@end deftypefun
@deftypefun int lt_dlpreload_default (const lt_dlsymlist *@var{preloaded})
Set the default list of preloaded modules to @var{preloaded}, which
won't be deleted by @code{lt_dlpreload}. Note that this function does
@emph{not} require libltdl to be initialized using @code{lt_dlinit} and
can be used in the program to register the default preloaded modules.
Instead of calling this function directly, most programs will use the
macro @code{LTDL_SET_PRELOADED_SYMBOLS}.
Return 0 on success.
@end deftypefun
@defmac LTDL_SET_PRELOADED_SYMBOLS
Set the default list of preloaded symbols.
Should be used in your program to initialize libltdl's
list of preloaded modules.
@example
#include <ltdl.h>
int main() @{
/* ... */
LTDL_SET_PRELOADED_SYMBOLS();
/* ... */
@}
@end example
@end defmac
@deftypefn {Function Type} {int} lt_dlpreload_callback_func (lt_dlhandle @var{handle})
Functions of this type can be passed to @code{lt_dlpreload_open},
which in turn will call back into a function thus passed for each
preloaded module that it opens.
@end deftypefn
@deftypefun int lt_dlpreload_open (@w{const char *@var{originator},} @w{lt_dlpreload_callback_func *@var{func})}
Load all of the preloaded modules for @var{originator}. For every
module opened in this way, call @var{func}.
@noindent
To open all of the modules preloaded into @file{libhell.la}
(presumably from within the @file{libhell.a} initialisation code):
@example
#define preloaded_symbols lt_libhell_LTX_preloaded_symbols
static int hell_preload_callback (lt_dlhandle handle);
int
hell_init (void)
@{
@dots{}
if (lt_dlpreload (&preloaded_symbols) == 0)
@{
lt_dlpreload_open ("libhell", preload_callback);
@}
@dots{}
@}
@end example
@noindent
Note that to prevent clashes between multiple preloaded modules, the
preloaded symbols are accessed via a mangled symbol name: to get the
symbols preloaded into @samp{libhell}, you must prefix
@samp{preloaded_symbols} with @samp{lt_}; the originator name,
@samp{libhell} in this case; and @samp{_LTX_}. That is,
@samp{lt_libhell_LTX_preloaded_symbols} here.
@end deftypefun
@node Linking with dlopened modules
@section Linking with dlopened modules
@cindex linking, dlopen
@cindex linking, dlpreopen
When, say, an interpreter application uses dlopened modules to extend
the list of methods it provides, an obvious abstraction for the
maintainers of the interpreter is to have all methods (including the
built in ones supplied with the interpreter) accessed through
dlopen. For one thing, the dlopening functionality will be tested
even during routine invocations. For another, only one subsystem has
to be written for getting methods into the interpreter.
The downside of this abstraction is, of course, that environments that
provide only static linkage can't even load the intrinsic interpreter
methods. Not so! We can statically link those methods by
@strong{dlpreopening} them.
Unfortunately, since platforms such as AIX and cygwin require
that all library symbols must be resolved at compile time, the
interpreter maintainers will need to provide a library to both its own
dlpreopened modules, and third-party modules loaded by dlopen. In
itself, that is not so bad, except that the interpreter too must
provide those same symbols otherwise it will be impossible to resolve
all the symbols required by the modules as they are loaded. Things
are even worse if the code that loads the modules for the interpreter
is itself in a library -- and that is usually the case for any
non-trivial application. Modern platforms take care of this by
automatically loading all of a module's dependency libraries as the
module is loaded (libltdl can do this even on platforms that can't do
it by themselves). In the end, this leads to problems with duplicated
symbols and prevents modules from loading, and prevents the
application from compiling when modules are preloaded.
@example
,-------------. ,------------------. ,-----------------.
| Interpreter |----> Module------------> Third-party |
`-------------' | Loader | |Dlopened Modules |
| | | `-----------------'
|,-------v--------.| |
|| Dlpreopened || |
|| Modules || |
|`----------------'| |
| | | |
|,-------v--------.| ,--------v--------.
||Module Interface|| |Module Interface |
|| Library || | Library |
|`----------------'| `-----------------'
`------------------'
@end example
Libtool has the concept of @dfn{weak library interfaces} to circumvent
this problem. Recall that the code that dlopens method-provider
modules for the interpreter application resides in a library: All of
the modules and the dlopener library itself should be linked against
the common library that resolves the module symbols at compile time.
To guard against duplicate symbol definitions, and for dlpreopened
modules to work at all in this scenario, the dlopener library must
declare that it provides a weak library interface to the common
symbols in the library it shares with the modules. That way, when
@command{libtool} links the @strong{Module Loader} library with some
@strong{Dlpreopened Modules} that were in turn linked against the
@strong{Module Interface Library}, it knows that the @strong{Module
Loader} provides an already loaded @strong{Module Interface Library}
to resolve symbols for the @strong{Dlpreopened Modules}, and doesn't
ask the compiler driver to link an identical @strong{Module Interface
Library} dependency library too.
In conjunction with Automake, the @file{Makefile.am} for the
@strong{Module Loader} might look like this:
@example
lib_LTLIBRARIES = libinterface.la libloader.la
libinterface_la_SOURCES = interface.c interface.h
libinterface_la_LDFLAGS = -version-info 3:2:1
libloader_la_SOURCES = loader.c
libloader_la_LDFLAGS = -weak libinterface.la \
-version-info 3:2:1 \
-dlpreopen ../modules/intrinsics.la
libloader_la_LIBADD = $(libinterface_la_OBJECTS)
@end example
And the @file{Makefile.am} for the @file{intrinsics.la} module in a
sibling @file{modules} directory might look like this:
@example
AM_CPPFLAGS = -I$(srcdir)/../libloader
AM_LDFLAGS = -no-undefined -module -avoid-version \
-export-dynamic
noinst_LTLIBRARIES = intrinsics.la
intrinsics_la_LIBADD = ../libloader/libinterface.la
../libloader/libinterface.la:
cd ../libloader && $(MAKE) $(AM_MAKEFLAGS) libinterface.la
@end example
@cindex @option{-weak} option
For a more complex example, see the sources of @file{libltdl} in the
Libtool distribution, which is built with the help of the @option{-weak}
option.
@node Finding the dlname
@section Finding the correct name to dlopen
@cindex names of dynamic modules
@cindex dynamic modules, names
After a library has been linked with @option{-module}, it can be dlopened.
Unfortunately, because of the variation in library names,
your package needs to determine the correct file to dlopen.
The most straightforward and flexible implementation is to determine the
name at runtime, by finding the installed @file{.la} file, and searching
it for the following lines:
@example
# The name that we can @code{dlopen}.
dlname='@var{dlname}'
@end example
If @var{dlname} is empty, then the library cannot be dlopened.
Otherwise, it gives the dlname of the library. So, if the library was
installed as @file{/usr/local/lib/libhello.la}, and the @var{dlname} was
@file{libhello.so.3}, then @file{/usr/local/lib/libhello.so.3} should be
dlopened.
If your program uses this approach, then it should search the
directories listed in the @code{LD_LIBRARY_PATH}@footnote{@code{LIBPATH}
on AIX, and @code{SHLIB_PATH} on HP-UX.} environment variable, as well as
the directory where libraries will eventually be installed. Searching
this variable (or equivalent) will guarantee that your program can find
its dlopened modules, even before installation, provided you have linked
them using libtool.
@node Dlopen issues
@section Unresolved dlopen issues
@cindex pitfalls with dlopen
@cindex dlopening, pitfalls
@cindex trouble with dlopen
The following problems are not solved by using libtool's dlopen support:
@itemize @bullet
@item
Dlopen functions are generally only available on shared library
platforms. If you want your package to be portable to static platforms,
you have to use either libltdl (@pxref{Using libltdl}) or develop your
own alternatives to dlopening dynamic code.
Most reasonable solutions involve writing wrapper functions for the
@code{dlopen} family, which do package-specific tricks when dlopening
is unsupported or not available on a given platform.
@item
There are major differences in implementations of the @code{dlopen}
family of functions. Some platforms do not even use the same function
names (notably HP-UX, with its @code{shl_load} family).
@item
The application developer must write a custom search function
to discover the correct module filename to supply to @code{dlopen}.
@end itemize
@node Using libltdl
@chapter Using libltdl
@findex libltdl
@findex dlopen
@findex dlsym
@findex dlclose
@findex dlerror
@findex shl_load
@cindex dynamic linking, applications
@cindex dlopening modules
@cindex modules, dynamic
@cindex application-level dynamic linking
Libtool provides a small library, called @file{libltdl}, that aims at
hiding the various difficulties of dlopening libraries from programmers.
It consists of a few headers and small C source files that can be
distributed with applications that need dlopening functionality. On
some platforms, whose dynamic linkers are too limited for a simple
implementation of @file{libltdl} services, it requires GNU DLD, or it
will only emulate dynamic linking with libtool's dlpreopening mechanism.
@noindent
libltdl supports currently the following dynamic linking mechanisms:
@itemize @bullet
@item
@code{dlopen} (POSIX compliant systems, GNU/Linux, etc.)
@item
@code{shl_load} (HP-UX)
@item
@code{LoadLibrary} (Win16 and Win32)
@item
@code{load_add_on} (BeOS)
@item
@code{NSAddImage} or @code{NSLinkModule} (Darwin and Mac OS X)
@item
GNU DLD (emulates dynamic linking for static libraries)
@item
libtool's dlpreopen (@pxref{Dlpreopening})
@end itemize
@noindent
libltdl is licensed under the terms of the GNU Lesser General
Public License, with the following exception:
@quotation
As a special exception to the GNU Lesser General Public License,
if you distribute this file as part of a program or library that
is built using GNU Libtool, you may include it under the same
distribution terms that you use for the rest of that program.
@end quotation
@menu
* Libltdl interface:: How to use libltdl in your programs.
* Modules for libltdl:: Creating modules that can be @code{dlopen}ed.
* Thread Safety in libltdl:: Registering callbacks for multi-thread safety.
* User defined module data:: Associating data with loaded modules.
* Module loaders for libltdl:: Creating user defined module loaders.
* Distributing libltdl:: How to distribute libltdl with your package.
@end menu
@node Libltdl interface
@section How to use libltdl in your programs
@noindent
The libltdl API is similar to the POSIX dlopen interface,
which is very simple but powerful.
@noindent
To use libltdl in your program you have to include the header file @file{ltdl.h}:
@example
#include <ltdl.h>
@end example
@noindent
The early releases of libltdl used some symbols that violated the
POSIX namespace conventions. These symbols are now deprecated,
and have been replaced by those described here. If you have code that
relies on the old deprecated symbol names, defining
@samp{LT_NON_POSIX_NAMESPACE} before you include @file{ltdl.h} provides
conversion macros. Whichever set of symbols you use, the new API is
not binary compatible with the last, so you will need to recompile
your application to use this version of libltdl.
@noindent
Note that libltdl is not well tested in a multithreaded environment,
though the intention is that it should work (@pxref{Thread Safety
in libltdl, , Using libltdl in a multi threaded environment}). If there are
any issues, working around them is left as an exercise for the reader;
contributions are certainly welcome.
@noindent
The following macros are defined by including @file{ltdl.h}:
@defmac LT_PATHSEP_CHAR
@code{LT_PATHSEP_CHAR} is the system-dependent path separator,
that is, @samp{;} on Windows and @samp{:} everywhere else.
@end defmac
@defmac LT_DIRSEP_CHAR
If @code{LT_DIRSEP_CHAR} is defined, it can be used as directory
separator in addition to @samp{/}. On Windows, this contains
@samp{\}.
@end defmac
@noindent
The following types are defined in @file{ltdl.h}:
@deftp {Type} lt_dlhandle
@code{lt_dlhandle} is a module ``handle''.
Every lt_dlopened module has a handle associated with it.
@end deftp
@deftp {Type} lt_dladvise
@code{lt_dladvise} is used to control optional module loading modes.
If it is not used, the default mode of the underlying system module
loader is used.
@end deftp
@deftp {Type} lt_dlsymlist
@code{lt_dlsymlist} is a symbol list for dlpreopened modules
(@pxref{Dlpreopening}).
@end deftp
@page
@noindent
libltdl provides the following functions:
@deftypefun int lt_dlinit (void)
Initialize libltdl.
This function must be called before using libltdl
and may be called several times.
Return 0 on success, otherwise the number of errors.
@end deftypefun
@deftypefun int lt_dlexit (void)
Shut down libltdl and close all modules.
This function will only then shut down libltdl when it was called as
many times as @code{lt_dlinit} has been successfully called.
Return 0 on success, otherwise the number of errors.
@end deftypefun
@deftypefun lt_dlhandle lt_dlopen (const char *@var{filename})
Open the module with the file name @var{filename} and return a
handle for it. @code{lt_dlopen} is able to open libtool dynamic
modules, preloaded static modules, the program itself and
native dynamic modules@footnote{Some platforms, notably Mac OS X,
differentiate between a runtime library that cannot be opened by
@code{lt_dlopen} and a dynamic module that can. For maximum
portability you should try to ensure that you only pass
@code{lt_dlopen} objects that have been compiled with libtool's
@option{-module} flag.}.
Unresolved symbols in the module are resolved using its dependency
libraries and, on some platforms, previously dlopened modules. If
the executable using this module was linked with the
@option{-export-dynamic} flag, then the global symbols in the executable
will also be used to resolve references in the module.
If @var{filename} is @code{NULL} and the program was linked with
@option{-export-dynamic} or @option{-dlopen self}, @code{lt_dlopen} will
return a handle for the program itself, which can be used to access its
symbols.
If libltdl cannot find the library and the file name @var{filename} does
not have a directory component it will additionally look in the
following search paths for the module (in the following order):
@enumerate 1
@item user-defined search path:
This search path can be changed by the program using the
functions @code{lt_dlsetsearchpath}, @code{lt_dladdsearchdir} and
@code{lt_dlinsertsearchdir}.
@item libltdl's search path:
This search path is the value of the environment variable
@env{LTDL_LIBRARY_PATH}.
@item system library search path:
The system dependent library search path
(e.g.@: on GNU/Linux it is @env{LD_LIBRARY_PATH}).
@end enumerate
Each search path must be a list of absolute directories separated by
@code{LT_PATHSEP_CHAR}, for example, @code{"/usr/lib/mypkg:/lib/foo"}.
The directory names may not contain the path separator.
If the same module is loaded several times, the same handle is returned.
If @code{lt_dlopen} fails for any reason, it returns @code{NULL}.
@end deftypefun
@deftypefun lt_dlhandle lt_dlopenext (const char *@var{filename})
The same as @code{lt_dlopen}, except that it tries to append
different file name extensions to the file name.
If the file with the file name @var{filename} cannot be found
libltdl tries to append the following extensions:
@enumerate 1
@item the libtool archive extension @file{.la}
@item the extension used for native dynamically loadable modules on the host platform, e.g., @file{.so}, @file{.sl}, etc.
@end enumerate
This lookup strategy was designed to allow programs that don't
have knowledge about native dynamic libraries naming conventions
to be able to @code{dlopen} such libraries as well as libtool modules
transparently.
@end deftypefun
@deftypefun lt_dlhandle lt_dlopenadvise (const char *@var{filename}, @w{lt_dladvise @var{advise}})
The same as @code{lt_dlopen}, except that it also requires an additional
argument that may contain additional hints to the underlying system
module loader. The @var{advise} parameter is opaque and can only be
accessed with the functions documented below.
Note that this function does not change the content of @var{advise}, so
unlike the other calls in this API takes a direct @code{lt_dladvise}
type, and not a pointer to the same.
@end deftypefun
@deftypefun int lt_dladvise_init (lt_dladvise *@var{advise})
The @var{advise} parameter can be used to pass hints to the module
loader when using @code{lt_dlopenadvise} to perform the loading.
The @var{advise} parameter needs to be initialised by this function
before it can be used. Any memory used by @var{advise} needs to be
recycled with @code{lt_dladvise_destroy} when it is no longer needed.
On failure, @code{lt_dladvise_init} returns non-zero and sets an error
message that can be retrieved with @code{lt_dlerror}.
@end deftypefun
@deftypefun int lt_dladvise_destroy (lt_dladvise *@var{advise})
Recycle the memory used by @var{advise}. For an example, see the
documentation for @code{lt_dladvise_ext}.
On failure, @code{lt_dladvise_destroy} returns non-zero and sets an error
message that can be retrieved with @code{lt_dlerror}.
@end deftypefun
@deftypefun int lt_dladvise_ext (lt_dladvise *@var{advise})
Set the @code{ext} hint on @var{advise}. Passing an @var{advise}
parameter to @code{lt_dlopenadvise} with this hint set causes it to
try to append different file name extensions like @code{lt_dlopenext}.
The following example is equivalent to calling
@code{lt_dlopenext (filename)}:
@example
lt_dlhandle
my_dlopenext (const char *filename)
@{
lt_dlhandle handle = 0;
lt_dladvise advise;
if (!lt_dladvise_init (&advise) && !lt_dladvise_ext (&advise))
handle = lt_dlopenadvise (filename, advise);
lt_dladvise_destroy (&advise);
return handle;
@}
@end example
On failure, @code{lt_dladvise_ext} returns non-zero and sets an error
message that can be retrieved with @code{lt_dlerror}.
@end deftypefun
@deftypefun int lt_dladvise_global (lt_dladvise *@var{advise})
Set the @code{symglobal} hint on @var{advise}. Passing an @var{advise}
parameter to @code{lt_dlopenadvise} with this hint set causes it to try
to make the loaded module's symbols globally available for resolving
unresolved symbols in subsequently loaded modules.
If neither the @code{symglobal} nor the @code{symlocal} hints are set,
or if a module is loaded without using the @code{lt_dlopenadvise} call
in any case, then the visibility of the module's symbols will be as per
the default for the underlying module loader and OS. Even if a
suitable hint is passed, not all loaders are able to act upon it in
which case @code{lt_dlgetinfo} will reveal whether the hint was actually
followed.
On failure, @code{lt_dladvise_global} returns non-zero and sets an error
message that can be retrieved with @code{lt_dlerror}.
@end deftypefun
@deftypefun int lt_dladvise_local (lt_dladvise *@var{advise})
Set the @code{symlocal} hint on @var{advise}. Passing an @var{advise}
parameter to @code{lt_dlopenadvise} with this hint set causes it to try
to keep the loaded module's symbols hidden so that they are not
visible to subsequently loaded modules.
If neither the @code{symglobal} nor the @code{symlocal} hints are set,
or if a module is loaded without using the @code{lt_dlopenadvise} call
in any case, then the visibility of the module's symbols will be as per
the default for the underlying module loader and OS. Even if a
suitable hint is passed, not all loaders are able to act upon it in
which case @code{lt_dlgetinfo} will reveal whether the hint was actually
followed.
On failure, @code{lt_dladvise_local} returns non-zero and sets an error
message that can be retrieved with @code{lt_dlerror}.
@end deftypefun
@deftypefun int lt_dladvise_resident (lt_dladvise *@var{advise})
Set the @code{resident} hint on @var{advise}. Passing an @var{advise}
parameter to @code{lt_dlopenadvise} with this hint set causes it to try
to make the loaded module resident in memory, so that it cannot be
unloaded with a later call to @code{lt_dlclose}.
On failure, @code{lt_dladvise_resident} returns non-zero and sets an error
message that can be retrieved with @code{lt_dlerror}.
@end deftypefun
@deftypefun int lt_dladvise_preload (lt_dladvise *@var{advise})
Set the @code{preload} hint on @var{advise}. Passing an @var{advise}
parameter to @code{lt_dlopenadvise} with this hint set causes it to
load only preloaded modules, so that if a suitable preloaded module is
not found, @code{lt_dlopenadvise} will return @code{NULL}.
@end deftypefun
@deftypefun int lt_dlclose (lt_dlhandle @var{handle})
Decrement the reference count on the module @var{handle}.
If it drops to zero and no other module depends on this module,
then the module is unloaded.
Return 0 on success.
@end deftypefun
@deftypefun {void *} lt_dlsym (lt_dlhandle @var{handle}, const char *@var{name})
Return the address in the module @var{handle}, where the symbol given
by the null-terminated string @var{name} is loaded.
If the symbol cannot be found, @code{NULL} is returned.
@end deftypefun
@deftypefun {const char *} lt_dlerror (void)
Return a human readable string describing the most
recent error that occurred from any of libltdl's functions.
Return @code{NULL} if no errors have occurred since initialization
or since it was last called.
@end deftypefun
@deftypefun int lt_dladdsearchdir (const char *@var{search_dir})
Append the search directory @var{search_dir} to the current user-defined
library search path. Return 0 on success.
@end deftypefun
@deftypefun int lt_dlinsertsearchdir (@w{const char *@var{before}}, @w{const char *@var{search_dir}})
Insert the search directory @var{search_dir} into the user-defined library
search path, immediately before the element starting at address
@var{before}. If @var{before} is @samp{NULL}, then @var{search_dir} is
appending as if @code{lt_dladdsearchdir} had been called. Return 0 on success.
@end deftypefun
@deftypefun int lt_dlsetsearchpath (const char *@var{search_path})
Replace the current user-defined library search path with
@var{search_path}, which must be a list of absolute directories separated
by @code{LT_PATHSEP_CHAR}. Return 0 on success.
@end deftypefun
@deftypefun {const char *} lt_dlgetsearchpath (void)
Return the current user-defined library search path.
@end deftypefun
@deftypefun int lt_dlforeachfile (@w{const char *@var{search_path}}, @w{int (*@var{func}) (const char *@var{filename}, void * @var{data})}, @w{void * @var{data}})
In some applications you may not want to load individual modules with
known names, but rather find all of the modules in a set of
directories and load them all during initialisation. With this function
you can have libltdl scan the @code{LT_PATHSEP_CHAR}-delimited directory list
in @var{search_path} for candidates, and pass them, along with
@var{data} to your own callback function, @var{func}. If @var{search_path} is
@samp{NULL}, then search all of the standard locations that
@code{lt_dlopen} would examine. This function will continue to make
calls to @var{func} for each file that it discovers in @var{search_path}
until one of these calls returns non-zero, or until the files are
exhausted. @samp{lt_dlforeachfile} returns the value returned by the last
call made to @var{func}.
For example you could define @var{func} to build an ordered
@dfn{argv}-like vector of files using @var{data} to hold the address of
the start of the vector.
@end deftypefun
@deftypefun int lt_dlmakeresident (lt_dlhandle @var{handle})
Mark a module so that it cannot be @samp{lt_dlclose}d. This can be
useful if a module implements some core functionality in your project
that would cause your code to crash if removed. Return 0 on success.
If you use @samp{lt_dlopen (NULL)} to get a @var{handle} for the running
binary, that handle will always be marked as resident, and consequently
cannot be successfully @samp{lt_dlclose}d.
@end deftypefun
@deftypefun int lt_dlisresident (lt_dlhandle @var{handle})
Check whether a particular module has been marked as resident, returning 1
if it has or 0 otherwise. If there is an error while executing this
function, return -1 and set an error message for retrieval with
@code{lt_dlerror}.
@end deftypefun
@node Modules for libltdl
@section Creating modules that can be @code{dlopen}ed
Libtool modules are created like normal libtool libraries with a few
exceptions:
You have to link the module with libtool's @option{-module} switch,
and you should link any program that is intended to dlopen the module with
@option{-dlopen @var{modulename.la}} where possible, so that libtool can
dlpreopen the module on platforms that do not support dlopening. If
the module depends on any other libraries, make sure you specify them
either when you link the module or when you link programs that dlopen it.
If you want to disable versioning (@pxref{Versioning}) for a specific module
you should link it with the @option{-avoid-version} switch.
Note that libtool modules don't need to have a "lib" prefix.
However, Automake 1.4 or higher is required to build such modules.
Usually a set of modules provide the same interface, i.e.@: exports the same
symbols, so that a program can dlopen them without having to know more
about their internals: In order to avoid symbol conflicts all exported
symbols must be prefixed with "modulename_LTX_" (@var{modulename} is
the name of the module). Internal symbols must be named in such a way
that they won't conflict with other modules, for example, by prefixing
them with "_modulename_". Although some platforms support having the
same symbols defined more than once it is generally not portable and
it makes it impossible to dlpreopen such modules.
libltdl will automatically cut the prefix off to get the real name of
the symbol. Additionally, it supports modules that do not use a
prefix so that you can also dlopen non-libtool modules.
@file{foo1.c} gives an example of a portable libtool module.
Exported symbols are prefixed with "foo1_LTX_", internal symbols
with "_foo1_". Aliases are defined at the beginning so that the code
is more readable.
@example
/* aliases for the exported symbols */
#define foo foo1_LTX_foo
#define bar foo1_LTX_bar
/* a global variable definition */
int bar = 1;
/* a private function */
int _foo1_helper() @{
return bar;
@}
/* an exported function */
int foo() @{
return _foo1_helper();
@}
@end example
@noindent
The @file{Makefile.am} contains the necessary rules to build the
module @file{foo1.la}:
@example
...
lib_LTLIBRARIES = foo1.la
foo1_la_SOURCES = foo1.c
foo1_la_LDFLAGS = -module
...
@end example
@node Thread Safety in libltdl
@section Using libltdl in a multi threaded environment
Libltdl provides a wrapper around whatever dynamic run-time object
loading mechanisms are provided by the host system, many of which are
themselves not thread safe. Consequently libltdl cannot itself be
consistently thread safe.
If you wish to use libltdl in a multithreaded environment, then you
must mutex lock around libltdl calls, since they may in turn be calling
non-thread-safe system calls on some target hosts.
Some old releases of libtool provided a mutex locking API that
was unusable with POSIX threads, so callers were forced to lock around
all libltdl API calls anyway. That mutex locking API was
next to useless, and is not present in current releases.
Some future release of libtool may provide a new POSIX thread
compliant mutex locking API.
@node User defined module data
@section Data associated with loaded modules
Some of the internal information about each loaded module that is
maintained by libltdl is available to the user, in the form of this
structure:
@deftypefn {Type} {struct} lt_dlinfo @{ @w{char *@code{filename};} @
@w{char *@code{name};} @w{int @code{ref_count};} @
@w{int @code{is_resident};} @w{int @code{is_symglobal};} @
@w{int @code{is_symlocal};}@}
@code{lt_dlinfo} is used to store information about a module.
The @code{filename} attribute is a null-terminated character string of
the real module file name. If the module is a libtool module then
@code{name} is its module name (e.g.@: @code{"libfoo"} for
@code{"dir/libfoo.la"}), otherwise it is set to @code{NULL}. The
@code{ref_count} attribute is a reference counter that describes how
often the same module is currently loaded. The remaining fields can
be compared to any hints that were passed to @code{lt_dlopenadvise}
to determine whether the underlying loader was able to follow them.
@end deftypefn
The following function will return a pointer to libltdl's internal copy
of this structure for the given @var{handle}:
@deftypefun {const lt_dlinfo *} lt_dlgetinfo (@w{lt_dlhandle @var{handle}})
Return a pointer to a struct that contains some information about
the module @var{handle}. The contents of the struct must not be modified.
Return @code{NULL} on failure.
@end deftypefun
Furthermore, to save you from having to keep a list of the
handles of all the modules you have loaded, these functions allow you to
iterate over libltdl's list of loaded modules:
@deftp {Type} lt_dlinterface_id
The opaque type used to hold the module interface details for each
registered libltdl client.
@end deftp
@deftypefn {Type} int lt_dlhandle_interface (@w{lt_dlhandle @var{handle},} @
@w{const char *@var{id_string}})
Functions of this type are called to check that a handle conforms to a
library's expected module interface when iterating over the global
handle list. You should be careful to write a callback function of
this type that can correctly identify modules that belong to this
client, both to prevent other clients from accidentally finding your
loaded modules with the iterator functions below, and vice versa. The
best way to do this is to check that module @var{handle} conforms
to the interface specification of your loader using @code{lt_dlsym}.
The callback may be given @strong{every} module loaded by all the
libltdl module clients in the current address space, including any
modules loaded by other libraries such as libltdl itself, and should
return non-zero if that module does not fulfill the interface
requirements of your loader.
@example
int
my_interface_cb (lt_dlhandle handle, const char *id_string)
@{
char *(*module_id) (void) = NULL;
/* @r{A valid my_module must provide all of these symbols.} */
if (!((module_id = (char*(*)(void)) lt_dlsym ("module_version"))
&& lt_dlsym ("my_module_entrypoint")))
return 1;
if (strcmp (id_string, module_id()) != 0)
return 1;
return 0;
@}
@end example
@end deftypefn
@deftypefun lt_dlinterface_id lt_dlinterface_register @
(@w{const char *@var{id_string}}, @w{lt_dlhandle_interface *@var{iface}})
Use this function to register your interface validator with libltdl,
and in return obtain a unique key to store and retrieve per-module data.
You supply an @var{id_string} and @var{iface} so that the resulting
@code{lt_dlinterface_id} can be used to filter the module handles
returned by the iteration functions below. If @var{iface} is @code{NULL},
all modules will be matched.
@end deftypefun
@deftypefun void lt_dlinterface_free (@w{lt_dlinterface_id @var{iface}})
Release the data associated with @var{iface}.
@end deftypefun
@deftypefun int lt_dlhandle_map (@w{lt_dlinterface_id @var{iface}}, @
@w{int (*@var{func}) (lt_dlhandle @var{handle}, void * @var{data})}, @
@w{void * @var{data}})
For each module that matches @var{iface}, call the function
@var{func}. When writing the @var{func} callback function, the
argument @var{handle} is the handle of a loaded module, and
@var{data} is the last argument passed to @code{lt_dlhandle_map}. As
soon as @var{func} returns a non-zero value for one of the handles,
@code{lt_dlhandle_map} will stop calling @var{func} and immediately
return that non-zero value. Otherwise 0 is eventually returned when
@var{func} has been successfully called for all matching modules.
@end deftypefun
@deftypefun lt_dlhandle lt_dlhandle_iterate (@w{lt_dlinterface_id @
@var{iface}}, @w{lt_dlhandle @var{place}})
Iterate over the module handles loaded by @var{iface}, returning the
first matching handle in the list if @var{place} is @code{NULL}, and
the next one on subsequent calls. If @var{place} is the last element
in the list of eligible modules, this function returns @code{NULL}.
@example
lt_dlhandle handle = 0;
lt_dlinterface_id iface = my_interface_id;
while ((handle = lt_dlhandle_iterate (iface, handle)))
@{
@dots{}
@}
@end example
@end deftypefun
@deftypefun lt_dlhandle lt_dlhandle_fetch (@w{lt_dlinterface_id @var{iface}}, @w{const char *@var{module_name}})
Search through the module handles loaded by @var{iface} for a module named
@var{module_name}, returning its handle if found or else @code{NULL}
if no such named module has been loaded by @var{iface}.
@end deftypefun
However, you might still need to maintain your own list of loaded
module handles (in parallel with the list maintained inside libltdl)
if there were any other data that your application wanted to associate
with each open module. Instead, you can use the following API
calls to do that for you. You must first obtain a unique interface id
from libltdl as described above, and subsequently always use it to
retrieve the data you stored earlier. This allows different libraries
to each store their own data against loaded modules, without
interfering with one another.
@deftypefun {void *} lt_dlcaller_set_data (@w{lt_dlinterface_id @var{key}}, @w{lt_dlhandle @var{handle}}, @w{void * @var{data}})
Set @var{data} as the set of data uniquely associated with @var{key} and
@var{handle} for later retrieval. This function returns the @var{data}
previously associated with @var{key} and @var{handle} if any. A result of
0, may indicate that a diagnostic for the last error (if any) is available
from @code{lt_dlerror()}.
For example, to correctly remove some associated data:
@example
void *stale = lt_dlcaller_set_data (key, handle, 0);
if (stale != NULL)
@{
free (stale);
@}
else
@{
char *error_msg = lt_dlerror ();
if (error_msg != NULL)
@{
my_error_handler (error_msg);
return STATUS_FAILED;
@}
@}
@end example
@end deftypefun
@deftypefun {void *} lt_dlcaller_get_data (@w{lt_dlinterface_id @var{key}}, @w{lt_dlhandle @var{handle}})
Return the address of the data associated with @var{key} and
@var{handle}, or else @code{NULL} if there is none.
@end deftypefun
Old versions of libltdl also provided a simpler, but similar, API
based around @code{lt_dlcaller_id}. Unfortunately, it had no
provision for detecting whether a module belonged to a particular
interface as libltdl didn't support multiple loaders in the same
address space at that time. Those APIs are no longer supported
as there would be no way to stop clients of the old APIs from
seeing (and accidentally altering) modules loaded by other libraries.
@node Module loaders for libltdl
@section How to create and register new module loaders
Sometimes libltdl's many ways of gaining access to modules are not
sufficient for the purposes of a project. You can write your own
loader, and register it with libltdl so that @code{lt_dlopen} will be
able to use it.
Writing a loader involves writing at least three functions that can be
called by @code{lt_dlopen}, @code{lt_dlsym} and @code{lt_dlclose}.
Optionally, you can provide a finalisation function to perform any
cleanup operations when @code{lt_dlexit} executes, and a symbol prefix
string that will be prepended to any symbols passed to @code{lt_dlsym}.
These functions must match the function pointer types below, after
which they can be allocated to an instance of @code{lt_user_dlloader}
and registered.
Registering the loader requires that you choose a name for it, so that it
can be recognised by @code{lt_dlloader_find} and removed with
@code{lt_dlloader_remove}. The name you choose must be unique, and not
already in use by libltdl's builtin loaders:
@table @asis
@item "dlopen"
The system dynamic library loader, if one exists.
@item "dld"
The GNU dld loader, if @file{libdld} was installed when libltdl was
built.
@item "dlpreload"
The loader for @code{lt_dlopen}ing of preloaded static modules.
@end table
The prefix "dl" is reserved for loaders supplied with future versions of
libltdl, so you should not use that for your own loader names.
@noindent
The following types are defined in @file{ltdl.h}:
@deftp {Type} lt_module
@code{lt_module} is a dlloader dependent module.
The dynamic module loader extensions communicate using these low
level types.
@end deftp
@deftp {Type} lt_dlloader
@code{lt_dlloader} is a handle for module loader types.
@end deftp
@deftp {Type} lt_user_data
@code{lt_user_data} is used for specifying loader instance data.
@end deftp
@deftypefn {Type} {struct} lt_user_dlloader @{@w{const char *@code{sym_prefix};} @w{lt_module_open *@code{module_open};} @w{lt_module_close *@code{module_close};} @w{lt_find_sym *@code{find_sym};} @w{lt_dlloader_exit *@code{dlloader_exit};} @}
If you want to define a new way to open dynamic modules, and have the
@code{lt_dlopen} API use it, you need to instantiate one of these
structures and pass it to @code{lt_dlloader_add}. You can pass whatever
you like in the @var{dlloader_data} field, and it will be passed back as
the value of the first parameter to each of the functions specified in
the function pointer fields.
@end deftypefn
@deftypefn {Type} lt_module lt_module_open (@w{const char *@var{filename}})
The type of the loader function for an @code{lt_dlloader} module
loader. The value set in the dlloader_data field of the @code{struct
lt_user_dlloader} structure will be passed into this function in the
@var{loader_data} parameter. Implementation of such a function should
attempt to load the named module, and return an @code{lt_module}
suitable for passing in to the associated @code{lt_module_close} and
@code{lt_sym_find} function pointers. If the function fails it should
return @code{NULL}, and set the error message with @code{lt_dlseterror}.
@end deftypefn
@deftypefn {Type} int lt_module_close (@w{lt_user_data @var{loader_data},} @w{lt_module @var{module}})
The type of the unloader function for a user defined module loader.
Implementation of such a function should attempt to release
any resources tied up by the @var{module} module, and then unload it
from memory. If the function fails for some reason, set the error
message with @code{lt_dlseterror} and return non-zero.
@end deftypefn
@deftypefn {Type} {void *} lt_find_sym (@w{lt_module @var{module},} @w{const char *@var{symbol}})
The type of the symbol lookup function for a user defined module loader.
Implementation of such a function should return the address of the named
@var{symbol} in the module @var{module}, or else set the error message
with @code{lt_dlseterror} and return @code{NULL} if lookup fails.
@end deftypefn
@deftypefn {Type} int lt_dlloader_exit (@w{lt_user_data @var{loader_data}})
The type of the finalisation function for a user defined module loader.
Implementation of such a function should free any resources associated
with the loader, including any user specified data in the
@code{dlloader_data} field of the @code{lt_user_dlloader}. If non-@code{NULL},
the function will be called by @code{lt_dlexit}, and
@code{lt_dlloader_remove}.
@end deftypefn
For example:
@example
int
register_myloader (void)
@{
lt_user_dlloader dlloader;
/* User modules are responsible for their own initialisation. */
if (myloader_init () != 0)
return MYLOADER_INIT_ERROR;
dlloader.sym_prefix = NULL;
dlloader.module_open = myloader_open;
dlloader.module_close = myloader_close;
dlloader.find_sym = myloader_find_sym;
dlloader.dlloader_exit = myloader_exit;
dlloader.dlloader_data = (lt_user_data)myloader_function;
/* Add my loader as the default module loader. */
if (lt_dlloader_add (lt_dlloader_next (NULL), &dlloader,
"myloader") != 0)
return ERROR;
return OK;
@}
@end example
Note that if there is any initialisation required for the loader,
it must be performed manually before the loader is registered --
libltdl doesn't handle user loader initialisation.
Finalisation @emph{is} handled by libltdl however, and it is important
to ensure the @code{dlloader_exit} callback releases any resources claimed
during the initialisation phase.
@page
@noindent
libltdl provides the following functions for writing your own module
loaders:
@deftypefun int lt_dlloader_add (@w{lt_dlloader *@var{place},} @
@w{lt_user_dlloader *@var{dlloader},} @w{const char *@var{loader_name}})
Add a new module loader to the list of all loaders, either as the
last loader (if @var{place} is @code{NULL}), else immediately before the
loader passed as @var{place}. @var{loader_name} will be returned by
@code{lt_dlloader_name} if it is subsequently passed a newly
registered loader. These @var{loader_name}s must be unique, or
@code{lt_dlloader_remove} and @code{lt_dlloader_find} cannot
work. Returns 0 for success.
@example
/* Make myloader be the last one. */
if (lt_dlloader_add (NULL, myloader) != 0)
perror (lt_dlerror ());
@end example
@end deftypefun
@deftypefun int lt_dlloader_remove (@w{const char *@var{loader_name}})
Remove the loader identified by the unique name, @var{loader_name}.
Before this can succeed, all modules opened by the named loader must
have been closed. Returns 0 for success, otherwise an error message can
be obtained from @code{lt_dlerror}.
@example
/* Remove myloader. */
if (lt_dlloader_remove ("myloader") != 0)
perror (lt_dlerror ());
@end example
@end deftypefun
@deftypefun {lt_dlloader *} lt_dlloader_next (@w{lt_dlloader *@var{place}})
Iterate over the module loaders, returning the first loader if @var{place} is
@code{NULL}, and the next one on subsequent calls. The handle is for use with
@code{lt_dlloader_add}.
@example
/* Make myloader be the first one. */
if (lt_dlloader_add (lt_dlloader_next (NULL), myloader) != 0)
return ERROR;
@end example
@end deftypefun
@deftypefun {lt_dlloader *} lt_dlloader_find (@w{const char *@var{loader_name}})
Return the first loader with a matching @var{loader_name} identifier, or else
@code{NULL}, if the identifier is not found.
The identifiers that may be used by libltdl itself, if the host
architecture supports them are @dfn{dlopen}@footnote{This is used for
the host dependent module loading API -- @code{shl_load} and
@code{LoadLibrary} for example}, @dfn{dld} and @dfn{dlpreload}.
@example
/* Add a user loader as the next module loader to be tried if
the standard dlopen loader were to fail when lt_dlopening. */
if (lt_dlloader_add (lt_dlloader_find ("dlopen"), myloader) != 0)
return ERROR;
@end example
@end deftypefun
@deftypefun {const char *} lt_dlloader_name (@w{lt_dlloader *@var{place}})
Return the identifying name of @var{place}, as obtained from
@code{lt_dlloader_next} or @code{lt_dlloader_find}. If this function fails,
it will return @code{NULL} and set an error for retrieval with
@code{lt_dlerror}.
@end deftypefun
@deftypefun {lt_user_data *} lt_dlloader_data (@w{lt_dlloader *@var{place}})
Return the address of the @code{dlloader_data} of @var{place}, as
obtained from @code{lt_dlloader_next} or @code{lt_dlloader_find}. If
this function fails, it will return @code{NULL} and set an error for
retrieval with @code{lt_dlerror}.
@end deftypefun
@subsection Error handling within user module loaders
@deftypefun int lt_dladderror (@w{const char *@var{diagnostic}})
This function allows you to integrate your own error messages into
@code{lt_dlerror}. Pass in a suitable diagnostic message for return by
@code{lt_dlerror}, and an error identifier for use with
@code{lt_dlseterror} is returned.
If the allocation of an identifier fails, this function returns -1.
@example
int myerror = lt_dladderror ("doh!");
if (myerror < 0)
perror (lt_dlerror ());
@end example
@end deftypefun
@deftypefun int lt_dlseterror (@w{int @var{errorcode}})
When writing your own module loaders, you should use this function to
raise errors so that they are propagated through the @code{lt_dlerror}
interface. All of the standard errors used by libltdl are declared in
@file{ltdl.h}, or you can add more of your own with
@code{lt_dladderror}. This function returns 0 on success.
@example
if (lt_dlseterror (LTDL_ERROR_NO_MEMORY) != 0)
perror (lt_dlerror ());
@end example
@end deftypefun
@node Distributing libltdl
@section How to distribute libltdl with your package
Even though libltdl is installed together with libtool, you may wish
to include libltdl in the distribution of your package, for the
convenience of users of your package that don't have libtool or
libltdl installed, or if you are using features of a very new version
of libltdl that you don't expect your users to have yet. In such
cases, you must decide what flavor of libltdl you want to use: a
convenience library or an installable libtool library.
The most simplistic way to add @code{libltdl} to your package is to
copy all the @file{libltdl} source files to a subdirectory within
your package and to build and link them along with the rest of your
sources. To help you do this, the m4 macros for Autoconf are
available in @file{ltdl.m4}. You must ensure that they are available
in @file{aclocal.m4} before you run Autoconf@footnote{@c
@c
We used to recommend adding the contents of @file{ltdl.m4} to
@file{acinclude.m4}, but with @command{aclocal} from a modern
Automake (1.8 or newer) and this release of libltdl that is not only
unnecessary but makes it easy to forget to upgrade @file{acinclude.m4}
if you move to a different release of libltdl.
@c
}. Having made the macros available, you must add a call to the
@samp{LTDL_INIT} macro (after the call to @samp{LT_INIT})
to your package's @file{configure.ac} to
perform the configure time checks required to build the library
correctly. Unfortunately, this method has problems if you then try to
link the package binaries with an installed libltdl, or a library that
depends on libltdl, because of the duplicate symbol definitions. For
example, ultimately linking against two different versions of libltdl,
or against both a local convenience library and an installed libltdl
is bad. Ensuring that only one copy of the libltdl sources are linked
into any program is left as an exercise for the reader.
@defmac LT_CONFIG_LTDL_DIR (@var{directory})
Declare @var{directory} to be the location of the @code{libltdl}
source files, for @command{libtoolize --ltdl} to place
them. @xref{Invoking libtoolize}, for more details. Provided that you
add an appropriate @code{LT_CONFIG_LTDL_DIR} call in your
@file{configure.ac} before calling @command{libtoolize}, the
appropriate @code{libltdl} files will be installed automatically.
@end defmac
@defmac LTDL_INIT (@var{options})
@defmacx LT_WITH_LTDL
@defmacx AC_WITH_LTDL
@code{AC_WITH_LTDL} and @code{LT_WITH_LTDL} are deprecated names for
older versions of this macro; @command{autoupdate} will update your
@file{configure.ac} file.
This macro adds the following options to the @command{configure}
script:
@table @option
@item --with-ltdl-include @var{installed-ltdl-header-dir}
The @code{LTDL_INIT} macro will look in the standard header file
locations to find the installed @code{libltdl} headers. If
@code{LTDL_INIT} can't find them by itself, the person who builds
your package can use this option to tell @command{configure} where
the installed @code{libltdl} headers are.
@item --with-ltdl-lib @var{installed-ltdl-library-dir}
Similarly, the person building your package can use this option to
help @command{configure} find the installed @file{libltdl.la}.
@item --with-included-ltdl
If there is no installed @code{libltdl}, or in any case if the
person building your package would rather use the @code{libltdl}
sources shipped with the package in the subdirectory named by
@code{LT_CONFIG_LTDL_DIR}, they should pass this option to
@command{configure}.
@end table
If the @option{--with-included-ltdl} is not passed at
configure time, and an installed @code{libltdl} is not
found@footnote{@c
@c
Even if libltdl is installed, @samp{LTDL_INIT} may fail
to detect it if libltdl depends on symbols provided by libraries
other than the C library.
@c
}, then @command{configure} will exit immediately with an error that
asks the user to either specify the location of an installed
@code{libltdl} using the @option{--with-ltdl-include} and
@option{--with-ltdl-lib} options, or to build with the
@code{libltdl} sources shipped with the package by passing
@option{--with-included-ltdl}.
If an installed @code{libltdl} is found, then @code{LIBLTDL} is set to
the link flags needed to use it, and @code{LTDLINCL} to the preprocessor
flags needed to find the installed headers, and @code{LTDLDEPS} will
be empty. Note, however, that no version checking is performed. You
should manually check for the @code{libltdl} features you need in
@file{configure.ac}:
@example
LT_INIT([dlopen])
LTDL_INIT
# The lt_dladvise_init symbol was added with libtool-2.2
if test yes != "$with_included_ltdl"; then
save_CFLAGS=$CFLAGS
save_LDFLAGS=$LDFLAGS
CFLAGS="$CFLAGS $LTDLINCL"
LDFLAGS="$LDFLAGS $LIBLTDL"
AC_CHECK_LIB([ltdl], [lt_dladvise_init],
[],
[AC_MSG_ERROR([installed libltdl is too old])])
LDFLAGS=$save_LDFLAGS
CFLAGS=$save_CFLAGS
fi
@end example
@var{options} may include no more than one of the following build
modes depending on how you want your project to build @code{libltdl}:
@samp{nonrecursive}, @samp{recursive}, or @samp{subproject}. In order
for @command{libtoolize} to detect this option correctly, if you
supply one of these arguments, they must be given literally (i.e.,
macros or shell variables that expand to the correct ltdl mode will not
work).
@table @samp
@item nonrecursive
This is how the Libtool project distribution builds the @code{libltdl}
we ship and install. If you wish to use Automake to build
@code{libltdl} without invoking a recursive make to descend into the
@code{libltdl} subdirectory, then use this option. You will need to set
your configuration up carefully to make this work properly, and you will
need releases of Autoconf and Automake that support
@code{subdir-objects} and @code{LIBOBJDIR} properly. In your
@file{configure.ac}, add:
@example
AM_INIT_AUTOMAKE([subdir-objects])
AC_CONFIG_HEADERS([config.h])
LT_CONFIG_LTDL_DIR([libltdl])
LT_INIT([dlopen])
LTDL_INIT([nonrecursive])
@end example
@noindent
You @emph{have to} use a config header, but it may have a name different
than @file{config.h}.
Also, add the following near the top of your @file{Makefile.am}:
@example
AM_CPPFLAGS =
AM_LDFLAGS =
BUILT_SOURCES =
EXTRA_DIST =
CLEANFILES =
MOSTLYCLEANFILES =
include_HEADERS =
noinst_LTLIBRARIES =
lib_LTLIBRARIES =
EXTRA_LTLIBRARIES =
include libltdl/ltdl.mk
@end example
@noindent
Unless you build no other libraries from this @file{Makefile.am},
you will also need to change @code{lib_LTLIBRARIES} to assign with
@samp{+=} so that the @code{libltdl} targets declared in
@file{ltdl.mk} are not overwritten.
@item recursive
This build mode still requires that you use Automake, but (in contrast
with @samp{nonrecursive}) uses the more usual device of starting another
@code{make} process in the @file{libltdl} subdirectory. To use this
mode, you should add to your @file{configure.ac}:
@example
AM_INIT_AUTOMAKE
AC_CONFIG_HEADERS([config.h])
LT_CONFIG_LTDL_DIR([libltdl])
LT_INIT([dlopen])
LTDL_INIT([recursive])
AC_CONFIG_FILES([libltdl/Makefile])
@end example
@noindent
Again, you @emph{have to} use a config header, but it may have a name
different than @file{config.h} if you like.
Also, add this to your @file{Makefile.am}:
@example
SUBDIRS = libltdl
@end example
@item subproject
This mode is the default unless you explicitly add @code{recursive} or
@code{nonrecursive} to your @code{LTDL_INIT} options; @code{subproject}
is the only mode supported by previous releases of libltdl. Even if you
do not use Autoconf in the parent project, then, in @samp{subproject}
mode, still @code{libltdl} contains all the necessary files to configure
and build itself -- you just need to arrange for your build system to
call @file{libltdl/configure} with appropriate options, and then run
@code{make} in the @code{libltdl} subdirectory.
If you @emph{are} using Autoconf and Automake, then you will need to add
the following to your @file{configure.ac}:
@example
LT_CONFIG_LTDL_DIR([libltdl])
LTDL_INIT
@end example
@noindent
and to @file{Makefile.am}:
@example
SUBDIRS = libltdl
@end example
@end table
Aside from setting the libltdl build mode, there are other keywords
that you can pass to @code{LTDL_INIT} to modify its behavior when
@option{--with-included-ltdl} has been given:
@table @samp
@item convenience
This is the default unless you explicitly add @code{installable} to
your @code{LTDL_INIT} options.
This keyword will cause options to be passed to the @command{configure}
script in the subdirectory named by @code{LT_CONFIG_LTDL_DIR}
to cause it to be built as a convenience library. If you're not
using automake, you will need to define @code{top_build_prefix},
@code{top_builddir}, and @code{top_srcdir} in your makefile so that
@code{LIBLTDL}, @code{LTDLDEPS}, and @code{LTDLINCL} expand correctly.
One advantage of the convenience library is that it is not installed,
so the fact that you use @code{libltdl} will not be apparent to the
user, and it won't overwrite a pre-installed version of
@code{libltdl} the system might already have in the installation
directory. On the other hand, if you want to upgrade @code{libltdl}
for any reason (e.g.@: a bugfix) you'll have to recompile your package
instead of just replacing the shared installed version of
@code{libltdl}. However, if your programs or libraries are linked
with other libraries that use such a pre-installed version of
@code{libltdl}, you may get linker errors or run-time crashes.
Another problem is that you cannot link the convenience library into
more than one libtool library, then link a single program with those
libraries, because you may get duplicate symbols. In general you can
safely use the convenience library in programs that don't depend on
other libraries that might use @code{libltdl} too.
@item installable
This keyword will pass options to the @command{configure}
script in the subdirectory named by @code{LT_CONFIG_LTDL_DIR}
to cause it to be built as an installable library. If you're not
using automake, you will need to define @code{top_build_prefix},
@code{top_builddir} and @code{top_srcdir} in your makefile so that
@code{LIBLTDL}, @code{LTDLDEPS}, and @code{LTDLINCL} are expanded
properly.
Be aware that you could overwrite another @code{libltdl} already
installed to the same directory if you use this option.
@end table
@end defmac
Whatever method you use, @samp{LTDL_INIT} will define the shell variable
@code{LIBLTDL} to the link flag that you should use to link with
@code{libltdl}, the shell variable @code{LTDLDEPS} to the files that
can be used as a dependency in @file{Makefile} rules, and the shell
variable @code{LTDLINCL} to the preprocessor flag that you should use to
compile programs that include @file{ltdl.h}. So, when you want to link a
program with libltdl, be it a convenience, installed or installable
library, just use @samp{$(LTDLINCL)} for preprocessing and compilation,
and @samp{$(LIBLTDL)} for linking.
@itemize @bullet
@item
If your package is built using an installed version of @code{libltdl},
@code{LIBLTDL} will be set to the compiler flags needed to link against
the installed library, @code{LTDLDEPS} will be empty, and @code{LTDLINCL}
will be set to the compiler flags needed to find the @code{libltdl}
header files.
@item
If your package is built using the convenience libltdl, @code{LIBLTDL}
and @code{LTDLDEPS} will be the pathname for the convenience version of
libltdl (starting with @samp{$@{top_builddir@}/} or
@samp{$@{top_build_prefix@}}) and @code{LTDLINCL} will be @option{-I}
followed by the directory that contains @file{ltdl.h} (starting with
@samp{$@{top_srcdir@}/}).
@item
If an installable version of the included @code{libltdl} is being
built, its pathname starting with @samp{$@{top_builddir@}/} or
@samp{$@{top_build_prefix@}}, will be stored in @code{LIBLTDL} and
@code{LTDLDEPS}, and @code{LTDLINCL} will be set just like in the case of
convenience library.
@end itemize
You should probably also use the @samp{dlopen} option to @code{LT_INIT}
in your @file{configure.ac}, otherwise libtool will assume no dlopening
mechanism is supported, and revert to dlpreopening, which is probably not
what you want. Avoid using the @option{-static},
@option{-static-libtool-libs}, or @option{-all-static}
switches when linking programs with libltdl. This will not work on
all platforms, because the dlopening functions may not be available
for static linking.
The following example shows you how to embed an installable libltdl in
your package. In order to use the convenience variant, just replace the
@code{LTDL_INIT} option @samp{installable} with @samp{convenience}. We
assume that libltdl was embedded using @samp{libtoolize --ltdl}.
configure.ac:
@example
...
# Name the subdirectory that contains libltdl sources
LT_CONFIG_LTDL_DIR([libltdl])
# Configure libtool with dlopen support if possible
LT_INIT([dlopen])
# Enable building of the installable libltdl library
LTDL_INIT([installable])
...
@end example
Makefile.am:
@example
...
SUBDIRS = libltdl
AM_CPPFLAGS = $(LTDLINCL)
myprog_LDFLAGS = -export-dynamic
myprog_LDADD = $(LIBLTDL) -dlopen self -dlopen foo1.la
myprog_DEPENDENCIES = $(LTDLDEPS) foo1.la
...
@end example
@defmac LTDL_INSTALLABLE
@defmacx AC_LIBLTDL_INSTALLABLE
These macros are deprecated, the @samp{installable} option to
@code{LTDL_INIT} should be used instead.
@end defmac
@defmac LTDL_CONVENIENCE
@defmacx AC_LIBLTDL_CONVENIENCE
These macros are deprecated, the @samp{convenience} option to
@code{LTDL_INIT} should be used instead.
@end defmac
@node Trace interface
@chapter Libtool's trace interface
@cindex trace interface
@cindex autoconf traces
This section describes macros whose sole purpose is to be traced using
Autoconf's @option{--trace} option (@pxref{autoconf Invocation, , The
Autoconf Manual, autoconf, The Autoconf Manual}) to query the Libtool
configuration of a project. These macros are called by Libtool
internals and should never be called by user code; they should only be
traced.
@defmac LT_SUPPORTED_TAG (@var{tag})
This macro is called once for each language enabled in the package. Its
only argument, @var{tag}, is the tag-name corresponding to the language
(@pxref{Tags}).
You can therefore retrieve the list of all tags enabled in a project
using the following command:
@example
autoconf --trace 'LT_SUPPORTED_TAG:$1'
@end example
@end defmac
@node FAQ
@chapter Frequently Asked Questions about libtool
This chapter covers some questions that often come up on the mailing
lists.
@menu
* Stripped link flags:: Dropped flags when creating a library
@end menu
@node Stripped link flags
@section Why does libtool strip link flags when creating a library?
When creating a shared library, but not when compiling or creating
a program, @command{libtool} drops some flags from the command line
provided by the user. This is done because flags unknown to
@command{libtool} may interfere with library creation or require
additional support from @command{libtool}, and because omitting
flags is usually the conservative choice for a successful build.
If you encounter flags that you think are useful to pass, as a
work-around you can prepend flags with @code{-Wc,} or @code{-Xcompiler }
to allow them to be passed through to the compiler driver
(@pxref{Link mode}). Another possibility is to add flags already
to the compiler command at @command{configure} run time:
@example
./configure CC='gcc -m64'
@end example
If you think @command{libtool} should let some flag through by default,
here's how you can test such an inclusion: grab the Libtool development
tree, edit the @file{ltmain.in} file in the @file{libltdl/config}
subdirectory to pass through the flag (search for @samp{Flags to be
passed through}), re-bootstrap and build with the flags in question
added to @code{LDFLAGS}, @code{CFLAGS}, @code{CXXFLAGS}, etc. on the
@command{configure} command line as appropriate. Run the testsuite
as described in the @file{README} file and report results to
@value{BUGADDR}.
@node Troubleshooting
@chapter Troubleshooting
@cindex troubleshooting
@cindex problems, solving
@cindex solving problems
@cindex problems, blaming somebody else for
Libtool is under constant development, changing to remain up-to-date
with modern operating systems. If libtool doesn't work the way you
think it should on your platform, you should read this chapter to help
determine what the problem is, and how to resolve it.
@menu
* Libtool test suite:: Libtool's self-tests.
* Reporting bugs:: How to report problems with libtool.
@end menu
@node Libtool test suite
@section The libtool test suite
@cindex test suite
Libtool comes with an integrated set of tests to check that your build is sane,
that test its capabilities, and report obvious bugs in the libtool program. The
test suite is based on Autotest from Autoconf (@pxref{testsuite Invocation, ,
Generating Test Suites with Autotest, autoconf, The Autoconf Manual}). These
tests, too, are constantly evolving, based on past problems with libtool, and
known deficiencies in other operating systems.
As described in the @file{README} file, you may run @kbd{make -k check} after
you have built libtool (possibly before you install it) to make sure that it
meets basic functional requirements.
@menu
* Test descriptions:: The contents of the test suite.
* When tests fail:: What to do when a test fails.
@end menu
@node Test descriptions
@subsection Description of test suite
The test suite uses keywords to classify certain test groups:
@table @samp
@item CXX
@itemx F77
@itemx FC
@itemx GCJ
The test group exercises one of these @command{libtool} language tags.
@item autoconf
@itemx automake
These keywords denote that the respective external program is needed
by the test group. The tests are typically skipped if the program is
not installed. The @samp{automake} keyword may also denote use of the
@command{aclocal} program.
@item interactive
This test group may require user interaction on some systems. Typically,
this means closing a popup window about a DLL load error on Windows.
@item libltdl
Denote that the @file{libltdl} library is exercised by the test group.
@item libtool
@itemx libtoolize
Denote that the @command{libtool} or @command{libtoolize} scripts are
exercised by the test group, respectively.
@item recursive
Denote that this test group may recursively re-invoke the test suite
itself, with changed settings and maybe a changed @command{libtool}
script. You may use the @env{INNER_TESTSUITEFLAGS} variable to pass
additional settings to this recursive invocation. Typically, recursive
invocations delimit the set of tests with another keyword, for example
by passing @code{-k libtool} right before the expansion of the
@env{INNER_TESTSUITEFLAGS} variable (without an intervening space, so
you get the chance for further delimitation).
Test groups with the keyword @samp{recursive} should not be denoted with
keywords, in order to avoid infinite recursion. As a consequence,
recursive test groups themselves should never require user interaction,
while the test groups they invoke may do so.
@end table
@cindex @samp{check-interactive}
@cindex @samp{check-noninteractive}
There is a convenience target @samp{check-noninteractive} that runs
all tests from both test suites that do not cause user interaction on
Windows. Conversely, the target @samp{check-interactive} runs the
complement of tests and might require closing popup windows about DLL
load errors on Windows.
Here is a list of the some of the current files in the test suite, and what
they test for:
@table @file
@item @file{tests/am-subdirs.at}
Tests that a binary can be built and ran from outside of the sub-directory that
it is built and ran in.
@item @file{tests/archive-in-archive.at}
Tests convenience archive within another convenience archive.
Compiles @code{foo()} in @file{libfoo}, then compiles @file{libfoo} (and
@code{bar()} function) into @file{libbar}.
@item @file{tests/bindir.at}
Tests include a demonstration of various scenarios related to using the
@option{-bindir} option with @command{libtool}, verifying that installed files
are correctly linked and executed, and demonstrating flexibility in handling
different installation paths and adapting to changing directory structures.
@item @file{tests/bug_42313.at}
Tests that there are no conflicting warnings about AC_PROG_RANLIB by verifying
no autoscan AC_PROG_RANLIB warning and checking that AC_PROG_RANLIB declaration
has a warning.
@item @file{tests/bug_62343.at}
Tests that the @option{-no-canonical-prefixes} flag is not removed from the linking
command, but it is instead passed through to the linker.
@item @file{tests/bug_71489.at}
Tests that the local version of an installed program is used both with and
without an external library.
@item @file{tests/cdemo.at}
Tests include a demonstration of @command{libtool} convenience libraries, a
mechanism that allows build-time static libraries to be created, in a way that
their components can be later linked into programs or other libraries, even
shared ones.
@item @file{tests/cmdline_wrap.at}
Tests include a verification that @command{libtool} operates properly when the
maximum command line length is very small.
@item @file{tests/configure-funcs.at}
Tests creation of shell functions shared with @command{configure} and
@command{libtool}.
@item @file{tests/configure-iface.at}
Tests that exercise the configure interface to @file{libltdl}.
@item @file{tests/convenience.at}
Tests demonstrate @command{libtool}'s convenience archive capabilities for C,
C++, Fortran, and Java languages.
@item @file{tests/ctor.at}
Test that @command{libtool} can handle code with C++ constructors.
@item @file{tests/cwrapper.at}
Test cwrapper compliance with standards for uninstalled executables, string
length, and installed shared libraries.
@item @file{tests/darwin.at}
Tests for macOS X including compilation, concurrent library extraction,
@command{GDB} debug information, @command{ld} warnings, and verifying
@file{.dylib} and @file{.so} files can be used with @code{lt_dlopen}.
@item @file{tests/demo.at}
Tests include a demonstration of a trivial package that uses @command{libtool}.
The tests include scenarios to build both static and shared libraries, only
static libraries, only shared libraries, disabling fast-install mode, building
PIC code, and building non-PIC code.
@item @file{tests/depdemo.at}
Tests include a demonstration of inter-library dependencies with
@command{libtool}. The test programs link some interdependent libraries under
different scenarios.
@item @file{tests/deplib-in-subdir.at}
Tests building and linking various libraries within various directories and
sub-directories while changing directories as well. It should be possible to
use a nontrivial relative path to the output file name when creating libraries
and programs. The @code{deplibs} of these might have relative paths as well.
When executing uninstalled programs, the paths relative to @file{$PWD} at build
time needs to be translated to a path valid at execution time. Also test
installing these libraries and programs; however, use consistent relative paths
between @command{libtool} @option{--mode=link} and @command{libtool}
@option{--mode=install} in this test.
@item @file{tests/deplibs-ident.at}
Tests the correct detection and handling of identical dependency libraries when
using @command{libtool}.
@item @file{tests/deplibs-mingw.at}
Tests include various scenarios related to detecting deplibs correctly,
including cases where there is no @command{file} command installed as well
as the host OS being MinGW.
@item @file{tests/destdir.at}
Installs some libs in @file{$DESTDIR}, moves them to a different dir, then
installs some false libraries in @file{$DESTDIR} that should not be linked
against. If the program refers to these false libraries, there is a bug.
@item @file{tests/dlloader-api.at}
Tests that @code{lt_dlopen} can open a shared library and call a function from
the library, verifies that @code{lt_dlsym} can find a symbol in the opened
library, and make sure that @code{lt_dlclose} can properly close the library.
@item @file{tests/dumpbin-symbols.at}
Tests whether on Windows a convenience symbol in a section of a @file{.lib}
file is present even if that section is hidden in the @file{.obj} file.
@item @file{tests/duplicate_conv.at}
Tests two convenience archives with the same name, and also containing an
object with the same name.
@item @file{tests/duplicate_deps.at}
Tests circular call of dependencies between two libraries, @file{liba} and
@file{libb}. Function @code{a1()} from @file{liba} calls @code{b1()} from
@file{libb} and function @code{b1()} from @file{libb} calls @code{a2()} from
@file{liba}.
@item @file{tests/duplicate_members.at}
Tests a library with multiple files of the same name (from different
directories), such as @file{1/a.c}, @file{2/a.c}, @file{3/a.c}, etc.
@item @file{tests/early-libtool.at}
Tests building binaries using @command{libtool} using two configure approaches.
@item @file{tests/exceptions.at}
Tests C++ exception handling with @command{libtool}.
@item @file{tests/execute-mode.at}
Tests the @option{--mode=execute} feature of @command{libtool}.
@item @file{tests/exeext.at}
Tests ensure that @code{$EXEEXT} handling works by linking and installing an
executable.
@item @file{tests/export.at}
Tests symbol exports if shared libraries are enabled.
@item @file{tests/export-def.at}
Test exporting from a DLL with module definition (@file{.def} files).
This test only runs if shared libraries are enabled and building a DLL is
supported.
@item @file{tests/f77demo.at}
Tests test Fortran 77 support in @command{libtool} by creating libraries from
Fortran 77 sources, and mixed Fortran and C sources, and a Fortran 77 program
to use the former library, and a C program to use the latter library.
@item @file{tests/fail.at}
Tests to ensure that @command{libtool} really fails when it should, including
after compile failure, program creation failure, and shared library creation
failure.
@item @file{tests/fcdemo.at}
Tests are similar to the @file{tests/f77demo.at} tests, except that Fortran 90
is used in combination with the @samp{FC} interface provided by Autoconf and
Automake.
@item @file{tests/flags.at}
Tests include checks that compile and linker flags get passed through
@command{libtool}. Tests flags for C, C++, Fortran 77, and Fortran 90.
@item @file{tests/help.at}
Tests a variety of mode commands, including mode short-hands, to ensure basic
command line functionality. Also verify that the @option{--debug} flag is
handled correctly in each mode.
@item @file{tests/indirect_deps.at}
Tests indirect dependencies (or nested dependencies). @file{libd} depends
on @file{libconv}, which depends on @file{libb}, which depends on @file{liba}.
@item @file{tests/infer-tag.at}
Tests that @code{func_infer_tag} works by compiling various code snippets in
various languages (C, C++, Fortran, Java) without a @option{--tag} flag.
@item @file{tests/inherited_flags.at}
Tests the functionality of the @code{inherited_linker_flags} variable in
@command{libtool} library files.
@item @file{tests/install.at}
Tests install mode and ensures that @code{install_override_mode} overrides the
mode of the shared library (and only the shared library).
@item @file{tests/lalib-syntax.at}
Tests parsing of @file{.la} files including both correctly formed @file{.la}
files and malformed or bogus @file{.la} files.
@item @file{tests/libtool.at}
Tests basic @command{libtool} functionality including shell meta-character removal, lack
of supplied mode option, file extension handling, simple linking,
@option{objectlist} flag usage, and @code{LT_SUPPORTED_TAG} usage.
@item @file{tests/libtoolize.at}
Tests that errors and warnings are reported for invalid and unknown
@code{LIBTOOLIZE_OPTIONS} as well as checking the @option{--no-warn} flag
suppresses @command{libtoolize} warnings.
@item @file{tests/link-order2.at}
Tests to make sure that @code{depdepls} are added right after the libraries
that pull them in which is necessary at least for static linking and on systems
where libraries do not link against other libraries.
@item @file{tests/link-order.at}
Tests for problems with linking libraries in different order.
@item @file{tests/loadlibrary.at}
Tests @file{libltdl} (a @file{libdl} API for @code{dlopen}) support of
@code{LoadLibrary} for dynamically linking DLLs in Windows environments.
@item @file{tests/localization.at}
Tests to verify that invoking C compiler language localization options do not
cause problems with @command{libtool}.
@item @file{tests/lt_dladvise.at}
Tests @file{libltdl} (a @file{libdl} API for @code{dlopen}) module loading
advisor functions.
@item @file{tests/lt_dlexit.at}
Tests @file{libltdl} (a @file{libdl} API for @code{dlopen}) for a memory use
after free bug in @code{lt_dlexit}.
@item @file{tests/lt_dlopen_a.at}
Tests @code{lt_dlopen} with an archive file. Verifies that @code{lt_dlopen}
can load an archive file and successfully return a handle to it.
@item @file{tests/lt_dlopen.at}
Tests @file{libltdl} (a @file{libdl} API for @code{dlopen}) with a basic C
example.
@item @file{tests/lt_dlopenext.at}
Tests @file{libltdl} (a @file{libdl} API for @code{dlopen}) with an extern C++
function.
@item @file{tests/ltdl-api.at}
Tests that @command{libtool} doesn't mangle @file{gnulib} @code{argv} names.
@item @file{tests/ltdl-libdir.at}
Tests if @file{ltdl} can find an installed module using the @code{libdir}
variable in the @file{.la} file. Tests also include a MinGW test which uses a
Windows-style @code{libdir} name.
@item @file{tests/mdemo.at}
Tests include a demonstration of a package that uses @command{libtool} and the
system independent @code{dlopen} wrapper @file{libltdl} to load modules. The
library @file{libltdl} provides a @code{dlopen} wrapper for various platforms
(POSIX) including support for @code{dlpreopened} modules
(@pxref{Dlpreopening}).
@item @file{tests/need_lib_prefix.at}
Tests to check for failures on systems that require libraries to be prefixed
with @file{lib}.
@item @file{tests/nocase.at}
Tests the @code{want_nocaseglob} configuration option to search for libraries
regardless of case.
@item @file{tests/no-executables.at}
Tests @code{AC_NO_EXECUTABLES} macro with @command{gcc}.
@item @file{tests/nonrecursive.at}
Tests non-recursive DLL libraries with nonrecursive Automake @file{libltdl}
(a @file{libdl} API for @code{dlopen}) build.
@item @file{tests/old-m4-iface.at}
Tests various aspects of @command{libtool}'s old @command{m4} interface.
@item @file{tests/pic_flag.at}
Tests the @option{-fpic} flag with @command{gcc} and @command{g++}.
@item @file{tests/recursive.at}
Tests recursive DLL libraries with recursive Automake @file{libltdl} (a
@file{libdl} API for @code{dlopen}) build.
@item @file{tests/resident.at}
Tests that resident modules are not unloaded at program exit, as they need to
be able to invoke @code{atexit} handlers. A module being a resident module
means it is prevented from being @code{lt_dlclosed}.
@item @file{tests/runpath-in-lalib.at}
Tests the runpath configuration options in @command{libtool} (@option{-R} and
@option{-rpath}) and that the resulting @file{.la} files are installed in the
directory appended to their respective library's @code{dependency_libs} by
@option{-R}.
@item @file{tests/search-path.at}
Tests @code{sys_lib_search_path_spec}, which is an expression to get the
compile-time system library search path. Tests include determining if it is
possible to link an executable to system libraries and if
@code{sys_lib_search_path_spec} also works on MSVC (Microsoft Visual C++).
@item @file{tests/shlibpath.at}
Tests include verifying proper behaviour of the @code{shlibpath_var}, which is
the variable responsible for telling the linker where to find shared libraries.
Additionally, this tests the behaviour of the @code{shlibpath_overrides_runpath}
variable, which determines if it possible to override an executable's
hardcoded library search path with an environment variable.
@item @file{tests/slist.at}
Tests the functionality of the SList datastructure, which is an implementation
of singly linked lists. Tests include verifying the usual linked list
functions such as finding, removing, deleting, reversing, and sorting the
element(s) in SLists.
@item @file{tests/standalone.at}
Tests @file{libltdl} functionality as a standalone tool. Tests include
compiling a symlinked version of @file{libltdl} and a copied version of
@file{libltdl}. Next @file{libltdl} is installed locally and linked in a
project without the use of Autoconf or Automake.
@item @file{tests/static.at}
Tests various flags related to static and dynamic linking including
@option{-Bstatic} and @option{-Bdynamic}.
@item @file{tests/stresstest.at}
Tests various @command{libtool} flag and option combinations, tests linking
various types of objects in different sections, and tests regular expressions
to cover edge cases and unusual configurations over multiple iterations.
@item @file{tests/subproject.at}
Tests the @file{libltdl} flag for building subprojects in individual
directories. Tests cover soft-linked libltdl trees, copied @file{libltdl}
trees, and linking @file{libltdl} without Autotools.
@item @file{tests/sysroot.at}
Tests that @command{libtool} runs properly in sandboxed @file{sysroot}
directories.
@item @file{tests/tagdemo.at}
Tests include a demonstration of a package that uses @command{libtool}'s
multi-language support through configuration tags. It generates a library from
C++ sources, which is then linked to a C++ program.
@item @file{tests/template.at}
Tests that @command{libtool} can handle C++ code utilizing templates.
@item @file{tests/testsuite.at}
Main testsuite framework file processed by @command{autom4te} processes enable
running the @command{libtool} test cases.
@item @file{tests/versioning.at}
Tests @command{libtool}'s versioning system. Tests begin with verifying the
behaviour of @command{libtool} versioning flags @option{-version-info} and
@option{-version-number}. Next, this tests installing a library, then updating
the library with a new revision, a compatible update, and an incompatible
update. In each case, the tests verify that the original library will link and
install as expected.
@item @file{tests/with_pic.at}
Tests the function of the @option{--enable-pic} flag. The @option{--enable-pic}
flag is used to specify whether or not @command{libtool} uses PIC objects.
This includes tests for setting @option{--enable-pic} to no, yes, or a comma
delimited list of package names.
@end table
@node When tests fail
@subsection When tests fail
@cindex failed tests
@cindex tests, failed
The Autotest-based test suite produces as output a file
@file{tests/testsuite.log} that contains information about failed tests.
You can pass options to the test suite through the @command{make} variable
@env{TESTSUITEFLAGS} (@pxref{testsuite Invocation, , Making testsuite Scripts,
autoconf, The Autoconf Manual}).
@node Reporting bugs
@section Reporting bugs
@cindex bug reports
@cindex reporting bugs
@cindex problem reports
If you think you have discovered a bug in libtool, you should think
twice: the libtool maintainer is notorious for passing the buck (or
maybe that should be ``passing the bug''). Libtool was invented to fix
known deficiencies in shared library implementations, so, in a way, most
of the bugs in libtool are actually bugs in other operating systems.
However, the libtool maintainer would definitely be happy to add support
for somebody else's buggy operating system. [I wish there was a good
way to do winking smiley-faces in Texinfo.]
Genuine bugs in libtool include problems with shell script portability,
documentation errors, and failures in the test suite (@pxref{Libtool
test suite}).
First, check the documentation and help screens to make sure that the
behaviour you think is a problem is not already mentioned as a feature.
Then, you should read the Emacs guide to reporting bugs (@pxref{Bugs, ,
Reporting Bugs, emacs, The Emacs Manual}). Some of the details
listed there are specific to Emacs, but the principle behind them is a
general one.
Finally, send a bug report to @value{BUGADDR} with any appropriate
@emph{facts}, such as test suite output (@pxref{When tests fail}), all
the details needed to reproduce the bug, and a brief description of why
you think the behaviour is a bug. Be sure to include the word
``libtool'' in the subject line, as well as the version number you are
using (which can be found by typing @kbd{libtool --version}).
@node Maintaining
@chapter Maintenance notes for libtool
This chapter contains information that the libtool maintainer finds
important. It will be of no use to you unless you are considering
porting libtool to new systems, or writing your own libtool.
@menu
* New ports:: How to port libtool to new systems.
* Tested platforms:: When libtool was last tested.
* Platform quirks:: Information about different library systems.
* libtool script contents:: Configuration information that libtool uses.
* Cheap tricks:: Making libtool maintainership easier.
@end menu
@node New ports
@section Porting libtool to new systems
Before you embark on porting libtool to an unsupported system, it is
worthwhile to send e-mail to @value{MAILLIST}, to make sure that you are
not duplicating existing work.
If you find that any porting documentation is missing, please complain!
Complaints with patches and improvements to the documentation, or to
libtool itself, are more than welcome.
@menu
* Information sources:: Where to find relevant documentation
* Porting inter-library dependencies:: Implementation details explained
@end menu
@node Information sources
@subsection Information sources
Once it is clear that a new port is necessary, you'll generally need the
following information:
@table @asis
@item canonical system name
You need the output of @code{config.guess} for this system, so that you
can make changes to the libtool configuration process without affecting
other systems.
@item man pages for @command{ld} and @command{cc}
These generally describe what flags are used to generate PIC, to create
shared libraries, and to link against only static libraries. You may
need to follow some cross references to find the information that is
required.
@item man pages for @command{ld.so}, @command{rtld}, or equivalent
These are a valuable resource for understanding how shared libraries are
loaded on the system.
@item man page for @command{ldconfig}, or equivalent
This page usually describes how to install shared libraries.
@item output from @kbd{ls -l /lib /usr/lib}
This shows the naming convention for shared libraries on the system,
including what names should be symbolic links.
@item any additional documentation
Some systems have special documentation on how to build and install
shared libraries.
@end table
If you know how to program the Bourne shell, then you can complete the
port yourself; otherwise, you'll have to find somebody with the relevant
skills who will do the work. People on the libtool mailing list are
usually willing to volunteer to help you with new ports, so you can send
the information to them.
To do the port yourself, you'll definitely need to modify the
@code{libtool.m4} macros to make platform-specific changes to
the configuration process. You should search that file for the
@code{PORTME} keyword, which will give you some hints on what you'll
need to change. In general, all that is involved is modifying the
appropriate configuration variables (@pxref{libtool script contents}).
Your best bet is to find an already-supported system that is similar to
yours, and make your changes based on that. In some cases, however,
your system will differ significantly from every other supported system,
and it may be necessary to add new configuration variables, and modify
the @code{ltmain.in} script accordingly. Be sure to write to the
mailing list before you make changes to @code{ltmain.in}, since they may
have advice on the most effective way of accomplishing what you want.
@node Porting inter-library dependencies
@subsection Porting inter-library dependencies support
@cindex inter-library dependency
@vindex deplibs_check_method
Since version 1.2c, libtool has re-introduced the ability to do
inter-library dependency on some platforms, thanks to a patch by Toshio
Kuratomi @email{badger@@prtr-13.ucsc.edu}. Here's a shortened version
of the message that contained his patch:
The basic architecture is this: in @file{libtool.m4}, the person who
writes libtool makes sure @samp{$deplibs} is included in
@samp{$archive_cmds} somewhere and also sets the variable
@samp{$deplibs_check_method}, and maybe @samp{$file_magic_cmd} when
@samp{deplibs_check_method} is file_magic.
@samp{deplibs_check_method} can be one of five things:
@table @samp
@item file_magic [@var{regex}]
@vindex file_magic
@vindex file_magic_cmd
@vindex file_magic_test_file
looks in the library link path for libraries that have the right
libname. Then it runs @samp{$file_magic_cmd} on the library and checks
for a match against the extended regular expression @var{regex}. When
@code{file_magic_test_file} is set by @file{libtool.m4}, it is used as an
argument to @samp{$file_magic_cmd} to verify whether the
regular expression matches its output, and warn the user otherwise.
@item pass_all
@vindex pass_all
will pass everything without any checking. This may work on platforms
where code is position-independent by default and inter-library
dependencies are properly supported by the dynamic linker, for example,
on DEC OSF/1 3 and 4.
@item none
@vindex none
It causes deplibs to be reassigned @samp{deplibs=""}. That way
@samp{archive_cmds} can contain deplibs on all platforms, but not have
deplibs used unless needed.
@item unknown
@vindex unknown
is the default for all systems unless overridden in @file{libtool.m4}.
It is the same as @samp{none}, but it documents that we really don't
know what the correct value should be, and we welcome patches that
improve it.
@end table
Then in @file{ltmain.in} we have the real workhorse: a little
initialization and postprocessing (to setup/release variables for use
with eval echo libname_spec etc.) and a case statement that decides
the method that is being used. This is the real code@dots{} I wish I could
condense it a little more, but I don't think I can without function
calls. I've mostly optimized it (moved things out of loops, etc.) but
there is probably some fat left. I thought I should stop while I was
ahead, work on whatever bugs you discover, etc.@: before thinking about
more than obvious optimizations.
@node Tested platforms
@section Tested platforms
This table describes when libtool was last known to be tested on
platforms where it claims to support shared libraries:
@example
@include PLATFORMS
@end example
Note: The vendor-distributed HP-UX @command{sed}(1) programs are horribly
broken, and cannot handle libtool's requirements, so users may report
unusual problems. There is no workaround except to install a working
@command{sed} (such as GNU @command{sed}) on these systems.
Note: The vendor-distributed NCR MP-RAS @command{cc} programs emits
copyright on standard error that confuse tests on size of
@file{conftest.err}. The workaround is to specify @code{CC}
when run @code{configure} with @kbd{CC='cc -Hnocopyr'}.
@node Platform quirks
@section Platform quirks
This section is dedicated to the sanity of the libtool maintainers. It
describes the programs that libtool uses, how they vary from system to
system, and how to test for them.
Because libtool is a shell script, it can be difficult to understand
just by reading it from top to bottom. This section helps show why
libtool does things a certain way. Combined with the scripts
themselves, you should have a better sense of how to improve libtool, or
write your own.
@menu
* Compilers:: Creating object files from source files.
* Reloadable objects:: Binding object files together.
* Multiple dependencies:: Removing duplicate dependent libraries.
* Archivers:: Programs that create static archives.
* Cross compiling:: Issues that arise when cross compiling.
* File name conversion:: Converting file names between platforms.
* Windows DLLs:: Windows header defines.
@end menu
@node Compilers
@subsection Compilers
The only compiler characteristics that affect libtool are the flags
needed (if any) to generate PIC objects. In general, if a C compiler
supports certain PIC flags, then any derivative compilers support the
same flags. Until there are some noteworthy exceptions to this rule,
this section will document only C compilers.
The following C compilers have standard command line options, regardless
of the platform:
@table @code
@item gcc
This is the GNU C compiler, which is also the system compiler for many
free operating systems (FreeBSD, GNU/Hurd, GNU/Linux, Lites, NetBSD, and
OpenBSD, to name a few).
The @option{-fpic} or @option{-fPIC} flags can be used to generate
position-independent code. @option{-fPIC} is guaranteed to generate
working code, but the code is slower on m68k, m88k, and SPARC chips.
However, using @option{-fpic} on those chips imposes arbitrary size limits
on the shared libraries.
@end table
The rest of this subsection lists compilers by the operating system that
they are bundled with:
@c FIXME these should all be better-documented
@table @code
@item aix3*
@itemx aix4*
Most AIX compilers have no PIC flags, since AIX (with the exception of
AIX for IA-64) runs on PowerPC and RS/6000 chips. @footnote{All code compiled
for the PowerPC and RS/6000 chips (@code{powerpc-*-*}, @code{powerpcle-*-*},
and @code{rs6000-*-*}) is position-independent, regardless of the operating
system or compiler suite. So, ``regular objects'' can be used to build
shared libraries on these systems and no special PIC compiler flags are
required.}
@item hpux10*
Use @samp{+Z} to generate PIC.
@item osf3*
Digital/UNIX 3.x does not have PIC flags, at least not on the PowerPC
platform.
@item solaris2*
Use @option{-KPIC} to generate PIC.
@item sunos4*
Use @option{-PIC} to generate PIC.
@end table
@node Reloadable objects
@subsection Reloadable objects
On all known systems, a reloadable object can be created by running
@kbd{ld -r -o @var{output}.o @var{input1}.o @var{input2}.o}. This
reloadable object may be treated as exactly equivalent to other
objects.
@node Multiple dependencies
@subsection Multiple dependencies
On most modern platforms the order where dependent libraries are listed
has no effect on object generation. In theory, there are platforms
that require libraries that provide missing symbols to other libraries
to be listed after those libraries whose symbols they provide.
Particularly, if a pair of static archives each resolve some of the
other's symbols, it might be necessary to list one of those archives
both before and after the other one. Libtool does not currently cope
with this situation well, since duplicate libraries are removed from
the link line by default. Libtool provides the command line option
@option{--preserve-dup-deps} to preserve all duplicate dependencies
in cases where it is necessary.
@node Archivers
@subsection Archivers
On all known systems, building a static library can be accomplished by
running @kbd{ar cr lib@var{name}.a @var{obj1}.o @var{obj2}.o @dots{}},
where the @file{.a} file is the output library, and each @file{.o} file is an
object file.
On all known systems, if there is a program named @command{ranlib}, then it
must be used to ``bless'' the created library before linking against it,
with the @kbd{ranlib lib@var{name}.a} command. Some systems, like Irix,
use the @code{ar ts} command, instead.
@node Cross compiling
@subsection Cross compiling
@cindex cross compile
Most build systems support the ability to compile libraries and applications
on one platform for use on a different platform, provided a compiler capable
of generating the appropriate output is available. In such cross compiling
scenarios, the platform where the libraries or applications are compiled
is called the @dfn{build platform}, while the platform where the libraries
or applications are intended to be used or executed is called the
@dfn{host platform}.
@ref{GNU Build System,, The GNU Build System, automake, The Automake Manual},
of which libtool is a part, supports cross compiling via arguments passed to
the configure script: @option{--build=...} and @option{--host=...}. However,
when the build platform and host platform are very different, libtool is
required to make certain accommodations to support these scenarios.
In most cases, because the build platform and host platform differ, the
cross-compiled libraries and executables can't be executed or tested on the
build platform where they were compiled. The testsuites of most build systems
will often skip any tests that involve executing such foreign executables when
cross-compiling. However, if the build platform and host platform are
sufficiently similar, it is often possible to run cross-compiled applications.
Libtool's own testsuite often attempts to execute cross-compiled tests, but
will mark any failures as @emph{skipped} since the failure might simply be due
to the differences between the two platforms.
In addition to cases where the host platform and build platform are extremely
similar (e.g. @samp{i586-pc-linux-gnu} and @samp{i686-pc-linux-gnu}), there is
another case where cross-compiled host applications may be executed on the
build platform. This is possible when the build platform supports an emulation
or API-enhanced environment for the host platform. One example of this
situation would be if the build platform were MinGW, and the host platform were
Cygwin (or vice versa). Both of these platforms can actually operate within a
single Windows instance, so Cygwin applications can be launched from a MinGW
context, and vice versa---provided certain care is taken. Another example
would be if the build platform were GNU/Linux on an x86 32bit processor, and
the host platform were MinGW. In this situation, the
@uref{http://www.winehq.org/, Wine} environment can be used to launch Windows
applications from the GNU/Linux operating system; again, provided certain care
is taken.
One particular issue occurs when a Windows platform such as MinGW, Cygwin, or
MSYS is the host or build platform, while the other platform is a Unix-style
system. In these cases, there are often conflicts between the format of the
file names and paths expected within host platform libraries and executables,
and those employed on the build platform.
This situation is best described using a concrete example: suppose the build
platform is GNU/Linux with canonical triplet @samp{i686-pc-linux-gnu}. Suppose
further that the host platform is MinGW with canonical triplet
@samp{i586-pc-mingw32}. On the GNU/Linux platform there is a cross compiler
following the usual naming conventions of such compilers, where the compiler
name is prefixed by the host canonical triplet (or suitable alias). (For more
information concerning canonical triplets and platform aliases, see
@ref{Specifying Target Triplets,, Specifying Target Triplets, autoconf,
The Autoconf Manual} and @ref{Canonicalizing,, Canonicalizing, autoconf,
The Autoconf Manual}) In this case, the C compiler is named
@samp{i586-pc-mingw32-gcc}.
As described in @ref{Wrapper executables}, for the MinGW host platform libtool
uses a wrapper executable to set various environment variables before launching
the actual program executable. Like the program executable, the wrapper
executable is cross-compiled for the host platform (that is, for MinGW). As
described above, ordinarily a host platform executable cannot be executed on
the build platform, but in this case the Wine environment could be used to
launch the MinGW application from GNU/Linux. However, the wrapper executable,
as a host platform (MinGW) application, must set the @env{PATH} variable so
that the true application's dependent libraries can be located---but the
contents of the @env{PATH} variable must be structured for MinGW. Libtool
must use the Wine file name mapping facilities to determine the correct value
so that the wrapper executable can set the @env{PATH} variable to point to the
correct location.
For example, suppose we are compiling an application in @file{/var/tmp} on
GNU/Linux, using separate source code and build directories:
@example
@multitable @columnfractions 0.5 0.5
@item @file{/var/tmp/foo-1.2.3/app/} @tab (application source code)
@item @file{/var/tmp/foo-1.2.3/lib/} @tab (library source code)
@item @file{/var/tmp/BUILD/app/} @tab (application build objects here)
@item @file{/var/tmp/BUILD/lib/} @tab (library build objects here)
@end multitable
@end example
Since the library will be built in @file{/var/tmp/BUILD/lib}, the wrapper
executable (which will be in @file{/var/tmp/BUILD/app}) must add that
directory to @env{PATH} (actually, it must add the directory named
@var{objdir} under @file{/var/tmp/BUILD/lib}, but we'll ignore that detail
for now). However, Windows does not have a concept of Unix-style file or
directory names such as @file{/var/tmp/BUILD/lib}. Therefore, Wine provides
a mapping from Windows file names such as @file{C:\Program Files} to specific
Unix-style file names. Wine also provides a utility that can be used to map
Unix-style file names to Windows file names.
In this case, the wrapper executable should actually add the value
@example
Z:\var\tmp\BUILD\lib
@end example
@noindent
to the @env{PATH}. libtool contains support for path conversions of this
type, for a certain limited set of build and host platform combinations. In
this case, libtool will invoke Wine's @command{winepath} utility to ensure that
the correct @env{PATH} value is used. @xref{File name conversion}.
@node File name conversion
@subsection File name conversion
@cindex file name conversion
@cindex path conversion
In certain situations, libtool must convert file names and paths between
formats appropriate to different platforms. Usually this occurs when
cross-compiling, and affects only the ability to launch host platform
executables on the build platform using an emulation or API-enhancement
environment such as Wine. Failure to convert paths
(@pxref{File Name Conversion Failure}) will cause a warning to be issued, but
rarely causes the build to fail---and should have no effect on the compiled
products, once installed properly on the host platform. For more information,
@pxref{Cross compiling}.
However, file name conversion may also occur in another scenario: when using a
Unix emulation system on Windows (such as Cygwin or MSYS), combined with a
native Windows compiler such as MinGW or MSVC. Only a limited set of such
scenarios are currently supported; in other cases file name conversion is
skipped. The lack of file name conversion usually means that uninstalled
executables can't be launched, but only rarely causes the build to fail
(@pxref{File Name Conversion Failure}).
libtool supports file name conversion in the following scenarios:
@multitable @columnfractions .25 .25 .5
@headitem build platform @tab host platform @tab Notes
@item MinGW (MSYS) @tab MinGW (Windows)
@tab @pxref{Native MinGW File Name Conversion}
@item Cygwin @tab MinGW (Windows)
@tab @pxref{Cygwin/Windows File Name Conversion}
@item Unix + Wine @tab MinGW (Windows)
@tab Requires Wine. @xref{Unix/Windows File Name Conversion}.
@item MinGW (MSYS) @tab Cygwin
@tab Requires @env{LT_CYGPATH}. @xref{LT_CYGPATH}. Provided for testing
purposes only.
@item Unix + Wine @tab Cygwin
@tab Requires both Wine and @env{LT_CYGPATH}, but does not yet work with
Cygwin 1.7.7 and Wine-1.2.
@xref{Unix/Windows File Name Conversion}, and @ref{LT_CYGPATH}.
@end multitable
@menu
* File Name Conversion Failure:: What happens when file name conversion fails
* Native MinGW File Name Conversion:: MSYS file name conversion idiosyncrasies
* Cygwin/Windows File Name Conversion:: Using @command{cygpath} to convert Cygwin file names
* Unix/Windows File Name Conversion:: Using Wine to convert Unix paths
* LT_CYGPATH:: Invoking @command{cygpath} from other environments
* Cygwin to MinGW Cross:: Other notes concerning MinGW cross
@end menu
@node File Name Conversion Failure
@subsubsection File Name Conversion Failure
@cindex File Name Conversion - Failure
@cindex Path Conversion - Failure
In most cases, file name conversion is not needed or attempted. However, when
libtool detects that a specific combination of build and host platform does
require file name conversion, it is possible that the conversion may fail.
In these cases, you may see a warning such as the following:
@example
Could not determine the host file name corresponding to
`... a file name ...'
Continuing, but uninstalled executables may not work.
@end example
@noindent
or
@example
Could not determine the host path corresponding to
`... a path ...'
Continuing, but uninstalled executables may not work.
@end example
@noindent
This should not cause the build to fail. At worst, it means that the wrapper
executable will specify file names or paths appropriate for the build platform.
Since those are not appropriate for the host platform, the uninstalled
executables would not operate correctly, even when the wrapper executable is
launched via the appropriate emulation or API-enhancement (e.g. Wine). Simply
install the executables on the host platform, and execute them there.
@node Native MinGW File Name Conversion
@subsubsection Native MinGW File Name Conversion
@cindex File Name Conversion - MinGW
@cindex Path Conversion - MinGW
@cindex MSYS
MSYS is a Unix emulation environment for Windows, and is specifically designed
such that in normal usage it @emph{pretends} to be MinGW or native Windows,
but understands Unix-style file names and paths, and supports standard Unix
tools and shells. Thus, ``native'' MinGW builds are actually an odd sort of
cross-compile, from an MSYS Unix emulation environment ``pretending'' to be
MinGW, to actual native Windows.
When an MSYS shell launches a native Windows executable (as opposed to other
@emph{MSYS} executables), it uses a system of heuristics to detect any
command-line arguments that contain file names or paths. It automatically
converts these file names from the MSYS (Unix-like) format, to the
corresponding Windows file name, before launching the executable. However,
this auto-conversion facility is only available when using the MSYS runtime
library. The wrapper executable itself is a MinGW application (that is, it
does not use the MSYS runtime library). The wrapper executable must set
@env{PATH} to, and call @code{_spawnv} with, values that have already been
converted from MSYS format to Windows. Thus, when libtool writes the source
code for the wrapper executable, it must manually convert MSYS paths to
Windows format, so that the Windows values can be hard-coded into the wrapper
executable.
@node Cygwin/Windows File Name Conversion
@subsubsection Cygwin/Windows File Name Conversion
@cindex File Name Conversion - Cygwin to Windows
@cindex Path Conversion - Cygwin to Windows
Cygwin provides a Unix emulation environment for Windows. As part of that
emulation, it provides a file system mapping that presents the Windows file
system in a Unix-compatible manner. Cygwin also provides a utility
@command{cygpath} that can be used to convert file names and paths between
the two representations. In a correctly configured Cygwin installation,
@command{cygpath} is always present, and is in the @env{PATH}.
Libtool uses @command{cygpath} to convert from Cygwin (Unix-style) file names
and paths to Windows format when the build platform is Cygwin and the host
platform is MinGW.
When the host platform is Cygwin, but the build platform is MSYS or some Unix
system, libtool also uses @command{cygpath} to convert from Windows to Cygwin
format (after first converting from the build platform format to Windows format;
@xref{Native MinGW File Name Conversion}, and
@ref{Unix/Windows File Name Conversion}.) Because the build platform is not
Cygwin, @command{cygpath} is not (and should not be) in the @env{PATH}.
Therefore, in this configuration the environment variable @env{LT_CYGPATH} is
required. @xref{LT_CYGPATH}.
@node Unix/Windows File Name Conversion
@subsubsection Unix/Windows File Name Conversion
@cindex File Name Conversion - Unix to Windows
@cindex Path Conversion - Unix to Windows
@uref{http://www.winehq.org/, Wine} provides an interpretation environment for
some Unix platforms where Windows applications can be executed. It provides
a mapping between the Unix file system and a virtual Windows file system used
by the Windows programs. For the file name conversion to work, Wine must be
installed and properly configured on the build platform, and the
@command{winepath} application must be in the build platform's @env{PATH}. In
addition, on 32bit GNU/Linux it is usually helpful if the binfmt extension is
enabled.
@node LT_CYGPATH
@subsubsection LT_CYGPATH
@cindex LT_CYGPATH
For some cross-compile configurations (where the host platform is Cygwin), the
@command{cygpath} program is used to convert file names from the build platform
notation to the Cygwin form (technically, this conversion is from Windows
notation to Cygwin notation; the conversion from the build platform format
to Windows notation is performed via other means). However, because the
@command{cygpath} program is not (and should not be) in the @env{PATH} on
the build platform, @env{LT_CYGPATH} must specify the full build platform
file name (that is, the full Unix or MSYS file name) of the @command{cygpath}
program.
The reason @command{cygpath} should not be in the build platform @env{PATH} is
twofold: first, @command{cygpath} is usually installed in the same directory as
many other Cygwin executables, such as @command{sed}, @command{cp}, etc. If
the build platform environment had this directory in its @env{PATH}, then these
Cygwin versions of common Unix utilities might be used in preference to the
ones provided by the build platform itself, with deleterious effects. Second,
especially when Cygwin-1.7 or later is used, multiple Cygwin installations can
coexist within the same Windows instance. Each installation will have separate
``mount tables'' specified in @file{@var{CYGROOT-N}/etc/fstab}. These
@dfn{mount tables} control how that instance of Cygwin will map Windows file
names and paths to Cygwin form. Each installation's @command{cygpath} utility
automatically deduces the appropriate @file{/etc/fstab} file. Since each
@file{@var{CYGROOT-N}/etc/fstab} mount table may specify different mappings, it
matters what @command{cygpath} is used.
Note that @command{cygpath} is a Cygwin application; to execute this tool from
Unix requires a working and properly configured Wine installation, as well
as enabling the GNU/Linux @code{binfmt} extension. Furthermore, the Cygwin
@command{setup.exe} tool should have been used, via Wine, to properly install
Cygwin into the Wine file system (and registry).
Unfortunately, Wine support for Cygwin is intermittent. Recent releases of
Cygwin (1.7 and above) appear to require more Windows API support than Wine
provides (as of Wine version 1.2); most Cygwin applications fail to execute.
This includes @command{cygpath} itself. Hence, it is best @emph{not} to use
the LT_CYGPATH machinery in libtool when performing Unix to Cygwin
cross-compiles. Similarly, it is best @emph{not} to enable the GNU/Linux binfmt
support in this configuration, because while Wine will fail to execute the
compiled Cygwin applications, it will still exit with status zero. This tends
to confuse build systems and test suites (including libtool's own testsuite,
resulting in spurious reported failures). Wine support for the older
Cygwin-1.5 series appears satisfactory, but the Cygwin team no longer supports
Cygwin-1.5. It is hoped that Wine will eventually be improved such that
Cygwin-1.7 will again operate correctly under Wine. Until then, libtool will
report warnings as described in @pxref{File Name Conversion Failure} in these
scenarios.
However, @env{LT_CYGPATH} is also used for the MSYS to Cygwin cross compile
scenario, and operates as expected.
@node Cygwin to MinGW Cross
@subsubsection Cygwin to MinGW Cross
@cindex Cygwin to MinGW Cross
There are actually three different scenarios that could all legitimately be
called a ``Cygwin to MinGW'' cross compile. The current (and standard)
definition is when there is a compiler that produces native Windows libraries
and applications, but which itself is a Cygwin application, just as would be
expected in any other cross compile setup.
However, historically there were two other definitions, which we will refer
to as the @emph{fake} one, and the @emph{lying} one.
In the @emph{fake} Cygwin to MinGW cross compile case, you actually use a
native MinGW compiler, but you do so from within a Cygwin environment:
@example
@kbd{export PATH="/c/MinGW/bin:$@{PATH@}"}
@kbd{configure --build=i686-pc-cygwin \
--host=mingw32 \
NM=/c/MinGW/bin/nm.exe}
@end example
In this way, the build system ``knows'' that you are cross compiling, and the
file name conversion logic will be used. However, because the tools
(@command{mingw32-gcc}, @command{nm}, @command{ar}) used are actually native
Windows applications, they will not understand any Cygwin (that is, Unix-like)
absolute file names passed as command line arguments (and, unlike MSYS, Cygwin
does not automatically convert such arguments). However, so long as only
relative file names are used in the build system, and non-Windows-supported
Unix idioms such as symlinks and mount points are avoided, this scenario should
work.
If you must use absolute file names, you will have to force Libtool to convert
file names for the toolchain in this case, by doing the following before you
run configure:
@example
@kbd{export lt_cv_to_tool_file_cmd=func_convert_file_cygwin_to_w32}
@end example
@cindex lt_cv_to_tool_file_cmd
@cindex func_convert_file_cygwin_to_w32
In the @emph{lying} Cygwin to MinGW cross compile case, you lie to the
build system:
@example
@kbd{export PATH="/c/MinGW/bin:$@{PATH@}"}
@kbd{configure --build=i686-pc-mingw32 \
--host=i686-pc-mingw32 \
--disable-dependency-tracking}
@end example
@noindent
and claim that the build platform is MinGW, even though you are actually
running under @emph{Cygwin} and not MinGW. In this case, libtool does
@emph{not} know that you are performing a cross compile, and thinks instead
that you are performing a native MinGW build. However, as described in
(@pxref{Native MinGW File Name Conversion}), that scenario triggers an ``MSYS
to Windows'' file name conversion. This, of course, is the wrong conversion
since we are actually running under Cygwin. Also, the toolchain is expecting
Windows file names (not Cygwin) but unless told so Libtool will feed Cygwin
file names to the toolchain in this case. To force the correct file name
conversions in this situation, you should do the following @emph{before}
running configure:
@example
@kbd{export lt_cv_to_host_file_cmd=func_convert_file_cygwin_to_w32}
@kbd{export lt_cv_to_tool_file_cmd=func_convert_file_cygwin_to_w32}
@end example
@cindex lt_cv_to_host_file_cmd
@cindex lt_cv_to_tool_file_cmd
@cindex func_convert_file_cygwin_to_w32
Note that this relies on internal implementation details of libtool, and
is subject to change. Also, @code{--disable-dependency-tracking} is required,
because otherwise the MinGW GCC will generate dependency files that contain
Windows file names. This, in turn, will confuse the Cygwin @command{make}
program, which does not accept Windows file names:
@example
Makefile:1: *** target pattern contains no `%'. Stop.
@end example
There have also always been a number of other details required for the
@emph{lying} case to operate correctly, such as the use of so-called
@dfn{identity mounts}:
@example
# @var{cygwin-root}/etc/fstab
D:/foo /foo some_fs binary 0 0
D:/bar /bar some_fs binary 0 0
E:/grill /grill some_fs binary 0 0
@end example
In this way, top-level directories of each drive are available using
identical names within Cygwin.
Note that you also need to ensure that the standard Unix directories
(like @file{/bin}, @file{/lib}, @file{/usr}, @file{/etc}) appear in the root
of a drive. This means that you must install Cygwin itself into the @file{C:/}
root directory (or @file{D:/}, or @file{E:/}, etc)---instead of the
recommended installation into @file{C:/cygwin/}. In addition, all file names
used in the build system must be relative, symlinks should not be used within
the source or build directory trees, and all @option{-M*} options to
@command{gcc} except @option{-MMD} must be avoided.
This is quite a fragile setup, but it has been in historical use, and so is
documented here.
@node Windows DLLs
@subsection Windows DLLs
@cindex Windows DLLs
This topic describes a couple of ways to portably create Windows Dynamic
Link Libraries (DLLs). Libtool knows how to create DLLs using GNU tools
and using Microsoft tools.
A typical library has a ``hidden'' implementation with an interface
described in a header file. On just about every system, the interface
could be something like this:
Example @file{foo.h}:
@example
#ifndef FOO_H
#define FOO_H
int one (void);
int two (void);
extern int three;
#endif /* FOO_H */
@end example
@noindent
And the implementation could be something like this:
Example @file{foo.c}:
@example
#include "foo.h"
int one (void)
@{
return 1;
@}
int two (void)
@{
return three - one ();
@}
int three = 3;
@end example
When using contemporary GNU tools to create the Windows DLL, the above
code will work there too, thanks to its auto-import/auto-export
features. But that is not the case when using older GNU tools or perhaps
more interestingly when using proprietary tools. In those cases the code
will need additional decorations on the interface symbols with
@code{__declspec(dllimport)} and @code{__declspec(dllexport)} depending
on whether the library is built or it's consumed and how it's built and
consumed. However, it should be noted that it would have worked also
with Microsoft tools, if only the variable @code{three} hadn't been
there, due to the fact the Microsoft tools will automatically import
functions (but sadly not variables) and Libtool will automatically export
non-static symbols as described next.
With Microsoft tools, Libtool digs through the object files that make up
the library, looking for non-static symbols to automatically export.
I.e., Libtool with Microsoft tools tries to mimic the auto-export feature
of contemporary GNU tools. It should be noted that the GNU auto-export
feature is turned off when an explicit @code{__declspec(dllexport)} is
seen. The GNU tools do this to not make more symbols visible for projects
that have already taken the trouble to decorate symbols. There is no
similar way to limit what symbols are visible in the code when Libtool
is using Microsoft tools. In order to limit symbol visibility in that
case you need to use one of the options @option{-export-symbols} or
@option{-export-symbols-regex}.
No matching help with auto-import is provided by Libtool, which is why
variables must be decorated to import them from a DLL for everything but
contemporary GNU tools. As stated above, functions are automatically
imported by both contemporary GNU tools and Microsoft tools, but for
other proprietary tools the auto-import status of functions is unknown.
When the objects that form the library are built, there are generally
two copies built for each object. One copy is used when linking the DLL
and one copy is used for the static library. On Windows systems, a pair
of defines are commonly used to discriminate how the interface symbols
should be decorated. The first define is @samp{-DDLL_EXPORT}, which is
automatically provided by Libtool when @command{libtool} builds the copy
of the object that is destined for the DLL. The second define is
@samp{-DLIBFOO_BUILD} (or similar), which is often added by the package
providing the library and is used when building the library, but not
when consuming the library.
However, the matching double compile is not performed when consuming
libraries. It is therefore not possible to reliably distinguish if the
consumer is importing from a DLL or if it is going to use a static
library.
With contemporary GNU tools, auto-import often saves the day, but see
the GNU ld documentation and its @option{--enable-auto-import} option
for some corner cases when it does not
(@pxref{Options, @option{--enable-auto-import}, Options specific to
i386 PE targets, ld, Using ld@comma{} the GNU linker}).
With Microsoft tools you typically get away with always compiling the
code such that variables are expected to be imported from a DLL and
functions are expected to be found in a static library. The tools will
then automatically import the function from a DLL if that is where they
are found. If the variables are not imported from a DLL as expected, but
are found in a static library that is otherwise pulled in by some
function, the linker will issue a warning (LNK4217) that a locally
defined symbol is imported, but it still works. In other words, this
scheme will not work to only consume variables from a library. There is
also a price connected to this liberal use of imports in that an extra
indirection is introduced when you are consuming the static version of
the library. That extra indirection is unavoidable when the DLL is
consumed, but it is not needed when consuming the static library.
For older GNU tools and other proprietary tools there is no generic way
to make it possible to consume either of the DLL or the static library
without user intervention, the tools need to be told what is intended.
One common assumption is that if a DLL is being built (@samp{DLL_EXPORT}
is defined) then that DLL is going to consume any dependent libraries as
DLLs. If that assumption is made everywhere, it is possible to select
how an end-user application is consuming libraries by adding a single
flag @samp{-DDLL_EXPORT} when a DLL build is required. This is of course
an all or nothing deal, either everything as DLLs or everything as static
libraries.
To sum up the above, the header file of the foo library needs to be
changed into something like this:
Modified @file{foo.h}:
@example
#ifndef FOO_H
#define FOO_H
#if defined _WIN32 && !defined __GNUC__
# ifdef LIBFOO_BUILD
# ifdef DLL_EXPORT
# define LIBFOO_SCOPE __declspec (dllexport)
# define LIBFOO_SCOPE_VAR extern __declspec (dllexport)
# endif
# elif defined _MSC_VER
# define LIBFOO_SCOPE
# define LIBFOO_SCOPE_VAR extern __declspec (dllimport)
# elif defined DLL_EXPORT
# define LIBFOO_SCOPE __declspec (dllimport)
# define LIBFOO_SCOPE_VAR extern __declspec (dllimport)
# endif
#endif
#ifndef LIBFOO_SCOPE
# define LIBFOO_SCOPE
# define LIBFOO_SCOPE_VAR extern
#endif
LIBFOO_SCOPE int one (void);
LIBFOO_SCOPE int two (void);
LIBFOO_SCOPE_VAR int three;
#endif /* FOO_H */
@end example
When the targets are limited to contemporary GNU tools and Microsoft
tools, the above can be simplified to the following:
Simplified @file{foo.h}:
@example
#ifndef FOO_H
#define FOO_H
#if defined _WIN32 && !defined __GNUC__ && !defined LIBFOO_BUILD
# define LIBFOO_SCOPE_VAR extern __declspec (dllimport)
#else
# define LIBFOO_SCOPE_VAR extern
#endif
int one (void);
int two (void);
LIBFOO_SCOPE_VAR int three;
#endif /* FOO_H */
@end example
This last simplified version can of course only work when Libtool is
used to build the DLL, as no symbols would be exported otherwise (i.e.,
when using Microsoft tools).
It should be noted that there are various projects that attempt to relax
these requirements by various low level tricks, but they are not
discussed here.
Examples are
@uref{https://github.com/ocaml/flexdll, FlexDLL} and
@uref{https://edll.sourceforge.net/, edll}.
@node libtool script contents
@section @code{libtool} script contents
@cindex implementation of libtool
@cindex libtool implementation
Since version 1.4, the @code{libtool} script is generated by
@code{configure} (@pxref{Configuring}). In earlier versions,
@code{configure} achieved this by calling a helper script called
@file{ltconfig}. From libtool version 0.7 to 1.0, this script
simply set shell variables, then sourced the libtool backend,
@code{ltmain.sh}. @code{ltconfig} from libtool version 1.1 through 1.3
inlined the contents of @code{ltmain.sh} into the generated
@code{libtool}, which improved performance on many systems. The tests
that @file{ltconfig} used to perform are now kept in @file{libtool.m4}
where they can be written using Autoconf. This has the runtime
performance benefits of inlined @code{ltmain.sh}, @emph{and} improves
the build time a little while considerably easing the amount of raw
shell code that used to need maintaining.
The convention used for naming variables that hold shell commands for
delayed evaluation, is to use the suffix @code{_cmd} where a single
line of valid shell script is needed, and the suffix @code{_cmds} where
multiple lines of shell script @strong{may} be delayed for later
evaluation. By convention, @code{_cmds} variables delimit the
evaluation units with the @code{~} character where necessary.
Here is a listing of each of the configuration variables, and how they
are used within @code{ltmain.sh} (@pxref{Configuring}):
@defvar AR
The name of the system library archiver.
@end defvar
@defvar CC
The name of the compiler used to configure libtool. This will always
contain the compiler for the current language (@pxref{Tags}).
@end defvar
@defvar ECHO
An @command{echo} program that does not interpret backslashes as an
escape character. It may be given only one argument, so due quoting
is necessary.
@end defvar
@defvar LD
The name of the linker that libtool should use internally for reloadable
linking and possibly shared libraries.
@end defvar
@defvar LTCC
@defvarx LTCFLAGS
The name of the C compiler and C compiler flags used to configure
libtool.
@end defvar
@defvar NM
The name of a BSD- or MS-compatible program that produces listings of
global symbols.
For BSD @command{nm}, the symbols should be in one the following formats:
@example
@var{address} C @var{global-variable-name}
@var{address} D @var{global-variable-name}
@var{address} T @var{global-function-name}
@end example
For MS @command{dumpbin}, the symbols should be in one of the following
formats:
@example
@var{counter} @var{size} UNDEF notype External | @var{global-var}
@var{counter} @var{address} @var{section} notype External | @var{global-var}
@var{counter} @var{address} @var{section} notype () External | @var{global-func}
@end example
The @var{size} of the global variables are not zero and the @var{section}
of the global functions are not "UNDEF". Symbols in "pick any" sections
("pick any" appears in the section header) are not global either.
@end defvar
@defvar RANLIB
Set to the name of the @command{ranlib} program, if any.
@end defvar
@defvar allow_undefined_flag
The flag that is used by @samp{archive_cmds} to declare that
there will be unresolved symbols in the resulting shared library.
Empty, if no such flag is required. Set to @samp{unsupported} if there
is no way to generate a shared library with references to symbols that
aren't defined in that library.
@end defvar
@defvar always_export_symbols
Whether libtool should automatically generate a list of exported symbols
using @code{export_symbols_cmds} before linking an archive.
Set to @samp{yes} or @samp{no}. Default is @samp{no}.
@end defvar
@defvar archive_cmds
@defvarx archive_expsym_cmds
@defvarx old_archive_cmds
Commands used to create shared libraries, shared libraries with
@option{-export-symbols} and static libraries, respectively.
@end defvar
@defvar archiver_list_spec
Specify filename containing input files for @code{AR}.
@end defvar
@defvar old_archive_from_new_cmds
If the shared library depends on a static library,
@samp{old_archive_from_new_cmds} contains the commands used to create that
static library. If this variable is not empty, @samp{old_archive_cmds} is
not used.
@end defvar
@defvar old_archive_from_expsyms_cmds
If a static library must be created from the export symbol list to
correctly link with a shared library, @samp{old_archive_from_expsyms_cmds}
contains the commands needed to create that static library. When these
commands are executed, the variable @code{soname} contains the name of the
shared library in question, and the @samp{$objdir/$newlib} contains the
path of the static library these commands should build. After executing
these commands, libtool will proceed to link against @samp{$objdir/$newlib}
instead of @code{soname}.
@end defvar
@defvar lock_old_archive_extraction
Set to @samp{yes} if the extraction of a static library requires locking
the library file. This is required on Darwin.
@end defvar
@defvar build
@defvarx build_alias
@defvarx build_os
Set to the specified and canonical names of the system that libtool was
built on.
@end defvar
@defvar build_libtool_libs
Whether libtool should build shared libraries on this system. Set to
@samp{yes} or @samp{no}.
@end defvar
@defvar build_old_libs
Whether libtool should build static libraries on this system. Set to
@samp{yes} or @samp{no}.
@end defvar
@defvar compiler_c_o
Whether the compiler supports the @option{-c} and @option{-o} options
simultaneously. Set to @samp{yes} or @samp{no}.
@end defvar
@defvar compiler_needs_object
Whether the compiler has to see an object listed on the command line in
order to successfully invoke the linker. If @samp{no}, then a set of
convenience archives or a set of object file names can be passed via
linker-specific options or linker scripts.
@end defvar
@defvar dlopen_support
Whether @code{dlopen} is supported on the platform.
Set to @samp{yes} or @samp{no}.
@end defvar
@defvar dlopen_self
Whether it is possible to @code{dlopen} the executable itself.
Set to @samp{yes} or @samp{no}.
@end defvar
@defvar dlopen_self_static
Whether it is possible to @code{dlopen} the executable itself, when it
is linked statically (@option{-all-static}). Set to @samp{yes} or
@samp{no}.
@end defvar
@defvar exclude_expsyms
List of symbols that should not be listed in the preloaded symbols.
@end defvar
@defvar export_dynamic_flag_spec
Compiler link flag that allows a dlopened shared library to reference
symbols that are defined in the program.
@end defvar
@defvar export_symbols_cmds
Commands to extract exported symbols from @code{libobjs} to the
file @code{export_symbols}.
@end defvar
@defvar extract_expsyms_cmds
Commands to extract the exported symbols list from a shared library.
These commands are executed if there is no file @samp{$objdir/$soname-def},
and should write the names of the exported symbols to that file, for
the use of @samp{old_archive_from_expsyms_cmds}.
@end defvar
@defvar fast_install
Determines whether libtool will privilege the installer or the
developer. The assumption is that installers will seldom run programs
in the build tree, and the developer will seldom install. This is only
meaningful on platforms where @code{shlibpath_overrides_runpath} is
not @samp{yes}, so @code{fast_install} will be set to @samp{needless} in
this case. If @code{fast_install} set to @samp{yes}, libtool will create
programs that search for installed libraries, and, if a program is run
in the build tree, a new copy will be linked on-demand to use the
yet-to-be-installed libraries. If set to @samp{no}, libtool will create
programs that use the yet-to-be-installed libraries, and will link
a new copy of the program at install time. The default value is
@samp{yes} or @samp{needless}, depending on platform and configuration
flags, and it can be turned from @samp{yes} to @samp{no} with the
configure flag @option{--disable-fast-install}.
On some systems, the linker always hardcodes paths to dependent libraries
into the output. In this case, @code{fast_install} is never set to @samp{yes},
and relinking at install time is triggered. This also means that @env{DESTDIR}
installation does not work as expected.
@end defvar
@defvar file_magic_glob
How to find potential files when @code{deplibs_check_method} is
@samp{file_magic}. @code{file_magic_glob} is a @code{sed} expression,
and the @code{sed} instance is fed potential file names that are
transformed by the @code{file_magic_glob} expression. Useful when the
shell does not support the shell option @code{nocaseglob}, making
@code{want_nocaseglob} inappropriate. Normally disabled (i.e.
@code{file_magic_glob} is empty).
@end defvar
@defvar finish_cmds
Commands to tell the dynamic linker how to find shared libraries in a
specific directory. These commands can be disabled during testing local
changes to shared libraries with @option{--no-finish}.
@end defvar
@defvar finish_eval
Same as @code{finish_cmds}, except the commands are not displayed.
@end defvar
@defvar global_symbol_pipe
A pipeline that takes the output of @code{NM}, and produces a listing of
raw symbols followed by their C names. For example:
@example
$ @kbd{eval "$NM progname | $global_symbol_pipe"}
D @var{symbol1} @var{C-symbol1}
T @var{symbol2} @var{C-symbol2}
C @var{symbol3} @var{C-symbol3}
@dots{}
$
@end example
The first column contains the symbol type (used to tell data from code)
but its meaning is system dependent.
@end defvar
@defvar global_symbol_to_cdecl
A pipeline that translates the output of @code{global_symbol_pipe} into
proper C declarations. Since some platforms, such as HP/UX, have
linkers that differentiate code from data, data symbols are declared
as data, and code symbols are declared as functions.
@end defvar
@defvar hardcode_action
Either @samp{immediate} or @samp{relink}, depending on whether shared
library paths can be hardcoded into executables before they are installed,
or if they need to be relinked.
@end defvar
@defvar hardcode_direct
Set to @samp{yes} or @samp{no}, depending on whether the linker
hardcodes directories if a library is directly specified on the command
line (such as @samp{@var{dir}/lib@var{name}.a}) when
@code{hardcode_libdir_flag_spec} is specified.
@end defvar
@defvar hardcode_direct_absolute
Some architectures hardcode "absolute" library directories that cannot
be overridden by @code{shlibpath_var} when @code{hardcode_direct} is
@samp{yes}. In that case set @code{hardcode_direct_absolute} to
@samp{yes}, or otherwise @samp{no}.
@end defvar
@defvar hardcode_into_libs
Whether the platform supports hardcoding of run-paths into libraries.
If enabled, linking of programs will be much simpler but libraries will
need to be relinked during installation. Set to @samp{yes} or @samp{no}.
@end defvar
@defvar hardcode_libdir_flag_spec
Flag to hardcode a @code{libdir} variable into a binary, so that the
dynamic linker searches @code{libdir} for shared libraries at runtime.
If it is empty, libtool will try to use some other hardcoding mechanism.
@end defvar
@defvar hardcode_libdir_separator
If the compiler only accepts a single @code{hardcode_libdir_flag}, then
this variable contains the string that should separate multiple
arguments to that flag.
@end defvar
@defvar hardcode_minus_L
Set to @samp{yes} or @samp{no}, depending on whether the linker
hardcodes directories specified by @option{-L} flags into the resulting
executable when @code{hardcode_libdir_flag_spec} is specified.
@end defvar
@defvar hardcode_shlibpath_var
Set to @samp{yes} or @samp{no}, depending on whether the linker
hardcodes directories by writing the contents of @samp{$shlibpath_var}
into the resulting executable when @code{hardcode_libdir_flag_spec} is
specified. Set to @samp{unsupported} if directories specified by
@samp{$shlibpath_var} are searched at run time, but not at link time.
@end defvar
@defvar host
@defvarx host_alias
@defvarx host_os
Set to the specified and canonical names of the system that libtool was
configured for.
@end defvar
@defvar include_expsyms
List of symbols that must always be exported when using @code{export_symbols}.
@end defvar
@defvar inherit_rpath
Whether the linker adds runtime paths of dependency libraries to the
runtime path list, requiring libtool to relink the output when installing.
Set to @samp{yes} or @samp{no}. Default is @samp{no}.
@end defvar
@defvar install_override_mode
Permission mode override for installation of shared libraries. If the
runtime linker fails to load libraries with wrong permissions, then it
may fail to execute programs that are needed during installation,
because these need the library that has just been installed. In this
case, it is necessary to pass the mode to @command{install} with
@option{-m @var{install_override_mode}}.
@end defvar
@defvar libext
The standard old archive suffix (normally @samp{a}).
@end defvar
@defvar libname_spec
The format of a library name prefix. On all Unix systems, static
libraries are called @samp{lib@var{name}.a}, but on some systems (such
as OS/2 or MS-DOS), the library is just called @samp{@var{name}.a}.
@end defvar
@defvar library_names_spec
A list of shared library names. The first is the name of the file,
the rest are symbolic links to the file. The name in the list is
the file name that the linker finds when given @option{-l@var{name}}.
@end defvar
@defvar link_all_deplibs
Whether libtool must link a program against all its dependency libraries.
Set to @samp{yes} or @samp{no}. Default is @samp{unknown}, which is
a synonym for @samp{yes}.
@end defvar
@defvar link_static_flag
Linker flag (passed through the C compiler) used to prevent dynamic
linking.
@end defvar
@defvar macro_version
@defvarx macro_revision
The release and revision from which the libtool.m4 macros were
taken. This is used to ensure that macros and @code{ltmain.sh}
correspond to the same Libtool version.
@end defvar
@defvar max_cmd_len
The approximate longest command line that can be passed to @samp{$SHELL}
without being truncated, as computed by @samp{LT_CMD_MAX_LEN}.
@end defvar
@defvar need_lib_prefix
Whether we can @code{dlopen} modules without a @samp{lib} prefix.
Set to @samp{yes} or @samp{no}. By default, it is @samp{unknown}, which
means the same as @samp{yes}, but documents that we are not really sure
about it. @samp{no} means that it is possible to @code{dlopen} a
module without the @samp{lib} prefix.
@end defvar
@defvar need_version
Whether versioning is required for libraries, i.e.@: whether the
dynamic linker requires a version suffix for all libraries.
Set to @samp{yes} or @samp{no}. By default, it is @samp{unknown}, which
means the same as @samp{yes}, but documents that we are not really sure
about it.
@end defvar
@defvar need_locks
Whether files must be locked to prevent conflicts when compiling
simultaneously. Set to @samp{yes} or @samp{no}.
@end defvar
@defvar nm_file_list_spec
Specify filename containing input files for @code{NM}.
@end defvar
@defvar no_builtin_flag
Compiler flag to disable builtin functions that conflict with declaring
external global symbols as @code{char}.
@end defvar
@defvar no_undefined_flag
The flag that is used by @samp{archive_cmds} to declare that
there will be no unresolved symbols in the resulting shared library.
Empty, if no such flag is required.
@end defvar
@defvar objdir
The name of the directory that contains temporary libtool files.
@end defvar
@defvar objext
The standard object file suffix (normally @samp{o}).
@end defvar
@defvar pic_flag
Any additional compiler flags for building library object files.
@end defvar
@defvar postinstall_cmds
@defvarx old_postinstall_cmds
Commands run after installing a shared or static library, respectively.
@end defvar
@defvar postuninstall_cmds
@defvarx old_postuninstall_cmds
Commands run after uninstalling a shared or static library, respectively.
@end defvar
@defvar postlink_cmds
Commands necessary for finishing linking programs. @code{postlink_cmds}
are executed immediately after the program is linked. Any occurrence of
the string @code{@@OUTPUT@@} in @code{postlink_cmds} is replaced by the
name of the created executable (i.e.@: not the wrapper, if a wrapper is
generated) prior to execution. Similarly, @code{@@TOOL_OUTPUT@@} is
replaced by the toolchain format of @code{@@OUTPUT@@}. Normally disabled
(i.e.@: @code{postlink_cmds} empty).
@end defvar
@defvar reload_cmds
@defvarx reload_flag
Commands to create a reloadable object. Set @code{reload_cmds} to
@samp{false} on systems that cannot create reloadable objects.
@end defvar
@defvar runpath_var
The environment variable that tells the linker what directories to
hardcode in the resulting executable.
@end defvar
@defvar shlibpath_overrides_runpath
Indicates whether it is possible to override the hard-coded library
search path of a program with an environment variable. If this is set
to no, libtool may have to create two copies of a program in the build
tree, one to be installed and one to be run in the build tree only.
When each of these copies is created depends on the value of
@code{fast_install}. The default value is @samp{unknown}, which is
equivalent to @samp{no}.
@end defvar
@defvar shlibpath_var
The environment variable that tells the dynamic linker where to find
shared libraries.
@end defvar
@defvar soname_spec
The name coded into shared libraries, if different from the real name of
the file.
@end defvar
@defvar striplib
@defvarx old_striplib
Command to strip a shared (@code{striplib}) or static (@code{old_striplib})
library, respectively. If these variables are empty, the strip flag
in the install mode will be ignored for libraries (@pxref{Install mode}).
@end defvar
@defvar sys_lib_dlsearch_path_spec
Expression to get the run-time system library search path. Directories
that appear in this list are never hard-coded into executables.
@end defvar
@defvar sys_lib_search_path_spec
Expression to get the compile-time system library search path. This
variable is used by libtool when it has to test whether a certain
library is shared or static. The directories listed in
@code{shlibpath_var} are automatically appended to this list, every time
libtool runs (i.e., not at configuration time), because some linkers use
this variable to extend the library search path. Linker switches such
as @option{-L} also augment the search path.
@end defvar
@defvar thread_safe_flag_spec
Linker flag (passed through the C compiler) used to generate thread-safe
libraries.
@end defvar
@defvar to_host_file_cmd
If the toolchain is not native to the build platform (e.g.@: if you are using
MSYS to drive the scripting, but are using the MinGW native Windows compiler)
this variable describes how to convert file names from the format used by the
build platform to the format used by host platform. Normally set to
@samp{func_convert_file_noop}, libtool will autodetect most cases where
other values should be used. On rare occasions, it may be necessary to override
the autodetected value (@pxref{Cygwin to MinGW Cross}).
@end defvar
@defvar to_tool_file_cmd
If the toolchain is not native to the build platform (e.g.@: if you are using
some Unix to drive the scripting together with a Windows toolchain running
in Wine) this variable describes how to convert file names from the format
used by the build platform to the format used by the toolchain. Normally set
to @samp{func_convert_file_noop}.
@end defvar
@defvar version_type
The library version numbering type. One of @samp{libtool},
@samp{freebsd-aout}, @samp{freebsd-elf}, @samp{irix}, @samp{linux},
@samp{osf}, @samp{sunos}, @samp{windows}, or @samp{none}.
@end defvar
@defvar want_nocaseglob
Find potential files using the shell option @code{nocaseglob}, when
@code{deplibs_check_method} is @samp{file_magic}. Normally set to
@samp{no}. Set to @samp{yes} to enable the @code{nocaseglob} shell
option when looking for potential file names in a case-insensitive
manner.
@end defvar
@defvar whole_archive_flag_spec
Compiler flag to generate shared objects from convenience archives.
@end defvar
@defvar wl
The C compiler flag that allows libtool to pass a flag directly to the
linker. Used as: @code{$@{wl@}@var{some-flag}}.
@end defvar
Variables ending in @samp{_cmds} or @samp{_eval} contain a
@samp{~}-separated list of commands that are @code{eval}ed one after
another. If any of the commands return a nonzero exit status, libtool
generally exits with an error message.
Variables ending in @samp{_spec} are @code{eval}ed before being used by
libtool.
@node Cheap tricks
@section Cheap tricks
Here are a few tricks that you can use to make maintainership
easier:
@itemize @bullet
@item
When people report bugs, ask them to use the @option{--config},
@option{--debug}, or @option{--features} flags, if you think they will help
you. These flags are there to help you get information directly, rather
than having to trust second-hand observation.
@item
Rather than reconfiguring libtool every time I make a change to
@code{ltmain.in}, I keep a permanent @code{libtool} script in my
@env{PATH}, which sources @code{ltmain.in} directly.
The following steps describe how to create such a script, where
@code{/home/src/libtool} is the directory containing the libtool source
tree, @code{/home/src/libtool/libtool} is a libtool script that has been
configured for your platform, and @code{~/bin} is a directory in your
@env{PATH}:
@smallexample
trick$ cd ~/bin
trick$ sed 's%^\(macro_version=\).*$%\1@@VERSION@@%;
s%^\(macro_revision=\).*$%\1@@package_revision@@%;
/^# ltmain\.sh/q' /home/src/libtool/libtool > libtool
trick$ echo '. /home/src/libtool/ltmain.in' >> libtool
trick$ chmod +x libtool
trick$ libtool --version
ltmain.sh (GNU @@PACKAGE@@@@TIMESTAMP@@) @@VERSION@@
Copyright (C) 2014 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
trick$
@end smallexample
@end itemize
The output of the final @samp{libtool --version} command shows that the
@code{ltmain.in} script is being used directly. Now, modify
@code{~/bin/libtool} or @code{/home/src/libtool/ltmain.in} directly in
order to test new changes without having to rerun @code{configure}.
@node GNU Free Documentation License
@appendix GNU Free Documentation License
@cindex FDL, GNU Free Documentation License
@include fdl.texi
@page
@node Combined Index
@unnumbered Combined Index
@printindex cp
@bye