blob: 88d51a7c244471ac6305a9d1c3dbab490e3891d7 [file] [log] [blame]
\input texinfo
@c %**start of header
@setfilename R-admin.info
@settitle R Installation and Administration
@setchapternewpage on
@defcodeindex en
@c %**end of header
@syncodeindex fn vr
@dircategory Programming
@direntry
* R Administration: (R-admin). R Installation and Administration.
@end direntry
@finalout
@include R-defs.texi
@include version.texi
@copying
This manual is for R, version @value{VERSION}.
@Rcopyright{2001}
@quotation
@permission{}
@end quotation
@end copying
@titlepage
@title R Installation and Administration
@subtitle Version @value{VERSION}
@author R Core Team
@page
@vskip 0pt plus 1filll
@insertcopying
@end titlepage
@ifplaintext
@insertcopying
@end ifplaintext
@c @ifnothtml
@contents
@c @end ifnothtml
@ifnottex
@node Top, Obtaining R, (dir), (dir)
@top R Installation and Administration
This is a guide to installation and administration for R.
@insertcopying
@end ifnottex
@menu
* Obtaining R::
* Installing R under Unix-alikes::
* Installing R under Windows::
* Installing R under macOS::
* Running R::
* Add-on packages::
* Internationalization::
* Choosing between 32- and 64-bit builds::
* The standalone Rmath library::
* Essential and useful other programs under a Unix-alike::
* Configuration on a Unix-alike::
* Platform notes::
* The Windows toolset::
* Function and variable index::
* Concept index::
* Environment variable index::
@end menu
@node Obtaining R, Installing R under Unix-alikes, Top, Top
@chapter Obtaining R
@cindex Obtaining R
Sources, binaries and documentation for @R{} can be obtained via
@acronym{CRAN}, the ``Comprehensive R Archive Network'' whose current
members are listed at @uref{https://CRAN.R-project.org/@/mirrors.html}.
@menu
* Getting and unpacking the sources::
* Getting patched and development versions::
@end menu
@node Getting and unpacking the sources, Getting patched and development versions, Obtaining R, Obtaining R
@section Getting and unpacking the sources
@cindex Sources for R
The simplest way is to download the most recent
@file{R-@var{x}.@var{y}.@var{z}.tar.gz} file, and unpack it with
@example
tar -xf R-@var{x}.@var{y}.@var{z}.tar.gz
@end example
@noindent
on systems that have a suitable@footnote{e.g.@: @acronym{GNU}
@command{tar} version 1.15 or later, or that from the @samp{libarchive}
(as used on macOS versions 10.6 and later) or `Heirloom Toolchest'
distributions.} @command{tar} installed. On other systems you need to
have the @command{gzip} program installed, when you can use
@example
gzip -dc R-@var{x}.@var{y}.@var{z}.tar.gz | tar -xf -
@end example
The pathname of the directory into which the sources are unpacked should
not contain spaces, as most @command{make} programs (and specifically
@acronym{GNU} @command{make}) do not expect spaces.
If you want the build to be usable by a group of users, set @code{umask}
before unpacking so that the files will be readable by the target group
(e.g.,@: @code{umask 022} to be usable by all users). Keep this
setting of @code{umask} whilst building and installing.
If you use a fairly recent @acronym{GNU} version of @command{tar} and do
this as a root account (which on Windows includes accounts with
administrator privileges) you may see many warnings about changing
ownership. In which case you can use
@example
tar --no-same-owner -xf R-@var{x}.@var{y}.@var{z}.tar.gz
@end example
@noindent
and perhaps also include the option @option{--no-same-permissions}.
@enindex TAR_OPTIONS
(These options can also be set in the @env{TAR_OPTIONS} environment
variable: if more than one option is included they should be separated
by spaces.)
@node Getting patched and development versions, , Getting and unpacking the sources, Obtaining R
@section Getting patched and development versions
A patched version of the current release, @samp{r-patched}, and the
current development version, @samp{r-devel}, are available as daily
tarballs and via access to the @R{} Subversion repository. (For the two
weeks prior to the release of a minor (3.x.0) version, @samp{r-patched}
tarballs may refer to beta/release candidates of the upcoming release,
the patched version of the current release being available via
Subversion.)
The tarballs are available from
@uref{https://stat.ethz.ch/R/daily}. Download
@file{R-patched.tar.gz} or @file{R-devel.tar.gz} (or the @file{.tar.bz2}
versions) and unpack as described in the previous section. They are
built in exactly the same way as distributions of @R{} releases.
@menu
* Using Subversion and rsync::
@end menu
@node Using Subversion and rsync, , Getting patched and development versions, Getting patched and development versions
@subsection Using Subversion and rsync
@cindex Subversion
Sources are also available via @uref{https://svn.R-project.org/R/}, the
R Subversion repository. If you have a Subversion client (see
@uref{https://subversion.apache.org/}), you can check out and update the
current @samp{r-devel} from
@uref{https://svn.r-project.org/@/R/@/trunk/} and the current
@samp{r-patched} from
@samp{https://svn.r-project.org/@/R/@/branches/@/R-@var{x}-@var{y}-branch/}
(where @var{x} and @var{y} are the major and minor number of the current
released version of R). E.g., use
@example
svn checkout https://svn.r-project.org/R/trunk/ @var{path}
@end example
@noindent
to check out @samp{r-devel} into directory @var{path} (which will be
created if necessary). The alpha, beta and RC versions of an upcoming
@var{x.y.0} release are available from
@samp{https://svn.r-project.org/R/branches/R-@var{x}-@var{y}-branch/} in
the four-week period prior to the release.
Note that @samp{https:} is required@footnote{for some Subversion clients
@samp{http:} may appear to work, but requires continual redirection.},
and that the SSL certificate for the Subversion server of the @R{}
project should be recognized as from a trusted source.
Note that retrieving the sources by e.g.@: @command{wget -r} or
@command{svn export} from that URL will not work (and will give a error
early in the @command{make} process): the Subversion information is
needed to build @R{}.
The Subversion repository does not contain the current sources for the
recommended packages, which can be obtained by @command{rsync} or
downloaded from @acronym{CRAN}. To use @code{rsync} to install the
appropriate sources for the recommended packages, run
@code{./tools/rsync-recommended} from the top-level directory of the
@R{} sources.
If downloading manually from @acronym{CRAN}, do ensure that you have the
correct versions of the recommended packages: if the number in the file
@file{VERSION} is @samp{@var{x}.@var{y}.@var{z}} you need to download
the contents of @samp{https://CRAN.R-project.org/src/contrib/@var{dir}},
where @var{dir} is @samp{@var{x}.@var{y}.@var{z}/Recommended} for
r-devel or @file{@var{x}.@var{y}-patched/Recommended} for r-patched,
respectively, to directory @file{src/library/Recommended} in the sources
you have unpacked. After downloading manually you need to execute
@command{tools/link-recommended} from the top level of the sources to
make the requisite links in @file{src/library/Recommended}. A suitable
incantation from the top level of the @R{} sources using @command{wget}
might be (for the correct value of @file{@var{dir}})
@example
wget -r -l1 --no-parent -A\*.gz -nd -P src/library/Recommended \
https://CRAN.R-project.org/src/contrib/@var{dir}
./tools/link-recommended
@end example
@node Installing R under Unix-alikes, Installing R under Windows, Obtaining R, Top
@chapter Installing R under Unix-alikes
@cindex Installing under Unix-alikes
@R{} will configure and build under most common Unix and Unix-alike
platforms including @samp{@var{cpu}-*-linux-gnu} for the
@cputype{alpha}, @cputype{arm64}, @cputype{hppa}, @cputype{ix86},
@cputype{m68k}, @cputype{mips}, @cputype{mipsel}, @cputype{ppc64},
@cputype{s390}, @cputype{sparc64}, and @cputype{x86_64} @acronym{CPU}s,
@c (see e.g.@: @uref{https://buildd.debian.org/build.php?&pkg=r-base}),
@c Actually, see https://packages.debian.org/unstable/math/r-base-core as
@c the build daemon is not used for all platforms; note also that Debian
@c has x86_64 <=> amd64, ix86 <=> i386.
@samp{x86_64-@/apple-@/darwin}, @samp{i386-@/sun-@/solaris} and
@samp{sparc-@/sun-@/solaris} as well as
perhaps (it is tested less frequently on these platforms)
@samp{i386-@/*-@/freebsd}, @samp{x86_64-@/*-@/freebsd},
@samp{i386-@/*-@/netbsd}, @samp{x86_64-@/*-@/openbsd} and
@samp{powerpc-@/ibm-@/aix6*}
@cindex Linux
@cindex macOS
In addition, binary distributions are available for some common Linux
distributions and for macOS (formerly OS X and Mac OS). See the
@acronym{FAQ} for current details. These are installed in
platform-specific ways, so for the rest of this chapter we consider only
building from the sources.
Cross-building is not possible: installing @R{} builds a minimal version
of @R{} and then runs many @R{} scripts to complete the build.
@menu
* Simple compilation::
* Help options::
* Making the manuals::
* Installation::
* Uninstallation::
* Sub-architectures::
* Other Options::
* Testing a Unix-alike Installation::
@end menu
@node Simple compilation, Help options, Installing R under Unix-alikes, Installing R under Unix-alikes
@section Simple compilation
First review the essential and useful tools and libraries in
@ref{Essential and useful other programs under a Unix-alike}, and install
those you
@enindex TMPDIR
want or need. Ensure that either the environment variable @env{TMPDIR}
is either unset (and @file{/tmp} exists and can be written in and
scripts can be executed from) or points to the absolute path to a valid
temporary directory (one from which execution of scripts is allowed)
which does not contain spaces.@footnote{Most aspects will work with
paths containing spaces, but external software used by @R{} may not.}
@findex R_HOME
Choose a directory to install the @R{} tree (@R{} is not just a binary, but
has additional data sets, help files, font metrics etc). Let us call
this place @var{R_HOME}. Untar the source code. This should create
directories @file{src}, @file{doc}, and several more under a top-level
directory: change to that top-level directory (At this point North
American readers should consult @ref{Setting paper size}.) Issue the
following commands:
@findex configure
@example
./configure
make
@end example
@noindent
(See @ref{Using make} if your make is not called @samp{make}.) Users of
Debian-based 64-bit systems@footnote{which use @file{lib} rather than
@file{lib64} for their primary 64-bit library directories.} may need
@example
./configure LIBnn=lib
make
@end example
Then check the built system works correctly by
@example
make check
@end example
@noindent
Failures are not necessarily problems as they might be caused by missing
functionality, but you should look carefully at any reported
discrepancies. (Some non-fatal errors are expected in locales that do
not support Latin-1, in particular in true @code{C} locales and
non-UTF-8 non-Western-European locales.) A failure in
@file{tests/ok-errors.R} may indicate inadequate resource limits
(@pxref{Running R}).
More comprehensive testing can be done by
@example
make check-devel
@end example
@noindent
or
@example
make check-all
@end example
@noindent
see file @file{tests/README} and @ref{Testing a Unix-alike Installation}
for the possibilities of doing this in parallel. Note that these checks
are only run completely if the recommended packages are installed.
If the @command{configure} and @command{make} commands execute
successfully, a shell-script front-end called @file{R} will be created
and copied to @file{@var{R_HOME}/bin}. You can link or copy this script
to a place where users can invoke it, for example to
@file{/usr/local/bin/R}. You could also copy the man page @file{R.1} to
a place where your @command{man} reader finds it, such as
@file{/usr/local/man/man1}. If you want to install the complete @R{}
tree to, e.g., @file{/usr/local/lib/R}, see @ref{Installation}. Note:
you do not @emph{need} to install @R{}: you can run it from where it was
built.
You do not necessarily have to build @R{} in the top-level source
directory (say, @file{@var{TOP_SRCDIR}}). To build in
@file{@var{BUILDDIR}}, run
@findex configure
@example
cd @var{BUILDDIR}
@var{TOP_SRCDIR}/configure
make
@end example
@noindent
and so on, as described further below. This has the advantage of always
keeping your source tree clean and is particularly recommended when you
work with a version of @R{} from Subversion. (You may need
@acronym{GNU} @command{make} to allow this, and you will need no spaces
in the path to the build directory. It is unlikely to work if the
source directory has previously been used for a build.)
@c For those obtaining @R{} @emph{via} Subversion, one additional step is
@c necessary:
@c @cindex Vignettes
@c @cindex Subversion
@c @example
@c make vignettes
@c @end example
@c @noindent
@c which makes the @pkg{grid} vignettes (which are contained in the
@c tarballs): it make take several minutes.
Now @code{rehash} if necessary, type @kbd{R}, and read the @R{} manuals
and the @R{} @acronym{FAQ} (files @file{FAQ} or
@file{doc/manual/R-FAQ.html}, or
@uref{https://CRAN.R-project.org/@/doc/@/FAQ/@/R-FAQ.html} which always
has the version for the latest release of @R{}).
Note: if you already have @R{} installed, check that where you installed
@R{} replaces or comes earlier in your path than the previous
installation. Some systems are set up to have @file{/usr/bin} (the
standard place for a system installation) ahead of @file{/usr/local/bin}
(the default place for installation of @R{}) in their default path, and
some do not have @file{/usr/local/bin} on the default path.
@node Help options, Making the manuals, Simple compilation, Installing R under Unix-alikes
@section Help options
@R{} by default provides help pages a plain text displayed in a pager,
with the options (see the help for @code{help} of displaying help as
HTML or PDF.
By default @HTML{} help pages are created when needed rather than being
built at install time.
If you need to disable the server and want @HTML{} help, there is the
option to build @HTML{} pages when packages are installed
(including those installed with @R{}). This is enabled by the
@command{configure} option @option{--enable-prebuilt-html}. Whether
@command{R CMD INSTALL} (and hence @code{install.packages}) pre-builds
@HTML{} pages is determined by looking at the @R{} installation and is
reported by @command{R CMD INSTALL --help}: it can be overridden by
specifying one of the @command{INSTALL} options @option{--html} or
@option{--no-html}.
The server is disabled by setting the environment variable
@enindex R_DISABLE_HTTPD
@env{R_DISABLE_HTTPD} to a non-empty value, either before @R{} is
started or within the @R{} session before @HTML{} help (including
@code{help.start}) is used. It is also possible that system security
measures will prevent the server from being started, for example if the
loopback interface has been disabled. See
@code{?tools::startDynamicHelp} for more details.
@node Making the manuals, Installation, Help options, Installing R under Unix-alikes
@section Making the manuals
@cindex Manuals
There is a set of manuals that can be built from the sources,
@table @samp
@item fullrefman
Printed versions of all the help pages for base and recommended packages
(around 3600 pages).
@item refman
Printed versions of the help pages for selected base packages (around
2200 pages)
@item R-FAQ
R @acronym{FAQ}
@item R-intro
``An Introduction to R''.
@item R-data
``R Data Import/Export''.
@item R-admin
``R Installation and Administration'', this manual.
@item R-exts
``Writing R Extensions''.
@item R-lang
``The R Language Definition''.
@end table
@noindent
To make these (with @samp{fullrefman} rather than @samp{refman}), use
@example
make pdf @r{to create PDF versions}
make info @r{to create info files (not @samp{refman} nor @samp{fullrefman}).}
@end example
@c texi2any from Mar 2013.
You will not be able to build any of these unless you have
@command{texi2any} version 5.1 or later installed, and for PDF you must
have @command{texi2dvi} and @file{texinfo.tex} installed (which are part
of the @acronym{GNU} @pkg{texinfo} distribution but are, especially
@file{texinfo.tex}, often made part of the @TeX{} package in
re-distributions). The path to @command{texi2any} can be set by macro
@samp{TEXI2ANY} in @file{config.site}. NB: @command{texi2any} require
@command{perl}.
The PDF versions can be viewed using any recent PDF viewer: they have
hyperlinks that can be followed. The info files are suitable for
reading online with Emacs or the standalone @acronym{GNU} @command{info}
program. The PDF versions will be created using the paper size selected
at configuration (default ISO a4): this can be overridden by setting
@env{R_PAPERSIZE}
@enindex R_PAPERSIZE
on the @command{make} command line, or setting @env{R_PAPERSIZE} in the
environment and using @command{make -e}. (If re-making the manuals for
a different paper size, you should first delete the file
@file{doc/manual/version.texi}. The usual value for North America would
be @samp{letter}.)
There are some issues with making the PDF reference manual,
@file{fullrefman.pdf} or @file{refman.pdf}. The help files contain both
ISO Latin1 characters (e.g.@: in @file{text.Rd}) and upright quotes,
neither of which are contained in the standard @LaTeX{} Computer Modern
fonts. We have provided four alternatives:
@table @code
@item times
(The default.) Using standard PostScript fonts, Times Roman, Helvetica
and Courier. This works well both for on-screen viewing and for
printing. One disadvantage is that the Usage and Examples sections may
come out rather wide: this can be overcome by using @emph{in addition}
either of the options @code{inconsolata} (on a Unix-alike only if found
by @command{configure}) or @code{beramono}, which replace the Courier
monospaced font by Inconsolata or Bera Sans mono respectively. (You
will need a recent version of the appropriate @LaTeX{} package
@pkg{inconsolata}@footnote{Instructions on how to install the latest
version are at
@uref{https://www.ctan.org/@/tex-archive/@/fonts/@/inconsolata/}.} or
@pkg{bera} installed.)
Note that in most @LaTeX{} installations this will not actually use the
standard fonts for PDF, but rather embed the URW clones NimbusRom,
NimbusSans and (for Courier, if used) NimbusMon.
This needs @LaTeX{} packages @pkg{times}, @pkg{helvetic} and (if used)
@pkg{courier} installed.
@item lm
Using the @emph{Latin Modern} fonts. These are not often installed as
part of a @TeX{} distribution, but can obtained from
@uref{https://www.ctan.org/@/tex-archive/@/fonts/@/ps-type1/@/lm/} and
mirrors. This uses fonts rather similar to Computer Modern, but is not
so good on-screen as @code{times}.
@item cm-super
Using type-1 versions of the Computer Modern fonts by Vladimir Volovich.
This is a large installation, obtainable from
@uref{https://www.ctan.org/@/tex-archive/@/fonts/@/ps-type1/@/cm-super/}
and its mirrors. These type-1 fonts have poor hinting and so are
nowhere near as readable on-screen as the other three options.
@item ae
A package to use composites of Computer Modern fonts. This works well
most of the time, and its PDF is more readable on-screen than the
previous two options. There are three fonts for which it will need to
use bitmapped fonts, @file{tctt0900.600pk}, @file{tctt1000.600pk} and
@file{tcrm1000.600pk}. Unfortunately, if those files are not available,
Acrobat Reader will substitute completely incorrect glyphs so you need
to examine the logs carefully.
@end table
The default can be overridden by setting the environment variable
@enindex R_RD4PDF
@env{R_RD4PDF}. (On Unix-alikes, this will be picked up at install time
and stored in @file{etc/Renviron}, but can still be overridden when the
manuals are built, using @command{make -e}.) The usual@footnote{on a
Unix-alike, @samp{inconsolata} is omitted if not found by
@command{configure}.} default value for @env{R_RD4PDF} is
@samp{times,inconsolata,hyper}: omit @samp{hyper} if you do not want
hyperlinks (e.g.@: for printing the manual) or do not have @LaTeX{}
package @pkg{hyperref}, and omit @samp{inconsolata} if you do not have
@LaTeX{} package @pkg{inconsolata} installed.
Further options, e.g@: for @pkg{hyperref}, can be included in a file
@file{Rd.cfg} somewhere on your @LaTeX{} search path. For example, if
you prefer to hyperlink the text and not the page number in the table of
contents use
@example
\ifthenelse@{\boolean@{Rd@@use@@hyper@}@}@{\hypersetup@{linktoc=section@}@}@{@}
@end example
@noindent
or
@example
\ifthenelse@{\boolean@{Rd@@use@@hyper@}@}@{\hypersetup@{linktoc=all@}@}@{@}
@end example
@noindent
to hyperlink both text and page number.
Ebook versions of most of the manuals in one or both of @file{.epub} and
@file{.mobi} formats can be made by running in @file{doc/manual} one of
@example
make ebooks
make epub
make mobi
@end example
@noindent
This requires @command{ebook-convert} from @command{Calibre}
(@uref{http://calibre-ebook.com/download}), or from most Linux
distributions. If necessary the path to @command{ebook-convert} can be
set as make macro @env{EBOOK} to by editing @file{doc/manual/Makefile}
(which contains a commented value suitable for macOS) or using
@command{make -e}.
@node Installation, Uninstallation, Making the manuals, Installing R under Unix-alikes
@section Installation
@cindex Installation
To ensure that the installed tree is usable by the right group of users,
set @code{umask} appropriately (perhaps to @samp{022}) before unpacking
the sources and throughout the build process.
After
@findex configure
@example
./configure
make
make check
@end example
@noindent
(or, when building outside the source,
@code{@var{TOP_SRCDIR}/configure}, etc) have been completed
successfully, you can install the complete @R{} tree to your system by
typing
@example
make install
@end example
@noindent
A parallel make can be used (but run @command{make} before @command{make
install}). Those using GNU @command{make} 4.0 or later may want to use
@command{make -j @var{n} -O} to avoid interleaving of output.
This will install to the following directories:
@table @asis
@item @file{@var{prefix}/bin} or @file{@var{bindir}}
the front-end shell script and other scripts and executables
@item @file{@var{prefix}/man/man1} or @file{@var{mandir}/man1}
the man page
@item @file{@var{prefix}/@var{LIBnn}/R} or @file{@var{libdir}/R}
all the rest (libraries, on-line help system, @dots{}). Here
@var{LIBnn} is usually @samp{lib}, but may be @samp{lib64} on some
64-bit Linux systems. This is known as the @R{} home directory.
@end table
@noindent
where @var{prefix} is determined during configuration (typically
@file{/usr/local}) and can be set by running @command{configure} with
the option @option{--prefix}, as in
@findex configure
@example
./configure --prefix=/where/you/want/R/to/go
@end example
@noindent
where the value should be an absolute path. This causes @command{make
install} to install the @R{} script to
@file{/where/you/want/R/to/go/bin}, and so on. The prefix of the
installation directories can be seen in the status message that is
displayed at the end of @command{configure}. The installation may need
to be done by the owner of @file{@var{prefix}}, often a root account.
There is the option of using @command{make install-strip} (@pxref{Debugging
Symbols}).
You can install into another directory tree by using
@example
make prefix=/path/to/here install
@end example
@noindent
at least with @acronym{GNU} @command{make} (but not some other Unix
makes).
More precise control is available at configure time via options: see
@command{configure --help} for details. (However, most of the `Fine
tuning of the installation directories' options are not used by @R{}.)
Configure options @option{--bindir} and @option{--mandir} are supported
and govern where a copy of the @command{R} script and the @command{man}
page are installed.
The configure option @option{--libdir} controls where the main @R{}
files are installed: the default is @samp{@var{eprefix}/@var{LIBnn}},
where @var{eprefix} is the prefix used for installing
architecture-dependent files, defaults to @var{prefix}, and can be set
via the configure option @option{--exec-prefix}.
Each of @code{bindir}, @code{mandir} and @code{libdir} can also be
specified on the @command{make install} command line (at least for
@acronym{GNU} @command{make}).
The @command{configure} or @command{make} variables @code{rdocdir} and
@code{rsharedir} can be used to install the system-independent
@file{doc} and @file{share} directories to somewhere other than
@code{libdir}. The C header files can be installed to the value of
@code{rincludedir}: note that as the headers are not installed into a
subdirectory you probably want something like
@code{rincludedir=/usr/local/include/R-@value{VERSIONno}}.
If you want the @R{} home to be something other than
@file{@var{libdir}/R}, use @option{rhome}: for example
@example
make install rhome=/usr/local/lib64/R-@value{VERSIONno}
@end example
@noindent
will use a version-specific @R{} home on a non-Debian Linux 64-bit
system.
If you have made @R{} as a shared/static library you can install it in
your system's library directory by
@example
make prefix=/path/to/here install-libR
@end example
@noindent
where @code{prefix} is optional, and @code{libdir} will give more
precise control.@footnote{This will be needed if more than one
sub-architecture is to be installed.} However, you should not install
to a directory mentioned in @env{LDPATHS} (e.g.@:
@file{/usr/local/lib64}) if you intend to work with multiple versions of
@R{}, since that directory may be given precedence over the @file{lib}
directory of other @R{} installations.
@example
make install-strip
@end example
@noindent
will install stripped executables, and on platforms where this is
supported, stripped libraries in directories @file{lib} and
@file{modules} and in the standard packages.
Note that installing @R{} into a directory whose path contains spaces is
not supported, and some aspects (such as installing source packages)
will not work.
@c The main problem is the Makefile include in etc/Makeconf
@cindex Manuals, installing
To install info and PDF versions of the manuals, use one or both of
@example
make install-info
make install-pdf
@end example
@noindent
Once again, it is optional to specify @code{prefix}, @code{libdir} or
@code{rhome} (the PDF manuals are installed under the @R{} home
directory).
More precise control is possible. For info, the setting used is that of
@code{infodir} (default @file{@var{prefix}/info}, set by configure
option @option{--infodir}). The PDF files are installed into the @R{}
@file{doc} tree, set by the @command{make} variable @code{rdocdir}.
A staged installation is possible, that it is installing @R{} into a
temporary directory in order to move the installed tree to its final
destination. In this case @code{prefix} (and so on) should reflect the
@enindex DESTDIR
final destination, and @env{DESTDIR} should be used: see
@uref{https://www.gnu.org/@/prep/@/standards/@/html_node/@/DESTDIR.html}.
You can optionally install the run-time tests that are part of
@command{make check-all} by
@example
make install-tests
@end example
@noindent
which populates a @file{tests} directory in the installation.
@node Uninstallation, Sub-architectures, Installation, Installing R under Unix-alikes
@section Uninstallation
You can uninstall @R{} by
@example
make uninstall
@end example
@noindent
optionally specifying @code{prefix} etc in the same way as specified for
installation.
This will also uninstall any installed manuals. There are specific
targets to uninstall info and PDF manuals in file
@file{doc/manual/Makefile}.
Target @code{uninstall-tests} will uninstall any installed tests, as
well as removing the directory @file{tests} containing the test results.
An installed shared/static @code{libR} can be uninstalled by
@example
make prefix=/path/to/here uninstall-libR
@end example
@node Sub-architectures, Other Options, Uninstallation, Installing R under Unix-alikes
@section Sub-architectures
Some platforms can support closely related builds of @R{} which can
share all but the executables and dynamic objects. Examples include
builds under Linux and Solaris for different @acronym{CPU}s or 32- and
64-bit builds.
@R{} supports the idea of architecture-specific builds, specified by
adding @samp{r_arch=@var{name}} to the @command{configure} line. Here
@var{name} can be anything non-empty, and is used to name subdirectories
of @file{lib}, @file{etc}, @file{include} and the package @file{libs}
subdirectories. Example names from other software are the use of
@file{sparcv9} on Sparc Solaris and @file{32} by @command{gcc} on
@cputype{x86_64} Linux.
If you have two or more such builds you can install them over each other
(and for 32/64-bit builds on one architecture, one build can be done
without @samp{r_arch}). The space savings can be considerable: on
@cputype{x86_64} Linux a basic install (without debugging symbols) took
74Mb, and adding a 32-bit build added 6Mb. If you have installed
multiple builds you can select which build to run by
@example
R --arch=@var{name}
@end example
@noindent
and just running @samp{R} will run the last build that was installed.
@code{R CMD INSTALL} will detect if more than one build is installed and
try to install packages with the appropriate library objects for each.
This will not be done if the package has an executable @code{configure}
script or a @file{src/Makefile} file. In such cases you can install for
extra builds by
@example
R --arch=@var{name} CMD INSTALL --libs-only @var{pkg1} @var{pkg2} @dots{}
@end example
If you want to mix sub-architectures compiled on different platforms
(for example @cputype{x86_64} Linux and @cputype{i686} Linux), it is
wise to use explicit names for each, and you may also need to set
@option{libdir} to ensure that they install into the same place.
When sub-architectures are used the version of @command{Rscript} in
e.g.@: @file{/usr/bin} will be the last installed, but
architecture-specific versions will be available in e.g.@:
@file{/usr/lib64/R/bin/exec$@{@var{R_ARCH}@}}. Normally all installed
architectures will run on the platform so the architecture of
@command{Rscript} itself does not matter. The executable
@command{Rscript} will run the @command{R} script, and at that time the
@enindex R_ARCH
setting of the @env{R_ARCH} environment variable determines the
architecture which is run.
When running post-install tests with sub-architectures, use
@example
R --arch=@var{name} CMD make check[-devel|all]
@end example
@noindent
to select a sub-architecture to check.
Sub-architectures are also used on Windows, but by selecting executables
within the appropriate @file{bin} directory,
@file{@var{R_HOME}/bin/i386} or @file{@var{R_HOME}/bin/x64}. For
backwards compatibility there are executables
@file{@var{R_HOME}/bin/R.exe} and @file{@var{R_HOME}/bin/Rscript.exe}:
these will run an executable from one of the subdirectories, which one
being taken first from the
@enindex R_ARCH
@env{R_ARCH} environment variable, then from the
@option{--arch} command-line option@footnote{with possible values
@samp{i386}, @samp{x64}, @samp{32} and @samp{64}.} and finally from the
installation default (which is 32-bit for a combined 32/64 bit @R{}
installation).
@menu
* Multilib::
@end menu
@node Multilib, , Sub-architectures, Sub-architectures
@subsection Multilib
For some Linux distributions@footnote{mainly on RedHat and Fedora, whose
layout is described here.}, there is an alternative mechanism for mixing
32-bit and 64-bit libraries known as @emph{multilib}. If the Linux
distribution supports multilib, then parallel builds of @R{} may be
installed in the sub-directories @file{lib} (32-bit) and @file{lib64}
(64-bit). The build to be run may then be selected using the
@command{setarch} command. For example, a 32-bit build may be run by
@example
setarch i686 R
@end example
The @command{setarch} command is only operational if both 32-bit and
64-bit builds are installed. If there is only one installation of @R{},
then this will always be run regardless of the architecture specified
by the @command{setarch} command.
There can be problems with installing packages on the non-native
architecture. It is a good idea to run e.g.@: @code{setarch i686 R} for
sessions in which packages are to be installed, even if that is the only
version of @R{} installed (since this tells the package installation
code the architecture needed).
There is a potential problem with packages using Java, as the
post-install for a @cputype{i686} RPM on @cputype{x86_64} Linux
reconfigures Java and will find the @cputype{x86_64} Java. If you know
where a 32-bit Java is installed you may be able to run (as root)
@example
export JAVA_HOME=<path to jre directory of 32-bit Java>
setarch i686 R CMD javareconf
@end example
@noindent
to get a suitable setting.
When this mechanism is used, the version of @command{Rscript} in
e.g.@: @file{/usr/bin} will be the last installed, but an
architecture-specific version will be available in
e.g.@: @file{/usr/lib64/R/bin}. Normally all installed architectures
will run on the platform so the architecture of @command{Rscript} does
not matter.
@node Other Options, Testing a Unix-alike Installation, Sub-architectures, Installing R under Unix-alikes
@section Other Options
There are many other installation options, most of which are listed by
@command{configure --help}. Almost all of those not listed elsewhere in
this manual are either standard @command{autoconf} options not relevant
to @R{} or intended for specialist uses by the @R{} developers.
One that may be useful when working on @R{} itself is the option
@option{--disable-byte-compiled-packages}, which ensures that the base
and recommended packages are not byte-compiled. (Alternatively the
(make or environment) variable @env{R_NO_BASE_COMPILE} can be set to a
non-empty value for the duration of the build.)
Option @option{--with-internal-tzcode} makes use of @R{}'s own code and
copy of the Olson database for managing timezones. This will be
preferred where there are issues with the system implementation, usually
involving times after 2037 or before 1916. An alternative time-zone
directory@footnote{How to prepare such a directory is described in file
@file{src/extra/tzone/Notes} in the @R{} sources.} can be used, pointed
to by environment variable @env{TZDIR}: this should contain files such
as @file{Europe/London}. On all tested OSes the system timezone was
deduced correctly, but if necessary it can be set as the value of
environment variable @env{TZ}.
@menu
* Debugging Symbols::
* OpenMP Support::
* C++ Support::
* Link-Time Optimization::
@end menu
@node Debugging Symbols, OpenMP Support, Other Options, Other Options
@subsection Debugging Symbols
By default, @command{configure} adds a flag (usually @option{-g}) to the
compilation flags for C, Fortran and CXX sources. This will slow down
compilation and increase object sizes of both @R{} and packages, so it
may be a good idea to change those flags (set @samp{CFLAGS} etc in
@file{config.site} before configuring, or edit files @file{Makeconf}
and @file{etc/Makeconf} between running @command{configure} and
@command{make}).
Having debugging symbols available is useful both when running @R{} under a
debugger (e.g., @command{R -d gdb}) and when using sanitizers and
@command{valgrind}, all things intended for experts.
Debugging symbols (and some others) can be `stripped' on installation by
using
@example
make install-strip
@end example
@noindent
How well this is supported depends on the platform: it works best on
those using GNU @code{binutils}. On @cputype{x86_64} Linux a typical
reduction in overall size was from 92MB to 66MB. On macOS debugging
symbols are not by default included in @file{.dylib} and @file{.so}
files, so there is negligible difference.
@node OpenMP Support, C++ Support, Debugging Symbols, Other Options
@subsection OpenMP Support
By default @command{configure} searches for suitable flags@footnote{for
example, @option{-fopenmp}, @option{-xopenmp} or @option{-qopenmp}.
This includes for @command{clang} and the Intel and Oracle compilers.}
for OpenMP support for the C, C++ (default standard) and Fortran
compilers.
Only the C result is currently used for @R{} itself, and only if
@code{MAIN_LD}/@code{DYLIB_LD} were not specified. This can be
overridden by specifying
@example
R_OPENMP_CFLAGS
@end example
Use for packages has similar restrictions (involving @code{SHLIB_LD} and
similar: note that as Fortran code is by default linked by the C (or
C++) compiler, both need to support OpenMP) and can be overridden by
specifying some of
@example
SHLIB_OPENMP_CFLAGS
SHLIB_OPENMP_CXXFLAGS
SHLIB_OPENMP_FFLAGS
@end example
@noindent
Setting these to an empty value will disable OpenMP for that compiler
(and configuring with @option{--disable-openmp} will disable all
detection@footnote{This does not necessarily disable @emph{use} of
OpenMP -- the @command{configure} code allows for platforms where OpenMP
is used without a flag. For the @command{flang} compiler in late 2017,
the Fortran runtime always used OpenMP.} of OpenMP). The
@command{configure} detection test is to compile and link a standalone
OpenMP program, which is not the same as compiling a shared object and
loading it into the C program of @R{}'s executable. Note that
overridden values are not tested.
@node C++ Support, Link-Time Optimization, OpenMP Support, Other Options
@subsection C++ Support
C++ is not used by @R{} itself, but support is provided for installing
packages with C++ code via @command{make} macros defined in file
@file{etc/Makeconf} (and with explanations in file @file{config.site}):
@example
CXX
CXXFLAGS
CXXPICFLAGS
CXXSTD
CXX11
CXX11STD
CXX11FLAGS
CXX11PICFLAGS
CXX14
CXX14STD
CXX14FLAGS
CXX14PICFLAGS
CXX17
CXX17STD
CXX17FLAGS
CXX17PICFLAGS
@end example
@noindent
The macros @code{CXX} etc are those used by default for C++ code.
@command{configure} will attempt to set the rest suitably, choosing for
@code{CXX11STD} a suitable flag such as @option{-std=c++11} for C++11
support. Similarly, configure will if possible choose for
@code{CXX14STD} a flag@footnote{Currently this is a valid option for
@command{g++} 5 and later and 2016 versions of the Intel and Solaris
compilers. For earlier versions of @command{g++} one could try
@option{-std=c++1y}.} such as @option{-std=c++14} for C++14 support and
@option{-std=c++17} or @option{-std=c++1z} for support for the C++17
standard. The inferred values can be overridden in file
@file{config.site} or on the @command{configure} command line:
user-supplied values will be tested by compiling some C++11/14/17 code.
@R{} versions 3.1.0 to 3.3.3 used @code{CXX1X} rather than @code{CXX11}:
these forms were deprecated in 3.4.4 and removed in 3.6.0.
It may be@footnote{This is true for earlier versions of @command{g++}
such as 4.2.1, and also for earlier versions of the Solaris
compiler @code{CC}.} that there is no suitable flag for C++11 support,
in which case a different compiler could be selected for @code{CXX11}
and its corresponding flags. Likewise, a different compiler can be
specified for C++14 support with @code{CXX14} and for C++17 support with
@code{CXX17}.
The @option{-std} flag is supported by the GCC, @command{clang++}, Intel
and Solaris compilers (the latter from version 12.4). Currently
accepted values are (plus some synonyms)
@example
g++: c++98 gnu++98 c++11 gnu+11 c++14 gnu++14 c++17 gnu++17
Intel: gnu+98 c++11 c++14 (from 16.0) c++17 (from 17.0)
Solaris: c++03 c++11 c++14 (from 12.5)
@end example
@noindent
(Those for @command{clang++} are not documented, but seem to be based on
@code{g++}.) Versions 4.3.x to 4.8.x of @command{g++} accepted flag
@option{-std=c++0x} with partial support@footnote{For when features were
supported, see
@uref{https://gcc.gnu.org/@/projects/@/cxx-status.html#cxx11}.} for
C++11: this is currently still accepted as a deprecated synonym for
@option{-std=c++11}. (At least for versions 4.8.x it has sufficient
support to be picked by @command{configure}.) Option
@option{-std=c++14} was introduced in version 5.x.
@c c++1y does not pass the configure test in 4.9.3
@c , with @option{-std=c++1y} (introduced@footnote{See
@c @uref{https://gcc.gnu.org/@/projects/@/cxx-status.html#cxx14} for which
@c C++14 features it supported.} in version 4.9.x) remaining as a deprecated
@c synonym.
`Standards' for @command{g++} starting with @samp{gnu} enable `GNU
extensions': what those are is hard to track down.
For the use of C++11 and later in @R{} packages see the `Writing R
Extensions' manual. Prior to @R{} 3.6.0 the default C++ standard was
that of the compiler used: currently it is C++11 if supported by the
compiler: this can be overridden by setting @samp{CXXSTD} when @R{} is
configured, for example to @samp{-std=gnu++14}.
@node Link-Time Optimization, , C++ Support, Other Options
@subsection Link-Time Optimization
There is support for using link-time optimization (LTO) if the toolchain
supports it: configure with flag @option{--enable-lto}.
Whether toolchains support LTO is often unclear: all of the C compiler,
the Fortran compiler and linker have to support it, and support it by
the same mechanism (so mixing compiler families may not work and a
non-system linker such as @command{gold}@footnote{Nowadays part of GNU
@code{binutils} and used by LLVM to support LTO. It is sometimes
installed as @command{ld.gold}.} may be needed). It has been tested
recently on Linux with @command{gcc}/@command{gfortran} 8.x and 9.x:
that needed setting
@example
AR=gcc-ar
RANLIB=gcc-ranlib
@end example
@noindent
(e.g.@: in @file{config.site}). For non-system compilers or if those
wrappers have not been installed one may need something like
@example
AR="ar --plugin=/path/to/liblto_plugin.so"
RANLIB="ranlib --plugin=/path/to/liblto_plugin.so"
@end example
@noindent
amd it may be needed to set @code{NM} similarly.
Unfortunately @option{--enable-lto} may be accepted but silently do
nothing if some of the toolchain does not support LTO: that has happened on
macOS.
When LTO is enabled it is used for compiled code in packages (including
the recommended packages) unless the flag @option{--enable-lto=R} is
used. With sufficient diagnostic flags (e.g.@: @option{-Wall} in GCC)
this can flag inconsistencies between source files in a package.
Under some circumstances and for a few packages, the PIC flags have
needed overriding on Linux with GCC 9: e.g@: in @file{config.site}:
@example
CPICFLAGS=-fPIC
CXXPICFLAGS=-fPIC
CXX11PICFLAGS=-fPIC
CXX14PICFLAGS=-fPIC
CXX17PICFLAGS=-fPIC
FPICFLAGS=-fPIC
@end example
LTO support was added in 2011 for @command{gcc} 4.5.x on Linux but was
little used before 2019.
With @command{gcc}/@command{gfortran} 9.x@footnote{probably also 8.4 and
later when released.} this will flag inconsistencies in calls to Fortran
subroutines/functions, both between Fortran source files and between
Fortran and C/C++. @command{gfortran} 9.2@footnote{or 9.1-patched since
2019-05-09.} can help understanding these by extracting C prototypes
from Fortran source files with its @option{-fc-prototypes-external}
option, e.g.@: that (at the time of writing) Fortran @code{LOGICAL}
corresponds to @code{int_least32_t *}.
@node Testing a Unix-alike Installation, , Other Options, Installing R under Unix-alikes
@section Testing an Installation
Full post-installation testing is possible only if the test files have
been installed with
@example
make install-tests
@end example
@noindent
which populates a @file{tests} directory in the installation.
If this has been done, two testing routes are available. The first is
to move to the home directory of the @R{} installation (as given by
@command{R RHOME} or from @R{} as @code{R.home()}) and run
@example
cd tests
## followed by one of
../bin/R CMD make check
../bin/R CMD make check-devel
../bin/R CMD make check-all
@end example
@noindent
and other useful targets are @code{test-BasePackages} and
@code{test-Recommended} to run tests of the standard and recommended
packages (if installed) respectively.
This re-runs all the tests relevant to the installed @R{} (including for
example the code in the package vignettes), but not for example the ones
checking the example code in the manuals nor making the standalone Rmath
library. This can occasionally be useful when the operating environment
has been changed, for example by OS updates or by substituting the
@acronym{BLAS} (@pxref{Shared BLAS}).
Parallel checking of packages may be possible: set the environment
variable @env{TEST_MC_CORES} to the maximum number of processes to be
run in parallel. This affects both checking the package examples (part
of @command{make check}) and package sources (part of @command{make
check-devel} and @command{make check-recommended}). It does require a
@command{make} command which supports the @command{make -j @var{n}}
option: most do but on Solaris you need to select GNU @code{make} or
@code{dmake}.
Alternatively, the installed @R{} can be run, preferably with
@option{--vanilla}. Then
@enindex LC_COLLATE
@example
Sys.setenv(LC_COLLATE = "C", LC_TIME = "C", LANGUAGE = "en")
tools::testInstalledBasic("both")
tools::testInstalledPackages(scope = "base")
tools::testInstalledPackages(scope = "recommended")
@end example
@noindent
runs the basic tests and then all the tests on the standard and
recommended packages. These tests can be run from anywhere: the basic
tests write their results in the @file{tests} folder of the @R{} home
directory and run fewer tests than the first approach: in particular
they do not test things which need Internet access---that can be tested
by
@example
tools::testInstalledBasic("internet")
@end example
These tests work best if @command{diff} (in @file{Rtools*.exe} for
Windows users) is in the path.
It is possible to test the installed packages (but not their
package-specific tests) by @code{testInstalledPackages} even if
@command{make install-tests} was not run.
Note that the results may depend on the language set for times and
messages: for maximal similarity to reference results you may want to
try setting (before starting the @R{} session)
@example
LANGUAGE=en
@end example
@noindent
and use a UTF-8 or Latin-1 locale.
@node Installing R under Windows, Installing R under macOS, Installing R under Unix-alikes, Top
@chapter Installing R under Windows
@cindex Installing under Windows
The @file{bin/windows} directory of a @acronym{CRAN} site contains
binaries for a base distribution and a large number of add-on packages
from @acronym{CRAN} to run on 32- or 64-bit Windows (Windows 7 and later
are tested; XP is known to fail some tests) on @cputype{ix86} and
@cputype{x86_64} @acronym{CPU}s.
Your file system must allow long file names (as is likely except
perhaps for some network-mounted systems). If it doesn't also support
conversion to short name equivalents (a.k.a. DOS 8.3 names), then R
@emph{must} be installed in a path that does not contain spaces.
Installation is @emph{via} the installer
@file{@value{RWVERSION}-win.exe}. Just double-click on the icon and
follow the instructions. When installing on a 64-bit version of Windows
the options will include 32- or 64-bit versions of @R{} (and the default is
to install both). You can uninstall @R{} from the Control Panel.
Note that you will be asked to choose a language for installation, and
that choice applies to both installation and un-installation but not to
running @R{} itself.
See the @uref{https://CRAN.R-project.org/@/bin/@/windows/@/base/@/rw-FAQ.html, R
Windows @acronym{FAQ}} for more details on the binary installer.
@menu
* Building from source::
* Testing a Windows Installation::
@end menu
@node Building from source, Testing a Windows Installation, Installing R under Windows, Installing R under Windows
@section Building from source
@R{} can be built as either a 32-bit or 64-bit application on Windows:
to build the 64-bit application you need a 64-bit edition of Windows:
such an OS can also be used to build 32-bit @R{}.
The standard installer combines 32-bit and 64-bit builds into a single
executable which can then be installed into the same location and share
all the files except the @file{.exe} and @file{.dll} files and some
configuration files in the @file{etc} directory.
Building is only tested in a 8-bit locale: using a multi-byte locale (as
used for CJK languages) is unsupported and may not work (the scripts do
try to select a @samp{C} locale; Windows may not honour this).
@strong{NB:} The build process is currently being changed to require
external binary distributions of third-party software. Their location
is set using macro @code{EXT_LIBS} with default setting
@file{$(LOCAL_SOFT)}; the $(LOCAL_SOFT) macro defaults to
@file{$(R_HOME)/extsoft}. This directory can be populated using
@command{make rsync-extsoft}. The location can be overridden by
setting @code{EXT_LIBS} to a different path in
@file{src/gnuwin32/MkRules.local}. A suitable collection of files can
also be obtained from
@uref{https://CRAN.R-project.org/@/bin/@/windows/@/extsoft} or
@uref{https://www.stats.ox.ac.uk/@/pub/@/Rtools/@/libs.html}.
@menu
* Getting the tools::
* Getting the source files::
* Building the core files::
* Building the cairo devices files::
* Using ICU for collation::
* Support for libcurl::
* Checking the build::
* Building the manuals::
* Building the Inno Setup installer::
* Building the MSI installer::
* 64-bit Windows builds::
@end menu
@node Getting the tools, Getting the source files, Building from source, Building from source
@subsection Getting the tools
If you want to build @R{} from the sources, you will first need to
collect, install and test an extensive set of tools. See @ref{The
Windows toolset} (and perhaps updates in
@uref{https://CRAN.R-project.org/bin/@/windows/@/Rtools/}) for details.
The @file{Rtools*.exe} executable installer described in @ref{The
Windows toolset} also includes some source files in addition to the @R{}
source as noted below. You should run it first, to obtain a working
@code{tar} and other necessities. Choose a ``Full installation'', and
install the extra files into your intended @R{} source directory, e.g.@:
@file{C:/R}. The directory name @emph{should not contain spaces}. We
will call this directory @file{@var{R_HOME}} below.
@node Getting the source files, Building the core files, Getting the tools, Building from source
@subsection Getting the source files
You need to collect the following sets of files:
@itemize
@item
Get the @R{} source code tarball @file{R-@value{VERSIONno}.tar.gz} from
@acronym{CRAN}. Open a command window (or another shell) at directory
@var{R_HOME}, and run
@example
tar -xf R-@value{VERSIONno}.tar.gz
@end example
@noindent
to create the source tree in @var{R_HOME}. @strong{Beware}: do use
@command{tar} to extract the sources rather than tools such as WinZip.
If you are using an account with administrative privileges you may get a
lot of messages which can be suppressed by
@example
tar --no-same-owner -xf R-@value{VERSIONno}.tar.gz
@end example
@noindent
@enindex TAR_OPTIONS
or perhaps better, set the environment variable @env{TAR_OPTIONS} to the
value @samp{--no-same-owner --no-same-permissions}.
It is also possible to obtain the source code using Subversion; see
@ref{Obtaining R} for details.
@item
If you are not using a tarball you need to obtain copies of the
recommended packages from @acronym{CRAN}. Put the @file{.tar.gz} files
in @file{@var{R_HOME}/src/library/Recommended} and run @code{make
link-recommended}. If you have an Internet connection, you can do this
automatically by running in @file{@var{R_HOME}/src/gnuwin32}
@example
make rsync-recommended
@end example
@item
The binary distributions of external software. Download
@example
https://www.stats.ox.ac.uk/pub/Rtools/goodies/multilib/local323.zip
@end example
@noindent
(or a more recent version if appropriate), create an empty directory,
say @file{c:/R/extsoft}, and unpack it in
that directory by e.g.@:
@example
unzip local323.zip -d c:/R/extsoft
@end example
@item
Make a local copy of the configuration rules by
@example
cd @var{R_HOME}/src/gnuwin32
cp MkRules.dist MkRules.local
@end example
@noindent
and edit @file{MkRules.local}, uncommenting @code{EXT_LIBS} and setting
it to the appropriate path (in our example @file{c:/R/extsoft}).
Look through the file @file{MkRules.local} and make any other changes
needed: in particular, this is where a 64-bit build is selected and the
locations are set of external software for ICU collation and the
cairo-based devices.
@end itemize
The following additional item is normally installed by
@file{Rtools*.exe}. If instead you choose to do a completely manual
build you will also need
@itemize
@item
The Tcl/Tk support files are contained in @file{Rtools*.exe}. Please
make sure you install the right version: there is a 32-bit version and a
64-bit version. They should be installed to @file{@var{R_HOME}},
creating directory @file{Tcl} there.
@end itemize
@node Building the core files, Building the cairo devices files, Getting the source files, Building from source
@subsection Building the core files
@enindex TMPDIR
Set the environment variable @env{TMPDIR} to the absolute path to a
writable directory, with a path specified with forward slashes and no
spaces. (The default is @file{/tmp}, which may not be useful on
Windows.)
You may need to compile under a case-honouring file system: we found
that a @command{samba}-mounted file system (which maps all file names to
lower case) did not work.
Open a command window at @file{@var{R_HOME}/src/gnuwin32}, then run
@example
make all recommended vignettes
@end example
@noindent
and sit back and wait while the basic compile takes place.
Notes:
@itemize
@item
We have had reports that earlier versions of anti-virus software locking
up the machine, but not for several years. However, aggressive
anti-virus checking such as the on-access scanning of Sophos can slow
the build down several-fold.
@item
You can run a parallel make by e.g.
@example
make -j4 all
make -j4 recommended
make vignettes
@end example
@noindent
but this is only likely to be worthwhile on a multi-core machine with
ample memory, and is not 100% reliable.
@item
It is possible (mainly for those working on @R{} itself) to set the
(make or environment) variable @env{R_NO_BASE_COMPILE} to a non-empty
value, which inhibits the byte-compilation of the base and recommended
packages.
@end itemize
@node Building the cairo devices files, Using ICU for collation, Building the core files, Building from source
@subsection Building the cairo devices
@cindex winCairo.dll
The devices based on cairographics (@code{svg}, @code{cairo_pdf},
@code{cairo_ps} and the @code{type = "cairo"} versions of @code{png},
@code{jpeg}, @code{tiff} and @code{bmp}) are implemented in a separate
DLL @file{winCairo.dll} which is loaded when one of these devices is
first used. It is not built by default, and needs to be built (after
@command{make all}) by @command{make cairodevices}.
To enable the building of these devices you need to install the static
cairographics libraries built by Simon Urbanek at
@uref{https://www.rforge.net/@/Cairo/@/files/@/cairo-current-win.tar.gz}. Set
the macro @samp{CAIRO_HOME} in @file{MkRules.local}. (Note that this
tarball unpacks with a top-level directory @file{src/}:
@samp{CAIRO_HOME} needs to include that directory in its path.)
@node Using ICU for collation, Support for libcurl, Building the cairo devices files, Building from source
@subsection Using ICU for collation
It is recommended to build @R{} to support ICU (International Components
for Unicode, @uref{http://site.icu-project.org/}) for collation, as is
commonly done on Unix-alikes.
Two settings are needed in @file{MkRules.local},
@example
# set to use ICU
# USE_ICU = YES
# path to parent of ICU headers
ICU_PATH = /path/to/ICU
@end example
@noindent
The first should be uncommented and the second set to the top-level
directory of a suitably packaged binary build of ICU, for example that
at @url{https://www.stats.ox.ac.uk/pub/Rtools/goodies/ICU_531.zip}.
Depending on the build, it may be necessary to edit the macro
@code{ICU_LIBS}.
Unlike on a Unix-alike, it is normally necessary to call
@code{icuSetCollate} to set a locale before ICU is actually used for
collation, or set the environment variable @env{R_ICU_LOCALE}.
@node Support for libcurl, Checking the build, Using ICU for collation, Building from source
@subsection Support for libcurl
@code{libcurl} version 7.28.0 or later is used to support
@code{curlGetHeaders} and the @code{"libcurl"} methods of
@code{download.file} and @code{url}.
A suitable distribution can be found @emph{via}
@uref{https://www.stats.ox.ac.uk/@/pub/@/Rtools/@/libs.html} and its unpacked
location should be specified in file @file{MkRules.local}.
For secure use of e.g.@: @samp{https://} URLs Windows users may need to
specify the path to up-to-date @emph{CA root certificates}: see
@code{?download.file}.
@node Checking the build, Building the manuals, Support for libcurl, Building from source
@subsection Checking the build
You can test a build by running
@example
make check
@end example
@noindent
The recommended packages can be checked by
@example
make check-recommended
@end example
@noindent
Other levels of checking are
@example
make check-devel
@end example
@noindent
for a more thorough check of the @R{} functionality, and
@example
make check-all
@end example
@noindent
for both @code{check-devel} and @code{check-recommended}.
If a test fails, there will almost always be a @file{.Rout.fail} file in
the directory being checked (often @file{tests/Examples} or
@file{tests}): examine the file to help pinpoint the problem.
Parallel checking of package sources (part of @command{make check-devel}
and @command{make check-recommended}) is possible: see the environment
variable @env{TEST_MC_CORES} to the maximum number of processes to be
run in parallel.
@node Building the manuals, Building the Inno Setup installer, Checking the build, Building from source
@subsection Building the manuals
The PDF manuals require @pkg{texinfo} 5.1 or later, and can be made by
@example
make manuals
@end example
@noindent
If you want to make the info versions (not including the Reference
Manual), use
@example
cd ../../doc/manual
make -f Makefile.win info
@end example
@noindent
(all assuming you have @command{pdftex}/@command{pdflatex} installed and
in your path).
See the @ref{Making the manuals} section in the Unix-alike section for setting
options such as the paper size and the fonts used.
By default it is assumed that @pkg{texinfo} is not installed, and the
manuals will not be built. The comments in file @file{MkRules.dist}
describe settings to build them. (Copy that file to
@file{MkRules.local} and edit it.) The @pkg{texinfo} 5.x package for
use on Windows is available at
@uref{https://www.stats.ox.ac.uk/pub/Rtools/}: you will also need to
install @command{Perl}@footnote{Suitable distributions include
Strawberry Perl, @uref{http://strawberryperl.com/} and ActivePerl,
@uref{https://www.activestate.com/activeperl}.}
@node Building the Inno Setup installer, Building the MSI installer, Building the manuals, Building from source
@subsection Building the Inno Setup installer
You need to have the files for a complete @R{} build, including bitmap and
Tcl/Tk support and the manuals (which requires @pkg{texinfo} installed),
as well as the recommended packages and Inno Setup (@pxref{The Inno
Setup installer}).
Once everything is set up
@example
make distribution
make check-all
@end example
@noindent
will make all the pieces and the installer and put them in the
@file{gnuwin32/cran} subdirectory, then check the build. This works by
building all the parts in the sequence:
@example
rbuild @r{(the executables, the @acronym{FAQ} docs etc.)}
rpackages @r{(the base packages)}
htmldocs @r{(the HTML documentation)}
cairodevices @r{(the cairo-based graphics devices)}
recommended @r{(the recommended packages)}
vignettes @r{(the vignettes in base packages:}
@r{ only needed if building from an @command{svn} checkout)}
manuals @r{(the PDF manuals)}
rinstaller @r{(the install program)}
crandir @r{(the @acronym{CRAN} distribution directory, only for 64-bit builds)}
@end example
The parts can be made individually if a full build is not needed, but
earlier parts must be built before later ones. (The @file{Makefile}
doesn't enforce this dependency---some build targets force a lot of
computation even if all files are up to date.) The first four targets
are the default build if just @command{make} (or @command{make all}) is
run.
Parallel make is not supported and likely to fail.
If you want to customize the installation by adding extra packages,
replace @code{make rinstaller} by something like
@example
make rinstaller EXTRA_PKGS='pkg1 pkg2 pkg3'
@end example
An alternative way to customize the installer starting with a binary
distribution is to first make an installation of @R{} from the standard
installer, then add packages and make other customizations to that
installation. Then (after having customized file @file{MkRules},
possibly @emph{via} @file{MkRules.local}, and having made @R{} in the
source tree) in @file{src/gnuwin32/installer} run
@example
make myR IMAGEDIR=rootdir
@end example
@noindent
where @file{rootdir} is the path to the root of the customized
installation (in double quotes if it contains spaces or backslashes).
Both methods create an executable with a standard name such as
@file{@value{RWVERSION}-win.exe}, so please rename it to indicate that
it is customized. If you intend to @emph{distribute} a customized
installer please do check that license requirements are met -- note that
the installer will state that the contents are distributed under GPL
and this has a requirement for @emph{you} to supply the complete sources
(including the @R{} sources even if you started with a binary distribution
of R, and also the sources of any extra packages (including their
external software) which are included).
The defaults for the startup parameters may also be customized. For example
@example
make myR IMAGEDIR=rootdir MDISDI=1
@end example
@noindent
will create an installer that defaults to installing @R{} to run in SDI
mode. See @file{src/@/gnuwin32/installer/Makefile} for the names and
values that can be set.
The standard @acronym{CRAN} distribution of a 32/64-bit installer is
made by first building 32-bit @R{} (just
@example
make 32-bit
@end example
@noindent
is needed), and then (in a separate directory) building 64-bit @R{} with
the macro @code{HOME32} set in file @file{MkRules.local} to the
top-level directory of the 32-bit build. Then the @command{make
rinstaller} step copies the files that differ between architectures from
the 32-bit build as it builds the installer image.
@node Building the MSI installer, 64-bit Windows builds, Building the Inno Setup installer, Building from source
@subsection Building the MSI installer
It is also possible to build an installer for use with Microsoft
Installer. This is intended for use by sysadmins doing automated
installs, and is not recommended for casual use.
It makes use of the Windows Installer XML (WiX) toolkit @emph{version
3.5} (or perhaps later, untested) available from
@uref{http://wixtoolset.org/}. Once WiX is installed, set the path to
its home directory in @file{MkRules.local}.
You need to have the files for a complete @R{} build, including bitmap and
Tcl/Tk support and the manuals, as well as the recommended packages.
There is no option in the installer to customize startup options, so
edit @file{etc/Rconsole} and @file{etc/Rprofile.site} to set these as
required. Then
@example
cd installer
make msi
@end example
@noindent
which will result in a file with a name like
@file{@value{RWVERSION}-win32.msi}. This can be double-clicked to be
installed, but those who need it will know what to do with it (usually
by running @command{msiexec /i} with additional options). Properties
that users might want to set from the @command{msiexec} command line
include @samp{ALLUSERS}, @samp{INSTALLDIR} (something like
@file{c:\Program Files\R\@value{RWVERSION}}) and @samp{RMENU} (the path
to the @samp{R} folder on the start menu) and @samp{STARTDIR} (the
starting directory for @R{} shortcuts, defaulting to something like
@file{c:\Users\name\Documents\R}).
The MSI installer can be built both from a 32-bit build of @R{}
(@file{@value{RWVERSION}-win32.msi}) and from a 64-bit build of @R{}
(@file{@value{RWVERSION}-win64.msi}, optionally including 32-bit files
by setting the macro @code{HOME32}, when the name is
@file{@value{RWVERSION}-win.msi}). Unlike the main installer, a 64-bit
MSI installer can only be run on 64-bit Windows.
Thanks to David del Campo (Dept of Statistics, University of Oxford)
for suggesting WiX and building a prototype installer.
@node 64-bit Windows builds, , Building the MSI installer, Building from source
@subsection 64-bit Windows builds
To build a 64-bit version of @R{} you need a 64-bit toolchain: the only one
discussed here is based on the work of the MinGW-w64 project
(@uref{http://sourceforge.net/@/projects/@/mingw-w64/}, but commercial
compilers such as those from Intel and PGI could be used (and have been
by @R{} redistributors).
Support for MinGW-w64 was developed in the @R{} sources over the period
2008--10 and was first released as part of @R{} 2.11.0. The assistance
of Yu Gong at a crucial step in porting @R{} to MinGW-w64 is gratefully
acknowledged, as well as help from Kai Tietz, the lead developer of the
MinGW-w64 project.
Windows 64-bit is now completely integrated into the @R{} and package
build systems: a 64-bit build is selected in file @file{MkRules.local}.
@node Testing a Windows Installation, , Building from source, Installing R under Windows
@section Testing an Installation
The Windows installer contains a set of test files used when building
@R{}.
The @code{Rtools} are not needed to run these tests, but more
comprehensive analysis of errors will be given if @command{diff} is in
the path (and @code{errorsAreFatal = FALSE} is then not needed below).
Launch either @code{Rgui} or @code{Rterm}, preferably with
@option{--vanilla}. Then run
@example
Sys.setenv(LC_COLLATE = "C", LANGUAGE = "en")
library("tools")
testInstalledBasic("both")
testInstalledPackages(scope = "base", errorsAreFatal = FALSE)
testInstalledPackages(scope = "recommended", errorsAreFatal = FALSE)
@end example
@noindent
runs the basic tests and then all the tests on the standard and
recommended packages. These tests can be run from anywhere: they write
some of their results in the @file{tests} folder of the @R{} home
directory (as given by @code{R.home()}), and hence may need to be run
under the account used to install @R{}.
The results of @code{example(md5sums)} when testing @pkg{tools} will
differ from the reference output as some files are installed with
Windows' CRLF line endings.
@node Installing R under macOS, Running R, Installing R under Windows, Top
@chapter Installing R under macOS
@cindex macOS
@macro Rapp{}
@sc{R.app}
@end macro
(`macOS' was known as `OS X' from 2012--2016 and as `Mac OS X' before that.)
The front page of a @acronym{CRAN} site has a link `Download R for (Mac)
OS X'. Click on that, then download the file
@file{R-@value{VERSIONno}.pkg} and install it. This runs on macOS 10.11
and later (El Capitan, Sierra, High Sierra, Mojave, Catalina, @dots{}).
Installers for R-patched and R-devel are usually available from
@uref{https://mac.R-project.org}. (Some of these packages are
unsigned/not notarized: to install those Control/right/two-finger click,
select @samp{Open With} and @samp{Installer}.)
For some older versions of the OS you can in principle (it is little
tested) install @R{} from the sources (see @ref{macOS}).
It is important that if you use a binary installer package that your OS
is fully updated: look at `Updates' from the `App Store' to be sure.
(If using XQuartz, check that is current.)
To install, just double-click on the icon of the file you downloaded.
At the `Installation Type' stage, note the option to `Customize'. This
currently shows four components: everyone will need the `R Framework'
component: the remaining components are optional. (The `Tcl/Tk'
component is needed to use package @pkg{tcltk}. The `Texinfo' component
is only needed by those installing source packages or @R{} from its
sources.)
This is an Apple Installer package. If you encounter any problem during
the installation, please check the Installer log by clicking on the
``Window'' menu and item ``Installer Log''. The full output (select
``Show All Log'') is useful for tracking down problems. Note the the
installer is clever enough to try to upgrade the last-installed version
of the application where you installed it (which may not be where you
want this time @dots{}).
Various parts of the build require XQuartz to be installed: see
@uref{https://xquartz.macosforge.org/}. These include the @pkg{tcltk}
package and the @code{X11} device: attempting to use these without
XQuartz will remind you. Also for the cairographics-based devices
(which are not often used on macOS) such as @code{png(type = "cairo")}.
If you update your macOS version, you should re-install @R{} (and
perhaps XQuartz): the installer may tailor the installation to the
current version of the OS.
For building @R{} from source, see @ref{macOS}.
@menu
* Running R under macOS::
* Uninstalling under macOS::
* Multiple versions::
@end menu
@node Running R under macOS, Uninstalling under macOS, Installing R under macOS, Installing R under macOS
@section Running R under macOS
There are two ways to run @R{} on macOS from a @acronym{CRAN} binary
distribution.
There is a GUI console normally installed with the @R{} icon in
@file{/Applications} which you can run by double-clicking (e.g.@: from
Launchpad or Finder). (If you cannot find it there it was possibly
installed elsewhere so try searching for it in Spotlight.) This is
usually referred to as @Rapp{} to distinguish it from command-line @R{}:
its user manual is currently part of the macOS FAQ at
@uref{https://cran.r-project.org/@/bin/@/macosx/@/RMacOSX-FAQ.html} and
can be viewed from @Rapp{}'s `Help' menu.
You can run command-line @R{} and @command{Rscript} from a
Terminal@footnote{The installer puts links to @command{R} and
@command{Rscript} in @file{/usr/local/bin}. If these are missing, you
can run directly the copies in
@file{/Library/Frameworks/R.framework/Resources/}.} so these can be
typed as commands like any other Unix-alike: see the next chapter of
this manual. There are some small differences which may surprise users
of @R{} on other platforms, notably the default location of the personal
library directory (under @file{~/Library/R},
e.g.@: @file{~/Library/R/3.6/library}), and that warnings, messages and
other output to @file{stderr} are highlighted in bold.
@c https://stat.ethz.ch/pipermail/r-sig-mac/2014-October/011131.html
It has been reported that running @Rapp{} may fail if no preferences are
stored, so if it fails when launched for the very first time, try it
again (the first attempt will store some preferences).
Users of @Rapp{} need to be aware of the `App Nap' feature
(@uref{https://developer.apple.com/@/library/@/mac/@/releasenotes/@/MacOSX/@/WhatsNewInOSX/@/Articles/MacOSX10_9.html})
which can cause @R{} tasks to appear to run very slowly when not
producing output in the console. Here are ways to avoid it:
@itemize
@item
Ensure that the console is completely visible (or at least the activity
indicator at the top right corner is visible).
@item
In a Terminal, run
@example
defaults write org.R-project.R NSAppSleepDisabled -bool YES
@end example
@noindent
(see @uref{https://developer.apple.com/@/library/@/mac/@/releasenotes/@/MacOSX/@/WhatsNewInOSX/@/Articles/MacOSX10_9.html}).
@end itemize
Using the @code{X11} device or the X11-based versions of @code{View()}
and @code{edit()} for data frames and matrices (the latter are the
default for command-line @R{} but not @Rapp{}) requires an X sub-system
to be installed: see @ref{macOS}. So do the @pkg{tcltk} package and
some third-party packages.
@node Uninstalling under macOS, Multiple versions, Running R under macOS, Installing R under macOS
@section Uninstalling under macOS
@R{} for macOS consists of two parts: the GUI (@Rapp{}) and the R
framework. The un-installation is as simple as removing those folders
(e.g.@: by dragging them onto the Trash). The typical installation will
install the GUI into the @file{/Applications/R.app} folder and the R
framework into the @file{/Library/Frameworks/R.framework} folder. The
links to @file{R} and @file{Rscript} in @file{/usr/local/bin} should
also be removed.
If you want to get rid of @R{} more completely using a Terminal, simply
run:
@example
sudo rm -Rf /Library/Frameworks/R.framework /Applications/R.app \
/usr/local/bin/R /usr/local/bin/Rscript
@end example
The installation consists of up to four Apple packages:@footnote{The
framework for @R{} 3.3.x was named
@code{org.r-project.R.mavericks.fw.pkg}: use @command{pkgutil --pkgs |
grep org.r-project} to check for earlier versions of @R{}.}
@code{org.r-project.R.el-capitan.fw.pkg},
@code{org.r-project.R.el-capitan.GUI.pkg},
@code{org.r-project.x86_64.tcltk.x11} and
@code{org.r-project.x86_64.texinfo}. You can use @code{pkgutil --forget}
if you want the Apple Installer to forget about the package without
deleting its files (useful for the @R{} framework when installing
multiple @R{} versions in parallel), or after you have deleted the
files.
Uninstalling the Tcl/Tk or Texinfo components (which are installed under
@file{/usr/local}) is not as simple. You can list the files they installed
in a Terminal by
@example
pkgutil --files org.r-project.x86_64.tcltk.x11
pkgutil --files org.r-project.x86_64.texinfo
@end example
@noindent
These are paths relative to @file{/}, the root of the file system.
@c Maybe too dangerous for naive users.
@c file.remove removes empty directories on Unix.
@c The second could be uninstalled by an @R{} script like
@c @example
@c lis <- system2("pkgutil", "--files org.r-project.x86_64.texinfo", stdout = TRUE)
@c setwd("/")
@c file.remove(rev(lis))
@c @end example
@c @noindent
@c run as the owner of @file{/usr/local}.
@node Multiple versions, , Uninstalling under macOS, Installing R under macOS
@section Multiple versions
The installer will remove any previous version@footnote{More precisely,
of the Apple package of the same name: this means that installing a package for
3.3.x does not remove an installation for 3.4.x or later.} of
the @R{} framework which it finds installed. This can be avoided by
using @command{pkgutil --forget} (see the previous section). However,
note that different versions are installed under
@file{/Library/Frameworks/R.framework/Versions} as @file{3.5},
@file{3.6} and so on, so it is not possible to have different
@samp{3.x.y} versions installed for the same @samp{x}.
A version of @R{} can be run directly from the command-line as e.g.@:
@example
/Library/Frameworks/R.framework/Versions/3.6/Resources/bin/R
@end example
@noindent
However, @Rapp{} will always run the `current' version, that is the last
installed version. A small utility, @command{Rswitch.app} (available at
@url{https://mac.R-project.org/#other}: it is 32-bit so not usable on
Catalina), can be used to change the `current' version. This is of
limited use as @Rapp{} is compiled against a particular version of @R{}
and will likely crash if switched to an earlier version. This may allow
you to install a development version of @R{} (de-selecting @Rapp{}) and
then switch back to the release version.
@node Running R, Add-on packages, Installing R under macOS, Top
@chapter Running R
How to start @R{} and what command-line options are available is discussed
in @ref{Invoking R, , Invoking R, R-intro, An Introduction to R}.
You should ensure that the shell has set adequate resource limits: @R{}
expects a stack size of at least 8MB and to be able to open at least 256
file descriptors. (Any modern OS should have default limits at least as
large as these, but apparently NetBSD may not. Use the shell command
@command{ulimit} (@command{sh}/@command{bash}) or @command{limit}
(@command{csh}/@command{tcsh}) to check.) For some
compilers@footnote{The Oracle compilers on Solaris (where the issue is
parsing very complex @R{} expressions) and GCC 9 on Linux.} and packages
a larger stack size has been needed: 20-25MB has sufficed to date.
@R{} makes use of a number of environment variables, the default values
of many of which are set in file @file{@var{R_HOME}/etc/Renviron} (there
are none set by default on Windows and hence no such file). These are
set at @command{configure} time, and you would not normally want to
@enindex R_PAPERSIZE
change them -- a possible exception is @env{R_PAPERSIZE} (@pxref{Setting
paper size}). The paper size will be deduced from the @samp{LC_PAPER}
locale category if it exists and @env{R_PAPERSIZE} is unset, and this
will normally produce the right choice from @samp{a4} and @samp{letter}
on modern Unix-alikes (but can always be overridden by setting
@env{R_PAPERSIZE}).
Various environment variables can be set to determine where @R{} creates
its per-session temporary directory. The environment variables
@enindex TMPDIR
@enindex TMP
@enindex TEMP
@env{TMPDIR}, @env{TMP} and @env{TEMP} are searched in turn and the
first one which is set and points to a writable area is used. If none
do, the final default is @file{/tmp} on Unix-alikes and the value of
@enindex R_USER
@env{R_USER} on Windows. The path should be an absolute path not
containing spaces (and it is best to avoid non-alphanumeric characters
such as @code{+}).
Some Unix-alike systems are set up to remove files and directories
periodically from @file{/tmp}, for example by a @command{cron} job
@enindex TMPDIR
running @command{tmpwatch}. Set @env{TMPDIR} to another directory
before starting long-running jobs on such a system.
Note that @env{TMPDIR} will be used to execute @command{configure}
scripts when installing packages, so if @file{/tmp} has been mounted as
@samp{noexec}, @env{TMPDIR} needs to be set to a directory from which
execution is allowed.
@node Add-on packages, Internationalization, Running R, Top
@chapter Add-on packages
@cindex Packages
@cindex Libraries
@menu
* Default packages::
* Managing libraries::
* Installing packages::
* Updating packages::
* Removing packages::
* Setting up a package repository::
* Checking installed source packages::
@end menu
It is helpful to use the correct terminology. A @emph{package} is
loaded from a @emph{library} by the function @code{library()}. Thus a
library is a directory containing installed packages; the main library
is @file{@var{R_HOME}/library}, but others can be used, for example by
@enindex R_LIBS
setting the environment variable @env{R_LIBS} or using the @R{} function
@code{.libPaths()}. To avoid any confusion you will often see a library
directory referred to as a `library tree'.
@node Default packages, Managing libraries, Add-on packages, Add-on packages
@section Default packages
@cindex Packages, default
The set of packages loaded on startup is by default
@example
> getOption("defaultPackages")
[1] "datasets" "utils" "grDevices" "graphics" "stats" "methods"
@end example
@noindent
(plus, of course, @pkg{base}) and this can be changed by setting the
option in startup code (e.g.@: in @file{~/.Rprofile}). It is initially
@enindex R_DEFAULT_PACKAGES
set to the value of the environment variable @env{R_DEFAULT_PACKAGES} if
set (as a comma-separated list). Setting @env{R_DEFAULT_PACKAGES=NULL}
ensures that only package @pkg{base} is loaded.
Changing the set of default packages is normally used to reduce the set
for speed when scripting: in particular not using @pkg{methods} will
reduce the start-up time by a factor of up to two. But it can also be
used to customize @R{}, e.g.@: for class use. @command{Rscript}
also checks the environment variable @env{R_SCRIPT_DEFAULT_PACKAGES};
@enindex R_SCRIPT_DEFAULT_PACKAGES
if set, this takes precedence over @env{R_DEFAULT_PACKAGES}.
@node Managing libraries, Installing packages, Default packages, Add-on packages
@section Managing libraries
@cindex Libraries, managing
@R{} packages are installed into @emph{libraries}, which are
directories in the file system containing a subdirectory for each
package installed there.
@R{} comes with a single library, @file{@var{R_HOME}/library} which is
the value of the @R{} object @samp{.Library} containing the standard and
recommended@footnote{unless they were excluded in the build.} packages.
Both sites and users can create others and make use of them (or not) in
an @R{} session. At the lowest level @samp{.libPaths()} can be used to
add paths to the collection of libraries or to report the current
collection.
@cindex Libraries, site
@cindex Site libraries
@R{} will automatically make use of a site-specific library
@file{@var{R_HOME}/site-library} if this exists (it does not in a
vanilla @R{} installation). This location can be overridden by
setting@footnote{its binding is locked once the startup files have been
read, so users cannot easily change it.} @samp{.Library.site} in
@file{@var{R_HOME}/etc/Rprofile.site}, or (not recommended) by setting
the
@enindex R_LIBS_SITE
environment variable @env{R_LIBS_SITE}. Like @samp{.Library}, the
site libraries are always included by @samp{.libPaths()}.
@cindex Libraries, user
@cindex User libraries
@enindex R_LIBS_USER
Users can have one or more libraries, normally specified by the
environment variable @env{R_LIBS_USER}. This has a default value (to
see it, use @samp{Sys.getenv("R_LIBS_USER")} within an @R{} session),
but that is only used if the corresponding directory actually exists
(which by default it will not).
Both @env{R_LIBS_USER} and @env{R_LIBS_SITE} can specify multiple
library paths, separated by colons (semicolons on Windows).
@node Installing packages, Updating packages, Managing libraries, Add-on packages
@section Installing packages
@cindex Packages, installing
@menu
* Windows packages::
* macOS packages::
* Customizing package compilation::
* Multiple sub-architectures::
* Byte-compilation::
* External software::
@end menu
Packages may be distributed in source form or compiled binary form.
Installing source packages which contain C/C++/Fortran code requires
that compilers and related tools be installed. Binary packages are
platform-specific and generally need no special tools to install, but
see the documentation for your platform for details.
Note that you may need to specify implicitly or explicitly the library to
which the package is to be installed. This is only an issue if you have
more than one library, of course.
@c If installing packages on a Unix-alike to be used by other users, ensure
@c that the system @code{umask} is set to give sufficient permissions (see
@c also @code{Sys.umask} in @R{}). (To a large extent this is unnecessary
@c in recent versions of @R{}, which install packages as if @code{umask = 022}.)
@enindex TMPDIR
Ensure that the environment variable @env{TMPDIR} is either unset (and
@file{/tmp} exists and can be written in and executed from) or is the
absolute path to a valid temporary directory, not containing spaces.
For most users it suffices to call
@samp{install.packages(@var{pkgname})} or its GUI equivalent if the
intention is to install a @acronym{CRAN} package and internet access is
available.@footnote{If a proxy needs to be set, see
@command{?download.file}.} On most systems @samp{install.packages()}
will allow packages to be selected from a list box (typically with
thousands of items).
To install packages from source on a Unix-alike use in a terminal
@example
R CMD INSTALL -l /path/to/library @var{pkg1} @var{pkg2} @dots{}
@end example
@noindent
The part @samp{-l /path/to/library} can be omitted, in which case the
first library of a normal @R{} session is used (that shown by
@code{.libPaths()[1]}).
There are a number of options available: use @code{R CMD INSTALL --help}
to see the current list.
@findex install.packages
Alternatively, packages can be downloaded and installed from within
@R{}. First choose your nearest @acronym{CRAN} mirror using
@command{chooseCRANmirror()}. Then download and install packages
@pkg{pkg1} and @pkg{pkg2} by
@example
> install.packages(c("pkg1", "pkg2"))
@end example
@noindent
The essential dependencies of the specified packages will also be fetched.
Unless the library is specified (argument @code{lib}) the first library
in the library search path is used: if this is not writable, @R{} will
ask the user (in an interactive session) if the default personal library
should be created, and if allowed to will install the packages there.
If you want to fetch a package and all those it depends on (in any way)
that are not already installed, use e.g.
@example
> install.packages("Rcmdr", dependencies = TRUE)
@end example
@code{install.packages} can install a source package from a local
@file{.tar.gz} file (or a URL to such a file) by setting argument
@code{repos} to @code{NULL}: this will be selected automatically if the
name given is a single @file{.tar.gz} file.
@code{install.packages} can look in several repositories, specified as a
character vector by the argument @code{repos}: these can include a
@acronym{CRAN} mirror, Bioconductor, R-forge, rforge.net,
local archives, local files, @dots{}). Function
@code{setRepositories()} can select amongst those repositories that the
@R{} installation is aware of.
Naive users sometimes forget that as well as installing a package, they
have to use @code{library} to make its functionality available.
@node Windows packages, macOS packages, Installing packages, Installing packages
@subsection Windows
What @code{install.packages} does by default is different on Unix-alikes
(except macOS) and Windows. On Unix-alikes it consults the list of
available @emph{source} packages on @acronym{CRAN} (or other
repository/ies), downloads the latest version of the package sources,
and installs them (via @code{R CMD INSTALL}). On Windows it looks (by
default) first at the list of @emph{binary} versions of packages
available for your version of @R{} and downloads the latest versions (if
any). If no binary version is available or the source version is newer,
it will install the source versions of packages without compiled
C/C++/Fortran code, and offer to do so for those with, if @command{make}
is available (and this can be tuned by option
@code{"install.packages.compile.from.source"}).
On Windows @code{install.packages} can also install a binary package
from a local @file{zip} file (or the URL of such a file) by setting
argument @code{repos} to @code{NULL}. @code{Rgui.exe} has a menu
@code{Packages} with a GUI interface to @code{install.packages},
@code{update.packages} and @code{library}.
Windows binary packages for @R{} are distributed as a single binary
containing either or both architectures (32- and 64-bit).
A few of the binary packages need other software to be installed on your
system: see for example
@uref{https://CRAN.R-project.org/@/bin/@/windows/@/contrib/@/3.2/@/@@ReadMe}.
Packages using Gtk+ (@CRANpkg{Cairo}, @CRANpkg{RGtk2},
@CRANpkg{cairoDevice} and those that depend on them) need the @file{bin}
directory of a bundled distribution of Gtk2 from
@uref{http://ftp.gnome.org/@/pub/@/gnome/@/binaries/@/win32/@/gtk+} or
@uref{http://ftp.gnome.org/@/pub/@/gnome/@/binaries/@/win64/@/gtk+} in
the path: it should work to have both 32- and 64-bit Gtk+ @file{bin}
directories in the path on a 64-bit version of @R{}.
@command{R CMD INSTALL} works in Windows to install source packages.
No additional tools are needed if the package does not contain
compiled code, and @code{install.packages(type="source")} will work
for such packages (and for those with compiled code if the tools (see
@ref{The Windows toolset}) are on the path, and the variables
@code{BINPREF} and @code{BINPREF64} are set properly; see the
discussion below). We have seen occasional permission problems after
unpacking source packages on some systems: these have been
circumvented by setting the environment variable @env{R_INSTALL_TAR}
to @samp{tar.exe}.
@enindex R_INSTALL_TAR
If you have only a source package that is known to work with current
@R{} and just want a binary Windows build of it, you could make use of
the building service offered at
@uref{https://win-builder.r-project.org/}.
For almost all packages @command{R CMD INSTALL} will attempt to install
both 32- and 64-bit builds of a package if run from a 32/64-bit install
of @R{}. It will report success if the installation of the architecture
of the running @command{R} succeeded, whether or not the other
architecture was successfully installed. The exceptions are packages
with a non-empty @file{configure.win} script or which make use of
@file{src/Makefile.win}. If @file{configure.win} does something
appropriate to both architectures use@footnote{for a small number of
@acronym{CRAN} packages where this is known to be safe and is needed by
the autobuilder this is the default. Look at the source of
@file{tools:::.install_packages} for the list. It can also be specified
in the package's @file{DESCRIPTION} file.} option
@option{--force-biarch}: otherwise @command{R CMD INSTALL
--merge-multiarch} can be applied to a source tarball to merge separate
32- and 64-bit installs. (This can only be applied to a tarball, and
will only succeed if both installs succeed.)
If you have a package without compiled code and no Windows-specific
help, you can zip up an installation on another OS and install from that
zip file on Windows. However, such a package can be installed from the
sources on Windows without any additional tools.
@enindex LOCAL_SOFT
@enindex BINPREF
@enindex BINPREF64
Packages with compiled code may need to have paths to the compilers
set explicitly, and there is provision to make use of a system-wide
library of installed external software. The compiler paths are set
using the @command{make} variables @code{BINPREF} and (usually)
@code{BINPREF64}. The library location is set using @command{make}
variable @code{LOCAL_SOFT}, to give an equivalent of @file{/usr/local}
on a Unix-alike. All of these can be set in
@file{src/gnuwin32/MkRules.local} when @R{} is built from sources (see
the comments in @file{src/gnuwin32/MkRules.dist}), or in
file@footnote{or by adding it in a file such as
@file{etc/i386/Makevars.site}, which does not exist by default.}
@file{etc/i386/Makeconf} or @file{etc/x64/Makeconf} for an installed
version of @R{}. In the latter case only @code{BINPREF} is used, with
the 64 bit path used in @file{etc/x64/Makeconf}. The version used by
@acronym{CRAN} can be installed as described in @ref{Building from
source}.
@node macOS packages, Customizing package compilation, Windows packages, Installing packages
@subsection macOS
On macOS (formerly OS X) @code{install.packages} works as it does on
other Unix-alike systems, but there are additional types starting with
@code{mac.binary} (available for the @acronym{CRAN} distribution but not
when compiling @R{} from source: @code{mac.binary.el-capitan} for an
`El Capitan and later' build with @code{"default"} a synonym for the
appropriate variant) which can be passed to @code{install.packages} in
order to download and install binary packages from a suitable
repository. These binary package files for macOS have the extension
@samp{.tgz}. The @Rapp{} GUI provides menus for installation of either
binary or source packages, from @acronym{CRAN} or local files.
On @R{} builds using binary packages, the default is type @code{both}:
this looks first at the list of binary packages available for your
version of @R{} and installs the latest versions (if any). If no binary
version is available or the source version is newer, it will install the
source versions of packages without compiled C/C++/Fortran code and offer
to do so for those with, if @command{make} is available.
Note that most binary packages which include compiled code are tied to a
particular series (e.g.@: @R{} 3.6.x or 3.5.x) of @R{}.
Installing source packages which do not contain compiled code should
work with no additional tools. For others you will need the
`Command Line Tools' for @command{Xcode} and compilers which match those
used to build @R{}: see @ref{macOS}.
Package @CRANpkg{rJava} and those which depend on it need a Java runtime
installed and several packages need X11 installed, including those using
Tk. See @ref{macOS} and @ref{Java (macOS)}.
Tcl/Tk extensions @code{BWidget} and @code{Tktable} are part of the
Tcl/Tk contained in the @R{} installer. These are required by a number
of @acronym{CRAN} and Bioconductor packages.
A few of the binary packages need other software to be installed on your
system. In particular packages using Gtk+ (@CRANpkg{RGtk2},
@CRANpkg{cairoDevice} and those that depend on them) need the GTK
framework installed from @uref{https://mac.R-project.org/libs/}: the
appropriate version at the time of writing was
@uref{https://mac.R-project.org/@/libs/@/GTK_2.24.17-X11.pkg}
The default compilers specified are shown in file
@file{/Library/Frameworks/@/R.framework/@/Resources/etc/Makeconf}. At
the time of writing these setting assumed that the C, Fortran and C++
compilers were on the path, using @command{gfortran} 6.1.0 (see
@ref{macOS}). The settings can be changed, either by editing that file
or in a file such as @file{~/.R/Makevars} (see the next section).
Entries which may need to be changed include @samp{CC}, @samp{CXX},
@samp{FC}, @samp{FLIBS} and the corresponding flags, and perhaps
@samp{CXXCPP}, @samp{DYLIB_LD}, @samp{MAIN_LD}, @samp{SHLIB_CXXLD} and
@samp{SHLIB_LD}, as well as the @samp{CXX11}, @samp{CXX14} and
@samp{CXX17} variants
So for example you could select a specific build of @command{clang} for
both C and C++ with extensive checking by having in @file{~/.R/Makevars}
@example
CC = /usr/local/clang7/bin/clang
CXX = /usr/local/clang7/bin/clang++
CXX11 = $CXX
CXX14 = $CXX
CXX17 = $CXX
CFLAGS = -g -O2 -Wall -pedantic -Wconversion -Wno-sign-conversion
CXXFLAGS = -g -O2 -Wall -pedantic -Wconversion -Wno-sign-conversion
CXX11FLAGS = $CXXFLAGS
CXX14FLAGS = $CXXFLAGS
CXX17FLAGS = $CXXFLAGS
@end example
@noindent
and @command{gfortran} by
(El Capitan)
@example
FC = /usr/local/gfortran/bin/gfortran
FLIBS = -L/usr/local/gfortran/lib/gcc/x86_64-apple-darwin15/6.1.0
-L/usr/local/gfortran/lib -lgfortran -lquadmath -lm
@end example
@noindent
or (Sierra or High Sierra)
@example
FC = /usr/local/gfortran/bin/gfortran
FLIBS = -L/usr/local/gfortran/lib/gcc/x86_64-apple-darwin16/6.3.0
-L/usr/local/gfortran/lib -lgfortran -lquadmath -lm
@end example
or (Mojave or later)
@example
FC = /usr/local/gfortran/bin/gfortran
FLIBS = -L/usr/local/gfortran/lib/gcc/x86_64-apple-darwin18/8.2.0
-L/usr/local/gfortran/lib -lgfortran -lquadmath -lm
@end example
@noindent
(with lines broken here for legibility).
If using the C/C++ compilers from the Command Line Tools (which do not
have OpenMP support) one will need to include
@example
SHLIB_OPENMP_CFLAGS =
SHLIB_OPENMP_CXXFLAGS =
@end example
@noindent
to compile OpenMP-using packages.
Apple includes many Open Source libraries in macOS but increasingly
without the corresponding headers (not even in Xcode nor the Command
Line Tools): they are often rather old versions. If installing packages
from source using them it is usually easiest to install a
statically-linked up-to-date copy of the Open Source package from its
sources or from @uref{https://mac.R-project.org/libs}. But sometimes
it is desirable/necessary to use Apple's dynamically linked library, in
which case appropriate headers could be extracted from the
sources@footnote{Note that capitalization and version may differ from
the Open Source project.} available @emph{via}
@uref{https://opensource.apple.com}.
@node Customizing package compilation, Multiple sub-architectures, macOS packages, Installing packages
@subsection Customizing package compilation
The @R{} system and package-specific compilation flags can be overridden
or added to by setting the appropriate Make variables in the personal
file @file{@var{HOME}/.R/Makevars-@var{R_PLATFORM}} (but
@file{@var{HOME}/.R/Makevars.win} or @file{@var{HOME}/.R/Makevars.win64}
on Windows), or if that does not exist, @file{@var{HOME}/.R/Makevars},
where @samp{R_PLATFORM} is the platform for which @R{} was built, as
available in the @code{platform} component of the @R{} variable
@code{R.version}. The path to an alternative personal
file@footnote{using a path containing spaces is likely to cause
problems} can be specified @emph{via} the environment variable
@env{R_MAKEVARS_USER}.
Package developers are encouraged to use this mechanism to enable a
reasonable amount of diagnostic messaging (``warnings'') when compiling,
such as e.g.@: @option{-Wall -pedantic} for tools from GCC, the GNU
Compiler Collection or for @command{clang}.
Note that this mechanism can also be used when it necessary to change
the optimization level whilst installing a particular package. For
example
@example
## @r{for C code}
CFLAGS = -g -O -mtune=native
## @r{for C++ code}
CXXFLAGS = -g -O -mtune=native
## @r{for fixed-form Fortran code}
FFLAGS = -g -O -mtune=native
@end example
Another use is to override the settings in a binary installation of R.
For example, to use a different Fortran compiler on macOS
@example
FC = /usr/local/gfortran/bin/gfortran
FLIBS = -L/usr/local/gfortran/lib/gcc/x86_64-apple-darwin16/6.3.0
-L/usr/local/gfortran/lib -lgfortran -lquadmath -lm
@end example
@noindent
(line split for legibility here).
There is also provision for a site-wide @file{Makevars.site} file under
@file{@var{R_HOME}/etc} (in a sub-architecture-specific directory if
appropriate). This is read immediately after @file{Makeconf}, and the
path to an alternative file can be specified by environment variable
@env{R_MAKEVARS_SITE}.
Note that these mechanisms do not work with packages which fail to pass
settings down to sub-makes, perhaps reading @file{etc/Makeconf} in
makefiles in subdirectories. Fortunately such packages are unusual.
@node Multiple sub-architectures, Byte-compilation, Customizing package compilation, Installing packages
@subsection Multiple sub-architectures
When installing packages from their sources, there are some extra
considerations on installations which use sub-architectures. These are
commonly used on Windows but can in principle be used on other
platforms.
When a source package is installed by a build of @R{} which supports
multiple sub-architectures, the normal installation process installs the
packages for all sub-architectures. The exceptions are
@table @emph
@item Unix-alikes
where there is an @file{configure} script, or a file @file{src/Makefile}.
@item Windows
where there is a non-empty @file{configure.win} script, or a file
@file{src/Makefile.win} (with some exceptions where the package is known
to have an architecture-independent @file{configure.win}, or if
@option{--force-biarch} or field @samp{Biarch} in the @file{DESCRIPTION}
file is used to assert so).
@end table
@noindent
In those cases only the current architecture is installed. Further
sub-architectures can be installed by
@example
R CMD INSTALL --libs-only @var{pkg}
@end example
@noindent
using the path to @command{R} or @command{R --arch} to select the
additional sub-architecture. There is also @command{R CMD INSTALL
--merge-multiarch} to build and merge the two architectures, starting
with a source tarball.
@node Byte-compilation, External software, Multiple sub-architectures, Installing packages
@subsection Byte-compilation
As from @R{} 3.6.0, all packages are by default byte-compiled.
Byte-compilation can be controlled on a per-package basis by the
@samp{ByteCompile} field in the @file{DESCRIPTION} file.
@node External software, , Byte-compilation, Installing packages
@subsection External software
Some @R{} packages contain compiled code which links to external
software libraries. Unless the external library is statically linked
(which is done as much as possible for binary packages on Windows and
macOS), the libraries have to be found when the package is loaded and
not just when it is installed. How this should be done depends on the
OS (and in some cases the version).
For Unix-alikes except macOS the primary mechanism is the @code{ld.so}
cache controlled by @command{ldconfig}: external dynamic libraries
recorded in that cache will be found. Standard library locations will
be covered by the cache, and well-designed software will add its
locations (as for example @pkg{openmpi} does on Fedora). The secondary
mechanism is to consult the environment variable @env{LD_LIBRARY_PATH}.
The @R{} script controls that variable, and sets it to the concatenation
of @env{R_LD_LIBRARY_PATH}, @env{R_JAVA_LD_LIBRARY_PATH} and the
environment value of @env{LD_LIBRARY_PATH}. The first two have defaults
which are normally set when @R{} is installed (but can be overridden in
the environment) so @env{LD_LIBRARY_PATH} is the best choice for a user
to set.
On macOS the primary mechanism is to embed the absolute path to
dependent dynamic libraries into an object when it is compiled. Few
@R{} packages arrange to do so, but it can be edited@footnote{They need
to have been created using @option{-headerpad_max_install_names}, which
is the default for an @R{} package.} @emph{via}
@command{install_name_tool} --- that only deals with direct dependencies
and those would also need to be compiled to include the absolute paths
of their dependencies. If the choice of absolute path is to be deferred
to load time, how they are resolved is described in @command{man dyld}:
the role of @env{LD_LIBRARY_PATH} is replaced on macOS by
@env{DYLD_LIBRARY_PATH} and @env{DYLD_FALLBACK_LIBRARY_PATH}. Running
@command{R CMD otool -L} on the package shared object will show where
(if anywhere) its dependencies are
resolved. @env{DYLD_FALLBACK_LIBRARY_PATH} is preferred (and it is that
which is manipulated by the @R{} script), but as from 10.11 (`El
Capitan') the default behaviour had been changed for security reasons to
discard these environment variables when invoking a shell script (and
@file{R} is a shell script). That makes the only portable option to set
@env{R_LD_LIBRARY_PATH} in the environment, something like
@example
export R_LD_LIBRARY_PATH="`R RHOME`/lib:/opt/local/lib"
@end example
The precise rules for where Windows looks for DLLs are complex and
depend on the version of Windows. But for present purposes the main
solution is to put the directories containing the DLLs the package
links to (and any those DLLs link to) on the @env{PATH}. 64-bit
versions of Windows will ignore 32-bit DLLs from 64-bit @R{} and
@emph{vice versa}.
The danger with any of the methods which involve setting environment
variables is of inadvertently masking a system library. This is less
for @env{DYLD_FALLBACK_LIBRARY_PATH} and for @emph{appending} to
@env{PATH} on Windows (as it should already contain the system library
paths).
@node Updating packages, Removing packages, Installing packages, Add-on packages
@section Updating packages
@findex update.packages
@cindex Packages, updating
The command @code{update.packages()} is the simplest way to ensure that
all the packages on your system are up to date. It downloads the list
of available packages and their current versions, compares it with those
installed and offers to fetch and install any that have later versions
on the repositories.
An alternative interface to keeping packages up-to-date is provided by
the command @code{packageStatus()}, which returns an object with
information on all installed packages and packages available at multiple
repositories. The @code{print} and @code{summary} methods give an
overview of installed and available packages, the @code{upgrade} method
offers to fetch and install the latest versions of outdated packages.
One sometimes-useful additional piece of information that
@code{packageStatus()} returns is the status of a package, as
@code{"ok"}, @code{"upgrade"} or @code{"unavailable"} (in the currently
selected repositories). For example
@example
> inst <- packageStatus()$inst
> inst[inst$Status != "ok", c("Package", "Version", "Status")]
Package Version Status
Biobase Biobase 2.8.0 unavailable
RCurl RCurl 1.4-2 upgrade
Rgraphviz Rgraphviz 1.26.0 unavailable
rgdal rgdal 0.6-27 upgrade
@end example
@node Removing packages, Setting up a package repository, Updating packages, Add-on packages
@section Removing packages
@findex remove.packages
@cindex Packages, removing
Packages can be removed in a number of ways. From a command prompt they
can be removed by
@example
R CMD REMOVE -l /path/to/library @var{pkg1} @var{pkg2} @dots{}
@end example
From a running @R{} process they can be removed by
@example
> remove.packages(c("pkg1", "pkg2"),
lib = file.path("path", "to", "library"))
@end example
Finally, one can just remove the package directory from the library.
@node Setting up a package repository, Checking installed source packages, Removing packages, Add-on packages
@section Setting up a package repository
@cindex Repositories
Utilities such as @code{install.packages} can be pointed at any
@acronym{CRAN}-style repository, and @R{} users may want to set up their
own. The `base' of a repository is a URL such as
@uref{http://www.stats.ox.ac.uk/pub/RWin}: this must be an URL scheme
that @code{download.packages} supports (which also includes
@samp{https://}, @samp{ftp://} and @samp{file://}). Under that base URL
there should be directory trees for one or more of the following types
of package distributions:
@itemize
@item
@code{"source"}: located at @file{src/contrib} and containing
@file{.tar.gz} files. Other forms of compression can be used, e.g.@:
@file{.tar.bz2} or @file{.tar.xz} files. Complete repositories contain
the sources corresponding to any binary packages, and in any case it is
wise to have a @file{src/contrib} area with a possibly empty
@file{PACKAGES} file.
@item
@code{"win.binary"}: located at @file{bin/windows/contrib/@var{x.y}} for
@R{} versions @var{x.y.z} and containing @file{.zip} files for Windows.
@item
@code{"mac.binary.el-capitan"}: located at
@file{bin/macosx/el-capitan/contrib/@var{3.y}} for the CRAN builds for
`El Capitan (and later) for @R{} versions @var{3.y.z}, containing
@file{.tgz} files.
@end itemize
Each terminal directory must also contain a @file{PACKAGES} file. This
can be a concatenation of the @file{DESCRIPTION} files of the packages
separated by blank lines, but only a few of the fields are needed. The
simplest way to set up such a file is to use function
@code{write_PACKAGES} in the @pkg{tools} package, and its help explains
which fields are needed. Optionally there can also be
@file{PACKAGES.rds} and @file{PACKAGES.gz} files, downloaded in
preference to @file{PACKAGES}. (These files will be smaller:
@file{PACKAGES.rds} is used only from @R{} 3.4.0. If you have a
mis-configured server that does not report correctly non-existent files
you may need these files.)
To add your repository to the list offered by @code{setRepositories()},
see the help file for that function.
Incomplete repositories are better specified @emph{via} a
@code{contriburl} argument than @emph{via} being set as a repository.
A repository can contain subdirectories, when the descriptions in the
@file{PACKAGES} file of packages in subdirectories must include a line
of the form
@example
Path: @var{path/to/subdirectory}
@end example
@noindent
---once again @code{write_PACKAGES} is the simplest way to set this up.
@node Checking installed source packages, , Setting up a package repository, Add-on packages
@section Checking installed source packages
It can be convenient to run @command{R CMD check} on an installed
package, particularly on a platform which uses sub-architectures. The
outline of how to do this is, with the source package in directory
@file{@var{pkg}} (or a tarball filename):
@example
R CMD INSTALL -l @var{libdir} @var{pkg} > @var{pkg}.log 2>&1
R CMD check -l @var{libdir} --install=check:@var{pkg}.log @var{pkg}
@end example
@noindent
Where sub-architectures are in use the @command{R CMD check} line can be
repeated with additional architectures by
@example
R --arch @var{arch} CMD check -l @var{libdir} --extra-arch --install=check:@var{pkg}.log @var{pkg}
@end example
@noindent
where @option{--extra-arch} selects only those checks which depend on
the installed code and not those which analyse the sources. (If
multiple sub-architectures fail only because they need different
settings, e.g.@: environment variables, @option{--no-multiarch} may need
to be added to the @code{INSTALL} lines.) On Unix-alikes the
architecture to run is selected by @option{--arch}: this can also be
used on Windows with @file{@var{R_HOME}/bin/R.exe}, but it is more usual
to select the path to the @command{Rcmd.exe} of the desired
architecture.
So on Windows to install, check and package for distribution a source
package from a tarball which has been tested on another platform one
might use
@example
.../bin/i386/Rcmd INSTALL -l @var{libdir} @var{tarball} --build > @var{pkg}.log 2>&1
.../bin/i386/Rcmd check -l @var{libdir} --extra-arch --install=check:@var{pkg}.log @var{pkg}
.../bin/x64/Rcmd check -l @var{libdir} --extra-arch --install=check:@var{pkg}.log @var{pkg}
@end example
@noindent
where one might want to run the second and third lines in a different
shell with different settings for environment variables and the path (to
find external software, notably for Gtk+).
@command{R CMD INSTALL} can do a @code{i386} install and then add the
@code{x64} DLL from a single command by
@example
R CMD INSTALL --merge-multiarch -l @var{libdir} @var{tarball}
@end example
@noindent
and @option{--build} can be added to zip up the installation.
@node Internationalization, Choosing between 32- and 64-bit builds, Add-on packages, Top
@chapter Internationalization and Localization
@cindex Internationalization
@cindex Localization
@emph{Internationalization} refers to the process of enabling support
for many human languages, and @emph{localization} to adapting to a
specific country and language.
Current builds of @R{} support all the character sets that the
underlying OS can handle. These are interpreted according to the
@cindex Locale
current @code{locale}, a sufficiently complicated topic to merit a
separate section. Note though that @R{} has no built-in support for
right-to-left languages and bidirectional output, relying on the OS
services. For example, how character vectors in UTF-8 containing both
English digits and Hebrew characters are printed is OS-dependent (and
perhaps locale-dependent).
The other aspect of the internationalization is support for the
translation of messages. This is enabled in almost all builds of @R{}.
@menu
* Locales::
* Localization of messages::
@end menu
@node Locales, Localization of messages, Internationalization, Internationalization
@section Locales
@cindex Locale
A @emph{locale} is a description of the local environment of the user,
including the preferred language, the encoding of characters, the
currency used and its conventions, and so on. Aspects of the locale are
accessed by the @R{} functions @code{Sys.getlocale} and
@code{Sys.localeconv}.
The system of naming locales is OS-specific. There is quite wide
agreement on schemes, but not on the details of their implementation. A
locale needs to specify
@itemize
@item
A human language. These are generally specified by a lower-case
two-character abbreviation following ISO 639 (see e.g.@:
@uref{https://en.wikipedia.org/@/wiki/@/ISO_639-1}).
@item
A `territory', used mainly to specify the currency. These are generally
specified by an upper-case two-character abbreviation following ISO 3166
(see e.g.@: @uref{https://@/en.wikipedia.org/@/wiki/@/ISO_3166}).
@item
A charset encoding, which determines both how a byte stream should be
divided into characters, and which characters the subsequences of bytes
represent. Sometimes the combination of language and territory is used
to specify the encoding, for example to distinguish between traditional
and simplified Chinese.
@item
Optionally, a modifier, for example to indicate that Austria is to be
considered pre- or post-Euro. The modifier is also used to indicate the
script (@code{@@latin}, @code{@@cyrillic} for Serbian, @code{@@iqtelif})
or language dialect (e.g.@: @code{@@saaho}, a dialect of Afar, and
@code{@@bokmal} and @code{@@nynorsk}, dialects of Norwegian regarded by
some OSes as separate languages, @code{no} and @code{nn}).
@end itemize
@R{} is principally concerned with the first (for translations) and
third. Note that the charset may be deducible from the language, as
some OSes offer only one charset per language.
@menu
* Locales under Unix-alikes::
* Locales under Windows::
* Locales under macOS::
@end menu
@node Locales under Unix-alikes, Locales under Windows, Locales, Locales
@subsection Locales under Unix-alikes
Modern Linux uses the XPG@footnote{`X/Open Portability Guide', which has
had several versions.} locale specifications which have the form
@samp{en_GB}, @samp{en_GB.UTF-8}, @samp{aa_ER.UTF-8@@saaho},
@samp{de_AT.iso885915@@euro}, the components being in the order listed
above. (See @command{man locale} and @command{locale -a} for more
details.) Similar schemes are used by most Unix-alikes: some (including
some distributions of Linux) use @samp{.utf8} rather than @samp{.UTF-8}.
Note that whereas UTF-8 locales are nowadays almost universally used,
locales such as @samp{en_GB} use 8-bit encodings for backwards
compatibility.
@node Locales under Windows, Locales under macOS, Locales under Unix-alikes, Locales
@subsection Locales under Windows
Windows also uses locales, but specified in a rather less concise way.
Most users will encounter locales only via drop-down menus, but more
information and lists can be found by searching for @samp{Windows
language country strings}).
It offers only one encoding per language.
Some care is needed with Windows' locale names. For example,
@code{chinese} is Traditional Chinese and not Simplified Chinese as used
in most of the Chinese-speaking world.
@node Locales under macOS, , Locales under Windows, Locales
@subsection Locales under macOS
macOS supports locales in its own particular way, but the @R{} GUI tries to
make this easier for users. See
@uref{https://developer.apple.com/@/library/@/content/@/documentation/@/MacOSX/@/Conceptual/@/BPInternational/}
for how users can set their locales. As with Windows, end users will
generally only see lists of languages/territories. Users of @R{} in a
terminal may need to set the locale to something like @samp{en_GB.UTF-8}
if it defaults to @samp{C} (as it sometimes does when logging in
remotely and for batch jobs: note whether @command{Terminal} sets the
@env{LANG} environment variable is an (advanced) preference, but does so
by default).
Internally macOS uses a form similar to Linux: the main difference from
other Unix-alikes is that where a character set is not specified it is
assumed to be @code{UTF-8}.
@node Localization of messages, , Locales, Internationalization
@section Localization of messages
The preferred language for messages is by default taken from the locale.
This can be overridden first by the setting of the environment variable
@enindex LANGUAGE
@enindex LC_ALL
@enindex LC_MESSAGES
@enindex LANG
@env{LANGUAGE} and then@footnote{On some systems setting
@env{LC_ALL} or @env{LC_MESSAGES} to @samp{C} disables @env{LANGUAGE}.}
by the environment variables @env{LC_ALL}, @env{LC_MESSAGES} and
@env{LANG}. (The last three are normally used to set the locale and so
should not be needed, but the first is only used to select the language
for messages.) The code tries hard to map locales to languages, but on
some systems (notably Windows) the locale names needed for the
environment variable @env{LC_ALL} do not all correspond to XPG language
names and so @env{LANGUAGE} may need to be set. (One example is
@samp{LC_ALL=es} on Windows which sets the locale to Estonian and the
language to Spanish.)
It is usually possible to change the language once @R{} is running
@emph{via} (not Windows) @code{Sys.setlocale("LC_MESSAGES",
"new_locale")}, or by setting an environment variable such as
@env{LANGUAGE}, @emph{provided}@footnote{If you try changing from French
to Russian except in a UTF-8 locale, you will most likely find messages
change to English.} the language you are changing to can be output in
the current character set. But this is OS-specific, and has been known
to stop working on an OS upgrade.
Messages are divided into @emph{domains}, and translations may be
available for some or all messages in a domain. @R{} makes use of the
following domains.
@itemize
@item
Domain @code{R} for the C-level error and warning messages from the @R{}
interpreter.
@item
Domain @code{R-@var{pkg}} for the @R{} @code{stop}, @code{warning} and
@code{message} messages in each package, including @code{R-base} for the
@pkg{base} package.
@item
Domain @code{@var{pkg}} for the C-level messages in each package.
@item
Domain @code{RGui} for the menus etc of the @R{} for Windows GUI front-end.
@end itemize
Dividing up the messages in this way allows @R{} to be extensible: as
packages are loaded, their message translation catalogues can be loaded
too.
@R{} can be built without support for translations, but it is enabled by
default.
R-level and C-level domains are subtly different, for example in the way
strings are canonicalized before being passed for translation.
Translations are looked for by domain according to the currently
specified language, as specifically as possible, so for example an
Austrian (@samp{de_AT}) translation catalogue will be used in preference
to a generic German one (@samp{de}) for an Austrian user. However, if a
specific translation catalogue exists but does not contain a
translation, the less specific catalogues are consulted. For example,
@R{} has catalogues for @samp{en_GB} that translate the Americanisms
(e.g., @samp{gray}) in the standard messages into English.@footnote{the
language written in England: some people living in the USA appropriate
this name for their language.} Two other examples: there are catalogues
for @samp{es}, which is Spanish as written in Spain and these will by
default also be used in Spanish-speaking Latin American countries, and
also for @samp{pt_BR}, which are used for Brazilian locales but not for
locales specifying Portugal.
Translations in the right language but the wrong charset are made use of
@enindex LANGUAGE
by on-the-fly re-encoding. The @env{LANGUAGE} variable (only) can be a
colon-separated list, for example @samp{se:de}, giving a set of
languages in decreasing order of preference. One special value is
@samp{en@@quot}, which can be used in a UTF-8 locale to have American
error messages with pairs of single quotes translated to Unicode directional
quotes.
If no suitable translation catalogue is found or a particular message is
not translated in any suitable catalogue, `English'@footnote{with
Americanisms.} is used.
See @uref{https://developer.r-project.org/@/Translations30.html} for how to
prepare and install translation catalogues.
@node Choosing between 32- and 64-bit builds, The standalone Rmath library, Internationalization, Top
@chapter Choosing between 32- and 64-bit builds
Almost all current @acronym{CPU}s have both 32- and 64-bit sets of
instructions. Most OSes running on such @acronym{CPU}s offer the choice
of building a 32-bit or a 64-bit version of @R{} (and details are given
below under specific OSes). For most a 32-bit version is the default,
but for some (e.g., @cputype{x86_64} Linux and macOS @geq{} 10.6)
64-bit is.
All current versions of @R{} use 32-bit integers (this is enforced in
the build) and @acronym{ISO}/@acronym{IEC}@tie{}60559@footnote{also
known as @acronym{IEEE}@tie{}754} double-precision reals, and so compute
to the same precision@footnote{at least when storing quantities: the
on-FPU precision is allowed to vary} and with the same limits on the
sizes of numerical quantities. The principal difference is in the size
of the pointers.
64-bit builds have both advantages and disadvantages:
@itemize
@item
The total virtual memory space made available to a 32-bit process is
limited by the pointer size to 4GB, and on most OSes to 3GB (or even
2GB). The limits for 64-bit processes are much larger (e.g.@:
8--128TB).
@R{} allocates memory for large objects as needed, and removes any
unused ones at garbage collection. When the sizes of objects become an
appreciable fraction of the address limit, fragmentation of the address
space becomes an issue and there may be no hole available that is the
size requested. This can cause more frequent garbage collection or the
inability to allocate large objects. As a guide, this will become an
issue for 32-bit builds with objects more than 10% of the size of the
address space (around 300Mb) or when the total size of objects in use is
around one third (around 1Gb).
@item
Only 64-bit builds support `long vectors', those with @math{2^{31}} or
more elements (which needs at least 16GB of storage for each numeric
vector).
@item
Most 32-bit OSes by default limit file sizes to 2GB (and this may also
apply to 32-bit builds on 64-bit OSes). This can often be worked
around: @command{configure} selects suitable defines if this is
possible. (We have also largely worked around that limit on 32-bit
Windows.) 64-bit builds have much larger limits.
@item
Because the pointers are larger, @R{}'s basic structures are larger.
This means that @R{} objects take more space and (usually) more time to
manipulate. So 64-bit builds of @R{} will, all other things being
equal, run slower than 32-bit builds. (On Sparc Solaris the difference
was 15-20%.)
@item
However, `other things' may not be equal. In the specific case of
@cputype{x86_64} @emph{vs} @cputype{ix86}, the 64-bit CPU has features
(such as SSE2 instructions) which are guaranteed to be present but are
optional on the 32-bit CPU, and also has more general-purpose registers.
This means that on chips like a desktop Intel i7 the vanilla 64-bit
version of @R{} has been around 10% faster on both Linux and macOS.
(Laptop CPUs are usually relatively slower in 64-bit mode.)
@end itemize
So, for speed you may want to use a 32-bit build (especially on a
laptop), but to handle large datasets (and perhaps large files) a 64-bit
build. You can often build both and install them in the same place:
@xref{Sub-architectures}. (This is done for the Windows binary
distributions.)
Even on 64-bit builds of @R{} there are limits on the size of @R{}
objects (see @code{help("Memory-limits")}), some of which stem from the
use of 32-bit integers (especially in Fortran code). For example, each
dimension of an array is limited to @math{2^{31} - 1}.
@node The standalone Rmath library, Essential and useful other programs under a Unix-alike, Choosing between 32- and 64-bit builds, Top
@chapter The standalone Rmath library
The routines supporting the distribution and
special@footnote{e.g.@: Bessel, beta and gamma functions} functions in @R{}
and a few others are declared in C header file @file{Rmath.h}. These
can be compiled into a standalone library for linking to other
applications. (Note that they are not a separate library when @R{} is
built, and the standalone version differs in several ways.)
The makefiles and other sources needed are in directory
@file{src/nmath/standalone}, so the following instructions assume that
is the current working directory (in the build directory tree on a
Unix-alike if that is separate from the sources).
@file{Rmath.h} contains @samp{R_VERSION_STRING}, which is a character
string containing the current @R{} version, for example @code{"3.6.0"}.
There is full access to @R{}'s handling of @code{NaN}, @code{Inf} and
@code{-Inf} via special versions of the macros and functions
@example
ISNAN, R_FINITE, R_log, R_pow and R_pow_di
@end example
@noindent
and (extern) constants @code{R_PosInf}, @code{R_NegInf} and @code{NA_REAL}.
There is no support for @R{}'s notion of missing values, in particular
not for @code{NA_INTEGER} nor the distinction between @code{NA} and
@code{NaN} for doubles.
A little care is needed to use the random-number routines. You will
need to supply the uniform random number generator
@example
double unif_rand(void)
@end example
@noindent
or use the one supplied (and with a shared library or DLL you may
have to use the one supplied, which is the Marsaglia-multicarry with
an entry point
@example
set_seed(unsigned int, unsigned int)
@end example
@noindent
to set its seeds).
The facilities to change the normal random number generator are
available through the constant @code{N01_kind}. This takes values
from the enumeration type
@example
typedef enum @{
BUGGY_KINDERMAN_RAMAGE,
AHRENS_DIETER,
BOX_MULLER,
USER_NORM,
INVERSION,
KINDERMAN_RAMAGE
@} N01type;
@end example
@noindent
(and @samp{USER_NORM} is not available).
@menu
* Unix-alike standalone::
* Windows standalone::
@end menu
@node Unix-alike standalone, Windows standalone, The standalone Rmath library, The standalone Rmath library
@section Unix-alikes
If @R{} has not already been made in the directory tree,
@command{configure} must be run as described in the main build
instructions.
Then (in @file{src/nmath/standalone})
@example
make
@end example
@noindent
will make standalone libraries @file{libRmath.a} and @file{libRmath.so}
(@file{libRmath.dylib} on macOS): @samp{make static} and @samp{make
shared} will create just one of them.
To use the routines in your own C or C++ programs, include
@example
#define MATHLIB_STANDALONE
#include <Rmath.h>
@end example
@noindent
and link against @samp{-lRmath} (and @samp{-lm} if needed on your OS).
The example file @file{test.c} does nothing useful, but is provided to
test the process (via @command{make test}). Note that you will probably
not be able to run it unless you add the directory containing
@enindex LD_LIBRARY_PATH
@file{libRmath.so} to the @env{LD_LIBRARY_PATH} environment variable
(@file{libRmath.dylib}, @env{DYLD_FALLBACK_LIBRARY_PATH} on macOS).
The targets
@example
make install
make uninstall
@end example
@noindent
will (un)install the header @file{Rmath.h} and shared and static
@enindex DESTDIR
libraries (if built). Both @code{prefix=} and @env{DESTDIR} are
supported, together with more precise control as described for the main
build.
@samp{make install} installs a file for @command{pkg-config} to use by
e.g.
@example
$(CC) `pkg-config --cflags libRmath` -c test.c
$(CC) `pkg-config --libs libRmath` test.o -o test
@end example
On some systems @samp{make install-strip} will install a stripped shared
library.
@node Windows standalone, , Unix-alike standalone, The standalone Rmath library
@section Windows
You need to set up@footnote{including copying @file{MkRules.dist} to
@file{MkRule.local} and selecting the architecture.} almost all the
tools to make @R{} and then run (in a Unix-like shell)
@example
(cd ../../gnuwin32; make MkRules)
(cd ../../include; make -f Makefile.win config.h Rconfig.h Rmath.h)
make -f Makefile.win
@end example
@noindent
Alternatively, in a @file{cmd.exe} shell use
@example
cd ../../include
make -f Makefile.win config.h Rconfig.h Rmath.h
cd ../nmath/standalone
make -f Makefile.win
@end example
This creates a static library @file{libRmath.a} and a DLL
@file{Rmath.dll}. If you want an import library @file{libRmath.dll.a}
(you don't need one), use
@example
make -f Makefile.win shared implib
@end example
To use the routines in your own C or C++ programs using MinGW-w64, include
@example
#define MATHLIB_STANDALONE
#include <Rmath.h>
@end example
@noindent
and link against @samp{-lRmath}. This will use the first found of
@file{libRmath.dll.a}, @file{libRmath.a} and @file{Rmath.dll} in that
order, so the result depends on which files are present. You should be
able to force static or dynamic linking @emph{via}
@example
-Wl,-Bstatic -lRmath -Wl,Bdynamic
-Wl,-Bdynamic -lRmath
@end example
@noindent
or by linking to explicit files (as in the @samp{test} target in
@file{Makefile.win}: this makes two executables, @file{test.exe} which
is dynamically linked, and @file{test-static.exe}, which is statically
linked).
It is possible to link to @file{Rmath.dll} using other compilers, either
directly or via an import library: if you make a MinGW-w64 import library as
above, you will create a file @file{Rmath.def} which can be used
(possibly after editing) to create an import library for other systems
such as Visual C++.
If you make use of dynamic linking you should use
@example
#define MATHLIB_STANDALONE
#define RMATH_DLL
#include <Rmath.h>
@end example
@noindent
to ensure that the constants like @code{NA_REAL} are linked correctly.
(Auto-import will probably work with MinGW-w64, but it is better to be
sure. This is likely to also work with VC++, Borland and similar
compilers.)
@node Essential and useful other programs under a Unix-alike, Configuration on a Unix-alike, The standalone Rmath library, Top
@appendix Essential and useful other programs under a Unix-alike
This appendix gives details of programs you will need to build @R{} on
Unix-like platforms, or which will be used by @R{} if found by
@command{configure}.
Remember that some package management systems (such as @acronym{RPM} and
Debian/Ubuntu's) make a distinction between the user version of a
package and the development version. The latter usually has the same
name but with the extension @samp{-devel} or @samp{-dev}: you need both
versions installed.
@menu
* Essential programs and libraries::
* Useful libraries and programs::
* Linear algebra::
@end menu
@node Essential programs and libraries, Useful libraries and programs, Essential and useful other programs under a Unix-alike, Essential and useful other programs under a Unix-alike
@section Essential programs and libraries
You need a means of compiling C and Fortran 90 (see @ref{Using
Fortran}). Your C compiler should be
@acronym{ISO}/@acronym{IEC}@tie{}60059@footnote{also known as
@acronym{IEEE}@tie{}754}, POSIX 1003.1 and C99-compliant.@footnote{Note
that C11 compilers need not be C99-compliant: @R{} requires support for
@code{double complex} and variable-length arrays which are optional in
C11 but are mandatory in C99. C18 (also known as C17) is a `bugfix
release' of C11, clarifying the standard.} @R{} tries to choose suitable
flags@footnote{Examples are @option{-std=gnu99}, @option{-std=c99} and
@option{-c99}.} for the C compilers it knows about, but you may have to
set @code{CC} or @code{CFLAGS} suitably. For versions of @command{gcc}
prior to 5.1 with @code{glibc}-based Linux this means including
@option{-std=gnu99}@footnote{@option{-std=c99} excludes POSIX
functionality, but @file{config.h} will turn on all @acronym{GNU}
extensions to include the POSIX functionality for @R{} itself: this does
not apply to badly-written packages. The default mode for GCC 5.1 and
later is @option{-std=gnu11}, which currently includes the optional
features @R{} needs.}. (Note that options essential to run the compiler
even for linking, such as those to set the architecture, should be
specified as part of @code{CC} rather than in @code{CFLAGS}.)
Unless you do not want to view graphs on-screen (or use macOS) you need
@samp{X11} installed, including its headers and client libraries. For
recent Fedora/RedHat distributions it means (at least) RPMs
@samp{libX11}, @samp{libX11-devel}, @samp{libXt} and @samp{libXt-devel}.
On Debian/Ubuntu we recommend the meta-package @samp{xorg-dev}. If you
really do not want these you will need to explicitly configure @R{}
without X11, using @option{--with-x=no}.
The command-line editing (and command completion) depends on the
@acronym{GNU} @code{readline} library (including its headers): version
4.2 or later is needed for all the features to be enabled. Otherwise
you will need to configure with @option{--with-readline=no} (or
equivalent).
A suitably comprehensive @code{iconv} function is essential. The @R{}
usage requires @code{iconv} to be able to translate between
@code{"latin1"} and @code{"UTF-8"}, to recognize @code{""} (as the
current encoding) and @code{"ASCII"}, and to translate to and from the
Unicode wide-character formats @code{"UCS-[24][BL]E"} --- this is true
by default for @code{glibc}@footnote{However, it is possible to break
the default behaviour of @code{glibc} by re-specifying the @code{gconv}
modules to be loaded.} but not of most commercial Unixes. However, you
can make use of @acronym{GNU} @code{libiconv} (as used on macOS: see
@uref{https://www.gnu.org/@/software/@/libiconv/}).
The OS needs to have enough support@footnote{specifically, the C99
functionality of headers @file{wchar.h} and @file{wctype.h}, types
@code{wctans_t} and @code{mbstate_t} and functions @code{mbrtowc},
@code{mbstowcs}, @code{wcrtomb}, @code{wcscoll}, @code{wcstombs},
@code{wctrans}, @code{wctype}, and @code{iswctype}.} for wide-character
types: this is checked at configuration. Some C99
functions@footnote{including @code{expm1}, @code{hypot}, @code{log1p},
@code{nearbyint} and @code{va_copy}.} are required and checked for at
configuration. A small number of POSIX functions@footnote{including
@code{opendir}, @code{readdir}, @code{closedir}, @code{popen},
@code{stat}, @code{glob}, @code{access}, @code{getcwd} and @code{chdir}
system calls, @code{select} on a Unix-alike, and either @code{putenv} or
@code{setenv}.} are essential, and others@footnote{such as
@code{realpath}, @code{symlink}.} will be used if available.
@c zlib 1.2.5 is from July 2010, bzip2 1.0.6 from Sept 2010
@c xz 5.0.3 is from May 2011
Installations of @code{zlib} (version 1.2.5 or later), @code{libbz2}
(version 1.0.6 or later: called @pkg{bzip2-libs}/@pkg{bzip2-devel} or
@pkg{libbz2-1.0}/@pkg{libbz2-dev} by some Linux distributions) and
@code{liblzma}@footnote{most often distributed as part of @code{xz}:
possible names in Linux distributions include
@code{xz-devel}/@code{xz-libs} and @code{liblzma-dev}.} version 5.0.3 or
later are required.
@c PCRE[1] 8.32 is from Nov 2012, but Ubuntu 14.04LTS has 8.31
@c and that is supported until Apr 2019.
@c Debian Wheezy has 8.30, Ubuntu 12.04LTS has 8.12
PCRE@footnote{sometimes known as PCRE1, and not PCRE2 which started at
version 10.0.} (version 8.32 or later, although versions 8.20--8.31 will
be accepted with a deprecation warning) is required (or just its library
and headers if packaged separately). Only the `8-bit' interface is used
(and only that is built by default when installing from sources). PCRE
must be built with UTF-8 support (not the default, and checked by
@command{configure}) and support for Unicode properties is assumed by
some @R{} packages. JIT support (optionally available) is desirable for
the best performance: support for this and Unicode properties can be
checked at run-time by calling @code{pcre_config()}. If building PCRE
for use with @R{} a suitable @command{configure} command might be
@example
./configure --enable-utf --enable-unicode-properties --enable-jit --disable-cpp
@end example
@noindent
The @option{--enable-jit} flag is supported for most common CPUs. (See
also the comments for Solaris.)
@c libcurl 7.22.0 was released in Sep 2011, in Ubuntu 12.04 LTS,
@c end-of-life Apr 2017
@c libcurl 7.26.0 was released in May 2012, still in Debian 7 Wheezy LTS,
@c end-of-life May 2018.
@c libcurl 7.28.0 was released in Oct 2012
Library @code{libcurl} (version 7.22.0 or later@footnote{but not a major
version greater than 7 should there ever be one: the major version has
been 7 since 2000.}) is required, with at least 7.28.0 being desirable.
Information on @code{libcurl} is found from the @command{curl-config}
script: if that is missing or needs to be overridden@footnote{for
example to specify static linking with a build which has both shared and
static libraries.} there are macros to do so described in file
@file{config.site}.
A @command{tar} program is needed to unpack the sources and packages
(including the recommended packages). A version@footnote{Such as
@acronym{GNU} @command{tar} 1.15 or later, @command{bsdtar} (from
@uref{https://github.com/@/libarchive/@/libarchive/}, as used as
@command{tar} by FreeBSD and macOS 10.6 and later) or @command{tar} from
the Heirloom Toolchest
(@uref{http://heirloom.sourceforge.net/@/tools.html}), although the
latter does not support @command{xz} compression.} that can
automagically detect compressed archives is preferred for use with
@code{untar()}: the configure script looks for @command{gtar} and
@command{gnutar} before
@enindex TAR
@command{tar} -- use environment variable @env{TAR} to override this.
(On NetBSD/OpenBSD systems set this to @command{bsdtar} if that is
installed.)
There need to be suitable versions of the tools @command{grep} and
@command{sed}: the problems are usually with old AT&T and BSD variants.
@command{configure} will try to find suitable versions (including
looking in @file{/usr/xpg4/bin} which is used on some commercial
Unixes).
You will not be able to build most of the manuals unless you have
@command{texi2any} version 5.1 or later installed, and if not most of
the @HTML{} manuals will be linked to a version on @acronym{CRAN}. To
make PDF versions of the manuals you will also need file
@file{texinfo.tex} installed (which is part of the @acronym{GNU}
@pkg{texinfo} distribution but is often made part of the @TeX{} package
in re-distributions) as well as
@command{texi2dvi}.@footnote{@command{texi2dvi} is normally a shell
script. Some versions (including those from @pkg{texinfo} 5.2 and
6.0-6.6) need to be run under @command{bash} rather than a Bourne shell,
especially on Solaris. Some of the issues which have been observed with
broken versions of @command{texi2dvi} can be circumvented by setting the
environment variable @env{R_TEXI2DVICMD} to the value @code{emulation}.}
Further, the versions of @command{texi2dvi} and @file{texinfo.tex} need
to be compatible: we have seen problems with older @TeX{} distributions.
@cindex Subversion
If you want to build from the @R{} Subversion repository then
@command{texi2any} is highly recommended as it is used to create files
which are in the tarball but not stored in the Subversion repository.
@cindex Vignettes
The PDF documentation (including @file{doc/NEWS.pdf}) and building
vignettes needs @command{pdftex} and @command{pdflatex}. We require
@LaTeX{} version @code{2005/12/01} or later (for UTF-8 support).
Building PDF package manuals (including the @R{} reference manual) and
vignettes is sensitive to the version of the @LaTeX{} package
@pkg{hyperref} and we recommend that the @TeX{} distribution used is
kept up-to-date. A number of standard @LaTeX{} packages are required
(including @pkg{url} and some of the font packages such as @pkg{times},
@pkg{helvetic}, @pkg{ec} and @pkg{cm-super}) and others such as
@pkg{hyperref} and @pkg{inconsolata} are desirable (and without them you
may need to change @R{}'s defaults: @pxref{Making the manuals}). Note
that package @pkg{hyperref} (currently) requires packages
@pkg{kvoptions}, @pkg{ltxcmds} and @pkg{refcount}. For distributions
based on TeX Live the simplest approach may be to install collections
@pkg{collection-latex}, @pkg{collection-fontsrecommended},
@pkg{collection-latexrecommended}, @pkg{collection-fontsextra} and
@pkg{collection-latexextra} (assuming they are not installed by
default): Fedora uses names like @pkg{texlive-collection-fontsextra} and
Debian/Ubuntu like @pkg{texlive-fonts-extra}.
@enindex PATH
The essential programs should be in your @env{PATH} at the time
@command{configure} is run: this will capture the full paths.
Those distributing binary versions of @R{} may need to be aware of the
licences of the external libraries it is linked to (including `useful'
libraries from the next section). The @code{liblzma} library is in the
public domain and X11, @code{libbzip2}, @code{libcurl} and @code{zlib}
have MIT-style licences. PCRE has a BSD-style licence which requires
distribution of the licence (included in @R{}'s @file{COPYRIGHTS} file)
in binary distributions. GNU @code{readline} is licensed under GPL
(which version(s) depending on the @code{readline} version).
@node Useful libraries and programs, Linear algebra, Essential programs and libraries, Essential and useful other programs under a Unix-alike
@section Useful libraries and programs
The ability to use translated messages makes use of @code{gettext} and
most likely needs @acronym{GNU} @code{gettext}: you do need this to work
with new translations, but otherwise the version contained in the R
sources will be used if no suitable external @code{gettext} is found.
The `modern' version of the @code{X11()}, @code{jpeg()}, @code{png()}
and @code{tiff()} graphics devices uses the @code{cairo} and
(optionally) @code{Pango} libraries. Cairo version 1.2.0 or later is
required. Pango needs to be at least version 1.10, and 1.12 is the
earliest version we have tested. (For Fedora users we believe the
@code{pango-devel} RPM and its dependencies suffice.) @R{} checks for
@command{pkg-config}, and uses that to check first that the
@samp{pangocairo} package is installed (and if not, @samp{cairo}) and if
additional flags are needed for the @samp{cairo-xlib} package, then if
suitable code can be compiled. These tests will fail if
@command{pkg-config} is not installed@footnote{If necessary the path to
@command{pkg-config} can be specified by setting @env{PKG_CONFIG} in
@file{config.site}, on the @command{configure} command line or in the
environment.}, and are likely to fail if @code{cairo} was built
statically (unusual). Most systems with @code{Gtk+} 2.8 or later
installed will have suitable libraries.
For the best font experience with these devices you need suitable fonts
installed: Linux users will want the @code{urw-fonts} package. On
platforms which have it available, the @code{msttcorefonts}
package@footnote{also known as @code{ttf-mscorefonts-installer} in the
Debian/Ubuntu world: see also
@uref{https://en.wikipedia.org/@/wiki/@/Core_fonts_for_the_Web}.} provides
TrueType versions of Monotype fonts such as Arial and Times New Roman.
Another useful set of fonts is the `liberation' TrueType fonts available
at
@uref{https://fedorahosted.org/@/liberation-fonts/},@footnote{@code{ttf-liberation}
in Debian/Ubuntu.} which cover the Latin, Greek and Cyrillic alphabets
plus a fair range of signs. These share metrics with Arial, Times New
Roman and Courier New, and contain fonts rather similar to the first two
(@uref{https://en.wikipedia.org/@/wiki/@/Liberation_fonts}). Then there
is the `Free UCS Outline Fonts' project
(@uref{https://www.gnu.org/@/software/@/freefont/}) which are
OpenType/TrueType fonts based on the URW fonts but with extended Unicode
coverage. See the @R{} help on @code{X11} on selecting such fonts.
The bitmapped graphics devices @code{jpeg()}, @code{png()} and
@code{tiff()} need the appropriate headers and libraries installed:
@code{jpeg} (version 6b or later, or @code{libjpeg-turbo}) or
@code{libpng} (version 1.2.7 or later) and @code{zlib} or @code{libtiff}
(versions 4.0.[5-10] and 4.1.0 have been tested) respectively.
@command{pkg-config} is used if available and so needs the appropriate
@file{.pc} file (which requires @code{libtiff} version 4.x and is not
available on all platforms for @code{jpeg} before version 9c). They
also need support for either @code{X11} or @code{cairo} (see above).
Should support for these devices @strong{not} be required or broken
system libraries need to be avoided there are @command{configure}
options @option{--without-libpng}, @option{--without-jpeglib} and
@option{--without-libtiff}. The TIFF library has many optional features
such as @code{jpeg}, @code{libz}, @code{lzma}, @code{jbig} and
@code{jpeg12}, none of which is required for the @code{tiff()} devices
but may need to be present to link the library (usually only an issue
for static linking).
Option @option{--with-system-tre} is also available: it needs a recent
version of TRE. (The latest (2016) sources are in the @command{git}
repository at @url{https://github.com/laurikari/tre/}, but at the time
of writing the resulting build will not pass its checks.).
An implementation of @acronym{XDR} is required, and the @R{} sources
contain one which is likely to suffice (although a system version may
have higher performance). @acronym{XDR} is part of @acronym{RPC} and
historically has been part of @file{libc} on a Unix-alike. (In
principle @command{man xdr_string} should tell you which library is
needed, but it often does not: on Solaris and others it is provided by
@code{libnsl}.) However some builds@footnote{This is the default as
from @code{glibc} 2.26 and has been confirmed for Fedora 28, which does
not mention this on its @command{man} pages.} of @code{glibc} omit or
hide it with the intention that the @acronym{TI-RPC} library be used, in
which case @code{libtirpc} (and its development version) should be
installed, and its headers@footnote{@R{} uses @file{rpc/xdr.h} but that
includes @file{netconfig.h} from the top @file{tirpc} directory.} need
to be on the C include path or under @file{/usr/include/tirpc}.
Use of the X11 clipboard selection requires the @code{Xmu} headers and
libraries. These are normally part of an X11 installation (e.g.@: the
Debian meta-package @samp{xorg-dev}), but some distributions have split
this into smaller parts, so for example recent versions of Fedora
require the @samp{libXmu} and @samp{libXmu-devel} RPMs.
Some systems (notably macOS and at least some FreeBSD systems) have
inadequate support for collation in multibyte locales. It is possible
to replace the OS's collation support by that from ICU (International
Components for Unicode, @uref{http://site.icu-project.org/}), and this
provides much more precise control over collation on all systems. ICU
is available as sources and as binary distributions for (at least) most
Linux distributions, Solaris, FreeBSD and AIX, usually as @code{libicu}
or @code{icu4c}. It will be used by default where available: should a
very old or broken version of ICU be found this can be suppressed by
@option{--without-ICU}.
The @code{bitmap} and @code{dev2bitmap} devices and function
@code{embedFonts()} use ghostscript
(@uref{http://www.ghostscript.com/}). This should either be in your
path when the command is run, or its full path specified by the
environment variable @env{R_GSCMD} at that time.
@enindex R_GSCMD
At the time of writing a full installation on Fedora Linux used the
following packages and their development versions, and this may provide
a useful checklist for other systems:
@example
bzip2 cairo fontconfig freetype fribidi glib2 libX11 libXext libXt
libcurl libicu libjpeg libpng libtiff libtirpc libxcrypt ncurses pango
pcre readline tcl tk xz zlib
@end example
@menu
* Tcl/Tk::
* Java support::
* Other compiled languages::
@end menu
@node Tcl/Tk, Java support, Useful libraries and programs, Useful libraries and programs
@subsection Tcl/Tk
The @pkg{tcltk} package needs Tcl/Tk @geq{} 8.4 installed: the sources are
available at @uref{https://@/www.tcl.tk/}. To specify the locations of the
Tcl/Tk files you may need the configuration options
@table @option
@item --with-tcltk
use Tcl/Tk, or specify its library directory
@item --with-tcl-config=@var{TCL_CONFIG}
specify location of @file{tclConfig.sh}
@item --with-tk-config=@var{TK_CONFIG}
specify location of @file{tkConfig.sh}
@end table
@noindent
or use the configure variables @code{TCLTK_LIBS} and
@code{TCLTK_CPPFLAGS} to specify the flags needed for linking against
the Tcl and Tk libraries and for finding the @file{tcl.h} and
@file{tk.h} headers, respectively. If you have both 32- and 64-bit
versions of Tcl/Tk installed, specifying the paths to the correct config
files may be necessary to avoid confusion between them.
Versions of Tcl/Tk up to 8.5.19 and 8.6.9 have been tested (including
most versions of 8.4.x, but not recently).
Note that the @file{tk.h} header includes@footnote{This is true even for
the `Aqua' version of Tk on macOS, but distributions of that include a
copy of the X11 files needed.} X11 headers, so you will need X11 and its
development files installed.
@node Java support, Other compiled languages, Tcl/Tk, Useful libraries and programs
@subsection Java support
The build process looks for Java support on the host system, and if it
finds it sets some settings which are useful for Java-using packages
(such as @CRANpkg{rJava} and @CRANpkg{JavaGD}: these require a full
JDK). This check can be suppressed by configure option
@option{--disable-java}.
@enindex JAVA_HOME
Configure variable @env{JAVA_HOME} can be set to point to a specific
JRE/JDK, on the @command{configure} command line or in the environment.
Principal amongst these settings are some paths to the Java
libraries and JVM, which are stored in environment variable
@enindex R_JAVA_LD_LIBRARY_PATH
@env{R_JAVA_LD_LIBRARY_PATH} in file @file{@var{R_HOME}/etc/ldpaths} (or
a sub-architecture-specific version). A typical setting for
@cputype{x86_64} Linux is
@example
JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.212.b04-0.fc30.x86_64/jre
R_JAVA_LD_LIBRARY_PATH=$@{JAVA_HOME@}/lib/that/server
@end example
Unfortunately this depends on the exact version of the JRE/JDK
installed, and so may need updating if the Java installation is updated.
This can be done by running @code{R CMD javareconf} which updates
settings in both @file{@var{R_HOME}/etc/Makeconf} and
@file{@var{R_HOME}/etc/ldpaths}. See @code{R CMD javareconf --help} for
details: note that this needs to be done by the account owning the @R{}
installation.
Another way of overriding those settings is to set the environment variable
@enindex R_JAVA_LD_LIBRARY_PATH
@env{R_JAVA_LD_LIBRARY_PATH} (before @R{} is started, hence not in
@file{~/.Renviron}), which suffices to run already-installed
Java-using packages. For example
@example
R_JAVA_LD_LIBRARY_PATH=/usr/lib/jvm/java-1.8.0/jre/lib/amd64/server
@end example
It may be possible to avoid this by specifying an invariant link as the
path when configuring. For example, on that system any of
@example
JAVA_HOME=/usr/lib/jvm/java
JAVA_HOME=/usr/lib/jvm/java-1.8.0
JAVA_HOME=/usr/lib/jvm/java-1.8.0/jre
@end example
@noindent
worked.
`Non-server' distributions of Java as from version 11 are of a full JDK.
However, Linux distributions can be confusing: for example Fedora 30 had
@example
java-1.8.0-openjdk
java-1.8.0-openjdk-devel
java-openjdk
java-openjdk-devel
java-11-openjdk
java-11-openjdk-devel
java-latest-openjdk
java-latest-openjdk-devel
@end example
@noindent
where the @code{-devel} RPMs are needed to complete the JDK. (At the
time of writing @code{java-openjdk} was Java 12.) Debian/Ubuntu use
@samp{-jre} and @samp{-jdk}, e.g.
@example
sudo apt install default-jdk
@end example
@c https://www.digitalocean.com/community/tutorials/how-to-install-java-with-apt-on-ubuntu-18-04
@node Other compiled languages, , Java support, Useful libraries and programs
@subsection Other compiled languages
Some add-on packages need a C++ compiler. This is specified by the
configure variables @code{CXX}, @code{CXXFLAGS} and similar.
@command{configure} will normally find a suitable compiler. However, in
many cases this will be a C++98 compiler, and it is
possible to specify an alternative compiler for use with C++11 by the
configure variables @code{CXX11}, @code{CXX11STD}, @code{CXX11FLAGS} and
similar (@pxref{C++ Support}). Again, @command{configure} will normally
find a suitable value for @code{CXX11STD} if the compiler given by
@code{CXX} is capable of compiling C++11 code, but it is possible that a
completely different compiler will be needed.
For source files with extension @file{.f90} or @file{.f95} containing
free-form Fortran, the compiler defined by the macro @code{FC} is used
by @command{R CMD INSTALL}. Note that it is detected by the name of the
command without a test that it can actually compile Fortran 90 code.
Set the configure variable @code{FC} to override this if necessary:
variables @code{FCFLAGS} and @code{FCLIBS_XTRA} might also need to be
set.
See file @file{config.site} in the @R{} source for more details about
these variables.
@node Linear algebra, , Useful libraries and programs, Essential and useful other programs under a Unix-alike
@section Linear algebra
@cindex BLAS library
@menu
* BLAS::
* LAPACK::
* Caveats::
@end menu
@node BLAS, LAPACK, Linear algebra, Linear algebra
@subsection BLAS
The linear algebra routines in @R{} can make use of enhanced
@acronym{BLAS} (Basic Linear Algebra Subprograms,
@uref{http://www.netlib.org/@/blas/@/faq.html}) routines. However,
these have to be explicitly requested at configure time: @R{} provides
an internal @acronym{BLAS} which is well-tested and will be adequate for
most uses of @R{}.
You can specify a particular @acronym{BLAS} library @emph{via} a value
for the configuration option @option{--with-blas} and not to use an
external @acronym{BLAS} library by @option{--without-blas} (the
default). If @option{--with-blas} is given with no @code{=}, its value
is taken from the
@enindex BLAS_LIBS
environment variable @env{BLAS_LIBS}, set for example in
@file{config.site}. If neither the option nor the environment variable
supply a value, a search is made for a suitable@footnote{The search
includes OpenBLAS, ATLAS and a generic @file{libblas}, plus some
platform-specific choices (see below).} @acronym{BLAS}. If the value is
not obviously a linker command (starting with a dash or giving the path
to a library), it is prefixed by @samp{-l}, so
@example
--with-blas="foo"
@end example
@noindent
is an instruction to link against @samp{-lfoo} to find an external
@acronym{BLAS} (which needs to be found both at link time and run time).
The configure code checks that the external @acronym{BLAS} is complete
(it must include all double precision and double complex routines, as
well as @code{LSAME}), and appears to be usable. However, an external
@acronym{BLAS} has to be usable from a shared object (so must contain
position-independent code), and that is not checked.
Some enhanced @acronym{BLAS}es are compiler-system-specific
(@code{sunperf} on Solaris@footnote{Using the Oracle Developer Studio
@command{cc} and @command{f95} compilers}, @code{libessl} on IBM,
@code{Accelerate} on macOS). The correct incantation for these is often
found @emph{via} @option{--with-blas} with no value on the appropriate
platforms.
Some of the external @acronym{BLAS}es are multi-threaded. One issue is
that @R{} profiling (which uses the @code{SIGPROF} signal) may cause
problems, and you may want to disable profiling if you use a
multi-threaded @acronym{BLAS}. Note that using a multi-threaded
@acronym{BLAS} can result in taking more @acronym{CPU} time and even
more elapsed time (occasionally dramatically so) than using a similar
single-threaded @acronym{BLAS}. On a machine running other tasks, there
can be contention for CPU caches that reduces the effectiveness of the
optimization of cache use by a @acronym{BLAS} implementation: some
people warn that this is especially problematic for hyperthreaded CPUs.
Note that under Unix (but not under Windows) if @R{} is compiled against
a non-default @acronym{BLAS} and @option{--enable-BLAS-shlib} is
@strong{not} used (it is the default on all platforms except AIX), then
all @acronym{BLAS}-using packages must also be. So if @R{} is re-built
to use an enhanced @acronym{BLAS} then packages such as
@CRANpkg{quantreg} will need to be re-installed; they may be under other
circumstances.
@R{} relies on @acronym{ISO}/@acronym{IEC}@tie{}60559 compliance of an
external @acronym{BLAS}. This can be broken if for example the code
assumes that terms with a zero factor are always zero and do not need to
be computed---whereas @code{x*0} can be @code{NaN}. This is checked in
the test suite.
External @acronym{BLAS} implementations often make less use of
extended-precision floating-point registers (where available) and will
almost certainly re-order computations. This can result in less
accuracy than using a reference @acronym{BLAS}, and may result in
different solutions, e.g.@: different signs in SVD and
eigendecompositions.
Debian/Ubuntu systems provide a system-specific way to switch the BLAS
in use. Build @R{} with @option{-with-blas} to select the OS version
of the reference BLAS, and then use @command{update-alternatives} to
switch between the available BLAS libraries. See
@uref{https://wiki.debian.org/DebianScience/LinearAlgebraLibraries}.
The URIs for several of these BLAS have been subject to frequent
gratuitous changes, so you will need to search for their current
locations.
BLAS (and LAPACK) routines may be used inside threaded code, for example
in OpenMP sections in packages such as @pkg{mgcv}. The reference
implementations are thread-safe but external ones may not be (even
single-threaded ones): this can lead to hard-to-track-down incorrect
results or segfaults.
@c Seen for OpenBLAS 3.2 in 2018.
@strong{NOTE:} BLAS libraries built with @command{gfortran}@tie{}9 (and
future versions 8.4, 7.5 and later) require calls from C/C++ to handle
`hidden' character lengths --- @R{} itself does so but many packages do
not and some segfault. (This applies also to external LAPACK libraries.)
@menu
* ATLAS::
* OpenBLAS::
* MKL::
* Shared BLAS::
@end menu
@node ATLAS, OpenBLAS, BLAS, BLAS
@subsubsection ATLAS
ATLAS (@uref{http://math-atlas.sourceforge.net/}) is a ``tuned''
@acronym{BLAS} that runs on a wide range of Unix-alike platforms.
Unfortunately it is built by default as a static library that on some
platforms may not be able to be used with shared objects such as are
used in @R{} packages. Be careful when using pre-built versions of
ATLAS static libraries (they seem to work on @cputype{ix86} platforms,
but not always on @cputype{x86_64} ones).
ATLAS contains replacements for a small number of LAPACK routines, but
can be built to merge these with LAPACK sources to include a full LAPACK
library.
Recent versions of ATLAS can be built as a single shared library, either
@code{libsatlas} or @code{libtatlas} (serial or threaded respectively):
these may even contain a full LAPACK. Such builds can be used by one of
@example
--with-blas=satlas
--with-blas=tatlas
@end example
@noindent
or, as on @cputype{x86_64} Fedora where a path needs to be specified,
@example
--with-blas="-L/usr/lib64/atlas -lsatlas"
--with-blas="-L/usr/lib64/atlas -ltatlas"
@end example
@noindent
Distributed ATLAS libraries cannot be tuned to your machine and so are a
compromise: for example Fedora tunes @cputype{x86_64} for CPUs with SSE3
extensions, and separate @samp{atlas-sse2} and @samp{atlas-sse3}
@cputype{i686} RPMs are available.@footnote{There were others for earlier
versions of ATLAS, and are for non-Intel architectures. The only way to
see exactly which CPUs the distributed libraries have been tuned for is
to read the @file{atlas.spec} file: at the time of writing
@samp{HAMMER64SSE3} and @samp{Corei264AVX} for @cputype{x86_64} Fedora.}
Note that building @R{} on Linux against distributed shared libraries
may need @samp{-devel} or @samp{-dev} packages installed.
Linking against multiple static libraries requires one of
@example
--with-blas="-lf77blas -latlas"
--with-blas="-lptf77blas -lpthread -latlas"
--with-blas="-L/path/to/ATLAS/libs -lf77blas -latlas"
--with-blas="-L/path/to/ATLAS/libs -lptf77blas -lpthread -latlas"
@end example
Consult its installation
guide@footnote{@uref{http://math-atlas.sourceforge.net/atlas_install/}}
for how to build ATLAS as a shared library or as a static library with
position-independent code (on platforms where that matters).
According to the ATLAS
FAQ@footnote{@uref{http://math-atlas.sourceforge.net/faq.html#tnum}} the
maximum number of threads used by multi-threaded ATLAS is set at compile
time. Also, the author advises against using multi-threaded ATLAS on
hyperthreaded CPUs without restricting affinities at compile-time to one
virtual core per physical CPU. (For the Fedora libraries the
compile-time flag specifies 4 threads.)
@c http://math-atlas.sourceforge.net/atlas_install/node21.html
@node OpenBLAS, MKL, ATLAS, BLAS
@subsubsection OpenBLAS
Dr Kazushige Goto wrote a tuned @acronym{BLAS} for several processors
and OSes, which was frozen in mid-2010. OpenBLAS
(@uref{http://www.openblas.net/}) is a descendant project with support
for some later CPUs.
This can be used by configuring @R{} with something like
@example
--with-blas="-lopenblas"
@end example
@noindent
See @pxref{Shared BLAS} for an alternative (and in many ways preferable)
way to use them.
Some platforms provide multiple builds of OpenBLAS: for example Fedora 30
has RPMs@footnote{(and more, e.g.@: for 64-bit ints and static versions).}
@example
openblas
openblas-threads
openblas-openmp
@end example
@noindent
providing shared libraries
@example
libopenblas.so
libopenblasp.so
libopenblaso.so
@end example
@noindent
respectively, each of which can be used as a shared BLAS. For the
second and third the number of threads is controlled by
@env{OPENBLAS_NUM_THREADS} and @env{OMP_NUM_THREADS} (as usual for
OpenMP) respectively. There is also a Fedora RPM @samp{openblas-Rblas}
to replace @file{libRblas.so} in their distribution of @R{}.
Note that building @R{} on Linux against distributed libraries may need
@samp{-devel} or @samp{-dev} packages installed.
@c https://wiki.debian.org/DebianScience/LinearAlgebraLibraries
For @cputype{ix86} and @cputype{x86_64} most distributed libraries
contain several alternatives for different CPU microarchitectures with
the choice being made at run time.
@node MKL, Shared BLAS, OpenBLAS, BLAS
@subsubsection Intel MKL
For Intel processors (and perhaps others) and some distributions of
Linux, there is Intel's Math Kernel Library. You are strongly
encouraged to read the MKL User's Guide, which is installed with the
library, before attempting to link to MKL. This includes a `link line
advisor' which will suggest appropriate incantations: its use is
recommended. Or see
@uref{https://software.intel.com/@/en-us/@/articles/@/intel-mkl-link-line-advisor}.
There are also versions of MKL for macOS@footnote{The issue for macOS
has been the use of double-complex routines.} and Windows, but when
these have been tried they did not work with the default compilers used
for @R{} on those platforms.
The MKL interface has changed several times but has been stable in
recent years: the following examples have been used with versions 10.3
to 2019.4, for GCC compilers on @cputype{x86_64}.
To use a sequential version of MKL we used
@example
MKL_LIB_PATH=/path/to/intel_mkl/mkl/lib/intel64
export LD_LIBRARY_PATH=$MKL_LIB_PATH
MKL="-L$@{MKL_LIB_PATH@} -lmkl_gf_lp64 -lmkl_core -lmkl_sequential"
./configure --with-blas="$MKL" --with-lapack
@end example
@noindent
The option @option{--with-lapack} is used since MKL contains a tuned
copy of LAPACK (often older than the current version) as well as
@acronym{BLAS} (@pxref{LAPACK}), although this can be omitted.
Threaded MKL may be used by replacing the line defining the variable
@code{MKL} by
@example
MKL="-L$@{MKL_LIB_PATH@} -lmkl_gf_lp64 -lmkl_core \
-lmkl_gnu_thread -dl -fopenmp"
@end example
@R{} can also be linked against a single shared library,
@code{libmkl_rt.so}, for both BLAS and LAPACK, but the correct OpenMP and
MKL interface layer then has to be selected via environment variables. With
64-bit builds and the GCC compilers, we used
@example
export MKL_INTERFACE_LAYER=GNU,LP64
export MKL_THREADING_LAYER=GNU
@end example
On Debian/Ubuntu, MKL is provided by package @code{intel-mkl-full} and one
can set @code{libmkl_rt.so} as the system-wide implementation of both BLAS
and LAPACK during installation of the package, so that also @R{} installed
from Debian/Ubuntu package @code{r-base} would use it. It is, however,
still essential to set @code{MKL_INTERFACE_LAYER} and
@code{MKL_THREADING_LAYER} before running @R{}, otherwise MKL computations
will produce incorrect results. @R{} does not have to be rebuilt to use MKL,
but @code{configure} include tests which may discover some errors such as a
failure to set the correct OpenMP and MKL interface layer.
@noindent
The default number of threads will be chosen by the OpenMP software, but
can be controlled by setting @code{OMP_NUM_THREADS} or
@code{MKL_NUM_THREADS}, and in recent versions seems to default to a
sensible value for sole use of the machine. (Parallel MKL has not
always passed @command{make check-all}, but did with MKL 2019.4.)
MKL includes a partial implementation of FFTW3, which causes trouble for
applications that require some of the FFTW3 functionality unsupported in
MKL. Please see the MKL manuals for description of these limitations and
for instructions on how to create a custom version of MKL which excludes the
FFTW3 wrappers.
@c https://stat.ethz.ch/pipermail/r-devel/2015-September/071717.html
It was reported in 2015 that
@example
--with-blas='-mkl=parallel' --with-lapack
@end example
@noindent
worked with the Intel 2015.3 compilers on Centos 6.
@node Shared BLAS, , MKL, BLAS
@subsubsection Shared BLAS
The @acronym{BLAS} library will be used for many of the add-on packages
as well as for @R{} itself. This means that it is better to use a
shared/dynamic @acronym{BLAS} library, as most of a static library will
be compiled into the @R{} executable and each @acronym{BLAS}-using
package.
@R{} offers the option of compiling the @acronym{BLAS} into a dynamic
library @code{libRblas} stored in @file{@var{R_HOME}/lib} and linking
both @R{} itself and all the add-on packages against that library.
This is the default on all platforms except AIX unless an external
@acronym{BLAS} is specified and found: for the latter it can be used by
specifying the option @option{--enable-BLAS-shlib}, and it can always be
disabled via @option{--disable-BLAS-shlib}.
This has both advantages and disadvantages.
@itemize
@item
It saves space by having only a single copy of the @acronym{BLAS}
routines, which is helpful if there is an external static @acronym{BLAS}
(as used to be standard for ATLAS).
@item
There may be performance disadvantages in using a shared @acronym{BLAS}.
Probably the most likely is when @R{}'s internal @acronym{BLAS} is used
and @R{} is @emph{not} built as a shared library, when it is possible to
build the @acronym{BLAS} into @file{R.bin} (and @file{libR.a}) without
using position-independent code. However, experiments showed that in
many cases using a shared @acronym{BLAS} was as fast, provided high
levels of compiler optimization are used.
@item
It is easy to change the @acronym{BLAS} without needing to re-install
@R{} and all the add-on packages, since all references to the
@acronym{BLAS} go through @code{libRblas}, and that can be replaced.
Note though that any dynamic libraries the replacement links to will
need to be found by the linker: this may need the library path to be
changed in @file{@var{R_HOME}/etc/ldpaths}.
@end itemize
Another option to change the @acronym{BLAS} in use is to symlink a
single dynamic @acronym{BLAS} library to
@file{@var{R_HOME}/lib/libRblas.so}. For example, just
@example
mv @var{R_HOME}/lib/libRblas.so @var{R_HOME}/lib/libRblas.so.keep
ln -s /usr/lib64/libopenblasp.so.0 @var{R_HOME}/lib/libRblas.so
@end example
@noindent
on @cputype{x86_64} Fedora will change the @acronym{BLAS} used to
multithreaded OpenBLAS. A similar link works for most versions of the
OpenBLAS (provided the appropriate @file{lib} directory is in the
run-time library path or @command{ld.so} cache). It can also be used
for a single-library ATLAS, so on @cputype{x86_64} Fedora
@example
ln -s /usr/lib64/atlas/libsatlas.so.3 @var{R_HOME}/lib/libRblas.so
ln -s /usr/lib64/atlas/libtatlas.so.3 @var{R_HOME}/lib/libRblas.so
@end example
@noindent
can be used with its distributed ATLAS libraries. (If you have the
@samp{-devel} RPMS installed you can omit the @code{.0}/@code{.3}.)
Note that rebuilding or symlinking @file{libRblas.so} may not suffice
if the intention is to use a modified LAPACK contained in an external
BLAS: the latter could even cause conflicts. However, on Fedora where
the OpenBLAS distribution contains a copy of LAPACK, it is the latter
which is used.
@node LAPACK, Caveats, BLAS, Linear algebra
@subsection LAPACK
@cindex LAPACK library
Provision is made for using an external LAPACK library, principally to
cope with @acronym{BLAS} libraries which contain a copy of LAPACK (such
as @code{sunperf} on Solaris, @code{Accelerate} on macOS and ATLAS and MKL
on @cputype{ix86}/@cputype{x86_64} Linux). At least LAPACK version 3.2
is required. This can only be done if @option{--with-blas} has been used.
However, the likely performance gains are thought to be small (and may
be negative). The default is not to search for a suitable LAPACK
library, and this is definitely @strong{not} recommended. You can
specify a specific LAPACK library or a search for a generic library by
the configuration option @option{--with-lapack}. The default for
@option{--with-lapack} is to check the @acronym{BLAS} library and then
look for an external library @samp{-llapack}. Sites searching for the
fastest possible linear algebra may want to build a LAPACK library using
the ATLAS-optimized subset of LAPACK: this is simplest with a dynamic
ATLAS library which contains a full LAPACK, when @option{--with-lapack}
suffices.
A value for @option{--with-lapack} can be set @emph{via} the environment
variable
@enindex LAPACK_LIBS
@env{LAPACK_LIBS}, but this will only be used if @option{--with-lapack}
is specified (as the default value is @code{no}) and the @acronym{BLAS} library
does not contain LAPACK.
If you do use @option{--with-lapack}, be aware of potential problems
with bugs in the LAPACK sources (or in the posted corrections to those
sources). In particular, bugs in @code{DGEEV} and @code{DGESDD} have
resulted in error messages such as
@example
DGEBRD gave error code -10
@end example
@noindent
. Other potential problems are incomplete versions of the libraries,
seen several times in Linux distributions over the years.
Please @strong{do} bear in mind that using @option{--with-lapack} is
`definitely @strong{not} recommended': it is provided @strong{only}
because it is necessary on some platforms and because some users want to
experiment with claimed performance improvements. Reporting problems
where it is used unnecessarily will simply irritate the @R{} helpers.
Note too the comments about @acronym{ISO}/@acronym{IEC}@tie{}60559
compliance in the section of external @acronym{BLAS}: these apply
equally to an external LAPACK, and for example the Intel MKL
documentation has said
@quotation
LAPACK routines assume that input matrices do not contain IEEE 754
special values such as INF or NaN values. Using these special values may
cause LAPACK to return unexpected results or become unstable.
@end quotation
We rely on limited support in LAPACK for matrices with @math{2^{31}} or
more elements: it is possible that an external LAPACK will not have that
support.
@node Caveats, , LAPACK, Linear algebra
@subsection Caveats
As with all libraries, you need to ensure that they and @R{} were
compiled with compatible compilers and flags. For example, this has
meant that on Sun Sparc using the Oracle compilers the flag
@option{-dalign} is needed if @code{sunperf} is to be used.
On some systems it has been necessary that an external
@acronym{BLAS}/LAPACK was built with the same Fortran compiler used to
build @R{}.
LAPACK 3.9.0 has a bug in which the DCOMBSSQ subroutine has a bug that
may cause NA to be interpreted as zero. This is fixed in R 3.6.3, but if
you use an external LAPACK, you may need to fix it there.
The code (in @code{dlapack.f}) should read
@example
* ..
* .. Executable Statements ..
*
IF( V1( 1 ).GE.V2( 1 ) ) THEN
IF( V1( 1 ).NE.ZERO ) THEN
V1( 2 ) = V1( 2 ) + ( V2( 1 ) / V1( 1 ) )**2 * V2( 2 )
ELSE
V1( 2 ) = V1( 2 ) + V2( 2 )
END IF
ELSE
V1( 2 ) = V2( 2 ) + ( V1( 1 ) / V2( 1 ) )**2 * V1( 2 )
V1( 1 ) = V2( 1 )
END IF
RETURN
@end example
(The inner ELSE clause was missing in LAPACK 3.9.0).
@node Configuration on a Unix-alike, Platform notes, Essential and useful other programs under a Unix-alike, Top
@appendix Configuration on a Unix-alike
@menu
* Configuration options::
* Internationalization support::
* Configuration variables::
* Setting the shell::
* Using make::
* Using Fortran::
* Compile and load flags::
* Maintainer mode::
@end menu
@node Configuration options, Internationalization support, Configuration on a Unix-alike, Configuration on a Unix-alike
@section Configuration options
@command{configure} has many options: running
@example
./configure --help
@end example
@noindent
will give a list. Probably the most important ones not covered
elsewhere are (defaults in brackets)
@table @option
@item --with-x
use the X Window System [yes]
@item --x-includes=@var{DIR}
X include files are in @var{DIR}
@item --x-libraries=@var{DIR}
X library files are in @var{DIR}
@item --with-readline
use readline library (if available) [yes]
@item --enable-R-profiling
attempt to compile support for @code{Rprof()} [yes]
@item --enable-memory-profiling
attempt to compile support for @code{Rprofmem()} and @code{tracemem()} [no]
@item --enable-R-shlib
build @R{} as a shared/dynamic library [no]
@item --enable-BLAS-shlib
build the @acronym{BLAS} as a shared/dynamic library [yes, except on AIX]
@end table
@noindent
You can use @option{--without-foo} or @option{--disable-foo} for the
negatives.
You will want to use @option{--disable-R-profiling} if you are building
a profiled executable of @R{} (e.g.@: with @samp{-pg)}. Support for @R{}
profiling requires OS support for POSIX threads (@emph{aka}
@samp{pthreads}), which are available on all mainstream Unix-alike
platforms.
Flag @option{--enable-R-shlib} causes the make process to build @R{} as
a dynamic (shared) library, typically called @file{libR.so}, and link
the main @R{} executable @file{R.bin} against that library. This can
only be done if all the code (including system libraries) can be
compiled into a dynamic library, and there may be a
performance@footnote{We have measured 15--20% on @cputype{i686} Linux
and around 10% on @cputype{x86_64} Linux.} penalty. So you probably
only want this if you will be using an application which embeds @R{}.
Note that C code in packages installed on an @R{} system linked with
@option{--enable-R-shlib} is linked against the dynamic library and so
such packages cannot be used from an @R{} system built in the default
way. Also, because packages are linked against @R{} they are on some
OSes also linked against the dynamic libraries @R{} itself is linked
against, and this can lead to symbol conflicts.
For maximally effective use of @command{valgrind}, @R{} should be
compiled with valgrind instrumentation. The @command{configure} option
is @option{--with-valgrind-instrumentation=@var{level}}, where
@var{level} is 0, 1 or 2. (Level 0 is the default and does not add
anything.) The system headers for @command{valgrind} can be requested
by option @option{--with-system-valgrind-headers}: they will be used if
present (on Linux they may be in a separate package such as
@pkg{valgrind-devel}). Note though that there is no guarantee that the
code in @R{} will be compatible with very old@footnote{We believe that
versions 3.4.0 to 3.13.0 are compatible.} or future @command{valgrind}
headers.
If you need to re-configure @R{} with different options you may need to run
@code{make clean} or even @code{make distclean} before doing so.
The @file{configure} script has other generic options added by
@command{autoconf} and which are not supported for @R{}: in particular
building for one architecture on a different host is not possible.
@node Internationalization support, Configuration variables, Configuration options, Configuration on a Unix-alike
@section Internationalization support
Translation of messages is supported via @acronym{GNU} @code{gettext}
unless disabled by the configure option @option{--disable-nls}.
The @code{configure} report will show @code{NLS} as one of the
`Additional capabilities' if support has been compiled in, and running
in an English locale (but not the @code{C} locale) will include
@example
Natural language support but running in an English locale
@end example
@noindent
in the greeting on starting R.
@node Configuration variables, Setting the shell, Internationalization support, Configuration on a Unix-alike
@section Configuration variables
@findex configure
If you need or want to set certain configure variables to something
other than their default, you can do that by either editing the file
@file{config.site} (which documents many of the variables you might want
to set: others can be seen in file @file{etc/Renviron.in}) or on the
command line as
@example
./configure @var{VAR}=@var{value}
@end example
@noindent
If you are building in a directory different from the sources, there can
be copies of @file{config.site} in the source and the build directories,
and both will be read (in that order). In addition, if there is a file
@file{~/.R/config}, it is read between the @file{config.site} files in
the source and the build directories.
There is also a general @command{autoconf} mechanism for
@file{config.site} files, which are read before any of those mentioned
in the previous paragraph. This looks first at a file specified by the
@enindex CONFIG_SITE
environment variable @env{CONFIG_SITE}, and if not is set at files such
as @file{/usr/local/share/config.site} and
@file{/usr/local/etc/config.site} in the area (exemplified by
@file{/usr/local}) where @R{} would be installed.
These variables are @emph{precious}, implying that they do not have to
be exported to the environment, are kept in the cache even if not
specified on the command line, checked for consistency between two
configure runs (provided that caching is used), and are kept during
automatic reconfiguration as if having been passed as command line
arguments, even if no cache is used.
See the variable output section of @code{configure --help} for a list of
all these variables.
If you find you need to alter configure variables, it is worth noting
that some settings may be cached in the file @file{config.cache}, and it
is a good idea to remove that file (if it exists) before re-configuring.
Note that caching is turned @emph{off} by default: use the command line
option @option{--config-cache} (or @option{-C}) to enable caching.
@menu
* Setting paper size::
* Setting the browsers::
* Compilation flags::
* Making manuals::
@end menu
@node Setting paper size, Setting the browsers, Configuration variables, Configuration variables
@subsection Setting paper size
@enindex R_PAPERSIZE
One common variable to change is @env{R_PAPERSIZE}, which defaults to
@samp{a4}, not @samp{letter}. (Valid values are @samp{a4},
@samp{letter}, @samp{legal} and @samp{executive}.)
This is used both when configuring @R{} to set the default, and when
running @R{} to override the default. It is also used to set the
paper size when making PDF manuals.
The configure default will most often be @samp{a4} if @env{R_PAPERSIZE}
is unset. (If the (Debian Linux) program @command{paperconf} is found
@enindex PAPERSIZE
or the environment variable @env{PAPERSIZE} is set, these are used to
produce the default.)
@node Setting the browsers, Compilation flags, Setting paper size, Configuration variables
@subsection Setting the browsers
@enindex R_BROWSER
Another precious variable is @env{R_BROWSER}, the default @HTML{}
browser, which should take a value of an executable in the user's path
or specify a full path.
@enindex R_PDFVIEWER
Its counterpart for PDF files is @env{R_PDFVIEWER}.
@node Compilation flags, Making manuals, Setting the browsers, Configuration variables
@subsection Compilation flags
If you have libraries and header files, e.g., for @acronym{GNU}
readline, in non-system directories, use the variables @code{LDFLAGS}
(for libraries, using @samp{-L} flags to be passed to the linker) and
@code{CPPFLAGS} (for header files, using @samp{-I} flags to be passed to
the C/C++ preprocessors), respectively, to specify these locations.
These default to @samp{-L/usr/local/lib} (@code{LDFLAGS},
@samp{-L/usr/local/lib64} on most 64-bit Linux OSes) and
@samp{-I/usr/local/include} (@code{CPPFLAGS}, but note that on most
systems @file{/usr/local/include} is regarded as a system include
directory and so instances in that macro will be skipped) to catch the
most common cases. If libraries are still not found, then maybe your
compiler/linker does not support re-ordering of @option{-L} and
@option{-l} flags.
@c (years ago this was reported to be a problem on HP-UX with the native
@c @command{cc}).
In this case, use a different compiler (or a front-end shell script
which does the re-ordering).
These flags can also be used to build a faster-running version of @R{}.
On most platforms using @command{gcc}, having @samp{-O3} in
@code{CFLAGS} and @code{FFLAGS} produces worthwhile
performance gains with @command{gcc} and @command{gfortran}, but may
result in a less reliable build (both segfaults and incorrect numeric
computations have been seen). On systems using the @acronym{GNU} linker
(especially those using @R{} as a shared library), it is likely that
including @samp{-Wl,-O1} in @code{LDFLAGS} is worthwhile, and
@samp{'-Bdirect,--hash-style=both,-Wl,-O1'} is recommended at
@uref{https://lwn.net/@/Articles/@/192624/}. Tuning compilation to a
specific @acronym{CPU} family (e.g.@: @samp{-mtune=native} for
@command{gcc}) can give worthwhile performance gains, especially on
older architectures such as @cputype{ix86}.
@node Making manuals, , Compilation flags, Configuration variables
@subsection Making manuals
@enindex R_RD4PDF
@enindex R_PAPERSIZE
The default settings for making the manuals are controlled by
@env{R_RD4PDF} and @env{R_PAPERSIZE}.
@node Setting the shell, Using make, Configuration variables, Configuration on a Unix-alike
@section Setting the shell
By default the shell scripts such as @file{R} will be @samp{#!/bin/sh}
scripts (or using the @env{SHELL} chosen by @file{configure}). This is
almost always satisfactory, but on a few systems @file{/bin/sh} is not a
Bourne shell or clone, and the shell to be used can be changed by
setting the configure variable @env{R_SHELL} to a suitable value (a full
path to a shell, e.g.@: @file{/usr/local/bin/bash}).
@node Using make, Using Fortran, Setting the shell, Configuration on a Unix-alike
@section Using make
@findex make
To compile @R{}, you will most likely find it easiest to use
@acronym{GNU} @command{make}, although the Sun @command{make} works on
Solaris.
To build in a separate directory you need a @command{make} that supports
the @code{VPATH} variable, for example @acronym{GNU} @command{make} and
Sun @command{make}.
@command{dmake} has also been used. e.g,@: on Solaris 10.
If you want to use a @command{make} by another name, for example if your
@acronym{GNU} @command{make} is called @samp{gmake}, you need to set the
variable @code{MAKE} at configure time, for example
@findex configure
@example
./configure MAKE=gmake
@end example
@node Using Fortran, Compile and load flags, Using make, Configuration on a Unix-alike
@section Using Fortran
@cindex Fortran
To compile @R{}, you need a Fortran 90 compiler. The current default
is to search for
@c From AC_PROG_FC
@command{gfortran}, @command{g95}, @command{xlf95} @command{f95},
@command{fort}, @command{ifort}, @command{ifc}, @command{efc},
@command{pgfortran}, @command{pgf95} @command{lf95}, @command{ftn},
@command{nagfor}, @command{xlf90}, @command{f90}, @command{pgf90},
@command{pghpf}, @command{epcf90}. (Note that these are searched for by
name, without checking the standard of Fortran they support.) The
command and flags used should support fixed-form Fortran with extension
@file{.f}: in the unusual case that a specific flag is needed for
free-form Fortran with extension @file{.f90} or @file{.f95}, this can be
specified as part of @code{FCFLAGS}.
The search mechanism can be changed using the configure variable
@code{FC} which specifies the command that runs the Fortran compiler.
If your Fortran compiler is in a non-standard location, you
@enindex PATH
should set the environment variable @env{PATH} accordingly before
running @command{configure}, or use the configure variable @code{FC} to
specify its full path.
If your Fortran libraries are in slightly peculiar places, you should
@enindex LD_LIBRARY_PATH
also look at @env{LD_LIBRARY_PATH} (or your system's equivalent) to make
sure that all libraries are on this path.
Note that only Fortran compilers which convert identifiers to lower case
are supported.
You must set whatever compilation flags (if any) are needed to ensure
that Fortran @code{integer} is equivalent to a C @code{int} pointer and
Fortran @code{double precision} is equivalent to a C @code{double}
pointer. This is checked during the configuration process.
Some of the Fortran code makes use of @code{DOUBLE COMPLEX} and
@code{COMPLEX*16} variables. This is checked for at configure time, as
well as its equivalence to the @code{Rcomplex} C structure defined in
@file{R_ext/Complex.h}.
Pre-release versions of @command{gfortran} 10 enforce a previously
allowed (without even a warning) restriction on the widespread practice
of passing Fortran array elements where an array is expected. To
continue to allow this (with a warning), add
@option{-fallow-argument-mismatch}@footnote{At the time of writing using
this with @option{-pedantic} still gave an error.} to @code{FFLAGS}.
@node Compile and load flags, Maintainer mode, Using Fortran, Configuration on a Unix-alike
@section Compile and load flags
A wide range of flags can be set in the file @file{config.site} or as
configure variables on the command line. We have already mentioned
@table @code
@item CPPFLAGS
header file search directory (@option{-I}) and any other miscellaneous
options for the C and C++ preprocessors and compilers
@item LDFLAGS
path (@option{-L}), stripping (@option{-s}) and any other miscellaneous
options for the linker
@end table
@noindent
and others include
@table @code
@item CFLAGS
debugging and optimization flags, C
@item MAIN_CFLAGS
ditto, for compiling the main program (e.g.@: when profiling)
@item SHLIB_CFLAGS
for shared objects (no known examples)
@item FFLAGS
debugging and optimization flags, fixed-form Fortran
@item FCFLAGS
debugging and optimization flags, free-form Fortran
@item SAFE_FFLAGS
ditto for source files which need exact floating point behaviour
@item MAIN_FFLAGS
ditto, for compiling the main program (e.g.@: when profiling)
@item SHLIB_FFLAGS
for shared objects (no known examples)
@item MAIN_LDFLAGS
additional flags for the main link
@item SHLIB_LDFLAGS
additional flags for linking the shared objects
@item LIBnn
the primary library directory, @file{lib} or @file{lib64}
@item CPICFLAGS
special flags for compiling C code to be turned into a shared object
@item FPICFLAGS
special flags for compiling Fortran code to be turned into a shared object
@item CXXPICFLAGS
special flags for compiling C++ code to be turned into a shared object
@item DEFS
defines to be used when compiling C code in @R{} itself
@end table
@noindent
Library paths specified as @option{-L/lib/path} in @code{LDFLAGS} are
@enindex LD_LIBRARY_PATH
collected together and prepended to @env{LD_LIBRARY_PATH} (or your
system's equivalent), so there should be no need for @option{-R} or
@option{-rpath} flags.
Variables such as @env{CPICFLAGS} are determined where possible by
@command{configure}. Some systems allows two types of PIC flags, for
example @samp{-fpic} and @samp{-fPIC}, and if they differ the first
allows only a limited number of symbols in a shared object. Since @R{}
as a shared library has about 6200 symbols, if in doubt use the larger
version.
Other variables often set by @command{configure} include
@samp{MAIN_LDFLAGS}, @samp{SAFE_FFLAGS}, @samp{SHLIB_LDFLAGS} and
@samp{SHLIB_CXXLDFLAGS}: see file @file{config.site} in the sources for
more documentation on these and others.
To compile a profiling version of @R{}, one might for example want to
use @samp{MAIN_CFLAGS=-pg}, @samp{MAIN_FFLAGS=-pg},
@samp{MAIN_LDFLAGS=-pg} on platforms where @samp{-pg} cannot be used
with position-independent code.
@strong{Beware}: it may be necessary to set @code{CFLAGS} and
@code{FFLAGS} in ways compatible with the libraries to be used: one
possible issue is the alignment of doubles, another is the way
structures are passed.
On some platforms @command{configure} will select additional flags for
@code{CFLAGS}, @code{CPPFLAGS} and @code{LIBS} in @code{R_XTRA_CFLAGS}
(and so on). These are for options which are always required, for
example to force @acronym{IEC}@tie{}60559 compliance.
@node Maintainer mode, , Compile and load flags, Configuration on a Unix-alike
@section Maintainer mode
There are several files that are part of the @R{} sources but can be
re-generated from their own sources by configuring with option
@option{--enable-maintainer-mode} and then running @command{make} in the
build directory. This requires other tools to be installed, discussed
in the rest of this section.
File @file{configure} is created from @file{configure.ac} and the files
under @file{m4} by @command{autoconf} and @command{aclocal} (part of the
@pkg{automake} package). There is a formal version requirement on
@command{autoconf} of 2.69 or later, but it is unlikely that anything
other than the most recent versions@footnote{at the time of revision of
this para in late 2018, @pkg{autoconf-2.69} from 2012 and
@pkg{automake-1.16.1} from 2018.} have been thoroughly tested.
File @file{src/include/config.h} is created by @command{autoheader}
(part of @pkg{autoconf}).
Grammar files @file{*.y} are converted to C sources by an implementation
of @command{yacc}, usually @command{bison -y}: these are found in
@file{src/main} and @file{src/library/tools/src}. It is known that
earlier versions of @command{bison} generate code which reads (and in
some cases writes) outside array bounds: @command{bison} 2.6.1 was found
to be satisfactory.
The ultimate sources for package @pkg{compiler} are in its @file{noweb}
directory. To re-create the sources from
@file{src/library/compiler/noweb/compiler.nw}, the command
@command{notangle} is required. Some Linux distributions include this
command in package @pkg{noweb}. It can also be installed from the
sources at @url{https://www.cs.tufts.edu/~nr/noweb/}@footnote{The links
there have proved difficult to access, in which case either point an FTP
client at @uref{ftp://www.eecs.harvard.edu/pub/nr/} or grab the copy
made available at
@uref{http://developer.r-project.org/noweb-2.11b.tgz}.}. The package
sources are only re-created even in maintainer mode if
@file{src/library/compiler/noweb/compiler.nw} has been updated.
@c It is likely that in future creating @code{configure} will need the GNU
@c `autoconf archive' installed. This can be found at
@c @c and it has moved to github!
@c @url{https://www.gnu.org/software/autoconf-archive/} and as a package
@c (usually called @pkg{autoconf-archive}) in most packaged distributions,
@c for example Debian, Fedora, OpenCSW, Homebrew and MacPorts.
@node Platform notes, The Windows toolset, Configuration on a Unix-alike, Top
@appendix Platform notes
This section provides some notes on building @R{} on different Unix-alike
platforms. These notes are based on tests run on one or two systems in
each case with particular sets of compilers and support libraries.
Success in building @R{} depends on the proper installation and functioning
of support software; your results may differ if you have other versions
of compilers and support libraries.
Older versions of this manual contain notes on platforms such as HP-UX,
IRIX, Alpha/OSF1 (for @R{} < 2.10.0, and support has since been removed
for all of these) and AIX (for @R{} < = 3.5.x) for which we have had no
recent reports.
C macros to select particular platforms can be tricky to track down
(there is a fair amount of misinformation on the Web). The Wiki
(currently) at @uref{http://sourceforge.net/@/p/@/predef/@/wiki/@/Home/}
can be helpful. The @R{} sources have used (often in included software
under @file{src/extra})
@example
AIX: _AIX
Cygwin: __CYGWIN__
FreeBSD: __FreeBSD__
HP-UX: __hpux__, __hpux
IRIX: sgi, __sgi
Linux: __linux__
macOS: __APPLE__
NetBSD: __NetBSD__
OpenBSD: __OpenBSD__
Solaris: __sun, sun
Windows: _WIN32, _WIN64
@end example
@menu
* X11 issues::
* Linux::
* macOS::
* Solaris::
* FreeBSD::
* OpenBSD::
* Cygwin::
* New platforms::
@end menu
@node X11 issues, Linux, Platform notes, Platform notes
@section X11 issues
The @samp{X11()} graphics device is the one started automatically on
Unix-alikes (except most macOS builds) when plotting. As its name
implies, it displays on a (local or remote) X server, and relies on the
services provided by the X server.
The `modern' version of the @samp{X11()} device is based on @samp{cairo}
graphics and (in most implementations) uses @samp{fontconfig} to pick and
render fonts. This is done on the server, and although there can be
selection issues, they are more amenable than the issues with
@samp{X11()} discussed in the rest of this section.
When X11 was designed, most displays were around 75dpi, whereas today
they are of the order of 100dpi or more. If you find that X11()
is reporting@footnote{for example, @code{X11 font at size 14 could not
be loaded}.} missing font sizes, especially larger ones, it is likely
that you are not using scalable fonts and have not installed the 100dpi
versions of the X11 fonts. The names and details differ by system, but
will likely have something like Fedora's
@example
xorg-x11-fonts-75dpi
xorg-x11-fonts-100dpi
xorg-x11-fonts-ISO8859-2-75dpi
xorg-x11-fonts-Type1
xorg-x11-fonts-cyrillic
@end example
@noindent
and you need to ensure that the @samp{-100dpi} versions are installed
and on the X11 font path (check via @command{xset -q}). The
@samp{X11()} device does try to set a pointsize and not a pixel size:
laptop users may find the default setting of 12 too large (although very
frequently laptop screens are set to a fictitious dpi to appear like a
scaled-down desktop screen).
More complicated problems can occur in non-Western-European locales, so
if you are using one, the first thing to check is that things work in
the @code{C} locale. The likely issues are a failure to find any fonts
or glyphs being rendered incorrectly (often as a pair of @acronym{ASCII}
characters). X11 works by being asked for a font specification and
coming up with its idea of a close match. For text (as distinct from
the symbols used by plotmath), the specification is the first element of
the option @code{"X11fonts"} which defaults to
@example
"-adobe-helvetica-%s-%s-*-*-%d-*-*-*-*-*-*-*"
@end example
If you are using a single-byte encoding, for example ISO 8859-2 in
Eastern Europe or KOI8-R in Russian, use @command{xlsfonts} to find an
appropriate family of fonts in your encoding (the last field in the
listing). If you find none, it is likely that you need to install
further font packages, such as @samp{xorg-x11-fonts-ISO8859-2-75dpi} and
@samp{xorg-x11-fonts-cyrillic} shown in the listing above.
Multi-byte encodings (most commonly UTF-8) are even more complicated.
There are few fonts in @samp{iso10646-1}, the Unicode encoding, and they
only contain a subset of the available glyphs (and are often fixed-width
designed for use in terminals). In such locales @emph{fontsets} are
used, made up of fonts encoded in other encodings. If the locale you
are using has an entry in the @samp{XLC_LOCALE} directory (typically
@file{/usr/share/X11/locale}), it is likely that all you need to do is to
pick a suitable font specification that has fonts in the encodings
specified there. If not, you may have to get hold of a suitable locale
entry for X11. This may mean that, for example, Japanese text can be
displayed when running in @samp{ja_JP.UTF-8} but not when running in
@samp{en_GB.UTF-8} on the same machine (although on some systems many
UTF-8 X11 locales are aliased to @samp{en_US.UTF-8} which covers several
character sets, e.g.@: ISO 8859-1 (Western European), JISX0208 (Kanji),
KSC5601 (Korean), GB2312 (Chinese Han) and JISX0201 (Kana)).
On some systems scalable fonts are available covering a wide range of
glyphs. One source is TrueType/OpenType fonts, and these can provide
high coverage. Another is Type 1 fonts: the URW set of Type 1 fonts
provides standard typefaces such as Helvetica with a larger coverage of
Unicode glyphs than the standard X11 bitmaps, including Cyrillic. These
are generally not part of the default install, and the X server may need
to be configured to use them. They might be under the X11 @file{fonts}
directory or elsewhere, for example,
@example
/usr/share/fonts/default/Type1
/usr/share/fonts/ja/TrueType
@end example
@node Linux, macOS, X11 issues, Platform notes
@section Linux
@cindex Linux
Linux is the main development platform for @R{}, so compilation from the
sources is normally straightforward with the most common compilers and
libraries.@footnote{For example, @code{glibc}: other C libraries such as
@code{musl} have been used but are not routinely tested.}
Recall that some package management systems (such as @acronym{RPM} and
deb) make a distinction between the user version of a package and the
developer version. The latter usually has the same name but with the
extension @samp{-devel} or @samp{-dev}: you need both versions
installed. So please check the @code{configure} output to see if the
expected features are detected: if for example @samp{readline} is
missing add the developer package. (On most systems you will also need
@samp{ncurses} and its developer package, although these should be
dependencies of the @samp{readline} package(s).) You should expect to
see in the @command{configure} summary
@example
Interfaces supported: X11, tcltk
External libraries: readline, curl
Additional capabilities: PNG, JPEG, TIFF, NLS, cairo, ICU
@end example
When @R{} has been installed from a binary distribution there are
sometimes problems with missing components such as the Fortran
compiler. Searching the @samp{R-help} archives will normally reveal
what is needed.
It seems that @cputype{ix86} Linux accepts non-PIC code in shared
libraries, but this is not necessarily so on other platforms, in
particular on 64-bit @acronym{CPU}s such as @cputype{x86_64}. So care
can be needed with @acronym{BLAS} libraries and when building @R{} as a
shared library to ensure that position-independent code is used in any
static libraries (such as the Tcl/Tk libraries, @code{libpng},
@code{libjpeg} and @code{zlib}) which might be linked against.
Fortunately these are normally built as shared libraries with the
exception of the ATLAS @acronym{BLAS} libraries.
The default optimization settings chosen for @code{CFLAGS} etc are
conservative. It is likely that using @option{-mtune} will result in
significant performance improvements on recent CPUs: one possibility is
to add @option{-mtune=native} for the best possible performance on the
machine on which @R{} is being installed. It is also possible to
increase the optimization levels to @option{-O3}: however for many
versions of the compilers this has caused problems in at least one
@acronym{CRAN} package.
For platforms with both 64- and 32-bit support, it is likely that
@example
LDFLAGS="-L/usr/local/lib64 -L/usr/local/lib"
@end example
@noindent
is appropriate since most (but not all) software installs its 64-bit
libraries in @file{/usr/local/lib64}. To build a 32-bit version of @R{}
on @cputype{x86_64} with Fedora 28 we used
@example
CC="gcc -m32"
CXX="g++ -m32"
FC="gfortran -m32"
OBJC=$@{CC@}
LDFLAGS="-L/usr/local/lib"
LIBnn=lib
@end example
@noindent
Note the use of @samp{LIBnn}: @cputype{x86_64} Fedora installs its
64-bit software in @file{/usr/lib64} and 32-bit software in
@file{/usr/lib}. Linking will skip over inappropriate binaries, but for
example the 32-bit Tcl/Tk configure scripts are in @file{/usr/lib}. It
may also be necessary to set the @command{pkg-config} path, e.g.@: by
@example
export PKG_CONFIG_PATH=/usr/local/lib/pkgconfig:/usr/lib/pkgconfig
@end example
@noindent
The 32-bit system @code{libcurl} did not work with the system CA
certificates: this is worked around in @R{}'s test suite.
64-bit versions on Linux are built with support for files > 2Gb, and
32-bit versions will be if possible unless @option{--disable-largefile}
is specified.
Note that 32-bit @code{glibc} currently@footnote{This has been announced
to change in version 2.29, having been postponed from 2.28.} uses a
32-bit @code{time_t} type, so to pass all the date-time checks needs
@R{} built with flag @option{--with-internal-tzcode}.
To build a 64-bit version of @R{} on @cputype{ppc64} (also known as
@cputype{powerpc64}) with @command{gcc}@tie{}4.1.1, Ei-ji Nakama used
@example
CC="gcc -m64"
CXX="gxx -m64"
FC="gfortran -m64"
CFLAGS="-mminimal-toc -fno-optimize-sibling-calls -g -O2"
FFLAGS="-mminimal-toc -fno-optimize-sibling-calls -g -O2"
@end example
@noindent
the additional flags being needed to resolve problems linking against
@file{libnmath.a} and when linking @R{} as a shared library.
@c suggestion of https://gcc.gnu.org/wiki/FloatingPointMath
The setting of the macro @samp{SAFE_FFLAGS} may need some help. It
should not need additional flags on platforms other than @cputype{68000}
(not likely to be encountered) and @cputype{ix86}. For the latter, if
the Fortran compiler is GNU (@command{gfortran} or possibly
@command{g77}) the flags
@example
-msse2 -mfpmath=sse
@end example
@noindent
are added: earlier versions of @R{} added @option{-ffloat-store} and
this might still be needed if a @cputype{ix86} CPU is encountered
without SSE2 support. Note that it is a @emph{replacement} for
@samp{FFLAGS}, so should include all the flags in that macro (except
perhaps the optimization level).
Additional compilation flags can be specified for added safety/security
checks. For example Fedora 30 adds
@example
-Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS
-Fexceptions -fstack-protector-strong -fasynchronous-unwind-tables
-fstack-clash-protection -fcf-protection
@end example
@noindent
to all the C, C++ and Fortran compiler flags (even though
@code{_GLIBCXX_ASSERTIONS} is only for C++ in current GCC and
@code{glibc} and none of these are documented for @code{gfortran}).
Use of @code{_GLIBCXX_ASSERTIONS} will link @code{abort} and
@code{printf} into almost all C++ code, and @command{R CMD check
--as-cran} will warn.
@menu
* Clang::
* Intel compilers::
@end menu
@node Clang, Intel compilers, Linux, Linux
@subsection Clang
@R{} has been built with Linux @cputype{ix86} and @cputype{x86_64} C and
C++ compilers (@uref{http://clang.llvm.org}) based on the Clang
front-ends, invoked by @code{CC=clang CXX=clang++}, together with
@command{gfortran}. These take very similar options to the
corresponding GCC compilers.
This has to be used in conjunction with a Fortran compiler: the
@command{configure} code will remove @option{-lgcc} from @env{FLIBS},
which is needed for some versions of @command{gfortran}.
The current out-of-the-box default for @command{clang++} is to use the
C++ runtime from the installed @command{g++}. Using the runtime from
the @code{libc++} project (@url{http://libcxx.llvm.org/}, Fedora RPM
@code{libcxx-devel}) @emph{via} @option{-stdlib=libc++} has also been
tested.
Recent versions have (optional when built) OpenMP support.@footnote{This
also needs the OpenMP runtime which has sometimes been distributed
separately.}
There is a project called @command{flang}
(@uref{https://github.com/@/flang-compiler/@/flang})) to develop a
Fortran compiler similar to clang but based on the Portland Group's front
end. This needs something like
@example
FC=/usr/local/flang/bin/flang
LDFLAGS="-L/usr/local/flang/lib -L/usr/local/lib64"
@end example
Note that @command{flang} accepts all the flags which @command{clang}
does (the driver is a modified version of @command{clang}, and
@command{flang} is a symbolic link to @command{clang}), but does not
implement all of them for Fortran compilation: it also accepts most
PGI-style flags such as @option{-mp} for OpenMP. It currently produces
few diagnostics even with @option{-Wall -pedantic}.
@command{flang}'s Fortran runtime is compiled against OpenMP and it
seems this conflicts with using OpenMP in @R{}. So it may be necessary
to disable the latter by configuring @R{} using @option{--disable-openmp}.
It is not clear what architectures @command{flang} intends to support:
our experiments were done on @cputype{x86_64}. At the time of writing
binary `releases' were available for that platform (called by them
@cputype{x86}) and @cputype{ppc64le}.
@node Intel compilers, , Clang, Linux
@subsection Intel compilers
Intel compilers have been used under @cputype{ix86} and @cputype{x86_64}
Linux. Brian Ripley used version 9.0 of the compilers for
@cputype{x86_64} on Fedora Core 5 with
@example
CC=icc
CFLAGS="-g -O3 -wd188 -ip -mp"
FC=ifort
FLAGS="-g -O3 -mp"
CXX=icpc
CXXFLAGS="-g -O3 -mp"
ICC_LIBS=/opt/compilers/intel/cce/9.1.039/lib
IFC_LIBS=/opt/compilers/intel/fce/9.1.033/lib
LDFLAGS="-L$ICC_LIBS -L$IFC_LIBS -L/usr/local/lib64"
SHLIB_CXXLD=icpc
@end example
@noindent
It may be necessary to use @code{CC="icc -std=c99"} or @code{CC="icc
-c99"} for C99-compliance. The flag @option{-wd188} suppresses a large
number of warnings about the enumeration type @samp{Rboolean}. Because
the Intel C compiler sets @samp{__GNUC__} without complete emulation of
@command{gcc}, we suggest adding @code{CPPFLAGS=-no-gcc}.
To maintain correct @acronym{IEC}@tie{}60559 arithmetic you most likely
need add flags to @code{CFLAGS}, @code{FFLAGS} and
@code{CXXFLAGS} such as @option{-mp} (shown above) or @option{-fp-model
precise -fp-model source}, depending on the compiler version.
Others have reported success with versions 10.x and 11.x.
@c https://stat.ethz.ch/pipermail/r-devel/2015-September/071717.html
Bjørn-Helge Mevik reported success with version 2015.3 of the compilers,
using (for a SandyBridge CPU on Centos 6.x)
@example
fast="-fp-model precise -ip -O3 -opt-mem-layout-trans=3 -xHost -mavx"
CC=icc
CFLAGS="$fast -wd188"
FC=ifort
FFLAGS="$fast"
CXX=icpc
CXXFLAGS="$fast"
@end example
It is possible that 32-builds need to force the use of SSE2 instructions
in @code{SAFE_FFLAGS}, e.g.@: by
@example
SAFE_FFLAGS=-axsse2
@end example
@node macOS, Solaris, Linux, Platform notes
@section macOS
@cindex macOS
(`macOS' was known as `OS X' from 2012--2016.)
The instructions here are for 64-bit (@cputype{x86_64}) builds on 10.11
(El Capitan) or later. @R{} can in principle be built for 10.6 and
later, although this is little tested and it may be necessary to install
later versions of software such as @code{libcurl}.
@menu
* Prerequisites::
* Recommended C/C++ compilers::
* Other libraries::
* Tcl/Tk headers and libraries::
* Java (macOS)::
* Frameworks::
* Building R.app::
@end menu
@node Prerequisites, Recommended C/C++ compilers, macOS, macOS
@subsection Prerequisites
The following are essential to build @R{}
@itemize
@item
Apple's `Command Line Tools': these can be (re-)installed by
@command{xcode-select --install}. (If you have a fresh OS installation,
running e.g.@: @command{make} in a terminal will offer the installation
of the command-line tools. If you have installed Xcode, this provides
the command-line tools. The tools may need to be reinstalled when macOS
is upgraded, as upgrading partially removes them.)
The Command Line Tools provide C and C++ compilers.
@item
A Fortran compiler. Installers@footnote{Some of these are
unsigned packages: to install them you may need to right-click and
select @code{Open with -> Installer}.} are available at
@itemize
@item
El Capitan and later@*
@uref{https://cran.r-project.org/@/bin/@/macosx/@/tools/@/gfortran-6.1.pkg},@*
which is a re-packaged (signed) version of@*
@uref{https://github.com/@/fxcoudert/@/gfortran-for-macOS/@/releases/@/download/@/6.1/@/gfortran-6.1-ElCapitan.dmg}
@item
Sierra and High Sierra@*
@uref{https://github.com/@/fxcoudert/@/gfortran-for-macOS/@/releases/@/download/@/6.3/@/gfortran-6.3-Sierra.dmg}
@item
Mojave and later@*
@uref{https://github.com/@/fxcoudert/@/gfortran-for-macOS/@/releases/@/download/@/8.2/gfortran-8.2-Mojave.dmg}.
@end itemize
These all install into @file{/usr/local/gfortran}. Its @file{bin}
directory does not need to be on the path as a full path can be used for
@code{FC}: see below.
@item
Binary components @code{pcre} and @code{xz} (for @code{libzma}) from
@uref{https://mac.R-project.org/libs}, as recent macOS versions provide
libraries but not headers for these (and the system @code{pcre} is too
old at 8.02 for versions up to Sierra, although High Sierra had 8.40).
For example
@example
curl -OL https://mac.r-project.org/libs/pcre-8.40-darwin.15-x86_64.tar.gz
sudo tar -fvxz pcre-8.40-darwin.15-x86_64.tar.gz -C /
curl -OL https://mac.r-project.org/libs/xz-5.2.3-darwin.15-x86_64.tar.gz
sudo tar -fvxz xz-5.2.3-darwin.15-x86_64.tar.gz -C /
@end example
@end itemize
and desirable
@itemize
@item
GNU @code{readline} from @uref{https://mac.R-project.org/libs},
or configure with @option{--without-readline}.
@item
Components @code{jpeg}, @code{libpng}, @code{pkgconfig},
@code{pkgconfig-system-stubs} and @code{tiff} from
@uref{https://mac.R-project.org/libs}, for the full range of bitmapped
graphics devices.
@item
An X sub-system unless configuring using @option{--without-x}: see
@uref{https://xquartz.macosforge.org/}. @R{}'s @command{configure}
script can be told to look for @code{X11} in @code{XQuartz}'s main
location of @file{/opt/X11}, e.g.@: by
@example
--x-includes=/opt/X11/include --x-libraries=/opt/X11/lib
@end example
@noindent
although linked versions under @file{/usr/X11} will be found.
@item
An Objective-C compiler, as provided by @command{clang} in the Command
Line Tools: this is needed for the @code{quartz()} graphics device.
Use @option{--without-aqua} if you want a standard Unix-alike build:
apart from disabling @code{quartz()} and the ability to use the build
with @Rapp{}, it also changes the default location of the personal
library (see @code{?.libPaths}).
@item
Support for @code{cairo} (without @code{Pango}) can be enabled if
@code{pkg-config} and XQuartz are available. Make sure the XQuartz's
@code{pkg-config} files are found first on the configuration path: for
example by setting
@example
export PKG_CONFIG_PATH=/opt/X11/lib/pkgconfig:/usr/local/lib/pkgconfig:/usr/lib/pkgconfig
@end example
@noindent
or appending that variable to the @command{configure} command.
@item
A TeX installation. @xref{Other libraries}.
@item
@command{texi2any} from a @samp{texinfo} distribution, which requires
@command{perl} (currently part of macOS but it has been announced that
it will not be from macOS 10.16). (An old version of @command{texi2any}
has been included in the binary distribution of @R{}.)
@item
Compilers supporting OpenMP: see the next subsection.
@end itemize
To build @R{} itself from the sources with the compilers in the Command
Line Tools (or Xcode) and one of the @command{gfortran} installers, use
a file @file{config.site} containing
@example
CC=clang
OBJC=$CC
FC=/usr/local/gfortran/bin/gfortran
CXX=clang++
@end example
@noindent
and configure by something like
@example
./configure -C \
--enable-R-shlib --enable-memory-profiling \
--x-includes=/opt/X11/include --x-libraries=/opt/X11/lib \
PKG_CONFIG_PATH=/opt/X11/lib/pkgconfig:/usr/local/lib/pkgconfig:/usr/lib/pkgconfig
@end example
To install packages using compiled code one needs the Command Line Tools
and appropriate compilers, e.g.@: Fortran and the C/C++ compilers from
those tools. Some packages have further requirements such as
@command{pkg-config}.
@menu
* Note for Catalina users::
@end menu
@node Note for Catalina users, , Prerequisites, Prerequisites
@subsubsection Note for Catalina users
The default security settings for Catalina can make it difficult to
install Apple packages built after 2019-06-01 which have not been
`notarized'@footnote{See
@uref{https://developer.apple.com//@/documentation/@/xcode/@/notarizing_macos_software_before_distribution}.}
by Apple. And not just packages, as this has been seen for executables
contained in tarballs/zipfiles (for example, for @command{pandoc}).
There are workarounds, including in some cases not installing the latest
version. Usually one can use @samp{Open With}
(Control/right/two-finger-click in Finder), then select @samp{Installer}
and @samp{Open} if you get a further warning message.
This applies also to some @R{} distributions, including @R{} 3.6.2 and
`nightly builds' from @uref{https://mac.r-project.org/}.
@node Recommended C/C++ compilers, Other libraries, Prerequisites, macOS
@subsection Recommended C/C++ compilers
@acronym{CRAN} binary distributions of @R{} 3.6.x use the build of
@command{clang} 7.0.0 contained in
@uref{https://cran.r-project.org/@/bin/@/macosx/@/tools/@/clang-7.0.0.pkg}.
Other recent distributions (including 9.0.0 and 8.0.0) of
@command{clang} are available from @uref{http://releases.llvm.org/}
(which the above re-packages). In particular, these include support for
OpenMP which Apple builds of @command{clang} do not.
Suppose one of these distributions is installed under
@file{/usr/local/clang7}. Use a file @file{config.site} containing
@example
CC=/usr/local/clang7/bin/clang
OBJC=$CC
FC=/usr/local/gfortran/bin/gfortran
CXX=/usr/local/clang7/bin/clang++
LDFLAGS="-L/usr/local/clang7/lib -L/usr/local/lib"
R_LD_LIBRARY_PATH=/usr/local/clang7/lib:/usr/local/lib
@end example
@noindent
The care to specify library paths is to ensure that the OpenMP runtime
library, here @file{/usr/local/clang7/lib/libomp.dylib}, is found when
needed. If this works, you should see the line
@example
checking whether OpenMP SIMD reduction is supported... yes
@end example
@noindent
in the @command{configure} output. Also, @samp{R_LD_LIBRARY_PATH} needs
to be set to find the latest version of the C++ run-time libraries
rather than the system ones.
For macOS 10.14 (`Mojave') and versions 10.x of the Command Line Tools,
an additional step is needed to install the headers to
@file{/usr/include}: from a Terminal run
@c https://developer.apple.com/documentation/xcode_release_notes/xcode_10_release_notes
@example
sudo installer -pkg \
/Library/Developer/CommandLineTools/Packages/macOS_SDK_headers_for_macOS_10.14.pkg \
-target /
@end example
@noindent
(This will need to be re-run if the OS decides to update the Command
Line Tools.)
Alternatively, change the system paths @emph{via}
@example
CC="/usr/local/clang7/bin/clang -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk"
CXX="/usr/local/clang7/bin/clang++ -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk"
@end example
@noindent
(The location of the SDK can be found by running @command{xcrun
-show-sdk-path}.)
This alternative approach is needed for versions 11.x of the Command
Line Tools/Xcode.
@node Other libraries, Tcl/Tk headers and libraries, Recommended C/C++ compilers, macOS
@subsection Other libraries
Pre-compiled versions of many of the @ref{Useful libraries and programs}
are available from @uref{https://mac.R-project.org/libs/}.
@cindex BLAS library
@cindex LAPACK library
The @code{Accelerate}
library@footnote{@uref{https://developer.apple.com/@/documentation/@/accelerate}.}
can be used @emph{via} the configuration options
@example
--with-blas="-framework Accelerate"
@end example
@noindent
to provide potentially higher-performance versions of the @acronym{BLAS}
and LAPACK routines.@footnote{It was reported that for some non-Apple
toolchains @code{CPPFLAGS} needed to contain @code{-D__ACCELERATE__}:
not needed for @command{clang} from @url{releases.llvm.org} though.}
This also includes a full LAPACK which can be used @emph{via}
@option{--with-lapack}: however, the version of LAPACK it contains is
often seriously old.
@c https://developer.apple.com/documentation/accelerate/veclib
In recent versions of macOS, threading in Accelerate is controlled by
`Grand Central Dispatch' and is said not to need user control.
Looking at the top of
@file{/Library/Frameworks/R.framework/Resources/etc/Makeconf}
will show the compilers and configuration options used for the
@acronym{CRAN} binary package for @R{}: at the time of writing the
non-default options
@example
--enable-memory-profiling --enable-R-framework --x-libraries=/opt/X11/lib
@end example
@noindent
were used. (@option{--enable-R-framework} implies @option{--enable-R-shlib}.)
Configure option @option{--with-internal-tzcode} is the default on macOS,
as the system implementation of time zones does not work correctly for
times before 1902 or after 2037 (despite using a 64-bit @code{time_t}).
The @TeX{} implementation used by the developers is MacTeX
(@uref{https://www.tug.org/mactex/}): the full installation is about
5GB, but a smaller version (`Basic TeX') is available at
@uref{https://www.tug.org/mactex/morepackages.html} to which you will
need to add some packages, e.g.@: for the 2019 version we needed to
add@footnote{E.g.@: @emph{via} @command{tlmgr install cm-super helvetic
inconsolata texinfo} .} @pkg{cm-super}, @pkg{helvetic},
@pkg{inconsolata} and @pkg{texinfo} which brought this to about
310MB. (After updates in Dec 2019 it was also necessary to add
@pkg{letltxmacro} to make @file{NEWS.pdf}.) @samp{TeX Live Utility}
(available @emph{via} the MacTeX front page) provides a graphical means
to manage @TeX{} packages. This is documented to require Sierra or
later: for earlier versions see the instructions on the MacTeX front
page.
Checking packages thoroughly requires @command{ghostscript} (part of the
full MacTeX distribution or separately from
@uref{https://www.tug.org/mactex/morepackages.html}) and @command{qpdf}
(an early version of which is in the @file{bin} directory of a binary
installation of @R{}, usually @file{/Library/Frameworks/R.framework/Resources/bin/qpdf}).
One macOS quirk is that the default path has @file{/usr/local/bin} after
@file{/usr/bin}, contrary to common practice on Unix-alikes. This means
that if you install tools from the sources they will by default be
installed under @file{/usr/local} and not supersede the system
versions.
@node Tcl/Tk headers and libraries, Java (macOS), Other libraries, macOS
@subsection Tcl/Tk headers and libraries
If you plan to use the @code{tcltk} package for @R{}, you need to
install a distribution of Tcl/Tk. There are two alternatives. If you
use @Rapp{} you will want to use X11-based Tcl/Tk (as used on other
Unix-alikes), which is installed as part of the CRAN binary for @R{} and
available as separate @code{tcl} and @code{tk} components from
@uref{https://mac.R-project.org/libs/}. This may need
@command{configure} options
@example
-with-tcltk=/usr/local/lib
@end example
or
@example
--with-tcl-config=/usr/local/lib/tclConfig.sh
--with-tk-config=/usr/local/lib/tkConfig.sh
@end example
Note that this requires a matching XQuartz installation.
There is also a native (`Aqua') version of Tcl/Tk which produces widgets
in the native macOS style: this will not work with @Rapp{} because of
conflicts over the macOS menu, but for those only using command-line @R{}
this provides a much more intuitive interface to Tk for experienced Mac
users. Most versions of macOS come with Aqua Tcl/Tk libraries, but these
are not at all recent versions of Tcl/Tk (8.5.9 in Sierra, which is
not even the latest patched version in that series). It is better to
install Tcl/Tk 8.6.x from the sources or a binary distribution from
@uref{https://www.activestate.com/@/activetcl/@/downloads}. Configure @R{}
with
@example
--with-tcl-config=/Library/Frameworks/Tcl.framework/tclConfig.sh
--with-tk-config=/Library/Frameworks/Tk.framework/tkConfig.sh
@end example
If you need to find out which distribution of Tk is in use at run time,
use
@example
library(tcltk)
tclvalue(.Tcl("tk windowingsystem")) # "x11" or "aqua"
@end example
@node Java (macOS), Frameworks, Tcl/Tk headers and libraries, macOS
@subsection Java
The situation with Java support on macOS is messy,@footnote{For more
details see @uref{http://www.macstrategy.com/@/article.php?3}.} and
distribution of Java for all platforms changed in 2018.
macOS no longer comes with an installed Java runtime (JRE), and a macOS
upgrade may remove one if already installed: it is intended to be
installed at first use. Check if a JRE is installed by running
@command{java -version} in a @command{Terminal} window: if Java is not
installed@footnote{In the unlikely event that the version reported does
not start with @code{1.8.0}, @code{11} or higher you need
to update your Java.} this should prompt you to install it. You can
also install directly a recent Java from Oracle (currently from
@uref{http://www.oracle.com/@/technetwork/@/java/@/javase/@/downloads/@/index.html}).
Builds of OpenJDK may also be available, e.g.@: from
@uref{http://jdk.java.net/}. We recommend you install a version with
long-term support, e.g.@: 8 or 11 (but not 9, 10, 12 or 13 which
have/had a 6-month lifetime).
Binary distributions of @R{} are built against a specific version
(e.g.@: 9) of Java so @command{sudo R CMD javareconf} will likely be
needed before using Java-using packages.
To see what compatible versions of Java are currently installed, run
@command{/usr/libexec/java_home -V -a x86_64}. If needed, set the
environment variable @env{JAVA_HOME} to choose between these, both when
@R{} is built from the sources and when @command{R CMD javareconf} is
run.
Configuring and building @R{} both looks for a JRE and for support for
compiling JNI programs (used by packages @CRANpkg{rJava} and
@CRANpkg{JavaGD}); the latter requires a JDK (Java SDK) and not just a
JRE@footnote{As from Java 11, there is no separate client JRE
distribution.}.
The build process tries to fathom out what JRE/JDK to use, but it may
need some help, e.g.@: by setting @env{JAVA_HOME}. A JDK can be
specified explicitly by something like
@example
JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk-11.jdk/Contents/Home
JAVA_CPPFLAGS="-I/$@{JAVA_HOME@}/include -I/$@{JAVA_HOME@}/include/darwin"
JAVA_LD_LIBRARY_PATH="$@{JAVA_HOME@}/lib/server"
JAVA_LIBS="-L/$@{JAVA_HOME@}/lib/server -ljvm"
@end example
@noindent
or
@example
JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_191.jdk/Contents/Home
JAVA_CPPFLAGS="-I/$@{JAVA_HOME@}/include -I/$@{JAVA_HOME@}/include/darwin"
JAVA_LD_LIBRARY_PATH="$@{JAVA_HOME@}/jre/lib/server"
JAVA_LIBS="-L/$@{JAVA_HOME@}/jre/lib/server -ljvm"
@end example
@noindent
in @file{config.site}.
To use the builds of OpenJDK (tarballs) from @uref{http://jdk.java.net/},
set @env{JAVA_HOME}:
@example
JAVA_HOME=@var{/path/to/JDK}/jdk-11.jdk/Contents/Home
@end example
@noindent
where @file{@var{/path/to/JDK}} is wherever the distribution tarball was
unpacked.
Note that it is necessary to set the environment variable @env{NOAWT} to
@code{1} to install many of the Java-using packages.
@node Frameworks, Building R.app, Java (macOS), macOS
@subsection Frameworks
The @acronym{CRAN} build of @R{} is installed as a framework, which is
selected by the option
@example
./configure --enable-R-framework
@end example
(This is intended to be used with an Apple toolchain: others may not
support frameworks correctly but those from @code{llvm.org} do.)
It is only needed if you want to build @R{} for use with the @Rapp{}
console, and implies @option{--enable-R-shlib} to build @R{} as a
dynamic library. This option configures @R{} to be built and installed
as a framework called @file{R.framework}. The default installation path
for @file{R.framework} is @file{/Library/Frameworks} but this can be
changed at configure time by specifying the flag
@option{--enable-R-framework[=@var{DIR}]} (or @option{--prefix}) or at
install time @emph{via}
@example
make prefix=/where/you/want/R.framework/to/go install
@end example
Note that installation as a framework is non-standard (especially to a
non-standard location) and Unix utilities may not support it (e.g.@: the
@command{pkg-config} file @file{libR.pc} will be put somewhere unknown
to @command{pkg-config}).
@node Building R.app, , Frameworks, macOS
@subsection Building R.app
Note that building the @Rapp{} GUI console is a separate project, using
Xcode. Before compiling @Rapp{} make sure the current version of @R{}
is installed in @file{/Library/Frameworks/R.framework} and working at
the command-line (this can be a binary install).
The current sources can be checked out by
@example
svn co https://svn.r-project.org/R-packages/trunk/Mac-GUI
@end example
@noindent
and built by loading the @code{R.xcodeproj} project (select the
@code{R} target and a suitable configuration), or from the command-line
by e.g.@:
@example
xcodebuild -target R -configuration Release
@end example
See also the @file{INSTALL} file in the checkout or directly at
@uref{https://svn.r-project.org/@/R-packages/@/trunk/@/Mac-GUI/@/INSTALL}.
@Rapp{} does not need to be installed in any specific way. Building
@Rapp{} results in the @Rapp{} bundle which appears as one @R{} icon. This
application bundle can be run anywhere and it is customary to place it
in the @file{/Applications} folder.
@node Solaris, FreeBSD, macOS, Platform notes
@section Solaris
@cindex Solaris
@menu
* 64-bit builds::
* Using gcc::
@end menu
@R{} has been built successfully on Solaris 10 using the (zero cost)
Oracle Developer Studio@footnote{Oracle Solaris Studio prior to 2016,
and previously Sun Studio.} compilers: there has also been success with
@command{gcc}/@command{gfortran}. (Recent Sun machines are AMD Opterons
or Intel Xeons (@cputype{amd64}) rather than @cputype{x86}, but 32-bit
@cputype{x86} executables are the default.) How these compilers
identify themselves is slightly confusing: commands @command{CC -V} with
Developer Studio 12.5 and 12.6 report as versions 5.14 and 5.15. We
will only consider Developer Studio versions 12.5 (May 2016) and 12.6
(July 2017): instructions for 12.3 can be found in versions of this
manual for @R{} 3.3.x.
There have been few reports on Solaris 11, with no known extra issues.
Solaris was last tested on Sparc machines in June 2017.
The Solaris versions of several of the tools needed to build @R{}
(e.g.@: @command{make}, @command{ar} and @command{ld}) are in
@file{/usr/ccs/bin}, so if using those tools ensure this is in your
path. A version of the preferred @acronym{GNU} @command{tar} is (if
installed) in @file{/usr/sfw/bin}. It may be necessary to avoid the
tools in @file{/usr/ucb}: POSIX-compliant versions of some tools can be
found in @file{/usr/xpg4/bin} and @file{/usr/xpg6/bin}.
A large selection of Open Source software can be installed from
@uref{https://www.opencsw.org}, by default installed under
@file{/opt/csw}. Solaris 10 ships with @code{bzlib} version 1.0.6
(sufficient) but @code{zlib} version 1.2.3 (too old): OpenCSW has 1.2.8.
(Note from 2019: updating of OpenCSW has slowed or stopped.)
At least when compiling with Oracle compilers, Solaris uses far more
stack space than other platforms. This makes it desirable to build PCRE
with the option @option{--disable-stack-for-recursion}: the OpenCSW
distribution was at the time of writing.
The Oracle compilers are unusual in not including
@file{/usr/local/include} in the default include search path: @R{}'s
default @code{CPPFLAGS=-I/usr/local/include} remedies this. If you rely
on OpenCSW software you may need @code{CPPFLAGS=-I/opt/csw/include} (or
both).
You will need @acronym{GNU} @code{libiconv} and @code{readline}: the
Solaris version of @code{iconv} is not sufficiently powerful.
The native @command{make} suffices to build @R{} but a number of
packages require @acronym{GNU} @command{make} (some without declaring it
as @samp{SystemRequirements} in the @file{DESCRIPTION} file).
The support for the C99 @code{long double} type on Sparc hardware uses
quad-precision arithmetic, and this is usually slow because it is done
by software emulation. On such systems the @command{configure} option
@option{--disable-long-double} can be used for faster but less accurate
computations.
The Solaris time-zone conversion services seem to be unreliable pre-1916
in Europe (when daylight-savings time was first introduced): most often
reporting in the non-existent DST variant. Using @command{configure}
option @option{--with-internal-tzcode} is recommended, and required if
you find time-zone abbreviations being given odd values (as has been
seen on 64-bit builds without it).
When using the Oracle compilers do @emph{not} specify @option{-fast}, as
this disables @acronym{IEEE} arithmetic and @command{make check} will
fail.
A little juggling of paths was needed to ensure @acronym{GNU}
@code{libiconv} (in @file{/usr/local}) was used rather than the Solaris
@code{iconv}:
@example
CC="cc -xc99"
CFLAGS="-O -xlibmieee"
FC=f95
FFLAGS=-O
CXX=CC
CXXSTD="-std=c++11 -library=stdcpp,CrunG3"
CXX11STD="-std=c++11 -library=stdcpp,CrunG3"
CXX14STD="-std=c++14 -library=stdcpp,CrunG3"
CXXFLAGS=-O
R_LD_LIBRARY_PATH="/opt/developerstudio12.6/lib:/usr/local/lib:/opt/csw/lib"
@end example
The Oracle compilers do not by default conform to the C99 standard
(appendix F 8.9) on the return values of functions such as @code{log}:
use @option{-xlibmieee} to ensure this.
A peculiarity of some versions of the Fortran compiler has been that
when asked to link a shared object they did not link against all the
Fortran 9x runtime libraries, hence
@example
FCLIBS_XTRA="-lfsu /opt/developerstudio12.6/lib/libfui.so.2"
@end example
@noindent
has been needed.
Using @code{-xlibmil} in @code{CFLAGS} or @code{-xlibmil} in
@code{FFLAGS} allows more system mathematical functions
to be inlined.
On @cputype{x86} you will get marginally higher performance @emph{via}
@example
CFLAGS="-xO5 -xlibmieee -xlibmil -nofstore -xtarget=native"
FFLAGS="-xO5 -libmil -nofstore -xtarget=native"
CXXFLAGS="-xO5 -xlibmil -nofstore -xtarget=native"
SAFE_FFLAGS="-O -libmil -fstore -xtarget=native"
@end example
@noindent
but the use of @code{-nofstore} can be less numerically stable, and some
packages have in the past failed to compile at optimization
level 5.
The Oracle compilers provide several implementations of the C++
standards which select both the set of headers and a C++ runtime
library. One of those is selected by the @option{-library} flag, which
as it is needed for both compiling and linking is best specified as part
of the compiler or standard. Current @R{} expects a C++11 compiler, for
which the choice given above is the only possibility. Although version
12.5 accepted the flag @option{-std=c++14}, it did not pass
@command{configure}'s conformance tests: version 12.6 does.
@cindex BLAS library
@cindex LAPACK library
The performance library @code{sunperf} is available for use with the
Oracle compilers. If selected as a @acronym{BLAS}, it must also be
selected as LAPACK @emph{via}
@example
./configure --with-blas='-library=sunperf' --with-lapack
@end example
@noindent
This has often given test failures in the past, in several different
places.@footnote{When last checked it failed in @file{tests/reg-BLAS.R},
and on some builds, including for @cputype{amd64}, it failed in
@code{example(eigen)}.}
Parsing very complex @R{} expressions needs a lot of stack space when
the Oracle compilers are used: several packages require the stack
increased to at least 20MB.
Some people have reported that the Solaris @code{libintl} needs to be
avoided, for example by using @option{--disable-nls} or
@option{--with-included-gettext} or using @code{libintl} from OpenCSW.
(On the other hand, there have been many successful installs which
automatically detected @code{libintl} from OpenCSW or selected the
included @code{gettext}.)
It has been reported that some Solaris installations need
@example
INTERNET_LIBS="-lsocket -lnsl"
@end example
@noindent
on the @command{configure} command line or in file @file{config.site};
however, there have been many successful installs without this.
@node 64-bit builds, Using gcc, Solaris, Solaris
@subsection 64-bit builds
On both @samp{x86} and @samp{Sparc} platforms the compilers default to
32-bit code.
For a 64-bit target add @option{-m64} to the compiler macros
and use something like @code{LDFLAGS=-L/usr/local/lib/amd64} or
@code{LDFLAGS=-L/usr/local/lib/sparcv9} as appropriate (and other 64-bit
library directories if used, e.g.@: @code{-L/opt/csw/lib/amd64}).
It will also be necessary to point @command{pkg-config} at the 64-bit
directories, e.g.@: by something like
@example
PKG_CONFIG_PATH= /usr/local/lib/amd64/pkgconfig:/opt/csw/lib/64/pkgconfig:/usr/lib/64/pkgconfig
@end example
@noindent
and to specify a 64-bit Java VM by e.g.@:
@example
JAVA_CPPFLAGS="-I$@{JAVA_HOME@}/../include -I$@{JAVA_HOME@}/../include/solaris"
JAVA_LD_LIBRARY_PATH=$@{JAVA_HOME@}/lib/amd64/server
JAVA_LIBS="-L$@{JAVA_HOME@}/lib/amd64/server \
-R$@{JAVA_HOME@}/lib/amd64/server -ljvm"
@end example
@node Using gcc, , 64-bit builds, Solaris
@subsection Using gcc
If using @command{gcc}, ensure that the compiler was compiled for the
version of Solaris in use. (This can be ascertained from @command{gcc
-v}.) @command{gcc} makes modified versions of some header files, and
several reports of problems were due to using @command{gcc} compiled on
one version of Solaris on a later version. Note that this can even
apply to OS patches: some 2016 patches to Solaris 10 changed its C
header files in way incompatible@footnote{In particular, header
@file{cmath} in C++11 mode includes @file{math.h} and
@file{iso/math_c99.h} and @command{gcc} had `fixed' an earlier version
of the latter.} with the modified versions included with OpenCSW's
binary distribution.
The notes here are for @command{gcc} set up to use the Solaris linker:
it can also be set up to use GNU @command{ld}, but that has not been
tested. The tests were for compilers from the OpenCSW repository:
Solaris systems often come with much older compilers installed under
@file{/usr/sfw/bin}. One of @option{-m32} or @option{-m64} will be the
default and could be omitted, but it is not easy to find out which.
(For OpenCSW it is @option{-m32}.)
Compilation for an @cputype{x86} target with @command{gcc}@tie{}5.2.0
needed
@example
CC="gcc -m32"
CPPFLAGS="-I/opt/csw/include -I/usr/local/include"
FC="gfortran -m32"
CXX="g++ -m32"
LDFLAGS="-L/opt/csw/lib -L/usr/local/lib"
@end example
For an @cputype{amd64} target we used
@example
CC="gcc -m64"
CPPFLAGS="-I/opt/csw/include -I/usr/local/include"
FC="gfortran -m64"
CXX="g++ -m64"
LDFLAGS="-L/opt/csw/lib/amd64 -L/usr/local/lib/amd64"
@end example
Note that paths such as @file{/opt/csw/lib}, @file{/usr/local/lib/amd64}
and @file{/opt/csw/lib/amd64} may need to be in the
@enindex LD_LIBRARY_PATH
@env{LD_LIBRARY_PATH} during configuration.
@node FreeBSD, OpenBSD, Solaris, Platform notes
@section FreeBSD
@cindex FreeBSD
There have been few recent reports on FreeBSD: there is a `port' at
@uref{https://www.freebsd.org/@/ports/@/math.html}. Recent versions of
FreeBSD use Clang and the @code{libc++} C++ headers and runtime, but the
`port' is configured to use GCC.
Use of ICU for collation and the @command{configure} option
@option{--with-internal-tzcode} are desirable workarounds.
@node OpenBSD, Cygwin, FreeBSD, Platform notes
@section OpenBSD
@cindex OpenBSD
Ingo Feinerer installed @R{} version 3.2.2 on OpenBSD 5.8 arch
@cputype{amd64} (their name for @cputype{x86_64}). Details of the build
(and patches applied) are at
@uref{http://cvsweb.openbsd.org/@/cgi-bin/@/cvsweb/@/ports/@/math/@/R/}. (Downgrading
the @code{zlib} requirement to 1.2.3 is against the advice of the @R{}
developers.)
@node Cygwin, New platforms, OpenBSD, Platform notes
@section Cygwin
The 32-bit version never worked well enough to pass @R{}'s @command{make
check}, and residual support from earlier experiments was removed in
@R{} 3.3.0.
The 64-bit version was never supported.
@node New platforms, , Cygwin, Platform notes
@section New platforms
There are a number of sources of problems when installing @R {} on a new
hardware/OS platform. These include
@strong{Floating Point Arithmetic}: @R{} requires arithmetic compliant
with @acronym{IEC}@tie{}60559, also known as @acronym{IEEE}@tie{}754.
This mandates the use of plus and minus infinity and @code{NaN} (not a
number) as well as specific details of rounding. Although almost all
current FPUs can support this, selecting such support can be a pain.
The problem is that there is no agreement on how to set the signalling
behaviour; Sun/Sparc, SGI/IRIX and @cputype{ix86} Linux require no
special action, FreeBSD requires a call to (the macro)
@code{fpsetmask(0)} and OSF1 required that computation be done with a
@option{-ieee_with_inexact} flag etc. On a new platform you must find
out the magic recipe and add some code to make it work. This can often
be done via the file @file{config.site} which resides in the top level
directory.
Beware of using high levels of optimization, at least initially. On
many compilers these reduce the degree of compliance to the
@acronym{IEEE} model. For example, using @option{-fast} on the Oracle
compilers has caused @R{}'s @code{NaN} to be set incorrectly, and
@command{gcc}'s @option{-ffast-math} and @command{clang}'s
@option{-Ofast} have given incorrect results.
@strong{Shared Objects}: There seems to be very little agreement
across platforms on what needs to be done to build shared objects.
there are many different combinations of flags for the compilers and
loaders. @acronym{GNU} libtool cannot be used (yet), as it currently
does not fully support Fortran: one would need a shell wrapper for
this). The technique we use is to first interrogate the X window system
about what it does (using @command{xmkmf}), and then override this in
situations where we know better (for tools from the @acronym{GNU}
Compiler Collection and/or platforms we know about). This typically
works, but you may have to manually override the results. Scanning the
manual entries for @command{cc} and @command{ld} usually reveals the
correct incantation. Once you know the recipe you can modify the file
@file{config.site} (following the instructions therein) so that the
build will use these options.
It seems that @command{gcc}@tie{}3.4.x and later on @cputype{ix86} Linux
defeat attempts by the LAPACK code to avoid computations entirely in
extended-precision registers, so file @file{src/modules/lapack/dlamc.f}
may need to be compiled without optimization or with additional flags.
Set the configure variable @env{SAFE_FFLAGS} to the flags to be used for
this file.
If you do manage to get @R{} running on a new platform please let us
know about it so we can modify the configuration procedures to include
that platform.
If you are having trouble getting @R{} to work on your platform please
feel free to use the @samp{R-devel} mailing list to ask questions. We
have had a fair amount of practice at porting @R{} to new platforms
@enddots{}
@node The Windows toolset, Function and variable index, Platform notes, Top
@appendix The Windows toolset
If you want to build @R{} or add-on packages from source in Windows, you
will need to collect, install and test an extensive set of tools. See
@uref{https://CRAN.R-project.org/@/bin/@/windows/@/Rtools/} for the current
locations and other updates to these instructions. (Most Windows users
will not need to build add-on packages from source; see @ref{Add-on
packages} for details.)
We have found that the build process for @R{} is quite sensitive to
the choice of tools: please follow our instructions @strong{exactly},
even to the choice of particular versions of the tools.@footnote{For
example, the Cygwin version of @code{make 3.81} fails to work
correctly.} The build process for add-on packages is somewhat more
forgiving, but we recommend using the exact toolset at first, and only
substituting other tools once you are familiar with the process.
@emph{This appendix contains a lot of prescriptive comments. They are
here as a result of bitter experience. Please do not report problems to
the @R{} mailing lists unless you have followed all the prescriptions.}
We have collected most of the necessary tools (unfortunately not all,
due to license or size limitations) into an executable installer named
@file{Rtools*.exe}, available from
@uref{https://CRAN.R-project.org/@/bin/@/windows/@/Rtools/}. You should
download and run it, choosing the default ``Package authoring
installation'' to build add-on packages, or the ``full installation'' if
you intend to build @R{}.
You will need the following items to build @R{} and packages.
See the subsections below for detailed descriptions.
@itemize
@item
The command line tools (in @file{Rtools*.exe})
@item
The MinGW-w64 32/64-bit toolchain to compile C, Fortran and C++.
@end itemize
For installing simple source packages containing data or @R{} source but
no compiled code, none of these are needed.
A complete build of @R{} including PDF manuals, and producing the
installer will also need the following:
@itemize
@item
@LaTeX{}
@item
The Inno Setup installer
@item
(optional) @code{qpdf}
@end itemize
@enindex PATH
It is important to set your @env{PATH} properly. The installer
@file{Rtools*.exe} optionally sets the path to components that it
installs.
Your @env{PATH} may include @file{.} first, then the @file{bin}
directories of the tools and @LaTeX{}. Do not
use filepaths containing spaces: you can always use the short forms
(found by @code{dir /x} at the Windows command line). Network shares
(with paths starting @code{\\}) are not supported.
For example for a 32-bit build, all on one line,
@example
PATH=c:\Rtools\bin;c:\MiKTeX\miktex\bin;
c:\R\R-3.2\bin\i386;c:\windows;c:\windows\system32
@end example
@noindent
It is essential that the directory containing the command line tools
comes first or second in the path: there are typically like-named
tools@footnote{such as @command{sort}, @command{find} and perhaps
@command{make}.} in other directories, and they will @strong{not}
work. The ordering of the other directories is less important, but if in
doubt, use the order above.
Our toolset contains copies of Cygwin DLLs that may conflict with other
ones on your system if both are in the path at once. The normal
recommendation is to delete the older ones; however, at one time we
found our tools did not work with a newer version of the Cygwin DLLs, so
it may be safest not to have any other version of the Cygwin DLLs in your
path.
@menu
* LaTeX::
* The Inno Setup installer::
* The command line tools::
* The MinGW-w64 toolchain::
* Useful additional programs::
@end menu
@node LaTeX, The Inno Setup installer, The Windows toolset, The Windows toolset
@section @LaTeX{}
The @samp{MiKTeX} (@uref{http://www.miktex.org/}) distribution of
@LaTeX{} includes a suitable port of @code{pdftex}. This can be set up
to install extra packages `on the fly', which is the simplest way to use
it (and the default). The `basic' version of @samp{MiKTeX} almost
suffices: when last checked packages
@example
epsf inconsolata mptopdf url
@end example
@noindent
needed to be added (on the fly or @emph{via} the @samp{MiKTeX} Package
Manager) to install @R{}. In any case ensure that the @pkg{inconsolata}
package is installed---you can check with the @samp{MiKTeX} Package
Manager.
The @file{Rtools*.exe} installer does @emph{not} include any version of
@LaTeX{}.
It is also possible to use the TeX Live distribution from
@uref{https://www.tug.org/texlive/}.
@enindex R_RD4PDF
Please read @ref{Making the manuals} about how to make @file{fullrefman.pdf}
and set the environment variable @env{R_RD4PDF} suitably; ensure you
have the required fonts installed or that @samp{MiKTeX} is set up to
install @LaTeX{} packages on first use.
@node The Inno Setup installer, The command line tools, LaTeX, The Windows toolset
@section The Inno Setup installer
To make the installer package (@file{@value{RWVERSION}-win.exe}) we
currently require the Unicode version of Inno Setup 5.3.7 or later from
@uref{http://jrsoftware.org/} (starting from 6.0, Inno Setup provides only
one version, which supports Unicode). This is @emph{not} included in
@file{Rtools*.exe}.
Copy file @file{src/gnuwin32/MkRules.dist} to
@file{src/gnuwin32/MkRules.local} and edit it to set @code{ISDIR} to the
location where Inno Setup was installed.
@node The command line tools, The MinGW-w64 toolchain, The Inno Setup installer, The Windows toolset
@section The command line tools
This item is installed by the @file{Rtools*.exe} installer.
@c INSTALL may use sh make zip (and tar if R_INSTALL_TAR is used)
@c build may use make and sh.
@c Rdiff.sh USED to use diff echo grep sed tr: grep and tr are no longer used.
@c basename is used in src/library/Recommended/Makefile.win
@c comm, sort, uniq are used in producing .def files
@c cmp is used in src/include/Makefile.win, tools/{copy,move}-if-change
@c cp is used as $(CP) in numerous Makefiles
@c cut is used to make RVER
@c date is used when building base and tools
@c diff is used by tools::Rdiff and tests/Makefile.common
@c du is used by R CMD check
@c expr is used in tools/GETVERSION
@c find is used in installer/Makefile
@c expr is used in tools/GETCONFIG
@c gzip is used in src/library/Makefile.win, R CMD build
@c ls is used in src/library/*/Makefile.win
@c mkdir is used in numerous Makefiles
@c rsync is only needed if building from svn
@c sed is used in tools/GETVERSION, many Makefiles
@c touch is used in Makefiles
@c unzip is used in making R, e.g. for zoneinfo.zip
@c AFAICS [g]awk, egrep, grep, head, rmdir, tail, tr, wc are no longer used
If you choose to install these yourself, you will need suitable versions
of at least @code{basename}, @code{cat}, @code{cmp}, @code{comm},
@code{cp}, @code{cut}, @code{date}, @code{diff}, @code{du}, @code{echo},
@code{expr}, @code{gzip}, @code{ls}, @code{make}, @code{makeinfo},
@code{mkdir}, @code{mv}, @code{rm}, @code{rsync}, @code{sed}, @code{sh},
@code{sort}, @code{tar}, @code{texindex}, @code{touch} and @code{uniq};
we use those from the Cygwin distribution
(@uref{https://www.cygwin.com/}) or compiled from the sources. You will
also need @code{zip} and @code{unzip} from the Info-ZIP project
(@uref{http://www.info-zip.org/}). All of these tools are in
@file{Rtools*.exe}.
@c So needed for end users:
@c comm cp diff echo gzip make mkdir rm sh sort tar uniq zip
@strong{Beware}: `Native' ports of make are @strong{not} suitable
(including those called `MinGW make' at the MinGW SourceForge site and
@command{mingw32-make} in some MinGW-w64 distributions). There were
also problems with other versions of the Cygwin tools and DLLs. To
avoid frustration, please use our tool set, and make sure it is at the
front of your path (including before the Windows system directories).
If you are using a Windows shell, type @code{PATH} at the prompt to find
out.
@enindex CYGWIN
You may need to set the environment variable @env{CYGWIN} to a value
including @samp{nodosfilewarning} to suppress messages about
Windows-style paths.
@node The MinGW-w64 toolchain, Useful additional programs, The command line tools, The Windows toolset
@section The MinGW-w64 toolchain
Technically you need more than just a compiler so the set of tools is
referred to as a `toolchain'.
The preferred toolchain is part of @code{Rtools*.exe}: this uses a
version of @command{gcc 4.9.3} and version rt_v3 of the MinGW-w64
project's runtime.
This toolchain does not use @emph{multilib}: separate front-ends are
used for 32-bit and 64-bit compilation. These compilers need to be
specified in @code{BINPREF} and @code{BINPREF64} make variables as
described previously at the end of @ref{Windows packages}.
To select a 32-bit or 64-bit build of @R{}, set the options in
@file{MkRules.local} appropriately (following the comments in the file).
Some external software libraries will need to be re-compiled under the
new toolchain: especially those providing a C++ interface. Many of
those used by @acronym{CRAN} packages are available from
@uref{https://www.stats.ox.ac.uk/@/pub/@/Rtools/@/multilib/}. Users
developing packages with @CRANpkg{Rcpp} need to ensure that they use a
version built with exactly the same toolchain as their package: the
recommendation is to build @CRANpkg{Rcpp} from its sources yourself.
There is support for OpenMP and pthreads in this toolchain. As the
performance of OpenMP on Windows is poor for small tasks, it is not used
for @R{} itself.
@node Useful additional programs, , The MinGW-w64 toolchain, The Windows toolset
@section Useful additional programs
The process of making the installer will make use of @code{qpdf} to
compact some of the package vignettes, if it is available. Windows
binaries of @code{qpdf} are available from
@uref{http://sourceforge.net/@/projects/@/qpdf/@/files/}. Set the path
to the @code{qpdf} installation in file @file{MkRules.local}.
Developers of packages will find some of the `goodies' at
@uref{https://www.stats.ox.ac.uk/@/pub/@/Rtools/@/goodies} useful.
There is a version of the @command{file} command that identifies the
type of files, and is used by @command{Rcmd check} if available. The
binary distribution is included in @file{Rtools*.exe}.
The file @file{xzutils.zip} contains the program @command{xz} which can
be used to (de)compress files with that form of compression.
@node Function and variable index, Concept index, The Windows toolset, Top
@unnumbered Function and variable index
@printindex vr
@node Concept index, Environment variable index, Function and variable index, Top
@unnumbered Concept index
@printindex cp
@node Environment variable index, , Concept index, Top
@unnumbered Environment variable index
@printindex en
@bye
@c Local Variables: ***
@c mode: TeXinfo ***
@c End: ***