Age | Commit message (Collapse) | Author | Lines |
|
Newer versions of Astropy package has been released. With this commit,
it has been updated. It has been increased from v3.2.1 to v4.0
|
|
Now that its 2020, its necessary to include this year in the copyright
statements.
|
|
An extra backslash in the prerequisites of Jeepney Python package was
causing a crash in the installation of that software. With this commit,
this problem has been fixed by removing the backslash.
|
|
Until now when building Matplotlib on macOS systems, we were using the
default C compiler. However, while Yahya Sefidbakht (previously) and
Mahdieh Nabavi (now) were trying to build the template, on their macOS
using the GNU Compiler Collection (GCC), we found that Matplotlib needs
special macOS headers that GCC doesn't recognize.
With this commit, when Matplotlib is being built on macOS systems, it uses
`clang' and this fixed the problem (so far checked on Mahdieh's machine).
|
|
While working on a different branch to build the GNU C Library, I noticed a
few places in the template that need corrections which are now applied:
1. A new-line character after the "C compiler works" notice at the start
of the configure script.
2. Removing possible `::' in the `LD_LIBRARY_PATH' definition of
`basic.mk'. Note that its not necessary in the other steps because we
don't use any outside-defined `LD_LIBRARY_PATH'.
3. Building GMP for C++ and also with `--enable-fat'.
4. Removing the unpacked Perl tarball directory after its installation.
|
|
These two packages are necessary to build the GNU C Library.
|
|
Perl is necessary to build Texinfo and later to build LaTeX. Until now we
were just using the host operating system's installation of Perl, but in
some instances that Perl can be too old and not suppor the features
necessary. With this commit, Perl is now built from source during the basic
installation step of the template.
This was reported by Idafen Santana PĂ©rez, after trying the pipeline on an
Amazon AWS EC2 system (a Linux distro by Amazon for its cloud services).
|
|
Until now, the tarballs were the first normal prerequisite of the
software. As a result if their date changed, the whole software would be
re-built. However, for tarballs specifically, we actually check their
contents with a checksum, so their date is irrelevant (if its newer than
the built program, but has the same checksum, there is no need to re-build
the software).
Also, calling the tarball name as an argument to the building process (for
example `gbuild') was redundant. It is now automatically found from the
list of order-only prerequisites within `gbuild' and `cbuild' (similar to
how it was previously found in the `pybuild' for Python building).
A `README.md' file has also been placed in `reproduce/software/make' to
help describe the shared properties of the software building
Makefiles. This will hopefully grow much larger in the future.
|
|
Python's `lmfit' module and all its major dependencies (`asteval',
`corner', `emcee' and `uncertainties') have been included in the template.
While doing this I noticed that if the tarballs are the last prerequisite
of each software building rule, then when building in parallel, the
template will immediately start building packages as soon as the first one
is downloaded. Not like the current way that it will attempt to download
several, then start building. For now, this has been implemented in the
Python build rules for all the modules and we'll later do the same for the
other programs and libraries. This also motivated a simplification of the
`pybuild' function: it now internally looks into the prerequisites and
selects the tarball from the prerequisite that is in the tarballs
directory.
This isn't a problem for the build, but I just don't understand why Python
can't recognize the version of `emcee', Python reads the version of `emcee'
as `0.0.0'! But it doesn't cause any crash in the build, so for now its
fine.
|
|
The tarball of HEALPix includes multiple languages and doesn't include the
ready-to-run GNU Build System by default, we actually have to build the
`./configure' script for the C/C++ libraries. So it was necessary to also
include GNU Autoconf and GNU Automake as prerequisites of HEALPix.
However, the official GNU Autoconf tarball (dating from 2012) doesn't build
on modern systems, so I just cloned it from its source and bootstrapped it
and built its modern tarball which we are using here.
|
|
The following software are added with this commit: eigency, esutil, flake8,
future, galsim, lsstdesccoord, pybind11 and pyflakes.
|
|
As part of an effort to bring in all the dependencies of the LSST Science
pipeline (which includes the last commit), these software are now available
in the template.
|
|
It was some time since these three software were not updated! With this
commit the template now uses the most recent stable release of these
packages.
Also, the hosting server for ImageMagick was moved to my own webpage
because unfortunately ImageMagick removes its tarballs from its own
version.
|
|
New versions of astropy, bash, cmake, curl, findutils, gawk, gcc,
ghostscript, git, make, gsl had recently come so they are updated with this
commit.
About GNU Findutils and GNU Make: I was bootstrapping (building the tarball
of) these two separately separately because their standard tarball release
had problems on some systems. Both have been updated now so I am no longer
using my own webpage as their main URL.
A special note about GNU Make. I just noticed that during bootstrapping,
GNU Make would use the fixed version string of `4.2.90' for any commit!!!
But fortunately they have officially released their 4.2.90 version, so we
are safely using their own webpage. The only difference is the compression
format. My old bootstrapped build was `tar.lz', but the standard release is
`tar.gz'.
Also, all the basic programs (installed in `.local/bin') in `basic.mk' are
now existance-only dependencies (after a `|'). Because later programs just
use them at a very basic level, so there is no need to rebuild everything
when Bash gets updated for example.
|
|
Until this commit, the name of the variable for `beautifulsoup4'
checksum was wrong, and because of that, it was not able to install it.
With this commit, `beautifulsoup-checksum' has been replaced for
`beautifulsoup4-checksum' in the `reproduce/software/make/python.mk'
Makefile, and the problem has been fixed. This was not noticed
previously because this Python package is only installed when some high
level programs are requested to be installed.
With this commit the version of `imagemagick' program has been also
updated because the previous version is not available in the official
website anymore.
|
|
In the line where we checked if libffi is installed in `lib/' or `lib64/',
there was a typo that lead to an incorrect result: we had forgot to add a
`-f' to the second check. This is corrected with this commit.
|
|
Until now, we weren't explicitly disabling Java in the build of libffi
because there was no Java platform on the systems we tested. But recently
Yahya Sefidbakht report a crash in libffi (reported in bug #56716) because
of Java not being compiled properly.
Unfortunately libffi doesn't have a configure option to disable Java, but
it does have an internal macro (`NO_JAVA_RAW_API'). With this commit, we
we defined this macro in the `make' step, and it solved the problem.
This fixes bug #56716.
|
|
Until now, in version 3.0.1, mpi4py couldn't be built with the most recent
version of OpenMPI. However, after trying the next version (3.0.2), I
noticed that it builds successfully without a problem.
|
|
Until now, there was no check on the integrity of the contents of the
downloaded/copied software tarballs, we only relied on the tarball
name. This could be bad for reproducibility and security, for example on
one server the name of a tarball may be the same but with different
content.
With this commit, the SHA512 checksums of all the software are stored in
the newly created `checksums.mk' (similar to how the versions are stored in
the `versions.mk'). The resulting variable is then defined for each
software and after downloading/copying the file we check to see if the new
tarball has the same checksum as the stored value. If it doesn't the script
will crash with an error, informing the user of the problem.
The only limitation now is a bootstrapping problem: if the host system
doesn't already an `sha512sum' executable, we will not do any checksum
verification until we install our `sha512sum' (as part of GNU
Coreutils). All the tarballs downloaded after GNU Coreutils are built will
have their checksums validated. By default almost all GNU/Linux systems
will have a usable `sha512sum' (its part of GNU Coreutils after all for a
long time: from the Coreutils Changelog file atleast since 2013).
This completes task #15347.
|
|
Until now, to work on a project, it was necessary to `./configure' it and
build the software. Then we had to run `.local/bin/make' to run the project
and do the analysis every time. If the project was a shared project between
many users on a large server, it was necessary to call the `./for-group'
script.
This way of managing the project had a major problem: since the user
directly called the lower-level `./configure' or `.local/bin/make' it was
not possible to provide high-level control (for example limiting the
environment variables). This was especially noticed recently with a bug
that was related to environment variables (bug #56682).
With this commit, this problem is solved using a single script called
`project' in the top directory. To configure and build the project, users
can now run these commands:
$ ./project configure
$ ./project make
To work on the project with other users in a group these commands can be
used:
$ ./project configure --group=GROUPNAME
$ ./project make --group=GROUPNAME
The old options to both configure and make the project are still valid. Run
`./project --help' to see a list. For example:
$ ./project configure -e --host-cc
$ ./project make -j8
The old `configure' script has been moved to
`reproduce/software/bash/configure.sh' and is called by the new `./project'
script. The `./project' script now just manages the options, then passes
control to the `configure.sh' script. For the "make" step, it also reads
the options, then calls Make. So in the lower-level nothing has
changed. Only the `./project' script is now the single/direct user
interface of the project.
On a parallel note: as part of bug #56682, we also found out that on some
macOS systems, the `DYLD_LIBRARY_PATH' environment variable has to be set
to blank. This is no problem because RPATH is automatically set in macOS
and the executables and libraries contain the absolute address of the
libraries they should link with. But having `DYLD_LIBRARY_PATH' can
conflict with some low-level system libraries and cause very hard to debug
linking errors (like that reported in the bug report).
This fixes bug #56682.
|
|
On some Fedora systems, libffi installs under `lib64', not `lib'. As a
result, Python's Setuptools can't find it and will not built (complaining
about not finding the `_ctypes' module).
With this commit, we fix this problem by explicitly putting a copy of
libffi's installed libraries within the `lib' directory.
This issue was found while testing the pipeline with Elham Saremi and Hamed
Altafi.
|
|
Until this commit, the name of the decompressed tarball directory of
PyYAML Python package was wrong. It has to be `PyYAML-version' instead
of `pyyaml-version'. When I run the installation on Mac OS system it
went up to the end of the installation with no error. However, when I
tried to install it on a GNU/Linux system, it complained about no
finding the `pyyaml-version' directory, which is the expected because
the name was wrong!
With this commit, I have fixed this issue by writting correctly the name
of the decompressed tarball directory.
|
|
With this commit, PyYAML Python package has been added into the project.
It is widely used in the Python community and the goal is to have human
readable configuration files. As in the web page
(https://pypi.org/project/PyYAML/) says:
YAML is a data serialization format designed for human readability and
interaction with scripting languages. PyYAML is a YAML parser and
emitter for Python.
|
|
Matplotlib can optionally use LaTeX and Ghostscript as dependencies to
render text with LaTeX.
Since we already have their build rules, with this commit, they have been
added as its prerequisites.
|
|
With this commit, ImageMagick program has been set as prerequisite of
Matplotlib Python package. Since some submodules of Matplotlib rely on
ImageMagick programs (for example, those to make image animations), and
now we have ImageMagick into the project, it is good to install
ImageMagick when Matplotlib is requested to be installed.
|
|
Especially because of the new convention regarding backslashes, there were
many conflicts that are now fixed. But none were substantial.
|
|
When we need to quote the new-line character we end the line with a
backslash (`\'). Until now, our convention has been to put all such
backslashes under each other to help in visual inspection.
But this causes a lot of confusion in version control: if only one line's
length is larger, the whole block will be marked as changed and thus makes
it hard to visually see the actual change. It also makes debuging the code
(adding some temporary lines) hard.
With this commit, I went through all the files and tried to fix all such
cases so only a single white space character is between the last command
character and the backslash. Where there was an empty line (ending with a
backslash, to help in visually separating the code into blocks), I put the
backslash right under the previous line's.
This completes task #15259.
|
|
Until now, we were missing `numpy' and `six' as prerequisites of `h5py'.
Because we did the configure step with all cores, `numpy' and `six'
allways were built before `h5py' with any crash. However, in a single
thread configure run we noticed that we were missing these two
dependencies.
With this commit, we fix this issue by setting `numpy' and `six' as
prerequisites of `h5py'.
|
|
Until this commit, the prerequisites of `sip_tpv' were `mpmath' and
`sympy'. However, the real prerequisites of `sip_tpv' are `astropy',
`numpy' and `sympy'.
With this commit, we fix this issue, and now `mpmath' is only a
prerequisite of `sympy'.
|
|
Until now, we were not citing the paper of `sip_tpv' package.
With this commit, we have been fix this issue.
|
|
With this commit, we add `sip_tpv' Python package into the template.
This is a small package to convert SIP distorsion coefficients into PV
distorsion coefficients, and the other way around.
This package is useful in a astronomical context, specially when `swarp'
is going to be used in order to resample images. The reason is that
`swarp' only can understand PV distorsion coefficients.
|
|
With this commit, we add `sympy' Python package into the template. This
is a package to do symbolic mathematics.
The motivation is that it is a prerequisite of `sip_tpv' Python package,
which is useful to convert SIP distorsion coefficients into PV
coefficients (in the context of astronomical images). However, the
availability of `sympy' in the template will be useful for anyone
interested in this package.
|
|
With this commit, we add `mpmath' Python package into the template. This
package is a prerequisite of `sympy', a package to do symbolic
mathematics.
The motivation of adding this package is because it is a dependency of
`sympy', which is more widely used into the Python science community.
|
|
Until now, to specify which high-level software you want the project to
contain, it was necessary to go into the `high-level.mk' Makefile that is
complicated and can create bugs.
With this commit, a new `reproduce/software/config/installation/TARGETS.mk'
file has been created that is easily/cleanly in charge of documenting the
final high-level software that must be built for the project.
Also, until now, FFTW was set as a dependency of Numpy while we couldn't
actually get Numpy to use it! It was just there for future reference and to
justify its build rule. But now that many software won't be built and there
is no problem with having rules even though a project might not use them,
it has been removed.
|
|
Until this commit, we were using `python3' when calling Python (because
we were using Python version 3.6.8). This will force us to change the
name in the future. For example, when `python4' were available and into
the pipeline.
With this commit, at the end of the Python installation it creates a
symbolic link to the Python bin executable with the new name `python'.
As a consecuence, whatever version of Python was installed, into the
project we will use allways `python' to invoke it.
|
|
Until now, the software building and analysis steps of the pipeline were
intertwined. However, these steps (of how to build a software, and how to
use it) are logically completely independent.
Therefore with this commit, the pipeline now has a new architecture
(particularly in the `reproduce' directory) to emphasize this distinction:
The `reproduce' directory now has the two `software' and `analysis'
subdirectories and the respective parts of the previous architecture have
been broken up between these two based on their function. There is also no
more `src' directory. The `config' directory for software and analysis is
now mixed with the language-specific directories.
Also, some of the software versions were also updated after some checks
with their webpages.
This new architecture will allow much more focused work on each part of the
pipeline (to install the software and to run them for an analysis).
|