Age | Commit message (Collapse) | Author | Lines |
|
Until now, if a project needed the healpy software package, Maneage would
crash with the following error message (abridged for full name in build
directory). This was caused by a typo in the version of 'healpix' (the
dependency of 'healpy').
make: *** No rule to make target '.../version-info/proglib/healpix-'
With this commit, the typo in line 334 of 'python.mk' is fixed, so that
when '$(ipydir)/healpy-$(healpy-version)' gets called it correctly searches
for a rule to make '$(ibidir)/healpix-$(healpix-version)'.
|
|
POSSIBLE EFFECT ON YOUR PROJECT: The changes in this commit may only cause
conflicts to your project if you have changed the software building
Makefiles in your project's branch (e.g., 'basic.mk', 'high-level.mk' and
'python.mk'). If your project has only added analysis, it shouldn't be
affected.
This is a large commit, involving a long series of corrections in a
differnt branch which is now finally being merged into the core Maneage
branch. All changes were related and came up naturally as the low-level
infrastructure was improved. So separating them in the end for the final
merge would have been very time consuming and we are merging them as one
commit.
In general, the software building Makefiles are now much more easier to
read, modify and use, along with several new features that have been
added. See below for the full list.
- Until now, Maneage needed the host to have a 'make' implementation
because Make was necessary to build Lzip. Lzip is then used to
uncompress the source of our own GNU Make. However, in the
minimalist/slim versions of operating systems (for example used to build
Docker images) Make isn't included by default. Since Lzip was the only
program before our own GNU Make was installed, we consulting Antonio
Diaz Diaz (creator of Lzip) and he kindly added the necessary
functionality to a new version of Lzip, which we are using now. Hence we
don't need to assume a Make implementation on the host any more. With
this commit, Lzip and GNU Make are built without Make, allowing
everything else to be safely built with our own custom version of GNU
Make and not using the host's 'make' at all.
- Until recently (Commit 3d8aa5953c4) GNU Make was built in
'basic.mk'. Therefore 'basic.mk' was written in a way that it can be
used with other 'make' implementations also (i.e., important shell
commands starting with '&&' and ending in '\' without any comments
between them!). Furthermore, to help in style uniformity, the rules in
'high-level.mk' and 'python.mk' also followed a similar structure. But
due to the point above, we can now guarantee that GNU Make is used from
the very first Makefile, so this hard-to-read structure has been removed
in the software build recipes and they are much more readable and
edit-friendly now.
- Until now, the default backup servers where at some fixed URLs, on our
own pages or on Gitlab. But recently we uploaded all the necessary
software to Zenodo (https://doi.org/10.5281/zenodo.3883409) which is
more suitable for this task (it promises longevity, has a fixed DOI,
while allowing us to add new content, or new software tarball
versions). With this commit, a small script has been written to extract
the most recent Zenodo upload link from the Zenodo DOI and use it for
downloading the software source codes.
- Until now, we primarily used the webpage of each software for
downloading its tarball. But this caused many problems: 1) Some of them
needed Javascript before the download, 2) Some URLs had a complex
dependency on the version number, 3) some servers would be randomly down
for maintenance and etc. So thanks to the point above, we now use the
Zenodo server as the primary download location. However, if a user wants
to use a custom software that is not (yet!) in Zenodo, the download
script gives priority to a custom URL that the users can give as Make
variables. If that variable is defined, then the script will use that
URL before going onto Zenodo. We now have a special place for such URLs:
'reproduce/software/config/urls.conf'. The old URLs (which are a good
documentation themselves) are preserved here, but are commented by
default.
- The software source code downloading and checksum verification step has
been moved into a Make function called 'import-source' (defined in the
'build-rules.mk' and loaded in all software Makefiles). Having taken all
the low-level steps there, I noticed that there is no more need for
having the tarball as a separate target! So with this commit, a single
rule is the only place that needs to be edited/added (greatly
simplifying the software building Makefiles).
- Following task #15272, A new option has been added to the './project'
script called '--all-highlevel'. When this option is given, the contents
of 'TARGETS.conf' are ignored and all the software in Maneage are built
(selected by parsing the 'versions.conf' file). This new option was
added to confirm the extensive changes made in all the software building
recipes and is great for development/testing purposes.
- Many of the software hadn't been tested for a long time! So after using
the newly added '--all-highlevel', we noticed that some need to be
updated. In general, with this commit, 'libpaper' and 'pcre' were added
as new software, and the versions of the following software was updated:
'boost', 'flex', 'libtirpc', 'openblas' and 'lzip'. A 'run-parts.in'
shell script was added in 'reproduce/software/shell/' which is installed
with 'libpaper'.
- Even though we intentionally add the necessary flags to add RPATH inside
the built executable at compilation time, some software don't do it
(different software on different operating systems!). Until now, for
historical reasons this check was done in different ways for different
software on GNU/Linux sytems. But now it is unified: if 'patchelf' is
present we apply it. Because of this, 'patchelf' has been put as a
top-level prerequisite, right after Tar and is installed before anything
else.
- In 'versions.conf', GNU Libtool is recognized as 'libtool', but in
'basic.mk', it was 'glibtool'! This caused many confusions and is
corrected with this commit (in 'basic.mk', it is also 'libtool').
- A new argument is added to the './project' script to allow easy loading
of the project's shell and environment for fast/temporary testing of
things in the same environment as the project. Before activating the
project's shell, we completely remove all host environment variables to
simulate the project's environment. It can be called with this command:
'./project shell'. A simple prompt has also been added to highlight that
the user is using the Maneage shell!
|
|
Until now, Maneage would only build Flock before building everything else
using Make (calling 'basic.mk') in parallel. Flock was necessary to avoid
parallel downloads during the building of software (which could cause
network problems). But after recently trying Maneage on FreeBSD (which is
not yet complete, see bug #58465), we noticed that the BSD implemenation of
Make couldn't parse 'basic.mk' (in particular, complaining with the 'ifeq'
parts) and its shell also had some peculiarities.
It was thus decided to also install our own minimalist shell, Make and
compressor program before calling 'basic.mk'. In this way, 'basic.mk' can
now assume the same GNU Make features that high-level.mk and python.mk
assume. The pre-make building of software is now organized in
'reproduce/software/shell/pre-make-build.sh'.
Another nice feature of this commit is for macOS users: until now the
default macOS Make had problems for parallel building of software, so
'basic.mk' was built in one thread. But now that we can build the core
tools with GNU Make on macOS too, it uses all threads. Furthermore, since
we now run 'basic.mk' with GNU Make, we can use '.ONESHELL' and don't have
to finish every line of a long rule with a backslash to keep variables and
such.
Generally, the pre-make software are now organized like this: first we
build Lzip before anything else: it is downloaded as a simple '.tar' file
that is not compressed (only ~400kb). Once Lzip is built, the pre-make
phase continues with building GNU Make, Dash (a minimalist shell) and
Flock. All of their tarballs are in '.tar.lz'. Maneage then enters
'basic.mk' and the first program it builds is GNU Gzip (itself packaged as
'.tar.lz'). Once Gzip is built, we build all the other compression software
(all downloaded as '.tar.gz'). Afterwards, any compression standard for
other software is fine because we have it.
In the process, a bug related to using backup servers was found in
'reproduce/analysis/bash/download-multi-try' for calling outside of
'basic.mk' and removed Bash-specific features. As a result of that bug-fix,
because we now have multiple servers for software tarballs, the backup
servers now have their own configuration file in
'reproduce/software/config/servers-backup.conf'. This makes it much easier
to maintain the backup server list across the multiple places that we need
it.
Some other minor fixes:
- In building Bzip2, we need to specify 'CC' so it doesn't use 'gcc'.
- In building Zip, the 'generic_gcc' Make option caused a crash on FreeBSD
(which doesn't have GCC).
- We are now using 'uname -s' to specify if we are on a Linux kernel or
not, if not, we are still using the old 'on_mac_os' variable.
- While I was trying to build on FreeBSD, I noticed some further
corrections that could help. For example the 'makelink' Make-function
now takes a third argument which can be a different name compared to the
actual program (used for examle to make a link to '/usr/bin/cc' from
'gcc'.
- Until now we didn't know if the host's Make implementation supports
placing a '@' at the start of the recipe (to avoid printing the actual
commands to standard output). Especially in the tarball download phase,
there are many lines that are printed for each download which was really
annoying. We already used '@' in 'high-level.mk' and 'python.mk' before,
but now that we also know that 'basic.mk' is called with our custom GNU
Make, we can use it at the start for a cleaner stdout.
- Until now, WCSLIB assumed a Fortran compiler, but when the user is on a
system where we can't install GCC (or has activated the '--host-cc'
option), it may not be present and the project shouldn't break because
of this. So with this commit, when a Fortran compiler isn't present,
WCSLIB will be built with the '--disable-fortran' configuration option.
This commit (task #15667) was completed with help/checks by Raul
Infante-Sainz and Boud Roukema.
|
|
In time, some of the copyright license description had been mistakenly
shortened to two paragraphs instead of the original three that is
recommended in the GPL. With this commit, they are corrected to be exactly
in the same three paragraph format suggested by GPL.
The following files also didn't have a copyright notice, so one was added
for them:
reproduce/software/make/README.md
reproduce/software/bibtex/healpix.tex
reproduce/analysis/config/delete-me-num.conf
reproduce/analysis/config/verify-outputs.conf
|
|
Until now there we had manually inserted a `\' before the `_' of sip_tpv
program. However, we also recently added a step in the configure script to
add a `\' before every `_' when writing the final LaTeX macro. This was
because some C compilers (when the host's is used) have an `_' in their
version that we had no control over.
With this commit, the `\' is removed from `sip_tpv' in its build-rule and
we let the backslash be inserted automatically.
|
|
Until now, when you changed the version of a software in an already-built
system, its tarball would be downloaded, but it wouldn't actually
build. The only way would be to force the build by deleting the main target
of that file (under `.local/version-info/TYPE/PROGRAM'). This was because
the tarballs were an order-only prerequisite which was implemented some
time ago based on some theoretical argument that if the tarball dates
changes, the software should not be rebuilt (because we check the
checksum).
However, the problems this causes are more than those it solves: Users may
forget to delete the main target of the program and mistakenly think that
they are using the new version. The fact that all the numbers going into
the paper also contain this number further hides this.
With this commit, tarballs are no longer order-only and any time a version
of a software is updated, it will be automatically built and not cause
confusion and manual intervention by the users. As a result of this change,
I also had to correct the way we find the tarball from the list of
prerequisites.
|
|
Until now, the sed script for determining URL download rules in the three
software building Makefiles (`basic.mk', `high-level.mk' and `python.mk')
considered package names such as `fftw-3...` and `fftw2-2.1...` to be
identical. As the example above shows, this would make it hard to include
some software that may hav conflicting non-number names.
With this commit, the SED script that is used to separate the version from
the tarball name only matches numbers that are after a dash
(`-'). Therefore considers `fftw-3...` and `fftw-2...` to be identical, but
`fftw-3-...` and `fftw2-2.1...` to be different. As a result of this
change, the `elif' check for some of the other programs like `m4', or
`help2man' was also corrected in all three Makefiles.
While doing this check on all the software, we noticed that `zlib-version'
is being repeated two times in `version.conf' so it was removed. It caused
no complications, because both were the same number, but could lead to bugs
later.
|
|
Until now, throughout Maneage we were using the old name of "Reproducible
Paper Template". But we have finally decided to use Maneage, so to avoid
confusion, the name has been corrected in `README-hacking.md' and also in
the copyright notices.
Note also that in `README-hacking.md', the main Maneage branch is now
called `maneage', and the main Git remote has been changed to
`https://gitlab.com/maneage/project' (this is a new GitLab Group that I
have setup for all Maneage-related projects). In this repository there is
only one `maneage' branch to avoid complications with the `master' branch
of the projects using Maneage later.
|
|
Until now the software configuration parameters were defined under the
`reproduce/software/config/installation/' directory. This was because the
configuration parameters of analysis software (for example Gnuastro's
configurations) were placed under there too. But this was terribly
confusing, because the run-time options of programs falls under the
"analysis" phase of the project.
With this commit, the Gnuastro configuration files have been moved under
the new `reproduce/analysis/config/gnuastro' directory and the software
configuration files are directly under `reproduce/software/config'. A clean
build was done with this change and it didn't crash, but it may cause
crashes in derived projects, so after merging with Maneage, please
re-configure your project to see if anything has been missed. Please let us
know if there is a problem.
|
|
Elham Saremi recently reported the following errors when building Numpy in
numpy/core/src/npysort/radixsort.c.src: "error: 'for' loop initial
declarations are only allowed in C99 or C11 mode". After some searching, I
found Issue 14147[1] on Numpy's main repository about the same problem. As
described there, apparently Numpy needs C99 compiler, but doesn't check for
it or set it manually (for some strange reason, leaving it to the packagers
to check if they want!!!).
Any way, after a check with Elham, we were able to fix it by adding the
`--std=c99' to CFLAGS of Numpy's build and with this commit, it is now
being implemented in the core Maneage to not cause a problem in any other
project.
[1] https://github.com/numpy/numpy/issues/14147
|
|
Until now, Astropy was instructed to build its own internal copy of the
Expat library. However, with the recent commits before, Maneage now
includes an installation of Expat and Astropy can't keep the two (its
internal version and the project's version) separate, so they conflict and
don't let Astropy get built.
With this commit, the problem is fixed by setting the Expat library as an
explicit dependency of Astropy and asking Astropy to ignore its internal
copy.
While doing this, I recognized that it is much easier and elegant to add
steps in various stages of the `pybuild' function through hooks instead of
variables. So the fifth argument of the `pybuild' function was removed and
now it actually checks if hooks are defined as functions and if so, they
will be called.
The `pyhook_after' function was also implemented in the installation of
`pybind11' (which needed it, given that the 5th argument of `pybuild' was
removed) and after doing a test-build, I noticed that two lines were not
ending with a `\' in `boost' (a dependency of `pybind11').
Commit written originally by Mohammad Akghlaghi
|
|
Until this commit, PyYAML was not set as prerequisite of Astropy. This
package is an optional dependency of Astropy for some particular
functions. However, we have already included PyYAML into this project so
it is available. With this commit, PyYAML has been set as a prerequisite
of Astropy.
In addition to this, Html5lib and Beautifulsoup4 have been also added as
prerequsites of Astropy (and removed from Astroquery prerequisites). I
noticed that both of them are optional dependencies of Astropy.
|
|
In the last update of Astropy to version 4.0 they removed some things
that the previous version of Astroquery needs. As a consequence, it is
also necessary to update the Astroquery version to be a ble to run with
the Astropy 4.0. With this commit, the update of Astroquery to it most
recent version (0.4) has been done.
|
|
Until now, the main download script could only check one server for the
given URL. However, ultimately the actual server that a file is downloaded
from is irrelevant for this project: we actually check its
checksum. Especially in the case of software (which are distributed over
many servers), this can usually be very annoying: the servers may not
properly communicate with the running system and even the 10 trials won't
be enough.
With this commit, the download script
`reproduce/analysis/bash/download-multi-try' can take a new optional
argument (a 5th argument). It assumes this argument is a space-separated
list of server(s) to use as backup for the original URL. When downloading
from the original URL fails, it will look into this list and try
downloading the same file from each given server.
|
|
Newer versions of Astropy package has been released. With this commit,
it has been updated. It has been increased from v3.2.1 to v4.0
|
|
Now that its 2020, its necessary to include this year in the copyright
statements.
|
|
An extra backslash in the prerequisites of Jeepney Python package was
causing a crash in the installation of that software. With this commit,
this problem has been fixed by removing the backslash.
|
|
Until now when building Matplotlib on macOS systems, we were using the
default C compiler. However, while Yahya Sefidbakht (previously) and
Mahdieh Nabavi (now) were trying to build the template, on their macOS
using the GNU Compiler Collection (GCC), we found that Matplotlib needs
special macOS headers that GCC doesn't recognize.
With this commit, when Matplotlib is being built on macOS systems, it uses
`clang' and this fixed the problem (so far checked on Mahdieh's machine).
|
|
While working on a different branch to build the GNU C Library, I noticed a
few places in the template that need corrections which are now applied:
1. A new-line character after the "C compiler works" notice at the start
of the configure script.
2. Removing possible `::' in the `LD_LIBRARY_PATH' definition of
`basic.mk'. Note that its not necessary in the other steps because we
don't use any outside-defined `LD_LIBRARY_PATH'.
3. Building GMP for C++ and also with `--enable-fat'.
4. Removing the unpacked Perl tarball directory after its installation.
|
|
These two packages are necessary to build the GNU C Library.
|
|
Perl is necessary to build Texinfo and later to build LaTeX. Until now we
were just using the host operating system's installation of Perl, but in
some instances that Perl can be too old and not suppor the features
necessary. With this commit, Perl is now built from source during the basic
installation step of the template.
This was reported by Idafen Santana PĂ©rez, after trying the pipeline on an
Amazon AWS EC2 system (a Linux distro by Amazon for its cloud services).
|
|
Until now, the tarballs were the first normal prerequisite of the
software. As a result if their date changed, the whole software would be
re-built. However, for tarballs specifically, we actually check their
contents with a checksum, so their date is irrelevant (if its newer than
the built program, but has the same checksum, there is no need to re-build
the software).
Also, calling the tarball name as an argument to the building process (for
example `gbuild') was redundant. It is now automatically found from the
list of order-only prerequisites within `gbuild' and `cbuild' (similar to
how it was previously found in the `pybuild' for Python building).
A `README.md' file has also been placed in `reproduce/software/make' to
help describe the shared properties of the software building
Makefiles. This will hopefully grow much larger in the future.
|
|
Python's `lmfit' module and all its major dependencies (`asteval',
`corner', `emcee' and `uncertainties') have been included in the template.
While doing this I noticed that if the tarballs are the last prerequisite
of each software building rule, then when building in parallel, the
template will immediately start building packages as soon as the first one
is downloaded. Not like the current way that it will attempt to download
several, then start building. For now, this has been implemented in the
Python build rules for all the modules and we'll later do the same for the
other programs and libraries. This also motivated a simplification of the
`pybuild' function: it now internally looks into the prerequisites and
selects the tarball from the prerequisite that is in the tarballs
directory.
This isn't a problem for the build, but I just don't understand why Python
can't recognize the version of `emcee', Python reads the version of `emcee'
as `0.0.0'! But it doesn't cause any crash in the build, so for now its
fine.
|
|
The tarball of HEALPix includes multiple languages and doesn't include the
ready-to-run GNU Build System by default, we actually have to build the
`./configure' script for the C/C++ libraries. So it was necessary to also
include GNU Autoconf and GNU Automake as prerequisites of HEALPix.
However, the official GNU Autoconf tarball (dating from 2012) doesn't build
on modern systems, so I just cloned it from its source and bootstrapped it
and built its modern tarball which we are using here.
|
|
The following software are added with this commit: eigency, esutil, flake8,
future, galsim, lsstdesccoord, pybind11 and pyflakes.
|
|
As part of an effort to bring in all the dependencies of the LSST Science
pipeline (which includes the last commit), these software are now available
in the template.
|
|
It was some time since these three software were not updated! With this
commit the template now uses the most recent stable release of these
packages.
Also, the hosting server for ImageMagick was moved to my own webpage
because unfortunately ImageMagick removes its tarballs from its own
version.
|
|
New versions of astropy, bash, cmake, curl, findutils, gawk, gcc,
ghostscript, git, make, gsl had recently come so they are updated with this
commit.
About GNU Findutils and GNU Make: I was bootstrapping (building the tarball
of) these two separately separately because their standard tarball release
had problems on some systems. Both have been updated now so I am no longer
using my own webpage as their main URL.
A special note about GNU Make. I just noticed that during bootstrapping,
GNU Make would use the fixed version string of `4.2.90' for any commit!!!
But fortunately they have officially released their 4.2.90 version, so we
are safely using their own webpage. The only difference is the compression
format. My old bootstrapped build was `tar.lz', but the standard release is
`tar.gz'.
Also, all the basic programs (installed in `.local/bin') in `basic.mk' are
now existance-only dependencies (after a `|'). Because later programs just
use them at a very basic level, so there is no need to rebuild everything
when Bash gets updated for example.
|
|
Until this commit, the name of the variable for `beautifulsoup4'
checksum was wrong, and because of that, it was not able to install it.
With this commit, `beautifulsoup-checksum' has been replaced for
`beautifulsoup4-checksum' in the `reproduce/software/make/python.mk'
Makefile, and the problem has been fixed. This was not noticed
previously because this Python package is only installed when some high
level programs are requested to be installed.
With this commit the version of `imagemagick' program has been also
updated because the previous version is not available in the official
website anymore.
|
|
In the line where we checked if libffi is installed in `lib/' or `lib64/',
there was a typo that lead to an incorrect result: we had forgot to add a
`-f' to the second check. This is corrected with this commit.
|
|
Until now, we weren't explicitly disabling Java in the build of libffi
because there was no Java platform on the systems we tested. But recently
Yahya Sefidbakht report a crash in libffi (reported in bug #56716) because
of Java not being compiled properly.
Unfortunately libffi doesn't have a configure option to disable Java, but
it does have an internal macro (`NO_JAVA_RAW_API'). With this commit, we
we defined this macro in the `make' step, and it solved the problem.
This fixes bug #56716.
|
|
Until now, in version 3.0.1, mpi4py couldn't be built with the most recent
version of OpenMPI. However, after trying the next version (3.0.2), I
noticed that it builds successfully without a problem.
|
|
Until now, there was no check on the integrity of the contents of the
downloaded/copied software tarballs, we only relied on the tarball
name. This could be bad for reproducibility and security, for example on
one server the name of a tarball may be the same but with different
content.
With this commit, the SHA512 checksums of all the software are stored in
the newly created `checksums.mk' (similar to how the versions are stored in
the `versions.mk'). The resulting variable is then defined for each
software and after downloading/copying the file we check to see if the new
tarball has the same checksum as the stored value. If it doesn't the script
will crash with an error, informing the user of the problem.
The only limitation now is a bootstrapping problem: if the host system
doesn't already an `sha512sum' executable, we will not do any checksum
verification until we install our `sha512sum' (as part of GNU
Coreutils). All the tarballs downloaded after GNU Coreutils are built will
have their checksums validated. By default almost all GNU/Linux systems
will have a usable `sha512sum' (its part of GNU Coreutils after all for a
long time: from the Coreutils Changelog file atleast since 2013).
This completes task #15347.
|
|
Until now, to work on a project, it was necessary to `./configure' it and
build the software. Then we had to run `.local/bin/make' to run the project
and do the analysis every time. If the project was a shared project between
many users on a large server, it was necessary to call the `./for-group'
script.
This way of managing the project had a major problem: since the user
directly called the lower-level `./configure' or `.local/bin/make' it was
not possible to provide high-level control (for example limiting the
environment variables). This was especially noticed recently with a bug
that was related to environment variables (bug #56682).
With this commit, this problem is solved using a single script called
`project' in the top directory. To configure and build the project, users
can now run these commands:
$ ./project configure
$ ./project make
To work on the project with other users in a group these commands can be
used:
$ ./project configure --group=GROUPNAME
$ ./project make --group=GROUPNAME
The old options to both configure and make the project are still valid. Run
`./project --help' to see a list. For example:
$ ./project configure -e --host-cc
$ ./project make -j8
The old `configure' script has been moved to
`reproduce/software/bash/configure.sh' and is called by the new `./project'
script. The `./project' script now just manages the options, then passes
control to the `configure.sh' script. For the "make" step, it also reads
the options, then calls Make. So in the lower-level nothing has
changed. Only the `./project' script is now the single/direct user
interface of the project.
On a parallel note: as part of bug #56682, we also found out that on some
macOS systems, the `DYLD_LIBRARY_PATH' environment variable has to be set
to blank. This is no problem because RPATH is automatically set in macOS
and the executables and libraries contain the absolute address of the
libraries they should link with. But having `DYLD_LIBRARY_PATH' can
conflict with some low-level system libraries and cause very hard to debug
linking errors (like that reported in the bug report).
This fixes bug #56682.
|
|
On some Fedora systems, libffi installs under `lib64', not `lib'. As a
result, Python's Setuptools can't find it and will not built (complaining
about not finding the `_ctypes' module).
With this commit, we fix this problem by explicitly putting a copy of
libffi's installed libraries within the `lib' directory.
This issue was found while testing the pipeline with Elham Saremi and Hamed
Altafi.
|
|
Until this commit, the name of the decompressed tarball directory of
PyYAML Python package was wrong. It has to be `PyYAML-version' instead
of `pyyaml-version'. When I run the installation on Mac OS system it
went up to the end of the installation with no error. However, when I
tried to install it on a GNU/Linux system, it complained about no
finding the `pyyaml-version' directory, which is the expected because
the name was wrong!
With this commit, I have fixed this issue by writting correctly the name
of the decompressed tarball directory.
|
|
With this commit, PyYAML Python package has been added into the project.
It is widely used in the Python community and the goal is to have human
readable configuration files. As in the web page
(https://pypi.org/project/PyYAML/) says:
YAML is a data serialization format designed for human readability and
interaction with scripting languages. PyYAML is a YAML parser and
emitter for Python.
|
|
Matplotlib can optionally use LaTeX and Ghostscript as dependencies to
render text with LaTeX.
Since we already have their build rules, with this commit, they have been
added as its prerequisites.
|
|
With this commit, ImageMagick program has been set as prerequisite of
Matplotlib Python package. Since some submodules of Matplotlib rely on
ImageMagick programs (for example, those to make image animations), and
now we have ImageMagick into the project, it is good to install
ImageMagick when Matplotlib is requested to be installed.
|
|
Especially because of the new convention regarding backslashes, there were
many conflicts that are now fixed. But none were substantial.
|
|
When we need to quote the new-line character we end the line with a
backslash (`\'). Until now, our convention has been to put all such
backslashes under each other to help in visual inspection.
But this causes a lot of confusion in version control: if only one line's
length is larger, the whole block will be marked as changed and thus makes
it hard to visually see the actual change. It also makes debuging the code
(adding some temporary lines) hard.
With this commit, I went through all the files and tried to fix all such
cases so only a single white space character is between the last command
character and the backslash. Where there was an empty line (ending with a
backslash, to help in visually separating the code into blocks), I put the
backslash right under the previous line's.
This completes task #15259.
|
|
Until now, we were missing `numpy' and `six' as prerequisites of `h5py'.
Because we did the configure step with all cores, `numpy' and `six'
allways were built before `h5py' with any crash. However, in a single
thread configure run we noticed that we were missing these two
dependencies.
With this commit, we fix this issue by setting `numpy' and `six' as
prerequisites of `h5py'.
|
|
Until this commit, the prerequisites of `sip_tpv' were `mpmath' and
`sympy'. However, the real prerequisites of `sip_tpv' are `astropy',
`numpy' and `sympy'.
With this commit, we fix this issue, and now `mpmath' is only a
prerequisite of `sympy'.
|
|
Until now, we were not citing the paper of `sip_tpv' package.
With this commit, we have been fix this issue.
|
|
With this commit, we add `sip_tpv' Python package into the template.
This is a small package to convert SIP distorsion coefficients into PV
distorsion coefficients, and the other way around.
This package is useful in a astronomical context, specially when `swarp'
is going to be used in order to resample images. The reason is that
`swarp' only can understand PV distorsion coefficients.
|
|
With this commit, we add `sympy' Python package into the template. This
is a package to do symbolic mathematics.
The motivation is that it is a prerequisite of `sip_tpv' Python package,
which is useful to convert SIP distorsion coefficients into PV
coefficients (in the context of astronomical images). However, the
availability of `sympy' in the template will be useful for anyone
interested in this package.
|
|
With this commit, we add `mpmath' Python package into the template. This
package is a prerequisite of `sympy', a package to do symbolic
mathematics.
The motivation of adding this package is because it is a dependency of
`sympy', which is more widely used into the Python science community.
|
|
Until now, to specify which high-level software you want the project to
contain, it was necessary to go into the `high-level.mk' Makefile that is
complicated and can create bugs.
With this commit, a new `reproduce/software/config/installation/TARGETS.mk'
file has been created that is easily/cleanly in charge of documenting the
final high-level software that must be built for the project.
Also, until now, FFTW was set as a dependency of Numpy while we couldn't
actually get Numpy to use it! It was just there for future reference and to
justify its build rule. But now that many software won't be built and there
is no problem with having rules even though a project might not use them,
it has been removed.
|
|
Until this commit, we were using `python3' when calling Python (because
we were using Python version 3.6.8). This will force us to change the
name in the future. For example, when `python4' were available and into
the pipeline.
With this commit, at the end of the Python installation it creates a
symbolic link to the Python bin executable with the new name `python'.
As a consecuence, whatever version of Python was installed, into the
project we will use allways `python' to invoke it.
|
|
Until now, the software building and analysis steps of the pipeline were
intertwined. However, these steps (of how to build a software, and how to
use it) are logically completely independent.
Therefore with this commit, the pipeline now has a new architecture
(particularly in the `reproduce' directory) to emphasize this distinction:
The `reproduce' directory now has the two `software' and `analysis'
subdirectories and the respective parts of the previous architecture have
been broken up between these two based on their function. There is also no
more `src' directory. The `config' directory for software and analysis is
now mixed with the language-specific directories.
Also, some of the software versions were also updated after some checks
with their webpages.
This new architecture will allow much more focused work on each part of the
pipeline (to install the software and to run them for an analysis).
|