diff options
author | Mohammad Akhlaghi <mohammad@akhlaghi.org> | 2019-04-15 01:47:58 +0100 |
---|---|---|
committer | Mohammad Akhlaghi <mohammad@akhlaghi.org> | 2019-04-15 02:24:09 +0100 |
commit | 313b936b502d22b6a2ff43f560dee0bb51fd01d0 (patch) | |
tree | 70f884b91b393be4d3c6b7cfaeaf3412900bd16f /reproduce/src | |
parent | 4722ea598edd6b630227404c48c1c09ac527e9b8 (diff) |
New architecture to separate software-building and analysis steps
Until now, the software building and analysis steps of the pipeline were
intertwined. However, these steps (of how to build a software, and how to
use it) are logically completely independent.
Therefore with this commit, the pipeline now has a new architecture
(particularly in the `reproduce' directory) to emphasize this distinction:
The `reproduce' directory now has the two `software' and `analysis'
subdirectories and the respective parts of the previous architecture have
been broken up between these two based on their function. There is also no
more `src' directory. The `config' directory for software and analysis is
now mixed with the language-specific directories.
Also, some of the software versions were also updated after some checks
with their webpages.
This new architecture will allow much more focused work on each part of the
pipeline (to install the software and to run them for an analysis).
Diffstat (limited to 'reproduce/src')
-rwxr-xr-x | reproduce/src/bash/download-multi-try | 112 | ||||
-rw-r--r-- | reproduce/src/bash/git-post-checkout | 66 | ||||
-rw-r--r-- | reproduce/src/bash/git-pre-commit | 83 | ||||
-rw-r--r-- | reproduce/src/make/delete-me.mk | 126 | ||||
-rwxr-xr-x | reproduce/src/make/dependencies-atlas-multiple.mk | 72 | ||||
-rwxr-xr-x | reproduce/src/make/dependencies-atlas-single.mk | 54 | ||||
-rw-r--r-- | reproduce/src/make/dependencies-basic.mk | 873 | ||||
-rw-r--r-- | reproduce/src/make/dependencies-build-rules.mk | 120 | ||||
-rw-r--r-- | reproduce/src/make/dependencies-python.mk | 506 | ||||
-rw-r--r-- | reproduce/src/make/dependencies.mk | 718 | ||||
-rw-r--r-- | reproduce/src/make/download.mk | 91 | ||||
-rw-r--r-- | reproduce/src/make/initialize.mk | 341 | ||||
-rw-r--r-- | reproduce/src/make/paper.mk | 142 | ||||
-rw-r--r-- | reproduce/src/make/top.mk | 135 |
14 files changed, 0 insertions, 3439 deletions
diff --git a/reproduce/src/bash/download-multi-try b/reproduce/src/bash/download-multi-try deleted file mode 100755 index 1fd7497..0000000 --- a/reproduce/src/bash/download-multi-try +++ /dev/null @@ -1,112 +0,0 @@ -# Attempt downloading multiple times before crashing whole project. From -# the top project directory (for the shebang above), this script must be -# run like this: -# -# $ /path/to/download-multi-try downloader lockfile input-url downloaded-name -# -# NOTE: The `downloader' must contain the option to specify the output name -# in its end. For example "wget -O". Any other option can also be placed in -# the middle. -# -# Due to temporary network problems, a download may fail suddenly, but -# succeed in a second try a few seconds later. Without this script that -# temporary glitch in the network will permanently crash the project and -# it can't continue. The job of this script is to be patient and try the -# download multiple times before crashing the whole project. -# -# LOCK FILE: Since there is ultimately only one network port to the outside -# world, downloading is done much faster in serial, not in parallel. But -# the project's processing may be done in parallel (with multiple threads -# needing to download different files at the same time). Therefore, this -# script uses the `flock' program to only do one download at a time. To -# benefit from it, any call to this script must be given the same lock -# file. -# -# Copyright (C) 2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# This script is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This script is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. See <http://www.gnu.org/licenses/>. - - - - - -# Script settings -# --------------- -# Stop the script if there are any errors. -set -e - - - - - -# Input arguments and necessary sanity checks. -inurl="$3" -outname="$4" -lockfile="$2" -downloader="$1" -if [ "x$downloader" = x ]; then - echo "$0: downloader (first argument) not given."; exit 1; -fi -if [ "x$lockfile" = x ]; then - echo "$0: lock file (second argument) not given."; exit 1; -fi -if [ "x$inurl" = x ]; then - echo "$0: full input URL (third argument) not given."; exit 1; -fi -if [ "x$outname" = x ]; then - echo "$0: output name (fourth argument) not given."; exit 1; -fi - - - - - -# Try downloading multiple times before crashing. -counter=0 -maxcounter=10 -while [ ! -f "$outname" ]; do - - # Increment the counter. We need the `counter=' part here because - # without it the evaluation of arithmetic expression will be like and - # error and the script is set to crash on errors. - counter=$((counter+1)) - - # If we have passed a maximum number of trials, just exit with - # a failed code. - if (( counter > maxcounter )); then - echo - echo "Failed $maxcounter download attempts: $outname" - echo - exit 1 - fi - - # If this isn't the first attempt print a notice and wait a little for - # the next trail. - if (( counter > 1 )); then - tstep=$((counter*5)) - echo "Download trial $counter for '$outname' in $tstep seconds." - sleep $tstep - fi - - # Attempt downloading the file (one-at-a-time). Note that the - # `downloader' ends with the respective option to specify the output - # name. For example "wget -O" (so `outname', that comes after it) will - # be the name of the downloaded file. - flock "$lockfile" bash -c \ - "if ! $downloader $outname $inurl; then rm -f $outname; fi" -done - - - - - -# Return successfully -exit 0 diff --git a/reproduce/src/bash/git-post-checkout b/reproduce/src/bash/git-post-checkout deleted file mode 100644 index 9552f01..0000000 --- a/reproduce/src/bash/git-post-checkout +++ /dev/null @@ -1,66 +0,0 @@ -#!@BINDIR@/bash -# -# The example hook script to store the metadata information of version -# controlled files (with each commit) using the `metastore' program. -# -# Copyright (C) 2016 Przemyslaw Pawelczyk <przemoc@gmail.com> -# Copyright (C) 2018-2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# This script is taken from the `examples/hooks/pre-commit' file of the -# `metastore' package (installed within the project, with an MIT license -# for copyright). We have just changed the name of the `MSFILE' and also -# set special characters for the installation location of meta-store so our -# own installation is found by Git. -# -# Permission is hereby granted, free of charge, to any person obtaining a -# copy of this software and associated documentation files (the -# "Software"), to deal in the Software without restriction, including -# without limitation the rights to use, copy, modify, merge, publish, -# distribute, sublicense, and/or sell copies of the Software, and to permit -# persons to whom the Software is furnished to do so, subject to the -# following conditions: -# -# The above copyright notice and this permission notice shall be included -# in all copies or substantial portions of the Software. -# -# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -# NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -# USE OR OTHER DEALINGS IN THE SOFTWARE. - - -# File containig the metadata and metastore executable. -MSFILE=".file-metadata" -MSBIN=@BINDIR@/metastore - -# If metastore is not installed, then ignore this script (exit with a -# status of 0). -if [ ! -f $MSBIN ]; then exit 0; fi - -# Delete all temporary files -find @TOP_PROJECT_DIR@/ -name "*~" -type f -delete - -# Function to help in reporting a crash. -exit_on_fail() { - "$@" - if [ $? -ne 0 ]; then - echo "Failed to execute: $@" >&2 - exit 1 - fi -} - -# Check if the metadata file exists. -if [ ! -e "$MSFILE" ]; then - echo "\"$MSFILE\" missing" >&2 - exit 1 -fi - -# Run metastore. -exit_on_fail \ - $MSBIN -a -m -e -E -q -O @USER@ -G @GROUP@ -f "$MSFILE" - -# Return with a success code (0). -exit 0 diff --git a/reproduce/src/bash/git-pre-commit b/reproduce/src/bash/git-pre-commit deleted file mode 100644 index dbe0ecc..0000000 --- a/reproduce/src/bash/git-pre-commit +++ /dev/null @@ -1,83 +0,0 @@ -#!@BINDIR@/bash -# -# The example hook script to store the metadata information of version -# controlled files (with each commit) using the `metastore' program. -# -# Copyright (C) 2016 Przemyslaw Pawelczyk <przemoc@gmail.com> -# Copyright (C) 2018-2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# WARNING: -# -# If the commit is aborted (e.g. by not entering any synopsis), -# then updated metastore file (.metadata by default) is not reverted, -# so its new version remains in the index. -# To undo any changes in metastore file written since HEAD commit, -# you may want to reset and checkout HEAD version of the file: -# -# git reset HEAD -- .metadata -# git checkout HEAD -- .metadata -# -# This script is taken from the `examples/hooks/pre-commit' file of the -# `metastore' package (installed within the project, with an MIT license -# for copyright). Here, the name of the `MSFILE' and also set special -# characters for the installation location of meta-store so our own -# installation is found by Git. -# -# Permission is hereby granted, free of charge, to any person obtaining a -# copy of this software and associated documentation files (the -# "Software"), to deal in the Software without restriction, including -# without limitation the rights to use, copy, modify, merge, publish, -# distribute, sublicense, and/or sell copies of the Software, and to permit -# persons to whom the Software is furnished to do so, subject to the -# following conditions: -# -# The above copyright notice and this permission notice shall be included -# in all copies or substantial portions of the Software. -# -# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -# NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -# USE OR OTHER DEALINGS IN THE SOFTWARE. - -# File containig the metadata and metastore executable. -MSFILE=".file-metadata" -MSBIN=@BINDIR@/metastore - -# If metastore is not installed, then ignore this script (exit with a -# status of 0). -if [ ! -f $MSBIN ]; then exit 0; fi - -# Function to help in reporting a crash. -exit_on_fail() { - "$@" - if [ $? -ne 0 ]; then - echo "Failed to execute: $@" >&2 - exit 1 - fi -} - -# Run metastore. -exit_on_fail \ - $MSBIN -O @USER@ -G @GROUP@ -s -f "$MSFILE" - -# If it's first metastore commit, store again to include $MSFILE in $MSFILE. -if ! git-ls-tree --name-only HEAD 2>/dev/null | grep -Fqx "$MSFILE"; then - exit_on_fail \ - $MSBIN -O @USER@ -G @GROUP@ -s -f "$MSFILE" -fi - -# Check if the metadata file exists. -if [ ! -e "$MSFILE" ]; then - echo "\"$MSFILE\" missing" >&2 - exit 1 -fi - -# Add the metadata file to the Git repository. -exit_on_fail \ - git-add "$MSFILE" - -# Return with a success code (0). -exit 0 diff --git a/reproduce/src/make/delete-me.mk b/reproduce/src/make/delete-me.mk deleted file mode 100644 index 701a316..0000000 --- a/reproduce/src/make/delete-me.mk +++ /dev/null @@ -1,126 +0,0 @@ -# Dummy Makefile to create a random dataset for plotting. -# -# Copyright (C) 2018-2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# This Makefile is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This Makefile is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. -# -# A copy of the GNU General Public License is available at -# <http://www.gnu.org/licenses/>. - - - - - -# Dummy dataset -# ------------- -# -# We will use AWK to generate a table showing X and X^2 and draw its plot. -dmdir = $(texdir)/delete-me -dm = $(dmdir)/data.txt -$(dmdir): | $(texdir); mkdir $@ -$(dm): $(pconfdir)/delete-me-num.mk | $(dmdir) - - # When the plotted values are re-made, it is necessary to also - # delete the TiKZ externalized files so the plot is also re-made. - rm -f $(tikzdir)/delete-me.pdf - - # Generate the table of random values. - awk 'BEGIN {for(i=1;i<=$(delete-me-num);i+=0.5) print i, i*i; }' > $@ - - - - - -# WFPC2 image PDF -# ----------------- -# -# For an example image, we'll make a PDF copy of the WFPC II image to -# display in the paper. -dddemodir = $(texdir)/delete-me-demo -$(dddemodir): | $(texdir); mkdir $@ -demopdf = $(dddemodir)/wfpc2.pdf -$(demopdf): $(dddemodir)/%.pdf: $(indir)/%.fits | $(dddemodir) - - # When the plotted values are re-made, it is necessary to also - # delete the TiKZ externalized files so the plot is also re-made. - rm -f $(tikzdir)/delete-me-wfpc2.pdf - - # Convert the dataset to a PDF. - astconvertt --colormap=gray --fluxhigh=4 $< -h0 -o$@ - - - - - -# Histogram of WFPC2 image -# ------------------------ -# -# For an example plot, we'll show the pixel value histogram also. -histogram = $(dddemodir)/wfpc2-hist.txt -$(histogram): $(dddemodir)/%-hist.txt: $(indir)/%.fits | $(dddemodir) - - # When the plotted values are re-made, it is necessary to also - # delete the TiKZ externalized files so the plot is also re-made. - rm -f $(tikzdir)/delete-me-wfpc2.pdf - - # Generate the pixel value distribution - aststatistics --lessthan=5 $< -h0 --histogram -o$@ - - - - - -# Basic statistics -# ---------------- -# -# This is just as a demonstration on how to get analysic configuration -# parameters from variables defined in `reproduce/config/pipeline'. -stats = $(dddemodir)/wfpc2-stats.txt -$(stats): $(dddemodir)/%-stats.txt: $(indir)/%.fits | $(dddemodir) - aststatistics $< -h0 --mean --median > $@ - - - - - -# TeX macros -# ---------- -# -# This is how we write the necessary parameters in the final PDF. -# -# NOTE: In LaTeX you cannot use any non-alphabetic character in a variable -# name. -$(mtexdir)/delete-me.tex: $(dm) $(demopdf) $(histogram) $(stats) - - # Write the number of random values used. - echo "\newcommand{\deletemenum}{$(delete-me-num)}" > $@ - - # Note that since Make variables start with a `$(', if you want to - # use `$' within the shell (not Make), you have to quote any - # occurance of `$' with another `$'. That is why there are `$$' in - # the AWK command below. - # - # Here, we are first using AWK to find the minimum and maximum - # values, then using it again to read each separately to use in the - # macro definition. - mm=$$(awk 'BEGIN{min=99999; max=-min} - {if($$2>max) max=$$2; if($$2<min) min=$$2;} - END{print min, max}' $(dm)); - v=$$(echo "$$mm" | awk '{printf "%.3f", $$1}'); - echo "\newcommand{\deletememin}{$$v}" >> $@ - v=$$(echo "$$mm" | awk '{printf "%.3f", $$2}'); - echo "\newcommand{\deletememax}{$$v}" >> $@ - - # Write the statistics of the WFPC2 image as a macro. - mean=$$(awk '{printf("%.2f", $$1)}' $(stats)) - echo "\newcommand{\deletemewfpctwomean}{$$mean}" >> $@ - median=$$(awk '{printf("%.2f", $$2)}' $(stats)) - echo "\newcommand{\deletemewfpctwomedian}{$$median}" >> $@ diff --git a/reproduce/src/make/dependencies-atlas-multiple.mk b/reproduce/src/make/dependencies-atlas-multiple.mk deleted file mode 100755 index fef25c7..0000000 --- a/reproduce/src/make/dependencies-atlas-multiple.mk +++ /dev/null @@ -1,72 +0,0 @@ -# Rules to build ATLAS shared libraries in multi-threaded mode on GNU/Linux -# -# ------------------------------------------------------------------------ -# !!!!! IMPORTANT NOTES !!!!! -# -# This Makefile will be run during the initial `./configure' script. It is -# not included into the reproduction pipe after that. -# -# ------------------------------------------------------------------------ -# -# Copyright (C) 2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# This Makefile is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This Makefile is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. -# -# A copy of the GNU General Public License is available at -# <http://www.gnu.org/licenses/>. - -ORIGLDFLAGS := $(LDFLAGS) - -include Make.inc - -all: libatlas.so libf77blas.so libptf77blas.so libstcblas.so libptcblas.so \ - libblas.so libcblas.so liblapack.so.3.6.1 libptlapack.so.3.6.1 liblapack.so.3 - -libatlas.so: libatlas.a - ld $(ORIGLDFLAGS) $(LDFLAGS) -shared -soname $@ -o $@ \ - --whole-archive libatlas.a --no-whole-archive -lc $(LIBS) - -libf77blas.so : libf77blas.a libatlas.so - ld $(ORIGLDFLAGS) $(LDFLAGS) -shared -soname libf77blas.so.3 \ - -o $@ --whole-archive libf77blas.a --no-whole-archive \ - $(F77SYSLIB) -L. -latlas - -libptf77blas.so : libptf77blas.a libatlas.so - ld $(ORIGLDFLAGS) $(LDFLAGS) -shared -soname libblas.so.3 \ - -o $@ --whole-archive libptf77blas.a --no-whole-archive \ - $(F77SYSLIB) -L. -latlas - -libstcblas.so : libcblas.a libatlas.so libblas.so - ld $(ORIGLDFLAGS) $(LDFLAGS) -shared -soname libstcblas.so \ - -o $@ --whole-archive libcblas.a -L. -latlas -lblas - -libptcblas.so : libptcblas.a libatlas.so libblas.so - ld $(ORIGLDFLAGS) $(LDFLAGS) -shared -soname libcblas.so \ - -o $@ --whole-archive libptcblas.a -L. -latlas -lblas - -libblas.so: libptf77blas.so - ln -s $< $@ - -libcblas.so: libptcblas.so - ln -s $< $@ - -liblapack.so.3.6.1 : liblapack.a libstcblas.so libf77blas.so - ld $(ORIGLDFLAGS) $(LDFLAGS) -shared -soname libstlapack.so.3 \ - -o $@ --whole-archive liblapack.a --no-whole-archive \ - $(F77SYSLIB) -L. -lstcblas -lf77blas - -libptlapack.so.3.6.1 : libptlapack.a libcblas.so libblas.so - ld $(ORIGLDFLAGS) $(LDFLAGS) -shared -soname liblapack.so.3 \ - -o $@ --whole-archive libptlapack.a --no-whole-archive \ - $(F77SYSLIB) -L. -lcblas -lblas - -liblapack.so.3: libptlapack.so.3.6.1 - ln -s $< $@ diff --git a/reproduce/src/make/dependencies-atlas-single.mk b/reproduce/src/make/dependencies-atlas-single.mk deleted file mode 100755 index dde2926..0000000 --- a/reproduce/src/make/dependencies-atlas-single.mk +++ /dev/null @@ -1,54 +0,0 @@ -# Rules to build ATLAS shared libraries for single threads on GNU/Linux -# -# ------------------------------------------------------------------------ -# !!!!! IMPORTANT NOTES !!!!! -# -# This Makefile will be run during the initial `./configure' script. It is -# not included into the reproduction pipe after that. -# -# ------------------------------------------------------------------------ -# -# Copyright (C) 2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# This Makefile is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This Makefile is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. -# -# A copy of the GNU General Public License is available at -# <http://www.gnu.org/licenses/>. - -ORIGLDFLAGS := $(LDFLAGS) - -include Make.inc - -all: libatlas.so libf77blas.so libcblas.so libblas.so liblapack.so.3.6.1 - -libatlas.so: libatlas.a - ld $(ORIGLDFLAGS) $(LDFLAGS) -shared -soname $@ -o $@ \ - --whole-archive libatlas.a --no-whole-archive -lc $(LIBS) - -libf77blas.so : libf77blas.a libatlas.so - ld $(ORIGLDFLAGS) $(LDFLAGS) -shared -soname libblas.so.3 \ - -o $@ --whole-archive libf77blas.a --no-whole-archive \ - $(F77SYSLIB) -L. -latlas - -libcblas.so : libcblas.a libatlas.so libblas.so - ld $(ORIGLDFLAGS) $(LDFLAGS) -shared -soname $@ -o $@ \ - --whole-archive libcblas.a -L. -latlas -lblas - -libblas.so: libf77blas.so - ln -s $< $@ - -liblapack.so.3.6.1 : liblapack.a libcblas.so libblas.so - ld $(ORIGLDFLAGS) $(LDFLAGS) -shared -soname liblapack.so.3 \ - -o $@ --whole-archive liblapack.a --no-whole-archive \ - $(F77SYSLIB) -L. -lcblas -lblas - -liblapack.so.3: liblapack.so.3.6.1 - ln -s $< $@ diff --git a/reproduce/src/make/dependencies-basic.mk b/reproduce/src/make/dependencies-basic.mk deleted file mode 100644 index b56d01d..0000000 --- a/reproduce/src/make/dependencies-basic.mk +++ /dev/null @@ -1,873 +0,0 @@ -# Build the VERY BASIC project dependencies before everything else assuming -# minimal/generic Make and Shell. -# -# ------------------------------------------------------------------------ -# !!!!! IMPORTANT NOTES !!!!! -# -# This Makefile will be run by the initial `./configure' script. It is not -# included into the reproduction pipe after that. -# -# This Makefile builds very low-level and basic tools like GNU Tar, and -# various compression programs, GNU Bash, and GNU Make. Therefore this is -# the only Makefile in the reproduction pipeline where you MUST NOT assume -# that modern GNU Bash or GNU Make are used. After this Makefile, other -# Makefiles can safely assume the fixed version of all these software. -# -# This Makefile is a very simplified version of `dependencies.mk' in the -# same directory. See there for more comments. -# -# ------------------------------------------------------------------------ -# -# Copyright (C) 2018-2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# This Makefile is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This Makefile is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. -# -# A copy of the GNU General Public License is available at -# <http://www.gnu.org/licenses/>. - - -# Top level environment -include reproduce/config/pipeline/LOCAL.mk -include reproduce/src/make/dependencies-build-rules.mk -include reproduce/config/pipeline/dependency-versions.mk - -lockdir = $(BDIR)/locks -ddir = $(BDIR)/dependencies -tdir = $(BDIR)/dependencies/tarballs -idir = $(BDIR)/dependencies/installed -ibdir = $(BDIR)/dependencies/installed/bin -ildir = $(BDIR)/dependencies/installed/lib -ibidir = $(BDIR)/dependencies/installed/version-info/bin -ilidir = $(BDIR)/dependencies/installed/version-info/lib - -# We'll need the system's PATH for making links to low-level programs we -# won't be building ourselves. -syspath := $(PATH) - -# As we build more programs, we want to use this project's built programs -# and libraries, not the host's. -export CCACHE_DISABLE := 1 -export PATH := $(ibdir):$(PATH) -export PKG_CONFIG_PATH := $(ildir)/pkgconfig -export PKG_CONFIG_LIBDIR := $(ildir)/pkgconfig -export LDFLAGS := $(rpath_command) -L$(ildir) $(LDFLAGS) -export CPPFLAGS := -I$(idir)/include $(CPPFLAGS) -export LD_LIBRARY_PATH := $(ildir):$(LD_LIBRARY_PATH) - -# Define the programs that don't depend on any other. -top-level-programs = low-level-links wget gcc -all: $(foreach p, $(top-level-programs), $(ibidir)/$(p)) - - - - - -# Tarballs -# -------- -# -# Prepare tarballs. Difference with that in `dependencies.mk': `.ONESHELL' -# is not recognized by some versions of Make (even older GNU Makes). So -# we'll have to make sure the recipe doesn't break into multiple shell -# calls (so we can preserve the variables). -# -# Software hosted at akhlaghi.org/src: As of our latest check (November -# 2018) their major release tarballs either crash or don't build on some -# systems (for example Make or Gzip), or they don't exist (for example -# Bzip2). -# -# In the first case, we used their Git repo and bootstrapped them (just -# like Gnuastro) and built the most recent tarball off of that. In the case -# of Bzip2: its webpage has expired and doesn't host the data any more. It -# is available on the link below (archive.org): -# -# https://web.archive.org/web/20180624184806/http://www.bzip.org/1.0.6/bzip2-1.0.6.tar.gz -# -# However, downloading from this link is slow (because its just a link). So -# its easier to just keep a with the others. -$(lockdir): | $(BDIR); mkdir $@ -downloadwrapper = ./reproduce/src/bash/download-multi-try -tarballs = $(foreach t, bash-$(bash-version).tar.gz \ - binutils-$(binutils-version).tar.lz \ - bzip2-$(bzip2-version).tar.gz \ - cert.pem \ - coreutils-$(coreutils-version).tar.xz \ - diffutils-$(diffutils-version).tar.xz \ - file-$(file-version).tar.gz \ - findutils-$(findutils-version).tar.lz \ - gawk-$(gawk-version).tar.lz \ - gcc-$(gcc-version).tar.xz \ - gmp-$(gmp-version).tar.lz \ - grep-$(grep-version).tar.xz \ - gzip-$(gzip-version).tar.gz \ - isl-$(isl-version).tar.bz2 \ - libtool-$(libtool-version).tar.xz \ - lzip-$(lzip-version).tar.gz \ - m4-$(m4-version).tar.gz \ - make-$(make-version).tar.lz \ - mpfr-$(mpfr-version).tar.xz \ - mpc-$(mpc-version).tar.gz \ - ncurses-$(ncurses-version).tar.gz \ - openssl-$(openssl-version).tar.gz \ - patchelf-$(patchelf-version).tar.gz \ - pkg-config-$(pkgconfig-version).tar.gz \ - readline-$(readline-version).tar.gz \ - sed-$(sed-version).tar.xz \ - tar-$(tar-version).tar.gz \ - wget-$(wget-version).tar.lz \ - which-$(which-version).tar.gz \ - xz-$(xz-version).tar.gz \ - zlib-$(zlib-version).tar.gz \ - , $(tdir)/$(t) ) -$(tarballs): $(tdir)/%: | $(lockdir) - if [ -f $(DEPENDENCIES-DIR)/$* ]; then \ - cp $(DEPENDENCIES-DIR)/$* $@; \ - else \ - n=$$(echo $* | sed -e's/[0-9\-]/ /g' \ - -e's/\./ /g' \ - | awk '{print $$1}' ); \ - \ - mergenames=1; \ - if [ $$n = bash ]; then w=http://ftp.gnu.org/gnu/bash; \ - elif [ $$n = binutils ]; then w=http://ftp.gnu.org/gnu/binutils; \ - elif [ $$n = bzip ]; then w=http://akhlaghi.org/src; \ - elif [ $$n = cert ]; then w=http://akhlaghi.org/src; \ - elif [ $$n = coreutils ]; then w=http://ftp.gnu.org/gnu/coreutils;\ - elif [ $$n = diffutils ]; then w=http://ftp.gnu.org/gnu/diffutils;\ - elif [ $$n = file ]; then w=ftp://ftp.astron.com/pub/file; \ - elif [ $$n = findutils ]; then w=http://akhlaghi.org/src; \ - elif [ $$n = gawk ]; then w=http://ftp.gnu.org/gnu/gawk; \ - elif [ $$n = gcc ]; then w=http://ftp.gnu.org/gnu/gcc/gcc-$(gcc-version); \ - elif [ $$n = gmp ]; then w=https://gmplib.org/download/gmp; \ - elif [ $$n = grep ]; then w=http://ftp.gnu.org/gnu/grep; \ - elif [ $$n = gzip ]; then w=http://ftp.gnu.org/gnu/gzip; \ - elif [ $$n = isl ]; then w=ftp://gcc.gnu.org/pub/gcc/infrastructure; \ - elif [ $$n = libtool ]; then w=http://ftp.gnu.org/gnu/libtool; \ - elif [ $$n = lzip ]; then w=http://download.savannah.gnu.org/releases/lzip; \ - elif [ $$n = m ]; then \ - mergenames=0; \ - w=http://akhlaghi.org/src/m4-1.4.18-patched.tar.gz; \ - elif [ $$n = make ]; then w=http://akhlaghi.org/src; \ - elif [ $$n = mpfr ]; then w=http://www.mpfr.org/mpfr-current;\ - elif [ $$n = mpc ]; then w=http://ftp.gnu.org/gnu/mpc; \ - elif [ $$n = ncurses ]; then w=http://ftp.gnu.org/gnu/ncurses; \ - elif [ $$n = openssl ]; then w=http://www.openssl.org/source; \ - elif [ $$n = patchelf ]; then w=http://nixos.org/releases/patchelf/patchelf-$(patchelf-version); \ - elif [ $$n = pkg ]; then w=http://pkg-config.freedesktop.org/releases; \ - elif [ $$n = readline ]; then w=http://ftp.gnu.org/gnu/readline; \ - elif [ $$n = sed ]; then w=http://ftp.gnu.org/gnu/sed; \ - elif [ $$n = tar ]; then w=http://ftp.gnu.org/gnu/tar; \ - elif [ $$n = wget ]; then w=http://ftp.gnu.org/gnu/wget; \ - elif [ $$n = which ]; then w=http://ftp.gnu.org/gnu/which; \ - elif [ $$n = xz ]; then w=http://tukaani.org/xz; \ - elif [ $$n = zlib ]; then w=http://www.zlib.net; \ - else \ - echo; echo; echo; \ - echo "'$$n' not a basic dependency name (for downloading)." \ - echo; echo; echo; \ - exit 1; \ - fi; \ - \ - if [ $$mergenames = 1 ]; then tarballurl=$$w/"$*"; \ - else tarballurl=$$w; \ - fi; \ - \ - echo "Downloading $$tarballurl"; \ - if [ -f $(ibdir)/wget ]; then \ - downloader="wget --no-use-server-timestamps -O"; \ - else \ - downloader="$(DOWNLOADER)"; \ - fi; \ - \ - touch $(lockdir)/download; \ - $(downloadwrapper) "$$downloader" $(lockdir)/download \ - $$tarballurl $@; \ - fi - - - - - -# Low-level (not built) programs -# ------------------------------ -# -# For the time being, we aren't building a local C compiler, but we'll use -# any C compiler that the system already has and just make a symbolic link -# to it. -# -# ccache: ccache acts like a wrapper over the C compiler and is made to -# avoid/speed-up compiling of identical files in a system (it is commonly -# used on large servers). It actually makes `gcc' or `g++' a symbolic link -# to itself so it can control them internally. So, for our purpose here, it -# is very annoying and can cause many complications. We thus remove any -# part of PATH of that has `ccache' in it before making symbolic links to -# the programs we are not building ourselves. -makelink = origpath="$$PATH"; \ - export PATH=$$(echo $(syspath) | tr : '\n' | grep -v ccache \ - | tr '\n' :); \ - a=$$(which $(1) 2> /dev/null); \ - if [ -e $(ibdir)/$(1) ]; then rm $(ibdir)/$(1); fi; \ - if [ x$$a = x ]; then \ - if [ "x$(strip $(2))" = xmandatory ]; then \ - echo "'$(1)' is necessary for higher-level tools."; \ - echo "Please install it for the configuration to continue."; \ - exit 1; \ - fi; \ - else \ - ln -s $$a $(ibdir)/$(1); \ - fi; \ - export PATH="$$origpath" -$(ibdir) $(ildir):; mkdir $@ -$(ibidir)/low-level-links: | $(ibdir) $(ildir) - - # The Assembler - $(call makelink,as) - - # Compiler (Cmake needs the clang compiler which we aren't building - # yet in the project). - $(call makelink,clang) - $(call makelink,clang++) - - # The linker - $(call makelink,ar) - $(call makelink,ld) - $(call makelink,nm) - $(call makelink,ps) - $(call makelink,ranlib) - - # Mac OS specific - $(call makelink,sysctl) - $(call makelink,sw_vers) - $(call makelink,dsymutil) - $(call makelink,install_name_tool) - - # On Mac OS, libtool is different compared to GNU Libtool. The - # libtool we'll build in the high-level dependencies has the - # executable name `glibtool'. - $(call makelink,libtool) - - # GNU Gettext (translate messages) - $(call makelink,msgfmt) - - # Needed by TeXLive specifically. - $(call makelink,perl) - - # Necessary libraries: - # Libdl (for dynamic loading libraries at runtime) - # POSIX Threads library for multi-threaded programs. - for l in dl pthread; do \ - rm -f $(ildir)/lib$$l*; \ - if [ -f /usr/lib/lib$$l.a ]; then \ - ln -s /usr/lib/lib$$l.* $(ildir)/; \ - fi; \ - done - - # We want this to be empty (so it doesn't interefere with the other - # files in `ibidir'. - touch $@ - - - - - - - - - - -# Level 1 (MOST BASIC): Compression programs -# ------------------------------------------ -# -# The first set of programs to be built are those that we need to unpack -# the source code tarballs of each program. First, we'll build the -# necessary programs, then we'll build GNU Tar. -$(ibidir)/gzip: $(tdir)/gzip-$(gzip-version).tar.gz - $(call gbuild, $<, gzip-$(gzip-version), static, , V=1) \ - && echo "GNU Gzip $(gzip-version)" > $@ - -# GNU Lzip: For a static build, the `-static' flag should be given to -# LDFLAGS on the command-line (not from the environment). -ifeq ($(static_build),yes) -lzipconf="LDFLAGS=-static" -else -lzipconf= -endif -$(ibidir)/lzip: $(tdir)/lzip-$(lzip-version).tar.gz - $(call gbuild, $<, lzip-$(lzip-version), , $(lzipconf)) \ - && echo "Lzip $(lzip-version)" > $@ - -$(ibidir)/xz: $(tdir)/xz-$(xz-version).tar.gz - $(call gbuild, $<, xz-$(xz-version), static) \ - && echo "XZ Utils $(xz-version)" > $@ - -$(ibidir)/bzip2: $(tdir)/bzip2-$(bzip2-version).tar.gz - # Bzip2 doesn't have a `./configure' script, and its Makefile - # doesn't build a shared library. So we can't use the `gbuild' - # function here and we need to take some extra steps (inspired - # from the "Linux from Scratch" guide for Bzip2): - # 1) The `sed' call is for relative installed symbolic links. - # 2) The special Makefile-libbz2_so builds shared libraries. - # - # NOTE: the major version number appears in the final symbolic - # link. - tdir=bzip2-$(bzip2-version); \ - if [ $(static_build) = yes ]; then \ - makecommand="make LDFLAGS=-static"; \ - makeshared="echo no-shared"; \ - else \ - makecommand="make"; \ - if [ x$(on_mac_os) = xyes ]; then \ - makeshared="echo no-shared"; \ - else \ - makeshared="make -f Makefile-libbz2_so"; \ - fi; \ - fi; \ - cd $(ddir) && rm -rf $$tdir && tar xf $< && cd $$tdir \ - && sed -e 's@\(ln -s -f \)$$(PREFIX)/bin/@\1@' Makefile \ - > Makefile.sed \ - && mv Makefile.sed Makefile \ - && $$makeshared \ - && cp -a libbz2* $(ildir)/ \ - && make clean \ - && $$makecommand \ - && make install PREFIX=$(idir) \ - && cd .. \ - && rm -rf $$tdir \ - && cd $(ildir) \ - && ln -fs libbz2.so.1.0 libbz2.so \ - && echo "Bzip2 $(bzip2-version)" > $@ - -# GNU Tar: When built statically, tar gives a segmentation fault on -# unpacking Bash. So we'll build it dynamically. -$(ibidir)/tar: $(tdir)/tar-$(tar-version).tar.gz \ - $(ibidir)/bzip2 \ - $(ibidir)/lzip \ - $(ibidir)/gzip \ - $(ibidir)/xz - # Since all later programs depend on Tar, the configuration will be - # stuck here, only making Tar. So its more efficient to built it on - # multiple threads (when the user's Make doesn't pass down the - # number of threads). - $(call gbuild, $<, tar-$(tar-version), , , -j$(numthreads)) \ - && echo "GNU Tar $(tar-version)" > $@ - - - - - - - - - - -# Level 2 (SECOND MOST BASIC): Bash and Make -# ------------------------------------------ -# -# GNU Make and GNU Bash are the second layer that we'll need to build the -# basic dependencies. -# -# Unfortunately Make needs dynamic linking in two instances: when loading -# objects (dynamically linked libraries), or when using the `getpwnam' -# function (for tilde expansion). The first can be disabled with -# `--disable-load', but unfortunately I don't know any way to fix the -# second. So, we'll have to build it dynamically for now. -$(ibidir)/make: $(tdir)/make-$(make-version).tar.lz \ - $(ibidir)/tar - # See Tar's comments for the `-j' option. - $(call gbuild, $<, make-$(make-version), , , -j$(numthreads)) \ - && echo "GNU Make $(make-version)" > $@ - -$(ilidir)/ncurses: $(tdir)/ncurses-$(ncurses-version).tar.gz \ - $(ibidir)/make - - # Delete the library that will be installed (so we can make sure - # the build process completed afterwards and reset the links). - rm -f $(ildir)/libncursesw* - - # Delete the (possibly existing) low-level programs that depend on - # `readline', and thus `ncurses'. Since these programs are actually - # used during the building of `ncurses', we need to delete them so - # the build process doesn't use the project's Bash and AWK, but the - # host's. - rm -f $(ibdir)/bash* $(ibdir)/awk* $(ibdir)/gawk* - - # Standard build process. - $(call gbuild, $<, ncurses-$(ncurses-version), static, \ - --with-shared --enable-rpath --without-normal \ - --without-debug --with-cxx-binding \ - --with-cxx-shared --enable-widec --enable-pc-files \ - --with-pkg-config=$(ildir)/pkgconfig ) - - # Unfortunately there are many problems with `ncurses' using - # "normal" (or 8-bit) characters. The standard way that will work - # is to build it with wide character mode as you see above in the - # configuration (or the `w' prefix you see below). Also, most - # programs (and in particular Bash and AWK), first look for other - # (mostly obsolete) libraries like tinfo, which define the same - # symbols. The links below address both situations: we need to fool - # higher-level packages to find this library even if they aren't - # explicitly mentioning its name correctly (as a value to `-l' at - # link time in their configure scripts). - # - # This part is taken from the Arch Linux build script[1], then - # extended to Mac thanks to Homebrew's script [2]. - # - # [1] https://git.archlinux.org/svntogit/packages.git/tree/trunk/PKGBUILD?h=packages/ncurses - # [2] https://github.com/Homebrew/homebrew-core/blob/master/Formula/ncurses.rb - # - # Since we can't have comments, in the connected script, here is a - # summary: - # - # 1. We find the actual suffix of the library, from the file that - # is not a symbolic link (starting with `-' in the output of - # `ls -l'). - # - # 2. We make symbolic links to all the "ncurses", "ncurses++", - # "form", "panel" and "menu" libraries to point to their - # "wide" (character) library. - # - # 3. We make symbolic links to the "tic" and "tinfo" libraries to - # point to the same `libncursesw' library. - # - # 4. Some programs link with "curses" (not "ncurses", notice the - # starting "n"), so we'll also make links for these to point - # to the `libncursesw' library. - # - # 5. A link is made to also be able to include files from the - # `ncurses' headers. - if [ x$(on_mac_os) = xyes ]; then so="dylib"; else so="so"; fi; \ - if [ -f $(ildir)/libncursesw.$$so ]; then \ - \ - sov=$$(ls -l $(ildir)/libncursesw* \ - | awk '/^-/{print $$NF}' \ - | sed -e's|'$(ildir)/libncursesw.'||'); \ - \ - cd "$(ildir)"; \ - for lib in ncurses ncurses++ form panel menu; do \ - ln -fs lib$$lib"w".$$sov lib$$lib.$$so; \ - ln -fs $(ildir)/pkgconfig/"$$lib"w.pc pkgconfig/$$lib.pc; \ - done; \ - for lib in tic tinfo; do \ - ln -fs libncursesw.$$sov lib$$lib.$$so; \ - ln -fs libncursesw.$$sov lib$$lib.$$sov; \ - ln -fs $(ildir)/pkgconfig/ncursesw.pc pkgconfig/$$lib.pc; \ - done; \ - ln -fs libncursesw.$$sov libcurses.$$so; \ - ln -fs libncursesw.$$sov libcursesw.$$sov; \ - ln -fs $(ildir)/pkgconfig/ncursesw.pc pkgconfig/curses.pc; \ - ln -fs $(ildir)/pkgconfig/ncursesw.pc pkgconfig/cursesw.pc; \ - \ - ln -fs $(idir)/include/ncursesw $(idir)/include/ncurses; \ - echo "GNU NCURSES $(ncurses-version)" > $@; \ - else \ - exit 1; \ - fi - -$(ilidir)/readline: $(tdir)/readline-$(readline-version).tar.gz \ - $(ilidir)/ncurses - $(call gbuild, $<, readline-$(readline-version), static, \ - --with-curses --disable-install-examples, \ - SHLIB_LIBS="-lncursesw" ) \ - && echo "GNU Readline $(readline-version)" > $@ - -$(ibidir)/patchelf: $(tdir)/patchelf-$(patchelf-version).tar.gz \ - $(ibidir)/make - $(call gbuild, $<, patchelf-$(patchelf-version), static) \ - && echo "PatchELF $(patchelf-version)" > $@ - - -# IMPORTANT: Even though we have enabled `rpath', Bash doesn't write the -# absolute adddress of the libraries it depends on! Therefore, if we -# configure Bash with `--with-installed-readline' (so the installed version -# of Readline, that we build below as a prerequisite or AWK, is used) and -# you run `ldd $(ibdir)/bash' on the resulting binary, it will say that it -# is linking with the system's `readline'. But if you run that same command -# within a rule in this project, you'll see that it is indeed linking with -# our own built readline. -ifeq ($(on_mac_os),yes) -needpatchelf = -else -needpatchelf = $(ibidir)/patchelf -endif -$(ibidir)/bash: $(tdir)/bash-$(bash-version).tar.gz \ - $(ilidir)/readline \ - $(needpatchelf) - - # Delete the (possibly) existing Bash executable. - rm -f $(ibdir)/bash - - # Build Bash. Note that we aren't building Bash with - # `--with-installed-readline'. This is because (as described above) - # Bash needs the `LD_LIBRARY_PATH' set properly before it is - # run. Within a recipe, things are fine (we do set - # `LD_LIBRARY_PATH'). However, Make will also call the shell - # outside of the recipe (for example in the `foreach' Make - # function!). In such cases, our new `LD_LIBRARY_PATH' is not set. - # This will cause a crash in the shell and thus the Makefile, - # complaining that it can't find `libreadline'. Therefore, even - # though we build readline below, we won't link Bash with an - # external readline. - # - # Bash has many `--enable' features which are already enabled by - # default. As described in the manual, they are mainly useful when - # you disable them all with `--enable-minimal-config' and enable a - # subset using the `--enable' options. - if [ "x$(static_build)" = xyes ]; then stopt="--enable-static-link";\ - else stopt=""; \ - fi; \ - $(call gbuild, $<, bash-$(bash-version),, \ - --with-installed-readline=$(ildir) $$stopt ) - - # Atleast on GNU/Linux systems, Bash doesn't include RPATH by - # default. So, we have to manually include it, currently we are - # only doing this on GNU/Linux systems (using the `patchelf' - # program). - if [ "x$(needpatchelf)" != x ]; then \ - if [ -f $(ibdir)/bash ]; then \ - $(ibdir)/patchelf --set-rpath $(ildir) $(ibdir)/bash; fi \ - fi - - # To be generic, some systems use the `sh' command to call the - # shell. By convention, `sh' is just a symbolic link to the - # preferred shell executable. So we'll define `$(ibdir)/sh' as a - # symbolic link to the Bash that we just built and installed. - # - # Just to be sure that the installation step above went well, - # before making the link, we'll see if the file actually exists - # there. - if [ -f $(ibdir)/bash ]; then \ - ln -fs $(ibdir)/bash $(ibdir)/sh; \ - echo "GNU Bash $(bash-version)" > $@; \ - else \ - echo "GNU Bash not built!"; exit 1; fi - - - - - -# Downloader -# ---------- -# -# Some programs (like Wget and CMake) that use zlib need it to be dynamic -# so they use our custom build. So we won't force a static-only build. -# -# Note for a static-only build: Zlib's `./configure' doesn't use Autoconf's -# configure script, it just accepts a direct `--static' option. -$(idir)/etc:; mkdir $@ -$(ilidir)/zlib: $(tdir)/zlib-$(zlib-version).tar.gz \ - $(ibidir)/bash - $(call gbuild, $<, zlib-$(zlib-version)) \ - && echo "Zlib $(zlib-version)" > $@ - -# OpenSSL: Some programs/libraries later need dynamic linking. So we'll -# build libssl (and libcrypto) dynamically also. -# -# Until we find a nice and generic way to create an updated CA file in the -# project, the certificates will be available in a file for this pipeline -# along with the other tarballs. -# -# In case you do want a static OpenSSL and libcrypto, then uncomment the -# following conditional and put $(openssl-static) in the configure options. -# -#ifeq ($(static_build),yes) -#openssl-static = no-dso no-dynamic-engine no-shared -#endif -$(ilidir)/openssl: $(tdir)/openssl-$(openssl-version).tar.gz \ - $(tdir)/cert.pem \ - $(ilidir)/zlib | $(idir)/etc - # According to OpenSSL's Wiki (link bellow), it can't automatically - # detect Mac OS's structure. It will need some help. So we'll use - # the `on_mac_os' Make variable that we defined in the configure - # script and help it with some extra configuration options and an - # environment variable. - # - # https://wiki.openssl.org/index.php/Compilation_and_Installation - if [ x$(on_mac_os) = xyes ]; then \ - export KERNEL_BITS=64; \ - copt="shared no-ssl2 no-ssl3 enable-ec_nistp_64_gcc_128"; \ - fi; \ - $(call gbuild, $<, openssl-$(openssl-version), , \ - zlib \ - $$copt \ - $(rpath_command) \ - --openssldir=$(idir)/etc/ssl \ - --with-zlib-lib=$(ildir) \ - --with-zlib-include=$(idir)/include, , , \ - ./config ) && \ - cp $(tdir)/cert.pem $(idir)/etc/ssl/cert.pem; \ - if [ $$? = 0 ]; then \ - if [ x$(on_mac_os) = xyes ]; then \ - echo "No need to fix rpath in libssl"; \ - else \ - patchelf --set-rpath $(ildir) $(ildir)/libssl.so; \ - fi; \ - echo "OpenSSL $(openssl-version)" > $@; \ - fi - -# GNU Wget -# -# Note that on some systems (for example GNU/Linux) Wget needs to explicity -# link with `libdl', but on others (for example Mac OS) it doesn't. We -# check this at configure time and define the `needs_ldl' variable. -# -# Also note that since Wget needs to load outside libraries dynamically, it -# gives a segmentation fault when built statically. -# -# There are many network related libraries that we are currently not -# building as part of this project. So to avoid too much dependency on the -# host system (especially a crash when these libraries are updated on the -# host), they are disabled here. -$(ibidir)/wget: $(tdir)/wget-$(wget-version).tar.lz \ - $(ibidir)/pkg-config \ - $(ilidir)/openssl - libs="-pthread"; \ - if [ x$(needs_ldl) = xyes ]; then libs="$$libs -ldl"; fi; \ - $(call gbuild, $<, wget-$(wget-version), , \ - LIBS="$$LIBS $$libs" \ - --with-libssl-prefix=$(idir) \ - --with-ssl=openssl \ - --with-openssl=yes \ - --without-metalink \ - --without-libuuid \ - --without-libpsl \ - --without-libidn \ - --disable-pcre2 \ - --disable-pcre \ - --disable-iri ) \ - && echo "GNU Wget $(wget-version)" > $@ - - - - - -# Basic command-line tools and their dependencies -# ----------------------------------------------- -# -# These are basic programs which are commonly necessary in the build -# process of the higher-level programs and libraries. Note that during the -# building of those higher-level programs (after this Makefile finishes), -# there is no access to the system's PATH. -$(ibidir)/coreutils: $(tdir)/coreutils-$(coreutils-version).tar.xz \ - $(ilidir)/openssl - # Coreutils will use the hashing features of OpenSSL's `libcrypto'. - # See Tar's comments for the `-j' option. - $(call gbuild, $<, coreutils-$(coreutils-version), static, \ - LDFLAGS="$(LDFLAGS)" CPPFLAGS="$(CPPFLAGS)" \ - --enable-rpath --disable-silent-rules --with-openssl, \ - -j$(numthreads)) \ - && echo "GNU Coreutils $(coreutils-version)" > $@ - -$(ibidir)/diffutils: $(tdir)/diffutils-$(diffutils-version).tar.xz \ - $(ibidir)/bash - $(call gbuild, $<, diffutils-$(diffutils-version), static, , V=1) \ - && echo "GNU Diffutils $(diffutils-version)" > $@ - -$(ibidir)/findutils: $(tdir)/findutils-$(findutils-version).tar.lz \ - $(ibidir)/bash - $(call gbuild, $<, findutils-$(findutils-version), static, , V=1) \ - && echo "GNU Findutils $(findutils-version)" > $@ - -$(ibidir)/gawk: $(tdir)/gawk-$(gawk-version).tar.lz \ - $(ibidir)/bash \ - $(ilidir)/mpfr \ - $(ilidir)/gmp - # AWK doesn't include RPATH by default, so we'll have to manually - # include it using the `patchelf' program (which was a dependency - # of Bash). Just note that AWK produces two executables (for - # example `gawk-4.2.1' and `gawk') and a symbolic link `awk' to one - # of those executables. - $(call gbuild, $<, gawk-$(gawk-version), static, \ - --with-readline=$(idir)) \ - && if [ "x$(needpatchelf)" != x ]; then \ - if [ -f $(ibdir)/gawk ]; then \ - $(ibdir)/patchelf --set-rpath $(ildir) $(ibdir)/gawk; \ - fi; \ - if [ -f $(ibdir)/gawk-$(gawk-version) ]; then \ - $(ibdir)/patchelf --set-rpath $(ildir) \ - $(ibdir)/gawk-$(gawk-version); \ - fi; \ - fi \ - && echo "GNU AWK $(gawk-version)" > $@ - -$(ilidir)/gmp: $(tdir)/gmp-$(gmp-version).tar.lz \ - $(ibidir)/bash - $(call gbuild, $<, gmp-$(gmp-version), static, , , make check) \ - && echo "GNU Multiple Precision Arithmetic Library $(gmp-version)" > $@ - -# On Mac OS, libtool does different things, so to avoid confusion, we'll -# prefix GNU's libtool executables with `glibtool'. -$(ibidir)/glibtool: $(tdir)/libtool-$(libtool-version).tar.xz \ - $(ibidir)/m4 - $(call gbuild, $<, libtool-$(libtool-version), static, \ - --program-prefix=g) \ - && echo "GNU Libtool $(libtool-version)" > $@ - -$(ibidir)/grep: $(tdir)/grep-$(grep-version).tar.xz \ - $(ibidir)/bash - $(call gbuild, $<, grep-$(grep-version), static) \ - && echo "GNU Grep $(grep-version)" > $@ - -$(ibidir)/m4: $(tdir)/m4-$(m4-version).tar.gz \ - $(ibidir)/bash - $(call gbuild, $<, m4-$(m4-version), static) \ - && echo "GNU M4 $(m4-version)" > $@ - -$(ilidir)/mpfr: $(tdir)/mpfr-$(mpfr-version).tar.xz \ - $(ilidir)/gmp - $(call gbuild, $<, mpfr-$(mpfr-version), static, , , make check) \ - && echo "GNU Multiple Precision Floating-Point Reliably $(mpfr-version)" > $@ - -$(ibidir)/pkg-config: $(tdir)/pkg-config-$(pkgconfig-version).tar.gz \ - $(ibidir)/bash - # Some Mac OS systems may have a version of the GNU C Compiler - # (GCC) installed that doesn't support some necessary features of - # building Glib (as part of pkg-config). So to be safe, for Mac - # systems, we'll make sure it will use LLVM's Clang. - if [ x$(on_mac_os) = xyes ]; then export compiler="CC=clang"; \ - else export compiler=""; \ - fi; \ - $(call gbuild, $<, pkg-config-$(pkgconfig-version), static, \ - $$compiler --with-internal-glib \ - --with-pc-path=$(ildir)/pkgconfig) \ - && echo "pkg-config $(pkgconfig-version)" > $@ - -$(ibidir)/sed: $(tdir)/sed-$(sed-version).tar.xz \ - $(ibidir)/bash - $(call gbuild, $<, sed-$(sed-version), static) \ - && echo "GNU Sed $(sed-version)" > $@ - -$(ibidir)/which: $(tdir)/which-$(which-version).tar.gz \ - $(ibidir)/bash - $(call gbuild, $<, which-$(which-version), static) \ - && echo "GNU Which $(which-version)" > $@ - - - - - - - - - - -# GCC and its prerequisites -# ------------------------- -# -# Binutils' linker `ld' is apparently only good for GNU/Linux systems and -# other OSs have their own. So for now we aren't actually building -# Binutils (`ld' isn't a prerequisite of GCC). -$(ibidir)/binutils: $(tdir)/binutils-$(binutils-version).tar.lz - $(call gbuild, $<, binutils-$(binutils-version), static) \ - && echo "GNU Binutils $(binutils-version)" > $@ - -# `file' is not a prerequisite of GCC. However, since it is low level, it is -# set as a prerequisite of GCC to have it installed. -$(ibidir)/file: $(tdir)/file-$(file-version).tar.gz - $(call gbuild, $<, file-$(file-version), static) \ - && echo "File $(file-version)" > $@ - -$(ilidir)/isl: $(tdir)/isl-$(isl-version).tar.bz2 \ - $(ilidir)/gmp - $(call gbuild, $<, isl-$(isl-version), static) \ - && echo "GNU Integer Set Library $(isl-version)" > $@ - -$(ilidir)/mpc: $(tdir)/mpc-$(mpc-version).tar.gz \ - $(ilidir)/mpfr - $(call gbuild, $<, mpc-$(mpc-version), static, , , make check) \ - && echo "GNU Multiple Precision Complex library" > $@ - -# We are having issues with `libiberty' (part of GCC) on Mac. So for now, -# GCC won't be built there. Since almost no natural science paper's -# processing depends so strongly on the compiler used, for now, this isn't -# a bad assumption, but we are indeed searching for a solution. -# -# Based on the GCC manual, the GCC build can benefit from a GNU -# environment. So, we'll build GCC after building all the basic tools that -# are often used in a configure and build scripts of GCC components. -# -# Objective C and Objective C++ is necessary for installing `matplotlib'. -# -# We are currently having problems installing GCC on macOS, so for the time -# being, if the project is being run on a macOS, we'll just set a link. -ifeq ($(host_cc),1) -gcc-prerequisites = -else -gcc-prerequisites = $(tdir)/gcc-$(gcc-version).tar.xz \ - $(ilidir)/isl \ - $(ilidir)/mpc -endif -$(ibidir)/gcc: $(gcc-prerequisites) \ - $(ibidir)/sed \ - $(ibidir)/bash \ - $(ibidir)/file \ - $(ibidir)/gawk \ - $(ibidir)/grep \ - $(ibidir)/which \ - $(ibidir)/glibtool \ - $(ibidir)/coreutils \ - $(ibidir)/diffutils \ - $(ibidir)/findutils - - # GCC builds is own libraries in '$(idir)/lib64'. But all other - # libraries are in '$(idir)/lib'. Since this project is only for a - # single architecture, we can trick GCC into building its libraries - # in '$(idir)/lib' by defining the '$(idir)/lib64' as a symbolic - # link to '$(idir)/lib'. - if [ $(host_cc) = 1 ]; then \ - $(call makelink,gcc); \ - $(call makelink,g++,mandatory); \ - $(call makelink,gfortran,mandatory); \ - ccinfo=$$(gcc --version | awk 'NR==1'); \ - echo "C compiler (""$$ccinfo"")" > $@; \ - else \ - rm -f $(ibdir)/gcc* $(ibdir)/g++ $(ibdir)/gfortran $(ibdir)/gcov*;\ - rm -rf $(ildir)/gcc $(ildir)/libcc* $(ildir)/libgcc*; \ - rm -rf $(ildir)/libgfortran* $(ildir)/libstdc* rm $(idir)/x86_64*;\ - \ - ln -fs $(ildir) $(idir)/lib64; \ - \ - cd $(ddir); \ - rm -rf gcc-build gcc-$(gcc-version); \ - tar xf $< \ - && mkdir $(ddir)/gcc-build \ - && cd $(ddir)/gcc-build \ - && ../gcc-$(gcc-version)/configure SHELL=$(ibdir)/bash \ - --prefix=$(idir) \ - --with-mpc=$(idir) \ - --with-mpfr=$(idir) \ - --with-gmp=$(idir) \ - --with-isl=$(idir) \ - --with-build-time-tools=$(idir) \ - --enable-shared \ - --disable-multilib \ - --disable-multiarch \ - --enable-threads=posix \ - --with-local-prefix=$(idir) \ - --enable-languages=c,c++,fortran,objc,obj-c++ \ - --disable-libada \ - --disable-nls \ - --enable-default-pie \ - --enable-default-ssp \ - --enable-cet=auto \ - --enable-decimal-float \ - && make SHELL=$(ibdir)/bash -j$$(nproc) \ - && make SHELL=$(ibdir)/bash install \ - && cd .. \ - && rm -rf gcc-build gcc-$(gcc-version) \ - \ - && if [ "x$(on_mac_os)" != xyes ]; then \ - for f in $$(find $(idir)/libexec/gcc); do \ - if ldd $$f &> /dev/null; then \ - patchelf --set-rpath $(ildir) $$f; \ - fi; \ - done; \ - fi \ - && echo "GNU Compiler Collection (GCC) $(gcc-version)" > $@; \ - fi diff --git a/reproduce/src/make/dependencies-build-rules.mk b/reproduce/src/make/dependencies-build-rules.mk deleted file mode 100644 index a8c8731..0000000 --- a/reproduce/src/make/dependencies-build-rules.mk +++ /dev/null @@ -1,120 +0,0 @@ -# Generic configurable recipes to build packages with GNU Build system or -# CMake. This is Makefile is not intended to be run directly, it will be -# imported into `dependencies-basic.mk' and `dependencies.mk'. They should -# be activated with Make's `Call' function. -# -# Copyright (C) 2018-2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# This Makefile is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This Makefile is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. -# -# A copy of the GNU General Public License is available at -# <http://www.gnu.org/licenses/>. - - - - - -# IMPORTANT note -# -------------- -# -# Without using `&&', if a step fails, the process will continue. However, -# in the `if' statements, we need `;' (particularly between `]' and -# `then'). So we need to put any necessary checks at the start, then when -# we start the process, every command will be separated by an `&&'. - - - - - -# GNU Build system -# ---------------- -# -# Arguments: -# 1: Tarball full address. -# 2: Directory name after unpacking. -# 3: Set to `static' for a static build. -# 4: Extra configuration options. -# 5: Extra options/arguments to pass to Make. -# 6: Step to run between `make' and `make install': usually `make check'. -# 7: The configuration script (`configure' by default). -# 8: Arguments for `make install'. -# -# NOTE: Unfortunately the configure script of `zlib' doesn't recognize -# `SHELL'. So we'll have to remove it from the call to the configure -# script. -# -# NOTE: A program might not contain any configure script. In this case, -# we'll just pass a non-relevant function like `pwd'. So SED should be used -# to modify `confscript' or to set `configop'. -gbuild = if [ x$(static_build) = xyes ] && [ "x$(3)" = xstatic ]; then \ - export LDFLAGS="$$LDFLAGS -static"; \ - fi; \ - check="$(6)"; \ - if [ x"$$check" = x ]; then check="echo Skipping-check"; fi; \ - cd $(ddir); rm -rf $(2); \ - if ! tar xf $(1); then echo; echo "Tar error"; exit 1; fi; \ - cd $(2); \ - \ - if [ x"$(strip $(7))" = x ]; then confscript=./configure; \ - else confscript="$(strip $(7))"; \ - fi; \ - \ - if [ -f $(ibdir)/bash ]; then \ - if [ -f $$confscript ]; then \ - sed -e's|\#\! /bin/sh|\#\! $(ibdir)/bash|' \ - -e's|\#\!/bin/sh|\#\! $(ibdir)/bash|' \ - $$confscript > $$confscript-tmp; \ - mv $$confscript-tmp $$confscript; \ - chmod +x $$confscript; \ - fi; \ - shellop="SHELL=$(ibdir)/bash"; \ - elif [ -f /bin/bash ]; then shellop="SHELL=/bin/bash"; \ - else shellop="SHELL=/bin/sh"; \ - fi; \ - \ - if [ -f $$confscript ]; then \ - if [ x"$(strip $(2))" = x"zlib-$(zlib-version)" ]; then \ - configop="--prefix=$(idir)"; \ - else configop="$$shellop --prefix=$(idir)"; \ - fi; \ - fi; \ - \ - echo; echo "Using '$$confscript' to configure:"; echo; \ - echo "$$confscript $(4) $$configop"; echo; \ - $$confscript $(4) $$configop && \ - make "$$shellop" $(5) && \ - $$check && \ - make "$$shellop" install $(8) && \ - cd .. && rm -rf $(2) - - - - -# CMake -# ----- -# -# According to the link below, in CMake `/bin/sh' is hardcoded, so there is -# no way to change it. -# -# https://stackoverflow.com/questions/21167014/how-to-set-shell-variable-in-makefiles-generated-by-cmake -cbuild = if [ x$(static_build) = xyes ] && [ $(3)x = staticx ]; then \ - export LDFLAGS="$$LDFLAGS -static"; \ - opts="-DBUILD_SHARED_LIBS=OFF"; \ - fi; \ - cd $(ddir) && rm -rf $(2) && tar xf $(1) && cd $(2) && \ - rm -rf project-build && mkdir project-build && \ - cd project-build && \ - cmake .. -DCMAKE_LIBRARY_PATH=$(ildir) \ - -DCMAKE_INSTALL_PREFIX=$(idir) \ - -DCMAKE_VERBOSE_MAKEFILE:BOOL=ON $$opts $(4) && \ - make && make install && \ - cd ../.. && \ - rm -rf $(2) diff --git a/reproduce/src/make/dependencies-python.mk b/reproduce/src/make/dependencies-python.mk deleted file mode 100644 index 837b0ad..0000000 --- a/reproduce/src/make/dependencies-python.mk +++ /dev/null @@ -1,506 +0,0 @@ -# Build the project's Python dependencies. -# -# ------------------------------------------------------------------------ -# !!!!! IMPORTANT NOTES !!!!! -# -# This Makefile will be run by the initial `./configure' script. It is not -# included into the reproduction pipe after that. -# -# ------------------------------------------------------------------------ -# -# Copyright (C) 2019 Raul Infante-Sainz <infantesainz@gmail.com> -# Copyright (C) 2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# This Makefile is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This Makefile is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. -# -# A copy of the GNU General Public License is available at -# <http://www.gnu.org/licenses/>. - - - - - -# Python enviroment -# ----------------- -# -# The main Python environment variable is `PYTHONPATH'. However, so far we -# have found several other Python-related environment variables on some -# systems which might interfere. To be safe, we are removing all their -# values. -export PYTHONPATH := $(installdir)/lib/python/site-packages -export PYTHONPATH2 := $(PYTHONPATH) -export PYTHONPATH3 := $(PYTHONPATH) -export _LMFILES_ := -export LOADEDMODULES := -export MPI_PYTHON_SITEARCH := -export MPI_PYTHON2_SITEARCH := -export MPI_PYTHON3_SITEARCH := - - - - - -# Tarballs -# -------- -# -# All the necessary tarballs are defined and prepared with this rule. -# -# Note that we want the tarballs to follow the convention of NAME-VERSION -# before the `tar.XX' prefix. For those programs that don't follow this -# convention, but include the name/version in their tarball names with -# another format, we'll do the modification before the download so the -# downloaded file has our desired format. -pytarballs = $(foreach t, asn1crypto-$(asn1crypto-version).tar.gz \ - astroquery-$(astroquery-version).tar.gz \ - astropy-$(astropy-version).tar.gz \ - beautifulsoup4-$(beautifulsoup4-version).tar.gz \ - certifi-$(certifi-version).tar.gz \ - cffi-$(cffi-version).tar.gz \ - chardet-$(chardet-version).tar.gz \ - cryptography-$(cryptography-version).tar.gz \ - cycler-$(cycler-version).tar.gz \ - cython-$(cython-version).tar.gz \ - entrypoints-$(entrypoints-version).tar.gz \ - h5py-$(h5py-version).tar.gz \ - html5lib-$(html5lib-version).tar.gz \ - idna-$(idna-version).tar.gz \ - jeepney-$(jeepney-version).tar.gz \ - kiwisolver-$(kiwisolver-version).tar.gz \ - keyring-$(keyring-version).tar.gz \ - libffi-$(libffi-version).tar.gz \ - matplotlib-$(matplotlib-version).tar.gz \ - mpi4py-$(mpi4py-version).tar.gz \ - numpy-$(numpy-version).zip \ - pkgconfig-$(pypkgconfig-version).tar.gz \ - pip-$(pip-version).tar.gz \ - pycparser-$(pycparser-version).tar.gz \ - python-$(python-version).tar.gz \ - python-dateutil-$(python-dateutil-version).tar.gz \ - pyparsing-$(pyparsing-version).tar.gz \ - requests-$(requests-version).tar.gz \ - scipy-$(scipy-version).tar.gz \ - secretstorage-$(secretstorage-version).tar.gz \ - setuptools-$(setuptools-version).zip \ - setuptools_scm-$(setuptools_scm-version).tar.gz \ - six-$(six-version).tar.gz \ - soupsieve-$(soupsieve-version).tar.gz \ - urllib3-$(urllib3-version).tar.gz \ - webencodings-$(webencodings-version).tar.gz \ - virtualenv-$(virtualenv-version).tar.gz \ - , $(tdir)/$(t) ) -pytopurl=https://files.pythonhosted.org/packages -$(pytarballs): $(tdir)/%: - if [ -f $(DEPENDENCIES-DIR)/$* ]; then - cp $(DEPENDENCIES-DIR)/$* $@ - else - - # Convenience variable - # -------------------- - # - # `n' is just for convenience and to avoid having to repeat the - # package tarball name in the conditional to find its URL. - # - # For some packages (for example `python-dateutil', or those with - # a number or dash in their name), we need special consideration - # because the tokenization above will produce `python' as the - # first string. - if [ $* = python-dateutil-$(python-dateutil-version).tar.gz ]; then - n=dateutil - elif [ $* = h5py-$(h5py-version).tar.gz ]; then - n=h5py - - # elif [ $* = strange-tarball5name-version.tar.gz ]; then - # n=strange5-name - else - # Remove all numbers, `-' and `.' from the tarball name so we can - # search more easily only with the program name. - n=$$(echo $* | sed -e's/[0-9\-]/ /g' -e's/\./ /g' \ - | awk '{print $$1}') - fi - - # Set the top download link of the requested tarball. The ones - # that have non-standard filenames (differing from our archived - # tarball names) are treated first, then the standard ones. - mergenames=1 - if [ $$n = cython ]; then - mergenames=0 - hash=36/da/fcb979fc8cb486a67a013d6aefefbb95a3e19e67e49dff8a35e014046c5e - h=$(pytopurl)/$$hash/Cython-$(cython-version).tar.gz - elif [ $$n = python ]; then - mergenames=0 - h=https://www.python.org/ftp/python/$(python-version)/Python-$(python-version).tgz - elif [ $$n = libffi ]; then - mergenames=0 - h=ftp://sourceware.org/pub/libffi/libffi-$(libffi-version).tar.gz - elif [ $$n = secretstorage ]; then - mergenames=0 - hash=a6/89/df343dbc2957a317127e7ff2983230dc5336273be34f2e1911519d85aeb5 - h=$(pytopurl)/$$hash/SecretStorage-$(secretstorage-version).tar.gz - elif [ $$n = asn ]; then h=fc/f1/8db7daa71f414ddabfa056c4ef792e1461ff655c2ae2928a2b675bfed6b4 - elif [ $$n = astroquery ]; then h=61/50/a7a08f9e54d7d9d97e69433cd88231e1ad2901811c9d1ae9ac7ccaef9396 - elif [ $$n = astropy ]; then h=eb/f7/1251bf6881861f24239efe0c24cbcfc4191ccdbb69ac3e9bb740d0c23352 - elif [ $$n = beautifulsoup ]; then h=80/f2/f6aca7f1b209bb9a7ef069d68813b091c8c3620642b568dac4eb0e507748 - elif [ $$n = certifi ]; then h=55/54/3ce77783acba5979ce16674fc98b1920d00b01d337cfaaf5db22543505ed - elif [ $$n = cffi ]; then h=64/7c/27367b38e6cc3e1f49f193deb761fe75cda9f95da37b67b422e62281fcac - elif [ $$n = chardet ]; then h=fc/bb/a5768c230f9ddb03acc9ef3f0d4a3cf93462473795d18e9535498c8f929d - elif [ $$n = cryptography ]; then h=07/ca/bc827c5e55918ad223d59d299fff92f3563476c3b00d0a9157d9c0217449 - elif [ $$n = cycler ]; then h=c2/4b/137dea450d6e1e3d474e1d873cd1d4f7d3beed7e0dc973b06e8e10d32488 - elif [ $$n = entrypoints ]; then h=b4/ef/063484f1f9ba3081e920ec9972c96664e2edb9fdc3d8669b0e3b8fc0ad7c - elif [ $$n = h5py ]; then h=43/27/a6e7dcb8ae20a4dbf3725321058923fec262b6f7835179d78ccc8d98deec - elif [ $$n = html ]; then h=85/3e/cf449cf1b5004e87510b9368e7a5f1acd8831c2d6691edd3c62a0823f98f - elif [ $$n = idna ]; then h=ad/13/eb56951b6f7950cadb579ca166e448ba77f9d24efc03edd7e55fa57d04b7 - elif [ $$n = jeepney ]; then h=16/1d/74adf3b164a8d19a60d0fcf706a751ffa2a1eaa8e5bbb1b6705c92a05263 - elif [ $$n = keyring ]; then h=15/88/c6ce9509438bc02d54cf214923cfba814412f90c31c95028af852b19f9b2 - elif [ $$n = kiwisolver ]; then h=31/60/494fcce70d60a598c32ee00e71542e52e27c978e5f8219fae0d4ac6e2864 - elif [ $$n = matplotlib ]; then h=89/0c/653aec68e9cfb775c4fbae8f71011206e5e7fe4d60fcf01ea1a9d3bc957f - elif [ $$n = mpi ]; then h=55/a2/c827b196070e161357b49287fa46d69f25641930fd5f854722319d431843 - elif [ $$n = numpy ]; then h=cf/8d/6345b4f32b37945fedc1e027e83970005fc9c699068d2f566b82826515f2 - elif [ $$n = pip ]; then h=4c/4d/88bc9413da11702cbbace3ccc51350ae099bb351febae8acc85fec34f9af - elif [ $$n = pkgconfig ]; then h=6e/a9/ff67ef67217dfdf2aca847685fe789f82b931a6957a3deac861297585db6 - elif [ $$n = pycparser ]; then h=68/9e/49196946aee219aead1290e00d1e7fdeab8567783e83e1b9ab5585e6206a - elif [ $$n = pyparsing ]; then h=b9/b8/6b32b3e84014148dcd60dd05795e35c2e7f4b72f918616c61fdce83d27fc - elif [ $$n = dateutil ]; then h=ad/99/5b2e99737edeb28c71bcbec5b5dda19d0d9ef3ca3e92e3e925e7c0bb364c - elif [ $$n = requests ]; then h=52/2c/514e4ac25da2b08ca5a464c50463682126385c4272c18193876e91f4bc38 - elif [ $$n = scipy ]; then h=a9/b4/5598a706697d1e2929eaf7fe68898ef4bea76e4950b9efbe1ef396b8813a - elif [ $$n = secretstorage ]; then h=a6/89/df343dbc2957a317127e7ff2983230dc5336273be34f2e1911519d85aeb5 - elif [ $$n = setuptools ]; then h=c2/f7/c7b501b783e5a74cf1768bc174ee4fb0a8a6ee5af6afa92274ff964703e0 - elif [ $$n = setuptools_scm ]; then h=54/85/514ba3ca2a022bddd68819f187ae826986051d130ec5b972076e4f58a9f3 - elif [ $$n = six ]; then h=dd/bf/4138e7bfb757de47d1f4b6994648ec67a51efe58fa907c1e11e350cddfca - elif [ $$n = soupsieve ]; then h=0c/52/e9088bb9b96e2d39fc3b33fcda5b4fde9d71473536ac660a1ca9a0958a2f - elif [ $$n = urllib ]; then h=b1/53/37d82ab391393565f2f831b8eedbffd57db5a718216f82f1a8b4d381a1c1 - elif [ $$n = virtualenv ]; then h=51/aa/c395a6e6eaaedfa5a04723b6446a1df783b16cca6fec66e671cede514688 - elif [ $$n = webencodings ]; then h=0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47 -# elif [ $$n = strange5-name ]; then h=XXXXX - else - echo; echo; echo; - echo "'$$n' not recognized as a dependency name to download." - echo; echo; echo; - exit 1 - fi - - # Download the requested tarball. Note that some packages may not - # follow our naming convention (where the package name is merged - # with its version number). In such cases, `w' will be the full - # address, not just the top directory address. But since we are - # storing all the tarballs in one directory, we want it to have - # the same naming convention, so we'll download it to a temporary - # name, then rename that. - if [ $$mergenames = 1 ]; then tarballurl=$(pytopurl)/$$h/"$*" - else tarballurl=$$h - fi - - # Download using the script specially defined for this job. - touch $(lockdir)/download - downloader="wget --no-use-server-timestamps -O" - $(downloadwrapper) "$$downloader" $(lockdir)/download \ - $$tarballurl $@ - fi - - - - - -# Necessary programs and libraries -# -------------------------------- -# -# While this Makefile is for Python programs, in some cases, we need -# certain programs (like Python itself), or libraries for the modules. -$(ilidir)/libffi: $(tdir)/libffi-$(libffi-version).tar.gz - $(call gbuild, $<, libffi-$(libffi-version)) \ - echo "Libffi $(libffi-version)" > $@ - -$(ibidir)/python3: $(tdir)/python-$(python-version).tar.gz \ - $(ilidir)/libffi - # On Mac systems, the build complains about `clang' specific - # features, so we can't use our own GCC build here. - if [ x$(on_mac_os) = xyes ]; then \ - export CC=clang; \ - export CXX=clang++; \ - fi; \ - $(call gbuild, $<, Python-$(python-version),, \ - --without-ensurepip \ - --with-system-ffi \ - --enable-shared) \ - && v=$$(echo $(python-version) | awk 'BEGIN{FS="."} \ - {printf "%d.%d\n", $$1, $$2}') \ - && ln -s $(ildir)/python$$v $(ildir)/python \ - && rm -rf $(ipydir) \ - && mkdir $(ipydir) \ - && echo "Python $(python-version)" > $@ - - - - - -# Non-PiP Python module installation -# ---------------------------------- -# -# To build Python packages with direct access to a `setup.py' (if no direct -# access to `setup.py' is needed, pip can be used). -# Arguments of this function are the numbers -# 1) Unpack command -# 2) Package name -# 3) Unpacked directory name after unpacking the tarball -# 4) site.cfg file (optional) -# 5) Official software name.(for paper). -pybuild = cd $(ddir); rm -rf $(3); \ - if ! $(1) $(2); then echo; echo "Tar error"; exit 1; fi; \ - cd $(3); \ - if [ "x$(strip $(4))" != x ]; then \ - sed -e 's|@LIBDIR[@]|'"$(ildir)"'|' \ - -e 's|@INCDIR[@]|'"$(idir)/include"'|' \ - $(4) > site.cfg; \ - fi; \ - python3 setup.py build \ - && python3 setup.py install \ - && cd .. \ - && rm -rf $(3) \ - && echo "$(5)" > $@ - - - - - -# Python modules -# --------------- -# -# All the necessary Python modules go here. -$(ipydir)/asn1crypto: $(tdir)/asn1crypto-$(asn1crypto-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, asn1crypto-$(asn1crypto-version), , \ - Asn1crypto $(asn1crypto-version)) - -$(ipydir)/astroquery: $(tdir)/astroquery-$(astroquery-version).tar.gz \ - $(ipydir)/beautifulsoup4 \ - $(ipydir)/html5lib \ - $(ipydir)/requests \ - $(ipydir)/astropy \ - $(ipydir)/keyring \ - $(ipydir)/numpy - $(call pybuild, tar xf, $<, astroquery-$(astroquery-version), ,\ - Astroquery $(astroquery-version)) - -$(ipydir)/astropy: $(tdir)/astropy-$(astropy-version).tar.gz \ - $(ipydir)/h5py \ - $(ipydir)/numpy \ - $(ipydir)/scipy - $(call pybuild, tar xf, $<, astropy-$(astropy-version)) \ - && cp $(dtexdir)/astropy.tex $(ictdir)/ \ - && echo "Astropy $(astropy-version) \citep{astropy2013,astropy2018}" > $@ - -$(ipydir)/beautifulsoup4: $(tdir)/beautifulsoup4-$(beautifulsoup4-version).tar.gz \ - $(ipydir)/soupsieve - $(call pybuild, tar xf, $<, beautifulsoup4-$(beautifulsoup4-version), ,\ - BeautifulSoup $(beautifulsoup4-version)) - -$(ipydir)/certifi: $(tdir)/certifi-$(certifi-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, certifi-$(certifi-version), ,\ - Certifi $(certifi-version)) - -$(ipydir)/cffi: $(tdir)/cffi-$(cffi-version).tar.gz \ - $(ilidir)/libffi \ - $(ipydir)/pycparser - $(call pybuild, tar xf, $<, cffi-$(cffi-version), ,\ - cffi $(cffi-version)) - -$(ipydir)/chardet: $(tdir)/chardet-$(chardet-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, chardet-$(chardet-version), ,\ - Chardet $(chardet-version)) - -$(ipydir)/cryptography: $(tdir)/cryptography-$(cryptography-version).tar.gz \ - $(ipydir)/asn1crypto \ - $(ipydir)/cffi - $(call pybuild, tar xf, $<, cryptography-$(cryptography-version), ,\ - Cryptography $(cryptography-version)) - -$(ipydir)/cycler: $(tdir)/cycler-$(cycler-version).tar.gz \ - $(ipydir)/six - $(call pybuild, tar xf, $<, cycler-$(cycler-version), ,\ - Cycler $(cycler-version)) - -$(ipydir)/cython: $(tdir)/cython-$(cython-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, Cython-$(cython-version)) \ - && cp $(dtexdir)/cython.tex $(ictdir)/ \ - && echo "Cython $(cython-version) \citep{cython2011}" > $@ - -$(ipydir)/entrypoints: $(tdir)/entrypoints-$(entrypoints-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, entrypoints-$(entrypoints-version), ,\ - EntryPoints $(entrypoints-version)) - -$(ipydir)/h5py: $(tdir)/h5py-$(h5py-version).tar.gz \ - $(ilidir)/hdf5 \ - $(ipydir)/cython \ - $(ipydir)/pypkgconfig \ - $(ipydir)/setuptools - #$(ipydir)/mpi4py # AFTER its problem is fixed. - #export HDF5_MPI=ON; # AFTER its problem is fixed. - export HDF5_DIR=$(ildir); \ - $(call pybuild, tar xf, $<, h5py-$(h5py-version), ,\ - h5py $(h5py-version)) - -$(ipydir)/html5lib: $(tdir)/html5lib-$(html5lib-version).tar.gz \ - $(ipydir)/six \ - $(ipydir)/webencodings - $(call pybuild, tar xf, $<, html5lib-$(html5lib-version), ,\ - HTML5lib $(html5lib-version)) - -$(ipydir)/idna: $(tdir)/idna-$(idna-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, idna-$(idna-version), ,\ - idna $(idna-version)) - -$(ipydir)/jeepney: $(tdir)/jeepney-$(jeepney-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, jeepney-$(jeepney-version), ,\ - Jeepney $(jeepney-version)) - -$(ipydir)/keyring: $(tdir)/keyring-$(keyring-version).tar.gz \ - $(ipydir)/entrypoints \ - $(ipydir)/secretstorage \ - $(ipydir)/setuptools_scm - $(call pybuild, tar xf, $<, keyring-$(keyring-version), ,\ - Keyring $(keyring-version)) - -$(ipydir)/kiwisolver: $(tdir)/kiwisolver-$(kiwisolver-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, kiwisolver-$(kiwisolver-version), ,\ - Kiwisolver $(kiwisolver-version)) - -$(ipydir)/matplotlib: $(tdir)/matplotlib-$(matplotlib-version).tar.gz \ - $(ipydir)/cycler \ - $(ilidir)/freetype \ - $(ipydir)/kiwisolver \ - $(ipydir)/numpy \ - $(ipydir)/pyparsing \ - $(ipydir)/python-dateutil - $(call pybuild, tar xf, $<, matplotlib-$(matplotlib-version)) \ - && cp $(dtexdir)/matplotlib.tex $(ictdir)/ \ - && echo "Matplotlib $(matplotlib-version) \citep{matplotlib2007}" > $@ - -# Currently mpi4py doesn't build because of some conflict with OpenMPI: -# -# In file included from src/mpi4py.MPI.c:591, -# from src/MPI.c:4: -# src/mpi4py.MPI.c: In function '__pyx_f_6mpi4py_3MPI_del_Datatype': -# src/mpi4py.MPI.c:15094:36: error: expected expression before '_Static_assert' -# __pyx_t_1 = (((__pyx_v_ob[0]) == MPI_UB) != 0); -# -# But atleast on my system it fails. -$(ipydir)/mpi4py: $(tdir)/mpi4py-$(mpi4py-version).tar.gz \ - $(ipydir)/setuptools \ - $(ilidir)/openmpi - $(call pybuild, tar xf, $<, mpi4py-$(mpi4py-version)) \ - && cp $(dtexdir)/mpi4py.tex $(ictdir)/ \ - && echo "mpi4py $(mpi4py-version) \citep{mpi4py2011}" > $@ - -$(ipydir)/numpy: $(tdir)/numpy-$(numpy-version).zip \ - $(ipydir)/setuptools \ - $(ilidir)/openblas \ - $(ilidir)/fftw \ - $(ibidir)/unzip - if [ x$(on_mac_os) = xyes ]; then \ - export LDFLAGS="$(LDFLAGS) -undefined dynamic_lookup -bundle"; \ - else \ - export LDFLAGS="$(LDFLAGS) -shared"; \ - fi; \ - conf="$$(pwd)/reproduce/config/pipeline/dependency-numpy-scipy.cfg"; \ - $(call pybuild, unzip, $<, numpy-$(numpy-version),$$conf, \ - Numpy $(numpy-version)) \ - && cp $(dtexdir)/numpy.tex $(ictdir)/ \ - && echo "Numpy $(numpy-version) \citep{numpy2011}" > $@ - -$(ibidir)/pip3: $(tdir)/pip-$(pip-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, pip-$(pip-version), ,\ - PiP $(pip-version)) - -$(ipydir)/pypkgconfig: $(tdir)/pkgconfig-$(pypkgconfig-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, pkgconfig-$(pypkgconfig-version), , - pkgconfig $(pypkgconfig-version)) - -$(ipydir)/pycparser: $(tdir)/pycparser-$(pycparser-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, pycparser-$(pycparser-version), ,\ - pycparser $(pycparser-version)) - -$(ipydir)/pyparsing: $(tdir)/pyparsing-$(pyparsing-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, pyparsing-$(pyparsing-version), ,\ - PyParsing $(pyparsing-version)) - -$(ipydir)/python-dateutil: $(tdir)/python-dateutil-$(python-dateutil-version).tar.gz \ - $(ipydir)/setuptools_scm \ - $(ipydir)/six - $(call pybuild, tar xf, $<, python-dateutil-$(python-dateutil-version), ,\ - python-dateutil $(python-dateutil-version)) - -$(ipydir)/requests: $(tdir)/requests-$(requests-version).tar.gz \ - $(ipydir)/certifi \ - $(ipydir)/chardet \ - $(ipydir)/idna \ - $(ipydir)/numpy \ - $(ipydir)/urllib3 - $(call pybuild, tar xf, $<, requests-$(requests-version), ,\ - Requests $(requests-version)) - -$(ipydir)/scipy: $(tdir)/scipy-$(scipy-version).tar.gz \ - $(ipydir)/numpy - if [ x$(on_mac_os) = xyes ]; then \ - export LDFLAGS="$(LDFLAGS) -undefined dynamic_lookup -bundle"; \ - else \ - export LDFLAGS="$(LDFLAGS) -shared"; \ - fi; \ - conf="$$(pwd)/reproduce/config/pipeline/dependency-numpy-scipy.cfg"; \ - $(call pybuild, tar xf, $<, scipy-$(scipy-version),$$conf) \ - && cp $(dtexdir)/scipy.tex $(ictdir)/ \ - && echo "Scipy $(scipy-version) \citep{scipy2007,scipy2011}" > $@ - -$(ipydir)/secretstorage: $(tdir)/secretstorage-$(secretstorage-version).tar.gz \ - $(ipydir)/cryptography \ - $(ipydir)/jeepney - $(call pybuild, tar xf, $<, SecretStorage-$(secretstorage-version), ,\ - SecretStorage $(secretstorage-version)) - -$(ipydir)/setuptools: $(tdir)/setuptools-$(setuptools-version).zip \ - $(ibidir)/python3 \ - $(ibidir)/unzip - $(call pybuild, unzip, $<, setuptools-$(setuptools-version), ,\ - Setuptools $(setuptools-version)) - -$(ipydir)/setuptools_scm: $(tdir)/setuptools_scm-$(setuptools_scm-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, setuptools_scm-$(setuptools_scm-version), ,\ - Setuptools-scm $(setuptools_scm-version)) - -$(ipydir)/six: $(tdir)/six-$(six-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, six-$(six-version), ,\ - Six $(six-version)) - -$(ipydir)/soupsieve: $(tdir)/soupsieve-$(soupsieve-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, soupsieve-$(soupsieve-version), ,\ - SoupSieve $(soupsieve-version)) - -$(ipydir)/urllib3: $(tdir)/urllib3-$(urllib3-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, urllib3-$(urllib3-version), ,\ - Urllib3 $(urllib3-version)) - -$(ipydir)/webencodings: $(tdir)/webencodings-$(webencodings-version).tar.gz \ - $(ipydir)/setuptools - $(call pybuild, tar xf, $<, webencodings-$(webencodings-version), ,\ - Webencodings $(webencodings-version)) diff --git a/reproduce/src/make/dependencies.mk b/reproduce/src/make/dependencies.mk deleted file mode 100644 index fd9bffa..0000000 --- a/reproduce/src/make/dependencies.mk +++ /dev/null @@ -1,718 +0,0 @@ -# Build the project's dependencies (programs and libraries). -# -# ------------------------------------------------------------------------ -# !!!!! IMPORTANT NOTES !!!!! -# -# This Makefile will be run by the initial `./configure' script. It is not -# included into the reproduction pipe after that. -# -# ------------------------------------------------------------------------ -# -# Copyright (C) 2018-2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# This Makefile is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This Makefile is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. -# -# A copy of the GNU General Public License is available at -# <http://www.gnu.org/licenses/>. - - - -# Top level environment -include reproduce/config/pipeline/LOCAL.mk -include reproduce/src/make/dependencies-build-rules.mk -include reproduce/config/pipeline/dependency-texlive.mk -include reproduce/config/pipeline/dependency-versions.mk - -lockdir = $(BDIR)/locks -ddir = $(BDIR)/dependencies -dtexdir = $(shell pwd)/tex/dependencies -tdir = $(BDIR)/dependencies/tarballs -idir = $(BDIR)/dependencies/installed -ibdir = $(BDIR)/dependencies/installed/bin -ildir = $(BDIR)/dependencies/installed/lib -ibidir = $(BDIR)/dependencies/installed/version-info/bin -ilidir = $(BDIR)/dependencies/installed/version-info/lib -itidir = $(BDIR)/dependencies/installed/version-info/tex -ictdir = $(BDIR)/dependencies/installed/version-info/cite -ipydir = $(BDIR)/dependencies/installed/version-info/python - -# Define the top-level programs to build (installed in `.local/bin'). -# -# About ATLAS: currently the template does not depend on ATLAS but many -# high level software depend on it. The current rule for ATLAS is tested -# successfully on Mac (only static) and GNU/Linux (shared and static). But, -# since it takes a few hours to build, it is not currently a target. -top-level-libraries = # atlas -top-level-programs = gnuastro metastore unzip zip -top-level-python = astroquery matplotlib -all: $(foreach p, $(top-level-libraries), $(ilidir)/$(p)) \ - $(foreach p, $(top-level-programs), $(ibidir)/$(p)) \ - $(foreach p, $(top-level-python), $(ipydir)/$(p)) \ - $(itidir)/texlive - -# Other basic environment settings: We are only including the host -# operating system's PATH environment variable (after our own!) for the -# compiler and linker. For the library binaries and headers, we are only -# using our internally built libraries. -# -# To investigate: -# -# 1) Set SHELL to `$(ibdir)/env - NAME=VALUE $(ibdir)/bash' and set all -# the parameters defined bellow as `NAME=VALUE' statements before -# calling Bash. This will enable us to completely ignore the user's -# native environment. -# -# 2) Add `--noprofile --norc' to `.SHELLFLAGS' so doesn't load the -# user's environment. -.ONESHELL: -.SHELLFLAGS := --noprofile --norc -ec -export CCACHE_DISABLE := 1 -export PATH := $(ibdir) -export SHELL := $(ibdir)/bash -export CPPFLAGS := -I$(idir)/include -export PKG_CONFIG_PATH := $(ildir)/pkgconfig -export PKG_CONFIG_LIBDIR := $(ildir)/pkgconfig -export LD_RUN_PATH := $(ildir):$(il64dir) -export LD_LIBRARY_PATH := $(ildir):$(il64dir) -export LDFLAGS := $(rpath_command) -L$(ildir) - - -# We want the download to happen on a single thread. So we need to define a -# lock, and call a special script we have written for this job. These are -# placed here because we want them both in the `dependencies.mk' and -# `dependencies-python.mk'. -$(lockdir): | $(BDIR); mkdir $@ -downloader="wget --no-use-server-timestamps -O"; -downloadwrapper = ./reproduce/src/bash/download-multi-try - - -# Python packages -include reproduce/src/make/dependencies-python.mk - - -# Tarballs -# -------- -# -# All the necessary tarballs are defined and prepared with this rule. -# -# Note that we want the tarballs to follow the convention of NAME-VERSION -# before the `tar.XX' prefix. For those programs that don't follow this -# convention, but include the name/version in their tarball names with -# another format, we'll do the modification before the download so the -# downloaded file has our desired format. -tarballs = $(foreach t, cfitsio-$(cfitsio-version).tar.gz \ - atlas-$(atlas-version).tar.bz2 \ - cmake-$(cmake-version).tar.gz \ - curl-$(curl-version).tar.gz \ - freetype-$(freetype-version).tar.gz \ - fftw-$(fftw-version).tar.gz \ - ghostscript-$(ghostscript-version).tar.gz \ - git-$(git-version).tar.xz \ - gnuastro-$(gnuastro-version).tar.lz \ - gsl-$(gsl-version).tar.gz \ - hdf5-$(hdf5-version).tar.gz \ - install-tl-unx.tar.gz \ - jpegsrc.$(libjpeg-version).tar.gz \ - lapack-$(lapack-version).tar.gz \ - libbsd-$(libbsd-version).tar.xz \ - libpng-$(libpng-version).tar.xz \ - libgit2-$(libgit2-version).tar.gz \ - metastore-$(metastore-version).tar.gz \ - openmpi-$(openmpi-version).tar.gz \ - unzip-$(unzip-version).tar.gz \ - openblas-$(openblas-version).tar.gz \ - tiff-$(libtiff-version).tar.gz \ - wcslib-$(wcslib-version).tar.bz2 \ - zip-$(zip-version).tar.gz \ - , $(tdir)/$(t) ) -$(tarballs): $(tdir)/%: | $(lockdir) - if [ -f $(DEPENDENCIES-DIR)/$* ]; then - cp $(DEPENDENCIES-DIR)/$* $@ - else - # Remove all numbers, `-' and `.' from the tarball name so we can - # search more easily only with the program name. - n=$$(echo $* | sed -e's/[0-9\-]/ /g' -e's/\./ /g' \ - | awk '{print $$1}' ) - - # Set the top download link of the requested tarball. - mergenames=1 - if [ $$n = cfitsio ]; then - mergenames=0 - v=$$(echo $(cfitsio-version) | sed -e's/\.//' \ - | awk '{l=length($$1); \ - printf (l==4 ? "%d\n" \ - : (l==3 ? "%d0\n" \ - : (l==2 ? "%d00\n" \ - : "%d000\n") ), $$1)}') - w=https://heasarc.gsfc.nasa.gov/FTP/software/fitsio/c/cfitsio$$v.tar.gz - elif [ $$n = atlas ]; then - mergenames=0 - w=https://sourceforge.net/projects/math-atlas/files/Stable/$(atlas-version)/atlas$(atlas-version).tar.bz2/download - elif [ $$n = cmake ]; then w=https://cmake.org/files/v3.12 - elif [ $$n = curl ]; then w=https://curl.haxx.se/download - elif [ $$n = fftw ]; then w=ftp://ftp.fftw.org/pub/fftw - elif [ $$n = freetype ]; then w=https://download.savannah.gnu.org/releases/freetype - elif [ $$n = hdf ]; then - mergenames=0 - majorver=$$(echo $(hdf5-version) | sed -e 's/\./ /g' | awk '{printf("%d.%d", $$1, $$2)}') - w=https://support.hdfgroup.org/ftp/HDF5/releases/hdf5-$$majorver/hdf5-$(hdf5-version)/src/$* - elif [ $$n = ghostscript ]; then w=https://github.com/ArtifexSoftware/ghostpdl-downloads/releases/download/gs926 - elif [ $$n = git ]; then w=http://mirrors.edge.kernel.org/pub/software/scm/git - elif [ $$n = gnuastro ]; then w=http://ftp.gnu.org/gnu/gnuastro - elif [ $$n = gsl ]; then w=http://ftp.gnu.org/gnu/gsl - elif [ $$n = install ]; then w=http://mirror.ctan.org/systems/texlive/tlnet - elif [ $$n = jpegsrc ]; then w=http://ijg.org/files - elif [ $$n = lapack ]; then w=http://www.netlib.org/lapack - elif [ $$n = libbsd ]; then w=http://libbsd.freedesktop.org/releases - elif [ $$n = libpng ]; then w=https://download.sourceforge.net/libpng - elif [ $$n = libgit ]; then - mergenames=0 - w=https://github.com/libgit2/libgit2/archive/v$(libgit2-version).tar.gz - elif [ $$n = metastore ]; then w=http://akhlaghi.org/src - elif [ $$n = openblas ]; then - mergenames=0 - w=https://github.com/xianyi/OpenBLAS/archive/v$(openblas-version).tar.gz - elif [ $$n = openmpi ]; then - mergenames=0 - majorver=$$(echo $(openmpi-version) | sed -e 's/\./ /g' | awk '{printf("%d.%d", $$1, $$2)}') - w=https://download.open-mpi.org/release/open-mpi/v$$majorver/$* - elif [ $$n = tiff ]; then w=https://download.osgeo.org/libtiff - elif [ $$n = unzip ]; then w=ftp://ftp.info-zip.org/pub/infozip/src - mergenames=0; v=$$(echo $(unzip-version) | sed -e's/\.//') - w=ftp://ftp.info-zip.org/pub/infozip/src/unzip$$v.tgz - elif [ $$n = wcslib ]; then w=ftp://ftp.atnf.csiro.au/pub/software/wcslib - elif [ $$n = zip ]; then - mergenames=0; v=$$(echo $(zip-version) | sed -e's/\.//') - w=ftp://ftp.info-zip.org/pub/infozip/src/zip$$v.tgz - else - echo; echo; echo; - echo "'$$n' not recognized as a dependency name to download." - echo; echo; echo; - exit 1 - fi - - # Download the requested tarball. Note that some packages may not - # follow our naming convention (where the package name is merged - # with its version number). In such cases, `w' will be the full - # address, not just the top directory address. But since we are - # storing all the tarballs in one directory, we want it to have - # the same naming convention, so we'll download it to a temporary - # name, then rename that. - if [ $$mergenames = 1 ]; then tarballurl=$$w/"$*" - else tarballurl=$$w - fi - - # Download using the script specially defined for this job. - touch $(lockdir)/download - downloader="wget --no-use-server-timestamps -O" - $(downloadwrapper) "$$downloader" $(lockdir)/download \ - $$tarballurl $@ - fi - - - - - -# Libraries -# --------- -# -# We would prefer to build static libraries, but some compilers like LLVM -# don't have static capabilities, so they'll only build dynamic/shared -# libraries. Therefore, we can't use the easy `.a' suffix for static -# libraries as targets and there are different conventions for shared -# library names. -# -# For the actual build, the same compiler that built the library will build -# the programs, so exact knowledge of the suffix is ultimately irrelevant -# for us here. So, we'll make an `$(ildir)/built' directory and make a -# simple plain text file in it with the basic library name (an no prefix) -# and create/write into it when the library is successfully built. -$(ilidir)/cfitsio: $(tdir)/cfitsio-$(cfitsio-version).tar.gz \ - $(ibidir)/curl - - # CFITSIO hard-codes the absolute address of cURL's `curl-config' - # program (which gives the necessary header and linking - # information) into the configure script. So we'll have to modify - # it manually before doing the standard build. - topdir=$(pwd); cd $(ddir); tar xf $< - customtar=cfitsio-$(cfitsio-version)-custom.tar.gz - sed cfitsio/configure \ - -e's|/usr/bin/curl-config|$(ibdir)/curl-config|g' \ - > cfitsio/configure_tmp - mv cfitsio/configure_tmp cfitsio/configure - chmod +x cfitsio/configure - tar cf $$customtar cfitsio - cd $$topdir - - # Continue the standard build on the customized tarball. - $(call gbuild, $$customtar, cfitsio, static, \ - --enable-sse2 --enable-reentrant) \ - && rm $$customtar \ - && echo "CFITSIO $(cfitsio-version)" > $@ - -$(ilidir)/gsl: $(tdir)/gsl-$(gsl-version).tar.gz - $(call gbuild, $<, gsl-$(gsl-version), static) \ - && echo "GNU Scientific Library $(gsl-version)" > $@ - -$(ilidir)/fftw: $(tdir)/fftw-$(fftw-version).tar.gz - $(call gbuild, $<, fftw-$(fftw-version), static, \ - --enable-shared) \ - && cp $(dtexdir)/fftw.tex $(ictdir)/ \ - && echo "FFTW $(fftw-version) \citep{fftw}" > $@ - -# Freetype is necessary to install matplotlib -$(ilidir)/freetype: $(tdir)/freetype-$(freetype-version).tar.gz \ - $(ilidir)/libpng - $(call gbuild, $<, freetype-$(freetype-version), static) \ - && echo "FreeType $(freetype-version)" > $@ - -$(ilidir)/hdf5: $(tdir)/hdf5-$(hdf5-version).tar.gz \ - $(ilidir)/openmpi - export CC=mpicc; \ - export FC=mpif90; \ - $(call gbuild, $<, hdf5-$(hdf5-version), static, \ - --enable-parallel \ - --enable-fortran, V=1) \ - && echo "HDF5 library $(hdf5-version)" > $@ - -$(ilidir)/libbsd: $(tdir)/libbsd-$(libbsd-version).tar.xz - $(call gbuild, $<, libbsd-$(libbsd-version), static,,V=1) \ - && echo "Libbsd $(libbsd-version)" > $@ - -$(ilidir)/libjpeg: $(tdir)/jpegsrc.$(libjpeg-version).tar.gz - $(call gbuild, $<, jpeg-9b, static) \ - && echo "Libjpeg $(libjpeg-version)" > $@ - -$(ilidir)/libpng: $(tdir)/libpng-$(libpng-version).tar.xz - $(call gbuild, $<, libpng-$(libpng-version), static) \ - && echo "Libpng $(libpng-version)" > $@ - -$(ilidir)/libtiff: $(tdir)/tiff-$(libtiff-version).tar.gz \ - $(ilidir)/libjpeg - $(call gbuild, $<, tiff-$(libtiff-version), static, \ - --disable-webp --disable-zstd) \ - && echo "Libtiff $(libtiff-version)" > $@ - -$(ilidir)/openmpi: $(tdir)/openmpi-$(openmpi-version).tar.gz - $(call gbuild, $<, openmpi-$(openmpi-version), static, , V=1) \ - && echo "OpenMPI $(openmpi-version)" > $@ - -$(ilidir)/atlas: $(tdir)/atlas-$(atlas-version).tar.bz2 \ - $(tdir)/lapack-$(lapack-version).tar.gz - - # Get the operating system specific features (how to get - # CPU frequency and the library suffixes). To make the steps - # more readable, the different library version suffixes are - # named with a single character: `s' for no version in the - # name, `m' for the major version suffix, and `f' for the - # full version suffix. - # GCC in Mac OS doesn't work. To work around this issue, on Mac - # systems we force ATLAS to use `clang' instead of `gcc'. - if [ x$(on_mac_os) = xyes ]; then - s=dylib - m=3.dylib - f=3.6.1.dylib - core=$$(sysctl hw.cpufrequency | awk '{print $$2/1000000}') - clangflag="--force-clang=$(ibdir)/clang" - else - s=so - m=so.3 - f=so.3.6.1 - clangflag= - core=$$(cat /proc/cpuinfo | grep "cpu MHz" \ - | head -n 1 \ - | sed "s/.*: \([0-9.]*\).*/\1/") - fi - - # See if the shared libraries should be build for a single CPU - # thread or multiple threads. - N=$$(nproc) - srcdir=$$(pwd)/reproduce/src/make - if [ $$N = 1 ]; then - sharedmk=$$srcdir/dependencies-atlas-single.mk - else - sharedmk=$$srcdir/dependencies-atlas-multiple.mk - fi - - # The linking step here doesn't recognize the `-Wl' in the - # `rpath_command'. - export LDFLAGS=-L$(ildir) - cd $(ddir) \ - && tar xf $< \ - && cd ATLAS \ - && rm -rf build \ - && mkdir build \ - && cd build \ - && ../configure -b 64 -D c -DPentiumCPS=$$core \ - --with-netlib-lapack-tarfile=$(word 2, $^) \ - --cripple-atlas-performance \ - -Fa alg -fPIC --shared $$clangflag \ - --prefix=$(idir) \ - && make \ - && if [ "x$(on_mac_os)" != xyes ]; then \ - cd lib && make -f $$sharedmk && cd .. \ - && for l in lib/*.$$s*; do \ - patchelf --set-rpath $(ildir) $$l; done \ - && cp -d lib/*.$$s* $(ildir) \ - && ln -fs $(ildir)/libblas.$$s $(ildir)/libblas.$$m \ - && ln -fs $(ildir)/libf77blas.$$s $(ildir)/libf77blas.$$m \ - && ln -fs $(ildir)/liblapack.$$f $(ildir)/liblapack.$$s \ - && ln -fs $(ildir)/liblapack.$$f $(ildir)/liblapack.$$m; \ - fi \ - && make install - - # We need to check the existance of `libptlapack.a', but we can't - # do this in the `&&' steps above (it will conflict). So we'll do - # the check after seeing if `libtatlas.so' is installed, then we'll - # finalize the build (delete the untarred directory). - if [ "x$(on_mac_os)" != xyes ]; then \ - [ -e lib/libptlapack.a ] && cp lib/libptlapack.a $(ildir); \ - cd $(ddir); \ - rm -rf ATLAS; \ - fi - - # We'll check the full installation with the static library (not - # currently building shared library on Mac. - if [ -f $(ildir)/libatlas.a ]; then \ - echo "ATLAS $(atlas-version)" > $@; \ - fi - -$(ilidir)/openblas: $(tdir)/openblas-$(openblas-version).tar.gz - if [ x$(on_mac_os) = xyes ]; then \ - export CC=clang; \ - fi; \ - cd $(ddir) \ - && tar xf $< \ - && cd OpenBLAS-$(openblas-version) \ - && make \ - && make PREFIX=$(idir) install \ - && cd .. \ - && rm -rf OpenBLAS-$(openblas-version) \ - && echo "OpenBLAS $(openblas-version)" > $@ - - - - -# Libraries with special attention on Mac OS -# ------------------------------------------ -# -# Libgit2 and WCSLIB don't set their installation path, or don't do it -# properly, in their finally installed shared libraries. But since we are -# linking everything (including OpenSSL and its dependencies) dynamically, -# we need to also make a shared libraries and can't use static -# libraries. So for Mac OS systems we have to correct their addresses -# manually. -# -# For example, Libgit2 page recommends doing a static build, especially for -# Mac systems (with `-DBUILD_SHARED_LIBS=OFF'): "It’s highly recommended -# that you build libgit2 as a static library for Xcode projects. This -# simplifies distribution significantly, as the resolution of dynamic -# libraries at runtime can be extremely problematic.". This is a major -# problem we have been having so far with Mac systems: -# https://libgit2.org/docs/guides/build-and-link -$(ilidir)/libgit2: $(tdir)/libgit2-$(libgit2-version).tar.gz \ - $(ibidir)/cmake \ - $(ibidir)/curl - # Build and install the library. - $(call cbuild, $<, libgit2-$(libgit2-version), static, \ - -DUSE_SSH=OFF -DBUILD_CLAR=OFF \ - -DTHREADSAFE=ON ) - - # Correct the shared library absolute address if necessary. - if [ x$(on_mac_os) = xyes ]; then - install_name_tool -id $(ildir)/libgit2.26.dylib \ - $(ildir)/libgit2.26.dylib - fi - - # Write the target file. - echo "Libgit2 $(libgit2-version)" > $@ - -$(ilidir)/wcslib: $(tdir)/wcslib-$(wcslib-version).tar.bz2 \ - $(ilidir)/cfitsio - # Build and install the library. - $(call gbuild, $<, wcslib-$(wcslib-version), , \ - LIBS="-pthread -lcurl -lm" \ - --with-cfitsiolib=$(ildir) \ - --with-cfitsioinc=$(idir)/include \ - --without-pgplot --disable-fortran) - - # Correct the shared library absolute address if necessary. - if [ x$(on_mac_os) = xyes ]; then - install_name_tool -id $(ildir)/libwcs.6.2.dylib \ - $(ildir)/libwcs.6.2.dylib; - fi - - # Write the target file. - echo "WCSLIB $(wcslib-version)" > $@ - - - - - -# Programs -# -------- -# -# CMake can be built with its custom `./bootstrap' script. -$(ibidir)/cmake: $(tdir)/cmake-$(cmake-version).tar.gz \ - $(ibidir)/curl - # After searching in `bootstrap', I couldn't find `LIBS', only - # `LDFLAGS'. So the extra libraries are being added to `LDFLAGS', - # not `LIBS'. - # - # On Mac systems, the build complains about `clang' specific - # features, so we can't use our own GCC build here. - if [ x$(on_mac_os) = xyes ]; then \ - export CC=clang; \ - export CXX=clang++; \ - fi; \ - cd $(ddir) \ - && rm -rf cmake-$(cmake-version) \ - && tar xf $< && cd cmake-$(cmake-version) \ - && ./bootstrap --prefix=$(idir) --system-curl --system-zlib\ - --system-bzip2 --system-liblzma --no-qt-gui \ - && make LIBS="$$LIBS -lssl -lcrypto -lz" VERBOSE=1 \ - && make install \ - && cd .. \ - && rm -rf cmake-$(cmake-version) \ - && echo "CMake $(cmake-version)" > $@ - -# cURL (and its library, which is needed by several programs here) can -# optionally link with many different network-related libraries on the host -# system that we are not yet building in the template. Many of these are -# not relevant to most science projects, so we are explicitly using -# `--without-XXX' or `--disable-XXX' so cURL doesn't link with them. Note -# that if it does link with them, the configuration will crash when the -# library is updated/changed by the host, and the whole purpose of this -# project is avoid dependency on the host as much as possible. -$(ibidir)/curl: $(tdir)/curl-$(curl-version).tar.gz - $(call gbuild, $<, curl-$(curl-version), , \ - LIBS="-pthread" \ - --with-zlib=$(ildir) \ - --with-ssl=$(idir) \ - --without-mesalink \ - --with-ca-fallback \ - --without-librtmp \ - --without-libidn2 \ - --without-wolfssl \ - --without-brotli \ - --without-gnutls \ - --without-cyassl \ - --without-libpsl \ - --without-axtls \ - --disable-ldaps \ - --disable-ldap \ - --without-nss, V=1) \ - && echo "cURL $(curl-version)" > $@ - -$(ibidir)/ghostscript: $(tdir)/ghostscript-$(ghostscript-version).tar.gz - $(call gbuild, $<, ghostscript-$(ghostscript-version)) \ - && echo "GPL Ghostscript $(ghostscript-version)" > $@ - -$(ibidir)/git: $(tdir)/git-$(git-version).tar.xz \ - $(ibidir)/curl - $(call gbuild, $<, git-$(git-version), static, \ - --without-tcltk --with-shell=$(ibdir)/bash, \ - V=1) \ - && echo "Git $(git-version)" > $@ - -# Metastore is used (through a Git hook) to restore the source modification -# dates of files after a Git checkout. Another Git hook saves all file -# metadata just before a commit (to allow restoration after a -# checkout). Since this project is managed in Makefiles, file modification -# dates are critical to not having to redo the whole analysis after -# checking out between branches. -# -# Note that we aren't using the standard version of Metastore, but a fork -# of it that is maintained in this repository: -# https://gitlab.com/makhlaghi/metastore-fork -# -# Libbsd is not necessary on macOS systems, because macOS is already a -# BSD-based distribution. But on GNU/Linux systems, it is necessary. -ifeq ($(on_mac_os),yes) -needlibbsd = -else -needlibbsd = $(ilidir)/libbsd -endif -$(ibidir)/metastore: $(tdir)/metastore-$(metastore-version).tar.gz \ - $(needlibbsd) \ - $(ibidir)/git - - # The build command below will change the current directory of this - # build, so we'll fix its value here. - current_dir=$$(pwd) - - # Metastore doesn't have any `./configure' script. So we'll just - # call `pwd' as a place-holder for the `./configure' command. - # - # File attributes are also not available on some systems, since the - # main purpose here is modification dates (and not attributes), - # we'll also set the `NO_XATTR' flag. - $(call gbuild, $<, metastore-$(metastore-version), static,, \ - NO_XATTR=1 V=1,,pwd,PREFIX=$(idir)) - - # Write the relevant hooks into this system's Git hooks, so Git - # calls metastore properly on every commit and every checkout. - # - # Note that the -O and -G options used here are currently only in a - # fork of `metastore' currently hosted at: - # https://github.com/mohammad-akhlaghi/metastore - user=$$(whoami) - group=$$(groups | awk '{print $$1}') - cd $$current_dir - if [ -f $(ibdir)/metastore ]; then - for f in pre-commit post-checkout; do - sed -e's|@USER[@]|'$$user'|g' \ - -e's|@GROUP[@]|'$$group'|g' \ - -e's|@BINDIR[@]|$(ibdir)|g' \ - -e's|@TOP_PROJECT_DIR[@]|'$$current_dir'|g' \ - reproduce/src/bash/git-$$f > .git/hooks/$$f - chmod +x .git/hooks/$$f - echo "Metastore (forked) $(metastore-version)" > $@ - done - else - echo; echo; echo; - echo "*****************" - echo "metastore couldn't be installed!" - echo - echo "Its used for preserving timestamps on Git commits." - echo "Its useful for development, not simple running of the project." - echo "So we won't stop the configuration because it wasn't built." - echo "*****************" - fi - -# The order of dependencies is based on how long they take to build (how -# large they are): Libgit2 depends on CMake which takes a VERY long time to -# build. Also, Ghostscript and GSL are relatively large packages. So when -# building in parallel, its better to have these packages start building -# early. -$(ibidir)/gnuastro: $(tdir)/gnuastro-$(gnuastro-version).tar.lz \ - $(ilidir)/gsl \ - $(ilidir)/wcslib \ - $(ilidir)/libjpeg \ - $(ilidir)/libtiff \ - $(ilidir)/libgit2 \ - $(ibidir)/ghostscript -ifeq ($(static_build),yes) - staticopts="--enable-static=yes --enable-shared=no"; -endif - $(call gbuild, $<, gnuastro-$(gnuastro-version), static, \ - $$staticopts, -j$(numthreads), \ - make check -j$(numthreads)) \ - && cp $(dtexdir)/gnuastro.tex $(ictdir)/ \ - && echo "GNU Astronomy Utilities $(gnuastro-version) \citep{gnuastro}" > $@ - -$(ibidir)/unzip: $(tdir)/unzip-$(unzip-version).tar.gz - v=$$(echo $(unzip-version) | sed -e's/\.//') - $(call gbuild, $<, unzip$$v, static,, \ - -f unix/Makefile generic_gcc \ - CFLAGS="-DBIG_MEM -DMMAP",,pwd, \ - -f unix/Makefile \ - BINDIR=$(ibdir) MANDIR=$(idir)/man/man1 ) \ - && echo "Unzip $(unzip-version)" > $@ - -$(ibidir)/zip: $(tdir)/zip-$(zip-version).tar.gz - v=$$(echo $(zip-version) | sed -e's/\.//') - $(call gbuild, $<, zip$$v, static,, \ - -f unix/Makefile generic_gcc \ - CFLAGS="-DBIG_MEM -DMMAP",,pwd, \ - -f unix/Makefile \ - BINDIR=$(ibdir) MANDIR=$(idir)/man/man1 ) \ - && echo "Zip $(zip-version)" > $@ - - - - - -# Since we want to avoid complicating the PATH, we are putting a symbolic -# link of all the TeX Live executables in $(ibdir). But symbolic links are -# hard to track for Make (as a target). Also, TeX in general is optional -# for the project (the processing is the main target, not the generation of -# the final PDF). So we'll make a simple ASCII file called -# `texlive-ready-tlmgr' and use its contents to mark if we can use it or -# not. -$(itidir)/texlive-ready-tlmgr: $(tdir)/install-tl-unx.tar.gz \ - reproduce/config/pipeline/texlive.conf - - # Unpack, enter the directory, and install based on the given - # configuration (prerequisite of this rule). - @topdir=$$(pwd) - cd $(ddir) - rm -rf install-tl-* - tar xf $(tdir)/install-tl-unx.tar.gz - cd install-tl-* - sed -e's|@installdir[@]|$(idir)|g' \ - $$topdir/reproduce/config/pipeline/texlive.conf > texlive.conf - - # TeX Live's installation may fail due to any reason. But TeX Live - # is optional (only necessary for building the final PDF). So we - # don't want the configure script to fail if it can't run. - if ./install-tl --profile=texlive.conf; then - - # Put a symbolic link of the TeX Live executables in `ibdir'. The - # main problem is that the year and build system (for example - # `x86_64-linux') are also in the directory names, making it hard - # to be generic. We are using wildcards here, but only in this - # Makefile, not in any other. - ln -fs $(idir)/texlive/20*/bin/*/* $(ibdir)/ - - # Register that the build was successful. - echo "TeX Live is ready." > $@ - else - echo "NOT!" > $@ - fi - - # Clean up - cd .. - rm -rf install-tl-* - - - - - -# To keep things modular and simple, we'll break up the installation of TeX -# Live itself (only very basic TeX and LaTeX) and the installation of its -# necessary packages into two packages. -$(itidir)/texlive: reproduce/config/pipeline/dependency-texlive.mk \ - $(itidir)/texlive-ready-tlmgr - - # To work with TeX live installation, we'll need the internet. - @res=$$(cat $(itidir)/texlive-ready-tlmgr) - if [ x"$$res" = x"NOT!" ]; then - echo "" > $@ - else - # Install all the extra necessary packages. If LaTeX complains - # about not finding a command/file/what-ever/XXXXXX, simply run - # the following command to find which package its in, then add it - # to the `texlive-packages' variable of the first prerequisite. - # - # ./.local/bin/tlmgr info XXXXXX - # - # We are putting a notice, because if there is no internet, - # `tlmgr' just hangs waiting. - tlmgr install $(texlive-packages) - - # Make a symbolic link of all the TeX Live executables in the bin - # directory so we don't have to modify `PATH'. - ln -fs $(idir)/texlive/20*/bin/*/* $(ibdir)/ - - # Get all the necessary versions. - texlive=$$(pdflatex --version | awk 'NR==1' | sed 's/.*(\(.*\))/\1/' \ - | awk '{print $$NF}'); - - # Package names and versions. - tlmgr info $(texlive-packages) --only-installed | awk \ - '$$1=="package:" {version=0; \ - if($$NF=="tex-gyre") name="texgyre"; \ - else name=$$NF} \ - $$1=="cat-version:" {version=$$NF} \ - $$1=="cat-date:" {if(version==0) version=$$2; \ - printf("%s %s\n", name, version)}' >> $@ - fi diff --git a/reproduce/src/make/download.mk b/reproduce/src/make/download.mk deleted file mode 100644 index dfc49da..0000000 --- a/reproduce/src/make/download.mk +++ /dev/null @@ -1,91 +0,0 @@ -# Download all the necessary inputs if they are not already present. -# -# Since most systems only have one input/connection into the network, -# downloading is essentially a serial (not parallel) operation. so the -# recipes in this Makefile all use a single file lock to have one download -# script running at every instant. -# -# Copyright (C) 2018-2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# This Makefile is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This Makefile is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. See <http://www.gnu.org/licenses/>. - - - - - -# Download input data -# -------------------- -# -# The input dataset properties are defined in `$(pconfdir)/INPUTS.mk'. For -# this template we only have one dataset to enable easy processing, so all -# the extra checks in this rule may seem redundant. -# -# In a real project, you will need more than one dataset. In that case, -# just add them to the target list and add an `elif' statement to define it -# in the recipe. -# -# Files in a server usually have very long names, which are mainly designed -# for helping in data-base management and being generic. Since Make uses -# file names to identify which rule to execute, and the scope of this -# research project is much less than the generic survey/dataset, it is -# easier to have a simple/short name for the input dataset and work with -# that. In the first condition of the recipe below, we connect the short -# name with the raw database name of the dataset. -# -# Download lock file: Most systems have a single connection to the -# internet, therefore downloading is inherently done in series. As a -# result, when more than one dataset is necessary for download, if they are -# done in parallel, the speed will be slower than downloading them in -# series. We thus use the `flock' program to tie/lock the downloading -# process with a file and make sure that only one downloading event is in -# progress at every moment. -$(indir):; mkdir $@ -downloadwrapper = $(srcdir)/bash/download-multi-try -inputdatasets = $(foreach i, wfpc2, $(indir)/$(i).fits) -$(inputdatasets): $(indir)/%.fits: | $(indir) $(lockdir) - - # Set the necessary parameters for this input file. - if [ $* = wfpc2 ]; then - origname=$(WFPC2IMAGE); url=$(WFPC2URL); mdf=$(WFPC2MD5); - else - echo; echo; echo "Not recognized input dataset: '$*.fits'." - echo; echo; exit 1 - fi - - # Download (or make the link to) the input dataset. - if [ -f $(INDIR)/$$origname ]; then - ln -s $(INDIR)/$$origname $@ - else - touch $(lockdir)/download - $(downloadwrapper) "wget --no-use-server-timestamps -O" \ - $(lockdir)/download $$url/$$origname $@ - fi - - # Check the md5 sum to see if this is the proper dataset. - sum=$$(md5sum $@ | awk '{print $$1}') - if [ $$sum != $$mdf ]; then - wrongname=$(dir $@)/wrong-$(notdir $@) - mv $@ $$wrongname - echo; echo; echo "Wrong MD5 checksum for '$$origname' in $$wrongname" - echo; echo; exit 1 - fi - - - - - -# Final TeX macro -# --------------- -# -# It is very important to mention the address where the data were -# downloaded in the final report. -$(mtexdir)/download.tex: $(pconfdir)/INPUTS.mk | $(mtexdir) - echo "\\newcommand{\\wfpctwourl}{$(WFPC2URL)}" > $@ diff --git a/reproduce/src/make/initialize.mk b/reproduce/src/make/initialize.mk deleted file mode 100644 index cd533f2..0000000 --- a/reproduce/src/make/initialize.mk +++ /dev/null @@ -1,341 +0,0 @@ -# Project initialization. -# -# Copyright (C) 2018-2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# This Makefile is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This Makefile is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. -# -# A copy of the GNU General Public License is available at -# <http://www.gnu.org/licenses/>. - - - - - -# High-level directory definitions -# -------------------------------- -# -# Basic directories that are used throughout the project. -# -# Locks are used to make sure that an operation is done in series not in -# parallel (even if Make is run in parallel with the `-j' option). The most -# common case is downloads which are better done in series and not in -# parallel. Also, some programs may not be thread-safe, therefore it will -# be necessary to put a lock on them. This project uses the `flock' program -# to achieve this. -texdir = $(BDIR)/tex -srcdir = reproduce/src -lockdir = $(BDIR)/locks -indir = $(BDIR)/inputs -mtexdir = $(texdir)/macros -pconfdir = reproduce/config/pipeline -installdir = $(BDIR)/dependencies/installed -# --------- Delete for no Gnuastro --------- -gconfdir = reproduce/config/gnuastro -# ------------------------------------------ - - - - - -# TeX build directory -# ------------------ -# -# In scenarios where multiple users are working on the project -# simultaneously, they can't all build the final paper together, there will -# be conflicts! It is possible to manage the working on the analysis, so no -# conflict is caused in that phase, but it would be very slow to only let -# one of the project members to build the paper at every instance -# (independent parts of the paper can be added to it independently). To fix -# this problem, when we are in a group setting, we'll use the user's ID to -# create a separate LaTeX build directory for each user. -# -# The same logic applies to the final paper PDF: each user will create a -# separte final PDF (for example `paper-user1.pdf' and `paper-user2.pdf') -# and no `paper.pdf' will be built. This isn't a problem because -# `initialize.tex' is a .PHONY prerequisite, so the rule to build the final -# paper is always executed (even if it is present and nothing has -# changed). So in terms of over-all efficiency and processing steps, this -# doesn't change anything. -ifeq (x$(GROUP-NAME),x) -texbdir = $(texdir)/build -final-paper = paper.pdf -else -user = $(shell whoami) -texbdir = $(texdir)/build-$(user) -final-paper = paper-$(user).pdf -endif -tikzdir = $(texbdir)/tikz - - - - - -# Original system environment -# --------------------------- -# -# Before defining the local sub-environment here, we'll need to save the -# system's environment for some scenarios (for example after `clean'ing the -# built programs). -sys-path := $(PATH) -sys-rm := $(shell which rm) -curdir := $(shell echo $$(pwd)) - - - - - -# High level environment -# ---------------------- -# -# We want the full recipe to be executed in one call to the shell. Also we -# want Make to run the specific version of Bash that we have installed -# during `./configure' time. -# -# Regarding the directories, this project builds its major dependencies -# itself and doesn't use the local system's default tools. With these -# environment variables, we are setting it to prefer the software we have -# build here. -.ONESHELL: -.SHELLFLAGS = -ec -export CCACHE_DISABLE := 1 -export PATH := $(installdir)/bin -export LD_LIBRARY_PATH := $(installdir)/lib -export LDFLAGS := -L$(installdir)/lib -export SHELL := $(installdir)/bin/bash -export CPPFLAGS := -I$(installdir)/include - - - - - -# Python enviroment -# ----------------- -# -# The main Python environment variable is `PYTHONPATH'. However, so far we -# have found several other Python-related environment variables on some -# systems which might interfere. To be safe, we are removing all their -# values. -export PYTHONPATH := $(installdir)/lib/python/site-packages -export PYTHONPATH3 := $(PYTHONPATH) -export _LMFILES_ := -export PYTHONPATH2 := -export LOADEDMODULES := -export MPI_PYTHON_SITEARCH := -export MPI_PYTHON2_SITEARCH := -export MPI_PYTHON3_SITEARCH := - - - - - -# High-level level directories -# ---------------------------- -# -# These are just the top-level directories for all the separate steps. The -# directories (or possible sub-directories) for individual steps will be -# defined and added within their own Makefiles. -# -# The `.SUFFIXES' rule with no prerequisite is defined to eliminate all the -# default implicit rules. The default implicit rules are to do with -# programming (for example converting `.c' files to `.o' files). The -# problem they cause is when you want to debug the make command with `-d' -# option: they add too many extra checks that make it hard to find what you -# are looking for in the outputs. -.SUFFIXES: -$(lockdir): | $(BDIR); mkdir $@ -$(texbdir): | $(texdir); mkdir $@ -$(tikzdir): | $(texbdir); mkdir $@ && ln -fs $@ tex/tikz - - - - - -# High-level Makefile management -# ------------------------------ -# -# About `.PHONY': these are targets that must be built even if a file with -# their name exists. -# -# Only `$(mtexdir)/initialize.tex' corresponds to a file. This is because -# we want to ensure that the file is always built in every run: it contains -# the project version which may change between two separate runs, even when -# no file actually differs. -packagebasename := $(shell echo paper-$$(git describe --dirty --always)) -packagecontents = $(texdir)/$(packagebasename) -.PHONY: all clean dist dist-zip distclean clean-mmap $(packagecontents) \ - $(mtexdir)/initialize.tex - -# --------- Delete for no Gnuastro --------- -clean-mmap:; rm -f reproduce/config/gnuastro/mmap* -# ------------------------------------------ - -clean: clean-mmap - # Delete the top-level PDF file. - rm -f *.pdf - - # Delete all the built outputs except the dependency - # programs. We'll use Bash's extended options builtin (`shopt') to - # enable "extended glob" (for listing of files). It allows extended - # features like ignoring the listing of a file with `!()' that we - # are using afterwards. - shopt -s extglob - rm -rf $(BDIR)/!(dependencies) - -distclean: clean - # We'll be deleting the built environent programs and just need the - # `rm' program. So for this recipe, we'll use the host system's - # `rm', not our own. - $(sys-rm) -rf $(BDIR) reproduce/build - $(sys-rm) -f Makefile .gnuastro .local - $(sys-rm) -f $(pconfdir)/LOCAL.mk $(gconfdir)/gnuastro-local.conf - - - - - -# Packaging rules -# --------------- -# -# With the rules in this section, you can package the project in a state -# that is ready for building the final PDF with LaTeX. This is useful for -# collaborators who only want to contribute to the text of your project, -# without having to worry about the technicalities of the analysis. -$(packagecontents): | $(texdir) - - # Set up the output directory, delete it if it exists and remake it - # to fill with new contents. - dir=$(texdir)/$(packagebasename) - rm -rf $$dir - mkdir $$dir - - # Build a small Makefile to help in automatizing the paper building - # (including the bibliography). - m=$$dir/Makefile - echo "paper.pdf: paper.tex paper.bbl" > $$m - printf "\tpdflatex -shell-escape -halt-on-error paper\n" >> $$m - echo "paper.bbl: tex/src/references.tex" >> $$m - printf "\tpdflatex -shell-escape -halt-on-error paper\n" >> $$m - printf "\tbiber paper\n" >> $$m - echo ".PHONY: clean" >> $$m - echo "clean:" >> $$m - printf "\trm -f *.aux *.auxlock *.bbl *.bcf\n" >> $$m - printf "\trm -f *.blg *.log *.out *.run.xml\n" >> $$m - - # Copy the top-level contents into it. - cp configure COPYING for-group README.md README-hacking.md $$dir/ - - # Build the top-level directories. - mkdir $$dir/reproduce $$dir/tex $$dir/tex/tikz $$dir/tex/pipeline - - # Copy all the `reproduce' contents except for the `build' symbolic - # link. - shopt -s extglob - cp -r tex/src $$dir/tex/src - cp tex/tikz/*.pdf $$dir/tex/tikz - cp -r reproduce/!(build) $$dir/reproduce - cp -r tex/pipeline/!($(packagebasename)) $$dir/tex/pipeline - cp -r tex/dependencies $$dir/tex/dependencies - - # Clean up un-necessary/local files: 1) the $(texdir)/build* - # directories (when building in a group structure, there will be - # `build-user1', `build-user2' and etc), are just temporary LaTeX - # build files and don't have any relevant/hand-written files in - # them. 2) The `LOCAL.mk' and `gnuastro-local.conf' files just have - # this machine's local settings and are irrelevant for anyone else. - rm -rf $$dir/tex/pipeline/build* - rm $$dir/reproduce/config/pipeline/LOCAL.mk - rm $$dir/reproduce/config/gnuastro/gnuastro-local.conf - - # PROJECT SPECIFIC: under this comment, copy any other file for - # packaging, or remove any of the copied files above to suite your - # project. - - # Since the packaging is mainly intended for high-level building of - # the PDF with LaTeX, we'll comment the `makepdf' LaTeX macro in - # the paper. - sed -e's|\\newcommand{\\makepdf}{}|%\\newcommand{\\makepdf}{}|' \ - paper.tex > $$dir/paper.tex - - # Just in case the package users want to rebuild some of the - # figures (manually un-comments the `makepdf' command we commented - # above), correct the TikZ external directory, so the figures can - # be rebuilt. - pgfsettings="$$dir/tex/src/preamble-pgfplots.tex" - sed -e's|{tikz/}|{tex/tikz/}|' $$pgfsettings > $$pgfsettings.new - mv $$pgfsettings.new $$pgfsettings - - # Clean temporary (currently those ending in `~') files. - cd $(texdir) - find $(packagebasename) -name \*~ -delete - -# Package into `.tar.gz'. -dist: $(packagecontents) - curdir=$$(pwd) - cd $(texdir) - tar -cf $(packagebasename).tar $(packagebasename) - gzip -f --best $(packagebasename).tar - cd $$curdir - mv $(texdir)/$(packagebasename).tar.gz ./ - -# Package into `.zip'. -dist-zip: $(packagecontents) - curdir=$$(pwd) - cd $(texdir) - zip -q -r $(packagebasename).zip $(packagebasename) - cd $$curdir - mv $(texdir)/$(packagebasename).zip ./ - - - - - -# Check the version of programs which write their version -# ------------------------------------------------------- -pvcheck = prog="$(strip $(1))"; \ - ver="$(strip $(2))"; \ - name="$(strip $(3))"; \ - macro="$(strip $(4))"; \ - verop="$(strip $(5))"; \ - if [ "x$$verop" = x ]; then V="--version"; else V=$$verop; fi; \ - v=$$($$prog $$V | awk '/'$$ver'/{print "y"; exit 0}'); \ - if [ x$$v != xy ]; then \ - echo; echo "PROJECT ERROR: Not running $$name $$ver"; echo; \ - exit 1; \ - fi; \ - echo "\newcommand{\\$$macro}{$$ver}" >> $@ - -lvcheck = idir=$(BDIR)/dependencies/installed/include; \ - f="$$idir/$(strip $(1))"; \ - ver="$(strip $(2))"; \ - name="$(strip $(3))"; \ - macro="$(strip $(4))"; \ - v=$$(awk '/^\#/&&/define/&&/'$$ver'/{print "y";exit 0}' $$f); \ - if [ x$$v != xy ]; then \ - echo; echo "PROJECT ERROR: Not linking with $$name $$ver"; \ - echo; exit 1; \ - fi; \ - echo "\newcommand{\\$$macro}{$$ver}" >> $@ - - - - -# Project initialization results -# ------------------------------ -# -# This file will store some basic info about the project that is necessary -# for the final PDF. Since these are not version controlled, it must be -# calculated everytime the project is run. So even though this file -# actually exists, it is also aded as a `.PHONY' target above. -$(mtexdir)/initialize.tex: | $(mtexdir) - - # Version of the project. - @v=$$(git describe --dirty --always); - echo "\newcommand{\pipelineversion}{$$v}" > $@ diff --git a/reproduce/src/make/paper.mk b/reproduce/src/make/paper.mk deleted file mode 100644 index 0c42bee..0000000 --- a/reproduce/src/make/paper.mk +++ /dev/null @@ -1,142 +0,0 @@ -# Build the final PDF paper/report. -# -# Copyright (C) 2018-2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# This Makefile is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This Makefile is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. -# -# A copy of the GNU General Public License is available at -# <http://www.gnu.org/licenses/>. - - - - -# LaTeX macros for paper -# ---------------------- -# -# To report the input settings and results, the final report's PDF (final -# target of this project) uses macros generated from various steps of the -# project. All these macros are defined in `$(mtexdir)/pipeline.tex'. -# -# `$(mtexdir)/pipeline.tex' is actually just a combination of separate -# files that keep the LaTeX macros related to each workhorse Makefile (in -# `reproduce/src/make/*.mk'). Those individual macros are pre-requisites to -# `$(mtexdir)/pipeline.tex'. The only workhorse Makefile that doesn't need -# to produce LaTeX macros is this Makefile (`reproduce/src/make/paper.mk'). -# -# This file is thus the interface between the processing scripts and the -# final PDF: when we get to this point, all the processing has been -# completed. -# -# Note that if you don't want the final PDF and just want the -# processing and file outputs, you can remove the value of -# `pdf-build-final' in `reproduce/config/pipeline/pdf-build.mk'. -$(mtexdir)/pipeline.tex: $(foreach s, $(subst paper,,$(makesrc)), $(mtexdir)/$(s).tex) - - # If no PDF is requested, or if LaTeX isn't available, don't - # continue to building the final PDF. Otherwise, merge all the TeX - # macros into one for building the PDF. - @if [ -f .local/bin/pdflatex ] && [ x"$(pdf-build-final)" != x ]; then - - # First make sure the `tex/pipeline' symbolic link exists. - if [ ! -e tex/pipeline ]; then ln -s $(texdir) tex/pipeline; fi - - # Put a LaTeX input command for all the necessary macro files. - rm -f $(mtexdir)/pipeline.tex - for t in $(subst paper,,$(makesrc)); do - echo "\input{tex/pipeline/macros/$$t.tex}" >> $(mtexdir)/pipeline.tex - done - else - echo - echo "-----" - echo "The processing has COMPLETED SUCCESSFULLY! But the final " - echo "LaTeX-built PDF paper will not be built." - echo - if [ x$(more-on-building-pdf) = x1 ]; then - echo "To do so, make sure you have LaTeX within the project (you" - echo "can check by running './.local/bin/latex --version'), _AND_" - echo "make sure that the 'pdf-build-final' variable has a value." - echo "'pdf-build-final' is defined in: " - echo "'reproduce/config/pipeline/pdf-build.mk'." - echo - echo "If you don't have LaTeX within the project, please re-run" - echo "'./configure' when you have internet access. To speed it up," - echo "you can keep the previous configuration files (answer 'n'" - echo "when it asks about re-writing previous configuration files)." - else - echo "For more, run './.local/bin/make more-on-building-pdf=1'" - fi - echo - echo "" > $@ - fi - - - - - -# The bibliography -# ---------------- -# -# We need to run the `biber' program on the output of LaTeX to generate the -# necessary bibliography before making the final paper. So we'll first have -# one run of LaTeX (similar to the `paper.pdf' recipe), then `biber'. -# -# NOTE: `$(mtexdir)/pipeline.tex' is an order-only-prerequisite for -# `paper.bbl'. This is because we need to run LaTeX in both the `paper.bbl' -# recipe and the `paper.pdf' recipe. But if `tex/src/references.tex' hasn't -# been modified, we don't want to re-build the bibliography, only the final -# PDF. -$(texbdir)/paper.bbl: tex/src/references.tex \ - | $(tikzdir) $(texbdir) $(mtexdir)/pipeline.tex - # If `$(mtexdir)/pipeline.tex' is empty, don't build PDF. - @macros=$$(cat $(mtexdir)/pipeline.tex) - if [ x"$$macros" != x ]; then - - # We'll run LaTeX first to generate the `.bcf' file (necessary - # for `biber') and then run `biber' to generate the `.bbl' file. - p=$$(pwd); - export TEXINPUTS=$$p:$$TEXINPUTS; - cd $(texbdir); - pdflatex -shell-escape -halt-on-error $$p/paper.tex; - biber paper - - fi - - - - - -# The final paper -# --------------- -# -# Run LaTeX in the `$(texbdir)' directory so all the intermediate and -# auxiliary files stay there and keep the top directory clean. To be able -# to run everything cleanly from there, it is necessary to add the current -# directory (top project directory) to the `TEXINPUTS' environment -# variable. -paper.pdf: $(mtexdir)/pipeline.tex paper.tex $(texbdir)/paper.bbl \ - | $(tikzdir) $(texbdir) - - # If `$(mtexdir)/pipeline.tex' is empty, don't build the PDF. - @macros=$$(cat $(mtexdir)/pipeline.tex) - if [ x"$$macros" != x ]; then - - # Go into the top TeX build directory and make the paper. - p=$$(pwd) - export TEXINPUTS=$$p:$$TEXINPUTS - cd $(texbdir) - pdflatex -shell-escape -halt-on-error $$p/paper.tex - - # Come back to the top project directory and copy the built PDF - # file here. - cd $$p - cp $(texbdir)/$@ $(final-paper) - - fi diff --git a/reproduce/src/make/top.mk b/reproduce/src/make/top.mk deleted file mode 100644 index 763dbd7..0000000 --- a/reproduce/src/make/top.mk +++ /dev/null @@ -1,135 +0,0 @@ -# Top-level Makefile (first to be loaded). -# -# Copyright (C) 2018-2019 Mohammad Akhlaghi <mohammad@akhlaghi.org> -# -# This Makefile is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This Makefile is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. -# -# A copy of the GNU General Public License is available at -# <http://www.gnu.org/licenses/>. - - - - - -# Load the local configuration (created after running `./configure'). -include reproduce/config/pipeline/LOCAL.mk - - - - - -# Ultimate target of this project -# ------------------------------- -# -# The final paper/report (`paper.pdf') is the main target of this -# project. As defined in the Make paradigm, it must be the first target -# that Make encounters (immediately after loading the local configuration -# settings, necessary for a group building scenario mentioned next). -# -# -# Group build -# ----------- -# -# This project can also be configured to have a shared build directory -# between multiple users. In this scenario, many users (on a server) can -# have their own/separate version controlled project source, but share the -# same build outputs (in a common directory). This will allow a group to -# work separately, on parallel parts of the analysis that don't -# interfere. It is thus very useful in cases were special storage -# requirements or CPU power is necessary and its not possible/efficient for -# each user to have a fully separate copy of the build directory. -# -# Controlling this requires two variables that are available at this stage: -# -# - `GROUP-NAME': from `LOCAL.mk' (which was built by `./configure'). -# - `reproducible_paper_group_name': from the `./for-group' script (if it -# was used to call Make). -# -# The analysis is only done when both have the same group name. Note that -# when the project isn't being built for a group, both variables will be an -# empty string. -# -# -# Only processing, no LaTeX PDF -# ----------------------------- -# -# If you are just interested in the processing and don't want to build the -# PDF, you can skip the creatation of the final PDF by removing the value -# of `pdf-build-final' in `reproduce/config/pipeline/pdf-build.mk'. -ifeq (x$(reproducible_paper_group_name),x$(GROUP-NAME)) -all: paper.pdf -else -all: - @if [ "x$(GROUP-NAME)" = x ]; then \ - echo "Project is NOT configured for groups, please run"; \ - echo " $$ .local/bin/make"; \ - else \ - echo "Project is configured for groups, please run"; \ - echo " $$ ./for-group $(GROUP-NAME) make -j8"; \ - fi -endif - - - - - -# Define source Makefiles -# ----------------------- -# -# To keep things clean, managable and readable, each set of operations -# is (and must be) classified (modularized) by context into separate -# Makefiles: the more the better. These modular steps are then -# included in this top-level Makefile through the `include' command of -# the next step. Each Makefile should also produce a LaTeX macro file -# with the same fixed name (used to keep all the parameters and -# relevant outputs of the steps in it for the final paper). -# -# In the rare case that no special LaTeX macros are necessary in a -# workhorse Makefile, you can simply make an empty file with `touch -# $@'. This will not add any lines to the final combined LaTeX macros -# file, but will create the file that is a prerequisite to the final -# paper generation. -# -# To (significantly) help in readability, this top-level Makefile should be -# the only one in charge of including Makefiles. So if you care about easy -# maintainence and understandability (even for your self, in one year! It -# is VERY IMPORTANT and as a scientist, you MUST care about it!), do not -# include Makefiles from any other Makefile. -# -# IMPORTANT NOTE: order matters in the inclusion of the processing -# Makefiles. As the project grows, some Makefiles will define -# variables/dependencies that later Makefiles need. Therefore we are using -# a `foreach' loop in the next step to explicitly request loading them in -# the same order that they are defined here (we aren't just using a -# wild-card like the configuration Makefiles). -makesrc = initialize \ - download \ - delete-me \ - paper - - - - - -# Include all Makefiles -# --------------------- -# -# We have two classes of Makefiles, separated by context and their location: -# -# 1) First, we'll include all the configuration-Makefiles. These -# Makefiles only define variables with no rules or order. We just -# won't include `LOCAL.mk' because it has already been included -# above. -# -# 2) Then, we'll import the workhorse-Makefiles which contain rules to -# actually do this project's processing. -include $(filter-out %LOCAL.mk, reproduce/config/pipeline/*.mk) -include $(foreach s,$(makesrc), reproduce/src/make/$(s).mk) |