From 3d8aa5953c4e0b79278ab2e27ec4e1051310d04f Mon Sep 17 00:00:00 2001 From: Mohammad Akhlaghi Date: Mon, 1 Jun 2020 03:05:37 +0100 Subject: Core software build before using Make to build other software Until now, Maneage would only build Flock before building everything else using Make (calling 'basic.mk') in parallel. Flock was necessary to avoid parallel downloads during the building of software (which could cause network problems). But after recently trying Maneage on FreeBSD (which is not yet complete, see bug #58465), we noticed that the BSD implemenation of Make couldn't parse 'basic.mk' (in particular, complaining with the 'ifeq' parts) and its shell also had some peculiarities. It was thus decided to also install our own minimalist shell, Make and compressor program before calling 'basic.mk'. In this way, 'basic.mk' can now assume the same GNU Make features that high-level.mk and python.mk assume. The pre-make building of software is now organized in 'reproduce/software/shell/pre-make-build.sh'. Another nice feature of this commit is for macOS users: until now the default macOS Make had problems for parallel building of software, so 'basic.mk' was built in one thread. But now that we can build the core tools with GNU Make on macOS too, it uses all threads. Furthermore, since we now run 'basic.mk' with GNU Make, we can use '.ONESHELL' and don't have to finish every line of a long rule with a backslash to keep variables and such. Generally, the pre-make software are now organized like this: first we build Lzip before anything else: it is downloaded as a simple '.tar' file that is not compressed (only ~400kb). Once Lzip is built, the pre-make phase continues with building GNU Make, Dash (a minimalist shell) and Flock. All of their tarballs are in '.tar.lz'. Maneage then enters 'basic.mk' and the first program it builds is GNU Gzip (itself packaged as '.tar.lz'). Once Gzip is built, we build all the other compression software (all downloaded as '.tar.gz'). Afterwards, any compression standard for other software is fine because we have it. In the process, a bug related to using backup servers was found in 'reproduce/analysis/bash/download-multi-try' for calling outside of 'basic.mk' and removed Bash-specific features. As a result of that bug-fix, because we now have multiple servers for software tarballs, the backup servers now have their own configuration file in 'reproduce/software/config/servers-backup.conf'. This makes it much easier to maintain the backup server list across the multiple places that we need it. Some other minor fixes: - In building Bzip2, we need to specify 'CC' so it doesn't use 'gcc'. - In building Zip, the 'generic_gcc' Make option caused a crash on FreeBSD (which doesn't have GCC). - We are now using 'uname -s' to specify if we are on a Linux kernel or not, if not, we are still using the old 'on_mac_os' variable. - While I was trying to build on FreeBSD, I noticed some further corrections that could help. For example the 'makelink' Make-function now takes a third argument which can be a different name compared to the actual program (used for examle to make a link to '/usr/bin/cc' from 'gcc'. - Until now we didn't know if the host's Make implementation supports placing a '@' at the start of the recipe (to avoid printing the actual commands to standard output). Especially in the tarball download phase, there are many lines that are printed for each download which was really annoying. We already used '@' in 'high-level.mk' and 'python.mk' before, but now that we also know that 'basic.mk' is called with our custom GNU Make, we can use it at the start for a cleaner stdout. - Until now, WCSLIB assumed a Fortran compiler, but when the user is on a system where we can't install GCC (or has activated the '--host-cc' option), it may not be present and the project shouldn't break because of this. So with this commit, when a Fortran compiler isn't present, WCSLIB will be built with the '--disable-fortran' configuration option. This commit (task #15667) was completed with help/checks by Raul Infante-Sainz and Boud Roukema. --- reproduce/analysis/bash/download-multi-try | 57 ++-- reproduce/software/config/checksums.conf | 9 +- reproduce/software/config/servers-backup.conf | 14 + reproduce/software/config/versions.conf | 3 +- reproduce/software/make/basic.mk | 466 +++++++++++++------------- reproduce/software/make/build-rules.mk | 40 ++- reproduce/software/make/high-level.mk | 12 +- reproduce/software/make/python.mk | 1 + reproduce/software/shell/configure.sh | 148 +++----- reproduce/software/shell/pre-make-build.sh | 249 ++++++++++++++ 10 files changed, 635 insertions(+), 364 deletions(-) create mode 100644 reproduce/software/config/servers-backup.conf create mode 100755 reproduce/software/shell/pre-make-build.sh (limited to 'reproduce') diff --git a/reproduce/analysis/bash/download-multi-try b/reproduce/analysis/bash/download-multi-try index 8d10bf4..e4ccd26 100755 --- a/reproduce/analysis/bash/download-multi-try +++ b/reproduce/analysis/bash/download-multi-try @@ -1,4 +1,4 @@ -#!/bin/bash +#!/bin/sh # # Attempt downloading multiple times before crashing whole project. From # the top project directory (for the shebang above), this script must be @@ -90,50 +90,61 @@ counter=0 maxcounter=10 while [ ! -f "$outname" ]; do - # Increment the counter. We need the `counter=' part here because - # without it the evaluation of arithmetic expression will be like and - # error and the script is set to crash on errors. - counter=$((counter+1)) + # Increment the counter. + counter=$(echo $counter | awk '{print $1+1}') # If we have passed a maximum number of trials, just exit with # a failed code. - if (( counter > maxcounter )); then - echo + reachedmax=$(echo $counter \ + | awk '{if($1>'$maxcounter') print "yes"; else print "no";}') + if [ x$reachedmax = xyes ]; then + echo "" echo "Failed $maxcounter download attempts: $outname" - echo + echo "" exit 1 fi # If this isn't the first attempt print a notice and wait a little for # the next trail. - if (( counter > 1 )); then - tstep=$((counter*5)) + if [ x$counter = x1 ]; then + just_a_place_holder=1 + else + tstep=$(echo $counter | awk '{print $1*5}') echo "Download trial $counter for '$outname' in $tstep seconds." sleep $tstep fi - # Attempt downloading the file (one-at-a-time). Note that the - # `downloader' ends with the respective option to specify the output - # name. For example "wget -O" (so `outname', that comes after it) will - # be the name of the downloaded file. + # Attempt downloading the file. Note that the `downloader' ends with + # the respective option to specify the output name. For example "wget + # -O" (so `outname', that comes after it) will be the name of the + # downloaded file. if [ x"$lockfile" = xnolock ]; then if ! $downloader $outname $inurl; then rm -f $outname; fi else # Try downloading from the requested URL. - flock "$lockfile" bash -c \ + flock "$lockfile" sh -c \ "if ! $downloader $outname $inurl; then rm -f $outname; fi" + fi + + # If the download failed, try the backup server(s). + if [ ! -f "$outname" ]; then + if [ x"$backupservers" != x ]; then + for bs in $backupservers; do - # If it failed, try the backup server(s). - if [ ! -f "$outname" ]; then - if [ x"$backupservers" != x ]; then - for bs in "$backupservers"; do - flock "$lockfile" bash -c \ + # Use this backup server. + if [ x"$lockfile" = xnolock ]; then + if ! $downloader $outname $bs/$urlfile; then rm -f $outname; fi + else + flock "$lockfile" sh -c \ "if ! $downloader $outname $bs/$urlfile; then rm -f $outname; fi" - done - fi + fi + + # If the file was downloaded, break out of the loop that + # parses over the backup servers. + if [ -f "$outname" ]; then break; fi + done fi fi - done diff --git a/reproduce/software/config/checksums.conf b/reproduce/software/config/checksums.conf index 643be9c..e23df9f 100644 --- a/reproduce/software/config/checksums.conf +++ b/reproduce/software/config/checksums.conf @@ -20,26 +20,27 @@ bzip2-checksum = 00ace5438cfa0c577e5f578d8a808613187eff5217c35164ffe044fbafdfec9 cert-checksum = a81dfa59c70788126a395c576e54cb8f61c1ea34da69b5cd42e2d83ee6426c2a26941360c7302793774ea98ca16846deb6e683144cc7fb6da6ef87b70447e4c8 coreutils-checksum = ef8941dae845bbf5ae5838bc49e44554a766302930601aada6fa594e8088f0fbad74e481ee392ff89633e68b99e4da3f761fcb5d31ee3b233d540fe2a2d4e1af curl-checksum = df8fc6b2cccf100f7479e25cad743964a84066b587da19585b36a788b0041925e33944986d636a451d6bb95a452d5ac6812b2d5fa6631a10e0ac82a2c7821c75 +dash-checksum = 9d55090115ac04f505d70e6790179331178950f96fe713b33fd698fa8bfa60d4eff1b68cb7b8a2f099d29c587d36034a17dccd6658ba1623ff0a625ac1fb9620 diffutils-checksum = 7b12cf8aea1b9844773748f72272d9c6a38adae9c3c3a8c62048f91fb56c60b76035fa5f51665dceaf2cfbf1d1f4a3efdcc24bf47a5a16ff4350543314b12c9c file-checksum = 3ec5e51ffb7a82defa74845a90fbc983f6e169fc116606049bc01ff6e720d340c8abf6eb7a08b9ac1099162a5c02deac3633b07b039d486344c8abd9052ca751 findutils-checksum = 650a24507f8f4ebff83ad28dd27daa4785b4038dcaadc4fe00823b976e848527074cce3f9ec34065b7f037436d2aa6e9ec099bc05d7472c29864ac2c69de7f2e -flock-checksum = 2fe663839b5fd03a08e8b3d0596ce1b4216d8f19a1c4da4fa3db8b409aa4aa292358cc671be857e0f308315458bb2e10288f9d2152dce9940085d33cb7e4a24b +flock-checksum = ddb997174c0653bc3d29410a5a16b6290e737aa40fbf4b746e2d1db1e88e5acb08ec11a25c27c8a5a5fbf5a00fcac17abeaa245e7df27bd975ae86364d400b86 gawk-checksum = 3734740b7406ddfec9e04bb7774e76c6446cba76642a6180266e7b1822de20aab824c29c4e417256d877762ef04ef3f9df855cd4a3ca414a9225323b49d79195 gcc-checksum = a12dff52af876aee0fd89a8d09cdc455f35ec46845e154023202392adc164848faf8ee881b59b681b696e27c69fd143a214014db4214db62f9891a1c8365c040 gettext-checksum = 08d20c659004a77e607af17df15f5ce9bd4fc0feca9436aa206b0cbd2516f9f0c98c7ee1faacf7ff429f9b0dd9de219947b300216887a60727602a688acabc82 git-checksum = 5d92d07b171c5cd6e89a29c1211c73c1c900cd51c74d690aebfb4a3d0e93b541b09b42b6d6a1a82f5c3d953096771f9a8605c63be139f559f58698c1a0eabcfc gmp-checksum = 8aea94f867174eacac44f395ceb9212530c208e8de69d0bb53056f52360317230fc84ac177fd3ffc9fdb19a07c7549305dcc34c83c34821ccfab9dc63a16e67e grep-checksum = e4805dbddf7cd0f0faf412557d408509650c1ccf703bc450f10a3f727c690dbfaa1235aa81939a0e4b7ac6190f88c15ea1fcc562b343d4b4c7550f967aeb15db -gzip-checksum = 7939043e74554ced0c1c05d354ab4eb36cd6dce89ad79d02ccdc5ed6b7ee390759689b2d47c07227b9b44a62851afe7c76c4cae9f92527d999f3f1b4df1cccff +gzip-checksum = 753fbcf5eb104bfc8a8eb81b69b8701f757b5158e6333b17438574169a4662642a122e1fdbd920a536edbcb77253d65fa571e4f507dbe72a70fee5eb161d6324 isl-checksum = 85d0b40f4dbf14cb99d17aa07048cdcab2dc3eb527d2fbb1e84c41b2de5f351025370e57448b63b2b8a8cf8a0843a089c3263f9baee1542d5c2e1cb37ed39d94 libbsd-checksum = 435822b8f2495a5e2705e5ab5c834a4f0f3a177b3e5c46a7c6162924507ca984e957e94a512b5ebd0067ecb413bac458fade357709ef199e9b75edf0315de91c libiconv-checksum = 365dac0b34b4255a0066e8033a8b3db4bdb94b9b57a9dca17ebf2d779139fe935caf51a465d17fd8ae229ec4b926f3f7025264f37243432075e5583925bb77b7 libtool-checksum = a6eef35f3cbccf2c9e2667f44a476ebc80ab888725eb768e91a3a6c33b8c931afc46eb23efaee76c8696d3e4eed74ab1c71157bcb924f38ee912c8a90a6521a4 libunistring-checksum = 01dcab6e05ea4c33572bf96cc0558bcffbfc0e62fc86410cef06c1597a0073d5750525fe2dee4fdb39c9bd704557fcbab864f9645958108a2e07950bc539fe54 libxml2-checksum = cb7784ba4e72e942614e12e4f83f4ceb275f3d738b30e3b5c1f25edf8e9fa6789e854685974eed95b362049dbf6c8e7357e0327d64c681ed390534ac154e6810 -lzip-checksum= 0349b4c6c0b41e601b7ee381c3254d741397beb3ef9354c08162f346f131f4f48f6613ee0a610cdc6d827530df634f884ecfeee35215b10045a40fee76f8e938 +lzip-checksum = e3331bbf0166541332182a9a28c2e08f522735ef668a06dfa26055251d5142a09227d97e6ae50b38c0b8805479a307a9e55c81b120d5befad2fde44676261843 m4-checksum = a92cad4441b3fd7c033837389ca3499494523d364a5fda043d92c517051510f1758b3b837f0477f42d2258a179ab79a4993e5d1694ef2673db6d96d1faff84fe -make-checksum = 9a1185cc468368f4ec06478b1cfa343bf90b5cd7c92c0536567db0315b0ee909af53ecce3d44cfd93dd137dbca1ed13af5713e8663590c4fdd21ea635d78496b +make-checksum = ddf0fdcb9ee1b182ef294c5da70c1275288c99bef60e63a25c0abed2ddd44aba1770be4aab1db8cac81e5f624576f2127c5d825a1824e1c7a49df4f16445526b metastore-checksum = b2a5fdde9de5ddc1e6c368d5da1b2e97e4fdbaa138a7be281ccb40a81dd4a9bb1849d36b2d5d3f01205079bace60441f82a7002097ff3a7037340a35b0f1574a mpc-checksum = 72d657958b07c7812dc9c7cbae093118ce0e454c68a585bfb0e2fa559f1bf7c5f49b93906f580ab3f1073e5b595d23c6494d4d76b765d16dde857a18dd239628 mpfr-checksum = d583555d08863bf36c89b289ae26bae353d9a31f08ee3894520992d2c26e5683c4c9c193d7ad139632f71c0a476d85ea76182702a98bf08dde7b6f65a54f8b88 diff --git a/reproduce/software/config/servers-backup.conf b/reproduce/software/config/servers-backup.conf new file mode 100644 index 0000000..a4320fa --- /dev/null +++ b/reproduce/software/config/servers-backup.conf @@ -0,0 +1,14 @@ +# Default servers to use as backup, later this should go in a file that is +# not under version control (the actual server that the tarbal comes from +# is irrelevant). Note that this is not a to be read as a variable but will +# be parsed as a list. +# +# Copyright (C) 2018-2020 Mohammad Akhlaghi +# +# Copying and distribution of this file, with or without modification, are +# permitted in any medium without royalty provided the copyright notice and +# this notice are preserved. This file is offered as-is, without any +# warranty. +http://gitlab.com/maneage/tarballs-software/-/raw/master +http://git.maneage.org/tarballs-software.git/plain +http://akhlaghi.org/maneage-software \ No newline at end of file diff --git a/reproduce/software/config/versions.conf b/reproduce/software/config/versions.conf index e9aad2e..1a4172a 100644 --- a/reproduce/software/config/versions.conf +++ b/reproduce/software/config/versions.conf @@ -18,6 +18,7 @@ bash-version = 5.0.11 binutils-version = 2.32 coreutils-version = 8.31 curl-version = 7.65.3 +dash-version = 0.5.10.2 diffutils-version = 3.7 file-version = 5.36 findutils-version = 4.7.0 @@ -34,7 +35,7 @@ libiconv-version = 1.16 libtool-version = 2.4.6 libunistring-version = 0.9.10 libxml2-version = 2.9.9 -lzip-version= 1.20 +lzip-version = 1.20 m4-version = 1.4.18 make-version = 4.3 metastore-version = 1.1.2-23-fa9170b diff --git a/reproduce/software/make/basic.mk b/reproduce/software/make/basic.mk index e0da312..b4745e2 100644 --- a/reproduce/software/make/basic.mk +++ b/reproduce/software/make/basic.mk @@ -52,6 +52,7 @@ syspath := $(PATH) # As we build more programs, we want to use this project's built programs # and libraries, not the host's. +.ONESHELL: export CCACHE_DISABLE := 1 export PATH := $(ibdir):$(PATH) export PKG_CONFIG_PATH := $(ildir)/pkgconfig @@ -59,6 +60,17 @@ export PKG_CONFIG_LIBDIR := $(ildir)/pkgconfig export CPPFLAGS := -I$(idir)/include $(CPPFLAGS) export LDFLAGS := $(rpath_command) -L$(ildir) $(LDFLAGS) +# Note that we build GNU Bash here in 'basic.mk'. So we can't assume Bash +# in this Makefile and use the DASH shell that was before calling this +# Makefile: http://gondor.apana.org.au/~herbert/dash. Dash is a minimalist +# POSIX shell, so it doesn't have startup options like '--noprofile +# --norc'. But from its manual, to load startup files, Dash actually +# requires that it be called with a '-' before it (for example '-dash'), so +# it shouldn't be loading any startup files if it was interpretted +# properly. +.SHELLFLAGS := -e -c +export SHELL := $(ibdir)/dash + # This is the "basic" tools where we are relying on the host operating # system, but are slowly populating our basic software envirnoment. To run # (system or template) programs, `LD_LIBRARY_PATH' is necessary, so here, @@ -85,7 +97,10 @@ all: $(foreach p, $(top-level-programs), $(ibidir)/$(p)) # Servers to use as backup, later this should go in a file that is not # under version control (the actual server that the tarbal comes from is # irrelevant). -backupservers = http://akhlaghi.org/maneage-software +backupservers := $(shell awk '!/^#/{printf "%s ", $$1}' \ + reproduce/software/config/servers-backup.conf) + + @@ -131,16 +146,14 @@ tarballs = $(foreach t, bash-$(bash-version).tar.lz \ git-$(git-version).tar.xz \ gmp-$(gmp-version).tar.lz \ grep-$(grep-version).tar.xz \ - gzip-$(gzip-version).tar.gz \ + gzip-$(gzip-version).tar.lz \ isl-$(isl-version).tar.bz2 \ libbsd-$(libbsd-version).tar.xz \ libiconv-$(libiconv-version).tar.gz \ libtool-$(libtool-version).tar.xz \ libunistring-$(libunistring-version).tar.xz \ libxml2-$(libxml2-version).tar.gz \ - lzip-$(lzip-version).tar.gz \ m4-$(m4-version).tar.gz \ - make-$(make-version).tar.gz \ metastore-$(metastore-version).tar.gz \ mpfr-$(mpfr-version).tar.xz \ mpc-$(mpc-version).tar.gz \ @@ -167,105 +180,108 @@ $(tarballs): $(tdir)/%: | $(lockdir) # the first character of the version to be a digit: packages such # as `foo' and `foo-3' will not be distinguished, but `foo' and # `foo2' will be distinguished. - n=$$(echo $* | sed -e's/-[0-9]/ /' -e's/\./ /g' \ - | awk '{print $$1}' ); \ - \ + @n=$$(echo $* | sed -e's/-[0-9]/ /' -e's/\./ /g' \ + | awk '{print $$1}' ) + mergenames=1; \ - if [ $$n = bash ]; then c=$(bash-checksum); w=http://akhlaghi.org/maneage-software; \ - elif [ $$n = binutils ]; then c=$(binutils-checksum); w=http://ftp.gnu.org/gnu/binutils; \ - elif [ $$n = bzip2 ]; then c=$(bzip2-checksum); w=http://akhlaghi.org/maneage-software; \ - elif [ $$n = cert ]; then c=$(cert-checksum); w=http://akhlaghi.org/maneage-software; \ - elif [ $$n = coreutils ]; then c=$(coreutils-checksum); w=http://ftp.gnu.org/gnu/coreutils;\ - elif [ $$n = curl ]; then c=$(curl-checksum); w=https://curl.haxx.se/download; \ - elif [ $$n = diffutils ]; then c=$(diffutils-checksum); w=http://ftp.gnu.org/gnu/diffutils;\ - elif [ $$n = file ]; then c=$(file-checksum); w=ftp://ftp.astron.com/pub/file; \ - elif [ $$n = findutils ]; then c=$(findutils-checksum); w=http://ftp.gnu.org/gnu/findutils; \ - elif [ $$n = gawk ]; then c=$(gawk-checksum); w=http://ftp.gnu.org/gnu/gawk; \ - elif [ $$n = gcc ]; then c=$(gcc-checksum); w=http://ftp.gnu.org/gnu/gcc/gcc-$(gcc-version); \ - elif [ $$n = gettext ]; then c=$(gettext-checksum); w=https://ftp.gnu.org/gnu/gettext; \ - elif [ $$n = git ]; then c=$(git-checksum); w=http://mirrors.edge.kernel.org/pub/software/scm/git; \ - elif [ $$n = gmp ]; then c=$(gmp-checksum); w=https://gmplib.org/download/gmp; \ - elif [ $$n = grep ]; then c=$(grep-checksum); w=http://ftp.gnu.org/gnu/grep; \ - elif [ $$n = gzip ]; then c=$(gzip-checksum); w=http://ftp.gnu.org/gnu/gzip; \ - elif [ $$n = isl ]; then c=$(isl-checksum); w=ftp://gcc.gnu.org/pub/gcc/infrastructure; \ - elif [ $$n = libbsd ]; then c=$(libbsd-checksum); w=http://libbsd.freedesktop.org/releases; \ - elif [ $$n = libiconv ]; then c=$(libiconv-checksum); w=https://ftp.gnu.org/pub/gnu/libiconv; \ - elif [ $$n = libtool ]; then c=$(libtool-checksum); w=http://ftp.gnu.org/gnu/libtool; \ - elif [ $$n = libunistring ]; then c=$(libunistring-checksum); w=http://ftp.gnu.org/gnu/libunistring; \ - elif [ $$n = libxml2 ]; then c=$(libxml2-checksum); w=ftp://xmlsoft.org/libxml2; \ - elif [ $$n = lzip ]; then c=$(lzip-checksum); w=http://download.savannah.gnu.org/releases/lzip; \ - elif [ $$n = m4 ]; then \ - mergenames=0; \ - c=$(m4-checksum); \ - w=http://akhlaghi.org/maneage-software/m4-1.4.18-patched.tar.gz; \ - elif [ $$n = make ]; then c=$(make-checksum); w=https://ftp.gnu.org/gnu/make; \ - elif [ $$n = metastore ]; then c=$(metastore-checksum); w=http://akhlaghi.org/maneage-software; \ - elif [ $$n = mpc ]; then c=$(mpc-checksum); w=http://ftp.gnu.org/gnu/mpc; \ - elif [ $$n = mpfr ]; then c=$(mpfr-checksum); w=http://www.mpfr.org/mpfr-current;\ - elif [ $$n = ncurses ]; then c=$(ncurses-checksum); w=http://ftp.gnu.org/gnu/ncurses; \ - elif [ $$n = openssl ]; then c=$(openssl-checksum); w=http://www.openssl.org/source; \ - elif [ $$n = patchelf ]; then c=$(patchelf-checksum); w=http://nixos.org/releases/patchelf/patchelf-$(patchelf-version); \ - elif [ $$n = perl ]; then \ - c=$(perl-checksum); \ - v=$$(echo $(perl-version) | sed -e's/\./ /g' | awk '{printf("%d.0", $$1)}'); \ - w=https://www.cpan.org/src/$$v; \ - elif [ $$n = pkg-config ]; then c=$(pkgconfig-checksum); w=http://pkg-config.freedesktop.org/releases; \ - elif [ $$n = readline ]; then c=$(readline-checksum); w=http://ftp.gnu.org/gnu/readline; \ - elif [ $$n = sed ]; then c=$(sed-checksum); w=http://ftp.gnu.org/gnu/sed; \ - elif [ $$n = tar ]; then c=$(tar-checksum); w=http://ftp.gnu.org/gnu/tar; \ - elif [ $$n = texinfo ]; then c=$(texinfo-checksum); w=http://ftp.gnu.org/gnu/texinfo; \ - elif [ $$n = unzip ]; then \ - c=$(unzip-checksum); \ - mergenames=0; v=$$(echo $(unzip-version) | sed -e's/\.//'); \ - w=ftp://ftp.info-zip.org/pub/infozip/src/unzip$$v.tgz; \ - elif [ $$n = wget ]; then c=$(wget-checksum); w=http://ftp.gnu.org/gnu/wget; \ - elif [ $$n = which ]; then c=$(which-checksum); w=http://ftp.gnu.org/gnu/which; \ - elif [ $$n = xz ]; then c=$(xz-checksum); w=http://tukaani.org/xz; \ - elif [ $$n = zip ]; then \ - c=$(zip-checksum); \ - mergenames=0; v=$$(echo $(zip-version) | sed -e's/\.//'); \ - w=ftp://ftp.info-zip.org/pub/infozip/src/zip$$v.tgz; \ - elif [ $$n = zlib ]; then c=$(zlib-checksum); w=http://www.zlib.net; \ - else \ - echo; echo; echo; \ - echo "'$$n' not recognized as a software tarball name to download."; \ - echo; echo; echo; \ - exit 1; \ - fi; \ - \ - \ - if [ -f $(DEPENDENCIES-DIR)/$* ]; then \ - cp $(DEPENDENCIES-DIR)/$* "$@.unchecked"; \ - else \ - if [ $$mergenames = 1 ]; then tarballurl=$$w/"$*"; \ - else tarballurl=$$w; \ - fi; \ - \ - echo "Downloading $$tarballurl"; \ - if [ -f $(ibdir)/wget ]; then \ - downloader="wget --no-use-server-timestamps -O"; \ - else \ - downloader="$(DOWNLOADER)"; \ - fi; \ - \ - touch $(lockdir)/download; \ + if [ $$n = bash ]; then c=$(bash-checksum); w=http://akhlaghi.org/maneage-software + elif [ $$n = binutils ]; then c=$(binutils-checksum); w=http://ftp.gnu.org/gnu/binutils + elif [ $$n = bzip2 ]; then c=$(bzip2-checksum); w=http://akhlaghi.org/maneage-software + elif [ $$n = cert ]; then c=$(cert-checksum); w=http://akhlaghi.org/maneage-software + elif [ $$n = coreutils ]; then c=$(coreutils-checksum); w=http://ftp.gnu.org/gnu/coreutils + elif [ $$n = curl ]; then c=$(curl-checksum); w=https://curl.haxx.se/download + elif [ $$n = diffutils ]; then c=$(diffutils-checksum); w=http://ftp.gnu.org/gnu/diffutils + elif [ $$n = file ]; then c=$(file-checksum); w=ftp://ftp.astron.com/pub/file + elif [ $$n = findutils ]; then c=$(findutils-checksum); w=http://ftp.gnu.org/gnu/findutils + elif [ $$n = gawk ]; then c=$(gawk-checksum); w=http://ftp.gnu.org/gnu/gawk + elif [ $$n = gcc ]; then c=$(gcc-checksum); w=http://ftp.gnu.org/gnu/gcc/gcc-$(gcc-version) + elif [ $$n = gettext ]; then c=$(gettext-checksum); w=https://ftp.gnu.org/gnu/gettext + elif [ $$n = git ]; then c=$(git-checksum); w=http://mirrors.edge.kernel.org/pub/software/scm/git + elif [ $$n = gmp ]; then c=$(gmp-checksum); w=https://gmplib.org/download/gmp + elif [ $$n = grep ]; then c=$(grep-checksum); w=http://ftp.gnu.org/gnu/grep + elif [ $$n = gzip ]; then c=$(gzip-checksum); w=http://akhlaghi.org/src + elif [ $$n = isl ]; then c=$(isl-checksum); w=ftp://gcc.gnu.org/pub/gcc/infrastructure + elif [ $$n = libbsd ]; then c=$(libbsd-checksum); w=http://libbsd.freedesktop.org/releases + elif [ $$n = libiconv ]; then c=$(libiconv-checksum); w=https://ftp.gnu.org/pub/gnu/libiconv + elif [ $$n = libtool ]; then c=$(libtool-checksum); w=http://ftp.gnu.org/gnu/libtool + elif [ $$n = libunistring ]; then c=$(libunistring-checksum); w=http://ftp.gnu.org/gnu/libunistring + elif [ $$n = libxml2 ]; then c=$(libxml2-checksum); w=ftp://xmlsoft.org/libxml2 + elif [ $$n = m4 ]; then + mergenames=0 + c=$(m4-checksum) + w=http://akhlaghi.org/maneage-software/m4-1.4.18-patched.tar.gz + elif [ $$n = metastore ]; then c=$(metastore-checksum); w=http://akhlaghi.org/maneage-software + elif [ $$n = mpc ]; then c=$(mpc-checksum); w=http://ftp.gnu.org/gnu/mpc + elif [ $$n = mpfr ]; then c=$(mpfr-checksum); w=http://www.mpfr.org/mpfr-current + elif [ $$n = ncurses ]; then c=$(ncurses-checksum); w=http://ftp.gnu.org/gnu/ncurses + elif [ $$n = openssl ]; then c=$(openssl-checksum); w=http://www.openssl.org/source + elif [ $$n = patchelf ]; then c=$(patchelf-checksum); w=http://nixos.org/releases/patchelf/patchelf-$(patchelf-version) + elif [ $$n = perl ]; then + c=$(perl-checksum) + v=$$(echo $(perl-version) | sed -e's/\./ /g' | awk '{printf("%d.0", $$1)}') + w=https://www.cpan.org/src/$$v + elif [ $$n = pkg-config ]; then c=$(pkgconfig-checksum); w=http://pkg-config.freedesktop.org/releases + elif [ $$n = readline ]; then c=$(readline-checksum); w=http://ftp.gnu.org/gnu/readline + elif [ $$n = sed ]; then c=$(sed-checksum); w=http://ftp.gnu.org/gnu/sed + elif [ $$n = tar ]; then c=$(tar-checksum); w=http://ftp.gnu.org/gnu/tar + elif [ $$n = texinfo ]; then c=$(texinfo-checksum); w=http://ftp.gnu.org/gnu/texinfo + elif [ $$n = unzip ]; then + c=$(unzip-checksum) + mergenames=0; v=$$(echo $(unzip-version) | sed -e's/\.//') + w=ftp://ftp.info-zip.org/pub/infozip/src/unzip$$v.tgz + elif [ $$n = wget ]; then c=$(wget-checksum); w=http://ftp.gnu.org/gnu/wget + elif [ $$n = which ]; then c=$(which-checksum); w=http://ftp.gnu.org/gnu/which + elif [ $$n = xz ]; then c=$(xz-checksum); w=http://tukaani.org/xz + elif [ $$n = zip ]; then + c=$(zip-checksum) + mergenames=0; v=$$(echo $(zip-version) | sed -e's/\.//') + w=ftp://ftp.info-zip.org/pub/infozip/src/zip$$v.tgz + elif [ $$n = zlib ]; then c=$(zlib-checksum); w=http://www.zlib.net + else + echo; echo; echo + echo "'$$n' not recognized as a software tarball name to download." + echo; echo; echo + exit 1 + fi + + # Download the raw tarball, using an '.unchecked' suffix to specify + # that it is not yet fully checked and usable. But first, since the + # download may be interrupted in a previous build and an incomplete + # '.unchecked' file may remain, we'll remove any possibly existing + # uncheked file. + rm -f "$@.unchecked" + if [ -f $(DEPENDENCIES-DIR)/$* ]; then + cp $(DEPENDENCIES-DIR)/$* "$@.unchecked" + else + if [ $$mergenames = 1 ]; then tarballurl=$$w/"$*" + else tarballurl=$$w + fi + + echo "Downloading $$tarballurl" + if [ -f $(ibdir)/wget ]; then + downloader="wget --no-use-server-timestamps -O" + else + downloader="$(DOWNLOADER)" + fi + + touch $(lockdir)/download $(downloadwrapper) "$$downloader" $(lockdir)/download \ - $$tarballurl "$@.unchecked" "$(backupservers)"; \ - fi; \ - \ - \ - if type sha512sum > /dev/null 2>/dev/null; then \ - checksum=$$(sha512sum "$@.unchecked" | awk '{print $$1}'); \ - if [ x"$$checksum" = x"$$c" ]; then \ - mv "$@.unchecked" "$@"; \ - else \ - echo "ERROR: Non-matching checksum for '$*'."; \ - echo "Checksum should be: $$c"; \ - echo "Checksum is: $$checksum"; \ - exit 1; \ - fi; \ - else mv "$@.unchecked" "$@"; \ - fi; + $$tarballurl "$@.unchecked" "$(backupservers)" + fi + + # Make sure the file's Checksum is correct. + if type sha512sum > /dev/null 2>/dev/null; then + checksum=$$(sha512sum "$@.unchecked" | awk '{print $$1}') + if [ x"$$checksum" = x"$$c" ]; then + mv "$@.unchecked" "$@" + else + echo "ERROR: Non-matching checksum for '$*'." + echo "Checksum should be: $$c" + echo "Checksum is: $$checksum" + exit 1 + fi + else mv "$@.unchecked" "$@" + fi @@ -274,36 +290,39 @@ $(tarballs): $(tdir)/%: | $(lockdir) # Low-level (not built) programs # ------------------------------ # -# For the time being, we aren't building a local C compiler, but we'll use -# any C compiler that the system already has and just make a symbolic link -# to it. +# For the time being, some components of the project on some systems, so we +# are simply making a symbolic link to the system's files here. We'll do +# this after building GNU Coreutils to have trustable elements. # -# ccache: ccache acts like a wrapper over the C compiler and is made to -# avoid/speed-up compiling of identical files in a system (it is commonly -# used on large servers). It actually makes `gcc' or `g++' a symbolic link -# to itself so it can control them internally. So, for our purpose here, it -# is very annoying and can cause many complications. We thus remove any -# part of PATH of that has `ccache' in it before making symbolic links to -# the programs we are not building ourselves. +# About ccache: ccache acts like a wrapper over the C compiler and is made +# to avoid/speed-up compiling of identical files in a system (it is +# commonly used on large servers). It actually makes `gcc' or `g++' a +# symbolic link to itself so it can control them internally. So, for our +# purpose here, it is very annoying and can cause many complications. We +# thus remove any part of PATH of that has `ccache' in it before making +# symbolic links to the programs we are not building ourselves. makelink = origpath="$$PATH"; \ - export PATH=$$(echo $(syspath) \ - | tr : '\n' \ - | grep -v ccache \ - | tr '\n' :); \ - a=$$(which $(1) 2> /dev/null); \ - if [ -e $(ibdir)/$(1) ]; then rm $(ibdir)/$(1); fi; \ - if [ x$$a = x ]; then \ - if [ "x$(strip $(2))" = xmandatory ]; then \ - echo "'$(1)' is necessary for higher-level tools."; \ - echo "Please install it for the configuration to continue."; \ - exit 1; \ - fi; \ - else \ - ln -s $$a $(ibdir)/$(1); \ - fi; \ - export PATH="$$origpath" + export PATH=$$(echo $(syspath) \ + | tr : '\n' \ + | grep -v ccache \ + | tr '\n' :); \ + if type $(1) > /dev/null 2> /dev/null; then \ + if [ x$(3) = x ]; then \ + ln -sf $$(which $(1)) $(ibdir)/$(1); \ + else \ + ln -sf $$(which $(1)) $(ibdir)/$(3); \ + fi; \ + else \ + if [ "x$(strip $(2))" = xmandatory ]; then \ + echo "'$(1)' is necessary for higher-level tools."; \ + echo "Please install it for the configuration to continue."; \ + exit 1; \ + fi; \ + fi; \ + export PATH="$$origpath" + $(ibdir) $(ildir):; mkdir $@ -$(ibidir)/low-level-links: | $(ibdir) $(ildir) +$(ibidir)/low-level-links: $(ibidir)/coreutils | $(ibdir) $(ildir) # Not-installed (but necessary in some cases) compilers. # Clang is necessary for CMake. @@ -325,11 +344,11 @@ $(ibidir)/low-level-links: | $(ibdir) $(ildir) # Necessary libraries: # Libdl (for dynamic loading libraries at runtime) # POSIX Threads library for multi-threaded programs. - for l in dl pthread; do \ - rm -f $(ildir)/lib$$l*; \ - if [ -f /usr/lib/lib$$l.a ]; then \ - ln -s /usr/lib/lib$$l.* $(ildir)/; \ - fi; \ + for l in dl pthread; do + rm -f $(ildir)/lib$$l*; + if [ -f /usr/lib/lib$$l.a ]; then + ln -s /usr/lib/lib$$l.* $(ildir)/ + fi done # We want this to be empty (so it doesn't interefere with the other @@ -349,28 +368,23 @@ $(ibidir)/low-level-links: | $(ibdir) $(ildir) # ------------------------------------------ # # The first set of programs to be built are those that we need to unpack -# the source code tarballs of each program. First, we'll build the -# necessary programs, then we'll build GNU Tar. -$(ibidir)/gzip: $(tdir)/gzip-$(gzip-version).tar.gz +# the source code tarballs of each program. We have already installed Lzip +# before calling 'basic.mk', so it is present and working. Hence we first +# build the Lzipped tarball of Gzip, then use our own Gzip to unpack the +# tarballs of the other compression programs. Once all the compression +# programs/libraries are complete, we build our own GNU Tar and continue +# with other software. +$(ibidir)/gzip: $(tdir)/gzip-$(gzip-version).tar.lz $(call gbuild, gzip-$(gzip-version), static, , V=1) \ && echo "GNU Gzip $(gzip-version)" > $@ -# GNU Lzip: For a static build, the `-static' flag should be given to -# LDFLAGS on the command-line (not from the environment). -ifeq ($(static_build),yes) -lzipconf="LDFLAGS=-static" -else -lzipconf= -endif -$(ibidir)/lzip: $(tdir)/lzip-$(lzip-version).tar.gz - $(call gbuild, lzip-$(lzip-version), , $(lzipconf)) \ - && echo "Lzip $(lzip-version)" > $@ - -$(ibidir)/xz: $(tdir)/xz-$(xz-version).tar.gz +$(ibidir)/xz: $(ibidir)/gzip \ + $(tdir)/xz-$(xz-version).tar.gz $(call gbuild, xz-$(xz-version), static) \ && echo "XZ Utils $(xz-version)" > $@ -$(ibidir)/bzip2: $(tdir)/bzip2-$(bzip2-version).tar.gz +$(ibidir)/bzip2: $(ibidir)/gzip \ + $(tdir)/bzip2-$(bzip2-version).tar.gz # Bzip2 doesn't have a `./configure' script, and its Makefile # doesn't build a shared library. So we can't use the `gbuild' # function here and we need to take some extra steps (inspired @@ -380,28 +394,28 @@ $(ibidir)/bzip2: $(tdir)/bzip2-$(bzip2-version).tar.gz # # NOTE: the major version number appears in the final symbolic # link. - tdir=bzip2-$(bzip2-version); \ - if [ $(static_build) = yes ]; then \ - makecommand="make LDFLAGS=-static"; \ - makeshared="echo no-shared"; \ - else \ - makecommand="make"; \ - if [ x$(on_mac_os) = xyes ]; then \ - makeshared="echo no-shared"; \ - else \ - makeshared="make -f Makefile-libbz2_so"; \ - fi; \ - fi; \ + tdir=bzip2-$(bzip2-version) + if [ $(static_build) = yes ]; then + makecommand="make LDFLAGS=-static" + makeshared="echo no-shared" + else + makecommand="make" + if [ x$(on_mac_os) = xyes ]; then + makeshared="echo no-shared" + else + makeshared="make -f Makefile-libbz2_so" + fi + fi cd $(ddir) && rm -rf $$tdir \ && tar xf $(word 1,$(filter $(tdir)/%,$^)) \ && cd $$tdir \ && sed -e 's@\(ln -s -f \)$$(PREFIX)/bin/@\1@' Makefile \ > Makefile.sed \ && mv Makefile.sed Makefile \ - && $$makeshared \ + && $$makeshared CC=cc \ && cp -a libbz2* $(ildir)/ \ && make clean \ - && $$makecommand \ + && $$makecommand CC=cc \ && make install PREFIX=$(idir) \ && cd .. \ && rm -rf $$tdir \ @@ -409,21 +423,23 @@ $(ibidir)/bzip2: $(tdir)/bzip2-$(bzip2-version).tar.gz && ln -fs libbz2.so.1.0 libbz2.so \ && echo "Bzip2 $(bzip2-version)" > $@ -$(ibidir)/unzip: $(tdir)/unzip-$(unzip-version).tar.gz - v=$$(echo $(unzip-version) | sed -e's/\.//'); \ +$(ibidir)/unzip: $(ibidir)/gzip \ + $(tdir)/unzip-$(unzip-version).tar.gz + v=$$(echo $(unzip-version) | sed -e's/\.//') $(call gbuild, unzip$$v, static,, \ - -f unix/Makefile generic_gcc \ + -f unix/Makefile generic \ CFLAGS="-DBIG_MEM -DMMAP",,pwd, \ - -f unix/Makefile \ + -f unix/Makefile generic \ BINDIR=$(ibdir) MANDIR=$(idir)/man/man1 ) \ && echo "Unzip $(unzip-version)" > $@ -$(ibidir)/zip: $(tdir)/zip-$(zip-version).tar.gz - v=$$(echo $(zip-version) | sed -e's/\.//'); \ +$(ibidir)/zip: $(ibidir)/gzip \ + $(tdir)/zip-$(zip-version).tar.gz + v=$$(echo $(zip-version) | sed -e's/\.//') $(call gbuild, zip$$v, static,, \ - -f unix/Makefile generic_gcc \ + -f unix/Makefile generic \ CFLAGS="-DBIG_MEM -DMMAP",,pwd, \ - -f unix/Makefile \ + -f unix/Makefile generic \ BINDIR=$(ibdir) MANDIR=$(idir)/man/man1 ) \ && echo "Zip $(zip-version)" > $@ @@ -432,7 +448,8 @@ $(ibidir)/zip: $(tdir)/zip-$(zip-version).tar.gz # # Note for a static-only build: Zlib's `./configure' doesn't use Autoconf's # configure script, it just accepts a direct `--static' option. -$(ibidir)/zlib: $(tdir)/zlib-$(zlib-version).tar.gz +$(ibidir)/zlib: $(ibidir)/gzip \ + $(tdir)/zlib-$(zlib-version).tar.gz $(call gbuild, zlib-$(zlib-version)) \ && echo "Zlib $(zlib-version)" > $@ @@ -443,8 +460,7 @@ $(ibidir)/zlib: $(tdir)/zlib-$(zlib-version).tar.gz # software to be built). $(ibidir)/tar: $(ibidir)/xz \ $(ibidir)/zip \ - $(ibidir)/gzip \ - $(ibidir)/lzip \ + $(ibidir)/gzip \ $(ibidir)/zlib \ $(ibidir)/bzip2 \ $(ibidir)/unzip \ @@ -476,13 +492,7 @@ $(ibidir)/tar: $(ibidir)/xz \ # function (for tilde expansion). The first can be disabled with # `--disable-load', but unfortunately I don't know any way to fix the # second. So, we'll have to build it dynamically for now. -$(ibidir)/make: $(ibidir)/tar \ - $(tdir)/make-$(make-version).tar.gz - # See Tar's comments for the `-j' option. - $(call gbuild, make-$(make-version), , , -j$(numthreads)) \ - && echo "GNU Make $(make-version)" > $@ - -$(ibidir)/ncurses: $(ibidir)/make \ +$(ibidir)/ncurses: $(ibidir)/tar \ $(tdir)/ncurses-$(ncurses-version).tar.gz # Delete the library that will be installed (so we can make sure @@ -540,32 +550,32 @@ $(ibidir)/ncurses: $(ibidir)/make \ # # 5. A link is made to also be able to include files from the # `ncurses' headers. - if [ x$(on_mac_os) = xyes ]; then so="dylib"; else so="so"; fi; \ - if [ -f $(ildir)/libncursesw.$$so ]; then \ - \ + if [ x$(on_mac_os) = xyes ]; then so="dylib"; else so="so"; fi + if [ -f $(ildir)/libncursesw.$$so ]; then + sov=$$(ls -l $(ildir)/libncursesw* \ | awk '/^-/{print $$NF}' \ - | sed -e's|'$(ildir)/libncursesw.'||'); \ - \ - cd "$(ildir)"; \ - for lib in ncurses ncurses++ form panel menu; do \ - ln -fs lib$$lib"w".$$sov lib$$lib.$$so; \ - ln -fs $(ildir)/pkgconfig/"$$lib"w.pc pkgconfig/$$lib.pc; \ - done; \ - for lib in tic tinfo; do \ - ln -fs libncursesw.$$sov lib$$lib.$$so; \ - ln -fs libncursesw.$$sov lib$$lib.$$sov; \ - ln -fs $(ildir)/pkgconfig/ncursesw.pc pkgconfig/$$lib.pc; \ - done; \ - ln -fs libncursesw.$$sov libcurses.$$so; \ - ln -fs libncursesw.$$sov libcursesw.$$sov; \ - ln -fs $(ildir)/pkgconfig/ncursesw.pc pkgconfig/curses.pc; \ - ln -fs $(ildir)/pkgconfig/ncursesw.pc pkgconfig/cursesw.pc; \ - \ - ln -fs $(idir)/include/ncursesw $(idir)/include/ncurses; \ - echo "GNU NCURSES $(ncurses-version)" > $@; \ - else \ - exit 1; \ + | sed -e's|'$(ildir)/libncursesw.'||') + + cd "$(ildir)" + for lib in ncurses ncurses++ form panel menu; do + ln -fs lib$$lib"w".$$sov lib$$lib.$$so + ln -fs $(ildir)/pkgconfig/"$$lib"w.pc pkgconfig/$$lib.pc + done + for lib in tic tinfo; do + ln -fs libncursesw.$$sov lib$$lib.$$so + ln -fs libncursesw.$$sov lib$$lib.$$sov + ln -fs $(ildir)/pkgconfig/ncursesw.pc pkgconfig/$$lib.pc + done + ln -fs libncursesw.$$sov libcurses.$$so + ln -fs libncursesw.$$sov libcursesw.$$sov + ln -fs $(ildir)/pkgconfig/ncursesw.pc pkgconfig/curses.pc + ln -fs $(ildir)/pkgconfig/ncursesw.pc pkgconfig/cursesw.pc + + ln -fs $(idir)/include/ncursesw $(idir)/include/ncurses + echo "GNU NCURSES $(ncurses-version)" > $@ + else + exit 1 fi $(ibidir)/readline: $(ibidir)/ncurses \ @@ -575,7 +585,7 @@ $(ibidir)/readline: $(ibidir)/ncurses \ SHLIB_LIBS="-lncursesw" -j$(numthreads)) \ && echo "GNU Readline $(readline-version)" > $@ -$(ibidir)/patchelf: $(ibidir)/make \ +$(ibidir)/patchelf: $(ibidir)/tar \ $(tdir)/patchelf-$(patchelf-version).tar.gz $(call gbuild, patchelf-$(patchelf-version)) \ && echo "PatchELF $(patchelf-version)" > $@ @@ -629,13 +639,13 @@ $(ibidir)/bash: $(needpatchelf) \ # default. As described in the manual, they are mainly useful when # you disable them all with `--enable-minimal-config' and enable a # subset using the `--enable' options. - if [ "x$(static_build)" = xyes ]; then stopt="--enable-static-link";\ - else stopt=""; \ - fi; \ + if [ "x$(static_build)" = xyes ]; then stopt="--enable-static-link" + else stopt="" + fi; export CFLAGS="$$CFLAGS \ -DDEFAULT_PATH_VALUE='\"$(ibdir)\"' \ -DSTANDARD_UTILS_PATH='\"$(ibdir)\"' \ - -DSYS_BASHRC='\"$(BASH_ENV)\"' "; \ + -DSYS_BASHRC='\"$(BASH_ENV)\"' " $(call gbuild, bash-$(bash-version),, $$stopt \ --with-installed-readline=$(ildir) \ --with-curses=yes, \ @@ -645,9 +655,9 @@ $(ibidir)/bash: $(needpatchelf) \ # default. So, we have to manually include it, currently we are # only doing this on GNU/Linux systems (using the `patchelf' # program). - if [ "x$(needpatchelf)" != x ]; then \ - if [ -f $(ibdir)/bash ]; then \ - $(ibdir)/patchelf --set-rpath $(ildir) $(ibdir)/bash; fi \ + if [ "x$(needpatchelf)" != x ]; then + if [ -f $(ibdir)/bash ]; then + $(ibdir)/patchelf --set-rpath $(ildir) $(ibdir)/bash; fi fi # To be generic, some systems use the `sh' command to call the @@ -658,10 +668,10 @@ $(ibidir)/bash: $(needpatchelf) \ # Just to be sure that the installation step above went well, # before making the link, we'll see if the file actually exists # there. - if [ -f $(ibdir)/bash ]; then \ - ln -fs $(ibdir)/bash $(ibdir)/sh; \ - echo "GNU Bash $(bash-version)" > $@; \ - else \ + if [ -f $(ibdir)/bash ]; then + ln -fs $(ibdir)/bash $(ibdir)/sh + echo "GNU Bash $(bash-version)" > $@ + else echo "GNU Bash not built!"; exit 1; fi @@ -678,14 +688,14 @@ perl-conflddlflags = else perl-conflddlflags = -Dlddlflags="-shared $$LDFLAGS" endif -$(ibidir)/perl: $(ibidir)/make \ +$(ibidir)/perl: $(ibidir)/tar \ $(tdir)/perl-$(perl-version).tar.gz major_version=$$(echo $(perl-version) \ | sed -e's/\./ /g' \ - | awk '{printf("%d", $$1)}'); \ + | awk '{printf("%d", $$1)}') base_version=$$(echo $(perl-version) \ | sed -e's/\./ /g' \ - | awk '{printf("%d.%d", $$1, $$2)}'); \ + | awk '{printf("%d.%d", $$1, $$2)}') cd $(ddir) \ && rm -rf perl-$(perl-version) \ && if ! tar xf $(word 1,$(filter $(tdir)/%,$^)); then \ @@ -795,7 +805,7 @@ $(idir)/etc:; mkdir $@ # Note: cert.pm has to be AFTER the tarball, otherwise the build script # will try to unpack cert.pm and crash (it unpacks the first dependency # under `tdir'). -$(ibidir)/openssl: $(ibidir)/make \ +$(ibidir)/openssl: $(ibidir)/tar \ $(tdir)/openssl-$(openssl-version).tar.gz \ $(tdir)/cert.pem \ | $(idir)/etc @@ -969,13 +979,13 @@ $(ibidir)/libiconv: $(ibidir)/pkg-config \ $(call gbuild, libiconv-$(libiconv-version), static) \ && echo "GNU libiconv $(libiconv-version)" > $@ -$(ibidir)/libunistring: $(ibidir)/make \ +$(ibidir)/libunistring: $(ibidir)/libiconv \ $(tdir)/libunistring-$(libunistring-version).tar.xz $(call gbuild, libunistring-$(libunistring-version), static,, \ -j$(numthreads)) \ && echo "GNU libunistring $(libunistring-version)" > $@ -$(ibidir)/libxml2: $(ibidir)/make \ +$(ibidir)/libxml2: $(ibidir)/tar \ $(tdir)/libxml2-$(libxml2-version).tar.gz # The libxml2 tarball also contains Python bindings which are built # and installed to a system directory by default. If you don't need @@ -1150,7 +1160,7 @@ $(ibidir)/mpfr: $(ibidir)/gmp \ $(call gbuild, mpfr-$(mpfr-version), static, , , make check) \ && echo "GNU Multiple Precision Floating-Point Reliably $(mpfr-version)" > $@ -$(ibidir)/pkg-config: $(ibidir)/make \ +$(ibidir)/pkg-config: $(ibidir)/tar \ $(tdir)/pkg-config-$(pkgconfig-version).tar.gz # An existing `libiconv' can cause a conflict with `pkg-config', # this is why `libiconv' depends on `pkg-config'. On a clean build, @@ -1305,11 +1315,11 @@ $(ibidir)/gcc: $(gcc-tarball) \ # in '$(idir)/lib' by defining the '$(idir)/lib64' as a symbolic # link to '$(idir)/lib'. if [ $(host_cc) = 1 ]; then \ - $(call makelink,gcc); \ - $(call makelink,g++,mandatory); \ - $(call makelink,gfortran,mandatory); \ + $(call makelink,cc); \ + $(call makelink,cc,,gcc); \ + $(call makelink,c++,,g++); \ + $(call makelink,gfortran); \ $(call makelink,strip,mandatory); \ - ln -sf $$(which gcc) $(ibdir)/cc; \ ccinfo=$$(gcc --version | awk 'NR==1'); \ echo "C compiler (""$$ccinfo"")" > $@; \ else \ diff --git a/reproduce/software/make/build-rules.mk b/reproduce/software/make/build-rules.mk index 9f5b493..260ded8 100644 --- a/reproduce/software/make/build-rules.mk +++ b/reproduce/software/make/build-rules.mk @@ -34,6 +34,31 @@ +# Unpack a tarball in the current directory. The issue is that until we +# install GNU Tar within Maneage, we have to use the host's Tar +# implementation and in some cases, they don't recognize '.lz'. +uncompress = csuffix=$$(echo $$tarball \ + | sed -e's/\./ /g' \ + | awk '{print $$NF}'); \ + if [ x$$csuffix = xlz ]; then \ + intarrm=1; \ + intar=$$(echo $$tarball | sed -e's/.lz//'); \ + lzip -c -d $$tarball > $$intar; \ + else \ + intarrm=0; \ + intar=$$tarball; \ + fi; \ + if tar xf $$intar; then \ + if [ x$$intarrm = x1 ]; then rm $$intar; fi; \ + else \ + echo; echo "Tar error"; exit 1; \ + fi + + + + + + # GNU Build system # ---------------- # @@ -58,13 +83,13 @@ gbuild = if [ x$(static_build) = xyes ] && [ "x$(2)" = xstatic ]; then \ fi; \ check="$(5)"; \ if [ x"$$check" = x ]; then check="echo Skipping-check"; fi; \ - cd $(ddir); rm -rf $(1); \ + cd $(ddir); \ + rm -rf $(1); \ if [ x"$$gbuild_tar" = x ]; then \ tarball=$(word 1,$(filter $(tdir)/%,$^)); \ else tarball=$$gbuild_tar; \ fi; \ - if ! tar xf $$tarball; then \ - echo; echo "Tar error"; exit 1; fi; \ + $(call uncompress); \ cd $(1); \ \ if [ x"$(strip $(6))" = x ]; then confscript=./configure; \ @@ -114,10 +139,11 @@ cbuild = if [ x$(static_build) = xyes ] && [ $(2)x = staticx ]; then \ export LDFLAGS="$$LDFLAGS -static"; \ opts="-DBUILD_SHARED_LIBS=OFF"; \ fi; \ - cd $(ddir) \ - && rm -rf $(1) \ - && tar xf $(word 1,$(filter $(tdir)/%,$^)) \ - && cd $(1) \ + tarball=$(word 1,$(filter $(tdir)/%,$^)); \ + cd $(ddir); \ + rm -rf $(1); \ + $(call uncompress); \ + cd $(1) \ && rm -rf project-build \ && mkdir project-build \ && cd project-build \ diff --git a/reproduce/software/make/high-level.mk b/reproduce/software/make/high-level.mk index 19cb52f..7cc2d51 100644 --- a/reproduce/software/make/high-level.mk +++ b/reproduce/software/make/high-level.mk @@ -104,7 +104,8 @@ export BASH_ENV := $(shell pwd)/reproduce/software/shell/bashrc.sh # Servers to use as backup, later this should go in a file that is not # under version control (the actual server that the tarbal comes from is # irrelevant). -backupservers = http://akhlaghi.org/maneage-software +backupservers := $(shell awk '!/^#/{printf "%s ", $$1}' \ + reproduce/software/config/servers-backup.conf) # Building flags: # @@ -335,6 +336,7 @@ $(tarballs): $(tdir)/%: | $(lockdir) # storing all the tarballs in one directory, we want it to have the # same naming convention, so we'll download it to a temporary name, # then rename that. + rm -f "$@.unchecked" if [ -f $(DEPENDENCIES-DIR)/$* ]; then cp $(DEPENDENCIES-DIR)/$* "$@.unchecked" else @@ -812,11 +814,17 @@ $(ibidir)/libgit2: $(ibidir)/curl \ $(ibidir)/wcslib: $(ibidir)/cfitsio \ $(tdir)/wcslib-$(wcslib-version).tar.bz2 + # If Fortran isn't present, don't build WCSLIB with it. + if type gfortran &> /dev/null; then fortranopt=""; + else fortranopt="--disable-fortran" + fi + + # Build WCSLIB. $(call gbuild, wcslib-$(wcslib-version), , \ LIBS="-pthread -lcurl -lm" \ --with-cfitsiolib=$(ildir) \ --with-cfitsioinc=$(idir)/include \ - --without-pgplot) \ + --without-pgplot $$fortranopt) \ && if [ x$(on_mac_os) = xyes ]; then \ install_name_tool -id $(ildir)/libwcs.6.4.dylib \ $(ildir)/libwcs.6.4.dylib; \ diff --git a/reproduce/software/make/python.mk b/reproduce/software/make/python.mk index 43499c7..eef8279 100644 --- a/reproduce/software/make/python.mk +++ b/reproduce/software/make/python.mk @@ -250,6 +250,7 @@ $(pytarballs): $(tdir)/%: # storing all the tarballs in one directory, we want it to have # the same naming convention, so we'll download it to a temporary # name, then rename that. + rm -f "$@.unchecked" if [ -f $(DEPENDENCIES-DIR)/$* ]; then cp $(DEPENDENCIES-DIR)/$* "$@.unchecked" else diff --git a/reproduce/software/shell/configure.sh b/reproduce/software/shell/configure.sh index 69097c2..5cf813b 100755 --- a/reproduce/software/shell/configure.sh +++ b/reproduce/software/shell/configure.sh @@ -1,4 +1,4 @@ -#! /bin/sh +#!/bin/sh # # Necessary preparations/configurations for the reproducible project. # @@ -319,19 +319,27 @@ static_build=no -# If we are on a Mac OS system -# ---------------------------- -# -# For the time being, we'll use the existance of `otool' to see if we are -# on a Mac OS system or not. Some tools (for example OpenSSL) need to know -# this. +# See if we are on a Linux-based system +# -------------------------------------- # -# On Mac OS, the building of GCC crashes sometimes while building libiberty -# with CLang's `g++'. Until we find a solution, we'll just use the host's C -# compiler. -if type otool > /dev/null 2>/dev/null; then +# Some features are tailored to GNU/Linux systems, while the BSD-based +# behavior is different. Initially we only tested macOS (hence the name of +# the variable), but as FreeBSD is also being inlucded in our tests. As +# more systems get used, we need to tailor these kinds of things better. +kernelname=$(uname -s) +if [ x$kernelname = xLinux ]; then + on_mac_os=no +else host_cc=1 on_mac_os=yes +fi + + + + + +# Print warning if the host CC is to be used. +if [ x$host_cc = x1 ]; then cat < /dev/null; then - if [ $jobs = 0 ]; then +# If the user hasn't manually specified the number of threads, see if we +# can deduce it from the host: +# - On systems with GNU Coreutils we have 'nproc'. +# - On BSD-based systems (for example FreeBSD and macOS), we have a +# 'hw.ncpu' in the output of 'sysctl'. +# - When none of the above work, just set the number of threads to 1. +if [ $jobs = 0 ]; then + if type nproc > /dev/null 2> /dev/null; then numthreads=$(nproc --all); else - numthreads=$jobs + numthreads=$(sysctl -a | awk '/^hw\.ncpu/{print $2}') + if [ x"$numthreads" = x ]; then numthreads=1; fi fi else - numthreads=1; -fi - - - - - -# Build `flock' before other program -# ---------------------------------- -# -# Flock (or file-lock) is a unique program that is necessary to serialize -# the (generally parallel) processing of make when necessary. GNU/Linux -# machines have it as part of their `util-linux' programs. But to be -# consistent in non-GNU/Linux systems, we will be using our own build. -# -# The reason that `flock' is sepecial is that we need it to serialize the -# download process of the software tarballs. -flockversion=$(awk '/flock-version/{print $3}' $depverfile) -flockchecksum=$(awk '/flock-checksum/{print $3}' $depshafile) -flocktar=flock-$flockversion.tar.gz -flockurl=http://github.com/discoteq/flock/releases/download/v$flockversion/ - -# Prepare/download the tarball. -if ! [ -f $tardir/$flocktar ]; then - flocktarname=$tardir/$flocktar - ucname=$flocktarname.unchecked - if [ -f $ddir/$flocktar ]; then - cp $ddir/$flocktar $ucname - else - if ! $downloader $ucname $flockurl/$flocktar; then - rm -f $ucname; - echo - echo "DOWNLOAD ERROR: Couldn't download the 'flock' tarball:" - echo " $flockurl" - echo - echo "You can manually place it in '$ddir' to avoid downloading." - exit 1 - fi - fi - - # Make sure this is the correct tarball. - if type sha512sum > /dev/null 2>/dev/null; then - checksum=$(sha512sum "$ucname" | awk '{print $1}') - if [ x$checksum = x$flockchecksum ]; then mv "$ucname" "$flocktarname" - else echo "ERROR: Non-matching checksum for '$flocktar'."; exit 1 - fi; - else mv "$ucname" "$flocktarname" - fi -fi - -# If the tarball is newer than the (possibly existing) program (the version -# has changed), then delete the program. -if [ -f .local/bin/flock ]; then - if [ $tardir/$flocktar -nt $ibidir/flock ]; then - rm $ibidir/flock - fi -fi - -# Build `flock' if necessary. -if ! [ -f $ibidir/flock ]; then - cd $tmpblddir - tar xf $tardir/$flocktar - cd flock-$flockversion - ./configure --prefix=$instdir - make - make install - cd $topdir - rm -rf $tmpblddir/flock-$flockversion - echo "Discoteq flock $flockversion" > $ibidir/flock + numthreads=$jobs fi - # Paths needed by the host compiler (only for `basic.mk') # ------------------------------------------------------- # @@ -1261,14 +1197,28 @@ fi -# Build basic software -# -------------------- +# Build core tools for project +# ---------------------------- +# +# Here we build the core tools that 'basic.mk' depends on: Lzip +# (compression program), GNU Make (that 'basic.mk' is written in), Dash +# (minimal Bash-like shell) and Flock (to lock files and enable serial +# download). +./reproduce/software/shell/pre-make-build.sh \ + "$bdir" "$ddir" "$downloader" + + + + + +# Build other basic tools our own GNU Make +# ---------------------------------------- # # When building these software we don't have our own un-packing software, # Bash, Make, or AWK. In this step, we'll install such low-level basic # tools, but we have to be very portable (and use minimal features in all). echo; echo "Building necessary software (if necessary)..." -make -k -f reproduce/software/make/basic.mk \ +.local/bin/make -k -f reproduce/software/make/basic.mk \ sys_library_path=$sys_library_path \ rpath_command=$rpath_command \ static_build=$static_build \ diff --git a/reproduce/software/shell/pre-make-build.sh b/reproduce/software/shell/pre-make-build.sh new file mode 100755 index 0000000..e2ac789 --- /dev/null +++ b/reproduce/software/shell/pre-make-build.sh @@ -0,0 +1,249 @@ +#!/bin/sh +# +# Very basic tools necessary to start Maneage's default building. +# +# Copyright (C) 2020 Mohammad Akhlaghi +# +# This script is free software: you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# This script is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this script. If not, see . + + + + + +# Script settings +# --------------- +# Stop the script if there are any errors. +set -e + + + + + +# Input arguments. +bdir=$1 +ddir=$2 +downloader="$3" + + + + + +# Basic directories/files +topdir=$(pwd) +sdir=$bdir/software +tardir=$sdir/tarballs +instdir=$sdir/installed +tmpblddir=$sdir/build-tmp +confdir=reproduce/software/config +ibidir=$instdir/version-info/proglib +downloadwrapper=reproduce/analysis/bash/download-multi-try + +# Derived directories +bindir=$instdir/bin +versionsfile=$confdir/versions.conf +checksumsfile=$confdir/checksums.conf +backupfile=$confdir/servers-backup.conf + + + + +# Set the system to first look into our newly installed programs. +export PATH="$bindir:$PATH" + + + + + +# Load the backup servers +backupservers=$(awk '!/^#/{printf "%s ", $1}' $backupfile) + + + + + +# Download the necessary tarball. +download_tarball() { + # Basic definitions + maneagetar=$tardir/$tarball + + # See if the tarball already exists in Maneage. + if [ -f "$maneagetar" ]; then + just_a_place_holder=1 + else + ucname=$tardir/$tarball.unchecked + + # See if it is in the input software directory. + if [ -f "$ddir/$tarball" ]; then + cp $ddir/$tarball $ucname + else + $downloadwrapper "$downloader" nolock $url/$tarball $ucname \ + "$backupservers" + fi + + # Make sure this is the correct tarball. + if type sha512sum > /dev/null 2> /dev/null; then + checksum=$(sha512sum "$ucname" | awk '{print $1}') + expectedchecksum=$(awk '/^'$progname'-checksum/{print $3}' $checksumsfile) + if [ x$checksum = x$expectedchecksum ]; then mv "$ucname" "$maneagetar" + else + echo "ERROR: Non-matching checksum for '$tarball'." + echo "Checksum should be: $expectedchecksum" + echo "Checksum is: $checksum" + exit 1 + fi; + else mv "$ucname" "$maneagetar" + fi + fi + + # If the tarball is newer than the (possibly existing) program (the version + # has changed), then delete the program. + if [ -f $ibidir/$progname ]; then + if [ $maneagetar -nt $ibidir/$progname ]; then + rm $ibidir/$progname + fi + fi +} + + + + + +# Build the program from the tarball +build_program() { + if ! [ -f $ibidir/$progname ]; then + + # Go into the temporary building directory. + cd $tmpblddir + unpackdir="$progname"-"$version" + + # Some implementations of 'tar' don't recognize Lzip, so we need to + # manually call Lzip first, then call tar afterwards. + csuffix=$(echo $tarball | sed -e's/\./ /g' | awk '{print $NF}') + rm -rf $unpackdir + if [ x$csuffix = xlz ]; then + intarrm=1 + intar=$(echo $tarball | sed -e's/.lz//') + lzip -c -d $tardir/$tarball > $intar + else + intarrm=0 + intar=$tardir/$tarball + fi + + # Unpack the tarball and build the program. + tar xf $intar + if [ x$intarrm = x1 ]; then rm $intar; fi + cd $unpackdir + ./configure --prefix=$instdir + make + make install + cd $topdir + rm -rf $tmpblddir/$unpackdir + echo "$progname_tex $version" > $ibidir/$progname + fi +} + + + + + +# Lzip +# ---- +# +# Lzip is a compression program that is the first built program in Maneage +# because the sources of all other programs (including other compression +# softwaer) are compressed. Lzip has the advantage that it is very small +# (without compression it is just ~400Kb). So we use its '.tar' file and +# won't rely on the host's compression tools at all. +progname="lzip" +progname_tex="Lzip" +url=http://akhlaghi.org/src +version=$(awk '/^'$progname'-version/{print $3}' $versionsfile) +tarball=$progname-$version.tar +download_tarball +build_program + + + + + +# GNU Make +# -------- +# +# The job orchestrator of Maneage is GNU Make. Although it is not +# impossible to account for all the differences between various Make +# implementations, its much easier (for reading the code and +# writing/debugging it) if we can count on a special implementation. So +# before going into the complex job orchestration in building high-level +# software, we start by building GNU Make. +progname="make" +progname_tex="GNU Make" +url=http://akhlaghi.org/src +version=$(awk '/^'$progname'-version/{print $3}' $versionsfile) +tarball=$progname-$version.tar.lz +download_tarball +build_program + + + + + +# Dash +# ---- +# +# Dash is a shell (http://gondor.apana.org.au/~herbert/dash). Having it in +# this phase will allow us to have a fixed/identical shell for 'basic.mk' +# (which builds GNU Bash). +progname="dash" +progname_tex="Dash" +url=http://akhlaghi.org/src +version=$(awk '/^'$progname'-version/{print $3}' $versionsfile) +tarball=$progname-$version.tar.lz +download_tarball +build_program + +# If the 'sh' symbolic link isn't set yet, set it to point to Dash. +if [ -f $bindir/sh ]; then just_a_place_holder=1 +else ln -sf $bindir/dash $bindir/sh; +fi + + + + + +# Flock +# ----- +# +# Flock (or file-lock) is necessary to serialize operations when +# necessary. GNU/Linux machines have it as part of their `util-linux' +# programs. But to be consistent in non-GNU/Linux systems, we will be using +# our own build. +# +# The reason that `flock' is built here is that generally the building of +# software is done in parallel, but we need it to serialize the download +# process of the software tarballs to avoid network complications when too +# many simultaneous download commands are called. +progname="flock" +progname_tex="Discoteq flock" +url=http://akhlaghi.org/src +version=$(awk '/^'$progname'-version/{print $3}' $versionsfile) +tarball=$progname-$version.tar.lz +download_tarball +build_program + + + + + +# Finish this script successfully +exit 0 -- cgit v1.2.1