diff options
Diffstat (limited to 'peer-review/1-answer.txt')
-rw-r--r-- | peer-review/1-answer.txt | 80 |
1 files changed, 38 insertions, 42 deletions
diff --git a/peer-review/1-answer.txt b/peer-review/1-answer.txt index ae28c5f..55be70a 100644 --- a/peer-review/1-answer.txt +++ b/peer-review/1-answer.txt @@ -7,17 +7,18 @@ already done a very comprehensive review of the tools (as you may notice from the Git repository[1]). However, the CiSE Author Information explicitly states: "The introduction should provide a modicum of background in one or two paragraphs, but should not attempt to give a literature -review". This is the usual practice in previously published papers at CiSE and -is in line with the very limited word count and maximum of 12 references to -be used in bibliography. +review". This is the usual practice in previously published papers at CiSE +and is in line with the maximum 6250 word-count and maximum of 12 +references to be used in bibliography. We agree with the need for this extensive review to be on the public record -(creating the review took a lot of time and effort; most of the tools were run and -tested). We discussed this with the editors and the following -solution was agreed upon: we include the extended review as a set of appendices in -the arXiv[2] and Zenodo[3] pre-prints of this paper and mention these -publicly available appendices in the submitted paper so that any interested -reader can easily access them. +(creating the review took a lot of time and effort; most of the tools were +run and tested). We discussed this with the editors and the following +solution was agreed upon: the extended reviews will be published as a set +of appendices in the arXiv[2] and Zenodo[3] pre-prints of this paper. These +publicly available appendices are also mentioned in the submitted paper so +that any interested reader of the final paper published by CiSE can easily +access them. [1] https://gitlab.com/makhlaghi/maneage-paper/-/blob/master/tex/src/paper-long.tex#L1579 [2] https://arxiv.org/abs/2006.03018 @@ -205,24 +206,24 @@ ANSWER: large. However, the 6250 word-count limit is very tight and if we add more on it in this length, we would have to remove points of higher priority. Hopefully this can be the subject of a follow-up paper. -3. A review of ReproZip is in Appendix B. -4. A review of Occam is in Appendix B. -5. A review of Popper is in Appendix B. -6. A review of Whole Tale is in Appendix B. -7. A review of Snakemake is in Appendix A. -8. CWL and WDL are described in Appendix A (Job management). -9. Nextflow is described in Appendix A (Job management). -10. Sumatra is described in Appendix B. -11. Podman is mentioned in Appendix A (Containers). -12. AppImage is mentioned in Appendix A (Package management). -13. Flatpak is mentioned in Appendix A (Package management). -14. Snap is mentioned in Appendix A (Package management). +3. A review of ReproZip is in Appendix C. +4. A review of Occam is in Appendix C. +5. A review of Popper is in Appendix C. +6. A review of Whole Tale is in Appendix C. +7. A review of Snakemake is in Appendix B. +8. CWL and WDL are described in Appendix B (Job management). +9. Nextflow is described in Appendix B (Job management). +10. Sumatra is described in Appendix C. +11. Podman is mentioned in Appendix B (Containers). +12. AppImage is mentioned in Appendix B (Package management). +13. Flatpak is mentioned in Appendix B (Package management). +14. Snap is mentioned in Appendix B (Package management). 15. nbdev and jupytext are high-level tools to generate documentation and packaging custom code in Conda or pypi. High-level package managers like Conda and Pypi have already been thoroughly reviewed in Appendix A for their longevity issues, so we feel that there is no need to include these. -16. Bazel is mentioned in Appendix A (job management). +16. Bazel is mentioned in Appendix B (job management). 17. Debian's reproducible builds are only designed for ensuring that software packaged for Debian is bitwise reproducible. As mentioned in the discussion section of this paper, the bitwise reproducibility of software is @@ -244,12 +245,12 @@ ANSWER: * A model project for reproducible papers: https://arxiv.org/abs/1401.2000 * Executable/reproducible paper articles and original concepts -ANSWER: Thank you for highlighting these points. Appendix B starts with a +ANSWER: Thank you for highlighting these points. Appendix C starts with a subsection titled "suggested rules, checklists or criteria" with a review of existing sets of criteria. This subsection includes the sources proposed by the reviewer [Sandve et al; Rule et al; Nust et al] (and others). -ArXiv:1401.2000 has been added in Appendix A as an example paper using +ArXiv:1401.2000 has been added in Appendix B as an example paper using virtual machines. We thank the referee for bringing up this paper, because the link to the VM provided in the paper no longer works (the URL http://archive.comp-phys.org/provenance_challenge/provenance_machine.ova @@ -348,7 +349,7 @@ FreeBSD (despite having bit-wise different executables). provides little novelty (see comments below). ANSWER: The previously suggested sets of criteria that were listed by -Reviewer 1 are reviewed by us in the newly added Appendix B, and the +Reviewer 1 are reviewed by us in the newly added Appendix C, and the novelty and advantages of our proposed criteria are contrasted there with the earlier sets of criteria. @@ -541,7 +542,7 @@ ANSWER: Thank you very much for pointing out the works by Thain. We couldn't find any first-author papers in 2015, but found Meng & Thain (https://doi.org/10.1016/j.procs.2017.05.116) which had a related discussion of why they didn't use Docker containers in their work. That -paper is now cited in the discussion of Containers in Appendix A. +paper is now cited in the discussion of Containers in Appendix B. ------------------------------ @@ -554,7 +555,7 @@ paper is now cited in the discussion of Containers in Appendix A. ANSWER: Thank you for the reference. We are restricted in the main body of the paper due to the strict bibliography limit of 12 -references; we have included Kurtzer et al 2017 in Appendix A (where +references; we have included Kurtzer et al 2017 in Appendix B (where we discuss Singularity). ------------------------------ @@ -569,7 +570,7 @@ we discuss Singularity). ANSWER: The FAIR principles have been mentioned in the main body of the paper, but unfortunately we had to remove its citation in the main paper (like many others) to keep to the maximum of 12 references. We have cited it in -Appendix B. +Appendix C. ------------------------------ @@ -583,15 +584,10 @@ Appendix B. further enrich the tool presented. -ANSWER: Our section II discussing existing tools seems to be the most -appropriate place to mention IPOL, so we have retained its position at -the end of this section. - -We have indeed included an in-depth discussion of IPOL in Appendix B. -We recommend it to the reader for any project written uniquely in C, -and we comment on the readiness of Maneage'd projects for a similar -level of peer-review control. - +ANSWER: We agree and have removed the IPOL example from that section. +We have included an in-depth discussion of IPOL in Appendix C and we +comment on the readiness of Maneage'd projects for a similar level of +peer-review control. ------------------------------ @@ -657,7 +653,7 @@ Within the constraints of space (the limit is 6500 words), we don't see how we could add more discussion of the history of our choice of criteria or more anecdotal examples of their relevance. -We do discuss some alternatives lists of criteria in Appendix B.A, +We do discuss some alternatives lists of criteria in Appendix C.A, without debating the wider perspective of which criteria are the most desirable. @@ -1027,7 +1023,7 @@ ANSWER: This work and the proposed criteria are very different from Popper. We agree that VMs and containers are an important component of this field, and the appendices add depth to our discussion of this. However, these do not appear to satisfy all our proposed criteria. -A detailed review of Popper, in particular, is given in Appendix B. +A detailed review of Popper, in particular, is given in Appendix C. ------------------------------ @@ -1041,7 +1037,7 @@ A detailed review of Popper, in particular, is given in Appendix B. most promising tools for offering true reproducibility. ANSWER: Containers and VMs have been more thoroughly discussed in -the main body and also extensively discussed in appendix A (that are +the main body and also extensively discussed in appendix B (that are now available in the arXiv and Zenodo versions of this paper). As discussed (with many cited examples), Containers and VMs are only appropriate when they are themselves reproducible (for example, if @@ -1088,7 +1084,7 @@ of each step (why and how the analysis was done). additional work highly relevant to this paper. ANSWER: Thank you for the interesting paper by Lofstead+2019 on Data -pallets. We have cited it in Appendix A as an example of how generic the +pallets. We have cited it in Appendix B as an example of how generic the concept of containers is. The topic of linking data to analysis is also a core result of the criteria @@ -1126,7 +1122,7 @@ ANSWER: All these tools have been reviewed in the newly added appendices. ANSWER: A thorough review of current low-level tools and and high-level reproducible workflow management systems has been added in the extended -Appendix. +Appendices. ------------------------------ |