| tags: [ decision tools reproducibility ] categories: [synthesis ]

reproducibility for decision support tools

I’ve set aside this document for developing a parallel framework of reproducibility issues not just outside of NHST, but particular to Decision Support Tools in ecology and conservation.

WHY? Because the majority of work addressing reproducibility in and outside of ecology has focused on hypothesis testing (NHST, predominantly). And non-NHST methods are important tools in the Decision Support toolbox for conservation science and applied ecology.

From this, I hope to: a) develop the coding criteria for the systematic review b) propose some sort of gold-standard protocol for developing reproducible decision support tools

Questionable Research Practices

P-hacking / Cherry-picking

Is there some a priori preferred decision, perhaps the stakeholder or even the analyst prefers? What happens if the tool doesn’t support this? Is the tool “hacked” until it does support the preferred decision? Are there particular modelling approaches or tools that are more or less immune to this type of hacking?

Decision Tool Frameworks

distinguishing decision from process modelling

Is the process of making the decision given focus, its own process for implementing, or is it just taken as a given? I.e. is the process distinguished from the predicting of impacts of alternative actions / policy scenarios?

Many tools, particularly in conservation prioritisation problems, present the results of some process model / oprojection analyses, but without detailing how the choice between alternative actions / scenarios should occur. I think this is problematic, because, at this point, values and risk-attitudes of the decision-maker can influence the chosen alternative.

** decision-maker / stakeholder values and risk-attitudes**

Thus, another criterion might be, are values / risk attitudes given attention in the absence of some technical method for deciding between alternatives? (and even then, what values / preferences are embedded into the ‘objective’ tool…?).


Fidler, F., Chee, Y. E., Wintle, B. A., Burgman, M. A., McCarthy, M. A., Gordon, A. (2017) Metaresearch for Evaluating Reproducibility in Ecology and Evolution. BioScience doi: 10.1093/biosci/biw159.