| tags: [ systematic review decision support ] categories: [meeting ]

meeting 15 March 2018

Libby, Hannah present.

Exploring systematic review methods


Add my notes from CJ meeting here.


Also see: BES - will be discussed in coding club / reading club. SER (Society of ecological restoration) -> Australian Chapter, SERA. Other disciplines where science / evidence is used in making decision s(Ecotoxicology / Biosecurity).

Authors who have worked on systematic reviews in ecology (@TODO): - Catherine Pickering - Robin Hale - W J Sutherland

Developing Questions for the review, and the thesis

Seeing 3 areas of focus emerging:

  1. What constitutes reproducibility… for this context, how do we evaluate it?
  2. Reporting standards / transparency, especially around particular decision points. Libby: Is it just on the model dev phase you would look at, or other steps, such as conceptual model dev?
  3. Incorporation and synthesis of evidence in DST development, and the issue of modeller’s choice (i.e. in the absence of available / appropriate evidence, modeller’s often use themselves as experts, choosing a particular probability distribution for a given variable, for example). Libby notes that this is a subset of point 2, above, that these choices need to be documented in a transparent fashion.

There is arguably a question around how to do this ‘reproducibly’… how do we set standards for incorporating evidence into decision-tools.

What types of systematic bias emerge because of the state of the evidence base? Issues that come to mind How do these forms of systematic feed into reproducibility or lack thereof of DST’s?

Multiple levels..

Possible Experiments

CJ approached Libby and Elise for advice / potential collaborations on demonstrating the benefit of environmental water allocations across Victoria. At this point the project is loosely defined, with lots of scope to work on something of mutual benefit to CJ and Elise. Also the opportunity for funding / resources. Because there are multiple river systems, this presents the opoortunity for some sort of reproducibility / replication experiment.

Give a problem brief, with 2 - 3 objectives, to a group of researchers (~5) and ask that they develop a decision support tool. Choose which ever approach / method they like. Give a survey afterwards, asking them to describe what they did and why. If they can’t remember then it’s probably not replicable / reproducible. Alternatively, ask them to write up the methods section hand over at the end. How to entice? - Authorship - paid - there are funds availble. Who? Target differnt modellers / people taking different approach to decision support / analysis - Anca, Tom, Mick…. - Less “tech heavy”, people taking other approaches, Terry Walshe? CEED, UQ folk working on environmental decisions How - to implement? Chris to be present in the room, can consult with Chris at any point (Record, does this represent a decision point) How - to evaluate? How would this also benefit Chris? It would be a sort of sensitivity analysis, tell him about the robustness of different models, but also proceses. Can show that the decision is reasonably robust to different processes / tools.

Elise’s Todo’s for next week:

  • Start thinking about the structure of the review - what questions would you ask?
  • What questions do you want to ask in the thesis?
  • 3 month review - set up a doodle poll to set up meeting. JUST GET IT DONE, Elise!