Five Reasons Why Infrastructure Sustainability Assessments Fail to Manage Community Risk #5

This is the last in a series of five posts exploring common community engagement failures in infrastructure planning. It follows on from posts exploring how world view biases are a pitfall in translating community values into operational decision-making criteria and heuristics can unhelpfully influence project option design and collective evaluation. This post considers the treatment in reporting uncertainty when reporting the outcomes of MCA to stakeholders and decision-makers.

Reason #5: Failure to present residual uncertainty to decision-takers

Usually, the MCA process leads to the reporting of one or more recommendations. A report will typically explain the MCA method and the reasons for its selection, and it will provide transparent justification for the appraisal of project options including the key factors the authors believe justify the recommendation(s).

Stakeholders, including the media, have become used to detailed impact assessments and appear to place enormous emphasis on modelling. Don’t get me wrong: as impact assessors, our bread and butter in the well-informed guess that is social impact modelling. But the reporting of uncertainty around modelled outcomes can be challenging – especially in public sector infrastructure projects.

Some MCA methods incorporate the treatment of uncertainty into the model and all methods should conduct a sensitivity analysis to demonstrate the extent to which key assumptions and weightings need to change for another option to become most preferred. But uncertainty will continue to be an important factor in many decisions.

We have been focused on the inclusion and representation of community values in MCA processes and the avoidance of reporting uncertainty is not confined to MCA. We have mentioned the IAP2 Quality Standards which require an evaluation of the success and failures of a public participation project: essentially, what the decision-maker needs to know about the limitation of the engagement given cost, time and capability constraints, and how the quality of similar future projects might be improved. Yet these evaluations are still relatively rare in the public domain.

The Ostrich Effect describes the tendency to ignore information that points to failure or justify actions that led to failure to avoid taking responsibility. We see this effect after cost blow-outs on infrastructure projects. The leadership challenge is the promotion of a culture that talks opening about past short-comings and documents failures and the causes of failure. Most managers are comfortable celebrating success but leadership is required to signal the value of learning from mistakes too.

In project sustainability assessments, we have seen consultants avoid probing past mistakes for fear of putting client teams on the defensive. More successful engagements (often on projects employing Alliancing or Early Contractor Involvement procurement strategies) have been associated with the leadership team communicated the value of challenging assumptions and current practices.

To ensure equitable consideration of the most vulnerable in the community, there are some common-sense guidelines on reporting uncertainty aspect of an MCA to stakeholders and decision-makers:

  • Characterize these uncertainties as explicitly and unambiguously as possible.
  • If it is possible to assign a probability of competing assumptions being true, that will assist the decision-taker. It may be possible to construct a distribution of probabilities of a particular outcome.
  • Where possible, provide an indication of the order of magnitude of a consequence or estimate the consequences of project options under alternative scenarios.
  • Make any trade-off of risks under any option explicit to the decision-maker. Consider supplementing the performance matrix with a diagram of risk profiles under various assumptions.
  • Provide context for the uncertainty. For instance, is one project option modelled close to a steep part of the likely cost curve or does an option come close to a previous trigger of community outrage.
  • Objectively report the level of agreement and disagreement among participants/experts and the detailed reasons for that disagreement. While consensus is desirable, it is not mandatory.
  • Where relevant, provide recommendations to further reduce uncertainty (the nature of scale of further studies/the type of studies that are not useful in reducing uncertainty, continued monitoring of the situation, or targeted consultation).

Key point #5:

Decision quality extends to the decision-taker being entrusted to understand the limitations of the analysis and being able to tolerate both ambiguity and a degree of uncertainty.