Discussion about this post

User's avatar
Doug Radford's avatar

As someone who just conferred an engineering PhD thesis about using optimisation for bushfire risk management - I have a lot of thoughts on this. I think you're absolutely right to say that using tech like optimisation in decision-making can have a dark side (it does).

Whatever we might try to convince ourselves are '"optimal"' decisions, will only ever be "optimal" within the context of whatever "objectives" we define as part of the optimsation (importantly considering who is setting those objectives and how) and how we might be able to represent those objectives (within the context of our limited ability to represent fire behaviour, environmental processes etc. across aforementioned deep time/space dimensions).

Should we use optimisation to make decisions? Absolutely not - unless you think you are an all-knowing being who has accounted for every possible objective, represented them all accurately/precisely and then weighed up those objectives perfectly (whatever that would mean).

If you do not have a god-complex and believe yourself to be a normal person, then using optimisation to make your decisions would be akin to getting chat gpt to do your homework and just clicking submit.

However (and it's always on the "however" that you reveal yourself to be the fool), I do think that optimisation gives us an interesting ability to explore how the decisions we might make and how those decisions might change with respect to changing conditions or objectives. For example, how might person A make decisions compared with person B based on their different priorities or objectives? - are there decisions which might work for both parties? Are there alternative ways for person A to reach their objectives in a way that doesn't impact negatively on person B's objectives? Do the actions of person B maybe help person A more than they realise? Should person A take a bit more time to appreciate that? Are there ways of achieving our "objectives" that we can't even imagine given the deluge of land tenures, constraints, trade-offs and multi-objectives that we are trying to juggle?

I don't think we should use optimisation to make decisions outright (clearly), but I do wonder what role it could play (dark side impulse?)...

I hope that if we do choose to pick optimisation up, then we make sure that we take the time to do so in a way that is responsible (what might that even look like?). In the final paragraph of my thesis, I wrote that risk-based planning (and using optimisation within that space) is itself at risk of being a modern extension of the colonial views we have seen towards fire suppression. Is this avoidable? Or are we in the midst of that jurassic park scene: waiting for Jeff to tell us that we were so focussed on whether we could, that we didn't stop to ask ourselves whether or not we should (the whole monologue prior to that line is perfect).

Expand full comment
Patrick Jurd's avatar

The serenity prayer has been a personal fav for some time - and then I discovered its link with 12 step programs!!

Control is very human but can quickly turn toxic and I think speaks of fear and insecurity. As a recovering control freak, I know!!

Thanks again

Expand full comment
4 more comments...

No posts