Saturday, November 21, 2015

Towards a more sophisticated precautionary principle

Amish society is resilient- but not to asteroids
Source: amishbuggyrides.com
I'm a huge fan of Nassim Taleb's concept of antifragility and the overall philosophical framework that goes with it. Greater emphasis on tail risk--the most extreme example being existential risk--is direly needed in our society. But something that's been troubling me is Taleb's opposition to synthetic biology. This has been in the news lately due to the FDA's approval of AquaBounty's frankenfish.

Taleb's position essentially revolves around the precautionary principle, e.g. that despite the potential upsides of GM crops, the expected risk of extremely terrible outcomes (biosphere collapse) is high enough to necessitate banning the technology. The ball metaphor in this embedded video roughly describes the analytical fact of technological risk that drives Taleb's position.

Concern over GM crops in some ways makes sense. Tinkering with systems that involve exponential growth and replication, like biology, expose us to massive downside risk. The probability of a biosphere collapse caused by synthetic biotech is hazy but scientifically plausible. Fair enough.

Where I think Taleb errs is in assuming this issue exists in isolation and apart from the broader context of political and cultural attitudes towards technology.

The fundamental dilemma that humanity faces regarding existential risk is this: its magnitude of terribleness is such that no degree of precautionary restriction on innovation is sufficient to guarantee a 0% probability of human extinction. As such, the only--only--solution is to establish modular and sustainable space colonies, ensuring that the inevitable obliteration of one local human population won't doom us all.

So then, the question of whether the practical application of a precautionary principle has negative ramifications for humanity's ability to sustainably colonize space becomes quite important. And on the question of GM crops, I think the answer is quite clearly that its opposition sustains and is fueled by an anti-science and generally myopic worldview. On any single technology, taking an overly-cautious approach might be rational. But aggregated over a million technologies, we'll see increased stasis and decreased dynamism.

Sustainably colonizing space is a long way off. But I think it's safe to assume that this outcome is not at all inevitable. To get there, we'll first need stepping-stone technologies, some of which (like synthetic biology) will bring tail risks. The problem then becomes one of optimizing these risks in relation to the ultimate goal of space livin'. Too much precaution over too many issues might ultimately doom us to eventual oblivion. Since time is indeed a factor, I tend to see a proactionary attitude towards new technologies as in fact the truest interpretation of the precautionary principle's tenets.