Tuesday, 31 July 2012

When recycling goes wrong

Having blogged so consistently of the opportunities for great improvements in the recycling of waste, it is worth recalling that recycling can go wrong.

A great (terrible) example among many is the outbreak of Bovine spongiform encephalopathy (BSE) in the UK in the late 1980s and early 1990s. This was caused by the recycling of animal remains (specifically that of contaminated cattle) into meat and bone meal which was then added to cattle feed to improve its quality. The idea was that the animal remains were sterilised, and so no diseases could be spread in this form. On the face of it, this seems a wise use of an otherwise waste resource.

Of course, with hindsight we know this was not the case. The disease led to the slaughter of 4 million cattle in the UK, and the deaths of over 200 people to October 2009 (its long incubation means that there may still continue to be cases). And with hindsight we ask ourselves how it could ever have been considered a good idea to have cattle, naturally plant eaters, in effect cannibalising other cattle.

File:Aphis.usda.gov BSE 3.jpg
Cow with BSE, digging frantically but going nowhere. Source: Wikipedia


We can respond to the BSE experience in several ways. One extreme is to demand full testing of all risks before we try anything new. This approach essentially requires that anything new be proven to be ok - formally an impossibility (as Nassim Nicholas Taleb pithily wrote: absence of evidence is not evidence of absence), and crazy insane as a blanket rule. In effect it decides that the status quo remains, and the status quo may be far worse than the unprovable alternative.

The other extreme is to pursue and implement any new idea, stopping only when it proves to be bad. This is easy to do, but built into the approach is an acceptance that bad things will happen, and that they are ok. On big risks this is also insane crazy.

So the middle ground ends up, as always, the best ground. You try to avoid big negative outcomes. This is not quite the same as quantifying the risk, because some risks are unknowable with the current knowledge base. It is attempting to identify what bad things might happen even though we don't know how they might happen, and then trying to guard against the worst.

I don't quite know what the rules of thumb might be for this evaluation. Perhaps it has something to do with not messing with any ordinary order of things (begging the questions "what is ordinary?", "ordinary to who?") and so on. It might look at the natural cycles of materials (though there is not a lot natural about steel manufacture). It might even decide to make things no worse than they are now. But I think something could be derived.

At core, we need to be careful, but not paralysed by fear. We need to be especially careful around food and systems  that are hugely complex and perhaps unknowable in the full scale of potential interactions (such as animals, the environment). But at the same time, we need to keep the course clear for innovation. And from there, be sure to keep a ready eye out for things going wrong.

Perhaps this warning is pointless, obvious and useless. Perhaps, but I think it is worth the reminder that recycling is not always best. Some paths need to be very carefully thought through before they are begun. But this does not mean no path should be begun.

No comments:

Post a Comment