In a Swedish newspaper
article/blog the Swedish journalist Clas Svahn discusses the book ”Katastroferunder 100 år” (Catastrophes during 100 years) written by the Danish researcher
and historian Rasmus Dahlberg. Reading the article I react on/against the view
or role of both humans and technology.
The effect of “human negligence never seems to be reduced. The more people we become
and the more technical solutions introduced the more sources of possible
misery, there will be. ... The common
thread here is without a doubt humans and the fact that we are not flawless,
but full of arrogance, laziness and influenced by peer pressure and therefore takes
decisions that sometimes result in disasters that no one could have predicted”.
I don’t like this
perspective. Technology and technical solutions are not introduced to increase
safety, they are introduced in order to make things possible (that was not
possible before). At the introduction there is an unconscious (and sometimes conscious)
process of weighing the gain against the risks. If the gain does not weigh heavier,
the solutions will not be introduced (that’s how our laziness works and has
made humans successful at spreading over this planet). Look for instance at inventions
such as the bungyjump cord, the machinegun or the car. Not safe at all, but
enough people has perceived the benefits as more important than the risks and
therefore used these things. No one thinks these inventions are harmless and
most disasters they can lead to are envisioned and predicted (including an inhabitable
Earth).
As I’ve writtenbefore humans are very good at feeling/identifying when something is going
wrong. In most of these instances humans react and stop catastrophes in the
making. This happens all the time and is most often not documented (and the saved
lives not counted). Humans are fantastic!
Also, the amount of
people on this world is only possible thanks to our inventions. Technology
kills many, but supports many more (at least for now). And looking at how we
destroy our planet, it is not human errors leading to catastrophes that are
killing the earth. It is a systematic flaw in weighing gain against risk. This because
there are two things that don’t work as well as they could, or should:
-
The
process of weighing potential gain against the risks (today, many introductions
of new technology are too complex for our intuition).
-
Our
never ending strive for (economic) growth skew our perspective on gain and
therefore (can) let us introduce things that we don’t need.
No comments:
Post a Comment