Originally published December 23, 2012
In Harnessing Innovation Through More Knowledge, I pointed out that the best way of dealing with the dilemmas of technology innovation is through more innovation and knowledge. We need to have knowledge of the basics of technology, its use and the context.
So in the end, I am a technology optimist. I have to be. Technology is our evolutionary path, even as a species. Technology pessimism means there is no future, no perspective. I believe technological innovation should be encouraged, sponsored, and enabled with every means possible because it can bring so much good to humanity. We should know as much as we can about it, and then some, as Manfred Eigen suggests.
Exactly this, however, is a realistic concern. Unlike in the Age of Reason, most of us no longer believe we can think everything through. The rationality of human beings has clear boundaries. In our actions and in our decision making, we are driven by more primal needs. We simply do not have the capacity to think through the consequences of our actions because there are too many variables. We tend to be structurally overconfident in our own abilities. Every individual has his own frame of mind, and total perspective on a matter is rare, if not impossible.
Manfred Eigen argued for more knowledge. I interpret that in various ways, understanding the basics, understanding the use of systems and understanding the context. Perhaps the most important understanding of context is to realize that we don’t know the consequences of technology innovation. Socrates got it right when he explained that the one reason he could think of why people called him wise was that at least he realized he didn’t know, where others thought they knew.
In conclusion, there are no simple answers here (most likely not even complex answers). Partly we will simply learn by suffering the consequences, like the effect of nuclear weapons. Sometimes we will discover that something good can come out of it. Some will claim that nuclear energy is a good source of energy; however, after the Fukushima accident in Japan in 2011, that group has shrunk. Sometimes something bad will come out of inventions aimed for the good, like side effects of certain medications. This is what the consequentialists will propose as the way forward.
The universalists demand to know the rules up front. If we know that a certain cure can kill as well, we should keep searching for one that doesn’t. If we can imagine a scenario that cars driven by computers instead of humans can lead to horrible accidents, we should first use technologies like this in more restricted environments to completely test them before selling them as a commercial product.
Both universalists and consequentialists will agree that the way forward is in public discourse. An open debate. Simply banning stem cell research is not the answer, but instructing scientists to organize open public debate as part of the research procedure can be. On a case-by-case basis as a community, we need to build an understanding – as much as we humanly can – about the role technology plays in our society. Sometimes we make mistakes, and we learn. Sometimes, when society is uncomfortable, we need to delay and continue the discussion, as much as the scientists want to continue (which I think is better than just saying “no,” as the same research will pop up elsewhere anyway). And sometimes, we’ll get it right. Form your opinion, voice it, and keep an open mind.
Read. Listen. Talk. Write.
Recent articles by Frank Buytendijk