This week it was reported that soldiers could potentially, in the near future, have their minds plugged directly into weapons systems, and have their learning boosted by neural stimulation. The Royal Society’s [UK Academy of Sciences] Brain Waves project on new directions in neuro-science gives us much to reflect on and worry about. It follows the news last week that scientists are developing a “mind-reading” technique to capture thoughts.
Research in all this is in its infancy, but though new understandings of how the brain works generate new treatments for disease and brain damage, they also expose us to many new dangers. The challenge is always to use judgement, and, if necessary force, to maximize good and minimize evil. However, we should be clear that there is no precautionary approach; therapy delayed is rescue denied. As in all other areas of human activity, choice is not an option, but a destiny. How should we choose?
The Royal Society report spoke of brain-machine interfaces (BMIs) to connect people’s brains directly to machinery. These interfaces are already being used to control artificial limbs for amputees, but they would also be efficient in improving speed and accuracy in delivering weapons systems.
Rod Flower, chair of the report’s working group, rightly asks: “If you are controlling a drone and you shoot the wrong target or bomb a wedding party, who is responsible for that action? Is it you or the BMI?”
While this is a nice puzzle, the alternative without BMIs might be a greater likelihood that the wrong target will be chosen or hit. If we ban military BMIs, who is responsible for that?
The bigger question, though, is how to reduce the incidence of events where people suffer and others need to be called to account. Think of smart drugs that improve thought. Modafinil, a drug that keeps pilots alert, can indeed aid military pilots — but it also protects civilian passengers. The same drug also enhances other cognitive functioning, including exam performance.
We humans need to be smarter in order to combat a monstrous regiment of dangers that include climate change, meteorite strikes, diseases such as AIDS and Creutzfeldt-Jakob disease, and an over-precautionary approach to innovation that could increase, rather than reduce, our vulnerability to these and other dangers. The dilemma is: Whither caution? The ability to choose between caution and adventure assumes we can predict accurately — something we humans have been lamentably bad at.
In future, we are also likely to face an ethical dilemma over memory manipulation. This is now a distinct possibility because drugs are available that can wipe, or certainly dampen, our recollection of events. Why should we tamper with our access to history? Well, one good reason is that memories can be traumatic. The victim of, for example, a brutal rape, might well wish to wipe the memory. However, what if so doing removes the capacity to identify the perpetrator, and leaves him free to ruin others’ lives?
The neurotransmitter serotonin and the molecule oxytocin are hailed as agents that, by increasing reluctance to cause suffering on the one hand and trust on the other, can bring about an improvement in morals. Adjusting the levels of these chemicals in the body will effect changes that bypass decision-making and make certain behavior, for all practical purposes, automatic. Why should we worry about bypassing morally defective decisionmaking? One reason is it takes away our freedom.