The 2018 edition of USI innovates with the concept of Open discussion, the first of which brings together three experts in psychology and the humanities: Dan Ariely, Moran Cerf and Sandra Matz. The topic discussed, both thorny and fascinating, is situated at the crossroads between science and philosophy: can human behavior be changed in an ethical manner?
The hurdle of ethics
Professor Dan Ariely, author of the bestselling Predictably irrational, The Hidden Forces That Shape Our Decisions, begins the panel discussion with an attempt to define the limit between ethical and unethical: “If you could write any experiment, what would you do?” All scientists in the humanities dream of studying the behavior of twins brought up in two completely different contexts, such as one in China and one in the US. And yet all agree that such an experiment would be immoral.
On a more personal note, Dan Ariely recalls his hospitalization after he suffered third degree burns over more than 70% of his body. The hospital staff covered him in tight bandages, a very painful procedure supposed to massage the skin and maintain blood pressure. The professor of psychology and behavioral economics noted not only the absence of results on his own body but also the absence of proper experimentation. He therefore offered to burn himself again twice, and to have one burn treated with the tight bandages and the other left uncovered, to measure the results of the method. Despite his willingness, the experiment was deemed unethical by his doctors who preferred to stick to their treatment method. But is it really more ethical to inflict painful treatment that has not been scientifically validated? He concludes that “Some experiments are unethical, but sometimes it’s unethical not to do the experiment.”
According to Sandra Matz, a specialist in social sciences, current technology makes it possible to know the personality, values, political and religious beliefs of any given person from their cell phone, credit card or Facebook profile. This psychological profiling was used by the company Cambridge Analytica during the last presidential elections in the US to attempt to sway the vote, which led to a public uproar. But is it the means or the end which is the most shocking? Dan Ariely invites us to ask the question: “If Cambridge Analytica had helped prevent Trump from being elected, would you consider it immoral?”
Informed or uninformed consent
Sandra Matz is convinced of it, psychological targeting can have a positive effect on society and individuals, under three conditions: their consent must be obtained, you must explain what the data will be used for, and, crucially, whether they will be used for a good cause – three conditions which were not met by Cambridge Analytica.
For Moran Cerf, a former hacker turned neuroscientist, the most important question is that of informed consent. For example, an experiment was carried out with people who wished to stop smoking. Simply releasing the smell of cigarettes followed by that of rotten eggs while the subjects were sleeping was enough for them to no longer want to smoke. Moran Cerf finds these results disquieting: what if the same method were used to suppress or modify people’s memories? It could completely transform their beliefs and behavior, for better or for worse. “Everyone aspires to changing their behavior. But if you’re sleeping, you’re not really there. It’s not ethical.” Moran Cerf concludes by alerting us to the risk of letting Alexa, Amazon’s intelligent personal assistant, control our sleep.
The price of freedom
Most of us make bad decisions on a daily basis: we eat too much, don’t exercise enough, don’t sleep enough… Studies show that 100 years ago, only 10% of deaths were linked to a bad decision, while today the figure is over 40%. Just think of car accidents caused by texting or alcohol. Have we become more stupid? For Dan Ariely, there are simply more temptations today: “We have created a world of temptations all around us.”
In the Odyssey, Ulysses fights the siren’s temptation by being tied to the mast and stuffing his crew’s ears with wax. Dan Ariely tries to help us resist thanks to the best of technology: an app given to patients who have just undergone heart surgery. It features a turtle that wakes up in great form in the morning but slowly retracts into its shell and withers away if its “owner” does not exercise or forgets to take their medication. Worse: it even goes so far as to delete the apps most often used: Facebook, Instagram, Twitter… One could say that the app limits people’s freedom for their own good. “We eliminate human freedom from the equation because it helps them behave better.”
On the same principles but on a larger scale, the Chinese government has set up a notation system which aims to improve civic behavior by reinstating a reputation system in an anonymized society. It takes into account all aspects of daily life, from using the crosswalk to donating to charities. For Sandra Matz the system is somewhat extreme but could be positive if it helps reduce social inequality.
Lastly, Moran, Sandra and Dan agree on one point: freedom can also be bad for people. “We have created a level playing field that is killing us. It’s only fair to help people fight this war.”