Magazine

Digital Ethics — the Conscience of Artificial Intelligence

When Dave the astronaut tries to reenter the spaceship in Stanley Kubrick’s “2001: A Space Odyssey” (1968), on-board computer HAL refuses to follow orders. “I’m sorry, Dave. I’m afraid I can’t do that.”

The narrative is science fiction. 50 years later, fiction has become reality. Today’s virtual assistants Alexa (Amazon), Cortana (Microsoft), and Siri (Apple) still apologize profusely when they fail to understand and execute a voice command. Their tone of voice indicates that this is a situation they would like to change sooner rather than later. Half a century after “2001: A Space Odyssey”, Amazon developed software that was supposed to rank job applications with the help of AI. Unfortunately, the artificial intelligence detected that women are currently a minority among the several hundred thousand employees and categorized female attributes as detrimental.

The Digital World Needs Ethics

The University of Applied Sciences in Business Administration Zurich (HWZ) has published a whitepaper on digital ethics. “Since the Greek philosopher Aristotle, ethics has been determined as the discipline of philosophy that studies morals: It challenges what is right and wrong, and what kind of behavior may be considered appropriate under different circumstances.” Traffic information in Google Maps relies on us sharing our location data with Google. As far as I am concerned, this is not a problem as long as data security is ensured. My grown-up son finds this unacceptable because it makes him feel like he is under surveillance. Voilà! What is acceptable and what is not depends on a person’s morals. And morals, according to Thomas Beschorner, Director of the Institute for Business Ethics at the University of St. Gallen, “have always been influenced by technological developments”.

We Are Technological Development

In the Swiss “Handelszeitung”, Cornelia Diethelm puts it like this: “If you use data as a strategic resource, you should ask yourself what you are actually going to do with the data – and what you are not going to do.” It’s all about keeping the customer in focus when using data: “This builds trust and increases the acceptance of digital innovations”. To a certain extent, a data strategy is self-regulatory. Lorenzo Mutti, UX Director at Unic, has a very clear plan for this: “Answering ethical questions requires caution, reflection and contemplation. This can be promoted through diversity in teams and an awareness of ethical issues.” Regarding the technical aspects, he lists respecting people’s privacy, preventing sensitive data leaks and the definition and implementation of fairness. While artificial intelligence is ostensibly neutral, fairness is usually violated by humans. In user experience design, for instance, there is more than just good and bad design. There is also evil design: “Dark patterns” are user interface designs set up to con users into actions that may not be in their best interests. Lufthansa subsidiary Swiss even made headlines with their rather special cookie consent creation. “Die Republik” magazine called their approach “a kind of digital sleight of hand. Don’t look here, look over there!” It is a game that Facebook, Instagram and Tik Tok mastered a long time ago. Their user interfaces are designed to rope us in and make us spend as much time as possible there. In her book “Digitale Ethik” (Digital Ethics), Sarah Spiekermann developed the antithesis to this: “ethics by design”. Technologies are to be inherently ethical, right from the start.

Interview with Cornelia Diethelm

Cornelia Diethelm founded the Centre for Digital Responsibility (CDR), an independent think tank for digital ethics in the German-speaking DACH region. We wanted to know more about it.

You founded a think tank for digital ethics. What does this think tank do? 

Cornelia Diethelm: For one thing, the think tank supports companies and organizations in the DACH region, for example with our monthly “Digitale Ethik” trend report, our annual “Shift” conference and related projects, presentations, expertise and consulting. In addition, I have identified a definite desire in our society to reflect on the possibilities and limitations of new technologies and their effect on our everyday lives. That is why I am delighted to be in charge of a new course of studies and a seminar on digital ethics at the HWZ. I am sure that training on this topic will become increasingly important in the digital age in the coming years. 

Ethics is an age-old subject. Why is its importance increasing in our day and age?

At first, we were amazed by everything that new technologies such as AI made possible. But not everything that’s possible is also rational and desirable. We are starting to realize that and are asking ourselves: What do we want, what don’t we want and where should we draw the line? This is about the responsible management of customer and employee data and making conscious decisions about where we rely on humans and where we make use of machines, among other things.

How much awareness of this topic is there within companies?

This varies tremendously and does not correlate with company size. The awareness within companies depends on the individuals. In general, I have been positively surprised by the interest people take in ethical questions, including board members, by the way! There are more and more supervisory and management boards looking into this issue. At the end of the day, it’s the result that counts, of course. Does a company succeed in abiding by principles such as the protection of privacy, non-discrimination or fairness, even during product development? Is there transparent communication regarding the use of new technologies? That is what it’s about.

What are the critical factors involved? How do you embed an ethical approach within a company?

The main thing is for companies to maintain credibility. We should also be aware of “ethics washing”. Instead of creating trust, which also forms the basis for innovations based on new technologies, in this case, trust is destroyed. It is only natural for that to lead to regulations. Of course, voluntary obligations take effect much faster in the case of unethical business practices, and in many areas, they are much more efficient. Companies can credibly tackle the topic by issuing ethics guidelines and implementing them with swift governance. Credible decision-makers and well-trained employees are the keys to success. And you need a culture of discourse so that sensitive topics can be discussed within the company. I am personally convinced that external ethics boards are an asset to any company because they are not subject to internal logic and can contribute additional knowledge.

The digital world creates new structures and use cases that require ethical assessment.

More On This Topic In Our Dossier On Artificial Intelligence