Regional Bishop Ernst-Wilhelm Gohl at the AI theme day in Heilbronn. Image: Gottfried Stoppel
In the last five years, the topic of artificial intelligence (AI) has increasingly become a mass phenomenon. Laboratories and institutes around the world are researching it. Groundbreaking breakthroughs are reported again and again. But it was only with the free ChatGPT software that the topic of AI really arrived in our everyday lives.
The consequences of this technological leap are currently difficult to estimate. Does the gain in freedom outweigh the loss of freedom? How beneficial can AI become for our everyday lives? How dangerous is it if it is not we who control the AI, but at some point the AI controls us?
The global appeal from numerous AI experts on the Future of Life Institute website on March 22 last year has an even more dramatic effect. (see note 1) Systems like ChatGPT, the appeal says, have now “become too powerful and too dangerous” and there are fundamental risks to society and humanity posed by human-level AI. The appeal therefore calls for a research moratorium on AI for at least six months.
It is therefore necessary to classify these threat scenarios. Where are the opportunities of AI and where are the dangers? Today’s theme day offers a competent forum for all of these questions – also thanks to you, dear Prof. Lasi and the Steinbeis Institute. It is good that we as a church are addressing these questions together. That’s why I would like to thank the state synod and thus, on behalf of everyone involved in the preparation, you, dear Sabine Foth.
AI raises new ethical questions. The challenge for all of us is not to condemn AI across the board when it comes to ethical questions, but rather to bring these questions into our internal church and social debates in an informed and differentiated manner in the light of biblical traditions.
In this context, I am very relieved that the EU wants to create a binding legal framework for the use of AI in the EU with the AI Act 2024. The core of the AI Act is the distinction between three risk classes when using artificial intelligence.
The highest risk comes from AI applications that essentially massively restrict people’s freedoms or even threaten them with death.
These applications will be banned in the EU in the future. In the EU we do not want Chinese conditions and reject, for example, social scoring (see note 2).
Applications of AI in the latest weapons technology are also dramatic.
Russia and Ukraine are on the verge of using drones that use AI to make completely autonomous decisions about the target of its destruction.
As a church, we welcome the ban on these applications. This is about defending fundamental values that derive from human dignity and the protection of the weak.
Compared to this risk class 1, the EU defines a class of AI applications that are considered harmless and whose development should also be promoted economically. Especially with a view to possible uses in diakonia, it must be clarified where we as a church can promote this use. I’m thinking about supporting medical diagnosis – last week I spoke to a radiologist who found AI very helpful in evaluating images – or I’m thinking about practical help for people who need support.
In my opinion, the real challenge lies in assessing risks, which the EU defines as risk class 2. The assessment of the benefits and risks of these AI applications is not clear. The assessment must be negotiated in public discourse. The church must make its contribution to this. That’s why it’s important that we have exchange forums like this. That’s not the only reason why I’m looking forward to today’s lectures and discussions. As an impulse for this day, I would like to briefly outline three criteria for weighing up the risks/benefits of AI from the churches’ perspective.
First criterion: justice
The theologian and former chairman of the German Ethics Council, Peter Dabrock, has pointed out that it is crucial to ask the right questions in the debates about AI. In this way he expresses that many ethically clear positions receive a lot of publicity, but the really relevant problems tend to be hidden.
But before we can talk about solutions to ethical conflicts, elementary challenges of “digital justice” must first be solved. The focus is therefore on questions of participation. In the diaconal sense, equity in participation depends entirely on the resources available. This also includes critically analyzing the market power of the large tech companies and pointing out their role in technical innovations. The technological leaps in development that these companies help to trigger do not primarily occur for altruistic motives, but rather to achieve economic success. That is not reprehensible. But this motivation must be kept in mind. Therefore, “digital justice” also includes the question: What business models do we as churches agree to when we use the digital services of these companies?
Second criterion: education
Participatory justice does not just include access to resources. Access to education is also crucial. Both are closely related to the Christian view of humanity. All people should be empowered to make self-determined decisions about their lives. This includes informational self-determination and the ability to make qualified judgments about the use of AI.
If you look back at the beginnings of Protestant educational work, you can see a basic attitude that is strongly influenced by the biblical view of humanity. Man is God’s image. Therefore he has an undying dignity. This dignity does not depend on gender, social status or other criteria. Education must therefore ensure the ability of each individual to make judgments. This is not the only reason why Protestant educational work is a fundamental alternative to a society of the uninformed, to a society without an ethical compass that is particularly easily influenced by “fake news” and conspiracy stories. Evangelical educational work is more important than ever because it strengthens the individual’s ability to judge.
Third criterion: community
“Digitalization is changing our world. It opens up new personal and social scope that makes special gains in freedom possible.” (see note 3)
The current EKD memorandum on digitalization begins with this statement. It was published three years ago with the programmatic title “Freedom Digital”. Using the Ten Commandments, she not only reflects on the opportunities of digitalization, but also its limitations. The aim, according to the memorandum, is to “shape this change in a humane and appropriate manner”. How can AI be used so that its use is humane?
That’s why I want to put this third criterion alongside justice and education. It has become increasingly clear to me through conversations over the last few months. AI, which is used in the church, primarily offers people opportunities for contact and communication. More precisely, many offers are aimed at community. I’m thinking of care robots, tools for preparing and conducting services, blessing robots and pastoral care services.
Church offerings are fundamentally aimed at exchange and contact between church members and are derived from communication with God.
Before using AI, we as a church should check: Do these offerings really open up new forms of community or do they just serve to reduce costs? On the other hand, when using AI, alternatives must always be taken into account. If the alternative to an AI-supported offer is that the church can no longer offer a corresponding offer in the future, there is probably something to be said for the AI-supported solution.
There is a real dilemma in pastoral care. I can imagine contact and informational conversations that are AI-supported. Those in need of help often need initial contact quickly. This could relieve the burden on pastors and professionalize the church’s help at this point. But I cannot imagine human-to-human pastoral care supported by AI, either now or in the future.
Pastoral care is the mother tongue of the church. Christian counseling is essentially determined by the relationship between two people. In pastoral care, the person who listens, encourages and comforts is not like a doctor who makes a diagnosis and then initiates therapy. Every pastor knows that every path in life is unique.
In contrast to the doctor, the pastor is a witness of solidarity. This witnessing means being there in the suffering of others, going along with them, enduring with them, complaining with them, remaining silent.
This witness is empathetic and, above all, deeply human. And it becomes noticeable when this witness identifies himself as a human being. Ultimately, here, as in other areas of church work, it will be a question of whether human community – this central Christian value – can be strengthened.
Remarks:
2. Applied to people, this means that people’s behavior is evaluated through the collection and evaluation of data. Plus points are awarded for desirable behavior; Points are deducted for negative behavior.
2024-04-10 13:52:05
#April #Justice #Education #Community