Digital Religion(s): Der Blog

Let’s discuss Law, Religion and Digital Technologies!

27. April 2023 | Cristina Frei | Keine Kommentare |

You won’t get to know legal scholars frequently, whose research interests include religion as well as digital technologies. During my research stay at the Università degli Studi di Milano at the Dipartimento di Diritto Pubblico Italiano e Sovranazionale, I got the opportunity to discuss my research with Prof. Luca Pietro Vanoni, Associate Professor of comparative public law, and Dr. Giada Ragone, Researcher in constitutional law. They have kindly agreed to answer a few questions about their papers published in the issue named „the spatial ramifications of religion: new and traditional legal challenges“ of the electronic journal „Stato, Chiese e pluralismo confessionale„.

Università degli Studi di Milano

Cristina Frei (CF): Luca Vanoni, you reflected on the dematerialisation of the traditional public sphere and asked about new challenges for religious freedom. Could you please describe what you understand by the „dematerialisation of the traditional public sphere“ in the context of the digital transformation?

Luca Pietro Vanoni (LV): I think it is a very big issue. What is the „public space“ and the „public sphere“? In American literature, such a question is pivotal because they have the establishment clause and, for that reason, they had to define the space in which it is allowed to show religious habits, ceremonies, symbols … So, in that context, it is fundamental to pinpoint the space in which you can host religion and places where you cannot. There is an interesting book written by Richard Neuhaus in 1944 that is exactly explaining this problem. He called it the „The Naked Public Square“. He says that the public square cannot be naked because otherwise we do not have a public sphere anymore.

The real interesting thing is that during the 20th century, some scholars tried to define and dividing the public sphere or the public place at least in three different categories. Silvio Ferrari, who is a very important scholar in Italian academia, says that we have a common space, that is the physical space where we meet (street, square, and so on). Then we have the political space, that is the space in which the public debate is going on. Lastly, we have the institutional space, which is the space of the courts, the parliaments, and, of course, of schools. This is a really fascinating idea, because when dividing the public space in functional categories, you can use different normative rules in order to fix the problem that the religious symbol or the presence of religion brings up in our pluralistic society.

The problem is that this definition starts with the division of the space in the material place. After the technological revolution, we quickly discovered that, for example, the political space is not a physical space anymore. You can see how often the politicians are using social networks such as Twitter and Facebook to connect with the people that are supposed to vote for them. So, how can we use rules to regulate a space that is not material anymore? As we saw in the Trump case, it is Twitter, Facebook and the other providers, as private companies, that decide who must be excluded from the dematerialised political space. We have to think by using other categories in order to reflect on new normative issues: the traditional definition of public space is not useful anymore.

CF: You pointed out, that scholars of law and religion need to address the new challenges that the digital revolution is raising by this dematerialisation. Could you specify a few of these new challenges for religious freedom?

LV: There are a lot of problematic issues concerning the relationship between religion and new technologies. When dealing with that, I think, you have to start by looking at the different technological tools we are all using and the different problems they rise. So, for example, religion in social networks is an issue that is mainly concerned with the idea of hate speech and non-discrimination. Artificial intelligence entails the challenge of translating human values, also the idea of religious freedom, into the algorithm vocabulary, et cetera.

A very interesting example, that was brought up in Italy during the pandemic, concerns students that have to attend class online. We have a huge problem (and a rich case law) in our system with the presence of religious symbols in class in public school. But what about religious symbols in the rooms where students and teachers sit for online calls? In a case, reported by newspapers, a professor asked a student, during the digital class, to take down the crucifix he had in its bedroom. This scenario brings us to new problems. When you are teaching a class online, are the screens a public space? Are they like the public space of a classroom? So, here we have something that we can define as a pandemic public space, and we have to deal with it.

CF: Giada Ragone, you wrote about artificial intelligence and new scenarios of religious discrimination in virtual and real space. Which issues in relation to the application of traditional instruments of current discrimination law to algorithmic bias would you highlight?

Giada Ragone (GR): Thank you very much for your question. I do agree with those who think that non-discrimination law already has the legal categories for framing the phenomenon of AI discrimination. But, at the same time, I think we also need specific rules to face this kind of discrimination, for at least two reasons.

The first one is that the phenomenon of AI discrimination is something that is not related to specific contexts. We have a lot of sources of non-discrimination law, often related to specific contexts, such as discrimination in workplaces. For example, we have maybe the most important directive of the European Union concerning discrimination (Directive 2000/78/EC), which is focused on discrimination on the labour market. But the use of AI is not related to a specific context (work, schools, institutional communication, etc.). You can use the same kind of AI in different contexts and, so, I think if you continue to use legal sources referred to specific contexts, there will be always a grey space that is not covered by non-discrimination rules.

The second element is that, especially if we consider not intentional ways in which AI could provoke discrimination, there are open issues about the liability and the responsibility. If human beings directly use AI to discriminate against people, it could be easier to apply traditional tools of non-discrimination law to prohibit or sanction this kind of behaviour. The problem is not in this case, but when you have a discriminatory effect due to, for example, a poor data-driven AI, a poor functioning of AI, or other aspects that were not intentionally put in place by human beings. And in this case, you have to provide legal instruments capable of allocating responsibility and liability to the proper subject. Who is the human being legally responsible for an AI not intentional discrimination? Its designer? Its end-user?

CF: You note that there is no case law on AI and religious bias and that mainly other grounds of discrimination are paid attention to in literature. Still, you conclude that jurists should also focus on AI’s potentially discriminatory effects on the grounds of religion. Could you please explain why you think this is important?

GR: I think that studies on AI religious discrimination should be improved and deepened because of two main reasons. The first one is that religion, religious habits and religious events, are something really characterizing human beings’ life and the same thing could also be said about the use of AI. So, I think that statistically, it can happen in many cases that AI provokes discrimination on religious grounds in different ways. We know that AI could have an impact on traditional ways of living religion, of being religious, but we also know that AI could have an impact on the new ways of living religion, so also through access to new technologies. Therefore, I think it is very likely to have discriminatory effects in the field of religious rights due to the use of these technologies.

The second aspect is that religion is something that has to do with the main fundamental rights and liberties of people and with the search of meaning of life of human beings. That is why I think it should not be left as a secondary ground of discrimination, something that could be explored after all the other grounds have been explored. I think that it should have a peculiar relevance because of the importance of its dimension for human beings’ existence.

CF: Thank you very much for taking the time to respond to my questions!


Blogbeitrag von Cristina Frei, UFSP-Projekt 7 Religion und der Digital Turn: Neue rechtliche Herausforderungen im Cyberspace


Abgelegt unter: Projektvorstellung
Tags: