Professor Reinemann, what is your current research project on measuring opinion-forming power and concentration control on the internet about?
Prof Reinemann: The background to the project is regulatory issues in the German media sector. Up to now, the focus has primarily been on television due to its assumed power of opinion. However, with the advent of the internet and social media, the range of players, sources and channels that influence opinion formation has expanded considerably.
In the past, the traditional media had a virtual monopoly, especially in the journalistic field. This monopoly is now being broken up, which has both positive and problematic aspects. On the one hand, we can speak of a democratisation of public communication. On the other hand, there are challenges such as disinformation, misinformation and hate speech. The question we are asking ourselves as part of the project is how and by whom power of opinion is exercised in these new channels. We want to understand how many people are reached by media they trust and how these media ultimately affect people.
How can an abstract concept such as opinion-forming power be quantified at all?
Prof Reinemann: From a social science perspective, there are various approaches. One simple indicator is the reach of a medium. Another important factor is the trust that a medium enjoys, especially if it is used exclusively by many people for information. In practice, however, only the market share is often considered. In the case of television channels, for example, this is based on viewing time. However, this indicator is susceptible to distortions caused by extremely intensive users. However, it is of course most valid if you measure the actual impact of individual media, but this is quite time-consuming if you want to do it continuously.
In the past, the traditional media had a virtual monopoly, especially in the journalistic field. This monopoly is now being broken up, which has both positive and problematic aspects.Professor Carsten Reinemann
Instead of radio and television, the younger generation now uses YouTube, Instagram and TikTok to form their opinions.
So what approach are you taking?
Prof Reinemann: We are proposing a continuous monitoring of opinion power that would have a modular structure and a multi-method design. The first goal is to understand the user's perspective. We ask: Which media do you use and which sources do you trust? It is important to be open and also to record the use of obscure or lesser-known sources. The quality of the source is of secondary importance. Our proposal is to use a combination of different questions and methods to find out what content is actually being used.
We rely on a combination of traditional surveys and tracking of online information behaviour. Building on this, further sub-studies could investigate media content and the actual effects of media, as well as, for example, the resonance of media in social media or their effects on politics and other media. Our approach does not necessarily have to lead directly to regulatory intervention, but can also initially serve to improve and strengthen social transparency about the media used.
The sheer volume of content on YouTube, Instagram and Tiktok makes it difficult to decide which channels deserve trust. What is the state of media literacy in Germany?
Prof Reinemann: There is definitely room for improvement. For example, there are proposals to introduce media literacy as a school subject, but schools are often not in a position to implement this in the way that would be desirable. The majority of teachers have simply not been given the necessary tools in their training. Such programmes are then often implemented by external providers or there are initiatives that are more symbolic in nature, such as the 'media driving licence' in Bavaria, the effectiveness of which is not evaluated. However, we must not be under the illusion that media literacy solves all problems. It's not just about technical skills, but also about building trust and democratic values. Even extremists are often very media-savvy. In a study on the question of how young people come into contact with extremism, we therefore emphasised the need to link media literacy and democracy education.
Older target groups in particular, who have learnt to use the internet in recent years, especially during the coronavirus pandemic, are also susceptible [to misinformation].Professor Carsten Reinemann
Artificial intelligence: stemming the tide of fake factsRead more
To what extent are disinformation campaigns already spreading online?
Prof Reinemann: If you look at the last few years, events such as the coronavirus pandemic, the war in Ukraine and the Israel-Gaza conflict stand out. You can see that both state actors from outside Germany and actors within the country are trying to spread their perspective on conflicts not only with well-supported information, but also through polarising disinformation. This is often done by spreading fears or discrediting traditional media and political actors through misinformation. Another problem is that such disinformation often circulates within certain groups and remains there as misinformation. New challenges arise with the development of AI. Only recently, there have been cases of AI-generated voices of news anchors appearing in fake videos apologising for their reporting. Such technologies are becoming increasingly sophisticated and this makes it very difficult to effectively combat such disinformation campaigns.
Is it necessarily the case that younger people are more susceptible to misinformation because they spend more time on social media?
Prof Reinemann: Studies, including those from other countries, suggest that susceptibility to disinformation is not necessarily a question of age, but that the problem lies in the way people use the internet. Young people who are not interested in political issues may be less exposed to the flood of disinformation than those who have a strong interest and already have preconceived opinions. In combination with the platforms' algorithms, this can lead to the formation of information bubbles in which one's own world view is confirmed. Older target groups in particular, who have learnt to use the internet in recent years, especially during the coronavirus pandemic, are also susceptible.
Then there are the platforms themselves, which do not produce their own content but use their algorithms to select and present content in a controlled manner. At a certain point in the communication process, they therefore also exercise power of opinion.Professor Carsten Reinemann
Anyone can become an opinion maker on the Internet. What does this mean for your study and the associated diffusion of opinion power?
Prof Reinemann: In this respect, there are several factors that we have to take into account. Firstly, there are non-journalistic players, so-called alternative media players. Then there are the platforms themselves, which do not produce their own content but use their algorithms to select and present content in a controlled manner. At a certain point in the communication process, they therefore also exercise power of opinion. An additional problem is that we often do not know exactly what is happening behind the scenes. So far, there is a lack of data to show exactly how these algorithms work and which content is ultimately selected and presented for the individual user.
How transparent are the platforms in this respect?
Prof Reinemann: So far, they have not been transparent at all. For a long time, researchers criticised the fact that platforms such as Meta or Google do not provide data. Although there were initiatives by large platforms that involved scientists, there were enormous problems with regard to the willingness to actually release relevant data.However, with the EU's Digital Service Act, we are now faced with the possibility of gaining access to relevant data. Scientists can now approach the EU Commission and specify what kind of data is needed. It will be interesting to see how this develops in research practice and how co-operative companies will be. There are methods for gaining an external impression of the impact of certain search queries or what content you receive when you create artificial profiles in social networks. But internal access to the data is essential in order to be able to carry out a well-founded analysis.
Read more about the project "Measurement of opinion-forming power"Read more
You mentioned earlier that your project takes a regulatory approach. What is your recommendation regarding the regulation of opinion power?
Prof Reinemann: We are working on a concept that differs from the current approach. The current control of concentration in the media sector is too simplistic and mainly focused on television, which no longer corresponds to today's media reality, especially among younger generations.we advocate looking at the entire media spectrum and analysing where power of opinion arises.As a society, it is important to know where the powerful media are. Our proposal is therefore to move from a rigid regulation that allows intervention above a certain threshold of concentrated power of opinion to a pure monitoring system that creates social transparency. Although regulators show understanding, historical experience has made them very reluctant to intervene. A middle way is probably needed, but the current situation definitely needs to change.
Carsten Reinemann is a professor for communication science with a focus on political communication. As part of the project funded by the Bavarian Research Institute for Digital Transformation (bidt), he is focussing on the question of how contemporary measurement and regulation of opinion-forming power on the internet can look.