Flashback: Orientation in the digital world – conference on Google, TikTok & Co

At the beginning of March, the Evangelische Akademie Tutzing and the Open Search Foundation organised a conference on orientation in the digital world. Participants discussed social, digital policy and ethical aspects of search engines, social media and artificial intelligence with bestselling authors and experts from all over Germany.

“Trust is the basis for successful cooperation”

The philosopher and author Dr Nicolas Dierks opened the conference with a lecture on “Ethics as orientation in digital change”. He defined ethics as a “dialogue about what we can rightly expect from each other” and then took the audience on a philosophical journey. His presentation focussed on the values of trust, responsibility and transparency, with Dierks using examples to illustrate that we all have an ethical compass within us.

He invited the audience to conduct their own thought experiments (based on Amartya Sen’s flute example “The Idea of Justice”). One insight: ethics offers no patent remedies; what is important is dialogue and the struggle for values. According to Dierks, the task of digital ethics is to ask: “Is this a good thing? Is this the way we want it?”

Der Philosoph und Autor Dr. Nicolas Dierks bei einem Vortrag auf der Konferenz „Orientierung in der digitalen Welt. Wohin führen uns Google, TikTok & Co?“

“The task of digital ethics is to ask: ‘Is this a good thing?” – Dr Nicolas Dierks, philosopher and author

For Dierks, responsibility is a key value for successful digitalisation. He defined responsibility as a “moral obligation to justify and share one’s own actions and decisions with others in dialogue”. Only when it is clear that you are responsible, what you are responsible for and to whom you are responsible, does real responsibility arise, he stated with regard to digital platforms.

So what does it take for people to trust you? The answer is actually quite simple, explained Dierks: “Behave in a trustworthy manner.”

The lecture by Nicolas Dierks is available as a recording on YouTube

“We need an upgrade for a fair digital society”

Lajla Fetic dispelled the myths surrounding artificial intelligence: AI is magic? Algorithms are neutral? European AI models are better? – The expert in digital policy and the social impact of AI and algorithm-based technologies used concrete examples to explain that algorithms and AI are neither one nor the other, but are made by humans and based on human decisions.

She impressively demonstrated why “the opportunities and risks of the digital society are unevenly distributed” using the algorithm-based fight against poverty in Jordan. She explained that the World Bank’s automated money transfer programme “Takaful”, which is used there, poses a whole host of problems. For example, a poor family was unable to apply for support because they earned less from their work than they needed to live on: the application simply did not allow a negative income to be entered. (Background information on this example in the Human Rights Watch article)

Algorithmic biases, poor results, lack of sustainability … Large language models currently have many weaknesses. Lajla Fetic presented some of them, referring to research by Timnit Gebru and Emily Bender, among others. In their article on “On the Dangers of Stochastic Parrots”, the renowned ethics experts warned of the dangers of large language models as early as 2020.

In general, it is hard work to make AI models transparent, comprehensible and explainable, explained Lajla Fetic. This is also one of the reasons why European AI applications are not necessarily better than their US counterparts.

So how can we succeed in creating a fair digital society?Lajla Fetic named some “recipes” for making algorithm-based applications fairer, for example:

  • Recognising that the digital transformation is also a social transformation and that algorithms and AI must be understood as social tools and not as neutral elements
  • Consider in advance whether an application with algorithms or artificial intelligence is actually necessary and useful for the desired purpose
  • Developing not for people, but with people, i.e. involving those affected in the development of applications from the outset – this could avoid fundamental errors (such as the case from Jordan mentioned above)

Lajla Fetic concludes that responsibility must be demanded from those responsible, because “there are risks that we need to address. But it is primarily those who benefit from the systems who must take responsibility.”

„Wir brauchen ein Upgrade für eine gerechte digitale Gesellschaft.“ – Lajla Fetc

“We need an upgrade for a fair digital society.” – Lajla Fetic

Workshops on social media and “fake news”, legal aspects of digitalisation, “digital literacy” and artificial intelligence

After brief introductions to the workshop topics, the workshops began, in which the conference visitors were able to deepen their knowledge and ask questions in small groups.

Prof. Dr. Melanie Platz zeigt anhand von „Bubble Sort“, wie Algorithmen funktionieren und warum es schwer ist, sie zu durchschauen.

Prof Dr Melanie Platz uses “Bubble Sort” to show how algorithms work and why they are difficult to understand.

In the “Education and Literacy” workshop, Prof Dr Melanie Platz from Saarland University used practical exercises to show how algorithmic sorting works. In a very practical and playful way, “Bubble Sort” made it clear that algorithms are made by humans and that it is usually not clear what criteria were used for sorting.

Algorithmische Sortierung in der Praxis

Algorithmic sorting in practice

In the workshop led by Prof. Dr Alexander Decker, media professor at Ingolstadt University of Applied Sciences, the participants examined why we are all easily taken in by disinformation and so-called fake news, how we recognise dubious news – and: what we can “learn” from Donald Trump when it comes to disinformation.

Prof. Dr. Alexander Decker: „Wir alle können auf Fake News und Desinformation hereinfallen“

Prof Dr Alexander Decker: “We can all fall for fake news and disinformation”.

Prof Dr Matthias Wendland, Professor of Law at the University of Oldenburg, discussed the legal framework such as the General Data Protection Regulation and the European legal situation as well as the importance of regulation with the workshop participants.

Recht, Gesetze und Digitalisierung in der Praxis: Prof. Dr. Matthias Wendland, LL.M. (Harvard), Inhaber des Lehrstuhls für Informationsrecht an der Universität Oldenburg, Berater, Speaker; Forschungsschwerpunkt zu Ethik und Regulierung von KI, Oldenburg

Law, legislation and digitalisation in practice: Prof. Dr. Matthias Wendland, LL.M. (Harvard), holder of the Chair of Information Law at the University of Oldenburg, consultant, speaker; research focus on ethics and regulation of AI, Oldenburg

Lajla Fetic, who had already given a presentation on the social impact of AI (see above), opened the doors wide for questions and shared her wealth of knowledge and experience with the participants. True to the motto: there are no stupid questions, especially not when it comes to finding your way around such complex topics as artificial intelligence and algorithms.

Deep Dive künstliche Intelligenz – Raum für Fragen

Deep dive into artificial intelligence – room for questions

Debug Digitisation!

A team from the ‘Common Grounds Forum stirred up the audience in the ‘Debugging digitalisation’ workshop. In 2023, students Daphne Auer, Emma Beuschel, Sébastien Elbracht, Daniel Mendes Jenner and Ludwig Lorenz worked with other young people to develop positions on digital policy as part of a participation project organised by the Gesellschaft für Informatik e.V. and the BMBF . They have now presented these for discussion.

Common Grounds Forum bei der Tagung „Orientierung in der digitalen Welt – Wohin führen uns Google, TikTok & Co.?“ an der Evangelischen Akademie Tutzing

Students from the “Common Grounds Forum” project organised the “Debugging digitalisation” workshop

After a brief introduction, it was straight into “digital bingo” and lively discussions across all generations. The participants then learnt about the Common Grounds Forum’s positions, including demands on “monopolies and big tech”, “participation and opportunities”, “democracy and media monopolies” and “open source”.

Demands were for example:
“We need a system in which citizens don’t have to worry about protecting their digital rights themselves.”
“People’s self-determination should be at the heart of digitalisation.”
“Digitalisation should help to protect our resources instead of wasting them.”
“Education across all generations is the key to an enlightened life in the digital world.”

In the subsequent ‘reality check’, the young team actively involved the conference participants in small groups: Where are there gaps in the demands? Where do the predominantly older conference participants feel that the demands perhaps miss the mark? What can be done to put them into practice?

A very successful exchange, which all those present across all age groups found very fruitful.

“Search engines shape our world view”

Christine Plote’s lecture dealt with ethical aspects and how internet searches – i.e. search engines – shape our everyday lives and our information behaviour and what effects this has on our society.

In her presentation, the chairwoman of the Open Search Foundation and moderator of the osf Ethics working group showed, among other things,

  • that search engines only show us a selective section of reality;
  • the effects of the overwhelming dominance of a few providers;
  • how search engines reinforce and even strengthen social prejudices;
  • why the intensive collection of user information and comprehensive tracking by search engine providers such as Google undermines our (not only) digital privacy;
  • how, for the same search queries, different search results are displayed depending on context and location, which means that we do not have a common view of reality;
  • that the text snippets on the search results page do not necessarily represent the actual intention of a website;
  • how artificial intelligence affects internet searches, e.g. when AI-generated images are ranked first instead of real historical images when searching for historical events.

Christine Plote warned: “Last but not least, this also jeopardises our democratic structures. Think of the upcoming elections, for example” and added: “Searching on the internet is only free at first glance. In the end, we pay a high price for it.”

Finally, she gave specific tips for better searching on the Internet. For example, she showed some “search operators” that can be used to get results faster. For example, you can search for an exact sequence of words by placing several search terms in inverted commas or search a specific website by entering the suffix “site:” in conjunction with a domain, followed by the search query.

The most important tip, however, is to “always scrutinise the search results critically and scroll further down, because that’s where the more suitable results are often hidden.”

„Suchmaschinen sind weder neutral noch fair.“ – Christine Plote, Vorständin der Open Search Foundation und Co-Moderatoring der osf-Fachgruppe Ethik

“Search engines are neither neutral nor fair.” – Christine Plote, Director of the Open Search Foundation and co-moderator of the osf Ethics Working Group

In the subsequent “workshop”, conference participants were able to ask the Open Search Foundation team specific questions – for example, about safe surfing on the Internet, browser settings, changing search engines, social media account settings – and received concrete assistance on their devices.

“The diversity of information in our society is suffering. We are running out of time”

Martin Andree showed very impressively that the diversity of information in our society is threatened by the large digital platforms. The media scientist and bestselling author (“Big Tech muss weg”) was the first to measure media concentration on the internet and prove that almost all media use is concentrated on a handful of digital platforms. Traditional, analogue media no longer play a role because “the fair and free market has been completely abolished in the digital media sector”, he concludes.

The presentation showed how this situation has arisen in recent decades as a result of network effects, closed standards, the suppression of so-called outlinks, “user-generated content” without remuneration and the lack of assumption of responsibility for content as well as monopoly abuse by digital platforms.

Medienwissenschaftler und Autor Dr. Martin Andree

“The fair and free market has been completely abolished in the digital media sector” – media scientist and author Dr Martin Andree

Andree concluded his stirring presentation with the encouraging message “We can do something” and named five concrete measures that could be implemented relatively easily.

  • Enable outlinks so that the so-called traffic can flow away from the platforms to the original content. This would lead to a democratisation of traffic.
  • Open standards, similar to the e-mail market, for example, where you can contact other people regardless of the e-mail provider. This would restore market diversity.
  • Separation of transmission channel and content, no one may own the channel and the content at the same time
  • Equal treatment of social media platforms as media companies, analogous to analogue media such as television, where a market share of more than 30% is not permitted in Germany.
  • No monetisation of criminal content: Platforms continue to monetise criminal content (Martin Andree cited Instagram as an example, which still earns money through advertising with the accounts of the NPD and its successor organisation “Neue Heimat”). As soon as someone earns money with content through advertising, they make it their own and must be held accountable.

Measures and projects for media education and the development of alternatives to the monopoly platforms would achieve nothing, because “as long as we don’t approach the monopolies, we have no chance at all.”

The lecture by Martin Andree is available as a recording on YouTube (German).
Further information on Martin Andree’s research on the monopolisation of platforms and the threat to traditional media can be found on the following English language websites: bigtechmustgo, atlasofthedigitalworld

“Between elective biography and self-optimisation, exhaustion sets in.”

On Sunday morning, Dr Sabrina Wilkenshof, pastor and author (“Wie man den Staub von der Hoffnung wischt” [How to wipe the dust off hope]), gave an introduction to a completely different kind of orientation in the digital world: “Coffee, contingency, church” was how she described the search for meaning in social media. There are meaningful influencers in all areas of Instagram: order, time management, motherhood, sport and nutrition as well as in the church. In principle, they all cater to people’s longing for a happy life. According to Wilkenshof, accounts that succeed in addressing the balancing act between “elective biography and self-optimisation” are met with great interest, especially when they find the right balance between intimacy, closeness and distance.

Wilkenshof showed how “meaningful influencers” on Instagram and TikTok use simple messages such as “tidy room – tidy soul” and beautiful styling to tap into people’s desires and, in some cases, turn them into extremely lucrative business models.

However, she also introduced pastors who see Instagram as an extended church and provide serious and professional pastoral care there. This certainly has advantages: “Religious communication on Instagram starts directly at the point of life experience. What a sermon has to painstakingly ‘work out’, namely the concrete lifeworld of the worshipping community, is easily tangible here.” In this way, communication on Instagram can take place at eye level.

However, Sabrina Wilkenshof was not sparing with her criticism. For example, she criticised the platforms’ business model and the dependence of content providers on these platforms. For example, anyone who wants to reach a lot of people is currently dependent on the big platforms such as Instagram and TikTok and has to abide by their rules. She also saw the harmonisation of styles with the mainstream as a problem.

„Es heißt oft, Instagram wäre die Plattform für Perfektion, Ästhetik und Konsum. Ist es auch. Aber es ist noch mehr. Es ist ein Ort für Identifikation, Inspiration, Abgrenzung und Individidualität – und für Massengeschmack, Hate-Speech und toxische Vergleiche.“ – Pfarrerin Dr. Sabrina Wilkenshof

“People often say that Instagram is the platform for perfection, aesthetics and consumption. And it is. But it’s more than that. It’s a place for identification, inspiration, differentiation and individuality – and for mass taste, hate speech and toxic comparisons.” – Pastor Dr Sabrina Wilkenshof

“Wrap-up in four acts”

In the concluding interactive “wrap-up in four acts”, Prof Dr Alexander Decker once again invited all participants to exchange ideas: What impressions will they take away with them? What were the most important insights? Where can and should there be further discussion?

It also became clear in the discussion rounds that many people are unsettled by the rapid developments in the digital world and have great fears of a decline in traditional media and the associated threat to democracy. One older participant made a particularly urgent appeal to younger people: “Please don’t let it get that far. I don’t want to experience a dictatorship again”.


Further Information

Further information on Martin Andree’s research on the monopolisation of platforms and the threat to traditional media can be found on the following English language websites: bigtechmustgo, atlasofthedigitalworld

The Open Search Foundation regularly organises talks, webinars and conferences, such as the Free Web Search Day #FWSD and the Open Search Symposium #ossym24 in autumn. If you would like to stay informed, subscribe to the osf newsletter.