GESIS Leibniz Institute for the Social Sciences: Go to homepage

Algorithms and Society

The digitization of society brings about a structural change of the public sphere and the private, in which algorithms – quite literally – play a decisive role. We all interact in and with socio-technical systems, e.g. social media, search engines, online stores, job application platforms, news and information platforms. In these systems, algorithms play a major role in deciding which content, groups, people or institutions are presented or recommended to us and how they are being priorized. Algorithms often take the concrete behavior of users as a starting point and thus create a complex recursive interaction between the operating algorithm and human action or experience. With this, artificial intelligence, algorithms and automated processes create dynamics that might not be perceivable by users but do create social structures that substantially influence our individual lives and society as a whole. Whether or not these consequences are desirable can only be discussed and evaluated if we know precisely how digital technologies, the Web and the algorithms therein shape social structures.

This is why GESIS studies the mechanisms of socio-technical systems in order to understand the social change they bring about and to improve the basis for informed and "good" decisions. We do this through collecting digital behavioral data on societal issues, conducting online experiments to analyze behavioral patterns and their susceptibility in digital environments, and developing analytical tools. One of the most pressing social issues is inequality. Algorithms can reinforce existing social inequality or generate new distortions or discrimination. We investigate how distortions (e.g. gender bias) occur in digital practice and how, on the other hand, algorithms and AI can be used to counteract structural inequality and injustice or misinformation.

Learn more about our consulting and services:

  • Lee, Eun, Fariba Karimi, Claudia Wagner, Hang-Hyun Jo, Markus Strohmaier, and Mirta Galesic. 2019. "Homophily and minority-group size explain perception biases in social networks." Nature Human Behavior 3 (10): 1078–1087. doi: https://doi.org/10.1038/s41562-019-0677-4.
  • Tsvetkova, Milena, Claudia Wagner, and Andrew Mao. 2018. "The emergence of inequality in social groups: Network structure and institutions affect the distribution of earnings in cooperation games." PLoS ONE 2018 (13 (7)): e0200965. doi: https://doi.org/10.1371/journal.pone.0200965.
  • Karimi, Fariba, Mathieu Génois, Claudia Wagner, Philipp Singer, and Markus Strohmaier. 2018. "Homophily influences ranking of minorities in social networks." Scientific Reports 2018 (8). doi: https://doi.org/10.1038/s41598-018-29405-7.
  • Zagovora, Olga, Katrin Weller, Milan Janosov, Claudia Wagner, and Isabella Peters. 2018. "What increases (social) media attention: Research impact, author prominence or title attractiveness?" In Proceedings of the 23rd International Conference on Science and Technology Indicators, doi: https://doi.org/10.31235/osf.io/mwxye. urn: http://hdl.handle.net/1887/65362.
  • Kinder-Kurlanda, Katharina E., Katrin Weller, Wolfgang Zenk-Möltgen, Jürgen Pfeffer, and Fred Morstatter. 2017. "Archiving Information from Geotagged Tweets to Promote Reproducibility and Comparability in Social Media Research." Big Data & Society 4 (2): 1-14. doi: https://doi.org/10.1177/2053951717736336.
Title Start End Funder
Political polarization and individualized online information environments: A longitudinal tracking study (POLTRACK)
2022-01-01 2025-08-30 SAW (Leibniz)
NFDI for Data Science and Artificial Intelligence (NFDI4DS)
2021-10-01 2026-09-30 DFG
Dehumanization Online: Measurement and Consequences (Professorinnenprogramm) (DeHum)
2021-01-01 2026-09-30 SAW (Leibniz)
Artificial Intelligence without BIAS (NoBIAS)
2020-01-01 2023-12-31 Horizon 2020
Inequality research (Inequality)
The emergence of inequality in social systems
2017-02-01 2022-02-28 Stiftung