The digitization of society brings about a structural change of the public sphere and the private, in which algorithms – quite literally – play a decisive role. We all interact in and with socio-technical systems, e.g. social media, search engines, online stores, job application platforms, news and information platforms. In these systems, algorithms play a major role in deciding which content, groups, people or institutions are presented or recommended to us and how they are being priorized. Algorithms often take the concrete behavior of users as a starting point and thus create a complex recursive interaction between the operating algorithm and human action or experience. With this, artificial intelligence, algorithms and automated processes create dynamics that might not be perceivable by users but do create social structures that substantially influence our individual lives and society as a whole. Whether or not these consequences are desirable can only be discussed and evaluated if we know precisely how digital technologies, the Web and the algorithms therein shape social structures.
This is why GESIS studies the mechanisms of socio-technical systems in order to understand the social change they bring about and to improve the basis for informed and "good" decisions. We do this through collecting digital behavioral data on societal issues, conducting online experiments to analyze behavioral patterns and their susceptibility in digital environments, and developing analytical tools. One of the most pressing social issues is inequality. Algorithms can reinforce existing social inequality or generate new distortions or discrimination. We investigate how distortions (e.g. gender bias) occur in digital practice and how, on the other hand, algorithms and AI can be used to counteract structural inequality and injustice or misinformation.
Learn more about our consulting and services:
Analyzing Digital Behavioral Data
Methods, tools, frameworks and infrastructures for analyzing digital behavioral data.
CSS Capacity Building
Talks, tutorials, materials on computational methods for the collection, processing, and analysis of digital behavioral data.
Digital Behavioral Data: Datasets
Curated digital behavioral data – datasets for scientific re-use.
- Lee, Eun, Fariba Karimi, Claudia Wagner, Hang-Hyun Jo, Markus Strohmaier, and Mirta Galesic. 2019. "Homophily and minority-group size explain perception biases in social networks." Nature Human Behavior 3 (10): 1078–1087. doi: https://doi.org/10.1038/s41562-019-0677-4.
- Tsvetkova, Milena, Claudia Wagner, and Andrew Mao. 2018. "The emergence of inequality in social groups: Network structure and institutions affect the distribution of earnings in cooperation games." PLoS ONE 2018 (13 (7)): e0200965. doi: https://doi.org/10.1371/journal.pone.0200965.
- Karimi, Fariba, Mathieu Génois, Claudia Wagner, Philipp Singer, and Markus Strohmaier. 2018. "Homophily influences ranking of minorities in social networks." Scientific Reports 2018 (8). doi: https://doi.org/10.1038/s41598-018-29405-7.
- Zagovora, Olga, Katrin Weller, Milan Janosov, Claudia Wagner, and Isabella Peters. 2018. "What increases (social) media attention: Research impact, author prominence or title attractiveness?" In Proceedings of the 23rd International Conference on Science and Technology Indicators, doi: https://doi.org/10.31235/osf.io/mwxye. urn: http://hdl.handle.net/1887/65362.
- Kinder-Kurlanda, Katharina E., Katrin Weller, Wolfgang Zenk-Möltgen, Jürgen Pfeffer, and Fred Morstatter. 2017. "Archiving Information from Geotagged Tweets to Promote Reproducibility and Comparability in Social Media Research." Big Data & Society 4 (2): 1-14. doi: https://doi.org/10.1177/2053951717736336.
Political polarization and individualized online information environments: A longitudinal tracking study
NFDI for Data Science and Artificial Intelligence
Dehumanization Online: Measurement and Consequences (Professorinnenprogramm)
Artificial Intelligence without BIAS
The emergence of inequality in social systems