Kristin Kaltenhäuser

PhD Fellow, Department of Computer Science, University of Copenhagen

prof_pic.jpeg

My research is situated at the intersection of machine learning (ML), computer-supported cooperative work (CSCW), and human-computer interaction (HCI). I explore creative, interdisciplinary methods to extract meaning from quantitative and textual data, and to reimagine how statistical and ML tools can be used to study complex, real-life phenomena. A central concern in my work is enabling diverse communities to meaningfully participate in shaping how AI is developed and applied in society.

I hold a BA in Media and Culture Studies, a MSc in Software Design, specialising in algorithms and data science from the IT University in Copenhagen, Denmark, and an MA in Intercultural Communication and Gender Studies from the European University in Frankfurt/Oder, Germany. Additionally, I have several years of work experience: I worked as a diversity analyst at CERN, Switzerland for two years. During my three years at Ørsted, a wind energy company in Denmark, I worked as a data scientist and model engineer, where I developed ML models to predict mechanical failures in large-scale infrastructure.

Currently, I am a PhD student with the Human-Centred Computin section (HCC) at the Department of Computer Science of the University of Copenhagen, Denmark and part of the Confronting Data Co-Lab.

My PhD project is part of the interdisciplinary and international research project NordASIL spanning law, computer science and medicine. The project examines how asylum decisions are made in Denmark, Sweden, and Norway—and why outcomes vary so widely even among similar cases.

One area I focus on is how people can be unintentionally excluded by the way data is cleaned or processed. For example, many tools used in data analysis automatically remove “outliers” — data points that don’t fit expected patterns. But when these tools are used in sensitive areas like asylum decision-making, those “outliers” can actually be real people with complex stories. If they’re excluded from the data, they risk being left out of systems that affect their lives.

In my work, I examine how different tools make these decisions, and what it means when people are treated as exceptions or ignored by algorithms. I believe we need to think more deeply about who is represented in data, how algorithms shape decisions, and what makes a system truly useful and just—not just whether it performs well in a technical sense.

Overall, my goal is to help build technologies that are more inclusive, participatory, and socially responsible—especially in areas where the stakes are high and where people’s futures are on the line.

I am a public speaker on topics such as my research and diversity and inclusion.

In my free time, I like to do weight lifting, yoga and the occasional run. I also enjoy to meditate, to knit and immersing myself in art exhibitions. I passionately teach Python and data analysis to beginners, which I used to do for ReDI School of Digital Integration in 2021 and Ukrainian refugees in 2022.

I am also a certified NLP (neuro-linguistic programming) practitioner, trained by the amazing Tristan Soames.