Data, Power and Justice How AI Can Reinforce Social Inequalities
AI systems often appear objective because they are based on data. This unit shows why this assumption is misleading. Data emerges in social contexts, reflects power relations, and excludes certain perspectives. You will explore how AI can reinforce social inequalities and what responsibility this creates for pedagogical professionalism, democracy, and inclusion.
Warm up
Data is not a neutral raw material. It emerges in social contexts.
Imagine the following situation:
A data-based system assesses risks or support needs. Certain groups are classified as problematic far more often than others, without it being clear why.
Now take a moment for the following questions:
-
When do data seem objective or reliable to you?
-
Who decides which data are collected and which are not?
-
Who benefits from data-based decisions, and who remains invisible?
Together with your buddy:
Talk about where, in your professional field, you work with assessments, classifications, or categories that are based on data.
Learn
Data is often regarded as the objective foundation of AI systems. In reality, it is the result of social selection processes. Data reflects existing power relations, norms, and exclusions. AI cannot recognise these distortions but continues to process them. As a result, inequalities can be reproduced or reinforced.
Read more here:
Data as Social Constructions
Dive in 1
Algorithmic Fairness, Discrimination and Responsibility
How can decisions that appear fair but are in fact discriminatory emerge from data-based systems?
In this section, you will explore why fairness in AI systems cannot be technically produced, how responsibility is shifted in automated decision-making processes, and which particular challenges arise from this for pedagogical professionalism.
You can find more on this here:
Algorithmic Fairness, Discrimination and Responsibility
Transfer 1
Positioning data, fairness and responsibility within one’s own professional practice
Educators often work in contexts in which assessments, classifications, or decisions are increasingly prepared or legitimised through data. This task supports conscious reflection on one’s own role within such systems.
Pedagogical professionalism is not only reflected in direct contact with children and young people, but also in how structural requirements, assessments, and seemingly objective decision bases are handled. The goal is not to delegate responsibility, but to actively assume it.
Are you ready?
Take time for an individual written reflection or a structured exchange with a colleague or a buddy.
Work on the following guiding questions:
- In which situations in my pedagogical everyday life do data, categories, or assessments play a role?
- Where do I experience decisions that are presented as objective or without alternatives because they are data-based?
- At which points is there a risk that structural inequalities are reproduced through such decisions?
Then formulate two to three concrete principles that should guide your own actions, for example:
- How you deal with data-based assessments
- Where you deliberately ask about context, history, and individual life situations
- How you keep responsibility visible even in complex systems
These principles can serve as personal orientation or as a basis for collegial discussions.
Transfer 2
Promoting sensitivity to data power and discrimination in direct contact
In everyday pedagogical practice, children and young people increasingly encounter assessments, rankings, and automated evaluations. These are often perceived as neutral or fair because they are based on data. Educators have the task of contextualising and questioning this perception without creating mistrust or feelings of powerlessness.
This task focuses on how a sensitive approach to data-based decisions can be fostered in direct contact with children and young people. The goal is to develop awareness of power, exclusion, and responsibility.
Are you ready?
Develop a conversation prompt, a short exercise, or a recurring question with which you address the following aspects:
- That data does not capture everything that is important for people
- That assessments are always based on certain assumptions
- That injustice can also arise where equal treatment is promised
Reflect on the following:
- Which examples from the everyday lives of children or young people are particularly suitable?
- How can you explain that data can be helpful without granting it absolute authority?
- How can you encourage children and young people to ask questions and not simply accept assessments?
The goal is not to reject technology, but to strengthen critical thinking and self-efficacy.
Reflect
Dealing with data and AI touches on fundamental questions of power, justice, and participation. Educators stand at an interface between individual life situations and societal structures. Their stance influences whether data-based systems are experienced as supportive or as excluding. Data-based decisions often appear factual and neutral. Precisely for this reason, it is necessary to make their underlying assumptions visible and not lose sight of responsibility.
Reflection questions on your own stance
• Where am I myself inclined to trust data-based decisions more than relational or contextual knowledge?
• Which forms of inequality do I particularly perceive in my pedagogical environment?
• How do categories and assessments influence my own professional practice?
Reflection questions for future practice
• What do I want to consciously retain in dealing with data and AI because it is fair and responsible?
• What do I want to change in order to deal more sensitively with power, exclusion, and invisibility?
• Which stance on fairness, justice, and responsibility do I want to pass on to children, young people, colleagues, or parents?