If it discriminates, this is not my AI - III Conference on Thinking about Global Digital Justice
The entire process of designing, developing and implementing AI is marked by deeply discriminatory dynamics. The mainstreaming of the spheres of application of this technology requires us to address the complexity of the phenomenon. For that reason, this conference has two parts, one consisting of a dialogue and the other consisting of a roundtable with two different approaches.
The first part is a dialogue between experts who will take a global approach and will discuss the influence of imbalances between the global north and global south on the consolidation of an AI development model that, since its inception, has implicitly entailed discrimination. The hegemonic economic model, the role of big business in the sector, the imbalances in technical conditions between different geographical areas and even the frameworks constructed for their regulation, mean that AI system developments are discriminatory from the outset.
In the second part, which takes the form of a roundtable discussion between experts, participants will discuss the vectors of discrimination articulated through specific uses of AI, from the resources to the AI training datasets, to the areas where AI-based systems are implemented, to the design of the algorithms themselves. The experts will discuss how the techno-social artefacts which AI-based tools are, increase institutional racism and gender-based discrimination in access to public services, in the management of public space and in the mechanisms for controlling human mobility, for example.
In both sessions participants will also aim to show that the current hegemonic model is not the only possible model and will propose what an AI model that doesn’t feed on racism and other forms of discrimination should be like.
- Languages
- English
- Spanish
- Organiser
- Cost
Free