Petersmann, Marie ORCID: 0000-0001-5665-0211
(2025)
Refusing algorithmic recognition.
European Journal of International Law.
ISSN 0938-5428
![]() |
Text (chae068)
- Published Version
Available under License Creative Commons Attribution. Download (244kB) |
Abstract
The Black Technical Object retraces and problematizes the entanglements between technology, data and race. Ramon Amaro’s genealogy of racial sorting argues that machine learning is preconditioned by a prototypical Whiteness that relegates Black beings as objects measured against a White norm. In raising salient questions about technological, political and legal agency – which intersect with various modes of critique of algorithmic governance – I explore three main contributions the book makes to international law. First, machine learning and algorithmic decision-making are increasingly deployed in different domains of international law. The Black Technical Object speaks to emergent strands of scholarship that map how these developments reconfigure and disrupt key legal concepts and categories. Second, both the individual ‘subject’ and the collective ‘public’ of (international) law come into formation differently through these technological systems. Amaro helps us engage with these modalities of subject-making and understand their lineages and political stakes. Finally, the book provides a distinct political perspective of refusal and resistance – a refusal of recognition to resist the reinforcement of racial constructs (re)produced in the digital space.
Item Type: | Article |
---|---|
Additional Information: | © 2025 The Author(s) |
Divisions: | Law |
Subjects: | K Law T Technology Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
Date Deposited: | 13 Dec 2024 11:33 |
Last Modified: | 18 Feb 2025 09:27 |
URI: | http://eprints.lse.ac.uk/id/eprint/126345 |
Actions (login required)
![]() |
View Item |