Louise Amoore’s research delves into algorithmic ethics and the broad societal implications of AI impact. Her work delves into AI’s societal implications, looking at classification systems, social dynamics, and discrimination. Through projects like “Algorithmic Societies,” Amoore strives to engage the public in critical discussions on AI’s ethical dimensions. She aims to inspire scholars to scrutinize AI’s political and ethical implications closely and influence the shaping of AI-driven worlds for the better.

Which wall does your research break?

My research aims to break the walls that limit our ethical and political responses to AI technologies. The limitations of social science research on machine learning systems are that questions of ethics and norms are most commonly posed outside of the technical building of the algorithms. My research seeks to position the normative assumptions of algorithms within the structures of technology itself. In this way, every decision taken or pathway followed must carry the full weight of the ethics of a wider society. In systems such as facial recognition technology or biometric border controls, for example, the impact on society extends significantly beyond how the technology is implemented or whether it has legal safeguards. My research proposes that we take seriously how AI is changing the systems of classification, social clustering, forms of division and discrimination that have accompanied the growth of statistical models. From the new analysis of biomedical images to the new use of AI in welfare decisions, the algorithmic society demands that we rethink how we live together ethically and politically.

What inspired or motivated you to work on your current research or project?

My current project, Algorithmic Societies: Ethical Life in the Machine Learning Age, was inspired by more than a decade of research on how social inequalities and discrimination are intensified in technological systems. My early work on the use of algorithms in warfare, for example, was concerned with the forms of automation and machine learning that were entering lethal decisions from the late twentieth century. In the twenty-first century, these systems are mutating once again, using new AI models that are more difficult to explain or to open to scrutiny. I am inspired to find new ways for society and the public to critically interrogate and respond to the algorithms in their lives.

In what ways does society benefit from your research?

It is important to me that my academic research is translated into forms of discussion that are accessible to a wider public. For example, the Algorithmic Societies project has worked with musicians to translate the politics of AI into performances that open new discussion and public engagement. We have also worked with artists to build installations and objects that can be encountered by people in spaces that are experimental and open. More practically, the findings from my research have informed two oral testimonies before the UK House of Commons Science and Technology Select Committee – one on the future of biometric technologies, and the other on the ethics of algorithmic systems in public life.

Looking ahead, what are your hopes or aspirations for the future based on your research or project?

My greatest hope is that a new generation of scholars will be inspired to research AI systems, up close and in detail, with greater attention paid to the technical assumptions that become political forces in the world. In particular, I hope that people will become more attuned to the world-making that is taking place via AI systems. It is crucial that we find ways to shape and govern the ethical consequences of the kinds of worlds that are being made in and through machine learning tools.

Further Activities to have a look at

We use cookies and other technologies to collect data about your use of our site and about your browser, device and location. We process this data to help us understand how the site is used and to personalize our content and the advertising you see on this and other sites we also share this data with advertising, social media and analytics partners for the same purposes. By choosing “Accept All” you agree that we store cookies on your device and process the data for the aforementioned purposes . You may revoke your consent at any time by choosing the respective settings. For more information on the processing of personal data and what specific services we use see our Privacy Policy

Privacy Preference Center

We use cookies and other technologies to collect data about your use of our site and about your browser, device and location. We process this data to help us understand how the site is used and to personalize our content and the advertising you see on this and other sites we also share this data with advertising, social media and analytics partners for the same purposes. By choosing “Accept All” you agree that we store cookies on your device and process the data for the aforementioned purposes . You may revoke your consent at any time by choosing the respective settings. For more information on the processing of personal data and what specific services we use see our Privacy Policy

Necessary Cookies

These cookies are required to provide you with our site and its functions. Since storing these cookies is necessary for the provision of our site and its functions, you cannot disable them. If you block these cookies via you browser settings our site may not function properly.

Analyse Cookies

These cookies enable us to use website analytics in order to improve our site or provide you with personalized content. They may be set by us or by third party providers whose services we have added to our pages.

Marketing Cookies

These cookies enable us to use online marketing and advertising in order to provide you on this or on other sites with personalized advertisement. They may be set by us or by third party providers whose services we have added to our pages.