Redefining Autonomy: Julia Janssen's Fight Against Digital Heteronomy
Breaking the Wall of Digital Heteronomy
Winner Interview 2024: Art & Science
Discover the visionary work of Julia Janssen, who explores how data and AI shape our autonomy in her project "Mapping the Oblivion". By visualizing Netflix’s prediction models, Janssen raises critical questions about living in a world dominated by algorithms. Her innovative installations make complex, opaque systems tangible and inspire a movement towards a more autonomous and fair digital future. Dive into this thought-provoking journey and redefine what it means to be human in the age of AI.
Which wall does your research or project break?
The walls of algorithms increasingly shape your life. Telling what to buy, where to go, what news to believe or songs to listen to. Data helps to navigate the world’s complexity and its endless possibilities. Artificial intelligence promises frictionless experiences, tailored and targeted, seamless and optimized to serve you best. But, at what cost? Frictionlessness comes with obedience. To the machine, the market and your own prophesy.
Mapping the Oblivion researches the influence of data and AI on human autonomy. The installation visualized Netflix’s percentage-based prediction models to provoke questions about to what extent we want to quantify choices. Will you only watch movies that are over 64% to your liking? Dine at restaurants that match your appetite above 76%. Date people with a compatibility rate of 89%? Will you never choose the career you want when there is only a 12% chance you’ll succeed? Do you want to outsmart your intuition with systems you do not understand and follow the map of probabilities and statistics?
Digital heteronomy is a condition in which one is guided by data, governed by AI and ordained by the industry. Homo Sapiens, the knowing being becomes Homo Stultus, the controllable being.
Living a quantified life in a numeric world. Not having to choose, doubt or wonder. Kept safe, risk-free and predictable within algorithmic walls. Exhausted of autonomy, creativity and randomness. Imprisoned in bubbles, profiles and behavioural tribes. Controllable, observable and monetizable.
Breaking the wall of digital heteronomy means taking back control over our data, identity, choices and chances in life. Honouring the unexpected, risk, doubt and having an unknown future. Shattering the power structures created by Big Tech to harvest information and capitalize on unfairness, vulnerabilities and fears. Breaking the wall of digital heteronomy means breaking down a system where profit is more important than people.
The right to be forgotten is a European right that entails the possibility of requesting the deletion of personal data. However, this right poses a multitude of challenges. The complexity of data-sharing infrastructures requires a new understanding of the volume of the right to be forgotten. Moreover, it raises the question of what it means to be forgotten in a world that is built on data. Janssen imagines a state of oblivion on the Internet. A state without documented history or quantified future. Where you can just exist– be human.
What are the three main goals of your research or project?
Janssen has been working on data protection and digital rights for years. In 2016, a time before whistle-blowers, Facebook scandals and privacy debates, she took a stand. “We are enslaved by the data business”. Janssen dedicates her work towards the fundamental questions of living in a digital world. “We need to break the algorithmic walls. Not just because these systems are highly opaque, biased and unfair. Apart from that they are made to be addictive and feed on anger and vulnerabilities. Beyond the fact that they are driven by capitalism and designed for profit maximisation. Other than all those and more major challenges, I’m wondering what will happen If we let AI navigate our lives. If we ban the unpredicted. If we find comfort only in probabilities. If we increasingly live within the mere margins of algorithms, what does it mean to be human?”.
We don’t have privacy problems. We have innovation and liability problems. Janssen stipulates the importance to talk beyond privacy but rather explore how to deal with values like fairness, autonomy, freedom and democracy in a data-driven society. Privacy appeals to the individual responsible for people to deny cookies, make informed decisions and not use an application. Privacy distracts from the fact that the government and industry are taking too little to no responsibility to create fair systems, protect rights and ensure data minimization. We all want to use the beautiful things the Internet has to offer. But, we need to be careful not to become too dependent on them. Live for them.
Every new invention gives something but also takes things away. It's not necessarily bad to have the machine decide, but we are becoming more and more oblivious to what is not programmed. Every time we ask chat-GPT to write us a text, it takes away the opportunity to challenge your mind and structure your thoughts. Every time you choose a tailored playlist like Jazzy-dinner vibes, it takes away the joy of memory and shares songs that bring stories to the table. Every time we lean on directions on where to it distracts you from looking around and it vaporizes the (unexpected) clues on where else you might want to go.
Julia Janssen creates awareness, movement and change. Towards a society where digital rights are guaranteed and data protection is at the core of technology. Where you have full autonomy over your digital identity and where you can exist in oblivion.
What advice would you give to young scientists or students interested in pursuing a career in research, or to your younger self starting in science?
To grow beyond your own practice, education or discipline you have to often position yourself outside the scope of what you know or feel comfortable with. I stated my practice at campus for innovation and technology. Surrounded by think tanks, journalist and start-ups to learn about entrepreneurship, to make connections and to gain confidence in my speaking.
Dare to choose against the odds, respect advice but dare to neglect it, and keep on challenging your own thought. When I graduated art school in 2016, I stated “the data business has enslaved us”. Nobody cared for data protection. I never expected that I, as a young female artist could have a voice in shaping our digital future. But I knew, It was worth fighting for.
What inspired you to be in the profession you are today?
At age fifteen, when I had to choose the course of my education, I chose the economics profile substituted with higher mathematics, art and art history. The counselor asked me “Why this very unusual combination?” I said: I need mathematics to make art. And art helps me to understand mathematics and complexity.” Not knowing that right there, I laid the foundation for my practice.
What impact does your research or project have on society?
My art installations make complex, opaque systems tangle and understandable to a wide audience. In this work, I’m making people aware of digital heteronomy by visualizing how we increasingly live within the mere margins of AI. This allows people to think about their relationship with technology and question how to deal with fairness, autonomy, freedom and democracy in a data-driven world.
What is one surprising fact about your research or project that people might not know?
Can you know someone, based on data? By researching and visualizing my family's Netflix algorithms for this project, I discovered that my grandmother is profiled for liking Korean Horror movies. That was quite a surprise considering she never visited a foreign country.
What’s the most exciting moment you've experienced over the course of your research or project?
In October 2023, I received the Rise25 award from Mozilla, as one of 25 visionaries reshaping the internet. For the ceremony and speech, I made a gown of the spare fabrics of the Mapping the Oblivion installation. Walking down the red carpet in my grandma’s algorithm.