Freedom To Think – Susie Alegre & Freedom Matters
Protecting our human rights against the threat of surveillance capitalism
This week we welcome Susie Alegre, a leading human rights barrister in the internationally renowned Doughty Street Chambers. She has been a legal pioneer in digital human rights, in particular the impact of artificial intelligence on the human rights of freedom of thought and opinion.
Her book, Freedom to Think, charts the history and importance of our most basic human right: freedom of thought.
From Galileo to Nudge Theory to Alexa, Susie explores how the powerful have always sought to get inside our heads, influence how we think, and shape what we buy. Providing a bold new framework to understand how our agency is being gradually undermined, Freedom to Think is a groundbreaking and vital charter for taking back our humanity and safeguarding our reason in the technological age.
In this fascinating episode we discuss:
- How human rights underpin what it means to be human and why the right to the freedom of thought should be protected at all costs
- How historically this right has come under threat, but never more so than today when the threat of surveillance capitalism means our minds are read every single minute
- How to stay cognizant of how technology is affecting our freedom of thought
- Future strategies to keep us safe.
This episode is part of our mini-series on ‘Self’ where we explore how our technology impacts some of the most important aspects of being human.
Over the coming weeks, we will speak with Krista Tippett, creator of On Being, Susie Alegre, human rights lawyer and author of Freedom to Think, Jillian Horton MD, physician, and author of We Are All Perfectly Fine, Casey Swartz, author of Attention, A Love Story, L M Sarcasas, renowned commentator on technology & society, and Sharath Jeevan OBE, motivation expert and author of Intrinsic. Our goal: is to help our listeners to think more critically about the role of technology in our lives, and how it shapes who we are.
Host and Producer: Georgie Powell Sentient Digital
Music and audio production: Toccare Philip Amalong
Transcript:
Susie: I suppose one of the big problems with the way the internet has developed is about personalization and targeting, and the way that our online engagements are constantly surveilled.
And so what we’ve found is that the information that I find if I go and look online, it’s going to be different from the information that you find when you go and look online. Not because of what I think I’m looking for, but what the search engines think I’m looking for, what the search engines have decided, I should be served up with.
We’re not able to freely search for information in the way that it feels like we are. But in reality, we are being served up information in a very personalized and targeted way.
Georgie: Welcome to Freedom Matters, where we explore the intersection of technology, productivity, and digital well-being. I’m your host, Georgie Powell. And each week, we’ll be talking to experts in productivity and digital wellness. We’ll be sharing their experiences on how to take back control of technology. We hope you leave feeling inspired. So, let’s get to it.
This week, we welcome Susie Alegre, a leading human rights barrister. She has been a legal pioneer in digital human rights, in particular the impact of AI on the human rights of freedom of thought and opinion, and recently released a book on the topic Freedom to Think.
In this fascinating conversation, we discuss how human rights underpin what it means to be human, and why the right to the freedom of thought should be protected at all costs. We explore how historically this right has come under threat, but never more so than today when the threat of surveillance capitalism means our minds are read every single minute, we discuss how to hold on to ourselves by keeping our thoughts close, and our eyes wide open.
Susie, welcome to the Freedom Matters podcast. Thank you so much for joining us today. It’s an absolute pleasure to have you.
Susie: It’s a pleasure, thanks so much for inviting me.
Georgie: So, your book, Freedom to Think, focuses on the important right and freedom of thought, and historically how that has been challenged and threatened and how, perhaps with the digital world, it’s more threatened than ever before.
Before we get into understanding that specific human right of the freedom to think, freedom of thought, the first question I wanted to ask you is to understand a little bit more about human rights and why they’re so important to what it means to be human, and how we come to understand ourselves.
Because I think you made the point a few times that when your life is comfortable, you don’t really think about human rights. So, yes, can you start by explaining the role of human rights and understanding what it means to be human?
Susie: When we think about human rights today, often we think about human rights being a problem for other people. So, we think about human rights as connected to the rights of terrorists or the rights of immigrants, or we don’t necessarily connect it to ourselves. That the Human Rights Project, particularly in the 20th century, came about really strongly in the aftermath of the horrors of the Second World War.
And the Universal Declaration on Human Rights was a kind of statement, bringing together all the nations of the world, all different cultures, all different philosophical, religious, perspectives and political contexts, to decide what rights we needed in order to be human. And so there are kind of condensation of what basic rights we all need to be able to enjoy our lives, to develop as individuals, and to develop as societies.
And so the Universal Declaration on Human Rights covers a really broad range of things from the right to liberty, or right to a fair trial; those kinds of classic civil and political rights. But also crucially, this idea of the right to freedom of thought and the sort of human spark, if you like, and what it is that we need to protect ourselves as human beings.
And that was in a context where half the world was moving forward into sort of capitalist liberal democracies, while the other half of the world was at that time living under the Soviet Union or the development of socialism, and communism. So, the UDHR really distilled that idea of what it meant to be human, what you need to protect, for us to be human, no matter what your political or cultural context.
Georgie: I’d love to understand a little bit more about yourself and in your journey as a human rights lawyer, how that has helped you to and stand yourself more.
Susie: That’s a very good question. I suppose, I mean, I started out not planning to be a lawyer of any kind. I intended to be a poet and/or a philosopher. But after university, I went and worked in Spain, and I landed up working initially as a translator. And then as a research assistant for an international conflict resolution NGO, where I was working on issues around the conflict in Spain, but also conflict in Latin America in places like Colombia.
And then finally on the Northern Ireland conflict, which at the time, was going through a peace protest. And that’s when I met lawyers from Northern Ireland who were coming to do research about the role of politically motivated prisoners in a peace process. And they then started talking about judgments of the European Court of Human Rights about the way human rights law worked.
And for me, this was a real revelation, because I saw that human rights law was really about explaining what it is to be human; about managing the political context that we all live in, but in a way that would help us to develop as societies and as individuals. And it was really that experience that sent me back to the UK to train as a lawyer, and to become, ultimately, a human rights lawyer.
What I found working in human rights law in a really wide range of contexts showed me how this human rights law framework that developed in the middle of the 20th century, can help us really to understand and navigate some of the biggest issues that face us as societies in a way that is humane, and in a way that reflects what it means to be human and how our futures will develop.
Georgie: That’s fascinating. And you talk about how you think the freedom of thought, human rights is one of the most important rights. Can you explain a bit, what constitutes that right and then why you think it is so important?
Susie: Most human rights are what are called limited rights. And that means that there are certain situations where that right can be limited without breaching the right. So, for example, we all have a right to liberty. But that right can be limited if, for example, you’re arrested to face trial, or if you’re sentenced to prison following a trial. So, there are a list of reasons why your right to liberty can legitimately be restricted or limited.
But there are a small group of rights which are called absolute rights, which are human rights, which can never be restricted for any reason at all. And so the classic examples of absolute rights are the right to freedom from torture, and the right to freedom from slavery. So, if you think about those rights, they’re really about the dignity of human beings. There can never be a justification for torturing someone.
Similarly, the right to slavery is somebody cannot legitimately be sold or sell themselves in slavery, because the whole concept and practice of slavery is so fundamentally inhumane that it can never ever be justified.
And so similarly, the right to freedom of thought, insofar as it relates to what’s going on inside your head, can never, ever be limited. Because doing that really takes away what it means to be human, this inviolable space inside our heads.
The right to freedom of thought and the right to freedom of opinion, have a kind of dual aspect. They have an internal aspect of what is going on inside our own heads at any moment, and an external aspect, which is when we decide to share what we’re thinking. Once you share your thoughts and opinions, once you express your thoughts and opinions, then there can be limitations placed legitimately on those expressions, classically to protect the rights of others.
So, once you say something, if you express something that is hateful, that amounts to hate speech, then the state can limit what you’re saying in order to protect the rights of others. But what goes on inside your own head is really entirely your business, and that is an absolute right.
And that’s important, because it also gives us the chance to think dreadful things and then put those thoughts away without acting on them and express them. It gives us the space to change our minds, and also to judge what we want to share, on the understanding that once we share our thoughts, we’re going to have to deal with the consequences of sharing those thoughts.
Georgie: And importantly, that those thoughts are totally private, no one else has the right to see or hear or judge them in any kind of way.
Susie: Absolutely. So, no one has the right to coerce us into revealing our thoughts. And one of the things that I think is very interesting is that the drafters of international human rights conventions recognize that inferences about our thoughts, could also be a violation of that right to keep our thoughts private.
So, it’s not just about whether or not someone gets our thoughts right, and sort of extracts our thoughts from our head, and correctly reads our minds. Even if they incorrectly read our minds, that in itself may be a violation, where they’re acting on inferences about what’s going on inside our heads.
Georgie: Why is it dangerous that someone infers what they’re thinking?
Susie: I think if you look at the past, a classic example is witch hunts. In witch hunts, a witch finder would turn up, infer that you are a witch because you are, in some way, communing with the devil. It doesn’t matter whether you are in fact someone who thinks they’re a witch, the inference that you’re a witch might well still see you being burned at the stake.
If we look at that in the current sort of digital environment, inferences about what kind of a person you are, may well lead to, for example, a risk assessment. So, if you’re applying for insurance, somebody reads your big data, makes inferences about what kind of a person you are.
Not necessarily based on things you’ve done, but on an analysis of all the information that can be gathered about you in the digital sphere. And they may then decide what price your insurance will be, or whether or not you’re actually insurable. And so it doesn’t matter whether or not those inferences are correct, they may still have an impact on your daily life.
And then another sort of harder example, I suppose if you look at technology that’s been developed, to identify criminality, and to say that somebody has a criminal predisposition before they’ve actually done anything. It doesn’t matter whether or not they’re right, you may well still find yourself being penalized for what the text says about you, and infers forget about you.
Georgie: Yeah, and you talk about the sort of, the computer says no anecdote that so many of us have experienced.
The right to freedom of thought has always been under pressure. And I think this is something you’re really good at documenting in the book, how through religion, through science, psychiatry, there has always been this kind of attack, this idea of trying to understand what people are thinking and to manipulate that. Can you talk a bit about how it’s been threatened in the past?
Susie: Historically, freedom of thought was really not a thing that most people enjoyed. And so while we see philosophers like Spinoza, for example, writing about the importance of freedom of thought, and philosophers even go back to Greek philosophers developing ideas around this freedom. The political reality was that if your thoughts were challenging the status quo, it was very unlikely that you would be allowed to carry on peacefully about your business.
And one of my favorite examples was Galileo and his ideas about heliocentrism, so that the Earth traveled around the Sun rather than the universe traveling around the Earth, which was the church’s doctrine at the time. And so we can see how that scientific reality was a real challenge to the status quo and the controls of the church.
And so historically, we’ve seen that while philosophers might have talked about freedom of thought, the reality on the ground was that most people were being driven in a way that they were to accept the doctrine, whether it was of the church or of the political classes, which were often together.
And so I think what we saw then, in the 20th century, with the development of the right to freedom of thought as part of the human rights framework, and the sort of legal international framework, was a recognition that for us to develop as humans, to innovate to thrive as humans, we needed a legal protection of this right to freedom in our own minds.
But I think that was something that wasn’t at all recognized politically, or religiously in centuries and millennia before we started to develop this international human rights framework.
Georgie: And then you talk about how initially, the Internet was seen actually as a place which could really support independent thinking to really flourish because you’d be able to source information from a whole range of places. And that’s really beneficial. You’re not receiving a singular message of propaganda or whatever that might have previously been served by the state.
Why, in your view, has the internet not actually improved our freedom of thought, but rather put it further at risk?
Susie: Well, I think absolutely, this idea of freedom of information is absolutely [inaudible 00:15:24] freedom of opinion and freedom of thought. To decide what we think about things, we need to be able to go out and find out about them in order to form our own opinions.
I suppose one of the big problems with the way the internet is developed is about personalization and targeting, and the way that our online engagements are constantly surveilled.
And so what we’ve found is that the information that I find if I go and look online, it’s going to be different from the information that you find when you go and look online. Not because of what I think I’m looking for, but what the search engines think I’m looking for, what the search engines have decided, I should be served up with.
We’re not able to freely search for information in the way that it feels like we are. But in reality, we are being served up information in a very personalized and targeted way.
Georgie: Is that the difference between, say, how we are today versus how we were back in the 18th century? Is it that now we think we have freedom of thought, but actually we still don’t? Is that the problem?
Susie: Yes, I think the big problem is this sort of personalization and targeting. And so while in previous generations, when we saw propaganda, whether it was religious propaganda, political propaganda, with mass impact, and if you look at the Nazi propaganda machine, for example, as a classic example of dominating the information sources of an entire population to manipulate their minds.
What we can see now is the ability to pick us off as individuals or as very small groups, in order to mold our minds, very particularly, rather than just tailoring a message for a whole population or even a whole geographical area.
Georgie: Yeah. At a time when the user feels like they have full access to all information on the internet. They think they’re looking at everything, when actually they’re not
Susie: No. And I mean, what you’re looking at or what you’re being served up is based on what big data has amassed about you. So, it’s about a sort of granular understanding of you as a person, of you as an emotional being, not just as of you as a consumer.
Georgie: And you talk about how it’s these emotional subtleties, isn’t it, that means that in many cases, people say this technology now knows us better than we know ourselves. One of the interviews I love the most is Tristan Harris interviewing Yuval Harare, for Wired, where Harare says that the internet could have told — his computer could have told him many years before he actually came out that he was gay. But it took him longer to get to that point. Is there not a role where technology knowing us in this way could actually help us to know ourselves better?
Susie: I think that’s debatable. I mean, the problem is, well, there are several problems. One is who has the access to that information. It’s not technology as a disembodied entity, if you like, that is helping us to know ourselves. These are companies or political organizations who are gathering this information and using it for their own ends.
The quantified self-movement says that we’re going to be able to know ourselves better. And one of the classic examples of that is period tracker apps, which allow women supposedly to manage their fertility.
But the reality behind period tracker apps is that all of the detailed information that you’re giving about your physical and emotional state, your relationship status, all of that information is actually being shared with a company who is potentially using that information in a wider ecosystem.
Which may again go back to affect your ability to get credit, your ability to get insurance, your ability to access health care. Or in countries where, for example, adultery is illegal or abortion is illegal, that kind of information may well land you up in prison beyond the broader context.
So, if you were looking at quantified self-applications, which were a closed circuit, if you like, where only you, and or if you wanted your medical practitioner are able to use this information in any way whatsoever, then you would be looking at a different situation. But applications and technology that is designed to, or the claims to help you know yourself better, I think you always have to ask how that might be used.
Another example in a trend is developments in tech, which claim to be able to predict Alzheimer’s many years before a medical diagnosis. But you have to ask yourself, how helpful it is to you if your word processing software, or your email account can tell that you have early onset Alzheimer’s? How is that going to be used against you and/or for your benefit?
And so I think there are really big questions about the way technology is developing that is being packaged to claim that it’s going to be helpful for us.
Georgie: Yeah. It’s the question that I kind of encourage everyone to think about. Every technology is sold for the benefit. That’s why we use it, because it meets some need, whether it’s entertainment, connection, creation, some kind of need, but there is always a cost. So, just being able to critically assess what that cost is relative to the net benefit that it brings.
Susie: I think there is. But I think that’s also the space where from a human rights perspective, governments have an obligation to protect our human rights from each other and from corporations. So, in some countries, like in the UK, there’s been recent developments in criminal law to protect us from coercive control in domestic partners situations.
And that is part of the government’s obligation to protect us from each other. And I think we need to see legislation coming in that also says that there are certain activities, or certain uses of our data that technology can never, ever do. And so once we know what can’t be done, then we’re able to develop technology and to use technology in a safer space.
Georgie: It’s clear that Susie would like to see the human right translated into much more practical legislation that puts in place stricter boundaries around how data can be used.
An important part of this is surveillance advertising, something she’d like to see banned. What exactly is surveillance advertising, and how does it differ from the targeted advertising that we’re so used to?
Susie: I think, if you targeted advertising as contextual advertising. So, if you’re visiting a website about camping holidays, if you then get adverts for tents in your geographical region that might be contextual targeted advertising.
Surveillance advertising, is where there’s a system, which is called real-time bidding, which is when you go on to a website, your access to that website is sold at auction in real-time around the world, to allow people to decide how much your eyes are worth, if you’d like, for their adverts.
But that might include information that says whether or not you’ve got a gambling problem. It might be information that says that you are someone who suffers from anxiety, and therefore, an advert that makes you feel more anxious might well help to sell whatever it is that’s being sold to you. It might be being sold for political ends or it might be being sold for commercial ends.
And so in that split second, your individual eyes are effectively auctioned globally to press your buttons with whatever advert you’re going to be served up. And that’s very different from this contextual advertising, which says, okay, here’s a person in the UK, who is looking to go on a camping holiday, and therefore, they might want to buy a tent. That’s a very, very different context.
Georgie: This is much more about mindset, it’s about emotional behavior, it’s about personality type, it’s about all those things that if you’re walking down the street, you wouldn’t know about a person, you wouldn’t be able to see.
Susie: Absolutely. Although there’s also been technological developments in billboards, so smart billboards in Mexico were being used to read the person looking at the billboard. So, you’re looking at your local politician selling themselves to you. And in the meantime, the billboard is reading how you’re responding to that, so that can be fed back into their campaign strategy.
And so even on the street, we’re seeing smart technology, being able to respond to what we’re thinking and feeling or as you’re walking around the mall, the technology might well be reading your face to see whether or not you’re likely to be a potential shoplifter based on how you’re walking or how you’re looking.
So, this kind of constant reading is also way beyond just looking at your phone or your computer. It’s increasingly developing in our physical environments around us.
Georgie: Yeah, fascinating, and terrifying. And it’s no wonder that 1984 is referenced so much in your book.
What about her proposal to ban surveillance advertising? Is it really that radical or can it be a possibility?
Susie: I think surveillance advertising is not the only problem. But I think it’s the economic driver behind a lot of these developments. So, we often think, oh, well, it’s just down to being sold annoying stuff that I already bought this thing and now I’m being targeted with the next thing.
I think you need to think behind the adverts that you’re seeing to that wider ecosystem. And so yes, I think banning surveillance advertising would take away the profit incentive for this direction of surveillance capitalism that Shoshana Zubov describes so thoroughly in her book on surveillance capitalism.
And interestingly, when I first started working on the right to freedom of thought, banning surveillance advertising seemed like absolute pie in the sky, that no one would ask for it, because it was just, there was absolutely no chance.
But what we’ve seen recently, just in the last couple of months, Biden, in his State of the Union speech, started to talk about stopping the manipulative extractive techniques that are being used on children. So, he’s really talking about stopping surveillance advertising targeting children.
But effectively if you’re going to start, or if you’re going to stop surveillance advertising targeting children, you really just need to look at the whole business model. It’s going to be very difficult to say how you do this with children and not with adults.
And essentially, if it is a rights violation of children, really, it is a rights violation for all of us. And I think it was just a couple of weeks ago that the French digital Minister, Cédric O, in the European context, talked as well about banning surveillance advertising for children, and to also banning surveillance advertising that uses sensitive data for adults.
So, we are starting to see in the mainstream in Europe and in the US, mainstream politicians talking about getting rid of certainly large chunks of the surveillance advertising model.
Georgie: Yeah. Do you feel like there is a way that this human right can be better protected through technology?
Susie: I think there’s no reason to assume that tech giants that are invading our right to freedom of thought, are impervious to the passage of time, the development of recognition that actually this is a massive human rights violation, and that we don’t want it and that we need a reset and a change of direction for technological innovation. So, I am optimistic. Absolutely.
Georgie: We just spoke with Krista Tippett, and she had a very similar view. Technology, it’s got its training wheels on. It’s so new and there are so many problems, but that doesn’t mean that they can’t be resolved.
Susie: Absolutely. But I think that the narrative that it’s just so huge that we can’t deal with it is quite a self-serving narrative. I mean, obviously, for people who don’t want to be regulated, it’s very easy to say it’s just too complicated. Don’t worry your pretty little head about it.
Georgie: Yeah. There’s always reasons why things can’t be done. And then final question, when you think about the role of technology in your life, how do you consciously think about technology? What is technology’s place in your life?
Susie: Well, one of the downsides of publishing a book is that suddenly you have to use social media to sell your book.
Georgie: It’s an irony we crossed with a number of our guests.
Susie: Absolutely. So, you have to use social media. My book is available on Amazon. So, technology and social media are a key part of my life, both personal and professional.
And the reason why I came across freedom was for me trying to navigate a path that allows me to engage with the outer world, to engage with my friends, to engage professionally, while also somehow finding ways that I can protect my own agency, and autonomy.
So, using tools that give me a wake-up call, I think are very useful, for me, in understanding how technology is playing me, if you like. So, Privacy International had a Twitter bot that was really revelatory in that you sign up for the privacy bot, and it would tell you, each week, what you had been revealing about yourself through your tweets.
Another thing that I used was Apply Magic Sauce, which was an app again, which you could use to connect through your social media accounts. And one of the things I found interesting was the Twitter data indicated that I was a 30-something man.
So, I think using these tools, and they are obviously technological tools that help you to understand what you’re revealing, is very useful to have a repeated wake up call for how technology is reading you and how you’re using it.
Georgie: And how they are sometimes very wrong.
Susie: Yes. Absolutely.
Georgie: Susie, you’ve been an absolutely fantastic guest for the Freedom Matters podcast. Thank you so much for joining us today. I’m really so grateful.
Susie: It’s a pleasure. Thanks so much for having me. It’s been a pleasure talking to you.
Georgie: Thank you for joining us on Freedom Matters. If you like what you hear, then subscribe on your favorite platform. And until next time, we wish you happy, healthy, and productive days.