103 Meredith Whittaker: Technology acts political and hides behind objectivity

With us today is Meredith Whittaker, president of the Signal Foundation who serves on its board of directors. She was formerly the Minderoo Research Professor at New York University (NYU), and the co-founder and faculty director of the AI Now Institute.

She also served as a senior advisor on AI to Chair Lina Khan at the Federal Trade Commission. Whittaker was employed at Google for 13 years, where she founded Google’s Open Research group and co-founded the M-Lab. In 2018, she was a core organizer of the Google Walkouts and resigned from the company in July 2019. She now runs Signal, the leading global privacy-orientated NGO.

Transcript of the episode:

Expand the transcript
00:00:06 Domen Savič / Citizen D
Welcome everybody, it’s the 10th of September 2024, but we are releasing this episode of Citizen D podcast on the 15th of October 2024. With us today is Meredith Whittaker, president of the Signal Foundation who serves on its board of directors. She was formerly the Minderoo Research Professor at New York University (NYU), and the co-founder and faculty director of the AI Now Institute. She also served as a senior advisor on AI to Chair Lina Khan at the Federal Trade Commission. Whittaker was employed at Google for 13 years, where she founded Google’s Open Research group and co-founded the M-Lab. In 2018, she was a core organizer of the Google Walkouts and resigned from the company in July 2019. Welcome to the show, Meredith!

00:00:55 Meredith Whittaker / Signal
Thank you so much for having me, it’s great to be here.

00:00:58 Domen Savič / Citizen D

Let’s start at the beginning. I’m curious to hear your thoughts on the consequences or the whole Google walkout situation. Back then, it seems to me that it was your personal convictions that started everything, and I want to hear your thoughts on the balance between personal involvement, personal responsibility and the push for systemic change in a particular area. Does one happen without the other?

00:01:32 Meredith Whittaker / Signal

That’s a big question and I’m not sure I have an easy one-size-fits-all answer. Ultimately, we’re all individuals and when we act, we need to act on our own volition. We need to recognize what we believe and take it seriously and be accountable to our analysis.

But I wouldn’t actually say that the walkout was the very beginning for me, the walkout was a culmination of a lot of work, a lot of thinking, a lot of conversations that I’d had over more than a decade. And the walkout also wasn’t just me. It was thousands and thousands of people. It was apparently the biggest labor action that has happened in tech, with 20,000 people leaving work in protest, you know, against the unethical business conduct at Google and against a culture that persistently valued some people more than others and developed products that often caused serious risk for those who were devalued, so to speak, due to that culture and those design decisions.

I think the walkout was one way in which throughout my career, in many, many ways I have endeavored to be accountable to my analysis, I have endeavored to do what I can to change things when I saw them going in in a bad direction, but I had worked for many years and many different ways, from the inside trying to influence trying to shape policy and many of these things I still do… So again, I think the walkout wasn’t the beginning. It was one manifestation of a theory of change that looked to collective action from below to remedy some of the dangers and harms of the concentrated tech business model.

00:03:52 Domen Savič / Citizen D
I was wondering because it’s like an example of this personal engagement that is usually pitted against “please don’t regulate us”, and “the industry is capable of regulating itself”.
So, would you think that with these types of personal engagements and campaigns from people who are working from within the industry… do you think that’s enough to resolve these issues revolving around privacy, security, surveillance and other pertaining topics

00:04:42 Meredith Whittaker / Signal
I don’t think there is one trick we’re going to find that solves these issues at once. For all the labor organizing and the ability of the structural leverage that employees/workers have relative to the institutions and people who employ them is a well-known lever before the 1980s, before the hegemonic ascent of neoliberal ideology, it was very well accepted, on the right and in the center.

Conservatives, liberals, leftists… recognized simply structural check on toxic capitalism and labor power involved, the workers having some say in what they work on and how. I don’t know that this is individual so much as going back to some of the basics and recognizing that we have an industry that is making some decisions and putting revenue and growth above the common good in ways that could be really, really dangerous given the power and information possessed by this industry.

00:06:11 Domen Savič / Citizen D
And speaking of power and controlling power, you’re now running Signal which is, for privacy activists an a journalists and many others across the globe the app or the service to use if somebody wants to protect privacy and have a decent level of security and I would like to know how do you generate trust in the app, in the system, in the symbol that is Signal currently representing in terms of privacy, security, and other protections of human rights especially in the area or in the time where you distrust in institutions, in each other is ultimately growing?

00:07:15 Meredith Whittaker / Signal
Well, I think… look, trust is earned. It’s slow to earn. It is fast to erode, and it reflects a pattern of behavior and trustworthiness overtime. So in fact, I don’t, we don’t ask people to trust us, we endeavor to be as trustworthy as possible, to be as open as possible to develop our code in the open, open source to make our encryption protocols open, to allow people to vet and scrutinize the math and the implementation, and to ensure that in fact, insofar as possible, we are never asking someone simply to blindly trust us because they like what I say or they like the way Signal looks or operates.

We are asking people to validate our claims and we are making insofar as possible everything available for them to do that and I think that is why Signal is so trusted, because in fact we, we are going above and beyond to be trustworthy in a way that most actors in the ecosystem can’t or are unwilling to for a number of reasons.

00:08:36 Domen Savič / Citizen D
An easy follow up question: how hard is it? I mean how hard is it to go above and beyond and do all the things you just said Signal is doing to encourage or to be trustworthy?

00:08:55 Meredith Whittaker / Signal
Well, it is difficult and it’s difficult for a couple of reasons. First, the tech ecosystem, the tech industry as it exists now, as it’s been built since the 1990s and before is structured around making money off data collection, it’s structured around monetizing surveillance. That’s the business model. So, you collect data, you use that data to target advertisements, or you use that data to train an AI model… but the assumption is built into all of the tools we use, just the narratives. We think with the libraries that we can access, that we would want to collect as much data as possible.

And so, it is difficult to do the opposite. We actually end up having to rewrite parts of the stack, so to speak, in order to enable privacy, in order to reject data collection as a norm. So that is difficult because we are swimming upstream against a massive current in a trillion-dollar industry, where privacy has not been something that was prioritized and trust around privacy is certainly not been part of the business model. Now it’s also difficult or related to that it’s difficult because there isn’t a business model for privacy at this point in the tech industry, and this is one of the huge harms that we are, we are grappling with.

The profit motive is oppositional to privacy, data collection is oppositional to privacy. So it’s difficult from that perspective in that we have to really think about our structure and protect ourselves from the imperatives of profit and growth, not necessarily because they’re bad in and of themselves, but because following those imperatives, would at this point lead us down a path towards surveillance toward data collection.

So, this is why Signal is structured as a non-profit. This is why we really go out of our way to take the incentives for surveillance off the table when it comes to Signal again. So, we’re structured for success in the long term, so we stay laser focused on our mission.

00:11:29 Domen Savič / Citizen D
And speaking of economy or surveillance capitalism… that’s one part of the equation, right? The industry is focusing on gathering information, repacking it, reselling it or selling it to the highest bidder. On the other hand, you have politicians, who are worried about privacy eroding security.

So, is it hard for you, for Signal to argue for privacy when faced with a fake dilemma of choice between privacy and security?

00:12:17 Meredith Whittaker / Signal
I think that debate, sadly, has gone on since, you know, we can date that debate to the mid-1970s and before when public key cryptography was introduced as a way to secure a network.
And we’ve seen that debate resurface under multiple pretexts from the good guys need to undermine privacy, so we can stop terrorism to stopping child abuse to whatever else it is. But the motive is always the same.

And the motive is that there are some among governments and law enforcement who feel that the fundamental human right to private communication should not be available to people online, that there should be no communications network that is not tappable, that law enforcement or governments aren’t able to surveil.

And I think that is… It’s just simply incredibly dangerous, and it flies in the face of the long-standing expert consensus that knows there is no way to create a backdoor, create a way in that only “the good guys” can access, that anytime you create a flaw in these infrastructures, anyone with the tools and expertise to exploit that flaw will, and so you are corroding the very same cyber security measures, the very same private communications networks that your government also relies on, that your law enforcement also relies on and you are making those vulnerable to hackers, to hostile nations and to whoever else might want to infiltrate those.

So, it is a very pernicious line of argument, but I don’t think it’s always in good faith and I don’t think that we’re ever going to win this battle simply by being correct simply by force of argument. We’ve been correct for multiple, multiple decades. The facts have not changed, but the will to create some magical formula that lets the government spy on everything does not seem to die.

00:14:35 Domen Savič / Citizen D
And currently it seems like the debate, or the situation is getting worse on both sides of the Atlantic, right? You have several legislative proposals from the EU, from the US government, from Australia and other countries to sort of corrode the privacy, to enable backdooring for the good guys and so on. Why do you think the current situation is going from bad to worse?

00:15:09 Meredith Whittaker / Signal
Well, I don’t know. I don’t have insight into the rationale of each individual government, but if we look at the arc of recent history, we see that the situation was pretty bad in the 2000s. Large surveillance companies, the big tech companies we talk about now, had open relationships with governments, were handing over huge amounts of data. You know, that still happens, but in 2013, the Snowden revelations really shone light on this imbrication between the large surveillance companies and the business model that we touched on and the US and other governments.

And I think that that provoked a kind of counter reaction. You saw a number of the platform companies, you had iOS and Android adding encryption to their operating systems, you had a turn to privacy from the industry that wanted to in effect, save their reputation if we’re going to be cynical about it, one to distance themselves from government spying by adding privacy features, and immediately after that in 2015, you see a showdown between the FBI and Apple in the US over the encryption on the iPhone.

And you begin to see an escalating campaign, as it were, to undermine the privacy guarantees that have been put in place post Snowden, most profound among these is Signal and the Signal protocol.
The Signal protocol pushed the state-of-the-art forward significantly by making private communication truly private communication possible on mobile apps and after it was released in 2013, WhatsApp integrated the Signal protocol and now protects the contents of your messages, not every piece of data, but the contents of your messages, using our technology and there has been, since then and as private communications become more and more mainstream as people begin to recognize in the wake of data breach after data breach after data breach that it really is important to prioritize privacy, there is a mounting anxiety that I see on the part of some in law enforcement and some in government.

Often people who are perhaps a bit parochial or confused that want to undermine and walk back these changes. Now there are many kind of dynamics that I think have helped or hindered this, but I see this as one more salvo in an ongoing battle and no sign that we are losing the war… and in fact in the last couple of years we, those of us in the privacy world who are pushing for these fundamental human rights, have had a number of wins, have pushed back on a number of pieces of very bad legislation in the face of often incredibly emotional and compelling narratives that are difficult to fight against, particularly when someone is bringing a heartfelt story to the table and then you’re on the other side debating about the nuances of cryptographic mathematics or something, right?
It’s difficult battle, but it’s, you know, nonetheless, it’s incredibly worthwhile.

00:18:50 Domen Savič / Citizen D
And one of the issues, of that battle is also the commodification of privacy. So now, as people are more aware of privacy first or privacy savvy technology analogies, privacy becomes a business offer, right? You trade off your privacy for usability or you trade privacy for data, for product. Do you think that poses any significant danger moving forward? So, in terms of companies saying “OK, you can have privacy, but you either have to pay for it or you do not get to use all of the services and features.”?

00:19:37 Meredith Whittaker / Signal
I think this pattern, it requires a bit of nuance or a bit of understanding of the particulars of every case, but, what I can say in the context of communication is that requiring people to pay for privacy simply doesn’t work because, of course, communication requires more than one person constitutively it requires,at least it’s me and other people who are going to be involved in whatever message or call or what have you and if only one party pays, the other party is not benefiting from privacy either, and the example of e-mail is a good one.

There are great services for private e-mail like Proton, but Proton is also interoperable with other major e-mail services like Gmail or Outlook which means that if I e-mail somebody on Gmail from my Proton account, Google has all of that information shared on their servers, right?

Similarly, if I were to pay for privacy, I’m paying the subscription fee so that you don’t collect my data, say, but I e-mail somebody or I connect with somebody, communicate with them in in some way and they don’t pay for that, I would also not be kept private, so you know communication networks are really something that shows us just how interdependent we are when it comes to privacy.
And some of these standards, that a “pay to play” model breaks down pretty quickly when it comes to communication.

00:21:25 Domen Savič / Citizen D
Speaking of networks and big players in this field, we already touched upon Google and its monopoly in the communication digital communication sphere, it seems to me that on the one side you have an issue with privacy, personal data, all of these dangers that we’ve talked about, but on the other hand you can see Google, Apple, Amazon and other big intermediaries almost becoming essential services to the netizens and to basically everybody around the globe. So how do you see or what would be in in your words the best way to dismantle this or to start doing things differently on a universal scale?

00:22:19 Meredith Whittaker / Signal
Yeah, well, there isn’t one answer to that question. In my view, in part because these services have infiltrated and become core infrastructure for so many heterogeneous functions of daily social and economic and even governmental life. Imagine trying not to use Google or Amazon at your job? Imagine having no social media presence and trying to apply for a loan or get government benefits, right?
When we begin to engage in those thought experiments, we begin to recognize just how deeply these services have infiltrated so many domains across the globe, and so I think to answer those questions, we need to get a little bit more granular.

We need to ask to efficiently fulfill the role of governments in these core services is, what type of technical infrastructure that is not controlled by monopoly actors? Would we need to be able to govern technology in a way that is more democratic? What type of governance structures do we need to put in place that allow participation at the level of design and features and how a technology behaves, how it respects or fails to respect fundamental rights.

And those are questions that are ultimately very exciting and I do think we are in a time when there’s no longer any debate over whether this business model is good or bad. You even have institutions like Combinator, which has done, you know, arguably as much as anyone to cement and promote the toxic Silicon Valley business model, is now coming out and saying like, “Hey, we’re actually not very into big tech. We’re looking at little tech now. We want to promote the, you know, the small players!” and whether that’s good faith or not, I think that really shows us that there is a sea change in terms of sentiment and that there is an opportunity to think through.

How would we dismantle and disarm the centralized power that is held by these platform companies? You know, what would independent cloud infrastructure look like? What would independent communication networks look like, what would interoperable protocols that enable more flexibility and independence at the application layer look like and how do we find the capital to fund these things and maintain them forever over time?

And how do we put in place governance structures that don’t behave like the boardrooms of big tech with their focus on profit, revenue and growth over everything else, but have more civic minded duties and processes that are working to aerate the tech ecosystem and make it more amenable to building technologies that actually serve beneficial futures.

00:25:46 Domen Savič / Citizen D
Who should be the driver of this debate, of this big rethink? It sometimes seems that this debate sort of formulates in one area, let’s say one part of the industrial ecosystem, one company, sometimes one person thinks about a concept and then others just parrot it. And when you listen to political debates, when you listen to media discussions, it seems that a lot of people aren’t thinking about this, they’re just copy pasting arguments and counter-arguments for a specific case.

00:26:33 Meredith Whittaker / Signal
There isn’t one actor who should be driving this because what we’re talking about is a massive scope. Scope that is affecting people in heterogeneous domains across the globe, where we are in a situation where what Facebook means in Myanmar will be very different than what Facebook means in Stockholm, for example, you know the use of TikTok in Nairobi may look very, very different from the use of TikTok in Connecticut.

So, I think in a sense that when I say we need to get down to a more local level, that’s not fetishizing the small and the local. That’s really recognizing the function of these platforms, the role they play vis-à-vis government services, vis-à-vis commerce and communications really does vary across contexts and that the people in those contexts are almost certainly best positioned to answer some of these questions.

We don’t want to repeat the mistakes of one size fits all billion-user platforms, but simply do that kind of interventions right, because there is no one-size-fits-all.

00:28:09 Domen Savič / Citizen D
And do you see a danger of activism fatigue or even personal fatigue in this regard, because these problems or these systemic pushes are just too massive and are taking a toll on our basically health, right? So, do you see that as an issue in terms of that the NGO or the civic sphere are not replicating fast enough to counter the industry and the political pressures or pushes in this area to be an effective counterweight to privacy versus security and everything else, we’ve talked about?

00:29:11 Meredith Whittaker / Signal
Yeah. Well, I certainly do see that. And I think we do need to look long and hard at the political economy of the NGO sphere and this whole society sphere because you know what I’ve seen? I’ve been in this work for almost 2 decades now and I’ve really seen it sadly is often very susceptible to trends.

In the US, which I know is different than many other places, but it’s also generally funded through philanthropy, so you don’t have long term sustainable funding in most cases you are at the whim of whatever foundation or your donor might think it is important at that moment, and of course that is also susceptible to trends and to whim and to hype, which makes it very difficult to pursue a long term strategy, particularly when you are rowing upstream against vested interests that may frankly, may have a lot more access to some of the leaders in philanthropy than some of the activists who are on the ground doing real work but not being seen and appreciated.

So again, I think the political economy of NGO work and civil society needs a lot more scrutiny. And I think we need to be a bit bolder in frankly demanding the kind of support and capital that we need to do this work. And there’s a lot of really good ideas out there, really good architectures, incredibly brilliant thinking around how we could build tech differently, how we could build more respectful tech, but an idea is not the solution.

An idea is a possible template and what is not generally understood or let’s say respected is just how much work and how much money it costs to build reliable tech. It’s never just built once. This isn’t two guys in a garage, who come up with something genius and the world changes. No, it’s two guys in a garage, a really good idea and then billions of dollars of capital and hundreds of thousands of hours of labor who make that idea real, to maintain that idea in a volatile and dynamic environment and to do that forever or until that idea dies, or that tech doesn’t exist.

I think we also need to reframe our understanding of tech and recognize that we can’t have Sam Altman be the only one who’s talking big money, right? We also need to recognize that we’re serious about this change. We need to be at the table, and we need to be demanding a cut of that.

00:32:19 Domen Savič / Citizen D
Part of that issue is also this illogical mantra about tech being neutral and non-political. That it sort of just flies out there in the world and smart guys are gals are just reaching for it. So, do you think that’s one of the stereotypes that prevents us from moving forward or think differently about the whole development cycle and uses and misuses of information technology nowadays?

00:32:57 Meredith Whittaker / Signal
Oh, absolutely. And I think that mantra or that narrative that often couches highly political decisions in the veil of objectivity is not new to computational tech. If we are going a little far afield, there is an enlightenment era or a paradigm that saw certain forms of knowledge and certain forms of certain people as neutral and objective and able to discern and represents the facts in a way that was simply stenography of the world. And this is a paradigm that has been very, very well studied by a number of scholars looking at the way this guise of neutrality was used to mask often very political and very brutal regimes and determinations.

We can look at things like race, science, that couched structural inequality as neutral biological destiny, they were just observing differences and then determining what those differences meant in ways that were pernicious and harmful for the world. I think we need to question narratives of neutrality to begin with, and then particularly in tech, I think that this has been a conflation of computational technology with scientific progress, which has been promoted by the tech industry.

The reason we’re all suddenly using Google or we’re all hosting on Amazon is not because those companies were successful during the primitive accumulation stage of tech but is simply because what they discovered is that significant scientific advance and they are introducing that to the world and as such, they bear no responsibility for that as such, that what they are doing is neutral and inevitable, it cannot be changed and as such, if you were to question it or if you were to say desire to regulate it in a way that wasn’t beneficial for those companies, you are anti-progress or anti-science, you’re putting your finger on the scales of human advancement, and I think that narrative has done as much as anything to really chill our ability to grapple with and meaningfully regulate these technologies over the past number of decades.

00:35:49 Domen Savič / Citizen D
It is funny because when you talk about using Signal or be a Signal supporter, it seems that especially in in the political arena, there’s a bit of a hypocrisy going on. So, you have, at least in Slovenia, but I bet all over the world, you have many politicians, MPs, ministries and so on using Signal for their own communication efforts, but at the same time they don’t see a problem in the Signal being the solution to whatever problem they’re trying to solve with it – usually the privacy of communication. So how do you see the current political arguments going on both sides of the Atlantic regarding the privacy on one side and everything else on the other?

00:36:50 Meredith Whittaker / Signal
Well, I, I mean, I don’t think the hypocritical, but nonetheless, the desire to have it for me, but not for thee is very new to politics. Right? And that’s basically what I see happening here, “Yes, my privacy is important, but you must understand as a responsible steward of our society, I need to invade your privacy.”

You know, these are people who imagine themselves as always, in a position of power, and thus don’t generally question in the type of levers of power that we are creating and the way those could be misused, if somebody with more pernicious intentions were occupying their seat, so, I think this is an age-old pathology and it’s why we need to hold anyone who has a position of power to incredibly stringent standards and recognize that it’s really not personal, but that if you are going to take that kind of responsibility, you need to be held accountable and the people who are worthy of that responsibility should be embracing that.

00:38:05 Domen Savič / Citizen D
And if they don’t?

00:38:10 Meredith Whittaker / Signal
Well, if they don’t… read history. There are many, many, many things that can be done if they don’t, but I don’t have a specific example. I just think that you know, they’re clearly not suitable for the responsibility of the office, they have endeavored to occupy.

00:38:34 Domen Savič / Citizen D
We’re slowly wrapping up, so for the last part, I just have a few, let’s say Signal specific questions. So first off, could you describe your design process when deciding on a new feature being implemented or not being implemented? You hear a lot about Signal being offered on Graphene OS as a native app or you should not be using telephone numbers as an ID. There’s a lot of different ideas but how do these ideas get streamlined into the design process?

00:39:18 Meredith Whittaker / Signal
Well, we are very serious with reading and thinking about the feedback and the ideas that come from the massive community of people who use Signal. And we’re also serious about ruthlessly prioritizing so that we can focus our, our small but mighty energies on the things that really matter.

So, there are many, many things, I would say 99% of the things that we would like to do are not things that we choose to do, because we do really value focus and we balance, we think long and hard about new features, we think about whether we can build those features in a way that that meets our very strict privacy bar, we think about whether those features are useful to people.
Are the features that are common in communication apps in one region or another? When people pick up Signal to use, it is the absence of that feature something that is going to make it feel like it’s incomplete or like it’s broken or that people can’t use it?

For example, Signal introduced Stories a couple of years ago, similar to Instagram, but our Stories are actually private, right? And while they’re not a hugely common feature in the US, they’re massive in South Asia and in Brazil and we were hearing from people that pick up Signal and it feels broken, because it doesn’t have this feature that has been a core way of communicating among the people who use Signal there.

So, it’s a lot of conversations, a lot of collaboration with our Chief Product Officer Clancy Childs, who is very brilliant and very experienced and has been working on messaging now for over a decade, who really has a lot of instincts there and we try to do some market research.

We do don’t collect user data, we don’t collect telemetry and analytics the way almost every other communication service does, so we often don’t have or we almost never have, the kind of signals that our competition does, but we do have other ways of fetching information and doing user research in the field that gives us a sense of how are people using Signal, what might they enjoy and we go from there.

00:42:08 Domen Savič / Citizen D
What are some typical technology limitations you have to jump through or overcome when you’re deciding to improve Signal or how to sustain the current quality level of the service?

00:42:47 Meredith Whittaker / Signal

As a rule, doing things with strong encryption takes more resources, computationally it is slower. It often means that you can’t do the obvious thing, you have to do a work around so for example, we’ve talked about in the past is something like gif search. People want to be able to send reaction gifs on Signal. This is again one of those features that people feel is really missing if it’s not there.
The company Giphy was collecting all that data and then was acquired by Meta, so we weren’t going to simply shove in a Giphy library and call it a day, we had to do a lot of work that involved some pretty deep re-architecting of parts of our system in order to make gif search available to people in a way that met our privacy bar.

Now the end result for a normal user is gif search, right? It looks exactly the same as if we just shoved it in there and said “Hey, we’re giving all the data to Meta”, but in fact we spent orders of magnitude more time, creativity and rigor doing that than the competition.

00:44:14 Domen Savič / Citizen D
You were repeatedly quoted that Signal will not be available in the markets that will pass privacy-invasive legislation or back doors for the good guys.
What do you think it will happen by the end of the year or in 2025 in this area? Are you seeing some movement that politicians are sort of backing off of these terrible ideas or do you see do you see the opposite?

00:45:04 Meredith Whittaker / Signal
Well, I am not a wizard or a psychic, so I don’t have a clear prediction. I do think more and more that politicians are hearing the consensus from the expert community, the arguments that we and others have put forward. Now, I don’t think those arguments alone are going to stop the push from some who don’t really care about privacy, they just want the ability to surveil.
But what I can guarantee is that whatever happens, we will continue to fight and our position is not going to change. We’re simply not going to comply with any regulatory or legal provision that would force us to implement a backdoor or otherwise undermine the privacy guarantees that the people who rely on us rely on often in life-or-death situations.

00:46:02 Domen Savič / Citizen D
And just one final question. So, I’ve labeled this part controversies in order to sort of encompass everything that has been floating around the Signal app and service around the web. So, from rumors that the Signal is supposed to be a CIA front, that it’s not secured, that is being mismanaged or something like that. So, is there a way for the company to sort of present itself as a trustworthy?
We already said a few words about that in the beginning, but I’m curious to hear your thoughts on users choosing between Signal and Telegram or other services, while in fact not having the necessary technological experience to sort of really go into the the nitty gritty of individual services and sort of decide for themselves? It often seems like the choice for privacy, it almost seems like a fashion statement or a fashion choice, right? You either wear blue or red.

00:47:20 Meredith Whittaker / Signal
Yeah, but it’s sadly much, much more serious than that, and we do see misinformation campaigns very frequently that we believe are attributable to in some case nation states who would really prefer that people be using a very insecure communications app like Telegram over Signal which is not secure.

How do we put the disinformation to rest and how do we push back against it? Well, we really do go above and beyond. We are open source. You can validate our claims if you have the skills, you can go into our repos, there are people who watch every commit we do and analyze it on Reddit, looking for new features, discussing what we’re doing, submitting bug reports, so we are one of the most scrutinized apps and one of the most scrutinized and audited cryptographic protocols in the world.

How does that translate to a popular message? Well, we are out there talking about Signal security as much as we can. We spend a lot of time when these campaigns of disinformation that are saying that Signal is a CIA asset or whatever the nonsense is. We spend a lot of time pushing back, even though we know this is fully fabricated and kind of, I would say almost lazy. We also recognize the stakes are really high, that there are people who don’t have the expertise to validate these claims themselves and can get really worried by those claims.

So, we also ask that the overzealous community of people who may be amplifying those claims or making them in service of getting attention or going viral. please find something else to do, because digital security is life and death for a number of people who use Signal in authoritarian context, and these kinds of rumors can have real harmful impacts on people, even if they are completely baseless.

00:49:54 Domen Savič / Citizen D
Well, I think that’s all for us today. Thank you so much Meredith, for taking the time and thank you for all the answers and all the expertise.
Best of luck going forward, fingers crossed we at some point reach the situation where Signal won’t be necessary anymore because everybody will understand the importance of privacy, surveillance capitalism will die a horrible death without or with very little victims and life will be great again.

00:50:32 Meredith Whittaker / Signal
Well, from your lips to God’s ears. Thank you, Domen, thank you for having me on the show. It’s been great.

Citizen D advice:

  • Use Signal
  • Engage in a political debate around privacy
  • Rethink.

More information:

  • Time100 AI: Meredith Whittaker – article
  • Signal president Meredith Whittaker criticizes EU attempts to tackle child abuse material – article
  • ‘You cannot do mass surveillance privately, full stop’: Signal boss hits out at government encryption-busting moves – article

About the podcast:

Podcast Citizen D gives you a reason for being a productive citizen. Citizen D features talks by experts in different fields focusing on the pressing topics in the field of information society and media. We can do it. Full steam ahead!

Join the discussion

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Podcast Državljan D

Naročite se na podcast Državljan D!