107 Joe McNamee: Disinformation is not freedom of speech

We sat down with Joe McNamee, senior policy expert at the EU DisinfoLab, who was previously leading the European Digital Rights network.

The topic of the discussion? Everything from digital rights to fighting disinformation, the current EU political climate and more.

How does transparency come in to play when addressing the digital rights and countering disinformation, why the current EU situation does not bode well for guardrails in both fields and what can we do about it?

Transcript of the episode:

Expand the transcript

00:00:02 Domen Savič / Državljan D

Welcome, it’s the 11th of February 2025, but you’re listening to this episode of Citizen D Podcast on the 15th of February same year. With us today is Joe McNamee, a senior policy expert at the EU Disinfo lab who’s been working on topics related to Internet regulation for over 20 years.

From 2009 to 2018, he led European digital rights, so-called Adri, the Association of Digital civil Rights organizations in Europe, working on major topics such as adoption of the general Data Protection Regulation and the Copyright Directive. So welcome Joe, welcome to the show.

00:00:43 Joe McNamee

Thank you.

00:00:45 Domen Savič / Državljan D

Let’s start at the beginning… there’s tons of things happening in the disinfo area of EU policy and practice as well, but I want to start with your previous or pre-previous employer; European Digital Rights Association.

How did this field of digital rights change in the last, let’s say, 10 years, the so-called digital decade. What were some of the challenges that were, let’s say, present at the beginning and what were some of the solutions we saw as the as the as the decade wrapped up?

00:01:30 Joe McNamee

Well, I’d say the problems are easier to define than the solutions because the problems are inspired by things that seem to last forever, and solutions change as technology changes. So, I think over the last 10 years, as you mentioned, the general Data Protection Regulation getting across the finish line was already a success, and it being a reasonably solid instrument was also a success.

We had the adoption of the Digital Services Act and the Digital Markets Act on a European level which served to strengthen the role of the of individuals in society in the digital society, we had things like the latest revision of the Audiovisual Media Services Directive, which is more important than people think in regulation.

What we see in here in the audial space, particularly with video streaming services, both based on profiling like YouTube and paid for like Netflix… So, there’s the one thing I think that is a fundamental change is that I remember in 2014, two 1015, there was a lot of objections to the notion that there were digital rights.

Whereas now I think the question is answered that the rights valued in the pre-Internet age have a different character and a different value in the in the digital age, in 2025 and we have to think about what it means to transfer those rights and to protect those rights in the digital environment compared with the more analogue life that we had prior to 2015.

00:03:50 Domen Savič / Državljan D

This was actually my follow up… so how do you see or how did you see the change in, let’s say, civic organizations or the general public, the change of perception of this field.

So, I remember like back in was it 2011, 2012, one of my first, let’s say campaigns was against the ACTA Treaty and back then it was really hard to sort of broaden this this issue of treaties of surveillance of, you know, monitoring, monitoring the different digital past. It was very hard to translate this into a language that the general public would understand, right… as we progressed it seemed in a way that the public is getting more in attuned to these issues, to these values. There’s a bigger connection between, as you said, human rights and let’s say digital rights.

But again, looking at the past 10 years, how did you see this area changing or adapting to the to the new reality?

00:05:04 Joe McNamee

I think it’s a question of people and society being better able to map what they consider to be unquestionable values of society like democracy, like privacy, like freedom of expression, mapping that onto the digital society.

And I think ACTA… it was perhaps one of the first examples of a certain part of society going “We can’t just leave this to arbitrariness…”, we have to have better control of our freedom of communication online and since then, we’ve had different challenges, different opportunities, like the update of the Data Protection Directive which became the GDPR, where there’s a lot more understanding among policy makers among politicians than there would have been two years before, or even or three years before.

I remember when the GDPR was proposed first by the Commission, there was… It was met with sort of a degree of bafflement, and the degree of misunderstanding and deliberate misunderstanding to a certain extent. So, I think it’s society becoming more in tune with old rights in a new digital form, I think that’s really the big change over the past 10 years.

00:06:46 Domen Savič / Državljan D

OK. And what were some of the drivers of this attunement?

00:06:54 Joe McNamee

I think, it came to something like data protection, I think it became more obvious just how much data was being generated about each individual and what could be extrapolated by computers … if we imagine the foundational work for the GDPR was done in the early 1990s and the number of computers that existed and their computing power was fairly minimal and we were talking about protecting personal data that was largely on sheets of paper. And I think in the course of the… between 2010 and 2020, it became more embedded in society that it’s a whole different level of magnitude; personal data being protected now with huge amounts of data being collected and data that can be compared with each other.

I wrote a paper for EDRi, back in the day, where I talked about data that had no parents. So, data that was generated by comparing two other pieces of data. If you’re this, and if you’re that, then chances are you will be… If you’re A and you’re B, chances are you’ll be C and the awareness of that change in what those that hold the data and those that are allowed to process the data can do with it was finally understood in that period. And I think even in 2014, I think it was when the initial conversation started, society wasn’t there yet, hadn’t quite understood and I think now you know that your data can be used in this way and I think that is a fundamental shift that allowed good things to happen, even if implementation of the GDPR is sometimes disappointing.

00:09:33 Domen Savič / Državljan D

And how do you see the difference between, let’s say for quite some time the, let’s say the good guys were the big intermediaries, right? Facebook and other US-based companies and the bad actors were these shady companies that nobody really knew about.

As the time moved forward, you had this… I felt like a change between the perception of these big intermediaries that weren’t all good and at the same time the issue of foreign states came into play, right? You had the US, let’s say China, Russia in in the last couple of years… so do you think this change from commercial to geopolitical sense of digital rights helped the progression of this field?

00:10:49 Joe McNamee

I’m not sure there was ever a possibility to make a complete distinction between state and private because we saw… most simply were in the Data Retention Directive that showed, once data is stored and once data is processed it will be kidnapped for and repurposed for state reasons… So, I think one leads to the other rather than one being different from the other.

Of course, China went off in in one, the US, with its big tech base, went off in another direction and is going off in a new direction now, but I wouldn’t see them necessarily as a distinct phenomenon.

00:11:53 Domen Savič / Državljan D

I mean, sure, they’re connected. But at the same time, I get the feeling that, when we were discussing, let’s say, the GDPR, the main actors that the data needed to be protected against were the big intermediaries.

Right now, you have… after or maybe even during or just before COVID, you had this shift to a more geopolitical sense, where companies were maybe replaced by country names or regime names in a way, right.

So, you were talking or we were all talking about, you know, EU digital sovereignty in regards to both China and the US, you had this whole push towards more EU-centric development of private, but also public sector in this regard.

And on the other side you, you also had these quarrels between the West and the east, right, so you had the US countering the push of let’s say, the Chinese, then you have Russian companies in in this regard.

So, I was trying to lead into the next topic on the agenda, which was: are our digital rights, something that pertains mostly to the consumer or are our digital rights, something that pertains more to, let’s say, a citizen?

00:13:50 Joe McNamee

I think… if something is a value, then it’s a value regardless of time to a point. So, the rights that we have are rights that we see as values that underpin our society, even if the challenges to those values, whether it’s privacy or freedom of expression or whatever, the right to life… The things that need to be protected so it doesn’t… it’s easier to explain that in one year by explaining the threat from big tech and another year, it’s easier to explain because from a threat from government perspective, but the actual underlying value that our society holds dear is timeless to a point.

For example, there was the profiling case of the Dutch Social Security Office,  I can’t remember the what it’s called anymore, that did some illegal profiling of individuals in order to assess whether or not their applications were appropriate and it became easier to explain to Dutch people the danger of state profiling, because there was an example that was fresh in their minds as regards what had to be avoided and what the underlying values of society as reflected in human rights law are required to be protected, whereas if you if you go to a Portuguese person or a Spanish person trying to explain the same thing you need, you can’t explain in the same way because the society’s experience is different.

So, it… I don’t think it’s better to focus on the value that the right represents rather than it necessarily being something that needs to be protected, either from private business or from the from the state… it changes and it will continue to change.

00:16:38 Domen Savič / Državljan D

But do you think the approach to protecting these values is the same or… are there some similarities between arguing for privacy against Google and big intermediaries or against a democratic or somewhat democratic state?

00:17:04 Joe McNamee

I think… nothing is so good that it can’t be done badly and try as we try as we might, we always find ways of doing good things in bad ways. So, if you say a state democratic state by definition doesn’t need the accountability of data rules, then that is that democracy will be eroded by the lack of accountability.

So, it’s a very dangerous and counterproductive thing to say, we can trust these people, but not those people, because that’s, that’s where the erosion of morals and values happens. No government structure, no private structure is good enough without accountability to withstand the temptation to take shortcuts.

00:18:18 Domen Savič / Državljan D

Looking at the EU level, but also how does this regulation and these protective frameworks transfer between the Brussels level, the EU level to the to the Member States?Do you see this as an effective system of translation in implementation or is this something that is actually even more important than the regulation itself but is then sort of diluted or spread so far apart that it’s very hard to follow it to the actual enactment of all these EU regulation in practice?

00:19:12 Joe McNamee

And that’s a question that I would be very rich if I could answer it effectively. The GDPR relies on national data protection authorities and they vary quite a lot in in quality… and this is traditionally the way the EU did business and traditionally works quite well, if the issue is within the borders of a country.

So, if you take telecom regulation, for example… some countries implemented well, some countries implemented badly. The countries that implemented well, have fast and innovative companies with low prices, the countries that that don’t have higher prices, less competition, less quality telephone and Internet connections.

That’s not good, but ultimately, it’s a problem that is or it’s either a problem or solution that is chosen by that country for that country and so at a given moment, before the GDPR was thought of it made a degree of sense.

So, when the GDPR came along, they followed the same approaches, the 1995 data protection directive and they said, “OK, we’ll leave it to the Member States to implement”, for companies that are based in those countries.

But what happened was that that made the traditionally weak Irish data protection authority responsible for Google, Facebook and every other big tech that wanted to establish a base in in Europe and an underfunded unsupported authority became responsible for vastly more than it could ever have been able to manage and that has led to significant problems for the protection of personal data for individuals in the EU and beyond.

And you saw that in the in the discussions around the Digital Services Act, for example where one of the things that was repeated often over and over again in the in the political discussions was we can’t have what happened to the GDPR happen to the Digital Services Act and there was a push from the center and the right of the European Parliament to say we can’t have a new agency, ideologically, we just we can’t defend new agency.

So, the compromise that was reached was that the European Commission would be the regulator for big tech, taking that away from the risk that it might fall to Ireland again allegedly and the national authorities would be responsible for everything else on the national level.

So, there’s been an evolution of the thinking of the of the EU from a time of pre-networked age where everything could be left to national choices that would have national consequences to a network page where we have networked data, which and networked freedom of expression and so on, that requires a higher authority… which for good or evil, is the Commission in this case.

00:23:34 Domen Savič / Državljan D

And you know, looking at or comparing the field that you’re working in now, so the disinformation or the tackling of the spread of this information and the digital rights, so your previous job… What are some of the similarities and what are some of the differences?

Because in a way I thought about this really weird connection where one could argue that on one side the digital rights were more about freedom of expression and the uninhibited transfer of information across the Internet, while the disinformation field is slightly different in a way that not everything that moves freely is actually good.

00:24:30 Joe McNamee

Well, that goes back to my frequently repeated mantra that everything can be done badly if you try hard enough and defending digital rights and the basis that nothing has any limits is as wrong as saying that everything needs to be restricted by default. The truth lies somewhere in the middle.

If you leave it to big tech to control what you see and hear on the basis of reverse censorship of pushing the information that they see as profitable up and pushing everybody else’s freedom of expression down… that’s not defending freedom of expression either. So, a balance needs to be to be achieved where voices are heard and without undue interference by the companies. And in the same way, there are fundamentally anti-democratic forces that are organized in in spreading disinformation which removes the freedom of expression and democratic rights of individuals and groups that need to be addressed without necessarily preventing anybody from any human being from speaking.

I thought that there was when I before I started this job, that there might be a friction between my strong belief and freedom of expression that I defended in EDRi and my strong belief in democratic rights, which I defend in my current job, but actually there isn’t because you have to both can be defended at the same time and it’s the.

If you look at the pre-censorship of algorithms that are promoting toxic content or foreign information networks, those are both contrary to democratic rights and country to the freedom of expression, rights of individuals and groups.

So, I I’m happy to say with complete certainty that the two go hand in hand, and they’re not contradictory to each other.

00:27:49 Domen Savič / Državljan D

Because I was wondering about that… let’s say these autocratic forces are usually using the freedom of speech protections to sort of further the spread of disinformation, while you have the counter-argument of companies that are, you know in charge of algorithmic content management are sort of, you know, left alone or they’re usually, you know, defended by “Oh, no, no, these are private, you don’t need or you don’t have meddle in them”, right?

So how do we address the issue of at least on paper or in practice or on private companies that are now becoming more and more essential and more and more inescapable, if I can use this term in a modern communication sphere.

00:28:59 Joe McNamee

I think we can solve multiple problems with one effort and the effort is to be found in the shortest article in the Digital Services Act, and I think possibly the shortest article in the history of European legislation, which is Article 38, which says that there must be non-profiling based options available in very large online platforms that are providing services.

If you give people the power to build their own feed of information that they that they want to see and people that they want to hear from and get recommendations from those people about other people, they want to hear from rather they want to hear from, you have an increase in media literacy because these because people are able to understand what a menu is and what choosing your based on your tastes is.

And that means that different people would have different feeds and there wouldn’t be the single point of failure at which we have at the moment, which is the monolithic algorithm feed which is designed not for the interests of the user, but for the commercial interests of the platform.

And I think if we could, if we could break that monopoly of information feed in the platforms, we would have platforms that would be serving society in a more transparent way, in a more safe way, because it’s comparatively easy to work out the basics of how an algorithm works and exploit it, it’s quite another thing working out how to overcome an algorithm that is based on the tastes of many million or a billion subscribers?

That’s a personal opinion rather than one of either former employers or my current employer.

00:31:55 Domen Savič / Državljan D

But is that, you know, feasible in… Because then you’re usually left with the question of these platforms, they are running these algorithmic content decision-making solutions on the other hand, you have, the end users and state entities.

But how do you make the whole thing work? So, we’ve met back in 2006? and the one of the first issues even back then was these self-regulatory models do not work when push comes to shove, so we need stronger role of the state impacting this field, right?

And as we moved on, we saw even in the digital decade that the States aren’t really pulling their weight, but you know, compared to always reverting back to self-regulatory models and codes of conduct… there was a lot of the latter and not a lot of the former, right.

So is there… I know it’s $1,000,000 question but is there an effective way to change the practice in this in this regard… to make it more functional and effective, and actually doing what it says it does on paper.

00:33:36 Joe McNamee

Well, I would first of all ask whether it’s possible to make it less functional than it is at the moment because the monolithic algorithms are causing harm and any improvement would likely be a change for the better.

I think one important thing to understand to accept is that self-regulation doesn’t always work, but self-regulation does sometimes work, so the Council of Europe adopted guidelines on best practice for content moderation, for example, and that sets out some very basic principles for what rules need to be followed in order to have a self-regulatory approach that achieves public policy objectives for the good of society.

And we need to move beyond embarrassing levels of naivety of governments when they when they approach this topic. If it’s to the economic disadvantage of a company, to self-regulate in the way that governments want, it will not do it well, it will not do it at all possibly and having a press release and having a photo opportunity saying something different is not going to work.

So, in environments where self-regulation is not going to work, then regulation has to work and the DSA Digital Services Act from the EU tries to strike a balance between flexible, rapid measures that can be taken in a self-regulatory environment together with several articles on transparency in order to ensure the appropriate levels of evidence and data are produced to show when regulatory intervention is needed and necessary and proportionate and rules that apply in all cases.

I think that the architecture, if not necessarily the outcome, is an admirable one and one that potentially could have been done better, I think. I think it’s a model that isn’t a failure by default.

00:36:37 Domen Savič / Državljan D

And how do you see the issue of, let’s say talking or staying on the topic of EU regulation, right? How do you see the issue of these changing decision-making bodies on the EU level? We’ve just got a new European Commission; we got a new European Parliament. They’re now deciding what they will be doing for the next 5 years… How do you see the next five years in the area of digital rights?

00:37:23 Joe McNamee

That’s a very big question… the only thing that we see clearly at the moment is the noise around enforcement of the of the Digital Services Act, the noise around the creation of the Subcommittee on the Democracy Shield, the majorities in the Parliament are quite difficult to build because of the election results. Plus, if anything is possible on a scale similar to the GDPR of the DSA seems quite limited, the Commission is under tremendous political pressure to not do anything that would be perceived as undermining financial security or the economic well-being of the Union.

So, every direction that you look in, there’s a constraint on the Commission as a whole, the strange division of powers that the Commission President has imposed on the Commissioners is a restraint on all of them to act freely in the best interests of the people of the European Union, in my opinion.

And even if there was something strong on any subject that was going to come out of the European Commission, there are forces in the European Parliament now that are partly there because of external forces that don’t want Europe to have effective regulation on anything and therefore building majority to pass legislation that the European Commission will have difficulty proposing is limited as well.

There are significant barriers to there being any major, meaningful policies adopted in the next five years, in my personal opinion.

00:40:01 Domen Savič / Državljan D

So, will there be more bad policies or as you said before, it’s going to be a stalemate where nothing will move at all?

00:40:15 Joe McNamee

I think there will be an effort to deregulate, to move backwards in the name of bureaucratic simplification. And how far that goes, I think that sounds nice, it sounds sellable, “we’re simplifying bureaucracy where we’re getting rid of red tape,” but the red tape was designed for the achievement of a public policy objective that was shared by a majority in the Parliament and by everybody in the Commission.

So, it’s a seductive term with… the least bad consequence is the removal of measures that were designed for the good of the people in Europe and businesses in Europe, in and with the majority in the three institutions.

So, I see more political potential to move backwards, being seduced by simplification and getting rid of bad red tape than I see moving forward with good legislation, good policy development and legislation.

00:41:56 Domen Savič / Državljan D

And if we in our last part of the show, if we move on to the disinformation and compare all of the work that’s been done in the last digital decade on the issues related to digital rights and now try to talk about the current situation in the field of disinformation…

Do you think the next five years of the new European Commission will provide some solutions, some regulatory frameworks that will address the issue of disinformation or is this, with the current powers at play sort of a lost cause or an issue that you know nobody’s going to touch?

00:42:55 Joe McNamee

Ideologically, I don’t like the notion of condemning somebody for a failure that they haven’t failed at yet. I think in principle, the Digital Services Act, Digital Markets Act and GDPR give us all of the tools that we need to within the European context, within the EU context to address to, to make major take major strides in the fight against organized efforts to undermine our ability to choose by undermining our ability to know what our choices are.

It is still a little bit too early to tell if the implementation phase will fail, I hope not. But I think we have a duty to not condemn those who are trying their best to impose, to implement legislation that was adopted for the good of our society, because even because the best way of ensuring legislation will fail is by talking it to death before it has it has had a chance to succeed.

00:44:26 Domen Savič / Državljan D

And staying in the disinfo area, what would be some of the key issues or the most pertaining problems that the EU is facing in the current situation?

00:44:48 Joe McNamee

This is again more of a personal opinion, but I think disinformation, the economics of it, of how disinformation is funded and shared is central, if we start with the algorithmic choices made on social media that promote certain content and demote other content, is a problem that is inherent to any system that is reliant on an algorithm which is not in the interest of the user and not is not in the interest of quality, but in the interests of driving traffic for the purpose of generating data.

And that’s why I come back, always to the recommender systems… So, I think that’s the Achilles heel, on which a lot is based. Then you have a quasi-duopoly in the online advertising space on which online media and newspapers in their digital or hybrid format rely on, because that siphons off of revenue from the newspapers to the online advertising industry, that needs to be addressed.

The fact that you generate data for your newspaper by generating clicks, which means that your headlines and your content have to be based on or leaning towards clickbait and lower quality rather than the type of high quality that a subscriber would rely on, is a problem in that it drives quality down and it’s a problem in that it drives revenue away.

And on top of that, you have the untransparent nature of the online advertising market and there was a report only this week about child abuse material being rewarded by advertising from major brands and from government departments in some cases.

So, these are all interconnected problems and all generated by the attention economy and I think this is the place to look if we want to solve many problems by addressing that one source of problems.

00:48:15 Domen Savič / Državljan D

And who do you see as the leading actors in this field? We have already talked about the civic sector, the decision makers, the companies in the digital rights field… is the situation similar in countering disinformation, or is there somebody who’s leading the pack a bit more compared to the field of digital rights?

00:48:46 Joe McNamee

So, from the digital rights side, there are so different NGOs in the space, have different expertise and different priorities… however, there is a lot of energy in the data protection and freedom of expression fields and there’s a lot of energy in the disinformation field, and I think we all have a shared interest in getting to the root of this, this whole range of problems by requiring transparency, by requiring alternative recommender systems, by requiring an end to the fraud in in the advertising sector, by helping wean newspapers off low quality and back, as some are beginning to do back towards quality driven, subscriber based models.

So, I think there’s a big overlap, I wouldn’t like to pick out anybody as being the champion in in this area, there are a lot of champions, thankfully.

00:50:04 Domen Savič / Državljan D

No, but so my question wasn’t yeah, to sort of put you on the spot… I was just trying to hear your thoughts on… are there differences between the ecosystems, right?

Digital rights, at least in the beginning, but also in in the last few years, had a very strong support from the digital industry sector, that was present in this field to promote their own goals, but to sort of also expand and to sort of normalize the discussion right in the area of disinformation, you kind of don’t have anybody who’s actively or who’s openly for disinformation.

You have various, you know, interest groups that wouldn’t mind if disinformation was flowing freely, right? So, in addressing this issue, is the majority of the burden again on the NGOs, the fact checkers, on journalistic organizations? Or is there somebody else from the field of academia, from the field of state institutions that are helping address this issue?

00:51:31 Joe McNamee

I think I’m not going to be able to provide a simple answer to that because I think anybody that’s working on rigorous implementation of data protection law is working towards an environment where disinformation is less hard to drive through the attention economy.

If they’re in their NGO’s working on implementation of the DSA and for example measures against systemic risks and measures to ensure adequate transparency and transparency of algorithms, they’re not working directly on disinformation, but their success will be a success for disinformation.

There are a lot of people working on analyzing strategies behind disinformation campaigns and these are not… nobody has got the free speech rights to build a billion bots to drown out everybody’s voice that is not a free speech, right?

And there are a lot of people working on analysing the networks and the structures and the technologies behind disinformation that are also working more explicitly in the disinformation field.

I have a particular personal dislike for the word ecosystem as a metaphor, but iIt really is harped in this case, that we’re all in the different battles we’re fighting in the different areas of digital rights, we are all playing our part to make the environment better for everybody.

00:53:43 Domen Savič / Državljan D

OK. And one final question, I know it’s just February, but the annual EU Disinfo lab conference is coming to Ljubljana this year in October, you’re still, filtering through the program proposals, but let me ask you this… why did you choose Ljubljana for the next iteration? Was it because we at some point had a place in the disinformation universe?

00:54:23 Joe McNamee

Well, sadly, every country has got their place in in the disinformation universe at the moment. It wasn’t me that made the choice, so I can’t be precise, but I think from my experience that area of Europe has got an unusually sharp and vibrant civil society as well as experience in a sort of political crossroads.

Ljubljana has got a lot going from a location, it’s a beautiful, small city, it’s got ancient neighboring countries have a strong civil society and everything we heard from local government agencies was also very positive and we’ve had very productive discussions on topics related to us with government officials and those implementing the DSA.

So, I can’t think of any argument that I heard against Ljubljana.

00:55:55 Domen Savič / Državljan D

OK, thank you very much, we’ll wrap it up at this point. Thank you, Joe, for dropping by, for sharing these thoughts, best of luck going forward and if we don’t see each other before, we’ll definitely see each other in in October.

00:56:14 Joe McNamee

Much looking forward to, it will only be my second trip to Ljubljana.

Citizen D advice:

  • Demand political action in the field of funding disinformation
  • Push back against reducing legal protections of digital rights
  • Join us in October 2025 at the Disinfo annual conference in Ljubljana!

More information:

  • Big tech funding disinformation – article
  • Countering disinformation – booklet
  • Digital Rights Worsened in Central, Southeast Europe in 2024 – report

About the podcast:

Podcast Citizen D gives you a reason for being a productive citizen. Citizen D features talks by experts in different fields focusing on the pressing topics in the field of information society and media. We can do it. Full steam ahead!

Join the discussion

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Podcast Državljan D

Naročite se na podcast Državljan D!