104 Mackenzie Funk and the birth of surveillance capitalism

We sat down with an award-winning investigative journalist Mackenzie Funk, whose book, “The Hank Show: How a House-Painting, Drug-Running DEA Informant Built the Machine That Rules Our Lives” tells a story about the origin of surveillance capitalism we know and hate today.

We talk about the book and the man behind the story, but we also touch upon his legacy, the surveillance capitalism that stems from data economy and big data intermediaries and the way we have to address this on a local but also on a systemic level.

Also included in the conversation are the critiques of predictive policing, the issue of almost unchecked business of data analytics and the way forward.

Transcript of the episode:

Expand the transcript

00:00:06 Domen Savič / Citizen D

Welcome everybody. It’s the 23rd of October 2023, but you’re listening to this episode of Citizen D podcast on the 15th of November same year.

With us today is Mackenzie Funk, an award-winning investigative journalist whose second book, and it’s a mouthful, “The Hank Show how a house painting drug running DEA informant built the machine that rules our lives”, tells a story about, well, I’m just going to say it, it basically describes the birth of surveillance capitalism.

Hello, Mack. Welcome to the show. Would you say the description is accurate?

00:00:44 Mackenzie Funk

Yes, you’re right. That is a mouthful. I never had an easy way to explain what the book was about except that it was… Yes, it was the person who we’ve all never heard of, who started so much of this world we now live in.

00:01:01 Domen Savič / Citizen D

And before we start with the book, so the first question I’d like to ask is before this… You were so you’re an award-winning investigative journalist and you focused mostly, or you focused mostly on environmental stories, global warming and stuff like that. What made you pick the the Hank Asher story?

00:01:21 Mackenzie Funk

I think, as crazy as it sounds, that surveillance and climate change are very similar in some ways, especially to an audience that doesn’t know very much about either one.

They can seem very dense and very boring at first and so you need to find a way to tell a story that is not just about the facts of it. So, in the case of climate change, it was just about here’s the science. And so, for many years before, I was working on my climate change.

You would have these arguments in the United States between the people who believed in the science and the people who did not, and it was not a very good way to win the argument, just to be right, just to just to have the scientists have the right facts.

And so, I was trying to find a way with the climate change book to tell a story that would maybe show the stakes show why people should care.

When it came to surveillance, that was my same technique as I focused on a person because I figured even if you’re not interested in privacy or surveillance, you might be interested in this person. But the second thing is that they’re both these big systems and the reason they that some people find them hard to… they find them boring or hard to understand is because they’re it’s so complex.

But there are these big systems that seem to fall the heaviest on the people who are poorest among us and the most vulnerable. It’s becoming more obvious with both climate change and privacy that the poorest people in the world, the poorest people in each of our countries, those are the people who are bearing the brunt of this and that there can even be winners in these new economies from climate change and from surveillance capitalism.

And those winners are not the poorest, they’re the usual winners in our societies, and I find both of these both climate change and our lack of privacy to be accelerating some of the worst inequities in society.

00:03:25 Domen Savič / Citizen D

Before we jump into that, I saved this part the last part of the show. But first let’s talk about the book.  So how did you find this guy? Hank Asher, the guy who… I’ll let you tell the story or a brief recap, but this was…

Reading your book this was the first time I heard about it and I am working in the field of, you know, digital privacy and digital activism for the last fifteen to twenty years

00:03:56 Mackenzie Funk

Yeah, I would not say that I was by any means a privacy expert when I started this, but I certainly paid attention from reporting abroad, especially in in places like China and Russia, I was very careful about… I tried to understand where my information is going and who will see it and even after the Snowden revelations in the United States, I became a little careful about even what my own government was looking at.

And I had never heard of Hank Asher. His name first came up when my… it’s a complex story, but a magazine approached me. An editor I knew, and he said I have a story for you.

And the story was about this group of people who were trying to stop child predators as they called them, and they were using this software built by a person named Hank Asher, who they described on their website as the father of data fusion. So, I was looking at this group, and I saw that note on their website. And I said, who, what?

And that is that is what it began my journey, and I then googled the name and then I saw that he had been a cocaine smuggler, that he made a fortune multiple times and then lost it in Florida in the 80s, I saw that he had this crazy story that he was involved in the security build up after 9/11 in the United States.

That his technology undergirded some of the biggest surveillance companies in the world and certainly in this country, and that it was just this character nobody had ever heard of. And I found that amazing.

But what really did it for me was when I stumbled upon his obituary page because he had died by the time I heard his name and the things that people wrote about this man.

Just on his online obituary website that his company set up for him and then another one that the Funeral Home put up are just not the kinds of things you see written about most dead people you know.

You read obituaries, you read the comments people make, and they say nice things about people. But the ones that they said about Asher was like, he changed my life. He paid for my kids’ college. He fixed my broken teeth, or he yelled at me. He swore at me. He was the craziest person I ever met. He changed… He changed this country forever. He changed my life forever, and he just seemed to have this outsized impact not only on privacy, but also on the people around him.

00:06:39 Domen Savič / Citizen D

The book is full of these anecdotes or happenings in in his life when you, I mean, the title sort of tells a pretty good story. So, you have a guy who’s been drug running for, for the DEA, or who was an informant for the DEA, but at the same time, he was saving lives helping people, you know chase down, child molesters and stuff.

But what was in your view the thing that surprised you the most? Like when you were doing, when you were doing the research for the book?

00:07:16 Mackenzie Funk

Of course, everything about I got perhaps a little too obsessed with understanding what exactly he was doing in the 1980s when in the United States, the center of all drug smuggling was Florida because of its proximity to Latin America, to Colombia, to Jamaica, to the Caribbean, and I became obsessed with that. But that was interesting, but not surprising.

The biggest surprise was how these open records laws in Florida were a good thing in this country. We wanted transparency and the States, especially after some scandals, Watergate and others. They decided to make sure that the public could know what the government was doing in their name and so many states opened up their records, opened up their books so that citizens could see what the government was doing and people like Hank Asher, in states like Florida, which had very open public records, exploited this.

They were able, using this state law, they were able to go in and get all of the driver’s license records and all of the vehicle registration records, marriage records, birth records, divorce records, all the housing record. Everything, everything you can imagine that the government, local government would use to sort of have a citizen move through their life.

This was a public record and because the governments wanted this to be they, they wanted citizens to know what they were doing. They opened it up to a new species of person, which is these data, data aggregators.

And the technology, by the time Asher came along in the late 80s and 1990s, you were suddenly able to scoop all this up and make sense of it in a way that I don’t think you could have when these laws were written.

So that was the big surprise that this, this very progressive policy was became something very different and would… say if we compare the Hank Asher’s period where he was buying up or gathering all of these data points and putting them in a in a database in a searchable database, would you say there are similarities if we draw a comparison with Mark Zuckerberg or Elon Musk or any of the other big intermediaries, digital intermediaries’’ bosses…

00:09:54 Mackenzie Funk

Yeah, I have similarities and an important difference in the big trick with Hank Asher’s era. We’re talking in the 1990s and early 2000s is that they are going out to all these different databases and trying to make sense of these little tidbits of information they get from each one, they’re trying to connect all of them to a single person.

And to do that, they ended up assigning a tracking number to each American citizen and each American resident, and also many people across the world, so that each of us has our own our own bar code or something like this.

And then they had very tricky rules that became algorithms or and now are different through machine learning, they’ve perfected these but different rules that assign a new data point that comes in to an individual with some degree of faith that this is going to the right person so I am not the only Mackenzie Funk in the United States.

It turns out there are many of us and they tend to know when I’ve moved or when I’ve bought something. And they do so because they’re able to see, of course, my address, but also the people I’ve lived with and my age, my gender, these kinds of details. So, they can with some faith put this new data point with me.

Someone with a name like John Smith, it’s two very common names in this country, it’s much more difficult, and yet they would have ways to make sense of this.

Contrast that with what Zuckerberg built, which is you have to use your real name on Facebook, and the moment you you do, you become your own aggregator. Zuckerberg did not need a fancy algorithm or a fancy set of rules to attach each new data point to an individual because we did that for him, we did it by logging in under our real names.

We did it by putting Facebook on our computers and on our phones, and so it almost turns the problem on its head from the perspective of the of the aggregator and I think that’s the biggest difference with what’s come out today and Asher’s day is that back then the problem was information was coming in from all sides and we need to find a way to really attach it to a person.

In Zuckerberg’s era, it was how do we make people build their own databases and then give us access to it and I think right now in a maybe post Facebook era, we’re a little bit in a more of a hybrid space between the two. The same problem Asher solved is now being solved again from the from the data broker perspective, they’re trying to do it again.

00:12:51 Domen Savič / Citizen D

And just to continue your train of thought…  So, what are some of the lessons that we can learn from his story right regarding data economy regarding these data points regarding data aggregation regarding I don’t know use of data in, in policing and surveillance?

00:13:19 Mackenzie Funk

Yeah, I. Well, one thing is that Asher, I think had high minded goals with what he was doing and why he was doing it. Yes, he wanted to get rich, but like many people in Silicon Valley, say 15 years ago, he kind of believed he was the good guy and that he was aggregating all this information to do good things, and that included, especially in his case, using this information to go after child predators, to go after child molesters and to help the police do that.

Then after the attacks of 9/11, he used it to go after supposed terrorists, he thought it would help the police solve crimes. He thought it would eventually, you know, help companies determine lower rates for people who had good driving records, for instance, or otherwise look good. And it was hard for Hank Asher to see the downside of what he built.

But I think he was an early stand in for people like Zuckerberg, who may have claimed high minded principles. At the beginning, you know bringing the world more connected or whatever the slogan was in in the early days of Facebook and yet all this stuff has such a dark side that if you build this this weapon, eventually you’re not going to be the one in control of where it’s aimed or it will become take on its own.

And with Asher, you could see the dark side of this pretty quickly, his technology was used not by him, but by the people who took control of his first company to change the results of one of the most important elections in this country by kicking black Americans off the voter rolls in Florida, they did this in the name of keeping fraud out of elections. But the end result was that, it appears that George Bush beat out Al Gore in the 2000 election in Florida and therefore won the entire country.

You can draw a line from what Asher built to that moment, not something that he would have wanted. He actually was in favor of the Democrats in favor of Al Gore, but that’s how his technology was used, I don’t think that he would like how it was used much later in Florida, again disenfranchising and taking away the vote of black Americans.

I don’t know that he would have liked how it misidentified various Muslims in this country as somehow being terrorists. But it did. I don’t think he would like how people were wrongly accused because a police officer who was badly trained in this information would pick the name of somebody from his database and say “Oh, this has got to be our suspect”, but that’s happened.

So, I think that’s the parable you can draw from what he built is that this stuff has such power and eventually its creator won’t be in control of where this power goes.

00:16:44 Domen Savič / Citizen D

Do you think… what you just said… that this lends itself to a theory that, you know, technology is neutral and that it matters who’s running the show if I can borrow a phrase from the title of the book, or would you say that technology has some inherent biases and inherent problems built in, and the driver behind the steering wheel isn’t the only, let’s say good or bad switch?

00:17:20 Mackenzie Funk

That’s a good question. I think the reality is that it is both things, I think the driver matters quite a bit, but also the technology can have something built in that makes it inherently worse or better, and I don’t think that it has to be one or the other.

And that again is the lesson I would see here, if you think about what Hank Asher built and why he built it, in some sense, it’s a suspicion machine. Keep in mind that this is a former cocaine smuggler who suddenly is wanting to work with the police, he knows that he has something hidden in his own background, he thinks like a criminal.

He knows how the how the Drug Enforcement Agency, the DEA, has gone after other drug smugglers, and he’s seen how they put information together and draw connections and so what he kind of built a machine that would, that would show hidden connections between people or between people and what they were trying to hide. For instance, say their assets.

And a lot of the machine was built based on his own psychology and because of that, it’s something that surfaces, things that most Americans would think would be hidden, like if you’ve lived with the same person three times in a row and they know that because his machines have your entire address history and they can see “Oh, these people were roommates.”

What if in a time that this country was a much more anti-gay, what if that was your husband? If you’re a man and that’s your husband, but you were keeping your relationship a secret… Well, that’s the kind of hidden information that his machines would surface.

And they basically remember every little thing that you might want to hide, or think would be hidden. And that’s based on in part who the maker was and how he started to think about what would be useful to investigators and what criminals or other people might be hiding… So yeah, there there’s some inherent danger to the technology, but also how it gets used is very much depends on who’s in control of it.

00:19:44 Domen Savič / Citizen D

And if I can just backtrack a little bit, you’ve mentioned that after Watergate, the government wanted to be transparent and open. So nowadays you have digital activists, you have human rights activists arguing for more transparency in the government works in police works in in these supposed closed system systems.

But on the other side, transparency on its own… I sometimes get the feeling it it’s not doing much good, I would say, in a way that OK, sure, you open up all the databases and people are drowning in these data points. And every once in a while, you have this Hank Asher type of type of person who, you know, makes sense and a little bit of money on the side from all of these.

So would you say, how would you argue or what would be the correct way to sort of argue for transparency, but at the same time not argue for, OK, you just open up the floodgates and let it all out. And, you know, we’ll figure it out as we go…

00:21:08 Mackenzie Funk

I am no expert in the data privacy laws in the United States versus Europe, but I do know that you have a much better understanding that data collected should be used for the purposes it was collected and not for other things, and one of the big things that went wrong in the United States and I think continues to go wrong and round around the world is that this idea of transparency also means that companies can purchase or otherwise acquire data it was never meant to be used for, say, advertising.

When it’s something that a local government has collected, but because there are no controls on how it is used. Two or three purchasers down the line, then it becomes transformed into something very different.

So I think the answer to much of the transparency and privacy problem, which is real, is some sort of control on… is this used for what the person who gave up their information to their local government, for instance, intended if I give my information to the Department of Motor vehicles in my state here, I don’t expect that they are going to then sell that information on to LexisNexis, the data broker.

I don’t expect that LexisNexis will then send that information onto immigration authorities and that the immigration authorities then could come and come to my home if I am not a say, I’m not a citizen and I don’t have the right papers to be here. Those are not things I expect when I go to get a driver’s license. And I think some sort of… something like that is built into the laws, can go a long way to helping… I tend to agree that transparency isn’t enough, I tend to think that that governments are collecting too much information and they are keeping it far longer than they need to because then.

If there is transparency, if there are transparency laws and it happens to be in the database still, well then, they might have to give it to whoever asks for it. If they don’t collect it in the first place, or if they don’t retain it for a long time, that’s often safer.

But I think the tension is always going to be there, but I think a lot of it is about the very the gut check. Is this what I thought was going to happen to my information?

00:23:47 Domen Savič / Citizen D

And do you see things changing like now, like in the past? Like maybe after the pandemic or even during the pandemic, it seems that you know the data industry, all of these big digital intermediaries, they finally lost its, I should say user-friendly halo, right?

And people are sort of pushing… in the United States and also in Europe and also in other places across the globe, they’re pushing for a for a different approach to these to these issues of privacy of, you know, companies knowing almost everything or everything about the person.

So do you see that happening as we move away from from, let’s say, the COVID pandemic and go forward?

00:24:36 Mackenzie Funk

Yes, I do see it happening and I think again, I know the situation in the United States better than I know in Europe, but in the United States, the big change was after the election, in the first election of Donald Trump.

After the 2016 election and Facebook’s role in that election, the Cambridge Analytica scandal, those things seem to very much change the public perception of these of social media in particular.

And then you see more controls on our phones, both Android and Apple have done a better job as to not letting other companies track us all over the web all the time. Not that they’re perfect, but we have a lot more control than we did eight years ago.

And I think that is that is a good thing. That said, I think we’re coming back, especially in in the United States from 20 years of almost no control on privacy and, and so even if there’s some incremental gains on the digital space…

A lot of this information is already out there and a lot of the information that companies like Hank Asher built and those are big parts of LexisNexis and Thomson Reuters and TransUnion, those databases are still there and so they’ve got20 going on 25 years of information about each of us and they never got rid of it. They didn’t purge it, they kept it. And so even if governments are collecting less, even if our phones are giving a little bit less information about us, the length of that history that some of these data brokers have on us and that still includes Facebook, that really matters because they can get a sense of where your life is going and where you’ve been and those are… That’s pretty important information when they’re trying to help other companies make decisions about your life.

I looked at healthcare in the United States, doctors began to look at not just your clinical conditions, but at the conditions of your life now in general, where do you live? Do you have access to a vehicle to get to your doctor’s appointments? Or a bus or a train?

Do you drive a certain kind of car? Because certain kinds of cars can be, you know, can you can judge someone based on their car. You can kind of guess what their general health is. Where do you get your information? Because if you’re getting all your news from Facebook versus of newspaper, you might be lower information and you might not follow the directions of your healthcare provider as much.

That kind of information or even things like where did you live when you were growing up nd was that area worse in terms of air pollution… that kind of information, going toward healthcare decisions now is very… You’re not going to easily get away from that just because your phone isn’t telling Apple or Google exactly where you are right now.

00:27:59 Domen Savič / Citizen D

And just to follow up… So do you think that all of these data points about the person or the information packets you just described, do you think these are actually used by people or by companies by industries collecting it or do you feel that this data economy is basically I’m not going to say a mirage, but basically something that looks nice, looks fancy, looks I don’t know, important, but at the same time you just get these, you know, data dumps that that nobody uses and then, you know, when somebody hacks a big, you know pharmaceutical or health related database, then it becomes a problem, right?

00:28:47 Mackenzie Funk

Yeah, I think they are using it. In fact, I know they are using it, but most of the Hank Asher’s products, most of the world’s biggest banks and most of the biggest companies use those products. Most of the law enforcement agencies in this country use these products and I’ve seen… I mean, imagine it’s not just the little bits of data, right? It’s tied to a person’s identity, all the information you might want to know about them.

For figuring out how much to charge them for insurance or like I said, healthcare decisions, or if you’re a police officer trying to track someone down, where do they live? Where have they lived? Who are their friends and who are their relatives? If you’re trying to find them, where would they go?

That kind of information is in there if you are an immigration officer trying to track people down as, as could happen in in this country and has happened in this country. It’s the same kinds of questions, and also when did they arrive? When does their name first appear in this database? Does this show that their recent arrival in this country?

And I found evidence that it’s being used all the time. I don’t know how banks work in Europe, but when I log in from a new location, say I have a new computer or a new phone and I want to access my bank information sometimes, it will send me a bunch of questions: which of these street names have you lived on or which of these streets have you lived on in your past? Who of these people have you lived with? Which of these people have you lived with? Little details about my biography that only I would know.

You would see things that stretch back 15-20 years into my history and that’s a Hank Asher product built right in and used for something that I think is pretty benign. It’s making sure that you are really you, but it’s a show of the reach that these products have everywhere in the background of our lives in this country. And nobody really has any idea…

00:31:10 Domen Savič / Citizen D

So, the case that you just described with the banks it perfectly I think describes the issue between privacy and security, right? So, you have these data products that are a boon for security, but at the same time they are literally almost or not even almost literally killing our privacy. Right?

So, do you see this conflict? You know, resolving in in the in the future, in, in any shape or form, or do you think we’re just going to go back and forth between, you know, this is too much security. We want more privacy or this is too much privacy, we want more security?

00:31:54 Mackenzie Funk

Yeah, I don’t see it resolving and nor do I see the sister of that one, which is the privacy and convenience question. And we see of course that that one all the more with the new AI products that say Microsoft or Apple is coming out with how much of your information do you want to give to these companies, especially if they’re uploading it to the cloud to make your life more convenient to complete your emails or calculate the distance between when you have to leave so you can get to your next meeting or to reschedule a meeting.

These APIs, these AI products can already do this if you let them have access to your calendar or all of your emails and I find that scary and very interesting because I think a lot of people will choose convenience over privacy here and that the privacy controls will always come late. And as for security and privacy, it goes back to your earlier question about are they actually using these products? Do they actually work?

I think we’re finding that as much as, at least in this country, as much as they were used for, say, counterterrorism, or for predictive policing or all these things where the idea was, if you get enough information about everyone in in a country or in a city, and if you really build these algorithms to help us predict who’s who among the general population. You can stop crime before it occurs or stop a terrorist attack before it occurs.

So far, that hasn’t worked out as well as it was supposed to so far… yes, police officers are using Hank Asher’s products to say, figure out someone’s address history or who their relatives are, but they’re not using it in sort of an AI powered way… let’s let’s just give everybody a score, and I mean they are using it, but when they’ve used it, it hasn’t worked very well.

So, a lot of them have dropped it. So, a lot of the predictive policing projects have been dropped in major U.S. cities because it didn’t work as well as they thought it would. It just turned out to be racism at scale.

It just turned out to wrongly tar huge swaths of a city with the idea that they might have a higher risk of being a criminal, which didn’t always match up with reality and was certainly unfair to those people who were caught up in the algorithm.

So, I do think that there’s been… Yeah, I sometimes think that is showing that, especially with law enforcement and counterterrorism, that it’s not as useful as they thought.

00:34:49 Domen Savič / Citizen D

I was asking this because, you know, we in so this podcast is a part of the NGO with the same name and we focus on privacy, security and amongst other things, surveillance capitalism. And we’ve been doing this investigation in the CCTV cams that are set up by municipalities across the across the country and we realized that it’s basically… it’s a promise of security.

They never deliver because the whole pipeline that follows detection, we have somewhere around 500 CCTV cameras across Ljubljana, and at the same time, they’re not doing much in terms of crime prevention, but they’re doing a lot in terms of, you know, privacy invasion, right. And if you see, if you look at across the world, you see that, you know, some technologies work in particular cases, but they sure don’t work in in other cases, right.

So the debate around security and privacy becomes very targeted, I should say. Or you know, you have to look at individual cases, individual technology and so forth. So, would you say the data, the data policing or the policing with the use of data, is that something similar or would you say the success rate is much higher or higher than that?

00:36:18 Mackenzie Funk

Hmm… we have this similar debate with traffic cameras and surveillance cameras here. I would say that the use of data in policing, it has been useful and hat Asher brought and what his companies bring to local police departments has accelerated the kinds of investigations they would do.

For instance, they I was talking to these detectives who worked in Florida in the 1990s or in the 1980s, about how they would go about their casework and they would go and track down each of these individual pieces of information from different police departments across the state, and they would not necessarily have access to say the stuff you could get from credit cards, which always have a good address history, or they would or they could get it, but they would have to get a court order and the court order would take weeks. They could get someone’s phone records to know who they’ve been calling. But that again would take weeks or months and so to just to get enough information to try to track down a suspect in a crime, it would take them a very long time.

And all of them found these products that Hank Asher built to be transformed and to help them do the work they were doing much faster and I as much as I was skeptical of where the technology has gone.

I wanted to be fair and recognized that that it really did help some of these police and do their jobs and I and I think in a way that it didn’t necessarily hurt the privacy so much, no more than they already could. It was just happening faster, but things changed.

When Asher’s company and others began to make this more algorithmic and to start assigning scores to someone’s criminality or their likelihood of being a terrorist, and I think where the a lot of the failure has been, is that when taking humans out of the loop or in saying that we need all of this so that we can solve these crimes, I don’t think you need all of this. And you certainly don’t need the scoring technology to help you surface people you think might be more likely to be criminals than others.

It’s very different to try to predict crime than it is to try to track down suspects in a crime that already occurred. And I think that once they started to try to predict the future using this data, that’s where many of the biggest harms have happened and then when it goes away from these narrow uses. I can see that someone investigating a murder, for instance, might it might be more fair to give them this information than someone investigating a minor crime.

One thing we have in the United States, not just the surveillance cameras around the around cities, but are the license plate readers on all of the police cars and on many city cars all over the country. And many of those were installed by a private company and that private company has taken all these scanned licenses and they’re able to give predictions is about where someone will be.

For instance, if you always see someone’s car parked in front of a certain address at a certain time of day, you can predict that they’ll be there pretty soon, and some of that, that just seems like too much. Like who actually needs that?

You know, who needs that is real estate companies or banks trying to use data for things very different than policing and these same companies were selling to the police and they were also selling to business.

And I think the selling to business part doesn’t need to happen and the retention of all these scans over the course of many years, does not need to happen.

Other people using it were immigration authorities trying to track down people. In this country, who might who might not have come legally, and my opinion on that is that in the United States, there are so many people who came to the country without the right papers and there are only so many they’re going to want to arrest.

And it seems like if you are going to try to prioritize people, it’s the worst idea to go after people who have been here for so long that they might have children who are citizens in this country and who have real lives and who are living like any citizen, if you have to prioritize.

And yet these systems, they will surface the people who are living the most, who are not trying to hide right? The ones who engage in banking, the ones who properly registered their car and got a driver’s license and got insurance, bought a home.

All that information, that all goes into these systems. And so, it turns out that the people who are not trying to hide, who are living the non-criminal, very average lives are the ones who are easiest to find in these systems. And so, if they’re used for these purposes like that for immigration enforcement, for instance, that just seems like a total perversion of what what they’re for.

00:42:08 Domen Savič / Citizen D

I have two more questions before we wrap up. So, the first one is basically a continuation of this debate.

So why do you think these technologies, these data silos that are basically telling us where a person is going to be at a particular moment, why do you think the usage exploded in literally every corner of our society, so if you like, take Hank Asher, for example, right at in the beginning, he was basically helping out the police to catch pretty basic criminals, right? And then, the usage extended to anti-terrorism and then it extended to anti child molester.

So why do you think the people are so excited about copy-pasting the same basic solution to, you know, places or fields that that may not have very much or much in common with. You know, the previous or the original field that this technology was used in?

00:43:20 Mackenzie Funk

That’s a good question. I don’t know the full answer, but I can say that, before this was used for policing, the original product that Hank Asher had and that many of these things grew out of was not policing, but insurance. Private insurance companies trying to get a sense of who am I selling this insurance policy to and how can I reduce my risk?

And the idea of reducing risk is basically look at the past to try to understand the future. Try to do a calculation about what someone has done before in order to predict what they’ll do in the future, and then if you can do that and whoever does that best they’ll make the most money, if you’re an insurance company to those clients who are riskier and they’ll charge much less money to clients who are not really going to have any problems.

So basically, they’re getting money for nothing, and that that’s the whole idea. It seems like with Hank Asher, they took this idea of risk management from insurance and applied it across all of society.

This idea that you can just look at someone’s past to predict the future, or more broadly look at people like this person and what they’ve done in the past to predict the future. That it’s a very seductive idea, because imagine for one if you can predict the future, you can make money off it or you can protect against whatever coming danger is.

So, we all love to know what’s coming down around the corner. Human literature is full, is full of stories about time travel, or the person who could see the future or the seer and whoever knows what’s coming is almost the most powerful person in the world. And that’s what these systems promise to police and everyone else and counterterrorism investigators is a chance to predict the future.

And maybe they get it right, for the most part, but the problem is, this isn’t advertising. In a lot of these systems are not used just to send somebody an advertisement for a tennis shoe, or a handbag they might want. These are used to make life and death decisions about that really affect people’s lives, who gets insurance, who gets a job and who gets arrested and getting it right most of the time, I don’t think should be good enough.

I think it’s very different if you send the wrong ad to the to somebody, no big deal, you’ve wasted a little bit of money. But if you arrest somebody because you think they’re going to do something wrong, that’s a very big deal. And so, we can’t have the logic of insurance or the logic of advertising which is similar to that of insurance, which is your risk pool just… If you get it 70%, that’s great, if you get 50%, that’s great.

We can’t have that logic be the same logic for the rest of our lives.

00:46:36 Domen Savič / Citizen D

And since you mentioned healthcare… OK, two more questions. Since you mentioned healthcare, why do you think this data surveillance or data economy isn’t regulated as well as health field, right?

So, you’ve just mentioned 70% isn’t good enough… imagine if you would have those options going to a doctor, right. And the doctor would go well, you know, I can heal you, but I can also kill you, so let’s see, let’s see what happens next.

Why do you think the like, the regulatory frameworks, and this is the second part of this question, still heavily rely on the end user. So you’ve mentioned Apple and Android building all of these protections in in in our devices in our, you know, online services and stuff, why do you think there wasn’t a bigger, let’s say, systemic push for, I’m not going to say corporate protections of our privacy, but let’s say state protections of our privacy.

So why are we at the end of the day David versus the Goliath?

00:47:54 Mackenzie Funk

Good questions. The first one… healthcare. It’s very easy to understand that these are life and death decisions. It’s very easy to understand that what happens with the doctor or at the hospital is going to change everything for a person.

I think it was harder for people to understand and imagine that what happened with privacy would have such impact, and this goes back to your earlier question about why did you go from or how do you go from climate change to privacy.

Early on and climate change, everyone was talking about polar bears, right? They couldn’t imagine they knew they cared, but they couldn’t imagine that really that climate change would really affect their lives.

They couldn’t imagine the wildfire smoke or the glaciers disappearing or anything really changing for them. So sure, they cared about it, but not in a way that was visceral and close to them, not like a healthcare decision.

And I think with privacy, a lot of people are still stuck in the imagination that it’s these advertisements that are that it’s about advertising that it’s about these ads following you around as you surf the Internet and that, yes, it’s a little bit creepy, but what does it really have to do with my life?

How does it really change my life and that the public has been slow to wake up to the reality that it has everything to do with your life and more and more the information about you is out there being used to make decisions about your life and what opportunities you’ll be given.

And once that understanding is there and it’s becoming more there than maybe data will be regulated a little bit more like healthcare because we’ll see that it’s the same level of decision.

As to the second part of the question – why is it always the end user? Why is it not systemic? I think it’s related a little bit, I think it especially in the United States, people think well, it’s your choice to use Facebook or it’s your choice to give up this information online and if you really care, you can do something about it and it’s almost the companies that still dominate the privacy landscape, which are American companies are built with this American individualist ethos.

That we are a country where we pretend that the individual has lots of power and that we are supposedly self-reliant and that the government shouldn’t step in and regulate, and that if you really care, you’ll take action on your own. And of course, that’s nonsense. That’s nonsense with climate change and it’s nonsense with privacy. The idea that people would have any way to meaningfully counteract the most powerful forces in our lives without banding together without asking government to do what government is supposed to do, which is step in when it’s a collective problem, it’s nonsense, but it’s still permeating.

I think the debate and that’s the biggest issue is that that that we’re stuck with this individualist sort of American ethos.

And another point on that, the idea that there is any meaningful choice anymore that you that you can somehow avoid the digital economy that you don’t have to give up your information if you don’t want to, that you don’t have to use these tools that their tech companies built, that’s not… I mean for, for young people, for anybody who wants to participate in society as it is now, you have to be using these tools and therefore you have to be giving up information for the most part.

And that’s not really a meaningful choice, when a when a gun is at your head.

00:51:54 Domen Savič / Citizen D

We’re trying, we’re not always successful in wrapping up on a positive note, so please if you have like a like a… I’m not going to say an uplifting message for the end and I’m not even going to ask you about how’s the current US election going and what do you think will happen after November in this field… but is there a place to or is there a thing that you notice regarding data economy regarding you know these protections, that is, let’s say improving in the last couple of years? So, is there a light at the end of the tunnel?

00:52:34 Mackenzie Funk

Yes, I do think as we’ve talked about that the public is much more skeptical of Silicon Valley and of these technologies in the United States in particular, then the public was five years or eight years ago.

And I think it we are slowly waking up to the fact that this has life and death consequences for people that privacy is not just an ad that follows you around the Internet and that knowledge, even if it’s too late, even if it’s a little slow, will eventually translate to change, I think.

I think it must, we’re not stuck just doing whatever these companies have us do entirely because we are, they’re still voting in all of ours countries and there are still elections, and those elections, if voters care about these issues and voters care more and more about these issues, I think there can be change and little things.

I saw from Hank Asher’s technology for instance, the utility companies, we’re talking Internet companies and electricity companies, water companies for many years they were all getting together and they created a database of all these different customers and what their address was, what their names were, and they did it in the name of making sure they were credit worthy so you couldn’t switch from one electricity provider to another.

And if you were someone who didn’t pay your bills that that they would know about that. But they started taking this information and selling it to the data brokers which you would never imagine that you move into a house and you decide to get electricity and because of that some corporation knows where you are or that they sell that to police.

And that was one of those things that once it was uncovered in the United States, a few years ago, there was outrage and they stopped selling it to the data brokers that went away. It’s little things like that, there is a change.

There are in this country and I know also in the EU there are lots of small fights against overreach by the surveillance companies, not all of them are victorious, but there are more and more, and I think that is a sign for some hope.

00:55:10 Domen Savič / Citizen D

Excellent, that’s a perfect way to end this debate about the really horrifying situation everywhere. The book is called The Hank Show, thank you Mackenzie for dropping by, for, for sharing your thoughts, it’s really nice to see these ideas of privacy and the ineffectiveness of the security capitalism echoing around the world.

Thank you so much, dear listener, this has been the Citizen D podcast, we publish an episode every month, so feel free to subscribe and we’ll talk to each other next month. Thanks again, Mackenzie.

Citizen D advice:

  • Demand new regulation of data economy
  • Pay attention to the local surveillance capitalism implementations
  • Focus on re-contextualising the debate between privacy and security

More information:

  • LAPD ends another data-driven crime program touted to target violent offenders – article
  • Editorial: The problem with LAPD’s predictive policing – article
  • Chicago PD automated policing program: Heatlisted – article
  • New Orleans ends its Palantir predictive policing program – article
  • Health Insurers Are Vacuuming Up Details About You — And It Could Raise Your Rates – article
  • Automatic License Plate Readers: Legal Status and Policy Recommendations for Law Enforcement Use – article
  • How ICE Picks Its Targets in the Surveillance Age – article
  • Hank show excerpt – article
Join the discussion

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Podcast Državljan D

Naročite se na podcast Državljan D!