Claudio Agosti is a fellow researcher at the University of Amsterdam. He has a hacker background and security professional. He researches, implements, and promotes technology in the public interest. In the last decade, he addressed issues such as communication protection for whistleblowers, analysis of the web-tracking phenomenon, and algorithm analysis.
In 2019 he started to work with the ETUI Foresight Unit on research and training courses aimed at understanding the ‘technicalities’ behind Artificial Intelligence. Five years later, the ETUI is releasing a technical report which meticulously outlines the techniques used by researchers to observe the internal logic of the app used by riders in Italy and documents its actual behavior in terms of harvesting their personal data.
We sat down with Claudio to discuss his investigation, the consequences of unchecked share economy and the way forward.
Transcript of the episode:
00:00:23 Domen Savič / Citizen D
OK, so welcome everybody. It’s the 8th of November 2023, but you’re listening to this episode of Citizen D podcast on the 15th of December 2023. With us today is Claudio Agosti, algorithms explorer and digital rights evangelist at the AI forensics NGO and the topic of today’s discussion is worker rights in the digital economy. So first of all, hello.
00:00:52 Claudio Agosti
Hello and thank you people of the future.
00:00:56 Domen Savič / Citizen D
Let’s start with the quick recap of the report. It’s titled “Exercising workers rights in algorithmic management system”. What does that mean? What was the topic of the report? What did you investigate and what were some of the findings?
00:01:15 Claudio Agosti
Thank you. The report, it talks about a story, an investigation that began in 2019. I founded a project Tracking exposed. That was a project meant to do algorithm analysis, and we were analyzing platforms of social media or of Amazon and other web platforms. But I get in touch with the Aida of the European Trade Union Institute because I met her at the privacy camp, the event organized every year in in Brussel to put together privacy folks and other people considered on this right. And she was fascinated about our approach to analyze algorithms because it’s a black box analysis.
And it was in 2019 we met. I started to collaborate with that institution to teach a bit to trade unions how they should be skeptical of the apps that run on the, let’s say, on the riders phones, to organize their work, because those apps may violate certain kind of privacy rights or also some kind of labour rights.
Initially it was just an insight, an intuition, but only in 2020 or let’s say around the time we start to try to make some investigation. And through that we approach it in two ways. The first was making a survey, a set of questions that were meant for riders, to understand if they felt that the techniques that the technology that organize their work was discriminating on them, that’s because if you want to use some, if you want to bring it to court a company, you need to have evidence that this company did something bad, and the evidence of the violations.
In the other path instead, we were doing analysis that was only technical. So, try to do the best engineering of an app, but to do a reverse-engineering of a rider app you need to have a login and password of a rider because the app starts to execute and do all the potential privacy leakage or the surveillance of the worker only if you log in properly and we needed to find a rider willing to share and a login and password, and that was particularly difficult. We also tried to subscribe ourselves to be rider, but we did not get accepted. I don’t know if it was because of the place we were living.
But after 18 months and that’s this huge amount of time… So one of the important cost of this investigation that could have been reduced and will be reduced in the future, but after 18 months, we found a person willing to share those logins and passwords. So we set up a mechanism and this methodology that initially was some static analysis by Exodus privacy, which is an online service. I suggest you consult because it shows by doing static analysis how many known trackers are present in every mobile app.
Then we meet them, a proxy is a software that allow you to intercept the traffic that the app is performing to the outside, and then with the freedom is a system that allows to run the application on a sort of special environment, where the calls made to the system can be intercepted and recorded or modified and in that way we can observe when the app was actually accessing to the GPS or to other peripherical. So, with these three methods we start to observe how the app was behaving and we start to realize that.
As first was revealing, the location of the worker, even outside of the working shift. The second is that inside of the communication, about the profiles, so we were intercepting traffic and inside of this communication between the app and the global infrastructure you were seeing that some requests were made by the app to understand who’s the rider that is using the app, and so you are seeing the profile and the information tied to this profile. And then there were other requests more focused on getting new orders.
Inside of the profile of the rider, there was a present. A score. That’s it. I mean, it was not the number you could have expected in the sense that official League Global acknowledged the existence of an excellent score. And we realized that the excellent score was a different score. So therefore, there was a hit then scoring mechanism present or let’s say present in this communication, then how it was used by the app.
But infrastructure, we don’t know. But that’s was an evidence that even if it’s not that surprising for a Labor Unionist, it’s very important to have this evidence because Labor unionists in the past hugged, they requested that the worker should not be subjected to, let’s say, the vote or it can be subject to the vote, but the voting that they get from the customer should not impact their ability to win.
And that is part of the labour right. It’s not that your worker can stop because someone starts to vote you poorly, even if this seems to be a standard in the online market, because it’s normal for me when I buy something I don’t know on eBay to check the task worthless.
The seller is not OK that uh, if uh, your life and your work depend on a system.
Someone can game this system and start to downvote you and make you suffer a loss of business. And last but not least, we saw that there were also third parties not declared in the contracts or in the privacy policy that we’re getting all this information about the user profile and their location and everything that we’re doing in the app. So, you click it here and you move this panel where you were when this happened.
How much you are moving in that moment, all those kinds of detailed information were given to third parties, that’s, is another problem. But I don’t want to keep reassuming it, there is the report that is a 60 page long and there is also a video of 50 minutes around with me and Joanna that talks about it and you can find it on the website https://www.reversing.works because tracking exposed the project that I mentioned before closed this year and it became two different projects, one is AI forensics. The one you mentioned that carry on the algorithm analysis toward the influential algorithm.
So we look at TikTok and being chart and the language model. So some part of the algorithm analysis is carried on by forensics, reversing works focus more on the impact of algorithm and surveillance capitalism in worker rights. So this kind of effort is captured by this different group and the website reversing works contain also more reference about this report.
00:10:08 Domen Savič / Citizen D
So in your investigation you’ve investigated one app and one company basically on one market, right? So how fair would be the assumption that other providers of these types of services and the same providers in different markets? Are they using the same, let’s say, tactic, social scoring, hidden grades and so forward. Do you think it’s this? Is this limited to 1 market or is it present everywhere?
00:10:43 Claudio Agosti
I believe there are insights that make us assume this is a frequent condition because one sided analytics third parties are in between offering analytic service to them. So showing how the app is actually used, but they’re also tied to the marketing campaign, so they use the data collected this way to resell them in the advertisement or don’t know customers certification or brand analysis market…
This is still part of surveillance capitalism in the sense that for me, if you install a third party that monitors my behavior and uses this information to make a product out of it, that is the exact definition of surveillance capitalism.
And those companies, let’s say user the offering of analytics or other analysis services as a reason to be in the device. And that’s, I believe, a phenomenon that is present in many apps. The exodus privacy website that I was mentioning can allow you to inspect which trackers are present on the app and then this tracker in all the apps is present and that can give you an idea on how this phenomenon is very widespread.
This phenomenon is wide spread in the market and the problem for me is when this also has an impact on an app that you have to use to do your work because you are not a user but you are a worker. So worker rights apply to you and not just to the service. So because those apps are born under the surveillance capitalist Internet is fair to assume that this behavior is quite spread.
Then not all the apps have the same amount of trackers, all of them, or let’s say all the one I checked have a heavy presence of Google services, so there is some Google infrastructure that run together with the company infrastructure and the Google infrastructure also receives a large amount of personal data. Now we can talk about Google and the current EU US privacy agreement. But that is a different topic.
Other kind of behavior like secret scoring or other kind of discrimination are all part of the business optimization that made the economy effective and now is the time for the worker and for the labor union. And also for the regulators to be updated on how this problem is affecting the population and perhaps make better regulation because we cannot expect someone that start to make a tool that to optimize and the delivery and use this tool on different continents of the world. Only if it’s force, we start to implement a different policy, a different regulation and perhaps something more protective for riders rights.
In a specific country only if it’s forced, at the moment they are not yet forced. That’s why I assume what happen in a market also happen in all the market where this app is present.
00:14:27 Domen Savič / Citizen D
Before we move on to the topic of workers’ rights. I just want to ask a follow-up question. So, this app is workers or writer specific, right? So, it’s not the same app as it is when you know consumers are using it, I’m guessing, right?
00:14:45 Claudio Agosti
Correct, this is a different app and it requires login and password.
00:14:51 Domen Savič / Citizen D
So my question is, would you say the consumer app is using similar, let’s say strategy and approaches to getting more information about their users than it says on the label, so to speak?
00:15:08 Claudio Agosti
I believe that, uh, in the consumer app this is granted and that is also written in the fine print.
Because in excuse privacy, you realize how the, UM, analytics, uh and a marketing campaign is the same that is present in the Courier app. So, it’s really a third party that is collecting data from couriers and from consumer in the consumer you have this term of service when you create your account. Let’s say we’re going to use your location because they must deliver food to you and therefore when you have the app open, it’s normal that they know your location and they’re going to study what you consume. So let’s say it’s not surprising that this is happening. This is, let’s say, more justified as a way to give you a service is that for the for the rider that your location is surveyed outside of the working shift, that is not justified.
00:16:11 Domen Savič / Citizen D
Mm-hmm… and what was the reaction of the unions. You’ve mentioned you’ve cooperated with workers unions on this on these issues. What are they worried about this? Are they paying attention? Is this a topic that they are, let’s say, actively pursuing?
00:16:34 Claudio Agosti
Partially, in the sense that I tried to work with the Union and explain how that was a potential problem, but they don’t have those kinds of technical skills. That’s why we started to make analysis independently to show that there was evidence and potential then because you have also to think how your finding can have an impact and we went to that protection authority.
In 2021, we start to collaborate with a lawyer, but this lawyer took two years to present a complaint and this complaint initially was rejected by the Italian authority because the data were too old and we show that the version of the app was changing nearly every week. They initially rejected this application.
This summer, after this rejection, we made a test and instead of making a legal filing that was articulating potential violation. We just submitted a technical analysis and that allowed us to make the analysis and report in one week. This has been accepted. So now there is an investigation open in the election authority in Italy. Why is important because in 2021 are related from our investigation, the authority already fined global for 2,000,000€ as they did not have our insights, but they had other kind of insights because they were running an investigation, they proceeded as official, which is the authority can investigate on what they want because they feel it’s important and because in 2019 the Italian Government, the Labour Ministry, was making some high-level agreements with the rider company, then that artists are to make some investigation to verify how much they were compliant and in 2021 that investigation was concluded and the fault global faulty of many different reasons.
Interestingly, Glovo appealed and in 2022 and the judge said to Glovo, you don’t have to pay the fine because it’s too high, but the authority not only issued the fine, but also requested many remedies.
The solution is that the app stopped to be abusive that the true solution and the global apparently ignored all this remediation because we were doing our technical analysis in this time and we were seeing that the problem was still present.
Now I know after we submit our technical evidence this July, at the end of September, the Constitutional Court comment on the appeal and revert it and say the “No, no Glovo, you have to pay the fine.” So now after two years they use the global how to pay the fine, how to implement remediation 2 years ago and that there is an investigation in progress that showed that they did not implement this remediation and there were more problems than before. So we tried to use GDPR to enforce labor rights, and that’s even if for us price activist seems to be a linear behavior for labor union.
It’s not because normally a labor union protects collective rights and they go to defend the rider in a different strategy. They want to have their contract be fair, and then the company can do the business they want as long as the work is stated fairly. What we want to argue is that if you want to talk about power dynamics in the modern work environment, you need to understand how the technology works because it’s in the technology that the power will be exerted. So it’s not just enough to protect the contractual obligation and the agreements.
And so it’s a new thing for the labor law labor union thing that GDPR or privacy right can be used in in this collective frame and that’s why it has been a bit complex initially to engage with them. At the moment we don’t have any kind of active projects with unions yet. We are trying to show that they need to be trained to learn about this and show that this problem is present in every European country. So now is the replication and scalability of this experiment. That is our goal at harvesting works.
00:21:34 Domen Savič / Citizen D
You’ve mentioned GDPR. Do you think the current European legal frameworks in this area, the Digital Service Act, the Digital Markets Act, all this that is coming up or has come up in, the past few years. Do you think these are effective legal tools to pursue these types of privacy abuses.
00:22:07 Claudio Agosti
I will relay only in GDPR. The Digital Service act or also the currently debated platform worker directive, they may offer additional tools, but the digital service act is not meant for those that kindplication and services and also I don’t know yet I did not make a full assessment on what are the new legal tool that we have in our ability to speculate how we can use them for worker rights.
At the moment GDPR is already giving to us a lot of tools and there are many parts that can be explored. One of the experiments that we made was also to use data subject access request because if you start to get data about you, you can start to understand what is preserved and based on what you get about.
Based on what you get some service offered or not, and those form of transparency, they should be more enforced by the Platform Worker Directive, but this directive needed to be as first need to be completed in this cycle.
And at the moment the trialogue is stuck between the contract or obligation so should be there either be a full-time employee with some contractual agreement that protects them or not. That is where the focus of the trialogue is. There are some articles that talk about algorithmic transparency and possibility of oversights, and those articles need to be translated in national law.
I believe there will be some potential of lobbying in that phase because algorithmic transparency is something that nobody ever saw. If you look at what Instagram does, for example, is making a blog post, let’s say algorithm works in this way but but for me algorithm transparency means that every time I get something assigned, I know based on what and even if I get excluded by something, I need to know why.
When I’m doing some activity I should be aware if this has an impact or not on my ability to work in the close future. So transparency means full accountability, full control and the constant feedback between what the app record about you and how this will impact your life and you see this spectrum is not defined in the regulation need to become part of the demands at work. Also, perhaps the labor union, should also promote those kinds of demands because they can be part of the collective bargain. At the moment it’s not, that is why the fight for algorithmic transparency, if not for more control, is still to be thought, and the current regulation help enough, and really, even without new regulation, we have to explore many paths that GDPR gives us.
And uh, more than having privacy activists exploring those regulation, these matters we have a strategy that is with the labor union is with other representative of the gig economy workers is with the worker itself.
00:25:53 Domen Savič / Citizen D
Hmm, I’m asking because the election year is coming up in the in the European Union or in the Europen parliament and I’m curious to hear your thoughts on the, let’s say, the media representation of these types of issue. Is this a topic that is being discussed in the media reporting? The reason I’m asking is at least in Slovenia, the political decision-making processes and the media reporting go hand in hand, right? The politicians are usually picking up topics they want to represent or they want to address in, let’s say the Parliament or in their work based on the amount of media reporting.
So if an issue isn’t reported enough or isn’t reported regularly, then the politicians are saying, OK, I’m not going to get any brownie points for addressing this issue so they just don’t, and I’d like to hear your thoughts or your analysis of situation in Italy in this field.
00:27:12 Claudio Agosti
Well, it’s quite weak in the sense that we had some coverage when we raised the report and section of the GGL, which is one of the important labor unions, talk about it and organized a strike in Milan where the riders were asking for algorithmic transparency.
So that was the cyberpunk moment when the actual last layer of the pyramid demanded algorithmic transparency. But besides that, that event, this topic is not touched, not discussed, maybe because privacy or general data processing is abstract and it’s not that compelling. It’s not that seen as a as a problem and the media did not catch up.
It’s different in Spain. In Spain there is El Diario, a media outlet that is dedicated to cover a lot of the rider issues. But that’s also because Glovo is there and there are other companies that developed the gig economy based in Madrid or somewhere else in Spain.
And then there is the layer rider, so theydedicated the law that protects more labor rights in the economy. So let’s say Spain was a bit more advanced in the public debate, and that has also converted these demands into a better regulation. So what you say is, is correct. Perhaps if there is more coverage that will be also more progress in in labor rights, but at the moment in Italy we don’t have a government that care too much about this topic, yeah. Other countries I’m not that aware of special movements.
00:29:16 Domen Savič / Citizen D
The reason I’m asking is exactly because we’re slowly wrapping up and I’m trying to wrap up these discussions in, in a sort of looking into the future, right? So what would be an effective way to to address these issues going forward? We need more cooperations between privacy, digital activists and trade unions. Do we need better journalistic training in this regard, to sort of, you know, hone the journalistic sensibilities around this issue so they’ll be able to write about them, to investigate them?
Do we need better regulators so that, you know, activists and other players in this field from a non-government sector will basically stop worrying about the atomic bomb because it will be regulated well enough. Is it all of the above?
00:30:13 Claudio Agosti
Yeah, a bit all of the above especially have to say also in the privacy activism sector, despite in theory, those problem have been looked they were not considered different from any other privacy. Instead, they can give to us more tools, because if surveillance, privacy evaluation, sorry, affects regulation, a law that protects riders, that’s sorry workers and this law is stronger than others, then is a legal tool that you can use for social privacy improvement.
So this path has not been that explored and also when I try to advocate for it or pitching to some to some grantor, let’s say very few are keen on this idea, especially grantor that came from the US, never saw the labour rights as something important and most of their accountability effort is focused toward this information or other form of platform accountability
Labour has not been a compelling keyword, even in our sector of activism now, I hope it will be how we can help that I believe improving the amount of evidence is through that we made an analysis by involving one rider in one app in one nation, but that can be easily replicated.
We understood what were this low point like finding a rider and to give to us login and password and the legal filing instead of filing a legal analysis, we can just provide the technical assessment to the authority and then they have the legal expert that will make a case out of it.
So now that we identify the two slow points, we can train other technologists to do this kind of reverse engineering trainer, union unionist or rather advocate to do at least superficial analysis and understand what’s potentially problematic and talk more to the workers. But I mean it’s clear that a worker that has already the problem of working for an app perhaps is not that interested in the technical analysis of that app? Is a kind of reporting and analysis that requires a lot of technical information or a lot of abstract thinking on how this affects society, how these affects your ability and your autonomy, and be a worker. Now at the end, the algorithmic influence is not a problem that is compelling as other physical problems. It is a quite abstract problem and we’re still there. That’s why, for example, there is a documentary that I forget the name now, but I will offer you the link.
It’s a documentary produced by some colleague in Herman Center. They are part of Europe media initiative in Italy that investigates the gig economy, and they made a documentary that, for example, interview a a worker, and they asked her, so what do you feel about the algorithm? And she honestly answers, what is the algorithm?
I mean at the end they should just get order in an app, and explaining what there is behind the scenes which impact, which power has that is not that granted for us. That’s why the privacy advocate I believe are the community that more easily can start to catch up and eventually give us some thought and effort in this domain. But thelarge-scalee communication has a lot of complexity.
00:34:33 Domen Savič / Citizen D
That’s true. And just one more, one more question, since you brought it up several times during the discussion. It’s basically… we always try to question the idea of this empowered user, because this what’s basically is sold to us by the, you know, app providers, by the tech industry that says. you know, we provide, or they provide all the information and the user must decide for himself or herself, what does she want and what she doesn’t, or he doesn’t want, right?
It’s also in the GDPR. So, the GDPR is focusing on the personal capabilities of a user to be responsible for his or her data. And, you know to worry about it and to have tools that help her or him, you know, use the data, delete the data, remove the data, check the data and so forward.
But on the other hand, as you mentioned, the users, they’re not all super users, right? They’re not going to be very invested in this investigative work, reverse algorithm, type of dealings. So, do you think we need more emphasis on collective protection?
So, you’ve mentioned privacy activists as a possible solution, also an establishment of, let’s say a government body like the Information Commissioner or the Data Protection Authority to focus particularly or specifically on the algorithms on the decision-making systems that are running, you know, behind the behind the curtains to sort of, you know, speed up the process of addressing these issues in real time.
00:36:22 Claudio Agosti
You talk about workers or every algorithm?
00:36:24 Domen Savič / Citizen D
Generally, but yeah, we can focus on workers, yes.
00:36:28 Claudio Agosti
Because for workers, in theory the body that protects the situation that protects workers, interest and rights should be the trade union and the problems that they are not updated on the new challenges perhaps?
That shows if you want the institution that were established when Internet arrive have not caught up and this, we know it because we saw how certain kind of privacy regulation arrived 15 years after the beginning of the surveillance capitalism. But uh, but some institutions are even, uh, tougher to change and to update themselves and the Labour Union seems to be one of the most traditional ones.
And I would hope that these algorithms or these logics should be transparently communicated to the worker and to the Union so they can oversee and agree on what is in their best interest for everybody. We are far from it and regulation may go in this direction, but it needs to be mostly a cultural shift. You need to have a worker asking for it, and you need to also to understand what is better for you.
Because I think this example, if I just made a delivery and I have to bike for 30 minutes, would be fairer for me if I stopped for 5 minutes to take breath or because I’m on my working shift I should optimize and do as much as delivery I can?
So what is the logic? The algorithm implementation is also something that is thinking how to better optimize the time of the worker and how to better offer the services. But what is better for the rider? Does the rider have ever had the opportunity to ask for it, or do the rider have a special interface that say, OK, today I’m working, but I don’t feel that well, so please, slow pace or today I’m working and I want to make as much money as possible, so please give me all the difficult deliveries. So those are not the options that have been ever discussed but that’s if happen is a sign of our society that actually understands the new power dynamics, understands how they’re implemented and has all the actors that equally speak and ask their demands.
We are far from it, but that is what I hope to see.
00:39:34 Domen Savič / Citizen D
Fingers crossed. Thank you, Claudio or I guess the final call to action would be to update the trade unions that that would be. That would be a good start. Thank you so much, Claudio, for sitting down with us. Best of luck with your future endeavors. This was a Citizen D podcast episode, we publish an episode every month, so we’ll see you next time.
Citizen D advice:
- Workers rights need an update related to the information society
- Privacy is not a commodity
- The globality of surveillance capitalism needs to be tackled locally
More information:
- Exercising workers rights in algorithmic management systems – Lessons learned from the Glovo-Foodinho digital labour platform case – report
- AI talks @ ETUI: Algorithmic management – event
- AlgorithmWatch in Italy – reports
About the podcast:
Podcast Citizen D gives you a reason for being a productive citizen. Citizen D features talks by experts in different fields focusing on the pressing topics in the field of information society and media. We can do it. Full steam ahead!
Podcast: Play in new window | Download
Subscribe: RSS