ECREA Workshop 2025: Why Platform Audits Need to Get Technical (And Why We Need to Work Together)

The Digital Services Act promised transparency and accountability. Platforms would undergo independent audits, vetted researchers would gain access to data, and we’d finally be able to verify whether these systems actually comply with the rules meant to protect users. But between the promise on paper and what’s actually happening, there’s a gap – one that became very clear at the ECREA Communication Law and Policy Workshop I attended in Brussels this September.

I was really looking forward to being in Brussels after such a long time. This city is like a breath of fresh air for me – multicultural, diverse, very young, and it genuinely feels like a policy hub where things can actually be shaped. The VUB campus brought back memories of my undergrad years in Amsterdam, which felt fitting for the conversations ahead about making the DSA work in practice, not just in theory.

As part of Panel 2 on DSA assessments and compliance, I presented our work on algorithmic auditing. We looked at actual audit reports that Very Large Online Platforms have been producing – reports that are supposed to show how they’re complying with DSA requirements around profiling minors, recommender system transparency, and targeted advertising. What we found was concerning: significant inconsistencies in methodologies and a real lack of technical depth when it comes to evaluating AI-powered systems. Our proposal is to use algorithmic auditing – essentially simulating user behavior, observing how algorithms respond, and analyzing those responses to get empirical evidence rather than just taking platforms at their word.

I was a bit nervous about the technical aspects of my presentation, honestly. This was predominantly a social science audience focused on media freedom and pluralism, so I tried to emphasize the policy analysis of audit reports. But something interesting happened. The audience didn’t just engage with the policy side – they were genuinely curious about the technical realization. Multiple people, from heads of research institutes to fellow researchers, came up afterward asking detailed questions: How do sock-puppeting audits actually work? How do you avoid getting banned by platforms? What does the technical infrastructure look like?

I found myself consulting with Ivan, our project’s principal investigator who wasn’t at the conference, and then relaying our answers. But more importantly, these questions told me something: people are starting to realize you can’t just theorize about platform governance anymore. You need to get your hands dirty with these technical methods. And that’s exactly why our research resonates – it bridges that gap. We provide the legal and policy debate, but we also provide empirical evidence from our AI researchers. This multidisciplinary approach creates real leverage for policy change because it speaks to both worlds.

The conversation that followed my presentation went deeper into Article 40 of the DSA – the provision that’s supposed to enable vetted researchers to access platform data. At the time of the conference, we were still waiting for guidelines on how this would actually work. I shared our experience with TikTok: 21 months to get access to a virtual compute environment, which wasn’t even what we originally applied for. Twenty-one months. The room reacted with shock, but also with a kind of recognition. Some researchers said they’d faced similar struggles but just didn’t have the manpower to keep appealing like we did.

This is what „researcher access“ looks like under the DSA right now. If you don’t have the resources to be persistent for almost two years, you’re essentially locked out. And that matters because without data access, you can’t do the rigorous, independent research. 

What made the workshop valuable beyond my own presentation was seeing how all these pieces connect. The presentations on systemic risk reports and their evolution over the past two years will definitely inform our work moving forward. The debates on media freedom and pluralism reminded me why this matters – these platforms shape what information reaches citizens, they can threaten media diversity and democratic participation. It’s not just about technical compliance; it’s about protecting democratic spaces.

But honestly, what resonated most was the buzz after my presentation. People wanted to talk about it. They wanted to understand how to implement these methods themselves. And I think that’s because we’re highlighting something people are experiencing but maybe struggling to articulate: that the audits happening now have real problems, real inconsistencies. The DSA was supposed to bring transparency and accountability, but if the mechanisms enforcing it lack rigor, we’re just creating compliance theater.

This is why the connections I made at the workshop feel so important. Researchers from Austria, Germany, and Belgium working on systemic risk assessments, enforcement, transparency reports – we’re all tackling different pieces of the same puzzle. And here’s the thing: getting access to VLOP data is incredibly difficult right now, especially for smaller research institutes. But if we work together, if we share knowledge and insights and actually coordinate our efforts, we can build something stronger than any of us could alone. We’re not competing. We’re all trying to figure out if the DSA is working, and that requires collaboration.

Looking back now, what strikes me is how much work remains but also how much momentum is building. The DSA created this framework, but implementation is messy. Access to data takes years. Audit methodologies are inconsistent. Yet researchers are getting more sophisticated about technical methods, more willing to bridge disciplines, more committed to making this work. There’s a real community forming around these questions.

For those of us working on platform governance in Slovakia and beyond, this is where we need to be – bridging technical and policy expertise, sharing resources, pushing for the kind of rigorous, independent auditing that the DSA promised but hasn’t fully delivered yet.

The conversations from Brussels will continue. And hopefully, a year from now, we’ll be further along – clearer guidelines, better access, audits with real teeth.