Holding Big Platforms Accountable: The Why, What, and How of Auditing

Imagine you’re about to audit one of the world’s largest online platforms – perhaps TikTok, Google, or Facebook. Your task? To assess whether they’re managing risks effectively, moderating content fairly, and ensuring transparency in how their algorithms shape the digital landscape. This isn’t just a bureaucratic formality; it’s about accountability. These platforms wield immense influence over society, and their operations must be scrutinised accordingly.

This is the essence of Article 37 of the Digital Services Act (DSA). Under this regulation, platforms classified as Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) are now legally required to undergo independent annual audits. These audits should go beyond mere box-checking – they are designed to uncover how platforms handle systemic risks, protect users (including minors), and comply with EU regulations.

In short, platforms like Facebook, TikTok, Instagram, Google, and YouTube no longer operate in a regulatory vacuum. The DSA sets out specific obligations they must adhere to, ensuring that their power comes with responsibility. So, what exactly do these obligations entail?

Key Areas of Article 37

Independent annual audit is the essence of Article 37 of the DSA. The obligations in this audit are organised into different levels that mirror a hierarchical pyramid structure of requirements in the DSA. At the apex of this structure are the Special Obligations that are unique to VLOPs and VLOSEs and the following section will address the main points that are crucial to understand in order to perform an audit.

Systemic Risk Management

Platforms must assess risks tied to illegal content, fundamental rights, public security, and user well-being. For example, VLOP must ensure its platform does not distribute child abuse material or enable counterfeit goods. Similarly, a social media VLOP must prevent the amplification of harmful disinformation, such as around elections.

A major focus is design ethics. Article 25 of the DSA prohibits platforms from designing interfaces that deceive or manipulate users. If a platform buries privacy settings deep in menus or nudges users into sharing more data than intended, it may be using dark patterns—a clear compliance failure. Auditors will examine whether users can make informed, free choices without manipulation also over protracted periods.

Child safety is another priority. Article 28 of the DSA requires platforms accessible to minors to implement appropriate safeguards ensuring high levels of privacy, safety, and security. This means features like default privacy settings for minors, age-appropriate content controls, and limitations on targeted advertising. An audit should check whether these measures are actually enforced or merely a superficial policy.

Risk Mitigation Measures

Once risks are identified, platforms must demonstrate concrete steps to mitigate them. This goes beyond drafting policies—it requires measurable actions.

Take recommender systems as an example. Article 27 of the DSA mandates that platforms explain why certain content is recommended. A video-sharing platform such as TikTok or YouTube should clearly state if recommendations are based on past interactions, trending content, or paid promotions.

But transparency alone isn’t enough. The same article ensures that users must be able to modify their preferences at any time, directly from where recommendations appear. If a user wants to switch from an algorithm-driven feed to a chronological one, they should not have to navigate a maze of settings. Auditors will test whether these options are genuinely accessible or deliberately obscured.

Similarly, advertising transparency is critical. Platforms must clearly label ads, explain targeting criteria, and allow users to opt out of profiling-based recommendations. Auditors verify whether these commitments hold up under scrutiny—whether platforms genuinely respect user autonomy or employ dark patterns to nudge them back into data-driven personalisation. 

Comprehensive Reporting and Transparency

Platforms must also publicly disclose how they address systemic risks. These reports must be based on clear methodologies and include measurable outcomes. A vague statement about “enhancing safety” isn’t enough—platforms must show what steps were taken, what data was collected, and how effective their measures were.

Regulators and researchers must also have access to internal platform data to verify claims. Auditors will assess whether a platform provides meaningful access or just superficial summaries.

Other requirements

However, the aforementioned obligations are only the tip of the iceberg. Platforms shall also adhere to rules dealing with content moderation providing transparent and user-friendly procedures for notifications of illegal content and subsequent handling of the request. Trusted flaggers as privileged layers of users with priority notification handling should be recognised together with informing users about out-of-court dispute resolution mechanisms. DSA also contains rules on how the terms and conditions of platforms should look like.

How the Audit is Conducted

The audit report consists of three parts (1) audit risk analysis (2) audit procedures and testing and (3) an audit report and findings. First, the audit risk analysis serves as a foundational element before conducting audits and it distinguishes between inherent, control and detection risks. Secondly, in the audit procedures and testing methodologies, the auditors focus on the controls implemented by the provider, and analyse the algorithmic interactions and decision-making parameters in areas such as content moderation. In this step, auditors also test algorithms and related systems in simulated environments that mirror real-world conditions. Finally, the audit report classifies compliance as positive, positive with comments, or negative. If weaknesses are found, platforms must address them within one month or provide justification.

The Role of Researchers

Researchers play a crucial oversight role in the auditing process. They may contribute in three ways:

  • Conducting independent audits as recognised auditing bodies.
  • Verifying audit results to ensure they are accurate and unbiased.
  • Analysing audit credibility by reviewing whether platforms are making genuine compliance efforts or just aiming for regulatory approval.

Researchers also act as a safeguard against lenient audits. If an auditing body produces a report that appears too favourable, independent researchers can challenge its findings, increasing accountability.

AI-Auditology Project and Its Relation to DSA

At the Kempelen Institute of Intelligent Technologies, we are pioneering a groundbreaking approach to model-based algorithmic auditing within the AI-Auditology project. This novel paradigm seeks to automate key steps of the auditing process, ultimately creating a powerful watchdog tool that addresses the limitations of conventional audits. 

This kind of independent (external) algorithmic solution provides a novel means of how we, researchers, can get involved in all three above-mentioned roles. One of the main aims of the AI-Auditology project is to enable longitudinal and cross-platform algorithmic audits. Such audits can subsequently used to verify VLOPs/VLOSEs compliance with obligations stipulated by the DSA, particularly those related to recommender system transparency (Articles 27 & 38 of the DSA), the protection of minors (Article 28 of the DSA), and advertising practices (Articles 26 & 39 of the DSA).

Acknowledgement

The content of this blog was created as a part of research funded by the EU NextGenerationEU through the Recovery and Resilience Plan for Slovakia under the project No.  09I03-03-V03-00020.