Core Concepts
EU legislation lacks data access for researchers and civil society, hindering the establishment of an effective AI audit ecosystem.
Abstract
The European legislature proposed the Digital Services Act (DSA) and Artificial Intelligence Act (AIA) to regulate platforms and AI products. The DSA mandates independent audits for very large online platforms (VLOPs) and search engines, while the AIA focuses on high-risk AI systems. However, a regulatory gap exists as the AIA does not provide data access for researchers and civil society, limiting accountability mechanisms. Third-party audits are crucial for compliance and oversight but face challenges due to restricted API access and platform limitations.
Directory:
Introduction:
EU legislative response to AI risks.
Algorithm Audits:
Definitions of audits and key attributes.
Regulatory Framework:
DSA mandates independent audits for VLOPs/VLOSEs.
Data Access:
DSA grants data access for vetted researchers.
Conformity Assessment:
AIA focuses on internal control with optional external audits for high-risk systems.
Audit Ecosystem:
Importance of third-party audits by researchers and civil society.
Technical Methods:
Different audit types from code audits to experimental audits.
Access Levels:
Continuum from "white-box" to "black-box" auditing access levels.
Stats
"The purpose of auditing is to determine whether providers comply with obligations standardized by the DSA."
"Independent third parties play a crucial role in additional oversight under the DSA."
"Access to model and training data is crucial for third-party auditing."
"DSA mandates independent annual audits for VLOPs/VLOSEs."
"AIA focuses on internal control with optional external audits for high-risk systems."