Skip to content
Longterm Wiki
Back

PMC Academic - Obligations to assess: Recent trends in AI accountability regulations

paper

Authors

Serena Oduro·Emanuel Moss·Jacob Metcalf

Credibility Rating

4/5
High(4)

High quality. Established institution or organization with editorial oversight and accountability.

Rating inherited from publication venue: PubMed Central

This peer-reviewed journal article analyzes emerging AI accountability regulations and their shift toward impact assessments and governance documentation, directly addressing regulatory approaches to managing risks from automated decision systems.

Paper Details

Citations
15
Year
2022
Methodology
peer-reviewed
Categories
Patterns

Metadata

journal articleanalysis

Summary

This paper examines recent trends in AI accountability regulations that require developers to conduct impact assessments of automated decision systems across social, economic, and ethical dimensions. Analyzing four legislative examples from the US and EU, the authors demonstrate how regulations are shifting beyond technical assessments toward accountability documentation as a governance mechanism. The paper identifies three core concerns these regulations address: identifying and documenting harms, ensuring public transparency, and enforcing anti-discrimination rules. The authors provide insights for system designers on preparing for and complying with emerging regulatory requirements.

Cited by 1 page

PageTypeQuality
US State AI Legislation LandscapeAnalysis70.0

Cached Content Preview

HTTP 200Fetched Apr 10, 20260 KB
Checking your browser - reCAPTCHA Checking your browser before accessing pmc.ncbi.nlm.nih.gov ... Click here if you are not automatically redirected after 5 seconds.
Resource ID: f4ba840569bf2bb5 | Stable ID: sid_sjqvDz1Fvp