Research

Data on the compliance web.

We scan the public web for compliance signals — trackers, AI disclosures, accessibility failures, license contamination — and publish what we find. Open data, reproducible methods, linked tooling.

By The Gridworker Team License CC BY 4.0 (data) · MIT (tools) Contact [email protected]
No reports published yet.

The first research drop lands soon. Reports will appear here as they ship — each with full methodology, raw dataset, and the scanner you can run yourself.

How this works

Every Gridwork research report runs the same playbook. We pick a compliance question with a clear public-web signal. We run one of our open-source scanners against a named sample of sites. We publish the raw data, the methodology, and a summary of what we found — plus every edge case the scanner flagged for human review.

All scanners are on npm and GitHub. Every finding is reproducible. If you think we got something wrong, tell us — corrections get published with the name of the person who caught them, if they want the credit.

Tools used: gridwork-siteaudit, gridwork-privacy, gridwork-aiact, gridwork-license, gridwork-wiki.