How We Normalized 67 Child Safety Laws into 45 API Rule Categories
The Phosra Child Safety Spec (PCSS) is an open specification that maps every child safety law we could find to a single, machine-readable taxonomy. This is the technical story of how we built it.
My 7-year-old got a Nintendo Switch for his birthday. I sat down to configure parental controls and spent 45 minutes in Nintendo's app. Then I realized I also needed to update YouTube, Netflix, Roblox, the iPad, and the WiFi router. Each platform has its own parental control system. Nintendo uses a custom age-based rating. Apple uses content descriptors plus app-level restrictions. YouTube's “Restricted Mode” is a single toggle — on or off. Netflix has its own maturity ratings that don't map to any other system. None of these platforms share a vocabulary for what “age-appropriate” means, and none of them can import settings from each other.
As an engineer, I recognized this as a normalization problem — the same class of issue as timezones before the Olson database, or payment processing before Stripe. N platforms, each with their own schema for child safety, and no interoperability layer.
It gets worse at the regulatory level. There are now 67 child safety laws across 7 jurisdictions — US federal, US state, EU, UK, Asia-Pacific, Americas, and Middle East & Africa. Each law defines its own requirements for content filtering, screen time, privacy, algorithmic safety, and parental consent. A platform operating globally needs to comply with all of them simultaneously, but there is no shared language between KOSA's “duty of care” requirements and the EU Digital Services Act's “systemic risk mitigation” obligations, even when they mandate functionally identical controls.
67 Laws, 60+ Platforms, No Shared Language
The compliance fragmentation problem is worse than most engineers realize. Consider three laws that all address algorithmic recommendations for minors:
- KOSA (US Federal) requires platforms to let minors “opt out of personalized algorithmic recommendations” and disable addictive design features like autoplay and notification streaks by default.
- EU Digital Services Act requires “very large online platforms” to assess systemic risks to minors and prohibits profiling-based recommendations for minors when they're aware a user is a child.
- California SB 976 (the Protecting Our Kids from Social Media Addiction Act) bans platforms from serving addictive feeds to minors without parental consent and restricts notifications during school hours and overnight.
Three different laws, three different legal frameworks, three different enforcement mechanisms — but they all boil down to two technical controls: disable algorithmic feeds and limit addictive design patterns. A developer building compliance for a social media app has to read all three laws, understand the nuances, and implement what is effectively the same feature three different ways, because each law uses different terminology and defines different thresholds.
Now multiply that across 67 laws. The Kids Online Safety and Media Act (KOSMA) adds algorithmic audit requirements. Virginia's SB 854 mandates notification curfews. New York's SAFE for Kids Act requires usage timer notifications. COPPA 2.0 bans targeted advertising to minors. Every US state that has passed a children's code or age-appropriate design code adds its own variation of the same core protections.
The industry needs what the timezone world got with the IANA Time Zone Database: a single, maintained, machine-readable mapping that normalizes the mess. That is what PCSS is.
From Legal Text to JSON: The Rule Category Taxonomy
The core of PCSS is a taxonomy of 45 rule categories. These aren't arbitrary — every one of them was derived by reading every provision of every law in our registry and identifying the distinct, enforceable technical controls they require. When multiple laws mandate the same control, that control gets a single category. When a law introduces a genuinely new requirement, we add a new category.
The 45 categories are organized into 11 domains:
| Domain | Rule Categories |
|---|---|
| Content | content_rating, content_block_title, content_allow_title, content_allowlist_mode, content_descriptor_block |
| Time | time_daily_limit, time_scheduled_hours, time_per_app_limit, time_downtime |
| Purchases | purchase_approval, purchase_spending_cap, purchase_block_iap |
| Social | social_contacts, social_chat_control, social_multiplayer |
| Web Filtering | web_safesearch, web_category_block, web_custom_allowlist, web_custom_blocklist, web_filter_level |
| Privacy | privacy_location, privacy_profile_visibility, privacy_data_sharing, privacy_account_creation |
| Monitoring | monitoring_activity, monitoring_alerts |
| Algorithmic Safety | algo_feed_control, addictive_design_control |
| Notifications | notification_curfew, usage_timer_notification |
| Advertising & Data | targeted_ad_block, dm_restriction, age_gate, data_deletion_request, geolocation_opt_in |
| Compliance & Safety | csam_reporting, library_filter_compliance, ai_minor_interaction, social_media_min_age, image_rights_minor |
| Legislation (2025) | parental_consent_gate, parental_event_notification, screen_time_report, commercial_data_ban, algorithmic_audit |
Each law in our registry maps to one or more of these categories. Here is what a real law entry looks like in our system — this is the actual KOSA entry from our LawEntry type:
{
"id": "kosa",
"shortName": "KOSA",
"fullName": "Kids Online Safety Act",
"jurisdiction": "United States (Federal)",
"jurisdictionGroup": "us-federal",
"country": "US",
"status": "passed",
"statusLabel": "Passed Senate (Jul 2024)",
"summary": "Establishes a duty of care for platforms, requiring
them to disable addictive features and algorithmic
feeds for minors by default.",
"keyProvisions": [
"Duty of care requiring platforms to prevent and
mitigate harms to minors",
"Strongest default privacy settings for minors
must be enabled by default",
"Minors must be able to opt out of algorithmic
recommendations",
"Platforms must disable addictive design features
by default for minors",
"FTC enforcement authority with civil penalties
up to $50,000 per violation",
"Annual independent audits of platform compliance"
],
"ruleCategories": [
"algo_feed_control",
"addictive_design_control",
"targeted_ad_block",
"algorithmic_audit"
],
"platforms": ["Netflix", "YouTube", "TikTok", "Instagram"],
"ageThreshold": "All minors",
"penaltyRange": "Up to $50,000 per violation"
}
The ruleCategories field is the critical bridge. It tells any consuming system exactly which technical controls this law requires, using a vocabulary shared by every law in the registry. When KOSA says “minors must be able to opt out of algorithmic recommendations,” that maps to algo_feed_control. When it says “disable addictive design features,” that maps to addictive_design_control. A developer who implements support for these two rule categories is simultaneously compliant with the equivalent provisions in KOSA, the EU DSA, California SB 976, and every other law that mandates the same controls.
Why 45 categories and not 30 or 100? Because we started from the legal text, not from engineering convenience. We read every provision of every law and asked: “Does this require a technical control that is genuinely distinct from every other category we already have?” If yes, we added a new category. If the provision could be satisfied by an existing category, we mapped it there. The result is a taxonomy that is as small as possible while covering every enforceable requirement across all 67 laws. We expect the number to grow as new legislation passes — the five most recent categories (parental_consent_gate, parental_event_notification, screen_time_report, commercial_data_ban, algorithmic_audit) were added to cover 2025 legislation.
The Platform Adapter Interface
A taxonomy of rule categories is useful for reading comprehension, but it doesn't enforce anything by itself. To actually apply rules to a platform, we need an adapter layer. In Phosra's architecture, every platform implements a single Go interface:
// Adapter is the core interface all platforms implement.
type Adapter interface {
Info() PlatformInfo
Capabilities() []Capability
ValidateAuth(ctx context.Context, auth AuthConfig) error
EnforcePolicy(ctx context.Context, req EnforcementRequest) (
*EnforcementResult, error,
)
GetCurrentConfig(ctx context.Context, auth AuthConfig) (
map[string]any, error,
)
RevokePolicy(ctx context.Context, auth AuthConfig) error
SupportsWebhooks() bool
RegisterWebhook(ctx context.Context, auth AuthConfig,
callbackURL string) error
}
The key method is Capabilities(). Each adapter declares what it can do — not which rule categories it supports, but which capabilities it has. This is an important distinction. A Capability is a cluster of related rule categories that a platform can handle natively. For example, the web_filtering capability covers five rule categories: web_filter_level, web_category_block, web_custom_allowlist, web_custom_blocklist, and web_safesearch.
Here is a concrete example. The NextDNS adapter declares four capabilities:
func (a *Adapter) Info() provider.PlatformInfo {
return provider.PlatformInfo{
ID: "nextdns",
Name: "NextDNS",
Category: domain.PlatformCategoryDNS,
Tier: domain.ComplianceLevelCompliant,
Description: "DNS-level content filtering and parental controls",
AuthType: "api_key",
}
}
func (a *Adapter) Capabilities() []provider.Capability {
return []provider.Capability{
provider.CapWebFiltering,
provider.CapSafeSearch,
provider.CapCustomBlocklist,
provider.CapCustomAllowlist,
}
}
NextDNS can filter web content, enforce safe search, and manage custom block and allow lists. It cannot set screen time limits, manage in-app purchases, or control social features — those capabilities don't exist at the DNS layer. When the engine encounters a rule like time_daily_limit targeting a child who has NextDNS connected, it knows immediately that NextDNS can't handle it and routes the rule elsewhere.
This capability-based routing is what makes PCSS work as a universal spec. You don't need to know the specific API of every platform. You declare rules using the 45-category taxonomy, and the engine figures out which platform can enforce each rule. If no connected platform supports a given rule category natively, the engine routes it to one of Phosra's own services as a fallback.
Split-Brain Enforcement
The hardest problem in the system is what we call “split-brain enforcement.” When a parent sets a policy for their child, the rules need to be enforced across every connected platform. But not every platform can handle every rule. The CompositeEngine solves this by splitting each rule set into two buckets: rules the native platform adapter handles, and rules that Phosra's services handle.
// RouteRules splits rules between native provider and Phosra services.
// For each enabled rule, it checks if the adapter's capabilities cover it.
// If yes -> NativeRules. If no -> routes to the appropriate Phosra service.
func (e *CompositeEngine) RouteRules(
adapter provider.Adapter,
rules []domain.PolicyRule,
) *RuleRouting {
routing := &RuleRouting{
PhosraRules: make(map[string][]domain.PolicyRule),
}
adapterCaps := adapter.Capabilities()
for _, rule := range rules {
if !rule.Enabled {
continue
}
// Check if the adapter natively supports this rule
nativelySupported := false
for _, cap := range adapterCaps {
if matchesCapability(rule.Category, cap) {
nativelySupported = true
break
}
}
if nativelySupported {
routing.NativeRules = append(routing.NativeRules, rule)
} else {
// Route to the appropriate Phosra service
svcName, ok := e.categoryToSvc[rule.Category]
if ok {
routing.PhosraRules[svcName] =
append(routing.PhosraRules[svcName], rule)
}
}
}
return routing
}
The engine is initialized with 9 Phosra services, each responsible for a subset of rule categories that platforms commonly lack native support for:
| Service | Purpose |
|---|---|
| notification | Curfew notifications, usage timer alerts, parental event notifications |
| analytics | Activity monitoring, alerts, screen time reporting |
| age_verification | Age gates, parental consent gates, social media minimum age enforcement |
| content_classify | Content rating, descriptor blocking, allowlist mode |
| privacy_consent | Data deletion requests, data sharing opt-outs, profile visibility |
| compliance_attest | CSAM reporting, library filter compliance, AI interaction rules, algorithmic audits |
| social | Contact management, chat controls, DM restrictions, multiplayer settings |
| location | Location tracking, geolocation opt-in enforcement |
| purchase | Purchase approval workflows, spending caps, IAP blocking |
This split-brain architecture solves a real problem. Consider a family that has NextDNS for web filtering, Apple Screen Time for device management, and YouTube connected directly. When the parent enables a policy with web_safesearch, time_daily_limit, and algo_feed_control, the engine routes each rule to the right handler:
web_safesearchgoes to NextDNS (native capability)time_daily_limitgoes to Apple Screen Time (native capability)algo_feed_controlgoes to YouTube (native capability via algorithmic safety)
If the parent also enables notification_curfew and none of the connected platforms support it natively, the engine routes it to Phosra's notification service, which handles the curfew scheduling itself and sends push notifications to the child's registered devices. The parent doesn't need to know which platform handles which rule. The policy is expressed once in PCSS, and the engine does the routing.
The Compliance Graph
The real power of PCSS emerges when you see the three-way mapping as a graph. On one side, you have laws. In the middle, you have rule categories. On the other side, you have platform capabilities. Every law maps to a set of rule categories. Every capability maps to a set of rule categories. A platform is compliant with a law if, for every rule category that law requires, at least one connected platform (or Phosra service) has a capability that covers it.
The capability-to-rule mapping is defined in the engine's matchesCapability function. Here is the full mapping:
| Capability | Rule Categories Covered |
|---|---|
| content_rating | content_rating, content_block_title, content_allow_title, content_allowlist_mode, content_descriptor_block |
| time_limit | time_daily_limit, time_per_app_limit |
| scheduled_hours | time_scheduled_hours, time_downtime |
| purchase_control | purchase_approval, purchase_spending_cap, purchase_block_iap |
| web_filtering | web_filter_level, web_category_block, web_custom_allowlist, web_custom_blocklist |
| safe_search | web_safesearch |
| social_control | social_contacts, social_chat_control, social_multiplayer, dm_restriction |
| location_tracking | privacy_location, geolocation_opt_in |
| activity_monitoring | monitoring_activity, monitoring_alerts, screen_time_report |
| privacy_control | privacy_profile_visibility, privacy_data_sharing, privacy_account_creation, data_deletion_request |
| algorithmic_safety | algo_feed_control, addictive_design_control, algorithmic_audit |
| notification_control | notification_curfew, usage_timer_notification, parental_event_notification |
| ad_data_control | targeted_ad_block, commercial_data_ban |
| age_verification | age_gate, parental_consent_gate, social_media_min_age |
| compliance_reporting | csam_reporting, library_filter_compliance, ai_minor_interaction, image_rights_minor |
This graph is what makes compliance auditable. Given any law, you can programmatically ask: “Which rule categories does this law require? For each of those categories, which of the family's connected platforms have a capability that covers it? What is the compliance gap?” The answer is always computable because every edge in the graph is explicit in the data.
For platforms building their own compliance, the graph works in reverse. A platform can ask: “Given my declared capabilities, which laws am I already covering, and which rule categories am I missing?” This turns the open-ended question of “are we compliant?” into a specific, enumerable list of gaps to close.
The PCSS v1.0 Spec
PCSS defines a spec format for enforcement requests and responses. An enforcement request describes a set of rules to apply for a child, expressed using the 45-category taxonomy. Here is the request format:
// PCSS Enforcement Request
{
"rules": [
{
"category": "web_safesearch",
"enabled": true,
"config": { "enabled": true }
},
{
"category": "time_daily_limit",
"enabled": true,
"config": { "minutes": 120 }
},
{
"category": "algo_feed_control",
"enabled": true,
"config": { "mode": "chronological" }
},
{
"category": "notification_curfew",
"enabled": true,
"config": {
"start": "21:00",
"end": "07:00",
"timezone": "America/Chicago"
}
}
],
"auth_config": {
"api_key": "...",
"extra_params": { "profile_id": "abc123" }
},
"child_name": "Emma",
"child_age": 10
}
The response reports exactly what happened — which rules were applied, which were skipped (because the platform doesn't support them), and which failed:
// PCSS Enforcement Response
{
"rules_applied": 3,
"rules_skipped": 1,
"rules_failed": 0,
"details": {
"web_safesearch": "applied",
"time_daily_limit": "applied",
"algo_feed_control": "applied",
"notification_curfew": {
"routed_to": "phosra_notification_service",
"status": "applied"
}
},
"message": "3 rules applied natively, 1 routed to Phosra services"
}
The EnforcementRequest and EnforcementResult types are defined in our Go provider package. The request contains the rules (each with a category, enabled flag, and JSON config), authentication credentials, and optional child metadata. The response is a simple accounting of what happened, with per-rule details for debugging and audit trails.
We chose this format specifically because it is platform-agnostic. The request doesn't mention NextDNS or Apple or YouTube. It speaks only in terms of rule categories. The adapter translates rule categories into platform-specific API calls. This means a PCSS-compatible request can be sent to any adapter without modification — the adapter decides which rules it can handle and reports back.
What We're Open-Sourcing
We believe the taxonomy and the law data should be public infrastructure. The specific problem — “what does KOSA require, technically?” — should not require a lawyer and three months of work for every platform to answer independently. The PCSS spec repo contains:
- The PCSS specification — The 45-category taxonomy with descriptions and domain groupings
- The full law registry — All 67 laws with metadata, rule category mappings, key provisions, and jurisdiction data
- The capability mapping — 18 capabilities and their rule category coverage
- A reference adapter interface — The Go Adapter and Capability types for building compliant platform integrations
What remains proprietary is the enforcement engine (the CompositeEngine, the 9 Phosra services, the actual platform adapters, and the production API). We think this is the right split: the data and the spec should be open; the infrastructure that makes it fast and reliable is the product.
The spec is versioned. PCSS v1.0 ships with 45 categories and 67 laws. When new legislation passes, we add categories and law entries and bump the version. Contributions to the law registry are welcome — we know there are child safety laws in jurisdictions we haven't covered yet.
Help Us Build This
PCSS is an attempt to solve a coordination problem. The fragmentation of child safety compliance is costing the industry enormous amounts of duplicated work, and the children it is meant to protect are getting inconsistent coverage depending on which platforms they use. A shared, open taxonomy can fix this, but only if the industry adopts it.
We are looking for feedback from three groups: platform engineers who have to implement these controls, compliance teams who have to audit them, and parents who have to live with the patchwork results. If you have built parental controls for a platform, we want to know what we got wrong in the taxonomy. If you have read a child safety law we haven't covered, we want to add it. If you are a parent who spends too many hours configuring controls on six different devices, we want to hear what matters most to you.
Check out the Phosra Compliance Hub to explore all 67 laws and their rule category mappings. The PCSS spec and law data are available on GitHub.