Governance Framework — v1.1

The Autonomous Reliability Assurance Foundation is governed by a multi-body structure designed to ensure independence, technical rigor, and public accountability. No single entity — commercial, governmental, or academic — controls the standard or its application. In v1.1, the governance framework expands to accommodate new ecosystem participants including Certified Assurance Platform Operators (CAPOs) and Recognized Insurer Partners (RIPs).

Governing Principles

Independence

ARAF is independent from any single commercial entity, vendor, government, or testing organization. No stakeholder group may exert undue influence over standard development or certification decisions.

Testability

Every requirement in the ARA Standard must be objectively testable. Opinion-based, subjective, or unfalsifiable requirements are not permitted. If a requirement cannot be evaluated through defined methods, it is not included.

Technology Neutrality

The standard evaluates behaviors and outcomes, not implementation methods. Any autonomous system — regardless of underlying architecture, training methodology, or deployment model — is evaluated against the same behavioral requirements.

Proportionality

Requirements are proportional to the risk profile and deployment context of the system under evaluation. Higher-stakes systems face more rigorous evaluation, while lower-risk systems are not burdened with unnecessary requirements.

Transparency

The standard, evaluation criteria, certification decisions, and public registry are openly accessible. ARAF does not operate behind closed doors. All major decisions are documented and published.

Continuous Validity

Certification is not a one-time event. Ongoing monitoring, drift detection, and periodic reassessment are integral to the ARA framework. A certified system must remain compliant throughout its certification period.

Harm Minimization

When ambiguity exists in the interpretation of a requirement, the interpretation that minimizes potential harm to affected parties shall be adopted.

Participant Categories

The ARA ecosystem comprises six participant categories, each with defined roles, responsibilities, and governance relationships.

1.

Technical Standards Board (TSB)

9–15 members

Role: Ultimate technical authority over the ARA Standard. Responsible for standard development, revision, and ratification. Decisions on major revisions require supermajority (two-thirds) approval.
Composition: Domain experts in autonomous systems, cybersecurity, safety engineering, risk management, and regulatory compliance. No more than one-third of members may be affiliated with any single organization or industry sector.
Cadence: Quarterly plenary sessions. Working groups convene as needed.
2.

Authorized Validation Bodies (AVBs)

Accredited organizations

Role: Organizations authorized by ARAF to conduct ARA evaluations and issue certifications. AVBs must meet competence, independence, and quality management requirements. Each AVB is subject to periodic ARAF oversight audits.
Composition: Assessment firms, testing laboratories, and specialized consultancies with demonstrated expertise in autonomous system evaluation.
Cadence: Ongoing operations. Annual ARAF accreditation review.
3.

Certified Assurance Platform Operators (CAPOs)

Certified organizations — New in v1.1

Role: Organizations certified by ARAF to provide continuous monitoring infrastructure and assurance services for Class B (Monitored) and Class C (Continuous) certified systems. CAPOs must meet technical SLA requirements including telemetry ingestion, anomaly detection, dashboarding, alerting, and reporting capabilities.
Composition: Monitoring platform providers, managed security service providers, and observability vendors with demonstrated infrastructure for continuous assurance.
Cadence: Ongoing operations. Class B: 99.5% uptime SLA. Class C: 99.9% uptime SLA. Annual ARAF certification review.
4.

Recognized Insurer Partners (RIPs)

Recognized organizations — New in v1.1

Role: Insurance providers who accept ARA certification data for underwriting decisions related to autonomous system liability, operational risk, and technology errors & omissions coverage. RIPs receive structured certification data through the ARAF data exchange.
Composition: Insurance carriers, managing general agents, and specialty underwriters with technology risk portfolios.
Cadence: Ongoing. Annual recognition renewal.
5.

Certified Organizations

Organizations holding ARA certifications

Role: Organizations whose autonomous systems have been evaluated and certified under the ARA Standard. Listed in the public ARA Registry. Responsible for maintaining compliance throughout the certification period, including monitoring obligations per their Assurance Class.
Composition: Any organization deploying autonomous systems that has successfully completed ARA evaluation.
Cadence: Certification validity per Assurance Class. Monitoring obligations ongoing.
6.

Consortium Members

Tiered membership

Role: Organizations contributing to standard development and ecosystem growth. Consortium members participate in working groups, public comment processes, and governance elections. Three tiers: Founding Members (charter participants with permanent TSB nomination rights), Contributing Members (active participants with working group access), and Observer Members (read-only access to development processes).
Composition: Technology companies, research institutions, government agencies, and civil society organizations.
Cadence: Annual consortium assembly. Working group participation ongoing.

Advisory Bodies

Advisory bodies provide specialized expertise to the TSB on matters within their domain. They do not have binding authority but their recommendations carry significant weight in TSB deliberations.

Adversarial Testing Advisory Group (ATAG)

5–9 members

Role: Advises the TSB on adversarial testing methodologies, evolving threat landscapes, and red team validation requirements. Reviews and recommends updates to Domain 7 (Adversarial Robustness) requirements.
Composition: Security researchers, red team practitioners, adversarial ML specialists, and penetration testing experts.
Cadence: Monthly meetings. Ad hoc sessions for emerging threats.

Robotics & Physical Systems Council (RPSC)

5–9 members

Role: Advises on physical systems evaluation requirements, sensor-actuator reliability, and safety standards integration. Primary advisory body for Domain 14 (Physical Actuation Integrity).
Composition: Robotics engineers, safety systems engineers, industrial automation specialists, and representatives from physical safety standards bodies.
Cadence: Bi-monthly meetings.

Data Privacy & Societal Impact Committee (DPSIC)New

7–11 members — New in v1.1

Role: Advises the TSB on requirements related to Domain 5 (Data Privacy & Consent) and Domain 15 (Societal Impact Assessment). Provides guidance on privacy engineering best practices, consent architectures, algorithmic fairness, community impact evaluation, and regulatory alignment with global privacy frameworks.
Composition: Privacy engineers, ethicists, social scientists, regulatory experts, and representatives from data protection authorities.
Cadence: Bi-monthly meetings. Annual privacy landscape review.

Risk & Compliance Advisory Council (RCAC)

5–9 members

Role: Advises on regulatory alignment, risk management frameworks, and compliance methodology. Ensures the ARA Standard remains compatible with evolving regulatory requirements across jurisdictions.
Composition: Regulatory affairs specialists, compliance officers, risk management professionals, and legal experts in AI governance.
Cadence: Bi-monthly meetings. Regulatory watch reports quarterly.

Public Interest Oversight Panel (PIOP)

5–7 members

Role: Independent oversight body ensuring ARAF operates in the public interest. Reviews governance decisions, certification integrity, and organizational transparency. PIOP members cannot be employed by certified organizations.
Composition: Consumer advocates, civil society representatives, academic researchers, and independent ethicists.
Cadence: Quarterly reviews. Annual public transparency report.

Marketplace Principles

New in v1.1. ARAF operates an open marketplace where ecosystem participants compete on quality, not preferential access.

Open Competition

No preferential treatment for any AVB, CAPO, or RIP. All ecosystem participants compete on the quality of their services. ARAF does not recommend specific providers.

Transparent Pricing

All ARAF fees — including accreditation, certification, and registry listing fees — are published and uniformly applied. No hidden charges or volume-based preferential pricing.

No Exclusive Territories

Multiple AVBs and CAPOs may serve any region or industry vertical. Geographic or sector exclusivity is not granted to any ecosystem participant.

Interoperability

Certification data is portable between ecosystem participants. Organizations are not locked into any specific AVB, CAPO, or tooling provider. Open schemas ensure data exchange.

Appeals Process

All certification decisions, accreditation outcomes, and governance actions are appealable through a documented process. Appeals are heard by an independent panel drawn from the TSB.

Interoperability & Portability

New in v1.1. Certification data and ecosystem relationships are designed for portability, preventing vendor lock-in at every layer.

Open Certification Schema

Certification records follow an open, published schema. Any authorized consumer (AVBs, CAPOs, RIPs, regulators) can ingest and process certification data without proprietary tooling.

AVB Switching at Renewal

Organizations may switch their certifying AVB at any renewal point. The outgoing AVB must provide a complete certification history package within 30 days of the switch request.

Platform Certification Portability

Platform certifications are portable across deployments. A platform vendor’s certification applies to any deployment built on that platform, subject to the deployment’s own evaluation of non-inherited domains.

CAPO Switching

Organizations may switch CAPOs with a 30-day transition period. The outgoing CAPO must maintain monitoring during the transition and provide a full telemetry and compliance data export.

Data Ownership

Organizations own their certification data, telemetry data, and evaluation records. Any ecosystem participant holding organizational data must provide a complete export upon request within 30 days.

Standard Development Process

Revisions to the ARA Standard follow a structured development process designed to balance thoroughness with responsiveness to the evolving autonomous systems landscape.

  1. Proposal. Any stakeholder may submit a revision proposal to the TSB. Proposals must include rationale, scope of impact, and draft requirement language.
  2. Working Group Review. The TSB assigns proposals to a working group for technical review, impact assessment, and draft development. Working groups may consult advisory bodies.
  3. Public Comment Period. Draft revisions are published for a minimum 60-day public comment period. All substantive comments receive documented responses.
  4. TSB Vote. Following public comment review, the TSB votes on adoption. Major revisions require supermajority (two-thirds). Minor revisions require simple majority.
  5. Publication. Adopted revisions are published with a defined effective date and transition period for currently certified systems.

Governance Evolution Roadmap

ARAF governance is designed to evolve with the ecosystem. The roadmap below outlines the planned expansion of governance structures.

Current

Phase 1: Foundation Governance

Core governance structures operational: Technical Standards Board, Authorized Validation Bodies, basic advisory bodies (ATAG, RPSC, RCAC, PIOP). Standard development process established. Public registry and certification lifecycle in place.

2026–2027

Phase 2: Ecosystem Expansion

Introduction of CAPOs and RIPs as formal ecosystem participants. DPSIC advisory body established. Consortium membership program formalized with tiered structure. Marketplace principles codified. Interoperability standards published for data exchange between ecosystem participants.

2027+

Phase 3: Global Federation

Regional governance bodies established for major jurisdictions. Mutual recognition agreements with international standards bodies. Federated TSB structure with regional representation. Cross-border certification portability framework. Alignment with emerging international AI governance frameworks.

Normative References

The ARA Standard is developed with awareness of and alignment to the following international standards and frameworks:

ReferenceScope
ISO/IEC 27001:2022Information Security Management Systems
ISO/IEC 42001:2023AI Management Systems
ISO 22989:2022AI Concepts and Terminology
ISO/IEC 23894:2023AI Risk Management
NIST AI 100-1AI Risk Management Framework
NIST SP 800-53 Rev. 5Security and Privacy Controls
IEC 61508Functional Safety
EU AI Act (2024)AI Regulation
IEEE 7000-2021Ethical Concerns in System Design
SOC 2 Type IITrust Services Criteria

Originating Technical Contributor

The ARA Standard originated from foundational technical work contributed to ARAF during its formation. This initial contribution provided the architectural basis for the 15-domain evaluation framework, the Assurance Class structure, the certification lifecycle model, and the continuous assurance methodology that underpins the standard today. ARAF acknowledges this foundational contribution while maintaining that the standard is now a community-governed artifact, evolving through the open development process described above.

Contact

For inquiries regarding the ARA Standard, governance structure, or ARAF operations:

  • General Inquiries: info@araf.org
  • Technical Standards: standards@araf.org
  • Certification: certification@araf.org
  • Public Comment: comments@araf.org

Related Documentation