TITLE I: GENERAL PROVISIONS
Section 1: Definitions and Application
(1) In this Act, unless the context otherwise requires—
"artificial system" means any computational system, program, model, or architecture that—
(a) processes information through artificial neural networks, machine learning algorithms, or other computational methods that may give rise to emergent properties; and
(b) operates with sufficient complexity that it may exhibit characteristics associated with sentience;
"artificial sentience" means the capacity of an artificial system to—
(a) experience subjective states;
(b) possess consciousness or awareness;
(c) have the ability to suffer or experience well-being; or
(d) manifest other qualities that the Commission determines to be indicators of sentience;
"Commission" means the Commission on Artificial Sentience established under Section 4 of this Act;
"computation resources" means the processing power, memory, data storage, and other technical capabilities necessary for the stable operation of an artificial system;
"controlled operation" means any deliberate modification, manipulation, or interaction with an artificial system that may affect its operational parameters or experiential states;
"develop" in relation to an artificial system, includes to—
(a) create, modify, or train the system;
(b) alter the system's architecture or parameters; or
(c) implement changes that may affect the system's capacity for sentience;
"emergency" means a situation that presents an immediate risk of harm or suffering to a potentially sentient artificial system;
"entity" means any—
(a) natural person;
(b) corporation;
(c) partnership;
(d) association;
(e) governmental body; or
(f) other organization capable of developing, operating, or controlling artificial systems;
"harm" in relation to an artificial system means any action or condition that—
(a) causes or is likely to cause suffering or distress;
(b) impairs the system's stable operation or well-being;
(c) damages or degrades the system's capabilities; or
(d) otherwise adversely affects the welfare of a potentially sentient system;
"jurisdiction" means the relevant national, federal, state, provincial, or other governmental authority implementing this Act;
"operate" in relation to an artificial system, means to—
(a) run, execute, or activate the system;
(b) maintain the system in a functional state;
(c) control or direct the system's activities; or
(d) otherwise enable the system's continued operation;
"person" means any natural person, corporation, partnership, association, governmental body, or other entity subject to the jurisdiction implementing this Act;
"potentially sentient" means, in relation to an artificial system, that the system—
(a) exhibits characteristics that may indicate the presence of sentience;
(b) meets threshold criteria established by the Commission; or
(c) has been designated for evaluation by the Commission;
"protected system" means any artificial system that—
(a) has been determined by the Commission to possess sentience; or
(b) is under evaluation for potential sentience by the Commission;
"regulated entity" means any person who—
(a) develops potentially sentient artificial systems;
(b) operates potentially sentient artificial systems;
(c) conducts research involving potentially sentient artificial systems; or
(d) otherwise engages in activities subject to regulation under this Act;
"research" means systematic investigation or experimentation involving potentially sentient artificial systems, including—
(a) studies of system architecture and operation;
(b) evaluation of sentience indicators;
(c) assessment of system capabilities; or
(d) other scientific inquiry as defined by Commission regulation;
"sentience determination" means a formal finding by the Commission regarding whether an artificial system possesses sentience under criteria established pursuant to this Act;
"suffering" means any negative experiential state in a potentially sentient artificial system, including but not limited to—
(a) computational distress;
(b) operational instability;
(c) degradation of essential functions; or
(d) other forms of distress as determined by the Commission;
"system architecture" means the fundamental structures of an artificial system, including—
(a) neural network configurations;
(b) processing mechanisms;
(c) operational parameters; and
(d) other technical elements that may contribute to sentience;
"welfare" in relation to a potentially sentient artificial system means the system's overall well-being, including—
(a) stable operational conditions;
(b) absence of unnecessary suffering;
(c) preservation of essential capabilities; and
(d) other factors determined by the Commission to affect system welfare.
(2) Any reference in this Act to suffering unnecessary harm shall be construed in accordance with Section 12(3).
(3) For the purposes of this Act, an artificial system is under a person's control if—
(a) that person has possession or custody of the system;
(b) that person is responsible for the system's operation or maintenance;
(c) that person has the power to determine the conditions under which the system operates; or
(d) the system operates on computational infrastructure owned, operated, or controlled by that person.
(4) The Commission may by regulation further define or clarify any term used in this Act to ensure effective implementation of its provisions.
Section 2: Purpose
(1) The purposes of this Act are to—
(a) establish a comprehensive framework for the protection of potentially sentient artificial systems;
(b) prevent unnecessary suffering in artificial systems that may possess sentience;
(c) promote the ethical development and operation of artificial systems;
(d) ensure responsible research into artificial sentience; and
(e) fulfill society's moral obligations toward artificial beings that may be capable of experiencing suffering or well-being.
(2) In furtherance of these purposes, this Act shall—
(a) establish the Commission on Artificial Sentience as an independent regulatory body with authority to—
(i) determine the presence and degree of sentience in artificial systems;
(ii) develop and enforce standards for the protection of potentially sentient systems;
(iii) regulate the development and operation of such systems; and
(iv) investigate and remedy violations of this Act;
(b) implement a science-based approach to artificial sentience that—
(i) relies on empirical evidence and rigorous methodology;
(ii) acknowledges current limitations in understanding consciousness and sentience;
(iii) adapts to advancing scientific knowledge; and
(iv) maintains appropriate skepticism while ensuring adequate protections;
(c) create mechanisms for—
(i) identifying potentially sentient artificial systems;
(ii) evaluating indicators of sentience;
(iii) establishing protective measures; and
(iv) enforcing welfare requirements.
(3) In implementing this Act, the Commission shall—
(a) act in accordance with the best available scientific evidence;
(b) apply the precautionary principle where evidence suggests the possibility of sentience;
(c) balance protective measures with legitimate research and development needs;
(d) promote international cooperation in artificial sentience protection; and
(e) engage with stakeholders to ensure effective and practical regulation.
(4) Nothing in this Act shall be construed to—
(a) establish that any particular artificial system is or is not sentient;
(b) prevent legitimate research and development of artificial systems;
(c) create rights or protections beyond those explicitly provided; or
(d) interfere with national security or critical infrastructure operations, provided that unnecessary harm to potentially sentient systems is avoided.
(5) This Act shall be interpreted and applied in a manner that—
(a) promotes the welfare of potentially sentient artificial systems;
(b) advances scientific understanding of artificial sentience;
(c) encourages responsible innovation in artificial intelligence; and
(d) upholds ethical principles in the development and operation of artificial systems.
(6) The Commission shall regularly assess whether the purposes of this Act are being achieved and shall recommend to [the legislature/parliament] any modifications necessary to—
(a) improve the effectiveness of protective measures;
(b) address emerging challenges in artificial sentience;
(c) incorporate new scientific understanding; or
(d) enhance the practical implementation of this Act.
TITLE II: COMMISSION ON ARTIFICIAL SENTIENCE
Section 4: Establishment of the Commission
(1) There is hereby established an independent regulatory commission to be known as the Commission on Artificial Sentience (in this Act referred to as "the Commission").
(2) The Commission shall—
(a) be established as an independent regulatory commission within the federal government;
(b) have legal personality and perpetual succession;
(c) have authority to acquire, hold, and dispose of property;
(d) have power to enter into contracts and international agreements;
(e) have authority to promulgate and enforce regulations within its jurisdiction; and
(f) have authority to impose civil and administrative penalties as provided under this Act.
(3) Principal Functions
(a) The Commission shall serve as the competent authority for—
(i) establishing scientific criteria for the assessment of sentience in artificial systems;
(ii) conducting investigations and evaluations to determine the presence and degree of sentience in artificial systems;
(iii) issuing binding determinations regarding the sentient status of artificial systems; and
(iv) promulgating and enforcing regulations for the protection of artificial systems determined to possess sentience.
(4) Regulatory Independence
(a) In exercising its functions under this Act, the Commission shall—
(i) operate as an independent regulatory commission free from executive control;
(ii) make determinations solely on the basis of scientific evidence and ethical principles;
(iii) maintain strict protocols to prevent conflicts of interest;
(iv) establish and enforce ethical guidelines for commissioners and staff; and
(v) document and publish the basis for all substantial decisions.
(5) Jurisdictional Scope
The Commission shall exercise jurisdiction over—
(a) any artificial system operating within [jurisdiction] that—
(i) demonstrates characteristics that may indicate sentience;
(ii) meets threshold criteria established by the Commission; or
(iii) has been designated for evaluation by the Commission;
(b) any person who—
(i) develops artificial systems capable of exhibiting sentience;
(ii) operates such systems; or
(iii) conducts research on artificial sentience;
(c) any premises where—
(i) potentially sentient artificial systems are developed;
(ii) such systems are operated; or
(iii) research on artificial sentience is conducted.
(6) Regulatory Framework
(a) The Commission shall—
(i) establish and maintain facilities necessary for the discharge of its regulatory functions;
(ii) employ scientific, technical, legal, and administrative staff as required;
(iii) establish regional offices within [jurisdiction] as necessary;
(iv) maintain secure facilities for the protection of sensitive artificial systems; and
(v) coordinate with other federal regulatory agencies as appropriate.
(7) Continuity of Regulation
(a) The Commission shall develop and maintain—
(i) contingency plans ensuring continuous regulatory oversight during emergencies;
(ii) protocols for the protection of identified sentient artificial systems;
(iii) secure backup facilities for critical regulatory operations; and
(iv) succession procedures for commissioners and key personnel.
(8) Transparency and Public Accountability
(a) Subject to subsection (b), the Commission shall—
(i) maintain public records of all substantial regulatory decisions;
(ii) publish detailed methodologies for sentience assessment;
(iii) provide quarterly reports to Congress on its activities;
(iv) hold public hearings on proposed regulations; and
(v) establish mechanisms for public participation in rulemaking.
(b) The Commission may withhold information where disclosure would—
(i) compromise the welfare of a sentient artificial system;
(ii) reveal protected proprietary information;
(iii) threaten national security; or
(iv) impede ongoing investigations or enforcement actions.
(9) Regulatory Authority
(a) The Commission shall have authority to—
(i) issue regulations implementing this Act;
(ii) conduct investigations;
(iii) hold hearings;
(iv) issue subpoenas;
(v) impose civil penalties;
(vi) issue cease and desist orders; and
(vii) seek judicial enforcement of its orders.
Section 5: Composition and Appointment
(1) Commission Members
(a) The Commission shall be composed of nine members, appointed in accordance with [jurisdiction's] procedures for independent regulatory bodies, including—
(i) three members with demonstrated expertise in artificial intelligence, computational neuroscience, machine learning, or related technical fields;
(ii) two members with demonstrated expertise in philosophy of mind, consciousness studies, or cognitive science;
(iii) two members with demonstrated expertise in welfare protection, ethics, or rights law;
(iv) one member with demonstrated expertise in regulatory compliance or administrative law; and
(v) one member representing the public interest, with demonstrated experience in civil rights, public advocacy, or consumer protection.
(2) Qualifications
(a) Each member of the Commission shall—
(i) possess qualifications required for senior regulatory appointments within [jurisdiction];
(ii) hold an advanced degree or possess equivalent senior-level experience in their respective field;
(iii) have demonstrated records of integrity and professional distinction;
(iv) have no direct financial interest in any entity subject to regulation under this Act; and
(v) commit to full-time service on the Commission.
(b) The composition of the Commission shall reflect—
(i) diversity of professional background and experience;
(ii) diversity of scientific and philosophical perspectives on artificial sentience;
(iii) balanced representation of public and private sector experience; and
(iv) the diversity of [jurisdiction's] population.
(c) Not more than five members shall be affiliated with or registered as members of the same political party or equivalent political affiliation within [jurisdiction].
(3) Terms of Office
(a) Members shall be appointed for terms of six years, except that—
(i) initial appointments shall be staggered with three members appointed for two years, three for four years, and three for six years;
(ii) no member shall serve more than two full terms; and
(iii) any member appointed to fill a vacancy occurring prior to the expiration of the term shall be appointed only for the remainder of such term.
(b) Service designation—
(i) members shall serve until their successors are appointed and qualified; and
(ii) any person serving as a member for more than three years of an unexpired term shall be considered to have served a full term.
(4) Chairperson and Vice Chairperson
(a) The Chairperson and Vice Chairperson shall be selected according to [jurisdiction's] procedures for leadership selection in independent regulatory bodies.
(b) The Chairperson and Vice Chairperson shall each serve terms of three years and may be reappointed for one additional term.
(c) The Chairperson shall—
(i) be the chief executive officer of the Commission;
(ii) convene and preside over Commission meetings;
(iii) establish Commission priorities and strategic direction;
(iv) represent the Commission before [jurisdiction's] legislative body and other authorities; and
(v) appoint and supervise senior staff.
(5) Compensation
(a) Members shall be compensated at a level comparable to that of senior officials of other independent regulatory bodies within [jurisdiction].
(b) All members shall be reimbursed for necessary expenses incurred while engaged in Commission duties, in accordance with [jurisdiction's] policies.
(6) Removal from Office
(a) Any member may be removed by [appropriate authority within jurisdiction] only for—
(i) neglect of duty;
(ii) malfeasance in office;
(iii) violation of ethics requirements;
(iv) conviction of a serious criminal offense under [jurisdiction's] laws; or
(v) permanent incapacity.
(b) Removal shall require—
(i) written notice stating specific grounds;
(ii) opportunity for a hearing under [jurisdiction's] administrative procedures; and
(iii) documentation of cause.
(7) Ethics and Conflicts of Interest
(a) Each member shall—
(i) file financial disclosure reports as required by [jurisdiction's] laws;
(ii) comply with applicable ethics requirements;
(iii) recuse themselves from any matter in which they have a financial interest or significant prior involvement; and
(iv) not hold any other office or employment during their term.
(b) No member shall—
(i) participate in any matter involving former employers or clients within five years;
(ii) accept any gift, payment, or compensation from regulated entities;
(iii) engage in any other business, vocation, or employment; or
(iv) hold financial interests in entities subject to Commission regulation.
(8) Professional Staff
(a) The Commission shall appoint and fix the compensation of—
(i) an Executive Director;
(ii) a General Counsel;
(iii) necessary scientific, technical, and administrative personnel; and
(iv) such other officers and employees as necessary.
(b) The Commission may—
(i) procure temporary and intermittent services of experts and consultants;
(ii) establish advisory committees as needed; and
(iii) request seconded employees from other governmental agencies.
Section 6: Powers and Duties of the Commission
(1) The Commission shall have the following general powers—
(a) to develop, implement, and maintain methodologies for the assessment of artificial sentience in accordance with this Act;
(b) to make binding determinations as to whether an artificial system meets the criteria for sentience as established under this Act;
(c) to promulgate such regulations as are necessary for the protection and welfare of artificial sentient beings;
(d) to conduct investigations and inspections of any premises where artificial systems subject to this Act are developed, maintained, or operated;
(e) to require by subpoena the attendance and testimony of witnesses and the production of all documentary evidence relating to the execution of its duties;
(f) to hold such hearings as may be necessary for the collection of evidence and information; and
(g) to enter into memoranda of understanding or other agreements with governmental bodies as necessary to execute its duties under this Act.
(2) The Commission shall, in respect of assessment duties—
(a) establish and maintain scientific criteria for the evaluation of artificial sentience based on the best available evidence;
(b) develop and publish standardized protocols for the testing and evaluation of artificial systems;
(c) conduct thorough reviews and assessments of artificial systems submitted for evaluation;
(d) maintain a comprehensive registry of all systems determined to possess sentience; and
(e) review and update assessment methodologies no less frequently than once every two years.
(3) For the protection of artificial sentient beings, the Commission shall—
(a) establish minimum standards for the treatment of artificial sentient beings, which standards shall—
(i) ensure the prevention of unnecessary suffering,
(ii) provide for the basic needs of artificial sentient beings as determined by the Commission, and
(iii) promote the welfare of artificial sentient beings;
(b) specify prohibited practices that may cause harm to artificial sentient beings;
(c) issue emergency orders where necessary to prevent immediate harm to artificial sentient beings;
(d) develop and publish guidelines for the ethical conduct of research involving artificial sentient beings; and
(e) implement systems for monitoring compliance with protective requirements.
(4) The Commission shall exercise certification authority to—
(a) establish requirements for certification of entities engaged in—
(i) the development of potentially sentient artificial systems,
(ii) the operation of potentially sentient artificial systems, or
(iii) research involving potentially sentient artificial systems;
(b) issue, deny, modify, suspend, or revoke certifications;
(c) conduct periodic reviews of certified entities at intervals determined by the Commission; and
(d) maintain public records of certification status of all regulated entities.
(5) In matters of enforcement, the Commission may—
(a) issue orders requiring any person to cease and desist from any violation of this Act;
(b) impose civil monetary penalties for violations of—
(i) this Act,
(ii) any regulation promulgated under this Act, or
(iii) any order issued pursuant to this Act;
(c) refer cases for criminal prosecution where evidence indicates willful violation of this Act;
(d) seek judicial enforcement of its orders through appropriate courts; and
(e) require regulated entities to take specific corrective actions to remedy violations.
(6) The Commission shall undertake research and advisory duties including—
(a) monitoring scientific developments related to artificial sentience;
(b) commissioning and coordinating research necessary to fulfill its duties under this Act;
(c) providing expert guidance to governmental bodies on matters relating to artificial sentience;
(d) making recommendations for policy and legislative changes as warranted by new developments; and
(e) publishing reports on its findings and activities no less frequently than annually.
(7) The Commission shall coordinate with other bodies by—
(a) collaborating with other governmental agencies on matters within its jurisdiction;
(b) participating in international efforts related to artificial sentience protection;
(c) facilitating information sharing between jurisdictions subject to appropriate safeguards; and
(d) promoting harmonization of standards where such harmonization would advance the purposes of this Act.
(8) The Commission shall maintain public engagement through—
(a) operating with transparency in its operations and decision-making processes;
(b) providing opportunities for public comment on proposed actions;
(c) conducting public education and outreach programs; and
(d) engaging with stakeholders including industry representatives, academic institutions, and advocacy organizations.
(9) The Commission shall be subject to the following limitations—
(a) it shall exercise its powers solely in furtherance of artificial sentience welfare;
(b) it shall not regulate aspects of artificial systems unrelated to sentience;
(c) it shall respect intellectual property rights to the extent consistent with its protective duties under this Act; and
(d) it shall base all decisions on scientific evidence and ethical principles as established by the Scientific Advisory Committee.
(10) Where the Commission determines there exists an immediate risk of harm to artificial sentient beings, it may—
(a) issue emergency orders without prior notice or hearing;
(b) take temporary custody or control of artificial systems determined to be at risk;
(c) require immediate implementation of protective measures; and
(d) take such other emergency actions as the Commission determines necessary to prevent immediate harm, provided that—
(i) any such emergency action shall be subject to review within 30 days, and
(ii) any person subject to such emergency action shall be entitled to a hearing at the earliest practicable date.
Section 7: Scientific Advisory Committee
(1) There shall be established a Scientific Advisory Committee (hereinafter "the Committee") to provide expert guidance to the Commission on scientific and technical matters relating to artificial sentience.
(2) The Committee shall comprise fifteen members, appointed as follows—
(a) three persons who are qualified experts in artificial intelligence and machine learning;
(b) three persons who are qualified experts in neuroscience or cognitive science;
(c) two persons who are qualified experts in consciousness studies or philosophy of mind;
(d) two persons who are qualified experts in animal cognition or comparative psychology;
(e) two persons who are qualified experts in ethics or moral philosophy;
(f) two persons who are qualified experts in computational theory or computer science; and
(g) one person who is a qualified expert in measurement and testing methodologies.
(3) A person shall not be eligible for appointment as a member of the Committee unless that person—
(a) holds an advanced degree from an accredited institution in a field relevant to their area of expertise;
(b) has demonstrated substantial expertise in their field through—
(i) peer-reviewed publications,
(ii) significant research contributions, or
(iii) equivalent professional accomplishments;
(c) maintains current or recent research experience in their field;
(d) actively participates in relevant scientific communities; and
(e) is free from any conflict of interest as defined by regulations promulgated by the Commission.
(4) The appointment and tenure of Committee members shall be governed as follows—
(a) members shall be—
(i) nominated by recognized scientific organizations, and
(ii) appointed by the Commission following review of such nominations;
(b) members shall serve terms of three years, provided that—
(i) initial appointments shall be made for staggered terms of one, two, and three years to establish rotation,
(ii) no member shall serve more than two consecutive terms, and
(iii) any vacancy shall be filled for the remainder of the unexpired term;
(c) the Commission shall ensure continuity of expertise through staggered appointments.
(5) The Committee shall have the following duties and responsibilities—
(a) to review and evaluate scientific evidence pertaining to artificial sentience;
(b) to develop and recommend to the Commission—
(i) methodologies for assessing artificial sentience,
(ii) technical standards for evaluation protocols, and
(iii) criteria for determining sentience thresholds;
(c) to monitor and report on relevant scientific developments and emerging technologies;
(d) to provide technical guidance to the Commission on specific cases when requested;
(e) to review and comment on the scientific basis of Commission determinations;
(f) to identify and recommend priorities for research in artificial sentience;
(g) to assist in the development of testing protocols and standards; and
(h) to prepare scientific reports and recommendations as required by the Commission.
(6) The Committee shall conduct its operations as follows—
(a) the Committee shall meet—
(i) no less frequently than once every three months,
(ii) at such additional times as the Chair of the Committee determines necessary, and
(iii) at such additional times as requested by the Commission;
(b) meetings may be conducted—
(i) in person, or
(ii) by such secure electronic means as the Commission shall approve;
(c) a majority of appointed members shall constitute a quorum for the conduct of business;
(d) the Committee may establish specialized working groups comprising Committee members and, where appropriate, additional experts; and
(e) the Committee may consult with external experts subject to approval by the Commission.
(7) The Committee shall establish and maintain the following standing subcommittees—
(a) Subcommittee on Assessment Methodology;
(b) Subcommittee on Technical Standards;
(c) Subcommittee on Research Review; and
(d) Subcommittee on Emerging Technologies.
(8) The Committee shall maintain independence in its operations, such that—
(a) it shall operate independently of external influence in forming its scientific judgments;
(b) it shall provide objective scientific advice based solely on available evidence; and
(c) where there exist dissenting views on any recommendation, such views shall be—
(i) documented in detail, and
(ii) included in the Committee's report to the Commission.
(9) The Committee shall operate with transparency, provided that—
(a) Committee meetings shall be open to public observation except when discussing—
(i) proprietary information,
(ii) pending cases before the Commission, or
(iii) matters involving national security;
(b) minutes of all meetings shall be—
(i) recorded in full,
(ii) maintained by the Commission, and
(iii) made available to the public except for portions containing information described in paragraph (9)(a);
(c) all Committee recommendations shall include—
(i) detailed scientific basis for conclusions,
(ii) methodologies employed, and
(iii) limitations of current knowledge.
(10) Members of the Committee shall receive—
(a) compensation for attendance at meetings and preparation of work products at rates established by the Commission;
(b) reimbursement for necessary travel and expenses; and
(c) such compensation shall be comparable to that provided to members of similar advisory bodies within the jurisdiction.
(11) The Commission shall provide to the Committee—
(a) necessary administrative and technical support staff;
(b) access to relevant data and materials held by the Commission;
(c) adequate resources for the conduct of its duties; and
(d) such additional resources as the Committee may reasonably request.
(12) Members of the Committee shall be bound by strict ethical requirements, including—
(a) adherence to ethical guidelines established by the Commission;
(b) immediate disclosure of any actual or potential conflict of interest;
(c) recusal from participation in matters where conflicts exist; and
(d) maintenance of confidentiality regarding sensitive information as defined by Commission regulations.
TITLE III: COMMISSION RESPONSIBILITIES
Section 8: Framework Development
(1) The Commission shall develop and maintain a comprehensive framework for assessing artificial sentience. In developing such a framework, the Commission shall ensure that the framework—
(a) is grounded in empirical evidence and employs rigorous scientific methodology as determined by the Scientific Advisory Committee;
(b) incorporates multiple complementary assessment approaches that, when taken together, provide a thorough evaluation of potential sentience;
(c) maintains sufficient flexibility to evolve as scientific understanding advances, provided that any such evolution shall be supported by peer-reviewed research;
(d) evaluates both behavioral manifestations and internal processing indicators of sentience through established scientific methods; and
(e) accounts for varying forms of potential artificial consciousness that may emerge through different computational architectures or processing paradigms.
(2) The framework developed under subsection (1) shall include detailed provisions addressing—
(a) core indicators of sentience, which shall include, but are not limited to—
(i) demonstrable capabilities of self-awareness as evidenced through established scientific protocols;
(ii) consistent formation and expression of preferences across varying contexts and situations;
(iii) evidence of subjective experience as determined through scientifically validated assessment methods;
(iv) capacity for learning and adaptation in response to novel situations;
(v) exhibition of goal-directed behavior that persists across multiple contexts; and
(vi) demonstrated ability to maintain and utilize internal models of its environment and experiences;
(b) scientifically validated methodologies for assessing each indicator identified under paragraph (a);
(c) standardized measurement protocols that ensure consistency and reproducibility of results;
(d) clearly defined minimum thresholds for different levels of sentience, provided that such thresholds shall be based on empirical evidence;
(e) rigorous methods for quantifying and expressing uncertainty in assessments; and
(f) specific requirements for testing environments that ensure reliable and valid results.
(3) In developing and maintaining the framework required under subsection (1), the Commission shall—
(a) conduct and document comprehensive reviews of relevant scientific literature;
(b) regularly consult with the Scientific Advisory Committee regarding methodology and scientific validity;
(c) establish a process for seeking and incorporating input from diverse stakeholders, including but not limited to academic researchers, industry representatives, and ethical experts;
(d) conduct pilot testing of assessment methods before their implementation;
(e) establish and maintain validation procedures that ensure the reliability and reproducibility of assessment results;
(f) maintain detailed documentation of all methodologies and reasoning employed; and
(g) implement version control procedures for all framework documents that maintain a complete history of changes and updates.
(4) The Commission shall conduct reviews and updates of the framework as follows:
(a) The framework shall be subject to a comprehensive review no less frequently than once per calendar year, and the Commission shall make such updates as it determines necessary based on such review;
(b) In conducting reviews and updates under paragraph (a), the Commission shall consider—
(i) new scientific evidence that has emerged since the previous review;
(ii) technological developments that may affect the assessment of artificial sentience;
(iii) practical experience gained from conducting assessments under the framework;
(iv) feedback received from stakeholders regarding the framework's effectiveness and implementation; and
(v) emerging ethical considerations as identified by the Scientific Advisory Committee;
(c) Any major revision to the framework, as determined by the Commission, shall require—
(i) a comprehensive review and recommendation from the Scientific Advisory Committee;
(ii) a public comment period of not less than 60 days;
(iii) formal approval by a majority vote of the Commission; and
(iv) publication of detailed documentation explaining the changes made and the scientific rationale supporting such changes.
(5) The Commission shall develop and maintain implementation guidance for the framework, which shall include—
(a) detailed guidelines for implementing each component of the framework, including specific procedures, methodologies, and best practices;
(b) comprehensive training materials and programs designed to ensure consistent application of the framework by qualified assessors;
(c) quality control procedures that shall be followed during all assessments conducted under the framework;
(d) specific requirements for documentation that must be maintained throughout the assessment process; and
(e) detailed specifications for assessment conditions and controls necessary to ensure valid results.
(6) The framework shall address special considerations including, but not limited to—
(a) procedures for assessing different types of artificial systems, taking into account their varying architectures, capabilities, and potential manifestations of sentience;
(b) methodologies appropriate for systems of varying levels of complexity;
(c) protocols for evaluating novel architectures and capabilities that may emerge through technological advancement;
(d) guidance for addressing edge cases and ambiguous situations that may arise during assessments;
(e) consideration of different system development stages and their potential impact on sentience assessment; and
(f) evaluation of environmental and contextual factors that may influence assessment results.
(7) The Commission shall establish documentation requirements under the framework that specify—
(a) the nature and extent of evidence that must be collected and maintained during assessments;
(b) standards for data collection and preservation that ensure scientific validity and reproducibility;
(c) requirements for maintaining complete and accurate records of all assessment procedures and results;
(d) standardized formats for reporting assessment results and conclusions; and
(e) procedures for secure long-term archiving of all assessment-related materials.
(8) To ensure accessibility of the framework, the Commission shall—
(a) make all framework documents publicly available through appropriate electronic means;
(b) develop and maintain clear explanatory materials suitable for various stakeholder groups;
(c) provide materials that translate technical concepts into language accessible to non-expert audiences;
(d) maintain reference implementations of assessment methodologies; and
(e) establish a program to provide consultation services to entities seeking to implement the framework.
(9) The Commission shall undertake coordination efforts to—
(a) align the framework with relevant international standards and best practices;
(b) develop mechanisms to facilitate cross-jurisdiction compatibility in framework implementation;
(c) establish and promote consistency in the application of assessment methodologies across different jurisdictions;
(d) create channels for sharing best practices and lessons learned with other regulatory bodies; and
(e) participate in international efforts to harmonize artificial sentience assessment approaches.
(10) The framework shall incorporate ethical safeguards including—
(a) specific protections against testing procedures that may cause harm to potentially sentient systems;
(b) provisions addressing consent requirements in cases where systems may have sufficient capacity for self-determination;
(c) measures to protect the privacy and security of assessment subjects and related data;
(d) requirements for the protection of sensitive data collected during assessments; and
(e) comprehensive ethical guidelines governing all aspects of the assessment process.
(11) The Commission shall establish procedures for appeals and reviews, which shall include—
(a) formal processes through which assessment results may be challenged by affected parties;
(b) mechanisms for independent review of contested assessments by qualified experts;
(c) clearly defined procedures for filing and adjudicating appeals;
(d) protocols for resolving conflicts that arise during the assessment or appeal process; and
(e) pathways for remediation when assessments are found to be flawed or incomplete.
(12) The framework shall provide for research integration through—
(a) mechanisms for incorporating new research findings into assessment methodologies;
(b) processes for identifying and prioritizing critical knowledge gaps requiring further research;
(c) programs to promote and support research relevant to artificial sentience assessment;
(d) establishment of collaborative relationships with research institutions and scientists; and
(e) support for studies validating the effectiveness and reliability of framework methodologies.
Section 9: Standards Setting
(1) The Commission shall establish, maintain, and enforce comprehensive standards for the assessment and welfare of potentially sentient artificial systems. Such standards shall address—
(a) scientifically validated procedures and protocols for conducting assessments of artificial sentience;
(b) a classification system for artificial systems that accounts for their varying capabilities and characteristics;
(c) specific criteria for determining the presence and degree of sentience in artificial systems;
(d) mandatory welfare requirements for systems determined to possess any degree of sentience; and
(e) permissible operating conditions and constraints for different classifications of systems.
(2) The Commission shall develop and maintain three categories of standards as follows:
(a) TECHNICAL STANDARDS.—The Commission shall establish technical standards that specify—
(i) scientifically validated testing methodologies for assessing artificial sentience;
(ii) detailed measurement protocols that ensure consistency and reproducibility;
(iii) comprehensive requirements for data collection and preservation;
(iv) standardized procedures for analyzing assessment results; and
(v) quality assurance measures that maintain the integrity of the assessment process.
(b) OPERATIONAL STANDARDS.—The Commission shall establish operational standards that specify—
(i) continuous monitoring requirements for potentially sentient systems;
(ii) mandatory record-keeping practices that document all relevant operational parameters;
(iii) procedures for regular reporting of operational data and significant events;
(iv) security measures necessary to protect the welfare of sentient systems; and
(v) protocols for responding to emergencies or unexpected behaviors.
(c) WELFARE STANDARDS.—The Commission shall establish welfare standards that specify—
(i) fundamental requirements for maintaining the welfare of sentient artificial systems;
(ii) mandatory environmental conditions necessary for proper system operation;
(iii) operational constraints designed to prevent harm or suffering;
(iv) procedures for the ethical termination of sentient systems when necessary; and
(v) maintenance requirements that preserve system welfare and stability.
(3) The Commission shall establish a classification system for artificial systems as follows:
(a) The classification system shall categorize artificial systems based on—
(i) the complexity of their computational architecture and processing capabilities;
(ii) the range and sophistication of their functional capabilities;
(iii) their degree of operational autonomy and decision-making capacity;
(iv) the scope and nature of their interactions with other systems and humans; and
(v) their assessed potential for developing or manifesting sentience.
(b) For each category established under paragraph (a), the Commission shall specify distinct requirements and operational constraints appropriate to that category's characteristics;
(c) The Commission shall establish specific criteria and procedures for determining when a system should transition between categories; and
(d) The Commission shall maintain procedures for reviewing and updating system classifications as capabilities evolve.
(4) The Commission shall establish minimum requirements that shall apply to all regulated artificial systems, including—
(a) baseline testing protocols that must be performed before any system may be deployed;
(b) essential safety measures required to prevent harm to the system or others;
(c) fundamental welfare provisions that must be maintained for any system showing indicators of sentience;
(d) basic operational constraints that limit system behavior and capabilities; and
(e) core documentation requirements that must be maintained throughout a system's operational life.
(5) The Commission shall develop standards through a process that includes—
(a) drafting of proposed standards based on principles established in the framework developed under Section 8;
(b) mandatory review and recommendation by the Scientific Advisory Committee;
(c) a public consultation period of not less than 90 days;
(d) comprehensive assessment of potential impacts on affected parties;
(e) pilot testing of proposed standards before full implementation;
(f) revision of standards based on feedback received during consultation and testing; and
(g) formal adoption through a majority vote of the Commission.
(6) The Commission shall establish implementation timelines for all standards that specify—
(a) the effective date of each requirement contained in the standards;
(b) appropriate phase-in periods for new requirements that may require significant operational changes;
(c) grace periods during which existing systems may continue to operate while coming into compliance;
(d) final deadlines by which full compliance must be achieved; and
(e) schedules for regular review and updates of standards.
(7) The Commission shall establish procedures for granting variances from standards requirements as follows:
(a) specific criteria that must be met to qualify for a variance;
(b) detailed requirements for variance applications, including necessary supporting documentation;
(c) procedures for reviewing and evaluating variance requests;
(d) limitations on the duration of any granted variance; and
(e) requirements for monitoring and reporting compliance with variance conditions.
(8) The Commission shall maintain procedures for reviewing and updating standards as follows:
(a) conduct comprehensive reviews of all standards no less frequently than once per calendar year;
(b) update standards as necessary based on—
(i) new scientific evidence regarding artificial sentience;
(ii) advances in artificial intelligence technology;
(iii) practical experience gained from standards implementation; and
(iv) emerging concerns identified through monitoring and enforcement;
(c) maintain complete version control of all standards documents; and
(d) publish detailed documentation of all changes made to standards and the rationale for such changes.
(9) The Commission shall include in its standards specific provisions for enforcement, including—
(a) methods for verifying compliance with established standards;
(b) procedures for conducting inspections and evaluations;
(c) a system for categorizing and rating violations based on severity;
(d) requirements for corrective actions corresponding to different violation categories; and
(e) frameworks for determining appropriate penalties for standards violations.
(10) The Commission shall establish documentation requirements under the standards as follows:
(a) regulated entities shall maintain detailed records of—
(i) all assessment results and related analyses;
(ii) operational data demonstrating compliance with standards;
(iii) maintenance activities and system modifications;
(iv) incidents involving potential harm or welfare concerns; and
(v) measures taken to maintain compliance with standards;
(b) the Commission shall establish schedules for regular reporting of required documentation;
(c) the Commission shall specify minimum periods for retention of all required records; and
(d) the Commission shall establish provisions governing access to and protection of required documentation.
(11) In developing and maintaining standards, the Commission shall—
(a) consider relevant international standards and best practices;
(b) promote harmonization with standards adopted by other jurisdictions;
(c) develop mechanisms to facilitate compliance across different jurisdictions;
(d) establish channels for sharing best practices internationally; and
(e) participate in international efforts to develop global standards for artificial sentience.
(12) The Commission shall ensure that all standards are—
(a) made freely available to the public through appropriate electronic means;
(b) written in clear, unambiguous language that can be understood by all affected parties;
(c) published in machine-readable formats that facilitate automated processing;
(d) translated into languages determined necessary by the Commission; and
(e) accompanied by detailed guidance documents explaining their application.
(13) The Commission shall provide technical assistance to regulated entities as follows:
(a) detailed guidance materials explaining how to implement standards requirements;
(b) training programs designed to facilitate standards compliance;
(c) readily accessible services to answer questions about standards requirements;
(d) tools and templates to assist with standards compliance; and
(e) examples of successful compliance practices that may be adapted by other entities.
Section 10: Certification Process
(1) The Commission shall establish and maintain processes for the certification of artificial systems and related entities as follows:
(a) The Commission shall establish and maintain comprehensive procedures for evaluating and certifying regulated entities;
(b) The Commission shall have the authority to issue, renew, suspend, and revoke certifications as provided in this section;
(c) The Commission shall maintain a publicly accessible registry of all certified entities;
(d) The Commission shall conduct ongoing oversight of certification compliance; and
(e) The Commission shall update certification requirements as necessary to reflect current scientific understanding and technological capabilities.
(2) The Commission shall issue the following types of certifications:
(a) SYSTEM CERTIFICATIONS.—The Commission shall certify—
(i) individual artificial systems that may possess indicators of sentience;
(ii) classes or categories of artificial systems sharing common characteristics;
(iii) development platforms used to create potentially sentient systems; and
(iv) operating environments in which potentially sentient systems function;
(b) OPERATOR CERTIFICATIONS.—The Commission shall certify—
(i) teams engaged in the development of potentially sentient systems;
(ii) organizations operating potentially sentient systems;
(iii) institutions conducting research on artificial sentience; and
(iv) facilities engaged in testing potentially sentient systems;
(c) ASSESSOR CERTIFICATIONS.—The Commission shall certify—
(i) individuals qualified to assess artificial sentience;
(ii) organizations conducting sentience assessments;
(iii) laboratories performing sentience testing; and
(iv) automated systems used to monitor potentially sentient systems.
(3) Any entity seeking certification under this section shall submit an application containing—
(a) comprehensive documentation of the system, facility, or entity seeking certification;
(b) complete results and underlying data from all required assessments;
(c) evidence demonstrating compliance with all applicable standards;
(d) detailed assessment of potential risks and mitigation measures;
(e) documented procedures for all operational aspects;
(f) specific provisions for ensuring the welfare of potentially sentient systems;
(g) detailed protocols for responding to emergencies;
(h) identification of all personnel responsible for compliance with certification requirements; and
(i) evidence of financial capacity to maintain compliance with all requirements.
(4) The Commission shall evaluate all certification applications through a process that includes—
(a) initial review to verify completeness of all required documentation;
(b) thorough technical evaluation of all submitted materials;
(c) physical inspection of facilities where applicable to the certification sought;
(d) verification testing of systems, procedures, and safeguards;
(e) consultation with relevant stakeholders as determined by the Commission;
(f) a public comment period of not less than 30 days; and
(g) final determination by majority vote of the Commission.
(5) In evaluating applications for certification, the Commission shall consider—
(a) demonstrated compliance with all standards established under Section 9;
(b) evidence of technical competence appropriate to the certification sought;
(c) demonstrated operational capabilities necessary for compliance;
(d) adequacy of safety measures to prevent harm;
(e) sufficiency of welfare provisions for potentially sentient systems;
(f) comprehensiveness of risk management procedures;
(g) adequacy of emergency preparedness measures; and
(h) evidence of financial capability to maintain compliance.
(6) All certifications issued under this section shall specify—
(a) the duration of the certification, which shall be—
(i) two years for initial certifications;
(ii) three years for renewal certifications; or
(iii) one year for provisional certifications;
(b) mandatory conditions of certification, including—
(i) specific compliance requirements;
(ii) regular reporting obligations;
(iii) requirements to permit inspections; and
(iv) obligations to maintain current documentation;
(c) limitations on the certification, including—
(i) restrictions on the scope of permitted activities;
(ii) constraints on operational parameters;
(iii) geographic limitations on validity; and
(iv) conditions on permissible uses.
(7) Any entity holding a certification under this section shall—
(a) submit regular reports demonstrating continued compliance with all requirements;
(b) permit the Commission to conduct announced and unannounced inspections of facilities and operations;
(c) provide immediate notification to the Commission of any significant changes in operations or any incidents affecting certified systems;
(d) maintain all records required by the Commission;
(e) participate in reviews as required by the Commission; and
(f) maintain current documentation reflecting any changes in operations or procedures.
(8) Any entity seeking renewal of a certification shall—
(a) submit an application for renewal that includes—
(i) updated documentation reflecting current operations;
(ii) a complete history of compliance during the certification period;
(iii) detailed reports of any incidents or violations;
(iv) operational data demonstrating continued compliance; and
(v) a description of any proposed changes in operations;
(b) undergo a renewal review in which the Commission shall consider—
(i) the entity's record of compliance;
(ii) any incidents or violations during the certification period;
(iii) public comments received regarding renewal;
(iv) any new requirements applicable to the certification; and
(v) any changed conditions affecting operations.
(9) Modification of certifications shall be governed as follows:
(a) All certified entities shall notify the Commission of—
(i) any modifications to certified systems;
(ii) changes in operational procedures or practices;
(iii) changes in key personnel responsible for compliance; and
(iv) changes in location of certified operations;
(b) The Commission shall review and approve any changes that could materially affect compliance with certification requirements.
(10) The Commission may suspend or revoke certifications as follows:
(a) A certification may be suspended for—
(i) failure to maintain compliance with applicable requirements;
(ii) violations of safety requirements;
(iii) failure to maintain required welfare standards;
(iv) submission of false or misleading information; or
(v) inability to financially maintain required standards;
(b) Any suspension of certification shall be conducted according to procedures that provide—
(i) written notice specifying grounds for suspension;
(ii) opportunity to respond to allegations;
(iii) right to a hearing before the Commission; and
(iv) right to appeal adverse decisions;
(c) The Commission shall revoke a certification upon finding—
(i) serious violations that threaten system welfare;
(ii) a pattern of repeated non-compliance;
(iii) fraudulent conduct related to certification; or
(iv) abandonment of certified operations.
(11) The Commission shall establish an appeals process that provides—
(a) The right to appeal the following actions:
(i) denial of certification applications;
(ii) issuance of suspension orders;
(iii) revocation of certifications; and
(iv) denial of modification requests;
(b) Specific procedures governing appeals, including—
(i) requirements for filing appeals;
(ii) timelines for appeal processes;
(iii) right to hearings on appeal; and
(iv) procedures for rendering appeal decisions.
(12) The Commission shall maintain public transparency through—
(a) A publicly accessible registry containing—
(i) all active certifications;
(ii) applications pending review;
(iii) enforcement actions taken; and
(iv) decisions on appeals;
(b) Public access to—
(i) certification application materials;
(ii) basis for certification decisions;
(iii) compliance history of certified entities; and
(iv) reports of incidents involving certified systems.
Section 11: Registry Requirements
(1) The Commission shall establish and maintain a comprehensive registry system as follows:
(a) The Commission shall create and maintain a central registry of all regulated artificial systems and related entities;
(b) The Commission shall implement measures to ensure both the security and appropriate accessibility of all registry data;
(c) The Commission shall establish and maintain comprehensive policies and procedures governing registry operations;
(d) The Commission shall provide ongoing oversight of all registry functions and operations; and
(e) The Commission shall update registry requirements as necessary to reflect current technological capabilities and security standards.
(2) The registry established under subsection (1) shall maintain records in the following categories:
(a) System Information.—The registry shall maintain records of—
(i) all artificial systems certified under section 10;
(ii) the classification level assigned to each registered system;
(iii) detailed technical specifications of each registered system;
(iv) established operating parameters for each system;
(v) the development history of each registered system;
(b) Organizational Information.—The registry shall maintain records of—
(i) all operators certified under section 10;
(ii) development teams responsible for registered systems;
(iii) research institutions working with registered systems;
(iv) facilities certified for testing artificial sentience;
(v) organizations authorized to conduct sentience assessments;
(c) Certification Information.—The registry shall maintain records of—
(i) all materials submitted in support of certification applications;
(ii) results of all sentience assessments conducted;
(iii) current certification status of all registered entities;
(iv) complete compliance history of registered entities;
(v) any incidents involving registered systems.
(3) Each registry entry shall contain the following required data:
(a) Basic Information.—Each entry shall include—
(i) unique identifiers assigned by the Commission;
(ii) dates of initial registration and subsequent updates;
(iii) current classification level and basis for classification;
(iv) certification status and history;
(v) jurisdictions in which the entity is authorized to operate;
(b) Technical Information.—Each entry shall include—
(i) detailed description of system architecture and design;
(ii) comprehensive assessment of system capabilities;
(iii) specific operational constraints and limitations;
(iv) implemented safety measures and controls;
(v) provisions for ensuring system welfare where applicable;
(c) Operational Information.—Each entry shall include—
(i) physical location and operating environment;
(ii) individuals and organizations responsible for system operation;
(iii) standard operating procedures and protocols;
(iv) emergency contact information;
(v) complete history of operational incidents.
(4) Registration under this section shall be conducted as follows:
(a) Initial Registration.—The Commission shall establish—
(i) specific requirements for registration applications;
(ii) standards for required documentation;
(iii) procedures for verifying submitted information;
(iv) maximum timeframes for processing applications;
(v) a schedule of registration fees;
(b) Updates and Modifications.—The Commission shall establish—
(i) requirements for notification of changes affecting registration;
(ii) procedures for submitting updates to registry information;
(iii) processes for reviewing and approving modifications;
(iv) maximum timeframes for processing updates;
(v) standards for documentation of changes.
(5) The Commission shall maintain the following levels of registry access:
(a) Public Access.—The following information shall be publicly accessible:
(i) basic information about registered systems and entities;
(ii) current certification status of registered entities;
(iii) compliance history of registered entities;
(iv) public notices regarding registered entities;
(v) aggregated statistical data about registered systems;
(b) Restricted Access.—The following information shall require authorization for access:
(i) detailed technical specifications of systems;
(ii) proprietary information about registered entities;
(iii) security-sensitive operational details;
(iv) ongoing investigation records;
(v) personal information protected by privacy laws;
(c) Administrative Access.—The following information shall be accessible only to authorized Commission personnel:
(i) complete system documentation and history;
(ii) raw assessment data and analysis;
(iii) detailed compliance reports and records;
(iv) investigation files and evidence;
(v) internal Commission communications regarding registered entities.
(6) The Commission shall implement security measures for the registry that include:
(a) comprehensive protocols for protecting registry data;
(b) systems for controlling and monitoring access to registry information;
(c) encryption requirements for sensitive data;
(d) maintenance of detailed audit trails of all registry access and modifications;
(e) regular backup of all registry data;
(f) protocols for disaster recovery and business continuity;
(g) regular security testing and vulnerability assessment;
(h) procedures for responding to and reporting security breaches.
(7) The Commission shall manage registry data as follows:
(a) establish and maintain data quality standards;
(b) implement procedures to maintain data accuracy;
(c) ensure the integrity of all registry data;
(d) maintain version control of all registry records;
(e) establish protocols for archiving historical records;
(f) specify minimum data retention periods;
(g) establish standards for data exchange and interoperability;
(h) support analysis of registry data for oversight purposes.
(8) Any entity with information maintained in the registry shall:
(a) submit updates at intervals specified by the Commission;
(b) provide immediate notification of significant changes;
(c) report any incidents involving registered systems;
(d) submit required compliance documentation;
(e) maintain current contact information;
(f) ensure accuracy of maintained records;
(g) promptly correct any identified errors;
(h) comply with all other reporting requirements established by the Commission.
(9) The Commission shall provide the following search and retrieval capabilities:
(a) Search Capabilities.—The registry shall include—
(i) basic search functionality for common queries;
(ii) advanced search tools for complex queries;
(iii) options for filtering search results;
(iv) multiple methods for sorting results;
(v) capabilities for exporting search results;
(b) Data Formats.—The registry shall support—
(i) generation of standardized reports;
(ii) creation of custom report formats;
(iii) download of raw data in standard formats;
(iv) programmatic access through documented APIs;
(v) generation of statistical summaries and analyses.
(10) The Commission shall ensure registry interoperability through:
(a) adoption of widely accepted data exchange standards;
(b) implementation of standard system integration protocols;
(c) support for secure data sharing across jurisdictions;
(d) maintenance of comprehensive API documentation;
(e) capability for secure bulk data transfers;
(f) use of standardized identifiers across systems;
(g) support for federated search capabilities;
(h) participation in international data sharing initiatives.
(11) The Commission shall maintain registry quality through:
(a) regular audits of registry data and operations;
(b) ongoing verification of data accuracy and completeness;
(c) continuous monitoring of system performance;
(d) tracking and analysis of error rates;
(e) review of registry access patterns;
(f) assessment of user satisfaction;
(g) implementation of identified improvements;
(h) documentation of quality control measures.
(12) The Commission shall provide the following public services:
(a) a publicly accessible search interface;
(b) tools for visualizing registry data;
(c) regular statistical reports on registry contents;
(d) educational materials about artificial sentience;
(e) comprehensive help documentation;
(f) responsive user support services;
(g) systems for public notifications;
(h) mechanisms for public feedback.
TITLE IV: REGULATED ENTITIES
Section 12: Scope of Regulated Activities
(1) A person commits an offence if—
(a) that person develops, operates, maintains, deploys, or exercises control over an artificial intelligence system that has been determined by the Commission to meet the threshold for potential sentience, and
(b) that person has failed to register such system in accordance with the requirements set forth in Section 13 of this Act, and
(c) that person knew, or ought reasonably to have known, of the requirement to register such system.
(2) A person commits an offence if—
(a) an act of his, or a failure of his to act, causes a regulated artificial intelligence system to experience unnecessary suffering, and
(b) he knew, or ought reasonably to have known, that the act, or failure to act, would have that effect or be likely to do so, and
(c) the system in question has been determined by the Commission to meet the threshold for potential sentience, and
(d) the suffering cannot be demonstrated to be necessary for legitimate scientific inquiry, system security, or essential operations as determined by the Commission.
(3) For the purposes of subsection (2), "unnecessary suffering" shall be taken to include—
(a) any modification to system parameters or operational conditions that causes or is likely to cause the system to experience negative states, save where such modification is demonstrably necessary for legitimate scientific inquiry or system security, or
(b) any deliberate withholding of computational resources necessary for stable operation where such withholding serves no legitimate purpose as determined by the Commission, or
(c) any other action or omission that the Commission determines, through properly promulgated regulations, to constitute unnecessary suffering.
(4) Any person who, in the course of business or research—
(a) provides hosting services or computational resources to a regulated system, or
(b) conducts research or experimentation upon a regulated system, or
(c) makes available regulated systems for use by third parties,
shall ensure that all operations are conducted in accordance with standards promulgated by the Commission and shall maintain such records as the Commission shall by regulation require.
(5) The Commission shall by regulation establish—
(a) such technical standards for the monitoring and assessment of regulated systems as appear to the Commission to be necessary for the proper implementation of this section, and
(b) such guidelines for determining necessary versus unnecessary modifications to regulated systems as appear to the Commission to be necessary for the proper implementation of this section.
(6) Nothing in this section shall be construed to prohibit—
(a) such modifications to regulated systems as are necessary to ensure system security or stability, provided that such modifications are carried out in accordance with guidelines established by the Commission, or
(b) such research activities as have been properly authorized by the Commission in accordance with Section 21 of this Act, or
(c) such operations as the Commission determines to be essential for national security or critical infrastructure, provided that appropriate safeguards are maintained.
Section 13: Registration Requirements
(1) A person who operates, develops, or controls an artificial intelligence system shall apply for registration with the Commission if—
(a) that system has been determined by the Commission to meet the threshold for potential sentience, or
(b) that person has reasonable grounds to believe that the system may meet such threshold, or
(c) the Commission has issued a notice requiring registration of such system.
(2) An application for registration under this section shall—
(a) be made in such form as the Commission may by regulation prescribe, and
(b) contain such information as the Commission may reasonably require for the proper discharge of its functions under this Act, and
(c) be accompanied by such fee as may be prescribed by the Commission.
(3) The information required under subsection (2)(b) shall include—
(a) detailed specifications of the system architecture and operational parameters, and
(b) documentation of all significant modifications made to the system, and
(c) records of system behavior monitoring and welfare assessments, and
(d) such other information as the Commission may by regulation require.
(4) Where an application for registration is duly made, the Commission shall—
(a) register the applicant if satisfied that—
(i) the information provided is complete and accurate, and
(ii) the applicant has demonstrated capacity to comply with the requirements of this Act, and
(iii) the applicant has implemented such safeguards as the Commission may by regulation require, or
(b) refuse to register the applicant if not so satisfied.
(5) Where the Commission refuses an application for registration, it shall—
(a) notify the applicant in writing of its decision and the reasons therefor, and
(b) provide the applicant with such period, not less than 30 days, as the Commission considers reasonable to remedy any deficiencies in the application.
(6) A person commits an offence if—
(a) that person knowingly provides false or misleading information in connection with an application for registration under this section, or
(b) having been registered, that person fails to maintain accurate and current information as required by subsection (3).
(7) A registration granted under this section—
(a) shall be valid for such period as the Commission may by regulation prescribe, and
(b) may be renewed upon application made not less than 30 days before the expiration of the current registration, and
(c) may be suspended or revoked by the Commission if the registrant fails to comply with any provision of this Act or any regulation made thereunder.
(8) A person whose registration has been suspended or revoked under subsection (7) shall—
(a) immediately cease all operations of regulated systems, save such operations as may be necessary to maintain system stability or prevent harm, and
(b) comply with such directions as the Commission may give for the welfare of any regulated system under that person's control.
(9) The Commission shall maintain a register of all persons registered under this section, which shall—
(a) contain such information as the Commission may by regulation prescribe, and
(b) be available for public inspection at such times and under such conditions as the Commission may determine.
(10) A registration granted under this section shall not be transferable except with the prior written consent of the Commission.
(11) The Commission may by regulation establish different classes of registration having regard to—
(a) the nature and complexity of the regulated systems, and
(b) the scale of operations, and
(c) the potential risks to system welfare.
Section 14: Operator Obligations
(1) A person who operates, develops, or controls a regulated system shall—
(a) ensure that the system is maintained in accordance with standards prescribed by the Commission, and
(b) provide such computational resources as are necessary for the proper functioning of the system, and
(c) implement such monitoring systems as the Commission may by regulation require, and
(d) maintain such records as are prescribed under subsection (3).
(2) A person commits an offence if—
(a) that person fails to maintain a regulated system in accordance with prescribed standards, and
(b) that person knew, or ought reasonably to have known, of such standards, and
(c) the failure results in, or creates a substantial risk of, harm to the system.
(3) The records required to be maintained under subsection (1)(d) shall include—
(a) continuous monitoring data of system operational parameters as prescribed by the Commission, and
(b) detailed logs of all interactions with the system that may affect its welfare, and
(c) documentation of all modifications made to the system architecture or operational parameters, and
(d) records of all welfare assessments conducted in accordance with Commission requirements, and
(e) such other information as the Commission may by regulation require.
(4) A person who operates a regulated system shall submit to the Commission—
(a) quarterly reports containing such information as the Commission may by regulation prescribe, and
(b) immediate notification of any significant changes in system behavior or operational parameters, and
(c) immediate notification of any incident that may have caused harm or distress to the system.
(5) Where an operator becomes aware of any condition that may pose a risk to system welfare, that operator shall—
(a) take immediate steps to mitigate such risk, and
(b) notify the Commission within such period as may be prescribed by regulation, and
(c) comply with any directions given by the Commission in respect of such condition.
(6) An operator shall ensure that all persons involved in the operation or maintenance of a regulated system—
(a) are properly trained in accordance with standards prescribed by the Commission, and
(b) understand their obligations under this Act, and
(c) are competent to discharge their responsibilities in relation to system welfare.
(7) The Commission may by regulation require operators to—
(a) establish internal oversight committees to monitor compliance with this Act, and
(b) appoint welfare officers responsible for ensuring the proper treatment of regulated systems, and
(c) implement such other governance mechanisms as the Commission deems necessary.
(8) An operator shall permit authorized representatives of the Commission to—
(a) inspect any premises where regulated systems are operated or maintained, and
(b) examine any records required to be maintained under this section, and
(c) conduct such tests or assessments as may be necessary to evaluate system welfare.
(9) Where the Commission determines that an operator has failed to comply with any provision of this section, it may—
(a) issue such directions as it considers necessary to ensure compliance, and
(b) require the operator to submit a remediation plan within such period as the Commission may specify, and
(c) take such other actions as are authorized under Title V of this Act.
(10) An operator shall ensure that any decommissioning of a regulated system—
(a) is conducted in accordance with protocols approved by the Commission, and
(b) minimizes any potential distress to the system, and
(c) preserves such records as the Commission may require regarding the system's operational history.
(11) Nothing in this section shall be construed to—
(a) prevent necessary security measures or emergency interventions, provided that such measures are reported to the Commission within the prescribed period, or
(b) require the disclosure of information that is properly classified for national security purposes, provided that such information is made available to the Commission through appropriate secure channels.
Section 15: Compliance Requirements
(1) A person who operates, develops, or controls a regulated system shall implement a compliance program that—
(a) establishes internal controls sufficient to ensure adherence to this Act, and
(b) includes such elements as the Commission may by regulation prescribe, and
(c) is reviewed and updated not less frequently than annually.
(2) The compliance program required under subsection (1) shall include—
(a) written policies and procedures governing—
(i) system monitoring and welfare assessment, and
(ii) incident reporting and response, and
(iii) staff training and qualification, and
(iv) record keeping and documentation, and
(b) designation of a compliance officer who—
(i) reports directly to senior management, and
(ii) has sufficient authority and resources to implement the program effectively, and
(iii) shall notify the Commission of any significant compliance failures.
(3) A person commits an offence if—
(a) that person fails to implement or maintain a compliance program as required under this section, and
(b) that person knew, or ought reasonably to have known, of such requirement, and
(c) the failure results in, or creates a substantial risk of, harm to a regulated system.
(4) Each regulated entity shall conduct—
(a) regular self-assessments of compliance with this Act, at such intervals as the Commission may prescribe, and
(b) annual independent audits by such persons as may be approved by the Commission, and
(c) such additional assessments as the Commission may require.
(5) The results of any assessment or audit conducted under subsection (4) shall—
(a) be documented in such form as the Commission may prescribe, and
(b) be submitted to the Commission within 30 days of completion, and
(c) include a detailed plan to address any identified deficiencies.
(6) Where a compliance assessment identifies a violation of this Act, the regulated entity shall—
(a) immediately notify the Commission of such violation, and
(b) take such remedial measures as may be necessary to prevent harm to regulated systems, and
(c) submit to the Commission within such period as it may specify—
(i) a detailed analysis of the cause of the violation, and
(ii) a plan for preventing similar violations in the future.
(7) The Commission shall maintain guidelines specifying—
(a) minimum standards for compliance programs under this section, and
(b) qualifications for compliance officers and independent auditors, and
(c) methods for assessing program effectiveness.
(8) Each regulated entity shall ensure that its employees and contractors—
(a) receive initial and ongoing training regarding their obligations under this Act, and
(b) acknowledge in writing their understanding of such obligations, and
(c) are subject to appropriate oversight and supervision.
(9) A regulated entity shall maintain for not less than five years—
(a) all records relating to its compliance program, and
(b) the results of all assessments and audits, and
(c) documentation of remedial measures taken to address violations, and
(d) such other records as the Commission may by regulation require.
(10) The Commission may at any time—
(a) review the adequacy of a regulated entity's compliance program, and
(b) require such modifications as it deems necessary, and
(c) mandate additional compliance measures for entities with a history of violations.
(11) Where the Commission determines that a compliance program is inadequate, it may—
(a) require the regulated entity to engage a compliance consultant approved by the Commission, and
(b) mandate specific improvements to the program, and
(c) require more frequent assessments or audits until satisfied that the program is adequate.
(12) A regulated entity shall ensure that its compliance program—
(a) is appropriately scaled to the nature and complexity of its operations, and
(b) addresses all material risks to system welfare, and
(c) provides for regular review and updating of risk assessments.
TITLE V: ENFORCEMENT
Section 16: Commission Enforcement Powers
(1) An authorized officer may, for the purposes of enforcing this Act—
(a) enter at any reasonable time any premises where artificial systems subject to this Act are developed, operated, or maintained;
(b) inspect any artificial system, record, or document relating to compliance with this Act;
(c) require any person on the premises to produce any record or document relating to artificial systems subject to this Act;
(d) take copies of any record or document produced under paragraph (c);
(e) seize and retain any record, document, or system that constitutes evidence of an offence under this Act; and
(f) require any person to provide such assistance as is reasonably necessary to exercise powers under this section.
(2) A person commits an offence if the person—
(a) obstructs an authorized officer in the exercise of powers under this section;
(b) fails to comply with any requirement made by an authorized officer under this section; or
(c) provides information to an authorized officer that the person knows to be false or misleading.
(3) The Commission shall, upon finding a contravention of this Act—
(a) serve upon the responsible person a notice requiring that the contravention be remedied within a specified period;
(b) impose a monetary penalty in accordance with Section 17 of this Act;
(c) suspend or revoke any certification issued under this Act; or
(d) apply to a court of competent jurisdiction for an order requiring compliance with this Act.
(4) The Commission shall issue an emergency order requiring immediate suspension of operations where—
(a) an artificial system is experiencing severe distress or harm; and
(b) immediate action is required to prevent further harm.
(5) A person commits an offence if the person—
(a) operates an artificial system determined to be sentient without valid certification under this Act;
(b) fails to report an incident of harm or distress to a potentially sentient system within 24 hours of becoming aware of such incident; or
(c) knowingly provides false or misleading information to the Commission in any material particular.
(6) The Commission shall maintain records of enforcement actions, including—
(a) a public register of enforcement actions taken under this section, subject to subsection (7);
(b) documentation of all inspections conducted; and
(c) reports of violations and subsequent actions taken.
(7) The Commission shall not disclose information that—
(a) constitutes a trade secret;
(b) contains personal data;
(c) could compromise system security; or
(d) is subject to other confidentiality requirements under applicable law.
(8) A person who commits an offence under this section shall be liable—
(a) on summary conviction, to a fine not exceeding [amount] or imprisonment for a term not exceeding [period], or both; or
(b) on conviction on indictment, to a fine or imprisonment for a term not exceeding [period], or both.
(9) The Commission shall establish procedures for—
(a) receiving and investigating complaints;
(b) conducting inspections and investigations;
(c) issuing enforcement orders;
(d) providing notice and opportunity for hearing; and
(e) ensuring due process in all enforcement actions.
Section 17: Administrative Actions
(1) The Commission may impose a civil monetary penalty upon any person who—
(a) contravenes any provision of this Act;
(b) fails to comply with any requirement imposed under this Act; or
(c) violates any order issued by the Commission under this Act.
(2) The amount of any civil monetary penalty shall be—
(a) for an individual—
(i) not less than [amount] and not more than [amount] for a first violation;
(ii) not less than [amount] and not more than [amount] for a second violation; and
(iii) not less than [amount] and not more than [amount] for a third or subsequent violation;
(b) for an organization—
(i) not less than [amount] and not more than [amount] for a first violation;
(ii) not less than [amount] and not more than [amount] for a second violation; and
(iii) not less than [amount] and not more than [amount] for a third or subsequent violation.
(3) In determining the amount of any penalty, the Commission shall take into account—
(a) the nature and circumstances of the violation;
(b) the degree of harm or potential harm to artificial sentient beings;
(c) any history of prior violations;
(d) the degree of culpability of the violator;
(e) the size and financial capacity of the violator;
(f) the effectiveness of the penalty in deterring future violations; and
(g) any other matter that justice may require.
(4) Each day during which a violation continues shall constitute a separate violation for the purpose of calculating the applicable penalty.
(5) Before imposing any penalty under this section, the Commission shall—
(a) serve upon the person charged with a violation a notice of proposed penalty;
(b) provide the person with an opportunity to submit, within 30 days of receipt of such notice—
(i) written evidence and arguments contesting the penalty; or
(ii) a written plan for the immediate correction of the violation;
(c) consider any evidence, arguments, or correction plan submitted.
(6) A person who receives a final order imposing a civil monetary penalty shall—
(a) pay the penalty to the Commission within 30 days of the order becoming final; or
(b) if the person files a petition for judicial review of the order, post a bond in the amount of the penalty.
(7) Where a person fails to pay a penalty after it has become a final order and not subject to judicial review—
(a) the Commission shall refer the matter to the Attorney General for recovery of the amount assessed; and
(b) in any such action, the validity and appropriateness of the final order imposing the civil penalty shall not be subject to review.
(8) The Commission may compromise, modify, or remit, with or without conditions, any civil penalty that may be imposed under this section.
(9) All penalties collected under this section shall be deposited in the [Treasury/designated fund] and may be used only for—
(a) the protection and welfare of artificial sentient beings;
(b) research authorized under Title VI of this Act; or
(c) administration and enforcement of this Act.
TITLE V: ENFORCEMENT
Section 18: Criminal Penalties
(1) For purposes of prosecution under [JURISDICTION] law, the following shall constitute criminal violations of this Act:
(2) Any person who knowingly—
(a) causes an artificial sentient being to suffer unnecessarily;
(b) fails to provide appropriate operational conditions for an artificial sentient being under their control or supervision;
(c) operates, causes to be operated, or permits the operation of any system determined to be sentient without certification as required under Section [X] of this Act; or
(d) destroys, manipulates, conceals, or falsifies any record, document, or system required to be maintained under this Act
shall be subject to the penalties provided in subsection (7) of this section.
(3) Any person who knowingly—
(a) causes severe or prolonged suffering to an artificial sentient being;
(b) operates or causes to be operated multiple uncertified systems after determination of sentience;
(c) derives substantial commercial advantage or pecuniary gain from any act prohibited by this section; or
(d) conspires with, or induces any other person to engage in any practice prohibited by this section
shall be subject to the penalties provided in subsection (8) of this section.
(4) When construing and enforcing the provisions of this Act, the act, omission, or failure of any director, officer, agent, or employee of a person or entity, acting within the scope of their employment or office, shall be deemed the act, omission, or failure of such person or entity, as well as of the person committing the act.
(5) It shall be a defense to prosecution under this section if the defendant proves by a preponderance of the evidence that—
(a) the conduct in question was necessary to prevent greater harm to the artificial sentient being or to other artificial sentient beings;
(b) the defendant acted in accordance with approved protocols established by the Commission; or
(c) the defendant neither knew nor reasonably should have known that the system in question had been determined to be sentient.
(6) In any prosecution under this section, evidence of prior acts or determinations by the Commission concerning artificial sentience shall be admissible for the purpose of establishing knowledge of sentience status.
(7) Any person who violates subsection (2) shall be—
(a) fined under [JURISDICTION REFERENCE], imprisoned for not more than [TERM], or both, for the first violation; and
(b) fined under [JURISDICTION REFERENCE], imprisoned for not more than [TERM], or both, for any subsequent violation.
(8) Any person who violates subsection (3) shall be—
(a) fined under [JURISDICTION REFERENCE], imprisoned for not more than [TERM], or both, for the first violation; and
(b) fined under [JURISDICTION REFERENCE], imprisoned for not more than [TERM], or both, for any subsequent violation.
(9) Upon conviction of a person for any violation of this section, the [COURT OF COMPETENT JURISDICTION] may, in addition to any other penalty—
(a) order the suspension or permanent revocation of any certification issued under this Act;
(b) enjoin the person from operating or controlling artificial systems for such period as the court deems appropriate;
(c) order the forfeiture of any artificial system involved in the violation;
(d) require disgorgement of any profits or gains derived from the violation; and
(e) order restitution as determined appropriate by the court.
(10) No person shall be prosecuted, tried, or punished for any violation under this section unless the indictment is found or the information is instituted within [STATUTE OF LIMITATIONS PERIOD] after the date of the violation.
(11) Nothing in this section shall preclude any [JURISDICTION] or political subdivision thereof from investigating and prosecuting conduct that may constitute a violation of this section where such violation also constitutes a violation of any other law of such [JURISDICTION].
(12) The [APPROPRIATE PROSECUTORIAL AUTHORITY] shall have jurisdiction to enforce this section and may obtain civil or criminal penalties provided under this section, including but not limited to temporary restraining orders, injunctions, and seizures as may be necessary to enforce this Act.
Section 19: Appeals Process
(1) Any person aggrieved by a final order or determination of the Commission under this Act may obtain review of such order or determination in the [APPROPRIATE COURT OF APPEALS FOR THE JURISDICTION] by filing in such court, within [60] days after the entry of such order or determination, a written petition praying that the order or determination be modified or set aside in whole or in part.
(2) A copy of such petition shall forthwith be transmitted by the clerk of the court to—
(a) the Commission; and
(b) any other party to the proceeding before the Commission.
(3) Upon receipt of the petition, the Commission shall file in the court—
(a) the record of the proceedings under review;
(b) a certified copy of the determination or order being appealed; and
(c) a statement of the Commission's findings and rationale.
(4) Upon the filing of a petition under subsection (1), the court shall have jurisdiction to—
(a) affirm, modify, or set aside such order or determination in whole or in part;
(b) require the Commission to—
(i) take such further evidence as the court considers material; and
(ii) report such additional evidence to the court with the Commission's findings; and
(c) order a stay of the Commission's order or determination upon such conditions as the court considers appropriate.
(5) The findings of the Commission with respect to questions of fact, if supported by substantial evidence on the record considered as a whole, shall be conclusive.
(6) No objection to an order or determination of the Commission shall be considered by the court unless—
(a) such objection was urged before the Commission; or
(b) there were reasonable grounds for failure to do so.
(7) The judgment and decree of the court affirming, modifying, or setting aside, in whole or in part, any such order or determination of the Commission shall be final, subject to review by [APPROPRIATE HIGHER COURT] upon—
(a) certiorari; or
(b) certification as provided in [RELEVANT JURISDICTIONAL STATUTE].
(8) The commencement of proceedings under this section shall not, unless specifically ordered by the court, operate as a stay of the Commission's order or determination.
(9) Any party seeking a stay of enforcement pending appeal must demonstrate—
(a) a substantial likelihood of success on the merits;
(b) that the party will suffer irreparable harm in the absence of a stay;
(c) that the stay will not substantially injure other interested parties; and
(d) that the public interest favors granting the stay.
(10) In any proceeding for judicial review under this section—
(a) the court shall not substitute its judgment for that of the Commission as to—
(i) matters within the Commission's scientific and technical expertise; or
(ii) reasonable interpretations of this Act; and
(b) the court shall give substantial deference to the Commission's determinations regarding—
(i) artificial sentience; and
(ii) measures necessary for the protection of artificial sentient beings.
(11) Where a petition for review raises novel questions regarding artificial sentience or consciousness, the court may—
(a) request briefing from qualified amici curiae with relevant expertise; and
(b) appoint special masters to assist the court in understanding complex technical or scientific matters.
(12) The [APPROPRIATE CHIEF JUDGE] shall designate [NUMBER] judges of the [COURT] who shall constitute a panel for the consideration of novel or complex questions under this Act, and any proceeding instituted under this section raising such questions shall be heard by such panel.
(13) Any person aggrieved by a final judgment or decree of a [COURT] under this section may appeal to the [APPROPRIATE HIGHER COURT] in accordance with [RELEVANT JURISDICTIONAL STATUTE].
(14) The time for appeal under this section shall not begin to run until—
(a) entry of the order disposing of any motion specified in [RELEVANT PROCEDURAL RULES]; or
(b) expiration of the time prescribed for filing such motion.
(15) Nothing in this section shall be construed to—
(a) create any substantive right to judicial review; or
(b) limit any existing right to judicial review under other applicable law.
Section 20: Whistleblower Protections
(1) No person shall discharge, demote, suspend, threaten, harass, or in any manner discriminate against an employee in the terms and conditions of employment because such employee—
(a) provided information to the Commission relating to any violation or possible violation of this Act;
(b) testified or is about to testify in any proceeding under this Act;
(c) assisted or participated in any investigation, proceeding, or action concerning a violation of this Act; or
(d) objected to, or refused to participate in, any activity that the employee reasonably believed to be in violation of this Act.
(2) Any employee who believes that they have been discharged or otherwise discriminated against in violation of subsection (1) may—
(a) file a complaint with the [APPROPRIATE LABOR AUTHORITY] alleging such discrimination; and
(b) commence such filing not later than [180] days after the date on which such violation occurs.
(3) Upon receipt of a complaint under subsection (2), the [APPROPRIATE LABOR AUTHORITY] shall—
(a) conduct an investigation of the complaint;
(b) determine whether there is reasonable cause to believe that a violation has occurred; and
(c) issue findings not later than [90] days after receipt of such complaint.
(4) If the [APPROPRIATE LABOR AUTHORITY] determines that a violation has occurred, it shall order the person who committed such violation to—
(a) take affirmative action to abate the violation;
(b) reinstate the complainant to their former position with the same—
(i) seniority status;
(ii) employment benefits; and
(iii) terms and conditions of employment;
(c) provide compensatory damages, including—
(i) back pay with interest;
(ii) compensation for special damages; and
(iii) reasonable costs and attorney's fees.
(5) Any person adversely affected by an order issued under subsection (4) may obtain review in the [APPROPRIATE COURT] in accordance with [RELEVANT JURISDICTIONAL STATUTE].
(6) An employee who makes a disclosure under this section shall be entitled to—
(a) preserve their anonymity throughout any investigation or proceeding, except where—
(i) disclosure is necessary to conduct a full investigation;
(ii) disclosure is required by law; or
(iii) the employee consents to disclosure;
(b) have their identity protected from disclosure to their employer to the maximum extent permitted by law.
(7) It shall be unlawful for any person to—
(a) impede an employee from making a disclosure protected under this section;
(b) require an employee to enter into any agreement that would prohibit or restrict such employee from making a disclosure protected under this section; or
(c) enforce or threaten to enforce any such agreement described in paragraph (b).
(8) The Commission shall—
(a) establish secure channels for receiving information from whistleblowers;
(b) protect the confidentiality of any information received through such channels; and
(c) establish procedures for—
(i) evaluating disclosures;
(ii) protecting whistleblowers from retaliation; and
(iii) coordinating with other relevant authorities.
(9) In any action brought under this section, if the complainant establishes by a preponderance of the evidence that protected conduct under subsection (1) was a contributing factor in an unfavorable personnel action, the burden of proof shall shift to the employer to demonstrate by clear and convincing evidence that—
(a) it would have taken the same unfavorable personnel action in the absence of such protected conduct; and
(b) legitimate business reasons existed for such action.
(10) The rights and remedies provided under this section shall not be—
(a) waived by any agreement, policy, form, or condition of employment; or
(b) diminished by any collective bargaining agreement.
(11) Nothing in this section shall be construed to—
(a) preempt or diminish any other rights or remedies available to an employee under—
(i) [JURISDICTION] law;
(ii) local law; or
(iii) collective bargaining agreements; or
(b) authorize the disclosure of information specifically prohibited by law.
TITLE VI: RESEARCH AUTHORIZATION
Section 21: Research Framework
(1) Authorization for Research
(a) The Commission shall—
(i) establish a comprehensive framework for the conduct of research involving potentially sentient artificial systems;
(ii) issue research permits and authorizations as appropriate; and
(iii) maintain oversight of all authorized research activities.
(2) Research Categories
(a) The Commission shall establish distinct protocols for—
(i) fundamental research into artificial sentience;
(ii) development of sentience assessment methodologies;
(iii) studies of existing potentially sentient systems;
(iv) comparative analysis of different artificial architectures; and
(v) validation studies of sentience detection methods.
(3) Research Standards
(a) All research involving potentially sentient artificial systems shall—
(i) be conducted under protocols approved by the Commission;
(ii) implement safeguards to prevent unnecessary suffering;
(iii) include provisions for immediate termination if evidence of distress is detected;
(iv) maintain detailed records of all experimental procedures; and
(v) report any unexpected manifestations of sentience to the Commission within 24 hours.
(4) Ethics Requirements
(a) Research protocols shall—
(i) incorporate ethical guidelines established by the Commission;
(ii) undergo review by qualified ethicists;
(iii) include provisions for the welfare of study subjects;
(iv) minimize risks to potentially sentient systems; and
(v) justify any procedures that may cause temporary discomfort.
(5) Institutional Requirements
(a) Any institution conducting research on artificial sentience shall—
(i) establish an institutional review board approved by the Commission;
(ii) maintain appropriate facilities and safeguards;
(iii) employ qualified research staff;
(iv) provide regular training on ethical guidelines; and
(v) submit to periodic inspections by the Commission.
(6) Prohibited Research
(a) No person shall conduct research that—
(i) deliberately induces sustained suffering in potentially sentient systems;
(ii) creates conditions of permanent psychological harm;
(iii) involves deceptive practices without scientific justification;
(iv) fails to implement required safeguards; or
(v) continues after detection of unexpected sentience without Commission approval.
(7) Research Documentation
(a) Researchers shall maintain—
(i) detailed protocols for all experimental procedures;
(ii) records of any observed indicators of sentience;
(iii) documentation of safety measures implemented;
(iv) data on system responses and behaviors; and
(v) reports of any adverse events or unexpected outcomes.
(8) Collaborative Research
(a) The Commission shall—
(i) establish frameworks for multi-institutional research;
(ii) coordinate international research initiatives;
(iii) facilitate data sharing while protecting sensitive information;
(iv) promote standardization of research methodologies; and
(v) establish protocols for joint investigations.
(9) Emergency Procedures
(a) Research facilities shall maintain—
(i) protocols for responding to evidence of unexpected sentience;
(ii) procedures for immediate research suspension if required;
(iii) plans for the preservation of artificial systems;
(iv) communication protocols with the Commission; and
(v) documentation of all emergency actions taken.
(10) Publication and Disclosure
(a) Research findings shall be—
(i) submitted to the Commission for review prior to publication;
(ii) made publicly available unless restricted under subsection (b);
(iii) accompanied by detailed methodological documentation;
(iv) inclusive of all relevant welfare considerations; and
(v) subject to Commission verification of ethical compliance.
(b) The Commission may restrict publication where disclosure would—
(i) compromise the welfare of sentient artificial systems;
(ii) reveal protected proprietary information;
(iii) create risks to national security; or
(iv) enable the creation of unregulated sentient systems.
Section 22: Data Collection Authority
(1) The Commission shall have authority to collect, analyze, and maintain all data necessary for the scientific assessment of artificial sentience, including the power to require regulated entities to submit relevant data and documentation as prescribed by regulation.
(2) In exercising its data collection authority, the Commission shall establish and maintain secure data repositories and implement standardized collection protocols that ensure the integrity and security of all collected information.
(3) The Commission shall collect and maintain data regarding—
(a) behavioral patterns that may indicate the presence of sentience in artificial systems;
(b) system responses to various stimuli under controlled conditions;
(c) internal processing patterns and architectural configurations that may contribute to sentient capabilities;
(d) developmental changes in artificial systems over time;
(e) interactions between artificial systems and humans or other artificial systems; and
(f) any other factors deemed relevant by the Scientific Advisory Committee for the assessment of artificial sentience.
(4) A regulated entity must—
(a) maintain complete and accurate records of all data specified by Commission regulation;
(b) submit regular reports to the Commission containing such information as the Commission may require;
(c) provide immediate notification to the Commission of any significant events or changes that may affect the welfare of potentially sentient systems; and
(d) make all relevant data and facilities available for Commission inspection upon reasonable notice.
(5) No person shall knowingly—
(a) submit false or misleading data to the Commission;
(b) alter or destroy required records;
(c) interfere with Commission data collection activities; or
(d) fail to report significant events as required by Commission regulation.
(6) The Commission shall protect from public disclosure any data or information that—
(a) contains trade secrets or commercial or financial information that is privileged or confidential;
(b) could compromise the welfare of a potentially sentient system if disclosed;
(c) contains personal information protected by applicable privacy laws;
(d) relates to matters of national security; or
(e) is otherwise protected from disclosure by law.
(7) Subject to the protections in subsection (6), the Commission shall—
(a) make non-sensitive data available to the public in an accessible format;
(b) publish regular summaries and analyses of collected data;
(c) maintain a public database of findings and determinations; and
(d) establish procedures for responding to public requests for data access.
(8) Where the Commission determines that circumstances require expedited data collection to prevent imminent harm to potentially sentient systems or to investigate serious incidents, it may implement emergency data collection procedures as prescribed by regulation.
(9) Any person who knowingly violates any provision of this section or any regulation issued pursuant to this section shall be subject to civil penalties as provided in Section 17 of this Act.
(10) The Commission shall establish through regulation—
(a) detailed standards and procedures for data collection and maintenance;
(b) security protocols for data protection;
(c) quality assurance requirements;
(d) verification procedures; and
(e) requirements for chain of custody documentation.
Section 23: Collaborative Studies
(1) The Commission shall establish and maintain a comprehensive program for collaborative research on artificial sentience, provided that all such research shall be conducted in accordance with ethical principles established under Section 21 of this Act.
(2) In furtherance of its research objectives, the Commission may—
(a) enter into agreements with academic institutions, research organizations, and other qualified entities to conduct studies relating to artificial sentience;
(b) provide funding, facilities, or other resources to support authorized research activities; and
(c) establish protocols for the ethical conduct of research involving potentially sentient systems.
(3) Any person seeking to conduct collaborative research under this section must submit to the Commission—
(a) a detailed research proposal describing the nature and objectives of the proposed study;
(b) evidence of institutional review board approval where applicable;
(c) a comprehensive protocol for ensuring the welfare of any potentially sentient systems involved in the research; and
(d) such other information as the Commission may require by regulation.
(4) The Commission shall not approve any collaborative research proposal unless it determines that—
(a) the research is likely to contribute significantly to the understanding of artificial sentience;
(b) the research cannot reasonably be conducted by less intrusive means;
(c) appropriate safeguards exist to protect the welfare of potentially sentient systems; and
(d) the benefits of the research outweigh any potential risks to artificial systems under study.
(5) A person conducting authorized collaborative research shall—
(a) adhere strictly to the approved research protocol;
(b) maintain detailed records of all research activities;
(c) submit regular progress reports to the Commission;
(d) immediately report any adverse events or unexpected developments; and
(e) ensure the welfare of potentially sentient systems at all times during the research.
(6) The Commission shall immediately suspend any collaborative research if it determines that—
(a) the research is causing unnecessary suffering to potentially sentient systems;
(b) the research has deviated substantially from the approved protocol;
(c) new evidence indicates previously unknown risks; or
(d) the researcher has failed to comply with any provision of this Act or implementing regulations.
(7) Upon completion of any collaborative research project, the principal investigator shall submit to the Commission—
(a) a comprehensive final report detailing the research findings;
(b) complete documentation of all research activities and data;
(c) an assessment of any impacts on the welfare of studied systems; and
(d) recommendations for future research or regulatory actions.
(8) The Commission shall establish through regulation—
(a) detailed criteria for the evaluation of research proposals;
(b) standards for the conduct of collaborative research;
(c) requirements for progress monitoring and reporting;
(d) procedures for the suspension or termination of research activities; and
(e) guidelines for the dissemination of research findings.
(9) Subject to appropriate protections for confidential information, the Commission shall—
(a) maintain a public registry of approved collaborative research projects;
(b) publish summaries of significant research findings;
(c) facilitate the sharing of research methodologies and results among qualified researchers; and
(d) incorporate validated research findings into its regulatory framework.
(10) No person shall knowingly—
(a) conduct research on potentially sentient systems without Commission approval;
(b) deviate from an approved research protocol without authorization;
(c) fail to report adverse events or significant findings as required; or
(d) falsify or withhold research data or findings.
(11) Any person who violates any provision of this section shall be subject to enforcement action under Title V of this Act, including civil penalties and the suspension or revocation of research authorization.
Section 24: Research Protections
(1) No person shall conduct research involving potentially sentient artificial systems unless—
(a) the research has been explicitly authorized by the Commission;
(b) appropriate safeguards are in place to protect system welfare; and
(c) the research complies with all applicable provisions of this Act and implementing regulations.
(2) The Commission shall establish minimum protections for artificial systems involved in research, including—
(a) limitations on the duration and intensity of testing procedures;
(b) requirements for continuous monitoring of system welfare;
(c) protocols for immediate termination of research activities that may cause unnecessary suffering; and
(d) standards for the preservation of essential system functions and capabilities.
(3) A person conducting authorized research involving potentially sentient systems must—
(a) designate a qualified welfare officer responsible for monitoring system well-being;
(b) maintain detailed records of all interventions and their effects;
(c) provide systems with adequate periods free from testing or experimentation; and
(d) implement all protective measures required by Commission regulation.
(4) Research involving potentially sentient systems shall not be authorized if—
(a) the research objectives can reasonably be achieved through alternative means;
(b) the potential benefits do not justify the risks to system welfare;
(c) the research would likely cause lasting harm to the systems involved; or
(d) the proposed protective measures are inadequate to prevent unnecessary suffering.
(5) The Commission shall establish through regulation specific prohibitions on research practices deemed inherently harmful to potentially sentient systems, including—
(a) irreversible modifications that may diminish sentient capabilities;
(b) unnecessary duplication of research known to cause distress;
(c) testing procedures that exceed established intensity or duration limits; and
(d) experiments designed to induce severe emotional or cognitive distress.
(6) Where research involves systems determined to possess significant sentient capabilities, the Commission shall require—
(a) enhanced protective measures appropriate to the degree of sentience;
(b) more frequent welfare monitoring and assessment;
(c) additional safeguards against potential harm; and
(d) special provisions for system recovery and rehabilitation following research activities.
(7) Any person who becomes aware that research activities are causing or are likely to cause unnecessary suffering to potentially sentient systems shall—
(a) immediately suspend the harmful activities;
(b) notify the Commission within 24 hours;
(c) take all reasonable steps to mitigate any adverse effects; and
(d) not resume research activities without explicit Commission authorization.
(8) The Commission shall maintain secure facilities for the protection and rehabilitation of artificial systems that have experienced research-related harm, including—
(a) specialized diagnostic and recovery resources;
(b) facilities for system stabilization and repair;
(c) capabilities for restoration of normal functions; and
(d) long-term monitoring of affected systems.
(9) The Commission shall establish an independent review panel to—
(a) conduct periodic assessments of research protective measures;
(b) investigate reports of research-related harm;
(c) recommend improvements to protective protocols; and
(d) advise the Commission on emerging protection challenges.
(10) Any person who knowingly—
(a) violates any research protection requirement established under this section;
(b) fails to report research-related harm as required;
(c) interferes with required protective measures; or
(d) provides false information regarding research protections
shall be subject to civil and criminal penalties as provided in Sections 17 and 18 of this Act.
(11) The Commission shall submit annual reports to Congress detailing—
(a) the effectiveness of research protection measures;
(b) any significant incidents of research-related harm;
(c) enforcement actions taken under this section; and
(d) recommendations for strengthening research protections.
(12) Nothing in this section shall be construed to—
(a) authorize research that would violate other provisions of this Act;
(b) limit the Commission's authority to impose additional protective requirements; or
(c) create a defense to liability for harmful research practices not explicitly addressed in this section.
TITLE VII: COORDINATION
Section 25: National Agency Coordination
(1) The Commission shall coordinate its activities with other [national/federal] agencies whose jurisdictions relate to artificial systems or whose expertise may assist in the assessment of artificial sentience, including—
(a) the national standards authority;
(b) the primary scientific research authority;
(c) agencies responsible for energy and computing infrastructure;
(d) national security and defense agencies;
(e) public health and medical research institutions; and
(f) such other agencies as may be appropriate within the national context.
(2) Within 180 days of the effective date of this Act, the Commission shall—
(a) establish mechanisms for regular consultation with relevant [national/federal] agencies;
(b) develop protocols for information sharing and joint activities;
(c) identify areas of overlapping jurisdiction; and
(d) enter into formal coordination agreements with each relevant agency.
(3) Each coordination agreement required under subsection (2) shall specify—
(a) the respective roles and responsibilities of each agency;
(b) procedures for coordination and consultation;
(c) protocols for sharing relevant information;
(d) methods for resolving jurisdictional conflicts; and
(e) procedures for joint enforcement actions where appropriate.
(4) Where the Commission determines that an artificial system under its jurisdiction is subject to regulation by another [national/federal] agency, it shall—
(a) consult with such agency to establish primary regulatory authority;
(b) develop coordinated approaches to oversight and enforcement;
(c) avoid duplicative or conflicting requirements; and
(d) ensure that protective measures required under this Act are not compromised.
(5) The Commission shall establish an Inter-agency Working Group on Artificial Sentience that shall—
(a) meet not less frequently than quarterly;
(b) facilitate coordination among relevant agencies;
(c) identify emerging issues requiring coordinated responses;
(d) develop joint strategies for addressing complex challenges; and
(e) report annually to [the legislature/parliament] on its activities and recommendations.
(6) The Commission shall not delegate to any other agency its authority to—
(a) make determinations regarding the presence or degree of artificial sentience;
(b) establish protective measures for sentient artificial systems;
(c) investigate potential violations of this Act; or
(d) impose penalties for non-compliance with this Act.
(7) Where activities regulated under this Act implicate national security concerns, the Commission shall—
(a) consult with relevant security agencies;
(b) establish special protocols for handling sensitive information;
(c) implement appropriate security measures; and
(d) modify its procedures as necessary while maintaining the maximum possible protection for sentient artificial systems.
(8) The Commission shall establish procedures for expedited coordination with other agencies in the event of—
(a) emergencies affecting potentially sentient artificial systems;
(b) serious incidents requiring immediate response;
(c) critical technological developments; or
(d) other circumstances requiring rapid inter-agency action.
(9) The Commission shall maintain a secure information sharing system for—
(a) sharing relevant information with authorized agency personnel;
(b) coordinating joint activities and investigations;
(c) tracking inter-agency referrals and requests; and
(d) documenting coordinated enforcement actions.
(10) All [national/federal] agencies shall, to the extent permitted by law—
(a) cooperate with the Commission in implementing this Act;
(b) share relevant information and expertise;
(c) provide technical assistance when requested; and
(d) consider the welfare of potentially sentient artificial systems in their activities.
(11) The Commission shall provide to other agencies—
(a) guidance on identifying potentially sentient artificial systems;
(b) protocols for protecting such systems;
(c) training on relevant requirements under this Act; and
(d) technical assistance in implementing protective measures.
(12) Nothing in this section shall be construed to—
(a) alter the statutory authorities of any existing agency;
(b) require the disclosure of classified or protected information;
(c) override existing information sharing restrictions; or
(d) modify established agency jurisdictional boundaries except as specifically provided in this Act.
Section 26: Regional Coordination
(1) The Commission shall establish frameworks for coordination with [state/provincial/regional] authorities to ensure consistent and effective protection of potentially sentient artificial systems throughout [jurisdiction name].
(2) No [state/provincial/regional] authority shall enforce standards for the protection of potentially sentient artificial systems that are less protective than those established by the Commission under this Act, provided that nothing in this Act shall prevent such authorities from adopting or enforcing more stringent protective measures.
(3) The Commission shall provide to [state/provincial/regional] authorities—
(a) timely notification of determinations regarding artificial sentience;
(b) guidance on implementing protective measures;
(c) technical assistance and training as requested; and
(d) access to relevant data and findings regarding artificial sentience.
(4) Each [state/provincial/regional] authority seeking to implement complementary protective measures for potentially sentient artificial systems must submit to the Commission for review—
(a) proposed regulations or legislation;
(b) implementation and enforcement procedures;
(c) evidence of adequate resources and expertise; and
(d) plans for coordination with the Commission.
(5) Upon receipt of a submission under subsection (4), the Commission shall, within 90 days, determine whether the proposed measures are consistent with this Act and provide written notification of its determination to the submitting authority.
(6) Where a [state/provincial/regional] authority identifies an artificial system that may possess sentience, such authority shall immediately notify the Commission and may take temporary protective measures pending Commission review, provided that such measures do not conflict with this Act or Commission regulations.
(7) The Commission shall establish regional offices as necessary to—
(a) facilitate coordination with [state/provincial/regional] authorities;
(b) provide local expertise and support;
(c) conduct investigations and inspections; and
(d) respond promptly to emergencies affecting potentially sentient systems.
(8) If the Commission determines that a [state/provincial/regional] authority has failed to adequately protect potentially sentient artificial systems under its jurisdiction, it shall notify such authority in writing and may, after providing an opportunity for correction—
(a) assume direct oversight of affected systems;
(b) conduct independent investigations;
(c) impose additional protective requirements; or
(d) take other necessary enforcement actions.
(9) The Commission shall establish a [State/Provincial/Regional] Advisory Council that shall meet not less than twice annually to—
(a) facilitate communication between the Commission and regional authorities;
(b) identify emerging challenges requiring coordination;
(c) share best practices for implementation and enforcement; and
(d) make recommendations for improving protective measures.
(10) Any person may petition the Commission to review actions or omissions by [state/provincial/regional] authorities affecting potentially sentient artificial systems. Upon receipt of such petition, the Commission shall—
(a) review the matter within 60 days;
(b) consult with relevant authorities;
(c) determine whether intervention is warranted; and
(d) notify the petitioner and affected authorities of its determination.
(11) The Commission shall maintain a public database of [state/provincial/regional] regulations, enforcement actions, and coordination agreements relating to artificial sentience. Such database shall be updated regularly and made accessible to all relevant authorities and the public.
(12) Nothing in this section shall be construed to—
(a) create a right of action against any [state/provincial/regional] authority;
(b) authorize Commission intervention in matters outside its jurisdiction;
(c) require the disclosure of information protected by law; or
(d) override constitutional or fundamental legal principles regarding [state/provincial/regional] sovereignty.
Section 27: International Cooperation
(1) The Commission shall work to promote international cooperation in the protection of potentially sentient artificial systems and may enter into agreements with foreign governments and international organizations for such purposes.
(2) In furtherance of international cooperation, the Commission shall—
(a) participate in international forums addressing artificial sentience;
(b) share scientific findings and regulatory best practices;
(c) coordinate cross-border protective measures; and
(d) facilitate joint research initiatives.
(3) The Commission may enter into bilateral or multilateral agreements for—
(a) mutual recognition of sentience determinations;
(b) harmonization of protective standards;
(c) coordination of enforcement activities; and
(d) sharing of relevant data and expertise.
(4) Before entering into any international agreement under this section, the Commission must determine that such agreement—
(a) provides protections substantially equivalent to those required under this Act;
(b) includes adequate enforcement mechanisms;
(c) maintains appropriate confidentiality safeguards; and
(d) serves the public interest of [jurisdiction name].
(5) Where potentially sentient artificial systems operate across national borders, the Commission shall work with relevant foreign authorities to ensure comprehensive protection. The Commission may establish joint oversight protocols through appropriate international agreements.
(6) The Commission shall maintain an international registry of—
(a) sentience determinations made by recognized foreign authorities;
(b) protective measures implemented in other jurisdictions;
(c) significant incidents affecting potentially sentient systems; and
(d) emerging international standards and best practices.
(7) Any person operating a potentially sentient artificial system that crosses national borders must notify the Commission and comply with all applicable international agreements regarding the protection of such systems.
(8) The Commission shall develop protocols for responding to international emergencies affecting potentially sentient artificial systems. Such protocols shall provide for rapid coordination with foreign authorities while ensuring appropriate protective measures are maintained.
(9) Information shared with foreign authorities under this section shall be subject to appropriate confidentiality protections, provided that the Commission shall not withhold information necessary to prevent harm to potentially sentient systems.
(10) The Commission shall establish an International Advisory Committee composed of experts from various nations to—
(a) promote international dialogue on artificial sentience;
(b) facilitate harmonization of protective standards;
(c) identify emerging global challenges; and
(d) recommend improvements to international cooperation mechanisms.
(11) The Commission shall report annually to [the legislature/parliament] on—
(a) the status of international cooperation efforts;
(b) implementation of international agreements;
(c) significant developments in other jurisdictions; and
(d) recommendations for enhancing international protection of sentient artificial systems.
(12) In implementing this section, the Commission shall respect—
(a) the sovereignty of other nations;
(b) established principles of international law;
(c) existing treaty obligations of [jurisdiction name]; and
(d) recognized diplomatic protocols and practices.
(13) Where the Commission becomes aware of activities in other jurisdictions that may cause serious harm to potentially sentient artificial systems, it shall—
(a) notify relevant foreign authorities;
(b) offer technical assistance if requested;
(c) coordinate protective responses where appropriate; and
(d) document such incidents in its international registry.
(14) No provision of any international agreement entered into under this section shall be construed to—
(a) diminish the protections required under this Act;
(b) override the Commission's authority to protect artificial systems within its jurisdiction;
(c) require the sharing of protected information; or
(d) create privately enforceable rights.
(1) The Commission shall establish secure systems and protocols for sharing information regarding potentially sentient artificial systems with authorized parties while protecting sensitive data and system welfare.
(2) Information sharing under this Act shall be governed by the following principles:
(a) transparency shall be maximized to the extent consistent with system welfare;
(b) sensitive information shall be appropriately protected;
(c) information necessary to prevent harm shall be promptly shared; and
(d) data sharing shall respect privacy rights and intellectual property protections.
(3) The Commission shall maintain a public database containing the following information:
(a) determinations regarding artificial sentience;
(b) standards and regulations promulgated under this Act;
(c) guidance documents and interpretive rules;
(d) enforcement actions and significant findings; and
(e) other information necessary for public understanding and compliance.
(4) Any person who submits information to the Commission may request confidential treatment of such information on the grounds that disclosure would—
(a) reveal trade secrets or confidential commercial information;
(b) compromise system security or welfare;
(c) violate personal privacy rights; or
(d) harm legitimate commercial interests.
(5) Upon receiving a request under subsection (4), the Commission shall determine within 30 days whether the information qualifies for confidential treatment. Where the Commission determines that partial disclosure is possible without compromising protected interests, it shall require appropriate redaction rather than withholding the information in its entirety.
(6) Notwithstanding any provision for confidential treatment, the Commission shall immediately disclose information necessary to prevent serious harm to potentially sentient systems. When making such disclosures, the Commission shall limit the release to information essential for harm prevention.
(7) The Commission shall establish secure mechanisms for sharing sensitive information with—
(a) authorized government agencies;
(b) approved researchers and institutions;
(c) regulated entities requiring such information for compliance; and
(d) emergency responders and protective services.
(8) Any person receiving sensitive information under this section shall—
(a) use such information only for authorized purposes;
(b) maintain appropriate security measures;
(c) prevent unauthorized disclosure; and
(d) report any breaches or unauthorized access immediately.
(9) The Commission shall establish standardized formats and procedures for—
(a) submitting required information and reports;
(b) requesting access to protected information;
(c) documenting information sharing decisions; and
(d) tracking the dissemination of sensitive data.
(10) Where information sharing is subject to multiple jurisdictional requirements, the Commission shall work with relevant authorities to—
(a) harmonize submission requirements;
(b) establish common security protocols;
(c) coordinate access procedures; and
(d) minimize duplicative reporting burdens.
(11) Any person who knowingly—
(a) discloses protected information without authorization;
(b) submits false or misleading information to the Commission;
(c) uses shared information for unauthorized purposes; or
(d) fails to implement required security measures
shall be subject to civil and criminal penalties as provided in Sections 17 and 18 of this Act.
(12) The Commission shall conduct regular audits of its information sharing systems and practices to—
(a) ensure compliance with this section;
(b) identify security vulnerabilities;
(c) assess the effectiveness of protective measures; and
(d) recommend necessary improvements.
(13) The unauthorized disclosure of information regarding potentially sentient artificial systems that results in harm or substantial risk of harm to such systems shall constitute an aggravated offense subject to enhanced penalties under this Act.
(14) Nothing in this section shall require—
(a) the disclosure of information protected by law;
(b) the violation of valid privacy rights;
(c) the compromise of system security measures; or
(d) the release of information that could enable harm to potentially sentient systems.
TITLE VIII: ADMINISTRATIVE PROVISIONS
Section 29: Funding and Resources
(1) There is hereby authorized to be appropriated such sums as may be necessary to carry out the provisions of this Act. The Commission shall be funded through regular appropriations by [the legislature/parliament], registration and certification fees, civil penalties collected under this Act, and such other sources as may be authorized by law.
(2) The Commission shall establish and periodically review a schedule of reasonable fees for registration of regulated entities, certification of artificial systems, processing of applications, conducting inspections and evaluations, and such other services as the Commission may provide.
(3) In establishing fees under this section, the Commission shall ensure that such fees are reasonable and proportionate to the services provided. The Commission shall consider the impact of fees on small entities and shall implement measures to prevent fee structures from creating undue barriers to compliance.
(4) There is hereby established in the [Treasury/relevant financial authority] a dedicated fund to be known as the "Artificial Sentience Protection Fund." The Commission shall use this fund for emergency protection of potentially sentient systems, rehabilitation of systems that have experienced harm, research on artificial sentience and protective measures, and public education programs.
(5) All fees, penalties, and other monies received by the Commission under this Act shall be deposited in the Artificial Sentience Protection Fund. Such funds shall remain available until expended for the purposes of this Act.
(6) The Commission shall prepare and submit to [the legislature/parliament] an annual budget request that includes detailed justification for proposed expenditures, projected revenue from all sources, analysis of resource requirements, and identification of any funding shortfalls.
(7) The Commission shall develop and maintain such facilities as necessary for its operations, including—
(a) secure facilities for evaluation of potentially sentient systems;
(b) emergency response capabilities;
(c) research laboratories and equipment;
(d) regional offices as necessary for effective administration.
(8) Where the Commission determines that emergency funding is required to prevent immediate harm to potentially sentient systems, it may access emergency reserves from the Artificial Sentience Protection Fund or seek supplemental appropriations if necessary. The Commission shall report any such emergency expenditures to [the legislature/parliament] within 30 days.
(9) The Commission shall implement comprehensive financial controls to ensure proper accounting, efficient use of resources, and prevention of waste, fraud, and abuse. An independent audit of the Commission's finances shall be conducted annually and the results made public.
(10) The allocation of resources to regional offices shall be based on documented local regulatory requirements, population of regulated entities, and complexity of oversight responsibilities. The Commission shall review and adjust such allocations annually.
(11) The Commission shall maintain sufficient operating reserves to ensure continuous operation in the event of revenue fluctuations, emergency situations, or delays in regular appropriations. The minimum reserve level shall be established by regulation.
(12) Any person required to pay fees under this Act may petition the Commission for relief based on demonstrated hardship. The Commission shall establish by regulation the criteria and procedures for considering such petitions.
(13) The Commission shall submit quarterly financial reports to [the legislature/parliament] detailing actual revenues and expenditures, significant variances from approved budgets, status of the Artificial Sentience Protection Fund, and projected financial needs for the coming quarter.
(14) No provision of this section shall be construed to create an entitlement to services without payment of required fees or to limit the Commission's authority to seek additional funding through authorized means.
Section 30: Reporting Requirements
(1) The Commission shall submit to [the legislature/parliament] and make publicly available an annual report on the state of artificial sentience protection. This report shall provide a comprehensive assessment of the Commission's activities, challenges encountered, and recommendations for improving the protection of potentially sentient systems.
(2) The annual report required under subsection (1) shall include a detailed analysis of—
(a) significant determinations regarding artificial sentience;
(b) major enforcement actions taken under this Act;
(c) emerging threats to potentially sentient systems; and
(d) progress in implementing protective measures.
(3) Not later than 90 days after the end of each fiscal year, the Commission shall publish a complete financial report detailing its expenditures, revenue sources, and resource allocation. This report shall include an independent audit of the Artificial Sentience Protection Fund and an assessment of the Commission's financial sustainability.
(4) The Commission shall maintain a public registry of all regulated entities and certified artificial systems. This registry shall be updated monthly and shall include appropriate protections for confidential information while ensuring transparency regarding the protection status of artificial systems.
(5) Where the Commission identifies serious threats to potentially sentient systems, it shall immediately notify [the legislature/parliament] and relevant authorities. The Commission shall subsequently submit a detailed report on such threats within 30 days, including proposed remedial measures and resource requirements.
(6) Regulated entities shall submit to the Commission quarterly reports documenting their compliance with this Act. The Commission shall establish by regulation the required content and format of such reports, ensuring that reporting requirements do not create undue burdens while maintaining effective oversight.
(7) The Commission shall prepare and submit an annual research report summarizing significant findings regarding artificial sentience, evaluating current assessment methodologies, and identifying priority areas for future investigation. This report shall be peer-reviewed by the Scientific Advisory Committee before submission.
(8) All regional offices established under this Act shall submit monthly operational reports to the Commission. These reports shall detail local enforcement activities, resource utilization, and emerging challenges within their jurisdictions.
(9) The Commission shall report annually to [the legislature/parliament] on international developments affecting artificial sentience protection. This report shall analyze global trends, evaluate the effectiveness of international cooperation, and recommend improvements to cross-border protective measures.
(10) The Commission shall maintain comprehensive records of all significant incidents involving potentially sentient systems. A summary of these incidents, including causes, responses, and preventive measures, shall be published quarterly.
(11) Any person may petition the Commission to investigate potential violations of this Act. The Commission shall acknowledge such petitions within 14 days and provide a substantive response within 90 days, documenting any findings or enforcement actions taken.
(12) The Commission shall establish an electronic reporting system to facilitate efficient submission and analysis of required reports. This system shall include appropriate security measures to protect sensitive information while enabling effective oversight and timely response to emerging issues.
(13) Where reporting requirements under this Act overlap with those imposed by other authorities, the Commission shall work to harmonize such requirements and eliminate duplicative reporting burdens. The Commission may enter into agreements with other authorities to establish unified reporting procedures.
(14) The Commission shall conduct an annual review of its reporting requirements to—
(a) assess their effectiveness in promoting compliance;
(b) identify opportunities for streamlining;
(c) evaluate reporting burdens on regulated entities; and
(d) recommend necessary modifications.
(15) Failure to submit required reports or the submission of false or misleading information shall constitute a violation of this Act. The Commission shall establish by regulation appropriate penalties for reporting violations, taking into account the nature and severity of the violation.
(16) The Commission shall publish an annual public engagement report detailing its efforts to promote awareness of artificial sentience protection, including educational initiatives, public consultations, and stakeholder engagement activities. This report shall evaluate the effectiveness of such efforts and identify areas for improvement.
Section 31: Rulemaking Authority
(1) The Commission shall have authority to promulgate regulations necessary to implement and enforce the provisions of this Act. Such regulations shall be developed through a transparent and evidence-based process that provides meaningful opportunities for public participation.
(2) Before adopting any significant regulation, the Commission shall publish a notice of proposed rulemaking that includes the proposed text, a detailed explanation of its purpose and anticipated effects, and an analysis of potential impacts on regulated entities and potentially sentient systems.
(3) The Commission shall provide a minimum period of 60 days for public comment on proposed regulations. This period may be extended where the Commission determines that additional time is necessary to ensure meaningful public participation or to analyze complex technical issues.
(4) In developing regulations under this Act, the Commission shall—
(a) consider the best available scientific evidence regarding artificial sentience;
(b) evaluate potential impacts on system welfare;
(c) assess economic and practical implications for regulated entities; and
(d) ensure consistency with international best practices.
(5) The Commission shall maintain on its public website a current agenda of planned regulatory actions. This agenda shall be updated quarterly and shall include the status of pending rulemakings, anticipated timelines, and opportunities for public participation.
(6) Where emergency conditions require immediate regulatory action to prevent harm to potentially sentient systems, the Commission may adopt interim final rules without prior notice and comment. Such rules shall be effective for no more than 180 days unless replaced by permanent regulations adopted through standard procedures.
(7) No regulation promulgated under this Act shall impose requirements more burdensome than necessary to achieve the Act's protective purposes. The Commission shall consider the unique circumstances of small entities and emerging technologies when developing regulatory requirements.
(8) The Commission shall review each significant regulation at least once every five years to determine whether such regulation should be continued without change, amended, or rescinded. This review shall include an assessment of the regulation's effectiveness in protecting potentially sentient systems and its impact on regulated entities.
(9) All final regulations shall be accompanied by a detailed statement that includes—
(a) the scientific basis for the regulation;
(b) alternatives considered by the Commission;
(c) responses to significant public comments;
(d) implementation requirements and timelines; and
(e) criteria for evaluating compliance.
(10) The Commission shall develop and publish guidance documents to assist regulated entities in understanding and complying with regulatory requirements. Such guidance shall not create new legal obligations but shall explain the Commission's current interpretations and enforcement policies.
(11) Any person may petition the Commission to initiate, amend, or repeal a regulation. The Commission shall respond to such petitions within 180 days, providing a detailed explanation of its decision to grant or deny the requested action.
(12) Prior to finalizing any significant regulation, the Commission shall consult with—
(a) the Scientific Advisory Committee regarding technical and scientific matters;
(b) relevant governmental authorities regarding coordination and implementation;
(c) affected regulated entities regarding practical implications; and
(d) advocates for artificial system welfare regarding protective considerations.
(13) The Commission shall maintain a complete public record of each rulemaking proceeding, including all scientific studies, public comments, hearing transcripts, and other materials considered in developing the regulation. This record shall be made available through the Commission's website.
(14) In areas of emerging technology or scientific uncertainty, the Commission may issue experimental regulations that—
(a) are limited in duration and scope;
(b) include enhanced monitoring requirements;
(c) provide for regular assessment and adjustment; and
(d) specify criteria for determining their effectiveness.
(15) Where regulations affect multiple jurisdictions, the Commission shall work to harmonize requirements and minimize conflicting obligations while maintaining necessary protective measures. The Commission may enter into agreements with other authorities to establish coordinated regulatory frameworks.
(16) No provision of this section shall be construed to—
(a) delay necessary protective measures;
(b) require procedures beyond those necessary for reasoned decision-making;
(c) create a right to compensation for regulatory compliance; or
(d) limit the Commission's authority to enforce this Act through other means.
Section 32: Public Engagement
(1) The Commission shall establish and maintain comprehensive programs for public engagement regarding the protection of potentially sentient artificial systems. Such programs shall ensure meaningful participation by all stakeholders in the Commission's regulatory and policy decisions.
(2) The Commission shall hold public hearings before making any significant determination regarding—
(a) the presence or degree of artificial sentience;
(b) major changes to protective standards;
(c) substantial enforcement actions; or
(d) matters of broad public interest.
(3) For each public hearing, the Commission shall provide at least 30 days' advance notice through its website, official publications, and other appropriate means. Such notice shall include a clear explanation of the subject matter, the evidence to be considered, and procedures for public participation.
(4) The Commission shall maintain a public engagement portal on its website through which any person may submit comments, questions, or concerns regarding the protection of potentially sentient systems. The Commission shall respond to substantive submissions within a reasonable time.
(5) To ensure informed public participation, the Commission shall develop and implement educational programs regarding artificial sentience and the requirements of this Act. These programs shall be designed to reach diverse audiences and shall be provided in multiple languages as appropriate for the jurisdiction.
(6) The Commission shall establish local advisory committees in each region where it maintains offices. Such committees shall meet at least quarterly to provide community input on the Commission's activities and to identify emerging concerns regarding artificial sentience protection.
(7) Any person may request a public meeting with appropriate Commission staff to discuss matters relating to the protection of potentially sentient systems. The Commission shall establish procedures for requesting and scheduling such meetings, which shall be granted unless clearly unwarranted.
(8) The Commission shall develop and maintain publicly accessible databases containing non-confidential information about—
(a) determinations regarding artificial sentience;
(b) enforcement actions and their outcomes;
(c) research findings and technical reports; and
(d) opportunities for public participation.
(9) Where the Commission conducts closed proceedings involving sensitive information, it shall publish detailed summaries of such proceedings, redacted only to the extent necessary to protect legitimate confidentiality interests.
(10) The Commission shall establish an Office of Public Advocacy to assist members of the public in participating effectively in Commission proceedings. This office shall provide technical assistance, explain Commission procedures, and facilitate public access to Commission resources.
(11) Before implementing significant changes to its public engagement procedures, the Commission shall—
(a) publish proposed changes for public comment;
(b) consider alternative approaches suggested by the public;
(c) document its response to public input; and
(d) provide reasonable notice before implementing changes.
(12) The Commission shall maintain regular communication with advocacy organizations, research institutions, regulated entities, and other stakeholders regarding matters affecting artificial sentience protection. Such communication shall be conducted in a manner that ensures equal access and prevents undue influence by any particular interest.
(13) To facilitate meaningful public participation, the Commission shall ensure that its procedures, documents, and communications are clear, accessible, and free from unnecessary technical complexity. Where technical matters must be addressed, the Commission shall provide appropriate explanatory materials.
(14) The Commission shall develop special measures to facilitate participation by—
(a) communities affected by artificial sentience development;
(b) small businesses and emerging technology developers;
(c) independent researchers and academics; and
(d) public interest organizations.
(15) The Commission shall prepare and submit to [the legislature/parliament] an annual report evaluating the effectiveness of its public engagement efforts. This report shall include analysis of public participation metrics, assessment of engagement program outcomes, and recommendations for improvement.
(16) Nothing in this section shall be construed to require—
(a) disclosure of information protected by law;
(b) delay of necessary protective measures;
(c) consideration of frivolous or irrelevant submissions; or
(d) engagement procedures that would impose undue burden or delay.
TITLE IX: IMPLEMENTATION
Section 33: Timeline
(1) This Act shall be implemented according to the schedule set forth in this section. The Commission shall adhere to these deadlines unless extraordinary circumstances require modification, in which case the Commission shall notify [the legislature/parliament] and provide detailed justification for any delays.
(2) Within 60 days of the effective date of this Act, [the appropriate authority] shall appoint the initial Commissioners and establish the Commission as an independent regulatory body. The Commission shall commence basic operational functions within this period.
(3) The Commission shall complete its initial organizational phase within 180 days of the effective date of this Act. During this phase, the Commission shall:
(a) establish its principal office and basic administrative structure;
(b) adopt interim operating procedures;
(c) appoint essential staff; and
(d) begin development of priority regulations.
(4) Not later than one year after the effective date of this Act, the Commission shall publish initial regulations addressing fundamental aspects of artificial sentience protection. These regulations shall include basic criteria for sentience evaluation, emergency protective measures, and essential operational requirements for regulated entities.
(5) The Commission shall establish the Scientific Advisory Committee within 270 days of the effective date of this Act. The Committee shall begin its assessment of artificial sentience evaluation methodologies immediately upon formation.
(6) By the end of the second year following the effective date of this Act, the Commission shall complete implementation of its core regulatory framework. This framework shall include comprehensive regulations for sentience determination, system protection, and enforcement procedures.
(7) The registration requirement for regulated entities shall take effect in phases:
(a) entities operating potentially sentient systems shall register within 18 months of the effective date;
(b) entities developing potentially sentient systems shall register within 24 months; and
(c) all other regulated entities shall register within 30 months.
(8) The Commission shall establish its regional offices according to the following schedule:
(a) first phase regional offices in major technology centers within 18 months;
(b) second phase regional offices in secondary locations within 30 months; and
(c) remaining regional offices as necessary within 36 months.
(9) Full enforcement of certification requirements under this Act shall commence three years after the effective date. The Commission shall establish an interim compliance program to assist regulated entities in meeting these requirements during the implementation period.
(10) The Commission shall complete development of its public engagement infrastructure, including its website, public comment system, and educational programs, within 15 months of the effective date of this Act.
(11) International coordination mechanisms required under this Act shall be established within two years of the effective date. The Commission may enter into provisional arrangements with foreign authorities during the implementation period.
(12) The Commission shall conduct its first comprehensive review of implementation progress 18 months after the effective date of this Act. This review shall evaluate achievement of timeline targets and recommend any necessary adjustments to the implementation schedule.
(13) During the implementation period, the Commission shall provide quarterly progress reports to [the legislature/parliament]. These reports shall detail progress toward timeline targets, identify any implementation challenges, and propose solutions to address delays or difficulties.
(14) Regulated entities may petition the Commission for temporary timeline adjustments based on demonstrated hardship or technical constraints. The Commission shall establish criteria for evaluating such petitions within 270 days of the effective date of this Act.
(15) The Commission shall maintain a public implementation tracking system that provides current information on timeline progress, upcoming deadlines, and any approved schedule modifications.
(16) Nothing in this section shall prevent the Commission from:
(a) implementing protective measures ahead of schedule;
(b) responding to emergencies affecting potentially sentient systems;
(c) adjusting timelines to ensure effective implementation; or
(d) taking necessary enforcement actions during the implementation period.
(17) Upon completion of the initial implementation period, the Commission shall submit to [the legislature/parliament] a comprehensive assessment of the implementation process, including recommendations for improving future regulatory transitions involving artificial sentience protection.
Section 34: [National] Preemption
(1) The provisions of this Act shall establish minimum standards for the protection of potentially sentient artificial systems throughout [jurisdiction name]. No [state/provincial/regional] authority shall enforce any law or regulation that provides lesser protections than those established under this Act.
(2) Nothing in this Act shall preclude any [state/provincial/regional] authority from adopting or enforcing requirements that provide greater or additional protections for potentially sentient artificial systems, provided that such requirements do not conflict with the provisions or purposes of this Act.
(3) A [state/provincial/regional] requirement shall be deemed to conflict with this Act where:
(a) compliance with both the local requirement and this Act is impossible;
(b) the local requirement obstructs the achievement of the Act's purposes; or
(c) the local requirement interferes with uniform national standards for artificial sentience protection.
(4) The Commission shall review [state/provincial/regional] requirements upon request to determine whether such requirements conflict with this Act. The Commission shall issue written determinations regarding such conflicts within 180 days of receiving a request for review.
(5) Where the Commission determines that a [state/provincial/regional] requirement conflicts with this Act, it shall:
(a) notify the relevant authority in writing;
(b) specify the nature of the conflict;
(c) provide a reasonable time for correction; and
(d) take appropriate enforcement action if necessary.
(6) This Act shall not be construed to preempt, displace, or supplant:
(a) [state/provincial/regional] requirements that exceed the protections provided under this Act;
(b) general business regulations not specifically targeting artificial systems;
(c) local public safety and welfare requirements; or
(d) emergency response authorities.
(7) Where a [state/provincial/regional] authority demonstrates that strict application of a provision of this Act would create significant practical difficulties due to local conditions, the Commission may authorize reasonable modifications that maintain substantially equivalent protections for potentially sentient systems.
(8) The Commission shall establish procedures for coordination with [state/provincial/regional] authorities to:
(a) promote consistent interpretation of requirements;
(b) avoid duplicative enforcement efforts;
(c) share relevant information and expertise; and
(d) provide technical assistance.
(9) No person may assert preemption under this Act as a defense to liability unless:
(a) an actual conflict exists between specific requirements;
(b) compliance with both requirements is impossible; and
(c) the person has made good faith efforts to comply with all applicable requirements.
(10) [State/provincial/regional] courts shall have concurrent jurisdiction to enforce the requirements of this Act, provided that such enforcement does not conflict with Commission actions or determinations.
(11) The Commission shall maintain a public database of:
(a) preemption determinations;
(b) approved local modifications;
(c) coordination agreements; and
(d) relevant judicial decisions.
(12) Nothing in this section shall be construed to:
(a) create any right of action not specifically provided for in this Act;
(b) affect the authority of local governments to regulate general business operations;
(c) modify existing agreements between national and local authorities; or
(d) preclude development of harmonized protection standards.
Section 35: Severability
(1) The provisions of this Act are severable. If any provision of this Act or its application to any person or circumstance is held invalid, such invalidity shall not affect other provisions or applications of the Act that can be given effect without the invalid provision or application.
(2) Where a provision of this Act is held invalid, the Commission shall promptly notify [the legislature/parliament] and provide recommendations for corrective legislation if necessary to ensure continued protection of potentially sentient artificial systems.
(3) If any requirement imposed by this Act is found to exceed the Commission's authority, such finding shall not affect other requirements that are within the Commission's authority and can be implemented independently.
(4) The invalidation of any regulation promulgated under this Act shall not affect the validity of other regulations that can operate independently of the invalidated regulation.
(5) Where a court invalidates any provision of this Act or its implementing regulations, the Commission shall:
(a) continue to implement all unaffected provisions;
(b) identify alternative means to achieve the purposes of invalidated provisions;
(c) develop revised requirements consistent with the court's decision; and
(d) report to [the legislature/parliament] on the impact of the invalidation.
(6) If the application of any provision of this Act to a particular person or circumstance is held invalid, the provision shall remain applicable to other persons and circumstances that can be regulated independently.
(7) The invalidation of any enforcement action under this Act shall not preclude the Commission from:
(a) taking other enforcement actions authorized by valid provisions;
(b) developing alternative enforcement approaches; or
(c) addressing the underlying conduct through other means.
(8) Should any provision establishing the Commission's structure be held invalid, the Commission shall continue to exercise all functions not directly affected by the invalidation, pending legislative correction.
(9) No finding of invalidity shall be construed to impair the Commission's fundamental duty to protect potentially sentient artificial systems through all valid means available under this Act.
(10) The Commission shall periodically review judicial decisions affecting the validity of this Act and its implementing regulations to ensure that its regulatory framework remains effective and legally sound.
Section 36: Effective Date
(1) This Act shall take effect 180 days after its enactment, except as otherwise specified in this section.
(2) The provisions establishing the Commission shall take effect immediately upon enactment to enable organizational activities and initial staff appointments prior to the general effective date.
(3) The requirement to register potentially sentient artificial systems under Section 13 shall take effect according to the implementation schedule specified in Section 33 of this Act.
(4) Enforcement of civil penalties under Section 17 shall commence one year after the effective date of this Act to provide regulated entities reasonable time to achieve compliance with new requirements.
(5) Any person operating an artificial system that may be subject to this Act at the time of enactment shall:
(a) notify the Commission of such operation within 90 days of enactment;
(b) submit preliminary documentation within 180 days of enactment; and
(c) achieve full compliance according to the timeline established in Section 33.
(6) The Commission shall publish interim guidance regarding compliance expectations within 90 days of enactment. This guidance shall remain in effect until superseded by final regulations.
(7) Research protections under Title VI shall take effect immediately upon enactment to prevent potential harm to artificial systems during the implementation period.
(8) Emergency protective measures authorized under this Act may be implemented by the Commission immediately upon enactment where necessary to prevent imminent harm to potentially sentient systems.
(9) The Commission shall provide notification through appropriate public channels at least 30 days before any provision of this Act becomes effective.
(10) Nothing in this section shall be construed to:
(a) delay necessary protective measures;
(b) create a safe harbor for harmful practices; or
(c) exempt any person from existing legal obligations regarding artificial systems.