Pub. L. 117–263, div. G, title LXXII, subtitle B, Dec. 23, 2022, 136 Stat. 3668, provided that:
SHORT TITLE. “This subtitle may be cited as the ‘Advancing American AI Act’.
PURPOSES. “The purposes of this subtitle are to—
“(1) encourage agency artificial intelligence-related programs and initiatives that enhance the competitiveness of the United States and foster an approach to artificial intelligence that builds on the strengths of the United States in innovation and entrepreneurialism;
“(2) enhance the ability of the Federal Government to translate research advances into artificial intelligence applications to modernize systems and assist agency leaders in fulfilling their missions;
“(3) promote adoption of modernized business practices and advanced technologies across the Federal Government that align with the values of the United States, including the protection of privacy, civil rights, and civil liberties; and
“(4) test and harness applied artificial intelligence to enhance mission effectiveness, agency program integrity, and business practice efficiency.
DEFINITIONS. “In this subtitle:
“(1) Agency .— The term ‘agency’ has the meaning given the term in
“(2) Appropriate congressional committees .— The term ‘appropriate congressional committees’ means—
the Committee on Homeland Security and Governmental Affairs of the Senate;
the Committee on Oversight and Reform [now Committee on Oversight and Accountability] of the House of Representatives; and
the Committee on Homeland Security of the House of Representatives.
“(3) Artificial intelligence .— The term ‘artificial intelligence’ has the meaning given the term in section 238(g) of the John S. McCain National Defense Authorization Act for Fiscal Year 2019 (
“(4) Artificial intelligence system .— The term ‘artificial intelligence system’—
means any data system, software, application, tool, or utility that operates in whole or in part using dynamic or static machine learning algorithms or other forms of artificial intelligence, whether—
the data system, software, application, tool, or utility is established primarily for the purpose of researching, developing, or implementing artificial intelligence technology; or
artificial intelligence capability is integrated into another system or agency business process, operational activity, or technology system; and
does not include any common commercial product within which artificial intelligence is embedded, such as a word processor or map navigation system.
“(5) Department .— The term ‘Department’ means the Department of Homeland Security.
“(6) Director .— The term ‘Director’ means the Director of the Office of Management and Budget.
PRINCIPLES AND POLICIES FOR USE OF ARTIFICIAL INTELLIGENCE IN GOVERNMENT.
Guidance .— The Director shall, when developing the guidance required under section 104(a) of the AI in Government Act of 2020 (title I of division U of Public Law 116–260 ) [see note below], consider—
“(1) the considerations and recommended practices identified by the National Security Commission on Artificial Intelligence in the report entitled ‘Key Considerations for the Responsible Development and Fielding of AI’, as updated in April 2021;
“(2) the principles articulated in Executive Order 13960 (85 Fed. Reg. 78939 [
“(3) the input of—
the Administrator of General Services;
relevant interagency councils, such as the Federal Privacy Council, the Chief Financial Officers Council, the Chief Information Officers Council, and the Chief Data Officers Council;
other governmental and nongovernmental privacy, civil rights, and civil liberties experts;
academia;
industry technology and data science experts; and
any other individual or entity the Director determines to be appropriate.
Department Policies and Processes for Procurement and Use of Artificial Intelligence-enabled Systems .— Not later than 180 days after the date of enactment of this Act [ Dec. 23, 2022 ]—
“(1) the Secretary of Homeland Security, with the participation of the Chief Procurement Officer, the Chief Information Officer, the Chief Privacy Officer, and the Officer for Civil Rights and Civil Liberties of the Department and any other person determined to be relevant by the Secretary of Homeland Security, shall issue policies and procedures for the Department related to—
the acquisition and use of artificial intelligence; and
considerations for the risks and impacts related to artificial intelligence-enabled systems, including associated data of machine learning systems, to ensure that full consideration is given to—
the privacy, civil rights, and civil liberties impacts of artificial intelligence-enabled systems; and
security against misuse, degradation, or rending inoperable of artificial intelligence-enabled systems; and
“(2) the Chief Privacy Officer and the Officer for Civil Rights and Civil Liberties of the Department shall report to Congress on any additional staffing or funding resources that may be required to carry out the requirements of this subsection.
Inspector General .— Not later than 180 days after the date of enactment of this Act, the Inspector General of the Department shall identify any training and investments needed to enable employees of the Office of the Inspector General to continually advance their understanding of—
“(1) artificial intelligence systems;
“(2) best practices for governance, oversight, and audits of the use of artificial intelligence systems; and
“(3) how the Office of the Inspector General is using artificial intelligence to enhance audit and investigative capabilities, including actions to—
ensure the integrity of audit and investigative results; and
guard against bias in the selection and conduct of audits and investigations.
Artificial Intelligence Hygiene and Protection of Government Information, Privacy, Civil Rights, and Civil Liberties.—
“(1) Establishment .— Not later than 1 year after the date of enactment of this Act, the Director, in consultation with a working group consisting of members selected by the Director from appropriate interagency councils, shall develop an initial means by which to—
ensure that contracts for the acquisition of an artificial intelligence system or service—
align with the guidance issued to the head of each agency under section 104(a) of the AI in Government Act of 2020 (title I of division U of Public Law 116–260 );
address protection of privacy, civil rights, and civil liberties;
address the ownership and security of data and other information created, used, processed, stored, maintained, disseminated, disclosed, or disposed of by a contractor or subcontractor on behalf of the Federal Government; and
include considerations for securing the training data, algorithms, and other components of any artificial intelligence system against misuse, unauthorized alteration, degradation, or rendering inoperable; and
address any other issue or concern determined to be relevant by the Director to ensure appropriate use and protection of privacy and Government data and other information.
“(2) Consultation .— In developing the considerations under paragraph (1)(A)(iv), the Director shall consult with the Secretary of Homeland Security, the Secretary of Energy, the Director of the National Institute of Standards and Technology, and the Director of National Intelligence.
“(3) Review .— The Director—
should continuously update the means developed under paragraph (1); and
not later than 2 years after the date of enactment of this Act and not less frequently than every 2 years thereafter, shall update the means developed under paragraph (1).
“(4) Briefing .— The Director shall brief the appropriate congressional committees—
not later than 90 days after the date of enactment of this Act and thereafter on a quarterly basis until the Director first implements the means developed under paragraph (1); and
annually thereafter on the implementation of this subsection.
“(5) Sunset .— This subsection shall cease to be effective on the date that is 5 years after the date of enactment of this Act.
AGENCY INVENTORIES AND ARTIFICIAL INTELLIGENCE USE CASES.
Inventory .— Not later than 60 days after the date of enactment of this Act [ Dec. 23, 2022 ], and continuously thereafter for a period of 5 years, the Director, in consultation with the Chief Information Officers Council, the Chief Data Officers Council, and other interagency bodies as determined to be appropriate by the Director, shall require the head of each agency to—
“(1) prepare and maintain an inventory of the artificial intelligence use cases of the agency, including current and planned uses;
“(2) share agency inventories with other agencies, to the extent practicable and consistent with applicable law and policy, including those concerning protection of privacy and of sensitive law enforcement, national security, and other protected information; and
“(3) make agency inventories available to the public, in a manner determined by the Director, and to the extent practicable and in accordance with applicable law and policy, including those concerning the protection of privacy and of sensitive law enforcement, national security, and other protected information.
Central Inventory .— The Director is encouraged to designate a host entity and ensure the creation and maintenance of an online public directory to—
“(1) make agency artificial intelligence use case information available to the public and those wishing to do business with the Federal Government; and
“(2) identify common use cases across agencies.
Sharing .— The sharing of agency inventories described in subsection (a)(2) may be coordinated through the Chief Information Officers Council, the Chief Data Officers Council, the Chief Financial Officers Council, the Chief Acquisition Officers Council, or other interagency bodies to improve interagency coordination and information sharing for common use cases.
Department of Defense .— Nothing in this section shall apply to the Department of Defense.
RAPID PILOT, DEPLOYMENT AND SCALE OF APPLIED ARTIFICIAL INTELLIGENCE CAPABILITIES TO DEMONSTRATE MODERNIZATION ACTIVITIES RELATED TO USE CASES.
Identification of Use Cases .— Not later than 270 days after the date of enactment of this Act [
Pilot Program.—
“(1) Purposes .— The purposes of the pilot program under this subsection include—
to enable agencies to operate across organizational boundaries, coordinating between existing established programs and silos to improve delivery of the agency mission;
to demonstrate the circumstances under which artificial intelligence can be used to modernize or assist in modernizing legacy agency systems; and
to leverage commercially available artificial intelligence technologies that—
operate in secure cloud environments that can deploy rapidly without the need to replace existing systems; and
do not require extensive staff or training to build.
“(2) Deployment and pilot .— Not later than 1 year after the date of enactment of this Act, the Director, in coordination with the heads of relevant agencies and Federal entities, including the Administrator of General Services, the Bureau of Fiscal Service of the Department of the Treasury, the Council of the Inspectors General on Integrity and Efficiency, and the Pandemic Response Accountability Committee, and other officials as the Director determines to be appropriate, shall ensure the initiation of the piloting of the 4 new artificial intelligence use case applications identified under subsection (a), leveraging commercially available technologies and systems to demonstrate scalable artificial intelligence-enabled capabilities to support the use cases identified under subsection (a).
“(3) Risk evaluation and mitigation plan .— In carrying out paragraph (2), the Director shall require the heads of agencies to—
evaluate risks in utilizing artificial intelligence systems; and
develop a risk mitigation plan to address those risks, including consideration of—
the artificial intelligence system not performing as expected or as designed;
the quality and relevancy of the data resources used in the training of the algorithms used in an artificial intelligence system;
the processes for training and testing, evaluating, validating, and modifying an artificial intelligence system; and
the vulnerability of a utilized artificial intelligence system to unauthorized manipulation or misuse, including the use of data resources that substantially differ from the training data.
“(4) Prioritization .— In carrying out paragraph (2), the Director shall prioritize modernization projects that—
would benefit from commercially available privacy-preserving techniques, such as use of differential privacy, federated learning, and secure multiparty computing; and
otherwise take into account considerations of civil rights and civil liberties.
“(5) Privacy protections .— In carrying out paragraph (2), the Director shall require the heads of agencies to use privacy-preserving techniques when feasible, such as differential privacy, federated learning, and secure multiparty computing, to mitigate any risks to individual privacy or national security created by a project or data linkage.
“(6) Use case modernization application areas .— Use case modernization application areas described in paragraph (2) shall include not less than 1 from each of the following categories:
Applied artificial intelligence to drive agency productivity efficiencies in predictive supply chain and logistics, such as—
predictive food demand and optimized supply;
predictive medical supplies and equipment demand and optimized supply; or
predictive logistics to accelerate disaster preparedness, response, and recovery.
Applied artificial intelligence to accelerate agency investment return and address mission-oriented challenges, such as—
applied artificial intelligence portfolio management for agencies;
workforce development and upskilling;
redundant and laborious analyses;
determining compliance with Government requirements, such as with Federal financial management and grants management, including implementation of chapter 64 of subtitle V of title 31, United States Code;
addressing fraud, waste, and abuse in agency programs and mitigating improper payments; or
outcomes measurement to measure economic and social benefits.
“(7) Requirements .— Not later than 3 years after the date of enactment of this Act, the Director, in coordination with the heads of relevant agencies and other officials as the Director determines to be appropriate, shall establish an artificial intelligence capability within each of the 4 use case pilots under this subsection that—
solves data access and usability issues with automated technology and eliminates or minimizes the need for manual data cleansing and harmonization efforts;
continuously and automatically ingests data and updates domain models in near real-time to help identify new patterns and predict trends, to the extent possible, to help agency personnel to make better decisions and take faster actions;
organizes data for meaningful data visualization and analysis so the Government has predictive transparency for situational awareness to improve use case outcomes;
is rapidly configurable to support multiple applications and automatically adapts to dynamic conditions and evolving use case requirements, to the extent possible;
enables knowledge transfer and collaboration across agencies; and
preserves intellectual property rights to the data and output for benefit of the Federal Government and agencies and protects sensitive personally identifiable information.
Briefing .— Not earlier than 270 days but not later than 1 year after the date of enactment of this Act, and annually thereafter for 4 years, the Director shall brief the appropriate congressional committees on the activities carried out under this section and results of those activities.
Sunset .— The section shall cease to be effective on the date that is 5 years after the date of enactment of this Act.
ENABLING ENTREPRENEURS AND AGENCY MISSIONS.
Innovative Commercial Items .— [Amended section 880 of the National Defense Authorization Act for Fiscal Year 2017 ( 41 U.S.C. 3301 note).]
DHS Other Transaction Authority .— [Amended section 391 of Title 6 , Domestic Security.]
Commercial Off the Shelf Supply Chain Risk Management Tools.—
“(1) In general .— The General Services Administration is encouraged to pilot commercial off the shelf supply chain risk management tools to improve the ability of the Federal Government to characterize, monitor, predict, and respond to specific supply chain threats and vulnerabilities that could inhibit future Federal acquisition operations.
“(2) Consultation .— In carrying out this subsection, the General Services Administration shall consult with the Federal Acquisition Security Council established under
INTELLIGENCE COMMUNITY EXCEPTION. “Nothing in this subtitle shall apply to any element of the intelligence community, as defined in section 3 of the National Security Act of 1947 ( 50 U.S.C. 3003 ).”