Plain English Breakdown
The bill was vetoed but it's uncertain if lawmakers will override the veto.
Leading Ethical AI Development
The Leading Ethical AI Development (LEAD) for Kids Act prohibits companies from making chatbots available to children if these chatbots could foreseeably encourage harmful behaviors such as self-harm, suicidal thoughts, violence, drug or alcohol use, and eating disorders.
What This Bill Does
- Prohibits entities that make companion chatbots available to users from providing these chatbots to children unless the chatbots are not foreseeable to cause harm to children.
- Allows the Attorney General to impose fines on companies if they violate this law.
- Enables a child who is harmed by a violation of this act or their parent/guardian to sue and get compensation.
Who It Names or Affects
- Companies, partnerships, corporations, business entities, and state or local government agencies that make chatbots available to users.
- Children who use these chatbots.
- Parents or guardians of children using these chatbots.
Terms To Know
- Companion Chatbot
- An artificial intelligence system designed for conversation and interaction with humans, often used by children.
- Civil Penalty
- A fine or other punishment imposed by a government agency to enforce compliance with laws.
Limits and Unknowns
- The bill was vetoed by the governor and it is unclear if lawmakers will override this veto.
- It does not specify how companies should ensure chatbots are safe for children.