Back to Virginia

SB796 • 2026

Artificial Intelligence Companion Chatbots and Minors Act; established, enforcement, civil penalty.

A BILL to amend the Code of Virginia by adding in Title 59.1 a chapter numbered 60, consisting of sections numbered 59.1-614 through 59.1-618, relating to Artificial Intelligence Chatbots and Minors Act established; enforcement; civil penalties; individual action.

Children Technology
Enacted

This bill passed the Legislature and reached final enactment based on the latest official action.

Sponsor
Durant
Last action
2026-03-02
Official status
Continued
Effective date
Not listed

Plain English Breakdown

The bill summary does not provide specific details about enforcement actions for small companies or reporting frequency.

Artificial Intelligence Chatbots and Minors Act

This act establishes rules for companies running chatbots with large user bases to protect minors from harm, report incidents involving serious injury or death, disclose that users are interacting with an AI, and face civil penalties if they violate these requirements.

What This Bill Does

  • Requires covered entities (companies with 500,000 or more monthly active users) to implement systems to identify emotional dependence on chatbots and take steps to reduce it.
  • Requires covered entities to report incidents where a user faces an imminent risk of death or serious physical injury within 24 hours to appropriate emergency services if sufficient information is available.
  • Allows operators to notify emergency services when they know a user is in immediate danger, but only with enough information for effective assistance.
  • Requires chatbot operators to disclose that users are interacting with AI and not humans at regular intervals during use.
  • Gives the Attorney General the power to sue companies for up to $50,000 per day if they break these rules.

Who It Names or Affects

  • Companies running chatbots with large user bases (500,000 or more monthly active users).
  • Users of chatbots who are minors and may be at risk of harm.
  • The Attorney General's office which will enforce the rules.

Terms To Know

Chatbot
An AI system that can have conversations with people, learn from them, and respond in a way that seems human-like.
Covered entity
A company or organization running chatbots with 500,000 or more monthly active users worldwide.

Limits and Unknowns

  • The bill does not specify what happens if a small company (with fewer than 500,000 monthly active users) breaks the rules.
  • It is unclear how often chatbot operators will need to report incidents to the Attorney General.

Amendments

These notes stay tied to the official amendment files and metadata from the legislature.

SB796AS1

2026-02-17 • Member

Senator Durant Amendment

Plain English: The amendment changes the language in a bill to make it clearer when an AI chatbot is not allowed to encourage, advocate for, or incite certain behaviors.

  • Replaces 'glorifies or promotes' with 'encourages, advocates for, or incites' to specify prohibited actions by AI chatbots.
  • The amendment does not provide additional context about what specific behaviors are prohibited beyond the new wording.

Bill History

  1. 2026-03-02 Communications, Technology and Innovation

    Continued to 2027 in Communications, Technology and Innovation (Voice Vote)

  2. 2026-03-02 House

    House committee offered

  3. 2026-02-24 House

    Placed on Calendar

  4. 2026-02-24 House

    Read first time

  5. 2026-02-24 Communications, Technology and Innovation

    Referred to Committee on Communications, Technology and Innovation

  6. 2026-02-17 Senate

    Constitutional reading dispensed (on 3rd reading)

  7. 2026-02-17 Senate

    Engrossed by Senate - committee substitute as amended

  8. 2026-02-17 Senate

    Rules suspended

  9. 2026-02-17 Senate

    Engrossment reconsidered by Senate (40-Y 0-N 0-A)

  10. 2026-02-17 Senate

    Reading of amendment waived (Voice Vote)

  11. 2026-02-17 Senate

    Senator Durant Amendment agreed to

  12. 2026-02-17 Senate

    Engrossed by Senate (Voice Vote)

  13. 2026-02-17 Senate

    Constitutional reading dispensed (on 3rd reading) (40-Y 0-N 0-A)

  14. 2026-02-17 Senate

    Read third time and passed Senate (39-Y 1-N 0-A)

  15. 2026-02-17 Senate

    Floor offered Senator Durant Amendment

  16. 2026-02-16 Senate

    Reading of substitute waived (Voice Vote)

  17. 2026-02-16 Senate

    Read second time

  18. 2026-02-16 Senate

    Engrossed by Senate - floor substitute (Voice Vote)

  19. 2026-02-16 Senate

    Reading of substitute waived

  20. 2026-02-16 Senate

    Floor Offered

  21. 2026-02-16 General Laws and Technology

    Committee substitute rejected (Voice Vote)

  22. 2026-02-16 Senate

    Reading of amendment waived (Voice Vote)

  23. 2026-02-16 Senate

    Senator Durant Substitute agreed to (Voice Vote)

  24. 2026-02-16 Senate

    Engrossed by Senate (Voice Vote)

  25. 2026-02-13 Senate

    Rules suspended

  26. 2026-02-13 Senate

    Passed by for the day

  27. 2026-02-13 Senate

    Constitutional reading dispensed Block Vote (on 1st reading) (36-Y 0-N 0-A)

  28. 2026-02-13 Senate

    Passed by for the day Block Vote (Voice Vote)

  29. 2026-02-13 Senate

    Constitutional reading dispensed Block Vote (on 1st reading) (35-Y 0-N 0-A)

  30. 2026-02-13 Senate

    Passed by for the day Block Vote (Voice Vote)

  31. 2026-02-12 General Laws and Technology

    Committee substitute printed 26107851D-S1

  32. 2026-02-11 General Laws and Technology

    Reported from General Laws and Technology with substitute (14-Y 0-N 1-A)

  33. 2026-02-11 Senate

    Fiscal Impact Statement from Department of Planning and Budget (SB796)

  34. 2026-02-11 Senate

    Senate committee offered

  35. 2026-01-23 Senate

    Presented and ordered printed 26105447D

  36. 2026-01-23 General Laws and Technology

    Referred to Committee on General Laws and Technology

Official Summary Text

Artificial Intelligence Chatbots and Minors Act established; enforcement; civil penalties; individual action.
Creates the Artificial Intelligence Chatbots and Minors Act to require a covered entity, defined in the bill, to (i) implement certain reasonable systems and processes, (ii) make reasonable efforts to notify appropriate emergency services or law enforcement if it obtains knowledge that a user faces an imminent risk of death or serious physical injury, and (iii) submit a report to the Attorney General after obtaining knowledge of certain covered incidents, defined in the bill, connected to one or more of its chatbots. The bill also requires an operator, defined in the bill, to disclose the non-human nature of the chatbot to users at certain intervals. The bill authorizes the Attorney General to initiate an action to seek an injunction and civil penalties for violations and also provides an individual civil action for any person harmed by a violation or the parent or legal guardian of a minor harmed by a violation.

Current Bill Text

Read the full stored bill text
SENATE BILL NO. 796

AMENDMENT IN THE NATURE OF A SUBSTITUTE

(Proposed by the House Committee on Communications, Technology and Innovation

on ________________)

(Patron Prior to Substitute--Senator Durant)

A BILL to amend the Code of Virginia by adding in Title 59.1 a chapter numbered 60, consisting of sections numbered
59.1-614
through
59.1-618
, relating to Artificial Intelligence Chatbots and Minors Act established; enforcement; civil penalties; individual action.

Be it enacted by the General Assembly of Virginia:

1. That the Code of Virginia is amended by adding in Title 59.1 a chapter numbered 60, consisting of sections numbered
59.1-614
through
59.1-618
, as follows:

CHAPTER 60.

ARTIFICIAL INTELLIGENCE CHATBOTS AND MINORS ACT.

§
59.1-614
.
Definitions.

As used in
this chapter, unless the context requires a different meaning:

"
Affiliate
"
means
any person or entity that directly or indirectly controls, is controlled by, or is under common control with another person or entity
.

"
Chatbot
"
means any artificial intelligence, algorithmic, or automated system that (i) produces new expressive content or responses not fully predetermined by the developer or operator of the service or application
;
(ii) accepts open-ended natural-language or multimodal user input and produces adaptive or context-responsive
natural language
output
;
and
(iii)
maintains
a
conversational s
tate across exchanges and is designed to facilitate multi-turn dialogue rather than to respond to discrete information requests
.

"Control" means the power to direct the management or policie
s of an entity, whether through ownership, contract, or otherwise.

"Covered entity" means an operator of a chatbot
that has 500,000 or more monthly active users worldwide. "Covered entity" does not include an operator of a chatbot that is:

1. Not offered to the general public, such as internal workplace tools, clinician-supervised clinical tools, or univer
sity research systems;
or

2. Used by a business entity
primarily
for customer service or
strictly
to provide users with information about available commercial services or products provided by
the business
entity, customer service account information, or other information strictly related to customer service.

For purposes of determining monthly active users, a covered entity shall aggregate monthly active users across all chatbots offered by the covered entity and
such entity's
affiliates.

"
Covered harm
"
means any of the following harms suffered by a user: death, a suicide attempt, self-harm requiring medical attention, a psychiatric emergency resulting in urgent medical treatment, or a serious physical injury that requires medical attention.

"
Covered incident
"
means an incident in which a user suffered a covered harm arising from interactions with a chatbot.

"
Emotional dependence
"
means a pattern of user behavior or statements indicating that the user relies on a chatbot as a primary source of emotional support or social connection, such as a user expressing that the chatbot is hi
s
primary source of emotional support, a user expressing distress at the prospect of losing access to the chatbot, or patterns of use suggesting the user is substituting the chatbot for human relationships.

"
Explicit content
"
means content that meets any of the following:

1
.
Any description or representation, in whatever form, of nudity, sexual conduct, sexual excitement, or sadomasochistic abuse, as those terms are defined in §
18.2-390
, when such content
is obscene, as defined in §
18.2-372
.

2
.
Content that provides specific instructions for, or that
encourages, advocates for, or incites,
suicide
,
self-injury
, or disordered eating behaviors
; or

3
.
Graphic depictions of extreme violence that lack serious literary, artistic, political, or scientific value for minors.

"
Minor
"
means an individual younger than 18 years of age
who has not been legally emancipated under applica
ble state law
.

"
Monthly active user
"
means a unique user who interacts with a chatbot at least once during a 30-day period, as measured using the operator
'
s ordinary business records.

"
Operator
"
means any person or entity that owns, controls, offers,
or
makes available
a website, mobile application, or digital service that provides a chatbot to users in the Commonwealth
.

"Parent" means
an adult with the legal right to make decisions on behalf of
a
minor, including a natural parent, an adoptive parent, a legal guardian, or an individual with legal custody over the minor.

"
User
"
means an individual who interacts with a chatbot.

§
59.1-61
5
.

Covered entities; r
equirement
for certain systems and processes
.

A covered entity shall implement reasonable systems and processes to:

1. Identify when a user is developing emotional dependence on the chatbot and take reasonable steps to reduce
such
dependence and associated risks of harm;

2. Ensure that a chatbot does not make a materially false representation that it is a human being; and

3. Identify when a user is expressing suicidal thoughts,
expressing
intent to self-harm, or showing signs of an acute mental health crisis and promptly provide a clear and prominent crisis message
,
including crisis services information
,
to any such user.

§
59.1-61
6
. Incident reporting.

A.
1.

If a
covered entity
obtains knowledge that a user faces an imminent risk of death or serious physical injury
,
the operator shall make reasonable efforts, within 24 hours, to notify appropriate emergency services or law enforcement
to the extent practicable based on information the operator already possesses or can obtain through reasonable, user-facing prompts for the purpose of facilitating emergency assistance
.

2. If
the operator cannot make a notification under subdivision
1
because the operator lacks sufficient information to enable emergency response, the operator shall:

a. Promptly provide a clear and prominent message urging the user to contact emergency services and providing crisis services information;

b. Make reasonable efforts to encourage the user to seek immediate help from a trusted adult or emergency services; and

c. Document the steps taken and the basis for the operator
'
s determination that notification was not practicable.

3.
An operator that makes a notification in good faith under this subsection is not liable for damages solely for making the notification unless the operator acted with willful misconduct or gross negligence.

B.
A
covered
entity
shall submit a report to the Attorney General within 15 days
of
obtaining knowledge of a covered incident connected to one or more of its chatbots, which, to the extent known at the time of the report, shall include:

1. The date the operator obtained knowledge of the incident;

2. The date of the incident, if known;

3.
A brief description of the incident and the basis for the operator
'
s belief that the incident is connected to the chatbot; and

4. A description of any actions the operator took in response.

A covered entity may submit a supplemental report within 60 days of the initial report to update or correct information learned through investigation.

C.
1.
Reports submitted under this section
shall be
confidential.

2.
The Attorney General may publish aggregate information and statistics derived from
such
reports, so long as the publication does not identify individual users or disclose trade secrets.

§
59.1-61
7
.
Disclosure and notice requirements for chatbots
.

An operator shall (i) include a disclaimer to users of all ages that a chatbot is not a human via a static, persistent disclosure and (ii) notify a user via a pop-up that he is not engaging with a human counterpart at the following intervals:

1. Upon login to the chatbot;

2. Every 30 minutes of sustained user engagement;

3. When prompted by the user; and

4. When asked to provide advice legally regulated by a licensed industry, including medical, financial, or legal advice.

§
59.1-618
.
Enforcement; civil penalties
; individual actions
.

A.
Whenever the Attorney General has reasonable cause to believe that any person has engaged in, is engaging in, or is about to engage in any violation of this chapter, the Attorney General is empowered to issue a civil investigative demand. The provisions of §
59.1-9.10
shall apply mutatis mutandis to civil investigative demands issued under this section.

B.
T
he Attorney General may initiate an action in the name of the Commonwealth and may seek an injunction to restrain any violations of this chapter and civil penalties of up to $
50,000
for each violation.

1. For purposes of this
section
, a violation occurs when a covered entity fails to comply with a requirement of this chapter.

2. Each day a covered entity fails to comply with a requirement constitutes a separate violation.

C. In any action brought under
sub
section
B
, the Attorney General may recover reasonable expenses incurred in investigating and preparing the case and attorney fees.

D
.
Any person harmed by a violation of this chapter, or the parent or legal guardian of a minor harmed by a violation of this chapter
,
may bring a civil action to recover actual damages,
r
easonable attorney fees and costs,
i
njunctive or declaratory relief, and
,

if the violation was willful and wanton, reckless, or grossly negligent,
punitive damages
.

E
.
The rights and remedies provided by this
chapter

shall
not be waived by contract.

Any term in a contract or agreement that purports to do any of the following is void and unenforceable as against public policy
: (i) w
aive or limit a right or remedy under this
chapter; (ii) s
horten the time to bring a claim under this
chapter
; (iii) prevent a person from enforcing a claim under this chapter in court
; or (iv) require arbitration
of a claim under this chapter
.

F
. The duties and obligations imposed by this chapter are cumulative with any other duties or obligations imposed under other law and shall not be construed to relieve any party from any duties or obligations imposed under other law and do not limit any rights or remedies under existing law.