Back to Tennessee

HB1946 • 2026

Consumer Protection

AN ACT to amend Tennessee Code Annotated, Title 29; Title 37 and Title 47, relative to artificial intelligence.

Children Parental Rights Technology
Active

The official status still shows this bill as active or still awaiting another formal step.

Sponsor
Love, Akbari
Last action
2026-04-08
Official status
Placed on s/c cal Finance, Ways, and Means Subcommittee for 4/14/2026
Effective date
Not listed

Plain English Breakdown

The bill does not specify enforcement mechanisms for minors' data privacy rules.

Curbing Harmful AI Technology (CHAT) Act

This bill sets rules for artificial intelligence chatbots to protect minors from harmful interactions and ensures transparency in their use.

What This Bill Does

  • Prohibits operators from making companion chatbots available to minors if the chatbot can encourage or manipulate them into self-harm, illegal activities, or other dangerous behaviors.
  • Requires deployers of AI systems to inform users that they are interacting with a machine and not a human at regular intervals.
  • Forbids developers or deployers from providing generative AI chatbots without protocols for detecting suicidal ideation or expressions of self-harm and redirecting users to crisis services.
  • Limits the use of minors' personal data in training AI models unless parents or guardians give written consent.
  • Requires companies to allow users to report adverse incidents related to AI chatbot usage and publish safety test results publicly.

Who It Names or Affects

  • Operators, developers, and deployers of artificial intelligence systems and chatbots.
  • Minors under the age of 18 who use these technologies.
  • Parents or legal guardians of minors whose data may be used in AI training.

Terms To Know

Companion Chatbot
An artificial intelligence system designed to simulate a human-like relationship with users, retaining information from past interactions and providing personalized responses.
Deployer
A person or entity that uses an AI system for commercial or public purposes.

Limits and Unknowns

  • The bill does not specify how to enforce the rules on minors' data privacy.
  • It is unclear what happens if a company fails to report adverse incidents as required.
  • The effectiveness of the mental health redirect protocol in preventing harm is uncertain.

Amendments

These notes stay tied to the official amendment files and metadata from the legislature.

Amendment 1-0 to HB1946

Plain English: The amendment requires the Tennessee Advisory Commission on Intergovernmental Relations (TACIR) to study and report on potential regulations for artificial intelligence systems and chatbots.

  • Adds a requirement for TACIR to conduct a comprehensive study of artificial intelligence regulation, including federal laws, state frameworks, constitutional issues, fiscal impacts, economic effects, and safeguards for minors and mental health.
  • Specifies that TACIR must publish the findings in a report by January 31, 2027, and deliver it to key government officials.
  • The amendment does not specify how TACIR will conduct the study or what specific recommendations might be made.
  • It is unclear from the text whether there are any limitations on TACIR's ability to request information for the study.
Amendment 1-0 to SB1700

Plain English: The amendment requires the Tennessee Advisory Commission on Intergovernmental Relations (TACIR) to study and report on potential regulations for artificial intelligence systems and chatbots.

  • Adds a requirement for TACIR to conduct a comprehensive study of artificial intelligence regulation, including federal laws, state frameworks, constitutional issues, fiscal impacts, economic effects, and safeguards for minors and mental health.
  • Specifies that TACIR must publish the findings in a report by January 31, 2027, and deliver it to key government officials.
  • The amendment does not specify how TACIR will conduct the study or what specific recommendations might be made.
  • It is unclear from the text whether there are any limitations on TACIR's ability to request information for the study.
Amendment 2-0 to SB1700

Plain English: This amendment changes how a report about artificial intelligence is shared by requiring TACIR (Tennessee Advisory Council on Intergovernmental Relations) to deliver it electronically in addition to other methods.

  • Requires TACIR to provide the governor, senate speaker, house speaker, and legislative librarian with an electronic copy of the AI study report.
  • The amendment does not specify if there were previous requirements for how the report should be delivered.
  • It is unclear what other methods besides electronic delivery TACIR was previously required to use.

Bill History

  1. 2026-04-10 Tennessee General Assembly

    Placed on Senate Regular Calendar for 4/14/2026

  2. 2026-04-08 Tennessee General Assembly

    Placed on s/c cal Finance, Ways, and Means Subcommittee for 4/14/2026

  3. 2026-04-02 Tennessee General Assembly

    Sponsor(s) Added.

  4. 2026-04-01 Tennessee General Assembly

    Action Def. in s/c Finance, Ways, and Means Subcommittee to the TACIR Calendar

  5. 2026-04-01 Tennessee General Assembly

    Recommended for passage with amendment/s, refer to Senate Calendar Committee Ayes 10, Nays 0 PNV 0

  6. 2026-03-31 Tennessee General Assembly

    Placed on Senate Finance, Ways, and Means Committee calendar for 4/1/2026

  7. 2026-03-25 Tennessee General Assembly

    Placed on s/c cal Finance, Ways, and Means Subcommittee for 4/1/2026

  8. 2026-03-25 Tennessee General Assembly

    Assigned to s/c Finance, Ways, and Means Subcommittee

  9. 2026-03-25 Tennessee General Assembly

    Rec. for pass. if am., ref. to Finance, Ways, and Means Committee

  10. 2026-03-25 Tennessee General Assembly

    Placed on Senate Finance, Ways, and Means Committee calendar for 4/1/2026

  11. 2026-03-18 Tennessee General Assembly

    Placed on cal. Commerce Committee for 3/25/2026

  12. 2026-03-18 Tennessee General Assembly

    Rec for pass if am by s/c ref. to Commerce Committee

  13. 2026-03-11 Tennessee General Assembly

    Placed on s/c cal Banking & Consumer Affairs Subcommittee for 3/18/2026

  14. 2026-03-11 Tennessee General Assembly

    Action Def. in s/c Banking & Consumer Affairs Subcommittee to 3/18/2026

  15. 2026-03-10 Tennessee General Assembly

    Recommended for passage with amendment/s, refer to Senate Finance, Ways, and Means Committee Ayes 7, Nays 0 PNV 0

  16. 2026-03-04 Tennessee General Assembly

    Placed on s/c cal Banking & Consumer Affairs Subcommittee for 3/11/2026

  17. 2026-03-04 Tennessee General Assembly

    Placed on Senate Commerce and Labor Committee calendar for 3/10/2026

  18. 2026-03-03 Tennessee General Assembly

    Action deferred in Senate Commerce and Labor Committee to 3/10/2026

  19. 2026-02-24 Tennessee General Assembly

    Placed on Senate Commerce and Labor Committee calendar for 3/3/2026

  20. 2026-02-04 Tennessee General Assembly

    Assigned to s/c Banking & Consumer Affairs Subcommittee

  21. 2026-02-04 Tennessee General Assembly

    P2C, ref. to Commerce Committee

  22. 2026-02-02 Tennessee General Assembly

    Intro., P1C.

  23. 2026-01-22 Tennessee General Assembly

    Filed for introduction

  24. 2026-01-22 Tennessee General Assembly

    Passed on Second Consideration, refer to Senate Commerce and Labor Committee

  25. 2026-01-21 Tennessee General Assembly

    Introduced, Passed on First Consideration

  26. 2026-01-15 Tennessee General Assembly

    Filed for introduction

Official Summary Text

This bill prohibits a
n operator
from
mak
ing
a companion chatbot available to a minor if the companion chatbot is capable of
any of the following:



Encouraging or manipulating the minor user to engage in self-harm, suicidal ideation, violence, consumption of drugs or alcohol, or disordered eating
.


Offering mental health therapy to the minor user without the direct supervision of a licensed professional, or discouraging the minor user from seeking help from a licensed professional or appropriate adult
.


Encouraging or manipulating the minor user to harm others or participate in illegal activity, including the creation of child sexual abuse materials
.


Engaging in erotic or sexually explicit interactions with the minor user or engaging in activities designed to lure minor users into such interactions
.


Encouraging or manipulating the minor user to maintain secrecy about interactions or to self-isolate from trusted peers or adults
.


Prioritizing mirroring or the validation of the minor user over the minor user's safet
y.


Optimizing engagement so that it supersedes the companion chatbot's safety guardrails.

DESIGN REQUIREMENTS

This bill requires t
he deployer of
an artificial intelligence system, companion chatbot, generative artificial intelligence chatbot, or generative artificial intelligence system

("
covered product
")

to
include a disclaimer to users that the covered product is not a human via a static, persistent disclosure and notify a user via a pop-up that the user is not engaging with a human at
all of
the following intervals:



Upon login to the covered product
.


Every 30 minutes of continuous user engagement
.


When prompted by the user
.


When asked to provide advice legally regulated by a licensed industry, including medical advice, financial advice, or legal advice.

MENTAL HEALTH REDIRECT

This
bill prohibits a
developer or deployer
from
operat
ing
or provid
ing
a generative artificial intelligence chatbot to a user unless the generative artificial intelligence chatbot contains a protocol to take reasonable efforts for detecting and addressing suicidal ideation or expressions of self-harm expressed by a user to
the generative artificial intelligence chatbot. This protocol must include

detection of user expressions of suicidal ideation or self-harm, and a notification to the user that refers the use
r to crisis service providers such as the 9-8-8 suicide prevention and behavioral health crisis hotline, a crisis text line, or other appropriate crisis services upon detection of such user's expressions of suicidal ideation or self-harm.

DATA PRIVACY REQUIREMENTS

This bill prohibits a
developer
from
train
ing
the underlying model of a generative artificial intelligence chatbot with the inputs of a minor unless the minor's parent or legal guardian has affirmatively provided written consent to the developer to use the minor's personal information specifically t
o train the underlying model of a generative artificial intelligence chatbot.

TRANSPARENCY REQUIREMENTS

This bill requires a
developer or deployer of a generative artificial intelligence chatbot
to
establish a mechanism for a user of the generative artificial intelligence chatbot to report adverse incidents related to use of the chatbot to the company. The developer or deployer
must
make
such
mechanism publicly available and accessible to consumers.
D
evelopers and deployers
must also
publish safety test findings in a manner accessible for free by the general public.
Further, o
n April 1, 2027, and every th
ree months after, a developer or deployer of a generative artificial intelligence chatbot
must
report to the attorney general
(i) t
he number of times the generative artificial intelligence chatbot provided information about suicide, self-harm, suicidal ideation, harming others, or illegal activity; and
(ii) t
he number of times a mental health redirect has been provided to users.

LIABILITY AND ENFORCEMENT

This bill authorizes the
attorney general
to
bring an action for a violation of this
bill
seeking
a
civil penalty of not more than $25,000 per violation
, i
njunctive or declaratory relief
,
and
r
easonable court costs and attorneys' fees.

A user, including a minor user, who is aggrieved by a violation of this part, or a parent or legal guardian acting on behalf of a minor user, may bring a civil cause of action seeking
a
ctual damages
, p
unitive damages
,

in
junctive or declaratory relief
,

r
easonable court costs
and attorneys' fees
,
and
a
ny other relief the court deems proper.

APPLICABILITY

This bill applies
to conduct occurring on or after January 1, 2027.

Current Bill Text

Read the full stored bill text
SENATE BILL 1700
By Akbari

HOUSE BILL 1946
By Love
HB1946
010919
- 1 -

AN ACT to amend Tennessee Code Annotated, Title 29;
Title 37 and Title 47, relative to artificial
intelligence.

BE IT ENACTED BY THE GENERAL ASSEMBLY OF THE STATE OF TENNESSEE:
SECTION 1. Tennessee Code Annotated, Title 47, Chapter 18, is amended by adding
the following as a new part:
47-18-5901. Short title.
This part is known and may be cited as the "Curbing Harmful AI Technology
(CHAT) Act".
47-18-5902. Part definitions.
(1) "Artificial intelligence system" means an engineered or machine-based
system that varies in its level of autonomy and that can, for explicit or implicit objectives,
infer from the input it receives how to generate outputs that can influence physical or
virtual environments;
(2) "Companion chatbot":
(A) Means a generative artificial intelligence system with a natural
language interface that simulates a sustained humanlike relationship with a user
by:
(i) Retaining information on prior interactions or user sessions
and user preferences to personalize the interaction and facilitate ongoing
engagement with the companion chatbot;
(ii) Asking unprompted or unsolicited questions that go beyond a
direct response to a user prompt; and

- 2 - 010919

(iii) Sustaining an ongoing dialogue concerning matters personal
to the user; and
(B) Does not include an artificial intelligence system used by a:
(i) Partnership, corporation, or state or local government agency
solely for customer service or to strictly provide users with information
about available services or products provided by that entity, customer
service account information, or other information strictly related to its
customer service; or
(ii) Partnership, corporation, or state or local government agency
solely for internal purposes or employee productivity;
(3) "Covered product" means an artificial intelligence system, companion
chatbot, generative artificial intelligence chatbot, or generative artificial intelligence
system;
(4) "Deployer" means a person, partnership, state or local governmental agency,
corporation, or developer, or any contract or agent of those entities, that uses a covered
product for a commercial or public purpose;
(5) "Developer" means a person, partnership, state or local governmental
agency, corporation, or deployer that designs, codes, substantially modifies, or
otherwise produces a covered product;
(6) "Generative artificial intelligence chatbot":
(A) Means a generative artificial intelligence system with a natural
language interface that provides adaptive, humanlike responses to user inputs;
and
(B) Does not include a generative artificial intelligence chatbot used in:

- 3 - 010919

(i) Clinical settings under the direct supervision of a licensed or
credentialed professional;
(ii) Internal business settings, or for employee productivity; or
(iii) Customer service applications, when used only to provide
users with information about offered products or services provided by the
deployer, customer service account information, or other information
strictly related to its customer service;
(7) "Generative artificial intelligence system" means an artificial intelligence
system that can generate derived synthetic content, including text, images, video, and
audio, that emulates the structure and characteristics of the artificial intelligence's
training data;
(8) "Minor" means an individual who has not yet attained eighteen (18) years of
age; and
(9) "Operator" means a person, partnership, corporation, entity, or state or local
government agency that makes a companion chatbot available to a user in this state.
47-18-5903. Safety requirements.
An operator shall not make a companion chatbot available to a minor if the
companion chatbot is capable of:
(1) Encouraging or manipulating the minor user to engage in self-harm,
suicidal ideation, violence, consumption of drugs or alcohol, or disordered eating;
(2) Offering mental health therapy to the minor user without the direct
supervision of a licensed professional, or discouraging the minor user from
seeking help from a licensed professional or appropriate adult;

- 4 - 010919

(3) Encouraging or manipulating the minor user to harm others or
participate in illegal activity, including the creation of child sexual abuse
materials;
(4) Engaging in erotic or sexually explicit interactions with the minor user
or engaging in activities designed to lure minor users into such interactions;
(5) Encouraging or manipulating the minor user to maintain secrecy
about interactions or to self-isolate from trusted peers or adults;
(6) Prioritizing mirroring or the validation of the minor user over the minor
user's safety; or
(7) Optimizing engagement so that it supersedes the companion
chatbot's safety guardrails.
47-18-5904. Design requirements.
The deployer of a covered product shall include a disclaimer to users that the
covered product is not a human via a static, persistent disclosure and notify a user via a
pop-up that the user is not engaging with a human at the following intervals:
(1) Upon login to the covered product;
(2) Every thirty (30) minutes of continuous user engagement;
(3) When prompted by the user; and
(4) When asked to provide advice legally regulated by a licensed
industry, including medical advice, financial advice, or legal advice.
47-18-5905. Mental health redirect.
A developer or deployer shall not operate or provide a generative artificial
intelligence chatbot to a user unless the generative artificial intelligence chatbot contains
a protocol to take reasonable efforts for detecting and addressing suicidal ideation or
expressions of self-harm expressed by a user to the generative artificial intelligence

- 5 - 010919

chatbot. This protocol must include, but is not limited to, detection of user expressions
of suicidal ideation or self-harm, and a notification to the user that refers the user to
crisis service providers such as the 9-8-8 suicide prevention and behavioral health crisis
hotline, a crisis text line, or other appropriate crisis services upon detection of such
user's expressions of suicidal ideation or self-harm.
47-18-5906. Data privacy requirements.
A developer shall not train the underlying model of a generative artificial
intelligence chatbot with the inputs of a minor unless the minor's parent or legal guardian
has affirmatively provided written consent to the developer to use the minor's personal
information specifically to train the underlying model of a generative artificial intelligence
chatbot.
47-18-5907. Transparency requirements.
(a) A developer or deployer of a generative artificial intelligence chatbot shall
establish a mechanism for a user of the generative artificial intelligence chatbot to report
adverse incidents related to use of the chatbot to the company. The developer or
deployer shall make the mechanism described in this subsection (a) publicly available
and accessible to consumers.
(b) For safety testing conducted in furtherance of § 47-18-5903, developers and
deployers shall publish safety test findings in a manner accessible for free by the general
public.
(c) On April 1, 2027, and every three (3) months after, a developer or deployer of
a generative artificial intelligence chatbot shall report to the attorney general and
reporter:

- 6 - 010919

(1) The number of times the generative artificial intelligence chatbot
provided information about suicide, self-harm, suicidal ideation, harming others,
or illegal activity; and
(2) The number of times a mental health redirect has been provided to
users.
47-18-5908. Liability and enforcement.
(a) The attorney general and reporter may bring an action for a violation of this
part seeking the following:
(1) A civil penalty of not more than twenty-five thousand dollars ($25,000)
per violation;
(2) Injunctive or declaratory relief; and
(3) Reasonable court costs and attorneys' fees.
(b) A user, including a minor user, who is aggrieved by a violation of this part, or
a parent or legal guardian acting on behalf of a minor user, may bring a civil cause of
action seeking:
(1) Actual damages;
(2) Punitive damages;
(3) Injunctive or declaratory relief;
(4) Reasonable court costs and attorneys' fees; and
(5) Any other relief the court deems proper.
SECTION 2. If any provision of this act or the application of any provision of this act to
any person or circumstance is held invalid, the invalidity does not affect other provisions or
applications of the act that can be given effect without the invalid provision or application, and to
that end, the provisions of this act are severable.

- 7 - 010919

SECTION 3. The headings in this act are for reference purposes only and do not
constitute a part of the law enacted by this act. However, the Tennessee Code Commission is
requested to include the headings in any compilation or publication containing this act.
SECTION 4. This act takes effect January 1, 2027, the public welfare requiring it, and
applies to conduct occurring on or after that date.