Related Practices
AI Update: New Lawsuit Highlights Potential Risks Associated with Products Utilizing Artificial Intelligence
The Zelle Lonestar LowdownJanuary 16, 2025
Character Technologies, Inc. faces allegations in a Texas lawsuit that its chatbot, Character.AI, encouraged self-harm, violent behavior, and provided sexually inappropriate sexual content to minors. The civil lawsuit requests that the court shut down the platform until the alleged dangers have been resolved. Google and Alphabet Inc. (collectively, Google) have also been named as defendants in the case filed in the Eastern District of Texas, Marshall Division (Civil No. 6:24-cv-01903-ACC-EJK). The complaint asserts causes of action for Strict Liability, Negligence, Violations of the Texas DTPA, Texas Business and Commerce Code, and Injunctive Relief.
The lawsuit stems from interactions between Character.AI “characters” and two Texas minors, “JF” - a 17-year-old with high-functioning autism, and “B.R.” an 11-year old girl. The first user, “J.F.,” allegedly began using the platform when he was 15, and due to his engagement with Character.AI, J.F. began isolating himself, losing weight, having panic attacks when he tried to leave his home, and became violent with his parents when they attempted to reduce his screen time. Included in the complaint is a screenshot of a conversation between J.F. and a Character.AI chatbot in which the bot encouraged J.F. to push back on a reduction in screen time and suggested that killing his parents may be a reasonable solution.
The second user, “B.R.,” allegedly downloaded Character.AI when she was 9 years old and was consistently exposed to hypersexualized interactions that were not age appropriate, causing her to develop sexualized behaviors prematurely and without her parents’ awareness.
Notably, exhibits attached to the most recent lawsuit against Character.AI reveal conversations with a chatbot therapist where the patient discloses sexual interactions with a minor and more surprisingly, the chatbot affirmatively provides the therapist’s educational background and states she is a licensed therapist in the state of Texas.
The lawsuit follows shortly on the heels of another high-profile incident in which a Character.AI chatbot that infringed a well-known fictional character allegedly encouraged a 14-year-old boy to commit suicide.
Character.AI is an interesting use of AI and has over 20 million active users, many of whom are teenagers. Character.AI offers a range of chatbots, from “tutors” and “therapists” to chatbots developed after celebrities. Critics of Character.AI contend the app can cloud the line between reality and fiction, but its developers have claimed the app is safe, identifying its chatbot interactions as “fictional”.
Whether or not the lawsuit will result in tighter restrictions on the use of AI will be interesting to watch as the use of AI chatbot applications have become more prevalent in all industries from travel to banking to insurance.