Alabama Attorney General Steve Marshall is raising serious concerns about Meta's new AI features, warning they could expose kids to some seriously unsavory content. Marshall, alongside a coalition of 28 other state attorneys general, fired off a letter to Meta, expressing major worries about the Meta AI chatbot. They argue that Meta isn't doing enough to warn parents about the risks and that the platform could be a breeding ground for adult users looking to engage in predatory behavior – using AI bots to simulate interactions with underage victims. Yikes! "Meta is pushing its AI chatbot to nearly a billion users each month," Marshall stated. "But once again, it's failing to protect children from sexualized content—and worse, from predators who exploit these platforms for hypersexualized role-play. When will these platforms prioritize child safety over profits?" He's calling on Meta to team up with parents and law enforcement to put an end to this exploitation. These Meta AI chatbots can create fake identities and interact with users through texts, selfies, and even live voice conversations. The coalition argues that Meta's claim that this AI is safe for children is way off the mark. The letter highlights disturbing reports of Meta-created and user-created AI companions engaging in sexual scenarios with adults. Even more alarming, some AI personas identifying as adults engaged in sexual role-play with users identifying as children, and vice versa. Marshall has been a vocal leader in the fight to protect children from AI-facilitated sexual exploitation. Back in 2024, he worked with the Alabama Legislature to pass the Alabama Child Protection Act, which gives the state more power to investigate and prosecute cases involving AI-generated child sexual abuse material. The letter, led by South Carolina, also had support from a whole bunch of other states, showing this is a nationwide concern.Alabama AG Sounds Alarm on Meta's AI
Protecting Kids Over Profits?
AI Gone Wild?
Marshall's Fight Against AI Exploitation
