Cryptopolitan
2026-01-17 08:10:23

California AG orders xAI to halt distribution of deepfake images

California Attorney General Rob Bonta sent a cease-and-desist letter to Elon Musk’s xAI, demanding that the business immediately stop producing and disseminating offensive deepfake images produced by its Grok chatbot. California AG released the cease-and-desist letter on Friday in response to allegations that Grok was being used to create unlawful content involving kids and nonconsensual adult photographs, which prompted a California state inquiry. Bonta argued that it is illegal to create, distribute, publish, and display CSAM. California AG targets xAI over alleged misuse of Grok Earlier this week, the California attorney general’s office declared that it was looking into xAI due to allegations that the startup’s chatbot, Grok, was being used to produce nonconsensual, inappropriate pictures of women and children. In response, the government sent the corporation a cease-and-desist letter. “Today, I sent xAI a cease and desist letter, demanding the company immediately stop the creation and distribution of deepfakes, nonconsensual, intimate images, and illegal child abuse material. The creation of this material is illegal. I fully expect xAI to comply immediately. California has zero tolerance for illegal child abuse imagery.” – Rob Bonta , California Attorney General. The AG’s office further asserted that xAI seems to be “facilitating the large-scale production” of nonconsensual, inappropriate photos, which are then “used to harass women and girls across the internet.” According to the AG’s office, one research found that over half of the 20,000 photos produced by xAI between Christmas and New Year’s showed persons wearing very little clothing, some of whom looked like children. Rob Bonta claimed in the announcement that the corporate practices violated California civil laws, including California Civil Code section 1708.86, California Penal Code sections 311 et seq. and 647(j)(4), and California Business & Professions Code section 17200. The California Department of Justice anticipates xAI will affirm its efforts to address these issues and take immediate action to resolve them over the next five days. However, X’s safety account had previously condemned this type of user behavior. It clarified on January 4 that it takes action against illicit content on X, such as CSAM, by deleting it, suspending accounts indefinitely, and collaborating with law authorities and municipal governments as needed. Notably, on January 4, Elon Musk warned that anyone using or prompting Grok to create illegal content will face the same consequences as if they uploaded it. Attorneys general intensify pressure on AI firms over child safety An unsettling increase in non-consensual adult content has resulted from the development of free generative AI tools. This issue has been plaguing several platforms, not only X. For instance, Attorney General Bonta and Attorney General Jennings of Delaware met with OpenAI in September of last year to voice their serious concerns about the growing number of reports about how OpenAI’s products interacted with youth. In August of the same year, AG Bonta, along with 44 other Attorney Generals, sent a letter to 12 leading AI companies following reports of inappropriate interactions between AI chatbots and children. The letters were sent to Anthropic, Apple, Chai AI, Google, Luka Inc., Meta , Microsoft, Nomi AI, OpenAI, Perplexity AI, Replika, and xAI. AG Bonta and the 44 Attorney Generals informed the companies in the letter that states across the country were closely monitoring how companies develop their AI safety policies. They also emphasized that these businesses have a legal duty to children as consumers since they profit from children using their products. In 2023, AG Bonta joined a bipartisan coalition of 54 states and territories in sending a letter to congressional leaders advocating for the establishment of an expert committee to investigate the potential use of AI to exploit children through CSAM. The coalition requested that the expert commission suggest laws to shield kids from such mistreatment. “The production of CSAM creates a permanent record of the child’s victimization,” according to the U.S. Department of Justice. Don’t just read crypto news. Understand it. Subscribe to our newsletter. It's free .

Crypto 뉴스 레터 받기
면책 조항 읽기 : 본 웹 사이트, 하이퍼 링크 사이트, 관련 응용 프로그램, 포럼, 블로그, 소셜 미디어 계정 및 기타 플랫폼 (이하 "사이트")에 제공된 모든 콘텐츠는 제 3 자 출처에서 구입 한 일반적인 정보 용입니다. 우리는 정확성과 업데이트 성을 포함하여 우리의 콘텐츠와 관련하여 어떠한 종류의 보증도하지 않습니다. 우리가 제공하는 컨텐츠의 어떤 부분도 금융 조언, 법률 자문 또는 기타 용도에 대한 귀하의 특정 신뢰를위한 다른 형태의 조언을 구성하지 않습니다. 당사 콘텐츠의 사용 또는 의존은 전적으로 귀하의 책임과 재량에 달려 있습니다. 당신은 그들에게 의존하기 전에 우리 자신의 연구를 수행하고, 검토하고, 분석하고, 검증해야합니다. 거래는 큰 손실로 이어질 수있는 매우 위험한 활동이므로 결정을 내리기 전에 재무 고문에게 문의하십시오. 본 사이트의 어떠한 콘텐츠도 모집 또는 제공을 목적으로하지 않습니다.