My page - topic 1, topic 2, topic 3 Postbox Live

AI to produce pictures of child sex abuse

Ai To Produce Pictures Of Child Sex Abuse

Soldier in the U.S. Army accused of utilizing

AI to produce pictures of child sex abuse

 

 

 


The government is now more intent on punishing the use of artificial intelligence in the production of child pornography, as seen by Seth Herrera’s arrest.


A U.S. Army officer has been accused by federal prosecutors with using artificial intelligence to create explicit sexual photographs of minors he knew. This case highlights the government’s heightened focus on criminalizing the use of AI to create imagery of child sexual assault.


According to a Justice Department statement released on Monday, Seth Herrera, 34, an Army soldier stationed in Anchorage, had thousands of photos showing the brutal sexual assault of children and used artificial intelligence (AI) techniques to create realistic child sex abuse material.
Following his arrest last week, he appeared in court for the first time on Tuesday.

According to court filings, Herrera used artificial intelligence (AI) software to take photos of kids that he knew and use it to either strip the subjects or turn them into pornographic images of them having oral sex or having something pierce them. He received and saved images of child sex abuse via popular chat apps like Telegram.


The timing of Herrera’s arrest coincides with the proliferation of child pornography, or AI-generated child sexual abuse material (CSAM), on the internet thanks to programs that produce artificial images.
According to child safety experts who spoke with The Washington Post, the tools are being pushed more and more on pedophile forums as a means of producing more realistic and unedited sexual representations of kids.

Federal officials, meanwhile, are outlining their legal defenses, arguing that AI-generated photographs need to be handled similarly to real-world child sex abuse reports.


Deputy Attorney General Lisa Monaco stated in the Justice Department statement that “the misuse of cutting-edge generative AI is accelerating the proliferation of dangerous content.” “…criminals should pause and reconsider if they’re thinking about using AI to continue their crimes.”


The Army was consulted by the Defense Department. A request for comment was not immediately answered by the Army. Benjamin Muse, an associate federal public defender, Herrera’s lawyer, chose not to comment.

Following several federal instances concerning AI and child abuse content, Herrera was arrested. A Wisconsin man was charged in May with using AI to create photographs of child sex abuse. This is likely the first federal case involving the application of such material to images created exclusively with AI.


Federal officials said that males in North Carolina and Pennsylvania had employed artificial intelligence (AI) in two other recent cases to either digitally erase children’s clothing from authentic photos or superimpose children’s faces into explicit sex scenes, a technique known as “deepfakes.”

A review of Herrera’s three Samsung Galaxy phones obtained during the execution of a search warrant by Homeland Security Investigations showed, in court documents, that he had “tens of thousands” of videos and pictures, dating back to March 2021, showing infants and young children being violently raped.

Herrera distributed pornographic content over a number of messaging programs in addition to Telegram, such as Potato Chat, Enigma, and Nandbox. According to court documents, he also started his own open Telegram group to keep his sexual content.

According to court filings, Herrera took pictures and videos of kids he knew in private settings, such showering, and “morphed” them into imagery of sexual abuse. Prosecutors said that he would use AI to “enhance” these pictures by zooming in. According to police, Herrera used AI to create photos of youngsters participating in “the type of sexual conduct he wanted to see” after those photographs “did not satisfy his sexual desire.”


Herrera’s involvement as a soldier creating images of child sexual assault using artificial intelligence, according to Robert Hammer, special agent in charge of Homeland Security Investigation’s Pacific Northwest Division, was a “profound violation of trust” and foreshadowed difficulties law enforcement will face in safeguarding children.

According to Stars and Stripes, Herrera is an enlisted Army specialist who worked as a motor transport operator in the 11th Airborne Division at Joint Base Elmendorf-Richardson in Anchorage.


He faces charges on one count each of receiving and conveying images of child sexual assault and having the content. Herrera may spend up to 20 years in jail if found guilty.

 

 


Discover more from Postbox Live

Subscribe to get the latest posts sent to your email.

Leave a Reply

error: Content is protected !!

Discover more from Postbox Live

Subscribe now to keep reading and get access to the full archive.

Continue reading