How Schools Across America Are Struggling With AI Deepfakes

Rise Up 'Deplorables': Rallying Round Pro-America Businesses

The rise of fake AI-generated images and videos that target administrators and students prompt new laws and policies.

Gone are the days where the biggest concern is students drawing alien ears on their science teacher or printing images of a friend’s face connected to a four-legged body with scales and a tail.

That was 30-something years ago. Now, schools are being forced to develop emergency response plans in case sexually explicit images of students or teachers generated by artificial intelligence (AI) pop up on social media.

In two separate cases, school principals were seen or heard spewing racist, violent language against black students. Both were AI-generated deepfakes—one was produced by students and the other was made by a disgruntled athletic director who later was arrested.

Deepfakes are defined as “non-consensually AI-generated voices, images, or videos that are created to produce sexual imagery, commit fraud, or spread misinformation,” according to a nonprofit group focused on AI regulation.

As education leaders scramble to set policy to mitigate the damage of deepfakes—and as state legislators work to criminalize such malicious acts specific to schools or children—the technology to combat AI tools that can replicate a person’s image and voice doesn’t yet exist, says Andrew Buher, founder and managing director of the Opportunity Labs nonprofit research organization.

“There is a lot of work to do, both with prevention and incident response,” he said during a virtual panel discussion held by Education Week last month on teaching digital and media literacy in the age of AI. “This is about social norming [because] the technical mitigation is quite a ways away.”

Legislation Targets Deepfakes

On Sept. 29, California Gov. Gavin Newsom signed into law a bill criminalizing AI-generated child porn. It’s now a felony in the Golden State to possess, publish, or pass along images of individuals under the age of 18 simulating sexual conduct.

There are similar new laws in New YorkIllinois, and Washington State.

At the national level, Sen. Ted Cruz (R-Texas) has proposed the Take It Down Act, which would criminalize the “intentional disclosure of nonconsensual intimate visual depictions.”

By Aaron Gifford

Read Full Article on TheEpochTimes.com

Contact Your Elected Officials