Washington News Bureau

Washington lawmakers hold hearing on AI’s use in scams, criminal behavior

WASHINTGON — Lawmakers are warning the country about a growing number of scammers using artificial intelligence technology.

The warning comes as the use of AI rises rapidly across multiple sectors.

Channel 2 Washington Correspondent Nicole D’Antonio was at the Capitol, where efforts to crack down on fraud and increase protections related to the use of AI were the subject of a congressional hearing.

From deepfakes to voice cloning scams, advocates say AI has made it easier for scammers to target families in a more personalized way.

At the Capitol, lawmakers discussed ways to strengthen guardrails on AI, while advocates warned about how it’s become harder to differentiate what’s real, and what’s manipulated or generated.

[DOWNLOAD: Free WSB-TV News app for alerts as news breaks]

TRENDING STORIES:

They said it includes deepfake videos showing celebrities endorsing politicians or products, as well as how many children and teenagers are being depicted in sexual ways online due to the technology’s use.

“My 14-year-old daughter, along with her sophomore classmates at Westfield High School, was as confirmed victim of AI deep fake misuse,” Dorota Mani, a parent, told members of Congress. “Boys in my daughter’s grade used AI to generate sexually explicit images of her and other girls.”

Mani says she wants to see school districts implement AI literacy programs on how to use the technology safely and ethically and to make sure students understand the responsibilities associated with the powerful programs.

“I strongly believe there is a critical missing component in our approach to artificial intelligence which is education to prevent misuse,” Mani said.

Both Democrat and Republican lawmakers are working together on a series of bills to increase protections related to AI.

Lawmakers say they hope to get the bills across the finish line and passed into law in the coming weeks.

[SIGN UP: WSB-TV Daily Headlines Newsletter]

0