🚨 Law enforcement has been warning about scams using deepfake technology

📱 A New Jersey man is charged with running an AI scam in Texas

😡 Scammers as using social media posts to impersonate friends and relatives

A week after legislation was introduced to update the state’s identity theft law to include fraudulent impersonation using artificial intelligence and deepfake technology, come word a New Jersey man has been charged with using that technology to scam seniors in Texas.

Law enforcement officials in Harris County, TX, allege Roman Guzmon of North Brunswick used voice cloning technology to swindle thousands of dollars from at least two victims.

In one case, the 26-year-old Guzman is accused of convincing a woman her son had been driving drunk when he struck and injured a pregnant woman. He claimed the woman had lost the baby. The District Attorney's office claims Guzman cloned the son's voice and used the AI technology to get the woman to provide tens of thousands of dollars for "bail money."

Guzman was arrested by North Brunswick Police and is being held in the Middlesex County lockup pending extradition to Texas.

Law enforcement and the Federal Trade Commission have been increasingly concerned about the use of AI and deepfake technology in criminal enterprises and consumer scams.

The FBI issued a warning last year about the misuse of such technology.

Essentially, the FBI says, scammers are using the same computer programs utilized by the medical profession and Hollywood special effects, but for a far different purpose.


"It's a very sophisticated program. It's used medically, for people that have throat cancer or ALS, they've lost their voice," FBI Special agent Raul Hernandez told Texas TV station NewsWest9, "Hollywood uses it is as well to change the voices of actors. It's a very expense program but some of the nefarious groups have the money so they'll go out there and pay for it."

The Federal Trade Commission (FTC) issued its own alert in March.

Scammers often need just a small sample to close a person's voice. The FTC says these are often available from videos people post on their social media accounts.

According to the FTC, if you get a call from a friend or relative claiming they are in distress or in trouble with the law, "Don’t trust the voice. Call the person who supposedly contacted you and verify the story. Use a phone number you know is theirs. If you can’t reach your loved one, try to get in touch with them through another family member or their friends."

New Jersey State Sen. Doug Steinhardt, R-Warren, is sponsoring a bill (3926) that extends the crime of identity theft to include fraudulent impersonation or false depiction by means of artificial intelligence or deep-fake technology."

"To take a voice and be able to make it sound like I was talking to you or you were talking to me and we would never know the difference, that’s a terrifying thought," Steinhardt said.

If the bill is approved and signed into law, using deepfake technology in a scam or criminal enterprise could be prosecuted as a second-degree crime punishable by up to 10 years in prison.

Previous reporting by New Jersey 101.5's David Matthau was used in this story.

Potential hidden hazard on NJ picnic tables and benches you need to know about

A hidden danger that could potentially be present without you knowing

A very stupid move by a NJ bicyclist that almost got him hit

Another example of a bicyclist doing whatever they want in New Jersey

Why you shouldn't visit the Jersey Shore this summer

10 reasons why you might want to rethink that visit...

Report a correction 👈 | 👉 Contact our newsroom

More From Cat Country 107.3