Later that day, Madi Hine approached her mother crying hysterically about the harassing texts, her mother said. Her daughter admitted that she was embarrassed, and told her that she had been receiving anonymous text messages depicting her naked or telling her to die by suicide for a month before the gym’s owners received anything.
“I was so upset,” Jennifer Hine said. “What kept her sane was getting back to that gym.”
Days after Ms. Hine’s daughter returned to the gym, Ms. Hine said, she began to receive anonymous text messages on her cellphone about her daughter’s whereabouts and how the sender was disappointed that her daughter had returned to the gym. That prompted Ms. Hine to move her daughter to an out-of-state gym. But in August, Ms. Hine said, similar texts started to go to her daughter’s friends, too.
The gym’s owners expressed regret for the campaign of harassment.
“Victory Vipers has always promoted a family environment and we are sorry for all individuals involved,” the gym’s owners, Mark McTague and Kelly Cramer, said in a statement, adding that the incident occurred outside the gym and that all the athletes involved no longer attended there. “We have very well-established policies, and a very strict anti-bullying policy in our program.
Police officials said they executed multiple search warrants throughout the year to determine the source of the text messages. Investigators requested that providers reveal the IP addresses associated with the assorted phone numbers, which led back to Ms. Spone’s residence.
On Dec. 18, the police said, they went into Ms. Spone’s home with a search warrant and seized several devices, including multiple cellphones. With another approved search warrant on Dec. 28, the police said, they analyzed the devices and found that six messages on one of the cellphones lined up with the date the victims had received the texts.
Henry Ajder, who researches deepfakes, said crimes like the one Ms. Spone has been accused of conducting are something he has seen coming. Generating deepfakes has become more accessible to people through apps, and face swapping and lip synchronization tools. People can even hire others through online forums to generate more realistic deepfakes.
While many of the apps that are available, like one through the genealogy website MyHeritage, don’t produce incredibly realistic images, Mr. Ajder anticipates that in the next five years, technology to create more realistic depictions could emerge more widely.