Look what they made X do.
Earlier in the week, graphic AI images of Taylor Swift went viral on the social media platform and now, X users are being met with an error message upon searching her name.
After we discovered the issue, X told , “This is a temporary action and done with an abundance of caution as we prioritize safety on this issue.”
Renderings featuring the 34-year-old billionaire flooded X this past week, showing the “Cruel Summer” songstress in various sexual scenarios at boyfriend Travis Kelce’s Kansas City Chiefs game.
As soon as the images started going viral, the singer’s diehard fans came to her defense, begging people not to share them.
Swift is reportedly “furious,” too, and she’s considering taking legal action, a source told the Daily Mail Thursday.
“Whether or not legal action will be taken is being decided, but there is one thing that is clear: These fake, AI-generated images are abusive, offensive, exploitative and done without Taylor’s consent and/or knowledge,” the insider said.
“The Twitter account that posted them does not exist anymore. It is shocking that the social media platform even let them be up to begin with,” the source added.
For more you love ...
“Legislation needs to be passed to prevent this, and laws must be enacted.”
The 34-year-old singer has not yet publicly addressed the scandal, but her fans have been flooding X with positive messages about her in an attempt to fight back against the images known as “deepfakes.”
“people sharing the ai pics are sick and disgusting. protect taylor swift at all costs,” one fan tweeted about the images that show the singer in provocative poses.
“using ai generated pornography of someone is awful and inexcusable. you guys need to be put in jail,” another fan wrote of the person or persons behind the offensive snaps.
“Protect Taylor…
Read the full article here