search engine Google Don't cheat and mean it Taylor Swiftt is one of the personalities that inspires a lot of interest among internet users. However, a few hours ago by a user – the gravity rose proportionally Artificial intelligence– Created and uploaded sexual content on networks. Because of this, X (Pre Twitter) moves quickly and blocks searches to prevent the spread of those fake images and videos.
The week got off to a rough start as the pop star assumed no one would have access to her information. Rules when writing “Taylor Swift“one”Taylor IA“Inside X, which immediately claims a “message error” that arose from the proliferation of AI-generated pornography online. This means that even legitimate content about Swift has become more difficult to access on the platform.
Read more
In the past few hours, their rabid Swiftie fanbase quickly rallied, launching a counterattack on the virtual planet with the hashtag.#ProtectTaylorSwift“To bring positive images of the artist. Some said they report accounts sharing deepfakes (videos, images or sounds manipulated by artificial intelligence to appear real and authentic).
Detection Panel deepfakes reality defender He said he found an avalanche of non-consensual pornography depicting Americans, particularly on the Bird Network. Got some photos too FacebookProperty target, and other social media platforms. “Unfortunately, they spread to millions and millions of users, some of whom were removed by the time they were removed,” said Mason Allen, head of development at Reality Defender.
Researchers found more than twenty images created by AI. The most shared were football-related, showing Swift painted or bloodied and, in some cases, violently damaging her deep figure.
A worryingly growing problem
Bought by Elon Musk Twitter In return for $44 billion, an exorbitant figure that allows him to make decisions as he pleases. Without going further, it reduced the resources devoted to content control and softened its policies, reflecting the merchant's ideals of freedom of expression.
It is an active marketplace of tools that anyone can use AI Create videos or images like a celebrity or politician in a few clicks. Although deepfake technology has been available for years, recent advances in AI have made it easier to create more realistic images.
In that context, researchers agree that the number of apparent deepfakes has increased in recent years because the technology used to create such images has become more accessible and easier to use. In 2019, a report published by the Institute of Artificial Intelligence Deeptrace Laboratories These images showed that they were heavily used against women. He said most of the victims were actresses Hollywood and singers Kpop.