The fight against “deepfake” porn now has another enemy: Twitter. The company told Motherboard that it’s investigating and banning accounts that are either the original posters of AI-edited videos or are accounts dedicated to posting these types of clips.
Twitter says that these face swaps violate the company’s “intimate media” policy, which bans any sexually explicit photos or videos produced or shared without someone’s consent. Basically, this is on par with revenge porn and that’s exactly the route Twitter is taking in terms of handling this situation.
Twitter joins other companies like Discord, Gfycat and Pornhub, all of whom have stated that they will not allow deepfake and other non consensual porn on their platforms. While companies are taking a hard stance, there’s no full guarantee that they can completely eliminate these types of posts, but at least we have a stance from these companies and know that they’re working hard to eliminate this type of threat from their platforms.
Unlike other companies, Twitter is actually in an interesting situation because it allows sexually explicit material on its platform as long as it’s flagged properly. Facebook on the other hand doesn’t allow it at all.
While Twitter is hard at work to enforce these policies, it may not be enough. Reddit’s deepfake subreddits, where the AI-built porn really took off, is still running and has thousands of subscribers, and shows no signs of slowing down. It might be a lot harder for Twitter to eliminate deepfake porn if the necessary tools to create are widely available.