Reddit is full of stolen OnlyFans content, and owners of that content say the platform often ignores their pleas for takedown requests.
There are dozens of subreddits devoted to "leaked" or stolen content on Reddit. One of the biggest Reddit communities devoted to sharing stolen OnlyFans content, r/LEAKEDonlyfans, described itself as a place to "give people pictures from only fans when they can't afford it," and had nearly 100,000 subscribers.
"This is our income, a lot of us have other businesses but this is a way that we make a living too," adult performer and OnlyFans user Alexa Pearl told me. She said Reddit is her biggest culprit for this type of theft, and she's been dealing with content theft stemming from the platform for nearly four years.
"Excuse my language but it's fucking with my livelihood," Pearl said. "Again, this is how myself and hundreds of thousands of others, women and men, make their living, so I'm not quite sure why [Reddit] hasn't done much."
Reddit banned that 100,000-subscriber subreddit last week, citing excessive copyright violations. "In accordance with Reddit's User Agreement, we respond to valid DMCA takedown requests for cases of infringing or copyright materials and will action any users or communities in appropriate circumstances," a spokesperson told Motherboard. "The subreddit in question was banned for violating this policy."
Pearl said she files Digital Millennium Copyright Act takedown requests every week to Reddit, and that she has contacted the platform on multiple occasions. "They kind of just give you that rundown like oh, anyone is free to post whatever they like here as long as it's not like 18 or under porn or something like that, but they're definitely not helpful at all."
In these subreddits, users will post short gifs or videos and then post links promising more content in the comments. A lot of what's posted to these forums are linking schemes that make you click through a maze of tasks with the promise of an album download at the end. The majority of these use the same site, where the process to access this "free" content is mostly the same: download a Google Chrome extension (Motherboard has investigated the malicious browser extension market in the past), activate browser notifications, and click on a bunch of articles from a selection of a content chumbox. The content thieves then get paid for the number of views they send to the site. It's similar to the "OnlyFans premium" scams we found all over YouTube last year.
Last year, Motherboard investigated the economy of stolen OnlyFans material and found that Reddit was one site where it was widely shared. At the time, Reddit directed Motherboard to its user agreement, which states that it expects users to respect intellectual property and ban repeat copyright infringers.
onlyfans leaksSeveral of the most frequent posters on the biggest stolen OnlyFans content subreddits are also mods for their own, smaller stolen content subs. Despite these users and subreddits openly advertising their practice of monetizing stolen OnlyFans content, from creators' points of view, Reddit is slow to enforce its policy on intellectual property. Sometimes, it takes hiring a lawyer to handle takedown requests. "I suspect my only option for purging groups these days is cease-and-desist letters—yet another way purging content costs more for victims and how their legal fees stack up," Charles DeBarber, lead privacy analyst at Phoenix Advocates & Consultants, told me. DeBarber often helps clients try to get non-consensual porn off platforms, including Reddit.
Attorney Lawrence Walters, whose firm Walters Law Group specializes in adult entertainment legal services, said that Reddit is just one of many platforms dealing with this problem. "Reddit has been an important platform for free expression since the early days of the Internet," Walters said. "Illegal and infringing content uploads are an ongoing problem for any open platform. In our experience, Reddit has been reasonably responsive to takedown requests by our clients."
Others don't have as much luck. DeBarber, whose clients include victims of Girls Do Porn's fraud, told me that Reddit has not acted on the numerous complaints and requests for removal of his clients' content. Even if one individual post is removed, the comments below it may remain, with women being doxed or harassed in the same Reddit thread.
"They literally have no mechanism for reporting a subreddit itself," DeBarber said. "Some subreddits [which include the victim's real name in the subreddit name] are dedicated to doxing and harassing individuals! Some are based on sharing massive amounts of pirated pornography. Reddit needs a 'report subreddit' tool, but lacks one."
A small percentage of the posts on leaked forums are from creators themselves, advertising their own content as "leaked" to stay within the subreddits' themes of stolen content, but then directing users to their official account. This also happens on adult sites, including Pornhub, where performers title content "leaked" or "leaked Onlyfans" because they know people are searching for it.
"Complex issues with no simple solution"
Compared to other mainstream social media sites like Facebook and Twitter, which are constantly changing their terms to push sexual content off their platforms, Reddit is very sex-friendly. Reddit doesn't just allow sharing adult content, it also hosts dozens of subreddits devoted to sharing advice on how to make the most of OnlyFans—technical tips, safety advice, and advertising content with promotions-focused subs or broken down into fetish themes like r/OnlyFansBlonde, r/Onlyfansfeet and r/OnlyfansAmateurs.
OnlyFans itself is not a good discoverability tool for its own users—there's a "suggested" column on the site but no categories or search functions like other creator-subscription sites like Patreon. For the most part, people use OnlyFans because they got there through a creator's link that they saw somewhere else on the internet. Creators are doing most of the work in driving traffic to the site. Reddit, in a way, can act as that discoverability tool.
But it's also one of the biggest sources of content theft for some creators, and that's reflected in its own annual transparency reports: In 2019, it received 34,989 copyright notices, and removed 124,257 pieces of content. In 2018, it removed 26,234 pieces of content for copyright violations. In 2017, that number was 4,352. In 2016, it was 610.
Reddit uses decentralized delegation, where mods maintain their own subreddits on a volunteer basis, making sure users within their corner of Reddit follow the platforms' rules and step in to remove or ban when they don't, in order to protect it from Reddit intervening and shutting the whole subreddit down. For the most part, this works for the platform: people spend their free time moderating content for a company valued at $3 billion, the company gets that labor for free, and communities are left to self-regulate. Reddit does give mods the option to set up a bot called AutoModerator to detect and flag the most egregious rule violations, which takes some of the burden off.
Reddit has stepped in and given the banhammer to uniquely awful subreddits in the past: r/The_Donald and r/chapotraphouse in 2020, and Qanon community r/GreatAwakening in 2018, for instance. But these almost always happen after months or years of journalists asking the platform why they're allowed to stay up in blatant violation of the rules. Even Reddit founder Alexis Ohanian said publicly that Reddit needs to ban a flagrantly racist subreddit, which was only removed after we reached out to Reddit about Ohanian's comment.
User-moderated communities work well enough, until the mods are in on the grift, as with leaked content subreddits. If the whole subreddit is set up to infringe on someone's copyright by reposting their photos, or spread nonconsensual videos that dox people in the comments, it takes someone from outside to flag it.
Pearl believed the solution should be simple: Reddit needs to listen to content creators and act faster on takedown requests. "It's just whether or not they're willing to take a stand and really say, hey, we do protect performers and we do advocate for these people."
Walters sees the issue as more complex than a more stringent moderation process can solve. Between biased or unfairly-applied algorithms and human mods that are prone to error and slow to respond, it's not an easy issue to solve, especially as social media platforms crack down on sexual content all over the internet. More moderation likely isn't the answer.
"Increased moderation frequently results in over-censorship of lawful materials," Walters told me. "The platforms that adopted stricter moderation policies have squelched voices of sex workers and censored constitutionally protected sexual expression... In the current legal climate, platforms like Reddit are likely struggling to strike a balance of competing interests involving free expression, privacy, intellectual property rights, and elimination of unlawful content. These are complex issues with no simple solution."
The misogyny behind why people feel entitled to steal and devalue sex workers' content online—and go to absurd lengths to get it for free instead of buying it directly to from creators themselves—is much bigger problem that tech companies can't solve.
The Wall