A Technical Guide to Burning Down a Troll Farm

Clara “Keffals” Sorrenti, a popular Twitch streamer, answered her door on August 5 to Ontario police pointing guns in her face. She was arrested, questioned, then released, once the police realized that she had been “swatted” — someone impersonated her, emailing that she was going to carry out a mass shooting.
Soon, Sorrenti became a victim of a month-long, cross-continental campaign of doxxing (publicly releasing personal information about an individual) and swatting.
This was all because of her co-organizing of the #DropKiwiFarms campaign. Kiwi Farms is an online message board or forum that organizes trolling, harassment and stalking of mostly transgender people, those with disabilities, and those who are neurodivergent.
The New Stack spoke to Liz Fong-Jones — Honeycomb.io’s principal developer advocate, who has been working to bring down the hate site for more than five years — to learn how the volunteer team decided which piece of the Kiwi Farms stack to target, why it eventually came down (and then went up again), and how members of the tech industry can take better responsibility for the infrastructure they use.
Which Layer of the Stack to Target?
If you consider all of the parts of the modern tech stack that make it possible to put content online — client side, cloud and data centers, hosting and network providers, databases, and so on — it would seem like there are many layers to potentially target to take down a troll farm. So the #DropKiwiFarms campaign applied the process of elimination.
The big three cloud providers simply won’t have it. Amazon Web Services, Microsoft Azure and Google Cloud have terms of service that prevent this kind of harassment and threats and are known to act on abuse reports. Kiwi Farms, Fong-Jones said, deliberately picked a host that was friendly to them — sometimes dubbed pink hosts, notorious for hosting spammers but also hate groups that more mainstream hosts reject.
“Pay them money, they didn’t care what they did with the bandwidth,” Fong-Jones said, making targeting the host provider with a pressure campaign futile.
“If you can’t go after the hosting provider itself, because they are aligned, you only go [for the] CDN or the network transit,” she said.
Cloudflare, Kiwi Farms’ content delivery network (CDN) and distributed denial of service (DDoS) mitigation layer, became the obvious target. According to Cloudflare’s website, it provides security services for more than 20% of the web.
But the #DropKiwiFarms campaign didn’t choose Cloudflare just because it provided the forum’s DDoS protection, but because, Fong-Jones argued, the company actually subsidized the cost of Kiwi Farms.
“The forum worked off of a substantial amount of image- and video-based harassment,” Fong-Jones said, including photos of victims and their families, screenshots, dehumanizing propaganda, and photos that support doxxing through identifying personality identifiable information, like a person’s location.
Cloudflare provides an edge-caching service, hosting copies of content closer to the end users.
“Every request does not go to the origin server. From a technical perspective they were providing free bandwidth to Kiwi Farms,” Fong-Jones said. “Kiwi Farms is serving one of these abusive images [that weighs] a megabyte. Cloudflare is caching that one-megabyte image. So if that image gets viewed 1,000 times, instead of costing [Kiwi Farms] a gigabyte of bandwidth transfer, it’s once. They would’ve had to provision a thousand times the server capacity.”
In addition, Kiwi Farms isn’t just a forum. As Ben Collins, a senior reporter for NBC, described on an episode of WNYC’s “On the Media” program, “It’s a way to archive every movement of people they deemed to be a ‘lolcow,’” the site’s harassment targets, which, as of late, tend to be transgender people.
“They were going after people who were mostly private citizens,” he told the radio show’s audience. “They archived their every movement, their every address. They lived their lives in actual terror.”
Kiwi Farms stands out against its extremist forum competitors that delete their content regularly; it maintains a searchable database, backed, in part, by this edge caching feature.
The site is technically built in a way to amplify content, to the point, Collins said — Kiwi Farms content would index first on Google if you searched victims’ names. This likely contributes to the greater unemployment and poverty that the transgender community faces, as most employers Google a job candidate before hiring them.
“The right to free speech isn’t the right to free amplification,” Fong-Jones said. Harassment perpetrated by Kiwi Farms users led to at least three suicides.
Pressure to de-platform the forum didn’t just come from activists and doxxing victims, but from an unknown number of Cloudflare’s paying customers, including Dreamwidth, a blogging service. Of course, not many Cloudflare customers would be so vocal about why they migrated away from them, because going public could lead to employees becoming harassment targets, too.
“If you are a Cloudflare enterprise customer, you are massively subsidizing the cost of the stack and therefore subsidizing hate groups,” Fong-Jones said. “You are paying for the servers, bandwidth and network that Cloudflare uses to subsidize hate groups.”
Cloudflare (Eventually) Takes Action
“Cloudflare claimed we don’t host it, we just proxy it. We absolve ourselves of responsibility. The only thing we’ll promise is we will pass on your details [of the complaint] to the end host,” which, Fong-Jones added, “they changed to not include contact details.”
Indeed, Matthew Prince, Cloudflare’s CEO and co-founder, argued on the company blog that “voluntarily terminating access to services that protect against cyberattack is not the correct approach.” (The New Stack reached out to Cloudflare repeatedly for comment for this article, but has not received a response.)
An August 31 post on his company’s blog, co-written with Alissa Starzak, Cloudflare’s vice president, global head of public policy, stated that to take down a site (Kiwi Farms is not named) that the company dislikes would create a dangerous precedent and be an abuse of power:
“Some argue that we should terminate these services to content we find reprehensible so that others can launch attacks to knock it offline,” the blog post said. “That is the equivalent argument in the physical world that the fire department shouldn’t respond to fires in the homes of people who do not possess sufficient moral character.”
Less than three days later, Prince announced via Cloudflare’s blog that his company was pivoting to block Kiwi Farms’ content from being accessed through the Cloudflare infrastructure.
In his blog post, Prince emphasized that his team made the move because of the sudden, unprecedented escalation of immediate threats to human life published on the forum, which he wrote was a result of the pressure campaign to de-platform the forum.
“We do not believe that terminating security services is appropriate, even to revolting content,” he continued, expressing regret for the decision to drop Kiwi Farms as a customer.
Some other vendors that Kiwi Farms used dropped it as well, including hCaptcha, an independent cybersecurity service that runs on about 15% of the world’s internet.
“As a security service, we do not host or see content on sites using us, but do review specific complaints regarding users who may be violating our terms,” the hCaptcha team wrote as part of its notification to Kiwi Farms, terminating its free account.
As user bases scale into the millions and billions, it calls into question whether service providers can and should monitor every user or create some sort of flagging system.
The Impact of a Takedown
Getting Cloudflare to drop the hate group — and then removing its content from the Internet Archive — is not a cure-all for bigotry. There’s no doubt that those intent on organizing around harassing, doxxing, stalking and swatting people will continue, onto other sites or through the Kiwi Farms site, which quickly migrated to Russian servers.
On Monday, Kiwi Farms reported a 503 Intermission error and that it has been hacked by an XSS injection. Founder Joshua Moon stated on the website’s error page: “The forum was hacked. You should assume the following.
- Assume your password for the Kiwi Farms has been stolen.
- Assume your email has been leaked.
- Assume any IP you’ve used on your Kiwi Farms account in the last month has been leaked.”
Furthermore, he wrote, “Cloudflare not only provided DDoS protection, they also accounted for many popular exploits like this. As I’ve worked for weeks to combat the endless flow of attacks from every conceivable angle I have spread myself very thin and hurridly [sic] replaced old systems with new ones that are not properly vetted.”
This hacking would have been very difficult if Cloudflare hadn’t dropped Kiwi Farms. There’s also a solid chance that this hacking will enable both civilians and law enforcement to more easily identify Kiwi Farms members who have participated in criminal harassment tactics.
Cloudflare, in the aforementioned blog post, also noted that it is working with law enforcement for ongoing investigations into individuals who posted on the forum. However, legislation and enforcement lag far behind in regulating and pursuing this online, distributed, anonymous form of harassment.
Kiwi Farms will be back up soon, but perhaps with less power. When it does sign up for another DDoS provider, that will undoubtedly become a target of de-platforming pressure campaigns.
The Impact of Better Vendor Vetting
In the short term, containing the offline harm caused by online hate groups will likely come down to buyer power. Organizations can figure out who is backing these hate sites by running them through a Whois domain lookup. Look for information in the HTTP headers, Fong-Jones suggested, including DNS servers, IP address and even content acceleration through different caching types.
Organizations already have extensive policies in place guiding supplier selection and vetting. “We need to rethink the role of infrastructure providers,” she said. “If you do business with neo-Nazis and go out of your way, people have the right to not do business with you.”
It’s reasonable that companies add moral criteria to that vetting process for both software as a service subscriptions and integration partnerships, asking:
- Who else is this vendor standing up?
- Who of our clients and/or employees might be harmed by us financially supporting this vendor?
“You should, as a company that provides services to your customers, have a policy around what conduct is acceptable,” Fong-Jones said. “You should absolutely evaluate the risk that supplier poses to your reputation and your employees,” including those at the greatest risk to be doxed.
“We have to stop treating technology as neutral, [claiming] we provide equal service to everyone,” she said. “That doesn’t work anymore when someone is using your service to harm people.”