The algorithm came between people I loved.

Feed Filter shows you who spams your social feed and removes it as you scroll. The website audits the people you follow — evaluating their recent posts for deception, clickbait, promotion, rage bait, and other low-signal filler — and the app filters that content out of your feed as you scroll, fully customizable so you see more of what actually matters.

I don't mean that figuratively.

About three years ago, I came home for the holidays. My sister sent me a text asking me to give her space, that she didn't want to talk or really interact at all while we were both there, which is a strange thing to receive from someone you grew up with and are now staying in the same house as. We'd grown apart over arguments that started online. Things that felt urgent on a screen felt personal in a room.

The American Psychiatric Association found in 2024 that 1 in 5 Americans has become estranged from a family member over disagreements on controversial topics, and a similar share has blocked a family member on social media. People didn't suddenly get worse at disagreeing. The feed got better at making disagreements feel like fights worth winning.

At first I thought this was about politics. It wasn't. Politics was just the most visible wreckage.

Then I noticed it was happening to me too, and it had nothing to do with politics at all.

I'd open Twitter to check one thing and look up forty minutes later, upset about things I hadn't been thinking about before I opened it. Not just more reactive in a general way but specifically worse at assuming the people around me were acting in good faith. I felt informed. I was being trained.

So I did what everyone does. I muted accounts, clicked “not interested,” blocked the worst ones, and for a day or two it would actually feel cleaner, genuinely cleaner, and then slowly the same content would start reappearing through different accounts, slightly repackaged, as if the feed had just mapped the pattern and rerouted. Which, it turns out, is exactly what it was doing.

That's when the willpower frame collapsed. I wasn't weak or undisciplined. The tools I'd been given were designed to feel like solutions while solving nothing. I was doing maintenance on a system that had no interest in being fixed.

What hurts you is exactly what the platforms profit from.

Internal Facebook documents showed that the platform's ranking system treated emoji reactions (angry, sad, wow) as five times more valuable than a like. So outrage rose. Not because anyone decided users should feel that way, but because strong reactions kept people on the app longer, and longer sessions mean more ad inventory, and that math doesn't care what state you're in when you close the tab.

The platforms know what the feed is doing. There is no incentive for them to stop. The controls they hand you (mute, block, “see less often”) are real, but they operate at the surface. Ofcom found that among users who tried content controls, only 38% said their experience improved. Most saw no real change. The problem isn't individual accounts. It's the pattern beneath them, and no native tool addresses the pattern.

Before we built the next version, we fielded a US-national survey of daily scrollers. 771 people completed it reliably. The single strongest signal: 89% wanted the tool to catch repeat offenders automatically — the thing their platforms don't provide. The seven findings, including why blocking “comes back,” are written up in the survey results.

So I built something that does.

Feed Filter works one level below the surface. When something gets flagged, it doesn't just block that account. It recognizes the pattern: the type of content, the behavior behind it, the signals that make it spread. Block one account and five fan pages fill the gap. Feed Filter blocks the gap too.

It won't fix what happened to my family. But it gives you something the platforms were never going to hand over: a feed that actually runs on your rules.

I made one decision early on that I'm not going to compromise on: nobody can pay to change how their account is classified. Not an advertiser, not a partner, not an account with ten million followers. The same rules run on everyone.

We never ask for your password or touch your private messages. Everything we check is content that's already publicly visible. And whatever your results show, they're yours. You decide whether anyone else sees them.

I personally read every email. ted@feedfilter.com