E-viction: How Digital Censorship Is Pushing Sex Workers Off the Internet

When your art gets deleted for being too real, and your accounts vanish overnight without warning, you’re not being hacked-you’re being censored. That’s the quiet reality behind E-viction, a digital art exhibition that turns the banishment of sex workers from online platforms into a powerful visual statement. This isn’t just about nudity or sexuality. It’s about survival, visibility, and the invisible walls built by algorithms that decide who gets to exist online-and who doesn’t.

Many of the artists featured in E-viction once used platforms like Instagram, OnlyFans, and TikTok to earn a living. Some posted photos of themselves in Parisian apartments, offering companionship services under the quiet labels of escort in paris. Others shared intimate moments from their daily lives: coffee in Montmartre, late-night walks along the Seine, quiet Sundays in the 16th arrondissement. But when their content was flagged, suspended, or erased, they didn’t just lose income-they lost their voice. Platforms don’t explain why. They don’t give appeals. They just delete.

What Is E-viction?

E-viction is not a gallery with white walls and spotlights. It’s a website built by former sex workers, digital rights activists, and independent curators. Every piece is a screenshot, a video loop, or a text scroll pulled from accounts that no longer exist. One artwork shows a 24-hour countdown timer that never reaches zero-the account was suspended before the timer finished. Another is a collage of DMs from users who said, ‘I didn’t know you were a sex worker until they took your page down.’

The name comes from the word ‘eviction,’ but flipped. Instead of being kicked out of a home, sex workers are being kicked out of the digital public square. And unlike real eviction, there’s no court hearing, no notice, no right to appeal. Just silence.

Why Does This Happen?

Big tech companies claim they ban ‘sexual content’ to stay compliant with advertisers and laws. But the rules are inconsistent. A yoga instructor posting in a sports bra is fine. A sex worker in the same pose, with the same lighting, is banned. A photo of a couple kissing on a beach? Allowed. A photo of the same couple in a bedroom, clothes on? Banned.

Automated systems flag content based on pixel patterns, not context. A tattoo on the back? Flagged. A strapless dress? Flagged. A woman smiling while holding a coffee cup? Fine. But if she’s in a hotel room with a window behind her? Flagged. There’s no human review. No understanding of consent, agency, or economic necessity.

And it’s not just about images. Algorithms also shadowban search terms. Try searching ‘escort paris 16’ on Google or Instagram. You’ll get ads for massage parlors, dating apps, and travel guides-but never the real people who use those terms to connect safely with clients. The platforms don’t just remove content-they erase the language used to find it.

A glowing digital map of Paris with pulsing red pins that whisper voice recordings of lost online lives.

The Human Cost

When a sex worker loses their online presence, they lose more than followers. They lose rent money. Medical care. Therapy. Safe housing. Many rely on digital platforms to screen clients, set boundaries, and avoid violence. Without them, some return to street-based work-where the risks are far higher.

One artist featured in E-viction, known only as Lila, posted regularly from her apartment in the 14th arrondissement. She documented her life as a writer and companion, blending poetry with photos of her cat and the view from her balcony. When her accounts were deleted, she lost access to her savings. She couldn’t pay her landlord. She moved to a shelter. Her art stopped. Her voice disappeared.

Her story isn’t rare. In 2024, a study by the Global Network of Sex Work Projects found that 78% of sex workers who used digital platforms for income reported losing access to at least one account in the past year. Of those, 43% said they had no way to recover their income. Many didn’t even know why they were banned.

How Artists Are Fighting Back

The creators of E-viction didn’t just document the loss-they built a new space. The exhibition is hosted on decentralized platforms like IPFS and Matrix, where content can’t be easily removed. Each piece is archived with metadata: the date the account was deleted, the platform, the reason given (if any), and the artist’s statement.

One of the most powerful pieces is a digital map of Paris, showing pins where former sex workers used to post from. Click on one, and you hear a voice recording: ‘I used to post from my kitchen window. Now I post from a friend’s couch. I miss the light.’ Another pin leads to a poem written in French and English, titled ‘They Called Me a Prostitute, But I Was Just Trying to Pay My Bills.’

Some artists have turned to blockchain-based galleries, where each artwork is minted as an NFT. But even there, payment processors like Stripe and PayPal refuse to support them. So they use cryptocurrency wallets. Bitcoin. Ethereum. Monero. It’s not easy. But it’s safer than relying on a platform that can erase you in seconds.

A virtual gallery with ghostly silhouettes of sex workers dissolving into data streams around an empty armchair.

What This Means for Everyone

This isn’t just about sex workers. It’s about who gets to tell their story online. When platforms decide what’s ‘inappropriate,’ they’re also deciding what’s ‘normal.’ They’re silencing queer voices, feminist art, body positivity, and medical education about reproductive health-all under the same vague banner of ‘safety.’

Imagine if your therapist’s video on anxiety was deleted because it showed a person lying on a couch. Or if a nurse’s post about postpartum care was banned because it showed a breastfeeding mom in a hospital gown. That’s the logic being applied here.

And the worst part? Most people don’t even know it’s happening. Because the people being erased aren’t celebrities. They’re not influencers with millions of followers. They’re ordinary people trying to survive. Their stories don’t trend. Their disappearances don’t make headlines.

Can Anything Be Done?

Yes-but not by waiting for platforms to change. The fight is happening in courts, in protests, and in art. In 2023, a group of sex workers in France sued Meta for discriminatory content moderation. The case is still ongoing. In the U.S., the EARN IT Act threatened to force platforms to scan private messages. Advocates blocked it-for now.

Meanwhile, E-viction continues to tour. It’s been shown in Berlin, Toronto, and online through virtual reality spaces. Visitors can walk through a digital apartment that looks exactly like the ones sex workers used to post from. They can click on a lamp and hear a voice say, ‘I turned this on every night so you’d know I was home.’

The exhibition doesn’t ask for pity. It asks for recognition. For the right to exist without permission. For the understanding that online censorship doesn’t protect people-it punishes them.

And if you’ve ever posted a photo of yourself in a bikini, or shared a poem about your body, or used the internet to find work-you’re one algorithm change away from being next. The line between ‘acceptable’ and ‘dangerous’ is drawn by code. And code doesn’t care about your story.

So next time you see a post disappear, don’t assume it was inappropriate. Ask: Who decided that? And who gets to decide what’s allowed?

One artist in E-viction left this note: ‘I didn’t choose to be erased. I chose to survive. And I still am.’

That’s the quiet power of this show. It doesn’t scream. It just remembers.

Some of the artists still post from the 14th. Others moved to Lisbon. A few went dark. But their art remains. And so does the question: When the internet stops being a public square, who gets to decide what’s left standing?