I joined Facebook on October 24, 2006, on my 28th birthday. Why I joined is pretty boring: I joined because it was the thing you did. I joined because I wanted to be part of a larger society, and it was this network that allowed me to reach out to all of those people I had long forgotten or missed since graduating from high school.
In the seven-plus years that followed, I saw Facebook go from what felt like the newest version of MySpace to a multi-billion-dollar company that relied on my data and the data of those around me to sell ads, curate content, and promote events and causes from those who are marginally close to me. My friend number went up and down and up and down. I struggled to maintain relevance, gave up trying to be heard, pared back my social circle, and fought with what had become a more and more irrelevant feed.
I got tired of Facebook – of the site’s weird algorithm and of my network’s passive aggressive messages, but I couldn’t let it go.
My relationship status with Facebook? It’s Complicated.
News Feed: Most Recent (A History)
I was like everyone else when Facebook launched: thrilled with the ability to connect with people around me, to share my thoughts and likes, to get closer to people I had only met a few times, or strengthen the bonds between old friends and far-away family members. Facebook was a way for me to live globally, to move beyond Sioux Falls without losing my community.
But, each year, that freedom shifted. Pages were downplayed. My feed was shifted out of order. The idea of a “News Feed” shifted from a river of current content to a curated web of what an algorithm though I wanted to see. And while I love how algorithms have created better search results and more relevant sidebars and a feeling that I can make quick scans of content without subjecting myself to everything, I still have fundamental issues with someone doing that to my friends and family.
When “News Feed” became a euphemism for robot-curated content, I felt betrayed. “Most Recent” became the only way to see a real-time feed of news from your friends. When “Most Recent” was further buried – a seemingly defensive mechanism to force users lock step into algorithm – I felt angry. And when news of emotion experiments surfaced last week, I felt exhausted.
Listen, I get it. I’m getting dramatic and railing against a corporation that is responsible for thousands of jobs and has business goals and all of that. I’m raging against a machine and not offering any answers.
But the content presented by Facebook isn’t just a series of stories written by authors I don’t know, or videos pitched based on what I most recently read. It’s the life and thoughts of the people I know. For some people, it’s the equivalent of a conversation at the bar, or a look into a diary. It’s personal. It’s not anonymous; it’s deeply connected in a way that no other network can claim.
If Facebook actively wandered into a face-to-face conversation and assumed the best lines, rearranging them for maximum impact, what would we think? If they took our stack of Christmas cards and determined who would be the best recipients, what would we think?
The Public Dislikes Facebook’s Link
So then this whole Facebook social experimentation thing happened. To very grossly summarize, Facebook’s research department was using the feeds of 700,000 Facebook members to perform experiments on the effects of positive or negative comments. (“Emotional contagion through social networks,” it was called.) What do we do when things are going great – or horribly? Were we more likely to post if we saw positive things in our feed? Were we less likely to share if we saw too many negative posts?
Rightfully so, Facebook opponents rose up in arms. This was thought experimentation without consent, the flimsy (and vague) terms of Facebook’s terms of service used as some kind of crutch. Who was Facebook to control the output of my friends and family, of my feed, of the organizations I chose to follow? Who was Facebook to use my data in a way that was less than ethical?
In the midst of the discussion, we talked about the morality of A/B testing, the separation between social networking and marketing, and the possibility of opting in to data mining and experimentation. We talked about best practices in research. We talked about “business as usual.” We talked about how maybe we were just blowing this out of proportion.
That last point stuck with me – in the grand scheme of things, is this really something I wanted to raise my ire about? In the grand scheme of things, does this really matter?
Do I care?
Disinterest Has Sent You a Friend Request
I’ll be honest: I don’t. I don’t care about the experiments. I don’t care about the data they’ve used, because I assumed they already used it. I’ve threatened to quit over it before, but I haven’t. Which shows I probably don’t care. Not really.
I do care, however, about Facebook’s assumption that they know better than I do what I want to look at.
As danah boyd writes in her wonderful article about the growing anxiety around data manipulation, “What does the Facebook experiment teach us?”:
I get the anger. I personally loathe Facebook and I have for a long time, even as I appreciate and study its importance in people’s lives. But on a personal level, I hate the fact that Facebook thinks it’s better than me at deciding which of my friends’ posts I should see.
This was never an issue of experimentation and consent – this was a clear reminder of what I don’t like about Facebook. This was not the straw that broke the camel’s back, but a catalyst for my justification. This was what would send me, finally, after years of threats, away from Facebook.
Facebook’s algorithmic adaptation of my friends’ lives is a model unlike any other used on the web. News sites and blogs use algorithms to force certain stories to the top, but that is the type of editorial curation we expect from a journalism source. Search uses algorithms to assume solutions, but in those case we’re typically not sure of the solution we’re looking for in the first place.
But this? This is full-scale reinterpretation of actual lives: a kind of dramatized version of my social circle, like a classic book being remade into a film with a happier ending and a few extra sex scenes. This is not what I signed up for; I signed up for the full feed – the flaws, the bumps, the happy and the sad. Internet, you can go ahead and curate and editorialize the things that are not directly connected to me: the world of search, the editorials on TechCrunch, the assumptions of movies I might want to see.
Just don’t fuck with my friends.
Corey Vilhauer Has Updated His Relationship!
The reason I still cling so closely to Twitter is that they do not depend on Facebook’s algorithms of assumed value – that I need someone else to filter through my friends’ thoughts, like a warden pre-screening an inmate’s mail. Twitter gives me a running feed of everything my friends say. It is up to me to negotiate that feed – to pare it down and curate in order to retain some value. Instagram does this too – it’s just every picture from every friend – and despite the hypocrisy of fighting Facebook and keeping Instagram, I’m still okay with Instagram.
Facebook is where the worst opinions are surfaced. It’s where platitudes go to die, where everyone has an opinion, where long rambling diatribes are all the rage, where companies compete to pay their way to my heart and friends compete to gather sympathy. This is not the fault of Facebook. This is not the fault of my friends. This is simply what the ecosystem has become, and it is an ecosystem that I no longer feel the drive to be a part of.
Yeah. The experiments matter. They are news that we need to focus on. They are a betrayal of trust, and while most people don’t care (and only a very small portion of people awere actually affected) they still represent the first step toward wanton abuse of personal data.
But the experiments are not why I’m quitting Facebook. They are just the reason I remembered to do it in the first place.