In the never-ending quest to take back the ground they have been losing to other social networks in terms of time spent on the platform, as well as to address the privacy concerns that have destroyed their credibility, Facebook announced last week that it is rolling out a new feature to let users better understand why the algorithm is showing them certain content.
Essentially an extension of the “Why am I seeing this ad?” feature that came out in 2014, “Why am I seeing this?” is the first foray into explaining how the organic content algorithm works. Facebook has gotten a lot of criticism for its algorithm in the past. First from publishers who saw their traffic dwindle. Then from brands who saw their reach strangled. And finally from the public at large as the algorithm played into the hands of social manipulators who used it to spread highly divisive content and ultimately sway elections. The biggest complaint now is that the algorithm is so complex that no single person can understand how it works.
When a problem gets this complex the default is to blame the company for unleashing a monster. Facebook becomes the target. But in fact the algorithm is actually made up from a series of incremental changes that were all designed to show you content that’s more relevant to you. “Why am I seeing this?” shows you the surface of the reasons why a certain piece of content pops into your feed, look at this image from Facebook that they provided as an example:
The top reasons that you see content are obviously social (it’s a social network after all). You see content from your friends and groups that you’re a part of. Facebook gives a preference to people you interact with more. It’s hard to argue with that, and in giving you the option to see your past behavior – reviewing the actual posts that you interacted with from a certain person – Facebook is shifting the responsibility of what you see from them to you.
They also start to show some of the metadata reasons for showing certain content. If you disproportionately like photos, for example, they will show you more photos. We can assume it’s the same for videos and text posts. There is also a “other factors” option which is where a lot of other things factor in and where the level of opacity grows.
Giving a Semblance of Control Back to You
Facebook doesn’t want you to be turned off by what you see. I have many friends that I like spending time with but who share things that aren’t relevant to me. Our differences are what makes our relationships interesting, but that doesn’t mean I want to shift through every post about social justice from a particularly passionate friend. The “Why am I seeing this?” feature stops short of providing direct options for you to control what you see.
Facebook already offered the ability to unfollow and mute accounts – either for a certain period of time or for good. But now the ability to edit your news feed preferences is placed even more front and center via the new feature instead of having to find it. The problem is that it is still highly focused on who you follow and not what you you want to see. You can select people whose content you absolutely want to see, you can see who you unfollowed to see if you want to follow them again. You can also select between groups, so if there is an important group, say your group of best friends, you can prioritize it over your casual groups, say a music fan group where multiple random people post per day.
What I Really Want to Control
Since we cannot know in advance what type of content any of our friends are going to post, what I really want are content filters. I want to be able to decide what types of content I see, independently of who is posting it.
For example, here is a list of things I don’t want to see:
- Anything related to politics – no news articles, diatribes or memes. It’s too depressing and even if I agree with you, posting about politics to social media is a waste of everyone’s time
- Viral soccer videos, none of them are that cool, every goal is pretty much the same thing, it’s like watching home runs in baseball
- Cryptic messages, like the classic “Why me?” post that’s begging for interactions
- Ultrasounds of your baby – in fact I would like to automatically unfollow everyone who posts an ultrasound because if you’re posting a photo of your baby before its born, it’s going to be an onslaught of baby photos from here on out. Don’t get me wrong, I’m probably very happy for you and your family, but I don’t want to peruse through pictures of babies anywhere
- Anyone asking for recommendations in a city where I don’t live. No Karen, I don’t know a good electrician in Washington DC
- Your updated cover photo, either you share a photo or you don’t
- Thanking your friends who wished you a happy birthday in a generic post. This is the worst. Not only did I NOT wish you a happy birthday, you didn’t even take the time to individually thank people who individually took the time to wish you a happy birthday.
Now this list is completely personalized to me. Maybe you don’t care when people share music videos or news about a sports team. Maybe you hate people talking about career achievements since you’re unemployed. Maybe you hate pictures of people partying because you have three kids and live in the countryside and haven’t been out for drinks in the past three years. Everyone has a different set of preferences.
Here are things that I would like to see:
- Your achievements so I can say congrats
- Your milestones like your wedding or new job
- You when you are doing things with other people I know
- Your travels and adventures
- Thought-provoking articles about digital brand strategy or media
- Recommendations for great places to eat
If Facebook ever gets intelligent enough to be able to let us control what we see and not just the people we see things from, it will start to regain some of its lost lustre.
But really, Facebook’s new feature is just a way to cover their asses by showing you that it was really you all along that determines what you see. Facebook only makes logical changes that are hard to argue with on an individual level. Facebook is just a mirror of society, nothing more, and therefore it cannot be held to any sort of moral standard.
The decline will continue.