People have started to think more negatively about Facebook than before because the giant, used by 1,9 billion people, has become a synonym for negative influences on peoples lives. For years, we’ve known it’s too much addictive and many are losing sleep over social media in general. But recently, more serious negative effects of Facebook have become apparent: Facebook is a conveyor for false information and fake news. And Facebook’s algorithms, that define what we perceive as reality, is undermining our democracies and making us less educated and enlightened. Who would have thought at the birth of the age of the information economy that people could get less informed and knowledgably about fundamental matters that affect their lives?
Examples of fake news:
Despite Mark Zuckerberg’s efforts to describe Facebook as a medium that unites the world, more and more people are realizing that Facebook can not only make people less informed, it can also pave the way for crooks to get powerful. And when we have come to a point when someone has used this negative flaws of Facebook to become the president of the United States, we are obliged to discuss what to do.
Imagine a typical coal miner, relying on one income from an industry that should be fading away. Facebook algorithms could easily make sure that this coal miner sees the world in a light where most of people are completely happy about continuous use of coals. And when that person discovers someone―typically an environmentalist―that has opposite opinions, the miner can get angry because of his ‘made-up’ biased world view: Everyone is in favor of coal mining. And if you think that ‘almost everyone’ likes your opinions, you are not willing to listen to other people’s reasoning, knowledge and experience. They are just stupid, they are wrong and you are right. This is how Facebook is making the world a darker place where the respect for other people’s opinion is fading at an incredible rate.
An example of a biased world view where a person is convinced that the majority of people likes your views, attitudes and manners:
An example of consequences when Facebook’s algorithms create an echo chamber of yourself (the ‘I am right, all the time’-syndrome):
Facebook’s carefully planned intention of diluting peoples perceptions and vision has especially become apparent after Donald Trump started his run for the White House office. With more fake news than ever and more biased world views than ever, the world has truly become a less interesting place. A dirtier, darker place where dignity, appreciation, respect and fairness are labeled as unimportant, if not thrown down to the trash.
Therefore it is now time for Zuckerberg’s team to apply a few changes, not only to restore Facebook’s brand image, but more importantly, to get fixed what has been made broke; to support values that make a successful society. The right path for Facebook would be to think in terms of un-biased and true information. But what Zuckerberg has said so far (his recent manifesto in 6000 words) has unfortunately not convinced many. ‘We will make sure that people will not be able to share news if they haven’t read them’ is just one of many statements that underline weak intention to get to the core of the matter. Most likely, Facebook has collation of interest here. Sharing all news, including fake news is lucrative for Facebook. And biasing our perceptions is extremely lucrative for Facebook. Therefore we could argue that it is unlikely that Facebook will fix things on a deep level. But keeping that in mind, Facebook could easily do a few changes that would really help to give people back the faith in Facebook and its intention to serve society well.
Therefore, here is a list of changes that Facebook could consider to implement in order to restore people’s faith in Facebook and would restore at least some of the trust that the brand has lost over the recent months:
#1: On-Off Algorithm-Filtering
To add options to Facebook’s news feed that would allow us, once in a while, to see the news feed without algorithm-filtering. To see a clear news feed from every person and page that we have connected with, without making Facebook judging what we should get to see and not.
Facebook doesn’t give us an algorithm-filtered news feed because they are control freaks. They do it to increase revenues. More control over the news feed means more revenues in their pocket. That is the biggest reason why they filter our news feed.
But as we have enought of bad examples that are direct consequences of algorithm-filtering, Facebook users should have the option of adjusting the news feed to be filter-free.
It could be as simple as this small menu:
This ‘Filtering-Free’ news feed would allow us to percieve the world as it is and learn more about how common opinions and attitudes are. One could argue that followers of uncommon views and opinions often be too narrow minded to turn that option on and maybe that is true up to a some extent. But being a person that only looks at the filtered news feed would soon be a negative label, commonly synchronized with narrow-mindedness, prejudices and discrimination. When arguing for views towards racial discrimination, one could get an answer back: “Oh dear, have you been looking at your filtered newsfeed for too long?”
#2: Tell It Like It Is
It’s funny to hear Mark Zuckerberg condemn fake news when Facebook is faking basic user information. Let’s look at the news feed. You are told that you can control if the news feed shows you the most popular or most recent status updates from your Facebook-friends. You can choose ‘Top Stories’ or ‘Most Recent’ at the top of the news feed page (see picture below). But Facebook is faking the idea that the user can actually control this in order to increse revenues. What is true is that Facebook takes over the control of what you see most of the time. Therefore, these two options are mostly fake.
A common news feed will bring this into light if you choose to see the most recent statuses. Once in a while you might get your most recent statuses but most of the time, Facebook deep intention of controlling what you get to see takes over. Here, at the picture below, is a real newsfeed (names of pages have been changed) but this picture reveals that Facebook is publishing a 9 hour old status even after the user asks for the most recent statuses. This user has over 2000 friends and has real status updates almost few minutes. So, offering a 9 hour old status when the user is asking for that latest statuses is a cheap trick to control your perceptions of the world.
Facebook should be true to Zuckerberg’s words and tell it like it is: Don’t say you are showing your users the most recent news feed when it is half a day old. And what more important: Don’t make it so hard for users to see the truth and reality of the world around them. Allow them to disconnect from your controlled news feed, at least for a while, if the really ask for it.
#3: Fund an independent news approval agency, accessable to everyone
One of the worst virus that has affected our societies is the expansion of fake news all over the Internet. Newspapers and other media are to weak to fight this vermin and the establishment of governments should not either. The world needs a financially super-strong entity that has a sincere will to support good values that keeps our societies together. A company like Facebook would be the first name that comes to mind when looking for an entity that could pave the path for a non-fake news reality. Of course, we cannot be naive here; a complete elimination of fake news is not a plausible target, such news might exist for years to come. But our goal should be to minimize the effect of fake news down to the smallest number possible.
Facebook is financially strong enough to fund an indipendent news approval agenty that would be accessable to every news media and social media out there. The goal would be to give everyone a well funded, fact-checking approval of news and other facts that people share across various platforms. That could be an institution, with a name such as NewsApproval.org that would both do their own fact-checking and also, support and fund other institutions and websites that have similar goals. One if the websites that have already been established for this purpose is Politifact, a fact-checking website that won the Pulitzer prize in 2009.
It would be against basic principles of democracy to have all fact-checking on one hand. Therefore, it would be great to have a well funded entity that would support such fact-checking websites and institutions and be a conveyor of delivering their results into a one, easily understood raking system that would be attached to all shared news across the world. A shared news should include a ranking for the reader to find out if it would be worthwhile to trust that particular piece.
At last, here is an example of one possible ranking system that would be great to attatch to all shared news in social media, worldwide:
This could be a metering system that could be enough simple for everyone to understand instantly, such as something similar to this:
Not only would a stong support from Facebook in supporting these activities make the world truly a better place, it would boost Facebook’s brand image higher up than anything that Zuckerberg has ever done. He would then truly be one to the dominant figures in the world that have given the world back something that is invaluable. I urge you, Zuckerberg!