European Central: Reduced Political Posts on Facebook in Ireland, Spain, and Sweden
Facebook announced that it is expanding testing of reducing the visibility of political posts to three more countries, Ireland, Spain, and Sweden. The company initially started testing this earlier this year in United States, Canada, Brazil, and Indonesia. Facebook will be reducing the visibility of political content in its algorithm due to feedback it has received from users. Users seem to be unhappy with misinformation that has been spread on the site and this is how Facebook is attempting to do damage control.
Looking at these three countries chosen in Europe, while it is not initially obvious each one can benefit from attempts to lessen the political divide. While Spain, Sweden, and Ireland may not be the three most discussed countries when it comes to political issues as more concern is focused on Poland and Hungary, these three countries each face their own political issues.
Starting with Ireland, Brexit caused a headache and there was concern that there would be a renewal in violence at the border between Ireland and Northern Ireland. Turning to Sweden, the Swedish Democrats have continued to gain more support with each parliamentary election and continue to gain seats making it harder for other political parties to make coalitions excluding the Swedish Democrats. While the Swedish Democrats have been kept out of power, these coalitions are not the most stable as political parties may disagree on key topics. Finally, within Spain Vox has been gaining popularity as a populist party within the country but has been become divisive as well.
A drawback from this plan is that media companies may be hurt financially as their political content may be hidden from potential readers. This is particularly concerning as media companies that were traditionally distributed by paper have had to scramble to embrace the digital age. Social media is a great way for people to discover a specific news story and learn about a new news source they may not already know. When people share news stories on social media this helps media companies by giving them free advertisement, but with this new policy the advantage of social media for media companies will be greatly reduced. As some media companies depend on ad-revenue earned from when people read an article, their revenue may decrease if Facebook decides it will eventually not recommend any political content to users in all countries and not just the ones where the policy is currently being tested. Just because Facebook plans on implementing the policies slowly does not mean it will be any easier for media companies to find a way to maintain current levels of article views if their posts are suppressed by updates to Facebook’s algorithm.
It is understandable that Facebook is working towards closing the divide amongst users in an attempt to heal the divide amongst the populations of countries but making political news articles less visible may also make users less educated about current events. There are other sources for Facebook users to learn about what is occurring in their own country such as tuning into the news on TV. The problem is it is harder to find information about the political situations of other countries. When political events trend on social media it helps users become connected with other parts of the world. If these events are hidden on social media, it may make it more difficult for people to read about events that they may not be aware of yet are currently occurring.
Another issue with this policy is that it appears that instead of fact-checking political content on the platform, Facebook is simply making it harder for users to read any political content. Rather than rooting out the problem this method punishes everyone involved. As pointed out by Facebook, political content is only 6 percent of all content on the site. As political content is a very small minority of content, it is disappointing Facebook could not come up with a different solution that would not hurt media companies that distribute political articles on Facebook, particularly if these media companies are doing their due diligence and only posting factchecked articles.
It is also important to remember that political content on Facebook also may have some positive benefits and be used to inform users rather than attempt to divide them based on political ideology. An example is in the past Facebook has encouraged people to share a post on their timeline that they registered to vote so their friends would remember to register as well. Political content on Facebook can also remind users when elections will occur. Media companies also may release articles simply stating the positions of candidates in an election in order to inform readers before they vote. Other political content may be shared on Facebook to simply inform other users of events that have recently occurred that may impact them. All of these positive types of political content could technically be less visible unless Facebook would make exceptions in its policy of not recommending these types of content to users.
Ideally, Facebook would push down news articles that have been flagged as containing unverified or false information rather than simply because a specific post is political in nature. This is logic as Facebook users have criticized the site more for the spread of misinformation rather than simply the presence of political content. It would make more sense for Facebook’s algorithm to not promote a story about violence among the Ireland/Northern Ireland border if it turned out it was not occurring, and the writer was trying to stir tensions in the region. Unfortunately, this can be difficult to do and would be time-consuming to go through each report to verify whether or not the new story does not seem trustworthy after users flag it. Facebook already employs 15,000 human moderators and uses AI to moderate the site. As these human moderators are already used to help verify whether flagged posts go against Facebook policies, they could be helped use to confirm whether news articles being circulated are helping inform people or spread misinformation. There have been concerns over the number of moderators not being sufficient in order to moderate the site, and NYU Stern Center for Business and Human Rights released a study suggesting that Facebook needs to double the number of moderators. As this is before suggesting moderators also have to decide whether news stories shared on the site or legitimate or not, Facebook may have to increase the number of moderators further than simply doubling.