02 Sep 2016

Facebook allegedly hiding specific types of news caused uproar amongst social media users earlier this year. Since then, the role that algorithms and humans play in the curation and dissemination of news on social media has sparked huge interest and debate about veracity, relevance and chronology.

Facebook announced this week that it had eradicated the human element of curation in favour of allowing the algorithm to work alone. So is there no value in human analysis? The company claims its intention was to always end up with full automation in favour of editors. However, problems have arisen since the changeover, highlighting flaws in the quality of purely automated analysis. The team had worked on the basis that only legitimate, known sources were included, filtering out inappropriate content. Since the removal of editors, offensive and spammy articles have been thrown into the mix.

This illustrates that machines are currently not capable of the sophisticated insight provided by a human brain. It also raises very valid queries about the level to which social media users are aware of their interaction with and reliance on algorithms. Take into consideration, for example, news coverage of a specific worldview promoted on social platforms, which tallies with a user’s own political, economic and social perspectives. If an algorithm only offers what is popular or of suggested overlapping interest, people’s ideals could be reinforced by an impression that everyone holds the same beliefs.

Does an algorithm only serve to reinforce already fixed opinions, rather than providing users with a broader genuine reflection of world news and information?

In the case of social media monitoring, reliance on automation and algorithms alone can lead to inaccurate data that does not reflect reality or give insight. With the billions of websites and social media platforms that now exist, purely human analysis of online posts demands time and resource to rationalise all visible content, while enjoying a world of Twitter bots and spam blogs.

The value of human analysed content cannot be discounted. Legitimate, relevant sources can only be truly evaluated by an editorial team. Rather than leaving the automatons to it, programmed analysis can and should work in harmony with human expertise to ensure customers and readers are provided with the information they need, regardless of whether you are a company assessing reputational risks or a consumer trawling Facebook.

Share this article