22 Sep 2017

There seems to be an ever-increasing amount of news stories about how algorithms somehow have failed often in the most basic way.

We need to talk about algorithms

Over the last few weeks, there has been the high-profile Amazon example, where it appears that the algorithm makes suggestions on the different materials you need in order to make explosives. The ‘frequently bought together’ or ‘customers also bought’ functions are where the issue lies for Amazon. Marketers will argue that automation helps customers get to products they need more efficiently and this improves brand performance, which is true. However simple oversight from Amazon would have minimised the damage to the brand reputation. Days after C4 news aired a segment on the issue, Amazon had still not made any changes to the site.

Yesterday, Guardian reporter Olivia Solon was sent a rape threat, she posted a screenshot on Instagram. Then the Facebook-owned company made it a dark ad to attract users on Olivia’s timeline. Although this was not a paid for above the line ad it still raises questions about the content being selected automatically. It appears the alogrithm boosted the content as it saw it as offensive and controversial. Facebook has responded by saying they are stopping automated ad targeting and will be using human beings now in the process. There are also other ethical and free speech questions hovering over Sheryl Sandberg and Mark Zuckerberg’s inbox currently.

The large platforms are now aware that the public knows complete automation of all processes is not possible and human intervention is critical. If there is content that contains controversial statements about politics, race or gender simply relying on the algorithms will mean typically things will not end well. Human input in programming obviously has a built-in bias, however, there needs to be human oversight across the whole process to avoid any obvious cock-ups. The investment required to have human oversight of content and decision-making will dwarf the market caps of these platform behemoths.

There needs to be an acceptance that the belief that any and all tasks can be automated is problematic. As renowned computer security expert Bruce Schneier has noted: "Finding terrorism plots … is a needle-in-a-haystack problem, and throwing more hay on the pile doesn’t make that problem any easier. We’d be far better off putting people in charge of investigating potential plots and letting them direct the computers, instead of putting the computers in charge and letting them decide who should be investigated."

Share this article

Tags

Aside

Giles Brown

gbrown@s360group.com

Giles has over ten years of experience in sales and marketing within the B2B media marketplace. He previously worked for United Business Media group across a range of mediums largely within the TMT sector. Giles has managed large-scale social media monitoring programs within the natural resources, pharmaceutical, financial services, defence and retail sectors.

+44 (0)20 8875 7969

Recent

Tags