The editorial distribution model has changed dramatically since we all switched our homepages from our favourite news source to Facebook. Now, we get news from our friends.
I’m not complaining – it’s nice to feel connected through the things we read and watch. But it’s created a few problems.
Firstly, fake news is everywhere.
The aggregated news-feed-style method of displaying information puts less emphasis on the publisher and more on the sharer. I’m not getting my news from an editor, I’m getting it from Aunty Janet. And when we feel like we’re sharing information with like-minded people, there’s less resistance and less fact-checking. ‘Everyone I know believes it, so it must be true’.
The second issue is the ‘filter bubble’. And it’s a symptom of the greater issue – the problem with letting algorithms run our lives.
Personalisation, in general, makes sense – where possible, we should see things that already reflect our interest. But in something as high-stakes as the dissemination of information, avoiding differing views completely is getting a bit problematic and leading to a greater divide than ever between both sides of politics.
However, some good news – some interesting new technology is helping people reverse the effects.
PolitEcho is an app that serves to actively work against ‘filter bubble’ biases which users can download and use themselves. A Norwegian website has started demanding users pass a quiz before being allowed to comment on stories, and even Facebook itself has recently rolled out a tool to identify false news.
But are we living in an editorial psychopath test – if I think I’m in a filter bubble, or seeing fake news, I’m probably already correcting it subconsciously?
Although ‘censorship’ is a dirty word, perhaps moderating online content is the best way forward to fight against fakery.
At Reevoo, we deal with online product reviews. Filtering reviews is a necessary part of the process and leads to a more useful and helpful experience for everyone. Every review that comes through our platform (and there are millions) is moderated by a real human being before it’s published. It’s the only proper way to make sure the reviews are genuinely useful and authentic, and that everyone’s opinion gets an equal chance to be heard.
We do this because we believe we have a responsibility to the consumer (and to a degree, the brands we collect for as well). But when it comes to our everyday news consumption, where does the responsibility lie?
Does Facebook need to hire an editor-in-chief? Maybe a whole department of people who aren’t motivated by ad revenue from clicks, but by impartiality?
It might just be the only way to restore some balance to the news feed. At the very least, it will buy Facebook a little credibility while it works out some way to fix the news feed.
Although we may not like it to begin with, implementing some sort of human moderation is currently the only method we have of keeping the internet a safe, reliable source of information for us all.
— This feed and its contents are the property of The Huffington Post UK, and use is subject to our terms. It may be used for personal consumption, but may not be distributed on a website.