Fake News. It’s a phrase that’s
been thrown around a lot on social media, especially during this past election.
But what does it mean, really?
Fake news can be defined as, a type of yellow journalism or propaganda that
consists of deliberate misinformation or hoaxes spread via
traditional print and broadcast news media or online social
media.
Facebook can be a cesspool of fake news
articles. I’m sure you’ve seen articles along the lines of, “Hillary Clinton Orders
the Execution of Donald Trump’s Daughter,” from a website with a name like
“FreedomEagleSoldier.com,” or an article like this one from “News Bible
Report,” claiming Sasha Obama had been shot and killed. These articles are clearly
ludicrous and the thought that someone could believe it is borderline
hilarious.
Unfortunately, these articles can do
some real damage. Propaganda and fake news during the 2016 election had such a
profound impact that it outperformed real news. Facebook received a considerable
amount of backlash because of this to the point where they needed to do
something to combat it. Now, if you see an article you believe is fake, you
have the option to report it. If enough people report it as fake, it gets sent
to a fact-checking organization like Snopes or Politifact. Once determined that
it is indeed fake, the article shows up on Facebook with a red banner saying,
“Disputed by Third-Party Fact Checkers.”
In some
instances, these precautions don’t have the desired effect. Usually fake news
stories don’t get branded as fake until after they go viral and the damage has
been done. Certain groups see that Facebook flagged a particular story and say,
“Hey, they’re trying to censor us! Share this!” This is what happened with
Christian Winthrop’s false story that claimed
hundreds of thousands of Irish people were brought to the US as slaves. By
saying, “Don’t share this, it’s fake news,” it had the opposite effect.
A number of
technologists, academics, and media experts have pitched ideas to Facebook to
help them with their fake news problem. Some ideas include verifying news
pages, sort of like how Twitter uses a blue checkmark for verified celebrity
accounts; sharer reputation ranking which takes into account the types of
articles a person shares; and connecting fake news to fact-checking sites, so
that when you click a link to a false story, it can also open a page debunking
it.
Even with
all of these modifications, some think that the more Facebook tries to debunk a
story, the more people will still decide to believe it. So, the question
remains: How can we get people to agree with facts when the fact in question
doesn’t fit how they want to view the world?
Written by Alexia Amato of Rebecca Adele PR & Events
No comments:
Post a Comment