That’s right, Facebook is focusing on its user base in the spreading of fake news, and flagging users known for “brigading” or reducing visibility.
The bill is due
Now, the eye of Facebook turns to us, the masses, and begins to note how accurately have we flagged fake news that third party fact-checkers have verified.
Facebook is measuring this all through a trustworthiness score from 0 to 1. That’s it. If you flag something as fake that turns out to be true, you are docked a score.
However, if you flag something that turned out to be truly erroneous, your score rises and Facebook will note you as a trustworthy source of quality flagging.
Secret friend-ranking scores
Apparently, “friend-ranking” is another score system that compares the similarities (friends in common, Groups, profile creeping, etc.) you are likely to share, and gives lower scores, but it has to be reciprocal.
Yes, if you creep without being creeped on, your score gap will widen. It’s a system used to dictate friend News Feed frequency and first reported here.
Facebook is undergoing the immensely arduous task of verifying news shared amongst their entire platform, while making sure the kids play nice. It's a mammoth, perhaps impossible task. Is the Zuck up to it?
What do you think about Facebook’s effort to cull fake news through trustworthiness? Is this an achievable undertaking? Let us know in the comments.