I know MediaBiasFactCheck is not a be-all-end-all to truth/bias in media, but I find it to be a useful resource.
It makes sense to downvote it in posts that have great discussion – let the content rise up so people can have discussions with humans, sure.
But sometimes I see it getting downvoted when it’s the only comment there. Which does nothing, unless a reader has rules that automatically hide downvoted comments (but a reader would be able to expand the comment anyways…so really no difference).
What’s the point of downvoting? My only guess is that there’s people who are salty about something it said about some source they like. Yet I don’t see anyone providing an alternative to MediaBiasFactCheck…
Bias can be subtle and take work to suss out, especially if you’re not familiar with the source.
After getting a credibility read of mediabiasfactcheck itself (which I’ve done only superficially for myself), it seems to be a potentially useful shortcut. And easy to block if it gets annoying.
The main problem that I see with MBFC, aside from the simple fact that it’s a third party rather than ones own judgment (which is not infallible, but should still certainly be exercised, in both senses of the term) is that it appears to only measure factuality, which is just a tiny part of bias.
In spite of all of the noise about “fake news,” very little news is actually fake. The vast majority of bias resides not in the nominal facts of a story, but in which stories are run and how they’re reported - how those nominal facts are presented.
As an example, admittedly exaggerated for effect, compare:
Tom walked his dog Rex.
with
Rex the mangy cur was only barely restrained by Tom’s limp hold on his thin leash.
Both relay the same basic facts, and it’s likely that by MBFC’s standards, both would be rated the same for that reason alone. But it’s plain to see that the two are not even vaguely similar.
Again, exaggerated for effect.
MBFC doesn’t only count how factual something is. They very much look at inflammatory language like that, and grade a media outlet accordingly. It’s just not in the factual portion, it is in the bias portion. Which makes sense since, like you said, both stories can be factually accurate.
I haven’t seen any evidence that it does that, and quite the contrary, evidence that it does not - examples from publications ranging from Israel Times to New York Times to Slate in which it accompanied an article with clearly loaded language with an assessment of high credibility.
It’s possible that it’s improved of late - I don’t know, since I blocked it weeks ago, after a particularly egregious example of that accompanied a technically factually accurate but brazenly biased Israel Times article.
Both relay the same basic facts
NO, THEY DO NOT.
rex has a mange is factual statement, that can be investigated and either confirmed or rejected.
same goes for rex’s leash was inadequate and tom’s hold of the dog was weak.
there is a lot more facts in your second example, compared to first one.
it’s likely that by MBFC’s standards, both would be rated the same for that reason alone
no, they would not and it is pretty easy to find out - https://mediabiasfactcheck.com/methodology/
your powers of “paying attention, weighing, analyzing, reviewing and questioning” are not as strong as you think.
be careful not to hurt yourself when you are falling down from this mountain.
So are you saying that you wouldn’t be able to recognize my second example as a biased statement without the MBFC bot’s guidance?
Or did you just entirely miss the point?