By: Liz Keiper
If you want to become discouraged real quick, ask a teenager to define the term “fake news.”
Sadly, instead of legitimate qualifications that would rightly bequeath a news source with the qualifier of “fake,” you will likely hear a plethora of names of news sources shouted at you. If you ask those same teenagers why they believe that news source to be “fake,” you would probably hear answers such as, “It’s promoting a liberal agenda!” or “It’s conservative propaganda!” or “It’s fake—everyone knows that!”
And it’s not really the fault of teens. Many adults would answer the same way. In fact, the term “fake news” has become such a convoluted, hot-button topic that it has become almost impossible to have a non-partisan discussion about it.
As a society, we are increasingly confused by trying to determine what is “fake” and what is real.
And, English teacher friends, that should terrify us.
We’ve all read the dystopias; we all know where that leads. The purpose of having a free press is to curb abuses by the government and those in power. In societies without freedom of the press, the government can tell its citizens whatever it wants, and they have no way of knowing that that information may not be entirely true (i.e. North Korea). However, in a society in which people cannot tell fake news from real news, even though the society may have a technically free press, the outcome in level of awareness of the citizenry ends up being similar.
Part of this confusion has come at the hands of politicians and public figures who, over the last few decades, have taken to labeling as “fake news” any outlet which disseminates a view which is unfavorable to them. So, we now find ourselves in a culture in which, if you don’t like something or someone, you can just call it “fake” and discredit it.
Wow. No wonder our students have trouble telling the difference between real and fake news.
The true definition of fake news is something along the lines of this: a mode of media “wherein a group or individual purposely misleads someone with inaccurate facts.” Inaccurate is a key word here. Just because someone disagrees with a news outlet’s interpretation of events does not make the news source’s reporting of those events inherently “inaccurate.” However, there is actual fake news out there that does just that—fabricates statistics or quotes, promotes ridiculous conspiracy theories, circulates false or unsubstantiated claims, or egregiously skews evidence so that it is indistinguishable from the truth and past what can be said to be a reasonable take on the event. Simply calling a dissenting opinion “fake” detracts from the pressing problem that actual fake news presents to society.
The fake news problem has been a scorpion in my mind, to quote The Bard, for a while now. It has absolutely irked me when a student refuses to even consider an article that I assign in class because they have determined, in the infinite wisdom of their 14-year-old-ness, that the source is “fake.” (And I immediately picture the SNL skit featuring CNN in a cage in the corner screaming, “I’m not fake news!” But that’s beside the point.)
On the flipside, it has also made my stomach drop to see students citing sources such as Breitbart and The Wall Street Journal together as if they had equal footings of academic merit simply because those students are young and lack experience with the reputations of those sources. I do have my students conduct independent research in my classroom, and I struggled for a long time with how to give them tools to evaluate a news source for credibility when performing that very task is arduous and would take a research paper’s worth of time in itself to conduct for each source used.
Also, because of the vitriolic level of opinions that the fake news epidemic has come to elicit, I did not know how to tackle this topic in my classroom in a non-partisan way. However, I have come upon a source that, though imperfect, is a step in the right direction and has been helpful for my students this year.
Media Bias/Fact Check (MBFC) is an independently run organization which aims to rate news sites in two ways: on a scale from very left-biased to very right-biased and on a scale from very high to very low factual reporting. This year, I incorporated MBFC into my research unit in my lesson on finding credible sources. While students are taking notes on their sources for their research projects, I always make them write a short description of how they know that the source is credible, and this year, I allowed them to use the rankings on MBFC as evidence for credibility of a source.
One of my favorite aspects of MBFC is that they use two different categorizations for news sources rather than just labeling a news source as “real” or “fake.” This acknowledges that a news source’s political or ideological leaning is not necessarily what makes the source real or fake but rather its methods of sourcing and level of factual reporting.
And I explicitly told that to my students this year. I showed them the tabs at the top of the website and told them, “Do you see these tabs? Any of these news sources all the way from Left Bias to Right Bias could be credible sources. You may not agree with their bias, but that’s ok as long as you understand their bias. In fact, all news sources have some sort of bias because news articles are written by people and people have inherent ways that they think about life—it’s called their worldview. Everyone has one. And so does every news source. What you need to stay away from is any news source that is listed under Conspiracy-Psuedoscience or Questionable Sources.” I also showed my students how the site was organized, how the news sources were rated in the two different categories, and how the managers of the site used evidence to back up their claims about the credibility of the news source itself.
Now, here comes the caveat about MBFC; like I mentioned above, it’s not perfect. Below are listed from my view the pros and cons about the website.
Cons of MBFC:
- It’s Run By Amateurs: Media Bias/Fact Check is independently owned by Dave Van Zandt, who by the site’s own admission works in the health care field and not in the field of journalism. Other than that, the researching and writing is done by volunteers. So, would I ultimately trust this site a bit more if it were run by a journalism program based in a university? Yeah, probably. But then again, in that case, the site would likely be open to funding incentives from corporations, and any organization which receives funding from other corporations to pay their employees is subject to being biased in favor of the initiatives of those corporations. So, is the fact that it’s largely run by volunteers actually a good sign in the long run? Maybe. Jury is out on that one.
- Lack of Information on Contributors: There are links with a few sentences of information on the contributors, but no pictures, and the information is sparse. That is actually something that I teach my students to look for in determining credibility of a source, and I don’t really see a reason why the contributors shouldn’t tell us more about themselves.
Pros of MBFC:
- Transparency of Methodology: Their page describing how they come to particular conclusions about ratings for news sites is quite thorough, and this makes me think that they are attempting objectivity.
- Evidence to Back Up Claims: While the Media Bias Chart from Ad Fontes Media is popular and often shared to back up claims about reliability of a news source, it has always struck me that the chart is pretty subjective. Though the creators of the chart do base their placement of sources off of research, a viewer pretty much just has to take their evaluation at face value or leave it, which doesn’t promote critical thinking. It’s easy to think that something that you read on a source must be true just because it was highly placed on the chart, and it is equally easy to dismiss the chart altogether because you disagree with the placement of a source, and therefore the chart must just be “biased.” I like that when you click on a source’s rating on MBFC, the contributors explain exactly why they rated the source like they did, and they even link example articles from the source to back up their claims. This does promote critical thinking because you then do not have to simply “take their word for” their rating—you can see the evidence of their research and, based on their evidence, either agree or disagree with their conclusions about the source.
- Consideration of Reader Votes: As an attempt to eliminate internal bias in the organization, MBFC has enabled a voting platform for each news source which allows readers to share how they would rank the news source. If there seems to be a large discrepancy between reader votes and MBFC’s ranking, they perform more research and reconsider the ranking.
- Open Dialogue: MBFC welcomes correspondence from those who beg to differ with their rankings and offer evidence to support differing claims. This is essentially a way to fact-check their fact-checking through crowd-sourcing and gives me the impression that they don’t have a hidden agenda and are legitimately seeking accuracy because they are seeking input.
- Recategorization of Sources: When it comes to light that there is a more accurate categorization for a news source, that news source is moved to a new category. This shows that they are open to and welcome critique.
- Admission of Subjectivity: Though MBFC is open about how they categorize sources, they also openly admit that categorizing news sources by bias level is inherently a subjective task because of everyone’s own inherent biases. While this leaves some unsatisfied at the validity of their categorizations, it actually gives me confidence that they don’t have some hidden agenda which they are trying to use this site to promote. I think that if they were trying to dupe us, they would ask for our blind agreement with their rankings.
- General Accuracy: I welcome you to click around on MBFC for yourself. Though I have occasionally encountered news sources which I would myself categorize slightly differently, I have never encountered a source that was egregiously mislabeled. And, I think that since complete objectivity in this sort of topic is impossible, that’s a pretty good sign.
At the end of the day, MBFC is not the be-all-end-all source on media bias. However, it is a good place for my freshmen to start. It gives them an introductory summary to a person’s evaluation of a news source and also why that person evaluated that news source in that way. This year, it has helped them to ward off some seriously biased or propaganda-filled sites that they encountered without knowing any better, and it has begun to instill in their minds the important difference between “fake news” and fake news; the idea that just because you disagree with the political slant of a news source does not inherently make that source factually inaccurate. And that is one big step in the right direction for society.
Liz Keiper is a contributing blogger for WVCTE. When she’s not dressing up in togas or running around her classroom with foam swords reenacting Shakespeare, she can be found enjoying the great outdoors, playing guitar, or adding to her rather out-of-control rubber duck collection. You can follow her on Twitter @KeiperET1.
WVCTE is wondering…
- How do you teach media literacy in your ELA classroom?
- What other methods do you use to help students determine if a source is real or fake news?
Leave us a comment, Tweet us your thoughts @WVCTE, or connect with us on Facebook!