1. Ask yourself whether you know the website or the source of information. Do you know the reputation of the website? Don't read or share media until you know what it is.
2. If you feel you are getting overwhelmed in your fact-checking efforts, STOP and remember your purpose.
Are you there to repost, read an interesting story, or get a high-level explanation of a concept? First, find out if the information is reputable.
Investigate the source
Use Wikipedia or a search engine (ex. Google) to investigate and find information about an organization or other resource.
Find Trusted Coverage
You need to find the claimthat the article is making, then find out if that claim is true or false. In order to do this, look for trusted reporting or analysis on the claim.
In this case it's a good idea to find other coverage of the claim, something that is "more trusted, more in-depth, and maybe just more varied."
Do you need to agree with the consensus that you find? No, but you will need to understand the context and the history of a claim so you can better evaluate the information.
Trace claims, quotes, and media back to the original context.
Much of what's available on the internet has been stripped of context. Do we know what happened before? What was clipped out of the video? Maybe a claim is made, but you're not really sure how to verify it?
In these cases you'll have to trace the claim, quote, or media back to the original source and see its original context. After doing this, you'll be able to see if whatt you saw was accurately presented.
Ways to do this:
Look for the original reporting, it should be linked in the source you're viewing and if it's not, ask yourself why?
Look for reporting sources as such a bibliography.
Look to see if the the claim, quote, or media was fairly represented.
Algorithms can be biased based on who builds them and how they're used. If an algorithm is biased, it will consistently make biased choices, unless a computer programmer adjusts the algorithm.
Algorithmic biases can stem from text and images that data scientists use to train their algorithm models. For example, if you search "boss" in a search engine the pictures that show up will likely be of white men. If this data is fed into the algorithm, the model will likely conclude that bosses are usually white and male, possibly perpetuating stereotypes against communities of color.
Search engines are not concerned about information retrieval in the same way a librarian or other information professionals are. When you use a search engine (i.e. Google) you're dealing with advertisement information retrieval. This can make a fundamental difference in the type of information you receive. (Noble, 2016)
Revisiting Search Engine Bias by Eric GoldmanQuestions about search engine bias have percolated in the academic literature for over a decade. In the past few years, the issue has evolved from a quiet academic debate to a full-blown regulatory and litigation frenzy. At the center of this maelstrom is Google, the dominant market player.
This Essay looks at changes in the industry and political environment over the past half-dozen years that have contributed to the current situation. This essay supplements my prior contribution to the literature, a 2006 essay entitled Search Engine Bias and the Demise of Search Engine Utopianism.
Algorithms of Oppression by Safiya Umoja NobleA revealing look at how negative biases against women of color are embedded in search engine results and algorithms Run a Google search for "black girls"--what will you find? "Big Booty" and other sexually explicit terms are likely to come up as top search terms. But, if you type in "white girls," the results are radically different. The suggested porn sites and un-moderated discussions about "why black women are so sassy" or "why black women are so angry" presents a disturbing portrait of black womanhood in modern society. In Algorithms of Oppression, Safiya Umoja Noble challenges the idea that search engines like Google offer an equal playing field for all forms of ideas, identities, and activities. Data discrimination is a real social problem; Noble argues that the combination of private interests in promoting certain sites, along with the monopoly status of a relatively small number of Internet search engines, leads to a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color. Through an analysis of textual and media searches as well as extensive research on paid online advertising, Noble exposes a culture of racism and sexism in the way discoverability is created online. As search engines and their related companies grow in importance--operating as a source for email, a major vehicle for primary and secondary school learning, and beyond--understanding and reversing these disquieting trends and discriminatory practices is of utmost importance. An original, surprising and, at times, disturbing account of bias on the internet, Algorithms of Oppression contributes to our understanding of how racism is created, maintained, and disseminated in the 21st century.