Module 2: The Consumption of Information


No doubt most people are familiar with the friend who, no matter how much evidence is presented, will find a way to argue against it. If the debate were a football field, it would have the curious feature of moving goalposts, the end zone getting further and further away with every yard that’s gained. Whether they want to admit it or not, all people do this to a certain extent because they interact with information using a variety of heuristics, through a wide range of cognitive filters, and predisposed to a number of biases.

Complicating this matter more is that most people have what is known as a bias blind spot, which is the inability to recognize and compensate for their own cognitive biases. This module will introduce the reader to epistemology, heuristics, a few different kinds of biases, and some of the sociocultural filters through which people look at information. Learning to recognize these features of how people consume information is a fundamental skill of information literacy.


Epistemology is the study of knowledge, how people know things, and what it means to be justified in believing the things they believe. In a course about information literacy, it only makes sense to touch on the nuances of how people send and receive information, and to be able to discuss those things as precisely as possible. For instance, most people have likely not considered that a fact is not the same as knowledge and that knowledge is not the same as information. Yet, it makes it much easier to think critically about information when one incorporates the following framework:

A Fact is a record of a verifiable thing that happened or is actually the case. 

Information is what is presented by arranging facts and/or non-facts in any given order (or disorder, for that matter)

Knowledge is the assemblage of facts and information. (i.e.: One idea is built upon the idea that came before it, such that the next idea can build upon it.)

The Definition of Knowledge:

There are 3 aspects of knowledge to be aware of:

  • Belief  — a statement of faith in something.  (The train will be here soon.)
  • Truth  — a belief isn’t always true.  But if the belief tests positive (the train shows up soon) then it is said to be true (at least, conditionally).
  • Justification – just because something turns out to be true doesn’t mean it is a justified belief.  (Did you think the train was coming because you are an optimist or because you looked at the schedule?)

Because of this, it is said that Knowledge is a justified, true belief

In building knowledge, there are multiple levels on which meaning can be lost:  People can be ignorant of facts, they can misinterpret information, and because of those things, they can construct a towering assemblage of beliefs that are not true. While facts can be learned and information clarified, the third instance is the most troubling because often once people have untrue beliefs ossified in their heads, it is difficult to shake them out.

The process through which people decide what is true and justified occurs constantly in both personal and public spheres. Individuals map out the world in their minds, weighing evidence and experience to try to figure out the best way forward. But it also happens in society, whenever people communicate their personal ideas to one another in the public sphere. These ideas are compared, contrasted, and tested to see if and how they may be true. Ideally, in this way a society would come to justified, true beliefs, or knowledge. Writing research papers may become a more interesting act if students recognize that every paper they write is part of the test that justifies beliefs, is part of the societal conversation that determines the best way forward.

It must be a very interesting time to be an epistemologist because the internet allows a front-row seat to a giant, unregulated barroom brawl over personal and public manifestations of belief, information, and knowledge. Here individuals, groups, organizations, governments, corporations, academics and everyone in between advance their ideas to inform the societal understanding of truth. In this way people construct knowledge or at the very least agree as a community what the truth is, even if that agreement represents an imperfect or even incorrect version of the facts.

Case Study


Often people take shortcuts when they approach research.  But being informed means more than finding the exact fact that backs up your thesis; it means understanding how different facts fit together to create knowledge. In the quest for knowledge, people rely on primary, secondary, and tertiary sources of information. Often there is a misconception that one kind of source is better than another, that primary sources are the real information that then gets filtered down into secondary, and finally tertiary sources. But this is an oversimplified view of how these sources work together to create knowledge.

Two examples that spring to mind are the Haymarket Riot and vaccines, which provide two useful case studies about sources.

Timothy Messer-Kruse is a professor of history who has been researching the Haymarket Riot for over a decade. He has written books and articles about the 1886 labor dispute in which a bomb was thrown at the police. Several police officers died, and the ensuing trial led to the conviction of eight labor activists. The interesting thing about this is that the trial is most often presented as a railroading in which no evidence was presented to actually link the suspects to the bombing. Decades upon decades of history courses have taught it this way, in fact. But when a student’s in-class observation spurred Professor Messer-Kruse to investigate the trial transcripts, he learned that there was a substantial body of evidence linking the suspects to the crime, and in fact, that the Haymarket Riot had been incorrectly taught for all those years. But when he tried to correct the Wikipedia page about it, he was told that their editorial practices give preference to consensus over primacy. That is, until more secondary sources report what the primary source actually said, as opposed to the incorrect information, that he would not be allowed to change it.

A researcher might be tempted to draw the conclusion that secondary and tertiary sources should not be trusted. But keep in mind that primary sources are always products of their times, and thus, a political speech from 1960 will not reflect an understanding of the present world, just as the Declaration of Independence won’t tell you who won the American Revolution, and an original research study about living with AIDS written in 1999 won’t explain the current best practices. Also, a distanced summary of a field that reflects all of its ups and downs and points and counterpoints is likely to give you a better understanding of the field than reading a single experiment or research paper.

For instance, if you read Andrew Wakefield’s Ileal-Lymphoid-Nodular Hyperplasia, Non-Specific Colitis, and Pervasive Developmental Disorder in Children, published in Lancet, 1998 (a primary source), you might find it persuasive in its indication that vaccines are linked to autism. However, reading a broader overview of the field, for instance a secondary source such as Good Investigative Reporting May Finally Debunk the Myth that Vaccines Cause Autism, published in Harvard Health Publications would reveal that the study was ultimately retracted by Lancet in 2010 because the research methods were flawed.

Thus, being information literate means more than finding the exact article that backs up your thesis; it means understanding how all the articles fit together within the broader structure and knowledge of the discipline. Information literacy demands attention to both primary and secondary sources as well as the critical thinking necessary to evaluate each for timeliness, credibility, and bias.

If epistemology is about building knowledge, it is also about how people arrange the structure of knowledge, with one understanding built on another. Around 17000 years ago someone realized that carrying a box up a hill was not very efficient and decided that he needed to invent the wheel. Some 12,000 years after that, someone else figured out that it would work better with an axle. Then 5000 years later, society is blessed with wheels, axles, gears, engines, satellites, and YouTube videos of singing cats. Because one piece of knowledge builds upon the one that came before it, and, as noted before, societies need to protect their knowledge and make it accessible so it can be put to good use.

Libraries are one way that this goal can be attained. Melvil Dewey, a librarian at Columbia University back in the late 1800s created the Dewey Decimal system sometime around 1876 because he thought it would be helpful to arrange books in a way that would lead to serendipitous discovery. That is, lucky discovery. Luck which librarians help people have. Were someone to go to the shelves to retrieve a book about the Chicago River she would find next to it other books about the Chicago River, books she didn’t even know existed, which she was not looking for, but which now she knows about, because Melvil Dewey imagined 140 years ago that it would be helpful to arrange books by subject matter so that if a patron found one she would discover others like it. Before that convention, books were arranged by size. The tall books went on the tall shelf, and the short books went on the short shelf. As efficient and economical as that arrangement was, it did not encourage learning because patrons had to know exactly what they were looking for in order to find it.

Knowledge can be mapped and cross-referenced and plotted, but it is not stagnant, unchanging, or set in stone. When people talk about an issue, it is important to recognize that there are facts and ignorance, information and misinterpretation, knowledge and assemblages of incorrect beliefs, and every permutation in between. All of that provides a socially constructed context in which a debate happens and in which knowledge is said to be created. 

The Impact of Heuristics and Cognitive Biases on Information

According to Tversky and Kahneman’s often-cited “Judgment Under Uncertainty: Heuristics and Biases” (1974), heuristics may be simply defined as automatic shortcuts a person’s brain takes in its daily attempt at survival (1124). These shortcuts are necessary because of the constant volume of input coming from all around. A person’s brain has to filter out the things that seem extraneous to focus on what seems important. For instance, someone might tune out the swooping arcs of the birds and the color of the leaves to focus instead on the two strangers walking toward her on the sidewalk. People use the same kinds of processes to make decisions quickly, rather than taking the time to research, find exact measurements, and plot decisions using perfectly correct calculations. Consider for instance, weighing the estimated amount of gasoline left in the tank vs. the approximate distance one needs to go vs. the exact time ones needs to be there vs. the exact traffic and road conditions vs. how long it will take to stop and get gas. Taking the time to put exact numbers on all of those variables would likely make someone just as late as running out of gas on the way there. That kind of planning does not require exact calculations as long as it is reasonably accurate.

Understandably, the field of heuristics has startling implications for information literacy. While most of the time shortcuts work well, when people become dependent on their shortcuts in situations calling for more precise information, they are subject to a wide range of failures. Not every task needs to be backed up with formal research, but when it comes to making important decisions, people should seek out unbiased, timely, and credible sources. As Tversky and Kahneman pointed out, the heuristics which people employ to make intuitive decisions can cause them to make mistakes in judgment regarding the likelihood of events and the reliability of evidence. For instance, they found that over-reliance on heuristics may lead people to draw false correlations between unrelated events or to inaccurately categorize people because of perceived characteristics. Similarly, Stavy and Tirosh (2000) found that reliance on heuristics can lead students to make critical errors in math and science coursework.

The heuristics people use come from a variety of inputs, including personal experience, family, culture, and political preferences. Some even seem to be rooted in the deep structures of the brain and have lingered through millennia of evolution (Westen 2007). They are not in and of themselves bad things. In fact, they are quite useful at getting people through the day safely and efficiently. But it is important to recognize that quick, automatic decisions cannot help but oversimplify complex problems, sometimes in a way that does more harm than good.

An important facet of using information is to recognize when a quick mental calculation will suffice and when deeper research is needed. For instance, when trying to get across town on a nearly empty tank of gas, estimates will probably lead to a reasonably reliable decision. On the other hand, trying to put an astronaut on the moon using the same shortcuts would be catastrophic.

Overreliance on heuristics can sometimes leads to bad decisions. Other times, someone’s unconscious thinking is so out of line with rational decision-making that it interferes with his objective understanding of reality. Cognitive biases are ingrained thought processes that skew perception.  There are many different kinds of cognitive bias recognized in the literature, of which these may be the most important for the sake of research:

Confirmation Bias refers to people’s willingness to believe sources that agree with them with little critical effort.

Disconfirmation Bias refers to the opposite effect, that people will expend a great deal of critical effort to reject information that disagrees with their prior beliefs.

Fundamental Attribution Error refers to the error of assuming that people’s behavior is the result of their personalities or dispositions instead of external circumstances that may have led to the behavior.

Bias Blind Spot refers to people’s inability to see their own biases.

Anchoring Bias refers to people’s over-reliance on one piece of information or evidence.

Projection Bias refers to the error of assuming that others share one’s beliefs or feelings.

These biases are but a handful of all the recognized cognitive biases, but being aware of them will assist in the task of evaluating resources for credibility, timeliness, and authority. Compensating for one’s cognitive biases is not at all an easy task, because they happen pre-consciously.  But the crucial first step is simply to be aware that they exist.  The second step is to create a rubric by which to judge all sources.  The rubric once created should be read and understood to such a degree that its application becomes practically automatic.  Evenly applying standards to research in an a priori manner typically yields the best results.  

Because people tend to be blind to their own biases, it could be informative to use a hidden bias checker. The Southern Poverty Law Center keeps a page on understanding prejudices and biases here. One resource they link to is the Harvard Implicit Bias checker.  People should try taking tests like these from time to time to see what they discover about themselves.


The Impact of Culture on Information

In anthropological terms, culture is the collection of the ways of living and being and the system of beliefs and values that are shared and expressed by a group of people and which they transmit from one generation to the next. It includes all aspects of intellectual, political, economic, spiritual, and artistic endeavor within a group, as well as much more. Shennan (2007) espouses the notion of culture as information that is learned via teaching or imitation, stating that cultural tradition is “a process of inheritance through social learning” (38). He helpfully explains that while culture is not the same as behavior, it does influence the range of possible behaviors through the “creation of cultural tradition” which is inherited through social learning (42).  Gertude Himmelfarb suggests that much of this social learning and cultural tradition comes from interaction with family. The family, she argues, is the place where, “the most elemental, primitive emotions come into play and we learn to express and control them, where we come to trust and relate to others, where we acquire the habits of feeling, thinking, and behaving….where we are, in short, civilized, socialized, and moralized” (44).

Admittedly, it is somewhat of a fool’s errand to try to speak of how culture impacts information, as information itself is part of culture. Culture is what is written in textbooks, what is broadcast on the news, that flashes across the screen in movie theaters, and is both the subject and the substance of what people discourse about on internet message boards. Without culture of some kind, there would not be much to say and no way to say it. However, the cultural preferences with which people identify very much affect how they relate to information. Moreover, as much as culture represents the makeup of an individual’s dispositions, it cannot be overstated that there is a reciprocal relationship between the individual and the group in which she finds herself. In this sense, culture can be said to  be self-reinforcing. The more an overall group’s culture is shaped by a particular idea, the more that idea becomes normalized, and the more entrenched it becomes in the society. The more entrenched in society an idea becomes, the more difficult it becomes for an individual to express something outside those boundaries. This is known as the process of enculturation.

Once information becomes enshrined in culture, proposing narratives that challenge the orthodoxy becomes challenging. Joanne Melish writes in “Recovering (from) Slavery” Four Struggles to Tell the Truth,” about a Massachusetts museum wishing to alter its exhibits to tell a more nuanced and accurate story about slavery in the northeastern United States. They felt it was important to acknowledge that Massachusetts, itself, had been complicit with slavery until the 1780s. Melish explains the difficulties they encountered:

Persuading administrative, curatorial, and educative staffs to
recast their interpretations to incorporate the lives of
slaves and free people of color is one issue; getting trustees,
members, subscribers, and especially donors to buy into
new interpretations that not only challenge the celebratory narratives
of ‘their’ founders and patriots but also move the objects
and documents many of them have donated off center stage
is another (103).

In other words, it is one thing to acknowledge the truth, but quite another to expect wholesale acceptance of that truth by people who are made uncomfortable by it. As this example illustrates, the representations and institutions of culture are informed by people, just as people are dependent on them to be informed. Culture spreads while reinforcing itself, and people become less inclined to critically examine or challenge popular narratives or beliefs. Thus, people may resist messages that go against the grain of what they already believe to be an important part of their culture. The past and present are rife with examples of this exact phenomenon,and the lessons for the student of information literacy are that culture resists critical inquiry from within itself and that people are all part of one culture or another and need to recognize that they are infused with biases, dispositions, beliefs, and ideas that seem as natural to themselves as their very skin. Information literacy at the very least asks people to be aware of their own biases.

Another source for people’s cultures and worldviews is found in their political beliefs. Studies such as Oxley et al.’s Political Attitudes Vary with Physiological Traits (2008) and Dodd et al.’s The Political Left Roll with the Good and the Political Right Confronts the Bad: Connecting Physiology and Cognition to Preferences (2012) both found that there were correlations between physiology and politics. Oxley et al found that higher physiological reactions to sudden noises and threatening visual images were linked to approval of military spending, capital punishment, and patriotism, while those with lower physical sensitivities were more likely to support gun control, pacifism, and other core “leftist” stances. For its part, Dodd et al. found that people who dwelled on aversive stimuli tended to be right leaning, while those who sought out greater exposure to pleasing stimuli tended to be left leaning. In that sense, these political beliefs are somewhat akin to heuristics, because they are internal guides which encourage decision-making along particular and entrenched lines. Thus, it can be said that political beliefs may predispose people to accept one line of evidence over another, to embrace one policy decision and shun another, to filter decisions through the frame of their partisanship.  At the very least these aspects of culture make for a divided citizenry unable to ever get to the bottom of why they disagree with one another.  At their worst they contribute to intractable gridlock and inaction on issues of vital public importance. 

Still another root can be found in economics. Zac Bissonnette, author of The Great Beanie Baby Bubble  notes that the height of speculative buying of Beanie Babies coincided with the internet bubble, and he postulated that buying Beanie Babies gave the middle and lower class the feeling that they were participating in the market in the same way that the wealthy and elite were with technology stocks and internet websites. He finds this not unusual, drawing from Dr. Robert Shiller’s work in Irrational Exuberance, showing that times of market speculation among the wealthy can contribute to an overall feeling among the general population that the future seems brighter than it is (qtd in Bissonnette 6).

This would seem nothing more than a humorous cautionary tale, except for the people who dumped their savings to buy toys that retained none of their speculative value and little of their actual value. But it illustrates a more sinister point. Fueled by religion, politics, economics, or other forces of culture, people seem to lose all judgment of information and perspective of fact and get caught up in the fervor of some moment or movement.  Charles McKay sums the thesis of his Extraordinary Popular Delusions: The Madness of Crowds bluntly: “Whole communities suddenly fix their minds upon one object and go mad in its pursuit; that millions of people become simultaneously impressed with one delusion and run after it, till their attention is caught by some new folly more captivating than the first” (xvii). The book plays the thesis out across the panoply of the crusades, witch hunts, haunted houses, and dueling as a way of settling disputes. Indeed, a student of information literacy might aid society by finding what notions are so feverishly pursued today, and through critical investigation of dispassionate sources, begin to provide a corrected course.



The Impact of Psychology and Biology on Information

If political beliefs influence people’s interpretation and use of information, then, it may concern some to find that political affiliation seems to be ingrained in people and not chosen through the active process of deliberation. The fact that Oxley et al. and Dodd et al. correlated physiology to political beliefs seems to indicate that something intrinsic about people makes them political beings. If that is indeed the case, then it suggests that there are naturally left-leaning people and right-leaning people as a matter of brain function and morphology, and that the two corresponding political parties which represent those predilections naturally followed.

It would make sense from an evolutionary standpoint that people would evolve with variations of those two mindsets. Individualists are akin to archetypal conservatives and community-minded people are archetypal democrats. Both sides have their advantages because the survival of the species depends on the skills of both sides. Sometimes everyone has to work together to solve a problem, and sometimes an individual can succeed where a large group cannot. Likewise, both sides are generally affected by cognitive biases and thinking limited by he margins of their minds. An idealist might suppose that society overall benefits from the interactions of these two mindsets. Richard Lanham in The Economics of Attention: Style and Substance in the Age of Information (2006) acknowledges that two-sided arguments are “profoundly social” and are ultimately a way of solving social problems and disputes in democratic governance (25). In this model, two sides of an argument would be presented and the superior side would presumably win. Political Scientist Arthur Lupia (2004) contends on the other hand that people do not pay attention to most of the information they encounter, and what they do, they do not necessarily understand and entertain the entirety of a concept presented to them (Par. 6). In fact, neuroscience, as Lupia (2016) recounted, suggests that “reason” is suffused in emotion, and that positive and negative feelings about things manifest even before the thought is consciously processed.  Moreover, this theory holds, the conscious thoughts people take as deliberation are actually more like rationalization for why they made the choice they made (108-11).

In other words, people will not even consciously engage an idea before making a decision about how they feel about it, which explains why it is so easy for people to accept information with which they happen to agree and why they often avoid  information with which they think they will disagree. In order to understand why this is so, a brief voyage into brain morphology is required.

Case Study



An example of theory in action is found in the infamous 1988 Willie Horton television ad.


Though the explicit statement about Dukakis’s stance on crime was damning, the emotional overtones of the commercial were much more so. According to Westen, “The Willie Horton commercials were “well-attuned to the primate brain, and particularly to the amygdala, which is highly responsive to both facial expressions and to fear-evoking stimuli.  The ad was packed with both” (65). Everything about the ad takes advantage of the primitive structures of the brain, from the explicit comparison of the confused face of Dukakis to the more focused expression of Bush, from the forceful language to the implicit projection of black men as a threat to safety (63-68).



Michael Ghiglieri’s The Dark Side of Man (1999) offers an entertaining romp through the human brain, detailing how the different parts of the brain function to process stimuli.  What people think of as a simple decision requires the interaction of approximately 100 billion brain cells and 20 quadrillion synapses.  Ghiglieri breaks the brain down by saying, essentially, that at the bottom of everything is the brainstem, above it is the limbic system, and above that is the cerebral cortex. 

The brain stem (some call it the reptilian brain, because that is pretty much the extent of the reptile’s brain) controls all of the autonomous nervous system stuff such as breathing and heart beats (33). The limbic system makes up about 20% of the brain.  It is comprised of the hippocampus, the amygdala, the thalamus, and the hypothalamus. The hippocampus helps people store and access memories and it generates emotions.  The amygdala helps people understand how other people are feeling, and it also provides the very important service of letting you feel fear.  The thalamus receives all the sensory input from your nerves and muscles.  The hypothalamus, though, for the purposes of information literacy, is the most interesting aspect of the limbic system.  It is responsible for determining how people feel about and react to outside stimuli, such as challenges or dangers. It is important to note that this is all happening pre-cognitively. That is, before people register it in their conscious minds. The signals at that point can either be sent to the body to tell it to react, or they can be sent up to the cerebral cortex for further consideration.  So when people say that fear is irrational, they’re right.  The sensation of fear happens in the hypothalamus, which is in the part of the brain below where the rational thinking happens. The cerebral cortex, representing approximately 70% of brain matter and volume, does the higher thinking people need to do complex activities, such as “rational thought, inspiration, insight, and creativity” (33). In other words, people’s initial interaction with information happens in the limbic system, which is the same system that perceives something is out of order in a person’s surroundings and sets off a fight or flight signal. It is responding to the feeling of cognitive dissonance, the feeling of the brain in disagreement with itself. This could manifest itself in ancient times as a primitive man in the jungle who notices in his favorite place to scavenge for berries a pattern of color that reminds him of a tiger’s stripes, setting off in him the feeling that something is wrong, and that he needs to run away or defend himself; or it could manifest itself in modern times when he encounters information that challenges his core beliefs or values.

This is a reasonable supposition. Ghiglieri points out that the brain does not necessarily measure its response in accordance to whether an emotion is justified and unjustified. For instance, he states: “The spate of hormones that fuel rage are secreted automatically when we conceive an insult or injury, whether it be someone who cuts in line or someone who steals our parking space or our mate….” (36).  In other words, it is hard not to overreact to perceived slights because that reaction happens in a part of the brain that has nothing to do with being calm, cool, and collected. When a person’s brain encounters something that makes it feel cognitive dissonance, the amygdala sends hormones to all parts of the brain and body that it is time to fight or flee. That fear needn’t be rational or necessarily even real and might just manifest itself in the act of changing the channel when a news story challenges someone’s worldview. Because, HEY, something doesn’t seem right here!  

Questions for Critical Thinking

How does Wikipedia determine truth?  Why do you agree or disagree with them?

Heuristics may be simply defined as automatic shortcuts your brain takes in its daily attempt at survival. What are some examples of people’s using heuristics in daily life. How might the use of heuristics help or hinder them?

In the quest for knowledge, people rely on primary, secondary, and tertiary sources of information. What are the advantages and disadvantages of using each kind of source?

The reading lists six cognitive biases. Choose two types of bias and give examples of people you have encountered who exhibited them.

The reading suggests that you use a rubric for evaluating sources. Why is this a good practice?

Joanne Melish writes in “Recovering (from) Slavery” Four Struggles to Tell the Truth,” about a Massachusetts museum wishing to alter its exhibits to tell a more nuanced and accurate story about slavery in the northeastern United States. What were some of the difficulties they encountered, and why?

Michael Ghiglieri’s The Dark Side of Man (1999) details how the different parts of the brain function to process stimuli. In what ways does brain morphology pose a challenge when encountering new information?


Research Skills


Many of us have been taught that in order to start a research paper we need a thesis statement, and while that’s true, coming up with the thesis statement first is not necessarily a good way to start your research. Simply stated, a thesis statement is what your paper intends to prove or show. A research question is what you need to learn in order to come up with a good thesis statement.

Instead of starting with a thesis statement, it’s better to start with a question, and there are a couple of reasons for that.

The first reason is that starting with a thesis statement presupposes that you already know enough about your topic to have not only a well-informed opinion, but the most up-to-date and expert opinion possible on the matter. The vast majority of us don’t have that kind of knowledge about academic subjects, so research is required.

The second reason is that starting with a thesis statement builds your own biases into your search and limits your findings only to the ones you expected to find in the first place, which keeps you from learning important new things.

Let’s say you want to write a paper about binge drinking and college students. If you start with the thesis statement, “Binge drinking among college students is caused by peer pressure and rebellion,” and search for those terms, one of three things will happen:

  1. You will find all the information you need to know because peer pressure and rebellion are the only two reasons that college students binge drink
  1. You will find no information because experts all agree that binge drinking is caused by other factors.

These first two scenarios are not very likely, but the third one, which is just as bad for your research, is:

  1. You will find some of the information you need, but not all of it, because your query does not allow for results that show other important reasons that students binge drink.

On the other hand, if you start from the point of asking, “What are the reasons that college students binge drink?” you will find ALL of the reasons that experts think college students binge drink, not just the ones that agree with you. This approach exposes you to a fuller range of ideas about the topic, than you started with, and that knowledge can only make your paper or project better.

After you have completed your research and read the articles you retrieved, in order to write a thesis statement, all you have to do is answer your research question with the information that you have discovered.

What are the causes of binge drinking among college students?”

May become

The causes of binge drinking among college students are socialization, pleasure, the affordability of alcohol, and the institutional promotion of drinking culture.

Before you can take a definitive stand on an issue, you need to be well-informed about it. That’s why you should start with a question, not with a statement.


After deciding your research question  you can create a Search Strategy by keeping only the keywords and joining them with Boolean operators. The keywords are ONLY the important words of the sentence, stripped of everything else. In a very basic search, if you are interested in finding out the causes of binge drinking among college students, you can break your research question

What are the causes of binge drinking among college students? 

Into the following keywords:

causes, binge drinking, college students

After you have a list of keywords, you need to connect them with Boolean operators such as:

AND, OR, and NOT

AND is what you want to use when you want to join your concepts together and retrieve ONLY articles that contain all of your keywords.

In this case, the search will retrieve articles that contain the word causes but only if they also contain the words binge drinking and college students.

Causes and binge drinking and college students

When you use the Boolean operator OR, you’re signifying to the database that what you want is any article that contains either this word or this word, regardless of whether or not they appear in the same article. It would not make sense to do a search for

Causes or binge drinking or college students

because any useful articles about all three of your keywords would be lost in a virtual haystack of millions of articles about one or two of the keywords but not the other(s).

However, it might be extremely useful to search for

Causes or reasons


binge drinking


college students or university students

Such a search would retrieve any article containing either the word causes or the word reasons but only if it also included the word binge drinking, but only if it also contained the words college students or university students.

If you use NOT you are saying, “I don’t want any articles that contain this word.” That is a useful strategy if, for instance, you want to isolate binge drinking from, say, drug use.

Causes or reasons


binge drinking not drug use


college students or university students

If my first search does not retrieve enough useful resources, you can use OR to broaden your search:

Causes or reasons or predictors


binge drinking not drug use


college students or university students or undergraduates

As you can see, this search is both powerful and specific. Library search systems are built on Boolean logic, so they adapt themselves effortlessly to you complex search strategies. Also, keep in mind that this is a search strategy meant to answer a pretty simple question, namely what the causes of binge drinking among college students are.

Academic and professional research tasks often require a much broader scope and call for stacking search strategies and synthesizing the results in order to derive new understandings about the topic. For example, if the above search strategy helps you understand that the problem of student binge drinking has five root causes, you might have to perform additional searches in order to craft solutions for each. After synthesizing the information, your paper could then propose a comprehensive plan for addressing the problems of binge drinking on campus. Thus, depending on the nature of the information assignment, you might need to perform multiple searches.

Download and Print a copy of this guide here.

I. Write a research question about your subject:


Which crime prevention programs are most effective at cutting down on repeat offenses of juvenile delinquents?

What are the effects of pollution on frogs in marshlands?

How did Lewis Carroll portray madness in Alice in Wonderland?

How can wireless technology improve patient care in hospitals?




II. Write down the key concepts found in your topic sentence:

Key concepts from one of the examples:

Wireless technology, patient care, hospitals

________________ __________________ ___________________

Write 2 or 3 key concepts from your question. If your question contains more than three keywords, you might need to do multiple searches and synthesize the results.


III. Find Synonyms of (or words related to) your concepts:

Synonyms of example concepts:

wireless technology: wireless lan, wlan, hotspots

patient care: PCS services, patient recovery, patient treatment

hospitals: clinics, emergency rooms


List synonyms or words related to concepts in your own topic sentence:

______________ _______________ _______________

______________ _______________ _______________

______________ _______________ _______________


IV. Connect Your search terms with Boolean Operators

And narrows your search:

A search for Wireless technology and patient care and hospitals

will retrieve only articles about all three concepts.

Or broadens your search:

A search for patient care or patients or medical records

will retrieve all articles about any of the three concepts.


A good search for this topic might look like this:

wireless technology OR  wireless lan OR  wlan OR hotspots


patient care OR  PCS services OR  patient recovery OR patient treatment


hospitals OR  clinics OR  emergency rooms

This search connects the synonyms and related concepts with OR and connects the different concepts with AND, thus doing a broad search for articles that must contain certain specific ideas.


V. Enter your terms into one or more library databases:

As needed, substitute or include other terms from your list of synonyms and related concepts. Keep in ,mind that articles you retrieve can be read to find additional search terms, such as important people, related concepts, and/or Library of Congress subject headings. These new words can be added to your next search.


VI. If You Need Help

Always feel free to ask a librarian for help!