How stereotypes influence behavior

So far we’ve seen that when we hold stereotypes about particular groups, those stereotypes may influence what we notice, how we interpret people’s behavior, and what we remember. That naturally leads to the next idea, which is that stereotypes also influence behavior.

self-fulfilling prophecy

One of the first ways this has been explored is with the idea of a self-fulfilling prophecy. Self-fulfilling prophecies occur when a perceiver’s expectations influence their own behavior, which can elicit expectancy confirming behavior in a target individual, thereby “proving” the perceiver’s expectation true. If I expect someone to be unintelligent or incapable of understanding something, for example, I may be very short with them, giving incomplete or unhelpful explanations. They may respond with confusion and several questions as I roll my eyes thinking about how dumb they are. My expectation came true not because the person was unintelligent, but because of how I treated them.

Studies show that:

(1) teachers who have the expectations that certain students will “bloom” in the coming year spend more time helping/challenging those students, so that in the end, those students (who were randomly given the label) learn more throughout the year.

(2) men who think they are speaking on the phone to more attractive women are nicer, and they elicit more positive social behavior from the woman they are speaking to (even though, in fact, the photo does not depict the person they are actually speaking to).

(3) people interviewing Black applicants sit further away, make more speech errors, and conduct shorter interviews. In turn, people who are treated with such “low immediacy,” whether black or white, perform worse in interviews.

(4) people who think they are speaking with someone who is in psychological counseling are more uncomfortable and cold, eliciting anti-social behavior from their partner.

In all these examples, perceivers are essentially manufacturing “proof” of their initial expectations – that certain students are better than others, that more beautiful people are more socially skilled, that blacks are inherently less qualified job candidates, and that people in psychological counseling are socially awkward. The problem is, people usually don’t realize how their own behavior (and nothing internal to the target) led to those outcomes.

Even without our behavior as a mediator, stereotypes can sometimes ‘prove’ themselves true. Stereotype threat occurs when a person is afraid of being judged in line with an existing stereotype, or when they are afraid that they might fulfill someone’s stereotypic expectancies. So just because stereotypes exist in society, they may interfere with someone’s performance in a way that provides ‘proof’ that they exist. Research shows that when group memberships or stereotypes are salient:

(1) women perform worse on math tests.

(2) Black participants score worse on intellectual tasks.

(3) White men perform worse on athletic tasks.

No group can really escape this (for once, general privilege doesn’t seem to matter; what does is the existence of a negative stereotype!). If we’re afraid we might confirm an existing stereotype, we often do. Exceptions include when a stereotypic trait is exceptionally discrepant from one’s self concept, we may not hold that fear of confirming a stereotype in the first place, and thus our performance may not be negatively impacted (in fact, it may be boosted!).

Those first two effects – self-fulfilling prophecies and stereotype threat – show how stereotypes are self-perpetuating. By influencing the behavior of targets individuals, we manufacture evidence that the stereotype is true. Stereotypes are quite powerful in that way. But they may be even more problematic in the way they fill in details in decisions and quick reactions.

As we saw last week, stereotypes influence what people notice and remember. Think about how important that can be in situations such as jury trials, where jurors are given a massive amount of information, some of which is likely incriminating (the person is on trial after all), some of which is probably exculpatory (if the defense attorney is doing his or her job), and some of which is fairly neutral or irrelevant. If we focus on information in a stereotype-confirming manner, the natural consequences is that we should be more likely to convict individuals who are members of groups that are stereotyped as criminal or violent.

One study told participants that they would be participating in a study on jury decision-making. Participants read several pieces of evidence about the defendant and the case, then rendered guilt-likelihood judgments and were asked to recall as much of the evidence as they could. The researchers changed the name of the defendant across two conditions of the study. That is, one group of participants read that the defendant was Carlos Ramirez (presumably Latino), and another group of participants read that the defendant was Robert Johnson. Everything else that the participants read about the case was identical.

The results showed that, even though participants read identical evidence, participants who read the case against Carlos judged him more likely to be guilty, and recalled relatively more incriminating than exonerating information about him. In directing our attention to particular elements of the evidence, the stereotype impacted judgments.

This kind of effect outlines why it is so detrimental to label an entire group of people as, for example “bad hombres.” Labeling an entire group as criminals, or creating a registry to highlight a particular group’s criminal behavior (ignoring such behavior of other groups) reinforces the stereotype, which likely then influences the way individuals are judged (which reinforces the stereotype and so on). All of our ‘evidence’ is a manufactured product of the stereotype, but we use it to justify or prove our expectations true.

A clear consequence of this is highlighted by recent data which show that Black men are disproportionally wrongly convicted of crimes, and especially murder, sexual assault, or drug offenses. This may very well be attributable to the way Black (men in particular) are stereotyped, and thus, how juries consider evidence against them.

And that leads to the final tragic consequence I’d like to discuss in this post, which is an effect known as “shooter bias” or “the police officer’s dilemma.” Horrified by the killing of Amadou Diallo, a West African immigrant to the U.S. who was shot 41 times by 4 plain-clothes police officers who thought he resembled a rape suspect, researchers began investigating biases in people’s quick decisions.


More specifically, researchers wanted to investigate the decisions that people make in quickly identifying potentially dangerous objects and their decisions to shoot or not shoot target people. Participants were asked to play a rudimentary video game where they saw target people in different backgrounds, and were asked to “shoot” any armed targets and “not shoot” any unarmed targets (people holding phones, soda cans, wallets, etc.). The results were compelling. Participants were faster to shoot an armed black target person than an armed white target person. They were also faster to not shoot an unarmed white target person than an unarmed black target person. In other words, the speed of people’s decisions was influenced by the target’s race in stereotype-confirming fashion.

If the results ended there, they wouldn’t be too discouraging, but studies show that we also make a different pattern of errors depending on the target’s race. More specifically, people are more likely to accidentally shoot an unarmed black target person than an unarmed white target person, and more likely to accidentally NOT shoot an armed white target person, than an armed black target person.

Test yourself here.

This kind of shooter bias seems to be predicted by people’s knowledge (not endorsement) of the cultural stereotype that “blacks are dangerous.” That is, the more we are aware of that stereotype, the more likely we are to show such bias. Shooter bias is unaffected by a person’s level of prejudice. It’s not because we dislike black people that we make such mistakes; it’s the simple existence of the stereotype that leads to them. Of course, explicit prejudice may very well have an amplifying effect on the process, as do perceptions that the world is a dangerous place. That means that any rhetoric that tells us that we are in danger and need to be protected may amplify bias against people who are different from us.

Studies show that black participants show the same time biases, but maybe not the same pattern of errors as white participants. Police officers also show a similar pattern of bias, although it may be reduced somewhat compared to the general public. Finally, other research has shown similar effects with different target groups. In particular, people have a bias to shoot “Muslim looking” targets over less Muslim-looking targets, and Latino targets over white and Asian targets.

Clearly stereotypes linking danger, criminality, or violence to particular groups can have very dangerous implications. This is again why using the label “radical Islamic terror” (rather than simply “extremist” or “terrorist”) has a negative impact on the millions of peaceful Muslims in the world. It’s why scapegoating Latinos and immigrants for crime can be dangerous for those groups. And it’s the scientific basis of the “Black Lives Matter” campaign, showing that – yes – innocent black people are more likely to become victims of “accidental” police shootings than are innocent whites. So yes, blue lives matter, and indeed, all lives matter. The problem is, some lives are (perhaps unintentionally) targeted because of insidious stereotypes. We are all responsible for these tragedies and need to work together to change – or at the very least make more subtle – stereotypes that link groups to such characteristics. Until then, it’s worth repeating: #blacklivesmatter.

Image result for unarmed black people shot by police


How boxes influence thinking

Once we’ve categorized someone as a member of a group, whatever information our mind has stored in our schema about that group – our stereotype – comes into play in the way we think about them. We’re not unbiased observers, no matter how much we may want to convince ourselves that we are.

Image result for stereotyping comic

Remember that schemas are mental structures that influence what people notice, think about, and remember. Let’s develop that idea, and see how it works in the case of stereotypes.

What we notice:

Remember also that we are motivated to pay careful attention to information that is deemed relevant to us. Thus, if we hold a stereotype that a particular group is dangerous, our mind should deem members of that group as especially notable. In fact, research shows that the more people hold the stereotype that Black Americans are dangerous, the more their attention is drawn to Black faces in their environment.

Other people become relevant when contextual features make “them” a particular concern. For example, research shows that IF we are feeling anxious AND the idea of terrorism has been activated in our minds, we become especially attentive to “Middle Eastern looking” faces.

Think about how important that is, in places like airports, where many people are anxious to fly, and where the loudspeaker reminds us of the current terror threat level. Perhaps it’s no wonder that many people’s attention is drawn to men with thick beards, or people speaking Arabic or other foreign languages. This attentional bias is leaking our underlying stereotypes. Our attention is only directed toward certain people because of the beliefs and expectation we have associated with particular groups.

How we think about information:

Image result for is this mel's secretary, no this is mel

Stereotypes don’t just direct our attention to particular information, they also influence how we interpret it. More specifically, we tend to interpret ambiguous information in line with our existing stereotypes. In one classic study, white participants watched two men arguing, culminating in one man ‘shoving’ the other. The “trick” to the experiment is that all participants were actually watching a video recording, not a live broadcast as they expected. Some participants watched as a black man ‘shoved’ another man (either white or black). Others watched as a white man shoved another man (either white or black). Participants were asked to describe the interaction, and their responses were coded by people who didn’t know which version of the video they saw.

The researcher found that when the ‘shover’ was black, a proportion of participants spontaneously labeled the shove as aggressive. Very few participants labeled the behavior that way when it was performed by a white man. In this case, a stereotype linking violence, aggression, and or criminality to black people influenced the way white participants evaluated the event. Not only that, but whereas participants blamed the ‘shoving’ conducted by a white man on the context, they were more likely to blame the ‘shoving’ conducted by a black man on his nature, showing another way that stereotypes influence the way we interpret behavior.

That study was conceptually replicated with black and white children, who evaluated ambiguous behavior of children presented in comics. The researchers found that both black and white children evaluated black characters as more mean and threatening than white characters (similar to the cartoon that opens this post). Once stereotypes exist, even when they are about our own group, we may use them to disambiguate information we perceive.

What we remember:

Finally, stereotypes influence what information we remember about people, often biasing our recall in the direction of the stereotype. In other words, rather than remembering information about people accurately, we may fill in certain details using our stereotypes.

Demonstrating this, researchers asked participants to read and evaluate college applications. All participants read the same application from Emily Chen. Among lots of other information provided in her application, participants saw that Emily had scored 640/800 on her math SAT.

After reviewing her application, some participants were asked to recall the SAT math performance of “the female high school student whose materials you just read.” Other participants were asked to recall the SAT math performance of “The Asian-American high school student whose materials you just read.” Still others were asked to recall the SAT math performance of the “high school student whose materials you just read.”

In the last instance, no relevant social category was cued for participants, whereas in the first two, either the applicants gender or her ethnicity was activated. Results shows that when asked to recall the scores of the “Asian-American” student, participants recalled her scoring better (compared to their memory when no category was activated). In contrast, when asked to recall the scores of the “female” student, participants recalled her scoring worse (compared to their memory when no category was activated). In other words, participants remembered her performance in line with the stereotype. Because Asians are stereotyped to be good at math, recall was biased to assume the student performed well. Because women are stereotyped to be bad at math, recall was biased to assume that the student performed poorly.

In the same way, we may remember a person we met as more kind, more aggressive, more outspoken, or more selfish than they actually were, just because our stereotypes suggest that they must have been.

So here is the ominous thing… if stereotypes influence how we interpret information (and in fact, make us do so in stereotype confirming ways) AND what information we readily recall (notably we recall information in line with our stereotypes) – well crap; you might realize just how darn good they are going to be at maintaining themselves, but that’s a discussion for a few weeks from now…

Why do we do this:

We use stereotypes for a variety of reasons, some of which I’ll explore in the next post. We may be filling in details we don’t have, we may be expanding the amount of information we feel we have about a topic. We may be distracted or unmotivated.

But we may also be pre-programmed to use stereotypes simply because doing so is cognitively efficient. Stereotypes have been labeled as energy saving devices because when we use them, we maintain cognitive resources to focus on other things; that is, we don’t have to waste any effort or time to understand a new person, and can instead focus on finding our airport gate, finishing our homework, or discussing the latest fashion trends. Unfortunately that energy that we save, may lead us to make inaccurate assumptions. Stay tuned…

Stereotypes: Where do they come from?

Officially, stereotypes are the set of beliefs and expectations we have about members of a social group or category. In other words, they are schemas about groups. But schemas aren’t innate. They have to be formed through some sort of life experience. So where do these beliefs and expectations come from?


Image result for stereotype comics

They come from everywhere.

Image result for childrens toys gender specific

Think of your unconscious mind as a serious hoarder. Everything we learn about a particular group, be it from personal experience, an overheard conversation, a movie, a news program, etc. is placed in the metaphorical box we have representing a group. Some of our boxes are overflowing with an inconsistent jumble of information – we’re likely to recognize the diversity of these groups. Others overflow with a brilliantly organized and consistent set of information – these people, it seems to us, are all the same. Still others sit virtually empty – we know very little about people that represent these groups. Although you may not consciously remember some specific information or experience at a particular point in time – just like for a hoarder, it is there, somewhere, influencing our overall impression of the group.

We begin to learn stereotypes at a young age, and one way we do so is through social learning. Social learning is the process of observing and mimicking the behavior of others. As we mimic behavior (or beliefs or attitudes), we are reinforced or punished according to whether we’ve picked things up ‘correctly.’

Social learning is one of the primary ways we learn gender roles (and corresponding gender stereotypes). Children may play dress up, for example, as my sisters and I did with our younger brother. If he were to wear a beautiful princess dress to school though, his classmates may ridicule him. Many parents may do the same, telling boys that ‘those clothes are for girls.’ In the same way we tell boys to “man up” and girls they “look pretty” and very quickly we learn both how to fit in, and what is expected of different groups.

We also form stereotypes on the basis of something called an illusory correlation. The concept is fairly simple. Minority groups are numerically distinctive – they stand out. Likewise, unusual behaviors or events, an in particular negative behaviors or events stand out to us. So if we see a member of a minority group doing something bad – it’s doubly distinctive. It captures our attention and becomes memorable. As a result, we tend to overestimate the correlation or association between that type of behavior and that group. In other words, we form a stereotype that the whole group is ‘like that.’

Many prominent stereotypes that exist in the world today likely result from this kind of process. The stereotype that Muslims are terrorists, for example, may result because both Muslims and terrorism are distinctive, at least to Western audiences. Terrorist events attract so much attention, in fact, that they are exceptionally notable. Exceptionally horrifying. Exceptionally distinctive. And we are likely to focus on the distinctive examples of Muslim terrorists, rather than the millions of examples of non-distinctive Muslims in forming our impressions of the group.

The issue is compounded by the language we use in reporting news events or setting a political agenda. Do we label a perpetrator, for example as a ‘radical Islamic terrorist’ (linking the action to a group and repeating it anytime the action is condemned; piling and re-piling the consistent information into our “Muslim box” until the link seems a formal truth) or just as a ‘terrorist’ (which can represent its own category that incorporates multiple ideologies, becoming a box that includes both the 9/11 hijackers, and the Oklahoma City bombers, people “who use terror as a means of coercion”). One label creates and reinforces a stereotype by repeatedly linking a behavior to a group (even when such behavior is rejected by the majority of the groups’ members); the other labels a person by the intent of their violent actions. It allows for a more comprehensive, more nuanced, but perhaps also more jumbled box. And darnit, most of us like organized boxes.

In this way and others, the news and other forms of media may also help us form stereotypes. In fact, even without ever meeting a member of a particular group, we may hold beliefs and expectations about ‘them’ because of how ‘they’ are portrayed on television, in the movies, or in the news. Because media often highlight negative events, coverage may color our perception of various groups, and perhaps to form those organized stacks in the little box labeled “Black” or “Mexican” or “transgender” or “refugee.” And that means, when we finally meet a member of that group, we already have expectations about what they might be like.

Those expectations are our stereotypes. They have important consequences. Stay tuned…

Thinking inside the boxes

One of the ways we make sense of that complex world we live in is by the creation of boxes. Many schemas we form are about “types” of people. Take a moment to think about how important it is that we’re able to do that. Imagine that you are standing in the middle of a foreign city, lost, in need of directions. Who will you approach? What language will you speak? Being able to quickly differentiate child from adult, native from tourist, police officer from thief is an essential skill. In other words, making and using categories is helpful in our complex world.


People can be categorized on the basis of countless social groups, including age, sex, race, religion, nationality, occupation, political affiliation, sports team affiliation, etc.

When we meet people, we tend to categorize them automatically based on whatever features are salient to us in the context. If there is only one man in a group we meet, his sex may jump out at us (“you know, that guy who was there…”) If we’re having a discussion about race, we’re likely to note race of the participants (“You know, what that Asian person said…”). In other words, features of the context interact with our current goals to influence how we see others.

The process of social categorization has a number of immediate consequences. Take a look at the following image:


We could label the people in it as a collection of individuals. If we do so, we are likely to notice both the things the people have in common, and the things that make them unique. If we give the crowd a label, though, we’re likely to look at the people differently. Let’s call them German fans. Suddenly we may start to focus on elements that the people have in common, and ignore many differences.

This is called within group assimilation. It’s the tendency to perceive a group of people as more similar than a collection of individuals. The simple action of labeling a collection of people as a group has perceptual consequences – it amplifies the similarities we see among people, which sets an important basis for stereotyping others. We can only generalize traits or characteristics across people, after all, if they are all (at least basically) alike.

Now let’s say that see a crowd that has TWO groups. Maybe Muslims and Christians. Once again there is a direct perceptual consequence. If we perceive people as belonging to two different groups, we amplify the differences we see between them, failing to notice or think about everything they have in common. This allows us to perceive polarization between groups which may exceed what actually exists.


Finally, let’s put these two effects together. Let’s say we’re watching a debate on the floor of the United States senate. Members of both parties speak. A few hours later we discuss the session with friends. Something interesting might happen – remember that we’re seeing members of the groups as more similar to each other than, perhaps, we should (within group assimilation) and as more different from the other group than, perhaps, they are (between group contrast). And the debate was lively with many people contributing. It was complex.

We may find that we don’t really remember which senator said what, but we certainly know which group did. That is, we might misattribute a quote to John McCain than was actually said by Mitch McConnell. Or we might be convinced that Elizabeth Warren said something that was actually said by Kamala Harris. Or maybe we just know that ‘a Republican’ or ‘a Democrat’ said something… Any version of this effect shows that we’re no longer processing information at the level of the individual. We’re processing the information at the level of the group. Members have become interchangeable. Anything we know about the group can be attributed then, to any individual member.

These three perceptual biases arise even without consideration of the self. We don’t have to be a member of any of these groups for the effects to occur. In other words, they are basic cognitive processes; self-protection and enhancement are not involved.

When the self is involved, other biases arise. Let’s say that we’re not just perceiving two groups, but we’re perceiving one group that we belong to (an ingroup) and another than we do not (an outgroup). Maybe we’re watching that senate debate as an affiliated Democrat or Republican. In this case, we’re likely to feel attachment and affiliation for one group more than the other. The need to feel good about the group may lead us to favor members of our group more than others. This is called ingroup favoritism, and we seem to show it even for groups with whom we are minimally identified. Although perhaps harmless if it just means that we smile more at someone “on our team,” it’s also a basis of prejudice.

Inherent limitations of the human mind set the basis for processes that can be extremely detrimental. And yet, they are also fundamentally human, arising out of the normal (not pathological) functioning. Stay tuned…

How the mind works

The world is a complex place and the human mind does a remarkable job of making sense of it all. From discovering hidden laws of math and physics, to just trying to understand the intentions of a neighbor who says that they “like” your dress (while making a funny face), the mind does a lot of work. Sometimes that ‘work’ leads us to understand the world correctly, other times our conclusions are inaccurate. Both outcomes result from the same basic processes.

Sometimes we feel our mind at work. This is called conscious processing. It’s the kind of thinking we engage in that is effortful, intentional, and controllable. We ‘experience’ the act of conscious processing, which involves seeing images in the mind, or having a conversations inside of our head. When trying to make decisions, we may agonize over different options, rehearsing and reconsidering positive and negative elements of different choices. We may write our thoughts down. We may express them out loud.

But conscious processing isn’t the only way the mind works. We also have a whole set of processes which go on outside of conscious awareness. These processes are operating constantly (some even while we’re sleeping), and help us navigate our complex social environment, all without us ‘feeling’ the experience.

Let me try to help you feel it. The following words are presented in colored font. Rather than read the words presented, I want you to try to say the color of the font. Go from the beginning to the end of the line as quickly as you can:


Not bad right? In fact, probably pretty easy. Now do the same thing with the following line (remember, you want to say the font color):


Was it harder than the first line? Why?

For the purpose of this discussion, let’s say that as fluent English speakers (which I assume anyone reading this blog is), the mind is able to read automatically. We don’t have to think about it, and maybe we don’t even WANT to do it (indeed, your instructions here were to do something different). Nevertheless, it happens. It’s a process going on outside of conscious control. We can become aware of it in this task, because it interferes with the task we are actually trying to do. Other times, though, we may not even realize that unconscious processes are affecting us.

One way that the unconscious mind influences us positively is with the use of schemas. Schemas are mental structures that organize our knowledge on a particular topic, and influence the way we notice, think about, and remember information. We develop schemas through experience, so, for example, if your neighbor brought you cookies when you moved in and is willing to take care of your mail when you’re out of town, you may form a schema that she’s kind and generous. Likewise you have schemas about what happens when you eat in a restaurant (there is a whole different procedure than when you’re eating at home!), how to take a taxi, and what ‘science’ is. Whatever your beliefs and expectations about these topics, picked up from whatever experiences you’ve had with them (first, second, or third hand) – that’s your schema.

As we’re navigating life, so long as things proceed as we expect (i.e., so long as things fit our existing schemas), we don’t need to waste any time or effort trying to figure anything out about the situation we are in. We can run on automatic pilot. And we may do very many things in this mode of thought. I, for example, have a morning routine that includes washing my hair. I go through the motions, but am usually consciously thinking about what I have to do for the day. Half of the time when I’m drying off, I find myself wondering whether I actually shampooed my hair. I probably did, but I wasn’t consciously thinking about it, I was just running on automatic pilot.

Anytime our schemas are violated through – let’s say that your neighbor yells at you for parking too near to her spot, or your shampoo bottle explodes sending shampoo all over the shower – your unconscious mind will tell your conscious mind – hey! Something’s weird! You need to figure this out! In those moments, we’ll pay closer attention to what is going on in our environment, redirecting our attention to elements we may have ignored, save for the schema violating experience.

This dual-processing approach to navigating the world is rather functional, allowing us to save ‘space’ in the conscious mind to focus on what is important in the moment. I don’t really need to talk myself through shampooing my hair every morning – if I did, I’d have to get up MUCH earlier to be prepared for me day.

Let’s take a moment to experience that allocation of conscious awareness. Play the following video (it’s 60 seconds, you have time!), and really focus on the task at hand. The video quality isn’t great, so it’s actually really hard. Do what you can to focus and try to get the right answer. Ready? Press play:

Well? Did you get the right answer?

We need ‘space’ in the mind if we want to think about anything at a conscious level. More specifically, we need cognitive capacity – this is kind of like mental energy, and that energy is best devoted to one thing at a time. It’s why we’re limited in terms of how many things we can do (well) at once, or why I can’t ever remember if I actually shampooed – my mind is otherwise occupied with considerations of what I have to do for the day, focused on only the most important cognitive task at hand.

More generally, if we’re thinking about something carefully, we may totally miss other things going on – that’s why when you were watching the video, you probably got the right answer OR you saw the moon-walking bear, but not both (maybe some of you did, especially if you expected there was some trick at hand, but hopefully you get the idea).

Just like fuel in your car or strength of your phone battery, our cognitive energy is limited. If we spend too much time studying or reading technical information, we may use up the energy we have, and we may need to replenish it by taking a brain break, getting some rest, or having a healthy snack (particularly fruit).

But having capacity isn’t enough. We also need motivation to consider information at a conscious level. Sometimes we simply decide to allocate our conscious awareness to some task (like working on our tax forms). Other times our attention is directed to some information either because it’s relevant to us, or because it violates our expectations. An example of the former is if you’re at a party, engrossed in conversation, when across the room you hear someone say your name. Here the unconscious mind is telling the conscious mind – hey, direct your attention over there, it might be important! An example of the latter was given above – when your sweet neighbor yelled at you. Again, it seems very strange, and different than what we would expect, so our mind is compelled to figure out why it has happened.

You may assume that these unconscious processes can lead us astray in the way we perceive the world because we’re not ‘really thinking’. Certainly they can. But in my last two posts, I’ve written about how motivation can lead to bias. Hopefully you can see that even the conscious mind isn’t going to be a perfect information processor. It’s human, and it balances efficiency and accuracy the best way it can, given the inherent limits of our mental skills. So remember these two forms of information processing; they will be important to bear in mind as we see how each plays a role in our perception of the world in future posts. Stay tuned…


Bias and the Media

Let’s talk about the media. And let’s talk about bias. Certainly those two words have been tossed around a lot these days, mixed and matched in different ways by different sources who blame either mainstream or new media for the misrepresentation of public opinion…

Rather than focus on ways that the media is biased, as a psychologist, I’d like to focus on a different source of bias in our interactions with the media. That bias comes from us as perceivers. We may be seeing bias as something that exists out in the world when in fact, it exists inside our own mind.

In my last post, I outlined how humans are motivated to protect their personal sense of value. But our sense of value and worth comes from multiple sources. Social identities are ways we perceive ourselves as members of groups. Some identities are centrally meaningful to us, whereas others are category labels that may apply to the self without provoking attachment or identification. Just as we have a motivation to protect our sense of personal worth, we have a motivation to protect the image, esteem, or value of the groups that are important to us. This has important implications.

One implication is called the Hostile Media bias, which is the tendency to perceive that objectively neutral media coverage on a controversial issue is hostile to our side. For example, objective media coverage of an election debate may be perceived as anti-Republican by Republicans, and simultaneously as anti-Democrat by Democrats. This perceived hostility results because we are highly attentive to criticism of our group, and because we are irritated that the report gives voice to an alternative view – one which we perceive as unfair or incorrect. In this case, the media is objectively balanced, but we can’t see it that way because it doesn’t adequately promote the righteousness of our group.

Perceiver bias that’s rooted in social identity doesn’t stop there. We are also more receptive to arguments or proposals that come from our own groups, while being especially antagonistic to arguments that come from the other side. Research conducted in Israel, for example, showed that Israeli Jews devalued an Israeli/Palestinian peace plan that was actually authored by the Israeli government when they were told that it was authored by Palestinians. Similarly, both Israeli Jews and Israeli Arabs devalued a plan that was described as coming from the other side. More generally, it seems that we trust information that comes from our own groups, and we devalue information that comes from outside. In the age of social media, we may be tricked into agreeing or disagreeing with a particular proposal or issue, just because someone created a meme that attributed the idea to a particular side.

And once again, the problem doesn’t stop there. It’s clear that we don’t like news coverage which is actually fair and balanced – so long as our identity is involved, we won’t consider it fair or balanced. We do, however, like news which comes from “our side” – this may be the reason that Partisans prefer the news source they do, with liberals preferring Slate or CNN, while conservatives prefer Breitbart or Fox News. We like to hear news that fits our own perception of the world, and we feel uncomfortable when our view is challenged. This is called confirmation bias.confirmation-bias

Confirmation bias is the tendency to seek out information which confirms our existing views, and to disregard or ignore information which contradicts it. This again is why we may be tricked by false news that aligns with our preexisting views of people or policies.

These days rather than watching conventional news programs, many of us select opinion-oriented or interpretation-oriented shows. Here the focus is more on building an argument aligned with our own ideological position than it is with presenting different sides of an issue. Tomi Lahren and Stephen Colbert, for example, report a passionate or satirical interpretation of the news. The show that sits comfortably with us doesn’t necessarily do so because it’s right, but because it confirms our own interpretation of reality. It feels comfortable to us. It reinforces what we already suspected (or wanted) to be true.

Here’s the thing… sometimes the argument that someone on our side is making is, well… crap. If the argument isn’t centrally relevant to our group, we probably won’t even notice. The reason is that we don’t spend a lot of effort or energy thinking about all the news that bombards us throughout the day. Stuff that comes from our side, because we intuitively trust it, we’ll agree with without much thought. Stuff from the other side, which we are mistrustful of, we can dismiss likewise without thought. When a message is centrally relevant to our group though, we’ll think about it much more closely… so long as it comes from someone in our group. In this case, we’ll agree with a message from our side when it’s strong and compelling, but dismiss it when it’s weak. This suggests that we’re paying careful attention and thinking deeply about the message – we notice when it’s valid or invalid. We won’t devote the same energy to a message presented from the other side though, often dismissing it out of hand without considering deeply the value of the argument. Ignoring messages from the other side helps protect our pre-existing views, as well as the felt value of our group, at least relative to the other side. After all, we don’t have to see our side as wrong if we’re not willing to deeply consider an alternative perspective.

The more we elect to receive our news from partisan sources, the more polarized our view of the world may become. Take a look at these graphics created by the Wall Street Journal which show the differences in what highly identified Republicans vs. Democrats see in their facebook news feeds. These feeds seem to represent two alternate universes, rather than the a single country that houses a diversity of people and ideas. It’s a shame, actually, if we’re only exposed to one side, because it means that at a minimum, we’ll be blind to the reality of life experienced by those who share the world – or perhaps the neighborhood – with us. This may be part of the reason many Democrats were blindsided by the outcome of the 2016 election, they failed to notice the perspective and strength of a large contingent of the country. Perhaps we all need to do a better job of knowing what other people in the world actually think. It’s a fallacy to expect that they all think as we do, no matter how red or blue our own newsfeed appears.


If you want to try to reduce your own bias in the way you interpret the media or political action, it’s important to both (1) be critical and conscientious in the way we receive news or information that comes from BOTH our side and the other side, and (2) explicitly seek outsider sources of information. This ensures we’re exposed to arguments or information that we might otherwise avoid, even if you ultimately dismiss it, it’s good to know what message is reaching the other side.

But to truly be a critical consumer, we need to step away from our identities, detaching in a way that allows us to depersonalize information (assuming that the information doesn’t have a direct effect on us as individuals). This helps to control the “modifications of the mind” by removing self or social-identity protective bias. It’s a point on which Eastern and Western science of the mind agree, though they also agree that it’s easier said than done. Bias is insidious. It works hard to convince us that it lives somewhere else – in the media, in the other group, or in society at large. It may live there, but make no mistake, it also lives in your mind.

Motivation and Perception

Spend a few minutes looking at the image below. How would you describe what is happening?


My Social Psychology students suggest:

  1. “the wife is sad, because the husband just died.”
  2. “the wife is upset because her husband is drunk.”
  3. “maybe she just woke up, and he’s still sleeping.”

For what it’s worth, the first two suggestions came from women. The third from a man.

But there are other possible explanations. Maybe she’s not his wife. Maybe she’s his nurse, his sister, a prostitute, or a stranger. The setting could be a hospital, a hotel, or a home. The woman could be sad, angry, ashamed, or tired. Or maybe her neck is just sore. The man could be asleep or awake. Dead or alive. Caught in flagrante or innocently observed. The mind could tell a thousand stories about the same simple drawing.

The story we tell – the interpretation we provide – is shaped by our past experience, our beliefs and values, our cultural background, and our current goals and motives, just to name a few. Thus, the stories we tell may be more indicative of something about us, than of anything about the targets.

Of course, this is a drawing – there is no objective reality, no correct answer regarding its meaning. But we experience a conceptually similar process on a daily basis, watching people interact on public transportation, or at the store, or at work. Our mind is constantly making sense of experiences, filling in details that we don’t actually know to give us a feeling of understanding. That understanding may be right or wrong, but so long as it feels right to us, it becomes our subjective reality.

If there is anything systematic to the way we draw our inferences – the way we construct our subjective reality – it it represents bias. Bias takes on many forms and can influence us through many channels.

One motive which introduces bias into the way we perceive the world is the need to view ourselves positively. When that motive is threatened, such as when someone insults us, or when we fail at some task, our need for self-defense or self-justification grows. As a result, we become susceptible to many forms of bias aimed at self-protection.

Let’s consider another set of images. The following are of Barack Obama’s 2009 inauguration (left), and Donald Trump’s in 2017 (right).


There are many reasons Donald Trump’s crowd may have been smaller than Barack Obama’s, however there is no alternative truth (truth being representative of objective reality) about the sizes of the crowds in these photos – Obama’s was bigger.

There is, however, alternative interpretation (interpretation being reflective of subjective experience – and therefore subject to bias). For someone  who not only has a strong desire to see himself as ‘great’ but who has an acknowledged desire to see himself as ‘the greatest,’ lower crowd turnout represents a threat to self-esteem and value. It implicates self-defensive bias. Under these conditions, Donald Trump may convince himself that there is something amiss in or with the photos.

Bear in mind that this was Trump’s view of that event:


Amazing right? It must have been overwhelming to look out across that crowd of thousands and be sworn in as President. To later see side-by-side images from a different perspective contradicts his experience (“it feels like a lot of people!”). As a result, he may look for evidence which confirms those feelings, and ignore evidence which contradicts it. Of course, he may accurately understand the differences in crowd sizes and his assertion that his inaugural crowd was the largest ever may be an attempt to impact the subjective perception of millions of viewers. This is a case, in trying to know his intentions, where objective reality is unclear.

When reality is tangible, it leaves evidence. Pictures of crowds, or reliably measured data about global temperatures reflect fact. The meaning of those facts can be debated, but if our perception contradicts or ignores the objective indicators (“the crowd in that image is clearly bigger!” “the global temperature is dropping!”), it must be biased.

When reality is intangible, (for example, when we’re trying to understand someone’s motives) objective reality is harder to define. In this case, it can be difficult – maybe impossible – to know if our perception matches ACTUAL reality. However the uncomfortable consideration we give to self-threatening information (information which makes us look bad, or which contradicts our preexisting views) shows that our motivation is at least partially tempered by a concern for accuracy. Refusal to do so leaks our underlying motive, and indicates our bias.

If our self-esteem motive is working well, we will be blind to its influence. That is, we will believe that our perception reflects reality, not bias. Because it influences us automatically, and outside of conscious awareness, bias is hard to taste or feel or smell. That’s what makes it so insidious, and that’s why we may get into arguments with people who view the world differently than we do – both sides believing that our perspective is the right one, and unable to understand how someone could possibly “see” it another way.

Although I can’t give you the tools to identify bias in yourself and others every time it occurs, the goal of this blog is to help you identify times and situations that it might influence you, to help you understand why it would do so, and to give you some tools to help counter its role in the way you interact with others. Feedback, questions, and suggestions welcome.

**Motivation and Perception in Action: If you would like to know how different motives may influence your interpretation of the world, take this test here: (you’ll need at least 10 minutes). Because it’s not administered under carefully controlled conditions, it won’t give you a perfect snapshot of your motives in life, or even your motives right now, but it might give you some things to think about.**