Actions

Survey Research

From OPOSSEM



Objectives[edit]

  • Would you say that you approve or disapprove of survey research?

Introduction[edit]

Heading[edit]

Sub-heading[edit]

Example[edit]

Sub-heading[edit]

Example[edit]

Response Effects[edit]

Response Effects refer to the phenomenon where something about the survey process impacts the responses given by survey respondents. While it is impossible to eliminate all response effects, it is possible to reduce or eliminate the structural causes of many response effects (that is, when the structure of the survey, such as the order of questions or alternatives, the wording of questions, etc. is the cause) and minimize the impact of others (for example, response effects caused by the race or sex of interviewer can be minimized by randomization in the sampling process)

Framing[edit]

Framing refers to the use of language, words, or phrases which are likely to lead survey respondents to give certain types of answers.

Example of Framing[edit]

In one telephone poll of adults 18 and older in York County, SC conducted in 2003, two versions of one question were included to demonstrate the phenomenon of framing. Respondents were randomly selected to receive one of the versions. In a series of questions where respondents were asked to Strongly Agree, Agree, Neither Agree nor Disagree, Disagree, or Strongly Disagree with a statement, half of the respondents received Version A of one statement while half received Version B.

Version A: "I don't mind if the government keeps tabs on regular people."

Results:
Error creating thumbnail: Unable to save thumbnail to destination


Note: The majority of respondents value their privacy and do not approve of the government keeping tabs on them.


Version B: "I don't mind if the government keeps tabs on regular people if it helps keep us safer from terrorism."


Results:
Error creating thumbnail: Unable to save thumbnail to destination


Note: Now, the majority of respondents appear to be willing to sacrifice their privacy for perceived safety.

Side by side comparison of Versions A and B:

Error creating thumbnail: Unable to save thumbnail to destination

By creating a frame of "terrorism" for the question, respondents gave a different answer than they otherwise might. Whether you attribute the cause to placing terrorism at the "top of the head" of the respondent<ref>Zaller, John (1992). The Nature and Origins of Mass Opinions. Cambridge University Press. </ref> or causing them to consult a different core predisposition<ref>Alvarez, R. Michael; Brehm, John (2002). Hard Choices, Easy Answers: Values, Information, and American Public Opinion. Princeton University Press. </ref>, the result of altering their response is the same.

Example of Framing Related to Word Choice[edit]

In the fall of 2008, the United States Congress was considering The Troubled Asset Relief Program (TARP), the Bush administration's plan to purchase assets and equity from financial institutions in hopes of stabilizing the financial industry. It was signed into law on October 3, 2008. In September 2008, three separate national telephone surveys (Pew, LA Times/Bloomberg, and ABC News/Washington Post) sought to gauge public opinion regarding this program. Each survey used different language to describe the program.

Pew wording: "As you may know, the government is potentially investing billions to try and keep financial institutions and markets secure. Do you think this is the right thing or the wrong thing for the government to be doing?"

Pew results: Right 57%, Wrong 30%

LA Times/Bloomberg wording: "Do you think the government should use taxpayers’ dollars to rescue ailing private financial firms whose collapse could have adverse effects on the economy and market, or is it not the government’s responsibility to bail out private companies with taxpayers’ dollars?"

LA Times/Bloomberg results: Should 31%, Should not 55%

ABC News/Washington Post wording: "Do you approve or disapprove of the steps the Federal Reserve and the Treasury Department have taken to try to deal with the current situation involving the stock market and major financial institutions?"

ABC News/Washington Post results: Approve 44%, Disapprove 42%


Impact of Word Choice on Survey Results
Poll Wording Favor Program Do Not Favor Program
Pew "investing" 57% 30%
LAT/Bloomberg "bail out" 31% 55%
ABC News/Post "steps" 44% 42%


Certainly, there was no intention to bias the results as all three polls have an established national reputation for reliability.  Additionally, at first blush, none of the questions seem to have any obvious problems with wording.  Nonetheless, the results vary dramatically from one poll to the next.  Part of the reason may be the affective connotations attached to the different key words.  "Investing" often has positive connotations.  People think of "investing" as something positive -- storing away something for the future or improving something.  The results of the Pew poll show that a solid majority of Americans liked the idea of "investing" in the security of national markets and financial institutions.  The term "bail out," on the other hand, often has negative connotations.  Many people associate getting bailed out with saving someone from a silly or stupid mistake caused by an action they should not have taken.  It is often seen as undeserved and necessary because of imprudence.  The results of the LA Times/Bloomberg poll show that a definite majority of Americans were against a "bail out" for companies that had gotten themselves in trouble.  Finally, there usually aren't heavy positive or negative connotations with taking "steps."  The language here is much more neutral.  The ABC News/Washington Post poll results show a country evenly divided on the "steps" proposed to "deal with the [...] situation."


It could be that framing from simple differences in word choice is more likely to occur for issues on which individuals do not hold strong opinions or are difficult for an individual to understand.  Any survey question long enough to adequately explain TARP, giving the respondent enough information to fully understand the program, would have been far too long to practically include in a survey.  As such, the respondent may be more likely to look for, or pick up on, inadvertant cues in the question wording.  "Investing" provided a slight positive cue and showed results favorable of TARP.  "Bail out" provided a slight negative cue and showed results unfavorable of TARP.  "Steps" provided no positive or negative cue and show results that could be interpreted to mean that the public was either evenly divided on TARP or were uncertain due to a lack of familiarity with the specifics of TARP.

Priming[edit]

Michael Parkin defines priming as the "psychological process in which exposure to a stimulus activates a concept in memory that is then given increased weight in subsequent judgment tasks. Priming works by making the activated concept accessible so that it can be readily used in evaluating related objects."<ref>Parkin, Michael (2008). "Priming". In Paul J. Lavrakas. Encyclopedia of Survey Research Methods. Sage. </ref>

Example of Priming[edit]

In surveys, Priming response effects are often the result of question order. Imagine a survey that seeks to measure individuals' concerns about the economy. If respondents are asked what they consider to be the most important problem facing their country or state after they have been asked a series of questions regarding economic concerns, they may be more likely to cite the economy at the "most important problem."

Priming due to question order may also change the criteria used for judgement or evaluation. Asking a series of questions about the economy before asking respondents to evaluate the U.S. president may result in higher evaluations than might otherwise be expected in economic good times and lower than expected evaluations in an economic downturn. Similarly, asking for a presidential evaluation after a series of questions regarding foreign military engagements may result in the individual subconsciously creating their evaluation of the president based on their attitudes regarding foreign policy rather than any domestic issue.

In fact, for these reasons, most political surveys ask questions regarding approval or evaluation for candidates or elected officials, as well as questions regarding the most important problem or issue facing the country or state, somewhere near the beginning of the survey. Another example would be asking questions regarding religious beliefs and church attendance prior to asking questions about issues that are often religiously charged, such as abortion or same sex marriage.

More on Question Order[edit]

The order in which questions are asked can impact results in ways other than priming. In a brilliant survey based experiment conducted in 1948 by Herbert Hyman and Paul Sheatsley<ref>Hyman, Herbert; Paul Sheatsley (1950). "The Current Status of American Public Opinion". In J. C. Payne. The Teaching of Contemporary Affairs. Twenty-first Yearbook of the National Council of Social Studies. </ref>, question order was shown to inspire a sense of reciprocity or fairness as it impacted respondent attitudes.

Respondents were asked both of the following questions:

Communist reporter item: "Do you think the United States should let Communist newspaper reporters come in here and send back to their papers the news as they see it?"
American reporter item: "Do you think a Communist country like Russia should let American newspaper reporters come in and send back to America the news as they see it?"

However, the order in which the questions were asked was varied. Americans' willingness to allow a Communist reporter into the United States varied greatly depending on which question was asked first.


Question Order Effects
Communist reporter item asked first American reporter item asked first
Percent in favor of allowing a Communist reporter into the United States 36.5% 73.1%


In 1948, as fears surrounding the Cold War grew, individuals who were first asked about allowing a Communist reporter into the United States showed little willingness to open the country to them with only a bit more than one-third favoring admitting the Communist reporter.  However, when asked about allowing in a Communist reporter after being asked whether Communist countries should permit the entrance of an American reporter, more than 7 out of 10 favored allowing in a Communist reporter.  Thinking about Communist countires allowing in an American reporter generated feelings of reciprocal obligation and made respondents more likely to respond favorably to the Communist reporter item.  Howard Schuman and Stanley Presser repeated the experiment exactly in 1980 with nearly as dramatic results.<ref>Schuman, Howard; Presser, Stanley (1996). Questions and Answers in Attitude Surveys: Experiments on Question Form, Wording, and Context. Sage Publications. </ref>


Assessing the impact of Cell-Only households on telephone surveys[edit]

According to data collected in the latter half of 2009 by the United States Centers for Disease Control and Prevention's National Health Interview Study<ref>Stephen J. Blumberg, Ph.D., and Julian V. Luke (May 2010). "Wireless Substitution:Early Release of Estimates From the National Health Interview Survey, July–December 2009". http://www.cdc.gov/nchs/data/nhis/earlyrelease/wireless201005.pdf. Retrieved July 9, 2011. </ref> the percentage of "cell-only" households in the United States, that is, households that do not have a landline, stood at 24.5% -- nearly 1 in 4 households. Additionally, "[t]he percentage of households that are wireless-only has been steadily increasing. The 4.3-percentage-point increase from the last 6 months of 2008 through the last 6 months of 2009 is nearly equivalent to the 4.4-percentage-point increase observed from the last 6 months of 2007 through the last 6 months of 2008."<ref>Stephen J. Blumberg, Ph.D., and Julian V. Luke (May 2010). "Wireless Substitution:Early Release of Estimates From the National Health Interview Survey, July–December 2009". http://www.cdc.gov/nchs/data/nhis/earlyrelease/wireless201005.pdf. Retrieved July 9, 2011. </ref> Researchers may also be interested to learn "[t]he percentages of adults and children living without any telephone service have remained relatively unchanged over the past 3 years [from the date of the report]. Approximately 2.0% of households had no telephone service (neither wireless nor landline)<ref>Stephen J. Blumberg, Ph.D., and Julian V. Luke (May 2010). "Wireless Substitution:Early Release of Estimates From the National Health Interview Survey, July–December 2009". http://www.cdc.gov/nchs/data/nhis/earlyrelease/wireless201005.pdf. Retrieved July 9, 2011. </ref>."

The CDC report noted the following demographic differences for cell-only households (all of the following are taken directly from the CDC report<ref>Stephen J. Blumberg, Ph.D., and Julian V. Luke (May 2010). "Wireless Substitution:Early Release of Estimates From the National Health Interview Survey, July–December 2009". http://www.cdc.gov/nchs/data/nhis/earlyrelease/wireless201005.pdf. Retrieved July 9, 2011. </ref>):

  • More than three in five adults living only with unrelated adult roommates (62.9%) were in households with only wireless telephones. This is the highest prevalence rate among the population subgroups examined.
  • More than two in five adults renting their home (43.1%) had only wireless telephones. Adults renting their home were more likely than adults owning their home (14.0%) to be living in households with only wireless telephones.
  • Nearly half of adults aged 25–29 years (48.6%) lived in households with only wireless telephones. More than one-third of adults aged 18–24 or 30–34 (37.8% and 37.2%, respectively) lived in households with only wireless telephones.
  • As age increased from 35 years, the percentage of adults living in households with only wireless telephones decreased: 23.9% for adults aged 35–44; 14.9% for adults aged 45–64; and 5.2% for adults aged 65 and over.
  • Men (24.5%) were more likely than women (21.3%) to be living in households with only wireless telephones. However, the proportion of women among all wireless-only adults increased from approximately 46% to 48.2%.
  • Adults living in poverty (36.3%) and adults living near poverty (29.0%) were more likely than higher income adults (19.6%) to be living in households with only wireless telephones.
  • Adults living in the Midwest (25.6%), South (25.4%), and West (22.2%) were more likely than adults living in the Northeast (15.1%) to be living in households with only wireless telephones.
  • Hispanic adults (30.4%) were more likely than non-Hispanic white adults (21.0%) or non-Hispanic black adults (25.0%) to be living in households with only wireless telephones.
  • Among all wireless-only adults, the proportion of adults aged 30 years and over has steadily increased. In the last 6 months of 2009, the majority of wireless-only adults (59.2%) were aged 30 and over, up from 48.4% in the first 6 months of 2006.
  • The proportion of employed adults among all wireless-only adults has decreased from 78.6% to 69.1%. Over the same time period, the proportion of adults with an employment status other than working, keeping house, or going to school increased. These adults (largely unemployed or retired) made up 20.2% of wireless-only adults in the last 6 months of 2009, up from 10.3% in the first 6 months of 2006.


Impact on Political Polls[edit]

As noted in the demographic characteristics listed above, young adults are much more likely to live in cell-only households. While it is often thought of as a truism that young adults in the United States are more likely to vote Democratic, additional evidence exists that cell-only young adults can exhibit different behaviors and hold different opinions than young adults who live in landline only households or dual (landline and cell) households <ref>Blumberg, Stephen J.; Luke, Julian V. (2007). "Coverage Bias in Traditional Telephone Surveys of Low-Income and Young Adults". Public Opinion Quarterly 71: 734-749. </ref>. According to a Pew Research Center study based on three pre-2010 election polls, research "showed Democrats with a 53%-to-38% lead over Republicans among registered voters younger than age 30. But estimates based only on interviews from the landline sample showed Democratic and Republican candidates running about even among young voters -- 49% said that if the elections were held today they would vote for the Democratic candidate, while 45% backed the Republican candidate in their district. The difference in the margin between the combined sample [of young voters] and the landline sample [of young voters] was 11 points<ref>Scott Keeter, Leah Christian and Michael Dimock (November 22, 2010). "The Growing Gap between Landline and Dual Frame Election Polls". PEW RESEARCH CENTER FOR THE PEOPLE & THE PRESS. http://pewresearch.org/pubs/1806/growing-gap-between-landline-and-dual-frame-election-polls. Retrieved July 9, 2011. </ref>."

This and other differences between cell-only and landline respondents contributes to a significant Republican skew in polls that exclude cell phones. As the Pew report notes, Republicans "held a lead that was on average 5.1 percentage points larger in the landline sample than in the combined landline and cell phone sample [across three pre-2010 election polls]<ref>Scott Keeter, Leah Christian and Michael Dimock (November 22, 2010). "The Growing Gap between Landline and Dual Frame Election Polls". PEW RESEARCH CENTER FOR THE PEOPLE & THE PRESS. http://pewresearch.org/pubs/1806/growing-gap-between-landline-and-dual-frame-election-polls. Retrieved July 9, 2011. </ref>."

This is relevant because the Telecommunications Consumer Protection Act of 1991 <ref>"Telecommunications Consumer Protection Act of 1991". http://www.law.cornell.edu/uscode/html/uscode47/usc_sec_47_00000227----000-.html. Retrieved July 9, 2011. </ref> and the Telecommunications Act of 1996<ref>"Telecommunications Act of 1996". http://transition.fcc.gov/Reports/tcom1996.pdf. Retrieved July 9, 2011. </ref> forbid the use of automated calling or auto-dialers to contact cell phones. In recent years, the media has been giving widespread press coverage to results published by several polling companies or institutions that exclusively use automated polling -- also often referred to as Interactive Voice Response, or IVR, polling. Polls which only use this methodology, and are therefore barred by law from contacting cell phone users in the United States, will have an inherent Republican skew. This bias cannot be completely eliminated by weighting the responses of respondents who are demographically similar to cell-only individuals more heavily since, as noted above, cell-only individuals exhibit different attitudes than demographically similar individuals who are not cell-only<ref>Blumberg, Stephen J.; Luke, Julian V. (2007). "Coverage Bias in Traditional Telephone Surveys of Low-Income and Young Adults". Public Opinion Quarterly 71: 734-749. </ref>.

Heading[edit]

Sub-heading[edit]

Conclusion[edit]

References[edit]

<references group=""></references>

Discussion questions[edit]

Problems[edit]

  1. Your predilection for philately (stamp collecting) has finally caught up with you. The recent purchase of a Treskilling Yellow has left you so heavily indebted that you are required to seek paid employment of any type. Thankfully, a position has just become available at Pontificate Polling – a somewhat less than reputable polling firm that promises to deliver results for its clients. Your job is to design a survey for one of two clients: “The federation of stingy tax payers of Canada” or “The leftist alliance of deficit lovers of Canada”. Both groups want a survey that will lead respondents to support their political objective: in the first case a decrease in foreign aid, and in the second an increase in foreign aid. Produce a questionnaire (containing at least 5 questions) that will elicit the desired response from a randomly selected sample of Canadians. Give a brief explanation of the techniques that you used to bias the results. Be subtle in your bias. If your bias is too obvious, then people will not respond in the way that you want.

Having completed the first survey, you have an epiphany of sorts. You reason (quite rightly) that this type of survey would not be ethical. You make it your mission to challenge biased surveys from this moment on. Draw up a second survey that demonstrates what a non-biased question set should look like for this issue.


Glossary[edit]

  • [[Def: ]]
  • [[Def: ]]
  • [[Def: ]]