polls and surveys

64
Polls and surveys Evaluate Statistically Based Reports

Upload: cherie

Post on 23-Feb-2016

44 views

Category:

Documents


0 download

DESCRIPTION

Polls and surveys. Evaluate Statistically Based Reports. Figure 7.4 Classification of Survey Methods. Classification of Survey Methods. Survey Methods. Telephone. Personal. Mail. Electronic . In-Home. Mail/Fax Interview. E-Mail. Traditional Telephone. Mall Intercept. Mail Panel . - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Polls and surveys

Polls and surveysEvaluate Statistically Based Reports

Page 2: Polls and surveys

Survey Methods

Telephone Personal Electronic

TraditionalTelephone

Computer-Assisted Telephone Interviewing

Mall Intercept

In-Home

E-Mail

InternetComputer-Assisted

Personal Interviewing

Mail

Mail Panel

Mail/Fax Interview

Classification of Survey Methods

Page 3: Polls and surveys

Other Data collection methods

• Experimental design– Subjects are randomly assigned to treatments (=variables)

by the researcher– Causal inferences are stronger– Random sampling from the population less important– Usually laboratory Observational design (e.g., surveys)– Subjects are not randomly assigned to variables– Random sampling is important.– Selection bias – Causal inferences are compromised.

Page 4: Polls and surveys

probability sampling methods • Systematic random sample:

• Stratified random sample:–

• Cluster sampling:–

• Multistage sampling:–

Page 5: Polls and surveys

probability sampling methods • Systematic random sample:

– pick a random case from the first k cases of a sample; select every kth case after that one

• Stratified random sample:–

• Cluster sampling:–

• Multistage sampling:–

Page 6: Polls and surveys

probability sampling methods• Systematic random sample:

– pick a random case from the first k cases of a sample; select every kth case after that one

• Stratified random sample:– divide a population into groups, then select a simple

random sample from each stratum• Cluster sampling:

• Multistage sampling:–

Page 7: Polls and surveys

probability sampling methods• Systematic random sample:

– pick a random case from the first k cases of a sample; select every kth case after that one

• Stratified random sample:– divide a population into groups, then select a simple

random sample from each stratum• Cluster sampling:

– divide the population into groups called clusters or primary sampling units (PSUs); take a random sample of the clusters

• Multistage sampling:–

Page 8: Polls and surveys

probability sampling methods • Systematic random sample:

– pick a random case from the first k cases of a sample; select every kth case after that one

• Stratified random sample:– divide a population into groups, then select a simple

random sample from each stratum• Cluster sampling:

– divide the population into groups called clusters or primary sampling units (PSUs); take a random sample of the clusters

• Multistage sampling:– several levels of nested clusters, often including both

stratified and cluster sampling techniques

Page 9: Polls and surveys

Dr. G. Johnson, www.ResearchDemsytified.org

9

Random Sample Size • Sample size is a function of three things:

– Size of the population of interest– Decision about how important is it to be accurate?

• Confidence level

– Decision about how important is to be precise? • Sampling error (also called margin of error) or

confidence interval• In general, accuracy and precision is improved by increasing

the sample size

Page 10: Polls and surveys

Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. 5.10

Questionnaire Design…Over the years, a lot of thought has been put into the science

of the design of survey questions. Key design principles:1. Keep the questionnaire as short as possible.2. Ask short, simple, and clearly worded questions.3. Start with demographic questions to help respondents get

started comfortably.4. Use dichotomous (yes|no) and multiple choice questions.5. Use open-ended questions cautiously. 6. Avoid using leading-questions.7. Pretest a questionnaire on a small number of people.8. Think about the way you intend to use the collected data

when preparing the questionnaire.

Page 11: Polls and surveys

What is a poll

A poll allows you to ask one (or few) multiple choice question. Participants can choose from among answers that you predefine. You can allow the voter to select just one answer or allow them to choose multiple answers. You also have the option of adding an Other field to allow a voter to enter their own answer.

Page 12: Polls and surveys

Example

Page 13: Polls and surveys

What type of driving licence do you hold ?

1. full,2. restricted, 3. learner, 4. none

Page 14: Polls and surveys

Do you have a problem with this statement?

15% of New Zealanders have a restricted licence.

Page 15: Polls and surveys

Did you consider

1. target population, 2. sample, 3. random selection, 4. making an inference

Page 16: Polls and surveys

Target population

The target population is that complete group whose relevant characteristics are to be determined through the sampling

The target group should be clearly delineated if possible, for example, do all pre-college students include only primary and secondary students or also students in other specialized educational institutions?

Page 17: Polls and surveys

Sampling Frame

The sampling frame is a list of all those population elements that will be used in the sample

Examples of sampling frames are a student telephone directory (for the student population), the list of companies on the stock exchange, the directory of medical doctors and specialists, the yellow pages (for businesses), the electoral role.

Often, the list does not include the entire population. The discrepancy is often a source of error associated with the selection of the sample (sampling frame error)

Page 18: Polls and surveys

Probability Sampling – Every element in the population under study has a non-zero probability of selection to a sample, and every member of the population has an equal probability of being selected

Non-Probability Sampling – An arbitrary means of selecting sampling units based on subjective considerations, such as personal judgment or convenience. It is less preferred to probability sampling

Page 19: Polls and surveys

Give some examples of where you see this form of questioning

• TV polls e.g. The Vote, Campbell Live, X Factor• Radio polls• Internet polls• Phone polls

Page 20: Polls and surveys

What is a survey?

A survey allows you to ask multiple questions across a wider range of question types. So you can ask for a comment, an email address, a name, an address etc., as well as multiple choice questions.

Page 21: Polls and surveys

Example

Page 22: Polls and surveys

Non-samplingErrors

Sampling Errors(random process)

Selection bias Non-response bias

Self selection bias Question effects

Behavioural considerations

Interviewer effects Survey-format effects

Transfer of findings

Sampling

Page 23: Polls and surveys

Selection bias is a statistical bias in which there is an error in choosing the individuals or groups to take part in a study.

If the selection bias is not taken into account then certain conclusions drawn may be wrong.

Page 24: Polls and surveys

Target population (e.g. adults in NZ) Sampling frame

(e.g. households with a landline phone)

Not included in sampling frame

Not eligible

for survey

Cannot be contacted

Refuse to respond

Incapable of responding

SAMPLED POPULATION

Selection Bias: Population sampled is not (or should not be) exactly the population of interest.

Page 25: Polls and surveys

Sources of Non-sampling Errors

Non-response biasWhen people who have been targeted to be surveyed do not respond:

Page 26: Polls and surveys

Non-response bias

Non-response bias occurs in statistical surveys if the answers of respondents differ from the potential answers of those who did not answer.

e.g. Non-respondents in an employment survey are likely to be those who work long hours.

Page 27: Polls and surveys

Sources of Non-sampling Errors

Self-selection biasPeople decide themselves whether to be surveyed or not.

Dru Rose

Page 28: Polls and surveys

In such fields, a poll suffering from such bias is termed a self-selecting opinion poll or "SLOP".

Most TV and radio polls are SLOPs.

Page 29: Polls and surveys

Participants' decision to participate may be correlated with traits that affect the study, making the participants a non-representative sample.

Page 30: Polls and surveys

Self-selection bias

An example is online and phone-in polls are self-selected. Those individuals who are highly motivated to respond, typically individuals who have strong opinions, are overrepresented, and individuals that are indifferent or apathetic are less likely to respond. This often leads to a polarization of responses with extreme perspectives being given a disproportionate weight in the summary.

Page 31: Polls and surveys

Behavioural considerations

People tend to answer questions in a way they consider to be socially desirable.

e.g. pregnant women being asked about their drinking habits may be reluctant to admit that they drink alcohol

Dru Rose

Page 32: Polls and surveys

Interviewer effects

Different interviewers asking the same question can obtain different results.

e.g. the sex, race, religion , manner of the interviewer

may influence how people respond to a particular question.

Dru Rose

Page 33: Polls and surveys

Famous example

The Bradley Effect

Page 34: Polls and surveys

A theory proposed to explain observed discrepancies between voter opinion polls and election outcomes in some United States government elections where a white candidate and a non-white candidate run against each other.

Page 35: Polls and surveys

The theory proposes that some voters will tell pollsters they are undecided or likely to vote for a black candidate, while on election day they vote for the white candidate.

Page 36: Polls and surveys

It was named after Los Angeles Mayor Tom Bradley, an African-American who lost the 1982 California governor's race despite being ahead in voter polls going into the elections.

Page 37: Polls and surveys

The Bradley effect theory posits that the inaccurate polls were skewed by the phenomenon of social desirability bias. Specifically, some white voters give inaccurate polling responses for fear that, by stating their true preference, they will open themselves to criticism of racial motivation.

Page 38: Polls and surveys

Members of the public may feel under pressure to provide an answer that is deemed to be more publicly acceptable, or 'politically correct'.

Page 39: Polls and surveys

Interviewer Effects in Racial Questions

In 1968, one year after a major racial disturbance in Detroit, a sample of black residents were asked: “Do you personally feel that you trust most white

people, some white people or none at all?”• White interviewer: 35% answered “most”• Black interviewer: 7% answered “most”

Page 40: Polls and surveys

Interviewer error

Interviewer errors arise when• different interviewers administer a survey in different

ways• differences occur in reactions of respondents to different

interviewers, e.g. to interviewers of their own sex or own ethnic group

• inadequate training of interviewers• inadequate attention to the selection of interviewers • there is too high a workload for the interviewer

Page 41: Polls and surveys

Behaviours

• respondent gives an incorrect answer, e.g. due to prestige or competence implications, or due to sensitivity or social undesirability of question

• respondent misunderstands the requirements• lack of motivation to give an accurate answer• “lazy” respondent gives an “average” answer• question requires memory/recall

Page 42: Polls and surveys

Instrument or question errors

Instrument or question errors arise when• The question is unclear, ambiguous or difficult to answer• the list of possible answers suggested in the recording

instrument is incomplete• requested information assumes a framework unfamiliar

to the respondent• the definitions used by the survey are different from

those used by the respondent (e.g. how many part-time employees do you have? See next slide for an example)

Page 43: Polls and surveys

The following example is from Ruddock (1998)

In the Short Term Employment Survey (STES) conducted by Office of National Statistics in UK, data are collected on numbers of full-time and part-time employees on a given reference date.

Some firms ignored the reference date and gave figures for employees paid at the end of the month, thus including those who joined and those who left in that month – leading to an over-estimate.

Firms found it difficult to give details of part-time employees as their definition of “part-time” did not agree with that used by ONS.

Page 44: Polls and surveys

Survey effects

-question order e.g.“To what extent do you think teenagers are

affected by peer pressure when drinking alcohol ?”

followed by:“ Name the top 5 peer pressures you think

teenagers face today.”.

Page 45: Polls and surveys

Survey effect

-survey layout -interviewed by phone or in-person

or mail

Page 46: Polls and surveys

Taking the data from one population and transferring the results to another.

e.g. Auckland opinions may not be a good indication of New Zealand opinions.

Auckland

sample

New Zealand

Page 47: Polls and surveys

Non-sampling Errors

• can be much larger than sampling errors

• are always present

• can be virtually impossible to correct for after the completion of survey

• virtually impossible to determine how badly they will affect the result

• good surveys try to minimize them in the design of the survey (e.g. do a pilot survey first)

Page 48: Polls and surveys

Questioning in polls

Page 49: Polls and surveys

Consider Wording

Be aware that the wording of a question influences the answers.

Examples:

Is our government providing too much money for welfare programs?

– 44% said “yes”

Is our government providing too much money for assistance to the poor?

– 13% said yes

Page 50: Polls and surveys

18 August 1980 New York Times/CBS News Poll

“Do you think there should be an amendment to the constitution prohibiting abortions?”

Yes 29% No 62%

Later the same people were asked:“Do you think there should be an amendment to the constitution protecting the life of the unborn child?”

Yes 50% No 39%

Page 51: Polls and surveys

Ethnicity 1986 1991 1996

Single Ethnicity

94.6 94.3 81.0

European 81.2 78.1 65.8 Maori 9.1 9.6 7.6

Two Ethnicities 4.0 4.5 11.2

European & Maori 2.9 2.7 4.7Two European gps 0.0 0.6 4.5

Question Effects in the NZ Census1986: “What is your ethnic origin? (Tick the box or boxes which apply to you.)

1991: “Which ethnic group do you belong to?” (Tick the box or boxes which apply to you.)

1996: “Tick as many circles as you need to show which ethnic group(s) you belong to.”

Page 52: Polls and surveys

Question effects

Polls have been asking the following (or similar) question regularly since the 1960s and '70s:

"If a hopelessly ill patient, in great pain, with absolutely no chance of recovering, asks for a lethal dose, so as not to wake again, should the doctor be allowed to give the lethal dose?”

Page 53: Polls and surveys

Questions asked influence the answers given

"If a hopelessly ill patient, in great pain, with absolutely no chance of recovering, asks for a lethal dose, so as not to wake again, should the doctor be allowed to give the lethal dose?”

It would be hard for an uninformed person to say "no" to that question without feeling negligent, dogmatic or insensitive.

Page 54: Polls and surveys

But when the current ability of good palliative care to relieve the severe pain of terminal illness is known, though it it also known tragically not to be sufficiently available, the same question could be more accurately put:

Page 55: Polls and surveys

"If a doctor is so negligent as to leave a terminally-ill patient in pain, severe enough to drive him / her to ask to be killed, should the doctor be able to compound that negligence by killing the patient, instead of seeking help?"

The question is really about medical standards, not euthanasia.

Page 56: Polls and surveys

But whose opinion should count

Public opinion polls are reported as showing strong public support for the legalisation of assisted suicide and/or euthanasia.

Page 57: Polls and surveys

Consider the following

Page 58: Polls and surveys

Support is lower if the word 'suicide' is used.

Page 59: Polls and surveys

Why might this be important?

When polls are conducted, they are seldom targeted at elderly or disabled people. For the most part the questions are aimed at relatively young and healthy people who are horrified at the idea of growing old and feeble, or not even so old but terribly sick, losing their faculties, their mobility or their mind (reverting to the helplessness of infancy).

Page 60: Polls and surveys

As such, they often favour legalising euthanasia and assisted suicide because they can't imagine wanting to live with disability or infirmity if they were in that position.

Page 61: Polls and surveys

Disability activists point out while many might wish to die in the early weeks and months following disability, when they come to terms with their condition they are thankful no-one took them at their word when they were most vulnerable.

Page 62: Polls and surveys

Should we be listening to everyone or those who are most affected?

• Euthanasia is NOT:Generally supported by people suffering a terminal illness.

• Euthanasia is NOT:Generally supported by the aged population.

• Euthanasia IS:Opposed by nearly all groups representing the terminally ill.

• Euthanasia IS:Opposed by nearly all groups representing the elderly.

Page 63: Polls and surveys

Questions to Ask Before You Believe a Poll

• Who carried out the survey?• What was the population?• How was the sample selected?• How large was the sample?• What was the margin of error?• What was the response rate?• How were the subjects contacted?• When was the survey conducted?• What questions were asked?

Page 64: Polls and surveys

A report on a sample survey/poll should include:• Who carried it out and who funded it• target population (population of interest)• sample selection method• the sample size and the margin of error• the date of the survey• the exact question(s) being asked• the results • the claims (inferences) made