business research methods survey research. surveys surveys ask respondents for information using...

Post on 16-Jan-2016

222 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Business Research Methods

Survey Research

Surveys

Surveys ask respondents for information using verbal or written questioning

Respondents Respondents

are a representative sample of people

Gathering Information via Surveys

Quick Inexpensive Efficient Accurate Flexible

Problems Poor Design Improper Execution

Totalerror

Systematicerror (bias)

Random samplingerror

Tree Diagram of Total Survey Error

Random Sampling Error A statistical fluctuation that occurs

because of change variation in the elements selected for the sample

Systematic Error Systematic error results from some

imperfect aspect of the research design or from a mistake in the execution of the research

Systematicerror (bias)

Administrativeerror

Respondenterror

Tree Diagram of Total Survey Error

Sample Bias

Sample bias - when the results of a sample show a persistent tendency to deviate in one direction from the true value of the population parameter

Respondenterror

Nonresponseerror

Responsebias

Tree Diagram of Total Survey Error

Respondent ErrorA classification of sample bias resulting from some respondent action or inaction

Nonresponse bias Response bias

Non-response Error Non-respondents - people who

refuse to cooperate Not-at-homes Self-selection bias

Over-represents extreme positions Under-represents indifference

Responsebias

Unconsciousmisrepresentation

Deliberatefalsification

Tree Diagram of Total Survey Error

Response Bias

A bias that occurs when respondents tend to answer questions with a certain slant that consciously or unconsciously misrepresents the truth

Acquiescence bias

Extremity bias

Interviewer bias

Auspices bias

Social desirability bias

Tree Diagram of Total Survey Error

A category of response bias that results because some individuals tend to agree with all questions or to concur with a particular position.

Acquiescence Bias

A category of response bias that results because response styles vary from person to person; some individuals tend to use extremes when responding to questions.

Extremity Bias

A response bias that occurs because the presence of the interviewer influences answers.

Interviewer Bias

Auspices Bias Bias in the responses of subjects

caused by the respondents being influenced by the organization conducting the study.

Social Desirability Bias Bias in responses caused by

respondents’ desire, either conscious or unconscious, to gain prestige or appear in a different social role.

Systematicerror (bias)

Administrativeerror

Respondenterror

Tree Diagram of Total Survey Error

Administrative Error Improper administration of the

research task Blunders

Confusion Neglect Omission

Data processing error

Sample selection error

Interviewer error

Interviewer cheating

Tree Diagram of Total Survey Error

Administrative Error Interviewer cheating - filling in fake

answers or falsifying interviewers Data processing error - incorrect data

entry, computer programming, or other procedural errors during the analysis stage.

Sample selection error -improper sample design or sampling procedure execution.

Interviewer error - field mistakes

M E TH O DO F

C O M M U N IC A TIO N

S TR U C TU R E DA N D D IS Q U IS E D

Q U E S TIO N S

TE M P O R A LC L A S S IF IC A TIO N S

C L A S S IF Y IN GS U R V E Y

R E S E A R C HM E TH O D S

Time Period for Surveys Cross-sectional Longitudinal

Cross-Sectional Study A study in which various segments

of a population are sampled Data are collected at a single

moment in time.

Longitudinal Study A survey of respondents at

different times, thus allowing analysis of changes over time.

Tracking study - compare trends and identify changes consumer satisfaction

Consumer Panel

A longitudinal survey of the same sample of individuals or households to record (in a diary) their attitudes, behavior, or purchasing habits over time.

Total quality management - A business philosophy that emphasizes market-driven quality as a top organizational priority.

Stages in Tracking Quality Improvement

CommitmentCommitmentand and ExplorationExploration

BenchmarkingBenchmarking

InitialInitialqualityqualityimprovementimprovement

ContinuousContinuousQualityQualityImprovementImprovement

Stages in Tracking Quality Improvement

Commitment and Exploration Stage Management makes a commitment

to total quality assurance Business researchers explore

external customers’ needs and problems.

Business researchers explore internal customers’ needs, beliefs, and motivations.

Benchmarking Stage Research establishes quantitative

measures as benchmarks or points of comparison

Overall satisfaction and quality ratings of specific attributes

Employees actual performance and perceptions

Tracking wave 1 measures trends Establishes a quality improvement

process within the organization. Translate quality issues into the

internal vocabulary of the organization.

Establish performance standards and expectations for improvement.

Initial Quality Improvement Stage

Continuous Quality Improvement Consists of many consecutive

waves with the same purpose—to improve over the previous period.

Quality improvement management continues.

Determinants of the Quality of Goods Performance Features Conformance with specifications Reliability Durability Serviceability Aesthetic design

Access Communication Competence Courtesy Reliability Credibility

Determinants of Service Quality

BusinessResearch Methods

Survey Research: Basic Communication Methods

Surveys

Surveys as a respondent for information using verbal or written questioning

Communicating with Respondents Personal interviews

Door-to-door Shopping mall intercepts

Telephone interviews Self-administered questionnaires

Personal Interviews

Good Afternoon, my name is_________. I am with _________survey research company. We are conducting a survey on_________

Door-to-Door Personal Interview Speed of data collection

Moderate to fast Geographical flexibility

Limited to moderate Respondent cooperation

Excellent Versatility of questioning

Quite versatile

Door-to-Door Personal Interview Questionnaire length

Long Item non-response

Low Possibility of respondent

misunderstanding Lowest

Door-to-Door Personal Interview Degree of interviewer influence of

answer High

Supervision of interviewers Moderate

Anonymity of respondent Low

Door-to-Door Personal Interview Ease of call back or follow-up

Difficult Cost

Highest Special features

Visual materials may be shown or demonstrated; extended probing possible

Mall Intercept Personal Interview Speed of data collection

Fast Geographical flexibility

Confined, urban bias Respondent cooperation

Moderate to low Versatility of questioning

Extremely versatile

Mall Intercept Personal Interview Speed of Data Collection

Fast Geographical Flexibility

Confined, urban bias Respondent Cooperation

Moderate to low Versatility of Questioning Extremely versatile

Mall Intercept Personal Interview

Questionnaire length Moderate to long

Item non-response Medium

Possibility of respondent misunderstanding Lowest

Mall Intercept Personal Interview Degree of interviewer influence of

answers Highest

Supervision of interviewers Moderate to high

Anonymity of respondent Low

Mall Intercept Personal Interview Ease of call back or follow-up

Difficult Cost

Moderate to high Special features

Taste test, viewing of TV commercials possible

Telephone Surveys

Telephone Surveys Speed of Data Collection

Very fast Geographical Flexibility

High Respondent Cooperation

Good Versatility of Questioning

Moderate

Telephone Surveys Questionnaire Length

Moderate Item Nonresponse

Medium Possibility of Respondent Misunderstanding

Average Degree of Interviewer Influence of Answer

Moderate

Telephone Surveys Supervision of interviewers

High, especially with central location WATS interviewing

Anonymity of respondent Moderate

Ease of call back or follow-up Easy

Telephone Surveys Cost

Low to moderate Special features

Fieldwork and supervision of data collection are simplified; quite adaptable to computer technology

Telephone Surveys Central location interviewing Computer-assisted telephone

interviewing Computerized voice-activated

interviews

Most Unlisted Markets Sacramento, CA Oakland, CA Fresno, CA Los Angles/Long Beach, CA

The Frame, November 2001 Published by Survey Sampling, Inc.

M A IL IN -P E R S O ND R O P -O F F

IN S E R TS F A X

P A P E RQ U E S TIO N N A IR E S

E -M A IL IN TE R N E TW E B S ITE

K IO S K

E L E C TR O N ICQ U E S TIO N N A IR E S

S E L F -A D M IN IS TE R E DQ U E S TIO N N A IR E S

Self-Administered Questionnaires

Mail Surveys

Mail Surveys Speed of data collection

Researcher has no control over return of questionnaire; slow

Geographical flexibility High

Respondent cooperation Moderate--poorly designed questionnaire

will have low response rate

Mail Surveys Versatility of questioning

Highly standardized format Questionnaire length

Varies depending on incentive Item nonresponse

High

Mail Surveys Possibility of respondent

misunderstanding Highest--no interviewer present for

clarification Degree of interviewer influence of

answer None--interviewer absent

Supervision of interviewers Not applicable

Mail Surveys Anonymity of respondent

High Ease of call back or follow-up

Easy, but takes time Cost

Lowest

• Write a “sales oriented” cover letter• Money helps

- As a token of appreciation- For a charity

• Stimulate respondents’ interest with interesting questions• Follow Up

- Keying questionnaires with codes• Advanced notification• Sponsorship by a well-known and prestigious institution

How to Increase Response Rates for Mail Surveys

Increasing Response Rates Effective cover letter Money helps Interesting questions Follow-ups Advanced notification Survey sponsorship Keying questionnaires

E-Mail Questionnaire Surveys Speed of data collection

Instantaneous Geographic flexibility

worldwide Cheaper distribution and

processing costs

E-Mail Questionnaire Surveys Flexible, but

Extensive differences in the capabilities of respondents’ computers and e-mail software limit the types of questions and the layout

E-mails are not secure and “eavesdropping” can possibly occur

Respondent cooperation Varies depending if e-mail is seen as

“spam

A self-administered questionnaire posted on a Web site.

Respondents provide answers to questions displayed online by highlighting a phrase, clicking an icon, or keying in an answer.

Internet Surveys

Speed of data collection Instantaneous

Cost effective Geographic flexibility

worldwide Visual and interactive

Internet Surveys

Respondent cooperation Varies depending on web site Varies depending on type of sample When user does not opt-in or expect a

voluntary survey cooperation is low. Self-selection problems in web site

visitation surveys - participants tend to be more deeply involved than the average person.

Internet Surveys

Versatility of questioning Extremely versatile

Questionnaire length Individualized base on respondent

answers Longer questionnaires with panel samples

Item non-response Software can assure none

Internet Surveys

Internet Surveys Representative samples The quality of internet samples

may vary substantially. A sample of those who visit a web

page and voluntarily fill out a questionnaires can have self-selection error.

Internet Surveys

1) not all individuals in the general public have internet access 2) many respondents lack powerful computers with high-speed connections to the internet 3) many respondents computer skills will be relatively unsophisticated.

Internet Surveys Possibility for respondent

misunderstanding High

Interviewer influence of answers None

Supervision of interviewersnot required

Anonymity of Respondent Respondent can be anonymous or

known Ease of Callback or Follow-up

difficult unless e-mail address is known

Special Features allows graphics and streaming media

Internet Surveys

Welcome Screen Welcome Screen like a cover letter It contains the name of the research

company and how to contact the organization if there is a problem or concern.

"If you have any concerns or questions about this survey, or if you experience any technical difficulties, please contact (NAME OF RESEARCH ORGANIZATION).

Welcome Screen should ask for password and give instructions Please enter your personal password

from your invitation.Then, press the "enter" key to begin the survey or simply click on the right arrow at the bottom of the page to begin the survey (after you have read the remaining instructions):

During the survey, please do not use your browser's FORWARD and BACK buttons.

Use the arrows on the lower right to move backward and forward through the survey.

There is no best form of survey; each has advantages and disadvantages.

Selected Questions to Determine the Appropriate Technique Is the assistance of an interviewer

necessary? Are respondents interested in the

issues being investigated? Will cooperation be easily

attained?

Selected Questions to Determine the Appropriate Technique

How quickly is the information needed?

Will the study require a long and complex questionnaire?

How large is the budget?

Pre-testingA trial run with a group of respondents to iron out fundamental problems in the instructions of survey design

“Practice is the best of all instructors.”

BusinessResearch Methods

Measurement

Concept

A generalized idea about a class of objects, attributes, occurrences, or processes

Operational Definition

Specifies what the researcher must do to measure the

concept under investigation

Media skepticism - the degree to which individuals are skeptical toward the reality presented in the mass media. Media skepticism varies across individuals, from those who are mildly skeptical and accept most of what they see and hear in the media to those who completely discount and disbelieve the facts, values, and portrayal of reality in the media.

Media Skepticismconceptual definition

Media SkepticismOperational Definition

Please tell me how true each statement is about the media. Is it very true, not very true, or not at all true?1. The program was not very accurate in its portrayal of the problem.2. Most of the story was staged for entertainment purposes.3. The presentation was slanted and unfair.

Scale Series of items arranged according

to value for the purpose of quantification

A continuous spectrum

Nominal Scale

Ordinal Scale

Interval Scale

Ratio Scale

Scale Properties Uniquely classifies Preserves order Equal intervals Natural zero

Nominal Scale Properties Uniquely classifies

Sammy Sosa # 21 Barry Bonds # 25

Ordinal Scale Properties Uniquely classifies Preserves order Win, place, & show

Interval Scale Properties Uniquely classifies Preserves order Equal intervals

Consumer Price Index (Base 100) Fahrenheit temperature

Ratio Scale Properties Uniquely classifies Preserves order Equal intervals

Natural zero Weight and distance

Index Measures

ATTRIBUTES A single characteristic or fundamental feature that pertains to an object, person, or issue

COMPOSITE MEASURE A composite measure of several variables to measure a single concept; a multi-item instrument

The Goal of Measurement Validity

The ability of a scale to measure what was intended

to be measured

Validity

The degree to which measures are free from

random error and therefore yield consistent results

Reliability

Old Rifle New Rifle New Rifle Sun glare

Low Reliability High Reliability Reliable but Not Valid

(Target A) (Target B) (Target C)

Reliability and Validity on Target

F A C E O R C O N TE N T

C O N C U R R E N T P R E D IC TIV E

C R ITE R IO N V A L ID ITY C O N S TR U C T V A L ID ITY

V a lid ity

Validity

TE S T R E TE S T

S TA B IL ITY

E Q U IV A L E N T F O R M S S P L ITTIN G H A L V E S

IN TE R N A L C O N S IS TE N C Y

R E L IA B IL ITY

Reliability

Sensitivity A measurement instrument’s

ability to accurately measure variability in stimuli or responses.

Attitude Measurement

BusinessResearch Methods

Attitude

An enduring disposition to consistently respond in a given matter

Attitudes as Hypothetical Constructs

The term hypothetical construct is used to describe a variable that is not directly observable, but is measurable by an indirect means such as verbal expression or overt behavior - attitudes are considered to be such variables.

Three Components of an Attitude

Affective Cognitive Behavioral

The feelings or emotions toward an object

Affective

Knowledge and beliefs

Cognitive

Behavioral Predisposition to action Intentions Behavioral expectations

Ranking Rating Sorting Choice

Measuring Attitudes

The Attitude Measuring Process

Ranking - Rank order preference

Rating - Estimates magnitude of a characteristic

Sorting - Arrange or classify concepts

Choice - Selection of preferred alternative

Ranking tasks require that the respondent rank order a small number of objects in overall performance on the basis of some characteristic or stimulus.

Rating asks the respondent to estimate the magnitude of a characteristic, or quality, that an object possesses. The respondent’s position on a scale(s) is where he or she would rate an object.

Sorting might present the respondent with several concepts typed on cards and require that the respondent arrange the cards into a number of piles or otherwise classify the concepts.

Choice between two or more alternatives is another type of attitude measurement - it is assumed that the chosen object is preferred over the other.

Physiological measures of attitudes provide a means of measuring attitudes without verbally questioning the respondent. for example, galvanic skin responses, measure blood pressure etc.

Simple Attitude ScalingIn its most basic form, attitude scaling requires that an individual agree with a statement or respond to a single question. This type of self-rating scale merely classifies respondents into one of two categories;

Simplified Scaling Example

THE PRESIDENT SHOULD RUN FOR RE-ELECTION

_______ AGREE ______ DISAGREE

Category ScalesA category scale is a more sensitive measure than a scale having only two response categories - it provides more information. Questions working is an extremely important factor in the usefulness of these scales.

Example of Category ScaleHow important were the following in your decision to visit San Diego (check one for each item)

VERY SOMEWHAT NOT TOO

IMPORTANT IMPORTANT IMPORTANT

CLIMATE ___________ ___________ ___________

COST OF TRAVEL ___________ ___________ ___________

FAMILY ORIENTED ___________ ___________ ___________

EDUCATIONAL/

HISTORICAL ASPECTS _________ ___________ ___________

FAMILIARITY WITH

AREA ___________ ___________ ___________

An extremely popular means for measuring attitudes. Respondents indicate their own attitudes by checking how strongly they agree or disagree with statements. Response alternatives: “strongly agree”, “agree”, “uncertain”, “disagree”, and “strongly disagree”.

Method of Summated Ratings: The Likert Scale

Likert Scale for Measuring Attitudes Toward TennisIt is more fun to play a tough,

competitive tennis match tan to play an easy one.

___Strongly Agree ___Agree ___Not Sure ___Disagree ___Strongly Disagree

There is really no such thing as a tennis stroke that cannot be mastered.___Strongly Agree___Agree ___Not Sure ___Disagree ___Strongly Disagree

Likert Scale for Measuring Attitudes Toward Tennis

Playing tennis is a great way to exercise.

___Strongly Agree___Agree ___Not Sure ___Disagree ___Strongly Disagree

Likert Scale for Measuring Attitudes Toward Tennis

Semantic Differential A series of seven-point bipolar

rating scales. Bipolar adjectives, such as “good” and “bad”, anchor both ends (or poles) of the scale.

Semantic Differential

A weight is assigned to each position on the rating scale. Traditionally, scores are 7, 6, 5, 4, 3, 2, 1, or +3, +2, +1, 0, -1, -2, -3.

Semantic Differential Scales for Measuring Attitudes Toward Tennis

Exciting ___ : ___ : ___ : ___ : ___ : ___ : ___ : Calm

Interesting ___ : ___ : ___ : ___ : ___ : ___ : ___ : Dull

Simple___ : ___ : ___ : ___ : ___ : ___ : ___ Complex

Passive ___ : ___ : ___ : ___ : ___ : ___ : ___ Active

Numerical ScalesNumerical scales have numbers as response options, rather than “semantic space’ or verbal descriptions, to identify categories (response positions).

Stapel ScalesModern versions of the Stapel scale place a single adjective as a substitute for the semantic differential when it is difficult to create pairs of bipolar adjectives.The advantage and disadvantages of a Stapel scale, as well as the results, are very similar to those for a semantic differential. However, the Stapel scale tends to be easier to conduct and administer.

A Stapel Scale for Measuring a Store’s Image

DepartmentStore Name

+3+2+1

Wide Selection-1-2-3

Select a plus number for words that you think describe the store accurately. the more accurately you think the work describes the store, the larger the plus number you should choose. Select a minus number for words you think do not describe the store accurately. The less accurately you think the word describes the store, the large the minus number you should choose, therefore, you can select any number from +3 for words that you think are very accurate all the way to -3 for words that you think are very inaccurate.

The behavioral differential instrument has been developed for measuring the behavioral intentions of subjects towards any object or category of objects. A description of the object to be judged is placed on the top of a sheet, and the subjects indicate their behavioral intentions toward this object on a series of scales. For example:

A 25-year old woman sales representativeWould ___ : ___ : ___ : ___ : ___ : ___ : ___ : Would Not

Ask this person for advice.

Behavioral Differential

Paired Comparisons In paired comparisons the respondents are

presented with two objects at a time and asked to pick the one they prefer. Ranking objects with respect to one attribute is not difficult if only a few products are compared, but as the number of items increases, the number of comparisons increases geometrically (n*(n -1)/2). If the number of comparisons is too great, respondents may fatigue and no longer carefully discriminate among them.

Divide 100 points among each of the following brands according to your preference for the brand:

Brand A _________

Brand B _________

Brand C _________

Graphic Rating Scales

A graphic rating scale presents respondents with a graphic continuum.

3 2 1Very VeryGood Poor

Graphic Rating Scale Stressing Pictorial Visual Communications

Monadic Rating Scale

A Monadic Rating Scale asks about a single concept

Now that you’ve had your automobile for about 1 year, please tell us how satisfied you are with its engine power and pickup.

Completely Very Fairly Well Somewhat VerySatisfied Satisfied Satisfied Dissatisfied Dissatisfied

A Comparative Rating Scale

A Comparative Rating Scale asks respondents to rate a concept by comparing it with a benchmark

Please indicate how the amount of authority in your present position compares with the amount of authority that would be ideal for this position.

TOO MUCH ABOUT RIGHT TOO LITTLE

An Unbalanced ScaleAn Unbalanced Scale has more responses distributed at one end of the scale

How satisfied are you with the bookstore in the Student Union?

Neither Satisfied Quite VerySatisfied Nor Dissatisfied Satisfied Dissatisfied

Examples of Bias in Questions

Examples of Bias in Questions

Examples of Bias in Questions

Unfair Alternatives Bad: Some people say that the city is

spending too much on building new public schools. Do you agree or disagree?

Improved: Some people say that the city is spending too much on building public schools ... and others say the city is not spending enough. With which opinion do you agree?

Examples of Bias in Questions

Maligning the Other Side Bad: Do you think the government

should spend more of our tax money on the slums?

Improved: Do you think the government should spend more ... or less money on replacing the slum neighborhoods in the city with new housing projects?

Examples of Bias in Questions

Damning with Faint Praise Bad: Some people say that the Mayor's

plan is a poor plan to solve garbage removal problems in the city. Others say it will do for now until a better solution is found. Do you think it is a good plan or a poor plan?

Improved: Some people favor and some oppose the plan for combined garbage and trash removal by the city.

Do you think the plan is a good solution ... or a poor solution to the garbage removal problem?

Examples of Bias in Questions

Deliberately Omitting Names Bad: Hello ... I'm conducting a poll for

Sam Snide, a candidate for mayor of the city. If the election were held today, whom would you vote for ... Mr. Snide, or one of the other candidates?

Improved: I am conducting a survey on the mayoral election...

Examples of Bias in Questions

Inappropriate Use of Titles Bad: State Attorney General Allen P.

Mutt is running for governor this year against Tom L. Jeff. Which man, Mutt or Jeff, is best qualified to be governor?

Improved: Allen P Mutt and Tom L. Jeff. Are running for governor this year. Which man, Mutt or Jeff, is best qualified to be governor?

Examples of Bias in Questions

Personalities Bad: Would you say that governor

Hunt's energy program for promoting solar heating of private homes has been very effective, fairly effective, not too effective, or not effective at all?

Improved: Would you say that the state energy program for promoting solar heating of private homes has been very effective, fairly effective, not too effective, or not effective at all?

Examples of Bias in Questions

Emotionally Charged Words Bad: Congressman Pork barrel has

been accused of defrauding the voters of this district. Do you agree or disagree with that charge?

Improvement: One of the issues in this campaign is how well Congressman Pork barrel has carried out his campaign promises. Do you think that Pork barrel has done an excellent, good, poor, or very poor job of doing what he said he would do?

Examples of Bias in Questions

Conditioned by Context Q1: Tom Fetzer is mayor of the city.

In your opinion is he doing a good ... or poor job as mayor?

Q2: As you understand it, what are the mayor's principal duties in office?

Note: the response to the second question is conditioned by your response to the first it would be better to reverse the order or ask only one!

Examples of Bias in Questions

Embarrassing Questions Bad: How much time did you spend reading

the newspaper yesterday? Improvement: Did you have a chance to

read the newspaper yesterday? (IF YES: About how much time did you spend reading the newspaper yesterday?)

Bad: What is your religion? Improvement Do you happen to have a

religious preference? (IF YES: What is your religious preference?)

Examples of Bias in Questions

Embarrassing Questions Bad: Did you vote in the city

election last month? Improvement Did you happen to

vote in the city election last month, or didn't you have a chance to vote?

Bad: How old are you? Improvement: In what year were

your born?

Examples of Bias in Questions

Embarrassing Questions Bad: Did you vote in the city

election last month? Improvement Did you happen to

vote in the city election last month, or didn't you have a chance to vote?

Bad: How old are you? Improvement: In what year were

your born?

Examples of Bias in Questions

Illogical Sentence Construction Bad: Some people say that Senator

Helms is doing an excellent job in office, and some people say he is doing a very poor job. What kind of job do you think Senator Helms is doing ... excellent, good, poor, or very poor?

Improvement: Would you say that Helms is doing an excellent, good, poor, or very poor job as United States Senator?

Examples of Bias in Questions

Two-part Questions Bad: Do you think that Mayor Booth

should run for re-election this year, or could the Democrats find a stronger candidate?

Improvement: Q1: Do you think that Mayor Booth should or should not run for re-election this year?

Q2: Do you think the Democrats could or could not find a stronger candidate than Mayor Booth this year?

Examples of Bias in Questions

Ambiguous Questions Bad: Did you vote in the last

election? Improvement: Did you vote in the

city election for Mayor last June? Bad: Are you in favor of a larger

government role in housing and the environment?

Improvement: Q1: Are you in favor of a larger role for the federal government in the environment? Q2: Are you in favor of a larger role for the federal government in housing?

Examples of Bias in Questions

Indefinite Persons or Places Bad: Are there many voters living

around here? Improvement: Of the people you

personally know living on Apple Street between 34th and 35th Avenues, about how many do you know to be registered to vote ... Would you say nearly all, maybe about three-quarters, about half ... or less than half?

Examples of Bias in Questions

Indefinite Concepts Bad: Among you circle of friends, is

there anyone whose opinions or advice you frequently ask about the public affairs issues of the day?

Improvement: Among your circle of friends is there anyone whose opinions or advice you frequently ask ... about such issues as the energy crisis?

top related