tools for testing contingency models prof. brian boyd w. p. carey school arizona state university ...

33
Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University http:// www.briankboyd.com Kati Haynes, Mike Hitt, Don Bergh, & Dave Ketchen

Upload: mervin-may

Post on 13-Jan-2016

218 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Tools for Testing Contingency Models

Prof. Brian BoydW. P. Carey School

Arizona State Universityhttp://www.briankboyd.com

Kati Haynes, Mike Hitt, Don Bergh, & Dave Ketchen

Page 2: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Co

nti

ng

ency

Tay

lori

smRationalizationRationalization

CompartmentalizationCompartmentalization

Strategic ChoiceStrategic Choice

Confluence of multiple factorsConfluence of multiple factors

Page 3: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Problems With Contingency TheoryProblems With Contingency Theory

• Issues regarding causality• Imprecision regarding definition of “fit”• Near-infinite combination of variables• Limited or inconsistent support for many

theoretical predictions• Schoonhoven (1981): Contingency is an

orienting strategy or metatheory, but not a theory per se

• See Peteraf and Reed (2007) for “The rise and fall and rise again of contingency theory”

Page 4: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Why Are Contingency Effects Why Are Contingency Effects Important?Important?

Examples from corporate governance research:

• Meta-analyses of both duality and insider ratios report minimal links to performance

• These same variables show highly significant roles in the context of moderators, mediators, and other effects

Weak main effects in other SM areas as well:

• Strategic planning and performance

• Acquisitions and performance

• Tests of assumptions of transaction cost economics

Page 5: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

RoadmapRoadmap

• Venkatraman’s (1989) typology as a toolbox for theory development

• Content analysis of 30 years of studies in SMJ

• Implications for future studies

Page 6: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Venkatraman’s Venkatraman’s (1989) Typology (1989) Typology

of Fitof Fit

Page 7: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

LOW

FEW

MANY

HIGH

Criterion-specific Criterion-free

Nu

mb

er o

f va

riab

les

in

eq

ua

tio

n

Deg

ree

of

sp

ecif

icit

y in

fu

nct

ion

al

form

of

fit-

bas

ed

rela

tio

nsh

ip

ProfileDeviation

Moderation

Mediation

Gestalts

Matching

Covariation

Page 8: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Content Analysis:Content Analysis:DescriptivesDescriptives

Page 9: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Benchmarking Benchmarking Prior Studies Prior Studies

• Fit is central to theory development in strategic management

• First empirical article published in SMJ used a contingency design

• We examined all empirical studies published in SMJ between 1980 and 2009

• Practices/trends generally track those of other management sub-specialties

Page 10: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

0

10

20

30

40

50

60

70

80

90A

rtic

les

/ y

ea

r

Total

Empirical

Contingency

Breakout of ArticlesBreakout of Articles

Page 11: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

0%

20%

40%

60%

80%

100%

1980s 1990s 2000s Overall

Other

Survey

Archival

Data Sources Over TimeData Sources Over Time

Page 12: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

0500

100015002000250030003500400045005000

Sam

ple

siz

e

Archival

Survey

Sample sizes varied widely across studies.

Gains in N are only marginally significant for both survey- and archival-based studies

Sample Size Over TimeSample Size Over Time

Page 13: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Conceptual

Other Empirical

Interaction

Subgroup

Mediation

Matching

Gestalt

Profile

Covariation

Articles By TypeArticles By Type

Page 14: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Distribution of Contingency PapersDistribution of Contingency Papers

Interaction

Subgroup

Mediation

Matching

Gestalt

Profile

Covariation

Page 15: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

0%

5%

10%

15%

20%

25%

30%

35%

40%

45%

50%

Interaction

Subgroup

Mediation

Matching

Gestalt

Profile

Covariation

Contingency Papers Over TimeContingency Papers Over Time

Page 16: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Top Theoretical PerspectivesTop Theoretical Perspectives

Rank 1980s 1990s 2000s

1 IO econ IO econ RBV

2 Contingency RBV Agency

3 SCP model Agency Knowledge

4 Org theory Contingency Networks

5 Upper echelons

TCE TCE

Page 17: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Content Analysis:Content Analysis:Individual ToolsIndividual Tools

Page 18: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

ModerationModeration

Interaction:• First-order relationship

between x and y changes at different levels of z

• Arnold (1982): Form moderation

• Some potential issues:– Power– Colinearity

Subgroups:• First-order relationship

between x and y can be stronger or weaker at different levels of z

• Arnold (1982): Strength moderation

• Some potential issues:– Power– Coarse measurement of z– Less robust hypothesis

testsApproaches are interrelated, but do not always yield comparable results

Approaches are interrelated, but do not always yield comparable results

Page 19: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Interactions – Results Interactions – Results • Became the dominant tool by 2000s• Mainly used in regression, increasingly common in

limited dependent variables, rarely in SEM• Minority of studies distinguish between types of

moderation, stable over time• Only 20% of studies discuss power• One in three papers address measurement quality• Mean centering more widely discussed, but infrequently

used• Increasingly common to address multicolinearity issues• Interactions often not shown visually, but this is

becoming more common over time

Aguinis & Gottfredson (2010) Journal of Organizational Behavior for interaction best practices

Aguinis & Gottfredson (2010) Journal of Organizational Behavior for interaction best practices

Page 20: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Subgroups – Results Subgroups – Results • Second most widely used contingency tool• Comparison of correlations dramatic decline over time• Regression is primary tool, big gains in use by both SEM

and logit/probit• Minority of studies distinguish between types of

moderation, increasing number of studies reporting supplementary analyses

• Virtually none of studies discuss power• One in ten papers address measurement quality• Only 20% of studies report any significance tests of

differences between subgroups• 15 – 20% of studies use subgroups for post hoc

analyses

Page 21: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

MediationMediation• The relationship between x and y is

affected by the intervening variable z

• Simple versus complex or multiple mediation

• Methods choices: MacKinnon et al (2002) identified 14 choices for testing effects

• Growing trend of integrating mediating and moderation effects concurrently

Wood et al (2008) ORM for a content analysis of mediation studies in micro and mixed-focus journals.

Mathieu, DeShon & Bergh (2008) ORM for an overview of mediation research designs.

Wood et al (2008) ORM for a content analysis of mediation studies in micro and mixed-focus journals.

Mathieu, DeShon & Bergh (2008) ORM for an overview of mediation research designs.

Page 22: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Mediation – ResultsMediation – Results• Less than five percent of studies use mediation

– comparable to other outlets• SEM is by far the dominant tool of choice, a

marked difference from the emphasis on regression in other management outlets

• Most (90%) studies use complex mediation – more than other journals

• Cross-sectional designs dominant, but (a) less so than other journals, and (b) declining over time

• Power rarely addressed, measurement more so• One third of studies consider alternate causal

configurationsMiller et al (2007) RMISM examine use of Baron and Kenny ‘four step’ in strategy studies.

Miller et al (2007) RMISM examine use of Baron and Kenny ‘four step’ in strategy studies.

Page 23: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Gestalts and ConfigurationsGestalts and Configurations

• Venkatraman (1989): Degree of internal coherence among a set of theoretical attributes

• Miller and Friesen (1977): Holistic and ordered patterns of attributes

• Similar to configurations and archetypes

Page 24: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Gestalts and Configurations – Gestalts and Configurations – Results Results

• Over time, configurations increasingly based on theory versus exploratory data analysis

• Use of multiple clustering algorithms has grown dramatically over time

• Sample size is a historical concern, and has not improved over time

• Mixed results for validity tests– Split half reliabilities and hold-out samples rarely used– Use of criterion validity tests improving

Short et al (2008) JOM review configuration studies

Short et al (2008) JOM review configuration studies

Page 25: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Profile DeviationProfile Deviation

• Micro-Taylorism: There is an ideal combination or bundle of elements, but that combination is highly situation dependent

• Thomas, Litschert, and Ramaswamy (1991): Firms pursuing Prospector and Defender strategies each have unique needs vis-à-vis TMT characteristics. Alignment of strategy and human capital will shape future performance

Page 26: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Profile Deviation – ResultsProfile Deviation – Results• Extremely rare in practice• Little consistency in defining ideal: Ranges from

10% of top performers to average firm behavior• Arguments for and against weighting of

components, but no empirical testing of the different approaches

• Multiple samples recommended, but rarely utilized

• Little consistency in approach for testing predictive validity

Zajac et al (2000) SMJ examine directional effects of departure from ideal

Zajac et al (2000) SMJ examine directional effects of departure from ideal

Page 27: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

CovariationCovariation

• Related to gestalt, as fit is seen as a consistent pattern across a number of areas

• Venkatraman (1989): Covariation requires a high degree of theoretical precision in the linkage among elements

• Modeled as second order factor structures

• Hult and Ketchen (2004): Use the RBV to assess how four capabilities combine to create a distinct advantage for the firm

Page 28: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Covariation – ResultsCovariation – Results

• Also quite rarely used, but more common in recent years than profile deviation or gestalt

• Typical paper uses 4-5 first order factors, with 2-7 indicators per factor

• Few studies conduct formal comparison of first versus second order structure

• Several methodological issues– Small ratio of observations: indicators– Often no control variables included– Only partial correlation matrices reported

• Several studies have applied covariation in conjunction with mediation or moderation

Page 29: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

MatchingMatching

• Venkatraman (1989): “A measure of fit between variables…developed independently of any performance anchor”

• Fit can be based on deviation scores, residual analysis, or three way interaction.

• Most studies in SMJ referencing “matching” were simple interaction terms

• Powell (1992) an example of profile deviation, Habib & Victor (1991) an example of three-way interaction

• Very rarely used tool

Page 30: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Suggestions for Suggestions for Future StudiesFuture Studies

Page 31: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Theory DevelopmentTheory Development

• Frame more nuanced hypotheses– Multi-theoretic perspectives– Complementary or competing, synergy or

suppression?– Combinations of different contingencies (e.g.,

moderated mediation)

• Faculty training– Declining variety observed in types of tools used in

our content analysis– Shook et al (2003: 1236): “Doctoral training may not

be keeping pace with data analytic trends and future research needs.”

Page 32: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

MethodologyMethodology

• Design studies with sufficient power– Power is generally weak in SM studies– Broad constructs often measured with single items– Power is a concern with many contingency tools

• Continue advances with mediation– Less than 5% of studies use mediation

• Pay greater attention to type of moderator effects

• Application of interactions to limited dependent variable studies

• Rarely used tools offer great opportunities for theoretical development

Page 33: Tools for Testing Contingency Models Prof. Brian Boyd W. P. Carey School Arizona State University  Kati Haynes, Mike Hitt, Don

Organizational Research MethodsOrganizational Research MethodsSpecial issue on

Construct Measurement in Strategic Management

Sample topics:Sample topics:• Development and validation of new

measures• Multi-level variables• Problems with proxies• Articles/notes and empirical/conceptual

papers welcomeCall for Papers in April issue of ORMCall for Papers in April issue of ORMManuscripts due December 1, 2011Manuscripts due December 1, 2011