evaluating future technology assessment jonathan calof and france bouthilller telfer school of...

46
Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The 4th International Seville Conference on Future-Oriented Technology Analysis (FTA) 12 & 13 May 2011

Upload: erin-saunders

Post on 27-Mar-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

Evaluating Future Technology Assessment

Jonathan Calof and France Bouthilller

Telfer School of Management, University of Ottawa and McGill University

The 4th International Seville Conference onFuture-Oriented Technology Analysis (FTA)

12 & 13 May 2011

Page 2: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

2

The Methodology – what we did

• Literature review

• Review of NRC TA documents

• Interviews with ITA’s (Technical Advisors), IS’s (Information Specialists) and clients

Page 3: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

The Data Base

• Over 100 projects reviewed

• 15 users selected for interview based on frequency of TA use

• 15 TA users interviewed

• Projects ranged from short term orientated TA (under 1 year time horizon) to long term (40 year time horizon). Mix of methodologies and costs.

Page 4: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

The Definition of FTA that I am using

ForeLearn• Foresight enhances such thinking by

gathering anticipatory intelligence from a wide range of knowledge sources in a systematic way and linking it to today's decision making.

• FORESIGHT is a participative approach to creating shared long-term visions to inform short-term decision-making processes. (www.foresight-network.eu)

Page 5: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

5

Literature review

• Setting the context – why assess performance?

• Measures from the foresight literature

• Measures from the CI literature

• Measures from services and consulting literature

Page 6: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

6

Why Assess Performance?

Purpose Related Question1. To Evaluate How well is our CI department, group, manager, task force or unit

(etc.) performing?

2. To Control How can CI managers ensure their reports do the right things?

3. To Budget To what CI programs, people, projects, consultants, vendors or

information sources should resources be allocated?

4. To Motivate How can CI executives motivate their reports as well as other

functional stakeholders to do the things necessary to improve both

CI and the enterprise’s performance?

5. To Promote How can CI managers convince their superiors and other relevant

stakeholders that their function is doing a good job?

6. To Celebrate What CI accomplishments are worthy of the important

organizational ritual of celebrating success?

7. To Learn What CI activities or efforts are working and not working, and

why?

8. To Improve What should be done differently to improve CI performance, and

by whom?

Adapted from Behn (2003) in Blenkhorn & Fleisher 2007

Page 7: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

Why assess performance

• Because the decision makers/funders are asking for it/Tough budgeting decisions in difficult times/Pragmatic reality.

• If we can come up with a methodology for doing this we will land up being evaluated by another persons wrong measures.

Page 8: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

8

IMPACTDIMENSIONISSUE DIMENSION

III.INITIALISINGACTIO

TECHNOLOGICAL/SCIENTIFIC ASPECTS

SCIENTIFICASSESSMENTa) Technicaloptions assessedand made visibleb) Comprehensiveoverview onconsequences given

AGENDA SETTINGf) Setting the agenda inthe political debateg) Stimulating publicdebateh) Introducing visions orscenarios

REFRAMING OFDEBATEo) New action planor initiative tofurther scrutinisethe problem at stakep) New orientationin policiesestablished

SOCIETAL ASPECTS SOCIALMAPPINGc) Structure ofconflicts madetransparent

MEDIATIONi) Self-reflecting among actorsj) Blockade runningk) Bridge building

NEW DECISIONMAKINGPROCESSESq) New ways ofgovernanceintroducedr) Initiative tointensify publicdebate taken

POLICY ASPECTS POLICYANALYSISd) Policy objectivesexplorede) Existing policiesassessed

RE-STRUCTURINGTHE POLICY DEBATEl) Comprehensiveness inpolicies increasedm) Policies evaluatedthrough debaten) Democraticlegitimisation perceived

DECISIONTAKENs) Policyalternatives filteredt) Innovationsimplementedu) New legislationis passed

Impacts of FTA Ladikas and Decker (2004)

Page 9: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

9

EFMN (2005)(a) Quality of products • Produce future-oriented material for the system to use • Development of reference material for policymakers and other innovation actors • Creating a language and practice for thinking about the future • More informed STI priorities • A source of inspiration for policy system actors

(b) Organisation and quality of social interactions • Aid discussions of the future • Facilitate thinking out of the box • Challenge mindsets • Creation of new networks and clusters, re-positioning of existing ones • Establishment of communication structures between innovation actors • Support the empowerment of system actors • Contribute towards development of actor identities

Page 10: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

10

(c) Impacts in terms of learning effects

• Support system actors to create their own futures

• Creating a shared vision

• Gain insights into complex interactions and emerging drivers of change

• Build trust between system actors

• Detect and analyse weak signals to ‘foresee’ changes in the future

• Facilitate better understanding of potential disruptive change

• Provide anticipatory intelligence to system actors

• Development of new ways of thinking

• Collective learning through an open exchange of experiences

• Highlighting the need for a systemic approach to both policymaking and innovation

• Stimulation of others to conduct their own foresight exercises after being inspired

• Accumulation of experience in using foresight tools and thinking actively about the future

• Enhanced reputational position and positive image of those actors running a foresight

• Better understanding (and visibility) of a territory’s strengths and competencies

(d) Impacts in terms of strategy formulation for action

• Support decision-making

• Improve policy implementation

• Better informed strategies in general

• Using foresight results to evaluate and future-proof strategies

• Better evidence-based policy

• Making the case for increased investments in R&D

• Achievement of long-term reform of the productive system through a raised emphasis on high technology

• Better manage external pressures and challenges

• Overcome path dependency and lock-ins

Page 11: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

11

Competitive Intelligence Literature

• No universal method for measurement

• CI is a service, thus intangible and has a persuasive affect

• Cause-and-effect relationships cannot always be established due to many factors

Page 12: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

12

Prescott & Bhardwaj (ref. Herring 1999, 15)

• Influencing decision makers

• Improved early warning

• Identifying new opportunities

• Exploiting competitor vulnerabilities

• Sharing of ideas

• Better serving the company’s customers

Page 13: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

13

Hard and Soft Measures of CI Success (Simon, 1998)

Hard Measures Soft Measures

Costs – CI contribution to the bottom line (input)

1. cost of doing the research2. cost benefit of CI research3. financial gain from ideasQuantitative measures (output) 1. clients serviced2. projects completed3. suggestions submitted4. suggestions implemented5. projects assisted6. number of BI/CI staff7. staff productivity8. participants in the CI process (direct and

indirect)Quality measures1. Intelligence product measures2. accuracy of information (validity and

reliability)3. immediate usability of results (no rework)

Customer usability1. work habits 2. user friendly reports3. participation on teams4. contributions to teams5. communication skills6. contact follow-ups7. customer satisfaction ratings8. understandingAcceptance and alliance measures1. work climate2. number of requests for service3. number of repeated requests for service4. requests for participation in team meetings5. referrals from customers6. further integration of CI projectsCI practitioner performance measures

initiative

1. implementation of new ideas

2. degree of supervision required

3. ability to set goals and objectives

Page 14: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

14

Simon (Cont)

Hard Measures Soft Measures

Time measures

1. ability to produce timely information

2. efficiency

3. time saved by CI

4. on-time delivery

CI practitioner performance measures

1. effective use of resources (resourceful and creative)

2. knowledge of CI methods

3. resourcefulness

Unit and personnel effectiveness measures

feeling/attitude

1. solicitation for services

2. attitude changes – clients taking you in to confidence or consulting with you

3. customer loyalty rating

4. perception of CI contributions

5. relationship building (sharing of personal information)

6. problem solver perception

Personnel development/advancement

rewards

1. job effectiveness

2. attendance at CI orientation and training programs (participant or teaching)

3. promotion

4. pay increases

5. work accomplishment acknowledgments

Page 15: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

15

CI Measurement according to McGonagle and Vella (2002)

Assignments and Projects1. Meeting objectives2. Number completed3. Number completed on time4. Number requested5. Number requested—Increase by End Users6. Number of follow-up assignments7. Number of projects assisted8. Number of suggestions submittedBudget1. Comparative cost savings—compared with

cost of outsider2. Comparative cost savings—compared with

cost of untrained3. Meeting project and function budget

constraintsEfficiency1. Accuracy of analysis2. Data quality3. First time results (no reworking)4. Meeting project time line5. Time for research versus time for response

End users1. Creating compelling reasons to use CI2. Effectiveness of implementation of findings3. Meeting needs4. Number of referrals5. Number servedFeedback1. [Feedback]—written2. [Feedback]—oralFinancial1. Cost avoidance2. Cost savings3. [Financial] goals met4. Linking CI to specific investments5. Linking CI to investments enhancement6. Linking CI to specific savings from

unneeded investments7. Revenue enhancement8. Value creation

Page 16: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

16

McGonagle and Vella (cont)

Internal Relationships1. Building strong with end-users2. Formulating relevant strategy and tactics3. Quality of relationship with end-users4. Quality of participation on cross-functional

teamsNew Products and Services1. Number developed due to use of CI2. Cost savings/avoidance in development

from use of CIPerformance1. Growth profitable for the unit or firm2. Impact on strategic direction of unit or firm3. Market share gains for unit or firmReport and Presentations1. Number2. Number of follow-ups3. Production of actionable CI

Sales effectiveness1. Customer satisfaction2. Linking to specific customer wins3. Number of customers retained4. Number of leads generated5. Repeat business6. Improvement in win-loss ratioSurveys1. [Surveys]—Written2. [Surveys]—OralTime1. Gained by CI input2. Projects delivered on time3. Saved by input

Page 17: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

17

SCIP Study (2006)Response Percent

Customer satisfaction 301 57.9

Decisions made/ supported 248 47.7

CI productivity/ output 210 40.4

Strategies enhanced 196 37.7

New Products or services 113 21.7

ROI Calculation 72 13.8

We have no effectiveness measures154

29.6

The value of CI:

Response Percent

New or increased revenue 152 29.2

New products or services developed 147 28.3

Cost savings or avoidance 141 27.1

Time savings 116 22.3

Profit increases 105 20.2

Financial goals met 103 19.8

We have no value measures 222 35.0

Page 18: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

18

Consulting Performance

• Brought in as similar to TA it is a non tangible advice based service.

• Dominant measures: Service quality, Satisfaction, Service quality – expected service quality vs received, quality, value, trust, intention to use.

• Virtually all measures are subjective.

Page 19: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

19

Taking these streams of literature

• We designed the following questionnaire designed to see whether it could be used to measure the impact of TA

• Used this as an element for an interview to better appreciate the broader issues that will be involved in measuring TA

Page 20: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

20

Initial Questionnaire1. Which impact do you think CTI has on direct clients?Scale1 2 3 4 5Strongly disagree strongly agree

Impact on Decision makers (ITA or business analysts)Client made decision in a more effective way (effectiveness) Client made decision more rapidly (timeliness)Client made better decision (appropriateness)Client made decision with more confidence (confidence)Client’s analysis was confirmed (reassurance)Financial impact Client was able to save time (need to quantify)Client was able to save money (need to quantify)Client was able to save resources (need to identify which ones)

Page 21: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

21

2. Which impact do you think CTI has on indirect clients?Scale1 2 3 4 5Strongly disagree strongly agreeImpact on Decision makers Client made decision in a more effective way (effectiveness) Client made decision more rapidly (timeliness)Client made better decision (appropriateness)Client made decision with more confidence (confidence)Client’s analysis was confirmed (reassurance)Financial impact Client was able to save time (need to quantify)Client was able to save money (need to quantify)Client was able to save resources (need to identify which ones)Client was able to reduce costsClient was able to avoid costsClient was able to increase revenues

Page 22: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

22

ITA’s and Other Clients: Power Users

• We asked what are the benefits?• We showed them the projects they had

commissioned and asked for the benefits they received.

• We then gave them the questionnaire and asked if it captured the benefits they had received

• The questionnaire was then revised based on the clients comments

Page 23: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

23

Results of the Interviewsgeneral

• Clients raved about the service• Lots of stories of positive impacts• Clear indication of difficulty in measuring impact due to

– Mediating variables (type of client, etc)– Indirect nature of intelligence (intelligence is only one aspect of

what is used for the decision)– TA officers do not control implementation of the assessments– Time – some of the TA recommendations affect decisions that

can take up to 20 years for full project realization– Lots of indirect flow producing significant benefit hard to

measure.– Lots of direct flow which will be difficult to directly link to the

decisions.

Page 24: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

24

Flow of TA Direct Investment DecisionDirector: Invest/Don’t invest

Lead ITA: Does it go forward for investment

TBA/IS: Market Technical Intel scan

BA: Business case

Other ITA’s: Technical assessment

SME: I want investmentProvince: Potential

funder

Other Federal

Others

Page 25: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

25

Flow of intelligence Direct Other

Client: Internal or external

TBA/IS: Market Technical Intel scan

SME: I want investment

Province: Potential funder

OGD’s

Other NRC personnel

Page 26: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

26

But there is also secondary benefit

• Indirect benefit

• Intelligence spillovers

• Evidenced by looking at the actual flow of intelligence during a project.

• Network benefits

Page 27: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

27

Examples of secondary impact arising from projects

• ITA reads the TBA report and integrates it in discussions with others and in decision making.

• TBA makes speech based on information developed for a TA report.

• TBA does a report using information gathered during a previous report.

• ITA/Client takes the intelligence and puts it in a public report

• Lots of back and forth during the project with the client so there is focusing and intel being passed on during the process including helping the client focus and improving the intelligence gathered.

• ITA/IS sends email/newsletter updates to their network based on information gathered during the project

• Participants during process learn

Page 28: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

28

Flow of intelligence Indirect Investment Decision

Director: Invest/Don’t invest

Lead ITA: Does it go forward for investment

TBA/IS: Market Technical Intel scan

BA: Business case

Other ITA’s: Technical assessment

SME: I want investmentProvince: Potential

funder

Other Federal

Others

Page 29: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

29

Client: Internal

TBA/IS: Market Technical Intel scan

SMEProvince: Potential funder

OGD’s

Other NRC personnel

Indirect – Other decisions

Others

Page 30: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

30

There was also informal intelligence – no project

• ITA/Other informal chat in the hallway with the TBA (happens in office, cafeteria, hallway)

• ITA/IS sends an email to people information you might like to know

• Client/other drops by the ITA/IS office/calls them/emails them and asks “what do you think of this idea”

Page 31: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

31

Other direct and secondary impact questions

• Was considered in making the decision/influenced the policy• Made a difference• Policy/decison was successful • Assess the org/policy whatever it is as of a future date and compare it to the same before the

project – look for the diffs, success of recommendations/advice• New networks created • Knowledge used in other places (spillover)• Decision makers thinking longer term• Futureaziation of the organization• Social improvements• Clients now doing some intelligence themselves• I didn’t have the time or patience to do it• The TBA provided valuable analysis rather than information• Is a skill set that I don’t have• It is a perspective that I do not have • Provides information/analysis that I can’t do• Helped me/my client focus in on the right questions • Provided more than just information• I/my client could not afford to do it themselves• I am getting busier (reality is more people are asking for TBA to do stuff so they must find it

valuable)

Page 32: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

32

Evaluating Performance

• Direct economic measures – difficult due to the flow of intelligence

• Attitude measures (asking clients what value they received) is easier to do and is what is most common in the literature

• Clients indicated that attitude measures were also the best ot use

Page 33: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

33

The Performance Model –So what are we measuring?

Page 34: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

34

Based on the literature and the interviews

• Organization performance: How well is he organization doing in their TA program? How well is it developing (this is a general area of measurement – entire organization)

• Individual performance: Are the TA officers doing their job well – again this is general area of measurement

• Project/process performance: Is the TA process being conducted appropriately

• Output performance: What is the quality of the TA output itself

• Impact performance: What direct and intended impact did the TA have

• Secondary impact performance: What indirect unanticipated impacts arose from the TA.

mgt
Page 35: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

35

Measuring impact: Direct and secondary

Page 36: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

36

Revised Questionnaire – still 1-5 but now distinction

• Between decision maker vs decision recommender

• These are the questions on impact and secondary benefit

Page 37: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

37

Impact on ITA’s/Advisors• I made my recommendation in a more effective way (effectiveness)• I made my recommendation more rapidly (timeliness)• I made a better recommendation (appropriateness)• Made me more confident on my recommendation• My recommendation was validated (reassurance)• I became aware of important issues that I was not aware of before• It saved me time• It saved me money• It saved me resources• It reduced uncertainty• It gave me information that I was able to use in future projects• It gave me important information that I previously was unaware of• It gave me important new ideas• It broadened my knowledge• It has helped improve service to my clients• Reduced bias in decision making/recommendation• Has given me the information required to improve my clients proposal• Has given me the information I needed to provide my client with good advice• Has enabled me to do my job better• Reduced the possibility of errors in my recommendation• Helped to reduce risk• Gave me information that is hard for me to get• Gave me information that I did not know how to get• Provided me with new network/contacts• Could validate hypotheses, intuition; could identify potential market

Page 38: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

38

Impact on decision makers• Made the decision in a more effective way (effectiveness)• Made the decision more rapidly (timeliness)• Made a better decision (appropriateness)• I was more confident in my decisions (confidence)• It validated/confirmed my proposal/plan (reassurance)• I became aware of important issues to address that I was not aware of before• It saved time• It saved money• It saved resources• It reduced uncertainty• It gave me information that I was able to use in future projects• It broadened my knowledge• It made my decision less biased• It made my proposal/plan better• Has enabled me to do my job better• Reduced the possibility of errors in my decision• Helps to reduce risk in my decision• It helped me to avoid making mistakes• It helped me to pursue an opportunity• It helped me improve management processes• It helped me improve productivity• It helped me improve R & D• It helped me develop better strategy• It helped me Identify new markets• It helped me Identify new lines of business• It helped me chose a technology direction• It helped to chose a new product• Prevented me from going in the wrong direction• Prevented me from making the wrong decision• Provided me with new network/contacts• It gave me important information that I previously was unaware of• It gave me important new ideas

Page 39: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

39

Impact on researchers

• Stopped unproductive research

• Modified a research design

Page 40: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

40

Direct benefit from TA

• Flows from how the intelligence is developed

Page 41: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

41

Measuring the Organization

• Number of projects done • Number of projects done on time• Client satisfaction • Projects/reports cited • People reading/ordering reports, • Requests for service• Repeat clients • Referral clients • References to our

material/citations• Request for speeches• Client intention to use the service

again• Client intention to use CTI more

frequently• Extent to which client experiences

are positive

• Overall perceived quality of CTI services

• Overall cost• Overall benefits• Overall cost/benefit• Extent recommendations are

accepted• Number of clients served• Number of staff• Projects/staff• Staff productivity• Customer loyalty• Number of projects assisted

Page 42: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

42

Measuring the Individual• Acted in a professional manner• Demonstrated professional conduct• Has the appropriate skills• Certification/knowledge testing• Patents/papers done• Number of invitations they get• Being perceived as a valuable member of the clients team• Being invited to key meetings• Knowledge of the clients area• Understanding of the problem• Flexibility in adapting to requests• Communication skills• Collection skills• Analytical skills• Planning skills

Page 43: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

43

Measuring the Process

• Followed proper practices• Finished on time• Proper mix of time: 15-20% planning, 25-35%

collection, 25-35% assessment, 10-15% communication, 15-20% management.

• Used proper analytical techniques• Proper mix of primary and secondary sources• Clients needs were understood • Clients need were dealt with

Page 44: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

44

Measuring the output

• Number of recommendations made• Met clients expectations• Project professionally done• ROI of the output • Impact on decision• Quality of recommendations• Numbers of time report has been used,• Number of times report has been quoted.• Exceeded expectations• Readability of the report• Reliability of the intelligence• Accuracy of the intelligence (over time)• Clients overall perception of the quality of the intelligence• Extent recommendations accepted• Usability of the results• User friendliness of reports

Page 45: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

45

What the Researchhas driven home

• As a field we need to develop standards of practice that are measurable.

• We need to become recognized as a legitimate body of knowledge

Page 46: Evaluating Future Technology Assessment Jonathan Calof and France Bouthilller Telfer School of Management, University of Ottawa and McGill University The

For more information

• Jonathan Calof

• Telfer School of management, University of Ottawa

[email protected]

• Phone: 1-613-228-0509