thinking tools - from top motors through s'ware proc improv't to context-driven (2007)

35
© September 2007 Neil Thompson Thinking tools: from top motors, through software process improvement, to context-driven Neil Thompson Thompson information Systems Consulting Ltd SoftTest Ireland with the support of the All Ireland Software Network Belfast 20 Sep 2007 23 Oast House Crescent Farnham, Surrey England, UK GU9 0NP www.TiSCL.com

Upload: neil-thompson

Post on 16-Apr-2017

723 views

Category:

Technology


0 download

TRANSCRIPT

Page 1: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

Thinking tools: from top motors, through

software process improvement, to context-driven

Neil Thompson Thompson information Systems Consulting Ltd

SoftTest Ireland with the support of the All Ireland Software Network

Belfast 20 Sep 2007

23 Oast House CrescentFarnham, SurreyEngland, UKGU9 0NPwww.TiSCL.com

Page 2: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

2

Can software process improvement learn from these?

TOYOTA CELICA GT4

TOYOTA PRIUS

Page 3: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

3

How Toyota progressed through quality to global dominance, and now innovation

• Quality (top reputation):– has dominated JD Power satisfaction survey for decade– Toyota Production System (TPS): 14 principles across

Philosophy, Problem-Solving, Process and People & Partners

• Global dominance:– market value > GM, Chrysler & Ford combined– on track to become (2006) world’s largest-volume car mfr

• Innovation (fast):– Lexus: invaded the “quality” market and won– Prius: not evolutionary but revolutionary – and launched 2

months early and sold above expectations– Toyota Product Development System (TPDS):

13 (!) principles across Process, People and Tools & Technology

Page 4: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

4

Agenda• Contents:

– Analogies between world-leading improvements in manufacturing and what we may do in the software development lifecycle (SDLC):

• 1. Things that flow through a process: inventory, value (EuroSP3 2004)• 2. Constraints on process, and thinking tools to improve (EuroSTAR 2006)• 3. From process improvement to process definition, eg context-driven

(STAREast 2003)• Acknowledgements:

– Jens Pas (EuroSTAR 1998) – my introduction to Goldratt– Greg Daich (STAREast 2002) – I generalised his idea, then worked

backwards to the roots • Objectives for audience:

– entertainment? – something a bit different– appreciate some fundamental principles– take away a set of simple diagrammatic thinking tools which are useful

in many situations– think about your particular SDLC – where are the constraints?– go on to read some of the references – benefit by then improving your own processes– be more ready to learn from other disciplines & industries

Page 5: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

5

GOLDRATT: Drum-Buffer-RopeMaximise throughputCritical Chain manag’t

Monitoring buffers

Cause-effect treesConflict resol diagramsIdentify constraint,“elevate” & iterate

The “new” paradigm in manufacturing: value flow, pull not push, problem-solving

• And now these principles have been successfully applied beyond actual manufacturing, into product development

3. Pull to avoid over-production

• Customer-defined value (to separate value-added from waste)

2. Continuous process flow to surface problems

7. Visual control to see problems

12. See for yourself to thoroughly understand

13. Decide slowly (all options) by consensus • Front-load product dev to explore thoroughly alternatives while max design space

TOYOTA: Takt (rhythm), Low-Inventory (“lean”),Just-In-TimeMinimise wasteAndon (stop and fix)Kanban cardsTagging slow moversOne-page metricsChain of 5 “why”s

Plan-Do-Check-Act

4. Level workload

14. Learning org via reflection & improvement

• But what about development of software?...

• I prefer Goldratt for thinking tools...

TPS & TPDS

Page 6: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

6

Goldratt’s Theory of Constraints: an analogy to explain

diagram based on those in “The Race”, E.M. Goldratt & R. Fox 1986

Drum Rope

Buffer

Goal: to win the warObjective: to maximise throughput (right soldiers doing right things)Constraint on throughput: slowest marcher

Critical chain: weakest link is all we need fix, by means of...Five focussing steps: identify constraint, exploit it, subordinate all else, elevate it (i.e. strengthen so not now weakest), then... identify next constraintBut now it’s no longer simple: so need iterative tools for: what to change, what to change to, howFive thinking tools (based on sufficient causes & necessary conditions)

Page 7: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

7

Applicability of TOC beyond manufacturing

• Military logistics• Marketing, sales & distribution• Project management• Measurements, human relationships, medicine etc• Using technology, eg assessing benefits of functionality• IT systems development:

– focussing of Goldratt’s Critical Chain on hi-tech projects Robert C Newbold

– methodology design Alistair Cockburn

– “Lean software development” Mary & Tom Poppendieck

– Agile management using Feature Driven Development David J Anderson

Page 8: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

8

But software development isn’tlike manufacturing?

• Software isn’t like hardware• Intellect adds value less predictably than machines• The manufacturing part of software development is disk

duplication: “development” is really a design activity• People are more important than processes• Software development doesn’t repeat exactly, people are

always tinkering with the processes• Development involves discovery, production involves

reducing variation• But I say: does that make all the analogies worthless,

or do they just need interpreting? I suggest the latter…

Page 9: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

9

A code factory and a bug factory

• No-waste factory

• Now here’s some waste: meeting/escalation (and loaded personal memories), or inventory?

Programming

Programming

a ab b’

c d

Documentedrequirements

Implicitrequirements

Meeting / escalation to agree

I I Acceptance tests

?

?

a b c Demonstrations &acceptance tests

Statedrequirements

Page 10: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

10

Specs & unfinished software are inventory

• Specifications are generally not needed after go-live (I will come later to exceptions) so they are not end-product, they are work-in-progress (especially intermediate levels like functional & non-functional specs)

• Untested software, and even finished software not paid for, is also inventory

• Iterative lifecycles help if “adaptive” (product-based) rather than “transformational” (where specifications multiply!)

a b

ProgrammingProgramming

b’

a’

c

I

May includeredesign

Revised & new requirements

Page 11: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

11

The full traditional W-model bulges with inventory!

Requirements Statement

Functional Spec.

Technical Design

Module Specs.

Code

Business objectives

V&V MS, + spec. UT

Unit test

Unit retest, fix & reg. test

Post-implement- ation review

Verify & Validate RS, + spec. AT

Acceptance test

Acc. retest, fix & reg. test

Sys. retest, fix & reg. test

Int. retest, fix & reg. test

System test

Integrat- ion test

Verify & Validate FS, + spec. AT

V&V TD, + spec. IT

Makeinto

Test against

Retest lowerlevelswhere

necessary

Static-check

Verify

Validate(incl. “QA”)

Page 12: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

12

In a factory, small batches reduce inventory

Based on Goldratt, The race (North River Press 1986)

200Multi-batch(ie “iterative”)

(i) 1.3

(ii) 10.0

(iii) 1.0

(iv) 10.0

(v) 2.0

0 1 2 3 4 months

200 200 200 2001000

Single-batch(ie “waterfall”)

0 1 2 3 4 months

Inventory Inventory

Stages &units / hour

Page 13: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

13

Drum-buffer-rope approach to constraints• Optimise throughput by:

– (1) drumbeat based on constraining stage (a) & orders (b)– (2) buffer to protect constraining stage from upstream

disruptions– (3) rope to prevent leader extending gap on constraining

stage• For subassemblies feeding in, have additional buffers

assembly constrainingstage (a) ”leader”

buffer

orders (b)

subassembly buffer

raw materialsin

“troops marching” materials flow

(1)

(2)

(3)

Page 14: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

14

In software development & testing, small batches = agile methods: consider inventory

moving through SDLC

Amount offunctionality

Date

Systemtesting

Live andpaid-for

Acceptancetesting

Programming &unit testing

Integrationtesting

Requirements

Design

Specification

If lines not approx parallel, inventory is growing

Inventory inprocessoverall

Inventory inthis stageof process

Lead time forthis stageof process

Within each stage of testing, can subdivide by

pass/fail, bug states etc

Based on Anderson, Agile management for software engineering (Prentice Hall 2004)

Page 15: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

15

Agile methods: pull value instead of pushing documentation

LEVELS OF DOCUMENTATION,pushed by specifiers

WORKINGSOFTWARE

Accepted

System-tested

Integrated

Unit / Component-tested

FLOW OF FULLY-WORKING SOFTWARE,

pulled bycustomer demand

Requirements

+ FuncSpec

+ TechnicalDesign

+ Unit / Componentspecifications

+ Test Specifications

Page 16: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

16

But even if our context is not suitable (or ready) for agile methods, we should understand flow

order(s)Acceptancetesting

Systemtesting

Integrationtesting

Unittesting

Programming

Requirements

raw materialsin Where is/are the

constraining stage(s)?

Where should buffersbe / not be?

Design

Specification

assembly

sub-assemblies

Page 17: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

17

New paradigm problem-solving: the Goldratt-Dettmer* “Thinking Tools”

Core problem+(other) Root causes

Intermediateeffects

Undesirableeffects

Prerequisites+Conflicts

Requirements +INJECTIONS

Objective

CURRENT REALITY+ Injections

Intermediateeffects

Desired effects

Intermediateobjectives

Obstacles

Objective

Needs+Specificactions

Intermediateeffects

Objective

CURRENTREALITY

........... What to change to .......(2) (3)

CONFLICTRESOLUTION

.... How to change ....

PRE-REQUISITES

TRANS-ITION

FUTUREREALITY

What tochange (1)

* very slightly paraphrased here

(4) (5)

Sources: Dettmer, W. Goldratt’s Theory of Constraints (ASQ 1997) Thompson, N. “Best Practices” & Context-Driven – building a bridge (StarEast 2003)

Page 18: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

18

The thinking tools are complementary diagrams

• Causes and effects• Necessary and sufficient conditions

http://www.osaka-gu.ac.jp/php/nakagawa/TRIZ/eTRIZ/eforum/eETRIACon2003/Fig11TillmannB.jpg

Page 19: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

19

Why better than “traditional” process improvement in software testing

Sources: TMMSM - http://www.stsc.hill.af.mil/crosstalk/1996/09/developi.aspTPI® – based on http://www.sogeti.nl/images/TPI_Scoring_Tool_v1_98_tcm6-30254.xlsTOMTM – based on http://www.evolutif.co.uk/tom/tom200.pdf, as interpreted by Reid, S. Test Process Improvement – An Empirical Study (EuroSTAR 2003)

PREDEFINED SUBJECT AREAS

MATURITY LEVELS

Medicalanalogies

SomeflexibilityTest

OrganisationMaturityTM

TestMaturityModelSM

TestProcess

Improvement®

Page 20: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

20

Extending the new paradigm to testing: by rearranging TPI’s key areas…

…we can begin to see cause-effect trees…

1.Test strategy

2.Lifecycle model

3.Mom of involv’t

4.Estim & plan

5.Test spec techn

6.Static techn’s

7.Metrics

8.Test automation

9.Test env’t

10.Office env’t

11.Commit & motiv

12.Test func & train

13.Scope of meth’y

14.Communication

15.Reporting

16.Defect mgmt

17.Testware mgmt

18.Test proc mgmt

19.Evaluation

20.Low-lev testing

Page 21: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

21

Cause-effect trees: can start with TPI’s inbuilt dependencies

…eg for getting to at least level A throughout

1.Test strategy 2.Lifecycle model3.Mom of involv’t

4.Estim & plan 5.Test spec techn6.Static techn’s 7.Metrics

8.Test automation9.Test env’t10.Office env’t

11.Commit & motiv 12.Test func & train13.Scope of meth’y 14.Communication 15.Reporting16.Defect mgmt

17.Testware mgmt

18.Test proc mgmt19.Evaluation 20.Low-lev testing

A:Informal techniques

A:Single hi-level test

A:Budget & time

A:Plan, spec, exec

A:Compl test basis

A:Substantiated A:Product for project

B:Test int in proj org B:Progress, activities,prioritised defects

A:DefectsA:Internal

A:Planning & exec’n

B:+Monitoring &adjustment

A:Managed-controlled

A:Testers & Test Mgr

A:Project-specific

B:Formal techniques

A:Internal

(slightly simplified)

Page 22: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

22

Can add extra “key areas”, lifecycle inputs & outputs, general categories

…eg TPI / TMap’s four “cornerstones”

1.Test strategy 2.Lifecycle model3.Mom of involv’t

4.Estim & plan 5.Test spec techn6.Static techn’s 7.Metrics

8.Test automation9.Test env’t10.Office env’t

11.Commit & motiv 12.Test func & train13.Scope of meth’y 14.Communication 15.Reporting16.Defect mgmt

17.Testware mgmt

18.Test proc mgmt19.Evaluation 20.Low-lev testing

TECHNIQUESin general

LIFECYCLEin general

ORGANISATIONin general

INFRA-STRUCTURE

in general

INPUTS &INFLUENCES

on testing

OUTPUTSfrom testing

+ Test data

+ Risk-Based STAR

Page 23: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

23

Can go beyond the fixed questions: SWOT each subject area

INPUTS & INFLUENCES on STAR 4.Estimating & planning

(small Post-it® notes are good for this)

STRENGTHS

WEAKNESSES THREATS

OPPORTUNITIES STRENGTHS

WEAKNESSES THREATS

OPPORTUNITIES

Not substantiated, just “wedid it as in previous project”

Monitored, and adjustmentsmade if needed

Source: solid borders denote as in TPI; dashed borders denote additional

System requirements areagreed too late

System specs & designs aredefective, just timeboxed

The most experiencedbusiness analysts are leaving,more may follow

Release dates are fixed

Can’t recruit more staff

The squeeze on testing islikely to worsen

Some managers are consideringagile methods

System specs are heavy textdocuments

Business analysts may bemotivated by UML training

Too busy for well-consideredestimating & planning

Page 24: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

24

Applying the thinking tools to information from SWOT analysisSTRENGTHS

WEAKNESSES THREATS

OPPORTUNITIES

The SWOT method can be “nested”, eg aggregate upfrom individual subject areas to whole lifeycle

CURRENT REALITYFUTURE REALITY

PRE-REQUISITES

TRANS-ITION

Anticipating &overcomingobstacles

Action planning

CONFLICTRESOLUTION

System specs are heavy textdocuments

Culture of our testers is toprefer large text documentsto diagrams

SDLC method does notencourage diagrams

Test specs are large & “texty”

Test coverage omissions & overlaps

Can still improve coverageat macro level with

informal techniques (80/20)

Using extracts from both 1st & 2nd examples

Too many failures in Live

Some managers are consideringagile methods

Business analysts may bemotivated by UML training

STRATEGIC: Improve SDLC method

TACTICAL: Address culture byworked examples of diagrams

TACTICAL: Include tables &diagrams in test specifications

(Use Threats to helpidentify obstacles)

(Use Strengths to helpamplify opportunities)

ONGOING: Techniquestraining &

coaching

Page 25: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

25

Difficulties in problem-solving: conflict resolution (eg for documentation)

Objectives of documentation

are to help build and maintain a fit-for-purpose

system by knowingand agreeingwhat built andwhat tested

“We needmore

documentation”

“We needless

documentation”

Developed further to Daich, GT. Software documentation superstitions (STAREast 2002)See also Rüping, A. Agile documentation (Wiley 2003)

CO

NFLIC

T

“Signed-off requirementsare counterproductive tosystems meeting realuser needs now”

“Documented test plansare counterproductive tothe best testing”

“People never readany documentation”

Specifications are likeinventory, no end value

“Reviews are powerful atfinding defects early, but it’sdifficult to review just speech”

“If it’s not written,it can’t be signed off”

“Test reports need tobe formal documents”

“They will when theirmemories have fadedor when there’s acontract dispute”

Documentation varies:need to distinguish necessaryfrom unnecessary

Need to distinguish qualityof documentation, notjust quantity

“Will the live system bemaintained by itsits developers?” No!

Our users cannot be on-sitewith the project throughout

“Are test analysts writingtests for others to run?” No!

Can mix exploratory& scriptedtesting

“Sign-off can be byagreed meeting outcomes” “Are there few enough people

to make frequent widespreadmeetings practical?” No!

“What documentation is neededfor contractual reasons? Stilltime to negotiate?” Yes!

Documentation is still needed for maintenanceafter go-live

Agree in a workshop whatdocumentationis needed

Documentationdoesn’t have tobe paper: usewikis etc

Make maximumuse of tables& diagrams

Page 26: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

26

Not only process improvement – we can apply the thinking tools to defining “appropriate” practices!

Rootcauses

Intermediateeffects

Effects

REALITY + Injections

Intermediateeffects

Desired effects

Intermediatesub-prerequisites

Obstacles

Sub-prerequisite

Specificactions

Intermediateeffects

Sub-objectives

ContextAlways-good principles

CONFLICTRESOLUTION

Questions toconsider

PRE-REQUISITES

FUTUREREALITY

CURRENTREALITY

What “appropriate” meansin this context

Choice categories& actions

“Prerequisites”

Requirements

Objectives

POSITIONING+ Justifications

Extremes

Sub-requirements(valid & invalidassumptions)+ INJECTIONS

• Methodology unhappy with ( actions)

• Unsure how best to test ( conditions)

Good practicesin thiscontext

12a

2b

3

4

5a

5b©

TRANSITION

Choicecategories +NEEDS +

INTERACTIONS

Objectives

Actions & sub-objectives

26

Page 27: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

27

This is a structure I argued could build a bridge between “best practices” and context-driven

BestPractice

Context-Driven

Constraints,Requirements,Objectives etc

“Always-Good”Principles

“fossilisedthinking”

“formalisedsloppiness”

Unifying points

Goldratt’s“thinking tools”

Expert pragmatismwith structure

What How

Page 28: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

28

Context (CURRENT REALITY)

• Unsure how best

to testMethodologyhappy with

• Methodologyunhappy with

Business/org. sector Nation

(eg USA)

Corporate culture

Technology

Legal constraints:• regulation• standards

Moralconstraints, eg:• human safety• money, property• convenience Process

constraints, eg:• quality management• configuration mgmt

Job type & size:• project/programme• bespoke/product• new/maintenance

Resources:• money ( skills, environments)• time

SCO

PE, CO

ST, TIME,

QU

ALITY

/ RISK

FAC

TOR

S

App type

1

Page 29: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

29

Always-good principles (CONFLICT RESOLUTION upper level)

Always-goodEffectiveness

EfficiencyRisk management Quality management

Insurance Assurance

V-model: what testing against W-model: quality management

Risks: list & evaluate

Define & detect errors (UT,IT,ST)Give confidence (AT)

Prioritise tests based on risks

Tailor risks & priorities etc to factors

Refine test specifications progressively: Plan based on priorities & constraints Design flexible tests to fit Allow appropriate script format(s) Use synthetic + lifelike data

Allow & assess for coverage changes Document execution & management procedures

Distinguish problems from change requests Prioritise urgency & importance

Distinguish retesting from regression testing

Use handover & acceptance criteria

Define & measure test coverage

Measure progress & problem significance

Be pragmatic over quality targets

Quantify residual risks & confidence

Decide process targets & improve over time

Define & use metrics

Assess where errors originally made

Define & agree roles & responsibilities

Use appropriate skills mix

Use independent system & acceptance testers

Use appropriate techniques & patterns

Plan early, thenrehearse-run,acceptance tests

Use appropriate tools

Optimise efficiency

2a

Page 30: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

30

Conflicting interpretations of these principles

• adherence to standards and/or proprietary methods

• detail• amount of

documentation• scientific-ness• degree of control• repeatability

• consistency• contracted-ness• trained-ness and

certification of staff• “ceremony”, eg degree

to which tests need to be witnessed, results audited, progress reported

• any others?

Next diagram will take each box from the previous diagram and assess it on a formal-informal continuum, so...

In preparation for this: what do we mean by “formality”?

Page 31: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

31

Good practices in this context (CONFLICT RESOLUTION lower level)

APPROPRIATE USE OFV-model: what testing against

Use a waterfallV-model

Don’tuse a V-model

• NEED NOT BE 4 LEVELS

V-model isdiscredited

We want to be trendy, anyway

We’re too lazy to think

We want baselines to test against

We want to test viewpoints of:• users• someone expert & independent• designers• programmers

We’re doing adaptive development (no specs)

MANY PEOPLESTAND BY V-MODEL

We’re doing iterative development

We’re object-oriented • V-MODEL IS IMPLICIT IN BINDER’S BOOKTesting OO systems: models,

patterns & tools

Documentation must be minimised

“Conflict”

We have little time

• SOME SPECS AREOUT OF DATE / IMPERFECT,BUT WE COPE

Different levelsmitigate different risks

Two heads are better than one

• CAN USE EXPLORATORY TESTING AGAINST CONSENSUS BASIS

• NEED NOT BE 1-1 CORRESP SPECS-LEVELS

• MULTIPLE PARTIAL PASSES

• THEY ARE LEVELS, NOT STAGES

All systems are integrated from parts

2b

Page 32: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

32

What “appropriate” means in this context (FUTURE REALITY)

V-model with only 3 levels: acceptance (v. consensus) system (v. spec) unit (informal)

We don’t need aseparateintegration test level

The system has:• users• (potentially) expert & independent testers• designers (where significant)• programmers

• NEED NOT BE 4 LEVELS

Our system isvery simple We do need separate

development & acceptancetest levels

Our user requirementsare out of date andwere vague when written

Our programmers hatedocumentation

We do have a good functional specand independent testers available

3

Page 33: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

33

And so on… overall what we have done is deconstruct then reconstruct: the framework is a

“meta-V-model”

Your context

All possiblecontexts

Questions toconsider

What “appropriate”means in your context

Choicecategories& actions

Each practiceto examine

Choices

CONFLICT RESOLUTION upper

PRE-REQUISITES

TRANSITIONlower

CURRENTREALITY

TRANSITIONupper

CONFLICTRESOLUTION

lower

FUTUREREALITY

FUTUREREALITY

Page 34: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

34

Conclusions• Summary:

– Toyota’s success (and penetration of Just In Time)– “The Goldratt Trilogy”:

• 1. Things that flow through a process: inventory, value• 2. Constraints on process, and thinking tools to improve• 3. From process improvement to process definition, eg context-

driven

• Lessons learned:– three papers are enough?

• Take away:– read references – Dettmer is key

• Way forward:– examples!

Page 35: Thinking tools - From top motors through s'ware proc improv't to context-driven (2007)

©

September 2007Neil Thompson

35

Key references• Context-Driven:

– Kaner, Bach & Pettichord (2002) Lessons learned in software testing, Wiley• Best Practice:

– ISEB, ISTQB??• My inspiration:

– Jens Pas (EuroSTAR 1998) Software testing metrics– Gregory Daich (STAREast 2002) Software documentation superstitions

• Theory of Constraints understanding:– Eliyahu M. Goldratt (1984 then 1992 with Jeff Cox) - The Goal; (1986 with R. Fox)

The Race; (1997) Critical Chain • TOC overview and the thinking tools:

– H. William Dettmer (1997) Goldratt’s TOC: a systems approach to cont. improv’t, ASQ• Related (but differently-specialised) thinking from Agile

community:– Alistair Cockburn: A methodology per project, www.crystalmethodologies.org– Mary Poppendieck: Lean development: an agile toolkit,

www.poppendieck.com