- CMMI Institute

Quantification: a driver for
cultural change
C. C. Shelley
OXFORD SOFTWARE ENGINEERING Ltd
”Take away number in all things and all things perish. Take
calculation from the world and all is enveloped in dark ignorance,
nor can he who does not know the way to reckon be distinguished
from the rest of the animals.”
St. Isidore of Seville (c. 600)
Objective
•To persuade you that:
– quantification, and measurement, are primary indicators and enablers of
mature software development
– that CMMI requires and expects quantification
– and contains some of the tools you will need
...Status of Software Measurement
•
•
•
Emerged as a new practice in 1976
•
very popular for a while...
•
...it explored many aspects of software and software development...
•
...but failed to deliver on promise
Currently unfashionable and unpopular
•
perceived as a stick for management to beat developers with...
•
...or a low value bureaucratic burden...
•
...frequently dysfunctional *
Current 'state of practice'
'Mediæval'
– Schoolmen exploring and
analysing surviving IP from
the ancient world
– Emergence of first
universities
– An extraordinary flowering
of reason and logic
– But reasoning is almost
totally qualitative
Craftsmen & Artisans...
– Construct some of the
most memorable artefacts,
ever
– Using empirical 'know
how',
– 'Sub euclidian' geometry
– And rule of (calloused)
thumb
•
Also almost entirely
qualitative*
however...
•Quantification starts to
emerge...
– improbably, time is
quantized - hours
– Publicly
– (and the calendar
under intense scrutiny
and eventually
reformed)
The Appraisal
•As is music...
– Remembered
– Shared
– Standard
– And (critically)
visual *
Measurement in CMMI®...
•(See:
– www.osel.co.uk/presentations/cmmiandmetrics.pdf )
– CMMI introduced powerful tool in M&A PA
• 'Policy' is an opportunity for cultural change
• Basili's Goal Question (I) Metric
• Which is often misinterpreted
– One shot deal – put 'metrics' in place
– Presumed to be 'systemic' reporting (admittedly its original use) –
no local or tactical measurement
– Requirement for pervasive process measurement somewhat eroded –
M&A 'common feature' now only a generic vestige
– 'High maturity' measurement is severely limited for design intensive
environments
...Measurement in CMMI®
•Mature organizations tend to have a culture conducive to good measurement
•Characteristics:
– Simple
– Pervasive
– Competent
– Local
– Trusted
– Low visibility (!)
– Seems obvious
– ...and more than GQM
•Chicken or Egg?
( In passing...
•Hint for quick PA evaluation
– Credible measurement is the last characteristic to emerge as a PA
becomes institutionalized
• Process data
– ...useful,
– ...usable...
– ...used (Ed Weller)
• Someone cares )
Three quantification and measurement tools (in
order)
•'What are you doing?'
– Binary (ish)
•Empirical Analysis Framework
– For exploratory analysis of data
•GQ(I)M
– For specifying and designing measures
'What are you doing?'...
•There are two fundamental types of analysis:
Enumerative
Analytic
“How Many?”
“Why?”
Tactical / Operational
Strategic / SPI
Provoke / Direct Change
Understand
Supervised
Delegated
Public
Private
...'What are you doing?'...
•Decide what you're doing – before anything else
– Or it will default to 'enumerative'
– And it can suggest which types of tools are used
• GQM or EAF (or both, or elements of both, or either...)
…'What are you doing?'...
•How long until testing is finished?
•Have we found all the defects?
•How many defects have we found?
•Why do we keep getting this defect?
•What other defects of this type are we getting?
•How many times has this occurred?
•Could we prevent this defect from occurring again?
•Why is it taking so long?
•How long will it take to fix
•What's the backlog?
•What's the acceptable number of defects
•How do we compare to other organizations?
•How much is this costing us?
…'What are you doing?'...
•How long until testing is finished?
•Have we found all the defects?
•How many defects have we found?
•Why do we keep getting this defect?
•What other defects of this type are we getting?
•How many times has this occurred?
•Could we prevent this defect from occurring again?
•Why is it taking so long?
•How long will it take to fix?
•What's the backlog?
•What's the acceptable number of defects?
•How do we compare to other organizations?
•How much is this costing us?
…'What are you doing?'
Empirical Analysis Framework...
…Empirical Analysis Framework...
•Mostly for analytic work (for gaining insights, growing understanding, 'doing
science')
•(undocumented, tacit, progressive, scalable)
Prerequisites (a need for information, understanding of context, models)
(optional, but can be valuable – read the data)
1. Show the data: Graphics (and then stop, probably)
2. Summarize the data: Arithmetic (and stop...)
3. Verify the data Hypothesis testing, statistical significance, statistical
modelling – if you absolutely have to – very rarely required)
•
Iterate and adapt as required
•“Graphs are friendly
•“Arithmetic often exists to make graphics possible
•“Graphs force us to notice the unexpected; nothing
could be more important.”
John Tukey, EDA, p157
GQ (I) M (Goal Question Metric)
•For defining or selecting measures to meet information needs
– Designing measures and analyses
– Originally for specifying and designing metrics (and collection mechs –
then collect data...)
– ...but now more likely to be used for identifying and selecting from
software development's extant 'pervasive data', now
GQM (Goal Question Metric)
…Setting Measurement Goals...
•Explore, refine and formalize the Measurement Goal
Begin with the informal statement (or a revision, if prompted by later
steps in the process) and place into the format:
“ The measurement goal is to:
<analyse> (characterize, predict, monitor…)
the
<object> (process, product, model…)
In order to
<purpose> (understand, manage, control, improve…)
the
<quality focus> (cost, resources, timeliness….)
from the viewpoint of the
<viewpoint> (developer, tester, user, manager…)
In the context of the
<context> (test env, team, department, supplier…)”
… Refining Measurement Goals - Asking
Questions...
Begin by documenting the measurement goal (as a focus and aide memoire)
then:
1.Draft the quality focus – what matters to me/us?
2.Draft the baseline hypothesis – how are things at present?
–Then, when these are reasonable:
1.Identify the variation factors – what influences have an impact on the current
situation?
2.Identify the impact of variation factors – how do the variation factors influence
the current situation?
… Refining Measurement Goals - Asking Questions...
•Next, develop the questions:
–Primed by understanding developed by completion of the abstraction sheets
–Not ‘what?’, ‘why?’, ‘how?’ – this should already be known from process model
–Ask quantitative questions:
•How much?
•When?
•How many?
•How many more?
Refer to the process model for prompts
…Identifying and Defining Software Measures…
Why?
•“An operational definition puts communicable meaning into a concept… …without
operational definition, a specification is meaningless.”
— W. Edwards Deming, Out of the Crisis, Chapter Nine.
•Operational software definitions tell people how to collect the data. They:
1.are the data ‘counting rules’
2.enable others to know what has been measured, and how it was measured – they
are communicable
3.enable others to repeat the measurement and get the same results – repeatability
•“…if there is no criterion for determining whether a given numeral should or should
not be assigned, it is not measurement”
–Stevens, S. S. "Measurement, Psychophysics, and Utility," 18–63. Measurement:
Definitions and Theories, C. West Churchman & Philburn Ratoosh, ed. New York, N.Y.:
John Wiley & Sons, Inc., 1959.
…Identifying and Defining Software Measures…
Operational definitions are important because…
1.There are no ‘true’ or ‘exact’ measures’.
•Measurements ( of speed of light, population of Manchester, GDP… all depend for
their result on the method used - i.e. operational definition. Even apparent
‘absolutes’ will be found to be subject to this. (try defining the criteria of any
measure to see how things can quickly become specific)
1.Measurement practice in industry is more exacting than in pure science or
teaching.
•Pure science works to the best precision and accuracy affordable - regardless of
how poor these may be - and reports estimates of errors.
•Industry is required to work to satisfy commercial or contractual needs. Failure to
measure to specification may lead to misunderstanding, financial loss or legal action
…Identifying and Defining Software Measures…
How?
•The software measurement, and measurement, literature is remarkably limited
•Kan describes operational (metrics) definitions as ‘definitions that actually spell out the
metrics to be used and the procedures used to obtain the data… …an operational definition
of software product defect rate would indicate the formula for defect rate, what defect is to
be measured [ ], what denominator [ ], how to measure, and so forth…”
…Identifying and Defining Software Measures…
How?
•The best method for defining measures is that of Park et al (CMU/SEI-96-HB002):
•It is checklist based
•The procedure has three parts:
1.Build a structured checklist
2.Build a matching ‘usage matrix’
3.‘Check’ the items on the checklist
NB – this is a technical activity and should be treated with care
Putting it in place...
“ It should be borne in mind that there is nothing
more difficult to handle, more doubtful of success,
and more dangerous to carry through than
initiating changes in a state’s constitution. The
innovator makes enemies of all those who
prospered under the old order, and only lukewarm
support is forthcoming from those that would
prosper under the new. Their support is lukewarm
partly from fear of their adversaries, who have the
existing laws on their side, and partly because men
are generally incredulous, never really trusting new
things unless they have tested them by experience.
In consequence, whenever those who oppose the
changes can do so they attack vigorously, and the
defence made by the others only lukewarm. So
both the innovator and his friends come to grief. ”
Machiavelli: The Prince, Part VI
Two steps to good measurement (one easy, one
hard)
1.Recognize poor or dysfunctional measurement *
2.Eliminate it
1. Recognizing dysfunctional measurement…
•Investigate
•Keep the specifics confidential
•Work with developers, testers, team leads, PMs, line managers and board to
find answers * to the following questions:
1. Recognizing dysfunctional measurement…
1.What measurement data do you collect?
2.Why?
3.How do you use it?
4.What decisions or actions do you take based on your analysis of the data?
1. Recognizing dysfunctional measurement…
And some technical questions:
1.What are the <data type or count> definitions
2.How accurate does the data need to be?
3.How accurate is it?
2. Eliminating dysfunctional measurement…
Note: This is difficult
1.Establish, and monitor the move to and adherence to ‘policies’ for good,
functional measurement, e.g....
•that establish local ownership of data
•and recognize and value data privacy (I.e. reduce ‘visibility’ to enable
informational measurement and acknowledge the ‘information horizon’)
•and/or support aggregation (to avoid focusing on the specific, and to avoid suboptimization)
…2. Eliminating dysfunctional measurement…
1.Support technical correctness *
•GQM
•‘M’ definition, and access to definitions by collectors and user
•Simple analyses (good graphics and minimal arithmetic, esp. division – 'the
devil's operator')
•V&V
…2. Eliminating dysfunctional measurement…
1.Periodically evaluate the information need and value delivered by
measurements
•At all levels
•Assess the degree of dysfunction (see previous questions)
•Attempt * to stop collecting dysfunctional (and low value) data
Related material
•Cultural Change:
– Methods and tools for evaluation and change
– www.osel.co.uk/presentations/ccp.pdf
– www.osel.co.uk/papers/changingculture.pdf
– www.osel.co.uk/papers/listofattributes.pdf
•Process Improvement
– Hunting for Anti-patterns
– www.osel.co.uk/presentations/HAP.pdf
– Getting the best from CMMI
– www.osel.co.uk/papers/energizingcmmi.pdf
If Software Metrics leads to something like Taylorian “scientific
management” in software development, I for one, will bow my head in
shame.’
Gerald M. Weinberg
writing in the foreword of Toms Gilb’s ‘Software Metrics’, the first s/w metrics
book
‘The good news is that you can succeed in producing a culture conducive
to measurement. There are organizations in which people have given
themselves completely to the pursuit of organizational goals…
…organizations in which members hunger for measurement as a tool
that helps get the job done… …To use measurement inappropriately
would betray a sacred trust, and no one would consider such a
betrayal.’
Robert D. Austin
OXFORD
SOFTWARE ENGINEERING
LIMITED
9 Spinners Court, 53 West End,
Witney,
Oxfordshire
OX28 1NH
www.osel.co.uk
[email protected]
Tel. +44 (0) 1993 700878