Systems Thinking for Adaptors & Innovators

Abstract

In our consulting and teaching work we are challenged to meet a pressing requirement to increase the adoption of Systems Thinking to current societal challenges. As students of the pioneering thinkers in management, Beer, Ackoff and Deming we now see the struggles as their holistic emergent thinking is applied narrowly and formulaically.

Using information from occupational preferences we recognise the difficulty of Systems Thinking for the adaptor personality: a profile of tool users, of reductionist thinking and checklist methodologies.

Geoff Elliott & Roger James

Learning Management

It is perhaps a mixed blessing that we are seeing a growing interest in Systems Thinking as organisations in the public and private sectors both require better techniques to cope with pace and complexity of change. But this interest is being met with a mixed response from those providing answers and solutions. Systems Thinking is at the tipping point – where great work can re-establish the important of Systems Thinking in the portfolio of interventions, or where the influx of ill-advised applications led by a coterie of poor consultants guarantees it is seen as a short term fad.

Our point of view comes from working as both practitioners and academics and working for 30 years. Avoiding age related dogmatism – the rigid orthodoxy of approach – we believe we approach assignments as open minded but not empty headed. We have built a perspective across many techniques, grounded in the past and challenged by the future. It spans the classic works of Ackoff, Beer and Deming developed in the factory and manufacturing but now applied to the global, technology led, knowledge industries. It is an era during which the use of information has been revolutionised from the slim pickings used for the elegant theories in Operations Research to the brute force of big data and model-less heuristics.

The practice of Systems Thinking has been adversely affected by the schism that appeared in Operations Research: in the early 1980’s the discipline split. The chasm was between the methods for simple problems deemed incapable of dealing with complex social problems, and the methods for complex social problems too academic and obtuse for every day needs. The pragmatic middle, delivering practical solutions for difficult problems, became a barren area for academic research yet a significant area of our real world assignments and practice.

Boisot and McKelvey have developed their own critique of the difficulty of organisational science – Management scholars thus face a stark choice: (a) either say something that practitioners want to hear but do so through narratives in which rhetorically dramatic effects are achieved at the expense of academic rigor or (b) maintain academic integrity by sacrificing perceived practitioner relevance. They are trapped between the characteristics of idea propagation which demand wide applicability and the need for idea novelty which demands academic purity. In Systems Thinking this is a specifically acute problem – real world problems, the wicked problems of Rittell and Webber.., often produce hybrid even mongrel solutions. The pure meme-otype, beloved of academic research, is seldom encountered in practice despite its prevalence in case studies. Real solutions and the real world often involves fuzzy boundaries, purposeful agents and things that cheat.

Current practice in ST appears caught between the over-simple and over-elaborate. In the former critical elements and behaviours of the systems are ignored and simple solutions are forcibly applied, in the latter the complexity and detail of the technique appears out of line with a practical parsimonious solution. Either way ST stands to fail.

There is great variety in Systems Thinking Approach or as they are called methodologies. It is easy to know how to use each approach but the struggle comes with knowing why to use a technique or when to use it. Alternatively, we invent specific approaches, the latest being lean systems thinking, in ignorance of what has gone before in the belief of the new universal answer [the curse of the management fad].

Principled Cheating

Anyone familiar with Ackoff’s work will recall his example of the mirror in the lift as a way of ‘solving’ the engineering of the slow lift – the shortest version of this comes from Re-designing Society and simply states “Complaints of occupants of an office building about slow elevator service were dissolved not by speeding up or adding elevators but by putting mirrors on the walls of the waiting areas. This occupied those waiting in looking at each other or themselves without appearing to do so. Then time passed quickly“. Much longer, more elaborate and suitably embroidered versions of this story appear in his other books – amplified to meet the sense of drama and pathos required of the academic writer.

In a practical engineering sense the mirror solution is no solution at all, but in the complex, real social system it is a clever and dramatic intervention. We teach the rock and bird metaphor: imagine trying to throw either into a waste paper basket at the far end of the room. Both obey the laws of physics – such that ballistics, gravity and aerodynamics are applicable to the trajectory of either. For the rock we could write a case study of formulating the problem, of solving the range of differential equations and of the training required of the thrower – all finishing with the event where the Olympic standard athlete hits the target: a perfect solution for an unreal and restrictive problem. Contrast this with the bird: here to achieve the objective the best solution cunning replaces athleticism. Simply place bird food in the wastepaper bin and without the need for the big equations, or the hero athlete we produce a much more applicable, scalable and robust solution.

People Cheating

Guilfoyle from the perspective of a serving police officer presents an excellent critique of the manic marriage of targets and deliverology so characteristic of the recent government agenda. The strength of the critique makes the case for authentic systems thinking but sadly here no answers are provided.

At the core of the criticism of the theory of governance by targets lies two overriding flaws with the reliance upon:

  • ‘Synecdoche’—taking a part to represent the whole. In performance terms, this is where one takes the performance of a part of the system and interprets it as a surrogate measure of the whole system’s performance;

and

  • The assumption that governance by targets can ever be immune to ‘gaming’.

[It is ironic that the biggest critics of deliverology fall into the trap of synecdoche in their own critique in denouncing all targets un-categorically, without understanding the difference between good targets and bad targets. The challenge lies in discriminating between good and bad, not in decrying targets].

These criticisms are addressed by the appropriate use of Systems Thinking approaches: they are based on holism and address purposefulness.

Sharp Tools: Blunt minds

The pioneers of Systems Thinking ranging from Beer to Boulding or Deming would not fall into this trap; they understood the characteristics of the human world and the complexities that lie within. They had time to think – from the period where there were few one-size-fits-all solutions and where an elegance of ideas had to compensate for a shortage of data.

Socrates in his teaching had a strong distrust to writing suggesting that “it will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory”. Whilst Boulding encouraged the use of techniques he was equally guarded:-

By means of mathematics we purchase a great ease of manipulation at the cost of a certain loss of complexity of content. If we forget this costs, and it is easy for it to fall to the back of our minds, then the very ease with which we manipulate symbols may be our undoing. All I am saying is that mathematics in any of its applied fields is a wonderful servant but a very bad master: it is so good a servant that there is a tendency for it to become an unjust steward and usurp the master’s place

It is simple human nature to apply what we know, in contrast to what is needed. This is evident in the current fixation on lean methods as the one solution to improving efficiency; without asking questions of effectiveness. we need to understand where, when and if this is an appropriate response. Many excellent ST approaches are featured in the academic literature; such as Systems Dynamics, Senge’s organisational learning school or the interest in wicked messy problems.

Systems Thinking Approaches or Approaches to Systems Thinking?

So what are we to do – our common challenge is how to teach Systems Thinking and Systems Practice usually in a practice setting but also academically. Typically our audience comprises ‘middle experience’ staff usually with a background in lean and Six Sigma. Usually we struggle with the gap between the strong analytical traits often well developed in such groups and the conceptual understanding needed for Systems Thinking. In this respect many people do not understand the difference between analysis and synthesis or to put it another way, convergent and divergent thinking.

In response to a request from one such group for a checklist of Systems Thinking we realised the central problem, it was the problem epitomised in the following cartoon [taken from the Open University course on creativity). Based on occupational profiling studies by McBer reported in Competence at Work suggests 5% or less of individuals are naturally conceptual thinkers [the innovators of the cartoon].

 

In our consulting the difference is obvious and immediate, a few individuals are instinctive Systems Thinkers the majority, adaptors, are the ones who need a checklist for Systems Thinking whose focus are the steps and stages of the approach rather than the concepts and the problem. Individuals looking to Systems Thinking as the next step from a procedural discipline such as Lean or Six Sigma become trapped in the detail without appreciating the conceptual basis.

A step-by-step Systems Thinking approach for Adaptors

Russell Ackoff in his work had the opportunity for the grand gestures and sweeping critiques – excellent in conference but poor as an instructional technique. Looking for a checklist approach adapted for adapters we began with the series of articles on Systems Thinking by William Dettmer. In Part 6, entitled Systems and Constraints: The Concept of Leverage, Dettmer introduces the Theory of Constraints reminding us of the importance of the system constraint as the only point of useful intervention.

This is the complete antithesis of the typical wicked problem but, usefully, it represents one end of a spectrum of systems intervention – the end represented by a closed system, defined by analysis and requiring the optimisation of a single variable. As we study any systems – under conditions of change, longer timescales, the introduction of social factors – we can start to identify where the models weaken and approximations become invalid. This is the practical illustration of George Box’s dictum “all models are wrong some models are useful“, our practical world is comprised of a number of simplifying assumptions which allow us to be efficient but which, unless challenged, ultimately cause us to be ineffective.

Systems Thinker often criticise reductionism, breaking a system into smaller and smaller parts, but as explained by Anderson the real challenge is to understand constructionism – how to move our students from the narrowing reductionist approach to the correct constructionist thinking. Anderson in More is Different makes the fundamental point that reductionism and constructionism are asymmetric, you can always dissemble a system by reductionism but there is never a guarantee that you can re-assemble the parts to the original, or to a coherent, whole.

We may borrow the funnel experiment from Deming to explain the operational difference between reductionism and constructionism. Whilst pouring liquid into a funnel the flow is aligned and narrowed to a finer and finer focus: reductionism works! Reverse the simple linear flow from the narrow spout and the output is complex, the direction is unpredictable and the asymmetry is evident: constructionism is problematic!

This brings us to one of our techniques that help bridge the gap … the new factors prism.

Figure 1 The spectrum of problem solving approaches applicable to the closed physical systems on the left ‘the world of manufacturing’ to the open often social systems on the right ‘the world of purposeful systems’

On the left we have the approaches for closed systems within limited problem dimensionality, often requiring optimisation. As we move right, driven by conditions of change or open systems, the difficulty of resolving the situation is made more difficult as the number of dimensions increases. As the situation becomes more complex and the problem becomes more wicked we require a systems thinking approach, such as VSM, SSM, CSH and so on. The challenge is to first recognise that the situation being studied is no longer a closed system and then to identify which, of many possible, Systems Thinking approaches will provide insight.

The challenge, introduced in the section on tools, is that the danger is that we force the approach before we understand the problem and here we offer our prism technique to use the problem characteristics to guide us towards an appropriate approach. It uses reductionist techniques to identify constructionist approach and bridges the adaptors world of analysis to the innovators world of synthesis.

Mobs, rabbles, ideas and progress.

Reactions to the IFG Report

After being part of the team that put the IFG report together on Systems Failure I am both encouraged by the response but concerned about the range of opinion. Opinion is good and democratic but often not helpful when it comes to being able to deliver something. In reacting to discordant voices the conditioned response is more likely to be either blind stridency on the part of the report authors or accommodation which leads to death by 1000 ideas as studied by Belbin on the collective IQ of teams..

Ideally the response is to synthesise a solution from as many differing opinions as possible but at the same time remain capable of coherent action. The love it / hate it opinion on agile forces the either/or dichotomy of managing which generates the pendulum swing of management fashion oscillating from one extreme to another. Discordant strident voices become the trigger for management fads and the reified “”my way”” is best simplify defies the reality of the problem being tackled.

As a skilled manager, or sane individual, you avoid the worst of extremes and learn by integrating the potential of new ideas with the practicality of the old ones. The passion of the zealots may inspire the heart but all too often they mess with the head. If in doubt try the simple diagnostic of “when good ideas go bad” where too much of a good thing become toxic – in the lab carrot juice can kill not because it is inherently toxic at low levels but in extremis the toxic limit is reached.

This thinking brings me to one of the key challenges and core peeves around IT. I also have a background in Operations Research (70’s discipline) or Systems Thinking (popular 00’s term for the same thing) so I recognise the need to understand the problem before rushing into a solution. And it is precisely ‘solutionology’ that generates much of the disconnect in IT generally, in big IT and big bad government IT. Einstein recognised this in his writing “If I was engaged in a really difficult task and had just one hour to live I would spend at least 55 minutes on trying to understand the problem as I know that once I understand the problem the solutions come easily“.

Recent material on scale would suggest that per-se scale is bad and certainly government IT see scale as special. Inherently it is neither bad nor special. Large and successful IT implementations dwarf both government IT and gives proof that scale does work – it also begs the question of your problem understanding, design and applicability in cases where it does not. But it should not provoke a luddite response as seems fashionable from some consultants. And before you condemn big failed IT projects (such as the national identity card) think first how big successful IT systems work (such as Visa or Google).

The report took some time to consider procurement. For me the damning statistic is to compare the average 77 weeks for major procurements and the 5 years in which Facebook has come from nothing to 25% of the US web traffic – or just 4 procurement cycles. It is here that the simple logic of agile – iterative learning the problem and progressively delivering solutions – can make a big difference, in the video I suggest that it is not failure of government IT that is bad, but it is the long, slow and expensive failure of government IT that sees the worst of all possible outcomes. Small companies, entrepreneurial outfits and researchers are not immune to failure – but they learn to fail fast and fail cheaply! Perhaps the successes would look after themselves if the failures could be managed cheaply, and whilst agile approaches handle success and failure equally well the established methods manage only success.

Although we had a number of discussions on risk here too Agile makes a difference. Simply by making the productive focus on workable components progress is evident and the need to ‘trust and verify’ can be managed around the tangible outputs and outcomes of work. Much of institutional attitude to risk is around proxy measures to verify hidden progress or measure inputs as a proxy for outputs. The academic theory of agile management draws from the manufacturing revolution in reducing queues, making work visible, eliminating waste, managing by outcomes and navigating by customer pull. It is only now being applied to IT development. Every supplier organisation should welcome this remembering, of course, that their reputation is sustained on the last thing they delivered.

From the comments there are already some terrific ideas, deep expertise and substantial knowledge that can help government IT but every idea is amplified by its integration with others as opposed to standing in splendid isolation. Drawing from Leon Walras’s economic theory of utility the best ideas are the product of the quality of the idea and its adoption …

Solving the right problem wrongly.

Tonight at the IFG I got into the complexity argument where noble objectives defeat practical solutions.

Think of google or visa through clear simplicity of one solution they focus on secure access and make it work – completely and universally. The government takes the course of multiple safeguards and intermediaries.

So based on  the medeaval ways of defending a castle the government wisely adopts the multiple defences approach but forgets to think that their approach multiplies the number of citadels. A solution which amplifies the problem.
And dont forget the human factors. The technologists approach to password vulnerability is to make the passwords more difficult to guess (and remember) and change them more often. The  net result is that ‘real people’ simply write their passwords down to stick on their monitor – result fixes that fail just like government IT.

On the Toxicity of good ideas

When Good Ideas go Bad

My thinking for this last week has focused on the report “Systems Failure” published by the Institute for Government and the associated blog and twitter discussion [#ukgovit]. There are some really good comments but together they represent a cacophony of ideas – distilling a course of action, of best advice, from the din of divergent opinion is problematic.

The Value of Ideas – Quality * Coherence

It is all so reminiscent of my time at the British Computer Society Healthcare Group just prior to the launch of the National Programme of Health, which became Connecting for Health and which history recognises as one of the largest failed promises in IT. Why so, how did this happen and were the ideas people responsible [not simply the Dilbert like cadre of dumb managers].

Really then what happened was that the enthusiasm and ideas of the  innovators produced the spark that something needed to be done but, of themselves, they were incapable of making it happen – too many strong opinions, to many divergent views and simply too many obdurate arguments What then should a policy maker do?

In this case Tony Blair brought in the consultants – who from the power of bogus abstraction – captured the promise of the gaggle of enthusiasts but with the veneer of coherence. Their Powerpoints suggested it could be done – wait – that they could deliver the world’s biggest IT project and that the commercial processes would help identify companies experienced enough to do the work and big enough to handle the responsibility.

companies experienced enough to do the work and big enough to handle the responsibility? [My analytical mind objects to this last – experience is a learning and size is a matter of diminishing populations. Small companies learn big companies copy. The outcome was evident for anyone who know statistics – the qualification of size produced such a small number of firms that delivered a thin envevlope of experience].

The situation was obvious if you spoke to the project teams withing the large (and now loss making contracts). Why were you chosen? because we are big. Have you done anything like this before – no! All of the expertise from the passionate, knowledgeable enthusiasts was discarded – not because of the quality of their individual ideas but because together they became toxic. Anmd of course the debris of the National Programme continues – a renewed skepticism about IT projects, a missed opportunity, good experience ignored and financial waste.

In another post I have introduced Belbin’s ideas on destructive IQ.

Statistics 101

The situation also reminds me of one of my first faux-pas at work. In my early career for a software company I was considered the statistics experts due to my ‘nerdy’ behaviour – fine I relished it long before geek was cool. At a management meeting the operations director of the software house showed a plot of project performance – with the project size graphed against the project estimating (and hence profitability) with some 50 points on the graph some 40 appeared strongly correlated with a reasonable scatter – an excellent basis for a commercial company. The remaining 10 were for big projects and showed significant f not random scatter.

Invited by the Ops Director to comment on my advice – my immediate response was “do’nt do big projects“. This was the message from the data but not the message the company wanted to hear – who were building their business on large projects. It was also not the most astute answer but, 3 years later, after I left the company it was proved to be correct. The company lost money and was taken over as a result of dependency on large project. They were just too impatient to learn!

And so

Just two things to do ….

  1. To quote Bob Dylan – It’s peace and quiet we need to go back to work again – hopefully taking just a few of the really good ideas with them. Not necessarily the best ideas, but the one that play nicely together
  2. Learn about the zone of statistic significance for experience – and take your time and your opportunities to build experience