Canada awakens as many new developments are underway in improving Web 2.0 Innovations.
Recently, government organizations are seeing the value of Web 2.0 solutions to provide social networking capabilities for their employees. A major Ontario government project is underway that will eventually support over 250,000 people and will cover 58 government departments. Key technology for this initiative is being provided by Waterloo, Ont.-based Open Text.
Wikis, blogs and other Internet-based collaboration tools can also be used to communicate information during emergency situations such as the SARS outbreak Online conferencing and broadcasting technologies can be exploited for hosted online discussions on issues such as: climate change policies, federal and provincial budgets, or upcoming elections.
More effective use of government information
Public sector organizations collect, store and manage huge amount of data covering areas such as: health records, crime statistics, education and the economy. Unfortunately this information is stored in numerous separate systems. Emerging Web 2.0 technologies allow government data to be "mashed up" by independent parties.
By combining disparate sources of information into consolidated applications, users can have easier access to data and have it delivered to them in the context that is relevant to their need.
The Ministry of Transportation in British Columbia, for example combined Map Quest data with real-time traffic data to provide timely advisories to motorists. By providing third parties access to some structured and unstructured data, governments can enable these organizations to create information services that the authorities themselves do not provide.
Streamline internal operations
Information silos that consume massive amounts of resources often restrict information flow. Sharing information within and across various agencies can drive higher quality and more timely outcomes.
For example, Intellipedia, a wiki developed by the U.S. intelligence community, permits employees across a number of security agencies to engage in open discussions on topics concerning them.
In April last year, the North Star Implementation Task Group, a team of Natural Resources Canada (NRCan) employees, developed an action plan for knowledge management which includes the examination of Web 2.0 technologies to enable online collaboration in government.
Their work has resulted in the creation of the NRCan Resource Wiki.
Annual government budget processes typically consume substantial resources and time before anything could be implemented. Online collaboration between budget officers and program managers can cut much of the work, according to Macmillan.
Policy analysts could also use wikis to develop and update policy briefing notes and ensure input comes from a wider group of people.
Attracting Top talent
Governments around the world are challenged to obtain and retain top quality employees.
Today's high school and university students will become tomorrow's leaders, and the best way to reach out to these talented individuals is through the Internet-based technology they use, said a recent Deloitte report. This report urges governments to establish environments that challenge and engage the so-called Generation Y. The report also suggests that governments look at and emulate innovative recruitment programs in companies such Google to improve their ability to develop, attract and retain talent.
Summary
We have been following the developments of Web 2.0 in addition to activities in the government sector, and see also tremendous innovation in the FSI sector, also starting to waken up that they need to start working on creating new business models to support the emerging generational needs for working in a more social, engaging and with access easily to people, irrespective of their role in the organization to execute business needs.
We often the term mashing web 2.0 solutions together - but in many respects we are smashing the old paradigms of how business in her Frederick Taylor roots were perceived to be one that is far more human and authentic and interactive. Clearly with the evolution of these new forms of social enabling tools we are well on our way to create new employee, customer, or partnering experiences.
Key Web 2.0 Innovation and Growth Reflection Questions
1.) How well prepared is your organization to take advantage of these solutions?
2.) Do you have a Web 2.0 strategy, business owner, and roadmap that represents your overall business needs, vs point solutions?
3.) Is your leadership team or employees having ease of access to blogging or wiki tools to support them in their day to day interactions, or increased opportunities for employee engagement?
4.) Are you exploring the value of virtual worlds and how this can make a difference to your organization?
If you would like a copy of our white paper on Web 2.0 that tackles some of these questions, send us an email to info@helixcommerce.com with your name, company and phone number and we would be pleased to forward you some of our recent best practice research.
References
Nestor E. Arellano, IT Business CA, May 28th.
Saturday, May 31, 2008
Tuesday, May 20, 2008
Trends: The Power of Free
Wired Magazine, Chris Andreson believes the "free economics" will define the digital economy. Three technologies that we have increasingly seen become ubiquitous are: bandwidth, storage and processsing.
We have seen tremendous shifts with the costs of transitiors and computing processing powers move from tens of dollars only a few decaes ago to being valued at less than 0.00001centrs and hence almost free to the consumer.
This has never happened before so with the web based economic world and increasingly rich array of producrts of services available on the web. For example we seek Yahoo offer unlimited sotriage capacity for no cost at all to users. Google also offers its applications free to single users and teh storage needed as well. We are seeing products liek wthe Wall Street Journal which may only need 1% of its online readers to subsrice to premisum content in order to subsidize - give away most of its content for nothing, while still making a profit.
Eventually --- there's a compay that is making electric cars and is selling the electricity that Chris Anderson describes in his new book Free to be released in 2009. In time, he predicts that movies will be free with expensive popcorn, other companies will follow the Google model, where advertisers pay to get noticed, but their products are virtually free.
On the web, distribution models are almost free and access is to over 1.2 consumers that conduct commerce on the web. Anderson argues that technology has made many products incredibly cheap to produce. And once they are cheap, they can be given away for free. And providing a free product isn't just offering it at a lower price than a competitor, it's employing a completely different business model.
"From the consumer's perspective," Anderson wrote, "there is a huge difference between cheap and free. Give a product away and it can go viral. Charge a single cent for it and you're in an entirely different business, one of clawing and scratching for every customer."The truth is that zero is one market and any other price is another. In many cases, that's the difference between a great market and none at all."
Irrespective of whether Chris Anderson's predictions are accurate, one cannot refute the realities of increasingly lower cost digital distrubution models. Many examples of offering free that emerged as a full fledged economy include:Radiohead, Trent Reznor of Nine Inch Nails, and a swarm of MySpace bands that grasped the audience-building merits of zero.
The fastest-growing parts of the gaming industry are ad-supported casual games online and free-to-try massively multiplayer online games. Virtually everything Google does is free to consumers, from Gmail to Picasa to GOOG-411.Even the OpenSource movement, and companies like RedHat helped innovation to occur by bringing legitimacy to free.
The rise of "freeconomics" is being driven by the underlying technologies that power the Web. Just as Moore's law dictates that a unit of processing power halves in price every 18 months, the price of bandwidth and storage is dropping even faster. Which is to say, the trend lines that determine the cost of doing business online all point the same way: to zero.
We have seen tremendous shifts with the costs of transitiors and computing processing powers move from tens of dollars only a few decaes ago to being valued at less than 0.00001centrs and hence almost free to the consumer.
This has never happened before so with the web based economic world and increasingly rich array of producrts of services available on the web. For example we seek Yahoo offer unlimited sotriage capacity for no cost at all to users. Google also offers its applications free to single users and teh storage needed as well. We are seeing products liek wthe Wall Street Journal which may only need 1% of its online readers to subsrice to premisum content in order to subsidize - give away most of its content for nothing, while still making a profit.
Eventually --- there's a compay that is making electric cars and is selling the electricity that Chris Anderson describes in his new book Free to be released in 2009. In time, he predicts that movies will be free with expensive popcorn, other companies will follow the Google model, where advertisers pay to get noticed, but their products are virtually free.
On the web, distribution models are almost free and access is to over 1.2 consumers that conduct commerce on the web. Anderson argues that technology has made many products incredibly cheap to produce. And once they are cheap, they can be given away for free. And providing a free product isn't just offering it at a lower price than a competitor, it's employing a completely different business model.
"From the consumer's perspective," Anderson wrote, "there is a huge difference between cheap and free. Give a product away and it can go viral. Charge a single cent for it and you're in an entirely different business, one of clawing and scratching for every customer."The truth is that zero is one market and any other price is another. In many cases, that's the difference between a great market and none at all."
Irrespective of whether Chris Anderson's predictions are accurate, one cannot refute the realities of increasingly lower cost digital distrubution models. Many examples of offering free that emerged as a full fledged economy include:Radiohead, Trent Reznor of Nine Inch Nails, and a swarm of MySpace bands that grasped the audience-building merits of zero.
The fastest-growing parts of the gaming industry are ad-supported casual games online and free-to-try massively multiplayer online games. Virtually everything Google does is free to consumers, from Gmail to Picasa to GOOG-411.Even the OpenSource movement, and companies like RedHat helped innovation to occur by bringing legitimacy to free.
The rise of "freeconomics" is being driven by the underlying technologies that power the Web. Just as Moore's law dictates that a unit of processing power halves in price every 18 months, the price of bandwidth and storage is dropping even faster. Which is to say, the trend lines that determine the cost of doing business online all point the same way: to zero.
Sunday, May 18, 2008
The Science of Complexity Theory and Implications to Management & Innovation
Blog Summary
This entry pays tribute to Jonathan Rosenhead for his research on Complexity Theory and Management practices. This blog below are extracts of Jonathan's book perspectives, with additional points of view based on the implications of innovation and management leadership. Key takeaways are: 1.) Leadership needs to ensure they encourage agile and adaptive management practices 2.) Encouraging ambiquity, uncertainty and conflict are healthy dynamics and should be encouraged - rather constraining them 3.)Experimentation and iterative or perpetual betas form deeper insights than detailed contingency planning and analysis.
Perhaps the most important point based on Helix and also parallel's Innosight's research is that disruptive innovation resides in complexity and less predictable dynamics than does incremental innovation. Both approaches need to be stressed as valuable in developing effective organizational systems. Healthy tension between stable and less predictable system dynamics create healthier organizational systems.
Rational Analysis for a Problematic World: Problem Structuring Methods for Complexity, Uncertainty and Conflict, 2nd Edition (Paperback)by Jonathan Rosenhead (Editor), John Mingers (Editor)
Jonathan can be reached at:J.Rosenhead@lse.ac.uk. Dr. Gordon can be reached at cindy@helixcommerce.com.
Making Sense of Complexity Theory
One of the most important areas of science critical to innovation capacity development is rooted in Complexity theory. Complexity theory is concerned with the behaviour over time of certain kinds of complex systems.
Over the last 30 years attention to complexity has been driven from diverse research fields like: astronomy, chemistry, evolutionary biology, geology, meteorology, and physics. Althought there is no unified complexity theory, there is respect for the underlying conditions which under certain conditions perform in regular, predictable ways; under other conditions they exhibit behaviour in which regularity and predictability is lost.
Complexity lies in dynamic systems which are capable of changing over time – and the concern is with the predictability of their behaviour. Some systems, though they are constantly changing, do so in a completely regular manner. For definiteness, think of the solar system, or a clock pendulum. Other systems lack this stability: for example, the universe (if we are to believe the ‘big bang’ theory), or a bicyclist on an icy road. Unstable systems move further and further away from their starting conditions until/unless brought up short by some over-riding constraint – in the case of the bicyclist, impact with the road surface.
Stable and unstable behaviour as concepts are part of the traditional repertoire of physical science. What is novel is the concept of something in between – chaotic behaviour. For chaos here is used in a subtly different sense from its common language usage as ‘a state of utter confusion and disorder’. It refers to systems which display behaviour which, though it has certain regularities, defies prediction. Think of the weather as we have known it. Despite immense efforts, success in predicting the weather has been limited, and forecasts get worse the further ahead they are pitched. And this is despite vast data banks available on previous experience. Every weather pattern, every cold front is different from all its predecessors. And yet…the Nile doesn’t freeze, and London is not subject to the monsoon.
Systems behaviour, then, may be divided into two zones, plus the boundary between them. There is the stable zone, where if it is disturbed the system returns to its initial state; and there is the zone of instability, where a small disturbance leads to movement away from the starting point, which in turn generates further divergence. Which type of behaviour is exhibited depends on the conditions which hold: the laws governing behaviour, the relative strengths of positive and negative feedback mechanisms. Under appropriate conditions, systems may operate at the boundary between these zones, sometimes called a phase transition, or the ‘edge of chaos’. It is here that they exhibit the sort of bounded instability which we have been describing – unpredictability of specific behaviour within a predictable general structure of behaviour.
Before the emergence of complexity theory, the unpredictability of such systems was attributed to randomness – a notion that bundles up all unexplained variation and treats it as best captured by probabilities. What actually happens on any given occasion is understood as the result of random choice among possible outcomes, but in proportion to their probabilities. Thus probability becomes a catch-all for what cannot be explained in terms of cause leading to effect; paradoxically the implication is that variation about predicted values results from as yet unexplained causal factors, and that as we understand more about what is going on the residual random element will be progressively reduced.
Underlying the explosion of interest in chaos is the discovery that apparently random results can be produced without the need for any probabilistic element at all. That is, we can take some quite simple equations, compute the values of some variables of interest repetitively using the outputs of any stage of the calculation as the input to the next, and get results which skip around as the calculation proceeds. (Put in more technical terms, these are non-linear dynamic systems incorporating both positive and negative feedback loops.) More significantly, if we repeat the calculation a second time from a starting point only infinitesimally different from the first, after a time the computed values diverge and follow a quite different path. This is the mathematical equivalent of the ‘butterfly effect’. The small difference in starting conditions is analogous to an additional movement of the butterfly’s wings; the quite different trajectories which result correspond to distinct weather sequences bearing little or no resemblance to each other.
That last statement is not quite right. For though, the different streams of values outputted by the mathematical calculations, like the different weather sequences, are highly irregular, they are not formless. The indeterminate meanderings of these systems, plotted over time, show that there is a pattern to the movements. Though they are infinitely variable, the variation stays within a pattern, a family of trajectories. Such a pattern of trajectories (and a whole range of different ones have been identified by trying out interesting ideas in the branch of mathematics called topology) is called a strange attractor. They are called ‘strange’ to distinguish them from stable attractors, states to which the system reliably returns if disturbed. A strange attractor has the property of being fractal or self-similar – that is, its pattern repeats itself at whatever scale it is examined. Indeed one can say that chaos and fractals are mathematical cousins, with chaos emphasising the dynamics of irregularity, and fractals picking out its geometry (Stewart 1989).
All of the above are, of course, ‘only’ abstract mathematical results – demonstrating at best that certain kinds of unstable behaviour are theoretically possible. However mathematicians are likely to assert that "anything that shows up as naturally as this in the mathematics has to be all over the place" (Stewart 1989, p.125). And the literature on chaos can cite examples that appear to validate this claim. One example is the wobbly orbit of Hyperion, one of Saturn’s planets. Another is the propagation of turbulence in fluids. In the field of chemistry, Prigogine and colleagues won the Nobel prize for work showing that under appropriate conditions chemical systems pass through randomness to evolve into higher level self-organised dissipative structures – so called because they dissipate unless energy is fed in from outside to maintain them. It has been used as the basis for an approach offering an alternative (or at least a complement) to Darwinian natural selection as an explanation of the ordered complexity of living organisms. And so on.
Complexity itself is centre-stage, rather than an emergent property of research in particular disciplines, at the Santa Fe Institute. At SFI, set up in 1984 as an independent research centre, scientists (some of them eminent) from a range of disciplines – physics, biology, psychology, mathematics, immunology and more - have engaged with computing expertise to conduct interdisciplinary work on the behaviour of complex adaptive systems. They have built models which can be interpreted as representing biological, ecological and economic phenomena.
All of this is undoubtedly an exciting journey of intellectual discovery, which already boasts very significant achievements. The great nineteenth century mathematician Poincare has been claimed as a founding figure (because he ‘almost’ discovered complexity), but in effect the achievements are all those of the last four decades. Some distinguished authors even believe that this work already represents a watershed for natural science, ending three centuries (since Newton) of determinism (see Prigogine 1989).
But why should it be of interest to innovation leadership and how management lead? The next section considers the arguments which have been advanced for the extension of these ideas to the role of management.
Management and Complexity Theory
The work of Ralph Stacey is probably the most widely read and influential of the management complexity authors.
Justifications of the significance of complexity theory for management generally start from a description of the pre-existing state of management theory and practice, which stresses its unacknowledged but self-imposed limitations, its tunnel-vision. (These limitations are, subsequently, shown to be transcended by complexity-based thinking.)
Given that the key finding claimed for complexity theory is the effective unknowability of the future, the common assumption among managers that part of their job is to decide where the organisation is going, and to take decisions designed to get it there is seen as a dangerous delusion.
Management, afflicted by increasing complexity and information overload, can react by becoming quite intolerant of ambiguity. Factors, targets, organizational structures all need to be nailed down. Uncertainty is ignored or denied. The management task is seen to be the enunciation of mission, the determination of strategy, and the elimination of deviation. Stability is sought with highly reliable reporting systems and controls.
What management does is for some of these standard operating norms. There should be a Chief Executive Officer presiding over a cohesive management team with a vision or strategic intent supported by a common culture. The organisation should stick to its core business and competencies, build on its strengths, adapt to the market environment, and keep its eyes focused on the bottom line. Despite the critical hammering taken by 1970’s-style long-term planning, strategic management will nevertheless incorporate the tasks of goal formation; environmental analysis; strategy formulation, evaluation and implementation; and strategic control.
All these approaches are completely wrong from the perspective of management writers influenced by complexity theory.
This kind of management theory and practice bears the hallmarks of the over-rationalist thinking which has dominated since the triumphs of Newton and Descartes.
The organisation, like the universe, is conceptualised as a giant piece of clockwork machinery. The latter was thought to be, in principle, entirely predictable; and good management should be able to get similarly reliable performance from the latter. Discoveries by the theorists of complexity and chaos show that even the natural world does not operate this way – and this revelation of the role of creative disorder in the universe needs to be taken to heart by managers. The consequences, as Stacey (1993) comprehensively summarises, are to turn much management orthodoxy on its head:
1.) analysis loses its primacy
2.) contingency (cause and effect) loses its meaning
3.) long-term planning becomes impossible
4.) visions become illusions
5.) consensus and strong cultures become dangerous
6.) statistical relationships become dubious.
What are the implications or lessons learned for management?
What lessons, it is claimed, does complexity theory teach managers? These can be divided, loosely, into two categories: general suggestions as to how managers should approach their jobs, and more detailed prescriptions for particular tasks. The latter, of course, derive at least in principle from the former.
General lessons
The general lessons concern how learning can be fostered in organisations, how they should view instability, and the (negative) consequences of a common internal culture. The need for an emphasis on learning stems from the central finding of this theory – that the future is in principle unknowable for systems of any complexity. If we accept that we can have no idea of the future environment, then long-term planning becomes an irrelevance, if not a hindrance. This absence of any reliable long-term chart makes learning crucially important, and this must be what has been named ‘double-loop learning’. That is, it is not enough for managers to adjust their behaviour in response to feedback on the success of their actions relative to pre-established targets; they also need to reflect on the appropriateness, in the light of unfolding events, of the assumptions (the mental model) used to set up those actions and targets.
This sort of learning cannot easily take place within an organization which puts a premium on maintaining a common culture. The dynamics of ‘group think’ and the possible effects of divergence on promotion or even survival within the organisation are potent pressures for conformity. This is not an atmosphere in which searching re-examination of cherished assumptions can thrive – rather the reverse. Yet agility of thought based on the fostering of diversity is a prerequisite for the organisation’s longer-term success.
For an organisation to seek stable equilibrium relationships with an environment which is inherently unpredictable is bound to lead to failure. The organisation will build on its strengths, fine-tune its adjustments – and succumb to more innovative rivals. Successful strategies, especially in the longer-term, do not result from fixing an organisational intention and mobilising around it; they emerge from complex and continuing interactions between people. Even the dominant 1980’s approach to strategy, which distanced itself so emphatically from the strategic planning paradigm of preceding decades, at base maintained the aim of strategic management as the realisation of prior intent. Management complexity theorists emphasise, rather, the importance of openness to accident, coincidence, serendipity. Strategy is the emerging resultant.
Rather than trying to consolidate stable equilibrium, the organisation should aim to position itself in a region of bounded instability, to seek the edge of chaos. The organisation should welcome disorder as a partner, use instability positively. In this way new possible futures for the organisation will emerge, arising out of the (controlled) ferment of ideas which it should try to provoke. Instead of a perfectly planned corporate death, the released creativity leads to an organisation which continuously re-invents itself. Members of an organisation in equilibrium with its environment are locked into stable work patterns and attitudes; far from equilibrium, behaviour can be changed more easily.
Yet although they challenge sharply certain management orthodoxies ripe for reassessment, these are all in effect ‘motherhood’ statements. Their motherhoodicity lies in their generality and non-specificity, their sense of being unchallengable within the offered framework of ideas. If you accept the relevance of complexity theory to the managerial condition, then you must also accept the package of systemic categorical imperatives which are embedded in it. As Wheatley says, the first step required is an act of faith.
Specific lessons
There are thus two reasons for paying close attention to the actionable proposals – for managerial structure, strategy etc. – which it is claimed are deduced from complexity theory. The first is that managers are thereby provided with a subjective reality test. Managers who can say, after due reflection, that these concrete proposals are plausibly beneficial will feel less inclined to be sceptical, more inclined to accept the general stance as well as the specific recommendations; and vice versa. The second, of course, is that it is through the specifics that change in management practice will be effected.
Stacey (1993) makes a key distinction between ordinary and extraordinary management. Ordinary management is required in order to carry out day-to-day problem solving to achieve the organisation’s established objectives. It employs a logical analytic process involving data analysis, goal setting, evaluating options against goals, rational choice, implementation through the hierarchy, monitoring. This is planning and management based on a shared ideological consensus, with control at its centre. Competent ordinary management is necessary if the organisation is to deliver cost-effective performance.
Extraordinary management, by contrast, is what is required if the organisation is to be able to transform itself in situations of open-ended change. Here rationalistic forms of decision making are largely inoperative, since these require as their starting point precisely those ‘givens’ which must now be disputed.
The problems and practices of ordinary management have been repeatedly addressed in management texts.
What is innovative is the concept of extraordinary management.
Extraordinary management requires the activation of the tacit knowledge and creativity available within the organisation. This necessitates the encouragement of informal structures – for example, workshops round particular issues or processes, with membership drawn from different business units, functions, and levels. Formation of these groups should be essentially spontaneous, provoked by paradoxes, anomalies and conflicts thrown up in the process of normal management. They need to be self-organising, capable of redefining or extending their remit rather than being bound by fixed terms of reference. Under these conditions group learning can occur, and its results inputted as arguments to the broader management process. In the necessary absence of hard evidence, arguments in favour of new assumptions and directions will be analogical and intuitive, and the process of decision making will be political as champions attempt to persuade others to their point of view.
Stacey does not propose that ordinary management should drive out extraordinary management. His case is rather that both are needed in viable organisations, and they must be enabled to coexist. There is, however, an intrinsic tension between the two modes. If the boundaries limiting the scope of extraordinary management’s informal networks are drawn too tight, it will wither; too loose, and the organisation will descend into anarchy, failing to deliver on its core short-term tasks.
It follows that one key task of senior management is to manage these boundaries. It needs also, eg by mid-career recruitment or job rotation, to ensure that there is not a single homogeneous organisational culture. It should take steps to promote an active internal politics that is both open and broadly democratic in style. Senior management should not espouse a unique vision or long-term plan, but should rather promote the conditions for the emergence of an evolving agenda of strategic issues, and aspirations. It should intervene only selectively, and then at sensitive points; to do so effectively it needs to have an understanding of the qualitative patterns of behaviour which such intervention could produce, without wishing to control it to a preconceived path or believing that it could.
In effect management needs to combine permissive style with abrasive challenge. If necessary it should provoke conflict through ambiguity, deliberately steer away from equilibrium, intentionally escalate small changes, amplify rather than damp down the effects of chance events.
The role for analysis in extraordinary management is extraordinarily limited. While the whole purpose is to ensure long-term survival, there is no long-term plan and precious little long-term planning. The strategic role of senior management is largely to facilitate processes of dialogue which can lead to innovation, rather than to preside as final arbiters over an elaborate analytic process. Stacey in particular relegates rather cursorily even those tools which might be thought to be consistent with extraordinary management (eg simulation, scenario analysis) to a marginal role, if any.
This downplaying of analysis is asserted rather than argued. Nevertheless it clearly stems from the very firm distinction which Stacey draws between rationality and creativity. For him rationality is fine, and necessary, for handling routine business, but is just not up to the job of sense-making in poorly structured situations. There is, however, evidence to support the common-sense observation that in practice rationality and creativity are notkept in such tidy compartments. Shallice (1996) has demonstrated that dealing with novel situations involves complex cognitive processes which have many rational elements. More piquantly, it was Poincaré himself, so often cited by writers on complexity as the unwitting father of their subject, who has left us the most insightful personal story of (mathematical) creativity from the point of view of the creator. In this classic account (Poincaré 1946) he makes absolutely clear the crucial role of very extensive, even exhaustive, analytic endeavour in preparing the ground for his breakthrough!
These lessons constitute an ambitious agenda for change. Explicit claims are made to have identified managerial imperatives for the survival of organisations. Implicit at least is a claim to have a more adequate characterisation of the threats and opportunities confronting organisations and their managements than has previously been articulated. It is therefore appropriate to evaluate the achievement of the management complexity writers on two fronts: in what respects is the account novel? and what is the evidence for the claims? These two approaches will be taken in this and in the following section.
The general question of novelty has two components. Are there important features of their analysis which have been previously identified and prioritised, but from other management research perspectives or traditions? And have disciplines peripheral to or outside management (taken narrowly) generated other proposals for handling these issues which the complexity writers ignore or unreasonably dismiss?
On the first of these Stacey repeatedly cites other management researchers who, despite working in very different conceptual frameworks, have reached comparable conclusions. These convergences include
* that organisations do not only adapt to their environments, but help to create them
* that organisational success can come from contradiction as well as consistency
* that success may stem from being part of a self-reinforcing cycle, rather than from an explicit ‘vision’
* that revolutionary as well as incremental changes may lie on the route to organisational success.
These outputs of more conventional schools of managerial thought can serve the function of making the complexity writers’ prescriptions appear less threatening. In assessing the novelty of the complexity perspective we need also to take account of the claims for priority in certain areas which other approaches could lodge.
The prior work which is most directly relevant comes, predominantly, with a ‘systems’ pedigree. Thus cybernetics for more than 50 years has been attempting to develop understandings of the behaviour of self-organising systems; while since the 1960’s system dynamics has been exploring how complex systems incorporating feedback can generate counter-intuitive consequences.
Writers like Vickers (1965) and Schon (1973) took related systems concepts into broader social discourse a generation and more ago. Etzioni (1971) is another thinker who anticipates some of the complexity-based insights. His mixed scanning approach to planning is an attempt to encompass the paradox that organisations need both control and innovation.
Stacey (1996) holds that systems contributions pre-dating the complexity revolution are irredeemably trapped. They operate within the stable equilibrium paradigm, and are wedded to notions of efficiency, effectiveness and control. They are thus outdated by the new findings. (He makes a limited exception for system dynamics, and cites Senge’s (1990) system archetypes – a small number of qualitative generic structures of feedback interaction - as examples of the sort of broad patterns of behaviour which management needs to take into account.)
Such literature as does exist, in a number of other disciplines, on disorderly processes in organisations, emergent strategies, dialectical evolution etc. lacks a coherent theoretical framework. The ‘science of complexity’, he says, adds value to such compatible work as can be found in the literature by supplying that coherence.
The managerially relevant claims for complexity theory are thus large. In their demotion of pre-existing systems insights, and in a number of other respects, they rest explicitly on the authority of science. We should look next at the solidity of this foundation.
What is the evidence?
It hardly needs saying that there is no formally validated evidence demonstrating that the complexity theory-based prescriptions for management style, structure and process do produce the results claimed for them. These results are generally to do with long-term survival, a phenomenon not susceptible to study using short-term experimental methods. Such evidence as is adduced is almost exclusively anecdotal in character. The stories range from improving tales of successful corporate improvisation, to longer accounts of organisational death wishes or of innovation which bypasses the obstruction of the formal hierarchy; there are also approving quotations from business leaders.
The problem with anecdotal evidence is that it is most persuasive to those who experienced the events in question, and to those who are already persuaded. For others it can be hard to judge the representativeness of the sample of exhibits. This is especially so if, even unintentionally, different standards of proof or disproof are used for different sides of an argument. Such distortions do occur in Stacey (1992). Thus the advantage of opportunistic policies is supported by presenting examples of success, while the perils of formal planning methods are driven home by examples of failure. Yet obviously opportunism has its failures, and analytic techniques even have their modest achievements – which are not cited.
In the absence of a conclusive case based on evidence of organisational success, it is not surprising that great weight is placed on the authority of science. Wheatley (1992) has it in her title – "Leadership and the New Science". Merry (1995) relegates it to his sub-title, but in the plural: "Insights from the New Sciences of Chaos, Self-Organization and Complexity". Indeed ‘New Sciences’, always capitalised, runs through his book like the message in Blackpool rock. However all management complexity authors lean heavily on ‘science’ in their texts. These are liberally peppered with phrases like "Scientific discoveries have shown that…" or "The science of complexity shows that…". The illustrative examples provided are commonly of natural rather than social or managerial phenomena – the behaviour of molecules when the temperature of liquid rises, a laser beam, the weather…
Many of the ‘results’ cited in the complexity literature are not, however, firmly grounded on empirical observations. They are the outputs of computer simulations. Typically some simple laws of behaviour and interaction are postulated, and the computer is used to see how the operations of these laws would translate into long-term development or macro-behaviour. For example Kauffman (1993, 1995) models how an organism might evolve through an ‘adaptive walk’ of mutations across available alternatives, depending on the degree of cross-coupling of the organism’s component parts. Krugman (1996) shows how aggregate patterns of land use (eg the formation of multiple business districts, racial segregation) could result from individual responses to purely local conditions. Stacey (1996) reports a wide variety of simulations, mostly produced under the auspices of the Santa Fe Institute, in which simple rules of individual behaviour generate replications of the flocking of birds, the trail-laying of ants, the dynamics of organism-parasite systems…In each case the computer tracks the way in which such simple laws, if they were to hold, could produce patterned order.
Evidently such demonstrations, absorbing though they may be, cannot constitute proofs that these laws are indeed the cause of the observed behaviour. Indeed Kauffman (1993) in the introduction to his 700 page volume, stresses that "this is not a finished book …Premises and conclusions stand open to criticism." Krugman (1996) adopts an informal approach, and allows himself to include "a few wild speculations". Mittleton-Kelly (1997) recognises a further need for circumspection which arises in essaying to transfer complexity theory formulations from the natural to the social domain. Behaviour in the former may be assumed to be governed by laws; in the latter, awareness of a claimed law may itself generate changed behaviour. In this crucial respect, social systems (including organisations and their managements) are fundamentally different from all other complex systems.
It can be seen from this that scientific authority is an unsafe ground for asserting that specific results from complexity theory necessarily apply to organisations, or that complexity-based lessons constitute imperatives for management practice.
Stacey makes a series of strong assertions about the range of systems which come within the remit of complexity theory. Thus (p.124) "most of nature’s systems are nonlinear feedback ones" that in many respects exhibit chaotic and therefore unpredictable behaviour. It is not quite clear how one could either substantiate or refute such a claim – but as organisations are social rather than natural systems we do not have to address this question.
However the pronouncements on organisations are in the same vein. Following an account of nonlinear feedback systems operating "far from equilibrium" (p.11) we read that "an innovative business is just such a system". This is broadened out in a passage (pp. 48-51) which appears to extend the scope of this mapping to all businesses whether innovative or not.
The argument is that "almost all interactions in organizations between individuals, or groups of individuals, take on a nonlinear feedback form"; as a result "every business" is a web of such feedback loops; and the scientific discoveries of complexity therefore apply.In his later book - Stacey (1996) – the remit is extended still further: "human systems, that is, individuals, groups, organizations, and societies, are all nonlinear feedback networks" to which the findings of complexity theory consequentially apply (p.47).
But do they?
There are some concealed mathematical booby-traps along the route which this argument has just traversed. Consider two of these. The first is that not all non-linear dynamical systems exhibit chaotic behaviour. Many are quite happy to settle down to stable equilibria, on which an infinitesimal difference in starting conditions has only an infinitesimal effect. Whether we get chaos or not depends critically not only on the form of the equation but also on the parameters defining the strength of the feedback loops. Evidence that the feedback parameters in the whole array of social systems listed above do in fact take chaos-generating values is absent.
The second difficulty is that the mathematical work cited by management complexity writers is virtually all about deterministic chaos. That is, the unpredictability arises without the need for any randomness of input or process. The weird and wonderful results of mathematical chaos theory which have so gripped the public imagination all stem from this type of formulation. However it is surely indisputable that in the real world of social processes and organisations to be managed, probabilistic elements abound. Sadly perhaps, under what is called stochastic chaos strange attractors fail to manifest themselves (and consequently these more realistic situations have attracted much less theoretical attention). Intuitively, we may understand this result as the buffetting of random shocks knocking the system out of the delicate entrapment of the strange attractor. Systems return to a stable equilibrium, or retreat indefinitely from it. The edge of chaos, on which so much management complexity writing is predicated, is abolished.
These severe limitations on the practical relevance of chaos/complexity have evidently not deterred the rather inclusive claims made for its sphere of operation. Indeed the strength and generality of the assertions based on complexity theory merge into a sense that its findings are non-negotiable. We find a text where the words ‘manager’, and ‘must’ or ‘need to’, go together like, as they used to say, a horse and carriage. We hear of actions which "managers must take to be successful"; and that "managers need to embrace this new frame of reference" (both in the Preface to Stacey (1992)).
We find that "human organizations…must operate in chaos if they are to be continually creative"(p.124). Some of these ‘oughts’ are less problematic than others. No author can reasonably be debarred from urging the theme of his/her book on us as a new way of thinking, an improved mental model. There can be concern, though, when the new frame of reference (supported by appeals to science) generates prescriptions for management control, management style and culture, risk taking behaviour, skill enhancement…The pervasive message is that corporate survival depends on how well management internalises the complexity perspective.
Complexity as metaphor
If it is unwarranted (and it is) to say to managers "Take these actions because science has discovered that they are necessary", does that dispose of the matter? Can managers safely put away those rather daunting books with complexity in their titles, and get on with managing by the seat of their pants (or by the recipe of some less scientific guru)? Well, not necessarily. Though one cannot prove that the claimed results (about creativity at the edge of chaos, versus controlled extinction in an equilibrium state) apply to the management of organisations, there is another way in which they can be relevant. This is the route of metaphor or analogy.
There is of course a difference between the two. A metaphor is a figure of speech in which a name or descriptive term is transferred to some object to which it is it is not properly applicable. Metaphor is an entirely legitimate device. It can be a way of illuminating certain phenomena in a novel way, so that routine understandings of their significance may be enriched or replaced by interpretations based on the quite different field to which they are juxtaposed. Literature would be impoverished without it. So, quite possibly would science. For example, a good case can be made out that Darwin’s phrase "natural selection" is a highly debatable metaphor drawn from the field of plant and animal breeding (Young 1985).
However if complexity/chaos is an extended metaphor when applied to organisational behaviour and management, then clearly it loses any prescriptive force. Any influence it may have over policy will depend on the vividness of the metaphor, and on the plausibility in the real world of organisations of any inferences for practice that the metaphor seems to imply. For example, economists (or perhaps politicians) used to make frequent use of an automobile metaphor to describe macroeconomic policy – applying the brakes, a touch on the accelerator. Yet no one would have proposed that this metaphor, doubtless helpful to understanding, could justify designing economic institutions based on the big end or tyre valve.
An analogy carries rather more clout. An analogy consists of some assertions of similarity or difference between corresponding elements in two different systems, and about the sets of causal relations operating within each system (Hesse 1966). Analogies are widely used to suggest scientific hypotheses worth investigating, to the extent that ‘analogy’ and ‘model’ can be treated as virtual synonyms (Brodbeck 1968). Generally the analogy is used to connect a well-understood domain to one in which understanding is less developed. So, for example, Huygens developed his wave theory of light with ideas from the familiar view of sound as a wave phenomenon; and Fourier’s theory of heat conduction was constructed by analogy with the known laws of the flows of liquids (Nagel 1961).
In the light of this, could the management complexity writers, deprived of ultimate natural scientific authority, nevertheless claim that the results they draw from complexity theory hold for management by analogy? The requirements for this would be
(a) that the natural scientific domain of complexity theory is better understood than that of management;
(b) that there are concepts in the first domain which have been clearly put in one-to-one correspondence with similarly precise equivalents in the second;
and (c) that connections (especially causal ones) between groups of concepts in the first domain are implicitly preserved between their equivalents in the second (Brodbeck 1968).
If the answers to all these queries are positive, then the analogy may reasonably be used to articulate a theory in the new domain. Validation of this new theory can only come from empirical work in the new domain.
(a) Complexity theory proper is a rapidly developing but still young and imperfectly integrated field. It is certainly arguable whether it is sufficiently well established to serve as a reliable source of analogies for the field of management.
(b) Management complexity authors are rather unspecific about what organisational aspects are to be put in relation with what concepts drawn from complexity theory. For example, is it the organisation or its environment which is the focus of interest in which chaotic behaviour might be observed?
(c) Broadly the relationships claimed to be preserved across the two domains are those of non-linear feedback between elements within each of them. This mapping is so general and therefore so undemanding as to add only limited credibility to the analogy.
This situation could well change, as complexity theory develops further, or as management complexity writers refine their analyses. However at present the conceptual basis seems inadequate to support testable analogical insights.
As a footnote to this discussion, it is worth recalling Stacey’s key task for ‘extraordinary management’ – to understand the qualitative patterns of behaviour which their interventions could provoke.
These recognisable patterns, he says (Stacey 1993, pp231-2), are typified by Senge’s (1990) archetypes – qualitative maps which are essentially simplified causal loop diagrams from the field of system dynamics. In fact, though commonly attributed to Senge, the concept of ‘archetype’ in this context was originated by Meadows (1982). These archetypes are designed to illustrate a range of generic types of counter-intuitive behaviours of systems. Lane (1998) argues that archetypes have at most an analogical [I would say, metaphorical] relationship with their real world targets. "With weakly defined attributes there seem to be few ways to test the isomorphism and so gain confidence in the relevance of the diagram." He is driven to ask "is this science or ‘new age’ waffle?" Regardless of how one answers this question, the central place of Senge’s archetypes in Stacey’s scheme dramatises the role of approximate and suggestive thinking in the management lessons drawn from complexity theory.
Summary
This account to apply ideas from complexity theory to management practice has been broadly critical – critical of claims for the authoritative status of what would be better presented as stimulating metaphors. It is indeed curious that a message based on the importance of accepting instability, uncertainty and the limits to our knowledge should be presented with such an excess of certainty. The explanation for this paradox may lie in the twin heritage of management complexity. The ‘systems’ community world-wide has been particularly prone to sectarianism and evangelism, while the audience for management texts is conditioned to expect large generalisations supported anecdotally. It can be a heady mixture.
This conceptual imperialism is both unfortunate and unnecessary. Unfortunate because some of those exposed to these ideas may reject them on grounds of over-selling, while others (recall that complexity theory proper is far from transparent) may be persuaded to place more reliance on the ‘findings’ than is merited. Unnecessary, because management complexity has indeed generated metaphorically based insights which are novel and instructive.
There are a number of complexity theory insights we can take away. Consider the handling of the long-term. Long-term planning has taken such a battering that the complexity-based view that it is impossible anyway can hardly classify as startling. However Stacey’s extension of this critique to cover the view of ‘strategy as vision’ is a powerful antidote to much management writing. He makes a good case(Stacey 1992, pp 126-144) that a single vision to serve as intended organisational future, motivator of behaviour and guarantor of corporate cohesion is a thoroughly bad idea. It produces a culture of dependency, restricts the expression of conflicting views, and generates shared mental models tending towards groupthink. One must hope that this debunking of ‘the vision thing’ will prove influential.
Not that the long-term is dismissed as an effective irrelevance. What is proposed is a refocusing: rather than establish a future target and work back to what we do now to achieve it, the sequence is reversed. We should concentrate on the significant issues which need to be handled in the short-term, and ensure that the debate about their long-term consequences is lively and engaged. Read in this way, the lesson of complexity theory is not to justify short-termism, but to point towards a more practicable way of taking the future into account.
For this more pragmatic balance between present concerns and future potentialities to be achieved, Stacey proposes the deliberate fostering of an active organisational politics. Only in this way can learning, the essential requirement for organisational survival, be institutionalised. This is an uncomfortable message, which many organisations will find sits uncomfortably with current structures and cultures. However it is arguably a much needed corrective to the authoritarian, hegemonic or presumptively consensual regimes which are our management heritage.
There are, of course, insights and insights. Some of the implications of Stacey’s complexity-based view are harder to embrace, even in principle. I will highlight just one of these here – the notion that organisations ‘should’ operate at the edge of chaos. (Another, the downgrading of analysis, I will address a little later.) We are repeatedly told that "the space for creativity in an adaptive system is a phase transition at the edge of chaos, at the edge of system disintegration" (Stacey 1996, p.97); "human organizations …must operate in chaos if they are to be continually creative" (Stacey 1992, p.124); or again, "the constant creativity and innovation necessary for success can occur only in this state" – namely bounded instability, or chaos (Stacey 1992, p.186).
To stay far from equilibrium, the organisation has to operate as a ‘dissipative structure’, maintainable only by "continual inputs of energy, attention and information" (Stacey 1993, p.231). For this reason these states occur in nature only fleetingly – they breakdown in disorder or relapse into equilibrium. If the analogy to organisations can be sustained (on which I have said enough above), it can hardly support an imperative for organisations to operate ‘constantly’ and ‘continually’ in this mode. It might instead suggest a more intermittent invocation of a fluid and responsive stance, during which radically new directions would be entertained. (This is, in fact, the view taken by McMaster, 1995.) In between, the organisation would operate in a relatively stable manner.
Appropriate thought on procedures would be necessary to stop this mixed mode degenerating into crisis management, pure and simple. But the alternative proposed by Stacey is that our organisations should operate permanently poised on the edge between order and disintegration. This is in effect to propose an unsecured gamble with institutions which are dominant repositories of our physical and intellectual capital, and with the psychological health of those who would operate them under continuously stressful and under-structured conditions.
The discussion above is intended as one example of how we might make constructive use of management complexity. We do not need to swallow it whole, or reject it root and branch. Indeed Stacey (despite his tendencies towards the prescriptive) at intervals stresses that there are no recipes, that this is a way of thinking, that these are mental models. Wheatley (1992, p.150) urges the use of management complexity ideas to make sense of our own experience. So long as we give that experience powers of veto, management complexity can serve as a constructive provocation.
The evolution of management complexity
Management complexity theory redescribes the organisational world. Why, in this way, now?
One explanation might be that this new perspective is driven by substantive changes in that world: organisations getting bigger, more information arriving sooner, environments changing faster. Undoubtedly there is something to this argument; ideas do need to be timely. The ‘learning organisation’, for example, would not have been a helpful concept to the pioneers of the first industrial revolution, lacking as they did any cadre of professional managers. Nor would they have appreciated advice to aim for the edge of chaos, obsessed as they were with establishing greater control. Nevertheless this argument, while excluding past eras, still leaves a good deal of leeway for the timing of this conceptual revolution.
Timeliness also plays a part more indirectly. The development of the various elements of complexity theory have depended heavily on computing power; as the cost of computing has fallen, so the feasibility of elaborate simulations of dynamic behaviour has grown. This has meant that the natural science results from which management complexity theory might borrow have in effect been accumulating only since the 1960’s.
Nevertheless it is striking that general as opposed to academic interest in complexity theory mushroomed from the end of the 1980’s and as yet shows no signs of abating. This is as true for best selling works of popular science as it is for the management complexity developments we have been considering. Ideas which get taken up need to be in keeping with the spirit of the age. It is not, therefore, necessary to see the congruence (excellent, as we will see below) between complexity theory and the broader current of events and ideas in society as just a remarkable coincidence.
These effects are sometimes easier to see and accept in historical perspective than in relation to our own times. So, to provide some extra credibility before applying the same approach to complexity theory, let me make a short detour back to Darwin. (The following account is based on Young, 1985.) His "On the Origin of Species" was one of the two most significant publications of the nineteenth century. (The other was Marx’s Das Kapital.) It precipitated a revolution in thought, transforming how future generations would see the place of humans in the universe.
The full title of Darwin’s great work was "On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life". It was published in 1859 after a gestation period of more than 20 years. I have already referred to the metaphorical role of his phrase ‘natural selection’. Here I wish to draw attention to the mechanism which he proposed for selection, namely the ‘struggle for life’. Darwin’s notes, sketches and correspondence make clear the formative role in this respect of his reading of Malthus in 1838. Malthus’s theories were concerned not with biological diversity, but with social and economic conditions at the epicentre of the convulsion of the industrial revolution in Britain. Agricultural enclosures were dispossessing the rural population, who flooded to the new urban industrial centres. Against this background he published in 1798 his influential essay "On the Principle of Population" in which he argued that, since population tends to increase geometrically, but food supplies can grow only linearly, increasing population would always tend to outstrip the means of subsistence. The widening gap would be handled by poverty, famine and war. So Darwin’s adoption of the survival of the fittest as a key element of his theory was actually based on a philosophy which derived from and was particularly relevant to its era – that of the most brutal and self-confident phase of industrial capitalism.
The reason for this detour is that we can find comparable resonances linking complexity theory, larger currents of thought, and the social and economic circumstances of our own era. The past 20 years have not only seen the mushrooming of interest in complexity theory. They have also been years in which a hegemonic common wisdom has developed, in favour of the market rather than planning as the way to order our affairs; in which the demise of the Soviet Union has consolidated this victory in geo-political terms; in which mass production of standard products has been replaced by a Post-Fordist fluidity in promoting and responding to changes in taste and fashion; in which influential thinkers have mounted an assault on the pre-eminence of reason in social decision-making (see Rosenhead 1992); and in which postmodernism has made good progress in sweeping such notions as ‘progress’ off the agenda of respectable discourse.
All of this makes a theory which appears to argue that the future is in principle unknowable and therefore unplannable an extremely welcome menu item at the intellectual dinner table. It appears to add the authority of science to pronouncements that "there is no alternative" to the market. At present they seem well on the way to incorporation into the set of ideas in good currency for making sense of the social world.
There is a precedent for this process in the response to Darwin, which was not only scientific and theological in character; his work also provoked an influential current of social theory. Known as Social Darwinism, it pronounced that those in lowly social positions were there as a result of their lesser competence in the social struggle for survival. Conveniently (for those in higher social positions), it was ‘nature’s law’ that this was so. (The same argument demonstrated the evolved superiority of successful nations.) There was no need, indeed it would be anti-progressive, to attempt to alleviate the condition of the poor, or to restrain the free competition which led to it. (See Bannister 1979.)
As we have seen, this ‘timely’ idea had a tangled provenance, having moved from Malthus to Darwin, and then back to the Social Darwinists where it could be presented as the findings of science. The situation for complexity theory is simpler. There has been no yo-yoing back and forth between the natural scientific and the social domains, just a one way ticket. The mechanism can be seen in operation in the work of Kaufmann. His monumental book (Kaufmann 1993) in fact returns to the terrain of Darwin. He is concerned to show that random variation and natural selection provide only a partial explanation for the observable order in the natural world, and that there are sources of self-organising, spontaneous order in the biological phenomena which natural selection has to work with.
So far so good, you might think, but rather remote from organisations and management. Not so. In Kaufmann (1995) these ideas are popularised, and applied to technological innovation and organisational design. Large organisations, he says, need to be broken down into ‘patches’ rather than run hierarchically from the top. Then, "as if by an invisible hand", they will be able by mutual interaction to discover excellent solutions to tough problems. The analogy between these organisational issues and his primary research field of biology, he suspects, may be more than analogy; rather, the biosphere and the technosphere "may be governed by the same or similar fundamental laws".
We have already addressed the question of analogy at some length. But let us pause on that "invisible hand". Krugman (1996, p.76) uses the same words to describe the spontaneous emergence of a regular spatial pattern of urban centres. Neither of them invented the phrase. It dates from the eighteenth century and is due to Adam Smith, who used it metaphorically in connection with the workings of the market. The market operates so that, although all individuals act in their own self-interest, ‘as if by an invisible hand’ the wealth of the nation will grow. Smith’s conclusion was that no interference with free competition should be countenanced.
There is indeed a striking parallelism here. Adam Smith said that self-interested mutual interaction in the market is uniquely efficient in economic terms. Complexity theorists see that spontaneous self-organisation can produce survival strategies where central planning would fail. Two hundred years after Smith, in a period when there is once again an almost mystical belief in the beneficial properties of market forces, we are told on the authority of science that non-intervention is best.
So times change, and acceptable meta-ideas change with them. It is curious that, neatly in step with the popularisation of complexity theory, a blood relative of Social Darwinism has emerged to considerable acclaim under the label of ‘evolutionary psychology’ (Barkow et al 1992; Pinker 1998). The programme of this movement is to argue that natural selection operates not only on physical characteristics, but also on the mind –dispositions, tastes, preferences, and attitudes. It follows that those which we have are deeply embedded, and hence much less steerable by social engineering than was often previously assumed. Thus it too, just as complexity theory, provides justification for governmental non-interference. These are truly ideas well in touch with the spirit of the age!
A role for analysis
Stacey’s explicit rejection of a significant role for analytic methods was highlighted in section 3 above. This position is not uniform among management complexity writers; mostly they simply ignore the topic. (McMaster, 1995 does at least implicitly concede that scenario analysis could be useful in thinking about the future.) The general posture seems to be that managers who realise the importance of creativity in a world of complexity will have concerns for which analysis is an irrelevance.
The picture which Stacey (1992) paints of ‘step-by step analysis’ is a caricature designed to show up complexity-based thinking to maximum advantage. He portrays a mindless routine in which set models, rules and computations are applied with blinkered religious fervour. The choice, the reader is invited to believe, is between this, and a more fluid, creative and political process. Either-or, not both-and. (Yet we have seen, in section 3, that creativity and rationality are mutually supportive rather than exclusive.) Because the organisational world is one of non-linear dynamics, we are told, extraordinary managers must be intuitive, innovate, and spot emergent strategies. And all without recourse to analytic crutches.
As a corrective to analytic over-dependence, where it is still to be found, this rhetoric may have some value. But elsewhere it will serve as further incitement to an irrationalism which is already in the ascendancy. In any case, the logic by which this anti-analytical conclusion follows from Stacey’s complexity theory-based diagnosis is less than compelling. For he presents a powerful argument (Stacey 1992, p 101-2) for the inadequacy of informal mental models – the same case incidentally that operational research advanced thirty or so years earlier – only to leave managers with no tools to supplement them. Changed structures, process and culture can make a major contribution, to be sure. But in the end, ways forward need to emerge from hard thinking by individuals or groups about the complexity and uncertainty of their situation. Dealing with interconnected systems and environmental turbulence as they do, they need all the help they can get.
Consider some of the difficulties confronting Stacey’s ‘extraordinary managers’ for which appropriate analytic assistance is in principle available – operating in an uncertain world, learning to learn, taking decisions politically. On the handling of uncertainty, Stacey thinks that this can only be done by predicting the future, which complexity theory claims to be meaningfully impossible. Yet scenario planning is precisely predicated on this proposition. It assumes "that there is irreducible uncertainty and ambiguity in any situation faced by the strategist, and that successful strategy can only be developed in full view of this" (van der Heijden 1996, p.8). The different scenarios represent alternative possible futures for the organisation’s context; their development is as a stimulus to constructive debate on strategy. Indeed Sunter (1992), whose work was influential in South Africa's peaceful transition to democracy, justifies scenario planning in terms of uprooting the complacency of (temporarily) successful organisations, terms which mirror Stacey's concern about perfectly planned corporate death.
We are not finished with analytic assistance related to uncertainty. According to Stacey, uncertainty requires that strategic thinking should be anchored to the here and now, not the future. That is essentially the perspective of robustness analysis (Rosenhead, 1989b). This methodology adopts a bifocal approach in which thinking about possible futures informs current choices, not through the adoption of a ‘plan’, but via the (measurable) flexibility which current choices will, or will not, preserve.
Moving on, we find learning, carried out in groups, at the centre of Stacey’s version of strategic thinking. There are methods which can make these processes more effective, and there is no requirement of principle or common-sense why they should not be employed. Facilitated groups are likely to be more productive (Phillips and Phillips 1993), and a considerable variety of group decision support systems (GDSS) now exist which enhance rather than substitute for the group process (Huxham 1996).
Finally there is the question of reaching agreement on what to do. For Stacey this is one of the functions of the political process within the organisation. Crudely, the process is a heaving mass of power plays, rhetoric, self-interest, deals and ego-involvement all carried out in smoke-filled rooms – and of which he thoroughly approves as a mechanism far superior to central control for generating the decisions from which strategy emerges. In effect he treats politics as essentially a no-go area for rational thought. Yet all the participants do have their own separate rationalities. Not only that, there is usually the implicit acceptance of a common and overarching constraint that, despite their divergent perceptions and priorities, some form of operable outcome is required. This common ground will often provide scope for the use of one or more of the family of problem structuring methods (Rosenhead 1989a, 1996). These provide structured ways of avoiding mutual misperception, identifying a problem focus, opening up space for negotiation and getting buy-in to commitments.
Aids to rational discourse on organisational issues fit awkwardly within the world-view promoted by management complexity authors. Indeed they provide rational arguments against rationality, as well as forecasting with great confidence the impossibility of forecasting, and planning for the absence of planning. If we resist their invitation to elevate one view of management and organisation into the view, there are many thought provoking and practical insights along the way. References
Bibliography
R C Bannister (1979) Social Darwinism: science and myth, University of Pennsylvania Press, Philadelphia.
J H Barkow, L Cosmides and J Tooby (eds.) (1992) The Adapted Mind: evolutionary psychology and the generation of culture, Oxford University Press, New York.
M Brodbeck (1968) ‘Models, meaning, and theories’. In M Brodbeck (ed.) Readings in the Philosophy of the Social Sciences Macmillan, New York (pp 579-600).
A Etzioni (1971) The Active Society: a theory of societal and political processes, Free Press, New York.
K van der Heijden (1996) Scenarios: the art of strategic conversation, Wiley, Chichester.
MB Hesse (1966) Models and Analogies in Science, University of Notre Dame Press, Notre Dame, Indiana.
C Huxham (1996) ‘Group decision support for collaboration’. In C Huxham (ed.) Collaborative Advantage, Sage, London (pp 141-151).
SA Kauffmann (1993) The Origins of Order: self-organization and selection in evolution, Oxford University Press, New York.
SA Kauffmann (1995) ‘Escaping the Red Queen effect’, The McKinsey Quarterly 1995 No. 1 (pp 119-129).
P Krugman (1996) The Self-Organizing Economy, Blackwell, Cambridge Mass.
DC Lane (1998) ‘Can we have confidence in generic structures?’, J. Opl Res. Soc. 49 (pp 936-947).
M D McMaster (1995) The Intelligence Advantage: organising for complexity, Knowledge Based Development, Douglas IOM.
D H Meadows (1982) ‘Whole earth models and systems’, The Coevolution Quarterly, Summer (pp 98-108).
U Merry (1995) Coping With Uncertainty: insights from the new sciences of chaos, self-organization and complexity, Praeger, Westport, Conn.
E Mitleton-Kelly (1997) ‘Organisations as co-evolving complex adaptive systems’. BPRC Paper No.5, Business Process Resource Centre, University of Warwick, Coventry.
G Morgan (1986) Images of Organization, Sage, Beverly Hills, CA.
E Nagel (1961) The Structure of Science: problems in the logic of scientific explanation, Routledge and Kegan Paul, London.
L Phillips and M Philips (1993) ‘Facilitated work groups: theory and practice’, J. Opl Res. Soc. 44 (pp 533-549).
S Pinker (1998) How the Mind Works, Allen Lane:Penguin Press, London.
H Poincaré (1946) The Foundations of Science, The Science Press, Lancaster, Pa.
I Prigogine (1989) ‘The philosophy of instability’, Futures, August 1989 (pp 396-400)
J Rosenhead (ed.)(1989a) Rational Analysis for a Problematic World: problem structuring methods for complexity, uncertainty and conflict, Wiley, Chichester.
J Rosenhead (1989b) ‘Robustness analysis: keeping your options open’. In J.Rosenhead (1989a) (pp 193-218).
J Rosenhead (1992) ‘Into the swamp: the analysis of social issues’, J. Opl Res. Soc. 43 (pp 293-305).
J Rosenhead (1996) ‘What’s the problem? An introduction to problem structuring methods’, Interfaces 26 (pp 117-131).
D A Schon (1973) Beyond the Stable State, Norton, NY.
PM Senge (1990) The Fifth Discipline: the art and practice of the learning organization, Doubleday, New York.
T Shallice (1996) 'The domain of supervisory processes and temporal organisation of behaviour', Phil. Trans. Roy. Soc. Lond., 351 (pp 1405-1412).
R D Stacey (1992) Managing The Unknowable: strategic boundaries between order and chaos in organizations, Jossey-Bass, San Francisco.
R D Stacey (1993) Strategic Management and Organisational Dynamics, Pitman, London.
R D Stacey (1996) Complexity and Creativity in Organizations, Berrett-Koehler, San Fransisco.
C Sunter (1992) The New Century: quest for the high road, Human and Rousseau (with Tafelberg), Cape Town.
I Stewart (1989) Does God Play Dice? the mathematics of chaos, Blackwell, Oxford.
G Vickers (1965) The Art of Judgment, Chapman and Hall, London.
M J Wheatley (1992) Leadership and the New Science: learning about organization from an orderly universe, Berrett-Koehler, San Francisco.
R M Young (1985) Darwin’s Metaphor: nature’s place in Victorian culture, Cambridge University Press, Cambridge.
This entry pays tribute to Jonathan Rosenhead for his research on Complexity Theory and Management practices. This blog below are extracts of Jonathan's book perspectives, with additional points of view based on the implications of innovation and management leadership. Key takeaways are: 1.) Leadership needs to ensure they encourage agile and adaptive management practices 2.) Encouraging ambiquity, uncertainty and conflict are healthy dynamics and should be encouraged - rather constraining them 3.)Experimentation and iterative or perpetual betas form deeper insights than detailed contingency planning and analysis.
Perhaps the most important point based on Helix and also parallel's Innosight's research is that disruptive innovation resides in complexity and less predictable dynamics than does incremental innovation. Both approaches need to be stressed as valuable in developing effective organizational systems. Healthy tension between stable and less predictable system dynamics create healthier organizational systems.
Rational Analysis for a Problematic World: Problem Structuring Methods for Complexity, Uncertainty and Conflict, 2nd Edition (Paperback)by Jonathan Rosenhead (Editor), John Mingers (Editor)
Jonathan can be reached at:J.Rosenhead@lse.ac.uk. Dr. Gordon can be reached at cindy@helixcommerce.com.
Making Sense of Complexity Theory
One of the most important areas of science critical to innovation capacity development is rooted in Complexity theory. Complexity theory is concerned with the behaviour over time of certain kinds of complex systems.
Over the last 30 years attention to complexity has been driven from diverse research fields like: astronomy, chemistry, evolutionary biology, geology, meteorology, and physics. Althought there is no unified complexity theory, there is respect for the underlying conditions which under certain conditions perform in regular, predictable ways; under other conditions they exhibit behaviour in which regularity and predictability is lost.
Complexity lies in dynamic systems which are capable of changing over time – and the concern is with the predictability of their behaviour. Some systems, though they are constantly changing, do so in a completely regular manner. For definiteness, think of the solar system, or a clock pendulum. Other systems lack this stability: for example, the universe (if we are to believe the ‘big bang’ theory), or a bicyclist on an icy road. Unstable systems move further and further away from their starting conditions until/unless brought up short by some over-riding constraint – in the case of the bicyclist, impact with the road surface.
Stable and unstable behaviour as concepts are part of the traditional repertoire of physical science. What is novel is the concept of something in between – chaotic behaviour. For chaos here is used in a subtly different sense from its common language usage as ‘a state of utter confusion and disorder’. It refers to systems which display behaviour which, though it has certain regularities, defies prediction. Think of the weather as we have known it. Despite immense efforts, success in predicting the weather has been limited, and forecasts get worse the further ahead they are pitched. And this is despite vast data banks available on previous experience. Every weather pattern, every cold front is different from all its predecessors. And yet…the Nile doesn’t freeze, and London is not subject to the monsoon.
Systems behaviour, then, may be divided into two zones, plus the boundary between them. There is the stable zone, where if it is disturbed the system returns to its initial state; and there is the zone of instability, where a small disturbance leads to movement away from the starting point, which in turn generates further divergence. Which type of behaviour is exhibited depends on the conditions which hold: the laws governing behaviour, the relative strengths of positive and negative feedback mechanisms. Under appropriate conditions, systems may operate at the boundary between these zones, sometimes called a phase transition, or the ‘edge of chaos’. It is here that they exhibit the sort of bounded instability which we have been describing – unpredictability of specific behaviour within a predictable general structure of behaviour.
Before the emergence of complexity theory, the unpredictability of such systems was attributed to randomness – a notion that bundles up all unexplained variation and treats it as best captured by probabilities. What actually happens on any given occasion is understood as the result of random choice among possible outcomes, but in proportion to their probabilities. Thus probability becomes a catch-all for what cannot be explained in terms of cause leading to effect; paradoxically the implication is that variation about predicted values results from as yet unexplained causal factors, and that as we understand more about what is going on the residual random element will be progressively reduced.
Underlying the explosion of interest in chaos is the discovery that apparently random results can be produced without the need for any probabilistic element at all. That is, we can take some quite simple equations, compute the values of some variables of interest repetitively using the outputs of any stage of the calculation as the input to the next, and get results which skip around as the calculation proceeds. (Put in more technical terms, these are non-linear dynamic systems incorporating both positive and negative feedback loops.) More significantly, if we repeat the calculation a second time from a starting point only infinitesimally different from the first, after a time the computed values diverge and follow a quite different path. This is the mathematical equivalent of the ‘butterfly effect’. The small difference in starting conditions is analogous to an additional movement of the butterfly’s wings; the quite different trajectories which result correspond to distinct weather sequences bearing little or no resemblance to each other.
That last statement is not quite right. For though, the different streams of values outputted by the mathematical calculations, like the different weather sequences, are highly irregular, they are not formless. The indeterminate meanderings of these systems, plotted over time, show that there is a pattern to the movements. Though they are infinitely variable, the variation stays within a pattern, a family of trajectories. Such a pattern of trajectories (and a whole range of different ones have been identified by trying out interesting ideas in the branch of mathematics called topology) is called a strange attractor. They are called ‘strange’ to distinguish them from stable attractors, states to which the system reliably returns if disturbed. A strange attractor has the property of being fractal or self-similar – that is, its pattern repeats itself at whatever scale it is examined. Indeed one can say that chaos and fractals are mathematical cousins, with chaos emphasising the dynamics of irregularity, and fractals picking out its geometry (Stewart 1989).
All of the above are, of course, ‘only’ abstract mathematical results – demonstrating at best that certain kinds of unstable behaviour are theoretically possible. However mathematicians are likely to assert that "anything that shows up as naturally as this in the mathematics has to be all over the place" (Stewart 1989, p.125). And the literature on chaos can cite examples that appear to validate this claim. One example is the wobbly orbit of Hyperion, one of Saturn’s planets. Another is the propagation of turbulence in fluids. In the field of chemistry, Prigogine and colleagues won the Nobel prize for work showing that under appropriate conditions chemical systems pass through randomness to evolve into higher level self-organised dissipative structures – so called because they dissipate unless energy is fed in from outside to maintain them. It has been used as the basis for an approach offering an alternative (or at least a complement) to Darwinian natural selection as an explanation of the ordered complexity of living organisms. And so on.
Complexity itself is centre-stage, rather than an emergent property of research in particular disciplines, at the Santa Fe Institute. At SFI, set up in 1984 as an independent research centre, scientists (some of them eminent) from a range of disciplines – physics, biology, psychology, mathematics, immunology and more - have engaged with computing expertise to conduct interdisciplinary work on the behaviour of complex adaptive systems. They have built models which can be interpreted as representing biological, ecological and economic phenomena.
All of this is undoubtedly an exciting journey of intellectual discovery, which already boasts very significant achievements. The great nineteenth century mathematician Poincare has been claimed as a founding figure (because he ‘almost’ discovered complexity), but in effect the achievements are all those of the last four decades. Some distinguished authors even believe that this work already represents a watershed for natural science, ending three centuries (since Newton) of determinism (see Prigogine 1989).
But why should it be of interest to innovation leadership and how management lead? The next section considers the arguments which have been advanced for the extension of these ideas to the role of management.
Management and Complexity Theory
The work of Ralph Stacey is probably the most widely read and influential of the management complexity authors.
Justifications of the significance of complexity theory for management generally start from a description of the pre-existing state of management theory and practice, which stresses its unacknowledged but self-imposed limitations, its tunnel-vision. (These limitations are, subsequently, shown to be transcended by complexity-based thinking.)
Given that the key finding claimed for complexity theory is the effective unknowability of the future, the common assumption among managers that part of their job is to decide where the organisation is going, and to take decisions designed to get it there is seen as a dangerous delusion.
Management, afflicted by increasing complexity and information overload, can react by becoming quite intolerant of ambiguity. Factors, targets, organizational structures all need to be nailed down. Uncertainty is ignored or denied. The management task is seen to be the enunciation of mission, the determination of strategy, and the elimination of deviation. Stability is sought with highly reliable reporting systems and controls.
What management does is for some of these standard operating norms. There should be a Chief Executive Officer presiding over a cohesive management team with a vision or strategic intent supported by a common culture. The organisation should stick to its core business and competencies, build on its strengths, adapt to the market environment, and keep its eyes focused on the bottom line. Despite the critical hammering taken by 1970’s-style long-term planning, strategic management will nevertheless incorporate the tasks of goal formation; environmental analysis; strategy formulation, evaluation and implementation; and strategic control.
All these approaches are completely wrong from the perspective of management writers influenced by complexity theory.
This kind of management theory and practice bears the hallmarks of the over-rationalist thinking which has dominated since the triumphs of Newton and Descartes.
The organisation, like the universe, is conceptualised as a giant piece of clockwork machinery. The latter was thought to be, in principle, entirely predictable; and good management should be able to get similarly reliable performance from the latter. Discoveries by the theorists of complexity and chaos show that even the natural world does not operate this way – and this revelation of the role of creative disorder in the universe needs to be taken to heart by managers. The consequences, as Stacey (1993) comprehensively summarises, are to turn much management orthodoxy on its head:
1.) analysis loses its primacy
2.) contingency (cause and effect) loses its meaning
3.) long-term planning becomes impossible
4.) visions become illusions
5.) consensus and strong cultures become dangerous
6.) statistical relationships become dubious.
What are the implications or lessons learned for management?
What lessons, it is claimed, does complexity theory teach managers? These can be divided, loosely, into two categories: general suggestions as to how managers should approach their jobs, and more detailed prescriptions for particular tasks. The latter, of course, derive at least in principle from the former.
General lessons
The general lessons concern how learning can be fostered in organisations, how they should view instability, and the (negative) consequences of a common internal culture. The need for an emphasis on learning stems from the central finding of this theory – that the future is in principle unknowable for systems of any complexity. If we accept that we can have no idea of the future environment, then long-term planning becomes an irrelevance, if not a hindrance. This absence of any reliable long-term chart makes learning crucially important, and this must be what has been named ‘double-loop learning’. That is, it is not enough for managers to adjust their behaviour in response to feedback on the success of their actions relative to pre-established targets; they also need to reflect on the appropriateness, in the light of unfolding events, of the assumptions (the mental model) used to set up those actions and targets.
This sort of learning cannot easily take place within an organization which puts a premium on maintaining a common culture. The dynamics of ‘group think’ and the possible effects of divergence on promotion or even survival within the organisation are potent pressures for conformity. This is not an atmosphere in which searching re-examination of cherished assumptions can thrive – rather the reverse. Yet agility of thought based on the fostering of diversity is a prerequisite for the organisation’s longer-term success.
For an organisation to seek stable equilibrium relationships with an environment which is inherently unpredictable is bound to lead to failure. The organisation will build on its strengths, fine-tune its adjustments – and succumb to more innovative rivals. Successful strategies, especially in the longer-term, do not result from fixing an organisational intention and mobilising around it; they emerge from complex and continuing interactions between people. Even the dominant 1980’s approach to strategy, which distanced itself so emphatically from the strategic planning paradigm of preceding decades, at base maintained the aim of strategic management as the realisation of prior intent. Management complexity theorists emphasise, rather, the importance of openness to accident, coincidence, serendipity. Strategy is the emerging resultant.
Rather than trying to consolidate stable equilibrium, the organisation should aim to position itself in a region of bounded instability, to seek the edge of chaos. The organisation should welcome disorder as a partner, use instability positively. In this way new possible futures for the organisation will emerge, arising out of the (controlled) ferment of ideas which it should try to provoke. Instead of a perfectly planned corporate death, the released creativity leads to an organisation which continuously re-invents itself. Members of an organisation in equilibrium with its environment are locked into stable work patterns and attitudes; far from equilibrium, behaviour can be changed more easily.
Yet although they challenge sharply certain management orthodoxies ripe for reassessment, these are all in effect ‘motherhood’ statements. Their motherhoodicity lies in their generality and non-specificity, their sense of being unchallengable within the offered framework of ideas. If you accept the relevance of complexity theory to the managerial condition, then you must also accept the package of systemic categorical imperatives which are embedded in it. As Wheatley says, the first step required is an act of faith.
Specific lessons
There are thus two reasons for paying close attention to the actionable proposals – for managerial structure, strategy etc. – which it is claimed are deduced from complexity theory. The first is that managers are thereby provided with a subjective reality test. Managers who can say, after due reflection, that these concrete proposals are plausibly beneficial will feel less inclined to be sceptical, more inclined to accept the general stance as well as the specific recommendations; and vice versa. The second, of course, is that it is through the specifics that change in management practice will be effected.
Stacey (1993) makes a key distinction between ordinary and extraordinary management. Ordinary management is required in order to carry out day-to-day problem solving to achieve the organisation’s established objectives. It employs a logical analytic process involving data analysis, goal setting, evaluating options against goals, rational choice, implementation through the hierarchy, monitoring. This is planning and management based on a shared ideological consensus, with control at its centre. Competent ordinary management is necessary if the organisation is to deliver cost-effective performance.
Extraordinary management, by contrast, is what is required if the organisation is to be able to transform itself in situations of open-ended change. Here rationalistic forms of decision making are largely inoperative, since these require as their starting point precisely those ‘givens’ which must now be disputed.
The problems and practices of ordinary management have been repeatedly addressed in management texts.
What is innovative is the concept of extraordinary management.
Extraordinary management requires the activation of the tacit knowledge and creativity available within the organisation. This necessitates the encouragement of informal structures – for example, workshops round particular issues or processes, with membership drawn from different business units, functions, and levels. Formation of these groups should be essentially spontaneous, provoked by paradoxes, anomalies and conflicts thrown up in the process of normal management. They need to be self-organising, capable of redefining or extending their remit rather than being bound by fixed terms of reference. Under these conditions group learning can occur, and its results inputted as arguments to the broader management process. In the necessary absence of hard evidence, arguments in favour of new assumptions and directions will be analogical and intuitive, and the process of decision making will be political as champions attempt to persuade others to their point of view.
Stacey does not propose that ordinary management should drive out extraordinary management. His case is rather that both are needed in viable organisations, and they must be enabled to coexist. There is, however, an intrinsic tension between the two modes. If the boundaries limiting the scope of extraordinary management’s informal networks are drawn too tight, it will wither; too loose, and the organisation will descend into anarchy, failing to deliver on its core short-term tasks.
It follows that one key task of senior management is to manage these boundaries. It needs also, eg by mid-career recruitment or job rotation, to ensure that there is not a single homogeneous organisational culture. It should take steps to promote an active internal politics that is both open and broadly democratic in style. Senior management should not espouse a unique vision or long-term plan, but should rather promote the conditions for the emergence of an evolving agenda of strategic issues, and aspirations. It should intervene only selectively, and then at sensitive points; to do so effectively it needs to have an understanding of the qualitative patterns of behaviour which such intervention could produce, without wishing to control it to a preconceived path or believing that it could.
In effect management needs to combine permissive style with abrasive challenge. If necessary it should provoke conflict through ambiguity, deliberately steer away from equilibrium, intentionally escalate small changes, amplify rather than damp down the effects of chance events.
The role for analysis in extraordinary management is extraordinarily limited. While the whole purpose is to ensure long-term survival, there is no long-term plan and precious little long-term planning. The strategic role of senior management is largely to facilitate processes of dialogue which can lead to innovation, rather than to preside as final arbiters over an elaborate analytic process. Stacey in particular relegates rather cursorily even those tools which might be thought to be consistent with extraordinary management (eg simulation, scenario analysis) to a marginal role, if any.
This downplaying of analysis is asserted rather than argued. Nevertheless it clearly stems from the very firm distinction which Stacey draws between rationality and creativity. For him rationality is fine, and necessary, for handling routine business, but is just not up to the job of sense-making in poorly structured situations. There is, however, evidence to support the common-sense observation that in practice rationality and creativity are notkept in such tidy compartments. Shallice (1996) has demonstrated that dealing with novel situations involves complex cognitive processes which have many rational elements. More piquantly, it was Poincaré himself, so often cited by writers on complexity as the unwitting father of their subject, who has left us the most insightful personal story of (mathematical) creativity from the point of view of the creator. In this classic account (Poincaré 1946) he makes absolutely clear the crucial role of very extensive, even exhaustive, analytic endeavour in preparing the ground for his breakthrough!
These lessons constitute an ambitious agenda for change. Explicit claims are made to have identified managerial imperatives for the survival of organisations. Implicit at least is a claim to have a more adequate characterisation of the threats and opportunities confronting organisations and their managements than has previously been articulated. It is therefore appropriate to evaluate the achievement of the management complexity writers on two fronts: in what respects is the account novel? and what is the evidence for the claims? These two approaches will be taken in this and in the following section.
The general question of novelty has two components. Are there important features of their analysis which have been previously identified and prioritised, but from other management research perspectives or traditions? And have disciplines peripheral to or outside management (taken narrowly) generated other proposals for handling these issues which the complexity writers ignore or unreasonably dismiss?
On the first of these Stacey repeatedly cites other management researchers who, despite working in very different conceptual frameworks, have reached comparable conclusions. These convergences include
* that organisations do not only adapt to their environments, but help to create them
* that organisational success can come from contradiction as well as consistency
* that success may stem from being part of a self-reinforcing cycle, rather than from an explicit ‘vision’
* that revolutionary as well as incremental changes may lie on the route to organisational success.
These outputs of more conventional schools of managerial thought can serve the function of making the complexity writers’ prescriptions appear less threatening. In assessing the novelty of the complexity perspective we need also to take account of the claims for priority in certain areas which other approaches could lodge.
The prior work which is most directly relevant comes, predominantly, with a ‘systems’ pedigree. Thus cybernetics for more than 50 years has been attempting to develop understandings of the behaviour of self-organising systems; while since the 1960’s system dynamics has been exploring how complex systems incorporating feedback can generate counter-intuitive consequences.
Writers like Vickers (1965) and Schon (1973) took related systems concepts into broader social discourse a generation and more ago. Etzioni (1971) is another thinker who anticipates some of the complexity-based insights. His mixed scanning approach to planning is an attempt to encompass the paradox that organisations need both control and innovation.
Stacey (1996) holds that systems contributions pre-dating the complexity revolution are irredeemably trapped. They operate within the stable equilibrium paradigm, and are wedded to notions of efficiency, effectiveness and control. They are thus outdated by the new findings. (He makes a limited exception for system dynamics, and cites Senge’s (1990) system archetypes – a small number of qualitative generic structures of feedback interaction - as examples of the sort of broad patterns of behaviour which management needs to take into account.)
Such literature as does exist, in a number of other disciplines, on disorderly processes in organisations, emergent strategies, dialectical evolution etc. lacks a coherent theoretical framework. The ‘science of complexity’, he says, adds value to such compatible work as can be found in the literature by supplying that coherence.
The managerially relevant claims for complexity theory are thus large. In their demotion of pre-existing systems insights, and in a number of other respects, they rest explicitly on the authority of science. We should look next at the solidity of this foundation.
What is the evidence?
It hardly needs saying that there is no formally validated evidence demonstrating that the complexity theory-based prescriptions for management style, structure and process do produce the results claimed for them. These results are generally to do with long-term survival, a phenomenon not susceptible to study using short-term experimental methods. Such evidence as is adduced is almost exclusively anecdotal in character. The stories range from improving tales of successful corporate improvisation, to longer accounts of organisational death wishes or of innovation which bypasses the obstruction of the formal hierarchy; there are also approving quotations from business leaders.
The problem with anecdotal evidence is that it is most persuasive to those who experienced the events in question, and to those who are already persuaded. For others it can be hard to judge the representativeness of the sample of exhibits. This is especially so if, even unintentionally, different standards of proof or disproof are used for different sides of an argument. Such distortions do occur in Stacey (1992). Thus the advantage of opportunistic policies is supported by presenting examples of success, while the perils of formal planning methods are driven home by examples of failure. Yet obviously opportunism has its failures, and analytic techniques even have their modest achievements – which are not cited.
In the absence of a conclusive case based on evidence of organisational success, it is not surprising that great weight is placed on the authority of science. Wheatley (1992) has it in her title – "Leadership and the New Science". Merry (1995) relegates it to his sub-title, but in the plural: "Insights from the New Sciences of Chaos, Self-Organization and Complexity". Indeed ‘New Sciences’, always capitalised, runs through his book like the message in Blackpool rock. However all management complexity authors lean heavily on ‘science’ in their texts. These are liberally peppered with phrases like "Scientific discoveries have shown that…" or "The science of complexity shows that…". The illustrative examples provided are commonly of natural rather than social or managerial phenomena – the behaviour of molecules when the temperature of liquid rises, a laser beam, the weather…
Many of the ‘results’ cited in the complexity literature are not, however, firmly grounded on empirical observations. They are the outputs of computer simulations. Typically some simple laws of behaviour and interaction are postulated, and the computer is used to see how the operations of these laws would translate into long-term development or macro-behaviour. For example Kauffman (1993, 1995) models how an organism might evolve through an ‘adaptive walk’ of mutations across available alternatives, depending on the degree of cross-coupling of the organism’s component parts. Krugman (1996) shows how aggregate patterns of land use (eg the formation of multiple business districts, racial segregation) could result from individual responses to purely local conditions. Stacey (1996) reports a wide variety of simulations, mostly produced under the auspices of the Santa Fe Institute, in which simple rules of individual behaviour generate replications of the flocking of birds, the trail-laying of ants, the dynamics of organism-parasite systems…In each case the computer tracks the way in which such simple laws, if they were to hold, could produce patterned order.
Evidently such demonstrations, absorbing though they may be, cannot constitute proofs that these laws are indeed the cause of the observed behaviour. Indeed Kauffman (1993) in the introduction to his 700 page volume, stresses that "this is not a finished book …Premises and conclusions stand open to criticism." Krugman (1996) adopts an informal approach, and allows himself to include "a few wild speculations". Mittleton-Kelly (1997) recognises a further need for circumspection which arises in essaying to transfer complexity theory formulations from the natural to the social domain. Behaviour in the former may be assumed to be governed by laws; in the latter, awareness of a claimed law may itself generate changed behaviour. In this crucial respect, social systems (including organisations and their managements) are fundamentally different from all other complex systems.
It can be seen from this that scientific authority is an unsafe ground for asserting that specific results from complexity theory necessarily apply to organisations, or that complexity-based lessons constitute imperatives for management practice.
Stacey makes a series of strong assertions about the range of systems which come within the remit of complexity theory. Thus (p.124) "most of nature’s systems are nonlinear feedback ones" that in many respects exhibit chaotic and therefore unpredictable behaviour. It is not quite clear how one could either substantiate or refute such a claim – but as organisations are social rather than natural systems we do not have to address this question.
However the pronouncements on organisations are in the same vein. Following an account of nonlinear feedback systems operating "far from equilibrium" (p.11) we read that "an innovative business is just such a system". This is broadened out in a passage (pp. 48-51) which appears to extend the scope of this mapping to all businesses whether innovative or not.
The argument is that "almost all interactions in organizations between individuals, or groups of individuals, take on a nonlinear feedback form"; as a result "every business" is a web of such feedback loops; and the scientific discoveries of complexity therefore apply.In his later book - Stacey (1996) – the remit is extended still further: "human systems, that is, individuals, groups, organizations, and societies, are all nonlinear feedback networks" to which the findings of complexity theory consequentially apply (p.47).
But do they?
There are some concealed mathematical booby-traps along the route which this argument has just traversed. Consider two of these. The first is that not all non-linear dynamical systems exhibit chaotic behaviour. Many are quite happy to settle down to stable equilibria, on which an infinitesimal difference in starting conditions has only an infinitesimal effect. Whether we get chaos or not depends critically not only on the form of the equation but also on the parameters defining the strength of the feedback loops. Evidence that the feedback parameters in the whole array of social systems listed above do in fact take chaos-generating values is absent.
The second difficulty is that the mathematical work cited by management complexity writers is virtually all about deterministic chaos. That is, the unpredictability arises without the need for any randomness of input or process. The weird and wonderful results of mathematical chaos theory which have so gripped the public imagination all stem from this type of formulation. However it is surely indisputable that in the real world of social processes and organisations to be managed, probabilistic elements abound. Sadly perhaps, under what is called stochastic chaos strange attractors fail to manifest themselves (and consequently these more realistic situations have attracted much less theoretical attention). Intuitively, we may understand this result as the buffetting of random shocks knocking the system out of the delicate entrapment of the strange attractor. Systems return to a stable equilibrium, or retreat indefinitely from it. The edge of chaos, on which so much management complexity writing is predicated, is abolished.
These severe limitations on the practical relevance of chaos/complexity have evidently not deterred the rather inclusive claims made for its sphere of operation. Indeed the strength and generality of the assertions based on complexity theory merge into a sense that its findings are non-negotiable. We find a text where the words ‘manager’, and ‘must’ or ‘need to’, go together like, as they used to say, a horse and carriage. We hear of actions which "managers must take to be successful"; and that "managers need to embrace this new frame of reference" (both in the Preface to Stacey (1992)).
We find that "human organizations…must operate in chaos if they are to be continually creative"(p.124). Some of these ‘oughts’ are less problematic than others. No author can reasonably be debarred from urging the theme of his/her book on us as a new way of thinking, an improved mental model. There can be concern, though, when the new frame of reference (supported by appeals to science) generates prescriptions for management control, management style and culture, risk taking behaviour, skill enhancement…The pervasive message is that corporate survival depends on how well management internalises the complexity perspective.
Complexity as metaphor
If it is unwarranted (and it is) to say to managers "Take these actions because science has discovered that they are necessary", does that dispose of the matter? Can managers safely put away those rather daunting books with complexity in their titles, and get on with managing by the seat of their pants (or by the recipe of some less scientific guru)? Well, not necessarily. Though one cannot prove that the claimed results (about creativity at the edge of chaos, versus controlled extinction in an equilibrium state) apply to the management of organisations, there is another way in which they can be relevant. This is the route of metaphor or analogy.
There is of course a difference between the two. A metaphor is a figure of speech in which a name or descriptive term is transferred to some object to which it is it is not properly applicable. Metaphor is an entirely legitimate device. It can be a way of illuminating certain phenomena in a novel way, so that routine understandings of their significance may be enriched or replaced by interpretations based on the quite different field to which they are juxtaposed. Literature would be impoverished without it. So, quite possibly would science. For example, a good case can be made out that Darwin’s phrase "natural selection" is a highly debatable metaphor drawn from the field of plant and animal breeding (Young 1985).
However if complexity/chaos is an extended metaphor when applied to organisational behaviour and management, then clearly it loses any prescriptive force. Any influence it may have over policy will depend on the vividness of the metaphor, and on the plausibility in the real world of organisations of any inferences for practice that the metaphor seems to imply. For example, economists (or perhaps politicians) used to make frequent use of an automobile metaphor to describe macroeconomic policy – applying the brakes, a touch on the accelerator. Yet no one would have proposed that this metaphor, doubtless helpful to understanding, could justify designing economic institutions based on the big end or tyre valve.
An analogy carries rather more clout. An analogy consists of some assertions of similarity or difference between corresponding elements in two different systems, and about the sets of causal relations operating within each system (Hesse 1966). Analogies are widely used to suggest scientific hypotheses worth investigating, to the extent that ‘analogy’ and ‘model’ can be treated as virtual synonyms (Brodbeck 1968). Generally the analogy is used to connect a well-understood domain to one in which understanding is less developed. So, for example, Huygens developed his wave theory of light with ideas from the familiar view of sound as a wave phenomenon; and Fourier’s theory of heat conduction was constructed by analogy with the known laws of the flows of liquids (Nagel 1961).
In the light of this, could the management complexity writers, deprived of ultimate natural scientific authority, nevertheless claim that the results they draw from complexity theory hold for management by analogy? The requirements for this would be
(a) that the natural scientific domain of complexity theory is better understood than that of management;
(b) that there are concepts in the first domain which have been clearly put in one-to-one correspondence with similarly precise equivalents in the second;
and (c) that connections (especially causal ones) between groups of concepts in the first domain are implicitly preserved between their equivalents in the second (Brodbeck 1968).
If the answers to all these queries are positive, then the analogy may reasonably be used to articulate a theory in the new domain. Validation of this new theory can only come from empirical work in the new domain.
(a) Complexity theory proper is a rapidly developing but still young and imperfectly integrated field. It is certainly arguable whether it is sufficiently well established to serve as a reliable source of analogies for the field of management.
(b) Management complexity authors are rather unspecific about what organisational aspects are to be put in relation with what concepts drawn from complexity theory. For example, is it the organisation or its environment which is the focus of interest in which chaotic behaviour might be observed?
(c) Broadly the relationships claimed to be preserved across the two domains are those of non-linear feedback between elements within each of them. This mapping is so general and therefore so undemanding as to add only limited credibility to the analogy.
This situation could well change, as complexity theory develops further, or as management complexity writers refine their analyses. However at present the conceptual basis seems inadequate to support testable analogical insights.
As a footnote to this discussion, it is worth recalling Stacey’s key task for ‘extraordinary management’ – to understand the qualitative patterns of behaviour which their interventions could provoke.
These recognisable patterns, he says (Stacey 1993, pp231-2), are typified by Senge’s (1990) archetypes – qualitative maps which are essentially simplified causal loop diagrams from the field of system dynamics. In fact, though commonly attributed to Senge, the concept of ‘archetype’ in this context was originated by Meadows (1982). These archetypes are designed to illustrate a range of generic types of counter-intuitive behaviours of systems. Lane (1998) argues that archetypes have at most an analogical [I would say, metaphorical] relationship with their real world targets. "With weakly defined attributes there seem to be few ways to test the isomorphism and so gain confidence in the relevance of the diagram." He is driven to ask "is this science or ‘new age’ waffle?" Regardless of how one answers this question, the central place of Senge’s archetypes in Stacey’s scheme dramatises the role of approximate and suggestive thinking in the management lessons drawn from complexity theory.
Summary
This account to apply ideas from complexity theory to management practice has been broadly critical – critical of claims for the authoritative status of what would be better presented as stimulating metaphors. It is indeed curious that a message based on the importance of accepting instability, uncertainty and the limits to our knowledge should be presented with such an excess of certainty. The explanation for this paradox may lie in the twin heritage of management complexity. The ‘systems’ community world-wide has been particularly prone to sectarianism and evangelism, while the audience for management texts is conditioned to expect large generalisations supported anecdotally. It can be a heady mixture.
This conceptual imperialism is both unfortunate and unnecessary. Unfortunate because some of those exposed to these ideas may reject them on grounds of over-selling, while others (recall that complexity theory proper is far from transparent) may be persuaded to place more reliance on the ‘findings’ than is merited. Unnecessary, because management complexity has indeed generated metaphorically based insights which are novel and instructive.
There are a number of complexity theory insights we can take away. Consider the handling of the long-term. Long-term planning has taken such a battering that the complexity-based view that it is impossible anyway can hardly classify as startling. However Stacey’s extension of this critique to cover the view of ‘strategy as vision’ is a powerful antidote to much management writing. He makes a good case(Stacey 1992, pp 126-144) that a single vision to serve as intended organisational future, motivator of behaviour and guarantor of corporate cohesion is a thoroughly bad idea. It produces a culture of dependency, restricts the expression of conflicting views, and generates shared mental models tending towards groupthink. One must hope that this debunking of ‘the vision thing’ will prove influential.
Not that the long-term is dismissed as an effective irrelevance. What is proposed is a refocusing: rather than establish a future target and work back to what we do now to achieve it, the sequence is reversed. We should concentrate on the significant issues which need to be handled in the short-term, and ensure that the debate about their long-term consequences is lively and engaged. Read in this way, the lesson of complexity theory is not to justify short-termism, but to point towards a more practicable way of taking the future into account.
For this more pragmatic balance between present concerns and future potentialities to be achieved, Stacey proposes the deliberate fostering of an active organisational politics. Only in this way can learning, the essential requirement for organisational survival, be institutionalised. This is an uncomfortable message, which many organisations will find sits uncomfortably with current structures and cultures. However it is arguably a much needed corrective to the authoritarian, hegemonic or presumptively consensual regimes which are our management heritage.
There are, of course, insights and insights. Some of the implications of Stacey’s complexity-based view are harder to embrace, even in principle. I will highlight just one of these here – the notion that organisations ‘should’ operate at the edge of chaos. (Another, the downgrading of analysis, I will address a little later.) We are repeatedly told that "the space for creativity in an adaptive system is a phase transition at the edge of chaos, at the edge of system disintegration" (Stacey 1996, p.97); "human organizations …must operate in chaos if they are to be continually creative" (Stacey 1992, p.124); or again, "the constant creativity and innovation necessary for success can occur only in this state" – namely bounded instability, or chaos (Stacey 1992, p.186).
To stay far from equilibrium, the organisation has to operate as a ‘dissipative structure’, maintainable only by "continual inputs of energy, attention and information" (Stacey 1993, p.231). For this reason these states occur in nature only fleetingly – they breakdown in disorder or relapse into equilibrium. If the analogy to organisations can be sustained (on which I have said enough above), it can hardly support an imperative for organisations to operate ‘constantly’ and ‘continually’ in this mode. It might instead suggest a more intermittent invocation of a fluid and responsive stance, during which radically new directions would be entertained. (This is, in fact, the view taken by McMaster, 1995.) In between, the organisation would operate in a relatively stable manner.
Appropriate thought on procedures would be necessary to stop this mixed mode degenerating into crisis management, pure and simple. But the alternative proposed by Stacey is that our organisations should operate permanently poised on the edge between order and disintegration. This is in effect to propose an unsecured gamble with institutions which are dominant repositories of our physical and intellectual capital, and with the psychological health of those who would operate them under continuously stressful and under-structured conditions.
The discussion above is intended as one example of how we might make constructive use of management complexity. We do not need to swallow it whole, or reject it root and branch. Indeed Stacey (despite his tendencies towards the prescriptive) at intervals stresses that there are no recipes, that this is a way of thinking, that these are mental models. Wheatley (1992, p.150) urges the use of management complexity ideas to make sense of our own experience. So long as we give that experience powers of veto, management complexity can serve as a constructive provocation.
The evolution of management complexity
Management complexity theory redescribes the organisational world. Why, in this way, now?
One explanation might be that this new perspective is driven by substantive changes in that world: organisations getting bigger, more information arriving sooner, environments changing faster. Undoubtedly there is something to this argument; ideas do need to be timely. The ‘learning organisation’, for example, would not have been a helpful concept to the pioneers of the first industrial revolution, lacking as they did any cadre of professional managers. Nor would they have appreciated advice to aim for the edge of chaos, obsessed as they were with establishing greater control. Nevertheless this argument, while excluding past eras, still leaves a good deal of leeway for the timing of this conceptual revolution.
Timeliness also plays a part more indirectly. The development of the various elements of complexity theory have depended heavily on computing power; as the cost of computing has fallen, so the feasibility of elaborate simulations of dynamic behaviour has grown. This has meant that the natural science results from which management complexity theory might borrow have in effect been accumulating only since the 1960’s.
Nevertheless it is striking that general as opposed to academic interest in complexity theory mushroomed from the end of the 1980’s and as yet shows no signs of abating. This is as true for best selling works of popular science as it is for the management complexity developments we have been considering. Ideas which get taken up need to be in keeping with the spirit of the age. It is not, therefore, necessary to see the congruence (excellent, as we will see below) between complexity theory and the broader current of events and ideas in society as just a remarkable coincidence.
These effects are sometimes easier to see and accept in historical perspective than in relation to our own times. So, to provide some extra credibility before applying the same approach to complexity theory, let me make a short detour back to Darwin. (The following account is based on Young, 1985.) His "On the Origin of Species" was one of the two most significant publications of the nineteenth century. (The other was Marx’s Das Kapital.) It precipitated a revolution in thought, transforming how future generations would see the place of humans in the universe.
The full title of Darwin’s great work was "On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life". It was published in 1859 after a gestation period of more than 20 years. I have already referred to the metaphorical role of his phrase ‘natural selection’. Here I wish to draw attention to the mechanism which he proposed for selection, namely the ‘struggle for life’. Darwin’s notes, sketches and correspondence make clear the formative role in this respect of his reading of Malthus in 1838. Malthus’s theories were concerned not with biological diversity, but with social and economic conditions at the epicentre of the convulsion of the industrial revolution in Britain. Agricultural enclosures were dispossessing the rural population, who flooded to the new urban industrial centres. Against this background he published in 1798 his influential essay "On the Principle of Population" in which he argued that, since population tends to increase geometrically, but food supplies can grow only linearly, increasing population would always tend to outstrip the means of subsistence. The widening gap would be handled by poverty, famine and war. So Darwin’s adoption of the survival of the fittest as a key element of his theory was actually based on a philosophy which derived from and was particularly relevant to its era – that of the most brutal and self-confident phase of industrial capitalism.
The reason for this detour is that we can find comparable resonances linking complexity theory, larger currents of thought, and the social and economic circumstances of our own era. The past 20 years have not only seen the mushrooming of interest in complexity theory. They have also been years in which a hegemonic common wisdom has developed, in favour of the market rather than planning as the way to order our affairs; in which the demise of the Soviet Union has consolidated this victory in geo-political terms; in which mass production of standard products has been replaced by a Post-Fordist fluidity in promoting and responding to changes in taste and fashion; in which influential thinkers have mounted an assault on the pre-eminence of reason in social decision-making (see Rosenhead 1992); and in which postmodernism has made good progress in sweeping such notions as ‘progress’ off the agenda of respectable discourse.
All of this makes a theory which appears to argue that the future is in principle unknowable and therefore unplannable an extremely welcome menu item at the intellectual dinner table. It appears to add the authority of science to pronouncements that "there is no alternative" to the market. At present they seem well on the way to incorporation into the set of ideas in good currency for making sense of the social world.
There is a precedent for this process in the response to Darwin, which was not only scientific and theological in character; his work also provoked an influential current of social theory. Known as Social Darwinism, it pronounced that those in lowly social positions were there as a result of their lesser competence in the social struggle for survival. Conveniently (for those in higher social positions), it was ‘nature’s law’ that this was so. (The same argument demonstrated the evolved superiority of successful nations.) There was no need, indeed it would be anti-progressive, to attempt to alleviate the condition of the poor, or to restrain the free competition which led to it. (See Bannister 1979.)
As we have seen, this ‘timely’ idea had a tangled provenance, having moved from Malthus to Darwin, and then back to the Social Darwinists where it could be presented as the findings of science. The situation for complexity theory is simpler. There has been no yo-yoing back and forth between the natural scientific and the social domains, just a one way ticket. The mechanism can be seen in operation in the work of Kaufmann. His monumental book (Kaufmann 1993) in fact returns to the terrain of Darwin. He is concerned to show that random variation and natural selection provide only a partial explanation for the observable order in the natural world, and that there are sources of self-organising, spontaneous order in the biological phenomena which natural selection has to work with.
So far so good, you might think, but rather remote from organisations and management. Not so. In Kaufmann (1995) these ideas are popularised, and applied to technological innovation and organisational design. Large organisations, he says, need to be broken down into ‘patches’ rather than run hierarchically from the top. Then, "as if by an invisible hand", they will be able by mutual interaction to discover excellent solutions to tough problems. The analogy between these organisational issues and his primary research field of biology, he suspects, may be more than analogy; rather, the biosphere and the technosphere "may be governed by the same or similar fundamental laws".
We have already addressed the question of analogy at some length. But let us pause on that "invisible hand". Krugman (1996, p.76) uses the same words to describe the spontaneous emergence of a regular spatial pattern of urban centres. Neither of them invented the phrase. It dates from the eighteenth century and is due to Adam Smith, who used it metaphorically in connection with the workings of the market. The market operates so that, although all individuals act in their own self-interest, ‘as if by an invisible hand’ the wealth of the nation will grow. Smith’s conclusion was that no interference with free competition should be countenanced.
There is indeed a striking parallelism here. Adam Smith said that self-interested mutual interaction in the market is uniquely efficient in economic terms. Complexity theorists see that spontaneous self-organisation can produce survival strategies where central planning would fail. Two hundred years after Smith, in a period when there is once again an almost mystical belief in the beneficial properties of market forces, we are told on the authority of science that non-intervention is best.
So times change, and acceptable meta-ideas change with them. It is curious that, neatly in step with the popularisation of complexity theory, a blood relative of Social Darwinism has emerged to considerable acclaim under the label of ‘evolutionary psychology’ (Barkow et al 1992; Pinker 1998). The programme of this movement is to argue that natural selection operates not only on physical characteristics, but also on the mind –dispositions, tastes, preferences, and attitudes. It follows that those which we have are deeply embedded, and hence much less steerable by social engineering than was often previously assumed. Thus it too, just as complexity theory, provides justification for governmental non-interference. These are truly ideas well in touch with the spirit of the age!
A role for analysis
Stacey’s explicit rejection of a significant role for analytic methods was highlighted in section 3 above. This position is not uniform among management complexity writers; mostly they simply ignore the topic. (McMaster, 1995 does at least implicitly concede that scenario analysis could be useful in thinking about the future.) The general posture seems to be that managers who realise the importance of creativity in a world of complexity will have concerns for which analysis is an irrelevance.
The picture which Stacey (1992) paints of ‘step-by step analysis’ is a caricature designed to show up complexity-based thinking to maximum advantage. He portrays a mindless routine in which set models, rules and computations are applied with blinkered religious fervour. The choice, the reader is invited to believe, is between this, and a more fluid, creative and political process. Either-or, not both-and. (Yet we have seen, in section 3, that creativity and rationality are mutually supportive rather than exclusive.) Because the organisational world is one of non-linear dynamics, we are told, extraordinary managers must be intuitive, innovate, and spot emergent strategies. And all without recourse to analytic crutches.
As a corrective to analytic over-dependence, where it is still to be found, this rhetoric may have some value. But elsewhere it will serve as further incitement to an irrationalism which is already in the ascendancy. In any case, the logic by which this anti-analytical conclusion follows from Stacey’s complexity theory-based diagnosis is less than compelling. For he presents a powerful argument (Stacey 1992, p 101-2) for the inadequacy of informal mental models – the same case incidentally that operational research advanced thirty or so years earlier – only to leave managers with no tools to supplement them. Changed structures, process and culture can make a major contribution, to be sure. But in the end, ways forward need to emerge from hard thinking by individuals or groups about the complexity and uncertainty of their situation. Dealing with interconnected systems and environmental turbulence as they do, they need all the help they can get.
Consider some of the difficulties confronting Stacey’s ‘extraordinary managers’ for which appropriate analytic assistance is in principle available – operating in an uncertain world, learning to learn, taking decisions politically. On the handling of uncertainty, Stacey thinks that this can only be done by predicting the future, which complexity theory claims to be meaningfully impossible. Yet scenario planning is precisely predicated on this proposition. It assumes "that there is irreducible uncertainty and ambiguity in any situation faced by the strategist, and that successful strategy can only be developed in full view of this" (van der Heijden 1996, p.8). The different scenarios represent alternative possible futures for the organisation’s context; their development is as a stimulus to constructive debate on strategy. Indeed Sunter (1992), whose work was influential in South Africa's peaceful transition to democracy, justifies scenario planning in terms of uprooting the complacency of (temporarily) successful organisations, terms which mirror Stacey's concern about perfectly planned corporate death.
We are not finished with analytic assistance related to uncertainty. According to Stacey, uncertainty requires that strategic thinking should be anchored to the here and now, not the future. That is essentially the perspective of robustness analysis (Rosenhead, 1989b). This methodology adopts a bifocal approach in which thinking about possible futures informs current choices, not through the adoption of a ‘plan’, but via the (measurable) flexibility which current choices will, or will not, preserve.
Moving on, we find learning, carried out in groups, at the centre of Stacey’s version of strategic thinking. There are methods which can make these processes more effective, and there is no requirement of principle or common-sense why they should not be employed. Facilitated groups are likely to be more productive (Phillips and Phillips 1993), and a considerable variety of group decision support systems (GDSS) now exist which enhance rather than substitute for the group process (Huxham 1996).
Finally there is the question of reaching agreement on what to do. For Stacey this is one of the functions of the political process within the organisation. Crudely, the process is a heaving mass of power plays, rhetoric, self-interest, deals and ego-involvement all carried out in smoke-filled rooms – and of which he thoroughly approves as a mechanism far superior to central control for generating the decisions from which strategy emerges. In effect he treats politics as essentially a no-go area for rational thought. Yet all the participants do have their own separate rationalities. Not only that, there is usually the implicit acceptance of a common and overarching constraint that, despite their divergent perceptions and priorities, some form of operable outcome is required. This common ground will often provide scope for the use of one or more of the family of problem structuring methods (Rosenhead 1989a, 1996). These provide structured ways of avoiding mutual misperception, identifying a problem focus, opening up space for negotiation and getting buy-in to commitments.
Aids to rational discourse on organisational issues fit awkwardly within the world-view promoted by management complexity authors. Indeed they provide rational arguments against rationality, as well as forecasting with great confidence the impossibility of forecasting, and planning for the absence of planning. If we resist their invitation to elevate one view of management and organisation into the view, there are many thought provoking and practical insights along the way. References
Bibliography
R C Bannister (1979) Social Darwinism: science and myth, University of Pennsylvania Press, Philadelphia.
J H Barkow, L Cosmides and J Tooby (eds.) (1992) The Adapted Mind: evolutionary psychology and the generation of culture, Oxford University Press, New York.
M Brodbeck (1968) ‘Models, meaning, and theories’. In M Brodbeck (ed.) Readings in the Philosophy of the Social Sciences Macmillan, New York (pp 579-600).
A Etzioni (1971) The Active Society: a theory of societal and political processes, Free Press, New York.
K van der Heijden (1996) Scenarios: the art of strategic conversation, Wiley, Chichester.
MB Hesse (1966) Models and Analogies in Science, University of Notre Dame Press, Notre Dame, Indiana.
C Huxham (1996) ‘Group decision support for collaboration’. In C Huxham (ed.) Collaborative Advantage, Sage, London (pp 141-151).
SA Kauffmann (1993) The Origins of Order: self-organization and selection in evolution, Oxford University Press, New York.
SA Kauffmann (1995) ‘Escaping the Red Queen effect’, The McKinsey Quarterly 1995 No. 1 (pp 119-129).
P Krugman (1996) The Self-Organizing Economy, Blackwell, Cambridge Mass.
DC Lane (1998) ‘Can we have confidence in generic structures?’, J. Opl Res. Soc. 49 (pp 936-947).
M D McMaster (1995) The Intelligence Advantage: organising for complexity, Knowledge Based Development, Douglas IOM.
D H Meadows (1982) ‘Whole earth models and systems’, The Coevolution Quarterly, Summer (pp 98-108).
U Merry (1995) Coping With Uncertainty: insights from the new sciences of chaos, self-organization and complexity, Praeger, Westport, Conn.
E Mitleton-Kelly (1997) ‘Organisations as co-evolving complex adaptive systems’. BPRC Paper No.5, Business Process Resource Centre, University of Warwick, Coventry.
G Morgan (1986) Images of Organization, Sage, Beverly Hills, CA.
E Nagel (1961) The Structure of Science: problems in the logic of scientific explanation, Routledge and Kegan Paul, London.
L Phillips and M Philips (1993) ‘Facilitated work groups: theory and practice’, J. Opl Res. Soc. 44 (pp 533-549).
S Pinker (1998) How the Mind Works, Allen Lane:Penguin Press, London.
H Poincaré (1946) The Foundations of Science, The Science Press, Lancaster, Pa.
I Prigogine (1989) ‘The philosophy of instability’, Futures, August 1989 (pp 396-400)
J Rosenhead (ed.)(1989a) Rational Analysis for a Problematic World: problem structuring methods for complexity, uncertainty and conflict, Wiley, Chichester.
J Rosenhead (1989b) ‘Robustness analysis: keeping your options open’. In J.Rosenhead (1989a) (pp 193-218).
J Rosenhead (1992) ‘Into the swamp: the analysis of social issues’, J. Opl Res. Soc. 43 (pp 293-305).
J Rosenhead (1996) ‘What’s the problem? An introduction to problem structuring methods’, Interfaces 26 (pp 117-131).
D A Schon (1973) Beyond the Stable State, Norton, NY.
PM Senge (1990) The Fifth Discipline: the art and practice of the learning organization, Doubleday, New York.
T Shallice (1996) 'The domain of supervisory processes and temporal organisation of behaviour', Phil. Trans. Roy. Soc. Lond., 351 (pp 1405-1412).
R D Stacey (1992) Managing The Unknowable: strategic boundaries between order and chaos in organizations, Jossey-Bass, San Francisco.
R D Stacey (1993) Strategic Management and Organisational Dynamics, Pitman, London.
R D Stacey (1996) Complexity and Creativity in Organizations, Berrett-Koehler, San Fransisco.
C Sunter (1992) The New Century: quest for the high road, Human and Rousseau (with Tafelberg), Cape Town.
I Stewart (1989) Does God Play Dice? the mathematics of chaos, Blackwell, Oxford.
G Vickers (1965) The Art of Judgment, Chapman and Hall, London.
M J Wheatley (1992) Leadership and the New Science: learning about organization from an orderly universe, Berrett-Koehler, San Francisco.
R M Young (1985) Darwin’s Metaphor: nature’s place in Victorian culture, Cambridge University Press, Cambridge.
Labels:
Complexity Science,
Complexity Theory,
Innovation
Sunday, May 11, 2008
Web 2.0 Market will surge to $4.6B by 2013
Despite a long-term future marked by commoditization, enterprise spending on Web 2.0 technologies will surge over the next five years, growing 43 percent each year to reach $4.6 billion globally by 2013, according to a new report by Forrester Research, Inc. (Nasdaq: FORR - News).
The five-year Forrester forecast includes a breakdown of future business spending on technologies such as social networking, RSS, blogs, wikis, mashups, podcasting, and widgets, as well as an analysis of enterprise Web 2.0 spending across North America, Europe, and Asia Pacific.
Forrester believes that Web 2.0 technologies represent a fundamentally new way to connect with customers and prospects and harness the collaborative power of employees. Large enterprises such as General Motors, McDonald’s, Northwestern Mutual Life Insurance, and Wells Fargo have all made heavy use of these tools, and 56 percent of North American and European enterprises consider Web 2.0 to be a priority in 2008 according to a recent Forrester survey.
“Software firms can make money selling enterprise Web 2.0 software, but it will not be an easy road to hundred-million-dollar run rates,” said Forrester Research Analyst G. Oliver Young. “The market for enterprise Web 2.0 tools will be defined by commoditization, eroding prices, and incorporation into enterprise collaboration software over the next five years. It will eventually disappear into the fabric of the enterprise, despite the major effects the technology will have on how businesses market their products and optimize their workforces.”
The key question for software firms is who pays for Web 2.0 in the enterprise? Three challenges face vendors: IT shops are wary of what they perceive as insecure, consumer-grade technology; ad-supported Web 2.0 tools on the consumer side have set “free” as a starting point; and Web 2.0 technologies enter a crowded space dominated by legacy software investments.
Currently, large businesses are spending more on employee collaboration tools than customer-facing Web 2.0 technologies, but Forrester expects that trend to reverse by next year. By 2013, investment in customer-facing Web 2.0 technology will dwarf spending on internal collaboration software by nearly a billion dollars.
“Social Computing and Web 2.0 marketing are still in their infancy; and in general, the market is still in an experimentation phase,” said Young. “In the long run, the affect of Web 2.0 will be enormous. But what may prove to be of more value to vendors will be the skills of running a successful software-as-a-service (SaaS) business. For the vendors that do it well, disaggregating expertise about the medium from Web 2.0 content is likely to provide far more value than wikis and blogs ever did.”
“Global Enterprise Web 2.0 Market Forecast: 2007 To 2013” is currently available to Forrester RoleView™ clients and can also be purchased directly at
Source: Forrester Research
The five-year Forrester forecast includes a breakdown of future business spending on technologies such as social networking, RSS, blogs, wikis, mashups, podcasting, and widgets, as well as an analysis of enterprise Web 2.0 spending across North America, Europe, and Asia Pacific.
Forrester believes that Web 2.0 technologies represent a fundamentally new way to connect with customers and prospects and harness the collaborative power of employees. Large enterprises such as General Motors, McDonald’s, Northwestern Mutual Life Insurance, and Wells Fargo have all made heavy use of these tools, and 56 percent of North American and European enterprises consider Web 2.0 to be a priority in 2008 according to a recent Forrester survey.
“Software firms can make money selling enterprise Web 2.0 software, but it will not be an easy road to hundred-million-dollar run rates,” said Forrester Research Analyst G. Oliver Young. “The market for enterprise Web 2.0 tools will be defined by commoditization, eroding prices, and incorporation into enterprise collaboration software over the next five years. It will eventually disappear into the fabric of the enterprise, despite the major effects the technology will have on how businesses market their products and optimize their workforces.”
The key question for software firms is who pays for Web 2.0 in the enterprise? Three challenges face vendors: IT shops are wary of what they perceive as insecure, consumer-grade technology; ad-supported Web 2.0 tools on the consumer side have set “free” as a starting point; and Web 2.0 technologies enter a crowded space dominated by legacy software investments.
Currently, large businesses are spending more on employee collaboration tools than customer-facing Web 2.0 technologies, but Forrester expects that trend to reverse by next year. By 2013, investment in customer-facing Web 2.0 technology will dwarf spending on internal collaboration software by nearly a billion dollars.
“Social Computing and Web 2.0 marketing are still in their infancy; and in general, the market is still in an experimentation phase,” said Young. “In the long run, the affect of Web 2.0 will be enormous. But what may prove to be of more value to vendors will be the skills of running a successful software-as-a-service (SaaS) business. For the vendors that do it well, disaggregating expertise about the medium from Web 2.0 content is likely to provide far more value than wikis and blogs ever did.”
“Global Enterprise Web 2.0 Market Forecast: 2007 To 2013” is currently available to Forrester RoleView™ clients and can also be purchased directly at
Source: Forrester Research
Friday, May 9, 2008
Corporate Executive Board report on Growth Challenges
If you work in a large company and you want to become humble quickly, check out the Stall Points Initiative, a fascinating stream of research by the Corporate Executive Board. The research shows that almost all companies hit a point where historical growth rates decelerate. Once the corporate growth engine stalls, it is very hard to restart.
The study involved close to 500 companies that have appeared on the Fortune 100 or international equivalents over the past 50 years. Close to 90 percent of those companies experienced a stall, or “secular reversals in company growth fortunes.”
Only 50 percent of companies that stalled were able to grow even moderately over the next decade.
The Corporate Executive Board highlights four primary reasons why companies stall:
- Premium-position captivity, when companies get “stuck” in the high end of their industry
-Innovation management breakdown, a “chronic problem in managing the internal business processes for updating existing products and services and creating new ones”
-Premature core abandonment, when a company falls to capture all of the growth opportunities in and around its core business
-Talent bench shortfall, or a lack of leaders who have the capabilities to execute strategy
One thing that’s not directly on the list, but perhaps should be, is “inappropriate hurdles for innovation efforts.” As a company grows, the hurdle rate for new initiatives becomes so high that many potential game-changing initiatives never see the light of day.
The problem plays out in two ways. First, companies set the bar for the ultimate size of new initiatives so high that it becomes very hard to find attractive opportunities.
Only 250 public U.S. companies have $10 billion in revenues. How many high-flying start-up companies from the last decade reached $10 billion in revenue in 10 years? Well, Google hit $10 billion in its eighth year (2006) and … that’s it.
Unfortunately, massive businesses don’t always look like massive businesses in their early days. Innosight did a quick analysis of revenue by year of close to 20 recent disruptors. The list included Google, eBay, Amazon.com, First Solar, Enernoc, Baidu, and several others.
The average first year revenue of the collection of companies was less than $40 million, with many companies having revenues of less than $1 million. It took until year 3 for the average of the sample to get close to $100 million, and year 7 for the average of the sample to exceed $1 billion.
That is astronomically fast growth, but would be not fast enough for a company seeking a $100-million pop in the first year. The only reliable way to create top-line growth of that magnitude is through relatively large acquisitions, which tend to be at best value-neutral.
So what’s a behemoth to do? One key to success: individual units responsible for growth should be kept small enough so they can prioritize opportunities that start relatively modestly. For a long time Hewlett-Packard had a practice of splitting up any division that reached a certain size, to minimize bureaucracy and leave the smaller unit free to prioritize relatively small opportunities.
For More Information
A summary of the Stall Points research is here:
Blog Acknowledgement Source: Scott Anthony, Innosight.
The study involved close to 500 companies that have appeared on the Fortune 100 or international equivalents over the past 50 years. Close to 90 percent of those companies experienced a stall, or “secular reversals in company growth fortunes.”
Only 50 percent of companies that stalled were able to grow even moderately over the next decade.
The Corporate Executive Board highlights four primary reasons why companies stall:
- Premium-position captivity, when companies get “stuck” in the high end of their industry
-Innovation management breakdown, a “chronic problem in managing the internal business processes for updating existing products and services and creating new ones”
-Premature core abandonment, when a company falls to capture all of the growth opportunities in and around its core business
-Talent bench shortfall, or a lack of leaders who have the capabilities to execute strategy
One thing that’s not directly on the list, but perhaps should be, is “inappropriate hurdles for innovation efforts.” As a company grows, the hurdle rate for new initiatives becomes so high that many potential game-changing initiatives never see the light of day.
The problem plays out in two ways. First, companies set the bar for the ultimate size of new initiatives so high that it becomes very hard to find attractive opportunities.
Only 250 public U.S. companies have $10 billion in revenues. How many high-flying start-up companies from the last decade reached $10 billion in revenue in 10 years? Well, Google hit $10 billion in its eighth year (2006) and … that’s it.
Unfortunately, massive businesses don’t always look like massive businesses in their early days. Innosight did a quick analysis of revenue by year of close to 20 recent disruptors. The list included Google, eBay, Amazon.com, First Solar, Enernoc, Baidu, and several others.
The average first year revenue of the collection of companies was less than $40 million, with many companies having revenues of less than $1 million. It took until year 3 for the average of the sample to get close to $100 million, and year 7 for the average of the sample to exceed $1 billion.
That is astronomically fast growth, but would be not fast enough for a company seeking a $100-million pop in the first year. The only reliable way to create top-line growth of that magnitude is through relatively large acquisitions, which tend to be at best value-neutral.
So what’s a behemoth to do? One key to success: individual units responsible for growth should be kept small enough so they can prioritize opportunities that start relatively modestly. For a long time Hewlett-Packard had a practice of splitting up any division that reached a certain size, to minimize bureaucracy and leave the smaller unit free to prioritize relatively small opportunities.
For More Information
A summary of the Stall Points research is here:
Blog Acknowledgement Source: Scott Anthony, Innosight.
Tuesday, May 6, 2008
Video and Web 2.0 Increasing Collaboration Focus
Summary
Majority of Enterprises Plan to Ready Networks to Support Innovative Video and Collaborative Applications in the Next 5 Years
Cisco recently released a study on the use of video and Web 2.0 technology in businesses worldwide. The study revealed that as consumer adoption of video and Web 2.0 has grown, companies are increasingly interested in using video to help grow their businesses, reach new customers, increase collaboration between their employees and look for more environmentally conscious means of communicating. More than half of the 850 corporate information technology (IT) decision makers surveyed say they are using video and Web 2.0 tools today. Another 25 percent said that they are exploring such tools. However, nearly all those surveyed said more needs to be done to ready the network before they can implement video and Web 2.0 technologies to support organization-wide communication and collaboration.
Video and Web 2.0 technologies such as blogs, Wikis, telepresence and web conferencing are helping companies to keep pace with rapid market changes. Nearly 30 percent of companies surveyed reported that the primary business reason for investing in video and Web 2.0 tools is to address the demand for innovative products and services from their customers. The desire to be more environmentally conscious (26 percent) was also reported as a consideration in rolling out video applications.
"The tipping point for mainstream enterprise adoption of video and Web 2.0 technologies will depend on how clearly business cases establish the link with business growth and competitive advantage," said Marie Hattar, senior director of network systems for Cisco. "With the increased globalization and spread of the enterprise workforce, IT's role is expanding from managing network operations to also shaping a business's impact by innovating how employees, customers and partners communicate and collaborate."
In this era of the dynamic, collaborative knowledge worker, which is seeing a growing importance in global teaming, and a flatter, more interactive organization, enterprises need to innovate with their communication tools to be more agile in response to market changes. Nearly half the respondents anticipate using video more widely in the next five years, with more efficient collaboration with remote employees (66 percent) and reduced travel costs (56 percent) as the business drivers.
"At JWT, a multidisciplinary global communications company, the self-expression and collaboration of ideas across our network, clients and partners is paramount. The heart of which runs on a 'glocalized' network," said James Hudson, chief information officer--Worldwide, JWT. "Technology enables this collaboration through self-forming communities supported by Web2.0 initiatives."
Noteworthy is the fact that in the United States, companies that plan to use video conferencing technologies in the next five years are also most likely to be faster-growing companies, as measured by fiscal year growth. On the otherhand, the fast-growing companies in Europe and the emerging markets are most likely to use Web 2.0 tools in the next five years.
"Collaboration can be a game changer for organizations," said John Kaltenmark, global managing director of Accenture Technology Consulting. "Not only are collaboration technologies critical to enhancing efficiency and productivity, but they can also play a key role in creating a more environmentally responsible workplace. However, in order for this promise to be truly realized, IT has to work hand in hand with the business to implement a long-term strategy that places these technologies in the broader business context, thus enabling business growth."
Additional key findings from the study include the following:
* Aside from cost, the biggest barrier to deploying video was challenges associated with maintaining a secure network (27 percent).
* Respondents agreed that IT complexity will increase as companies resolve how to deploy video and Web 2.0 technologies on top of existing collaboration applications.
* Only a small percentage of companies report that their network is ready to support video; the leading barriers are insufficient bandwidth and a lack of network infrastructure.
* Decision-makers in the United States are more likely to state that their network is increasing in complexity, becoming more costly to manage while enabling greater mobility.
The new study includes responses from more than 850 corporate IT decision makers from companies with more than 1,000 employees in seven countries: the United States, United Kingdom, France, Germany, India, Russia and Brazil. The research was written and analyzed in the fall of 2007 by an independent international third-party market research firm, Illuminas.
Majority of Enterprises Plan to Ready Networks to Support Innovative Video and Collaborative Applications in the Next 5 Years
Cisco recently released a study on the use of video and Web 2.0 technology in businesses worldwide. The study revealed that as consumer adoption of video and Web 2.0 has grown, companies are increasingly interested in using video to help grow their businesses, reach new customers, increase collaboration between their employees and look for more environmentally conscious means of communicating. More than half of the 850 corporate information technology (IT) decision makers surveyed say they are using video and Web 2.0 tools today. Another 25 percent said that they are exploring such tools. However, nearly all those surveyed said more needs to be done to ready the network before they can implement video and Web 2.0 technologies to support organization-wide communication and collaboration.
Video and Web 2.0 technologies such as blogs, Wikis, telepresence and web conferencing are helping companies to keep pace with rapid market changes. Nearly 30 percent of companies surveyed reported that the primary business reason for investing in video and Web 2.0 tools is to address the demand for innovative products and services from their customers. The desire to be more environmentally conscious (26 percent) was also reported as a consideration in rolling out video applications.
"The tipping point for mainstream enterprise adoption of video and Web 2.0 technologies will depend on how clearly business cases establish the link with business growth and competitive advantage," said Marie Hattar, senior director of network systems for Cisco. "With the increased globalization and spread of the enterprise workforce, IT's role is expanding from managing network operations to also shaping a business's impact by innovating how employees, customers and partners communicate and collaborate."
In this era of the dynamic, collaborative knowledge worker, which is seeing a growing importance in global teaming, and a flatter, more interactive organization, enterprises need to innovate with their communication tools to be more agile in response to market changes. Nearly half the respondents anticipate using video more widely in the next five years, with more efficient collaboration with remote employees (66 percent) and reduced travel costs (56 percent) as the business drivers.
"At JWT, a multidisciplinary global communications company, the self-expression and collaboration of ideas across our network, clients and partners is paramount. The heart of which runs on a 'glocalized' network," said James Hudson, chief information officer--Worldwide, JWT. "Technology enables this collaboration through self-forming communities supported by Web2.0 initiatives."
Noteworthy is the fact that in the United States, companies that plan to use video conferencing technologies in the next five years are also most likely to be faster-growing companies, as measured by fiscal year growth. On the otherhand, the fast-growing companies in Europe and the emerging markets are most likely to use Web 2.0 tools in the next five years.
"Collaboration can be a game changer for organizations," said John Kaltenmark, global managing director of Accenture Technology Consulting. "Not only are collaboration technologies critical to enhancing efficiency and productivity, but they can also play a key role in creating a more environmentally responsible workplace. However, in order for this promise to be truly realized, IT has to work hand in hand with the business to implement a long-term strategy that places these technologies in the broader business context, thus enabling business growth."
Additional key findings from the study include the following:
* Aside from cost, the biggest barrier to deploying video was challenges associated with maintaining a secure network (27 percent).
* Respondents agreed that IT complexity will increase as companies resolve how to deploy video and Web 2.0 technologies on top of existing collaboration applications.
* Only a small percentage of companies report that their network is ready to support video; the leading barriers are insufficient bandwidth and a lack of network infrastructure.
* Decision-makers in the United States are more likely to state that their network is increasing in complexity, becoming more costly to manage while enabling greater mobility.
The new study includes responses from more than 850 corporate IT decision makers from companies with more than 1,000 employees in seven countries: the United States, United Kingdom, France, Germany, India, Russia and Brazil. The research was written and analyzed in the fall of 2007 by an independent international third-party market research firm, Illuminas.
Subscribe to:
Posts (Atom)