Tag Archives: Complexity Theory

Defining Complex Projects

There has been a lot written about ‘complex project management’ over the last few years much of which as confused projects with programs, complexity with big and complexity with complicated technology. For an overview of complexity theory see: A Simple View of ‘Complexity’ in Project Management.

A sentence in the paper ‘Translation and Convergence in Projects: An Organisational Perspective on Project Success’ (Project Management Journal, Sept.2011) triggered this post and sums up project complexity nicely: “The key difficulty with complex projects is that those managing them will often be ‘feeling their way’ towards a solution rather then following a reliable blueprint or project plan”.

Our view has consistently been that complexity is a function of complexity theory and it is a dimension of every project and program. This means every project has a degree of complexity in the same way that it has a defined size, a degree of technical difficulty and a degree of uncertainty, and all 4 dimensions interact and affect each other. These four dimensions are discussed in the Mosaic White Paper at: http://www.mosaicprojects.com.au/WhitePapers/WP1072_Project_Size.pdf.

What the thought from the paper above highlighted is the very close linkage between complexity which we see as being primarily a function of the project’s stakeholder community and the degree of uncertainty associated with the project outcome. Our blog post, Projects aren’t projects – Typology outlines one way of measuring uncertainty based on a model by Eddie Obeng.

I’m not sure how to measure this empirically yet, but I do have a feeling there is a need to define a measurement system that incorporates the type of uncertainty within the overall matrix of stakeholder engagement and supportiveness already embedded in the Stakeholder Circle® methodology – any thoughts will be appreciated.

See our earlier posts on Complexity.

If it can go wrong……

One derivative of Murphy’s Law is: If it can go wrong it will go wrong, usually at the most inconvenient moment!

Planning the assault

This post may be old news to many European’s but in November 2009, the 27-kilometer (16.8 mile) Large Hadron Collider (LHC), buried under fields on the French/Swiss border, suffered serious overheating in several sections after a small piece of baguette landed in a piece of equipment above the accelerator ring. Dr Mike Lamont, the LHC’s Machine Coordinator, said that “a bit of baguette”, believed to have been dropped by a passing bird (other sources suggest a malicious pigeon), caused the superconducting magnets to heat up from 1.9 Kelvin (-271.1C) to around 8 Kelvin (-265C), close to the level where they stop superconducting.

In theory, had the LHC been fully operational, this could cause a catastrophic breakdown similar to the one that occurred shortly after it was first switched on. Fortunately, the machine has several fail-safes which would have shut it down before the temperature rose too high.

Part of the LHC

Given the total cost of the project to build and commission the accelerator is of the order of 4.6 billion Swiss francs (approx. $4400M, €3100M, or £2800M as of Jan 2010) with an overall budget of 9 billion US dollars (approx. €6300M or £5600M as of Jan 2010), making the LHC is the most expensive scientific experiment in human history. Politicians are probably asking how a bungling bird could target a critical part of the machine with a small piece of bread and shut the whole system down?

A more realistic question for project practitioners is how could design engineers and risk managers be expected to foresee this type of problem? Failure Mode Analysis (FMA) may help but I can just see the reaction to someone in a risk workshop hypothesising that a bird would fly over the machine and drop its dinner precisely on target to cause the maximum damage. Theoretically possible, but hardly plausible would be a polite reaction……until after it happened.

One of the messages from books like ‘The Black Swan’ and from complexity theory  is the future is inherently unpredictable. This is probably as good an example of a ‘Black Swan’ as any I’ve heard of.

For more on the LHC see: http://en.wikipedia.org/wiki/Large_Hadron_Collider

PMI COS Seminar

This week, I will be presenting live from Australia the final session of the Fall PMI College Of Scheduling (COS) Wednesday Webinar Series: Scheduling in the Age of Complexity. This hour-long event will provide key insights for better scheduling from a personal level: What is the role of the scheduler and what is our future?

The PMI-COS Fall series is designed to bring highlights from the 6th Annual Scheduling Conference held in Boston, MA earlier this year.  Archived presentations are available at http://www.pmicos.org/ondemandlearning.asp if you find them of interest, why not sign up for the College?

The Featured Presentation:   Scheduling in the Age of Complexity

Scheduling was developed as a computer based modelling process at a time when ‘command and control’ was the dominant management paradigm. The mathematical precision of the early scheduling calculations were somehow translated into certain project outcomes. Today, the certainties are no longer so apparent. Most projects run late and uncertainty and complexity are starting to take center stage.

This paper identifies the key elements in Complexity Theory to suggest the real role of a schedule in ‘the age of complexity’. It concludes by recommending a way to re-establish the role of the scheduler in the successful delivery of projects in the 21st Century.

DATE:  Wednesday, December 2, 2009
TIME:   5:00pm EST (US Eastern Daylight Savings Time); Doors open at 4:45pm

LOCATION: http://pmi.acrobat.com/r31077016/

There is no dial-in telephone option for the presentation. All voice will be through the classroom platform.

Mathematical Modelling of Project Estimates

I have just finished reading a very interesting paper by Dr. Pavel Barseghyan; Problem of the Mathematical Theory of Human work the paper is available from the PM World Today web site.

Dr. Barseghyan’s key message is the unreliability of historical data to predict future project outcomes using simple regression analysis. This is similar to the core argument I raised in my paper Scheduling in the Age of Complexity presented to the PMI College of Scheduling conference in Boston earlier this year. Historical data is all we have but cannot be relied on due to the complexity of the relationships between the various project ‘actors’. As a practitioner, I was looking at the problem from an ‘observed’ perspective it’s fascinating to see rigorous statistical analysis obtaining similar outcomes.

A counterpoint to Dr. Barseghyan’s second argument that improved analysis will yield more correct results is the work of N.N. Taleb particularly in his book ‘The Black Swan’. Taleb’s arguments go a long way towards explaining much of the GFC – models based on historical data cannot predict unknown futures. For more on this argument see: http://www.edge.org/3rd_culture/taleb08/taleb08_index.html 

Personally I feel both of these lines of reasoning need to be joined in the practice of modern project management. We need the best possible predictors of likely future outcomes based on effective modelling (as argued by Dr. Barseghyan). But we also need to be aware that the best predictions cannot control the future and adopt prudent, effective and simple risk management processes that recognise each project is a unique journey into the unknown.

I would certainly recommend reading Dr. Barseghyan’s paper.


As readers of this blog and our published papers would know I have a passing interest in complexity theory and its application to project management. This seems to be an expanding area of interest world wide.

Last night I was at the PMI Canberra Chapter presenting a summary of my paper ‘Scheduling in the Age of Complexity’ [download the paper] – another good reception for the ideas but more importantly several people in the audience were involved in parallel lines of enquiry. Possibly of most interest is the ideas of Graham Durant-Law see his blog at http://www.durantlaw.info.

Another interesting development is a new publication from PMI, Exploring the Complexity of Projects written by Svetlana Cicmil, Terry Cooke-Davies, Lynn Crawford and Kurt Richardson [see: http://www.pmi.org/Marketplace/Pages/Default.aspx] A quick skim suggests this is a comprehensive round up of the current state of complexity theory in project management. More on this once I have had a chance to read it.

What is gratifying is seeing the confusion created by the so called ‘College of Complex Project Managers’ and Prof. David Dombkins receding rapidly into obscurity. Rather than the confusion caused by the ‘college’ treating large complicated programs of work as a synonym for complexity theory (as Dombkins did in the original College manifesto); thought leaders world wide seem to be:

The work on understanding complexity in project management has a long way to go and will undoubtedly be the subject of future blogs. Your contribution to the discussion will be welcome.

Complex Projects

Further to two earlier posts on the subject projects aren’t projects I have run across an interesting journal paper from Jon Whitty focused on Complexity.

A number of organisations are promoting the concept of ‘complex project management’; many other commentators, including me, feel complexity is a factor in every project. The elements of complexity theory include non linearity, emergence and unpredictability. In project management space, this translates to the interactions of people involved in and around the project to each other and to the project work (for more on this see A Simple View of Complexity in Project Management).

Jon’s paper reinforces the argument that projects have multiple dimensions and additionally projects and programs are quite different. Whilst there is likely to be a correlation between size and complexity, the two dimensions are not directly related! To read more see:

Projects aren’t Projects

Project management is not a one-size-fits-all process or discipline. The PMBOK® Guide makes this clear in Chapter 1. There are at least 4 dimensions of a project,

  • its inherent size usually measured in terms of value;
  • the degree of technical difficulty (complication) involved in the work;
  • the degree of uncertainty involved in defining its objectives; and
  • the complexity of the relationships surrounding the project.

Project Size
The size of the project will impact the degree of difficulty in achieving its objectives but large projects are not necessarily technically complicated or complex. There are projects in Australia to shift millions of cubic meters of overburden from mine sites with expenditures rising to several $million per day but the work is inherently simple (excavating, trucking and dumping dirt), and the relationships in and around the project are relatively straight forward. The management challenges are essentially in the area of logistics.

Technical Difficulty (degree of complication)
Complicated high tech projects are inherently more difficult to manage than simple projects. The nature of the technical difficulties and the degree of certainty largely depend on how well understood the work is. The important thing to remember with complicated work though is that systems can be developed and people trained to manage the complications. The work may require highly skilled people and sophisticated processes but it is understandable and solvable.

The degree of uncertainty associated with the desired output from the team’s endeavours has a major impact on the management of the project. The less certain the client is of its requirements, the greater the uncertainty associated with delivering a successful project and the greater the effort required from the project team to work with the client to evolve a clear understanding of what’s required for success. This is not an issue as long as all of the project stakeholders appreciate they are on a journey to initially determine what success looks like, and then deliver the required outputs. Budgets and timeframes are expected to change to achieve the optimum benefits for the client; and the project is set up with an appropriately high level of contingencies to deal with the uncertainty. Problems occur if the expectations around the project are couched in terms of achieving an ‘on time, on budget’ delivery when the output is not defined and the expected benefits are unclear. Managing uncertainty is closely associated with and influences the complexity of the relationships discussed below.

Complexity = The People
Complexity Theory has become a broad platform for the investigation of complex interdisciplinary situations and helps understand the social behaviours of teams and the networks of people involved in and around a project. These ideas apply equally to small in-house projects as to large complicated programs. In this regard, complexity is not a synonym for complicated or large. It focuses on the inherent unpredictability of people’s actions and reactions to ideas and information within the network of relationships that form in and around the project team.

Size is straightforward and most organisations have processes for assigning more experienced project managers to larger projects. What’s missing is consideration of the other three aspects.

The last item, complexity is very much an emerging area of thought and discussion. For a brief overview see: A Simple View of ‘Complexity’ in Project Management  and for some practical considerations of the impact of complexity theory on scheduling see: Scheduling in the Age of Complexity. However, I expect it will be some years before ‘complexity theory’ and project management sit comfortably together.

Of more immediate interest is the interaction of uncertainty and technical difficulty. Knowing both ‘what to do’ and ‘how to do it’; or more importantly knowing how much you know about these two elements is critically important in establishing a framework to manage a project. Some ideas on this topic will be the subject of my next post.