Author Archives: Pat Weaver

Just for the record – Climate science pre-dates the UN and modern China!

Global Temperature

Global Temperature

In developing a theory to explain the ice ages, Svante August Arrhenius (1859 – 1927), a Nobel-Prize winning Swedish scientist developed the formula that is still used to predict the effect of greenhouse gasses.

In 1896, he was the first to use basic principles of physical chemistry to estimate the extent to which increases in atmospheric carbon dioxide (CO2) will increase Earth’s surface temperature through the greenhouse effect. These calculations led him to conclude that human-caused CO2 emissions, from fossil-fuel burning and other combustion processes, are large enough to cause global warming. 120 years later some idiots still seem to think the concept is a ‘hoax’.

Even earlier, French scientist Claude Pouillet made the first estimate of the solar constant in 1838 and concluded the temperature experienced on the earth’s surface was much higher than could be explained by the sun’s radiation alone and suggested the atmosphere must provide some form of insulation. Arrhenius confirmed this hypothesis and identified the primary cause of the warming effect.

The origins of PERT and CPM – What came before the computers!

The development of PERT and CPM as Mainframe software systems starting in 1957 is well documented with contemporary accounts from the key people involved readily available.  What is less clear is how two systems developed contemporaneously, but in isolation, as well as a number of less well documented similar systems developed in the same timeframe in the UK and Europe came to have so many similar features.  These early tools used the ‘activity-on-arrow’ (AoA or ADM) notation which is a far from obvious model.  Later iterations of the concept of CPM used the ‘precedence’ notation which evolved from the way flow-charts were and are drawn.

stockpile

One obvious connection between the early developments was the community of interest around Operation (or Operational) Research (OR) a concept developed by the British at the beginning of WW2.  OR had developed to include the concept of linear programming by the mid-1950s which is the mathematical underpinning of CPM, but while this link explains some of the cross pollination of ideas and the mathematics it does not explain terms such as ‘float’ and the AoA notation (for more on the development of CPM as a computer based tool see http://www.mosaicprojects.com.au/PDF_Papers/P042_History%20of%20Scheduing.pdf).

A recent email from Chris Fostel, an Engineering Planning Analyst with Northrop Grumman Corporation (CFostel@rcn.com) appears to offer a rational explanation.  I’ve reproduced Chris’ email pretty much verbatim below – the challenge posed to you is to see if the oral history laid out below can be corroborated or validated.  I look forward to the responses.

Chris’ Oral History

quartermaster_corpsI was told this story in 1978 by a retired quartermaster who founded his own company after the War to utilize his global contacts and planning skills.  Unfortunately the individual who told me this story passed away quite a few years ago and I’m not sure any of his compatriots are still alive either.  Regardless, I thought I should pass this along before I join them in the next life.  I do not wish to minimize the work of Kelly and Walker. They introduced critical path scheduling to the world and formalized the algorithms.  They did not develop or invent the technique.

The origin of critical path scheduling was the planning of the US Pacific Island hopping campaign during World War II.  The Quartermaster Corps coordinated orders to dozens if not hundreds of warships, troop ships and supply ships for each assault on a new island.  If any ships arrived early it would alert the Japanese of an imminent attack.  Surprise was critical to the success of the island hopping campaign.  The US did not have enough warships to fight off the much larger Japanese fleet until late in the war. Alerting the Japanese high command would allow the Japanese fleet to intercept and destroy the slow moving US troop ships before they had a chance to launch an attack. 

Initially the quartermasters drew up their plans on maps of the pacific islands, including current location and travel times of each ship involved.  The travel times were drawn as arrows on the map.  Significant events, personnel or supplies that traveled by air were shown as dashed lines hopping over the ship’s arrows.  The quartermasters would then calculate shortest and longest travel times to the destination for all ships involved in the assault. The plans became very complicated.  Many ships made intermediate stops at various islands to refuel or transfer cargo and personnel.  The goal was to have all ships arrive at the same time.  It didn’t take the quartermasters long to realize that a photograph of the planning maps would be a devastating intelligence lapse.  They started drawing the islands as identical bubbles with identification codes and no particular geographical order on the bubble and arrow charts. These were the first activity on arrow critical path charts; circa 1942. 

The only validation I can offer you is that by now you should realize that activity on arrow diagrams were intuitive as was the term ‘float.’  Float was the amount of time a particular ship could float at anchor before getting underway for the rendezvous.  Later when the US quartermasters introduced the technique to the British for planning the D-Day invasion the British changed float to “Slack”, to broaden the term to include air force and army units which did not float, but could ‘slack off’ for the designated period of time. 

You will not find a written, dated, account of this story by a quartermaster corps veteran.  Critical path scheduling was a military secret until declassification in 1956.  In typical fashion, the veterans of WWII did not write about their experiences during the War.  No one broke the military secrecy.  After 1956 they were free to pass the method on to corporate planners such as Kelly and Walker.  A living WWII Quartermaster veteran, should be able to provide more than my intuitive confirmation.

This narrative makes sense to me from a historical perspective (military planning has involved drawing arrows on maps for at least 200 years) and a timing perspective.  Can we find any additional evidence to back this up??  Over to you!

New Articles posted to the Web #56

BeaverWe have been busy beavers updating the PM Knowledge Index on our website with White Papers and Articles.   Some of the more interesting uploaded during the last couple of weeks include:

And we continue to tweet a free PMI style of exam question every day for PMP, CAPM, and PMI-SP candidates: See today’s question and then click through for the answer and the Q&As from last week.

You are welcome to download and use the information under our Creative Commons licence

USA moving to formalise project and program management capabilities

The concept of professional project management is gathering pace. The USA Government’s Program Management Improvement and Accountability Act of 2015 (PMIAA) was unanimously passed by the US Senate by in November 2015, and was passed by Congress in September 2016 on a 404-11 vote.  Because Congress made some minor changes, it now has to was returned to the Senate before it can be and signed into law by the President on the 14th December 2016 (see comment below).

obama-law

The Act requires the Deputy Director for Management of the Office of Management and Budget (OMB) to:

  • adopt and oversee implementation of government-wide standards, policies, and guidelines for program and project management for executive agencies;
  • chair the Program Management Policy Council (established by this Act);
  • establish standards and policies for executive agencies consistent with widely accepted standards for program and project management planning and delivery;
  • engage with the private sector to identify best practices in program and project management that would improve Federal program and project management;
  • conduct portfolio reviews to address programs identified as high risk by the Government Accountability Office (GAO);
  • conduct portfolio reviews of agency programs at least annually to assess the quality and effectiveness of program management; and
  • establish a five-year strategic plan for program and project management.

The Act also requires the head of each federal agency that is required to have a Chief Financial Officer (other than Defence which has its own rules) to designate a Program Management Improvement Officer to implement agency program management policies and develop a strategy for enhancing the role of program managers within the agency.

The Office of Personnel Management must issue regulations that:

  1. identify key skills and competencies needed for an agency program and project manager,
  2. establish a new job series or update and improve an existing job series for program and project management within an agency, and
  3. establish a new career path for program and project managers.

And finally, the GAO must issue a report within three years of enactment, in conjunction with its high-risk list, examining the effectiveness of the following (as required or established under this Act) on improving Federal program and project management:

  • the standards, policies, and guidelines for program and project management;
  • the strategic plan;
  • Program Management Improvement Officers; and
  • the Program Management Policy Council.

When enacted the Act will enhance accountability and best practices in project and program management throughout the federal government by:

  1. Creating a formal job series and career path for program/project managers in the federal government, to include training and mentoring – PMP, PMI-SP and similar certifications will become increasingly important!
  2. Developing and implementing, with input from private industry, a standards-based program/project management policy across the federal government.
  3. Recognizing the essential role of executive sponsorship and engagement by designating a senior executive in federal agencies to be responsible for program/project management policy and strategy.
  4. Sharing knowledge of successful approaches to program/project management through an inter-agency council on program and project management.
  5. Implementing program/project portfolio reviews.
  6. Establishing a 5-year strategic plan for program/project management.

You can read the text of the Act here, and stay up-to-date on the Act’s progress here.  The approach USA is aligned with regulatory actions in both the UK and the EU to require government agencies to improve project and program delivery. If this trend continues hopefully the ‘accidental’ project manager / sponsor will be consigned to history and the use of qualified professionals will become the norm.

Follow these links for more on achieving your PMP credential of PMI-SP credential.

New Articles posted to the Web #55

BeaverWe have been busy beavers updating the PM Knowledge Index on our website with White Papers and Articles.   Some of the more interesting uploaded during the last couple of weeks include:

And we continue to tweet a free PMI style of exam question every day for PMP, CAPM, and PMI-SP candidates: See today’s question and then click through for the answer and the Q&As from last week.

You are welcome to download and use the information under our Creative Commons licence

The future of project controls

Last week I participated in two PUXX panel discussions in Perth and Sydney focused on predicting the influence of technology on project controls.  The range of subjects covered ranged from drones and remote monitoring to virtual reality.

Many of the topics discussed offered better ways to do things we already do, provided we can make effective use of the data generated in ever increasing quantities – significant improvements but essentially ‘business-as-usual’ done better. The aspect I want to focus on in this post is the potential to completely reframe the way project schedules are developed and controlled when existing ‘gaming technology’ and BIM are synthesised.

The current paradigm used for critical path scheduling is a (dumbed-down) solution to a complex set of problems required to allow the software to run on primitive mainframe computers in the late 1950s – the fundamentals have not changed since! See: A Brief History of Scheduling.

The underlying assumption is a project consists of a set of activities each with a defined duration and depending on the logical relationship between the activities, some are ‘critical’ others have ‘float’.  The basic flaw in this approach can be demonstrated by looking at the various options open to a schedule to define the work involved in 3 simple foundations involving excavation and mass concrete fill.

schedule-options

All four of the above options above are viable alternatives that may be chosen by different schedulers to describe the work using CPM, and none of them really describe what actually happens. The addition of more links would help but even then the real situation which is one resource crew visits three locations in turn and excavates the foundations, a second crew follows and places the concrete with some options for overlapping, parallel working and possibly synchronising the actual pouring of all three foundations on the same day…….. Optimising the work of the crews is the key to a cost effective outcome and this depends on what follows their work.  For more on resource optimisation see: www.mosaicprojects.com.au/Resources_Papers_152.html. Advances in computer software offer the opportunity to develop a new way of working.

The starting point for the hypothesis outlined I this post is 4D BIM (Building Information Modelling). Last month I was in London working on the final edits to the second edition of the CIOB’s book, Guide to Good Practice in the Management of Time in Complex Projects (due for publication in 2017 as The Management of Time in Major Projects). One of the enhancements in the second edition is an increased focus on BIM. To assist our work a demonstration of cutting edge 4D BIM was provided Freeform.

Their current capabilities include:

  • The ability to model in real time clashes in working space provided the space needed for each crews work is parameterised. Change the timing of one work crew and the effect on others in a space is highlighted.
  • The ability to view the work from any position at any time in the construction process; allowing things such as a tower crane driver’s actual line of sight to be literally ‘seen’ at different stages of the construction.
  • The relatively normal ability to import schedule timings from a range of standard tools to animate the building of the model, and the ability to feedback information derived from processes such as the identification of clashes in the use of working space using
  • The space occupied by temporary works and various pieces of equipment can be defined and clashes with permanent works identified over time.
  • Finally the ability for a person to see and move around within the virtual model using the same type of 3D virtual reality goggles used by many gaming programmes. The wearer is literally immersed in the model.

For all of this in action on a major rail project see: https://www.newcivilengineer.com/future-tech/pushing-the-limits-of-bim/10012298.article

Moving into the world of game playing, there are many different games that allow players in competition, or collaboration, to ‘build’ cities, empires, fortifications, farms, etc. These games know the resources available to the players and how many resources will be required to construct each new element in the game – if you don’t have the resources, you can’t build the new asset.

Combining these two concepts opens up the possibility for a completely new approach to scheduling physical projects that involve the deployment of resources to physical locations to undertake work. The concept of location-based scheduling is not new, it was used in the 1930s to construct the Empire State Building (see: Line of Balance) and is still widely used.  For more on location-based scheduling see: Location-Based Management for Construction: Planning, Scheduling, and Control by Prof. Russell Kenley.

How these concepts tie into BIM starts with the model itself.  A BIM model consists of a series of parameterised objects. Each object can contain data on its size, weight, durability, cost, maintainability, carbon footprint, etc. As BIM develops many of these objects will come from standard libraries created by suppliers and subcontractors. Change an object, for example, replace windows from manufacturer “A” with similar Windows from manufacturer “B” and the model is update and potential issues with sizes, fixings and waterproofing can be identified. It is only a small step from this point to add parameters related to the resources needed to undertake the work of installation.

With this information and relatively minor enhancements to current BIM capabilities, once the engineering model is reasonably complete a whole new paradigm for planning work opens up.

4d-vr

To plan the work the ‘planning team’ put on their virtual reality headsets and literally ‘walk’ onto the site.  As they start to locate temporary works and begin the building process the model is tracking the use of resources and physical space in real time. The plan is developed based on the embedded parameters in the fully integrated 3D model.

Current 4D imports a schedule ‘shows you’ the effect.  Using the proposed gaming approach and parameterized objects you can literally build the project in the virtual space and either see the consequences on resource loading or be limited by resource availability.  A whole bunch of games do this already, add in existing clash detection capabilities (but applied to workers using the space) and you change the whole focus of planning a project. Decisions can be made to adjust the size of resource crews and the flow of work can be optimised to balance the competing objectives of cost efficiency, time efficiency and resource optimisation.

The proposed model is a paradigm shift away from CPM and its arbitrary determination of activities and durations to a process focused on the smooth flow of resources through work areas. The computational base will be focused on resource effectiveness and resource utilisation. Change ‘critical path’ to ‘critical resources’, eliminate the illusion of ‘float’ but look for underutilised resources and resource waiting time. To optimise the work, different scenarios can be stored, replayed and edited – the ultimate ‘what-if’ experience.

The concept of schedule density ties in with this approach nicely; initial planning is done for the whole project at the ‘low density’ level with activity durations of several weeks or months setting out the overall ‘time budget’ for the project and establishing the strategic flow of work.  As the design improves and more information becomes available, the schedule is enhanced first to ‘medium density’ and then to ‘high density’. The actual work is controlled by the ‘high density’ part of the schedule. For more on ‘schedule density’ see: www.mosaicprojects.com.au/WhitePapers/WP1016_Schedule_Density.pdf.

Where this concept gets really interesting is in the control of the work.  The medium and high density elements of the schedule are built using the same ‘virtual reality’ process as the overall schedule, therefore each object in the overall BIM model can include data on the resources allocated to the work, the sequence of work and the time allowed. Given workers on BIM-enabled projects already use various PDAs to access details of their work, the same tablet or smart device can be used to tell the workers their next job and how long that have to complete it. When they complete the task, updating the BIM model with that progress information updates the schedule, tells the crew their next job and tells the next resources planned to move into the area that the space is available. The schedule and the 3D model are the same entity.

Similarly, off-site manufacturing and design lead-times can be integrated into the dataset.  Each manufactured item can have its design, manufacture and transport and approval times associated with the element making the development of an off-site works / procurement schedule a simple process to extract the report once the schedule is set.  Identifying delays in the supply chain and dealing with changes in the timing of installation become staigtforward.

When inevitable problems occur, the project management team have the ideal tool to work through solutions and determine the optimum way forward, as soon as the new schedule is agreed, the BIM model already holds the information.

One of the key concepts in ‘schedule density’ is that any work planned for the short-term future has to be based on the actual performance of the crews doing the work. In a BIM enabled scheduling system this can also be automated. The work content of each activity is held in the model as is the crew assigned to the work. As soon as the work crew’s productivity can be measured, the benchmark values used in the original planning can be updated with real data. Where changes in performance are needed to deal with slippages and productivity issues these can be properly planned and incorporated into the schedule based on when the implemented changes can be expected to occur.

I’m not sure if this is BIM2 or BIM++ but these ideas are not very far in advance of current capabilities – all we need now is a software developer to take on the ideas and make them work.

These concepts will be harder to apply to ‘soft projects’ but the planning paradigms in soft projects have already been shaken up by Agile. But integrating 3D modelling with an integrated capability for real 4D interaction certainly seem to make sense for projects where the primary time management issue is the flow of resources in the correct sequence through a defined series of work locations in three dimensions.   What do you think???

PUXX – Perth and Sydney

Panel Discussion:  The Future of Project Controls

puxx-logoI’m honoured to have been invited to join the PUXX speaker panel for a Q&A session focused on the Future of Project Controls at their final events for 2016 in Perth and Sydney.

The other panellists are:

Gordon Comins, the founder and CEO of the original Primavera software distributor – Primavera Australia.

Jenny Purdie, Enterprise Services Executive and a Non-Executive Director of Nexion Corp.

Ray Paulk, Head of Project Services, for BHP Billiton global Project Services function, including Cost Estimating, Planning and Scheduling, Cost Engineering, Project Controls, Project Information Management, and Project Risk & Assurance.

Hosted by Prescience Technology:

puxx1For more information on the events (registration is essential – places are filling fast with only  few remaining), Prescience Technology and the Prescience User Experience Exchange (PUXX) go to  http://prescience.com.au/puxx

  • 22 November 2016 – 5:00pm for 5:30pm start at the Parmelia Hilton,  14 Mill St, Perth CBD
  • 23 November 2016 – 5:30pm for 6:00pm start at the Greenwood Hotel, Greenwood Plaza Rooftop, 36 Blue Street, North Sydney