Tag Archives: Resource planning

CIOB launches Project Time Management Certificate

The Chartered Institute of Building has launched its Project Time Management Qualification (PTMQ) framework upon which the CIOB will assess and accredit Project Time Management professionals placing CIOB at the forefront of establishing the premier industry standard in planning, scheduling and project control.

The first element of the framework, the Project Time Management Certificate (PTMC) was launched at a gala function in London, by the CIOB President last week. Unlike existing certifications, this qualification is focused on assessing the candidates knowledge of practical project time management.   It is designed for new entrants to planning and scheduling as well as those who are already engaged in the management of time on projects. Holders of the PTMC will have demonstrated a rigorous understanding of the practice that underpins project planning and scheduling.

The launch of the PTMQ framework moves CIOB one step close to completing a five year strategy to provide standard education, training and accreditation in time management.

Back in 2008 CIOB research found that 67% of complex building projects were late. Of those delayed 13% were more than 3 months and 18% over 6 months. This finding prompted the CIOB to embark upon the development and publication in 2011 of the CIOB Guide to Good Practice in the Management of Time in Complex Projects which sets down the process and standards to be achieved in preparing and managing a time model.

The Guide underpins the new CIOB contract for the management of complex projects due for publication later this year, and the PTMQ framework for assessing and accrediting the Project Time Management professionals required as part of the CIOB contract.

The PTMC examination is open to CIOB members and non-members, those who have gone through Project Time Management training or those who have self-studied. It will appeal to anyone looking for a relevant and credible qualification in project time management. And in combination with the forthcoming Practitioner (PTMP) and Specialist (PTMS) credentials, it will offer a project time management qualification structure that will provide a progressive development path based on assessment of skills, knowledge and experience in planning, scheduling and project controls.

Mosaic is the exclusive CIOB partner for delivery of training in Australia and New Zealand, with rights to deliver training throughout the wider region. We are currently working on a planned series of public workshops and examinations commencing in Q1 of 2013. Courses and/or examinations can also be arranged for organised groups. For more information on this exciting development see: http://www.mosaicprojects.com.au/Training-CIOB-TM_Credential.html

UK and European readers contact: http://www.athenaprojectservices.com/

Advertisements

Scheduling Tools

Has Microsoft overcooked the price and performance of Microsoft Project (MSP)? With the impending release of Project 2010 most organisations should be re-evaluating their scheduling tools. Blindly following the Microsoft upgrade path should not be an option.

The trigger for this post is a number of emails I have received plus comments in a number of published articles and on Planning Planet.  Some users criticise MSP for flawed analytical performance, poor data handling and lack of real power in analysis. Other users criticise MSP for being too complex and too hard to use (you could almost feel sympathy for the MSP development teams dilemma). These criticisms have not changed much since the release of Project 2003 and Project Server. What has changed dramatically is the scheduling software market.

Through to the early 2000s Microsoft virtually gave MSP away, almost anyone could access a ‘competitive upgrade’ for under US$100. The very low cost of MSP effectively destroyed 90%+ of the mid to low end competition, TimeLine, CA SuperProject and a host of other businesses closed merged or changed focus.

Today most people outside of major corporations pay around US$1000 for a set of MSP. This tenfold increase in the ‘real price’ of the tool, primarily caused by the elimination of heavy discounts has opened the window for a host of new players in the mid to low end scheduling market place. Many with free options.

Asta PowerProject seems to be a complete replacement for MSP with equivalent levels of capability and sophistication and better presentation and analytical capabilities.

Other graphical tools include CASCAD-e and NetPoint

Some of the tools that are completely free, or have free entry level options include: jxProject, Gantter.com, PlanningForce and OpenProj

This is not a comprehensive list by any means more tools are documented on our ‘scheduling home page’. And I have not ventured into discussion of the high end products such as ACOS, Micro Planner, Primavera, Spider and the Deltek range.

The purpose of this blog is to challenge every organisation to really evaluate their scheduling requirements and test the market before letting their IT department blindly follow the Microsoft upgrade path.

Project 2010 may still be the best answer, but this needs to be an informed decision based on a proper review of the available alternatives. Simply paying the cost of upgrading to project 2010 (including licence fees, retraining and data conversion costs) without re-testing the market should be seen as being totally unacceptable because in 2010 there is a real choice of tools available!

We’ve got it wrong………

Apparently weather forecasters don’t need satellites and schedulers are redundant (or should that be rodent…?). Based on the following announcement “Catch all of the show at Gobblers Knob on Groundhog Day this year as Punxsutawney Phil and the Inner Circle of the Punxsutawney Groundhog Club predict the end to winter weather” (see: http://www.groundhog.org/) and the resulting answer of 6 weeks all we need is a well trained rat or two (maybe a Capybara for really big problems).

If you don’t know what a Capybara is see: http://en.wikipedia.org/wiki/Capybara

Planning is a social process!

I have just finished reading an article by Simon Harris on the use of 3M’s ubiquitous Post-It Notes in developing a schedule. Reading the article and taking a quick tour of Simon’s website reinforced a whole range of thoughts ideas and practices I have held for a long time:

  1. Schedules cannot control the future, there are numerous posts on this blog and papers on the Mosaic scheduling home page that explain why. Bookies and casino owners make money, this would be impossible if people could predict the future.
  2. I have also been posting and writing on the value of scheduling as a way to develop a communal view of what might be a good ‘future’ and the power of the schedule to influence people’s decisions and actions to help create that future. The schedule is a communication vehicle and needs to be as simple as possible to make sure the message is understood.

Whilst it is implicit in the stuff I have written, one of the elements that has probably been understated is the critical need to involve the project team in developing the project schedule if they are going to accept the schedule as ‘theirs’ and work to make it happen. This is where Simon’s dragged me back to ideas we used to teach 10 to 15 years ago.

Using Post-It Notes to shape up a schedule, particularly the activity definition and sequencing has always been a valuable process (at least since the launch of the notes in the early 1980s – I can still remember the first free samples that arrived in a magazine). Simon’s article reinforced the value of the process:

  • To start with, people really do think better on their feet; the cerebellum is important for physical movement, cognitive thinking and importantly planning future behaviour. The standing and moving associated with a Post-It Notes planning session (PIN-PS) helps the thinking.
  • Negotiating and debating ideas in a non-confrontational way needs the idea under discussion to be separated from the person. Focusing on the PIN-PS means looking at the wall; eye contact is minimised and the ‘idea’ (ie, the Post-It Note) is separated from the person. The interpersonal challenge is minimised.
  • Everyone can contribute, draw lines, move the notes and discuss the options. No-one controls the keyboard! This allows group consensus to be reached and group ownership of the outcome. The photograph of the finished wall is the single version of the truth.
  • The physical limitations of the process prevent too much granularity. Excessive detail does not assist understanding or accuracy.

The power in the process is not the physical elements of the PIN-PS it is the discussion an shared understanding that’s developed. Later as the project schedule hits the inevitable problems this shared understanding allows options and solutions to be reached quickly with team buy-in.

But before rushing off to try a PIN-PS there are a few tips…..

  • Use glass or paper as a background – Post-It Notes don’t stick well to whiteboards once they have been cleaned or vinyl wall coverings. Whiteboard markers work well on glass.
  • Cotton or wool can be used for link lines if whiteboard markers will leave a permanent stain (most paints, cloths and papers).
  • Peel your Post-It Note off the pack from the side – if you peel from the bottom the gummed portion curls and does not stick well.

If you think this is just plain common sense as a final thought Simon defines common sense as “that which is commonly considered right and proper when observed”, not “that which anyone and everyone has already thought of”.

The full article was published in the December 2009 edition of Project Manager Today. For more SOOPs (Simon’s observations on projects) see http://www.logicalmodel.net

Schedule Density

I have mentioned the work being done by the CIOB (UK) to develop a practice standard for scheduling in a few posts. This valuable work is now at the public comment stage and has a number of really innovative ideas.

The concept of schedule density contained in the CIOB ‘guide’ is not dissimilar to rolling wave planning but has far more practical advice.

The concept is based on the idea that it is practically impossible to fully detail a schedule for a complex project at ‘day 1’ – too many factors are unknown or still to be developed. The CIOB advice is to plan the overall project at ‘low density’, expand the work for the next 9 months to ‘medium density’ and plan the next 3 months at ‘high density’.

Schedule Density Over Time

Low density activities may be several moths in duration. Medium density activities are no longer than 2 months and focused on one type of work in one specific location. High density activities are fully resourced, with a planned duration no longer than the schedule update period and with specific workers allocated.

Activites are expanded to increase density

As the ‘density’ of the schedule is increased, the plan takes into account the current status of the work, current production rates and what is required to achieve the overall objective of the project.

This approach has a range of advantages over more traditional ways of scheduling not the least of which is engaging the people who will be responsible for doing the work in the next 2 to 3 months in the detailed planning of ‘their work’.

More later.

Resourcing Schedules – A Conundrum 2

Following on from comments to my post ‘Resourcing Schedules – A Conundrum’  there are still some basic problems to resolve.

As the commentators suggest, KISS is certainly an important aspect of effective resource planning: ie, planning resources at an appropriate level of detail for real management needs. But the basic issues remain; you cannot rely on a scheduling tool to optimise the duration of a resource levelled schedule.

We use the basic network below in our Scheduling courses (download the network – or – see more on our scheduling training)

Network for Analysis (download a PDF from the link above)

Network for Analysis (download a PDF from the link above)

No software I know of gets this one ‘right’.

When you play with the schedule, the answer to achieving the shortest overall duration is starting the critical resource (Resource 3) as soon as possible.

To achieve this Resource 2 has to focus 100% on completing Task B as quickly as possible BUT, Task C is on the Time Analysis critical path not Task B and 99% of the time software picks C to start before B.

This is not a new problem, a current paper by Kastor and Sirakoulis International Journal of Project Management, Vol 27, Issue 5 (July) p493 has the results of a series of tests – Primavera P6 achieved a duration of 709, Microsoft Project 744 and Open Workbench 863. Play with the resource leveling settings in P6 and its results are 709, 744, 823, 893 – a huge range of variation and the best option (P6) was still some 46% longer than the time analysis result . Other analysis reported in the 1970s and 80s showed similar variability of outcomes.

As Prof. George Box stated – All models are wrong, some are useful… the important question is how wrong does the model have to be before it is no longer be useful.

Computer driven resource schedules are never optimum, done well they are close enough to be useful (but this needs a good operator + a good tool). And good scheduling practice requires knowing when near enough is good enough so that you can use the insights and knowledge gained to get on with running the project. Remembering even the most perfectly balanced resource schedule will fall out of balance at the first update…..

How you encapsulate this in a guide to good scheduling practice is altogether a different question. I would certainly appreciate any additional feedback.

Resourcing Schedules – A Conundrum

I have recently been forced to think about the value of incorporating resources into schedules. At one level it’s not too hard to do, but is it useful?

From one aspect, it is impossible to schedule at any level without the active consideration of resources. Resources do the work in a given time and changing either the quality or quantity of the resource has some inevitable impact on duration. Consequently, it is critical to know the resource assumptions used in planning to validate the schedule and more importantly understand deviations from the plan during the execution of the work.

Generally what I mean by term ‘considered’ is the basic need to know the resources needed to undertake the work on every activity:

  • At the feasibility stage big picture tied to the strategy for the project.
  • At the contract stage to determine which tasks are the responsibilities of what contractor/subcontractor.
  • At the weekly level, the supervisors need to know who is working where and when.

These decisions also need to be recorded and monitored. How much detail is recorded in the scheduling tool and what scheduling functions are used though is an altogether different question – this I refer to as ‘quantitative’ resource analysis.

Consideration is not the same as quantitative analysis within a scheduling tool. Quantitative resource analysis requires answers, or assumptions to be made, about a range of uncertain issues. Some of the nearly insoluble questions include:

  1. There is no direct ‘straight line’ correlation between resource quantities and either task or project durations – there is a complex ‘J’ curve relationship and in some circumstances a negative correlation. For more on this see: The Cost of Time or for a more learned approach, The Mythical Man Month by Frederick P. Brooks Jr. originally published in 1975.
  2. It is nearly impossible to define skill levels for people who will be employed on a project at some time in the future but we know a skilled worker can be far more productive than an unskilled worker. The skill of the worker changes the production rates and consequently the durations.
  3. The other issue is the degree of motivation/moral of the people – a highly motivated team will always accomplish more than a ‘business as usual’ team and both more than a de-motivated workforce. Therefore the question of management and more importantly leadership also influence resource performance and therefore durations.

These unanswerable questions are complicated by the fact all scheduling software fails to optimally level resources . Basically the tools get it wrong the only question is how wrong: some are not too bad others unmitigated disasters. Resource scheduling needs both knowledge and common sense – no software applies common sense yet. But we have to plan resources – they need working space, accommodation, etc. And resources are the source of all cost expended on the project!

Another really interesting factor is the emerging understanding of the interaction between the schedule and the behaviour of people. IF the people believe the schedule represents a realistic approach to their work, they will (and do) modify their behaviours to conform to the schedule to be seen as successful. Obviously if resources are included in the schedule it is far more credible than if they are not. This was touched on in Scheduling in the Age of Complexity (read from p19 – the rest is not relevant and it’s a horribly long paper…. with a bit of luck this may turn into a book in a couple of years….).

So in conclusion I would suggest, consideration of resources is critical, as is having some form of method statement; together they dictate the planned durations of the work.

However, whilst using scheduling tools to calculate and level resource demands is useful, and can help gain valuable insights, you need real skill on the part of the scheduler and the right tools to achieve sensible results.

My feeling is the value of the process to the development of a realistic and achievable schedule depends on the circumstances of the project. Probably the biggest determinant of the value of quantitative resource analysis is the ease of adding to or reducing the resource pool. If this is easy, rudimentary quantitative analysis is all that’s needed, if any. If it is difficult to quickly change the resource pool far more rigour is required (eg, developing remote area mine sites in Australia). The quantitative analysis will still be ‘wrong’ but it is important to reduce the level error as much as possible.

This is a complex issue – what are your thoughts?