Resource Leveling: The Complete Series, Part 2

Content Restricted

Sign up for MPUG Membership to view this on-demand webinar and get unlimited access to our Webinars

Project Management Institute (PMI)® Professional Development Units (PDUs):
This Webinar is eligible for 1 PMI® PDU in the Technical category of the Talent Triangle.

Event Description:

Join MPUG author Daryl Deffler as he continues the journey through his latest series of articles on Resource Leveling.

Slide Deck: MPUG Project 2013 Resource Leveling Part II


Watch Part One


Read Daryl’s articles in this resource leveling series here:
“Scheduling vs. Leveling”
“Problem Indicators”
“Leveling Mechanics”
“Leveling Hierarchy, Part 1”
“Leveling Hierarchy, Part 2”
“Resolution Options”
“Understanding Split Task Options”
“Leveling Fields”
“Limitations”
“Preparing to Level”
“Resource Leveling: It’s Time to Level Your Schedule”
“Resource Leveling: The Leveling Cycle”
“Resource Leveling: Recommendations”


Presenter Info:

Daryl Deffler is currently employed by a large insurance company where he provides project management, project scheduling tool, process, and standards consulting for an enterprise project management office comprised of about 200 project managers. He has over 25 years in the IT project management field with experience managing small projects to large programs. During this time he has also developed and taught classes in both project management and scheduling tools such as Microsoft Project 2013, Primavera and ABT Workbench. He has been employed in the IT industry since 1979 in additional roles, including software development, technical support and management across mainframe, midrange and PC platforms.


Have you watched this webinar recording? Tell MPUG viewers what you think!

[WPCR_INSERT]

Written by Daryl Deffler
Daryl Deffler is currently employed by a large insurance company where he provides project management, project scheduling tool, process, and standards consulting for an enterprise project management office comprised of about 200 project managers. He has over 25 years in the IT project management field with experience managing small projects to large programs. During this time he has also developed and taught classes in both project management and scheduling tools such as Microsoft Project 2013, Primavera and ABT Workbench. He has been employed in the IT industry since 1979 in additional roles, including software development, technical support and management across mainframe, midrange and PC platforms.
Share This Post
Have your say!
00
4 Comments
  1. Excellent presentation and great job on the associated articles. Are the slides available?

  2. Great presentation

  3. Daryl, I’ve seen many references which discourage adding dependencies to every task as you are suggesting toward the end of this video. I was wondering what your thoughts are on this. Here’s an example: https://social.technet.microsoft.com/Forums/en-US/a7e6cb6c-58d0-4d94-8042-c6569d429c01/ms-project-pro-2010-linking-tasks-that-have-no-dependency-but-have-the-same-resource?forum=projectprofessional2010general

    Thanks,
    Dan

  4. Tasks with no predecessor will be scheduled immediately (as in today) and tasks with no successor will be scheduled anywhere form the current date to the end of the project. Generally, this is not what is desired and the primary reasons to add dependencies to every task are;
    * to avoid the above situation
    * if critical paths are reviewed, to allow Project to calculate a real, contiguous critical path. Missing dependencies result in fragmented critical paths.
    * to keep the schedule predictable and maintainable

    With that said, adding dependency relationships does not necessarily mean everything is waterfall. For example, Bob could have 8 tasks to complete after milestone A. All 8 tasks could have the same milestone A predecessor. On the plus side this would allow Project to dynamically determine the scheduling order for those 8 tasks. On the negative side, project might change the order of those 8 tasks every time the schedule is leveled on each subsequent week. I’ve seen that occur.

    There are a few primary reasons I add dependencies, including false dependencies. Primary of which is to simplify and remove unpredictability from the schedule/level functions. (I’m selfish. I want to make my PM role simpler!)

    First, I want to see how long specific blocks of work will take to complete. The internal sequence of when individual tasks within that work completes may be irrelevant to me, just the over all duration. So in the prior example, I may add false dependencies to Bob’s tasks 1-8 to schedule them in a waterfall sequence that Bob and I agree upon simply so that I can see that all 8 tasks will take 2 months. I would then let Bob know that he can work the tasks in any order, but that overall, all 8 tasks should be completed in 2 months. And thats what I use for tracking.

    The second reason is so that the weekly schedule/level functions don’t keep changing the task order. In an environment where team members are using online time sheets or weekly reports to guide what they should be working on that week, having the task sequence constantly changing every week causes lots confusion. So as a PM, I want them to focus on tasks until completion rather than jumping between multiple in-flight tasks.

    The third reason is to create a simpler, more predictable schedule. In a complicated schedule, I may have multiple work streams for Bob running in parallel, meaning Bob has the 8 tasks above as a work stream, along with and two more groups of tasks, and all three groups could be running in parallel. If I let Project control the sequencing, there’s a very good chance that I’ll be getting different leveling end dates on each of the work streams with each subsequent weekly schedule/leveling functions.
    So by adding some false dependencies, I can control the scheduling/leveling function and end up with more consistent and predictable results every time.
    Along these same veins, I could overly complicate the schedule and configure Bob to work at 12.5% on all 8 tasks in parallel. But this adds a LOT more complexity to scheduling and schedule maintenance. So why not keep it simple. As an example of the added complexity, If Bob has 320 hours of work to complete on these 8 tasks, scheduling them at 100% allocation with artificial waterfall dependencies will generally result in a 2 month duration. However, I could schedule all 8 tasks at 12.5% running in parallel. This might theoretically result in the same 2 month duration, but odds are it will result in a later date unless every task has the exact same hours estimate. But in reality, it won’t. For example, an 80 hour task allocated at 12.5% (1 hour per day) will schedule out to 80 days or 16 weeks (4 months) duration. A 20 hour task will take 20 days or 4 weeks or 1 month. As a side effect, What happens then is after the shorter estimated tasks complete, we start seeing that Bob is now only being used 87% after the first task completes, then 75%, and so on. This can be corrected, but it means that you as the PM must manually adjust the allocation % on each remaining task. So after the first task in the group completes, remaining tasks must be adjusted to 14.2% allocation, then after the second task completes, remaining tasks in the group need to be bumped up to 16.6%. And so on. While this can work, it’s way too much burden on you as the PM. So keep it simple. Focus on the end date for the group of work and schedule the tasks within the group as simply as possible.
    I know I went a bit off on a tangent, but these topics are all related to the dependency relationships on ever task. Not only is there a reason for a relationship on each task, but thought also needs to be given to HOW those relationships will be configured, which then leads down stream into how could each approach impact on-going schedule maintenance.
    Hope that helps.

Leave a Reply