You are currently offline, waiting for your internet to reconnect

Best Practices for creating plans to maximize performance

Symptoms
When a project plan is opened, saved or updated it takes a very long time to process. The application may appear to hang and the Task Manager may show that Project is "Not Responding". However if you let the process complete access to the application will return to normal and the plan will be opened, saved or updated successfully.

Cause
=====
The problem could be due to any one or a combination of the following:
1. there are many custom fields that are setup to use a formula.
2. there are a large number or assignments in the project. An assignment if created each resource assigned to a task.
3. there are tasks that span a year or more.
4. there are a large number of tasks.
Resolution
The more complex the plan, the larger it's size and the longer it takes to process. To resolve the issue you need to analyze the plan to see where you can make changes that will reduce the size of the plan.

1. Custom Fields with Formulas, having one custom field with a formula slows down processing slightly. However, what if there are 10 fields that all have formulas. Depending on the computing power of the system you may see slower open, save or update times.

2. When you assign one resource to a one day task in Project, what is actually stored in the Timesphased table is how much work will be done on a minute by minute basis. That would be 480 pieces of data. Imagine what will happen if the task goes over a year and has 400 resources assigned? You can see that it will take longer to open, save or update because there is so much more data being stored and updated.

3. As above, if tasks are going over a year or more, it will take longer to open, save or update than a 1 day task.

4. Large number of tasks - Microsoft recommends that a Project Manager keep his plan to less than 750 tasks for a couple reasons. First, it is easier to manage a small number of tasks, to display them and work through the file. Second, the performance will improve with a smaller file.

5. When a file is reused there may be large gaps in the Unique ID numbers due to deleting tasks and creating new ones. When the plan is opened
Resolution

6. When a baseline is saved the Duration, Work, Cost, Start and Finish dates for each task and each resource as well as assignment information is saved in the file. You can save up to 11 different baselines

==========
In cases 1-4 evaluate the necessity of the extensive use of a feature.
1. Can the same be accomplished using fewer custom fields or another kind of field or a macro? Can the formulas be simplified?
Is there another field that will show similar information?
2. Can you make more tasks that are shorter with fewer resources assigned to replace the long task with many resources assigned?
3. Can the same task be represented by a Summary task with smaller child tasks that rollup into it?
4. Break the plan into smaller projects and use a Master plan to view all the data when needed.

Resolution to cause 5
==================
1. Export the data into a different file format and remove the Unique IDs.
2. Reopen the data into a new project plan
The result is that the Unique IDs are reassigned starting with 1

Workaround for cause 6
===================
Remove/Clear unused baselines
If you have been saving baselines periodically over the course of the project there may be some baselines that are no longer needed.
1. Tools/Tracking/ click Clear Baseline

More information for #2
=============
How many assignments are in the plan? Project will load the summary assignment information when launched this way if there are many assignments across their EPM deployment you will see a performance hit.
Quick test you can try to see if it is the assignments thing.
1. Start Project (using the account that accepts the updates)
2. Uncheck the option to "Load Summary Assignment" information
3. Now go and accept the updates
4. Project will already be setup not to load this extra information so the plan will then open
It should be somewhat faster as project doesn’t need to query the extra data that really isn’t needed for an update.
Other things to look at
· How many hops to the SQL Server? This will show network latency and opening lots of data
· How big is their Enterprise Global? (Basically custom fields w/ formulas) this will cause slowdown when project is trying to calculate the data. Big issue if machine is slow
· How fast is this when you turn off the scheduling engine before accepting updates? Takes calculating out of the equation


Additional Resources

How datasets affect performance and capacity in Project Server 2013 - http://technet.microsoft.com/en-us/library/fp161198.aspx

Plan for performance and capacity in Project Server 2013 - http://technet.microsoft.com/en-us/library/ff646967.aspx

Run a Project Server 2013 performance test lab - http://technet.microsoft.com/en-us/library/ee956502.aspx

Plan for software boundaries (Project Server) - http://technet.microsoft.com/en-us/library/cc197693(v=office.12).aspx
More information
Properties

Article ID: 2143876 - Last Review: 05/06/2016 08:58:00 - Revision: 5.0

Microsoft Project Online, Microsoft Project Web App, Microsoft Project Professional 2013, Microsoft Project 2013 Standard, Microsoft Project Professional 2010, Microsoft Project Standard 2010, Microsoft Office Project Professional 2007, Microsoft Office Project Standard 2007, Microsoft Project for Office 365

  • KB2143876
Feedback
/script>