Why do you need Performance Management?

Management directives, technical capabilities, focus groups, and other "Voice of the Customer" activities produce myriad requirements. Modeling, simulation, and other design activities produce enormous amounts of raw data. From all this, how do you make sense of how well (or poorly) a project is performing?

  • Are review meetings the first time you find out about problems in a design?
  • Do your team members make changes to a design one day, only to find out two weeks later that it caused problems to a different part of the design?
  • Do you struggle to keep track of ever-changing requirements?
  • Do your team members know how their work is performing relative to the project requirements?

Whether you design corporate software systems or energy efficient green buildings, you need a system that will keep track of your performance. This means defining and managing requirements, as well as capturing and managing results.


Each metric measures the performance of an aspect of a design. Calculation of a metric consists of a comparison between requirements and results. For example, a new building might be evaluated on two metrics: cost and time-to-complete.

Requirements often change. Simulation results and predictions of product behavior often change, especially in the early stages of a design. As a result, product performance is often unknown or wildly varying.

Your performance management system must adapt to changes to both requirements and results.

Testing and Validation

Testing and validation are critical to a quality product. Whether you design custom ASICs or simulate prices of energy, a rigorous, methodical approach to testing and validation will assure that you catch the problems before your customers do.

Tests and testing procedures change over time. Tests may be done manually at first, then become automated as the product evolves. Over the life of a product, the number of tests will grow. In addition, tests will change as the product and/or design changes.

Your performance management system must be flexible enough to accomodate the changes. It should move with you from manual system testing to automated unit tests. It should encourage running and re-running of tests rather than hindering them by being brittle and inflexible.

Integration with Revision Management

If your team members cannot see how their actions affect the product, they have no way to know how well they are performing. If they are not sharing a common set of performance indicators or a process for communicating performance, problems with system performance will not be discovered until later in the design.

Displaying performance indicators in Revision Manager provides accurate, up-to-the-minute feedback. When team members regularly commit their work to a repository, that repository holds the snapshot of the current state of the project. Part of that state is a set of performance indicators.

Not only do the indicators show how a project is performing, but they can also provide immediate feedback to team members about their specific contributions. No longer will you have to wait until a team meeting to find out about an improvement in results - the indicators will show it as soon as the results are commited to the repository.