Part 4: The worst of what we found

Part 4: The worst of what we found, and why it matters

Asset management for public entities: Learning from local government examples.

The main elements of asset management planning need to be carried out together - recording and analysing information, determining work required on the assets, establishing levels of service, financial forecasting, and making links to wider service and financial planning. No element can be effective in isolation.

As we carried out our asset management audits, we came across examples where organisations’ asset management was not working effectively. To help others avoid these pitfalls, in this Part we are presenting a series of case studies illustrating some of the things that can go wrong.

They are based on real examples, but have been made anonymous. Before reading these case studies, you should also consider the following points.

Insufficient interest, time, and control, to manage effectively

The Capital Asset Management Team at the Treasury has identified a number of factors that can undermine effective asset management.5 Many of these match our own experience:

  • lack of interest and support at senior management or governance level;
  • a project manager without the time or skills; and
  • no clear targets, time lines, role(s), or responsibilities, and no active management of how planning is going.

Failure to use resources effectively

It is clear that resources, both in terms of skilled asset engineers and finance, are limited. However, while some of the organisations we reviewed were struggling with relatively few, inexperienced staff, others were not even making the best use of the resources they did have available.

If an organisation is not making the most of its resources, this can lead to isolation. We found that some organisations seem to operate in isolation, not wanting to learn from others, or adapt, or update their approach to planning to match the latest developments.

Confusion about levels of service

"Levels of service" seems to be an area that many organisations struggle with, and at worst it can cause considerable confusion. The confusion seems to come in a number of forms:

  • A lack of distinction between technical and customer levels of service. Customer levels of service should describe how the customer experiences the service in a way that they can understand. Technical levels of service are about how the organisation provides the service and are often expressed in terms of technical standards and specifications.
  • A lack of clarity that customer satisfaction surveys, performance measures, performance indicators, and performance targets are not levels of service, although they are all useful elements in a performance framework. Levels of service are a statement of the standard the customer can expect. Customer satisfaction measures whether these standards are meeting the customer’s expectations. They do not of themselves define the standards. Similarly, performance measures and indicators provide information on whether levels of service are being achieved. Performance targets define a planned future level of performance.
  • Confusing the level of provision with the levels of service. A simple example demonstrates this. The number or length of roads that a council owns is not a level of service, but it does quantify the volume of roads provided. The levels of service will relate to the smoothness, safety, reliability, attractiveness, and so on, of those roads. However, in other services, this distinction is lost. The number of parks, libraries, swimming pools, and so on, is not a level of service, although it often seems to be quoted as one. It is, however, one factor affecting the accessibility of such facilities, which would be an appropriate level of service.
  • Failure to consider service from a range of perspectives - such as quantity, availability, quality, convenience, responsiveness, environment, cost, and system efficiency - or to ensure that all aspects of a service are covered. The number of levels of service needs to be manageable, but it also needs to effectively cover the full scope of the service as its users understand and experience it. For example, parks levels of service covering only playing field maintenance standards and not paths, seating, planting, play equipment, and so on, does not cover the range of perspectives and services relevant to a park.
  • Poor or missing links to higher objectives and community outcomes so that the levels of service give no indication of what the most important aspects of the service are. If an organisation’s main objective is about safety, the safety of its roads is an important level of service; but, if the main objective is economic development, a more important level of service might be reliability of travel between business districts.

All of this is important because the purpose of asset management is to provide a desired level of service through the management of assets in the most cost-effective manner for present and future customers. Without setting levels of service, there can be no asset management.

Lack of external scrutiny

It was striking to note that, for many councils, the most advanced asset management plan was the one for roading (or transportation). Although this is clearly a high-value, complex asset that requires good quality planning, we found that an important driver of more advanced practice was the scrutiny that this planning gets from the New Zealand Transport Agency. We were left with two questions:

  • Why is external scrutiny needed to raise performance, when good quality planning is for the organisation’s own benefit?
  • If external scrutiny is a key factor driving good performance, who is scrutinising the management of other critical assets such as water services, other utilities, and networks of public buildings throughout the country?

Our case studies in this Part exemplify these and other points.

  • Case study 4.1: Organisation A – risk increased by lack of policy and plans
  • Case study 4.2: Organisation B – unreliable and missing information
  • Case study 4.3: Organisation C – plans not current
  • Case study 4.4: Organisation D – plans do not link framework elements
  • Case study 4.5: Organisation E – deferred maintenance creates problems
  • Case study 4.6: Organisation F – affordability and investment not balanced

5: The Treasury (National Infrastructure Unit) in association with the National Asset Management Steering Group of the Association of Local Government Engineering NZ Inc (INGENIUM) (2009) Capital Asset Management CAM) Leadership Training: "Towards stronger Capital Asset Management", The Treasury, Wellington, overhead slide number 13 "Reasons why the CAM team may falter" (Session 3).