4 Performance Reference Model

4.1 Introduction

4.1.1 Measurement and the Australian Public Service

Measurement and evaluation have been used in the Australian Public Service (APS) since the early 1980s. The Portfolio Evaluation Program was a centralised program aimed at integrating program evaluation with the central budgetary process in an effort to place greater focus on outcomes and cost effectiveness, rather than simply on inputs and processes. Although successful at first, the Portfolio Evaluation Program was ultimately abandoned in the early 1990s because it was argued that it had become too resource intensive to be effective and was seen to be too cumbersome and prohibitive of government innovation, and because there were shortages of skilled evaluation practitioners.

Despite more recent evolutions in government policy associated with measurement and evaluation that devolve responsibility for measurement and evaluation to government agencies (the 1997 Outcomes and Outputs Framework and the revised 2009 Programs and Outcomes Policy), there continue to be ongoing challenges in establishing and operating a robust, effective measurement and evaluation framework.

In a presentation to the Canberra Evaluation Forum (CEF)8 on the 18/08/2010, the Secretary of the Department of Finance and Deregulation, Mr David Tune, stated that the availability of useful evaluation information across the APS is uneven with some agencies maintain a best practice, coherent and well-coordinated evaluation function, with well developed and stable internal evaluation capability and partnerships with external expert consultants. However, others appear to be less focused, which can lead to problems with the usefulness, objectivity, transparency and openness of the data that is collected.

What is needed is a measurement framework capable of operating in all government agencies, and capable of simplifying and streamlining the measurement and evaluation process.

4.1.2 Performance Reference Model

The Performance Reference Model (PRM) (Figure 4-1) has a measurement framework that aligns with the Inputs–Transformation–Outcome (ITO) Model9, the Outcome Process Model (OPM)10 and the Government Business Operation model. It supports the identification and definition of measures that are able to capture and describe for business:

  • the efficiency of government agencies in their utilisation of public funds and resources in the delivery of business outputs
  • the effectiveness of the outputs produced by agencies in realising desired outcomes for the government and the agency
  • the overall efficacy (ability to execute) of agencies and the delivery of government programs.

When implemented in support of business intelligence systems, balanced scorecards, enterprise architecture or other measurement systems, this framework for measurement delivers to agency executives a line of sight between the inputs of a business initiative and the realisation of outcomes from that initiative.

Figure 4-1shows the 14 domain sub-types within the five measurement domains that comprise the PRM Classification Framework.

Figure 4-1: The PRM Classification Framework

There are five measurement domains within the PRM, and within the five measurement domains there are 14 domain sub-types. Under each of the measurement domain subtypes are groupings for sub-type attributes that correspond to the characteristics of the domain sub-type, and below the sub-type attributes are measurement groupings that provide further refinement of attributes that possess multiple variables

Domain Domain Sub-Type
[1] Outcomes [11] Program outcomes [12] Business outcomes
[2] Usage [21] Product consumption [22] Service delivery
[3] Outputs [31] Products [32] Services
[4] Work [41] Projects [42] Ad hoc tasks [43] Processes and operations (BAU)
[5] Inputs [51] Human Resources [52] Data and information [53] Technology [54] Fixed Assets [55] Money

The classification framework and hierarchical structure of the PRM have been shaped by the measurement and evaluation frameworks that existed in government in the past, operate in government today, and will support government into the future. By design, the PRM has been derived from pre-existing Australian Government initiatives such as:

  • the 1997 Outcomes and Outputs Framework
  • the 2009 Outcomes and Programs Policy
  • the Strategic Review Framework.

The drivers for the framework and structure are straightforward. The PRM must:

  • be easy to understand and implement within the Australian Government
  • not require large reinvestment of agency resources by establishing a new regime
  • lessen the existing burden of measurement and evaluation in government.

4.1.3 Australian Government Architecture Reference Models

The PRM is included alongside the other AGA reference models11, but it is distinctly different from them in both form and function (Figure 4-2). Where the other reference models are intended to assist in the development of architecture within a specific domain, or layer, of an organisation, the PRM is intended to facilitate measurement across all domains and layers of the organisation. Within the PRM Framework there are subjects, topics and terms that align with and are directly attributable to the other reference models of the AGA; when the PRM is used in directly attributable to the other reference models of the AGA; when the PRM is use concert with them, their functionality and effectiveness are significantly improved.

Figure 4-2: Linking the PRM to the AGA Reference Models and Agency Architecture - see text description below

Figure 4-2: Linking the PRM to the AGA Reference Models and Agency Architecture

Text description for Figure 4-2: Linking the PRM to the AGA Reference Models and Agency Architecture - Figure 4-2 illustrates the connections between the five AGA Reference Models and Agency Architecture elements consisting of Programs and Capabilities, Services and Components, Technology and Data.

For example, of the PRM domains;

  • Outcomes link to Agency Programs and Capabilities;
  • Outputs and usage link to Agency Programs and Capabilities, and Agency Services and Components;
  • Work activities link to Agency Services and Components; and
  • Inputs link to Agency Technology and Agency Data
BRM domains link to Agency Programs and Capabilities.
SRM domains link to Agency Services and Components.
TRM domains link to Agency Technology.
DRM domains link to Agency Data.

 

4.1.3.1 Business Reference Model

The Business Reference Model (BRM) provides a framework that facilitates a whole- of-government functional view of the Australian Government's lines of business (LoBs), independent of the agencies performing them, providing a clear view of what government does.

The BRM is structured into a tiered hierarchy representing the business functions of the Australian Government. Business areas are at the highest level, broken down into lines of business that comprise a collection of business capabilities at the lowest level of functionality in the BRM.

At the agency level, those business capabilities are represented by business services that are enacted through the business processes created by the agency. Business processes are, in turn, delivered and supported by service components that are described in the are, in turn, delivered and Service Reference Model.

The BRM outlines the government business required to enable the realisation of government outcomes and business objectives, and links the operation and structure of business to the business processes supported by service components that are described in the Service Reference Model.

Figure 4-3: Combinging the BRM and the PRM - see text description below

Figure 4-3: Combining the BRM and the PRM

Text description for Figure 4-3: Combining the BRM and the PRM -

BRM =

What services do we deliver?
What functions of government do we enable?
What is our organisation structure?
What are our business capabilities?

BRM + PRM =

How effectively do we deliver services?
How effective are we in securing outcomes for government?
Is our structure optimal for our business?
How well are our business capabilities performing?

The PRM outcomes measurement domain is intended to support the evaluation (success or failure) of the products and services the government delivers to citizens (program outcomes), and the evaluation of internal government agency operation (business outcomes). While the BRM is not the direct source of outcome sub-types and outcome themes in the PRM, there is a strong alignment between the two reference models because of their shared relationship with the Australian Government Interactive Functions Thesaurus (AGIFT). This strong alignment allows the straightforward coupling of the PRM to the BRM based on the themes of government and the themes of outcomes the government is seeking.

When architectures combine the PRM and the BRM in the definition and documentation of agency business operations, capabilities and functions, they create not only is a picture of what an agency does, but also a map of how well the agency is performing.

4.1.3.2 Service Reference Model

The Service Reference Model (SRM) is a business-driven, functional framework classifying services according to how they support business and their performance objectives. It supports agency ICT investments and asset management activities by facilitating the re-use of business components and services.

The PRM supports the evaluation of new service creation and commission through the conducting of a fit-for-purpose evaluation within the Outputs measurement domain, and through the ongoing evaluation of service performance through efficiency measurements defined within the Operations and Processes segment of the Work measurement domain. The separation of 'fit for purpose' measurement and efficiency- focused measurement set out by the PRM makes it simpler to target measurement based on organisational needs (i.e. performance in the Work domain, conformance in Output domain).

Figure 4-4: Combining the SRM and the PRM  - see text description below

Figure 4-4: Combining the SRM and the PRM

Text description for Figure 4-4: Combining the SRM and the PRM -

SRM =

What are our business processes?
What shared services do we have?
What processes have we automated?
What are our service capabilities?

SRM + PRM =

How efficient are our business processes?
What processes should we be improving?
Do our shared service areas have capacity?
Are our service capabilities performing to the required level?  

While the SRM will help an agency to define and understand its functional capabilities, incorporation of the PRM allows an agency to determine the performance characteristics (when is it available, how efficient is it, what is the service level agreement?) of those capabilities and target operational improvements to areas where maximum benefit will be obtained.

4.1.3.3 Technical Reference Model

The Technical Reference Model (TRM) is a component-driven technical framework categorising the standards and technologies that support and enable the delivery of services and capabilities within an agency. It provides a foundation to advance the re-use and standardisation of technology and service components from a whole-of-government perspective by forming a catalogue of the component standards implemented in government agencies.

Figure 4-5: Combining the TRM and the PRM - see text description below

Figure 4-5: Combining the TRM and the PRM

Text description for Figure 4-5: Combining the TRM and the PRM -

TRM =

What products do we use?
What infrastructure do we have?
What standards do we embrace?
How many products have we got?

TRM + PRM =

How much does our technology portfolio cost us?
How much capacity do we have on our infrastructure?
How readily can our systems interoperate with one another?
How reliable is the infrastructure we have?  

Technology exists as a domain sub-type of the Inputs measurement domain. Within the sub-type are attribute classifications that are the common measurable characteristics of all variants of technology.

When the PRM and TRM are combined, not only is the 'standard' of technology used by an agency and government captured, but so are its costs, capacity, utilisation, reliability and interoperability, leading to improved strategic investment decisions and more targeted tactical decisions. Combining the TRM and the PRM helps agencies and the Australian Government to benefit from economies of scale by identifying and re-using the best solutions and technologies to support business functions, agency strategies and government outcomes.

4.1.3.4 Data Reference Model

The Data Reference Model (DRM) is a flexible, standards-based framework that enables information sharing and re-use across the Australian Government via the standard description and discovery of common data and the promotion of uniform data management practices. The DRM provides a standard means by which data may be described, categorised and shared.

The PRM considers data to be an input of business, as well as a product of business, so the measurement and evaluation of data are addressed in both the Inputs and Outputs measurement domains. If data is an output of a government activity, such as the publication of the national accounts data, the dataset would be assessed against fit-for-purpose characteristics as defined by the program requirements. As an input to a business initiative, the PRM supports the evaluation of data attributes such as standardisation, quality and relevance.

Combining the DRM and the PRM is a simple process. Where the DRM defines a dataset by specifying its functional characteristics, format, interchange protocol and metadata, the PRM describes the non-functional characteristics of the same dataset, such as continuity, reliability, completeness and accuracy.

Figure 4-6 illustrates the domains of the PRM that are covered by each of the BRM, SRM, DRM and TRM.  For example: BRM covers the Outcomes, Usage and Outputs domains; SRM covers the Work domain and HR,  Technology and Fixed Assets sub-types of the Inputs domain; DRM covers the Data and Information sub-type of the Inputs domain; and TRM covers the Technology sub-type of the Inputs domain.

Figure 4-6: PRM coverage of the BRM, SRM, TRM and DRM

4.1.4 Related Models and Frameworks

4.1.4.1 The United States Government Federal Enterprise Architecture Framework

The United States Federal Enterprise Architecture Framework (FEAF) PRM was designed to operate at the whole-of-government level as an instrument to assist in effective ICT investment decisions. While the AGA Framework PRM is capable of supporting that type of activity, it also operates equally effectively within the agency context, supporting organisational design, project initiation, and internal investment and performance improvement.

While the AGA Framework PRM is based on the US FEAF PRM, there are some significant differences. The intent of the PRM in both architectural contexts is the same: to deliver a line of sight between inputs and outcomes; however, the way in which this is supported and realised is different. The AGA is built around a generic business pattern that is able to be applied flexibly. In other words, the AGA Framework PRM is not as prescriptive as the FEAF PRM.

The AGA PRM also has a stronger business focus than the FEAF PRM. This is reflected in the framework of the AGA Framework PRM, which places equal emphasis on all inputs to government business, not just ICT. It also sets out generic measurement concepts for measuring business processes and supports the measurement of any type of output.

4.1.4.2 The Outcome Process Model

Figure 4-7 shows an abstract Outcome Process Model (OPM) applicable to the transformation of inputs, also known as resources, into the realisation of outcomes. The model represents the most fundamental end-to-end business process. All other higher order business processes, such as grants management processes and service delivery processes, can be identified as instances of this process.

Figure 4-7 shows an abstract Outcome Process Model (OPM) applicable to the transformation of inputs, also known as resources, into the realisation of outcomes.  The model represents the most fundamental end-to-end business process.  All other higher order business processes, such as grants management processes and service delivery processes can be identified as instances of this process.

Figure 4-7: The Outcome Process Model

The relationship with the OPM must underpin the PRM if agencies are to define an effective measurement line of sight between inputs and outcomes. The line of sight will then allow agencies to understand the cause-and-effect relationship between their input resources, process performance and outcomes realisation.

OPM Definition Examples
Inputs The resources that a government agency is able to direct towards the attainment of an outcome.
  • Financial resources (money)
  • Staffing resources (people)
  • Data (information)
  • Technology resources (ICT)
  • Other physical resources (assets)
Work The act of applying labour resources (people and technology) to other supplied resources (money, information, assets and other materials) in accordance with business operational parameters and business processes.
  • Project-based business processes
  • Processes and operations (business as usual)
  • Ad hoc tasks and activities
Outputs The direct results arising from the application of physical or technological labour (work) and the supply of resources to business process within a government agency.
  • Social services
  • Infrastructure
  • Campaigns
  • Legislation/regulation
Usage The access to and utilisation of the outputs produced through work, by the intended consumer of an agency/ government initiative.
  • The consumption, by a targeted demographic, of a service provided by government
  • Utilisation of public infrastructure, such as a hospital
Effects The observable, measurable impacts that the usage of outputs has on any state of the world.12
  • Fewer arrests for drink-driving offences
  • A decline in concentrations of pollutants found in national waterways
Outcomes The qualitative, cumulative impacts that can be inferred through the observation/measurement of the effects produced as outputs are used. Outcomes cannot be directly measured.
  • Greater efficiency in government functions
  • Reduced environmental impact of heavy industry in Australia
  • Reduction of long-term unemployment

4.1.4.3 Key concepts of the Outcome Process Model

While the model is robust in its application to the process of transforming inputs into outcomes, it does not include a formal representation of time. Time is an implicit attribute of the model components of work and usage. Work is effort applied over time, and usage is consumption conducted over time.

Determining the efficiency and effectiveness of an initiative is often contingent on the attribute of time. 'Were outputs delivered on time? Was work completed in a timely manner? Were deliveries made on time?' For this reason, efficiency and effectiveness are embedded in the model as attributes of work and usage.

Project management methodologies, such as PMBoK and Prince2, do not make a distinction between project outputs and outcomes and often use the two terms interchangeably. The OPM emphasises the difference between outputs and outcomes, and that separation carries through to the PRM framework. While the PRM is suitable for the effective measurement of project performance, care needs to be taken to ensure that stated project outcomes and outputs are correctly identified within scope statements in order for the model to deliver maximum value.

The three types of work that may be conducted within agencies—ad hoc tasks, projects and operational processes—are not directly represented in this model, as they possess common characteristics and fundamentally achieve the same result: the production of an output. For more information on the handling of work by the PRM, see the detailed PRM Framework and Implementation Guide.

Demonstrations of the OPM's applicability to business activity planning and operational management are provided in Section 4.9 Demonstrating the Outcome Process Model.

4.1.4.4 Department of Finance and Deregulation's 'Outcomes and Programs' Policy

The AGA Framework PRM is designed to support and reinforce the 'Outcomes and Programs' guidance issued by the Department of Finance and Deregulation for the development of agency outcome statements and the formation of effective program key performance indicators (KPIs).

This alignment makes the PRM:

  • explicitly applicable to all agencies within federal government, as all agencies have budget outcome statements that can easily be mapped to the PRM
  • simpler for agencies to implement, as many measurements will already be established
  • a logical choice of measurement framework for agencies.

The PRM is structured in a way that supports existing agency measurement and reporting obligations set out by the Department of Finance and Deregulation's Outcomes and Programs policy (Figure 4-8), and extends the traceability of measurement and accountability.

Figure 4-8 illustrates how the PRM supports agency measurement and reporting obligations set out by the Department of Finance and Deregulation's Outcomes and Programs policy, by linking reporting requirements to their legal source.  Further explanation can be found in the surrounding text.

Figure 4-8: The Department of Finance and Deregulation Outcomes and Programs Policy

Under the Outcomes and Programs arrangements, all General Government Sector agencies are required to report to parliament annually on the progress of program implementation or the success of program implementation and outcome realisation.

The intent of the Outcomes and Programs Policy was to introduce program-based reporting that was able to span traditional portfolio and agency boundaries and deliver transparency and accountability for government spending through an increased emphasis on performance management, measurement and reporting.

Outcomes are defined in outcome statements, and they articulate the objectives of government.13 In the financial operation of the federal government, outcome statements serve three main purposes:

  • to explain the purposes for which annual appropriations are approved by the parliament for use by agencies
  • to provide a basis for annual budgeting, including financial reporting against the use of appropriated funds and the control thereof
  • to measure and assess agency and program non-financial performance in contributing to government policy objectives.

Programs are the logical building blocks of government operation that provide a tangible link between the decisions of government, government activities, the deliverables of government and the observable impacts of those actions and deliverables.

The PRM and the Outcomes and Programs framework (Figure 4-9) align very closely when the PRM is applied at the government program level. The PRM is designed to apply at multiple levels and supports agency and business unit performance reporting.

Figure 4-9 illustrates how the PRM aligns with the PRM and the Outcomes and Programs framework, at the government program level.   Further explanation can be found in the surrounding text.

Figure 4-9: The PRM's alignment with outcomes and programs

The process model that underpins the PRM can be applied at any level of an organisation or program. It can be applied effectively at lower functional levels of government (project, team, branch etc.) to support any initiative (social, financial, environmental), and is able to establish an effective performance measurement and evaluation line of sight extending beyond the program deliverables of government agencies.

In this way, the PRM is able to support not only Outcomes and Programs Policy implementation (Figure 4-10), but also more detailed reporting for agency managers.

Figure 4-10 illustrates how the PRM is able to support more detailed reporting for agency managers outside the Outcomes and Programs Policy implementation.  Further explanation can be found in the preceding text.

Figure 4-10: The PRM operating beyond outcomes and programs

4.1.4.5 Other References

  • Australian Government Information Management Office (2009). Australian Government Architecture Framework, Version 3.0, Commonwealth of Australia, Canberra, Australia. Australian Government Architecture
  • Australian National Audit Office. Better practice guide: Performance management, Commonwealth of Australia, Canberra, Australia. http://anao.gov.au/Publications/Better-Practice-Guides
  • Australian National Audit Office (2007). Better practice guide: Corporate governance, Commonwealth of Australia, Canberra, Australia. http://anao.gov.au/Publications/Better-Practice-Guides
  • Department of Finance and Deregulation (2009). Outcomes and Programs Policy, Commonwealth of Australia.
  • Eccles RG (1991). 'The performance management manifesto', Harvard Business Review, January 1991, Harvard Business School Publishing Corporation, USA.
  • Federal Enterprise Architecture Program Management Office (2005). How to use the Performance Reference Model, United States Government, USA. http://www.cio.gov
  • Kaplan RS and Norton DP (2005). 'Using the balanced scorecard as a strategic management system', Harvard Business Review, Harvard Business School Publishing Corporation, USA.
  • Kaplan RS and Norton DP (2005). 'Having trouble with your strategy? Then map it', Harvard Business Review, Harvard Business School Publishing Corporation, USA.
  • National Archives of Australia (2005). Australian Government Interactive Functions Thesaurus (AGIFT), 2005, Commonwealth of Australia. http://www.naa.gov.au
  • PRINCE2 (Projects in controlled environments) website: http://www.prince2.com
  • Project Management Institute (2009). Reference guide to the project management body of knowledge (PMBoK), 4th edition, Project Management Institute, USA. http://www.pmi.org
  • Smyrk J (1995). The ITO Model: a framework for developing and classifying performance indicators, Australasian Evaluation Society, International Conference, Sydney, Australia.

4.1.5 Use by Australian Government Agencies

While the PRM facilitates the development of measurement systems for all levels of and processes within government, it is not envisaged that agencies will define measurement indicators within every measurement grouping, of every measurement category, of each measurement area and domain. Agencies should make their own determinations as to what measurements are necessary and appropriate in meeting their operational and strategic needs, and implement them accordingly.

Many agencies are already measuring aspects of their business in order to support the reporting requirements and management of the programs that they support and deliver. The PRM may be useful to such agencies in identifying any gaps in measurement systems, suggesting alternative approaches to the application of measurement, and possibly in fine tuning existing measurement systems.

Direct benefits resulting from the implementation of the PRM are first realised at the agency level. As usage increases throughout government and agency implementations become more mature, indirect benefits are then realised at the whole-of-government level.

4.1.5.1 The PRM as an evaluation tool

The PRM is able to evaluate any form of business process and any form of business output (Figure 3-6). It operates effectively at a major government program and portfolio level and at the functional level of an individual team.

It helps define government programs by quantifying government outcomes and promoting the alignment of program outputs and activities. At the same time, it supports the delivery of program outcomes by defining and tracking relevant performance information that is needed for effective evaluation and management.

In the same way, it helps define agency business plans by translating the outcomes demanded by government programs into quantifiable agency-level outcomes and promoting the alignment of agency outputs and activities. The PRM supports the delivery of agency outcomes by defining and tracking the performance information needed for agency-based management and activity valuation.

This pattern continues down the organisational structure.

Figure 4-11 shows how the PRM: helps define government programs, performance information, and agency plans; and supports the delivery of government outcomes; down through the organisational structure, at the government, portfolio, agency/department, division, branch, section and team levels.  Further explanation can be found in the surrounding text.

Figure 4-11: PRM support for cascading planning and execution

The PRM is able to:

  • support planning and execution and evaluation of business activities at both a macro and a micro level
  • support portfolio, program and project evaluation (P3M3), as well as individual project evaluation
  • assist agencies to develop strategic and business plans at the same time as assisting individual sections and teams to define work plans.

The PRM (Figure 4-7) also supports non-hierarchical business operation models that have an iterative process leading to outcome realisation, where the outputs of one process are an iterative process leading to outcome realisation, where the outputs of on intermediate and are not intended to realise an outcome in their own right.

Figure 4-12 shows how the outputs of lower order work processes are used as inputs to higher order work processes, along with additional resource inputs.  This applies to all levels of work from sub-projects to projects/processes to programs to branch to agency to portfolio.  Further explanation can be found in the surrounding text.

Figure 4-12: PRM aggregation

In a non-hierarchical business operation model:

  • the outputs of lower order work processes are taken as 'value-added' inputs to higher order work processes, along with additional resource inputs
  • usage of value-added inputs in higher order work processes represents usage of outputs produced in the lower order work processes
  • a portion of overall outcome realisation (benefit) can be attributed to the delivery of outputs at lower levels, and this line of sight can be established, traced and measured.

For example: a project has an ICT system as a deliverable. The delivery of the ICT system in itself will not realise an outcome, but when the ICT system is implemented to support a business process, it gets used and an outcome can be realised.

This flexibility translates into implementation within a government agency in a manner consistent with the operational and strategic needs of the agency.

The PRM supports the operation and improvement of both core and extended business capabilities at any level within any government agency. Some examples of where the PRM can be of particular benefit to an agency include:

  • ongoing business process performance improvement
  • management of service level agreements
  • compliance with customer charter provisions
  • employee performance management
  • benchmarking and improving ICT performance
  • internal investment management
  • portfolio, program and project management
  • Portfolio, Programme and Project Management Maturity Management evaluation (P3M3)
  • ITIL-based service management processes.

For more information on how the PRM supports and extends these activities, consult the AGA How to Use Guide [PDF 1.3MB].

4.1.5.2 Direct Benefits of the Performance Reference Model

The direct benefits to be realised by an implementing agency include:

Direct Benefit This is achieved by:
An increased ability to develop accurate cost models for ICT activities, and support for realistic agency ICT budgetary allocations
  • Establishing a framework that measures resource utilisation and value (costs) at all levels of agency operation, from the costs of resources consumed by ICT, to the costs attributable to business process execution, to the value of the outputs produced, and finally to the costs associated with promoting and ensuring output use by customers
  • Transparently demonstrating to business the true cost of ICT operations attributable to the realisation of specific business outcomes.
An increased effectiveness of internal capital investments
  • Supporting the development of transparent, objective agency business cases that are based on standardised language and objective measurements, and are able to clearly demonstrate the contribution of the proposal to overall business objectives.
An increased alignment between government outcomes, business initiatives and ICT operations
  • Promoting and supporting a planning framework that seeks clear and measurable definition of desired outcomes first, then outputs (deliverables), work (processes), and inputs (resources).
An increased strategic and tactical effectiveness
  • Providing decision makers with factual data on agency operations spanning input allocation through to outcome realisation, independent of hierarchical or functional boundaries; and by articulating the cause-and-effect relationship that exists between elements of the business.
An increased efficiency of business operations (management)
  • Supporting the definition and gathering of measurement information that can be used to determine performance at all levels of business operation.
An increased transparency in operations and reporting on progress and performance
  • Providing a standardised language for measurement and delivering a framework for the definition of objective measurement indicators able to withstand scrutiny (these measurement indicators are quantifiable, contextual and comprehensive and are based on scientific method).

4.1.5.3 Indirect Benefits

The indirect benefits that are cumulatively realised at the whole-of-government level as agency implementations mature include:

Indirect Benefit This is achieved by:
Support for the Declaration of Open Government
  • Establishing a culture of transparent, objective measurement of government programs, and implementing a framework that supports the definition and gathering of a comprehensive dataset for government programs that can be shared with third parties.
Increased interoperability across government
  • Standardising the language all agencies use to describe their measurement practices, and by establishing a baseline technique for defining measurement indicators that can be readily understood by all other government agencies, the public, and third-party industry and commercial enterprises.
Improved business case development across government and increased value from government investment management
  • Facilitating the development of objective business cases based on transparent, factual data in all government agencies, allowing effective comparison between business case proposals and the directing of investment to proposals that represent the greatest value for money for government.
Increased ability to design and deliver efficient and effective programs to support government outcomes, independent of APS portfolio boundaries
  • Maintaining a component and function performance repository that allows program designers to plan for and use the most efficient and effective processes and delivery mechanisms.

  1. http://www.finance.gov.au/presentations/docs/speaking-notes-for-David-Tune-presentation-18-08-2010.pdf
  2. J Smyrk, The ITO Model: a framework for developing and classifying performance indicators, Australasian Evaluation Society, International Conference, Sydney, Australia, 2005. The ITO Model is used under a Creative Commons Version 2.5 licence.
  3. The OPM is based on the Inputs–Transformation–Outcome (ITO) model of business process theory developed by John Smyrk. The model has been widely adapted within the economics and finance professions and is included here because it describes abstractly the unbreakable relationship that exists between outcomes and inputs.
  4. AGIMO, Australian Government Architecture Framework, version 2.0, Commonwealth of Australia, 2009, Canberra.
  5. A 'state of the world' is defined as the 'observable reality that surrounds government'.
  6. Department of Finance and Deregulation, 'Outcomes and Programs Policy', Commonwealth of Australia, 2009.

Previous Page - Next Page
 
Contact for information on this page: AGA@finance.gov.au

Last updated: 19 September 2013