OPIF Manual (Philippines)
Short Description
Download OPIF Manual (Philippines)...
Description
ORGANIZATIONAL PERFORMANCE INDICATOR FRAMEWORK
SECTIONS OF A MANUAL FOR IMPLEMENTATION
1
PURPOSE OF THIS MANUAL
T
he purpose of this manual is to provide the staff of departments and agencies of the Government of the Philippines with specific guidelines regarding the implementation of the Organizational Performance Indicator Framework (OPIF). The manual takes a step-by-step approach to the OPIF process, providing examples from OPIF experience to date and indications of how the components are intended to be implemented. While the manual functions as a stand-alone document, the manual is intended to be supplemented by training. In this regard, it also serves as a reference document. The manual also contains information from a readiness assessment that was carried out in two departments—Department of Budget and Management (DBM) and Department of Social Welfare and Development (DSWD). The readiness assessment included two major dimensions: readiness for the type of change required for OPIF to be successfully implemented and technical readiness to implement OPIF. This manual is not complete. Further work is required to bring it to the stage of a useful tool. Part of the required work is internal to the Government of the Philippines. Part will be assisted by a new ADB RETA.
2
ORGANIZATION OF THIS MANUAL This manual is organized into four main sections that are intended to guide departments through the initial OPIF processes. It starts with an overview of the context in which OPIF is being implemented, with the intention of providing readers with an understanding of the rationale behind OPIF. Sections 1.0 to 3.0 focus upon the three key initial OPIF processes—identifying results, documenting them, and then developing indicators to measure them. The fourth section provides some guidance as to how departments can assess their readiness to undertake the OPIF process.
Organizational Performance Indicator Framework Context 1.0 Identifying Outputs and Outcomes 1.1 Results Chain 1.2 Rules of Thumb for Distinguishing Results 1.3 Developing MFOs and Outcomes
2.0 Documenting Major Final Outputs and Outcomes 2.1 2.2 2.3 2.4
What is a logframe? Example of a Logframe Components of a Logframe Why are logframes useful?
Developing a Logframe Step 1: Planning for Logframe Development Step 2: Constructuring the Logframe Step 3: Finalizing the Logframe
3.0 Developing Indicators 3.1 What are indicators? 3.2 Steps for Developing Indicators 3.3 Data Collection
Developing Indicators Step 1: Clarify Outputs and Results Step 2: Generate Lists of Indicators Step 3: Assess Indicators
4.0 Assessing Departmental Readiness for Organizational Performance Indicator Framework 4.1 Change Readiness 4.2 Technical Readiness 4.3 Departmental Examples
3
ORGANIZATIONAL PERFORMANCE INDICATOR FRAMEWORK CONTEXT OPIF is being undertaken in the context of public expenditure management (PEM) in the Philippines, which covers all levels of the public sector, from the national government to the corporate sector and the local government. The overall goal of PEM is to reduce poverty in the long term by addressing two key dysfunctions of the current system: • •
inadequacy of resources for basic services, and deteriorating fiscal position of government.
PEM uses the following mechanisms: 1. the budget as an instrument for ensuring desired results—that is, the principal tool to enforce the PEM program is the government budget; 2. strengthening existing incentive structures to advocate and implement reforms; 3. enlisting the assistance of civil society, particularly in monitoring results; and 4. establishing clear targets and assessment mechanisms to ensure transparency in terms of disseminating the desired results as well as the monitoring of such results. The PEM framework includes three objectives: • fiscal discipline or living within means; • allocative efficiency or spending money on the right things; and • operational efficiency or obtaining the best value for money. OPIF is being implemented as regards the second objective, which is the allocative efficiency. In terms of allocative efficiency, the government has refined the medium-term public investment program (MTPIP) to clearly define the public investment priorities of the public sector over the next 6 years. Within this context, OPIF is a process whereby programs and projects are ranked and funded in terms of their priority and relevance to the desired outcomes.
What is OPIF? The key to results-based approach in budget use is the OPIF, the strategy whereby government is able to establish priority expenditures, identify targets, assess accomplishments, and report results.
4
OPIF is defined by the following five strategies: • • • • •
a shift to output/outcome results measured by performance indicators; clarification of expected performance and accountability of government agencies through these results; encouragement to agencies to focus efforts on the delivery of outputs relevant to their goals; establishment of an integrated performance management system where organizational performance targets are cascaded down to lower level units and used as basis of performance-based compensation; and reporting to the public and to Congress in clear terms the outputs of departments/agencies.
Initial OPIF process The initial OPIF process involves the following steps: 1. identification of major final outputs (MFOs) and outcomes by both the line agency and the oversight agencies of government (i.e., DBM, NEDA, and DOF) on a parallel basis (both outputs and outcomes are referred to as “results”); 2. documenting MFOs and outcomes in a framework that shows the linkages between the different levels of results; 3. identification of programs/projects that contribute to the realization of MFOs and desired outcomes; and 4. determination of performance indicators for each MFO and outcomes (These performance indicators become the gauge by which performance of the program or agency will be measured).
Reporting to Congress In addition to improving a department or agency’s performance through better management and monitoring of outputs and outcomes, there is an important reporting function to OPIF. Each budget submission from departments and agencies will be formulated according to MFOs and performance will be reported according to MFOs and outcomes using indicators. These OPIF budgets will then be compiled by DBM and submitted to Congress for approval. Congress will now be able to review—in concrete terms—the outputs of departments, the results delivered, and budget requests according to MFOs.
5
1.0 IDENTIFYING MAJOR FINAL OUTPUTS AND OUTCOMES The first step in the OPIF process is the identification of key results to be delivered by the department. These results are usually referred to as major final outputs (MFOs) and outcomes. To identify MFOs and outcomes, it is important to have a clear understanding of the differences between the types of results and how they are related to the activities and interventions that the department undertakes.
1.1
Results Chain
It may be helpful to think of results in a sequence or chain, leading from activities and processes to long-term goals such as poverty reduction. Each result in the chain is a “link” and is joined to other results in the chain by causality. The chain starts with projects, activities, and programs (PAPs) and moves through MFOs to outcomes and finally to higher-level goals at the sectoral and societal levels. The diagram below shows the linkage between these different levels. The key level for OPIF is the MFO level. Each of the other levels can be defined in relation to MFOs: • •
activities are “how” MFOs are produced; outcomes and higher-level goals are the reason or “why” MFOs are produced; and for the MFOs themselves, there is a need to know “what” is produced and for “whom.”
•
Why
Societal Goal
Societal benefits resulting from sectoral changes
Sectoral Goals
Longer-term benefits in the sector resulting from organizational changes
Organizational Outcomes
Benefits to the organization that results from the services/products/good
What and Who
MFOs
Products, goods, and services delivered to external clients/stakeholders. What is produced? Who receives the services/products?
How
PAPs
Benefits to the organization that results from the services/products/good
6
How do we distinguish between PAPs, MFOs, and outcomes? To distinguish results, try to remember that… PAPs consist of the key activities and processes undertaken by the department/agency to achieve MFOs, outcomes, and goals. MFOs are defined as “the goods and services that a department or agency is mandated to deliver to external clients through the implementation of programs, activities, and projects.” MFOs are tangible and can be more easily quantified as compared to outcomes and goals. Organizational outcomes are changes external to the department that result from the implementation of MFOs and are directly influenced by MFOs. Sectoral goals are the longer-term sectoral changes/ impacts that result from the organizational outcomes. These should be found in higher-level documents such as the Philippines’ Medium-Term Development Plan (MTDPP) or strategy planning matrices (SPMs). The societal goal describes the societal benefit derived from sectoral changes. These should be found in higher level documents such as the MTDPP or SPMs.
1.2
Rules of Thumb for Distinguishing Results
Several rules of thumb can be used to differentiate between MFOs and the other levels of results and activities. The hardest part is to identify the difference between MFOs and outcomes. The following rules of thumb can help to understand the difference.
7
Table 1: Rules of Thumb for Distinguishing between Different Results Levels Result Levels Societal goal
Relation to Dep’t. small contribution along with a wide range of actors
Control no control
Attribution little or no attribution to the dep’t.
Sectoral goal
contributed to by the outcomes, along with other actors involved in the sector
no control
low level of attribution to the dep’t..
Organizational outcomes
directly influenced by the dept.’s MFOs, but external to the dep’t. and also influenced by other sector actors and external events
controlled by stakeholders and partners external to the dep’t.
attributable to the influence of the dep’t. MFOs and also attributable to other partners and stakeholders
MFOs
produced by the dep’t.
controlled by the dep’t.
100% attribution to the dep’t.
PAPs
undertaken by the dep’t.
controlled by the dep’t.
100% attribution to the dep’t.
Accountability dep’t. accountable for documenting plausible logic for moving from MFOs to outcomes to goals dep’t. accountable for documenting plausible logic for moving from MFOs to outcomes to goals dep’t. accountable for managing toward the outcomes and changing approach if outcomes are not forthcoming dep’t. accountable for producing outputs—quantity, quality, cost, and timing dep’t. accountable for processes and interventions
Time Frame very long term
Change only changed when higher-level government strategy changes
long term
only changed when MTPDP changes
medium term, should be realistically achievable by the end of a project or program
only changed when MTPDP changes
produced annually
changed as required by the dep’t. to try and ensure outcomes are achieved changed as required by the dep’t.
undertaken continuously
MFOs = major final outputs; MTDPP = Medium-Term Development Plan of the Philippines; PAPs = projects, activities, and programs; % = percent Source: Mcmillan, Liam, and Bernard Woods. Asian Development Bank Consultants.
8
1.3
Developing MFOs and Outcomes
The key tasks involved in developing MFOs and outcomes for a department are as follows: • identify PAPs, MFOs, organizational outcomes, and goals; and • logically link the components. As an aid to help you develop and link the components, you can think about the relationships among PAPs, the MFOs, outcomes, and goals as a series of if … then statements. Let us take the example of a training workshop. One of the PAPs might be training delivery. Using “if… then” logic, we can try to develop a logical sequence: • if the training is delivered, then a certain number of training participants receive training (a certain number of people trained in particular skills areas is an example of an MFO for capacity building); • if participants receive skills and knowledge, then they will transfer their knowledge and skills to the organization, resulting in improved organizational productivity (organizational outcome); and • if the organization’s productivity improves, then this contributes to achievement of sectoral goals and so on. Another way of developing a logical sequence is to work from the P/A/Ps to the goals. This works well for existing departmental programs. • You could start by asking “what is it that we do?” We deliver training (PAP). • You can then ask “why”: Why are we delivering training? The answer to the question becomes the MFO—skills and knowledge for clients. • Why do we provide training workshops to clients? The answer to this question could become the organizational outcome—i.e., to improve the capacity of the organization.
What?
Why?
PAPs
MFPs
Why?
Outcomes
Why?
Goals
You could also try to develop departmental results by starting with the goals and moving backward to the activities. This can work well when the initiative is in the planning stages. Starting with the societal goal, you could then ask “how to achieve this goal.” The answer to the question becomes your sectoral goal
9
statement. You can then ask how you will achieve the sectoral goal. The answer to this question becomes your organizational outcome statement and so on.
PAPs
MFOs
Outcomes
Goals
How?
How?
How?
How?
Tips for Developing Results… • • • •
• • •
Brainstorm the PAPs, MFOs, outcomes, and goals; Select the key activities (which activities are most likely to lead to MFOs?); Or you can start with the goals and work back to the PAPs; Logically link components. Consider what PAPs most plausibly lead to MFOs and whether MFOs plausibly lead to organizational outcomes and so on; Keep the results as focused as possible—i.e., balance detail with the need for clarity and focus; If you can control it, then it is a PAP or an MFO; if you can only influence it, then it is an outcome; and Check the logic of your results by using if…then logic or asking how and why questions.
10
2.0 DOCUMENTING MAJOR FINAL OUTPUTS AND OUTCOMES How do I document departmental results? To document the results for a department, you need some way of writing the results down in a format that shows the logical, causal connections between them. One such framework for writing the results down is called a logframe or logical framework. Logframes can serve as a road map for monitoring performance. They help identify and distinguish between PAPs, MFOs, outcomes, and goals. This section describes the logframe, provides illustrative examples, and presents a step-by-step approach to developing logframe.
2.1
What is a logframe?
A logframe is basically a graphical image of a department, program, project, or any organization. The image depicts what changes are expected to occur. It shows the logical linkages among PAPs, MFOs, organizational outcomes, and goals (sectoral and societal). Logframes illustrate the theory or rationale behind the work of a department. In this case, when we talk about theory, we are generally referring to the expectations of people regarding how department programs will work. The logframe can graphically show what the department does, the services and/or products it produces, and what it intends to achieve.
A logframe has many applications. It can be applied to a program, project, policy, organization, department, new initiative, etc. For the sake of simplicity, the term “department” will be used throughout this section of the manual. Please keep in mind all the different logframe applications.
11
A logframe is •
a graphical way of showing the logical links between PAPs, MFOs, organizational outcomes, and sectoral and societal goals of a department.
•
a depiction of the process of change starting with PAPs to goals.
A logframe is not
2.2
•
an organizational chart – it does not depict organizational structure and functions.
•
an action plan – it does not show task assignments, milestones, timelines, or resource allocations.
Example of a Logframe
There are different graphical formats for logframes. If you were to search on the Internet for examples of logframes, you would see all different formats such as matrix style, flow chart (box and wire model). In addition, logframes can flow either horizontally or vertically. This chapter focuses on the flow chart logframe (see example on the following page). This model has some advantages and unique features. The format allows the reader to see the logical flow of the work of the department. In addition, this type of logframe is typically contained on one page, thus encouraging a more focused and streamlined model.
12
13
2.3
Components of a Logframe
As shown in the previous example, a logframe is broken down into components. The components are linked sequentially. PAPs should lead to MFOs, and MFOs should lead to organizational changes (organizational outcomes), and so on. Some variations in terminology are used for the various components of the logframe. The illustration below shows other terms commonly used to depict components of the logframe. In these cases, outcomes refer to the shorter and more medium-term changes that result from the activities and the outputs. Impacts refer to the broader changes to sectors, communities, or society at large. The terms have similar meanings to those used in OPIF. OPIF Terms
Alternate Terminology
Societal Goal Impacts Sectoral Goals
Organizational Outcomes
Outcomes
MFOs
Outputs
PAPs
Activities
Source: Mcmillan, Liam, and Bernard Woods. Asian Development Bank Consultants.
2.4
Why are logframes useful?
Logframes are useful tools for planning and performance monitoring. They are also useful for communicating the work of the department to other stakeholders and partners. It is important to note that the process of developing a logframe can also yield a number of benefits. Logframe development is generally a team effort with input from a variety of players. During this consultation process, a
14
clearer understanding and consensus around key activities, outputs, and objectives typically occurs. Logframes also have the following uses: • provide a model or reference point against which to assess whether departmental programs are being implemented and whether they are achieving the desired outputs and results; • identify limits, logical gaps and potential of a department; • can help set priorities for allocating resources by identifying activities and outputs critical to goal attainment; and • provide direction with respect to key performance indicators for ongoing monitoring, evaluation issues and questions, and methodologies.
Developing a Logframe Step 1
Step 2
Step 3
Planning for logic model development
Constructing the model
Finalizing the model
-identify intended users and -build and logically link key decision makers; the components. -review documentation; -consult with key people; and -identify appropriate Strategies.
-validate draft logic model.
Step 1: Planning for Logframe Development If you are preparing a logframe for your department, plan the tasks at hand. A task checklist is provided in this subsection as an aid to logframe preparation. Prior to reviewing documents and consulting with key players and stakeholders, the main purpose of the logframe must be clear. Under OPIF, the logframe is intended to position MFOs of the department in relation to the PAPs and longer-term results. Once
Key Questions to Ask: • What is the mission/mandate of the department? • What key benefits are expected from the department and its programs? (outcomes) • Who are the clients? Who are the stakeholders? (reach) • What are the key products/ services/goods produced? (outputs) • How should the programs be undertaken to achieve outcomes and goals? (activities)
15
MFOs are specified, budget can be allocated to each of them. A good logframe requires a full understanding of the department, its activities, products and services, and the expected benefits of its programs and services. Subsequently, relevant documents must be reviewed and key players consulted. When reviewing documents, start with the most strategic ones first (relevant legislation, regulations, policies, etc.). Review also performance monitoring and/or evaluation reports, or any narrative department descriptions. Before actually constructing the logframe, it is useful to consult a few key people to ensure a clear understanding of the program. A number of questions can be asked at this stage (see text box on the previous page). Table 2 Planning Tasks for Logframe Development Clarify main use for logframe (e.g., monitoring MFOs?). Identify key users of the logframe (i.e., program staff, management, policy staff, codelivery partners, clients, and other beneficiaries?). Identify key decision maker(s). Who has authority to sign off or approve the logframe? Review relevant documents. Consult with key group representatives. Identify who should be consulted during the development of logframe (i.e., potential users of the logframe, decision makers). Decide on strategy for developing logframe: Develop logframe during workshops or present draft logframe to relevant stakeholders?
(√)
Source: Mcmillan, Liam, and Bernard Woods. Asian Development Bank Consultants.
Developing the Logframe Before the logframe is actually constructed, an approach or strategy for its development should be decided. Bringing together key stakeholders and brainstorming are recommended. Identify key components of the logframe and develop the draft logframe with the group. Advantages of this approach include • increased buy-in, • improved capacity of stakeholders to develop logframes, and • needs of stakeholders are taken into account.
Step 2: Constructing the Logframe To construct the logframe, developing all the results as per section 1.3 is necessary. Several different methods can be used to construct the logframe with these results:
16
•
• •
write the PAPs, MFOs, outcomes, and higher level goals individually on pieces of card or paper. Then you can arrange them in sequence and link them. The individual results can be attached or pasted to a wall or arranged in sequence in a table. on a white board, divide the area into the five different result levels. Then write in the PAPs, MFOs, outcomes, and higher-level goals individually and draw the connections between them. in a drawing software such as PowerPoint, create text boxes for the PAPs, MFOs, outcomes, and higher-level goals. Link them using connectors or arrows.
Assessing the Logframe A good logframe has a number of characteristics. When reviewing a logframe, one must ask these questions: • •
• • •
Is it logical? Check the logic with If…then statements, or why and how questions. Is it focused? A key challenge to developing a logframe is to balance the need for detail with the need for focus. A focused logframe is a more useful communications and monitoring tool. Does the logframe build on the strategic direction and objectives of the department? Can external stakeholders understand the program by looking at the logframe? Do key departmental staff and other stakeholders agree with the model?
Step 3: Finalizing the Logframe Once developed, the logframe must be finalized. This step will proceed more smoothly if the appropriate people were consulted during the planning and development of the logframe. During this step, the model must be validated with decision makers and other key stakeholders. Ensure that the people who will implement and use the logframe are consulted throughout the process. They should have an opportunity to review the final draft. If time permits, the draft must be circulated to individuals who are very familiar with the work of the department, but who were not part of the workshop development sessions.
17
3.0
DEVELOPING INDICATORS
Logframes help identify MFOs and the expected benefits of the program. Indicators help specify data to be collected to measure MFOs. This section provides a description of indicators and outlines a step-by-step guide to their development.
3.1 What are indicators? An indicator is an element of information or data that helps measure whether progress or change has occurred. Indicators can be quantitative (a number, percentage, or proportion of something), and qualitative (i.e., those indicators measuring perceptions or opinions).
Example of a quantitative or numerical indicator: •
Percentage of an organization’s budget devoted to research & development expenditures
Example of qualitative indicator:
Examples of MFO Indicators The following table presents examples of MFO indicators.
•
Level of Client satisfaction regarding service quality
Table 3. MFOs Training participants involved in project management course Tourism marketing services Public transit service provided Outpatient health-care service provided
Indicators Percentage of training participants that have met a minimum pre-specified standard of achievement in project management skills area Number of clients who received promotional materials Number of passenger miles provided by public transit Outpatient health-care visits per 1,000 inhabitants ratio
Source: Mcmillan, Liam, and Bernard Woods. Asian Development Bank Consultants.
3.2 Steps for Developing Indicators A number of steps to be undertaken when developing indicators:
18
Step 1
Step 2
Generate list of Identify MFOs possible indicators for (use logic model) MFO
Step 3 Assess indicators
Step 1: Clarify Outputs and Results As discussed in the previous section, a logframe is a useful tool for identifying MFOs, outcomes, and goals.
Step 2: Generate List of Indicators For each MFO identified in the logframe, there are typically a number of possible indicators. Brainstorming and Proxy Indicators consultations with experts or other departments may help generate a list of A direct indicator is unavailable indicators. It is important to be as sometimes, so an indirect or proxy inclusive as possible during this step and measure has to be developed. For to look at the MFO from all perspectives. example, data with respect to Indicator development can be productivity levels of an organization challenging, so generate a flow of ideas may not be available. An indirect and possible alternative indicators. indicator might be number of sick
Step 3: Assess Indicators Choosing the best indicators to measure MFOs can be complex. The best indicators should be selected by assessing each according to a set of criteria. As a general principle, it is wise to limit the number of indicators: A few good indicators that measure something are better than many indicators that measure nothing.
days taken by employees. Share of social expenditure in a government budget could be a “proxy” for the poverty orientation of national policies. While proxy indicators are not ideal, they can provide good information particularly where they are shown to be associated with the more direct indicator. (Source: UNDP. Selecting Indicators for Impact Evaluation)
1. Is the indicator a direct measure of MFO? A key challenge in selecting indicators is ensuring that the indicator actually measures what it intends.
19
Consider the following: An anti-smoking campaign is launched with the aim of reducing smoking, particularly among young females. You want to measure the extent the anti-smoking campaign is being implemented. One indicator could be the number of anti-smoking pamphlets developed. However, a more direct indicator might be the number of anti-smoking materials distributed to a target population (e.g., female youth)
2. Is it practical to collect the data? Can the data be collected in a timely manner and at reasonable cost? Is the data available? The selection of “best” indicators has to be balanced with the need for practicality. Where improvements or adjustments in the existing data collection system are not feasible, careful consideration must be given to the types of data currently collected. 3. Will the information collected be useful for decision making?
Consider the following: You want to find out the extent and quality of the training being delivered to the clients of your department. You also want to increase your client’s organizational capacity with respect to marketing techniques. The department collects information on the number of participants trained and the number of training sessions delivered. How useful is this information? The indicators would not provide information on the quality of training, nor what types of knowledge or skills are being acquired. Other indicators that might be considered are: • number of participants who acquired training in marketing techniques; and • percentage of supervisors who report that training has increased skills/knowledge of participants.
20
3.3 Data Collection The process does not stop at the selection of indicators. Indicators identify the types of information to be collected. Decisions have to be made as to how this information will be collected. A few key questions to be considered for data collection for each indicator are the following: • • • •
What sources will the data come from? What methods will be used to collect the data? How often will the data be collected? Who will be responsible for collecting the data?
As illustrated in the previous chapter, a matrix can be developed as a tool for the collection of information pertaining to MFOs. Table 4. Logframe Framework MFOs
Indicators
Data Sources files, documents, people, clients, and tourists database?
Data Collections Methods interviews? file review? database review? survey?
Frequency of Collection monthly, quarterly, and annually?
Responsibility for Collection
Source: Mcmillan, Liam, and Bernard Woods. Asian Development Bank Consultants.
21
4.0 ASSESSING DEPARTMENTAL READINESS FOR ORGANIZATIONAL PERFORMANCE INDICATOR FRAMEWORK OPIF will be a significant undertaking for many departments. It involves new planning processes, the conceptual reorganization of PAPs of a department in relation to newly formed MFOs, and the proper placement of those MFOs in the broader context of outcomes and higher-level goals. Indicators then need to be selected for MFOs and outcomes and data collection strategies devised. Once the OPIF planning process has been completed and all MFOs and indicators have been developed, data collection and performance measurement will have to be undertaken. Some departments may be ready to take on the OPIF process while other departments may find it more difficult. There are two aspects of readiness that need to be assessed by a department: • How ready the organization is to make the organization, behavioral and management changes necessary to undertake OPIF – organizational change readiness • How ready the department is in terms of the technical aspects of OPIF – technical readiness The following diagram shows the two levels of readiness and the different dimensions that need to be considered for each level. Change Readiness Organizational HR Business Vision for Leadership Leadership Communication Communication Individual Organizational Internal/ Culture case for change and Team design Design Practices External practices change Capacity capacity Events events
Technical Readiness Strategic Planning with planning Results results
Technical Performance performance Measurement measurement
Data Capability capability
Data Collection, collection, Collation, and collation, and Analysis analysis
Use of performance Performance Information information
Assessing change readiness should be carried out using a number of methods including interviews with management and staff at all levels, focus group discussions, field visits to regional offices, questionnaires, and
22
document/literature review. The change assessment tools used in the development of this manual are attached in an appendix.
4.1
Organizational Change Readiness
Undertaking OPIF is a change process which extends beyond the technical tasks involved. Aligning the organization and the behaviors of the people behind OPIF requires clear understanding of the following: • • •
forces acting to promote OPIF, forces acting to prevent the proper implementation of performance budgeting, and behaviors and decision-making processes required.
As the diagram on the previous page illustrates, assessing the “readiness’”for change on nine key criteria will identify the critical areas where work is required to provide the best possible foundation OPIF and to maximize the possibility of success. How “ready” the organization is on these factors can be established using the change readiness questionnaire:
a.
The Business Case for Change
Without management understanding of why OPIF is necessary and the communication of that understanding to the employees of the organization, there can be no real motivation for change. Change is always uncomfortable. The business case must be powerful enough to overcome the natural tendency of many people to resist, or at least be passive, in the face of change. Without a solid and articulated business case for OPIF there can be no translation of the general motivation for change into a personal motivation for behavior change. For OPIF to succeed, individual behaviors must change.
b.
Vision for Change
Change programs require a vision: a communicated picture of a future state which acts as a guide to decision making and behavior. What will the effect of OPIF be upon the organization? How will it be different? How will employees be affected? What behaviors are required in employees? How should decisions be made? To effect large-scale change programs, people need to have a vision of where they are heading.
23
c.
Change Leadership
Management have a critical role to play in making OPIF work. The behaviors and decision-making processes identified by the vision must be put into action by the management at all levels of the organization. Management have a collective responsibility to show the way to the vision through their activities, behaviors, decisions, priorities, and plans. Leadership is a critical aspect of the implementation of OPIF, yet is one of the most difficult to promote. However, without good leadership by all managers, the OPIF change process will fail. Planning and managing the process of leading is as important, if not more important, than planning the technicalities of implementing OPIF and deserves a great deal of management time. Technical problems can always be identified and solved, but leadership and management problems are very difficult to identify. They are often complex to solve and have very far-reaching consequences. The starting point for an assessment of the role of leadership in the change process, whether it is acting to promote or to prevent change, is honesty. Honest feedback from employees and honest analysis on the part of managers regarding personal performance is critical. This is often difficult to achieve in government organizations.
d.
Communications
The business case, the vision, and the alignment of leaders—together with the plans for change and the implementation of new OPIF procedures, mean nothing unless they are communicated clearly to staff in a form that they can readily internalize. To change, people must understand the need to change, be motivated to change, and clearly understand what it is they are expected to do under the new regime. Change plans often fail at this stage. Poorly communicated plans, expressed in the jargon of those responsible for designing the new system, often fail to provide employees with the information they need to work toward the vision.
e.
Individual and Team Capacity
The assessment of the capacities of the staff to implement OPIF is a critical area of analysis. A balancing act is to be performed here between the requirements of the new system and the individual and team capabilities of the existing staff. In the short term, these capabilities are often the limiting factor in a major process of change. These capabilities can be improved in the short term through basic technical training, plans for which should be included in the implementation plan for OPIF. However, in addition, an honest assessment of organizational, team, and individual capacities must be made and plans for capacity development in the medium and long term included in the overall plan for change.
24
f.
Organizational Design
The implementation of OPIF may require a new form of organization in terms of physical structure and location, staffing allocation, delegation procedures, responsibilities, and reporting arrangements. Aligning the organizational design with the proper functioning of OPIF is critical. Change process in many organizations fails because the new ways of working are imposed upon the existing organization structures and practices without any attempt to reform them. Implementing OPIF is as much a question of organizational reform as it is of technical reform.
g.
Human Resource Practices
Change is about people. Great plans, visions, intentions, and leadership behaviors all fail to produce the intended result if people do not decide to align themselves with the new practices and behaviors. In organizations as well established as those of government, it is often difficult to change long-standing practices. However, for change to work, behaviors have to change. This requires that management clearly identify which existing and new behaviors actively contribute to the success of OPIF and which do not. Human resource practices must be aligned behind the promotion of the behaviors and attitudes which enable the successful implementation of OPIF to proceed and against those behaviors that are, in the environment, reactionary. This requires a fundamental understanding of the current system of rewards and consequences operating in an organization. Often, change in government organizations is hampered by the inability to offer appreciable rewards for appropriate and positive behavior and to provide appreciable negative consequences for destructive behaviors. This situation demands that the human resource be at the forefront of realignment efforts with innovative practices.
h.
Internal and External Events
The environment for change is a critical factor in determining the success or failure of any change initiative. From an internal perspective, the success or failure of past change attempts have a critical impact upon present plans. Failure leaves a legacy and, often, a cynicism on the part of staff which must be identified and removed before any new change plan can be implemented. It is essential that previous change programs are analyzed and the causes for success and failure identified and incorporated into the new change process. From an external point of view, it often takes an externally generated crisis to promote change. In this context, external crisis is not something to be minimized and kept from view, but rather to be honestly and accurately analyzed and used as a motivation for change. The most successful efforts for change have used
25
external crises as the key drivers. Organizations which fail to change often hide from an external crisis or fail to use it positively.
i.
Culture
It is a characteristic of well-established government bodies that stability and security are paramount. Governments are not designed to change fast nor radically and the decision-making processes and patterns of behavior that are rewarded often reflect this. It may be that the culture of the organization is averse to risk and will actively seek to resist change. To a greater or lesser extent, this will be true in all departments seeking to implement OPIF. An accurate assessment of the culture and the identification of the aspects of the culture which will conspire to block change, together with those that will work to promote OPIF, are required. Organizational cultures change very slowly and are a consequence of people slowly beginning to work in different ways. It is almost always better to use the existing culture of an organization to change rather than to promote changes which actively go against existing cultural imperatives. Over time, organizations develop their own culture: a unique set of beliefs, values, attitudes, and behaviors which define how decisions are made, interactions defined, and resources allocated. The culture of an organization renders some activities and behaviors possible and natural and others more difficult. With time and management attention, some aspects of the culture can be changed indirectly, while others become so ingrained that they defy all attempts at reform. Often unconscious, culture shapes the way the organization sees itself and perceives its environment. Violations of the cultural norms, often in the form of new initiatives, can cause disturbance and lead to resistance. The nature of the culture helps determine how change operates and how it should be planned and managed. Organizational culture is neither good nor bad. Rather, the critical question is whether the culture is appropriate or inappropriate with respect to the implementation of OPIF. Change programs surrounding OPIF, which seek to reorient the essential drivers of the organization are, by implication, culture changing, and themselves require, for their implementation, changes to the culture. For an organization to succeed, its values, attitudes, behaviors, and systems must be aligned with its strategies and priorities. The reverse is also true: that the strategies must be aligned—in the short term at least—with the culture and capabilities of the organization. Understanding the limitations and strengths of today’s organization and having a plan to align it with future strategic intent is the essence of change management. To place this analysis in the context of the
26
organizational change required by the implementation of OPIF, it is useful to examine a broad typology of organizational cultures identified in Table 5. This model identifies four general organizational types, each with its own strengths and weaknesses. Table 5 – ORGANIZATIONAL TYPES INSTITUTION
REFORMERS
Character: “You are supposed to be doing what you are supposed to do.” • Conservative and stable, • Traditional and past orientation, • Responsible and dutiful, • Rules driven, • Internal focus, • Hierarchical and centralized, • Risk minimizing, • Change only when necessary, and • Change at a steady pace, step by step. IMPLEMENTOR
Character: “We care deeply about doing the right thing for all.” • Driven by vision and values, • Work with people and for people, • Future orientation, • Big picture thinking, • Participative decision making, • Open communication, • Interested in new ideas and concepts, and • Need to identify the human impacts of any change. ARCHITECT
Character: “Explore the surroundings to seize any opportunity or opening.” • Adaptable and open, • Tactical and timely, • Efficiency and effectiveness, • Flexible and fast paced, • Risk taking, • Outward focus, • Must identify the practical benefits of change, and • Change must improve efficiency and effectiveness.
Character: “Whatever we do must be planned and make rational sense.” • Strategic and visionary, • Led by values, • Future orientation, • External focus, • Analytical and rational, • Driven to challenge the status quo, • Open to change and planned risk, • Interested in change, not in routine, and • Oriented toward broad focus of change.
Adapted from Bridges. 2000; Kiersey. 1998; and Hirsh. 1992.
Each type of organizational culture type holds certain values, rewards different behaviors, and encourages different forms of thinking and decision making.
27
Initiatives out of step with the organizational type will meet with resistance and may be hard to achieve. All organizational types find change painful as people adjust to new working practices. Adding to the model above, it is also important to consider two further aspects of an organization’s ability to handle change. 1. Extent to which the organization is ready for change – Is the organization change ready or change resistant? This is the focus of the categories of change readiness discussed above. 2. Size, complexity, and importance of the change itself – Is the change incremental or fundamental? Small incremental changes to systems or objectives may present little difficulty, even to a change-resistant organization. However, large-scale change may pose problems for even the most change-ready organization and require careful change management. For change-resistant organizations, fundamental change is the most difficult to implement. In terms of the model in Table 5, the institution type may find large-scale change most problematic: changing from a long-term stable state is inherently risky. It is important to regard change management as essentially a risk-management activity, integrated with other sophisticated riskmanagement techniques to identify and manage the risk of the change initiative upon the organization. Commonly, government organizations, in terms of the model, can be characterized as institutions with all the strengths and weaknesses that imply: • • • • • •
Orderly and controlled systems and behaviors; Respect for hierarchies and position; Tendency to be slow and careful in implementing change; Averse to risks, decision making, and attitudes; Great respect for hard work, experience, and qualification embodied in the system; and Functional organization, in vertical silos.
Such attributes are exactly what is required in certain environments and to achieve certain organizational objectives. Institutions are traditional, stable, and mature. They perform their tasks logically and follow time-tested ways of doing things, contributing to stability and historical success. Institutions have a tendency to be rational and linear in their thinking patterns. They are not swayed easily by emotional arguments or by personalities. They are often critical of new organizational ideas, forcing all plans through a rigorous analytical checking process. Institutions will tend to be uncomfortable with behaviors that operate in fundamentally different ways. For the culture of an institution to change, there must be a logical argument for change or a real sense of external threat or crisis.
28
In general, these arguments apply to most government organizations and provide a guide to change management for OPIF.
A General Model for OPIF Change Common among organizations that are successful in the implementation of change is the development of a model which brings together aspects of the various categories of change readiness and identifies appropriate actions. The particular nature of the model required will differ from organization to organization, but all government organizations seem to share several aspects of model development. Organizations do not automatically progress in the appropriate direction. Change must be planned and managed. Change management programs must be appropriate to the culture of each implementing organization and evolve as the organization evolves. The establishment of new behaviors—supported by leadership, new systems, and new human resource practices—are the means by which the organization evolves and OPIF is successfully implemented. To pursue a path of reform and renewal which is implied by OPIF, each organization must exhibit some operating characteristics of the reformer/implementor/architect (see Table 5), while still maintaining the systems built up on the path to maturity. The notion of balance between the institution and the reformer/architect is critical. Tipping the balance too much in favor of the institution can result in a stalled change process. Tipping the balance too far in favor of the reformer may result in a loss of control that is not sustainable. This openness at maturity—allowing movement from the established character of the mature institution to aspects of the reformer, implementor and architect—is essentially what is required to successfully implement OPIF in the Government of the Philippines. From best practice and research, it is clear that the use of a formal model of change is a growing prerequisite for success. Further, the evidence suggests that while the standard models of change can provide significant insight, it is essential that organizations develop their own models that fit their environment and character. For change to work, the analysis of problems and success factors from previous initiatives must be incorporated into a change model designed for the unique operating environment of each department which is seeking to implement OPIF. The general OPIF model for change should be results based—that is, identifying activities, outputs, and outcomes of the change initiative. There are three phases to this process:
29
• • •
Phase 1: Planning Phase 2: Implementation Phase 3: Mainstreaming
Each phase has its own results chain, providing the basis for effective management and monitoring. Phase 1: Planning Phase 1 of the model prepares the department for change and identifies the required medium-term outcomes.
Phase 1: General OPIF Change Management Model PHASE 1: PLANNING
VISION CHANGE STRATEGY & PLAN
PHASE 2: IMPLEMENTING
PHASE 3: MAINSTREAMING
CHANGE TEAM & RESOURCES CHANGE SPONSOR & LEADERSHIP
Source: Mcmillan, Liam, and Bernard Woods. Asian Development Bank Consultants.
In this phase of the process, the tools and activities are designed to work together toward the four critical medium-terms outcomes of: •
• • •
Clear and internalized vision for the change process, including the rationale and the motivation. This would include the business case for change and the contribution of external forces for change, including the communication of a sense of urgency. Detailed and monitorable plan, created through participation and discussion, which provides for the communication of the vision and the measurement of results. The identification of a team and resources that have been analyzed and planned and to provide the maximum opportunity for the initiative to succeed. Focused and committed leaders and change sponsors working together to “walk the talk” and to lead the implementation process.
30
Phase 2: Implementation Implementation is the core of the change management process recommended for OPIF. There are two main required medium-term outcomes with a recommended communication focus.
Phase 2: General OPIF Change Management Model PHASE 1: PLANNING
IMPLEMENTATION PROCESSES
PHASE 2: IMPLEMENTING
COMMUNICATIONS ONGOING REPORTING & MONITORING
PHASE 3: MAINSTREAMING
Source: Mcmillan, Liam, and Bernard Woods. Asian Development Bank Consultants.
In this phase of the process, the tools and activities discussed are designed to work together toward the critical medium-term outcomes of • • • • •
creation of awareness regarding the nature of the change, building of a desire to be part of the change and to see the process through to full implementation, communication of the knowledge necessary to play a full role in the change initiative, including technical and process competencies, encouragement and application of the ability to perform the required activities and deliver the required outcomes of the initiative, and ability to both give and receive feedback, closing the resultsmanagement feedback loop.
These outcomes are delivered through the coordinated application of a series of output plans for communication, capacity building, resistance management, monitoring and evaluation, and team and sponsor activities. These plans are created though a process of facilitated and participative discussion.
31
Phase 3: Mainstreaming Institutionalizing the change initiative is where the change becomes internalized in employee behaviors.
Phase 3: General OPIF Change-Management Model PHASE 1: PLANNING
PHASE 2: IMPLEMENTING
ANALYZE GAPS CORRECTIVE ACTION CELEBRATE SUCCESSES
PHASE 3: MAINSTREAMING
TRANSFER OWNERSHIP
Source: Mcmillan, Liam, and Bernard Woods. Asian Development Bank Consultants.
Change management is a way of managing risk through the design and planning of initiatives. It is not only the function of a specialist but also a set of processes, behaviors, and skills which are useful for all staff. Any model for change management for OPIF must have a commitment to mainstreaming changemanagement principles, ensuring staff plan, and implementing all initiatives in the same general way, adapting as necessary to the needs of individual initiatives. The mainstreaming activity is enabled through the process of • • • •
assessing change effectiveness through staff feedback, developing appropriate corrective action, rewarding and reinforcing the required behaviors and systems changes, and having an active plan for the transfer of ownership of both the initiative and the change-management process.
These outputs of Phase 3 will enable staff to refine the change-management process during implementation and improve the chance of the OPIF process to deliver its outcomes.
32
4.2
Technical Readiness
Technical readiness assessment provides a view of the current levels of capacity, from strategic planning to reporting. There are five dimensions of technical readiness and performance assessment that need to be undertaken:
a.
Strategic Planning with Results
The first dimension of the technical assessment considers how the strategic planning takes place, the degree to which it is results based, and the extent to which those results are cascaded down to different parts of the organization and to programs.
b.
Technical Performance Measurement
The second dimension deals with how performance is measured. Issues include the selection process for indicators and the determination of how those indicators will actually be measured.
c.
Data Capability
The third dimension includes an assessment of the capability that the department has to address the various facets of data collection. This includes the systems and processes for data collection and the levels of technology and automation that are used. Human resources—in terms of skills and knowledge—and the adequacy of financial resources to enable all the required data to be collected require to be assessed. Data capability also considers internal departmental administrative data, results data (MFOs, outcomes, and higher-level goals) and any additional monitoring and evaluation program. It also includes consideration of the data that the department receives—or should receive—from other partners and stakeholders, and the processes and systems to acquire it.
d.
Data Collection, Collation, and Analysis
Moving from an assessment of capacity, the fourth dimension considers how performance measurement systems are used in practice. This includes the use of systems, accessibility/sharing of data, accuracy, and timeliness. Another aspect of this dimension is an assessment of how the data is collated from various sources and then analyzed.
e.
Use of Performance Information
The final dimension of the assessment focuses upon the use of the performance information that is collected, collated, and analyzed through the various systems and processes. This will include an assessment of how the information is reported and captured for internal and external stakeholders, how it is used within the organization for understanding the department’s successes and challenges, learning about its performance, changing it as appropriate, and how it is used for accountability.
33
4.3 Application of the Change-Readiness Tool: Departmental Examples 4.3.1 Departmental Examples – Change Readiness. The change readiness questionnaire was distributed to 25 staff members in DBM, of which nine were at the directorate level, and 45 staff in DSWD, of which three were at the directorate level. The analysis identifies which of the nine categories of change readiness identified above need to be prioritized more than the others. Priority categories were identified as those having less than the average score of all nine categories in that department. The two sets of departmental results are not comparable with another since the ratings provided by respondents and the criteria used reflect the specific environment of each department. Rather, they indicate areas of priority focus with each department that require most attention if the implementation of OPIF is to be successful. For DBM, priority areas for management attention are • • •
communications, in particular, regular communication from the leadership; alignment of human resource practices behind the implementation of OPIF, in particular the processes of performance management; and cultural background and history against which OPIF is to be implemented, in particular that the present culture was too directive and rules based.
For DSWD, the priority areas for management attention are • •
•
development of the business case, in terms of a generally appreciated awareness and understanding of the objectives of OPIF; management of change and the development of a change model. The department had a poor historical track record of change management which lowered confidence in the OPIF change process. cultural background and history against which OPIF was to be implemented, in particular, that the present culture was too directive and rules based.
For the implementation of OPIF to be successful, urgent management attention is required in these areas.
34
4.3.2 Departmental Examples – Technical Readiness. In the following paragraphs, the word “unit” is used generically to refer to the component part of the organizational structure in which people work. As indicated in 4.2, there are several aspects to an assessment of technical readiness in DBM and DSWD:
a. Strategic Planning with Results Department of Budget and Management In terms of understanding results statements and frameworks, the assessment found that DBM results were clear to DBM personnel at all levels and the majority of personnel understood the link between their departmental MFOs and the organizational outcomes. In terms of understanding how their individuals work and the work of their unit fitted with the departmental MFOs, most staff were clear but a significant minority were unsure or did not know how their work or their unit’s work fitted. With respect to developing results statements and frameworks, the assessment found that DBM Directors could develop results for their bureaus and positions. However, more than half the DBM staff overall had doubts that there was sufficient knowledge to formulate new MFOs, organizational outcomes, or logframes. The majority of DBM staff also felt that current systems for staff development and training were not sufficient to address this skills and knowledge deficit. Department of Social Welfare and Development In terms of levels of understanding, the assessment found that there was widespread knowledge of MFOs, outcomes, what they mean, and how bureaus and staff contribute to them at DSWD. All DSWD staff who responded to the questionnaire reported that they understood how their work, and the work of their unit, contributed to department MFOs. With respect to developing results statements and frameworks, the assessment found that DSWD has added internal MFOs and there were high levels of confidence about the ability to formulating new MFOs and outcomes at the department, bureau, and individual levels. However, there was less confidence with the ability to develop logframes. In terms of training, less than half felt that current resources and systems were adequate to address any knowledge or skills gaps.
35
b. Technical Performance Measurement Department of Budget and Management With respect to indicators and determining how to measure performance, the new DBM OPIF submission now has indicators that cover MFOs well. The majority of the DBM staff indicated that they understood these indicators but only half stated that they understood how they would actually be measured. As many of the indicators in the OPIF submission are new, they are without targets or baselines. In addition, outcome indicators are still to be discussed. In terms of attached agencies, the assessment found that it would be relatively straightforward to develop indicators for the procurement service. Department of Social Welfare and Development At DSWD, there is a mixture of output and outcome indicators with some at the wrong level and in need of adjustment. Some result areas—specifically training and technical assistance undertaken by DSWD—were not sufficiently covered by existing indicators. In addition, some client-oriented MFOs did not have indicators for client satisfaction. DSWD staff have a clear understanding of the indicators and over two thirds of survey respondents stated that they understood how the indicators would be measured. c. Data Capability Department of Budget and Management Human Resource Capacity for Data Collection With respect to both staff numbers and skills for data collection, there were a number of concerns at DBM. At the senior management level, there were widespread concerns that DBM lacked sufficient knowledge and skills in data collection and analysis. Senior management also expressed concerns that there were insufficient numbers of staff to operate data collection and analysis systems. Directors were also concerned that staff numbers to collect data were lacking. Moreover, just over half of all DBM staff surveyed indicated that they had concerns about the numbers of personnel and the skills required to collected data required for OPIF. An associated identified need was for additional programmers and analysts to adapt and develop OPIFrelated systems. At DBM, there was a lack of confidence in departmental training systems to address the above areas of concerns in terms of capacity
36
gaps. Over half the staff surveyed stated that they felt existing training systems were not adequate to develop these skills. Financial Resources In terms of the financial resources needed for data collection, there was a mixture of responses and difference of views between senior management and staff. Half of the DBM staff surveyed agreed that their departments had adequate financial resources to collect data for the indicators, while about one third of staff doubted this was true. Systems for Data Collection With respect to the information systems available for data collection on OPIF indicators, a number of systems were suggested by the respondents. These include the document tracking system (DTS), foreign-assisted projects (FAP) database, and the e-budget systems. The budget preparation management system, the government manpower information system, and the accounts payable system were also mentioned as having possible future linkages with OPIF data collection. In terms of staff perspectives, more than half of the staff surveyed agreed that the department had sufficient information technology (IT) systems to store, share, analyze, and report on OPIF data. However, a significant minority—about one third—disagreed. External Data Acquisition In terms of acquiring needed data from external sources, there was general confidence at all levels that DBM would be able to obtain needed data from outside the department. Department of Social Welfare and Development Human Resource Capacity for Data Collection With respect to staff numbers and skills for data collection, there were general concerns that the staff were not properly equipped to undertake data collection and analysis. Over half of the staff surveyed indicated that there were adequate human resource numbers for collecting the data required for OPIF. However, almost a quarter of the staff stated that human resources were lacking in this area. In terms of the ability of DSWD to address these capacity gaps, there was general confidence in departmental training systems at senior levels, but a significant minority of the staff surveyed felt that these systems were not adequate to address capacity gaps.
37
Financial Resources In terms of the financial resources needed for data collection, there was a level of concern regarding resources for data collection. Just over one third of DWSD staff surveyed agreed that their departments had adequate financial resources to collect data for the indicators, while about one-third doubted this was true. Systems for Data Collection In the area of information systems to support OPIF, the assessment found that major IT systems were being developed but were still incomplete. Based on existing system designs, it was determined that these systems under development would only address a portion of the data needs of OPIF. From the perspective of staff, over 50% of DSWD staff were generally positive about the adequacy of departmental IT systems to store, share, analyze and report on OPIF data. However, over a third of staff the staff disagreed. External Data Acquisition In terms of acquiring needed data from external sources there were serious concerns at many levels about the ability of the department to obtain all needed data that resided outside the department. d. Data Collection, Collation, and Analysis Department of Budget and Management With respect to how performance measurement systems were used in practice at DBM, the assessment was complicated by the fact that over half of the indicators are new, and therefore, not much data is being collected. For indicators for which data was collected, common comments were that it was not always being calculated, compiled, and analyzed to be useful for performance management. There were many issues around existing IT systems. A common problem was that the systems were not being used as intended which would be a barrier to their use for OPIF. In addition, they did not capture all the data needed for OPIF. With respect to DTS which was cited as being especially relevant for OPIF, it was stated that there was no “champion” for DTS in DBM which had the potential to limit its growth for OPIF. DTS had not been extended to the regional level for which data was being collected.
38
Department of Social Welfare and Development The assessment found a mixed picture at DSWD regarding how performance management systems were used in practice. Measurement at the outcome level was already taking place, which is usually a big challenge for organizations. However, there were many issues with existing systems. The challenges that these systems faced included the fact that many systems were manual—some with encoding in spreadsheets characterized by multiple points of data entry and manual compilation of data from the regions. There were also major issues of getting data from external parties (some reported only 20–50% of required data received and much of them late). In addition, all data were reported to have errors and inconsistencies. e. Use of Performance Information Department of Budget and Management Measurement of Performance One primary intention of OPIF is to use the information to measure performance and to decide based on that information. The majority of DBM staff stated that it was clear what decisions would be made using OPIF information. A majority of the staff surveyed stated that information on their department's, their unit's, and their own performance was used a basis for decision making, as a learning tool, and to improve performance. Cascading Performance Measures At DBM, there was significant support for the idea of cascading performance measures, from the department level down to the unit and individual levels. This was seen as being fair and appropriate. However, with respect to judging individual and unit performance, while the majority of DBM staff surveyed tended to agree that indicators of performance could be developed easily, a significant minority disagreed. Reporting The majority of DBM staff surveyed stated that the requirements of senior management regarding the format and content of OPIF information were clear to them.
39
Department of Social Welfare and Development Performance Contracting DSWD is fairly unique in that is has an established performance contracting system that is operational. At the top of the performance contracting system is the OPIF logframe with MFOs. This is then split into key results for the heads of offices, which are then cascaded to the staff to ensure accountability in each level. Performance contracting and performance assessment are then brought down to the level of the other staff. The basis of performance assessment is an approved performance contract that uses different templates depending on the level of the personnel. For directors, assistant bureau/service/field directors, including the division chief, the performance contract has two components: technical performance and managerial competency. For staff below division chief, it is based on technical performance and “job-related behavior competencies.” The effects of the performance contracting system linked at the highest level to the DSWD MFOs is to create a demand and requirement for performance information at every level of the organization. This information is then used for decision making, accountability, human resource management, planning, and reporting. Reporting Reporting on performance information is used in performance contracting system at all levels. However, the system currently is aligned more with individual than organizational reporting. There is a significant reporting burden at DSWD with over 200 different reports using many different formats, especially difficult at the regional level. Many reports contain the same information organized in different ways, leading to inconsistencies and difficulties in interpreting the data. This issue has been recognized and steps are being taken to rationalize reporting requirements.
40
View more...
Comments