Vous êtes sur la page 1sur 70

WaSH M&E-MIS QUALITY

ASSESSMENT

Draft Report

Submitted to:
MINISTRY OF WATER AND ENERGY (MOWE)
WATER SUPPLY AND SANITATION DIRECTORATE

By:
D AT A M A N A G E M E N T A N D I N F O R M AT I O N S Y S T E M
S U P P O RT ( D . M . I . S . S . ) P L C .

June 2013
Addis Ababa
Table of Contents
Table of Contents.......................................................................................................................ii
List of Tables and Figures.........................................................................................................iii
Abbreviations and Acronyms....................................................................................................iv
Executive Summary..................................................................................................................vi
1. INTRODUCTION..........................................................................................................1
1.1 Background and Purpose.......................................................................................................3

1.2 Objectives..............................................................................................................................5

1.3 Significance...........................................................................................................................5

1.4 Scope and Organization of the Report...................................................................................6

2. WaSH M&E-MIS............................................................................................................. 7
3. METHODOLOGY........................................................................................................... 11
3.1 WaSH M&E-MIS Verification Approach............................................................................11

3.2 WaSH M&E-MIS DATA QUALITY ASSESSMENT (DQA) Approach............................17

4. DATA ANALYSIS AND FINDINGS.............................................................................. 22


4.1 Current Status of WaSH M&E-MIS....................................................................................23

4.2 The WASHCOM Application Software Testing..................................................................23

4.3 DQA of WaSH M&E-MIS...................................................................................................25

4.3.1 System Assessment Verification...................................................................................25

4.3.2 Data Reporting Verification..........................................................................................34

5. CONCLUSION AND RECOMMENDATIONS............................................................. 37


BIBLIOGRAPHY........................................................................................................................ 42
APPENDICES.............................................................................................................................. 43

ii
List of Tables and Figures

Table 1: WaSH M&E-MIS performance across reporting levels

Figure 1: Overview of WaSH M&E System

Figure 2: Overview of WaSH Data Flow and Data Use by Different Government Levels

Figure 3: WaSH M&E-MIS Architecture

Figure 4: Application Layers

Figure 5: WASHCOM Architecture

Figure 6: Conceptual Framework of RDQA

Figure 7: Overall system performance per functional area


Figure 8: System Performance by Reporting Levels
Figure 9: System Performance at Woreda Level
Figure 10: System Performance at Zone level
Figure 11: System Performance at Regional level
Figure 12: Availability, Timeliness and Completeness of Reports across the Levels

Abbreviations and Acronyms

CSA Central Statistical Agency


DFID Department for International Development

iii
DQA Data quality assessment
DQAF Data Quality Assurance Framework
EDQAF Ethiopian Data Quality Assessment Framework
EMIS Education- Management Information System
FCA Functional Configuration Audit
GoE Government of Ethiopia
GTP Growth and Transformation Plan
HMIS Health- Management Information System
IBEX Integrated Budget and Expenditure System
IT Information Technology
KPIs Key Performance Indicators
M&E Monitoring and Evaluation
MDG Millennium Development Goal
MIS Management Information System
MoE Ministry o f Education
MoFED Ministry of Finance and Economic Development
MoH Ministry of Health
MoU Memorandum o f Understanding
MoWE Ministry of Water and Energy
MoWR Ministry of Water Resources
MSF Multi-Stakeholder Forum
NGO Non-governmental Organization
NWCO National WASH Coordination Office
PASDEP Plan for Accelerated & Sustained Development to End Poverty
RDBMS Relational Database Management System
RDQA Routine Data quality assessment
RiPPLE Research-inspired Policy and Practice Learning in Ethiopia & Nile Region
SQL Structured Query Language
SRS Software Requirement Specifications
TDP Technical Data Package
UNDESA United Nations Department for Economic and Social Affairs
WaSH Water, Hygiene & Sanitation Program

iv
WASHCOM WASH Committee (community level)
WEC WaSH Evaluation Checklist
WFI WaSH Implementation Framework

Executive Summary
 The success of a decision, in general, relies on strong Monitoring and Evaluation (M&E)

system, which is assumed to produce quality data. Thus, it is significant to identify and

v
address the strengths and challenges associated with any M&E system in order to strengthen

the sectoral MIS (WaSH in our case).


 On the other hand, Data Quality Assessment (DQA) appears to be a scientific evaluation of

data pertaining to determine if they are correctly presented through the given M&E system

and used appropriately in decision-making and planning process. The process of DQA also

argues that data quality is meaningful only when it relates to the intended use of the data,

along with its reporting through a standard M&E system.


 In designing the Data Quality Assurance Framework (DQAF), it deemed necessary to

undertake WaSH M&E-MIS assessment, which is currently not well functioning in order to

report adequate and timely data. If the WaSH M&E-MIS is underutilized, it reveals wastage

of resources on one hand, and likely to seriously undermine the quality of information (that

system produces), on the other.


 Therefore, M&E systems strengthening techniques can be developed under the premise that

it is important to understand the system through which data are generated, aggregated and

reported in order to assess their quality. For the purpose, web-based WASHCOM application

is accessed for its capability to integrate with other line-ministries systems (e.g., EMIS,

HMIS, CSA, MoFED/IBEX) by using checklists, test cases and RDQA approaches.
 From the perspective of web-based system assessment, in general, the findings reveal that

documents pertaining to the effective implementation of the system are either not developed

or delivered to the MoWE. Furthermore, the existing system requires certain

modifications/customization to fit to the MoWE/WaSH organizational needs.


 WaSH M&E-MIS assessment draws attention to the functional areas like data management

process and reporting requirements, data quality mechanisms and controls and training

provided to M&E staff across all the three levels (Woreda, Zone and Region) are posing

serious challenges to the system.

vi
 As majority of the areas and their functioning is directly linked with the training component,

a formal provision is very much required to be provided to the concern staff across the given

levels.
 Furthermore, by considering that data collection and reporting forms/tools, and well

maintained links with the national reporting system along with indicator definitions are the

major functional areas where the system is considered to be performing somehow better,

adequate arrangements should be carried out to at least sustain with the similar trend or

bringing effectiveness to the existing system over time through experience.


 In order to practice/implement a full-fledged WaSH M&E-MIS system, internet connectivity

appeared to be a serious challenge and in some cases putting the newly introduced web-

based system on stake by brining the staff back to manual/paper-based system of reporting.
 Finally, specific gaps on the part of system assessment/functional areas and data reported

through the systems under verification for the last two years (2011 and 2012) are identified

within the WaSH M&E-MIS framework, as in majority of the cases the examined units are

found to be maintaining only National WaSH-Inventory data.

1. INTRODUCTION
Over the past few decades, development and management of strong Monitoring and Evaluation

(M&E) systems become a critical area of concern to many governments and associated stakeholders

of the developmental programs/projects. Ethiopia is not an exception to this, due to the importance

of M&E-Management Information Systems (MIS) in facilitating effective decision-making by the

authorities, within the given organizational framework. However, reports generated through such

vii
systems must be accurate and timely to meet the requirements of the decision-makers,

implementers, and regional/national governmental bodies.


Ethiopian Growth and Transformation Plan (GTP), a national planning document for the period

2010/11-2014/15, has put priority in improving sectoral data management by undertaking M&E-

MIS systems-wide assessments and data verifications across line ministries and various

developmental projects (in the country). By concerning that improper development and execution of

M&E systems may cause inferior/poor data quality, both in terms of content and timeliness, such

assessments are assumed to be fruitful in bringing quality to managerial/ administrative decisions

through effective reporting.


To achieve the ambitious goals laid out in the GTP for safe water and improved hygiene and

sanitation, the Government of Ethiopia (GoE) is poised to launch the new National Water Supply,

Sanitation & Hygiene (WaSH) Program. The strategies to achieve GTP target and set up for the

National WaSH Program are described in the WaSH Implementation Framework (WIF). A major

feature of WIF is that it has the leadership of four government Ministries (MoFED, MoE, MoH &

MoWE) that are pledged, through a Memorandum of Understanding (MoU) to support an integrated

National WaSH program that addresses the needs of individuals, communities, schools and health

posts more holistically and reduces bureaucratic compartmentalization of services.


Moreover, it has been witnessed that the development programs under GTP have a strong focus on

the improvement of quality of public services, which may only be realized through an effective

management and reporting system. As a result, ambitious goals are set in the GTP for safe water and

improved hygiene and sanitation by the Federal Government of Ethiopia. The WaSH program

believes that safe water and improved sanitation and hygiene are not separate pursuits, and

coordination of efforts is required among governmental agencies, civil society organizations and the

private sector (NWIF, 2011). Furthermore, it addresses the needs of individuals, communities,

schools and health posts, more holistically, and reduces bureaucratic compartmentalization of public

services.

viii
Furthermore, establishment of an effective WaSH sector M&E system is one of the major

components of the integrated National WaSH program (NWIF, 2011). The WaSH M&E mainly

deals with the rural and urban water supply information, as well as, the sanitation and hygiene

covering institutions like schools and health facilities- also reaching down the community. It mainly

focuses on sources of water, public access to safe water, water quality, methods of diverting water

and the management of utilities, along with the condition of each individual water scheme or point

of service, maintenance and expansion of such schemes, and construction of new schemes (WaSH-

MIS Manual, 2011).

The concerned ministries have shown their concerns to assess the WaSH M&E-MIS to further

strengthen its functioning in producing quality, timely and reliable data. However, in order to ensure

and improve overall data quality, effective Data Quality Assessment (DQA) methodology would be

highly appreciated. The DQA is the scientific evaluation of data pertaining to determine if they are

correctly presented through the given reporting systems and used appropriately in decision-making

and planning.

This document provides an assessment of WaSH M&E-MIS system by looking into the associated

strengths and challenges, and by examining/verifying the quality of information that the system

produce. It will also identify feasible capacity strengthening and development actions to be

implemented in order to mitigate the identified challenges across the reporting levels (e.g., Woredas,

Regions and National M&E levels).

1.1 Background and Purpose

The National WASH Inventory (NWI) is recognized as one of the most important initiatives in the

WaSH sector (UNDESA, 2011). It is assumed that NWI has considerable potential to improve

service delivery, whereby better data can improve policy-making, planning and decision-making at

ix
all levels (RiPPLE, 2009). To this end, the MoWE, has been exerted various efforts, whereby one of

the major progresses appeared to be the establishment of sectoral M&E-MIS, implemented in some

pilot areas.

The newly introduced WaSH M&E-MIS is a web-based software solution with full potential of

targeting the needs of WaSH program’s M&E information management. The system is assumed to

be used in helping the program to work towards its objectives by availing necessary information in

a timely and required format to improve associated public services e.g., building, expanding,

enhancing water supply facilities at different levels. The system helps to accurately collect,

aggregate, store, share, exchange, and analyze the multi-sectoral data of the program at every level

(WaSH-MIS Manual, 2011).

Even though an organization can select the best indicators and develop the best protocols/tools, but

if not properly used and if reporting standards set by the organization are not respected (both in

terms of quality and timeliness), then the risk exists that data will be of inferior value/ poor quality.

Thus, to be useful, data must not only be correct, reliable and complete, but also accessible and

organized in appropriate form for the purpose of usage.

Following this, in the recent past, several attempts have been made by both Federal and Regional

Governments in order to collect basic data about the WaSH sector, particularly an inventory of

protected public water supply schemes and their functional status, which provided a snapshot of the

water supply sector in the country. Though, rapid pace of construction of new water points

combined with their dynamic nature and increasing requirements for regular maintenance/repair

may dictate such data inventories soon to be outdated.

Also, it is assumed that a strong and up-to-date database would allow WaSH agencies to better

understand the progresses being made, approaches and associated costs, along with guidance,

planning and allocation of investments to help ensure equity in access. However, Moriarty et al.

x
(2009, as cited in RiPPLE, 2009) highlighted WaSH sector information in Ethiopia as unreliable,

being not optimal, hard to collect data, and sometimes under/over and controversial data reporting.

Furthermore, as argued by DQA methodology, data quality is meaningful only when it relates to the

intended use of the data, along with its reporting through a standard M&E system. Also, for data to

be credible for reporting, it should meet the seven data quality dimensions namely accuracy,

completeness, reliability, timeliness, precision, integrity and coherence.

To this effect, MoWE has put priority in improving sectoral data management systems through

undertaking M&E systems assessments and verification of data collected through WaSH M&E-MIS

system across the woreda, intermediary (regions and zones) and national levels. This helps in

determining the strengths and challenges (areas/causes of poor data quality) of the WaSH M&E-

MIS, and suggests potential solutions for the problems causing ineffective and untimely data

reporting, along with highlighting the coherence and comparability of data across the reporting

system.

1.2 Objectives

The basic objective of this study is to undertake WaSH M&E-MIS (sector-wide) assessment,

reporting Key Performance Indicators (KPIs), and verification of the data quality in the associated

sector. This also includes identifying strengths and potential challenges in the reporting process

across the system levels (from woreda to central/national).

The specific objectives include:

 To verify the WaSH M&E-MIS (web based) system;

 To assess the appropriateness of WaSH M&E-MIS data management system across the

reporting levels to generate quality data (i.e., DQA);

 To recommend for the strengthening of WaSH M&E-MIS to report accurate and timely data.

xi
1.3 Significance

Accurate and reliable data are required for multiple purposes across all the organizational/

governmental bodies. Good quality data is assumed to be the foundation for effective decision-

making. As a result, organizations remain concerned on the part of their progress and optimal

utilization of resources, and want to know how poor/inferior data impacts the setting-up and

achievement of their/program’s objectives/goals. This study maintains its importance to understand

any deviations that data in the stated sector (WaSH) maintain, while aggregated and reported

through M&E-MIS across different levels.


Furthermore, the applicability of the M&E system assessment and DQA tools to judge the validity

and reliability of data would guide the MoWE to identify areas of challenges and weaknesses to

improve upon in future course of actions. Additionally, it would serve in benchmarking and

prioritizing emergent WaSH M&E-MIS system and data issues.


Finally, any knowledge about the strengths of the existing (MIS) system would help in

concentrating and deploying the resources to reveal better data quality and its usage across the

decisional areas.

1.4 Scope and Organization of the Report

This assessment covers the area pertaining to the WaSH M&E-MIS system quality and data

reporting across the 10 regions in Ethiopia. Hence, the scope of the study is limited to the sector’s

WaSH M&E-MIS, and the findings cannot be generalized to other sectors/programs. Furthermore,

the study covers only data reporting mechanism/aspect of the system (WaSH M&E-MIS) and not

extended to the information associated with construction or rehabilitation of water

schemes/facilities, and project specific expenditures (financial data). The study used standard M&E

xii
system assessment and DAQ checklists, adopted and modified to the sectoral context. Moreover, the

study covers the system and data quality assessment in a more quantitative manner, though

understanding of the process, language of questionnaires (used across different levels), and

individual biasness/background of data collectors and facilitators may pose certain limitations.
On the part of organization, the report consists of 5 sections: Introduction and Background; WaSH

M&E-MIS; Methodological Approach; Data Analysis; and Findings and Recommendations.

2. WaSH M&E-MIS
Across the world, a growing number of governments are working hard to improve their decisional

domains by developing and sustaining with the systems that are helpful in facilitating effective

decision-making and performance measurement over-time, across various sectors. In other words,

M&E systems are necessary for the achievement-based policy-making, budget decisions,

management and accountability.

Though there exists many reasons for the increasing efforts to strengthen M&E systems in the

public domains, doing more with less and with quality, perhaps, can be considered as one of the

significant factors. Indeed, there are technical aspects of M&E systems that need to be managed

carefully. However, if an M&E system is underutilized, it reveals wastage of resources on one hand,

and likely to seriously undermine the quality of information (that system produces), on the other.

Also, the M&E systems require pooling of data and information routed through pertinent sectoral

ministries (e.g. Water & Energy, Education, Health etc.), agencies (CSA and regional branch

offices), and a variety of other sources (Ethics and Anti-corruption Commission, Human Right

Commission, Institutions etc.). This involves bringing together administrative data generated by the

xiii
concerned sectors and information obtained through formal statistical surveys and other similar

research studies.

In line to this setup, the current Growth and Transformation Program (GTP) is developed as a

national planning document for the period 2010/11-2014/15, mainly directed to achieve the

Millennium Development Goals (MDGs). Similar to the above program (WaSH), measuring the

success and improving the management of this nationwide and highly resource-demanding initiative

is predicated on strong M&E systems, responsible to produce quality data.

To this effect, the GTP has put priority in improving sectoral data management systems through

undertaking M&E systems assessments and verification of data collected through established

systems across various reporting levels (e.g. Kebeles, Woredas, Zones, Regions and Federal levels).

However, to a great extent, the realization of the associated objectives may depend on the systems’

reporting of quality data for use in taking effective decisions.

The conceptual model of WASH M&E was proposed in November 2007 (see Figure1). It describes

the range of WASH M&E instruments that are used by the kebele, town/woreda, regional and

federal levels to regularly report on the progress of the sector.

Figure 1: Overview of WaSH M&E System

xiv
(Source: Ethiopia WaSH M&E Framework & Manual)

The above framework provides five rural water supply M&E instruments and four urban water

supply and sewerage M&E instruments (given under different sections of the manual) that are to be

set up by the MoWR, in consultation with Regional Water Resources Bureaus and Woreda Water

Desks. However, in order to make a link between WaSH instruments at the lowest administrative

level of planning, a development of the Kebele WaSH action plans is put forth.

Modifications to the WaSH M&E instruments embedded in the education and health sector MIS are

carried out, along with WaSH M&E instruments that are contained in the household survey system

(managed by the Central Statistical Agency) and GoE’s financial management system (managed by

MoFED).

Additionally, the manual lists the 15 key WaSH performance indicators, which the integrated

WaSH M&E system is designed to report on, by showing which instrument(s) they are drawn from

and describing the analytical narrative that is to be presented with each. The derivation of these 15

indicators was carried out from over 150 indicators (listed in the manual). As presented in Figure 2,

each governmental level is expected to actively use, process, and analyze the data generated by the

WaSH M&E instruments/indicators.

xv
Figure 2: Overview of WaSH Data Flow and Data Use by Different Government Levels
(Source: Ethiopia WaSH M&E Framework & Manual)
The remaining sections of the document/manual describe the annual calendar for the whole WaSH

M&E system and the process for designing, testing and rolling out a national WASH M&E system.

In a nutshell, as stated earlier, the WaSH M&E Management Information System (MIS) is a web-

based software solution with full potential of targeting the needs of WaSH program’s M&E

information management. The system is considered to be used in helping the program to work

towards its objectives, by availing necessary information in a timely manner and in required format,

to improve the provision of public services associated with water supply facilities at different levels.

Also, the system helps to accurately collect, aggregate, store, share, exchange, and analyze the

multi-sectoral data of the program at every level.

xvi
Finally, as shown in the Figure 3, the WaSH M&E-MIS is an application that can be accessed

through web/browser (e.g., Microsoft Internet Explorer), and holds three components to interact by

the people (Input data forms, output forms, and administrative utilities).

Figure 3: WaSH M&E-MIS Architecture


(Source: Ethiopia WaSH M&E Framework & Manual)

3. METHODOLOGY
The primary objective of this study is to assess WaSH M&E-MIS functioning and verify its

capability to produce quality data (by using DQA methodology). For M&E system verification, an

‘Integrated MIS Assessment Framework’ is used, whereby, for data quality assessment ‘RDQA’

approach is employed. Thus, M&E system strengthening techniques can be developed under the

premise that while it is important to understand the system, through which data are generated,

aggregated and reported in order to assess their quality, verification of reported indicators/data is

equally relevant.

3.1 WaSH M&E-MIS Verification Approach

xvii
An ‘Integrated MIS Assessment Framework’ approach was used to verify the system’s capability to

function effectively, thus producing correct and timely data. This approach describes a typical MIS

into four main components:

a) Governance and organizational structure: This provides the adequate business environment

for an effective and efficient MIS, which includes (i) institutional arrangements and service

agreements, (ii) good oversight, (iii) clearly defined roles and responsibilities, and (iv) an

established process for business improvements.

b) Information management: This ensures the quality of information from accuracy, correctness,

timeliness, completeness, and relevance dimensions.

c) Infrastructure: The physical equipment used to operate the MIS. More specifically, it includes

the hardware, software, and network used to operate the MIS application. It also assesses

connections between the program’s central office with the local and regional offices.

d) Application management: This prevents vulnerabilities in day-to-day operations of the MIS

program. MIS applications provide an interface between the user (at federal or regional levels)

and the data input provider, controller, and supervisor.

After carefully examining the MOWE/WASH Implementation Framework (WIF) of we propose a

three (N-tier) application architecture for better future upgrading (Extensible), scalability,

maintainability of the proposed application databases.


The Application Layers are presented below (Figure 4):

i) User Interface/presentation layer: The presentation layer presents data to the users. We

advocate that, there should not be any business logic in the presentation layer because if we start

putting business logic into the presentation layer, maintenance problems starts popping up. The

presentation layer will also catch unexpected application errors and gracefully handle them and

xviii
display them to the user. We assume the presentation layer is linked to the business logic layer by

referencing one or more business logic components.

Figure 4: Application Layers

ii) The Business-logic layer: The business layer contains all of the application’s logic. We

assume that the business layer validates all of the data entered into the system, and advocate

putting some of the data validation into the database because someone may go into the database to

edit data, thereby bypassing the application logic. Although this is the least desirable situation, it

does happen and the application needs to be prepared to handle it. Also, if another application is

going directly into the database (again, not the best situation), some aspects of the data should be

checked because the application is bypassing a majority of the business logic. We put these checks

into the database as check constraints, referential integrity checks, triggers, and default values.

iii) The Data layer: The database consists of tables of data, stored procedures, views, and various

mechanisms to constrain the data entered into the tables. The only business logic contained in the

database should be the logic associated with the table columns, as mentioned above. Therefore, we

must take into account the likeliness of moving to a different RDBMS in the future and what the

xix
capabilities of that database might be. Each of these components complements and constrains one

another. The integrated framework provides a picture of the implementation of the MIS and helps

evaluate and identify areas in need of strengthening. Though, the framework does not prioritize its

various MIS components and functions, it depends on the specific challenges, maturity levels, and

design of each program/sector.

By following the above framework, bottom-up approach of MIS evaluation was adopted, whereby

the assessment of the system was carried out by first looking at the available infrastructure.

Following this, the application was assessed to make sure its ability to leverage the available

infrastructure, and information management practices are evaluated in the context of the application

and business processes. Finally, assessment of the business environment was made to check its

conduciveness for the effective operation of WaSH M&E-MIS.

In this way, the main tasks are appeared to be: a) The WASHCOM application software test

(background), which identified the process for the Physical and Functional configuration; b) The

WASHCOM web based application software identification (system configuration including

hardware, software and the Technical Data Package documentation; c) The WASHCOM web based

application software overview (identified the overall design and functionality of WASHCOM web

based application software main components and sub components). To carry out these tasks, we use

an integrated and three-phased approach, as given hereunder.

Phase 1. Diagnostic: In this phase we conducted two main activities:


 First, a scoping study of the existing Wash M&E-MIS was carried out. The immediate

purpose of this study was to analyze the present status of WaSH M&E- MIS and to identify

any weakness and gaps in line with the developed application software. At this stage, the

system was assessed with respect the four MIS components (the governance and

organizational structure, information management, application management, and

infrastructure).

xx
 Secondly, analysis of the present status of the web-based WaSH M&E-MIS application

software, was carried out, as already being developed and reported to be operational in the

MoWE. This part assessed the integrity, proper functioning and functionality of the web-

based software. Through this task, the present status, technology used for the development

of the application, its functionality and other constraints of the system were identified.

Phase 2. Development of checklist & testing scenario for evaluating application software: In

this phase the checklist and test scenario was developed (based on the results of the

diagnostic phase). The main purpose of this stage was to produce an evaluation document,

which helps for evaluating the developed web-based application software in the next (third)

phase.

Phase 3. Evaluation and Testing: Based on the developed checklist and test scenarios, we

conducted the evaluation and testing of the application software.

For the purpose of data collection, pertaining to the study, a set of formal and informal methods

were deployed, mainly:


 Review of important supporting documentation (e.g., Ethiopian WASH M&E-MIS system

manual, WASH Implementation Framework, etc.);


 Interviews of IT staff, operational staff, and program representatives at local and other

offices;

 Rapid appraisal and analysis to understand the already developed web-based application

software

A list of key checklists to look for, in each supporting document, and a sample of interview

questions were prepared to facilitate the execution of the plan as mentioned above. With respect to

checklist and testing scenario for evaluating the application software, it was assumed that the

features of the database should fit into the structure of the MoWE/WaSH program; integrate the

databases for efficient communication of data among different departments, administrative level,

xxi
and sectors; be accessible to use by different applications to be developed in the future; be flexible

to incorporate some of the structural changes that are likely to take place within the System; be user

friendly; be capable of upgrading in the future with minimal disruption of the existing databases;

and be sustainable.

A two-stage process was employed in the development of checklist and testing scenario. First, a

broad checklist and test scenarios was identified, whereby the checklist and test scenario was

developed for the main architecture of the WASH M&E MIS (web-based application) architecture

and other related issues. This was followed by a detailed system checklist and test scenario

development phase. However, the checklist and test cases to evaluate the application was

developed in line with the Software Requirement Specifications (SRS), and further categorized

into three sections, namely:

a) The WASHCOM application software test background: under this section, physical

configuration audit, which deals with the physical elements of the WASHCOM system

(configuration) and document review, was carried out. Additionally, functional configuration

audit (FCA), an examination of the functional aspects of the WASHCOM system was

performed along with the review of test document, FCA accuracy and characteristics tests

(determining usability, accessibility and maintainability).

b) The WASHCOM web based application software system identification: under this section,

we identified the system configuration including hardware, software and the Technical Data

Package documentation associated with WASHCOM web-based application software. For

the purpose, a detailed test case was developed with instructions to test the application and

database to validate the operational status of the WASHCOM web based application system.

Some of the testing areas include: architecture, installation, technology, functionality, data

validation etc.

xxii
c) The WASHCOM web based application software characteristic: under this part, the

consulting firm identified the overall design and functionality of WASHCOM web-based

application software in accordance to SRS. It also considered the software functional

features according to the following diagram (Figure 5).

On the part of role-based access security, as revealed from SRS document, WASHCOM application

is expected to be providing role-based data access/security during reporting. Therefore, all the users

are expected to be maintaining username and password to login, whereby the users are granted with

access to specific areas. Therefore, a checklist consisting WASHCOM functionalities in this respect

was prepared and administered, along with an inquiry on the physical/technical information on

(detailed) water schemes, and data management/ documentation pertaining to sanitation and

hygiene.

Figure 5: WASHCOM Architecture

Finally, as discussed earlier, the overall evaluation under this category was performed under two

steps: i) based on the checklists and test cases developed (mainly focused on architecture,

installation, technology, user interface etc.); and ii) detailed checklists and test scenarios (testing

was performed at the MoWE place).

xxiii
Additionally, the other part of this assignment includes data quality assessment and verification

(using selected indicator) across various reporting levels (Woreda to Regional/Federal). For the

purpose, Rotine Data Quality Assessment (RDQA) approach was undertaken, as given hereunder.

3.2 WaSH M&E-MIS DATA QUALITY ASSESSMENT (DQA) Approach

The RDQA is developed for the purpose of assessing M&E system and verification of data quality

for pre-identified indicators. The conceptual framework of RDQA is illustrated in the Figure 6. For

good quality data to be produced by and flow through a data management system, key functional

components need to be in place across all the levels- the service points/Woredas, the intermediate

level(s) where the data are aggregated (e.g., Zones and Regions) and the M&E unit at the highest

(Federal) level to which data are reported.

As can be seen from the Figure 6, RDQA is based on three constructs: reporting levels, data quality

dimensions and functional components of data management system. Therefore, the RDQA, for the

purpose of current assessment, has been developed based on a multidimensional concept of data

flows through an M&E system that operates at three (Woreda/Town, Zone/Region, and Federal)

levels, whereby the seven dimensions of data quality can pose certain challenges across the system.

Furthermore, the RDQA identifies nine functional areas that need to be addressed to strengthen the

WaSH M&E-MIS system, and to improve the quality of data produced through the system.

However, the tool may be divided into two core components: 1) assessment of data management

and reporting system, and 2) verification of reported data for key indicator(s) at selected points.

Accordingly, the RDQA questionnaire contains two parts for data collection:

Part 1- M&E System Assessment: Assessment of the relative strengths and weaknesses of 9

functional areas (as given below) of the data management and reporting system. The purpose of this

xxiv
part is to identify possible threats to data quality posed by the design and implementation of the data

management and reporting system across the (three) stated levels.

Quality Dimensions
R Quality Data
(accuracy, completeness,
e reliability, timeliness,
p precision, integrity, coherence)
o M&E Unit
r Data
t Mgt Functional Areas of DMS (for ensuring
i Intermediate Aggregation
Data Quality)
n Levels (Regions/Zones) &
g M&E Structures, Functions and Capabilities
Rep Training
Sys Data reporting requirements
L
Woreda/Town Indicator Definitions/Reporting Guidelines
e
Data Collection and Reporting Forms/Tools
v Data Management Processes
e Data Quality Mechanisms and Controls
l Links with National Reporting System
s Data usage

Figure 6: Conceptual Framework of RDQA

Additionally, the questions for the system assessment are grouped as per the following functional

areas (see checklists for details):

1. M&E Capabilities, Roles and Responsibilities

2. Staff Training

3. Data Reporting Requirements

4. Indicator Definitions

5. Data-collection and Reporting Forms and Tools

6. Data Management Processes

7. Data Quality Mechanisms and Controls

xxv
8. Links with National Reporting System

9. Data Usage

Part 2- Data Verification: Quantitative comparison of recounted to reported data, and review of

timeliness, completeness and availability of reports. Moreover, the RDQA approach, in our context,

maintained three groups of data collection sheets to be completed: (1) at Woreda/Town level, (2) at

intermediate aggregation sites (e.g. Zones and Regional offices), and (3) at the M&E (Federal level)

unit. However, for the purpose of implementation of RDQA approach, a total of 2 indicators were

agreed upon for verification purposes.

As the RDQA tool is designed to assess data related to indicators during a specific (selected) time

period, the study maintains data verification related to the last two consecutive years (2011 and

2012), by considering the non-availability of the current year (2013) data during the survey period.

For the purpose of making system assessment and verification of reported data, the audit team was

selected with the sampled sites/units that appeared to be representing the locations where WaSH

M&E-MIS is already piloted or claimed to be functional. Accordingly, the consultant team selected

26 piloted sites on random basis that consists of 40 woredas, 12 zones including the respective

towns, 10 regions (excluding Addis Ababa), and the MoWE at the Federal level.

Additionally, site visits for the purpose of system assessment and data verification, in line with the

stated objectives were made by the team provided with required training/skills for administering

questionnaires/ checklists for the purpose of data collection from sampled units. A full-day training

session to enumerators/data collectors was delivered to gain understanding on the approach of data

collection across various levels. The sampling unit includes the individuals/respondents familiar

with the WaSH data management and reporting (M&E) system across all the levels (from

Woreda/Town to Zone/Region to the Federal M&E level). However, the second part of the RDQA

xxvi
was to be filled through recounting the numbers/cases (associated with selected indicators) across

the reporting levels.

Similarly, at the Intermediate Aggregation Level (Zone/Region), recounting includes re-aggregation

of the numbers from reports received from all woredas/towns, and a comparison of that number

with the aggregated result that was contained in the summary report prepared by the intermediate

aggregation level and submitted to the next level.

Finally, at the M&E Unit, recounting includes re-aggregating reported numbers from all reporting

entities and comparing them to the summary report that was prepared by the M&E Unit for the

stated reporting period(s). At this level, the reports should also be reviewed to count the numbers

that are available, on time, and completeness of two selected indicators (i.e., number of water

schemes and number of functional/non-functional water schemes), being the measures of data

quality.

Alternatively, the recounting was made simplified by comparing the recounted results in the

relevant database/register/document to the summary report. By using this approach, a sample from

the earlier was drawn for the purpose of ‘cross checks’. Those entries on the database/register were

compared to the same information on the source documents. If errors are found, the above stated

procedure of recounting from source documents was thoroughly implemented.

Based on the findings of the RDQA (first two sections), suggestions to guide on preparing action

plan for strengthening the data management system and improving the quality of data were initiated

under the 3rd section of the RDAQ template (see Annex for details).

xxvii
4. DATA ANALYSIS AND FINDINGS
The purpose of this report is to provide assessment result for the web-based WASH M&E-MIS

(WASH M&E management system integrated with education/EMIS, health, CSA, MoFED/Ibex).

The assessment was undertaken by following the methodology discussed in the previous section by

using checklists and test cases, document reviews and RDQA approaches. Additionally, a summary

of discussions on the significant issues is presented by assuming that those should be resolved,

before hand, in order to ensure the effective/successful implementation of WASH M&E-MIS

system. In line to this, data obtained by following the stated approaches/methodologies were

analyzed both qualitatively and quantitatively, for the WaSH M&E-MIS, in line with the stated

objectives: 1) assessment of the data management and reporting system; and 2) data verification

pertaining to 2 variables.

For the purpose of web-based system assessment, WASH M&E-MIS Evaluation Checklist (WEC)

was developed for evaluating and testing of the application software within developmental context.

As a tool, it helped to remember certain tasks involved in evaluation and considers a wide array of

criteria of importance to evaluation. Basically, the checklist aims to:

(a) Evaluate WASH M&E MIS web based application software

(b) Provide recommendations/solutions for WASH M&E-MIS web-based software

sustainability, on the basis of evaluation results.

The checklist is built based on the inception report submitted by DMISS Plc by taking into

consideration of system requirement specification (SRS) and Terms of Reference (TOR) documents.

This majorly consists of three parts:

(A) The WASH M&E MIS application software test background;

xxviii
(B) The WASH web based application software system identification; and

(C)The WASH M&E MIS application software characteristic.

The WASH M&E-MIS application software test backgrounds is mainly focuses on documentations

and is subdivided into six sections: System Overview, System Functionality Description , Software

Design and Specification, Section System Test and Verification Specification, Section System

Operations Procedures, and Section System Maintenance Procedures. Following this, criteria for

evaluating the software and hardware, and assessment of overall system and subsystems of the

WASH M&E-MIS software (in connection with first part) are provided.

4.1 Current Status of WaSH M&E-MIS

The WaSH M&E-MIS implementation project involves system study, design, development, pilot

test, and deployment of an integrated web-based application.


To date, the WASH M&E MIS software installation is in its initial/current stage, i.e., all modules

have been installed. Pre-determined data were converted from the Access database system to the

new WASH M&E-MIS SQL database system. This project involves the installation and

implementation of twenty one modules. It includes the WASH management integrated to education,

health, CSA, MoFED/Ibex.

During the testing, following major project tasks have been completed:

 Data collection instrument developed, which consists of ten pages

 The WASH M&E MIS System is installed/ configured.

 End user and technical personal training was provided.

4.2 The WASHCOM Application Software Testing

xxix
The results and evaluations of the PCA and FCA reviews tests are identified below. Detailed data

regarding the Acceptance/Rejection criteria, reviews and tests are found in the appendices.

PCA TDP Document Review

We reviewed all the submitted TDP documents of the WASH M&E-MIS system against the

deliverables stated in the TOR and SRS requirements (the submitted documents.) Each submitted

document was reviewed against the deliverables specified in the TOR and SRS documents. If the

required content was present in one or more submitted documents results were summarized and the

requirement was accepted. If it was not present the requirement was rejected. Accordingly 5

document reviews were conducted. Out of the reviewed five Technical Data Package documents

three of them have been found to meet the requirements and two of them did not meet the

requirements.

FCA Functional and System Integration Testing

The consultants undertook a review of the WASH M&E MIS system functionality to the

requirements of the SRS and TOR. Tests covering system functional requirements were

incorporated into three standard system level integration test cases. Two of the tests were dispensed

with the broad characteristic and one was dealt with detailed module level testing. We used the

WASH M&E MIS system installed software in the xxx premises and formats were prepared,

sample data entry for each sub components/modules of the system were conducted and reported

exercising the input controls, error content, and audit message content of the WASH M&E MIS

system. Data manipulations functions; Data entry, retrieval/viewing, edition, and deletion were

conducted for the 21 sub components/modules of the WASH M&E MIS system and data

accurately and reliably were also checked and tested. In addition to, Effectiveness of security access

controls, system integrity, availability, confidentiality and audit accountability were examined. The

content and clarity of user instructions and processes was reviewed for usability.

xxx
During the WASHCOM web based application software characteristic system level testing

numerous documentation and functional defects were noted.


Evaluation of Web-based System (WASHCOM)

Upon completion of all evaluation and testing of the Functional and System Level test cases, while

some of the WASH M&E-MIS system sub-components were found to meet the Functional and

System Integration requirements of the SRS, if modification on the data edit function has been

made, some appears to be failing in doing so, as can be seen from attached appendices.

4.3 DQA of WaSH M&E-MIS

4.3.1 System Assessment Verification

The nine functional areas (as described in the methodology section) were assessed by using as many

as 23 questions in order to assess the data collection and reporting system, and to identify the

potential risks (see Checklists for details). For the purpose of analysis, standard RDQA application

format was designed by the consulting team, in the MS-Excel (application software), for generating

the outputs in line with the study objectives. However, while measuring the performance of the

system, each value presented in the following figures determines the average (degree) of responses

obtained on different functional areas across all the reporting levels of the system, whereby a value

approaching to 1 indicates completely or 100%, and towards 0 shows the absence/not at all.
Therefore, as presented in Figure 7, an average score is calculated for each of the 9 functional areas

in order to assess the overall performance of the WaSH M&E-MIS, across all the levels (Woreda,

Zone/Region and central M&E). The result associated with overall performance reveals that while

data usage across the system, data collection and reporting forms/tools, links with national reporting

system, and organization and staffing are the major functional areas where the system is considered

to be performing better (over 80%), the remaining functional areas such as training provided to the

xxxi
staff, data reporting requirements, and indicator definitions are reported to be low on the part of

their assessment, as compared to data management processes and data quality mechanisms.

Figure 7: Overall system performance per functional area


This indicates that relatively different levels are not having trained professionals along with lower

level understanding of data reporting requirements pertaining to what to report where, how and

when. This may be attributed to the lack of appropriate trainings given to the staffs. Also, it reveals,

in some cases the absence of guidelines/standards, and either non-availability or poor utilization of

the computerized system/software for the purpose of data collection/aggregation and/or reporting

across various levels.


Furthermore, Table 3 presents the data related to 9 functional areas, across the 3 levels. As can be

seen from it, as an average, Regional units are found to be performing better than those at Woreda

and Zone levels, almost on six different functional areas. While lowest level of training provisions

can be witnessed at Woreda level, on the part of data reporting requirements, Zones are appeared to

be performing poorly against Woreda and Regional levels. The major reasons of poor (formal)

training appear to be employee turnover (with some knowledge of M&E-MIS system), almost

xxxii
across all the levels, however impacted seriously the Woreda level reporting, following that of Zone

and Region.
Table 1: WaSH M&E-MIS performance across reporting levels

Average per
Functional area Wereda Zone Region
functional area
I - Organization and Staffing 0.82 0.79 0.90 0.84
II- Training 0.29 0.58 0.60 0.49
III - Data Reporting Requirements 0.71 0.33 0.50 0.51
IV- Indicator Definitions 0.54 0.42 0.90 0.62
V - Data Collection & Reporting
0.93 0.83 0.80 0.85
Forms & Tools
VI - Data Management Processes 0.68 0.74 0.93 0.78
VII - Data Quality Mechanisms &
0.66 0.59 0.84 0.70
Controls
VIII - Links with National Reporting
0.93 0.67 0.89 0.83
System
IX - Data Usage 1.00 0.92 1.00 0.97
Overall performance 0.73 0.65 0.82
Source: Survey data, 2013.
On the other hand, while indicator definitions, data management processes, and data quality

mechanisms are appeared to be better understood by the concern authorities at regional level, both

the zonal and woreda levels units are reported with challenges. However, data usage across all the

levels is well witnessed, as compared to other areas.


Furthermore, an attempt was made to compare the overall system performance across three levels.

As can be seen from Figure 8, regions appeared to be performing, cumulatively, better than Woredas

and Zones. This may be associated with the organization and staffing, and comparatively trained

staff at regional level pertaining to WaSH M&E-MIS, which reflects somehow better system

utilization.

xxxiii
Figure 8: System Performance by Reporting Levels (Source: Survey data)
On the other hand, when it comes to system performance at Woreda level, as can be seen from

Figure 9, except training and indicator definitions, remaining functional areas are appeared to be

better performing. Therefore, once again, issues related to performing what is expected without

training appears to be highly challenging, as maintaining the lowest performance score (29%),

following indicator definitions (54%) towards positivity. However, data usage at this stage is found

to be perfect (100%), along with the areas of links with national reporting system and data

collection and reporting formats/tools (93% each). This can be attributed to comparatively better

organization and staffing (82%) and some informal knowledge of what to report to whom. Though,

in the light of a targeted (formal) training on WaSH M&E-MIS, better results are expected.

xxxiv
Figure 9: System Performance at Woreda Level (Source: Survey data)
With respect to zone level performance of the WaSH M&E-MIS, data reporting requirements,

indicator definitions, training, and data quality mechanisms and control are appeared to be

performing weakly (Figure 10). However, on the part of data usage, data collection and reporting

forms/tools, and organization and staffing, system is found to be performing better. Therefore,

while data reporting requirements, what to report to whom, how, and when, are reported to be the

weak link, and closely associated with non-standard indicator definitions being used along with the

poor (formal) training provisions. Similar to that of woreda level, any provision of training

pertaining to WaSH M&E-MIS and clarification on indicator definitions and data reporting

requirements at zonal level may assume to be helpful in bringing-up the system performance.

xxxv
Figure 10: System Performance at Zone level (Source: Survey data)
Additionally, on the part of system’s performance at regional level, once again areas related to data

reporting requirements and training are found to be weaker (Figure 11) as compared to data usage

(100%), data management process (93%), indicator definition and organization and staffing (90%

each). Additionally, links with national reporting system appeared to be better performing in the

context of WaSH M&E-MIS. This reflects the highest data usage by decision-makers at regional

level. Though, areas associated with formal training provisions and clarification on data reporting

requirements under WaSH M&E-MIS are found to be appearing challenging.

xxxvi
Figure 11: System Performance at Regional level (Source: Survey data)
Moreover, a close assessment on regional basis guides us on the part of certain discrepancies and

variations from each other, in the context of WaSH M&E-MIS. The major differences are appeared

to be in their reporting mechanisms. While in Amhara region, Woredas and Town utility (Debre-

Markos) are found to be reporting simultaneously to zone (copy to), Town also reports to the Water

& Energy Bureau at the Regional level. On the other hand, Bahir Dar town utility is not found to be

reporting to any zone, rather directly to the Regional Bureau. Moreover, at woreda level, in some

parts of this region, no responsible/dedicated staff was found, though some representatives claimed

that they have taken part in the National WaSH-Inventory/survey, and thus were received some

formal training (once). Though, many of the units (at woreda and zone levels) are found to be using

the standard WaSH-Inventory report format in the region.


Additionally, in order to prepare the WaSH M&E reports, mostly paper based approach is in use

with some support, sometimes, have been taken from applications like MS-Word and Excel to

prepare the report at zonal level, and the same print-out would be used for reporting to next

xxxvii
(regional) level. This represents poor utilization of the web-based WaSH M&E-MIS, at least in

Amhara region.
Opposite to this, in Tigrai region, in one of the sampled woredas, EVDO/Wireless-based internet

connection was found to report from Woreda to the next level (Region), thus appeared to be

practicing online reporting. However, daily/weekly online-reporting, in the region is also depends

on network connectivity, as in the absence of internet connection, the manual/paper-based system is

taking over. Therefore, in the areas, whereby staff follows online-reporting under WaSH M&E-MIS,

internet appears to be a big challenge and forcing the system to return back to the paper-based.

Additionally, as reported before, in this region, the Woreda is expected to report directly to the

Region (without passing through zone). Similar cases have been reported from Afar, Somali, Hareri,

and Dere-Dawa. In this region too, staff training appears to be a serious challenge to not to adapt the

web-based system for reporting, in a perfect/full-fledged manner, though partial trainings are

received by some staff at regional level. Also, the region is found to be suffering with some

logistical problems, associated with visiting Kebeles/sites to obtain data from. Though, the region

appeared to be somehow prepared to join One-WaSH initiative, about to be realized in 2014.


With respect to Harari, Afar, and Somali regions, woredas are found to be reporting directly to the

region. However, Dere-Dawa is expected to be reporting to the City administration Bureau of Water

& Energy, along with a direct reporting to the Federal level. In these reasons too, majority of the

reports are prepared manually/paper-based, to communicate with Regional level, however, at

Regional level, standard formats are maintained. Surprisingly, in Jigjiga/Somali region, no numeric

data, pertaining to WaSH M&E-MIS, are found, though there exists some responsible persons for

the data collection and reporting purpose, along with that in Dire-Dawa. Whereby, in Afar region,

data availability related to the M&E/reporting system is witnessed. In general, these regions are

claimed to be exposed to detail training on National WaSH-Inventory, except that of Somali region.
With respect to Assosa, Gambella and some part of Oromia, while Woredas are found to be

reporting to Zone, Towns are claimed to be reporting to Zone and directly to the Region. Once

xxxviii
again, shortage of manpower, trained on M&E system, and high employee turnover are witnessed as

serious problems in these regions. Moreover, no manual/guidelines on WaSH M&E-MIS are found

with concerned units at Woreda and Zone levels. In Oromia region, while data reports are being

prepared using MS-Excel application software, remaining (stated) regions are found to be

maintaining manual/paper-based approach,, whereby data are to be compiled using MS-Word

application and the print-outs are to be sent to the next levels.


Specifically, in Assosa, some discrepancy of data between Woreda and Zone levels is witnessed

pertaining to the verification indicators, whereby the data is reported to be re-sent to Zone and

Region with some modifications, but that was not brought forth/updated at Zonal level, before

sending to the Region. In some Zones (in Oromia and Assosa), the problems are being reported

pertaining to re-submission of data by Woredas to Zones several times/on different days. Therefore,

Zones are unable to capture the most recent in their communication to the Region.
Also, in Gambella region, data discrepancy is witnessed whereby the Zones are found to be holding

data less than what have been communicated by Woredas. Such problems are quite common for all

the units where electronic data management (computational/web-based) is found absent. Though,

almost all the Woredas and Regions are found to be maintaining monthly, quarterly and annual

reports (in well-structured/WaSH-Inventory formats) and mostly prepared in MS-Word application

software, but soft copies are difficult to access, when required with these units. However,

availability of responsible persons/staff is well witnessed in the Gambella (all the three levels) and

Assosa (at Regional level) regions.

4.3.2 Data Reporting Verification


Figure 12 provides presentation on the availability, timeliness and completeness of reports across

the three levels. As can be seen from the figure, zonal data are appeared to be comparatively better

from Woreda and Regional levels on both the availability (92%) and timeliness (75%), though

WaSH M&E-MIS reports at Woreda level are appearing more complete (86%), compared to zone

xxxix
(83%) and region (67%). Therefore, at regional level, system associated reports are found to be least

available (69%), with delayed (26%) and incomplete (67%) manner.


This can be attributed to the fact that Woreda and zones are expected to report to regions, therefore,

one way or other completing it and forwarding to the next level in a timely manner. However,

regions may not be required that much frequent reporting to the next (Federal) level in the context

of WaSH M&E-MIS, though appear to be well placed on indicator definitions, organization and

staff responsible for reporting through the system, and (elecronic) data management process. This

shows that for major regions, the reporting majorly stops at Regional level, therefore, appears less

completed reports as compared to Woreda and Zonal level.

Figure 12: Availability, Timeliness and Completeness of Reports across the Levels
However, with respect to the other objective of data verfication, serious challenges were faced in

order to collect the data being reported through WaSH M&E-MIS over the last 2 years. The major

data that have been reported once through the system are assocaited with National WaSH inventory.

Therefore, from this perspective the system appeared to be a failure caused by many problems as

discussed above (mainly related to training and data reporting requirements in most of the cases).
Specifically, in the case of Amhara region, number of water schemes (indicator 1) is reported only

for the year 2011 and not for 2012 (at regional level), with an argument that it’ll be complied later

xl
on. This gives strength to the claims made in Figure 12, whereby, at regional level, in general,

timeliness in reports preparation is not well witnessed. Moreover, different Woredas are found to be

using different indicators, pertaining to WaSH M&E-MIS, on the part of their reporting to the next

levels. Moreover, reporting in a delayed fashion can be attributed to the manual reporting

system/process, caused by various problems (e.g., Internet connectivity, computer system, training,

logistics etc.) across all the levels, particularly in the Amhara region. However, among the stated

problems, inadequacy of formal training on WaSH M&E-MIS and staff/employee turnover appears

to be the sevior, though various units are found to be reported through National WaSH-Inventory

forms.
On the other hand, in Tigrai region, no documents for policy/management of the WaSH M&E

system were found, though, concerning manual (on WaSH Administration and operation) was very

much available with the concerned authorities. Also, the reporting system is appeared to be one-

channel (Woreda to Region), though staff is not assigned specifically to WaSH &E-MIS rather they

appeard to be doing it on secondry (job/work) basis, thus unorganized WaSH unit. Generally, data

pertaining to 2011/12 (associated with indicator for verification) was found at regional level without

any manipulation, but remained untimely.


Finally, in Gambella region, most of the Woredas met timeliness, though some reports beyond June

to the next levels, which is reported to be caused by frequent employee turnover (across all the three

levels).

xli
5. CONCLUSION AND RECOMMENDATIONS

This study is carried out to assess the WaSH M&E-MIS from the perspective of its current status of

deployment, associated facilities/staff, and data usage maintained through the system for decision-

making by the interested stakeholders. Based on the assessment, in general, it is found that the

system is in infant stage, even after the installation of the web-based software at various units across

the reporting levels and in different regions, still there appears to be challenges associated with its

usage to report from one level to another. Additionally, it is found that there exists high staff

turnover among the reporting units (almost across all the levels from woreda to regional), and that

creates the problem of trained staff who can deliver the output in line with the systems

requirements, even after the provision of their requirement (formally) in the organizational/WaSH

program structure.

On the part of system (web-based application software) assessment, adequate documentation, which

is expected to be maintained by the MoWE, was found missing, concerning to WaSH M&E MIS.

This appeared to be providing serious challenges associated with effective implementation of the

system to provide quality data reporting and dissemination. Therefore, documentation related to

software design and specifications, system functionality, testing and verification, operating and

maintenance procedures, and technical/functional user purposes are to be made available/or

developed. Furthermore, additional modifications/customization is required in the existing format of

xlii
the program (web-based) to fit into the general MoWE/WaSH organization’s

requirements/framework. Moreover it has been found that there exists no proper chaecking

mechanism of the data being entered into the newly implemented (web-based) electronic system,

which may make it difficult to the organization and concerned stakeholders to get reliable

data/information from the system.

Additionally, basic utilities e.g. computers, in general, and internet connectivity, in particular appear

to be posing other serious challenges on the part of better functioning of the system (WaSH M&E-

MIS). Though, many of the units (particularly at woreda and zone levels) are found to be not

maintaining any guidelines on the usage of the system, along with well-structured reporting formats.

Also, it has been observed that authorities are maintaining WaSH M&E-MIS system related

activities as their secondary job, rather primary function without adequate training, and most of the

times, manually, instead of using web-based software program. Finally, the WasH M&E-MIS is

appeared to be maintaining poor connectivity across various reporting levels e.g., the reporting, in

general, ends with region, rather approaching to the national/Federal level, as expected from any

strong M&E system.

Based on the concluded findings, the consultants forward the following recommendations:

 By concerning to the security loopholes (associated with data manipulation), it is strongly

recommended to devise some control mechanism by highlighting security issue/policy, so

that data manipulation in any form can be avoided. Moreover, some testing mechanism

should be implemented within the existing system, so that any data manipulation, as log file,

can be generated/ maintained.


 From the perspective of WaSH M&E-MIS, in general, the findings of the assessment reveal

that functional areas pertaining to data management process and data reporting

requirements, data quality mechanisms and controls and training provided to M&E staff,

xliii
almost across all the levels are posing serious challenges to the effective implementation of

the system. Therefore, special attention is required to handle carefully the above issues by

deploying some mechanism to effectively implement and sustained with the newly

implemented/piloted WaSH M&E-MIS.


 As majority of the areas and their functioning is directly linked with the training component,

development of a sound training program is very much required to be developed and

effectively delivered across various reporting levels (from woredas to regions). Therefore,

all the technical personals should be well trained on WASH M&E MIS application software

technical part in order to capture the specific skills of the system.


 By keeping in mind the existing system design and functioning, it is recommended that

technical resources need to acquire more the consultant (application software developer)

specific skills, including customization of application engines and reporting tools.


 Also, there needs to be a transition plan to switching over to the new system, whereby the

skills, knowledge and other associated resources with staff can be well protected on one

hand (in the areas of M&E-MIS), and developed further on the other, to keep them

motivated to discharge their duties with interest.


 Considering that standard data collection/reporting formats/tools (national WaSH-

Inventory), and well maintained links with the national reporting system along with

indicator definitions are the major functional areas where the system is expected to perform

better, adequate arrangements should be carried out. As, sustainability of the system would

be guided by the standard reporting mechanism/formats, standardization should be

maintained to reveal effective and efficient functioning of the system through easy/self data

aggregation across various levels.


 Concerning that different reporting levels are not aware of the data management process or

showing non-availability or poor utilization of the web-based/computerized system/software

(WaSH MIS) for data collection and/or reporting purposes across various levels, efforts

xliv
should be made to arrange adequate installation, supported by user manuals to learn by

doing (over the time), along with some short-term, specialized training modules.
 Moreover, the findings report that data reporting requirements in terms of where and when

to report the data, which may be caused either by employee turnover or the absence of

related communication/documents/policy manuals, attempts should be made to initiate and

promote a two way communication channel to make clear the queries as and when required,

both in upward and downward manner, along with standard reporting procedures.
 Concerning the regional M&E unit assessment, challenges are being observed in the areas of

data management, and data quality mechanisms and control, which do require appropriate

and immediate attention to be paid to, as the moment they will be connected to the higher

level (Federal), problems may be encountered related to late and incomplete reporting.
 In the case of zonal and woreda level assessment, once again, data reporting requirements,

data quality mechanisms and control, and data management processes are appeared to be

posing challenges. Therefore, in order to strengthen the WaSH M&E-MIS at zonal and

woreda levels, dedicated staff within organizational framework should be appointed with

clear responsibilities, knowledge and sector specific experience.


 When looking into the WaSH M&E MIS performance, even though the data collection and

reporting forms/tools are available to various units (across different levels), the system

appears to be, up to some extent, dysfunctional, either caused by no training or absence of

computation systems or internet connectivity. This further requires immediate attention to

obtain the associated benefits (quality data for decision-making) from the system, by

facilitating internet connectivity though special arrangements with service provider (Ethio

Telecom) at facility/unit level.


 Though data management processes reflecting on the poor usage of computerized system of

data collection, effective reporting is found to be absent along with the areas of training and

data reporting requirements. Therefore, steps are expected to be taken to use the electronic

xlv
system, where applicable, in the process of data collection and reporting, by providing

specific software solutions for the purpose (WaSH M&E-MIS).

BIBLIOGRAPHY
1. Ethiopian Data Quality Assessment Framework (EDQAF), CSA, 2011.
2. Federal Democratic Republic of Ethiopia, GTP, Ministry of Finance and Economic
Development, 2010.
3. Fink, A., Conducting Research Literature Reviews: from paper to the Internet, Thousand
Oaks, CA: Sage Publications, Inc., 1998.
4. Lyberg, L., 1997, Survey measurement and Survey Quality, Wiley.
5. MoFED, DPRD, September 2006: A Plan For Accelerated and Sustained Development to
End Poverty (PASDEP), Ministry of Finance and Economic Development, Addis Ababa.
6. RiPPLE, Improving WaSH Information for Better Service Delivery in Ethiopia: Scoping
Report for Initiatives, December, 2009.
7. RiPPLE, Monitoring WaSH in Ethiopia: Messages from a Sector Symposium Report,
November, 2010.

xlvi
8. UNDESA, Strengthening Water Sector Monitoring and Information System in Ethiopia with
GIRWI Project: Second Phase, Consolidated Report, 2011.
9. WaSH M&E Framework and Manual, Version 1.
10. WaSH M&E-MIS System Development Report, 2010.
11. WaSH User Operational Manual, Version 1, 2011.

APPENDICES
APPENDIX- A

The WASHCOM web based application software system identification

A. HARDWARE STANDARDS
HARDWARE STANDARDS CHECKLIST
  Performance Requirements Performance   
requirements address a broad range of parameters
like:

  National Datacenter Network Infrastructure   


  MWR Datacenter Network Infrastructure   
  RWB Network Infrastructure   
  Mekelle RWB Network Infrastructure   
  Afar RWB Network Infrastructure   
  Amhara RWB Network Infrastructure   
  Oromia RWB Network Infrastructure   

xlvii
  Somali RWB Network Infrastructure   
  Hareri RWB Network Infrastructure   
  Southern Peoples RWB Network   
Infrastructure
  Benishangul-Gumuz RWB Network   
Infrastructure
  Gambela RWB Network Infrastructure   
  Diredawa City Administration WB   
Network Infrastructure
  Addis Ababa Administrations WB Network   
Infrastructure
  Woreda Water Desk Network Infrastructure   
  WaSH Stakholder Sectors Network   
Infrastructure
  Stakeholders and Donors Network   
Infrastructure

B. General application software standards

SOFTWARE STANDARD CHECKLIST


  Architecture: Check whether the Web   
application meets for Loosely Coupled
,Encapsulated, Scalable, Extensible, and
Maintainable
  Installation: How it is easy to install   
  Technology: What kind of technology used to   
develop the application software, application
architecture, how fast it is, and concurrent issue
will be identified
  Speed and Accuracy: how fast it is, can the   
application support more than one users to update
the same record simultaneously will be identified,
i.e how the system addresses concurrent issue
  User interface: How the application software is   
user friendly

xlviii
C. Detail application software consideration
In this detailed checklist the following points are taken into consideration to check each
components/modules of the WASH M&E MIS application software:
C1: Functionality: whether the component/module is working properly
C2: Functionality: whether component/module meets the user requirement
C3: Data manipulation: whether the user can enter, view, modify, and delete records
C4: Error free: whether all bugs is clear for the component/module
C5: Data Validation: Does the component/module supports basic data validation rule

WASH M&E MIS application Test Result Test Remark


software characteristic Cases
No C1 C2 C3 C4 C5

Managing WASHCOM Role-based No No No No Yes


Access Security

Manage Information on “Reporting & Yes Yes Yes Yes Yes


Data Source

Manage WASHCOM Instrument Yes No No Yes Yes


Information

Consolidate WASHCOM at Kebele


Level

Manage Water Scheme & PTS Yes No No Yes Yes


Technical Details

Management of RWS Construction


Projects

Manage UWS Facility One-Time Yes No No Yes Yes


Inventory

Management of UWS Facility Service Yes No No Yes Yes


Level

Management of UWS Construction Yes No No Yes Yes


Projects

xlix
Maintain Annual School Census Data Yes Yes Yes Yes Yes

Maintain MoH- Health Household Yes Yes Yes Yes Yes


Register

Maintain Health Facility Inventory Yes No No Yes Yes

Maintain National Census Population Yes No No Yes Yes


Data

Maintain CSA Household Survey Data Yes No No Yes Yes

Maintain WASH Financials Yes No No Yes Yes

Produce Kebele WASH Action Plan - No No No No No


Supportive Report

Produce Aggregate Reports No No No No No

Advanced Direct Data Inquiry & No No No No No


Reporting

KPI Listings No No No No No

KPI Cards No No No No No

l
WaSH M&E-MIS Application Software Characteristics- Detailed Checklist

li
WASH M&E MIS application software Test Test Remark
characteristic items Result Cases

Managing WASHCOM Role-based Access Security

The system administrator defines system roles Accepted

The system administrator assigns access Rejected Un managed Error


privileges to roles page displayed

The system administrator associates system Accepted


users to the right role(s)

The system assigns subscription users to pre- Accepted


determined subscription members role

The system enforces user role after asking user Accepted


to select role for which the user is member
during login

The system administrator upgrades/changes Rejected


roles’ privileges

The system administrator removes users from Accepted


roles and/or reassign to same or different roles

The system administrator removes/deactivates Accepted


unwanted roles

The system administrator removes/deactivates Accepted


user accounts

The system logs system administrators’ and Rejected


system users activities

Manage Information on “Reporting & Data  Merging, deleting,


Source splitting any
administrative
structure is not
working.
 The admin structure
page contains GPS
Reference combo
box having different
SHP files. There is
no lookup form to
add or remove this
SHP files.
 The use of Woreda
and Zone GPS code
is not clear
especially when
creating region and
zone.
 Almost all pages that
lii
have date and time
field are not properly
validated
Appendix- B

DATA QUALITY ASSESSMENT (DQA)


CHECKLIST TOOL

Introduction

The DQA tool is comprised of three complementary checklists that would be implemented to
assess the quality of data at each level currently involved in management and reporting WaSH
data (i.e., National WaSH, Regional WASH and Woreda/Town WASH levels).The checklist for each
of these levels includes questions to assess the data management and reporting system and
instructions for recounting data and comparing with data that have been previously reported

The checklist for each level contains background information section and the following three main
parts:

PART 1 - Systems Assessment This part is designed to identify potential challenges to data
quality created by the design and implementation of data management and reporting systems at
the level under consideration
PART 2 - Data Verifications designed to assess whether the level under consideration is
collecting and reporting data to measure the indicator(s) accurately, completely and on time and to
cross check the reported results with other data sources..
PART 3 - Reflection on/Implication to an Action Plan designed to summarize key findings &
needed improvements.

liii
DATA QUALITY ASSESSMENT (DQA) TOOL

DQA Checklists for National WaSH Level

May 2013

liv
This DATA QUALITY ASSESSMENT (DQA) TOOL is comprised of three complementary
checklists which enable to assess the quality of data at each level of the data-collection and
reporting system (WaSH M&E-MIS) at Central/Federal level.

This questionnaire should be used at the Federal WaSH M&E unit.It is divided in three
parts:
 PART 1 - Systems Assessment. This assessment is designed to help identify potential risk
to data quality linked to the data collection and reporting system at the Central level.
 PART 2 - Data Verifications. Review of the availability, completeness and accuracy of
reported data for the given time period.
Indicators to be verified: 1) No. or % of water schemes; 2) No. or % of functional/ non-
functional water schemes
PART 3 - Reflection on/Implication to an Action Plan. Summarize key findings &
needed improvements.

Background Information

Name of Sector WaSH


Name of Unit Federal
Indicator(s) Selected
Reporting Period (Verified)
Documentation Reviewed
Date of Verifications

PART 1 - Systems Assessment

This assessment is designed to help identify potential risks to data quality linked to the data-
collection and reporting system at Federal Level.

Functiona
Question l area Yes / Comments
No

1 Is there organizational structure that clearly I


identifies positions that have data management
responsibilities at the Central level?

2 Are all staff positions dedicated to M&E and I


data management systems filled?

3 Is there any senior staff member (e.g., the I


Program Manager) responsible for reviewing
the aggregated numbers prior to the release of

55
Functiona
Question l area Yes / Comments
No

reports from the Central level?

4 Are there designated staffs responsible for I


reviewing the quality of data (i.e., accuracy,
completeness, timeliness etc.) received from
sub-reporting levels (e.g., regions, woredas)?

5 Is there a training plan which includes staff II


involved in data-manipulation and releasing at
the Central level?

6 Have relevant staffs received training on data II


management, processing and tools?

7 Is there any guideline that defines what to III


report, when to report and to whom to report?

8 Are there operational indicator definitions IV


meeting relevant standards and systematically
followed at the Central level?

9 Is there a description of how to manipulate IV


each indicator monitored by the Central level?

10 Are there standard data collection and V


reporting forms that are used by the Central
level?

11 Are source documents kept and made VI


available in accordance with a written policy?

12 Does the central level utilize standard VI


computerized system for data recording,
manipulation and reporting?

13 Are all source documents and reporting VI


forms/formats for measuring the indicator(s)
available for auditing purposes (e.g. dated
print-outs in case of computerized system)?

14 Is there a written back-up procedure for VI


computerized data entry or data processing?

15 Are there quality control mechanisms in place VII


when entering data? (e.g., double entry, post-
56
Functiona
Question l area Yes / Comments
No

data entry verification etc).

16 Are there clearly defined and followed VII


procedures to periodically verify source data?

17 Are there clearly defined and followed VII


procedures to identify and reconcile
discrepancies in reports?

18 Are feedback systematically provided to all VII


sub-reporting levels on the quality of their
reporting (i.e., accuracy, completeness and
timeliness)?

19 Are there mechanisms to identify and address VII


data quality challenges?

20 Does the Central level undertake regular VII


supervision to lower levels?

21 Are the reporting forms/tools uniformly VIII


standardized across all the reporting levels?

22 Are all data being reported through a single VIII


channel of the National reporting system?

23 Do the collected and reported data are used IX


regularly for decision-making and planning?
* 9 Functional areas include:

I. Organization and Staffing VI. Data Management Processes


II. Training VII. Data Quality Mechanisms and Controls
III. Data Reporting Requirements VIII. Links with the National Reporting System
IV. Indicator Definitions IX. Data Usage
V. Data Collection and Reporting Forms and Tools

PART 2 - Data Verifications

A- Re-aggregate reported numbers from the lower levels (regions), Reported results
from all the regions should be re-aggregated and the total compared to the number
contained in the summary report prepared by the federal level

57
Indicator 1 Indicator 2 Comments

Year 1 Year 2 Year 1 Year 2

1 What aggregated result was contained


in the system/summary report prepared
by the Federal WaSH Unit?

2 Re-aggregate the numbers received


from lower levels (Regions). What is the
re-aggregated number?

3 What are the reasons for the


discrepancy (if any) observed (i.e.,
missing data, data entry errors,
arithmetic errors, missing source
documents etc.)?

4 Is there any reason to over


report/misreport data intentionally?
Yes/No

B- Review availability, completeness and timeliness of reports from all regions/zones.


Number Comments

Year 1 Year 2

1 How many reports should there have been from all


the regions?

2 How many reports are there?

3 Check the dates on the reports received. How many


reports were received on time? (i.e., received by the
due date).

4 How many reports were complete? (i.e., report


contained all the required indicator data, timely etc*).

5 Has the data reported indicate sufficient details


(Indicator 1)? Y/N

6 Has the data reported indicate sufficient details


(Indicator 2)? Y/N

7 Is the currently reported data comparable with the


previous year’s data? Y/N

58
8 Are there any other information sources than the
Central reporting system?

9 If yes, is the information from other source (s)


comparable with the Central level information?

* For a report to be considered complete, it should at least include:

(1) the reported count relevant to the indicator; (2) the reporting period;

(3) the date of submission of the report; and (4) a signature from the staff having submitted the report.

PART 3 - Recommendations for the Central M&E unit

Based on the findings of the systems’ review and data verification at the central level please
describe any compliance requirements or recommend strengthening measures. See system
assessment functions by functional area on the table below for review of system). Action points
should be discussed with the sector as well.

Requirement or
Description of Action Point Recommendation Timeline
(please specify)

DATA QUALITY ASSESSMENT (DQA) TOOL


59
DQA Checklists for Regional/Zonal WaSH Level

May 2013

This DATA QUALITY ASSESSMENT (DQA) TOOL is comprised of three complementary


checklists which enable to assess the quality of data at each level of the data-collection and
reporting system at Zonal/Regional levels.

This questionnaire should be used at the Regional/Zone level. It is divided in three parts:
 PART 1 – Systems Assessment This assessment is designed to help identify potential risk
to data quality linked to the data collection and reporting system at the Intermediary
(Regional/Zone) level.

60
 PART 2 - Data Verifications. Review of the availability, completeness and accuracy of
reported data for the given time period.
Indicators to be verified: 1) No. or % of water schemes; 2) No. or % of functional/ non-
functional water schemes
PART 3 – Recommendations:. Summarize key findings and needed improvements.

Background Information

Name of Sector WaSH


Unit Region/Zone
Indicator(s) Selected
Reporting Period Verified
Documentation Reviewed
Date of assessment

PART 1 - Systems Assessment

This assessment is designed to help identify potential risks to data quality linked to the data-
collection and reporting system at Regional/Zone level.

Functiona
Question l area Yes / Comments
No

1 Are there designated staffs responsible for I


collecting and aggregating regional/zone data?

2 Is there any senior staff member responsible I Who


for reviewing the aggregated numbers prior to
the submission of reports to the Central level?

3 Is there a training plan for staffs involved in II


data collection and aggregation at the
regional/zone level?

4 Have relevant staffs received training on data II


collection and aggregation through (WaSH)
MIS at regional/zone level?

5 Is there any guideline that defines what to III


report, when to report and to whom to report?

6 Are there operational indicator definitions used IV


by the region/zone?

61
Functiona
Question l area Yes / Comments
No

7 Is there a description of how to aggregate each IV


indicator monitored by the region/zone?

8 Are there standard data collection and V


reporting formats that are used by the
region/zone?

9 Does the region/zone utilize computerized VI


system for data recording, manipulation and
reporting?

10 Are all source documents (e.g. dated print-outs VI


in case of computerized system) and formats
are available for auditing purposes?

11 Is there a written back-up procedure for VI


computerized data entry or data processing at
regional/zone level?

12 Are there quality control mechanisms in place VII


when entering data ? (e.g.post-data entry
verification, etc).

13 Are there clearly defined and followed VII


procedures to identify and reconcile
discrepancies in reports?

14 Are there mechanisms to identify and address VII


data quality challenges?

15 Is there a staff to undertake regular supervision VII


of the data collection and aggregation at
regional/zone level?

16 Are systematic feedback provided to the VII


designated staff (for data collection/entry) by
the reviewer on the reporting quality (i.e.,
accuracy, completeness and timeliness)?

17 Are all data being reported through a single VIII


channel of the National reporting system?

18 Do the collected and reported data are used IX


regularly for decision-making and planning?
* 9 Functional areas include:

62
Functiona
Question l area Yes / Comments
No
I. Organization and Staffing VI. Data Management Processes

II. Training VII. Data Quality Mechanisms and Controls

III. Data Reporting Requirements VIII. Links with the National Reporting System

IV. Indicator Definitions IX. Data Usage

V. Data Collection and Reporting Forms and Tools

PART 2 - Data Verifications

A- Re-aggregate (cross-check) reported data at Regional/Zone level. (Should be cross-


checked by re-aggregating the numbers contained in the summary report)

Indicator 1 Indicator 2 Comments

Year 1 Year 2 Year 1 Year 2

1 What aggregated result was contained


in the summary (system) report
prepared at region/zone level?

2 Re-aggregate the numbers from the


source system/documents. What is the
re-aggregated number?

4 What are the reasons for the


discrepancy (if any) observed (i.e., data
entry errors, missing data, computing
errors etc.)?

5 Is there any reason to over


report/misreport data intentionally?
Yes/No

C- Review availability, completeness and timeliness of reports.


Number Comments

Year 1 Year 2

1 Are all reports available at the region/zone level?

2 Were they prepared electronically and on time?

63
3 Are all reports complete?

4 Do the data reported on indicator 1 having sufficient


details? Y/N

5 Do the data reported on indicator 2 having sufficient


details? Y/N

6 Is the prepared report for indicators comparable with


the previous year’s data? Y/N

7 Are there any information sources on indicators other


than WaSH-M&E MIS reporting system?

8 If yes, is the information from other source(s)


comparable with the WaSH (MIS) information?

* For a report to be considered complete, it should at least include:

(1) The reported count relevant to the indicator; (2) The reporting period;

(3) The date of submission of the report; and (4) A signature from the staff having submitted the report.

PART 3 – Recommendations for Region/Zone

Based on the findings of the systems’ review and data verification at the Intermediary
(Regional/zone) level, please describe any compliance requirements or recommend
strengthening measures. See assessment by function area (table below) for review of the
system. Action points should be discussed with the sector as well.

Requirement or
Description of Action Point Recommendation Timeline
(please specify)

64
DATA QUALITY ASSESSMENT (DQA) TOOL

65
DQA Checklists for Woreda/Town WaSH Level

May 2013

This DATA QUALITY ASSESSMENT (DQA) TOOL is comprised of three complementary


checklists which enable to assess the quality of data at each level of the data-collection and
reporting system at Woreda/Town level.

This questionnaire should be used at the Woreda/Town level. It is divided in three parts:
 PART 1 – Systems Assessment This assessment is designed to help identify potential risk
to data quality linked to the data collection and reporting system at the Woreda/Town level.
 PART 2 - Data Verifications. Review of the availability, completeness and accuracy of
reported data for the given time period.
Indicators to be verified: 1) No. or % of water schemes; 2) No. or % of functional/ non-
functional water schemes
PART 3 – Recommendations:. Summarize key findings and needed improvements.

Background Information

66
Name of Sector WaSH
Unit Woreda/Town
Indicator(s) Selected
Reporting Period Verified
Documentation Reviewed
Date of assessment

PART 1 - Systems Assessment

This assessment is designed to help identify potential risks to data quality linked to the data-
collection and reporting system at Woreda/Town level.

Functiona
Question l area Yes / Comments
No

1 Are there designated staffs responsible for I


collecting and aggregating data at
Woreda/Town level?

2 Is there any senior staff member responsible I Who


for reviewing the aggregated numbers prior to
the submission of reports to the regional/zone
level?

3 Is there a training plan for staffs involved in II


data collection and aggregation at the
Woreda/Town level?

4 Have relevant staffs received training on data II


collection and aggregation through (WaSH-
M&E) MIS at Woreda/Town level?

5 Is there any guideline that defines what to III


report, when to report and to whom to report?

6 Are there operational indicator definitions used IV


by the Woreda/Town?

7 Is there a description of how to aggregate each IV


indicator monitored by the Woreda/Town?

8 Are there standard data collection and V


reporting formats that are used by the
Woreda/Town?

9 Does the Woreda/town utilize computerized VI


67
Functiona
Question l area Yes / Comments
No

system for data recording, manipulation and


reporting?

10 Are all source documents (e.g. dated print-outs VI


in case of computerized system) and formats
are available for auditing purposes?

11 Is there a written back-up procedure for VI


computerized data entry or data processing at
Woreda level?

12 Are there quality control mechanisms in place VII


when entering data? (e.g.post-data entry
verification, etc)

13 Are there clearly defined and followed VII


procedures to identify and reconcile
discrepancies in reports?

14 Are there mechanisms to identify and address VII


data quality challenges?

15 Is there a staff to undertake regular supervision VII


of the data collection and aggregation at
Woreda/Town level?

16 Are systematic feedback provided to the VII


designated staff (for data collection/entry) by
the reviewer on the reporting quality (i.e.,
accuracy, completeness and timeliness)?

17 Are all data being reported through a single VIII


channel?

18 Do the collected and reported data are used IX


regularly for decision-making and planning?
* 9 Functional areas include:

I. Organization and Staffing VI. Data Management Processes

II. Training VII. Data Quality Mechanisms and Controls

III. Data Reporting Requirements VIII. Links with the National Reporting System

IV. Indicator Definitions IX. Data Usage

68
Functiona
Question l area Yes / Comments
No
V. Data Collection and Reporting Forms and Tools

PART 2 - Data Verifications

A- Re-aggregate (cross-check) reported data at Woreda level. (Should be cross-checked


by re-aggregating the numbers contained in the summary report)

Indicator 1 Indicator 2 Comments

Year 1 Year 2 Year 1 Year 2

1 What aggregated result was contained


in the summary (system) report
prepared at Woreda/Town level?

2 Re-aggregate the numbers from the


source system/documents/kebele. What
is the re-aggregated number?

4 What are the reasons for the


discrepancy (if any) observed (i.e., data
entry errors, missing data, computing
errors etc.)?

5 Is there any reason to over


report/misreport data intentionally?
Yes/No

D- Review availability, completeness and timeliness of reports.


Number Comments

Year 1 Year 2

1 Are all reports available at the Woreda/Town level?

2 Were they prepared electronically and on time?

3 Are all reports complete?

4 Do the data reported on indicator 1 having sufficient


details? Y/N

5 Do the data reported on indicator 2 having sufficient


details? Y/N

69
6 Is the prepared report for indicators comparable with
the previous year’s data? Y/N

7 Are there any information sources on indicators other


than WaSH-M&E MIS reporting system?

8 If yes, is the information from other source(s)


comparable with the WaSH (MIS) information?

* For a report to be considered complete, it should at least include:

(1) The reported count relevant to the indicator; (2) The reporting period;

(3) The date of submission of the report; and (4) A signature from the staff having submitted the report.

PART 3 – Recommendations for Woreda

Based on the findings of the systems’ review and data verification at the Woreda/Town level,
please describe any compliance requirements or recommend strengthening measures. See
assessment by function area (table below) for review of the system. Action points should be
discussed with the sector as well.

Requirement or
Description of Action Point Recommendation Timeline
(please specify)

70

Vous aimerez peut-être aussi