M&E framework
Click here to see the diagram

The Conceptual Hill: Our M&E interventions approach are based on organizational mission and philosophy known as conceptual hill. In real terms every organization is laid on a theoretical or ideological foundation(s) known as a conceptual hill. Institutions are an expression of some sort of development concepts, which is why the conceptual hill of any organization is expressed in the mission and vision of that organization or in the edicts establishing it. However the conceptual foundation of organizations largely inform their M&E approach. Our M&E processes support and achieve our client’s conceptual hill.

Planning: We take considerable time at this stage to plan every M&E intervention because; a careful comprehensive planning is the first most important step towards success. A major part of the planning will be to state what specific problems/objectives the program will address and how or what the project will do to address the problems. We use TenStep project management planning tool to design, construct, and test a project life cycle.

Baselines: One of the fundamental blocs of an effective M&E is to establish a baseline at the early stage of the intervention. Baselines are the situations or conditions of the problem before the intervention took place. We use various methods to develop baselines which form the basis for measuring improvement, and the extent to which the objectives of the program are achieved or not. See the figure below on how we use this information for early stage analysis.

Indicators: Developing indicators is a critical exercise in RID’s M&E process. Indicators refer to the criteria against which reform progress is measured or observable evidences, that the objectives are achieved. These may be indictors of efforts (output), effects (outcomes), and impact (change). The implication of setting indictors in these three dimensions is that it supports institutions to start from simple observable, to some more analytical evidences and in some cases, from descriptive to analytical issues. Effort indicators would ask questions about ‘what was done’ to show that institutional efforts are producing results. Outcome indicators refers to the immediate and observable changes in relation to output and asks questions on ‘what happened?’ Meanwhile, Impact indicators as the name implies, ask questions on ‘what changed’ referring to the long term and suitable changes achieved as a result of the intervention. Setting indicators is an important aspect of M&E but require careful consideration and attention to select the appropriate ones. We use various methods to develop indicators but especially participatory tools.

Data Collection: Our M&E process is largely about data collection.This is the stage where we and our clients concentrate in defining and selecting what data to collect and how to collect it. The purpose of selecting and gathering data is to answer a set of questions like:

Are we doing what we planned to do (internal validity)?

Are we making any difference (impact assessment)?

Are these the right things to do (learning/strategic relevance)?

Data collection questions often vary. But many of them seek to answer efficiency, effectiveness and impact issues of the project. We help our clients to prioritise the questions because not all questions are relevant and easy to obtain data to explain.

We use qualitative and quantitative methods of data collection simultaneously; to ensure that all data collected are reliable, valid and credible. We achieve these through information triangulation i.e the use of different methods and sources.

Evaluation: At this stage we conduct data analysis through simple and comprehensive tools such as the log frames, large scale impact studies, case study, time lines and impact grid. This is the process of converting the raw information we collected into knowledge base that can be used to influence decision making, lesson learning and action. Our evaluation processes measure progress and impact. We continue to establish chain of evidences that relates to what is happening in people’s lives by our evaluation techniques. In terms of policies and strategies, our intervention approach is to help our clients to know ‘what works’, ‘what does not work’ and ‘why’?

Our evaluation ultimately leads to Learning and Action because we would highlight what works and what does not work in terms of activities, approach and policies. Each data analysis concludes with a set of actions to be taken as a result of the lesson learnt. These information are fed back into the overall framework to correct errors and establish good practices.

Principles of M&E: The practice of M&E involves certain vital principles that must run through the whole process. These are participation, capacity building, communication and reporting. Practitioners need to understand these processes in detail and implement them throughout the M&E program.