Monitoring SRS Implementation: Lessons from CRVS Monitoring
Daniel Cobos from the Swiss Tropical and Public Health Institute (Swiss TPH) presented at a webinar in November 2025. He discussed a previous project that aimed to improve monitoring of civil registration and vital statistics (CRVS) systems and how the lessons learned could be applied to sample registration system (SRS) monitoring.
Watch the webinar:
Effective development of a sample registration system involves creating a plan to monitor and continuously improve the system’s performance. An approach to improving CRVS systems can be adapted for use with SRS.
Example from CRVS monitoring
Effective monitoring of CRVS system performance has been a persistent challenge that various projects have tried to address. Often governments and CRVS stakeholders do not use routine metrics to monitor the performance of their CRVS system, and some of the metrics, although informative, are not actionable. Some metrics are not linked to decisions or may not be relevant for the administrative level or sector. Limited financial resources are another challenge.
Swiss TPH conducted a project in Bolivia, Vietnam and Zambia to identify strategies to improve CRVS monitoring, in partnership with University of New South Wales (UNSW), Africa CDC, Vital Strategies, Bloomberg Philanthropies, the CDC Foundation, the World Health Organization, the Economic and Social Commission for Asia and the Pacific (ESCAP) and other regional and global experts.
Different lenses for approaching the challenge
The project approached the challenge with six types of orientation which included:
Taking a user-oriented “decision first” approach – this means determining what decisions need to be made, then identifying the best data to inform that decision.
Being interested in operational and strategic considerations – Operational considerations look at whether data collection is occurring as it should, while strategies considerations examine whether the data is having the intended impact.
Applying a systems thinking lens.
Using participatory techniques.
Building on existing frameworks and efforts.
Working with existing data rather than creating new data sources.
The methodologies used were document review, participatory workshops, process analysis, and decision spaces.
Seven-step approach
The project unfolded over seven sequential phases:
1) Stakeholder engagement – initiating dialogue with key stakeholders to ensure alignment and support and creating a technical working group.
2) Developing shared understanding of the stakeholders’ roles and engagement strategy.
3) Defining priority areas – prioritizing specific areas within CRVS for focused monitoring; this further defined the scope of project.
4) Identifying stakeholders – creating a comprehensive map of all stakeholders involved in the CRVS system.
5) Needs assessment – evaluating the stakeholders’ decision-making and information needs.
6) Identification of indicators – selecting appropriate indicators to measure performance and data quality with links between indicators and specific stages or functions.
7) Data collection implementation – establishing systems for collecting data to monitor CRVS performance.
8) Data to action – converting collected data to actionable insights to improve CRVS outcomes.
Sample Registration System (SRS) Monitoring
Applying this approach to SRS monitoring involves developing the questions to be answered, thinking through both program progress and operations, and then planning how and by whom the monitoring will occur.
Strategic questions could include:
• Do routine outputs (mortality rates, COD fractions) meet precision and timeliness targets at required statistical domains?
• Are decision-relevant products (dashboards, bulletins) reaching national/subnational users, and are there documented management actions that follow?
• Do internal consistency checks and triangulation with external sources indicate acceptable data quality (completeness, plausibility, stability)?
• Are governance, ownership and financing arrangements adequate for continuity (custodianship, data release, domestic budget lines, TA roles)?
Some examples of operational questions could include:
• Are all expected deaths in sampled clusters being detected, assigned, and followed up within the planned time windows?
• Is the field-to-server pipeline (device → server → analysis portal) functioning with acceptable latency and error rates?
• Are supervision and quality-assurance protocols (spot checks, re-visits, recapture) being executed as designed and documented?
• Do we have the operational capacity (human resources, equipment, connectivity) to keep pace with the planned workload and address backlogs?
Monitoring progress and operations
Tracking progress is necessary to understand whether the SRS project is delivering the planned activities on time and within budget. Some example indicators are:
· Achievement of milestones, such as finalizing the sampling frame;
· Budget execution;
· Interoperability arrangements, such as signing the memorandum of understanding; and
· Workforce capability, such as the percentage of staff recruited or trained.
Monitoring routine SRS operations will ensure that the system produces timely, complete and high-quality data that is used to inform policy. Some operations indicators are:
· Completeness of data collection, such as percentage of vital events captured;
· Data quality, such as verbal autopsies with undetermined cause of death;
· Data timeliness; and
· Data integration and use, such as number of people with access to the dashboard.
In order to collect this information consistently, the team needs to create: governance mechanisms to anchor the monitoring system, such as a steering committee; a clear RACI matrix to define who is accountable and responsible for monthly dashboards and other products; and“learning loops” through which the information generated will inform decisions.