Uttarakhand National Health Mission × IIT Bombay Virtual Labs · Multi-Department Scoping Internal · For Official Use

Process and Problem Analysis — Multi-Department Compilation

Consolidated scoping output from the 7 May 2026 working sessions held with the Uttarakhand NHM. Five department-level analyses are presented below as separate tabs. Each follows an identical five-deliverable template: information workflow, departmental process flow, episode swim-lane, problem statements, and a problem-mapping table.

Cross-Department Synthesis · IIT Bombay Virtual Labs Internal · For Official Use

Cross-Department Synthesis — Executive Summary

Consolidated view across the five 7 May 2026 working sessions held with the Uttarakhand National Health Mission. Each department was analysed independently using the same five-deliverable template; this summary draws the recurring patterns out, maps them against departments, and proposes a sequenced intervention roadmap.

Document Ref.
SUM-UK / SCOPE / 2026-05 / 000
Version
1.0
Date of Issue
07 May 2026
Pages
1 of 1 (web)
Prepared For
Uttarakhand National Health Mission
Prepared By
IIT Bombay Virtual Labs Team
Source
Compiled from MCH, NCD, TB, ASHA, IMM analyses (7 May 2026)
Status
Draft for review
Purpose. Five sessions, ninety-six problems in total — but the problems are not ninety-six independent items. They cluster into a small number of cross-cutting themes that recur across departments. This summary identifies those themes, shows how they manifest in each department, and translates them into consolidated interventions sequenced across four time horizons. The detail behind every claim here lives in the per-department tabs; the synthesis below is meant to be read first and used as a navigation map.
Section 1

Aggregate Snapshot

High-level numerics summarising the five departmental analyses. Five sessions in a single working day produced ninety-six diagnostic problem statements, the majority of which carry a High priority rating. Counts vary by department reflecting the depth of issues raised in each session. The priority distribution makes the scale of the operational queue visible at a glance.

5
Departments
engaged on 7 May 2026
96
Problems identified
across all sessions
59
High-priority
61% of total
7
Cross-cutting themes
across departments
Priority distribution across all 96 problems
59 High
31 Medium
5 LT
1
High — must address to unblock operations Medium — should address in parallel Long-term — strategic horizon Defer
Note. Priority labels are assigned per-department against that department's own programme objectives; aggregation here is volumetric, not weighted by clinical or programmatic significance.
Section 2

Cross-Cutting Themes

Seven themes recur across the five departmental analyses. Each is grounded in specific problem statements (cited at the foot of every theme block) and is paired with a consolidated intervention that would address the theme across the departments where it manifests.

THEME 01

Cross-Programme Data Fragmentation

Manifested in: All five departments

Patient identity, vaccination records, NCD profiles, TB notifications and ASHA-captured family-level data live in vertical programme silos. Reconciling a single individual across portals — U-WIN, ABHA, NCD, RCH 2.0, HMI — is a manual exercise where it is possible at all. Every department raised this from a different angle.

Consolidated intervention.A cross-programme patient identity layer with controlled fan-out to existing portals; deduplication at the centre; federation rather than replacement.
REFERENCESMCH P-04, P-10 · NCD P-12 · TB P-07 · ASHA P-12 · IMM P-06
THEME 02

Paper-First Capture with Downstream Re-keying

Manifested in: All five departments

The capture surface in the field is overwhelmingly paper. Digital records exist only because someone else — a Block Entry Operator, an informal data entrant, a state-level analyst with Excel — re-types the same fact later. The same data moves through 2–3 hands before it becomes queryable, with quality and timeliness eroding at each step.

Consolidated intervention.Source-level digital capture replacing paper diary; declare digital as authoritative; paper allowed only as a transitional fallback.
REFERENCESMCH P-02, P-12 · NCD P-09 · TB P-02, P-11 · ASHA P-02, P-10 · IMM P-10, P-12
THEME 03

Operational Dashboards Missing or Misaligned

Manifested in: MCH · NCD · TB · IMM

Existing national dashboards report cumulative figures suited to central oversight, not state operations. State teams cannot answer "which facility on which indicator is the worst performer this month?" without manual Excel work. The denominator dispute in immunization is a sharp version of the same gap.

Consolidated intervention.A state-tailored operational dashboard layer over existing portal data with daily / weekly granularity and facility-level drill-down.
REFERENCESMCH P-01, P-05 · NCD P-02, P-03 · TB P-09 · IMM P-04
THEME 04

Incentive & Behaviour Misalignment

Manifested in: NCD · ASHA · TB

Field workers are paid by submission count. Data quality, visit-quality and care outcomes do not enter the incentive formula. The result is the rational worker behaviour everyone observes — fill the form, submit, move on. Behavioural inertia around digital tools is the same problem viewed from a different side.

Consolidated intervention.A hybrid incentive structure: completion plus a quality / outcome bonus tied to a clinical signal (growth, vaccination, alert resolution).
REFERENCESNCD P-06, P-07 · ASHA P-07, P-08 · TB P-10
THEME 05

Workforce Burden & Role Overload

Manifested in: MCH · TB · ASHA · IMM

Multiple verticals converge at the same field worker, single coordinator, or single block operator. ASHA carries 10+ years of accumulated forms; the DPC plans, enters data, runs supply chain, and oversees handheld X-ray; immunization sees informal data entry by non-designated workers because the formal block operator role is unfilled.

Consolidated intervention.Role rationalisation paired with tooling automation that explicitly removes data-entry burden from the most overloaded role; formalise informal roles with certification.
REFERENCESMCH P-03, P-06, P-09 · TB P-08, P-10 · ASHA P-01, P-06 · IMM P-05, P-10
THEME 06

Decision-Point Information Gaps

Manifested in: MCH · NCD · TB · IMM

Critical clinical and operational data is not accessible at the moment a decision is being made. The doctor without the NCD-MO app open. The vaccinator without the lot number. The MO facing a possible AEFI without an HMIS to attribute it to a vaccine. The CHC without exact patient-level access to the people on its panel.

Consolidated intervention.Point-of-decision information surfacing — bring lot, ID, episode, and prior-care context to the worker device at the moment of the decision, not the day after.
REFERENCESMCH P-07 · NCD P-04, P-05, P-08 · TB P-04 · IMM P-07, P-08
THEME 07

End-to-End / Longitudinal Visibility Missing

Manifested in: MCH · NCD · TB · IMM

Per-case treatment trajectories are invisible (269 TB cases lost between notification and outcome). The chain from school dropout to early marriage to infant mortality cannot be traced. Referral opacity in MCH means a referred patient leaves the funnel as far as the source is concerned.

Consolidated intervention.A longitudinal patient / case view from notification through outcome; cohort linkage layer with controlled access for research and policy use.
REFERENCESMCH P-08 · NCD P-11 · TB P-01 · IMM P-11
Section 3

Department × Theme Matrix

A two-dimensional view of where each theme manifests and at what intensity. Themes that appear strongly across many departments — particularly cross-programme data fragmentation and paper-first capture — represent the highest-leverage intervention targets.

Cross-Cutting ThemeMCHNCDTBASHAIMM
01. Cross-Programme Data Fragmentation●●●●●●●●●●●●●●
02. Paper-First Capture with Downstream Re-keying●●●●●●●●●●●●●
03. Operational Dashboards Missing or Misaligned●●●●●●●●●
04. Incentive & Behaviour Misalignment●●●●●
05. Workforce Burden & Role Overload●●●●●●●●●●
06. Decision-Point Information Gaps●●●●●●●●
07. End-to-End / Longitudinal Visibility Missing●●●●●●●
●●● Strong manifestation ●● Present Touched on Not raised
Table 1. Cross-cutting themes by department, with intensity markers reflecting the count and priority of problem statements in that department aligned to each theme.
Section 4

Per-Department Snapshots

A compact summary of each department analysis with priority breakdown and the three highest-priority problems. Each card links into the full per-department analysis tab for the supporting workflow, process flow, swim-lane, problem statements, and mapping table.

MCH

Maternal & Child Health

09:55 IST
20 issues 12 High6 Med1 LT1 Defer
Top three high-priority items
  • P-01Information Latency. Real-time visibility does not exist above the sub-centre level.
  • P-02Paper-First Capture. ANMs record on paper; digital entry is a downstream after-thought.
  • P-03Workforce Constraint. ~50% specialist vacancy and no appetite for additional data-entry hours.
View full MCH analysis →
NCD

Non-Communicable Diseases

11:29 IST
18 issues 9 High7 Med2 LT
Top three high-priority items
  • P-01Regulatory Compliance. Image-based screening device faces regulatory friction despite a no-local-storage design.
  • P-02Dashboard Misalignment. National NCD dashboard not tailored to state operational needs.
  • P-03Cumulative-Only View. Dashboard shows only 5–6 years of cumulative data; no day-to-day metrics.
View full NCD analysis →
TB

Tuberculosis

13:04 IST
20 issues 12 High7 Med1 LT
Top three high-priority items
  • P-01Visibility Gap. 269 cases identified in the first 100-day round have no continued visibility through treatment.
  • P-02Reporting Mechanics. Daily Google Forms reporting requires manual Excel work to derive performance categories.
  • P-04Equipment Reliability. Biomedical equipment management issues affect 80–90% of facilities.
View full TB analysis →
ASHA

ASHA Diary & Forms

14:55 IST
18 issues 12 High6 Med
Top three high-priority items
  • P-01Form Volume. Number of forms ASHA workers must complete is overwhelming; a 10+ year pain point.
  • P-02Paper-First Capture. Paper diaries remain the primary capture surface; data is re-keyed downstream.
  • P-03Source-of-Truth Ambiguity. Digital and paper formats coexist for the same data without a designated source of truth.
View full ASHA analysis →
IMM

Immunization

15:47 IST
20 issues 14 High5 Med1 LT
Top three high-priority items
  • P-01Outreach Failure. ~50% of outreach immunization sessions fail in urban areas (weather, migration).
  • P-02Urban Blind Spots. Urban areas function as system-level blind spots for immunization tracking.
  • P-03Migration Instability. Migration in urban / peri-urban settings makes the denominator structurally unstable.
View full IMM analysis →
Section 5

Prioritised Intervention Roadmap

Consolidated interventions organised by time horizon. Each item lists the specific problem statements it addresses across the affected departments. Items are sequenced so that foundation work enables the major-programme interventions which in turn enable the strategic-horizon work.

Next sprint · sequenced firstImmediate

Regulatory case-study compilation for image-based screening

Assemble a dossier of comparable imagery-based screening tools deployed under DPDP-aligned architectures. Required for the NCD committee presentation; unblocks the cervical-cancer screening device rollout.

NCDP-01

KPI ↔ data-parameter mapping (joint working session)

Co-author with state teams a formal mapping from analytical questions ("which facility / indicator is worst performing?") to the underlying portal fields. Pre-requisite to any dashboard build, called out in three meetings.

NCDP-10TBP-12MCHP-05
3 – 6 months · enabling layerFoundation

Source-level digital capture pilot

Replace paper diary in one programme × one block as a pilot. Mobile-first ASHA app that absorbs HBNC, MCP, immunization due-list, and basic NCD screening capture. Aim: eliminate the Block Entry Operator step for the piloted programme.

ASHAP-02, P-10MCHP-02IMMP-12

State-operational dashboard layer

Bolt a state-tailored facility-ranking and indicator-drill-down view on top of existing portal data. Daily / weekly granularity, mobile-renderable, no CSV downloads required.

MCHP-01, P-05NCDP-02, P-03, P-04TBP-09IMMP-04

Form consolidation audit

Audit the form set ASHA, MO, and DIO each handle. Identify overlap, enforce the single-form-single-card principle, and stage merges where data is redundant.

ASHAP-01, P-09MCHP-06TBP-10
6 – 12 months · structural changeMajor Programme

Cross-programme patient identity & federation layer

A federation layer that reconciles the same individual across U-WIN, ABHA, NCD, RCH 2.0, HMI, and TB notification. Not a replacement — a stitch. Unlocks longitudinal views and removes duplicate capture.

MCHP-04, P-10NCDP-12TBP-07ASHAP-12IMMP-06

Decision-point information surfacing

Worker / clinician devices that present lot numbers, episode history, eligibility status, and AEFI context at the moment of administration or decision. Closes the "I don't have what I need to act now" gap.

NCDP-05, P-08TBP-04IMMP-07, P-08MCHP-07

Hybrid incentive structure rollout

Layer a quality / outcome bonus on top of the existing submission-based incentive. Tied to a measurable clinical signal so that the field worker is rewarded for what the system is trying to achieve.

NCDP-06ASHAP-07, P-08

In-form source-level validation framework

Range checks, mandatory-field gates, GPS-based authenticity, deviation alerts at the point of capture. Moves quality enforcement upstream from the MCP-level Google form.

ASHAP-04TBP-11NCDP-11MCHP-12
12+ months · long-horizonStrategic

Longitudinal case / cohort view

Per-case treatment trajectories from notification through outcome (TB), per-mother MCH episode reconstruction, per-child immunization completeness. Built on top of the patient-identity federation layer.

TBP-01IMMP-11MCHP-08NCDP-11

End-to-end social-determinant analytical layer

Joining school dropout, early marriage, infant mortality, maternal malnutrition into a single research-ready analytical surface. Controlled access; for policy and academic use, not operational dashboards.

IMMP-11MCHP-05

Role formalisation across the field

Formalise informal data-entry roles across ASHA, Block Entry Operator, DIO; certification, training progression tracking, accountability. Removes the informal-data-entry risk and stabilises the data layer at the source.

IMMP-10ASHAP-06MCHP-03
MCH Uttarakhand × IIT Bombay VLabs Internal · For Official Use

Maternal & Child Health Programme — Process and Problem Analysis

Scoping output from the meeting between the Uttarakhand National Health Mission (MCH Division) and the IIT Bombay Virtual Labs team held on 7 May 2026.

Document Ref.
MCH-UK / SCOPE / 2026-05 / 001
Version
1.0
Date of Issue
07 May 2026
Pages
1 of 1 (web)
Prepared For
Uttarakhand NHM — MCH Division
Prepared By
IIT Bombay Virtual Labs Team
Source
Meeting transcript, 09:55–11:07 IST
Status
Draft for review
Purpose. This document presents a structured analysis of the Uttarakhand MCH information ecosystem as discussed in the scoping meeting. Five deliverables are included: a workflow diagram of the current information flow; a process-flow diagram of the maternal death audit; an enumerated list of problem statements; a problem-mapping table linking causes, impacts, and candidate interventions; and a swim-lane view of a maternal care episode. The document is intended to inform subsequent requirements definition and platform-scoping discussions.
Section 1

Information Workflow — Current State

Information generated at the point of care travels through a four-tier hierarchy on paper, is re-keyed at each level, aggregated at the district, and surfaces at the state and central level only as end-of-month numerical indicators. The diagram below depicts the complete flow with characteristic time-delays at each hand-off and the failure modes introduced along the way.

FIG. 1 — INFORMATION FLOW: VILLAGE TO CENTRAL REPORTING Patient Community / Home Stage 0 ANM Sub-centre Stage 1 · paper Block PHC / CHC Stage 2 · paper to excel District CMO Office Stage 3 · aggregation State MCH NHM Uttarakhand Stage 4 · review Central HMIS / GoI portals Stage 5 ~ days paper register ~ 1 week re-keyed to excel ~ 2 weeks district aggregation end of month indicators frozen monthly upload numerical only FAILURE MODES INTRODUCED ALONG THE WAY Transcription errors · Lost forms · Illegible handwriting · Excel header drift · No qualitative context · No drill-down · Missed course-correction window DECISION FEEDBACK LOOP — CURRENTLY BROKEN Course correction · referral status · resource re-allocation — currently delayed by ≥ 30 days What flows up: Aggregated counts only — deliveries, immunisations, mortality totals. What does not flow: Real-time referral availability, qualitative incident context, stock-out signals, daily case status.
Figure 1. End-to-end information flow from village service delivery to central reporting. Time annotations indicate typical delays at each hand-off.
Section 2

Process Flow — Maternal Death Audit

The maternal death audit was identified as the priority deliverable in the meeting due to its legal and clinical significance. The process branches by location category (community, facility, in-transit), produces a 40–50-page paper bundle, requires CMO-level inquiry and a legal certificate, and terminates in the existing Maternal Death Portal which captures approximately twenty-five aggregate indicators with no checklist or drill-down capability.

FIG. 2 — MATERNAL DEATH AUDIT PROCESS, CURRENT STATE Maternal death reported Where did it occur? Three-way classification Community ANM-led inquiry · verbal autopsy Form set A Facility (Govt / Private) CMS / facility audit committee Form set B In-transit (ambulance) Referral chain reconstruction Form set C Audit form bundle compiled 40–50 pages · multi-layer · paper · handwritten BOTTLENECK — illegibility, missing fields, transit losses CMO review · inquiry + legal certificate District-level medico-legal review Maternal Death Portal · ~25 indicators only PAIN POINTS — No checklist enforcement — No daily reporting — Free-text vs. synoptic — No drill-down analysis — Indicator-only output
Figure 2. Maternal death audit process from event to portal entry, showing the three location categories, the 40–50-page paper form bundle, and the indicator-only output.
Section 3

Swim-lane Diagram — Maternal Care Episode

A swim-lane representation of a single maternal-care episode from antenatal registration through delivery and postnatal care, with the audit branch shown for cases ending in maternal death. Each lane represents an actor; each box is a discrete activity. Markers indicate hand-offs at which information drops out of the system.

FIG. 3 — MATERNAL CARE EPISODE WITH AUDIT BRANCH ANC REGISTRATION ANC VISITS DELIVERY POSTNATAL AUDIT (IF DEATH) Patient + Family ASHA / ANM Sub-centre Block PHC / CHC District CMO Office State / Central NHM · GoI Pregnancy disclosed Register in MCP card · paper PAPER 4 ANC visits BP · Hb · weight PAPER Travels to sub-centre / PHC USG · investigations if specialist available ~50% GAP Onset of labour community Screen · decide home vs. refer Institutional delivery or refer ↑ MULTI-REFER Mother + newborn return home PNC home visits Day 3 · 7 · 14 · 42 PAPER DIARY Monthly aggregation excel · review END OF MONTH HMIS upload numerical only NO DRILL-DOWN If maternal death: verbal autopsy 40-50 PG FORM Facility audit CMS-led committee PAPER CMO inquiry + legal certificate Maternal Death Portal ~25 indicators NO ANALYSIS Activity Critical / failure point Paper / lossy hand-off
Figure 3. Five-lane episode view showing antenatal registration, ANC visits, delivery, postnatal care, and the audit branch invoked in the event of maternal death.
Section 4

Problem Statements

The following twelve problem statements were derived from the meeting transcript. Each is grounded in a specific observation made during the discussion and is written as a diagnostic claim suitable for testing, prioritisation, and conversion into formal requirements.

P-01
Information Latency

Real-time visibility does not exist above the sub-centre level.

Indicators surface at the state level only at month-end. There is no daily, searchable, or drill-downable view of activity on the ground, which forecloses any timely course correction.

P-02
Paper-First Capture

ANMs record on paper; digital entry is a downstream after-thought.

Field staff are not tech-comfortable and do not treat data entry as part of their core role. Paper registers are the source of truth; everything digital is a transcription of those registers.

P-03
Workforce Constraint

Approximately 50% specialist vacancy and no appetite for additional data-entry hires.

Specialist services are filled at roughly half their sanctioned strength. NHM has explicitly ruled out hiring dedicated data-entry staff to operate any new portal; usability is therefore a non-negotiable design constraint.

P-04
Portal Fragmentation

Multiple disconnected portals require duplicate data entry.

Distributed systems exist for HMIS, maternal death tracking, NCD screening, ASHA diary, NRC reporting, and others. Eventual consolidation by the Government of India is years away; in the meantime, the same data is keyed multiple times.

P-05
Analytical Depth

The Maternal Death Portal functions as a counter, not as an analytical tool.

It carries approximately twenty-five overall indicators but offers no checklist, no daily granularity, no drill-down, and no top-N analysis. It records that a death has occurred but does not help decision-makers understand why or where the system broke.

P-06
Audit Form Burden

Maternal death audits run to forty to fifty pages of handwritten paperwork.

Multi-layer audit forms across community, facility, and transit categories must be filled by hand, then bundled, transported, and reviewed at the CMO level. Forms are commonly lost, illegible, or partially completed.

P-07
Patient Accessibility

Specialist services require patients to undertake 30+ km journeys through difficult terrain.

Travel times of approximately two hours for ultrasound, MRI, or specialist consultation are common. Patients frequently navigate multiple facilities for a single episode, and service availability shifts unpredictably with staff transfers.

P-08
Referral Opacity

Repeated referrals for delivery cases occur without root-cause visibility.

Cases bouncing between facilities are a recognised pattern, but the operational reasons — staff absence, equipment failure, capacity constraint — are not captured in any system that decision-makers can interrogate.

P-09
Quarterly NRC Reporting

Severe Acute Malnutrition cases at NRCs are reported only quarterly, on paper.

SAM admissions are a high-priority signal. Reporting them quarterly on paper means the state cannot respond to outbreaks, regional spikes, or facility-level performance variation in time.

P-10
NCD Portal Limitations

The NCD tracking portal is slow and does not provide state-level data extraction.

An existing system tracks non-communicable disease screening, but slow performance and the absence of state-level extraction make it unusable for dashboarding or strategic review.

P-11
Template Churn

Excel reporting templates change repeatedly, breaking historical continuity.

Headers and parameters drift over time as templates are revised. Historical comparison becomes difficult, automation breaks, and field staff lose confidence that this month's effort will remain valid next month.

P-12
Data Quality at Source

Form legibility is itself a barrier to reliable data.

Handwritten audit forms — particularly the longer ones — produce illegible or ambiguous entries that cannot be reliably re-keyed, leading to silent data quality erosion before any analysis is undertaken.

P-13
Single-MO PHC Bottleneck

Patient expectation of comprehensive primary care exceeds the staffing reality of a single Medical Officer per PHC.

Communities expect a Primary Health Centre to deliver every facility on demand. With a single MO often the only clinician on site, that expectation is structurally unmet — orthopaedic complaints and other specialist needs cannot be addressed at the PHC and must be referred onwards.

P-14
Service Volatility

Specialist transfer policy moves services unpredictably; the field has no real-time view of what each facility currently offers.

A facility that was providing a particular specialist service last quarter may not be providing it this quarter due to a transfer. Workers and patients learn this only on arrival. Referral logistics and patient counselling depend on stale knowledge of facility capability.

P-15
End-of-Month Data Freeze

Portal data does not freeze until month-end; daily and weekly granularity is structurally absent.

The HMIS portal aggregates monthly. There is no searchable daily-numbers view, and reports become available only after the month closes. Course-correction within the cycle is therefore not possible from portal data alone.

P-16
Synoptic Reporting Gap

Free-text capture persists where structured synoptic reports would aggregate.

The team explicitly noted the preference for a "choice drop synoptic report" over free-text writing. Current capture for clinical events and audits remains free-text, which limits aggregation, comparison across cases, and downstream automation.

P-17
CMO Analysis Checklist Absent

The CMO conducting maternal death analysis has no standardised top-N analysis-question checklist.

Analytical depth on maternal death cases depends on the individual reviewer's framing. There is no codified set of "top 10" or "top 20" analytical questions to run against each case, so cross-case learning is not systematic.

P-18
PDF Upload Duplication

Upper-level Excel sheets require PDF uploads of the same physical paper data.

The same source data ends up in three places — the original paper form, the Excel sheet at the upper level, and a PDF upload of the physical sheet. Three places, three integrity risks, and three units of effort for one piece of information.

P-19
Hierarchy Depth Latency

Information traverses ANM → block → district → state → NHM/MD; each level adds to the time-to-action delay.

The state programme manager sits at the fourth level of a hierarchy whose source is the ANM's notebook. Each upward hop adds time, simplification, and aggregation; by the time a signal reaches the level that can act, the moment to act has often passed.

P-20
Outreach vs Facility Split

Outreach (provider-to-village) and facility-based services need separate platform handling, currently absent.

The MCH service model spans both fixed-facility care and outreach where the health provider goes to the village. The data and workflow needs of those two modes differ, but no platform currently treats them distinctly — outreach is bolted onto facility flows.

Section 5

Problem Mapping and Prioritisation

The table below maps each problem statement to its underlying root cause, the operational impact it produces, and a candidate intervention. Priority is graded by the extent to which the problem directly obstructs the programme's stated objective of timely and accurate maternal and child health monitoring.

Ref. Problem Root Cause Operational Impact Candidate Intervention Priority
P-01 Information latency Four-tier hierarchy operating on a monthly aggregation cycle State cannot intervene within the window where intervention is meaningful Mobile-first capture layer with daily push to a state-readable index High
P-02 Paper-first capture ANMs lack tech comfort; digital entry is not part of role expectation Every digital number is a transcription artefact, not a primary record Field-tested mobile UI optimised against the speed of paper; entry as part of role High
P-03 Workforce gap ~50% specialist vacancy; no budget for data-entry staff Any tool requiring additional personnel to operate is unsustainable Zero-additional-headcount design constraint; usability as primary KPI High
P-04 Portal fragmentation Vertical programmes built independent systems over years Duplicate keying; absence of integrated view; staff fatigue Unified capture layer with integration-based fan-out to existing portals Medium
P-05 No analytical depth Maternal Death Portal designed as a register, not an analytical tool Cannot answer "why" — only "how many" Synoptic audit form with structured fields; analytical layer with top-N views High
P-06 Audit form burden Legal-medical compliance demands multi-layer forms; no digital equivalent 40–50 pages of paper per death; loss, illegibility, delay Digital audit workflow with progressive disclosure and field validation High
P-07 Patient accessibility Geography combined with thin specialist coverage 30+ km journeys; multi-facility navigation; care abandonment risk Outside platform scope; addressable indirectly through P-08 Long-term
P-08 Referral opacity No real-time facility status feed; reasons not structurally captured Repeated referrals; avoidable delays at delivery Live referral-unit status board fed from facility-level check-ins High
P-09 Quarterly NRC reporting Reporting cadence locked to a paper-based quarterly cycle SAM signals invisible for approximately three months at a time NRC mobile reporting module — viable as a quick-win pilot High
P-10 NCD portal limitations Slow system performance; no state-level data extraction API State unable to dashboard NCD screening progress Outside MCH scope; flagged for separate engagement Defer
P-11 Template churn Excel-based reporting with no schema governance Historical comparability lost; field rework; trust erosion Versioned data schema with backward-compatible field evolution Medium
P-12 Data quality at source Source records are handwritten paper Data quality eroded silently before analysis Resolved by P-02 / P-06 — digital capture eliminates the handwriting step Medium
P-13 Single-MO PHC bottleneck Specialist HR shortage; ~50% specialist vacancy Patient expectation perpetually unmet at PHC; unnecessary referrals Tele-consult specialist support layer at PHC; clear "what this PHC offers" signage and digital advisory High
P-14 Service volatility Transfer policy churn; no facility-capability registry Patients arrive at facilities that no longer offer the needed service Real-time facility-capability registry tied to staffing roster; visible to ASHA / ANM / patient Medium
P-15 End-of-month data freeze HMIS aggregation cadence is monthly by design In-cycle course correction infeasible Daily / weekly reporting layer over existing portal data with rolling indicators High
P-16 Synoptic reporting gap Forms designed as free-text rather than structured synoptic templates Aggregation, search, and cross-case learning hampered Convert priority forms (maternal death audit, referral note) to structured synoptic templates with controlled vocabularies High
P-17 CMO analysis checklist absent No codified analytical-question set for maternal death review Cross-case learning depends on reviewer framing; not systematic Co-author with state CMO a top-N analytical question checklist embedded in the audit workflow High
P-18 PDF upload duplication Upper-level Excel + PDF uploads of physical paper coexist Triple capture; integrity risk at each step Eliminate PDF upload step once digital source-level capture is established Medium
P-19 Hierarchy depth latency 4–5 level reporting hierarchy with manual transmission Signal reaches actionable level too late Direct field-to-state visibility for priority indicators bypassing intermediate aggregation High
P-20 Outreach vs facility split Platform designed for facility flows; outreach grafted on Outreach data captured inconsistently; visit-quality not tracked Outreach-specific module within ASHA app — visit type, location, beneficiaries reached Medium
Table 1. Mapping of problems to root causes, operational impacts, and candidate interventions, with assigned priority.
NCD Uttarakhand × IIT Bombay VLabs Internal · For Official Use

Non-Communicable Diseases Programme — Process and Problem Analysis

Scoping output from the meeting between the Uttarakhand National Health Mission (NCD Division) and the IIT Bombay Virtual Labs team held on 7 May 2026.

Document Ref.
NCD-UK / SCOPE / 2026-05 / 002
Version
1.0
Date of Issue
07 May 2026
Pages
1 of 1 (web)
Prepared For
Uttarakhand NHM — NCD Division
Prepared By
IIT Bombay Virtual Labs Team
Source
Meeting transcript, 11:29–12:28 IST
Status
Draft for review
Purpose. This document presents a structured analysis of the Uttarakhand NCD information ecosystem as discussed in the scoping meeting. The session focused on regulatory compliance for an image-based cervical cancer screening tool, the gap between the existing national NCD dashboard and operational state needs, and the workforce dynamics that shape data capture at the CHO and ASHA level. Five deliverables are included: a workflow diagram of the current information flow, a process-flow diagram of the image-based screening process, a swim-lane view of an NCD episode, a list of problem statements, and a problem-mapping table linking causes, impacts, and candidate interventions.
Section 1

Information Workflow — Current State

NCD information generated at the screening point — typically a community camp or sub-centre visit — is recorded on paper and partly into the NCD-MO application, aggregated manually at the district level via Excel, and ultimately surfaces in the National NCD Dashboard as cumulative five-to-six-year totals. The diagram below depicts the complete flow with characteristic delays and the failure modes introduced along the way.

FIG. 1 — NCD INFORMATION FLOW: COMMUNITY SCREENING TO NATIONAL DASHBOARD Patient Community / Camp Stage 0 ASHA / CHO Sub-centre · screening Stage 1 · paper / app PHC · MO NCD-MO app entry Stage 2 · often skipped District Manual Excel review Stage 3 · aggregation State NCD NHM Uttarakhand Stage 4 · review National NCD Dashboard Stage 5 · cumulative ~ days paper / app capture often skipped MO delegates entry ~ weekly manual Excel no real-time no facility ranking 5–6 yr cumulative no day-to-day view FAILURE MODES INTRODUCED ALONG THE WAY Doctors skip NCD-MO app despite minimal click-burden · Behavioral inertia (not digital literacy) blocks adoption · CSV downloads unworkable on small tablets / phones Cumulative-only dashboard hides current performance · No exact patient-level access for action · Multiple disconnected portals require duplicate entry DECISION FEEDBACK LOOP — CURRENTLY BROKEN "Where am I vs. where should I be?" — CHO/ASHA cannot answer at any meaningful cadence What flows up: Aggregated NCD counts — screening volume, diagnosis totals, treatment initiation — on a national cumulative basis. What does not flow: Facility-level day-to-day metrics, indicator-by-indicator drill-down, exact patient identifiers for follow-up action.
Figure 1. End-to-end NCD information flow from community screening to the national cumulative dashboard.
Section 2

Process Flow — Image-Based Cervical Cancer Screening

Image-based screening for cervical cancer was the priority deliverable identified in the meeting due to the regulatory friction the project is currently encountering. The device is non-invasive, holds no images locally, and transmits directly to a server. The process below traces the screening from patient eligibility through the DPDP-Act compliance gate to the NCD Portal record.

FIG. 2 — IMAGE-BASED CERVICAL CANCER SCREENING PROCESS, CURRENT STATE Patient eligible for screening Screening category? Three-way classification Cervical (current) Image-based screening device Non-invasive · screening only Breast (planned) Adjacent gynecological scope Same compliance regime Other (future) Broader oncological screening Roadmap dependency Image transmission · DPDP-Act gate No local storage on device · sent directly to server REGULATORY FRICTION — committee compliance demonstration pending Server-side classification · suspect / normal Result triggers referral or routine follow-up NCD Portal record · audit trail · documentary evidence PAIN POINTS — DPDP-Act compliance unclear — No reference case studies — Documentary evidence weak — Committee decision pending — Server-only model questioned
Figure 2. Image-based screening from patient eligibility to portal record, with the DPDP-Act compliance gate highlighted as the current friction point.
Section 3

Swim-lane Diagram — NCD Episode

A swim-lane representation of a single NCD episode from community screening through diagnosis, treatment initiation, and reporting. Each lane represents an actor; each box is a discrete activity. Markers indicate hand-offs at which information drops out of the system or is captured in a form that cannot be acted upon.

FIG. 3 — NCD EPISODE: SCREENING TO DASHBOARD SCREENING CAPTURE DIAGNOSIS TREATMENT REPORTING Patient Community ASHA / CHO Sub-centre MO PHC Specialist CHC / DH State / Dashboard NHM · National Walks into screening camp Vitals · history eligibility check PAPER Image capture device · server only DPDP GATE NCD-MO app often skipped DELEGATED Travels for CHC review Refer up if suspect Diagnosis specialist confirm ~50% GAP Treatment plan starts Prescribe · counsel CHC / DH ASHA follow-up adherence support PAPER DIARY Excel aggregation manual · weekly NO REAL-TIME National NCD Dashboard 5–6 yr cumulative NO DRILL-DOWN CSV download on small phone UNUSABLE Activity Critical / failure point Paper / lossy hand-off
Figure 3. Five-lane episode view showing screening, capture, diagnosis, treatment, and reporting; failure markers identify points at which data quality or accessibility erodes.
Section 4

Problem Statements

The following twelve problem statements were derived from the meeting transcript. Each is grounded in a specific observation and is written as a diagnostic claim suitable for testing, prioritisation, and conversion into formal requirements.

P-01
Regulatory Compliance

Image-based screening device faces regulatory friction despite a no-local-storage design.

The device is intended as a screening tool only and sends data directly to the server, holding nothing on the device. Even so, the project meets resistance from the committee, attributed to Digital Personal Data Protection Act compliance ambiguity. Reference case studies of similar imagery-based solutions are not yet assembled.

P-02
Dashboard Misalignment

The national NCD dashboard is not tailored to state operational needs.

It was built to serve national reporting interests rather than state-level decision-making. Day-to-day facility performance metrics are not available, and the analytical depth required to drive action at the field level is absent.

P-03
Cumulative-Only View

The available dashboard shows only five-to-six years of cumulative data.

No real-time analysis exists. CHOs and ASHA workers cannot see "where I am vs. where I should be" at any cadence shorter than half a decade, which renders the dashboard unsuitable for operational performance tracking.

P-04
Form Factor Mismatch

CHOs and ASHAs work on small tablets and phones; CSV downloads are unworkable.

Reports must be downloaded as CSV files. Field staff lack the device size and software environment to manipulate spreadsheets meaningfully, so the data they technically have access to remains practically inaccessible.

P-05
Patient-Level Access

Decision-makers cannot access exact patient-level data for action.

Privacy understandings — implicitly applied — prevent authorities who could intervene from seeing which specific patients are due for follow-up, are non-adherent, or have abandoned care. The blanket interpretation hides operational signals from those positioned to act on them.

P-06
Incentive Misalignment

ASHA workers are incentivised against targets, sometimes irrespective of data quality.

The implementation reality is that workers focus on meeting submission counts because that is what is rewarded. Quality of capture is a downstream concern that no single role is currently accountable for.

P-07
Behavioural Inertia

Resistance to digital forms is behavioural, not a digital-literacy gap.

Surveyed regions show approximately 70% smartphone use, WhatsApp adoption, and basic mobile fluency. Digital forms are demonstrably easier than handwritten ones. The barrier is comfort-zone preference, not capability — implying training has to address behaviour change, not technology.

P-08
NCD-MO App Underuse

Doctors do not fill the NCD-MO application despite minimal click-burden.

The app is designed so the medical officer answers only five or six questions. Doctors nonetheless avoid it, citing time constraints, and delegate the entry to staff. The result is data captured by people who are not the clinical decision-maker, with quality consequences.

P-09
Manual Excel Workflow

State analysis depends on manual Excel aggregation.

Compiled sheets are built by hand from source sheets, then compared period-over-period. This is the current state-level analytical workflow, and it constrains how often analysis can be refreshed and how reliably insights can be reproduced.

P-10
Question–Parameter Mapping

No formal mapping exists from analytical questions to underlying portal parameters.

To plot meaningful dashboards, each analytical question — "which is the worst-performing facility on which indicator?" — must be linked to specific data fields. This mapping has not been done. Without it, dashboard development is open-ended and dashboards risk being unfocused.

P-11
ID Coverage

Patient profiles accept up to ten unique IDs, but missing IDs may mean missing benefits.

Aadhaar, PAN, voter ID, LPG ID, and others can each anchor a profile, with no single mandate. However, the absence of a particular ID can disqualify a patient from specific entitlement schemes downstream.

P-12
Portal Fragmentation

Multiple disconnected portals are slated for eventual GoI consolidation.

Patient-wise integration is the stated direction at the centre, but it is years away. In the interim, NCD data lives alongside infectious-disease screening, multiple outreach campaigns, and adjacent reporting systems with no unified view.

P-13
Programme Phasing

The cervical → breast → whole-cancer programme phasing is stated but not validated against capacity.

The current focus is cervical cancer screening, with breast cancer planned and broader oncological screening as a future scope. The progression assumes capacity will be available at each step; no capacity validation has been done against the current 50%-vacancy specialist baseline.

P-14
Outreach vs Facility Protocols

Outreach screening and facility-based screening differ operationally but use the same data forms.

Camp-based outreach screening and walk-in facility screening have materially different workflows — patient identification, consent, follow-up logistics. The data layer treats them identically, which loses signal about which mode produced which outcome.

P-15
Contextual Data Absent

Population, environmental, and demographic context data are not integrated for risk-based targeting.

Current NCD data is target-specific; population density, water quality, soil quality, and other environmental risk factors are not joined to the NCD reporting layer. Risk-based targeting of screening therefore depends on intuition rather than evidence.

P-16
External Data Joining

NSS data, water and soil quality data, and state-specific cancer epidemiology are not joined to NCD reporting.

These external datasets exist but live separately. Without a joining layer, NCD performance interpretation defaults to national averages rather than state-specific context, blunting both targeting and evaluation.

P-17
Descriptive Baseline Missing

Descriptive analytical baseline is absent — comparative or temporal work is premature.

The team explicitly identified the need to establish a descriptive analytical foundation before moving to temporal or comparative analysis. Without that baseline, more sophisticated analytical work would build on uncalibrated ground.

P-18
Concurrent Campaign Load

Multiple concurrent screening campaigns concentrate load on the same field staff without de-duplication.

NCD screening, infectious-disease outreach, and other vertical campaigns frequently land on the same ASHA / CHO at the same time. The portal-side patient-wise integration is direction of travel at the centre, but the field-side workload de-duplication is not yet addressed.

Section 5

Problem Mapping and Prioritisation

The table below maps each problem statement to its underlying root cause, the operational impact it produces, and a candidate intervention. Priority is graded by the extent to which the problem directly obstructs the programme objective of timely and accurate NCD performance monitoring.

Ref. Problem Root Cause Operational Impact Candidate Intervention Priority
P-01 Regulatory friction DPDP-Act compliance demonstration not yet assembled Committee approval delayed; rollout blocked Compile case-study dossier of comparable imagery-based screening tools with privacy-compliant architectures High
P-02 Dashboard misalignment National dashboard built for central reporting, not state operations State has no facility-level operational view State-tailored dashboard layer over existing portal data High
P-03 Cumulative-only view Aggregated reporting cadence with no daily granularity Performance gaps invisible until they have compounded for years Daily / weekly metric extraction with comparative views High
P-04 Form factor mismatch CSV download model assumes desktop work environment Data formally available is practically unusable for field staff Mobile-rendered, query-based data access replacing CSV download High
P-05 Patient-level access Blanket privacy posture without role-based access tiers Authorities cannot identify whom to follow up with Role-based access control with audit trail; exact data for action-takers, anonymised for review High
P-06 Incentive misalignment Compensation tied to submission count, not quality or outcome Data quality erodes silently at the source Hybrid incentive — submission completion plus quality / outcome bonus Medium
P-07 Behavioural inertia Habit anchored to paper despite digital being faster Digitisation effort under-realised; data fragmented across formats Behaviour-change programme paired with rollout; not just digital training High
P-08 NCD-MO app underuse Doctor time pressure; delegation culture Clinical signal captured by non-clinical staff; quality compromised Auto-fill from upstream context; voice / one-tap entry; SBAR-style synoptic High
P-09 Manual Excel workflow No automated state-level analytical layer Insights gate-kept by the analyst with the spreadsheet Automate the existing Excel logic as a state-level dashboard view Medium
P-10 Question–parameter mapping No formal documentation linking KPIs to data fields Dashboard development open-ended and unfocused Co-author a KPI–parameter map with the NCD team; share Excel analysis as ground truth High
P-11 ID coverage Multiple ID schemes with scheme-specific eligibility Patients may miss entitlements they are eligible for Coverage audit per ID type; user-facing prompt for missing IDs Medium
P-12 Portal fragmentation Vertical-programme history; central consolidation distant Duplicate entry; no unified patient view Bridging layer — unified capture, fan-out to existing portals via integration Medium
P-13 Programme phasing Cervical → breast → whole-cancer expansion stated without capacity validation Risk of programme expansion outpacing specialist capacity Capacity audit per phase prior to expansion; explicit specialist availability gates Medium
P-14 Outreach vs facility protocols Same form set used for materially different workflows Cannot disaggregate outcome by screening mode Mode-aware capture (outreach vs facility) with shared core fields and mode-specific extensions Medium
P-15 Contextual data absent NCD reporting layer is target-specific; no contextual data join Risk-based targeting depends on intuition Join layer for population, environmental, and demographic context data Long-term
P-16 External data joining NSS, water/soil quality, cancer epidemiology data live in separate systems Performance interpretation defaults to national averages Cross-domain data joining layer with controlled access for analytical use Long-term
P-17 Descriptive baseline missing No established descriptive analytical layer over current NCD data More sophisticated analytical work would build on uncalibrated ground Descriptive analytical layer first; temporal and comparative work sequenced after High
P-18 Concurrent campaign load Multiple verticals trigger campaigns independently Same field worker hit by multiple campaigns simultaneously Campaign coordination layer with worker-load awareness Medium
Table 1. Mapping of NCD problems to root causes, operational impacts, and candidate interventions, with assigned priority.
TB Uttarakhand × IIT Bombay VLabs Internal · For Official Use

Tuberculosis Programme — Process and Problem Analysis

Scoping output from the meeting between the Uttarakhand National Health Mission (TB Division) and the IIT Bombay Virtual Labs team held on 7 May 2026.

Document Ref.
TB-UK / SCOPE / 2026-05 / 003
Version
1.0
Date of Issue
07 May 2026
Pages
1 of 1 (web)
Prepared For
Uttarakhand NHM — TB Division
Prepared By
IIT Bombay Virtual Labs Team
Source
Meeting transcript, 13:04–14:34 IST
Status
Draft for review
Purpose. This document presents a structured analysis of the Uttarakhand TB programme as discussed in the scoping meeting. The session covered IDSP-aligned surveillance, the 100-day Active Case Finding campaign in high-density districts, household contact-tracing dynamics under stigma, private-sector reporting through DPS personnel, biomedical equipment reliability, and the integration potential between TB and NCD programmes. Five deliverables are included: a workflow diagram of the current surveillance flow, a process-flow diagram of the 100-day campaign, a swim-lane view of a TB patient journey, a list of problem statements, and a problem-mapping table linking causes, impacts, and candidate interventions.
Section 1

Information Workflow — Current State

TB surveillance information originates with a symptomatic patient — typically presenting with cough exceeding two weeks or surfaced through OPD diabetes screening — and travels through ASHA-led identification, PHC-level screening, district notification, and state-level aggregation before reaching the Central TB Division. The diagram below depicts the end-to-end flow with characteristic delays and the failure modes introduced along the way.

FIG. 1 — TB SURVEILLANCE FLOW: SYMPTOM ONSET TO CTD REPORTING Symptomatic Cough > 2 weeks · OPD Stage 0 ASHA Identification · contacts Stage 1 · paper PHC / Sub-centre Handheld X-ray · sputum Stage 2 · screening DPC / DTO Notification · DBT Stage 3 · district State TB Cell Google Forms · Excel Stage 4 · aggregation Central TB Division (CTD) Stage 5 · national stigma barrier household tracing ~ days equipment-dependent notification DBT enrolment Google Forms manual Excel national report CTD review FAILURE MODES INTRODUCED ALONG THE WAY 269 cases identified in the first 100-day round have no continued visibility · Stigma disrupts family-level contact tracing · Biomedical equipment 80–90% reliability gap ~80% of plain-district reports flow through DPS personnel · NCD comorbidity not integrated · No facility ranking dashboard from current Excel/Forms data DECISION FEEDBACK LOOP — CURRENTLY BROKEN Treatment outcomes, contact-tracing yield, and equipment downtime — visibility delayed and partial What flows up: Notification counts, treatment categories (DS/DR/MDR/XDR), 100-day campaign aggregates, drug-supply confirmations. What does not flow: Per-case treatment-trajectory visibility, household-contact tracing yield, equipment status, NCD-TB comorbidity signal.
Figure 1. End-to-end TB surveillance flow from community symptom presentation to Central TB Division reporting.
Section 2

Process Flow — 100-Day Active Case Finding Campaign

The 100-day Active Case Finding campaign was identified as a priority focus area, having demonstrated success in a high-density district. The process branches by source of identification (community symptomatic, OPD diabetes screening, household contact), converges on handheld X-ray and sputum confirmation, and terminates in CTD outcome reporting. The biomedical equipment reliability gap remains the most significant operational bottleneck.

FIG. 2 — 100-DAY ACTIVE CASE FINDING CAMPAIGN, CURRENT STATE Campaign launched Source of identification? Three-way classification Community symptomatic ASHA-led · cough > 2 wk Source A OPD diabetes screening NCD-comorbidity entry point Source B Household contact Existing TB patient profile Source C Handheld X-ray + sputum confirmation Mobile screening · molecular diagnosis BOTTLENECK — biomedical equipment 80–90% reliability gap Notification · classification · DBT enrolment DS / DR / MDR / XDR · DPC-led district workflow Treatment initiation · CTD outcome reporting PAIN POINTS — 269 cases lose visibility — Stigma blocks contacts — Equipment downtime — Private sector opaque — No comorbidity link
Figure 2. 100-day Active Case Finding campaign showing the three identification sources, the handheld X-ray and sputum confirmation bottleneck, and the path through DPC notification to CTD outcome reporting.
Section 3

Swim-lane Diagram — TB Patient Journey

A swim-lane representation of a TB patient journey from symptom onset through outcome. Each lane represents an actor; each box is a discrete activity. Markers indicate hand-offs at which information drops out of the system, equipment unreliability impacts throughput, or comorbidity signals are lost.

FIG. 3 — TB PATIENT JOURNEY: SYMPTOM ONSET TO OUTCOME SYMPTOM ONSET SCREENING DIAGNOSIS TREATMENT OUTCOME Patient + Family ASHA Sub-centre PHC / CHC Screening DPC / DTO District State / CTD National Cough > 2 weeks stigma · delay Identify symptomatic household contacts PAPER Refer to PHC accompany if needed Travels to PHC screening Handheld X-ray + sputum sample EQUIPMENT 80% Molecular confirm DS / DR class. Notify · DBT enrol patient DPC OVERLOAD Drug regimen monthly supply Adherence visit psycho-social PHILANTHROPIC Daily Google Forms Excel aggregation MANUAL CTD report national aggregate NO RANKING 269 cases lost to follow-up VISIBILITY GAP DPS personnel private 80% OPAQUE NCD comorbidity HTN · DM UNTRACKED Outcome reporting cure / loss / death NO ANALYTICS Activity Critical / failure point Paper / lossy hand-off
Figure 3. Five-lane patient journey covering symptom onset, screening, diagnosis, treatment, and outcome; failure markers identify visibility, equipment, and integration gaps.
Section 4

Problem Statements

The following twelve problem statements were derived from the meeting transcript. Each is grounded in a specific observation made during the discussion and is written as a diagnostic claim suitable for testing, prioritisation, and conversion into formal requirements.

P-01
Visibility Gap

269 cases identified in the first 100-day round have no continued visibility through treatment.

Cases reach notification but the system does not provide a per-case treatment-trajectory view. Whether these cases initiated treatment, completed it, or were lost to follow-up cannot currently be answered with confidence.

P-02
Reporting Mechanics

Daily reporting via Google Forms requires manual Excel work to derive performance categories.

Performance categorisation as "good", "yellow", or "poor" is not produced automatically. State staff manually rank facilities from downloaded sheets, which limits both frequency and reliability of the exercise.

P-03
Private-Sector Reporting

Approximately 80% of plain-district reports flow through DPS personnel.

Dedicated Private Sector channel concentrates a large share of district-level reporting in private-sector hands. State TB Forum meetings convey aggregate signals, but operational visibility into individual private providers is limited.

P-04
Equipment Reliability

Biomedical equipment management issues affect 80–90% of facilities.

Handheld X-ray devices, sputum analysers, and related biomedical kit suffer chronic uptime gaps. Without a tracking system, downtime translates directly into screening throughput loss.

P-05
Stigma

Stigma is a structural barrier to family-level case finding.

Household contact tracing depends on disclosure within the family, which stigma actively suppresses. Patient support groups exist but rely on philanthropic capacity rather than an institutional channel.

P-06
Patient Support

Psycho-social support depends on philanthropic groups, not stable institutional channels.

Government of India provides per-month financial support, but counselling, peer support, and adherence reinforcement are delivered by patient support groups whose availability varies by district.

P-07
NCD Integration

NCD comorbidities (hypertension, diabetes) are not integrated with TB monitoring.

The infrastructure already exists for NCD screening; TB and NCD share patient cohorts in many cases. Integration would surface comorbidity patterns and improve both treatment and follow-up but is not currently designed.

P-08
DPC Overload

The District Programme Coordinator role is overloaded across planning, data entry, and supply chain.

A single role covers programme planning, data entry, supply-chain coordination, and handheld X-ray oversight. Capacity is finite; quality on every front suffers.

P-09
Facility Ranking

No facility ranking dashboard exists from current Google Forms / Excel data.

Identifying poor-performing facilities, blocks, or indicators requires bespoke analysis each time. The dashboarding need is well understood but not yet implemented.

P-10
ASHA Burden

AFB reporting and NCD 20-point capture overburden the same ASHA at the source.

Multiple vertical reporting requirements land on the same field worker. Without consolidation, additions to the data stack continue to compound burden and erode quality.

P-11
Data Rectification

Data quality requires manual rectification downstream — "purification" via SBNC.

Source data quality is treated as something to be cleaned later rather than captured cleanly. SBNC involvement keeps the system functioning but is not scalable as case volume grows.

P-12
Mapping Gap

Forms and Juniper-sheet → dashboard parameter mapping has not been defined.

The next step requires deriving a clear mapping from existing forms and Juniper sheets to specific dashboard fields. Without this, dashboard scope is open-ended.

P-13
Household Contact Tracking

Household contact treatment protocol exists but compliance is not consistently captured.

Per-protocol, household contacts of an existing TB patient should receive evaluation and where appropriate, preventive treatment. The protocol is in place; whether the contacts were actually evaluated and whether treatment was completed is not tracked at the case-record level.

P-14
Drug Supply Visibility

TB drug supply chain depends on DPS distribution channel with limited downstream visibility.

Monthly drug supply for TB patients runs through the dedicated private-sector channel. Whether each patient actually received the planned monthly supply, on time, is not visible at the district programme level except by exception.

P-15
Campaign Replicability

100-day campaign success in a high-density district has uncertain replicability across other districts.

The 100-day Active Case Finding campaign worked in one high-density district. The factors responsible for that success — population density, team composition, equipment availability — have not been explicitly identified, so replication elsewhere is empirical.

P-16
State TB Forum Loop

State TB Forum meetings function as a discussion forum, not a corrective-action loop.

The forum meets, best practices are discussed, and dashboarding intentions are stated. There is no closing-the-loop mechanism to verify that discussion resulted in operational change at the field level.

P-17
Multi-Profile Operator

A single individual handling multiple patient profiles creates audit and quality risk.

The team noted that one person is "handling multiple profiles" — useful for cross-cutting insight, but problematic for audit, accountability, and segregation-of-duties when the same individual touches data across many cases.

P-18
HIV-TB Comorbidity

HIV-TB comorbidity surveillance is not integrated with the main TB workflow.

HIV-TB co-infection has its own clinical and reporting pathway via DPS / mortality records. Integration with the standard TB workflow is partial; the comorbidity signal does not flow into the main TB case view.

P-19
Equipment Rollout Support

2024 handheld X-ray rollout lacks supporting equipment-management infrastructure.

New handheld X-ray equipment was rolled out as part of the 100-day campaign. Equipment uptime, calibration status, and lifecycle management for the new fleet are not yet in a managed system, exposing the campaign to the same 80–90% reliability gap as legacy equipment.

P-20
Awareness Activity Outcome

School-based TB awareness activity volume is not measured into outcome impact.

Schools-based awareness sessions are conducted and counted, but their effect on subsequent symptomatic identification, screening uptake, or contact-tracing yield is not measured. The activity exists; its causal contribution to the programme is not characterised.

Section 5

Problem Mapping and Prioritisation

The table below maps each problem statement to its underlying root cause, the operational impact it produces, and a candidate intervention. Priority is graded by the extent to which the problem directly obstructs case finding, treatment continuity, or outcome reporting.

Ref. Problem Root Cause Operational Impact Candidate Intervention Priority
P-01 Visibility gap No per-case treatment-trajectory store; aggregate counts only 269 identified cases of unknown post-notification status Per-case longitudinal view from notification through outcome High
P-02 Reporting mechanics Google Forms / Excel pipeline with no automation Performance categorisation manual and inconsistent Automated facility-ranking layer atop existing Forms data High
P-03 Private-sector reporting 80% of plain-district reports flow through DPS Reduced operational visibility into private providers Standardised private-sector reporting interface; periodic reconciliation audits Medium
P-04 Equipment reliability No biomedical-equipment management system Screening throughput compromised at 80–90% of facilities Equipment uptime tracker tied to facility dashboards High
P-05 Stigma Social barrier; not amenable to direct system intervention Household contact tracing yield depressed Discreet, family-level outreach protocols; integration with NCD touchpoints to reduce stigma Long-term
P-06 Patient support Psycho-social services rely on philanthropy Adherence and retention vary geographically Institutional patient-support layer; helpline + peer-support tooling Medium
P-07 NCD integration TB and NCD programmes operate as parallel verticals Comorbidity patterns invisible; duplicate touch with same patient Cross-programme patient view; OPD diabetes-screening as TB entry point High
P-08 DPC overload Single role spanning planning, data entry, supply chain, equipment Each function delivered with reduced quality Tooling that automates DPC data-entry burden; dashboard self-service for state High
P-09 Facility ranking No analytical layer over reporting data Cannot identify worst-performing facility / indicator on demand Dashboard with "worst-performing facility · worst-performing indicator" drill-down High
P-10 ASHA burden Multiple vertical programmes converge at the same field worker Capture quality erodes as form count grows Consolidated capture surface — one form instance per visit High
P-11 Data rectification Source data not validated at entry Manual cleanup necessary; not scalable In-form validation; deviation alerts at source Medium
P-12 Mapping gap Forms / sheets → dashboard fields not yet aligned Dashboard scope open-ended Joint working session to produce form ↔ field map before dashboard build High
P-13 Household contact tracking Protocol exists; compliance not captured at case level Contact-tracing yield uncertain; preventive opportunities missed Per-case household-contact register linked to the index patient record High
P-14 Drug supply visibility DPS distribution channel; no patient-level supply tracking Adherence interruptions invisible until exception surfaces Patient-level monthly supply confirmation tracker tied to notification record High
P-15 Campaign replicability Success factors of the high-density campaign not codified Replication across districts is empirical, not engineered Post-campaign factor analysis; replication playbook Medium
P-16 State TB Forum loop Forum lacks a closing-the-loop mechanism Discussion does not consistently translate into field change Action-tracker tied to each forum meeting with field-level verification Medium
P-17 Multi-profile operator Single role spans many profiles; audit burden Accountability and segregation-of-duties weakened Role-based access control with audit trail; multi-profile use logged and reviewed Medium
P-18 HIV-TB comorbidity HIV-TB pathway runs in parallel to main TB workflow Comorbidity signal absent from primary TB case view Integrate HIV status into the TB case record with controlled access High
P-19 Equipment rollout support Handheld X-ray fleet rolled out without lifecycle management New equipment exposed to legacy 80–90% reliability gap Equipment-management module specifically for handheld diagnostic fleet High
P-20 Awareness activity outcome Activity counted; outcome contribution unmeasured Cannot evaluate activity ROI or refine targeting Outcome-linkage layer connecting awareness activity to subsequent screening uptake Medium
Table 1. Mapping of TB problems to root causes, operational impacts, and candidate interventions, with assigned priority.
ASHA Team Uttarakhand × IIT Bombay VLabs Internal · For Official Use

ASHA Diary and Forms — Process and Problem Analysis

Scoping output from the meeting between the Uttarakhand National Health Mission (ASHA Team) and the IIT Bombay Virtual Labs team held on 7 May 2026.

Document Ref.
ASHA-UK / SCOPE / 2026-05 / 004
Version
1.0
Date of Issue
07 May 2026
Pages
1 of 1 (web)
Prepared For
Uttarakhand NHM — ASHA Team (Community Process)
Prepared By
IIT Bombay Virtual Labs Team
Source
Meeting transcript, 14:55–15:42 IST
Status
Draft for review
Purpose. This document presents a structured analysis of the ASHA diary and forms ecosystem as discussed in the scoping meeting. The session focused on the volume of paper-based forms ASHA workers carry, the coexistence of paper and digital capture without a designated source of truth, the role of the Block Entry Operator as a single point of digitisation, the placement of authenticity verification at the MCP level downstream of capture, and the misalignment between an incentive structure paid on submission count and the quality of care delivered. Five deliverables are included: a workflow diagram of the current data capture flow, a process-flow diagram of HBNC reporting (42-day, 60-parameter), a swim-lane view of an HBNC episode, a list of problem statements, and a problem-mapping table linking causes, impacts, and candidate interventions.
Section 1

Information Workflow — Current State

ASHA workers capture data at the point of contact — typically a home visit for HBNC, MCH, or HBY — onto a paper diary, transcribe it into per-card forms, and rely on a Block Entry Operator to digitise the data into the HMI portal and RCH 2.0. Authenticity is verified downstream by the MCP via a Google form, and incentive disbursement is calculated from submission counts rather than data quality. The diagram below depicts the complete flow with characteristic delays and the failure modes introduced along the way.

FIG. 1 — ASHA DATA CAPTURE FLOW: HOME VISIT TO INCENTIVE DISBURSEMENT Home Visit Mother · Newborn Stage 0 · field ASHA Diary 25 sheets · per month Stage 1 · paper Form Filling Per-card · per-visit Stage 2 · 60 params Block Entry Operator · digitises Stage 3 · re-key HMI Portal RCH 2.0 listing Stage 4 · digital Verification + Incentive Stage 5 · MCP handwritten on-the-spot entry overwhelming multiple per visit bottleneck single point of digitisation duplicate risk paper · digital coexist downstream not at source FAILURE MODES INTRODUCED ALONG THE WAY Overwhelming form count is a 10+ year pain point · Paper diary still primary; digital exists in parallel · Same data re-keyed by Block Entry Operator Authenticity checked at MCP downstream, not at source · Latest PDFs circulate via Facebook / AFD / WhatsApp · Incentive tied to submission volume, not data quality DECISION FEEDBACK LOOP — CURRENTLY BROKEN Quality signal does not return to ASHA — submission count is the only feedback she experiences What flows up: Submission counts (HBNC visits, MCP cards, HBY routines), digitised parameter values via HMI portal and RCH 2.0. What does not flow: Visit-quality signal, completeness flags at source, training-progression status, cross-form patient-level reconciliation.
Figure 1. End-to-end ASHA data capture flow from home visit through paper diary, block-level digitisation, MCP verification, and incentive disbursement.
Section 2

Process Flow — HBNC Reporting (42 Days, 60 Parameters)

Home-Based Newborn Care reporting was discussed as the canonical worked example of the ASHA forms burden. The protocol prescribes scheduled visits across a 42-day post-natal window, each requiring a 60-parameter capture. The process below traces the episode from newborn delivery through visit-type branching, per-visit form bundle capture, block-level digitisation, MCP verification, and incentive disbursement.

FIG. 2 — HBNC REPORTING PROCESS, 42 DAYS · 60 PARAMETERS · CURRENT STATE Newborn delivery Visit type? Three-way schedule branch Initial visit · Day 1–3 Day 0 · 1 · 3 Critical newborn window Mid visit · Day 7–21 Day 7 · 14 · 21 Recurrent capture Closure visit · Day 28–42 Day 28 · 42 Outcome closure Per-visit form · 60 parameters Diary aggregation · paper-first capture BOTTLENECK — heavy at the visit point; quality erodes under load Block entry · HMI portal · MCP verification Re-key by operator · authenticity check via Google form Incentive disbursed · ministry-based on submission count PAIN POINTS — 60 parameters per visit — Form count overwhelming — Paper · digital duality — Re-keying duplicates — Incentive on volume
Figure 2. HBNC reporting process across the 42-day window, with the per-visit 60-parameter form bundle highlighted as the source-level bottleneck.
Section 3

Swim-lane Diagram — HBNC Episode

A swim-lane representation of a single HBNC episode from birth through Day 42 closure and incentive disbursement. Each lane represents an actor; each box is a discrete activity. Markers indicate hand-offs at which information drops out of the system, the same data is re-keyed, or quality signal fails to reach the worker who generated it.

FIG. 3 — HBNC EPISODE: BIRTH TO INCENTIVE DISBURSEMENT BIRTH DAY 3–7 DAY 14–28 DAY 42 VERIFY · INCENTIVE Mother + Newborn ASHA Field Block Entry Operator District / MCP Verifier State / Ministry HMI · RCH 2.0 Newborn delivered at home / facility Day 1–3 visit paper diary entry PAPER Day 3 · 7 visits 60-parameter form HEAVY LOAD Receives visits at home Day 14 · 21 · 28 recurrent capture REPEAT FORM Continued monitoring Day 42 closure diary aggregation Closure check outcome Re-key from paper to HMI portal DUPLICATE RISK MCP verification Google form check DOWNSTREAM HMI portal record RCH 2.0 listing VOLUME ONLY Incentive disbursed submission-based NOT QUALITY HBY visit compulsive routine UNMEASURED Block entry queue single point BOTTLENECK Authenticity check at MCP, not source LATE GATE Activity Critical / failure point Paper / lossy hand-off
Figure 3. Five-lane episode view showing birth, scheduled visits, Day 42 closure, MCP verification, and incentive; failure markers identify duplicate-entry risk, downstream authenticity gating, and incentive-quality misalignment.
Section 4

Problem Statements

The following twelve problem statements were derived from the meeting transcript. Each is grounded in a specific observation made during the discussion and is written as a diagnostic claim suitable for testing, prioritisation, and conversion into formal requirements.

P-01
Form Volume

The number of forms ASHA workers must complete is overwhelming and has been a pain point for over a decade.

Multiple programmes converge at the same field worker, each with its own form set. The total reporting load is described in the meeting as overwhelming — a problem heard for at least the last ten years. Digitisation alone, without consolidation, will not solve the burden.

P-02
Paper-First Capture

Paper diaries remain the primary capture surface; data is re-keyed downstream.

ASHA workers currently fill a paper diary at the visit. Block-level operators then digitise the same data into the HMI portal and RCH 2.0. The same fact is recorded twice — once on paper, once digitally — creating duplicate-entry risk and obscuring the source of truth.

P-03
Source-of-Truth Ambiguity

Digital and paper formats coexist for the same data without a designated source of truth.

When the two records disagree, there is no rule for which wins. Verification is done at the MCP downstream rather than at source, so reconciliation is reactive rather than systemic.

P-04
Authenticity Gate

Authenticity is verified at the MCP downstream, not at the source of capture.

A Google MCP-level verification form is the current authenticity-checking mechanism. By the time the check runs, the data has already moved through paper diary, form filling, and block entry — meaning errors are caught late and source-level accountability is weak.

P-05
Document Distribution

The latest PDF documents are distributed informally — via Facebook, AFD, and WhatsApp.

Workers depend on social-channel distribution to obtain current versions. There is no canonical document repository tied to the worker's identity or role, which means version drift and circulation gaps are routine.

P-06
Training Fragmentation

Training is fragmented across an audio tutorial and Part 1 / Part 2 modules.

Training material exists for HBNC (42 days, 60 parameters) and the MCB / mobile-bag for senior citizens, but it is split across formats and parts. Workers do not have a consolidated path; refresher and progression are ad hoc.

P-07
Incentive Misalignment

The incentive structure rewards submission volume rather than data quality or care outcomes.

Incentives are paid by the ministry on the basis of submissions. Quality of capture and the actual care signal are not part of what the worker is paid for. Behavioural change at the source is therefore disincentivised.

P-08
Compulsive Routine

HBY (home-based young child) visits are treated as a compulsive routine rather than a quality-measured activity.

Visits happen because they are scheduled, and submissions are recorded because they are required. Whether the visit produced clinical value is not a tracked dimension. The result is high volume with uncertain effect.

P-09
Single-Form Principle

The principle of "single form for single card for important work" is not yet enforced.

Multiple forms still cover overlapping data for the same MCP card or HBNC episode. The intent — one card, one form — was acknowledged in the meeting but is operationally absent today.

P-10
Block-Entry Bottleneck

The Block Entry Operator is a bottleneck and a transcription failure point.

All paper-to-digital conversion routes through a single role per block. Throughput is limited by that role's capacity, and any transcription error introduced there propagates into the HMI portal as authoritative data.

P-11
Visit-Level Form Weight

The 60-parameter HBNC form is heavy at the visit point.

Capturing 60 parameters at each home visit, on paper, in field conditions, is structurally unfavourable to data quality. Completion rates and consistency are predictably affected as visits accumulate.

P-12
Cross-Form Interoperability

Cross-form interoperability is not designed; the same family appears in multiple disconnected forms.

The MCB card, HBNC form, RCH 2.0 listing, and HBY visit record share patient-level data but are not linked. Reconciliation depends on the operator manually identifying the patient across forms — a step that is not part of the system.

P-13
Onboarding Gap

Onboarding tutorial format is short and informal, leaving new workers underprepared.

New ASHA workers receive a brief "welcome" tutorial and short online materials. There is no structured, role-specific onboarding programme that would prepare a new worker for the form burden and capture quality expectations they will face from day one.

P-14
Senior-Citizen Programme Load

Senior-citizen MCB / mobile-bag programme adds workload without incentive accounting.

The MCB card and mobile-bag-for-senior-citizens programmes are part of the ASHA workload but are not separately accounted for in the incentive structure. Time spent on these activities competes with HBNC and MCH duties.

P-15
Per-Month Form Volume

Approximately 25 sheets per month of form work compounds across multiple programmes.

The team explicitly cited a 25-page-per-month form volume per ASHA worker. Across HBNC, MCH, NCD screening, immunization due-list, and senior-citizen tracking, the total page count per worker is materially higher than commonly assumed.

P-16
Audio-Only Training

Audio tutorial format for HBNC training limits reference-back utility for the 60-parameter protocol.

HBNC training uses an audio tutorial covering the 42-day, 60-parameter protocol. Audio is hard to skim and hard to reference at the point of a visit; workers cannot quickly look up "what does parameter 47 require?" while at the home of a beneficiary.

P-17
Card System Fragmentation

Multiple disconnected card systems (MCB, NCB, MCH) require parallel workflow management.

Different beneficiary categories have different cards (MCB, NCB, MCH, HBY). Each carries its own form set and its own workflow. The single-form-per-card principle would consolidate this, but is not enforced today.

P-18
Month-End Batching

Increment / incentive verification dependence creates month-end batching pressure on the worker.

Incentive disbursement is calculated from monthly submission counts verified at the MCP level. Workers therefore rationally batch their submissions toward month-end, which reduces data freshness and concentrates digitisation load on the Block Entry Operator at the close of every cycle.

Section 5

Problem Mapping and Prioritisation

The table below maps each problem statement to its underlying root cause, the operational impact it produces, and a candidate intervention. Priority is graded by the extent to which the problem directly obstructs the programme objectives of clean source-level capture, single-source-of-truth records, and a quality-aligned incentive structure.

Ref. Problem Root Cause Operational Impact Candidate Intervention Priority
P-01 Form volume Multiple verticals converge at the same worker without consolidation Capture quality erodes; worker burden compounds over time Form consolidation programme — single capture surface per visit, fan-out to existing portals High
P-02 Paper-first capture Paper diary culturally entrenched; digital not deployed at source Same fact recorded twice; duplicate-entry risk Source-level digital capture replacing paper diary; mobile-first worker app High
P-03 Source-of-truth ambiguity No designated authoritative record between paper and digital Reconciliation reactive; data lineage opaque Declare digital capture as authoritative; paper as transitional only High
P-04 Authenticity gate Verification placed downstream at MCP rather than at capture Errors caught late; source-level accountability weak In-form validation at capture (range checks, mandatory-field gates, GPS-based authenticity) High
P-05 Document distribution No canonical repository tied to worker identity Version drift; circulation gaps Role-based document delivery via the worker app; auto-update on policy change Medium
P-06 Training fragmentation Material distributed across audio tutorial and Part 1 / Part 2 modules No consolidated learning path; refresher ad hoc Unified, role-based learning track with progression tracking Medium
P-07 Incentive misalignment Ministry incentive paid on submission count alone Quality signal not rewarded; behavioural change disincentivised Hybrid incentive structure — completion plus quality / outcome bonus High
P-08 Compulsive routine Visits scheduled and submitted; outcome impact not tracked High visit volume with uncertain clinical value Visit-quality marker tied to a clinical signal (e.g., growth, vaccination, alert resolution) Medium
P-09 Single-form principle Programme verticals each retain their own forms Overlapping data captured multiple times Form-merge audit; enforce one-card-one-form rule with cross-programme reuse High
P-10 Block-entry bottleneck Single operator per block performs all paper-to-digital conversion Throughput limited; transcription errors propagate Eliminate paper-to-digital re-keying via source-level digital capture High
P-11 Visit-level form weight 60 parameters captured per visit on paper in field conditions Completion and consistency suffer over the 42-day window Form decomposition by visit type; mandatory fields only at the visit, optional fields deferred High
P-12 Cross-form interoperability Forms share patient-level data but are not linked Reconciliation manual; patient-level view absent Patient identifier linking across MCB, HBNC, RCH 2.0, HBY records High
P-13 Onboarding gap Welcome tutorial short and informal; no structured programme New workers underprepared for actual form burden Structured onboarding curriculum with progression tracking Medium
P-14 Senior-citizen programme load MCB / mobile-bag programmes added without incentive recalibration Additional time competes with core MCH / HBNC duties Workload accounting in incentive formula; explicit time allocation Medium
P-15 Per-month form volume 25+ sheets per month per worker across vertical programmes Capture quality erodes with volume Form decomposition + consolidation audit; mandatory-only fields per visit High
P-16 Audio-only training Training material delivered as audio tutorial Reference-back at the point of visit is impractical Searchable text + structured checklist embedded in worker app High
P-17 Card system fragmentation Different cards for different beneficiary categories with parallel workflows Workflow burden multiplied by card-type count Enforce single-form-per-card principle; consolidate where possible High
P-18 Month-end batching Incentive tied to monthly submission count verified downstream Data freshness reduced; digitisation load spikes month-end Continuous-credit incentive structure decoupled from batch verification Medium
Table 1. Mapping of ASHA Diary and Forms problems to root causes, operational impacts, and candidate interventions, with assigned priority.
Immunization Uttarakhand × IIT Bombay VLabs Internal · For Official Use

Immunization Programme — Process and Problem Analysis

Scoping output from the meeting between the Uttarakhand National Health Mission (Immunization Division) and the IIT Bombay Virtual Labs team held on 7 May 2026.

Document Ref.
IMM-UK / SCOPE / 2026-05 / 005
Version
1.0
Date of Issue
07 May 2026
Pages
1 of 1 (web)
Prepared For
Uttarakhand NHM — Immunization Division
Prepared By
IIT Bombay Virtual Labs Team
Source
Meeting transcript, 15:47–16:15 IST
Status
Draft for review
Purpose. This document presents a structured analysis of the Uttarakhand immunization programme as discussed in the scoping meeting. The session focused on the urban outreach failure rate, the structural denominator dispute with the centre, the absence of an ABHA linkage from U-WIN, the lack of a facility-level HMIS to attribute AEFI or outbreak signals to specific lots, the rotation of programme priorities that prevents a dedicated urban team, and the practice of informal data entry by workers other than the formal block data operator. Five deliverables are included: a workflow diagram of the current information flow, a process-flow diagram of outreach session execution, a swim-lane view of an immunization episode, a list of problem statements, and a problem-mapping table linking causes, impacts, and candidate interventions.
Section 1

Information Workflow — Current State

Immunization information originates with a beneficiary becoming due per the schedule, is captured on paper at the outreach session by an ANM or ASHA, transcribed into Excel and U-WIN at the district level, and surfaces at the centre as district-level coverage figures whose denominator is contested. The diagram below depicts the end-to-end flow with characteristic delays and the failure modes introduced along the way.

FIG. 1 — IMMUNIZATION INFORMATION FLOW: BENEFICIARY TO NATIONAL DATABASE Beneficiary Mother · Child Stage 0 · community ANM / ASHA Due-list · outreach Stage 1 · field PHC / Sub-centre Vaccinate · lot entry Stage 2 · MO record DIO / District Excel aggregation Stage 3 · district State · U-WIN Entry · no ABHA link Stage 4 · portal National GoI Database Stage 5 · disputed due-list overdue tracking 50% urban fail weather · migration lot recorded paper · then Excel no HMIS denominator dispute data rejected 78% coverage disputed FAILURE MODES INTRODUCED ALONG THE WAY 50% outreach session failure in urban areas (weather, migration) · Urban "blind spots" — no system-level visibility · U-WIN not linked to ABHA, fragmenting patient identity No HMIS at facility level — outbreak-to-vaccine attribution impossible · GoI database rejects state denominator · Informal data entry by non-designated workers in place of formal block operators DECISION FEEDBACK LOOP — CURRENTLY BROKEN Coverage signal disputed at the centre — state and field cannot tell whether 78% is the floor or the ceiling What flows up: Dose counts, U-WIN entries, district-level coverage estimates against a contested denominator. What does not flow: Lot-level traceability at decision-point, AEFI / outbreak-vaccine correlation, end-to-end social-determinant linkage (dropout, marriage, mortality).
Figure 1. End-to-end immunization information flow from beneficiary through outreach, vaccination, and district aggregation to the disputed national database.
Section 2

Process Flow — Outreach Session Execution

The outreach session is the operational unit at which immunization meets the field. The process branches by setting (rural / peri-urban / urban), converges on vaccine administration with lot recording, and terminates in U-WIN entry against a contested denominator. The bottleneck identified in the meeting is the inaccessibility of lot information at the point of clinical decision and the absence of a facility-level HMIS to support that decision.

FIG. 2 — OUTREACH IMMUNIZATION SESSION EXECUTION, CURRENT STATE Outreach session scheduled Setting? Three-way classification Rural village ANM session at sub-centre Standard protocol Peri-urban Mixed migration · variable Tracking unstable Urban "Don't miss opportunity" 50% WEATHER FAILURE Vaccine administered · lot recorded BCG · routine schedule · paper-first capture BOTTLENECK — lot number not accessible at point-of-decision; no HMIS link Excel record · U-WIN entry Often by non-designated worker · not block data operator U-WIN record · GoI denominator dispute · no ABHA link PAIN POINTS — 50% urban session fail — Migration breaks tracking — Lot number missing — Don't-miss-it risk — Informal data entry
Figure 2. Outreach session execution showing the urban failure mode (50% weather-related), the lot-recording bottleneck, and the path through U-WIN to GoI database with denominator dispute.
Section 3

Swim-lane Diagram — Immunization Episode

A swim-lane representation of a single immunization episode from beneficiary identification through outreach, vaccination, recording, and reporting. Each lane represents an actor; each box is a discrete activity. Markers indicate hand-offs at which information drops out of the system, urban sessions fail, lot-level traceability is lost, or the denominator dispute manifests at the centre.

FIG. 3 — IMMUNIZATION EPISODE: IDENTIFICATION TO REPORTING IDENTIFICATION OUTREACH VACCINATION RECORDING REPORTING Beneficiary Mother · Child ANM / ASHA Field MO PHC DIO / District Coordinator State / U-WIN National GoI Becomes due per schedule Maintain due-list overdue tracking PAPER Conduct session weather-dependent 50% URBAN FAIL Attends if weather permits Vaccinate BCG · routine Mobilise "don't miss it" ANSWERABLE Lot record no HMIS access DECISION GAP AEFI risk UNATTRIBUTABLE Informal entrant not block operator DANGEROUS Excel · U-WIN district aggregation NO HMIS GoI denominator rejection · 78% DISPUTED U-WIN ↛ ABHA no patient link FRAGMENTED Priority shift every 2–3 months NO DEDICATION End-to-end study missing NO CORRELATION Activity Critical / failure point Paper / lossy hand-off
Figure 3. Five-lane episode view covering identification, outreach, vaccination, recording, and reporting; failure markers identify the urban session failure rate, informal data entry, U-WIN-ABHA fragmentation, and the GoI denominator dispute.
Section 4

Problem Statements

The following twelve problem statements were derived from the meeting transcript. Each is grounded in a specific observation made during the discussion and is written as a diagnostic claim suitable for testing, prioritisation, and conversion into formal requirements.

P-01
Outreach Failure

Approximately 50% of outreach immunization sessions fail in urban areas.

The dominant cause cited is weather, but underlying contributors include migration, the absence of a system-level tracking mechanism, and the difficulty of reaching beneficiaries in dense urban settings. The 50% failure rate is operationally significant and persistent.

P-02
Urban Blind Spots

Urban areas function as system-level "blind spots" for immunization tracking.

The HMI ecosystem assumes a stable rural denominator, which urban demographics break. Migration, informal housing, and private-sector vaccination reduce the share of immunizations the public system observes, and there is no compensating data layer.

P-03
Migration Instability

Migration in urban and peri-urban settings makes the denominator structurally unstable.

The eligible population shifts faster than the registration cadence. Beneficiaries enter and leave the catchment without being tracked across boundaries. Coverage estimates derived against a stale denominator are systematically biased.

P-04
Denominator Dispute

The Government of India database does not accept state-submitted denominator and coverage figures.

State coverage is reported at 78%. The centre disputes the denominator on which this is computed, despite the state team having taken multiple factors into account. The result is a chronic disagreement on whether the programme is performing well or poorly.

P-05
Priority Shifts

Programme priorities shift every two to three months, preventing a dedicated urban team.

Urban immunization needs sustained, dedicated capacity, but rotating priorities mean staff and attention move before the urban strategy can mature. The team identified this rotation as a structural barrier to fixing the urban gap.

P-06
U-WIN ↛ ABHA

U-WIN is not linked to the ABHA system.

A beneficiary's immunization record in U-WIN cannot be reconciled with their broader health identity in ABHA. This fragments patient identity, blocks longitudinal views, and prevents joining vaccination data with other clinical signals.

P-07
No Facility HMIS

Local facilities lack a Health Management Information System.

When an outreach session is followed by a possible AEFI or outbreak, there is no system that lets the facility test whether the events are vaccine-related. The clinical question cannot be answered with the data on hand.

P-08
Lot Number Access

BCG vaccine number and lot number are not accessible at the point of decision.

For a clinician facing a possible adverse event, the specific vaccine lot administered to a specific child should be the first datum. Currently it requires retrieval from paper-based records that are not at hand, delaying ground-level decisions.

P-09
"Don't Miss the Opportunity"

The urban strategy of vaccinate-and-be-answerable creates downstream accountability burden.

Because formal eligibility cannot always be verified at the point of contact, the practical strategy is to vaccinate when the family is in front of you and accept the answerability later. This is rational at the worker level but accumulates risk at the system level.

P-10
Informal Data Entry

Data is often entered by an informal worker rather than the designated block data operator.

The role responsible for entering immunization data is not always the designated block data operator. Training, accountability, and standardisation suffer when the actual data entrant is informal, and the team flagged this configuration explicitly as dangerous.

P-11
End-to-End Analysis Gap

End-to-end analytical study correlating social determinants with health outcomes is missing.

The team wants to understand the chain from school dropout and early marriage to infant mortality and maternal malnutrition. The data exists in silos but is not joined, so the chain cannot be traced or interrupted with evidence.

P-12
Excel as De Facto HMIS

In the absence of a real HMIS, Excel sheets serve as the de facto management system.

Excel is what facility and district staff actually use to track sessions, lots, and coverage. As an analytical and audit surface it is brittle, but as a stop-gap it is universal. Any HMIS replacement must absorb the work that Excel currently does.

P-13
Overdue-List Tracking

Overdue list serves as the primary tracking mechanism instead of real-time scheduled visits.

Workers track who has missed their schedule via an overdue list. This is a backward-looking signal — by the time a beneficiary appears on the overdue list, they have already missed the window. A forward-looking scheduled-visit view is not the operational primary.

P-14
Access vs Uptake Gap

Physical house-facility coverage at 99% but vaccination coverage at 78% — uptake is the gap, not access.

Houses are reached at near-universal coverage; vaccinations are administered at 78%. The 21-percentage-point gap is therefore not an access problem but an uptake problem — a different intervention class than what access-focused programmes would address.

P-15
Demographic Capture

Reproductively-active demographic (~13.7 families per area) is not systematically captured for planning.

Planning for maternal and neonatal vaccination depends on knowing the reproductively-active subset of the catchment. The current data capture does not flag this subset, so planning runs against undifferentiated population counts.

P-16
Informal Eligibility Heuristic

Family connectivity is used as an informal eligibility heuristic in urban settings.

Where formal eligibility cannot be verified at the point of contact, "well-connected to family" is used as a proxy for legitimacy. The vaccinator assumes the answerability and proceeds. The heuristic is rational but uncodified; consistency varies by worker.

P-17
Mobility Data Integration

Migration and population mobility data are not integrated with U-WIN to follow beneficiaries across boundaries.

When a beneficiary moves between catchments, their vaccination record does not move with them automatically. Fresh registration is required; dose history may be lost or duplicated. Migration breaks the longitudinal record.

P-18
AEFI Lot Traceability

AEFI surveillance requires lot-level traceability that current architecture does not support.

When an Adverse Event Following Immunization is suspected, the first datum needed is which lot was administered to the affected individual and to which other individuals from the same lot. This linkage is not queryable from current systems.

P-19
Specialist Rotation

Specialist staffing rotation compounds the 2–3 month priority-shift problem.

In addition to programme priorities rotating, the specialist staffing assigned to those programmes also rotates. The combined effect is that institutional knowledge of urban immunization barriers does not accumulate in any single team.

P-20
Cohort Attrition Modelling

The 50% urban outreach session failure rate creates downstream cohort attrition that is not modelled in coverage figures.

Sessions that fail are typically rescheduled, but not all targeted beneficiaries are reached at the rescheduled session. The attrition between original target and eventual coverage is not separately tracked, so the true urban funnel is opaque.

Section 5

Problem Mapping and Prioritisation

The table below maps each problem statement to its underlying root cause, the operational impact it produces, and a candidate intervention. Priority is graded by the extent to which the problem directly obstructs the programme objectives of accurate urban coverage, lot-level traceability, and integration with the broader patient health record.

Ref. Problem Root Cause Operational Impact Candidate Intervention Priority
P-01 Outreach failure Weather and migration disrupt urban sessions; no fallback mechanism 50% of urban outreach sessions yield no vaccinations Indoor / fixed-site urban immunization points; weather-resilient session design High
P-02 Urban blind spots HMI architecture assumes stable rural denominators Urban coverage chronically under-counted Urban-specific tracking layer with private-sector data ingestion High
P-03 Migration instability Eligible population shifts faster than registration cadence Coverage estimates systematically biased Beneficiary-portable record (ABHA-linked) that survives location change High
P-04 Denominator dispute State and centre use different population baselines Coverage figures contested; programme performance unclear Co-defined denominator methodology with documented assumptions High
P-05 Priority shifts Rotating programme priorities at the state level Urban immunization team cannot mature Ring-fenced urban immunization capacity insulated from quarterly rotation Medium
P-06 U-WIN ↛ ABHA Vertical-programme architecture; no cross-system identity link Vaccination data isolated from broader health record ABHA linkage at U-WIN registration; deduplication at the centre High
P-07 No facility HMIS Investment in HMIS not yet reached facility level AEFI and outbreak attribution not possible Lightweight facility HMIS with vaccine-event linkage High
P-08 Lot number access Lot data captured on paper; not at decision-point Clinical decisions made without lot context Lot-and-batch capture into U-WIN at administration; queryable from facility High
P-09 "Don't miss the opportunity" Eligibility verification not feasible at urban point-of-contact Worker-level rational behaviour accumulates system-level risk Real-time eligibility check on the worker device; reduce ad-hoc decisions Medium
P-10 Informal data entry Block data operator role unfilled in practice Training, accountability, and standardisation compromised Formalise data-entry role; certification and auditability High
P-11 End-to-end analysis gap Data lives in vertical silos; no joining identifier Social-determinant → outcome chain not traceable Cross-domain linkage layer with controlled access for research and policy use Long-term
P-12 Excel as de facto HMIS Real HMIS absent; staff fall back on what works Brittle, non-auditable analytical surface HMIS replacement that explicitly absorbs current Excel workflows High
P-13 Overdue-list tracking Primary tracking is backward-looking (overdue) rather than forward-looking (scheduled) Workers see misses after the fact; preventive nudge absent Forward-looking scheduled-visit view with proactive reminders High
P-14 Access vs uptake gap 21-point gap between house facility coverage (99%) and vaccination coverage (78%) Uptake-targeted interventions absent; access-focused interventions over-applied Uptake-targeted programme distinct from access-targeted programme; behavioural / consent layer High
P-15 Demographic capture Reproductively-active subset not flagged in capture Maternal / neonatal planning runs against undifferentiated counts Demographic flag in beneficiary record; planning view filtered accordingly Medium
P-16 Informal eligibility heuristic Formal eligibility check infeasible at urban point-of-contact Worker-discretion variability; inconsistent eligibility application Real-time eligibility check on worker device; heuristic codified as fallback Medium
P-17 Mobility data integration U-WIN record bound to original catchment; no cross-catchment portability Dose history lost or duplicated when beneficiary moves ABHA-linked beneficiary record with cross-catchment portability High
P-18 AEFI lot traceability Lot data not joined to administered-dose records at query time AEFI investigation cannot identify cohort exposed to same lot Lot-and-batch capture at administration; queryable AEFI surveillance view High
P-19 Specialist rotation Specialist staffing rotates alongside programme priorities Institutional knowledge does not accumulate in one team Ring-fenced urban immunization specialist team with longer assignment tenure Medium
P-20 Cohort attrition modelling Failed-session attrition not separately tracked through reschedule True urban funnel opaque; coverage figures hide attrition steps Funnel-level cohort tracker showing target → reached → vaccinated by session High
Table 1. Mapping of immunization problems to root causes, operational impacts, and candidate interventions, with assigned priority.