SDR documentation flow – Clinical Research Made Simple https://www.clinicalstudies.in Trusted Resource for Clinical Trials, Protocols & Progress Sun, 07 Sep 2025 05:14:47 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.1 How to Log and Document SDR Findings and Annotations for Compliance https://www.clinicalstudies.in/how-to-log-and-document-sdr-findings-and-annotations-for-compliance/ Sun, 07 Sep 2025 05:14:47 +0000 https://www.clinicalstudies.in/how-to-log-and-document-sdr-findings-and-annotations-for-compliance/ Read More “How to Log and Document SDR Findings and Annotations for Compliance” »

]]>
How to Log and Document SDR Findings and Annotations for Compliance

Documenting SDR Findings and Annotations: Best Practices for Compliance

Why Proper Logging of SDR Findings Matters

In remote monitoring setups, the effectiveness of Source Data Review (SDR) is directly linked to how findings are documented. SDR involves reviewing source data—either electronically or through scanned uploads—to identify inconsistencies, missing information, and potential protocol deviations. But unless these observations are logged, categorized, and traceable, they hold little regulatory value.

Regulators such as the FDA and EMA expect SDR logs to show not just that review occurred, but what was found, who reviewed it, and how it was handled. ICH E6(R2) also mandates appropriate documentation of monitoring activities, including centralized and risk-based methods like SDR. During inspections, poorly documented or untraceable SDR findings can result in major observations or CAPA requirements.

This article provides a practical guide to logging SDR findings, using reviewer annotations, categorizing issues, and mapping these entries to monitoring reports and TMF documentation.

Key Components of a Compliant SDR Finding Log

Whether maintained in Excel, eClinical systems, or within an SDR platform, a compliant finding log should include the following fields:

Field Purpose
Finding ID Unique identifier (e.g., SDR-F-2024-015)
Reviewer Name & Role Tracks accountability and role (Central Monitor, CRA, etc.)
Site/Subject/Visit Links finding to specific patient data
Review Date Timestamped review activity
Finding Category Predefined codes (e.g., Missing Data, AE Inconsistency, Protocol Deviation)
Finding Description Free-text summary of the issue
Action Taken Escalated to CRA, Site Contacted, CAPA Initiated
Resolution Date Date of closure or resolution (if applicable)
TMF Filing Reference TMF section where related documentation is stored

This format ensures that each SDR finding is uniquely logged, traceable to a reviewer and subject, and linked to outcome documentation.

Using Annotations to Capture Contextual Reviewer Input

Annotations are reviewer comments added directly to source documents during SDR—often via eSource platforms or scanned document tools. These annotations may highlight data discrepancies, unclear narratives, missing lab values, or incorrect visit dates.

Key best practices for annotations:

  • Always include reviewer initials and timestamp
  • Be specific: “Visit 4 labs not uploaded” is clearer than “Missing data”
  • Maintain a master list of annotation types for categorization
  • Annotate in-system if supported, or use external templates for scanned PDFs
  • Redact sensitive PHI when sharing annotated records externally

Annotation logs should be stored with SDR finding logs or attached to alerts in the monitoring dashboard. Some sponsors include them as appendix materials in SDR Summary Reports for quarterly monitoring reviews.

Standard Finding Categories for SDR Logs

To ensure consistency, sponsors should define a controlled vocabulary or coding system for categorizing SDR findings. Common categories include:

  • Eligibility Error: Missing documentation of inclusion/exclusion criteria
  • Informed Consent Issue: Incomplete form, incorrect version, wrong date
  • AE/SAE Discrepancy: Mismatch between source and CRF
  • Visit Deviation: Out-of-window visits or unscheduled assessments
  • Lab Value Anomaly: Missing lab ranges or flagged abnormal values
  • eCRF Mismatch: Source does not match entered CRF values

Each category should be linked to an action template and escalation procedure. This allows for consistency across reviewers and sites.

Linking SDR Findings to CAPA and Monitoring Reports

Findings that indicate protocol non-compliance or recurring data issues should be escalated to the CRA or CTM and documented in the issue management system. Link each SDR finding to any CAPA, Protocol Deviation Form (PDF), or Monitoring Visit Report (MVR) using unique IDs.

Best practices:

  • Use a “Finding ID” as the anchor across SDR log, CAPA tracker, and MVR
  • Maintain a crosswalk table if multiple systems are used (e.g., CTMS + Excel)
  • Filing CAPA evidence and review confirmation in the TMF under 5.2.1 or 5.4.1

Example: SDR Documentation Process in a Multicenter Study

In a global Phase III diabetes trial, SDR was performed weekly by central monitors. The sponsor implemented a three-tier documentation system:

  • SDR Review Log – 350+ subject reviews documented by reviewer, subject, and date
  • SDR Finding Tracker – 92 findings logged, with 40 escalations and 12 linked to CAPAs
  • Annotation Archive – 60+ annotated screenshots filed with reviewer initials and timestamps

During an EMA inspection, the reviewer logs and SDR findings were sampled for 6 subjects. The inspector found complete documentation, traceability to CAPA, and TMF indexing. No SDR-related findings were raised.

Conclusion: Build a Defensible SDR Documentation Framework

Properly documenting SDR findings and annotations is essential for demonstrating oversight and data quality. Without structured logs, clear reviewer notes, and TMF traceability, sponsors risk observations and delays during regulatory inspections.

Key takeaways:

  • Use standardized SDR finding log templates with unique IDs
  • Train reviewers on annotation expectations and finding categorization
  • Ensure every finding is linked to an action: escalation, CAPA, or comment
  • File logs and summaries in TMF for inspection readiness
  • Maintain reviewer accountability through name, date, and role tracking

By building strong documentation systems around SDR, sponsors can enhance remote oversight and ensure regulatory confidence in their monitoring programs.

]]>