Skip to Content
TestingCRO Testing Guide

CRO Testing Guide

Contract Research Organizations (CROs) can use OpenFactory’s testing system to validate clinical trial infrastructure and ensure FDA 21 CFR Part 11 compliance.

Overview

CROs spend months validating infrastructure for clinical trials. OpenFactory enables you to:

  • Define executable tests for data collection systems
  • Validate FDA compliance automatically
  • Ensure data integrity and audit trails
  • Test integration with EDC, LIMS, and safety databases

Why This Matters for CROs

Traditional validation requires:

  • Manual testing protocols
  • Extensive documentation
  • Months of validation cycles
  • Re-validation for every change

With OpenFactory:

  • Tests are code (version controlled)
  • Re-run validation in minutes
  • Reproduce environments exactly
  • Automated compliance verification

Clinical Trial Use Cases

Central Lab Data Collection

Validate HL7 message processing and lab result storage:

{ "test_config": { "custom_tests": [ { "description": "Verify HL7 message queue operational", "category": "data-collection", "assertions": [ { "type": "service_running", "description": "HL7 listener service", "params": { "service": "hl7-receiver" } }, { "type": "port_listening", "description": "HL7 port 2575 listening", "params": { "port": "2575" } }, { "type": "command_output", "description": "Parse sample HL7 message", "params": { "command": "python3 /opt/scripts/validate_hl7.py /opt/test-data/sample_lab.hl7", "contains": "VALID" } } ] }, { "description": "Verify GxP compliance - audit trails", "category": "compliance", "assertions": [ { "type": "service_running", "description": "Audit daemon running", "params": { "service": "auditd" } }, { "type": "file_exists", "description": "Audit log exists", "params": { "path": "/var/log/audit/audit.log" } }, { "type": "file_contains", "description": "Audit rules configured", "params": { "path": "/etc/audit/audit.rules", "content": "-w /var/lib/postgres" } } ] } ] } }

What this validates:

  • HL7 receiver is running and accepting connections
  • Sample messages parse correctly
  • Audit logging is enabled for database access
  • Changes are tracked per 21 CFR Part 11

Site Coordinator EDC Workstation

Locked-down workstation for accessing Electronic Data Capture systems:

{ "test_config": { "custom_tests": [ { "description": "Verify EDC access restricted", "category": "security", "assertions": [ { "type": "package_installed", "description": "2FA authentication", "params": { "package": "libpam-google-authenticator" } }, { "type": "file_contains", "description": "Browser whitelisting", "params": { "path": "/etc/chromium/policies/managed/url_allowlist.json", "content": "trial.imedidata.com" } }, { "type": "command_output", "description": "USB storage disabled", "params": { "command": "lsmod | grep usb_storage", "exit_code": 1 } } ] }, { "description": "Verify 21 CFR Part 11 - time sync", "category": "compliance", "assertions": [ { "type": "service_running", "description": "Time sync service", "params": { "service": "chrony" } }, { "type": "command_output", "description": "Time synchronized", "params": { "command": "chronyc tracking", "contains": "System time" } } ] } ] } }

What this validates:

  • Users can only access approved EDC sites
  • Two-factor authentication is configured
  • USB drives are blocked (prevent data exfiltration)
  • System clocks are synchronized (required for audit trails)

Safety Database Workstation

For safety physicians reviewing serious adverse events:

{ "test_config": { "custom_tests": [ { "description": "Verify MedDRA dictionary loaded", "category": "data-validation", "assertions": [ { "type": "command_output", "description": "MedDRA LLT table exists", "params": { "command": "psql -h safety-db -U safetyuser -d safetydb -c '\\dt meddra_llt'", "contains": "meddra_llt" } }, { "type": "command_output", "description": "MedDRA version 26.1", "params": { "command": "psql -h safety-db -U safetyuser -d safetydb -tAc 'SELECT version FROM meddra_version'", "contains": "26.1" } } ] }, { "description": "Verify CIOMS form generation", "category": "reporting", "assertions": [ { "type": "command_output", "description": "Generate CIOMS I form", "params": { "command": "python3 /opt/safety/generate_cioms.py --sae-id TEST001 --output /tmp/test.pdf", "exit_code": 0 } }, { "type": "file_exists", "description": "CIOMS form created", "params": { "path": "/tmp/test.pdf" } } ] } ] } }

What this validates:

  • MedDRA dictionary is installed (required for SAE coding)
  • Correct MedDRA version is loaded
  • CIOMS form generation works
  • All reporting tools are functional

Biostatistics Workstation

CDISC SDTM/ADaM validation and statistical analysis:

{ "test_config": { "custom_tests": [ { "description": "Verify CDISC validation tools", "category": "data-validation", "assertions": [ { "type": "command_output", "description": "R SDTM validator package", "params": { "command": "Rscript -e 'library(sdtm.oak)'", "exit_code": 0 } }, { "type": "command_output", "description": "Validate sample SDTM dataset", "params": { "command": "python3 /opt/cdisc/validate_sdtm.py /opt/test-data/dm.xpt", "contains": "PASS" } } ] }, { "description": "Verify statistical software", "category": "analysis", "assertions": [ { "type": "command_output", "description": "R can load SAS datasets", "params": { "command": "Rscript -e 'library(haven); read_sas(\"/opt/test-data/adsl.sas7bdat\")'", "exit_code": 0 } }, { "type": "package_installed", "description": "SciPy installed", "params": { "package": "python3-scipy" } } ] } ] } }

What this validates:

  • CDISC validation tools are installed
  • Sample datasets validate correctly
  • R can read SAS datasets (common in clinical trials)
  • Statistical libraries are available

DICOM Imaging Workstation

For radiologists reviewing medical images (CT, MRI, X-ray) in clinical trials:

{ "name": "dicom-imaging-station", "base_image": "ubuntu-24.04", "features": ["desktop", "gxp"], "packages": ["dcmtk", "weasis", "orthanc"], "test_config": { "custom_tests": [ { "description": "Verify DICOM viewer launches and displays images", "category": "gui-validation", "assertions": [ { "type": "gui_application_opens", "description": "Weasis DICOM viewer launches", "params": { "command": "weasis", "window_title": "Weasis", "timeout": 10 } }, { "type": "gui_window_visible", "description": "DICOM viewer main window visible", "params": { "window_title": "Weasis Medical Viewer" } }, { "type": "gui_application_process", "description": "Weasis process is running", "params": { "process_name": "weasis" } } ] }, { "description": "Verify DICOM network services (PACS connectivity)", "category": "network", "assertions": [ { "type": "service_running", "description": "Orthanc DICOM server running", "params": { "service": "orthanc" } }, { "type": "port_listening", "description": "DICOM C-STORE port 4242 listening", "params": { "port": 4242 } }, { "type": "command_output", "description": "Query PACS for test study", "params": { "command": "dcmqrscp -q -c AE_TITLE@localhost:4242 -k StudyInstanceUID=1.2.3.4", "exit_code": 0 } } ] }, { "description": "Verify image rendering with screenshot validation", "category": "visual-validation", "assertions": [ { "type": "gui_execute_command", "description": "Load test DICOM image and capture screenshot", "params": { "command": "weasis /opt/test-data/chest-xray.dcm", "wait_seconds": 5 } } ] } ] } }

What this validates:

  • DICOM viewer application launches successfully
  • Main viewer window displays correctly
  • PACS network connectivity (C-STORE, C-FIND, C-MOVE protocols)
  • Test DICOM images load and render
  • Screenshot capture for FDA audit trail - Visual proof that images rendered correctly

:::info Screenshot Capture in Action When gui_execute_command runs, OpenFactory:

  1. Executes the command via guest agent (gets stdout/exit code)
  2. Simultaneously captures a VNC screenshot of the desktop
  3. Stores both as a screenshot + metadata pair (PNG + JSON)
  4. Displays in HTML test report for visual verification

This provides visual proof that the DICOM image rendered correctly, which is critical for FDA validation of imaging systems used in clinical trials. :::

Visual Validation & Screenshot Comparison

Beyond command-line testing, OpenFactory captures real VNC screenshots during test execution to validate visual interfaces. This is critical for CROs validating GUI-based clinical trial systems like EDC clients, CTMS interfaces, and medical imaging viewers.

Current Capabilities: Screenshot Capture

Every GUI assertion automatically captures a VNC screenshot alongside command output. This provides:

  • Visual proof of execution: Screenshots show exactly what was on screen during the test
  • Audit trail for FDA compliance: Timestamped visual evidence of system behavior
  • Debug assistance: See visual errors (dialog boxes, error messages, UI state)
  • Documentation: Screenshots stored with structured metadata for archival and future comparison

Screenshot + Metadata Format

Each screenshot is stored with structured metadata to enable future visual comparison. The system captures two files per assertion:

assertion-123.png - Screenshot captured via VNC during test execution

assertion-123.json - Metadata: match areas, tags, assertion details, command output

{ "area": [ { "xpos": 100, "ypos": 50, "width": 300, "height": 200, "type": "match", "match": 95 } ], "tags": ["edc-login", "passed"], "properties": { "description": "EDC login screen displays", "stdout": "Login form rendered successfully" } }

Automated Screenshot Comparison (Available Now)

OpenFactory now supports automated comparison of screenshots against reference images, detecting visual regressions automatically.

gui_screenshot_matches Assertion

Compare captured screenshots to reference images, failing if visual differences exceed threshold:

{ "type": "gui_screenshot_matches", "description": "EDC form layout matches approved design", "params": { "reference_id": "edc-form-v3.2-approved", "threshold": 0.95, "ignore_regions": [[10, 10, 200, 30]], "match_area": [0, 0, 1920, 1080], "save_as_reference": false, "wait_before": 1.0 } }

Use case: Detect when a software update inadvertently changes EDC form layouts, button positions, or required field indicators — all visual regressions that could invalidate a validated system.

How it works:

  1. Capture current screenshot
  2. Load reference image from storage
  3. Apply ignore regions (mask dynamic content)
  4. Compare using perceptual image hashing
  5. Generate diff image if failed (highlights differences in red)
  6. Log comparison history for audit trail

Advanced GUI Assertions (Available Now)

gui_click_element - Click UI elements at coordinates

{ "type": "gui_click_element", "params": { "x": 600, "y": 400, "button": "left", "double_click": false, "verify_click": true } }

gui_form_fill - Automated form data entry

{ "type": "gui_form_fill", "params": { "fields": [ {"name": "patient_id", "value": "PT-001", "method": "tab"}, {"name": "visit_date", "value": "2024-01-15", "method": "click", "x": 300, "y": 250} ], "submit_button": {"x": 600, "y": 500} } }

gui_text_visible - OCR-based text detection (requires tesseract)

{ "type": "gui_text_visible", "params": { "text": "Consent Form Approved", "region": [100, 100, 800, 600], "case_sensitive": false } }

Visual Validation Features

Area-Based Matching

  • Define regions of interest with perceptual hashing comparison
  • Ignore dynamic content like timestamps or session IDs
  • Configurable similarity thresholds (default: 0.95)

Perceptual Image Comparison

  • Detects visual changes while tolerating minor rendering differences
  • Handles anti-aliasing, fonts, and subtle color shifts
  • Uses imagehash library for robust comparison

Reference Library Management

  • Maintain approved reference screenshots for each validated screen
  • Automatic diff image generation highlighting changes
  • Comparison history logging for audit trails
  • Update library when intentional design changes are validated

Clinical Trial Visual Validation Use Cases

EDC Form Validation

Verify that electronic Case Report Forms (eCRFs) display correctly after software updates:

  • Required field indicators (*) are visible
  • Date pickers use correct format (DD-MMM-YYYY)
  • Dropdown options match protocol specifications
  • Edit checks display appropriate error messages

Medical Imaging Viewer UI

Validate DICOM viewer interface for radiology endpoints:

  • Measurement tools display calibrated scale
  • Window/level presets render correctly (lung, bone, soft tissue)
  • Annotation overlays are visible and editable
  • Multi-frame sequences load in correct order

Safety Database Interface

Verify adverse event reporting forms and MedDRA coding interfaces:

  • SAE forms display all required fields per ICH E2B
  • MedDRA autocomplete suggests correct preferred terms
  • Seriousness criteria checkboxes are visible and functional
  • CIOMS form preview matches regulatory template

Why Visual Validation Matters for FDA Compliance

FDA 21 CFR Part 11 requires that computerized systems be validated to ensure accuracy, reliability, and consistent intended performance. Visual validation provides:

  • Audit Trail: Timestamped screenshots prove that system interfaces displayed correctly during validation testing (required for IQ/OQ/PQ documentation)
  • Change Control: Automated visual regression testing detects when software updates inadvertently modify validated interfaces
  • Protocol Compliance: Screenshots verify that EDC forms match protocol-specified data collection requirements
  • Inspection Readiness: Visual proof reduces reliance on manual documentation and human attestation during FDA inspections

Test Composability

Instead of 50 pre-baked variants, create one base image with different test suites:

Base Image: GxP Compliant Workstation

{ "name": "gxp-base-workstation", "base_image": "ubuntu-24.04", "features": ["desktop", "gxp", "audit-logging"], "packages": ["postgresql-client", "python3", "r-base"] }

Test Suite 1: Central Lab

{ "test_config": { "custom_tests": [ // HL7 validation tests // Database schema tests // Audit trail tests ] } }

Test Suite 2: EDC Workstation

{ "test_config": { "custom_tests": [ // Browser whitelisting tests // 2FA tests // USB blocking tests ] } }

Test Suite 3: Safety Database

{ "test_config": { "custom_tests": [ // MedDRA validation // CIOMS generation // Electronic signatures ] } }

Benefits:

  • Single base image to maintain
  • Test suites validate different workflows
  • Update base → all tests re-run automatically
  • No duplication of effort

FDA 21 CFR Part 11 Validation

Required elements for electronic records:

RequirementAssertion TypeExample
Audit Trailsservice_running, file_existsVerify auditd logging database access
Time Synchronizationservice_running, command_outputEnsure chrony/ntpd synchronized
User Authenticationpackage_installed, file_containsCheck PAM 2FA configuration
Electronic Signaturescommand_output, file_existsValidate signature libraries installed
Data Integritycommand_output, file_containsCheck database backups configured

Complete 21 CFR Part 11 Test Suite

{ "test_config": { "custom_tests": [ { "description": "21 CFR Part 11 - Audit Trails", "category": "compliance", "assertions": [ { "type": "service_running", "params": { "service": "auditd" } }, { "type": "service_enabled", "params": { "service": "auditd" } }, { "type": "file_contains", "params": { "path": "/etc/audit/audit.rules", "content": "-w /var/lib/clinical-data" }} ] }, { "description": "21 CFR Part 11 - Time Synchronization", "category": "compliance", "assertions": [ { "type": "service_running", "params": { "service": "chrony" } }, { "type": "command_output", "params": { "command": "chronyc sources", "contains": "^*" }} ] }, { "description": "21 CFR Part 11 - Access Controls", "category": "compliance", "assertions": [ { "type": "package_installed", "params": { "package": "libpam-google-authenticator" } }, { "type": "file_contains", "params": { "path": "/etc/pam.d/common-auth", "content": "pam_google_authenticator.so" }} ] } ] } }

Coming Soon

Future assertion types for clinical trials:

Data & API Validation

AssertionDescriptionUse Case
database_queryExecute SQL and validate resultsCheck MedDRA tables, verify data integrity
data_formatValidate HL7, DICOM, CDISC formatsEnsure messages meet specifications
api_respondsTest REST API endpointsEDC connectivity, CTMS integration
file_ageVerify data freshnessEnsure logs are recent, backups current
signature_validValidate electronic signaturesVerify signed CIOMS forms, audit logs
certificate_validCheck SSL/TLS certificatesEnsure EDC/CTMS connections are secure

Best Practices

1. Start with Core Requirements

Define tests for must-have functionality first:

  • System boots and users can log in
  • Critical services start
  • Network connectivity works

2. Add Compliance Tests

Layer on regulatory requirements:

  • Audit logging enabled
  • Time synchronization working
  • Access controls configured

3. Validate Workflows

Test end-to-end data flows:

  • HL7 messages → database
  • EDC data entry → validation
  • SAE reports → CIOMS generation

4. Use Test Categories

Organize tests by:

  • compliance - 21 CFR Part 11, GxP
  • data-collection - HL7, LIMS integration
  • data-validation - CDISC, format checks
  • security - Access controls, encryption
  • reporting - CIOMS, TLFs, CSRs

5. Document Expected Behavior

Add clear descriptions to every test:

{ "description": "Verify HL7 receiver processes ORM messages", "assertions": [...] }

Not:

{ "description": "Test HL7", "assertions": [...] }

Getting Help

For CRO-specific validation requirements:

  1. Book a demo - Discuss your trial infrastructure
  2. Review our examples - See pre-built validation tests
  3. Custom assertions - We can add new assertion types for your needs

Contact sales@openfactory.tech to discuss your clinical trial validation requirements.