CRP 106 Lecture 4 Flashcards

1
Q

Traditional data management methods

A

Paper source documents
* Paper Case Report Forms (CRFs)
* Monitor checks to ensure source and CRF match
* Data queries generated – addressed by site
* “Pulls” CRFs from site and returns to Sponsor
* Data entered into database by data management team
* Can also use double data entry to ensure accuracy of the data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

EDC in Risk-Based Monitoring

A
  • To most effectively implement an RBM program, study
    data must be quickly obtained from study sites
  • Algorithms assess and compare data across sites
  • Real-time assessments of site progress and compliance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

EDC Basics

A

Replaces paper-based CRFs with eCRFs
* System that allows for electronic entry of data from study site
Electronic transmission of data to Sponsor
* Variety of sponsor personnel can remotely access data
Can assess data to ensure integral and clean data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

EDC Data Entry Quality Checks

A

Preprogrammed checks
* Range verification
* Missing data not accepted
* Field verification/variable types
* Only accepts numerical values or categorical variables
* Reduces errors by not allowing inappropriate data types
* On demand reviews
* Quick assessment of data by Sponsor staff
* Queries can be generated and addressed quickly
* Some systems have auto-queries

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

EDC Data Management Methods

A
  • Paper or electronic source documents
  • CRFs can serve as source if appropriately documented in protocol
  • Electronic Case Report Forms (eCRFs)
  • Completed by site personnel
  • Double data entry can be requested to be completed by site personnel
  • Oversight by Monitor (either SDV or SDR)
  • Data reviewed by data management team and monitors
    One or more round of data entry removed
  • Allows for focused attention to resources to activities that have a
    greater impact on ensuring data integrity and participant safety
  • Greater emphasis on both Investigator’s team and Sponsor
    personnel focusing on a partnership on ensuring accurate data
    obtained
  • Historically, some study teams would devote less resources to this task
    with expectation that monitor would find issues and correct them.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Title 21 CFR Part 11 Compliance

A
  • A requirement in US jurisdictions and best practice in others
    without other explicit requirements
  • Outlines process of validating electronic records and signatures
  • Only required when electronic system will be used exclusively.
  • Certain controls required to ensure valid system in place
  • Validation occurs to ensure accuracy, reliability, consistency of
    performance, ability to discern altered records
  • Ability to create both electronic and human readable format
  • System ensured to maintain records for appropriate record retention
    period
  • Access to system limited to only authorized individuals
  • Appropriate audit trials of all activities on system exist
  • Written policies and procedures exist regarding use of the electronic
    system
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Annex 11: Computerised Systems

A

-Health Canada
-applies to all forms of computerised systems used as part
of a GMP regulated activities
-application should be validated; IT infrastructure should be qualified
-Where a computerised system replaces a manual operation, there
should be nodecrease in product quality, process control or
quality assurance. no increase in the overall risk of
the process.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

EDC vs. CTMS

A

EDC system exclusively concerned with data generated by clinical
trial
* Clinical Trial Management System is like a project management
tool that tracks progress of a clinical trial at each site
* Participant scheduling
* Storing study essential documents ie: approvals, etc
* Managing study workflow between all study components eg: Investigator,
Coordinators, Pharmacists, etc
* Financial Management of sites <– major role
* CTMS may interface with an EDC system but does a different task

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Risk-Based Approaches

A

-risk-based monitoring uses
technology, data integration and analytics to allow assess
and detect trends in large amounts of aggregated data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Ideal Characteristics of RBM System
Planning

A

-Ability to apply a pre-determined risk algorithm based on
risk level identified (low/medium/high)
* Ability to scan forms and extract data (eg: protocol, CRFs,
etc)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Ideal Characteristics of RBM System
Data Capture

A

Accommodate direct digital data capture

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Ideal Characteristics of RBM System
Data Aggregation

A
  • Ability to access and aggregate data from different
    sites/systems
  • Able to standardize data to enable analytics
  • Flexible platform that allows for data from multiple
    sources and formats to be aggregated
  • Near real-time data access
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Ideal Characteristics of RBM System
Analytics

A
  • Analytics can be applied to aggregated data
  • Can assess data for outliers and trends
  • Ability to set thresholds with respect to identified outliers
  • Ability to indicate potential quality risk
  • Automated reporting
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Ideal Characteristics of RBM System
Reporting and Visualization

A
  • Can report on outliers and trends observed
  • Reports can be generated by issues identified
  • Ability to visual observe outliers and trends
  • Can have role specific views (study manager, monitor, etc)
  • Customized dashboards and viewing parameters
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Variable Assessment

A
  • How variables are to be assessed should be determined during
    setup
  • Determine whether comparisons occur across:
  • Program (ie: all studies investigating the IP)
  • Protocol (ie: all study sites implementing the protocol)
  • Country
  • This plan should be determined and documented within the
    Integrated Quality and Risk Management Plan
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Risk Indicator Assessment
Safety

A
  • Reporting times of SUSARs and SAEs to REBs
  • Timeliness of reporting to Sponsor
  • Incidence of potentially unreported SAEs based on review of
    submitted data
  • Assessment of trends or outliers of this site in comparison to other
    sites
17
Q

Risk Indicator Assessment
Investigational Product

A
  • Timeliness of acknowledgement of receipt of IP
  • Errors in dispensing
    -Assessed by comparing CRF entries to those provided by IVRS (Interactive
    Voice Response System)/IWRS, IRT (Interactive Response Technology)
  • Compliance
    -Amount assigned vs. amount administered
  • Incidence of temperature excursions
  • Times that IP administration interruptions occur compared to other sites
18
Q

Risk Indicator Assessment
Subject Recruitment and Discontinuation

A
  • Outlier in screen failure rate/enrollment rate
  • Number of screen failures compared to average across sites
  • Planned vs. actual recruitment
  • Inconsistent recruitment
  • Outlier or a trend observed in ratio of subjects
    discontinued to those randomized
19
Q

Risk Indicator Assessment
General Compliance Issues

A
  • Outliers/trends in number or type of protocol deviations
  • Number of deviations (per subject or site compared across sites)
  • Types of deviations (significant vs. non-significant)
  • Number and/or severity of issues observed
  • Number of issues (overall, in a particular category, by severity)
  • Number of unresolved issues
20
Q

Risk Indicator Assessment
Data Quality

A
  • Abnormal trends in data
  • Lack of variability in data submitted by site compared to other sites
  • CRF completion issues
  • Overdue data entry, substantial incomplete pages or fields, extended time
    between visit data and CRF completion date, timeliness of PI CRF
    approval.
  • Query issues
  • Substantial number of queries, overdue queries, queries requiring re-
    addressing, query response time.
21
Q

Risk Indicator Assessment
Essential Documents

A
  • Processing or storage of essential documents
  • Large number of overdue or missing documents
  • REB annual renewal letters, protocol amendment approvals
22
Q

Risk Indicator Assessment
Staffing, Facilities and Supplies

A
  • Issues with site staffing, supplies or study equipment
  • Amount of staff turn-over
  • Knowledge gap in study staff
  • Staff training requirements
  • Inappropriate delegation of responsibilities
  • Adequacy, maintenance, calibration, storage of
    supplies/equipment
23
Q

EDC in Risk-Based Monitoring

A
  • To most effectively implement an RBM program, study
    data must be quickly obtained from study sites
  • Algorithms assess and compare data across sites
  • Real-time assessments of site progress and compliance