There are multiple definitions of data integrity, yet the essence is the same. Data integrity is the overall accuracy, completeness, and consistency of data.
Data integrity is the maintenance of, and the assurance of the accuracy and consistency of data over its entire life-cycle and is a critical aspect to the design, implementation and usage of any system that stores, processes, or retrieves data.
When we secure the data integrity, we ensure completeness, accuracy, and reliability of data regardless of the duration of storage and the number of times data is accessed. Data integrity also ensures that your data is safe from any external forces.
Data integrity applies to all the industries where data is generated however here we are just focused on the clinical research industry and more specifically on the Bioanalytical part thereof.
Data integrity carries substantial importance in the healthcare industry as it is directly related to the safety and well-being of people.
Assurance of data integrity, security, and privacy are essential for the pharmaceutical industry and is becoming more difficult for today’s bioanalytical laboratory owing to the complexity of typical datasets, and multiple analytical techniques.
It is more important than ever that laboratory personnel be proactive in organizing, securing, and protecting their data.
This white paper focuses on data integrity in different spans of the Bioanalytical Phase from data creation to final archival and destruction.
To ensure Data integrity it is very essential to follow correct processes at each phase of the data’s lifecycle.
Issues to consider at each phase include:
A. Prerequisite: Identification of Selection of appropriate instruments and Operating Software with Proper installation, qualification, and maintenance of instruments and proper Validation of software
B. Appropriate analytical methods development and Validation.
C. Correct use of the analytical instrumentation and execution of the analytical method and online documentation.
Once the prerequisite for the lab is met, it is essential to develop a scientific approach with an adequate number of trials to have proper optimization of the method. While in most cases Bioanalytical uses the set protocol for method development and validation, however, it is essential to foresee all the possibilities from the start of the clinical study and all efforts are made to ensure no variation in the results because of specific conditions of study samples.
During the method development, validation and study sample analysis, it is essential to have complete control of all sources which may lead to data integrity failure.
For eg.
a. Receipt of samples: it is essential to cross-verify all the sources for their correct identity as per study protocol and clinical updates, condition of the sample during receipt, temperature by the time of receipt and throughout from collection to receipt.
b. Storage till completion of study: must be within the temperature specified in the protocol, and must not go beyond the temperature at which the stability is proved. FT cycles must be within all proven stability parameters.
c. Retrieval of Samples for Processing: Accurate sample identity is essential. The analyst must be fully sure about the identity of the samples he is processing. There must not be any sample mix-up or interchange. Sample with doubtful identity must not be processed.
d. Actual Processing of Samples: The analyst must be well experienced and trained on method SOP. Randomization of sequence can be considered (with CC and QC interspersed) to avoid any bias. Every step of the Method should be followed as specified in method SOPs, without compromising on time, temperature, speed etc. The analyst must document in processing document online for every step performed.
e. In-process Supervision: every step must be cross-verified by QC from the beginning.
f. Sample loading in LCMS/MS and analysis: in most of the cases all manual operation in LD modules are done in isolation and these operation does not leave any audit trails. It is high time that all manual operations of LC modules were covered in software and must leave the audit trail behind.
For eg. the Opening of a sample loader must be done through Software with proper justification for opening that to is captured in the audit trail. Opening of Column oven, there should be a transparent enclosed chamber to hold mobile phase bottles which can be opened only with software and backed by proper audit trails.
g. Integration of Chromatograms: once the analysis of a batch is completed. It should be quantified with set integration parameters as soon as possible after verification for proper chromatography. Identification of repeats must be done immediately and documented. Access for reintegration and manual integration must not be given to analysts.
h. Audit trail verification: audit trails must be enabled from the beginning and QA should verify the same.
a. Selection of appropriate processing methods that are robust enough to need little or no human-assisted processing.
b. Identification of iterative processing techniques (reprocessing), and retention of intermediate processing states
c. Quantitation parameters (also called methods) are to be robust enough that only minimal adjustments are required to produce peak areas, calibration curves, etc.
d. If a run of samples is acquired and processed, and the quality control samples in the run fail, the user might be tempted to adjust the quantitation parameters repetitively until the QCs pass, then report the results of the run using those successful parameters.
e. systems be configured so that re-processing of data must be prevented and such attempts are captured in the audit trails.
f. Manual integration must be controlled so that its misuse is prevented.
g. Special criteria apply to data that are generated and stored as electronic records by computerized
systems. In these situations, laboratories must have policies and procedures in place to ensure that they meet the prevailing criteria for the acceptance of electronic records/signatures as the equivalent of paper records/signatures by the regulatory authorities. FDA regulation 21 CFR Part 11 allows for the use of electronic records and electronic signatures when appropriate
h. These data will often subsequently be transferred to a laboratory information management system (LIMS) for further processing and evaluation, storage, and report generation. Further transfer of the data may then take place to allow statistical or pharmacokinetic analysis. While a LIMS is not an absolute requirement, it is unlikely that a modern bioanalytical laboratory could function efficiently without one.
i. Computerized systems used to generate, manipulate, modify, or store electronic data should be validated, and key instrumentation should be appropriately qualified before use.
j. When data are transferred between electronic systems, the link or transfer process should, ideally, be validated. If not, procedural and quality control processes will be needed to demonstrate that data integrity has been maintained.
a. Written procedures for data review and approval, including consideration of:
– QA/QC should be able to detect errors and misuse of processing tools when reviewing data
– written procedures should identify how to detect and investigate occurrences of re-processing and manual integration.
– Verification of meta-data
– Use of human-verified processing at every step
– Reprocessing of sampled data for validation
– Audit trails review
– Detection and avoidance of testing for compliance
– Written procedures should identify how to detect and investigate occurrences of re-processing and manual integration.
b. Written procedures for reporting, including:
– Accuracy and security of reports
– Ability of users to determine which results are reported
– Ability of users to alter the presentation, including scaling, labelling, excluding observations etc.
– Appropriate review and approval of reports
– Reports distributed to appropriate individuals
– Appropriate level of review and approval for data used for decision-making
– Records retention period mandated by regulatory bodies
– Records retention period based on user and business needs
– Ability to view and re-process data at a future date
– Archive growth over time
– Redundancy of archived records
– Protection of the archive, e.g. Usable life of storage media
– External and environmental threats to the archive
– Records retention period mandated by regulatory bodies
– Records retention period based on user and business needs
– Ability to view and re-process data at a future date
– Archive growth over time
– Redundancy of archived records
– Protection of the archive, e.g. Usable life of storage media
– External and environmental threats to the archive
With growing business size, it becomes inevitable that organizations make huge efforts and investments in personnel, training, equipment, software and entire systems to ensure the accuracy, correctness reliability and integrity of data generated. It is a mixture of quality orientation and befitting efforts to achieve this level of data integrity. Deficiencies in data integrity can not only result in the loss of this investment but also in the viability of the business in the case of severe regulatory gaps. By implementing policies and procedures to ensure data integrity and regulatory compliance, organizations can minimize these risks while still operating efficiently.