Pharmabiz
 

Data integrity - no testing into compliance!

Sivakumar VTuesday, June 5, 2018, 08:00 Hrs  [IST]

The objective of this article is to provide the regulatory perspective and identify current best practice thinking relative to what one can do from a compliance and quality perspective to avoid and detect data integrity issues and overall data quality pitfalls. Data integrity is a current hot topic, but not a new one, within the pharmaceutical industries.

Data integrity is basically assurance of that information is true, accurate, complete, retrievable, honest, easily located and verified. Failures in data integrity break the essential trust that regulators have with manufacturers of medicinal products. Regulatory authorities cannot review all the data that firms generate during the development and commercial life cycle of every drug. Even during inspections, they review only a small fraction of the data generated. When regulators find that companies have falsified or manipulated data to achieve passing results or have failed to document and investigate failing results they lose confidence in all the data presented by the firm.

Data management that ensures integrity of the associated data requires more than risk-based computer system validations. It requires understanding the events that precipitated this focus, understanding the intent of the governing regulations and guidance, and enforcement actions. If you have read regulatory guides and rules, you have read the generic and recurring terms  ’quality and integrity of the data’  meaning pharmaceutical products must meet certain quality attributes associated with effects on patients such as strength, identity, safety, purity and quality, with associated data meet certain quality and integrity attributes…i.e., ALCOA+.

Data management that also ensures security, reliability of the data it must be effectively incorporated into the pharma quality system. Data governance should be established that ensures procedures and processes are implemented and that staff are trained appropriately. The most senior management in the firm needs to support the effort and potential cost, and lead the way to ensure the data from their firm is always correct, valid, complete and secure.

Culture of ALCOA
Who acquired the data or performed an action (or modification) and when. There should be definition of separate user roles (based on record involvement) for all systems (e.g., creator/user ability to write records, reviewer ability to append /modify a record, and administrator ability to delete). Each user regardless of role must have a unique user ID to access the system. Assessment should be performed to verify all expected regulatory requirements including audit trial and electronic record/signature attributes where applicable (e.g., secure). Security settings outside of the application must be designed and configured to only allow the minimum user permissions for the application to function. If the computer system is standalone, security controls need to be in place to limit user ability to modify or delete raw data, metadata, and audit trial information. Users of the system should not have more than one role. If possible utilize a system administrator who is independent from the department responsible for electronic records (e.g., IT) or one that does not have a vested interest in the data results from the given system.

Legible
Data is permanent and easily read (by a human) controlled configuration and use of any record annotation tools in a manner that prevents data in display and print from being obscured (if possible). Verification of record/report output against on-screen or originally entered (meta) data.

Implement SOP direction (and associated training) to identify the importance of data integrity and define procedural controls as necessary to direct that no data is to be obscured during use and output (e.g., hardcopy) is reviewed where applicable for legibility and consistency with good documentation practices.

Contemporaneous
Documented at the time of the activity (promptly) assessment should be performed to verify all expected regulatory requirements including audit trial and electronic record/signature attributes where applicable (e.g., secure). Verification needs to be made that users cannot change system date, time and time zone on the computer that the application uses to stamp that information. If the system is an enterprise level system where use may span multiple time zones, verification needs to be made relative to a consistent centralized time for the system regardless of access point and that time is synchronized to a traceable source. Users who are not administrators should not be in a local administrators group or power/super users group.

Original
The first recording of data, raw or source data, or a certified true copy. If a standalone system, consideration needs to be made relative to process/procedural data flow and if automatic or manual functions will be needed. Folders may need to be set for either write permissions with deny append or read and execute with allow to write. Permission inheritance needs to be accounted for -  An Annex 11/Part 11 assessment should be performed to verify all expected regulatory requirements including audit trial and electronic record/signature attributes where applicable (e.g., secure).

Accurate
Data is correct including context/meaning (e.g., metadata) and edits. A procedure or procedures (and associated training) should dictate the acceptable and consistent data management practices for the system including how an original data record is processed/saved, how it may be reviewed or how it can have metadata associated with it (e.g., signed), and how it can be historically retrieved, backed-up, and restored.

Consistent
Data is created in a repeatable and comparative manner (traceable). A procedure or procedures (and associated training) should clearly dictate the acceptable and consistent data management practices for the system including how an original data record is processed/saved, how it may be reviewed or how it can have metadata associated with it (e.g., signed), and how it can be historically retrieved, backed-up, and restored.

Enduring
Stored on media proven for the record retention period. A procedure or procedures (and associated training) should clearly dictate the acceptable and consistent data backup processes, schedules, media types, on-site and off-site schedules, archive, and restoration activities. Formal policy, plan or procedure documentation should exist (along with associated training) dictating the minimum retention period for the record types affected by the system.

Available
Readily accessible in human readable form for review throughout the retention period for the record. A procedure or procedures (and associated training) should clearly dictate the acceptable and consistent data backup processes, schedules, media types, on-site and off-site schedules, archive, and restoration activities. Procedural and periodic tests should be performed to verify the ability to retrieve archived electronic data from storage locations.

Avoiding data integrity issues
Firms must recognize that Part 11 requirements apply whenever electronic records and/or electronic signatures are used in GxP processes and activities. Part 11 is a regulation, just as Parts 210 and 211 are regulations. Firms that maintain they operate primarily paper-based systems should consider that their laboratories depend largely on laboratory instrument associated computer systems.

Quality system processes may need to be revised to address use of computer systems and electronic records. Computer systems should be appropriately developed, qualified, tested, and periodically assessed to ensure they remain in a validated state. A risk-based lifecycle approach should be taken from initial system development through production, decommissioning, and data archiving, where appropriate. Changes made to computer systems must be adequately assessed for their impact on GMP operations they support. Changes made to GMP computer systems should be reviewed and approved by the quality unit, which should have appropriate training and expertise.

As part of system validation/revalidation, firms should perform gap assessments for each GxP computer system against the requirements of Part 11 using the MHRA and WHO guidelines to provide additional explanation and examples of expectations. Documented evidence supporting conclusions should be provided or referenced within the gap assessment. The simple result of “complies” is not sufficient. Where necessary, remediation activities should be identified and their progress tracked through the CAPA quality process.

Include data integrity assessments in internal audits. Internal audit programs should always incorporate assessments of data integrity. Internal audit staff should have documented training in assessments of data integrity. when audit functions are outsourced to a third party, the firm should confirm that auditors have appropriate training in data integrity evaluations.

Conclusion
Data integrity is a component of data quality that is directly relational to product quality. Technology alone will not solve the situation; it requires a hybrid (human and computerized) approach to address and improve your overall data and product quality. we need to ensure we educate people on the expectations for data integrity and ensure our processes have the appropriate controls in place to protect against data integrity issues throughout the company. We’re testing for compliance, not into compliance.

(The author is a pharmaceutical quality professional)

 
[Close]