Prospective Validation - Deployment Phase
On completion of successful testing the system is ready for installation into the final operational environment. During this phase the hardware and software installation is qualified and operational checks are performed. During this phase it is necessary to ensure that arrangements for the use and ongoing systems support and maintenance are either established or that
documented plans have been prepared to ensure that these arrangements are in place when the system becomes operational.
Plans, procedures and protocols should define the data load process. All GxP data should be at least double-checked to verify that it has been correctly loaded. Other checks should be put in place as required to verify original GxP data is correct. Statistical sampling can be used to check other data commensurate with the business need to verify data integrity. Rationales justifying sampling regimes should be defined. Automated data load tools should be validated.
Where data representative of the live environment is required for testing or where data load/migration routines need to be tested prior to final load, the data load/migration activity may be phased.
Operational and Support Plans
Operational and support plans should be established in order to ensure that SOP, training, service contracts, business continuity plans, etc are established, reviewed, approved and where appropriate tested before the computerised system becomes operational. Support organisations (both internal and external) should be periodically audited to verify continued service and associated GxP regulatory expectations.
• Security Management
SOPs for managing security access (including adding and removing authorised users, virus management, password management and physical security measures) should be specified, tested, and approved before the systems is approved for use. Security management procedures should apply to all users including administrators, super users, maintainers and normal
• User Procedures
SOPs should be established to define the use of the computerised system.
SOPs should be approved before the computerised system is approved for use and available, even if only in approved draft form for operational qualification (OQ).
The following areas should be addressed (where appropriate):
− Recommended spares holding.
− Frequency of routine testing/calibration.
− Backup and restoration of software and data files.
− Performance monitoring.
− Procedures covering maintenance activities should be specified, and where practical tested, and approved before the system is handed over for use.
• Backup and Restoration
Backup copies of all software and relevant data should be taken, maintained and retained within safe and secure areas, protected within fire safes. Backup and restoration procedures should be verified.
• Data Archiving and Retrieval Archiving and retrieval procedures for data should be specified, tested, and approved before the system is approved for use. Careful consideration should
be taken of special requirements affecting the retention preservation, protection and confidentiality of electronic records, including their associated audit trail information.
• Availability of Software and Reference Documentation All software (e.g. source code, compilers, operating system) and reference (supplier) documentation should be available for inspection and copies should be available for business continuity. Copies of the software should be
retained within safe and secure areas, protected within fire safes. Where access to software is restricted, formal agreements should be established to ensure software can be accessed when needed, e.g. ESCROW accounts.
Training plans should be established for use and support of the computerised system.
• Business Continuity Plans
Procedures and plans supporting business continuity (disaster recovery plans and contingency plans) should be specified, tested, and approved before the system is approved for use. Topics for consideration should include catastrophic hardware and software failures, fire/flood/lightning strikes, and security breaches. Alternative means of operation should be available in case
of failure where critical data maybe required at short notice (e.g. in case of drug product recalls).
Checks should establish that the installation has been completed in accordance with system specification. Checks should be based on:
• Inventory checks (hardware models, software name and version, data, user manuals and SOPs).
• Operating environment checks (e.g. power, RFI/EMI, RH, temp).
• Diagnostic checks (installation diagnostics and software launch).
The boundary of the system and hence the scope of the IQ should be defined in the validation plan. An IQ summary report should be prepared and approved prior to OQ commencing.
Operational Qualification OQ should only commence on successful completion of IQ and involves user acceptance testing of the base functionality of the computerised system.
System tests from the developer may be used to reduce the amount of OQ testing conducted. The suitability of such testing and available documentation must be reviewed and approved by QA for this purpose. Testing should be designed to demonstrate that operations will function as
specified under normal operating conditions and, where appropriate, under realistic stress conditions, OQ should cover:
• Checking required functionality.
• Checking de-selected functionality cannot be accessed.
• Checking execution flows/sequences.
• Checking calculations and algorithms.
• Checking alarm and alert messages.
• Checking timer accuracy.
• Conducting system loading tests.
• Verifying SOPs established to control the use and maintenance of the
Tests should reference appropriate functional specifications. It may be possible to train operators to help conduct testing.
An OQ summary report should be issued on completion of OQ activities.
OQ may be conducted in a controlled off-line test environment. Alternatively
OQ may be conducted with the final system installed in-situ prior to its release for use in the live environment. Test environments should be subjected to IQ demonstrating they are, for testing purposes, equivalent to the intended live environment.
For simple computerised systems IQ and OQ may be combined into a single activity and documented accordingly. More complex computerised systems may be divided into sub-systems and each of those systems subjected to separate OQ. This should be complemented by a collective OQ demonstrating that the fully integrated system functions as intended.
System Release Computerised systems are often released into the live environment following
completion of OQ, i.e. in advance of performance qualification (PQ). Final evidence needs to be collected from the live environment to conclude that the system is fit for routine use. However, to do this the system must be brought into use in the live environment. An interim validation report or alternative e.g. system release note should be prepared, reviewed and approved in order to authorise system release. The interim report should cover all aspects of the validation plan up to and including OQ. Multiple reports may be required in order to phase roll out of
discrete aspects of the system or where there is a phase roll out to multiple sites.
PQ should only commence on successful completion of OQ and comprises product PQ and/or process PQ.
Product PQ establishes the confidence that the data-dependent output from the system consistently meets specification across the defined range of output variants. Examples of product PQ outputs include:
• Batch reports.
• Label variants.
• Pharmaceutical product packaging variants.
Process PQ ensures that operational and support processes are effective,
reproducible and reliable. Process PQ typically include:
• Monitoring of user enquiries.
• Progressing data changes.
• System availability.
• System stability.
• Problem resolution to incident logging.
• Progressing system changes.
Manual processes, such as additional data checks and report verification, should be operated in parallel with the computerised system until the completion of PQ.
A validation report should be prepared to conclude on the completion of the activities prescribed in the validation plan. The validation report should address each of the activities defined within the validation plan and confirm that these have been successfully completed with a clear statement that the system is validated and fit for purpose.
The validation report for a system should not be approved until all the relevant documents defined within its validation plan have been approved. Where there are deviations from the validation plan or unresolved incidents then these should be documented and justified to allow the computerised system to be used on an ongoing basis. Where critical unresolved issues
exist, then the system cannot be considered validated and should not be put into use for GxP applications. The validation report and supporting documentation should be lodged with the
relevant site validation manager.