Introduction
This guidance provides recommendations for the content of the planning, testing and reporting types of validation documentation.
This guidance provides an overview of the recommended content for the documentation of the validation program, planning, testing and reporting.
This guidance does not address documentation of science and risk based assessments or final design review.
Recommendations & Rationale
Current elements of the validation program
This may also be considered as the validation strategy for a site or Center function and is typically documented in a Validation Master Plan or similar policy-level document.
The documentation should be based on the scope of the site or Center function’s operations and/or functional areas. The program / strategy documentation may reference other documents addressing various aspects of validation, for example, different facilities within a site or different validation types.
It is recommended that the validation program / strategy documentation is reviewed at an appropriate interval to ensure that it remains current.
Current Status of Validation
To effectively manage validation activities, and for the purposes of both internal and external audits, it is recommended that a site or center function can easily report or summarize the current validation status of their systems and/or processes. For example, this could include:
- List (may be maintained as a separate document) of qualified or validated systems and processes, including document references;
- Summary of systems and/or processes to be validated (may be maintained as a separate document);
- Validation scheduling plan (may be a separate document);
- Review or requalification period for systems / processes, where applicable
Validation Planning
A validation planning document should be considered for use for larger scale projects that encompass multiple systems and processes. A planning document may be a separate document or combined with other documents such as testing or change control documents.
The planning document should contain, or at least reference, the following information, as applicable:
- A description of systems (e.g., system boundaries, system level impact assessments) and/or processes included in the project;
- The validation approach that will be followed;
- Key roles and responsibilities;
- Testing strategy;
- Project documentation requirements; and
- Sequence of activities and execution.
Testing Documentation
Documentation, such as protocols or test scripts, should be developed that specifies how the validation study will be conducted. Testing documentation should contain or reference the following information, as applicable:
- Title and unique identification number;
- References to related documents such as the validation planning document and SOPs;
- Objectives and scope of the study;
- Prerequisites (e.g. qualified equipment for process validation or Installation Qualification with no major deviations prior to Operational Qualification)
- Clear, precise definition, or reference to same, of the system or process to be validated, for example:
- Summary and/or process flow diagram of critical processing steps included in the study;
- The Master Manufacturing instructions or Device Master Record to be validated (i.e., that to be used in preparation of validation lots or batches);
- The critical process parameters (CPPs) for the process steps being validated;
- User requirements, functional specifications and/or design specifications, as applicable, including document identification and version number;
- Validation approach (e.g. prospective, retrospective or concurrent; traditional or Continuous Quality Verification) and justification;
- Detailed statement of actions to be taken in performing the study (or studies), for example:
- Identification or description of tests to be performed, for example, for a process this may include:
- Equivalence/comparability of validation batches to previously produced lots or batches (commercial, development, or biobatches, as appropriate);
- Demonstration of in-process and/or finished product homogeneity uniformity;
- Demonstration of consistent heterogeneity profiles for a drug substance prepared by a biopharmaceutical process;
- Requirements to conduct hold time studies
- Identification of calibration of critical equipment used specifically for the validation studies (such as that used for one-time studies or the use of portable measuring equipment);
- Methods for recording and evaluating results (such as statistical analysis of results).
- Assignment and description of responsibilities for performing the study;
- Description of or reference to all test methodologies to be employed;
- Acceptance criteria for the study;
- Acceptable operating ranges critical process parameters;
- Established quality attributes and specifications;
- The number of consecutive successful validation lots/batches needed to show consistent control of the process;
- Procedure for handling test document modifications and deviations; and
- Sampling plans, relevant diagrams, or tests, including a plan for the number of validation lots/batches to be included in a stability study, if applicable.
When designing testing documentation, consider whether the test requirements and the report can be combined into a single document. This can be particularly effective for system qualification but does require two approval stages –pre-execution and following the review and summary of the test results. Where this approach is taken, the recommendations below on reporting should still be followed.
Validation Execution
Test documentation should be reviewed and approved prior to execution. Completed testing documentation should contain or reference the following information, as applicable:
- Test instrument calibration verification;
- Actual test results;
- Verification that acceptance criteria have been met; and
- List of document modifications and test deviations.
All test results, along with any test deviations, should be recorded and documented in a manner permitting objective pass/fail decisions to be made, regarding the success or failure of the validation.
Actual values obtained should be stated and reported in appropriate significant figures (where applicable). Calculations should be shown, including application of rounding rules.
All fields of a test document should be completed. If a field is intentionally left blank, the field should be lined-out, initialed and dated and indicated as “Not Applicable” or “N/A”.
Where appropriate, primary data should be placed in tables and/or plotted. Primary data and other attachments, including videotape and any type of electronic data storage medium, should follow good documentation practices. Attachments should be referenced in the relevant section of the test documentation.
The attachment itself should also be identified appropriately, for example, by protocol or report section, test number or attachment number. It is recommended that where there are multiple pages/items within an attachment that the number submitted is also recorded in the test documentation and/or on the attachment.
When transforming or plotting data, an individual other than the executor should verify the accuracy of the data transcribed. For data that is plotted, a printout of the data tables from the software used to generate the plot should be attached to the report. The data table printout may be signed instead of the graph.
The results of the test execution should be reviewed by QA and the system owner, at minimum, typically as part of the validation report.
Validation Reporting
Reports should describe the results from the planning and testing and should be approved by QA and the process/system owner, at minimum. The content of the report should include, or reference, the following:
- The planning or test document, including scope;
- Summary of the results obtained;
- Analysis of the results, where appropriate. For a process, this may include:
- Review of critical process parameters from validation batch/lot production records;
- Comparison with previously produced batches (commercial, development, or biobatches), where applicable;
- Summary and resolution of any manufacturing, laboratory or testing deviations observed;
- Report conclusions, including clear statement of validation status of system/process; for example:
- “System qualification has been successfully completed without deviation and the system is suitable for use.”
- “Process validation has been successfully completed for product Y, all deviations have been resolved and the process is suitable for routine manufacturing.”
- “Cleaning validation has been successfully completed for 2 runs of product X. One further run is required to complete the validation study.”
- Recommendations or corrective actions needed; and
- Attachments (including raw data or summary of raw data).
When writing a report, consider the need for a “stand-alone” document, for example, one that is intended for an external reviewer or inspector. In this case, it may be of benefit if the report does not require the reviewer to frequently refer to the protocol or other referenced documents. This may require repetition of some portions (e.g. objective, scope, process flowchart and description, methodology, etc.) of the planning and/or testing documentation to ensure comprehensiveness. (See also Testing Documentation section above for consideration of combined test/report document). It is recommended that significant outstanding actions from the report are entered into and managed through an appropriate CAPA system. This could include actions to close test deviations that have no significant impact on the validated status of the system or process, or actions to improve the performance of the system or process.