Goals
When you have completed this unit, you should be able to:
a. Identify computer systems with GMP implications within the scope of the GMP facility audit
b. Include in the audit an assessment of the computerized systems used to support a GMP facility
c. Understand and apply applicable GMP requirements to the audit.
d. Recognize compliance or non-compliance of GMP facilities to applicable regulations for computerized systems
This Training Module is written at an introductory level. It aims to equip the GMP auditor to assess the computerized systems used to support a GMP process while performing a general GMP audit.
Where there is likely to be significant focus on the computerized systems, a specialist IS Compliance auditor should accompany the audit team. This may be the case if:
a. The GMP facility relies heavily on novel computerized systems
b. There are specific IS concerns prior to the audit
c. A previous audit has raised concerns about the Facility’s computerized systems
Definitions
Supervisory control and data acquisition (SCADA): Application software used for process control and alarm management for the collection of data.
Explanation of Topic
Introduction
Any GMP facility whether involved in contract testing, manufacturing, processing, packaging or distribution will rely to a greater or lesser extent on computerized systems.
They are fundamental in ensuring that processes and data are reliable and secure. In order to achieve this they should be maintained in a validated state.
Validating a computerized system is establishing documented evidence that provides a high degree of assurance that it will consistently function in accordance with its pre- determined specifications and quality attributes throughout its lifecycle.
What is an audit of computerized systems at a GMP facility?
An audit of the computerized systems is a review or inspection of the practices, procedures, methods and standards of the GMP facility that are applied during the life- cycle of the computerized system.
It is intended to determine if the validation of the computerized system is adequate to satisfy regulatory and company expectations, and that the process is executed consistently and in accordance with established guidelines and procedures.
The audit also determines how computer systems are developed, maintained and used at the facility.
Computerized Systems Management
It is important to establish that computerized systems are managed appropriately within the GMP Facility. They should be covered by a quality management system. This may be the same as the rest of the facility or may be specific (and tailored) for the computerized systems.
Procedures and standards should exist addressing the areas listed below:
a. System Development Lifecycle
b. Functional Requirements
c. Design Specifications Programming
d. Testing
e. Installation/Implementation
f. Support/Maintenance
g. Change Control/Configuration Management
h. Backup and Restoration
i. Business Continuity & Disaster Recovery
j. Security
k. Decommissioning
l. Management of Deviations
It is important to determine the role of Quality Assurance in the management of computerized systems. Quality Assurance may be involved with every step of the validation process or limited to independent approval of the validation deliverables. It is possible that a separate IS Quality Management function (within the IS organization) carries out some or all of the QA activities.
Procedures should describe what level of approval is required for each validation deliverable. As a minimum, QA approval would be expected for Validation Plans.
Requirements and Validation Reports and Change Control documentation. If deviations from procedure and standards occur, confirm that they are documented, evaluated and approved.
A systems inventory should exist which identifies all systems owned by the facility detailing the system owner and GMP impact. Any system with GMP impact that is currently operational should be validated. The IT infrastructure should also be identified.
GMP critical systems may include:
a. Process Control Systems
Ø SCADA
Ø Autoclaves
b. Environmental Monitoring Systems
c. Laboratory Information Management System
d. Manufacturing Execution Systems
e. Document Management Systems
f. Enterprise Resource Planning (ERP) Systems e.g. SAP
It should be clear how a facility determines, evaluates and documents the GMP impact or criticality of a system.
Validation
Validation Deliverables
In general, the standard GMP documentation practices should be applied to all Computer Validation documentation. At a minimum this includes pagination, dating, revision history, document versioning, approvals, and other good documentation practices.
Validation Plan
The Validation Plan will provide specific planning information in regard to the validation activities that will be conducted for the system. The Validation Plan will describe the validation approach to be taken, state the validation deliverables and provide clear system acceptance criteria. The scope and depth of computer validation vary, depending on the nature and complexity of the system. A categorization approach may be taken where systems are grouped according to their complexity and use.
See below an example (from GAMP4):
Software Category Definition
A. Operating Systems: Established commercially available operating systems Applications are developed to run under control of these operating systems.
B. Firmware: Instrumentation and controllers often incorporate firmware. Configuration may be required to setup runtime environment and process parameters.
C. Standard Software Packages : Commercially available, standard packages, providing an ‘off-the shelf solution. Limited configuration is required to establish the run time environment.
D. Configurable Software Packages: Configurable software packages that provide standard interfaces and functions that enable configuration of user or business specific processes
E. Custom (Bespoke) Software: Software developed to meet specific business needs
The plan should indicate approval from the business process owner or system owner, Technical resources and Quality Assurance.
Requirement Specification
The Requirements Specification (may be referred to as User Requirement Specification or User Requirements) describes what the system is to achieve or what the user of the system needs it to do. Requirements should be approved prior to finalization of the Design Specification unless an iterative development process is used (and included in the validation plan).
Requirements should be unique and prioritized. They should be written at a detailed level, precisely identifying acceptable criteria for success from a users’ perspective.
Functional Specification
The Functional Specification (may be referred to as Design Specification) describes how the system is designed to achieve the Functional Requirements. This deliverable is a technical document that identifies the technical solution for the application, underlying software and the hardware necessary to support the system. There should be clear traceability between the requirements and the functional design. This may be achieved using a matrix, cross-referencing, common numbering or any other approach which makes it clear how each requirement is satisfied by the design. It is not necessary to include code or pseudo-code. Some development methodologies may require a formal approach e.g. UML Use Cases. Depending on the complexity of the system it may be appropriate to have a hierarchy of Functional or Design Specification with a high level design specification complemented by more detailed module specific design specifications.
System Programming.
Coding should comply with programming standards for screens, menus, code annotations, etc. Where possible industry standard coding standards should be used.
Compliance with coding standards should be assessed through formal code review. A risk based approach can be applied to code review with code selected based on complexity, criticality or experience of the developer.
Defects found during code review should be logged with corrective and preventive actions as appropriate.
Procedures for maintaining and controlling multiple source code versions should be in place. In practice this is often achieved using commercial configuration management software e.g. CVS, Microsoft Visual SourceSafe or IBM Rational ClearQuest.
Where possible separate environments should be used for development and testing. These should be as similar as possible to the production environment for the system in order to assure the validity of system testing. The Development environment is used for developing and unit testing the software. The Test environment is used to test completed components and perform functional and integration testing. The final Production environment is where the application is placed for production roll out.
Test specification
The Test Specification describes the tests to be performed to ensure the system meets the user requirements. Test scripts should be written such that they can be re-executed and the same results can be obtained. Tests should test limit, failure and stress conditions as well as the successful execution of the required functionality. There should be traceability from the requirements, through the Functional Specification to the Test Specification. It should be possible to demonstrate that all required functionality has been adequately tested.
It is good practice for tests to be written by someone other than the developer and executed by another independent person.
Expected results should be clearly identified and the results obtained should be documented with sufficient evidence to demonstrate success. Deviations from test steps or expected results should be documented.
Test failures should be logged and investigated. It is not necessary for every test to pass.
Depending on the impact of the failure and the criticality of the function options include correcting the code and retesting the function (and other implicated functions) or providing a procedural workaround.
Testing may be performed using automated test software or tools. Procedures should exist covering maintenance of the test tool and test scripts. Automated test tools and software should themselves be validated though this may well be ‘light touch’ unless they are untried or used in a novel way.
Testing may exist at different levels with:
Unit (or module) testing – this is technical testing, sometimes performed by the developer, to prove that individual code modules or functions operate correctly
Integration testing – ensures that the modules operate correctly together as a system as a whole. This phase may include the testing of any interfaces to other systems. Alternatively interface testing may be a separate phase.
User acceptance testing – this phase confirms that the user requirements have been met
It is acceptable to have increasing formality applied to testing phases in terms of the degree of documentation for the test and evidence collected.
Validation Report
The Validation Report summarizes the validation activities associated with the system development. It contains a summary of testing conclusions and reports on other validation requirements, for example, user training and updated procedures. It provides recommendations for acceptance of the system. The Report should also present a strategy for maintaining the system in a validated state. Where there are any exceptions to the deliverables required by the validation plan, these are detailed in the report together with a justification. Any actions to be completed after system implementation should be tracked to completion.
The Validation Report must be approved prior to the system entering operational use.
Release/Installation
Release and implementation of the system is based on fulfillment of the validation deliverables and other pre-requisites required for management of the system. These may include: a confirmed build from the config management system, installation procedures, technical reference manuals, system reference manuals, system operation manuals, process procedures, training.
Support, Change Management and Maintenance Strategy
In order to maintain the validated state, the system should be supported and maintained with any changes managed appropriately. A mechanism should be in place for logging system problems whether reported by users or technical support. Problems should be logged, investigated and tracked to completion with corrective and preventative actions as appropriate.
Changes to the system may result from problems or changes in business process and system use. The impact of any proposed change on existing functionality should be assessed prior to approval. Testing done in support of a change should demonstrate that (a) the changed functionality works as expected and (b) there is no unexpected impact in related system functions.
On a periodic basis there should be a management review of the system to determine what needs to be done to maintain the validated state. Inputs to the management review may include:
a. Amount of system change
b. Common user problems
c. Technical changes to the infrastructure supporting the system
d. Continuing support for any commercially sourced parts of the system (hardware or software)
e. Changes in system use
f. Changes in business process supported by the system
The output recommendations of the management periodic review should be formally logged and actions tracked.
Where possible, the system should be maintained with current security patches for operating systems, database and application software in order to protect the process and data from unauthorized access and change (hacking)
Where the system can be accessed directly by the software supplier (by dial-in or internet) such access should be controlled so that any changes made are managed in the normal way.
Backup and Restore
In order to secure the system and its data backup and restoration requirements should be identified for all GMP impacted systems. The frequency with which back-ups are taken should by determined according to the criticality of the system. Backed up data should be secured and restore processes tested to ensure successful execution when required.
Business Continuity/Disaster Recovery
Loss of computerized systems can occur for a wide variety of reasons, varying from local equipment failure, network or software failure to site disasters. It is anticipated that most computerized system or infrastructure problems will be resolved quickly by existing support processes to minimize impact on system end users. A serious problem can, however, develop into a disaster if it is left unresolved.
The immediate period surrounding a disaster can result in confusion, chaos and significant interruption to routine business operation. The steps taken in the first hours of a crisis have a significant bearing on the ultimate severity and speed to resume normal business operations. The creation of structured plans will assist management by providing clearly defined responsibilities and actions. These procedures, together with some pre- disaster preparation, will ensure that the impact on the business of any disaster can be managed and minimized.
Business continuity (BC) and disaster recovery (DR) are two separate, parallel processes are planned for prior to, and executed after, a disaster. BC planning describes how business processes and activities are to continue in the absence of the supporting computerized systems. BC planning depends entirely on the criticality of the system. For non-critical systems the BC plan may be to do nothing until the system is restored.
Disaster Recovery (DR) planning describes how systems are restored to an operational state (including data) and must be in place for all computerized systems, infrastructure and computer centers. Again a risk-based approach is often taken and the DR planning for non-critical systems may be on a ‘best efforts’ basis.
To ensure the plans are effective in the event of a disaster they must be accurate, timely and complete. They must be reviewed and updated periodically and also when prompted by events, e.g. significant reorganizations occur, a disaster or ‘near miss’ incident or the emergence of new risks or threats. Where practical the review should include a test or exercise of the plans.
Security Measures
Security measures should be in place to prevent unauthorized access to the system.
Physical security may involve control of access to data center or restricting system access to specific areas within the facility e.g the relevant laboratories or plant areas. Logical access may include controls on accessing the system software, restricting critical system functions to specified users and changing default passwords on operating systems, databases and application software. An important security consideration is keeping the system components updated with security patches provided by the vendor. The aim of both physical and logical security is to protect the system’s functionality and data from unauthorized change.
Electronic Records and Electronic Signatures
With computerized systems supporting the critical GMP processes at a facility and much of the raw data held electronically, it is important that the records and associated signatures are secure, reliable and attributable. It is important that the Facility’s employees understand that their electronic signatures are the equivalent of the hand written signature. Audit trails should record all relevant detail including who created or modified a record and when.
Advanced Topics
Advanced topics are presented for information only. It is unlikely that either of these topics will need to be covered during a routine GMP audit of a supplier. If a ‘for cause’ audit is required and needs to cover these areas in depth it is recommended that an IS Compliance specialist is included in the audit team.
System Retirement and Decommissioning
Method by which a system is phased-out or retired from production service. During this effort the code is archived, data is converted/migrated (if applicable) and the system is dis-assembled and placed out of service. The approach used for system retirement should maintain integrity of data or the necessity of the use of the data in the future.
Data Migration
Data Migration is a one-time process of “moving” data from one system to another system. This can be due to a system upgrade or due to the replacement of an existing system with a new system. This is not an on going process between two systems.
The approach to data migration should consist of a migration plan, identification of the data being migrated, the actual process of migrating the data, and a migration verification process for determining that the migration process was accurate and successful.
Deviations from the Data Migration process should be document and resolved according to its criticality.
Key Parameters in Auditing a Computerized Systems at a GMP Facility
Prior to the audit
a. Determine which GMP related products or services the facility provides.
b. Determine which regulatory requirements apply.
c. Request and review the systems inventory and descriptions to identify which computerized systems to focus on during the audit.
d. Contact the Facility and notify them if computerized systems are likely to be included in the scope of the audit. This will enable them to make the relevant staff available.
e. Request and review the Facility’s policy or procedure on Computerized System Validation (or equivalent)
f. Consider whether an IS Compliance specialist should be included as part of the audit team.
During the Audit
The aim of the audit should be to determine that the computerized systems are developed/procured, implemented, operated and maintained in accordance with GMP principles. It is more important to confirm that the concepts are addressed than the terminology that is used.
Confirm that a systems inventory identifies all systems owned by the facility and details the system owner and which have GMP impact.
Ø Verify that any system with GMP impact, which is currently operational, is validated.
Ø Check that there is a clear rationale for which systems have been validated and which haven’t.
Ensure the review is focused on the system(s) most critical to the component, process or task(s) that the facility performs.
Determine whether the validation approach chosen is based on categorization of the system. GAMP categorization may be used.
Ø Verify that the choice of category is justified
Confirm that the Facility’s employees understand that their electronic signatures are the equivalent of the hand written signature.
System Development
Ensure Suppliers have been assessed
Ø Confirm whether the system is developed in-house or provided by a commercial vendor.
Ø If the system is vendor supplied, confirm that the vendor has been evaluated.
Ø Verify that the facility have standards for assessing vendors and that they were followed.
Ø If the vendor assessment involved an audit, verify that any observations noted has been followed up.
Ø Where subcontractors are used, verify that they have the appropriate experience and qualifications and that this is available, e.g. training records.
Ø Consider how both technical and quality requirements are placed on vendors and subcontractors. Verify if there is a defined process for this and if it has been followed.
Ø Verify if there is a method in place for accepting the product and that it includes assessment of both technical and quality requirements.
Validation Deliverables
Ensure that the validation deliverables form a consistent set with be a clear relationship between documents describing the requirements, technical design and testing approach.
Ø Verify that at a detailed level this includes traceability between individual functional requirements, design, coding and testing. In practical terms this is often best established by selecting a sample of discrete functions, including those critical to the process, GMP compliance and access security. The links between the documents for the selected functions can then be examined.
Ø Confirm that the deliverables are stated in the Validation Plan, have been completed and are summarized in the validation report.
Ø Where the Validation report contains exceptions or actions to be completed after the system implementation, confirm that these have been appropriately actioned.
Requirements
Ensure that all GMP requirements of the business process, including ER/ES considerations, have been included in the stated requirements.
Ø Verify that it is clear which records are required to have an audit trail and that the audit trail contains the required information regarding record creation, change and deletion.
Ø Verify that Electronic Signatures include information associated with the signing that indicates the printed name of signer, date and time when signature was executed and the meaning associated with the signature (such as review, approval, or authorship).
Ø Determine that the appropriate individuals and departments reviewed the requirements in the Requirements Specification and that as a minimum the business or system owner responsible for the process and Quality Assurance have be involved in the development of the requirements.
Design & Coding
Confirm that design documents exist covering all the user requirements and that traceability exists between requirements and design elements.
Determine that the appropriate individuals and departments reviewed them.
Confirm that design specification were approved prior to coding commencing.
Confirm that coding standards exist. If the facility is using industry standard software (e.g. Java, Delphi, .net etc.) determine whether industry standard coding standards are used. If not, clarify why not.
Confirm that code reviews are carried out.
Determine whether all code is reviewed or a sample. If a sample is used determine how the sample was selected.
Ensure source code reviewers are independent and suitably qualified.
Confirm that defects found during code reviews are corrected, tracked and trended.
Ensure preventative actions are in place to prevent recurrence.
Testing
Review the testing documentation associated with the system to audit.
Establish that a test strategy explains the testing done at each stage of development, e.g. module testing, integration testing, interface testing and acceptance testing.
Confirm through the testing process that the critical functional requirements have been tested.
Confirm that testing covers limit, failure and stress conditions as well as tests designed to confirm that the system works as expected. Also, ensure that any logical access security controls have been included in the test regime.
Establish that testing is carried out by people who are independent (of both the developer and test author) and suitably qualified.
Confirm that test scripts are written so that they will provide the same results if re-executed.
Confirm that expected results have been clearly identified and that the results obtained have been documented with sufficient evidence to demonstrate success.
Verify that deviations from test steps or expected results have been documented.
Establish that test failures have been evaluated for risk according to the criticality of the functional requirement.
Establish that where a test has failed the required functionality it is not critical to either the supported process or GMP compliance. Confirm that a suitable procedural workaround exists.
If automated testing tools are used, establish that:
Ø they have been tested/qualified to ensure their proper operation.
Ø procedures are available that identify how the tools are to be used.
Ø script approval signatures are captured.
Ø changes to the tools or electronic scripts are controlled and approved.
Release
Determine how the system is released for use.
Evaluate the controls and checks to ensure that all required activities are completed.
Ensure that roles and responsibilities for release are clear and have been complied with.
Ensure that all required deliverables were part of the release package. These may include:
Ø a confirmed build of the code
Ø release certificate or note describing the release content and any exceptions.
Ø user manuals
Ø training material
Confirm that all master data was in place and verified as correct at time of release.
Establish if a long-term archival strategy is documented for retaining versions of the software and its associated documentation.
Ø Determine what is archived, how long is it retained, where it is archived and how is access restricted.
System Management
Confirm a detailed system description exists and is maintained.
Establish how system support is provided.
Ø Confirm that the roles and responsibilities of helpdesk, super users and
technical support are clear and defined.
Ø Ensure they have the training and experience to fulfill their role
Verify that the system is owned and managed by someone with sufficient authority and budget to maintain it.
Problem Handling
Confirm that there is a problem reporting mechanism in place that logs reports, tracks their investigation and resolution.
Verify that both corrective and preventative actions are implemented for faults.
Establish the link between incident reporting and change management for the system.
Change Management
Ensure procedures for initiating, authorizing and documenting system changes are in place.
Ø Confirm changes are assessed to ensure that the impact on related functionality is understood.
Ø Ensure that the management of change includes updates to system documentation, procedural instruction, training and master data as well as the technical functionality.
Ensure that testing done in support of system changes confirms that related functions are not adversely affected as well as ensuring that the required change is achieved.
Access & Security
Establish what physical security measures are defined for the system and if they are in effect.
Ø Establish how access to physical hardware is controlled.
Establish what software security measures are defined for the system.
Ø Confirm whether access to critical system functionality has been restricted to appropriate users or groups of users.
Ø Confirm whether the ability to amend master data has been restricted
Determine the process by which users have access to the system granted and removed. Determine that the process is current and followed.
Ø Confirm that user access is reviewed periodically to ensure users only have the access they require.
Determine whether access to powerful default accounts for system elements such as operating system and database are controlled and that the default passwords have been changed.
Ensure development, test and production system environments are routinely screened for viruses.
Determine if system elements such as the operating system, database and application software are maintained with the latest available security patches.
Master Data
Ensure that processes exist for maintaining the master data required by the system.
Confirm that roles and responsibilities are clear regarding the management and approval of master data.
Determine whether master data is checked periodically for accuracy.
BC/DR Planning (including backup and restore)
Confirm that the system has a suitable Business Continuity plan.
Confirm this is kept current with regard to changes to the system functionality, business process, support organization etc.
Establish that this takes into account the criticality of the system.
Confirm that the system has a suitable Disaster Recovery plan.
Confirm this is periodically reviewed and kept current with regard to changes to the system functionality, business process, support organization etc.
Establish that this takes into account the criticality of the system.
Establish that a backup and restoration regime exists for the system and that it is documented, maintained and tested.
Ø Determine if the regime is based on the criticality of the process supported by the system and the data it holds.
Ø Confirm that the backup regime is followed.
Ø Establish how backup media is maintained.
Ø Determine if it is held at a remote location and that procedures exist to give appropriate control over media held remotely
Ø Verify if controls allow the retrieval of specific historic data if required.
Periodic Review
Establish that the system is subject to periodic management review to determine what needs to be done to maintain the validated state.
Confirm that the output of reviews is logged and any actions tracked to completion.