Computer System Validation: A Requisite Approach for Laboratory
Computer System Validation: A Requisite Approach for Laboratory
Computer System Validation: A Requisite Approach for Laboratory
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
Article<br />
<strong>Computer</strong> <strong>System</strong> <strong>Validation</strong>: A <strong>Requisite</strong><br />
<strong>Approach</strong> <strong>for</strong> <strong>Laboratory</strong><br />
1<br />
Gaurav Tiwari*, 1 Ruchi Tiwari, 1 Kaushlesh Prasad and 1 Awani K Rai<br />
1<br />
Department of Pharmaceutical Sciences, Pranveer Singh Institute of Technology, Kanpur (UP).<br />
As quality is the most important aspect of any manufacturing process, it becomes necessary to validate or<br />
examine all the peripherals connected to the manufacturing instruments used in pharmaceutical industries.<br />
Among all these peripherals, computer is the main equipment, as it controls and handles all the activities<br />
of manufacturing process starting from input to finalized output. So the requirement of <strong>Computer</strong> <strong>System</strong><br />
<strong>Validation</strong> (CSV) has naturally expanded to encompass computer systems used both in the development<br />
and production of pharmaceutical products and medical devices. American FDA and the UK MHRA regulate<br />
the guidelines <strong>for</strong> the use of computer systems in pharmaceutical industries. <strong>Validation</strong> is based on taking a<br />
structured life-cycle approach by initial planning of the project through development, testing and release to<br />
final operation and maintenance of the computer. A 4Q model is recommended <strong>for</strong> the whole process with<br />
just four phases: design qualification (DQ), installation qualification (IQ), operational qualification (OQ) and<br />
per<strong>for</strong>mance qualification (PQ). Through validation there is documented evidence that a process or a system<br />
meets the previously specified parameters. <strong>Validation</strong> ends when the system is retired and all important quality<br />
data is successfully migrated to the new system. The ultimate goal of any CSV project is to realize and sustain<br />
compliance, while ensuring the peak per<strong>for</strong>mance and functionality of computer systems. Above all the system<br />
must be shown to operate correctly, consistently, and according to its specifications.<br />
Keywords: <strong>Validation</strong>, <strong>Computer</strong> <strong>System</strong>, Phases of <strong>Validation</strong>, <strong>Validation</strong> of Hardware, <strong>Validation</strong> of Software.<br />
Introduction<br />
Pharmaceutical product research,<br />
development, manufacturing, and distribution<br />
require considerable investment of both<br />
time and money, and computerization has<br />
become a key to improving operational<br />
efficiency. <strong>Computer</strong> system application<br />
is expected to support the fundamental<br />
requirement of minimizing risk to product<br />
identity, purity, strength, and efficacy by<br />
providing consistent and secure operation<br />
and reducing the potential of human error.<br />
In the last decade, computerized systems<br />
have become a vital part in the manufacture<br />
of Active Pharmaceutical Ingredients.<br />
Proper functioning and per<strong>for</strong>mance of<br />
software and computer systems play a<br />
major role in obtaining consistency, reliability<br />
and accuracy of data. There<strong>for</strong>e, cGMP<br />
regulations imply that the functionalities of<br />
those computerized systems, which have<br />
an influence on the quality of API, should<br />
be validated and thus <strong>Computer</strong> <strong>System</strong><br />
<strong>Validation</strong> (CSV) should be a part of any<br />
good development and manufacturing<br />
practice. <strong>Validation</strong> shall demonstrate that<br />
the parameters defined as critical <strong>for</strong> its<br />
operation and maintenance are properly<br />
(adequately) controlled and managed.<br />
It is essential that the validation is<br />
practical and achievable, adds value to the<br />
project, and is concentrated on the critical<br />
elements of the system [1] .<br />
The computer system qualification <strong>for</strong><br />
validation program requires the regulated<br />
*Email id: lda_mpharm@rediffmail.com<br />
pharmaceutical industry to provide<br />
guidance and reference on regulatory<br />
requirements, validation methodologies,<br />
and documentation. The pharmaceutical<br />
organization requires to follow GxP<br />
regulations from the U.S. Code of Federal<br />
Regulations (CFRs) and the Food and Drug<br />
Administration (FDA) to conduct proper<br />
CSV program <strong>for</strong> company’s regulatory<br />
requirement and <strong>for</strong> quality policy of the<br />
company.<br />
<strong>Validation</strong><br />
The word ‘validate’ is defined in the<br />
dictionary as to make ‘valid’, ‘to legalize’<br />
or indeed ‘to confirm’. But what does<br />
this exactly mean? Through validation<br />
there is documented evidence that a<br />
process or a system meets the previously<br />
specified parameters. It is a scientific<br />
method <strong>for</strong> confirming the value of a system<br />
<strong>for</strong> a specific purpose. So, validation in<br />
the pharmaceutical and medical device<br />
industry is defined as the documented<br />
act of demonstrating that a procedure,<br />
process, and activity will consistently lead<br />
to the expected results. It often includes the<br />
qualification of systems and equipments.<br />
The US FDA in 1987 defined validation<br />
as “Establishing documented evidence that<br />
provides a high degree of assurance that a<br />
specific process will consistently produce<br />
a product meeting its pre-determined<br />
specifications and quality attributes”.<br />
This definition was originally applied to<br />
the drug manufacturing processes, but<br />
validation is applied to many aspects of the<br />
healthcare and other regulated industries<br />
and businesses which include services,<br />
equipment, computer system, processes &<br />
cleaning. So, it is the process by which all<br />
aspects of a process (including buildings,<br />
equipments, and computer systems) are<br />
shown to meet all quality requirements,<br />
and comply with the applicable rules and<br />
regulations regarding product quality,<br />
safety and traceability. In each case,<br />
the objective of validation is to produce<br />
documented evidence, which provides a<br />
high degree of assurance that all parts of<br />
the facility will consistently work correctly<br />
when brought into use. In case of computer<br />
systems there is an expectation that<br />
validation should allow quality to be built<br />
into all stages of a regulated system’s life<br />
cycle in order to minimize system errors<br />
and problems. It is a requirement <strong>for</strong><br />
Good Manufacturing Practices and other<br />
regulatory requirements. Since a wide<br />
variety of procedures, processes, and<br />
activities need to be validated, the field<br />
of validation is divided into a number of<br />
subsections as follows:<br />
1. Cleaning <strong>Validation</strong><br />
2. Process <strong>Validation</strong><br />
3. Analytical Method <strong>Validation</strong><br />
4. <strong>Computer</strong> <strong>System</strong> <strong>Validation</strong><br />
The use of the word ‘<strong>Validation</strong>’, in<br />
connection with computers has originated<br />
Pharma Times - Vol. 44 - No. 02 - February 2012 20
in USA. The term ‘validation’, as used in the<br />
USA, refers to any type of evidence about<br />
a state of affairs. ‘Validated’ is there<strong>for</strong>e<br />
not an adjective <strong>for</strong> an object, but rather an<br />
adjective <strong>for</strong> a property of an object. The<br />
terms validation and qualification are very<br />
often used interchangeably. The precise<br />
meaning of and the difference between the<br />
terms is discussed more in theory than in<br />
practical usage. Because a distinction is<br />
only possible in theory, the line between<br />
qualification and validation is indeed blurred.<br />
However, as already shown in practice, a<br />
clear distinction is not necessarily required.<br />
For completeness the terms are explained<br />
briefly below. The following definitions of<br />
the terms have been established as a type<br />
of standard.<br />
• H a r d w a r e a n d d e v i c e s a r e<br />
qualified.<br />
• Methods and processes are<br />
validated.<br />
The combination of qualified hardware<br />
and validated processes and methods<br />
results in a validated computer system [2] .<br />
<strong>Computer</strong> <strong>System</strong> <strong>Validation</strong><br />
<strong>Computer</strong> system installed in the<br />
corporations are validated to assure that<br />
they are of high quality, meet business<br />
needs, and are designed, implemented and<br />
managed in compliance with appropriate<br />
regulatory requirements to per<strong>for</strong>m in a<br />
manner consistent with their intended<br />
functions. The intent of validation is to<br />
ensure that regulated systems meet the<br />
criteria listed below:<br />
1. <strong>System</strong>s are developed according to<br />
quality software engineering principles.<br />
2. <strong>System</strong>s meet the business needs of<br />
their users and<br />
3. Continue to operate correctly and reliably<br />
throughout their life cycle.<br />
So in general, CSV is mostly just good<br />
software engineering practice in a <strong>for</strong>mal<br />
setting; making sure that the right system<br />
is built and managing changes. CSV<br />
can be defined as “an ongoing process<br />
of establishing documented evidence to<br />
provide a high degree of assurance that a<br />
computerized system (and its components)<br />
will consistently per<strong>for</strong>m to predetermined<br />
specifications”. For a process supported<br />
by a computer system, we can say that<br />
CSV provides documented proof that<br />
the system (e.g. hardware, software,<br />
peripherals and networks) will repeatedly<br />
and reliably do what it is designed to do,<br />
is “fit-<strong>for</strong>-purpose”, and complies with the<br />
applicable rules and regulations. CSV<br />
must show that the system operates<br />
predictably according to its specifications,<br />
and that conclusion is supported by <strong>for</strong>mal<br />
and documentary evidence. The ultimate<br />
goal of any CSV project is to realize and<br />
sustain compliance, while ensuring the<br />
peak per<strong>for</strong>mance and functionality of<br />
these systems. CSV is a sound business<br />
practice that supports quality assurance,<br />
and promotes responsible and profitable<br />
operations. CSV provides the evidence that<br />
a computer system does what it is intended<br />
to do according to the system specifications<br />
and operating procedures [3] .<br />
CSV: Regulatory<br />
Requirements<br />
In 1983, FDA published a guide to the<br />
inspection of <strong>Computer</strong>ized <strong>System</strong>s in<br />
Pharmaceutical Processing, also known as<br />
the ‘bluebook’ (FDA 1983). Recently, both<br />
the American FDA and the UK Medicines<br />
and Healthcare Products Regulatory Agency<br />
have added sections to the regulations<br />
specifically <strong>for</strong> the use of computer systems.<br />
FDA introduced 21 CFR Part 11 <strong>for</strong> rules on<br />
the use of electronic records and electronic<br />
signatures (FDA 1997). FDA regulation is<br />
harmonized with ISO 8402:1994 (ISO 1994),<br />
which treats “verification” and “validation”<br />
as separate and distinct terms. On the<br />
other hand, many software engineering<br />
journal articles use the terms “verification”<br />
and “validation” interchangeably, or in<br />
some cases refer to software “verification,<br />
validation, and testing (VV&T)” as if it is a<br />
single concept with no distinction among<br />
the three terms. The General Principles<br />
of Software <strong>Validation</strong> defines software<br />
verification as “that which provides objective<br />
evidence that the design outputs of a<br />
particular phase of the software development<br />
life cycle should meet all of the specified<br />
requirements <strong>for</strong> that phase.” The software<br />
validation guideline states: “The software<br />
development process should be sufficiently<br />
well planned, controlled, and documented to<br />
detect and correct unexpected results from<br />
software changes.” U.S. regulations related<br />
to computer systems are listed in Table1.<br />
A lot of regulatory agencies worldwide<br />
pay increasing attention on computerized<br />
systems:<br />
1. AIFA (Agenzia Italianadel Farmaco)<br />
www.agenziafarmaco.it<br />
2. Eudralex www.ema.europa.eu<br />
3. MHRA www.mhra.gov.uk<br />
4. ICH www.ich.org<br />
CSV: Overview<br />
<strong>Validation</strong> of computer systems is not<br />
a once off event. <strong>Validation</strong> should be<br />
considered as part of the complete life cycle<br />
of a computer system. This cycle includes<br />
the stages of planning, specification,<br />
programming, testing, commissioning,<br />
documentation, operation, monitoring and<br />
modifying. For new systems, validation<br />
starts when a user department has a need<br />
<strong>for</strong> a new computer system and thinks<br />
about how the system can solve an existing<br />
problem. For an existing system, it starts<br />
when the system owner gets the task of<br />
bringing the system into a validated state.<br />
<strong>Validation</strong> ends when the system is retired<br />
and all important quality data is successfully<br />
migrated to the new system. Important steps<br />
in between are validation planning, defining<br />
user requirements, functional specifications,<br />
design specifications, validation during<br />
development, vendor assessment <strong>for</strong><br />
purchased systems, installation, initial<br />
and ongoing testing and change control.<br />
In other words, computer systems should<br />
be validated during the entire life of the<br />
system. Because of the complexity and<br />
the long time span of computer validation,<br />
the process is typically broken down into<br />
life cycle phases. V-model describes the<br />
system development life cycle and is<br />
frequently used. This model comprises of<br />
User Requirement Specifications (URS),<br />
Functional Specifications (FS), Design<br />
Specifications (DS), development and<br />
testing of code, Installation Qualification<br />
(IQ), Operational Qualification (OQ) and<br />
Per<strong>for</strong>mance Qualification (PQ). The<br />
V-Model is quite good if the validation process<br />
also includes software development. It also<br />
looks quite complex <strong>for</strong> true commercial, off<br />
the shelf system with no code development<br />
<strong>for</strong> customization. Phases like design<br />
specification or code development and code<br />
testing are not necessary. For such systems<br />
the 4Q model is recommended with just four<br />
phases: design qualification (DQ), installation<br />
qualification (IQ), operational qualification<br />
(OQ) and per<strong>for</strong>mance qualification (PQ).<br />
Both the 4Q and the V-model do not address<br />
the retirement phase. The 4Q model is<br />
also not suitable when systems need to<br />
be configured <strong>for</strong> specific applications or<br />
when additional software is required that<br />
is not included in the standard product and<br />
is developed by the user’s firm or by a 3rd<br />
party. In this case a life cycle model that<br />
combines system development and system<br />
integration is preferred.<br />
User representatives define User<br />
or <strong>System</strong> Requirement Specifications<br />
(URS, SRS). If there is no vendor that<br />
offers a commercial system, the software<br />
needs to be developed and validated<br />
by following the steps on the left side of<br />
Figure 3. Programmers develop functional<br />
specifications, design specifications and the<br />
code and per<strong>for</strong>m testing in all development<br />
phases under supervision of the quality<br />
assurance. The vendor that best meets the<br />
user’s technical and business requirements<br />
is selected and qualified [5] .<br />
Pharma Times - Vol. 44 - No. 02 - February 2012 21
Table 1. U.S. Regulations Applicable to <strong>Computer</strong> <strong>System</strong>s<br />
CFR TITLE SYSTEM IMPACT<br />
People<br />
21.CFR.211.25<br />
21.CFR.211.34<br />
Hardware<br />
21.CFR.211.63<br />
21.CFR.211.67<br />
21.CFR.211.68 (a)<br />
Personnel qualification<br />
Consultants<br />
Equipment design, size and location<br />
Equipment cleaning and maintenance<br />
Automatic, mechanical and electronic<br />
equipment<br />
Qualifications, training and experience <strong>for</strong> assigned functions<br />
Qualifications, training and experience to provide the service<br />
Record qualifications and work undertaken<br />
<strong>System</strong> design, capacity and operating environment<br />
Preventative maintenance program at appropriate intervals,<br />
to <strong>for</strong>mal procedures identifying responsibilities, schedule,<br />
tasks.<br />
<strong>System</strong> reliability with routine calibration, inspection or checks<br />
to <strong>for</strong>mal maintenance procedures; results to be documented<br />
Software<br />
21.CFR.211.68 (a), (b)<br />
Automatic, mechanical and electronic<br />
equipment<br />
Accuracy, Repeatability and Diagnostics<br />
Application software documentation<br />
Configuration agreement<br />
Access Security<br />
Input/ Output signal accuracy and device calibration<br />
Data Storage<br />
Software backup, archiving and retrieval<br />
21.CFR.211.100<br />
Written procedures: Deviation<br />
Formal approved and documented procedures (Software)<br />
Deviation report<br />
21.CFR.211.101 (d)<br />
21.CFR.211.180 (a), (c),<br />
(d), (e)<br />
21.CFR.211.182<br />
21.CFR.211.186 (a), (b)<br />
21.CFR.211.188 (a), (b)<br />
21.CFR.211.192<br />
21.CFR.211<br />
FD and C Act, Section<br />
704 (a)<br />
Charge-in of components<br />
General requirements (Records and<br />
Reports)<br />
Equipment cleaning and use log<br />
Master production and control records<br />
Batch production and control records<br />
Production record review<br />
Electronic record; electronic<br />
signatures<br />
Inspection<br />
Automated component, addition verification<br />
Data record availability, retention, storage medium and<br />
reviews<br />
Maintenance records<br />
Application software documentation<br />
Data reproduction accuracy<br />
Documented verification of process steps<br />
Operator identification<br />
Data record review by quality control<br />
Electronic record/ electronic signature type, use, control and<br />
audit trial<br />
Access to computer system<br />
Qualifications of CSV<br />
Design Qualification (DQ): “Design<br />
qualification (DQ) defines the functional and<br />
operational specifications of the instrument<br />
and details the conscious decisions in the<br />
selection of the supplier”. DQ should ensure<br />
that computer systems have all the necessary<br />
functions and per<strong>for</strong>mance criteria that will<br />
enable them to be successfully implemented<br />
<strong>for</strong> the intended application and to meet<br />
business requirements. Errors in DQ can<br />
have a tremendous technical and business<br />
impact, and there<strong>for</strong>e a sufficient amount of<br />
time and resources should be invested in<br />
the DQ phase. For example, setting wrong<br />
functional specifications can substantially<br />
increase the workload <strong>for</strong> OQ testing, adding<br />
missing functions at a later stage will be<br />
much more expensive than including them<br />
in the initial specifications and selecting a<br />
vendor with insufficient support capability<br />
can decrease instrument up-time with a<br />
negative business impact. Steps <strong>for</strong> design<br />
specification normally include:<br />
1. Description of the task the computer<br />
system is expected to per<strong>for</strong>m.<br />
2. Description of the intended use of the<br />
system.<br />
3. Description of the intended environment<br />
(Includes network environment).<br />
4. Preliminary selection of the system<br />
requirement specifications, functional<br />
specifications and vendor.<br />
5. Vendor assessment.<br />
6. Final selection of the system requirement<br />
s p e c i f i c a t i o n s a n d f u n c t i o n a l<br />
specifications.<br />
7. Final selection and supplier.<br />
Installation Qualification (IQ): Installation<br />
qualification establishes that the<br />
computer system is received as designed<br />
and specified that it is properly installed<br />
Pharma Times - Vol. 44 - No. 02 - February 2012 22
in the selected environment, and that this<br />
environment is suitable <strong>for</strong> the operation<br />
and use of the instrument. The list below<br />
includes steps as recommended be<strong>for</strong>e and<br />
during installation.<br />
1. Be<strong>for</strong>e installation<br />
• Obtain manufacturer's recommendations<br />
<strong>for</strong> installation of site requirements.<br />
• Check the site <strong>for</strong> the fulfillment of<br />
the manufacturer’s recommendations<br />
(utilities such as electricity, water, gases<br />
and environmental conditions such as<br />
humidity, temperature, vibration levels<br />
and dust).<br />
2. During installation<br />
• Compare computer hardware and<br />
software, as received, with purchase<br />
order (including software, accessories,<br />
spare parts).<br />
• Check documentation <strong>for</strong> completeness<br />
(operating manuals, maintenance<br />
instructions, standard operating<br />
procedures <strong>for</strong> testing, safety and<br />
validation certificates).<br />
• Check computer hardware and<br />
peripherals <strong>for</strong> any damage.<br />
• Install hardware (computer, peripherals,<br />
network devices, cables).<br />
• Install software on computer following<br />
the manufacturer’s recommendation.<br />
• Verify correct software installation,<br />
e.g., are all files accurately copied on<br />
the computer hard disk. Utilities to do<br />
this should be included in the software<br />
itself.<br />
• Make back-up copy of software.<br />
• Configure network devices and<br />
peripherals, e.g. printers and equipment<br />
modules.<br />
• Identify and make a list with a description<br />
of the hardware, include drawings where<br />
appropriate, e.g., <strong>for</strong> networked data<br />
systems.<br />
• Make a list with a description of the<br />
software installed on the computer.<br />
• Store configuration settings either<br />
electronically or on paper.<br />
• List equipment manuals and SOPs.<br />
• Prepare an installation report.<br />
Installation and installation qualification<br />
(IQ) of larger commercial system is normally<br />
per<strong>for</strong>med by a supplier’s representative.<br />
Both the supplier’s representative and a<br />
representative of the user should sign off<br />
the IQ documents [6, 7] .<br />
Operational Qualification (OQ):<br />
“Operational qualification (OQ) is the<br />
process of demonstrating that a computer<br />
system will function according to its functional<br />
specifications in the selected environment.”<br />
Be<strong>for</strong>e OQ testing is done, one should<br />
always consider what the computer system<br />
will be used <strong>for</strong>. There must be a clear<br />
link between testing as part of OQ and<br />
requirement specifications as developed in<br />
DQ phase. Testing may be quite extensive if<br />
the computer system is complex and if there<br />
is little or no in<strong>for</strong>mation from the supplier<br />
on what tests have been per<strong>for</strong>med at the<br />
supplier’s site. Extent of testing should be<br />
based on a justified and documented risk<br />
assessment. Criteria are: Impact on product<br />
quality, Impact on business continuity,<br />
Complexity of system, In<strong>for</strong>mation from the<br />
vendor on type of tests and test environment,<br />
Level of customization.<br />
Most extensive tests are necessary<br />
if the system has been developed <strong>for</strong><br />
a specific user. In this case the user<br />
should test all functions. For commercial<br />
off-the-shelf systems that come with a<br />
validation certificate, only tests should be<br />
done of functions that are highly critical<br />
<strong>for</strong> the operation or that which can be<br />
influenced by the environment. Specific user<br />
configurations should also be tested, <strong>for</strong><br />
example, correct settings of IP addresses of<br />
network devices should be verified through<br />
connectivity testing.<br />
Per<strong>for</strong>mance Qualification (PQ):<br />
“Per<strong>for</strong>mance Qualification (PQ) is the<br />
process of demonstrating that a system<br />
consistently per<strong>for</strong>ms according to a<br />
specification appropriate <strong>for</strong> its routine use”.<br />
Important here is the word ‘consistently’.<br />
Important <strong>for</strong> consistent computer system<br />
per<strong>for</strong>mance is regular preventive<br />
maintenance, e.g., removal of temporary<br />
files and making changes to a system in<br />
a controlled manner and regular testing.<br />
In practice, PQ can mean testing the<br />
system with the entire application. For a<br />
computerized analytical system this can<br />
mean, <strong>for</strong> example, running a system<br />
suitability testing, where critical key system<br />
per<strong>for</strong>mance characteristics are measured<br />
and compared with documented, preset<br />
limits. PQ activities normally can include<br />
complete system test to prove that the<br />
application works as intended. For example<br />
<strong>for</strong> a computerized analytical system this<br />
can mean running a well characterized<br />
sample through the system and compare<br />
the results with a result previously obtained,<br />
Regression testing: reprocessing of data<br />
files and compare the result with previous<br />
result, Regular removal of temporary files,<br />
Regular virus scan, Auditing computer<br />
systems.<br />
Most efficient is to use software <strong>for</strong><br />
automated regression testing. The software<br />
runs typical data sets through a series of<br />
applications and calculates and stores the<br />
final result using processing parameters<br />
as defined by the user. During regression<br />
testing the data is processed again and<br />
the results are compared with previously<br />
recorded results. Normally such tests do<br />
not take more than five minutes but gives<br />
assurance that the key functions of the<br />
system work as intended [8] .<br />
<strong>Validation</strong> of HARDWARE and<br />
SOFTWARE<br />
<strong>Validation</strong> of Hardware<br />
Hardware validation is essential when<br />
the hardware is to be used in complex<br />
systems that are used in cost-critical and<br />
life-critical applications. This motivates<br />
the need <strong>for</strong> a systematic approach to<br />
verify functionality. Hardware verification<br />
complexity has increased to the point<br />
that it dominates the cost of design. In<br />
order to manage the complexity of the<br />
problem, we have to investigate validation<br />
techniques, in which functionality is verified<br />
by simulating (or emulating) a system<br />
description with a given test input sequence.<br />
However, <strong>for</strong>mal techniques suffer from<br />
high complexity, so the verification of large<br />
designs using <strong>for</strong>mal techniques alone<br />
is often intractable. The complexity of<br />
validation can be made tractable by using<br />
a test sequence of reasonable length,<br />
and the degree of certainty provided can<br />
become arbitrarily close to 100%. A practical<br />
difficulty in the validation of large hardware<br />
systems is choosing the proper design<br />
abstraction level which provides a trade off<br />
between simulation complexity and error<br />
modelling accuracy. In practice, validation<br />
is per<strong>for</strong>med at all levels of abstraction from<br />
behavioural down to layout. Behavioural<br />
hardware description languages, such as<br />
VHDL and Verilog, have only been fully<br />
accepted by industry <strong>for</strong> less than a decade,<br />
and research in behavioural validation is still<br />
developing.<br />
<strong>Validation</strong> of Software<br />
As the pharmaceutical and chemical<br />
research and testing industries phase in and<br />
start to comply with GALP, we are becoming<br />
more frequently summoned, to provide<br />
in<strong>for</strong>mation and documentation which may<br />
help to satisfy the reporting and compliance<br />
requirements.<br />
The Software Life Cycle<br />
Concept<br />
An idea <strong>for</strong> software improvement is<br />
expressed and <strong>for</strong>med usually in one of the<br />
following ways:<br />
1. A user has a need <strong>for</strong> some feature<br />
or improvement, and makes a specific<br />
request.<br />
Pharma Times - Vol. 44 - No. 02 - February 2012 23
2. A user or prospective customer raises<br />
an issue and expresses its importance.<br />
3. Individuals close to the development<br />
process recognize and identify<br />
improvement possibilities.<br />
Analysis and Design<br />
A program structure analysis is<br />
conducted to determine the effects of the<br />
improvement:<br />
1. That the underlying structure of the<br />
program can support the added<br />
functionality.<br />
2. That backward compatibility can be<br />
maintained with data collected by<br />
previous program versions.<br />
3. The extent that modules, functions and<br />
procedures will be affected.<br />
4. The extent that additional functions,<br />
procedures and algorithms will be<br />
required.<br />
5. A determination of program branch<br />
points affected.<br />
6. Specification of variables and flow<br />
control flags needed <strong>for</strong> implementation<br />
and control.<br />
Coding and Implementation<br />
During the coding and implementation,<br />
phase organization issues are observed:<br />
1. The coding languages, style and <strong>for</strong>mat<br />
is kept consistent with the rest of the<br />
application’s modules, functions and<br />
procedures.<br />
2. Variables and condition flags are<br />
assigned and named in accordance<br />
with Scintco’s standard mnemonic and<br />
naming guidelines.<br />
Test and Verification<br />
The developer tests and verifies that:<br />
1. The functionality of the improvement is<br />
in accordance with the requirements and<br />
specifications.<br />
2. The user interface, input, branching,<br />
processing and output are as specified<br />
by the requirements.<br />
3. The functionality within procedures<br />
affected by the improvement is not<br />
compromised when the improvement is<br />
bypassed.<br />
4. In special request cases, a pre-release<br />
version may be available <strong>for</strong> beta testing<br />
by those making the request.<br />
Minor changes and cautionary notes are<br />
appended to the Software Advisory Bulletins,<br />
so they more accurately depict proper and<br />
useful operation and advise the user of<br />
issues requiring special consideration [2, 9] .<br />
<strong>Validation</strong> and Release<br />
The distributor receives the program<br />
version package with the applicable<br />
Software Advisory Bulletins, then tests and<br />
verifies the system <strong>for</strong> proper operation, and<br />
when satisfied that everything is in order,<br />
prepares the version <strong>for</strong> final release.<br />
Operation and Maintenance<br />
User Documentation: The Software<br />
Advisory Bulletins serve as the raw material<br />
from which the distributor produces the<br />
necessary user documentation and<br />
incorporates it into the instruction manual.<br />
The Software Advisory Bulletins are<br />
normally included in an appendix of the<br />
instruction manual. The distributor and the<br />
developer provide recommendations and<br />
guidelines which a user may find useful<br />
when developing SOPs <strong>for</strong> a particular<br />
study.<br />
Anomaly Report<br />
An Anomaly Report <strong>for</strong>m is provided <strong>for</strong><br />
the purpose of reporting a software problem.<br />
It includes:<br />
1. A description of the location and place<br />
within the software system, that an<br />
anomaly is first encountered or observed,<br />
i.e. the sequence of events which leads<br />
to an expression of the anomaly.<br />
2. A description of the effect and impact<br />
the anomaly has on the system or on<br />
the affected software function.<br />
3. To the extent possible, report the cause<br />
or perceived /believed cause of the<br />
anomaly.<br />
4. To the extent possible, report the level<br />
of criticality the anomaly represents.<br />
5. I n c l u d e r e c o m m e n d a t i o n s a n d<br />
suggestions that represent a resolution<br />
of the anomaly.<br />
<strong>Validation</strong> Master Plan and<br />
Project Plan<br />
The <strong>Validation</strong> Master Plan is a document<br />
that describes how the validation program<br />
will be executed in a facility. It should be<br />
developed according to company policies<br />
and internal procedures, including both<br />
infrastructures and applications. SOPs<br />
should be in place together with a <strong>for</strong>mal<br />
<strong>System</strong> Life Cycle Concept which describes<br />
all the relevant activities <strong>for</strong> creating and<br />
maintaining qualified infrastructure and<br />
application. All validation activities should be<br />
described in a validation master plan which<br />
should provide a framework <strong>for</strong> thorough and<br />
consistent validation. A validation master<br />
plan is officially required by Annex 15 of the<br />
European GMP directive. FDA regulations<br />
and guidelines don’t mandate a validation<br />
master plan; however, inspectors want to<br />
know what the company’s approach towards<br />
validation is. The validation master plan is<br />
an ideal tool to communicate this approach<br />
both internally and to inspectors [10] . It also<br />
ensures consistent implementation of<br />
validation practices and makes validation<br />
activities much more efficient. In case there<br />
are any questions as to why things have<br />
been done or not done, the validation master<br />
plan should give the answer. <strong>Computer</strong><br />
<strong>Validation</strong> Master Plans should include:<br />
1. Introduction with a scope of the plan,<br />
e.g., sites, systems, processes.<br />
2. Responsibilities by function.<br />
3. Related documents, e.g., risk<br />
management plans.<br />
4. Products/processes to be validated and/<br />
or qualified.<br />
5. <strong>Validation</strong> approach, e.g., system life<br />
cycle approach.<br />
6. Risk management approach with<br />
examples of risk categories and<br />
recommended validation tasks <strong>for</strong><br />
different categories.<br />
7. Vendor management.<br />
8. Steps <strong>for</strong> <strong>Computer</strong> <strong>System</strong> <strong>Validation</strong><br />
with examples on type and extent of<br />
testing, <strong>for</strong> example, <strong>for</strong> IQ, OQ and<br />
PQ.<br />
9. Handling existing computer systems.<br />
10. <strong>Validation</strong> of Macros and spreadsheet<br />
calculations.<br />
11. Qualification of network infrastructure.<br />
12. Configuration management and change<br />
control procedures and templates.<br />
13. Back-up and recovery.<br />
14. Error handling and corrective actions.<br />
15. Requalification criteria.<br />
16. Contingency planning and disaster<br />
recovery.<br />
17. Maintenance and support.<br />
18. <strong>System</strong> retirement.<br />
19. Training plans (e.g., system operation,<br />
compliance).<br />
20. <strong>Validation</strong> deliverables and other<br />
documentation.<br />
21. Templates and references to SOPs.<br />
22. Glossary.<br />
23. V a l i d a t i o n R e p o r t a n d o t h e r<br />
documents.<br />
<strong>Validation</strong> Report<br />
When the validation project is completed,<br />
a validation summary report should be<br />
Continued on pg 29<br />
Pharma Times - Vol. 44 - No. 02 - February 2012 24
Continued from pg 24<br />
generated by the system owner. The report<br />
documents the outcome of the validation<br />
project [11] . The validation report should<br />
mirror the validation project plan and should<br />
include:<br />
1. A brief description of the system.<br />
2. Identification of the system and all<br />
software versions that were tested.<br />
3. Description of hardware used.<br />
4. Major project activities.<br />
5. Listing of test protocols, test results and<br />
conclusions.<br />
6. Statement on system status prior to<br />
release.<br />
7. List of all major or critical issues and<br />
deviations with risk assessment and<br />
corrective actions.<br />
8. Statement that all tasks have been<br />
per<strong>for</strong>med as defined in the project<br />
plan.<br />
9. Statement that validation has been<br />
per<strong>for</strong>med according to the documented<br />
procedures.<br />
10. Listing of all deliverables.<br />
11. Final approval or rejection statement.<br />
12. The validation report should be reviewed,<br />
approved and signed by QA and the<br />
system owner.<br />
Checklists<br />
Checklists should help to verify that<br />
validation tasks are identified and per<strong>for</strong>med.<br />
However, some validation tasks are specific<br />
<strong>for</strong> specific systems. There<strong>for</strong>e going<br />
through checklists does not mean that<br />
everything is covered <strong>for</strong> each system nor<br />
does it mean that all checklist items are<br />
applicable <strong>for</strong> every system.<br />
Templates and <strong>Validation</strong><br />
Examples<br />
Templates are useful to effectively follow<br />
and document validation tasks and results.<br />
<strong>Validation</strong> examples help to get adequate<br />
in<strong>for</strong>mation on how to conduct validation<br />
and to prepare deliverables [12] .<br />
Documentation<br />
Basic Documentation<br />
I n a d d i t i o n t o t h e b a s i c G L P<br />
documentation (i.e. training record, job<br />
description, and CV), there should be an<br />
inventory of all computerized systems being<br />
used in the facility listing system name,<br />
system owner, location and validation<br />
status.<br />
Standard Operating Procedures<br />
GLP requires a set of standard operating<br />
procedures <strong>for</strong> the development and/or<br />
routine use of validated computerized<br />
systems addressing the following topics:<br />
1. Operation: In addition to the User<br />
Manual, an SOP should describe how<br />
the computerized system will be used<br />
<strong>for</strong> its intended purpose.<br />
2. Security: Two levels of security should<br />
be addressed:<br />
Physical security of the system (e.g.<br />
locked server room).<br />
Logical security of the system (e.g. User<br />
ID, password) including user rights.<br />
3. Problem log: This should describe<br />
measures how to document and solve<br />
problems encountered during routine<br />
operation of the system. Reference to<br />
change management procedures should<br />
be taken into account.<br />
4. Maintenance: Regular and preventive<br />
maintenance should be described.<br />
5. Change control: Changes to the<br />
computerized system, except regular<br />
and preventive maintenance, should<br />
be evaluated <strong>for</strong> their potential impact<br />
on the validation status. The procedure<br />
how to per<strong>for</strong>m a change control should<br />
be described.<br />
6. Backup and restore: Procedures<br />
<strong>for</strong> backup of the application and<br />
data should be defined including their<br />
frequency, period of retention <strong>for</strong> backup<br />
copies, the method and responsibility<br />
<strong>for</strong> periodic backups, and the process<br />
of restoration.<br />
7. Periodic testing: The system needs<br />
to be monitored regularly <strong>for</strong> correct<br />
operation including device checks. Basic<br />
functionality testing should be per<strong>for</strong>med<br />
on a regular basis.<br />
8. Contingency plan and disaster<br />
recovery: A contingency plan should<br />
specify procedures to be followed in<br />
case of system breakdown or failure.<br />
A detailed plan <strong>for</strong> disaster recovery<br />
should be available.<br />
9. Archiving and retrieval: Procedures<br />
should describe how and where<br />
documents, software and data are<br />
archived, including the period of retention,<br />
retrieval mechanism, readability, and<br />
storage conditions.<br />
10. Quality Assurance: Procedures how<br />
QA will review and inspect the system<br />
life cycle and the IT-infrastructure in a<br />
GLP-regulated environment.<br />
Apart from the SOP on operation of a<br />
system, these SOPs may be as generic<br />
as possible; i.e. they need not be written<br />
separately <strong>for</strong> each application [7, 13] .<br />
Additional <strong>System</strong> Specific<br />
Documents<br />
1. Installation manual: A set of instructions<br />
that have to be followed when the<br />
system is installed. In addition, it defines<br />
the minimum hardware and operating<br />
system requirements.<br />
2. User manual: Describes how to use<br />
the system, usually provided by the<br />
vendor.<br />
3. Release notes: Contain in<strong>for</strong>mation<br />
on changes and enhancements of<br />
the software compared to a previous<br />
version.<br />
4. Vendor audit report: Describes the<br />
results of the inspection of the vendor<br />
concerning the software development<br />
life cycle (SDLC) and the quality system<br />
of the vendor. It also includes in<strong>for</strong>mation<br />
about software design and, in particular,<br />
about software testing.<br />
5. Logbook: Logbook should be established<br />
to record all actions e.g. calibration,<br />
cleaning, maintenance, change control<br />
of all components of a computerized<br />
system over the whole life cycle.<br />
6. Source Code: The test facility should<br />
have access to the source code of<br />
application software. It is not necessary<br />
to have it available at the test facility,<br />
but the test facility should ensure that<br />
the vendor of the software maintains the<br />
source code <strong>for</strong> each version in a safe<br />
place.<br />
conclusion<br />
Successful CSV is highly dependent<br />
upon a quality management or quality<br />
assurance system.CSV must establish<br />
a “level of confidence” that the system<br />
consistently meets all requirements and<br />
user expectations. CSV is a critical activity<br />
that should be pursued and <strong>for</strong>mally<br />
documented <strong>for</strong> all systems with regulatory<br />
implications. CSV activities provide the<br />
controlled testing conditions necessary<br />
to ensure proactive identification and<br />
resolution of operational and regulatory<br />
issues. Above all, the system must be<br />
shown to operate correctly, consistently,<br />
and according to its specifications. Whilst<br />
the concepts and principles behind<br />
computer validation remain convincing<br />
and relevant, it is clear that computer<br />
validation practices need to be updated to<br />
reflect modern computer technology and<br />
development techniques.<br />
“At the end of the day people make the<br />
difference. Good people deliver not just<br />
short term results but results that hold up<br />
to scrutiny long-term too.”<br />
Pharma Times - Vol. 44 - No. 02 - February 2012 29