Tuesday, 11 February 2014

Process Validation: Can We Now Get Back to Basics?

By Ali Afnan, PhD, Principal, Step Change Pharma, Inc.
It was around 2002. The message of change was in the breeze. At FDA’s advisory committee meetings, the language of science was used to examine and discuss Good Manufacturing Practices (GMPs), both in practice and in law. To support and enhance the milieu for change, and to prevent GMP’s from getting in the way of innovation, the “GMP’s for the 21st Century” program was initiated.
But one area that industry remained unclear about was process validation, whose roots go back to a brief part of Chapter 21 of the Code of Federal Regulations (21 CFR). Validation is the GMP activity most frequently performed. Yet, after GMPs had become law, drug companies weren’t sure how they should interpret validation requirements.
FDA responded with a guidance in 1987, explaining its thinking about validation. Guidances are, by no means, law. It’s not “FDA’s way or the highway,” as has been proven in U.S. courts (see, for example, ajudge’s decision regarding Utah Medical Products).
But the original process validation guidance was far less clear on principles and requirements, and led to misunderstandings. For instance, an example cited in that guidance was misinterpreted as requiring the “execution of three consecutive validation batches.” A string of engineering batches were needed to assure successful execution of 3 consecutive batches.

Three-Batch Generation

Let’s take a closer look at the original from 21 CFR 211.110(a), which states: “Control procedures shall be established to monitor the output and to validate the performance of those manufacturing processes that may be responsible for causing variability. . . .” Interestingly enough, “appropriate samples” is also another requirement of this same section of the regulations.
An entire industry has grown around the “three-batch practice.” Would any conscientious reader agree that three consecutive batches, even after a large number of engineering batches, satisfy either the spirit or the letter of the law?
Industry practitioners as well as most of the professionals at FDA agree that the “three consecutive batch” practice falls short of what it intended to achieve, namely validating the performance of processes responsible for variability.
Earlier this year, FDA published revised guidance on process validation. So, after 24 years, the Agency’s thinking on process validation has changed. What does this mean? The routine of “three batches and you are done” is no longer valid; truly “statistically significant” sampling is now a reality.
This guidance is far less prescriptive than what preceded it, even though that one wasn’t truly prescriptive either, just misinterpreted, with an example that became practice and reality.
Now it has three phases (see “A Practical Roadmap to Pharmaceutical Process Validation”) and the third phase seems like a never ending process! The reason is that a validated process is an outcome; in the past we have considered validation as a discrete activity and an input. I can hear many complaining that the 2011 guidance raises the cost of compliance; and asking how many batches comprise the “validation batches.”

Rocking the Boat

The FDA has rocked the boat. As a consumer, I am thankful that the Office of Compliance had the vision and drive to deliver a guidance of this nature—in line with common sense, science and good engineering practice.
Only reasonable guidance like this can bring the pharmaceutical industry’s quality and efficiency performance in line with that of consumer goods and other industries.
What “control procedures” have other industries like consumer goods established to “monitor the output and to validate the performance of those manufacturing processes that may be responsible for causing variability”?
Is a three-batch process validation part of their control and manufacturing strategies? What end of line testing strategy do they rely on? Or do they simply run their processes to control the quality of the material attributes?
What about the cost of compliance? Surely a culture of compliance is inferior to a culture of quality. Should the medications that our well-being depends on come from a compliant mindset or a quality mindset?
Yes, the 2011 guidance will drive up the cost of compliance, if the mindset and hence the practice is that of mere compliance.
This guidance facilitates understanding the manufacturing process as well as the relationship between the product and the process, based on R&D information, and then facilitates real time control of the process. It requires, primarily, an open mind and an approach very different from the old three-batch mindset.
Once the process and the product, and their interrelationship, is understood and controlled, then the cost of quality will be reduced.
But how willing will a regulated industry be to change its practices, and how will the guidance be enforced?

A Practical Roadmap to Pharmaceutical Process Validation

By Wai Wong and Bikash Chatterjee, Pharmatech Associates
In January 2011, the FDA issued its new guidance regarding Process Validation. Based upon experience gathered by the agency since 1987, the new guidance reflects the principles of the 2004 FDA initiative,Pharmaceutical cGMPs for the 21st Century – A Risk-Based Approach.
This new definition of process validation is a significant paradigm shift from the original concept, embracing the basic principles of scientific understanding put forth in ICHQ8 and Q9 as a foundation for controlling process variability.
The challenge most organizations will have with this new guidance will be assuming responsibility for defining what is scientifically acceptable for characterizing the sources of process variability. This article will present a roadmap that is both practical and scientifically sound, for deploying a process validation program that is consistent with the new guidance.
In our experience, the biggest challenge facing organizations attempting to bridge the classical paradigm of  “three batches and we’re done,” is in understanding how the new process validation stages work together to build the argument for process predictability.
The new guidance divides process validation into three stages:
  • Stage 1 Process Design: The commercial manufacturing process is defined during this stage based on knowledge gained through development and scale-up activities.
  • Stage 2 Process Qualification:  During this stage, the process design is evaluated to determine if the process is capable of reproducible commercial manufacturing.
  • Stage 3: Continued Process Verification: Ongoing assurance is gained during routine production that the process remains in a state of control.
The Roadmap
A proposed approach for connecting the activities within each stage and implementing a manageable program is given in Figure 1.

Stage 1: Process Design
This initial stage is the most significant departure from the classical definition of what constitutes process validation. Stage 1 focuses on process characterization studies to identify the Key Process Input Variables (KPIV) that affect Critical-to-Quality (CTQ) measurements for the product. This characterization affects a product’s form, fit or function and is typically performed on small or intermediate stage equipment.
  1. Product Design
One might ask, why go all the way back to product design? Process predictability relies on understanding what is important to process predictability and product performance. Having a solid grasp of the formulation and product design rationale is essential to achieving that level of understanding.  The formulation will provide an early glimpse as to what processing steps may become critical downstream and hence become sources of variation in the process. The product design rationale will define how the formulation, raw materials and processing steps are related to achieving the desired product performance.  Without this understanding, it becomes difficult to know where the emphasis should be for the initial characterization studies in Stage 1 and confirmatory studies in Stage 2.
  1. Process Risk Assessment
As the process is developed at small scale, a process risk management tool such as a Process Failure Modes and Effects (pFMEA) can be powerful in identifying which processing steps could affect process stability at Stage 2.  Before conducting any risk management it is a good idea to create a process map that captures all inputs, outputs and control variables.  This can be used to discuss what CTQs will be measured and provide a risk based foundation for developing a sampling and testing strategy.  Small scale and scale-up data may be captured here if a comparability argument is part of the downstream scale-up exercise to ensure there is parity between the critical output parameters as they relate to identified CTQs.
At this point, the pFMEA can be used to prioritize which key process steps and KPIVs represent areas of risk to process predictability.  These areas will become the focus of characterization studies in Stage 1 and later in Stage 2.
Equipment/Process Characterization Studies
Before beginning any characterization study, it is essential to be sure the equipment performance is stable and reproducible.  Characterization studies performed on unstable equipment will introduce variability that will not be indicative of the final process.  While a formal qualification process is not required, fundamental engineering characterization studies should be performed on the equipment before beginning the process studies.
When looking at the basic principles behind ICH Q8, the guidance describes a tiered exercise in which process understanding and key parameter variability is methodically narrowed as the process definition moves from the knowledge space through the design space to the control space used for manufacturing. Characterization studies need to be balanced in their experimental design. This means that early one-factor-at-a–time (OFAT) studies can serve as supportive data for the design of these experiments but that characterization studies should be balanced or “orthogonal” when it comes to determining the contribution to process stability from critical input parameters.  While the number of lots will increase during this phase, these smaller scale studies provide the opportunity for larger sampling plans and greater process characterization than would be required with full-scale batches.  Effective Stage 1 characterization studies are based on several factors:
a) Sampling Plans
Designing a sampling plan that has the appropriate resolution to describe the process variability is important to building confidence as the process scales up and moves to validation.  There is no one approach to determining the appropriate sampling plan.  The FDA does not legislate a specific approach to establishing a sampling plan.  Whatever approach is selected, however, must have a clearly defined rationale behind it.  Possible sources and approaches for developing a sampling plan include PQRI recommendations for powder processes, ANSI Z1.4-2008, Acceptable Quality Level (AQL), Lot Tolerance Percent Defective (LTPD), or the Operating Characteristic (OC) curve.  There is no right or wrong answer, but whatever sampling plan is developed must be defensible based upon the level of resolution necessary to see variation in the process.
b) Sampling Technique
Although the equipment may not reflect the sampling challenges at full scale, demonstrating that sampling and storage methodology does not introduce variability into the process is a precursor step to performing characterization studies.  A Gage Reliability and Reproducibility (GRR) study would be an effective way of demonstrating the sampling technique is robust.
c) Method Robustness
Typically, analytical and in-process methods are validated at this stage but it is important to ensure the accuracy and precision of the method itself.  Making sure the measurement tool is capable of seeing the differences in the process performance being evaluated is fundamental to knowing you are characterizing process variability and not measuring noise.
Design Space Establishment
To identify the boundaries and variables that drive process stability it is possible to focus only on the parameters that steer the process and the corresponding Key Process Output Variables (KPOV) that affect the product CTQs.  The design space will explore the boundary limits of the parameters that are critical to process stability.  Identifying the KPIVs of interest can be achieved using a combination of a balanced Design of Experiments (DOE) approach and statistical analysis, such as Analysis of Variance (ANOVA) to summarize the contribution of each variable to the variation seen in the data being analyzed.   A high correlation of determination (r) means that most of the variation seen in the data can be explained by the variables evaluated.
Validation Master Plan
The end of Stage 1 should provide sufficient detail to develop the validation master plan that will describe the approach, justification and rationale for moving to Process performance Qualification.
Stage 2: Process Qualification
The demonstration phase of the process validation lifecycle occurs in Stage 2.  Before moving to this phase there are several critical precursors. First, the facility and its supporting critical utilities must be in a state of control.  Secondly, the equipment must be qualified—meaning the installation qualification, operational qualification and performance qualification are all complete.  Finally, the in-process and release methods used for testing must be validated, and their accuracy and precision well understood, in terms of the final control space being evaluated.  These steps are essential to ensure that the unknown variability we are evaluating is attributable to the process alone.
The new guidance introduces a new term Process Performance Qualification (PPQ), in lieu of process validation for process demonstration. The PPQ is intended to subsume all of the known variability from the manufacturing process and demonstrate that the process predictability is sufficient to ensure the product performs as it claims to do.  In this case, the big departure from past process validation approaches is that it is the cumulative understanding from Stage 1 and 2 that drives the decision that the process is predictable.  The rigor applied in Stage 1 will dictate the level of characterization, sampling and testing required in Stage 2.  Dedicated focus in Stage 1 will result in reduce Stage 2 cost and timeline impact.
The PPQ exercise focuses on demonstrating process control.  Data from platform formulations and unit operations can be used to manage the risk moving forward and establish the level of characterization required in the PPQ protocol.  Consequently, the old rule of “three lots and we are done” goes out the window.  For simple processes with a low risk of process excursion, e.g. high loaded dose, direct blend formulations, the PPQ may be three lots or less.  For complex processes, e.g. low dose controlled release spray drying processes or mammalian cell processing, the number of demonstration lots will likely be higher.  Old paradigms, supplied by FDA Guidance for such things as media fills for aseptic validation, will now require a risk-based statistical justification based upon lot size and risk tolerance.  The PPQ will challenge the process control space.  The control space represents the recommended manufacturing limits for the process.  The control limits are typically established by moving away from the boundary limits of the design space, and selecting parameter limits in a process design space that will ensure process predictability away from the edge of failure for each KPIV.
There are no sacrosanct evaluation parameters for demonstrating a successful PPQ.  Process Capability is a fundamental metric that can be used to compare process variability and process centering against allowable specifications.  It can be used to justify AQL or LTPD sampling levels at the commercial level that could be a substantial on-going cost savings.  If desired, this information will support any PAT strategy the site may have for the process downstream.
A good practice at the end of the PPQ is to go back to the risk management evaluation and demonstrate that the process risk elements identified at the outset of Stage 1 have been mitigated.  This data will be the basis of managing continued improvement on the process via the change control system.
Stage 3: Continued Process Verification
The goal of the third validation stage is continual assurance that the process remains in a state of control (the validated state) during commercial manufacture. The FDA is looking for a monitoring program capable of detecting gradual or unplanned departures from the process as designed.  Historically, we have used the product stability program, change control process and the Annual Product Review Process as vehicles for monitoring and assessing process stability.  The challenge with this approach has always been the resolution of these systems making proactive intervention difficult to achieve when dealing with process drift.  For this stage the agency is looking for a program that builds upon the process understanding acquired in Stages 1 and 2.
Stage 3 will require a monitoring program that balances sampling, testing costs and demand with process understanding.  A matrix approach to sampling with a focus on looking at intra- and inter-batch variation of the KPIVs and CTQs for the process is one way to cost effectively monitor the commercial process stability. Employing Statistical Process Control, Moving Range Charts and XBar-R charts are also simple ways to evaluate if the process is wandering unacceptably.  It is important to apply a data-gathering phase before establishing alerts and action limits, since the commercial process will subsume the totality of variation from the raw material, process, and testing methods.  This data should drive a statistical analysis of data against the process characterization and PV lot performance.  Understanding the intent behind each analysis is essential to coming to the right conclusion.  Statistical software packages such as Minitab and JMP can make analysis simple and reproducible and introduce, as required, data evaluation criteria such as the Westinghouse rules which, when used to discriminate aberrant data from true process variability, can determine if further action is required.  As areas of further study are identified, the risk management tools should be revisited to ensure the impact of the process variation is evaluated consistently.
Knowledge Management
Underlying the new guidance is the need for proactively establishing a system for knowledge management.  This means ensuring all parties involved in the development, analysis and evaluation of the data and process have a solid understanding of past performance and its implications on process stability and product performance.   Consolidating the information in a central document or repository will ensure some continuity of learning and will allow continuous improvement or CAPA activities to build upon best practices of the past.
Quality Management System (QMS)
The largest paradigm shift within the new guidance is the Quality function.  Moving away from a product centric QMS requires that Quality be intimately involved in the evaluation and decision-making criteria as the process moves through each stage.  It will require a heightened level of scrutiny to make sure all supportive elements are in place.  For example, ensuring critical monitoring systems are calibrated will beg the question: “Is it single point or three point calibration?”  Method capability will focus on accuracy and precision and interference points. Ensuring that controlling and measurement tools are capable will become the foundation for managing the QMS, rather than the QMS procedures and documentation audit trail. To facilitate both the knowledge management and QMS paradigm shifts, a milestone or stage gate approach to process validation is an effective way to ensure all key stakeholders and decision makers remain on board with the new process-centric philosophy.  An example of one possible approach is shown in Figure 2.
Figure 2: New Process Validation Stage Gate Approach
ConclusionThe new Process Validation guidance represents a dramatic shift from the 1987 FDA guidance issued to industry.  While less prescriptive, it provides a sufficiently descriptive framework for industry to create a scientifically driven approach to demonstrating process predictability.  There is no single answer to this guidance, and a structured roadmap, with clearly defined deliverables at each milestone will ensure that the philosophical and technical components required to demonstrate process predictability will be applied in a uniform fashion across the organization.  In addition, this uniform approach to process validation will allow the organization to reap the benefits of a more focused validation effort, potentially reducing the cost of Stage 2 PPQ and resulting in products and processes which are both stable and predictable.
1.  FDA Guidance for Industry, Process Validation: General Principles and Practices, January 2011
2. The Westinghouse Rules for Identifying Aberrant Observations in the Statistical Quality Control Handbook, 1984, Section 1B

Tuesday, 4 February 2014


1.0        PURPOSE                                                     

1.1     This procedure defines how Quality System Documents, Data & Records including external documents are controlled. Document and Record Control System explains the manners of Developing, Issuing and Controlling all the documents This procedure explains how documents are prepared reviewed, approved and issued and how, changes to documents are controlled.

2.0        SCOPE

2.1         Document and Record control system is applicable in all the department of the company with in the scope of ISO 9001:2000 Quality System.

3.0        PROCEDURE

3.1        Document Control
3.1.1  Document and Data Approval and Issue:         Company Quality Manual & Company Quality Procedures are printed on the Quality System Paper. Quality System paper bears coloured company logo and the footer of the Quality System Paper is written with “If the Logo on This Paper Appears Black It Is an Uncontrolled Copy, If in Doubt, Please Check”. This automatically declares the photocopies of the      ISO 9000 system documents as uncontrolled copies, since logo shall turn black at the time of photocopying.    All other documents are printed on simple plain paper and are controlled by stamping on each paper of the document with “CONTROLLED” stamp in green.    Management Representative maintains a master copy of all the control documents including external origin documents.    Quality System Papers are stored under lock and key.    Management Representative ensures that appropriate documents are available at all locations where operation essential to the effective functioning of the Quality System are performed and invalid and / or obsolete documents are promptly removed from all these locations.

3.1.2  Document Creation / Change Request (DCCR):  Any employee  can request / suggest a change in controlled document(s) or request for a new document using Document Creation / Change Request Form. All the requests are submitted to Management Representative all these request are reviewed & authorized by Management Representative. If the suggested changes / creation are agreed by the author approving authority then management representative will incorporated the required change in the document.    In case of any changes in the controlled document(s), the relevant documents (i.e. documents master list, company quality plans etc.) are also updated by Management Representative.

3.1.3  Document Preparation:
                     The Management Representative prepares the documents, assigns document number to the document through which it is identified using Document Master List. The Issuing, Approving and Reviewing authority should be according to the Master List. Syntax of the reference numbers for all kind of documents are as follows:

Company Quality Manual

Company Quality Procedures

Job Description

Standard Operating Procedure

Standard Analytical Procedure Raw Materials

Manual for Raw Material Specification

Standard Manufacturing Procedure

Standard Analytical Procedure for Products

Product Specification

Packing Material Specification

Quality Plans

Manual for Preparation & Standardization of Reagent

Company Quality Records    Document Master List identifies the current revision status of document. The Document Master List is maintained to control the updated revision no. and the distribution of documents. The document Master List describes the reviewing and approving authority for the documents.    Document Status Block: appears on the top of the each page of the document to control the issuance and validity of the document. Issue Status in the Document Status Block represents the current issue of the document. In Quality Control department the Review Date column is also added in all documents.    To ensure the pertinent issues of appropriate documents are available at all locations where they were originally issued, Designation to whom the document is originally issued to, is printed in Document Status.

3.1.4  Approval and Issuance of Document:  Quality System documents are controlled and issued by the Management Representative. He shall have a copy of all the updated controlled documents.    Management Representative prints author’s name in the Document Control Block. Author shall be technically competent and fully informed about the activity / process to be described.    The designation of Reviewing Authority (Reviewed By) for a particular document is responsible for its accuracy, relevance and language.    The designation of Approving Authority (Approved By) for a particular document is responsible for its accuracy, relevance, language and validity. The documents shall be approved for adequacy by this designation prior to issue.

3.1.5  Distribution / Recall of Document:  The Management Representative distributes the documents to the concerned personnel, as mentioned in the Master List. Management Representative prepares Document Distribution Form and issues Documents by getting the Form signed by the recipient.    Document Distribution Form is filled with MR Copy designation wise and is maintained by the Management Representative.    The Management Representative recalls the “Obsolete” document, documents of previous revision by referring to the Master List by using Document Recall Form. The Management Representative collects the previously issued Documents (Obsolete) by getting the Form signed by the recipient.    After recalling the obsolete document, Management Representative marks one copy of the Document “Obsolete” and files it the OBSOLETE DOCUMENT FILE. The particulars of the obsolete documents are recorded in the List of Obsolete Document.

3.2     Control Of Records

3.2.1  Company records are identified by Quality Records number QR. No.) The syntax of QR No. is as follows.
          QR / Department Name / Sequence No / Issue Status

3.2.2    Management Representative is overall responsible for the maintenance of company quality records in all Departments.

3.2.3    Management Representative maintains a List of the company Quality Record. The list of Company Quality Record describes.

·                     Serial No.
·                     Department Name
·                     QR No.
·                     QR Description
·                     QR Type
·                     Application
·                     Issue Status
·                     Issue Date
·                     Retention Time

3.2.4    Management Representative maintains a folder containing the current format of the Company Quality Records with list of Company Quality Records. If any addition and deletion of Company Quality Forms required it would be processed according to the Procedure.
3.2.5    Record are labeled, indexed, and filed in a date wise sequence so that they are readily available.
3.2.6    Retention period of each Company Quality Record is mentioned in the List of Company Quality Record. After the retention period is passed, the records can be removed from the file or can be retained for a longer period if required.
3.2.7    Obsolete records are disposed off by appropriate means.
3.2.8    Records are properly stacked, segregated and stored in such a manner that they should occupy less floor space and to protect them from lubricants, dust, oil and other material. Also Quality Records shall be stored at a place or area where there is no chance of loss by fire, theft and approach of unauthorized personnel.
3.2.9    Pertinent Subcontractor Quality Records: The records delivered by the subcontractor shall be the part of the Quality Records. However, these Records are filed in the appropriate Departments files.
Company Quality Records are submitted for evaluation to the Customer / Regulatory Affairs / Government. Agencies whenever required with the permission.

4.0     Quality Records / Forms:

4.1     The following Quality Records shall be generated and managed in accordance with the Procedure for Control of Document & Quality Records.

Required Record
Form Reference No.
Data Back Up Log

Document Distribution Form

Document Generation/ Change Form

Document Master List

Document Recall Form

Obsolete Document List

List of Company Quality Records