Qualification and Validation of Process Analytical Technologies (PAT)
Carl Anderson, Ph.D. Duquesne University
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#2)
Duquesne University Center for Pharmaceutical Technology (DCPT)
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#3)
DCPT Activities • • • • •
PAT Method Development PAT Validation Pharmaceutical Formulation and Development Biotechnology Dosage Form Development Sensor Technology – NIR – Chemical Imaging – Acoustics
DCPT Assets • Analytical equipment – NIR
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#4)
• FT, Dispersive, AOTF, Diode Array
– Chemical Imaging – Laboratory equipment • • • • •
HPLC, GC SEM/EDX Molecular spectroscopy Mass Spectrometry Etc.
• Manufacturing equipment – – – – – – – –
38-Station tablet press Fluidized-bed system Blenders Granulators Mills Spray dryer Roller compactor Etc.
Outline • General perspective
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#5)
– Qualification – Validation
• Qualification of PAT Related Sensors • Validation of PAT Methods • Implementation of PAT – Immediate – Long-term
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#6)
The “Pharm House”
Pharmaceutical Business
cGMP
Validation
Qualification
This session’s topics
This Session’s Topic • PAT are systems for continuous analysis and control • Qualification of PAT – Analytical instrumentation – Sensors Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#7)
• Validation of PAT – Analytical methods – Control point inputs
• Qualification and Validation are critical for the successful implementation of PAT – Identical, in principal, to current practices
This Session’s Topic – Elements of PAT • Process analytical chemistry tools (sensors) • Information management tools Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#8)
• Process control strategies • Product/Process design • Product/Process Optimization
From Ajaz Hussain, Opening remarks for the process analytical subcommittee meeting 12 June 2002
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#9)
Qualification/Validation Theme
Qualification Validation
Qualification - General • Installation and operation qualification (IQ/OQ) – Verification that an analytical system is capable of performing a given measurement
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#10)
• Performance qualification (PQ) – Demonstration that an analytical system performed as required during the course of a given measurement – System suitability requirements
Qualification of PAT Equipment for Off-line/At-line Measurements
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#11)
• Rapid at-line or laboratory measurements – Rapid biological methods (for example; cytometry, bioluminescence) – Spectroscopic absorbance/transmittance methods (for example, NIR, IR) – Spectroscopic scattering methods (for example, low angle laser light scattering, RAMAN)
• Questions and challenges for qualification – Upon what scientific principals does the technique operate? – How can those principals and functional be challenged? – What other reasonable factors might interfere with performance?
PAT Qualification Example - NIR • Example
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#12)
– Qualification of an FT-NIR transmission system – Off-line quantitative determinations (e.g. water, API, hardness, etc.)
• Starting point – USP general chapter <1119> – ONLY a starting point • More testing may be required • Some tests may be omitted
USP <1119> Qualification • IQ – Installation of hardware and software – Typically to manufacturer’s specifications
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#13)
• OQ – Testing to verify the instrument’s capability of performing a measurement – Includes instrument “self-tests” – USP recommends external standards – Performed at installation and following major maintenance
• PQ – Verification that the instrument is operating correctly at the time measurements are performed – Typically a subset of the OQ testing – Performed regularly (frequency determined by method demands)
USP <1119> Qualification • Wavelength uncertainty – SRM 1920a is inappropriate for a transmission instrument (reflectance standard) – SRM 2035 is appropriate
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#14)
• Photometric linearity – Reflectance standards are not appropriate – Neutral density transmission filters are available
• Spectrophotometric noise – High-flux noise – Low-flux noise – Similar considerations as photometric linearity
Specifications listed in USP <1119> • Chapters in the 1000 series are taken as recommendations – It is the responsibility of the user to verify testing is adequate – FDA is not legally bound to accept these USP tests Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#15)
• Table 1 – Note that the title reads: “Recommended Near-IR Instrument Specifications” – Series of minimum acceptance criteria – User must justify appropriate tests and criteria
• The most important test is, “suitable for intended use”
Examples of OQ Testing • Contrast two instruments – High resolution FT-NIR – Low resolution diode-array instrument
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#16)
• Wavelength uncertainty test (USP <1119>) – Table 1 specifications
• Two issues – Wavelength stability – Wavelength accuracy
SRM 1920a Recommended Acceptance Criteria Criteria in nm
Criteria in cm-1
1261 ± 1
7930 ± 8
1681 ± 1
5949 ± 4
1935 ± 1.5
5168 ± 4
OQ Testing of Two Instruments 1.1
1.4 FT-NIR
1.1
A OTF-NIR
1.0
1.3
1.0 Log (1/R)
FT-NIR AOTF-NIR
1.2
0.9
0.9
1.1
0.8
Log (1/R)
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#17)
0.8
1.0
0.7 1175
1195
1215
1235
1255
1275
Wave le ngth (nm )
0.9 0.8 0.7 0.6 0.5 1100
1200
1300
1400
1500
1600
1700
Wavelength (nm)
1800
1900
2000
2100
OQ Questions
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#18)
• What is the intended purpose of the instrument? – Type of method (quantitative, qualitative, whole spectrum, peak ratio, etc.) – Spectral features to be resolved – Dynamic range ( log(1/R) ) – Necessary data collection rates – Environmental conditions
Relationship of Qualification and Validation • Functional relationship – The requirements of an analytical method dictate the performance criteria of an analytical system
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#19)
• Chronological relationship – A minimum level of qualification of instrumentation should be performed prior to attempting method development: Does the instrument work? – Qualification of instrumentation is required prior to collection of data in support of method validation
PAT Qualification/Validation Flow Process/product risk assessment PAT Need/Opportunity
AS performance
Method development
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#20)
Analytical methods requirements Method validation
Technology selection Analytical systems (AS) specifications
IQ Requirements
Establishment of Criteria
Chronological flow
OQ PQ
Implementation
Requirements not met
AS Performance Inadequate
Relationship of Qualification and Validation Process/product risk assessment
AS performance
Method development
AS Performance Inadequate
PAT Need/Opportunity
AS Performance Adequate
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#21)
Analytical methods requirements Technology selection
Method validation
Analytical systems (AS) specifications IQ Requirements
OQ
Establishment of Criteria
PQ
Validation Strategies for PAT • Definition of validation “Establish documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting pre-determined specifications and quality attributes”
• Traditional validation approach
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#22)
– USP <1225>1 – ICH Q2A and Q2B2
• Non-traditional (method specific) – Example: Rapid Microbial Methods (RMM) • PDA Technical Report 33 Evaluation, Validation and Implementation of New Microbiological Testing Methods • USP <1223>3 1
USP 25 <1225> p 2256, 2002 available at: http://www.ich.org/ich5q.html 3 PF V29(1) p 256, 2003 2
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#23)
USP Validation Guidelines for Validation of NIR and RMM Methods • A need to bridge the gap between traditional concept of Pharmacopeial testing and application needs of a dynamic industry • Striving to meet high value, quality and compliance driven objectives in a cost effective manner • Designed to encompass a broad range of potential applications/users
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#24)
Validation Parameters • • • • • • •
Specificity Linearity Range Accuracy Precision Robustness Other criteria?
Acceptance criteria for these parameters MUST be consistent with the intended use of the method
NIR Validation Requirements for Qualitative and Quantitative
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#25)
NIR Validation Parameter Specificity Linearity Range Accuracy Precision Repeatability Intermediate Precision Robustness
Type of NIR Method Qualitative Quantitative + + + + + -
+ +
+
+
Specificity • Definition
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#26)
– The ability to assess unequivocally the analyte in the presence of components, which may be expected to be present. Typically these might include impurities, degradants, matrix, etc.
• Applied to NIR – The extent of specificity testing is dependent on the application and the risks being controlled – The specificity of a given NIR method is enhanced by other associated analytical procedures
Examples Demonstrating Specificity of a NIR Qualitative Method • Demonstration of positive ID testing (sensitivity) – Samples of materials represented in the library, but not used to create it – Positive identification results when analyzed Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#27)
• Demonstration of negative ID testing (selectivity) – Materials received on site that are similar to library member visually, chemically and/or by name – Negative identification results when analyzed
Examples Demonstrating Specificity of a NIR Quantitative Method
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#28)
• Demonstrate correlation between analyte signal and analysis – Wavelengths used in the calibration model can be compared to known bands of the analyte of interest and those of the matrix to verify that the bands of the analyte of interest are being used – Wavelengths and factors/regression coefficients (partial least squares regression) may be explained using the actual spectroscopic information from the analyte of interest
• Variations in matrix concentrations should not effect the quantitative measurements within the specified range • Qualitative assessment of spectral data prior to data analysis
Specificity Example â&#x20AC;&#x201C; Quantitative NIR Method Second Derivative NIR Spectrum of Analyte 0.15
Second Derivative Value (A.U.)
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#29)
0.10 0.05 0.00 -0.05 -0.10 -0.15 -0.20 -0.25 1100
1200
1300
1400
1500
1600
1700
Wavelength (nm)
1800
1900
2000
2100
Specificity Example â&#x20AC;&#x201C; Quantitative NIR Method Second Derivative NIR Spectrum of Analyte and First PLS Loading Weight 0.20
Second Derivative Value (A.U.)
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#30)
0.15
Loading weight 1 Analyte (scaled)
0.10 0.05 0.00 -0.05 -0.10 -0.15 -0.20 -0.25 1100
1200
1300
1400
1500
1600
1700
Wavelength (nm)
1800
1900
2000
2100
Linearity • Definition – Linearity is the ability (within a given range) to obtain test results that are directly proportional to the concentration of the analyte in the sample Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#31)
• Applied to NIR – Demonstrate the correlation of NIR response to samples distributed throughout the defined range of the calibration model
Linearity
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#32)
• Demonstration of linearity in NIR methods – – – –
Slope y-intercept (bias) with a plot of the data Plot of data and residuals Auxiliary statistics from regression • • • •
Standard error of estimate (SEE) Multiple correlation coefficient (R) F for regression Student’s t for the coefficients
Linearity • R2 is not a true measure of linearity in multivariate methods, •
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#33)
BUT….. is a measure of the fraction of variability modeled by the equation • Dependent on the standard error of the calibration equation (reference method) and the range of the calibration data
Range • Definition – The interval between the upper and lower concentration of the analyte in the sample for which it has been demonstrated that the analytical procedure has a suitable level of accuracy, precision and linearity
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#34)
• Applied to NIR – Range of analyte values defines the range of the NIR method – Results outside the validated range are not valid
• Process consideration – It may not be possible/desirable to extend the validated range to cover the product specification /expected variability
Validation of Range Desired Calibration Range Specification Range
Manufacturing Target
Results
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#35)
Typical Range
Batches produced
Accuracy • Definition
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#36)
– Expresses the closeness of agreement between the value which is accepted either as a conventional true value or an accepted reference value and the value found
• Applied to NIR – Demonstrated by correlation of NIR results with analytical reference data
Accuracy
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#37)
• Examples – Comparison of the Standard Error of Prediction (SEP) to the reference method used for validation (SEL) – Several statistical comparison methods to compare NIR results with Reference values • Paired t-test • Statistical bias
Precision (USP<1225>/ICH) • ICH - Q2A Definition
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#38)
– Expresses the closeness of agreement (degree of scatter) between a series of measurements obtained from multiple sampling of the same homogeneous sample under the prescribed conditions. Precision may be considered at 3 levels: • Repeatability • Intermediate precision • Reproducibility
– Expressed as the Relative Standard Deviation of a series of measurements
Precision (USP <1119>) • Applied to NIR - Same as definition – Only 2 levels are applicable to NIR • Repeatability
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#39)
– # of replicates of same sample presentation – multiple sample presentations
• Intermediate Precision – Different analyst – Different days
The Precision of the NIR method should be equivalent or better to the reference method used for validation
Robustness
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#40)
• Definition – Robustness of an analytical method is a measure of an analytical procedure’s capacity to remain unaffected by small but deliberate variations in method parameters and provides an indication of its reliability during normal usage
• Applied to NIR – Same as definition but the variations in the method are dependent on the application and sample technique
Robustness Typical challenges
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#41)
• Environmental conditions – temperature – humidity
• Influence of instrument changes – lamp – warm-up time
• Sample temperature • Sample handling – – – –
probe depth compression of material sample depth/thickness sample position
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#42)
Validation Parameters • • • • • • •
Specificity Linearity Range Accuracy Precision Robustness Other criteria?
Acceptance criteria for these parameters MUST be consistent with the intended use of the method
Periodic Model Evaluation • NIR Methods require periodic performance evaluation • Continuing acceptable performance of the method MUST be demonstrated
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#43)
– accuracy monitoring – periodic monitoring (parallel testing)
• If unacceptable performance is indicated, corrective action will be necessary – investigations into the cause the discrepancy – maintenance of the calibration model – re-validation of the method
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#44)
Qualitative Revalidation may be Necessary if….. • Addition of a new material to the library • Changes in the physical properties of the material • Changes in the source of supply • Coverage of a wider range of characteristics of a material
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#45)
Quantitative Revalidation may be Necessary if….. • Changes in the composition of the material • Changes in the manufacturing process • Changes in the sources/grades of raw ingredients • Changes in the analytical procedure or NIR instrumentation
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#46)
Model Transfer • The calibration should not be transferred to another instrument unless procedures and criteria are applied to demonstrate that the model remains valid on the 2nd instrument • Electronic calibration transfer is only recommended to another instrument of the same type & configuration (or when the relationship between the two instruments and the effect of that relationship on the calibration is well known) • Ongoing method evaluation is a model transfer exercise
PAT Potential Realized • Low – Replace lab methods (1:1) with faster lab methods
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#47)
• Medium – Monitor isolated unit operations
• High – Apply PAT retroactively to an existing process – Find and monitor primary critical control points (PCCP)
• Highest – Design process and PAT together
Potential PAT Impact Level of application
Implementation Perspective
PAT Potential Realized
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#48)
High
Low
Driving force
Committed personnel
Required support base
R&D, Regulatory, Manufacturing, Quality
Corporate level commitment to PAT ideas
Design process and PAT together
Optimize process capabilities
Where do sensors maximize process capabilities
Apply PAT retroactively to an existing process
Optimize process capabilities
Where are sensors mitigate maximum risk
Regulatory, Manufacturing, Quality
Product team
Monitor isolated unit operations
PAT added to existing process
What can we do with on-line PAT sensors
Regulatory, Manufacturing
Department
Replace lab methods (1:1)
PAT added to existing process
What can we do with off-line PAT sensors
Regulatory, Quality
Department
Potential PAT Impact Level of application
Implementation Perspective
PAT Potential Realized
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#49)
High
Low
Driving force
Committed personnel
Required support base
R&D, Regulatory, Manufacturing, Quality
Corporate level commitment to PAT ideas
Design process and PAT together
Optimize process capabilities
Where do sensors maximize process capabilities
Apply PAT retroactively to an existing process
Optimize process capabilities
Where are sensors mitigate maximum risk
Regulatory, Manufacturing, Quality
Product team
Monitor isolated unit operations
PAT added to existing process
What can we do with on-line PAT sensors
Regulatory, Manufacturing
Department
Replace lab methods (1:1)
PAT added to existing process
What can we do with off-line PAT sensors
Regulatory, Quality
Department
Potential PAT Impact Level of application
Implementation Perspective
PAT Potential Realized
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#50)
High
Low
Driving force
Committed personnel
Required support base
R&D, Regulatory, Manufacturing, Quality
Corporate level commitment to PAT ideas
Design process and PAT together
Optimize process capabilities
Where do sensors maximize process capabilities
Apply PAT retroactively to an existing process
Optimize process capabilities
Where are sensors mitigate maximum risk
Regulatory, Manufacturing, Quality
Product team
Monitor isolated unit operations
PAT added to existing process
What can we do with on-line PAT sensors
Regulatory, Manufacturing
Department
Replace lab methods (1:1)
PAT added to existing process
What can we do with off-line PAT sensors
Regulatory, Quality
Department
Potential PAT Impact Level of application
PAT Potential Realized
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#51)
High
Low
Implementation Perspective
Driving force
Committed personnel
Required support base
R&D, Regulatory, Manufacturing, Quality
Corporate level commitment to PAT ideas
Design process and PAT together
Optimize process capabilities
Where do sensors maximize process capabilities
Apply PAT retroactively to an existing process
Optimize process capabilities
Where are sensors mitigate maximum risk
Regulatory, Manufacturing, Quality
Product team
Monitor isolated unit operations
PAT added to existing process
What can we do with on-line PAT sensors
Regulatory, Manufacturing
Department
Replace lab methods (1:1)
PAT added to existing process
What can we do with off-line PAT sensors
Regulatory, Quality
Department
A Word of Caution • PAT is a powerful tool
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#52)
– Optimal use in many places for many applications – Should be used with great care based upon Risk based application
Risk based validation
Good science
A Tale of Two (PAT) Applications
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#53)
• It was the best of times. . . – Proper validation of PAT methods is achievable – Well-validated methods will improve manufacturing efficiency and safety of products – Application of PAT facilitates continuous manufacturing improvement
• It was the worst of times. . . – The validation of PAT must be done well – Poorly validated methods could lead to a potential failure of a PAT based method – Early failures will quash the current favorable environment
Thank You
QUESTIONS?
A Tale of Two (PAT) Applications
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#55)
• It was the best of times. . . – Proper validation of PAT methods is achievable – Well-validated methods will improve manufacturing efficiency and safety of products – Application of PAT facilitates continuous manufacturing improvement
• It was the worst of times. . . – The validation of PAT must be done well – Poorly validated methods could lead to a potential failure of a PAT based method – Early failures will quash the current favorable environment
Important Considerations and Relationships for PAT
Carl Anderson, DCPT, PAT Conference, Philadelphia, PA 24 March 2003 (#56)
Business
Science
Regulatory