Statistical design of experiments (DOE) is a powerful tool when both prior knowledge and mechanistic understanding of a system are lacking, but it is remarkable that this is viewed by some as a one-size-fits-all basis for QbD. While DOE can lead to clear (normally linear) fitted relationships between outputs and inputs over the studied experimental range, even the originators of the technique would not suggest that it delivers process understanding. In its favour, it encourages a systematic, multivariate exploration of variables, a good step forward relative to the sometimes blinkered, random walk of one variable at a time experimentation. This is why both users and regulators view the technique as a core element in QbD.
The full picture however is that experiments must be combined with thinking (not used as a substitute for it) in order to develop genuine process understanding and apply the scientific method. Experiments without thinking should be a last resort and that kind of QbD effort seen for what it is, a pale shadow of the genuine article.
We are fortunate that in many areas of the chemical and physical sciences, our and previous generations have made strides in providing a mechanistic understanding of the transformations we use to make pharmaceutical products. In API / drug substance, the field of chemical reaction engineering, from Arrhenius (1889) through Levenspiel (1957) and others up to the present day, means we understand how temperature, pressure and concentration affect chemical reaction rates. We understand much about how crystallization, filtration, centrifugation and drying work. We are weaker admittedly in areas such as dry milling, but progress is evident in that field also. In biologics, much is understood about biochemical kinetics. In drug product, much progress has been made in understanding solids mixing and segregation and related operations such as compaction and coating; we understand many aspects of dissolution, shelf life / stability and even pharmacokinetics and pharmacodynamics. At this year's AIChE meeting, which I attended in Philadelphia, several industrial and academic papers addressed these operations based on sound chemical engineering principles and mechanistic understanding.
Should statistical DOE prove necessary to deliver QbD for a process, i) factor selection should be informed and justified by mechanistic arguments and prior knowledge ii) an attempt should be made to provide a mechanistic explanation of the results and iii) the scaling of the results to pilot or production scale should be guided by mechanistic thinking (e.g. scale-up using dimensionless groups, or a first principles mathematical model). Lab DOEs should not be reproduced on scale; this is the antithesis of process understanding and represents a simplistic extension of the old trial and error ways industry and regulators are trying to leave behind. I agree with a leading practitioner at AIChE who believes that in future the purpose of experiments on scale will be only to confirm the results of mechanistically-based predictions.
The type of experiments in a statistical DOE differ sharply from those informed by mechanistic understanding. They are typically more numerous (as prior knowledge is set aside in case of introducing bias); they focus on end-point results (rather than following the process by taking multiple samples); and they aim to maximise the process result (rather than how much is understood about the process). They can miss or lead to misinterpretation of important effects, especially those that will change on scale.
The preferred approach for QbD should not be to start with statistical DOE, or even to promote DOE as a primary tool, for the reasons given above. Rather, experiments where necessary should be designed to maximize their information content, so that a mechanistic understanding of the process can be obtained. That mechanistic understanding should be framed in the context of factors that are scale-independent, so that the resulting design space is scalable. This requires use of mechanistic thinking and sound principles of chemical engineering. A subsequent post will address experimental design for the common case of multi-phase synthesis reactions.
Monday, November 24, 2008
Thursday, November 6, 2008
Design space based on probability of success
As companies use models to an increasing extent for design space development, there is rightly an increased focus on model statistics and model parameter uncertainty.
John Peterson and colleagues from the statistics group at GSK have published a useful article on how to factor uncertainty into a DOE-based design space with multiple responses, consistent with ICH Q8. This paper, just published, is available at http://dx.doi.org/10.1080/10543400802278197 and highlights limitations of the commonly used procedure of overlapping average responses to define a design space.
The DynoChem team has been looking at much the same issue with first principles / mechanistic models, i.e. how to carve out a design space based on the probability of successful operation there. A posting with example calculations will be made here shortly.
John Peterson and colleagues from the statistics group at GSK have published a useful article on how to factor uncertainty into a DOE-based design space with multiple responses, consistent with ICH Q8. This paper, just published, is available at http://dx.doi.org/10.1080/10543400802278197 and highlights limitations of the commonly used procedure of overlapping average responses to define a design space.
The DynoChem team has been looking at much the same issue with first principles / mechanistic models, i.e. how to carve out a design space based on the probability of successful operation there. A posting with example calculations will be made here shortly.
Labels:
CPP,
CQA,
Design Space,
DOE,
PQLI,
Probability,
QbD