continued process verification of legacy products in …

29
Data integrity for IT in the Biopharmaceutical Industry 1 CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN THE BIOPHARMACEUTICAL INDUSTRY

Upload: others

Post on 24-Apr-2022

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Data integrity for IT in the Biopharmaceutical Industry 1

CONTINUED PROCESSVERIFICATION OF

LEGACY PRODUCTS IN THE BIOPHARMACEUTICAL

INDUSTRY

Page 2: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 2

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

Main Authors

BayerPaul Wong

BiogenAndre Walker

BMSMarcus Boyer

Genentech/RocheParag Shah

LonzaRob Grassi

Merck, now Tunnell ConsultingJulia O’Neill

PfizerCarly Cox - Pfizer

Shire, now Biopharm DesignsBert Frohlich

BPOGRobin Payne

The authors wish to acknowledge the following for their contribution in reviewing this paper:

BayerSharif Ahmed Edgar Sur

BiogenAndrew LenzSarah Yuan

BMSSyama Adhibhatta

EMD SeronoChristian Menzel

JanssenRuss Moser

MerckJulie BeemanSusan Byerley-RicksBeth JunkerWalt Manger

ShireCarrie Fraga Florian Jantscher

Page 3: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 3

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

1 Introduction ........................................................................................................................................................................................................4

2 Integrating CPV with Legacy Quality Systems .........................................................................................................................................7

3 Design Space Knowledge and Parameter Selection .................................................................................................................................9

4 Legacy Products and Variability across Multiple Sites ........................................................................................................................ 11

5 Discovery, Disclosure and Regulatory Action ........................................................................................................................................ 13

6 Data Integration and Legacy IT Infrastructure ....................................................................................................................................... 15

7 The Effort and Cost of Deploying CPV for Legacy Processes ............................................................................................................. 17

8 Conclusions and Recommendations .......................................................................................................................................................... 19

Appendices ........................................................................................................................................................................................................ 21

Appendix 1 .................................................................................................................................................................................................................................21

Appendix 2 – The BPOG CPV Maturity Assessment Tool (CMAT) and Conclusions Arising from Team Discussion. .................22

9 References ......................................................................................................................................................................................................... 28

Contents

Page 4: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 4

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

1.0 Introduction The BioPhorum Operations Group (BPOG), an industry wide collaboration, published an industry position paper on Continued Process Verification (CPV) for biologics in 2014, which included a detailed example CPV plan for a single biologic molecule based on the process defined in “A-Mab: A Case Study in Bioprocess Development” [9]. Over 100 pages in length, “Continued Process Verification: An Industry Position Paper with Example Plan” [1], provided a detailed methodology for selecting process parameters from the overall control strategy, defining sampling plans, and analyzing the data that flows through the manufacturing process. It provided insight into leveraging Quality by Design (QbD) concepts to enable CPV for a single biologic product. Since then, a number of other papers have also been published on the topic of CPV plan development and deployment [10, 12].

This paper discusses the challenges of implementing a CPV program in a real-world scenario, where an already-licensed biopharmaceutical product has a manufacturing history, is produced at more than one site, and/or where resources are limited, requiring leadership to make difficult choices of where to focus on CPV implementation efforts. It highlights several different challenges encountered by member companies of the BPOG on their CPV journeys, and suggests appropriate strategies.

Although not a formal guideline, it is hoped that this publication can offer an approach based on the collective experience of the BPOG member companies with existing commercial manufacturing operations. This paper represents the consensus view of the authors and reviewers but is not specifically representative of the internal procedures of any particular company. Also, it is assumed that the reader has a general understanding of the concepts behind CPV, and its implementation. Several references provide comprehensive and in-depth guidance on how to develop a CPV plan and program for biopharmaceutical processes, in particular [1, 4, 10, 12] which cover some unique challenges, perhaps the most notable of which are:

• significantly less data tends to flow from manufacturing, making the application of statistical methods challenging.

• significant raw material variability can be present.

• product quality measures can be complicated and exhibit high variability.

Where applicable, the authors point out the specific challenges biologic products create in implementation of a CPV program, but in general the concepts presented will be applicable to small molecule and emerging therapeutic technologies as well.

Background

The FDA, EMA and ICH have issued guidance stating that manufacturers are responsible for identifying sources of variation affecting process performance and product quality. With the issuance of FDA’s 2011 guidance to the industry: “Process Validation: General Principles and Practices” [2], the implementation of systems to ensure continued quality assurance is, for all practical purposes, a GMP requirement. These comprehensive control strategies will be expected to extend to existing commercial operations. The FDA guidance states:

“Pharmaceutical companies should plan and execute a system for the monitoring of process performance and product quality to ensure a state of control is maintained. An effective monitoring system provides assurance of the continued capability of processes and controls to meet product quality and to identify areas for continual improvement.”

“Adherence to the CGMP requirements, specifically, the collection and evaluation of information and data about the performance of the process, will allow detection of undesired process variability. Evaluating the performance of the process identifies problems and determines whether action must be taken to correct, anticipate, and prevent problems so that the process remains in control [2, section 211.180(e)]. An ongoing program to collect and analyze product and process data that relate to product quality must be established [2, section 211.180(e)].”

Complying with this guidance from the FDA is commonly referred to as implementing Continued Process Verification (CPV).

The European Union’s Annex 15 [6], refers to:

“Ongoing Process Verification” as ‘documented evidence a process remains in a state of control during commercial manufacture’.

As such, Europe’s Ongoing Process Verification is equivalent to the FDA’s CPV.

ICH Q10 [3] states that manufacturers should:

“Identify sources of variation affecting process performance and product quality for potential continual improvement activities to reduce or control variation.”

There are a number of reasons why implementing CPV for legacy programs may be complex:

• The process design for legacy products may be less rigorous than what would now be deemed acceptable for launching into a commercial phase.

• Older control strategies may be less rigorously developed and documented and require significant research to properly classify and choose the relevant process parameters to include in a CPV program.

Page 5: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 5

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

• Definitions of terms may not be aligned with the new, increasingly harmonized language for monitoring methodologies, and alignment can require significant change to existing quality management systems.

• Process knowledge generated by the CPV program may be inconsistent with regulatory filings or commitments.

• Legacy data acquisition and IT infrastructure may not be capable of assembling the data required for a CPV program.

• Well-established legacy commercial products are likely to have multiple manufacturing sites with differing infrastructure, essentially requiring multiple and substantial CPV implementation projects for one legacy product.

• A complex regulatory history for a legacy product licensed in multiple jurisdictions can make implementing CPV a challenge and slow the implementation of process improvements identified in the CPV program.

While the effort to implement CPV for legacy products may be substantial, the potential benefits are numerous and of high value:

• Reduction of the number of unexpected process variations, emergency investigations, and product quality variability through accumulation of greater process understanding and proactive corrective action.

• More efficient identification, justification and implementation of process improvements.

• Ability to leverage data to potentially reduce regulatory approval requirements for process changes.

• More consistent methods for manufacturing monitoring and process control.

In most cases, legacy products and their production processes are associated with one or more legacy systems and infrastructure. Many of these older facilities are now being asked to accommodate newer processes. Thus, some of the recommendations in this paper may also apply to these circumstances.

Current status of CPV

Drawing from the collective experience of the BPOG member companies, implementation of CPV processes and systems are at various stages of development, implementation and system maturity. In general, most have progressed significantly towards having provisional CPV plans in place for their commercial products, at a minimum, and are developing those plans towards high aspirational levels of CPV maturity. However, most are not yet realizing the full benefits that a mature CPV system can deliver. Legacy products may have a long

manufacturing history where data mining and analysis can result in a rapid accumulation of process understanding, providing the added value of a CPV program, going beyond compliance to superior process awareness and control.

As a measure of current status, a maturity assessment was carried out across the BPOG member companies; the results of which are briefly summarized here. A more detailed description of the methodology and the survey results is given in Appendix 1.

Five ‘dimensions’ of maturity were assessed in the survey: business processes, automation, implementation, business benefit, and people and roles, using word models with five levels against each dimension. As shown in Appendix 1, Figure 1, the survey was undertaken in 2014 and again in 2016. There was progress towards maturity in all five dimensions, whilst the relative levels of maturity remained consistent across the five dimensions. Business processes were seen to be maturing and significant levels of implementation across the manufacturing networks of the participating companies. People and roles form a key element of any business process of course and a slight lag here suggests an area for focus into the future. Levels of automation lag significantly behind the goal of getting fundamental business processes in place and this is logical as processes would ideally be very clear before they are automated. Companies are aware they would ideally be measuring the benefits of introducing CPV systems, but maturity is relatively low on this dimension. Perhaps the ability to measure benefits will improve quickly where automation is being introduced.

Overall, the BPOG team recognized compliance as the key driver initially, with a number of other benefits coming to the fore as CPV programs are developed. These additional benefits arise from getting better at managing manufacturing processes. Explicitly they are: a reduction in observations from regulatory inspections, a reduction in investigation times, and a reduction in lost batches. It has proven difficult to quantify these benefits to date, as Business Benefits Management is currently one of the least mature aspects of CPV, but early indications suggest BPOG member companies are experiencing reductions of around 20% in each of these benefit areas. It is an aspiration that the means of assessing these benefits will improve and that levels of benefit will reach 30-40% in each area.

The team also shared experiences of regulatory interactions in confidence, as a measure of the level of importance regulators place on the topic of CPV. In about 30% of inspections, regulators have asked questions regarding the status of CPV. Typically, the questions are related to the business processes and methods by

Page 6: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 6

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

which CQAs and CPPs are identified, and how signals are generated and responded to. Another area of interest has been the linkage between CPV programs and Annual Product Quality Reports.

The bottom-line value is subtly present through increased batch success rates, and sometimes is dramatically proven when a rapid detection and response overtly prevents the loss of a batch.

For legacy products, a detailed review of existing data is an early step in designing a CPV program. This review can, for products with a long history of data, yield the anticipated benefits quickly; perhaps even before the formal CPV protocol is executed.

Challenges

Implementing CPV requires a significant commitment of resources, and the timing and scope should be in concert with the company’s overall Quality Assurance strategy so that resources are deployed for maximum patient benefit. When the decision is made to go forward with any CPV program, legacy or otherwise, the following questions should be considered and addressed as they are fundamental prerequisites to implementing CPV:

• Does the CPV program fit clearly into a comprehensive control strategy? Is the program’s scope clearly defined as a subcomponent of this strategy? Is it clear how the CPV component connects to other elements of the overall control strategy and in such a way that avoids redundancy?

• Do clear business process and risk-assessment methods exist to identify and justify which parameters to include and/or exclude?

• Is there clear direction on how CPV signals will be addressed and escalated if necessary?

• Does historical data fit with the current way in which CPV plans are defined?

Gaps in these areas are common when first implementing CPV, and remediation of these deficiencies must be part of the overall CPV launch plan. The BPOG paper: ‘A Roadmap to the Implementation of CPV’ [4] can provide succinct guidance on addressing these issues.

The focus of this paper is to highlight the unique challenges of implementing CPV for legacy programs. These challenges have been organized into the following topics:

1. Integrating CPV with legacy quality systems

2. Design space knowledge and parameter selection

3. Legacy products and variability across multiple sites

4. Discovery, disclosure and regulatory action

5. Data integration and legacy IT infrastructure

6. The effort and costs associated with deploying CPV for legacy processes

Concerns, responses, and recommendations for each of these points are presented in the following sections of this document. It should be noted, however, that there is some overlap between these areas of concern. For example, an older IT infrastructure and multi-site manufacturing will certainly add to the complexity and challenge. Finally, there is no one-size-fits-all solution to CPV for legacy products, but this document highlights the unique challenges legacy products present to enable companies to make decisions on when and how to implement CPV.

Given the importance of CPV for compliance, manufacturing control and process improvement, the authors encourage readers to share and discuss the content of this paper with their colleagues. BPOG continues to collaborate on this topic: a paper on ‘Responses to Signals’ has been published in Pharmaceutical Engineering [5] and a paper on ‘The Validation of Informatics Systems for CPV’ is in draft form currently [7].

Page 7: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 7

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

2.0 2.0 Integrating CPV with legacy quality systems

Concerns

CPV will likely require new concepts, definitions and procedures that will have to align with policies and procedures in the existing quality systems. The BPOG member companies have highlighted three challenges in this area:

1. Conflictoverthedefinitionof‘validation’.

Process validation has had specific meaning in the industry for decades originally aligned to the 1987 FDA guidance; those specific batches produced to provide evidence that the process works reliably and predictably. These batches were produced under pre-approved protocols, were a one-time ‘event’, and were often more heavily monitored, tested, than normal production batches. The more recent definition, derived from the FDA’s “Process Validation: General Principles and Practices, January 2011” [2], extends across the product lifecycle:

“For purposes of this guidance, process validation is defined as the collection and evaluation of data, from the process design stage through commercial production, which establishes scientific evidence that a process is capable of consistently delivering quality product.”

It will likely not be a trivial endeavor to modify dozens of existing quality system documents to align with this definition.

2. Differentiating between the CPV process and existing trending procedures.

All companies have trending procedures, to provide for Annual Product Reports to regulators at a minimum, but often to trend PQ release data, critical in-process data, as well as environmental and clean utility monitoring. It is important to create clarity between the existing and new CPV ‘trending’ procedures.

3. Integrating the CPV process with the formal exceptions (deviations) system.

The CPV system is designed to review data, take action under certain circumstances, and document the results. This clearly has overlap with the existing formal exceptions system. Defining when the CPV program triggers a formal exception is critically important to provide clarity and ensure the formal exceptions system is not overburdened [5].

Response and recommendations

CPV is focused on ‘continual assurance that the process remains in a state of control’ (i.e. the validated state). It is advisable to keep this fundamental concept in mind when navigating the integration of CPV with existing legacy quality systems.

For example, it may be tempting to rewrite existing procedures so they embrace the FDA concept that ‘validation’ now consists of three stages: Process Design, Process Performance Qualification, and CPV. A quick search through most company’s documentation system will find that the old definition of process validation is thoroughly embedded in hundreds of documents and regulatory filings. Modifying these will not deliver the intent of the agency, which is to have a CPV program that ensures a state of control.

A more reasonable approach is to focus resources on creating the CPV procedures and embrace the legacy definition of validation for expediency. Modifying or creating one high-level document to codify the vocabulary of the company against the 2011 FDA guidance and act as a 'key' for translating between old and new documents should suffice. This ‘key’ can be referenced in filings and presented during opening meetings at pre-approval inspections and regulatory audits.

Undoubtedly a company already trends multiple attributes of their products and parameters of their manufacturing operation, and includes many in their Annual Product Report. Product complaints, out-of-specifications (OOS), process deviations, environmental monitoring, and raw material quality data are all part of the existing quality system. The key to assessing the relevance of these existing trending procedures to CPV is identifying which are trending against statistical control limits, and which are not. CPV is meant to ensure that the process remains in a ‘state of control’.

In order to clarify and reinforce the difference between specification-centric monitoring programs and the statistically focused CPV, the BPOG CPV group recommends carefully choosing the terms for issues that are detected and for the response to those issues. ‘Trend’ is the term commonly used to identify a shift or drift in quality attribute or process parameter values and which may indicate the need to take action to ensure the quality of the product is maintained [5]. The BPOG member companies recommend the CPV program generates ‘signals’ from the data. Likewise, ‘investigation’ is usually

Page 8: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 8

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

associated with the exceptions system and implies significant effort with possible product quality impact. CPV ‘signals’ are very unlikely to rise to that level of concern, so we recommend ‘evaluation of signals’. Note that since CPV programs evaluate data against SPC limits, the quantity of signals to be evaluated is expected to be greater than the typical rate of exceptions, although the effort to complete each one should be much less.

In rare instances, a CPV signal may be evaluated and found to possibly impact safety and efficacy. In this case, action should escalate into the formal exceptions systems, where it will be recorded and thoroughly investigated. It is recommended that specific rules for escalation be established as part of the CPV program; this too will help to differentiate the CPV program from the formal exceptions system.

The escalation strategy is a critical component in a CPV program. Properly designed, it will align the CPV program with the exceptions system in a compliant manner, escalating only those CPV signals that deserve the greater attention and scrutiny of the exception system. Improperly configured, it can overburden the exception system with investigations into CPV signals that have little possibility of product or patient impact, or overlook important signals that should be investigated. Unfortunately, there is no one-size-fits-all approach to finding a balanced escalation strategy [5].

Trends identified as part of the CPV program and their associated investigations should be recorded and documented. The system used to store this information can vary depending on existing infrastructure and preference. The same quality system that is used to record GMP deviations (exceptions) and related investigations can be used as long as the system has the flexibility to allow differential responses that are appropriate for either case. Alternatively, separate systems can be used to keep the activities and the associated electronic records separate. The CPV team needs to estimate the impact to the business if CPV ‘events’ are recorded in the same exceptions management system as true non-conformances.

Clearly, these considerations and the potential impact on existing quality systems should be addressed well before a CPV program is implemented. A comprehensive evaluation of the existing quality systems should be conducted in light of recent regulatory expectations. If there are compliance gaps, how and when should they be addressed? The introduction of CPV may represent an opportunity to align all elements of the existing quality systems with the new approach. On the other hand, the time and expense has to be weighed against the expected life of the product and manufacturing facility and its compliance status.

Page 9: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 9

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

3.0 3.0 Design space knowledge and parameter selection

Concerns

Legacy products that were developed years ago were likely licensed based on a level of process understanding and control-strategy definition that was appropriate at the time, but is less sophisticated than current regulatory expectations and industry standards. A sound approach towards process development (ICH Q11), perhaps following the principles of Quality by Design will lead to a comprehensive understanding of critical process parameters and their impact on product quality [8,9]. This knowledge enhances a company’s ability to define (and defend) a robust and efficient control strategy. The parameters to include in the CPV program are based upon this control strategy, ensuring an efficient and effective monitoring program.

Furthermore, analytical technologies continuously evolve, and parameters and attributes, which are routinely measured for products licensed today, may not have been included in the original control strategy for legacy products. This is particularly true for legacy biopharmaceutical products, as methods for analyzing protein attributes were insufficient to ensure acceptable CQAs*, so there was a high reliance on controlling all aspects of the process. This situation has led to products being licensed in the mode of ‘the process being the product’; as a consequence there may be a number of attributes in the license that are not critical to the quality of the product. Not including these ‘non-CQAs' in a CPV plan may be concerning, as they were part of the description of the licensed product. However, if data indicates they are not critical to quality, they may be justifiably excluded from a CPV plan.

So, there may be cases where systematic classification of process parameters have not been developed or must be updated as a precursor to defining the CPV program. In other cases, gaps might exist for process data that was not collected on important attributes and parameters, which are needed in the CPV program. In some cases a product may have been manufactured very infrequently so there is a lack of data to work with.

Response and recommendations

A reasonable strategy that enables relatively rapid and efficient deployment of a compliant CPV program is to implement the new monitoring program in phases, initially focusing on the product qualities of the legacy product that are measured at time of release. The data should be readily available and already trended against release specification in the Annual Product Review. The CPV program will

increase the frequency of data review, establish trending

against statistical control limits, and identify the capability

(CpK, PpK) of the CQAs. It also permits the establishment

of a CPV program with a manageable set of product

attributes. These processes include CPV protocol

generation, process review meetings, trend reporting, and

response to signals procedures [5].

A second phase of implementation could focus resources

on the less capable CQAs, adding Process Parameters that

impact on these less capable CQAs to the CPV program.

Consideration should also be given to the quality attributes

of incoming raw materials, as these can be a cause of

variation in the process. These second phase additions to

the CPV program will move it from a compliance exercise

to a value-added program by deploying resources against

high-risk processes.

A science- and risk-based Control Strategy is the

foundation of an efficient and robust CPV plan. One should

be created or updated to define CQAs and CPPs that

impact CQAs for legacy products by taking advantage of

the rich manufacturing data that has been accumulated.

Formal risk assessment methods (FMEA etc.) are the

fundamental tools used to classify parameter impact

(e.g. Critical, Key, No Impact) and should be employed.

Once complete, the Control Strategy guides the choice of

parameters to include in the CPV program.

Despite a lack of sophisticated process design and

modern control-strategy documentation, the extensive

manufacturing history, copious process data, and existing

regulatory filings can be useful sources of information

unique to legacy products that should be leveraged. The

likely lack of small-scale robustness studies for legacy

products is countered by the extended availability of

at-scale data, historical quality system events, and the

experience SMEs (Subject Matter Experts) have with

the process. These resources can enable the accurate

assignment of risk. The significant dataset also permits

exploratory data analysis (correlations, multivariate

regression) to further contribute to the risk assessment. If

significant uncertainty still exists, small-scale studies can be

targeted at specific unit operations. For fuller descriptions

of the risk assignment process, see references 1 and 4.

Note that historic data may contain process results

encompassing one or more process or assay changes,

complicating the comparison of process performance over

time. These and other anomalies need to be identified and

incorporated into the trending program according to sound

statistical considerations.

* See Appendix 1 for definition of terms

Page 10: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 10

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

Careful consideration should be given to whether data from previously failed lots should be included or excluded from the analysis. In general, excluding data from lot failures (process excursions, OOS, etc.) should be limited to instances where there is a known ‘special cause’ of the deviation. For example, a utility system failure may have taken place. If such a cause is unknown, then it may be the results are part of the normal overall process variability.

It is very much worth noting that, highly potent biotechnology products and those with small patient populations can present situations where there is little historic data, as these products are manufactured

infrequently. In these cases, there may be insufficient data to draw sound statistical conclusions, in line with corporate standards. Typically, in these circumstances a company might establish a forum that reviews data for each manufacturing run, and create a CPV plan that includes a review of run data against action and/or acceptance limits on a regular basis; annually for example. In Section 5, there is a discussion of the use of IT and automation as an enabler of CPV systems where there are large amounts of data to handle. For infrequently manufactured products alone, there may be little or no justification for putting such technological solutions in place.

Page 11: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 11

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

4.0 4.0 Legacy products and variability across multiple sites

Although having a manufacturing process that occurs on multiple sites is not unique to legacy products, it is arguably more often the case than for new products. Over time, additional sites may have been added as market demand expands and/or as a way to mitigate supply-chain risks. Companies may decide to employ third-party suppliers and thus some or all of the sites could be contract manufacturing organizations (CMOs).

Concerns

A chief concern is that a CPV program will highlight process differences between sites that are manufacturing the same product and executing the same manufacturing procedures. These differences may stem from facilities that have been built at different times and/or in different countries. Design standards and equipment performance may have changed over time, and facilities may be operated by different personnel or even by different companies (e.g. CMOs). Performance differences between sites are almost unavoidable, and thus the introduction of a cross-site CPV program may be perceived as only highlighting these differences. This can be particularly troubling for legacy biopharmaceutical products where the license was filed with many process parameters.

Also, process changes are typically implemented sequentially, and thus different facilities may be manufacturing with different versions of the process as it evolves along the product’s lifecycle. Different manufacturing scales and assays may also make it difficult to make a direct comparison of performance across sites.

Response and recommendations

It is useful to distinguish between a CPV plan, including the list of process inputs (parameters) and product quality attributes (outputs) to trend based on the control strategy, and the presentation of data that come from the different locations.

As a guiding principle, for processes being executed in multiple locations, the decision to have one or multiple CPV plans is based on the level of difference in process design (or unit operation design); it should not be based on process performance.

However, the decision of whether to present the data from the two locations combined together should be based on the process performance, i.e. a statistical assessment on whether location is or is not a factor in the process’s performance. Note, it is advisable to involve someone with a good background in statistics when making this type of determination.

Identical processes (those with the same unit operations, CQAs and CPPs) being executed in different suites, sites, or CMOs are presumably all being controlled by the same science- and risk-based control strategy. Thus, the list of control inputs (parameters) and process outputs to be monitored should be the same. Conversely, newer versions of a process (or unit operation) with improved control strategies or different unit operations should be considered a new process with a separate CPV plan. There may be significant overlap in the old and new CPV plans, but the one-to-one relationship between process design and CPV plan should be maintained.

The decision on whether to combine the data between the multiple sites should be based upon the type of data and whether their combination is statistically valid; i.e. they are determined to be from the ‘same process’ (mean and standard deviation). Location of manufacture is merely another factor, similar to raw material lot, time of day, or operator; all of which could inject variability into the process. Statistical methods should be utilized to test whether location is a factor in process performance, and, once excluded, the data from both locations can be analyzed together. This approach has the benefit of doubling the amount of data passing through CPV and increasing the ability to detect issues in the process no matter where it is being run.

Note that if equipment capabilities are different, data from two locations should not be combined together. Similarly, a site utilizing inline probes and automation for pH adjustment is likely to have a very tightly controlled viral inactivation step. A site where it is manually controlled via off-line samples and titrations is likely to have more variability. The data from these two unit operations, although both executing the same process, and both operating within the licensed specification limits, should not be combined, because they would have two inherently different variances.

Page 12: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 12

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

Finally, it is also valid, based on statistical analysis and scientific rationale, to combine the data from an old and a new process, regardless of location. For example, if only a single unit operation was changed and the rest were unchanged. The upstream unit operations are identical and can be considered the same. An assessment can be made on whether the altered unit operation ‘changes’ the downstream processes.

A CPV program configured as noted above will highlight the differences between sites, which should not be cause for concern, as both sites are licensed to manufacture against predetermined release criteria that ensure safety and efficacy. A properly defined CPV program should have no impact on safety and efficacy, but should provide early detection of process shifts and drifts and appropriately prompt an escalation of activities that result in greater process understanding and control. Knowledge that an identically designed process is performing differently at different locations should be evaluated the same as any other signal; it should be assessed for impact and a decision made on whether to expend energy to understand the root cause of the difference. In many cases it will be perfectly acceptable to simply acknowledge the difference and continue to monitor the processes separately. In others, further investigations may be undertaken in an effort to expand process understanding.

There are distinct advantages to combining data when statistically valid and scientifically justified:

• an integrated report can be presented to regulatory authorities

• combining data sets can increase the volume of available data and so improve statistics and process understanding and control

• process performance can be compared between sites

• this comparison can enable the identification of opportunities for improvement.

When CMOs are part of the mix of manufacturing sites, there is a significant additional challenge in obtaining the data and the level of engagement necessary to evaluate and investigate CPV signals. By their very nature, CPV signals are not impactful to product safety and efficacy, and it is unlikely that existing contractual agreements and quality agreements call out for assistance with these non-critical requests.

One approach for dealing with this is to develop a technical /quality agreement (TQA) for each site, documenting which measures will be taken, how they will be reported, and what technical assistance will be provided. One of the challenges when establishing a TQA with a CMO is that the basic service provided by the CMO may not meet the full needs of a single CPV plan. In some cases, when products are being manufactured at a CMO, there may be an existing supplier quality agreement, which does not account for the required access to data necessary to maintain a CPV program. It might be necessary to create a TQA beyond what is described in a typical CMO contract (the increased scope may be associated with an increase in cost). The TQA should determine whether the CMO or license holder is responsible for completing CPV reports. If the license holder is responsible for the reports, they should ensure that the process data is generated and reviewed by the CMO.

It is increasingly common that CMOs are asked to provide comparative data for quality attributes controlled to similar levels on different sites. The details of the monitoring and review should be mutually agreed and a clear understanding of responsibilities is required. Collaborative procedures should be established to ensure the data is comparable between the sites and presented from the CMO to the license holder in a timely fashion.

Page 13: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 13

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

5.0 5.0 Discovery, disclosure and regulatory action

Concerns

It is understandable that some reluctance may exist for implementing systems that will generate new (additional) data about an existing process. What if the data generated by the CPV effort is inconsistent with current process understanding and regulatory filings or commitments? What if unexpected levels of variation are discovered? How much of this information will have to be disclosed? What if, through the process of identifying attributes and parameters, new risks are discovered or parameters are miss-classified? Are we obliged to disclose them and if so, how? Would the process need to be re-filed? What other actions may be expected or required? What is the responsibility of the manufacturer in disclosing newly found information?

When regulatory inspectors see control charts with ‘white space’ between the control limits and specifications, it has been the experience of many companies that a regulator may ask for the specifications to be tightened. How should such a request be managed?

Response and recommendations

Perhaps the most significant concern would be that through more thorough data review the CPV program uncovers an issue that indicates a risk to the safety and efficacy of product that was already released and distributed. If this were to occur, then the license holder would be required to follow regulations and appropriately notify the authorities, but this would truly be a rare event. CPV programs are designed to monitor the state of control of the process, to verify that the validated state has been maintained, and to generate data that enables process improvement. CPV trend limits are based on statistically derived control limits, not safety or specification limits. Even if a CPV program establishes that a given parameter has a sub-standard capability, safety and efficacy are ensured by following the process as it was filed, including all its in-process controls (IPCs), in-process tests (IPTs) and CQA release tests. By GMP regulations, each manufacturer has quality systems that ensure the process has been executed per the licensed process and that the final product has passed all release tests. These quality systems have been inspected before licensure and are audited at regular intervals by regulators, often from multiple jurisdictions, to ensure they are effective at ensuring patient safety.

Furthermore, it is an expectation of health agencies that manufacturing process issues (errors, out of specifications, environmental excursions, etc.) be identified, investigated,

and Corrective Actions and Preventative Actions (CAPAs) initiated to continually improve the process. CPV activities are merely an extension of this continuous-improvement expectation, and no more likely to result in significant regulatory concerns than existing process monitoring systems. Although it is unlikely, it is possible that a significant gap could be found in the control strategy, or an error found in the release process of material that is no longer in the company’s control. As such, it is advisable to agree beforehand on an escalation path for this scenario with the quality and regulatory organization, with the response being proportionate to the potential impact to patient safety and product quality [5].

To reiterate, the purpose of reviewing historical data is not to re-release past production, nor is it likely that this data review will discover information that indicts previous lots. All data from released batches should be within the licensed range, and even if a process is evaluated as ‘incapable’ (through having a low PpK), safety and efficacy were assured by the existing quality systems in place at time of release.

A clear articulation of the overall control strategy is a prerequisite to developing a CPV plan. This comprehensive document should provide assurance to the quality organization and external regulators that safety and efficacy have been assured in the past. For some legacy products a cogent, succinct control-strategy document may have to be created from existing filings, any available technical development documentation, batch records, and specifications. If in fact a gap is found in the existing control strategy, then it is the license holder’s responsibility to update procedures and specifications as appropriate, similar to any other issue that is detected by the quality system. The need for regulatory agency notification will be determined by the regulatory affairs organization just as it would with any process improvement.

The purpose of reviewing historical data is to inform the control strategy and CPV plan, and any investigations into shifts and drifts from past data should be in support of that task (i.e. to determine if a data point should be included or excluded from the statistical process history).

Once the CPV program is active, weakly capable processes will create signals that will be evaluated for impact and escalated per procedures. In this way the CPV program will focus process improvement efforts on the high-risk unit operations.

For highly capable processes it is common for regulatory agencies to request a tightening of specifications to reflect the manufacturing history, in some cases suggesting that they be set at or near the process’s statistical control limits. Specifications should be based on scientific or clinical evidence that those limits are appropriate for

Page 14: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 14

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

ensuring safety and efficacy, and that evidence should be used to avoid collapsing specifications simply because the manufacturing process is highly capable. This clinical evidence should be in the original regulatory filing and should be used to defend the specifications.

In cases where specifications cannot be scientifically defended, legacy products offer the advantage of clinical experience gained through years of patient and product complaints overlaid against the long-term output of the process. Ensuring that agencies have visibility to the full range of past production, not just the recent history upon which CPV limits are based, could provide evidence to justify the existing specifications, or to suggest a less restrictive range, based on the full history.

Setting static ‘alert’ limits, set just inside the existing specification limits, is sometimes suggested as a defense against tightening specification limits. This has been a historic approach and so may already be present for legacy products. These are designed to prompt action

should the process drift, which is often the concern of the regulator. The CPV program will be more effective, through statistically justified control limits and assessment of signals, at sensing process shifts and leading to appropriate action through a defined escalation process. Unfortunately, the removal of legacy action limits may require re-filing, and so the static action limits and dynamic CPV limits may have to coexist until a change is approved.

In cases where the data and process performance review identify a parameter that was misclassified as non-critical, its status in the CPV protocol can be immediately increased. A filing change is not required to begin additional monitoring and trending that is normally reserved for critical parameters. Appropriate regulatory action can then follow internal procedures.

A parameter that was misclassified as critical can be dropped from the protocol or monitored at a reduced frequency with the appropriate justification. Its formal reclassification may have to wait until a filing amendment is introduced.

Page 15: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 15

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

6.0 6.0 Data integration and legacy IT infrastructure

The new business processes implemented for CPV may need to be supported by a company’s IT infrastructure. Indeed, manual data collection and analysis for a large-scale operation may be impractical. Only under circumstances where a company has few, infrequently manufactured products is it unlikely that IT and automation will be advantageous in a CPV system (as mentioned in more detail at the end of Section 2). Thus automation of the collection, analysis, and reporting of a significant volume of data may be considered as a critical enabler for the practice of CPV.

Concerns

For older facilities, legacy data acquisition systems and IT infrastructure may not be able to readily access data required by the CPV protocol. Therefore, a substantial investment may be required to implement CPV. Whether the new procedures will require manual data consolidation or an automated system, a cost will be incurred either in terms of the ongoing labor, or in terms of hardware and software upgrades and the long-term IT support needed to enable reliable electronic information access.

In addition to cost, there are also technical challenges associated with applying a new information system to a manufacturing process that has a lengthy history. In many cases older products may have data that has been collected for years in paper batch records or electronic formats that are no longer compatible with more modern data systems. Other information may reside in separate, unconnected systems such as quality results recorded in Laboratory Information Management Systems (LIMS), batch information detailed in Enterprise Resource Planning (ERP) systems, and process performance data archived in Process Information (PI) systems. The effort to consolidate the data may be significant.

The challenges of monitoring process performance across multiple sites, including CMOs, present further complications. Unconnected electronic systems are especially challenging for products manufactured at multiple sites. Obtaining the data in a usable form may be complicated when interfacing older procedures with more recent data systems. Gaining access to the data electronically across a network may be impossible with the existing infrastructure, and sharing the information may require extensive collaboration through manual exchange between each site. The interaction may require a central place to store data so that it can be exchanged, creating the need for additional investments to upgrade software for monitoring and analysis. Even when connectivity exists,

sharing data between different systems can be complicated

by the need for extensive reformatting before the data can

be analysed.

Response and recommendations

There is no single best solution to address these challenges.

In each case, a cost-benefit assessment should be developed

to assess the value of establishing connected IT systems for

the scale and scope of CPV envisioned. A full commitment

to CPV across multiple plants and products will

necessitate a highly connected and efficient data collection

infrastructure. Single product deployment in one operating

plant could be sustained with less inter-connectivity.

Therefore, starting with a simple system at one site with

the capability of expanding to a comprehensive, multi-site

system is generally recommended.

A data lineage or hierarchy should be created that maps the

various CPV parameters to the primary source of the data.

Changes to the data source over the manufacturing history

can also be noted so that multiple methods of assembling

data can be readied. This exercise will also highlight data

that is collected now, but was not collected in the past.

It is useful to consider two separate challenges: the

backward-looking population of a data historian with prior

product and process data, and the forward-looking system

for continuously collecting and analyzing data from the

manufacturing process. These can be considered as two

separate sets of processes, united by the data hierarchy.

When back-populating data, a likely challenge will be

adapting to changes in the type and format of data that

was collected over the product’s history as process

improvements and batch record formats evolved, and as

additional manufacturing sites were added. Manual data

entry can readily deal with these variants through well-

documented procedures. Automating these variants can be

more challenging, and careful consideration should be given

to the cost vs. benefit, noting that this data collection only

need occur once to back-populate the historical database.

Automation and integration of the systems for maintaining

the flow of data into the future also deserves a careful cost/

benefit analysis. One concept to keep in mind is that the

desire to track and trend data continues to grow once an

organization has recognized the benefit of quick access

to data and sophisticated analyses. The system you are

building today will likely need to expand to collect more

data from more processes being operated at more locations.

Scalability is an important consideration.

A critical design decision for any CPV informatics system

is whether the analysis software will draw data directly

from source for each analysis, or if a data historian will

Page 16: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 16

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

be created that pulls from the original source once (and possibly refreshes at a defined frequency) and then makes that ‘copied’ data available for analysis. It is anticipated that for legacy data, a historian system will provide a valuable buffer between the multiple variants of source data and the analysis tools implemented to support CPV.

In all configurations, data integrity is fundamental and should be designed into the business processes and computer systems. For legacy products, there may be significant quantities of data in paper systems that need entering into the CPV Informatics system for analysis. Manual data entry should be ‘double blinded’ with a procedure for reconciling discrepancies. Where data is in electronic form, new interfaces may be required and as with any CPV Informatics system, automated data conduits should be well documented, under change control, and validated for integrity of core and metadata.

Keeping this in mind, a phased rollout can balance the need to become compliant to CPV while providing the learning environment necessary to design a sustainable system that can be launched quickly. An example of such an approach is to launch a CPV program focusing only on CQAs, which are likely to be very well defined, and with very accessible data in LIMS systems. This creates the opportunity to establish the analysis software and related business processes for data review and signal escalation, while minimizing the data collection infrastructure investment. A second phase could then enable data collection and analysis for the CPPs impacting CQAs [4].

Automation of data collection and analysis should be phased for maximum efficiency and to enable a broader CPV scope if desired. Connecting the plant’s automation system to the data analysis software, for example, is likely a high-value connection that will eliminate significant manual data entry, and enable more ad-hoc analysis outside of the CPV program. This will greatly assist in troubleshooting and investigations. Low-tech methods for speeding manual data collection and accuracy should not be overlooked. Simple reformatting of batch records to highlight the data that must be keyed, and carefully matching the data descriptors in the batch record to those in the entry software, will greatly ease operator fatigue and reduce the risk of error. Some data entry software can have data checks to permit only reasonable entries. Creativity is the only limit to addressing human-factor issues.

Process analytic software is the heart of the CPV system and should be implemented early. It should have appropriate validation to ensure that the trends, limits, and signals are correct and accurate. This requires change control over the configurations responsible for generating those objects.

It is tempting to launch a nascent CPV program with common off-the-shelf software like Excel, SPSS, or JMP. This is a great learning environment, but long-term sustainability should be considered, as these highly configurable tools will be hard to control to cGMP standards.

Page 17: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 17

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

7.0 7.0 The effort and cost of deploying CPV for legacy processes

Concerns

The previous sections discussed in detail the specific challenges that will have a tendency to make the implementation of CPV for a legacy program more complicated, and therefore costlier.

• the challenge of integrating CPV concepts within an existing quality system

• limited design space knowledge impacting appropriate parameter selection

• data collection and analysis across multiple sites

• discovery and disclosure of information not congruent with existing licenses

• data integration within a legacy IT infrastructure

These factors mean that the cost of implementing CPV for legacy products could outweigh the benefits, unless a sound strategy is developed to address the challenges based on cost and benefit.

Response and recommendations

Whether deployed against a legacy or newly licensed product, the implementation of a CPV program requires significant commitment from management to supply the necessary human and capital resources. Leadership must also drive the cultural change required to respond to signals from control charts, even as the process is performing within specification limits. Implementations without this level of support may be compliant, but will certainly not provide the expected improvement in productivity and efficiency that CPV programs have the potential to create.

If management commitment must be developed, the presence of new ‘guidelines’ from regulatory agencies is a powerful motivator to act on CPV, but developing a compelling business case is paramount to gaining the level of support for an implementation that will provide benefits beyond compliance.

BPOG member companies identify three significant benefits of a strong CPV program. The value of each of these should be estimated and used to justify expenditures for the CPV program.

• sensing process shifts and proactively identifying and fixing underlying issues, which may prevent the loss of future batches

• sensing process shifts and proactively identifying and fixing underlying issues, thereby reducing deviations/exceptions and associated workload

• improved process knowledge that can lead to improvement in process control, which will lead to increased process robustness and yield

Annual Product Quality Reports (APQRs) are regulatory requirements and minimally include the CQAs for a given product. Their assembly can be time- and resource-intensive, particularly for business processes that depend on manual data transcription. A CPV program that assists in the compilation and interpretation of process data may well be partly justified on this aspect alone.

Ideally a company has a CPV ‘roadmap’ that commits the organization to implementing CPV within their new and licensed programs, and uses professional project-management techniques to ensure timely progress. In this situation, the prior sections have noted methods of minimizing cost and speeding implementation for the various challenges unique to legacy products, and the overall project plan can use this information to determine the best timing for CPV implementation for each program. The most common technique mentioned is a phased rollout, focused to provide the desired benefit.

Clear CPV program goal(s) in concert with a supportive deployment strategy will focus CPV deployment efforts for maximum benefit, but also provide a unifying message from management to the broader organization. Table 1 highlights three potential goals to address and indicates the basis of a roll-out strategy.

Page 18: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 18

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

In the absence of company-wide CPV deployment effort, it is likely that a given site, department, or even individual will charter a CPV project. In these instances, although initial progress may be more rapid, the long-term expectations may be modest, because a well-deployed CPV effort will require efforts from across the organization and the support of senior leadership. The best outcome would be a localized effort that can demonstrate the value of CPV and thereby garner the global support necessary for a more comprehensive implementation. It is not an uncommon pitfall that a local champion department or individual may create a tool that is difficult to maintain and does not provide the expected benefit. Further, implementing local solutions may fail to provide the benefit of achieving a critical mass of data, sufficient to enable good statistical analysis to support decision-making.

CPV Program Goal Definition

Compliance • start with CQAs

expand to CPPs and critical material attributes

Cost management Start with:

• high-cost product, or

• high-issue process

Process knowledge • start with legacy product with extensive historical data

• expect to include multiple levels of data (CQA, CPP, nCPP and critical material attributes) to achieve results

Table 1: CPV Roll-Out Concepts

Page 19: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 19

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

8.0 8.0 Conclusions and recommendations

FDA, EMA and ICH have all issued guidance stating that manufacturers are responsible for identifying and controlling sources of variation affecting process performance and product quality. Definitions of the terms ‘control’ and ‘variation’ are consistent among the emerging regulations and guidelines, and as such the BPOG member companies recommend that a CPV program be based on statistical process control concepts.

A CPV program for legacy products, with their long history of manufacturing and multiple filings spanning several years, present unique challenges and possibilities:

• Integrating the new CPV program within the well-established quality systems will require significant effort.

• Definitions of terms may not be aligned with the new, increasingly harmonized language for monitoring methodologies, and alignment can require significant change to existing quality system infrastructure.

• Despite a copious manufacturing history, fundamental process knowledge (design space understanding) may be limited, and the Control Strategy may need to be updated before creating the CPV plan.

• Multiple sites and multiple versions of a process are likely to exist simultaneously, complicating the collection, analysis, and presentation of the data, as well as slowing the implementation of process improvements.

• Process knowledge generated by the CPV program may be inconsistent with regulatory filings or commitments.

• Access to the data will be challenging (i.e. expensive), as it likely exists in disparate systems that have evolved over the years.

However, legacy products have a long manufacturing history where data mining and analysis can result in a very rapid accumulation of process understanding, which is the true value of a CPV program, going beyond compliance to superior performance.

A primary recommendation of this paper is to implement CPV for legacy products in phases that are focused to provide the desired benefit (See Table 1: CPV Roll-out Concepts). These phases should be defined in a comprehensive CPV roadmap, which is agreed to by management and supported by all impacted departments. Good project management will be required to control the significant resources involved, and coordinate across the involved departments and manufacturing sites. A CPV program requires the collection, identification,

and analysis of data from multiple sources. Two distinct challenges exist: the backward-looking population of data from past production, and the forward-looking continuous flow of data into the system. Collecting historical information is a one-time event that will be resource-intensive and difficult to automate. This is likely to be achieved most efficiently by hand. The Information Technology function is a key partner in establishing the infrastructure necessary to automate future data flow and create a sustainable infrastructure for CPV. Data integrity is important in each case, and the effort expended to ensure this should be proportionate to risk.

Conflict and confusion with existing quality system elements can be minimized by clearly differentiating the purpose, methods, and language of each. CPV is designed to demonstrate the manufacturing system is in a state of control and relies heavily on statistics and control chart techniques. Many similar (trending) systems in existing quality systems are focused on safety and efficacy, and focused on licensed specifications. BPOG recommends that CPV programs adopt language that provides a clear distinction. CPV charts generate ‘signals’ that are ‘evaluated’, in contrast to specification-based systems that likely use the terms ‘trend’ and ‘investigation’.

The detailed collection and review of historical data will undoubtedly expand process knowledge, but the concern that it will also present a compliance risk is largely unfounded. The purpose of the historical review is to back-populate data for statistical analysis, not to re-release product. All previously released batches were reviewed under licensed procedures that ensure safety and efficacy. New knowledge can be used to update the control strategy through normal regulatory processes. However unlikely, it is prudent to have a pre-agreed escalation path, should the data review identify an issue with possible patient impact.

A science- and risk-based control strategy is the foundation of an efficient and defensible CPV plan. One should be created or updated for the targeted CQAs. Formal risk-assessment methods (FMEA, etc.) are the fundamental tools used to classify parameter impact (critical or not, etc.) and should be liberally employed. The extensive manufacturing history, copious process data, and existing regulatory filings will be powerful sources of information unique to legacy products that should be leveraged.

The control strategy should be updated as the process evolves, and each version will have a corresponding CPV plan. Data from different versions should be combined if warranted by scientific and statistical rationale, as this provides greater power to the analysis. This also applies if the product is produced at multiple facilities with differing infrastructure.

Page 20: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 20

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

For highly capable processes it is not uncommon for regulatory agencies to request a tightening of specifications to reflect the manufacturing history. It is the opinion of this team that an arbitrary tightening of specifications should be avoided, providing that the specification limits can be defended. Specifications should be based on scientific or clinical evidence that those limits are appropriate for ensuring safety and efficacy. The long history of legacy product manufacturing should be used to defend existing specifications, or at least in defining the new limits. Finally, the CPV program will be more effective, through statistically justified control limits, at sensing process shifts and leading to appropriate action through a defined escalation process, which is often the concern of the regulator.

It is hoped this paper will ultimately lead to a broader dialogue with regulatory agencies. Before this can efficiently occur, these authors believe a consensus by the industry should be established. A common interpretation of current regulatory guidelines and a common set of questions will greatly facilitate this dialogue. We hope that this paper has succeeded in taking this first step. To that end, we encourage feedback from readers with the objective of streamlining the dialogue within the industry to arrive at a common approach, and ultimately with the regulators to arrive at a common set of expectations for continued process verification in biopharmaceutical manufacturing.

Page 21: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 21

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

Appendices

Appendix 1: Glossary

Critical process parameter (CPP)A process parameter whose variability has an impact on a critical quality attribute and therefore should be monitored or controlled to ensure the process produces the desired Quality. (ICH Q8).

Critical quality attribute (CQA)A physical, chemical, biological, or microbiological property or characteristic that should be within an appropriate limit, range, or distribution to ensure the desired product quality. (ICH Q8) .

In-process controls (IPCs)Checks performed during production in order to monitor and, if appropriate, to adjust the process and/or to ensure that the product conforms to its specifications. (ICH Q7).

CpK and PpKProcess capability and process performance indicators.

Page 22: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 22

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

Appendix 2 – The BPOG CPV maturity assessment tool (CMAT) and conclusions arising from team discussion.

Approximately 50 commercial products make up the collective experience of the member companies. Some 70% of these products were considered to be part of legacy-product CPV programs.

The maturity assessment was based on a model within the framework known as the ‘control of objectives for information and related technologies’ or COBIT.

It has five dimensions:

• business processes

• automation

• implementation

• business benefit

• people and roles

Each dimension has five levels of maturity with associated word models to enable those completing the maturity assessment to discuss and agree which level applies for the current status of their organization.

The CMAT has been distributed to BPOG member companies on two occasions, first in 2014 and again in 2016. We therefore have three mean responses for each dimension: 2014 Level, 2016 Level and an aspirational level.

Level Word Model

0 Non-existent

Complete lack of any recognisable CPV business processes. The enterprise has not even recognized that there is an issue to be addressed.

1 Initial/ad hoc

There is evidence that the enterprise has recognized that the issues exist and need to be addressed. There are, however, no standardized processes;

instead, there are ad hoc approaches that tend to be applied on an individual or case-by-case basis.

The overall approach to management is disorganized.

2 Repeatable but intuitive

Processes have developed to the stage where similar procedures are followed by different people undertaking the same task. There is no formal

training or communication of standard procedures, and responsibility is left to the individual. There is a high degree of reliance on the knowledge of

individuals and, therefore, errors are likely.

3 Defined process

Procedures have been standardized and documented and communicated through training. It is mandated that these processes should be followed;

however, it is unlikely that all deviations will be detected. The procedures themselves are not sophisticated but are the formalization of existing practices.

4 Managed and measurable

Management monitors and measures compliance with procedures and takes action where processes appear not to be working effectively. Processes

are under constant improvement and provide good practice.

5 Optimized

Processes have been refined to a level of good practice, based on the results of continuous improvement and maturity modelling with other enterprises.

Table 2: The word model for the CPV&I maturity assessment tool, business processes.

Dimension 1: Business Processes

Page 23: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 23

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

Figure 1: Mean maturity

The radar diagram in Figure 1 contains the responses for this and the other four dimensions.

In terms of the first dimension, business processes, the mean maturity level in 2016 was found to be 3.1, compared to 2.5 in 2014. Procedures have been standardized, documented, and communicated through training. The procedures themselves are not sophisticated but are the formalization of existing practices. There is an increase in the number of companies managing and monitoring the process well and reacting when signals are observed in the process.

Page 24: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 24

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

Level Word Model

0 Non-existent

There are no IT systems supporting CPV data collection and analysis

1 Initial/ad hoc

Data related to some CQAs and CPPs are attainable from different IT systems and these are extracted and reported manually.

There are limited procedures guiding this work.

Management of the IT systems is disconnected, with no single organization owning the analytical software.

Analytical work is undertaken on bespoke, unvalidated applications.

2 Repeatable but intuitive

Data for all CQAs and CPPs are available from IT systems and some data is extracted and reported automatically.

IT management processes have developed to the stage where similar procedures are followed by different people undertaking the same task.

There is no formal training or communication of standard procedures for data analysts and responsibility is left to the individual using bespoke

applications.

There is a high degree of reliance on the knowledge of individuals and, therefore, errors are likely.

3 Defined process

All CQA and CPP data can be extracted automatically, but reporting is still manual via skilled analysts.

The number of bespoke unmanaged systems has been reduced by efforts to drive standardization and consistency.

Analytical and IT management procedures have been standardized and documented and communicated through training.

It is mandated that these processes should be followed by a recognized owning function; however, it is unlikely that deviations will be detected.

The procedures themselves are not sophisticated but are the formalization of existing practices.

4 Managed and measurable

Additional data is available automatically, beyond CQAs and CPPs, enabling more sophisticated, timely investigations and corrective actions.

Management monitors and measures compliance with analytical and IT management procedures and takes action where processes appear not to be

working effectively. Processes are under constant improvement and provide good practice.

Automation and tools are used in a limited or fragmented way.

5 Optimized

Correlation between data has been understood and MVA approaches mean automatic alerts capture all potential excursions before they have an

adverse impact on the process.

Analytical and IT management processes have been refined to a level of good practice, based on the results of continuous improvement and maturity

modelling with other enterprises.

There is one standard, validated platform for CPV in which IT is used in an integrated way to fully automate the workflow.

Table 3: The word model for the CPV&I maturity assessment tool, level of automation.

Dimension 2: Level of automation

Here, the 2014 level was about 1.5 and in 2016, 2.2. Based on discussions and the word model for this dimension in Table 3, we can say: most companies are working with validated systems, though the level of automation ranges widely between companies, sites and products. There is often an aspiration to introduce automation, but this may not be practical where legacy systems are paper-based and/or the frequency with which a product is manufactured is low. There is a significant degree of reliance on the knowledge of individuals, because statistical rules are difficult to apply by procedure or via automation, and this means process knowledge and judgement calls are necessary. Some opportunities to improve the process may be missed.

Page 25: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 25

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

Level Word Model

0 Non-existent

No products have a CPV plan. site and global policies and procedures simply do not exist.

1 Defined scope

One product has been selected as a pilot for the development of a CPV plan.

The process scope has been defined (Development, DS Manufacture, Fill Finish).

The data scope has been defined (CQAs, CPPs, other parameters, complaints).

The scope of analytical techniques has been defined (SPC, PpK, rules for alerts).

The scope of reports has been defined (e.g. quarterly CPV reports, monthly reviews, weekly reviews).

2 Pilot complete, rollout started

The CPV Plan for the first product has been delivered and is operational, fully integrated with other elements of the quality system.

This is being used as the basis for the development of CPV plans for several other different products, which may be on different sites.

Expectations of development are understood in terms of providing the foundations for CPV plans for new products.

25% portfolio completion.

3 Roll-out progressing towards 50%

CPV plans exist and are operational for multiple products on more than one site.

Site master plans take account of CPV and global policies and procedures are under development.

The organization is able to generate a CPV plan for new products.

50% portfolio completion.

4 Roll-out achieved for most products

The majority of products have a CPV Plan.

All sites have master plans accounting for CPV.

Global policies and procedures exist.

75% portfolio completion.

5 All products have a CPV Plan

Continuous improvement has led to optimization in product CPV plans, site and global policies and procedures are in place, and the organization

manages the CPV process in a highly efficient way.

100% portfolio completion.

Table 4: The word model for the CPV&I maturity assessment tool, scope of implementation.

Dimension 3: Scope of implementation

This dimension is intended to indicate the extent to which companies have rolled out CPV plans across their portfolio of products. In 2014, the average score was 2.5 and this increased to 3.1 in 2016. When the team reflected on the word model for this dimension (Table 4), the results suggested the vast majority of companies across the membership have global CPV master plans and well in excess of 50% of products in the marketplace have CPV plans. In some companies, all products have CPV plans in place. The existence of CPV master plans means that processes and procedures are in place to develop CPV plans for new products.

Page 26: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 26

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

Level Word Model

0 Non-existent

The business benefits of CPV are simply not known.

1 Benefit targets identified to align with business objectives

Benefits are known to include (t,$,Q), typically:

• compliance and observations from inspections;

• number of out of specification events;

• number of lost batches;

• improved schedule adherence;

• reduced operating costs;

But are not given an expected value to deliver.

2 Benefits quantified

The potential size of the benefits is known and the program will be measured against them.

3 Benefits measurable

Means of measuring the benefits is known and measurements are being delivered and assessed.

4 Benefits accruing

The program is demonstrating deliverable benefits.

5 Benefits accrued

All anticipated benefits have been demonstrably delivered and the level of performance is sustainable.

Table 5: The word model for the CPV&I Maturity assessment tool, business performance benefits.

Dimension4:Businessperformancebenefits

On this dimension, the average score across companies still was rather low, having increased from 1.0 in 2014, to 1.4 in 2016. This suggests that companies are aware of the benefits CPV can bring, but are unable to measure those benefits systematically. This low score was seen by the BPOG CPV&I team members as a weakness, as the justification of resources to improve the processes around CPV may be difficult if benefits cannot be shown. There was also an observation that a company’s ability to demonstrate benefits will only occur after processes have been established, and it may be related to the extent and sophistication of the automation system they have been able to implement. As a consequence, the measurement of business performance benefits is likely to lag behind the other dimensions in this model.

Page 27: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 27

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

Level Word Model

0 Non-existent

No distinct roles and responsibilities have been defined.

1 Initial/ad hoc

Individuals are taking responsibility for making core processes work, but roles and responsibilities are not written down and communicated widely.

2 Repeatable but intuitive

All key processes including governance, process improvement and change control have been identified and local ownership has been established.

3 Defined process

The organization has recognized the need for global standards and these are under development.

4 Managed and measurable

Global and local processes and procedures, roles and responsibilities are clear for all key processes..

5 Optimized

Processes have been refined to a level of good practice, based on the results of continuous improvement and maturity modelling with other enterprises.

Process ownership at the global and local levels is clear and succession management is proven to work, showing the system is sustainable.

Table 6: The word model for the CPV&I maturity assessment tool, people and roles.

Dimension 5: People and Roles

Clearly, there are interdependencies for the 5 dimensions of the assessment tool. In this case, the team recognized the strong dependence between business processes (dimension 1) and people and roles. So it was not surprising to find that the results on these dimensions moved forwards at about the same rate over time. In 2014 the average score was approximately 2.0 and in 2016 it was 2.7. In discussion of the word model displayed in Table 6, the team felt that all the key processes have people in roles that own them, but there is still work to do communicating the need for coherent governance across corporate networks, particularly at the interfaces between CPV and the systems that act as sources of data.

The team agreed that people in well-defined roles are key. Process knowledge and the interpretation of trend data sometimes relies upon a relatively small number of people, and this represents a risk. This risk emphasizes a need to develop knowledge management practices to capture process knowledge over time.

Page 28: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 28

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

9.0 References[1] Continued Process Verification: An Industry Position Paper with Example Plan, BioPhorum Operations Group (2014),

http://www.biophorum.com/cpv-case-study---downloads

[2] Guidance For Industry, Process Validation: General Principles and Practices, U.S. Food and Drug Administration (2011). From http://www.fda.gov/BiologicsBloodVaccines/GuidanceComplianceRegulatoryInformation/Guidances/default.htm

[3] ICH Harmonised Tripartite Guideline: Pharmaceutical Quality System Q10, http://www.ich.org/products/guidelines/quality/quality-single/article/pharmaceutical-quality-system.html

[4] Roadmap for CPV Implementation, Marcus Boyer, Joerg Gampfer, Abdelqader Zamamiri and Robin Payne, PDA Journal, March 2016.

[5] CPV Signal Responses in the Biopharmaceutical Industry, Mark DiMartino, Abdelqader Zamamiri, Kevin Pipkins, Jim Heimbach, Eric Hamann, Syama Adhibhatta, Richard Falcon, Kevin Legg, and Robin Payne, Pharmaceutical Engineering, January/February 2017, Volume 37, Number 1.

[6] Guideline on process validation for the manufacture of biotechnology-derived active substances and data to be provided in the regulatory submission, EMA/CHMP/BWP/187338/2014. http://www.ema.europa.eu/docs/en_GB/document_library/Scientific_guideline/2016/04/WC500205447.pdf

[7] Continued Process Verification and the Validation of Informatics Systems, Rob Eames et al, in preparation for submission to ISPE Pharmaceutical Engineering Journal, September 2017.

[8] Pharmaceutical Quality by Design: Product and Process Development, Understanding, and Control, Lawrence X Yu, Director for Science, Office of Generic Drugs, Food and Drug Administration.

[9] A-Mab: A case study in Bioprocess Development, CMC Biotech Working Group, 2009, http://www.ispe.org/pqli/a-mab-case-study-version-2.1

[10] A Framework for Implementing Stage 3 Continued Process Verification for Legacy Products, Bikash Chatterjee and Wai Wong, ISPE Pharmaceutical Engineering Journal, September/October 2014.

[11] Stage 3 Process Validation: Applying Continued Process Verification Expectations to New and Existing Products, D Bika et al, ISPE Discussion Paper, August 2012, http://www.ispe.org/discussion-papers/stage-3-process-validation.pdf

[12] Process Validation Lifecycle Implementation for Existing (“Legacy”) Products, ISPE Web Site.

Page 29: CONTINUED PROCESS VERIFICATION OF LEGACY PRODUCTS IN …

Continued Process Verification of Legacy Products in the Biopharmaceutical Industry 29

CONTINUED PROCESS VERIFICATION©BioPhorum Operations Group Ltd

Permission to useThe contents of this report may be used unaltered as long as the copyright is acknowledged appropriately with correct source citation, as follows “Entity, Author(s), Editor, Title, Location: Year”

DisclaimerThis document represents a consensus view, and as such it does not represent fully the internal policies of the contributing companies.

Neither BPOG nor any of the contributing companies accept any liability to any person arising from their use of this document.