Historic background

The concept of quality control and management has been commonly devised in four different quality eras, which were different on level of organizational involvement: inspection, statistical quality control, quality assurance and strategic quality management. The beginning of a new phase did not preclude the preceding, i.e., they could coexist depending on the activity of an organization [1].

The inspection phase started with the rise of mass production. At first, products were randomly inspected for defects, without any statistical sampling definition. When a defective product was found, it was simply remade or discarded [1].

Statistical process control was inaugurated in 1930. Variability was recognized as an important attribute in quality control; therefore, only products outside established limits were discarded or reprocessed [1]. The statistical quality control based on sampling guaranteed quality standards of manufactured products with a high degree of confidence.

Between 1940 and 1960, with quality assurance, quality ceased to be restricted to industrial production. New concepts were introduced, such as quality costs quantification [2, 3], total quality control [4], reliability engineering [1] and zero defects [5]. Later on, Joseph Juran proposed a classification quality costs and a “quality trilogy”: planning, control and improvement, which can be referenced as the basis of quality by design (QbD) [2, 3, 6, 7]. Around the same time, Armand Feigenbaum proposed total quality control: Business was managed to serve the client, beginning with product design [4]. Reliability engineering was intended to ensure performance of the product for a specific period of time [1]. Philip Crosby initiated zero defects program: To ensure quality, everybody has to work to do things right at first time, eliminating costs associated with poor quality [5]. Several company segments started to integrate a collective effort to improve product quality standards, from raw material acquisition to final product commercialization [1, 2].

In 1970, strategic quality management era had begun by the entry of high-quality Japanese products in American market. Quality became a company strategy for survival and competitiveness, looking for continuous quality improvement [1, 8]. The Toyota Production System, or lean manufacturing, began in the 1950s in Japan. This new approach was based on the elimination of waste. Therefore, quality was built and audited during the process to fix possible defects in the same step, avoiding returning it to earlier stages [9, 10]. Lean manufacturing can also be understood as a strategic and integrated management model to achieve quality and productivity [11].

Quality on pharmaceutical industry

Strategic quality management has been mainly adopted by the automotive industry. The new vision based on risk and scientific knowledge was lately adopted by pharmaceutical industry due to its singularity, such as small batches sizes, many and complex production processes and the mandatory strict quality regulations to ensure patient safety [12]. Quality management is undoubtedly an important driver for the transformation of drug discovery, development and manufacture [13].

The US Food and Drug Administration (FDA), before The International Council on Harmonization of Technical Requirements for Medicinal Products for Human Use (ICH), recognized the necessity to expand quality evaluation of drug products, based until then on good manufacturing practices (GMP) inspections and regulatory review of drug product applications. In 2000, the amount of recalls and product waste as a result of mistakes in manufacturing was very high. Information developed and submitted for registration seemed to be not sufficient to guarantee scale-up and to understand root causes for manufacturing failures. Such failures generated a great number of post-approval supplements for review, and, as regulatory review treated all products equally, without considering specific risks to the patient, a disconnection between product review and GMP inspections was observed. Furthermore, global expansion of pharmaceutical industry with greater number of products aligned with advances in production technology and pharmaceutical sciences management quality boosted the paradigm shift [14,15,16]. In this way, the FDA proposed a reappraisal of approaches for product quality regulation, encouraging the adoption of integrated quality systems, international cooperation, public health protection, scientific and risk-based orientation, innovation on the development, production and quality assurance in order to anticipate technical and regulatory issues [16,17,18]. Product development importance and problems were emphasized by FDA in 2004, when a guideline addressing challenges and opportunities of critical path of development was published [17]. In 2004, FDA also published a final report that outlined QbD in pharmaceutical industry, and encouraged continuous improvement and risk management in the drug manufacturing process [16, 19].

In addition to new concepts, three important guidance documents were also published by the ICH about the implementation of quality system new approach: pharmaceutical development (Q8) [20], quality risk management (Q9) [21] and pharmaceutical quality system (Q10) [22].

ICH Q8 guideline recognizes and defines QbD in agreement with Juran view as “a systematic approach to development that begins with predefined objectives and emphasizes product and process understanding and process control, based on sound science and quality risk management” [20]. Therefore, quality is built in by design, not tested into products.

Quality risk management described in ICH Q9 aims to identify and control potential quality issues during development and manufacturing, to ensure product remains consistent with those tested by means of in vivo studies. Product quality should be based on scientific knowledge and focused on safety.

Q10 guideline describes an effective pharmaceutical quality system model based on International Standards Organization (ISO) quality concepts and GMP. The aim is to obtain a system to enhance quality and availability of medicines, innovation and continual improvement of processes as well as strengthening the link between development and manufacturing activities of pharmaceutical products throughout their life cycle [22].

The main differences between traditional and new quality approaches are listed in Table 1 [23, 24].

Table 1 Main differences between traditional and new quality approach

Briefly, in the traditional approach, quality is verified in finished drug product by assessing whether it complied with approved specifications. In case of non-compliance, the product is reprocessed or discharged, which represents a considerable waste. Manufacturing process and parameters, specification and methods are determined using a few pilot batches manufactured for marketing authorization application. All approved parameters are tight to assure manufacturing consistency. Many process, specification or other changes verified in scale-up or industrial production generate a post-approval amendment that have to be verified by a regulatory agency for implementation. The whole process may involve substantial effort coupled with considerable waste for both regulators and industry [25].

In contrast, in the new framework, quality is built into product. Then quality confirmation is verified in final product, not as a part of manufacturing consistency or process control. Manufacturing process and parameters, specifications and methods are understood and determined in design space (Table 1). Several parameters already tested and approved in design space would not characterize a change, which, consequently, would not need regulatory approval before implementation, reducing the number of post-approval submissions [6, 13, 20]. Patient safety and product efficacy are also considered though science-based risk assessment [6, 25, 26].

In 2011, the European Medicine Agency (EMA) and the US FDA, both members of ICH, launched a pilot program for parallel review of new approach applications, to ensure consistent implementation of ICH Q8, 9, 10 guidelines and to facilitate sharing decisions on new regulatory concepts [27]. The parallel review includes evaluation by each agency of application parts relevant to QbD, such as development, design space and real-time release testing, communication and consultation during the review [28]. A final report of this program was recently published, by which a total of 14 applications were evaluated and approved. Based on the learnings during the pilot, three sets of question and answer documents were published, besides some recommendations and harmonization between both agencies [29]. Also, the US FDA Office of Generic Drugs have published two examples of QbD for generic drugs: for modified and immediate release dosage forms, to illustrate the types of pharmaceutical development studies that can be used to implement QbD [30, 31].

Another example of regulatory change based on new paradigm of quality was Ph.Eur. Method of Analysis 2.9.47, which describes uniformity of dosage units using large sample sizes. This procedure is intended for the evaluation of products manufactured using process analytical technology (PAT), which is part of QbD [32, 33], demonstrating that regulatory organizations are willing to implement and keep up with principles of new concept.

Statistical tools

Different strategies have been proposed to apply QbD principles in several areas [13, 19, 23, 26, 32, 34,35,36,37] as an attempt of substituting the usual trial-and-errors procedures in the development and production of pharmaceutical formulations [6, 13]. A well-planned experiment could bring more information about a manufacturing process with less effort on the part of researchers. For this, several statistical tools could be employed as design of experiments, response surface methodologies, simulation experiments and emulators, multi-objective optimization and mixtures designs. Design of experiments is an efficient tool for systematic experimental planning to determine the relationship between experimental variables (factors) and expected output (response variable). Factorial designs are used when the responses are simultaneously affected by more than one factor, and the analysis of variance is applied to evaluate the main effects and their interactions [37]. Experiments can be organized in a full factorial design in which complete trials are performed of all combinations of several factors at all their levels. The factors choice is based on previous experience. When the number of experiments is not reasonable, a fractional factorial design would be recommended. This approach is able to identify main effects and lower-order interactions [38]. Such planning uses some constraints considering that high-order interactions might be negligible. Plackett–Burman and Box–Behnken designs are of classic use.

Response surface methodologies explore graphically the relationship between controllable factors and important response variables using regression models. Polynomial regression techniques allow the development of an empirical model related to the response to those terms. They allow investigating, from a mathematical point of view, the construction of a space design that considers the interaction effects of the involved factors [39].

The idea to replace physical experimentation with an in silico experiment using a computer program has been explored by simulation experiments and emulators. Experiments can be run on the simulator diminishing time, cost and effort [40]. Increasingly powerful statistical tools have been developed in this direction. Emulators models can combine several levels of simulation fidelity with practical experiments. Software has been developed to solve specific challenges of pharmaceutical field, as mixing processes simulations, analytical methods development and chemical reactions prediction [37].

Complex processes as those in the pharmaceutical field demand a multi-objective optimization framework; in this sense, the multi-objective optimization has been proposed [37]. Recent implementations employ desirability functions to define a candidate solution desirability index based on the individual objectives scores, important, for example, in drug design [41].

Mixtures designs are specially indicated to investigate the effects of mixture composition on a dosage form. A characteristic feature of this approach is that the sum of all compounds must be 100 %, meaning that mixture factors cannot be completely manipulated independently of one another [42]. The most wide mixtures designs are cubic or quadratic simplex lattice designs. Mixture designs have also proved to be useful in preformulation studies [43].

More recently, novel solutions were introduced to improve modeling in pharmaceutical products, such as the application of artificial neural networks generated by computational systems to simulate biological neural networks [44]. This tool presents several advantages, as the possible use of different types of data together (continuous, discrete, binominal); the use of incomplete data or even historical data; and the capacity of generate complex multi-dimensional models of easy and quick numerical solutions [44]. The recent use of this approach in the scale-up of a high shear granulation process has shown applicability in important technical and quality barriers faced in the pharmaceutical field [45].

Quality in Brazil

In Brazil, the Resolution RDC no. 17/2010, which currently deals with drugs GMP, determines the implementation of quality policy in a pharmaceutical quality system. Quality policy involves quality standard programs, policies and processes authorized by company direction. The resolution also establishes quality assurance, which involves not only GMP, but also the design and development of the product [46].

Pharmaceutical quality system and quality assurance, concepts introduced by RDC no. 17/2010, are aligned with strategic quality management. However, Brazil does not have specific regulations to deal with quality by design. Furthermore, as a recent regulatory member of the ICH [47], ICH guidelines adoption is not yet a rule. As an exception, RDC no. 60/2014, which establishes criteria for registration of new, generic and similar drugs, included some points addressed in ICH Q8 guideline, such as documentation regarding formulation development, encompassing function of excipients, compatibility of excipients and active pharmaceutical ingredient (API), justification of API overage use; summary report of manufacturing process including definition of critical steps of manufacturing and parameters evaluated [20, 48]. Such requirement inclusion might be an ANVISA (Brazilian Health Surveillance Agency) initiative to aware companies that drug product development and manufacturing parameters definition are essential to design a quality product, and it is not guaranteed only by GMP inspections or quality control.

In practice, several petitions analyzed according to this new guideline were rejected due to non-conformities related to the new quality concepts of product development and production [49]. Most petitions were not yet analyzed following RDC no. 60/2014, but it reinforces that process and product conception is still empirical for Brazilian laboratories. Generic and similar drugs industry in Brazil is mainly focused on quality control, which is compatible with the statistical process control era. The strategic approach and the adoption of lean manufacturing concepts have indeed been implemented in some companies, but the main focus lies on productivity increment rather on quality since the development. This is because the RDC no. 17/2010 primarily addresses GMP, which is quality aspect accomplished by companies and inspected by ANVISA [46].

In 2015 in Brazil, 31 % of registration application refusal of generic and similar drugs was due to quality control non-conformities: 13.3 % (67 reasons) of them was related to drug product; 8.2 % (41 reasons) related to API quality control by drug product manufacturer; 5.0 % (25 reasons) related to API quality control by API manufacturer; 3.2 % (16 reasons) related to excipients; and 1.6 % (8 reasons) related to importer quality control [49]. Although being rigorously evaluated and questioned by ANVISA, statistical quality control is still a difficulty for pharmaceutical industry. Even being a primary concept to safeguard quality nowadays, based on a few batches data are required for registration. In addition, quality control and method validation appear as a key component of quality documentation, appearing as the main cause of registration refusal.

While concepts facing strategic quality are introduced, other regulations are reviewed to reinforce quality control. An example of these two different ways to treat quality at ANVISA is the review of analytical methods validation rules, RE no. 899/2003 [50], placed in public consultation in February 2016 [51]. The review proposes specific statistical methods for some parameters and models. Also, RE no. 899/2003 positions the statistical quality control as a mean of quality assurance: the higher the analytical methods stiffness, the higher product quality. Moreover, by editing regulatory rules with more and more rigid requirements, ANVISA take on more responsibility for quality product, further increasing agency workload.

Editing rules to present new requirements, as product development and validation process in by RDC no. 60/2014, which compliance is obligatory for drug registration, may be a questionable way to introduce such complex concepts to Brazilian pharmaceutical scenario. The FDA background makes clear that a quality paradigm shift involves other concepts and must be internally harmonized within the own regulatory agency and with pharmaceutical industries before being announced as mandatory regulation [48].

Still, quality eras can coexist. The concern to improve and standardize analytical methods for quality control is not dispensable, although QbD can also be applicable to analytical methods [23]. Registration denials, mainly attributed to problems in quality control and analytical methods validation [49], reinforce that pharmaceutical industry still breaches in meeting requirements. The challenge is to built statistical quality control and strategic management in parallel, with the first as a component of the second, introducing, for example, real-time process control and real-time quality assurance [18]. Investment on products, process and personnel development is also primordial and may not be only encouraged by regulation. Medical advances demand new pharmaceutical technology. Thus, investing in an interdisciplinary team with chemistry, biology, physics, engineering, and statistics professionals should be an industry initiative to apply new technologies and develop a robust QbD project, attending patients’ necessities [32].

As in industry, risk assessment implementation is also important in ANVISA. Effective quality risk management can enable more effective and consistent risk-based decisions due to the capacity to deal with potential risks, and better use of resources by both parties [20, 21]. Current regulatory review tends to treat all products equally, observing whether they comply or not with actual requirements, in some cases without considering specific risks to safety, efficacy and quality [25]. By establishing risk-based control points, it would be possible to optimize analysis time, decision making and to apply necessary resources to better control high-risk products.

Patients demand better products. Thus, a reliable relationship between academia, companies, patients and regulatory agency is highly demanded to support QbD implementation [52].

Challenges

Different industries have adopted strategic vision of quality as a way to develop and continuously improve their products. Pharmaceutical laboratories, as well as international regulators, have recently adopted a strategic management in order to achieve high-quality products, with lower cost, less risk and without extensive regulatory oversight. Pharmaceutical regulation is necessary to harmonize concepts to continue benefiting companies from knowledge gained through development of better products and process that attend patients’ necessities.

Rather than modifying current mandatory health regulation in line with the most recent quality standards adopted by the FDA and ICH, Latin American health agencies should initiate a transition period engaging the industries and the academy in a joint discussion. Although Latin America market is diverse in regulatory terms, discussing quality paradigm is an opportunity to reduce the differences between countries of this developing market [53].

Both Brazilian regulatory agency and pharmaceutical industry must also rise to the challenge, to align to this new international practice. Companies must face product design and process understanding as a key step to ensure quality and reduce risk. Therefore, appropriate investments should be done in multi-disciplinary teams who could guarantee quality, efficacy and safety of a product using new model tools: QbD, risk management, quality system facing innovation and continuous improvement.

Brazil is in a moment very similar to that lived in USA when QbD idea was launched: absence of risk-based analysis, many registration petitions awaiting analysis, empirical information about development step, stagnation of national industry with respect to innovation, inclusion of more requirements to assure quality, which lead to high costs and increase review time [14, 25]. It is time to think and implement different practices to change this scenario. Brazil, Mexico, Colombia and Argentina are the most important pharmaceutical markets in Latin America. Therefore, Brazil can act as a model to other countries in this area as it had been moved ahead in GMP regulation [53].

The transition to QbD will not be simple and will require joint work for a strategy development that could fulfill the spirit of QbD in line with international agencies, transforming regulatory review in a modern science-based pharmaceutical quality assessment.