Today’s manufacturing and analytical testing environments finally see a significant increase in digitalization efforts. These range in levels of sophistication and technology maturity from early Proof of Concepts & Pilots through to full scale implementation planning across use cases such as resource planning and scheduling, deviation reduction, predictive process design & real time optimization, predictive equipment maintenance and much more. In addition, mature technologies for process automation and robotics are already widely adopted in the industry—often linked together in Industry 4.0 concepts.
The investment and focus needed for underlying process and system enhancements, that are from an organizational perspective clearly transformational in nature, are significant and involve investment, time, energy, and experimentation before you achieve results. Despite a broad focus on adopting artificial intelligence in end-to-end manufacturing operations (in fact everywhere at work and in our daily lives), it is evident that many tools and technologies are built on core elements from machine learning and “augmented intelligence” concepts.
As we learn more about possibilities and areas of application for these data-enabled technologies, the groundwork and fundamentals needed for these solutions to work are equally important.
Excitement around new technologies and digital concepts can sometimes obscure the need to focus on the basics: harmonization, standardization, robust process understanding & development and the applicability of Quality Risk Management concepts.
It is clear that new technologies now provide the ability to mine available data in much more detailed and comprehensive ways, helping us to better understand process weaknesses, detect patterns and identify clusters of potential failure in ways that we have not been able to achieve before.
This can be especially beneficial in the development stage and during comparability studies of processes, where subtle changes across multiple parameters are often hard and late detected by traditional process control strategies. With more abundant in-line sensors, and highly digitized and connected equipment, we now have an abundance of valuable data - the data “deluge”.
Furthermore, with advancements in web connectivity and cloud computing, as well as quantum leaps in computing performance, these data assets can now be more easily collected, stored, and analyzed.
With advanced data analytics techniques like Multivariate Data Analytics (MVDA) or Natural Language Processing, coupled with Internet of Things (IoT) and “Digital Twin” technologies, it has become possible to find correlations across vast volumes of data to understand true root causes for defects and material attribute issues and detect them in ways that were never possible before - real time and predictive.
With digitalization and automation, precious resources can be reallocated to higher value work that requires true human intelligence and creativity, like problem solving instead of data review and periodic trending.
Clearly, digitalization helps identify and even predict issues that we might not be able to see today and will enable us to become more precise and efficient. At the same time, if we want to take a meaningful readout and potentially even automate responses to certain situations/ process conditions, we will need to embed key concepts of Quality Risk Management in the design of these augmented intelligence systems. This will require a certain degree of flexibility within the tightly regulated environment to allow a different form of decision-making and more adaptable change control concept to allow for assistive interventions in established processes and controls.
For example, digital technologies will enable faster analytical readouts, faster test methods that can pick up trends in real time and are much more sensitive, incl. incorporating highly connected systems that are able to detect minor shifts in quality attributes and relay them into the quality system to compare with historical trends and established control limits.
In this way, the quality team moves away from a focus on reviewing records of batches that have already been produced to becoming a business partner for investigating issues and improving processes, as is seen in other industries. Strong Quality Risk Management foundations will need to be in place to keep up with all the procedural, parameter and process changes and data driven decisions that will surely result from new digital capabilities.
As long as digital and predictive efforts focus on the areas of process improvements, waste elimination and highlighting clusters for further improvement work to follow, we are well on track getting there as an industry.
When it comes to concepts of `review by exception` (beyond comparing specifications and logged in parameters with actual batch data of runs) or automated assistive guidance for Quality decision making, a good fundamental understanding of basic concepts of Pharmaceutical Quality Risk Management will be key for the development of such applications.
Come and join us at the 2021 ISPE Asia Pacific Pharmaceutical Manufacturing Virtual Conference on 17 - 18 June 2021 with a main topic on ‘Sustainable Implementation of Quality Risk Management’ as we discuss with industry peers, SMEs, and global regulators (USFDA, MHRA) on the importance of Implementing QRM in manufacturing and supply operations with emphasis on critical review of quality and manufacturing data.
Learn More