In the recent update to ICH Q9(R1) regulators have pointed to the main key update being:
"The application of digitalization and emerging technologies in the manufacture and control of drug (medicinal) products can lead to risk reduction, when such technologies are fit for their intended use. However, they can also introduce other risks that may need to be controlled. The application of quality risk management to the design, validation and technology transfer of advanced production processes and analytical methods, advanced data analysis methods and computerized systems is important."
Given the aforementioned information, it is critical to enhance the application of effective data governance and apply a framework for data structure by digitization software providers to ensure recent digitization tools do not “introduce other risks that may need to be controlled”.
The move from paper to digital validation:
With the advancement of technology, there is a growing trend towards the use of paperless tools in the pharmaceutical industry. The digitization of paper forms and documents has been a crucial first step in this process, but now we are seeing a shift towards the electronic generation and management of data.
This shift towards paperless tools and electronic data has numerous benefits. For one, electronic data can be stored and accessed more easily than physical documents. It can also be analyzed and used more efficiently, leading to better decision-making and improved productivity.
Knowledge management is considered one of the two enablers, along with quality risk management, that are required to achieve the objectives of the Pharmaceutical Quality System (PQS - ICH Q10). “Use of knowledge management and quality risk management will enable a company to implement ICH Q10 effectively and successfully.“
Therefore, considering our data is an integral part of our knowledge, it is imperative that we view it as an asset, and investing in it can help organizations enhance its value. Standardizing the creation and structure of data can facilitate seamless querying and analysis across various systems and applications, enabling organizations to leverage data from diverse sources to gain valuable insights and make well-informed decisions.
If we aim to utilize the most advanced tools and technologies available to streamline the use and management of data, we must embrace a paradigm shift in how we organize and structure information, with the relational database model with robust data governance emerging as a popular solution to store and manage complex data sets efficiently.
Current digital validation limitations
Although automation has made notable advancements in our manufacturing processes over the past decade, there are still mundane tasks within the design, commissioning, qualification, and validation phases of projects that will benefit from becoming more streamlined and efficient.
Consider the amount of time spent recording and verifying related information associated with a physical asset across various systems such as the design firm's documentation management system, the construction phase, or the CQ contractor. Recognising this, it becomes evident that there is significant potential in having these pieces of information recorded once and then made available to others to use, verify or supplement, rather than the duplication of effort in terms of data collection, recording and access
Benchmarking from other industries
By observing the historical example of the ASCII (1963) code that facilitated the seamless functioning of text on different machines, as well as the independence of our phone numbers from service providers and devices, we can recognize the value of data being system-agnostic. Consequently, it is reasonable to anticipate that the data generated and managed throughout the construction or operation of a plant should possess the same level of interoperability and freedom from reliance on specific platforms or systems.
Decoupling data – the benefits
This blog post has named some instances in our everyday lives where key data assets have been decoupled from their originating systems, which have some obvious benefits, such as porting of mobile phone numbers, or moving a piece of validated information from once system to another. What is less obvious and not spoken about as often, is the wider impact on industry that the data moats that software providers have evolved over time, has on competition and innovation. There are examples of this everywhere in industry in everything from ERP to MES solutions, where the mere suggestion of the move away from the status quo is unthinkable due to the complex nature of the underlaying data structure. The owners of these incumbent solution will tell you the closed system approach they are deploying is critical for innovation, but evidence from other industries tells a very different story.
In 2015 the EU introduced Open banking Legislation which meant that banks had to share customer information with authorized parties e.g. across departments or applications within an organization, or vendors resulting in a boom of challenger banks and fintech start-ups, providing superior services and niche products, leaving legacy banks, with their decades of technical debt and a distinct lack of innovation, struggling to keep up with the innovation of challenges banks such as Revolut and N26, who dominated the market in 2022 and accounted for more than 50% share of the global revenue.
If we apply similar logic to the pharma sector, and we saw established players being forced to make their data available to new market entrants, there would be a massive boom in innovation and competition, that will see the best products and service compete for customers and market share, rather than the current situation that only rewards incumbents.
Conclusion:
We strongly believe that collaborative and open innovation is necessary by industry stakeholders to establish the user requirements and expectations for data structures and data lifecycle management and governance. It is essential to align these standards with regulatory requirements and industry best practices. In today's unstandardised digital landscape, where decisions heavily rely on datasets, ensuring the reliability and accessibility of our data flow is of utmost importance. This is crucial for achieving cost reductions and enhancing overall quality improvements.