iSpeak Blog

Paperless Validation Systems: Truly Paperless?

Dori Gonzalez-Acevedo
Paperless Validation Systems: Truly Paperless?

Paperless validation – Oxymoron, perhaps? Is there really a truly paperless validation solution out there, one that fits the needs of the validation lifecycle? The ISPE Sub-Committee for Paperless Validation has defined ‘Paperless Validation Systems’ as Paperless solutions enable validation lifecycle deliverables to be generated, approved, and more importantly, testing to be completed without the need for the printing of paper test documents. The term ‘Paperless Validation’ has been used both by vendors who develop the software solutions as well as by the organizations that have aimed to adopt those systems and it is a phrase that has been commonplace in our industry for two decades at this point. Yet how far have we come as an industry in making these systems ‘truly paperless’?

2024 ISPE Annual Meeting & Expo

There are many unanswered questions. Have the tools developed by vendors delivered on the paperless validation promise? Have the organizations that have adopted these solutions achieved a paperless model for validation and verification? If a paperless validation system has been adopted, how universal or widely used have they been with organizations and what has their true return-on-investment (ROI) been? Is there a gap in features from the vendor/technology side? Is the reluctance of organizations to give up paper a root cause of the lack of use? Is there an inability to adopt full functionality of paperless systems due to fear of the unknown and/or lack of ability to truly transform business processes?

In my experience, there are variety of tools available today that can serve as paperless validation systems. Most are more than capable of enabling a truly paperless validation business process. Where I tend to see most organizations fall short of their paperless validation systems is a failure to use and trust the technology that they bought and configured for use. Let’s look at some key components to this failure to launch.

Failures on Multiple Counts

Multiple failures keep the industry stuck in outdated processes which add no value. For example, there is a common failure to trust the people creating the data that feeds into tools/systems. These systems were designed specifically to support validation or quality objectives. And that is typically coupled with a failure to properly trust and leverage SMEs in their own domain. This all boils down to an inability or unwillingness to shed legacy mindsets around compliance and validation rather than trust in well-defined business processes supported by extremely competent people and efficient technology. While a very nuanced conversation, I see struggles occur most often across the organizations that disagree on what ‘documentation’ should exist, ‘who and what’ constitutes quality reviews, and, of course, the demand for resourcing in time, people, and money.

If It Is Not Documented, It Did Not Happen.

If it is not ’documented’ it did not happen. How many times have we repeated this mantra? However, the fallacy is that everything is ’just fine’ if it is written down…somewhere…regardless of what was written. Well, that is not necessarily the case. We have seen plenty of examples over the years of insufficient documentation, risk assessments, requirements, testing, etc., whether it was documented on paper or in an electronic system.

In 1997 (25 years ago), 21 CFR Part 11: Electronic Records and Electronic Signatures, was introduced in response to the emerging technology of the time. Software applications were just beginning to be deployed to capture data that supported predicate rules and regulatory submissions. With this change of technology, the need to define proper use of electronic records and electronic signatures was required. The industry needed definitions and guardrails to ensure consistency and assurance that the data being captured had data integrity and the right levels of control points (i.e., reviews and approvals) in place.

Fast forward 20+ years and yes, data is being captured electronically. Massive amounts of data, in fact. However, standard practice for many organizations is to take that data (in many cases much more data than required by predicate rules/regulations) and print it all to paper, sign the paper, and then file the paper. All this time and effort is spent despite having used a ‘paperless’ 21CFR Part 11 compliant validated system to originally capture this data. The act of printing (either physically or electronically to pdf) all this information is not because they 'have' to, but because most Quality organizations are often rooted in a fear-based, overly conservative mindset. It’s the worry-based mindset that often takes the form of worst-case scenario planning. The ‘What If’ scenarios play out something like this...

  • “What if the 'system' is not available?”
  • “What if the inspector requests hard copies?”
  • “What if we can’t locate the information fast enough?”
  • “What system is the information actually in and is it the system of record?”

While it’s good that there are answers to these “what if” scenarios, the bad news is that’s not really the point in the first place. That’s the micro portion of a macro issue. IT and Quality professionals need to evaluate why those questions are even being asked to begin with. Yet what we often instead see is instead of rethinking these questions or really challenging them with answers, the most conservative quality voice wins out. To then solve all these worries, what happens? Someone usually says...

We should print to PDF, store, and file in another 'electronic' document management system. Or print it and put the documents in a file folder in Document Control.

While all a bit troubling, the last one is even more disturbing to me as it perpetuates bad practices and the notion that 'data' is free or doesn’t come with overhead. Of course, we need to make sure that we have adequate backup and disaster recovery policies and processes in place to avoid a complete loss of data (especially when it comes to product related data) but taking data from one system only to then recreate more 'static' data in another system only serves to feed and increase technical debt and data expansion. This sprawl of data across the life sciences industry perpetuates the ever-growing archival and data retention issue that so many organizations struggle with because it compounds data volumes at faster rates than ever before.

21 CFR Part 314.420 and 21 CFR Part 814.46 both specifies that the sponsors should maintain all data and information in the application for at least 2 years after the date of approval of the application or after the device is discontinued from marketing. However, it is standard practice within the industry to have data retention policy upwards of 15 years or more. Within many of these organizations, the archive processes and criteria are so wide and open to interpretation that companies are unable to interpret how to decommission their data. Which then forces them to hold onto systems years beyond their end of life along with all associated data for many more years beyond that. In service of what? The slight chance that the one in a million possibility that an auditor asks for a piece of outdated data that cannot be retrieved.

If that is indeed the fear, then we need to get to the root of why and what can be done to assuage it. We need to rethink and objectively evaluate if we are performing the right analysis of what is even 'appropriate' data to hold. Is it critical data? Will the data be needed for future assessments? I am not arguing that there are not cases where data needs to be kept. Absolutely! There is critical data to be maintained for the duration of associated drugs, devices, products, etc. A fitting example of this type of data is in the evolution of Software as a Medical Device and other innovative technologies we have begun to see become much more prevalent in recent years.

Still, the utmost importance is to intelligently and appropriately parse out what are 'critical to quality' attributes or not. We have not been assessing our data very well as an industry with this regard. We tend to broad brush categories – High, Medium, Low or Direct, or Indirect impact, etc. without having nuanced conversations about predicate rules, Intended Use, and Fit for Purpose. Paperless Validation Systems were designed to capture tons of information from different requirement types (User, Business, and Design) and yet we add redundancies and manual process to intervene, which indicates a lack of trust and confidence in not only the technology, but also the teams that interface with said technology, and worse yet within the organization itself.


Rethinking your approach to worries about your ‘Paperless Validation System’
Fear Statement Factual Statement Reimagined Response
“What if the 'system' is not available?” Backup and recovery plans are in place as ‘Paperless Validation System’. Response to Internal QA Audit Team: There is always a risk to any given system at any time. ‘Paperless Validation Systems’ are GxP and a supporting system - not direct impact to product quality, safety, and efficacy. Risk to a failure to access the system is mitigated by a variety of factors including but not limited to Backup and recovery plans, periodic audits/review, vendor assessments, administrative notifications etc. Most organizations have ‘mock’ or ‘prep-audit’ days to ensure key systems that are associated with products under inspection or are ‘typically’ audited are easily accessible. ‘Paperless Validation Systems’ are not frequently a primary source of an audit.
What if the inspector requests hard copies?” Hard copies of select data are only available on request. Response to Regulator or internal QA team: We’d be happy to show you within our system the data that you’d like to review. What exactly do want to review so we can gather the key SMEs to assist in this request.
“What if we can’t locate the information fast enough?” Engaging with a SME(s) that understand not only the ‘Paperless Validation System’ but ALSO the Specific Product/Application/Equipment/etc. being discussed is required to assist organization Audit Lead with Regulators. Response to Regulator or internal QA team: We’d be happy to show you within our system the data that you’d like to review. We are gathering our experts from the system and the product so you can ask them directly about the data that you’d like to review.

Trust but Verify

Trust is a major issue for us in life sciences industry. The lack of true trust within organizations drives an extremely counterproductive divide between the Quality side of the house and the IT/Business side of the house. While peer reviews are certainly important and should be a part of any process regardless of the technical control points a tool might provide, there is a right-sized approach that I too often see passed by in favor of overly burdensome practices. Instead of trusting the system, companies design manual processes that disregard both technological capabilities, as well as the people involved.

Paperless validation systems are systematically/programmatically designed to automate business process workflow and provide the ability to ensure control points (i.e., review/approvals) within the system itself. However, due to the lack of trust between one part of the organization and another, we see multiple (too many) levels of review and approvals often at data collection points that one party or another does not necessarily have a subject matter expertise in. This is a problem in and of itself at the organizational level, not at the technological level.

Often, the result of this lack of trust is a reluctance of Quality to truly understand the applications (in this case the paperless validation system) and an unwillingness to try something different. In fairness, this could possibly be a symptom resulting from the traditional tendency of these organizations to work within a silo rather than more closely together. It could be a matter of a lack of cross functional and cross technical enablement shared with other parts of the organization. Again, this speaks to a problem at the organizational level rather than an issue with the technology itself. Additionally, even if these activities do take place to some degree, we can still see skepticism from those Quality reps who were not directly involved in the validation of the electronic system not fully trusting that it was properly validated - even when that validation was approved by their colleagues in alignment with their organization’s approved procedures.

In many ways, Quality wants the systems to do something that is not physically possible – at least not just yet. Designing systems to enforce quality data is very difficult as it is not usually a ‘system failure’ but rather the human error that is the part of the process that is most vulnerable to failure. Failing to trust in the correctness of the data being entered or created in the system, they want the system itself to ensure the data is correct. Without being 100% assured that a system can prevent incorrect data and lacking trust in a single person entering or creating data, the result tends to be overengineered processes and manual checkpoints with physical extractions. But there is a healthy middle ground. Let us look at an example.

A system can logically prevent a data entry error (such as entering a weight value) in a way that can provide assurances. It is possible to have the data field systematically ensure that either a) the data is transferred from the balance accurately based on the balances precision and/or b) the decimal figures associated with the weight must match that of the system configuration. Still, if that data is manually entered (an entry of 110.000g and 110000.000g could both be entered) then a typographical error could be entered into the system. So, we added another pair of eyes to ensure the data entry is accurate. This second data entry check is commonly referred to as the “4-eyes” principle. When assessed and combined with technical controls, this principle should prove more than adequate.

Unfortunately, and in many cases, 'we' (Quality) are not satisfied with this. We layer in added control points either immediately or further downstream to review. Often, this means extracting and 'reporting' data in a physical format to review (and/or approve). It then also becomes quite common to insist that paper records are retained with the transfer to the electronic system. I would recommend reading the ISPE blog about True Copy for more information on this. Again, the insistence on retaining paper records inhibits any innovation and impedes the ability of our resources and the technology to enhance how we manage validation. In the Life Sciences industry, we invest heavily in the resources and systems yet prevent them from doing this effectively.

Exclusion of Experts

Typically, Quality organizations run both internal and external audits. It’s common practice and knowledge that within our industry internal audits tend to be more rigid and less flexible than audits performed by regulators. Having sat on both sides of audits representing customers/sponsors and software vendors, I have observed a natural tendency towards dictating and controlling the narrative of the audit as much as possible. To achieve this control and allow for as little variance as possible, many companies and Quality teams would rather supply flat file documents for auditors to review. They do not consider the option to provide access to a system or the teams that own said system. The truth is, Quality at many organizations infrequently bring SMEs to the audit table in fear of losing the narrative of the audit. Again, coming back to an issue of trust in processes, systems, and their own resources.

Through this omission of experts, we often see an undue strain on tools and processes wherein the 'reports' that are generated from the paperless validation system are heavily configured to deliver as much of the data as possible without having to give auditors access to the system. However, FDA and other regulatory bodies are demanding more access to systems across the board. This reluctance to grant access to systems and SMEs must be overcome. Not only should it be overcome, but it should be embraced. Having SMEs that understand the system that is being audited involved in audit readiness and at the table during an audit is not only beneficial to the outcome of the audit itself but contributes to bridging the internal trust and connectedness across the organization itself. Moreover, it builds trust between the company and the regulators.

Conclusion

And that connectedness is really the crux of it. Cohesion, trust, and a unified organization is really the key to all of this. The technology is there, and the tools are more than up to the task, but we as an industry need to embrace them on their terms, not force them to bend the terms we defined decades ago. Trust the technology, trust your experts, work together to define right levels of process and right measures for capturing and controlling necessary data. Reshape and commit to training and enablement across all teams to ensure proper adoption and understanding of technology and reshape and commit to the upfront effort around defining risk profiles, proper requirements, and a right-sized review process.

If we remake our processes, the Paperless validation solutions are truly paperless, but only if we allow them to be. After all, a tool is only as good as the process it supports. And a process is only as good as the team that defines it – not rigidity, siloed work habits, or maintaining perceived status quos – a good team consists of pooled knowledge and collaboration allowing for efficiencies and innovation.

Disclaimer:

iSpeak Blog posts provide an opportunity for the dissemination of ideas and opinions on topics impacting the pharmaceutical industry. Ideas and opinions expressed in iSpeak Blog posts are those of the author(s) and publication thereof does not imply endorsement by ISPE.