Why We All Need Critical Thinking
It may seem that the only things loved by pharma industry more than acronyms are new buzz phrases. And that may be true. But sometimes, behind the buzz phrase, there are real, strong, achievable benefits to our work, our industry, and our end patients.
Critical thinking is one such buzz phrase. Originating outside our industry, and enthusiastically embraced by academic institutions and consultants globally, most of the definitions are somewhat unintelligible and hard to relate back to real life.
For example, the Foundation for Critical Thinking1 introduces it thus: Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action.
Wikipedia2 offers no less than eleven different definitions and three sub-definitions. Isn’t such complexity the antithesis of critical thinking itself?
Me, I like to simplify down to what resonates with my own psyche:
- It’s common sense, if any of you can remember such an old-fashioned concept.
- It’s Vulcan logic, ably demonstrated by Spock in Star Trek and frequently ignored by the irrational Captain Kirk (know anyone like this in your organization?)
- It’s choosing wisely, excluding bureaucracy, hierarchy, ego, bias, and ambition from the decision-making process.
- It’s focusing on what matters, which has to be quality instead of compliance and documentation. Get the quality right and the compliance will be there anyway.
I’m sure at this point the many critical thinking experts are gnashing their teeth in anger at my irreverent attitude but honestly, I don’t care. It seems to me more important to get industry using critical thinking than debating the intricacies of the paradigm.
Let’s look at a few examples of critical thinking with GxP computerized systems; after all, I am a lifetime GAMPer.
Requirement Specifications
We all agree we need to capture what a computerized system must do when we are planning a new system, i.e., the functionality it will provide to support a particular business process. This means we need to define what regulations apply and what individual requirements must be met within the overall regulatory framework. We need requirements to define the specific functionality that the system needs to provide relating to the business process; how it will control an activity, analyse a sample, calculate a result, and store the data. There may be particular constraints or infrastructure requirements: cloud-hosting, operating in a cleanroom, connecting or interfacing to an existing system.
These are all tremendously important and need to be captured as requirements.
Requirements for how many levels of sub-menu are allowed in the system. Requirements for “fast, easy, user-friendly”. These are meaningless to the business process and useless to the end patient.
What about capturing the requirements into a word document versus using a requirements management tool? What choice of logo, font, line spacing? Will the requirements be hand-signed or electronically approved? None of this has any impact on the product quality or patient safety. As long as the requirements are controlled, approved, protected against unauthorized changes, available and kept current throughout the life of the system, then the information format, media and appearance also have no impact on data integrity.
I mentioned ‘approved’ in the list above. Review by the process owner and system owner – absolutely. Review by four other individuals with no system or process knowledge who just wanted their name on the document – pointless. My personal record on a CSV project was a customer demanding a total of nine reviewers and approvers. If a reviewer is not going to read the content or understand the content, what does their signature represent: An autograph for posterity? A confirmation that they believe the earlier reviewer probably read it? Not helping….
Testing
There are major gains to be had by applying critical thinking to testing, and it’s therefore no surprise that FDA’s excellent instigation of Computer Software Assurance (CSA) approaches have a significant focus on testing.
I would conservatively estimate that click-by-click detailed test scripts take five times longer to write than to execute. The largest proportion of defects found by such test scripts is, without a doubt, errors in the scripts themselves. Let’s look at the motivation for such detailed test scripts.
Motivation | Critical Thinking |
---|---|
You need to challenge a specific path or branch in the software; the detailed instructions ensure you get to the path or branch. | This brings value in defect detection and ultimate benefits to patient safety. |
The tester doesn’t know how to operate the software so they need the click-by-click instructions to compensate for the lack of system knowledge. | X WRONG choice of tester. |
A detailed test script will only ever test the same paths and branches corresponding to the instructions. To get better coverage of the software, use unscripted testing techniques such as ad-hoc, exploratory, error-guessing and day in the life testing. It is all documented – what was tested, by whom and when, any issues and their disposition, overall conclusion – and significantly improves the probability of proactively finding a defect that will negatively impact your business process in operation. Documentation alone is no defence against defects.
And there are plenty of other efficiency and quality gains to be had around streamlining test evidence. And eliminating forensic review of completed testing. And so on…
How to learn more about Critical Thinking for Computerized Systems?
Good news. By happy coincidence, we have just significantly revised the industry-benchmark on managing GxP computerized systems – the ISPE GAMP® 5 Guide – into a Second Edition. Critical thinking has a separate appendix to provide a detailed insight, but also is woven in through all of the guide main body and appendices. The testing approaches in Appendix D5 have been substantially revised from a documentation-heavy detailed test-script approach to now focusing on information capture and targeting assurance of intended use through leveraging vendor testing and using combined scripted and unscripted testing to maximum advantage. Yes, CSA is built-in to the Second Edition.
Specific examples such as those in this blog and many more are contained in the Second Edition. What examples can you share on critical thinking gains in computerized systems?