top of page
Search

What Process Validation Really Means — And Why We Keep Getting It Wrong

  • Writer: Elizabeth Zybczynski
    Elizabeth Zybczynski
  • Apr 26
  • 6 min read


The Truth About Process Validation — And the Systemic Mistakes Holding Us Back

Misunderstanding how to execute effective Process Validation carries significant consequences—ranging from quality failures and regulatory findings to wasted capital and increased patient risk.


This misunderstanding starts with a somewhat confusing name.  What we commonly refer to as Process Validation isn't actually validation at all – it’s a verification – a verification that the process can consistently produce conforming product.  Validation is building the right thing; Verification is building it right.  Building the right thing includes: the right automation, the right material flow, the right throughput, the right cost structure, etc.  Actual Process Validation is most closely described as the PPQ step in the FDA’s Guidance for Industry: Process Validation: General Principles and Practices.


Process Verification involves a series of engineering activities to build confidence that product rolls off the end of the line meets specifications.  I know – I know – the industry lingo is that Process Validation is for when the product can’t be 100% verified.  Even if we called it Process Verification, these would still be two distinct activities – Process Verification versus Product Verification.


I recognize that the terminology is entrenched, and we will continue to use the regulatory term “Process Validation.” However, it is essential to treat these as distinct concepts so the appropriate engineering and scientific activities occur. For the remainder of this article, I will use the term Process Validation, but

the underlying activity remains the

verification of process output.


The Three Stakeholders We Must Serve

Success in cGMP and manufacturing requires meeting the needs of three critical stakeholder groups:

1.      Patients, who depend on high‑quality, reliably available, life‑saving and life‑sustaining products.

2.      Regulators, who must be able to oversee products and the processes that produce them to protect public health.

3.      The business, which must manage cost, throughput, and supply predictably. As I often told my cGMP staff: You get zero points for being compliant if the plant doesn’t run.

 

This may challenge conventional thinking, but the point is critical. When we frame these stakeholder needs as competing—quality versus operations, regulators versus industry—everyone loses. The mindset must shift toward achieving Quality and Continuity and Operational Excellence and Compliance. A well‑designed, well‑validated, and well‑monitored process serves all stakeholders simultaneously. It does so by controlling defect levels and, fundamentally, by demonstrating robust process capability.


How do we get there?

Through Engineering and Science.


A process—any process—is a defined sequence of steps that transforms a set of inputs into an output. Those steps may be automated or manual, simple or complex, co‑located or distributed, but they all function as a process to transform inputs into outputs. In pharmaceutical and medical device manufacturing, processes use combinations of people, equipment, procedures, and training to convert incoming materials—APIs, water, components, resins, solvents, bonding agents (inputs)—into finished products or intermediate components that serve as inputs to subsequent processes (outputs).

Every process also operates with controls, which ensure the output meets requirements, and constraints, which the process must satisfy, such as throughput, documentation standards, or electrical and safety requirements.

 


Breaking down a process in this manner delivers several critical benefits:

1.      It creates a deliberate, scientifically designed process with purposeful, well‑understood controls. Documenting constraints ensures they can be planned for, monitored, and verified. I once visited a facility where several steps in the SOP were marked with “+++,” indicating a historical regulatory commitment. Yet no one could explain what the step accomplished or why it existed. The process had devolved into a collection of disconnected obligations rather than a coherent design. This does not promote quality, compliance, or operational excellence.

2.      It drives essential Process Validation activities—ensuring critical elements are included and non‑value‑added activities are eliminated.  A structured breakdown prevents validation from becoming a paperwork exercise and keeps the focus on what truly matters.

3.      It informs where process monitoring must occur, recognizing that validation is not a one‑time event.  Understanding the process architecture clarifies which controls require ongoing surveillance to maintain a state of control.

4.      It enables fast, accurate impact assessments when changes occur.  When the process is clearly defined, you immediately know what is affected—and what is not. 

5.      It equips you to clearly explain to reviewers and inspectors how you maintain an ongoing state of control.  A well‑structured process narrative demonstrates intentional design, scientific rationale, and disciplined execution—exactly what regulators expect to see.


And Some Math

Now that the inputs, outputs, controls, and constraints are connected, we can determine what to measure. The goal of Process Validation is to establish high confidence that the process can consistently produce conforming product. Unfortunately, there is a great deal of misunderstood sampling math in the industry—far more than I can unpack here—that routinely undermines effective validation.


What you should know is this: If you rely on attribute data and pull 200 samples with zero defects, you are effectively accepting a defect rate of ~1.2%—a process capability of roughly Cpk ≈ 0.8. At 400 samples, you only reach Cpk ≈ 0.9. These correspond to 3.5–4 sigma performance.


That is not good. In fact, many pharmaceutical and medical device companies are validating processes at capability levels comparable to a discount airline’s baggage‑handling operation.

Process Validation should instead ensure the process is capable of achieving a Cpk of at least 1.33. Only a capable process can meet the needs of all stakeholders—and, in truth, the needs of any one of them.


Where Process Validation Goes Off the Rails

Bad science is always bad compliance. When compliance—the minimum acceptable standard—is used to justify weak science or inadequate engineering, organizations fail on both fronts. FDA’s Process Validation: General Principles and Practices outlines the conceptual framework but offers limited guidance on how to operationalize these principles within real manufacturing environments. Conversely, PIC/S Annex 15 provides extensive requirements and definitions, yet offers little insight into the underlying scientific and engineering principles that make a process truly capable.

And in truth, this gap is not the regulator’s responsibility to fill. It is the industry’s obligation to our patients, regulators, and business stakeholders to apply rigorous science and sound engineering, then articulate clearly how those practices achieve compliance and ensure safe, effective, and reliable products.

 

IQ/OQ/PQ frameworks often promote a box‑checking mindset. The first step in shifting our approach is abandoning the check‑the‑box mentality and re‑engaging critical thinking. Too often, legacy IQ/OQ/PQ frameworks have reduced validation to a paperwork exercise rather than a scientific demonstration of process capability.

I once reviewed a validation report divided into three sections—each with the same cover page—where the only distinction was a literal checkbox indicating whether the section represented IQ, OQ, or PQ. When I asked how these activities connected to demonstrate a robust, capable process, the answer was simple: they didn’t; each was completed independently.

I call this: working more hard to be less compliant.  Validation activities must be integrated, scientifically justified, and tied directly to process performance—not treated as isolated documents to be completed for the sake of completion.  This results in validation efforts being both lean and effective.


Getting Back on Track

Where it All Starts

Now that we’ve challenged decades of legacy Process Validation practices, the question becomes: where do we go from here?  We go back to the beginning.

Teams often hesitate to do this because they believe they “don’t have time.” The reality is the opposite: you don’t have the time not to do this. Without clearly defining process intent, process constraints, and process requirements, there is no meaningful finish line—only activity without direction.

Failing to establish these fundamentals early leads to late‑stage failures that are far more costly to correct than issues identified during early Process Validation activities. Worse, organizations may mistakenly believe they have “finished” validation, only to encounter post‑launch quality problems, operational disruptions, or regulatory observations that reveal the underlying gaps.

Returning to first principles is not rework—it is the only path to a capable, compliant, and sustainable process.


Creating a Holistic Validation Approach

Once you have a holistic description of the process, the next step is to design a validation approach that is intentional, science‑based, and aligned with process capability. This includes three core elements:

 

1. Verifying Constraints and Confirming Output Meets Specification

You must define how you will verify that:

 

·       Process constraints are satisfied, and

·       The output (finished product or intermediate) conforms to specification.

 

Where possible, invest in variable method development rather than relying solely on attribute methods. Variable methods reduce sample size, accelerate decision‑making, and provide higher statistical confidence—delivering significant cost and time advantages over the life of the process.

 

2. Defining Activities Across FAT, SAT, IQ, OQ, and PQ

The goal is not to abandon traditional phases, but to use them deliberately. Each phase should contribute meaningfully to building confidence in the process and detecting issues early.

 

Key principles:

·       Early issue detection must guide activity placement.

·       Activities should build cumulative confidence, not operate as isolated checkboxes.

·       Plan for all plausible outcomes, including failure modes.

 

Too often, organizations reach PQ without having considered what happens if PQ fails. When that occurs, it becomes clear that upstream risk‑reducing activities were never built into the plan. Thoughtful sequencing prevents discovering fundamental issues—such as improper equipment grounding—only after running multiple lots of saleable product.

 

3. Establishing Ongoing Monitoring to Maintain a State of Control

Process Validation represents a moment in time. Sustained control requires a monitoring strategy that focuses on the performance of process controls, not just the output. This is the foundation of Statistical Process Control (SPC).

Your monitoring plan should include:

 

·       Defined limits for process control parameters

·       Action plans for excursions

·       Integration into a formal Control Plan that governs ongoing operations

Monitoring controls—not just product—provides earlier signals, better diagnostics, and a more reliable state of control.


Join the conversation by commenting below, and follow A–Z Continuous Compliance for ongoing insights into designing capable processes and elevating validation practices across the industry.

 

 

 
 
 

Comments


© 2026 by A-Z Continuous Compliance, LLC.

Built on Science. Driven by Evidence. Ready Every Day.

bottom of page