Excerpt from Special Edition: Computer Validation II
A Practical Approach to PLC Validation
By Jay H. King
formerly Medeva Pharmaceuticals
"My first experience with PLC validation was absolutely awful;
I knew that there had to be a better way."
How many of you have validated Programmable Logic Controllers (PLCs) or computers in the past? How many of you are going to have to do it again? And how many of you think that it is an awful, complicated, and terrible thing to have to do? If the answer is "Yes," I used to feel that way as well.
My first experience with PLC validation was absolutely awful; I knew there had to be a better way. I knew that it had to be much easier than I had made it and that I could somehow communicate that to others.
PLC validation is not hard, and it doesn't have to be painful. It can, at times, be tedious or frustrating, but it doesn't have to be hard. If you organize your validation program and have the proper things in place before you get started, you can get through it. And, when you're done, you can actually have a validation package that people can look at, understand, and get something from. It can be done.
Much of what I'm going to discuss is generally applicable to computer validation. PLCs and computers are interchangeable in terms of the way they are developed and the way they are put into place.
Have you been told that PLC validation is different than Installation Qualification (IQ) and Operational Qualification (OQ)? As far as I'm concerned, it's all the same. There are many people who have promulgated the belief that PLC validation is like black magic. They would have you believe that, in order to make it happen, you need a cloak, a pointed hat, and some crystals. They are wrong though, because you can approach it with your standard validation tools and treat it in the same way.
If this is the first time you have had to deal with anything of this nature, then you probably feel like you've been thrown to the wolves.
My first experience with this was probably typical to what many people have experienced. A system was developed by engineering, and they threw the specification over the wall. "The equipment is there for validation. Go for it." Those of you who have been involved in computer validation know how this can go.
The system needed to be validated right away, but there was a shortage of time and resources. It fell to my department to fill the holes. There was a good deal of crying and gnashing of teeth along the way, but in the end, we came out of it with some very good tools.
How did I get through it? For starters, I went through the literature that was out there. Orlando Lуpez (a member of the Editorial Advisory Board for this Journal) has written some very good articles on a checklist approach to PLC validation. There have also been a number of articles that have appeared in Medical Device of Diagnostic Industry and Pharmaceutical Engineering over the last couple of years. These were all different places to start and see what the state of the art was.
Then I stepped back.
I looked at reports by other consultants, and the reports that they had written for PLCs didn't look like our other validation reports. I didn't feel that was necessarily the only way to do it, because I have found that validation can be broken down into a three-step process:
- You develop a methodology for how you're going to test something.
- You draft an acceptance criteria.
- You perform your tests, and you get some results.
I didn't see any reason why the validation of PLCs couldn't be handled in the same manner.
It was very important to me to look at the validation of PLCs that way. In my facility, we had only just begun to get into validation in general. I wasn't ready to introduce a whole different way of looking at things. On top of that, I wanted to use the same nomenclature, the same terminology, and the same approach so that we could all understand one another.
I had a lot of fun asking people to give me a definition of what a PLC is. If you ask an engineer, you get a very exciting definition. If you ask somebody that works with one, you will probably get a more useful definition. The people working with them will tell you it's a relay replacer.
Up until the not-too-distant past, industrial equipment was generally controlled with relays. Electricians would wire them together, people would push buttons, and things would happen. During the '60s or thereabouts, companies began to implement some of the functionality of the relays with software.
In doing so, they invented a programming language called Ladder Logic. Most people will look at Ladder Logic and be mystified. Evidently it was designed so electricians of that day could understand it; it mirrored nomenclature used at the time.
The PLC, though, is a general purpose computer that has been dedicated to input and output functions. Input/Output (IO) operates on simple algorithms, and it is dedicated to control applications. It is possible to do other things with it, but I wouldn't recommend it.
I've seen someone write a Ladder Logic program that did serial-to-parallel data conversion. It was horrible. This isn't the place to do that, and these aren't the right tools. However, PLCs are very robust. Even though industrial grade computers are on the market, PLCs are more suitable for plugging into the manufacturing environment.
That, like everything else, is changing. I suspect that over the next 20 years, PLCs will be replaced by general purpose computers. Why? Because computing power is becoming less and less expensive and more and more tools are being made available.
Even today, there are other languages available besides Ladder Logic, languages with which you can actually program a PLC with many of the same tools that would be used with a computer.
What Makes PLC Validation Unique?
There are some differences between PLC validation and "standard" validation, and these differences apply equally well to software validation in general. What makes software validation so different is the complete absence of artifacts. We're used to walking out, putting our hands on a piece of equipment, and applying a measuring stick, tachometer, thermometer, etc., to whatever that piece of equipment is doing.
That isn't the case with software. You can never actually touch, see, sense, or perceive the true software. Anytime you're looking at a screen display of the software, or a printout of the software, all you are seeing is an interpretation, by another device, of what the software actually looks like.
So the big challenge in PLC validation, and validation in general, is to somehow give reality to these artifacts. Our challenge as validation professionals is to document that these artifacts are valid and responsive to their user's needs.
Another significant difference in the validation of PLCs is that validation starts at the beginning of development. With traditional equipment, it is conceivable that the engineering department could design something and toss that design "over the wall." The facilities department could go out and buy it, plumb it up, install it, and then they could come to the validation department and say, "Oh, by the way, we have this thing we'd like you to validate." And you could generally be successful.
That won't work with PLC validation. Validation has to be part of the program from the beginning, or you're going to have a huge amount of work at the end. And you probably won't be happy with the results.
Even a simple PLC program and system is probably more complex than the most complex piece of equipment you'll ever validate. A tablet press may have a few thousand moving components, but a PLC program with many thousands of lines of Ladder Logic - if you consider each line to be a component - is immensely more complex.
That is another challenge for the people in validation: simplifying that complexity to a point where we can deal with it. We can't be expected to look at every component of the PLC. It certainly isn't done when you look at a piece of equipment. With equipment, you look at its functionality, and that's what we need to do with PLCs. We need to look at the functionality of the software.
Don't forget the expertise of the developer. Nowhere else in our industry does someone who has less understanding of the requirements have a greater impact on what happens. The person who programs the PLC knows very little about the process, current Good Manufacturing Practice (cGMP), or anything else that matters to us as validation personnel.
The developer is an engineer, an electrician, or a software guy looking at a set of specifications (probably something that was drawn out on a napkin by somebody in the cafeteria), and as he is executing that software, points are going to be missed, very important points. Which is why validation has to start at the beginning and continue on through the entire process.
The Wrong Way to do PLC Validation
This is the wrong way to do PLC validation:
- The developer prepares the hardware and the Ladder Logic.
- Somebody designs a control system and writes some software.
- The system, which doesn’t perform as desired, gets tweaked to make it do what the user originally wanted.
- The system gets handed over to validation, who is told to get it validated “real quick.”
When you do it the wrong way, you discover problems during validation. That is the worst possible time to discover something amiss, because, as usual, the validation function is at the tail end of the time line and everybody is saying, “Validation won’t allow this system into production.” Then you have vice presidents and presidents banging on your door asking why you’re keeping them from making product.
Changes are required though, because you discover problems and, worst of all, the defects are hidden in the system. If the right development work hasn’t been done up front, you can’t avoid the fact that there are going to be hidden problems that you won’t find until much further down the road. It’s an inconvenience for validation, but it’s a potential disaster for the business (especially if it requires a recall).
Worst of all, the system will fail to meet the user’s needs. The opportunity is there, during the development phase, to make these software control systems meet the user’s needs. It only requires a little bit of paperwork up front and communication. It’s truly unfortunate when all of the effort to put a system in place is expended, and it doesn’t do what anybody really needed it to do.
One of the biggest challenges with retrospective validation of PLCs is to really understand what each piece of code means; it’s not like a computer program. With PLCs, the developer has to sit down and explain to us address, alarms, input/output, the interfaces, and so forth so we can understand. And many times, as you are going through the code line by line with the developer, he will say, “Wow, I didn’t know it was going to do that.”
One time when I asked a developer to walk through the code, it turned out to be the first time he had actually reviewed the code with a third party. I asked him, “What happens if you get a signal that’s greater than this conditional statement?” He shrugged his shoulders and said, “I think the program would crash. I think something disastrous would happen.”
Involving validation experts early in the process will result in few changes to the system during development, and that’s always preferable. Hopefully, you’ll have no hidden defects. If the validation and testing plan has been carried out properly, there shouldn’t be any, and the system will meet the user’s needs.
It costs no more to do it the right way than to do it the wrong way. The challenge is getting the people to make the investment up front rather than correcting things later on.
This way of doing things requires the validation personnel to be pro-active in talking to the people who develop and create these things in their facilities. These people need to be convinced that it is in their best interests to be open with you, to tell you what’s going on, and spend time with you explaining the system.
Tools of the Trade
You need two things in-house to put together a PLC validation:
- An approved and signed-off system development methodology.
- An SOP (standard operating procedure) for the validation of computer-related systems.
The system development methodology describes how you develop, maintain, and cover change control of automated systems in your facility. I have run into some cases recently where systems were being developed and a manager argued that a systems development methodology would handcuff the creativity of his programmers. Now, there’s a facility that’s just about guaranteed to be out of compliance in short order.
What goes into the development methodology? First of all, the requirements for specifications. How you specify systems, and how you get people to sign off on those specifications are critically important. There has to be a review of requirements and specifications to the system. You can’t afford to have somebody specifying a system, sketching it on a tablecloth and carrying it down to engineering. That’s not sufficient. Too many people that have interest in the system are not being allowed to know what’s going on, and there are too many opportunities for failures or hidden defects to work their way in. (You’ll find a lot of warts in there that need to be corrected.)
Development practices also need to be discussed. How code will be developed, how it will be reviewed, where it’s going to be stored, what tools will be used, security and change control, among other things, should be addressed. Finally, maintenance and change control have to be dealt with. What happens when changes come along after development? Where are the revisions kept? Who’s informed, and how does validation find out about these changes to see if more work needs to be done? All of these questions need to be answered in the system development methodology.
Your system development methodology will probably be a very extensive document.
In addition to the system development methodology, you must have an SOP for the validation of computer-related systems. The SOP should be “computer-related,” so it can cover PLCs, computers, controllers, and anything else you care to include. These can all be handled with the same kind of tools. While the development methodology has probably been created by others, you, as a validation person, can prepare this validation SOP.
The SOP should work in conjunction with the development methodology. The same language should be used, and it should address specific phases. If you don’t get the kind of cooperation you need, then you can step back and just think of it in standard validation terms which will give you a document that will work.
You need a commitment from management that this SOP is the tool that’s going to be used and that it is going to be followed. If you have that, then it becomes your big stick when it comes to internal and vendor developed systems.
This document should define your approach to the validation effort. It should talk about what you consider to be appropriate for validation effort and what you consider to be appropriate for validation of these systems in your facility. It is very important to define the terms and how they relate to computer validation (e.g., what do IQ, OQ, and PQ mean in your facility?).
This is essential, because some of the people that come from the software industry have a very different spin on OQ and PQ than those who come from the validation side. In software, module testing would probably be considered OQ and system testing PQ. In our business, we’d consider system testing to be OQ and reserve PQ for the final “three batches” when product is actually manufactured. Because of those variances in terminology, these are things that have to be included in the approach and defined in the document.
The validation SOP needs to address how you plan to go about validation. But the validation plan should do more than just list the validation activities that are going to take place. It’s an opportunity to say what standards the different validation activities will be held to, who’s going to be involved in them, and what resources are required.
This is also a great place to let people see the scope of the project before you invest a lot of time writing protocols that you may or may not need. And it’s also a good tool when you’re going to vendors, before writing the purchase order, to see what resources they can supply to validate their part of the program.
An important part of the validation SOP specifically addresses how vendors will be audited. The extent of this audit depends on your level of sensitivity, so the audit could range from a very cursory telephone call (asking the vendor if he has systems in place and what they are) to actually having a certified auditor produce a formal report.
There are three key participants in this validation process:
- The end user.
- The developer.
- QA (quality assurance) validation.
You need to have these people (or groups representing them) involved, and, although it’s tempting, try not to let them fill more than one role. It is very important in the development and validation of these software systems that you have different sets of eyes looking at the documents. Everybody has a different take on it; they all have different needs, and they all have things to add to the process.
The end user is the expert on what he wants. (Even if, as often happens, what he says he wants is not what he needs.) The developer is the person who has to translate those wants and needs into system specifications. QA validation looks at those specifications to determine if it is a system that can be tested and will produce compliant product.
Many times, though, the developer is the person least qualified to develop the process because he has the most tenuous grasp on it. The relationship between end user and developer has to fill in that gap. There has to be a good line of communication to make sure that the developer understands the user’s needs. Remember, the developer is probably an electrical engineer or a contract programmer who doesn’t understand your safety or quality concerns. He is going to need some help.
QA validation (if your systems are working right) acts as an auditor. It shouldn’t be necessary for the validation person to be developing the specifications. Unfortunately, all too often, they end up doing that.
Most of your PLC systems will be developed by contractors. There aren’t many companies that keep a large staff in-house to develop software or design PLC systems.
If a piece of equipment or a large utility system is brought, then it’s important to negotiate the contracts so you can somehow hand off a large quantity of the validation effort to the supplier. After all, they know what they’re doing better than you possibly can.
An important factor that seldom gets into the discussion of these audits is looking at the contractor from a business standpoint. If you’re having someone develop software on contract, it’s good to see if their business is solvent. That way you know that they will be there in a couple of years when changes need to be made to the system.
One way to hedge against a supplier’s dissolution (which has become a common requirement in the software industry) is to have the developer’s source code placed into an escrow account. This isn’t a cure-all though, because anyone who has ever looked at several hundred lines of source code will tell you that this fall back in no way gives you the capability to understand what was written. There is no substitute for being able to go back to the original developer, who has the development documentation, in order to implement changes later on.
It is critical for the developer to have good quality programs in place. Many people who supply equipment to the pharmaceutical industry will not be excited about having you dig into the specific details of the software they develop. They will, on the other hand, be more open to your understanding their quality program and how they implement software quality assurance. That can give you a great deal of confidence in the product that they’re producing for you. If you know they have SOPs for software development, a software development methodology, and a validation program, you can feel a lot better about the product they’re producing.
Another thing to keep in mind is that many vendors say they will provide you with a certificate that says their system has been validated. The FDA will not accept that, and God help you if an investigator finds that certificate, then investigates the supplier and discovers that they were deficient. If that happens, it means your system is deficient as well, and there will be hell to pay...