Vous êtes sur la page 1sur 4

Computerized systems Periodic Review

Isaac Serra
QA Manager - IT Validations
Dear all.
How often you need to perform a Periodic Review of a computerized system?
I think the timing will be based on the criticality of the system and assigned GAMP criteria. The problem is that
GAMP5 do not specify the periodicity of the system review according to its category (3, 4, 5).
So, what would be the time assigned to perform a periodic review for a system category GAMP4 and one from
category GAMP3?
David Stokes
Global Lead, Life Sciences at Venostic
Base it on the criticality of the system and the number of records generated over a given period of time. Clearly
you'll want to review higher risk records more frequently.

If lots of users generate lots of data you'll want to review more frequently. If few users generate few date/records,
then you can review less frequently.

The rationale is that if you sample and find a problem, you have a manageable number of data/records to review in
full if you find a problem and need to do a 100% check for similar issues. E.g. a GMP lab instrument used by just
two users for testing batches only manufactured twice a year could be reviewed every 2-3 years. A LIMS in the
same lab, with dozens of users generating data/records from dozens of other instruments every day would need
reviewing every 6-12 months.

This is a business issue and not a technical issue, so software category has little to do with the decision

Epitome Technologies Training


Computer System and Software Validation Training
Correctly conveyed by David. The frequency of performing Periodic Reviews should be dependent on the
Complexity, Criticality, Novelty, and Operating History of the (computer) system. The decision and rationale must
be documented.
GAMP suggests that the frequency of review can be defined with a minimum and maximum time between reviews,
for example a scale of 1 to 4 years can be set for the review period. Only new GAMP category 4 and 5 Computer
Systems would have an annual review period. This would be extended or reduced based on demonstrating that the
system operation is stable from the 1st validation or not.
For critical systems, you may decide to have periodic review every 6 month initially, and once you feel that your
process is stabilized and system remain in control, you may decide to extend the period to annual.

Randy Perez
Director, Information Governance and Management at Novartis

I think David's post is more in line with expectations than the subsequent one. My only quibble is with his proposed
frequency, which is probably too often. As he noted, the GAMP category of the system, and hence the complexity
and novelty of the system, are not really relevant.
What is relevant are the processes a firm has in place to keep a system in a state of control, the operating history of
the system (number of incidents, problems, and changes), and the criticality of the information in the system.
Many firms have a default value for periodic review, often 3 years. Reviews are only done more often than that if
warranted based on the behavior of the system. If even a critical system like a LIMS used for product release is
demonstrably stable and controlled, there is no need for a review as often as every 6-12 months. If on the other
hand the system has had multiple incidents and/or changes, an earlier review may be indicated.

Hector Olson
Senior Manager - QA Compliance at Gilead Sciences
I tend to agree with David on the Business issue, however the majority of the task (for collecting data) are on IT
shoulders. The interval is dependent on the impact to the business, not IT. And just like GAMP can be used to
develop the risk models for systems and separate the the components (OS, application, configuration, support
programs,

records...),

it

can

also

be

used

to

evaluate

the

health

of

system.

I will say that I find it interesting that creating a VMP is the new thing or expectation. However, the status of the
validated system (when was it initially validated) and subsequent periodic review (some refer to it as revalidation,
which I consider an incorrect term). The periodic review has always been a requirement in the GMPs, we just
tended to only consider aspects like cleaning, equipment or process in that arena. With the much greater use of
computers in support of many operations (data management and automation), it has become a much greater
component of our GMP world and should be given the review that it deserves.

Steve O'Connor
Senior Project Manager GTS Life Science Global Regulatory Center of Excellence & Cyber Security at IBM
I agree with Randy. In a former organization we built into the periodic review SOP the ability to adjust the frequency
of the review of the system. For example, if there were a large number of problems reported or a number of
incidents with the system we may shift the frequency from 3 years to 2 years or perhaps yearly based on the
criticality of the system and the problems or incidents. We also accounted for shifting the time frame out for
systems that were for example, stable for a number of years, very few if any changes to the systems etc.

David Stokes
Global Lead, Life Sciences at Venostic
I do understand the 'economy of scale' argument for performing periodic reviews less frequently (2-3 years),
because if you're going to the bother of setting up a review, you might as well include a reasonable sample size.
This is certainly more efficient.

However, early in the operational phase of a system I prefer "little and often" - conduct initial periodic reviews more
frequently, with a smaller sample size to review. The problem with waiting 2-3 years before performing the initial
periodic review is that if you do find you have a problem, you potentially have an awful lot or records to review as
part of the CAPA investigation. If you do your initial review earlier, the scope of your CA is reduced.

6 months is indeed at one end of the scale - only suitable for a new, high risk system, with immature operational or
maintenance/support processes and high transactional volume. Once you'd established that the process was under
control, you'd look to reduce the frequency with a rationale that empirical evidence from the first periodic reviews
proved that the risk likelihood was tending towards medium-low.

As Steve says, never be afraid to adjust the periodic review frequency (or sample size) based upon what previous
reviews have told you

Hector Olson
Senior Manager - QA Compliance at Gilead Sciences
Hi David,
Can you clarify your interpretation of periodic review? It appears that you are implying of data (data reviews higher
for more processed data), but I read the regulations (including process, equipment, cleaning, methods.....) as
periodic review to ensure that the validated state is still appropriate and the system is meeting the intended use.
Since this is the premise of validation, why would you need to do such intensive data review (if that is what you are
saying, I may be wrong in how I am reading it).

I look for data reviews to be done periodically during operation, i.e., audit trails. That's why they are there. As an
analogy, when you have paper logbooks, QA does periodic review of the data in those logbooks to ensure that the
data is correct, appropriate and supports any data/information coming out of it. Just because you move to an
electronic system, does not removes QA's responsibility to review data, except, now that they can leverage a
validated system, I expect them to use the audit trail (using a sampling plan) to review recovers over a period of
time to have a high level of confidence that they are accurate and meet the same criteria (predicate rule) as their
paper brothers. By reviewing the audit trails (real ones), they can see that the records were approved/signed and
that any change in the record was recorded in the audit trail and that the record was re approved/authorized after
the change - the equivalent of the paper process.

Also, since our definition of validation keeps getting more defined, the periodic review allows you to assess your
current documentation to ensure you are keeping up with your interpretation of what validation is today. Including
incorporating changes made from the last review period to today and gives you a change to update the validation
documentation to make a new "line in the sand" for the system, basically a statement of what the system is today.

David Stokes
Global Lead, Life Sciences at Venostic
I see periodic review as having two parts:
1. This is the review of the validated state, per regulatory expectations, and focuses to a large extent on change
controls, which have a certain element of data review i.e. you may review a sample of change controls to ensure
that they have been properly implemented, tested, backed out (where appropriate) and closed in a timely manner,
with appropriate documentation being reviewed to ensure that it has been updated. Typically, you can't review
100% of the change controls for a large, complex system with hundreds of changes being applied every year, so a
sampling approach can be taken. To be statistically relevant and provide the necessary confidence level, the

sample size will vary to some extent on the number of changes made (and can also be varied based on the system
risk) and will cover different types of change record (planned, emergency etc). This approach can also be related to
relevant processes such as incident and problem management, repair activities, configuration management etc to
identify issues with other processes which support the maintenance of the validated state.

2. In some cases it may be relevant to review the transactional records in the system, which can also provide
evidence that the system remains fit for intended purpose. If the system has a lot of purchase requisitions that were
never cancelled or converted into purchase orders, or laboratory analyses that have been scheduled but with
results not entered, that can be a sign that the process or the system is no longer fit for purpose, regardless of what
the validation documentation says. Although not an approach applied to every system it can be a useful sign that
changes controls should have been raised to ensure that the system remains fit for purpose and meets evolving
requirements. Typically I would only take this approach if discussions with systems users indicate that there are
problems.

In both cases it can be appropriate to sample operational, support and maintenance process records as well as just
review the SDLC documentation.

David Stokes
Global Lead, Life Sciences at Venostic
I certainly agree that the review, and where necessary, the updating of validation / SDLC documentation must be a
key part of the periodic review. I've just found over the years that the data associated with the system helps to
complete the story (or sometimes presents a different story!) - I guess that's the auditor in me :o)

K R Vaghela
Founder Partner at KVS Technologies and Infra Control Systems
Everything is risk based in today's world. Validation approach has to be based on complexity, Novelty and Risk
based. Once system is validated. Than review approach will be decided based on Risk assessment of the system,
Application end use and Incident, CAPA and Change control Applied to the system. If no of incidence are more
suggesting that risk is not mitigated properly and we can have a relook on the system.

System can be small or big. Single user or multi user. If stable than no need for quick review. Review and then
revalidation if needed can be longer time duration.

Best risk assessment is the key thing which very few people do it fundamentally.
CSV review also needs to take a look at audit trail and security breach events.

Vous aimerez peut-être aussi