Computerised System Validation in the QC Laboratory

Margaret Corduff asks the hard but necessary questions around Computer System Validation in the Quality Control Laboratory.  

All too often we assume that our process data within an organisation is safe. We assume that it stays within our organisation and, therefore, that it is kept protected from adulteration. But where does the data generated during Quality Control (QC) testing go?  

A QC Manager is often the Process Owner for computerised systems in a QC laboratory. Process Owners are responsible for the process being managed, i.e. responsible for ensuring that the computerised system and its operation are compliant and fit for intended use. As a QC Manager, do you know where your data goes and who has access to it?  Have you seen the associated data flow map? Can you talk to your data?  

A lot of test equipment and test applications are leased, licensed or serviced by outside vendors. While this allows for multiple efficiencies and benefits to the organisation, it also poses a challenge for the process owner.  

For example, as a process owner for microbial ID methods that utilise mass spectrometry and microbial libraries, you need to know where that library is stored.  

Do you know? 
Do you know where your sample results are stored? Is this data (library and sample data) stored locally on the PC? Or in your company’s on-premise data center? Or on the cloud? 
What about backups? How often and where does the data get backed up to? If the data is saved on the cloud, has that cloud been qualified? 
Security and access? Is your data secure on the cloud? Has this been verified?  Who can access it?  Can they make changes?  
Can your vendor log in remotely to assist you with trouble shooting? If so, what can they do? Can they log in without your knowledge?  What controls have you in place for remote login?  Do external parties have access to your data regarding in-house isolates? Can this data be manipulated? 

There are a lot of factors to consider, and it can seem endless. Remember the architecture for hosting data must be proven by  the user to be fit for purpose and to be 21 CFR Part 11 compliant. A lot of cloud providers may claim compliance, but it is up to the customer to demonstrate this.  

“As a QC Manager, do you know where your data goes and who has access to it?  Have you seen the associated data flow map? Can you talk to your data?”  

All of this and more must be addressed during the test equipment validation. CSV (Computerised System Validation) ensures that not only is your system validated for its intended use (e.g. the computerised system is capable of microbial identification), but also that it (and associated data) is secure and protected. CSV considers not just validation of the application (e.g. mass spectrometry for microbial identification) but also the qualification of the infrastructure (architecture for hosting).   

CSV considers the system being validated and also the interaction with other systems, applications and hosting architecture.  An example data flow map is shown below, covering everything from data generation through to storage and retrieval. A data flow map is a great way to understand and demonstrate where data goes and what systems it interacts it. This in turn will feed into the validation approach.   

Example of a data flow map

Effective computer system validation will not only result in a safe and effective computer system that meets your organisation’s needs. It will also provide the process owner with knowledge on where their data actually goes (the data flow) and will assure them of its integrity whether in-house or in the cloud.  As a QC systems process owner, protect your test data by ensuring it is securely managed and stored.  

Margaret Corduff

Margaret Corduff

Margaret Corduff is QA & Compliance Lead with Odyssey VC and Compliant Cloud.

Share on

Share on twitter
Share on linkedin
Share on facebook
avatar
  Subscribe  
Notify of