Odyssey VC announces 100 new jobs

(L-R) CEO Oisín Curran, Fine Gael chairman Martin Heydon, COO Fionnán Friel and CMO Tom McKittrick at the announcement on Thursday January 16th

Global leader in IT compliance solutions Odyssey VC have announced that it is to create 100 new jobs based in its headquarters in Sallins, Naas.

The Irish company will create highly-skilled new roles which will be spread across the company’s technical, product development, service delivery, compliance and sales & marketing departments. This announcement coincides with the official launch of Compliant Cloud, Odyssey VC’s new online platform. Compliant Cloud is an innovative new platform delivering market-leading compliant cloud solutions and features the world’s first self-care portal for the deployment of audit-ready compliant architecture. To find out more about Compliant Cloud visit our Home page.

This announcement is the latest development for the company, which has achieved double-digit growth since its launch in 2015 and counts global Fortune 500 life science companies including Pfizer, Alexion, Amgen and Takeda among its clients.

The announcement took place yesterday (January 16th) in the Chamber in the Kildare County Council. The event featured speeches from Fianna Fáil councillor and Mayor of Kildare Suzanne Doyle, Head of the High Potential Start-Ups Unit at Enterprise Ireland Jennifer Melia, CEO of Odyssey VC and Compliant Cloud Oisín Curran, and Fine Gael chairman Martin Heydon. The event was well attended, receiving coverage from numerous media outlets.

Odyssey VC offer Integrated GxP Cloud and Computerised System Lifecycle Management Services to Life Science companies that operate in highly regulated environments. To find out more, visit the website here.

Supplier Assessment – who’s in charge?

Fionnán Friel provides an in-depth analysis of supplier qualification and says that a company is only as compliant as the suppliers they outsource to.

Supplier Qualification is more than just auditing. Supplier qualification, for Life Science companies in particular, is a risk assessment tool with the goal of demonstrating a confidence that suppliers / vendors / contractors (referred to as suppliers from here on) can supply consistent quality products and services in compliance with established company and regulatory requirements. It gives a company a level of confidence in allowing them to outsource the delivery of critical products and services.

It makes perfect business sense for suppliers to ensure that their products and services are of a certain standard and quality to attract and maintain clients. However, it is the unfaltering view of regulatory authorities that the regulatory burden ultimately and always resides with the regulated company that has engaged the supplier.

“A company is only as compliant as the suppliers they outsource to”

 

Why Outsource? 

Historically, the two main reasons why organisations decided to outsource were to reduce costs and to have the ability to focus on core business goals and planning. Very often a company must outsource as it doesn’t have / cannot get the expertise in-house. 

But recent research shows a shift in industry thinking. Outsourcing is not just about saving money anymore, it’s seen as a critical tool in innovation. There are new and emerging reasons for outsourcing, including: 

  • Enabling competitive advantage 
  • Improve speed / time to market 
  • Embracing disruptive solutions (such as Cloud) 

Cloud is a perfect example of a new technology that is enabling and driving outsourcing. A recent survey from Deloitte ‘2018 global outsourcing survey – Disruptive outsourcing trends, technology, and innovation’ [1] identified new and emerging reasons for outsourcing and adopting Cloud. 

Deloitte Global Outsourcing Survey Results on Cloud Adoption[1] 

Why Audit? 

Regulatory bodies allow for outsourcing. However, they demand that the regulated entity audits potential suppliers to determine their level of compliance and ultimately their “fitness for intended use”.  

All companies – but especially heavily regulated companies such as those in the Life Science industry – depend on their suppliers for delivery of critical activities, making them vulnerable to potentially catastrophic quality issues if they get it wrong. They are required by the governing regulatory bodies to execute this process and they audit potential suppliers to ensure that they can meet company requirements and expectations in respect of the quality of their product, application or service.  

The table below outlines at a high level what you must ensure when you audit a supplier to whom you are considering outsourcing:  

Problem Statement Solution Confirm In Audit 
Need to increase capacity in packaging Outsource to Contract Manufacturing Organisation (CMO) Confirm that they have an equivalent QMS (Processes, procedures, training, documentation etc.) that is reflective of how we do this in-house to ensure quality of end product.  
Need to modernise our IT infrastructure and move to a more cost-effective model Outsource to a company delivering Cloud based infrastructure Confirm that they have an equivalent QMS (Processes, procedures, training, documentation etc.) that is reflective of how we do this in-house to ensure quality of end product remembering that Annex 11 states that infrastructure must be qualified. 

What do the Regulations and Regulators Say? 

1.1 EudraLex – The Rules Governing Medicinal Products in the European Union [2] 

Volume 4, Annex 11 within Eudralex governing computerised systems states the following in respect of suppliers and service providers:  

Excerpt from Eudralex Vol 4, Annex 11[2] 

Key things to note: 

  1. IT Departments should be considered analogous, meaning that IT departments and practices for the supplier selected should act and behave exactly as an internal IT department acts and behaves (Analogous – Comparable, Similar, Equivalent). Remember that Annex 11 states that infrastructure must be qualified, and software must be validated. If the supplier selected cannot demonstrate this then they should not be used.  
  2. Competence and reliability of supplier are key factors – suppliers need to be able to demonstrate this, ideally by having quality standards in place.  
  3. Audit & audit information must be available to inspectors on request – suppliers must be available for audits in the same manner internal departments within a regulated company would be made available during an audit. 

1.2 FDA (US Food and Drug Administration) 

21 CFR Part 820 [3] – Medical Devices (Section 820.50 Purchasing Controls) states each manufacturer shall establish and maintain procedures to ensure that all purchased or otherwise received product and services conform to specified requirements, which in summary involves: 

  1. Establishing requirements, including quality requirements – e.g. if evaluating a potential supplier of a computerised service then infrastructure must be qualified, and software must be validated. 
  2. Evaluating and selecting potential suppliers, contractors, and consultants on the basis of their ability to meet specified requirements, including quality requirements. 
  3. Defining the type and extent of control to be exercised over the product, services, suppliers, contractors, and consultants, based on the evaluation results. 

FDA Guidance for Industry Q10 Pharmaceutical Quality System [4] 

Section G. Management of Outsourced Activities and Purchased Materials of the guidance document details the expected control and review of any outsourced activities and quality of purchased materials and critically states that “the pharmaceutical company is ultimately responsible to ensure processes are in place to assure the control of outsourced activities and quality of purchased materials”. These processes should incorporate quality risk management and must include:  

  1. Assessing, prior to outsourcing operations or selecting material suppliers, the suitability and competence of the external party to carry out the activity or provide the material. 
  2. Defining the responsibilities and communication processes for quality-related activities of the involved parties. 
  3. Monitoring and review of the performance of the external party 

Key things to note: 

  1. It’s up to the regulated company to establish the requirements. This should be easy as you should apply the same requirements you have when you did it in-house prior to outsourcing. If not the same, you will have additional work to explain to regulators why you feel it can be different out-of-house.  
  2. Assess and audit suppliers prior to selection on their ability to meet requirements, including quality requirements.  
  3. Monitor and review as you go.  

 

So, who’s in charge? 

Risk assessment and auditing including supplier assessment within regulated Life Science companies is the domain of the Quality department, and for good reason. They are the ultimate arbiters on the quality of the pharmaceutical product or medical device, leaving the manufacturing plant ultimately responsible for the health and safety of the end user. 

No matter what you are assessing – be it a material, application or service – and regardless of how advantageous it is perceived to be from a technological and business perspective, if it does not meet the quality requirements and expectations of the regulated company, then it should not be used.  

A supplier assessment team should be made up of a variety of Subject Matter Experts (SMEs) with the goal of assessing the supplier on their ability to deliver on all requirements including Quality system and User/Functional Requirements. It is crucial – and an expectation of the regulators – that Quality Departments are making the final decision on whether a supplier and their products and/or services are of acceptable quality.  

  • It is not enough that the technology is new, state of the art, innovative and capable.   

  • It is not enough that the solution will save time, resources and money.  

Ultimately, without Quality oversight, it could end up costing a regulated company a lot more. With the emergence of new reasons for outsourcing (enabling competitive advantage and embracing disruptive solutions), quality departments need to maintain vigilance and not let the reasons for outsourcing overshadow the reasons for quality. 

 

References 

  1. 2018 global outsourcing survey – Disruptive outsourcing trends, technology, and innovation 
  2. EudraLex The Rules Governing Medicinal Products in the European Union Volume 4 Good Manufacturing Practice Medicinal Products for Human and Veterinary Use Annex 11: Computerised Systems 
  3. Guidance for Industry Q10 Pharmaceutical Quality System U.S. Department of Health and Human Services, Food and Drug Administration 
  4. Title 21—Food and Drugs, Chapter I—Food and Drug Administration Department of Health and Human Services Subchapter H – Medical devices, Part 820 Quality System regulation, Subpart E – Purchasing Controls  

The Future of Data Analytics

In God we trust, all others must bring data”.

W. Edwards Deming

Odyssey VC and Compliant Cloud CEO Oisín Curran gives a high-level overview of data analytics and looks to the possibilities ahead

Odyssey VC and Compliant Cloud CEO Oisín Curran gives a high-level overview of data analytics and looks to the possibilities ahead

Engineer and statistician William Edwards Deming paved the way for how analytics plays a key role across the lifespan of a regulated product today, and he gets straight to the critical point. We rely on and collaborate with our data scientists to build robust analytical models that inform and control the supply chain and manufacturing processes. Without data we will flounder, and crucially it must be accurate and reliable data. Let’s remember our first principles – garbage in, garbage out. 

The foundation of an analytics model is a train; validate and test cycle followed by continuous model maintenance and retirement procedures. As such, analytics processes are developed in what are often called analytics “sandboxes”, or development environments. Here they undergo multiple iterations of development and improvement and are robustly tested prior to deployment into a production environment. But what resides in these sandboxes? Who has access to them? And how can we be sure of the integrity of the data underpinning these models that we are becoming more and more reliant on in production environments?

Believe it or not, in the past these sandboxes have existed on the data scientist’s machine. Yes, take a breath! Snapshots of data have been, and at times still are, gathered from varying sources such as the production historian, MES, LIMS and transferred by simple means to a single person. Processes are improving, and we now see analytics sandbox environments pointing to central and shared databases. However, the level of control of such environments is at best questionable. When dealing with the challenges that come with analytics processes, such as time alignment and cleansing of data, a large amount of data manipulation is required. When this is being undertaken on snapshots of data in relatively uncontrolled environments and by any number of data scientists across the enterprise, the opportunity for error is massive.

We need to get better at this. We need to ensure that the data which informs so much of a regulated products lifecycle is of the utmost integrity, whilst of course ensuring it is readily available to the teams and processes that need it most. Having data pertaining to the entire lifecycle in one space, ideally incorporating everything across R&D, Manufacturing, Quality and Supply Chain, means we can further drive efficiencies with reliable analytics. The centralisation of disparate data sources in a compliant and controlled environment opens a massive opportunity for efficient analytics and accurate, targeted decision making. The CompliantCloud.com team are passionate about data, data integrity in particular, and that drives the platform we deliver to our customers.

Imagine a world where data scientists are not just deployed to react but are continuously innovating and deploying analytical models to enhance operations. Imagine they had full end-to-end visibility of a product’s lifecycle in real time, predicting issues and informing preventative action. Imagine they were doing this in a controlled and compliant environment that satisfies regulatory requirements. Imagine no longer. It’s time to act. Then again, in the words of Deming again – “It is not necessary to change. Survival is not mandatory”.

How TIM WOODS applies to paper-based systems

A major goal of the life sciences community is to move away from paper-based systems, and it’s easy to see why. Some of the challenges posed by and waste associated with paper-based systems can be summarised using the acronym Tim Woods; not a real person, but full of real problems.

T – The “T” stands for “Transport”, which involves the physical movement of paper documentation around the office and around the business, starting at the printer where it is initially created. The documentation is passed around to testers, reviewers and approvers, transported from one person to the next; transported between functions and between physical locations.  At the end of all of this, once all approvers have completed the approvals or the paper documentation has served its purpose, it is transported to its final destination, a folder perhaps or a document storage area. But in the life science industry we know that this paper document can be  pulled at any time, for example during audits or to support investigations.   And so the transportation starts again.  Not alone is the transportation a huge waste, but how do you protect the document while it is being transported from file to folder and person to person? How do you assure that it will not be mislaid?  How do you preserve the integrity of the document in terms of completeness, availability and retrievability?

I – The “I” stands for “Inventory”, or in this case the amount of physical documentation  associated with paper processes.  For example, physical retention of  master copies of documentation as well as obsolete or superseded versions.  The stack of physical paper doesn’t take long to become a mountain which poses challenges when it comes to long term storage and retention.  Many companies within the Life Sciences sector end up outsourcing their long term storage solutions to third party which in itself introduces additional complexities around retention, retrievability and traceability.

M – The “M” represents “Motion”, which in the context of paper documentation not only relates to moving documentation around the business but also the movement of people.  For example, if I need to work on the same document as you then you need the document to move to me or I need to move to the document to you or possibly someone else in the organisation.

W – The “W” stands for waiting.  Only one person can work on a physical document at one time.  Even if staff are co-located, there is an amount of waiting required for one person to complete their activities before the next person can perform theirs.  For example, the review of a physical executed test script for a validation exercise can only be performed by one reviewer at a time.  As such the next person in the chain has to wait for the previous person to complete their task. 

O – The first “O” stands for Over-production and the second “O” stands for Over-processing. Paper processes, by their very nature, are often laden with inefficiencies. If you take the example of a physical documentation control process for standard operating procedures, the level of work associated with addressing a typo on a single page of a controlled SOP is often equivalent to the level of work associated with a more significant change.

D – The “D” is for “Defects”, which basically amounts to the waste associated with something going wrong. Take the example of the executed test script for a validation exercise.  If the tester makes errors when recording test details in the paper documentation it necessitates a level of additional documentation, explanation and sometimes investigation and rework which ultimately generates more paper.  

S – “S” is for “Skills”, when paper processes are abundant in an organisation it can often lead to the under-utilisation or poor utilisation of skills when so much labour from highly-skilled people is spent on waiting at printers, scanning documents, stamping documents, filing documents etc.

We’ve all had our own experiences with paper and its compliance and data integrity challenges, but the question remains; what does the future really look like beyond paper? One potential solution is the introduction of validated workflows. The aim of validated workflows is to eliminate GDP errors and have data integrity built in from the start, where you can’t progress to the next step until you satisfy specific workflow requirements. They have been and can be successfully used for the automation and management of validation activities, logbooks, documentation control – essentially anything that has an associated workflow.

This is exactly what we are planning to cover in our exclusive free webinar. On the 24th of October, Patrick Murray, the Compliant Cloud Technical SME for Pharma VIEW™ (Validated Integrated Enterprise Workflow) will discuss the merits of transforming regulated business from paper-based systems to validated workflows, using some common use cases as inspiration. Now is the best time to enter the world of validated workflows; the new paperless. Discover more here: https://compliantcloud.com/webinar/

 

Data archiving: the obstacles from lab to shelf

how can we embrace this digitisation of data to ensure that vital and essential data is preserved and accessible for as long as it needs to be while protecting its integrity?

Given the nature of its products and its customers, it follows that the life sciences sector is highly-regulated. In fact, the term “pharmaceutical”, per its Greek etymology, “pharmakon”, means both care and poison.  Hence, before being marketed, pharmaceutical drug products must pass an abundance of different tests and be subject to extensive rules and regulations in order to guarantee safety for its customers and patients.

This is not a linear path. The life of the pharmaceutical drug product begins with its discovery, but it doesn’t end with its immediate and quick distribution to those who need it most. From the moment of the initial conception, the path that it follows can be fragmented across different centres, universities and other educational institutes and even across different pharmaceutical companies. This fragmented path results in a vast amount of data production and data sources with complex data property, data custody and data management rights and requirements as well as various data media types. These complexities, coupled with the difficulties associated with identifying and controlling data that requires long-term management and maintenance, represent a significant challenge for pharmaceutical companies today.

FDA (Food and Drug Administration) and European regulations prescribe requirements for data retention and data production, for example the requirement to retain relevant data up to several generations of software and hardware.  Another requirement relates to the retention of pharmaceutical drug product registration-related documentation for as long as a product is on the market plus 10-15 years.  A typical registration submission for a pharmaceutical drug product to a Health Authority consists of a large amount of paper scanned to PDF format, generated from and / or summarising some of the source data.

According to Anita Paul (Roche, Basel, Switzerland); Juerg Hagmann (Novartis, Basel, Switzerland) today the future of pharmaceutical drug product registration is gradually becoming paperless and, very soon, paper submissions will no longer be accepted by major Health Authorities. But is the life science sector moving quickly enough in the same direction? The two scholars discussed the challenges of digital preservation, which does not just mean the ability to read specific data in preserved (rendition) format but also means the ability to “readily retrieve” all pertinent raw data and metadata.

Digitisation of data is arguably the most effective way to preserve the data content and context, and also to facilitate access and retrievability as required.  Building digitisation of data in at every step along the fragmented path of a pharmaceutical drug product results in easy access and retrieval of accurate data by the right people which contributes to sound quality decisions and ultimately safer products for patients.

So, the question is, how can we embrace this digitisation of data to ensure that vital and essential data is preserved and accessible for as long as it needs to be while protecting its integrity?

Bibliography

Anita Paul (Roche, Basel, Switzerland); Juerg Hagmann (Novartis, Basel, Switzerland), Challenges of Long-Term Archiving in the Pharmaceutical Industry, 2008 http://www.imaging.org/site/PDFS/Reporter/Articles/Rep23_5_NIPDF2008_PAUL.pdf (last access 09/09/2019)

Periodic Review for outsourced cloud-based computerised systems, applications and infrastructure

Periodic Review for outsourced cloud-based computerised systems, applications and infrastructure

By Nicola Brady

Periodic review of computerised systems is a regulatory requirement.  EU GMP Eudralex Vol. 4 Annex 11 states, “Computerised systems should be periodically evaluated to confirm that they remain in a valid state and are compliant with GMP. Such evaluations should include, where appropriate, the current range of functionality, deviation records, incidents, problems, upgrade history, performance, reliability, security, and validation status reports.”  This regulatory requirement applies to both validated computerised systems and qualified infrastructure.  The periodic review process ensures that a system remains compliant with applicable regulations, is fit for its intended use and satisfies company policies and procedures.  There are no exceptions for the performance of periodic reviews, however the frequency, scope and depth may differ dependent on the system under evaluation and this should be determined using a risk-based approach. 

Periodic review is often considered a challenging exercise as it requires a detailed, comprehensive, holistic review of all elements pertaining to a computerised system or computer infrastructure for a defined period at a defined frequency.  This review represents an even bigger challenge when computerised system, applications or infrastructure are outsourced and in particular when they are outsourced to the cloud. 

The primary requirements for periodic review are the same whether the computerised system or infrastructure is located in-house or outsourced to a service provider.  The table below summarises the particular challenges associated with outsourced cloud-based applications and infrastructure when it comes to periodic review:

The end goal of the periodic review exercise is to establish a clear understanding relating to the current state of the computerised system or infrastructure to conclude that it remains in a compliant, validated (or qualified) state.  So, what is the best way to assure this if you are utilising outsourced cloud-based applications or infrastructure? Well, it is imperative that there is a clear understanding of the controls that are the responsibility of the subscriber versus those that have been delegated to the provider.  Where controls are being delegated, the subscriber should ensure they are assessed and accepted and reflective of how they are currently managed.  A contract should be established between both parties with clear details in relation to the service provision, responsibilities and controls including but not limited to commitment to supporting activities relating to periodic review.  The contract should also establish the supplier support required for regulatory inspections, where applicable. 

A comprehensive contract between the outsourced cloud-based application or infrastructure provider and the subscriber, where all required elements are clearly established and endorsed, will help the subscriber satisfy their periodic review requirements and assure the maintenance of the computer system or infrastructure in a compliant, validated (or qualified) state.

 

Ensuring IOT Data Integrity & Security with Identity and Access Management (IAM)

Ensuring IOT Data Integrity & Security with Identity and Access Management (IAM)

 

Modestas Jakuska focuses on the importance of using an Identity and Access Management (IAM) system in order to maintain data integrity and security in the context of IOT devices.

Ensuring  data integrity means ensuring that data is complete, original, consistent, attributable and accurate. Data must be protected at all stages of its lifecycle, when it is created, transmitted, in use or at rest. Otherwise, there is no assurance that the integrity of current data is maintained.

This is as important for IOT devices (computing devices that connect wirelessly to a network and have the ability to transmit data) as for any other device.  IOT devices are used across a variety of industries, including the life sciences industry where they are often employed in the control of drug product manufacturing or equipment monitoring, e.g. IOT sensor monitoring temperature, humidity, light intensity etc.

There are many considerations for ensuring data integrity for IOT devices including but not limited to:

  • Vendor / Supplier assessment.
  • Verification and definition of the ER (Entity-Relationship) model. 
  • Definition of security protocols used by IOT devices.
  • Definition and verification of  the use of cryptography for IOT communication.
  • Definition of procedures for good data management.
  • Identity and Access Management (IAM)

In this post, however, I will solely focus on the importance of using an Identity and Access Management (IAM) system in order to maintain data integrity and security. In the context of IOT devices, an IAM system is a set of policies and technologies that ensures that only specified IOT devices have access to specified resources with appropriate restrictions.

The importance of IAM has been highlighted by the recent NASA hack which occurred specifically due to the mismanagement of IOT devices. According to NASA Office of Inspector General [1]: “JPL uses its Information Technology Security Database (ITSDB) to track and manage physical assets and applications on its network; however, we found the database inventory incomplete and inaccurate, placing at risk JPL’s ability to monitor, report effectively, and respond to security incidents.”(Note JPL = Jet Propulsion Laboratory).

No device or network is trivial. That includes even the most basic IOT devices. In fact, a Raspberry Pi (a credit-card sized computer that plugs into a computer monitor) was used to gain access to the network. Once accessed,  a network gateway was then used to gain access to other networks. This could all have been avoided if something like network segmentation had been implemented implemented. According to BBC News [2]:  ”Once the attacker had won access, they then moved around the internal network by taking advantage of weak internal security controls that should have made it impossible to jump between different departmental systems … The stolen data came from 23 files, but little detail was given about the type of information that went astray.”

After this ‘hack’ NASA implemented measures to address the identified system weaknesses, including but not limited to semi-annual assessment of inventory to ensure that the system components are registered in the Information Security Database.

In conclusion, the implementation of and adherence to robust IAM policies and technologies is a crucial element in the preservation of data integrity and security for IOT devices.  Failure to do so exposes the data to the risk of corruption, alteration or destruction.

References

[1] “Cybersecurity Management and Oversight at the Jet Propulsion Laboratory”, Oig.nasa.gov, 2019. [Online]. Available: https://oig.nasa.gov/docs/IG-19-022.pdf. [Accessed: 03-Aug-2019].

[2] “Raspberry Pi used to steal data from Nasa”, BBC News, 2019. [Online]. Available: https://www.bbc.com/news/technology-48743043. [Accessed: 03-Aug-2019].

The Crossover of Data Integrity and Data Privacy in the Cloud

The Crossover of Data Integrity and Data Privacy in the Cloud

With the increased adoption of cloud-based applications in the life science sector, Compliant Cloud CSV Engineer Eliane Veiga details the fundamentals of data integrity and data privacy.

Data integrity (DI) and data privacy (DP) challenges have received increased regulatory attention in recent years. When considering GxP applications, a robust approach to risk-based computerized system lifecycle management requires well-defined processes, use of a qualified infrastructure, validated design and deployment of software, qualified personnel, rigorous change management and version control.

With the increased adoption of cloud-based applications in the life science sector, cloud computing solutions such as Software as a Service (SaaS) offer many advantages including enhanced cost-effectiveness, ease of implementation, and flexible, highly scalable platforms. However, assuring data integrity and data privacy in the cloud requires a well-informed, proactive approach by the regulated organization in planning and maintaining control of their data once it is hosted on the cloud provider’s site.

In Europe, protection of data privacy is now regulated under the General Data Protection Regulation(GDPR), which came into force on the 25th May 2018 replacing the existing data protection framework under the EU Data Protection Directive.

Data Integrity – The Fundamentals

The UK Medicines & Healthcare products Regulatory Agency (MHRA) defines data integrity as “the degree to which data are complete, consistent, accurate, trustworthy, reliable and that these characteristics of the data are maintained throughout the data lifecycle” (MHRA, 2018).

Assuring data integrity requires effective quality and risk management systems which enable consistent adherence to sound scientific principles and good documentation practices. The international regulators have defined an acronym (ALCOA) as the five elements necessary to assure data integrity throughout the data life-cycle. Even though ALCOA has been widely discussed in many publications, evidence from the US FDA warning letters and EU Statements of Non-Compliance (SNCs) indicate that there still are many who do not understand the fundamentals of ALCOA.

More recent publications, including the WHO Guidance on Good Data and Record Management Practices, have expanded these principles to describe ALCOA+ expectations, which puts additional emphasis on ensuring that data and records are “complete, consistent, enduring and available” (WHO, 2016).

Data Privacy – The Fundamentals

The General Data Protection Regulation (GDPR) came into force in the EU on the 25th May 2018, replacing the existing data protection framework under the EU Data Protection Directive. The GDPR emphasizes transparency, security and accountability by both data controllers and data processors, while at the same time standardizing and strengthening the right of European citizens to data privacy.

From a health care and cloud-based solutions prospective, the GDPR brings some significant changes from the current directive including:

  • definition of “sensitive personal data”
    •  imposes stricter obligation on both data controllers & processors
    •  appointment of a Data Protection Officer (DPO)
    •  conducting Data Protection Impact Assessments (DPIA)
    •  assuring security of data processing

As data controllers and processors have been allocated shared, stricter responsibilities under the GDPR, the obligations on both controllers and processors have been a surprise for the IT Sector.

Under GDPR, the data controller must implement organizational and technical measures to demonstrate compliance of the processing activities undertaken on their behalf. Furthermore, data controllers have the responsibility for selection and oversight of their service providers (data processors).  The GDPR defines such a data processor as “a natural or legal person, public authority, agency or another body which processes personal data on behalf of the controller”. 

The compliance burden is now shared between processors and controllers. One of the significant requirements that GDPR imposes for processors is that if they intend to hire another processor to assist with data management, e.g. a cloud computing supplier, the data controller must approve this appointment prior to commencement. This requirement is intended to protect personal data from transfer to a third party, even to another country, without the controller’s prior authorization.

Conclusion

As the adoption of digital technology – such as cloud-based – has increased in the life science sector, under the GDPR it will no longer be possible for cloud provider services (processors) to position themselves as mere processors and evade the reach of data protection rules. Recent publications have shown that to achieve assurance of DI in the cloud, service providers must still learn how to comply with the GxP regulatory bodies.

What is 21 CFR Part 11?

 

Adam Lawler answers the big questions about one of the life science industry’s core tenets.

 answers the big questions about one of the life science industry's

 

The life sciences industry would be nothing without regulatory consistency and control, and one of the most significant forces governing the manufacture of pharmaceuticals is 21 CFR Part 11. The question is, what is it exactly? For such an important regulation it seems to be constantly shrouded in mystery and more than a fair share of confusion, and in this article we hope to answer the big questions about one of the industry’s core tenets.

In short, 21 CFR Part 11 is a section of the Code of Federal Regulations (CFR) which outlines the Food and Drug Administration’s (FDA) code pertaining to electronic signatures and electronic records. What does this mean? Basically, it amounts to accountability, traceability, and transparency. 21 CFR Part 11 clearly lays out the checklist for accurate electronic records and signatures, with the express intention of making sure that companies adhere to good practices when it comes to electronic data logging and maintenance, guaranteeing accuracy, mitigating potential cases of human error, and ensuring that any alterations made to an electronic document can be traced.

What counts as an electronic record? A simpler question would be what doesn’t; the definition is broad, encompassing everything from words to sound. More specifically, Part 11 defines an electronic record as “any combination of text, graphics, data, audio, pictorial, or other information representation in digital form that is created, modified, maintained, archived, retrieved, or distributed by a computer system.” Traditional analogue methods of record-keeping are not exempt, as paper documents, when scanned into a computer, also fit under 21 CFR part 11 as soon as they are digitised.

Where did this law come from? The law originated from meetings between the FDA and pharmaceutical companies regarding how to deal with record-keeping when entering the hitherto nebulous electronic sphere. Eventually, after much refinement, 21 CFR Part 11 emerged as the proffered solution in its core version in 1997. The most significant alteration made to the law since then was in 2000, when the FDA acknowledged further-developing digitisation by officially stating the equivalence of paper records and electronic records, as well as of electronic signatures and traditional ink signatures. It should also be noted that there is a European equivalent to 21 CFR Part 11 called Annex 11, and, while extensive similarities exist, the requirements of the two do not entirely correspond, something we will explore at length in a later article.

Considering the unwieldy and outmoded nature of paper record-keeping and the ever-shifting electronic landscape, such a law could always stand to be further amended. Over the years, 21 CFR Part 11 has been supplemented by guidance documents relating to data integrity and data management practices from various bodies – including PIC/S, MHRA and WHO, as well as the FDA themselves who issued a draft guidance ‘Data Integrity & Compliance with CGMP’ in April 2016 – but as it stands, 21 CFR Part 11 is one of the longest standing and most influential laws in the life sciences industry pertaining to electronic data.

In a world without 21 CFR Part 11, the impact of human error would be much more significant, and the electronic landscape would remain an untameable beast with no guarantee of accountability or traceability. With it, navigating the overwhelming breadth of the digital realm is that much easier, the outcomes more accurate, and technology becomes just another tool in the belt of regulated record-keeping.

From Compliant to Complaint: the human error minefield in the life sciences industry

In particular for the life sciences industry a human error, undetected or unresolved, poses significant risk to the end user of the product.

 

Nicola Brady tells you how to mitigate risks in the life sciences industry.

To err is human, to forgive divine.  But this forgiveness is not usually forthcoming in industries where a human error can translate into a significant business impact.  Human error imposes significant costs on a business, costs to the quality of the product or service being delivered, financial costs and often reputational costs.

In particular for the life sciences industry a human error, undetected or unresolved, poses significant risk to the end user of the product.  This is why life sciences companies invest so heavily in programs and policies to drive human error down to as low a level as possible. To eliminate it completely is impossible!  Although the world in which we live is moving rapidly towards automation and Artificial Intelligence (AI) technologies, people are still necessary and unfortunately fallible; where there are people there will always be the potential for error.

So, what can a company do to reduce the occurrence
of human error or reduce the impact when it occurs?

  • Allow time for training. Initial training and on-the-job training should be in place and appropriate time should be allocated to allow for training.
  • Put robust processes in place. Having comprehensive policies and procedures in place will ensure standard consistent processes are followed and make errors and deviations more detectable. ‘Error proof’ the process as far as practicable. Complex processes should be risk assessed and mitigation actions implemented as required.
  • Ensure the workplace environment is appropriate for the work required. Consider noise levels, lighting, temperature or other environmental factors that might cause distraction.
  • Document it, investigate it, learn from it. Effective investigation processes should be in place to determine root causes and implement corrective actions. The investigation should not stop when a root cause of ‘human error’ is determined; dig deeper and you might find that there was something else at play.  This will allow you to address the underlying causes that contributed to the human error in the first place and reduce the likelihood of its recurrence.
  • Adopt the right culture. There’s no use for ‘blame culture’.  A quality culture where employees are encouraged to ‘raise their hands’ when mistakes occur actually serves to drive the rate of mistakes down.

If a company applies the right focus and attention on
training, processes, workplace environment, investigations and overall culture
they should find it easy to remain compliant and avoid that complaint!