How TIM WOODS applies to paper-based systems

A major goal of the life sciences community is to move away from paper-based systems, and it’s easy to see why. Some of the challenges posed by and waste associated with paper-based systems can be summarised using the acronym Tim Woods; not a real person, but full of real problems.

T – The “T” stands for “Transport”, which involves the physical movement of paper documentation around the office and around the business, starting at the printer where it is initially created. The documentation is passed around to testers, reviewers and approvers, transported from one person to the next; transported between functions and between physical locations.  At the end of all of this, once all approvers have completed the approvals or the paper documentation has served its purpose, it is transported to its final destination, a folder perhaps or a document storage area. But in the life science industry we know that this paper document can be  pulled at any time, for example during audits or to support investigations.   And so the transportation starts again.  Not alone is the transportation a huge waste, but how do you protect the document while it is being transported from file to folder and person to person? How do you assure that it will not be mislaid?  How do you preserve the integrity of the document in terms of completeness, availability and retrievability?

I – The “I” stands for “Inventory”, or in this case the amount of physical documentation  associated with paper processes.  For example, physical retention of  master copies of documentation as well as obsolete or superseded versions.  The stack of physical paper doesn’t take long to become a mountain which poses challenges when it comes to long term storage and retention.  Many companies within the Life Sciences sector end up outsourcing their long term storage solutions to third party which in itself introduces additional complexities around retention, retrievability and traceability.

M – The “M” represents “Motion”, which in the context of paper documentation not only relates to moving documentation around the business but also the movement of people.  For example, if I need to work on the same document as you then you need the document to move to me or I need to move to the document to you or possibly someone else in the organisation.

W – The “W” stands for waiting.  Only one person can work on a physical document at one time.  Even if staff are co-located, there is an amount of waiting required for one person to complete their activities before the next person can perform theirs.  For example, the review of a physical executed test script for a validation exercise can only be performed by one reviewer at a time.  As such the next person in the chain has to wait for the previous person to complete their task. 

O – The first “O” stands for Over-production and the second “O” stands for Over-processing. Paper processes, by their very nature, are often laden with inefficiencies. If you take the example of a physical documentation control process for standard operating procedures, the level of work associated with addressing a typo on a single page of a controlled SOP is often equivalent to the level of work associated with a more significant change.

D – The “D” is for “Defects”, which basically amounts to the waste associated with something going wrong. Take the example of the executed test script for a validation exercise.  If the tester makes errors when recording test details in the paper documentation it necessitates a level of additional documentation, explanation and sometimes investigation and rework which ultimately generates more paper.  

S – “S” is for “Skills”, when paper processes are abundant in an organisation it can often lead to the under-utilisation or poor utilisation of skills when so much labour from highly-skilled people is spent on waiting at printers, scanning documents, stamping documents, filing documents etc.

We’ve all had our own experiences with paper and its compliance and data integrity challenges, but the question remains; what does the future really look like beyond paper? One potential solution is the introduction of validated workflows. The aim of validated workflows is to eliminate GDP errors and have data integrity built in from the start, where you can’t progress to the next step until you satisfy specific workflow requirements. They have been and can be successfully used for the automation and management of validation activities, logbooks, documentation control – essentially anything that has an associated workflow.

This is exactly what we are planning to cover in our exclusive free webinar. On the 24th of October, Patrick Murray, the Compliant Cloud Technical SME for Pharma VIEW™ (Validated Integrated Enterprise Workflow) will discuss the merits of transforming regulated business from paper-based systems to validated workflows, using some common use cases as inspiration. Now is the best time to enter the world of validated workflows; the new paperless. Discover more here: https://compliantcloud.com/webinar/

 

Data archiving: the obstacles from lab to shelf

how can we embrace this digitisation of data to ensure that vital and essential data is preserved and accessible for as long as it needs to be while protecting its integrity?

Given the nature of its products and its customers, it follows that the life sciences sector is highly-regulated. In fact, the term “pharmaceutical”, per its Greek etymology, “pharmakon”, means both care and poison.  Hence, before being marketed, pharmaceutical drug products must pass an abundance of different tests and be subject to extensive rules and regulations in order to guarantee safety for its customers and patients.

This is not a linear path. The life of the pharmaceutical drug product begins with its discovery, but it doesn’t end with its immediate and quick distribution to those who need it most. From the moment of the initial conception, the path that it follows can be fragmented across different centres, universities and other educational institutes and even across different pharmaceutical companies. This fragmented path results in a vast amount of data production and data sources with complex data property, data custody and data management rights and requirements as well as various data media types. These complexities, coupled with the difficulties associated with identifying and controlling data that requires long-term management and maintenance, represent a significant challenge for pharmaceutical companies today.

FDA (Food and Drug Administration) and European regulations prescribe requirements for data retention and data production, for example the requirement to retain relevant data up to several generations of software and hardware.  Another requirement relates to the retention of pharmaceutical drug product registration-related documentation for as long as a product is on the market plus 10-15 years.  A typical registration submission for a pharmaceutical drug product to a Health Authority consists of a large amount of paper scanned to PDF format, generated from and / or summarising some of the source data.

According to Anita Paul (Roche, Basel, Switzerland); Juerg Hagmann (Novartis, Basel, Switzerland) today the future of pharmaceutical drug product registration is gradually becoming paperless and, very soon, paper submissions will no longer be accepted by major Health Authorities. But is the life science sector moving quickly enough in the same direction? The two scholars discussed the challenges of digital preservation, which does not just mean the ability to read specific data in preserved (rendition) format but also means the ability to “readily retrieve” all pertinent raw data and metadata.

Digitisation of data is arguably the most effective way to preserve the data content and context, and also to facilitate access and retrievability as required.  Building digitisation of data in at every step along the fragmented path of a pharmaceutical drug product results in easy access and retrieval of accurate data by the right people which contributes to sound quality decisions and ultimately safer products for patients.

So, the question is, how can we embrace this digitisation of data to ensure that vital and essential data is preserved and accessible for as long as it needs to be while protecting its integrity?

Bibliography

Anita Paul (Roche, Basel, Switzerland); Juerg Hagmann (Novartis, Basel, Switzerland), Challenges of Long-Term Archiving in the Pharmaceutical Industry, 2008 http://www.imaging.org/site/PDFS/Reporter/Articles/Rep23_5_NIPDF2008_PAUL.pdf (last access 09/09/2019)

Where’s my data gone?

Understanding how data flows within an organisation is key to ensuring that it can be managed and analysed effectively.

 

Photo Credit of Silvia Paola Lai

By Nicola Brady, QA Compliance Specialist
Data is already everywhere and as information technologies evolve and the world in which we live becomes more automated available data gets even bigger and more prolific.  Understanding how data flows within an organisation is key to ensuring that it can be managed and analysed effectively.  The bigger data gets, the more complex it is to deal with.  Therefore, understanding the supply chain of your data is so important.  This is particularly true for the life science industry where quality and GMP decisions are made every day based on data, and where data itself is a critical product as it underpins all products and processes.  As such it is imperative that we understand how and where data flows i.e. do you know where your data is and who is accessing your data at any given time during its lifecycle and how can you assure the preservation of data integrity?

The data flow for a given process or system can be defined as the supply chain for all data, metadata, inputs and outputs for that process and
system.  All data goes through a process of creation, processing, review, reporting and use, retention and retrieval,
and destruction.  During the data lifecycle the data may cross between different systems, between manual
(paper) processes and computerized systems, to cloud-based applications and storage.  Data may move across
organisational boundaries, e.g. internally between departments, or externally  between regulated companies and third parties. 
Understanding and controlling these hand offs between processes, systems and entities is already complex and even more so where the data is moving in and out of cloud-based applications provisioned by a third party!

Has your organisation made a decision to outsource activities, such as data storage, to external cloud service
providers?  Are you taking a risk handing over your data to an unknown entity?  Do you understand how your data will be protected and controlled by the external service provider? Does the external service provider fully understand what’s expected from a life-science regulatory perspective? Are they
willing and able to demonstrate this?

To mitigate the potential risk to your data when outsourcing to a third party you must have a clear understanding of exactly where your data will reside, whether other third-party suppliers / subcontractors will have access to it and what security control measures will be implemented to safeguard it.  This can only be achieved through appropriate vetting of your potential third party supplier.  Once you are satisfied with the potential third party supplier you should then ensure that an agreement and contract is established and approved containing explicit requirements and controls prior to using the supplier for the outsourced activity.  Once in use you should ensure a periodic evaluation of your third-party supplier to ensure that the requirements and controls per the contract agreements are being adhered to.   

So irrespective of the process or system or its interfaces and boundaries, once an organisation can pin point where all associated data is at any given time during the data life-cycle and understand the controls in place to protect the data, even when it is stored in a cloud-based application managed by a third-party supplier, they can be confident that data integrity can be assured.

My Cloud Service Provider is ISO certified – Isn’t that enough?

Quality Assurance and Compliance Specialist Nicola Brady details the best way to approach Cloud Service Providers, by taking their standards and certifications into account, but also taking responsibility as a subscriber to ensure that they meet requirements.

Cloud Service Providers (CSPs) more often than not hold a myriad of standards and certifications purporting to make them a better option than their competitors. The CSP may be certified to one or more quality standards, including ISO 27001(Managing Information Risk), ISO 9001(quality management of business procedures), COBIT (Control Objectives for IT)or SSEA 16(Controls over security, availability, and confidentiality). While the attainment of these standards and certifications goes a long way to inspiring confidence in a prospective subscriber, they cannot-and should not -replace due diligence on the part of the prospective subscriber in establishing whether the cloud service provider will be able to deliver a service that meets their specific requirements.

But surely these standards and certifications count for something?

Absolutely! By achieving and maintaining these standards or certifications, the CSPs must have efficient and effective management and business practices, processes and controls in place. This can provide assurances to the prospective subscriber, as well as inspire confidence in the ability of the CSP to deliver and meet the subscriber’s requirements. Any standards and certifications held by the CSP may also be leveraged, to an extent, to satisfy the subscribers requirements. However, they cannot replace the mandatory requirements of the subscriber, particularly where the subscriber is a regulated entity, e.g. a Life Science Company. The regulated entity must meet specific regulations, including GMP regulations. The standards and certifications held by the CSP will not satisfy the GMP regulations.

So, what should the subscriber do?

It is not up to the CSP to meet the prospective subscriber’s regulatory requirements. No, it is the subscriber’s responsibility to perform a thorough evaluation of the CSP to determine if their processes and controls stand up to scrutiny. Once the CSP has been appropriately vetted and determined to be suitable for the service delivery required, clear responsibilities and accountabilities must be established via a comprehensive contract. At the end of the day,no standard or certification relieves the potential subscriber of the responsibility to meet the requirements for the regulated industry in which they operate.

Lost in Space – Navigating the new IT landscape in the Life Science Industry

A perspective on trying to adapt to the ever-changing technologies and tools at our disposal in the life sciences sector.

How do we keep up with a speeding car that’s only getting faster? QA and Compliance Specialist Nicola Brady gives an outsider’s perspective on trying to adapt to the ever-changing technologies and tools at our disposal in the life sciences sector.

I’m not naturally tech savvy.  I’ll be the first to admit that I call the IT help desk whenever I encounter any issue with my computer – after I try to turn it off and on again, of course – so you’d hardly believe that I studied some computer science elements as part of my undergraduate degree in science.  I cast my mind back now to those classes on C+ programming, SQL and database design, biomedical imaging and emerging technologies, and I remember thinking “how is this science?”.  Now, the question is “how wrong was I?”

Science and information technology do not exist independently of each other. Look at the life science industry, for example; how would drug product development or drug product manufacturing be possible without the symbiosis of science and information technology? In fact, life science companies these days are investing more time, resources and effort than ever before in implementing and maintaining Information technologies to deliver safe, efficacious and affordable products to patients.  The information technologies and tools that I encountered during my studies may even be redundant now (I am not going to share how long ago that was!), as those technologies and tools are evolving so quickly that it is often difficult for organisations to keep pace. Should I have paid more attention? Perhaps, although I could argue that I know what I know, and I certainly know what I don’t know!

Over the course of my career in various Quality Control, Quality Assurance and Compliance roles in the life sciences sector, I have interacted as an end user with my fair share of information technologies and tools. I have supported validation and qualification activities, facilitated risk assessments, conducted investigations and audited the associated processes for these same technologies and tools. But these roles did not require me to be a software developer, or a programmer, or a system builder. There are many more skilled people suitable for performing these types of technical tasks. No, my role is about looking at these information technologies and tools relative to their intended use and application, asking the tough questions like, for example, how will we meet the regulatory requirements? How can we qualify or validate the technology for its intended use? How can we assure data integrity? What do we need to implement to monitor and control?

So, while I will try my best to embrace the new information technologies and tools as they emerge tools including cloud computing, data analytics, blockchain, IOT (Internet of Things) devices and even AI (Artificial Intelligence) –  I will most definitely stay on top of the changing regulatory landscape pertaining to their use in the life science industry.

Ensuring IOT Data Integrity & Security with Identity and Access Management (IAM)

Ensuring IOT Data Integrity & Security with Identity and Access Management (IAM)

 

Modestas Jakuska focuses on the importance of using an Identity and Access Management (IAM) system in order to maintain data integrity and security in the context of IOT devices.

Ensuring  data integrity means ensuring that data is complete, original, consistent, attributable and accurate. Data must be protected at all stages of its lifecycle, when it is created, transmitted, in use or at rest. Otherwise, there is no assurance that the integrity of current data is maintained.

This is as important for IOT devices (computing devices that connect wirelessly to a network and have the ability to transmit data) as for any other device.  IOT devices are used across a variety of industries, including the life sciences industry where they are often employed in the control of drug product manufacturing or equipment monitoring, e.g. IOT sensor monitoring temperature, humidity, light intensity etc.

There are many considerations for ensuring data integrity for IOT devices including but not limited to:

  • Vendor / Supplier assessment.
  • Verification and definition of the ER (Entity-Relationship) model. 
  • Definition of security protocols used by IOT devices.
  • Definition and verification of  the use of cryptography for IOT communication.
  • Definition of procedures for good data management.
  • Identity and Access Management (IAM)

In this post, however, I will solely focus on the importance of using an Identity and Access Management (IAM) system in order to maintain data integrity and security. In the context of IOT devices, an IAM system is a set of policies and technologies that ensures that only specified IOT devices have access to specified resources with appropriate restrictions.

The importance of IAM has been highlighted by the recent NASA hack which occurred specifically due to the mismanagement of IOT devices. According to NASA Office of Inspector General [1]: “JPL uses its Information Technology Security Database (ITSDB) to track and manage physical assets and applications on its network; however, we found the database inventory incomplete and inaccurate, placing at risk JPL’s ability to monitor, report effectively, and respond to security incidents.”(Note JPL = Jet Propulsion Laboratory).

No device or network is trivial. That includes even the most basic IOT devices. In fact, a Raspberry Pi (a credit-card sized computer that plugs into a computer monitor) was used to gain access to the network. Once accessed,  a network gateway was then used to gain access to other networks. This could all have been avoided if something like network segmentation had been implemented implemented. According to BBC News [2]:  ”Once the attacker had won access, they then moved around the internal network by taking advantage of weak internal security controls that should have made it impossible to jump between different departmental systems … The stolen data came from 23 files, but little detail was given about the type of information that went astray.”

After this ‘hack’ NASA implemented measures to address the identified system weaknesses, including but not limited to semi-annual assessment of inventory to ensure that the system components are registered in the Information Security Database.

In conclusion, the implementation of and adherence to robust IAM policies and technologies is a crucial element in the preservation of data integrity and security for IOT devices.  Failure to do so exposes the data to the risk of corruption, alteration or destruction.

References

[1] “Cybersecurity Management and Oversight at the Jet Propulsion Laboratory”, Oig.nasa.gov, 2019. [Online]. Available: https://oig.nasa.gov/docs/IG-19-022.pdf. [Accessed: 03-Aug-2019].

[2] “Raspberry Pi used to steal data from Nasa”, BBC News, 2019. [Online]. Available: https://www.bbc.com/news/technology-48743043. [Accessed: 03-Aug-2019].

The Crossover of Data Integrity and Data Privacy in the Cloud

The Crossover of Data Integrity and Data Privacy in the Cloud

With the increased adoption of cloud-based applications in the life science sector, Compliant Cloud CSV Engineer Eliane Veiga details the fundamentals of data integrity and data privacy.

Data integrity (DI) and data privacy (DP) challenges have received increased regulatory attention in recent years. When considering GxP applications, a robust approach to risk-based computerized system lifecycle management requires well-defined processes, use of a qualified infrastructure, validated design and deployment of software, qualified personnel, rigorous change management and version control.

With the increased adoption of cloud-based applications in the life science sector, cloud computing solutions such as Software as a Service (SaaS) offer many advantages including enhanced cost-effectiveness, ease of implementation, and flexible, highly scalable platforms. However, assuring data integrity and data privacy in the cloud requires a well-informed, proactive approach by the regulated organization in planning and maintaining control of their data once it is hosted on the cloud provider’s site.

In Europe, protection of data privacy is now regulated under the General Data Protection Regulation(GDPR), which came into force on the 25th May 2018 replacing the existing data protection framework under the EU Data Protection Directive.

Data Integrity – The Fundamentals

The UK Medicines & Healthcare products Regulatory Agency (MHRA) defines data integrity as “the degree to which data are complete, consistent, accurate, trustworthy, reliable and that these characteristics of the data are maintained throughout the data lifecycle” (MHRA, 2018).

Assuring data integrity requires effective quality and risk management systems which enable consistent adherence to sound scientific principles and good documentation practices. The international regulators have defined an acronym (ALCOA) as the five elements necessary to assure data integrity throughout the data life-cycle. Even though ALCOA has been widely discussed in many publications, evidence from the US FDA warning letters and EU Statements of Non-Compliance (SNCs) indicate that there still are many who do not understand the fundamentals of ALCOA.

More recent publications, including the WHO Guidance on Good Data and Record Management Practices, have expanded these principles to describe ALCOA+ expectations, which puts additional emphasis on ensuring that data and records are “complete, consistent, enduring and available” (WHO, 2016).

Data Privacy – The Fundamentals

The General Data Protection Regulation (GDPR) came into force in the EU on the 25th May 2018, replacing the existing data protection framework under the EU Data Protection Directive. The GDPR emphasizes transparency, security and accountability by both data controllers and data processors, while at the same time standardizing and strengthening the right of European citizens to data privacy.

From a health care and cloud-based solutions prospective, the GDPR brings some significant changes from the current directive including:

  • definition of “sensitive personal data”
    •  imposes stricter obligation on both data controllers & processors
    •  appointment of a Data Protection Officer (DPO)
    •  conducting Data Protection Impact Assessments (DPIA)
    •  assuring security of data processing

As data controllers and processors have been allocated shared, stricter responsibilities under the GDPR, the obligations on both controllers and processors have been a surprise for the IT Sector.

Under GDPR, the data controller must implement organizational and technical measures to demonstrate compliance of the processing activities undertaken on their behalf. Furthermore, data controllers have the responsibility for selection and oversight of their service providers (data processors).  The GDPR defines such a data processor as “a natural or legal person, public authority, agency or another body which processes personal data on behalf of the controller”. 

The compliance burden is now shared between processors and controllers. One of the significant requirements that GDPR imposes for processors is that if they intend to hire another processor to assist with data management, e.g. a cloud computing supplier, the data controller must approve this appointment prior to commencement. This requirement is intended to protect personal data from transfer to a third party, even to another country, without the controller’s prior authorization.

Conclusion

As the adoption of digital technology – such as cloud-based – has increased in the life science sector, under the GDPR it will no longer be possible for cloud provider services (processors) to position themselves as mere processors and evade the reach of data protection rules. Recent publications have shown that to achieve assurance of DI in the cloud, service providers must still learn how to comply with the GxP regulatory bodies.

From Compliant to Complaint: the human error minefield in the life sciences industry

In particular for the life sciences industry a human error, undetected or unresolved, poses significant risk to the end user of the product.

 

Nicola Brady tells you how to mitigate risks in the life sciences industry.

To err is human, to forgive divine.  But this forgiveness is not usually forthcoming in industries where a human error can translate into a significant business impact.  Human error imposes significant costs on a business, costs to the quality of the product or service being delivered, financial costs and often reputational costs.

In particular for the life sciences industry a human error, undetected or unresolved, poses significant risk to the end user of the product.  This is why life sciences companies invest so heavily in programs and policies to drive human error down to as low a level as possible. To eliminate it completely is impossible!  Although the world in which we live is moving rapidly towards automation and Artificial Intelligence (AI) technologies, people are still necessary and unfortunately fallible; where there are people there will always be the potential for error.

So, what can a company do to reduce the occurrence
of human error or reduce the impact when it occurs?

  • Allow time for training. Initial training and on-the-job training should be in place and appropriate time should be allocated to allow for training.
  • Put robust processes in place. Having comprehensive policies and procedures in place will ensure standard consistent processes are followed and make errors and deviations more detectable. ‘Error proof’ the process as far as practicable. Complex processes should be risk assessed and mitigation actions implemented as required.
  • Ensure the workplace environment is appropriate for the work required. Consider noise levels, lighting, temperature or other environmental factors that might cause distraction.
  • Document it, investigate it, learn from it. Effective investigation processes should be in place to determine root causes and implement corrective actions. The investigation should not stop when a root cause of ‘human error’ is determined; dig deeper and you might find that there was something else at play.  This will allow you to address the underlying causes that contributed to the human error in the first place and reduce the likelihood of its recurrence.
  • Adopt the right culture. There’s no use for ‘blame culture’.  A quality culture where employees are encouraged to ‘raise their hands’ when mistakes occur actually serves to drive the rate of mistakes down.

If a company applies the right focus and attention on
training, processes, workplace environment, investigations and overall culture
they should find it easy to remain compliant and avoid that complaint!

Moving to the cloud – Regulated Companies business drivers and challenges for regulated applications and Data

Compliant Cloud

By  Oisín Curran, CEO at Compliant Cloud  and  Odyssey VC

Almost every conversation we have with customers these days, regardless of whether they are in the regulated Life Science sector or not, have a clear IT strategy driven from senior management that is ‘cloud first’. In many cases these are throw-away statements made by management functions who perceive the move to cloud as the silver bullet for managing the IT and data challenges that lie in front of them.

Moving to cloud can be perceived to eliminate some of the basic problems of traditional on-premise installs such as (and not limited to of course!) the following;

  1. Datacentre build & maintenance is too costly. We don’t want to own datacentres anymore – We want to focus on our core business of making product X or delivering service Y
  2. We need to cut our headcount. Buying XaaS can reduce headcount and operating costs
  3. We need to cut our operating costs for application ownership
  4. Reduce the number of SLA’s with 3rd party Vendors

IT operational challenges, risks, support service model, and gaps in controls stand in the way of enterprises fully exploiting the potential of SaaS.’.

While the above makes sense of course, senior management should be aware that moving to the cloud creates new costs, headcount challenges and of course, in the case of Life Sciences, introduces potentially significant risks. Gartner© in their Hype Cycle for Software[1] as a Service state that SaaS can be a challenge in that ‘IT operational challenges, risks, support service model, and gaps in controls stand in the way of enterprises fully exploiting the potential of SaaS.’. Businesses should define a cloud service strategy that fits the overarching company business strategy before making any IT decisions related to XaaS as a result of these unknowns.

Life Science organisations have to pay particular attention to the introduction of new and untested risks by moving to the cloud and that is evident in the fact that regulatory bodies can only consider XaaS as an ‘Outsourced Activity’ and comes under the associated regulations governing same. The regulatory expectations are clear in these cases and mandate the following (specific focus on Eudralex);

       1.  There must be written contracts:

 

        a.   With clear responsibilities, communication processes, technical aspects including who undertakes each step of the outsourced activity

 

 

       2.    The Contract giver must:

        a.   Include control and review of any outsourced activities in their quality system

        b. Include control and review of any outsourced activities in their quality system

        c.  Monitor and review performance of contract acceptor

 

        3.      The Contract Acceptor must:

a.     Be able to carry out the outsourced activity satisfactorily

b.     Not subcontract to a third party without prior approval

c.     Not make unauthorised changes

d.   Be available for inspection

 

  1.      Looking for validated XaaS creates a significant supply & demand pressure on existing compliance service providers

a.      Software vendors generally do not have an expertise in compliance. By pushing this responsibility onto them there are likely to be shortfalls in the quality and compliance side of the delivery. Gartner © notes[1] that regulated companies should

                                                    i.     Beware of vendors that claim to have a validated environment

                                                   ii.     Partner only with companies that are transparent, open for audits, and committed to compliance

        2.      An ISO certification is not evidence of a Life Science Quality Management System (QMS)

a.      Remember the regulators consider this an outsourced activity so the regulated company must ensure the ability of the vendor to deliver the service in line with regulatory expectations.

b.      This requires the vendor to have a clear and demonstrable QMS and also requires critically a level of integration with the customer QMS processes. This highlights the Gartner © recommendations to partner only with those providers with a demonstrated expertise in this vertical

        3.      Risk is a subjective term – Make sure you’re clear with your supplier

a.      Remember the regulatory focus on Data Integrity. This requires a clear understanding of the risks to data integrity from the XaaS vendor and should be a guiding principal in their application design

b.      Change management should have a clear callout of risk to data integrity e.g. ALCOA+ risks and not just reference business risk e.g. up-time and availability.

      All considered we are at a very exciting time in the evolution of cloud-based services in the Life Sciences sector. We are seeing more and more cloud-native  application options that bring significant operational benefits in terms of cost, data mobility and integration. At the end of the day, suppliers in the Life Science vertical need to be hyper sensitive to the regulated business need to ensure Patient Safety, Product Quality & Data Integrity. By aligning ourselves with the business drivers of the regulated business we are best placed to play our part in delivering tomorrows health solutions.

 [1] [1] Hype Cycle for Software as a Service, 2018, Published: 31 July 2018 ID: G0036079