As a quality management professional with over 20 years of experience working with pharmaceutical and medical device companies, there’s one thing that gives me chills and makes me run for the hills, figuratively speaking. That’s when someone in a life sciences organization—usually IT—reaches out and says, “We’re implementing a new system, and someone mentioned it might need validation.” Even worse is the past tense version: “We’ve implemented this awesome new system…” The latter changes my initial impulse to run from figurative to quite literal.
I once had a colleague approach me about validating a laboratory information management system (LIMS) they were using to manage HPLC (high-performance liquid chromatography) data. For those not familiar, both of these are incredibly configurable, and labeling them as complex is an understatement. I was told they were going live in two weeks. Oh, how I laughed. This was the funniest thing I’d heard in a long while …until I realized he was serious.
Whatever my initial reaction, my curiosity always wins out: What system? What’s the intended use? Is the vendor approved and qualified? Have we done any testing? If so, how much? Is it new or is it replacing an existing system? Is it 21 CFR Part 11 compliant? Is this production or R&D? What’s the intended use? I know, I know … I already asked that one, but it bears repeating.
The answers to all of these questions are the cornerstone of any GxP compliance effort, and they help define the scope. A qualified and approved vendor shows that the company making/selling the system meets a minimum set of quality standards. The amount of testing performed can help reduce the amount of additional effort required to meet specific regulatory requirements.
In my opinion, intended use is the most important aspect of any validation effort. It shapes and molds everything you do, and, more important, everything you don’t do. How a system or application is used drives the amount of risk as well as the amount (and possibly level) of testing required, not to mention the documentation, training, and everything in between.
Over the last decade or so, an increased emphasis has been placed on intended use and risk-based decision-making. It is no longer necessary to test every function a system has to offer; instead we focus our efforts only on those functions that are reasonably in scope of how the system is to be used. An example of this is an electronic quality management system (QMS). If you’re only going to implement corrective and preventive actions (CAPAs) in the system, there’s no reason to test other quality events, or document management, supplier qualifications, audits, etc.
In the current world of SaaS models and Snowflake’s Data Cloud, understanding the intended use is absolutely critical to having a system that enables compliance with current regulatory requirements. Are you developing applications powered by Snowflake? Are you connecting your system to Snowflake? What type of data are you storing? How will the data be used? Depending on the answers to these questions and others, your approach and scope to creating a compliant, validated environment will vary.
Ultimately, compliance is about showing control, with objective evidence, to reduce or eliminate risk and impact to patient safety, product quality, and data integrity. This applies not only to physical products like pharmaceuticals and medical devices, but also to software systems that hold records on how those products are made. Diagnostics and data analytics have also been playing an ever-increasing role in patient care as well.
Ensuring a supplier can reliably provide goods and/or services is an industry-agnostic practice. Every consumer should ensure the services they purchase meet their predefined requirements. Don’t settle for less. For FDA-regulated industries, qualifying vendors is a requirement, not just best practice.
It is up to each individual organization to identify what is an acceptable level of quality. These standards can consist of specific policy and process documentation, third party certifications or attestations, or specific security standards. This evaluation can be performed in any number of ways, such as bench audits, questionnaires, or on-site or remote audits.
Computer system validation
The process of testing and verifying that a computerized system or application works as intended is multifaceted and requires thoughtful planning.
Planning is the roadmap guiding you through predefined validation requirements, tests to meet those requirements, and results which either satisfy the intended use or not. Planning will also identify what additional activities are required to begin production usage, such as data migration (if needed) and training.
Helping your compliance team understand how you are using Snowflake is critical. How are you going to ingest your data? Will you be migrating data from existing systems/platforms/applications? Is your data in spreadsheets? Is your system powered by Snowflake? Or is this a new implementation? Each of these scenarios has very specific requirements in the validation effort, and can also have very different regulatory requirements.
21 CFR Part 11
Customers who use Snowflake to manage records in electronic form (e.g., those that are created, modified, maintained, archived, retrieved, or transmitted), under any records requirements set forth in agency regulations, are required to comply with applicable requirements within 21 CFR Part 11 Electronic Records; Electronic Signatures. Customers must determine if their intended use of Snowflake falls within this scope.
Snowflake has multiple functions that assist customers in demonstrating their compliance with 21 CFR Part 11 requirements, specifically around controls for closed systems and access control, audit logging, and records retention. Snowflake is designed as a closed system, where access to a customer’s data is solely governed and managed by the customer. Snowflake provides industry-leading features such as data encryption, MFA, key pair authentication and rotation, SSO, role-based access control, and more to ensure the highest levels of security for your account and users, as well as all the data you store in Snowflake.
For personnel who access the systems and infrastructure that support the Snowflake service, each user must have a unique username and password. For each user, the Snowflake service logs all activity from the time an account logs in until when the account logs out. This activity can be found in the Snowflake database in the ACCOUNT_USAGE and READER_ACCOUNT_USAGE schemas. These views enable querying object metadata, as well as historical usage data. A list of ACCOUNT_USAGE and READER_ACCOUNT_USAGE views can be found here and here, respectively.
It is important to note that the data contained in these views is retained for one year, unless otherwise specified. To demonstrate compliance with audit trail requirements, customers may ingest (duplicate) system logs into their environment, thereby preserving the audit history. Customers have complete control over their data and who has access to it. Records retention policies should be identified by the customer based on their intended use and needs.
There are several components and considerations when determining regulatory compliance for computerized systems validation. Identifying your intended use is arguably one of the most important, at least in my humble opinion. With that said, it is important to understand that only in harmony with other components, such as planning, vendor qualification, validation testing, and a host of other activities, that an organization can truly encapsulate a compliance effort. And starting with Snowflake gives you the foundation to support your compliance journey.