With more complex therapy comes larger and more complex data sets. The complexity increases with advances in personalized medicine as it introduces new data points during apheresis (extracting and infusing a patient’s blood, cells, tissues, and/or regenerative medicinal compounds) and during cell and gene enhancements.

Although data integrity is critical to building supply chain trust and product quality, as well as meeting FDA compliance obligations, many organizations continue to rely heavily on spreadsheets, manual data entry, records of hard copy and email. This creates multiple opportunities for error and can result in FDA warning letters, fines, or recalls. Although data “capture” may begin early in biopharmaceutical R&D, a variety of disparate IT systems are often installed without regard to data consistency throughout process development and clinical and commercial manufacturing.

These challenges are compounded by the widespread reliance on external partners for significant process development and manufacturing operations.

To reduce the risks of delayed, incomplete, and inconsistent data, biopharmaceutical companies must establish a robust data management approach early in product development. Especially for startups that may not have a lot of IT experience or staff, this can be daunting.

The following items should be prioritized to better address and reduce enterprise risks around data integrity and reliability:

  • Creating a backbone for digital data throughout the product and process lifecycle and between internal and external teams, sites and partners
  • Interagency review of quality and delivery agreements with CDMOs [contract development and manufacturing organizations] to provide data visibility, IP ownership and process oversight

Create a single pillar of digital data early

There are new business demands to process information faster. Early building of the digital data backbone supports key downstream activities – late-stage process development, technology scaling and transfer, and manufacturing where quality assurance and compliance requirements are imposed.

New digital data systems preserve or establish the context and relative importance of data collected by the IT infrastructure. By implementing a cloud-based data backbone, data can be collected and organized in a central platform without compromising context. It can scale as product and IT infrastructure evolves and remains relevant as it integrates with systems such as LIMS [laboratory informatics managements systems]historians, MES [manufacturing execution systems] and eBRs [electronics batch records software]to serve as the single verifiable source of truth for data critical to process control monitoring and analysis and reporting.

With increasing demand for accelerated technology transfer, FDA filings and commercialization, early data backbone generation generates significant time and cost benefits: fewer PPQs [process performance qualification] works, accurate transfer of technology for the first time, streamlined investigations and production and earlier release of batches.

While a cloud-based data management solution is the first step, companies also need to be vigilant when partnering with manufacturers.

Data visibility into quality and delivery agreements

With the acceleration of new drug and therapy development, complex manufacturing requirements and associated capital investment, outsourcing growth is projected to continue for the foreseeable future.

Despite the outsourcing of manufacturing, the drug owner (sponsor) remains responsible for meeting FDA product quality standards, demonstrating control over the contract manufacturer and the drug manufacturing process, and establishing an unfathomable process data set with a high degree of integrity, product and quality. The nearly universal reliance on contract manufacturers and the FDA’s focus on data integrity issues in drug manufacturing have generated unprecedented scrutiny of manufacturing operations by the FDA, strategic acquirers, and the SEC. As the supply chain continues to expand in complexity, process development and manufacturing, data management is an area that requires new approaches/innovations.

While data integrity challenges can lead to quality and performance issues, they can also create legal risks, such as the loss of production intellectual property and the inability to demonstrate control over a CDMO, which can impact the enterprise value of the company.

While these challenges affect both large and small companies, data visibility is a key pain point for small biopharma companies, as most are 100% reliant on CDMOs, but often lack the expertise and/or negotiating power against well-established CDMO.

Despite FDA mandates to manage their CDMOs and manufacturing processes, drug owners struggle to meet these requirements because they are physically remote and often lack IT systems designed to share data between the owner and contract partners. Failure to comply with this requirement may result in warning letters being issued by the FDA. In fact, approximately 50% of all FDA warnings letters in 2019 were related to data integrity issues.

Delivery agreements should anticipate data needs and emphasize data visibility and ownership of critical information, including process control parameters.

Fortunately, a growing number of CDMOs are realizing the compliance burden of their drug sponsors and that the future of biopharma depends on collaboration and visibility into their manufacturing workflows. With state-of-the-art data management solutions and collaboration with CDMOs, biopharmaceutical companies can become more confident in the quality of their products and better prepared to meet stringent compliance requirements.

Cloud-based data management solutions help industry address business and compliance challenges. These platforms should replace traditional data management methods and workflows for biopharma companies and CDMOs seeking competitive advantage.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *