Additionally, many contract research organisations (CROs) are now working with their customers as trusted advisors, shifting the collaboration from a pure service approach to fully-fledged collaboration, providing additional value to their customers.
R&D collaboration with CROs and external partners can speed up the innovation process and bring life-changing drugs to market sooner. However, collaborating remotely naturally increases the risks associated with data management and data integrity. Organisations need to be responsible for any data associated with them – whether it’s produced internally or externally – and make sure that it is compliant with industry regulations.
The challenges of collaborating with data
When an organisation collaborates with a third party or an external provider, what they’re really paying for is data. But, collaboration projects– with a number of different and disparate parties all needing to coordinate activities, transfer data and complete reports – can cause all kinds of logistical problems. And this presents a range of challenges relating to data integrity.
The shift away from in-house operations has brought a fresh set of obstacles for drug discovery companies. One of the most prominent hurdles is the consolidation of data from multiple organisations – and organisations are increasingly placed in the difficult position of gatekeeping data packets while finding a way to maintain both security and data integrity.
Technology is a particularly important consideration for collaborative work. If you are working with an external team, for example, it’s unlikely your IT teams will allow any third parties access to your network easily – so legacy systems, originally brought in to save time and speed up processes internally, could actually be doing the opposite.
Quality assurance is naturally a major concern. Any B2B collaboration environment must make document lifecycle management effortless and as transparent as possible. Good governance of the content, version and authorship throughout a project is essential – particularly for any document used to communicate primary data and results, such as final study reports that drive key decisions. When this data is spread across multiple systems, and separate organisations (often sat in an individual’s inbox), the ability to trace the source information is dramatically compromised and it is difficult to even identify, if and when, critical information is missing.
The lack of shared infrastructure systems, such as electronic lab notebooks (ELNs), lab information management systems (LIMS) or scientific data management systems (SDMS), which are often the backbone of R&D organisations for data capture, validation and retention, means that someone within an outsourcing organisation may need to process all the files and documents and distribute them to each relevant corporate data repository.
It’s a time-consuming process, but, more concerningly, this process also introduces the opportunity for data loss and corruption. When paying external partners for data, organisations need to ensure measures are in place to ensure that data integrity is as high as it can possibly be – something that is far harder to guarantee when different systems are being used by each of the different parties.
Outsourcing support for complex data types
Long-term partnerships often require the harmonising of business processes, standard operating procedures (SOPs) and document templates for material preparation, method validation protocols, data analysis methodologies and data reporting. Such documents are also typically version controlled, approved for release and need to be frequently revised and referenced. A collaboration environment should make content governance between businesses simple and transparent.
For scientific R&D and drug discovery, basic document exchange is the most rudimentary level of collaboration, and scientific processes typically involve the manufacturing and testing of samples, which may contain material formulations, and chemical and biological entities.
To complicate matters, this type of production and testing needs specialist requirements that are not catered for by generic business collaborative platforms. These materials have properties and proprietary data formats that require specialised tools to edit and visualise their sophisticated contextual data, provide registration services and enable searching capabilities.
Larger outsourced projects may have hundreds, or even thousands, of these objects to track and manage.
The complexity doesn’t stop with the testable objects. Scientific testing also involves sophisticated methodologies and the use of complex instrumentation. How a test is run and how an instrument is configured will have a profound effect on the testing outcomes and interpretation of data. Because of this, the ability to share scientific protocols, equipment configuration parameters and complete datasets is essential.
Taking advantage of new technologies
Despite all of the challenges discussed above, many organisations are still using legacy, on-premise software or worse still, email, for their collaborative projects – meaning they are unable to take advantage of technology enhancements that would support, faster, leaner, more successful collaboration, with data integrity at the core.
This creates a burden at both ends when collaborating with external organisations. To reduce this burden and eliminate data transcriptions errors, virtualised teams need to work together in a shared environment with ELNs and/or LIMS-like capabilities, that can deal with both documents and data, to better streamline the review and approval methodology, test parameters and interim data in real time.
R&D orientated collaboration tools
It might seem that there’s a balance to be struck between security, efficiency, and data quality, but that shouldn’t ever have to be the case. With a modern scientific collaboration software, organisations can provide partners with a neutral environment for collaboration – restricting the access levels available to third parties where required, meaning private, corporate content remains inaccessible and secure.
Modern spreadsheet technology can allow collaborators to enter their data directly into the system, eliminating the need for disparate files and formats to be sent via email, or ‘dropped’ into a generic company portal. With the right system, organisations are also able to create and share templates directly with their external collaboration partners and, when a study is completed, transfer data securely back to an organisation’s database with a click of a button.
Collaboration systems should also include built-in mechanisms for document and data review, social commenting and task alerts, meaning organisations can always be confident they have the full picture, even when your collaboration partners are on the other side of the world.
By implementing a modern data management system designed for scientists, R&D organisations can monitor processes remotely, while still benefiting from the flexibility, scalability and security needed to get collaboration projects moving. Establishing and implementing rules and overseeing the processes doesn’t only ensure improved efficiency and compliance – it can protect businesses’ data integrity and overall reputation, with minimal IT support requirements. In the ideal world, all studies are successful. But, in the real world, some studies need be stopped due to unexpected results. Having near-time insight into study progress can bring important decisions forward, reducing wasted time, effort and cost to all the parties involved.
In a time of increased outsourcing, indispensable specialisation, and diversified industries working together for a common purpose, an R&D orientated electronic solution for collaboration is more important than ever. In fact, when ensuring data integrity during an outsourced project, it’s fast becoming essential.
Graham Sanger, Director of Client Engagement (EMEA), IDBS