Cloud computing is no longer just an IT buzz word for research organisations in pharma/biotech, consumer packaged goods, energy/process/utilities, industrial equipment and other science- and process-based industries. In today’s increasingly collaborative project landscape, creating and managing a complex, high performance computing infrastructure to help scientists handle, analyse and share information presents significant scientific, business and IT challenges. This is especially true as organisations strive simultaneously to lower costs, reduce risk, increase agility and improve collaborative innovation.
From the scientist’s point of view, it can be difficult to collaborate effectively with partners down the hall, let alone in different time zones, cultures and languages. Data exchange with partners is often manual and therefore error-prone. Because many collaborative projects involve different types of data representations, scientists can spend up to 50 per cent of their time manually processing and checking collaborator data prior to analysis. Access to collaboration data is often restricted to a few “gatekeepers,” which can further reduce operational efficiency and extend project timelines. On the business side, managing multiple partners, establishing key performance indicators, assessing results and protecting intellectual property (IP) across complex partnering networks can be costly, time-consuming and labour-intensive.
The challenges for IT departments tasked with building and managing partner workspaces are especially daunting. Existing in-house data management systems are typically not designed to support externalised research. Instead, the systems are built on the assumption that in-house scientists can, for the most part, freely access the data required to make informed decisions—and that the company owns all of its IP. Externalised collaboration networks can be much more complex. Many combine numerous partners with diverse objectives involving single or multiple research projects. Dealing with this complexity can sometimes tie up more than 50 per cent of a commissioning organisation’s IT budget. Common internet-based collaboration solutions such as email, SharePoint, VPN, Citrix, spreadsheets and other data exchange mechanisms are potentially insecure. They often involve incompatible data formats and can require organisations to prepare and curate files manually.
Other IT challenges include: (1) ensuring that external partners see only the data that applies to them, and (2) managing the ownership of collaborative IP that sometimes remains with the commissioning company and sometimes needs to be distributed among several partners in accordance with contractual obligations. Most importantly, collaboration networks are not static but evolve over time with different partners involved at various stages. They often need to be spun up and down quickly with partner data securely partitioned. Additional IT challenges include syncing incoming partner data with in-house legacy and on-premises systems—and, most importantly, making sure that data is secure.
Robust collaboration workspace
Faced with these data management/analytics, partnering and IT challenges, many research organisations are turning to the cloud as a scalable, secure, state-of-the-art informatics environment with zero local footprint that can speed analytic workflows, drive collaborative innovation and relieve administrative burdens. The cloud lets organisations set up a robust collaboration workspace quickly and easily with minimal IT support and no hardware to maintain. The system is available anywhere, anytime—and organisations only pay for what they use. Granular permissions built into the cloud platform should make it possible to limit the data and tools available to different partners based on the collaboration use case.
For many organisations, cloud adoption will be a staged process in which scientists will leverage cloud collaboration along with existing server-based or on-premises systems. Any hosted collaboration system needs to support this hybrid-cloud environment in which data flows between on-premises and cloud applications. For example, a design team can start an experiment in an on-premises electronic laboratory notebook and then transfer the information to the cloud for execution at a contract research organisation. Similarly, molecules registered in a cloud database can be automatically synchronised into a company’s on-premises corporate registration system. This bi-directional integration can greatly facilitate an organisation’s staged movement from existing legacy IT infrastructure to an emerging cloud-hosted environment. For research-sponsoring IT organisations, there is a benefit to keeping data in the cloud, because there can be a clear delineation between the hosted collaboration system and in-house, server-based systems and data.
A 2016 IDG survey of IT and business decision makers found that 68 per cent of respondents intend to investigate or deploy cloud analytics solutions over the coming year. In addition, 74 per cent said they expect to adopt a hybrid or cloud-only approach to analytics over the next three years. Respondents with currently deployed cloud analytics solutions cited the advantages of lower up-front costs (60 per cent) over on-premises solutions. They also cited greater agility and faster time to market (61 per cent), more rapid and cost-effective scaling for large data sets (60 per cent) and improved self-service capabilities for non-technical users (51 per cent). It is noteworthy that IT executives are assuming a leadership role in this emerging migration to the cloud, as confirmed by a 2016 ServiceNow survey of 1,850 executives and managers in which 52 per cent reported having “cloud-first” policies for new technology purchases, an adoption stance that will increase to 77 per cent within the next two years.
Passing through the cloud
Recent RightScale “State of the Cloud” reports have identified security as the #1 challenge of cloud adoption, although this concern dropped to second place in their 2016 report. In 2017, security concerns fell to 25 per cent compared with 29 per cent in 2016. This shift reflects the steady effort of cloud providers to adopt recognised security standards and build trust in their user communities. ISO 27001 is one of the most widely recognised and internationally accepted best-practice standards for cloud-based information security management. All cloud providers should offer ISO accreditation for the systems, technology, processes and data centres supporting their cloud environments.
Implementing a cloud-based research informatics system is a big step for an IT organisation. What are the best ways to measure the success of adopting cloud technology once it is implemented? Organisations can expect a number of benefits over time including time and cost savings through data exchange and communication automation, reduced TCO resulting from infrastructure in the cloud, shorter project timelines and improved efficiency with cloud agility. Key Performance Indicators (KPIs) organisations typically use measure (1) scientist productivity, (2) IT spending per user/application, (3) time spent implementing new software applications and (4) average time to start up, run and close down a project.
By 2020, at least a third of all data will pass through the cloud. More than 43 per cent of organisations expect to deliver the majority of their IT capability through public cloud services by 2020, and they will access 78 per cent of IT resources through some form of cloud by 2018. There is no arguing with the numbers. Now is the time for science- and process-based research organisations to begin defining and implementing effective cloud strategies to improve agility, reduce costs and support end-to-end collaboration dynamics.
Frederic Bost, Senior Director, Cloud R&D, Dassault Systèmes BIOVIA
Image Credit: Rawpixel / Shutterstock