Skip to main content

Mission critical gets complicated: Managing enterprise apps in a multi-cloud & IoT environment

(Image credit: Image Credit: TZIDO SUN / Shutterstock)

Enterprise applications (EAs) are the heart of corporate IT. If one of them should fail, connected solutions are affected, and - in worst-case scenarios - entire systems fault causing major financial and reputational damage.

Yet businesses are increasingly struggling to manage mission-critical apps. This is primarily down to their relocation to the cloud, multi-cloud interactions, and the progression of IoT and edge compute. With 70 per cent of enterprise apps to be developed natively for the cloud this time next year, the stakes couldn’t be higher.

Moving to the cloud

Typically, companies have started by putting their backups in the cloud, gradually reducing the size of the on-premises environment. A popular model is a hybrid multi-cloud, a mixture of different public cloud services, private cloud and local systems. This enables companies to select the best solution for each application from all options.

But when enterprise applications are distributed across multiple systems in a hybrid landscape, it becomes difficult to connect them while ensuring secure, fast data transport between them. What they need is a data fabric strategy.

A data fabric strategy unifies data management and ensures seamless transfer between data on-premises, private or public cloud services. Even in a heterogeneous IT landscape, data is always available at the required speed where needed – regardless of where it comes from. This also enables companies to easily move workloads from one public cloud provider to another – for example, to take advantage of a more attractive offer.

Enter IOT…

Collecting, merging and evaluating data is important for business success. On this basis, companies can optimise processes and develop new business models. 5G will provide the much-needed boost to valuable and widespread IoT adoption, enabling latency-critical applications that were previously impossible to implement. Analysts see Edge and cloud not as competing, but as complementary concepts.  As such, truly effective deployment of IoT demands linking it up with enterprise applications, not operating in silos.

These sensors and end points will generate huge amounts of data in a very short time. And dealing with this flood is difficult. With local storage systems struggling to cope, cloud storage systems will play an important role in IoT’s success. At the same time, the data is created at many different, widely distributed locations from where it must be transported safely and quickly.

What’s more, due to the amount of data, it would not be practical to transfer and store all data on a central platform, as this would go beyond bandwidth and increase costs. Latency-critical applications also require real-time analysis, so that data must be filtered and analysed directly at the point of origin. This is why edge computing will become increasingly important in the coming years. 

In order to ensure smooth interaction between IoT and enterprise applications, organisations require integrated data management that takes into account all levels in the complex, distributed IT environment – from the sensor through the intermediate layer of edge computing to cloud storage and the application that uses the data. Systems must be able to securely deliver the right data to the right destination at the right speed.

NetApp differentiates between five steps, each of which involves different tasks: Collect, Transport, Store, Analyse, and Archive. These five phases show how complex IoT projects are from an IT perspective alone, and for companies to be successful in implementing IoT, various building blocks need to be interlinked. This is where the Data Fabric comes in, which has all stages under control.

Back it up and recovery

Reliability for IoT and enterprise applications must also be guaranteed in the cloud as well as in your own data centre. EAs in particular have strong high availability demands, with organisations often requiring Service Level Agreements (SLAs) that guarantee 99.9999 per cent reliability. In order to meet these requirements, data management must also meet high standards.

This can be difficult to implement in large database environments such as SAP HANA. So, to guarantee high performance for backup and restore in the cloud, you should use a cloud storage solution as a managed public cloud service.

As an enterprise you receive powerful NAS file service environments where capacity and performance can be adapted to actual demand at any time through appropriate SLAs. The service provider takes care of configuration, operation and compliance to the SLAs. Other ways to manage this could include reducing downtime through an automatic storage failover. This ensures that critical applications are always available.

But backup is only the first step in protecting data.

Disaster recovery in distributed environments also poses a challenge. If there are problems with the entire data centre, companies must be able to restore the backups. It's not just a question of not losing any data. The data must also fit together during recovery, otherwise the processes in the enterprise applications no longer function. This means that data that is distributed across different systems must be backed up consistently at the same time – across data center and cloud boundaries. It's very complex.

One possible solution is to employ Consistency Groups. All systems that work together are combined in such a group and backed up in a common backup plan. The backup is performed automatically and synchronously. If a system fails once, the same data status can be re-imported everywhere at the same time, so that consistency is guaranteed.

Conclusion

Enterprise applications will continue to play a central role in corporate IT in the future – whether on-premises or in the cloud. However, due to the growing volume of data resulting from new technologies, the share of cloud deployments will continue to rise. It remains questionable whether companies will also be able to place their core SAP systems in the cloud in the future, and so a hybrid multi-cloud environment is set to prevail in the coming years. The major challenge is ensuring the required performance, security and high availability even in such complex, distributed IT landscapes and with huge amounts of data. To achieve this, companies need powerful data management that works consistently across data centre and cloud boundaries.

Andreas Limpak, Director Solutions Engineering, NetApp UK&I