Many months have passed since Devoxx 2018, but the themes discussed there are still resonating strongly across the industry and some of the conversations I had with the developers visiting our stand still stick in my mind even today, probably because I found one or two of these discussions so surprising.
Among the hottest topics of debate among developers who attended the event was around building cloud native applications and the pervasive nature of this approach across today’s development ecosystem. However, it was also interesting to note that were also a significant number of people at the event from organisations who adopted a different approach.
Many of the developers I spoke to had two main things in common; they came from organisations that owned huge volumes of data and they needed to create a process involving both analytical and transactional workloads with that data using a variety of different technologies.
These analytical workflows were varied in nature. Many were finding that being able to run SQL, for example, wasn’t enough to solve the functional and non-functional requirements of queries in an adequate manner.
Consequently, several of these developers were building their own data management platform, usually running in a private cloud and often on specialist hardware. What I found particularly interesting at the event was the number of different organisations feeling the need to do this, given there is such a wide variety of cloud native – and also on-premise platforms - out there that are well-known, in both the SQL and NoSQL spaces. The issue, clearly articulated by a number of them, was that currently they can find nothing on the market to suit their needs.
I find this remarkable because they may see it as the most cost-effective option to begin with but it’s likely to turn out a much less economic option in the long-term. What these developers are building is very specific to a particular problem - and as far as addressing their challenge it is likely to be an initial success. However, there is a significant risk inherent in this approach.
Knowledge vacuum is expensive
All will be well if those developers building the solution remain with their organisation. However, if they decide to leave, given the strong competition for developers, then their existing employer either has to face the prospect of having a knowledge vacuum surrounding the platform in the near future. This will result with the likely consequence of having to bring in expensive consultants to the rescue further down the line.
The other issue is a matter of functionality. Once the organisation wants to do something extra with the platform they will need to set up a data pipeline and replicate it in another data store, reconstructing it along the way. Before they know where they are, they have the same data replicated in four or five different structures. Suddenly, what started out as a cost-effective platform developed for a specific purpose has become both expensive and complex.
Interestingly, this was one also one of the reasons several developers told me they are not going cloud native. This ramping up of cost and complexity is generally not easy to manage. Besides, if you consider the lower cost storage options on AWS, or any of the other cloud platforms, they are not rapid mechanisms. It might take several seconds to do a read and there is no performance guarantee. If a business is data-intensive, then it inevitably does beg the question as to whether cloud native is the right route for them.
This is especially the case if an organisation needs specific hosting requirements. For example, a healthcare company may need to be connected to N3/HSCN. If so, the costs rise significantly as the provisioning requires specialised infrastructure.
Of course, there is a plus side here, especially if a developer makes use of all available services offered by the cloud provider. This can significantly reduce the time to build and deploy a solution – and ensures it is highly scalable. However, this does tie the organisation to a specific cloud provider. If a solution is built as a native application, for example in AWS, it can be difficult to refactor to move to another cloud environment. Then as transactional values increase, data volumes grow, and complexity intensifies, the costs can increase again quite dramatically.
Another aspect that was discussed at Devoxx was the topic of ‘ACID compliance’. In the cloud world, because of the way the infrastructure works, providers cannot offer this as we used to know it in the relational world. Instead the big providers have redefined the concept, so they can comply. While in some cases this may not be important, it can bring a wide range of issues to the fore when building apps that are critically dependent on the immediate consistency and quality of data, such as real-time clinical applications.
Yet despite all these drawbacks, developers are still building cloud native applications because they can’t find what they want on the market. This bodes well for solutions such as InterSystems’ IRIS Data Platform which, with its flexible architecture can meet varied transactional and analytical workloads and interrogate data in a number of different modes, rather than simply in SQL, or object or document-based.
What could also make IRIS so valuable in these cases is its interoperability: in particular, its ability to integrate data and applications into seamless, real-time business processes. It can also cut through the complexity, collapsing the tech stack and the need to manage multiple open source components, all in a highly cost-effective manner.
It’s not really surprising that many developers attending Devoxx are building their own cloud native applications. After all, they are rather a self-selecting band, given the nature of the event. It is surprising that so many, from such a technically inquisitive and inventive group, are implementing their own data platform, forced to do so because of the shortcomings of cloud native architectures that scale well for relatively simple use cases that can easily be divided up into small, functional units, but are less of a good fit for more technically-challenging requirements. Taken together, this all means current new market developments which are addressing these in different ways are certainly moving in the right direction.
Jon Payne, database engineer, InterSystems
Image Credit: Melpomene / Shutterstock