Skip to main content

Stream processing and AI adoption in the enterprise: What are the major bottlenecks?

(Image credit: Image Credit: Geralt / Pixabay)

There has been a lot of discussion regarding how Artificial Intelligence is adopted in the Enterprise, especially as it relates to stream processing and data streams from systems, devices and real time applications. As the adoption of artificial intelligence in the enterprise hits a tipping point, it is interesting to explore how both technologies are connected and how they share similar adoption challenges.

Artificial intelligence and real time data have formed a truly intertwined relationship; they are both on the same path to bring massive technological advances to the enterprise. AI relies on the availability of data to feed models with information (ideally in real-time or with low latency). Real time data platforms provide the fabric behind the scenes to deliver data to machine learning models. They also forward insight to the users in real time so that they can generate value out of the data. Analysts and industry influencers argue that every modern enterprise is basically in the data business and will need AI to digest big data in order to make sense out of it.

There are distinct connections between stream processing and Artificial Intelligence which make their marriage a unique opportunity for companies dealing with a massive volume of real time events. Both AI and stream processing are distributed, organised in logical units and both support incremental updates and iterative tasks. Last but not least, both demonstrate an asynchronous nature.

A recent O’Reilly Media study found that most organisations plan to increase IT budgets spent on Artificial Intelligence projects in 2019. More precisely, the study found that more than 60 per cent of organisations plan to spend at least 5 per cent of their IT budget over the next 12 months on AI, while 19 per cent of organisations plan to spend a significant portion (at least 20 per cent) of their IT budget on Artificial Intelligence.

The same survey predicts a growing gap between the leaders in this space and the ones falling behind in terms of Artificial Intelligence adoption. This is primarily due to the bottlenecks that some organisations face, especially with respect to the following areas:

  • Lack of data
  • Lack of skilled people
  • Company culture
  • Difficulties identifying relevant use cases

Interestingly enough we see that both stream processing and Artificial Intelligence, not only share the same similarities that make them unique in combination but also share the same challenges and bottlenecks in their adoption. So how can enterprises eliminate these bottlenecks by making the right investments and taking necessary actions related to their data architecture?


The lack of data cited by the survey respondents refers to either insufficient amounts of data that can effectively train and maintain machine learning and AI models and applications in the enterprise or, in the case of streaming data, data quality issues. Adopting an event-driven data strategy with stream processing and Apache Flink allows companies to make better sense of their data. It allows companies to transform data and enrich it in real time whilst gathering valuable insight from events as they are generated from systems, connected devices or website and mobile application interactions; this is contrary to storing them in a data lake to then try and make sense of what happened retrospectively. Stream processing with Apache Flink enables companies to react to information in real time, making artificial intelligence models and applications iterative and responsive to how the world currently responds to change.

The skills gap seems to be a common denominator for the adoption of both AI and stream processing in the enterprise. The same O’Reilly study finds that demand for machine learning experts, data scientists as well as data and infrastructure engineers, is one of the most cited reasons that block AI’s adoption in many organisations. In order to alleviate this, organisations need to take the appropriate action and transform their organisations and teams both at a technical and a cultural level. Adopting AI, or stream processing requires investing in people’s skillset, training and embracing a ‘sharing’ culture across the organisation, whereby data silos are minimised and data is accessible by multiple teams working on different projects and applications — thus significantly reducing the time-to-market and successful deployment.

The remaining two obstacles preventing the adoption of Artificial Intelligence in the Enterprise, relate to company culture and lack of identified use cases. As with any new technology, data and analytics leaders need to embrace an open and inclusive culture in their teams, challenge the status quo and find ways of leveraging the teams potential to achieve outcomes faster and in a cost-effective manner. Stream processing and AI can become a catalyst for organisational change in the IT and data departments: one that moves away from data silos and hierarchical structures and brings together the data, operations and product teams to significantly reduce the time needed to produce real time applications, AI and machine learning models or deep learning algorithms.

As Artificial Intelligence adoption grows over time, identifying relevant use cases for the technology will become easier. Companies are reportedly using AI to power their customer service operations as well as other applications and use cases in finance, accounting, marketing, advertising and last but not least, R&D operations and projects. Data and analytics leaders should focus on finding the right talent that can identify use cases for both real time data and AI and drive these projects forward, enabling the business to keep up with, and even stay ahead of, the competition.

Seth Wiesman, Senior Solutions Architect, Ververica
Image Credit: Geralt / Pixabay

Seth Wiesman is a Senior Solutions Architect at Ververica, consulting clients to maximise the benefits of real-time data processing for their business. He supports customers in the areas of application design, system integration and performance tuning. Prior to joining Ververica, he was a Data Engineer on the reporting team at MediaMath and has a Masters in Computer Science from the University of Missouri.