Skip to main content

Data engineering inefficiencies lead to missed opportunities for two-thirds of companies

Flatfile data
(Image credit: Flatfile)

Being ‘data-driven’ is no longer just an aspiration for forward-thinking businesses; it has become a necessity. As quantities of data continue to stack up, finding ways to efficiently serve up information and actionable insights for streamlined decision-making has become a crucial part of the day-to-day operations for organizations.

Agile companies are modernizing their business intelligence to provide a holistic view of data across the organization, where it can be put to work improving operational efficiency and driving innovation. However, as organizations transform their digital capabilities and implement different tools at different moments to enable a business strategy powered by data, there is a danger that most are not getting maximum returns on their business intelligence investments.

Brand new research shows that inefficiencies in data engineering processes mean that businesses are unable to leverage the ever-increasing amount of data they have on hand. A new global survey carried out by Dimensional Research has uncovered the most common stumbling blocks data engineers face across the board that is leading to key data opportunities being missed among the majority of companies.

Challenges of building and maintaining pipelines are costing time 

Responsible for transforming multiple data sets into a format that can be easily analyzed, data engineers today play an instrumental role in nearly every organization’s data strategy. Indeed, 79 percent of the companies surveyed by Dimensional Research indicated that they would be hiring more personnel this year to help unlock the full value from their data. But regardless of the number of engineers in a team, these data professionals are coming up against the same roadblocks time and time again.

For example, nearly all participants in the survey (98 percent) say they have difficulty building pipelines to access their data, with pipeline-related frustrations widely recognized as a drain on the productivity of data engineers. Indeed, around half of all companies questioned indicated they required more than one business week to build a new pipeline.

organizations are trying all that they can to accelerate these processes. For example, 86 percent of data professionals are currently using multiple solutions to help solve these issues, with some deploying as many as six different tools. Yet, rather than speeding things up, using such a wide array of tools can be counterproductive. By adding complexity, the process of building a pipeline can be slowed down still further. 

Even once the pipeline is built it requires careful management, demanding further time from already hard-pressed data engineers. For example, nearly all data professionals (98 percent) say they have wrestled with the issue of data pipeline breakages, with over half stating that breakages happen as regularly as every month, or more frequently. More than 50 percent of these breakages take at least one working day to fix. Not only does this add to engineers’ workloads, it disconnects the organization from its data, which can have far-reaching consequences across the entire business. For a business to be truly data-driven, the data it bases its decision-making processes on must be always up-to-date. Data from even just a few months ago may no longer be relevant to strategic business insight – in fact, making decisions based on old data may even have a misleading effect on company direction. With pipelines taking up to a week and beyond to build, and regularly breaking thereafter, delays will spread throughout the business, while time-sensitive processes such as training AI models often end up using information that is out of date.

Time for better decision-making 

While 54 percent of data professionals believe that the top metric for measuring data engineering success is decision-making, nearly half of those surveyed did express concern that valuable data sources were not always integrated into their pipelines. And when business-critical data is unavailable to leaders, decision-making is little more than an educated guessing-game.

Moreover, 68 percent of data professionals indicated that more insights could be extracted from existing data, if only they had the time. Time spent repairing, maintaining and building pipelines means not enough time is spent focusing on creating new business insights. Delayed or inaccurate decisions result in lost opportunities and can put organizations at a competitive disadvantage, as well having a knock-on effect on customer satisfaction.

Combining effective leadership with a strong data strategy for business success 

Although data engineers undoubtedly deliver significant business value, employing more of them might not be the answer for organizations seeking to improve their overall data strategy.

For one, the research suggests that there is a lack of direction for data professionals. organizations’ data strategies are often unclear, and involve too many stakeholders with ill-defined needs. With no clear brief on how the data will be used, engineers frequently resort to adopting numerous pipeline tools in an attempt to accelerate and optimize their efforts. However, despite looking to technology to enable business intelligence, the survey found that key features are regularly missing from the arsenal of the data engineer, with end-to-end data governance, version controlling, model dependency mapping and self-healing pipelines (idempotent data replication) often cited as lacking. With an indistinct idea of what exactly is needed from them, data engineers are resorting to using tools that fall short of their requirements.

Implementing self-service technology to unburden data engineers 

Organizations hoping to make the most of the insights delivered by their data need to consider combining a strong data strategy with tools better suited to their purposes. By marrying a clear organizational data vision with tools that can create a reliable flow of up-to-the-minute data and fully manage data pipelines and SQL transformations, it becomes possible to free up data engineers’ time. In doing so, businesses allow these data professionals to focus on higher value tasks, enabling them to deliver up-to-date information to key senior executives and across the organization.

Automation of the data integration process provides the right foundation to any strong data strategy. Once unburdened by the limitations of brittle and hard-to-manage data pipelines, data engineers can be the driving force behind truly data-driven, revenue-impacting decisions.

Alex James, Senior Director EMEA Customer Success, Fivetran

Alex James is Senior Director EMEA Customer Success at data integration provider Fivetran.