Skip to main content

Data Governance: The Unsung Hero of Network Operations

(Image credit: Image Credit: Flickr / janneke staaks)

Data and data systems are the lifeblood of modern business. Innovations like cloud computing, business intelligence, analytics, and automation hold the promise of operational efficiency and strategic insight; but without proper visibility and control over data, all that digitization can get in the way of productivity by creating a management nightmare. Time spent wrestling with difficult systems is time not spent extracting useful insights from the information contained therein and can discourage innovation. 

Not so long ago, even large organizations handled data governance as a mostly discrete function. Some data and business applications were available company-wide, but specific business functions—and the data associated with them—were largely isolated and inaccessible to teams that might benefit from the knowledge they contained. Such isolation, often referred to as data silos, rewards routine and stanches innovation, preventing an organization from improving processes and efficiencies. 

To change that dynamic, it’s vital that the silos be torn down and the data be interconnected. But without robust tools that are easy to use, it can be difficult to know what’s going on with the data flowing into, out of, and within your enterprise. And without that level of control and visibility you run the risk of failing to meet service level agreements (SLAs), losing track of vital files, and you may even fall afoul of the many security and privacy regulations that dictate the care of sensitive personal information and high-value intellectual property today. 

Solutions designed for today’s demanding data management landscape operate as highly evolved gateways, bridging the gap between an enterprise’s internal constituencies and the many external entities that constitute a network of partners, contractors and customers. As these interconnected relationships grow, so does the need for more efficient and easy-to-use means of analyzing the underlying data flows that give employees and managers the insights necessary to make good decisions and for IT managers to govern their systems. 

This need is a driving factor behind the development of management and analytics platforms that are tightly integrated to the systems that manage data flow in an enterprise. By tying into business and operational intelligence, IT managers have the near real-time ability to ensure data is moving in and out of an organization correctly and efficiently. That means alerting network administrators to conditions that could put security, compliance or operations in jeopardy. 

IT teams are being tasked with greater responsibility over mission-critical systems and business functions. That responsibility means that when the C-suite calls asking for an update on the status of an important transaction, there’s an expectation that the answer will be readily available. Under such pressure, it’s vital that IT has a timely, accurate view of what is happening within their data environment. More to the point, advanced knowledge of a circumstance that could jeopardize the transaction means prescriptive action can be taken to ensure the transaction is not at risk in the first place. 

To make sure that the flow of data is being governed correctly requires that an IT team have several capabilities at their disposal: 

  • Analytics to gather and evaluate an organization’s data and view trends over time for business and operational intelligence purposes to support good decision making. 
  • Consolidated data to give a detailed view of the information underlying any operational issues requiring investigation along with a rich set of filters and an intuitive, responsive interface. 
  • Server health monitoring to track the state of your environment through customizable diagrams of your managed file transfer system and related connections, including summary statistics on individual components. 
  • Service level agreement (SLA) tracking and management to understand when mission-critical tasks are not being met because a transfer failed or was never initiated in the first place, including visibility into associated partner and other third-party data. 
  • Reporting to document the health of your transfer, provide that insight to key stakeholders and demonstrate compliance should an audit be required. 

Any data governance solution that doesn’t meet the above baseline of capabilities is obsolete—and may be putting your organization at risk of inefficient operations, breach of contract, regulatory non-compliance, or even a catastrophic data breach. Such incidents can result in loss of revenue or undermine brand reputation.  

Tracking the health of your enterprise’s data management architecture is important to maintaining proper governance. This could include using customized data maps that reflect your infrastructure and related dependencies, including summary statistics on individual components and the capability to focus the view of an individual server or expand to see details across the entire enterprise. It can also mean ensuring proper levels of management access based on an individual’s job function or need-to-know.

With this level of insight, systems administrators can also properly manage automation commands and set triggers or alerts when problems arise. Areas where administrators need transparency include understanding scheduled actions, violations of policy or process or simply notification when tasks are completed successfully—or not. Such alerts can be the difference between meeting or falling short of SLAs. The beauty of tracking event expectations in this way is not only in knowing and rectifying what processes might have failed, but more importantly in recognizing what processes never happened at all. If a third-party never sends a necessary file, for example, you can’t track the exception unless you have insight necessary to create an expectation that the file is coming. 

When network operations are smooth, no one notices. But when one file out of a thousand goes missing, there could be serious repercussions for the business, its customers and partners. With that in mind, perhaps the biggest benefit to using state-of-the-art data management tools is that IT staff are empowered to add value to the business. Such empowerment means spending less time coping with the uncertainty of managing operations by the seat of their pants, and more time working on ensuring your IT systems and infrastructure are functioning as expected. 

When that happens, your IT staff will spend a lot less time answering the phone when the boss calls wondering what went wrong, and spend more time being the IT hero that every business needs. 

Peter Merkulov, Vice President of Product Strategy and Technology Alliances, Globalscape 

Image Credit: Flickr / janneke staaks

Peter Merkulov
Peter Merkulov serves as Vice President of Product Strategy and Technology Alliances at Globalscape. He is responsible for leading and overseeing the product strategy, product management, product marketing, and technology alliances teams.