Skip to main content

The business of DLP: A look at three common misconceptions

As we discussed in previous articles (opens in new tab) on ITProPortal, the Data Loss Prevention (DLP) market has had to overcome significant challenges since the introduction of the concept. DLP is now undoubtedly a necessary and business critical part of a modern company's IT infrastructure; however misconceptions about how to implement a successful DLP programme are still plentiful.

Businesses choose to implement a DLP solution for a variety of reasons but all with three common objectives: 1. to increase productivity, 2. assert control over data, or 3. to facilitate cost savings. Whilst DLP solutions can achieve these business objectives, many DLP programmes often fail as companies try to implement large overarching systems that encompass too much information, too soon. This is a perennial issue for DLP programmes; costs spiral as realistic targets are not set which leads to unattainable success because the system is too big.

People assume that they cannot get value from their DLP deployment without answering the following questions: What is my sensitive data? Who owns this data? Where is my sensitive data?

While these questions are indeed important, businesses should be aware that gathering this information will lead to a lengthy and costly process of Data classification, Data ownership and Discovery without really showing any value or risk reduction on the DLP side.

One customer in the financial industry in Europe indicated that it had taken more than two and a half years to determine what technologies it needed to answer the above questions when the actual business requirements were to protect intellectual property in unstructured data (documents) which could have been done without that process.

Let;s look at those three processes in more details.

Data Classification

Most people implementing a DLP solution assume that they need to classify data to find out what portion deemed sensitive. Data Classification is a common procedure but data can still be leaked out because of the constant addition of even more data.

Old data is usually classified by a long and expensive consultation project whilst new data is usually classified by the employees as they create it and all too often it is not classified correctly, particularly when the implemented system suggests default classification. For example, if most documents are marked with the category "Internal Use Only" then that designation becomes worthless as the system will get clogged with too many false positives.

When classifying data, it is best to start in small steps and only on those projects that require absolute protection – prototypes etc. – securing this small but high profile first step, will not only generate a real and quick return on investment but also build the momentum necessary to move on to the harder aspects of the programme.

Data Ownership

Many believe that in order to be able to protect data without impacting the business, data ownership must be determined in advance. However, if planned correctly, data ownership can be determined by using the DLP capabilities of data identification/categorization.

The process should identify owners from various business units simply based on the data type itself: Financial data owned by Finance, HR data owned by HR etc. Most systems that offer data ownership marking are looking on file and folder attributes and ignore the data itself. But how do you decide who owns a miscellaneous file? Is it the creator? The person that last modified it? The person who accesses the file the most?

Most document management systems and file servers support the ability to investigate a file's attributes and help in determining data ownership. This is important especially after an incident when you need to know where the data originated and find out how valuable it is. However, most people assume you must assign an owner to everything for it to be useful.


To answer the question of where sensitive data is located, many believe that they need to go into a large-scale discovery programme. However the process of discovery across networks can be compared to running an antivirus program for the first time, the numbers of events/alerts generated by the system will be enormous. Customers are usually concerned with how much time it will take the discovery system to scan all their data, the collection of that much data might be achieved in a short period but to process all that data could take much longer. Customers should also understand the value that this process will bring, knowing that certain data sits in specific place is not enough. The investigation process also needs to come with answers to the questions of: why is it there, who can access it, and can it be moved? Additionally, if remediation is required, a second scan is useful to determine if actions were taken which add more time.

However, if you focus on a specific data or a specific file server/storage system, identify what data sits there and why, and if needed, identify the owner of the data. You can then recommend necessary actions and move/encrypt or archive the data – you can also check if the actions were taken and then move on to another critical folder without tackling irrelevant data.


These misconceptions lead many organisations to make the mistake of seeing a DLP programme as an "all in one" deal. The right approach to launching a DLP programme is one that is focused and targeted. Once the business critical data is categorised/identified, the DLP programme can be rolled out to new data sets, whist continuing to manage the program for the existing policies. Company directors, after all, are more interested in the results of the projects and the financial returns.

The sensitive nature of the information being handled by DLP programmes often means companies like to handle the process inhouse. Whilst there are many talented IT staff working within companies, DLP often requires a specialist security consultant/partner to support, and properly manage a DLP system.

Choosing the right partner for your DLP programme will help it gain a successful ROI, ensure the core management tools are implemented to monitor data activity and take proactive steps to ensure information is properly monitored and secured.

Taking into account this series of articles, I hope that you have a clearer idea of the challenges of the data loss prevention market, the best approach to implementing a DLP programme, how to bring business benefits to customers and how to make the programme a success. Whilst DLP will not solve every problem within your IT systems, it is a fundamental part of a modern IT infrastructure and will provide tangible results to your business if the right steps are taken.

Lior Arbel is the Chief Technical Officer of Performanta (UK) Limited (opens in new tab). Performanta Technologies specialises in Information Security and Risk Management, offering enterprise clients end-to-end products, services and consulting capabilities.