Vous êtes sur la page 1sur 4

Date: January 1, 2010 Title: When to DLP and When Not to DLP? Written by: Dr.

Anton Chuvakin

Summary
DLP technology has emerged as the most recent silver bullet for solving the Data Security concerns. Many DLP vendors have promised that this technology can finally realize the ultimate goal of automated data protection. How to decide when to use and when not to use this technology?

The Issue Data leak prevention (DLP), also sometimes called Content Monitoring and Filtering (CMF), is a recent addition to the information security toolset. DLP technology has rushed out of relative obscurity in recent years. Given todays security and compliance challenges, the need to protect the security of confidential, regulated and customer data has created the perfect storm for DLP. This storm is made even more severe by the fact that many older security safeguards, such as encryption, have failed to turn the tide of losses and breaches. Overall, DLP technology solves the problems of discovering, monitoring and preventing the leaks of sensitive data, in both structured and unstructured forms. While the industry debate on the efficiency of DLP for preventing the deliberate, malicious attacks rages on, and its value for stopping negligent but still highly damaging leaks is largely unquestioned. And while DLP technology is not directly mentioned in any of the recent compliance mandates, laws and regulations, it can bring value to many information security projects that are driven by regulatory requirements. However, as with any new security technology, many challenges await the enterprise that is planning to deploy data leak prevention. In this note, we will discuss the preconditions and criteria for deploying data leak protection technologies. We will also look at some common scenarios for when DLP must, can, and should not be used.

Page 1 Echelon One 2010

Discussion Given the current rush to deploy DLP technology, many companies have discovered the limitations and challenges of this technology and its applicability to their problems. As most security professionals know, no individual security technology, no matter how innovative, will solve a significant portion of information security challenges as silver bullets simply do not exist. In fact, DLP runs the risk of being one of the most overhyped technologies in the security domain. It is complex, often complicated to deploy and expensive. Despite all of the above, the main obstacle for DLP deployment success is lack of clarity in deployment requirements, expectations and success criteria as well as failing to adopt the business processes and procedures to allow would DLP to do its work. For example, few admit that this technology is most effective at stopping accidental data leaks. Others believe that DLP can only be deployed after an extensive, and expensive, enterprise-wide role management project and a comprehensive information classification project. In reality, while these are extremely useful and can make your DLP deployment more painless, neither is the strict requirement for data loss protection success. On the other hand, creating monitoring and response procedures as well as making information owners aware (and, in fact, actively involved in) of DLP Technology deployment are essential. Similarly, most modern DLP solutions deploy on both network level and system-level in order to monitor and protect data at rest, data in motion, and data in use. Thus obtaining the cooperation of network managers as well as desktop and server managers is absolutely crucial for project success. Lets review what technologies as well as places and procedures are mandatory, helpful or non-essential for DLP. Having a mature identity management infrastructure in place is very helpful for DLP deployment. In fact, the common enterprise identity store can be queries by DLP solution in order for it to make its protection decisions. If identity management infrastructure can provide a DLP tool with roles and responsibilities for all the users in an organization, it will significantly contribute to project success. Moreover, in some cases, the DLP data monitoring features have helped to refining user roles and rules, governing the use of sensitive information inside the organization. Few enterprises nowadays can boast they have a comprehensive information classification effort. It appears that classifying by sensitivity will remain main primarlily a government endeavor. Modern DLP solutions can effectively fingerprint data in order to simplify telling sensitive data from public data.
Page 2 Echelon One 2010

Even though knowing where all the sensitive data is stored is not absolutely mandatory before starting with the DLP project, utilizing the discovery tools bundled with DLP Technology is an absolute must. Many of the commercial DLP tools will help you identify sensitive and regulated data across many systems with relatively little effort. Date crawling tools can often unveil regulated information that could cause major brand, reputation and financial damage if leaked. On the compliance side, knowing which systems and applications house regulated data is absolutely essential. This will allow users to create rules that will minimize the chance of regulated data from being leaked by mistake and thus causing not only a public affairs nightmare but also possible fines from regulators. In addition, configuring the types of data that are commonly regulated to detect it is useful. For example, most DLP tools can automatically detect credit card account numbers, social security numbers and other types of regulated data. Knowing where DLP can be used to satisfy regulatory mandates or serve as a compensating control for other, more complicated security technologies, such as database encryption, is very useful. This might provide a much needed shortcut to achieving compliance and improving security without engaging in risky deployments of unproven technologies. On the other hand, DLP tools will not and cannot automatically solve a broad range of poorly defined information security challenges. Complex distributed environments with large amounts of regulated sensitive invaluable data cannot be secured by simply dropping a DLP box in them. Conclusion and next steps To summarize, before initiating a DLP project, an enterprise must perform a careful assessment of what goals are to be accomplished by using DLP technology, what information security problems are to be solved. Next, it is important to be aware of DLP tool capabilities requirements and limitations. It is also extremely useful to know what technologies, what polices, what procedures as well as what people inside the organization can help (or, sometimes, hurt) this chances of a successful DLP project. All the hidden requirements, hidden assumptions and unspoken success criteria before engaging with DLP must be unveiled. Only after completing steps listed above can it be concluded whether to seek deployment of data leak protection tools. It will also define the value derived from data discovery components, data security monitoring components or data leak blocking components of the DLP tools. Finally, it is critical to know which
Page 3 Echelon One 2010

regulatory compliance mandates can be satisfied by deploying data leak protection tools and whether DLP is, in fact, the best way for addressing those regulatory requirements. Completing the steps listed above will increase the probability of success when DLP tools are implemented and allow an organization, maximize its investment, and control its information effectively and efficiently.

ABOUT THE AUTHOR:


This is an updated author bio, added to the paper at the time of reposting in 2011. Dr. Anton Chuvakin (www.chuvakin.org) is a recognized security expert in the field of log management and PCI DSS compliance. Anton leads his security consulting practice www.securitywarriorconsulting.com, focusing on logging, SIEM, security strategy and compliance for security vendors and Fortune 500 organizations. He is an author of books "Security Warrior" and "PCI Compliance" (www.pcicompliancebook.info) and a contributor to "Know Your Enemy II", "Information Security Management Handbook"; and now working on a book about system logs. Anton has published dozens of papers on log management, correlation, data analysis, PCI DSS, security management (see list www.info-secure.org). His blog www.securitywarrior.org is one of the most popular in the industry. In addition, Anton teaches classes (including his own SANS class on log management) and presents at many security conferences across the world; he recently addressed audiences in United States, UK, Singapore, Spain, Russia and other countries. He works on emerging security standards and serves on advisory boards of several security startups. Dr. Anton Chuvakin was formerly a Director of PCI Compliance Solutions at Qualys. Previously, Anton worked at LogLogic as a Chief Logging Evangelist, tasked with educating the world about the importance of logging for security, compliance and operations. Before LogLogic, Anton was employed by a security vendor in a strategic product management role. Anton earned his Ph.D. degree from Stony Brook University.

Page 4 Echelon One 2010