15.4.5

DATA LEAKAGE

involves siphoning or leaking information out of the computer.

includes dumping files to paper or stealing computer reports and tapes

data leakage leaves the original copy, so it may go
undetected.

Data Leak Prevention

suite of technologies and associated processes that locate, monitor and protect sensitive information from unauthorized disclosure.

facilitates the following three key objectives:

Locate and catalog sensitive information stored throughout the enterprise.

Monitor and control the movement of sensitive information.

objectives are associated with the following three primary states of information:

• Data at rest

• Data in motion

• Data in use

basic function of DLP solutions is the ability to identify and log where specific types of information (e.g., credit card or social security numbers) are STORED THROUGHOUT THE ENTERPRISE

To accomplish this, most DLP systems use
crawlers, #

which are applications that are deployed remotely to log onto each end system and “crawl” through data stores, searching for and logging the location of specific information sets

based on a set of rules that have been entered into the DLP management console.

To monitor data movement on enterprise networks,

DLP solutions use specific NETWORK APPLIANCES OR EMBEDDED TECHNOLOGY

how it works

To inspect the information being sent across
the network DLP solution must be able to

passively MONITOR THE NETWORK TRAFFIC

recognize the correct data streams to capture

assemble the collected
packets

reconstruct the files carried in the data stream

perform the same analysis that is done on the data at rest to determine whether any portion of the file contents is restricted by its rule set

core of this ability is a process known as deep packet inspection (DPI). #

DPI goes beyond the basic header information of a packet to read the contents within the packet’s payload (akin to a letter within a postal envelope).

If sensitive data are detected flowing to an unauthorized destination

DLP solution has the capability to alert and optionally block the data flows in real or near real time,

again based on the rule set defined within its central management component.

solution may also quarantine or encrypt the data in question.

refers to monitoring data movement stemming from actions taken by end users on their workstations that would entail

copying data to a flash drive

sending information to a printer

even cutting and pasting between applications.

DLP solutions typically accomplish this
through the use of a software program known as an agent,

ideally controlled by the same central management capabilities of the overall DLP solution.

The range of services available in the management console varies between products but many have functions in common, such as those outlined in the following sections.

Policy Creation and Management

Policies (rule sets) dictate the actions taken by the various DLP components.

Most DLP solutions come with preconfigured policies (rules) that map to common regulations.

important to be able to customize these
policies or build completely custom policies.

Should be built upon the ASSET MANAGEMENT AND DATA CLASSIFICATION exercises performed by the enterprise

Directory Services Integration

Integration with directory services allows the DLP console to map a network address to a named end user.

Workflow Management

provide the capacity TO CONFIGURE INCIDENT HANDLING allowing the central management system to ROUTE SPECIFIC INCIDENTS TO APPROPRIATE PARTIES

Backup and Restore

Reporting

features allow for PRESERVATION OF POLICIES AND OTHER CONFIGURATION SETTINGS

reporting function may be internal or may leverage external reporting tools.

based on violation type, severity, user and other such criteria.

DLP Risk, Limitations and Considerations

Encryption

Excessive reporting and false positives

Trying to monitor too many data patterns or enabling too many detection points early on can quickly overwhelm resources.

important that the system be rolled out in phases, focusing on the highest risk areas first.

greatest feature of a DLP solution is the ability to
customize rules or templates to specific organizational data patterns.

Improperly tuned network DLP modules

Enabling the system in monitor-only mode allows for tuning and provides

establish some means of accessibility in the event there is critical content being blocked during off-hours when the team managing the DLP solution is not available

Proper tuning and testing of the DLP system should occur before enabling actual blocking of content.

THE OPPORTUNITIES TO ALERT USERS OUT OF COMPLIANCE ACTIVITIES AND PROCEDURES so they may adjust accordingly.

Involving the appropriate business and IT stakeholders in the planning and monitoring stages helps

to ensure that disruptions to processes will be anticipated and mitigated

DLP solutions can only inspect encrypted information that they can first decrypt.

Which do not happen in case of personal encryption packages installed by employee

To mitigate this risk,

policies should forbid the installation and use of encryption solutions that are not centrally managed

users should be educated that anything that cannot be decrypted for inspection will ultimately be blocked.

Graphics

DLP solutions cannot intelligently interpret graphics files

alternates wont work like
result : Gap in control

manually inspecting all such information

blocking

Sensitive information scanned into a graphics file or intellectual property that exists

Enterprise should develop strong policies that govern the use and dissemination of this information.

HOWEVER, , they can identify specific
file types
, their source and destination.

combined with well-defined traffic analysis, can flag uncharacteristic movement of this type of information and provide some level of control

across enterprise networks

on end-user systems.

to selectively capture and analyze network traffic.

CMS