Targeted enterprise attacks by skilled actors are designed to evade your cyber defenses and fly under your anomaly detection radar. To enhance your enterprise’s stealth attack detection capabilities, look to rationalize and justify all network traffic by correlating and reconciling network and endpoint sensor data observations.
Skilled actors don’t draw attention to their cyber-attacks with clearly anomalous actions or careless reuse of blacklisted artifacts. They understand what is likely to raise alerts in enterprise cyber defense tools and practice stealthy evasion and deception. Initial penetration of targeted enterprises has become almost child’s play, as any pen tester or red teamer may attest, after which the attackers seek to maintain a low profile and maximize their dwell time and penetration scope across your network, system, and data assets. Evasion tactics will exploit weaknesses, seams, and cracks in your security framework and processes, where monitoring visibility is either deficient or insufficiently correlated to obtain complete situational awareness.
A recent example is a major breach at NASA’s Jet Propulsion Laboratory where attackers exploited an unmanaged $25 Raspberry Pi computer to steal sensitive data. In its official report, the NASA Office of Inspector General castigated JPL’s cybersecurity management and oversight for these visibility gaps:
Multiple IT security control weaknesses reduce JPL’s ability to prevent, detect, and mitigate attacks targeting its systems and networks, thereby exposing NASA systems and data to exploitation by cyber criminals. JPL uses its Information Technology Security Database (ITSDB) to track and manage physical assets and applications on its network; however, we found the database inventory incomplete and inaccurate, placing at risk JPL’s ability to effectively monitor, report, and respond to security incidents. Moreover, reduced visibility into devices connected to its networks hinders JPL’s ability to properly secure those networks
Cybersecurity Management and Oversight at the Jet Propulsion Laboratory June 18, 2019
Network and Application Discovery
Only continuous monitoring of networks and endpoints can reveal an accurate real-time picture of what devices and applications are active on an enterprise network. Periodic static scans cannot provide that visibility and accuracy, and skilled attackers exploit this. Again, to quote the report:
“The April 2018 attack on JPL’s network illustrates how sophisticated attackers can exploit weaknesses within JPL’s system of security controls. Advanced persistent threat attackers patiently and methodically move from system to system searching for weaknesses in a network to advance their attack. In this case the attacker, using an external user account, exploited weaknesses in JPL’s system of security controls to move undetected within the JPL network for approximately 10 months. Prior to detection and containment of the incident, the attacker exfiltrated approximately 500 megabytes of data from 23 files, 2 of which contained International Traffic in Arms Regulations information related to the Mars Science Laboratory mission.”
Cybersecurity Management and Oversight at the Jet Propulsion Laboratory
June 18, 2019
Network Traffic Rationalization
Once a complete real-time picture of devices and active applications is obtained from network and endpoint sensors, these observations must be correlated to provide full context in support of network traffic rationalization.
Rationalize: to bring into accord with reason or cause something to seem reasonable
Any network traffic that cannot be adequately rationalized through this combination of network traffic context and endpoint application context demands further examination. Network traffic context may include network address information (IP, port, domain) of traffic source and destination nodes, protocols employed, transfer volumes and patterns, temporal details, etc. This is correlated with endpoint application context which may include characterization of the communicating process and application, account and session information, other processes or open files on the communicating system, user presence and application focus tracking, system resource consumption patterns, etc. Together these sensor data dimensions provide a holistic picture of the network traffic context, and this informs the traffic rationalization in justifying or questioning the appropriateness of the traffic. Machine learning may be used to classify traffic as justified or questionable based upon this comprehensive context information, which serves as the machine learning feature vector in this risk scoring analysis.
The basic concept here is to pass judgment on observed network traffic reasonableness based upon its correlated network and application context. This is fundamentally different from traditional intrusion detection or traffic anomaly detection, which stealth attackers understand and evade. Rather than apply rules that define “bad” behavior, it learns the rules of “good” behavior at application and network context levels, which is very difficult for attackers to reproduce across both network and application dimensions. The attackers may have surveilled your network traffic and shaped theirs to blend into this background, but they will have a far harder challenge in replicating the endpoint application context sourcing this traffic. Network traffic rationalization takes your enterprise cyber defenses to the next level.
To learn more about how Ziften helps improve endpoint visibility read our Solution Brief “Unparalleled Endpoint Visibility” at https://ziften.com/unparalleled-endpoint-visibility/.