By Al Hartmann

Illumination Defined: The New Era of Endpoints

The traditional perimeter as we know it is quickly dissolving. So what does this mean for the endpoint?

Investment in perimeter security, as defined by firewalls, managed gateways and intrusion detection/prevention systems (IDS/IPS), is changing. Investments are being questioned, with returns unable to overcome the costs and complexity to create, maintain, and justify these antiquated defenses.

More than that, the paradigm has changed – employees are no longer exclusively working in the office. Many people are logging hours from home or while traveling – neither location is under the umbrella of a firewall. Instead of keeping the bad guys out, firewalls often have the inverse effect – they prevent the good guys from being productive. The irony? They create a safe haven for attackers to breach and hide for months, then traverse to critical systems.

What Has Really Changed?

The endpoint has become the last line of defense. With the aforementioned failure in perimeter defense and a “mobile everywhere” workforce, we must now enforce trust at the endpoint. Easier said than done, however.

In the endpoint space, identity & access management (IAM) tools are not the silver bullet. Even innovative companies like Okta, OneLogin, and cloud proxy vendors such as Blue Coat and Zscaler cannot overcome one simple truth: trust goes beyond simple identification, authentication, and authorization.

Encryption is a second attempt at protecting entire libraries and individual assets. In the most recent (2016) Ponemon study on data breaches, encryption only saved 10% of the cost per breached record (from $158 to $142). This isn’t the panacea that some make it seem.

Everything is changing

Organizations must be prepared to embrace new paradigms and attack vectors. While organizations must provide access to trusted groups and individuals, they have to address this in a better way. Critical business systems are now accessed from anywhere, any time, not just from desks in corporate office buildings. And contractors (contingent workforce) are quickly comprising more than half of the overall enterprise workforce.

On endpoint devices, the binary is predominantly the problem. Presumably benign incidents, such as an executable crash, could indicate something simple – like Windows 10 Desktop Manager (DWM) restarting. Or it be a much deeper problem, such as a malicious file or early indicators of an attack.

Trusted access doesn’t solve this vulnerability. According to the Ponemon Institute, between 70% and 90% of all attacks are caused by human error, social engineering, or other human factors. This requires more than simple IAM – it requires behavioral analysis.

Instead of making good better, perimeter and identity access companies made bad faster.

When and Where Does the Good News Begin?

Taking a step back, Google (Alphabet Corp) announced a perimeter-less network model in late 2014, and has made significant progress. Other enterprises – from corporations to governments – have done this (in silence and less severe), but BeyondCorp has done this and shown its efforts to the world. The design philosophy, endpoint plus (public) cloud displacing cloistered enterprise network, is the key concept.

This changes the entire conversation on an endpoint – be it a laptop, desktop, workstation, or server – as subservient to the corporate/enterprise/private/organization network. The endpoint truly is the last line of defense, and must be protected – yet also report its activity.

Unlike the conventional perimeter security model, BeyondCorp doesn’t gate access to services and tools based on a user’s physical location or the originating network; instead, access policies are based on information about a device, its state, and its associated user. BeyondCorp considers both internal networks and external networks to be completely untrusted, and gates access to applications by dynamically asserting and enforcing levels, or “tiers,” of access.

By itself, this seems innocuous. But the reality is that this is a radical new model which is imperfect. The access criteria have shifted from network addresses to device trust levels, and the network is heavily segmented by VLAN’s, rather than a centralized model with potential for breaches, hacks, and threats at the human level (the “soft chewy center”).

The good news? Breaching the perimeter extremely challenging for would-be attackers, while making network pivoting next to impossible once past the reverse proxy (a common mechanism used by attackers today – proving that firewalls do a better job of keeping the bad guys in rather than letting the good guys get out). The inverse model further applies to Google cloud servers, presumably tightly managed, inside the perimeter, versus client endpoints, who are all out in the wild.

Google has done some nice refinements on proven security approaches, notably to 802.1X and Radius, bundled it as the BeyondCorp architecture, including strong identity and access management (IAM).

Why is this important? What are the gaps?

Ziften believes in this approach because it emphasizes device trust over than network trust. However, Google doesn’t specifically show a device security agent or emphasize any form of client-side monitoring (apart from very strict configuration control). While there may be reporting and forensics, this is something which every organization should aware of, since it’s a matter of when – not if – bad things will happen.

Since implementing the initial phases of the Device Inventory Service, we’ve ingested billions of deltas from over 15 data sources, at a typical rate of about three million per day, totaling over 80 terabytes. Retaining historical data is essential in allowing us to understand the end-to-end lifecycle of a given device, track and analyze fleet-wide trends, and perform security audits and forensic investigations.

This is an expensive and data-heavy process with two shortcomings. On ultra-high-speed networks (utilized by the likes of Google, universities and research organizations), ample bandwidth allows for this type of communication to occur without flooding the pipes. The first issue is that In more pedestrian corporate and government scenarios, this would cause great user disruption.

Second, machines must have the horsepower to constantly collect and transmit data. While most employees would be delighted to have current developer-class workstations at their disposal, the expense of the devices and process of refreshing them on a regular basis makes this prohibitive.

A Lack of Lateral Visibility

Very few products actual generate ‘enhanced’ netflow, augmenting traditional network visibility with rich, contextual data.

Ziften’s patented ZFlow™ provides network flow details on data generated from the endpoint, otherwise accomplished using brute force (human labor) or expensive network devices.

ZFlow acts as a “connective tissue” of sorts, which extends and completes the end-to-end network visibility cycle, adding context to on-network, off-network and cloud servers/endpoints, allowing security teams to make faster and more informed and accurate decisions. In essence, investing in Ziften services result in a labor savings, plus an increase in speed-to-discovery and time-to-remediation due to technology acting as a substitute for people resources.

For organizations moving/migrating to the public cloud (as 56% are planning to do by 2021 according to IDG Enterprise’s 2015 Cloud Survey), Ziften offers unmatched visibility into cloud servers to better monitor and secure the complete infrastructure.

In Google’s environment, only corporate-owned devices (COPE) are allowed, while crowding out bring-your-own (BYOD). This works for a company like Google that can hand out new devices to all staff—phone, tablet, laptop, etc. Part of the reason for that is the vesting of identity in the device itself, plus user authentication as usual. The device must meet Google requirements, having either a TPM or a software equivalent of a TPM, to hold the X.509 cert used to validate device identity and to facilitate device-specific traffic encryption. There must be several agents on each endpoint to verify the device validation predicates called out in the access policy, which is where Ziften would need to partner with the systems management agent provider, since it is likely that agent cooperation is essential to the process.


In summary, Google has developed a world-class solution, but its applicability and practicality is limited to organizations like Alphabet.

Ziften offers the same level of operational visibility and security protection to the masses, using a lightweight agent, metadata/network flow monitoring (from the endpoint), and a best-in-class console. For organizations with specialized needs or incumbent tools, Ziften provides both an open REST API and an extension framework (to augment ingest of data and triggering response actions).

This yields the benefits of the BeyondCorp model to the masses, while protecting network bandwidth and endpoint (machine) computing resources. As organizations will be slow to move completely away from the enterprise network, Ziften partners with firewall and SIEM vendors.

Finally, the security landscape is steadily shifting towards managed detection & response (MDR). Managed security providers (MSSP’s) offer traditional monitoring and management of firewalls, gateways and perimeter intrusion detection, but this is not enough. They lack the skills and the technology.

Ziften’s solution has been tested, integrated, approved and implemented by a number of the emerging MDR’s, illustrating the standardization (capability) and flexibility of the Ziften platform to play a key role in remediation and incident response.

Get the Blog Here