Tuesday, 29 September 2015

Three key challenges in vulnerability risk management

This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.

Vulnerability risk management has re-introduced itself as a top challenge – and priority – for even the most savvy IT organizations. Despite the best detection technologies, organizations continue to get compromised on a daily basis. Vulnerability scanning provides visibility into potential land mines across the network, but often just results in data tracked in spreadsheets and independent remediation teams scrambling in different directions.

The recent Verizon Data Breach report showed that 99.9% of vulnerabilities exploited in attacks were compromised more than a year after being published. This clearly demonstrates the need to change from a “find” to “fix” mentality. Here are three key challenges to getting there:

* Vulnerability prioritization. Today, many organizations prioritize based on CVSS score and perform some level of asset importance classification within the process. However, this is still generating too much data for remediation teams to take targeted and informed action. In a larger organization, this process can result in tens of thousands – or even millions – of critical vulnerabilities detected. So the bigger question is – which vulnerabilities are actually critical?

Additional context is necessary get a true picture of actual risk across the IT environment. Organizations might consider additional factors in threat prioritization, such as the exploitability or value of an asset, the correlation between the vulnerability and the availability of public exploits, attacks and malware actively targeting the detected vulnerability, or the popularity of a vulnerability in social media conversations.

* Remediation process. The second and perhaps most profound challenge is in the remediation process itself. On average, organizations take 103 days to remediate a security vulnerability. In a landscape of zero-day exploits and the speed and agility at which malware developers operate, the window of opportunity is wide open for attackers.

The remediation challenge is most often rooted in the process itself. While there is no technology that can easily and economically solve the problem, there are ways to enable better management through automation that can improve the process and influence user behavior. In some cases, there are simple adjustments that can result in a huge impact. For example, a CISO at a large enterprise company recently stated that something as easy as being able to establish deadlines and automated reminder notifications when a deadline was approaching could vastly improve the communication process between Security and DevOps/SysAdmin teams.

In other words, synchronizing communication between internal teams through workflow automation can help accelerate the remediation process. From simple ticket and task management to notifications and patch deployment, the ability to track the remediation process within a single unified view can eliminate the need to navigate and update multiple systems and potentially result in significant time savings.

* Program governance. The adage, “You can’t manage it if you can’t measure it” is true when it comes to evaluating the success of a vulnerability risk management program. In general, information security programs are hard to measure compared to other operational functions such as sales and engineering. One can create hard metrics, but it is often difficult to translate those metrics into measurable business value.

There is no definitive answer for declaring success. For most organizations, this will likely vary depending on the regulatory nature of their industry and overall risk management strategy. However, IT and security teams demonstrate greater value when they can show the level of risk removed from critical systems.

Establishing the right metrics is the key to any successful governance program, but it also must have the flexibility to evolve with the changing threat landscape. In the case of vulnerability risk management, governance may start with establishing baseline metrics such as number of days to patch critical systems or average ticket aging. As the program evolves, new, and more specific, metrics can be introduced such as number of days from discovery to resolution (i.e., time when a patch is available to actual application).

Practitioners can start improving the process by making some simple changes. For example, most vulnerability assessment tools offer standard prioritization of risks based on CVSS score and asset classification. However, this approach is still generating too much data for remediation teams. Some organizations have started to perform advanced correlation with threat intelligence feeds and exploit databases. Yet, this process can be a full-time job in itself, and is too taxing on resources.

Technologies exist today to help ease this process through automation by enriching the results of vulnerability scan data with rich context beyond the CVSS score. Through correlation with external threat, exploit, malware, and social media feeds and the IT environment, a list of prioritized vulnerabilities is delivered based on the systems most likely to be targeted in a data breach. Automating this part of the process with existing technologies can help cut the time spent on prioritization from days to hours.

Today, vulnerability management has become as much about people and process as it is about technology, and this is where many programs are failing. The problem is not detection. Prioritization, remediation, and program governance have become the new precedence. It is no longer a question of if you will be hacked, but rather when, and most importantly, how. The inevitable breach has become a commonly accepted reality. Vulnerability risk management calls for a new approach that moves beyond a simple exercise in patch management to one focused on risk reduction and tolerable incident response.


Sunday, 13 September 2015

Why (and how) VMware created a new type of virtualization just for containers

VMware says containers and virtual machines are better together

As the hype about containers has mounted over the past year, it has raised questions about what this technology – which is for packaging applications - means for traditional management and virtualization vendors. Some have wondered: Will containers kill the virtual machine?

VMware answered that question with a resounding no at its annual conference in San Francisco last week. But, company officials say containers can benefit from having a new type of management platform. And it’s built a whole new type of virtualization just for containers.
Virtualization for containers

A decade and a half ago, VMware helped revolutionized the technology industry with the introduction of enterprise-grade hypervisors that ushered in an era of server virtualization.

Last week the company revealed a redesigned version of its classic virtualization software named Project Photon. It’s a lightweight derivative of the company’s popular ESX hypervisor that has been engineered specifically to run application containers.

“At its core, it’s still got the virtualization base,” explains Kit Colbert, VMware’s vice president and CTO of Cloud Native Applications. Colbert calls Photon a “micro-visor” with “just enough” functionality to have the positive attributes of virtualization, while also being packaged in a lightweight format ideal for containers.

Project Photon includes two key pieces. One is named Photon Machine – a hypervisor software born out of ESX that is installed directly onto physical servers. It creates miniature virtual machines that containers are placed in. It includes a guest operating system, which the user can choose. By default Photon Machine comes with VMware’s customized Linux distribution named Photon OS, which the company has also designed to be container friendly.

The second major piece is named Photon Controller, which is a multi-tenant control plane that can handle many dozens, if not hundreds or thousands of instances of Photon Machine. Photon Controller will provision the clusters of Photon Machines and ensure they have access to network and storage resources as needed.

The combination of Photon Machine and Photon Controller creates a blueprint for a scale-out environment that has no single point of failure and exposes a single logical API endpoint that developers can write to. In theory, IT operators can deploy Project Photon and developers can write applications that run on it.

Project Photon will integrate with various open source projects, such as Docker for the container run-time support, as well as Google Kubernetes and Pivotil’s Cloud Foundry for higher-level application management. (Photon manages infrastructure provisioning while Kubernetes and CF manage application deployments.)

VMware's virtual approach to containers (3:30)

VMware has not yet set pricing for either platform, but both will be available this year as a private beta.
The journey to containers

Not all customers are ready to go all-in on containers though. So, VMware is also integrating container support into its traditional management tools.

VSphere Integrated Containers is a second product VMware announced that Colbert says is a good starting point for organizations that want to get their feet wet with containers. For full-scale container build outs, Colbert recommends transitioning to Project Photon.

VSphere Integrated Containers is a plugin for vSphere, the company’s venerable ESX management software. “It makes containers first-class citizens in vSphere,” Colbert explains. With the plugin, customers are able to deploy containers inside of a virtual machine, allowing the container in the VM to be managed just like any other VM by vSphere.

By comparison, currently if a user wanted to deploy containers in vSphere, they would likely deploy multiple containers inside a single virtual machine. Colbert says that has potentially harmful security implications though: If one of the containers in the VM is compromised, then the other containers in the VM could be impacted. By packaging one container inside each VM, it allows containers to be protected by the security isolation and baked in management features of vSphere.

Kurt Marko, an analyst at Marko Insights, says VMware’s approach to containers could be appealing to VMware admins who are being pressured to embrace containers. It could come with a downside, though.

“Wrapping Photon containers in a micro-VM makes it look like any other instance to the management stack and operators,” Marko wrote in an email. “Of course, the potential downside is lost efficiency since even micro-VMs will have more overhead than containers sharing the same kernel and libraries.” VMware says the VM-overhead is minute, but Marko says it will take independent analysis to determine if there is a tax for using containers inside VMs.
Hold your horses

As VMware attempts to position itself as a container company, there are headwinds. First, it is still very early on in the container market.

“The hype far outweighs the utilization” at this point, says IDC analyst Al Gillen, program vice president for servers and systems software. He estimates that fewer than 1/10 of 1% of enterprise applications are currently running in containers. It could be more than a decade before the technology reaches mainstream adoption with more than 40% of the market.

VMware also hasn’t traditionally been known as a company that leads the charge when it comes to cutting edge open source projects, which is a perception the company is fighting. Sheng Liang, co-founder and CEO of Rancher Labs – a startup that was showcasing its container operating system and management platform at VMworld - said the container movement has thus far been driven largely by developers and open source platforms like Mesos, Docker and Kubernetes – he hasn’t run into a single container user who is running containers in VMware environments, he said.

Forrester analyst Dave Bartoltti says that shouldn’t be surprising though. VMware has strong relations with IT operations managers, not developers who have been most enthusiastically using containers. Announcements the company has made at VMworld are about enabling those IT ops workers to embrace containers in their VMware environments. Other management vendors, like Red Hat, Microsoft and IBM are equally enthusiastically embracing containers. VMware’s argument though, is that containers and VMs are better together.