Not too many years ago, network security was a relatively low level priority in most organizations. That has changed today due to such high level failures as the recent breaches at Sony and Target. Closer to home, the University of Maryland is now incurring large expenses for all students due to the recent leak of personally identifiable student information.

There is a lot of discussion around fewer federal dollars being available for Defense IT spending. Actually, the more finite pool of funding is causing government leaders to pause and think strategically about how they are spending their security dollars. It is really making them consider the best possible ways to secure their data centers, while also being cost-effective in their approaches.

What’s becoming increasingly clear is that the technology isn’t the core issue — it’s the architectural approach to cybersecurity that’s the issue. It’s time for organizations to improve their security postures by thinking of security differently.

This may be the most critical time in IT in the past 30 years. The datacenter must operate more efficiently, in a more automated fashion, and in a more secure manner. As hardwired devices get reconfigured as virtualized resources, there’s an opportunity to supply “a new security enforcement layer.” Thanks to network virtualization, defense agencies are seeing true benefits by having access to new information and isolating the actual network when they experience an incident – all while saving money.

Virtualization and the broader infrastructure of the software-defined data center provide a unique opportunity to get it all — isolation, context, and a horizontal layer that provides near-ubiquitous coverage. Through virtualization, organizations can insert security in a location that provides end-to-end coverage, isolation, and the full context of application, user, and data. In a physical environment, it is much more difficult to actually isolate and remove the problem. Moreover, this also prevents intruders from communicating between infected virtual machines if they do break into the network.

In a recent Center for Digital Government (CDG) survey of state and local government IT leaders, 45 percent of respondents said network security is the most important IT issue to their organization. So security is clearly top of mind at agencies around the nation. Even so, respondents did not express great confidence in their organizations’ current defense systems to protect against threats. While 38 percent said cybersecurity ranked “very high” among their organization’s other top technology priorities, a quarter said they were either very unprepared or didn’t know if they were prepared to shield themselves from zero-day targeted attacks, advanced persistent threats, and unknown threats.

New solutions are allowing government customers to better isolate and manage security breaches and incidences. But as mentioned above, better, technology alone is far from the complete answer to rising cybersecurity risks.

The traditional data center security architecture is overly perimeter-centric, with the majority of data center security investment spent on the north-south boundary. Why? Because putting security inside the data center turns out to be extremely difficult. On the perimeter you have a few egress points. Inside the data center, you have a complex web of data paths. The more controls you use, the more complex a distributed policy problem you have. The fewer controls you use, the more choke points you create.

The “Goldilocks Zone” is a term that is often used to describe a new approach that addresses these challenges. The term was originally coined to describe a planetary location that exhibits characteristics that must be simultaneously present for a planet to support life – not too hot, not too cold, etc. Security professionals borrowed it to describe the location for security controls that simultaneously provides context and isolation—key characteristics required to create a secure information infrastructure.

When it comes to instrumenting IT infrastructure with security controls, defense IT historically had two choices: the network or the host. With those two choices, IT was forced to make a tradeoff between context (visibility into the application layer) and isolation (protection of the control itself).

If IT places controls in the network, there is isolation, but we lack context. Visibility is limited to telemetry such as ports and protocols. These were never good proxies for applications, but in modern IT architectures such as the cloud, where workloads are mobile, these physical identifiers become even worse. Next-generation firewalls emerged precisely because of this issue.

If IT places controls on the host, we get context about the application, processes, files, and users — but lack meaningful isolation. If the endpoint is compromised, so will be the control. In both cases we lack ubiquity, a horizontal enforcement layer that places control everywhere.

The next logical step for defense agencies is more virtualization of the network itself. We're seeing more and more adoption of this in the enterprise space right now, because of enhanced security capabilities and the ability via software to build cloud functionality on top of existing infrastructure. Companies are tired of the constraints inherent in a siloed approach to compute- network-storage, and can't afford those types of vulnerabilities in this new age.

Government IT leaders simply are no longer satisfied with the status quo, nor can they afford to be. Despite the increased risk profile today, it's a thrilling time to be involved in cybersecurity.

This article was written by Steven Coles, Vice President of Networking and Security for U.S. Public Sector, VMware (Palo Alto, CA). For more information, Click Here .


Aerospace & Defense Technology Magazine

This article first appeared in the April, 2015 issue of Aerospace & Defense Technology Magazine.

Read more articles from this issue here.

Read more articles from the archives here.