Up until now if you asked any security professional or consultant to provide you a best practice strategy for securing your enterprise, they would most likely recommend that you follow the Defense-in-Depth (DiD) strategy. That is to use multiple computer security techniques to help mitigate the risk of one component of the defense being compromised or circumvented. Although this has proven to be a sound strategy for the most part, as a security practitioner, I would have a tough time making that recommendation to customers that are adopting a Cloud Computing model.
As we move our resources, storage, services, and application into the cloud we are drastically changing our enterprise model. I would argue that we are turning the defense-in-depth model inside-out. We are putting more and more on the edge of our network, if not directly into the cloud.
So let’s think about this for a second,
· How much of our existing investment in DiD strategies (firewalls, ids, ips, vulnerability management, NAC, anti-virus, anti-malware, etc) can we leverage as we move our IT infrastructure further in to the cloud?
· How can we ensure the confidentiality of our High Business Impact (HBI) data as we adopt more cloud computing services?
· Is a SAS70 Type I or II certification sufficient evidence for us to trust the confidentiality of our HBI data in the hands of our trusted Cloud vendor of choice?
· Who will be monitoring and protecting the confidentiality of such data as clients with questionable security postures interact with the Cloud service / application?
· The SaaS (Software-as-a-Service) provider? The cloud vendor? I was under the impression that they are not supposed to see into our confidential data streams. So if not them, then who?
· Who is responsible for ensuring the integrity of that data when users connect from a Starbucks or an internet kiosk without going through our corporate LAN? As far as I know, very few companies (if any) are enforcing Network admission Control (NAC) on systems when they are not connecting to their corporate LAN/VPNs.
· What if their system is infected with the latest worm, malware or even worst a rootkit? What if they suddenly get infected with a day-zero worm before connecting to SaaS vendor hosting the corporate secure document repository?
· Does that mean IT departments need to re-engineer their entire security architecture and operational models?
· What is the cost of doing that? Can that cost be justified by the perceived value you could expect from your Cloud Computing investment?
IT Security organizations need to be smart about this, and start thinking about how to revamp, enhance and adapt their existing Security models and Risk Management strategies to keep up with the Cloud Computing “revolution, and they need to do so quickly. I see these clouds moving really fast towards us.
As the IT infrastructure moves into the cloud, there is a lot less that we can control. Many of us are using GPRS cards, Hotspots, Free Wifi Home Broadband to do our daily work away from the corporate LAN. There’s very little companies can do unless they start expanding their existing technology controls to be effective both inside and outside the corporate walls when dealing with the risk associated with Cloud Computing.
They need solutions to control any system (managed, unmanaged, trusted or un-trusted) and access points (internal, external, secured or unsecured) that can be used to connect to the Cloud service hosting corporate high business impact data.
For those of us who have already began to leverage the Cloud services and infrastructure the only thing we can still hope to control is the data itself.
What is being done?
On the vendor side:
Many solution vendors especially DLP vendors like McAfee have been thinking about this and are offering new complementary solutions like robust endpoint DLP agents to ensure the DLP policies are enforced even when users are offline. Provide the ability to TAG the data and enforce policies and controls based on the content itself. There is also a lot of talk about SaaS DLP and other complementary technologies. I believe utilizing these would be a great step in the right direction.
On the customer side:
Some organizations have already realized that they can leverage and reuse some of the current investments they have made as part of their DiD strategy. For instance, when users are traversing the corporate LANs, the IT organizations should be able to leverage existing technologies like NAC (although few companies have rolled out internal NAC), HIPS, DLP, Application Firewalls, Anti-malware, Anti-spyware, and Antivirus, and some more advanced proxies that can handle Web 2.0 applications and end-to-end HTTPS/SSL connections. These technologies in conjunction with the right policies and processes can help monitor and protect the integrity and confidentiality of the sensitive data as users interact with the cloud from inside the corporate environment.
What is next?
In the meantime, in order to come up with a real solution, it would require a collective mind shift by all of us (Security Practitioners, Consultants, Advisors, Vendors, customers ) away from System Security (i.e. Defense in Depth) towards data Security, proper data classification and Defense-at-the-Edge. Since data is really the only thing Cloud Computing users own and have control over (I know I am reaching here), perhaps that is where they should plan to invest the scarce security dollars available these days.
The focus should be on Classifying and Securing the data itself as well as enhancing the security at edge. Unfortunately that is a lot easier said than done.
· Today’s data is very dynamic and polymorphic; same sensitive content can be in many forms in the enterprise DOC, XLS, PDF, ZIP, JPG, XML, WMV, MP3, SQL, encrypted or protected by some kind of DRM (Data Rights Management) …
· Your sensitive and high impact business data can also be on many locations i.e. SharePoint, Secure vaults, Client laptops, desktops, PDAs, Servers, hosted repositories in the Cloud, partnered websites, etc. No wonder e-Discovery is such an expensive and taunting effort these days.
· How do you define the edge of your network? Where are the boundaries? Is it limited to your internet gateways? I doubt it. Think about it:
a. We all use some sort of Smartphone, iPhone or Blackberry everyday for connecting to our corporate and personal email, favorite social network circle, browsing, and checking on our brokerage account.
b. Most of us use GPRS cards at Starbucks coffee houses (well, I go to PETEs myself).
c. We rely on Home Broadband to connect to our corporate email using Outlook Web Access (OWA).
d. Some even are brave enough to Tap into our neighbors’ Wifi or jump on a free Wifi while taking our kids to the Park.
You get the picture…
It’s time for security practitioners and consultants to collectively review and re-access Defense-in-Depth strategies used today, and consider devising a complementary and more scalable, feasible and effective Defense-at-the-Edge (DATE) strategy for tomorrow.
As I said before, we should strive to get a better handle on classifying and securing our High Business Impact data at the time of conception and figure out a way to closely monitor and protect it throughout its lifecycle. To top it all, we have to do this in probably one of the toughest economies we have seen for over few decades.
Nobody said Security was easy.
Although, some people joked about the fact that security is just a cost center. I’ll encourage them to wait until they are hit by a lawsuit where the judge orders them to perform an exhaustive e-Discovery within a 30-60 day time span. Let’s then come back and compare the cost associated with the e-Discovery (where a good chunk of the data is dispersed across the globe partially thanks to Cloud Computing) versus the cost for proactively classifying and securing the data itself and closely monitoring and protecting the edge.
What are your thoughts?