Accelerating Organization wide Cloud Computing

Greg Gardner, Chief Architect, Government and Defense Solutions, NetApp
800
1318
238
Greg Gardner, Chief Architect, Government and Defense Solutions, NetApp

Greg Gardner, Chief Architect, Government and Defense Solutions, NetApp

The Dept. of Defense (DoD) and the Aerospace and Defense Industry–two entities heavily reliant on collaboration–still grapple with siloed legacy systems, processes and IT infrastructures. That said, both are turning to cloud computing to address this issue.  

Last year, Deltek forecast federal agency demand for cloud computing services would balloon from $2.45 billion in FY2014 to $6.5 billion in 2109–citing DoD movement to the Cloud as a factor in this growth. However, accelerating organization-wide cloud computing adoption across the DoD–the largest potential consumer of cloud computing services in the U.S. federal government is slowed by security concerns.  

Every organization, public and private, that considers cloud computing is forced to confront a conundrum. While cloud computing is arguably both more effective and more efficient than legacy alternatives, the fact remains when data is moved off-premise to some form of cloud storage–whether government, private, hybrid, or public–the organization gives up the stewardship of that data to a third party. Unlike the entities that generate and own the data, those who provide cloud services are motivated to bring in and manage more clients to either generate profits or justify its existence. Further, clouds expose more attack surfaces (more data is accessed by more users through more applications all co-located in the same cloud) and introduce more people (e.g., Cloud Administrators).That puts data even more at risk. Those considering a move to cloud computing find this very worrying.  

Organizations deal with this challenge in a variety of ways. Many in the private sector have very specific rules limiting the use of “external” cloud services and expressly restrict data that may be stored off premise to that in the “low risk” category with waivers requiring both legal review and Vice President approval.  

Those in the public sector rely on cloud-related government policies. Under the umbrella of the Department of Defense Cloud Computing Strategy, for example, the Defense Information Systems Agency (DISA) recently released a Security Requirements Guide (SRG) for cloud computing, which is intended to make it easier—and quicker—for Defense Department agencies to procure commercial cloud services while still ensuring security. The new SRG closely follows the Federal Risk and Authorization Management Program (FedRAMP) used by civilian federal agencies while adding additional requirements in areas where extra security is needed.  

And, yet, an unchanging baseline of security standards would fail to keep pace with security threats. So government cloud security standards are destined to continually change. In fact, the Office of Management and Budget’s 2014 Federal Information Security Management Act cited many agencies and contractors that were not NIST compliant and did not meet evolving cloud security standards and guidelines.

“Data sharing between cloud service providers must improve and issues surrounding the liability of both government and private sector data and the ramifications if data is breached through the cloud must be addressed”

Additionally, data sharing between cloud service providers must improve and issues surrounding the liability of both government and private sector data and the ramifications if data is breached through the cloud must be addressed.  Technical and procedural guidelines continue to evolve as well. Where once the government talked of “Cloud Access Points”–the means by which commercial cloud service providers were to connect to government networks–that concept is no longer even discussed.  

And with all these policies and limitations, questions inevitably emerge about cloud computing’s true return  on investment and whether the actual effectiveness of cloud computing will eventually allow governmental agencies to save money by repurposing some of its workforce.

But when cloud computing works it can be exceptionally valuable. Take, for example, the case of the Army’s Logistic Support Activity (LOGSA) at Redstone Arsenal, Alabama. The Army employs hybrid cloud computing services from a commercial provider to bolster services provided by LOGSA. This cloud architecture is responsible for some 40 million unique Army data transactions per day. LOGSA houses the Logistics Information Warehouse, which serves as the Army’s storehouse for logistics data and also provides relevant business intelligence to ensure the Army’s material needs worldwide are met.

LOGSA transitioned to its hybrid cloud model in 2014. Today, the cloud service provider is securely adding new data analytics capabilities to LOGSA to maximize returns from the cloud environment. The hybrid cloud model allows the organization to better make use of the enormous amount of logistics data the Army collects. That, in turn, enables sharper insights and better services to units across the Army. Critical logistics data like equipment inventories and movement information are just a few of the data points LOGSA manages in the cloud.

LOGSA’s hybrid cloud model—which securely connects various on-premise IT systems and adds the provider’s capabilities to the mix—has slashed costs by 50 percent over its previous model. In addition, the new model is less prone to downtime; reliability has improved from 99 percent to 99.999 percent.

All things considered, best practices for the use of cloud computing are best addressed in terms of a data-lifecycle management model. When data stores are first established and both technical and procedural issues are being hashed out, it makes sense to leverage cloud storage to hold down the costs of development and testing. When the system moves into production however, logic dictates that critical data must be both well protected and provisioned close to compute to meet the demands of daily operations. This can be done on-premise or in some form of cloud structure but the standards of security and performance are intensely high as are implementation costs.

For disaster recovery or as data ages and has not been touched for many months or years, it makes sense to move that data to a less expensive but less functional form of cloud storage. But this data must also be well secured. Many organizations find that it makes the most sense to protect these workloads within the cloud and keep copies separated by region, login credentials, or even cloud providers.

One approach to securing aging, untouched data or provisioning backup/recovery stores involves using physical or virtual appliances to compress, de-duplicate, encrypt and then seamlessly and securely back-up data and workloads to the cloud. For example, storage teams can now quickly spin up a cloud-based appliance with a cloud service provider and move their data to that cloud with enterprise class speed and security. They pay only for what they use and they retain the encryption keys. This approach is becoming increasingly popular amongst government organizations.

In sum, cloud computing is here to stay. It is functional, low cost, and enterprise-class. Nevertheless, security challenges persist and will continually evolve. Private sector and government must aggressively work to balance the security, usability, and performance of their data stores and workloads and cloud computing must be a key part of those deliberations. 

Read Also

ISR Analytics: Challenges for IT, Implications for Big Data, and Opportunities for IoT

ISR Analytics: Challenges for IT, Implications for Big Data, and Opportunities for IoT

Ravi Ravichandran, Ph.D., Director, BAE Systems Technology Solutions
Are we at war?

Are we at war?

Barry Barlow, Chief Technology Officer, Vencore, Inc.
What can I learn from a Defense Industry CIO?

What can I learn from a Defense Industry CIO?

David Tamayo, CIO, DCS Corporation