Businessman holding virtual cloud show how to secure unstructured data in the cloud

Room to Grow: Four Areas to Consider for Securely Handling Unstructured Data in the Cloud

Enterprises are dealing with explosive growth of unstructured file data, and, to cope, they’re turning to the cloud to more cost-effectively store and collaborate on that data across the globe. Still, while public and private clouds provide powerful new capabilities and the potential for financial and operational efficiencies, it also can broaden the risk profile.

The fact is, the cloud can provide better data security than traditional storage … when done correctly. But bringing hybrid cloud into the data path requires a different security approach than for files stored on-premises. To protect data in private and public clouds, enterprises need to employ a blend of strong encryption and local authentication, as well as the native capabilities of leading cloud storage solutions.

To ensure their unstructured data remains secure and highly available, IT teams should pay particular attention to the following four areas.

Encrypt it

Encryption provides a strong foundation for security, and whether in-transit or at-rest, file data and metadata should utilize the Advanced Encryption Standard (AES)-256. Why? This is the first open encryption standard approved by the U.S. National Security Agency. If it’s strong enough for sensitive government information, it’s a good idea for enterprises to use it. Plus, AES-256 is particularly well suited for symmetric encryption.

What’s more, IT should be sure to “salt” keys and passwords. Passwords are typically protected by encrypting them using a one-way hash function that requires a cryptographic key to decrypt them. Salting adds random bits to the hash function, which an extra layer of security, ensuring that, even if the cryptographic keys are compromised, the passwords and keys they protect will remain unusable. No unauthorized users from outside an organization will be able to gain access to data.

Open source public key encryption

Consider adopting the OpenPGP protocol for public key-based encryption and decryption. OpenPGP combines fast symmetric encryption to protect data with slower asymmetric to encrypt keys. Not only does this provide optimum data security and at a higher level of granularity, but it also keeps performance from being impacted. Plus, OpenPGP specifies details like proper salting and cipher modes, and it provides cipher feedback mode, which overcomes the weaknesses of alternative techniques such as electronic codebook.

Further, IT shouldn’t neglect the encryption of metadata, which contains the file name, file size, timestamps, access control information and location within the directory tree. If this information is transmitted or stored in the clear, hackers can easily obtain and use it to launch sophisticated, targeted attacks.

Separate the paths

Separation of the data and control paths is essential. For most enterprises, the control path for data management and orchestration functions is handled by a component such as an operations center, which needs to be separated from the data path that encompasses all functions used for storage in public and private cloud environments. By keeping these paths separate, IT can prevent malware from affecting data stored in the cloud.

If using a private cloud, all file data and file system metadata should be encrypted and stored solely in the private cloud object storage. The control path can use public cloud services to provide orchestration and management functions at scale, but the data path should be kept entirely within the private cloud; file data should never be transmitted outside the enterprise security perimeter.

In a hybrid situation, where on-premises appliances are deployed to cache data locally to ensure performance, the data path extends outside the enterprise security perimeter. However, so long as files and metadata are properly encrypted, and keys and passwords are encrypted and salted, the data is secure. If IT is using a cloud-only model, where appliances are deployed as virtual machines within the cloud, all file data and system metadata should be encrypted and kept it in object storage.

Remember, data and metadata should never be made visible to anyone who is not authorized to possess a master key, even if they are the cloud storage provider or vendor.

Double-checking the security of cloud instances and buckets

Enterprises are increasingly moving to the big three public cloud storage providers – Amazon Web Services, Google Cloud Platform and Microsoft Azure – and for valid reasons. These entities have invested billions in their data centers to ensure data reliability, availability, security and performance. In doing so, enterprises can reduce investments and maintenance, while always staying up to speed with relevant updates and emerging features.

Still, that doesn’t mean IT can forget about security concerns. Cloud providers operate on a shared responsibility model, which means they protect the overall infrastructure, but individual customers are responsible for securing access to cloud storage containers and instances, as well as for ensuring the data stored there is properly protected against data loss. Double check all configurations and audit them regularly to ensure they are sound.

But don’t take the cloud provider at their word that everything on their end is ship shape. Make sure cloud storage partners have geo-redundant storage and possess complete industry security and compliance certifications, including:

  • ISO 27001 certification
  • American Institute of Certified Public Accountants SOC 1 and SOC 2
  • CSA Security, Trust and Assurance Registry Certification (including the Consensus Assessments Initiative Questionnaire)
  • Payment Card Industry Data Security Standard Level 1
  • Health Insurance Portability and Accountability Act and Health Information Trust Alliance
  • FDA Code of Federal Regulations Title 21 Part 11

Room to grow

Unstructured file data will continue to grow. It’s clear that traditional, on-premises network-attached storage (NAS) and file server infrastructure can’t keep pace with demands to store, protect, synchronize and collaborate on files globally.  Simply put, many enterprises are running out of space. But as organizations move their file storage infrastructure to the cloud, it’s critical that security not be being compromised.

Bringing hybrid #cloud into the data path requires a different #security approach than for files stored on-premise, a blend of strong encryption and local authentication. #respectdata Click to Tweet

Keeping the above four areas at the forefront when planning and executing cloud storage migrations can ensure data is secure and protected, performance is strong and budgets are controlled. For enterprises, that means room to cost-efficiently grow and the ability to plan and pursue an even greater future.

 

Chief Product Officer at Nasuni