Integrity refers to ensuring that user data that resides in the cloud is not altered or deleted and cloud-run applications are operating properly.
In a conventional IT environment user data and their operational applications are hosted on user servers and the responsibility for their security resides with the user. This responsibility encompasses many aspects, including but not limited to firewalls, encryption, access logs, monitoring, and many of the techniques described in the Access Control Page. It is obvious that implementation of these methods in a conventional IT environment is costly in both time, effort, skills and funding. The fact that most Cloud environments provide these methods as a package at an attractive price is a major selling point for moving to the Cloud. But user caution remains because the ultimate responsibility for the security of their data and applications remains with the user.
Data encryption is the first line and probably most important data protection method. Powerful cryptographic keys must be employed to access data and admin accounts must require them, they must be taken away when no longer required, and they must be generated when a new user is created. As the users conventional IT practice should already demand, in a Cloud environment a similar process must be maintained and is the responsibility of the data owner. Cloud computing providers offer hardware security modules (HSM) to control cryptographic keys. Within a fully managed cloud service their use eliminates the workload and overhead of managing an HSM, but the process for their use remains a continuing and critical part of your data encryption plan. An additional factor to consider is a data classification model to ease the encrypt everything at a high-level burden. Open/public information (e.g. training documents) is not worth encrypting. Commercial in confidence is contractually sensitive (e.g. data shared with customers) and should be encrypted as early as possible. Highly sensitive information (e.g. financial data) should be encrypted before storage in the cloud.
Another way for users to ensure data integrity is the use of the Hash Function. A hash function, or algorithm, takes data as input and returns a shorter, fixed-size alphanumeric string, or key, that represents the original data. The string is called the 'hash value', also known as a 'digital fingerprint', or 'checksum'.
Using this process, it is the users responsibility to compute the hash value of a file, encrypt the file, and upload the file to the cloud for storage and/or use. The calculated hash value is stored on the customers local secure hash repository. To check the integrity of user data on a periodic basis, it is decrypted and the hash value is calculated. The new hash value is compared to the stored hash value. If both values are the same, the data has not been altered and its integrity is intact.
A potential new way to ensure data integrity is blockchain. Blockchain is inherently secure using both extensive encryption, hashing and distributed ledger technology. It is not useful for stagnant data storage or as a searchable data base. But it is being employed appropriately as an enterprise application within the cloud.
Read more about blockchain here.From an operational viewpoint access and storage logs provide another way to ensure data integrity. Log management can be defined as a service that collects, normalizes, stores and allows the user to search log data. This process is complex but of great value to the organization. However, if implemented insecurely, log management can introduce substantial risk and liabilities resulting from the centralization of vast amounts of sensitive trade secrets, passwords and user information, customer records and/or regulated personal data.
Log management in a conventional IT environment includes extracting and centralizing data from disparate sources; normalizing the disparate data into a standardized format; storing the data in a way that is secure, tamper resistant, and retained for an adequate amount of time; and providing a suitable search, reporting and extraction interface. This process is costly from a hardware, talent, time, and effort standpoint. A cloud environment alleviates some of this cost.
The process is the same in a cloud computer environment and is usually a less costly service. However, the user is responsible for insuring the collection of data from the appropriate disparate log sources, insuring encryption of that data during transport; and verifying that the data within the cloud supported products is normalized properly.
As in a conventional IT environment, the most critical aspect is the security of the log data at rest and encryption of the data, at the very least, is assumed. In the cloud the user must determine how a provider secures the data at rest and prevents it from being tampered with or accidentally intermingled with that of another cloud customer. Additional considerations would include: how does the provider sanitize data on decommissioned hardware; will you be able to produce 3 months of log data from multiple disparate sources on request; does the customer have the ability to search in numerous ways with a robust user interface for text across all user log entries?
There are two user focuses to ensure that cloud-run applications are operating properly. The first is to monitor the cloud operational infrastructure. Techniques for this process would include: a) Monitoring the cloud infrastructure from a single platform that can marry the metrics collected via the cloud platforms Application Programming Interface (API); b) monitoring trend and alerts on cloud resource consumption; c) monitor the end user experience with tools that seamlessly sync with an end user experience tool; and d) integrate metrics, flows and logs for a complete view of all the data working together. A sophisticated monitoring platform turns all of this dissimilar data into uniform metrics which provides problem alerts for action, presents the user with information to baseline performance, and graphic trends for future planning
The second is fully embracing user responsibility for all the Software as a Service (SaaS) applications employed for your specific business operations in the cloud (aka silo software). This includes; vetting apps before deployment including security checks; controlling employees' downloads; documenting and managing SaaS portfolios; and completing security maintenance tasks, such as patching and configuration; create and maintain an application inventory and periodically get rid of the glut; initiate security risk assessments and analyze for vulnerabilities.
An alternative to this customer effort is to employ a third-party cloud access security broker (CASB). The customer must weigh hard choice of the benefits of entrusting a third party with sensitive code/data or the in-house effort described above. An additional benefit to weigh in using CASB is enabling business to upload and implement applications faster which, in turn, spurs innovation and enhances competition. Even with a CASB, in-house resources must remain involved in the process, bear responsibility, and take action quickly to CSAB warnings and alerts.
BCT LLC
10810 Guilford Road, Suite 111 | Annapolis Junction, MD 20701