Tuesday 28 March 2017

The Security Evangelist, Lesson IV: Integrity continued...

This post was originally posted on the Workshare blog at https://www.workshare.com/blog/the-security-evangelist-lesson-iii-integrity-continued.

Data integrity is quite a subject and couldn't be covered in one blog, so here's part ii. How to prove the integrity of your data...

Integrity guarantees do not prevent authorized modification of data, otherwise it would be impossible to add new versions of a document. What they do is provide users with ways of identifying all changes and who made them.

To be able to prove the integrity of your data you have to establish what the baseline is; the original data that you want to protect. A normal assumption is that the first version of a document is the canonical version and that any others that appear later are a modification of the original.

You then have to establish policies and controls around handling the data. This will ensure people understand how to manage it, whether it’s critical or not, and what to do when something goes wrong.

It is important to set permissions and tooling to prevent accidental modification and provide monitoring and metrics that enable you to assess events as they happen, depending on the criticality of the data in question.

There must be an established process to identify changes to data, including details about the changes themselves - the authors, modification times and any other data that will enable auditing. It should not be possible to modify data without leaving traces and common policy indicates that rolling back is impossible. If a version of a document is detected to be faulty, a new version undoing the mistake should be added and both kept for historical purposes.
Logs must be kept for as long as legally required and must include enough protection to prevent tampering. A common approach for this is to allow systems to generate logs that get exported to a separate system to prevent tampering and require a completely different set of permissions to access the logs. The ideal approach is to enable append-only logs, where data cannot be modified once it is in the log, no matter what level of permission you may have.

Finally, you must set up policies and processes to deal with unauthorized modification. This must not be an afterthought, in the event of a major breach you must be working on resolving the issue, not trying to understand how you can start investigating or who you should be talking to. All requirements should be established in advance, including communications, reporting and crisis management. The criticality of this last step cannot be underestimated, specially with new data protection legislation coming out, such as the GDPR, which requires timely reporting to customers and the authorities on any data breaches.

The importance of data integrity must be fully understood across a business. A corrupt or incorrectly modified document in circulation will not only incur reputational loss, it may also cause large amounts of financial pain in the form of legal cases and external audits.
Once policies and controls are in place, it becomes a matter of regularly reviewing and verifying them and taking any learnings from events and the way they are handled, whether successful or not.

In later posts, we will look at how blockchain can help create logs that are open and tamper-proof.



No comments:

Post a Comment