When It Comes To Erasure Verification, Hope Is Not Our Friend

Recent findings have shown that your erasure QC process could be failing without your knowledge.

 

When the National Institute of Standards and Technology update Special Publication 800-88, they made it a point to expand Section 4.7.3 which deals with verification of electronic storage media erasure processes. You can read the whole document here if you are interested, but if you’re short on time the relevant section is brief:

4.7.3 Verification of Sanitization Results: "As part of the sanitization process, in addition to the verification performed on each piece of media following the sanitization operation, a subset of media items should be selected at random for secondary verification using a separate validation tool. The secondary validation tool should be from a separate developer..." In most cases, guidelines adopted by non-governmental certifying organizations have been based wholly or partially on this updated NIST document.

While NIST’s job is to develop the guidelines, their job is not to provide or even recommend solutions. That means that QC procedures have evolved in a sometimes hit or miss manner. 

In theory, the concept of verifying whether or not your media erasure process worked or didn’t work is simply a logical component of any media sanitization protocol. The fundamental quality control objective in effect here is not complicated in principle: did the job get done correctly or not.  

But how does that match up with a day to day reality in which the number of variables involved is significant and shifting? A sober analysis of the parameters involved suggests that your well intended erasure QC routines could be failing without you knowing it.  

In a typical scenario, many organizations execute a "Verify" pass after performing an overwrite on storage media. But if, for example, the hardware / software that performed the combination didn’t detect a Device Configuration Overlay on the original sanitization pass, it is unlikely to notice it on the verify pass either.   

In order to perform objective quality control, the sampling environment must be as separate as possible from the environment in which the original process was executed.  

You can read about this in detail in our white paper on the topic: Identifying & Connecting Failures in the Media Sanitization Process.

Distilled down, the white paper asks the following question:

Can YOUR data wiping verification protocol truly analyze whether YOUR version of YOUR erasure software running on YOUR hardware has erased YOUR specific HDD models running specific firmware versions according to YOUR specific data erasure policy?  

One solution is to send your drives out to a third party company specializing in data recovery. These outfits by and large have the tools and expertise to perform the independent analysis recommended by NIST and the certifying bodies. Of course, there is a significant charge for this kind of service, not to mention the introduction of a time lag.  

The alternative is a handy little device called the Validator, which we developed in partnership with CPR Tools, one of the nation’s premier forensic data recovery outfits. This is a simple, hand-held tool that was engineered specifically to do what SP 800-88 recommends for data wiping verification. This allows you to perform an independent in-house verification that is valid for any your data erasure scenario. 

Of course, the wide range of volume requirements and the cyclical nature of ITAD processes means that in-house QC needs vary widely. That’s why we have introduced a rental program for the Validator, an affordable short term option for media erasure verification.  

Read about the Validator Rental Program here.  

Sean O'LearyComment