A "One-Pass" Methodology for Sensitive Data Disk Wipes

A "One-Pass" Methodology for Sensitive Data Disk Wipes

Doug White, Alan Rea
Copyright: © 2009 |Pages: 9
DOI: 10.4018/978-1-59904-855-0.ch016
(Individual Chapters)
No Current Special Offers


Hard disk wipes are a crucial component of computing security. However, more often than not, hard drives are not adequately processed before either disposing or reusing them within an environment. When an organization does not follow a standard disk wipe procedure, the opportunity to expose sensitive data occurs. More often than not, most organizations do not wipe drives because of the intense time and resource commitment of a highly-secure seven-pass DOD wipe. However, we posit that our one-pass methodology, verified with a zero checksum, is more than adequate for organizations wishing to protect against the loss of sensitive hard drive data.
Chapter Preview


There have always been concerns that data existent on magnetic media could interfere with new data or create problems. Early ANS standards advocate wiping the entire width of the tape to ensure no residual data remained (Kerpelman, 1970). Moreover, there is a long-standing myth that in order to protect sensitive data from recovery, it is necessary to overwrite the data many times (Joukov, Papaxenopoulos, & Zadok, 2006). A common hacker term is the “DOD 99 wipe” that advocates up to 99 overwrites for media to ensure it is unrecoverable.

Many other papers have been written examining this type of practice for effectiveness (Garfinkel & Shelat, 2003; Joukov et al., 2006). In particular, Gutmann (1996) demonstrated that the use of special equipment, such as Magnetic Force Microscopy (Rugar, Mamin, Guenther, Lambert, Stern, McFadyen, & Yogi, 1990) and other Microscopic techniques (Gomez, Adly, Mayergoyz, Burke, 1992; Gomez, Burke, Adly, Mayergoyz, Gorczyca, 1993; Rice & Moreland, 1991), enabled the recovery from wiped media (wiped in the traditional sense), even with multiple passes.

Gutmann (1996 ) went on to demonstrate techniques to fully ensure the destruction of data using repeated writing along the lines of the DOD 99 wipe. Related works advocate physical techniques such as degaussing (NSA, 1985) or even physical destruction of the media. The seminal work for this type of approach is the NIST Special publication 800-88 which provides guidelines for media sanitization. This work advocates multiple passes—the DOD seven-pass wipe— only for the most critical data (Kissel, Scholl, Skolochenko, & Li, 2006).

Key Terms in this Chapter

IDE Write Blocker: Most operating systems automatically write some information to a disk when it is connected and the computer is turned on. This hardware device sits between an IDE drive and the computer and does not allow this information to be written to the drive. In forensic investigations, it is crucial to maintain disk image integrity including preventing any data being written from the operating system onto the disk.

One-Pass Methodology: A process by which all zeroes are written to a hard drive or other media in order to sufficiently sanitize the media for reuse within an organization. The one-pass methodology allows for an efficient and effective use of resources (time, personnel, and equipment) so that organizations can safely reuse media that once contained sensitive information.

Bitwise Forensic Image: An exact copy of every bit of data found on a disk image. This image is used in forensic investigations to track changes—no matter how minute—in the disk image.

Slack Space: The unused space in a particular file system cluster.

Disk Artifacts: In forensic investigations, this refers to leftover information that remains on media even after a wipe has been performed.

Hard Drive Sanitization: The process of securely wiping all data from a hard disk. With a properly sanitized drive it is impossible to recover any data.

Computer Forensics: A discipline that uses analytical and investigative techniques to identify, collect, examine, and preserve evidence or information found on computers or other devices.

Data Carving: An examination of the slack space and free space on a drive. A forensic investigator used tools to “carve” out these sections on a drive and look for data.

Disk Clusters: A group of sectors on a disk. The operating system assigns designations to each cluster and uses them to store and access data.

Chksum: This command performs a checksum of the data on a piece of media. The checksum returns a numerical equivalent based on the number of bits present.

Gutmann Wipe: A 35-pass wipe consisting of particular patterns of data which remove risk from examination by magnetic microscopy techniques which can be used to reveal previous bit patterns on magnetic media.

Disk Wiping Tool: Any software used to remove and/or overwrite data on a disk for security purposes. Wipes can simply write zeroes to a disk (e.g., Wiper) or perform complex writes for increased security (e.g., DOD wipes).

Hex Editor: Computer software that allows a forensics investigator to see the exact contents of a file instead of the data being interpreted by software (e.g., an operating system).

Unpartitioned Space: The unused portions of a disk that are not yet formatted for use.

DOD Wipe: The seven-pass Department of Defense disk wipe is the standard used for highly sensitive data. In the first pass, all zeroes are written to the disk. In the second pass, all ones are written. In subsequent passes a pseudo-random zero or one is written.

Disk Cache: A portion of RAM used to speed up access times. The cache stores the most recently accessed data. When more data is requested, the cache is first checked before accessing the disk again.

Forensics Workstation: A computer specifically designed for forensic investigations. The forensic workstation includes tools to create pristine disk images, as well as a variety of analysis tools. Most workstations allow for multiple types of media to be connected so that information can be analyzed off of a variety of media, such as floppy drives, SCSI, or IDE drives, and so on.

Hash: An algorithm or function that translates data into a number. By “hashing” data, one can create a digital fingerprint of the data that can then be compared to see if the data matches. This digital fingerprint is often called a hash value.

Health Insurance Portability and Accountability Act of 1996: HIPAA provide several guidelines and regulations designed to protect individuals and their medical information. Not only does it cover privacy issues, but also a person’s right access to all of his or her medical information. Most important to this discussion is the need for organizations to meet the Privacy Rule stipulations in regards to protecting data.

Sarbanes Oxley Act of 2002: A United States federal law designed to make organizations accountable for their actions. It includes stipulations regarding external audits, governance, and financial disclosures. Most important to this discussion is the need for organizations to meet stipulations in regards to protecting employee, partner, and customer data.

Free Space: The unused portions of a disk that are already partitioned and ready for use.

Complete Chapter List

Search this Book: