Data Acquisition & Duplication Flashcards
What is Data Acquisition?
The process of imaging or otherwise obtaining information from a digital device and its peripheral equipment and media.
Static
Non-Volatile
Live
Volatile
Order of Volatility
- Registers/Caches
- Routing tables/process table/memory
- Temporary files
- Disk and storage media
- Remote logging and monitoring data
- System configuration / topology
- Archival media
First step in data collection?
Record the time, date, and command history of the system to establish an audit trail generate dates and times while executing each forensic tool or command.
Static Data Collection:
Bit Stream vs. Backups (What’s the big difference?)
- Bit-stream disk to image
2. Bit-stream disk to disk
NIST SP 800-88 R1 guidance defines three sanitization methods:
- Clear: Logical techniques applied to sanitize data in all storage areas using the standard read and write commands.
- Purge: Involves physical or logical techniques to make the target data recovery infeasible by using state-of-the-art laboratory techniques.
- Destroy: Enables target data recovery to be infeasible with the use of state- of-the-art laboratory techniques, which result in an inability to use the media for data storage.
Data Acquisition Formats
- Raw - understood by everything
- Proprietary - (by tool vendor)
- Advanced Forensics Format (AFF) - open source; file extensions include .afm for AFF metadata and .afd for segmented image files. Supports two compression formats: zlib and LZMA
- Advanced Forensics Framework 4 (AFF4) - AFF4 supports image signing and cryptography. Adopts a scheme of globally unique identifiers for identifying and referring to all evidence.
Basic AFF4 object types include:
Volumes: They store segments, which are indivisible blocks of data
Streams: These data objects can help in reading or writing
Graphs: Collections of RDF statements
Generic Forensic Zip (gfzip)
provides an open file format for compressed, forensically complete, and signed disk image data files. It is a set of tools and libraries that can help in creating and accessing randomly accessible zip files. It uses multi-level SHA256 digests to safeguard the files. It also embeds the user’s metadata within the file metadata. This file format focuses on signed data and metadata sections using x509 certificates.
Logical vs. Sparse
Logical - Only specific types of files or specific files of interest to the case are captured
Sparse - Similar to logical, but also captures fragments of unallocated (deleted) data.
CRC-32 - Cyclic Redundancy Code algorithm-32 (CRC-32)
is a hash function based on the idea of polynomial division.
MD5 - Message Digest 5
is an algorithm used to check data integrity by creating a 128-bit message digest from data input of any length.
SHA-1 - Secure Hash Algorithm-160
is a cryptographic hash function developed by the United States National Security Agency (NSA), and it is a US Federal Information Processing Standard (FIPS) issued by NIST. It creates a 160-bit (20-byte) hash value called a message digest. This hash value is a hexadecimal number, 40 digits long.
SHA-256
A cryptographic hash algorithm that creates a unique and fixed-size 256-bit (32-byte) hash.