Post

Incident Journal: Events & Experiences

A structured log of security investigations, tool exercises, and incident response practice mapped to the NIST IR Lifecycle.

A running record of hands-on security work: investigations, detections, and automation exercises. Each entry maps to a phase of the NIST Incident Response Lifecycle and documents what I did, what I used, and what I took away from it.


Entry 001: Phishing Investigation

Date: 2025-05-13
Phase: Detection & Analysis
Tools: Outlook, VirusTotal

An employee reported a suspected phishing email impersonating HR with an “updated benefits enrollment” prompt. The message used a lookalike sender address and a button linking to a credential-harvesting page. Several other recipients were identified via a broad internal distribution list.

Investigation:

  • Pulled full email headers in Outlook; confirmed sender domain was spoofed
  • Checked the embedded URL and registrant domain against VirusTotal, flagged as suspicious with a recently registered domain not affiliated with the organization
  • Searched mailboxes for identical messages to scope the distribution

Containment:

  • Blocked sender domain and URL at the email gateway
  • Issued user advisory; reset credentials and forced MFA re-registration for the user who interacted with the link
  • Logged indicators of compromise (sender address, subject, URL) and filed the incident

Takeaway: Early reporting compressed the response window significantly. Header analysis and reputation lookups are fast, high-confidence triage steps that should be default practice.


Entry 002: Malware Detection with Suricata

Date: 2025-05-15
Phase: Detection & Analysis
Tools: Suricata, Wireshark

An internal host on the 192.168.1.x subnet was generating outbound traffic to a known malicious IP, consistent with command-and-control (C2) communication. The traffic was not blocked by existing rules.

Response:

  • Authored a custom Suricata rule targeting the observed traffic pattern and destination IP
  • Validated the rule trigger using Wireshark packet captures to confirm match accuracy
  • Isolated the affected host; performed a clean re-image

Takeaway: Writing detection logic against real observed behavior rather than generic signatures, producing rules that actually fire. Pairing Suricata with Wireshark gives fast feedback on rule precision.


Entry 003: Brute-Force Detection in Splunk

Date: 2025-05-18
Phase: Detection & Analysis
Tools: Splunk

Practiced building detection and visibility for brute-force login attempts using Splunk against a log dataset with repeated authentication failures.

Work summary:

  • Queried failed login events grouped by src_ip and user to identify outlier patterns
  • Used time-series visualizations to surface burst activity across short windows
  • Configured threshold-based alerting for excessive failures within a defined period
  • Built a monitoring dashboard consolidating login failure metrics for operational review

Takeaway: Correlation queries turn raw authentication logs into actionable signals. Building the dashboard reinforced how important good field normalization is before querying at scale.


Entry 004: File Hash Analysis

Date: 2025-05-21
Phase: Detection & Analysis
Tools: VirusTotal, Any.run, Hybrid Analysis

Investigated a SHA256 hash submitted for triage. Cross-referenced across three sandboxing and reputation platforms to build a risk profile.

Findings:

  • VirusTotal: flagged by 39 vendors
  • Any.run: dynamic analysis showed ransomware-like behavior including file enumeration and encryption activity
  • Hybrid Analysis: tracked external call patterns and file system modifications

Takeaway: No single platform provides a complete picture. Cross-referencing multiple sources accelerates indicator enrichment and reduces the chance of a missed detection. This workflow maps cleanly to the early stages of a formal IOC vetting process.


Entry 005: IP Allow List Automation

Date: 2025-05-24
Phase: Preparation & Containment
Tools: Python, VS Code

Developed a Python script to automate cleanup of a text-based IP allow list, replacing a manual review process that was error-prone at scale.

Implementation:

  • Parsed the allow list file and loaded a defined remove_list of unauthorized entries
  • Filtered out any matching IPs and wrote the cleaned list back to disk
  • Designed to run on a schedule or integrate into a SOAR workflow

Takeaway: Automating access control maintenance reduces human error in a high-consequence operation. Even a small script applied consistently is more reliable than periodic manual audits.


Summary

These entries document practical application of the NIST IR Lifecycle across detection, containment, and preparation. The common thread is treating each exercise as a real investigation: scoping the problem, selecting the right tool, documenting findings, and extracting a reusable lesson.

The journal is ongoing. Future entries will cover more complex scenarios including lateral movement, log forensics, and response automation.

This post is licensed under CC BY 4.0 by the author.