Skip to content

Detection-as-Code Pipeline

Interactive framework for authoring, validating, testing, and deploying detection rules using modern CI/CD practices. Write Sigma, KQL, or SPL rules and get instant feedback on syntax, coverage, and test results.

Keyboard Shortcuts

1-5 Switch tabs | V Validate | T Run tests | E Export | R Reset


SIGMA Lines: 0 Chars: 0

Rule Validation Engine

Load a rule and click Validate Rule to check syntax, fields, ATT&CK mapping, false positives, and performance.

Unit Test Generator

Load a rule and click Generate & Run Tests to create true-positive, true-negative, and edge-case test scenarios.

ATT&CK Coverage Dashboard

Detection CI/CD Pipeline

GitHub Actions Pipeline Template

 

How Detection-as-Code Works

Detection-as-Code applies software engineering practices to security detection rules. Instead of manually creating and deploying rules through SIEM GUIs, teams manage detections as version-controlled code that flows through automated pipelines.

Core Principles

Principle Description
Version Control Every detection rule is stored in Git with full change history
Peer Review Rule changes require pull request review before deployment
Automated Testing Unit tests validate rules against synthetic log data
CI/CD Deployment Validated rules auto-deploy to SIEM environments
Coverage Tracking ATT&CK matrix coverage is continuously measured
Rollback Capability Any deployment can be reverted via Git revert

Repository Structure

detections/
  sigma/
    credential-access/
      brute-force-login.yml
      credential-dumping.yml
    execution/
      suspicious-powershell.yml
    lateral-movement/
      psexec-detection.yml
  kql/
    sentinel-analytics/
      failed-signin-anomaly.kql
      process-injection.kql
  spl/
    splunk-searches/
      auth-anomaly.spl
      c2-beacon.spl
tests/
  detections/
    test_brute_force.py
    test_powershell.py
    sample-logs/
      windows-security.json
      proxy-access.json
scripts/
  check_fields.py
  coverage_report.py
  deploy_sentinel.py
  test_rules.py

Detection Rule Lifecycle

Author --> Commit --> PR Review --> Validate --> Test --> Merge --> Deploy --> Monitor
  |                                   |           |                  |          |
  |                                   |           |                  |          |
  v                                   v           v                  v          v
 Write rule               Check syntax &    Run against        Push to     Track FP
 in Sigma/KQL/SPL         required fields   synthetic logs     SIEM API    rate & tune

Key Metrics

Track these metrics to measure detection engineering maturity:

  • Coverage ratio: Percentage of relevant ATT&CK techniques with at least one detection rule
  • Mean time to detect (MTTD): Average time from adversary action to alert firing
  • False positive rate: Percentage of alerts that are not true threats
  • Rule deployment frequency: How often new/updated rules reach production
  • Test coverage: Percentage of rules with corresponding unit tests
  • Mean time to deploy (MTTD): Time from rule authoring to production deployment

Educational Use Only

All detection rules, log samples, and test data in this tool use synthetic data only. IP addresses follow RFC 5737/RFC 1918 ranges. Never deploy sample rules to production without thorough testing against your environment's baseline.