Generated Prompt

Pick a question type + topic, then click Generate Prompt.

How to use

  1. Pick a question type from the left (8 types covering MCQ through full IR tabletop).
  2. Configure topic (paste chapter content or describe a domain), audience level, difficulty, count, constraints.
  3. Click Generate Prompt. The prompt is engineered for the question type + audience + difficulty combination, with explicit grading rubric and synthetic-data discipline.
  4. Click Copy. Paste into ChatGPT / Claude / Gemini.
  5. Review the LLM output for accuracy. Voice quality / fact-check / vendor neutrality. Reject + regenerate as needed.
  6. Once satisfied, paste into your quiz file or assessment system.
All prompts are engineered to enforce: synthetic data only (RFC 5737/1918 IPs, *.example.com hosts), no real CVE exploits, vendor-neutral phrasing where possible, explicit grading criteria, defender-side framing for any offensive content.
Why no built-in LLM call? This project is 100% free + no backend + no external API dependencies. Embedding an LLM call would require an API key (cost) or in-browser model (4GB+ download). The prompt-template approach keeps the project static while still letting educators leverage their own LLM access.