Generated Prompt
⚙Pick a question type + topic, then click Generate Prompt.
How to use
- Pick a question type from the left (8 types covering MCQ through full IR tabletop).
- Configure topic (paste chapter content or describe a domain), audience level, difficulty, count, constraints.
- Click Generate Prompt. The prompt is engineered for the question type + audience + difficulty combination, with explicit grading rubric and synthetic-data discipline.
- Click Copy. Paste into ChatGPT / Claude / Gemini.
- Review the LLM output for accuracy. Voice quality / fact-check / vendor neutrality. Reject + regenerate as needed.
- Once satisfied, paste into your quiz file or assessment system.
All prompts are engineered to enforce: synthetic data only (RFC 5737/1918 IPs, *.example.com hosts), no real CVE exploits, vendor-neutral phrasing where possible, explicit grading criteria, defender-side framing for any offensive content.
Why no built-in LLM call? This project is 100% free + no backend + no external API dependencies. Embedding an LLM call would require an API key (cost) or in-browser model (4GB+ download). The prompt-template approach keeps the project static while still letting educators leverage their own LLM access.