AI Prompts
- Note: Bypassing chatbot safeguards (aka jailbreaking) is against the TOS of most AI, so use them at your own risk.
- BlackFriday GPTs Prompts: Prompt Directory
- Leaked Prompts: Prompt Directory
- Prompt Engineering Guide / Discord / GitHub, Google Whitepaper, Prompt_Engineering, LearnPrompting, OpenAI Guide or Claude Prompts / Discord - Prompting Guides
- ChatGPT System Prompt: Prompt Directory
- The Big Prompt Library: Prompt Directory
- Jailbreak Listings: Prompt Directory / Jailbreaks
- InjectPrompt Companion: AI Jailbreak Prompt Assistant
- Heretic: AI Jailbreak / Anti-Censorship Tool / Discord
- promptfoo: Prompt Playgrounds / Discord / GitHub
- Tensor Trust / GitHub or Gandalf / GitHub: Prompting Skill Games
- Gobble Bot: Generate Text Files for Chatbots