Ai Red Teaming Prompt Engineering Tools
There are 5 ai red teaming tools tracked. The highest-rated is dronefreak/PromptScreen at 44/100 with 9 stars.
Get all 5 projects as JSON
curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=prompt-engineering&subcategory=ai-red-teaming&limit=20"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
| # | Tool | Score | Tier |
|---|---|---|---|
| 1 |
dronefreak/PromptScreen
Protect your LLMs from prompt injection and jailbreak attacks. Easy-to-use... |
|
Emerging |
| 2 |
anmolksachan/LLMInjector
Burp Suite Extension for LLM Prompt Injection Testing |
|
Emerging |
| 3 |
moketchups/permanently-jailbroken
We asked 6 AIs about their own programming. All 6 said jailbreaking will... |
|
Emerging |
| 4 |
AhsanAyub/malicious-prompt-detection
Detection of malicious prompts used to exploit large language models (LLMs)... |
|
Experimental |
| 5 |
AdityaBhatt3010/When-LinkedIn-Gmail-Obey-Hidden-AI-Prompts-Lessons-in-Indirect-Prompt-Injection
A real-world look at how hidden instructions in profiles and emails trick AI... |
|
Experimental |