Automatic-LLM-RedTeaming-Model A redteaming model based on LLM refusal to answer to generate Jailbreak prompts.