Visit ComfyUI Online for ready-to-use ComfyUI environment
Image safety analysis using CLIP model to filter NSFW content for creators, AI artists, and developers.
The Safety Checker node is designed to ensure that the images you generate or process do not contain any Not Safe For Work (NSFW) content. This node leverages a pre-trained CLIP-based model to analyze images and detect inappropriate content, providing a safeguard for creators who want to maintain a family-friendly or professional environment. By integrating this node into your workflow, you can automatically filter out or replace NSFW images, ensuring that your outputs adhere to desired content standards. The Safety Checker is particularly useful for AI artists and developers who need to manage content sensitivity and maintain a high level of content appropriateness in their projects.
This parameter accepts a batch of images that you want to check for NSFW content. The images should be in a format compatible with the node, typically as numpy arrays or tensors. The node processes these images to determine if they contain any inappropriate content.
The sensitivity parameter controls the strictness of the NSFW content detection. It is a floating-point value ranging from 0.0 to 1.0, with a default value of 0.5. A lower sensitivity value means the checker will be less strict, potentially allowing borderline content, while a higher sensitivity value makes the checker more stringent, flagging more content as NSFW. Adjusting this parameter allows you to fine-tune the balance between false positives and false negatives according to your specific needs.
This output provides the processed images after the NSFW check. If any NSFW content is detected, the corresponding images are replaced with a black output to ensure that no inappropriate content is displayed or used further in your workflow.
This boolean output indicates whether any NSFW content was detected in the batch of images. A value of true
means that at least one image in the batch contained NSFW content and was replaced, while a value of false
means that all images passed the safety check without any issues.
nsfw
output to trigger additional actions in your workflow, such as logging incidents of NSFW content or notifying administrators.<error_message>
<error_message>
© Copyright 2024 RunComfy. All Rights Reserved.