Visit ComfyUI Online for ready-to-use ComfyUI environment
Automatically detect and replace NSFW content in images using pre-trained model, ideal for AI artists and content creators.
The NSFWDetection node is designed to automatically identify and handle Not Safe For Work (NSFW) content within images. This node leverages a pre-trained image classification model to detect NSFW content and replace it with an alternative image if the detected content exceeds a specified threshold. This functionality is particularly useful for AI artists and content creators who need to ensure their generated images adhere to specific content guidelines or community standards. By integrating this node into your workflow, you can automate the process of filtering out inappropriate content, thereby saving time and ensuring compliance with content policies.
This parameter accepts the input image(s) that you want to analyze for NSFW content. The images should be in a format compatible with the node's processing capabilities. The node will scan these images to detect any NSFW content based on the specified threshold.
The score
parameter sets the threshold for NSFW content detection. It is a floating-point value that ranges from 0.0 to 1.0, with a default value of 0.9. This threshold determines the confidence level required for the node to classify content as NSFW. A higher score means the node will be more stringent in its detection, while a lower score will make it more lenient. Adjusting this value allows you to control the sensitivity of the NSFW detection process.
This parameter accepts an alternative image that will replace any detected NSFW content in the input images. The alternative image should be in a format compatible with the node's processing capabilities. When NSFW content is detected and its confidence score exceeds the specified threshold, the node will resize and replace the NSFW content with this alternative image, ensuring the output remains appropriate.
The output parameter image
returns the processed images after NSFW detection and replacement. If any NSFW content was detected and exceeded the specified threshold, it will be replaced with the provided alternative image. The output images will be in the same format as the input images, ensuring consistency in your workflow.
score
parameter to fine-tune the sensitivity of NSFW content detection based on your specific needs. A higher threshold will result in fewer false positives but may miss some NSFW content.alternative_image
is appropriate and visually consistent with the input images to maintain the overall aesthetic of your work.score
parameter is set outside the valid range of 0.0 to 1.0.score
parameter to a value within the valid range (0.0 to 1.0) to ensure proper functioning of the node.© Copyright 2024 RunComfy. All Rights Reserved.