ComfyUI > Nodes > ComfyUI-NSFW-Detection > NSFWDetection

ComfyUI Node: NSFWDetection

Class Name

NSFWDetection

Category
NSFWDetection
Author
trumanwong (Account age: 3074days)
Extension
ComfyUI-NSFW-Detection
Latest Updated
2024-08-03
Github Stars
0.02K

How to Install ComfyUI-NSFW-Detection

Install this extension via the ComfyUI Manager by searching for ComfyUI-NSFW-Detection
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter ComfyUI-NSFW-Detection in the search bar
After installation, click the Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • High-speed GPU machines
  • 200+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 50+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

NSFWDetection Description

Automatically detect and replace NSFW content in images using pre-trained model, ideal for AI artists and content creators.

NSFWDetection:

The NSFWDetection node is designed to automatically identify and handle Not Safe For Work (NSFW) content within images. This node leverages a pre-trained image classification model to detect NSFW content and replace it with an alternative image if the detected content exceeds a specified threshold. This functionality is particularly useful for AI artists and content creators who need to ensure their generated images adhere to specific content guidelines or community standards. By integrating this node into your workflow, you can automate the process of filtering out inappropriate content, thereby saving time and ensuring compliance with content policies.

NSFWDetection Input Parameters:

image

This parameter accepts the input image(s) that you want to analyze for NSFW content. The images should be in a format compatible with the node's processing capabilities. The node will scan these images to detect any NSFW content based on the specified threshold.

score

The score parameter sets the threshold for NSFW content detection. It is a floating-point value that ranges from 0.0 to 1.0, with a default value of 0.9. This threshold determines the confidence level required for the node to classify content as NSFW. A higher score means the node will be more stringent in its detection, while a lower score will make it more lenient. Adjusting this value allows you to control the sensitivity of the NSFW detection process.

alternative_image

This parameter accepts an alternative image that will replace any detected NSFW content in the input images. The alternative image should be in a format compatible with the node's processing capabilities. When NSFW content is detected and its confidence score exceeds the specified threshold, the node will resize and replace the NSFW content with this alternative image, ensuring the output remains appropriate.

NSFWDetection Output Parameters:

image

The output parameter image returns the processed images after NSFW detection and replacement. If any NSFW content was detected and exceeded the specified threshold, it will be replaced with the provided alternative image. The output images will be in the same format as the input images, ensuring consistency in your workflow.

NSFWDetection Usage Tips:

  • Adjust the score parameter to fine-tune the sensitivity of NSFW content detection based on your specific needs. A higher threshold will result in fewer false positives but may miss some NSFW content.
  • Ensure the alternative_image is appropriate and visually consistent with the input images to maintain the overall aesthetic of your work.
  • Use high-quality input images to improve the accuracy of the NSFW detection model.

NSFWDetection Common Errors and Solutions:

"Model not found"

  • Explanation: This error occurs when the pre-trained NSFW detection model is not available or cannot be loaded.
  • Solution: Ensure that the model "Falconsai/nsfw_image_detection" is correctly installed and accessible in your environment. You may need to download or update the model.

"Invalid image format"

  • Explanation: This error occurs when the input or alternative images are not in a compatible format.
  • Solution: Verify that the images are in a supported format (e.g., JPEG, PNG) and are correctly preprocessed before being fed into the node.

"Threshold value out of range"

  • Explanation: This error occurs when the score parameter is set outside the valid range of 0.0 to 1.0.
  • Solution: Adjust the score parameter to a value within the valid range (0.0 to 1.0) to ensure proper functioning of the node.

NSFWDetection Related Nodes

Go back to the extension to check out more related nodes.
ComfyUI-NSFW-Detection
RunComfy

© Copyright 2024 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals.