ComfyUI  >  Nodes  >  ComfyUI-safety-checker >  Safety Checker

ComfyUI Node: Safety Checker

Class Name

Safety Checker

Category
image
Author
42lux (Account age: 3816 days)
Extension
ComfyUI-safety-checker
Latest Updated
5/22/2024
Github Stars
0.0K

How to Install ComfyUI-safety-checker

Install this extension via the ComfyUI Manager by searching for  ComfyUI-safety-checker
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter ComfyUI-safety-checker in the search bar
After installation, click the  Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • High-speed GPU machines
  • 200+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 50+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

Safety Checker Description

Image safety analysis using CLIP model to filter NSFW content for creators, AI artists, and developers.

Safety Checker:

The Safety Checker node is designed to ensure that the images you generate or process do not contain any Not Safe For Work (NSFW) content. This node leverages a pre-trained CLIP-based model to analyze images and detect inappropriate content, providing a safeguard for creators who want to maintain a family-friendly or professional environment. By integrating this node into your workflow, you can automatically filter out or replace NSFW images, ensuring that your outputs adhere to desired content standards. The Safety Checker is particularly useful for AI artists and developers who need to manage content sensitivity and maintain a high level of content appropriateness in their projects.

Safety Checker Input Parameters:

images

This parameter accepts a batch of images that you want to check for NSFW content. The images should be in a format compatible with the node, typically as numpy arrays or tensors. The node processes these images to determine if they contain any inappropriate content.

sensitivity

The sensitivity parameter controls the strictness of the NSFW content detection. It is a floating-point value ranging from 0.0 to 1.0, with a default value of 0.5. A lower sensitivity value means the checker will be less strict, potentially allowing borderline content, while a higher sensitivity value makes the checker more stringent, flagging more content as NSFW. Adjusting this parameter allows you to fine-tune the balance between false positives and false negatives according to your specific needs.

Safety Checker Output Parameters:

IMAGE

This output provides the processed images after the NSFW check. If any NSFW content is detected, the corresponding images are replaced with a black output to ensure that no inappropriate content is displayed or used further in your workflow.

nsfw

This boolean output indicates whether any NSFW content was detected in the batch of images. A value of true means that at least one image in the batch contained NSFW content and was replaced, while a value of false means that all images passed the safety check without any issues.

Safety Checker Usage Tips:

  • Adjust the sensitivity parameter based on the context of your project. For professional or family-friendly environments, a higher sensitivity value is recommended to ensure stricter content filtering.
  • Regularly update the safety checker model to benefit from the latest improvements and enhancements in NSFW content detection.
  • Use the boolean nsfw output to trigger additional actions in your workflow, such as logging incidents of NSFW content or notifying administrators.

Safety Checker Common Errors and Solutions:

Error initializing Safety_Checker: <error_message>

  • Explanation: This error occurs when the Safety Checker node fails to initialize, possibly due to missing model files or incorrect paths.
  • Solution: Ensure that the required model files are downloaded and placed in the correct directory. Verify the paths and permissions to the model files.

Error in numpy_to_pil: <error_message>

  • Explanation: This error happens when there is an issue converting numpy arrays to PIL images, which could be due to incompatible image formats or corrupted data.
  • Solution: Check the format and integrity of the input images. Ensure that the images are in a compatible format and not corrupted.

Safety Checker: NSFW content detected. Replaced with black output.

  • Explanation: This warning indicates that the Safety Checker has detected NSFW content in one or more images and has replaced them with a black output.
  • Solution: Review the input images to understand why they were flagged as NSFW. Adjust the sensitivity parameter if necessary to better suit your content standards.

Safety Checker Related Nodes

Go back to the extension to check out more related nodes.
ComfyUI-safety-checker
RunComfy

© Copyright 2024 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals.