ComfyUI > Nodes > ComfyUI-safety-checker

ComfyUI Extension: ComfyUI-safety-checker

Repo Name

ComfyUI-safety-checker

Author
42lux (Account age: 3816 days)
Nodes
View all nodes(1)
Latest Updated
2024-05-22
Github Stars
0.03K

How to Install ComfyUI-safety-checker

Install this extension via the ComfyUI Manager by searching for ComfyUI-safety-checker
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter ComfyUI-safety-checker in the search bar
After installation, click the Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • High-speed GPU machines
  • 200+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 50+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

ComfyUI-safety-checker Description

ComfyUI-safety-checker is a node for ComfyUI designed to identify and filter NSFW content, ensuring safety and appropriateness in user-generated outputs.

ComfyUI-safety-checker Introduction

ComfyUI-safety-checker is an extension designed to help AI artists ensure that their generated or processed images adhere to safety guidelines by identifying and handling Not Safe For Work (NSFW) content. This extension integrates a CLIP-based safety checker, which uses a pretrained model from CompVis, specifically designed for detecting inappropriate content in images. By using this tool, you can automatically replace any detected NSFW content with a solid black placeholder, ensuring that your work remains appropriate for all audiences.

How ComfyUI-safety-checker Works

The ComfyUI-safety-checker operates by analyzing input images to detect any NSFW content. It uses a sophisticated model trained to recognize explicit material. When an image is processed through this extension, the safety checker evaluates the content and determines if it meets the safety criteria. If any inappropriate content is detected, the image is replaced with a solid black placeholder. This process helps maintain the integrity and appropriateness of your artwork without requiring manual inspection.

Think of the safety checker as a vigilant guard that scans each image for any signs of inappropriate content. If it finds something that doesn't meet the safety standards, it swiftly covers it up to prevent it from being displayed.

ComfyUI-safety-checker Features

Safety Checker Node

The core feature of the ComfyUI-safety-checker is the Safety Checker Node. This node can be connected directly to an image or through a VAE (Variational Autoencoder) Decode process. It scans the image for NSFW content and replaces any detected inappropriate material with a solid black placeholder.

Sensitivity Adjustment

One of the key features of the ComfyUI-safety-checker is the ability to adjust the sensitivity level of the NSFW detection. This allows you to customize how strict the safety checks are based on your needs.

  • Sensitivity Level 0: No filtering. This setting will not detect or replace any content.
  • Sensitivity Level 0.5: This is the standard detection threshold for explicit nudity. It balances between being too lenient and too strict.
  • Sensitivity Level 1.0: This setting is more sensitive and will detect images with lingerie or underwear, providing a higher level of content filtering. By adjusting the sensitivity, you can control the strictness of the safety checks to match the context of your work.

Example Node Setup

To help you get started, here is an example of how to set up the Safety Checker Node in your workflow:

image

This visual guide shows how to connect the node to your image processing pipeline, ensuring that all images are checked for safety before being finalized.

Troubleshooting ComfyUI-safety-checker

If you encounter any issues while using the ComfyUI-safety-checker, here are some common problems and their solutions:

Issue: The safety checker is too strict or too lenient.

Solution: Adjust the sensitivity level of the NSFW detection. If the checker is too strict, lower the sensitivity level. If it is too lenient, increase the sensitivity level to ensure more thorough filtering.

Issue: The safety checker is not detecting any NSFW content.

Solution: Ensure that the Safety Checker Node is correctly connected to the image input. Double-check the node setup and make sure the sensitivity level is set appropriately.

Issue: The processed images are not being replaced with black placeholders.

Solution: Verify that the Safety Checker Node is active and properly configured. If the node is correctly set up, try reprocessing the images to see if the issue persists.

Learn More about ComfyUI-safety-checker

For additional resources and support, consider exploring the following:

  • CompVis Stable Diffusion Safety Checker: Learn more about the pretrained model used by the ComfyUI-safety-checker.
  • Community Forums: Join online forums and communities where you can ask questions, share experiences, and get support from other AI artists and developers.
  • Tutorials and Documentation: Look for tutorials and detailed documentation that can provide step-by-step guides and deeper insights into using the ComfyUI-safety-checker effectively. By leveraging these resources, you can enhance your understanding and make the most out of the ComfyUI-safety-checker extension.

ComfyUI-safety-checker Related Nodes

RunComfy

© Copyright 2024 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals.