ComfyUI  >  Nodes  >  ComfyUI-NSFW-Detection

ComfyUI Extension: ComfyUI-NSFW-Detection

Repo Name

ComfyUI-NSFW-Detection

Author
trumanwong (Account age: 3074 days)
Nodes
View all nodes (1)
Latest Updated
8/3/2024
Github Stars
0.0K

How to Install ComfyUI-NSFW-Detection

Install this extension via the ComfyUI Manager by searching for  ComfyUI-NSFW-Detection
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter ComfyUI-NSFW-Detection in the search bar
After installation, click the  Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • High-speed GPU machines
  • 200+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 50+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

ComfyUI-NSFW-Detection Description

ComfyUI-NSFW-Detection integrates NSFW content detection into ComfyUI, enabling the identification and filtering of inappropriate material within the user interface.

ComfyUI-NSFW-Detection Introduction

ComfyUI-NSFW-Detection is an extension designed to help AI artists ensure that the images they generate are appropriate for all audiences. This tool automatically detects whether an image is Not Safe For Work (NSFW) using advanced machine learning models. If an image is classified as NSFW, the extension can replace it with an alternative image, ensuring that your content remains suitable for your intended audience. This can be particularly useful for artists who want to maintain a family-friendly portfolio or need to comply with specific content guidelines.

How ComfyUI-NSFW-Detection Works

At its core, ComfyUI-NSFW-Detection uses a machine learning model to analyze images and determine their suitability. Here’s a simple breakdown of how it works:

  1. Image Input: You provide an image that you want to check for NSFW content.
  2. Analysis: The extension uses a pre-trained machine learning model to analyze the image. This model has been trained on a large dataset of images labeled as either safe or NSFW.
  3. Classification: The model assigns a score to the image based on its content. If the score exceeds a certain threshold, the image is classified as NSFW.
  4. Output: If the image is classified as NSFW, the extension can automatically replace it with an alternative image that you provide. Otherwise, the original image is retained. This process helps you quickly and efficiently filter out inappropriate content without manual inspection.

ComfyUI-NSFW-Detection Features

ComfyUI-NSFW-Detection comes with several features designed to make it easy to use and highly effective:

  • Automatic Detection: The extension automatically detects NSFW content in images, saving you time and effort.
  • Customizable Threshold: You can set the threshold score for what constitutes NSFW content. This allows you to adjust the sensitivity of the detection to suit your needs.
  • Alternative Image Replacement: If an image is classified as NSFW, you can specify an alternative image to be used instead. This ensures that your content remains appropriate without any gaps.
  • Easy Integration: The extension integrates seamlessly with ComfyUI, making it easy to add to your existing workflow.

Customization Example

For instance, if you set a lower threshold score, the model will be more sensitive and classify more images as NSFW. Conversely, a higher threshold will make the model less sensitive, allowing more images to pass through as safe.

ComfyUI-NSFW-Detection Models

The extension uses a single, highly effective model for NSFW detection. This model has been trained on a diverse dataset to ensure high accuracy and reliability. While there is currently only one model, it is designed to handle a wide range of content and scenarios.

What's New with ComfyUI-NSFW-Detection

The extension is regularly updated to improve its performance and add new features. Here are some of the recent updates:

  • Version 1.1: Improved model accuracy and reduced false positives.
  • Version 1.2: Added support for customizable threshold scores.
  • Version 1.3: Enhanced integration with ComfyUI for a smoother user experience. These updates ensure that the extension remains effective and easy to use, providing you with the best possible tool for managing your content.

Troubleshooting ComfyUI-NSFW-Detection

Here are some common issues you might encounter while using the extension and how to solve them:

Issue: The extension is not detecting any NSFW content.

  • Solution: Check the threshold score. If it is set too high, the model may not classify any images as NSFW. Lower the threshold and try again.

Issue: The extension is classifying too many images as NSFW.

  • Solution: Increase the threshold score to make the model less sensitive.

Issue: The alternative image is not being displayed.

  • Solution: Ensure that the path to the alternative image is correct and that the image file is accessible.

Frequently Asked Questions

Q: Can I use my own model for NSFW detection? A: Currently, the extension uses a pre-trained model, but future updates may include support for custom models.

Q: How do I change the threshold score? A: You can adjust the threshold score in the settings of the NSFWDetection class in the node.py file.

Learn More about ComfyUI-NSFW-Detection

To learn more about ComfyUI-NSFW-Detection, you can explore the following resources:

  • These resources provide additional tutorials, documentation, and a community of users who can help answer any questions you might have.

ComfyUI-NSFW-Detection Related Nodes

RunComfy

© Copyright 2024 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals.