ComfyUI  >  Nodes  >  ControlFlowUtils >  ❌ Unload Models

ComfyUI Node: ❌ Unload Models

Class Name

UnloadModels

Category
🐺 VykosX-ControlFlowUtils
Author
VykosX (Account age: 2024 days)
Extension
ControlFlowUtils
Latest Updated
10/1/2024
Github Stars
0.1K

How to Install ControlFlowUtils

Install this extension via the ComfyUI Manager by searching for  ControlFlowUtils
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter ControlFlowUtils in the search bar
After installation, click the  Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • High-speed GPU machines
  • 200+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 50+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

❌ Unload Models Description

Manage VRAM usage by unloading models in ComfyUI for smoother workflow and faster generation processes.

❌ Unload Models:

The UnloadModels node is designed to help manage VRAM (Video Random Access Memory) usage by forcefully unloading all models currently loaded by ComfyUI into VRAM. This can be particularly useful if you are experiencing VRAM issues or if your workflow involves multiple models that do not need to be loaded simultaneously. By inserting this node into your workflow, you can clear the VRAM on demand, ensuring smoother and faster generation processes. The node offers two modes of operation based on the value of the ForceUnload parameter: immediate unloading of all models, which may disrupt workflows if any pending nodes still require access to the models, or requesting model unloading, which allows ComfyUI to handle the unloading at its earliest convenience. This functionality is similar to manually clicking the Unload Model button in ComfyUI.

❌ Unload Models Input Parameters:

Passthrough

This parameter allows data to be forwarded to other nodes after the VRAM has been cleaned. It ensures that the workflow continues seamlessly without interruption. The data type for this parameter is any_type.

ForceUnload

This boolean parameter determines the method of unloading models. If set to True, all models will be unloaded immediately, which can potentially cause issues if any pending nodes still require access to the models. If set to False, model unloading will be requested, and ComfyUI will handle the unloading at its earliest convenience. This is analogous to clicking the Unload Model button directly in ComfyUI. The default value is False.

❌ Unload Models Output Parameters:

Output

This parameter forwards the data to other nodes after the VRAM has been cleaned. It ensures that the workflow continues seamlessly without interruption. The data type for this parameter is any_type.

❌ Unload Models Usage Tips:

  • Insert the UnloadModels node between operations in your workflow to clear VRAM on demand, especially if you are experiencing VRAM issues or using multiple models that do not need to be loaded simultaneously.
  • Use the ForceUnload parameter with caution. Setting it to True will immediately unload all models, which may disrupt workflows if any pending nodes still require access to the models. It is generally safer to set it to False to allow ComfyUI to handle the unloading at its earliest convenience.

❌ Unload Models Common Errors and Solutions:

Models not unloading immediately

  • Explanation: This can occur if the ForceUnload parameter is set to False, causing the node to request model unloading rather than performing it immediately.
  • Solution: Set the ForceUnload parameter to True if you need the models to be unloaded immediately. However, be aware that this may disrupt workflows if any pending nodes still require access to the models.

Workflow disruption after unloading models

  • Explanation: If the ForceUnload parameter is set to True, all models are unloaded immediately, which can cause issues if any pending nodes still require access to the models.
  • Solution: Set the ForceUnload parameter to False to request model unloading, allowing ComfyUI to handle the unloading at its earliest convenience and avoid disrupting the workflow.

VRAM not freeing up as expected

  • Explanation: This can happen if there are still references to the models in the workflow, preventing them from being fully unloaded.
  • Solution: Ensure that there are no pending nodes that still require access to the models. You may need to adjust your workflow to ensure that models are no longer needed before unloading them.

❌ Unload Models Related Nodes

Go back to the extension to check out more related nodes.
ControlFlowUtils
RunComfy

© Copyright 2024 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals.