ComfyUI > Nodes > comfyui_LLM_party

ComfyUI Extension: comfyui_LLM_party

Repo Name

comfyui_LLM_party

Author
heshengtao (Account age: 2893 days)
Nodes
View all nodes(64)
Latest Updated
2024-08-17
Github Stars
0.44K

How to Install comfyui_LLM_party

Install this extension via the ComfyUI Manager by searching for comfyui_LLM_party
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter comfyui_LLM_party in the search bar
After installation, click the Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • High-speed GPU machines
  • 200+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 50+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

comfyui_LLM_party Description

comfyui_LLM_party is a set of block-based LLM agent node libraries for ComfyUI, enabling users to efficiently construct and integrate LLM workflows into existing SD workflows.

comfyui_LLM_party Introduction

Welcome to comfyui_LLM_party, an extension designed to enhance your experience with ComfyUI by integrating Large Language Models (LLMs) into your workflows. ComfyUI is a minimalist user interface primarily used for AI drawing and other workflows based on the Stable Diffusion (SD) model. This extension allows you to build comprehensive LLM workflows quickly and seamlessly integrate them into your existing SD workflows. Whether you're creating interactive novel games, enabling voice input and output, or integrating with various APIs, comfyui_LLM_party provides the tools you need to expand your creative possibilities.

Workflow Example

How comfyui_LLM_party Works

At its core, comfyui_LLM_party operates by providing a set of nodes that you can use within the ComfyUI interface to build and manage LLM workflows. Think of nodes as building blocks that you can connect to create complex workflows. Each node represents a specific function or operation, such as text generation, voice input/output, or API integration. By connecting these nodes, you can create workflows that perform a wide range of tasks, from generating text based on user input to querying online databases.

For example, you might create a workflow where a user's voice input is converted to text, processed by an LLM to generate a response, and then converted back to speech. This modular approach allows you to customize and expand your workflows as needed, making it easy to adapt to new projects and creative challenges.

comfyui_LLM_party Features

Node Integration

  • Right-Click Menu Access: Easily access LLM nodes by right-clicking in the ComfyUI interface and selecting llm from the context menu.
  • API and Local Model Integration: Supports both API-based and local large model integrations, allowing for flexible tool invocation.

Knowledge Base and Code Interpreters

  • Local Knowledge Base Integration: Integrate local knowledge bases with Retrieval-Augmented Generation (RAG) support.
  • Code Interpreters: Invoke code interpreters to execute code within your workflows.

Online Queries and Conditional Statements

  • Online Queries: Perform online searches, including Google search support, directly within your workflows.
  • Conditional Statements: Implement conditional logic to categorize user queries and provide targeted responses.
  • Looping Links: Enable two large models to engage in debates by supporting looping links.
  • Persona Masks: Attach any persona mask and customize prompt templates to fit your needs.

Tool Invocations

  • Various Tools: Invoke tools for weather lookup, time lookup, knowledge base queries, code execution, web search, and single-page search.
  • LLM as a Tool Node: Use LLMs as tool nodes within your workflows.

Web Applications and Dangerous Nodes

  • Web Applications: Rapidly develop web applications using API + Streamlit.
  • Omnipotent Interpreter Node: A powerful node that allows the large model to perform any task.

Display and Visual Features

  • Display Output: Use the show_text node under the function submenu for displaying LLM node outputs.
  • Visual Features: Supports the visual features of GPT-4O.

Workflow Intermediary

  • Workflow Intermediary: A new intermediary node that allows your workflow to call other workflows.

Model Adaptation

  • Model Adaptation: Adapted to models with interfaces similar to OpenAI, such as Tongyi Qianwen/QWEN, Zhigu Qingyan/GLM, DeepSeek, and Kimi/Moonshot.

LVM Loader

FastAPI Integration

  • FastAPI: Run fastapi.py to get an OpenAI interface on http://127.0.0.1:8817/v1/, allowing any application that can call GPT to invoke your comfyui workflow.

macOS and MPS Support

comfyui_LLM_party Models

comfyui_LLM_party supports a variety of models, both API-based and local. Here are some of the supported models:

API-Based Models

  • OpenAI Format: Supports all API calls in OpenAI format.
  • ollama
  • Tongyi Qianwen /qwen
  • zhipu qingyan/glm (https://open.bigmodel.cn/dev/api#http_auth)
  • deepseek (https://platform.deepseek.com/api-docs/zh-cn/)
  • kimi/moonshot (https://platform.moonshot.cn/docs/api/chat#%E5%9F%BA%E6%9C%AC%E4%BF%A1%E6%81%AF)

Local Models

Troubleshooting comfyui_LLM_party

Here are some common issues you might encounter while using comfyui_LLM_party and how to resolve them:

Common Issues and Solutions

  1. API Call Failures: If you encounter a 503 error when making API calls, try turning off the proxy server.
  2. Model Loading Issues: Ensure that the base_url ends with /v1/ and that you have entered the correct api_key and model_name.
  3. Voice Input/Output Issues: Make sure you have the necessary dependencies installed for voice input and output functionalities.

Frequently Asked Questions

  • How do I integrate a new model?
  • Follow the instructions in the Model support section to add new models.
  • Can I use comfyui_LLM_party on macOS?
  • Yes, macOS and MPS devices are now supported.

Learn More about comfyui_LLM_party

To further enhance your experience with comfyui_LLM_party, here are some additional resources:

Tutorials and Documentation

Community Support

  • Join the QQ group for support and discussions: 931057213 By leveraging these resources, you can unlock the full potential of comfyui_LLM_party and take your AI artistry to the next level.

comfyui_LLM_party Related Nodes

RunComfy

© Copyright 2024 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals.