Visit ComfyUI Online for ready-to-use ComfyUI environment
Node for manually selecting and configuring Ollama model for AI art projects.
The MZ_OllamaModelConfig_ManualSelect
node is designed to allow you to manually select and configure an Ollama model for use in your AI art projects. This node provides a straightforward way to specify the model path and chat format, ensuring that you have precise control over the model configuration. By using this node, you can easily integrate specific Ollama models into your workflow, enhancing the flexibility and customization of your AI-generated art. The primary goal of this node is to simplify the process of selecting and configuring models, making it accessible even to those without a deep technical background.
This parameter allows you to specify the path to the Ollama model you wish to use. If the model path starts with sha256:
, the node will automatically resolve the full path based on the provided SHA-256 hash. This ensures that the correct model is selected and used in your project. The default value is an empty string, which means no model is selected by default. Ensure that the specified path is correct to avoid errors.
This parameter lets you define the chat format for the model. The available options include auto
and other specific formats supported by the Ollama model. The default value is auto
, which allows the node to automatically determine the appropriate chat format. If you have a specific format in mind, you can manually set it to ensure the model behaves as expected.
This output parameter provides the configured model details, including the type of selection (ManualSelect
), the resolved model path, and the chat format. This configuration is essential for integrating the selected model into your AI art generation workflow, ensuring that the model is correctly set up and ready for use.
ollama_cpp_model
path is correctly specified to avoid errors related to missing models.auto
option for chat_format
if you are unsure about the specific format required, as this allows the node to automatically determine the best format.<model_path>
ollama_cpp_model
path is correct and that the model file is present at the specified location.ManualSelect
. This node only supports manual selection of models.© Copyright 2024 RunComfy. All Rights Reserved.