Visit ComfyUI Online for ready-to-use ComfyUI environment
Versatile node for generating text content using Zhipu AI API with advanced language models for detailed and creative outputs.
ZhipuaiApi_Txt is a versatile node designed to interact with the Zhipu AI API, enabling you to generate text-based content using advanced language models like glm-4, glm-3-turbo, and cogview-3. This node is particularly useful for creating detailed and contextually rich text outputs based on a given prompt. It leverages the power of AI to produce coherent and creative text, making it an invaluable tool for AI artists looking to enhance their projects with sophisticated language generation capabilities. The node supports various configurations to fine-tune the output, including the number of tokens, temperature settings, and language options, ensuring that the generated content meets your specific needs and preferences.
The prompt
parameter is a string input where you provide the initial text or question that you want the AI to respond to. This can be a detailed description or a simple query. The default value is "30 words describe a girl walking on the Moon." This parameter supports multiline input, allowing you to craft more complex and nuanced prompts.
The model_name
parameter allows you to select the AI model to be used for text generation. The available options are "glm-4", "glm-3-turbo", and "cogview-3". Each model has its own strengths and capabilities, so you can choose the one that best fits your needs.
The max_tokens
parameter defines the maximum number of tokens (words or word pieces) that the AI can generate in response to your prompt. The default value is 1024, with a minimum of 128 and a maximum of 8192. This parameter is adjustable via a slider, allowing you to control the length of the generated text.
The temperature
parameter controls the randomness of the AI's output. A higher value (closer to 0.99) makes the output more random and creative, while a lower value (closer to 0.01) makes it more focused and deterministic. The default value is 0.95, and it can be adjusted in increments of 0.01 using a slider.
The output_language
parameter specifies the language in which the AI should generate the output. The options are "english" and "original_language". This allows you to receive the generated text in your preferred language.
The translate_to
parameter allows you to specify if the generated text should be translated into another language. The available options are "none", "english", "chinese", "russian", and "japanese". This is useful if you need the output in a different language than the one used in the prompt.
The text
output parameter provides the generated text based on the input prompt and selected configurations. This output is a string that contains the AI's response, which can be used directly in your projects or further processed as needed.
The image
output parameter is currently set to None
for this node, indicating that it does not generate any image output. The focus of this node is solely on text generation.
model_name
options to see which model produces the best results for your specific use case.temperature
setting to balance creativity and coherence in the generated text. Higher values can produce more imaginative responses, while lower values yield more predictable and structured outputs.max_tokens
parameter to control the length of the generated text, especially if you need concise or extended responses.translate_to
parameter to get the output in different languages, which can be particularly useful for multilingual projects.config.json
file.prompt
parameter is not provided.output_language
or translate_to
parameters accordingly.© Copyright 2024 RunComfy. All Rights Reserved.