mirror of
https://github.com/langgenius/dify.git
synced 2026-01-10 16:34:15 +00:00
Compare commits
40 Commits
refactor/q
...
feat/pull-
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
f925266c1b | ||
|
|
6e2cf23a73 | ||
|
|
8b0bc6937d | ||
|
|
872fd98eda | ||
|
|
5bcd3b6fe6 | ||
|
|
1aed585a19 | ||
|
|
831eba8b1c | ||
|
|
8b8e521c4e | ||
|
|
88248ad2d3 | ||
|
|
760a739e91 | ||
|
|
d92c476388 | ||
|
|
9012dced6a | ||
|
|
50bed78d7a | ||
|
|
60250355cb | ||
|
|
75afc2dc0e | ||
|
|
225b13da93 | ||
|
|
37c748192d | ||
|
|
b7a2957340 | ||
|
|
a6ce6a249b | ||
|
|
8834e6e531 | ||
|
|
39010fd153 | ||
|
|
bd338a9043 | ||
|
|
39d6383474 | ||
|
|
add8980790 | ||
|
|
5157e1a96c | ||
|
|
4bb76acc37 | ||
|
|
b513933040 | ||
|
|
18ea9d3f18 | ||
|
|
7b660a9ebc | ||
|
|
783a49bd97 | ||
|
|
d3c6b09354 | ||
|
|
3d61496d25 | ||
|
|
16bff9e82f | ||
|
|
22f25731e8 | ||
|
|
035f51ad58 | ||
|
|
e9795bd772 | ||
|
|
93b516a4ec | ||
|
|
fc9d5b2a62 | ||
|
|
e3bfb95c52 | ||
|
|
752cb9e4f4 |
1
.gitignore
vendored
1
.gitignore
vendored
@@ -209,6 +209,7 @@ api/.vscode
|
||||
.history
|
||||
|
||||
.idea/
|
||||
web/migration/
|
||||
|
||||
# pnpm
|
||||
/.pnpm-store
|
||||
|
||||
434
api/core/memory/README.md
Normal file
434
api/core/memory/README.md
Normal file
@@ -0,0 +1,434 @@
|
||||
# Memory Module
|
||||
|
||||
This module provides memory management for LLM conversations, enabling context retention across dialogue turns.
|
||||
|
||||
## Overview
|
||||
|
||||
The memory module contains two types of memory implementations:
|
||||
|
||||
1. **TokenBufferMemory** - Conversation-level memory (existing)
|
||||
2. **NodeTokenBufferMemory** - Node-level memory (to be implemented, **Chatflow only**)
|
||||
|
||||
> **Note**: `NodeTokenBufferMemory` is only available in **Chatflow** (advanced-chat mode).
|
||||
> This is because it requires both `conversation_id` and `node_id`, which are only present in Chatflow.
|
||||
> Standard Workflow mode does not have `conversation_id` and therefore cannot use node-level memory.
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
||||
│ Memory Architecture │
|
||||
├─────────────────────────────────────────────────────────────────────────────┤
|
||||
│ │
|
||||
│ ┌─────────────────────────────────────────────────────────────────────-┐ │
|
||||
│ │ TokenBufferMemory │ │
|
||||
│ │ Scope: Conversation │ │
|
||||
│ │ Storage: Database (Message table) │ │
|
||||
│ │ Key: conversation_id │ │
|
||||
│ └─────────────────────────────────────────────────────────────────────-┘ │
|
||||
│ │
|
||||
│ ┌─────────────────────────────────────────────────────────────────────-┐ │
|
||||
│ │ NodeTokenBufferMemory │ │
|
||||
│ │ Scope: Node within Conversation │ │
|
||||
│ │ Storage: Object Storage (JSON file) │ │
|
||||
│ │ Key: (app_id, conversation_id, node_id) │ │
|
||||
│ └─────────────────────────────────────────────────────────────────────-┘ │
|
||||
│ │
|
||||
└─────────────────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## TokenBufferMemory (Existing)
|
||||
|
||||
### Purpose
|
||||
|
||||
`TokenBufferMemory` retrieves conversation history from the `Message` table and converts it to `PromptMessage` objects for LLM context.
|
||||
|
||||
### Key Features
|
||||
|
||||
- **Conversation-scoped**: All messages within a conversation are candidates
|
||||
- **Thread-aware**: Uses `parent_message_id` to extract only the current thread (supports regeneration scenarios)
|
||||
- **Token-limited**: Truncates history to fit within `max_token_limit`
|
||||
- **File support**: Handles `MessageFile` attachments (images, documents, etc.)
|
||||
|
||||
### Data Flow
|
||||
|
||||
```
|
||||
Message Table TokenBufferMemory LLM
|
||||
│ │ │
|
||||
│ SELECT * FROM messages │ │
|
||||
│ WHERE conversation_id = ? │ │
|
||||
│ ORDER BY created_at DESC │ │
|
||||
├─────────────────────────────────▶│ │
|
||||
│ │ │
|
||||
│ extract_thread_messages() │
|
||||
│ │ │
|
||||
│ build_prompt_message_with_files() │
|
||||
│ │ │
|
||||
│ truncate by max_token_limit │
|
||||
│ │ │
|
||||
│ │ Sequence[PromptMessage]
|
||||
│ ├───────────────────────▶│
|
||||
│ │ │
|
||||
```
|
||||
|
||||
### Thread Extraction
|
||||
|
||||
When a user regenerates a response, a new thread is created:
|
||||
|
||||
```
|
||||
Message A (user)
|
||||
└── Message A' (assistant)
|
||||
└── Message B (user)
|
||||
└── Message B' (assistant)
|
||||
└── Message A'' (assistant, regenerated) ← New thread
|
||||
└── Message C (user)
|
||||
└── Message C' (assistant)
|
||||
```
|
||||
|
||||
`extract_thread_messages()` traces back from the latest message using `parent_message_id` to get only the current thread: `[A, A'', C, C']`
|
||||
|
||||
### Usage
|
||||
|
||||
```python
|
||||
from core.memory.token_buffer_memory import TokenBufferMemory
|
||||
|
||||
memory = TokenBufferMemory(conversation=conversation, model_instance=model_instance)
|
||||
history = memory.get_history_prompt_messages(max_token_limit=2000, message_limit=100)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## NodeTokenBufferMemory (To Be Implemented)
|
||||
|
||||
### Purpose
|
||||
|
||||
`NodeTokenBufferMemory` provides **node-scoped memory** within a conversation. Each LLM node in a workflow can maintain its own independent conversation history.
|
||||
|
||||
### Use Cases
|
||||
|
||||
1. **Multi-LLM Workflows**: Different LLM nodes need separate context
|
||||
2. **Iterative Processing**: An LLM node in a loop needs to accumulate context across iterations
|
||||
3. **Specialized Agents**: Each agent node maintains its own dialogue history
|
||||
|
||||
### Design Decisions
|
||||
|
||||
#### Storage: Object Storage for Messages (No New Database Table)
|
||||
|
||||
| Aspect | Database | Object Storage |
|
||||
| ------------------------- | -------------------- | ------------------ |
|
||||
| Cost | High | Low |
|
||||
| Query Flexibility | High | Low |
|
||||
| Schema Changes | Migration required | None |
|
||||
| Consistency with existing | ConversationVariable | File uploads, logs |
|
||||
|
||||
**Decision**: Store message data in object storage, but still use existing database tables for file metadata.
|
||||
|
||||
**What is stored in Object Storage:**
|
||||
|
||||
- Message content (text)
|
||||
- Message metadata (role, token_count, created_at)
|
||||
- File references (upload_file_id, tool_file_id, etc.)
|
||||
- Thread relationships (message_id, parent_message_id)
|
||||
|
||||
**What still requires Database queries:**
|
||||
|
||||
- File reconstruction: When reading node memory, file references are used to query
|
||||
`UploadFile` / `ToolFile` tables via `file_factory.build_from_mapping()` to rebuild
|
||||
complete `File` objects with storage_key, mime_type, etc.
|
||||
|
||||
**Why this hybrid approach:**
|
||||
|
||||
- No database migration required (no new tables)
|
||||
- Message data may be large, object storage is cost-effective
|
||||
- File metadata is already in database, no need to duplicate
|
||||
- Aligns with existing storage patterns (file uploads, logs)
|
||||
|
||||
#### Storage Key Format
|
||||
|
||||
```
|
||||
node_memory/{app_id}/{conversation_id}/{node_id}.json
|
||||
```
|
||||
|
||||
#### Data Structure
|
||||
|
||||
```json
|
||||
{
|
||||
"version": 1,
|
||||
"messages": [
|
||||
{
|
||||
"message_id": "msg-001",
|
||||
"parent_message_id": null,
|
||||
"role": "user",
|
||||
"content": "Analyze this image",
|
||||
"files": [
|
||||
{
|
||||
"type": "image",
|
||||
"transfer_method": "local_file",
|
||||
"upload_file_id": "file-uuid-123",
|
||||
"belongs_to": "user"
|
||||
}
|
||||
],
|
||||
"token_count": 15,
|
||||
"created_at": "2026-01-07T10:00:00Z"
|
||||
},
|
||||
{
|
||||
"message_id": "msg-002",
|
||||
"parent_message_id": "msg-001",
|
||||
"role": "assistant",
|
||||
"content": "This is a landscape image...",
|
||||
"files": [],
|
||||
"token_count": 50,
|
||||
"created_at": "2026-01-07T10:00:01Z"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### Thread Support
|
||||
|
||||
Node memory also supports thread extraction (for regeneration scenarios):
|
||||
|
||||
```python
|
||||
def _extract_thread(
|
||||
self,
|
||||
messages: list[NodeMemoryMessage],
|
||||
current_message_id: str
|
||||
) -> list[NodeMemoryMessage]:
|
||||
"""
|
||||
Extract messages belonging to the thread of current_message_id.
|
||||
Similar to extract_thread_messages() in TokenBufferMemory.
|
||||
"""
|
||||
...
|
||||
```
|
||||
|
||||
### File Handling
|
||||
|
||||
Files are stored as references (not full metadata):
|
||||
|
||||
```python
|
||||
class NodeMemoryFile(BaseModel):
|
||||
type: str # image, audio, video, document, custom
|
||||
transfer_method: str # local_file, remote_url, tool_file
|
||||
upload_file_id: str | None # for local_file
|
||||
tool_file_id: str | None # for tool_file
|
||||
url: str | None # for remote_url
|
||||
belongs_to: str # user / assistant
|
||||
```
|
||||
|
||||
When reading, files are rebuilt using `file_factory.build_from_mapping()`.
|
||||
|
||||
### API Design
|
||||
|
||||
```python
|
||||
class NodeTokenBufferMemory:
|
||||
def __init__(
|
||||
self,
|
||||
app_id: str,
|
||||
conversation_id: str,
|
||||
node_id: str,
|
||||
model_instance: ModelInstance,
|
||||
):
|
||||
"""
|
||||
Initialize node-level memory.
|
||||
|
||||
:param app_id: Application ID
|
||||
:param conversation_id: Conversation ID
|
||||
:param node_id: Node ID in the workflow
|
||||
:param model_instance: Model instance for token counting
|
||||
"""
|
||||
...
|
||||
|
||||
def add_messages(
|
||||
self,
|
||||
message_id: str,
|
||||
parent_message_id: str | None,
|
||||
user_content: str,
|
||||
user_files: Sequence[File],
|
||||
assistant_content: str,
|
||||
assistant_files: Sequence[File],
|
||||
) -> None:
|
||||
"""
|
||||
Append a dialogue turn (user + assistant) to node memory.
|
||||
Call this after LLM node execution completes.
|
||||
|
||||
:param message_id: Current message ID (from Message table)
|
||||
:param parent_message_id: Parent message ID (for thread tracking)
|
||||
:param user_content: User's text input
|
||||
:param user_files: Files attached by user
|
||||
:param assistant_content: Assistant's text response
|
||||
:param assistant_files: Files generated by assistant
|
||||
"""
|
||||
...
|
||||
|
||||
def get_history_prompt_messages(
|
||||
self,
|
||||
current_message_id: str,
|
||||
tenant_id: str,
|
||||
max_token_limit: int = 2000,
|
||||
file_upload_config: FileUploadConfig | None = None,
|
||||
) -> Sequence[PromptMessage]:
|
||||
"""
|
||||
Retrieve history as PromptMessage sequence.
|
||||
|
||||
:param current_message_id: Current message ID (for thread extraction)
|
||||
:param tenant_id: Tenant ID (for file reconstruction)
|
||||
:param max_token_limit: Maximum tokens for history
|
||||
:param file_upload_config: File upload configuration
|
||||
:return: Sequence of PromptMessage for LLM context
|
||||
"""
|
||||
...
|
||||
|
||||
def flush(self) -> None:
|
||||
"""
|
||||
Persist buffered changes to object storage.
|
||||
Call this at the end of node execution.
|
||||
"""
|
||||
...
|
||||
|
||||
def clear(self) -> None:
|
||||
"""
|
||||
Clear all messages in this node's memory.
|
||||
"""
|
||||
...
|
||||
```
|
||||
|
||||
### Data Flow
|
||||
|
||||
```
|
||||
Object Storage NodeTokenBufferMemory LLM Node
|
||||
│ │ │
|
||||
│ │◀── get_history_prompt_messages()
|
||||
│ storage.load(key) │ │
|
||||
│◀─────────────────────────────────┤ │
|
||||
│ │ │
|
||||
│ JSON data │ │
|
||||
├─────────────────────────────────▶│ │
|
||||
│ │ │
|
||||
│ _extract_thread() │
|
||||
│ │ │
|
||||
│ _rebuild_files() via file_factory │
|
||||
│ │ │
|
||||
│ _build_prompt_messages() │
|
||||
│ │ │
|
||||
│ _truncate_by_tokens() │
|
||||
│ │ │
|
||||
│ │ Sequence[PromptMessage] │
|
||||
│ ├──────────────────────────▶│
|
||||
│ │ │
|
||||
│ │◀── LLM execution complete │
|
||||
│ │ │
|
||||
│ │◀── add_messages() │
|
||||
│ │ │
|
||||
│ storage.save(key, data) │ │
|
||||
│◀─────────────────────────────────┤ │
|
||||
│ │ │
|
||||
```
|
||||
|
||||
### Integration with LLM Node
|
||||
|
||||
```python
|
||||
# In LLM Node execution
|
||||
|
||||
# 1. Fetch memory based on mode
|
||||
if node_data.memory and node_data.memory.mode == MemoryMode.NODE:
|
||||
# Node-level memory (Chatflow only)
|
||||
memory = fetch_node_memory(
|
||||
variable_pool=variable_pool,
|
||||
app_id=app_id,
|
||||
node_id=self.node_id,
|
||||
node_data_memory=node_data.memory,
|
||||
model_instance=model_instance,
|
||||
)
|
||||
elif node_data.memory and node_data.memory.mode == MemoryMode.CONVERSATION:
|
||||
# Conversation-level memory (existing behavior)
|
||||
memory = fetch_memory(
|
||||
variable_pool=variable_pool,
|
||||
app_id=app_id,
|
||||
node_data_memory=node_data.memory,
|
||||
model_instance=model_instance,
|
||||
)
|
||||
else:
|
||||
memory = None
|
||||
|
||||
# 2. Get history for context
|
||||
if memory:
|
||||
if isinstance(memory, NodeTokenBufferMemory):
|
||||
history = memory.get_history_prompt_messages(
|
||||
current_message_id=current_message_id,
|
||||
tenant_id=tenant_id,
|
||||
max_token_limit=max_token_limit,
|
||||
)
|
||||
else: # TokenBufferMemory
|
||||
history = memory.get_history_prompt_messages(
|
||||
max_token_limit=max_token_limit,
|
||||
)
|
||||
prompt_messages = [*history, *current_messages]
|
||||
else:
|
||||
prompt_messages = current_messages
|
||||
|
||||
# 3. Call LLM
|
||||
response = model_instance.invoke(prompt_messages)
|
||||
|
||||
# 4. Append to node memory (only for NodeTokenBufferMemory)
|
||||
if isinstance(memory, NodeTokenBufferMemory):
|
||||
memory.add_messages(
|
||||
message_id=message_id,
|
||||
parent_message_id=parent_message_id,
|
||||
user_content=user_input,
|
||||
user_files=user_files,
|
||||
assistant_content=response.content,
|
||||
assistant_files=response_files,
|
||||
)
|
||||
memory.flush()
|
||||
```
|
||||
|
||||
### Configuration
|
||||
|
||||
Add to `MemoryConfig` in `core/workflow/nodes/llm/entities.py`:
|
||||
|
||||
```python
|
||||
class MemoryMode(StrEnum):
|
||||
CONVERSATION = "conversation" # Use TokenBufferMemory (default, existing behavior)
|
||||
NODE = "node" # Use NodeTokenBufferMemory (new, Chatflow only)
|
||||
|
||||
class MemoryConfig(BaseModel):
|
||||
# Existing fields
|
||||
role_prefix: RolePrefix | None = None
|
||||
window: MemoryWindowConfig | None = None
|
||||
query_prompt_template: str | None = None
|
||||
|
||||
# Memory mode (new)
|
||||
mode: MemoryMode = MemoryMode.CONVERSATION
|
||||
```
|
||||
|
||||
**Mode Behavior:**
|
||||
|
||||
| Mode | Memory Class | Scope | Availability |
|
||||
| -------------- | --------------------- | ------------------------ | ------------- |
|
||||
| `conversation` | TokenBufferMemory | Entire conversation | All app modes |
|
||||
| `node` | NodeTokenBufferMemory | Per-node in conversation | Chatflow only |
|
||||
|
||||
> When `mode=node` is used in a non-Chatflow context (no conversation_id), it should
|
||||
> fall back to no memory or raise a configuration error.
|
||||
|
||||
---
|
||||
|
||||
## Comparison
|
||||
|
||||
| Feature | TokenBufferMemory | NodeTokenBufferMemory |
|
||||
| -------------- | ------------------------ | ------------------------- |
|
||||
| Scope | Conversation | Node within Conversation |
|
||||
| Storage | Database (Message table) | Object Storage (JSON) |
|
||||
| Thread Support | Yes | Yes |
|
||||
| File Support | Yes (via MessageFile) | Yes (via file references) |
|
||||
| Token Limit | Yes | Yes |
|
||||
| Use Case | Standard chat apps | Complex workflows |
|
||||
|
||||
---
|
||||
|
||||
## Future Considerations
|
||||
|
||||
1. **Cleanup Task**: Add a Celery task to clean up old node memory files
|
||||
2. **Concurrency**: Consider Redis lock for concurrent node executions
|
||||
3. **Compression**: Compress large memory files to reduce storage costs
|
||||
4. **Extension**: Other nodes (Agent, Tool) may also benefit from node-level memory
|
||||
15
api/core/memory/__init__.py
Normal file
15
api/core/memory/__init__.py
Normal file
@@ -0,0 +1,15 @@
|
||||
from core.memory.base import BaseMemory
|
||||
from core.memory.node_token_buffer_memory import (
|
||||
NodeMemoryData,
|
||||
NodeMemoryFile,
|
||||
NodeTokenBufferMemory,
|
||||
)
|
||||
from core.memory.token_buffer_memory import TokenBufferMemory
|
||||
|
||||
__all__ = [
|
||||
"BaseMemory",
|
||||
"NodeMemoryData",
|
||||
"NodeMemoryFile",
|
||||
"NodeTokenBufferMemory",
|
||||
"TokenBufferMemory",
|
||||
]
|
||||
83
api/core/memory/base.py
Normal file
83
api/core/memory/base.py
Normal file
@@ -0,0 +1,83 @@
|
||||
"""
|
||||
Base memory interfaces and types.
|
||||
|
||||
This module defines the common protocol for memory implementations.
|
||||
"""
|
||||
|
||||
from abc import ABC, abstractmethod
|
||||
from collections.abc import Sequence
|
||||
|
||||
from core.model_runtime.entities import ImagePromptMessageContent, PromptMessage
|
||||
|
||||
|
||||
class BaseMemory(ABC):
|
||||
"""
|
||||
Abstract base class for memory implementations.
|
||||
|
||||
Provides a common interface for both conversation-level and node-level memory.
|
||||
"""
|
||||
|
||||
@abstractmethod
|
||||
def get_history_prompt_messages(
|
||||
self,
|
||||
*,
|
||||
max_token_limit: int = 2000,
|
||||
message_limit: int | None = None,
|
||||
) -> Sequence[PromptMessage]:
|
||||
"""
|
||||
Get history prompt messages.
|
||||
|
||||
:param max_token_limit: Maximum tokens for history
|
||||
:param message_limit: Maximum number of messages
|
||||
:return: Sequence of PromptMessage for LLM context
|
||||
"""
|
||||
pass
|
||||
|
||||
def get_history_prompt_text(
|
||||
self,
|
||||
human_prefix: str = "Human",
|
||||
ai_prefix: str = "Assistant",
|
||||
max_token_limit: int = 2000,
|
||||
message_limit: int | None = None,
|
||||
) -> str:
|
||||
"""
|
||||
Get history prompt as formatted text.
|
||||
|
||||
:param human_prefix: Prefix for human messages
|
||||
:param ai_prefix: Prefix for assistant messages
|
||||
:param max_token_limit: Maximum tokens for history
|
||||
:param message_limit: Maximum number of messages
|
||||
:return: Formatted history text
|
||||
"""
|
||||
from core.model_runtime.entities import (
|
||||
PromptMessageRole,
|
||||
TextPromptMessageContent,
|
||||
)
|
||||
|
||||
prompt_messages = self.get_history_prompt_messages(
|
||||
max_token_limit=max_token_limit,
|
||||
message_limit=message_limit,
|
||||
)
|
||||
|
||||
string_messages = []
|
||||
for m in prompt_messages:
|
||||
if m.role == PromptMessageRole.USER:
|
||||
role = human_prefix
|
||||
elif m.role == PromptMessageRole.ASSISTANT:
|
||||
role = ai_prefix
|
||||
else:
|
||||
continue
|
||||
|
||||
if isinstance(m.content, list):
|
||||
inner_msg = ""
|
||||
for content in m.content:
|
||||
if isinstance(content, TextPromptMessageContent):
|
||||
inner_msg += f"{content.data}\n"
|
||||
elif isinstance(content, ImagePromptMessageContent):
|
||||
inner_msg += "[image]\n"
|
||||
string_messages.append(f"{role}: {inner_msg.strip()}")
|
||||
else:
|
||||
message = f"{role}: {m.content}"
|
||||
string_messages.append(message)
|
||||
|
||||
return "\n".join(string_messages)
|
||||
353
api/core/memory/node_token_buffer_memory.py
Normal file
353
api/core/memory/node_token_buffer_memory.py
Normal file
@@ -0,0 +1,353 @@
|
||||
"""
|
||||
Node-level Token Buffer Memory for Chatflow.
|
||||
|
||||
This module provides node-scoped memory within a conversation.
|
||||
Each LLM node in a workflow can maintain its own independent conversation history.
|
||||
|
||||
Note: This is only available in Chatflow (advanced-chat mode) because it requires
|
||||
both conversation_id and node_id.
|
||||
|
||||
Design:
|
||||
- Storage is indexed by workflow_run_id (each execution stores one turn)
|
||||
- Thread tracking leverages Message table's parent_message_id structure
|
||||
- On read: query Message table for current thread, then filter Node Memory by workflow_run_ids
|
||||
"""
|
||||
|
||||
import logging
|
||||
from collections.abc import Sequence
|
||||
|
||||
from pydantic import BaseModel
|
||||
from sqlalchemy import select
|
||||
|
||||
from core.file import File, FileTransferMethod
|
||||
from core.memory.base import BaseMemory
|
||||
from core.model_manager import ModelInstance
|
||||
from core.model_runtime.entities import (
|
||||
AssistantPromptMessage,
|
||||
ImagePromptMessageContent,
|
||||
PromptMessage,
|
||||
TextPromptMessageContent,
|
||||
UserPromptMessage,
|
||||
)
|
||||
from core.prompt.utils.extract_thread_messages import extract_thread_messages
|
||||
from extensions.ext_database import db
|
||||
from extensions.ext_storage import storage
|
||||
from models.model import Message
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class NodeMemoryFile(BaseModel):
|
||||
"""File reference stored in node memory."""
|
||||
|
||||
type: str # image, audio, video, document, custom
|
||||
transfer_method: str # local_file, remote_url, tool_file
|
||||
upload_file_id: str | None = None
|
||||
tool_file_id: str | None = None
|
||||
url: str | None = None
|
||||
|
||||
|
||||
class NodeMemoryTurn(BaseModel):
|
||||
"""A single dialogue turn (user + assistant) in node memory."""
|
||||
|
||||
user_content: str = ""
|
||||
user_files: list[NodeMemoryFile] = []
|
||||
assistant_content: str = ""
|
||||
assistant_files: list[NodeMemoryFile] = []
|
||||
|
||||
|
||||
class NodeMemoryData(BaseModel):
|
||||
"""Root data structure for node memory storage."""
|
||||
|
||||
version: int = 1
|
||||
# Key: workflow_run_id, Value: dialogue turn
|
||||
turns: dict[str, NodeMemoryTurn] = {}
|
||||
|
||||
|
||||
class NodeTokenBufferMemory(BaseMemory):
|
||||
"""
|
||||
Node-level Token Buffer Memory.
|
||||
|
||||
Provides node-scoped memory within a conversation. Each LLM node can maintain
|
||||
its own independent conversation history, stored in object storage.
|
||||
|
||||
Key design: Thread tracking is delegated to Message table's parent_message_id.
|
||||
Storage is indexed by workflow_run_id for easy filtering.
|
||||
|
||||
Storage key format: node_memory/{app_id}/{conversation_id}/{node_id}.json
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
app_id: str,
|
||||
conversation_id: str,
|
||||
node_id: str,
|
||||
tenant_id: str,
|
||||
model_instance: ModelInstance,
|
||||
):
|
||||
"""
|
||||
Initialize node-level memory.
|
||||
|
||||
:param app_id: Application ID
|
||||
:param conversation_id: Conversation ID
|
||||
:param node_id: Node ID in the workflow
|
||||
:param tenant_id: Tenant ID for file reconstruction
|
||||
:param model_instance: Model instance for token counting
|
||||
"""
|
||||
self.app_id = app_id
|
||||
self.conversation_id = conversation_id
|
||||
self.node_id = node_id
|
||||
self.tenant_id = tenant_id
|
||||
self.model_instance = model_instance
|
||||
self._storage_key = f"node_memory/{app_id}/{conversation_id}/{node_id}.json"
|
||||
self._data: NodeMemoryData | None = None
|
||||
self._dirty = False
|
||||
|
||||
def _load(self) -> NodeMemoryData:
|
||||
"""Load data from object storage."""
|
||||
if self._data is not None:
|
||||
return self._data
|
||||
|
||||
try:
|
||||
raw = storage.load_once(self._storage_key)
|
||||
self._data = NodeMemoryData.model_validate_json(raw)
|
||||
except Exception:
|
||||
# File not found or parse error, start fresh
|
||||
self._data = NodeMemoryData()
|
||||
|
||||
return self._data
|
||||
|
||||
def _save(self) -> None:
|
||||
"""Save data to object storage."""
|
||||
if self._data is not None:
|
||||
storage.save(self._storage_key, self._data.model_dump_json().encode("utf-8"))
|
||||
self._dirty = False
|
||||
|
||||
def _file_to_memory_file(self, file: File) -> NodeMemoryFile:
|
||||
"""Convert File object to NodeMemoryFile reference."""
|
||||
return NodeMemoryFile(
|
||||
type=file.type.value if hasattr(file.type, "value") else str(file.type),
|
||||
transfer_method=(
|
||||
file.transfer_method.value if hasattr(file.transfer_method, "value") else str(file.transfer_method)
|
||||
),
|
||||
upload_file_id=file.related_id if file.transfer_method == FileTransferMethod.LOCAL_FILE else None,
|
||||
tool_file_id=file.related_id if file.transfer_method == FileTransferMethod.TOOL_FILE else None,
|
||||
url=file.remote_url if file.transfer_method == FileTransferMethod.REMOTE_URL else None,
|
||||
)
|
||||
|
||||
def _memory_file_to_mapping(self, memory_file: NodeMemoryFile) -> dict:
|
||||
"""Convert NodeMemoryFile to mapping for file_factory."""
|
||||
mapping: dict = {
|
||||
"type": memory_file.type,
|
||||
"transfer_method": memory_file.transfer_method,
|
||||
}
|
||||
if memory_file.upload_file_id:
|
||||
mapping["upload_file_id"] = memory_file.upload_file_id
|
||||
if memory_file.tool_file_id:
|
||||
mapping["tool_file_id"] = memory_file.tool_file_id
|
||||
if memory_file.url:
|
||||
mapping["url"] = memory_file.url
|
||||
return mapping
|
||||
|
||||
def _rebuild_files(self, memory_files: list[NodeMemoryFile]) -> list[File]:
|
||||
"""Rebuild File objects from NodeMemoryFile references."""
|
||||
if not memory_files:
|
||||
return []
|
||||
|
||||
from factories import file_factory
|
||||
|
||||
files = []
|
||||
for mf in memory_files:
|
||||
try:
|
||||
mapping = self._memory_file_to_mapping(mf)
|
||||
file = file_factory.build_from_mapping(mapping=mapping, tenant_id=self.tenant_id)
|
||||
files.append(file)
|
||||
except Exception as e:
|
||||
logger.warning("Failed to rebuild file from memory: %s", e)
|
||||
continue
|
||||
return files
|
||||
|
||||
def _build_prompt_message(
|
||||
self,
|
||||
role: str,
|
||||
content: str,
|
||||
files: list[File],
|
||||
detail: ImagePromptMessageContent.DETAIL = ImagePromptMessageContent.DETAIL.HIGH,
|
||||
) -> PromptMessage:
|
||||
"""Build PromptMessage from content and files."""
|
||||
from core.file import file_manager
|
||||
|
||||
if not files:
|
||||
if role == "user":
|
||||
return UserPromptMessage(content=content)
|
||||
else:
|
||||
return AssistantPromptMessage(content=content)
|
||||
|
||||
# Build multimodal content
|
||||
prompt_contents: list = []
|
||||
for file in files:
|
||||
try:
|
||||
prompt_content = file_manager.to_prompt_message_content(file, image_detail_config=detail)
|
||||
prompt_contents.append(prompt_content)
|
||||
except Exception as e:
|
||||
logger.warning("Failed to convert file to prompt content: %s", e)
|
||||
continue
|
||||
|
||||
prompt_contents.append(TextPromptMessageContent(data=content))
|
||||
|
||||
if role == "user":
|
||||
return UserPromptMessage(content=prompt_contents)
|
||||
else:
|
||||
return AssistantPromptMessage(content=prompt_contents)
|
||||
|
||||
def _get_thread_workflow_run_ids(self) -> list[str]:
|
||||
"""
|
||||
Get workflow_run_ids for the current thread by querying Message table.
|
||||
|
||||
Returns workflow_run_ids in chronological order (oldest first).
|
||||
"""
|
||||
# Query messages for this conversation
|
||||
stmt = (
|
||||
select(Message).where(Message.conversation_id == self.conversation_id).order_by(Message.created_at.desc())
|
||||
)
|
||||
messages = db.session.scalars(stmt.limit(500)).all()
|
||||
|
||||
if not messages:
|
||||
return []
|
||||
|
||||
# Extract thread messages using existing logic
|
||||
thread_messages = extract_thread_messages(messages)
|
||||
|
||||
# For newly created message, its answer is temporarily empty, skip it
|
||||
if thread_messages and not thread_messages[0].answer and thread_messages[0].answer_tokens == 0:
|
||||
thread_messages.pop(0)
|
||||
|
||||
# Reverse to get chronological order, extract workflow_run_ids
|
||||
workflow_run_ids = []
|
||||
for msg in reversed(thread_messages):
|
||||
if msg.workflow_run_id:
|
||||
workflow_run_ids.append(msg.workflow_run_id)
|
||||
|
||||
return workflow_run_ids
|
||||
|
||||
def add_messages(
|
||||
self,
|
||||
workflow_run_id: str,
|
||||
user_content: str,
|
||||
user_files: Sequence[File] | None = None,
|
||||
assistant_content: str = "",
|
||||
assistant_files: Sequence[File] | None = None,
|
||||
) -> None:
|
||||
"""
|
||||
Add a dialogue turn to node memory.
|
||||
Call this after LLM node execution completes.
|
||||
|
||||
:param workflow_run_id: Current workflow execution ID
|
||||
:param user_content: User's text input
|
||||
:param user_files: Files attached by user
|
||||
:param assistant_content: Assistant's text response
|
||||
:param assistant_files: Files generated by assistant
|
||||
"""
|
||||
data = self._load()
|
||||
|
||||
# Convert files to memory file references
|
||||
user_memory_files = [self._file_to_memory_file(f) for f in (user_files or [])]
|
||||
assistant_memory_files = [self._file_to_memory_file(f) for f in (assistant_files or [])]
|
||||
|
||||
# Store the turn indexed by workflow_run_id
|
||||
data.turns[workflow_run_id] = NodeMemoryTurn(
|
||||
user_content=user_content,
|
||||
user_files=user_memory_files,
|
||||
assistant_content=assistant_content,
|
||||
assistant_files=assistant_memory_files,
|
||||
)
|
||||
|
||||
self._dirty = True
|
||||
|
||||
def get_history_prompt_messages(
|
||||
self,
|
||||
*,
|
||||
max_token_limit: int = 2000,
|
||||
message_limit: int | None = None,
|
||||
) -> Sequence[PromptMessage]:
|
||||
"""
|
||||
Retrieve history as PromptMessage sequence.
|
||||
|
||||
Thread tracking is handled by querying Message table's parent_message_id structure.
|
||||
|
||||
:param max_token_limit: Maximum tokens for history
|
||||
:param message_limit: unused, for interface compatibility
|
||||
:return: Sequence of PromptMessage for LLM context
|
||||
"""
|
||||
# message_limit is unused in NodeTokenBufferMemory (uses token limit instead)
|
||||
_ = message_limit
|
||||
detail = ImagePromptMessageContent.DETAIL.HIGH
|
||||
data = self._load()
|
||||
|
||||
if not data.turns:
|
||||
return []
|
||||
|
||||
# Get workflow_run_ids for current thread from Message table
|
||||
thread_workflow_run_ids = self._get_thread_workflow_run_ids()
|
||||
|
||||
if not thread_workflow_run_ids:
|
||||
return []
|
||||
|
||||
# Build prompt messages in thread order
|
||||
prompt_messages: list[PromptMessage] = []
|
||||
for wf_run_id in thread_workflow_run_ids:
|
||||
turn = data.turns.get(wf_run_id)
|
||||
if not turn:
|
||||
# This workflow execution didn't have node memory stored
|
||||
continue
|
||||
|
||||
# Build user message
|
||||
user_files = self._rebuild_files(turn.user_files) if turn.user_files else []
|
||||
user_msg = self._build_prompt_message(
|
||||
role="user",
|
||||
content=turn.user_content,
|
||||
files=user_files,
|
||||
detail=detail,
|
||||
)
|
||||
prompt_messages.append(user_msg)
|
||||
|
||||
# Build assistant message
|
||||
assistant_files = self._rebuild_files(turn.assistant_files) if turn.assistant_files else []
|
||||
assistant_msg = self._build_prompt_message(
|
||||
role="assistant",
|
||||
content=turn.assistant_content,
|
||||
files=assistant_files,
|
||||
detail=detail,
|
||||
)
|
||||
prompt_messages.append(assistant_msg)
|
||||
|
||||
if not prompt_messages:
|
||||
return []
|
||||
|
||||
# Truncate by token limit
|
||||
try:
|
||||
current_tokens = self.model_instance.get_llm_num_tokens(prompt_messages)
|
||||
while current_tokens > max_token_limit and len(prompt_messages) > 1:
|
||||
prompt_messages.pop(0)
|
||||
current_tokens = self.model_instance.get_llm_num_tokens(prompt_messages)
|
||||
except Exception as e:
|
||||
logger.warning("Failed to count tokens for truncation: %s", e)
|
||||
|
||||
return prompt_messages
|
||||
|
||||
def flush(self) -> None:
|
||||
"""
|
||||
Persist buffered changes to object storage.
|
||||
Call this at the end of node execution.
|
||||
"""
|
||||
if self._dirty:
|
||||
self._save()
|
||||
|
||||
def clear(self) -> None:
|
||||
"""Clear all messages in this node's memory."""
|
||||
self._data = NodeMemoryData()
|
||||
self._save()
|
||||
|
||||
def exists(self) -> bool:
|
||||
"""Check if node memory exists in storage."""
|
||||
return storage.exists(self._storage_key)
|
||||
@@ -5,12 +5,12 @@ from sqlalchemy.orm import sessionmaker
|
||||
|
||||
from core.app.app_config.features.file_upload.manager import FileUploadConfigManager
|
||||
from core.file import file_manager
|
||||
from core.memory.base import BaseMemory
|
||||
from core.model_manager import ModelInstance
|
||||
from core.model_runtime.entities import (
|
||||
AssistantPromptMessage,
|
||||
ImagePromptMessageContent,
|
||||
PromptMessage,
|
||||
PromptMessageRole,
|
||||
TextPromptMessageContent,
|
||||
UserPromptMessage,
|
||||
)
|
||||
@@ -24,7 +24,7 @@ from repositories.api_workflow_run_repository import APIWorkflowRunRepository
|
||||
from repositories.factory import DifyAPIRepositoryFactory
|
||||
|
||||
|
||||
class TokenBufferMemory:
|
||||
class TokenBufferMemory(BaseMemory):
|
||||
def __init__(
|
||||
self,
|
||||
conversation: Conversation,
|
||||
@@ -115,10 +115,14 @@ class TokenBufferMemory:
|
||||
return AssistantPromptMessage(content=prompt_message_contents)
|
||||
|
||||
def get_history_prompt_messages(
|
||||
self, max_token_limit: int = 2000, message_limit: int | None = None
|
||||
self,
|
||||
*,
|
||||
max_token_limit: int = 2000,
|
||||
message_limit: int | None = None,
|
||||
) -> Sequence[PromptMessage]:
|
||||
"""
|
||||
Get history prompt messages.
|
||||
|
||||
:param max_token_limit: max token limit
|
||||
:param message_limit: message limit
|
||||
"""
|
||||
@@ -200,44 +204,3 @@ class TokenBufferMemory:
|
||||
curr_message_tokens = self.model_instance.get_llm_num_tokens(prompt_messages)
|
||||
|
||||
return prompt_messages
|
||||
|
||||
def get_history_prompt_text(
|
||||
self,
|
||||
human_prefix: str = "Human",
|
||||
ai_prefix: str = "Assistant",
|
||||
max_token_limit: int = 2000,
|
||||
message_limit: int | None = None,
|
||||
) -> str:
|
||||
"""
|
||||
Get history prompt text.
|
||||
:param human_prefix: human prefix
|
||||
:param ai_prefix: ai prefix
|
||||
:param max_token_limit: max token limit
|
||||
:param message_limit: message limit
|
||||
:return:
|
||||
"""
|
||||
prompt_messages = self.get_history_prompt_messages(max_token_limit=max_token_limit, message_limit=message_limit)
|
||||
|
||||
string_messages = []
|
||||
for m in prompt_messages:
|
||||
if m.role == PromptMessageRole.USER:
|
||||
role = human_prefix
|
||||
elif m.role == PromptMessageRole.ASSISTANT:
|
||||
role = ai_prefix
|
||||
else:
|
||||
continue
|
||||
|
||||
if isinstance(m.content, list):
|
||||
inner_msg = ""
|
||||
for content in m.content:
|
||||
if isinstance(content, TextPromptMessageContent):
|
||||
inner_msg += f"{content.data}\n"
|
||||
elif isinstance(content, ImagePromptMessageContent):
|
||||
inner_msg += "[image]\n"
|
||||
|
||||
string_messages.append(f"{role}: {inner_msg.strip()}")
|
||||
else:
|
||||
message = f"{role}: {m.content}"
|
||||
string_messages.append(message)
|
||||
|
||||
return "\n".join(string_messages)
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from enum import StrEnum
|
||||
from typing import Literal
|
||||
|
||||
from pydantic import BaseModel
|
||||
@@ -5,6 +6,13 @@ from pydantic import BaseModel
|
||||
from core.model_runtime.entities.message_entities import PromptMessageRole
|
||||
|
||||
|
||||
class MemoryMode(StrEnum):
|
||||
"""Memory mode for LLM nodes."""
|
||||
|
||||
CONVERSATION = "conversation" # Use TokenBufferMemory (default, existing behavior)
|
||||
NODE = "node" # Use NodeTokenBufferMemory (Chatflow only)
|
||||
|
||||
|
||||
class ChatModelMessage(BaseModel):
|
||||
"""
|
||||
Chat Message.
|
||||
@@ -48,3 +56,4 @@ class MemoryConfig(BaseModel):
|
||||
role_prefix: RolePrefix | None = None
|
||||
window: WindowConfig
|
||||
query_prompt_template: str | None = None
|
||||
mode: MemoryMode = MemoryMode.CONVERSATION
|
||||
|
||||
1418
api/core/workflow/docs/variable_extraction_design.md
Normal file
1418
api/core/workflow/docs/variable_extraction_design.md
Normal file
File diff suppressed because it is too large
Load Diff
@@ -63,6 +63,7 @@ class NodeType(StrEnum):
|
||||
TRIGGER_SCHEDULE = "trigger-schedule"
|
||||
TRIGGER_PLUGIN = "trigger-plugin"
|
||||
HUMAN_INPUT = "human-input"
|
||||
GROUP = "group"
|
||||
|
||||
@property
|
||||
def is_trigger_node(self) -> bool:
|
||||
|
||||
@@ -307,7 +307,14 @@ class Graph:
|
||||
if not node_configs:
|
||||
raise ValueError("Graph must have at least one node")
|
||||
|
||||
node_configs = [node_config for node_config in node_configs if node_config.get("type", "") != "custom-note"]
|
||||
# Filter out UI-only node types:
|
||||
# - custom-note: top-level type (node_config.type == "custom-note")
|
||||
# - group: data-level type (node_config.data.type == "group")
|
||||
node_configs = [
|
||||
node_config for node_config in node_configs
|
||||
if node_config.get("type", "") != "custom-note"
|
||||
and node_config.get("data", {}).get("type", "") != "group"
|
||||
]
|
||||
|
||||
# Parse node configurations
|
||||
node_configs_map = cls._parse_node_configs(node_configs)
|
||||
|
||||
@@ -125,6 +125,11 @@ class EventHandler:
|
||||
Args:
|
||||
event: The node started event
|
||||
"""
|
||||
# Check if this is a virtual node (extraction node)
|
||||
if self._is_virtual_node(event.node_id):
|
||||
self._handle_virtual_node_started(event)
|
||||
return
|
||||
|
||||
# Track execution in domain model
|
||||
node_execution = self._graph_execution.get_or_create_node_execution(event.node_id)
|
||||
is_initial_attempt = node_execution.retry_count == 0
|
||||
@@ -164,6 +169,11 @@ class EventHandler:
|
||||
Args:
|
||||
event: The node succeeded event
|
||||
"""
|
||||
# Check if this is a virtual node (extraction node)
|
||||
if self._is_virtual_node(event.node_id):
|
||||
self._handle_virtual_node_success(event)
|
||||
return
|
||||
|
||||
# Update domain model
|
||||
node_execution = self._graph_execution.get_or_create_node_execution(event.node_id)
|
||||
node_execution.mark_taken()
|
||||
@@ -226,6 +236,11 @@ class EventHandler:
|
||||
Args:
|
||||
event: The node failed event
|
||||
"""
|
||||
# Check if this is a virtual node (extraction node)
|
||||
if self._is_virtual_node(event.node_id):
|
||||
self._handle_virtual_node_failed(event)
|
||||
return
|
||||
|
||||
# Update domain model
|
||||
node_execution = self._graph_execution.get_or_create_node_execution(event.node_id)
|
||||
node_execution.mark_failed(event.error)
|
||||
@@ -345,3 +360,57 @@ class EventHandler:
|
||||
self._graph_runtime_state.set_output("answer", value)
|
||||
else:
|
||||
self._graph_runtime_state.set_output(key, value)
|
||||
|
||||
def _is_virtual_node(self, node_id: str) -> bool:
|
||||
"""
|
||||
Check if node_id represents a virtual sub-node.
|
||||
|
||||
Virtual nodes have IDs in the format: {parent_node_id}.{local_id}
|
||||
We check if the part before '.' exists in graph nodes.
|
||||
"""
|
||||
if "." in node_id:
|
||||
parent_id = node_id.rsplit(".", 1)[0]
|
||||
return parent_id in self._graph.nodes
|
||||
return False
|
||||
|
||||
def _handle_virtual_node_started(self, event: NodeRunStartedEvent) -> None:
|
||||
"""
|
||||
Handle virtual node started event.
|
||||
|
||||
Virtual nodes don't need full execution tracking, just collect the event.
|
||||
"""
|
||||
# Track in response coordinator for stream ordering
|
||||
self._response_coordinator.track_node_execution(event.node_id, event.id)
|
||||
|
||||
# Collect the event
|
||||
self._event_collector.collect(event)
|
||||
|
||||
def _handle_virtual_node_success(self, event: NodeRunSucceededEvent) -> None:
|
||||
"""
|
||||
Handle virtual node success event.
|
||||
|
||||
Virtual nodes (extraction nodes) need special handling:
|
||||
- Store outputs in variable pool (for reference by other nodes)
|
||||
- Accumulate token usage
|
||||
- Collect the event for logging
|
||||
- Do NOT process edges or enqueue next nodes (parent node handles that)
|
||||
"""
|
||||
self._accumulate_node_usage(event.node_run_result.llm_usage)
|
||||
|
||||
# Store outputs in variable pool
|
||||
self._store_node_outputs(event.node_id, event.node_run_result.outputs)
|
||||
|
||||
# Collect the event
|
||||
self._event_collector.collect(event)
|
||||
|
||||
def _handle_virtual_node_failed(self, event: NodeRunFailedEvent) -> None:
|
||||
"""
|
||||
Handle virtual node failed event.
|
||||
|
||||
Virtual nodes (extraction nodes) failures are collected for logging,
|
||||
but the parent node is responsible for handling the error.
|
||||
"""
|
||||
self._accumulate_node_usage(event.node_run_result.llm_usage)
|
||||
|
||||
# Collect the event for logging
|
||||
self._event_collector.collect(event)
|
||||
|
||||
@@ -20,6 +20,12 @@ class NodeRunStartedEvent(GraphNodeEventBase):
|
||||
provider_type: str = ""
|
||||
provider_id: str = ""
|
||||
|
||||
# Virtual node fields for extraction
|
||||
is_virtual: bool = False
|
||||
parent_node_id: str | None = None
|
||||
extraction_source: str | None = None # e.g., "llm1.context"
|
||||
extraction_prompt: str | None = None
|
||||
|
||||
|
||||
class NodeRunStreamChunkEvent(GraphNodeEventBase):
|
||||
# Spec-compliant fields
|
||||
|
||||
@@ -1,5 +1,13 @@
|
||||
from .entities import BaseIterationNodeData, BaseIterationState, BaseLoopNodeData, BaseLoopState, BaseNodeData
|
||||
from .entities import (
|
||||
BaseIterationNodeData,
|
||||
BaseIterationState,
|
||||
BaseLoopNodeData,
|
||||
BaseLoopState,
|
||||
BaseNodeData,
|
||||
VirtualNodeConfig,
|
||||
)
|
||||
from .usage_tracking_mixin import LLMUsageTrackingMixin
|
||||
from .virtual_node_executor import VirtualNodeExecutionError, VirtualNodeExecutor
|
||||
|
||||
__all__ = [
|
||||
"BaseIterationNodeData",
|
||||
@@ -8,4 +16,7 @@ __all__ = [
|
||||
"BaseLoopState",
|
||||
"BaseNodeData",
|
||||
"LLMUsageTrackingMixin",
|
||||
"VirtualNodeConfig",
|
||||
"VirtualNodeExecutionError",
|
||||
"VirtualNodeExecutor",
|
||||
]
|
||||
|
||||
@@ -167,6 +167,24 @@ class DefaultValue(BaseModel):
|
||||
return self
|
||||
|
||||
|
||||
class VirtualNodeConfig(BaseModel):
|
||||
"""Configuration for a virtual sub-node embedded within a parent node."""
|
||||
|
||||
# Local ID within parent node (e.g., "ext_1")
|
||||
# Will be converted to global ID: "{parent_id}.{id}"
|
||||
id: str
|
||||
|
||||
# Node type (e.g., "llm", "code", "tool")
|
||||
type: str
|
||||
|
||||
# Full node data configuration
|
||||
data: dict[str, Any] = {}
|
||||
|
||||
def get_global_id(self, parent_node_id: str) -> str:
|
||||
"""Get the global node ID by combining parent ID and local ID."""
|
||||
return f"{parent_node_id}.{self.id}"
|
||||
|
||||
|
||||
class BaseNodeData(ABC, BaseModel):
|
||||
title: str
|
||||
desc: str | None = None
|
||||
@@ -175,6 +193,9 @@ class BaseNodeData(ABC, BaseModel):
|
||||
default_value: list[DefaultValue] | None = None
|
||||
retry_config: RetryConfig = RetryConfig()
|
||||
|
||||
# Virtual sub-nodes that execute before the main node
|
||||
virtual_nodes: list[VirtualNodeConfig] = []
|
||||
|
||||
@property
|
||||
def default_value_dict(self) -> dict[str, Any]:
|
||||
if self.default_value:
|
||||
|
||||
@@ -229,6 +229,7 @@ class Node(Generic[NodeDataT]):
|
||||
self._node_id = node_id
|
||||
self._node_execution_id: str = ""
|
||||
self._start_at = naive_utc_now()
|
||||
self._virtual_node_outputs: dict[str, Any] = {} # Outputs from virtual sub-nodes
|
||||
|
||||
raw_node_data = config.get("data") or {}
|
||||
if not isinstance(raw_node_data, Mapping):
|
||||
@@ -270,10 +271,52 @@ class Node(Generic[NodeDataT]):
|
||||
"""Check if execution should be stopped."""
|
||||
return self.graph_runtime_state.stop_event.is_set()
|
||||
|
||||
def _execute_virtual_nodes(self) -> Generator[GraphNodeEventBase, None, dict[str, Any]]:
|
||||
"""
|
||||
Execute all virtual sub-nodes defined in node configuration.
|
||||
|
||||
Virtual nodes are complete node definitions that execute before the main node.
|
||||
Each virtual node:
|
||||
- Has its own global ID: "{parent_id}.{local_id}"
|
||||
- Generates standard node events
|
||||
- Stores outputs in the variable pool (via event handling)
|
||||
- Supports retry via parent node's retry config
|
||||
|
||||
Returns:
|
||||
dict mapping local_id -> outputs dict
|
||||
"""
|
||||
from .virtual_node_executor import VirtualNodeExecutor
|
||||
|
||||
virtual_nodes = self.node_data.virtual_nodes
|
||||
if not virtual_nodes:
|
||||
return {}
|
||||
|
||||
executor = VirtualNodeExecutor(
|
||||
graph_init_params=self._graph_init_params,
|
||||
graph_runtime_state=self.graph_runtime_state,
|
||||
parent_node_id=self._node_id,
|
||||
parent_retry_config=self.retry_config,
|
||||
)
|
||||
|
||||
return (yield from executor.execute_virtual_nodes(virtual_nodes))
|
||||
|
||||
@property
|
||||
def virtual_node_outputs(self) -> dict[str, Any]:
|
||||
"""
|
||||
Get the outputs from virtual sub-nodes.
|
||||
|
||||
Returns:
|
||||
dict mapping local_id -> outputs dict
|
||||
"""
|
||||
return self._virtual_node_outputs
|
||||
|
||||
def run(self) -> Generator[GraphNodeEventBase, None, None]:
|
||||
execution_id = self.ensure_execution_id()
|
||||
self._start_at = naive_utc_now()
|
||||
|
||||
# Step 1: Execute virtual sub-nodes before main node execution
|
||||
self._virtual_node_outputs = yield from self._execute_virtual_nodes()
|
||||
|
||||
# Create and push start event with required fields
|
||||
start_event = NodeRunStartedEvent(
|
||||
id=execution_id,
|
||||
|
||||
213
api/core/workflow/nodes/base/virtual_node_executor.py
Normal file
213
api/core/workflow/nodes/base/virtual_node_executor.py
Normal file
@@ -0,0 +1,213 @@
|
||||
"""
|
||||
Virtual Node Executor for running embedded sub-nodes within a parent node.
|
||||
|
||||
This module handles the execution of virtual nodes defined in a parent node's
|
||||
`virtual_nodes` configuration. Virtual nodes are complete node definitions
|
||||
that execute before the parent node.
|
||||
|
||||
Example configuration:
|
||||
virtual_nodes:
|
||||
- id: ext_1
|
||||
type: llm
|
||||
data:
|
||||
model: {...}
|
||||
prompt_template: [...]
|
||||
"""
|
||||
|
||||
import time
|
||||
from collections.abc import Generator
|
||||
from typing import TYPE_CHECKING, Any
|
||||
from uuid import uuid4
|
||||
|
||||
from core.workflow.enums import NodeType
|
||||
from core.workflow.graph_events import (
|
||||
GraphNodeEventBase,
|
||||
NodeRunFailedEvent,
|
||||
NodeRunRetryEvent,
|
||||
NodeRunStartedEvent,
|
||||
NodeRunSucceededEvent,
|
||||
)
|
||||
from libs.datetime_utils import naive_utc_now
|
||||
|
||||
from .entities import RetryConfig, VirtualNodeConfig
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from core.workflow.entities import GraphInitParams
|
||||
from core.workflow.runtime import GraphRuntimeState
|
||||
|
||||
|
||||
class VirtualNodeExecutionError(Exception):
|
||||
"""Error during virtual node execution"""
|
||||
|
||||
def __init__(self, node_id: str, original_error: Exception):
|
||||
self.node_id = node_id
|
||||
self.original_error = original_error
|
||||
super().__init__(f"Virtual node {node_id} execution failed: {original_error}")
|
||||
|
||||
|
||||
class VirtualNodeExecutor:
|
||||
"""
|
||||
Executes virtual sub-nodes embedded within a parent node.
|
||||
|
||||
Virtual nodes are complete node definitions that execute before the parent node.
|
||||
Each virtual node:
|
||||
- Has its own global ID: "{parent_id}.{local_id}"
|
||||
- Generates standard node events
|
||||
- Stores outputs in the variable pool
|
||||
- Supports retry via parent node's retry config
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
*,
|
||||
graph_init_params: "GraphInitParams",
|
||||
graph_runtime_state: "GraphRuntimeState",
|
||||
parent_node_id: str,
|
||||
parent_retry_config: RetryConfig | None = None,
|
||||
):
|
||||
self._graph_init_params = graph_init_params
|
||||
self._graph_runtime_state = graph_runtime_state
|
||||
self._parent_node_id = parent_node_id
|
||||
self._parent_retry_config = parent_retry_config or RetryConfig()
|
||||
|
||||
def execute_virtual_nodes(
|
||||
self,
|
||||
virtual_nodes: list[VirtualNodeConfig],
|
||||
) -> Generator[GraphNodeEventBase, None, dict[str, Any]]:
|
||||
"""
|
||||
Execute all virtual nodes in order.
|
||||
|
||||
Args:
|
||||
virtual_nodes: List of virtual node configurations
|
||||
|
||||
Yields:
|
||||
Node events from each virtual node execution
|
||||
|
||||
Returns:
|
||||
dict mapping local_id -> outputs dict
|
||||
"""
|
||||
results: dict[str, Any] = {}
|
||||
|
||||
for vnode_config in virtual_nodes:
|
||||
global_id = vnode_config.get_global_id(self._parent_node_id)
|
||||
|
||||
# Execute with retry
|
||||
outputs = yield from self._execute_with_retry(vnode_config, global_id)
|
||||
results[vnode_config.id] = outputs
|
||||
|
||||
return results
|
||||
|
||||
def _execute_with_retry(
|
||||
self,
|
||||
vnode_config: VirtualNodeConfig,
|
||||
global_id: str,
|
||||
) -> Generator[GraphNodeEventBase, None, dict[str, Any]]:
|
||||
"""
|
||||
Execute virtual node with retry support.
|
||||
"""
|
||||
retry_config = self._parent_retry_config
|
||||
last_error: Exception | None = None
|
||||
|
||||
for attempt in range(retry_config.max_retries + 1):
|
||||
try:
|
||||
return (yield from self._execute_single_node(vnode_config, global_id))
|
||||
except Exception as e:
|
||||
last_error = e
|
||||
|
||||
if attempt < retry_config.max_retries:
|
||||
# Yield retry event
|
||||
yield NodeRunRetryEvent(
|
||||
id=str(uuid4()),
|
||||
node_id=global_id,
|
||||
node_type=self._get_node_type(vnode_config.type),
|
||||
node_title=vnode_config.data.get("title", f"Virtual: {vnode_config.id}"),
|
||||
start_at=naive_utc_now(),
|
||||
error=str(e),
|
||||
retry_index=attempt + 1,
|
||||
)
|
||||
|
||||
time.sleep(retry_config.retry_interval_seconds)
|
||||
continue
|
||||
|
||||
raise VirtualNodeExecutionError(global_id, e) from e
|
||||
|
||||
raise last_error or VirtualNodeExecutionError(global_id, Exception("Unknown error"))
|
||||
|
||||
def _execute_single_node(
|
||||
self,
|
||||
vnode_config: VirtualNodeConfig,
|
||||
global_id: str,
|
||||
) -> Generator[GraphNodeEventBase, None, dict[str, Any]]:
|
||||
"""
|
||||
Execute a single virtual node by instantiating and running it.
|
||||
"""
|
||||
from core.workflow.nodes.node_mapping import LATEST_VERSION, NODE_TYPE_CLASSES_MAPPING
|
||||
|
||||
# Build node config
|
||||
node_config: dict[str, Any] = {
|
||||
"id": global_id,
|
||||
"data": {
|
||||
**vnode_config.data,
|
||||
"title": vnode_config.data.get("title", f"Virtual: {vnode_config.id}"),
|
||||
},
|
||||
}
|
||||
|
||||
# Get the node class for this type
|
||||
node_type = self._get_node_type(vnode_config.type)
|
||||
node_mapping = NODE_TYPE_CLASSES_MAPPING.get(node_type)
|
||||
if not node_mapping:
|
||||
raise ValueError(f"No class mapping found for node type: {node_type}")
|
||||
|
||||
node_version = str(vnode_config.data.get("version", "1"))
|
||||
node_cls = node_mapping.get(node_version) or node_mapping.get(LATEST_VERSION)
|
||||
if not node_cls:
|
||||
raise ValueError(f"No class found for node type: {node_type}")
|
||||
|
||||
# Instantiate the node
|
||||
node = node_cls(
|
||||
id=global_id,
|
||||
config=node_config,
|
||||
graph_init_params=self._graph_init_params,
|
||||
graph_runtime_state=self._graph_runtime_state,
|
||||
)
|
||||
|
||||
# Run and collect events
|
||||
outputs: dict[str, Any] = {}
|
||||
|
||||
for event in node.run():
|
||||
# Mark event as coming from virtual node
|
||||
self._mark_event_as_virtual(event, vnode_config)
|
||||
yield event
|
||||
|
||||
if isinstance(event, NodeRunSucceededEvent):
|
||||
outputs = event.node_run_result.outputs or {}
|
||||
elif isinstance(event, NodeRunFailedEvent):
|
||||
raise Exception(event.error or "Virtual node execution failed")
|
||||
|
||||
return outputs
|
||||
|
||||
def _mark_event_as_virtual(
|
||||
self,
|
||||
event: GraphNodeEventBase,
|
||||
vnode_config: VirtualNodeConfig,
|
||||
) -> None:
|
||||
"""Mark event as coming from a virtual node."""
|
||||
if isinstance(event, NodeRunStartedEvent):
|
||||
event.is_virtual = True
|
||||
event.parent_node_id = self._parent_node_id
|
||||
|
||||
def _get_node_type(self, type_str: str) -> NodeType:
|
||||
"""Convert type string to NodeType enum."""
|
||||
type_mapping = {
|
||||
"llm": NodeType.LLM,
|
||||
"code": NodeType.CODE,
|
||||
"tool": NodeType.TOOL,
|
||||
"if-else": NodeType.IF_ELSE,
|
||||
"question-classifier": NodeType.QUESTION_CLASSIFIER,
|
||||
"parameter-extractor": NodeType.PARAMETER_EXTRACTOR,
|
||||
"template-transform": NodeType.TEMPLATE_TRANSFORM,
|
||||
"variable-assigner": NodeType.VARIABLE_ASSIGNER,
|
||||
"http-request": NodeType.HTTP_REQUEST,
|
||||
"knowledge-retrieval": NodeType.KNOWLEDGE_RETRIEVAL,
|
||||
}
|
||||
return type_mapping.get(type_str, NodeType.LLM)
|
||||
@@ -8,12 +8,13 @@ from configs import dify_config
|
||||
from core.app.entities.app_invoke_entities import ModelConfigWithCredentialsEntity
|
||||
from core.entities.provider_entities import ProviderQuotaType, QuotaUnit
|
||||
from core.file.models import File
|
||||
from core.memory.token_buffer_memory import TokenBufferMemory
|
||||
from core.memory import NodeTokenBufferMemory, TokenBufferMemory
|
||||
from core.memory.base import BaseMemory
|
||||
from core.model_manager import ModelInstance, ModelManager
|
||||
from core.model_runtime.entities.llm_entities import LLMUsage
|
||||
from core.model_runtime.entities.model_entities import ModelType
|
||||
from core.model_runtime.model_providers.__base.large_language_model import LargeLanguageModel
|
||||
from core.prompt.entities.advanced_prompt_entities import MemoryConfig
|
||||
from core.prompt.entities.advanced_prompt_entities import MemoryConfig, MemoryMode
|
||||
from core.variables.segments import ArrayAnySegment, ArrayFileSegment, FileSegment, NoneSegment, StringSegment
|
||||
from core.workflow.enums import SystemVariableKey
|
||||
from core.workflow.nodes.llm.entities import ModelConfig
|
||||
@@ -86,25 +87,56 @@ def fetch_files(variable_pool: VariablePool, selector: Sequence[str]) -> Sequenc
|
||||
|
||||
|
||||
def fetch_memory(
|
||||
variable_pool: VariablePool, app_id: str, node_data_memory: MemoryConfig | None, model_instance: ModelInstance
|
||||
) -> TokenBufferMemory | None:
|
||||
variable_pool: VariablePool,
|
||||
app_id: str,
|
||||
tenant_id: str,
|
||||
node_data_memory: MemoryConfig | None,
|
||||
model_instance: ModelInstance,
|
||||
node_id: str = "",
|
||||
) -> BaseMemory | None:
|
||||
"""
|
||||
Fetch memory based on configuration mode.
|
||||
|
||||
Returns TokenBufferMemory for conversation mode (default),
|
||||
or NodeTokenBufferMemory for node mode (Chatflow only).
|
||||
|
||||
:param variable_pool: Variable pool containing system variables
|
||||
:param app_id: Application ID
|
||||
:param tenant_id: Tenant ID
|
||||
:param node_data_memory: Memory configuration
|
||||
:param model_instance: Model instance for token counting
|
||||
:param node_id: Node ID in the workflow (required for node mode)
|
||||
:return: Memory instance or None if not applicable
|
||||
"""
|
||||
if not node_data_memory:
|
||||
return None
|
||||
|
||||
# get conversation id
|
||||
# Get conversation_id from variable pool (required for both modes in Chatflow)
|
||||
conversation_id_variable = variable_pool.get(["sys", SystemVariableKey.CONVERSATION_ID])
|
||||
if not isinstance(conversation_id_variable, StringSegment):
|
||||
return None
|
||||
conversation_id = conversation_id_variable.value
|
||||
|
||||
with Session(db.engine, expire_on_commit=False) as session:
|
||||
stmt = select(Conversation).where(Conversation.app_id == app_id, Conversation.id == conversation_id)
|
||||
conversation = session.scalar(stmt)
|
||||
if not conversation:
|
||||
# Return appropriate memory type based on mode
|
||||
if node_data_memory.mode == MemoryMode.NODE:
|
||||
# Node-level memory (Chatflow only)
|
||||
if not node_id:
|
||||
return None
|
||||
|
||||
memory = TokenBufferMemory(conversation=conversation, model_instance=model_instance)
|
||||
return memory
|
||||
return NodeTokenBufferMemory(
|
||||
app_id=app_id,
|
||||
conversation_id=conversation_id,
|
||||
node_id=node_id,
|
||||
tenant_id=tenant_id,
|
||||
model_instance=model_instance,
|
||||
)
|
||||
else:
|
||||
# Conversation-level memory (default)
|
||||
with Session(db.engine, expire_on_commit=False) as session:
|
||||
stmt = select(Conversation).where(Conversation.app_id == app_id, Conversation.id == conversation_id)
|
||||
conversation = session.scalar(stmt)
|
||||
if not conversation:
|
||||
return None
|
||||
return TokenBufferMemory(conversation=conversation, model_instance=model_instance)
|
||||
|
||||
|
||||
def deduct_llm_quota(tenant_id: str, model_instance: ModelInstance, usage: LLMUsage):
|
||||
|
||||
@@ -16,7 +16,8 @@ from core.file import File, FileTransferMethod, FileType, file_manager
|
||||
from core.helper.code_executor import CodeExecutor, CodeLanguage
|
||||
from core.llm_generator.output_parser.errors import OutputParserError
|
||||
from core.llm_generator.output_parser.structured_output import invoke_llm_with_structured_output
|
||||
from core.memory.token_buffer_memory import TokenBufferMemory
|
||||
from core.memory.base import BaseMemory
|
||||
from core.memory.node_token_buffer_memory import NodeTokenBufferMemory
|
||||
from core.model_manager import ModelInstance, ModelManager
|
||||
from core.model_runtime.entities import (
|
||||
ImagePromptMessageContent,
|
||||
@@ -208,8 +209,10 @@ class LLMNode(Node[LLMNodeData]):
|
||||
memory = llm_utils.fetch_memory(
|
||||
variable_pool=variable_pool,
|
||||
app_id=self.app_id,
|
||||
tenant_id=self.tenant_id,
|
||||
node_data_memory=self.node_data.memory,
|
||||
model_instance=model_instance,
|
||||
node_id=self._node_id,
|
||||
)
|
||||
|
||||
query: str | None = None
|
||||
@@ -301,12 +304,41 @@ class LLMNode(Node[LLMNodeData]):
|
||||
"reasoning_content": reasoning_content,
|
||||
"usage": jsonable_encoder(usage),
|
||||
"finish_reason": finish_reason,
|
||||
"context": self._build_context(prompt_messages, clean_text, model_config.mode),
|
||||
}
|
||||
if structured_output:
|
||||
outputs["structured_output"] = structured_output.structured_output
|
||||
if self._file_outputs:
|
||||
outputs["files"] = ArrayFileSegment(value=self._file_outputs)
|
||||
|
||||
# Write to Node Memory if in node memory mode
|
||||
if isinstance(memory, NodeTokenBufferMemory):
|
||||
# Get workflow_run_id as the key for this execution
|
||||
workflow_run_id_var = variable_pool.get(["sys", SystemVariableKey.WORKFLOW_EXECUTION_ID])
|
||||
workflow_run_id = workflow_run_id_var.value if isinstance(workflow_run_id_var, StringSegment) else ""
|
||||
|
||||
if workflow_run_id:
|
||||
# Resolve the query template to get actual user content
|
||||
# query may be a template like "{{#sys.query#}}" or "{{#node_id.output#}}"
|
||||
actual_query = variable_pool.convert_template(query or "").text
|
||||
|
||||
# Get user files from sys.files
|
||||
user_files_var = variable_pool.get(["sys", SystemVariableKey.FILES])
|
||||
user_files: list[File] = []
|
||||
if isinstance(user_files_var, ArrayFileSegment):
|
||||
user_files = list(user_files_var.value)
|
||||
elif isinstance(user_files_var, FileSegment):
|
||||
user_files = [user_files_var.value]
|
||||
|
||||
memory.add_messages(
|
||||
workflow_run_id=workflow_run_id,
|
||||
user_content=actual_query,
|
||||
user_files=user_files,
|
||||
assistant_content=clean_text,
|
||||
assistant_files=self._file_outputs,
|
||||
)
|
||||
memory.flush()
|
||||
|
||||
# Send final chunk event to indicate streaming is complete
|
||||
yield StreamChunkEvent(
|
||||
selector=[self._node_id, "text"],
|
||||
@@ -566,6 +598,22 @@ class LLMNode(Node[LLMNodeData]):
|
||||
# Separated mode: always return clean text and reasoning_content
|
||||
return clean_text, reasoning_content or ""
|
||||
|
||||
@staticmethod
|
||||
def _build_context(
|
||||
prompt_messages: Sequence[PromptMessage],
|
||||
assistant_response: str,
|
||||
model_mode: str,
|
||||
) -> list[dict[str, Any]]:
|
||||
"""
|
||||
Build context from prompt messages and assistant response.
|
||||
Excludes system messages and includes the current LLM response.
|
||||
"""
|
||||
context_messages: list[PromptMessage] = [m for m in prompt_messages if m.role != PromptMessageRole.SYSTEM]
|
||||
context_messages.append(AssistantPromptMessage(content=assistant_response))
|
||||
return PromptMessageUtil.prompt_messages_to_prompt_for_saving(
|
||||
model_mode=model_mode, prompt_messages=context_messages
|
||||
)
|
||||
|
||||
def _transform_chat_messages(
|
||||
self, messages: Sequence[LLMNodeChatModelMessage] | LLMNodeCompletionModelPromptTemplate, /
|
||||
) -> Sequence[LLMNodeChatModelMessage] | LLMNodeCompletionModelPromptTemplate:
|
||||
@@ -778,7 +826,7 @@ class LLMNode(Node[LLMNodeData]):
|
||||
sys_query: str | None = None,
|
||||
sys_files: Sequence[File],
|
||||
context: str | None = None,
|
||||
memory: TokenBufferMemory | None = None,
|
||||
memory: BaseMemory | None = None,
|
||||
model_config: ModelConfigWithCredentialsEntity,
|
||||
prompt_template: Sequence[LLMNodeChatModelMessage] | LLMNodeCompletionModelPromptTemplate,
|
||||
memory_config: MemoryConfig | None = None,
|
||||
@@ -1337,7 +1385,7 @@ def _calculate_rest_token(
|
||||
|
||||
def _handle_memory_chat_mode(
|
||||
*,
|
||||
memory: TokenBufferMemory | None,
|
||||
memory: BaseMemory | None,
|
||||
memory_config: MemoryConfig | None,
|
||||
model_config: ModelConfigWithCredentialsEntity,
|
||||
) -> Sequence[PromptMessage]:
|
||||
@@ -1354,7 +1402,7 @@ def _handle_memory_chat_mode(
|
||||
|
||||
def _handle_memory_completion_mode(
|
||||
*,
|
||||
memory: TokenBufferMemory | None,
|
||||
memory: BaseMemory | None,
|
||||
memory_config: MemoryConfig | None,
|
||||
model_config: ModelConfigWithCredentialsEntity,
|
||||
) -> str:
|
||||
|
||||
@@ -89,18 +89,20 @@ class ToolNode(Node[ToolNodeData]):
|
||||
)
|
||||
return
|
||||
|
||||
# get parameters
|
||||
# get parameters (use virtual_node_outputs from base class)
|
||||
tool_parameters = tool_runtime.get_merged_runtime_parameters() or []
|
||||
parameters = self._generate_parameters(
|
||||
tool_parameters=tool_parameters,
|
||||
variable_pool=self.graph_runtime_state.variable_pool,
|
||||
node_data=self.node_data,
|
||||
virtual_node_outputs=self.virtual_node_outputs,
|
||||
)
|
||||
parameters_for_log = self._generate_parameters(
|
||||
tool_parameters=tool_parameters,
|
||||
variable_pool=self.graph_runtime_state.variable_pool,
|
||||
node_data=self.node_data,
|
||||
for_log=True,
|
||||
virtual_node_outputs=self.virtual_node_outputs,
|
||||
)
|
||||
# get conversation id
|
||||
conversation_id = self.graph_runtime_state.variable_pool.get(["sys", SystemVariableKey.CONVERSATION_ID])
|
||||
@@ -176,6 +178,7 @@ class ToolNode(Node[ToolNodeData]):
|
||||
variable_pool: "VariablePool",
|
||||
node_data: ToolNodeData,
|
||||
for_log: bool = False,
|
||||
virtual_node_outputs: dict[str, Any] | None = None,
|
||||
) -> dict[str, Any]:
|
||||
"""
|
||||
Generate parameters based on the given tool parameters, variable pool, and node data.
|
||||
@@ -184,12 +187,17 @@ class ToolNode(Node[ToolNodeData]):
|
||||
tool_parameters (Sequence[ToolParameter]): The list of tool parameters.
|
||||
variable_pool (VariablePool): The variable pool containing the variables.
|
||||
node_data (ToolNodeData): The data associated with the tool node.
|
||||
for_log (bool): Whether to generate parameters for logging.
|
||||
virtual_node_outputs (dict[str, Any] | None): Outputs from virtual sub-nodes.
|
||||
Maps local_id -> outputs dict. Virtual node outputs are also in variable_pool
|
||||
with global IDs like "{parent_id}.{local_id}".
|
||||
|
||||
Returns:
|
||||
Mapping[str, Any]: A dictionary containing the generated parameters.
|
||||
|
||||
"""
|
||||
tool_parameters_dictionary = {parameter.name: parameter for parameter in tool_parameters}
|
||||
virtual_node_outputs = virtual_node_outputs or {}
|
||||
|
||||
result: dict[str, Any] = {}
|
||||
for parameter_name in node_data.tool_parameters:
|
||||
@@ -199,14 +207,25 @@ class ToolNode(Node[ToolNodeData]):
|
||||
continue
|
||||
tool_input = node_data.tool_parameters[parameter_name]
|
||||
if tool_input.type == "variable":
|
||||
variable = variable_pool.get(tool_input.value)
|
||||
if variable is None:
|
||||
if parameter.required:
|
||||
raise ToolParameterError(f"Variable {tool_input.value} does not exist")
|
||||
continue
|
||||
parameter_value = variable.value
|
||||
# Check if this references a virtual node output (local ID like [ext_1, text])
|
||||
selector = tool_input.value
|
||||
if len(selector) >= 2 and selector[0] in virtual_node_outputs:
|
||||
# Reference to virtual node output
|
||||
local_id = selector[0]
|
||||
var_name = selector[1]
|
||||
outputs = virtual_node_outputs.get(local_id, {})
|
||||
parameter_value = outputs.get(var_name)
|
||||
else:
|
||||
# Normal variable reference
|
||||
variable = variable_pool.get(selector)
|
||||
if variable is None:
|
||||
if parameter.required:
|
||||
raise ToolParameterError(f"Variable {selector} does not exist")
|
||||
continue
|
||||
parameter_value = variable.value
|
||||
elif tool_input.type in {"mixed", "constant"}:
|
||||
segment_group = variable_pool.convert_template(str(tool_input.value))
|
||||
template = str(tool_input.value)
|
||||
segment_group = variable_pool.convert_template(template)
|
||||
parameter_value = segment_group.log if for_log else segment_group.text
|
||||
else:
|
||||
raise ToolParameterError(f"Unknown tool input type '{tool_input.type}'")
|
||||
|
||||
266
api/tests/fixtures/pav-test-extraction.yml
vendored
Normal file
266
api/tests/fixtures/pav-test-extraction.yml
vendored
Normal file
@@ -0,0 +1,266 @@
|
||||
app:
|
||||
description: Test for variable extraction feature
|
||||
icon: 🤖
|
||||
icon_background: '#FFEAD5'
|
||||
mode: advanced-chat
|
||||
name: pav-test-extraction
|
||||
use_icon_as_answer_icon: false
|
||||
dependencies:
|
||||
- current_identifier: null
|
||||
type: marketplace
|
||||
value:
|
||||
marketplace_plugin_unique_identifier: langgenius/google:0.0.8@3efcf55ffeef9d0f77715e0afb23534952ae0cb385c051d0637e86d71199d1a6
|
||||
version: null
|
||||
- current_identifier: null
|
||||
type: marketplace
|
||||
value:
|
||||
marketplace_plugin_unique_identifier: langgenius/tongyi:0.1.16@d8bffbe45418f0c117fb3393e5e40e61faee98f9a2183f062e5a280e74b15d21
|
||||
version: null
|
||||
kind: app
|
||||
version: 0.5.0
|
||||
workflow:
|
||||
conversation_variables: []
|
||||
environment_variables: []
|
||||
features:
|
||||
file_upload:
|
||||
allowed_file_extensions:
|
||||
- .JPG
|
||||
- .JPEG
|
||||
- .PNG
|
||||
- .GIF
|
||||
- .WEBP
|
||||
- .SVG
|
||||
allowed_file_types:
|
||||
- image
|
||||
allowed_file_upload_methods:
|
||||
- local_file
|
||||
- remote_url
|
||||
enabled: false
|
||||
image:
|
||||
enabled: false
|
||||
number_limits: 3
|
||||
transfer_methods:
|
||||
- local_file
|
||||
- remote_url
|
||||
number_limits: 3
|
||||
opening_statement: 你好!我是一个搜索助手,请告诉我你想搜索什么内容。
|
||||
retriever_resource:
|
||||
enabled: true
|
||||
sensitive_word_avoidance:
|
||||
enabled: false
|
||||
speech_to_text:
|
||||
enabled: false
|
||||
suggested_questions: []
|
||||
suggested_questions_after_answer:
|
||||
enabled: false
|
||||
text_to_speech:
|
||||
enabled: false
|
||||
language: ''
|
||||
voice: ''
|
||||
graph:
|
||||
edges:
|
||||
- data:
|
||||
sourceType: start
|
||||
targetType: llm
|
||||
id: 1767773675796-llm
|
||||
source: '1767773675796'
|
||||
sourceHandle: source
|
||||
target: llm
|
||||
targetHandle: target
|
||||
type: custom
|
||||
- data:
|
||||
isInIteration: false
|
||||
isInLoop: false
|
||||
sourceType: llm
|
||||
targetType: tool
|
||||
id: llm-source-1767773709491-target
|
||||
source: llm
|
||||
sourceHandle: source
|
||||
target: '1767773709491'
|
||||
targetHandle: target
|
||||
type: custom
|
||||
zIndex: 0
|
||||
- data:
|
||||
isInIteration: false
|
||||
isInLoop: false
|
||||
sourceType: tool
|
||||
targetType: answer
|
||||
id: tool-source-answer-target
|
||||
source: '1767773709491'
|
||||
sourceHandle: source
|
||||
target: answer
|
||||
targetHandle: target
|
||||
type: custom
|
||||
zIndex: 0
|
||||
nodes:
|
||||
- data:
|
||||
selected: false
|
||||
title: User Input
|
||||
type: start
|
||||
variables: []
|
||||
height: 73
|
||||
id: '1767773675796'
|
||||
position:
|
||||
x: 80
|
||||
y: 282
|
||||
positionAbsolute:
|
||||
x: 80
|
||||
y: 282
|
||||
sourcePosition: right
|
||||
targetPosition: left
|
||||
type: custom
|
||||
width: 242
|
||||
- data:
|
||||
context:
|
||||
enabled: false
|
||||
variable_selector: []
|
||||
memory:
|
||||
query_prompt_template: ''
|
||||
role_prefix:
|
||||
assistant: ''
|
||||
user: ''
|
||||
window:
|
||||
enabled: true
|
||||
size: 10
|
||||
model:
|
||||
completion_params:
|
||||
temperature: 0.7
|
||||
mode: chat
|
||||
name: qwen-max
|
||||
provider: langgenius/tongyi/tongyi
|
||||
prompt_template:
|
||||
- id: 11d06d15-914a-4915-a5b1-0e35ab4fba51
|
||||
role: system
|
||||
text: '你是一个智能搜索助手。用户会告诉你他们想搜索的内容。
|
||||
|
||||
请与用户进行对话,了解他们的搜索需求。
|
||||
|
||||
当用户明确表达了想要搜索的内容后,你可以回复"好的,我来帮你搜索"。
|
||||
|
||||
'
|
||||
selected: false
|
||||
title: LLM
|
||||
type: llm
|
||||
vision:
|
||||
enabled: false
|
||||
height: 88
|
||||
id: llm
|
||||
position:
|
||||
x: 380
|
||||
y: 282
|
||||
positionAbsolute:
|
||||
x: 380
|
||||
y: 282
|
||||
selected: false
|
||||
sourcePosition: right
|
||||
targetPosition: left
|
||||
type: custom
|
||||
width: 242
|
||||
- data:
|
||||
is_team_authorization: true
|
||||
paramSchemas:
|
||||
- auto_generate: null
|
||||
default: null
|
||||
form: llm
|
||||
human_description:
|
||||
en_US: used for searching
|
||||
ja_JP: used for searching
|
||||
pt_BR: used for searching
|
||||
zh_Hans: 用于搜索网页内容
|
||||
label:
|
||||
en_US: Query string
|
||||
ja_JP: Query string
|
||||
pt_BR: Query string
|
||||
zh_Hans: 查询语句
|
||||
llm_description: key words for searching
|
||||
max: null
|
||||
min: null
|
||||
name: query
|
||||
options: []
|
||||
placeholder: null
|
||||
precision: null
|
||||
required: true
|
||||
scope: null
|
||||
template: null
|
||||
type: string
|
||||
params:
|
||||
query: ''
|
||||
plugin_id: langgenius/google
|
||||
plugin_unique_identifier: langgenius/google:0.0.8@3efcf55ffeef9d0f77715e0afb23534952ae0cb385c051d0637e86d71199d1a6
|
||||
provider_icon: http://localhost:5001/console/api/workspaces/current/plugin/icon?tenant_id=7217e801-f6f5-49ec-8103-d7de97a4b98f&filename=1c5871163478957bac64c3fe33d72d003f767497d921c74b742aad27a8344a74.svg
|
||||
provider_id: langgenius/google/google
|
||||
provider_name: langgenius/google/google
|
||||
provider_type: builtin
|
||||
selected: false
|
||||
title: GoogleSearch
|
||||
tool_configurations: {}
|
||||
tool_description: A tool for performing a Google SERP search and extracting
|
||||
snippets and webpages.Input should be a search query.
|
||||
tool_label: GoogleSearch
|
||||
tool_name: google_search
|
||||
tool_node_version: '2'
|
||||
tool_parameters:
|
||||
query:
|
||||
type: variable
|
||||
value:
|
||||
- ext_1
|
||||
- text
|
||||
type: tool
|
||||
virtual_nodes:
|
||||
- data:
|
||||
model:
|
||||
completion_params:
|
||||
temperature: 0.7
|
||||
mode: chat
|
||||
name: qwen-max
|
||||
provider: langgenius/tongyi/tongyi
|
||||
context:
|
||||
enabled: false
|
||||
prompt_template:
|
||||
- role: user
|
||||
text: '{{#llm.context#}}'
|
||||
- role: user
|
||||
text: 请从对话历史中提取用户想要搜索的关键词,只返回关键词本身,不要返回其他内容
|
||||
title: 提取搜索关键词
|
||||
id: ext_1
|
||||
type: llm
|
||||
height: 52
|
||||
id: '1767773709491'
|
||||
position:
|
||||
x: 682
|
||||
y: 282
|
||||
positionAbsolute:
|
||||
x: 682
|
||||
y: 282
|
||||
selected: false
|
||||
sourcePosition: right
|
||||
targetPosition: left
|
||||
type: custom
|
||||
width: 242
|
||||
- data:
|
||||
answer: '搜索结果:
|
||||
|
||||
{{#1767773709491.text#}}
|
||||
|
||||
'
|
||||
selected: false
|
||||
title: Answer
|
||||
type: answer
|
||||
height: 103
|
||||
id: answer
|
||||
position:
|
||||
x: 984
|
||||
y: 282
|
||||
positionAbsolute:
|
||||
x: 984
|
||||
y: 282
|
||||
selected: true
|
||||
sourcePosition: right
|
||||
targetPosition: left
|
||||
type: custom
|
||||
width: 242
|
||||
viewport:
|
||||
x: 151
|
||||
y: 141.5
|
||||
zoom: 1
|
||||
rag_pipeline_variables: []
|
||||
@@ -0,0 +1,77 @@
|
||||
"""
|
||||
Unit tests for virtual node configuration.
|
||||
"""
|
||||
|
||||
from core.workflow.nodes.base.entities import VirtualNodeConfig
|
||||
|
||||
|
||||
class TestVirtualNodeConfig:
|
||||
"""Tests for VirtualNodeConfig entity."""
|
||||
|
||||
def test_create_basic_config(self):
|
||||
"""Test creating a basic virtual node config."""
|
||||
config = VirtualNodeConfig(
|
||||
id="ext_1",
|
||||
type="llm",
|
||||
data={
|
||||
"title": "Extract keywords",
|
||||
"model": {"provider": "openai", "name": "gpt-4o-mini"},
|
||||
},
|
||||
)
|
||||
|
||||
assert config.id == "ext_1"
|
||||
assert config.type == "llm"
|
||||
assert config.data["title"] == "Extract keywords"
|
||||
|
||||
def test_get_global_id(self):
|
||||
"""Test generating global ID from parent ID."""
|
||||
config = VirtualNodeConfig(
|
||||
id="ext_1",
|
||||
type="llm",
|
||||
data={},
|
||||
)
|
||||
|
||||
global_id = config.get_global_id("tool1")
|
||||
assert global_id == "tool1.ext_1"
|
||||
|
||||
def test_get_global_id_with_different_parents(self):
|
||||
"""Test global ID generation with different parent IDs."""
|
||||
config = VirtualNodeConfig(id="sub_node", type="code", data={})
|
||||
|
||||
assert config.get_global_id("parent1") == "parent1.sub_node"
|
||||
assert config.get_global_id("node_123") == "node_123.sub_node"
|
||||
|
||||
def test_empty_data(self):
|
||||
"""Test virtual node config with empty data."""
|
||||
config = VirtualNodeConfig(
|
||||
id="test",
|
||||
type="tool",
|
||||
)
|
||||
|
||||
assert config.id == "test"
|
||||
assert config.type == "tool"
|
||||
assert config.data == {}
|
||||
|
||||
def test_complex_data(self):
|
||||
"""Test virtual node config with complex data."""
|
||||
config = VirtualNodeConfig(
|
||||
id="llm_1",
|
||||
type="llm",
|
||||
data={
|
||||
"title": "Generate summary",
|
||||
"model": {
|
||||
"provider": "openai",
|
||||
"name": "gpt-4",
|
||||
"mode": "chat",
|
||||
"completion_params": {"temperature": 0.7, "max_tokens": 500},
|
||||
},
|
||||
"prompt_template": [
|
||||
{"role": "user", "text": "{{#llm1.context#}}"},
|
||||
{"role": "user", "text": "Please summarize the conversation"},
|
||||
],
|
||||
},
|
||||
)
|
||||
|
||||
assert config.data["model"]["provider"] == "openai"
|
||||
assert len(config.data["prompt_template"]) == 2
|
||||
|
||||
@@ -5,6 +5,7 @@ import type {
|
||||
} from 'lexical'
|
||||
import type { FC } from 'react'
|
||||
import type {
|
||||
AgentBlockType,
|
||||
ContextBlockType,
|
||||
CurrentBlockType,
|
||||
ErrorMessageBlockType,
|
||||
@@ -103,6 +104,7 @@ export type PromptEditorProps = {
|
||||
currentBlock?: CurrentBlockType
|
||||
errorMessageBlock?: ErrorMessageBlockType
|
||||
lastRunBlock?: LastRunBlockType
|
||||
agentBlock?: AgentBlockType
|
||||
isSupportFileVar?: boolean
|
||||
}
|
||||
|
||||
@@ -128,6 +130,7 @@ const PromptEditor: FC<PromptEditorProps> = ({
|
||||
currentBlock,
|
||||
errorMessageBlock,
|
||||
lastRunBlock,
|
||||
agentBlock,
|
||||
isSupportFileVar,
|
||||
}) => {
|
||||
const { eventEmitter } = useEventEmitterContextContext()
|
||||
@@ -139,6 +142,7 @@ const PromptEditor: FC<PromptEditorProps> = ({
|
||||
{
|
||||
replace: TextNode,
|
||||
with: (node: TextNode) => new CustomTextNode(node.__text),
|
||||
withKlass: CustomTextNode,
|
||||
},
|
||||
ContextBlockNode,
|
||||
HistoryBlockNode,
|
||||
@@ -212,6 +216,22 @@ const PromptEditor: FC<PromptEditorProps> = ({
|
||||
lastRunBlock={lastRunBlock}
|
||||
isSupportFileVar={isSupportFileVar}
|
||||
/>
|
||||
{(!agentBlock || agentBlock.show) && (
|
||||
<ComponentPickerBlock
|
||||
triggerString="@"
|
||||
contextBlock={contextBlock}
|
||||
historyBlock={historyBlock}
|
||||
queryBlock={queryBlock}
|
||||
variableBlock={variableBlock}
|
||||
externalToolBlock={externalToolBlock}
|
||||
workflowVariableBlock={workflowVariableBlock}
|
||||
currentBlock={currentBlock}
|
||||
errorMessageBlock={errorMessageBlock}
|
||||
lastRunBlock={lastRunBlock}
|
||||
agentBlock={agentBlock}
|
||||
isSupportFileVar={isSupportFileVar}
|
||||
/>
|
||||
)}
|
||||
<ComponentPickerBlock
|
||||
triggerString="{"
|
||||
contextBlock={contextBlock}
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import type { MenuRenderFn } from '@lexical/react/LexicalTypeaheadMenuPlugin'
|
||||
import type { TextNode } from 'lexical'
|
||||
import type {
|
||||
AgentBlockType,
|
||||
ContextBlockType,
|
||||
CurrentBlockType,
|
||||
ErrorMessageBlockType,
|
||||
@@ -20,7 +21,11 @@ import {
|
||||
} from '@floating-ui/react'
|
||||
import { useLexicalComposerContext } from '@lexical/react/LexicalComposerContext'
|
||||
import { LexicalTypeaheadMenuPlugin } from '@lexical/react/LexicalTypeaheadMenuPlugin'
|
||||
import { KEY_ESCAPE_COMMAND } from 'lexical'
|
||||
import {
|
||||
$getRoot,
|
||||
$insertNodes,
|
||||
KEY_ESCAPE_COMMAND,
|
||||
} from 'lexical'
|
||||
import {
|
||||
Fragment,
|
||||
memo,
|
||||
@@ -29,7 +34,9 @@ import {
|
||||
} from 'react'
|
||||
import ReactDOM from 'react-dom'
|
||||
import { GeneratorType } from '@/app/components/app/configuration/config/automatic/types'
|
||||
import AgentNodeList from '@/app/components/workflow/nodes/_base/components/agent-node-list'
|
||||
import VarReferenceVars from '@/app/components/workflow/nodes/_base/components/variable/var-reference-vars'
|
||||
import { BlockEnum } from '@/app/components/workflow/types'
|
||||
import { useEventEmitterContextContext } from '@/context/event-emitter'
|
||||
import { useBasicTypeaheadTriggerMatch } from '../../hooks'
|
||||
import { $splitNodeContainingQuery } from '../../utils'
|
||||
@@ -38,6 +45,7 @@ import { INSERT_ERROR_MESSAGE_BLOCK_COMMAND } from '../error-message-block'
|
||||
import { INSERT_LAST_RUN_BLOCK_COMMAND } from '../last-run-block'
|
||||
import { INSERT_VARIABLE_VALUE_BLOCK_COMMAND } from '../variable-block'
|
||||
import { INSERT_WORKFLOW_VARIABLE_BLOCK_COMMAND } from '../workflow-variable-block'
|
||||
import { $createWorkflowVariableBlockNode } from '../workflow-variable-block/node'
|
||||
import { useOptions } from './hooks'
|
||||
|
||||
type ComponentPickerProps = {
|
||||
@@ -51,6 +59,7 @@ type ComponentPickerProps = {
|
||||
currentBlock?: CurrentBlockType
|
||||
errorMessageBlock?: ErrorMessageBlockType
|
||||
lastRunBlock?: LastRunBlockType
|
||||
agentBlock?: AgentBlockType
|
||||
isSupportFileVar?: boolean
|
||||
}
|
||||
const ComponentPicker = ({
|
||||
@@ -64,6 +73,7 @@ const ComponentPicker = ({
|
||||
currentBlock,
|
||||
errorMessageBlock,
|
||||
lastRunBlock,
|
||||
agentBlock,
|
||||
isSupportFileVar,
|
||||
}: ComponentPickerProps) => {
|
||||
const { eventEmitter } = useEventEmitterContextContext()
|
||||
@@ -151,12 +161,41 @@ const ComponentPicker = ({
|
||||
editor.dispatchCommand(KEY_ESCAPE_COMMAND, escapeEvent)
|
||||
}, [editor])
|
||||
|
||||
const handleSelectAgent = useCallback((agent: { id: string, title: string }) => {
|
||||
editor.update(() => {
|
||||
const needRemove = $splitNodeContainingQuery(checkForTriggerMatch(triggerString, editor)!)
|
||||
if (needRemove)
|
||||
needRemove.remove()
|
||||
|
||||
const root = $getRoot()
|
||||
const firstChild = root.getFirstChild()
|
||||
if (firstChild) {
|
||||
const selection = firstChild.selectStart()
|
||||
if (selection) {
|
||||
const workflowVariableBlockNode = $createWorkflowVariableBlockNode([agent.id, 'text'], {}, undefined)
|
||||
$insertNodes([workflowVariableBlockNode])
|
||||
}
|
||||
}
|
||||
})
|
||||
agentBlock?.onSelect?.(agent)
|
||||
handleClose()
|
||||
}, [editor, checkForTriggerMatch, triggerString, agentBlock, handleClose])
|
||||
|
||||
const isAgentTrigger = triggerString === '@' && agentBlock?.show
|
||||
const agentNodes = agentBlock?.agentNodes || []
|
||||
|
||||
const renderMenu = useCallback<MenuRenderFn<PickerBlockMenuOption>>((
|
||||
anchorElementRef,
|
||||
{ options, selectedIndex, selectOptionAndCleanUp, setHighlightedIndex },
|
||||
) => {
|
||||
if (!(anchorElementRef.current && (allFlattenOptions.length || workflowVariableBlock?.show)))
|
||||
return null
|
||||
if (isAgentTrigger) {
|
||||
if (!(anchorElementRef.current && agentNodes.length))
|
||||
return null
|
||||
}
|
||||
else {
|
||||
if (!(anchorElementRef.current && (allFlattenOptions.length || workflowVariableBlock?.show)))
|
||||
return null
|
||||
}
|
||||
|
||||
setTimeout(() => {
|
||||
if (anchorElementRef.current)
|
||||
@@ -167,9 +206,6 @@ const ComponentPicker = ({
|
||||
<>
|
||||
{
|
||||
ReactDOM.createPortal(
|
||||
// The `LexicalMenu` will try to calculate the position of the floating menu based on the first child.
|
||||
// Since we use floating ui, we need to wrap it with a div to prevent the position calculation being affected.
|
||||
// See https://github.com/facebook/lexical/blob/ac97dfa9e14a73ea2d6934ff566282d7f758e8bb/packages/lexical-react/src/shared/LexicalMenu.ts#L493
|
||||
<div className="h-0 w-0">
|
||||
<div
|
||||
className="w-[260px] rounded-lg border-[0.5px] border-components-panel-border bg-components-panel-bg-blur p-1 shadow-lg"
|
||||
@@ -179,56 +215,73 @@ const ComponentPicker = ({
|
||||
}}
|
||||
ref={refs.setFloating}
|
||||
>
|
||||
{
|
||||
workflowVariableBlock?.show && (
|
||||
<div className="p-1">
|
||||
<VarReferenceVars
|
||||
searchBoxClassName="mt-1"
|
||||
vars={workflowVariableOptions}
|
||||
onChange={(variables: string[]) => {
|
||||
handleSelectWorkflowVariable(variables)
|
||||
}}
|
||||
maxHeightClass="max-h-[34vh]"
|
||||
isSupportFileVar={isSupportFileVar}
|
||||
{isAgentTrigger
|
||||
? (
|
||||
<AgentNodeList
|
||||
nodes={agentNodes.map(node => ({
|
||||
...node,
|
||||
type: BlockEnum.Agent,
|
||||
}))}
|
||||
onSelect={handleSelectAgent}
|
||||
onClose={handleClose}
|
||||
onBlur={handleClose}
|
||||
showManageInputField={workflowVariableBlock.showManageInputField}
|
||||
onManageInputField={workflowVariableBlock.onManageInputField}
|
||||
maxHeightClass="max-h-[34vh]"
|
||||
autoFocus={false}
|
||||
isInCodeGeneratorInstructionEditor={currentBlock?.generatorType === GeneratorType.code}
|
||||
/>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
{
|
||||
workflowVariableBlock?.show && !!options.length && (
|
||||
<div className="my-1 h-px w-full -translate-x-1 bg-divider-subtle"></div>
|
||||
)
|
||||
}
|
||||
<div>
|
||||
{
|
||||
options.map((option, index) => (
|
||||
<Fragment key={option.key}>
|
||||
)
|
||||
: (
|
||||
<>
|
||||
{
|
||||
// Divider
|
||||
index !== 0 && options.at(index - 1)?.group !== option.group && (
|
||||
workflowVariableBlock?.show && (
|
||||
<div className="p-1">
|
||||
<VarReferenceVars
|
||||
searchBoxClassName="mt-1"
|
||||
vars={workflowVariableOptions}
|
||||
onChange={(variables: string[]) => {
|
||||
handleSelectWorkflowVariable(variables)
|
||||
}}
|
||||
maxHeightClass="max-h-[34vh]"
|
||||
isSupportFileVar={isSupportFileVar}
|
||||
onClose={handleClose}
|
||||
onBlur={handleClose}
|
||||
showManageInputField={workflowVariableBlock.showManageInputField}
|
||||
onManageInputField={workflowVariableBlock.onManageInputField}
|
||||
autoFocus={false}
|
||||
isInCodeGeneratorInstructionEditor={currentBlock?.generatorType === GeneratorType.code}
|
||||
/>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
{
|
||||
workflowVariableBlock?.show && !!options.length && (
|
||||
<div className="my-1 h-px w-full -translate-x-1 bg-divider-subtle"></div>
|
||||
)
|
||||
}
|
||||
{option.renderMenuOption({
|
||||
queryString,
|
||||
isSelected: selectedIndex === index,
|
||||
onSelect: () => {
|
||||
selectOptionAndCleanUp(option)
|
||||
},
|
||||
onSetHighlight: () => {
|
||||
setHighlightedIndex(index)
|
||||
},
|
||||
})}
|
||||
</Fragment>
|
||||
))
|
||||
}
|
||||
</div>
|
||||
<div>
|
||||
{
|
||||
options.map((option, index) => (
|
||||
<Fragment key={option.key}>
|
||||
{
|
||||
index !== 0 && options.at(index - 1)?.group !== option.group && (
|
||||
<div className="my-1 h-px w-full -translate-x-1 bg-divider-subtle"></div>
|
||||
)
|
||||
}
|
||||
{option.renderMenuOption({
|
||||
queryString,
|
||||
isSelected: selectedIndex === index,
|
||||
onSelect: () => {
|
||||
selectOptionAndCleanUp(option)
|
||||
},
|
||||
onSetHighlight: () => {
|
||||
setHighlightedIndex(index)
|
||||
},
|
||||
})}
|
||||
</Fragment>
|
||||
))
|
||||
}
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
</div>,
|
||||
anchorElementRef.current,
|
||||
@@ -236,7 +289,7 @@ const ComponentPicker = ({
|
||||
}
|
||||
</>
|
||||
)
|
||||
}, [allFlattenOptions.length, workflowVariableBlock?.show, floatingStyles, isPositioned, refs, workflowVariableOptions, isSupportFileVar, handleClose, currentBlock?.generatorType, handleSelectWorkflowVariable, queryString, workflowVariableBlock?.showManageInputField, workflowVariableBlock?.onManageInputField])
|
||||
}, [isAgentTrigger, agentNodes, allFlattenOptions.length, workflowVariableBlock?.show, floatingStyles, isPositioned, refs, handleSelectAgent, handleClose, workflowVariableOptions, isSupportFileVar, currentBlock?.generatorType, handleSelectWorkflowVariable, queryString, workflowVariableBlock?.showManageInputField, workflowVariableBlock?.onManageInputField])
|
||||
|
||||
return (
|
||||
<LexicalTypeaheadMenuPlugin
|
||||
|
||||
@@ -21,6 +21,7 @@ import {
|
||||
VariableLabelInEditor,
|
||||
} from '@/app/components/workflow/nodes/_base/components/variable/variable-label'
|
||||
import { Type } from '@/app/components/workflow/nodes/llm/types'
|
||||
import { BlockEnum } from '@/app/components/workflow/types'
|
||||
import { isExceptionVariable } from '@/app/components/workflow/utils'
|
||||
import { useSelectOrDelete } from '../../hooks'
|
||||
import {
|
||||
@@ -66,6 +67,7 @@ const WorkflowVariableBlockComponent = ({
|
||||
)()
|
||||
const [localWorkflowNodesMap, setLocalWorkflowNodesMap] = useState<WorkflowNodesMap>(workflowNodesMap)
|
||||
const node = localWorkflowNodesMap![variables[isRagVar ? 1 : 0]]
|
||||
const isAgentVariable = node?.type === BlockEnum.Agent
|
||||
|
||||
const isException = isExceptionVariable(varName, node?.type)
|
||||
const variableValid = useMemo(() => {
|
||||
@@ -134,6 +136,9 @@ const WorkflowVariableBlockComponent = ({
|
||||
})
|
||||
}, [node, reactflow, store])
|
||||
|
||||
if (isAgentVariable)
|
||||
return <span className="hidden" ref={ref} />
|
||||
|
||||
const Item = (
|
||||
<VariableLabelInEditor
|
||||
nodeType={node?.type}
|
||||
|
||||
@@ -73,6 +73,17 @@ export type WorkflowVariableBlockType = {
|
||||
onManageInputField?: () => void
|
||||
}
|
||||
|
||||
export type AgentNode = {
|
||||
id: string
|
||||
title: string
|
||||
}
|
||||
|
||||
export type AgentBlockType = {
|
||||
show?: boolean
|
||||
agentNodes?: AgentNode[]
|
||||
onSelect?: (agent: AgentNode) => void
|
||||
}
|
||||
|
||||
export type MenuTextMatch = {
|
||||
leadOffset: number
|
||||
matchingString: string
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import type { FC } from 'react'
|
||||
import { memo } from 'react'
|
||||
import AppIcon from '@/app/components/base/app-icon'
|
||||
import { Folder as FolderLine } from '@/app/components/base/icons/src/vender/line/files'
|
||||
import {
|
||||
Agent,
|
||||
Answer,
|
||||
@@ -54,6 +55,7 @@ const DEFAULT_ICON_MAP: Record<BlockEnum, React.ComponentType<{ className: strin
|
||||
[BlockEnum.TemplateTransform]: TemplatingTransform,
|
||||
[BlockEnum.VariableAssigner]: VariableX,
|
||||
[BlockEnum.VariableAggregator]: VariableX,
|
||||
[BlockEnum.Group]: FolderLine,
|
||||
[BlockEnum.Assigner]: Assigner,
|
||||
[BlockEnum.Tool]: VariableX,
|
||||
[BlockEnum.IterationStart]: VariableX,
|
||||
@@ -97,6 +99,7 @@ const ICON_CONTAINER_BG_COLOR_MAP: Record<string, string> = {
|
||||
[BlockEnum.VariableAssigner]: 'bg-util-colors-blue-blue-500',
|
||||
[BlockEnum.VariableAggregator]: 'bg-util-colors-blue-blue-500',
|
||||
[BlockEnum.Tool]: 'bg-util-colors-blue-blue-500',
|
||||
[BlockEnum.Group]: 'bg-util-colors-blue-blue-500',
|
||||
[BlockEnum.Assigner]: 'bg-util-colors-blue-blue-500',
|
||||
[BlockEnum.ParameterExtractor]: 'bg-util-colors-blue-blue-500',
|
||||
[BlockEnum.DocExtractor]: 'bg-util-colors-green-green-500',
|
||||
|
||||
@@ -25,7 +25,7 @@ import {
|
||||
useAvailableBlocks,
|
||||
useNodesInteractions,
|
||||
} from './hooks'
|
||||
import { NodeRunningStatus } from './types'
|
||||
import { BlockEnum, NodeRunningStatus } from './types'
|
||||
import { getEdgeColor } from './utils'
|
||||
|
||||
const CustomEdge = ({
|
||||
@@ -136,7 +136,7 @@ const CustomEdge = ({
|
||||
stroke,
|
||||
strokeWidth: 2,
|
||||
opacity: data._dimmed ? 0.3 : (data._waitingRun ? 0.7 : 1),
|
||||
strokeDasharray: data._isTemp ? '8 8' : undefined,
|
||||
strokeDasharray: (data._isTemp && data.sourceType !== BlockEnum.Group && data.targetType !== BlockEnum.Group) ? '8 8' : undefined,
|
||||
}}
|
||||
/>
|
||||
<EdgeLabelRenderer>
|
||||
|
||||
11
web/app/components/workflow/custom-group-node/constants.ts
Normal file
11
web/app/components/workflow/custom-group-node/constants.ts
Normal file
@@ -0,0 +1,11 @@
|
||||
export const CUSTOM_GROUP_NODE = 'custom-group'
|
||||
export const CUSTOM_GROUP_INPUT_NODE = 'custom-group-input'
|
||||
export const CUSTOM_GROUP_EXIT_PORT_NODE = 'custom-group-exit-port'
|
||||
|
||||
export const GROUP_CHILDREN_Z_INDEX = 1002
|
||||
|
||||
export const UI_ONLY_GROUP_NODE_TYPES = new Set([
|
||||
CUSTOM_GROUP_NODE,
|
||||
CUSTOM_GROUP_INPUT_NODE,
|
||||
CUSTOM_GROUP_EXIT_PORT_NODE,
|
||||
])
|
||||
@@ -0,0 +1,54 @@
|
||||
'use client'
|
||||
|
||||
import type { FC } from 'react'
|
||||
import type { CustomGroupExitPortNodeData } from './types'
|
||||
import { memo } from 'react'
|
||||
import { Handle, Position } from 'reactflow'
|
||||
import { cn } from '@/utils/classnames'
|
||||
|
||||
type CustomGroupExitPortNodeProps = {
|
||||
id: string
|
||||
data: CustomGroupExitPortNodeData
|
||||
}
|
||||
|
||||
const CustomGroupExitPortNode: FC<CustomGroupExitPortNodeProps> = ({ id: _id, data }) => {
|
||||
return (
|
||||
<div
|
||||
className={cn(
|
||||
'flex items-center justify-center',
|
||||
'h-8 w-8 rounded-full',
|
||||
'bg-util-colors-green-green-500 shadow-md',
|
||||
data.selected && 'ring-2 ring-primary-400',
|
||||
)}
|
||||
>
|
||||
{/* Target handle - receives internal connections from leaf nodes */}
|
||||
<Handle
|
||||
id="target"
|
||||
type="target"
|
||||
position={Position.Left}
|
||||
className="!h-2 !w-2 !border-0 !bg-white"
|
||||
/>
|
||||
|
||||
{/* Source handle - connects to external nodes */}
|
||||
<Handle
|
||||
id="source"
|
||||
type="source"
|
||||
position={Position.Right}
|
||||
className="!h-2 !w-2 !border-0 !bg-white"
|
||||
/>
|
||||
|
||||
{/* Icon */}
|
||||
<svg
|
||||
className="h-4 w-4 text-white"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
strokeWidth={2}
|
||||
>
|
||||
<path d="M5 12h14M12 5l7 7-7 7" />
|
||||
</svg>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
export default memo(CustomGroupExitPortNode)
|
||||
@@ -0,0 +1,55 @@
|
||||
'use client'
|
||||
|
||||
import type { FC } from 'react'
|
||||
import type { CustomGroupInputNodeData } from './types'
|
||||
import { memo } from 'react'
|
||||
import { Handle, Position } from 'reactflow'
|
||||
import { cn } from '@/utils/classnames'
|
||||
|
||||
type CustomGroupInputNodeProps = {
|
||||
id: string
|
||||
data: CustomGroupInputNodeData
|
||||
}
|
||||
|
||||
const CustomGroupInputNode: FC<CustomGroupInputNodeProps> = ({ id: _id, data }) => {
|
||||
return (
|
||||
<div
|
||||
className={cn(
|
||||
'flex items-center justify-center',
|
||||
'h-8 w-8 rounded-full',
|
||||
'bg-util-colors-blue-blue-500 shadow-md',
|
||||
data.selected && 'ring-2 ring-primary-400',
|
||||
)}
|
||||
>
|
||||
{/* Target handle - receives external connections */}
|
||||
<Handle
|
||||
id="target"
|
||||
type="target"
|
||||
position={Position.Left}
|
||||
className="!h-2 !w-2 !border-0 !bg-white"
|
||||
/>
|
||||
|
||||
{/* Source handle - connects to entry nodes */}
|
||||
<Handle
|
||||
id="source"
|
||||
type="source"
|
||||
position={Position.Right}
|
||||
className="!h-2 !w-2 !border-0 !bg-white"
|
||||
/>
|
||||
|
||||
{/* Icon */}
|
||||
<svg
|
||||
className="h-4 w-4 text-white"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
strokeWidth={2}
|
||||
>
|
||||
<path d="M9 12l2 2 4-4" />
|
||||
<circle cx="12" cy="12" r="10" />
|
||||
</svg>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
export default memo(CustomGroupInputNode)
|
||||
@@ -0,0 +1,94 @@
|
||||
'use client'
|
||||
|
||||
import type { FC } from 'react'
|
||||
import type { CustomGroupNodeData } from './types'
|
||||
import { memo } from 'react'
|
||||
import { Handle, Position } from 'reactflow'
|
||||
import { Plus02 } from '@/app/components/base/icons/src/vender/line/general'
|
||||
import { cn } from '@/utils/classnames'
|
||||
|
||||
type CustomGroupNodeProps = {
|
||||
id: string
|
||||
data: CustomGroupNodeData
|
||||
}
|
||||
|
||||
const CustomGroupNode: FC<CustomGroupNodeProps> = ({ data }) => {
|
||||
const { group } = data
|
||||
const exitPorts = group.exitPorts ?? []
|
||||
const connectedSourceHandleIds = data._connectedSourceHandleIds ?? []
|
||||
|
||||
return (
|
||||
<div
|
||||
className={cn(
|
||||
'bg-workflow-block-parma-bg/50 group relative rounded-2xl border-2 border-dashed border-components-panel-border',
|
||||
data.selected && 'border-primary-400',
|
||||
)}
|
||||
style={{
|
||||
width: data.width || 280,
|
||||
height: data.height || 200,
|
||||
}}
|
||||
>
|
||||
{/* Group Header */}
|
||||
<div className="absolute -top-7 left-0 flex items-center gap-1 px-2">
|
||||
<span className="text-xs font-medium text-text-tertiary">
|
||||
{group.title}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
{/* Target handle for incoming connections */}
|
||||
<Handle
|
||||
id="target"
|
||||
type="target"
|
||||
position={Position.Left}
|
||||
className={cn(
|
||||
'!h-4 !w-4 !rounded-none !border-none !bg-transparent !outline-none',
|
||||
'after:absolute after:left-1.5 after:top-1 after:h-2 after:w-0.5 after:bg-workflow-link-line-handle',
|
||||
'transition-all hover:scale-125',
|
||||
)}
|
||||
style={{ top: '50%' }}
|
||||
/>
|
||||
|
||||
<div className="px-3 pt-3">
|
||||
{exitPorts.map((port, index) => {
|
||||
const connected = connectedSourceHandleIds.includes(port.portNodeId)
|
||||
|
||||
return (
|
||||
<div key={port.portNodeId} className="relative flex h-6 items-center px-1">
|
||||
<div className="w-full text-right text-xs font-semibold text-text-secondary">
|
||||
{port.name}
|
||||
</div>
|
||||
|
||||
<Handle
|
||||
id={port.portNodeId}
|
||||
type="source"
|
||||
position={Position.Right}
|
||||
className={cn(
|
||||
'group/handle z-[1] !h-4 !w-4 !rounded-none !border-none !bg-transparent !outline-none',
|
||||
'after:absolute after:right-1.5 after:top-1 after:h-2 after:w-0.5 after:bg-workflow-link-line-handle',
|
||||
'transition-all hover:scale-125',
|
||||
!connected && 'after:opacity-0',
|
||||
'!-right-[21px] !top-1/2 !-translate-y-1/2',
|
||||
)}
|
||||
isConnectable
|
||||
/>
|
||||
|
||||
{/* Visual "+" indicator (styling aligned with existing branch handles) */}
|
||||
<div
|
||||
className={cn(
|
||||
'pointer-events-none absolute z-10 hidden h-4 w-4 items-center justify-center rounded-full bg-components-button-primary-bg text-text-primary-on-surface',
|
||||
'-right-[21px] top-1/2 -translate-y-1/2',
|
||||
'group-hover:flex',
|
||||
data.selected && '!flex',
|
||||
)}
|
||||
>
|
||||
<Plus02 className="h-2.5 w-2.5" />
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
export default memo(CustomGroupNode)
|
||||
19
web/app/components/workflow/custom-group-node/index.ts
Normal file
19
web/app/components/workflow/custom-group-node/index.ts
Normal file
@@ -0,0 +1,19 @@
|
||||
export {
|
||||
CUSTOM_GROUP_EXIT_PORT_NODE,
|
||||
CUSTOM_GROUP_INPUT_NODE,
|
||||
CUSTOM_GROUP_NODE,
|
||||
GROUP_CHILDREN_Z_INDEX,
|
||||
UI_ONLY_GROUP_NODE_TYPES,
|
||||
} from './constants'
|
||||
|
||||
export { default as CustomGroupExitPortNode } from './custom-group-exit-port-node'
|
||||
|
||||
export { default as CustomGroupInputNode } from './custom-group-input-node'
|
||||
export { default as CustomGroupNode } from './custom-group-node'
|
||||
export type {
|
||||
CustomGroupExitPortNodeData,
|
||||
CustomGroupInputNodeData,
|
||||
CustomGroupNodeData,
|
||||
ExitPortInfo,
|
||||
GroupMember,
|
||||
} from './types'
|
||||
82
web/app/components/workflow/custom-group-node/types.ts
Normal file
82
web/app/components/workflow/custom-group-node/types.ts
Normal file
@@ -0,0 +1,82 @@
|
||||
import type { BlockEnum } from '../types'
|
||||
|
||||
/**
|
||||
* Exit port info stored in Group node
|
||||
*/
|
||||
export type ExitPortInfo = {
|
||||
portNodeId: string
|
||||
leafNodeId: string
|
||||
sourceHandle: string
|
||||
name: string
|
||||
}
|
||||
|
||||
/**
|
||||
* Group node data structure
|
||||
* node.type = 'custom-group'
|
||||
* node.data.type = '' (empty string to bypass backend NodeType validation)
|
||||
*/
|
||||
export type CustomGroupNodeData = {
|
||||
type: '' // Empty string bypasses backend NodeType validation
|
||||
title: string
|
||||
desc?: string
|
||||
_connectedSourceHandleIds?: string[]
|
||||
_connectedTargetHandleIds?: string[]
|
||||
group: {
|
||||
groupId: string
|
||||
title: string
|
||||
memberNodeIds: string[]
|
||||
entryNodeIds: string[]
|
||||
inputNodeId: string
|
||||
exitPorts: ExitPortInfo[]
|
||||
collapsed: boolean
|
||||
}
|
||||
width?: number
|
||||
height?: number
|
||||
selected?: boolean
|
||||
_isTempNode?: boolean
|
||||
}
|
||||
|
||||
/**
|
||||
* Group Input node data structure
|
||||
* node.type = 'custom-group-input'
|
||||
* node.data.type = ''
|
||||
*/
|
||||
export type CustomGroupInputNodeData = {
|
||||
type: ''
|
||||
title: string
|
||||
desc?: string
|
||||
groupInput: {
|
||||
groupId: string
|
||||
title: string
|
||||
}
|
||||
selected?: boolean
|
||||
_isTempNode?: boolean
|
||||
}
|
||||
|
||||
/**
|
||||
* Exit Port node data structure
|
||||
* node.type = 'custom-group-exit-port'
|
||||
* node.data.type = ''
|
||||
*/
|
||||
export type CustomGroupExitPortNodeData = {
|
||||
type: ''
|
||||
title: string
|
||||
desc?: string
|
||||
exitPort: {
|
||||
groupId: string
|
||||
leafNodeId: string
|
||||
sourceHandle: string
|
||||
name: string
|
||||
}
|
||||
selected?: boolean
|
||||
_isTempNode?: boolean
|
||||
}
|
||||
|
||||
/**
|
||||
* Member node info for display
|
||||
*/
|
||||
export type GroupMember = {
|
||||
id: string
|
||||
type: BlockEnum
|
||||
label?: string
|
||||
}
|
||||
@@ -10,6 +10,7 @@ import { useCallback } from 'react'
|
||||
import {
|
||||
useStoreApi,
|
||||
} from 'reactflow'
|
||||
import { BlockEnum } from '../types'
|
||||
import { getNodesConnectedSourceOrTargetHandleIdsMap } from '../utils'
|
||||
import { useNodesSyncDraft } from './use-nodes-sync-draft'
|
||||
import { useNodesReadOnly } from './use-workflow'
|
||||
@@ -108,6 +109,50 @@ export const useEdgesInteractions = () => {
|
||||
return
|
||||
const currentEdge = edges[currentEdgeIndex]
|
||||
const nodes = getNodes()
|
||||
|
||||
// collect edges to delete (including corresponding real edges for temp edges)
|
||||
const edgesToDelete: Set<string> = new Set([currentEdge.id])
|
||||
|
||||
// if deleting a temp edge connected to a group, also delete the corresponding real hidden edge
|
||||
if (currentEdge.data?._isTemp) {
|
||||
const groupNode = nodes.find(n =>
|
||||
n.data.type === BlockEnum.Group
|
||||
&& (n.id === currentEdge.source || n.id === currentEdge.target),
|
||||
)
|
||||
|
||||
if (groupNode) {
|
||||
const memberIds = new Set((groupNode.data.members || []).map((m: { id: string }) => m.id))
|
||||
|
||||
if (currentEdge.target === groupNode.id) {
|
||||
// inbound temp edge: find real edge with same source, target is a head node
|
||||
edges.forEach((edge) => {
|
||||
if (edge.source === currentEdge.source
|
||||
&& memberIds.has(edge.target)
|
||||
&& edge.sourceHandle === currentEdge.sourceHandle) {
|
||||
edgesToDelete.add(edge.id)
|
||||
}
|
||||
})
|
||||
}
|
||||
else if (currentEdge.source === groupNode.id) {
|
||||
// outbound temp edge: sourceHandle format is "leafNodeId-originalHandle"
|
||||
const sourceHandle = currentEdge.sourceHandle || ''
|
||||
const lastDashIndex = sourceHandle.lastIndexOf('-')
|
||||
if (lastDashIndex > 0) {
|
||||
const leafNodeId = sourceHandle.substring(0, lastDashIndex)
|
||||
const originalHandle = sourceHandle.substring(lastDashIndex + 1)
|
||||
|
||||
edges.forEach((edge) => {
|
||||
if (edge.source === leafNodeId
|
||||
&& edge.target === currentEdge.target
|
||||
&& (edge.sourceHandle || 'source') === originalHandle) {
|
||||
edgesToDelete.add(edge.id)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const nodesConnectedSourceOrTargetHandleIdsMap = getNodesConnectedSourceOrTargetHandleIdsMap(
|
||||
[
|
||||
{ type: 'remove', edge: currentEdge },
|
||||
@@ -126,7 +171,10 @@ export const useEdgesInteractions = () => {
|
||||
})
|
||||
setNodes(newNodes)
|
||||
const newEdges = produce(edges, (draft) => {
|
||||
draft.splice(currentEdgeIndex, 1)
|
||||
for (let i = draft.length - 1; i >= 0; i--) {
|
||||
if (edgesToDelete.has(draft[i].id))
|
||||
draft.splice(i, 1)
|
||||
}
|
||||
})
|
||||
setEdges(newEdges)
|
||||
handleSyncWorkflowDraft()
|
||||
|
||||
138
web/app/components/workflow/hooks/use-make-group.ts
Normal file
138
web/app/components/workflow/hooks/use-make-group.ts
Normal file
@@ -0,0 +1,138 @@
|
||||
import type { PredecessorHandle } from '../utils'
|
||||
import { useMemo } from 'react'
|
||||
import { useStore as useReactFlowStore } from 'reactflow'
|
||||
import { shallow } from 'zustand/shallow'
|
||||
import { BlockEnum } from '../types'
|
||||
import { getCommonPredecessorHandles } from '../utils'
|
||||
|
||||
export type MakeGroupAvailability = {
|
||||
canMakeGroup: boolean
|
||||
branchEntryNodeIds: string[]
|
||||
commonPredecessorHandle?: PredecessorHandle
|
||||
}
|
||||
|
||||
type MinimalEdge = {
|
||||
id: string
|
||||
source: string
|
||||
sourceHandle: string
|
||||
target: string
|
||||
}
|
||||
|
||||
/**
|
||||
* Pure function to check if the selected nodes can be grouped.
|
||||
* Can be called both from React hooks and imperatively.
|
||||
*/
|
||||
export const checkMakeGroupAvailability = (
|
||||
selectedNodeIds: string[],
|
||||
edges: MinimalEdge[],
|
||||
hasGroupNode = false,
|
||||
): MakeGroupAvailability => {
|
||||
if (selectedNodeIds.length <= 1 || hasGroupNode) {
|
||||
return {
|
||||
canMakeGroup: false,
|
||||
branchEntryNodeIds: [],
|
||||
commonPredecessorHandle: undefined,
|
||||
}
|
||||
}
|
||||
|
||||
const selectedNodeIdSet = new Set(selectedNodeIds)
|
||||
const inboundFromOutsideTargets = new Set<string>()
|
||||
const incomingEdgeCounts = new Map<string, number>()
|
||||
const incomingFromSelectedTargets = new Set<string>()
|
||||
|
||||
edges.forEach((edge) => {
|
||||
// Only consider edges whose target is inside the selected subgraph.
|
||||
if (!selectedNodeIdSet.has(edge.target))
|
||||
return
|
||||
|
||||
incomingEdgeCounts.set(edge.target, (incomingEdgeCounts.get(edge.target) ?? 0) + 1)
|
||||
|
||||
if (selectedNodeIdSet.has(edge.source))
|
||||
incomingFromSelectedTargets.add(edge.target)
|
||||
else
|
||||
inboundFromOutsideTargets.add(edge.target)
|
||||
})
|
||||
|
||||
// Branch head (entry) definition:
|
||||
// - has at least one incoming edge
|
||||
// - and all its incoming edges come from outside the selected subgraph
|
||||
const branchEntryNodeIds = selectedNodeIds.filter((nodeId) => {
|
||||
const incomingEdgeCount = incomingEdgeCounts.get(nodeId) ?? 0
|
||||
if (incomingEdgeCount === 0)
|
||||
return false
|
||||
|
||||
return !incomingFromSelectedTargets.has(nodeId)
|
||||
})
|
||||
|
||||
// No branch head means we cannot tell how many branches are represented by this selection.
|
||||
if (branchEntryNodeIds.length === 0) {
|
||||
return {
|
||||
canMakeGroup: false,
|
||||
branchEntryNodeIds,
|
||||
commonPredecessorHandle: undefined,
|
||||
}
|
||||
}
|
||||
|
||||
// Guardrail: disallow side entrances into the selected subgraph.
|
||||
// If an outside node connects to a non-entry node inside the selection, the grouping boundary is ambiguous.
|
||||
const branchEntryNodeIdSet = new Set(branchEntryNodeIds)
|
||||
const hasInboundToNonEntryNode = Array.from(inboundFromOutsideTargets).some(nodeId => !branchEntryNodeIdSet.has(nodeId))
|
||||
|
||||
if (hasInboundToNonEntryNode) {
|
||||
return {
|
||||
canMakeGroup: false,
|
||||
branchEntryNodeIds,
|
||||
commonPredecessorHandle: undefined,
|
||||
}
|
||||
}
|
||||
|
||||
// Compare the branch heads by their common predecessor "handler" (source node + sourceHandle).
|
||||
// This is required for multi-handle nodes like If-Else / Classifier where different branches use different handles.
|
||||
const commonPredecessorHandles = getCommonPredecessorHandles(
|
||||
branchEntryNodeIds,
|
||||
// Only look at edges coming from outside the selected subgraph when determining the "pre" handler.
|
||||
edges.filter(edge => !selectedNodeIdSet.has(edge.source)),
|
||||
)
|
||||
|
||||
if (commonPredecessorHandles.length !== 1) {
|
||||
return {
|
||||
canMakeGroup: false,
|
||||
branchEntryNodeIds,
|
||||
commonPredecessorHandle: undefined,
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
canMakeGroup: true,
|
||||
branchEntryNodeIds,
|
||||
commonPredecessorHandle: commonPredecessorHandles[0],
|
||||
}
|
||||
}
|
||||
|
||||
export const useMakeGroupAvailability = (selectedNodeIds: string[]): MakeGroupAvailability => {
|
||||
const edgeKeys = useReactFlowStore((state) => {
|
||||
const delimiter = '\u0000'
|
||||
const keys = state.edges.map(edge => `${edge.source}${delimiter}${edge.sourceHandle || 'source'}${delimiter}${edge.target}`)
|
||||
keys.sort()
|
||||
return keys
|
||||
}, shallow)
|
||||
|
||||
const hasGroupNode = useReactFlowStore((state) => {
|
||||
return state.getNodes().some(node => node.selected && node.data.type === BlockEnum.Group)
|
||||
})
|
||||
|
||||
return useMemo(() => {
|
||||
const delimiter = '\u0000'
|
||||
const edges = edgeKeys.map((key) => {
|
||||
const [source, handleId, target] = key.split(delimiter)
|
||||
return {
|
||||
id: key,
|
||||
source,
|
||||
sourceHandle: handleId || 'source',
|
||||
target,
|
||||
}
|
||||
})
|
||||
|
||||
return checkMakeGroupAvailability(selectedNodeIds, edges, hasGroupNode)
|
||||
}, [edgeKeys, selectedNodeIds, hasGroupNode])
|
||||
}
|
||||
@@ -8,6 +8,7 @@ import type {
|
||||
ResizeParamsWithDirection,
|
||||
} from 'reactflow'
|
||||
import type { PluginDefaultValue } from '../block-selector/types'
|
||||
import type { GroupHandler, GroupMember, GroupNodeData } from '../nodes/group/types'
|
||||
import type { IterationNodeType } from '../nodes/iteration/types'
|
||||
import type { LoopNodeType } from '../nodes/loop/types'
|
||||
import type { VariableAssignerNodeType } from '../nodes/variable-assigner/types'
|
||||
@@ -52,6 +53,7 @@ import { useWorkflowHistoryStore } from '../workflow-history-store'
|
||||
import { useAutoGenerateWebhookUrl } from './use-auto-generate-webhook-url'
|
||||
import { useHelpline } from './use-helpline'
|
||||
import useInspectVarsCrud from './use-inspect-vars-crud'
|
||||
import { checkMakeGroupAvailability } from './use-make-group'
|
||||
import { useNodesMetaData } from './use-nodes-meta-data'
|
||||
import { useNodesSyncDraft } from './use-nodes-sync-draft'
|
||||
import {
|
||||
@@ -73,6 +75,151 @@ const ENTRY_NODE_WRAPPER_OFFSET = {
|
||||
y: 21, // Adjusted based on visual testing feedback
|
||||
} as const
|
||||
|
||||
/**
|
||||
* Parse group handler id to get original node id and sourceHandle
|
||||
* Handler id format: `${nodeId}-${sourceHandle}`
|
||||
*/
|
||||
function parseGroupHandlerId(handlerId: string): { originalNodeId: string, originalSourceHandle: string } {
|
||||
const lastDashIndex = handlerId.lastIndexOf('-')
|
||||
return {
|
||||
originalNodeId: handlerId.substring(0, lastDashIndex),
|
||||
originalSourceHandle: handlerId.substring(lastDashIndex + 1),
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a pair of edges for group node connections:
|
||||
* - realEdge: hidden edge from original node to target (persisted to backend)
|
||||
* - uiEdge: visible temp edge from group to target (UI-only, not persisted)
|
||||
*/
|
||||
function createGroupEdgePair(params: {
|
||||
groupNodeId: string
|
||||
handlerId: string
|
||||
targetNodeId: string
|
||||
targetHandle: string
|
||||
nodes: Node[]
|
||||
baseEdgeData?: Partial<Edge['data']>
|
||||
zIndex?: number
|
||||
}): { realEdge: Edge, uiEdge: Edge } | null {
|
||||
const { groupNodeId, handlerId, targetNodeId, targetHandle, nodes, baseEdgeData = {}, zIndex = 0 } = params
|
||||
|
||||
const groupNode = nodes.find(node => node.id === groupNodeId)
|
||||
const groupData = groupNode?.data as GroupNodeData | undefined
|
||||
const handler = groupData?.handlers?.find(h => h.id === handlerId)
|
||||
|
||||
let originalNodeId: string
|
||||
let originalSourceHandle: string
|
||||
|
||||
if (handler?.nodeId && handler?.sourceHandle) {
|
||||
originalNodeId = handler.nodeId
|
||||
originalSourceHandle = handler.sourceHandle
|
||||
}
|
||||
else {
|
||||
const parsed = parseGroupHandlerId(handlerId)
|
||||
originalNodeId = parsed.originalNodeId
|
||||
originalSourceHandle = parsed.originalSourceHandle
|
||||
}
|
||||
|
||||
const originalNode = nodes.find(node => node.id === originalNodeId)
|
||||
const targetNode = nodes.find(node => node.id === targetNodeId)
|
||||
|
||||
if (!originalNode || !targetNode)
|
||||
return null
|
||||
|
||||
// Create the real edge (from original node to target) - hidden because original node is in group
|
||||
const realEdge: Edge = {
|
||||
id: `${originalNodeId}-${originalSourceHandle}-${targetNodeId}-${targetHandle}`,
|
||||
type: CUSTOM_EDGE,
|
||||
source: originalNodeId,
|
||||
sourceHandle: originalSourceHandle,
|
||||
target: targetNodeId,
|
||||
targetHandle,
|
||||
hidden: true,
|
||||
data: {
|
||||
...baseEdgeData,
|
||||
sourceType: originalNode.data.type,
|
||||
targetType: targetNode.data.type,
|
||||
_hiddenInGroupId: groupNodeId,
|
||||
},
|
||||
zIndex,
|
||||
}
|
||||
|
||||
// Create the UI edge (from group to target) - temporary, not persisted to backend
|
||||
const uiEdge: Edge = {
|
||||
id: `${groupNodeId}-${handlerId}-${targetNodeId}-${targetHandle}`,
|
||||
type: CUSTOM_EDGE,
|
||||
source: groupNodeId,
|
||||
sourceHandle: handlerId,
|
||||
target: targetNodeId,
|
||||
targetHandle,
|
||||
data: {
|
||||
...baseEdgeData,
|
||||
sourceType: BlockEnum.Group,
|
||||
targetType: targetNode.data.type,
|
||||
_isTemp: true,
|
||||
},
|
||||
zIndex,
|
||||
}
|
||||
|
||||
return { realEdge, uiEdge }
|
||||
}
|
||||
|
||||
function createGroupInboundEdges(params: {
|
||||
sourceNodeId: string
|
||||
sourceHandle: string
|
||||
groupNodeId: string
|
||||
groupData: GroupNodeData
|
||||
nodes: Node[]
|
||||
baseEdgeData?: Partial<Edge['data']>
|
||||
zIndex?: number
|
||||
}): { realEdges: Edge[], uiEdge: Edge } | null {
|
||||
const { sourceNodeId, sourceHandle, groupNodeId, groupData, nodes, baseEdgeData = {}, zIndex = 0 } = params
|
||||
|
||||
const sourceNode = nodes.find(node => node.id === sourceNodeId)
|
||||
const headNodeIds = groupData.headNodeIds || []
|
||||
|
||||
if (!sourceNode || headNodeIds.length === 0)
|
||||
return null
|
||||
|
||||
const realEdges: Edge[] = headNodeIds.map((headNodeId) => {
|
||||
const headNode = nodes.find(node => node.id === headNodeId)
|
||||
return {
|
||||
id: `${sourceNodeId}-${sourceHandle}-${headNodeId}-target`,
|
||||
type: CUSTOM_EDGE,
|
||||
source: sourceNodeId,
|
||||
sourceHandle,
|
||||
target: headNodeId,
|
||||
targetHandle: 'target',
|
||||
hidden: true,
|
||||
data: {
|
||||
...baseEdgeData,
|
||||
sourceType: sourceNode.data.type,
|
||||
targetType: headNode?.data.type,
|
||||
_hiddenInGroupId: groupNodeId,
|
||||
},
|
||||
zIndex,
|
||||
} as Edge
|
||||
})
|
||||
|
||||
const uiEdge: Edge = {
|
||||
id: `${sourceNodeId}-${sourceHandle}-${groupNodeId}-target`,
|
||||
type: CUSTOM_EDGE,
|
||||
source: sourceNodeId,
|
||||
sourceHandle,
|
||||
target: groupNodeId,
|
||||
targetHandle: 'target',
|
||||
data: {
|
||||
...baseEdgeData,
|
||||
sourceType: sourceNode.data.type,
|
||||
targetType: BlockEnum.Group,
|
||||
_isTemp: true,
|
||||
},
|
||||
zIndex,
|
||||
}
|
||||
|
||||
return { realEdges, uiEdge }
|
||||
}
|
||||
|
||||
export const useNodesInteractions = () => {
|
||||
const { t } = useTranslation()
|
||||
const store = useStoreApi()
|
||||
@@ -448,6 +595,146 @@ export const useNodesInteractions = () => {
|
||||
return
|
||||
}
|
||||
|
||||
// Check if source is a group node - need special handling
|
||||
const isSourceGroup = sourceNode?.data.type === BlockEnum.Group
|
||||
|
||||
if (isSourceGroup && sourceHandle && target && targetHandle) {
|
||||
const { originalNodeId, originalSourceHandle } = parseGroupHandlerId(sourceHandle)
|
||||
|
||||
// Check if real edge already exists
|
||||
if (edges.find(edge =>
|
||||
edge.source === originalNodeId
|
||||
&& edge.sourceHandle === originalSourceHandle
|
||||
&& edge.target === target
|
||||
&& edge.targetHandle === targetHandle,
|
||||
)) {
|
||||
return
|
||||
}
|
||||
|
||||
const parentNode = nodes.find(node => node.id === targetNode?.parentId)
|
||||
const isInIteration = parentNode && parentNode.data.type === BlockEnum.Iteration
|
||||
const isInLoop = !!parentNode && parentNode.data.type === BlockEnum.Loop
|
||||
|
||||
const edgePair = createGroupEdgePair({
|
||||
groupNodeId: source!,
|
||||
handlerId: sourceHandle,
|
||||
targetNodeId: target,
|
||||
targetHandle,
|
||||
nodes,
|
||||
baseEdgeData: {
|
||||
isInIteration,
|
||||
iteration_id: isInIteration ? targetNode?.parentId : undefined,
|
||||
isInLoop,
|
||||
loop_id: isInLoop ? targetNode?.parentId : undefined,
|
||||
},
|
||||
})
|
||||
|
||||
if (!edgePair)
|
||||
return
|
||||
|
||||
const { realEdge, uiEdge } = edgePair
|
||||
|
||||
// Update connected handle ids for the original node
|
||||
const nodesConnectedSourceOrTargetHandleIdsMap
|
||||
= getNodesConnectedSourceOrTargetHandleIdsMap(
|
||||
[{ type: 'add', edge: realEdge }],
|
||||
nodes,
|
||||
)
|
||||
const newNodes = produce(nodes, (draft: Node[]) => {
|
||||
draft.forEach((node) => {
|
||||
if (nodesConnectedSourceOrTargetHandleIdsMap[node.id]) {
|
||||
node.data = {
|
||||
...node.data,
|
||||
...nodesConnectedSourceOrTargetHandleIdsMap[node.id],
|
||||
}
|
||||
}
|
||||
})
|
||||
})
|
||||
const newEdges = produce(edges, (draft) => {
|
||||
draft.push(realEdge)
|
||||
draft.push(uiEdge)
|
||||
})
|
||||
|
||||
setNodes(newNodes)
|
||||
setEdges(newEdges)
|
||||
|
||||
handleSyncWorkflowDraft()
|
||||
saveStateToHistory(WorkflowHistoryEvent.NodeConnect, {
|
||||
nodeId: targetNode?.id,
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
const isTargetGroup = targetNode?.data.type === BlockEnum.Group
|
||||
|
||||
if (isTargetGroup && source && sourceHandle) {
|
||||
const groupData = targetNode.data as GroupNodeData
|
||||
const headNodeIds = groupData.headNodeIds || []
|
||||
|
||||
if (edges.find(edge =>
|
||||
edge.source === source
|
||||
&& edge.sourceHandle === sourceHandle
|
||||
&& edge.target === target
|
||||
&& edge.targetHandle === targetHandle,
|
||||
)) {
|
||||
return
|
||||
}
|
||||
|
||||
const parentNode = nodes.find(node => node.id === sourceNode?.parentId)
|
||||
const isInIteration = parentNode && parentNode.data.type === BlockEnum.Iteration
|
||||
const isInLoop = !!parentNode && parentNode.data.type === BlockEnum.Loop
|
||||
|
||||
const inboundResult = createGroupInboundEdges({
|
||||
sourceNodeId: source,
|
||||
sourceHandle,
|
||||
groupNodeId: target!,
|
||||
groupData,
|
||||
nodes,
|
||||
baseEdgeData: {
|
||||
isInIteration,
|
||||
iteration_id: isInIteration ? sourceNode?.parentId : undefined,
|
||||
isInLoop,
|
||||
loop_id: isInLoop ? sourceNode?.parentId : undefined,
|
||||
},
|
||||
})
|
||||
|
||||
if (!inboundResult)
|
||||
return
|
||||
|
||||
const { realEdges, uiEdge } = inboundResult
|
||||
|
||||
const edgeChanges = realEdges.map(edge => ({ type: 'add' as const, edge }))
|
||||
const nodesConnectedSourceOrTargetHandleIdsMap
|
||||
= getNodesConnectedSourceOrTargetHandleIdsMap(edgeChanges, nodes)
|
||||
|
||||
const newNodes = produce(nodes, (draft: Node[]) => {
|
||||
draft.forEach((node) => {
|
||||
if (nodesConnectedSourceOrTargetHandleIdsMap[node.id]) {
|
||||
node.data = {
|
||||
...node.data,
|
||||
...nodesConnectedSourceOrTargetHandleIdsMap[node.id],
|
||||
}
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
const newEdges = produce(edges, (draft) => {
|
||||
realEdges.forEach((edge) => {
|
||||
draft.push(edge)
|
||||
})
|
||||
draft.push(uiEdge)
|
||||
})
|
||||
|
||||
setNodes(newNodes)
|
||||
setEdges(newEdges)
|
||||
|
||||
handleSyncWorkflowDraft()
|
||||
saveStateToHistory(WorkflowHistoryEvent.NodeConnect, {
|
||||
nodeId: headNodeIds[0],
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
if (
|
||||
edges.find(
|
||||
edge =>
|
||||
@@ -909,8 +1196,34 @@ export const useNodesInteractions = () => {
|
||||
}
|
||||
}
|
||||
|
||||
let newEdge = null
|
||||
if (nodeType !== BlockEnum.DataSource) {
|
||||
// Check if prevNode is a group node - need special handling
|
||||
const isPrevNodeGroup = prevNode.data.type === BlockEnum.Group
|
||||
let newEdge: Edge | null = null
|
||||
let newUiEdge: Edge | null = null
|
||||
|
||||
if (isPrevNodeGroup && prevNodeSourceHandle && nodeType !== BlockEnum.DataSource) {
|
||||
const edgePair = createGroupEdgePair({
|
||||
groupNodeId: prevNodeId,
|
||||
handlerId: prevNodeSourceHandle,
|
||||
targetNodeId: newNode.id,
|
||||
targetHandle,
|
||||
nodes: [...nodes, newNode],
|
||||
baseEdgeData: {
|
||||
isInIteration,
|
||||
isInLoop,
|
||||
iteration_id: isInIteration ? prevNode.parentId : undefined,
|
||||
loop_id: isInLoop ? prevNode.parentId : undefined,
|
||||
_connectedNodeIsSelected: true,
|
||||
},
|
||||
})
|
||||
|
||||
if (edgePair) {
|
||||
newEdge = edgePair.realEdge
|
||||
newUiEdge = edgePair.uiEdge
|
||||
}
|
||||
}
|
||||
else if (nodeType !== BlockEnum.DataSource) {
|
||||
// Normal case: prevNode is not a group
|
||||
newEdge = {
|
||||
id: `${prevNodeId}-${prevNodeSourceHandle}-${newNode.id}-${targetHandle}`,
|
||||
type: CUSTOM_EDGE,
|
||||
@@ -935,9 +1248,10 @@ export const useNodesInteractions = () => {
|
||||
}
|
||||
}
|
||||
|
||||
const edgesToAdd = [newEdge, newUiEdge].filter(Boolean).map(edge => ({ type: 'add' as const, edge: edge! }))
|
||||
const nodesConnectedSourceOrTargetHandleIdsMap
|
||||
= getNodesConnectedSourceOrTargetHandleIdsMap(
|
||||
(newEdge ? [{ type: 'add', edge: newEdge }] : []),
|
||||
edgesToAdd,
|
||||
nodes,
|
||||
)
|
||||
const newNodes = produce(nodes, (draft: Node[]) => {
|
||||
@@ -1006,6 +1320,8 @@ export const useNodesInteractions = () => {
|
||||
})
|
||||
if (newEdge)
|
||||
draft.push(newEdge)
|
||||
if (newUiEdge)
|
||||
draft.push(newUiEdge)
|
||||
})
|
||||
|
||||
setNodes(newNodes)
|
||||
@@ -1090,7 +1406,7 @@ export const useNodesInteractions = () => {
|
||||
|
||||
const afterNodesInSameBranch = getAfterNodesInSameBranch(nextNodeId!)
|
||||
const afterNodesInSameBranchIds = afterNodesInSameBranch.map(
|
||||
node => node.id,
|
||||
(node: Node) => node.id,
|
||||
)
|
||||
const newNodes = produce(nodes, (draft) => {
|
||||
draft.forEach((node) => {
|
||||
@@ -1200,37 +1516,113 @@ export const useNodesInteractions = () => {
|
||||
}
|
||||
}
|
||||
|
||||
const currentEdgeIndex = edges.findIndex(
|
||||
edge => edge.source === prevNodeId && edge.target === nextNodeId,
|
||||
)
|
||||
let newPrevEdge = null
|
||||
// Check if prevNode is a group node - need special handling
|
||||
const isPrevNodeGroup = prevNode.data.type === BlockEnum.Group
|
||||
let newPrevEdge: Edge | null = null
|
||||
let newPrevUiEdge: Edge | null = null
|
||||
const edgesToRemove: string[] = []
|
||||
|
||||
if (nodeType !== BlockEnum.DataSource) {
|
||||
newPrevEdge = {
|
||||
id: `${prevNodeId}-${prevNodeSourceHandle}-${newNode.id}-${targetHandle}`,
|
||||
type: CUSTOM_EDGE,
|
||||
source: prevNodeId,
|
||||
sourceHandle: prevNodeSourceHandle,
|
||||
target: newNode.id,
|
||||
if (isPrevNodeGroup && prevNodeSourceHandle && nodeType !== BlockEnum.DataSource) {
|
||||
const { originalNodeId, originalSourceHandle } = parseGroupHandlerId(prevNodeSourceHandle)
|
||||
|
||||
// Find edges to remove: both hidden real edge and UI temp edge from group to nextNode
|
||||
const hiddenEdge = edges.find(
|
||||
edge => edge.source === originalNodeId
|
||||
&& edge.sourceHandle === originalSourceHandle
|
||||
&& edge.target === nextNodeId,
|
||||
)
|
||||
const uiTempEdge = edges.find(
|
||||
edge => edge.source === prevNodeId
|
||||
&& edge.sourceHandle === prevNodeSourceHandle
|
||||
&& edge.target === nextNodeId,
|
||||
)
|
||||
if (hiddenEdge)
|
||||
edgesToRemove.push(hiddenEdge.id)
|
||||
if (uiTempEdge)
|
||||
edgesToRemove.push(uiTempEdge.id)
|
||||
|
||||
const edgePair = createGroupEdgePair({
|
||||
groupNodeId: prevNodeId,
|
||||
handlerId: prevNodeSourceHandle,
|
||||
targetNodeId: newNode.id,
|
||||
targetHandle,
|
||||
data: {
|
||||
sourceType: prevNode.data.type,
|
||||
targetType: newNode.data.type,
|
||||
nodes: [...nodes, newNode],
|
||||
baseEdgeData: {
|
||||
isInIteration,
|
||||
isInLoop,
|
||||
iteration_id: isInIteration ? prevNode.parentId : undefined,
|
||||
loop_id: isInLoop ? prevNode.parentId : undefined,
|
||||
_connectedNodeIsSelected: true,
|
||||
},
|
||||
zIndex: prevNode.parentId
|
||||
? isInIteration
|
||||
? ITERATION_CHILDREN_Z_INDEX
|
||||
: LOOP_CHILDREN_Z_INDEX
|
||||
: 0,
|
||||
})
|
||||
|
||||
if (edgePair) {
|
||||
newPrevEdge = edgePair.realEdge
|
||||
newPrevUiEdge = edgePair.uiEdge
|
||||
}
|
||||
}
|
||||
else {
|
||||
const isNextNodeGroupForRemoval = nextNode.data.type === BlockEnum.Group
|
||||
|
||||
if (isNextNodeGroupForRemoval) {
|
||||
const groupData = nextNode.data as GroupNodeData
|
||||
const headNodeIds = groupData.headNodeIds || []
|
||||
|
||||
headNodeIds.forEach((headNodeId) => {
|
||||
const realEdge = edges.find(
|
||||
edge => edge.source === prevNodeId
|
||||
&& edge.sourceHandle === prevNodeSourceHandle
|
||||
&& edge.target === headNodeId,
|
||||
)
|
||||
if (realEdge)
|
||||
edgesToRemove.push(realEdge.id)
|
||||
})
|
||||
|
||||
const uiEdge = edges.find(
|
||||
edge => edge.source === prevNodeId
|
||||
&& edge.sourceHandle === prevNodeSourceHandle
|
||||
&& edge.target === nextNodeId,
|
||||
)
|
||||
if (uiEdge)
|
||||
edgesToRemove.push(uiEdge.id)
|
||||
}
|
||||
else {
|
||||
const currentEdge = edges.find(
|
||||
edge => edge.source === prevNodeId && edge.target === nextNodeId,
|
||||
)
|
||||
if (currentEdge)
|
||||
edgesToRemove.push(currentEdge.id)
|
||||
}
|
||||
|
||||
if (nodeType !== BlockEnum.DataSource) {
|
||||
newPrevEdge = {
|
||||
id: `${prevNodeId}-${prevNodeSourceHandle}-${newNode.id}-${targetHandle}`,
|
||||
type: CUSTOM_EDGE,
|
||||
source: prevNodeId,
|
||||
sourceHandle: prevNodeSourceHandle,
|
||||
target: newNode.id,
|
||||
targetHandle,
|
||||
data: {
|
||||
sourceType: prevNode.data.type,
|
||||
targetType: newNode.data.type,
|
||||
isInIteration,
|
||||
isInLoop,
|
||||
iteration_id: isInIteration ? prevNode.parentId : undefined,
|
||||
loop_id: isInLoop ? prevNode.parentId : undefined,
|
||||
_connectedNodeIsSelected: true,
|
||||
},
|
||||
zIndex: prevNode.parentId
|
||||
? isInIteration
|
||||
? ITERATION_CHILDREN_Z_INDEX
|
||||
: LOOP_CHILDREN_Z_INDEX
|
||||
: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let newNextEdge: Edge | null = null
|
||||
let newNextUiEdge: Edge | null = null
|
||||
const newNextRealEdges: Edge[] = []
|
||||
|
||||
const nextNodeParentNode
|
||||
= nodes.find(node => node.id === nextNode.parentId) || null
|
||||
@@ -1241,49 +1633,113 @@ export const useNodesInteractions = () => {
|
||||
= !!nextNodeParentNode
|
||||
&& nextNodeParentNode.data.type === BlockEnum.Loop
|
||||
|
||||
const isNextNodeGroup = nextNode.data.type === BlockEnum.Group
|
||||
|
||||
if (
|
||||
nodeType !== BlockEnum.IfElse
|
||||
&& nodeType !== BlockEnum.QuestionClassifier
|
||||
&& nodeType !== BlockEnum.LoopEnd
|
||||
) {
|
||||
newNextEdge = {
|
||||
id: `${newNode.id}-${sourceHandle}-${nextNodeId}-${nextNodeTargetHandle}`,
|
||||
type: CUSTOM_EDGE,
|
||||
source: newNode.id,
|
||||
sourceHandle,
|
||||
target: nextNodeId,
|
||||
targetHandle: nextNodeTargetHandle,
|
||||
data: {
|
||||
sourceType: newNode.data.type,
|
||||
targetType: nextNode.data.type,
|
||||
isInIteration: isNextNodeInIteration,
|
||||
isInLoop: isNextNodeInLoop,
|
||||
iteration_id: isNextNodeInIteration
|
||||
? nextNode.parentId
|
||||
: undefined,
|
||||
loop_id: isNextNodeInLoop ? nextNode.parentId : undefined,
|
||||
_connectedNodeIsSelected: true,
|
||||
},
|
||||
zIndex: nextNode.parentId
|
||||
? isNextNodeInIteration
|
||||
? ITERATION_CHILDREN_Z_INDEX
|
||||
: LOOP_CHILDREN_Z_INDEX
|
||||
: 0,
|
||||
if (isNextNodeGroup) {
|
||||
const groupData = nextNode.data as GroupNodeData
|
||||
const headNodeIds = groupData.headNodeIds || []
|
||||
|
||||
headNodeIds.forEach((headNodeId) => {
|
||||
const headNode = nodes.find(node => node.id === headNodeId)
|
||||
newNextRealEdges.push({
|
||||
id: `${newNode.id}-${sourceHandle}-${headNodeId}-target`,
|
||||
type: CUSTOM_EDGE,
|
||||
source: newNode.id,
|
||||
sourceHandle,
|
||||
target: headNodeId,
|
||||
targetHandle: 'target',
|
||||
hidden: true,
|
||||
data: {
|
||||
sourceType: newNode.data.type,
|
||||
targetType: headNode?.data.type,
|
||||
isInIteration: isNextNodeInIteration,
|
||||
isInLoop: isNextNodeInLoop,
|
||||
iteration_id: isNextNodeInIteration ? nextNode.parentId : undefined,
|
||||
loop_id: isNextNodeInLoop ? nextNode.parentId : undefined,
|
||||
_hiddenInGroupId: nextNodeId,
|
||||
_connectedNodeIsSelected: true,
|
||||
},
|
||||
zIndex: nextNode.parentId
|
||||
? isNextNodeInIteration
|
||||
? ITERATION_CHILDREN_Z_INDEX
|
||||
: LOOP_CHILDREN_Z_INDEX
|
||||
: 0,
|
||||
} as Edge)
|
||||
})
|
||||
|
||||
newNextUiEdge = {
|
||||
id: `${newNode.id}-${sourceHandle}-${nextNodeId}-target`,
|
||||
type: CUSTOM_EDGE,
|
||||
source: newNode.id,
|
||||
sourceHandle,
|
||||
target: nextNodeId,
|
||||
targetHandle: 'target',
|
||||
data: {
|
||||
sourceType: newNode.data.type,
|
||||
targetType: BlockEnum.Group,
|
||||
isInIteration: isNextNodeInIteration,
|
||||
isInLoop: isNextNodeInLoop,
|
||||
iteration_id: isNextNodeInIteration ? nextNode.parentId : undefined,
|
||||
loop_id: isNextNodeInLoop ? nextNode.parentId : undefined,
|
||||
_isTemp: true,
|
||||
_connectedNodeIsSelected: true,
|
||||
},
|
||||
zIndex: nextNode.parentId
|
||||
? isNextNodeInIteration
|
||||
? ITERATION_CHILDREN_Z_INDEX
|
||||
: LOOP_CHILDREN_Z_INDEX
|
||||
: 0,
|
||||
}
|
||||
}
|
||||
else {
|
||||
newNextEdge = {
|
||||
id: `${newNode.id}-${sourceHandle}-${nextNodeId}-${nextNodeTargetHandle}`,
|
||||
type: CUSTOM_EDGE,
|
||||
source: newNode.id,
|
||||
sourceHandle,
|
||||
target: nextNodeId,
|
||||
targetHandle: nextNodeTargetHandle,
|
||||
data: {
|
||||
sourceType: newNode.data.type,
|
||||
targetType: nextNode.data.type,
|
||||
isInIteration: isNextNodeInIteration,
|
||||
isInLoop: isNextNodeInLoop,
|
||||
iteration_id: isNextNodeInIteration
|
||||
? nextNode.parentId
|
||||
: undefined,
|
||||
loop_id: isNextNodeInLoop ? nextNode.parentId : undefined,
|
||||
_connectedNodeIsSelected: true,
|
||||
},
|
||||
zIndex: nextNode.parentId
|
||||
? isNextNodeInIteration
|
||||
? ITERATION_CHILDREN_Z_INDEX
|
||||
: LOOP_CHILDREN_Z_INDEX
|
||||
: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
const edgeChanges = [
|
||||
...edgesToRemove.map(id => ({ type: 'remove' as const, edge: edges.find(e => e.id === id)! })).filter(c => c.edge),
|
||||
...(newPrevEdge ? [{ type: 'add' as const, edge: newPrevEdge }] : []),
|
||||
...(newPrevUiEdge ? [{ type: 'add' as const, edge: newPrevUiEdge }] : []),
|
||||
...(newNextEdge ? [{ type: 'add' as const, edge: newNextEdge }] : []),
|
||||
...newNextRealEdges.map(edge => ({ type: 'add' as const, edge })),
|
||||
...(newNextUiEdge ? [{ type: 'add' as const, edge: newNextUiEdge }] : []),
|
||||
]
|
||||
const nodesConnectedSourceOrTargetHandleIdsMap
|
||||
= getNodesConnectedSourceOrTargetHandleIdsMap(
|
||||
[
|
||||
{ type: 'remove', edge: edges[currentEdgeIndex] },
|
||||
...(newPrevEdge ? [{ type: 'add', edge: newPrevEdge }] : []),
|
||||
...(newNextEdge ? [{ type: 'add', edge: newNextEdge }] : []),
|
||||
],
|
||||
edgeChanges,
|
||||
[...nodes, newNode],
|
||||
)
|
||||
|
||||
const afterNodesInSameBranch = getAfterNodesInSameBranch(nextNodeId!)
|
||||
const afterNodesInSameBranchIds = afterNodesInSameBranch.map(
|
||||
node => node.id,
|
||||
(node: Node) => node.id,
|
||||
)
|
||||
const newNodes = produce(nodes, (draft) => {
|
||||
draft.forEach((node) => {
|
||||
@@ -1342,7 +1798,10 @@ export const useNodesInteractions = () => {
|
||||
})
|
||||
}
|
||||
const newEdges = produce(edges, (draft) => {
|
||||
draft.splice(currentEdgeIndex, 1)
|
||||
const filteredDraft = draft.filter(edge => !edgesToRemove.includes(edge.id))
|
||||
draft.length = 0
|
||||
draft.push(...filteredDraft)
|
||||
|
||||
draft.forEach((item) => {
|
||||
item.data = {
|
||||
...item.data,
|
||||
@@ -1351,9 +1810,15 @@ export const useNodesInteractions = () => {
|
||||
})
|
||||
if (newPrevEdge)
|
||||
draft.push(newPrevEdge)
|
||||
|
||||
if (newPrevUiEdge)
|
||||
draft.push(newPrevUiEdge)
|
||||
if (newNextEdge)
|
||||
draft.push(newNextEdge)
|
||||
newNextRealEdges.forEach((edge) => {
|
||||
draft.push(edge)
|
||||
})
|
||||
if (newNextUiEdge)
|
||||
draft.push(newNextUiEdge)
|
||||
})
|
||||
setEdges(newEdges)
|
||||
}
|
||||
@@ -2078,6 +2543,302 @@ export const useNodesInteractions = () => {
|
||||
setEdges(newEdges)
|
||||
}, [store])
|
||||
|
||||
// Check if there are any nodes selected via box selection
|
||||
const hasBundledNodes = useCallback(() => {
|
||||
const { getNodes } = store.getState()
|
||||
const nodes = getNodes()
|
||||
return nodes.some(node => node.data._isBundled)
|
||||
}, [store])
|
||||
|
||||
const getCanMakeGroup = useCallback(() => {
|
||||
const { getNodes, edges } = store.getState()
|
||||
const nodes = getNodes()
|
||||
const bundledNodes = nodes.filter(node => node.data._isBundled)
|
||||
|
||||
if (bundledNodes.length <= 1)
|
||||
return false
|
||||
|
||||
const bundledNodeIds = bundledNodes.map(node => node.id)
|
||||
const minimalEdges = edges.map(edge => ({
|
||||
id: edge.id,
|
||||
source: edge.source,
|
||||
sourceHandle: edge.sourceHandle || 'source',
|
||||
target: edge.target,
|
||||
}))
|
||||
const hasGroupNode = bundledNodes.some(node => node.data.type === BlockEnum.Group)
|
||||
|
||||
const { canMakeGroup } = checkMakeGroupAvailability(bundledNodeIds, minimalEdges, hasGroupNode)
|
||||
return canMakeGroup
|
||||
}, [store])
|
||||
|
||||
const handleMakeGroup = useCallback(() => {
|
||||
const { getNodes, setNodes, edges, setEdges } = store.getState()
|
||||
const nodes = getNodes()
|
||||
const bundledNodes = nodes.filter(node => node.data._isBundled)
|
||||
|
||||
if (bundledNodes.length <= 1)
|
||||
return
|
||||
|
||||
const bundledNodeIds = bundledNodes.map(node => node.id)
|
||||
const minimalEdges = edges.map(edge => ({
|
||||
id: edge.id,
|
||||
source: edge.source,
|
||||
sourceHandle: edge.sourceHandle || 'source',
|
||||
target: edge.target,
|
||||
}))
|
||||
const hasGroupNode = bundledNodes.some(node => node.data.type === BlockEnum.Group)
|
||||
|
||||
const { canMakeGroup } = checkMakeGroupAvailability(bundledNodeIds, minimalEdges, hasGroupNode)
|
||||
if (!canMakeGroup)
|
||||
return
|
||||
|
||||
const bundledNodeIdSet = new Set(bundledNodeIds)
|
||||
const bundledNodeIdIsLeaf = new Set<string>()
|
||||
const inboundEdges = edges.filter(edge => !bundledNodeIdSet.has(edge.source) && bundledNodeIdSet.has(edge.target))
|
||||
const outboundEdges = edges.filter(edge => bundledNodeIdSet.has(edge.source) && !bundledNodeIdSet.has(edge.target))
|
||||
|
||||
// leaf node: no outbound edges to other nodes in the selection
|
||||
const handlers: GroupHandler[] = []
|
||||
const leafNodeIdSet = new Set<string>()
|
||||
|
||||
bundledNodes.forEach((node: Node) => {
|
||||
const targetBranches = node.data._targetBranches || [{ id: 'source', name: node.data.title }]
|
||||
targetBranches.forEach((branch) => {
|
||||
// A branch should be a handler if it's either:
|
||||
// 1. Connected to a node OUTSIDE the group
|
||||
// 2. NOT connected to any node INSIDE the group
|
||||
const isConnectedInside = edges.some(edge =>
|
||||
edge.source === node.id
|
||||
&& (edge.sourceHandle === branch.id || (!edge.sourceHandle && branch.id === 'source'))
|
||||
&& bundledNodeIdSet.has(edge.target),
|
||||
)
|
||||
const isConnectedOutside = edges.some(edge =>
|
||||
edge.source === node.id
|
||||
&& (edge.sourceHandle === branch.id || (!edge.sourceHandle && branch.id === 'source'))
|
||||
&& !bundledNodeIdSet.has(edge.target),
|
||||
)
|
||||
|
||||
if (isConnectedOutside || !isConnectedInside) {
|
||||
const handlerId = `${node.id}-${branch.id}`
|
||||
handlers.push({
|
||||
id: handlerId,
|
||||
label: branch.name || node.data.title || node.id,
|
||||
nodeId: node.id,
|
||||
sourceHandle: branch.id,
|
||||
})
|
||||
leafNodeIdSet.add(node.id)
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
const leafNodeIds = Array.from(leafNodeIdSet)
|
||||
leafNodeIds.forEach(id => bundledNodeIdIsLeaf.add(id))
|
||||
|
||||
const members: GroupMember[] = bundledNodes.map((node) => {
|
||||
return {
|
||||
id: node.id,
|
||||
type: node.data.type,
|
||||
label: node.data.title,
|
||||
}
|
||||
})
|
||||
|
||||
// head nodes: nodes that receive input from outside the group
|
||||
const headNodeIds = [...new Set(inboundEdges.map(edge => edge.target))]
|
||||
|
||||
// put the group node at the top-left corner of the selection, slightly offset
|
||||
const { x: minX, y: minY } = getTopLeftNodePosition(bundledNodes)
|
||||
|
||||
const groupNodeData: GroupNodeData = {
|
||||
title: t('operator.makeGroup', { ns: 'workflow' }),
|
||||
desc: '',
|
||||
type: BlockEnum.Group,
|
||||
members,
|
||||
handlers,
|
||||
headNodeIds,
|
||||
leafNodeIds,
|
||||
selected: true,
|
||||
_targetBranches: handlers.map(handler => ({
|
||||
id: handler.id,
|
||||
name: handler.label || handler.id,
|
||||
})),
|
||||
}
|
||||
|
||||
const { newNode: groupNode } = generateNewNode({
|
||||
data: groupNodeData,
|
||||
position: {
|
||||
x: minX - 20,
|
||||
y: minY - 20,
|
||||
},
|
||||
})
|
||||
|
||||
const nodeTypeMap = new Map(nodes.map(node => [node.id, node.data.type]))
|
||||
|
||||
const newNodes = produce(nodes, (draft) => {
|
||||
draft.forEach((node) => {
|
||||
if (bundledNodeIdSet.has(node.id)) {
|
||||
node.data._isBundled = false
|
||||
node.selected = false
|
||||
node.hidden = true
|
||||
node.data._hiddenInGroupId = groupNode.id
|
||||
}
|
||||
else {
|
||||
node.data._isBundled = false
|
||||
}
|
||||
})
|
||||
draft.push(groupNode)
|
||||
})
|
||||
|
||||
const newEdges = produce(edges, (draft) => {
|
||||
draft.forEach((edge) => {
|
||||
if (bundledNodeIdSet.has(edge.source) || bundledNodeIdSet.has(edge.target)) {
|
||||
edge.hidden = true
|
||||
edge.data = {
|
||||
...edge.data,
|
||||
_hiddenInGroupId: groupNode.id,
|
||||
_isBundled: false,
|
||||
}
|
||||
}
|
||||
else if (edge.data?._isBundled) {
|
||||
edge.data._isBundled = false
|
||||
}
|
||||
})
|
||||
|
||||
// re-add the external inbound edges to the group node as UI-only edges (not persisted to backend)
|
||||
inboundEdges.forEach((edge) => {
|
||||
draft.push({
|
||||
id: `${edge.id}__to-${groupNode.id}`,
|
||||
type: edge.type || CUSTOM_EDGE,
|
||||
source: edge.source,
|
||||
target: groupNode.id,
|
||||
sourceHandle: edge.sourceHandle,
|
||||
targetHandle: 'target',
|
||||
data: {
|
||||
...edge.data,
|
||||
sourceType: nodeTypeMap.get(edge.source)!,
|
||||
targetType: BlockEnum.Group,
|
||||
_hiddenInGroupId: undefined,
|
||||
_isBundled: false,
|
||||
_isTemp: true, // UI-only edge, not persisted to backend
|
||||
},
|
||||
zIndex: edge.zIndex,
|
||||
})
|
||||
})
|
||||
|
||||
// outbound edges of the group node as UI-only edges (not persisted to backend)
|
||||
outboundEdges.forEach((edge) => {
|
||||
if (!bundledNodeIdIsLeaf.has(edge.source))
|
||||
return
|
||||
|
||||
// Use the same handler id format: nodeId-sourceHandle
|
||||
const originalSourceHandle = edge.sourceHandle || 'source'
|
||||
const handlerId = `${edge.source}-${originalSourceHandle}`
|
||||
|
||||
draft.push({
|
||||
id: `${groupNode.id}-${edge.target}-${edge.targetHandle || 'target'}-${handlerId}`,
|
||||
type: edge.type || CUSTOM_EDGE,
|
||||
source: groupNode.id,
|
||||
target: edge.target,
|
||||
sourceHandle: handlerId,
|
||||
targetHandle: edge.targetHandle,
|
||||
data: {
|
||||
...edge.data,
|
||||
sourceType: BlockEnum.Group,
|
||||
targetType: nodeTypeMap.get(edge.target)!,
|
||||
_hiddenInGroupId: undefined,
|
||||
_isBundled: false,
|
||||
_isTemp: true,
|
||||
},
|
||||
zIndex: edge.zIndex,
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
setNodes(newNodes)
|
||||
setEdges(newEdges)
|
||||
workflowStore.setState({
|
||||
selectionMenu: undefined,
|
||||
})
|
||||
handleSyncWorkflowDraft()
|
||||
saveStateToHistory(WorkflowHistoryEvent.NodeAdd, {
|
||||
nodeId: groupNode.id,
|
||||
})
|
||||
}, [handleSyncWorkflowDraft, saveStateToHistory, store, t, workflowStore])
|
||||
|
||||
// check if the current selection can be ungrouped (single selected Group node)
|
||||
const getCanUngroup = useCallback(() => {
|
||||
const { getNodes } = store.getState()
|
||||
const nodes = getNodes()
|
||||
const selectedNodes = nodes.filter(node => node.selected)
|
||||
|
||||
if (selectedNodes.length !== 1)
|
||||
return false
|
||||
|
||||
return selectedNodes[0].data.type === BlockEnum.Group
|
||||
}, [store])
|
||||
|
||||
// get the selected group node id for ungroup operation
|
||||
const getSelectedGroupId = useCallback(() => {
|
||||
const { getNodes } = store.getState()
|
||||
const nodes = getNodes()
|
||||
const selectedNodes = nodes.filter(node => node.selected)
|
||||
|
||||
if (selectedNodes.length === 1 && selectedNodes[0].data.type === BlockEnum.Group)
|
||||
return selectedNodes[0].id
|
||||
|
||||
return undefined
|
||||
}, [store])
|
||||
|
||||
const handleUngroup = useCallback((groupId: string) => {
|
||||
const { getNodes, setNodes, edges, setEdges } = store.getState()
|
||||
const nodes = getNodes()
|
||||
const groupNode = nodes.find(n => n.id === groupId)
|
||||
|
||||
if (!groupNode || groupNode.data.type !== BlockEnum.Group)
|
||||
return
|
||||
|
||||
const memberIds = new Set((groupNode.data.members || []).map((m: { id: string }) => m.id))
|
||||
|
||||
// restore hidden member nodes
|
||||
const newNodes = produce(nodes, (draft) => {
|
||||
draft.forEach((node) => {
|
||||
if (memberIds.has(node.id)) {
|
||||
node.hidden = false
|
||||
delete node.data._hiddenInGroupId
|
||||
}
|
||||
})
|
||||
// remove group node
|
||||
const groupIndex = draft.findIndex(n => n.id === groupId)
|
||||
if (groupIndex !== -1)
|
||||
draft.splice(groupIndex, 1)
|
||||
})
|
||||
|
||||
// restore hidden edges and remove temp edges in single pass O(E)
|
||||
const newEdges = produce(edges, (draft) => {
|
||||
const indicesToRemove: number[] = []
|
||||
|
||||
for (let i = 0; i < draft.length; i++) {
|
||||
const edge = draft[i]
|
||||
// restore hidden edges that involve member nodes
|
||||
if (edge.hidden && (memberIds.has(edge.source) || memberIds.has(edge.target)))
|
||||
edge.hidden = false
|
||||
// collect temp edges connected to group for removal
|
||||
if (edge.data?._isTemp && (edge.source === groupId || edge.target === groupId))
|
||||
indicesToRemove.push(i)
|
||||
}
|
||||
|
||||
// remove collected indices in reverse order to avoid index shift
|
||||
for (let i = indicesToRemove.length - 1; i >= 0; i--)
|
||||
draft.splice(indicesToRemove[i], 1)
|
||||
})
|
||||
|
||||
setNodes(newNodes)
|
||||
setEdges(newEdges)
|
||||
handleSyncWorkflowDraft()
|
||||
saveStateToHistory(WorkflowHistoryEvent.NodeDelete, {
|
||||
nodeId: groupId,
|
||||
})
|
||||
}, [handleSyncWorkflowDraft, saveStateToHistory, store])
|
||||
|
||||
return {
|
||||
handleNodeDragStart,
|
||||
handleNodeDrag,
|
||||
@@ -2098,11 +2859,17 @@ export const useNodesInteractions = () => {
|
||||
handleNodesPaste,
|
||||
handleNodesDuplicate,
|
||||
handleNodesDelete,
|
||||
handleMakeGroup,
|
||||
handleUngroup,
|
||||
handleNodeResize,
|
||||
handleNodeDisconnect,
|
||||
handleHistoryBack,
|
||||
handleHistoryForward,
|
||||
dimOtherNodes,
|
||||
undimAllNodes,
|
||||
hasBundledNodes,
|
||||
getCanMakeGroup,
|
||||
getCanUngroup,
|
||||
getSelectedGroupId,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,8 +1,10 @@
|
||||
import type { AvailableNodesMetaData } from '@/app/components/workflow/hooks-store'
|
||||
import type { Node } from '@/app/components/workflow/types'
|
||||
import { useMemo } from 'react'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
import { CollectionType } from '@/app/components/tools/types'
|
||||
import { useHooksStore } from '@/app/components/workflow/hooks-store'
|
||||
import GroupDefault from '@/app/components/workflow/nodes/group/default'
|
||||
import { useStore } from '@/app/components/workflow/store'
|
||||
import { BlockEnum } from '@/app/components/workflow/types'
|
||||
import { useGetLanguage } from '@/context/i18n'
|
||||
@@ -25,6 +27,7 @@ export const useNodesMetaData = () => {
|
||||
}
|
||||
|
||||
export const useNodeMetaData = (node: Node) => {
|
||||
const { t } = useTranslation()
|
||||
const language = useGetLanguage()
|
||||
const { data: buildInTools } = useAllBuiltInTools()
|
||||
const { data: customTools } = useAllCustomTools()
|
||||
@@ -34,6 +37,9 @@ export const useNodeMetaData = (node: Node) => {
|
||||
const { data } = node
|
||||
const nodeMetaData = availableNodesMetaData.nodesMap?.[data.type]
|
||||
const author = useMemo(() => {
|
||||
if (data.type === BlockEnum.Group)
|
||||
return GroupDefault.metaData.author
|
||||
|
||||
if (data.type === BlockEnum.DataSource)
|
||||
return dataSourceList?.find(dataSource => dataSource.plugin_id === data.plugin_id)?.author
|
||||
|
||||
@@ -48,6 +54,9 @@ export const useNodeMetaData = (node: Node) => {
|
||||
}, [data, buildInTools, customTools, workflowTools, nodeMetaData, dataSourceList])
|
||||
|
||||
const description = useMemo(() => {
|
||||
if (data.type === BlockEnum.Group)
|
||||
return t('blocksAbout.group', { ns: 'workflow' })
|
||||
|
||||
if (data.type === BlockEnum.DataSource)
|
||||
return dataSourceList?.find(dataSource => dataSource.plugin_id === data.plugin_id)?.description[language]
|
||||
if (data.type === BlockEnum.Tool) {
|
||||
@@ -58,7 +67,7 @@ export const useNodeMetaData = (node: Node) => {
|
||||
return customTools?.find(toolWithProvider => toolWithProvider.id === data.provider_id)?.description[language]
|
||||
}
|
||||
return nodeMetaData?.metaData.description
|
||||
}, [data, buildInTools, customTools, workflowTools, nodeMetaData, dataSourceList, language])
|
||||
}, [data, buildInTools, customTools, workflowTools, nodeMetaData, dataSourceList, language, t])
|
||||
|
||||
return useMemo(() => {
|
||||
return {
|
||||
|
||||
@@ -27,6 +27,12 @@ export const useShortcuts = (): void => {
|
||||
handleHistoryForward,
|
||||
dimOtherNodes,
|
||||
undimAllNodes,
|
||||
hasBundledNodes,
|
||||
getCanMakeGroup,
|
||||
handleMakeGroup,
|
||||
getCanUngroup,
|
||||
getSelectedGroupId,
|
||||
handleUngroup,
|
||||
} = useNodesInteractions()
|
||||
const { shortcutsEnabled: workflowHistoryShortcutsEnabled } = useWorkflowHistoryStore()
|
||||
const { handleSyncWorkflowDraft } = useNodesSyncDraft()
|
||||
@@ -78,7 +84,8 @@ export const useShortcuts = (): void => {
|
||||
|
||||
useKeyPress(`${getKeyboardKeyCodeBySystem('ctrl')}.c`, (e) => {
|
||||
const { showDebugAndPreviewPanel } = workflowStore.getState()
|
||||
if (shouldHandleShortcut(e) && shouldHandleCopy() && !showDebugAndPreviewPanel) {
|
||||
// Only intercept when nodes are selected via box selection
|
||||
if (shouldHandleShortcut(e) && shouldHandleCopy() && !showDebugAndPreviewPanel && hasBundledNodes()) {
|
||||
e.preventDefault()
|
||||
handleNodesCopy()
|
||||
}
|
||||
@@ -99,6 +106,26 @@ export const useShortcuts = (): void => {
|
||||
}
|
||||
}, { exactMatch: true, useCapture: true })
|
||||
|
||||
useKeyPress(`${getKeyboardKeyCodeBySystem('ctrl')}.g`, (e) => {
|
||||
// Only intercept when the selection can be grouped
|
||||
if (shouldHandleShortcut(e) && getCanMakeGroup()) {
|
||||
e.preventDefault()
|
||||
// Close selection context menu if open
|
||||
workflowStore.setState({ selectionMenu: undefined })
|
||||
handleMakeGroup()
|
||||
}
|
||||
}, { exactMatch: true, useCapture: true })
|
||||
|
||||
useKeyPress(`${getKeyboardKeyCodeBySystem('ctrl')}.shift.g`, (e) => {
|
||||
// Only intercept when the selection can be ungrouped
|
||||
if (shouldHandleShortcut(e) && getCanUngroup()) {
|
||||
e.preventDefault()
|
||||
const groupId = getSelectedGroupId()
|
||||
if (groupId)
|
||||
handleUngroup(groupId)
|
||||
}
|
||||
}, { exactMatch: true, useCapture: true })
|
||||
|
||||
useKeyPress(`${getKeyboardKeyCodeBySystem('alt')}.r`, (e) => {
|
||||
if (shouldHandleShortcut(e)) {
|
||||
e.preventDefault()
|
||||
|
||||
@@ -37,7 +37,10 @@ export const useWorkflowNodeFinished = () => {
|
||||
}))
|
||||
|
||||
const newNodes = produce(nodes, (draft) => {
|
||||
const currentNode = draft.find(node => node.id === data.node_id)!
|
||||
const currentNode = draft.find(node => node.id === data.node_id)
|
||||
// Skip if node not found (e.g., virtual extraction nodes)
|
||||
if (!currentNode)
|
||||
return
|
||||
currentNode.data._runningStatus = data.status
|
||||
if (data.status === NodeRunningStatus.Exception) {
|
||||
if (data.execution_metadata?.error_strategy === ErrorHandleTypeEnum.failBranch)
|
||||
|
||||
@@ -45,6 +45,11 @@ export const useWorkflowNodeStarted = () => {
|
||||
} = reactflow
|
||||
const currentNodeIndex = nodes.findIndex(node => node.id === data.node_id)
|
||||
const currentNode = nodes[currentNodeIndex]
|
||||
|
||||
// Skip if node not found (e.g., virtual extraction nodes)
|
||||
if (!currentNode)
|
||||
return
|
||||
|
||||
const position = currentNode.position
|
||||
const zoom = transform[2]
|
||||
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
import type {
|
||||
Connection,
|
||||
} from 'reactflow'
|
||||
import type { GroupNodeData } from '../nodes/group/types'
|
||||
import type { IterationNodeType } from '../nodes/iteration/types'
|
||||
import type { LoopNodeType } from '../nodes/loop/types'
|
||||
import type {
|
||||
BlockEnum,
|
||||
Edge,
|
||||
Node,
|
||||
ValueSelector,
|
||||
@@ -28,14 +28,12 @@ import {
|
||||
} from '../constants'
|
||||
import { findUsedVarNodes, getNodeOutputVars, updateNodeVars } from '../nodes/_base/components/variable/utils'
|
||||
import { CUSTOM_NOTE_NODE } from '../note-node/constants'
|
||||
|
||||
import {
|
||||
useStore,
|
||||
useWorkflowStore,
|
||||
} from '../store'
|
||||
import {
|
||||
WorkflowRunningStatus,
|
||||
} from '../types'
|
||||
|
||||
import { BlockEnum, WorkflowRunningStatus } from '../types'
|
||||
import {
|
||||
getWorkflowEntryNode,
|
||||
isWorkflowEntryNode,
|
||||
@@ -381,7 +379,7 @@ export const useWorkflow = () => {
|
||||
return startNodes
|
||||
}, [nodesMap, getRootNodesById])
|
||||
|
||||
const isValidConnection = useCallback(({ source, sourceHandle: _sourceHandle, target }: Connection) => {
|
||||
const isValidConnection = useCallback(({ source, sourceHandle, target }: Connection) => {
|
||||
const {
|
||||
edges,
|
||||
getNodes,
|
||||
@@ -396,15 +394,42 @@ export const useWorkflow = () => {
|
||||
if (sourceNode.parentId !== targetNode.parentId)
|
||||
return false
|
||||
|
||||
// For Group nodes, use the leaf node's type for validation
|
||||
// sourceHandle format: "${leafNodeId}-${originalSourceHandle}"
|
||||
let actualSourceType = sourceNode.data.type
|
||||
if (sourceNode.data.type === BlockEnum.Group && sourceHandle) {
|
||||
const lastDashIndex = sourceHandle.lastIndexOf('-')
|
||||
if (lastDashIndex > 0) {
|
||||
const leafNodeId = sourceHandle.substring(0, lastDashIndex)
|
||||
const leafNode = nodes.find(node => node.id === leafNodeId)
|
||||
if (leafNode)
|
||||
actualSourceType = leafNode.data.type
|
||||
}
|
||||
}
|
||||
|
||||
if (sourceNode && targetNode) {
|
||||
const sourceNodeAvailableNextNodes = getAvailableBlocks(sourceNode.data.type, !!sourceNode.parentId).availableNextBlocks
|
||||
const sourceNodeAvailableNextNodes = getAvailableBlocks(actualSourceType, !!sourceNode.parentId).availableNextBlocks
|
||||
const targetNodeAvailablePrevNodes = getAvailableBlocks(targetNode.data.type, !!targetNode.parentId).availablePrevBlocks
|
||||
|
||||
if (!sourceNodeAvailableNextNodes.includes(targetNode.data.type))
|
||||
return false
|
||||
if (targetNode.data.type === BlockEnum.Group) {
|
||||
const groupData = targetNode.data as GroupNodeData
|
||||
const headNodeIds = groupData.headNodeIds || []
|
||||
if (headNodeIds.length > 0) {
|
||||
const headNode = nodes.find(node => node.id === headNodeIds[0])
|
||||
if (headNode) {
|
||||
const headNodeAvailablePrevNodes = getAvailableBlocks(headNode.data.type, !!targetNode.parentId).availablePrevBlocks
|
||||
if (!headNodeAvailablePrevNodes.includes(actualSourceType))
|
||||
return false
|
||||
}
|
||||
}
|
||||
}
|
||||
else {
|
||||
if (!sourceNodeAvailableNextNodes.includes(targetNode.data.type))
|
||||
return false
|
||||
|
||||
if (!targetNodeAvailablePrevNodes.includes(sourceNode.data.type))
|
||||
return false
|
||||
if (!targetNodeAvailablePrevNodes.includes(actualSourceType))
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
const hasCycle = (node: Node, visited = new Set()) => {
|
||||
@@ -525,6 +550,7 @@ export const useIsNodeInLoop = (loopId: string) => {
|
||||
return false
|
||||
|
||||
if (node.parentId === loopId)
|
||||
|
||||
return true
|
||||
|
||||
return false
|
||||
|
||||
@@ -54,6 +54,14 @@ import {
|
||||
} from './constants'
|
||||
import CustomConnectionLine from './custom-connection-line'
|
||||
import CustomEdge from './custom-edge'
|
||||
import {
|
||||
CUSTOM_GROUP_EXIT_PORT_NODE,
|
||||
CUSTOM_GROUP_INPUT_NODE,
|
||||
CUSTOM_GROUP_NODE,
|
||||
CustomGroupExitPortNode,
|
||||
CustomGroupInputNode,
|
||||
CustomGroupNode,
|
||||
} from './custom-group-node'
|
||||
import DatasetsDetailProvider from './datasets-detail-store/provider'
|
||||
import HelpLine from './help-line'
|
||||
import {
|
||||
@@ -112,6 +120,9 @@ const nodeTypes = {
|
||||
[CUSTOM_ITERATION_START_NODE]: CustomIterationStartNode,
|
||||
[CUSTOM_LOOP_START_NODE]: CustomLoopStartNode,
|
||||
[CUSTOM_DATA_SOURCE_EMPTY_NODE]: CustomDataSourceEmptyNode,
|
||||
[CUSTOM_GROUP_NODE]: CustomGroupNode,
|
||||
[CUSTOM_GROUP_INPUT_NODE]: CustomGroupInputNode,
|
||||
[CUSTOM_GROUP_EXIT_PORT_NODE]: CustomGroupExitPortNode,
|
||||
}
|
||||
const edgeTypes = {
|
||||
[CUSTOM_EDGE]: CustomEdge,
|
||||
|
||||
@@ -0,0 +1,135 @@
|
||||
'use client'
|
||||
import type { FC } from 'react'
|
||||
import type { BlockEnum } from '@/app/components/workflow/types'
|
||||
import * as React from 'react'
|
||||
import { useState } from 'react'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
import Input from '@/app/components/base/input'
|
||||
import BlockIcon from '@/app/components/workflow/block-icon'
|
||||
import { cn } from '@/utils/classnames'
|
||||
|
||||
export type AgentNode = {
|
||||
id: string
|
||||
title: string
|
||||
type: BlockEnum
|
||||
}
|
||||
|
||||
type ItemProps = {
|
||||
node: AgentNode
|
||||
onSelect: (node: AgentNode) => void
|
||||
}
|
||||
|
||||
const Item: FC<ItemProps> = ({ node, onSelect }) => {
|
||||
const [isHovering, setIsHovering] = useState(false)
|
||||
|
||||
return (
|
||||
<button
|
||||
type="button"
|
||||
className={cn(
|
||||
'relative flex h-6 w-full cursor-pointer items-center rounded-md border-none bg-transparent px-3 text-left',
|
||||
isHovering && 'bg-state-base-hover',
|
||||
)}
|
||||
onMouseEnter={() => setIsHovering(true)}
|
||||
onMouseLeave={() => setIsHovering(false)}
|
||||
onClick={() => onSelect(node)}
|
||||
onMouseDown={e => e.preventDefault()}
|
||||
>
|
||||
<BlockIcon
|
||||
className="mr-1 shrink-0"
|
||||
type={node.type}
|
||||
size="xs"
|
||||
/>
|
||||
<span
|
||||
className="system-sm-medium truncate text-text-secondary"
|
||||
title={node.title}
|
||||
>
|
||||
{node.title}
|
||||
</span>
|
||||
</button>
|
||||
)
|
||||
}
|
||||
|
||||
type Props = {
|
||||
nodes: AgentNode[]
|
||||
onSelect: (node: AgentNode) => void
|
||||
onClose?: () => void
|
||||
onBlur?: () => void
|
||||
hideSearch?: boolean
|
||||
searchBoxClassName?: string
|
||||
maxHeightClass?: string
|
||||
autoFocus?: boolean
|
||||
}
|
||||
|
||||
const AgentNodeList: FC<Props> = ({
|
||||
nodes,
|
||||
onSelect,
|
||||
onClose,
|
||||
onBlur,
|
||||
hideSearch,
|
||||
searchBoxClassName,
|
||||
maxHeightClass,
|
||||
autoFocus = true,
|
||||
}) => {
|
||||
const { t } = useTranslation()
|
||||
const [searchText, setSearchText] = useState('')
|
||||
|
||||
const handleKeyDown = (e: React.KeyboardEvent) => {
|
||||
if (e.key === 'Escape') {
|
||||
e.preventDefault()
|
||||
onClose?.()
|
||||
}
|
||||
}
|
||||
|
||||
const filteredNodes = nodes.filter((node) => {
|
||||
if (!searchText)
|
||||
return true
|
||||
return node.title.toLowerCase().includes(searchText.toLowerCase())
|
||||
})
|
||||
|
||||
return (
|
||||
<>
|
||||
{!hideSearch && (
|
||||
<>
|
||||
<div className={cn('mx-2 mb-2 mt-2', searchBoxClassName)}>
|
||||
<Input
|
||||
showLeftIcon
|
||||
showClearIcon
|
||||
value={searchText}
|
||||
placeholder={t('common.searchAgent', { ns: 'workflow' })}
|
||||
onChange={e => setSearchText(e.target.value)}
|
||||
onClick={e => e.stopPropagation()}
|
||||
onKeyDown={handleKeyDown}
|
||||
onClear={() => setSearchText('')}
|
||||
onBlur={onBlur}
|
||||
autoFocus={autoFocus}
|
||||
/>
|
||||
</div>
|
||||
<div
|
||||
className="relative left-[-4px] h-[0.5px] bg-black/5"
|
||||
style={{ width: 'calc(100% + 8px)' }}
|
||||
/>
|
||||
</>
|
||||
)}
|
||||
|
||||
{filteredNodes.length > 0
|
||||
? (
|
||||
<div className={cn('max-h-[85vh] overflow-y-auto py-1', maxHeightClass)}>
|
||||
{filteredNodes.map(node => (
|
||||
<Item
|
||||
key={node.id}
|
||||
node={node}
|
||||
onSelect={onSelect}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
)
|
||||
: (
|
||||
<div className="py-2 pl-3 text-xs font-medium text-text-tertiary">
|
||||
{t('common.noAgentNodes', { ns: 'workflow' })}
|
||||
</div>
|
||||
)}
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
export default React.memo(AgentNodeList)
|
||||
@@ -41,13 +41,14 @@ const PanelOperatorPopup = ({
|
||||
handleNodesDuplicate,
|
||||
handleNodeSelect,
|
||||
handleNodesCopy,
|
||||
handleUngroup,
|
||||
} = useNodesInteractions()
|
||||
const { handleNodeDataUpdate } = useNodeDataUpdate()
|
||||
const { handleSyncWorkflowDraft } = useNodesSyncDraft()
|
||||
const { nodesReadOnly } = useNodesReadOnly()
|
||||
const edge = edges.find(edge => edge.target === id)
|
||||
const nodeMetaData = useNodeMetaData({ id, data } as Node)
|
||||
const showChangeBlock = !nodeMetaData.isTypeFixed && !nodesReadOnly
|
||||
const showChangeBlock = !nodeMetaData.isTypeFixed && !nodesReadOnly && data.type !== BlockEnum.Group
|
||||
const isChildNode = !!(data.isInIteration || data.isInLoop)
|
||||
|
||||
const { data: workflowTools } = useAllWorkflowTools()
|
||||
@@ -61,6 +62,25 @@ const PanelOperatorPopup = ({
|
||||
|
||||
return (
|
||||
<div className="w-[240px] rounded-lg border-[0.5px] border-components-panel-border bg-components-panel-bg shadow-xl">
|
||||
{
|
||||
!nodesReadOnly && data.type === BlockEnum.Group && (
|
||||
<>
|
||||
<div className="p-1">
|
||||
<div
|
||||
className="flex h-8 cursor-pointer items-center justify-between rounded-lg px-3 text-sm text-text-secondary hover:bg-state-base-hover"
|
||||
onClick={() => {
|
||||
onClosePopup()
|
||||
handleUngroup(id)
|
||||
}}
|
||||
>
|
||||
{t('panel.ungroup', { ns: 'workflow' })}
|
||||
<ShortcutsName keys={['ctrl', 'shift', 'g']} />
|
||||
</div>
|
||||
</div>
|
||||
<div className="h-px bg-divider-regular"></div>
|
||||
</>
|
||||
)
|
||||
}
|
||||
{
|
||||
(showChangeBlock || canRunBySingle(data.type, isChildNode)) && (
|
||||
<>
|
||||
|
||||
@@ -594,7 +594,7 @@ const BasePanel: FC<BasePanelProps> = ({
|
||||
)
|
||||
}
|
||||
{
|
||||
!needsToolAuth && !currentDataSource && !currentTriggerPlugin && (
|
||||
!needsToolAuth && !currentDataSource && !currentTriggerPlugin && data.type !== BlockEnum.Group && (
|
||||
<div className="flex items-center justify-between pl-4 pr-3">
|
||||
<Tab
|
||||
value={tabType}
|
||||
@@ -603,9 +603,9 @@ const BasePanel: FC<BasePanelProps> = ({
|
||||
</div>
|
||||
)
|
||||
}
|
||||
<Split />
|
||||
{data.type !== BlockEnum.Group && <Split />}
|
||||
</div>
|
||||
{tabType === TabType.settings && (
|
||||
{(tabType === TabType.settings || data.type === BlockEnum.Group) && (
|
||||
<div className="flex flex-1 flex-col overflow-y-auto">
|
||||
<div>
|
||||
{cloneElement(children as any, {
|
||||
|
||||
@@ -56,6 +56,7 @@ const singleRunFormParamsHooks: Record<BlockEnum, any> = {
|
||||
[BlockEnum.VariableAggregator]: useVariableAggregatorSingleRunFormParams,
|
||||
[BlockEnum.Assigner]: useVariableAssignerSingleRunFormParams,
|
||||
[BlockEnum.KnowledgeBase]: useKnowledgeBaseSingleRunFormParams,
|
||||
[BlockEnum.Group]: undefined,
|
||||
[BlockEnum.VariableAssigner]: undefined,
|
||||
[BlockEnum.End]: undefined,
|
||||
[BlockEnum.Answer]: undefined,
|
||||
@@ -103,6 +104,7 @@ const getDataForCheckMoreHooks: Record<BlockEnum, any> = {
|
||||
[BlockEnum.DataSource]: undefined,
|
||||
[BlockEnum.DataSourceEmpty]: undefined,
|
||||
[BlockEnum.KnowledgeBase]: undefined,
|
||||
[BlockEnum.Group]: undefined,
|
||||
[BlockEnum.TriggerWebhook]: undefined,
|
||||
[BlockEnum.TriggerSchedule]: undefined,
|
||||
[BlockEnum.TriggerPlugin]: useTriggerPluginGetDataForCheckMore,
|
||||
|
||||
@@ -221,7 +221,7 @@ const BaseNode: FC<BaseNodeProps> = ({
|
||||
)
|
||||
}
|
||||
{
|
||||
data.type !== BlockEnum.IfElse && data.type !== BlockEnum.QuestionClassifier && !data._isCandidate && (
|
||||
data.type !== BlockEnum.IfElse && data.type !== BlockEnum.QuestionClassifier && data.type !== BlockEnum.Group && !data._isCandidate && (
|
||||
<NodeSourceHandle
|
||||
id={id}
|
||||
data={data}
|
||||
|
||||
@@ -14,6 +14,8 @@ import DocExtractorNode from './document-extractor/node'
|
||||
import DocExtractorPanel from './document-extractor/panel'
|
||||
import EndNode from './end/node'
|
||||
import EndPanel from './end/panel'
|
||||
import GroupNode from './group/node'
|
||||
import GroupPanel from './group/panel'
|
||||
import HttpNode from './http/node'
|
||||
import HttpPanel from './http/panel'
|
||||
import IfElseNode from './if-else/node'
|
||||
@@ -75,6 +77,7 @@ export const NodeComponentMap: Record<string, ComponentType<any>> = {
|
||||
[BlockEnum.TriggerSchedule]: TriggerScheduleNode,
|
||||
[BlockEnum.TriggerWebhook]: TriggerWebhookNode,
|
||||
[BlockEnum.TriggerPlugin]: TriggerPluginNode,
|
||||
[BlockEnum.Group]: GroupNode,
|
||||
}
|
||||
|
||||
export const PanelComponentMap: Record<string, ComponentType<any>> = {
|
||||
@@ -103,4 +106,5 @@ export const PanelComponentMap: Record<string, ComponentType<any>> = {
|
||||
[BlockEnum.TriggerSchedule]: TriggerSchedulePanel,
|
||||
[BlockEnum.TriggerWebhook]: TriggerWebhookPanel,
|
||||
[BlockEnum.TriggerPlugin]: TriggerPluginPanel,
|
||||
[BlockEnum.Group]: GroupPanel,
|
||||
}
|
||||
|
||||
26
web/app/components/workflow/nodes/group/default.ts
Normal file
26
web/app/components/workflow/nodes/group/default.ts
Normal file
@@ -0,0 +1,26 @@
|
||||
import type { NodeDefault } from '../../types'
|
||||
import type { GroupNodeData } from './types'
|
||||
import { BlockEnum } from '@/app/components/workflow/types'
|
||||
import { genNodeMetaData } from '@/app/components/workflow/utils'
|
||||
|
||||
const metaData = genNodeMetaData({
|
||||
sort: 100,
|
||||
type: BlockEnum.Group,
|
||||
})
|
||||
|
||||
const nodeDefault: NodeDefault<GroupNodeData> = {
|
||||
metaData,
|
||||
defaultValue: {
|
||||
members: [],
|
||||
handlers: [],
|
||||
headNodeIds: [],
|
||||
leafNodeIds: [],
|
||||
},
|
||||
checkValid() {
|
||||
return {
|
||||
isValid: true,
|
||||
}
|
||||
},
|
||||
}
|
||||
|
||||
export default nodeDefault
|
||||
94
web/app/components/workflow/nodes/group/node.tsx
Normal file
94
web/app/components/workflow/nodes/group/node.tsx
Normal file
@@ -0,0 +1,94 @@
|
||||
import type { GroupHandler, GroupMember, GroupNodeData } from './types'
|
||||
import type { BlockEnum, NodeProps } from '@/app/components/workflow/types'
|
||||
import { RiArrowRightSLine } from '@remixicon/react'
|
||||
import { memo, useMemo } from 'react'
|
||||
import BlockIcon from '@/app/components/workflow/block-icon'
|
||||
import { cn } from '@/utils/classnames'
|
||||
import { NodeSourceHandle } from '../_base/components/node-handle'
|
||||
|
||||
const MAX_MEMBER_ICONS = 12
|
||||
|
||||
const GroupNode = (props: NodeProps<GroupNodeData>) => {
|
||||
const { data } = props
|
||||
|
||||
// show the explicitly passed members first; otherwise use the _children information to fill the type
|
||||
const members: GroupMember[] = useMemo(() => (
|
||||
data.members?.length
|
||||
? data.members
|
||||
: data._children?.length
|
||||
? data._children.map(child => ({
|
||||
id: child.nodeId,
|
||||
type: child.nodeType as BlockEnum,
|
||||
label: child.nodeType,
|
||||
}))
|
||||
: []
|
||||
), [data._children, data.members])
|
||||
|
||||
const handlers: GroupHandler[] = useMemo(() => (
|
||||
data.handlers?.length
|
||||
? data.handlers
|
||||
: members.length
|
||||
? members.map(member => ({
|
||||
id: `${member.id}-source`,
|
||||
label: member.label || member.id,
|
||||
nodeId: member.id,
|
||||
sourceHandle: 'source',
|
||||
}))
|
||||
: []
|
||||
), [data.handlers, members])
|
||||
|
||||
return (
|
||||
<div className="space-y-2 px-3 pb-3">
|
||||
{members.length > 0 && (
|
||||
<div className="flex items-center gap-1 overflow-hidden">
|
||||
<div className="flex flex-wrap items-center gap-1 overflow-hidden">
|
||||
{members.slice(0, MAX_MEMBER_ICONS).map(member => (
|
||||
<div
|
||||
key={member.id}
|
||||
className="flex h-7 items-center rounded-full bg-components-input-bg-normal px-1.5 shadow-xs"
|
||||
>
|
||||
<BlockIcon
|
||||
type={member.type}
|
||||
size="xs"
|
||||
className="!shadow-none"
|
||||
/>
|
||||
</div>
|
||||
))}
|
||||
{members.length > MAX_MEMBER_ICONS && (
|
||||
<div className="system-xs-medium rounded-full bg-components-input-bg-normal px-2 py-1 text-text-tertiary">
|
||||
+
|
||||
{members.length - MAX_MEMBER_ICONS}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
<RiArrowRightSLine className="ml-auto h-4 w-4 shrink-0 text-text-tertiary" />
|
||||
</div>
|
||||
)}
|
||||
{handlers.length > 0 && (
|
||||
<div className="space-y-1">
|
||||
{handlers.map(handler => (
|
||||
<div
|
||||
key={handler.id}
|
||||
className={cn(
|
||||
'relative',
|
||||
'system-sm-semibold uppercase',
|
||||
'flex h-9 items-center rounded-md bg-components-panel-on-panel-item-bg px-3 text-text-primary shadow-xs',
|
||||
)}
|
||||
>
|
||||
{handler.label || handler.id}
|
||||
<NodeSourceHandle
|
||||
{...props}
|
||||
handleId={handler.id}
|
||||
handleClassName="!top-1/2 !-translate-y-1/2 !-right-[21px]"
|
||||
/>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
GroupNode.displayName = 'GroupNode'
|
||||
|
||||
export default memo(GroupNode)
|
||||
9
web/app/components/workflow/nodes/group/panel.tsx
Normal file
9
web/app/components/workflow/nodes/group/panel.tsx
Normal file
@@ -0,0 +1,9 @@
|
||||
import { memo } from 'react'
|
||||
|
||||
const GroupPanel = () => {
|
||||
return null
|
||||
}
|
||||
|
||||
GroupPanel.displayName = 'GroupPanel'
|
||||
|
||||
export default memo(GroupPanel)
|
||||
21
web/app/components/workflow/nodes/group/types.ts
Normal file
21
web/app/components/workflow/nodes/group/types.ts
Normal file
@@ -0,0 +1,21 @@
|
||||
import type { BlockEnum, CommonNodeType } from '../../types'
|
||||
|
||||
export type GroupMember = {
|
||||
id: string
|
||||
type: BlockEnum
|
||||
label?: string
|
||||
}
|
||||
|
||||
export type GroupHandler = {
|
||||
id: string
|
||||
label?: string
|
||||
nodeId?: string // leaf node id for multi-branch nodes
|
||||
sourceHandle?: string // original sourceHandle (e.g., case_id for if-else)
|
||||
}
|
||||
|
||||
export type GroupNodeData = CommonNodeType<{
|
||||
members?: GroupMember[]
|
||||
handlers?: GroupHandler[]
|
||||
headNodeIds?: string[] // nodes that receive input from outside the group
|
||||
leafNodeIds?: string[] // nodes that send output to outside the group
|
||||
}>
|
||||
@@ -0,0 +1,52 @@
|
||||
import type { FC } from 'react'
|
||||
import { RiCloseLine, RiEqualizer2Line } from '@remixicon/react'
|
||||
import { memo } from 'react'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
import { Agent } from '@/app/components/base/icons/src/vender/workflow'
|
||||
|
||||
type AgentHeaderBarProps = {
|
||||
agentName: string
|
||||
onRemove: () => void
|
||||
onViewInternals?: () => void
|
||||
}
|
||||
|
||||
const AgentHeaderBar: FC<AgentHeaderBarProps> = ({
|
||||
agentName,
|
||||
onRemove,
|
||||
onViewInternals,
|
||||
}) => {
|
||||
const { t } = useTranslation()
|
||||
|
||||
return (
|
||||
<div className="flex items-center justify-between px-2 py-1">
|
||||
<div className="flex items-center gap-1">
|
||||
<div className="flex items-center gap-1 rounded-md border-[0.5px] border-components-panel-border-subtle bg-components-badge-white-to-dark px-1.5 py-0.5 shadow-xs">
|
||||
<div className="flex h-4 w-4 items-center justify-center rounded bg-util-colors-indigo-indigo-500">
|
||||
<Agent className="h-3 w-3 text-text-primary-on-surface" />
|
||||
</div>
|
||||
<span className="system-xs-medium text-text-secondary">
|
||||
@
|
||||
{agentName}
|
||||
</span>
|
||||
<button
|
||||
type="button"
|
||||
className="flex h-4 w-4 items-center justify-center rounded hover:bg-state-base-hover"
|
||||
onClick={onRemove}
|
||||
>
|
||||
<RiCloseLine className="h-3 w-3 text-text-tertiary" />
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
<button
|
||||
type="button"
|
||||
className="flex items-center gap-0.5 text-text-tertiary hover:text-text-secondary"
|
||||
onClick={onViewInternals}
|
||||
>
|
||||
<RiEqualizer2Line className="h-3.5 w-3.5" />
|
||||
<span className="system-xs-medium">{t('common.viewInternals', { ns: 'workflow' })}</span>
|
||||
</button>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
export default memo(AgentHeaderBar)
|
||||
@@ -1,17 +1,29 @@
|
||||
import type { AgentBlockType } from '@/app/components/base/prompt-editor/types'
|
||||
import type {
|
||||
Node,
|
||||
NodeOutPutVar,
|
||||
} from '@/app/components/workflow/types'
|
||||
import {
|
||||
memo,
|
||||
useCallback,
|
||||
useEffect,
|
||||
useMemo,
|
||||
useState,
|
||||
} from 'react'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
import PromptEditor from '@/app/components/base/prompt-editor'
|
||||
import { useStore } from '@/app/components/workflow/store'
|
||||
import { BlockEnum } from '@/app/components/workflow/types'
|
||||
import { cn } from '@/utils/classnames'
|
||||
import AgentHeaderBar from './agent-header-bar'
|
||||
import Placeholder from './placeholder'
|
||||
|
||||
/**
|
||||
* Matches workflow variable syntax: {{#nodeId.varName#}}
|
||||
* Example: {{#agent-123.text#}} -> captures "agent-123.text"
|
||||
*/
|
||||
const WORKFLOW_VAR_PATTERN = /\{\{#([^#]+)#\}\}/g
|
||||
|
||||
type MixedVariableTextInputProps = {
|
||||
readOnly?: boolean
|
||||
nodesOutputVars?: NodeOutPutVar[]
|
||||
@@ -21,7 +33,9 @@ type MixedVariableTextInputProps = {
|
||||
showManageInputField?: boolean
|
||||
onManageInputField?: () => void
|
||||
disableVariableInsertion?: boolean
|
||||
onViewInternals?: () => void
|
||||
}
|
||||
|
||||
const MixedVariableTextInput = ({
|
||||
readOnly = false,
|
||||
nodesOutputVars,
|
||||
@@ -31,43 +45,124 @@ const MixedVariableTextInput = ({
|
||||
showManageInputField,
|
||||
onManageInputField,
|
||||
disableVariableInsertion = false,
|
||||
onViewInternals,
|
||||
}: MixedVariableTextInputProps) => {
|
||||
const { t } = useTranslation()
|
||||
const controlPromptEditorRerenderKey = useStore(s => s.controlPromptEditorRerenderKey)
|
||||
const setControlPromptEditorRerenderKey = useStore(s => s.setControlPromptEditorRerenderKey)
|
||||
|
||||
const nodesByIdMap = useMemo(() => {
|
||||
return availableNodes.reduce((acc, node) => {
|
||||
acc[node.id] = node
|
||||
return acc
|
||||
}, {} as Record<string, Node>)
|
||||
}, [availableNodes])
|
||||
|
||||
const detectedAgentFromValue = useMemo(() => {
|
||||
if (!value)
|
||||
return null
|
||||
|
||||
const matches = value.matchAll(WORKFLOW_VAR_PATTERN)
|
||||
for (const match of matches) {
|
||||
const variablePath = match[1]
|
||||
const nodeId = variablePath.split('.')[0]
|
||||
const node = nodesByIdMap[nodeId]
|
||||
if (node?.data.type === BlockEnum.Agent) {
|
||||
return {
|
||||
nodeId,
|
||||
name: node.data.title,
|
||||
}
|
||||
}
|
||||
}
|
||||
return null
|
||||
}, [value, nodesByIdMap])
|
||||
|
||||
const [selectedAgent, setSelectedAgent] = useState<{ id: string, title: string } | null>(null)
|
||||
|
||||
useEffect(() => {
|
||||
if (!detectedAgentFromValue && selectedAgent)
|
||||
setSelectedAgent(null)
|
||||
}, [detectedAgentFromValue, selectedAgent])
|
||||
|
||||
const agentNodes = useMemo(() => {
|
||||
return availableNodes
|
||||
.filter(node => node.data.type === BlockEnum.Agent)
|
||||
.map(node => ({
|
||||
id: node.id,
|
||||
title: node.data.title,
|
||||
}))
|
||||
}, [availableNodes])
|
||||
|
||||
const handleAgentSelect = useCallback((agent: { id: string, title: string }) => {
|
||||
setSelectedAgent(agent)
|
||||
}, [])
|
||||
|
||||
const handleAgentRemove = useCallback(() => {
|
||||
const agentNodeId = detectedAgentFromValue?.nodeId || selectedAgent?.id
|
||||
if (!agentNodeId || !onChange)
|
||||
return
|
||||
|
||||
const pattern = /\{\{#([^#]+)#\}\}/g
|
||||
const valueWithoutAgentVars = value.replace(pattern, (match, variablePath) => {
|
||||
const nodeId = variablePath.split('.')[0]
|
||||
return nodeId === agentNodeId ? '' : match
|
||||
}).trim()
|
||||
|
||||
onChange(valueWithoutAgentVars)
|
||||
setSelectedAgent(null)
|
||||
setControlPromptEditorRerenderKey(Date.now())
|
||||
}, [detectedAgentFromValue?.nodeId, selectedAgent?.id, value, onChange, setControlPromptEditorRerenderKey])
|
||||
|
||||
const displayedAgent = detectedAgentFromValue || (selectedAgent ? { nodeId: selectedAgent.id, name: selectedAgent.title } : null)
|
||||
|
||||
return (
|
||||
<PromptEditor
|
||||
key={controlPromptEditorRerenderKey}
|
||||
wrapperClassName={cn(
|
||||
'min-h-8 w-full rounded-lg border border-transparent bg-components-input-bg-normal px-2 py-1',
|
||||
'hover:border-components-input-border-hover hover:bg-components-input-bg-hover',
|
||||
'focus-within:border-components-input-border-active focus-within:bg-components-input-bg-active focus-within:shadow-xs',
|
||||
<div className={cn(
|
||||
'w-full rounded-lg border border-transparent bg-components-input-bg-normal',
|
||||
'hover:border-components-input-border-hover hover:bg-components-input-bg-hover',
|
||||
'focus-within:border-components-input-border-active focus-within:bg-components-input-bg-active focus-within:shadow-xs',
|
||||
)}
|
||||
>
|
||||
{displayedAgent && (
|
||||
<AgentHeaderBar
|
||||
agentName={displayedAgent.name}
|
||||
onRemove={handleAgentRemove}
|
||||
onViewInternals={onViewInternals}
|
||||
/>
|
||||
)}
|
||||
className="caret:text-text-accent"
|
||||
editable={!readOnly}
|
||||
value={value}
|
||||
workflowVariableBlock={{
|
||||
show: !disableVariableInsertion,
|
||||
variables: nodesOutputVars || [],
|
||||
workflowNodesMap: availableNodes.reduce((acc, node) => {
|
||||
acc[node.id] = {
|
||||
title: node.data.title,
|
||||
type: node.data.type,
|
||||
}
|
||||
if (node.data.type === BlockEnum.Start) {
|
||||
acc.sys = {
|
||||
title: t('blocks.start', { ns: 'workflow' }),
|
||||
type: BlockEnum.Start,
|
||||
<PromptEditor
|
||||
key={controlPromptEditorRerenderKey}
|
||||
wrapperClassName="min-h-8 px-2 py-1"
|
||||
className="caret:text-text-accent"
|
||||
editable={!readOnly}
|
||||
value={value}
|
||||
workflowVariableBlock={{
|
||||
show: !disableVariableInsertion,
|
||||
variables: nodesOutputVars || [],
|
||||
workflowNodesMap: availableNodes.reduce((acc, node) => {
|
||||
acc[node.id] = {
|
||||
title: node.data.title,
|
||||
type: node.data.type,
|
||||
}
|
||||
}
|
||||
return acc
|
||||
}, {} as any),
|
||||
showManageInputField,
|
||||
onManageInputField,
|
||||
}}
|
||||
placeholder={<Placeholder disableVariableInsertion={disableVariableInsertion} />}
|
||||
onChange={onChange}
|
||||
/>
|
||||
if (node.data.type === BlockEnum.Start) {
|
||||
acc.sys = {
|
||||
title: t('blocks.start', { ns: 'workflow' }),
|
||||
type: BlockEnum.Start,
|
||||
}
|
||||
}
|
||||
return acc
|
||||
}, {} as any),
|
||||
showManageInputField,
|
||||
onManageInputField,
|
||||
}}
|
||||
agentBlock={{
|
||||
show: agentNodes.length > 0 && !displayedAgent,
|
||||
agentNodes,
|
||||
onSelect: handleAgentSelect,
|
||||
} as AgentBlockType}
|
||||
placeholder={<Placeholder disableVariableInsertion={disableVariableInsertion} hasSelectedAgent={!!displayedAgent} />}
|
||||
onChange={onChange}
|
||||
/>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
|
||||
@@ -7,9 +7,10 @@ import { CustomTextNode } from '@/app/components/base/prompt-editor/plugins/cust
|
||||
|
||||
type PlaceholderProps = {
|
||||
disableVariableInsertion?: boolean
|
||||
hasSelectedAgent?: boolean
|
||||
}
|
||||
|
||||
const Placeholder = ({ disableVariableInsertion = false }: PlaceholderProps) => {
|
||||
const Placeholder = ({ disableVariableInsertion = false, hasSelectedAgent = false }: PlaceholderProps) => {
|
||||
const { t } = useTranslation()
|
||||
const [editor] = useLexicalComposerContext()
|
||||
|
||||
@@ -44,6 +45,21 @@ const Placeholder = ({ disableVariableInsertion = false }: PlaceholderProps) =>
|
||||
>
|
||||
{t('nodes.tool.insertPlaceholder2', { ns: 'workflow' })}
|
||||
</div>
|
||||
{!hasSelectedAgent && (
|
||||
<>
|
||||
<div className="system-kbd mx-0.5 flex h-4 w-4 items-center justify-center rounded bg-components-kbd-bg-gray text-text-placeholder">@</div>
|
||||
<div
|
||||
className="system-sm-regular cursor-pointer text-components-input-text-placeholder underline decoration-dotted decoration-auto underline-offset-auto hover:text-text-tertiary"
|
||||
onMouseDown={((e) => {
|
||||
e.preventDefault()
|
||||
e.stopPropagation()
|
||||
handleInsert('@')
|
||||
})}
|
||||
>
|
||||
{t('nodes.tool.insertPlaceholder3', { ns: 'workflow' })}
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
|
||||
@@ -1,3 +1,5 @@
|
||||
import type { FC, ReactElement } from 'react'
|
||||
import type { I18nKeysByPrefix } from '@/types/i18n'
|
||||
import {
|
||||
RiAlignBottom,
|
||||
RiAlignCenter,
|
||||
@@ -17,9 +19,13 @@ import {
|
||||
} from 'react'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
import { useStore as useReactFlowStore, useStoreApi } from 'reactflow'
|
||||
import { useNodesReadOnly, useNodesSyncDraft } from './hooks'
|
||||
import { shallow } from 'zustand/shallow'
|
||||
import Tooltip from '@/app/components/base/tooltip'
|
||||
import { useNodesInteractions, useNodesReadOnly, useNodesSyncDraft } from './hooks'
|
||||
import { useMakeGroupAvailability } from './hooks/use-make-group'
|
||||
import { useSelectionInteractions } from './hooks/use-selection-interactions'
|
||||
import { useWorkflowHistory, WorkflowHistoryEvent } from './hooks/use-workflow-history'
|
||||
import ShortcutsName from './shortcuts-name'
|
||||
import { useStore, useWorkflowStore } from './store'
|
||||
|
||||
enum AlignType {
|
||||
@@ -33,21 +39,67 @@ enum AlignType {
|
||||
DistributeVertical = 'distributeVertical',
|
||||
}
|
||||
|
||||
type AlignButtonConfig = {
|
||||
type: AlignType
|
||||
icon: ReactElement
|
||||
labelKey: I18nKeysByPrefix<'workflow', 'operator.'>
|
||||
}
|
||||
|
||||
type AlignButtonProps = {
|
||||
config: AlignButtonConfig
|
||||
label: string
|
||||
onClick: (type: AlignType) => void
|
||||
position?: 'top' | 'bottom' | 'left' | 'right'
|
||||
}
|
||||
|
||||
const AlignButton: FC<AlignButtonProps> = ({ config, label, onClick, position = 'bottom' }) => {
|
||||
return (
|
||||
<Tooltip position={position} popupContent={label}>
|
||||
<div
|
||||
className="flex h-7 w-7 cursor-pointer items-center justify-center rounded-md text-text-secondary hover:bg-state-base-hover"
|
||||
onClick={() => onClick(config.type)}
|
||||
>
|
||||
{config.icon}
|
||||
</div>
|
||||
</Tooltip>
|
||||
)
|
||||
}
|
||||
|
||||
const ALIGN_BUTTONS: AlignButtonConfig[] = [
|
||||
{ type: AlignType.Left, icon: <RiAlignLeft className="h-4 w-4" />, labelKey: 'alignLeft' },
|
||||
{ type: AlignType.Center, icon: <RiAlignCenter className="h-4 w-4" />, labelKey: 'alignCenter' },
|
||||
{ type: AlignType.Right, icon: <RiAlignRight className="h-4 w-4" />, labelKey: 'alignRight' },
|
||||
{ type: AlignType.DistributeHorizontal, icon: <RiAlignJustify className="h-4 w-4" />, labelKey: 'distributeHorizontal' },
|
||||
{ type: AlignType.Top, icon: <RiAlignTop className="h-4 w-4" />, labelKey: 'alignTop' },
|
||||
{ type: AlignType.Middle, icon: <RiAlignCenter className="h-4 w-4 rotate-90" />, labelKey: 'alignMiddle' },
|
||||
{ type: AlignType.Bottom, icon: <RiAlignBottom className="h-4 w-4" />, labelKey: 'alignBottom' },
|
||||
{ type: AlignType.DistributeVertical, icon: <RiAlignJustify className="h-4 w-4 rotate-90" />, labelKey: 'distributeVertical' },
|
||||
]
|
||||
|
||||
const SelectionContextmenu = () => {
|
||||
const { t } = useTranslation()
|
||||
const ref = useRef(null)
|
||||
const { getNodesReadOnly } = useNodesReadOnly()
|
||||
const { getNodesReadOnly, nodesReadOnly } = useNodesReadOnly()
|
||||
const { handleSelectionContextmenuCancel } = useSelectionInteractions()
|
||||
const {
|
||||
handleNodesCopy,
|
||||
handleNodesDuplicate,
|
||||
handleNodesDelete,
|
||||
handleMakeGroup,
|
||||
} = useNodesInteractions()
|
||||
const selectionMenu = useStore(s => s.selectionMenu)
|
||||
|
||||
// Access React Flow methods
|
||||
const store = useStoreApi()
|
||||
const workflowStore = useWorkflowStore()
|
||||
|
||||
// Get selected nodes for alignment logic
|
||||
const selectedNodes = useReactFlowStore(state =>
|
||||
state.getNodes().filter(node => node.selected),
|
||||
)
|
||||
const selectedNodeIds = useReactFlowStore((state) => {
|
||||
const ids = state.getNodes().filter(node => node.selected).map(node => node.id)
|
||||
ids.sort()
|
||||
return ids
|
||||
}, shallow)
|
||||
|
||||
const { canMakeGroup } = useMakeGroupAvailability(selectedNodeIds)
|
||||
|
||||
const { handleSyncWorkflowDraft } = useNodesSyncDraft()
|
||||
const { saveStateToHistory } = useWorkflowHistory()
|
||||
@@ -65,9 +117,9 @@ const SelectionContextmenu = () => {
|
||||
if (container) {
|
||||
const { width: containerWidth, height: containerHeight } = container.getBoundingClientRect()
|
||||
|
||||
const menuWidth = 240
|
||||
const menuWidth = 244
|
||||
|
||||
const estimatedMenuHeight = 380
|
||||
const estimatedMenuHeight = 203
|
||||
|
||||
if (left + menuWidth > containerWidth)
|
||||
left = left - menuWidth
|
||||
@@ -87,9 +139,9 @@ const SelectionContextmenu = () => {
|
||||
}, ref)
|
||||
|
||||
useEffect(() => {
|
||||
if (selectionMenu && selectedNodes.length <= 1)
|
||||
if (selectionMenu && selectedNodeIds.length <= 1)
|
||||
handleSelectionContextmenuCancel()
|
||||
}, [selectionMenu, selectedNodes.length, handleSelectionContextmenuCancel])
|
||||
}, [selectionMenu, selectedNodeIds.length, handleSelectionContextmenuCancel])
|
||||
|
||||
// Handle align nodes logic
|
||||
const handleAlignNode = useCallback((currentNode: any, nodeToAlign: any, alignType: AlignType, minX: number, maxX: number, minY: number, maxY: number) => {
|
||||
@@ -248,7 +300,7 @@ const SelectionContextmenu = () => {
|
||||
}, [])
|
||||
|
||||
const handleAlignNodes = useCallback((alignType: AlignType) => {
|
||||
if (getNodesReadOnly() || selectedNodes.length <= 1) {
|
||||
if (getNodesReadOnly() || selectedNodeIds.length <= 1) {
|
||||
handleSelectionContextmenuCancel()
|
||||
return
|
||||
}
|
||||
@@ -259,9 +311,6 @@ const SelectionContextmenu = () => {
|
||||
// Get all current nodes
|
||||
const nodes = store.getState().getNodes()
|
||||
|
||||
// Get all selected nodes
|
||||
const selectedNodeIds = selectedNodes.map(node => node.id)
|
||||
|
||||
// Find container nodes and their children
|
||||
// Container nodes (like Iteration and Loop) have child nodes that should not be aligned independently
|
||||
// when the container is selected. This prevents child nodes from being moved outside their containers.
|
||||
@@ -367,7 +416,7 @@ const SelectionContextmenu = () => {
|
||||
catch (err) {
|
||||
console.error('Failed to update nodes:', err)
|
||||
}
|
||||
}, [store, workflowStore, selectedNodes, getNodesReadOnly, handleSyncWorkflowDraft, saveStateToHistory, handleSelectionContextmenuCancel, handleAlignNode, handleDistributeNodes])
|
||||
}, [getNodesReadOnly, handleAlignNode, handleDistributeNodes, handleSelectionContextmenuCancel, handleSyncWorkflowDraft, saveStateToHistory, selectedNodeIds, store, workflowStore])
|
||||
|
||||
if (!selectionMenu)
|
||||
return null
|
||||
@@ -381,73 +430,75 @@ const SelectionContextmenu = () => {
|
||||
}}
|
||||
ref={ref}
|
||||
>
|
||||
<div ref={menuRef} className="w-[240px] rounded-lg border-[0.5px] border-components-panel-border bg-components-panel-bg shadow-xl">
|
||||
<div className="p-1">
|
||||
<div className="system-xs-medium px-2 py-2 text-text-tertiary">
|
||||
{t('operator.vertical', { ns: 'workflow' })}
|
||||
</div>
|
||||
<div
|
||||
className="flex h-8 cursor-pointer items-center gap-2 rounded-lg px-3 text-sm text-text-secondary hover:bg-state-base-hover"
|
||||
onClick={() => handleAlignNodes(AlignType.Top)}
|
||||
>
|
||||
<RiAlignTop className="h-4 w-4" />
|
||||
{t('operator.alignTop', { ns: 'workflow' })}
|
||||
</div>
|
||||
<div
|
||||
className="flex h-8 cursor-pointer items-center gap-2 rounded-lg px-3 text-sm text-text-secondary hover:bg-state-base-hover"
|
||||
onClick={() => handleAlignNodes(AlignType.Middle)}
|
||||
>
|
||||
<RiAlignCenter className="h-4 w-4 rotate-90" />
|
||||
{t('operator.alignMiddle', { ns: 'workflow' })}
|
||||
</div>
|
||||
<div
|
||||
className="flex h-8 cursor-pointer items-center gap-2 rounded-lg px-3 text-sm text-text-secondary hover:bg-state-base-hover"
|
||||
onClick={() => handleAlignNodes(AlignType.Bottom)}
|
||||
>
|
||||
<RiAlignBottom className="h-4 w-4" />
|
||||
{t('operator.alignBottom', { ns: 'workflow' })}
|
||||
</div>
|
||||
<div
|
||||
className="flex h-8 cursor-pointer items-center gap-2 rounded-lg px-3 text-sm text-text-secondary hover:bg-state-base-hover"
|
||||
onClick={() => handleAlignNodes(AlignType.DistributeVertical)}
|
||||
>
|
||||
<RiAlignJustify className="h-4 w-4 rotate-90" />
|
||||
{t('operator.distributeVertical', { ns: 'workflow' })}
|
||||
</div>
|
||||
</div>
|
||||
<div className="h-px bg-divider-regular"></div>
|
||||
<div className="p-1">
|
||||
<div className="system-xs-medium px-2 py-2 text-text-tertiary">
|
||||
{t('operator.horizontal', { ns: 'workflow' })}
|
||||
</div>
|
||||
<div
|
||||
className="flex h-8 cursor-pointer items-center gap-2 rounded-lg px-3 text-sm text-text-secondary hover:bg-state-base-hover"
|
||||
onClick={() => handleAlignNodes(AlignType.Left)}
|
||||
>
|
||||
<RiAlignLeft className="h-4 w-4" />
|
||||
{t('operator.alignLeft', { ns: 'workflow' })}
|
||||
</div>
|
||||
<div
|
||||
className="flex h-8 cursor-pointer items-center gap-2 rounded-lg px-3 text-sm text-text-secondary hover:bg-state-base-hover"
|
||||
onClick={() => handleAlignNodes(AlignType.Center)}
|
||||
>
|
||||
<RiAlignCenter className="h-4 w-4" />
|
||||
{t('operator.alignCenter', { ns: 'workflow' })}
|
||||
</div>
|
||||
<div
|
||||
className="flex h-8 cursor-pointer items-center gap-2 rounded-lg px-3 text-sm text-text-secondary hover:bg-state-base-hover"
|
||||
onClick={() => handleAlignNodes(AlignType.Right)}
|
||||
>
|
||||
<RiAlignRight className="h-4 w-4" />
|
||||
{t('operator.alignRight', { ns: 'workflow' })}
|
||||
</div>
|
||||
<div
|
||||
className="flex h-8 cursor-pointer items-center gap-2 rounded-lg px-3 text-sm text-text-secondary hover:bg-state-base-hover"
|
||||
onClick={() => handleAlignNodes(AlignType.DistributeHorizontal)}
|
||||
>
|
||||
<RiAlignJustify className="h-4 w-4" />
|
||||
{t('operator.distributeHorizontal', { ns: 'workflow' })}
|
||||
</div>
|
||||
<div ref={menuRef} className="w-[244px] rounded-lg border-[0.5px] border-components-panel-border bg-components-panel-bg shadow-xl">
|
||||
{!nodesReadOnly && (
|
||||
<>
|
||||
<div className="p-1">
|
||||
<div
|
||||
className={`flex h-8 items-center justify-between rounded-lg px-3 text-sm ${
|
||||
canMakeGroup
|
||||
? 'cursor-pointer text-text-secondary hover:bg-state-base-hover'
|
||||
: 'cursor-not-allowed text-text-disabled'
|
||||
}`}
|
||||
onClick={() => {
|
||||
if (!canMakeGroup)
|
||||
return
|
||||
handleMakeGroup()
|
||||
handleSelectionContextmenuCancel()
|
||||
}}
|
||||
>
|
||||
{t('operator.makeGroup', { ns: 'workflow' })}
|
||||
<ShortcutsName keys={['ctrl', 'g']} className={!canMakeGroup ? 'opacity-50' : ''} />
|
||||
</div>
|
||||
</div>
|
||||
<div className="h-px bg-divider-regular" />
|
||||
<div className="p-1">
|
||||
<div
|
||||
className="flex h-8 cursor-pointer items-center justify-between rounded-lg px-3 text-sm text-text-secondary hover:bg-state-base-hover"
|
||||
onClick={() => {
|
||||
handleNodesCopy()
|
||||
handleSelectionContextmenuCancel()
|
||||
}}
|
||||
>
|
||||
{t('common.copy', { ns: 'workflow' })}
|
||||
<ShortcutsName keys={['ctrl', 'c']} />
|
||||
</div>
|
||||
<div
|
||||
className="flex h-8 cursor-pointer items-center justify-between rounded-lg px-3 text-sm text-text-secondary hover:bg-state-base-hover"
|
||||
onClick={() => {
|
||||
handleNodesDuplicate()
|
||||
handleSelectionContextmenuCancel()
|
||||
}}
|
||||
>
|
||||
{t('common.duplicate', { ns: 'workflow' })}
|
||||
<ShortcutsName keys={['ctrl', 'd']} />
|
||||
</div>
|
||||
</div>
|
||||
<div className="h-px bg-divider-regular" />
|
||||
<div className="p-1">
|
||||
<div
|
||||
className="flex h-8 cursor-pointer items-center justify-between rounded-lg px-3 text-sm text-text-secondary hover:bg-state-destructive-hover hover:text-text-destructive"
|
||||
onClick={() => {
|
||||
handleNodesDelete()
|
||||
handleSelectionContextmenuCancel()
|
||||
}}
|
||||
>
|
||||
{t('operation.delete', { ns: 'common' })}
|
||||
<ShortcutsName keys={['del']} />
|
||||
</div>
|
||||
</div>
|
||||
<div className="h-px bg-divider-regular" />
|
||||
</>
|
||||
)}
|
||||
<div className="flex items-center justify-between p-1">
|
||||
{ALIGN_BUTTONS.map(config => (
|
||||
<AlignButton
|
||||
key={config.type}
|
||||
config={config}
|
||||
label={t(`operator.${config.labelKey}`, { ns: 'workflow' })}
|
||||
onClick={handleAlignNodes}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -30,6 +30,7 @@ export enum BlockEnum {
|
||||
Code = 'code',
|
||||
TemplateTransform = 'template-transform',
|
||||
HttpRequest = 'http-request',
|
||||
Group = 'group',
|
||||
VariableAssigner = 'variable-assigner',
|
||||
VariableAggregator = 'variable-aggregator',
|
||||
Tool = 'tool',
|
||||
@@ -79,6 +80,7 @@ export type CommonNodeType<T = {}> = {
|
||||
_isEntering?: boolean
|
||||
_showAddVariablePopup?: boolean
|
||||
_holdAddVariablePopup?: boolean
|
||||
_hiddenInGroupId?: string
|
||||
_iterationLength?: number
|
||||
_iterationIndex?: number
|
||||
_waitingRun?: boolean
|
||||
@@ -113,6 +115,7 @@ export type CommonEdgeType = {
|
||||
_connectedNodeIsHovering?: boolean
|
||||
_connectedNodeIsSelected?: boolean
|
||||
_isBundled?: boolean
|
||||
_hiddenInGroupId?: string
|
||||
_sourceRunningStatus?: NodeRunningStatus
|
||||
_targetRunningStatus?: NodeRunningStatus
|
||||
_waitingRun?: boolean
|
||||
|
||||
@@ -1,21 +1,15 @@
|
||||
import type { CustomGroupNodeData } from '../custom-group-node'
|
||||
import type { GroupNodeData } from '../nodes/group/types'
|
||||
import type { IfElseNodeType } from '../nodes/if-else/types'
|
||||
import type { IterationNodeType } from '../nodes/iteration/types'
|
||||
import type { LoopNodeType } from '../nodes/loop/types'
|
||||
import type { QuestionClassifierNodeType } from '../nodes/question-classifier/types'
|
||||
import type { ToolNodeType } from '../nodes/tool/types'
|
||||
import type {
|
||||
Edge,
|
||||
Node,
|
||||
} from '../types'
|
||||
import type { Edge, Node } from '../types'
|
||||
import { cloneDeep } from 'es-toolkit/object'
|
||||
import {
|
||||
getConnectedEdges,
|
||||
} from 'reactflow'
|
||||
import { getConnectedEdges } from 'reactflow'
|
||||
import { getIterationStartNode, getLoopStartNode } from '@/app/components/workflow/utils/node'
|
||||
import { correctModelProvider } from '@/utils'
|
||||
import {
|
||||
getIterationStartNode,
|
||||
getLoopStartNode,
|
||||
} from '.'
|
||||
import {
|
||||
CUSTOM_NODE,
|
||||
DEFAULT_RETRY_INTERVAL,
|
||||
@@ -25,18 +19,22 @@ import {
|
||||
NODE_WIDTH_X_OFFSET,
|
||||
START_INITIAL_POSITION,
|
||||
} from '../constants'
|
||||
import { CUSTOM_GROUP_NODE, GROUP_CHILDREN_Z_INDEX } from '../custom-group-node'
|
||||
import { branchNameCorrect } from '../nodes/if-else/utils'
|
||||
import { CUSTOM_ITERATION_START_NODE } from '../nodes/iteration-start/constants'
|
||||
import { CUSTOM_LOOP_START_NODE } from '../nodes/loop-start/constants'
|
||||
import {
|
||||
BlockEnum,
|
||||
ErrorHandleMode,
|
||||
} from '../types'
|
||||
import { BlockEnum, ErrorHandleMode } from '../types'
|
||||
|
||||
const WHITE = 'WHITE'
|
||||
const GRAY = 'GRAY'
|
||||
const BLACK = 'BLACK'
|
||||
const isCyclicUtil = (nodeId: string, color: Record<string, string>, adjList: Record<string, string[]>, stack: string[]) => {
|
||||
|
||||
const isCyclicUtil = (
|
||||
nodeId: string,
|
||||
color: Record<string, string>,
|
||||
adjList: Record<string, string[]>,
|
||||
stack: string[],
|
||||
) => {
|
||||
color[nodeId] = GRAY
|
||||
stack.push(nodeId)
|
||||
|
||||
@@ -47,8 +45,12 @@ const isCyclicUtil = (nodeId: string, color: Record<string, string>, adjList: Re
|
||||
stack.push(childId)
|
||||
return true
|
||||
}
|
||||
if (color[childId] === WHITE && isCyclicUtil(childId, color, adjList, stack))
|
||||
if (
|
||||
color[childId] === WHITE
|
||||
&& isCyclicUtil(childId, color, adjList, stack)
|
||||
) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
color[nodeId] = BLACK
|
||||
if (stack.length > 0 && stack[stack.length - 1] === nodeId)
|
||||
@@ -66,8 +68,7 @@ const getCycleEdges = (nodes: Node[], edges: Edge[]) => {
|
||||
adjList[node.id] = []
|
||||
}
|
||||
|
||||
for (const edge of edges)
|
||||
adjList[edge.source]?.push(edge.target)
|
||||
for (const edge of edges) adjList[edge.source]?.push(edge.target)
|
||||
|
||||
for (let i = 0; i < nodes.length; i++) {
|
||||
if (color[nodes[i].id] === WHITE)
|
||||
@@ -87,20 +88,34 @@ const getCycleEdges = (nodes: Node[], edges: Edge[]) => {
|
||||
}
|
||||
|
||||
export const preprocessNodesAndEdges = (nodes: Node[], edges: Edge[]) => {
|
||||
const hasIterationNode = nodes.some(node => node.data.type === BlockEnum.Iteration)
|
||||
const hasIterationNode = nodes.some(
|
||||
node => node.data.type === BlockEnum.Iteration,
|
||||
)
|
||||
const hasLoopNode = nodes.some(node => node.data.type === BlockEnum.Loop)
|
||||
const hasGroupNode = nodes.some(node => node.type === CUSTOM_GROUP_NODE)
|
||||
const hasBusinessGroupNode = nodes.some(
|
||||
node => node.data.type === BlockEnum.Group,
|
||||
)
|
||||
|
||||
if (!hasIterationNode && !hasLoopNode) {
|
||||
if (
|
||||
!hasIterationNode
|
||||
&& !hasLoopNode
|
||||
&& !hasGroupNode
|
||||
&& !hasBusinessGroupNode
|
||||
) {
|
||||
return {
|
||||
nodes,
|
||||
edges,
|
||||
}
|
||||
}
|
||||
|
||||
const nodesMap = nodes.reduce((prev, next) => {
|
||||
prev[next.id] = next
|
||||
return prev
|
||||
}, {} as Record<string, Node>)
|
||||
const nodesMap = nodes.reduce(
|
||||
(prev, next) => {
|
||||
prev[next.id] = next
|
||||
return prev
|
||||
},
|
||||
{} as Record<string, Node>,
|
||||
)
|
||||
|
||||
const iterationNodesWithStartNode = []
|
||||
const iterationNodesWithoutStartNode = []
|
||||
@@ -112,8 +127,12 @@ export const preprocessNodesAndEdges = (nodes: Node[], edges: Edge[]) => {
|
||||
|
||||
if (currentNode.data.type === BlockEnum.Iteration) {
|
||||
if (currentNode.data.start_node_id) {
|
||||
if (nodesMap[currentNode.data.start_node_id]?.type !== CUSTOM_ITERATION_START_NODE)
|
||||
if (
|
||||
nodesMap[currentNode.data.start_node_id]?.type
|
||||
!== CUSTOM_ITERATION_START_NODE
|
||||
) {
|
||||
iterationNodesWithStartNode.push(currentNode)
|
||||
}
|
||||
}
|
||||
else {
|
||||
iterationNodesWithoutStartNode.push(currentNode)
|
||||
@@ -122,8 +141,12 @@ export const preprocessNodesAndEdges = (nodes: Node[], edges: Edge[]) => {
|
||||
|
||||
if (currentNode.data.type === BlockEnum.Loop) {
|
||||
if (currentNode.data.start_node_id) {
|
||||
if (nodesMap[currentNode.data.start_node_id]?.type !== CUSTOM_LOOP_START_NODE)
|
||||
if (
|
||||
nodesMap[currentNode.data.start_node_id]?.type
|
||||
!== CUSTOM_LOOP_START_NODE
|
||||
) {
|
||||
loopNodesWithStartNode.push(currentNode)
|
||||
}
|
||||
}
|
||||
else {
|
||||
loopNodesWithoutStartNode.push(currentNode)
|
||||
@@ -132,7 +155,10 @@ export const preprocessNodesAndEdges = (nodes: Node[], edges: Edge[]) => {
|
||||
}
|
||||
|
||||
const newIterationStartNodesMap = {} as Record<string, Node>
|
||||
const newIterationStartNodes = [...iterationNodesWithStartNode, ...iterationNodesWithoutStartNode].map((iterationNode, index) => {
|
||||
const newIterationStartNodes = [
|
||||
...iterationNodesWithStartNode,
|
||||
...iterationNodesWithoutStartNode,
|
||||
].map((iterationNode, index) => {
|
||||
const newNode = getIterationStartNode(iterationNode.id)
|
||||
newNode.id = newNode.id + index
|
||||
newIterationStartNodesMap[iterationNode.id] = newNode
|
||||
@@ -140,24 +166,34 @@ export const preprocessNodesAndEdges = (nodes: Node[], edges: Edge[]) => {
|
||||
})
|
||||
|
||||
const newLoopStartNodesMap = {} as Record<string, Node>
|
||||
const newLoopStartNodes = [...loopNodesWithStartNode, ...loopNodesWithoutStartNode].map((loopNode, index) => {
|
||||
const newLoopStartNodes = [
|
||||
...loopNodesWithStartNode,
|
||||
...loopNodesWithoutStartNode,
|
||||
].map((loopNode, index) => {
|
||||
const newNode = getLoopStartNode(loopNode.id)
|
||||
newNode.id = newNode.id + index
|
||||
newLoopStartNodesMap[loopNode.id] = newNode
|
||||
return newNode
|
||||
})
|
||||
|
||||
const newEdges = [...iterationNodesWithStartNode, ...loopNodesWithStartNode].map((nodeItem) => {
|
||||
const newEdges = [
|
||||
...iterationNodesWithStartNode,
|
||||
...loopNodesWithStartNode,
|
||||
].map((nodeItem) => {
|
||||
const isIteration = nodeItem.data.type === BlockEnum.Iteration
|
||||
const newNode = (isIteration ? newIterationStartNodesMap : newLoopStartNodesMap)[nodeItem.id]
|
||||
const newNode = (
|
||||
isIteration ? newIterationStartNodesMap : newLoopStartNodesMap
|
||||
)[nodeItem.id]
|
||||
const startNode = nodesMap[nodeItem.data.start_node_id]
|
||||
const source = newNode.id
|
||||
const sourceHandle = 'source'
|
||||
const target = startNode.id
|
||||
const targetHandle = 'target'
|
||||
|
||||
const parentNode = nodes.find(node => node.id === startNode.parentId) || null
|
||||
const isInIteration = !!parentNode && parentNode.data.type === BlockEnum.Iteration
|
||||
const parentNode
|
||||
= nodes.find(node => node.id === startNode.parentId) || null
|
||||
const isInIteration
|
||||
= !!parentNode && parentNode.data.type === BlockEnum.Iteration
|
||||
const isInLoop = !!parentNode && parentNode.data.type === BlockEnum.Loop
|
||||
|
||||
return {
|
||||
@@ -180,21 +216,159 @@ export const preprocessNodesAndEdges = (nodes: Node[], edges: Edge[]) => {
|
||||
}
|
||||
})
|
||||
nodes.forEach((node) => {
|
||||
if (node.data.type === BlockEnum.Iteration && newIterationStartNodesMap[node.id])
|
||||
(node.data as IterationNodeType).start_node_id = newIterationStartNodesMap[node.id].id
|
||||
if (
|
||||
node.data.type === BlockEnum.Iteration
|
||||
&& newIterationStartNodesMap[node.id]
|
||||
) {
|
||||
(node.data as IterationNodeType).start_node_id
|
||||
= newIterationStartNodesMap[node.id].id
|
||||
}
|
||||
|
||||
if (node.data.type === BlockEnum.Loop && newLoopStartNodesMap[node.id])
|
||||
(node.data as LoopNodeType).start_node_id = newLoopStartNodesMap[node.id].id
|
||||
if (node.data.type === BlockEnum.Loop && newLoopStartNodesMap[node.id]) {
|
||||
(node.data as LoopNodeType).start_node_id
|
||||
= newLoopStartNodesMap[node.id].id
|
||||
}
|
||||
})
|
||||
|
||||
// Derive Group internal edges (input → entries, leaves → exits)
|
||||
const groupInternalEdges: Edge[] = []
|
||||
const groupNodes = nodes.filter(node => node.type === CUSTOM_GROUP_NODE)
|
||||
|
||||
for (const groupNode of groupNodes) {
|
||||
const groupData = groupNode.data as unknown as CustomGroupNodeData
|
||||
const { group } = groupData
|
||||
|
||||
if (!group)
|
||||
continue
|
||||
|
||||
const { inputNodeId, entryNodeIds, exitPorts } = group
|
||||
|
||||
// Derive edges: input → each entry node
|
||||
for (const entryId of entryNodeIds) {
|
||||
const entryNode = nodesMap[entryId]
|
||||
if (entryNode) {
|
||||
groupInternalEdges.push({
|
||||
id: `group-internal-${inputNodeId}-source-${entryId}-target`,
|
||||
type: 'custom',
|
||||
source: inputNodeId,
|
||||
sourceHandle: 'source',
|
||||
target: entryId,
|
||||
targetHandle: 'target',
|
||||
data: {
|
||||
sourceType: '' as any, // Group input has empty type
|
||||
targetType: entryNode.data.type,
|
||||
_isGroupInternal: true,
|
||||
_groupId: groupNode.id,
|
||||
},
|
||||
zIndex: GROUP_CHILDREN_Z_INDEX,
|
||||
} as Edge)
|
||||
}
|
||||
}
|
||||
|
||||
// Derive edges: each leaf node → exit port
|
||||
for (const exitPort of exitPorts) {
|
||||
const leafNode = nodesMap[exitPort.leafNodeId]
|
||||
if (leafNode) {
|
||||
groupInternalEdges.push({
|
||||
id: `group-internal-${exitPort.leafNodeId}-${exitPort.sourceHandle}-${exitPort.portNodeId}-target`,
|
||||
type: 'custom',
|
||||
source: exitPort.leafNodeId,
|
||||
sourceHandle: exitPort.sourceHandle,
|
||||
target: exitPort.portNodeId,
|
||||
targetHandle: 'target',
|
||||
data: {
|
||||
sourceType: leafNode.data.type,
|
||||
targetType: '' as string, // Exit port has empty type
|
||||
_isGroupInternal: true,
|
||||
_groupId: groupNode.id,
|
||||
},
|
||||
zIndex: GROUP_CHILDREN_Z_INDEX,
|
||||
} as Edge)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Rebuild isTemp edges for business Group nodes (BlockEnum.Group)
|
||||
// These edges connect the group node to external nodes for visual display
|
||||
const groupTempEdges: Edge[] = []
|
||||
const inboundEdgeIds = new Set<string>()
|
||||
|
||||
nodes.forEach((groupNode) => {
|
||||
if (groupNode.data.type !== BlockEnum.Group)
|
||||
return
|
||||
|
||||
const groupData = groupNode.data as GroupNodeData
|
||||
const {
|
||||
members = [],
|
||||
headNodeIds = [],
|
||||
leafNodeIds = [],
|
||||
handlers = [],
|
||||
} = groupData
|
||||
const memberSet = new Set(members.map(m => m.id))
|
||||
const headSet = new Set(headNodeIds)
|
||||
const leafSet = new Set(leafNodeIds)
|
||||
|
||||
edges.forEach((edge) => {
|
||||
// Inbound edge: source outside group, target is a head node
|
||||
// Use Set to dedupe since multiple head nodes may share same external source
|
||||
if (!memberSet.has(edge.source) && headSet.has(edge.target)) {
|
||||
const sourceHandle = edge.sourceHandle || 'source'
|
||||
const edgeId = `${edge.source}-${sourceHandle}-${groupNode.id}-target`
|
||||
if (!inboundEdgeIds.has(edgeId)) {
|
||||
inboundEdgeIds.add(edgeId)
|
||||
groupTempEdges.push({
|
||||
id: edgeId,
|
||||
type: 'custom',
|
||||
source: edge.source,
|
||||
sourceHandle,
|
||||
target: groupNode.id,
|
||||
targetHandle: 'target',
|
||||
data: {
|
||||
sourceType: edge.data?.sourceType,
|
||||
targetType: BlockEnum.Group,
|
||||
_isTemp: true,
|
||||
},
|
||||
} as Edge)
|
||||
}
|
||||
}
|
||||
|
||||
// Outbound edge: source is a leaf node, target outside group
|
||||
if (leafSet.has(edge.source) && !memberSet.has(edge.target)) {
|
||||
const edgeSourceHandle = edge.sourceHandle || 'source'
|
||||
const handler = handlers.find(
|
||||
h =>
|
||||
h.nodeId === edge.source && h.sourceHandle === edgeSourceHandle,
|
||||
)
|
||||
if (handler) {
|
||||
groupTempEdges.push({
|
||||
id: `${groupNode.id}-${handler.id}-${edge.target}-${edge.targetHandle}`,
|
||||
type: 'custom',
|
||||
source: groupNode.id,
|
||||
sourceHandle: handler.id,
|
||||
target: edge.target!,
|
||||
targetHandle: edge.targetHandle,
|
||||
data: {
|
||||
sourceType: BlockEnum.Group,
|
||||
targetType: edge.data?.targetType,
|
||||
_isTemp: true,
|
||||
},
|
||||
} as Edge)
|
||||
}
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
return {
|
||||
nodes: [...nodes, ...newIterationStartNodes, ...newLoopStartNodes],
|
||||
edges: [...edges, ...newEdges],
|
||||
edges: [...edges, ...newEdges, ...groupInternalEdges, ...groupTempEdges],
|
||||
}
|
||||
}
|
||||
|
||||
export const initialNodes = (originNodes: Node[], originEdges: Edge[]) => {
|
||||
const { nodes, edges } = preprocessNodesAndEdges(cloneDeep(originNodes), cloneDeep(originEdges))
|
||||
const { nodes, edges } = preprocessNodesAndEdges(
|
||||
cloneDeep(originNodes),
|
||||
cloneDeep(originEdges),
|
||||
)
|
||||
const firstNode = nodes[0]
|
||||
|
||||
if (!firstNode?.position) {
|
||||
@@ -206,23 +380,35 @@ export const initialNodes = (originNodes: Node[], originEdges: Edge[]) => {
|
||||
})
|
||||
}
|
||||
|
||||
const iterationOrLoopNodeMap = nodes.reduce((acc, node) => {
|
||||
if (node.parentId) {
|
||||
if (acc[node.parentId])
|
||||
acc[node.parentId].push({ nodeId: node.id, nodeType: node.data.type })
|
||||
else
|
||||
acc[node.parentId] = [{ nodeId: node.id, nodeType: node.data.type }]
|
||||
}
|
||||
return acc
|
||||
}, {} as Record<string, { nodeId: string, nodeType: BlockEnum }[]>)
|
||||
const iterationOrLoopNodeMap = nodes.reduce(
|
||||
(acc, node) => {
|
||||
if (node.parentId) {
|
||||
if (acc[node.parentId]) {
|
||||
acc[node.parentId].push({
|
||||
nodeId: node.id,
|
||||
nodeType: node.data.type,
|
||||
})
|
||||
}
|
||||
else {
|
||||
acc[node.parentId] = [{ nodeId: node.id, nodeType: node.data.type }]
|
||||
}
|
||||
}
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, { nodeId: string, nodeType: BlockEnum }[]>,
|
||||
)
|
||||
|
||||
return nodes.map((node) => {
|
||||
if (!node.type)
|
||||
node.type = CUSTOM_NODE
|
||||
|
||||
const connectedEdges = getConnectedEdges([node], edges)
|
||||
node.data._connectedSourceHandleIds = connectedEdges.filter(edge => edge.source === node.id).map(edge => edge.sourceHandle || 'source')
|
||||
node.data._connectedTargetHandleIds = connectedEdges.filter(edge => edge.target === node.id).map(edge => edge.targetHandle || 'target')
|
||||
node.data._connectedSourceHandleIds = connectedEdges
|
||||
.filter(edge => edge.source === node.id)
|
||||
.map(edge => edge.sourceHandle || 'source')
|
||||
node.data._connectedTargetHandleIds = connectedEdges
|
||||
.filter(edge => edge.target === node.id)
|
||||
.map(edge => edge.targetHandle || 'target')
|
||||
|
||||
if (node.data.type === BlockEnum.IfElse) {
|
||||
const nodeData = node.data as IfElseNodeType
|
||||
@@ -237,49 +423,86 @@ export const initialNodes = (originNodes: Node[], originEdges: Edge[]) => {
|
||||
]
|
||||
}
|
||||
node.data._targetBranches = branchNameCorrect([
|
||||
...(node.data as IfElseNodeType).cases.map(item => ({ id: item.case_id, name: '' })),
|
||||
...(node.data as IfElseNodeType).cases.map(item => ({
|
||||
id: item.case_id,
|
||||
name: '',
|
||||
})),
|
||||
{ id: 'false', name: '' },
|
||||
])
|
||||
// delete conditions and logical_operator if cases is not empty
|
||||
if (nodeData.cases.length > 0 && nodeData.conditions && nodeData.logical_operator) {
|
||||
if (
|
||||
nodeData.cases.length > 0
|
||||
&& nodeData.conditions
|
||||
&& nodeData.logical_operator
|
||||
) {
|
||||
delete nodeData.conditions
|
||||
delete nodeData.logical_operator
|
||||
}
|
||||
}
|
||||
|
||||
if (node.data.type === BlockEnum.QuestionClassifier) {
|
||||
node.data._targetBranches = (node.data as QuestionClassifierNodeType).classes.map((topic) => {
|
||||
node.data._targetBranches = (
|
||||
node.data as QuestionClassifierNodeType
|
||||
).classes.map((topic) => {
|
||||
return topic
|
||||
})
|
||||
}
|
||||
|
||||
if (node.data.type === BlockEnum.Group) {
|
||||
const groupData = node.data as GroupNodeData
|
||||
if (groupData.handlers?.length) {
|
||||
node.data._targetBranches = groupData.handlers.map(handler => ({
|
||||
id: handler.id,
|
||||
name: handler.label || handler.id,
|
||||
}))
|
||||
}
|
||||
}
|
||||
|
||||
if (node.data.type === BlockEnum.Iteration) {
|
||||
const iterationNodeData = node.data as IterationNodeType
|
||||
iterationNodeData._children = iterationOrLoopNodeMap[node.id] || []
|
||||
iterationNodeData.is_parallel = iterationNodeData.is_parallel || false
|
||||
iterationNodeData.parallel_nums = iterationNodeData.parallel_nums || 10
|
||||
iterationNodeData.error_handle_mode = iterationNodeData.error_handle_mode || ErrorHandleMode.Terminated
|
||||
iterationNodeData.error_handle_mode
|
||||
= iterationNodeData.error_handle_mode || ErrorHandleMode.Terminated
|
||||
}
|
||||
|
||||
// TODO: loop error handle mode
|
||||
if (node.data.type === BlockEnum.Loop) {
|
||||
const loopNodeData = node.data as LoopNodeType
|
||||
loopNodeData._children = iterationOrLoopNodeMap[node.id] || []
|
||||
loopNodeData.error_handle_mode = loopNodeData.error_handle_mode || ErrorHandleMode.Terminated
|
||||
loopNodeData.error_handle_mode
|
||||
= loopNodeData.error_handle_mode || ErrorHandleMode.Terminated
|
||||
}
|
||||
|
||||
// legacy provider handle
|
||||
if (node.data.type === BlockEnum.LLM)
|
||||
(node as any).data.model.provider = correctModelProvider((node as any).data.model.provider)
|
||||
if (node.data.type === BlockEnum.LLM) {
|
||||
(node as any).data.model.provider = correctModelProvider(
|
||||
(node as any).data.model.provider,
|
||||
)
|
||||
}
|
||||
|
||||
if (node.data.type === BlockEnum.KnowledgeRetrieval && (node as any).data.multiple_retrieval_config?.reranking_model)
|
||||
(node as any).data.multiple_retrieval_config.reranking_model.provider = correctModelProvider((node as any).data.multiple_retrieval_config?.reranking_model.provider)
|
||||
if (
|
||||
node.data.type === BlockEnum.KnowledgeRetrieval
|
||||
&& (node as any).data.multiple_retrieval_config?.reranking_model
|
||||
) {
|
||||
(node as any).data.multiple_retrieval_config.reranking_model.provider
|
||||
= correctModelProvider(
|
||||
(node as any).data.multiple_retrieval_config?.reranking_model.provider,
|
||||
)
|
||||
}
|
||||
|
||||
if (node.data.type === BlockEnum.QuestionClassifier)
|
||||
(node as any).data.model.provider = correctModelProvider((node as any).data.model.provider)
|
||||
if (node.data.type === BlockEnum.QuestionClassifier) {
|
||||
(node as any).data.model.provider = correctModelProvider(
|
||||
(node as any).data.model.provider,
|
||||
)
|
||||
}
|
||||
|
||||
if (node.data.type === BlockEnum.ParameterExtractor)
|
||||
(node as any).data.model.provider = correctModelProvider((node as any).data.model.provider)
|
||||
if (node.data.type === BlockEnum.ParameterExtractor) {
|
||||
(node as any).data.model.provider = correctModelProvider(
|
||||
(node as any).data.model.provider,
|
||||
)
|
||||
}
|
||||
|
||||
if (node.data.type === BlockEnum.HttpRequest && !node.data.retry_config) {
|
||||
node.data.retry_config = {
|
||||
@@ -289,14 +512,21 @@ export const initialNodes = (originNodes: Node[], originEdges: Edge[]) => {
|
||||
}
|
||||
}
|
||||
|
||||
if (node.data.type === BlockEnum.Tool && !(node as Node<ToolNodeType>).data.version && !(node as Node<ToolNodeType>).data.tool_node_version) {
|
||||
if (
|
||||
node.data.type === BlockEnum.Tool
|
||||
&& !(node as Node<ToolNodeType>).data.version
|
||||
&& !(node as Node<ToolNodeType>).data.tool_node_version
|
||||
) {
|
||||
(node as Node<ToolNodeType>).data.tool_node_version = '2'
|
||||
|
||||
const toolConfigurations = (node as Node<ToolNodeType>).data.tool_configurations
|
||||
if (toolConfigurations && Object.keys(toolConfigurations).length > 0) {
|
||||
const newValues = { ...toolConfigurations }
|
||||
Object.keys(toolConfigurations).forEach((key) => {
|
||||
if (typeof toolConfigurations[key] !== 'object' || toolConfigurations[key] === null) {
|
||||
if (
|
||||
typeof toolConfigurations[key] !== 'object'
|
||||
|| toolConfigurations[key] === null
|
||||
) {
|
||||
newValues[key] = {
|
||||
type: 'constant',
|
||||
value: toolConfigurations[key],
|
||||
@@ -312,50 +542,62 @@ export const initialNodes = (originNodes: Node[], originEdges: Edge[]) => {
|
||||
}
|
||||
|
||||
export const initialEdges = (originEdges: Edge[], originNodes: Node[]) => {
|
||||
const { nodes, edges } = preprocessNodesAndEdges(cloneDeep(originNodes), cloneDeep(originEdges))
|
||||
const { nodes, edges } = preprocessNodesAndEdges(
|
||||
cloneDeep(originNodes),
|
||||
cloneDeep(originEdges),
|
||||
)
|
||||
let selectedNode: Node | null = null
|
||||
const nodesMap = nodes.reduce((acc, node) => {
|
||||
acc[node.id] = node
|
||||
const nodesMap = nodes.reduce(
|
||||
(acc, node) => {
|
||||
acc[node.id] = node
|
||||
|
||||
if (node.data?.selected)
|
||||
selectedNode = node
|
||||
if (node.data?.selected)
|
||||
selectedNode = node
|
||||
|
||||
return acc
|
||||
}, {} as Record<string, Node>)
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, Node>,
|
||||
)
|
||||
|
||||
const cycleEdges = getCycleEdges(nodes, edges)
|
||||
return edges.filter((edge) => {
|
||||
return !cycleEdges.find(cycEdge => cycEdge.source === edge.source && cycEdge.target === edge.target)
|
||||
}).map((edge) => {
|
||||
edge.type = 'custom'
|
||||
return edges
|
||||
.filter((edge) => {
|
||||
return !cycleEdges.find(
|
||||
cycEdge =>
|
||||
cycEdge.source === edge.source && cycEdge.target === edge.target,
|
||||
)
|
||||
})
|
||||
.map((edge) => {
|
||||
edge.type = 'custom'
|
||||
|
||||
if (!edge.sourceHandle)
|
||||
edge.sourceHandle = 'source'
|
||||
if (!edge.sourceHandle)
|
||||
edge.sourceHandle = 'source'
|
||||
|
||||
if (!edge.targetHandle)
|
||||
edge.targetHandle = 'target'
|
||||
if (!edge.targetHandle)
|
||||
edge.targetHandle = 'target'
|
||||
|
||||
if (!edge.data?.sourceType && edge.source && nodesMap[edge.source]) {
|
||||
edge.data = {
|
||||
...edge.data,
|
||||
sourceType: nodesMap[edge.source].data.type!,
|
||||
} as any
|
||||
}
|
||||
if (!edge.data?.sourceType && edge.source && nodesMap[edge.source]) {
|
||||
edge.data = {
|
||||
...edge.data,
|
||||
sourceType: nodesMap[edge.source].data.type!,
|
||||
} as any
|
||||
}
|
||||
|
||||
if (!edge.data?.targetType && edge.target && nodesMap[edge.target]) {
|
||||
edge.data = {
|
||||
...edge.data,
|
||||
targetType: nodesMap[edge.target].data.type!,
|
||||
} as any
|
||||
}
|
||||
if (!edge.data?.targetType && edge.target && nodesMap[edge.target]) {
|
||||
edge.data = {
|
||||
...edge.data,
|
||||
targetType: nodesMap[edge.target].data.type!,
|
||||
} as any
|
||||
}
|
||||
|
||||
if (selectedNode) {
|
||||
edge.data = {
|
||||
...edge.data,
|
||||
_connectedNodeIsSelected: edge.source === selectedNode.id || edge.target === selectedNode.id,
|
||||
} as any
|
||||
}
|
||||
if (selectedNode) {
|
||||
edge.data = {
|
||||
...edge.data,
|
||||
_connectedNodeIsSelected:
|
||||
edge.source === selectedNode.id || edge.target === selectedNode.id,
|
||||
} as any
|
||||
}
|
||||
|
||||
return edge
|
||||
})
|
||||
return edge
|
||||
})
|
||||
}
|
||||
|
||||
@@ -157,6 +157,95 @@ export const getValidTreeNodes = (nodes: Node[], edges: Edge[]) => {
|
||||
}
|
||||
}
|
||||
|
||||
export const getCommonPredecessorNodeIds = (selectedNodeIds: string[], edges: Edge[]) => {
|
||||
const uniqSelectedNodeIds = Array.from(new Set(selectedNodeIds))
|
||||
if (uniqSelectedNodeIds.length <= 1)
|
||||
return []
|
||||
|
||||
const selectedNodeIdSet = new Set(uniqSelectedNodeIds)
|
||||
const predecessorNodeIdsMap = new Map<string, Set<string>>()
|
||||
|
||||
edges.forEach((edge) => {
|
||||
if (!selectedNodeIdSet.has(edge.target))
|
||||
return
|
||||
|
||||
const predecessors = predecessorNodeIdsMap.get(edge.target) ?? new Set<string>()
|
||||
predecessors.add(edge.source)
|
||||
predecessorNodeIdsMap.set(edge.target, predecessors)
|
||||
})
|
||||
|
||||
let commonPredecessorNodeIds: Set<string> | null = null
|
||||
|
||||
uniqSelectedNodeIds.forEach((nodeId) => {
|
||||
const predecessors = predecessorNodeIdsMap.get(nodeId) ?? new Set<string>()
|
||||
|
||||
if (!commonPredecessorNodeIds) {
|
||||
commonPredecessorNodeIds = new Set(predecessors)
|
||||
return
|
||||
}
|
||||
|
||||
Array.from(commonPredecessorNodeIds).forEach((predecessorNodeId) => {
|
||||
if (!predecessors.has(predecessorNodeId))
|
||||
commonPredecessorNodeIds!.delete(predecessorNodeId)
|
||||
})
|
||||
})
|
||||
|
||||
return Array.from(commonPredecessorNodeIds ?? []).sort()
|
||||
}
|
||||
|
||||
export type PredecessorHandle = {
|
||||
nodeId: string
|
||||
handleId: string
|
||||
}
|
||||
|
||||
export const getCommonPredecessorHandles = (targetNodeIds: string[], edges: Edge[]): PredecessorHandle[] => {
|
||||
const uniqTargetNodeIds = Array.from(new Set(targetNodeIds))
|
||||
if (uniqTargetNodeIds.length === 0)
|
||||
return []
|
||||
|
||||
// Get the "direct predecessor handler", which is:
|
||||
// - edge.source (predecessor node)
|
||||
// - edge.sourceHandle (the specific output handle of the predecessor; defaults to 'source' if not set)
|
||||
// Used to handle multi-handle branch scenarios like If-Else / Classifier.
|
||||
const targetNodeIdSet = new Set(uniqTargetNodeIds)
|
||||
const predecessorHandleMap = new Map<string, Set<string>>() // targetNodeId -> Set<`${source}\0${handleId}`>
|
||||
const delimiter = '\u0000'
|
||||
|
||||
edges.forEach((edge) => {
|
||||
if (!targetNodeIdSet.has(edge.target))
|
||||
return
|
||||
|
||||
const predecessors = predecessorHandleMap.get(edge.target) ?? new Set<string>()
|
||||
const handleId = edge.sourceHandle || 'source'
|
||||
predecessors.add(`${edge.source}${delimiter}${handleId}`)
|
||||
predecessorHandleMap.set(edge.target, predecessors)
|
||||
})
|
||||
|
||||
// Intersect predecessor handlers of all targets, keeping only handlers common to all targets.
|
||||
let commonKeys: Set<string> | null = null
|
||||
|
||||
uniqTargetNodeIds.forEach((nodeId) => {
|
||||
const keys = predecessorHandleMap.get(nodeId) ?? new Set<string>()
|
||||
|
||||
if (!commonKeys) {
|
||||
commonKeys = new Set(keys)
|
||||
return
|
||||
}
|
||||
|
||||
Array.from(commonKeys).forEach((key) => {
|
||||
if (!keys.has(key))
|
||||
commonKeys!.delete(key)
|
||||
})
|
||||
})
|
||||
|
||||
return Array.from<string>(commonKeys ?? [])
|
||||
.map((key) => {
|
||||
const [nodeId, handleId] = key.split(delimiter)
|
||||
return { nodeId, handleId }
|
||||
})
|
||||
.sort((a, b) => a.nodeId.localeCompare(b.nodeId) || a.handleId.localeCompare(b.handleId))
|
||||
}
|
||||
|
||||
export const changeNodesAndEdgesId = (nodes: Node[], edges: Edge[]) => {
|
||||
const idMap = nodes.reduce((acc, node) => {
|
||||
acc[node.id] = uuid4()
|
||||
|
||||
@@ -7,6 +7,7 @@
|
||||
"blocks.datasource-empty": "Empty Data Source",
|
||||
"blocks.document-extractor": "Doc Extractor",
|
||||
"blocks.end": "Output",
|
||||
"blocks.group": "Group",
|
||||
"blocks.http-request": "HTTP Request",
|
||||
"blocks.if-else": "IF/ELSE",
|
||||
"blocks.iteration": "Iteration",
|
||||
@@ -37,6 +38,7 @@
|
||||
"blocksAbout.datasource-empty": "Empty Data Source placeholder",
|
||||
"blocksAbout.document-extractor": "Used to parse uploaded documents into text content that is easily understandable by LLM.",
|
||||
"blocksAbout.end": "Define the output and result type of a workflow",
|
||||
"blocksAbout.group": "Group multiple nodes together for better organization",
|
||||
"blocksAbout.http-request": "Allow server requests to be sent over the HTTP protocol",
|
||||
"blocksAbout.if-else": "Allows you to split the workflow into two branches based on if/else conditions",
|
||||
"blocksAbout.iteration": "Perform multiple steps on a list object until all results are outputted.",
|
||||
@@ -171,6 +173,7 @@
|
||||
"common.needConnectTip": "This step is not connected to anything",
|
||||
"common.needOutputNode": "The Output node must be added",
|
||||
"common.needStartNode": "At least one start node must be added",
|
||||
"common.noAgentNodes": "No agent nodes available",
|
||||
"common.noHistory": "No History",
|
||||
"common.noVar": "No variable",
|
||||
"common.notRunning": "Not running yet",
|
||||
@@ -202,6 +205,7 @@
|
||||
"common.runApp": "Run App",
|
||||
"common.runHistory": "Run History",
|
||||
"common.running": "Running",
|
||||
"common.searchAgent": "Search agent",
|
||||
"common.searchVar": "Search variable",
|
||||
"common.setVarValuePlaceholder": "Set variable",
|
||||
"common.showRunHistory": "Show Run History",
|
||||
@@ -213,6 +217,7 @@
|
||||
"common.variableNamePlaceholder": "Variable name",
|
||||
"common.versionHistory": "Version History",
|
||||
"common.viewDetailInTracingPanel": "View details",
|
||||
"common.viewInternals": "View internals",
|
||||
"common.viewOnly": "View Only",
|
||||
"common.viewRunHistory": "View run history",
|
||||
"common.workflowAsTool": "Workflow as Tool",
|
||||
@@ -762,6 +767,7 @@
|
||||
"nodes.tool.inputVars": "Input Variables",
|
||||
"nodes.tool.insertPlaceholder1": "Type or press",
|
||||
"nodes.tool.insertPlaceholder2": "insert variable",
|
||||
"nodes.tool.insertPlaceholder3": "add agent",
|
||||
"nodes.tool.outputVars.files.title": "tool generated files",
|
||||
"nodes.tool.outputVars.files.transfer_method": "Transfer method.Value is remote_url or local_file",
|
||||
"nodes.tool.outputVars.files.type": "Support type. Now only support image",
|
||||
@@ -936,6 +942,7 @@
|
||||
"operator.distributeHorizontal": "Space Horizontally",
|
||||
"operator.distributeVertical": "Space Vertically",
|
||||
"operator.horizontal": "Horizontal",
|
||||
"operator.makeGroup": "Make Group",
|
||||
"operator.selectionAlignment": "Selection Alignment",
|
||||
"operator.vertical": "Vertical",
|
||||
"operator.zoomIn": "Zoom In",
|
||||
@@ -964,6 +971,7 @@
|
||||
"panel.scrollToSelectedNode": "Scroll to selected node",
|
||||
"panel.selectNextStep": "Select Next Step",
|
||||
"panel.startNode": "Start Node",
|
||||
"panel.ungroup": "Ungroup",
|
||||
"panel.userInputField": "User Input Field",
|
||||
"publishLimit.startNodeDesc": "You’ve reached the limit of 2 triggers per workflow for this plan. Upgrade to publish this workflow.",
|
||||
"publishLimit.startNodeTitlePrefix": "Upgrade to",
|
||||
|
||||
1298
web/i18n/en-US/workflow.ts
Normal file
1298
web/i18n/en-US/workflow.ts
Normal file
File diff suppressed because it is too large
Load Diff
@@ -171,6 +171,7 @@
|
||||
"common.needConnectTip": "接続されていないステップがあります",
|
||||
"common.needOutputNode": "出力ノードを追加する必要があります",
|
||||
"common.needStartNode": "少なくとも1つのスタートノードを追加する必要があります",
|
||||
"common.noAgentNodes": "利用可能なエージェントノードがありません",
|
||||
"common.noHistory": "履歴がありません",
|
||||
"common.noVar": "変数がありません",
|
||||
"common.notRunning": "まだ実行されていません",
|
||||
@@ -202,6 +203,7 @@
|
||||
"common.runApp": "アプリを実行",
|
||||
"common.runHistory": "実行履歴",
|
||||
"common.running": "実行中",
|
||||
"common.searchAgent": "エージェントを検索",
|
||||
"common.searchVar": "変数を検索",
|
||||
"common.setVarValuePlaceholder": "変数値を設定",
|
||||
"common.showRunHistory": "実行履歴を表示",
|
||||
@@ -213,6 +215,7 @@
|
||||
"common.variableNamePlaceholder": "変数名を入力",
|
||||
"common.versionHistory": "バージョン履歴",
|
||||
"common.viewDetailInTracingPanel": "詳細を表示",
|
||||
"common.viewInternals": "内部を表示",
|
||||
"common.viewOnly": "閲覧のみ",
|
||||
"common.viewRunHistory": "実行履歴を表示",
|
||||
"common.workflowAsTool": "ワークフローをツールとして公開する",
|
||||
@@ -762,6 +765,7 @@
|
||||
"nodes.tool.inputVars": "入力変数",
|
||||
"nodes.tool.insertPlaceholder1": "タイプするか押してください",
|
||||
"nodes.tool.insertPlaceholder2": "変数を挿入する",
|
||||
"nodes.tool.insertPlaceholder3": "エージェントを追加",
|
||||
"nodes.tool.outputVars.files.title": "ツールが生成したファイル",
|
||||
"nodes.tool.outputVars.files.transfer_method": "転送方法。値は remote_url または local_file です",
|
||||
"nodes.tool.outputVars.files.type": "サポートタイプ。現在は画像のみサポートされています",
|
||||
@@ -964,6 +968,7 @@
|
||||
"panel.scrollToSelectedNode": "選択したノードまでスクロール",
|
||||
"panel.selectNextStep": "次ノード選択",
|
||||
"panel.startNode": "開始ノード",
|
||||
"panel.ungroup": "グループ解除",
|
||||
"panel.userInputField": "ユーザー入力欄",
|
||||
"publishLimit.startNodeDesc": "このプランでは、各ワークフローのトリガー数は最大 2 個まで設定できます。公開するにはアップグレードが必要です。",
|
||||
"publishLimit.startNodeTitlePrefix": "アップグレードして、",
|
||||
|
||||
1298
web/i18n/ja-JP/workflow.ts
Normal file
1298
web/i18n/ja-JP/workflow.ts
Normal file
File diff suppressed because it is too large
Load Diff
@@ -171,6 +171,7 @@
|
||||
"common.needConnectTip": "此节点尚未连接到其他节点",
|
||||
"common.needOutputNode": "必须添加输出节点",
|
||||
"common.needStartNode": "必须添加至少一个开始节点",
|
||||
"common.noAgentNodes": "没有可用的代理节点",
|
||||
"common.noHistory": "没有历史版本",
|
||||
"common.noVar": "没有变量",
|
||||
"common.notRunning": "尚未运行",
|
||||
@@ -202,6 +203,7 @@
|
||||
"common.runApp": "运行",
|
||||
"common.runHistory": "运行历史",
|
||||
"common.running": "运行中",
|
||||
"common.searchAgent": "搜索代理",
|
||||
"common.searchVar": "搜索变量",
|
||||
"common.setVarValuePlaceholder": "设置变量值",
|
||||
"common.showRunHistory": "显示运行历史",
|
||||
@@ -213,6 +215,7 @@
|
||||
"common.variableNamePlaceholder": "变量名",
|
||||
"common.versionHistory": "版本历史",
|
||||
"common.viewDetailInTracingPanel": "查看详细信息",
|
||||
"common.viewInternals": "查看内部",
|
||||
"common.viewOnly": "只读",
|
||||
"common.viewRunHistory": "查看运行历史",
|
||||
"common.workflowAsTool": "发布为工具",
|
||||
@@ -762,6 +765,7 @@
|
||||
"nodes.tool.inputVars": "输入变量",
|
||||
"nodes.tool.insertPlaceholder1": "键入",
|
||||
"nodes.tool.insertPlaceholder2": "插入变量",
|
||||
"nodes.tool.insertPlaceholder3": "添加代理",
|
||||
"nodes.tool.outputVars.files.title": "工具生成的文件",
|
||||
"nodes.tool.outputVars.files.transfer_method": "传输方式。值为 remote_url 或 local_file",
|
||||
"nodes.tool.outputVars.files.type": "支持类型。现在只支持图片",
|
||||
@@ -936,6 +940,7 @@
|
||||
"operator.distributeHorizontal": "水平等间距",
|
||||
"operator.distributeVertical": "垂直等间距",
|
||||
"operator.horizontal": "水平方向",
|
||||
"operator.makeGroup": "建立群组",
|
||||
"operator.selectionAlignment": "选择对齐",
|
||||
"operator.vertical": "垂直方向",
|
||||
"operator.zoomIn": "放大",
|
||||
@@ -964,6 +969,7 @@
|
||||
"panel.scrollToSelectedNode": "滚动至选中节点",
|
||||
"panel.selectNextStep": "选择下一个节点",
|
||||
"panel.startNode": "开始节点",
|
||||
"panel.ungroup": "取消编组",
|
||||
"panel.userInputField": "用户输入字段",
|
||||
"publishLimit.startNodeDesc": "您已达到此计划上每个工作流最多 2 个触发器的限制。请升级后再发布此工作流。",
|
||||
"publishLimit.startNodeTitlePrefix": "升级以",
|
||||
|
||||
1298
web/i18n/zh-Hans/workflow.ts
Normal file
1298
web/i18n/zh-Hans/workflow.ts
Normal file
File diff suppressed because it is too large
Load Diff
@@ -171,6 +171,7 @@
|
||||
"common.needConnectTip": "此節點尚未連接到其他節點",
|
||||
"common.needOutputNode": "必須新增輸出節點",
|
||||
"common.needStartNode": "至少必須新增一個起始節點",
|
||||
"common.noAgentNodes": "沒有可用的代理節點",
|
||||
"common.noHistory": "無歷史記錄",
|
||||
"common.noVar": "沒有變數",
|
||||
"common.notRunning": "尚未運行",
|
||||
@@ -202,6 +203,7 @@
|
||||
"common.runApp": "運行",
|
||||
"common.runHistory": "運行歷史",
|
||||
"common.running": "運行中",
|
||||
"common.searchAgent": "搜索代理",
|
||||
"common.searchVar": "搜索變數",
|
||||
"common.setVarValuePlaceholder": "設置變數值",
|
||||
"common.showRunHistory": "顯示運行歷史",
|
||||
@@ -213,6 +215,7 @@
|
||||
"common.variableNamePlaceholder": "變數名",
|
||||
"common.versionHistory": "版本歷史",
|
||||
"common.viewDetailInTracingPanel": "查看詳細信息",
|
||||
"common.viewInternals": "查看內部",
|
||||
"common.viewOnly": "只讀",
|
||||
"common.viewRunHistory": "查看運行歷史",
|
||||
"common.workflowAsTool": "發佈為工具",
|
||||
@@ -762,6 +765,7 @@
|
||||
"nodes.tool.inputVars": "輸入變數",
|
||||
"nodes.tool.insertPlaceholder1": "輸入或按壓",
|
||||
"nodes.tool.insertPlaceholder2": "插入變數",
|
||||
"nodes.tool.insertPlaceholder3": "添加代理",
|
||||
"nodes.tool.outputVars.files.title": "工具生成的文件",
|
||||
"nodes.tool.outputVars.files.transfer_method": "傳輸方式。值為 remote_url 或 local_file",
|
||||
"nodes.tool.outputVars.files.type": "支持類型。現在只支持圖片",
|
||||
@@ -964,6 +968,7 @@
|
||||
"panel.scrollToSelectedNode": "捲動至選取的節點",
|
||||
"panel.selectNextStep": "選擇下一個節點",
|
||||
"panel.startNode": "起始節點",
|
||||
"panel.ungroup": "取消群組",
|
||||
"panel.userInputField": "用戶輸入字段",
|
||||
"publishLimit.startNodeDesc": "目前方案最多允許 2 個開始節點,升級後才能發布此工作流程。",
|
||||
"publishLimit.startNodeTitlePrefix": "升級以",
|
||||
|
||||
1298
web/i18n/zh-Hant/workflow.ts
Normal file
1298
web/i18n/zh-Hant/workflow.ts
Normal file
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user