Compare commits

...

8 Commits

Author SHA1 Message Date
Renzo
364d7ebc40 refactor: core/tools, agent, callback_handler, encrypter, llm_generator, plugin, inner_api (#34205)
Some checks are pending
autofix.ci / autofix (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Skip Duplicate Checks (push) Waiting to run
Main CI Pipeline / Check Changed Files (push) Blocked by required conditions
Main CI Pipeline / Run API Tests (push) Blocked by required conditions
Main CI Pipeline / Skip API Tests (push) Blocked by required conditions
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Run Web Tests (push) Blocked by required conditions
Main CI Pipeline / Skip Web Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Blocked by required conditions
Main CI Pipeline / Run VDB Tests (push) Blocked by required conditions
Main CI Pipeline / Skip VDB Tests (push) Blocked by required conditions
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / Run DB Migration Test (push) Blocked by required conditions
Main CI Pipeline / Skip DB Migration Test (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Asuka Minato <i@asukaminato.eu.org>
2026-03-28 10:14:43 +00:00
YBoy
7cc81e9a43 test: migrate workspace service tests to testcontainers (#34218) 2026-03-28 07:50:26 +00:00
YBoy
3409c519e2 test: migrate tag service tests to testcontainers (#34219) 2026-03-28 07:49:27 +00:00
YBoy
5851b42af3 test: migrate metadata service tests to testcontainers (#34220) 2026-03-28 07:48:48 +00:00
Maa-Lee | odeili
c5eae67ac9 refactor: use select for API key auth lookups (#34146)
Some checks failed
autofix.ci / autofix (push) Has been cancelled
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Has been cancelled
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Has been cancelled
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Has been cancelled
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Has been cancelled
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Has been cancelled
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Has been cancelled
Main CI Pipeline / Skip Duplicate Checks (push) Has been cancelled
Main CI Pipeline / Check Changed Files (push) Has been cancelled
Main CI Pipeline / Run API Tests (push) Has been cancelled
Main CI Pipeline / Skip API Tests (push) Has been cancelled
Main CI Pipeline / API Tests (push) Has been cancelled
Main CI Pipeline / Run Web Tests (push) Has been cancelled
Main CI Pipeline / Skip Web Tests (push) Has been cancelled
Main CI Pipeline / Web Tests (push) Has been cancelled
Main CI Pipeline / Style Check (push) Has been cancelled
Main CI Pipeline / Run VDB Tests (push) Has been cancelled
Main CI Pipeline / Skip VDB Tests (push) Has been cancelled
Main CI Pipeline / VDB Tests (push) Has been cancelled
Main CI Pipeline / Run DB Migration Test (push) Has been cancelled
Main CI Pipeline / Skip DB Migration Test (push) Has been cancelled
Main CI Pipeline / DB Migration Test (push) Has been cancelled
Co-authored-by: Asuka Minato <i@asukaminato.eu.org>
2026-03-28 00:01:05 +00:00
YBoy
865ee473ce test: migrate messages clean service retention tests to testcontainers (#34207) 2026-03-27 22:55:11 +00:00
dependabot[bot]
08e8145975 chore(deps): bump cryptography from 44.0.3 to 46.0.6 in /api (#34210)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-03-27 22:53:01 +00:00
tmimmanuel
ec0f20de03 refactor: use EnumText for prompt_type and customize_token_strategy (#34204) 2026-03-27 22:29:38 +00:00
30 changed files with 524 additions and 2950 deletions

View File

@@ -8,6 +8,7 @@ Go admin-api caller.
from flask import request
from flask_restx import Resource
from pydantic import BaseModel, Field
from sqlalchemy import select
from sqlalchemy.orm import Session
from controllers.common.schema import register_schema_model
@@ -87,7 +88,7 @@ class EnterpriseAppDSLExport(Resource):
"""Export an app's DSL as YAML."""
include_secret = request.args.get("include_secret", "false").lower() == "true"
app_model = db.session.query(App).filter_by(id=app_id).first()
app_model = db.session.get(App, app_id)
if not app_model:
return {"message": "app not found"}, 404
@@ -104,7 +105,7 @@ def _get_active_account(email: str) -> Account | None:
Workspace membership is already validated by the Go admin-api caller.
"""
account = db.session.query(Account).filter_by(email=email).first()
account = db.session.scalar(select(Account).where(Account.email == email).limit(1))
if account is None or account.status != AccountStatus.ACTIVE:
return None
return account

View File

@@ -18,7 +18,7 @@ from graphon.model_runtime.entities import (
from graphon.model_runtime.entities.message_entities import ImagePromptMessageContent, PromptMessageContentUnionTypes
from graphon.model_runtime.entities.model_entities import ModelFeature
from graphon.model_runtime.model_providers.__base.large_language_model import LargeLanguageModel
from sqlalchemy import select
from sqlalchemy import func, select
from core.agent.entities import AgentEntity, AgentToolEntity
from core.app.app_config.features.file_upload.manager import FileUploadConfigManager
@@ -104,11 +104,14 @@ class BaseAgentRunner(AppRunner):
)
# get how many agent thoughts have been created
self.agent_thought_count = (
db.session.query(MessageAgentThought)
.where(
MessageAgentThought.message_id == self.message.id,
db.session.scalar(
select(func.count())
.select_from(MessageAgentThought)
.where(
MessageAgentThought.message_id == self.message.id,
)
)
.count()
or 0
)
db.session.close()

View File

@@ -1,7 +1,7 @@
import logging
from collections.abc import Sequence
from sqlalchemy import select
from sqlalchemy import select, update
from core.app.apps.base_app_queue_manager import AppQueueManager, PublishFrom
from core.app.entities.app_invoke_entities import InvokeFrom
@@ -70,23 +70,21 @@ class DatasetIndexToolCallbackHandler:
)
child_chunk = db.session.scalar(child_chunk_stmt)
if child_chunk:
_ = (
db.session.query(DocumentSegment)
db.session.execute(
update(DocumentSegment)
.where(DocumentSegment.id == child_chunk.segment_id)
.update(
{DocumentSegment.hit_count: DocumentSegment.hit_count + 1}, synchronize_session=False
)
.values(hit_count=DocumentSegment.hit_count + 1)
)
else:
query = db.session.query(DocumentSegment).where(
DocumentSegment.index_node_id == document.metadata["doc_id"]
)
conditions = [DocumentSegment.index_node_id == document.metadata["doc_id"]]
if "dataset_id" in document.metadata:
query = query.where(DocumentSegment.dataset_id == document.metadata["dataset_id"])
conditions.append(DocumentSegment.dataset_id == document.metadata["dataset_id"])
# add hit count to document segment
query.update({DocumentSegment.hit_count: DocumentSegment.hit_count + 1}, synchronize_session=False)
db.session.execute(
update(DocumentSegment).where(*conditions).values(hit_count=DocumentSegment.hit_count + 1)
)
db.session.commit()

View File

@@ -19,7 +19,7 @@ def encrypt_token(tenant_id: str, token: str):
from extensions.ext_database import db
from models.account import Tenant
if not (tenant := db.session.query(Tenant).where(Tenant.id == tenant_id).first()):
if not (tenant := db.session.get(Tenant, tenant_id)):
raise ValueError(f"Tenant with id {tenant_id} not found")
assert tenant.encrypt_public_key is not None
encrypted_token = rsa.encrypt(token, tenant.encrypt_public_key)

View File

@@ -10,6 +10,7 @@ from graphon.model_runtime.entities.llm_entities import LLMResult
from graphon.model_runtime.entities.message_entities import PromptMessage, SystemPromptMessage, UserPromptMessage
from graphon.model_runtime.entities.model_entities import ModelType
from graphon.model_runtime.errors.invoke import InvokeAuthorizationError, InvokeError
from sqlalchemy import select
from core.app.app_config.entities import ModelConfig
from core.llm_generator.entities import RuleCodeGeneratePayload, RuleGeneratePayload, RuleStructuredOutputPayload
@@ -410,8 +411,8 @@ class LLMGenerator:
model_config: ModelConfig,
ideal_output: str | None,
):
last_run: Message | None = (
db.session.query(Message).where(Message.app_id == flow_id).order_by(Message.created_at.desc()).first()
last_run: Message | None = db.session.scalar(
select(Message).where(Message.app_id == flow_id).order_by(Message.created_at.desc()).limit(1)
)
if not last_run:
return LLMGenerator.__instruction_modify_common(

View File

@@ -227,7 +227,7 @@ class PluginAppBackwardsInvocation(BaseBackwardsInvocation):
get app
"""
try:
app = db.session.query(App).where(App.id == app_id).where(App.tenant_id == tenant_id).first()
app = db.session.scalar(select(App).where(App.id == app_id, App.tenant_id == tenant_id).limit(1))
except Exception:
raise ValueError("app not found")

View File

@@ -1,4 +1,4 @@
from sqlalchemy import select
from sqlalchemy import delete, select
from core.tools.__base.tool_provider import ToolProviderController
from core.tools.builtin_tool.provider import BuiltinToolProviderController
@@ -31,7 +31,7 @@ class ToolLabelManager:
raise ValueError("Unsupported tool type")
# delete old labels
db.session.query(ToolLabelBinding).where(ToolLabelBinding.tool_id == provider_id).delete()
db.session.execute(delete(ToolLabelBinding).where(ToolLabelBinding.tool_id == provider_id))
# insert new labels
for label in labels:

View File

@@ -255,11 +255,11 @@ class ToolManager:
if builtin_provider is None:
raise ToolProviderNotFoundError(f"no default provider for {provider_id}")
else:
builtin_provider = (
db.session.query(BuiltinToolProvider)
builtin_provider = db.session.scalar(
select(BuiltinToolProvider)
.where(BuiltinToolProvider.tenant_id == tenant_id, (BuiltinToolProvider.provider == provider_id))
.order_by(BuiltinToolProvider.is_default.desc(), BuiltinToolProvider.created_at.asc())
.first()
.limit(1)
)
if builtin_provider is None:
@@ -818,13 +818,13 @@ class ToolManager:
:return: the provider controller, the credentials
"""
provider: ApiToolProvider | None = (
db.session.query(ApiToolProvider)
provider: ApiToolProvider | None = db.session.scalar(
select(ApiToolProvider)
.where(
ApiToolProvider.id == provider_id,
ApiToolProvider.tenant_id == tenant_id,
)
.first()
.limit(1)
)
if provider is None:
@@ -872,13 +872,13 @@ class ToolManager:
get api provider
"""
provider_name = provider
provider_obj: ApiToolProvider | None = (
db.session.query(ApiToolProvider)
provider_obj: ApiToolProvider | None = db.session.scalar(
select(ApiToolProvider)
.where(
ApiToolProvider.tenant_id == tenant_id,
ApiToolProvider.name == provider,
)
.first()
.limit(1)
)
if provider_obj is None:
@@ -964,10 +964,10 @@ class ToolManager:
@classmethod
def generate_workflow_tool_icon_url(cls, tenant_id: str, provider_id: str) -> EmojiIconDict:
try:
workflow_provider: WorkflowToolProvider | None = (
db.session.query(WorkflowToolProvider)
workflow_provider: WorkflowToolProvider | None = db.session.scalar(
select(WorkflowToolProvider)
.where(WorkflowToolProvider.tenant_id == tenant_id, WorkflowToolProvider.id == provider_id)
.first()
.limit(1)
)
if workflow_provider is None:
@@ -981,10 +981,10 @@ class ToolManager:
@classmethod
def generate_api_tool_icon_url(cls, tenant_id: str, provider_id: str) -> EmojiIconDict:
try:
api_provider: ApiToolProvider | None = (
db.session.query(ApiToolProvider)
api_provider: ApiToolProvider | None = db.session.scalar(
select(ApiToolProvider)
.where(ApiToolProvider.tenant_id == tenant_id, ApiToolProvider.id == provider_id)
.first()
.limit(1)
)
if api_provider is None:

View File

@@ -110,7 +110,7 @@ class DatasetMultiRetrieverTool(DatasetRetrieverBaseTool):
context_list: list[RetrievalSourceMetadata] = []
resource_number = 1
for segment in sorted_segments:
dataset = db.session.query(Dataset).filter_by(id=segment.dataset_id).first()
dataset = db.session.get(Dataset, segment.dataset_id)
document_stmt = select(Document).where(
Document.id == segment.document_id,
Document.enabled == True,

View File

@@ -205,7 +205,7 @@ class DatasetRetrieverTool(DatasetRetrieverBaseTool):
if self.return_resource:
for record in records:
segment = record.segment
dataset = db.session.query(Dataset).filter_by(id=segment.dataset_id).first()
dataset = db.session.get(Dataset, segment.dataset_id)
dataset_document_stmt = select(DatasetDocument).where(
DatasetDocument.id == segment.document_id,
DatasetDocument.enabled == True,

View File

@@ -1,5 +1,6 @@
from events.app_event import app_was_created
from extensions.ext_database import db
from models.enums import CustomizeTokenStrategy
from models.model import Site
@@ -16,7 +17,7 @@ def handle(sender, **kwargs):
icon=app.icon,
icon_background=app.icon_background,
default_language=account.interface_language,
customize_token_strategy="not_allow",
customize_token_strategy=CustomizeTokenStrategy.NOT_ALLOW,
code=Site.generate_code(16),
created_by=app.created_by,
updated_by=app.updated_by,

View File

@@ -158,6 +158,15 @@ class FeedbackFromSource(StrEnum):
ADMIN = "admin"
class CustomizeTokenStrategy(StrEnum):
"""Site token customization strategy"""
MUST = "must"
ALLOW = "allow"
NOT_ALLOW = "not_allow"
UUID = "uuid"
class FeedbackRating(StrEnum):
"""MessageFeedback rating"""
@@ -314,6 +323,13 @@ class MessageChainType(StrEnum):
SYSTEM = "system"
class PromptType(StrEnum):
"""Prompt configuration type"""
SIMPLE = "simple"
ADVANCED = "advanced"
class ProviderQuotaType(StrEnum):
PAID = "paid"
"""hosted paid quota"""

View File

@@ -40,12 +40,14 @@ from .enums import (
ConversationFromSource,
ConversationStatus,
CreatorUserRole,
CustomizeTokenStrategy,
FeedbackFromSource,
FeedbackRating,
InvokeFrom,
MessageChainType,
MessageFileBelongsTo,
MessageStatus,
PromptType,
ProviderQuotaType,
TagType,
)
@@ -649,8 +651,11 @@ class AppModelConfig(TypeBase):
agent_mode: Mapped[str | None] = mapped_column(LongText, default=None)
sensitive_word_avoidance: Mapped[str | None] = mapped_column(LongText, default=None)
retriever_resource: Mapped[str | None] = mapped_column(LongText, default=None)
prompt_type: Mapped[str] = mapped_column(
String(255), nullable=False, server_default=sa.text("'simple'"), default="simple"
prompt_type: Mapped[PromptType] = mapped_column(
EnumText(PromptType, length=255),
nullable=False,
server_default=sa.text("'simple'"),
default=PromptType.SIMPLE,
)
chat_prompt_config: Mapped[str | None] = mapped_column(LongText, default=None)
completion_prompt_config: Mapped[str | None] = mapped_column(LongText, default=None)
@@ -802,7 +807,7 @@ class AppModelConfig(TypeBase):
"dataset_query_variable": self.dataset_query_variable,
"pre_prompt": self.pre_prompt,
"agent_mode": self.agent_mode_dict,
"prompt_type": self.prompt_type,
"prompt_type": self.prompt_type.value if isinstance(self.prompt_type, PromptType) else self.prompt_type,
"chat_prompt_config": self.chat_prompt_config_dict,
"completion_prompt_config": self.completion_prompt_config_dict,
"dataset_configs": self.dataset_configs_dict,
@@ -846,7 +851,7 @@ class AppModelConfig(TypeBase):
self.retriever_resource = (
json.dumps(model_config.get("retriever_resource")) if model_config.get("retriever_resource") else None
)
self.prompt_type = model_config.get("prompt_type", "simple")
self.prompt_type = PromptType(model_config.get("prompt_type", "simple"))
self.chat_prompt_config = (
json.dumps(model_config.get("chat_prompt_config")) if model_config.get("chat_prompt_config") else None
)
@@ -2084,7 +2089,9 @@ class Site(Base):
use_icon_as_answer_icon: Mapped[bool] = mapped_column(sa.Boolean, nullable=False, server_default=sa.text("false"))
_custom_disclaimer: Mapped[str] = mapped_column("custom_disclaimer", LongText, default="")
customize_domain = mapped_column(String(255))
customize_token_strategy: Mapped[str] = mapped_column(String(255), nullable=False)
customize_token_strategy: Mapped[CustomizeTokenStrategy] = mapped_column(
EnumText(CustomizeTokenStrategy, length=255), nullable=False
)
prompt_public: Mapped[bool] = mapped_column(sa.Boolean, nullable=False, server_default=sa.text("false"))
status: Mapped[AppStatus] = mapped_column(
EnumText(AppStatus, length=255), nullable=False, server_default=sa.text("'normal'"), default=AppStatus.NORMAL

View File

@@ -35,15 +35,13 @@ class ApiKeyAuthService:
@staticmethod
def get_auth_credentials(tenant_id: str, category: str, provider: str):
data_source_api_key_bindings = (
db.session.query(DataSourceApiKeyAuthBinding)
.where(
data_source_api_key_bindings = db.session.scalar(
select(DataSourceApiKeyAuthBinding).where(
DataSourceApiKeyAuthBinding.tenant_id == tenant_id,
DataSourceApiKeyAuthBinding.category == category,
DataSourceApiKeyAuthBinding.provider == provider,
DataSourceApiKeyAuthBinding.disabled.is_(False),
)
.first()
)
if not data_source_api_key_bindings:
return None
@@ -54,10 +52,11 @@ class ApiKeyAuthService:
@staticmethod
def delete_provider_auth(tenant_id: str, binding_id: str):
data_source_api_key_binding = (
db.session.query(DataSourceApiKeyAuthBinding)
.where(DataSourceApiKeyAuthBinding.tenant_id == tenant_id, DataSourceApiKeyAuthBinding.id == binding_id)
.first()
data_source_api_key_binding = db.session.scalar(
select(DataSourceApiKeyAuthBinding).where(
DataSourceApiKeyAuthBinding.tenant_id == tenant_id,
DataSourceApiKeyAuthBinding.id == binding_id,
)
)
if data_source_api_key_binding:
db.session.delete(data_source_api_key_binding)

View File

@@ -1,8 +1,10 @@
from __future__ import annotations
import datetime
import json
import uuid
from decimal import Decimal
from unittest.mock import patch
from unittest.mock import MagicMock, patch
import pytest
from faker import Faker
@@ -1169,3 +1171,66 @@ class TestMessagesCleanServiceIntegration:
# Verify all messages were deleted
assert db_session_with_containers.query(Message).where(Message.id.in_(msg_ids)).count() == 0
def test_from_time_range_validation(self):
"""Test that from_time_range raises ValueError for invalid inputs."""
policy = MagicMock(spec=BillingDisabledPolicy)
now = datetime.datetime.now()
with pytest.raises(ValueError, match="start_from .* must be less than end_before"):
MessagesCleanService.from_time_range(policy, now, now)
with pytest.raises(ValueError, match="batch_size .* must be greater than 0"):
MessagesCleanService.from_time_range(policy, now - datetime.timedelta(days=1), now, batch_size=0)
def test_from_time_range_success(self):
"""Test that from_time_range creates a service with correct parameters."""
policy = MagicMock(spec=BillingDisabledPolicy)
start = datetime.datetime(2024, 1, 1)
end = datetime.datetime(2024, 2, 1)
service = MessagesCleanService.from_time_range(policy, start, end)
assert service._start_from == start
assert service._end_before == end
def test_from_days_validation(self):
"""Test that from_days raises ValueError for invalid inputs."""
policy = MagicMock(spec=BillingDisabledPolicy)
with pytest.raises(ValueError, match="days .* must be greater than or equal to 0"):
MessagesCleanService.from_days(policy, days=-1)
with pytest.raises(ValueError, match="batch_size .* must be greater than 0"):
MessagesCleanService.from_days(policy, days=30, batch_size=0)
def test_from_days_success(self):
"""Test that from_days creates a service with correct parameters."""
policy = MagicMock(spec=BillingDisabledPolicy)
with patch("services.retention.conversation.messages_clean_service.naive_utc_now") as mock_now:
fixed_now = datetime.datetime(2024, 6, 1)
mock_now.return_value = fixed_now
service = MessagesCleanService.from_days(policy, days=10)
assert service._start_from is None
assert service._end_before == fixed_now - datetime.timedelta(days=10)
def test_batch_delete_message_relations_empty(self, db_session_with_containers: Session):
"""Test that batch_delete_message_relations with empty list does nothing."""
# Get execute call count before
MessagesCleanService._batch_delete_message_relations(db_session_with_containers, [])
# No exception means success — empty list is a no-op
def test_run_calls_clean_messages(self):
"""Test that run() delegates to _clean_messages_by_time_range."""
policy = MagicMock(spec=BillingDisabledPolicy)
service = MessagesCleanService(
policy=policy,
end_before=datetime.datetime.now(),
batch_size=10,
)
with patch.object(service, "_clean_messages_by_time_range") as mock_clean:
mock_clean.return_value = {"total_deleted": 5}
result = service.run()
assert result == {"total_deleted": 5}
mock_clean.assert_called_once()

View File

@@ -1,4 +1,6 @@
from unittest.mock import patch
from __future__ import annotations
from unittest.mock import MagicMock, patch
import pytest
from faker import Faker
@@ -534,3 +536,283 @@ class TestWorkspaceService:
# Verify database state
db_session_with_containers.refresh(tenant)
assert tenant.id is not None
def test_get_tenant_info_should_raise_assertion_when_join_missing(
self, db_session_with_containers: Session, mock_external_service_dependencies
):
"""TenantAccountJoin must exist; missing join should raise AssertionError."""
fake = Faker()
account = Account(email=fake.email(), name=fake.name(), interface_language="en-US", status="active")
db_session_with_containers.add(account)
db_session_with_containers.commit()
tenant = Tenant(name=fake.company(), status="normal", plan="basic")
db_session_with_containers.add(tenant)
db_session_with_containers.commit()
# No TenantAccountJoin created
with patch("services.workspace_service.current_user", account):
with pytest.raises(AssertionError, match="TenantAccountJoin not found"):
WorkspaceService.get_tenant_info(tenant)
def test_get_tenant_info_should_set_replace_webapp_logo_to_none_when_flag_absent(
self, db_session_with_containers: Session, mock_external_service_dependencies
):
"""replace_webapp_logo should be None when custom_config_dict does not have the key."""
import json
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
tenant.custom_config = json.dumps({})
db_session_with_containers.commit()
mock_external_service_dependencies["feature_service"].get_features.return_value.can_replace_logo = True
mock_external_service_dependencies["tenant_service"].has_roles.return_value = True
with patch("services.workspace_service.current_user", account):
result = WorkspaceService.get_tenant_info(tenant)
assert result is not None
assert result["custom_config"]["replace_webapp_logo"] is None
def test_get_tenant_info_should_use_files_url_for_logo_url(
self, db_session_with_containers: Session, mock_external_service_dependencies
):
"""The logo URL should use dify_config.FILES_URL as the base."""
import json
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
tenant.custom_config = json.dumps({"replace_webapp_logo": True})
db_session_with_containers.commit()
custom_base = "https://cdn.mycompany.io"
mock_external_service_dependencies["dify_config"].FILES_URL = custom_base
mock_external_service_dependencies["feature_service"].get_features.return_value.can_replace_logo = True
mock_external_service_dependencies["tenant_service"].has_roles.return_value = True
with patch("services.workspace_service.current_user", account):
result = WorkspaceService.get_tenant_info(tenant)
assert result is not None
assert result["custom_config"]["replace_webapp_logo"].startswith(custom_base)
def test_get_tenant_info_should_not_include_cloud_fields_in_self_hosted(
self, db_session_with_containers: Session, mock_external_service_dependencies
):
"""next_credit_reset_date and trial_credits should NOT appear in SELF_HOSTED mode."""
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
mock_external_service_dependencies["dify_config"].EDITION = "SELF_HOSTED"
mock_external_service_dependencies["feature_service"].get_features.return_value.can_replace_logo = False
mock_external_service_dependencies["tenant_service"].has_roles.return_value = False
with patch("services.workspace_service.current_user", account):
result = WorkspaceService.get_tenant_info(tenant)
assert result is not None
assert "next_credit_reset_date" not in result
assert "trial_credits" not in result
assert "trial_credits_used" not in result
def test_get_tenant_info_cloud_credit_reset_date(
self, db_session_with_containers: Session, mock_external_service_dependencies
):
"""next_credit_reset_date should be present in CLOUD edition."""
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
mock_external_service_dependencies["dify_config"].EDITION = "CLOUD"
feature = mock_external_service_dependencies["feature_service"].get_features.return_value
feature.can_replace_logo = False
feature.next_credit_reset_date = "2025-02-01"
feature.billing.subscription.plan = "professional"
mock_external_service_dependencies["tenant_service"].has_roles.return_value = False
with (
patch("services.workspace_service.current_user", account),
patch("services.credit_pool_service.CreditPoolService.get_pool", return_value=None),
):
result = WorkspaceService.get_tenant_info(tenant)
assert result is not None
assert result["next_credit_reset_date"] == "2025-02-01"
def test_get_tenant_info_cloud_paid_pool_not_full(
self, db_session_with_containers: Session, mock_external_service_dependencies
):
"""trial_credits come from paid pool when plan is not sandbox and pool is not full."""
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
mock_external_service_dependencies["dify_config"].EDITION = "CLOUD"
feature = mock_external_service_dependencies["feature_service"].get_features.return_value
feature.can_replace_logo = False
feature.next_credit_reset_date = "2025-02-01"
feature.billing.subscription.plan = "professional"
mock_external_service_dependencies["tenant_service"].has_roles.return_value = False
paid_pool = MagicMock(quota_limit=1000, quota_used=200)
with (
patch("services.workspace_service.current_user", account),
patch("services.credit_pool_service.CreditPoolService.get_pool", return_value=paid_pool),
):
result = WorkspaceService.get_tenant_info(tenant)
assert result is not None
assert result["trial_credits"] == 1000
assert result["trial_credits_used"] == 200
def test_get_tenant_info_cloud_paid_pool_unlimited(
self, db_session_with_containers: Session, mock_external_service_dependencies
):
"""quota_limit == -1 means unlimited; service should use paid pool."""
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
mock_external_service_dependencies["dify_config"].EDITION = "CLOUD"
feature = mock_external_service_dependencies["feature_service"].get_features.return_value
feature.can_replace_logo = False
feature.next_credit_reset_date = "2025-02-01"
feature.billing.subscription.plan = "professional"
mock_external_service_dependencies["tenant_service"].has_roles.return_value = False
paid_pool = MagicMock(quota_limit=-1, quota_used=999)
with (
patch("services.workspace_service.current_user", account),
patch("services.credit_pool_service.CreditPoolService.get_pool", side_effect=[paid_pool, None]),
):
result = WorkspaceService.get_tenant_info(tenant)
assert result is not None
assert result["trial_credits"] == -1
assert result["trial_credits_used"] == 999
def test_get_tenant_info_cloud_fall_back_to_trial_when_paid_full(
self, db_session_with_containers: Session, mock_external_service_dependencies
):
"""When paid pool is exhausted, switch to trial pool."""
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
mock_external_service_dependencies["dify_config"].EDITION = "CLOUD"
feature = mock_external_service_dependencies["feature_service"].get_features.return_value
feature.can_replace_logo = False
feature.next_credit_reset_date = "2025-02-01"
feature.billing.subscription.plan = "professional"
mock_external_service_dependencies["tenant_service"].has_roles.return_value = False
paid_pool = MagicMock(quota_limit=500, quota_used=500)
trial_pool = MagicMock(quota_limit=100, quota_used=10)
with (
patch("services.workspace_service.current_user", account),
patch("services.credit_pool_service.CreditPoolService.get_pool", side_effect=[paid_pool, trial_pool]),
):
result = WorkspaceService.get_tenant_info(tenant)
assert result is not None
assert result["trial_credits"] == 100
assert result["trial_credits_used"] == 10
def test_get_tenant_info_cloud_fall_back_to_trial_when_paid_none(
self, db_session_with_containers: Session, mock_external_service_dependencies
):
"""When paid_pool is None, fall back to trial pool."""
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
mock_external_service_dependencies["dify_config"].EDITION = "CLOUD"
feature = mock_external_service_dependencies["feature_service"].get_features.return_value
feature.can_replace_logo = False
feature.next_credit_reset_date = "2025-02-01"
feature.billing.subscription.plan = "professional"
mock_external_service_dependencies["tenant_service"].has_roles.return_value = False
trial_pool = MagicMock(quota_limit=50, quota_used=5)
with (
patch("services.workspace_service.current_user", account),
patch("services.credit_pool_service.CreditPoolService.get_pool", side_effect=[None, trial_pool]),
):
result = WorkspaceService.get_tenant_info(tenant)
assert result is not None
assert result["trial_credits"] == 50
assert result["trial_credits_used"] == 5
def test_get_tenant_info_cloud_sandbox_uses_trial_pool(
self, db_session_with_containers: Session, mock_external_service_dependencies
):
"""When plan is SANDBOX, skip paid pool and use trial pool."""
from enums.cloud_plan import CloudPlan
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
mock_external_service_dependencies["dify_config"].EDITION = "CLOUD"
feature = mock_external_service_dependencies["feature_service"].get_features.return_value
feature.can_replace_logo = False
feature.next_credit_reset_date = "2025-02-01"
feature.billing.subscription.plan = CloudPlan.SANDBOX
mock_external_service_dependencies["tenant_service"].has_roles.return_value = False
paid_pool = MagicMock(quota_limit=1000, quota_used=0)
trial_pool = MagicMock(quota_limit=200, quota_used=20)
with (
patch("services.workspace_service.current_user", account),
patch("services.credit_pool_service.CreditPoolService.get_pool", side_effect=[paid_pool, trial_pool]),
):
result = WorkspaceService.get_tenant_info(tenant)
assert result is not None
assert result["trial_credits"] == 200
assert result["trial_credits_used"] == 20
def test_get_tenant_info_cloud_both_pools_none(
self, db_session_with_containers: Session, mock_external_service_dependencies
):
"""When both paid and trial pools are absent, trial_credits should not be set."""
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
mock_external_service_dependencies["dify_config"].EDITION = "CLOUD"
feature = mock_external_service_dependencies["feature_service"].get_features.return_value
feature.can_replace_logo = False
feature.next_credit_reset_date = "2025-02-01"
feature.billing.subscription.plan = "professional"
mock_external_service_dependencies["tenant_service"].has_roles.return_value = False
with (
patch("services.workspace_service.current_user", account),
patch("services.credit_pool_service.CreditPoolService.get_pool", side_effect=[None, None]),
):
result = WorkspaceService.get_tenant_info(tenant)
assert result is not None
assert "trial_credits" not in result
assert "trial_credits_used" not in result

View File

@@ -64,18 +64,18 @@ class TestGetActiveAccount:
def test_returns_active_account(self, mock_db):
mock_account = MagicMock()
mock_account.status = "active"
mock_db.session.query.return_value.filter_by.return_value.first.return_value = mock_account
mock_db.session.scalar.return_value = mock_account
result = _get_active_account("user@example.com")
assert result is mock_account
mock_db.session.query.return_value.filter_by.assert_called_once_with(email="user@example.com")
mock_db.session.scalar.assert_called_once()
@patch("controllers.inner_api.app.dsl.db")
def test_returns_none_for_inactive_account(self, mock_db):
mock_account = MagicMock()
mock_account.status = "banned"
mock_db.session.query.return_value.filter_by.return_value.first.return_value = mock_account
mock_db.session.scalar.return_value = mock_account
result = _get_active_account("banned@example.com")
@@ -83,7 +83,7 @@ class TestGetActiveAccount:
@patch("controllers.inner_api.app.dsl.db")
def test_returns_none_for_nonexistent_email(self, mock_db):
mock_db.session.query.return_value.filter_by.return_value.first.return_value = None
mock_db.session.scalar.return_value = None
result = _get_active_account("missing@example.com")
@@ -205,7 +205,7 @@ class TestEnterpriseAppDSLExport:
@patch("controllers.inner_api.app.dsl.db")
def test_export_success_returns_200(self, mock_db, mock_dsl_cls, api_instance, app: Flask):
mock_app = MagicMock()
mock_db.session.query.return_value.filter_by.return_value.first.return_value = mock_app
mock_db.session.get.return_value = mock_app
mock_dsl_cls.export_dsl.return_value = "version: 0.6.0\nkind: app\n"
unwrapped = inspect.unwrap(api_instance.get)
@@ -221,7 +221,7 @@ class TestEnterpriseAppDSLExport:
@patch("controllers.inner_api.app.dsl.db")
def test_export_with_secret(self, mock_db, mock_dsl_cls, api_instance, app: Flask):
mock_app = MagicMock()
mock_db.session.query.return_value.filter_by.return_value.first.return_value = mock_app
mock_db.session.get.return_value = mock_app
mock_dsl_cls.export_dsl.return_value = "yaml-data"
unwrapped = inspect.unwrap(api_instance.get)
@@ -234,7 +234,7 @@ class TestEnterpriseAppDSLExport:
@patch("controllers.inner_api.app.dsl.db")
def test_export_app_not_found_returns_404(self, mock_db, api_instance, app: Flask):
mock_db.session.query.return_value.filter_by.return_value.first.return_value = None
mock_db.session.get.return_value = None
unwrapped = inspect.unwrap(api_instance.get)
with app.test_request_context("?include_secret=false"):

View File

@@ -621,7 +621,7 @@ class TestConvertDatasetRetrieverTool:
class TestBaseAgentRunnerInit:
def test_init_sets_stream_tool_call_and_files(self, mocker):
session = mocker.MagicMock()
session.query.return_value.where.return_value.count.return_value = 2
session.scalar.return_value = 2
mocker.patch.object(module.db, "session", session)
mocker.patch.object(BaseAgentRunner, "organize_agent_history", return_value=[])

View File

@@ -114,13 +114,9 @@ class TestOnToolEnd:
document = mocker.Mock()
document.metadata = {"document_id": "doc-1", "doc_id": "node-1"}
mock_query = mocker.Mock()
mock_db.session.query.return_value = mock_query
mock_query.where.return_value = mock_query
handler.on_tool_end([document])
mock_query.update.assert_called_once()
mock_db.session.execute.assert_called_once()
mock_db.session.commit.assert_called_once()
def test_on_tool_end_non_parent_child_index(self, handler, mocker):
@@ -138,13 +134,9 @@ class TestOnToolEnd:
"dataset_id": "dataset-1",
}
mock_query = mocker.Mock()
mock_db.session.query.return_value = mock_query
mock_query.where.return_value = mock_query
handler.on_tool_end([document])
mock_query.update.assert_called_once()
mock_db.session.execute.assert_called_once()
mock_db.session.commit.assert_called_once()
def test_on_tool_end_empty_documents(self, handler):

View File

@@ -38,13 +38,13 @@ class TestObfuscatedToken:
class TestEncryptToken:
@patch("models.engine.db.session.query")
@patch("extensions.ext_database.db.session.get")
@patch("libs.rsa.encrypt")
def test_successful_encryption(self, mock_encrypt, mock_query):
"""Test successful token encryption"""
mock_tenant = MagicMock()
mock_tenant.encrypt_public_key = "mock_public_key"
mock_query.return_value.where.return_value.first.return_value = mock_tenant
mock_query.return_value = mock_tenant
mock_encrypt.return_value = b"encrypted_data"
result = encrypt_token("tenant-123", "test_token")
@@ -52,10 +52,10 @@ class TestEncryptToken:
assert result == base64.b64encode(b"encrypted_data").decode()
mock_encrypt.assert_called_with("test_token", "mock_public_key")
@patch("models.engine.db.session.query")
@patch("extensions.ext_database.db.session.get")
def test_tenant_not_found(self, mock_query):
"""Test error when tenant doesn't exist"""
mock_query.return_value.where.return_value.first.return_value = None
mock_query.return_value = None
with pytest.raises(ValueError) as exc_info:
encrypt_token("invalid-tenant", "test_token")
@@ -119,7 +119,7 @@ class TestGetDecryptDecoding:
class TestEncryptDecryptIntegration:
@patch("models.engine.db.session.query")
@patch("extensions.ext_database.db.session.get")
@patch("libs.rsa.encrypt")
@patch("libs.rsa.decrypt")
def test_should_encrypt_and_decrypt_consistently(self, mock_decrypt, mock_encrypt, mock_query):
@@ -127,7 +127,7 @@ class TestEncryptDecryptIntegration:
# Setup mock tenant
mock_tenant = MagicMock()
mock_tenant.encrypt_public_key = "mock_public_key"
mock_query.return_value.where.return_value.first.return_value = mock_tenant
mock_query.return_value = mock_tenant
# Setup mock encryption/decryption
original_token = "test_token_123"
@@ -146,14 +146,14 @@ class TestEncryptDecryptIntegration:
class TestSecurity:
"""Critical security tests for encryption system"""
@patch("models.engine.db.session.query")
@patch("extensions.ext_database.db.session.get")
@patch("libs.rsa.encrypt")
def test_cross_tenant_isolation(self, mock_encrypt, mock_query):
"""Ensure tokens encrypted for one tenant cannot be used by another"""
# Setup mock tenant
mock_tenant = MagicMock()
mock_tenant.encrypt_public_key = "tenant1_public_key"
mock_query.return_value.where.return_value.first.return_value = mock_tenant
mock_query.return_value = mock_tenant
mock_encrypt.return_value = b"encrypted_for_tenant1"
# Encrypt token for tenant1
@@ -181,12 +181,12 @@ class TestSecurity:
with pytest.raises(Exception, match="Decryption error"):
decrypt_token("tenant-123", tampered)
@patch("models.engine.db.session.query")
@patch("extensions.ext_database.db.session.get")
@patch("libs.rsa.encrypt")
def test_encryption_randomness(self, mock_encrypt, mock_query):
"""Ensure same plaintext produces different ciphertext"""
mock_tenant = MagicMock(encrypt_public_key="key")
mock_query.return_value.where.return_value.first.return_value = mock_tenant
mock_query.return_value = mock_tenant
# Different outputs for same input
mock_encrypt.side_effect = [b"enc1", b"enc2", b"enc3"]
@@ -205,13 +205,13 @@ class TestEdgeCases:
# Test empty string (which is a valid str type)
assert obfuscated_token("") == ""
@patch("models.engine.db.session.query")
@patch("extensions.ext_database.db.session.get")
@patch("libs.rsa.encrypt")
def test_should_handle_empty_token_encryption(self, mock_encrypt, mock_query):
"""Test encryption of empty token"""
mock_tenant = MagicMock()
mock_tenant.encrypt_public_key = "mock_public_key"
mock_query.return_value.where.return_value.first.return_value = mock_tenant
mock_query.return_value = mock_tenant
mock_encrypt.return_value = b"encrypted_empty"
result = encrypt_token("tenant-123", "")
@@ -219,13 +219,13 @@ class TestEdgeCases:
assert result == base64.b64encode(b"encrypted_empty").decode()
mock_encrypt.assert_called_with("", "mock_public_key")
@patch("models.engine.db.session.query")
@patch("extensions.ext_database.db.session.get")
@patch("libs.rsa.encrypt")
def test_should_handle_special_characters_in_token(self, mock_encrypt, mock_query):
"""Test tokens containing special/unicode characters"""
mock_tenant = MagicMock()
mock_tenant.encrypt_public_key = "mock_public_key"
mock_query.return_value.where.return_value.first.return_value = mock_tenant
mock_query.return_value = mock_tenant
mock_encrypt.return_value = b"encrypted_special"
# Test various special characters
@@ -242,13 +242,13 @@ class TestEdgeCases:
assert result == base64.b64encode(b"encrypted_special").decode()
mock_encrypt.assert_called_with(token, "mock_public_key")
@patch("models.engine.db.session.query")
@patch("extensions.ext_database.db.session.get")
@patch("libs.rsa.encrypt")
def test_should_handle_rsa_size_limits(self, mock_encrypt, mock_query):
"""Test behavior when token exceeds RSA encryption limits"""
mock_tenant = MagicMock()
mock_tenant.encrypt_public_key = "mock_public_key"
mock_query.return_value.where.return_value.first.return_value = mock_tenant
mock_query.return_value = mock_tenant
# RSA 2048-bit can only encrypt ~245 bytes
# The actual limit depends on padding scheme

View File

@@ -314,8 +314,8 @@ class TestLLMGenerator:
assert "An unexpected error occurred" in result["error"]
def test_instruction_modify_legacy_no_last_run(self, mock_model_instance, model_config_entity):
with patch("extensions.ext_database.db.session.query") as mock_query:
mock_query.return_value.where.return_value.order_by.return_value.first.return_value = None
with patch("extensions.ext_database.db.session.scalar") as mock_scalar:
mock_scalar.return_value = None
# Mock __instruction_modify_common call via invoke_llm
mock_response = MagicMock()
@@ -328,12 +328,12 @@ class TestLLMGenerator:
assert result == {"modified": "prompt"}
def test_instruction_modify_legacy_with_last_run(self, mock_model_instance, model_config_entity):
with patch("extensions.ext_database.db.session.query") as mock_query:
with patch("extensions.ext_database.db.session.scalar") as mock_scalar:
last_run = MagicMock()
last_run.query = "q"
last_run.answer = "a"
last_run.error = "e"
mock_query.return_value.where.return_value.order_by.return_value.first.return_value = last_run
mock_scalar.return_value = last_run
mock_response = MagicMock()
mock_response.message.get_text_content.return_value = '{"modified": "prompt"}'
@@ -483,8 +483,8 @@ class TestLLMGenerator:
def test_instruction_modify_common_placeholders(self, mock_model_instance, model_config_entity):
# Testing placeholders replacement via instruction_modify_legacy for convenience
with patch("extensions.ext_database.db.session.query") as mock_query:
mock_query.return_value.where.return_value.order_by.return_value.first.return_value = None
with patch("extensions.ext_database.db.session.scalar") as mock_scalar:
mock_scalar.return_value = None
mock_response = MagicMock()
mock_response.message.get_text_content.return_value = '{"ok": true}'
@@ -504,8 +504,8 @@ class TestLLMGenerator:
assert "current_val" in user_msg_dict["instruction"]
def test_instruction_modify_common_no_braces(self, mock_model_instance, model_config_entity):
with patch("extensions.ext_database.db.session.query") as mock_query:
mock_query.return_value.where.return_value.order_by.return_value.first.return_value = None
with patch("extensions.ext_database.db.session.scalar") as mock_scalar:
mock_scalar.return_value = None
mock_response = MagicMock()
mock_response.message.get_text_content.return_value = "No braces here"
mock_model_instance.invoke_llm.return_value = mock_response
@@ -516,8 +516,8 @@ class TestLLMGenerator:
assert "Could not find a valid JSON object" in result["error"]
def test_instruction_modify_common_not_dict(self, mock_model_instance, model_config_entity):
with patch("extensions.ext_database.db.session.query") as mock_query:
mock_query.return_value.where.return_value.order_by.return_value.first.return_value = None
with patch("extensions.ext_database.db.session.scalar") as mock_scalar:
mock_scalar.return_value = None
mock_response = MagicMock()
mock_response.message.get_text_content.return_value = "[1, 2, 3]"
mock_model_instance.invoke_llm.return_value = mock_response
@@ -556,8 +556,8 @@ class TestLLMGenerator:
)
def test_instruction_modify_common_invoke_error(self, mock_model_instance, model_config_entity):
with patch("extensions.ext_database.db.session.query") as mock_query:
mock_query.return_value.where.return_value.order_by.return_value.first.return_value = None
with patch("extensions.ext_database.db.session.scalar") as mock_scalar:
mock_scalar.return_value = None
mock_model_instance.invoke_llm.side_effect = InvokeError("Invoke Failed")
result = LLMGenerator.instruction_modify_legacy(
@@ -566,8 +566,8 @@ class TestLLMGenerator:
assert "Failed to generate code" in result["error"]
def test_instruction_modify_common_exception(self, mock_model_instance, model_config_entity):
with patch("extensions.ext_database.db.session.query") as mock_query:
mock_query.return_value.where.return_value.order_by.return_value.first.return_value = None
with patch("extensions.ext_database.db.session.scalar") as mock_scalar:
mock_scalar.return_value = None
mock_model_instance.invoke_llm.side_effect = Exception("Random error")
result = LLMGenerator.instruction_modify_legacy(
@@ -576,8 +576,8 @@ class TestLLMGenerator:
assert "An unexpected error occurred" in result["error"]
def test_instruction_modify_common_json_error(self, mock_model_instance, model_config_entity):
with patch("extensions.ext_database.db.session.query") as mock_query:
mock_query.return_value.where.return_value.order_by.return_value.first.return_value = None
with patch("extensions.ext_database.db.session.scalar") as mock_scalar:
mock_scalar.return_value = None
mock_response = MagicMock()
mock_response.message.get_text_content.return_value = "No JSON here"

View File

@@ -332,27 +332,21 @@ class TestPluginAppBackwardsInvocation:
PluginAppBackwardsInvocation._get_user("uid")
def test_get_app_returns_app(self, mocker):
query_chain = MagicMock()
query_chain.where.return_value = query_chain
app_obj = MagicMock(id="app")
query_chain.first.return_value = app_obj
db = SimpleNamespace(session=MagicMock(query=MagicMock(return_value=query_chain)))
db = SimpleNamespace(session=MagicMock(scalar=MagicMock(return_value=app_obj)))
mocker.patch("core.plugin.backwards_invocation.app.db", db)
assert PluginAppBackwardsInvocation._get_app("app", "tenant") is app_obj
def test_get_app_raises_when_missing(self, mocker):
query_chain = MagicMock()
query_chain.where.return_value = query_chain
query_chain.first.return_value = None
db = SimpleNamespace(session=MagicMock(query=MagicMock(return_value=query_chain)))
db = SimpleNamespace(session=MagicMock(scalar=MagicMock(return_value=None)))
mocker.patch("core.plugin.backwards_invocation.app.db", db)
with pytest.raises(ValueError, match="app not found"):
PluginAppBackwardsInvocation._get_app("app", "tenant")
def test_get_app_raises_when_query_fails(self, mocker):
db = SimpleNamespace(session=MagicMock(query=MagicMock(side_effect=RuntimeError("db down"))))
db = SimpleNamespace(session=MagicMock(scalar=MagicMock(side_effect=RuntimeError("db down"))))
mocker.patch("core.plugin.backwards_invocation.app.db", db)
with pytest.raises(ValueError, match="app not found"):

View File

@@ -38,11 +38,9 @@ def test_tool_label_manager_filter_tool_labels():
def test_tool_label_manager_update_tool_labels_db():
controller = _api_controller("api-1")
with patch("core.tools.tool_label_manager.db") as mock_db:
delete_query = mock_db.session.query.return_value.where.return_value
delete_query.delete.return_value = None
ToolLabelManager.update_tool_labels(controller, ["search", "search", "invalid"])
delete_query.delete.assert_called_once()
mock_db.session.execute.assert_called_once()
# only one valid unique label should be inserted.
assert mock_db.session.add.call_count == 1
mock_db.session.commit.assert_called_once()

View File

@@ -220,9 +220,7 @@ def test_get_tool_runtime_builtin_with_credentials_decrypts_and_forks():
with patch.object(ToolManager, "get_builtin_provider", return_value=controller):
with patch("core.helper.credential_utils.check_credential_policy_compliance"):
with patch("core.tools.tool_manager.db") as mock_db:
mock_db.session.query.return_value.where.return_value.order_by.return_value.first.return_value = (
builtin_provider
)
mock_db.session.scalar.return_value = builtin_provider
encrypter = Mock()
encrypter.decrypt.return_value = {"api_key": "secret"}
cache = Mock()
@@ -274,7 +272,7 @@ def test_get_tool_runtime_builtin_refreshes_expired_oauth_credentials(
)
refreshed = SimpleNamespace(credentials={"token": "new"}, expires_at=123456)
mock_db.session.query.return_value.where.return_value.order_by.return_value.first.return_value = builtin_provider
mock_db.session.scalar.return_value = builtin_provider
encrypter = Mock()
encrypter.decrypt.return_value = {"token": "old"}
encrypter.encrypt.return_value = {"token": "encrypted"}
@@ -698,12 +696,10 @@ def test_get_api_provider_controller_returns_controller_and_credentials():
privacy_policy="privacy",
custom_disclaimer="disclaimer",
)
db_query = Mock()
db_query.where.return_value.first.return_value = provider
controller = Mock()
with patch("core.tools.tool_manager.db") as mock_db:
mock_db.session.query.return_value = db_query
mock_db.session.scalar.return_value = provider
with patch(
"core.tools.tool_manager.ApiToolProviderController.from_db", return_value=controller
) as mock_from_db:
@@ -730,12 +726,10 @@ def test_user_get_api_provider_masks_credentials_and_adds_labels():
privacy_policy="privacy",
custom_disclaimer="disclaimer",
)
db_query = Mock()
db_query.where.return_value.first.return_value = provider
controller = Mock()
with patch("core.tools.tool_manager.db") as mock_db:
mock_db.session.query.return_value = db_query
mock_db.session.scalar.return_value = provider
with patch("core.tools.tool_manager.ApiToolProviderController.from_db", return_value=controller):
encrypter = Mock()
encrypter.decrypt.return_value = {"api_key_value": "secret"}
@@ -750,7 +744,7 @@ def test_user_get_api_provider_masks_credentials_and_adds_labels():
def test_get_api_provider_controller_not_found_raises():
with patch("core.tools.tool_manager.db") as mock_db:
mock_db.session.query.return_value.where.return_value.first.return_value = None
mock_db.session.scalar.return_value = None
with pytest.raises(ToolProviderNotFoundError, match="api provider missing not found"):
ToolManager.get_api_provider_controller("tenant-1", "missing")
@@ -809,14 +803,14 @@ def test_generate_tool_icon_urls_for_workflow_and_api():
workflow_provider = SimpleNamespace(icon='{"background": "#222", "content": "W"}')
api_provider = SimpleNamespace(icon='{"background": "#333", "content": "A"}')
with patch("core.tools.tool_manager.db") as mock_db:
mock_db.session.query.return_value.where.return_value.first.side_effect = [workflow_provider, api_provider]
mock_db.session.scalar.side_effect = [workflow_provider, api_provider]
assert ToolManager.generate_workflow_tool_icon_url("tenant-1", "wf-1") == {"background": "#222", "content": "W"}
assert ToolManager.generate_api_tool_icon_url("tenant-1", "api-1") == {"background": "#333", "content": "A"}
def test_generate_tool_icon_urls_missing_workflow_and_api_use_default():
with patch("core.tools.tool_manager.db") as mock_db:
mock_db.session.query.return_value.where.return_value.first.return_value = None
mock_db.session.scalar.return_value = None
assert ToolManager.generate_workflow_tool_icon_url("tenant-1", "missing")["background"] == "#252525"
assert ToolManager.generate_api_tool_icon_url("tenant-1", "missing")["background"] == "#252525"

View File

@@ -263,7 +263,7 @@ def test_single_dataset_retriever_non_economy_run_sorts_context_and_resources():
)
db_session = Mock()
db_session.scalar.side_effect = [dataset, lookup_doc_low, lookup_doc_high]
db_session.query.return_value.filter_by.return_value.first.return_value = dataset
db_session.get.return_value = dataset
tool = SingleDatasetRetrieverTool(
tenant_id="tenant-1",
@@ -444,7 +444,7 @@ def test_multi_dataset_retriever_run_orders_segments_and_returns_resources():
)
db_session = Mock()
db_session.scalars.return_value.all.return_value = [segment_for_node_2, segment_for_node_1]
db_session.query.return_value.filter_by.return_value.first.side_effect = [
db_session.get.side_effect = [
SimpleNamespace(id="dataset-2", name="Dataset Two"),
SimpleNamespace(id="dataset-1", name="Dataset One"),
]

View File

@@ -1,311 +0,0 @@
import datetime
from unittest.mock import MagicMock, patch
import pytest
from services.retention.conversation.messages_clean_policy import (
BillingDisabledPolicy,
)
from services.retention.conversation.messages_clean_service import MessagesCleanService
class TestMessagesCleanService:
@pytest.fixture(autouse=True)
def mock_db_engine(self):
with patch("services.retention.conversation.messages_clean_service.db") as mock_db:
mock_db.engine = MagicMock()
yield mock_db.engine
@pytest.fixture
def mock_db_session(self, mock_db_engine):
with patch("services.retention.conversation.messages_clean_service.Session") as mock_session_cls:
mock_session = MagicMock()
mock_session_cls.return_value.__enter__.return_value = mock_session
yield mock_session
@pytest.fixture
def mock_policy(self):
policy = MagicMock(spec=BillingDisabledPolicy)
return policy
def test_run_calls_clean_messages(self, mock_policy):
service = MessagesCleanService(
policy=mock_policy,
end_before=datetime.datetime.now(),
batch_size=10,
)
with patch.object(service, "_clean_messages_by_time_range") as mock_clean:
mock_clean.return_value = {"total_deleted": 5}
result = service.run()
assert result == {"total_deleted": 5}
mock_clean.assert_called_once()
def test_clean_messages_by_time_range_basic(self, mock_db_session, mock_policy):
# Arrange
end_before = datetime.datetime(2024, 1, 1, 12, 0, 0)
service = MessagesCleanService(
policy=mock_policy,
end_before=end_before,
batch_size=10,
)
mock_db_session.execute.side_effect = [
MagicMock(all=lambda: [("msg1", "app1", datetime.datetime(2024, 1, 1, 10, 0, 0))]), # messages
MagicMock(all=lambda: [MagicMock(id="app1", tenant_id="tenant1")]), # apps
MagicMock(
rowcount=1
), # delete relations (this is wrong, relations delete doesn't use rowcount here, but execute)
MagicMock(rowcount=1), # delete relations
MagicMock(rowcount=1), # delete relations
MagicMock(rowcount=1), # delete relations
MagicMock(rowcount=1), # delete relations
MagicMock(rowcount=1), # delete relations
MagicMock(rowcount=1), # delete relations
MagicMock(rowcount=1), # delete relations
MagicMock(rowcount=1), # delete messages
MagicMock(all=lambda: []), # next batch empty
]
# Reset side_effect to be more robust
# The service calls session.execute for:
# 1. Fetch messages
# 2. Fetch apps
# 3. Batch delete relations (8 calls if IDs exist)
# 4. Delete messages
mock_returns = [
MagicMock(all=lambda: [("msg1", "app1", datetime.datetime(2024, 1, 1, 10, 0, 0))]), # fetch messages
MagicMock(all=lambda: [MagicMock(id="app1", tenant_id="tenant1")]), # fetch apps
]
# 8 deletes for relations
mock_returns.extend([MagicMock() for _ in range(8)])
# 1 delete for messages
mock_returns.append(MagicMock(rowcount=1))
# Final fetch messages (empty)
mock_returns.append(MagicMock(all=lambda: []))
mock_db_session.execute.side_effect = mock_returns
mock_policy.filter_message_ids.return_value = ["msg1"]
# Act
with patch("services.retention.conversation.messages_clean_service.time.sleep"):
stats = service.run()
# Assert
assert stats["total_messages"] == 1
assert stats["total_deleted"] == 1
assert stats["batches"] == 2
def test_clean_messages_by_time_range_with_start_from(self, mock_db_session, mock_policy):
start_from = datetime.datetime(2024, 1, 1, 0, 0, 0)
end_before = datetime.datetime(2024, 1, 1, 12, 0, 0)
service = MessagesCleanService(
policy=mock_policy,
start_from=start_from,
end_before=end_before,
batch_size=10,
)
mock_db_session.execute.side_effect = [
MagicMock(all=lambda: []), # No messages
]
stats = service.run()
assert stats["total_messages"] == 0
def test_clean_messages_by_time_range_with_cursor(self, mock_db_session, mock_policy):
# Test pagination with cursor
end_before = datetime.datetime(2024, 1, 1, 12, 0, 0)
service = MessagesCleanService(
policy=mock_policy,
end_before=end_before,
batch_size=1,
)
msg1_time = datetime.datetime(2024, 1, 1, 10, 0, 0)
msg2_time = datetime.datetime(2024, 1, 1, 11, 0, 0)
mock_returns = []
# Batch 1
mock_returns.append(MagicMock(all=lambda: [("msg1", "app1", msg1_time)]))
mock_returns.append(MagicMock(all=lambda: [MagicMock(id="app1", tenant_id="tenant1")]))
mock_returns.extend([MagicMock() for _ in range(8)]) # relations
mock_returns.append(MagicMock(rowcount=1)) # messages
# Batch 2
mock_returns.append(MagicMock(all=lambda: [("msg2", "app1", msg2_time)]))
mock_returns.append(MagicMock(all=lambda: [MagicMock(id="app1", tenant_id="tenant1")]))
mock_returns.extend([MagicMock() for _ in range(8)]) # relations
mock_returns.append(MagicMock(rowcount=1)) # messages
# Batch 3
mock_returns.append(MagicMock(all=lambda: []))
mock_db_session.execute.side_effect = mock_returns
mock_policy.filter_message_ids.return_value = ["msg1"] # Simplified
with patch("services.retention.conversation.messages_clean_service.time.sleep"):
stats = service.run()
assert stats["batches"] == 3
assert stats["total_messages"] == 2
def test_clean_messages_by_time_range_dry_run(self, mock_db_session, mock_policy):
service = MessagesCleanService(
policy=mock_policy,
end_before=datetime.datetime.now(),
batch_size=10,
dry_run=True,
)
mock_db_session.execute.side_effect = [
MagicMock(all=lambda: [("msg1", "app1", datetime.datetime.now())]), # messages
MagicMock(all=lambda: [MagicMock(id="app1", tenant_id="tenant1")]), # apps
MagicMock(all=lambda: []), # next batch empty
]
mock_policy.filter_message_ids.return_value = ["msg1"]
with patch("services.retention.conversation.messages_clean_service.random.sample") as mock_sample:
mock_sample.return_value = ["msg1"]
stats = service.run()
assert stats["filtered_messages"] == 1
assert stats["total_deleted"] == 0 # Dry run
mock_sample.assert_called()
def test_clean_messages_by_time_range_no_apps_found(self, mock_db_session, mock_policy):
service = MessagesCleanService(
policy=mock_policy,
end_before=datetime.datetime.now(),
batch_size=10,
)
mock_db_session.execute.side_effect = [
MagicMock(all=lambda: [("msg1", "app1", datetime.datetime.now())]), # messages
MagicMock(all=lambda: []), # apps NOT found
MagicMock(all=lambda: []), # next batch empty
]
stats = service.run()
assert stats["total_messages"] == 1
assert stats["total_deleted"] == 0
def test_clean_messages_by_time_range_no_app_ids(self, mock_db_session, mock_policy):
service = MessagesCleanService(
policy=mock_policy,
end_before=datetime.datetime.now(),
batch_size=10,
)
mock_db_session.execute.side_effect = [
MagicMock(all=lambda: [("msg1", "app1", datetime.datetime.now())]), # messages
MagicMock(all=lambda: []), # next batch empty
]
# We need to successfully execute line 228 and 229, then return empty at 251.
# line 228: raw_messages = list(session.execute(msg_stmt).all())
# line 251: app_ids = list({msg.app_id for msg in messages})
calls = []
def list_side_effect(arg):
calls.append(arg)
if len(calls) == 2: # This is the second call to list() in the loop
return []
return list(arg)
with patch("services.retention.conversation.messages_clean_service.list", side_effect=list_side_effect):
stats = service.run()
assert stats["batches"] == 2
assert stats["total_messages"] == 1
def test_from_time_range_validation(self, mock_policy):
now = datetime.datetime.now()
# Test start_from >= end_before
with pytest.raises(ValueError, match="start_from .* must be less than end_before"):
MessagesCleanService.from_time_range(mock_policy, now, now)
# Test batch_size <= 0
with pytest.raises(ValueError, match="batch_size .* must be greater than 0"):
MessagesCleanService.from_time_range(mock_policy, now - datetime.timedelta(days=1), now, batch_size=0)
def test_from_time_range_success(self, mock_policy):
start = datetime.datetime(2024, 1, 1)
end = datetime.datetime(2024, 2, 1)
# Mock logger to avoid actual logging if needed, though it's fine
service = MessagesCleanService.from_time_range(mock_policy, start, end)
assert service._start_from == start
assert service._end_before == end
def test_from_days_validation(self, mock_policy):
# Test days < 0
with pytest.raises(ValueError, match="days .* must be greater than or equal to 0"):
MessagesCleanService.from_days(mock_policy, days=-1)
# Test batch_size <= 0
with pytest.raises(ValueError, match="batch_size .* must be greater than 0"):
MessagesCleanService.from_days(mock_policy, days=30, batch_size=0)
def test_from_days_success(self, mock_policy):
with patch("services.retention.conversation.messages_clean_service.naive_utc_now") as mock_now:
fixed_now = datetime.datetime(2024, 6, 1)
mock_now.return_value = fixed_now
service = MessagesCleanService.from_days(mock_policy, days=10)
assert service._start_from is None
assert service._end_before == fixed_now - datetime.timedelta(days=10)
def test_clean_messages_by_time_range_no_messages_to_delete(self, mock_db_session, mock_policy):
service = MessagesCleanService(
policy=mock_policy,
end_before=datetime.datetime.now(),
batch_size=10,
)
mock_db_session.execute.side_effect = [
MagicMock(all=lambda: [("msg1", "app1", datetime.datetime.now())]), # messages
MagicMock(all=lambda: [MagicMock(id="app1", tenant_id="tenant1")]), # apps
MagicMock(all=lambda: []), # next batch empty
]
mock_policy.filter_message_ids.return_value = [] # Policy says NO
stats = service.run()
assert stats["total_messages"] == 1
assert stats["filtered_messages"] == 0
assert stats["total_deleted"] == 0
def test_batch_delete_message_relations_empty(self, mock_db_session):
MessagesCleanService._batch_delete_message_relations(mock_db_session, [])
mock_db_session.execute.assert_not_called()
def test_batch_delete_message_relations_with_ids(self, mock_db_session):
MessagesCleanService._batch_delete_message_relations(mock_db_session, ["msg1", "msg2"])
assert mock_db_session.execute.call_count == 8 # 8 tables to clean up
def test_clean_messages_interval_from_env(self, mock_db_session, mock_policy):
service = MessagesCleanService(
policy=mock_policy,
end_before=datetime.datetime.now(),
batch_size=10,
)
mock_returns = [
MagicMock(all=lambda: [("msg1", "app1", datetime.datetime.now())]), # messages
MagicMock(all=lambda: [MagicMock(id="app1", tenant_id="tenant1")]), # apps
]
mock_returns.extend([MagicMock() for _ in range(8)]) # relations
mock_returns.append(MagicMock(rowcount=1)) # messages
mock_returns.append(MagicMock(all=lambda: [])) # next batch empty
mock_db_session.execute.side_effect = mock_returns
mock_policy.filter_message_ids.return_value = ["msg1"]
with patch(
"services.retention.conversation.messages_clean_service.dify_config.SANDBOX_EXPIRED_RECORDS_CLEAN_BATCH_MAX_INTERVAL",
500,
):
with patch("services.retention.conversation.messages_clean_service.time.sleep") as mock_sleep:
with patch("services.retention.conversation.messages_clean_service.random.uniform") as mock_uniform:
mock_uniform.return_value = 300.0
service.run()
mock_uniform.assert_called_with(0, 500)
mock_sleep.assert_called_with(0.3)

View File

@@ -1,558 +0,0 @@
from __future__ import annotations
from dataclasses import dataclass
from datetime import UTC, datetime
from types import SimpleNamespace
from typing import Any, cast
from unittest.mock import MagicMock
import pytest
from pytest_mock import MockerFixture
from core.rag.index_processor.constant.built_in_field import BuiltInField, MetadataDataSource
from models.dataset import Dataset
from services.entities.knowledge_entities.knowledge_entities import (
DocumentMetadataOperation,
MetadataArgs,
MetadataDetail,
MetadataOperationData,
)
from services.metadata_service import MetadataService
@dataclass
class _DocumentStub:
id: str
name: str
uploader: str
upload_date: datetime
last_update_date: datetime
data_source_type: str
doc_metadata: dict[str, object] | None
@pytest.fixture
def mock_db(mocker: MockerFixture) -> MagicMock:
mocked_db = mocker.patch("services.metadata_service.db")
mocked_db.session = MagicMock()
return mocked_db
@pytest.fixture
def mock_redis_client(mocker: MockerFixture) -> MagicMock:
return mocker.patch("services.metadata_service.redis_client")
@pytest.fixture
def mock_current_account(mocker: MockerFixture) -> MagicMock:
mock_user = SimpleNamespace(id="user-1")
return mocker.patch("services.metadata_service.current_account_with_tenant", return_value=(mock_user, "tenant-1"))
def _build_document(document_id: str, doc_metadata: dict[str, object] | None = None) -> _DocumentStub:
now = datetime(2025, 1, 1, 10, 30, tzinfo=UTC)
return _DocumentStub(
id=document_id,
name=f"doc-{document_id}",
uploader="qa@example.com",
upload_date=now,
last_update_date=now,
data_source_type="upload_file",
doc_metadata=doc_metadata,
)
def _dataset(**kwargs: Any) -> Dataset:
return cast(Dataset, SimpleNamespace(**kwargs))
def test_create_metadata_should_raise_value_error_when_name_exceeds_limit() -> None:
# Arrange
metadata_args = MetadataArgs(type="string", name="x" * 256)
# Act + Assert
with pytest.raises(ValueError, match="cannot exceed 255"):
MetadataService.create_metadata("dataset-1", metadata_args)
def test_create_metadata_should_raise_value_error_when_metadata_name_already_exists(
mock_db: MagicMock,
mock_current_account: MagicMock,
) -> None:
# Arrange
metadata_args = MetadataArgs(type="string", name="priority")
mock_db.session.query.return_value.filter_by.return_value.first.return_value = object()
# Act + Assert
with pytest.raises(ValueError, match="already exists"):
MetadataService.create_metadata("dataset-1", metadata_args)
# Assert
mock_current_account.assert_called_once()
def test_create_metadata_should_raise_value_error_when_name_collides_with_builtin(
mock_db: MagicMock, mock_current_account: MagicMock
) -> None:
# Arrange
metadata_args = MetadataArgs(type="string", name=BuiltInField.document_name)
mock_db.session.query.return_value.filter_by.return_value.first.return_value = None
# Act + Assert
with pytest.raises(ValueError, match="Built-in fields"):
MetadataService.create_metadata("dataset-1", metadata_args)
def test_create_metadata_should_persist_metadata_when_input_is_valid(
mock_db: MagicMock, mock_current_account: MagicMock
) -> None:
# Arrange
metadata_args = MetadataArgs(type="number", name="score")
mock_db.session.query.return_value.filter_by.return_value.first.return_value = None
# Act
result = MetadataService.create_metadata("dataset-1", metadata_args)
# Assert
assert result.tenant_id == "tenant-1"
assert result.dataset_id == "dataset-1"
assert result.type == "number"
assert result.name == "score"
assert result.created_by == "user-1"
mock_db.session.add.assert_called_once_with(result)
mock_db.session.commit.assert_called_once()
mock_current_account.assert_called_once()
def test_update_metadata_name_should_raise_value_error_when_name_exceeds_limit() -> None:
# Arrange
too_long_name = "x" * 256
# Act + Assert
with pytest.raises(ValueError, match="cannot exceed 255"):
MetadataService.update_metadata_name("dataset-1", "metadata-1", too_long_name)
def test_update_metadata_name_should_raise_value_error_when_duplicate_name_exists(
mock_db: MagicMock, mock_current_account: MagicMock
) -> None:
# Arrange
mock_db.session.query.return_value.filter_by.return_value.first.return_value = object()
# Act + Assert
with pytest.raises(ValueError, match="already exists"):
MetadataService.update_metadata_name("dataset-1", "metadata-1", "duplicate")
# Assert
mock_current_account.assert_called_once()
def test_update_metadata_name_should_raise_value_error_when_name_collides_with_builtin(
mock_db: MagicMock,
mock_current_account: MagicMock,
) -> None:
# Arrange
mock_db.session.query.return_value.filter_by.return_value.first.return_value = None
# Act + Assert
with pytest.raises(ValueError, match="Built-in fields"):
MetadataService.update_metadata_name("dataset-1", "metadata-1", BuiltInField.source)
# Assert
mock_current_account.assert_called_once()
def test_update_metadata_name_should_update_bound_documents_and_return_metadata(
mock_db: MagicMock,
mock_redis_client: MagicMock,
mock_current_account: MagicMock,
mocker: MockerFixture,
) -> None:
# Arrange
mock_redis_client.get.return_value = None
fixed_now = datetime(2025, 2, 1, 0, 0, tzinfo=UTC)
mocker.patch("services.metadata_service.naive_utc_now", return_value=fixed_now)
metadata = SimpleNamespace(id="metadata-1", name="old_name", updated_by=None, updated_at=None)
bindings = [SimpleNamespace(document_id="doc-1"), SimpleNamespace(document_id="doc-2")]
query_duplicate = MagicMock()
query_duplicate.filter_by.return_value.first.return_value = None
query_metadata = MagicMock()
query_metadata.filter_by.return_value.first.return_value = metadata
query_bindings = MagicMock()
query_bindings.filter_by.return_value.all.return_value = bindings
mock_db.session.query.side_effect = [query_duplicate, query_metadata, query_bindings]
doc_1 = _build_document("1", {"old_name": "value", "other": "keep"})
doc_2 = _build_document("2", None)
mock_get_documents = mocker.patch("services.metadata_service.DocumentService.get_document_by_ids")
mock_get_documents.return_value = [doc_1, doc_2]
# Act
result = MetadataService.update_metadata_name("dataset-1", "metadata-1", "new_name")
# Assert
assert result is metadata
assert metadata.name == "new_name"
assert metadata.updated_by == "user-1"
assert metadata.updated_at == fixed_now
assert doc_1.doc_metadata == {"other": "keep", "new_name": "value"}
assert doc_2.doc_metadata == {"new_name": None}
mock_get_documents.assert_called_once_with(["doc-1", "doc-2"])
mock_db.session.commit.assert_called_once()
mock_redis_client.delete.assert_called_once_with("dataset_metadata_lock_dataset-1")
mock_current_account.assert_called_once()
def test_update_metadata_name_should_return_none_when_metadata_does_not_exist(
mock_db: MagicMock,
mock_redis_client: MagicMock,
mock_current_account: MagicMock,
mocker: MockerFixture,
) -> None:
# Arrange
mock_redis_client.get.return_value = None
mock_logger = mocker.patch("services.metadata_service.logger")
query_duplicate = MagicMock()
query_duplicate.filter_by.return_value.first.return_value = None
query_metadata = MagicMock()
query_metadata.filter_by.return_value.first.return_value = None
mock_db.session.query.side_effect = [query_duplicate, query_metadata]
# Act
result = MetadataService.update_metadata_name("dataset-1", "missing-id", "new_name")
# Assert
assert result is None
mock_logger.exception.assert_called_once()
mock_redis_client.delete.assert_called_once_with("dataset_metadata_lock_dataset-1")
mock_current_account.assert_called_once()
def test_delete_metadata_should_remove_metadata_and_related_document_fields(
mock_db: MagicMock,
mock_redis_client: MagicMock,
mocker: MockerFixture,
) -> None:
# Arrange
mock_redis_client.get.return_value = None
metadata = SimpleNamespace(id="metadata-1", name="obsolete")
bindings = [SimpleNamespace(document_id="doc-1")]
query_metadata = MagicMock()
query_metadata.filter_by.return_value.first.return_value = metadata
query_bindings = MagicMock()
query_bindings.filter_by.return_value.all.return_value = bindings
mock_db.session.query.side_effect = [query_metadata, query_bindings]
document = _build_document("1", {"obsolete": "legacy", "remaining": "value"})
mocker.patch("services.metadata_service.DocumentService.get_document_by_ids", return_value=[document])
# Act
result = MetadataService.delete_metadata("dataset-1", "metadata-1")
# Assert
assert result is metadata
assert document.doc_metadata == {"remaining": "value"}
mock_db.session.delete.assert_called_once_with(metadata)
mock_db.session.commit.assert_called_once()
mock_redis_client.delete.assert_called_once_with("dataset_metadata_lock_dataset-1")
def test_delete_metadata_should_return_none_when_metadata_is_missing(
mock_db: MagicMock,
mock_redis_client: MagicMock,
mocker: MockerFixture,
) -> None:
# Arrange
mock_redis_client.get.return_value = None
mock_db.session.query.return_value.filter_by.return_value.first.return_value = None
mock_logger = mocker.patch("services.metadata_service.logger")
# Act
result = MetadataService.delete_metadata("dataset-1", "missing-id")
# Assert
assert result is None
mock_logger.exception.assert_called_once()
mock_redis_client.delete.assert_called_once_with("dataset_metadata_lock_dataset-1")
def test_get_built_in_fields_should_return_all_expected_fields() -> None:
# Arrange
expected_names = {
BuiltInField.document_name,
BuiltInField.uploader,
BuiltInField.upload_date,
BuiltInField.last_update_date,
BuiltInField.source,
}
# Act
result = MetadataService.get_built_in_fields()
# Assert
assert {item["name"] for item in result} == expected_names
assert [item["type"] for item in result] == ["string", "string", "time", "time", "string"]
def test_enable_built_in_field_should_return_immediately_when_already_enabled(
mock_db: MagicMock,
mocker: MockerFixture,
) -> None:
# Arrange
dataset = _dataset(id="dataset-1", built_in_field_enabled=True)
get_docs = mocker.patch("services.metadata_service.DocumentService.get_working_documents_by_dataset_id")
# Act
MetadataService.enable_built_in_field(dataset)
# Assert
get_docs.assert_not_called()
mock_db.session.commit.assert_not_called()
def test_enable_built_in_field_should_populate_documents_and_enable_flag(
mock_db: MagicMock,
mock_redis_client: MagicMock,
mocker: MockerFixture,
) -> None:
# Arrange
mock_redis_client.get.return_value = None
dataset = _dataset(id="dataset-1", built_in_field_enabled=False)
doc_1 = _build_document("1", {"custom": "value"})
doc_2 = _build_document("2", None)
mocker.patch(
"services.metadata_service.DocumentService.get_working_documents_by_dataset_id",
return_value=[doc_1, doc_2],
)
# Act
MetadataService.enable_built_in_field(dataset)
# Assert
assert dataset.built_in_field_enabled is True
assert doc_1.doc_metadata is not None
assert doc_1.doc_metadata[BuiltInField.document_name] == "doc-1"
assert doc_1.doc_metadata[BuiltInField.source] == MetadataDataSource.upload_file
assert doc_2.doc_metadata is not None
assert doc_2.doc_metadata[BuiltInField.uploader] == "qa@example.com"
mock_db.session.commit.assert_called_once()
mock_redis_client.delete.assert_called_once_with("dataset_metadata_lock_dataset-1")
def test_disable_built_in_field_should_return_immediately_when_already_disabled(
mock_db: MagicMock,
mocker: MockerFixture,
) -> None:
# Arrange
dataset = _dataset(id="dataset-1", built_in_field_enabled=False)
get_docs = mocker.patch("services.metadata_service.DocumentService.get_working_documents_by_dataset_id")
# Act
MetadataService.disable_built_in_field(dataset)
# Assert
get_docs.assert_not_called()
mock_db.session.commit.assert_not_called()
def test_disable_built_in_field_should_remove_builtin_keys_and_disable_flag(
mock_db: MagicMock,
mock_redis_client: MagicMock,
mocker: MockerFixture,
) -> None:
# Arrange
mock_redis_client.get.return_value = None
dataset = _dataset(id="dataset-1", built_in_field_enabled=True)
document = _build_document(
"1",
{
BuiltInField.document_name: "doc",
BuiltInField.uploader: "user",
BuiltInField.upload_date: 1.0,
BuiltInField.last_update_date: 2.0,
BuiltInField.source: MetadataDataSource.upload_file,
"custom": "keep",
},
)
mocker.patch(
"services.metadata_service.DocumentService.get_working_documents_by_dataset_id",
return_value=[document],
)
# Act
MetadataService.disable_built_in_field(dataset)
# Assert
assert dataset.built_in_field_enabled is False
assert document.doc_metadata == {"custom": "keep"}
mock_db.session.commit.assert_called_once()
mock_redis_client.delete.assert_called_once_with("dataset_metadata_lock_dataset-1")
def test_update_documents_metadata_should_replace_metadata_and_create_bindings_on_full_update(
mock_db: MagicMock,
mock_redis_client: MagicMock,
mock_current_account: MagicMock,
mocker: MockerFixture,
) -> None:
# Arrange
mock_redis_client.get.return_value = None
dataset = _dataset(id="dataset-1", built_in_field_enabled=False)
document = _build_document("1", {"legacy": "value"})
mocker.patch("services.metadata_service.DocumentService.get_document", return_value=document)
delete_chain = mock_db.session.query.return_value.filter_by.return_value
delete_chain.delete.return_value = 1
operation = DocumentMetadataOperation(
document_id="1",
metadata_list=[MetadataDetail(id="meta-1", name="priority", value="high")],
partial_update=False,
)
metadata_args = MetadataOperationData(operation_data=[operation])
# Act
MetadataService.update_documents_metadata(dataset, metadata_args)
# Assert
assert document.doc_metadata == {"priority": "high"}
delete_chain.delete.assert_called_once()
assert mock_db.session.commit.call_count == 1
mock_redis_client.delete.assert_called_once_with("document_metadata_lock_1")
mock_current_account.assert_called_once()
def test_update_documents_metadata_should_skip_existing_binding_and_preserve_existing_fields_on_partial_update(
mock_db: MagicMock,
mock_redis_client: MagicMock,
mock_current_account: MagicMock,
mocker: MockerFixture,
) -> None:
# Arrange
mock_redis_client.get.return_value = None
dataset = _dataset(id="dataset-1", built_in_field_enabled=True)
document = _build_document("1", {"existing": "value"})
mocker.patch("services.metadata_service.DocumentService.get_document", return_value=document)
mock_db.session.query.return_value.filter_by.return_value.first.return_value = object()
operation = DocumentMetadataOperation(
document_id="1",
metadata_list=[MetadataDetail(id="meta-1", name="new_key", value="new_value")],
partial_update=True,
)
metadata_args = MetadataOperationData(operation_data=[operation])
# Act
MetadataService.update_documents_metadata(dataset, metadata_args)
# Assert
assert document.doc_metadata is not None
assert document.doc_metadata["existing"] == "value"
assert document.doc_metadata["new_key"] == "new_value"
assert document.doc_metadata[BuiltInField.source] == MetadataDataSource.upload_file
assert mock_db.session.commit.call_count == 1
assert mock_db.session.add.call_count == 1
mock_redis_client.delete.assert_called_once_with("document_metadata_lock_1")
mock_current_account.assert_called_once()
def test_update_documents_metadata_should_raise_and_rollback_when_document_not_found(
mock_db: MagicMock,
mock_redis_client: MagicMock,
mocker: MockerFixture,
) -> None:
# Arrange
mock_redis_client.get.return_value = None
dataset = _dataset(id="dataset-1", built_in_field_enabled=False)
mocker.patch("services.metadata_service.DocumentService.get_document", return_value=None)
operation = DocumentMetadataOperation(document_id="404", metadata_list=[], partial_update=True)
metadata_args = MetadataOperationData(operation_data=[operation])
# Act + Assert
with pytest.raises(ValueError, match="Document not found"):
MetadataService.update_documents_metadata(dataset, metadata_args)
# Assert
mock_db.session.rollback.assert_called_once()
mock_redis_client.delete.assert_called_once_with("document_metadata_lock_404")
@pytest.mark.parametrize(
("dataset_id", "document_id", "expected_key"),
[
("dataset-1", None, "dataset_metadata_lock_dataset-1"),
(None, "doc-1", "document_metadata_lock_doc-1"),
],
)
def test_knowledge_base_metadata_lock_check_should_set_lock_when_not_already_locked(
dataset_id: str | None,
document_id: str | None,
expected_key: str,
mock_redis_client: MagicMock,
) -> None:
# Arrange
mock_redis_client.get.return_value = None
# Act
MetadataService.knowledge_base_metadata_lock_check(dataset_id, document_id)
# Assert
mock_redis_client.set.assert_called_once_with(expected_key, 1, ex=3600)
def test_knowledge_base_metadata_lock_check_should_raise_when_dataset_lock_exists(
mock_redis_client: MagicMock,
) -> None:
# Arrange
mock_redis_client.get.return_value = 1
# Act + Assert
with pytest.raises(ValueError, match="knowledge base metadata operation is running"):
MetadataService.knowledge_base_metadata_lock_check("dataset-1", None)
def test_knowledge_base_metadata_lock_check_should_raise_when_document_lock_exists(
mock_redis_client: MagicMock,
) -> None:
# Arrange
mock_redis_client.get.return_value = 1
# Act + Assert
with pytest.raises(ValueError, match="document metadata operation is running"):
MetadataService.knowledge_base_metadata_lock_check(None, "doc-1")
def test_get_dataset_metadatas_should_exclude_builtin_and_include_binding_counts(mock_db: MagicMock) -> None:
# Arrange
dataset = _dataset(
id="dataset-1",
built_in_field_enabled=True,
doc_metadata=[
{"id": "meta-1", "name": "priority", "type": "string"},
{"id": "built-in", "name": "ignored", "type": "string"},
{"id": "meta-2", "name": "score", "type": "number"},
],
)
count_chain = mock_db.session.query.return_value.filter_by.return_value
count_chain.count.side_effect = [3, 1]
# Act
result = MetadataService.get_dataset_metadatas(dataset)
# Assert
assert result["built_in_field_enabled"] is True
assert result["doc_metadata"] == [
{"id": "meta-1", "name": "priority", "type": "string", "count": 3},
{"id": "meta-2", "name": "score", "type": "number", "count": 1},
]
def test_get_dataset_metadatas_should_return_empty_list_when_no_metadata(mock_db: MagicMock) -> None:
# Arrange
dataset = _dataset(id="dataset-1", built_in_field_enabled=False, doc_metadata=None)
# Act
result = MetadataService.get_dataset_metadatas(dataset)
# Assert
assert result == {"doc_metadata": [], "built_in_field_enabled": False}
mock_db.session.query.assert_not_called()

File diff suppressed because it is too large Load Diff

View File

@@ -1,576 +0,0 @@
from __future__ import annotations
from types import SimpleNamespace
from typing import Any, cast
from unittest.mock import MagicMock
import pytest
from pytest_mock import MockerFixture
from models.account import Tenant
# ---------------------------------------------------------------------------
# Constants used throughout the tests
# ---------------------------------------------------------------------------
TENANT_ID = "tenant-abc"
ACCOUNT_ID = "account-xyz"
FILES_BASE_URL = "https://files.example.com"
DB_PATH = "services.workspace_service.db"
FEATURE_SERVICE_PATH = "services.workspace_service.FeatureService.get_features"
TENANT_SERVICE_PATH = "services.workspace_service.TenantService.has_roles"
DIFY_CONFIG_PATH = "services.workspace_service.dify_config"
CURRENT_USER_PATH = "services.workspace_service.current_user"
CREDIT_POOL_SERVICE_PATH = "services.credit_pool_service.CreditPoolService.get_pool"
# ---------------------------------------------------------------------------
# Helpers / factories
# ---------------------------------------------------------------------------
def _make_tenant(
tenant_id: str = TENANT_ID,
name: str = "My Workspace",
plan: str = "sandbox",
status: str = "active",
custom_config: dict | None = None,
) -> Tenant:
"""Create a minimal Tenant-like namespace."""
return cast(
Tenant,
SimpleNamespace(
id=tenant_id,
name=name,
plan=plan,
status=status,
created_at="2024-01-01T00:00:00Z",
custom_config_dict=custom_config or {},
),
)
def _make_feature(
can_replace_logo: bool = False,
next_credit_reset_date: str | None = None,
billing_plan: str = "sandbox",
) -> MagicMock:
"""Create a feature namespace matching what FeatureService.get_features returns."""
feature = MagicMock()
feature.can_replace_logo = can_replace_logo
feature.next_credit_reset_date = next_credit_reset_date
feature.billing.subscription.plan = billing_plan
return feature
def _make_pool(quota_limit: int, quota_used: int) -> MagicMock:
pool = MagicMock()
pool.quota_limit = quota_limit
pool.quota_used = quota_used
return pool
def _make_tenant_account_join(role: str = "normal") -> SimpleNamespace:
return SimpleNamespace(role=role)
def _tenant_info(result: object) -> dict[str, Any] | None:
return cast(dict[str, Any] | None, result)
# ---------------------------------------------------------------------------
# Shared fixtures
# ---------------------------------------------------------------------------
@pytest.fixture
def mock_current_user() -> SimpleNamespace:
"""Return a lightweight current_user stand-in."""
return SimpleNamespace(id=ACCOUNT_ID)
@pytest.fixture
def basic_mocks(mocker: MockerFixture, mock_current_user: SimpleNamespace) -> dict:
"""
Patch the common external boundaries used by WorkspaceService.get_tenant_info.
Returns a dict of named mocks so individual tests can customise them.
"""
mocker.patch(CURRENT_USER_PATH, mock_current_user)
mock_db_session = mocker.patch(f"{DB_PATH}.session")
mock_query_chain = MagicMock()
mock_db_session.query.return_value = mock_query_chain
mock_query_chain.where.return_value = mock_query_chain
mock_query_chain.first.return_value = _make_tenant_account_join(role="owner")
mock_feature = mocker.patch(FEATURE_SERVICE_PATH, return_value=_make_feature())
mock_has_roles = mocker.patch(TENANT_SERVICE_PATH, return_value=False)
mock_config = mocker.patch(DIFY_CONFIG_PATH)
mock_config.EDITION = "SELF_HOSTED"
mock_config.FILES_URL = FILES_BASE_URL
return {
"db_session": mock_db_session,
"query_chain": mock_query_chain,
"get_features": mock_feature,
"has_roles": mock_has_roles,
"config": mock_config,
}
# ---------------------------------------------------------------------------
# 1. None Tenant Handling
# ---------------------------------------------------------------------------
def test_get_tenant_info_should_return_none_when_tenant_is_none() -> None:
"""get_tenant_info should short-circuit and return None for a falsy tenant."""
from services.workspace_service import WorkspaceService
# Arrange
tenant = None
# Act
result = WorkspaceService.get_tenant_info(cast(Tenant, tenant))
# Assert
assert result is None
def test_get_tenant_info_should_return_none_when_tenant_is_falsy() -> None:
"""get_tenant_info treats any falsy value as absent (e.g. empty string, 0)."""
from services.workspace_service import WorkspaceService
# Arrange / Act / Assert
assert WorkspaceService.get_tenant_info("") is None # type: ignore[arg-type]
# ---------------------------------------------------------------------------
# 2. Basic Tenant Info — happy path
# ---------------------------------------------------------------------------
def test_get_tenant_info_should_return_base_fields(
mocker: MockerFixture,
basic_mocks: dict,
) -> None:
"""get_tenant_info should always return the six base scalar fields."""
from services.workspace_service import WorkspaceService
# Arrange
tenant = _make_tenant()
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert result["id"] == TENANT_ID
assert result["name"] == "My Workspace"
assert result["plan"] == "sandbox"
assert result["status"] == "active"
assert result["created_at"] == "2024-01-01T00:00:00Z"
assert result["trial_end_reason"] is None
def test_get_tenant_info_should_populate_role_from_tenant_account_join(
mocker: MockerFixture,
basic_mocks: dict,
) -> None:
"""The 'role' field should be taken from TenantAccountJoin, not the default."""
from services.workspace_service import WorkspaceService
# Arrange
basic_mocks["query_chain"].first.return_value = _make_tenant_account_join(role="admin")
tenant = _make_tenant()
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert result["role"] == "admin"
def test_get_tenant_info_should_raise_assertion_when_tenant_account_join_missing(
mocker: MockerFixture,
basic_mocks: dict,
) -> None:
"""
The service asserts that TenantAccountJoin exists.
Missing join should raise AssertionError.
"""
from services.workspace_service import WorkspaceService
# Arrange
basic_mocks["query_chain"].first.return_value = None
tenant = _make_tenant()
# Act + Assert
with pytest.raises(AssertionError, match="TenantAccountJoin not found"):
WorkspaceService.get_tenant_info(tenant)
# ---------------------------------------------------------------------------
# 3. Logo Customisation
# ---------------------------------------------------------------------------
def test_get_tenant_info_should_include_custom_config_when_logo_allowed_and_admin(
mocker: MockerFixture,
basic_mocks: dict,
) -> None:
"""custom_config block should appear for OWNER/ADMIN when can_replace_logo is True."""
from services.workspace_service import WorkspaceService
# Arrange
basic_mocks["get_features"].return_value = _make_feature(can_replace_logo=True)
basic_mocks["has_roles"].return_value = True
tenant = _make_tenant(
custom_config={
"replace_webapp_logo": True,
"remove_webapp_brand": True,
}
)
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert "custom_config" in result
assert result["custom_config"]["remove_webapp_brand"] is True
expected_logo_url = f"{FILES_BASE_URL}/files/workspaces/{TENANT_ID}/webapp-logo"
assert result["custom_config"]["replace_webapp_logo"] == expected_logo_url
def test_get_tenant_info_should_set_replace_webapp_logo_to_none_when_flag_absent(
mocker: MockerFixture,
basic_mocks: dict,
) -> None:
"""replace_webapp_logo should be None when custom_config_dict does not have the key."""
from services.workspace_service import WorkspaceService
# Arrange
basic_mocks["get_features"].return_value = _make_feature(can_replace_logo=True)
basic_mocks["has_roles"].return_value = True
tenant = _make_tenant(custom_config={}) # no replace_webapp_logo key
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert result["custom_config"]["replace_webapp_logo"] is None
def test_get_tenant_info_should_not_include_custom_config_when_logo_not_allowed(
mocker: MockerFixture,
basic_mocks: dict,
) -> None:
"""custom_config should be absent when can_replace_logo is False."""
from services.workspace_service import WorkspaceService
# Arrange
basic_mocks["get_features"].return_value = _make_feature(can_replace_logo=False)
basic_mocks["has_roles"].return_value = True
tenant = _make_tenant()
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert "custom_config" not in result
def test_get_tenant_info_should_not_include_custom_config_when_user_not_admin(
mocker: MockerFixture,
basic_mocks: dict,
) -> None:
"""custom_config block is gated on OWNER or ADMIN role."""
from services.workspace_service import WorkspaceService
# Arrange
basic_mocks["get_features"].return_value = _make_feature(can_replace_logo=True)
basic_mocks["has_roles"].return_value = False # regular member
tenant = _make_tenant()
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert "custom_config" not in result
def test_get_tenant_info_should_use_files_url_for_logo_url(
mocker: MockerFixture,
basic_mocks: dict,
) -> None:
"""The logo URL should use dify_config.FILES_URL as the base."""
from services.workspace_service import WorkspaceService
# Arrange
custom_base = "https://cdn.mycompany.io"
basic_mocks["config"].FILES_URL = custom_base
basic_mocks["get_features"].return_value = _make_feature(can_replace_logo=True)
basic_mocks["has_roles"].return_value = True
tenant = _make_tenant(custom_config={"replace_webapp_logo": True})
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert result["custom_config"]["replace_webapp_logo"].startswith(custom_base)
# ---------------------------------------------------------------------------
# 4. Cloud-Edition Credit Features
# ---------------------------------------------------------------------------
CLOUD_BILLING_PLAN_NON_SANDBOX = "professional" # any plan that is not SANDBOX
@pytest.fixture
def cloud_mocks(mocker: MockerFixture, mock_current_user: SimpleNamespace) -> dict:
"""Patches for CLOUD edition tests, billing plan = professional by default."""
mocker.patch(CURRENT_USER_PATH, mock_current_user)
mock_db_session = mocker.patch(f"{DB_PATH}.session")
mock_query_chain = MagicMock()
mock_db_session.query.return_value = mock_query_chain
mock_query_chain.where.return_value = mock_query_chain
mock_query_chain.first.return_value = _make_tenant_account_join(role="owner")
mock_feature = mocker.patch(
FEATURE_SERVICE_PATH,
return_value=_make_feature(
can_replace_logo=False,
next_credit_reset_date="2025-02-01",
billing_plan=CLOUD_BILLING_PLAN_NON_SANDBOX,
),
)
mocker.patch(TENANT_SERVICE_PATH, return_value=False)
mock_config = mocker.patch(DIFY_CONFIG_PATH)
mock_config.EDITION = "CLOUD"
mock_config.FILES_URL = FILES_BASE_URL
return {
"db_session": mock_db_session,
"query_chain": mock_query_chain,
"get_features": mock_feature,
"config": mock_config,
}
def test_get_tenant_info_should_add_next_credit_reset_date_in_cloud_edition(
mocker: MockerFixture,
cloud_mocks: dict,
) -> None:
"""next_credit_reset_date should be present in CLOUD edition."""
from services.workspace_service import WorkspaceService
# Arrange
mocker.patch(
CREDIT_POOL_SERVICE_PATH,
side_effect=[None, None], # both paid and trial pools absent
)
tenant = _make_tenant()
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert result["next_credit_reset_date"] == "2025-02-01"
def test_get_tenant_info_should_use_paid_pool_when_plan_is_not_sandbox_and_pool_not_full(
mocker: MockerFixture,
cloud_mocks: dict,
) -> None:
"""trial_credits/trial_credits_used come from the paid pool when conditions are met."""
from services.workspace_service import WorkspaceService
# Arrange
paid_pool = _make_pool(quota_limit=1000, quota_used=200)
mocker.patch(CREDIT_POOL_SERVICE_PATH, return_value=paid_pool)
tenant = _make_tenant()
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert result["trial_credits"] == 1000
assert result["trial_credits_used"] == 200
def test_get_tenant_info_should_use_paid_pool_when_quota_limit_is_infinite(
mocker: MockerFixture,
cloud_mocks: dict,
) -> None:
"""quota_limit == -1 means unlimited; service should still use the paid pool."""
from services.workspace_service import WorkspaceService
# Arrange
paid_pool = _make_pool(quota_limit=-1, quota_used=999)
mocker.patch(CREDIT_POOL_SERVICE_PATH, side_effect=[paid_pool, None])
tenant = _make_tenant()
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert result["trial_credits"] == -1
assert result["trial_credits_used"] == 999
def test_get_tenant_info_should_fall_back_to_trial_pool_when_paid_pool_is_full(
mocker: MockerFixture,
cloud_mocks: dict,
) -> None:
"""When paid pool is exhausted (used >= limit), switch to trial pool."""
from services.workspace_service import WorkspaceService
# Arrange
paid_pool = _make_pool(quota_limit=500, quota_used=500) # exactly full
trial_pool = _make_pool(quota_limit=100, quota_used=10)
mocker.patch(CREDIT_POOL_SERVICE_PATH, side_effect=[paid_pool, trial_pool])
tenant = _make_tenant()
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert result["trial_credits"] == 100
assert result["trial_credits_used"] == 10
def test_get_tenant_info_should_fall_back_to_trial_pool_when_paid_pool_is_none(
mocker: MockerFixture,
cloud_mocks: dict,
) -> None:
"""When paid_pool is None, fall back to trial pool."""
from services.workspace_service import WorkspaceService
# Arrange
trial_pool = _make_pool(quota_limit=50, quota_used=5)
mocker.patch(CREDIT_POOL_SERVICE_PATH, side_effect=[None, trial_pool])
tenant = _make_tenant()
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert result["trial_credits"] == 50
assert result["trial_credits_used"] == 5
def test_get_tenant_info_should_fall_back_to_trial_pool_for_sandbox_plan(
mocker: MockerFixture,
cloud_mocks: dict,
) -> None:
"""
When the subscription plan IS SANDBOX, the paid pool branch is skipped
entirely and we fall back to the trial pool.
"""
from enums.cloud_plan import CloudPlan
from services.workspace_service import WorkspaceService
# Arrange — override billing plan to SANDBOX
cloud_mocks["get_features"].return_value = _make_feature(
next_credit_reset_date="2025-02-01",
billing_plan=CloudPlan.SANDBOX,
)
paid_pool = _make_pool(quota_limit=1000, quota_used=0)
trial_pool = _make_pool(quota_limit=200, quota_used=20)
mocker.patch(CREDIT_POOL_SERVICE_PATH, side_effect=[paid_pool, trial_pool])
tenant = _make_tenant()
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert result["trial_credits"] == 200
assert result["trial_credits_used"] == 20
def test_get_tenant_info_should_omit_trial_credits_when_both_pools_are_none(
mocker: MockerFixture,
cloud_mocks: dict,
) -> None:
"""When both paid and trial pools are absent, trial_credits should not be set."""
from services.workspace_service import WorkspaceService
# Arrange
mocker.patch(CREDIT_POOL_SERVICE_PATH, side_effect=[None, None])
tenant = _make_tenant()
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert "trial_credits" not in result
assert "trial_credits_used" not in result
# ---------------------------------------------------------------------------
# 5. Self-hosted / Non-Cloud Edition
# ---------------------------------------------------------------------------
def test_get_tenant_info_should_not_include_cloud_fields_in_self_hosted(
mocker: MockerFixture,
basic_mocks: dict,
) -> None:
"""next_credit_reset_date and trial_credits should NOT appear in SELF_HOSTED mode."""
from services.workspace_service import WorkspaceService
# Arrange (basic_mocks already sets EDITION = "SELF_HOSTED")
tenant = _make_tenant()
# Act
result = _tenant_info(WorkspaceService.get_tenant_info(tenant))
# Assert
assert result is not None
assert "next_credit_reset_date" not in result
assert "trial_credits" not in result
assert "trial_credits_used" not in result
# ---------------------------------------------------------------------------
# 6. DB query integrity
# ---------------------------------------------------------------------------
def test_get_tenant_info_should_query_tenant_account_join_with_correct_ids(
mocker: MockerFixture,
basic_mocks: dict,
) -> None:
"""
The DB query for TenantAccountJoin must be scoped to the correct
tenant_id and current_user.id.
"""
from services.workspace_service import WorkspaceService
# Arrange
tenant = _make_tenant(tenant_id="my-special-tenant")
mock_current_user = mocker.patch(CURRENT_USER_PATH)
mock_current_user.id = "special-user-id"
# Act
WorkspaceService.get_tenant_info(tenant)
# Assert — db.session.query was invoked (at least once)
basic_mocks["db_session"].query.assert_called()

74
api/uv.lock generated
View File

@@ -204,7 +204,7 @@ sdist = { url = "https://files.pythonhosted.org/packages/9a/7d/b22cb9a0d4f396ee0
[[package]]
name = "alibabacloud-tea-openapi"
version = "0.4.3"
version = "0.4.4"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "alibabacloud-credentials" },
@@ -213,9 +213,9 @@ dependencies = [
{ name = "cryptography" },
{ name = "darabonba-core" },
]
sdist = { url = "https://files.pythonhosted.org/packages/91/4f/b5288eea8f4d4b032c9a8f2cd1d926d5017977d10b874956f31e5343f299/alibabacloud_tea_openapi-0.4.3.tar.gz", hash = "sha256:12aef036ed993637b6f141abbd1de9d6199d5516f4a901588bb65d6a3768d41b", size = 21864, upload-time = "2026-01-15T07:55:16.744Z" }
sdist = { url = "https://files.pythonhosted.org/packages/30/93/138bcdc8fc596add73e37cf2073798f285284d1240bda9ee02f9384fc6be/alibabacloud_tea_openapi-0.4.4.tar.gz", hash = "sha256:1b0917bc03cd49417da64945e92731716d53e2eb8707b235f54e45b7473221ce", size = 21960, upload-time = "2026-03-26T10:16:16.792Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a5/37/48ee5468ecad19c6d44cf3b9629d77078e836ee3ec760f0366247f307b7c/alibabacloud_tea_openapi-0.4.3-py3-none-any.whl", hash = "sha256:d0b3a373b760ef6278b25fc128c73284301e07888977bf97519e7636d47bdf0a", size = 26159, upload-time = "2026-01-15T07:55:15.72Z" },
{ url = "https://files.pythonhosted.org/packages/f5/5a/6bfc4506438c1809c486f66217ad11eab78157192b3d5707b4e2f4212f6c/alibabacloud_tea_openapi-0.4.4-py3-none-any.whl", hash = "sha256:cea6bc1fe35b0319a8752cb99eb0ecb0dab7ca1a71b99c12970ba0867410995f", size = 26236, upload-time = "2026-03-26T10:16:15.861Z" },
]
[[package]]
@@ -1308,43 +1308,47 @@ wheels = [
[[package]]
name = "cryptography"
version = "44.0.3"
version = "46.0.6"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "cffi", marker = "platform_python_implementation != 'PyPy'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/53/d6/1411ab4d6108ab167d06254c5be517681f1e331f90edf1379895bcb87020/cryptography-44.0.3.tar.gz", hash = "sha256:fe19d8bc5536a91a24a8133328880a41831b6c5df54599a8417b62fe015d3053", size = 711096, upload-time = "2025-05-02T19:36:04.667Z" }
sdist = { url = "https://files.pythonhosted.org/packages/a4/ba/04b1bd4218cbc58dc90ce967106d51582371b898690f3ae0402876cc4f34/cryptography-46.0.6.tar.gz", hash = "sha256:27550628a518c5c6c903d84f637fbecf287f6cb9ced3804838a1295dc1fd0759", size = 750542, upload-time = "2026-03-25T23:34:53.396Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/08/53/c776d80e9d26441bb3868457909b4e74dd9ccabd182e10b2b0ae7a07e265/cryptography-44.0.3-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:962bc30480a08d133e631e8dfd4783ab71cc9e33d5d7c1e192f0b7c06397bb88", size = 6670281, upload-time = "2025-05-02T19:34:50.665Z" },
{ url = "https://files.pythonhosted.org/packages/6a/06/af2cf8d56ef87c77319e9086601bef621bedf40f6f59069e1b6d1ec498c5/cryptography-44.0.3-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ffc61e8f3bf5b60346d89cd3d37231019c17a081208dfbbd6e1605ba03fa137", size = 3959305, upload-time = "2025-05-02T19:34:53.042Z" },
{ url = "https://files.pythonhosted.org/packages/ae/01/80de3bec64627207d030f47bf3536889efee8913cd363e78ca9a09b13c8e/cryptography-44.0.3-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:58968d331425a6f9eedcee087f77fd3c927c88f55368f43ff7e0a19891f2642c", size = 4171040, upload-time = "2025-05-02T19:34:54.675Z" },
{ url = "https://files.pythonhosted.org/packages/bd/48/bb16b7541d207a19d9ae8b541c70037a05e473ddc72ccb1386524d4f023c/cryptography-44.0.3-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:e28d62e59a4dbd1d22e747f57d4f00c459af22181f0b2f787ea83f5a876d7c76", size = 3963411, upload-time = "2025-05-02T19:34:56.61Z" },
{ url = "https://files.pythonhosted.org/packages/42/b2/7d31f2af5591d217d71d37d044ef5412945a8a8e98d5a2a8ae4fd9cd4489/cryptography-44.0.3-cp37-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:af653022a0c25ef2e3ffb2c673a50e5a0d02fecc41608f4954176f1933b12359", size = 3689263, upload-time = "2025-05-02T19:34:58.591Z" },
{ url = "https://files.pythonhosted.org/packages/25/50/c0dfb9d87ae88ccc01aad8eb93e23cfbcea6a6a106a9b63a7b14c1f93c75/cryptography-44.0.3-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:157f1f3b8d941c2bd8f3ffee0af9b049c9665c39d3da9db2dc338feca5e98a43", size = 4196198, upload-time = "2025-05-02T19:35:00.988Z" },
{ url = "https://files.pythonhosted.org/packages/66/c9/55c6b8794a74da652690c898cb43906310a3e4e4f6ee0b5f8b3b3e70c441/cryptography-44.0.3-cp37-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:c6cd67722619e4d55fdb42ead64ed8843d64638e9c07f4011163e46bc512cf01", size = 3966502, upload-time = "2025-05-02T19:35:03.091Z" },
{ url = "https://files.pythonhosted.org/packages/b6/f7/7cb5488c682ca59a02a32ec5f975074084db4c983f849d47b7b67cc8697a/cryptography-44.0.3-cp37-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:b424563394c369a804ecbee9b06dfb34997f19d00b3518e39f83a5642618397d", size = 4196173, upload-time = "2025-05-02T19:35:05.018Z" },
{ url = "https://files.pythonhosted.org/packages/d2/0b/2f789a8403ae089b0b121f8f54f4a3e5228df756e2146efdf4a09a3d5083/cryptography-44.0.3-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:c91fc8e8fd78af553f98bc7f2a1d8db977334e4eea302a4bfd75b9461c2d8904", size = 4087713, upload-time = "2025-05-02T19:35:07.187Z" },
{ url = "https://files.pythonhosted.org/packages/1d/aa/330c13655f1af398fc154089295cf259252f0ba5df93b4bc9d9c7d7f843e/cryptography-44.0.3-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:25cd194c39fa5a0aa4169125ee27d1172097857b27109a45fadc59653ec06f44", size = 4299064, upload-time = "2025-05-02T19:35:08.879Z" },
{ url = "https://files.pythonhosted.org/packages/10/a8/8c540a421b44fd267a7d58a1fd5f072a552d72204a3f08194f98889de76d/cryptography-44.0.3-cp37-abi3-win32.whl", hash = "sha256:3be3f649d91cb182c3a6bd336de8b61a0a71965bd13d1a04a0e15b39c3d5809d", size = 2773887, upload-time = "2025-05-02T19:35:10.41Z" },
{ url = "https://files.pythonhosted.org/packages/b9/0d/c4b1657c39ead18d76bbd122da86bd95bdc4095413460d09544000a17d56/cryptography-44.0.3-cp37-abi3-win_amd64.whl", hash = "sha256:3883076d5c4cc56dbef0b898a74eb6992fdac29a7b9013870b34efe4ddb39a0d", size = 3209737, upload-time = "2025-05-02T19:35:12.12Z" },
{ url = "https://files.pythonhosted.org/packages/34/a3/ad08e0bcc34ad436013458d7528e83ac29910943cea42ad7dd4141a27bbb/cryptography-44.0.3-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:5639c2b16764c6f76eedf722dbad9a0914960d3489c0cc38694ddf9464f1bb2f", size = 6673501, upload-time = "2025-05-02T19:35:13.775Z" },
{ url = "https://files.pythonhosted.org/packages/b1/f0/7491d44bba8d28b464a5bc8cc709f25a51e3eac54c0a4444cf2473a57c37/cryptography-44.0.3-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f3ffef566ac88f75967d7abd852ed5f182da252d23fac11b4766da3957766759", size = 3960307, upload-time = "2025-05-02T19:35:15.917Z" },
{ url = "https://files.pythonhosted.org/packages/f7/c8/e5c5d0e1364d3346a5747cdcd7ecbb23ca87e6dea4f942a44e88be349f06/cryptography-44.0.3-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:192ed30fac1728f7587c6f4613c29c584abdc565d7417c13904708db10206645", size = 4170876, upload-time = "2025-05-02T19:35:18.138Z" },
{ url = "https://files.pythonhosted.org/packages/73/96/025cb26fc351d8c7d3a1c44e20cf9a01e9f7cf740353c9c7a17072e4b264/cryptography-44.0.3-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:7d5fe7195c27c32a64955740b949070f21cba664604291c298518d2e255931d2", size = 3964127, upload-time = "2025-05-02T19:35:19.864Z" },
{ url = "https://files.pythonhosted.org/packages/01/44/eb6522db7d9f84e8833ba3bf63313f8e257729cf3a8917379473fcfd6601/cryptography-44.0.3-cp39-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3f07943aa4d7dad689e3bb1638ddc4944cc5e0921e3c227486daae0e31a05e54", size = 3689164, upload-time = "2025-05-02T19:35:21.449Z" },
{ url = "https://files.pythonhosted.org/packages/68/fb/d61a4defd0d6cee20b1b8a1ea8f5e25007e26aeb413ca53835f0cae2bcd1/cryptography-44.0.3-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:cb90f60e03d563ca2445099edf605c16ed1d5b15182d21831f58460c48bffb93", size = 4198081, upload-time = "2025-05-02T19:35:23.187Z" },
{ url = "https://files.pythonhosted.org/packages/1b/50/457f6911d36432a8811c3ab8bd5a6090e8d18ce655c22820994913dd06ea/cryptography-44.0.3-cp39-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:ab0b005721cc0039e885ac3503825661bd9810b15d4f374e473f8c89b7d5460c", size = 3967716, upload-time = "2025-05-02T19:35:25.426Z" },
{ url = "https://files.pythonhosted.org/packages/35/6e/dca39d553075980ccb631955c47b93d87d27f3596da8d48b1ae81463d915/cryptography-44.0.3-cp39-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:3bb0847e6363c037df8f6ede57d88eaf3410ca2267fb12275370a76f85786a6f", size = 4197398, upload-time = "2025-05-02T19:35:27.678Z" },
{ url = "https://files.pythonhosted.org/packages/9b/9d/d1f2fe681eabc682067c66a74addd46c887ebacf39038ba01f8860338d3d/cryptography-44.0.3-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:b0cc66c74c797e1db750aaa842ad5b8b78e14805a9b5d1348dc603612d3e3ff5", size = 4087900, upload-time = "2025-05-02T19:35:29.312Z" },
{ url = "https://files.pythonhosted.org/packages/c4/f5/3599e48c5464580b73b236aafb20973b953cd2e7b44c7c2533de1d888446/cryptography-44.0.3-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:6866df152b581f9429020320e5eb9794c8780e90f7ccb021940d7f50ee00ae0b", size = 4301067, upload-time = "2025-05-02T19:35:31.547Z" },
{ url = "https://files.pythonhosted.org/packages/a7/6c/d2c48c8137eb39d0c193274db5c04a75dab20d2f7c3f81a7dcc3a8897701/cryptography-44.0.3-cp39-abi3-win32.whl", hash = "sha256:c138abae3a12a94c75c10499f1cbae81294a6f983b3af066390adee73f433028", size = 2775467, upload-time = "2025-05-02T19:35:33.805Z" },
{ url = "https://files.pythonhosted.org/packages/c9/ad/51f212198681ea7b0deaaf8846ee10af99fba4e894f67b353524eab2bbe5/cryptography-44.0.3-cp39-abi3-win_amd64.whl", hash = "sha256:5d186f32e52e66994dce4f766884bcb9c68b8da62d61d9d215bfe5fb56d21334", size = 3210375, upload-time = "2025-05-02T19:35:35.369Z" },
{ url = "https://files.pythonhosted.org/packages/8d/4b/c11ad0b6c061902de5223892d680e89c06c7c4d606305eb8de56c5427ae6/cryptography-44.0.3-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:896530bc9107b226f265effa7ef3f21270f18a2026bc09fed1ebd7b66ddf6375", size = 3390230, upload-time = "2025-05-02T19:35:49.062Z" },
{ url = "https://files.pythonhosted.org/packages/58/11/0a6bf45d53b9b2290ea3cec30e78b78e6ca29dc101e2e296872a0ffe1335/cryptography-44.0.3-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:9b4d4a5dbee05a2c390bf212e78b99434efec37b17a4bff42f50285c5c8c9647", size = 3895216, upload-time = "2025-05-02T19:35:51.351Z" },
{ url = "https://files.pythonhosted.org/packages/0a/27/b28cdeb7270e957f0077a2c2bfad1b38f72f1f6d699679f97b816ca33642/cryptography-44.0.3-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:02f55fb4f8b79c1221b0961488eaae21015b69b210e18c386b69de182ebb1259", size = 4115044, upload-time = "2025-05-02T19:35:53.044Z" },
{ url = "https://files.pythonhosted.org/packages/35/b0/ec4082d3793f03cb248881fecefc26015813199b88f33e3e990a43f79835/cryptography-44.0.3-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:dd3db61b8fe5be220eee484a17233287d0be6932d056cf5738225b9c05ef4fff", size = 3898034, upload-time = "2025-05-02T19:35:54.72Z" },
{ url = "https://files.pythonhosted.org/packages/0b/7f/adf62e0b8e8d04d50c9a91282a57628c00c54d4ae75e2b02a223bd1f2613/cryptography-44.0.3-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:978631ec51a6bbc0b7e58f23b68a8ce9e5f09721940933e9c217068388789fe5", size = 4114449, upload-time = "2025-05-02T19:35:57.139Z" },
{ url = "https://files.pythonhosted.org/packages/87/62/d69eb4a8ee231f4bf733a92caf9da13f1c81a44e874b1d4080c25ecbb723/cryptography-44.0.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:5d20cc348cca3a8aa7312f42ab953a56e15323800ca3ab0706b8cd452a3a056c", size = 3134369, upload-time = "2025-05-02T19:35:58.907Z" },
{ url = "https://files.pythonhosted.org/packages/47/23/9285e15e3bc57325b0a72e592921983a701efc1ee8f91c06c5f0235d86d9/cryptography-46.0.6-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:64235194bad039a10bb6d2d930ab3323baaec67e2ce36215fd0952fad0930ca8", size = 7176401, upload-time = "2026-03-25T23:33:22.096Z" },
{ url = "https://files.pythonhosted.org/packages/60/f8/e61f8f13950ab6195b31913b42d39f0f9afc7d93f76710f299b5ec286ae6/cryptography-46.0.6-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:26031f1e5ca62fcb9d1fcb34b2b60b390d1aacaa15dc8b895a9ed00968b97b30", size = 4275275, upload-time = "2026-03-25T23:33:23.844Z" },
{ url = "https://files.pythonhosted.org/packages/19/69/732a736d12c2631e140be2348b4ad3d226302df63ef64d30dfdb8db7ad1c/cryptography-46.0.6-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:9a693028b9cbe51b5a1136232ee8f2bc242e4e19d456ded3fa7c86e43c713b4a", size = 4425320, upload-time = "2026-03-25T23:33:25.703Z" },
{ url = "https://files.pythonhosted.org/packages/d4/12/123be7292674abf76b21ac1fc0e1af50661f0e5b8f0ec8285faac18eb99e/cryptography-46.0.6-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:67177e8a9f421aa2d3a170c3e56eca4e0128883cf52a071a7cbf53297f18b175", size = 4278082, upload-time = "2026-03-25T23:33:27.423Z" },
{ url = "https://files.pythonhosted.org/packages/5b/ba/d5e27f8d68c24951b0a484924a84c7cdaed7502bac9f18601cd357f8b1d2/cryptography-46.0.6-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:d9528b535a6c4f8ff37847144b8986a9a143585f0540fbcb1a98115b543aa463", size = 4926514, upload-time = "2026-03-25T23:33:29.206Z" },
{ url = "https://files.pythonhosted.org/packages/34/71/1ea5a7352ae516d5512d17babe7e1b87d9db5150b21f794b1377eac1edc0/cryptography-46.0.6-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:22259338084d6ae497a19bae5d4c66b7ca1387d3264d1c2c0e72d9e9b6a77b97", size = 4457766, upload-time = "2026-03-25T23:33:30.834Z" },
{ url = "https://files.pythonhosted.org/packages/01/59/562be1e653accee4fdad92c7a2e88fced26b3fdfce144047519bbebc299e/cryptography-46.0.6-cp311-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:760997a4b950ff00d418398ad73fbc91aa2894b5c1db7ccb45b4f68b42a63b3c", size = 3986535, upload-time = "2026-03-25T23:33:33.02Z" },
{ url = "https://files.pythonhosted.org/packages/d6/8b/b1ebfeb788bf4624d36e45ed2662b8bd43a05ff62157093c1539c1288a18/cryptography-46.0.6-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:3dfa6567f2e9e4c5dceb8ccb5a708158a2a871052fa75c8b78cb0977063f1507", size = 4277618, upload-time = "2026-03-25T23:33:34.567Z" },
{ url = "https://files.pythonhosted.org/packages/dd/52/a005f8eabdb28df57c20f84c44d397a755782d6ff6d455f05baa2785bd91/cryptography-46.0.6-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:cdcd3edcbc5d55757e5f5f3d330dd00007ae463a7e7aa5bf132d1f22a4b62b19", size = 4890802, upload-time = "2026-03-25T23:33:37.034Z" },
{ url = "https://files.pythonhosted.org/packages/ec/4d/8e7d7245c79c617d08724e2efa397737715ca0ec830ecb3c91e547302555/cryptography-46.0.6-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:d4e4aadb7fc1f88687f47ca20bb7227981b03afaae69287029da08096853b738", size = 4457425, upload-time = "2026-03-25T23:33:38.904Z" },
{ url = "https://files.pythonhosted.org/packages/1d/5c/f6c3596a1430cec6f949085f0e1a970638d76f81c3ea56d93d564d04c340/cryptography-46.0.6-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:2b417edbe8877cda9022dde3a008e2deb50be9c407eef034aeeb3a8b11d9db3c", size = 4405530, upload-time = "2026-03-25T23:33:40.842Z" },
{ url = "https://files.pythonhosted.org/packages/7e/c9/9f9cea13ee2dbde070424e0c4f621c091a91ffcc504ffea5e74f0e1daeff/cryptography-46.0.6-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:380343e0653b1c9d7e1f55b52aaa2dbb2fdf2730088d48c43ca1c7c0abb7cc2f", size = 4667896, upload-time = "2026-03-25T23:33:42.781Z" },
{ url = "https://files.pythonhosted.org/packages/ad/b5/1895bc0821226f129bc74d00eccfc6a5969e2028f8617c09790bf89c185e/cryptography-46.0.6-cp311-abi3-win32.whl", hash = "sha256:bcb87663e1f7b075e48c3be3ecb5f0b46c8fc50b50a97cf264e7f60242dca3f2", size = 3026348, upload-time = "2026-03-25T23:33:45.021Z" },
{ url = "https://files.pythonhosted.org/packages/c3/f8/c9bcbf0d3e6ad288b9d9aa0b1dee04b063d19e8c4f871855a03ab3a297ab/cryptography-46.0.6-cp311-abi3-win_amd64.whl", hash = "sha256:6739d56300662c468fddb0e5e291f9b4d084bead381667b9e654c7dd81705124", size = 3483896, upload-time = "2026-03-25T23:33:46.649Z" },
{ url = "https://files.pythonhosted.org/packages/c4/cc/f330e982852403da79008552de9906804568ae9230da8432f7496ce02b71/cryptography-46.0.6-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:12cae594e9473bca1a7aceb90536060643128bb274fcea0fc459ab90f7d1ae7a", size = 7162776, upload-time = "2026-03-25T23:34:13.308Z" },
{ url = "https://files.pythonhosted.org/packages/49/b3/dc27efd8dcc4bff583b3f01d4a3943cd8b5821777a58b3a6a5f054d61b79/cryptography-46.0.6-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:639301950939d844a9e1c4464d7e07f902fe9a7f6b215bb0d4f28584729935d8", size = 4270529, upload-time = "2026-03-25T23:34:15.019Z" },
{ url = "https://files.pythonhosted.org/packages/e6/05/e8d0e6eb4f0d83365b3cb0e00eb3c484f7348db0266652ccd84632a3d58d/cryptography-46.0.6-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:ed3775295fb91f70b4027aeba878d79b3e55c0b3e97eaa4de71f8f23a9f2eb77", size = 4414827, upload-time = "2026-03-25T23:34:16.604Z" },
{ url = "https://files.pythonhosted.org/packages/2f/97/daba0f5d2dc6d855e2dcb70733c812558a7977a55dd4a6722756628c44d1/cryptography-46.0.6-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:8927ccfbe967c7df312ade694f987e7e9e22b2425976ddbf28271d7e58845290", size = 4271265, upload-time = "2026-03-25T23:34:18.586Z" },
{ url = "https://files.pythonhosted.org/packages/89/06/fe1fce39a37ac452e58d04b43b0855261dac320a2ebf8f5260dd55b201a9/cryptography-46.0.6-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:b12c6b1e1651e42ab5de8b1e00dc3b6354fdfd778e7fa60541ddacc27cd21410", size = 4916800, upload-time = "2026-03-25T23:34:20.561Z" },
{ url = "https://files.pythonhosted.org/packages/ff/8a/b14f3101fe9c3592603339eb5d94046c3ce5f7fc76d6512a2d40efd9724e/cryptography-46.0.6-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:063b67749f338ca9c5a0b7fe438a52c25f9526b851e24e6c9310e7195aad3b4d", size = 4448771, upload-time = "2026-03-25T23:34:22.406Z" },
{ url = "https://files.pythonhosted.org/packages/01/b3/0796998056a66d1973fd52ee89dc1bb3b6581960a91ad4ac705f182d398f/cryptography-46.0.6-cp38-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:02fad249cb0e090b574e30b276a3da6a149e04ee2f049725b1f69e7b8351ec70", size = 3978333, upload-time = "2026-03-25T23:34:24.281Z" },
{ url = "https://files.pythonhosted.org/packages/c5/3d/db200af5a4ffd08918cd55c08399dc6c9c50b0bc72c00a3246e099d3a849/cryptography-46.0.6-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:7e6142674f2a9291463e5e150090b95a8519b2fb6e6aaec8917dd8d094ce750d", size = 4271069, upload-time = "2026-03-25T23:34:25.895Z" },
{ url = "https://files.pythonhosted.org/packages/d7/18/61acfd5b414309d74ee838be321c636fe71815436f53c9f0334bf19064fa/cryptography-46.0.6-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:456b3215172aeefb9284550b162801d62f5f264a081049a3e94307fe20792cfa", size = 4878358, upload-time = "2026-03-25T23:34:27.67Z" },
{ url = "https://files.pythonhosted.org/packages/8b/65/5bf43286d566f8171917cae23ac6add941654ccf085d739195a4eacf1674/cryptography-46.0.6-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:341359d6c9e68834e204ceaf25936dffeafea3829ab80e9503860dcc4f4dac58", size = 4448061, upload-time = "2026-03-25T23:34:29.375Z" },
{ url = "https://files.pythonhosted.org/packages/e0/25/7e49c0fa7205cf3597e525d156a6bce5b5c9de1fd7e8cb01120e459f205a/cryptography-46.0.6-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:9a9c42a2723999a710445bc0d974e345c32adfd8d2fac6d8a251fa829ad31cfb", size = 4399103, upload-time = "2026-03-25T23:34:32.036Z" },
{ url = "https://files.pythonhosted.org/packages/44/46/466269e833f1c4718d6cd496ffe20c56c9c8d013486ff66b4f69c302a68d/cryptography-46.0.6-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:6617f67b1606dfd9fe4dbfa354a9508d4a6d37afe30306fe6c101b7ce3274b72", size = 4659255, upload-time = "2026-03-25T23:34:33.679Z" },
{ url = "https://files.pythonhosted.org/packages/0a/09/ddc5f630cc32287d2c953fc5d32705e63ec73e37308e5120955316f53827/cryptography-46.0.6-cp38-abi3-win32.whl", hash = "sha256:7f6690b6c55e9c5332c0b59b9c8a3fb232ebf059094c17f9019a51e9827df91c", size = 3010660, upload-time = "2026-03-25T23:34:35.418Z" },
{ url = "https://files.pythonhosted.org/packages/1b/82/ca4893968aeb2709aacfb57a30dec6fa2ab25b10fa9f064b8882ce33f599/cryptography-46.0.6-cp38-abi3-win_amd64.whl", hash = "sha256:79e865c642cfc5c0b3eb12af83c35c5aeff4fa5c672dc28c43721c2c9fdd2f0f", size = 3471160, upload-time = "2026-03-25T23:34:37.191Z" },
{ url = "https://files.pythonhosted.org/packages/2e/84/7ccff00ced5bac74b775ce0beb7d1be4e8637536b522b5df9b73ada42da2/cryptography-46.0.6-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:2ea0f37e9a9cf0df2952893ad145fd9627d326a59daec9b0802480fa3bcd2ead", size = 3475444, upload-time = "2026-03-25T23:34:38.944Z" },
{ url = "https://files.pythonhosted.org/packages/bc/1f/4c926f50df7749f000f20eede0c896769509895e2648db5da0ed55db711d/cryptography-46.0.6-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:a3e84d5ec9ba01f8fd03802b2147ba77f0c8f2617b2aff254cedd551844209c8", size = 4218227, upload-time = "2026-03-25T23:34:40.871Z" },
{ url = "https://files.pythonhosted.org/packages/c6/65/707be3ffbd5f786028665c3223e86e11c4cda86023adbc56bd72b1b6bab5/cryptography-46.0.6-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:12f0fa16cc247b13c43d56d7b35287ff1569b5b1f4c5e87e92cc4fcc00cd10c0", size = 4381399, upload-time = "2026-03-25T23:34:42.609Z" },
{ url = "https://files.pythonhosted.org/packages/f3/6d/73557ed0ef7d73d04d9aba745d2c8e95218213687ee5e76b7d236a5030fc/cryptography-46.0.6-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:50575a76e2951fe7dbd1f56d181f8c5ceeeb075e9ff88e7ad997d2f42af06e7b", size = 4217595, upload-time = "2026-03-25T23:34:44.205Z" },
{ url = "https://files.pythonhosted.org/packages/9e/c5/e1594c4eec66a567c3ac4400008108a415808be2ce13dcb9a9045c92f1a0/cryptography-46.0.6-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:90e5f0a7b3be5f40c3a0a0eafb32c681d8d2c181fc2a1bdabe9b3f611d9f6b1a", size = 4380912, upload-time = "2026-03-25T23:34:46.328Z" },
{ url = "https://files.pythonhosted.org/packages/1a/89/843b53614b47f97fe1abc13f9a86efa5ec9e275292c457af1d4a60dc80e0/cryptography-46.0.6-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:6728c49e3b2c180ef26f8e9f0a883a2c585638db64cf265b49c9ba10652d430e", size = 3409955, upload-time = "2026-03-25T23:34:48.465Z" },
]
[[package]]