Compare commits

..

394 Commits

Author SHA1 Message Date
-LAN-
1cf788c43b Merge branch 'main' into feat/queue-based-graph-engine 2025-09-17 12:46:08 +08:00
-LAN-
73a7756350 feat(graph_engine): allow to dumps and loads RSC 2025-09-17 12:45:51 +08:00
-LAN-
02d15ebd5a feat(graph_engine): support dumps and loads in GraphExecution 2025-09-16 19:38:10 +08:00
-LAN-
b5a7e64e19 Fix incorrect API endpoint routing from PR #25628 (#25778)
Some checks failed
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Has been cancelled
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Has been cancelled
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Has been cancelled
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Has been cancelled
Main CI Pipeline / Check Changed Files (push) Has been cancelled
Main CI Pipeline / Style Check (push) Has been cancelled
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Has been cancelled
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Has been cancelled
Main CI Pipeline / API Tests (push) Has been cancelled
Main CI Pipeline / Web Tests (push) Has been cancelled
Main CI Pipeline / VDB Tests (push) Has been cancelled
Main CI Pipeline / DB Migration Test (push) Has been cancelled
2025-09-16 19:20:26 +08:00
Jiang
b283b10d3e Fix/lindorm vdb optimize (#25748)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-16 16:54:18 +08:00
-LAN-
976b3b5e83 Merge branch 'main' into feat/queue-based-graph-engine 2025-09-16 15:21:36 +08:00
-LAN-
ecb22226d6 refactor: remove Claude-specific references from documentation files (#25760) 2025-09-16 14:22:14 +08:00
Xiyuan Chen
8635aacb46 Enhance LLM model configuration validation to include active status c… (#25759)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-15 23:15:53 -07:00
-LAN-
b5684f1992 refactor(graph_engine): remove unused parameters from Engine 2025-09-16 14:11:42 +08:00
-LAN-
bd13cf05eb Merge branch 'main' into feat/queue-based-graph-engine 2025-09-16 12:59:26 +08:00
Asuka Minato
bdd85b36a4 ruff check preview (#25653)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-16 12:58:12 +08:00
znn
a0c7713494 chat remove transparency from chat bubble in dark mode (#24921) 2025-09-16 12:57:53 +08:00
-LAN-
5f263147f9 fix: make mypy happy 2025-09-16 12:51:11 +08:00
-LAN-
b68afdfa64 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-16 12:32:16 +08:00
NeatGuyCoding
abf4955c26 Feature: add test containers document indexing task (#25684)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-16 09:47:28 +08:00
miwa
74340e3c04 Bugfix: When i change the loop variable, 'Loop Termination Condition' wi… (#25695)
Co-authored-by: fengminhua <fengminhua@52tt.com>
2025-09-16 09:46:44 +08:00
-LAN-
b98b389baf fix(tests): resolve order dependency in disable_segments_from_index_task tests (#25737) 2025-09-16 08:26:52 +08:00
-LAN-
da87fce751 feat(graph_engine): dump and load ready queue 2025-09-16 04:19:46 +08:00
-LAN-
d5342927d0 chore: change _outputs type to dict[str, object] 2025-09-16 01:53:25 +08:00
github-actions[bot]
877806c34d chore: translate i18n files and update type definitions (#25713)
Some checks failed
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Check i18n Files and Create PR / check-and-update (push) Has been cancelled
Co-authored-by: GarfieldDai <28395549+GarfieldDai@users.noreply.github.com>
2025-09-15 21:22:57 +08:00
湛露先生
0bbf4fb66a correct typos . (#25717)
Signed-off-by: zhanluxianshen <zhanluxianshen@163.com>
2025-09-15 21:22:40 +08:00
chengjoey
169ce71e59 fix(web): custom-tool output_schema.properties missing type (#25731)
Co-authored-by: joeyczheng <joeyczheng@tencent.com>
2025-09-15 21:21:25 +08:00
quicksand
bdbe078630 fix(mcp): prevent masked headers from overwriting real values (#25722) 2025-09-15 19:24:12 +08:00
autofix-ci[bot]
754d790c89 [autofix.ci] apply automated fixes (attempt 2/3) 2025-09-15 07:58:44 +00:00
autofix-ci[bot]
a099a35e51 [autofix.ci] apply automated fixes 2025-09-15 07:56:51 +00:00
-LAN-
2dd893e60d Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-15 15:54:42 +08:00
Garfield Dai
88d5e27fe8 Release/e-1.8.1 (#25613)
Co-authored-by: zxhlyh <jasonapring2015@outlook.com>
Co-authored-by: GareArc <chen4851@purdue.edu>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: hjlarry <hjlarry@163.com>
2025-09-15 14:49:23 +08:00
-LAN-
bb5b8d2902 fix: resolve devalue prototype pollution vulnerability (#25709) 2025-09-15 13:26:36 +08:00
-LAN-
bab4975809 chore: add ast-grep rule to convert Optional[T] to T | None (#25560)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-15 13:06:33 +08:00
-LAN-
b8ee1d4697 Merge branch 'main' into feat/queue-based-graph-engine 2025-09-15 12:21:18 +08:00
dependabot[bot]
2e44ebe98d chore(deps): bump @lexical/text from 0.30.0 to 0.35.0 in /web (#25705)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-15 12:55:48 +09:00
dependabot[bot]
a1961cc37a chore(deps-dev): bump @next/bundle-analyzer from 15.5.0 to 15.5.3 in /web (#25704)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-15 12:55:17 +09:00
dependabot[bot]
727e1d3743 chore(deps): bump scheduler from 0.23.2 to 0.26.0 in /web (#25699)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-15 12:51:47 +09:00
dependabot[bot]
4e3b16c5f4 chore(deps-dev): bump sass from 1.89.2 to 1.92.1 in /web (#25698)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-15 12:50:49 +09:00
dependabot[bot]
6c36bf28d7 chore(deps): bump clickzetta-connector-python from 0.8.102 to 0.8.104 in /api (#25697)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-15 12:50:12 +09:00
dependabot[bot]
5548b22fe7 chore(deps): bump transformers from 4.53.3 to 4.56.1 in /api (#25696)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-15 12:49:26 +09:00
Asuka Minato
03664d6b51 dependabot (#25677) 2025-09-15 10:59:34 +08:00
Guangdong Liu
07d383ffaa refactor: update API routes and documentation for app and datasets endpoints (#25628) 2025-09-15 10:59:11 +08:00
Joel
9bb7bcf52e feat: user message support generate prompt (#25689) 2025-09-15 10:17:19 +08:00
Ritoban Dutta
67a686cf98 [Chore/Refactor] use __all__ to specify export member. (#25681) 2025-09-15 09:45:35 +08:00
ChasePassion
a3f2c05632 optimize _merge_splits function by using enumerate instead of manual index tracking (#25680)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-15 09:41:16 +08:00
-LAN-
b4ef1de30f feat(graph_engine): add ready_queue state persistence to GraphRuntimeState
- Add ReadyQueueState TypedDict for type-safe queue serialization
- Add ready_queue attribute to GraphRuntimeState for initializing with pre-existing queue state
- Update GraphEngine to load ready_queue from GraphRuntimeState on initialization
- Implement proper type hints using ReadyQueueState for better type safety
- Add comprehensive tests for ready_queue loading functionality

The ready_queue is read-only after initialization and allows resuming workflow
execution with a pre-populated queue of nodes ready to execute.
2025-09-15 03:05:10 +08:00
lyzno1
efcf052004 chore: bump pnpm version to v10.16.0 (#25640) 2025-09-14 18:44:35 +08:00
Timo
9234a2293d improve type hints using typing.Literal and add type annotations (#25641)
Co-authored-by: EchterTimo <EchterTimo@users.noreply.github.com>
2025-09-14 18:44:23 +08:00
Guangdong Liu
7a626747cf bugfix: The randomly generated email by Faker actually corresponded to an existing account in the test database, causing the test to fail. (#25646)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-09-14 18:41:35 +08:00
github-actions[bot]
db01cbb63d chore: translate i18n files and update type definitions (#25645)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-14 18:41:15 +08:00
Asuka Minato
4f868275a9 example for __all__ (#25666) 2025-09-14 18:40:06 +08:00
-LAN-
ed20d14d01 feat: enhance Makefile with code quality commands and default help (#25655) 2025-09-14 18:39:42 +08:00
NeatGuyCoding
0add1af1c8 feat: add test containers based tests for disable segments from index task (#25660) 2025-09-14 14:12:52 +08:00
yo
5c50c3aa70 fix: allow empty values in Variable Inspector (#25644) 2025-09-14 14:10:12 +08:00
lyzno1
9e7328abfb feat: add circular scrolling to GotoAnything command menu (#25662) 2025-09-14 14:07:10 +08:00
autofix-ci[bot]
0f15a2baca [autofix.ci] apply automated fixes 2025-09-13 20:20:53 +00:00
-LAN-
4cdc19fd05 feat(graph_engine): add abstract layer and dump / load methods for ready queue. 2025-09-14 04:19:24 +08:00
-LAN-
efa5f35277 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-14 01:48:06 +08:00
Yongtao Huang
188eb838c5 [Test] speed up Hypothesis strategies to avoid too_slow (#25623)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-13 21:05:19 +08:00
lyzno1
36ab9974d2 fix: Multiple UX improvements for GotoAnything command palette (#25637) 2025-09-13 21:03:42 +08:00
-LAN-
766fda395b Merge branch 'main' into feat/queue-based-graph-engine 2025-09-13 19:37:52 +08:00
NeatGuyCoding
a825f0f2b2 Feature add test containers disable segment from index task (#25631)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-13 14:28:10 +08:00
-LAN-
b0e815c3c7 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-13 01:31:17 +08:00
-LAN-
1b0f92a331 feat(stress-test): add comprehensive stress testing suite using Locust (#25617)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-12 22:25:05 +08:00
Krito.
a13d7987e0 chore: adopt StrEnum and auto() for some string-typed enums (#25129)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-12 21:14:26 +08:00
17hz
635e7d3e70 fix: Cannot modify values when startNode has defaultValue (#25595) 2025-09-12 21:11:24 +08:00
chengjoey
c78ef79995 fix close button cannot be clicked when the browser page is zoomed out (#25584)
Co-authored-by: joeyczheng <joeyczheng@tencent.com>
2025-09-12 21:11:00 +08:00
Tianyi Jing
c3f9a7ed9b feat: add type integer to VarType (#25500)
Signed-off-by: jingfelix <jingfelix@outlook.com>
2025-09-12 21:09:41 +08:00
kenwoodjw
c91253d05d fix segment deletion race condition (#24408)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-12 15:29:57 +08:00
Guangdong Liu
285291f545 refactor: update API routes and documentation for console endpoints (#25554)
Some checks failed
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Has been cancelled
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Has been cancelled
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Has been cancelled
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Has been cancelled
Main CI Pipeline / Check Changed Files (push) Has been cancelled
Main CI Pipeline / Style Check (push) Has been cancelled
Check i18n Files and Create PR / check-and-update (push) Has been cancelled
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Has been cancelled
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Has been cancelled
Main CI Pipeline / API Tests (push) Has been cancelled
Main CI Pipeline / Web Tests (push) Has been cancelled
Main CI Pipeline / VDB Tests (push) Has been cancelled
Main CI Pipeline / DB Migration Test (push) Has been cancelled
2025-09-12 11:51:24 +08:00
JQSevenMiao
c0e1015c6e fix: filter temporary edges from workflow draft sync (#25442)
Co-authored-by: jiasiqi <jiasiqi3@tal.com>
2025-09-12 11:19:57 +08:00
github-actions[bot]
12d1bcc545 chore: translate i18n files and update type definitions (#25575)
Co-authored-by: iamjoel <2120155+iamjoel@users.noreply.github.com>
2025-09-12 10:39:38 +08:00
Yeuoly
ec808f3fe8 refactor: centralize default end user session ID constant (#25416)
This PR refactors the handling of the default end user session ID by centralizing it as an enum in the models module where the `EndUser` model is defined. This improves code organization and makes the relationship between the constant and the model clearer.

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-12 10:27:16 +08:00
Joel
394b0ac9c0 fix: login security issue frontend (#25571)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-09-12 10:25:06 +08:00
zyssyz123
c2fcd2895b Feat/email register refactor (#25369)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Joel <iamjoel007@gmail.com>
2025-09-12 10:24:54 +08:00
Ganondorf
bb1514be2d Force update search method to keyword_search (#25464)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-12 10:12:25 +08:00
Kurumi1997
8ffb9b6aed fix: Support passing the default app mode when creating an app (#25142)
Co-authored-by: 王博 <wangbo@localhost.com>
2025-09-12 10:06:07 +08:00
Matri Qi
33afa7c84a Fix/disable no unsafe optional chaining (#25553) 2025-09-12 10:03:34 +08:00
L
69aad38d03 fix(date-picker): handle string date to avoid crash (#25522)
Co-authored-by: 刘佳佳 <liujiajia@nanjingwanhui.com>
Co-authored-by: crazywoola <427733928@qq.com>
2025-09-12 10:01:26 +08:00
Novice
17b5309e47 fix: single step system file error (#25533)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-12 09:47:45 +08:00
Asuka Minato
05af23f88f use autospec=True in mock (#25497) 2025-09-12 09:46:02 +08:00
Yongtao Huang
4511f4f537 Remove redundant parse_args call in WorkflowByIdApi.patch (#25498) 2025-09-12 09:40:41 +08:00
dependabot[bot]
bdacc4da36 chore(deps): bump mermaid from 11.4.1 to 11.10.0 in /web (#25521)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-12 09:40:18 +08:00
15
1a078657d8 Fixes #25530 (#25531) 2025-09-12 09:39:17 +08:00
Asuka Minato
77ba3e8f26 add autofix pnpm (#25557)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-09-12 09:37:54 +08:00
Wu Tianwei
84e3571ec3 fix: delete get upload file endpoint (#25543)
Co-authored-by: jyong <718720800@qq.com>
2025-09-12 09:36:53 +08:00
NeatGuyCoding
de18b14372 feat: add test containers based tests for delete segment from index task (#25564) 2025-09-12 09:33:39 +08:00
Yongtao Huang
a1322ddb5d Fix: correct has_more pagination logic in get_conversational_variable (#25484)
Signed-off-by: Yongtao Huang<yongtaoh2022@gmail.com>
2025-09-12 09:32:22 +08:00
GuanMu
c7868fb176 test: remove print code (#25481)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-12 09:30:56 +08:00
椰子糖
4b6687db6b Fix log time display bug (#25475)
Some checks failed
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Mark stale issues and pull requests / stale (push) Has been cancelled
Co-authored-by: wxliqigang <wxliqigang@gfpartner.com.cn>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-12 02:46:04 +09:00
-LAN-
462ba354a4 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-12 00:21:06 +08:00
JeeekXY
f1d5bc58b0 fix: app name overflow (#25551)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Co-authored-by: luxiaoyu1 <luxiaoyu1@xiaomi.com>
2025-09-11 21:19:55 +08:00
NeatGuyCoding
99f4cd1cfa feat: add test containers based tests for deal dataset vector index (#25545) 2025-09-11 21:12:53 +08:00
-LAN-
3c668e4a5c fix: update test assertions for ToolProviderApiEntity validation
- Fixed test_repack_provider_entity_no_dark_icon to use empty string instead of None for icon_dark field
- Updated test_builtin_provider_to_user_provider_no_credentials assertion to match actual implementation behavior where masked_credentials always contains empty strings for schema fields
2025-09-11 16:41:10 +08:00
-LAN-
872cff7bab chore(iteration_node): convert some Any to object
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-11 15:40:12 +08:00
-LAN-
8fb69429f9 feat(graph_engine): support parallel mode in iteration node
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-11 15:37:46 +08:00
-LAN-
85064bd8cf Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-11 15:13:31 +08:00
-LAN-
ba5df3612b fix: tests
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-11 15:13:18 +08:00
-LAN-
a923ab1ab8 fix: type errors
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-11 15:01:16 +08:00
QuantumGhost
874406d934 security(api): fix privilege escalation vulnerability in model config and chat message APIs (#25518)
The `ChatMessageApi` (`POST /console/api/apps/{app_id}/chat-messages`) and 
`ModelConfigResource` (`POST /console/api/apps/{app_id}/model-config`) 
endpoints do not properly validate user permissions, allowing users without `editor` 
permission to access restricted functionality.

This PR addresses this issue by adding proper permission check.
2025-09-11 14:53:35 +08:00
Nite Knite
07d067d828 chore: support Zendesk widget (#25517)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
2025-09-11 13:17:50 +08:00
Xiyuan Chen
af7f67dc9c Feat/enteprise cd (#25508) 2025-09-10 20:53:42 -07:00
Xiyuan Chen
34e55028ae Feat/enteprise cd (#25485) 2025-09-10 19:01:32 -07:00
-LAN-
b4c1766932 fix: type errors
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 21:48:05 +08:00
-LAN-
00a1af8506 refactor(graph_engine): use singledispatch in Node
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 20:59:34 +08:00
Eric Guo
70e4d6be34 Fix 500 in dataset page. (#25474)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
2025-09-10 15:57:04 +08:00
Wu Tianwei
b690ac4e2a fix: Remove sticky positioning from workflow component fields (#25470) 2025-09-10 15:17:49 +08:00
quicksand
f56fccee9d fix: workflow knowledge query raise error (#25465) 2025-09-10 13:47:47 +08:00
Asuka Minato
cbc0e639e4 update sql in batch (#24801)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-09-10 13:00:17 +08:00
Guangdong Liu
b51c724a94 refactor: Migrate part of the console basic API module to Flask-RESTX (#24732)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-09-10 12:15:47 +08:00
GuanMu
26a9abef64 test: imporve (#25461)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-10 11:36:22 +08:00
Will
fecdb9554d fix: inner_api get_user_tenant (#25462) 2025-09-10 11:31:16 +08:00
NeatGuyCoding
45ef177809 Feature add test containers create segment to index task (#25450)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-10 10:02:53 +08:00
Newton José
6574e9f0b2 Fix: Add Password Validation to Account Creation (#25382) 2025-09-10 08:58:39 +08:00
Asuka Minato
cce13750ad add rule for strenum (#25445) 2025-09-10 08:51:21 +08:00
17hz
928bef9d82 fix: imporve the condition for stopping the think timer. (#25365) 2025-09-10 08:45:00 +08:00
-LAN-
b6b98a2c8e Merge branch 'feat/dispatch-method' into feat/queue-based-graph-engine 2025-09-10 03:12:59 +08:00
-LAN-
7e69403dda refactor(graph_engine): use singledispatchmethod in event_handler
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 03:12:33 +08:00
-LAN-
9796cede72 fix: add missing type field to node configurations in integration tests
- Added 'type' field to all node data configurations in test files
- Fixed test_code.py: added 'type: code' to all code node configs
- Fixed test_http.py: added 'type: http-request' to all HTTP node configs
- Fixed test_template_transform.py: added 'type: template-transform' to template node config
- Fixed test_tool.py: added 'type: tool' to all tool node configs
- Added setup_code_executor_mock fixture to test_execute_code_scientific_notation

These changes fix the ValueError: 'Node X missing or invalid type information' errors
that were occurring due to changes in the node factory validation requirements.
2025-09-10 02:54:01 +08:00
-LAN-
836ed1f380 refactor(graph_engine): Move ErrorHandler into a single file package
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 02:35:05 +08:00
-LAN-
80f39963f1 chore: add import lint to CI
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 02:32:24 +08:00
-LAN-
9cf2b2b231 fix: type errors
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 02:22:58 +08:00
-LAN-
2a97a69825 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-10 02:03:45 +08:00
-LAN-
f17c71e08a refactor(graph_engine): Move GraphStateManager to single file package.
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 01:55:30 +08:00
-LAN-
08dd3f7b50 Fix basedpyright type errors (#25435)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-10 01:54:26 +08:00
-LAN-
d52621fce3 refactor(graph_engine): Merge error strategies into error_handler.py
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 01:49:46 +08:00
-LAN-
e060d7c28c refactor(graph_engine): remove Optional
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 01:49:15 +08:00
-LAN-
ea5dfe41d5 chore: ignore comment
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 01:36:11 +08:00
-LAN-
a23c8fcb1a refactor: move execution limits from engine core to layer
Remove max_execution_time and max_execution_steps from ExecutionContext and GraphEngine since these limits are now handled by ExecutionLimitsLayer. This follows the separation of concerns principle by keeping execution limits as a cross-cutting concern handled by layers rather than embedded in core engine components.

Changes:
- Remove max_execution_time and max_execution_steps from ExecutionContext
- Remove these parameters from GraphEngine.__init__()
- Remove max_execution_time from Dispatcher
- Update workflow_entry.py to no longer pass these parameters
- Update all tests to remove these parameters
2025-09-10 01:32:45 +08:00
-LAN-
e0e82fbfaa refactor: extract _run method into smaller focused methods in IterationNode
- Extract iterator variable retrieval and validation logic
- Separate empty iteration handling
- Create dedicated methods for iteration execution and result handling
- Improve type hints and use modern Python syntax
- Enhance code readability and maintainability
2025-09-10 01:15:36 +08:00
-LAN-
1c9f40f92a Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-09 22:16:59 +08:00
-LAN-
6ffa2ebabf feat: improve error handling in graph node creation
- Replace ValueError catch with generic Exception
- Use logger.exception for automatic traceback logging
- Abort on node creation failure instead of continuing
2025-09-09 22:16:42 +08:00
Yongtao Huang
2ac7a9c8fc Chore: thanks to bump-pydantic (#25437)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
2025-09-09 20:07:17 +08:00
Novice
240b65b980 fix(mcp): properly handle arrays containing both numbers and strings (#25430)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-09 20:06:35 +08:00
-LAN-
95dc1e2fe8 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-09 17:13:16 +08:00
-LAN-
7443c5a6fc refactor: update pyrightconfig to scan all API files (#25429) 2025-09-09 17:12:45 +08:00
GuanMu
a1cf48f84e Add lib test (#25410) 2025-09-09 17:11:49 +08:00
-LAN-
6fe7cf5ebf Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-09 17:11:46 +08:00
-LAN-
e5122945fe Fix: Use --fix flag instead of --fix-only in autofix workflow (#25425) 2025-09-09 17:00:00 +08:00
KVOJJJin
22cd97e2e0 Fix: judgement of open in explore (#25420) 2025-09-09 16:49:22 +08:00
Asuka Minato
38057b1b0e add typing to all wraps (#25405)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-09 16:48:33 +08:00
crazywoola
eb52216a9c Revert "example of remove useEffect" (#25418) 2025-09-09 16:23:44 +08:00
Joel
4c92e63b0b fix: avatar is not updated after setted (#25414) 2025-09-09 16:00:50 +08:00
-LAN-
a1e8ac4c96 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-09 15:49:09 +08:00
XiamuSanhua
ac2aa967c4 feat: change history by supplementary node information (#25294)
Co-authored-by: alleschen <alleschen@tencent.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-09 15:18:42 +08:00
ttz12345
d2e50a508c Fix:About the error problem of creating an empty knowledge base interface in service_api (#25398)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-09 15:18:31 +08:00
Wu Tianwei
37975319f2 feat: Add customized json schema validation (#25408) 2025-09-09 15:15:32 +08:00
Yongtao Huang
4aba570fa8 Fix flask response: 200 -> {}, 200 (#25404) 2025-09-09 15:06:18 +08:00
Novice
e180c19cca fix(mcp): current_user not being set in MCP requests (#25393) 2025-09-09 14:58:14 +08:00
zxhlyh
c595c03452 fix: credential not allow to use in load balancing (#25401) 2025-09-09 14:52:50 +08:00
Xiyuan Chen
64c9a2f678 Feat/credential policy (#25151)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-08 23:45:05 -07:00
Novice
566e0fd3e5 fix(container-test): batch create segment position sort (#25394) 2025-09-09 13:47:29 +08:00
-LAN-
b46858d87d Merge branch 'main' into feat/queue-based-graph-engine 2025-09-09 13:33:17 +08:00
NeatGuyCoding
7dfb72e381 feat: add test containers based tests for clean notion document task (#25385)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-09 11:02:19 +08:00
Asuka Minato
649242f82b example of uuid (#25380)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-09 10:45:08 +08:00
yinyu
cf1ee3162f Support Anchor Scroll In The Output Node (#25364)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-09 10:35:07 +08:00
NeatGuyCoding
bf6485fab4 minor fix: some translation mismatch (#25386) 2025-09-09 10:30:04 +08:00
Yeuoly
720ecea737 fix: tenant_id was not specific when retrieval end-user in plugin backwards invocation wraps (#25377)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-09 09:49:35 +08:00
HuDenghui
d5e86d9180 fix: Fixed the X-axis scroll bar issue in the LLM node settings panel (#25357) 2025-09-09 09:47:27 +08:00
Yongtao Huang
cab1272bb1 Fix: use correct maxLength prop for verification code input (#25371)
Some checks failed
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Check i18n Files and Create PR / check-and-update (push) Has been cancelled
Signed-off-by: Yongtao Huang <yongtaoh2022@gmail.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-08 20:44:48 +08:00
Matri Qi
563a5af9e7 Fix/disable no constant binary expression (#25311)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 20:44:20 +08:00
-LAN-
5ab6838849 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-08 19:55:43 +08:00
-LAN-
ec0800eb1a refactor: update pyrightconfig.json to use ignore field for better type checking configuration (#25373) 2025-09-08 19:55:25 +08:00
zyssyz123
ea61420441 Revert "feat: email register refactor" (#25367) 2025-09-08 19:20:09 +08:00
kenwoodjw
598ec07c91 feat: enable dsl export encrypt dataset id or not (#25102)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-09-08 18:03:24 +08:00
Debin.Meng
a932413314 fix: Incorrect URL Parameter Parsing Causes user_id Retrieval Error (#25261)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 18:00:33 +08:00
NeatGuyCoding
aff2482436 Feature add test containers batch create segment to index (#25306)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 17:55:57 +08:00
zyssyz123
860ee20c71 feat: email register refactor (#25344)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 17:51:43 +08:00
-LAN-
ef974e484b fix: handle None env vars
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-08 16:43:47 +08:00
Krito.
74be2087b5 fix: ensure Performance Tracing button visible when no tracing provid… (#25351) 2025-09-08 16:38:09 +08:00
github-actions[bot]
57f1822213 chore: translate i18n files and update type definitions (#25349)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 16:37:20 +08:00
Yongtao Huang
cdfdf324e8 Minor fix: correct PrecessRule typo (#25346) 2025-09-08 15:08:56 +08:00
Cluas
f891c67eca feat: add MCP server headers support #22718 (#24760)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Novice <novice12185727@gmail.com>
2025-09-08 14:10:55 +08:00
-LAN-
299141ae01 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-08 13:56:45 +08:00
NeatGuyCoding
5d0a50042f feat: add test containers based tests for clean dataset task (#25341)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-08 13:09:53 +08:00
ZalterCitty
4ee49f3550 chore: remove weird account login (#22247)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Check i18n Files and Create PR / check-and-update (push) Waiting to run
Co-authored-by: zhuqingchao <zhuqingchao@xiaomi.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 10:44:36 +08:00
Asuka Minato
f6059ef389 add more typing (#24949)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-08 10:40:00 +08:00
Ding
ce2281d31b Fix: Parameter Extractor Uses Correct Prompt for Prompt Mode in Chat Models (#24636)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-09-08 10:29:12 +08:00
github-actions[bot]
3d16767fb3 chore: translate i18n files and update type definitions (#25334)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 10:05:25 +08:00
qxo
593f7989b8 fix: 'curr_message_tokens' where it is not associated with a value #25307 (#25308)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-08 09:59:53 +08:00
Asuka Minato
16a3e21410 more assert (#24996)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 09:59:43 +08:00
zyileven
98204d78fb Refactor:upgrade react19 ref as props (#25225)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 09:46:02 +08:00
Asuka Minato
27bf244b3b keep add and remove the same (#25277) 2025-09-08 09:42:39 +08:00
-LAN-
9b8a03b53b [Chore/Refactor] Improve type annotations in models module (#25281)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 09:42:27 +08:00
Krito.
e1f871fefe fix: ensure consistent DSL export behavior across UI entry (#25317)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 09:41:51 +08:00
Yongtao Huang
beaa8de648 Fix: correct queryKey in useBatchUpdateDocMetadata and add test case (#25327) 2025-09-08 09:34:04 +08:00
-LAN-
7e629fd783 fix: update iteration node to use correct variable segment types (#25315)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
2025-09-07 21:31:41 +08:00
lyzno1
b623224d07 fix: remove workflow file preview docs (#25318) 2025-09-07 21:31:05 +08:00
-LAN-
92a939c401 chore: ignore PWA generated files in version control (#25313)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-07 21:29:59 +08:00
-LAN-
cc1d437dc1 fix: correct indentation in TokenBufferMemory get_history_prompt_messages method 2025-09-07 12:48:50 +08:00
-LAN-
7aef0b54e5 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-07 12:34:54 +08:00
NeatGuyCoding
afa7228076 fix: a failed index to be marked as created (#25290)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-06 22:53:26 +08:00
Asuka Minato
bbc43ca50d example of no-unstable-context-value (#25279) 2025-09-06 22:53:01 +08:00
-LAN-
3c28936796 fix: test
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-06 16:21:28 +08:00
NeatGuyCoding
9964cc202d Feature add test containers batch clean document (#25287)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-06 16:18:26 +08:00
-LAN-
81fdc7c54b fix: type errors
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-06 16:09:59 +08:00
-LAN-
b05245eab0 fix: resolve typing errors in configs module (#25268)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-06 16:08:14 +08:00
github-actions[bot]
e41e23481c chore: translate i18n files and update type definitions (#25260)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-06 16:06:09 +08:00
-LAN-
abb53f11ad Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-06 16:05:13 +08:00
-LAN-
30e5c197cb fix: standardize text color in install form to text-secondary (#25272) 2025-09-06 16:05:01 +08:00
-LAN-
52b1ac5f54 feat(web): add Progressive Web App (PWA) support (#25274) 2025-09-06 16:04:24 +08:00
Asuka Minato
a78339a040 remove bare list, dict, Sequence, None, Any (#25058)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-09-06 03:32:23 +08:00
Asuka Minato
2b0695bdde add more dataclass (#25039)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-09-06 03:20:13 +08:00
-LAN-
d9aa0ec046 fix: resolve mypy type errors in http_request and list_operator nodes
- Fix str | bytes union type handling in http_request executor
- Add type guard for boolean filter value in list_operator node
2025-09-05 21:17:18 +08:00
-LAN-
6c3302a192 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-05 21:13:07 +08:00
-LAN-
7ba1f0a046 chore: improve typing
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-05 20:57:11 +08:00
NeatGuyCoding
917d60a1cb Feature add test containers add document to index (#25251)
Some checks failed
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Check i18n Files and Create PR / check-and-update (push) Has been cancelled
2025-09-05 19:20:37 +08:00
taewoong Kim
edf4a1b652 feat: add reasoning format processing to LLMNode for <think> tag handling (#23313)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-05 18:15:35 +08:00
Timo
05cd7e2d8a add type annotations for Python SDK ChatClient Class (#24018)
Co-authored-by: EchterTimo <EchterTimo@users.noreply.github.com>
2025-09-05 18:12:46 +08:00
Asuka Minato
a9da8edbde example of remove useEffect (#25212) 2025-09-05 17:35:59 +08:00
Asuka Minato
d03d3518d7 example of lazy (#25216) 2025-09-05 17:35:50 +08:00
coolfinish
cd95237ae4 fix: loop node doesn't exit when it react the condition #24717 (#24844)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-05 14:38:52 +08:00
kenwoodjw
1ba69b8abf fix: child chunk API 404 due to UUID type comparison (#25234)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-09-05 14:00:28 +08:00
Asuka Minato
95eac7f7f0 example of readonly (#25220) 2025-09-05 12:41:54 +08:00
Asuka Minato
f84b9fd5ef example of type button (#25224) 2025-09-05 12:41:36 +08:00
-LAN-
e78f1cdc6a refactor: improve plugin version validation to support full semantic versioning (#25161)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-05 12:39:48 +08:00
Yongtao Huang
432f89cf33 Chore: clean some # type: ignore (#25157) 2025-09-05 11:30:04 +08:00
Asuka Minato
f0561c0c3b to RefObject (#25192)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
2025-09-05 10:14:13 +08:00
墨绿色
64e338133c fix: chunk detail modal answer not wrap line (#25203)
Co-authored-by: lijiezhao <lijiezhao@perfect99.com>
2025-09-05 10:11:49 +08:00
Yoshio Sugiyama
4966e4e1fb fix: Remove invalid key from firecrawl request payload. (#25190)
Signed-off-by: SUGIYAMA Yoshio <nenegi.01mo@gmail.com>
2025-09-05 10:10:56 +08:00
Asuka Minato
19e1cbd033 example regexp exec (#25200) 2025-09-05 09:53:01 +08:00
Anubhav Singh
f721c778ad fix: Ensure the order of execution steps are correct when logging with Weave by W&B (#25183) 2025-09-05 09:24:59 +08:00
-LAN-
a2e0f80c01 [Chore/Refactor] Improve type checking configuration (#25185)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-05 08:34:18 +08:00
-LAN-
2adf5d0eee docs: remove outdated document 2025-09-05 02:09:53 +08:00
-LAN-
103a9a4e67 fix(graph_engine): add type hint for workers_to_remove 2025-09-05 01:59:11 +08:00
-LAN-
15b3443e9e fix(debug_logging_layer): remove access for variable pool 2025-09-05 01:52:19 +08:00
Yongtao Huang
334218a62c Remove unused mypy script (#25177)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
2025-09-05 00:22:38 +08:00
-LAN-
81e9d6f63a fix: correct type checking for None values in code node output validation
- Fixed isinstance() checks to properly handle None values by checking None separately
- Fixed typo in STRING type validation where 'output_name' was hardcoded as string instead of variable
- Updated error message format to be consistent and more informative
- Updated test assertion to match new error message format
2025-09-04 20:39:37 +08:00
Will
de768af099 fix: reset password (#25172) 2025-09-04 20:34:56 +08:00
-LAN-
9c2943183e test: fix code node
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 20:17:28 +08:00
-LAN-
f6a2a09815 test: fix code node
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 20:04:29 +08:00
-LAN-
e229510e73 perf: eliminate lock contention in worker pool by removing callbacks
Remove worker idle/active callbacks that caused severe lock contention.
Instead, use sampling-based monitoring where worker states are queried
on-demand during scaling decisions. This eliminates the performance
bottleneck caused by workers acquiring locks 10+ times per second.

Changes:
- Remove callback parameters from Worker class
- Add properties to expose worker idle state directly
- Update WorkerPool to query worker states without callbacks
- Maintain scaling functionality with better performance
2025-09-04 19:37:31 +08:00
-LAN-
36048d1526 feat(graph_engine): allow to scale down without lock
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 19:32:07 +08:00
-LAN-
aff7ca12b8 fix(code_node): type checking bypass
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 19:25:08 +08:00
-LAN-
ad9eed2551 fix: disable scale for perfermance
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 19:11:22 +08:00
Will
d36ce782b7 fix: update account profile (#25150)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
2025-09-04 18:32:51 +08:00
-LAN-
07109846e0 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-04 17:48:08 +08:00
-LAN-
2aeaefccec test: fix test 2025-09-04 17:47:36 +08:00
-LAN-
4d63bd2083 refactor(graph_engine): rename SimpleWorkerPool to WorkerPool 2025-09-04 17:47:13 +08:00
lyzno1
fb307ae128 feat: add TypeScript type safety for i18next with automated maintenance (#25152) 2025-09-04 17:12:48 +08:00
-LAN-
226f14a20f feat(graph_engine): implement scale down worker
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 15:35:20 +08:00
CrabSAMA
8d5f788f2b feat(workflow): Allow paste node into nested block (#24234)
Co-authored-by: crab.huang <crab.huang@huolala.cn>
2025-09-04 15:21:43 +08:00
Will
804e599598 fix: EndUser not bound to Session when plugin invokes callback (#25132) 2025-09-04 13:59:34 +08:00
autofix-ci[bot]
2b28aed4e2 [autofix.ci] apply automated fixes 2025-09-04 04:50:21 +00:00
-LAN-
938a845852 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-04 12:48:58 +08:00
-LAN-
ead8568bfc fix: some errors reported by basedpyright
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 11:58:54 +08:00
Davide Delbianco
cdf9b674dc chore: Bump weaviate-client to latest v3 version (#25096) 2025-09-04 11:15:36 +08:00
Tonlo
d5aaee614f fix recommended apps reading from db logic (#25071) 2025-09-04 11:14:37 +08:00
Yongtao Huang
865ba8bb4f Minor fix: correct get_app_model mode for delete() (#25082)
Signed-off-by: Yongtao Huang <yongtaoh2022@gmail.com>
2025-09-04 11:08:31 +08:00
znn
ebbb4a5d0b fix png jpeg export (#25110) 2025-09-04 11:05:45 +08:00
17hz
9040b534c8 fix: TypeSelector component style (#25124) 2025-09-04 10:53:00 +08:00
非法操作
0a0ae16bd6 fix: old custom model not display credential name (#25112)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
2025-09-04 10:46:10 +08:00
fenglin
c22b325c31 fix: align text color in dark mode for config var type selector (#25121) 2025-09-04 10:45:30 +08:00
NeatGuyCoding
c0d82a412d feat: add test containers based tests for workflow converter (#25115)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-04 10:30:24 +08:00
Yongtao Huang
ac057a2d40 Chore: remove dead code in class Graph (#22791)
Co-authored-by: Yongtao Huang <99629139+hyongtao-db@users.noreply.github.com>
2025-09-04 10:30:04 +08:00
Will
3427f19a01 chore: improved trace info for generating conversation name (#25118) 2025-09-04 10:29:12 +08:00
znn
8effbaf101 make icon consistent in dropdown (#25109) 2025-09-04 10:03:13 +08:00
-LAN-
53c4a8787f [Chore/Refactor] Improve type safety and resolve type checking issues (#25104) 2025-09-04 09:35:32 +08:00
-LAN-
017a75aa44 chore: enhance basedpyright-check script to support path arguments (#25108) 2025-09-04 09:34:50 +08:00
-LAN-
ed22d04ea0 test: remove outdated test case 2025-09-04 02:42:36 +08:00
-LAN-
04bbf540d9 chore: code format
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 02:33:53 +08:00
-LAN-
657c27ec75 feat(graph_engine): make runtime state read-only in layer
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 02:30:40 +08:00
-LAN-
16e9cd5ac5 feat(graph_runtime_state): prevent to set variable pool after initialized. 2025-09-04 02:20:19 +08:00
-LAN-
61c79b0013 test: correct imported name 2025-09-04 02:15:46 +08:00
-LAN-
8332472944 refactor(graph_engine): rename Layer to GraphEngineLayer
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 02:11:31 +08:00
-LAN-
fe3f03e50a feat: add property-based access control to GraphRuntimeState
- Replace direct field access with private attributes and property decorators
- Implement deep copy protection for mutable objects (dict, LLMUsage)
- Add helper methods: set_output(), get_output(), update_outputs()
- Add increment_node_run_steps() and add_tokens() convenience methods
- Update loop_node and event_handlers to use new accessor methods
- Add comprehensive unit tests for immutability and validation
- Ensure backward compatibility with existing property access patterns
2025-09-04 02:08:58 +08:00
-LAN-
9c96b23d55 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-04 00:27:08 +08:00
zz_xu
56afb3fd64 db internal server error (#24947)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-03 22:44:22 +08:00
NeatGuyCoding
a9c7669c16 chore: comply to RFC 6750 and improve bearer token split (#24955) 2025-09-03 22:29:08 +08:00
17hz
aae792a9dd chore: Updated pnpm version to 10.15.1 (#25065) 2025-09-03 22:28:03 +08:00
Yongtao Huang
db53656a45 Fix jsonschema compliance: use number instead of float (#25049)
Signed-off-by: Yongtao Huang<yongtaoh2022@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-03 22:27:41 +08:00
GuanMu
ff7a0e3170 fix: improve error logging for vector search operation in MyScale (#25087) 2025-09-03 22:24:45 +08:00
-LAN-
c7700ac176 chore(docker): bump version (#25092)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-03 20:25:44 +08:00
Stream
d011ddfc64 chore(version): bump version to 1.8.1 (#25060) 2025-09-03 18:54:07 +08:00
zxhlyh
67cc70ad61 fix: model credential name (#25081)
Some checks failed
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Check i18n Files and Create PR / check-and-update (push) Has been cancelled
Co-authored-by: hjlarry <hjlarry@163.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-03 18:23:57 +08:00
-LAN-
a384ae9140 Fix advanced chat workflow event handler signature mismatch (#25078)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-03 16:22:13 +08:00
17hz
a7627882a7 fix: Boolean type control is not displayed (#25031)
Co-authored-by: WTW0313 <twwu@dify.ai>
2025-09-03 15:39:09 +08:00
NeatGuyCoding
8eae7a95be Hotfix translation error (#25035) 2025-09-03 15:23:04 +08:00
dswl23
dabf266048 Fix: handle 204 No Content response in MCP client (#25040) 2025-09-03 15:22:42 +08:00
Asuka Minato
462e764a3c typevar example (#25064)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-03 14:54:38 +08:00
github-actions[bot]
0e8a37dca8 chore: translate i18n files (#25061)
Co-authored-by: zxhlyh <16177003+zxhlyh@users.noreply.github.com>
2025-09-03 14:48:53 +08:00
zyileven
bffbe54120 fix: Solve the problem of opening remarks appearing in the chat cont… (#25067) 2025-09-03 14:48:30 +08:00
-LAN-
8c97937cae Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-03 13:53:43 +08:00
非法操作
b673560b92 feat: improve multi model credentials (#25009)
Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 13:52:31 +08:00
zxhlyh
9e125e2029 Refactor/model credential (#24994) 2025-09-03 13:36:59 +08:00
-LAN-
b88146c443 chore: consolidate type checking in style workflow (#25053) 2025-09-03 13:34:43 +08:00
-LAN-
c40cb7fd59 [Chore/Refactor] Update .gitignore to exclude pyrightconfig.json while preserving api/pyrightconfig.json (#25055) 2025-09-03 13:34:07 +08:00
-LAN-
f6acff4cce chore: remove unused variables
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-03 12:12:27 +08:00
-LAN-
3fa48cb5cf chore: remove ty-check from Python style check.
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-03 12:05:41 +08:00
-LAN-
b81745aed8 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-03 11:56:05 +08:00
-LAN-
9d5956cef8 [Chore/Refactor] Switch from MyPy to Basedpyright for type checking (#25047)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-03 11:52:26 +08:00
湛露先生
1fff4620e6 clean console apis and rag cleans. (#25042)
Signed-off-by: zhanluxianshen <zhanluxianshen@163.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-03 11:25:18 +08:00
-LAN-
8c41d95d03 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-03 11:06:42 +08:00
-LAN-
c3820f55f4 chore: translate Chinese comments to English in ClickZetta Volume storage module (#25037) 2025-09-03 10:57:58 +08:00
17hz
60c5bdd62f fix: remove redundant z-index from Field component (#25034) 2025-09-03 10:39:07 +08:00
Will
5092e5f631 fix: workflow not published (#25030)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
2025-09-03 10:07:31 +08:00
NeatGuyCoding
c0bd35594e feat: add test containers based tests for tools manage service (#25028) 2025-09-03 09:20:16 +08:00
Yongtao Huang
bc9efa7ea8 Refactor: use DatasourceType.XX.value instead of hardcoded (#25015)
Signed-off-by: Yongtao Huang <yongtaoh2022@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-03 08:56:48 +08:00
-LAN-
f540d0b747 chore: remove ty type checker from reformat script and pre-commit hooks (#25021) 2025-09-03 08:56:23 +08:00
-LAN-
7bcaa513fa chore: remove duplicate test helper classes from api root directory (#25024) 2025-09-03 08:56:00 +08:00
-LAN-
9d004a0971 test: fix test
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-03 02:11:37 +08:00
autofix-ci[bot]
02fcd08c08 [autofix.ci] apply automated fixes 2025-09-02 17:34:07 +00:00
-LAN-
77a9a73d0d Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-03 01:33:17 +08:00
Will
d33dfee8a3 fix: EndUser is not bound to a Session (#25010)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
2025-09-02 21:37:21 +08:00
Will
b5216df4fe fix: xxx is not bound to a Session (#24966) 2025-09-02 21:37:06 +08:00
GuanMu
25a11bfafc Export DSL from history (#24939)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-02 21:36:52 +08:00
Yongtao Huang
8fcc864fb7 Post fix of #23224 (#25007) 2025-09-02 20:59:08 +08:00
NeatGuyCoding
ed5ed0306e minor fix: fix the check of subscription capacity limit (#24991)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-02 19:14:30 +08:00
Asuka Minato
a418c43d32 example add more type check (#24999)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-02 19:13:43 +08:00
17hz
5aa8c9c8df fix: refresh UI after user profile change (#24998) 2025-09-02 18:57:35 +08:00
17hz
32972b45db fix: remove unnecessary modal visibility toggle on error in name save (#25001) 2025-09-02 18:57:24 +08:00
17hz
af351b1723 fix: ensure the modal closed by level (#24984)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Check i18n Files and Create PR / check-and-update (push) Waiting to run
2025-09-02 17:06:10 +08:00
Bowen Liang
af88266212 chore: run ty check CI action only when api code changed (#24986) 2025-09-02 16:59:11 +08:00
-LAN-
b14119b531 feat: add development environment setup commands to Makefile (#24976) 2025-09-02 16:24:21 +08:00
Novice
68c75f221b fix: workflow log status filter add parial success status (#24977) 2025-09-02 16:24:03 +08:00
Bowen Liang
7b379e2a61 chore: apply ty checks on api code with script and ci action (#24653) 2025-09-02 16:05:13 +08:00
17hz
c373b734bc feat: make secretInput type field prevent browser auto-fill (#24971) 2025-09-02 16:04:12 +08:00
-LAN-
1770b93e5b chore(graph_engine): Add a TODO commment in _update_response_outputs in event_handlers
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-02 15:20:03 +08:00
17hz
2ac8f8003f refactor: update radio component to handle boolean values instead of numeric (#24956) 2025-09-02 15:11:42 +08:00
-LAN-
d8ff4aa9ba feat(graph_engine): Handle NodeRunAgentLogEvent
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-02 15:02:07 +08:00
-LAN-
9f8f21bf87 chore: remove backup files
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-02 15:01:58 +08:00
17hz
d6b3df8f6f fix: API Key Authorization Configuration Model Form render default value (#24963) 2025-09-02 14:52:05 +08:00
湛露先生
deea07e905 make clean() function in index_processor_base abstractmethod (#24959)
Signed-off-by: zhanluxianshen <zhanluxianshen@163.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-02 14:48:45 +08:00
lyzno1
0caa94bd1c fix: add Indonesian (id-ID) language support and improve language selector (#24951) 2025-09-02 14:44:59 +08:00
-LAN-
a32dde5428 Fix: Resolve workflow_node_execution primary key conflicts with UUID v7 (#24643)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-02 14:18:29 +08:00
Yongtao Huang
067b0d07c4 Fix: ensure InstalledApp deletion uses model instances instead of Row (#24942)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-02 11:59:38 +08:00
17hz
044f96bd93 feat: LLM prompt Jinja2 template now support more variables (#24944) 2025-09-02 11:59:31 +08:00
-LAN-
0b0dc63f29 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-02 11:52:25 +08:00
Novice
ca96350707 chore: optimize SQL queries that perform partial full table scans (#24786) 2025-09-02 11:46:11 +08:00
Yongtao Huang
be3af1e234 Migrate SQLAlchemy from 1.x to 2.0 with automated and manual adjustments (#23224)
Co-authored-by: Yongtao Huang <99629139+hyongtao-db@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-02 10:30:19 +08:00
github-actions[bot]
2e89d29c87 chore: translate i18n files (#24934)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-02 10:16:14 +08:00
Zhedong Cen
e4eb9f7c55 fix(i18n): align zh-Hant indexMethodEconomyTip with zh-Hans (#24933) 2025-09-02 09:57:39 +08:00
znn
dd6547de06 downvote with reason (#24922) 2025-09-02 09:57:04 +08:00
Atif
84d09b8b8a fix: API key input uses password type and no autocomplete (#24864)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-09-02 09:37:24 +08:00
17hz
2c462154f7 fix: email input cannot scroll (#24930) 2025-09-02 09:35:53 +08:00
NeatGuyCoding
b810efdb3f Feature add test containers tool transform service (#24927) 2025-09-02 09:30:55 +08:00
17hz
ae04ccc445 fix: npx typo error (#24929) 2025-09-02 09:20:51 +08:00
Charles Liu
f7ac1192ae replace the secret field from obfuscated to full-masked value (#24800)
Co-authored-by: charles liu <dearcharles.liu@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-02 09:19:20 +08:00
jiangbo721
e048588a88 fix: remove duplicated code (#24893) 2025-09-02 08:58:31 +08:00
Frederick2313072
2042353526 fix:score threshold (#24897) 2025-09-02 08:58:14 +08:00
wlleiiwang
9486715929 FEAT: Tencent Vector optimize BM25 initialization to reduce loading time (#24915)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Co-authored-by: wlleiiwang <wlleiiwang@tencent.com>
2025-09-01 21:08:41 +08:00
湛露先生
64319c0d56 fix close session twice. (#24917)
Signed-off-by: zhanluxianshen <zhanluxianshen@163.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-01 21:08:01 +08:00
耐小心
acd209a890 fix: prevent database connection leaks in chatflow mode by using Session-managed queries (#24656)
Co-authored-by: 王锶奇 <wangsiqi2@tal.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-01 18:22:42 +08:00
ZalterCitty
bd482eb8ef fix wrong filter handle for saved messages (#24891)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Check i18n Files and Create PR / check-and-update (push) Waiting to run
Co-authored-by: zhuqingchao <zhuqingchao@xiaomi.com>
2025-09-01 16:32:08 +08:00
Frederick2313072
5b3cc560d5 fix:hard-coded top-k fallback issue. (#24879) 2025-09-01 15:46:37 +08:00
Asuka Minato
d41d4deaac example enum to StrEnum (#24877)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-01 15:40:26 +08:00
Yongtao Huang
208ce4e774 CI: add TS indentation check via esLint (#24810) 2025-09-01 15:31:59 +08:00
Tianyi Jing
414ee51975 fix: add missing form for boolean types (#24812)
Signed-off-by: jingfelix <jingfelix@outlook.com>
2025-09-01 15:21:36 +08:00
耐小心
d5a521eef2 fix: Fix database connection leak in EasyUIBasedGenerateTaskPipeline (#24815) 2025-09-01 14:48:56 +08:00
17hz
1b401063e8 chore: pnpx deprecation (#24868) 2025-09-01 14:45:44 +08:00
木之本澪
60d9d0584a refactor: migrate marketplace.py from requests to httpx (#24015) 2025-09-01 14:28:21 +08:00
willzhao
ffba341258 [CHORE]: remove redundant-cast (#24807) 2025-09-01 14:05:32 +08:00
-LAN-
8433cf4437 refactor(graph_engine): Merge event_collector and event_emitter into event_manager
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 13:15:58 +08:00
-LAN-
bb5d52539c refactor(graph_engine): Merge branch_handler into edge_processor
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 12:53:06 +08:00
-LAN-
88622f70fb refactor(graph_engine): Move setup methods into __init__
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 12:08:03 +08:00
-LAN-
0fdb1b2bc9 refactor(graph_engine): Correct private attributes and private methods naming
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 04:37:23 +08:00
-LAN-
a5cb9d2b73 refactor(graph_engine): inline output_registry into response_coordinator
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 03:59:53 +08:00
-LAN-
64c1234724 refactor(graph_engine): Merge worker management into one WorkerPool
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 03:23:47 +08:00
-LAN-
202fdfcb81 refactor(graph_engine): Remove backward compatibility code
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 02:41:16 +08:00
-LAN-
e2f4c9ba8d refactor(graph_engine): Merge state managers into unified_state_manager
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 02:08:08 +08:00
-LAN-
546d75d84d Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-01 00:29:28 +08:00
-LAN-
a8fe4ea802 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-30 16:36:10 +08:00
-LAN-
82193580de chore: improve typing
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-30 16:35:57 +08:00
-LAN-
1fd27cf3ad Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-30 00:13:45 +08:00
-LAN-
11d32ca87d test: fix web test
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-29 23:20:28 +08:00
-LAN-
5415d0c6d1 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-29 23:17:30 +08:00
-LAN-
d8af8ae4e6 fix: update workflow service tests for new graph engine
- Update method calls from _handle_node_run_result to _handle_single_step_result
- Add required fields (id, node_id, node_type, start_at) to graph events
- Use proper NodeType enum values instead of strings
- Fix imports to use correct modules (Node instead of BaseNode)
- Ensure event generators return proper generator objects

These tests were failing because the internal implementation changed
with the new graph engine architecture.
2025-08-29 23:04:33 +08:00
-LAN-
04e5d4692f Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-29 22:34:47 +08:00
-LAN-
3aa48efd0a test(test_workflow_service): Use new engine's method.
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-29 22:06:10 +08:00
-LAN-
8eb78c04b2 chore(token_buffer_memory): code format
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-29 17:02:51 +08:00
-LAN-
22ee318cf8 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-29 17:01:42 +08:00
-LAN-
f2bc4f5d87 fix: resolve type error in node_factory by using type guard for node_type_str 2025-08-29 16:16:58 +08:00
-LAN-
d7d456349d Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-29 16:14:04 +08:00
-LAN-
dce4d0ff80 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-29 13:22:13 +08:00
-LAN-
3dee8064ba feat: enhance typing 2025-08-29 13:17:02 +08:00
-LAN-
bfbb36756a feat(graph_engine): Add NodeExecutionType.ROOT and auto mark skipped in Graph.init
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 16:41:51 +08:00
-LAN-
d7e0c5f759 chore: use 'XXX | None' instead of Optional[XXX] in graph.py 2025-08-28 15:45:22 +08:00
-LAN-
c396788128 chore(graph_engine): add final mark to classes
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 15:38:35 +08:00
-LAN-
e3a7b1f691 fix: type hints
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 05:24:18 +08:00
-LAN-
8aab7f49c3 chore(graph_engine): Use XXX | None instead of Optional[XXX] 2025-08-28 05:09:33 +08:00
autofix-ci[bot]
1e12c1cbf2 [autofix.ci] apply automated fixes 2025-08-27 21:00:36 +00:00
-LAN-
affedd6ce4 chore(graph_engine): Use XXX | None instead of Optional[XXX] 2025-08-28 04:59:49 +08:00
-LAN-
ef21097774 refactor(graph_engine): Remove unnecessary check from SkipPropagator
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:45:26 +08:00
-LAN-
1d377fe994 refactor(graph_engine): Use _ to mark unused variable in BranchHandler
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:44:45 +08:00
-LAN-
c82697f267 refactor(graph_engine): Remove node_id from SkipPropagator.skip_branch_paths
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:43:56 +08:00
-LAN-
98b25c0bbc refactor(graph_engine): Convert attrs to private in error_handler
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:42:37 +08:00
-LAN-
1cd0792606 chore(graph_events): Improve type hints
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:41:48 +08:00
-LAN-
7cbf4093f4 chore(graph_engine): Use TYPE | None instead of Optional
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:30:50 +08:00
-LAN-
8129ca7c05 chore(graph_engine): Move error_strategy.py to protocols/
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:29:32 +08:00
-LAN-
65617f000d feat(event_collector): Update to use ReadWriteLock 2025-08-28 03:26:42 +08:00
-LAN-
635eff2e25 test(graph_engine): remove outdated tests
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 02:53:19 +08:00
-LAN-
55085a9ca2 chore(graph_engine): add type hint for event_queue
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 02:38:56 +08:00
-LAN-
9dc1e9724e Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-28 02:26:40 +08:00
-LAN-
c3f66e2901 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-27 18:05:35 +08:00
autofix-ci[bot]
86e7cb713c [autofix.ci] apply automated fixes 2025-08-27 07:38:26 +00:00
-LAN-
0f29244459 fix: test
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-27 15:37:37 +08:00
autofix-ci[bot]
48cbf4c78f [autofix.ci] apply automated fixes 2025-08-27 15:33:30 +08:00
-LAN-
8c35663220 feat: queue-based graph engine
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-27 15:33:28 +08:00
1883 changed files with 68099 additions and 24465 deletions

12
.github/dependabot.yml vendored Normal file
View File

@@ -0,0 +1,12 @@
version: 2
updates:
- package-ecosystem: "npm"
directory: "/web"
schedule:
interval: "weekly"
open-pull-requests-limit: 2
- package-ecosystem: "uv"
directory: "/api"
schedule:
interval: "weekly"
open-pull-requests-limit: 2

View File

@@ -42,11 +42,7 @@ jobs:
- name: Run Unit tests - name: Run Unit tests
run: | run: |
uv run --project api bash dev/pytest/pytest_unit_tests.sh uv run --project api bash dev/pytest/pytest_unit_tests.sh
- name: Run ty check
run: |
cd api
uv add --dev ty
uv run ty check || true
- name: Run pyrefly check - name: Run pyrefly check
run: | run: |
cd api cd api
@@ -66,15 +62,6 @@ jobs:
- name: Run dify config tests - name: Run dify config tests
run: uv run --project api dev/pytest/pytest_config_tests.py run: uv run --project api dev/pytest/pytest_config_tests.py
- name: MyPy Cache
uses: actions/cache@v4
with:
path: api/.mypy_cache
key: mypy-${{ matrix.python-version }}-${{ runner.os }}-${{ hashFiles('api/uv.lock') }}
- name: Run MyPy Checks
run: dev/mypy-check
- name: Set up dotenvs - name: Set up dotenvs
run: | run: |
cp docker/.env.example docker/.env cp docker/.env.example docker/.env

View File

@@ -20,14 +20,60 @@ jobs:
cd api cd api
uv sync --dev uv sync --dev
# Fix lint errors # Fix lint errors
uv run ruff check --fix-only . uv run ruff check --fix .
# Format code # Format code
uv run ruff format . uv run ruff format ..
- name: ast-grep - name: ast-grep
run: | run: |
uvx --from ast-grep-cli sg --pattern 'db.session.query($WHATEVER).filter($HERE)' --rewrite 'db.session.query($WHATEVER).where($HERE)' -l py --update-all uvx --from ast-grep-cli sg --pattern 'db.session.query($WHATEVER).filter($HERE)' --rewrite 'db.session.query($WHATEVER).where($HERE)' -l py --update-all
uvx --from ast-grep-cli sg --pattern 'session.query($WHATEVER).filter($HERE)' --rewrite 'session.query($WHATEVER).where($HERE)' -l py --update-all uvx --from ast-grep-cli sg --pattern 'session.query($WHATEVER).filter($HERE)' --rewrite 'session.query($WHATEVER).where($HERE)' -l py --update-all
# Convert Optional[T] to T | None (ignoring quoted types)
cat > /tmp/optional-rule.yml << 'EOF'
id: convert-optional-to-union
language: python
rule:
kind: generic_type
all:
- has:
kind: identifier
pattern: Optional
- has:
kind: type_parameter
has:
kind: type
pattern: $T
fix: $T | None
EOF
uvx --from ast-grep-cli sg scan --inline-rules "$(cat /tmp/optional-rule.yml)" --update-all
# Fix forward references that were incorrectly converted (Python doesn't support "Type" | None syntax)
find . -name "*.py" -type f -exec sed -i.bak -E 's/"([^"]+)" \| None/Optional["\1"]/g; s/'"'"'([^'"'"']+)'"'"' \| None/Optional['"'"'\1'"'"']/g' {} \;
find . -name "*.py.bak" -type f -delete
- name: mdformat - name: mdformat
run: | run: |
uvx mdformat . uvx mdformat .
- name: Install pnpm
uses: pnpm/action-setup@v4
with:
package_json_file: web/package.json
run_install: false
- name: Setup NodeJS
uses: actions/setup-node@v4
with:
node-version: 22
cache: pnpm
cache-dependency-path: ./web/package.json
- name: Web dependencies
working-directory: ./web
run: pnpm install --frozen-lockfile
- name: oxlint
working-directory: ./web
run: |
pnpx oxlint --fix
- uses: autofix-ci/action@635ffb0c9798bd160680f18fd73371e355b85f27 - uses: autofix-ci/action@635ffb0c9798bd160680f18fd73371e355b85f27

View File

@@ -19,11 +19,23 @@ jobs:
github.event.workflow_run.head_branch == 'deploy/enterprise' github.event.workflow_run.head_branch == 'deploy/enterprise'
steps: steps:
- name: Deploy to server - name: trigger deployments
uses: appleboy/ssh-action@v0.1.8 env:
with: DEV_ENV_ADDRS: ${{ vars.DEV_ENV_ADDRS }}
host: ${{ secrets.ENTERPRISE_SSH_HOST }} DEPLOY_SECRET: ${{ secrets.DEPLOY_SECRET }}
username: ${{ secrets.ENTERPRISE_SSH_USER }} run: |
password: ${{ secrets.ENTERPRISE_SSH_PASSWORD }} IFS=',' read -ra ENDPOINTS <<< "${DEV_ENV_ADDRS:-}"
script: | BODY='{"project":"dify-api","tag":"deploy-enterprise"}'
${{ vars.ENTERPRISE_SSH_SCRIPT || secrets.ENTERPRISE_SSH_SCRIPT }}
for ENDPOINT in "${ENDPOINTS[@]}"; do
ENDPOINT="$(echo "$ENDPOINT" | xargs)"
[ -z "$ENDPOINT" ] && continue
API_SIGNATURE=$(printf '%s' "$BODY" | openssl dgst -sha256 -hmac "$DEPLOY_SECRET" | awk '{print "sha256="$2}')
curl -sSf -X POST \
-H "Content-Type: application/json" \
-H "X-Hub-Signature-256: $API_SIGNATURE" \
-d "$BODY" \
"$ENDPOINT"
done

View File

@@ -12,7 +12,6 @@ permissions:
statuses: write statuses: write
contents: read contents: read
jobs: jobs:
python-style: python-style:
name: Python Style name: Python Style
@@ -44,6 +43,18 @@ jobs:
if: steps.changed-files.outputs.any_changed == 'true' if: steps.changed-files.outputs.any_changed == 'true'
run: uv sync --project api --dev run: uv sync --project api --dev
- name: Run Import Linter
if: steps.changed-files.outputs.any_changed == 'true'
run: uv run --directory api --dev lint-imports
- name: Run Basedpyright Checks
if: steps.changed-files.outputs.any_changed == 'true'
run: dev/basedpyright-check
- name: Run Mypy Type Checks
if: steps.changed-files.outputs.any_changed == 'true'
run: uv --directory api run mypy --exclude-gitignore --exclude 'tests/' --exclude 'migrations/' --check-untyped-defs --disable-error-code=import-untyped .
- name: Dotenv check - name: Dotenv check
if: steps.changed-files.outputs.any_changed == 'true' if: steps.changed-files.outputs.any_changed == 'true'
run: uv run --project api dotenv-linter ./api/.env.example ./web/.env.example run: uv run --project api dotenv-linter ./api/.env.example ./web/.env.example
@@ -89,7 +100,9 @@ jobs:
- name: Web style check - name: Web style check
if: steps.changed-files.outputs.any_changed == 'true' if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ./web working-directory: ./web
run: pnpm run lint run: |
pnpm run lint
pnpm run eslint
docker-compose-template: docker-compose-template:
name: Docker Compose Template name: Docker Compose Template

View File

@@ -67,12 +67,22 @@ jobs:
working-directory: ./web working-directory: ./web
run: pnpm run auto-gen-i18n ${{ env.FILE_ARGS }} run: pnpm run auto-gen-i18n ${{ env.FILE_ARGS }}
- name: Generate i18n type definitions
if: env.FILES_CHANGED == 'true'
working-directory: ./web
run: pnpm run gen:i18n-types
- name: Create Pull Request - name: Create Pull Request
if: env.FILES_CHANGED == 'true' if: env.FILES_CHANGED == 'true'
uses: peter-evans/create-pull-request@v6 uses: peter-evans/create-pull-request@v6
with: with:
token: ${{ secrets.GITHUB_TOKEN }} token: ${{ secrets.GITHUB_TOKEN }}
commit-message: Update i18n files based on en-US changes commit-message: Update i18n files and type definitions based on en-US changes
title: 'chore: translate i18n files' title: 'chore: translate i18n files and update type definitions'
body: This PR was automatically created to update i18n files based on changes in en-US locale. body: |
This PR was automatically created to update i18n files and TypeScript type definitions based on changes in en-US locale.
**Changes included:**
- Updated translation files for all locales
- Regenerated TypeScript type definitions for type safety
branch: chore/automated-i18n-updates branch: chore/automated-i18n-updates

View File

@@ -47,6 +47,11 @@ jobs:
working-directory: ./web working-directory: ./web
run: pnpm install --frozen-lockfile run: pnpm install --frozen-lockfile
- name: Check i18n types synchronization
if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ./web
run: pnpm run check:i18n-types
- name: Run tests - name: Run tests
if: steps.changed-files.outputs.any_changed == 'true' if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ./web working-directory: ./web

17
.gitignore vendored
View File

@@ -123,10 +123,12 @@ venv.bak/
# mkdocs documentation # mkdocs documentation
/site /site
# mypy # type checking
.mypy_cache/ .mypy_cache/
.dmypy.json .dmypy.json
dmypy.json dmypy.json
pyrightconfig.json
!api/pyrightconfig.json
# Pyre type checker # Pyre type checker
.pyre/ .pyre/
@@ -195,8 +197,8 @@ sdks/python-client/dify_client.egg-info
.vscode/* .vscode/*
!.vscode/launch.json.template !.vscode/launch.json.template
!.vscode/README.md !.vscode/README.md
pyrightconfig.json
api/.vscode api/.vscode
web/.vscode
# vscode Code History Extension # vscode Code History Extension
.history .history
@@ -214,7 +216,18 @@ mise.toml
# Next.js build output # Next.js build output
.next/ .next/
# PWA generated files
web/public/sw.js
web/public/sw.js.map
web/public/workbox-*.js
web/public/workbox-*.js.map
web/public/fallback-*.js
# AI Assistant # AI Assistant
.roo/ .roo/
api/.env.backup api/.env.backup
/clickzetta /clickzetta
# Benchmark
scripts/stress-test/setup/config/
scripts/stress-test/reports/

View File

@@ -1 +0,0 @@
CLAUDE.md

87
AGENTS.md Normal file
View File

@@ -0,0 +1,87 @@
# AGENTS.md
## Project Overview
Dify is an open-source platform for developing LLM applications with an intuitive interface combining agentic AI workflows, RAG pipelines, agent capabilities, and model management.
The codebase consists of:
- **Backend API** (`/api`): Python Flask application with Domain-Driven Design architecture
- **Frontend Web** (`/web`): Next.js 15 application with TypeScript and React 19
- **Docker deployment** (`/docker`): Containerized deployment configurations
## Development Commands
### Backend (API)
All Python commands must be prefixed with `uv run --project api`:
```bash
# Start development servers
./dev/start-api # Start API server
./dev/start-worker # Start Celery worker
# Run tests
uv run --project api pytest # Run all tests
uv run --project api pytest tests/unit_tests/ # Unit tests only
uv run --project api pytest tests/integration_tests/ # Integration tests
# Code quality
./dev/reformat # Run all formatters and linters
uv run --project api ruff check --fix ./ # Fix linting issues
uv run --project api ruff format ./ # Format code
uv run --directory api basedpyright # Type checking
```
### Frontend (Web)
```bash
cd web
pnpm lint # Run ESLint
pnpm eslint-fix # Fix ESLint issues
pnpm test # Run Jest tests
```
## Testing Guidelines
### Backend Testing
- Use `pytest` for all backend tests
- Write tests first (TDD approach)
- Test structure: Arrange-Act-Assert
## Code Style Requirements
### Python
- Use type hints for all functions and class attributes
- No `Any` types unless absolutely necessary
- Implement special methods (`__repr__`, `__str__`) appropriately
### TypeScript/JavaScript
- Strict TypeScript configuration
- ESLint with Prettier integration
- Avoid `any` type
## Important Notes
- **Environment Variables**: Always use UV for Python commands: `uv run --project api <command>`
- **Comments**: Only write meaningful comments that explain "why", not "what"
- **File Creation**: Always prefer editing existing files over creating new ones
- **Documentation**: Don't create documentation files unless explicitly requested
- **Code Quality**: Always run `./dev/reformat` before committing backend changes
## Common Development Tasks
### Adding a New API Endpoint
1. Create controller in `/api/controllers/`
1. Add service logic in `/api/services/`
1. Update routes in controller's `__init__.py`
1. Write tests in `/api/tests/`
## Project-Specific Conventions
- All async tasks use Celery with Redis as broker
- **Internationalization**: Frontend supports multiple languages with English (`web/i18n/en-US/`) as the source. All user-facing text must use i18n keys, no hardcoded strings. Edit corresponding module files in `en-US/` directory for translations.

View File

@@ -1,89 +0,0 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Project Overview
Dify is an open-source platform for developing LLM applications with an intuitive interface combining agentic AI workflows, RAG pipelines, agent capabilities, and model management.
The codebase consists of:
- **Backend API** (`/api`): Python Flask application with Domain-Driven Design architecture
- **Frontend Web** (`/web`): Next.js 15 application with TypeScript and React 19
- **Docker deployment** (`/docker`): Containerized deployment configurations
## Development Commands
### Backend (API)
All Python commands must be prefixed with `uv run --project api`:
```bash
# Start development servers
./dev/start-api # Start API server
./dev/start-worker # Start Celery worker
# Run tests
uv run --project api pytest # Run all tests
uv run --project api pytest tests/unit_tests/ # Unit tests only
uv run --project api pytest tests/integration_tests/ # Integration tests
# Code quality
./dev/reformat # Run all formatters and linters
uv run --project api ruff check --fix ./ # Fix linting issues
uv run --project api ruff format ./ # Format code
uv run --project api mypy . # Type checking
```
### Frontend (Web)
```bash
cd web
pnpm lint # Run ESLint
pnpm eslint-fix # Fix ESLint issues
pnpm test # Run Jest tests
```
## Testing Guidelines
### Backend Testing
- Use `pytest` for all backend tests
- Write tests first (TDD approach)
- Test structure: Arrange-Act-Assert
## Code Style Requirements
### Python
- Use type hints for all functions and class attributes
- No `Any` types unless absolutely necessary
- Implement special methods (`__repr__`, `__str__`) appropriately
### TypeScript/JavaScript
- Strict TypeScript configuration
- ESLint with Prettier integration
- Avoid `any` type
## Important Notes
- **Environment Variables**: Always use UV for Python commands: `uv run --project api <command>`
- **Comments**: Only write meaningful comments that explain "why", not "what"
- **File Creation**: Always prefer editing existing files over creating new ones
- **Documentation**: Don't create documentation files unless explicitly requested
- **Code Quality**: Always run `./dev/reformat` before committing backend changes
## Common Development Tasks
### Adding a New API Endpoint
1. Create controller in `/api/controllers/`
1. Add service logic in `/api/services/`
1. Update routes in controller's `__init__.py`
1. Write tests in `/api/tests/`
## Project-Specific Conventions
- All async tasks use Celery with Redis as broker
- **Internationalization**: Frontend supports multiple languages with English (`web/i18n/en-US/`) as the source. All user-facing text must use i18n keys, no hardcoded strings. Edit corresponding module files in `en-US/` directory for translations.

1
CLAUDE.md Symbolic link
View File

@@ -0,0 +1 @@
AGENTS.md

View File

@@ -4,6 +4,72 @@ WEB_IMAGE=$(DOCKER_REGISTRY)/dify-web
API_IMAGE=$(DOCKER_REGISTRY)/dify-api API_IMAGE=$(DOCKER_REGISTRY)/dify-api
VERSION=latest VERSION=latest
# Default target - show help
.DEFAULT_GOAL := help
# Backend Development Environment Setup
.PHONY: dev-setup prepare-docker prepare-web prepare-api
# Dev setup target
dev-setup: prepare-docker prepare-web prepare-api
@echo "✅ Backend development environment setup complete!"
# Step 1: Prepare Docker middleware
prepare-docker:
@echo "🐳 Setting up Docker middleware..."
@cp -n docker/middleware.env.example docker/middleware.env 2>/dev/null || echo "Docker middleware.env already exists"
@cd docker && docker compose -f docker-compose.middleware.yaml --env-file middleware.env -p dify-middlewares-dev up -d
@echo "✅ Docker middleware started"
# Step 2: Prepare web environment
prepare-web:
@echo "🌐 Setting up web environment..."
@cp -n web/.env.example web/.env 2>/dev/null || echo "Web .env already exists"
@cd web && pnpm install
@cd web && pnpm build
@echo "✅ Web environment prepared (not started)"
# Step 3: Prepare API environment
prepare-api:
@echo "🔧 Setting up API environment..."
@cp -n api/.env.example api/.env 2>/dev/null || echo "API .env already exists"
@cd api && uv sync --dev
@cd api && uv run flask db upgrade
@echo "✅ API environment prepared (not started)"
# Clean dev environment
dev-clean:
@echo "⚠️ Stopping Docker containers..."
@cd docker && docker compose -f docker-compose.middleware.yaml --env-file middleware.env -p dify-middlewares-dev down
@echo "🗑️ Removing volumes..."
@rm -rf docker/volumes/db
@rm -rf docker/volumes/redis
@rm -rf docker/volumes/plugin_daemon
@rm -rf docker/volumes/weaviate
@rm -rf api/storage
@echo "✅ Cleanup complete"
# Backend Code Quality Commands
format:
@echo "🎨 Running ruff format..."
@uv run --project api --dev ruff format ./api
@echo "✅ Code formatting complete"
check:
@echo "🔍 Running ruff check..."
@uv run --project api --dev ruff check ./api
@echo "✅ Code check complete"
lint:
@echo "🔧 Running ruff format and check with fixes..."
@uv run --directory api --dev sh -c 'ruff format ./api && ruff check --fix ./api'
@echo "✅ Linting complete"
type-check:
@echo "📝 Running type check with basedpyright..."
@uv run --directory api --dev basedpyright
@echo "✅ Type check complete"
# Build Docker images # Build Docker images
build-web: build-web:
@echo "Building web Docker image: $(WEB_IMAGE):$(VERSION)..." @echo "Building web Docker image: $(WEB_IMAGE):$(VERSION)..."
@@ -39,5 +105,27 @@ build-push-web: build-web push-web
build-push-all: build-all push-all build-push-all: build-all push-all
@echo "All Docker images have been built and pushed." @echo "All Docker images have been built and pushed."
# Help target
help:
@echo "Development Setup Targets:"
@echo " make dev-setup - Run all setup steps for backend dev environment"
@echo " make prepare-docker - Set up Docker middleware"
@echo " make prepare-web - Set up web environment"
@echo " make prepare-api - Set up API environment"
@echo " make dev-clean - Stop Docker middleware containers"
@echo ""
@echo "Backend Code Quality:"
@echo " make format - Format code with ruff"
@echo " make check - Check code with ruff"
@echo " make lint - Format and fix code with ruff"
@echo " make type-check - Run type checking with basedpyright"
@echo ""
@echo "Docker Build Targets:"
@echo " make build-web - Build web Docker image"
@echo " make build-api - Build API Docker image"
@echo " make build-all - Build all Docker images"
@echo " make push-all - Push all Docker images"
@echo " make build-push-all - Build and push all Docker images"
# Phony targets # Phony targets
.PHONY: build-web build-api push-web push-api build-all push-all build-push-all .PHONY: build-web build-api push-web push-api build-all push-all build-push-all dev-setup prepare-docker prepare-web prepare-api dev-clean help format check lint type-check

View File

@@ -75,6 +75,7 @@ DB_PASSWORD=difyai123456
DB_HOST=localhost DB_HOST=localhost
DB_PORT=5432 DB_PORT=5432
DB_DATABASE=dify DB_DATABASE=dify
SQLALCHEMY_POOL_PRE_PING=true
# Storage configuration # Storage configuration
# use for store upload files, private keys... # use for store upload files, private keys...
@@ -327,7 +328,7 @@ MATRIXONE_DATABASE=dify
LINDORM_URL=http://ld-*******************-proxy-search-pub.lindorm.aliyuncs.com:30070 LINDORM_URL=http://ld-*******************-proxy-search-pub.lindorm.aliyuncs.com:30070
LINDORM_USERNAME=admin LINDORM_USERNAME=admin
LINDORM_PASSWORD=admin LINDORM_PASSWORD=admin
USING_UGC_INDEX=False LINDORM_USING_UGC=True
LINDORM_QUERY_TIMEOUT=1 LINDORM_QUERY_TIMEOUT=1
# OceanBase Vector configuration # OceanBase Vector configuration
@@ -460,6 +461,16 @@ WORKFLOW_CALL_MAX_DEPTH=5
WORKFLOW_PARALLEL_DEPTH_LIMIT=3 WORKFLOW_PARALLEL_DEPTH_LIMIT=3
MAX_VARIABLE_SIZE=204800 MAX_VARIABLE_SIZE=204800
# GraphEngine Worker Pool Configuration
# Minimum number of workers per GraphEngine instance (default: 1)
GRAPH_ENGINE_MIN_WORKERS=1
# Maximum number of workers per GraphEngine instance (default: 10)
GRAPH_ENGINE_MAX_WORKERS=10
# Queue depth threshold that triggers worker scale up (default: 3)
GRAPH_ENGINE_SCALE_UP_THRESHOLD=3
# Seconds of idle time before scaling down workers (default: 5.0)
GRAPH_ENGINE_SCALE_DOWN_IDLE_TIME=5.0
# Workflow storage configuration # Workflow storage configuration
# Options: rdbms, hybrid # Options: rdbms, hybrid
# rdbms: Use only the relational database (default) # rdbms: Use only the relational database (default)
@@ -529,6 +540,7 @@ ENDPOINT_URL_TEMPLATE=http://localhost:5002/e/{hook_id}
# Reset password token expiry minutes # Reset password token expiry minutes
RESET_PASSWORD_TOKEN_EXPIRY_MINUTES=5 RESET_PASSWORD_TOKEN_EXPIRY_MINUTES=5
EMAIL_REGISTER_TOKEN_EXPIRY_MINUTES=5
CHANGE_EMAIL_TOKEN_EXPIRY_MINUTES=5 CHANGE_EMAIL_TOKEN_EXPIRY_MINUTES=5
OWNER_TRANSFER_TOKEN_EXPIRY_MINUTES=5 OWNER_TRANSFER_TOKEN_EXPIRY_MINUTES=5
@@ -568,3 +580,7 @@ QUEUE_MONITOR_INTERVAL=30
# Swagger UI configuration # Swagger UI configuration
SWAGGER_UI_ENABLED=true SWAGGER_UI_ENABLED=true
SWAGGER_UI_PATH=/swagger-ui.html SWAGGER_UI_PATH=/swagger-ui.html
# Whether to encrypt dataset IDs when exporting DSL files (default: true)
# Set to false to export dataset IDs as plain text for easier cross-environment import
DSL_EXPORT_ENCRYPT_DATASET_ID=true

105
api/.importlinter Normal file
View File

@@ -0,0 +1,105 @@
[importlinter]
root_packages =
core
configs
controllers
models
tasks
services
[importlinter:contract:workflow]
name = Workflow
type=layers
layers =
graph_engine
graph_events
graph
nodes
node_events
entities
containers =
core.workflow
ignore_imports =
core.workflow.nodes.base.node -> core.workflow.graph_events
core.workflow.nodes.iteration.iteration_node -> core.workflow.graph_events
core.workflow.nodes.loop.loop_node -> core.workflow.graph_events
core.workflow.nodes.node_factory -> core.workflow.graph
core.workflow.nodes.iteration.iteration_node -> core.workflow.graph_engine
core.workflow.nodes.iteration.iteration_node -> core.workflow.graph
core.workflow.nodes.iteration.iteration_node -> core.workflow.graph_engine.command_channels
core.workflow.nodes.loop.loop_node -> core.workflow.graph_engine
core.workflow.nodes.loop.loop_node -> core.workflow.graph
core.workflow.nodes.loop.loop_node -> core.workflow.graph_engine.command_channels
[importlinter:contract:rsc]
name = RSC
type = layers
layers =
graph_engine
response_coordinator
containers =
core.workflow.graph_engine
[importlinter:contract:worker]
name = Worker
type = layers
layers =
graph_engine
worker
containers =
core.workflow.graph_engine
[importlinter:contract:graph-engine-architecture]
name = Graph Engine Architecture
type = layers
layers =
graph_engine
orchestration
command_processing
event_management
error_handler
graph_traversal
graph_state_manager
worker_management
domain
containers =
core.workflow.graph_engine
[importlinter:contract:domain-isolation]
name = Domain Model Isolation
type = forbidden
source_modules =
core.workflow.graph_engine.domain
forbidden_modules =
core.workflow.graph_engine.worker_management
core.workflow.graph_engine.command_channels
core.workflow.graph_engine.layers
core.workflow.graph_engine.protocols
[importlinter:contract:worker-management]
name = Worker Management
type = forbidden
source_modules =
core.workflow.graph_engine.worker_management
forbidden_modules =
core.workflow.graph_engine.orchestration
core.workflow.graph_engine.command_processing
core.workflow.graph_engine.event_management
[importlinter:contract:graph-traversal-components]
name = Graph Traversal Components
type = layers
layers =
edge_processor
skip_propagator
containers =
core.workflow.graph_engine.graph_traversal
[importlinter:contract:command-channels]
name = Command Channels Independence
type = independence
modules =
core.workflow.graph_engine.command_channels.in_memory_channel
core.workflow.graph_engine.command_channels.redis_channel

View File

@@ -5,7 +5,7 @@ line-length = 120
quote-style = "double" quote-style = "double"
[lint] [lint]
preview = false preview = true
select = [ select = [
"B", # flake8-bugbear rules "B", # flake8-bugbear rules
"C4", # flake8-comprehensions "C4", # flake8-comprehensions
@@ -45,6 +45,7 @@ select = [
"G001", # don't use str format to logging messages "G001", # don't use str format to logging messages
"G003", # don't use + in logging messages "G003", # don't use + in logging messages
"G004", # don't use f-strings to format logging messages "G004", # don't use f-strings to format logging messages
"UP042", # use StrEnum
] ]
ignore = [ ignore = [
@@ -64,6 +65,7 @@ ignore = [
"B006", # mutable-argument-default "B006", # mutable-argument-default
"B007", # unused-loop-control-variable "B007", # unused-loop-control-variable
"B026", # star-arg-unpacking-after-keyword-arg "B026", # star-arg-unpacking-after-keyword-arg
"B901", # allow return in yield
"B903", # class-as-data-structure "B903", # class-as-data-structure
"B904", # raise-without-from-inside-except "B904", # raise-without-from-inside-except
"B905", # zip-without-explicit-strict "B905", # zip-without-explicit-strict

View File

@@ -108,5 +108,5 @@ uv run celery -A app.celery beat
../dev/reformat # Run all formatters and linters ../dev/reformat # Run all formatters and linters
uv run ruff check --fix ./ # Fix linting issues uv run ruff check --fix ./ # Fix linting issues
uv run ruff format ./ # Format code uv run ruff format ./ # Format code
uv run mypy . # Type checking uv run basedpyright . # Type checking
``` ```

View File

@@ -25,6 +25,9 @@ def create_flask_app_with_configs() -> DifyApp:
# add an unique identifier to each request # add an unique identifier to each request
RecyclableContextVar.increment_thread_recycles() RecyclableContextVar.increment_thread_recycles()
# Capture the decorator's return value to avoid pyright reportUnusedFunction
_ = before_request
return dify_app return dify_app

View File

@@ -1,11 +0,0 @@
from tests.integration_tests.utils.parent_class import ParentClass
class ChildClass(ParentClass):
"""Test child class for module import helper tests"""
def __init__(self, name):
super().__init__(name)
def get_name(self):
return f"Child: {self.name}"

View File

@@ -1,8 +1,9 @@
import base64 import base64
import json import json
import logging import logging
import operator
import secrets import secrets
from typing import Any, Optional from typing import Any
import click import click
import sqlalchemy as sa import sqlalchemy as sa
@@ -13,7 +14,6 @@ from sqlalchemy.exc import SQLAlchemyError
from configs import dify_config from configs import dify_config
from constants.languages import languages from constants.languages import languages
from core.plugin.entities.plugin import ToolProviderID
from core.rag.datasource.vdb.vector_factory import Vector from core.rag.datasource.vdb.vector_factory import Vector
from core.rag.datasource.vdb.vector_type import VectorType from core.rag.datasource.vdb.vector_type import VectorType
from core.rag.index_processor.constant.built_in_field import BuiltInField from core.rag.index_processor.constant.built_in_field import BuiltInField
@@ -31,6 +31,7 @@ from models.dataset import Dataset, DatasetCollectionBinding, DatasetMetadata, D
from models.dataset import Document as DatasetDocument from models.dataset import Document as DatasetDocument
from models.model import Account, App, AppAnnotationSetting, AppMode, Conversation, MessageAnnotation from models.model import Account, App, AppAnnotationSetting, AppMode, Conversation, MessageAnnotation
from models.provider import Provider, ProviderModel from models.provider import Provider, ProviderModel
from models.provider_ids import ToolProviderID
from models.tools import ToolOAuthSystemClient from models.tools import ToolOAuthSystemClient
from services.account_service import AccountService, RegisterService, TenantService from services.account_service import AccountService, RegisterService, TenantService
from services.clear_free_plan_tenant_expired_logs import ClearFreePlanTenantExpiredLogs from services.clear_free_plan_tenant_expired_logs import ClearFreePlanTenantExpiredLogs
@@ -212,7 +213,9 @@ def migrate_annotation_vector_database():
if not dataset_collection_binding: if not dataset_collection_binding:
click.echo(f"App annotation collection binding not found: {app.id}") click.echo(f"App annotation collection binding not found: {app.id}")
continue continue
annotations = db.session.query(MessageAnnotation).where(MessageAnnotation.app_id == app.id).all() annotations = db.session.scalars(
select(MessageAnnotation).where(MessageAnnotation.app_id == app.id)
).all()
dataset = Dataset( dataset = Dataset(
id=app.id, id=app.id,
tenant_id=app.tenant_id, tenant_id=app.tenant_id,
@@ -367,29 +370,25 @@ def migrate_knowledge_vector_database():
) )
raise e raise e
dataset_documents = ( dataset_documents = db.session.scalars(
db.session.query(DatasetDocument) select(DatasetDocument).where(
.where(
DatasetDocument.dataset_id == dataset.id, DatasetDocument.dataset_id == dataset.id,
DatasetDocument.indexing_status == "completed", DatasetDocument.indexing_status == "completed",
DatasetDocument.enabled == True, DatasetDocument.enabled == True,
DatasetDocument.archived == False, DatasetDocument.archived == False,
) )
.all() ).all()
)
documents = [] documents = []
segments_count = 0 segments_count = 0
for dataset_document in dataset_documents: for dataset_document in dataset_documents:
segments = ( segments = db.session.scalars(
db.session.query(DocumentSegment) select(DocumentSegment).where(
.where(
DocumentSegment.document_id == dataset_document.id, DocumentSegment.document_id == dataset_document.id,
DocumentSegment.status == "completed", DocumentSegment.status == "completed",
DocumentSegment.enabled == True, DocumentSegment.enabled == True,
) )
.all() ).all()
)
for segment in segments: for segment in segments:
document = Document( document = Document(
@@ -479,12 +478,12 @@ def convert_to_agent_apps():
click.echo(f"Converting app: {app.id}") click.echo(f"Converting app: {app.id}")
try: try:
app.mode = AppMode.AGENT_CHAT.value app.mode = AppMode.AGENT_CHAT
db.session.commit() db.session.commit()
# update conversation mode to agent # update conversation mode to agent
db.session.query(Conversation).where(Conversation.app_id == app.id).update( db.session.query(Conversation).where(Conversation.app_id == app.id).update(
{Conversation.mode: AppMode.AGENT_CHAT.value} {Conversation.mode: AppMode.AGENT_CHAT}
) )
db.session.commit() db.session.commit()
@@ -511,7 +510,7 @@ def add_qdrant_index(field: str):
from qdrant_client.http.exceptions import UnexpectedResponse from qdrant_client.http.exceptions import UnexpectedResponse
from qdrant_client.http.models import PayloadSchemaType from qdrant_client.http.models import PayloadSchemaType
from core.rag.datasource.vdb.qdrant.qdrant_vector import QdrantConfig from core.rag.datasource.vdb.qdrant.qdrant_vector import PathQdrantParams, QdrantConfig
for binding in bindings: for binding in bindings:
if dify_config.QDRANT_URL is None: if dify_config.QDRANT_URL is None:
@@ -525,7 +524,21 @@ def add_qdrant_index(field: str):
prefer_grpc=dify_config.QDRANT_GRPC_ENABLED, prefer_grpc=dify_config.QDRANT_GRPC_ENABLED,
) )
try: try:
client = qdrant_client.QdrantClient(**qdrant_config.to_qdrant_params()) params = qdrant_config.to_qdrant_params()
# Check the type before using
if isinstance(params, PathQdrantParams):
# PathQdrantParams case
client = qdrant_client.QdrantClient(path=params.path)
else:
# UrlQdrantParams case - params is UrlQdrantParams
client = qdrant_client.QdrantClient(
url=params.url,
api_key=params.api_key,
timeout=int(params.timeout),
verify=params.verify,
grpc_port=params.grpc_port,
prefer_grpc=params.prefer_grpc,
)
# create payload index # create payload index
client.create_payload_index(binding.collection_name, field, field_schema=PayloadSchemaType.KEYWORD) client.create_payload_index(binding.collection_name, field, field_schema=PayloadSchemaType.KEYWORD)
create_count += 1 create_count += 1
@@ -571,7 +584,7 @@ def old_metadata_migration():
for document in documents: for document in documents:
if document.doc_metadata: if document.doc_metadata:
doc_metadata = document.doc_metadata doc_metadata = document.doc_metadata
for key, value in doc_metadata.items(): for key in doc_metadata:
for field in BuiltInField: for field in BuiltInField:
if field.value == key: if field.value == key:
break break
@@ -627,7 +640,7 @@ def old_metadata_migration():
@click.option("--email", prompt=True, help="Tenant account email.") @click.option("--email", prompt=True, help="Tenant account email.")
@click.option("--name", prompt=True, help="Workspace name.") @click.option("--name", prompt=True, help="Workspace name.")
@click.option("--language", prompt=True, help="Account language, default: en-US.") @click.option("--language", prompt=True, help="Account language, default: en-US.")
def create_tenant(email: str, language: Optional[str] = None, name: Optional[str] = None): def create_tenant(email: str, language: str | None = None, name: str | None = None):
""" """
Create tenant account Create tenant account
""" """
@@ -941,7 +954,7 @@ def clear_orphaned_file_records(force: bool):
click.echo(click.style("- Deleting orphaned message_files records", fg="white")) click.echo(click.style("- Deleting orphaned message_files records", fg="white"))
query = "DELETE FROM message_files WHERE id IN :ids" query = "DELETE FROM message_files WHERE id IN :ids"
with db.engine.begin() as conn: with db.engine.begin() as conn:
conn.execute(sa.text(query), {"ids": tuple([record["id"] for record in orphaned_message_files])}) conn.execute(sa.text(query), {"ids": tuple(record["id"] for record in orphaned_message_files)})
click.echo( click.echo(
click.style(f"Removed {len(orphaned_message_files)} orphaned message_files records.", fg="green") click.style(f"Removed {len(orphaned_message_files)} orphaned message_files records.", fg="green")
) )
@@ -1295,7 +1308,7 @@ def cleanup_orphaned_draft_variables(
if dry_run: if dry_run:
logger.info("DRY RUN: Would delete the following:") logger.info("DRY RUN: Would delete the following:")
for app_id, count in sorted(stats["orphaned_by_app"].items(), key=lambda x: x[1], reverse=True)[ for app_id, count in sorted(stats["orphaned_by_app"].items(), key=operator.itemgetter(1), reverse=True)[
:10 :10
]: # Show top 10 ]: # Show top 10
logger.info(" App %s: %s variables", app_id, count) logger.info(" App %s: %s variables", app_id, count)

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,28 +7,28 @@ class NotionConfig(BaseSettings):
Configuration settings for Notion integration Configuration settings for Notion integration
""" """
NOTION_CLIENT_ID: Optional[str] = Field( NOTION_CLIENT_ID: str | None = Field(
description="Client ID for Notion API authentication. Required for OAuth 2.0 flow.", description="Client ID for Notion API authentication. Required for OAuth 2.0 flow.",
default=None, default=None,
) )
NOTION_CLIENT_SECRET: Optional[str] = Field( NOTION_CLIENT_SECRET: str | None = Field(
description="Client secret for Notion API authentication. Required for OAuth 2.0 flow.", description="Client secret for Notion API authentication. Required for OAuth 2.0 flow.",
default=None, default=None,
) )
NOTION_INTEGRATION_TYPE: Optional[str] = Field( NOTION_INTEGRATION_TYPE: str | None = Field(
description="Type of Notion integration." description="Type of Notion integration."
" Set to 'internal' for internal integrations, or None for public integrations.", " Set to 'internal' for internal integrations, or None for public integrations.",
default=None, default=None,
) )
NOTION_INTERNAL_SECRET: Optional[str] = Field( NOTION_INTERNAL_SECRET: str | None = Field(
description="Secret key for internal Notion integrations. Required when NOTION_INTEGRATION_TYPE is 'internal'.", description="Secret key for internal Notion integrations. Required when NOTION_INTEGRATION_TYPE is 'internal'.",
default=None, default=None,
) )
NOTION_INTEGRATION_TOKEN: Optional[str] = Field( NOTION_INTEGRATION_TOKEN: str | None = Field(
description="Integration token for Notion API access. Used for direct API calls without OAuth flow.", description="Integration token for Notion API access. Used for direct API calls without OAuth flow.",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeFloat from pydantic import Field, NonNegativeFloat
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class SentryConfig(BaseSettings):
Configuration settings for Sentry error tracking and performance monitoring Configuration settings for Sentry error tracking and performance monitoring
""" """
SENTRY_DSN: Optional[str] = Field( SENTRY_DSN: str | None = Field(
description="Sentry Data Source Name (DSN)." description="Sentry Data Source Name (DSN)."
" This is the unique identifier of your Sentry project, used to send events to the correct project.", " This is the unique identifier of your Sentry project, used to send events to the correct project.",
default=None, default=None,

View File

@@ -1,4 +1,4 @@
from typing import Literal, Optional from typing import Literal
from pydantic import ( from pydantic import (
AliasChoices, AliasChoices,
@@ -31,6 +31,12 @@ class SecurityConfig(BaseSettings):
description="Duration in minutes for which a password reset token remains valid", description="Duration in minutes for which a password reset token remains valid",
default=5, default=5,
) )
EMAIL_REGISTER_TOKEN_EXPIRY_MINUTES: PositiveInt = Field(
description="Duration in minutes for which a email register token remains valid",
default=5,
)
CHANGE_EMAIL_TOKEN_EXPIRY_MINUTES: PositiveInt = Field( CHANGE_EMAIL_TOKEN_EXPIRY_MINUTES: PositiveInt = Field(
description="Duration in minutes for which a change email token remains valid", description="Duration in minutes for which a change email token remains valid",
default=5, default=5,
@@ -51,7 +57,7 @@ class SecurityConfig(BaseSettings):
default=False, default=False,
) )
ADMIN_API_KEY: Optional[str] = Field( ADMIN_API_KEY: str | None = Field(
description="admin api key for authentication", description="admin api key for authentication",
default=None, default=None,
) )
@@ -91,17 +97,17 @@ class CodeExecutionSandboxConfig(BaseSettings):
default="dify-sandbox", default="dify-sandbox",
) )
CODE_EXECUTION_CONNECT_TIMEOUT: Optional[float] = Field( CODE_EXECUTION_CONNECT_TIMEOUT: float | None = Field(
description="Connection timeout in seconds for code execution requests", description="Connection timeout in seconds for code execution requests",
default=10.0, default=10.0,
) )
CODE_EXECUTION_READ_TIMEOUT: Optional[float] = Field( CODE_EXECUTION_READ_TIMEOUT: float | None = Field(
description="Read timeout in seconds for code execution requests", description="Read timeout in seconds for code execution requests",
default=60.0, default=60.0,
) )
CODE_EXECUTION_WRITE_TIMEOUT: Optional[float] = Field( CODE_EXECUTION_WRITE_TIMEOUT: float | None = Field(
description="Write timeout in seconds for code execution request", description="Write timeout in seconds for code execution request",
default=10.0, default=10.0,
) )
@@ -362,17 +368,17 @@ class HttpConfig(BaseSettings):
default=3, default=3,
) )
SSRF_PROXY_ALL_URL: Optional[str] = Field( SSRF_PROXY_ALL_URL: str | None = Field(
description="Proxy URL for HTTP or HTTPS requests to prevent Server-Side Request Forgery (SSRF)", description="Proxy URL for HTTP or HTTPS requests to prevent Server-Side Request Forgery (SSRF)",
default=None, default=None,
) )
SSRF_PROXY_HTTP_URL: Optional[str] = Field( SSRF_PROXY_HTTP_URL: str | None = Field(
description="Proxy URL for HTTP requests to prevent Server-Side Request Forgery (SSRF)", description="Proxy URL for HTTP requests to prevent Server-Side Request Forgery (SSRF)",
default=None, default=None,
) )
SSRF_PROXY_HTTPS_URL: Optional[str] = Field( SSRF_PROXY_HTTPS_URL: str | None = Field(
description="Proxy URL for HTTPS requests to prevent Server-Side Request Forgery (SSRF)", description="Proxy URL for HTTPS requests to prevent Server-Side Request Forgery (SSRF)",
default=None, default=None,
) )
@@ -414,7 +420,7 @@ class InnerAPIConfig(BaseSettings):
default=False, default=False,
) )
INNER_API_KEY: Optional[str] = Field( INNER_API_KEY: str | None = Field(
description="API key for accessing the internal API", description="API key for accessing the internal API",
default=None, default=None,
) )
@@ -430,7 +436,7 @@ class LoggingConfig(BaseSettings):
default="INFO", default="INFO",
) )
LOG_FILE: Optional[str] = Field( LOG_FILE: str | None = Field(
description="File path for log output.", description="File path for log output.",
default=None, default=None,
) )
@@ -450,12 +456,12 @@ class LoggingConfig(BaseSettings):
default="%(asctime)s.%(msecs)03d %(levelname)s [%(threadName)s] [%(filename)s:%(lineno)d] - %(message)s", default="%(asctime)s.%(msecs)03d %(levelname)s [%(threadName)s] [%(filename)s:%(lineno)d] - %(message)s",
) )
LOG_DATEFORMAT: Optional[str] = Field( LOG_DATEFORMAT: str | None = Field(
description="Date format string for log timestamps", description="Date format string for log timestamps",
default=None, default=None,
) )
LOG_TZ: Optional[str] = Field( LOG_TZ: str | None = Field(
description="Timezone for log timestamps (e.g., 'America/New_York')", description="Timezone for log timestamps (e.g., 'America/New_York')",
default="UTC", default="UTC",
) )
@@ -529,6 +535,28 @@ class WorkflowConfig(BaseSettings):
default=200 * 1024, default=200 * 1024,
) )
# GraphEngine Worker Pool Configuration
GRAPH_ENGINE_MIN_WORKERS: PositiveInt = Field(
description="Minimum number of workers per GraphEngine instance",
default=1,
)
GRAPH_ENGINE_MAX_WORKERS: PositiveInt = Field(
description="Maximum number of workers per GraphEngine instance",
default=10,
)
GRAPH_ENGINE_SCALE_UP_THRESHOLD: PositiveInt = Field(
description="Queue depth threshold that triggers worker scale up",
default=3,
)
GRAPH_ENGINE_SCALE_DOWN_IDLE_TIME: float = Field(
description="Seconds of idle time before scaling down workers",
default=5.0,
ge=0.1,
)
class WorkflowNodeExecutionConfig(BaseSettings): class WorkflowNodeExecutionConfig(BaseSettings):
""" """
@@ -589,22 +617,22 @@ class AuthConfig(BaseSettings):
default="/console/api/oauth/authorize", default="/console/api/oauth/authorize",
) )
GITHUB_CLIENT_ID: Optional[str] = Field( GITHUB_CLIENT_ID: str | None = Field(
description="GitHub OAuth client ID", description="GitHub OAuth client ID",
default=None, default=None,
) )
GITHUB_CLIENT_SECRET: Optional[str] = Field( GITHUB_CLIENT_SECRET: str | None = Field(
description="GitHub OAuth client secret", description="GitHub OAuth client secret",
default=None, default=None,
) )
GOOGLE_CLIENT_ID: Optional[str] = Field( GOOGLE_CLIENT_ID: str | None = Field(
description="Google OAuth client ID", description="Google OAuth client ID",
default=None, default=None,
) )
GOOGLE_CLIENT_SECRET: Optional[str] = Field( GOOGLE_CLIENT_SECRET: str | None = Field(
description="Google OAuth client secret", description="Google OAuth client secret",
default=None, default=None,
) )
@@ -639,6 +667,11 @@ class AuthConfig(BaseSettings):
default=86400, default=86400,
) )
EMAIL_REGISTER_LOCKOUT_DURATION: PositiveInt = Field(
description="Time (in seconds) a user must wait before retrying email register after exceeding the rate limit.",
default=86400,
)
class ModerationConfig(BaseSettings): class ModerationConfig(BaseSettings):
""" """
@@ -667,42 +700,42 @@ class MailConfig(BaseSettings):
Configuration for email services Configuration for email services
""" """
MAIL_TYPE: Optional[str] = Field( MAIL_TYPE: str | None = Field(
description="Email service provider type ('smtp' or 'resend' or 'sendGrid), default to None.", description="Email service provider type ('smtp' or 'resend' or 'sendGrid), default to None.",
default=None, default=None,
) )
MAIL_DEFAULT_SEND_FROM: Optional[str] = Field( MAIL_DEFAULT_SEND_FROM: str | None = Field(
description="Default email address to use as the sender", description="Default email address to use as the sender",
default=None, default=None,
) )
RESEND_API_KEY: Optional[str] = Field( RESEND_API_KEY: str | None = Field(
description="API key for Resend email service", description="API key for Resend email service",
default=None, default=None,
) )
RESEND_API_URL: Optional[str] = Field( RESEND_API_URL: str | None = Field(
description="API URL for Resend email service", description="API URL for Resend email service",
default=None, default=None,
) )
SMTP_SERVER: Optional[str] = Field( SMTP_SERVER: str | None = Field(
description="SMTP server hostname", description="SMTP server hostname",
default=None, default=None,
) )
SMTP_PORT: Optional[int] = Field( SMTP_PORT: int | None = Field(
description="SMTP server port number", description="SMTP server port number",
default=465, default=465,
) )
SMTP_USERNAME: Optional[str] = Field( SMTP_USERNAME: str | None = Field(
description="Username for SMTP authentication", description="Username for SMTP authentication",
default=None, default=None,
) )
SMTP_PASSWORD: Optional[str] = Field( SMTP_PASSWORD: str | None = Field(
description="Password for SMTP authentication", description="Password for SMTP authentication",
default=None, default=None,
) )
@@ -722,7 +755,7 @@ class MailConfig(BaseSettings):
default=50, default=50,
) )
SENDGRID_API_KEY: Optional[str] = Field( SENDGRID_API_KEY: str | None = Field(
description="API key for SendGrid service", description="API key for SendGrid service",
default=None, default=None,
) )
@@ -745,17 +778,17 @@ class RagEtlConfig(BaseSettings):
default="database", default="database",
) )
UNSTRUCTURED_API_URL: Optional[str] = Field( UNSTRUCTURED_API_URL: str | None = Field(
description="API URL for Unstructured.io service", description="API URL for Unstructured.io service",
default=None, default=None,
) )
UNSTRUCTURED_API_KEY: Optional[str] = Field( UNSTRUCTURED_API_KEY: str | None = Field(
description="API key for Unstructured.io service", description="API key for Unstructured.io service",
default="", default="",
) )
SCARF_NO_ANALYTICS: Optional[str] = Field( SCARF_NO_ANALYTICS: str | None = Field(
description="This is about whether to disable Scarf analytics in Unstructured library.", description="This is about whether to disable Scarf analytics in Unstructured library.",
default="false", default="false",
) )
@@ -796,6 +829,11 @@ class DataSetConfig(BaseSettings):
default=30, default=30,
) )
DSL_EXPORT_ENCRYPT_DATASET_ID: bool = Field(
description="Enable or disable dataset ID encryption when exporting DSL files",
default=True,
)
class WorkspaceConfig(BaseSettings): class WorkspaceConfig(BaseSettings):
""" """

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeInt from pydantic import Field, NonNegativeInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -40,17 +38,17 @@ class HostedOpenAiConfig(BaseSettings):
Configuration for hosted OpenAI service Configuration for hosted OpenAI service
""" """
HOSTED_OPENAI_API_KEY: Optional[str] = Field( HOSTED_OPENAI_API_KEY: str | None = Field(
description="API key for hosted OpenAI service", description="API key for hosted OpenAI service",
default=None, default=None,
) )
HOSTED_OPENAI_API_BASE: Optional[str] = Field( HOSTED_OPENAI_API_BASE: str | None = Field(
description="Base URL for hosted OpenAI API", description="Base URL for hosted OpenAI API",
default=None, default=None,
) )
HOSTED_OPENAI_API_ORGANIZATION: Optional[str] = Field( HOSTED_OPENAI_API_ORGANIZATION: str | None = Field(
description="Organization ID for hosted OpenAI service", description="Organization ID for hosted OpenAI service",
default=None, default=None,
) )
@@ -110,12 +108,12 @@ class HostedAzureOpenAiConfig(BaseSettings):
default=False, default=False,
) )
HOSTED_AZURE_OPENAI_API_KEY: Optional[str] = Field( HOSTED_AZURE_OPENAI_API_KEY: str | None = Field(
description="API key for hosted Azure OpenAI service", description="API key for hosted Azure OpenAI service",
default=None, default=None,
) )
HOSTED_AZURE_OPENAI_API_BASE: Optional[str] = Field( HOSTED_AZURE_OPENAI_API_BASE: str | None = Field(
description="Base URL for hosted Azure OpenAI API", description="Base URL for hosted Azure OpenAI API",
default=None, default=None,
) )
@@ -131,12 +129,12 @@ class HostedAnthropicConfig(BaseSettings):
Configuration for hosted Anthropic service Configuration for hosted Anthropic service
""" """
HOSTED_ANTHROPIC_API_BASE: Optional[str] = Field( HOSTED_ANTHROPIC_API_BASE: str | None = Field(
description="Base URL for hosted Anthropic API", description="Base URL for hosted Anthropic API",
default=None, default=None,
) )
HOSTED_ANTHROPIC_API_KEY: Optional[str] = Field( HOSTED_ANTHROPIC_API_KEY: str | None = Field(
description="API key for hosted Anthropic service", description="API key for hosted Anthropic service",
default=None, default=None,
) )

View File

@@ -1,5 +1,5 @@
import os import os
from typing import Any, Literal, Optional from typing import Any, Literal
from urllib.parse import parse_qsl, quote_plus from urllib.parse import parse_qsl, quote_plus
from pydantic import Field, NonNegativeFloat, NonNegativeInt, PositiveFloat, PositiveInt, computed_field from pydantic import Field, NonNegativeFloat, NonNegativeInt, PositiveFloat, PositiveInt, computed_field
@@ -78,18 +78,18 @@ class StorageConfig(BaseSettings):
class VectorStoreConfig(BaseSettings): class VectorStoreConfig(BaseSettings):
VECTOR_STORE: Optional[str] = Field( VECTOR_STORE: str | None = Field(
description="Type of vector store to use for efficient similarity search." description="Type of vector store to use for efficient similarity search."
" Set to None if not using a vector store.", " Set to None if not using a vector store.",
default=None, default=None,
) )
VECTOR_STORE_WHITELIST_ENABLE: Optional[bool] = Field( VECTOR_STORE_WHITELIST_ENABLE: bool | None = Field(
description="Enable whitelist for vector store.", description="Enable whitelist for vector store.",
default=False, default=False,
) )
VECTOR_INDEX_NAME_PREFIX: Optional[str] = Field( VECTOR_INDEX_NAME_PREFIX: str | None = Field(
description="Prefix used to create collection name in vector database", description="Prefix used to create collection name in vector database",
default="Vector_index", default="Vector_index",
) )
@@ -225,26 +225,26 @@ class CeleryConfig(DatabaseConfig):
default="redis", default="redis",
) )
CELERY_BROKER_URL: Optional[str] = Field( CELERY_BROKER_URL: str | None = Field(
description="URL of the message broker for Celery tasks.", description="URL of the message broker for Celery tasks.",
default=None, default=None,
) )
CELERY_USE_SENTINEL: Optional[bool] = Field( CELERY_USE_SENTINEL: bool | None = Field(
description="Whether to use Redis Sentinel for high availability.", description="Whether to use Redis Sentinel for high availability.",
default=False, default=False,
) )
CELERY_SENTINEL_MASTER_NAME: Optional[str] = Field( CELERY_SENTINEL_MASTER_NAME: str | None = Field(
description="Name of the Redis Sentinel master.", description="Name of the Redis Sentinel master.",
default=None, default=None,
) )
CELERY_SENTINEL_PASSWORD: Optional[str] = Field( CELERY_SENTINEL_PASSWORD: str | None = Field(
description="Password of the Redis Sentinel master.", description="Password of the Redis Sentinel master.",
default=None, default=None,
) )
CELERY_SENTINEL_SOCKET_TIMEOUT: Optional[PositiveFloat] = Field( CELERY_SENTINEL_SOCKET_TIMEOUT: PositiveFloat | None = Field(
description="Timeout for Redis Sentinel socket operations in seconds.", description="Timeout for Redis Sentinel socket operations in seconds.",
default=0.1, default=0.1,
) )
@@ -268,12 +268,12 @@ class InternalTestConfig(BaseSettings):
Configuration settings for Internal Test Configuration settings for Internal Test
""" """
AWS_SECRET_ACCESS_KEY: Optional[str] = Field( AWS_SECRET_ACCESS_KEY: str | None = Field(
description="Internal test AWS secret access key", description="Internal test AWS secret access key",
default=None, default=None,
) )
AWS_ACCESS_KEY_ID: Optional[str] = Field( AWS_ACCESS_KEY_ID: str | None = Field(
description="Internal test AWS access key ID", description="Internal test AWS access key ID",
default=None, default=None,
) )
@@ -284,15 +284,15 @@ class DatasetQueueMonitorConfig(BaseSettings):
Configuration settings for Dataset Queue Monitor Configuration settings for Dataset Queue Monitor
""" """
QUEUE_MONITOR_THRESHOLD: Optional[NonNegativeInt] = Field( QUEUE_MONITOR_THRESHOLD: NonNegativeInt | None = Field(
description="Threshold for dataset queue monitor", description="Threshold for dataset queue monitor",
default=200, default=200,
) )
QUEUE_MONITOR_ALERT_EMAILS: Optional[str] = Field( QUEUE_MONITOR_ALERT_EMAILS: str | None = Field(
description="Emails for dataset queue monitor alert, separated by commas", description="Emails for dataset queue monitor alert, separated by commas",
default=None, default=None,
) )
QUEUE_MONITOR_INTERVAL: Optional[NonNegativeFloat] = Field( QUEUE_MONITOR_INTERVAL: NonNegativeFloat | None = Field(
description="Interval for dataset queue monitor in minutes", description="Interval for dataset queue monitor in minutes",
default=30, default=30,
) )
@@ -300,8 +300,7 @@ class DatasetQueueMonitorConfig(BaseSettings):
class MiddlewareConfig( class MiddlewareConfig(
# place the configs in alphabet order # place the configs in alphabet order
CeleryConfig, CeleryConfig, # Note: CeleryConfig already inherits from DatabaseConfig
DatabaseConfig,
KeywordStoreConfig, KeywordStoreConfig,
RedisConfig, RedisConfig,
# configs of storage and storage providers # configs of storage and storage providers

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveFloat, PositiveInt from pydantic import Field, NonNegativeInt, PositiveFloat, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -19,12 +17,12 @@ class RedisConfig(BaseSettings):
default=6379, default=6379,
) )
REDIS_USERNAME: Optional[str] = Field( REDIS_USERNAME: str | None = Field(
description="Username for Redis authentication (if required)", description="Username for Redis authentication (if required)",
default=None, default=None,
) )
REDIS_PASSWORD: Optional[str] = Field( REDIS_PASSWORD: str | None = Field(
description="Password for Redis authentication (if required)", description="Password for Redis authentication (if required)",
default=None, default=None,
) )
@@ -44,47 +42,47 @@ class RedisConfig(BaseSettings):
default="CERT_NONE", default="CERT_NONE",
) )
REDIS_SSL_CA_CERTS: Optional[str] = Field( REDIS_SSL_CA_CERTS: str | None = Field(
description="Path to the CA certificate file for SSL verification", description="Path to the CA certificate file for SSL verification",
default=None, default=None,
) )
REDIS_SSL_CERTFILE: Optional[str] = Field( REDIS_SSL_CERTFILE: str | None = Field(
description="Path to the client certificate file for SSL authentication", description="Path to the client certificate file for SSL authentication",
default=None, default=None,
) )
REDIS_SSL_KEYFILE: Optional[str] = Field( REDIS_SSL_KEYFILE: str | None = Field(
description="Path to the client private key file for SSL authentication", description="Path to the client private key file for SSL authentication",
default=None, default=None,
) )
REDIS_USE_SENTINEL: Optional[bool] = Field( REDIS_USE_SENTINEL: bool | None = Field(
description="Enable Redis Sentinel mode for high availability", description="Enable Redis Sentinel mode for high availability",
default=False, default=False,
) )
REDIS_SENTINELS: Optional[str] = Field( REDIS_SENTINELS: str | None = Field(
description="Comma-separated list of Redis Sentinel nodes (host:port)", description="Comma-separated list of Redis Sentinel nodes (host:port)",
default=None, default=None,
) )
REDIS_SENTINEL_SERVICE_NAME: Optional[str] = Field( REDIS_SENTINEL_SERVICE_NAME: str | None = Field(
description="Name of the Redis Sentinel service to monitor", description="Name of the Redis Sentinel service to monitor",
default=None, default=None,
) )
REDIS_SENTINEL_USERNAME: Optional[str] = Field( REDIS_SENTINEL_USERNAME: str | None = Field(
description="Username for Redis Sentinel authentication (if required)", description="Username for Redis Sentinel authentication (if required)",
default=None, default=None,
) )
REDIS_SENTINEL_PASSWORD: Optional[str] = Field( REDIS_SENTINEL_PASSWORD: str | None = Field(
description="Password for Redis Sentinel authentication (if required)", description="Password for Redis Sentinel authentication (if required)",
default=None, default=None,
) )
REDIS_SENTINEL_SOCKET_TIMEOUT: Optional[PositiveFloat] = Field( REDIS_SENTINEL_SOCKET_TIMEOUT: PositiveFloat | None = Field(
description="Socket timeout in seconds for Redis Sentinel connections", description="Socket timeout in seconds for Redis Sentinel connections",
default=0.1, default=0.1,
) )
@@ -94,12 +92,12 @@ class RedisConfig(BaseSettings):
default=False, default=False,
) )
REDIS_CLUSTERS: Optional[str] = Field( REDIS_CLUSTERS: str | None = Field(
description="Comma-separated list of Redis Clusters nodes (host:port)", description="Comma-separated list of Redis Clusters nodes (host:port)",
default=None, default=None,
) )
REDIS_CLUSTERS_PASSWORD: Optional[str] = Field( REDIS_CLUSTERS_PASSWORD: str | None = Field(
description="Password for Redis Clusters authentication (if required)", description="Password for Redis Clusters authentication (if required)",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,37 +7,37 @@ class AliyunOSSStorageConfig(BaseSettings):
Configuration settings for Aliyun Object Storage Service (OSS) Configuration settings for Aliyun Object Storage Service (OSS)
""" """
ALIYUN_OSS_BUCKET_NAME: Optional[str] = Field( ALIYUN_OSS_BUCKET_NAME: str | None = Field(
description="Name of the Aliyun OSS bucket to store and retrieve objects", description="Name of the Aliyun OSS bucket to store and retrieve objects",
default=None, default=None,
) )
ALIYUN_OSS_ACCESS_KEY: Optional[str] = Field( ALIYUN_OSS_ACCESS_KEY: str | None = Field(
description="Access key ID for authenticating with Aliyun OSS", description="Access key ID for authenticating with Aliyun OSS",
default=None, default=None,
) )
ALIYUN_OSS_SECRET_KEY: Optional[str] = Field( ALIYUN_OSS_SECRET_KEY: str | None = Field(
description="Secret access key for authenticating with Aliyun OSS", description="Secret access key for authenticating with Aliyun OSS",
default=None, default=None,
) )
ALIYUN_OSS_ENDPOINT: Optional[str] = Field( ALIYUN_OSS_ENDPOINT: str | None = Field(
description="URL of the Aliyun OSS endpoint for your chosen region", description="URL of the Aliyun OSS endpoint for your chosen region",
default=None, default=None,
) )
ALIYUN_OSS_REGION: Optional[str] = Field( ALIYUN_OSS_REGION: str | None = Field(
description="Aliyun OSS region where your bucket is located (e.g., 'oss-cn-hangzhou')", description="Aliyun OSS region where your bucket is located (e.g., 'oss-cn-hangzhou')",
default=None, default=None,
) )
ALIYUN_OSS_AUTH_VERSION: Optional[str] = Field( ALIYUN_OSS_AUTH_VERSION: str | None = Field(
description="Version of the authentication protocol to use with Aliyun OSS (e.g., 'v4')", description="Version of the authentication protocol to use with Aliyun OSS (e.g., 'v4')",
default=None, default=None,
) )
ALIYUN_OSS_PATH: Optional[str] = Field( ALIYUN_OSS_PATH: str | None = Field(
description="Base path within the bucket to store objects (e.g., 'my-app-data/')", description="Base path within the bucket to store objects (e.g., 'my-app-data/')",
default=None, default=None,
) )

View File

@@ -1,4 +1,4 @@
from typing import Literal, Optional from typing import Literal
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +9,27 @@ class S3StorageConfig(BaseSettings):
Configuration settings for S3-compatible object storage Configuration settings for S3-compatible object storage
""" """
S3_ENDPOINT: Optional[str] = Field( S3_ENDPOINT: str | None = Field(
description="URL of the S3-compatible storage endpoint (e.g., 'https://s3.amazonaws.com')", description="URL of the S3-compatible storage endpoint (e.g., 'https://s3.amazonaws.com')",
default=None, default=None,
) )
S3_REGION: Optional[str] = Field( S3_REGION: str | None = Field(
description="Region where the S3 bucket is located (e.g., 'us-east-1')", description="Region where the S3 bucket is located (e.g., 'us-east-1')",
default=None, default=None,
) )
S3_BUCKET_NAME: Optional[str] = Field( S3_BUCKET_NAME: str | None = Field(
description="Name of the S3 bucket to store and retrieve objects", description="Name of the S3 bucket to store and retrieve objects",
default=None, default=None,
) )
S3_ACCESS_KEY: Optional[str] = Field( S3_ACCESS_KEY: str | None = Field(
description="Access key ID for authenticating with the S3 service", description="Access key ID for authenticating with the S3 service",
default=None, default=None,
) )
S3_SECRET_KEY: Optional[str] = Field( S3_SECRET_KEY: str | None = Field(
description="Secret access key for authenticating with the S3 service", description="Secret access key for authenticating with the S3 service",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,22 +7,22 @@ class AzureBlobStorageConfig(BaseSettings):
Configuration settings for Azure Blob Storage Configuration settings for Azure Blob Storage
""" """
AZURE_BLOB_ACCOUNT_NAME: Optional[str] = Field( AZURE_BLOB_ACCOUNT_NAME: str | None = Field(
description="Name of the Azure Storage account (e.g., 'mystorageaccount')", description="Name of the Azure Storage account (e.g., 'mystorageaccount')",
default=None, default=None,
) )
AZURE_BLOB_ACCOUNT_KEY: Optional[str] = Field( AZURE_BLOB_ACCOUNT_KEY: str | None = Field(
description="Access key for authenticating with the Azure Storage account", description="Access key for authenticating with the Azure Storage account",
default=None, default=None,
) )
AZURE_BLOB_CONTAINER_NAME: Optional[str] = Field( AZURE_BLOB_CONTAINER_NAME: str | None = Field(
description="Name of the Azure Blob container to store and retrieve objects", description="Name of the Azure Blob container to store and retrieve objects",
default=None, default=None,
) )
AZURE_BLOB_ACCOUNT_URL: Optional[str] = Field( AZURE_BLOB_ACCOUNT_URL: str | None = Field(
description="URL of the Azure Blob storage endpoint (e.g., 'https://mystorageaccount.blob.core.windows.net')", description="URL of the Azure Blob storage endpoint (e.g., 'https://mystorageaccount.blob.core.windows.net')",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,22 +7,22 @@ class BaiduOBSStorageConfig(BaseSettings):
Configuration settings for Baidu Object Storage Service (OBS) Configuration settings for Baidu Object Storage Service (OBS)
""" """
BAIDU_OBS_BUCKET_NAME: Optional[str] = Field( BAIDU_OBS_BUCKET_NAME: str | None = Field(
description="Name of the Baidu OBS bucket to store and retrieve objects (e.g., 'my-obs-bucket')", description="Name of the Baidu OBS bucket to store and retrieve objects (e.g., 'my-obs-bucket')",
default=None, default=None,
) )
BAIDU_OBS_ACCESS_KEY: Optional[str] = Field( BAIDU_OBS_ACCESS_KEY: str | None = Field(
description="Access Key ID for authenticating with Baidu OBS", description="Access Key ID for authenticating with Baidu OBS",
default=None, default=None,
) )
BAIDU_OBS_SECRET_KEY: Optional[str] = Field( BAIDU_OBS_SECRET_KEY: str | None = Field(
description="Secret Access Key for authenticating with Baidu OBS", description="Secret Access Key for authenticating with Baidu OBS",
default=None, default=None,
) )
BAIDU_OBS_ENDPOINT: Optional[str] = Field( BAIDU_OBS_ENDPOINT: str | None = Field(
description="URL of the Baidu OSS endpoint for your chosen region (e.g., 'https://.bj.bcebos.com')", description="URL of the Baidu OSS endpoint for your chosen region (e.g., 'https://.bj.bcebos.com')",
default=None, default=None,
) )

View File

@@ -1,7 +1,5 @@
"""ClickZetta Volume Storage Configuration""" """ClickZetta Volume Storage Configuration"""
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,17 +7,17 @@ from pydantic_settings import BaseSettings
class ClickZettaVolumeStorageConfig(BaseSettings): class ClickZettaVolumeStorageConfig(BaseSettings):
"""Configuration for ClickZetta Volume storage.""" """Configuration for ClickZetta Volume storage."""
CLICKZETTA_VOLUME_USERNAME: Optional[str] = Field( CLICKZETTA_VOLUME_USERNAME: str | None = Field(
description="Username for ClickZetta Volume authentication", description="Username for ClickZetta Volume authentication",
default=None, default=None,
) )
CLICKZETTA_VOLUME_PASSWORD: Optional[str] = Field( CLICKZETTA_VOLUME_PASSWORD: str | None = Field(
description="Password for ClickZetta Volume authentication", description="Password for ClickZetta Volume authentication",
default=None, default=None,
) )
CLICKZETTA_VOLUME_INSTANCE: Optional[str] = Field( CLICKZETTA_VOLUME_INSTANCE: str | None = Field(
description="ClickZetta instance identifier", description="ClickZetta instance identifier",
default=None, default=None,
) )
@@ -49,7 +47,7 @@ class ClickZettaVolumeStorageConfig(BaseSettings):
default="user", default="user",
) )
CLICKZETTA_VOLUME_NAME: Optional[str] = Field( CLICKZETTA_VOLUME_NAME: str | None = Field(
description="ClickZetta volume name for external volumes", description="ClickZetta volume name for external volumes",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,12 +7,12 @@ class GoogleCloudStorageConfig(BaseSettings):
Configuration settings for Google Cloud Storage Configuration settings for Google Cloud Storage
""" """
GOOGLE_STORAGE_BUCKET_NAME: Optional[str] = Field( GOOGLE_STORAGE_BUCKET_NAME: str | None = Field(
description="Name of the Google Cloud Storage bucket to store and retrieve objects (e.g., 'my-gcs-bucket')", description="Name of the Google Cloud Storage bucket to store and retrieve objects (e.g., 'my-gcs-bucket')",
default=None, default=None,
) )
GOOGLE_STORAGE_SERVICE_ACCOUNT_JSON_BASE64: Optional[str] = Field( GOOGLE_STORAGE_SERVICE_ACCOUNT_JSON_BASE64: str | None = Field(
description="Base64-encoded JSON key file for Google Cloud service account authentication", description="Base64-encoded JSON key file for Google Cloud service account authentication",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,22 +7,22 @@ class HuaweiCloudOBSStorageConfig(BaseSettings):
Configuration settings for Huawei Cloud Object Storage Service (OBS) Configuration settings for Huawei Cloud Object Storage Service (OBS)
""" """
HUAWEI_OBS_BUCKET_NAME: Optional[str] = Field( HUAWEI_OBS_BUCKET_NAME: str | None = Field(
description="Name of the Huawei Cloud OBS bucket to store and retrieve objects (e.g., 'my-obs-bucket')", description="Name of the Huawei Cloud OBS bucket to store and retrieve objects (e.g., 'my-obs-bucket')",
default=None, default=None,
) )
HUAWEI_OBS_ACCESS_KEY: Optional[str] = Field( HUAWEI_OBS_ACCESS_KEY: str | None = Field(
description="Access Key ID for authenticating with Huawei Cloud OBS", description="Access Key ID for authenticating with Huawei Cloud OBS",
default=None, default=None,
) )
HUAWEI_OBS_SECRET_KEY: Optional[str] = Field( HUAWEI_OBS_SECRET_KEY: str | None = Field(
description="Secret Access Key for authenticating with Huawei Cloud OBS", description="Secret Access Key for authenticating with Huawei Cloud OBS",
default=None, default=None,
) )
HUAWEI_OBS_SERVER: Optional[str] = Field( HUAWEI_OBS_SERVER: str | None = Field(
description="Endpoint URL for Huawei Cloud OBS (e.g., 'https://obs.cn-north-4.myhuaweicloud.com')", description="Endpoint URL for Huawei Cloud OBS (e.g., 'https://obs.cn-north-4.myhuaweicloud.com')",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class OCIStorageConfig(BaseSettings):
Configuration settings for Oracle Cloud Infrastructure (OCI) Object Storage Configuration settings for Oracle Cloud Infrastructure (OCI) Object Storage
""" """
OCI_ENDPOINT: Optional[str] = Field( OCI_ENDPOINT: str | None = Field(
description="URL of the OCI Object Storage endpoint (e.g., 'https://objectstorage.us-phoenix-1.oraclecloud.com')", description="URL of the OCI Object Storage endpoint (e.g., 'https://objectstorage.us-phoenix-1.oraclecloud.com')",
default=None, default=None,
) )
OCI_REGION: Optional[str] = Field( OCI_REGION: str | None = Field(
description="OCI region where the bucket is located (e.g., 'us-phoenix-1')", description="OCI region where the bucket is located (e.g., 'us-phoenix-1')",
default=None, default=None,
) )
OCI_BUCKET_NAME: Optional[str] = Field( OCI_BUCKET_NAME: str | None = Field(
description="Name of the OCI Object Storage bucket to store and retrieve objects (e.g., 'my-oci-bucket')", description="Name of the OCI Object Storage bucket to store and retrieve objects (e.g., 'my-oci-bucket')",
default=None, default=None,
) )
OCI_ACCESS_KEY: Optional[str] = Field( OCI_ACCESS_KEY: str | None = Field(
description="Access key (also known as API key) for authenticating with OCI Object Storage", description="Access key (also known as API key) for authenticating with OCI Object Storage",
default=None, default=None,
) )
OCI_SECRET_KEY: Optional[str] = Field( OCI_SECRET_KEY: str | None = Field(
description="Secret key associated with the access key for authenticating with OCI Object Storage", description="Secret key associated with the access key for authenticating with OCI Object Storage",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,17 +7,17 @@ class SupabaseStorageConfig(BaseSettings):
Configuration settings for Supabase Object Storage Service Configuration settings for Supabase Object Storage Service
""" """
SUPABASE_BUCKET_NAME: Optional[str] = Field( SUPABASE_BUCKET_NAME: str | None = Field(
description="Name of the Supabase bucket to store and retrieve objects (e.g., 'dify-bucket')", description="Name of the Supabase bucket to store and retrieve objects (e.g., 'dify-bucket')",
default=None, default=None,
) )
SUPABASE_API_KEY: Optional[str] = Field( SUPABASE_API_KEY: str | None = Field(
description="API KEY for authenticating with Supabase", description="API KEY for authenticating with Supabase",
default=None, default=None,
) )
SUPABASE_URL: Optional[str] = Field( SUPABASE_URL: str | None = Field(
description="URL of the Supabase", description="URL of the Supabase",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class TencentCloudCOSStorageConfig(BaseSettings):
Configuration settings for Tencent Cloud Object Storage (COS) Configuration settings for Tencent Cloud Object Storage (COS)
""" """
TENCENT_COS_BUCKET_NAME: Optional[str] = Field( TENCENT_COS_BUCKET_NAME: str | None = Field(
description="Name of the Tencent Cloud COS bucket to store and retrieve objects", description="Name of the Tencent Cloud COS bucket to store and retrieve objects",
default=None, default=None,
) )
TENCENT_COS_REGION: Optional[str] = Field( TENCENT_COS_REGION: str | None = Field(
description="Tencent Cloud region where the COS bucket is located (e.g., 'ap-guangzhou')", description="Tencent Cloud region where the COS bucket is located (e.g., 'ap-guangzhou')",
default=None, default=None,
) )
TENCENT_COS_SECRET_ID: Optional[str] = Field( TENCENT_COS_SECRET_ID: str | None = Field(
description="SecretId for authenticating with Tencent Cloud COS (part of API credentials)", description="SecretId for authenticating with Tencent Cloud COS (part of API credentials)",
default=None, default=None,
) )
TENCENT_COS_SECRET_KEY: Optional[str] = Field( TENCENT_COS_SECRET_KEY: str | None = Field(
description="SecretKey for authenticating with Tencent Cloud COS (part of API credentials)", description="SecretKey for authenticating with Tencent Cloud COS (part of API credentials)",
default=None, default=None,
) )
TENCENT_COS_SCHEME: Optional[str] = Field( TENCENT_COS_SCHEME: str | None = Field(
description="Protocol scheme for COS requests: 'https' (recommended) or 'http'", description="Protocol scheme for COS requests: 'https' (recommended) or 'http'",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class VolcengineTOSStorageConfig(BaseSettings):
Configuration settings for Volcengine Tinder Object Storage (TOS) Configuration settings for Volcengine Tinder Object Storage (TOS)
""" """
VOLCENGINE_TOS_BUCKET_NAME: Optional[str] = Field( VOLCENGINE_TOS_BUCKET_NAME: str | None = Field(
description="Name of the Volcengine TOS bucket to store and retrieve objects (e.g., 'my-tos-bucket')", description="Name of the Volcengine TOS bucket to store and retrieve objects (e.g., 'my-tos-bucket')",
default=None, default=None,
) )
VOLCENGINE_TOS_ACCESS_KEY: Optional[str] = Field( VOLCENGINE_TOS_ACCESS_KEY: str | None = Field(
description="Access Key ID for authenticating with Volcengine TOS", description="Access Key ID for authenticating with Volcengine TOS",
default=None, default=None,
) )
VOLCENGINE_TOS_SECRET_KEY: Optional[str] = Field( VOLCENGINE_TOS_SECRET_KEY: str | None = Field(
description="Secret Access Key for authenticating with Volcengine TOS", description="Secret Access Key for authenticating with Volcengine TOS",
default=None, default=None,
) )
VOLCENGINE_TOS_ENDPOINT: Optional[str] = Field( VOLCENGINE_TOS_ENDPOINT: str | None = Field(
description="URL of the Volcengine TOS endpoint (e.g., 'https://tos-cn-beijing.volces.com')", description="URL of the Volcengine TOS endpoint (e.g., 'https://tos-cn-beijing.volces.com')",
default=None, default=None,
) )
VOLCENGINE_TOS_REGION: Optional[str] = Field( VOLCENGINE_TOS_REGION: str | None = Field(
description="Volcengine region where the TOS bucket is located (e.g., 'cn-beijing')", description="Volcengine region where the TOS bucket is located (e.g., 'cn-beijing')",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -11,37 +9,37 @@ class AnalyticdbConfig(BaseSettings):
https://www.alibabacloud.com/help/en/analyticdb-for-postgresql/getting-started/create-an-instance-instances-with-vector-engine-optimization-enabled https://www.alibabacloud.com/help/en/analyticdb-for-postgresql/getting-started/create-an-instance-instances-with-vector-engine-optimization-enabled
""" """
ANALYTICDB_KEY_ID: Optional[str] = Field( ANALYTICDB_KEY_ID: str | None = Field(
default=None, description="The Access Key ID provided by Alibaba Cloud for API authentication." default=None, description="The Access Key ID provided by Alibaba Cloud for API authentication."
) )
ANALYTICDB_KEY_SECRET: Optional[str] = Field( ANALYTICDB_KEY_SECRET: str | None = Field(
default=None, description="The Secret Access Key corresponding to the Access Key ID for secure API access." default=None, description="The Secret Access Key corresponding to the Access Key ID for secure API access."
) )
ANALYTICDB_REGION_ID: Optional[str] = Field( ANALYTICDB_REGION_ID: str | None = Field(
default=None, default=None,
description="The region where the AnalyticDB instance is deployed (e.g., 'cn-hangzhou', 'ap-southeast-1').", description="The region where the AnalyticDB instance is deployed (e.g., 'cn-hangzhou', 'ap-southeast-1').",
) )
ANALYTICDB_INSTANCE_ID: Optional[str] = Field( ANALYTICDB_INSTANCE_ID: str | None = Field(
default=None, default=None,
description="The unique identifier of the AnalyticDB instance you want to connect to.", description="The unique identifier of the AnalyticDB instance you want to connect to.",
) )
ANALYTICDB_ACCOUNT: Optional[str] = Field( ANALYTICDB_ACCOUNT: str | None = Field(
default=None, default=None,
description="The account name used to log in to the AnalyticDB instance" description="The account name used to log in to the AnalyticDB instance"
" (usually the initial account created with the instance).", " (usually the initial account created with the instance).",
) )
ANALYTICDB_PASSWORD: Optional[str] = Field( ANALYTICDB_PASSWORD: str | None = Field(
default=None, description="The password associated with the AnalyticDB account for database authentication." default=None, description="The password associated with the AnalyticDB account for database authentication."
) )
ANALYTICDB_NAMESPACE: Optional[str] = Field( ANALYTICDB_NAMESPACE: str | None = Field(
default=None, description="The namespace within AnalyticDB for schema isolation (if using namespace feature)." default=None, description="The namespace within AnalyticDB for schema isolation (if using namespace feature)."
) )
ANALYTICDB_NAMESPACE_PASSWORD: Optional[str] = Field( ANALYTICDB_NAMESPACE_PASSWORD: str | None = Field(
default=None, default=None,
description="The password for accessing the specified namespace within the AnalyticDB instance" description="The password for accessing the specified namespace within the AnalyticDB instance"
" (if namespace feature is enabled).", " (if namespace feature is enabled).",
) )
ANALYTICDB_HOST: Optional[str] = Field( ANALYTICDB_HOST: str | None = Field(
default=None, description="The host of the AnalyticDB instance you want to connect to." default=None, description="The host of the AnalyticDB instance you want to connect to."
) )
ANALYTICDB_PORT: PositiveInt = Field( ANALYTICDB_PORT: PositiveInt = Field(

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveInt from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class BaiduVectorDBConfig(BaseSettings):
Configuration settings for Baidu Vector Database Configuration settings for Baidu Vector Database
""" """
BAIDU_VECTOR_DB_ENDPOINT: Optional[str] = Field( BAIDU_VECTOR_DB_ENDPOINT: str | None = Field(
description="URL of the Baidu Vector Database service (e.g., 'http://vdb.bj.baidubce.com')", description="URL of the Baidu Vector Database service (e.g., 'http://vdb.bj.baidubce.com')",
default=None, default=None,
) )
@@ -19,17 +17,17 @@ class BaiduVectorDBConfig(BaseSettings):
default=30000, default=30000,
) )
BAIDU_VECTOR_DB_ACCOUNT: Optional[str] = Field( BAIDU_VECTOR_DB_ACCOUNT: str | None = Field(
description="Account for authenticating with the Baidu Vector Database", description="Account for authenticating with the Baidu Vector Database",
default=None, default=None,
) )
BAIDU_VECTOR_DB_API_KEY: Optional[str] = Field( BAIDU_VECTOR_DB_API_KEY: str | None = Field(
description="API key for authenticating with the Baidu Vector Database service", description="API key for authenticating with the Baidu Vector Database service",
default=None, default=None,
) )
BAIDU_VECTOR_DB_DATABASE: Optional[str] = Field( BAIDU_VECTOR_DB_DATABASE: str | None = Field(
description="Name of the specific Baidu Vector Database to connect to", description="Name of the specific Baidu Vector Database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class ChromaConfig(BaseSettings):
Configuration settings for Chroma vector database Configuration settings for Chroma vector database
""" """
CHROMA_HOST: Optional[str] = Field( CHROMA_HOST: str | None = Field(
description="Hostname or IP address of the Chroma server (e.g., 'localhost' or '192.168.1.100')", description="Hostname or IP address of the Chroma server (e.g., 'localhost' or '192.168.1.100')",
default=None, default=None,
) )
@@ -19,22 +17,22 @@ class ChromaConfig(BaseSettings):
default=8000, default=8000,
) )
CHROMA_TENANT: Optional[str] = Field( CHROMA_TENANT: str | None = Field(
description="Tenant identifier for multi-tenancy support in Chroma", description="Tenant identifier for multi-tenancy support in Chroma",
default=None, default=None,
) )
CHROMA_DATABASE: Optional[str] = Field( CHROMA_DATABASE: str | None = Field(
description="Name of the Chroma database to connect to", description="Name of the Chroma database to connect to",
default=None, default=None,
) )
CHROMA_AUTH_PROVIDER: Optional[str] = Field( CHROMA_AUTH_PROVIDER: str | None = Field(
description="Authentication provider for Chroma (e.g., 'basic', 'token', or a custom provider)", description="Authentication provider for Chroma (e.g., 'basic', 'token', or a custom provider)",
default=None, default=None,
) )
CHROMA_AUTH_CREDENTIALS: Optional[str] = Field( CHROMA_AUTH_CREDENTIALS: str | None = Field(
description="Authentication credentials for Chroma (format depends on the auth provider)", description="Authentication credentials for Chroma (format depends on the auth provider)",
default=None, default=None,
) )

View File

@@ -1,69 +1,68 @@
from typing import Optional from pydantic import Field
from pydantic_settings import BaseSettings
from pydantic import BaseModel, Field
class ClickzettaConfig(BaseModel): class ClickzettaConfig(BaseSettings):
""" """
Clickzetta Lakehouse vector database configuration Clickzetta Lakehouse vector database configuration
""" """
CLICKZETTA_USERNAME: Optional[str] = Field( CLICKZETTA_USERNAME: str | None = Field(
description="Username for authenticating with Clickzetta Lakehouse", description="Username for authenticating with Clickzetta Lakehouse",
default=None, default=None,
) )
CLICKZETTA_PASSWORD: Optional[str] = Field( CLICKZETTA_PASSWORD: str | None = Field(
description="Password for authenticating with Clickzetta Lakehouse", description="Password for authenticating with Clickzetta Lakehouse",
default=None, default=None,
) )
CLICKZETTA_INSTANCE: Optional[str] = Field( CLICKZETTA_INSTANCE: str | None = Field(
description="Clickzetta Lakehouse instance ID", description="Clickzetta Lakehouse instance ID",
default=None, default=None,
) )
CLICKZETTA_SERVICE: Optional[str] = Field( CLICKZETTA_SERVICE: str | None = Field(
description="Clickzetta API service endpoint (e.g., 'api.clickzetta.com')", description="Clickzetta API service endpoint (e.g., 'api.clickzetta.com')",
default="api.clickzetta.com", default="api.clickzetta.com",
) )
CLICKZETTA_WORKSPACE: Optional[str] = Field( CLICKZETTA_WORKSPACE: str | None = Field(
description="Clickzetta workspace name", description="Clickzetta workspace name",
default="default", default="default",
) )
CLICKZETTA_VCLUSTER: Optional[str] = Field( CLICKZETTA_VCLUSTER: str | None = Field(
description="Clickzetta virtual cluster name", description="Clickzetta virtual cluster name",
default="default_ap", default="default_ap",
) )
CLICKZETTA_SCHEMA: Optional[str] = Field( CLICKZETTA_SCHEMA: str | None = Field(
description="Database schema name in Clickzetta", description="Database schema name in Clickzetta",
default="public", default="public",
) )
CLICKZETTA_BATCH_SIZE: Optional[int] = Field( CLICKZETTA_BATCH_SIZE: int | None = Field(
description="Batch size for bulk insert operations", description="Batch size for bulk insert operations",
default=100, default=100,
) )
CLICKZETTA_ENABLE_INVERTED_INDEX: Optional[bool] = Field( CLICKZETTA_ENABLE_INVERTED_INDEX: bool | None = Field(
description="Enable inverted index for full-text search capabilities", description="Enable inverted index for full-text search capabilities",
default=True, default=True,
) )
CLICKZETTA_ANALYZER_TYPE: Optional[str] = Field( CLICKZETTA_ANALYZER_TYPE: str | None = Field(
description="Analyzer type for full-text search: keyword, english, chinese, unicode", description="Analyzer type for full-text search: keyword, english, chinese, unicode",
default="chinese", default="chinese",
) )
CLICKZETTA_ANALYZER_MODE: Optional[str] = Field( CLICKZETTA_ANALYZER_MODE: str | None = Field(
description="Analyzer mode for tokenization: max_word (fine-grained) or smart (intelligent)", description="Analyzer mode for tokenization: max_word (fine-grained) or smart (intelligent)",
default="smart", default="smart",
) )
CLICKZETTA_VECTOR_DISTANCE_FUNCTION: Optional[str] = Field( CLICKZETTA_VECTOR_DISTANCE_FUNCTION: str | None = Field(
description="Distance function for vector similarity: l2_distance or cosine_distance", description="Distance function for vector similarity: l2_distance or cosine_distance",
default="cosine_distance", default="cosine_distance",
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class CouchbaseConfig(BaseSettings):
Couchbase configs Couchbase configs
""" """
COUCHBASE_CONNECTION_STRING: Optional[str] = Field( COUCHBASE_CONNECTION_STRING: str | None = Field(
description="COUCHBASE connection string", description="COUCHBASE connection string",
default=None, default=None,
) )
COUCHBASE_USER: Optional[str] = Field( COUCHBASE_USER: str | None = Field(
description="COUCHBASE user", description="COUCHBASE user",
default=None, default=None,
) )
COUCHBASE_PASSWORD: Optional[str] = Field( COUCHBASE_PASSWORD: str | None = Field(
description="COUCHBASE password", description="COUCHBASE password",
default=None, default=None,
) )
COUCHBASE_BUCKET_NAME: Optional[str] = Field( COUCHBASE_BUCKET_NAME: str | None = Field(
description="COUCHBASE bucket name", description="COUCHBASE bucket name",
default=None, default=None,
) )
COUCHBASE_SCOPE_NAME: Optional[str] = Field( COUCHBASE_SCOPE_NAME: str | None = Field(
description="COUCHBASE scope name", description="COUCHBASE scope name",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt, model_validator from pydantic import Field, PositiveInt, model_validator
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -10,7 +8,7 @@ class ElasticsearchConfig(BaseSettings):
Can load from environment variables or .env files. Can load from environment variables or .env files.
""" """
ELASTICSEARCH_HOST: Optional[str] = Field( ELASTICSEARCH_HOST: str | None = Field(
description="Hostname or IP address of the Elasticsearch server (e.g., 'localhost' or '192.168.1.100')", description="Hostname or IP address of the Elasticsearch server (e.g., 'localhost' or '192.168.1.100')",
default="127.0.0.1", default="127.0.0.1",
) )
@@ -20,30 +18,28 @@ class ElasticsearchConfig(BaseSettings):
default=9200, default=9200,
) )
ELASTICSEARCH_USERNAME: Optional[str] = Field( ELASTICSEARCH_USERNAME: str | None = Field(
description="Username for authenticating with Elasticsearch (default is 'elastic')", description="Username for authenticating with Elasticsearch (default is 'elastic')",
default="elastic", default="elastic",
) )
ELASTICSEARCH_PASSWORD: Optional[str] = Field( ELASTICSEARCH_PASSWORD: str | None = Field(
description="Password for authenticating with Elasticsearch (default is 'elastic')", description="Password for authenticating with Elasticsearch (default is 'elastic')",
default="elastic", default="elastic",
) )
# Elastic Cloud (optional) # Elastic Cloud (optional)
ELASTICSEARCH_USE_CLOUD: Optional[bool] = Field( ELASTICSEARCH_USE_CLOUD: bool | None = Field(
description="Set to True to use Elastic Cloud instead of self-hosted Elasticsearch", default=False description="Set to True to use Elastic Cloud instead of self-hosted Elasticsearch", default=False
) )
ELASTICSEARCH_CLOUD_URL: Optional[str] = Field( ELASTICSEARCH_CLOUD_URL: str | None = Field(
description="Full URL for Elastic Cloud deployment (e.g., 'https://example.es.region.aws.found.io:443')", description="Full URL for Elastic Cloud deployment (e.g., 'https://example.es.region.aws.found.io:443')",
default=None, default=None,
) )
ELASTICSEARCH_API_KEY: Optional[str] = Field( ELASTICSEARCH_API_KEY: str | None = Field(description="API key for authenticating with Elastic Cloud", default=None)
description="API key for authenticating with Elastic Cloud", default=None
)
# Common options # Common options
ELASTICSEARCH_CA_CERTS: Optional[str] = Field( ELASTICSEARCH_CA_CERTS: str | None = Field(
description="Path to CA certificate file for SSL verification", default=None description="Path to CA certificate file for SSL verification", default=None
) )
ELASTICSEARCH_VERIFY_CERTS: bool = Field( ELASTICSEARCH_VERIFY_CERTS: bool = Field(

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,17 +7,17 @@ class HuaweiCloudConfig(BaseSettings):
Configuration settings for Huawei cloud search service Configuration settings for Huawei cloud search service
""" """
HUAWEI_CLOUD_HOSTS: Optional[str] = Field( HUAWEI_CLOUD_HOSTS: str | None = Field(
description="Hostname or IP address of the Huawei cloud search service instance", description="Hostname or IP address of the Huawei cloud search service instance",
default=None, default=None,
) )
HUAWEI_CLOUD_USER: Optional[str] = Field( HUAWEI_CLOUD_USER: str | None = Field(
description="Username for authenticating with Huawei cloud search service", description="Username for authenticating with Huawei cloud search service",
default=None, default=None,
) )
HUAWEI_CLOUD_PASSWORD: Optional[str] = Field( HUAWEI_CLOUD_PASSWORD: str | None = Field(
description="Password for authenticating with Huawei cloud search service", description="Password for authenticating with Huawei cloud search service",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class LindormConfig(BaseSettings):
Lindorm configs Lindorm configs
""" """
LINDORM_URL: Optional[str] = Field( LINDORM_URL: str | None = Field(
description="Lindorm url", description="Lindorm url",
default=None, default=None,
) )
LINDORM_USERNAME: Optional[str] = Field( LINDORM_USERNAME: str | None = Field(
description="Lindorm user", description="Lindorm user",
default=None, default=None,
) )
LINDORM_PASSWORD: Optional[str] = Field( LINDORM_PASSWORD: str | None = Field(
description="Lindorm password", description="Lindorm password",
default=None, default=None,
) )
DEFAULT_INDEX_TYPE: Optional[str] = Field( LINDORM_INDEX_TYPE: str | None = Field(
description="Lindorm Vector Index Type, hnsw or flat is available in dify", description="Lindorm Vector Index Type, hnsw or flat is available in dify",
default="hnsw", default="hnsw",
) )
DEFAULT_DISTANCE_TYPE: Optional[str] = Field( LINDORM_DISTANCE_TYPE: str | None = Field(
description="Vector Distance Type, support l2, cosinesimil, innerproduct", default="l2" description="Vector Distance Type, support l2, cosinesimil, innerproduct", default="l2"
) )
USING_UGC_INDEX: Optional[bool] = Field( LINDORM_USING_UGC: bool | None = Field(
description="Using UGC index will store the same type of Index in a single index but can retrieve separately.", description="Using UGC index will store indexes with the same IndexType/Dimension in a single big index.",
default=False, default=True,
) )
LINDORM_QUERY_TIMEOUT: Optional[float] = Field(description="The lindorm search request timeout (s)", default=2.0) LINDORM_QUERY_TIMEOUT: float | None = Field(description="The lindorm search request timeout (s)", default=2.0)

View File

@@ -1,7 +1,8 @@
from pydantic import BaseModel, Field from pydantic import Field
from pydantic_settings import BaseSettings
class MatrixoneConfig(BaseModel): class MatrixoneConfig(BaseSettings):
"""Matrixone vector database configuration.""" """Matrixone vector database configuration."""
MATRIXONE_HOST: str = Field(default="localhost", description="Host address of the Matrixone server") MATRIXONE_HOST: str = Field(default="localhost", description="Host address of the Matrixone server")

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,22 +7,22 @@ class MilvusConfig(BaseSettings):
Configuration settings for Milvus vector database Configuration settings for Milvus vector database
""" """
MILVUS_URI: Optional[str] = Field( MILVUS_URI: str | None = Field(
description="URI for connecting to the Milvus server (e.g., 'http://localhost:19530' or 'https://milvus-instance.example.com:19530')", description="URI for connecting to the Milvus server (e.g., 'http://localhost:19530' or 'https://milvus-instance.example.com:19530')",
default="http://127.0.0.1:19530", default="http://127.0.0.1:19530",
) )
MILVUS_TOKEN: Optional[str] = Field( MILVUS_TOKEN: str | None = Field(
description="Authentication token for Milvus, if token-based authentication is enabled", description="Authentication token for Milvus, if token-based authentication is enabled",
default=None, default=None,
) )
MILVUS_USER: Optional[str] = Field( MILVUS_USER: str | None = Field(
description="Username for authenticating with Milvus, if username/password authentication is enabled", description="Username for authenticating with Milvus, if username/password authentication is enabled",
default=None, default=None,
) )
MILVUS_PASSWORD: Optional[str] = Field( MILVUS_PASSWORD: str | None = Field(
description="Password for authenticating with Milvus, if username/password authentication is enabled", description="Password for authenticating with Milvus, if username/password authentication is enabled",
default=None, default=None,
) )
@@ -40,7 +38,7 @@ class MilvusConfig(BaseSettings):
default=True, default=True,
) )
MILVUS_ANALYZER_PARAMS: Optional[str] = Field( MILVUS_ANALYZER_PARAMS: str | None = Field(
description='Milvus text analyzer parameters, e.g., {"type": "chinese"} for Chinese segmentation support.', description='Milvus text analyzer parameters, e.g., {"type": "chinese"} for Chinese segmentation support.',
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class OceanBaseVectorConfig(BaseSettings):
Configuration settings for OceanBase Vector database Configuration settings for OceanBase Vector database
""" """
OCEANBASE_VECTOR_HOST: Optional[str] = Field( OCEANBASE_VECTOR_HOST: str | None = Field(
description="Hostname or IP address of the OceanBase Vector server (e.g. 'localhost')", description="Hostname or IP address of the OceanBase Vector server (e.g. 'localhost')",
default=None, default=None,
) )
OCEANBASE_VECTOR_PORT: Optional[PositiveInt] = Field( OCEANBASE_VECTOR_PORT: PositiveInt | None = Field(
description="Port number on which the OceanBase Vector server is listening (default is 2881)", description="Port number on which the OceanBase Vector server is listening (default is 2881)",
default=2881, default=2881,
) )
OCEANBASE_VECTOR_USER: Optional[str] = Field( OCEANBASE_VECTOR_USER: str | None = Field(
description="Username for authenticating with the OceanBase Vector database", description="Username for authenticating with the OceanBase Vector database",
default=None, default=None,
) )
OCEANBASE_VECTOR_PASSWORD: Optional[str] = Field( OCEANBASE_VECTOR_PASSWORD: str | None = Field(
description="Password for authenticating with the OceanBase Vector database", description="Password for authenticating with the OceanBase Vector database",
default=None, default=None,
) )
OCEANBASE_VECTOR_DATABASE: Optional[str] = Field( OCEANBASE_VECTOR_DATABASE: str | None = Field(
description="Name of the OceanBase Vector database to connect to", description="Name of the OceanBase Vector database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class OpenGaussConfig(BaseSettings):
Configuration settings for OpenGauss Configuration settings for OpenGauss
""" """
OPENGAUSS_HOST: Optional[str] = Field( OPENGAUSS_HOST: str | None = Field(
description="Hostname or IP address of the OpenGauss server(e.g., 'localhost')", description="Hostname or IP address of the OpenGauss server(e.g., 'localhost')",
default=None, default=None,
) )
@@ -19,17 +17,17 @@ class OpenGaussConfig(BaseSettings):
default=6600, default=6600,
) )
OPENGAUSS_USER: Optional[str] = Field( OPENGAUSS_USER: str | None = Field(
description="Username for authenticating with the OpenGauss database", description="Username for authenticating with the OpenGauss database",
default=None, default=None,
) )
OPENGAUSS_PASSWORD: Optional[str] = Field( OPENGAUSS_PASSWORD: str | None = Field(
description="Password for authenticating with the OpenGauss database", description="Password for authenticating with the OpenGauss database",
default=None, default=None,
) )
OPENGAUSS_DATABASE: Optional[str] = Field( OPENGAUSS_DATABASE: str | None = Field(
description="Name of the OpenGauss database to connect to", description="Name of the OpenGauss database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,5 @@
import enum from enum import Enum
from typing import Literal, Optional from typing import Literal
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -10,7 +10,7 @@ class OpenSearchConfig(BaseSettings):
Configuration settings for OpenSearch Configuration settings for OpenSearch
""" """
class AuthMethod(enum.StrEnum): class AuthMethod(Enum):
""" """
Authentication method for OpenSearch Authentication method for OpenSearch
""" """
@@ -18,7 +18,7 @@ class OpenSearchConfig(BaseSettings):
BASIC = "basic" BASIC = "basic"
AWS_MANAGED_IAM = "aws_managed_iam" AWS_MANAGED_IAM = "aws_managed_iam"
OPENSEARCH_HOST: Optional[str] = Field( OPENSEARCH_HOST: str | None = Field(
description="Hostname or IP address of the OpenSearch server (e.g., 'localhost' or 'opensearch.example.com')", description="Hostname or IP address of the OpenSearch server (e.g., 'localhost' or 'opensearch.example.com')",
default=None, default=None,
) )
@@ -43,21 +43,21 @@ class OpenSearchConfig(BaseSettings):
default=AuthMethod.BASIC, default=AuthMethod.BASIC,
) )
OPENSEARCH_USER: Optional[str] = Field( OPENSEARCH_USER: str | None = Field(
description="Username for authenticating with OpenSearch", description="Username for authenticating with OpenSearch",
default=None, default=None,
) )
OPENSEARCH_PASSWORD: Optional[str] = Field( OPENSEARCH_PASSWORD: str | None = Field(
description="Password for authenticating with OpenSearch", description="Password for authenticating with OpenSearch",
default=None, default=None,
) )
OPENSEARCH_AWS_REGION: Optional[str] = Field( OPENSEARCH_AWS_REGION: str | None = Field(
description="AWS region for OpenSearch (e.g. 'us-west-2')", description="AWS region for OpenSearch (e.g. 'us-west-2')",
default=None, default=None,
) )
OPENSEARCH_AWS_SERVICE: Optional[Literal["es", "aoss"]] = Field( OPENSEARCH_AWS_SERVICE: Literal["es", "aoss"] | None = Field(
description="AWS service for OpenSearch (e.g. 'aoss' for OpenSearch Serverless)", default=None description="AWS service for OpenSearch (e.g. 'aoss' for OpenSearch Serverless)", default=None
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,33 +7,33 @@ class OracleConfig(BaseSettings):
Configuration settings for Oracle database Configuration settings for Oracle database
""" """
ORACLE_USER: Optional[str] = Field( ORACLE_USER: str | None = Field(
description="Username for authenticating with the Oracle database", description="Username for authenticating with the Oracle database",
default=None, default=None,
) )
ORACLE_PASSWORD: Optional[str] = Field( ORACLE_PASSWORD: str | None = Field(
description="Password for authenticating with the Oracle database", description="Password for authenticating with the Oracle database",
default=None, default=None,
) )
ORACLE_DSN: Optional[str] = Field( ORACLE_DSN: str | None = Field(
description="Oracle database connection string. For traditional database, use format 'host:port/service_name'. " description="Oracle database connection string. For traditional database, use format 'host:port/service_name'. "
"For autonomous database, use the service name from tnsnames.ora in the wallet", "For autonomous database, use the service name from tnsnames.ora in the wallet",
default=None, default=None,
) )
ORACLE_CONFIG_DIR: Optional[str] = Field( ORACLE_CONFIG_DIR: str | None = Field(
description="Directory containing the tnsnames.ora configuration file. Only used in thin mode connection", description="Directory containing the tnsnames.ora configuration file. Only used in thin mode connection",
default=None, default=None,
) )
ORACLE_WALLET_LOCATION: Optional[str] = Field( ORACLE_WALLET_LOCATION: str | None = Field(
description="Oracle wallet directory path containing the wallet files for secure connection", description="Oracle wallet directory path containing the wallet files for secure connection",
default=None, default=None,
) )
ORACLE_WALLET_PASSWORD: Optional[str] = Field( ORACLE_WALLET_PASSWORD: str | None = Field(
description="Password to decrypt the Oracle wallet, if it is encrypted", description="Password to decrypt the Oracle wallet, if it is encrypted",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class PGVectorConfig(BaseSettings):
Configuration settings for PGVector (PostgreSQL with vector extension) Configuration settings for PGVector (PostgreSQL with vector extension)
""" """
PGVECTOR_HOST: Optional[str] = Field( PGVECTOR_HOST: str | None = Field(
description="Hostname or IP address of the PostgreSQL server with PGVector extension (e.g., 'localhost')", description="Hostname or IP address of the PostgreSQL server with PGVector extension (e.g., 'localhost')",
default=None, default=None,
) )
@@ -19,17 +17,17 @@ class PGVectorConfig(BaseSettings):
default=5433, default=5433,
) )
PGVECTOR_USER: Optional[str] = Field( PGVECTOR_USER: str | None = Field(
description="Username for authenticating with the PostgreSQL database", description="Username for authenticating with the PostgreSQL database",
default=None, default=None,
) )
PGVECTOR_PASSWORD: Optional[str] = Field( PGVECTOR_PASSWORD: str | None = Field(
description="Password for authenticating with the PostgreSQL database", description="Password for authenticating with the PostgreSQL database",
default=None, default=None,
) )
PGVECTOR_DATABASE: Optional[str] = Field( PGVECTOR_DATABASE: str | None = Field(
description="Name of the PostgreSQL database to connect to", description="Name of the PostgreSQL database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class PGVectoRSConfig(BaseSettings):
Configuration settings for PGVecto.RS (Rust-based vector extension for PostgreSQL) Configuration settings for PGVecto.RS (Rust-based vector extension for PostgreSQL)
""" """
PGVECTO_RS_HOST: Optional[str] = Field( PGVECTO_RS_HOST: str | None = Field(
description="Hostname or IP address of the PostgreSQL server with PGVecto.RS extension (e.g., 'localhost')", description="Hostname or IP address of the PostgreSQL server with PGVecto.RS extension (e.g., 'localhost')",
default=None, default=None,
) )
@@ -19,17 +17,17 @@ class PGVectoRSConfig(BaseSettings):
default=5431, default=5431,
) )
PGVECTO_RS_USER: Optional[str] = Field( PGVECTO_RS_USER: str | None = Field(
description="Username for authenticating with the PostgreSQL database using PGVecto.RS", description="Username for authenticating with the PostgreSQL database using PGVecto.RS",
default=None, default=None,
) )
PGVECTO_RS_PASSWORD: Optional[str] = Field( PGVECTO_RS_PASSWORD: str | None = Field(
description="Password for authenticating with the PostgreSQL database using PGVecto.RS", description="Password for authenticating with the PostgreSQL database using PGVecto.RS",
default=None, default=None,
) )
PGVECTO_RS_DATABASE: Optional[str] = Field( PGVECTO_RS_DATABASE: str | None = Field(
description="Name of the PostgreSQL database with PGVecto.RS extension to connect to", description="Name of the PostgreSQL database with PGVecto.RS extension to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveInt from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,12 +7,12 @@ class QdrantConfig(BaseSettings):
Configuration settings for Qdrant vector database Configuration settings for Qdrant vector database
""" """
QDRANT_URL: Optional[str] = Field( QDRANT_URL: str | None = Field(
description="URL of the Qdrant server (e.g., 'http://localhost:6333' or 'https://qdrant.example.com')", description="URL of the Qdrant server (e.g., 'http://localhost:6333' or 'https://qdrant.example.com')",
default=None, default=None,
) )
QDRANT_API_KEY: Optional[str] = Field( QDRANT_API_KEY: str | None = Field(
description="API key for authenticating with the Qdrant server", description="API key for authenticating with the Qdrant server",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class RelytConfig(BaseSettings):
Configuration settings for Relyt database Configuration settings for Relyt database
""" """
RELYT_HOST: Optional[str] = Field( RELYT_HOST: str | None = Field(
description="Hostname or IP address of the Relyt server (e.g., 'localhost' or 'relyt.example.com')", description="Hostname or IP address of the Relyt server (e.g., 'localhost' or 'relyt.example.com')",
default=None, default=None,
) )
@@ -19,17 +17,17 @@ class RelytConfig(BaseSettings):
default=9200, default=9200,
) )
RELYT_USER: Optional[str] = Field( RELYT_USER: str | None = Field(
description="Username for authenticating with the Relyt database", description="Username for authenticating with the Relyt database",
default=None, default=None,
) )
RELYT_PASSWORD: Optional[str] = Field( RELYT_PASSWORD: str | None = Field(
description="Password for authenticating with the Relyt database", description="Password for authenticating with the Relyt database",
default=None, default=None,
) )
RELYT_DATABASE: Optional[str] = Field( RELYT_DATABASE: str | None = Field(
description="Name of the Relyt database to connect to (default is 'default')", description="Name of the Relyt database to connect to (default is 'default')",
default="default", default="default",
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,22 +7,22 @@ class TableStoreConfig(BaseSettings):
Configuration settings for TableStore. Configuration settings for TableStore.
""" """
TABLESTORE_ENDPOINT: Optional[str] = Field( TABLESTORE_ENDPOINT: str | None = Field(
description="Endpoint address of the TableStore server (e.g. 'https://instance-name.cn-hangzhou.ots.aliyuncs.com')", description="Endpoint address of the TableStore server (e.g. 'https://instance-name.cn-hangzhou.ots.aliyuncs.com')",
default=None, default=None,
) )
TABLESTORE_INSTANCE_NAME: Optional[str] = Field( TABLESTORE_INSTANCE_NAME: str | None = Field(
description="Instance name to access TableStore server (eg. 'instance-name')", description="Instance name to access TableStore server (eg. 'instance-name')",
default=None, default=None,
) )
TABLESTORE_ACCESS_KEY_ID: Optional[str] = Field( TABLESTORE_ACCESS_KEY_ID: str | None = Field(
description="AccessKey id for the instance name", description="AccessKey id for the instance name",
default=None, default=None,
) )
TABLESTORE_ACCESS_KEY_SECRET: Optional[str] = Field( TABLESTORE_ACCESS_KEY_SECRET: str | None = Field(
description="AccessKey secret for the instance name", description="AccessKey secret for the instance name",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveInt from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,12 +7,12 @@ class TencentVectorDBConfig(BaseSettings):
Configuration settings for Tencent Vector Database Configuration settings for Tencent Vector Database
""" """
TENCENT_VECTOR_DB_URL: Optional[str] = Field( TENCENT_VECTOR_DB_URL: str | None = Field(
description="URL of the Tencent Vector Database service (e.g., 'https://vectordb.tencentcloudapi.com')", description="URL of the Tencent Vector Database service (e.g., 'https://vectordb.tencentcloudapi.com')",
default=None, default=None,
) )
TENCENT_VECTOR_DB_API_KEY: Optional[str] = Field( TENCENT_VECTOR_DB_API_KEY: str | None = Field(
description="API key for authenticating with the Tencent Vector Database service", description="API key for authenticating with the Tencent Vector Database service",
default=None, default=None,
) )
@@ -24,12 +22,12 @@ class TencentVectorDBConfig(BaseSettings):
default=30, default=30,
) )
TENCENT_VECTOR_DB_USERNAME: Optional[str] = Field( TENCENT_VECTOR_DB_USERNAME: str | None = Field(
description="Username for authenticating with the Tencent Vector Database (if required)", description="Username for authenticating with the Tencent Vector Database (if required)",
default=None, default=None,
) )
TENCENT_VECTOR_DB_PASSWORD: Optional[str] = Field( TENCENT_VECTOR_DB_PASSWORD: str | None = Field(
description="Password for authenticating with the Tencent Vector Database (if required)", description="Password for authenticating with the Tencent Vector Database (if required)",
default=None, default=None,
) )
@@ -44,7 +42,7 @@ class TencentVectorDBConfig(BaseSettings):
default=2, default=2,
) )
TENCENT_VECTOR_DB_DATABASE: Optional[str] = Field( TENCENT_VECTOR_DB_DATABASE: str | None = Field(
description="Name of the specific Tencent Vector Database to connect to", description="Name of the specific Tencent Vector Database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveInt from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,12 +7,12 @@ class TidbOnQdrantConfig(BaseSettings):
Tidb on Qdrant configs Tidb on Qdrant configs
""" """
TIDB_ON_QDRANT_URL: Optional[str] = Field( TIDB_ON_QDRANT_URL: str | None = Field(
description="Tidb on Qdrant url", description="Tidb on Qdrant url",
default=None, default=None,
) )
TIDB_ON_QDRANT_API_KEY: Optional[str] = Field( TIDB_ON_QDRANT_API_KEY: str | None = Field(
description="Tidb on Qdrant api key", description="Tidb on Qdrant api key",
default=None, default=None,
) )
@@ -34,37 +32,37 @@ class TidbOnQdrantConfig(BaseSettings):
default=6334, default=6334,
) )
TIDB_PUBLIC_KEY: Optional[str] = Field( TIDB_PUBLIC_KEY: str | None = Field(
description="Tidb account public key", description="Tidb account public key",
default=None, default=None,
) )
TIDB_PRIVATE_KEY: Optional[str] = Field( TIDB_PRIVATE_KEY: str | None = Field(
description="Tidb account private key", description="Tidb account private key",
default=None, default=None,
) )
TIDB_API_URL: Optional[str] = Field( TIDB_API_URL: str | None = Field(
description="Tidb API url", description="Tidb API url",
default=None, default=None,
) )
TIDB_IAM_API_URL: Optional[str] = Field( TIDB_IAM_API_URL: str | None = Field(
description="Tidb IAM API url", description="Tidb IAM API url",
default=None, default=None,
) )
TIDB_REGION: Optional[str] = Field( TIDB_REGION: str | None = Field(
description="Tidb serverless region", description="Tidb serverless region",
default="regions/aws-us-east-1", default="regions/aws-us-east-1",
) )
TIDB_PROJECT_ID: Optional[str] = Field( TIDB_PROJECT_ID: str | None = Field(
description="Tidb project id", description="Tidb project id",
default=None, default=None,
) )
TIDB_SPEND_LIMIT: Optional[int] = Field( TIDB_SPEND_LIMIT: int | None = Field(
description="Tidb spend limit", description="Tidb spend limit",
default=100, default=100,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class TiDBVectorConfig(BaseSettings):
Configuration settings for TiDB Vector database Configuration settings for TiDB Vector database
""" """
TIDB_VECTOR_HOST: Optional[str] = Field( TIDB_VECTOR_HOST: str | None = Field(
description="Hostname or IP address of the TiDB Vector server (e.g., 'localhost' or 'tidb.example.com')", description="Hostname or IP address of the TiDB Vector server (e.g., 'localhost' or 'tidb.example.com')",
default=None, default=None,
) )
TIDB_VECTOR_PORT: Optional[PositiveInt] = Field( TIDB_VECTOR_PORT: PositiveInt | None = Field(
description="Port number on which the TiDB Vector server is listening (default is 4000)", description="Port number on which the TiDB Vector server is listening (default is 4000)",
default=4000, default=4000,
) )
TIDB_VECTOR_USER: Optional[str] = Field( TIDB_VECTOR_USER: str | None = Field(
description="Username for authenticating with the TiDB Vector database", description="Username for authenticating with the TiDB Vector database",
default=None, default=None,
) )
TIDB_VECTOR_PASSWORD: Optional[str] = Field( TIDB_VECTOR_PASSWORD: str | None = Field(
description="Password for authenticating with the TiDB Vector database", description="Password for authenticating with the TiDB Vector database",
default=None, default=None,
) )
TIDB_VECTOR_DATABASE: Optional[str] = Field( TIDB_VECTOR_DATABASE: str | None = Field(
description="Name of the TiDB Vector database to connect to", description="Name of the TiDB Vector database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,12 +7,12 @@ class UpstashConfig(BaseSettings):
Configuration settings for Upstash vector database Configuration settings for Upstash vector database
""" """
UPSTASH_VECTOR_URL: Optional[str] = Field( UPSTASH_VECTOR_URL: str | None = Field(
description="URL of the upstash server (e.g., 'https://vector.upstash.io')", description="URL of the upstash server (e.g., 'https://vector.upstash.io')",
default=None, default=None,
) )
UPSTASH_VECTOR_TOKEN: Optional[str] = Field( UPSTASH_VECTOR_TOKEN: str | None = Field(
description="Token for authenticating with the upstash server", description="Token for authenticating with the upstash server",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class VastbaseVectorConfig(BaseSettings):
Configuration settings for Vector (Vastbase with vector extension) Configuration settings for Vector (Vastbase with vector extension)
""" """
VASTBASE_HOST: Optional[str] = Field( VASTBASE_HOST: str | None = Field(
description="Hostname or IP address of the Vastbase server with Vector extension (e.g., 'localhost')", description="Hostname or IP address of the Vastbase server with Vector extension (e.g., 'localhost')",
default=None, default=None,
) )
@@ -19,17 +17,17 @@ class VastbaseVectorConfig(BaseSettings):
default=5432, default=5432,
) )
VASTBASE_USER: Optional[str] = Field( VASTBASE_USER: str | None = Field(
description="Username for authenticating with the Vastbase database", description="Username for authenticating with the Vastbase database",
default=None, default=None,
) )
VASTBASE_PASSWORD: Optional[str] = Field( VASTBASE_PASSWORD: str | None = Field(
description="Password for authenticating with the Vastbase database", description="Password for authenticating with the Vastbase database",
default=None, default=None,
) )
VASTBASE_DATABASE: Optional[str] = Field( VASTBASE_DATABASE: str | None = Field(
description="Name of the Vastbase database to connect to", description="Name of the Vastbase database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -11,14 +9,14 @@ class VikingDBConfig(BaseSettings):
https://www.volcengine.com/docs/6291/65568 https://www.volcengine.com/docs/6291/65568
""" """
VIKINGDB_ACCESS_KEY: Optional[str] = Field( VIKINGDB_ACCESS_KEY: str | None = Field(
description="The Access Key provided by Volcengine VikingDB for API authentication." description="The Access Key provided by Volcengine VikingDB for API authentication."
"Refer to the following documentation for details on obtaining credentials:" "Refer to the following documentation for details on obtaining credentials:"
"https://www.volcengine.com/docs/6291/65568", "https://www.volcengine.com/docs/6291/65568",
default=None, default=None,
) )
VIKINGDB_SECRET_KEY: Optional[str] = Field( VIKINGDB_SECRET_KEY: str | None = Field(
description="The Secret Key provided by Volcengine VikingDB for API authentication.", description="The Secret Key provided by Volcengine VikingDB for API authentication.",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,12 +7,12 @@ class WeaviateConfig(BaseSettings):
Configuration settings for Weaviate vector database Configuration settings for Weaviate vector database
""" """
WEAVIATE_ENDPOINT: Optional[str] = Field( WEAVIATE_ENDPOINT: str | None = Field(
description="URL of the Weaviate server (e.g., 'http://localhost:8080' or 'https://weaviate.example.com')", description="URL of the Weaviate server (e.g., 'http://localhost:8080' or 'https://weaviate.example.com')",
default=None, default=None,
) )
WEAVIATE_API_KEY: Optional[str] = Field( WEAVIATE_API_KEY: str | None = Field(
description="API key for authenticating with the Weaviate server", description="API key for authenticating with the Weaviate server",
default=None, default=None,
) )

View File

@@ -1,6 +1,6 @@
from pydantic import Field from pydantic import Field
from configs.packaging.pyproject import PyProjectConfig, PyProjectTomlConfig from configs.packaging.pyproject import PyProjectTomlConfig
class PackagingInfo(PyProjectTomlConfig): class PackagingInfo(PyProjectTomlConfig):

View File

@@ -1,5 +1,5 @@
from collections.abc import Mapping from collections.abc import Mapping
from typing import Any, Optional from typing import Any
from pydantic import Field from pydantic import Field
from pydantic.fields import FieldInfo from pydantic.fields import FieldInfo
@@ -15,22 +15,22 @@ class ApolloSettingsSourceInfo(BaseSettings):
Packaging build information Packaging build information
""" """
APOLLO_APP_ID: Optional[str] = Field( APOLLO_APP_ID: str | None = Field(
description="apollo app_id", description="apollo app_id",
default=None, default=None,
) )
APOLLO_CLUSTER: Optional[str] = Field( APOLLO_CLUSTER: str | None = Field(
description="apollo cluster", description="apollo cluster",
default=None, default=None,
) )
APOLLO_CONFIG_URL: Optional[str] = Field( APOLLO_CONFIG_URL: str | None = Field(
description="apollo config url", description="apollo config url",
default=None, default=None,
) )
APOLLO_NAMESPACE: Optional[str] = Field( APOLLO_NAMESPACE: str | None = Field(
description="apollo namespace", description="apollo namespace",
default=None, default=None,
) )

View File

@@ -4,8 +4,9 @@ import logging
import os import os
import threading import threading
import time import time
from collections.abc import Mapping from collections.abc import Callable, Mapping
from pathlib import Path from pathlib import Path
from typing import Any
from .python_3x import http_request, makedirs_wrapper from .python_3x import http_request, makedirs_wrapper
from .utils import ( from .utils import (
@@ -25,13 +26,13 @@ logger = logging.getLogger(__name__)
class ApolloClient: class ApolloClient:
def __init__( def __init__(
self, self,
config_url, config_url: str,
app_id, app_id: str,
cluster="default", cluster: str = "default",
secret="", secret: str = "",
start_hot_update=True, start_hot_update: bool = True,
change_listener=None, change_listener: Callable[[str, str, str, Any], None] | None = None,
_notification_map=None, _notification_map: dict[str, int] | None = None,
): ):
# Core routing parameters # Core routing parameters
self.config_url = config_url self.config_url = config_url
@@ -47,17 +48,17 @@ class ApolloClient:
# Private control variables # Private control variables
self._cycle_time = 5 self._cycle_time = 5
self._stopping = False self._stopping = False
self._cache = {} self._cache: dict[str, dict[str, Any]] = {}
self._no_key = {} self._no_key: dict[str, str] = {}
self._hash = {} self._hash: dict[str, str] = {}
self._pull_timeout = 75 self._pull_timeout = 75
self._cache_file_path = os.path.expanduser("~") + "/.dify/config/remote-settings/apollo/cache/" self._cache_file_path = os.path.expanduser("~") + "/.dify/config/remote-settings/apollo/cache/"
self._long_poll_thread = None self._long_poll_thread: threading.Thread | None = None
self._change_listener = change_listener # "add" "delete" "update" self._change_listener = change_listener # "add" "delete" "update"
if _notification_map is None: if _notification_map is None:
_notification_map = {"application": -1} _notification_map = {"application": -1}
self._notification_map = _notification_map self._notification_map = _notification_map
self.last_release_key = None self.last_release_key: str | None = None
# Private startup method # Private startup method
self._path_checker() self._path_checker()
if start_hot_update: if start_hot_update:
@@ -68,7 +69,7 @@ class ApolloClient:
heartbeat.daemon = True heartbeat.daemon = True
heartbeat.start() heartbeat.start()
def get_json_from_net(self, namespace="application"): def get_json_from_net(self, namespace: str = "application") -> dict[str, Any] | None:
url = "{}/configs/{}/{}/{}?releaseKey={}&ip={}".format( url = "{}/configs/{}/{}/{}?releaseKey={}&ip={}".format(
self.config_url, self.app_id, self.cluster, namespace, "", self.ip self.config_url, self.app_id, self.cluster, namespace, "", self.ip
) )
@@ -88,7 +89,7 @@ class ApolloClient:
logger.exception("an error occurred in get_json_from_net") logger.exception("an error occurred in get_json_from_net")
return None return None
def get_value(self, key, default_val=None, namespace="application"): def get_value(self, key: str, default_val: Any = None, namespace: str = "application") -> Any:
try: try:
# read memory configuration # read memory configuration
namespace_cache = self._cache.get(namespace) namespace_cache = self._cache.get(namespace)
@@ -104,7 +105,8 @@ class ApolloClient:
namespace_data = self.get_json_from_net(namespace) namespace_data = self.get_json_from_net(namespace)
val = get_value_from_dict(namespace_data, key) val = get_value_from_dict(namespace_data, key)
if val is not None: if val is not None:
self._update_cache_and_file(namespace_data, namespace) if namespace_data is not None:
self._update_cache_and_file(namespace_data, namespace)
return val return val
# read the file configuration # read the file configuration
@@ -126,23 +128,23 @@ class ApolloClient:
# to ensure the real-time correctness of the function call. # to ensure the real-time correctness of the function call.
# If the user does not have the same default val twice # If the user does not have the same default val twice
# and the default val is used here, there may be a problem. # and the default val is used here, there may be a problem.
def _set_local_cache_none(self, namespace, key): def _set_local_cache_none(self, namespace: str, key: str) -> None:
no_key = no_key_cache_key(namespace, key) no_key = no_key_cache_key(namespace, key)
self._no_key[no_key] = key self._no_key[no_key] = key
def _start_hot_update(self): def _start_hot_update(self) -> None:
self._long_poll_thread = threading.Thread(target=self._listener) self._long_poll_thread = threading.Thread(target=self._listener)
# When the asynchronous thread is started, the daemon thread will automatically exit # When the asynchronous thread is started, the daemon thread will automatically exit
# when the main thread is launched. # when the main thread is launched.
self._long_poll_thread.daemon = True self._long_poll_thread.daemon = True
self._long_poll_thread.start() self._long_poll_thread.start()
def stop(self): def stop(self) -> None:
self._stopping = True self._stopping = True
logger.info("Stopping listener...") logger.info("Stopping listener...")
# Call the set callback function, and if it is abnormal, try it out # Call the set callback function, and if it is abnormal, try it out
def _call_listener(self, namespace, old_kv, new_kv): def _call_listener(self, namespace: str, old_kv: dict[str, Any] | None, new_kv: dict[str, Any] | None) -> None:
if self._change_listener is None: if self._change_listener is None:
return return
if old_kv is None: if old_kv is None:
@@ -168,12 +170,12 @@ class ApolloClient:
except BaseException as e: except BaseException as e:
logger.warning(str(e)) logger.warning(str(e))
def _path_checker(self): def _path_checker(self) -> None:
if not os.path.isdir(self._cache_file_path): if not os.path.isdir(self._cache_file_path):
makedirs_wrapper(self._cache_file_path) makedirs_wrapper(self._cache_file_path)
# update the local cache and file cache # update the local cache and file cache
def _update_cache_and_file(self, namespace_data, namespace="application"): def _update_cache_and_file(self, namespace_data: dict[str, Any], namespace: str = "application") -> None:
# update the local cache # update the local cache
self._cache[namespace] = namespace_data self._cache[namespace] = namespace_data
# update the file cache # update the file cache
@@ -187,7 +189,7 @@ class ApolloClient:
self._hash[namespace] = new_hash self._hash[namespace] = new_hash
# get the configuration from the local file # get the configuration from the local file
def _get_local_cache(self, namespace="application"): def _get_local_cache(self, namespace: str = "application") -> dict[str, Any]:
cache_file_path = os.path.join(self._cache_file_path, f"{self.app_id}_configuration_{namespace}.txt") cache_file_path = os.path.join(self._cache_file_path, f"{self.app_id}_configuration_{namespace}.txt")
if os.path.isfile(cache_file_path): if os.path.isfile(cache_file_path):
with open(cache_file_path) as f: with open(cache_file_path) as f:
@@ -195,8 +197,8 @@ class ApolloClient:
return result return result
return {} return {}
def _long_poll(self): def _long_poll(self) -> None:
notifications = [] notifications: list[dict[str, Any]] = []
for key in self._cache: for key in self._cache:
namespace_data = self._cache[key] namespace_data = self._cache[key]
notification_id = -1 notification_id = -1
@@ -236,7 +238,7 @@ class ApolloClient:
except Exception as e: except Exception as e:
logger.warning(str(e)) logger.warning(str(e))
def _get_net_and_set_local(self, namespace, n_id, call_change=False): def _get_net_and_set_local(self, namespace: str, n_id: int, call_change: bool = False) -> None:
namespace_data = self.get_json_from_net(namespace) namespace_data = self.get_json_from_net(namespace)
if not namespace_data: if not namespace_data:
return return
@@ -248,7 +250,7 @@ class ApolloClient:
new_kv = namespace_data.get(CONFIGURATIONS) new_kv = namespace_data.get(CONFIGURATIONS)
self._call_listener(namespace, old_kv, new_kv) self._call_listener(namespace, old_kv, new_kv)
def _listener(self): def _listener(self) -> None:
logger.info("start long_poll") logger.info("start long_poll")
while not self._stopping: while not self._stopping:
self._long_poll() self._long_poll()
@@ -266,13 +268,13 @@ class ApolloClient:
headers["Timestamp"] = time_unix_now headers["Timestamp"] = time_unix_now
return headers return headers
def _heart_beat(self): def _heart_beat(self) -> None:
while not self._stopping: while not self._stopping:
for namespace in self._notification_map: for namespace in self._notification_map:
self._do_heart_beat(namespace) self._do_heart_beat(namespace)
time.sleep(60 * 10) # 10 minutes time.sleep(60 * 10) # 10 minutes
def _do_heart_beat(self, namespace): def _do_heart_beat(self, namespace: str) -> None:
url = f"{self.config_url}/configs/{self.app_id}/{self.cluster}/{namespace}?ip={self.ip}" url = f"{self.config_url}/configs/{self.app_id}/{self.cluster}/{namespace}?ip={self.ip}"
try: try:
code, body = http_request(url, timeout=3, headers=self._sign_headers(url)) code, body = http_request(url, timeout=3, headers=self._sign_headers(url))
@@ -292,7 +294,7 @@ class ApolloClient:
logger.exception("an error occurred in _do_heart_beat") logger.exception("an error occurred in _do_heart_beat")
return None return None
def get_all_dicts(self, namespace): def get_all_dicts(self, namespace: str) -> dict[str, Any] | None:
namespace_data = self._cache.get(namespace) namespace_data = self._cache.get(namespace)
if namespace_data is None: if namespace_data is None:
net_namespace_data = self.get_json_from_net(namespace) net_namespace_data = self.get_json_from_net(namespace)

View File

@@ -2,6 +2,8 @@ import logging
import os import os
import ssl import ssl
import urllib.request import urllib.request
from collections.abc import Mapping
from typing import Any
from urllib import parse from urllib import parse
from urllib.error import HTTPError from urllib.error import HTTPError
@@ -19,9 +21,9 @@ urllib.request.install_opener(opener)
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
def http_request(url, timeout, headers={}): def http_request(url: str, timeout: int | float, headers: Mapping[str, str] = {}) -> tuple[int, str | None]:
try: try:
request = urllib.request.Request(url, headers=headers) request = urllib.request.Request(url, headers=dict(headers))
res = urllib.request.urlopen(request, timeout=timeout) res = urllib.request.urlopen(request, timeout=timeout)
body = res.read().decode("utf-8") body = res.read().decode("utf-8")
return res.code, body return res.code, body
@@ -33,9 +35,9 @@ def http_request(url, timeout, headers={}):
raise e raise e
def url_encode(params): def url_encode(params: dict[str, Any]) -> str:
return parse.urlencode(params) return parse.urlencode(params)
def makedirs_wrapper(path): def makedirs_wrapper(path: str) -> None:
os.makedirs(path, exist_ok=True) os.makedirs(path, exist_ok=True)

View File

@@ -1,5 +1,6 @@
import hashlib import hashlib
import socket import socket
from typing import Any
from .python_3x import url_encode from .python_3x import url_encode
@@ -10,7 +11,7 @@ NAMESPACE_NAME = "namespaceName"
# add timestamps uris and keys # add timestamps uris and keys
def signature(timestamp, uri, secret): def signature(timestamp: str, uri: str, secret: str) -> str:
import base64 import base64
import hmac import hmac
@@ -19,16 +20,16 @@ def signature(timestamp, uri, secret):
return base64.b64encode(hmac_code).decode() return base64.b64encode(hmac_code).decode()
def url_encode_wrapper(params): def url_encode_wrapper(params: dict[str, Any]) -> str:
return url_encode(params) return url_encode(params)
def no_key_cache_key(namespace, key): def no_key_cache_key(namespace: str, key: str) -> str:
return f"{namespace}{len(namespace)}{key}" return f"{namespace}{len(namespace)}{key}"
# Returns whether the obtained value is obtained, and None if it does not # Returns whether the obtained value is obtained, and None if it does not
def get_value_from_dict(namespace_cache, key): def get_value_from_dict(namespace_cache: dict[str, Any] | None, key: str) -> Any:
if namespace_cache: if namespace_cache:
kv_data = namespace_cache.get(CONFIGURATIONS) kv_data = namespace_cache.get(CONFIGURATIONS)
if kv_data is None: if kv_data is None:
@@ -38,7 +39,7 @@ def get_value_from_dict(namespace_cache, key):
return None return None
def init_ip(): def init_ip() -> str:
ip = "" ip = ""
s = None s = None
try: try:

View File

@@ -11,5 +11,5 @@ class RemoteSettingsSource:
def get_field_value(self, field: FieldInfo, field_name: str) -> tuple[Any, str, bool]: def get_field_value(self, field: FieldInfo, field_name: str) -> tuple[Any, str, bool]:
raise NotImplementedError raise NotImplementedError
def prepare_field_value(self, field_name: str, field: FieldInfo, value: Any, value_is_complex: bool) -> Any: def prepare_field_value(self, field_name: str, field: FieldInfo, value: Any, value_is_complex: bool):
return value return value

View File

@@ -11,16 +11,16 @@ logger = logging.getLogger(__name__)
from configs.remote_settings_sources.base import RemoteSettingsSource from configs.remote_settings_sources.base import RemoteSettingsSource
from .utils import _parse_config from .utils import parse_config
class NacosSettingsSource(RemoteSettingsSource): class NacosSettingsSource(RemoteSettingsSource):
def __init__(self, configs: Mapping[str, Any]): def __init__(self, configs: Mapping[str, Any]):
self.configs = configs self.configs = configs
self.remote_configs: dict[str, Any] = {} self.remote_configs: dict[str, str] = {}
self.async_init() self.async_init()
def async_init(self): def async_init(self) -> None:
data_id = os.getenv("DIFY_ENV_NACOS_DATA_ID", "dify-api-env.properties") data_id = os.getenv("DIFY_ENV_NACOS_DATA_ID", "dify-api-env.properties")
group = os.getenv("DIFY_ENV_NACOS_GROUP", "nacos-dify") group = os.getenv("DIFY_ENV_NACOS_GROUP", "nacos-dify")
tenant = os.getenv("DIFY_ENV_NACOS_NAMESPACE", "") tenant = os.getenv("DIFY_ENV_NACOS_NAMESPACE", "")
@@ -29,22 +29,19 @@ class NacosSettingsSource(RemoteSettingsSource):
try: try:
content = NacosHttpClient().http_request("/nacos/v1/cs/configs", method="GET", headers={}, params=params) content = NacosHttpClient().http_request("/nacos/v1/cs/configs", method="GET", headers={}, params=params)
self.remote_configs = self._parse_config(content) self.remote_configs = self._parse_config(content)
except Exception as e: except Exception:
logger.exception("[get-access-token] exception occurred") logger.exception("[get-access-token] exception occurred")
raise raise
def _parse_config(self, content: str) -> dict: def _parse_config(self, content: str) -> dict[str, str]:
if not content: if not content:
return {} return {}
try: try:
return _parse_config(self, content) return parse_config(content)
except Exception as e: except Exception as e:
raise RuntimeError(f"Failed to parse config: {e}") raise RuntimeError(f"Failed to parse config: {e}")
def get_field_value(self, field: FieldInfo, field_name: str) -> tuple[Any, str, bool]: def get_field_value(self, field: FieldInfo, field_name: str) -> tuple[Any, str, bool]:
if not isinstance(self.remote_configs, dict):
raise ValueError(f"remote configs is not dict, but {type(self.remote_configs)}")
field_value = self.remote_configs.get(field_name) field_value = self.remote_configs.get(field_name)
if field_value is None: if field_value is None:
return None, field_name, False return None, field_name, False

View File

@@ -17,20 +17,26 @@ class NacosHttpClient:
self.ak = os.getenv("DIFY_ENV_NACOS_ACCESS_KEY") self.ak = os.getenv("DIFY_ENV_NACOS_ACCESS_KEY")
self.sk = os.getenv("DIFY_ENV_NACOS_SECRET_KEY") self.sk = os.getenv("DIFY_ENV_NACOS_SECRET_KEY")
self.server = os.getenv("DIFY_ENV_NACOS_SERVER_ADDR", "localhost:8848") self.server = os.getenv("DIFY_ENV_NACOS_SERVER_ADDR", "localhost:8848")
self.token = None self.token: str | None = None
self.token_ttl = 18000 self.token_ttl = 18000
self.token_expire_time: float = 0 self.token_expire_time: float = 0
def http_request(self, url, method="GET", headers=None, params=None): def http_request(
self, url: str, method: str = "GET", headers: dict[str, str] | None = None, params: dict[str, str] | None = None
) -> str:
if headers is None:
headers = {}
if params is None:
params = {}
try: try:
self._inject_auth_info(headers, params) self._inject_auth_info(headers, params)
response = requests.request(method, url="http://" + self.server + url, headers=headers, params=params) response = requests.request(method, url="http://" + self.server + url, headers=headers, params=params)
response.raise_for_status() response.raise_for_status()
return response.text return response.text
except requests.exceptions.RequestException as e: except requests.RequestException as e:
return f"Request to Nacos failed: {e}" return f"Request to Nacos failed: {e}"
def _inject_auth_info(self, headers, params, module="config"): def _inject_auth_info(self, headers: dict[str, str], params: dict[str, str], module: str = "config") -> None:
headers.update({"User-Agent": "Nacos-Http-Client-In-Dify:v0.0.1"}) headers.update({"User-Agent": "Nacos-Http-Client-In-Dify:v0.0.1"})
if module == "login": if module == "login":
@@ -45,16 +51,17 @@ class NacosHttpClient:
headers["timeStamp"] = ts headers["timeStamp"] = ts
if self.username and self.password: if self.username and self.password:
self.get_access_token(force_refresh=False) self.get_access_token(force_refresh=False)
params["accessToken"] = self.token if self.token is not None:
params["accessToken"] = self.token
def __do_sign(self, sign_str, sk): def __do_sign(self, sign_str: str, sk: str) -> str:
return ( return (
base64.encodebytes(hmac.new(sk.encode(), sign_str.encode(), digestmod=hashlib.sha1).digest()) base64.encodebytes(hmac.new(sk.encode(), sign_str.encode(), digestmod=hashlib.sha1).digest())
.decode() .decode()
.strip() .strip()
) )
def get_sign_str(self, group, tenant, ts): def get_sign_str(self, group: str, tenant: str, ts: str) -> str:
sign_str = "" sign_str = ""
if tenant: if tenant:
sign_str = tenant + "+" sign_str = tenant + "+"
@@ -63,7 +70,7 @@ class NacosHttpClient:
sign_str += ts # Directly concatenate ts without conditional checks, because the nacos auth header forced it. sign_str += ts # Directly concatenate ts without conditional checks, because the nacos auth header forced it.
return sign_str return sign_str
def get_access_token(self, force_refresh=False): def get_access_token(self, force_refresh: bool = False) -> str | None:
current_time = time.time() current_time = time.time()
if self.token and not force_refresh and self.token_expire_time > current_time: if self.token and not force_refresh and self.token_expire_time > current_time:
return self.token return self.token
@@ -77,6 +84,7 @@ class NacosHttpClient:
self.token = response_data.get("accessToken") self.token = response_data.get("accessToken")
self.token_ttl = response_data.get("tokenTtl", 18000) self.token_ttl = response_data.get("tokenTtl", 18000)
self.token_expire_time = current_time + self.token_ttl - 10 self.token_expire_time = current_time + self.token_ttl - 10
except Exception as e: return self.token
except Exception:
logger.exception("[get-access-token] exception occur") logger.exception("[get-access-token] exception occur")
raise raise

View File

@@ -1,4 +1,4 @@
def _parse_config(self, content: str) -> dict[str, str]: def parse_config(content: str) -> dict[str, str]:
config: dict[str, str] = {} config: dict[str, str] = {}
if not content: if not content:
return config return config

View File

@@ -16,14 +16,14 @@ AUDIO_EXTENSIONS = ["mp3", "m4a", "wav", "amr", "mpga"]
AUDIO_EXTENSIONS.extend([ext.upper() for ext in AUDIO_EXTENSIONS]) AUDIO_EXTENSIONS.extend([ext.upper() for ext in AUDIO_EXTENSIONS])
_doc_extensions: list[str]
if dify_config.ETL_TYPE == "Unstructured": if dify_config.ETL_TYPE == "Unstructured":
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls", "vtt", "properties"] _doc_extensions = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls", "vtt", "properties"]
DOCUMENT_EXTENSIONS.extend(("doc", "docx", "csv", "eml", "msg", "pptx", "xml", "epub")) _doc_extensions.extend(("doc", "docx", "csv", "eml", "msg", "pptx", "xml", "epub"))
if dify_config.UNSTRUCTURED_API_URL: if dify_config.UNSTRUCTURED_API_URL:
DOCUMENT_EXTENSIONS.append("ppt") _doc_extensions.append("ppt")
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])
else: else:
DOCUMENT_EXTENSIONS = [ _doc_extensions = [
"txt", "txt",
"markdown", "markdown",
"md", "md",
@@ -38,4 +38,4 @@ else:
"vtt", "vtt",
"properties", "properties",
] ]
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS]) DOCUMENT_EXTENSIONS = _doc_extensions + [ext.upper() for ext in _doc_extensions]

View File

@@ -19,6 +19,7 @@ language_timezone_mapping = {
"fa-IR": "Asia/Tehran", "fa-IR": "Asia/Tehran",
"sl-SI": "Europe/Ljubljana", "sl-SI": "Europe/Ljubljana",
"th-TH": "Asia/Bangkok", "th-TH": "Asia/Bangkok",
"id-ID": "Asia/Jakarta",
} }
languages = list(language_timezone_mapping.keys()) languages = list(language_timezone_mapping.keys())

View File

@@ -7,7 +7,7 @@ default_app_templates: Mapping[AppMode, Mapping] = {
# workflow default mode # workflow default mode
AppMode.WORKFLOW: { AppMode.WORKFLOW: {
"app": { "app": {
"mode": AppMode.WORKFLOW.value, "mode": AppMode.WORKFLOW,
"enable_site": True, "enable_site": True,
"enable_api": True, "enable_api": True,
} }
@@ -15,7 +15,7 @@ default_app_templates: Mapping[AppMode, Mapping] = {
# completion default mode # completion default mode
AppMode.COMPLETION: { AppMode.COMPLETION: {
"app": { "app": {
"mode": AppMode.COMPLETION.value, "mode": AppMode.COMPLETION,
"enable_site": True, "enable_site": True,
"enable_api": True, "enable_api": True,
}, },
@@ -44,7 +44,7 @@ default_app_templates: Mapping[AppMode, Mapping] = {
# chat default mode # chat default mode
AppMode.CHAT: { AppMode.CHAT: {
"app": { "app": {
"mode": AppMode.CHAT.value, "mode": AppMode.CHAT,
"enable_site": True, "enable_site": True,
"enable_api": True, "enable_api": True,
}, },
@@ -60,7 +60,7 @@ default_app_templates: Mapping[AppMode, Mapping] = {
# advanced-chat default mode # advanced-chat default mode
AppMode.ADVANCED_CHAT: { AppMode.ADVANCED_CHAT: {
"app": { "app": {
"mode": AppMode.ADVANCED_CHAT.value, "mode": AppMode.ADVANCED_CHAT,
"enable_site": True, "enable_site": True,
"enable_api": True, "enable_api": True,
}, },
@@ -68,7 +68,7 @@ default_app_templates: Mapping[AppMode, Mapping] = {
# agent-chat default mode # agent-chat default mode
AppMode.AGENT_CHAT: { AppMode.AGENT_CHAT: {
"app": { "app": {
"mode": AppMode.AGENT_CHAT.value, "mode": AppMode.AGENT_CHAT,
"enable_site": True, "enable_site": True,
"enable_api": True, "enable_api": True,
}, },

View File

@@ -8,7 +8,6 @@ if TYPE_CHECKING:
from core.model_runtime.entities.model_entities import AIModelEntity from core.model_runtime.entities.model_entities import AIModelEntity
from core.plugin.entities.plugin_daemon import PluginModelProviderEntity from core.plugin.entities.plugin_daemon import PluginModelProviderEntity
from core.tools.plugin_tool.provider import PluginToolProviderController from core.tools.plugin_tool.provider import PluginToolProviderController
from core.workflow.entities.variable_pool import VariablePool
""" """

View File

@@ -1,4 +1,5 @@
from flask import Blueprint from flask import Blueprint
from flask_restx import Namespace
from libs.external_api import ExternalApi from libs.external_api import ExternalApi
@@ -26,7 +27,16 @@ from .files import FileApi, FilePreviewApi, FileSupportTypeApi
from .remote_files import RemoteFileInfoApi, RemoteFileUploadApi from .remote_files import RemoteFileInfoApi, RemoteFileUploadApi
bp = Blueprint("console", __name__, url_prefix="/console/api") bp = Blueprint("console", __name__, url_prefix="/console/api")
api = ExternalApi(bp)
api = ExternalApi(
bp,
version="1.0",
title="Console API",
description="Console management APIs for app configuration, monitoring, and administration",
)
# Create namespace
console_ns = Namespace("console", description="Console management API operations", path="/")
# File # File
api.add_resource(FileApi, "/files/upload") api.add_resource(FileApi, "/files/upload")
@@ -43,7 +53,16 @@ api.add_resource(AppImportConfirmApi, "/apps/imports/<string:import_id>/confirm"
api.add_resource(AppImportCheckDependenciesApi, "/apps/imports/<string:app_id>/check-dependencies") api.add_resource(AppImportCheckDependenciesApi, "/apps/imports/<string:app_id>/check-dependencies")
# Import other controllers # Import other controllers
from . import admin, apikey, extension, feature, ping, setup, version from . import (
admin,
apikey,
extension,
feature,
init_validate,
ping,
setup,
version,
)
# Import app controllers # Import app controllers
from .app import ( from .app import (
@@ -70,7 +89,16 @@ from .app import (
) )
# Import auth controllers # Import auth controllers
from .auth import activate, data_source_bearer_auth, data_source_oauth, forgot_password, login, oauth, oauth_server from .auth import (
activate,
data_source_bearer_auth,
data_source_oauth,
email_register,
forgot_password,
login,
oauth,
oauth_server,
)
# Import billing controllers # Import billing controllers
from .billing import billing, compliance from .billing import billing, compliance
@@ -95,6 +123,23 @@ from .explore import (
saved_message, saved_message,
) )
# Import tag controllers
from .tag import tags
# Import workspace controllers
from .workspace import (
account,
agent_providers,
endpoint,
load_balancing_config,
members,
model_providers,
models,
plugin,
tool_providers,
workspace,
)
# Explore Audio # Explore Audio
api.add_resource(ChatAudioApi, "/installed-apps/<uuid:installed_app_id>/audio-to-text", endpoint="installed_app_audio") api.add_resource(ChatAudioApi, "/installed-apps/<uuid:installed_app_id>/audio-to-text", endpoint="installed_app_audio")
api.add_resource(ChatTextApi, "/installed-apps/<uuid:installed_app_id>/text-to-audio", endpoint="installed_app_text") api.add_resource(ChatTextApi, "/installed-apps/<uuid:installed_app_id>/text-to-audio", endpoint="installed_app_text")
@@ -166,19 +211,71 @@ api.add_resource(
InstalledAppWorkflowTaskStopApi, "/installed-apps/<uuid:installed_app_id>/workflows/tasks/<string:task_id>/stop" InstalledAppWorkflowTaskStopApi, "/installed-apps/<uuid:installed_app_id>/workflows/tasks/<string:task_id>/stop"
) )
# Import tag controllers api.add_namespace(console_ns)
from .tag import tags
# Import workspace controllers __all__ = [
from .workspace import ( "account",
account, "activate",
agent_providers, "admin",
endpoint, "advanced_prompt_template",
load_balancing_config, "agent",
members, "agent_providers",
model_providers, "annotation",
models, "api",
plugin, "apikey",
tool_providers, "app",
workspace, "audio",
) "billing",
"bp",
"completion",
"compliance",
"console_ns",
"conversation",
"conversation_variables",
"data_source",
"data_source_bearer_auth",
"data_source_oauth",
"datasets",
"datasets_document",
"datasets_segments",
"email_register",
"endpoint",
"extension",
"external",
"feature",
"forgot_password",
"generator",
"hit_testing",
"init_validate",
"installed_app",
"load_balancing_config",
"login",
"mcp_server",
"members",
"message",
"metadata",
"model_config",
"model_providers",
"models",
"oauth",
"oauth_server",
"ops_trace",
"parameter",
"ping",
"plugin",
"recommended_app",
"saved_message",
"setup",
"site",
"statistic",
"tags",
"tool_providers",
"version",
"website",
"workflow",
"workflow_app_log",
"workflow_draft_variable",
"workflow_run",
"workflow_statistic",
"workspace",
]

View File

@@ -1,22 +1,26 @@
from collections.abc import Callable
from functools import wraps from functools import wraps
from typing import ParamSpec, TypeVar
from flask import request from flask import request
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from werkzeug.exceptions import NotFound, Unauthorized from werkzeug.exceptions import NotFound, Unauthorized
P = ParamSpec("P")
R = TypeVar("R")
from configs import dify_config from configs import dify_config
from constants.languages import supported_language from constants.languages import supported_language
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.wraps import only_edition_cloud from controllers.console.wraps import only_edition_cloud
from extensions.ext_database import db from extensions.ext_database import db
from models.model import App, InstalledApp, RecommendedApp from models.model import App, InstalledApp, RecommendedApp
def admin_required(view): def admin_required(view: Callable[P, R]):
@wraps(view) @wraps(view)
def decorated(*args, **kwargs): def decorated(*args: P.args, **kwargs: P.kwargs):
if not dify_config.ADMIN_API_KEY: if not dify_config.ADMIN_API_KEY:
raise Unauthorized("API key is invalid.") raise Unauthorized("API key is invalid.")
@@ -41,7 +45,28 @@ def admin_required(view):
return decorated return decorated
@console_ns.route("/admin/insert-explore-apps")
class InsertExploreAppListApi(Resource): class InsertExploreAppListApi(Resource):
@api.doc("insert_explore_app")
@api.doc(description="Insert or update an app in the explore list")
@api.expect(
api.model(
"InsertExploreAppRequest",
{
"app_id": fields.String(required=True, description="Application ID"),
"desc": fields.String(description="App description"),
"copyright": fields.String(description="Copyright information"),
"privacy_policy": fields.String(description="Privacy policy"),
"custom_disclaimer": fields.String(description="Custom disclaimer"),
"language": fields.String(required=True, description="Language code"),
"category": fields.String(required=True, description="App category"),
"position": fields.Integer(required=True, description="Display position"),
},
)
)
@api.response(200, "App updated successfully")
@api.response(201, "App inserted successfully")
@api.response(404, "App not found")
@only_edition_cloud @only_edition_cloud
@admin_required @admin_required
def post(self): def post(self):
@@ -111,7 +136,12 @@ class InsertExploreAppListApi(Resource):
return {"result": "success"}, 200 return {"result": "success"}, 200
@console_ns.route("/admin/insert-explore-apps/<uuid:app_id>")
class InsertExploreAppApi(Resource): class InsertExploreAppApi(Resource):
@api.doc("delete_explore_app")
@api.doc(description="Remove an app from the explore list")
@api.doc(params={"app_id": "Application ID to remove"})
@api.response(204, "App removed successfully")
@only_edition_cloud @only_edition_cloud
@admin_required @admin_required
def delete(self, app_id): def delete(self, app_id):
@@ -130,21 +160,21 @@ class InsertExploreAppApi(Resource):
app.is_public = False app.is_public = False
with Session(db.engine) as session: with Session(db.engine) as session:
installed_apps = session.execute( installed_apps = (
select(InstalledApp).where( session.execute(
InstalledApp.app_id == recommended_app.app_id, select(InstalledApp).where(
InstalledApp.tenant_id != InstalledApp.app_owner_tenant_id, InstalledApp.app_id == recommended_app.app_id,
InstalledApp.tenant_id != InstalledApp.app_owner_tenant_id,
)
) )
).all() .scalars()
.all()
)
for installed_app in installed_apps: for installed_app in installed_apps:
db.session.delete(installed_app) session.delete(installed_app)
db.session.delete(recommended_app) db.session.delete(recommended_app)
db.session.commit() db.session.commit()
return {"result": "success"}, 204 return {"result": "success"}, 204
api.add_resource(InsertExploreAppListApi, "/admin/insert-explore-apps")
api.add_resource(InsertExploreAppApi, "/admin/insert-explore-apps/<uuid:app_id>")

View File

@@ -1,8 +1,7 @@
from typing import Any, Optional
import flask_restx import flask_restx
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, fields, marshal_with from flask_restx import Resource, fields, marshal_with
from flask_restx._http import HTTPStatus
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden from werkzeug.exceptions import Forbidden
@@ -13,7 +12,7 @@ from libs.login import login_required
from models.dataset import Dataset from models.dataset import Dataset
from models.model import ApiToken, App from models.model import ApiToken, App
from . import api from . import api, console_ns
from .wraps import account_initialization_required, setup_required from .wraps import account_initialization_required, setup_required
api_key_fields = { api_key_fields = {
@@ -40,7 +39,7 @@ def _get_resource(resource_id, tenant_id, resource_model):
).scalar_one_or_none() ).scalar_one_or_none()
if resource is None: if resource is None:
flask_restx.abort(404, message=f"{resource_model.__name__} not found.") flask_restx.abort(HTTPStatus.NOT_FOUND, message=f"{resource_model.__name__} not found.")
return resource return resource
@@ -49,7 +48,7 @@ class BaseApiKeyListResource(Resource):
method_decorators = [account_initialization_required, login_required, setup_required] method_decorators = [account_initialization_required, login_required, setup_required]
resource_type: str | None = None resource_type: str | None = None
resource_model: Optional[Any] = None resource_model: type | None = None
resource_id_field: str | None = None resource_id_field: str | None = None
token_prefix: str | None = None token_prefix: str | None = None
max_keys = 10 max_keys = 10
@@ -59,11 +58,11 @@ class BaseApiKeyListResource(Resource):
assert self.resource_id_field is not None, "resource_id_field must be set" assert self.resource_id_field is not None, "resource_id_field must be set"
resource_id = str(resource_id) resource_id = str(resource_id)
_get_resource(resource_id, current_user.current_tenant_id, self.resource_model) _get_resource(resource_id, current_user.current_tenant_id, self.resource_model)
keys = ( keys = db.session.scalars(
db.session.query(ApiToken) select(ApiToken).where(
.where(ApiToken.type == self.resource_type, getattr(ApiToken, self.resource_id_field) == resource_id) ApiToken.type == self.resource_type, getattr(ApiToken, self.resource_id_field) == resource_id
.all() )
) ).all()
return {"items": keys} return {"items": keys}
@marshal_with(api_key_fields) @marshal_with(api_key_fields)
@@ -82,12 +81,12 @@ class BaseApiKeyListResource(Resource):
if current_key_count >= self.max_keys: if current_key_count >= self.max_keys:
flask_restx.abort( flask_restx.abort(
400, HTTPStatus.BAD_REQUEST,
message=f"Cannot create more than {self.max_keys} API keys for this resource type.", message=f"Cannot create more than {self.max_keys} API keys for this resource type.",
code="max_keys_exceeded", custom="max_keys_exceeded",
) )
key = ApiToken.generate_api_key(self.token_prefix, 24) key = ApiToken.generate_api_key(self.token_prefix or "", 24)
api_token = ApiToken() api_token = ApiToken()
setattr(api_token, self.resource_id_field, resource_id) setattr(api_token, self.resource_id_field, resource_id)
api_token.tenant_id = current_user.current_tenant_id api_token.tenant_id = current_user.current_tenant_id
@@ -102,7 +101,7 @@ class BaseApiKeyResource(Resource):
method_decorators = [account_initialization_required, login_required, setup_required] method_decorators = [account_initialization_required, login_required, setup_required]
resource_type: str | None = None resource_type: str | None = None
resource_model: Optional[Any] = None resource_model: type | None = None
resource_id_field: str | None = None resource_id_field: str | None = None
def delete(self, resource_id, api_key_id): def delete(self, resource_id, api_key_id):
@@ -126,7 +125,7 @@ class BaseApiKeyResource(Resource):
) )
if key is None: if key is None:
flask_restx.abort(404, message="API key not found") flask_restx.abort(HTTPStatus.NOT_FOUND, message="API key not found")
db.session.query(ApiToken).where(ApiToken.id == api_key_id).delete() db.session.query(ApiToken).where(ApiToken.id == api_key_id).delete()
db.session.commit() db.session.commit()
@@ -134,7 +133,25 @@ class BaseApiKeyResource(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/apps/<uuid:resource_id>/api-keys")
class AppApiKeyListResource(BaseApiKeyListResource): class AppApiKeyListResource(BaseApiKeyListResource):
@api.doc("get_app_api_keys")
@api.doc(description="Get all API keys for an app")
@api.doc(params={"resource_id": "App ID"})
@api.response(200, "Success", api_key_list)
def get(self, resource_id):
"""Get all API keys for an app"""
return super().get(resource_id)
@api.doc("create_app_api_key")
@api.doc(description="Create a new API key for an app")
@api.doc(params={"resource_id": "App ID"})
@api.response(201, "API key created successfully", api_key_fields)
@api.response(400, "Maximum keys exceeded")
def post(self, resource_id):
"""Create a new API key for an app"""
return super().post(resource_id)
def after_request(self, resp): def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*" resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true" resp.headers["Access-Control-Allow-Credentials"] = "true"
@@ -146,7 +163,16 @@ class AppApiKeyListResource(BaseApiKeyListResource):
token_prefix = "app-" token_prefix = "app-"
@console_ns.route("/apps/<uuid:resource_id>/api-keys/<uuid:api_key_id>")
class AppApiKeyResource(BaseApiKeyResource): class AppApiKeyResource(BaseApiKeyResource):
@api.doc("delete_app_api_key")
@api.doc(description="Delete an API key for an app")
@api.doc(params={"resource_id": "App ID", "api_key_id": "API key ID"})
@api.response(204, "API key deleted successfully")
def delete(self, resource_id, api_key_id):
"""Delete an API key for an app"""
return super().delete(resource_id, api_key_id)
def after_request(self, resp): def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*" resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true" resp.headers["Access-Control-Allow-Credentials"] = "true"
@@ -157,7 +183,25 @@ class AppApiKeyResource(BaseApiKeyResource):
resource_id_field = "app_id" resource_id_field = "app_id"
@console_ns.route("/datasets/<uuid:resource_id>/api-keys")
class DatasetApiKeyListResource(BaseApiKeyListResource): class DatasetApiKeyListResource(BaseApiKeyListResource):
@api.doc("get_dataset_api_keys")
@api.doc(description="Get all API keys for a dataset")
@api.doc(params={"resource_id": "Dataset ID"})
@api.response(200, "Success", api_key_list)
def get(self, resource_id):
"""Get all API keys for a dataset"""
return super().get(resource_id)
@api.doc("create_dataset_api_key")
@api.doc(description="Create a new API key for a dataset")
@api.doc(params={"resource_id": "Dataset ID"})
@api.response(201, "API key created successfully", api_key_fields)
@api.response(400, "Maximum keys exceeded")
def post(self, resource_id):
"""Create a new API key for a dataset"""
return super().post(resource_id)
def after_request(self, resp): def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*" resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true" resp.headers["Access-Control-Allow-Credentials"] = "true"
@@ -169,7 +213,16 @@ class DatasetApiKeyListResource(BaseApiKeyListResource):
token_prefix = "ds-" token_prefix = "ds-"
@console_ns.route("/datasets/<uuid:resource_id>/api-keys/<uuid:api_key_id>")
class DatasetApiKeyResource(BaseApiKeyResource): class DatasetApiKeyResource(BaseApiKeyResource):
@api.doc("delete_dataset_api_key")
@api.doc(description="Delete an API key for a dataset")
@api.doc(params={"resource_id": "Dataset ID", "api_key_id": "API key ID"})
@api.response(204, "API key deleted successfully")
def delete(self, resource_id, api_key_id):
"""Delete an API key for a dataset"""
return super().delete(resource_id, api_key_id)
def after_request(self, resp): def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*" resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true" resp.headers["Access-Control-Allow-Credentials"] = "true"
@@ -178,9 +231,3 @@ class DatasetApiKeyResource(BaseApiKeyResource):
resource_type = "dataset" resource_type = "dataset"
resource_model = Dataset resource_model = Dataset
resource_id_field = "dataset_id" resource_id_field = "dataset_id"
api.add_resource(AppApiKeyListResource, "/apps/<uuid:resource_id>/api-keys")
api.add_resource(AppApiKeyResource, "/apps/<uuid:resource_id>/api-keys/<uuid:api_key_id>")
api.add_resource(DatasetApiKeyListResource, "/datasets/<uuid:resource_id>/api-keys")
api.add_resource(DatasetApiKeyResource, "/datasets/<uuid:resource_id>/api-keys/<uuid:api_key_id>")

View File

@@ -1,12 +1,26 @@
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from libs.login import login_required from libs.login import login_required
from services.advanced_prompt_template_service import AdvancedPromptTemplateService from services.advanced_prompt_template_service import AdvancedPromptTemplateService
@console_ns.route("/app/prompt-templates")
class AdvancedPromptTemplateList(Resource): class AdvancedPromptTemplateList(Resource):
@api.doc("get_advanced_prompt_templates")
@api.doc(description="Get advanced prompt templates based on app mode and model configuration")
@api.expect(
api.parser()
.add_argument("app_mode", type=str, required=True, location="args", help="Application mode")
.add_argument("model_mode", type=str, required=True, location="args", help="Model mode")
.add_argument("has_context", type=str, default="true", location="args", help="Whether has context")
.add_argument("model_name", type=str, required=True, location="args", help="Model name")
)
@api.response(
200, "Prompt templates retrieved successfully", fields.List(fields.Raw(description="Prompt template data"))
)
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -19,6 +33,3 @@ class AdvancedPromptTemplateList(Resource):
args = parser.parse_args() args = parser.parse_args()
return AdvancedPromptTemplateService.get_prompt(args) return AdvancedPromptTemplateService.get_prompt(args)
api.add_resource(AdvancedPromptTemplateList, "/app/prompt-templates")

View File

@@ -1,6 +1,6 @@
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from libs.helper import uuid_value from libs.helper import uuid_value
@@ -9,7 +9,18 @@ from models.model import AppMode
from services.agent_service import AgentService from services.agent_service import AgentService
@console_ns.route("/apps/<uuid:app_id>/agent/logs")
class AgentLogApi(Resource): class AgentLogApi(Resource):
@api.doc("get_agent_logs")
@api.doc(description="Get agent execution logs for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("message_id", type=str, required=True, location="args", help="Message UUID")
.add_argument("conversation_id", type=str, required=True, location="args", help="Conversation UUID")
)
@api.response(200, "Agent logs retrieved successfully", fields.List(fields.Raw(description="Agent log entries")))
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -23,6 +34,3 @@ class AgentLogApi(Resource):
args = parser.parse_args() args = parser.parse_args()
return AgentService.get_agent_logs(app_model, args["conversation_id"], args["message_id"]) return AgentService.get_agent_logs(app_model, args["conversation_id"], args["message_id"])
api.add_resource(AgentLogApi, "/apps/<uuid:app_id>/agent/logs")

View File

@@ -2,11 +2,11 @@ from typing import Literal
from flask import request from flask import request
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, marshal, marshal_with, reqparse from flask_restx import Resource, fields, marshal, marshal_with, reqparse
from werkzeug.exceptions import Forbidden from werkzeug.exceptions import Forbidden
from controllers.common.errors import NoFileUploadedError, TooManyFilesError from controllers.common.errors import NoFileUploadedError, TooManyFilesError
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.wraps import ( from controllers.console.wraps import (
account_initialization_required, account_initialization_required,
cloud_edition_billing_resource_check, cloud_edition_billing_resource_check,
@@ -21,7 +21,23 @@ from libs.login import login_required
from services.annotation_service import AppAnnotationService from services.annotation_service import AppAnnotationService
@console_ns.route("/apps/<uuid:app_id>/annotation-reply/<string:action>")
class AnnotationReplyActionApi(Resource): class AnnotationReplyActionApi(Resource):
@api.doc("annotation_reply_action")
@api.doc(description="Enable or disable annotation reply for an app")
@api.doc(params={"app_id": "Application ID", "action": "Action to perform (enable/disable)"})
@api.expect(
api.model(
"AnnotationReplyActionRequest",
{
"score_threshold": fields.Float(required=True, description="Score threshold for annotation matching"),
"embedding_provider_name": fields.String(required=True, description="Embedding provider name"),
"embedding_model_name": fields.String(required=True, description="Embedding model name"),
},
)
)
@api.response(200, "Action completed successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -43,7 +59,13 @@ class AnnotationReplyActionApi(Resource):
return result, 200 return result, 200
@console_ns.route("/apps/<uuid:app_id>/annotation-setting")
class AppAnnotationSettingDetailApi(Resource): class AppAnnotationSettingDetailApi(Resource):
@api.doc("get_annotation_setting")
@api.doc(description="Get annotation settings for an app")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Annotation settings retrieved successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -56,7 +78,23 @@ class AppAnnotationSettingDetailApi(Resource):
return result, 200 return result, 200
@console_ns.route("/apps/<uuid:app_id>/annotation-settings/<uuid:annotation_setting_id>")
class AppAnnotationSettingUpdateApi(Resource): class AppAnnotationSettingUpdateApi(Resource):
@api.doc("update_annotation_setting")
@api.doc(description="Update annotation settings for an app")
@api.doc(params={"app_id": "Application ID", "annotation_setting_id": "Annotation setting ID"})
@api.expect(
api.model(
"AnnotationSettingUpdateRequest",
{
"score_threshold": fields.Float(required=True, description="Score threshold"),
"embedding_provider_name": fields.String(required=True, description="Embedding provider"),
"embedding_model_name": fields.String(required=True, description="Embedding model"),
},
)
)
@api.response(200, "Settings updated successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -75,7 +113,13 @@ class AppAnnotationSettingUpdateApi(Resource):
return result, 200 return result, 200
@console_ns.route("/apps/<uuid:app_id>/annotation-reply/<string:action>/status/<uuid:job_id>")
class AnnotationReplyActionStatusApi(Resource): class AnnotationReplyActionStatusApi(Resource):
@api.doc("get_annotation_reply_action_status")
@api.doc(description="Get status of annotation reply action job")
@api.doc(params={"app_id": "Application ID", "job_id": "Job ID", "action": "Action type"})
@api.response(200, "Job status retrieved successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -99,7 +143,19 @@ class AnnotationReplyActionStatusApi(Resource):
return {"job_id": job_id, "job_status": job_status, "error_msg": error_msg}, 200 return {"job_id": job_id, "job_status": job_status, "error_msg": error_msg}, 200
@console_ns.route("/apps/<uuid:app_id>/annotations")
class AnnotationApi(Resource): class AnnotationApi(Resource):
@api.doc("list_annotations")
@api.doc(description="Get annotations for an app with pagination")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("page", type=int, location="args", default=1, help="Page number")
.add_argument("limit", type=int, location="args", default=20, help="Page size")
.add_argument("keyword", type=str, location="args", default="", help="Search keyword")
)
@api.response(200, "Annotations retrieved successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -122,6 +178,21 @@ class AnnotationApi(Resource):
} }
return response, 200 return response, 200
@api.doc("create_annotation")
@api.doc(description="Create a new annotation for an app")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"CreateAnnotationRequest",
{
"question": fields.String(required=True, description="Question text"),
"answer": fields.String(required=True, description="Answer text"),
"annotation_reply": fields.Raw(description="Annotation reply data"),
},
)
)
@api.response(201, "Annotation created successfully", annotation_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -168,7 +239,13 @@ class AnnotationApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/apps/<uuid:app_id>/annotations/export")
class AnnotationExportApi(Resource): class AnnotationExportApi(Resource):
@api.doc("export_annotations")
@api.doc(description="Export all annotations for an app")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Annotations exported successfully", fields.List(fields.Nested(annotation_fields)))
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -182,7 +259,14 @@ class AnnotationExportApi(Resource):
return response, 200 return response, 200
@console_ns.route("/apps/<uuid:app_id>/annotations/<uuid:annotation_id>")
class AnnotationUpdateDeleteApi(Resource): class AnnotationUpdateDeleteApi(Resource):
@api.doc("update_delete_annotation")
@api.doc(description="Update or delete an annotation")
@api.doc(params={"app_id": "Application ID", "annotation_id": "Annotation ID"})
@api.response(200, "Annotation updated successfully", annotation_fields)
@api.response(204, "Annotation deleted successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -214,7 +298,14 @@ class AnnotationUpdateDeleteApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/apps/<uuid:app_id>/annotations/batch-import")
class AnnotationBatchImportApi(Resource): class AnnotationBatchImportApi(Resource):
@api.doc("batch_import_annotations")
@api.doc(description="Batch import annotations from CSV file")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Batch import started successfully")
@api.response(403, "Insufficient permissions")
@api.response(400, "No file uploaded or too many files")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -239,7 +330,13 @@ class AnnotationBatchImportApi(Resource):
return AppAnnotationService.batch_import_app_annotations(app_id, file) return AppAnnotationService.batch_import_app_annotations(app_id, file)
@console_ns.route("/apps/<uuid:app_id>/annotations/batch-import-status/<uuid:job_id>")
class AnnotationBatchImportStatusApi(Resource): class AnnotationBatchImportStatusApi(Resource):
@api.doc("get_batch_import_status")
@api.doc(description="Get status of batch import job")
@api.doc(params={"app_id": "Application ID", "job_id": "Job ID"})
@api.response(200, "Job status retrieved successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -262,7 +359,20 @@ class AnnotationBatchImportStatusApi(Resource):
return {"job_id": job_id, "job_status": job_status, "error_msg": error_msg}, 200 return {"job_id": job_id, "job_status": job_status, "error_msg": error_msg}, 200
@console_ns.route("/apps/<uuid:app_id>/annotations/<uuid:annotation_id>/hit-histories")
class AnnotationHitHistoryListApi(Resource): class AnnotationHitHistoryListApi(Resource):
@api.doc("list_annotation_hit_histories")
@api.doc(description="Get hit histories for an annotation")
@api.doc(params={"app_id": "Application ID", "annotation_id": "Annotation ID"})
@api.expect(
api.parser()
.add_argument("page", type=int, location="args", default=1, help="Page number")
.add_argument("limit", type=int, location="args", default=20, help="Page size")
)
@api.response(
200, "Hit histories retrieved successfully", fields.List(fields.Nested(annotation_hit_history_fields))
)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -285,17 +395,3 @@ class AnnotationHitHistoryListApi(Resource):
"page": page, "page": page,
} }
return response return response
api.add_resource(AnnotationReplyActionApi, "/apps/<uuid:app_id>/annotation-reply/<string:action>")
api.add_resource(
AnnotationReplyActionStatusApi, "/apps/<uuid:app_id>/annotation-reply/<string:action>/status/<uuid:job_id>"
)
api.add_resource(AnnotationApi, "/apps/<uuid:app_id>/annotations")
api.add_resource(AnnotationExportApi, "/apps/<uuid:app_id>/annotations/export")
api.add_resource(AnnotationUpdateDeleteApi, "/apps/<uuid:app_id>/annotations/<uuid:annotation_id>")
api.add_resource(AnnotationBatchImportApi, "/apps/<uuid:app_id>/annotations/batch-import")
api.add_resource(AnnotationBatchImportStatusApi, "/apps/<uuid:app_id>/annotations/batch-import-status/<uuid:job_id>")
api.add_resource(AnnotationHitHistoryListApi, "/apps/<uuid:app_id>/annotations/<uuid:annotation_id>/hit-histories")
api.add_resource(AppAnnotationSettingDetailApi, "/apps/<uuid:app_id>/annotation-setting")
api.add_resource(AppAnnotationSettingUpdateApi, "/apps/<uuid:app_id>/annotation-settings/<uuid:annotation_setting_id>")

View File

@@ -2,12 +2,12 @@ import uuid
from typing import cast from typing import cast
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, inputs, marshal, marshal_with, reqparse from flask_restx import Resource, fields, inputs, marshal, marshal_with, reqparse
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from werkzeug.exceptions import BadRequest, Forbidden, abort from werkzeug.exceptions import BadRequest, Forbidden, abort
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import ( from controllers.console.wraps import (
account_initialization_required, account_initialization_required,
@@ -34,7 +34,27 @@ def _validate_description_length(description):
return description return description
@console_ns.route("/apps")
class AppListApi(Resource): class AppListApi(Resource):
@api.doc("list_apps")
@api.doc(description="Get list of applications with pagination and filtering")
@api.expect(
api.parser()
.add_argument("page", type=int, location="args", help="Page number (1-99999)", default=1)
.add_argument("limit", type=int, location="args", help="Page size (1-100)", default=20)
.add_argument(
"mode",
type=str,
location="args",
choices=["completion", "chat", "advanced-chat", "workflow", "agent-chat", "channel", "all"],
default="all",
help="App mode filter",
)
.add_argument("name", type=str, location="args", help="Filter by app name")
.add_argument("tag_ids", type=str, location="args", help="Comma-separated tag IDs")
.add_argument("is_created_by_me", type=bool, location="args", help="Filter by creator")
)
@api.response(200, "Success", app_pagination_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -91,6 +111,24 @@ class AppListApi(Resource):
return marshal(app_pagination, app_pagination_fields), 200 return marshal(app_pagination, app_pagination_fields), 200
@api.doc("create_app")
@api.doc(description="Create a new application")
@api.expect(
api.model(
"CreateAppRequest",
{
"name": fields.String(required=True, description="App name"),
"description": fields.String(description="App description (max 400 chars)"),
"mode": fields.String(required=True, enum=ALLOW_CREATE_APP_MODES, description="App mode"),
"icon_type": fields.String(description="Icon type"),
"icon": fields.String(description="Icon"),
"icon_background": fields.String(description="Icon background color"),
},
)
)
@api.response(201, "App created successfully", app_detail_fields)
@api.response(403, "Insufficient permissions")
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -115,12 +153,21 @@ class AppListApi(Resource):
raise BadRequest("mode is required") raise BadRequest("mode is required")
app_service = AppService() app_service = AppService()
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
if current_user.current_tenant_id is None:
raise ValueError("current_user.current_tenant_id cannot be None")
app = app_service.create_app(current_user.current_tenant_id, args, current_user) app = app_service.create_app(current_user.current_tenant_id, args, current_user)
return app, 201 return app, 201
@console_ns.route("/apps/<uuid:app_id>")
class AppApi(Resource): class AppApi(Resource):
@api.doc("get_app_detail")
@api.doc(description="Get application details")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Success", app_detail_fields_with_site)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -139,6 +186,26 @@ class AppApi(Resource):
return app_model return app_model
@api.doc("update_app")
@api.doc(description="Update application details")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"UpdateAppRequest",
{
"name": fields.String(required=True, description="App name"),
"description": fields.String(description="App description (max 400 chars)"),
"icon_type": fields.String(description="Icon type"),
"icon": fields.String(description="Icon"),
"icon_background": fields.String(description="Icon background color"),
"use_icon_as_answer_icon": fields.Boolean(description="Use icon as answer icon"),
"max_active_requests": fields.Integer(description="Maximum active requests"),
},
)
)
@api.response(200, "App updated successfully", app_detail_fields_with_site)
@api.response(403, "Insufficient permissions")
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -161,14 +228,31 @@ class AppApi(Resource):
args = parser.parse_args() args = parser.parse_args()
app_service = AppService() app_service = AppService()
app_model = app_service.update_app(app_model, args) # Construct ArgsDict from parsed arguments
from services.app_service import AppService as AppServiceType
args_dict: AppServiceType.ArgsDict = {
"name": args["name"],
"description": args.get("description", ""),
"icon_type": args.get("icon_type", ""),
"icon": args.get("icon", ""),
"icon_background": args.get("icon_background", ""),
"use_icon_as_answer_icon": args.get("use_icon_as_answer_icon", False),
"max_active_requests": args.get("max_active_requests", 0),
}
app_model = app_service.update_app(app_model, args_dict)
return app_model return app_model
@api.doc("delete_app")
@api.doc(description="Delete application")
@api.doc(params={"app_id": "Application ID"})
@api.response(204, "App deleted successfully")
@api.response(403, "Insufficient permissions")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def delete(self, app_model): def delete(self, app_model):
"""Delete app""" """Delete app"""
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
@@ -181,7 +265,25 @@ class AppApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/apps/<uuid:app_id>/copy")
class AppCopyApi(Resource): class AppCopyApi(Resource):
@api.doc("copy_app")
@api.doc(description="Create a copy of an existing application")
@api.doc(params={"app_id": "Application ID to copy"})
@api.expect(
api.model(
"CopyAppRequest",
{
"name": fields.String(description="Name for the copied app"),
"description": fields.String(description="Description for the copied app"),
"icon_type": fields.String(description="Icon type"),
"icon": fields.String(description="Icon"),
"icon_background": fields.String(description="Icon background color"),
},
)
)
@api.response(201, "App copied successfully", app_detail_fields_with_site)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -223,11 +325,26 @@ class AppCopyApi(Resource):
return app, 201 return app, 201
@console_ns.route("/apps/<uuid:app_id>/export")
class AppExportApi(Resource): class AppExportApi(Resource):
@api.doc("export_app")
@api.doc(description="Export application configuration as DSL")
@api.doc(params={"app_id": "Application ID to export"})
@api.expect(
api.parser()
.add_argument("include_secret", type=bool, location="args", default=False, help="Include secrets in export")
.add_argument("workflow_id", type=str, location="args", help="Specific workflow ID to export")
)
@api.response(
200,
"App exported successfully",
api.model("AppExportResponse", {"data": fields.String(description="DSL export data")}),
)
@api.response(403, "Insufficient permissions")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
"""Export app""" """Export app"""
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
@@ -237,12 +354,23 @@ class AppExportApi(Resource):
# Add include_secret params # Add include_secret params
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument("include_secret", type=inputs.boolean, default=False, location="args") parser.add_argument("include_secret", type=inputs.boolean, default=False, location="args")
parser.add_argument("workflow_id", type=str, location="args")
args = parser.parse_args() args = parser.parse_args()
return {"data": AppDslService.export_dsl(app_model=app_model, include_secret=args["include_secret"])} return {
"data": AppDslService.export_dsl(
app_model=app_model, include_secret=args["include_secret"], workflow_id=args.get("workflow_id")
)
}
@console_ns.route("/apps/<uuid:app_id>/name")
class AppNameApi(Resource): class AppNameApi(Resource):
@api.doc("check_app_name")
@api.doc(description="Check if app name is available")
@api.doc(params={"app_id": "Application ID"})
@api.expect(api.parser().add_argument("name", type=str, required=True, location="args", help="Name to check"))
@api.response(200, "Name availability checked")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -258,12 +386,28 @@ class AppNameApi(Resource):
args = parser.parse_args() args = parser.parse_args()
app_service = AppService() app_service = AppService()
app_model = app_service.update_app_name(app_model, args.get("name")) app_model = app_service.update_app_name(app_model, args["name"])
return app_model return app_model
@console_ns.route("/apps/<uuid:app_id>/icon")
class AppIconApi(Resource): class AppIconApi(Resource):
@api.doc("update_app_icon")
@api.doc(description="Update application icon")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppIconRequest",
{
"icon": fields.String(required=True, description="Icon data"),
"icon_type": fields.String(description="Icon type"),
"icon_background": fields.String(description="Icon background color"),
},
)
)
@api.response(200, "Icon updated successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -280,12 +424,23 @@ class AppIconApi(Resource):
args = parser.parse_args() args = parser.parse_args()
app_service = AppService() app_service = AppService()
app_model = app_service.update_app_icon(app_model, args.get("icon"), args.get("icon_background")) app_model = app_service.update_app_icon(app_model, args.get("icon") or "", args.get("icon_background") or "")
return app_model return app_model
@console_ns.route("/apps/<uuid:app_id>/site-enable")
class AppSiteStatus(Resource): class AppSiteStatus(Resource):
@api.doc("update_app_site_status")
@api.doc(description="Enable or disable app site")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppSiteStatusRequest", {"enable_site": fields.Boolean(required=True, description="Enable or disable site")}
)
)
@api.response(200, "Site status updated successfully", app_detail_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -301,12 +456,23 @@ class AppSiteStatus(Resource):
args = parser.parse_args() args = parser.parse_args()
app_service = AppService() app_service = AppService()
app_model = app_service.update_app_site_status(app_model, args.get("enable_site")) app_model = app_service.update_app_site_status(app_model, args["enable_site"])
return app_model return app_model
@console_ns.route("/apps/<uuid:app_id>/api-enable")
class AppApiStatus(Resource): class AppApiStatus(Resource):
@api.doc("update_app_api_status")
@api.doc(description="Enable or disable app API")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppApiStatusRequest", {"enable_api": fields.Boolean(required=True, description="Enable or disable API")}
)
)
@api.response(200, "API status updated successfully", app_detail_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -322,12 +488,17 @@ class AppApiStatus(Resource):
args = parser.parse_args() args = parser.parse_args()
app_service = AppService() app_service = AppService()
app_model = app_service.update_app_api_status(app_model, args.get("enable_api")) app_model = app_service.update_app_api_status(app_model, args["enable_api"])
return app_model return app_model
@console_ns.route("/apps/<uuid:app_id>/trace")
class AppTraceApi(Resource): class AppTraceApi(Resource):
@api.doc("get_app_trace")
@api.doc(description="Get app tracing configuration")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Trace configuration retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -337,6 +508,20 @@ class AppTraceApi(Resource):
return app_trace_config return app_trace_config
@api.doc("update_app_trace")
@api.doc(description="Update app tracing configuration")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppTraceRequest",
{
"enabled": fields.Boolean(required=True, description="Enable or disable tracing"),
"tracing_provider": fields.String(required=True, description="Tracing provider"),
},
)
)
@api.response(200, "Trace configuration updated successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -356,14 +541,3 @@ class AppTraceApi(Resource):
) )
return {"result": "success"} return {"result": "success"}
api.add_resource(AppListApi, "/apps")
api.add_resource(AppApi, "/apps/<uuid:app_id>")
api.add_resource(AppCopyApi, "/apps/<uuid:app_id>/copy")
api.add_resource(AppExportApi, "/apps/<uuid:app_id>/export")
api.add_resource(AppNameApi, "/apps/<uuid:app_id>/name")
api.add_resource(AppIconApi, "/apps/<uuid:app_id>/icon")
api.add_resource(AppSiteStatus, "/apps/<uuid:app_id>/site-enable")
api.add_resource(AppApiStatus, "/apps/<uuid:app_id>/api-enable")
api.add_resource(AppTraceApi, "/apps/<uuid:app_id>/trace")

View File

@@ -1,11 +1,11 @@
import logging import logging
from flask import request from flask import request
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from werkzeug.exceptions import InternalServerError from werkzeug.exceptions import InternalServerError
import services import services
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import ( from controllers.console.app.error import (
AppUnavailableError, AppUnavailableError,
AudioTooLargeError, AudioTooLargeError,
@@ -34,7 +34,18 @@ from services.errors.audio import (
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@console_ns.route("/apps/<uuid:app_id>/audio-to-text")
class ChatMessageAudioApi(Resource): class ChatMessageAudioApi(Resource):
@api.doc("chat_message_audio_transcript")
@api.doc(description="Transcript audio to text for chat messages")
@api.doc(params={"app_id": "App ID"})
@api.response(
200,
"Audio transcription successful",
api.model("AudioTranscriptResponse", {"text": fields.String(description="Transcribed text from audio")}),
)
@api.response(400, "Bad request - No audio uploaded or unsupported type")
@api.response(413, "Audio file too large")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -76,11 +87,28 @@ class ChatMessageAudioApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/text-to-audio")
class ChatMessageTextApi(Resource): class ChatMessageTextApi(Resource):
@api.doc("chat_message_text_to_speech")
@api.doc(description="Convert text to speech for chat messages")
@api.doc(params={"app_id": "App ID"})
@api.expect(
api.model(
"TextToSpeechRequest",
{
"message_id": fields.String(description="Message ID"),
"text": fields.String(required=True, description="Text to convert to speech"),
"voice": fields.String(description="Voice to use for TTS"),
"streaming": fields.Boolean(description="Whether to stream the audio"),
},
)
)
@api.response(200, "Text to speech conversion successful")
@api.response(400, "Bad request - Invalid parameters")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def post(self, app_model: App): def post(self, app_model: App):
try: try:
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -124,11 +152,18 @@ class ChatMessageTextApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/text-to-audio/voices")
class TextModesApi(Resource): class TextModesApi(Resource):
@api.doc("get_text_to_speech_voices")
@api.doc(description="Get available TTS voices for a specific language")
@api.doc(params={"app_id": "App ID"})
@api.expect(api.parser().add_argument("language", type=str, required=True, location="args", help="Language code"))
@api.response(200, "TTS voices retrieved successfully", fields.List(fields.Raw(description="Available voices")))
@api.response(400, "Invalid language parameter")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
try: try:
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -164,8 +199,3 @@ class TextModesApi(Resource):
except Exception as e: except Exception as e:
logger.exception("Failed to handle get request to TextModesApi") logger.exception("Failed to handle get request to TextModesApi")
raise InternalServerError() raise InternalServerError()
api.add_resource(ChatMessageAudioApi, "/apps/<uuid:app_id>/audio-to-text")
api.add_resource(ChatMessageTextApi, "/apps/<uuid:app_id>/text-to-audio")
api.add_resource(TextModesApi, "/apps/<uuid:app_id>/text-to-audio/voices")

View File

@@ -1,12 +1,11 @@
import logging import logging
import flask_login
from flask import request from flask import request
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from werkzeug.exceptions import InternalServerError, NotFound from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services import services
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import ( from controllers.console.app.error import (
AppUnavailableError, AppUnavailableError,
CompletionRequestError, CompletionRequestError,
@@ -29,7 +28,8 @@ from core.helper.trace_id_helper import get_external_trace_id
from core.model_runtime.errors.invoke import InvokeError from core.model_runtime.errors.invoke import InvokeError
from libs import helper from libs import helper
from libs.helper import uuid_value from libs.helper import uuid_value
from libs.login import login_required from libs.login import current_user, login_required
from models import Account
from models.model import AppMode from models.model import AppMode
from services.app_generate_service import AppGenerateService from services.app_generate_service import AppGenerateService
from services.errors.llm import InvokeRateLimitError from services.errors.llm import InvokeRateLimitError
@@ -38,7 +38,27 @@ logger = logging.getLogger(__name__)
# define completion message api for user # define completion message api for user
@console_ns.route("/apps/<uuid:app_id>/completion-messages")
class CompletionMessageApi(Resource): class CompletionMessageApi(Resource):
@api.doc("create_completion_message")
@api.doc(description="Generate completion message for debugging")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"CompletionMessageRequest",
{
"inputs": fields.Raw(required=True, description="Input variables"),
"query": fields.String(description="Query text", default=""),
"files": fields.List(fields.Raw(), description="Uploaded files"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"response_mode": fields.String(enum=["blocking", "streaming"], description="Response mode"),
"retriever_from": fields.String(default="dev", description="Retriever source"),
},
)
)
@api.response(200, "Completion generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(404, "App not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -56,11 +76,11 @@ class CompletionMessageApi(Resource):
streaming = args["response_mode"] != "blocking" streaming = args["response_mode"] != "blocking"
args["auto_generate_name"] = False args["auto_generate_name"] = False
account = flask_login.current_user
try: try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account or EndUser instance")
response = AppGenerateService.generate( response = AppGenerateService.generate(
app_model=app_model, user=account, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming
) )
return helper.compact_generate_response(response) return helper.compact_generate_response(response)
@@ -86,25 +106,58 @@ class CompletionMessageApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/completion-messages/<string:task_id>/stop")
class CompletionMessageStopApi(Resource): class CompletionMessageStopApi(Resource):
@api.doc("stop_completion_message")
@api.doc(description="Stop a running completion message generation")
@api.doc(params={"app_id": "Application ID", "task_id": "Task ID to stop"})
@api.response(200, "Task stopped successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model(mode=AppMode.COMPLETION) @get_app_model(mode=AppMode.COMPLETION)
def post(self, app_model, task_id): def post(self, app_model, task_id):
account = flask_login.current_user if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, account.id) AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, current_user.id)
return {"result": "success"}, 200 return {"result": "success"}, 200
@console_ns.route("/apps/<uuid:app_id>/chat-messages")
class ChatMessageApi(Resource): class ChatMessageApi(Resource):
@api.doc("create_chat_message")
@api.doc(description="Generate chat message for debugging")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"ChatMessageRequest",
{
"inputs": fields.Raw(required=True, description="Input variables"),
"query": fields.String(required=True, description="User query"),
"files": fields.List(fields.Raw(), description="Uploaded files"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"conversation_id": fields.String(description="Conversation ID"),
"parent_message_id": fields.String(description="Parent message ID"),
"response_mode": fields.String(enum=["blocking", "streaming"], description="Response mode"),
"retriever_from": fields.String(default="dev", description="Retriever source"),
},
)
)
@api.response(200, "Chat message generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(404, "App or conversation not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT]) @get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT])
def post(self, app_model): def post(self, app_model):
if not isinstance(current_user, Account):
raise Forbidden()
if not current_user.has_edit_permission:
raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument("inputs", type=dict, required=True, location="json") parser.add_argument("inputs", type=dict, required=True, location="json")
parser.add_argument("query", type=str, required=True, location="json") parser.add_argument("query", type=str, required=True, location="json")
@@ -123,11 +176,11 @@ class ChatMessageApi(Resource):
if external_trace_id: if external_trace_id:
args["external_trace_id"] = external_trace_id args["external_trace_id"] = external_trace_id
account = flask_login.current_user
try: try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account or EndUser instance")
response = AppGenerateService.generate( response = AppGenerateService.generate(
app_model=app_model, user=account, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming
) )
return helper.compact_generate_response(response) return helper.compact_generate_response(response)
@@ -155,20 +208,19 @@ class ChatMessageApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/chat-messages/<string:task_id>/stop")
class ChatMessageStopApi(Resource): class ChatMessageStopApi(Resource):
@api.doc("stop_chat_message")
@api.doc(description="Stop a running chat message generation")
@api.doc(params={"app_id": "Application ID", "task_id": "Task ID to stop"})
@api.response(200, "Task stopped successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]) @get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
def post(self, app_model, task_id): def post(self, app_model, task_id):
account = flask_login.current_user if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, account.id) AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, current_user.id)
return {"result": "success"}, 200 return {"result": "success"}, 200
api.add_resource(CompletionMessageApi, "/apps/<uuid:app_id>/completion-messages")
api.add_resource(CompletionMessageStopApi, "/apps/<uuid:app_id>/completion-messages/<string:task_id>/stop")
api.add_resource(ChatMessageApi, "/apps/<uuid:app_id>/chat-messages")
api.add_resource(ChatMessageStopApi, "/apps/<uuid:app_id>/chat-messages/<string:task_id>/stop")

View File

@@ -8,7 +8,7 @@ from sqlalchemy import func, or_
from sqlalchemy.orm import joinedload from sqlalchemy.orm import joinedload
from werkzeug.exceptions import Forbidden, NotFound from werkzeug.exceptions import Forbidden, NotFound
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from core.app.entities.app_invoke_entities import InvokeFrom from core.app.entities.app_invoke_entities import InvokeFrom
@@ -22,13 +22,35 @@ from fields.conversation_fields import (
from libs.datetime_utils import naive_utc_now from libs.datetime_utils import naive_utc_now
from libs.helper import DatetimeString from libs.helper import DatetimeString
from libs.login import login_required from libs.login import login_required
from models import Conversation, EndUser, Message, MessageAnnotation from models import Account, Conversation, EndUser, Message, MessageAnnotation
from models.model import AppMode from models.model import AppMode
from services.conversation_service import ConversationService from services.conversation_service import ConversationService
from services.errors.conversation import ConversationNotExistsError from services.errors.conversation import ConversationNotExistsError
@console_ns.route("/apps/<uuid:app_id>/completion-conversations")
class CompletionConversationApi(Resource): class CompletionConversationApi(Resource):
@api.doc("list_completion_conversations")
@api.doc(description="Get completion conversations with pagination and filtering")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("keyword", type=str, location="args", help="Search keyword")
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
.add_argument(
"annotation_status",
type=str,
location="args",
choices=["annotated", "not_annotated", "all"],
default="all",
help="Annotation status filter",
)
.add_argument("page", type=int, location="args", default=1, help="Page number")
.add_argument("limit", type=int, location="args", default=20, help="Page size (1-100)")
)
@api.response(200, "Success", conversation_pagination_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -101,7 +123,14 @@ class CompletionConversationApi(Resource):
return conversations return conversations
@console_ns.route("/apps/<uuid:app_id>/completion-conversations/<uuid:conversation_id>")
class CompletionConversationDetailApi(Resource): class CompletionConversationDetailApi(Resource):
@api.doc("get_completion_conversation")
@api.doc(description="Get completion conversation details with messages")
@api.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@api.response(200, "Success", conversation_message_detail_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "Conversation not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -114,16 +143,24 @@ class CompletionConversationDetailApi(Resource):
return _get_conversation(app_model, conversation_id) return _get_conversation(app_model, conversation_id)
@api.doc("delete_completion_conversation")
@api.doc(description="Delete a completion conversation")
@api.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@api.response(204, "Conversation deleted successfully")
@api.response(403, "Insufficient permissions")
@api.response(404, "Conversation not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]) @get_app_model(mode=AppMode.COMPLETION)
def delete(self, app_model, conversation_id): def delete(self, app_model, conversation_id):
if not current_user.is_editor: if not current_user.is_editor:
raise Forbidden() raise Forbidden()
conversation_id = str(conversation_id) conversation_id = str(conversation_id)
try: try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
ConversationService.delete(app_model, conversation_id, current_user) ConversationService.delete(app_model, conversation_id, current_user)
except ConversationNotExistsError: except ConversationNotExistsError:
raise NotFound("Conversation Not Exists.") raise NotFound("Conversation Not Exists.")
@@ -131,7 +168,38 @@ class CompletionConversationDetailApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/apps/<uuid:app_id>/chat-conversations")
class ChatConversationApi(Resource): class ChatConversationApi(Resource):
@api.doc("list_chat_conversations")
@api.doc(description="Get chat conversations with pagination, filtering and summary")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("keyword", type=str, location="args", help="Search keyword")
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
.add_argument(
"annotation_status",
type=str,
location="args",
choices=["annotated", "not_annotated", "all"],
default="all",
help="Annotation status filter",
)
.add_argument("message_count_gte", type=int, location="args", help="Minimum message count")
.add_argument("page", type=int, location="args", default=1, help="Page number")
.add_argument("limit", type=int, location="args", default=20, help="Page size (1-100)")
.add_argument(
"sort_by",
type=str,
location="args",
choices=["created_at", "-created_at", "updated_at", "-updated_at"],
default="-updated_at",
help="Sort field and direction",
)
)
@api.response(200, "Success", conversation_with_summary_pagination_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -239,7 +307,7 @@ class ChatConversationApi(Resource):
.having(func.count(Message.id) >= args["message_count_gte"]) .having(func.count(Message.id) >= args["message_count_gte"])
) )
if app_model.mode == AppMode.ADVANCED_CHAT.value: if app_model.mode == AppMode.ADVANCED_CHAT:
query = query.where(Conversation.invoke_from != InvokeFrom.DEBUGGER.value) query = query.where(Conversation.invoke_from != InvokeFrom.DEBUGGER.value)
match args["sort_by"]: match args["sort_by"]:
@@ -259,7 +327,14 @@ class ChatConversationApi(Resource):
return conversations return conversations
@console_ns.route("/apps/<uuid:app_id>/chat-conversations/<uuid:conversation_id>")
class ChatConversationDetailApi(Resource): class ChatConversationDetailApi(Resource):
@api.doc("get_chat_conversation")
@api.doc(description="Get chat conversation details")
@api.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@api.response(200, "Success", conversation_detail_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "Conversation not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -272,6 +347,12 @@ class ChatConversationDetailApi(Resource):
return _get_conversation(app_model, conversation_id) return _get_conversation(app_model, conversation_id)
@api.doc("delete_chat_conversation")
@api.doc(description="Delete a chat conversation")
@api.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@api.response(204, "Conversation deleted successfully")
@api.response(403, "Insufficient permissions")
@api.response(404, "Conversation not found")
@setup_required @setup_required
@login_required @login_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]) @get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
@@ -282,6 +363,8 @@ class ChatConversationDetailApi(Resource):
conversation_id = str(conversation_id) conversation_id = str(conversation_id)
try: try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
ConversationService.delete(app_model, conversation_id, current_user) ConversationService.delete(app_model, conversation_id, current_user)
except ConversationNotExistsError: except ConversationNotExistsError:
raise NotFound("Conversation Not Exists.") raise NotFound("Conversation Not Exists.")
@@ -289,12 +372,6 @@ class ChatConversationDetailApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
api.add_resource(CompletionConversationApi, "/apps/<uuid:app_id>/completion-conversations")
api.add_resource(CompletionConversationDetailApi, "/apps/<uuid:app_id>/completion-conversations/<uuid:conversation_id>")
api.add_resource(ChatConversationApi, "/apps/<uuid:app_id>/chat-conversations")
api.add_resource(ChatConversationDetailApi, "/apps/<uuid:app_id>/chat-conversations/<uuid:conversation_id>")
def _get_conversation(app_model, conversation_id): def _get_conversation(app_model, conversation_id):
conversation = ( conversation = (
db.session.query(Conversation) db.session.query(Conversation)

View File

@@ -2,7 +2,7 @@ from flask_restx import Resource, marshal_with, reqparse
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db from extensions.ext_database import db
@@ -12,7 +12,17 @@ from models import ConversationVariable
from models.model import AppMode from models.model import AppMode
@console_ns.route("/apps/<uuid:app_id>/conversation-variables")
class ConversationVariablesApi(Resource): class ConversationVariablesApi(Resource):
@api.doc("get_conversation_variables")
@api.doc(description="Get conversation variables for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser().add_argument(
"conversation_id", type=str, location="args", help="Conversation ID to filter variables"
)
)
@api.response(200, "Conversation variables retrieved successfully", paginated_conversation_variable_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -55,6 +65,3 @@ class ConversationVariablesApi(Resource):
for row in rows for row in rows
], ],
} }
api.add_resource(ConversationVariablesApi, "/apps/<uuid:app_id>/conversation-variables")

View File

@@ -1,9 +1,9 @@
from collections.abc import Sequence from collections.abc import Sequence
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import ( from controllers.console.app.error import (
CompletionRequestError, CompletionRequestError,
ProviderModelCurrentlyNotSupportError, ProviderModelCurrentlyNotSupportError,
@@ -16,10 +16,29 @@ from core.helper.code_executor.javascript.javascript_code_provider import Javasc
from core.helper.code_executor.python3.python3_code_provider import Python3CodeProvider from core.helper.code_executor.python3.python3_code_provider import Python3CodeProvider
from core.llm_generator.llm_generator import LLMGenerator from core.llm_generator.llm_generator import LLMGenerator
from core.model_runtime.errors.invoke import InvokeError from core.model_runtime.errors.invoke import InvokeError
from extensions.ext_database import db
from libs.login import login_required from libs.login import login_required
from models import App
from services.workflow_service import WorkflowService
@console_ns.route("/rule-generate")
class RuleGenerateApi(Resource): class RuleGenerateApi(Resource):
@api.doc("generate_rule_config")
@api.doc(description="Generate rule configuration using LLM")
@api.expect(
api.model(
"RuleGenerateRequest",
{
"instruction": fields.String(required=True, description="Rule generation instruction"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"no_variable": fields.Boolean(required=True, default=False, description="Whether to exclude variables"),
},
)
)
@api.response(200, "Rule configuration generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(402, "Provider quota exceeded")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -50,7 +69,26 @@ class RuleGenerateApi(Resource):
return rules return rules
@console_ns.route("/rule-code-generate")
class RuleCodeGenerateApi(Resource): class RuleCodeGenerateApi(Resource):
@api.doc("generate_rule_code")
@api.doc(description="Generate code rules using LLM")
@api.expect(
api.model(
"RuleCodeGenerateRequest",
{
"instruction": fields.String(required=True, description="Code generation instruction"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"no_variable": fields.Boolean(required=True, default=False, description="Whether to exclude variables"),
"code_language": fields.String(
default="javascript", description="Programming language for code generation"
),
},
)
)
@api.response(200, "Code rules generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(402, "Provider quota exceeded")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -82,7 +120,22 @@ class RuleCodeGenerateApi(Resource):
return code_result return code_result
@console_ns.route("/rule-structured-output-generate")
class RuleStructuredOutputGenerateApi(Resource): class RuleStructuredOutputGenerateApi(Resource):
@api.doc("generate_structured_output")
@api.doc(description="Generate structured output rules using LLM")
@api.expect(
api.model(
"StructuredOutputGenerateRequest",
{
"instruction": fields.String(required=True, description="Structured output generation instruction"),
"model_config": fields.Raw(required=True, description="Model configuration"),
},
)
)
@api.response(200, "Structured output generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(402, "Provider quota exceeded")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -111,7 +164,27 @@ class RuleStructuredOutputGenerateApi(Resource):
return structured_output return structured_output
@console_ns.route("/instruction-generate")
class InstructionGenerateApi(Resource): class InstructionGenerateApi(Resource):
@api.doc("generate_instruction")
@api.doc(description="Generate instruction for workflow nodes or general use")
@api.expect(
api.model(
"InstructionGenerateRequest",
{
"flow_id": fields.String(required=True, description="Workflow/Flow ID"),
"node_id": fields.String(description="Node ID for workflow context"),
"current": fields.String(description="Current instruction text"),
"language": fields.String(default="javascript", description="Programming language (javascript/python)"),
"instruction": fields.String(required=True, description="Instruction for generation"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"ideal_output": fields.String(description="Expected ideal output"),
},
)
)
@api.response(200, "Instruction generated successfully")
@api.response(400, "Invalid request parameters or flow/workflow not found")
@api.response(402, "Provider quota exceeded")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -135,9 +208,6 @@ class InstructionGenerateApi(Resource):
try: try:
# Generate from nothing for a workflow node # Generate from nothing for a workflow node
if (args["current"] == code_template or args["current"] == "") and args["node_id"] != "": if (args["current"] == code_template or args["current"] == "") and args["node_id"] != "":
from models import App, db
from services.workflow_service import WorkflowService
app = db.session.query(App).where(App.id == args["flow_id"]).first() app = db.session.query(App).where(App.id == args["flow_id"]).first()
if not app: if not app:
return {"error": f"app {args['flow_id']} not found"}, 400 return {"error": f"app {args['flow_id']} not found"}, 400
@@ -203,11 +273,25 @@ class InstructionGenerateApi(Resource):
raise CompletionRequestError(e.description) raise CompletionRequestError(e.description)
@console_ns.route("/instruction-generate/template")
class InstructionGenerationTemplateApi(Resource): class InstructionGenerationTemplateApi(Resource):
@api.doc("get_instruction_template")
@api.doc(description="Get instruction generation template")
@api.expect(
api.model(
"InstructionTemplateRequest",
{
"instruction": fields.String(required=True, description="Template instruction"),
"ideal_output": fields.String(description="Expected ideal output"),
},
)
)
@api.response(200, "Template retrieved successfully")
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
def post(self) -> dict: def post(self):
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument("type", type=str, required=True, default=False, location="json") parser.add_argument("type", type=str, required=True, default=False, location="json")
args = parser.parse_args() args = parser.parse_args()
@@ -222,10 +306,3 @@ class InstructionGenerationTemplateApi(Resource):
return {"data": INSTRUCTION_GENERATE_TEMPLATE_CODE} return {"data": INSTRUCTION_GENERATE_TEMPLATE_CODE}
case _: case _:
raise ValueError(f"Invalid type: {args['type']}") raise ValueError(f"Invalid type: {args['type']}")
api.add_resource(RuleGenerateApi, "/rule-generate")
api.add_resource(RuleCodeGenerateApi, "/rule-code-generate")
api.add_resource(RuleStructuredOutputGenerateApi, "/rule-structured-output-generate")
api.add_resource(InstructionGenerateApi, "/instruction-generate")
api.add_resource(InstructionGenerationTemplateApi, "/instruction-generate/template")

View File

@@ -2,10 +2,10 @@ import json
from enum import StrEnum from enum import StrEnum
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, marshal_with, reqparse from flask_restx import Resource, fields, marshal_with, reqparse
from werkzeug.exceptions import NotFound from werkzeug.exceptions import NotFound
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db from extensions.ext_database import db
@@ -19,7 +19,12 @@ class AppMCPServerStatus(StrEnum):
INACTIVE = "inactive" INACTIVE = "inactive"
@console_ns.route("/apps/<uuid:app_id>/server")
class AppMCPServerController(Resource): class AppMCPServerController(Resource):
@api.doc("get_app_mcp_server")
@api.doc(description="Get MCP server configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "MCP server configuration retrieved successfully", app_server_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -29,6 +34,20 @@ class AppMCPServerController(Resource):
server = db.session.query(AppMCPServer).where(AppMCPServer.app_id == app_model.id).first() server = db.session.query(AppMCPServer).where(AppMCPServer.app_id == app_model.id).first()
return server return server
@api.doc("create_app_mcp_server")
@api.doc(description="Create MCP server configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"MCPServerCreateRequest",
{
"description": fields.String(description="Server description"),
"parameters": fields.Raw(required=True, description="Server parameters configuration"),
},
)
)
@api.response(201, "MCP server configuration created successfully", app_server_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -59,6 +78,23 @@ class AppMCPServerController(Resource):
db.session.commit() db.session.commit()
return server return server
@api.doc("update_app_mcp_server")
@api.doc(description="Update MCP server configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"MCPServerUpdateRequest",
{
"id": fields.String(required=True, description="Server ID"),
"description": fields.String(description="Server description"),
"parameters": fields.Raw(required=True, description="Server parameters configuration"),
"status": fields.String(description="Server status"),
},
)
)
@api.response(200, "MCP server configuration updated successfully", app_server_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "Server not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -94,7 +130,14 @@ class AppMCPServerController(Resource):
return server return server
@console_ns.route("/apps/<uuid:server_id>/server/refresh")
class AppMCPServerRefreshController(Resource): class AppMCPServerRefreshController(Resource):
@api.doc("refresh_app_mcp_server")
@api.doc(description="Refresh MCP server configuration and regenerate server code")
@api.doc(params={"server_id": "Server ID"})
@api.response(200, "MCP server refreshed successfully", app_server_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "Server not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -113,7 +156,3 @@ class AppMCPServerRefreshController(Resource):
server.server_code = AppMCPServer.generate_server_code(16) server.server_code = AppMCPServer.generate_server_code(16)
db.session.commit() db.session.commit()
return server return server
api.add_resource(AppMCPServerController, "/apps/<uuid:app_id>/server")
api.add_resource(AppMCPServerRefreshController, "/apps/<uuid:server_id>/server/refresh")

View File

@@ -1,12 +1,11 @@
import logging import logging
from flask_login import current_user
from flask_restx import Resource, fields, marshal_with, reqparse from flask_restx import Resource, fields, marshal_with, reqparse
from flask_restx.inputs import int_range from flask_restx.inputs import int_range
from sqlalchemy import exists, select from sqlalchemy import exists, select
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import ( from controllers.console.app.error import (
CompletionRequestError, CompletionRequestError,
ProviderModelCurrentlyNotSupportError, ProviderModelCurrentlyNotSupportError,
@@ -27,7 +26,8 @@ from extensions.ext_database import db
from fields.conversation_fields import annotation_fields, message_detail_fields from fields.conversation_fields import annotation_fields, message_detail_fields
from libs.helper import uuid_value from libs.helper import uuid_value
from libs.infinite_scroll_pagination import InfiniteScrollPagination from libs.infinite_scroll_pagination import InfiniteScrollPagination
from libs.login import login_required from libs.login import current_user, login_required
from models.account import Account
from models.model import AppMode, Conversation, Message, MessageAnnotation, MessageFeedback from models.model import AppMode, Conversation, Message, MessageAnnotation, MessageFeedback
from services.annotation_service import AppAnnotationService from services.annotation_service import AppAnnotationService
from services.errors.conversation import ConversationNotExistsError from services.errors.conversation import ConversationNotExistsError
@@ -37,6 +37,7 @@ from services.message_service import MessageService
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@console_ns.route("/apps/<uuid:app_id>/chat-messages")
class ChatMessageListApi(Resource): class ChatMessageListApi(Resource):
message_infinite_scroll_pagination_fields = { message_infinite_scroll_pagination_fields = {
"limit": fields.Integer, "limit": fields.Integer,
@@ -44,6 +45,17 @@ class ChatMessageListApi(Resource):
"data": fields.List(fields.Nested(message_detail_fields)), "data": fields.List(fields.Nested(message_detail_fields)),
} }
@api.doc("list_chat_messages")
@api.doc(description="Get chat messages for a conversation with pagination")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("conversation_id", type=str, required=True, location="args", help="Conversation ID")
.add_argument("first_id", type=str, location="args", help="First message ID for pagination")
.add_argument("limit", type=int, location="args", default=20, help="Number of messages to return (1-100)")
)
@api.response(200, "Success", message_infinite_scroll_pagination_fields)
@api.response(404, "Conversation not found")
@setup_required @setup_required
@login_required @login_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]) @get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
@@ -117,12 +129,31 @@ class ChatMessageListApi(Resource):
return InfiniteScrollPagination(data=history_messages, limit=args["limit"], has_more=has_more) return InfiniteScrollPagination(data=history_messages, limit=args["limit"], has_more=has_more)
@console_ns.route("/apps/<uuid:app_id>/feedbacks")
class MessageFeedbackApi(Resource): class MessageFeedbackApi(Resource):
@api.doc("create_message_feedback")
@api.doc(description="Create or update message feedback (like/dislike)")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"MessageFeedbackRequest",
{
"message_id": fields.String(required=True, description="Message ID"),
"rating": fields.String(enum=["like", "dislike"], description="Feedback rating"),
},
)
)
@api.response(200, "Feedback updated successfully")
@api.response(404, "Message not found")
@api.response(403, "Insufficient permissions")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def post(self, app_model): def post(self, app_model):
if current_user is None:
raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument("message_id", required=True, type=uuid_value, location="json") parser.add_argument("message_id", required=True, type=uuid_value, location="json")
parser.add_argument("rating", type=str, choices=["like", "dislike", None], location="json") parser.add_argument("rating", type=str, choices=["like", "dislike", None], location="json")
@@ -159,7 +190,24 @@ class MessageFeedbackApi(Resource):
return {"result": "success"} return {"result": "success"}
@console_ns.route("/apps/<uuid:app_id>/annotations")
class MessageAnnotationApi(Resource): class MessageAnnotationApi(Resource):
@api.doc("create_message_annotation")
@api.doc(description="Create message annotation")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"MessageAnnotationRequest",
{
"message_id": fields.String(description="Message ID"),
"question": fields.String(required=True, description="Question text"),
"answer": fields.String(required=True, description="Answer text"),
"annotation_reply": fields.Raw(description="Annotation reply"),
},
)
)
@api.response(200, "Annotation created successfully", annotation_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -167,7 +215,9 @@ class MessageAnnotationApi(Resource):
@get_app_model @get_app_model
@marshal_with(annotation_fields) @marshal_with(annotation_fields)
def post(self, app_model): def post(self, app_model):
if not current_user.is_editor: if not isinstance(current_user, Account):
raise Forbidden()
if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -181,18 +231,37 @@ class MessageAnnotationApi(Resource):
return annotation return annotation
@console_ns.route("/apps/<uuid:app_id>/annotations/count")
class MessageAnnotationCountApi(Resource): class MessageAnnotationCountApi(Resource):
@api.doc("get_annotation_count")
@api.doc(description="Get count of message annotations for the app")
@api.doc(params={"app_id": "Application ID"})
@api.response(
200,
"Annotation count retrieved successfully",
api.model("AnnotationCountResponse", {"count": fields.Integer(description="Number of annotations")}),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
count = db.session.query(MessageAnnotation).where(MessageAnnotation.app_id == app_model.id).count() count = db.session.query(MessageAnnotation).where(MessageAnnotation.app_id == app_model.id).count()
return {"count": count} return {"count": count}
@console_ns.route("/apps/<uuid:app_id>/chat-messages/<uuid:message_id>/suggested-questions")
class MessageSuggestedQuestionApi(Resource): class MessageSuggestedQuestionApi(Resource):
@api.doc("get_message_suggested_questions")
@api.doc(description="Get suggested questions for a message")
@api.doc(params={"app_id": "Application ID", "message_id": "Message ID"})
@api.response(
200,
"Suggested questions retrieved successfully",
api.model("SuggestedQuestionsResponse", {"data": fields.List(fields.String(description="Suggested question"))}),
)
@api.response(404, "Message or conversation not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -225,7 +294,13 @@ class MessageSuggestedQuestionApi(Resource):
return {"data": questions} return {"data": questions}
@console_ns.route("/apps/<uuid:app_id>/messages/<uuid:message_id>")
class MessageApi(Resource): class MessageApi(Resource):
@api.doc("get_message")
@api.doc(description="Get message details by ID")
@api.doc(params={"app_id": "Application ID", "message_id": "Message ID"})
@api.response(200, "Message retrieved successfully", message_detail_fields)
@api.response(404, "Message not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -240,11 +315,3 @@ class MessageApi(Resource):
raise NotFound("Message Not Exists.") raise NotFound("Message Not Exists.")
return message return message
api.add_resource(MessageSuggestedQuestionApi, "/apps/<uuid:app_id>/chat-messages/<uuid:message_id>/suggested-questions")
api.add_resource(ChatMessageListApi, "/apps/<uuid:app_id>/chat-messages", endpoint="console_chat_messages")
api.add_resource(MessageFeedbackApi, "/apps/<uuid:app_id>/feedbacks")
api.add_resource(MessageAnnotationApi, "/apps/<uuid:app_id>/annotations")
api.add_resource(MessageAnnotationCountApi, "/apps/<uuid:app_id>/annotations/count")
api.add_resource(MessageApi, "/apps/<uuid:app_id>/messages/<uuid:message_id>", endpoint="console_message")

View File

@@ -3,9 +3,10 @@ from typing import cast
from flask import request from flask import request
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource from flask_restx import Resource, fields
from werkzeug.exceptions import Forbidden
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from core.agent.entities import AgentToolEntity from core.agent.entities import AgentToolEntity
@@ -14,17 +15,51 @@ from core.tools.utils.configuration import ToolParameterConfigurationManager
from events.app_event import app_model_config_was_updated from events.app_event import app_model_config_was_updated
from extensions.ext_database import db from extensions.ext_database import db
from libs.login import login_required from libs.login import login_required
from models.account import Account
from models.model import AppMode, AppModelConfig from models.model import AppMode, AppModelConfig
from services.app_model_config_service import AppModelConfigService from services.app_model_config_service import AppModelConfigService
@console_ns.route("/apps/<uuid:app_id>/model-config")
class ModelConfigResource(Resource): class ModelConfigResource(Resource):
@api.doc("update_app_model_config")
@api.doc(description="Update application model configuration")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"ModelConfigRequest",
{
"provider": fields.String(description="Model provider"),
"model": fields.String(description="Model name"),
"configs": fields.Raw(description="Model configuration parameters"),
"opening_statement": fields.String(description="Opening statement"),
"suggested_questions": fields.List(fields.String(), description="Suggested questions"),
"more_like_this": fields.Raw(description="More like this configuration"),
"speech_to_text": fields.Raw(description="Speech to text configuration"),
"text_to_speech": fields.Raw(description="Text to speech configuration"),
"retrieval_model": fields.Raw(description="Retrieval model configuration"),
"tools": fields.List(fields.Raw(), description="Available tools"),
"dataset_configs": fields.Raw(description="Dataset configurations"),
"agent_mode": fields.Raw(description="Agent mode configuration"),
},
)
)
@api.response(200, "Model configuration updated successfully")
@api.response(400, "Invalid configuration")
@api.response(404, "App not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model(mode=[AppMode.AGENT_CHAT, AppMode.CHAT, AppMode.COMPLETION]) @get_app_model(mode=[AppMode.AGENT_CHAT, AppMode.CHAT, AppMode.COMPLETION])
def post(self, app_model): def post(self, app_model):
"""Modify app model config""" """Modify app model config"""
if not isinstance(current_user, Account):
raise Forbidden()
if not current_user.has_edit_permission:
raise Forbidden()
assert current_user.current_tenant_id is not None, "The tenant information should be loaded."
# validate config # validate config
model_configuration = AppModelConfigService.validate_configuration( model_configuration = AppModelConfigService.validate_configuration(
tenant_id=current_user.current_tenant_id, tenant_id=current_user.current_tenant_id,
@@ -39,7 +74,7 @@ class ModelConfigResource(Resource):
) )
new_app_model_config = new_app_model_config.from_model_config_dict(model_configuration) new_app_model_config = new_app_model_config.from_model_config_dict(model_configuration)
if app_model.mode == AppMode.AGENT_CHAT.value or app_model.is_agent: if app_model.mode == AppMode.AGENT_CHAT or app_model.is_agent:
# get original app model config # get original app model config
original_app_model_config = ( original_app_model_config = (
db.session.query(AppModelConfig).where(AppModelConfig.id == app_model.app_model_config_id).first() db.session.query(AppModelConfig).where(AppModelConfig.id == app_model.app_model_config_id).first()
@@ -142,6 +177,3 @@ class ModelConfigResource(Resource):
app_model_config_was_updated.send(app_model, app_model_config=new_app_model_config) app_model_config_was_updated.send(app_model, app_model_config=new_app_model_config)
return {"result": "success"} return {"result": "success"}
api.add_resource(ModelConfigResource, "/apps/<uuid:app_id>/model-config")

View File

@@ -1,18 +1,31 @@
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from werkzeug.exceptions import BadRequest from werkzeug.exceptions import BadRequest
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import TracingConfigCheckError, TracingConfigIsExist, TracingConfigNotExist from controllers.console.app.error import TracingConfigCheckError, TracingConfigIsExist, TracingConfigNotExist
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from libs.login import login_required from libs.login import login_required
from services.ops_service import OpsService from services.ops_service import OpsService
@console_ns.route("/apps/<uuid:app_id>/trace-config")
class TraceAppConfigApi(Resource): class TraceAppConfigApi(Resource):
""" """
Manage trace app configurations Manage trace app configurations
""" """
@api.doc("get_trace_app_config")
@api.doc(description="Get tracing configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser().add_argument(
"tracing_provider", type=str, required=True, location="args", help="Tracing provider name"
)
)
@api.response(
200, "Tracing configuration retrieved successfully", fields.Raw(description="Tracing configuration data")
)
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -29,6 +42,22 @@ class TraceAppConfigApi(Resource):
except Exception as e: except Exception as e:
raise BadRequest(str(e)) raise BadRequest(str(e))
@api.doc("create_trace_app_config")
@api.doc(description="Create a new tracing configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"TraceConfigCreateRequest",
{
"tracing_provider": fields.String(required=True, description="Tracing provider name"),
"tracing_config": fields.Raw(required=True, description="Tracing configuration data"),
},
)
)
@api.response(
201, "Tracing configuration created successfully", fields.Raw(description="Created configuration data")
)
@api.response(400, "Invalid request parameters or configuration already exists")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -51,6 +80,20 @@ class TraceAppConfigApi(Resource):
except Exception as e: except Exception as e:
raise BadRequest(str(e)) raise BadRequest(str(e))
@api.doc("update_trace_app_config")
@api.doc(description="Update an existing tracing configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"TraceConfigUpdateRequest",
{
"tracing_provider": fields.String(required=True, description="Tracing provider name"),
"tracing_config": fields.Raw(required=True, description="Updated tracing configuration data"),
},
)
)
@api.response(200, "Tracing configuration updated successfully", fields.Raw(description="Success response"))
@api.response(400, "Invalid request parameters or configuration not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -71,6 +114,16 @@ class TraceAppConfigApi(Resource):
except Exception as e: except Exception as e:
raise BadRequest(str(e)) raise BadRequest(str(e))
@api.doc("delete_trace_app_config")
@api.doc(description="Delete an existing tracing configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser().add_argument(
"tracing_provider", type=str, required=True, location="args", help="Tracing provider name"
)
)
@api.response(204, "Tracing configuration deleted successfully")
@api.response(400, "Invalid request parameters or configuration not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -87,6 +140,3 @@ class TraceAppConfigApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
except Exception as e: except Exception as e:
raise BadRequest(str(e)) raise BadRequest(str(e))
api.add_resource(TraceAppConfigApi, "/apps/<uuid:app_id>/trace-config")

View File

@@ -1,16 +1,16 @@
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, marshal_with, reqparse from flask_restx import Resource, fields, marshal_with, reqparse
from werkzeug.exceptions import Forbidden, NotFound from werkzeug.exceptions import Forbidden, NotFound
from constants.languages import supported_language from constants.languages import supported_language
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db from extensions.ext_database import db
from fields.app_fields import app_site_fields from fields.app_fields import app_site_fields
from libs.datetime_utils import naive_utc_now from libs.datetime_utils import naive_utc_now
from libs.login import login_required from libs.login import login_required
from models import Site from models import Account, Site
def parse_app_site_args(): def parse_app_site_args():
@@ -36,7 +36,39 @@ def parse_app_site_args():
return parser.parse_args() return parser.parse_args()
@console_ns.route("/apps/<uuid:app_id>/site")
class AppSite(Resource): class AppSite(Resource):
@api.doc("update_app_site")
@api.doc(description="Update application site configuration")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppSiteRequest",
{
"title": fields.String(description="Site title"),
"icon_type": fields.String(description="Icon type"),
"icon": fields.String(description="Icon"),
"icon_background": fields.String(description="Icon background color"),
"description": fields.String(description="Site description"),
"default_language": fields.String(description="Default language"),
"chat_color_theme": fields.String(description="Chat color theme"),
"chat_color_theme_inverted": fields.Boolean(description="Inverted chat color theme"),
"customize_domain": fields.String(description="Custom domain"),
"copyright": fields.String(description="Copyright text"),
"privacy_policy": fields.String(description="Privacy policy"),
"custom_disclaimer": fields.String(description="Custom disclaimer"),
"customize_token_strategy": fields.String(
enum=["must", "allow", "not_allow"], description="Token strategy"
),
"prompt_public": fields.Boolean(description="Make prompt public"),
"show_workflow_steps": fields.Boolean(description="Show workflow steps"),
"use_icon_as_answer_icon": fields.Boolean(description="Use icon as answer icon"),
},
)
)
@api.response(200, "Site configuration updated successfully", app_site_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "App not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -75,6 +107,8 @@ class AppSite(Resource):
if value is not None: if value is not None:
setattr(site, attr_name, value) setattr(site, attr_name, value)
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
site.updated_by = current_user.id site.updated_by = current_user.id
site.updated_at = naive_utc_now() site.updated_at = naive_utc_now()
db.session.commit() db.session.commit()
@@ -82,7 +116,14 @@ class AppSite(Resource):
return site return site
@console_ns.route("/apps/<uuid:app_id>/site/access-token-reset")
class AppSiteAccessTokenReset(Resource): class AppSiteAccessTokenReset(Resource):
@api.doc("reset_app_site_access_token")
@api.doc(description="Reset access token for application site")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Access token reset successfully", app_site_fields)
@api.response(403, "Insufficient permissions (admin/owner required)")
@api.response(404, "App or site not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -99,12 +140,10 @@ class AppSiteAccessTokenReset(Resource):
raise NotFound raise NotFound
site.code = Site.generate_code(16) site.code = Site.generate_code(16)
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
site.updated_by = current_user.id site.updated_by = current_user.id
site.updated_at = naive_utc_now() site.updated_at = naive_utc_now()
db.session.commit() db.session.commit()
return site return site
api.add_resource(AppSite, "/apps/<uuid:app_id>/site")
api.add_resource(AppSiteAccessTokenReset, "/apps/<uuid:app_id>/site/access-token-reset")

View File

@@ -5,9 +5,9 @@ import pytz
import sqlalchemy as sa import sqlalchemy as sa
from flask import jsonify from flask import jsonify
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from core.app.entities.app_invoke_entities import InvokeFrom from core.app.entities.app_invoke_entities import InvokeFrom
@@ -17,11 +17,25 @@ from libs.login import login_required
from models import AppMode, Message from models import AppMode, Message
@console_ns.route("/apps/<uuid:app_id>/statistics/daily-messages")
class DailyMessageStatistic(Resource): class DailyMessageStatistic(Resource):
@api.doc("get_daily_message_statistics")
@api.doc(description="Get daily message statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Daily message statistics retrieved successfully",
fields.List(fields.Raw(description="Daily message count data")),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -74,11 +88,25 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/daily-conversations")
class DailyConversationStatistic(Resource): class DailyConversationStatistic(Resource):
@api.doc("get_daily_conversation_statistics")
@api.doc(description="Get daily conversation statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Daily conversation statistics retrieved successfully",
fields.List(fields.Raw(description="Daily conversation count data")),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -126,11 +154,25 @@ class DailyConversationStatistic(Resource):
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/daily-end-users")
class DailyTerminalsStatistic(Resource): class DailyTerminalsStatistic(Resource):
@api.doc("get_daily_terminals_statistics")
@api.doc(description="Get daily terminal/end-user statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Daily terminal statistics retrieved successfully",
fields.List(fields.Raw(description="Daily terminal count data")),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -183,11 +225,25 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/token-costs")
class DailyTokenCostStatistic(Resource): class DailyTokenCostStatistic(Resource):
@api.doc("get_daily_token_cost_statistics")
@api.doc(description="Get daily token cost statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Daily token cost statistics retrieved successfully",
fields.List(fields.Raw(description="Daily token cost data")),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -243,7 +299,21 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/average-session-interactions")
class AverageSessionInteractionStatistic(Resource): class AverageSessionInteractionStatistic(Resource):
@api.doc("get_average_session_interaction_statistics")
@api.doc(description="Get average session interaction statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Average session interaction statistics retrieved successfully",
fields.List(fields.Raw(description="Average session interaction data")),
)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -319,11 +389,25 @@ ORDER BY
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/user-satisfaction-rate")
class UserSatisfactionRateStatistic(Resource): class UserSatisfactionRateStatistic(Resource):
@api.doc("get_user_satisfaction_rate_statistics")
@api.doc(description="Get user satisfaction rate statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"User satisfaction rate statistics retrieved successfully",
fields.List(fields.Raw(description="User satisfaction rate data")),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -385,7 +469,21 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/average-response-time")
class AverageResponseTimeStatistic(Resource): class AverageResponseTimeStatistic(Resource):
@api.doc("get_average_response_time_statistics")
@api.doc(description="Get average response time statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Average response time statistics retrieved successfully",
fields.List(fields.Raw(description="Average response time data")),
)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -442,11 +540,25 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/tokens-per-second")
class TokensPerSecondStatistic(Resource): class TokensPerSecondStatistic(Resource):
@api.doc("get_tokens_per_second_statistics")
@api.doc(description="Get tokens per second statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Tokens per second statistics retrieved successfully",
fields.List(fields.Raw(description="Tokens per second data")),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -500,13 +612,3 @@ WHERE
response_data.append({"date": str(i.date), "tps": round(i.tokens_per_second, 4)}) response_data.append({"date": str(i.date), "tps": round(i.tokens_per_second, 4)})
return jsonify({"data": response_data}) return jsonify({"data": response_data})
api.add_resource(DailyMessageStatistic, "/apps/<uuid:app_id>/statistics/daily-messages")
api.add_resource(DailyConversationStatistic, "/apps/<uuid:app_id>/statistics/daily-conversations")
api.add_resource(DailyTerminalsStatistic, "/apps/<uuid:app_id>/statistics/daily-end-users")
api.add_resource(DailyTokenCostStatistic, "/apps/<uuid:app_id>/statistics/token-costs")
api.add_resource(AverageSessionInteractionStatistic, "/apps/<uuid:app_id>/statistics/average-session-interactions")
api.add_resource(UserSatisfactionRateStatistic, "/apps/<uuid:app_id>/statistics/user-satisfaction-rate")
api.add_resource(AverageResponseTimeStatistic, "/apps/<uuid:app_id>/statistics/average-response-time")
api.add_resource(TokensPerSecondStatistic, "/apps/<uuid:app_id>/statistics/tokens-per-second")

View File

@@ -4,18 +4,14 @@ from collections.abc import Sequence
from typing import cast from typing import cast
from flask import abort, request from flask import abort, request
from flask_restx import Resource, inputs, marshal_with, reqparse from flask_restx import Resource, fields, inputs, marshal_with, reqparse
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services import services
from configs import dify_config from configs import dify_config
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import ( from controllers.console.app.error import ConversationCompletedError, DraftWorkflowNotExist, DraftWorkflowNotSync
ConversationCompletedError,
DraftWorkflowNotExist,
DraftWorkflowNotSync,
)
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from controllers.web.error import InvokeRateLimitError as InvokeRateLimitHttpError from controllers.web.error import InvokeRateLimitError as InvokeRateLimitHttpError
@@ -24,6 +20,7 @@ from core.app.apps.base_app_queue_manager import AppQueueManager
from core.app.entities.app_invoke_entities import InvokeFrom from core.app.entities.app_invoke_entities import InvokeFrom
from core.file.models import File from core.file.models import File
from core.helper.trace_id_helper import get_external_trace_id from core.helper.trace_id_helper import get_external_trace_id
from core.workflow.graph_engine.manager import GraphEngineManager
from extensions.ext_database import db from extensions.ext_database import db
from factories import file_factory, variable_factory from factories import file_factory, variable_factory
from fields.workflow_fields import workflow_fields, workflow_pagination_fields from fields.workflow_fields import workflow_fields, workflow_pagination_fields
@@ -61,7 +58,13 @@ def _parse_file(workflow: Workflow, files: list[dict] | None = None) -> Sequence
return file_objs return file_objs
@console_ns.route("/apps/<uuid:app_id>/workflows/draft")
class DraftWorkflowApi(Resource): class DraftWorkflowApi(Resource):
@api.doc("get_draft_workflow")
@api.doc(description="Get draft workflow for an application")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Draft workflow retrieved successfully", workflow_fields)
@api.response(404, "Draft workflow not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -73,7 +76,7 @@ class DraftWorkflowApi(Resource):
""" """
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
assert isinstance(current_user, Account) assert isinstance(current_user, Account)
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
# fetch draft workflow by app_model # fetch draft workflow by app_model
@@ -90,13 +93,30 @@ class DraftWorkflowApi(Resource):
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW]) @get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW])
@api.doc("sync_draft_workflow")
@api.doc(description="Sync draft workflow configuration")
@api.expect(
api.model(
"SyncDraftWorkflowRequest",
{
"graph": fields.Raw(required=True, description="Workflow graph configuration"),
"features": fields.Raw(required=True, description="Workflow features configuration"),
"hash": fields.String(description="Workflow hash for validation"),
"environment_variables": fields.List(fields.Raw, required=True, description="Environment variables"),
"conversation_variables": fields.List(fields.Raw, description="Conversation variables"),
},
)
)
@api.response(200, "Draft workflow synced successfully", workflow_fields)
@api.response(400, "Invalid workflow configuration")
@api.response(403, "Permission denied")
def post(self, app_model: App): def post(self, app_model: App):
""" """
Sync draft workflow Sync draft workflow
""" """
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
assert isinstance(current_user, Account) assert isinstance(current_user, Account)
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
content_type = request.headers.get("Content-Type", "") content_type = request.headers.get("Content-Type", "")
@@ -163,7 +183,25 @@ class DraftWorkflowApi(Resource):
} }
@console_ns.route("/apps/<uuid:app_id>/advanced-chat/workflows/draft/run")
class AdvancedChatDraftWorkflowRunApi(Resource): class AdvancedChatDraftWorkflowRunApi(Resource):
@api.doc("run_advanced_chat_draft_workflow")
@api.doc(description="Run draft workflow for advanced chat application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AdvancedChatWorkflowRunRequest",
{
"query": fields.String(required=True, description="User query"),
"inputs": fields.Raw(description="Input variables"),
"files": fields.List(fields.Raw, description="File uploads"),
"conversation_id": fields.String(description="Conversation ID"),
},
)
)
@api.response(200, "Workflow run started successfully")
@api.response(400, "Invalid request parameters")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -174,7 +212,7 @@ class AdvancedChatDraftWorkflowRunApi(Resource):
""" """
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
assert isinstance(current_user, Account) assert isinstance(current_user, Account)
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
@@ -212,7 +250,23 @@ class AdvancedChatDraftWorkflowRunApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/advanced-chat/workflows/draft/iteration/nodes/<string:node_id>/run")
class AdvancedChatDraftRunIterationNodeApi(Resource): class AdvancedChatDraftRunIterationNodeApi(Resource):
@api.doc("run_advanced_chat_draft_iteration_node")
@api.doc(description="Run draft workflow iteration node for advanced chat")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.expect(
api.model(
"IterationNodeRunRequest",
{
"task_id": fields.String(required=True, description="Task ID"),
"inputs": fields.Raw(description="Input variables"),
},
)
)
@api.response(200, "Iteration node run started successfully")
@api.response(403, "Permission denied")
@api.response(404, "Node not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -224,7 +278,7 @@ class AdvancedChatDraftRunIterationNodeApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -248,7 +302,23 @@ class AdvancedChatDraftRunIterationNodeApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/iteration/nodes/<string:node_id>/run")
class WorkflowDraftRunIterationNodeApi(Resource): class WorkflowDraftRunIterationNodeApi(Resource):
@api.doc("run_workflow_draft_iteration_node")
@api.doc(description="Run draft workflow iteration node")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.expect(
api.model(
"WorkflowIterationNodeRunRequest",
{
"task_id": fields.String(required=True, description="Task ID"),
"inputs": fields.Raw(description="Input variables"),
},
)
)
@api.response(200, "Workflow iteration node run started successfully")
@api.response(403, "Permission denied")
@api.response(404, "Node not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -260,7 +330,7 @@ class WorkflowDraftRunIterationNodeApi(Resource):
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -284,7 +354,23 @@ class WorkflowDraftRunIterationNodeApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/advanced-chat/workflows/draft/loop/nodes/<string:node_id>/run")
class AdvancedChatDraftRunLoopNodeApi(Resource): class AdvancedChatDraftRunLoopNodeApi(Resource):
@api.doc("run_advanced_chat_draft_loop_node")
@api.doc(description="Run draft workflow loop node for advanced chat")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.expect(
api.model(
"LoopNodeRunRequest",
{
"task_id": fields.String(required=True, description="Task ID"),
"inputs": fields.Raw(description="Input variables"),
},
)
)
@api.response(200, "Loop node run started successfully")
@api.response(403, "Permission denied")
@api.response(404, "Node not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -297,7 +383,7 @@ class AdvancedChatDraftRunLoopNodeApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -321,7 +407,23 @@ class AdvancedChatDraftRunLoopNodeApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/loop/nodes/<string:node_id>/run")
class WorkflowDraftRunLoopNodeApi(Resource): class WorkflowDraftRunLoopNodeApi(Resource):
@api.doc("run_workflow_draft_loop_node")
@api.doc(description="Run draft workflow loop node")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.expect(
api.model(
"WorkflowLoopNodeRunRequest",
{
"task_id": fields.String(required=True, description="Task ID"),
"inputs": fields.Raw(description="Input variables"),
},
)
)
@api.response(200, "Workflow loop node run started successfully")
@api.response(403, "Permission denied")
@api.response(404, "Node not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -334,7 +436,7 @@ class WorkflowDraftRunLoopNodeApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -358,7 +460,22 @@ class WorkflowDraftRunLoopNodeApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/run")
class DraftWorkflowRunApi(Resource): class DraftWorkflowRunApi(Resource):
@api.doc("run_draft_workflow")
@api.doc(description="Run draft workflow")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"DraftWorkflowRunRequest",
{
"inputs": fields.Raw(required=True, description="Input variables"),
"files": fields.List(fields.Raw, description="File uploads"),
},
)
)
@api.response(200, "Draft workflow run started successfully")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -371,7 +488,7 @@ class DraftWorkflowRunApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -397,7 +514,14 @@ class DraftWorkflowRunApi(Resource):
raise InvokeRateLimitHttpError(ex.description) raise InvokeRateLimitHttpError(ex.description)
@console_ns.route("/apps/<uuid:app_id>/workflow-runs/tasks/<string:task_id>/stop")
class WorkflowTaskStopApi(Resource): class WorkflowTaskStopApi(Resource):
@api.doc("stop_workflow_task")
@api.doc(description="Stop running workflow task")
@api.doc(params={"app_id": "Application ID", "task_id": "Task ID"})
@api.response(200, "Task stopped successfully")
@api.response(404, "Task not found")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -410,15 +534,35 @@ class WorkflowTaskStopApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, current_user.id) # Stop using both mechanisms for backward compatibility
# Legacy stop flag mechanism (without user check)
AppQueueManager.set_stop_flag_no_user_check(task_id)
# New graph engine command channel mechanism
GraphEngineManager.send_stop_command(task_id)
return {"result": "success"} return {"result": "success"}
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/run")
class DraftWorkflowNodeRunApi(Resource): class DraftWorkflowNodeRunApi(Resource):
@api.doc("run_draft_workflow_node")
@api.doc(description="Run draft workflow node")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.expect(
api.model(
"DraftWorkflowNodeRunRequest",
{
"inputs": fields.Raw(description="Input variables"),
},
)
)
@api.response(200, "Node run started successfully", workflow_run_node_execution_fields)
@api.response(403, "Permission denied")
@api.response(404, "Node not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -432,7 +576,7 @@ class DraftWorkflowNodeRunApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -466,7 +610,13 @@ class DraftWorkflowNodeRunApi(Resource):
return workflow_node_execution return workflow_node_execution
@console_ns.route("/apps/<uuid:app_id>/workflows/publish")
class PublishedWorkflowApi(Resource): class PublishedWorkflowApi(Resource):
@api.doc("get_published_workflow")
@api.doc(description="Get published workflow for an application")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Published workflow retrieved successfully", workflow_fields)
@api.response(404, "Published workflow not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -480,7 +630,7 @@ class PublishedWorkflowApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
# fetch published workflow by app_model # fetch published workflow by app_model
@@ -501,7 +651,7 @@ class PublishedWorkflowApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -526,7 +676,7 @@ class PublishedWorkflowApi(Resource):
) )
app_model.workflow_id = workflow.id app_model.workflow_id = workflow.id
db.session.commit() db.session.commit() # NOTE: this is necessary for update app_model.workflow_id
workflow_created_at = TimestampField().format(workflow.created_at) workflow_created_at = TimestampField().format(workflow.created_at)
@@ -538,7 +688,12 @@ class PublishedWorkflowApi(Resource):
} }
@console_ns.route("/apps/<uuid:app_id>/workflows/default-workflow-block-configs")
class DefaultBlockConfigsApi(Resource): class DefaultBlockConfigsApi(Resource):
@api.doc("get_default_block_configs")
@api.doc(description="Get default block configurations for workflow")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Default block configurations retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -551,7 +706,7 @@ class DefaultBlockConfigsApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
# Get default block configs # Get default block configs
@@ -559,7 +714,13 @@ class DefaultBlockConfigsApi(Resource):
return workflow_service.get_default_block_configs() return workflow_service.get_default_block_configs()
@console_ns.route("/apps/<uuid:app_id>/workflows/default-workflow-block-configs/<string:block_type>")
class DefaultBlockConfigApi(Resource): class DefaultBlockConfigApi(Resource):
@api.doc("get_default_block_config")
@api.doc(description="Get default block configuration by type")
@api.doc(params={"app_id": "Application ID", "block_type": "Block type"})
@api.response(200, "Default block configuration retrieved successfully")
@api.response(404, "Block type not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -571,7 +732,7 @@ class DefaultBlockConfigApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -592,7 +753,14 @@ class DefaultBlockConfigApi(Resource):
return workflow_service.get_default_block_config(node_type=block_type, filters=filters) return workflow_service.get_default_block_config(node_type=block_type, filters=filters)
@console_ns.route("/apps/<uuid:app_id>/convert-to-workflow")
class ConvertToWorkflowApi(Resource): class ConvertToWorkflowApi(Resource):
@api.doc("convert_to_workflow")
@api.doc(description="Convert application to workflow mode")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Application converted to workflow successfully")
@api.response(400, "Application cannot be converted")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -606,7 +774,7 @@ class ConvertToWorkflowApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
if request.data: if request.data:
@@ -629,9 +797,14 @@ class ConvertToWorkflowApi(Resource):
} }
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/config")
class WorkflowConfigApi(Resource): class WorkflowConfigApi(Resource):
"""Resource for workflow configuration.""" """Resource for workflow configuration."""
@api.doc("get_workflow_config")
@api.doc(description="Get workflow configuration")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Workflow configuration retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -642,7 +815,12 @@ class WorkflowConfigApi(Resource):
} }
@console_ns.route("/apps/<uuid:app_id>/workflows")
class PublishedAllWorkflowApi(Resource): class PublishedAllWorkflowApi(Resource):
@api.doc("get_all_published_workflows")
@api.doc(description="Get all published workflows for an application")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Published workflows retrieved successfully", workflow_pagination_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -655,7 +833,7 @@ class PublishedAllWorkflowApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -693,7 +871,23 @@ class PublishedAllWorkflowApi(Resource):
} }
@console_ns.route("/apps/<uuid:app_id>/workflows/<string:workflow_id>")
class WorkflowByIdApi(Resource): class WorkflowByIdApi(Resource):
@api.doc("update_workflow_by_id")
@api.doc(description="Update workflow by ID")
@api.doc(params={"app_id": "Application ID", "workflow_id": "Workflow ID"})
@api.expect(
api.model(
"UpdateWorkflowRequest",
{
"environment_variables": fields.List(fields.Raw, description="Environment variables"),
"conversation_variables": fields.List(fields.Raw, description="Conversation variables"),
},
)
)
@api.response(200, "Workflow updated successfully", workflow_fields)
@api.response(404, "Workflow not found")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -706,7 +900,7 @@ class WorkflowByIdApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# Check permission # Check permission
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -719,7 +913,6 @@ class WorkflowByIdApi(Resource):
raise ValueError("Marked name cannot exceed 20 characters") raise ValueError("Marked name cannot exceed 20 characters")
if args.marked_comment and len(args.marked_comment) > 100: if args.marked_comment and len(args.marked_comment) > 100:
raise ValueError("Marked comment cannot exceed 100 characters") raise ValueError("Marked comment cannot exceed 100 characters")
args = parser.parse_args()
# Prepare update data # Prepare update data
update_data = {} update_data = {}
@@ -762,7 +955,7 @@ class WorkflowByIdApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# Check permission # Check permission
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
workflow_service = WorkflowService() workflow_service = WorkflowService()
@@ -785,7 +978,14 @@ class WorkflowByIdApi(Resource):
return None, 204 return None, 204
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/last-run")
class DraftWorkflowNodeLastRunApi(Resource): class DraftWorkflowNodeLastRunApi(Resource):
@api.doc("get_draft_workflow_node_last_run")
@api.doc(description="Get last run result for draft workflow node")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.response(200, "Node last run retrieved successfully", workflow_run_node_execution_fields)
@api.response(404, "Node last run not found")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -804,73 +1004,3 @@ class DraftWorkflowNodeLastRunApi(Resource):
if node_exec is None: if node_exec is None:
raise NotFound("last run not found") raise NotFound("last run not found")
return node_exec return node_exec
api.add_resource(
DraftWorkflowApi,
"/apps/<uuid:app_id>/workflows/draft",
)
api.add_resource(
WorkflowConfigApi,
"/apps/<uuid:app_id>/workflows/draft/config",
)
api.add_resource(
AdvancedChatDraftWorkflowRunApi,
"/apps/<uuid:app_id>/advanced-chat/workflows/draft/run",
)
api.add_resource(
DraftWorkflowRunApi,
"/apps/<uuid:app_id>/workflows/draft/run",
)
api.add_resource(
WorkflowTaskStopApi,
"/apps/<uuid:app_id>/workflow-runs/tasks/<string:task_id>/stop",
)
api.add_resource(
DraftWorkflowNodeRunApi,
"/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/run",
)
api.add_resource(
AdvancedChatDraftRunIterationNodeApi,
"/apps/<uuid:app_id>/advanced-chat/workflows/draft/iteration/nodes/<string:node_id>/run",
)
api.add_resource(
WorkflowDraftRunIterationNodeApi,
"/apps/<uuid:app_id>/workflows/draft/iteration/nodes/<string:node_id>/run",
)
api.add_resource(
AdvancedChatDraftRunLoopNodeApi,
"/apps/<uuid:app_id>/advanced-chat/workflows/draft/loop/nodes/<string:node_id>/run",
)
api.add_resource(
WorkflowDraftRunLoopNodeApi,
"/apps/<uuid:app_id>/workflows/draft/loop/nodes/<string:node_id>/run",
)
api.add_resource(
PublishedWorkflowApi,
"/apps/<uuid:app_id>/workflows/publish",
)
api.add_resource(
PublishedAllWorkflowApi,
"/apps/<uuid:app_id>/workflows",
)
api.add_resource(
DefaultBlockConfigsApi,
"/apps/<uuid:app_id>/workflows/default-workflow-block-configs",
)
api.add_resource(
DefaultBlockConfigApi,
"/apps/<uuid:app_id>/workflows/default-workflow-block-configs/<string:block_type>",
)
api.add_resource(
ConvertToWorkflowApi,
"/apps/<uuid:app_id>/convert-to-workflow",
)
api.add_resource(
WorkflowByIdApi,
"/apps/<uuid:app_id>/workflows/<string:workflow_id>",
)
api.add_resource(
DraftWorkflowNodeLastRunApi,
"/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/last-run",
)

View File

@@ -3,10 +3,10 @@ from flask_restx import Resource, marshal_with, reqparse
from flask_restx.inputs import int_range from flask_restx.inputs import int_range
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from core.workflow.entities.workflow_execution import WorkflowExecutionStatus from core.workflow.enums import WorkflowExecutionStatus
from extensions.ext_database import db from extensions.ext_database import db
from fields.workflow_app_log_fields import workflow_app_log_pagination_fields from fields.workflow_app_log_fields import workflow_app_log_pagination_fields
from libs.login import login_required from libs.login import login_required
@@ -15,7 +15,24 @@ from models.model import AppMode
from services.workflow_app_service import WorkflowAppService from services.workflow_app_service import WorkflowAppService
@console_ns.route("/apps/<uuid:app_id>/workflow-app-logs")
class WorkflowAppLogApi(Resource): class WorkflowAppLogApi(Resource):
@api.doc("get_workflow_app_logs")
@api.doc(description="Get workflow application execution logs")
@api.doc(params={"app_id": "Application ID"})
@api.doc(
params={
"keyword": "Search keyword for filtering logs",
"status": "Filter by execution status (succeeded, failed, stopped, partial-succeeded)",
"created_at__before": "Filter logs created before this timestamp",
"created_at__after": "Filter logs created after this timestamp",
"created_by_end_user_session_id": "Filter by end user session ID",
"created_by_account": "Filter by account",
"page": "Page number (1-99999)",
"limit": "Number of items per page (1-100)",
}
)
@api.response(200, "Workflow app logs retrieved successfully", workflow_app_log_pagination_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -27,7 +44,9 @@ class WorkflowAppLogApi(Resource):
""" """
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument("keyword", type=str, location="args") parser.add_argument("keyword", type=str, location="args")
parser.add_argument("status", type=str, choices=["succeeded", "failed", "stopped"], location="args") parser.add_argument(
"status", type=str, choices=["succeeded", "failed", "stopped", "partial-succeeded"], location="args"
)
parser.add_argument( parser.add_argument(
"created_at__before", type=str, location="args", help="Filter logs created before this timestamp" "created_at__before", type=str, location="args", help="Filter logs created before this timestamp"
) )
@@ -76,6 +95,3 @@ class WorkflowAppLogApi(Resource):
) )
return workflow_app_log_pagination return workflow_app_log_pagination
api.add_resource(WorkflowAppLogApi, "/apps/<uuid:app_id>/workflow-app-logs")

View File

@@ -1,12 +1,12 @@
import logging import logging
from typing import Any, NoReturn from typing import NoReturn
from flask import Response from flask import Response
from flask_restx import Resource, fields, inputs, marshal, marshal_with, reqparse from flask_restx import Resource, fields, inputs, marshal, marshal_with, reqparse
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden from werkzeug.exceptions import Forbidden
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import ( from controllers.console.app.error import (
DraftWorkflowNotExist, DraftWorkflowNotExist,
) )
@@ -17,10 +17,11 @@ from core.variables.segment_group import SegmentGroup
from core.variables.segments import ArrayFileSegment, FileSegment, Segment from core.variables.segments import ArrayFileSegment, FileSegment, Segment
from core.variables.types import SegmentType from core.variables.types import SegmentType
from core.workflow.constants import CONVERSATION_VARIABLE_NODE_ID, SYSTEM_VARIABLE_NODE_ID from core.workflow.constants import CONVERSATION_VARIABLE_NODE_ID, SYSTEM_VARIABLE_NODE_ID
from extensions.ext_database import db
from factories.file_factory import build_from_mapping, build_from_mappings from factories.file_factory import build_from_mapping, build_from_mappings
from factories.variable_factory import build_segment_with_type from factories.variable_factory import build_segment_with_type
from libs.login import current_user, login_required from libs.login import current_user, login_required
from models import App, AppMode, db from models import App, AppMode
from models.account import Account from models.account import Account
from models.workflow import WorkflowDraftVariable from models.workflow import WorkflowDraftVariable
from services.workflow_draft_variable_service import WorkflowDraftVariableList, WorkflowDraftVariableService from services.workflow_draft_variable_service import WorkflowDraftVariableList, WorkflowDraftVariableService
@@ -29,7 +30,7 @@ from services.workflow_service import WorkflowService
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
def _convert_values_to_json_serializable_object(value: Segment) -> Any: def _convert_values_to_json_serializable_object(value: Segment):
if isinstance(value, FileSegment): if isinstance(value, FileSegment):
return value.value.model_dump() return value.value.model_dump()
elif isinstance(value, ArrayFileSegment): elif isinstance(value, ArrayFileSegment):
@@ -40,7 +41,7 @@ def _convert_values_to_json_serializable_object(value: Segment) -> Any:
return value.value return value.value
def _serialize_var_value(variable: WorkflowDraftVariable) -> Any: def _serialize_var_value(variable: WorkflowDraftVariable):
value = variable.get_value() value = variable.get_value()
# create a copy of the value to avoid affecting the model cache. # create a copy of the value to avoid affecting the model cache.
value = value.model_copy(deep=True) value = value.model_copy(deep=True)
@@ -137,14 +138,20 @@ def _api_prerequisite(f):
@get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW]) @get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW])
def wrapper(*args, **kwargs): def wrapper(*args, **kwargs):
assert isinstance(current_user, Account) assert isinstance(current_user, Account)
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
return f(*args, **kwargs) return f(*args, **kwargs)
return wrapper return wrapper
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/variables")
class WorkflowVariableCollectionApi(Resource): class WorkflowVariableCollectionApi(Resource):
@api.doc("get_workflow_variables")
@api.doc(description="Get draft workflow variables")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"page": "Page number (1-100000)", "limit": "Number of items per page (1-100)"})
@api.response(200, "Workflow variables retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_LIST_WITHOUT_VALUE_FIELDS)
@_api_prerequisite @_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_WITHOUT_VALUE_FIELDS) @marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_WITHOUT_VALUE_FIELDS)
def get(self, app_model: App): def get(self, app_model: App):
@@ -173,6 +180,9 @@ class WorkflowVariableCollectionApi(Resource):
return workflow_vars return workflow_vars
@api.doc("delete_workflow_variables")
@api.doc(description="Delete all draft workflow variables")
@api.response(204, "Workflow variables deleted successfully")
@_api_prerequisite @_api_prerequisite
def delete(self, app_model: App): def delete(self, app_model: App):
draft_var_srv = WorkflowDraftVariableService( draft_var_srv = WorkflowDraftVariableService(
@@ -201,7 +211,12 @@ def validate_node_id(node_id: str) -> NoReturn | None:
return None return None
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/variables")
class NodeVariableCollectionApi(Resource): class NodeVariableCollectionApi(Resource):
@api.doc("get_node_variables")
@api.doc(description="Get variables for a specific node")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.response(200, "Node variables retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
@_api_prerequisite @_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS) @marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
def get(self, app_model: App, node_id: str): def get(self, app_model: App, node_id: str):
@@ -214,6 +229,9 @@ class NodeVariableCollectionApi(Resource):
return node_vars return node_vars
@api.doc("delete_node_variables")
@api.doc(description="Delete all variables for a specific node")
@api.response(204, "Node variables deleted successfully")
@_api_prerequisite @_api_prerequisite
def delete(self, app_model: App, node_id: str): def delete(self, app_model: App, node_id: str):
validate_node_id(node_id) validate_node_id(node_id)
@@ -223,10 +241,16 @@ class NodeVariableCollectionApi(Resource):
return Response("", 204) return Response("", 204)
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/variables/<uuid:variable_id>")
class VariableApi(Resource): class VariableApi(Resource):
_PATCH_NAME_FIELD = "name" _PATCH_NAME_FIELD = "name"
_PATCH_VALUE_FIELD = "value" _PATCH_VALUE_FIELD = "value"
@api.doc("get_variable")
@api.doc(description="Get a specific workflow variable")
@api.doc(params={"app_id": "Application ID", "variable_id": "Variable ID"})
@api.response(200, "Variable retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_FIELDS)
@api.response(404, "Variable not found")
@_api_prerequisite @_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_FIELDS) @marshal_with(_WORKFLOW_DRAFT_VARIABLE_FIELDS)
def get(self, app_model: App, variable_id: str): def get(self, app_model: App, variable_id: str):
@@ -240,6 +264,19 @@ class VariableApi(Resource):
raise NotFoundError(description=f"variable not found, id={variable_id}") raise NotFoundError(description=f"variable not found, id={variable_id}")
return variable return variable
@api.doc("update_variable")
@api.doc(description="Update a workflow variable")
@api.expect(
api.model(
"UpdateVariableRequest",
{
"name": fields.String(description="Variable name"),
"value": fields.Raw(description="Variable value"),
},
)
)
@api.response(200, "Variable updated successfully", _WORKFLOW_DRAFT_VARIABLE_FIELDS)
@api.response(404, "Variable not found")
@_api_prerequisite @_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_FIELDS) @marshal_with(_WORKFLOW_DRAFT_VARIABLE_FIELDS)
def patch(self, app_model: App, variable_id: str): def patch(self, app_model: App, variable_id: str):
@@ -302,6 +339,10 @@ class VariableApi(Resource):
db.session.commit() db.session.commit()
return variable return variable
@api.doc("delete_variable")
@api.doc(description="Delete a workflow variable")
@api.response(204, "Variable deleted successfully")
@api.response(404, "Variable not found")
@_api_prerequisite @_api_prerequisite
def delete(self, app_model: App, variable_id: str): def delete(self, app_model: App, variable_id: str):
draft_var_srv = WorkflowDraftVariableService( draft_var_srv = WorkflowDraftVariableService(
@@ -317,7 +358,14 @@ class VariableApi(Resource):
return Response("", 204) return Response("", 204)
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/variables/<uuid:variable_id>/reset")
class VariableResetApi(Resource): class VariableResetApi(Resource):
@api.doc("reset_variable")
@api.doc(description="Reset a workflow variable to its default value")
@api.doc(params={"app_id": "Application ID", "variable_id": "Variable ID"})
@api.response(200, "Variable reset successfully", _WORKFLOW_DRAFT_VARIABLE_FIELDS)
@api.response(204, "Variable reset (no content)")
@api.response(404, "Variable not found")
@_api_prerequisite @_api_prerequisite
def put(self, app_model: App, variable_id: str): def put(self, app_model: App, variable_id: str):
draft_var_srv = WorkflowDraftVariableService( draft_var_srv = WorkflowDraftVariableService(
@@ -358,7 +406,13 @@ def _get_variable_list(app_model: App, node_id) -> WorkflowDraftVariableList:
return draft_vars return draft_vars
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/conversation-variables")
class ConversationVariableCollectionApi(Resource): class ConversationVariableCollectionApi(Resource):
@api.doc("get_conversation_variables")
@api.doc(description="Get conversation variables for workflow")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Conversation variables retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
@api.response(404, "Draft workflow not found")
@_api_prerequisite @_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS) @marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
def get(self, app_model: App): def get(self, app_model: App):
@@ -374,14 +428,25 @@ class ConversationVariableCollectionApi(Resource):
return _get_variable_list(app_model, CONVERSATION_VARIABLE_NODE_ID) return _get_variable_list(app_model, CONVERSATION_VARIABLE_NODE_ID)
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/system-variables")
class SystemVariableCollectionApi(Resource): class SystemVariableCollectionApi(Resource):
@api.doc("get_system_variables")
@api.doc(description="Get system variables for workflow")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "System variables retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
@_api_prerequisite @_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS) @marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
def get(self, app_model: App): def get(self, app_model: App):
return _get_variable_list(app_model, SYSTEM_VARIABLE_NODE_ID) return _get_variable_list(app_model, SYSTEM_VARIABLE_NODE_ID)
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/environment-variables")
class EnvironmentVariableCollectionApi(Resource): class EnvironmentVariableCollectionApi(Resource):
@api.doc("get_environment_variables")
@api.doc(description="Get environment variables for workflow")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Environment variables retrieved successfully")
@api.response(404, "Draft workflow not found")
@_api_prerequisite @_api_prerequisite
def get(self, app_model: App): def get(self, app_model: App):
""" """
@@ -413,16 +478,3 @@ class EnvironmentVariableCollectionApi(Resource):
) )
return {"items": env_vars_list} return {"items": env_vars_list}
api.add_resource(
WorkflowVariableCollectionApi,
"/apps/<uuid:app_id>/workflows/draft/variables",
)
api.add_resource(NodeVariableCollectionApi, "/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/variables")
api.add_resource(VariableApi, "/apps/<uuid:app_id>/workflows/draft/variables/<uuid:variable_id>")
api.add_resource(VariableResetApi, "/apps/<uuid:app_id>/workflows/draft/variables/<uuid:variable_id>/reset")
api.add_resource(ConversationVariableCollectionApi, "/apps/<uuid:app_id>/workflows/draft/conversation-variables")
api.add_resource(SystemVariableCollectionApi, "/apps/<uuid:app_id>/workflows/draft/system-variables")
api.add_resource(EnvironmentVariableCollectionApi, "/apps/<uuid:app_id>/workflows/draft/environment-variables")

View File

@@ -4,7 +4,7 @@ from flask_login import current_user
from flask_restx import Resource, marshal_with, reqparse from flask_restx import Resource, marshal_with, reqparse
from flask_restx.inputs import int_range from flask_restx.inputs import int_range
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from fields.workflow_run_fields import ( from fields.workflow_run_fields import (
@@ -19,7 +19,13 @@ from models import Account, App, AppMode, EndUser
from services.workflow_run_service import WorkflowRunService from services.workflow_run_service import WorkflowRunService
@console_ns.route("/apps/<uuid:app_id>/advanced-chat/workflow-runs")
class AdvancedChatAppWorkflowRunListApi(Resource): class AdvancedChatAppWorkflowRunListApi(Resource):
@api.doc("get_advanced_chat_workflow_runs")
@api.doc(description="Get advanced chat workflow run list")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"last_id": "Last run ID for pagination", "limit": "Number of items per page (1-100)"})
@api.response(200, "Workflow runs retrieved successfully", advanced_chat_workflow_run_pagination_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -40,7 +46,13 @@ class AdvancedChatAppWorkflowRunListApi(Resource):
return result return result
@console_ns.route("/apps/<uuid:app_id>/workflow-runs")
class WorkflowRunListApi(Resource): class WorkflowRunListApi(Resource):
@api.doc("get_workflow_runs")
@api.doc(description="Get workflow run list")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"last_id": "Last run ID for pagination", "limit": "Number of items per page (1-100)"})
@api.response(200, "Workflow runs retrieved successfully", workflow_run_pagination_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -61,7 +73,13 @@ class WorkflowRunListApi(Resource):
return result return result
@console_ns.route("/apps/<uuid:app_id>/workflow-runs/<uuid:run_id>")
class WorkflowRunDetailApi(Resource): class WorkflowRunDetailApi(Resource):
@api.doc("get_workflow_run_detail")
@api.doc(description="Get workflow run detail")
@api.doc(params={"app_id": "Application ID", "run_id": "Workflow run ID"})
@api.response(200, "Workflow run detail retrieved successfully", workflow_run_detail_fields)
@api.response(404, "Workflow run not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -79,7 +97,13 @@ class WorkflowRunDetailApi(Resource):
return workflow_run return workflow_run
@console_ns.route("/apps/<uuid:app_id>/workflow-runs/<uuid:run_id>/node-executions")
class WorkflowRunNodeExecutionListApi(Resource): class WorkflowRunNodeExecutionListApi(Resource):
@api.doc("get_workflow_run_node_executions")
@api.doc(description="Get workflow run node execution list")
@api.doc(params={"app_id": "Application ID", "run_id": "Workflow run ID"})
@api.response(200, "Node executions retrieved successfully", workflow_run_node_execution_list_fields)
@api.response(404, "Workflow run not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -100,9 +124,3 @@ class WorkflowRunNodeExecutionListApi(Resource):
) )
return {"data": node_executions} return {"data": node_executions}
api.add_resource(AdvancedChatAppWorkflowRunListApi, "/apps/<uuid:app_id>/advanced-chat/workflow-runs")
api.add_resource(WorkflowRunListApi, "/apps/<uuid:app_id>/workflow-runs")
api.add_resource(WorkflowRunDetailApi, "/apps/<uuid:app_id>/workflow-runs/<uuid:run_id>")
api.add_resource(WorkflowRunNodeExecutionListApi, "/apps/<uuid:app_id>/workflow-runs/<uuid:run_id>/node-executions")

View File

@@ -7,7 +7,7 @@ from flask import jsonify
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, reqparse from flask_restx import Resource, reqparse
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db from extensions.ext_database import db
@@ -17,11 +17,17 @@ from models.enums import WorkflowRunTriggeredFrom
from models.model import AppMode from models.model import AppMode
@console_ns.route("/apps/<uuid:app_id>/workflow/statistics/daily-conversations")
class WorkflowDailyRunsStatistic(Resource): class WorkflowDailyRunsStatistic(Resource):
@api.doc("get_workflow_daily_runs_statistic")
@api.doc(description="Get workflow daily runs statistics")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"start": "Start date and time (YYYY-MM-DD HH:MM)", "end": "End date and time (YYYY-MM-DD HH:MM)"})
@api.response(200, "Daily runs statistics retrieved successfully")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -79,11 +85,17 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/workflow/statistics/daily-terminals")
class WorkflowDailyTerminalsStatistic(Resource): class WorkflowDailyTerminalsStatistic(Resource):
@api.doc("get_workflow_daily_terminals_statistic")
@api.doc(description="Get workflow daily terminals statistics")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"start": "Start date and time (YYYY-MM-DD HH:MM)", "end": "End date and time (YYYY-MM-DD HH:MM)"})
@api.response(200, "Daily terminals statistics retrieved successfully")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -141,11 +153,17 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/workflow/statistics/token-costs")
class WorkflowDailyTokenCostStatistic(Resource): class WorkflowDailyTokenCostStatistic(Resource):
@api.doc("get_workflow_daily_token_cost_statistic")
@api.doc(description="Get workflow daily token cost statistics")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"start": "Start date and time (YYYY-MM-DD HH:MM)", "end": "End date and time (YYYY-MM-DD HH:MM)"})
@api.response(200, "Daily token cost statistics retrieved successfully")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -208,7 +226,13 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/workflow/statistics/average-app-interactions")
class WorkflowAverageAppInteractionStatistic(Resource): class WorkflowAverageAppInteractionStatistic(Resource):
@api.doc("get_workflow_average_app_interaction_statistic")
@api.doc(description="Get workflow average app interaction statistics")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"start": "Start date and time (YYYY-MM-DD HH:MM)", "end": "End date and time (YYYY-MM-DD HH:MM)"})
@api.response(200, "Average app interaction statistics retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -285,11 +309,3 @@ GROUP BY
) )
return jsonify({"data": response_data}) return jsonify({"data": response_data})
api.add_resource(WorkflowDailyRunsStatistic, "/apps/<uuid:app_id>/workflow/statistics/daily-conversations")
api.add_resource(WorkflowDailyTerminalsStatistic, "/apps/<uuid:app_id>/workflow/statistics/daily-terminals")
api.add_resource(WorkflowDailyTokenCostStatistic, "/apps/<uuid:app_id>/workflow/statistics/token-costs")
api.add_resource(
WorkflowAverageAppInteractionStatistic, "/apps/<uuid:app_id>/workflow/statistics/average-app-interactions"
)

View File

@@ -1,6 +1,6 @@
from collections.abc import Callable from collections.abc import Callable
from functools import wraps from functools import wraps
from typing import Optional, Union from typing import ParamSpec, TypeVar, Union
from controllers.console.app.error import AppNotFoundError from controllers.console.app.error import AppNotFoundError
from extensions.ext_database import db from extensions.ext_database import db
@@ -8,8 +8,11 @@ from libs.login import current_user
from models import App, AppMode from models import App, AppMode
from models.account import Account from models.account import Account
P = ParamSpec("P")
R = TypeVar("R")
def _load_app_model(app_id: str) -> Optional[App]:
def _load_app_model(app_id: str) -> App | None:
assert isinstance(current_user, Account) assert isinstance(current_user, Account)
app_model = ( app_model = (
db.session.query(App) db.session.query(App)
@@ -19,10 +22,10 @@ def _load_app_model(app_id: str) -> Optional[App]:
return app_model return app_model
def get_app_model(view: Optional[Callable] = None, *, mode: Union[AppMode, list[AppMode], None] = None): def get_app_model(view: Callable[P, R] | None = None, *, mode: Union[AppMode, list[AppMode], None] = None):
def decorator(view_func): def decorator(view_func: Callable[P, R]):
@wraps(view_func) @wraps(view_func)
def decorated_view(*args, **kwargs): def decorated_view(*args: P.args, **kwargs: P.kwargs):
if not kwargs.get("app_id"): if not kwargs.get("app_id"):
raise ValueError("missing app_id in path parameters") raise ValueError("missing app_id in path parameters")

Some files were not shown because too many files have changed in this diff Show More