Compare commits

...

262 Commits

Author SHA1 Message Date
-LAN-
1cf788c43b Merge branch 'main' into feat/queue-based-graph-engine 2025-09-17 12:46:08 +08:00
-LAN-
73a7756350 feat(graph_engine): allow to dumps and loads RSC 2025-09-17 12:45:51 +08:00
-LAN-
02d15ebd5a feat(graph_engine): support dumps and loads in GraphExecution 2025-09-16 19:38:10 +08:00
-LAN-
b5a7e64e19 Fix incorrect API endpoint routing from PR #25628 (#25778)
Some checks failed
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Has been cancelled
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Has been cancelled
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Has been cancelled
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Has been cancelled
Main CI Pipeline / Check Changed Files (push) Has been cancelled
Main CI Pipeline / Style Check (push) Has been cancelled
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Has been cancelled
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Has been cancelled
Main CI Pipeline / API Tests (push) Has been cancelled
Main CI Pipeline / Web Tests (push) Has been cancelled
Main CI Pipeline / VDB Tests (push) Has been cancelled
Main CI Pipeline / DB Migration Test (push) Has been cancelled
2025-09-16 19:20:26 +08:00
Jiang
b283b10d3e Fix/lindorm vdb optimize (#25748)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-16 16:54:18 +08:00
-LAN-
976b3b5e83 Merge branch 'main' into feat/queue-based-graph-engine 2025-09-16 15:21:36 +08:00
-LAN-
ecb22226d6 refactor: remove Claude-specific references from documentation files (#25760) 2025-09-16 14:22:14 +08:00
Xiyuan Chen
8635aacb46 Enhance LLM model configuration validation to include active status c… (#25759)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-15 23:15:53 -07:00
-LAN-
b5684f1992 refactor(graph_engine): remove unused parameters from Engine 2025-09-16 14:11:42 +08:00
-LAN-
bd13cf05eb Merge branch 'main' into feat/queue-based-graph-engine 2025-09-16 12:59:26 +08:00
Asuka Minato
bdd85b36a4 ruff check preview (#25653)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-16 12:58:12 +08:00
znn
a0c7713494 chat remove transparency from chat bubble in dark mode (#24921) 2025-09-16 12:57:53 +08:00
-LAN-
5f263147f9 fix: make mypy happy 2025-09-16 12:51:11 +08:00
-LAN-
b68afdfa64 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-16 12:32:16 +08:00
NeatGuyCoding
abf4955c26 Feature: add test containers document indexing task (#25684)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-16 09:47:28 +08:00
miwa
74340e3c04 Bugfix: When i change the loop variable, 'Loop Termination Condition' wi… (#25695)
Co-authored-by: fengminhua <fengminhua@52tt.com>
2025-09-16 09:46:44 +08:00
-LAN-
b98b389baf fix(tests): resolve order dependency in disable_segments_from_index_task tests (#25737) 2025-09-16 08:26:52 +08:00
-LAN-
da87fce751 feat(graph_engine): dump and load ready queue 2025-09-16 04:19:46 +08:00
-LAN-
d5342927d0 chore: change _outputs type to dict[str, object] 2025-09-16 01:53:25 +08:00
github-actions[bot]
877806c34d chore: translate i18n files and update type definitions (#25713)
Some checks failed
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Check i18n Files and Create PR / check-and-update (push) Has been cancelled
Co-authored-by: GarfieldDai <28395549+GarfieldDai@users.noreply.github.com>
2025-09-15 21:22:57 +08:00
湛露先生
0bbf4fb66a correct typos . (#25717)
Signed-off-by: zhanluxianshen <zhanluxianshen@163.com>
2025-09-15 21:22:40 +08:00
chengjoey
169ce71e59 fix(web): custom-tool output_schema.properties missing type (#25731)
Co-authored-by: joeyczheng <joeyczheng@tencent.com>
2025-09-15 21:21:25 +08:00
quicksand
bdbe078630 fix(mcp): prevent masked headers from overwriting real values (#25722) 2025-09-15 19:24:12 +08:00
autofix-ci[bot]
754d790c89 [autofix.ci] apply automated fixes (attempt 2/3) 2025-09-15 07:58:44 +00:00
autofix-ci[bot]
a099a35e51 [autofix.ci] apply automated fixes 2025-09-15 07:56:51 +00:00
-LAN-
2dd893e60d Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-15 15:54:42 +08:00
Garfield Dai
88d5e27fe8 Release/e-1.8.1 (#25613)
Co-authored-by: zxhlyh <jasonapring2015@outlook.com>
Co-authored-by: GareArc <chen4851@purdue.edu>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: hjlarry <hjlarry@163.com>
2025-09-15 14:49:23 +08:00
-LAN-
bb5b8d2902 fix: resolve devalue prototype pollution vulnerability (#25709) 2025-09-15 13:26:36 +08:00
-LAN-
bab4975809 chore: add ast-grep rule to convert Optional[T] to T | None (#25560)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-15 13:06:33 +08:00
-LAN-
b8ee1d4697 Merge branch 'main' into feat/queue-based-graph-engine 2025-09-15 12:21:18 +08:00
dependabot[bot]
2e44ebe98d chore(deps): bump @lexical/text from 0.30.0 to 0.35.0 in /web (#25705)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-15 12:55:48 +09:00
dependabot[bot]
a1961cc37a chore(deps-dev): bump @next/bundle-analyzer from 15.5.0 to 15.5.3 in /web (#25704)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-15 12:55:17 +09:00
dependabot[bot]
727e1d3743 chore(deps): bump scheduler from 0.23.2 to 0.26.0 in /web (#25699)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-15 12:51:47 +09:00
dependabot[bot]
4e3b16c5f4 chore(deps-dev): bump sass from 1.89.2 to 1.92.1 in /web (#25698)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-15 12:50:49 +09:00
dependabot[bot]
6c36bf28d7 chore(deps): bump clickzetta-connector-python from 0.8.102 to 0.8.104 in /api (#25697)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-15 12:50:12 +09:00
dependabot[bot]
5548b22fe7 chore(deps): bump transformers from 4.53.3 to 4.56.1 in /api (#25696)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-15 12:49:26 +09:00
Asuka Minato
03664d6b51 dependabot (#25677) 2025-09-15 10:59:34 +08:00
Guangdong Liu
07d383ffaa refactor: update API routes and documentation for app and datasets endpoints (#25628) 2025-09-15 10:59:11 +08:00
Joel
9bb7bcf52e feat: user message support generate prompt (#25689) 2025-09-15 10:17:19 +08:00
Ritoban Dutta
67a686cf98 [Chore/Refactor] use __all__ to specify export member. (#25681) 2025-09-15 09:45:35 +08:00
ChasePassion
a3f2c05632 optimize _merge_splits function by using enumerate instead of manual index tracking (#25680)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-15 09:41:16 +08:00
-LAN-
b4ef1de30f feat(graph_engine): add ready_queue state persistence to GraphRuntimeState
- Add ReadyQueueState TypedDict for type-safe queue serialization
- Add ready_queue attribute to GraphRuntimeState for initializing with pre-existing queue state
- Update GraphEngine to load ready_queue from GraphRuntimeState on initialization
- Implement proper type hints using ReadyQueueState for better type safety
- Add comprehensive tests for ready_queue loading functionality

The ready_queue is read-only after initialization and allows resuming workflow
execution with a pre-populated queue of nodes ready to execute.
2025-09-15 03:05:10 +08:00
lyzno1
efcf052004 chore: bump pnpm version to v10.16.0 (#25640) 2025-09-14 18:44:35 +08:00
Timo
9234a2293d improve type hints using typing.Literal and add type annotations (#25641)
Co-authored-by: EchterTimo <EchterTimo@users.noreply.github.com>
2025-09-14 18:44:23 +08:00
Guangdong Liu
7a626747cf bugfix: The randomly generated email by Faker actually corresponded to an existing account in the test database, causing the test to fail. (#25646)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-09-14 18:41:35 +08:00
github-actions[bot]
db01cbb63d chore: translate i18n files and update type definitions (#25645)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-14 18:41:15 +08:00
Asuka Minato
4f868275a9 example for __all__ (#25666) 2025-09-14 18:40:06 +08:00
-LAN-
ed20d14d01 feat: enhance Makefile with code quality commands and default help (#25655) 2025-09-14 18:39:42 +08:00
NeatGuyCoding
0add1af1c8 feat: add test containers based tests for disable segments from index task (#25660) 2025-09-14 14:12:52 +08:00
yo
5c50c3aa70 fix: allow empty values in Variable Inspector (#25644) 2025-09-14 14:10:12 +08:00
lyzno1
9e7328abfb feat: add circular scrolling to GotoAnything command menu (#25662) 2025-09-14 14:07:10 +08:00
autofix-ci[bot]
0f15a2baca [autofix.ci] apply automated fixes 2025-09-13 20:20:53 +00:00
-LAN-
4cdc19fd05 feat(graph_engine): add abstract layer and dump / load methods for ready queue. 2025-09-14 04:19:24 +08:00
-LAN-
efa5f35277 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-14 01:48:06 +08:00
Yongtao Huang
188eb838c5 [Test] speed up Hypothesis strategies to avoid too_slow (#25623)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-13 21:05:19 +08:00
lyzno1
36ab9974d2 fix: Multiple UX improvements for GotoAnything command palette (#25637) 2025-09-13 21:03:42 +08:00
-LAN-
766fda395b Merge branch 'main' into feat/queue-based-graph-engine 2025-09-13 19:37:52 +08:00
NeatGuyCoding
a825f0f2b2 Feature add test containers disable segment from index task (#25631)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-13 14:28:10 +08:00
-LAN-
b0e815c3c7 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-13 01:31:17 +08:00
-LAN-
1b0f92a331 feat(stress-test): add comprehensive stress testing suite using Locust (#25617)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-12 22:25:05 +08:00
Krito.
a13d7987e0 chore: adopt StrEnum and auto() for some string-typed enums (#25129)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-12 21:14:26 +08:00
17hz
635e7d3e70 fix: Cannot modify values when startNode has defaultValue (#25595) 2025-09-12 21:11:24 +08:00
chengjoey
c78ef79995 fix close button cannot be clicked when the browser page is zoomed out (#25584)
Co-authored-by: joeyczheng <joeyczheng@tencent.com>
2025-09-12 21:11:00 +08:00
Tianyi Jing
c3f9a7ed9b feat: add type integer to VarType (#25500)
Signed-off-by: jingfelix <jingfelix@outlook.com>
2025-09-12 21:09:41 +08:00
kenwoodjw
c91253d05d fix segment deletion race condition (#24408)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-12 15:29:57 +08:00
Guangdong Liu
285291f545 refactor: update API routes and documentation for console endpoints (#25554)
Some checks failed
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Has been cancelled
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Has been cancelled
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Has been cancelled
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Has been cancelled
Main CI Pipeline / Check Changed Files (push) Has been cancelled
Main CI Pipeline / Style Check (push) Has been cancelled
Check i18n Files and Create PR / check-and-update (push) Has been cancelled
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Has been cancelled
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Has been cancelled
Main CI Pipeline / API Tests (push) Has been cancelled
Main CI Pipeline / Web Tests (push) Has been cancelled
Main CI Pipeline / VDB Tests (push) Has been cancelled
Main CI Pipeline / DB Migration Test (push) Has been cancelled
2025-09-12 11:51:24 +08:00
JQSevenMiao
c0e1015c6e fix: filter temporary edges from workflow draft sync (#25442)
Co-authored-by: jiasiqi <jiasiqi3@tal.com>
2025-09-12 11:19:57 +08:00
github-actions[bot]
12d1bcc545 chore: translate i18n files and update type definitions (#25575)
Co-authored-by: iamjoel <2120155+iamjoel@users.noreply.github.com>
2025-09-12 10:39:38 +08:00
Yeuoly
ec808f3fe8 refactor: centralize default end user session ID constant (#25416)
This PR refactors the handling of the default end user session ID by centralizing it as an enum in the models module where the `EndUser` model is defined. This improves code organization and makes the relationship between the constant and the model clearer.

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-12 10:27:16 +08:00
Joel
394b0ac9c0 fix: login security issue frontend (#25571)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-09-12 10:25:06 +08:00
zyssyz123
c2fcd2895b Feat/email register refactor (#25369)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Joel <iamjoel007@gmail.com>
2025-09-12 10:24:54 +08:00
Ganondorf
bb1514be2d Force update search method to keyword_search (#25464)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-12 10:12:25 +08:00
Kurumi1997
8ffb9b6aed fix: Support passing the default app mode when creating an app (#25142)
Co-authored-by: 王博 <wangbo@localhost.com>
2025-09-12 10:06:07 +08:00
Matri Qi
33afa7c84a Fix/disable no unsafe optional chaining (#25553) 2025-09-12 10:03:34 +08:00
L
69aad38d03 fix(date-picker): handle string date to avoid crash (#25522)
Co-authored-by: 刘佳佳 <liujiajia@nanjingwanhui.com>
Co-authored-by: crazywoola <427733928@qq.com>
2025-09-12 10:01:26 +08:00
Novice
17b5309e47 fix: single step system file error (#25533)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-12 09:47:45 +08:00
Asuka Minato
05af23f88f use autospec=True in mock (#25497) 2025-09-12 09:46:02 +08:00
Yongtao Huang
4511f4f537 Remove redundant parse_args call in WorkflowByIdApi.patch (#25498) 2025-09-12 09:40:41 +08:00
dependabot[bot]
bdacc4da36 chore(deps): bump mermaid from 11.4.1 to 11.10.0 in /web (#25521)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-12 09:40:18 +08:00
15
1a078657d8 Fixes #25530 (#25531) 2025-09-12 09:39:17 +08:00
Asuka Minato
77ba3e8f26 add autofix pnpm (#25557)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-09-12 09:37:54 +08:00
Wu Tianwei
84e3571ec3 fix: delete get upload file endpoint (#25543)
Co-authored-by: jyong <718720800@qq.com>
2025-09-12 09:36:53 +08:00
NeatGuyCoding
de18b14372 feat: add test containers based tests for delete segment from index task (#25564) 2025-09-12 09:33:39 +08:00
Yongtao Huang
a1322ddb5d Fix: correct has_more pagination logic in get_conversational_variable (#25484)
Signed-off-by: Yongtao Huang<yongtaoh2022@gmail.com>
2025-09-12 09:32:22 +08:00
GuanMu
c7868fb176 test: remove print code (#25481)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-12 09:30:56 +08:00
椰子糖
4b6687db6b Fix log time display bug (#25475)
Some checks failed
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Mark stale issues and pull requests / stale (push) Has been cancelled
Co-authored-by: wxliqigang <wxliqigang@gfpartner.com.cn>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-12 02:46:04 +09:00
-LAN-
462ba354a4 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-12 00:21:06 +08:00
JeeekXY
f1d5bc58b0 fix: app name overflow (#25551)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Co-authored-by: luxiaoyu1 <luxiaoyu1@xiaomi.com>
2025-09-11 21:19:55 +08:00
NeatGuyCoding
99f4cd1cfa feat: add test containers based tests for deal dataset vector index (#25545) 2025-09-11 21:12:53 +08:00
-LAN-
3c668e4a5c fix: update test assertions for ToolProviderApiEntity validation
- Fixed test_repack_provider_entity_no_dark_icon to use empty string instead of None for icon_dark field
- Updated test_builtin_provider_to_user_provider_no_credentials assertion to match actual implementation behavior where masked_credentials always contains empty strings for schema fields
2025-09-11 16:41:10 +08:00
-LAN-
872cff7bab chore(iteration_node): convert some Any to object
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-11 15:40:12 +08:00
-LAN-
8fb69429f9 feat(graph_engine): support parallel mode in iteration node
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-11 15:37:46 +08:00
-LAN-
85064bd8cf Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-11 15:13:31 +08:00
-LAN-
ba5df3612b fix: tests
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-11 15:13:18 +08:00
-LAN-
a923ab1ab8 fix: type errors
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-11 15:01:16 +08:00
QuantumGhost
874406d934 security(api): fix privilege escalation vulnerability in model config and chat message APIs (#25518)
The `ChatMessageApi` (`POST /console/api/apps/{app_id}/chat-messages`) and 
`ModelConfigResource` (`POST /console/api/apps/{app_id}/model-config`) 
endpoints do not properly validate user permissions, allowing users without `editor` 
permission to access restricted functionality.

This PR addresses this issue by adding proper permission check.
2025-09-11 14:53:35 +08:00
Nite Knite
07d067d828 chore: support Zendesk widget (#25517)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
2025-09-11 13:17:50 +08:00
Xiyuan Chen
af7f67dc9c Feat/enteprise cd (#25508) 2025-09-10 20:53:42 -07:00
Xiyuan Chen
34e55028ae Feat/enteprise cd (#25485) 2025-09-10 19:01:32 -07:00
-LAN-
b4c1766932 fix: type errors
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 21:48:05 +08:00
-LAN-
00a1af8506 refactor(graph_engine): use singledispatch in Node
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 20:59:34 +08:00
Eric Guo
70e4d6be34 Fix 500 in dataset page. (#25474)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
2025-09-10 15:57:04 +08:00
Wu Tianwei
b690ac4e2a fix: Remove sticky positioning from workflow component fields (#25470) 2025-09-10 15:17:49 +08:00
quicksand
f56fccee9d fix: workflow knowledge query raise error (#25465) 2025-09-10 13:47:47 +08:00
Asuka Minato
cbc0e639e4 update sql in batch (#24801)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-09-10 13:00:17 +08:00
Guangdong Liu
b51c724a94 refactor: Migrate part of the console basic API module to Flask-RESTX (#24732)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-09-10 12:15:47 +08:00
GuanMu
26a9abef64 test: imporve (#25461)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-10 11:36:22 +08:00
Will
fecdb9554d fix: inner_api get_user_tenant (#25462) 2025-09-10 11:31:16 +08:00
NeatGuyCoding
45ef177809 Feature add test containers create segment to index task (#25450)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-10 10:02:53 +08:00
Newton José
6574e9f0b2 Fix: Add Password Validation to Account Creation (#25382) 2025-09-10 08:58:39 +08:00
Asuka Minato
cce13750ad add rule for strenum (#25445) 2025-09-10 08:51:21 +08:00
17hz
928bef9d82 fix: imporve the condition for stopping the think timer. (#25365) 2025-09-10 08:45:00 +08:00
-LAN-
b6b98a2c8e Merge branch 'feat/dispatch-method' into feat/queue-based-graph-engine 2025-09-10 03:12:59 +08:00
-LAN-
7e69403dda refactor(graph_engine): use singledispatchmethod in event_handler
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 03:12:33 +08:00
-LAN-
9796cede72 fix: add missing type field to node configurations in integration tests
- Added 'type' field to all node data configurations in test files
- Fixed test_code.py: added 'type: code' to all code node configs
- Fixed test_http.py: added 'type: http-request' to all HTTP node configs
- Fixed test_template_transform.py: added 'type: template-transform' to template node config
- Fixed test_tool.py: added 'type: tool' to all tool node configs
- Added setup_code_executor_mock fixture to test_execute_code_scientific_notation

These changes fix the ValueError: 'Node X missing or invalid type information' errors
that were occurring due to changes in the node factory validation requirements.
2025-09-10 02:54:01 +08:00
-LAN-
836ed1f380 refactor(graph_engine): Move ErrorHandler into a single file package
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 02:35:05 +08:00
-LAN-
80f39963f1 chore: add import lint to CI
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 02:32:24 +08:00
-LAN-
9cf2b2b231 fix: type errors
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 02:22:58 +08:00
-LAN-
2a97a69825 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-10 02:03:45 +08:00
-LAN-
f17c71e08a refactor(graph_engine): Move GraphStateManager to single file package.
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 01:55:30 +08:00
-LAN-
08dd3f7b50 Fix basedpyright type errors (#25435)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-10 01:54:26 +08:00
-LAN-
d52621fce3 refactor(graph_engine): Merge error strategies into error_handler.py
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 01:49:46 +08:00
-LAN-
e060d7c28c refactor(graph_engine): remove Optional
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 01:49:15 +08:00
-LAN-
ea5dfe41d5 chore: ignore comment
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-10 01:36:11 +08:00
-LAN-
a23c8fcb1a refactor: move execution limits from engine core to layer
Remove max_execution_time and max_execution_steps from ExecutionContext and GraphEngine since these limits are now handled by ExecutionLimitsLayer. This follows the separation of concerns principle by keeping execution limits as a cross-cutting concern handled by layers rather than embedded in core engine components.

Changes:
- Remove max_execution_time and max_execution_steps from ExecutionContext
- Remove these parameters from GraphEngine.__init__()
- Remove max_execution_time from Dispatcher
- Update workflow_entry.py to no longer pass these parameters
- Update all tests to remove these parameters
2025-09-10 01:32:45 +08:00
-LAN-
e0e82fbfaa refactor: extract _run method into smaller focused methods in IterationNode
- Extract iterator variable retrieval and validation logic
- Separate empty iteration handling
- Create dedicated methods for iteration execution and result handling
- Improve type hints and use modern Python syntax
- Enhance code readability and maintainability
2025-09-10 01:15:36 +08:00
-LAN-
1c9f40f92a Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-09 22:16:59 +08:00
-LAN-
6ffa2ebabf feat: improve error handling in graph node creation
- Replace ValueError catch with generic Exception
- Use logger.exception for automatic traceback logging
- Abort on node creation failure instead of continuing
2025-09-09 22:16:42 +08:00
Yongtao Huang
2ac7a9c8fc Chore: thanks to bump-pydantic (#25437)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
2025-09-09 20:07:17 +08:00
Novice
240b65b980 fix(mcp): properly handle arrays containing both numbers and strings (#25430)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-09 20:06:35 +08:00
-LAN-
95dc1e2fe8 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-09 17:13:16 +08:00
-LAN-
7443c5a6fc refactor: update pyrightconfig to scan all API files (#25429) 2025-09-09 17:12:45 +08:00
GuanMu
a1cf48f84e Add lib test (#25410) 2025-09-09 17:11:49 +08:00
-LAN-
6fe7cf5ebf Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-09 17:11:46 +08:00
-LAN-
e5122945fe Fix: Use --fix flag instead of --fix-only in autofix workflow (#25425) 2025-09-09 17:00:00 +08:00
KVOJJJin
22cd97e2e0 Fix: judgement of open in explore (#25420) 2025-09-09 16:49:22 +08:00
Asuka Minato
38057b1b0e add typing to all wraps (#25405)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-09 16:48:33 +08:00
crazywoola
eb52216a9c Revert "example of remove useEffect" (#25418) 2025-09-09 16:23:44 +08:00
Joel
4c92e63b0b fix: avatar is not updated after setted (#25414) 2025-09-09 16:00:50 +08:00
-LAN-
a1e8ac4c96 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-09 15:49:09 +08:00
XiamuSanhua
ac2aa967c4 feat: change history by supplementary node information (#25294)
Co-authored-by: alleschen <alleschen@tencent.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-09 15:18:42 +08:00
ttz12345
d2e50a508c Fix:About the error problem of creating an empty knowledge base interface in service_api (#25398)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-09 15:18:31 +08:00
Wu Tianwei
37975319f2 feat: Add customized json schema validation (#25408) 2025-09-09 15:15:32 +08:00
Yongtao Huang
4aba570fa8 Fix flask response: 200 -> {}, 200 (#25404) 2025-09-09 15:06:18 +08:00
Novice
e180c19cca fix(mcp): current_user not being set in MCP requests (#25393) 2025-09-09 14:58:14 +08:00
zxhlyh
c595c03452 fix: credential not allow to use in load balancing (#25401) 2025-09-09 14:52:50 +08:00
Xiyuan Chen
64c9a2f678 Feat/credential policy (#25151)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-08 23:45:05 -07:00
Novice
566e0fd3e5 fix(container-test): batch create segment position sort (#25394) 2025-09-09 13:47:29 +08:00
-LAN-
b46858d87d Merge branch 'main' into feat/queue-based-graph-engine 2025-09-09 13:33:17 +08:00
NeatGuyCoding
7dfb72e381 feat: add test containers based tests for clean notion document task (#25385)
Some checks are pending
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-09 11:02:19 +08:00
Asuka Minato
649242f82b example of uuid (#25380)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-09 10:45:08 +08:00
yinyu
cf1ee3162f Support Anchor Scroll In The Output Node (#25364)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-09 10:35:07 +08:00
NeatGuyCoding
bf6485fab4 minor fix: some translation mismatch (#25386) 2025-09-09 10:30:04 +08:00
Yeuoly
720ecea737 fix: tenant_id was not specific when retrieval end-user in plugin backwards invocation wraps (#25377)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-09 09:49:35 +08:00
HuDenghui
d5e86d9180 fix: Fixed the X-axis scroll bar issue in the LLM node settings panel (#25357) 2025-09-09 09:47:27 +08:00
Yongtao Huang
cab1272bb1 Fix: use correct maxLength prop for verification code input (#25371)
Some checks failed
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/amd64, build-api-amd64) (push) Waiting to run
Build and Push API & Web / build (api, DIFY_API_IMAGE_NAME, linux/arm64, build-api-arm64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/amd64, build-web-amd64) (push) Waiting to run
Build and Push API & Web / build (web, DIFY_WEB_IMAGE_NAME, linux/arm64, build-web-arm64) (push) Waiting to run
Build and Push API & Web / create-manifest (api, DIFY_API_IMAGE_NAME, merge-api-images) (push) Blocked by required conditions
Build and Push API & Web / create-manifest (web, DIFY_WEB_IMAGE_NAME, merge-web-images) (push) Blocked by required conditions
Main CI Pipeline / Check Changed Files (push) Waiting to run
Main CI Pipeline / API Tests (push) Blocked by required conditions
Main CI Pipeline / Web Tests (push) Blocked by required conditions
Main CI Pipeline / Style Check (push) Waiting to run
Main CI Pipeline / VDB Tests (push) Blocked by required conditions
Main CI Pipeline / DB Migration Test (push) Blocked by required conditions
Check i18n Files and Create PR / check-and-update (push) Has been cancelled
Signed-off-by: Yongtao Huang <yongtaoh2022@gmail.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-08 20:44:48 +08:00
Matri Qi
563a5af9e7 Fix/disable no constant binary expression (#25311)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 20:44:20 +08:00
-LAN-
5ab6838849 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-08 19:55:43 +08:00
-LAN-
ec0800eb1a refactor: update pyrightconfig.json to use ignore field for better type checking configuration (#25373) 2025-09-08 19:55:25 +08:00
zyssyz123
ea61420441 Revert "feat: email register refactor" (#25367) 2025-09-08 19:20:09 +08:00
kenwoodjw
598ec07c91 feat: enable dsl export encrypt dataset id or not (#25102)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-09-08 18:03:24 +08:00
Debin.Meng
a932413314 fix: Incorrect URL Parameter Parsing Causes user_id Retrieval Error (#25261)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 18:00:33 +08:00
NeatGuyCoding
aff2482436 Feature add test containers batch create segment to index (#25306)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 17:55:57 +08:00
zyssyz123
860ee20c71 feat: email register refactor (#25344)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 17:51:43 +08:00
-LAN-
ef974e484b fix: handle None env vars
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-08 16:43:47 +08:00
Krito.
74be2087b5 fix: ensure Performance Tracing button visible when no tracing provid… (#25351) 2025-09-08 16:38:09 +08:00
github-actions[bot]
57f1822213 chore: translate i18n files and update type definitions (#25349)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-09-08 16:37:20 +08:00
Yongtao Huang
cdfdf324e8 Minor fix: correct PrecessRule typo (#25346) 2025-09-08 15:08:56 +08:00
Cluas
f891c67eca feat: add MCP server headers support #22718 (#24760)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Novice <novice12185727@gmail.com>
2025-09-08 14:10:55 +08:00
-LAN-
299141ae01 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-08 13:56:45 +08:00
NeatGuyCoding
5d0a50042f feat: add test containers based tests for clean dataset task (#25341)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-08 13:09:53 +08:00
-LAN-
cc1d437dc1 fix: correct indentation in TokenBufferMemory get_history_prompt_messages method 2025-09-07 12:48:50 +08:00
-LAN-
7aef0b54e5 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-07 12:34:54 +08:00
-LAN-
3c28936796 fix: test
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-06 16:21:28 +08:00
-LAN-
81fdc7c54b fix: type errors
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-06 16:09:59 +08:00
-LAN-
abb53f11ad Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-06 16:05:13 +08:00
-LAN-
d9aa0ec046 fix: resolve mypy type errors in http_request and list_operator nodes
- Fix str | bytes union type handling in http_request executor
- Add type guard for boolean filter value in list_operator node
2025-09-05 21:17:18 +08:00
-LAN-
6c3302a192 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-05 21:13:07 +08:00
-LAN-
7ba1f0a046 chore: improve typing
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-05 20:57:11 +08:00
-LAN-
2adf5d0eee docs: remove outdated document 2025-09-05 02:09:53 +08:00
-LAN-
103a9a4e67 fix(graph_engine): add type hint for workers_to_remove 2025-09-05 01:59:11 +08:00
-LAN-
15b3443e9e fix(debug_logging_layer): remove access for variable pool 2025-09-05 01:52:19 +08:00
-LAN-
81e9d6f63a fix: correct type checking for None values in code node output validation
- Fixed isinstance() checks to properly handle None values by checking None separately
- Fixed typo in STRING type validation where 'output_name' was hardcoded as string instead of variable
- Updated error message format to be consistent and more informative
- Updated test assertion to match new error message format
2025-09-04 20:39:37 +08:00
-LAN-
9c2943183e test: fix code node
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 20:17:28 +08:00
-LAN-
f6a2a09815 test: fix code node
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 20:04:29 +08:00
-LAN-
e229510e73 perf: eliminate lock contention in worker pool by removing callbacks
Remove worker idle/active callbacks that caused severe lock contention.
Instead, use sampling-based monitoring where worker states are queried
on-demand during scaling decisions. This eliminates the performance
bottleneck caused by workers acquiring locks 10+ times per second.

Changes:
- Remove callback parameters from Worker class
- Add properties to expose worker idle state directly
- Update WorkerPool to query worker states without callbacks
- Maintain scaling functionality with better performance
2025-09-04 19:37:31 +08:00
-LAN-
36048d1526 feat(graph_engine): allow to scale down without lock
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 19:32:07 +08:00
-LAN-
aff7ca12b8 fix(code_node): type checking bypass
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 19:25:08 +08:00
-LAN-
ad9eed2551 fix: disable scale for perfermance
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 19:11:22 +08:00
-LAN-
07109846e0 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-04 17:48:08 +08:00
-LAN-
2aeaefccec test: fix test 2025-09-04 17:47:36 +08:00
-LAN-
4d63bd2083 refactor(graph_engine): rename SimpleWorkerPool to WorkerPool 2025-09-04 17:47:13 +08:00
-LAN-
226f14a20f feat(graph_engine): implement scale down worker
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 15:35:20 +08:00
autofix-ci[bot]
2b28aed4e2 [autofix.ci] apply automated fixes 2025-09-04 04:50:21 +00:00
-LAN-
938a845852 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-04 12:48:58 +08:00
-LAN-
ead8568bfc fix: some errors reported by basedpyright
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 11:58:54 +08:00
-LAN-
ed22d04ea0 test: remove outdated test case 2025-09-04 02:42:36 +08:00
-LAN-
04bbf540d9 chore: code format
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 02:33:53 +08:00
-LAN-
657c27ec75 feat(graph_engine): make runtime state read-only in layer
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 02:30:40 +08:00
-LAN-
16e9cd5ac5 feat(graph_runtime_state): prevent to set variable pool after initialized. 2025-09-04 02:20:19 +08:00
-LAN-
61c79b0013 test: correct imported name 2025-09-04 02:15:46 +08:00
-LAN-
8332472944 refactor(graph_engine): rename Layer to GraphEngineLayer
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-04 02:11:31 +08:00
-LAN-
fe3f03e50a feat: add property-based access control to GraphRuntimeState
- Replace direct field access with private attributes and property decorators
- Implement deep copy protection for mutable objects (dict, LLMUsage)
- Add helper methods: set_output(), get_output(), update_outputs()
- Add increment_node_run_steps() and add_tokens() convenience methods
- Update loop_node and event_handlers to use new accessor methods
- Add comprehensive unit tests for immutability and validation
- Ensure backward compatibility with existing property access patterns
2025-09-04 02:08:58 +08:00
-LAN-
9c96b23d55 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-04 00:27:08 +08:00
-LAN-
8c97937cae Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-03 13:53:43 +08:00
-LAN-
f6acff4cce chore: remove unused variables
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-03 12:12:27 +08:00
-LAN-
3fa48cb5cf chore: remove ty-check from Python style check.
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-03 12:05:41 +08:00
-LAN-
b81745aed8 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-03 11:56:05 +08:00
-LAN-
8c41d95d03 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-03 11:06:42 +08:00
-LAN-
9d004a0971 test: fix test
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-03 02:11:37 +08:00
autofix-ci[bot]
02fcd08c08 [autofix.ci] apply automated fixes 2025-09-02 17:34:07 +00:00
-LAN-
77a9a73d0d Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-03 01:33:17 +08:00
-LAN-
1770b93e5b chore(graph_engine): Add a TODO commment in _update_response_outputs in event_handlers
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-02 15:20:03 +08:00
-LAN-
d8ff4aa9ba feat(graph_engine): Handle NodeRunAgentLogEvent
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-02 15:02:07 +08:00
-LAN-
9f8f21bf87 chore: remove backup files
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-02 15:01:58 +08:00
-LAN-
0b0dc63f29 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-02 11:52:25 +08:00
-LAN-
8433cf4437 refactor(graph_engine): Merge event_collector and event_emitter into event_manager
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 13:15:58 +08:00
-LAN-
bb5d52539c refactor(graph_engine): Merge branch_handler into edge_processor
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 12:53:06 +08:00
-LAN-
88622f70fb refactor(graph_engine): Move setup methods into __init__
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 12:08:03 +08:00
-LAN-
0fdb1b2bc9 refactor(graph_engine): Correct private attributes and private methods naming
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 04:37:23 +08:00
-LAN-
a5cb9d2b73 refactor(graph_engine): inline output_registry into response_coordinator
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 03:59:53 +08:00
-LAN-
64c1234724 refactor(graph_engine): Merge worker management into one WorkerPool
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 03:23:47 +08:00
-LAN-
202fdfcb81 refactor(graph_engine): Remove backward compatibility code
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 02:41:16 +08:00
-LAN-
e2f4c9ba8d refactor(graph_engine): Merge state managers into unified_state_manager
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-09-01 02:08:08 +08:00
-LAN-
546d75d84d Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-09-01 00:29:28 +08:00
-LAN-
a8fe4ea802 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-30 16:36:10 +08:00
-LAN-
82193580de chore: improve typing
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-30 16:35:57 +08:00
-LAN-
1fd27cf3ad Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-30 00:13:45 +08:00
-LAN-
11d32ca87d test: fix web test
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-29 23:20:28 +08:00
-LAN-
5415d0c6d1 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-29 23:17:30 +08:00
-LAN-
d8af8ae4e6 fix: update workflow service tests for new graph engine
- Update method calls from _handle_node_run_result to _handle_single_step_result
- Add required fields (id, node_id, node_type, start_at) to graph events
- Use proper NodeType enum values instead of strings
- Fix imports to use correct modules (Node instead of BaseNode)
- Ensure event generators return proper generator objects

These tests were failing because the internal implementation changed
with the new graph engine architecture.
2025-08-29 23:04:33 +08:00
-LAN-
04e5d4692f Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-29 22:34:47 +08:00
-LAN-
3aa48efd0a test(test_workflow_service): Use new engine's method.
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-29 22:06:10 +08:00
-LAN-
8eb78c04b2 chore(token_buffer_memory): code format
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-29 17:02:51 +08:00
-LAN-
22ee318cf8 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-29 17:01:42 +08:00
-LAN-
f2bc4f5d87 fix: resolve type error in node_factory by using type guard for node_type_str 2025-08-29 16:16:58 +08:00
-LAN-
d7d456349d Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-29 16:14:04 +08:00
-LAN-
dce4d0ff80 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-29 13:22:13 +08:00
-LAN-
3dee8064ba feat: enhance typing 2025-08-29 13:17:02 +08:00
-LAN-
bfbb36756a feat(graph_engine): Add NodeExecutionType.ROOT and auto mark skipped in Graph.init
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 16:41:51 +08:00
-LAN-
d7e0c5f759 chore: use 'XXX | None' instead of Optional[XXX] in graph.py 2025-08-28 15:45:22 +08:00
-LAN-
c396788128 chore(graph_engine): add final mark to classes
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 15:38:35 +08:00
-LAN-
e3a7b1f691 fix: type hints
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 05:24:18 +08:00
-LAN-
8aab7f49c3 chore(graph_engine): Use XXX | None instead of Optional[XXX] 2025-08-28 05:09:33 +08:00
autofix-ci[bot]
1e12c1cbf2 [autofix.ci] apply automated fixes 2025-08-27 21:00:36 +00:00
-LAN-
affedd6ce4 chore(graph_engine): Use XXX | None instead of Optional[XXX] 2025-08-28 04:59:49 +08:00
-LAN-
ef21097774 refactor(graph_engine): Remove unnecessary check from SkipPropagator
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:45:26 +08:00
-LAN-
1d377fe994 refactor(graph_engine): Use _ to mark unused variable in BranchHandler
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:44:45 +08:00
-LAN-
c82697f267 refactor(graph_engine): Remove node_id from SkipPropagator.skip_branch_paths
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:43:56 +08:00
-LAN-
98b25c0bbc refactor(graph_engine): Convert attrs to private in error_handler
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:42:37 +08:00
-LAN-
1cd0792606 chore(graph_events): Improve type hints
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:41:48 +08:00
-LAN-
7cbf4093f4 chore(graph_engine): Use TYPE | None instead of Optional
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:30:50 +08:00
-LAN-
8129ca7c05 chore(graph_engine): Move error_strategy.py to protocols/
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 04:29:32 +08:00
-LAN-
65617f000d feat(event_collector): Update to use ReadWriteLock 2025-08-28 03:26:42 +08:00
-LAN-
635eff2e25 test(graph_engine): remove outdated tests
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 02:53:19 +08:00
-LAN-
55085a9ca2 chore(graph_engine): add type hint for event_queue
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-28 02:38:56 +08:00
-LAN-
9dc1e9724e Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-28 02:26:40 +08:00
-LAN-
c3f66e2901 Merge remote-tracking branch 'origin/main' into feat/queue-based-graph-engine 2025-08-27 18:05:35 +08:00
autofix-ci[bot]
86e7cb713c [autofix.ci] apply automated fixes 2025-08-27 07:38:26 +00:00
-LAN-
0f29244459 fix: test
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-27 15:37:37 +08:00
autofix-ci[bot]
48cbf4c78f [autofix.ci] apply automated fixes 2025-08-27 15:33:30 +08:00
-LAN-
8c35663220 feat: queue-based graph engine
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-08-27 15:33:28 +08:00
1105 changed files with 49202 additions and 18038 deletions

12
.github/dependabot.yml vendored Normal file
View File

@@ -0,0 +1,12 @@
version: 2
updates:
- package-ecosystem: "npm"
directory: "/web"
schedule:
interval: "weekly"
open-pull-requests-limit: 2
- package-ecosystem: "uv"
directory: "/api"
schedule:
interval: "weekly"
open-pull-requests-limit: 2

View File

@@ -20,14 +20,60 @@ jobs:
cd api cd api
uv sync --dev uv sync --dev
# Fix lint errors # Fix lint errors
uv run ruff check --fix-only . uv run ruff check --fix .
# Format code # Format code
uv run ruff format . uv run ruff format ..
- name: ast-grep - name: ast-grep
run: | run: |
uvx --from ast-grep-cli sg --pattern 'db.session.query($WHATEVER).filter($HERE)' --rewrite 'db.session.query($WHATEVER).where($HERE)' -l py --update-all uvx --from ast-grep-cli sg --pattern 'db.session.query($WHATEVER).filter($HERE)' --rewrite 'db.session.query($WHATEVER).where($HERE)' -l py --update-all
uvx --from ast-grep-cli sg --pattern 'session.query($WHATEVER).filter($HERE)' --rewrite 'session.query($WHATEVER).where($HERE)' -l py --update-all uvx --from ast-grep-cli sg --pattern 'session.query($WHATEVER).filter($HERE)' --rewrite 'session.query($WHATEVER).where($HERE)' -l py --update-all
# Convert Optional[T] to T | None (ignoring quoted types)
cat > /tmp/optional-rule.yml << 'EOF'
id: convert-optional-to-union
language: python
rule:
kind: generic_type
all:
- has:
kind: identifier
pattern: Optional
- has:
kind: type_parameter
has:
kind: type
pattern: $T
fix: $T | None
EOF
uvx --from ast-grep-cli sg scan --inline-rules "$(cat /tmp/optional-rule.yml)" --update-all
# Fix forward references that were incorrectly converted (Python doesn't support "Type" | None syntax)
find . -name "*.py" -type f -exec sed -i.bak -E 's/"([^"]+)" \| None/Optional["\1"]/g; s/'"'"'([^'"'"']+)'"'"' \| None/Optional['"'"'\1'"'"']/g' {} \;
find . -name "*.py.bak" -type f -delete
- name: mdformat - name: mdformat
run: | run: |
uvx mdformat . uvx mdformat .
- name: Install pnpm
uses: pnpm/action-setup@v4
with:
package_json_file: web/package.json
run_install: false
- name: Setup NodeJS
uses: actions/setup-node@v4
with:
node-version: 22
cache: pnpm
cache-dependency-path: ./web/package.json
- name: Web dependencies
working-directory: ./web
run: pnpm install --frozen-lockfile
- name: oxlint
working-directory: ./web
run: |
pnpx oxlint --fix
- uses: autofix-ci/action@635ffb0c9798bd160680f18fd73371e355b85f27 - uses: autofix-ci/action@635ffb0c9798bd160680f18fd73371e355b85f27

View File

@@ -19,11 +19,23 @@ jobs:
github.event.workflow_run.head_branch == 'deploy/enterprise' github.event.workflow_run.head_branch == 'deploy/enterprise'
steps: steps:
- name: Deploy to server - name: trigger deployments
uses: appleboy/ssh-action@v0.1.8 env:
with: DEV_ENV_ADDRS: ${{ vars.DEV_ENV_ADDRS }}
host: ${{ secrets.ENTERPRISE_SSH_HOST }} DEPLOY_SECRET: ${{ secrets.DEPLOY_SECRET }}
username: ${{ secrets.ENTERPRISE_SSH_USER }} run: |
password: ${{ secrets.ENTERPRISE_SSH_PASSWORD }} IFS=',' read -ra ENDPOINTS <<< "${DEV_ENV_ADDRS:-}"
script: | BODY='{"project":"dify-api","tag":"deploy-enterprise"}'
${{ vars.ENTERPRISE_SSH_SCRIPT || secrets.ENTERPRISE_SSH_SCRIPT }}
for ENDPOINT in "${ENDPOINTS[@]}"; do
ENDPOINT="$(echo "$ENDPOINT" | xargs)"
[ -z "$ENDPOINT" ] && continue
API_SIGNATURE=$(printf '%s' "$BODY" | openssl dgst -sha256 -hmac "$DEPLOY_SECRET" | awk '{print "sha256="$2}')
curl -sSf -X POST \
-H "Content-Type: application/json" \
-H "X-Hub-Signature-256: $API_SIGNATURE" \
-d "$BODY" \
"$ENDPOINT"
done

View File

@@ -12,7 +12,6 @@ permissions:
statuses: write statuses: write
contents: read contents: read
jobs: jobs:
python-style: python-style:
name: Python Style name: Python Style
@@ -44,6 +43,10 @@ jobs:
if: steps.changed-files.outputs.any_changed == 'true' if: steps.changed-files.outputs.any_changed == 'true'
run: uv sync --project api --dev run: uv sync --project api --dev
- name: Run Import Linter
if: steps.changed-files.outputs.any_changed == 'true'
run: uv run --directory api --dev lint-imports
- name: Run Basedpyright Checks - name: Run Basedpyright Checks
if: steps.changed-files.outputs.any_changed == 'true' if: steps.changed-files.outputs.any_changed == 'true'
run: dev/basedpyright-check run: dev/basedpyright-check

4
.gitignore vendored
View File

@@ -227,3 +227,7 @@ web/public/fallback-*.js
.roo/ .roo/
api/.env.backup api/.env.backup
/clickzetta /clickzetta
# Benchmark
scripts/stress-test/setup/config/
scripts/stress-test/reports/

View File

@@ -1 +0,0 @@
CLAUDE.md

87
AGENTS.md Normal file
View File

@@ -0,0 +1,87 @@
# AGENTS.md
## Project Overview
Dify is an open-source platform for developing LLM applications with an intuitive interface combining agentic AI workflows, RAG pipelines, agent capabilities, and model management.
The codebase consists of:
- **Backend API** (`/api`): Python Flask application with Domain-Driven Design architecture
- **Frontend Web** (`/web`): Next.js 15 application with TypeScript and React 19
- **Docker deployment** (`/docker`): Containerized deployment configurations
## Development Commands
### Backend (API)
All Python commands must be prefixed with `uv run --project api`:
```bash
# Start development servers
./dev/start-api # Start API server
./dev/start-worker # Start Celery worker
# Run tests
uv run --project api pytest # Run all tests
uv run --project api pytest tests/unit_tests/ # Unit tests only
uv run --project api pytest tests/integration_tests/ # Integration tests
# Code quality
./dev/reformat # Run all formatters and linters
uv run --project api ruff check --fix ./ # Fix linting issues
uv run --project api ruff format ./ # Format code
uv run --directory api basedpyright # Type checking
```
### Frontend (Web)
```bash
cd web
pnpm lint # Run ESLint
pnpm eslint-fix # Fix ESLint issues
pnpm test # Run Jest tests
```
## Testing Guidelines
### Backend Testing
- Use `pytest` for all backend tests
- Write tests first (TDD approach)
- Test structure: Arrange-Act-Assert
## Code Style Requirements
### Python
- Use type hints for all functions and class attributes
- No `Any` types unless absolutely necessary
- Implement special methods (`__repr__`, `__str__`) appropriately
### TypeScript/JavaScript
- Strict TypeScript configuration
- ESLint with Prettier integration
- Avoid `any` type
## Important Notes
- **Environment Variables**: Always use UV for Python commands: `uv run --project api <command>`
- **Comments**: Only write meaningful comments that explain "why", not "what"
- **File Creation**: Always prefer editing existing files over creating new ones
- **Documentation**: Don't create documentation files unless explicitly requested
- **Code Quality**: Always run `./dev/reformat` before committing backend changes
## Common Development Tasks
### Adding a New API Endpoint
1. Create controller in `/api/controllers/`
1. Add service logic in `/api/services/`
1. Update routes in controller's `__init__.py`
1. Write tests in `/api/tests/`
## Project-Specific Conventions
- All async tasks use Celery with Redis as broker
- **Internationalization**: Frontend supports multiple languages with English (`web/i18n/en-US/`) as the source. All user-facing text must use i18n keys, no hardcoded strings. Edit corresponding module files in `en-US/` directory for translations.

View File

@@ -1,89 +0,0 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Project Overview
Dify is an open-source platform for developing LLM applications with an intuitive interface combining agentic AI workflows, RAG pipelines, agent capabilities, and model management.
The codebase consists of:
- **Backend API** (`/api`): Python Flask application with Domain-Driven Design architecture
- **Frontend Web** (`/web`): Next.js 15 application with TypeScript and React 19
- **Docker deployment** (`/docker`): Containerized deployment configurations
## Development Commands
### Backend (API)
All Python commands must be prefixed with `uv run --project api`:
```bash
# Start development servers
./dev/start-api # Start API server
./dev/start-worker # Start Celery worker
# Run tests
uv run --project api pytest # Run all tests
uv run --project api pytest tests/unit_tests/ # Unit tests only
uv run --project api pytest tests/integration_tests/ # Integration tests
# Code quality
./dev/reformat # Run all formatters and linters
uv run --project api ruff check --fix ./ # Fix linting issues
uv run --project api ruff format ./ # Format code
uv run --directory api basedpyright # Type checking
```
### Frontend (Web)
```bash
cd web
pnpm lint # Run ESLint
pnpm eslint-fix # Fix ESLint issues
pnpm test # Run Jest tests
```
## Testing Guidelines
### Backend Testing
- Use `pytest` for all backend tests
- Write tests first (TDD approach)
- Test structure: Arrange-Act-Assert
## Code Style Requirements
### Python
- Use type hints for all functions and class attributes
- No `Any` types unless absolutely necessary
- Implement special methods (`__repr__`, `__str__`) appropriately
### TypeScript/JavaScript
- Strict TypeScript configuration
- ESLint with Prettier integration
- Avoid `any` type
## Important Notes
- **Environment Variables**: Always use UV for Python commands: `uv run --project api <command>`
- **Comments**: Only write meaningful comments that explain "why", not "what"
- **File Creation**: Always prefer editing existing files over creating new ones
- **Documentation**: Don't create documentation files unless explicitly requested
- **Code Quality**: Always run `./dev/reformat` before committing backend changes
## Common Development Tasks
### Adding a New API Endpoint
1. Create controller in `/api/controllers/`
1. Add service logic in `/api/services/`
1. Update routes in controller's `__init__.py`
1. Write tests in `/api/tests/`
## Project-Specific Conventions
- All async tasks use Celery with Redis as broker
- **Internationalization**: Frontend supports multiple languages with English (`web/i18n/en-US/`) as the source. All user-facing text must use i18n keys, no hardcoded strings. Edit corresponding module files in `en-US/` directory for translations.

1
CLAUDE.md Symbolic link
View File

@@ -0,0 +1 @@
AGENTS.md

View File

@@ -4,10 +4,13 @@ WEB_IMAGE=$(DOCKER_REGISTRY)/dify-web
API_IMAGE=$(DOCKER_REGISTRY)/dify-api API_IMAGE=$(DOCKER_REGISTRY)/dify-api
VERSION=latest VERSION=latest
# Default target - show help
.DEFAULT_GOAL := help
# Backend Development Environment Setup # Backend Development Environment Setup
.PHONY: dev-setup prepare-docker prepare-web prepare-api .PHONY: dev-setup prepare-docker prepare-web prepare-api
# Default dev setup target # Dev setup target
dev-setup: prepare-docker prepare-web prepare-api dev-setup: prepare-docker prepare-web prepare-api
@echo "✅ Backend development environment setup complete!" @echo "✅ Backend development environment setup complete!"
@@ -46,6 +49,27 @@ dev-clean:
@rm -rf api/storage @rm -rf api/storage
@echo "✅ Cleanup complete" @echo "✅ Cleanup complete"
# Backend Code Quality Commands
format:
@echo "🎨 Running ruff format..."
@uv run --project api --dev ruff format ./api
@echo "✅ Code formatting complete"
check:
@echo "🔍 Running ruff check..."
@uv run --project api --dev ruff check ./api
@echo "✅ Code check complete"
lint:
@echo "🔧 Running ruff format and check with fixes..."
@uv run --directory api --dev sh -c 'ruff format ./api && ruff check --fix ./api'
@echo "✅ Linting complete"
type-check:
@echo "📝 Running type check with basedpyright..."
@uv run --directory api --dev basedpyright
@echo "✅ Type check complete"
# Build Docker images # Build Docker images
build-web: build-web:
@echo "Building web Docker image: $(WEB_IMAGE):$(VERSION)..." @echo "Building web Docker image: $(WEB_IMAGE):$(VERSION)..."
@@ -90,6 +114,12 @@ help:
@echo " make prepare-api - Set up API environment" @echo " make prepare-api - Set up API environment"
@echo " make dev-clean - Stop Docker middleware containers" @echo " make dev-clean - Stop Docker middleware containers"
@echo "" @echo ""
@echo "Backend Code Quality:"
@echo " make format - Format code with ruff"
@echo " make check - Check code with ruff"
@echo " make lint - Format and fix code with ruff"
@echo " make type-check - Run type checking with basedpyright"
@echo ""
@echo "Docker Build Targets:" @echo "Docker Build Targets:"
@echo " make build-web - Build web Docker image" @echo " make build-web - Build web Docker image"
@echo " make build-api - Build API Docker image" @echo " make build-api - Build API Docker image"
@@ -98,4 +128,4 @@ help:
@echo " make build-push-all - Build and push all Docker images" @echo " make build-push-all - Build and push all Docker images"
# Phony targets # Phony targets
.PHONY: build-web build-api push-web push-api build-all push-all build-push-all dev-setup prepare-docker prepare-web prepare-api dev-clean help .PHONY: build-web build-api push-web push-api build-all push-all build-push-all dev-setup prepare-docker prepare-web prepare-api dev-clean help format check lint type-check

View File

@@ -328,7 +328,7 @@ MATRIXONE_DATABASE=dify
LINDORM_URL=http://ld-*******************-proxy-search-pub.lindorm.aliyuncs.com:30070 LINDORM_URL=http://ld-*******************-proxy-search-pub.lindorm.aliyuncs.com:30070
LINDORM_USERNAME=admin LINDORM_USERNAME=admin
LINDORM_PASSWORD=admin LINDORM_PASSWORD=admin
USING_UGC_INDEX=False LINDORM_USING_UGC=True
LINDORM_QUERY_TIMEOUT=1 LINDORM_QUERY_TIMEOUT=1
# OceanBase Vector configuration # OceanBase Vector configuration
@@ -461,6 +461,16 @@ WORKFLOW_CALL_MAX_DEPTH=5
WORKFLOW_PARALLEL_DEPTH_LIMIT=3 WORKFLOW_PARALLEL_DEPTH_LIMIT=3
MAX_VARIABLE_SIZE=204800 MAX_VARIABLE_SIZE=204800
# GraphEngine Worker Pool Configuration
# Minimum number of workers per GraphEngine instance (default: 1)
GRAPH_ENGINE_MIN_WORKERS=1
# Maximum number of workers per GraphEngine instance (default: 10)
GRAPH_ENGINE_MAX_WORKERS=10
# Queue depth threshold that triggers worker scale up (default: 3)
GRAPH_ENGINE_SCALE_UP_THRESHOLD=3
# Seconds of idle time before scaling down workers (default: 5.0)
GRAPH_ENGINE_SCALE_DOWN_IDLE_TIME=5.0
# Workflow storage configuration # Workflow storage configuration
# Options: rdbms, hybrid # Options: rdbms, hybrid
# rdbms: Use only the relational database (default) # rdbms: Use only the relational database (default)
@@ -530,6 +540,7 @@ ENDPOINT_URL_TEMPLATE=http://localhost:5002/e/{hook_id}
# Reset password token expiry minutes # Reset password token expiry minutes
RESET_PASSWORD_TOKEN_EXPIRY_MINUTES=5 RESET_PASSWORD_TOKEN_EXPIRY_MINUTES=5
EMAIL_REGISTER_TOKEN_EXPIRY_MINUTES=5
CHANGE_EMAIL_TOKEN_EXPIRY_MINUTES=5 CHANGE_EMAIL_TOKEN_EXPIRY_MINUTES=5
OWNER_TRANSFER_TOKEN_EXPIRY_MINUTES=5 OWNER_TRANSFER_TOKEN_EXPIRY_MINUTES=5
@@ -569,3 +580,7 @@ QUEUE_MONITOR_INTERVAL=30
# Swagger UI configuration # Swagger UI configuration
SWAGGER_UI_ENABLED=true SWAGGER_UI_ENABLED=true
SWAGGER_UI_PATH=/swagger-ui.html SWAGGER_UI_PATH=/swagger-ui.html
# Whether to encrypt dataset IDs when exporting DSL files (default: true)
# Set to false to export dataset IDs as plain text for easier cross-environment import
DSL_EXPORT_ENCRYPT_DATASET_ID=true

105
api/.importlinter Normal file
View File

@@ -0,0 +1,105 @@
[importlinter]
root_packages =
core
configs
controllers
models
tasks
services
[importlinter:contract:workflow]
name = Workflow
type=layers
layers =
graph_engine
graph_events
graph
nodes
node_events
entities
containers =
core.workflow
ignore_imports =
core.workflow.nodes.base.node -> core.workflow.graph_events
core.workflow.nodes.iteration.iteration_node -> core.workflow.graph_events
core.workflow.nodes.loop.loop_node -> core.workflow.graph_events
core.workflow.nodes.node_factory -> core.workflow.graph
core.workflow.nodes.iteration.iteration_node -> core.workflow.graph_engine
core.workflow.nodes.iteration.iteration_node -> core.workflow.graph
core.workflow.nodes.iteration.iteration_node -> core.workflow.graph_engine.command_channels
core.workflow.nodes.loop.loop_node -> core.workflow.graph_engine
core.workflow.nodes.loop.loop_node -> core.workflow.graph
core.workflow.nodes.loop.loop_node -> core.workflow.graph_engine.command_channels
[importlinter:contract:rsc]
name = RSC
type = layers
layers =
graph_engine
response_coordinator
containers =
core.workflow.graph_engine
[importlinter:contract:worker]
name = Worker
type = layers
layers =
graph_engine
worker
containers =
core.workflow.graph_engine
[importlinter:contract:graph-engine-architecture]
name = Graph Engine Architecture
type = layers
layers =
graph_engine
orchestration
command_processing
event_management
error_handler
graph_traversal
graph_state_manager
worker_management
domain
containers =
core.workflow.graph_engine
[importlinter:contract:domain-isolation]
name = Domain Model Isolation
type = forbidden
source_modules =
core.workflow.graph_engine.domain
forbidden_modules =
core.workflow.graph_engine.worker_management
core.workflow.graph_engine.command_channels
core.workflow.graph_engine.layers
core.workflow.graph_engine.protocols
[importlinter:contract:worker-management]
name = Worker Management
type = forbidden
source_modules =
core.workflow.graph_engine.worker_management
forbidden_modules =
core.workflow.graph_engine.orchestration
core.workflow.graph_engine.command_processing
core.workflow.graph_engine.event_management
[importlinter:contract:graph-traversal-components]
name = Graph Traversal Components
type = layers
layers =
edge_processor
skip_propagator
containers =
core.workflow.graph_engine.graph_traversal
[importlinter:contract:command-channels]
name = Command Channels Independence
type = independence
modules =
core.workflow.graph_engine.command_channels.in_memory_channel
core.workflow.graph_engine.command_channels.redis_channel

View File

@@ -5,7 +5,7 @@ line-length = 120
quote-style = "double" quote-style = "double"
[lint] [lint]
preview = false preview = true
select = [ select = [
"B", # flake8-bugbear rules "B", # flake8-bugbear rules
"C4", # flake8-comprehensions "C4", # flake8-comprehensions
@@ -45,6 +45,7 @@ select = [
"G001", # don't use str format to logging messages "G001", # don't use str format to logging messages
"G003", # don't use + in logging messages "G003", # don't use + in logging messages
"G004", # don't use f-strings to format logging messages "G004", # don't use f-strings to format logging messages
"UP042", # use StrEnum
] ]
ignore = [ ignore = [
@@ -64,6 +65,7 @@ ignore = [
"B006", # mutable-argument-default "B006", # mutable-argument-default
"B007", # unused-loop-control-variable "B007", # unused-loop-control-variable
"B026", # star-arg-unpacking-after-keyword-arg "B026", # star-arg-unpacking-after-keyword-arg
"B901", # allow return in yield
"B903", # class-as-data-structure "B903", # class-as-data-structure
"B904", # raise-without-from-inside-except "B904", # raise-without-from-inside-except
"B905", # zip-without-explicit-strict "B905", # zip-without-explicit-strict

View File

@@ -1,8 +1,9 @@
import base64 import base64
import json import json
import logging import logging
import operator
import secrets import secrets
from typing import Any, Optional from typing import Any
import click import click
import sqlalchemy as sa import sqlalchemy as sa
@@ -13,7 +14,6 @@ from sqlalchemy.exc import SQLAlchemyError
from configs import dify_config from configs import dify_config
from constants.languages import languages from constants.languages import languages
from core.plugin.entities.plugin import ToolProviderID
from core.rag.datasource.vdb.vector_factory import Vector from core.rag.datasource.vdb.vector_factory import Vector
from core.rag.datasource.vdb.vector_type import VectorType from core.rag.datasource.vdb.vector_type import VectorType
from core.rag.index_processor.constant.built_in_field import BuiltInField from core.rag.index_processor.constant.built_in_field import BuiltInField
@@ -31,6 +31,7 @@ from models.dataset import Dataset, DatasetCollectionBinding, DatasetMetadata, D
from models.dataset import Document as DatasetDocument from models.dataset import Document as DatasetDocument
from models.model import Account, App, AppAnnotationSetting, AppMode, Conversation, MessageAnnotation from models.model import Account, App, AppAnnotationSetting, AppMode, Conversation, MessageAnnotation
from models.provider import Provider, ProviderModel from models.provider import Provider, ProviderModel
from models.provider_ids import ToolProviderID
from models.tools import ToolOAuthSystemClient from models.tools import ToolOAuthSystemClient
from services.account_service import AccountService, RegisterService, TenantService from services.account_service import AccountService, RegisterService, TenantService
from services.clear_free_plan_tenant_expired_logs import ClearFreePlanTenantExpiredLogs from services.clear_free_plan_tenant_expired_logs import ClearFreePlanTenantExpiredLogs
@@ -212,7 +213,9 @@ def migrate_annotation_vector_database():
if not dataset_collection_binding: if not dataset_collection_binding:
click.echo(f"App annotation collection binding not found: {app.id}") click.echo(f"App annotation collection binding not found: {app.id}")
continue continue
annotations = db.session.query(MessageAnnotation).where(MessageAnnotation.app_id == app.id).all() annotations = db.session.scalars(
select(MessageAnnotation).where(MessageAnnotation.app_id == app.id)
).all()
dataset = Dataset( dataset = Dataset(
id=app.id, id=app.id,
tenant_id=app.tenant_id, tenant_id=app.tenant_id,
@@ -367,29 +370,25 @@ def migrate_knowledge_vector_database():
) )
raise e raise e
dataset_documents = ( dataset_documents = db.session.scalars(
db.session.query(DatasetDocument) select(DatasetDocument).where(
.where(
DatasetDocument.dataset_id == dataset.id, DatasetDocument.dataset_id == dataset.id,
DatasetDocument.indexing_status == "completed", DatasetDocument.indexing_status == "completed",
DatasetDocument.enabled == True, DatasetDocument.enabled == True,
DatasetDocument.archived == False, DatasetDocument.archived == False,
) )
.all() ).all()
)
documents = [] documents = []
segments_count = 0 segments_count = 0
for dataset_document in dataset_documents: for dataset_document in dataset_documents:
segments = ( segments = db.session.scalars(
db.session.query(DocumentSegment) select(DocumentSegment).where(
.where(
DocumentSegment.document_id == dataset_document.id, DocumentSegment.document_id == dataset_document.id,
DocumentSegment.status == "completed", DocumentSegment.status == "completed",
DocumentSegment.enabled == True, DocumentSegment.enabled == True,
) )
.all() ).all()
)
for segment in segments: for segment in segments:
document = Document( document = Document(
@@ -479,12 +478,12 @@ def convert_to_agent_apps():
click.echo(f"Converting app: {app.id}") click.echo(f"Converting app: {app.id}")
try: try:
app.mode = AppMode.AGENT_CHAT.value app.mode = AppMode.AGENT_CHAT
db.session.commit() db.session.commit()
# update conversation mode to agent # update conversation mode to agent
db.session.query(Conversation).where(Conversation.app_id == app.id).update( db.session.query(Conversation).where(Conversation.app_id == app.id).update(
{Conversation.mode: AppMode.AGENT_CHAT.value} {Conversation.mode: AppMode.AGENT_CHAT}
) )
db.session.commit() db.session.commit()
@@ -511,7 +510,7 @@ def add_qdrant_index(field: str):
from qdrant_client.http.exceptions import UnexpectedResponse from qdrant_client.http.exceptions import UnexpectedResponse
from qdrant_client.http.models import PayloadSchemaType from qdrant_client.http.models import PayloadSchemaType
from core.rag.datasource.vdb.qdrant.qdrant_vector import QdrantConfig from core.rag.datasource.vdb.qdrant.qdrant_vector import PathQdrantParams, QdrantConfig
for binding in bindings: for binding in bindings:
if dify_config.QDRANT_URL is None: if dify_config.QDRANT_URL is None:
@@ -525,7 +524,21 @@ def add_qdrant_index(field: str):
prefer_grpc=dify_config.QDRANT_GRPC_ENABLED, prefer_grpc=dify_config.QDRANT_GRPC_ENABLED,
) )
try: try:
client = qdrant_client.QdrantClient(**qdrant_config.to_qdrant_params()) params = qdrant_config.to_qdrant_params()
# Check the type before using
if isinstance(params, PathQdrantParams):
# PathQdrantParams case
client = qdrant_client.QdrantClient(path=params.path)
else:
# UrlQdrantParams case - params is UrlQdrantParams
client = qdrant_client.QdrantClient(
url=params.url,
api_key=params.api_key,
timeout=int(params.timeout),
verify=params.verify,
grpc_port=params.grpc_port,
prefer_grpc=params.prefer_grpc,
)
# create payload index # create payload index
client.create_payload_index(binding.collection_name, field, field_schema=PayloadSchemaType.KEYWORD) client.create_payload_index(binding.collection_name, field, field_schema=PayloadSchemaType.KEYWORD)
create_count += 1 create_count += 1
@@ -627,7 +640,7 @@ def old_metadata_migration():
@click.option("--email", prompt=True, help="Tenant account email.") @click.option("--email", prompt=True, help="Tenant account email.")
@click.option("--name", prompt=True, help="Workspace name.") @click.option("--name", prompt=True, help="Workspace name.")
@click.option("--language", prompt=True, help="Account language, default: en-US.") @click.option("--language", prompt=True, help="Account language, default: en-US.")
def create_tenant(email: str, language: Optional[str] = None, name: Optional[str] = None): def create_tenant(email: str, language: str | None = None, name: str | None = None):
""" """
Create tenant account Create tenant account
""" """
@@ -941,7 +954,7 @@ def clear_orphaned_file_records(force: bool):
click.echo(click.style("- Deleting orphaned message_files records", fg="white")) click.echo(click.style("- Deleting orphaned message_files records", fg="white"))
query = "DELETE FROM message_files WHERE id IN :ids" query = "DELETE FROM message_files WHERE id IN :ids"
with db.engine.begin() as conn: with db.engine.begin() as conn:
conn.execute(sa.text(query), {"ids": tuple([record["id"] for record in orphaned_message_files])}) conn.execute(sa.text(query), {"ids": tuple(record["id"] for record in orphaned_message_files)})
click.echo( click.echo(
click.style(f"Removed {len(orphaned_message_files)} orphaned message_files records.", fg="green") click.style(f"Removed {len(orphaned_message_files)} orphaned message_files records.", fg="green")
) )
@@ -1295,7 +1308,7 @@ def cleanup_orphaned_draft_variables(
if dry_run: if dry_run:
logger.info("DRY RUN: Would delete the following:") logger.info("DRY RUN: Would delete the following:")
for app_id, count in sorted(stats["orphaned_by_app"].items(), key=lambda x: x[1], reverse=True)[ for app_id, count in sorted(stats["orphaned_by_app"].items(), key=operator.itemgetter(1), reverse=True)[
:10 :10
]: # Show top 10 ]: # Show top 10
logger.info(" App %s: %s variables", app_id, count) logger.info(" App %s: %s variables", app_id, count)

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,28 +7,28 @@ class NotionConfig(BaseSettings):
Configuration settings for Notion integration Configuration settings for Notion integration
""" """
NOTION_CLIENT_ID: Optional[str] = Field( NOTION_CLIENT_ID: str | None = Field(
description="Client ID for Notion API authentication. Required for OAuth 2.0 flow.", description="Client ID for Notion API authentication. Required for OAuth 2.0 flow.",
default=None, default=None,
) )
NOTION_CLIENT_SECRET: Optional[str] = Field( NOTION_CLIENT_SECRET: str | None = Field(
description="Client secret for Notion API authentication. Required for OAuth 2.0 flow.", description="Client secret for Notion API authentication. Required for OAuth 2.0 flow.",
default=None, default=None,
) )
NOTION_INTEGRATION_TYPE: Optional[str] = Field( NOTION_INTEGRATION_TYPE: str | None = Field(
description="Type of Notion integration." description="Type of Notion integration."
" Set to 'internal' for internal integrations, or None for public integrations.", " Set to 'internal' for internal integrations, or None for public integrations.",
default=None, default=None,
) )
NOTION_INTERNAL_SECRET: Optional[str] = Field( NOTION_INTERNAL_SECRET: str | None = Field(
description="Secret key for internal Notion integrations. Required when NOTION_INTEGRATION_TYPE is 'internal'.", description="Secret key for internal Notion integrations. Required when NOTION_INTEGRATION_TYPE is 'internal'.",
default=None, default=None,
) )
NOTION_INTEGRATION_TOKEN: Optional[str] = Field( NOTION_INTEGRATION_TOKEN: str | None = Field(
description="Integration token for Notion API access. Used for direct API calls without OAuth flow.", description="Integration token for Notion API access. Used for direct API calls without OAuth flow.",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeFloat from pydantic import Field, NonNegativeFloat
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class SentryConfig(BaseSettings):
Configuration settings for Sentry error tracking and performance monitoring Configuration settings for Sentry error tracking and performance monitoring
""" """
SENTRY_DSN: Optional[str] = Field( SENTRY_DSN: str | None = Field(
description="Sentry Data Source Name (DSN)." description="Sentry Data Source Name (DSN)."
" This is the unique identifier of your Sentry project, used to send events to the correct project.", " This is the unique identifier of your Sentry project, used to send events to the correct project.",
default=None, default=None,

View File

@@ -1,4 +1,4 @@
from typing import Literal, Optional from typing import Literal
from pydantic import ( from pydantic import (
AliasChoices, AliasChoices,
@@ -31,6 +31,12 @@ class SecurityConfig(BaseSettings):
description="Duration in minutes for which a password reset token remains valid", description="Duration in minutes for which a password reset token remains valid",
default=5, default=5,
) )
EMAIL_REGISTER_TOKEN_EXPIRY_MINUTES: PositiveInt = Field(
description="Duration in minutes for which a email register token remains valid",
default=5,
)
CHANGE_EMAIL_TOKEN_EXPIRY_MINUTES: PositiveInt = Field( CHANGE_EMAIL_TOKEN_EXPIRY_MINUTES: PositiveInt = Field(
description="Duration in minutes for which a change email token remains valid", description="Duration in minutes for which a change email token remains valid",
default=5, default=5,
@@ -51,7 +57,7 @@ class SecurityConfig(BaseSettings):
default=False, default=False,
) )
ADMIN_API_KEY: Optional[str] = Field( ADMIN_API_KEY: str | None = Field(
description="admin api key for authentication", description="admin api key for authentication",
default=None, default=None,
) )
@@ -91,17 +97,17 @@ class CodeExecutionSandboxConfig(BaseSettings):
default="dify-sandbox", default="dify-sandbox",
) )
CODE_EXECUTION_CONNECT_TIMEOUT: Optional[float] = Field( CODE_EXECUTION_CONNECT_TIMEOUT: float | None = Field(
description="Connection timeout in seconds for code execution requests", description="Connection timeout in seconds for code execution requests",
default=10.0, default=10.0,
) )
CODE_EXECUTION_READ_TIMEOUT: Optional[float] = Field( CODE_EXECUTION_READ_TIMEOUT: float | None = Field(
description="Read timeout in seconds for code execution requests", description="Read timeout in seconds for code execution requests",
default=60.0, default=60.0,
) )
CODE_EXECUTION_WRITE_TIMEOUT: Optional[float] = Field( CODE_EXECUTION_WRITE_TIMEOUT: float | None = Field(
description="Write timeout in seconds for code execution request", description="Write timeout in seconds for code execution request",
default=10.0, default=10.0,
) )
@@ -362,17 +368,17 @@ class HttpConfig(BaseSettings):
default=3, default=3,
) )
SSRF_PROXY_ALL_URL: Optional[str] = Field( SSRF_PROXY_ALL_URL: str | None = Field(
description="Proxy URL for HTTP or HTTPS requests to prevent Server-Side Request Forgery (SSRF)", description="Proxy URL for HTTP or HTTPS requests to prevent Server-Side Request Forgery (SSRF)",
default=None, default=None,
) )
SSRF_PROXY_HTTP_URL: Optional[str] = Field( SSRF_PROXY_HTTP_URL: str | None = Field(
description="Proxy URL for HTTP requests to prevent Server-Side Request Forgery (SSRF)", description="Proxy URL for HTTP requests to prevent Server-Side Request Forgery (SSRF)",
default=None, default=None,
) )
SSRF_PROXY_HTTPS_URL: Optional[str] = Field( SSRF_PROXY_HTTPS_URL: str | None = Field(
description="Proxy URL for HTTPS requests to prevent Server-Side Request Forgery (SSRF)", description="Proxy URL for HTTPS requests to prevent Server-Side Request Forgery (SSRF)",
default=None, default=None,
) )
@@ -414,7 +420,7 @@ class InnerAPIConfig(BaseSettings):
default=False, default=False,
) )
INNER_API_KEY: Optional[str] = Field( INNER_API_KEY: str | None = Field(
description="API key for accessing the internal API", description="API key for accessing the internal API",
default=None, default=None,
) )
@@ -430,7 +436,7 @@ class LoggingConfig(BaseSettings):
default="INFO", default="INFO",
) )
LOG_FILE: Optional[str] = Field( LOG_FILE: str | None = Field(
description="File path for log output.", description="File path for log output.",
default=None, default=None,
) )
@@ -450,12 +456,12 @@ class LoggingConfig(BaseSettings):
default="%(asctime)s.%(msecs)03d %(levelname)s [%(threadName)s] [%(filename)s:%(lineno)d] - %(message)s", default="%(asctime)s.%(msecs)03d %(levelname)s [%(threadName)s] [%(filename)s:%(lineno)d] - %(message)s",
) )
LOG_DATEFORMAT: Optional[str] = Field( LOG_DATEFORMAT: str | None = Field(
description="Date format string for log timestamps", description="Date format string for log timestamps",
default=None, default=None,
) )
LOG_TZ: Optional[str] = Field( LOG_TZ: str | None = Field(
description="Timezone for log timestamps (e.g., 'America/New_York')", description="Timezone for log timestamps (e.g., 'America/New_York')",
default="UTC", default="UTC",
) )
@@ -529,6 +535,28 @@ class WorkflowConfig(BaseSettings):
default=200 * 1024, default=200 * 1024,
) )
# GraphEngine Worker Pool Configuration
GRAPH_ENGINE_MIN_WORKERS: PositiveInt = Field(
description="Minimum number of workers per GraphEngine instance",
default=1,
)
GRAPH_ENGINE_MAX_WORKERS: PositiveInt = Field(
description="Maximum number of workers per GraphEngine instance",
default=10,
)
GRAPH_ENGINE_SCALE_UP_THRESHOLD: PositiveInt = Field(
description="Queue depth threshold that triggers worker scale up",
default=3,
)
GRAPH_ENGINE_SCALE_DOWN_IDLE_TIME: float = Field(
description="Seconds of idle time before scaling down workers",
default=5.0,
ge=0.1,
)
class WorkflowNodeExecutionConfig(BaseSettings): class WorkflowNodeExecutionConfig(BaseSettings):
""" """
@@ -589,22 +617,22 @@ class AuthConfig(BaseSettings):
default="/console/api/oauth/authorize", default="/console/api/oauth/authorize",
) )
GITHUB_CLIENT_ID: Optional[str] = Field( GITHUB_CLIENT_ID: str | None = Field(
description="GitHub OAuth client ID", description="GitHub OAuth client ID",
default=None, default=None,
) )
GITHUB_CLIENT_SECRET: Optional[str] = Field( GITHUB_CLIENT_SECRET: str | None = Field(
description="GitHub OAuth client secret", description="GitHub OAuth client secret",
default=None, default=None,
) )
GOOGLE_CLIENT_ID: Optional[str] = Field( GOOGLE_CLIENT_ID: str | None = Field(
description="Google OAuth client ID", description="Google OAuth client ID",
default=None, default=None,
) )
GOOGLE_CLIENT_SECRET: Optional[str] = Field( GOOGLE_CLIENT_SECRET: str | None = Field(
description="Google OAuth client secret", description="Google OAuth client secret",
default=None, default=None,
) )
@@ -639,6 +667,11 @@ class AuthConfig(BaseSettings):
default=86400, default=86400,
) )
EMAIL_REGISTER_LOCKOUT_DURATION: PositiveInt = Field(
description="Time (in seconds) a user must wait before retrying email register after exceeding the rate limit.",
default=86400,
)
class ModerationConfig(BaseSettings): class ModerationConfig(BaseSettings):
""" """
@@ -667,42 +700,42 @@ class MailConfig(BaseSettings):
Configuration for email services Configuration for email services
""" """
MAIL_TYPE: Optional[str] = Field( MAIL_TYPE: str | None = Field(
description="Email service provider type ('smtp' or 'resend' or 'sendGrid), default to None.", description="Email service provider type ('smtp' or 'resend' or 'sendGrid), default to None.",
default=None, default=None,
) )
MAIL_DEFAULT_SEND_FROM: Optional[str] = Field( MAIL_DEFAULT_SEND_FROM: str | None = Field(
description="Default email address to use as the sender", description="Default email address to use as the sender",
default=None, default=None,
) )
RESEND_API_KEY: Optional[str] = Field( RESEND_API_KEY: str | None = Field(
description="API key for Resend email service", description="API key for Resend email service",
default=None, default=None,
) )
RESEND_API_URL: Optional[str] = Field( RESEND_API_URL: str | None = Field(
description="API URL for Resend email service", description="API URL for Resend email service",
default=None, default=None,
) )
SMTP_SERVER: Optional[str] = Field( SMTP_SERVER: str | None = Field(
description="SMTP server hostname", description="SMTP server hostname",
default=None, default=None,
) )
SMTP_PORT: Optional[int] = Field( SMTP_PORT: int | None = Field(
description="SMTP server port number", description="SMTP server port number",
default=465, default=465,
) )
SMTP_USERNAME: Optional[str] = Field( SMTP_USERNAME: str | None = Field(
description="Username for SMTP authentication", description="Username for SMTP authentication",
default=None, default=None,
) )
SMTP_PASSWORD: Optional[str] = Field( SMTP_PASSWORD: str | None = Field(
description="Password for SMTP authentication", description="Password for SMTP authentication",
default=None, default=None,
) )
@@ -722,7 +755,7 @@ class MailConfig(BaseSettings):
default=50, default=50,
) )
SENDGRID_API_KEY: Optional[str] = Field( SENDGRID_API_KEY: str | None = Field(
description="API key for SendGrid service", description="API key for SendGrid service",
default=None, default=None,
) )
@@ -745,17 +778,17 @@ class RagEtlConfig(BaseSettings):
default="database", default="database",
) )
UNSTRUCTURED_API_URL: Optional[str] = Field( UNSTRUCTURED_API_URL: str | None = Field(
description="API URL for Unstructured.io service", description="API URL for Unstructured.io service",
default=None, default=None,
) )
UNSTRUCTURED_API_KEY: Optional[str] = Field( UNSTRUCTURED_API_KEY: str | None = Field(
description="API key for Unstructured.io service", description="API key for Unstructured.io service",
default="", default="",
) )
SCARF_NO_ANALYTICS: Optional[str] = Field( SCARF_NO_ANALYTICS: str | None = Field(
description="This is about whether to disable Scarf analytics in Unstructured library.", description="This is about whether to disable Scarf analytics in Unstructured library.",
default="false", default="false",
) )
@@ -796,6 +829,11 @@ class DataSetConfig(BaseSettings):
default=30, default=30,
) )
DSL_EXPORT_ENCRYPT_DATASET_ID: bool = Field(
description="Enable or disable dataset ID encryption when exporting DSL files",
default=True,
)
class WorkspaceConfig(BaseSettings): class WorkspaceConfig(BaseSettings):
""" """

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeInt from pydantic import Field, NonNegativeInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -40,17 +38,17 @@ class HostedOpenAiConfig(BaseSettings):
Configuration for hosted OpenAI service Configuration for hosted OpenAI service
""" """
HOSTED_OPENAI_API_KEY: Optional[str] = Field( HOSTED_OPENAI_API_KEY: str | None = Field(
description="API key for hosted OpenAI service", description="API key for hosted OpenAI service",
default=None, default=None,
) )
HOSTED_OPENAI_API_BASE: Optional[str] = Field( HOSTED_OPENAI_API_BASE: str | None = Field(
description="Base URL for hosted OpenAI API", description="Base URL for hosted OpenAI API",
default=None, default=None,
) )
HOSTED_OPENAI_API_ORGANIZATION: Optional[str] = Field( HOSTED_OPENAI_API_ORGANIZATION: str | None = Field(
description="Organization ID for hosted OpenAI service", description="Organization ID for hosted OpenAI service",
default=None, default=None,
) )
@@ -110,12 +108,12 @@ class HostedAzureOpenAiConfig(BaseSettings):
default=False, default=False,
) )
HOSTED_AZURE_OPENAI_API_KEY: Optional[str] = Field( HOSTED_AZURE_OPENAI_API_KEY: str | None = Field(
description="API key for hosted Azure OpenAI service", description="API key for hosted Azure OpenAI service",
default=None, default=None,
) )
HOSTED_AZURE_OPENAI_API_BASE: Optional[str] = Field( HOSTED_AZURE_OPENAI_API_BASE: str | None = Field(
description="Base URL for hosted Azure OpenAI API", description="Base URL for hosted Azure OpenAI API",
default=None, default=None,
) )
@@ -131,12 +129,12 @@ class HostedAnthropicConfig(BaseSettings):
Configuration for hosted Anthropic service Configuration for hosted Anthropic service
""" """
HOSTED_ANTHROPIC_API_BASE: Optional[str] = Field( HOSTED_ANTHROPIC_API_BASE: str | None = Field(
description="Base URL for hosted Anthropic API", description="Base URL for hosted Anthropic API",
default=None, default=None,
) )
HOSTED_ANTHROPIC_API_KEY: Optional[str] = Field( HOSTED_ANTHROPIC_API_KEY: str | None = Field(
description="API key for hosted Anthropic service", description="API key for hosted Anthropic service",
default=None, default=None,
) )

View File

@@ -1,5 +1,5 @@
import os import os
from typing import Any, Literal, Optional from typing import Any, Literal
from urllib.parse import parse_qsl, quote_plus from urllib.parse import parse_qsl, quote_plus
from pydantic import Field, NonNegativeFloat, NonNegativeInt, PositiveFloat, PositiveInt, computed_field from pydantic import Field, NonNegativeFloat, NonNegativeInt, PositiveFloat, PositiveInt, computed_field
@@ -78,18 +78,18 @@ class StorageConfig(BaseSettings):
class VectorStoreConfig(BaseSettings): class VectorStoreConfig(BaseSettings):
VECTOR_STORE: Optional[str] = Field( VECTOR_STORE: str | None = Field(
description="Type of vector store to use for efficient similarity search." description="Type of vector store to use for efficient similarity search."
" Set to None if not using a vector store.", " Set to None if not using a vector store.",
default=None, default=None,
) )
VECTOR_STORE_WHITELIST_ENABLE: Optional[bool] = Field( VECTOR_STORE_WHITELIST_ENABLE: bool | None = Field(
description="Enable whitelist for vector store.", description="Enable whitelist for vector store.",
default=False, default=False,
) )
VECTOR_INDEX_NAME_PREFIX: Optional[str] = Field( VECTOR_INDEX_NAME_PREFIX: str | None = Field(
description="Prefix used to create collection name in vector database", description="Prefix used to create collection name in vector database",
default="Vector_index", default="Vector_index",
) )
@@ -225,26 +225,26 @@ class CeleryConfig(DatabaseConfig):
default="redis", default="redis",
) )
CELERY_BROKER_URL: Optional[str] = Field( CELERY_BROKER_URL: str | None = Field(
description="URL of the message broker for Celery tasks.", description="URL of the message broker for Celery tasks.",
default=None, default=None,
) )
CELERY_USE_SENTINEL: Optional[bool] = Field( CELERY_USE_SENTINEL: bool | None = Field(
description="Whether to use Redis Sentinel for high availability.", description="Whether to use Redis Sentinel for high availability.",
default=False, default=False,
) )
CELERY_SENTINEL_MASTER_NAME: Optional[str] = Field( CELERY_SENTINEL_MASTER_NAME: str | None = Field(
description="Name of the Redis Sentinel master.", description="Name of the Redis Sentinel master.",
default=None, default=None,
) )
CELERY_SENTINEL_PASSWORD: Optional[str] = Field( CELERY_SENTINEL_PASSWORD: str | None = Field(
description="Password of the Redis Sentinel master.", description="Password of the Redis Sentinel master.",
default=None, default=None,
) )
CELERY_SENTINEL_SOCKET_TIMEOUT: Optional[PositiveFloat] = Field( CELERY_SENTINEL_SOCKET_TIMEOUT: PositiveFloat | None = Field(
description="Timeout for Redis Sentinel socket operations in seconds.", description="Timeout for Redis Sentinel socket operations in seconds.",
default=0.1, default=0.1,
) )
@@ -268,12 +268,12 @@ class InternalTestConfig(BaseSettings):
Configuration settings for Internal Test Configuration settings for Internal Test
""" """
AWS_SECRET_ACCESS_KEY: Optional[str] = Field( AWS_SECRET_ACCESS_KEY: str | None = Field(
description="Internal test AWS secret access key", description="Internal test AWS secret access key",
default=None, default=None,
) )
AWS_ACCESS_KEY_ID: Optional[str] = Field( AWS_ACCESS_KEY_ID: str | None = Field(
description="Internal test AWS access key ID", description="Internal test AWS access key ID",
default=None, default=None,
) )
@@ -284,15 +284,15 @@ class DatasetQueueMonitorConfig(BaseSettings):
Configuration settings for Dataset Queue Monitor Configuration settings for Dataset Queue Monitor
""" """
QUEUE_MONITOR_THRESHOLD: Optional[NonNegativeInt] = Field( QUEUE_MONITOR_THRESHOLD: NonNegativeInt | None = Field(
description="Threshold for dataset queue monitor", description="Threshold for dataset queue monitor",
default=200, default=200,
) )
QUEUE_MONITOR_ALERT_EMAILS: Optional[str] = Field( QUEUE_MONITOR_ALERT_EMAILS: str | None = Field(
description="Emails for dataset queue monitor alert, separated by commas", description="Emails for dataset queue monitor alert, separated by commas",
default=None, default=None,
) )
QUEUE_MONITOR_INTERVAL: Optional[NonNegativeFloat] = Field( QUEUE_MONITOR_INTERVAL: NonNegativeFloat | None = Field(
description="Interval for dataset queue monitor in minutes", description="Interval for dataset queue monitor in minutes",
default=30, default=30,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveFloat, PositiveInt from pydantic import Field, NonNegativeInt, PositiveFloat, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -19,12 +17,12 @@ class RedisConfig(BaseSettings):
default=6379, default=6379,
) )
REDIS_USERNAME: Optional[str] = Field( REDIS_USERNAME: str | None = Field(
description="Username for Redis authentication (if required)", description="Username for Redis authentication (if required)",
default=None, default=None,
) )
REDIS_PASSWORD: Optional[str] = Field( REDIS_PASSWORD: str | None = Field(
description="Password for Redis authentication (if required)", description="Password for Redis authentication (if required)",
default=None, default=None,
) )
@@ -44,47 +42,47 @@ class RedisConfig(BaseSettings):
default="CERT_NONE", default="CERT_NONE",
) )
REDIS_SSL_CA_CERTS: Optional[str] = Field( REDIS_SSL_CA_CERTS: str | None = Field(
description="Path to the CA certificate file for SSL verification", description="Path to the CA certificate file for SSL verification",
default=None, default=None,
) )
REDIS_SSL_CERTFILE: Optional[str] = Field( REDIS_SSL_CERTFILE: str | None = Field(
description="Path to the client certificate file for SSL authentication", description="Path to the client certificate file for SSL authentication",
default=None, default=None,
) )
REDIS_SSL_KEYFILE: Optional[str] = Field( REDIS_SSL_KEYFILE: str | None = Field(
description="Path to the client private key file for SSL authentication", description="Path to the client private key file for SSL authentication",
default=None, default=None,
) )
REDIS_USE_SENTINEL: Optional[bool] = Field( REDIS_USE_SENTINEL: bool | None = Field(
description="Enable Redis Sentinel mode for high availability", description="Enable Redis Sentinel mode for high availability",
default=False, default=False,
) )
REDIS_SENTINELS: Optional[str] = Field( REDIS_SENTINELS: str | None = Field(
description="Comma-separated list of Redis Sentinel nodes (host:port)", description="Comma-separated list of Redis Sentinel nodes (host:port)",
default=None, default=None,
) )
REDIS_SENTINEL_SERVICE_NAME: Optional[str] = Field( REDIS_SENTINEL_SERVICE_NAME: str | None = Field(
description="Name of the Redis Sentinel service to monitor", description="Name of the Redis Sentinel service to monitor",
default=None, default=None,
) )
REDIS_SENTINEL_USERNAME: Optional[str] = Field( REDIS_SENTINEL_USERNAME: str | None = Field(
description="Username for Redis Sentinel authentication (if required)", description="Username for Redis Sentinel authentication (if required)",
default=None, default=None,
) )
REDIS_SENTINEL_PASSWORD: Optional[str] = Field( REDIS_SENTINEL_PASSWORD: str | None = Field(
description="Password for Redis Sentinel authentication (if required)", description="Password for Redis Sentinel authentication (if required)",
default=None, default=None,
) )
REDIS_SENTINEL_SOCKET_TIMEOUT: Optional[PositiveFloat] = Field( REDIS_SENTINEL_SOCKET_TIMEOUT: PositiveFloat | None = Field(
description="Socket timeout in seconds for Redis Sentinel connections", description="Socket timeout in seconds for Redis Sentinel connections",
default=0.1, default=0.1,
) )
@@ -94,12 +92,12 @@ class RedisConfig(BaseSettings):
default=False, default=False,
) )
REDIS_CLUSTERS: Optional[str] = Field( REDIS_CLUSTERS: str | None = Field(
description="Comma-separated list of Redis Clusters nodes (host:port)", description="Comma-separated list of Redis Clusters nodes (host:port)",
default=None, default=None,
) )
REDIS_CLUSTERS_PASSWORD: Optional[str] = Field( REDIS_CLUSTERS_PASSWORD: str | None = Field(
description="Password for Redis Clusters authentication (if required)", description="Password for Redis Clusters authentication (if required)",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,37 +7,37 @@ class AliyunOSSStorageConfig(BaseSettings):
Configuration settings for Aliyun Object Storage Service (OSS) Configuration settings for Aliyun Object Storage Service (OSS)
""" """
ALIYUN_OSS_BUCKET_NAME: Optional[str] = Field( ALIYUN_OSS_BUCKET_NAME: str | None = Field(
description="Name of the Aliyun OSS bucket to store and retrieve objects", description="Name of the Aliyun OSS bucket to store and retrieve objects",
default=None, default=None,
) )
ALIYUN_OSS_ACCESS_KEY: Optional[str] = Field( ALIYUN_OSS_ACCESS_KEY: str | None = Field(
description="Access key ID for authenticating with Aliyun OSS", description="Access key ID for authenticating with Aliyun OSS",
default=None, default=None,
) )
ALIYUN_OSS_SECRET_KEY: Optional[str] = Field( ALIYUN_OSS_SECRET_KEY: str | None = Field(
description="Secret access key for authenticating with Aliyun OSS", description="Secret access key for authenticating with Aliyun OSS",
default=None, default=None,
) )
ALIYUN_OSS_ENDPOINT: Optional[str] = Field( ALIYUN_OSS_ENDPOINT: str | None = Field(
description="URL of the Aliyun OSS endpoint for your chosen region", description="URL of the Aliyun OSS endpoint for your chosen region",
default=None, default=None,
) )
ALIYUN_OSS_REGION: Optional[str] = Field( ALIYUN_OSS_REGION: str | None = Field(
description="Aliyun OSS region where your bucket is located (e.g., 'oss-cn-hangzhou')", description="Aliyun OSS region where your bucket is located (e.g., 'oss-cn-hangzhou')",
default=None, default=None,
) )
ALIYUN_OSS_AUTH_VERSION: Optional[str] = Field( ALIYUN_OSS_AUTH_VERSION: str | None = Field(
description="Version of the authentication protocol to use with Aliyun OSS (e.g., 'v4')", description="Version of the authentication protocol to use with Aliyun OSS (e.g., 'v4')",
default=None, default=None,
) )
ALIYUN_OSS_PATH: Optional[str] = Field( ALIYUN_OSS_PATH: str | None = Field(
description="Base path within the bucket to store objects (e.g., 'my-app-data/')", description="Base path within the bucket to store objects (e.g., 'my-app-data/')",
default=None, default=None,
) )

View File

@@ -1,4 +1,4 @@
from typing import Literal, Optional from typing import Literal
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +9,27 @@ class S3StorageConfig(BaseSettings):
Configuration settings for S3-compatible object storage Configuration settings for S3-compatible object storage
""" """
S3_ENDPOINT: Optional[str] = Field( S3_ENDPOINT: str | None = Field(
description="URL of the S3-compatible storage endpoint (e.g., 'https://s3.amazonaws.com')", description="URL of the S3-compatible storage endpoint (e.g., 'https://s3.amazonaws.com')",
default=None, default=None,
) )
S3_REGION: Optional[str] = Field( S3_REGION: str | None = Field(
description="Region where the S3 bucket is located (e.g., 'us-east-1')", description="Region where the S3 bucket is located (e.g., 'us-east-1')",
default=None, default=None,
) )
S3_BUCKET_NAME: Optional[str] = Field( S3_BUCKET_NAME: str | None = Field(
description="Name of the S3 bucket to store and retrieve objects", description="Name of the S3 bucket to store and retrieve objects",
default=None, default=None,
) )
S3_ACCESS_KEY: Optional[str] = Field( S3_ACCESS_KEY: str | None = Field(
description="Access key ID for authenticating with the S3 service", description="Access key ID for authenticating with the S3 service",
default=None, default=None,
) )
S3_SECRET_KEY: Optional[str] = Field( S3_SECRET_KEY: str | None = Field(
description="Secret access key for authenticating with the S3 service", description="Secret access key for authenticating with the S3 service",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,22 +7,22 @@ class AzureBlobStorageConfig(BaseSettings):
Configuration settings for Azure Blob Storage Configuration settings for Azure Blob Storage
""" """
AZURE_BLOB_ACCOUNT_NAME: Optional[str] = Field( AZURE_BLOB_ACCOUNT_NAME: str | None = Field(
description="Name of the Azure Storage account (e.g., 'mystorageaccount')", description="Name of the Azure Storage account (e.g., 'mystorageaccount')",
default=None, default=None,
) )
AZURE_BLOB_ACCOUNT_KEY: Optional[str] = Field( AZURE_BLOB_ACCOUNT_KEY: str | None = Field(
description="Access key for authenticating with the Azure Storage account", description="Access key for authenticating with the Azure Storage account",
default=None, default=None,
) )
AZURE_BLOB_CONTAINER_NAME: Optional[str] = Field( AZURE_BLOB_CONTAINER_NAME: str | None = Field(
description="Name of the Azure Blob container to store and retrieve objects", description="Name of the Azure Blob container to store and retrieve objects",
default=None, default=None,
) )
AZURE_BLOB_ACCOUNT_URL: Optional[str] = Field( AZURE_BLOB_ACCOUNT_URL: str | None = Field(
description="URL of the Azure Blob storage endpoint (e.g., 'https://mystorageaccount.blob.core.windows.net')", description="URL of the Azure Blob storage endpoint (e.g., 'https://mystorageaccount.blob.core.windows.net')",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,22 +7,22 @@ class BaiduOBSStorageConfig(BaseSettings):
Configuration settings for Baidu Object Storage Service (OBS) Configuration settings for Baidu Object Storage Service (OBS)
""" """
BAIDU_OBS_BUCKET_NAME: Optional[str] = Field( BAIDU_OBS_BUCKET_NAME: str | None = Field(
description="Name of the Baidu OBS bucket to store and retrieve objects (e.g., 'my-obs-bucket')", description="Name of the Baidu OBS bucket to store and retrieve objects (e.g., 'my-obs-bucket')",
default=None, default=None,
) )
BAIDU_OBS_ACCESS_KEY: Optional[str] = Field( BAIDU_OBS_ACCESS_KEY: str | None = Field(
description="Access Key ID for authenticating with Baidu OBS", description="Access Key ID for authenticating with Baidu OBS",
default=None, default=None,
) )
BAIDU_OBS_SECRET_KEY: Optional[str] = Field( BAIDU_OBS_SECRET_KEY: str | None = Field(
description="Secret Access Key for authenticating with Baidu OBS", description="Secret Access Key for authenticating with Baidu OBS",
default=None, default=None,
) )
BAIDU_OBS_ENDPOINT: Optional[str] = Field( BAIDU_OBS_ENDPOINT: str | None = Field(
description="URL of the Baidu OSS endpoint for your chosen region (e.g., 'https://.bj.bcebos.com')", description="URL of the Baidu OSS endpoint for your chosen region (e.g., 'https://.bj.bcebos.com')",
default=None, default=None,
) )

View File

@@ -1,7 +1,5 @@
"""ClickZetta Volume Storage Configuration""" """ClickZetta Volume Storage Configuration"""
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,17 +7,17 @@ from pydantic_settings import BaseSettings
class ClickZettaVolumeStorageConfig(BaseSettings): class ClickZettaVolumeStorageConfig(BaseSettings):
"""Configuration for ClickZetta Volume storage.""" """Configuration for ClickZetta Volume storage."""
CLICKZETTA_VOLUME_USERNAME: Optional[str] = Field( CLICKZETTA_VOLUME_USERNAME: str | None = Field(
description="Username for ClickZetta Volume authentication", description="Username for ClickZetta Volume authentication",
default=None, default=None,
) )
CLICKZETTA_VOLUME_PASSWORD: Optional[str] = Field( CLICKZETTA_VOLUME_PASSWORD: str | None = Field(
description="Password for ClickZetta Volume authentication", description="Password for ClickZetta Volume authentication",
default=None, default=None,
) )
CLICKZETTA_VOLUME_INSTANCE: Optional[str] = Field( CLICKZETTA_VOLUME_INSTANCE: str | None = Field(
description="ClickZetta instance identifier", description="ClickZetta instance identifier",
default=None, default=None,
) )
@@ -49,7 +47,7 @@ class ClickZettaVolumeStorageConfig(BaseSettings):
default="user", default="user",
) )
CLICKZETTA_VOLUME_NAME: Optional[str] = Field( CLICKZETTA_VOLUME_NAME: str | None = Field(
description="ClickZetta volume name for external volumes", description="ClickZetta volume name for external volumes",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,12 +7,12 @@ class GoogleCloudStorageConfig(BaseSettings):
Configuration settings for Google Cloud Storage Configuration settings for Google Cloud Storage
""" """
GOOGLE_STORAGE_BUCKET_NAME: Optional[str] = Field( GOOGLE_STORAGE_BUCKET_NAME: str | None = Field(
description="Name of the Google Cloud Storage bucket to store and retrieve objects (e.g., 'my-gcs-bucket')", description="Name of the Google Cloud Storage bucket to store and retrieve objects (e.g., 'my-gcs-bucket')",
default=None, default=None,
) )
GOOGLE_STORAGE_SERVICE_ACCOUNT_JSON_BASE64: Optional[str] = Field( GOOGLE_STORAGE_SERVICE_ACCOUNT_JSON_BASE64: str | None = Field(
description="Base64-encoded JSON key file for Google Cloud service account authentication", description="Base64-encoded JSON key file for Google Cloud service account authentication",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,22 +7,22 @@ class HuaweiCloudOBSStorageConfig(BaseSettings):
Configuration settings for Huawei Cloud Object Storage Service (OBS) Configuration settings for Huawei Cloud Object Storage Service (OBS)
""" """
HUAWEI_OBS_BUCKET_NAME: Optional[str] = Field( HUAWEI_OBS_BUCKET_NAME: str | None = Field(
description="Name of the Huawei Cloud OBS bucket to store and retrieve objects (e.g., 'my-obs-bucket')", description="Name of the Huawei Cloud OBS bucket to store and retrieve objects (e.g., 'my-obs-bucket')",
default=None, default=None,
) )
HUAWEI_OBS_ACCESS_KEY: Optional[str] = Field( HUAWEI_OBS_ACCESS_KEY: str | None = Field(
description="Access Key ID for authenticating with Huawei Cloud OBS", description="Access Key ID for authenticating with Huawei Cloud OBS",
default=None, default=None,
) )
HUAWEI_OBS_SECRET_KEY: Optional[str] = Field( HUAWEI_OBS_SECRET_KEY: str | None = Field(
description="Secret Access Key for authenticating with Huawei Cloud OBS", description="Secret Access Key for authenticating with Huawei Cloud OBS",
default=None, default=None,
) )
HUAWEI_OBS_SERVER: Optional[str] = Field( HUAWEI_OBS_SERVER: str | None = Field(
description="Endpoint URL for Huawei Cloud OBS (e.g., 'https://obs.cn-north-4.myhuaweicloud.com')", description="Endpoint URL for Huawei Cloud OBS (e.g., 'https://obs.cn-north-4.myhuaweicloud.com')",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class OCIStorageConfig(BaseSettings):
Configuration settings for Oracle Cloud Infrastructure (OCI) Object Storage Configuration settings for Oracle Cloud Infrastructure (OCI) Object Storage
""" """
OCI_ENDPOINT: Optional[str] = Field( OCI_ENDPOINT: str | None = Field(
description="URL of the OCI Object Storage endpoint (e.g., 'https://objectstorage.us-phoenix-1.oraclecloud.com')", description="URL of the OCI Object Storage endpoint (e.g., 'https://objectstorage.us-phoenix-1.oraclecloud.com')",
default=None, default=None,
) )
OCI_REGION: Optional[str] = Field( OCI_REGION: str | None = Field(
description="OCI region where the bucket is located (e.g., 'us-phoenix-1')", description="OCI region where the bucket is located (e.g., 'us-phoenix-1')",
default=None, default=None,
) )
OCI_BUCKET_NAME: Optional[str] = Field( OCI_BUCKET_NAME: str | None = Field(
description="Name of the OCI Object Storage bucket to store and retrieve objects (e.g., 'my-oci-bucket')", description="Name of the OCI Object Storage bucket to store and retrieve objects (e.g., 'my-oci-bucket')",
default=None, default=None,
) )
OCI_ACCESS_KEY: Optional[str] = Field( OCI_ACCESS_KEY: str | None = Field(
description="Access key (also known as API key) for authenticating with OCI Object Storage", description="Access key (also known as API key) for authenticating with OCI Object Storage",
default=None, default=None,
) )
OCI_SECRET_KEY: Optional[str] = Field( OCI_SECRET_KEY: str | None = Field(
description="Secret key associated with the access key for authenticating with OCI Object Storage", description="Secret key associated with the access key for authenticating with OCI Object Storage",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,17 +7,17 @@ class SupabaseStorageConfig(BaseSettings):
Configuration settings for Supabase Object Storage Service Configuration settings for Supabase Object Storage Service
""" """
SUPABASE_BUCKET_NAME: Optional[str] = Field( SUPABASE_BUCKET_NAME: str | None = Field(
description="Name of the Supabase bucket to store and retrieve objects (e.g., 'dify-bucket')", description="Name of the Supabase bucket to store and retrieve objects (e.g., 'dify-bucket')",
default=None, default=None,
) )
SUPABASE_API_KEY: Optional[str] = Field( SUPABASE_API_KEY: str | None = Field(
description="API KEY for authenticating with Supabase", description="API KEY for authenticating with Supabase",
default=None, default=None,
) )
SUPABASE_URL: Optional[str] = Field( SUPABASE_URL: str | None = Field(
description="URL of the Supabase", description="URL of the Supabase",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class TencentCloudCOSStorageConfig(BaseSettings):
Configuration settings for Tencent Cloud Object Storage (COS) Configuration settings for Tencent Cloud Object Storage (COS)
""" """
TENCENT_COS_BUCKET_NAME: Optional[str] = Field( TENCENT_COS_BUCKET_NAME: str | None = Field(
description="Name of the Tencent Cloud COS bucket to store and retrieve objects", description="Name of the Tencent Cloud COS bucket to store and retrieve objects",
default=None, default=None,
) )
TENCENT_COS_REGION: Optional[str] = Field( TENCENT_COS_REGION: str | None = Field(
description="Tencent Cloud region where the COS bucket is located (e.g., 'ap-guangzhou')", description="Tencent Cloud region where the COS bucket is located (e.g., 'ap-guangzhou')",
default=None, default=None,
) )
TENCENT_COS_SECRET_ID: Optional[str] = Field( TENCENT_COS_SECRET_ID: str | None = Field(
description="SecretId for authenticating with Tencent Cloud COS (part of API credentials)", description="SecretId for authenticating with Tencent Cloud COS (part of API credentials)",
default=None, default=None,
) )
TENCENT_COS_SECRET_KEY: Optional[str] = Field( TENCENT_COS_SECRET_KEY: str | None = Field(
description="SecretKey for authenticating with Tencent Cloud COS (part of API credentials)", description="SecretKey for authenticating with Tencent Cloud COS (part of API credentials)",
default=None, default=None,
) )
TENCENT_COS_SCHEME: Optional[str] = Field( TENCENT_COS_SCHEME: str | None = Field(
description="Protocol scheme for COS requests: 'https' (recommended) or 'http'", description="Protocol scheme for COS requests: 'https' (recommended) or 'http'",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class VolcengineTOSStorageConfig(BaseSettings):
Configuration settings for Volcengine Tinder Object Storage (TOS) Configuration settings for Volcengine Tinder Object Storage (TOS)
""" """
VOLCENGINE_TOS_BUCKET_NAME: Optional[str] = Field( VOLCENGINE_TOS_BUCKET_NAME: str | None = Field(
description="Name of the Volcengine TOS bucket to store and retrieve objects (e.g., 'my-tos-bucket')", description="Name of the Volcengine TOS bucket to store and retrieve objects (e.g., 'my-tos-bucket')",
default=None, default=None,
) )
VOLCENGINE_TOS_ACCESS_KEY: Optional[str] = Field( VOLCENGINE_TOS_ACCESS_KEY: str | None = Field(
description="Access Key ID for authenticating with Volcengine TOS", description="Access Key ID for authenticating with Volcengine TOS",
default=None, default=None,
) )
VOLCENGINE_TOS_SECRET_KEY: Optional[str] = Field( VOLCENGINE_TOS_SECRET_KEY: str | None = Field(
description="Secret Access Key for authenticating with Volcengine TOS", description="Secret Access Key for authenticating with Volcengine TOS",
default=None, default=None,
) )
VOLCENGINE_TOS_ENDPOINT: Optional[str] = Field( VOLCENGINE_TOS_ENDPOINT: str | None = Field(
description="URL of the Volcengine TOS endpoint (e.g., 'https://tos-cn-beijing.volces.com')", description="URL of the Volcengine TOS endpoint (e.g., 'https://tos-cn-beijing.volces.com')",
default=None, default=None,
) )
VOLCENGINE_TOS_REGION: Optional[str] = Field( VOLCENGINE_TOS_REGION: str | None = Field(
description="Volcengine region where the TOS bucket is located (e.g., 'cn-beijing')", description="Volcengine region where the TOS bucket is located (e.g., 'cn-beijing')",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -11,37 +9,37 @@ class AnalyticdbConfig(BaseSettings):
https://www.alibabacloud.com/help/en/analyticdb-for-postgresql/getting-started/create-an-instance-instances-with-vector-engine-optimization-enabled https://www.alibabacloud.com/help/en/analyticdb-for-postgresql/getting-started/create-an-instance-instances-with-vector-engine-optimization-enabled
""" """
ANALYTICDB_KEY_ID: Optional[str] = Field( ANALYTICDB_KEY_ID: str | None = Field(
default=None, description="The Access Key ID provided by Alibaba Cloud for API authentication." default=None, description="The Access Key ID provided by Alibaba Cloud for API authentication."
) )
ANALYTICDB_KEY_SECRET: Optional[str] = Field( ANALYTICDB_KEY_SECRET: str | None = Field(
default=None, description="The Secret Access Key corresponding to the Access Key ID for secure API access." default=None, description="The Secret Access Key corresponding to the Access Key ID for secure API access."
) )
ANALYTICDB_REGION_ID: Optional[str] = Field( ANALYTICDB_REGION_ID: str | None = Field(
default=None, default=None,
description="The region where the AnalyticDB instance is deployed (e.g., 'cn-hangzhou', 'ap-southeast-1').", description="The region where the AnalyticDB instance is deployed (e.g., 'cn-hangzhou', 'ap-southeast-1').",
) )
ANALYTICDB_INSTANCE_ID: Optional[str] = Field( ANALYTICDB_INSTANCE_ID: str | None = Field(
default=None, default=None,
description="The unique identifier of the AnalyticDB instance you want to connect to.", description="The unique identifier of the AnalyticDB instance you want to connect to.",
) )
ANALYTICDB_ACCOUNT: Optional[str] = Field( ANALYTICDB_ACCOUNT: str | None = Field(
default=None, default=None,
description="The account name used to log in to the AnalyticDB instance" description="The account name used to log in to the AnalyticDB instance"
" (usually the initial account created with the instance).", " (usually the initial account created with the instance).",
) )
ANALYTICDB_PASSWORD: Optional[str] = Field( ANALYTICDB_PASSWORD: str | None = Field(
default=None, description="The password associated with the AnalyticDB account for database authentication." default=None, description="The password associated with the AnalyticDB account for database authentication."
) )
ANALYTICDB_NAMESPACE: Optional[str] = Field( ANALYTICDB_NAMESPACE: str | None = Field(
default=None, description="The namespace within AnalyticDB for schema isolation (if using namespace feature)." default=None, description="The namespace within AnalyticDB for schema isolation (if using namespace feature)."
) )
ANALYTICDB_NAMESPACE_PASSWORD: Optional[str] = Field( ANALYTICDB_NAMESPACE_PASSWORD: str | None = Field(
default=None, default=None,
description="The password for accessing the specified namespace within the AnalyticDB instance" description="The password for accessing the specified namespace within the AnalyticDB instance"
" (if namespace feature is enabled).", " (if namespace feature is enabled).",
) )
ANALYTICDB_HOST: Optional[str] = Field( ANALYTICDB_HOST: str | None = Field(
default=None, description="The host of the AnalyticDB instance you want to connect to." default=None, description="The host of the AnalyticDB instance you want to connect to."
) )
ANALYTICDB_PORT: PositiveInt = Field( ANALYTICDB_PORT: PositiveInt = Field(

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveInt from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class BaiduVectorDBConfig(BaseSettings):
Configuration settings for Baidu Vector Database Configuration settings for Baidu Vector Database
""" """
BAIDU_VECTOR_DB_ENDPOINT: Optional[str] = Field( BAIDU_VECTOR_DB_ENDPOINT: str | None = Field(
description="URL of the Baidu Vector Database service (e.g., 'http://vdb.bj.baidubce.com')", description="URL of the Baidu Vector Database service (e.g., 'http://vdb.bj.baidubce.com')",
default=None, default=None,
) )
@@ -19,17 +17,17 @@ class BaiduVectorDBConfig(BaseSettings):
default=30000, default=30000,
) )
BAIDU_VECTOR_DB_ACCOUNT: Optional[str] = Field( BAIDU_VECTOR_DB_ACCOUNT: str | None = Field(
description="Account for authenticating with the Baidu Vector Database", description="Account for authenticating with the Baidu Vector Database",
default=None, default=None,
) )
BAIDU_VECTOR_DB_API_KEY: Optional[str] = Field( BAIDU_VECTOR_DB_API_KEY: str | None = Field(
description="API key for authenticating with the Baidu Vector Database service", description="API key for authenticating with the Baidu Vector Database service",
default=None, default=None,
) )
BAIDU_VECTOR_DB_DATABASE: Optional[str] = Field( BAIDU_VECTOR_DB_DATABASE: str | None = Field(
description="Name of the specific Baidu Vector Database to connect to", description="Name of the specific Baidu Vector Database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class ChromaConfig(BaseSettings):
Configuration settings for Chroma vector database Configuration settings for Chroma vector database
""" """
CHROMA_HOST: Optional[str] = Field( CHROMA_HOST: str | None = Field(
description="Hostname or IP address of the Chroma server (e.g., 'localhost' or '192.168.1.100')", description="Hostname or IP address of the Chroma server (e.g., 'localhost' or '192.168.1.100')",
default=None, default=None,
) )
@@ -19,22 +17,22 @@ class ChromaConfig(BaseSettings):
default=8000, default=8000,
) )
CHROMA_TENANT: Optional[str] = Field( CHROMA_TENANT: str | None = Field(
description="Tenant identifier for multi-tenancy support in Chroma", description="Tenant identifier for multi-tenancy support in Chroma",
default=None, default=None,
) )
CHROMA_DATABASE: Optional[str] = Field( CHROMA_DATABASE: str | None = Field(
description="Name of the Chroma database to connect to", description="Name of the Chroma database to connect to",
default=None, default=None,
) )
CHROMA_AUTH_PROVIDER: Optional[str] = Field( CHROMA_AUTH_PROVIDER: str | None = Field(
description="Authentication provider for Chroma (e.g., 'basic', 'token', or a custom provider)", description="Authentication provider for Chroma (e.g., 'basic', 'token', or a custom provider)",
default=None, default=None,
) )
CHROMA_AUTH_CREDENTIALS: Optional[str] = Field( CHROMA_AUTH_CREDENTIALS: str | None = Field(
description="Authentication credentials for Chroma (format depends on the auth provider)", description="Authentication credentials for Chroma (format depends on the auth provider)",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,62 +7,62 @@ class ClickzettaConfig(BaseSettings):
Clickzetta Lakehouse vector database configuration Clickzetta Lakehouse vector database configuration
""" """
CLICKZETTA_USERNAME: Optional[str] = Field( CLICKZETTA_USERNAME: str | None = Field(
description="Username for authenticating with Clickzetta Lakehouse", description="Username for authenticating with Clickzetta Lakehouse",
default=None, default=None,
) )
CLICKZETTA_PASSWORD: Optional[str] = Field( CLICKZETTA_PASSWORD: str | None = Field(
description="Password for authenticating with Clickzetta Lakehouse", description="Password for authenticating with Clickzetta Lakehouse",
default=None, default=None,
) )
CLICKZETTA_INSTANCE: Optional[str] = Field( CLICKZETTA_INSTANCE: str | None = Field(
description="Clickzetta Lakehouse instance ID", description="Clickzetta Lakehouse instance ID",
default=None, default=None,
) )
CLICKZETTA_SERVICE: Optional[str] = Field( CLICKZETTA_SERVICE: str | None = Field(
description="Clickzetta API service endpoint (e.g., 'api.clickzetta.com')", description="Clickzetta API service endpoint (e.g., 'api.clickzetta.com')",
default="api.clickzetta.com", default="api.clickzetta.com",
) )
CLICKZETTA_WORKSPACE: Optional[str] = Field( CLICKZETTA_WORKSPACE: str | None = Field(
description="Clickzetta workspace name", description="Clickzetta workspace name",
default="default", default="default",
) )
CLICKZETTA_VCLUSTER: Optional[str] = Field( CLICKZETTA_VCLUSTER: str | None = Field(
description="Clickzetta virtual cluster name", description="Clickzetta virtual cluster name",
default="default_ap", default="default_ap",
) )
CLICKZETTA_SCHEMA: Optional[str] = Field( CLICKZETTA_SCHEMA: str | None = Field(
description="Database schema name in Clickzetta", description="Database schema name in Clickzetta",
default="public", default="public",
) )
CLICKZETTA_BATCH_SIZE: Optional[int] = Field( CLICKZETTA_BATCH_SIZE: int | None = Field(
description="Batch size for bulk insert operations", description="Batch size for bulk insert operations",
default=100, default=100,
) )
CLICKZETTA_ENABLE_INVERTED_INDEX: Optional[bool] = Field( CLICKZETTA_ENABLE_INVERTED_INDEX: bool | None = Field(
description="Enable inverted index for full-text search capabilities", description="Enable inverted index for full-text search capabilities",
default=True, default=True,
) )
CLICKZETTA_ANALYZER_TYPE: Optional[str] = Field( CLICKZETTA_ANALYZER_TYPE: str | None = Field(
description="Analyzer type for full-text search: keyword, english, chinese, unicode", description="Analyzer type for full-text search: keyword, english, chinese, unicode",
default="chinese", default="chinese",
) )
CLICKZETTA_ANALYZER_MODE: Optional[str] = Field( CLICKZETTA_ANALYZER_MODE: str | None = Field(
description="Analyzer mode for tokenization: max_word (fine-grained) or smart (intelligent)", description="Analyzer mode for tokenization: max_word (fine-grained) or smart (intelligent)",
default="smart", default="smart",
) )
CLICKZETTA_VECTOR_DISTANCE_FUNCTION: Optional[str] = Field( CLICKZETTA_VECTOR_DISTANCE_FUNCTION: str | None = Field(
description="Distance function for vector similarity: l2_distance or cosine_distance", description="Distance function for vector similarity: l2_distance or cosine_distance",
default="cosine_distance", default="cosine_distance",
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class CouchbaseConfig(BaseSettings):
Couchbase configs Couchbase configs
""" """
COUCHBASE_CONNECTION_STRING: Optional[str] = Field( COUCHBASE_CONNECTION_STRING: str | None = Field(
description="COUCHBASE connection string", description="COUCHBASE connection string",
default=None, default=None,
) )
COUCHBASE_USER: Optional[str] = Field( COUCHBASE_USER: str | None = Field(
description="COUCHBASE user", description="COUCHBASE user",
default=None, default=None,
) )
COUCHBASE_PASSWORD: Optional[str] = Field( COUCHBASE_PASSWORD: str | None = Field(
description="COUCHBASE password", description="COUCHBASE password",
default=None, default=None,
) )
COUCHBASE_BUCKET_NAME: Optional[str] = Field( COUCHBASE_BUCKET_NAME: str | None = Field(
description="COUCHBASE bucket name", description="COUCHBASE bucket name",
default=None, default=None,
) )
COUCHBASE_SCOPE_NAME: Optional[str] = Field( COUCHBASE_SCOPE_NAME: str | None = Field(
description="COUCHBASE scope name", description="COUCHBASE scope name",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt, model_validator from pydantic import Field, PositiveInt, model_validator
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -10,7 +8,7 @@ class ElasticsearchConfig(BaseSettings):
Can load from environment variables or .env files. Can load from environment variables or .env files.
""" """
ELASTICSEARCH_HOST: Optional[str] = Field( ELASTICSEARCH_HOST: str | None = Field(
description="Hostname or IP address of the Elasticsearch server (e.g., 'localhost' or '192.168.1.100')", description="Hostname or IP address of the Elasticsearch server (e.g., 'localhost' or '192.168.1.100')",
default="127.0.0.1", default="127.0.0.1",
) )
@@ -20,30 +18,28 @@ class ElasticsearchConfig(BaseSettings):
default=9200, default=9200,
) )
ELASTICSEARCH_USERNAME: Optional[str] = Field( ELASTICSEARCH_USERNAME: str | None = Field(
description="Username for authenticating with Elasticsearch (default is 'elastic')", description="Username for authenticating with Elasticsearch (default is 'elastic')",
default="elastic", default="elastic",
) )
ELASTICSEARCH_PASSWORD: Optional[str] = Field( ELASTICSEARCH_PASSWORD: str | None = Field(
description="Password for authenticating with Elasticsearch (default is 'elastic')", description="Password for authenticating with Elasticsearch (default is 'elastic')",
default="elastic", default="elastic",
) )
# Elastic Cloud (optional) # Elastic Cloud (optional)
ELASTICSEARCH_USE_CLOUD: Optional[bool] = Field( ELASTICSEARCH_USE_CLOUD: bool | None = Field(
description="Set to True to use Elastic Cloud instead of self-hosted Elasticsearch", default=False description="Set to True to use Elastic Cloud instead of self-hosted Elasticsearch", default=False
) )
ELASTICSEARCH_CLOUD_URL: Optional[str] = Field( ELASTICSEARCH_CLOUD_URL: str | None = Field(
description="Full URL for Elastic Cloud deployment (e.g., 'https://example.es.region.aws.found.io:443')", description="Full URL for Elastic Cloud deployment (e.g., 'https://example.es.region.aws.found.io:443')",
default=None, default=None,
) )
ELASTICSEARCH_API_KEY: Optional[str] = Field( ELASTICSEARCH_API_KEY: str | None = Field(description="API key for authenticating with Elastic Cloud", default=None)
description="API key for authenticating with Elastic Cloud", default=None
)
# Common options # Common options
ELASTICSEARCH_CA_CERTS: Optional[str] = Field( ELASTICSEARCH_CA_CERTS: str | None = Field(
description="Path to CA certificate file for SSL verification", default=None description="Path to CA certificate file for SSL verification", default=None
) )
ELASTICSEARCH_VERIFY_CERTS: bool = Field( ELASTICSEARCH_VERIFY_CERTS: bool = Field(

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,17 +7,17 @@ class HuaweiCloudConfig(BaseSettings):
Configuration settings for Huawei cloud search service Configuration settings for Huawei cloud search service
""" """
HUAWEI_CLOUD_HOSTS: Optional[str] = Field( HUAWEI_CLOUD_HOSTS: str | None = Field(
description="Hostname or IP address of the Huawei cloud search service instance", description="Hostname or IP address of the Huawei cloud search service instance",
default=None, default=None,
) )
HUAWEI_CLOUD_USER: Optional[str] = Field( HUAWEI_CLOUD_USER: str | None = Field(
description="Username for authenticating with Huawei cloud search service", description="Username for authenticating with Huawei cloud search service",
default=None, default=None,
) )
HUAWEI_CLOUD_PASSWORD: Optional[str] = Field( HUAWEI_CLOUD_PASSWORD: str | None = Field(
description="Password for authenticating with Huawei cloud search service", description="Password for authenticating with Huawei cloud search service",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class LindormConfig(BaseSettings):
Lindorm configs Lindorm configs
""" """
LINDORM_URL: Optional[str] = Field( LINDORM_URL: str | None = Field(
description="Lindorm url", description="Lindorm url",
default=None, default=None,
) )
LINDORM_USERNAME: Optional[str] = Field( LINDORM_USERNAME: str | None = Field(
description="Lindorm user", description="Lindorm user",
default=None, default=None,
) )
LINDORM_PASSWORD: Optional[str] = Field( LINDORM_PASSWORD: str | None = Field(
description="Lindorm password", description="Lindorm password",
default=None, default=None,
) )
DEFAULT_INDEX_TYPE: Optional[str] = Field( LINDORM_INDEX_TYPE: str | None = Field(
description="Lindorm Vector Index Type, hnsw or flat is available in dify", description="Lindorm Vector Index Type, hnsw or flat is available in dify",
default="hnsw", default="hnsw",
) )
DEFAULT_DISTANCE_TYPE: Optional[str] = Field( LINDORM_DISTANCE_TYPE: str | None = Field(
description="Vector Distance Type, support l2, cosinesimil, innerproduct", default="l2" description="Vector Distance Type, support l2, cosinesimil, innerproduct", default="l2"
) )
USING_UGC_INDEX: Optional[bool] = Field( LINDORM_USING_UGC: bool | None = Field(
description="Using UGC index will store the same type of Index in a single index but can retrieve separately.", description="Using UGC index will store indexes with the same IndexType/Dimension in a single big index.",
default=False, default=True,
) )
LINDORM_QUERY_TIMEOUT: Optional[float] = Field(description="The lindorm search request timeout (s)", default=2.0) LINDORM_QUERY_TIMEOUT: float | None = Field(description="The lindorm search request timeout (s)", default=2.0)

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,22 +7,22 @@ class MilvusConfig(BaseSettings):
Configuration settings for Milvus vector database Configuration settings for Milvus vector database
""" """
MILVUS_URI: Optional[str] = Field( MILVUS_URI: str | None = Field(
description="URI for connecting to the Milvus server (e.g., 'http://localhost:19530' or 'https://milvus-instance.example.com:19530')", description="URI for connecting to the Milvus server (e.g., 'http://localhost:19530' or 'https://milvus-instance.example.com:19530')",
default="http://127.0.0.1:19530", default="http://127.0.0.1:19530",
) )
MILVUS_TOKEN: Optional[str] = Field( MILVUS_TOKEN: str | None = Field(
description="Authentication token for Milvus, if token-based authentication is enabled", description="Authentication token for Milvus, if token-based authentication is enabled",
default=None, default=None,
) )
MILVUS_USER: Optional[str] = Field( MILVUS_USER: str | None = Field(
description="Username for authenticating with Milvus, if username/password authentication is enabled", description="Username for authenticating with Milvus, if username/password authentication is enabled",
default=None, default=None,
) )
MILVUS_PASSWORD: Optional[str] = Field( MILVUS_PASSWORD: str | None = Field(
description="Password for authenticating with Milvus, if username/password authentication is enabled", description="Password for authenticating with Milvus, if username/password authentication is enabled",
default=None, default=None,
) )
@@ -40,7 +38,7 @@ class MilvusConfig(BaseSettings):
default=True, default=True,
) )
MILVUS_ANALYZER_PARAMS: Optional[str] = Field( MILVUS_ANALYZER_PARAMS: str | None = Field(
description='Milvus text analyzer parameters, e.g., {"type": "chinese"} for Chinese segmentation support.', description='Milvus text analyzer parameters, e.g., {"type": "chinese"} for Chinese segmentation support.',
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class OceanBaseVectorConfig(BaseSettings):
Configuration settings for OceanBase Vector database Configuration settings for OceanBase Vector database
""" """
OCEANBASE_VECTOR_HOST: Optional[str] = Field( OCEANBASE_VECTOR_HOST: str | None = Field(
description="Hostname or IP address of the OceanBase Vector server (e.g. 'localhost')", description="Hostname or IP address of the OceanBase Vector server (e.g. 'localhost')",
default=None, default=None,
) )
OCEANBASE_VECTOR_PORT: Optional[PositiveInt] = Field( OCEANBASE_VECTOR_PORT: PositiveInt | None = Field(
description="Port number on which the OceanBase Vector server is listening (default is 2881)", description="Port number on which the OceanBase Vector server is listening (default is 2881)",
default=2881, default=2881,
) )
OCEANBASE_VECTOR_USER: Optional[str] = Field( OCEANBASE_VECTOR_USER: str | None = Field(
description="Username for authenticating with the OceanBase Vector database", description="Username for authenticating with the OceanBase Vector database",
default=None, default=None,
) )
OCEANBASE_VECTOR_PASSWORD: Optional[str] = Field( OCEANBASE_VECTOR_PASSWORD: str | None = Field(
description="Password for authenticating with the OceanBase Vector database", description="Password for authenticating with the OceanBase Vector database",
default=None, default=None,
) )
OCEANBASE_VECTOR_DATABASE: Optional[str] = Field( OCEANBASE_VECTOR_DATABASE: str | None = Field(
description="Name of the OceanBase Vector database to connect to", description="Name of the OceanBase Vector database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class OpenGaussConfig(BaseSettings):
Configuration settings for OpenGauss Configuration settings for OpenGauss
""" """
OPENGAUSS_HOST: Optional[str] = Field( OPENGAUSS_HOST: str | None = Field(
description="Hostname or IP address of the OpenGauss server(e.g., 'localhost')", description="Hostname or IP address of the OpenGauss server(e.g., 'localhost')",
default=None, default=None,
) )
@@ -19,17 +17,17 @@ class OpenGaussConfig(BaseSettings):
default=6600, default=6600,
) )
OPENGAUSS_USER: Optional[str] = Field( OPENGAUSS_USER: str | None = Field(
description="Username for authenticating with the OpenGauss database", description="Username for authenticating with the OpenGauss database",
default=None, default=None,
) )
OPENGAUSS_PASSWORD: Optional[str] = Field( OPENGAUSS_PASSWORD: str | None = Field(
description="Password for authenticating with the OpenGauss database", description="Password for authenticating with the OpenGauss database",
default=None, default=None,
) )
OPENGAUSS_DATABASE: Optional[str] = Field( OPENGAUSS_DATABASE: str | None = Field(
description="Name of the OpenGauss database to connect to", description="Name of the OpenGauss database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,5 @@
import enum from enum import Enum
from typing import Literal, Optional from typing import Literal
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -10,7 +10,7 @@ class OpenSearchConfig(BaseSettings):
Configuration settings for OpenSearch Configuration settings for OpenSearch
""" """
class AuthMethod(enum.StrEnum): class AuthMethod(Enum):
""" """
Authentication method for OpenSearch Authentication method for OpenSearch
""" """
@@ -18,7 +18,7 @@ class OpenSearchConfig(BaseSettings):
BASIC = "basic" BASIC = "basic"
AWS_MANAGED_IAM = "aws_managed_iam" AWS_MANAGED_IAM = "aws_managed_iam"
OPENSEARCH_HOST: Optional[str] = Field( OPENSEARCH_HOST: str | None = Field(
description="Hostname or IP address of the OpenSearch server (e.g., 'localhost' or 'opensearch.example.com')", description="Hostname or IP address of the OpenSearch server (e.g., 'localhost' or 'opensearch.example.com')",
default=None, default=None,
) )
@@ -43,21 +43,21 @@ class OpenSearchConfig(BaseSettings):
default=AuthMethod.BASIC, default=AuthMethod.BASIC,
) )
OPENSEARCH_USER: Optional[str] = Field( OPENSEARCH_USER: str | None = Field(
description="Username for authenticating with OpenSearch", description="Username for authenticating with OpenSearch",
default=None, default=None,
) )
OPENSEARCH_PASSWORD: Optional[str] = Field( OPENSEARCH_PASSWORD: str | None = Field(
description="Password for authenticating with OpenSearch", description="Password for authenticating with OpenSearch",
default=None, default=None,
) )
OPENSEARCH_AWS_REGION: Optional[str] = Field( OPENSEARCH_AWS_REGION: str | None = Field(
description="AWS region for OpenSearch (e.g. 'us-west-2')", description="AWS region for OpenSearch (e.g. 'us-west-2')",
default=None, default=None,
) )
OPENSEARCH_AWS_SERVICE: Optional[Literal["es", "aoss"]] = Field( OPENSEARCH_AWS_SERVICE: Literal["es", "aoss"] | None = Field(
description="AWS service for OpenSearch (e.g. 'aoss' for OpenSearch Serverless)", default=None description="AWS service for OpenSearch (e.g. 'aoss' for OpenSearch Serverless)", default=None
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,33 +7,33 @@ class OracleConfig(BaseSettings):
Configuration settings for Oracle database Configuration settings for Oracle database
""" """
ORACLE_USER: Optional[str] = Field( ORACLE_USER: str | None = Field(
description="Username for authenticating with the Oracle database", description="Username for authenticating with the Oracle database",
default=None, default=None,
) )
ORACLE_PASSWORD: Optional[str] = Field( ORACLE_PASSWORD: str | None = Field(
description="Password for authenticating with the Oracle database", description="Password for authenticating with the Oracle database",
default=None, default=None,
) )
ORACLE_DSN: Optional[str] = Field( ORACLE_DSN: str | None = Field(
description="Oracle database connection string. For traditional database, use format 'host:port/service_name'. " description="Oracle database connection string. For traditional database, use format 'host:port/service_name'. "
"For autonomous database, use the service name from tnsnames.ora in the wallet", "For autonomous database, use the service name from tnsnames.ora in the wallet",
default=None, default=None,
) )
ORACLE_CONFIG_DIR: Optional[str] = Field( ORACLE_CONFIG_DIR: str | None = Field(
description="Directory containing the tnsnames.ora configuration file. Only used in thin mode connection", description="Directory containing the tnsnames.ora configuration file. Only used in thin mode connection",
default=None, default=None,
) )
ORACLE_WALLET_LOCATION: Optional[str] = Field( ORACLE_WALLET_LOCATION: str | None = Field(
description="Oracle wallet directory path containing the wallet files for secure connection", description="Oracle wallet directory path containing the wallet files for secure connection",
default=None, default=None,
) )
ORACLE_WALLET_PASSWORD: Optional[str] = Field( ORACLE_WALLET_PASSWORD: str | None = Field(
description="Password to decrypt the Oracle wallet, if it is encrypted", description="Password to decrypt the Oracle wallet, if it is encrypted",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class PGVectorConfig(BaseSettings):
Configuration settings for PGVector (PostgreSQL with vector extension) Configuration settings for PGVector (PostgreSQL with vector extension)
""" """
PGVECTOR_HOST: Optional[str] = Field( PGVECTOR_HOST: str | None = Field(
description="Hostname or IP address of the PostgreSQL server with PGVector extension (e.g., 'localhost')", description="Hostname or IP address of the PostgreSQL server with PGVector extension (e.g., 'localhost')",
default=None, default=None,
) )
@@ -19,17 +17,17 @@ class PGVectorConfig(BaseSettings):
default=5433, default=5433,
) )
PGVECTOR_USER: Optional[str] = Field( PGVECTOR_USER: str | None = Field(
description="Username for authenticating with the PostgreSQL database", description="Username for authenticating with the PostgreSQL database",
default=None, default=None,
) )
PGVECTOR_PASSWORD: Optional[str] = Field( PGVECTOR_PASSWORD: str | None = Field(
description="Password for authenticating with the PostgreSQL database", description="Password for authenticating with the PostgreSQL database",
default=None, default=None,
) )
PGVECTOR_DATABASE: Optional[str] = Field( PGVECTOR_DATABASE: str | None = Field(
description="Name of the PostgreSQL database to connect to", description="Name of the PostgreSQL database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class PGVectoRSConfig(BaseSettings):
Configuration settings for PGVecto.RS (Rust-based vector extension for PostgreSQL) Configuration settings for PGVecto.RS (Rust-based vector extension for PostgreSQL)
""" """
PGVECTO_RS_HOST: Optional[str] = Field( PGVECTO_RS_HOST: str | None = Field(
description="Hostname or IP address of the PostgreSQL server with PGVecto.RS extension (e.g., 'localhost')", description="Hostname or IP address of the PostgreSQL server with PGVecto.RS extension (e.g., 'localhost')",
default=None, default=None,
) )
@@ -19,17 +17,17 @@ class PGVectoRSConfig(BaseSettings):
default=5431, default=5431,
) )
PGVECTO_RS_USER: Optional[str] = Field( PGVECTO_RS_USER: str | None = Field(
description="Username for authenticating with the PostgreSQL database using PGVecto.RS", description="Username for authenticating with the PostgreSQL database using PGVecto.RS",
default=None, default=None,
) )
PGVECTO_RS_PASSWORD: Optional[str] = Field( PGVECTO_RS_PASSWORD: str | None = Field(
description="Password for authenticating with the PostgreSQL database using PGVecto.RS", description="Password for authenticating with the PostgreSQL database using PGVecto.RS",
default=None, default=None,
) )
PGVECTO_RS_DATABASE: Optional[str] = Field( PGVECTO_RS_DATABASE: str | None = Field(
description="Name of the PostgreSQL database with PGVecto.RS extension to connect to", description="Name of the PostgreSQL database with PGVecto.RS extension to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveInt from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,12 +7,12 @@ class QdrantConfig(BaseSettings):
Configuration settings for Qdrant vector database Configuration settings for Qdrant vector database
""" """
QDRANT_URL: Optional[str] = Field( QDRANT_URL: str | None = Field(
description="URL of the Qdrant server (e.g., 'http://localhost:6333' or 'https://qdrant.example.com')", description="URL of the Qdrant server (e.g., 'http://localhost:6333' or 'https://qdrant.example.com')",
default=None, default=None,
) )
QDRANT_API_KEY: Optional[str] = Field( QDRANT_API_KEY: str | None = Field(
description="API key for authenticating with the Qdrant server", description="API key for authenticating with the Qdrant server",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class RelytConfig(BaseSettings):
Configuration settings for Relyt database Configuration settings for Relyt database
""" """
RELYT_HOST: Optional[str] = Field( RELYT_HOST: str | None = Field(
description="Hostname or IP address of the Relyt server (e.g., 'localhost' or 'relyt.example.com')", description="Hostname or IP address of the Relyt server (e.g., 'localhost' or 'relyt.example.com')",
default=None, default=None,
) )
@@ -19,17 +17,17 @@ class RelytConfig(BaseSettings):
default=9200, default=9200,
) )
RELYT_USER: Optional[str] = Field( RELYT_USER: str | None = Field(
description="Username for authenticating with the Relyt database", description="Username for authenticating with the Relyt database",
default=None, default=None,
) )
RELYT_PASSWORD: Optional[str] = Field( RELYT_PASSWORD: str | None = Field(
description="Password for authenticating with the Relyt database", description="Password for authenticating with the Relyt database",
default=None, default=None,
) )
RELYT_DATABASE: Optional[str] = Field( RELYT_DATABASE: str | None = Field(
description="Name of the Relyt database to connect to (default is 'default')", description="Name of the Relyt database to connect to (default is 'default')",
default="default", default="default",
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,22 +7,22 @@ class TableStoreConfig(BaseSettings):
Configuration settings for TableStore. Configuration settings for TableStore.
""" """
TABLESTORE_ENDPOINT: Optional[str] = Field( TABLESTORE_ENDPOINT: str | None = Field(
description="Endpoint address of the TableStore server (e.g. 'https://instance-name.cn-hangzhou.ots.aliyuncs.com')", description="Endpoint address of the TableStore server (e.g. 'https://instance-name.cn-hangzhou.ots.aliyuncs.com')",
default=None, default=None,
) )
TABLESTORE_INSTANCE_NAME: Optional[str] = Field( TABLESTORE_INSTANCE_NAME: str | None = Field(
description="Instance name to access TableStore server (eg. 'instance-name')", description="Instance name to access TableStore server (eg. 'instance-name')",
default=None, default=None,
) )
TABLESTORE_ACCESS_KEY_ID: Optional[str] = Field( TABLESTORE_ACCESS_KEY_ID: str | None = Field(
description="AccessKey id for the instance name", description="AccessKey id for the instance name",
default=None, default=None,
) )
TABLESTORE_ACCESS_KEY_SECRET: Optional[str] = Field( TABLESTORE_ACCESS_KEY_SECRET: str | None = Field(
description="AccessKey secret for the instance name", description="AccessKey secret for the instance name",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveInt from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,12 +7,12 @@ class TencentVectorDBConfig(BaseSettings):
Configuration settings for Tencent Vector Database Configuration settings for Tencent Vector Database
""" """
TENCENT_VECTOR_DB_URL: Optional[str] = Field( TENCENT_VECTOR_DB_URL: str | None = Field(
description="URL of the Tencent Vector Database service (e.g., 'https://vectordb.tencentcloudapi.com')", description="URL of the Tencent Vector Database service (e.g., 'https://vectordb.tencentcloudapi.com')",
default=None, default=None,
) )
TENCENT_VECTOR_DB_API_KEY: Optional[str] = Field( TENCENT_VECTOR_DB_API_KEY: str | None = Field(
description="API key for authenticating with the Tencent Vector Database service", description="API key for authenticating with the Tencent Vector Database service",
default=None, default=None,
) )
@@ -24,12 +22,12 @@ class TencentVectorDBConfig(BaseSettings):
default=30, default=30,
) )
TENCENT_VECTOR_DB_USERNAME: Optional[str] = Field( TENCENT_VECTOR_DB_USERNAME: str | None = Field(
description="Username for authenticating with the Tencent Vector Database (if required)", description="Username for authenticating with the Tencent Vector Database (if required)",
default=None, default=None,
) )
TENCENT_VECTOR_DB_PASSWORD: Optional[str] = Field( TENCENT_VECTOR_DB_PASSWORD: str | None = Field(
description="Password for authenticating with the Tencent Vector Database (if required)", description="Password for authenticating with the Tencent Vector Database (if required)",
default=None, default=None,
) )
@@ -44,7 +42,7 @@ class TencentVectorDBConfig(BaseSettings):
default=2, default=2,
) )
TENCENT_VECTOR_DB_DATABASE: Optional[str] = Field( TENCENT_VECTOR_DB_DATABASE: str | None = Field(
description="Name of the specific Tencent Vector Database to connect to", description="Name of the specific Tencent Vector Database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveInt from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,12 +7,12 @@ class TidbOnQdrantConfig(BaseSettings):
Tidb on Qdrant configs Tidb on Qdrant configs
""" """
TIDB_ON_QDRANT_URL: Optional[str] = Field( TIDB_ON_QDRANT_URL: str | None = Field(
description="Tidb on Qdrant url", description="Tidb on Qdrant url",
default=None, default=None,
) )
TIDB_ON_QDRANT_API_KEY: Optional[str] = Field( TIDB_ON_QDRANT_API_KEY: str | None = Field(
description="Tidb on Qdrant api key", description="Tidb on Qdrant api key",
default=None, default=None,
) )
@@ -34,37 +32,37 @@ class TidbOnQdrantConfig(BaseSettings):
default=6334, default=6334,
) )
TIDB_PUBLIC_KEY: Optional[str] = Field( TIDB_PUBLIC_KEY: str | None = Field(
description="Tidb account public key", description="Tidb account public key",
default=None, default=None,
) )
TIDB_PRIVATE_KEY: Optional[str] = Field( TIDB_PRIVATE_KEY: str | None = Field(
description="Tidb account private key", description="Tidb account private key",
default=None, default=None,
) )
TIDB_API_URL: Optional[str] = Field( TIDB_API_URL: str | None = Field(
description="Tidb API url", description="Tidb API url",
default=None, default=None,
) )
TIDB_IAM_API_URL: Optional[str] = Field( TIDB_IAM_API_URL: str | None = Field(
description="Tidb IAM API url", description="Tidb IAM API url",
default=None, default=None,
) )
TIDB_REGION: Optional[str] = Field( TIDB_REGION: str | None = Field(
description="Tidb serverless region", description="Tidb serverless region",
default="regions/aws-us-east-1", default="regions/aws-us-east-1",
) )
TIDB_PROJECT_ID: Optional[str] = Field( TIDB_PROJECT_ID: str | None = Field(
description="Tidb project id", description="Tidb project id",
default=None, default=None,
) )
TIDB_SPEND_LIMIT: Optional[int] = Field( TIDB_SPEND_LIMIT: int | None = Field(
description="Tidb spend limit", description="Tidb spend limit",
default=100, default=100,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,27 +7,27 @@ class TiDBVectorConfig(BaseSettings):
Configuration settings for TiDB Vector database Configuration settings for TiDB Vector database
""" """
TIDB_VECTOR_HOST: Optional[str] = Field( TIDB_VECTOR_HOST: str | None = Field(
description="Hostname or IP address of the TiDB Vector server (e.g., 'localhost' or 'tidb.example.com')", description="Hostname or IP address of the TiDB Vector server (e.g., 'localhost' or 'tidb.example.com')",
default=None, default=None,
) )
TIDB_VECTOR_PORT: Optional[PositiveInt] = Field( TIDB_VECTOR_PORT: PositiveInt | None = Field(
description="Port number on which the TiDB Vector server is listening (default is 4000)", description="Port number on which the TiDB Vector server is listening (default is 4000)",
default=4000, default=4000,
) )
TIDB_VECTOR_USER: Optional[str] = Field( TIDB_VECTOR_USER: str | None = Field(
description="Username for authenticating with the TiDB Vector database", description="Username for authenticating with the TiDB Vector database",
default=None, default=None,
) )
TIDB_VECTOR_PASSWORD: Optional[str] = Field( TIDB_VECTOR_PASSWORD: str | None = Field(
description="Password for authenticating with the TiDB Vector database", description="Password for authenticating with the TiDB Vector database",
default=None, default=None,
) )
TIDB_VECTOR_DATABASE: Optional[str] = Field( TIDB_VECTOR_DATABASE: str | None = Field(
description="Name of the TiDB Vector database to connect to", description="Name of the TiDB Vector database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,12 +7,12 @@ class UpstashConfig(BaseSettings):
Configuration settings for Upstash vector database Configuration settings for Upstash vector database
""" """
UPSTASH_VECTOR_URL: Optional[str] = Field( UPSTASH_VECTOR_URL: str | None = Field(
description="URL of the upstash server (e.g., 'https://vector.upstash.io')", description="URL of the upstash server (e.g., 'https://vector.upstash.io')",
default=None, default=None,
) )
UPSTASH_VECTOR_TOKEN: Optional[str] = Field( UPSTASH_VECTOR_TOKEN: str | None = Field(
description="Token for authenticating with the upstash server", description="Token for authenticating with the upstash server",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,7 +7,7 @@ class VastbaseVectorConfig(BaseSettings):
Configuration settings for Vector (Vastbase with vector extension) Configuration settings for Vector (Vastbase with vector extension)
""" """
VASTBASE_HOST: Optional[str] = Field( VASTBASE_HOST: str | None = Field(
description="Hostname or IP address of the Vastbase server with Vector extension (e.g., 'localhost')", description="Hostname or IP address of the Vastbase server with Vector extension (e.g., 'localhost')",
default=None, default=None,
) )
@@ -19,17 +17,17 @@ class VastbaseVectorConfig(BaseSettings):
default=5432, default=5432,
) )
VASTBASE_USER: Optional[str] = Field( VASTBASE_USER: str | None = Field(
description="Username for authenticating with the Vastbase database", description="Username for authenticating with the Vastbase database",
default=None, default=None,
) )
VASTBASE_PASSWORD: Optional[str] = Field( VASTBASE_PASSWORD: str | None = Field(
description="Password for authenticating with the Vastbase database", description="Password for authenticating with the Vastbase database",
default=None, default=None,
) )
VASTBASE_DATABASE: Optional[str] = Field( VASTBASE_DATABASE: str | None = Field(
description="Name of the Vastbase database to connect to", description="Name of the Vastbase database to connect to",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field from pydantic import Field
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -11,14 +9,14 @@ class VikingDBConfig(BaseSettings):
https://www.volcengine.com/docs/6291/65568 https://www.volcengine.com/docs/6291/65568
""" """
VIKINGDB_ACCESS_KEY: Optional[str] = Field( VIKINGDB_ACCESS_KEY: str | None = Field(
description="The Access Key provided by Volcengine VikingDB for API authentication." description="The Access Key provided by Volcengine VikingDB for API authentication."
"Refer to the following documentation for details on obtaining credentials:" "Refer to the following documentation for details on obtaining credentials:"
"https://www.volcengine.com/docs/6291/65568", "https://www.volcengine.com/docs/6291/65568",
default=None, default=None,
) )
VIKINGDB_SECRET_KEY: Optional[str] = Field( VIKINGDB_SECRET_KEY: str | None = Field(
description="The Secret Key provided by Volcengine VikingDB for API authentication.", description="The Secret Key provided by Volcengine VikingDB for API authentication.",
default=None, default=None,
) )

View File

@@ -1,5 +1,3 @@
from typing import Optional
from pydantic import Field, PositiveInt from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings from pydantic_settings import BaseSettings
@@ -9,12 +7,12 @@ class WeaviateConfig(BaseSettings):
Configuration settings for Weaviate vector database Configuration settings for Weaviate vector database
""" """
WEAVIATE_ENDPOINT: Optional[str] = Field( WEAVIATE_ENDPOINT: str | None = Field(
description="URL of the Weaviate server (e.g., 'http://localhost:8080' or 'https://weaviate.example.com')", description="URL of the Weaviate server (e.g., 'http://localhost:8080' or 'https://weaviate.example.com')",
default=None, default=None,
) )
WEAVIATE_API_KEY: Optional[str] = Field( WEAVIATE_API_KEY: str | None = Field(
description="API key for authenticating with the Weaviate server", description="API key for authenticating with the Weaviate server",
default=None, default=None,
) )

View File

@@ -1,5 +1,5 @@
from collections.abc import Mapping from collections.abc import Mapping
from typing import Any, Optional from typing import Any
from pydantic import Field from pydantic import Field
from pydantic.fields import FieldInfo from pydantic.fields import FieldInfo
@@ -15,22 +15,22 @@ class ApolloSettingsSourceInfo(BaseSettings):
Packaging build information Packaging build information
""" """
APOLLO_APP_ID: Optional[str] = Field( APOLLO_APP_ID: str | None = Field(
description="apollo app_id", description="apollo app_id",
default=None, default=None,
) )
APOLLO_CLUSTER: Optional[str] = Field( APOLLO_CLUSTER: str | None = Field(
description="apollo cluster", description="apollo cluster",
default=None, default=None,
) )
APOLLO_CONFIG_URL: Optional[str] = Field( APOLLO_CONFIG_URL: str | None = Field(
description="apollo config url", description="apollo config url",
default=None, default=None,
) )
APOLLO_NAMESPACE: Optional[str] = Field( APOLLO_NAMESPACE: str | None = Field(
description="apollo namespace", description="apollo namespace",
default=None, default=None,
) )

View File

@@ -29,7 +29,7 @@ def no_key_cache_key(namespace: str, key: str) -> str:
# Returns whether the obtained value is obtained, and None if it does not # Returns whether the obtained value is obtained, and None if it does not
def get_value_from_dict(namespace_cache: dict[str, Any] | None, key: str) -> Any | None: def get_value_from_dict(namespace_cache: dict[str, Any] | None, key: str) -> Any:
if namespace_cache: if namespace_cache:
kv_data = namespace_cache.get(CONFIGURATIONS) kv_data = namespace_cache.get(CONFIGURATIONS)
if kv_data is None: if kv_data is None:

View File

@@ -16,14 +16,14 @@ AUDIO_EXTENSIONS = ["mp3", "m4a", "wav", "amr", "mpga"]
AUDIO_EXTENSIONS.extend([ext.upper() for ext in AUDIO_EXTENSIONS]) AUDIO_EXTENSIONS.extend([ext.upper() for ext in AUDIO_EXTENSIONS])
_doc_extensions: list[str]
if dify_config.ETL_TYPE == "Unstructured": if dify_config.ETL_TYPE == "Unstructured":
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls", "vtt", "properties"] _doc_extensions = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls", "vtt", "properties"]
DOCUMENT_EXTENSIONS.extend(("doc", "docx", "csv", "eml", "msg", "pptx", "xml", "epub")) _doc_extensions.extend(("doc", "docx", "csv", "eml", "msg", "pptx", "xml", "epub"))
if dify_config.UNSTRUCTURED_API_URL: if dify_config.UNSTRUCTURED_API_URL:
DOCUMENT_EXTENSIONS.append("ppt") _doc_extensions.append("ppt")
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])
else: else:
DOCUMENT_EXTENSIONS = [ _doc_extensions = [
"txt", "txt",
"markdown", "markdown",
"md", "md",
@@ -38,4 +38,4 @@ else:
"vtt", "vtt",
"properties", "properties",
] ]
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS]) DOCUMENT_EXTENSIONS = _doc_extensions + [ext.upper() for ext in _doc_extensions]

View File

@@ -7,7 +7,7 @@ default_app_templates: Mapping[AppMode, Mapping] = {
# workflow default mode # workflow default mode
AppMode.WORKFLOW: { AppMode.WORKFLOW: {
"app": { "app": {
"mode": AppMode.WORKFLOW.value, "mode": AppMode.WORKFLOW,
"enable_site": True, "enable_site": True,
"enable_api": True, "enable_api": True,
} }
@@ -15,7 +15,7 @@ default_app_templates: Mapping[AppMode, Mapping] = {
# completion default mode # completion default mode
AppMode.COMPLETION: { AppMode.COMPLETION: {
"app": { "app": {
"mode": AppMode.COMPLETION.value, "mode": AppMode.COMPLETION,
"enable_site": True, "enable_site": True,
"enable_api": True, "enable_api": True,
}, },
@@ -44,7 +44,7 @@ default_app_templates: Mapping[AppMode, Mapping] = {
# chat default mode # chat default mode
AppMode.CHAT: { AppMode.CHAT: {
"app": { "app": {
"mode": AppMode.CHAT.value, "mode": AppMode.CHAT,
"enable_site": True, "enable_site": True,
"enable_api": True, "enable_api": True,
}, },
@@ -60,7 +60,7 @@ default_app_templates: Mapping[AppMode, Mapping] = {
# advanced-chat default mode # advanced-chat default mode
AppMode.ADVANCED_CHAT: { AppMode.ADVANCED_CHAT: {
"app": { "app": {
"mode": AppMode.ADVANCED_CHAT.value, "mode": AppMode.ADVANCED_CHAT,
"enable_site": True, "enable_site": True,
"enable_api": True, "enable_api": True,
}, },
@@ -68,7 +68,7 @@ default_app_templates: Mapping[AppMode, Mapping] = {
# agent-chat default mode # agent-chat default mode
AppMode.AGENT_CHAT: { AppMode.AGENT_CHAT: {
"app": { "app": {
"mode": AppMode.AGENT_CHAT.value, "mode": AppMode.AGENT_CHAT,
"enable_site": True, "enable_site": True,
"enable_api": True, "enable_api": True,
}, },

View File

@@ -8,7 +8,6 @@ if TYPE_CHECKING:
from core.model_runtime.entities.model_entities import AIModelEntity from core.model_runtime.entities.model_entities import AIModelEntity
from core.plugin.entities.plugin_daemon import PluginModelProviderEntity from core.plugin.entities.plugin_daemon import PluginModelProviderEntity
from core.tools.plugin_tool.provider import PluginToolProviderController from core.tools.plugin_tool.provider import PluginToolProviderController
from core.workflow.entities.variable_pool import VariablePool
""" """

View File

@@ -1,4 +1,5 @@
from flask import Blueprint from flask import Blueprint
from flask_restx import Namespace
from libs.external_api import ExternalApi from libs.external_api import ExternalApi
@@ -26,7 +27,16 @@ from .files import FileApi, FilePreviewApi, FileSupportTypeApi
from .remote_files import RemoteFileInfoApi, RemoteFileUploadApi from .remote_files import RemoteFileInfoApi, RemoteFileUploadApi
bp = Blueprint("console", __name__, url_prefix="/console/api") bp = Blueprint("console", __name__, url_prefix="/console/api")
api = ExternalApi(bp)
api = ExternalApi(
bp,
version="1.0",
title="Console API",
description="Console management APIs for app configuration, monitoring, and administration",
)
# Create namespace
console_ns = Namespace("console", description="Console management API operations", path="/")
# File # File
api.add_resource(FileApi, "/files/upload") api.add_resource(FileApi, "/files/upload")
@@ -43,7 +53,16 @@ api.add_resource(AppImportConfirmApi, "/apps/imports/<string:import_id>/confirm"
api.add_resource(AppImportCheckDependenciesApi, "/apps/imports/<string:app_id>/check-dependencies") api.add_resource(AppImportCheckDependenciesApi, "/apps/imports/<string:app_id>/check-dependencies")
# Import other controllers # Import other controllers
from . import admin, apikey, extension, feature, ping, setup, version from . import (
admin,
apikey,
extension,
feature,
init_validate,
ping,
setup,
version,
)
# Import app controllers # Import app controllers
from .app import ( from .app import (
@@ -70,7 +89,16 @@ from .app import (
) )
# Import auth controllers # Import auth controllers
from .auth import activate, data_source_bearer_auth, data_source_oauth, forgot_password, login, oauth, oauth_server from .auth import (
activate,
data_source_bearer_auth,
data_source_oauth,
email_register,
forgot_password,
login,
oauth,
oauth_server,
)
# Import billing controllers # Import billing controllers
from .billing import billing, compliance from .billing import billing, compliance
@@ -95,6 +123,23 @@ from .explore import (
saved_message, saved_message,
) )
# Import tag controllers
from .tag import tags
# Import workspace controllers
from .workspace import (
account,
agent_providers,
endpoint,
load_balancing_config,
members,
model_providers,
models,
plugin,
tool_providers,
workspace,
)
# Explore Audio # Explore Audio
api.add_resource(ChatAudioApi, "/installed-apps/<uuid:installed_app_id>/audio-to-text", endpoint="installed_app_audio") api.add_resource(ChatAudioApi, "/installed-apps/<uuid:installed_app_id>/audio-to-text", endpoint="installed_app_audio")
api.add_resource(ChatTextApi, "/installed-apps/<uuid:installed_app_id>/text-to-audio", endpoint="installed_app_text") api.add_resource(ChatTextApi, "/installed-apps/<uuid:installed_app_id>/text-to-audio", endpoint="installed_app_text")
@@ -166,19 +211,71 @@ api.add_resource(
InstalledAppWorkflowTaskStopApi, "/installed-apps/<uuid:installed_app_id>/workflows/tasks/<string:task_id>/stop" InstalledAppWorkflowTaskStopApi, "/installed-apps/<uuid:installed_app_id>/workflows/tasks/<string:task_id>/stop"
) )
# Import tag controllers api.add_namespace(console_ns)
from .tag import tags
# Import workspace controllers __all__ = [
from .workspace import ( "account",
account, "activate",
agent_providers, "admin",
endpoint, "advanced_prompt_template",
load_balancing_config, "agent",
members, "agent_providers",
model_providers, "annotation",
models, "api",
plugin, "apikey",
tool_providers, "app",
workspace, "audio",
) "billing",
"bp",
"completion",
"compliance",
"console_ns",
"conversation",
"conversation_variables",
"data_source",
"data_source_bearer_auth",
"data_source_oauth",
"datasets",
"datasets_document",
"datasets_segments",
"email_register",
"endpoint",
"extension",
"external",
"feature",
"forgot_password",
"generator",
"hit_testing",
"init_validate",
"installed_app",
"load_balancing_config",
"login",
"mcp_server",
"members",
"message",
"metadata",
"model_config",
"model_providers",
"models",
"oauth",
"oauth_server",
"ops_trace",
"parameter",
"ping",
"plugin",
"recommended_app",
"saved_message",
"setup",
"site",
"statistic",
"tags",
"tool_providers",
"version",
"website",
"workflow",
"workflow_app_log",
"workflow_draft_variable",
"workflow_run",
"workflow_statistic",
"workspace",
]

View File

@@ -3,7 +3,7 @@ from functools import wraps
from typing import ParamSpec, TypeVar from typing import ParamSpec, TypeVar
from flask import request from flask import request
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from werkzeug.exceptions import NotFound, Unauthorized from werkzeug.exceptions import NotFound, Unauthorized
@@ -12,7 +12,7 @@ P = ParamSpec("P")
R = TypeVar("R") R = TypeVar("R")
from configs import dify_config from configs import dify_config
from constants.languages import supported_language from constants.languages import supported_language
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.wraps import only_edition_cloud from controllers.console.wraps import only_edition_cloud
from extensions.ext_database import db from extensions.ext_database import db
from models.model import App, InstalledApp, RecommendedApp from models.model import App, InstalledApp, RecommendedApp
@@ -45,7 +45,28 @@ def admin_required(view: Callable[P, R]):
return decorated return decorated
@console_ns.route("/admin/insert-explore-apps")
class InsertExploreAppListApi(Resource): class InsertExploreAppListApi(Resource):
@api.doc("insert_explore_app")
@api.doc(description="Insert or update an app in the explore list")
@api.expect(
api.model(
"InsertExploreAppRequest",
{
"app_id": fields.String(required=True, description="Application ID"),
"desc": fields.String(description="App description"),
"copyright": fields.String(description="Copyright information"),
"privacy_policy": fields.String(description="Privacy policy"),
"custom_disclaimer": fields.String(description="Custom disclaimer"),
"language": fields.String(required=True, description="Language code"),
"category": fields.String(required=True, description="App category"),
"position": fields.Integer(required=True, description="Display position"),
},
)
)
@api.response(200, "App updated successfully")
@api.response(201, "App inserted successfully")
@api.response(404, "App not found")
@only_edition_cloud @only_edition_cloud
@admin_required @admin_required
def post(self): def post(self):
@@ -115,7 +136,12 @@ class InsertExploreAppListApi(Resource):
return {"result": "success"}, 200 return {"result": "success"}, 200
@console_ns.route("/admin/insert-explore-apps/<uuid:app_id>")
class InsertExploreAppApi(Resource): class InsertExploreAppApi(Resource):
@api.doc("delete_explore_app")
@api.doc(description="Remove an app from the explore list")
@api.doc(params={"app_id": "Application ID to remove"})
@api.response(204, "App removed successfully")
@only_edition_cloud @only_edition_cloud
@admin_required @admin_required
def delete(self, app_id): def delete(self, app_id):
@@ -152,7 +178,3 @@ class InsertExploreAppApi(Resource):
db.session.commit() db.session.commit()
return {"result": "success"}, 204 return {"result": "success"}, 204
api.add_resource(InsertExploreAppListApi, "/admin/insert-explore-apps")
api.add_resource(InsertExploreAppApi, "/admin/insert-explore-apps/<uuid:app_id>")

View File

@@ -1,8 +1,7 @@
from typing import Any, Optional
import flask_restx import flask_restx
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, fields, marshal_with from flask_restx import Resource, fields, marshal_with
from flask_restx._http import HTTPStatus
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden from werkzeug.exceptions import Forbidden
@@ -13,7 +12,7 @@ from libs.login import login_required
from models.dataset import Dataset from models.dataset import Dataset
from models.model import ApiToken, App from models.model import ApiToken, App
from . import api from . import api, console_ns
from .wraps import account_initialization_required, setup_required from .wraps import account_initialization_required, setup_required
api_key_fields = { api_key_fields = {
@@ -40,7 +39,7 @@ def _get_resource(resource_id, tenant_id, resource_model):
).scalar_one_or_none() ).scalar_one_or_none()
if resource is None: if resource is None:
flask_restx.abort(404, message=f"{resource_model.__name__} not found.") flask_restx.abort(HTTPStatus.NOT_FOUND, message=f"{resource_model.__name__} not found.")
return resource return resource
@@ -49,7 +48,7 @@ class BaseApiKeyListResource(Resource):
method_decorators = [account_initialization_required, login_required, setup_required] method_decorators = [account_initialization_required, login_required, setup_required]
resource_type: str | None = None resource_type: str | None = None
resource_model: Optional[Any] = None resource_model: type | None = None
resource_id_field: str | None = None resource_id_field: str | None = None
token_prefix: str | None = None token_prefix: str | None = None
max_keys = 10 max_keys = 10
@@ -59,11 +58,11 @@ class BaseApiKeyListResource(Resource):
assert self.resource_id_field is not None, "resource_id_field must be set" assert self.resource_id_field is not None, "resource_id_field must be set"
resource_id = str(resource_id) resource_id = str(resource_id)
_get_resource(resource_id, current_user.current_tenant_id, self.resource_model) _get_resource(resource_id, current_user.current_tenant_id, self.resource_model)
keys = ( keys = db.session.scalars(
db.session.query(ApiToken) select(ApiToken).where(
.where(ApiToken.type == self.resource_type, getattr(ApiToken, self.resource_id_field) == resource_id) ApiToken.type == self.resource_type, getattr(ApiToken, self.resource_id_field) == resource_id
.all() )
) ).all()
return {"items": keys} return {"items": keys}
@marshal_with(api_key_fields) @marshal_with(api_key_fields)
@@ -82,7 +81,7 @@ class BaseApiKeyListResource(Resource):
if current_key_count >= self.max_keys: if current_key_count >= self.max_keys:
flask_restx.abort( flask_restx.abort(
400, HTTPStatus.BAD_REQUEST,
message=f"Cannot create more than {self.max_keys} API keys for this resource type.", message=f"Cannot create more than {self.max_keys} API keys for this resource type.",
custom="max_keys_exceeded", custom="max_keys_exceeded",
) )
@@ -102,7 +101,7 @@ class BaseApiKeyResource(Resource):
method_decorators = [account_initialization_required, login_required, setup_required] method_decorators = [account_initialization_required, login_required, setup_required]
resource_type: str | None = None resource_type: str | None = None
resource_model: Optional[Any] = None resource_model: type | None = None
resource_id_field: str | None = None resource_id_field: str | None = None
def delete(self, resource_id, api_key_id): def delete(self, resource_id, api_key_id):
@@ -126,7 +125,7 @@ class BaseApiKeyResource(Resource):
) )
if key is None: if key is None:
flask_restx.abort(404, message="API key not found") flask_restx.abort(HTTPStatus.NOT_FOUND, message="API key not found")
db.session.query(ApiToken).where(ApiToken.id == api_key_id).delete() db.session.query(ApiToken).where(ApiToken.id == api_key_id).delete()
db.session.commit() db.session.commit()
@@ -134,7 +133,25 @@ class BaseApiKeyResource(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/apps/<uuid:resource_id>/api-keys")
class AppApiKeyListResource(BaseApiKeyListResource): class AppApiKeyListResource(BaseApiKeyListResource):
@api.doc("get_app_api_keys")
@api.doc(description="Get all API keys for an app")
@api.doc(params={"resource_id": "App ID"})
@api.response(200, "Success", api_key_list)
def get(self, resource_id):
"""Get all API keys for an app"""
return super().get(resource_id)
@api.doc("create_app_api_key")
@api.doc(description="Create a new API key for an app")
@api.doc(params={"resource_id": "App ID"})
@api.response(201, "API key created successfully", api_key_fields)
@api.response(400, "Maximum keys exceeded")
def post(self, resource_id):
"""Create a new API key for an app"""
return super().post(resource_id)
def after_request(self, resp): def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*" resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true" resp.headers["Access-Control-Allow-Credentials"] = "true"
@@ -146,7 +163,16 @@ class AppApiKeyListResource(BaseApiKeyListResource):
token_prefix = "app-" token_prefix = "app-"
@console_ns.route("/apps/<uuid:resource_id>/api-keys/<uuid:api_key_id>")
class AppApiKeyResource(BaseApiKeyResource): class AppApiKeyResource(BaseApiKeyResource):
@api.doc("delete_app_api_key")
@api.doc(description="Delete an API key for an app")
@api.doc(params={"resource_id": "App ID", "api_key_id": "API key ID"})
@api.response(204, "API key deleted successfully")
def delete(self, resource_id, api_key_id):
"""Delete an API key for an app"""
return super().delete(resource_id, api_key_id)
def after_request(self, resp): def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*" resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true" resp.headers["Access-Control-Allow-Credentials"] = "true"
@@ -157,7 +183,25 @@ class AppApiKeyResource(BaseApiKeyResource):
resource_id_field = "app_id" resource_id_field = "app_id"
@console_ns.route("/datasets/<uuid:resource_id>/api-keys")
class DatasetApiKeyListResource(BaseApiKeyListResource): class DatasetApiKeyListResource(BaseApiKeyListResource):
@api.doc("get_dataset_api_keys")
@api.doc(description="Get all API keys for a dataset")
@api.doc(params={"resource_id": "Dataset ID"})
@api.response(200, "Success", api_key_list)
def get(self, resource_id):
"""Get all API keys for a dataset"""
return super().get(resource_id)
@api.doc("create_dataset_api_key")
@api.doc(description="Create a new API key for a dataset")
@api.doc(params={"resource_id": "Dataset ID"})
@api.response(201, "API key created successfully", api_key_fields)
@api.response(400, "Maximum keys exceeded")
def post(self, resource_id):
"""Create a new API key for a dataset"""
return super().post(resource_id)
def after_request(self, resp): def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*" resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true" resp.headers["Access-Control-Allow-Credentials"] = "true"
@@ -169,7 +213,16 @@ class DatasetApiKeyListResource(BaseApiKeyListResource):
token_prefix = "ds-" token_prefix = "ds-"
@console_ns.route("/datasets/<uuid:resource_id>/api-keys/<uuid:api_key_id>")
class DatasetApiKeyResource(BaseApiKeyResource): class DatasetApiKeyResource(BaseApiKeyResource):
@api.doc("delete_dataset_api_key")
@api.doc(description="Delete an API key for a dataset")
@api.doc(params={"resource_id": "Dataset ID", "api_key_id": "API key ID"})
@api.response(204, "API key deleted successfully")
def delete(self, resource_id, api_key_id):
"""Delete an API key for a dataset"""
return super().delete(resource_id, api_key_id)
def after_request(self, resp): def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*" resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true" resp.headers["Access-Control-Allow-Credentials"] = "true"
@@ -178,9 +231,3 @@ class DatasetApiKeyResource(BaseApiKeyResource):
resource_type = "dataset" resource_type = "dataset"
resource_model = Dataset resource_model = Dataset
resource_id_field = "dataset_id" resource_id_field = "dataset_id"
api.add_resource(AppApiKeyListResource, "/apps/<uuid:resource_id>/api-keys")
api.add_resource(AppApiKeyResource, "/apps/<uuid:resource_id>/api-keys/<uuid:api_key_id>")
api.add_resource(DatasetApiKeyListResource, "/datasets/<uuid:resource_id>/api-keys")
api.add_resource(DatasetApiKeyResource, "/datasets/<uuid:resource_id>/api-keys/<uuid:api_key_id>")

View File

@@ -1,12 +1,26 @@
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from libs.login import login_required from libs.login import login_required
from services.advanced_prompt_template_service import AdvancedPromptTemplateService from services.advanced_prompt_template_service import AdvancedPromptTemplateService
@console_ns.route("/app/prompt-templates")
class AdvancedPromptTemplateList(Resource): class AdvancedPromptTemplateList(Resource):
@api.doc("get_advanced_prompt_templates")
@api.doc(description="Get advanced prompt templates based on app mode and model configuration")
@api.expect(
api.parser()
.add_argument("app_mode", type=str, required=True, location="args", help="Application mode")
.add_argument("model_mode", type=str, required=True, location="args", help="Model mode")
.add_argument("has_context", type=str, default="true", location="args", help="Whether has context")
.add_argument("model_name", type=str, required=True, location="args", help="Model name")
)
@api.response(
200, "Prompt templates retrieved successfully", fields.List(fields.Raw(description="Prompt template data"))
)
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -19,6 +33,3 @@ class AdvancedPromptTemplateList(Resource):
args = parser.parse_args() args = parser.parse_args()
return AdvancedPromptTemplateService.get_prompt(args) return AdvancedPromptTemplateService.get_prompt(args)
api.add_resource(AdvancedPromptTemplateList, "/app/prompt-templates")

View File

@@ -1,6 +1,6 @@
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from libs.helper import uuid_value from libs.helper import uuid_value
@@ -9,7 +9,18 @@ from models.model import AppMode
from services.agent_service import AgentService from services.agent_service import AgentService
@console_ns.route("/apps/<uuid:app_id>/agent/logs")
class AgentLogApi(Resource): class AgentLogApi(Resource):
@api.doc("get_agent_logs")
@api.doc(description="Get agent execution logs for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("message_id", type=str, required=True, location="args", help="Message UUID")
.add_argument("conversation_id", type=str, required=True, location="args", help="Conversation UUID")
)
@api.response(200, "Agent logs retrieved successfully", fields.List(fields.Raw(description="Agent log entries")))
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -23,6 +34,3 @@ class AgentLogApi(Resource):
args = parser.parse_args() args = parser.parse_args()
return AgentService.get_agent_logs(app_model, args["conversation_id"], args["message_id"]) return AgentService.get_agent_logs(app_model, args["conversation_id"], args["message_id"])
api.add_resource(AgentLogApi, "/apps/<uuid:app_id>/agent/logs")

View File

@@ -2,11 +2,11 @@ from typing import Literal
from flask import request from flask import request
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, marshal, marshal_with, reqparse from flask_restx import Resource, fields, marshal, marshal_with, reqparse
from werkzeug.exceptions import Forbidden from werkzeug.exceptions import Forbidden
from controllers.common.errors import NoFileUploadedError, TooManyFilesError from controllers.common.errors import NoFileUploadedError, TooManyFilesError
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.wraps import ( from controllers.console.wraps import (
account_initialization_required, account_initialization_required,
cloud_edition_billing_resource_check, cloud_edition_billing_resource_check,
@@ -21,7 +21,23 @@ from libs.login import login_required
from services.annotation_service import AppAnnotationService from services.annotation_service import AppAnnotationService
@console_ns.route("/apps/<uuid:app_id>/annotation-reply/<string:action>")
class AnnotationReplyActionApi(Resource): class AnnotationReplyActionApi(Resource):
@api.doc("annotation_reply_action")
@api.doc(description="Enable or disable annotation reply for an app")
@api.doc(params={"app_id": "Application ID", "action": "Action to perform (enable/disable)"})
@api.expect(
api.model(
"AnnotationReplyActionRequest",
{
"score_threshold": fields.Float(required=True, description="Score threshold for annotation matching"),
"embedding_provider_name": fields.String(required=True, description="Embedding provider name"),
"embedding_model_name": fields.String(required=True, description="Embedding model name"),
},
)
)
@api.response(200, "Action completed successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -43,7 +59,13 @@ class AnnotationReplyActionApi(Resource):
return result, 200 return result, 200
@console_ns.route("/apps/<uuid:app_id>/annotation-setting")
class AppAnnotationSettingDetailApi(Resource): class AppAnnotationSettingDetailApi(Resource):
@api.doc("get_annotation_setting")
@api.doc(description="Get annotation settings for an app")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Annotation settings retrieved successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -56,7 +78,23 @@ class AppAnnotationSettingDetailApi(Resource):
return result, 200 return result, 200
@console_ns.route("/apps/<uuid:app_id>/annotation-settings/<uuid:annotation_setting_id>")
class AppAnnotationSettingUpdateApi(Resource): class AppAnnotationSettingUpdateApi(Resource):
@api.doc("update_annotation_setting")
@api.doc(description="Update annotation settings for an app")
@api.doc(params={"app_id": "Application ID", "annotation_setting_id": "Annotation setting ID"})
@api.expect(
api.model(
"AnnotationSettingUpdateRequest",
{
"score_threshold": fields.Float(required=True, description="Score threshold"),
"embedding_provider_name": fields.String(required=True, description="Embedding provider"),
"embedding_model_name": fields.String(required=True, description="Embedding model"),
},
)
)
@api.response(200, "Settings updated successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -75,7 +113,13 @@ class AppAnnotationSettingUpdateApi(Resource):
return result, 200 return result, 200
@console_ns.route("/apps/<uuid:app_id>/annotation-reply/<string:action>/status/<uuid:job_id>")
class AnnotationReplyActionStatusApi(Resource): class AnnotationReplyActionStatusApi(Resource):
@api.doc("get_annotation_reply_action_status")
@api.doc(description="Get status of annotation reply action job")
@api.doc(params={"app_id": "Application ID", "job_id": "Job ID", "action": "Action type"})
@api.response(200, "Job status retrieved successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -99,7 +143,19 @@ class AnnotationReplyActionStatusApi(Resource):
return {"job_id": job_id, "job_status": job_status, "error_msg": error_msg}, 200 return {"job_id": job_id, "job_status": job_status, "error_msg": error_msg}, 200
@console_ns.route("/apps/<uuid:app_id>/annotations")
class AnnotationApi(Resource): class AnnotationApi(Resource):
@api.doc("list_annotations")
@api.doc(description="Get annotations for an app with pagination")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("page", type=int, location="args", default=1, help="Page number")
.add_argument("limit", type=int, location="args", default=20, help="Page size")
.add_argument("keyword", type=str, location="args", default="", help="Search keyword")
)
@api.response(200, "Annotations retrieved successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -122,6 +178,21 @@ class AnnotationApi(Resource):
} }
return response, 200 return response, 200
@api.doc("create_annotation")
@api.doc(description="Create a new annotation for an app")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"CreateAnnotationRequest",
{
"question": fields.String(required=True, description="Question text"),
"answer": fields.String(required=True, description="Answer text"),
"annotation_reply": fields.Raw(description="Annotation reply data"),
},
)
)
@api.response(201, "Annotation created successfully", annotation_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -168,7 +239,13 @@ class AnnotationApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/apps/<uuid:app_id>/annotations/export")
class AnnotationExportApi(Resource): class AnnotationExportApi(Resource):
@api.doc("export_annotations")
@api.doc(description="Export all annotations for an app")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Annotations exported successfully", fields.List(fields.Nested(annotation_fields)))
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -182,7 +259,14 @@ class AnnotationExportApi(Resource):
return response, 200 return response, 200
@console_ns.route("/apps/<uuid:app_id>/annotations/<uuid:annotation_id>")
class AnnotationUpdateDeleteApi(Resource): class AnnotationUpdateDeleteApi(Resource):
@api.doc("update_delete_annotation")
@api.doc(description="Update or delete an annotation")
@api.doc(params={"app_id": "Application ID", "annotation_id": "Annotation ID"})
@api.response(200, "Annotation updated successfully", annotation_fields)
@api.response(204, "Annotation deleted successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -214,7 +298,14 @@ class AnnotationUpdateDeleteApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/apps/<uuid:app_id>/annotations/batch-import")
class AnnotationBatchImportApi(Resource): class AnnotationBatchImportApi(Resource):
@api.doc("batch_import_annotations")
@api.doc(description="Batch import annotations from CSV file")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Batch import started successfully")
@api.response(403, "Insufficient permissions")
@api.response(400, "No file uploaded or too many files")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -239,7 +330,13 @@ class AnnotationBatchImportApi(Resource):
return AppAnnotationService.batch_import_app_annotations(app_id, file) return AppAnnotationService.batch_import_app_annotations(app_id, file)
@console_ns.route("/apps/<uuid:app_id>/annotations/batch-import-status/<uuid:job_id>")
class AnnotationBatchImportStatusApi(Resource): class AnnotationBatchImportStatusApi(Resource):
@api.doc("get_batch_import_status")
@api.doc(description="Get status of batch import job")
@api.doc(params={"app_id": "Application ID", "job_id": "Job ID"})
@api.response(200, "Job status retrieved successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -262,7 +359,20 @@ class AnnotationBatchImportStatusApi(Resource):
return {"job_id": job_id, "job_status": job_status, "error_msg": error_msg}, 200 return {"job_id": job_id, "job_status": job_status, "error_msg": error_msg}, 200
@console_ns.route("/apps/<uuid:app_id>/annotations/<uuid:annotation_id>/hit-histories")
class AnnotationHitHistoryListApi(Resource): class AnnotationHitHistoryListApi(Resource):
@api.doc("list_annotation_hit_histories")
@api.doc(description="Get hit histories for an annotation")
@api.doc(params={"app_id": "Application ID", "annotation_id": "Annotation ID"})
@api.expect(
api.parser()
.add_argument("page", type=int, location="args", default=1, help="Page number")
.add_argument("limit", type=int, location="args", default=20, help="Page size")
)
@api.response(
200, "Hit histories retrieved successfully", fields.List(fields.Nested(annotation_hit_history_fields))
)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -285,17 +395,3 @@ class AnnotationHitHistoryListApi(Resource):
"page": page, "page": page,
} }
return response return response
api.add_resource(AnnotationReplyActionApi, "/apps/<uuid:app_id>/annotation-reply/<string:action>")
api.add_resource(
AnnotationReplyActionStatusApi, "/apps/<uuid:app_id>/annotation-reply/<string:action>/status/<uuid:job_id>"
)
api.add_resource(AnnotationApi, "/apps/<uuid:app_id>/annotations")
api.add_resource(AnnotationExportApi, "/apps/<uuid:app_id>/annotations/export")
api.add_resource(AnnotationUpdateDeleteApi, "/apps/<uuid:app_id>/annotations/<uuid:annotation_id>")
api.add_resource(AnnotationBatchImportApi, "/apps/<uuid:app_id>/annotations/batch-import")
api.add_resource(AnnotationBatchImportStatusApi, "/apps/<uuid:app_id>/annotations/batch-import-status/<uuid:job_id>")
api.add_resource(AnnotationHitHistoryListApi, "/apps/<uuid:app_id>/annotations/<uuid:annotation_id>/hit-histories")
api.add_resource(AppAnnotationSettingDetailApi, "/apps/<uuid:app_id>/annotation-setting")
api.add_resource(AppAnnotationSettingUpdateApi, "/apps/<uuid:app_id>/annotation-settings/<uuid:annotation_setting_id>")

View File

@@ -2,12 +2,12 @@ import uuid
from typing import cast from typing import cast
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, inputs, marshal, marshal_with, reqparse from flask_restx import Resource, fields, inputs, marshal, marshal_with, reqparse
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from werkzeug.exceptions import BadRequest, Forbidden, abort from werkzeug.exceptions import BadRequest, Forbidden, abort
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import ( from controllers.console.wraps import (
account_initialization_required, account_initialization_required,
@@ -34,7 +34,27 @@ def _validate_description_length(description):
return description return description
@console_ns.route("/apps")
class AppListApi(Resource): class AppListApi(Resource):
@api.doc("list_apps")
@api.doc(description="Get list of applications with pagination and filtering")
@api.expect(
api.parser()
.add_argument("page", type=int, location="args", help="Page number (1-99999)", default=1)
.add_argument("limit", type=int, location="args", help="Page size (1-100)", default=20)
.add_argument(
"mode",
type=str,
location="args",
choices=["completion", "chat", "advanced-chat", "workflow", "agent-chat", "channel", "all"],
default="all",
help="App mode filter",
)
.add_argument("name", type=str, location="args", help="Filter by app name")
.add_argument("tag_ids", type=str, location="args", help="Comma-separated tag IDs")
.add_argument("is_created_by_me", type=bool, location="args", help="Filter by creator")
)
@api.response(200, "Success", app_pagination_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -91,6 +111,24 @@ class AppListApi(Resource):
return marshal(app_pagination, app_pagination_fields), 200 return marshal(app_pagination, app_pagination_fields), 200
@api.doc("create_app")
@api.doc(description="Create a new application")
@api.expect(
api.model(
"CreateAppRequest",
{
"name": fields.String(required=True, description="App name"),
"description": fields.String(description="App description (max 400 chars)"),
"mode": fields.String(required=True, enum=ALLOW_CREATE_APP_MODES, description="App mode"),
"icon_type": fields.String(description="Icon type"),
"icon": fields.String(description="Icon"),
"icon_background": fields.String(description="Icon background color"),
},
)
)
@api.response(201, "App created successfully", app_detail_fields)
@api.response(403, "Insufficient permissions")
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -115,12 +153,21 @@ class AppListApi(Resource):
raise BadRequest("mode is required") raise BadRequest("mode is required")
app_service = AppService() app_service = AppService()
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
if current_user.current_tenant_id is None:
raise ValueError("current_user.current_tenant_id cannot be None")
app = app_service.create_app(current_user.current_tenant_id, args, current_user) app = app_service.create_app(current_user.current_tenant_id, args, current_user)
return app, 201 return app, 201
@console_ns.route("/apps/<uuid:app_id>")
class AppApi(Resource): class AppApi(Resource):
@api.doc("get_app_detail")
@api.doc(description="Get application details")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Success", app_detail_fields_with_site)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -139,6 +186,26 @@ class AppApi(Resource):
return app_model return app_model
@api.doc("update_app")
@api.doc(description="Update application details")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"UpdateAppRequest",
{
"name": fields.String(required=True, description="App name"),
"description": fields.String(description="App description (max 400 chars)"),
"icon_type": fields.String(description="Icon type"),
"icon": fields.String(description="Icon"),
"icon_background": fields.String(description="Icon background color"),
"use_icon_as_answer_icon": fields.Boolean(description="Use icon as answer icon"),
"max_active_requests": fields.Integer(description="Maximum active requests"),
},
)
)
@api.response(200, "App updated successfully", app_detail_fields_with_site)
@api.response(403, "Insufficient permissions")
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -161,14 +228,31 @@ class AppApi(Resource):
args = parser.parse_args() args = parser.parse_args()
app_service = AppService() app_service = AppService()
app_model = app_service.update_app(app_model, args) # Construct ArgsDict from parsed arguments
from services.app_service import AppService as AppServiceType
args_dict: AppServiceType.ArgsDict = {
"name": args["name"],
"description": args.get("description", ""),
"icon_type": args.get("icon_type", ""),
"icon": args.get("icon", ""),
"icon_background": args.get("icon_background", ""),
"use_icon_as_answer_icon": args.get("use_icon_as_answer_icon", False),
"max_active_requests": args.get("max_active_requests", 0),
}
app_model = app_service.update_app(app_model, args_dict)
return app_model return app_model
@api.doc("delete_app")
@api.doc(description="Delete application")
@api.doc(params={"app_id": "Application ID"})
@api.response(204, "App deleted successfully")
@api.response(403, "Insufficient permissions")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def delete(self, app_model): def delete(self, app_model):
"""Delete app""" """Delete app"""
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
@@ -181,7 +265,25 @@ class AppApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/apps/<uuid:app_id>/copy")
class AppCopyApi(Resource): class AppCopyApi(Resource):
@api.doc("copy_app")
@api.doc(description="Create a copy of an existing application")
@api.doc(params={"app_id": "Application ID to copy"})
@api.expect(
api.model(
"CopyAppRequest",
{
"name": fields.String(description="Name for the copied app"),
"description": fields.String(description="Description for the copied app"),
"icon_type": fields.String(description="Icon type"),
"icon": fields.String(description="Icon"),
"icon_background": fields.String(description="Icon background color"),
},
)
)
@api.response(201, "App copied successfully", app_detail_fields_with_site)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -223,11 +325,26 @@ class AppCopyApi(Resource):
return app, 201 return app, 201
@console_ns.route("/apps/<uuid:app_id>/export")
class AppExportApi(Resource): class AppExportApi(Resource):
@api.doc("export_app")
@api.doc(description="Export application configuration as DSL")
@api.doc(params={"app_id": "Application ID to export"})
@api.expect(
api.parser()
.add_argument("include_secret", type=bool, location="args", default=False, help="Include secrets in export")
.add_argument("workflow_id", type=str, location="args", help="Specific workflow ID to export")
)
@api.response(
200,
"App exported successfully",
api.model("AppExportResponse", {"data": fields.String(description="DSL export data")}),
)
@api.response(403, "Insufficient permissions")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
"""Export app""" """Export app"""
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
@@ -247,7 +364,13 @@ class AppExportApi(Resource):
} }
@console_ns.route("/apps/<uuid:app_id>/name")
class AppNameApi(Resource): class AppNameApi(Resource):
@api.doc("check_app_name")
@api.doc(description="Check if app name is available")
@api.doc(params={"app_id": "Application ID"})
@api.expect(api.parser().add_argument("name", type=str, required=True, location="args", help="Name to check"))
@api.response(200, "Name availability checked")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -263,12 +386,28 @@ class AppNameApi(Resource):
args = parser.parse_args() args = parser.parse_args()
app_service = AppService() app_service = AppService()
app_model = app_service.update_app_name(app_model, args.get("name")) app_model = app_service.update_app_name(app_model, args["name"])
return app_model return app_model
@console_ns.route("/apps/<uuid:app_id>/icon")
class AppIconApi(Resource): class AppIconApi(Resource):
@api.doc("update_app_icon")
@api.doc(description="Update application icon")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppIconRequest",
{
"icon": fields.String(required=True, description="Icon data"),
"icon_type": fields.String(description="Icon type"),
"icon_background": fields.String(description="Icon background color"),
},
)
)
@api.response(200, "Icon updated successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -285,12 +424,23 @@ class AppIconApi(Resource):
args = parser.parse_args() args = parser.parse_args()
app_service = AppService() app_service = AppService()
app_model = app_service.update_app_icon(app_model, args.get("icon"), args.get("icon_background")) app_model = app_service.update_app_icon(app_model, args.get("icon") or "", args.get("icon_background") or "")
return app_model return app_model
@console_ns.route("/apps/<uuid:app_id>/site-enable")
class AppSiteStatus(Resource): class AppSiteStatus(Resource):
@api.doc("update_app_site_status")
@api.doc(description="Enable or disable app site")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppSiteStatusRequest", {"enable_site": fields.Boolean(required=True, description="Enable or disable site")}
)
)
@api.response(200, "Site status updated successfully", app_detail_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -306,12 +456,23 @@ class AppSiteStatus(Resource):
args = parser.parse_args() args = parser.parse_args()
app_service = AppService() app_service = AppService()
app_model = app_service.update_app_site_status(app_model, args.get("enable_site")) app_model = app_service.update_app_site_status(app_model, args["enable_site"])
return app_model return app_model
@console_ns.route("/apps/<uuid:app_id>/api-enable")
class AppApiStatus(Resource): class AppApiStatus(Resource):
@api.doc("update_app_api_status")
@api.doc(description="Enable or disable app API")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppApiStatusRequest", {"enable_api": fields.Boolean(required=True, description="Enable or disable API")}
)
)
@api.response(200, "API status updated successfully", app_detail_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -327,12 +488,17 @@ class AppApiStatus(Resource):
args = parser.parse_args() args = parser.parse_args()
app_service = AppService() app_service = AppService()
app_model = app_service.update_app_api_status(app_model, args.get("enable_api")) app_model = app_service.update_app_api_status(app_model, args["enable_api"])
return app_model return app_model
@console_ns.route("/apps/<uuid:app_id>/trace")
class AppTraceApi(Resource): class AppTraceApi(Resource):
@api.doc("get_app_trace")
@api.doc(description="Get app tracing configuration")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Trace configuration retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -342,6 +508,20 @@ class AppTraceApi(Resource):
return app_trace_config return app_trace_config
@api.doc("update_app_trace")
@api.doc(description="Update app tracing configuration")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppTraceRequest",
{
"enabled": fields.Boolean(required=True, description="Enable or disable tracing"),
"tracing_provider": fields.String(required=True, description="Tracing provider"),
},
)
)
@api.response(200, "Trace configuration updated successfully")
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -361,14 +541,3 @@ class AppTraceApi(Resource):
) )
return {"result": "success"} return {"result": "success"}
api.add_resource(AppListApi, "/apps")
api.add_resource(AppApi, "/apps/<uuid:app_id>")
api.add_resource(AppCopyApi, "/apps/<uuid:app_id>/copy")
api.add_resource(AppExportApi, "/apps/<uuid:app_id>/export")
api.add_resource(AppNameApi, "/apps/<uuid:app_id>/name")
api.add_resource(AppIconApi, "/apps/<uuid:app_id>/icon")
api.add_resource(AppSiteStatus, "/apps/<uuid:app_id>/site-enable")
api.add_resource(AppApiStatus, "/apps/<uuid:app_id>/api-enable")
api.add_resource(AppTraceApi, "/apps/<uuid:app_id>/trace")

View File

@@ -1,11 +1,11 @@
import logging import logging
from flask import request from flask import request
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from werkzeug.exceptions import InternalServerError from werkzeug.exceptions import InternalServerError
import services import services
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import ( from controllers.console.app.error import (
AppUnavailableError, AppUnavailableError,
AudioTooLargeError, AudioTooLargeError,
@@ -34,7 +34,18 @@ from services.errors.audio import (
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@console_ns.route("/apps/<uuid:app_id>/audio-to-text")
class ChatMessageAudioApi(Resource): class ChatMessageAudioApi(Resource):
@api.doc("chat_message_audio_transcript")
@api.doc(description="Transcript audio to text for chat messages")
@api.doc(params={"app_id": "App ID"})
@api.response(
200,
"Audio transcription successful",
api.model("AudioTranscriptResponse", {"text": fields.String(description="Transcribed text from audio")}),
)
@api.response(400, "Bad request - No audio uploaded or unsupported type")
@api.response(413, "Audio file too large")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -76,11 +87,28 @@ class ChatMessageAudioApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/text-to-audio")
class ChatMessageTextApi(Resource): class ChatMessageTextApi(Resource):
@api.doc("chat_message_text_to_speech")
@api.doc(description="Convert text to speech for chat messages")
@api.doc(params={"app_id": "App ID"})
@api.expect(
api.model(
"TextToSpeechRequest",
{
"message_id": fields.String(description="Message ID"),
"text": fields.String(required=True, description="Text to convert to speech"),
"voice": fields.String(description="Voice to use for TTS"),
"streaming": fields.Boolean(description="Whether to stream the audio"),
},
)
)
@api.response(200, "Text to speech conversion successful")
@api.response(400, "Bad request - Invalid parameters")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def post(self, app_model: App): def post(self, app_model: App):
try: try:
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -124,11 +152,18 @@ class ChatMessageTextApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/text-to-audio/voices")
class TextModesApi(Resource): class TextModesApi(Resource):
@api.doc("get_text_to_speech_voices")
@api.doc(description="Get available TTS voices for a specific language")
@api.doc(params={"app_id": "App ID"})
@api.expect(api.parser().add_argument("language", type=str, required=True, location="args", help="Language code"))
@api.response(200, "TTS voices retrieved successfully", fields.List(fields.Raw(description="Available voices")))
@api.response(400, "Invalid language parameter")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
try: try:
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -164,8 +199,3 @@ class TextModesApi(Resource):
except Exception as e: except Exception as e:
logger.exception("Failed to handle get request to TextModesApi") logger.exception("Failed to handle get request to TextModesApi")
raise InternalServerError() raise InternalServerError()
api.add_resource(ChatMessageAudioApi, "/apps/<uuid:app_id>/audio-to-text")
api.add_resource(ChatMessageTextApi, "/apps/<uuid:app_id>/text-to-audio")
api.add_resource(TextModesApi, "/apps/<uuid:app_id>/text-to-audio/voices")

View File

@@ -1,12 +1,11 @@
import logging import logging
import flask_login
from flask import request from flask import request
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from werkzeug.exceptions import InternalServerError, NotFound from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services import services
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import ( from controllers.console.app.error import (
AppUnavailableError, AppUnavailableError,
CompletionRequestError, CompletionRequestError,
@@ -29,7 +28,8 @@ from core.helper.trace_id_helper import get_external_trace_id
from core.model_runtime.errors.invoke import InvokeError from core.model_runtime.errors.invoke import InvokeError
from libs import helper from libs import helper
from libs.helper import uuid_value from libs.helper import uuid_value
from libs.login import login_required from libs.login import current_user, login_required
from models import Account
from models.model import AppMode from models.model import AppMode
from services.app_generate_service import AppGenerateService from services.app_generate_service import AppGenerateService
from services.errors.llm import InvokeRateLimitError from services.errors.llm import InvokeRateLimitError
@@ -38,7 +38,27 @@ logger = logging.getLogger(__name__)
# define completion message api for user # define completion message api for user
@console_ns.route("/apps/<uuid:app_id>/completion-messages")
class CompletionMessageApi(Resource): class CompletionMessageApi(Resource):
@api.doc("create_completion_message")
@api.doc(description="Generate completion message for debugging")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"CompletionMessageRequest",
{
"inputs": fields.Raw(required=True, description="Input variables"),
"query": fields.String(description="Query text", default=""),
"files": fields.List(fields.Raw(), description="Uploaded files"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"response_mode": fields.String(enum=["blocking", "streaming"], description="Response mode"),
"retriever_from": fields.String(default="dev", description="Retriever source"),
},
)
)
@api.response(200, "Completion generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(404, "App not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -56,11 +76,11 @@ class CompletionMessageApi(Resource):
streaming = args["response_mode"] != "blocking" streaming = args["response_mode"] != "blocking"
args["auto_generate_name"] = False args["auto_generate_name"] = False
account = flask_login.current_user
try: try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account or EndUser instance")
response = AppGenerateService.generate( response = AppGenerateService.generate(
app_model=app_model, user=account, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming
) )
return helper.compact_generate_response(response) return helper.compact_generate_response(response)
@@ -86,25 +106,58 @@ class CompletionMessageApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/completion-messages/<string:task_id>/stop")
class CompletionMessageStopApi(Resource): class CompletionMessageStopApi(Resource):
@api.doc("stop_completion_message")
@api.doc(description="Stop a running completion message generation")
@api.doc(params={"app_id": "Application ID", "task_id": "Task ID to stop"})
@api.response(200, "Task stopped successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model(mode=AppMode.COMPLETION) @get_app_model(mode=AppMode.COMPLETION)
def post(self, app_model, task_id): def post(self, app_model, task_id):
account = flask_login.current_user if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, account.id) AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, current_user.id)
return {"result": "success"}, 200 return {"result": "success"}, 200
@console_ns.route("/apps/<uuid:app_id>/chat-messages")
class ChatMessageApi(Resource): class ChatMessageApi(Resource):
@api.doc("create_chat_message")
@api.doc(description="Generate chat message for debugging")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"ChatMessageRequest",
{
"inputs": fields.Raw(required=True, description="Input variables"),
"query": fields.String(required=True, description="User query"),
"files": fields.List(fields.Raw(), description="Uploaded files"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"conversation_id": fields.String(description="Conversation ID"),
"parent_message_id": fields.String(description="Parent message ID"),
"response_mode": fields.String(enum=["blocking", "streaming"], description="Response mode"),
"retriever_from": fields.String(default="dev", description="Retriever source"),
},
)
)
@api.response(200, "Chat message generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(404, "App or conversation not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT]) @get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT])
def post(self, app_model): def post(self, app_model):
if not isinstance(current_user, Account):
raise Forbidden()
if not current_user.has_edit_permission:
raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument("inputs", type=dict, required=True, location="json") parser.add_argument("inputs", type=dict, required=True, location="json")
parser.add_argument("query", type=str, required=True, location="json") parser.add_argument("query", type=str, required=True, location="json")
@@ -123,11 +176,11 @@ class ChatMessageApi(Resource):
if external_trace_id: if external_trace_id:
args["external_trace_id"] = external_trace_id args["external_trace_id"] = external_trace_id
account = flask_login.current_user
try: try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account or EndUser instance")
response = AppGenerateService.generate( response = AppGenerateService.generate(
app_model=app_model, user=account, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming
) )
return helper.compact_generate_response(response) return helper.compact_generate_response(response)
@@ -155,20 +208,19 @@ class ChatMessageApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/chat-messages/<string:task_id>/stop")
class ChatMessageStopApi(Resource): class ChatMessageStopApi(Resource):
@api.doc("stop_chat_message")
@api.doc(description="Stop a running chat message generation")
@api.doc(params={"app_id": "Application ID", "task_id": "Task ID to stop"})
@api.response(200, "Task stopped successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]) @get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
def post(self, app_model, task_id): def post(self, app_model, task_id):
account = flask_login.current_user if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, account.id) AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, current_user.id)
return {"result": "success"}, 200 return {"result": "success"}, 200
api.add_resource(CompletionMessageApi, "/apps/<uuid:app_id>/completion-messages")
api.add_resource(CompletionMessageStopApi, "/apps/<uuid:app_id>/completion-messages/<string:task_id>/stop")
api.add_resource(ChatMessageApi, "/apps/<uuid:app_id>/chat-messages")
api.add_resource(ChatMessageStopApi, "/apps/<uuid:app_id>/chat-messages/<string:task_id>/stop")

View File

@@ -8,7 +8,7 @@ from sqlalchemy import func, or_
from sqlalchemy.orm import joinedload from sqlalchemy.orm import joinedload
from werkzeug.exceptions import Forbidden, NotFound from werkzeug.exceptions import Forbidden, NotFound
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from core.app.entities.app_invoke_entities import InvokeFrom from core.app.entities.app_invoke_entities import InvokeFrom
@@ -22,13 +22,35 @@ from fields.conversation_fields import (
from libs.datetime_utils import naive_utc_now from libs.datetime_utils import naive_utc_now
from libs.helper import DatetimeString from libs.helper import DatetimeString
from libs.login import login_required from libs.login import login_required
from models import Conversation, EndUser, Message, MessageAnnotation from models import Account, Conversation, EndUser, Message, MessageAnnotation
from models.model import AppMode from models.model import AppMode
from services.conversation_service import ConversationService from services.conversation_service import ConversationService
from services.errors.conversation import ConversationNotExistsError from services.errors.conversation import ConversationNotExistsError
@console_ns.route("/apps/<uuid:app_id>/completion-conversations")
class CompletionConversationApi(Resource): class CompletionConversationApi(Resource):
@api.doc("list_completion_conversations")
@api.doc(description="Get completion conversations with pagination and filtering")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("keyword", type=str, location="args", help="Search keyword")
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
.add_argument(
"annotation_status",
type=str,
location="args",
choices=["annotated", "not_annotated", "all"],
default="all",
help="Annotation status filter",
)
.add_argument("page", type=int, location="args", default=1, help="Page number")
.add_argument("limit", type=int, location="args", default=20, help="Page size (1-100)")
)
@api.response(200, "Success", conversation_pagination_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -101,7 +123,14 @@ class CompletionConversationApi(Resource):
return conversations return conversations
@console_ns.route("/apps/<uuid:app_id>/completion-conversations/<uuid:conversation_id>")
class CompletionConversationDetailApi(Resource): class CompletionConversationDetailApi(Resource):
@api.doc("get_completion_conversation")
@api.doc(description="Get completion conversation details with messages")
@api.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@api.response(200, "Success", conversation_message_detail_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "Conversation not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -114,6 +143,12 @@ class CompletionConversationDetailApi(Resource):
return _get_conversation(app_model, conversation_id) return _get_conversation(app_model, conversation_id)
@api.doc("delete_completion_conversation")
@api.doc(description="Delete a completion conversation")
@api.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@api.response(204, "Conversation deleted successfully")
@api.response(403, "Insufficient permissions")
@api.response(404, "Conversation not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -124,6 +159,8 @@ class CompletionConversationDetailApi(Resource):
conversation_id = str(conversation_id) conversation_id = str(conversation_id)
try: try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
ConversationService.delete(app_model, conversation_id, current_user) ConversationService.delete(app_model, conversation_id, current_user)
except ConversationNotExistsError: except ConversationNotExistsError:
raise NotFound("Conversation Not Exists.") raise NotFound("Conversation Not Exists.")
@@ -131,7 +168,38 @@ class CompletionConversationDetailApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/apps/<uuid:app_id>/chat-conversations")
class ChatConversationApi(Resource): class ChatConversationApi(Resource):
@api.doc("list_chat_conversations")
@api.doc(description="Get chat conversations with pagination, filtering and summary")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("keyword", type=str, location="args", help="Search keyword")
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
.add_argument(
"annotation_status",
type=str,
location="args",
choices=["annotated", "not_annotated", "all"],
default="all",
help="Annotation status filter",
)
.add_argument("message_count_gte", type=int, location="args", help="Minimum message count")
.add_argument("page", type=int, location="args", default=1, help="Page number")
.add_argument("limit", type=int, location="args", default=20, help="Page size (1-100)")
.add_argument(
"sort_by",
type=str,
location="args",
choices=["created_at", "-created_at", "updated_at", "-updated_at"],
default="-updated_at",
help="Sort field and direction",
)
)
@api.response(200, "Success", conversation_with_summary_pagination_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -239,7 +307,7 @@ class ChatConversationApi(Resource):
.having(func.count(Message.id) >= args["message_count_gte"]) .having(func.count(Message.id) >= args["message_count_gte"])
) )
if app_model.mode == AppMode.ADVANCED_CHAT.value: if app_model.mode == AppMode.ADVANCED_CHAT:
query = query.where(Conversation.invoke_from != InvokeFrom.DEBUGGER.value) query = query.where(Conversation.invoke_from != InvokeFrom.DEBUGGER.value)
match args["sort_by"]: match args["sort_by"]:
@@ -259,7 +327,14 @@ class ChatConversationApi(Resource):
return conversations return conversations
@console_ns.route("/apps/<uuid:app_id>/chat-conversations/<uuid:conversation_id>")
class ChatConversationDetailApi(Resource): class ChatConversationDetailApi(Resource):
@api.doc("get_chat_conversation")
@api.doc(description="Get chat conversation details")
@api.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@api.response(200, "Success", conversation_detail_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "Conversation not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -272,6 +347,12 @@ class ChatConversationDetailApi(Resource):
return _get_conversation(app_model, conversation_id) return _get_conversation(app_model, conversation_id)
@api.doc("delete_chat_conversation")
@api.doc(description="Delete a chat conversation")
@api.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@api.response(204, "Conversation deleted successfully")
@api.response(403, "Insufficient permissions")
@api.response(404, "Conversation not found")
@setup_required @setup_required
@login_required @login_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]) @get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
@@ -282,6 +363,8 @@ class ChatConversationDetailApi(Resource):
conversation_id = str(conversation_id) conversation_id = str(conversation_id)
try: try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
ConversationService.delete(app_model, conversation_id, current_user) ConversationService.delete(app_model, conversation_id, current_user)
except ConversationNotExistsError: except ConversationNotExistsError:
raise NotFound("Conversation Not Exists.") raise NotFound("Conversation Not Exists.")
@@ -289,12 +372,6 @@ class ChatConversationDetailApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
api.add_resource(CompletionConversationApi, "/apps/<uuid:app_id>/completion-conversations")
api.add_resource(CompletionConversationDetailApi, "/apps/<uuid:app_id>/completion-conversations/<uuid:conversation_id>")
api.add_resource(ChatConversationApi, "/apps/<uuid:app_id>/chat-conversations")
api.add_resource(ChatConversationDetailApi, "/apps/<uuid:app_id>/chat-conversations/<uuid:conversation_id>")
def _get_conversation(app_model, conversation_id): def _get_conversation(app_model, conversation_id):
conversation = ( conversation = (
db.session.query(Conversation) db.session.query(Conversation)

View File

@@ -2,7 +2,7 @@ from flask_restx import Resource, marshal_with, reqparse
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db from extensions.ext_database import db
@@ -12,7 +12,17 @@ from models import ConversationVariable
from models.model import AppMode from models.model import AppMode
@console_ns.route("/apps/<uuid:app_id>/conversation-variables")
class ConversationVariablesApi(Resource): class ConversationVariablesApi(Resource):
@api.doc("get_conversation_variables")
@api.doc(description="Get conversation variables for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser().add_argument(
"conversation_id", type=str, location="args", help="Conversation ID to filter variables"
)
)
@api.response(200, "Conversation variables retrieved successfully", paginated_conversation_variable_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -55,6 +65,3 @@ class ConversationVariablesApi(Resource):
for row in rows for row in rows
], ],
} }
api.add_resource(ConversationVariablesApi, "/apps/<uuid:app_id>/conversation-variables")

View File

@@ -1,9 +1,9 @@
from collections.abc import Sequence from collections.abc import Sequence
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import ( from controllers.console.app.error import (
CompletionRequestError, CompletionRequestError,
ProviderModelCurrentlyNotSupportError, ProviderModelCurrentlyNotSupportError,
@@ -16,10 +16,29 @@ from core.helper.code_executor.javascript.javascript_code_provider import Javasc
from core.helper.code_executor.python3.python3_code_provider import Python3CodeProvider from core.helper.code_executor.python3.python3_code_provider import Python3CodeProvider
from core.llm_generator.llm_generator import LLMGenerator from core.llm_generator.llm_generator import LLMGenerator
from core.model_runtime.errors.invoke import InvokeError from core.model_runtime.errors.invoke import InvokeError
from extensions.ext_database import db
from libs.login import login_required from libs.login import login_required
from models import App
from services.workflow_service import WorkflowService
@console_ns.route("/rule-generate")
class RuleGenerateApi(Resource): class RuleGenerateApi(Resource):
@api.doc("generate_rule_config")
@api.doc(description="Generate rule configuration using LLM")
@api.expect(
api.model(
"RuleGenerateRequest",
{
"instruction": fields.String(required=True, description="Rule generation instruction"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"no_variable": fields.Boolean(required=True, default=False, description="Whether to exclude variables"),
},
)
)
@api.response(200, "Rule configuration generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(402, "Provider quota exceeded")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -50,7 +69,26 @@ class RuleGenerateApi(Resource):
return rules return rules
@console_ns.route("/rule-code-generate")
class RuleCodeGenerateApi(Resource): class RuleCodeGenerateApi(Resource):
@api.doc("generate_rule_code")
@api.doc(description="Generate code rules using LLM")
@api.expect(
api.model(
"RuleCodeGenerateRequest",
{
"instruction": fields.String(required=True, description="Code generation instruction"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"no_variable": fields.Boolean(required=True, default=False, description="Whether to exclude variables"),
"code_language": fields.String(
default="javascript", description="Programming language for code generation"
),
},
)
)
@api.response(200, "Code rules generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(402, "Provider quota exceeded")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -82,7 +120,22 @@ class RuleCodeGenerateApi(Resource):
return code_result return code_result
@console_ns.route("/rule-structured-output-generate")
class RuleStructuredOutputGenerateApi(Resource): class RuleStructuredOutputGenerateApi(Resource):
@api.doc("generate_structured_output")
@api.doc(description="Generate structured output rules using LLM")
@api.expect(
api.model(
"StructuredOutputGenerateRequest",
{
"instruction": fields.String(required=True, description="Structured output generation instruction"),
"model_config": fields.Raw(required=True, description="Model configuration"),
},
)
)
@api.response(200, "Structured output generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(402, "Provider quota exceeded")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -111,7 +164,27 @@ class RuleStructuredOutputGenerateApi(Resource):
return structured_output return structured_output
@console_ns.route("/instruction-generate")
class InstructionGenerateApi(Resource): class InstructionGenerateApi(Resource):
@api.doc("generate_instruction")
@api.doc(description="Generate instruction for workflow nodes or general use")
@api.expect(
api.model(
"InstructionGenerateRequest",
{
"flow_id": fields.String(required=True, description="Workflow/Flow ID"),
"node_id": fields.String(description="Node ID for workflow context"),
"current": fields.String(description="Current instruction text"),
"language": fields.String(default="javascript", description="Programming language (javascript/python)"),
"instruction": fields.String(required=True, description="Instruction for generation"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"ideal_output": fields.String(description="Expected ideal output"),
},
)
)
@api.response(200, "Instruction generated successfully")
@api.response(400, "Invalid request parameters or flow/workflow not found")
@api.response(402, "Provider quota exceeded")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -135,9 +208,6 @@ class InstructionGenerateApi(Resource):
try: try:
# Generate from nothing for a workflow node # Generate from nothing for a workflow node
if (args["current"] == code_template or args["current"] == "") and args["node_id"] != "": if (args["current"] == code_template or args["current"] == "") and args["node_id"] != "":
from models import App, db
from services.workflow_service import WorkflowService
app = db.session.query(App).where(App.id == args["flow_id"]).first() app = db.session.query(App).where(App.id == args["flow_id"]).first()
if not app: if not app:
return {"error": f"app {args['flow_id']} not found"}, 400 return {"error": f"app {args['flow_id']} not found"}, 400
@@ -203,7 +273,21 @@ class InstructionGenerateApi(Resource):
raise CompletionRequestError(e.description) raise CompletionRequestError(e.description)
@console_ns.route("/instruction-generate/template")
class InstructionGenerationTemplateApi(Resource): class InstructionGenerationTemplateApi(Resource):
@api.doc("get_instruction_template")
@api.doc(description="Get instruction generation template")
@api.expect(
api.model(
"InstructionTemplateRequest",
{
"instruction": fields.String(required=True, description="Template instruction"),
"ideal_output": fields.String(description="Expected ideal output"),
},
)
)
@api.response(200, "Template retrieved successfully")
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -222,10 +306,3 @@ class InstructionGenerationTemplateApi(Resource):
return {"data": INSTRUCTION_GENERATE_TEMPLATE_CODE} return {"data": INSTRUCTION_GENERATE_TEMPLATE_CODE}
case _: case _:
raise ValueError(f"Invalid type: {args['type']}") raise ValueError(f"Invalid type: {args['type']}")
api.add_resource(RuleGenerateApi, "/rule-generate")
api.add_resource(RuleCodeGenerateApi, "/rule-code-generate")
api.add_resource(RuleStructuredOutputGenerateApi, "/rule-structured-output-generate")
api.add_resource(InstructionGenerateApi, "/instruction-generate")
api.add_resource(InstructionGenerationTemplateApi, "/instruction-generate/template")

View File

@@ -2,10 +2,10 @@ import json
from enum import StrEnum from enum import StrEnum
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, marshal_with, reqparse from flask_restx import Resource, fields, marshal_with, reqparse
from werkzeug.exceptions import NotFound from werkzeug.exceptions import NotFound
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db from extensions.ext_database import db
@@ -19,7 +19,12 @@ class AppMCPServerStatus(StrEnum):
INACTIVE = "inactive" INACTIVE = "inactive"
@console_ns.route("/apps/<uuid:app_id>/server")
class AppMCPServerController(Resource): class AppMCPServerController(Resource):
@api.doc("get_app_mcp_server")
@api.doc(description="Get MCP server configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "MCP server configuration retrieved successfully", app_server_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -29,6 +34,20 @@ class AppMCPServerController(Resource):
server = db.session.query(AppMCPServer).where(AppMCPServer.app_id == app_model.id).first() server = db.session.query(AppMCPServer).where(AppMCPServer.app_id == app_model.id).first()
return server return server
@api.doc("create_app_mcp_server")
@api.doc(description="Create MCP server configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"MCPServerCreateRequest",
{
"description": fields.String(description="Server description"),
"parameters": fields.Raw(required=True, description="Server parameters configuration"),
},
)
)
@api.response(201, "MCP server configuration created successfully", app_server_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -59,6 +78,23 @@ class AppMCPServerController(Resource):
db.session.commit() db.session.commit()
return server return server
@api.doc("update_app_mcp_server")
@api.doc(description="Update MCP server configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"MCPServerUpdateRequest",
{
"id": fields.String(required=True, description="Server ID"),
"description": fields.String(description="Server description"),
"parameters": fields.Raw(required=True, description="Server parameters configuration"),
"status": fields.String(description="Server status"),
},
)
)
@api.response(200, "MCP server configuration updated successfully", app_server_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "Server not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -94,7 +130,14 @@ class AppMCPServerController(Resource):
return server return server
@console_ns.route("/apps/<uuid:server_id>/server/refresh")
class AppMCPServerRefreshController(Resource): class AppMCPServerRefreshController(Resource):
@api.doc("refresh_app_mcp_server")
@api.doc(description="Refresh MCP server configuration and regenerate server code")
@api.doc(params={"server_id": "Server ID"})
@api.response(200, "MCP server refreshed successfully", app_server_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "Server not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -113,7 +156,3 @@ class AppMCPServerRefreshController(Resource):
server.server_code = AppMCPServer.generate_server_code(16) server.server_code = AppMCPServer.generate_server_code(16)
db.session.commit() db.session.commit()
return server return server
api.add_resource(AppMCPServerController, "/apps/<uuid:app_id>/server")
api.add_resource(AppMCPServerRefreshController, "/apps/<uuid:server_id>/server/refresh")

View File

@@ -1,12 +1,11 @@
import logging import logging
from flask_login import current_user
from flask_restx import Resource, fields, marshal_with, reqparse from flask_restx import Resource, fields, marshal_with, reqparse
from flask_restx.inputs import int_range from flask_restx.inputs import int_range
from sqlalchemy import exists, select from sqlalchemy import exists, select
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import ( from controllers.console.app.error import (
CompletionRequestError, CompletionRequestError,
ProviderModelCurrentlyNotSupportError, ProviderModelCurrentlyNotSupportError,
@@ -27,7 +26,8 @@ from extensions.ext_database import db
from fields.conversation_fields import annotation_fields, message_detail_fields from fields.conversation_fields import annotation_fields, message_detail_fields
from libs.helper import uuid_value from libs.helper import uuid_value
from libs.infinite_scroll_pagination import InfiniteScrollPagination from libs.infinite_scroll_pagination import InfiniteScrollPagination
from libs.login import login_required from libs.login import current_user, login_required
from models.account import Account
from models.model import AppMode, Conversation, Message, MessageAnnotation, MessageFeedback from models.model import AppMode, Conversation, Message, MessageAnnotation, MessageFeedback
from services.annotation_service import AppAnnotationService from services.annotation_service import AppAnnotationService
from services.errors.conversation import ConversationNotExistsError from services.errors.conversation import ConversationNotExistsError
@@ -37,6 +37,7 @@ from services.message_service import MessageService
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@console_ns.route("/apps/<uuid:app_id>/chat-messages")
class ChatMessageListApi(Resource): class ChatMessageListApi(Resource):
message_infinite_scroll_pagination_fields = { message_infinite_scroll_pagination_fields = {
"limit": fields.Integer, "limit": fields.Integer,
@@ -44,6 +45,17 @@ class ChatMessageListApi(Resource):
"data": fields.List(fields.Nested(message_detail_fields)), "data": fields.List(fields.Nested(message_detail_fields)),
} }
@api.doc("list_chat_messages")
@api.doc(description="Get chat messages for a conversation with pagination")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("conversation_id", type=str, required=True, location="args", help="Conversation ID")
.add_argument("first_id", type=str, location="args", help="First message ID for pagination")
.add_argument("limit", type=int, location="args", default=20, help="Number of messages to return (1-100)")
)
@api.response(200, "Success", message_infinite_scroll_pagination_fields)
@api.response(404, "Conversation not found")
@setup_required @setup_required
@login_required @login_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]) @get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
@@ -117,12 +129,31 @@ class ChatMessageListApi(Resource):
return InfiniteScrollPagination(data=history_messages, limit=args["limit"], has_more=has_more) return InfiniteScrollPagination(data=history_messages, limit=args["limit"], has_more=has_more)
@console_ns.route("/apps/<uuid:app_id>/feedbacks")
class MessageFeedbackApi(Resource): class MessageFeedbackApi(Resource):
@api.doc("create_message_feedback")
@api.doc(description="Create or update message feedback (like/dislike)")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"MessageFeedbackRequest",
{
"message_id": fields.String(required=True, description="Message ID"),
"rating": fields.String(enum=["like", "dislike"], description="Feedback rating"),
},
)
)
@api.response(200, "Feedback updated successfully")
@api.response(404, "Message not found")
@api.response(403, "Insufficient permissions")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def post(self, app_model): def post(self, app_model):
if current_user is None:
raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument("message_id", required=True, type=uuid_value, location="json") parser.add_argument("message_id", required=True, type=uuid_value, location="json")
parser.add_argument("rating", type=str, choices=["like", "dislike", None], location="json") parser.add_argument("rating", type=str, choices=["like", "dislike", None], location="json")
@@ -159,7 +190,24 @@ class MessageFeedbackApi(Resource):
return {"result": "success"} return {"result": "success"}
@console_ns.route("/apps/<uuid:app_id>/annotations")
class MessageAnnotationApi(Resource): class MessageAnnotationApi(Resource):
@api.doc("create_message_annotation")
@api.doc(description="Create message annotation")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"MessageAnnotationRequest",
{
"message_id": fields.String(description="Message ID"),
"question": fields.String(required=True, description="Question text"),
"answer": fields.String(required=True, description="Answer text"),
"annotation_reply": fields.Raw(description="Annotation reply"),
},
)
)
@api.response(200, "Annotation created successfully", annotation_fields)
@api.response(403, "Insufficient permissions")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -167,7 +215,9 @@ class MessageAnnotationApi(Resource):
@get_app_model @get_app_model
@marshal_with(annotation_fields) @marshal_with(annotation_fields)
def post(self, app_model): def post(self, app_model):
if not current_user.is_editor: if not isinstance(current_user, Account):
raise Forbidden()
if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -181,18 +231,37 @@ class MessageAnnotationApi(Resource):
return annotation return annotation
@console_ns.route("/apps/<uuid:app_id>/annotations/count")
class MessageAnnotationCountApi(Resource): class MessageAnnotationCountApi(Resource):
@api.doc("get_annotation_count")
@api.doc(description="Get count of message annotations for the app")
@api.doc(params={"app_id": "Application ID"})
@api.response(
200,
"Annotation count retrieved successfully",
api.model("AnnotationCountResponse", {"count": fields.Integer(description="Number of annotations")}),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
count = db.session.query(MessageAnnotation).where(MessageAnnotation.app_id == app_model.id).count() count = db.session.query(MessageAnnotation).where(MessageAnnotation.app_id == app_model.id).count()
return {"count": count} return {"count": count}
@console_ns.route("/apps/<uuid:app_id>/chat-messages/<uuid:message_id>/suggested-questions")
class MessageSuggestedQuestionApi(Resource): class MessageSuggestedQuestionApi(Resource):
@api.doc("get_message_suggested_questions")
@api.doc(description="Get suggested questions for a message")
@api.doc(params={"app_id": "Application ID", "message_id": "Message ID"})
@api.response(
200,
"Suggested questions retrieved successfully",
api.model("SuggestedQuestionsResponse", {"data": fields.List(fields.String(description="Suggested question"))}),
)
@api.response(404, "Message or conversation not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -225,7 +294,13 @@ class MessageSuggestedQuestionApi(Resource):
return {"data": questions} return {"data": questions}
@console_ns.route("/apps/<uuid:app_id>/messages/<uuid:message_id>")
class MessageApi(Resource): class MessageApi(Resource):
@api.doc("get_message")
@api.doc(description="Get message details by ID")
@api.doc(params={"app_id": "Application ID", "message_id": "Message ID"})
@api.response(200, "Message retrieved successfully", message_detail_fields)
@api.response(404, "Message not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -240,11 +315,3 @@ class MessageApi(Resource):
raise NotFound("Message Not Exists.") raise NotFound("Message Not Exists.")
return message return message
api.add_resource(MessageSuggestedQuestionApi, "/apps/<uuid:app_id>/chat-messages/<uuid:message_id>/suggested-questions")
api.add_resource(ChatMessageListApi, "/apps/<uuid:app_id>/chat-messages", endpoint="console_chat_messages")
api.add_resource(MessageFeedbackApi, "/apps/<uuid:app_id>/feedbacks")
api.add_resource(MessageAnnotationApi, "/apps/<uuid:app_id>/annotations")
api.add_resource(MessageAnnotationCountApi, "/apps/<uuid:app_id>/annotations/count")
api.add_resource(MessageApi, "/apps/<uuid:app_id>/messages/<uuid:message_id>", endpoint="console_message")

View File

@@ -3,9 +3,10 @@ from typing import cast
from flask import request from flask import request
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource from flask_restx import Resource, fields
from werkzeug.exceptions import Forbidden
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from core.agent.entities import AgentToolEntity from core.agent.entities import AgentToolEntity
@@ -14,17 +15,51 @@ from core.tools.utils.configuration import ToolParameterConfigurationManager
from events.app_event import app_model_config_was_updated from events.app_event import app_model_config_was_updated
from extensions.ext_database import db from extensions.ext_database import db
from libs.login import login_required from libs.login import login_required
from models.account import Account
from models.model import AppMode, AppModelConfig from models.model import AppMode, AppModelConfig
from services.app_model_config_service import AppModelConfigService from services.app_model_config_service import AppModelConfigService
@console_ns.route("/apps/<uuid:app_id>/model-config")
class ModelConfigResource(Resource): class ModelConfigResource(Resource):
@api.doc("update_app_model_config")
@api.doc(description="Update application model configuration")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"ModelConfigRequest",
{
"provider": fields.String(description="Model provider"),
"model": fields.String(description="Model name"),
"configs": fields.Raw(description="Model configuration parameters"),
"opening_statement": fields.String(description="Opening statement"),
"suggested_questions": fields.List(fields.String(), description="Suggested questions"),
"more_like_this": fields.Raw(description="More like this configuration"),
"speech_to_text": fields.Raw(description="Speech to text configuration"),
"text_to_speech": fields.Raw(description="Text to speech configuration"),
"retrieval_model": fields.Raw(description="Retrieval model configuration"),
"tools": fields.List(fields.Raw(), description="Available tools"),
"dataset_configs": fields.Raw(description="Dataset configurations"),
"agent_mode": fields.Raw(description="Agent mode configuration"),
},
)
)
@api.response(200, "Model configuration updated successfully")
@api.response(400, "Invalid configuration")
@api.response(404, "App not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model(mode=[AppMode.AGENT_CHAT, AppMode.CHAT, AppMode.COMPLETION]) @get_app_model(mode=[AppMode.AGENT_CHAT, AppMode.CHAT, AppMode.COMPLETION])
def post(self, app_model): def post(self, app_model):
"""Modify app model config""" """Modify app model config"""
if not isinstance(current_user, Account):
raise Forbidden()
if not current_user.has_edit_permission:
raise Forbidden()
assert current_user.current_tenant_id is not None, "The tenant information should be loaded."
# validate config # validate config
model_configuration = AppModelConfigService.validate_configuration( model_configuration = AppModelConfigService.validate_configuration(
tenant_id=current_user.current_tenant_id, tenant_id=current_user.current_tenant_id,
@@ -39,7 +74,7 @@ class ModelConfigResource(Resource):
) )
new_app_model_config = new_app_model_config.from_model_config_dict(model_configuration) new_app_model_config = new_app_model_config.from_model_config_dict(model_configuration)
if app_model.mode == AppMode.AGENT_CHAT.value or app_model.is_agent: if app_model.mode == AppMode.AGENT_CHAT or app_model.is_agent:
# get original app model config # get original app model config
original_app_model_config = ( original_app_model_config = (
db.session.query(AppModelConfig).where(AppModelConfig.id == app_model.app_model_config_id).first() db.session.query(AppModelConfig).where(AppModelConfig.id == app_model.app_model_config_id).first()
@@ -142,6 +177,3 @@ class ModelConfigResource(Resource):
app_model_config_was_updated.send(app_model, app_model_config=new_app_model_config) app_model_config_was_updated.send(app_model, app_model_config=new_app_model_config)
return {"result": "success"} return {"result": "success"}
api.add_resource(ModelConfigResource, "/apps/<uuid:app_id>/model-config")

View File

@@ -1,18 +1,31 @@
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from werkzeug.exceptions import BadRequest from werkzeug.exceptions import BadRequest
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import TracingConfigCheckError, TracingConfigIsExist, TracingConfigNotExist from controllers.console.app.error import TracingConfigCheckError, TracingConfigIsExist, TracingConfigNotExist
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from libs.login import login_required from libs.login import login_required
from services.ops_service import OpsService from services.ops_service import OpsService
@console_ns.route("/apps/<uuid:app_id>/trace-config")
class TraceAppConfigApi(Resource): class TraceAppConfigApi(Resource):
""" """
Manage trace app configurations Manage trace app configurations
""" """
@api.doc("get_trace_app_config")
@api.doc(description="Get tracing configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser().add_argument(
"tracing_provider", type=str, required=True, location="args", help="Tracing provider name"
)
)
@api.response(
200, "Tracing configuration retrieved successfully", fields.Raw(description="Tracing configuration data")
)
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -29,6 +42,22 @@ class TraceAppConfigApi(Resource):
except Exception as e: except Exception as e:
raise BadRequest(str(e)) raise BadRequest(str(e))
@api.doc("create_trace_app_config")
@api.doc(description="Create a new tracing configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"TraceConfigCreateRequest",
{
"tracing_provider": fields.String(required=True, description="Tracing provider name"),
"tracing_config": fields.Raw(required=True, description="Tracing configuration data"),
},
)
)
@api.response(
201, "Tracing configuration created successfully", fields.Raw(description="Created configuration data")
)
@api.response(400, "Invalid request parameters or configuration already exists")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -51,6 +80,20 @@ class TraceAppConfigApi(Resource):
except Exception as e: except Exception as e:
raise BadRequest(str(e)) raise BadRequest(str(e))
@api.doc("update_trace_app_config")
@api.doc(description="Update an existing tracing configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"TraceConfigUpdateRequest",
{
"tracing_provider": fields.String(required=True, description="Tracing provider name"),
"tracing_config": fields.Raw(required=True, description="Updated tracing configuration data"),
},
)
)
@api.response(200, "Tracing configuration updated successfully", fields.Raw(description="Success response"))
@api.response(400, "Invalid request parameters or configuration not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -71,6 +114,16 @@ class TraceAppConfigApi(Resource):
except Exception as e: except Exception as e:
raise BadRequest(str(e)) raise BadRequest(str(e))
@api.doc("delete_trace_app_config")
@api.doc(description="Delete an existing tracing configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser().add_argument(
"tracing_provider", type=str, required=True, location="args", help="Tracing provider name"
)
)
@api.response(204, "Tracing configuration deleted successfully")
@api.response(400, "Invalid request parameters or configuration not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -87,6 +140,3 @@ class TraceAppConfigApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
except Exception as e: except Exception as e:
raise BadRequest(str(e)) raise BadRequest(str(e))
api.add_resource(TraceAppConfigApi, "/apps/<uuid:app_id>/trace-config")

View File

@@ -1,16 +1,16 @@
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, marshal_with, reqparse from flask_restx import Resource, fields, marshal_with, reqparse
from werkzeug.exceptions import Forbidden, NotFound from werkzeug.exceptions import Forbidden, NotFound
from constants.languages import supported_language from constants.languages import supported_language
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db from extensions.ext_database import db
from fields.app_fields import app_site_fields from fields.app_fields import app_site_fields
from libs.datetime_utils import naive_utc_now from libs.datetime_utils import naive_utc_now
from libs.login import login_required from libs.login import login_required
from models import Site from models import Account, Site
def parse_app_site_args(): def parse_app_site_args():
@@ -36,7 +36,39 @@ def parse_app_site_args():
return parser.parse_args() return parser.parse_args()
@console_ns.route("/apps/<uuid:app_id>/site")
class AppSite(Resource): class AppSite(Resource):
@api.doc("update_app_site")
@api.doc(description="Update application site configuration")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppSiteRequest",
{
"title": fields.String(description="Site title"),
"icon_type": fields.String(description="Icon type"),
"icon": fields.String(description="Icon"),
"icon_background": fields.String(description="Icon background color"),
"description": fields.String(description="Site description"),
"default_language": fields.String(description="Default language"),
"chat_color_theme": fields.String(description="Chat color theme"),
"chat_color_theme_inverted": fields.Boolean(description="Inverted chat color theme"),
"customize_domain": fields.String(description="Custom domain"),
"copyright": fields.String(description="Copyright text"),
"privacy_policy": fields.String(description="Privacy policy"),
"custom_disclaimer": fields.String(description="Custom disclaimer"),
"customize_token_strategy": fields.String(
enum=["must", "allow", "not_allow"], description="Token strategy"
),
"prompt_public": fields.Boolean(description="Make prompt public"),
"show_workflow_steps": fields.Boolean(description="Show workflow steps"),
"use_icon_as_answer_icon": fields.Boolean(description="Use icon as answer icon"),
},
)
)
@api.response(200, "Site configuration updated successfully", app_site_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "App not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -75,6 +107,8 @@ class AppSite(Resource):
if value is not None: if value is not None:
setattr(site, attr_name, value) setattr(site, attr_name, value)
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
site.updated_by = current_user.id site.updated_by = current_user.id
site.updated_at = naive_utc_now() site.updated_at = naive_utc_now()
db.session.commit() db.session.commit()
@@ -82,7 +116,14 @@ class AppSite(Resource):
return site return site
@console_ns.route("/apps/<uuid:app_id>/site/access-token-reset")
class AppSiteAccessTokenReset(Resource): class AppSiteAccessTokenReset(Resource):
@api.doc("reset_app_site_access_token")
@api.doc(description="Reset access token for application site")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Access token reset successfully", app_site_fields)
@api.response(403, "Insufficient permissions (admin/owner required)")
@api.response(404, "App or site not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -99,12 +140,10 @@ class AppSiteAccessTokenReset(Resource):
raise NotFound raise NotFound
site.code = Site.generate_code(16) site.code = Site.generate_code(16)
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
site.updated_by = current_user.id site.updated_by = current_user.id
site.updated_at = naive_utc_now() site.updated_at = naive_utc_now()
db.session.commit() db.session.commit()
return site return site
api.add_resource(AppSite, "/apps/<uuid:app_id>/site")
api.add_resource(AppSiteAccessTokenReset, "/apps/<uuid:app_id>/site/access-token-reset")

View File

@@ -5,9 +5,9 @@ import pytz
import sqlalchemy as sa import sqlalchemy as sa
from flask import jsonify from flask import jsonify
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from core.app.entities.app_invoke_entities import InvokeFrom from core.app.entities.app_invoke_entities import InvokeFrom
@@ -17,11 +17,25 @@ from libs.login import login_required
from models import AppMode, Message from models import AppMode, Message
@console_ns.route("/apps/<uuid:app_id>/statistics/daily-messages")
class DailyMessageStatistic(Resource): class DailyMessageStatistic(Resource):
@api.doc("get_daily_message_statistics")
@api.doc(description="Get daily message statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Daily message statistics retrieved successfully",
fields.List(fields.Raw(description="Daily message count data")),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -74,11 +88,25 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/daily-conversations")
class DailyConversationStatistic(Resource): class DailyConversationStatistic(Resource):
@api.doc("get_daily_conversation_statistics")
@api.doc(description="Get daily conversation statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Daily conversation statistics retrieved successfully",
fields.List(fields.Raw(description="Daily conversation count data")),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -126,11 +154,25 @@ class DailyConversationStatistic(Resource):
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/daily-end-users")
class DailyTerminalsStatistic(Resource): class DailyTerminalsStatistic(Resource):
@api.doc("get_daily_terminals_statistics")
@api.doc(description="Get daily terminal/end-user statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Daily terminal statistics retrieved successfully",
fields.List(fields.Raw(description="Daily terminal count data")),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -183,11 +225,25 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/token-costs")
class DailyTokenCostStatistic(Resource): class DailyTokenCostStatistic(Resource):
@api.doc("get_daily_token_cost_statistics")
@api.doc(description="Get daily token cost statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Daily token cost statistics retrieved successfully",
fields.List(fields.Raw(description="Daily token cost data")),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -243,7 +299,21 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/average-session-interactions")
class AverageSessionInteractionStatistic(Resource): class AverageSessionInteractionStatistic(Resource):
@api.doc("get_average_session_interaction_statistics")
@api.doc(description="Get average session interaction statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Average session interaction statistics retrieved successfully",
fields.List(fields.Raw(description="Average session interaction data")),
)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -319,11 +389,25 @@ ORDER BY
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/user-satisfaction-rate")
class UserSatisfactionRateStatistic(Resource): class UserSatisfactionRateStatistic(Resource):
@api.doc("get_user_satisfaction_rate_statistics")
@api.doc(description="Get user satisfaction rate statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"User satisfaction rate statistics retrieved successfully",
fields.List(fields.Raw(description="User satisfaction rate data")),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -385,7 +469,21 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/average-response-time")
class AverageResponseTimeStatistic(Resource): class AverageResponseTimeStatistic(Resource):
@api.doc("get_average_response_time_statistics")
@api.doc(description="Get average response time statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Average response time statistics retrieved successfully",
fields.List(fields.Raw(description="Average response time data")),
)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -442,11 +540,25 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/tokens-per-second")
class TokensPerSecondStatistic(Resource): class TokensPerSecondStatistic(Resource):
@api.doc("get_tokens_per_second_statistics")
@api.doc(description="Get tokens per second statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
200,
"Tokens per second statistics retrieved successfully",
fields.List(fields.Raw(description="Tokens per second data")),
)
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -500,13 +612,3 @@ WHERE
response_data.append({"date": str(i.date), "tps": round(i.tokens_per_second, 4)}) response_data.append({"date": str(i.date), "tps": round(i.tokens_per_second, 4)})
return jsonify({"data": response_data}) return jsonify({"data": response_data})
api.add_resource(DailyMessageStatistic, "/apps/<uuid:app_id>/statistics/daily-messages")
api.add_resource(DailyConversationStatistic, "/apps/<uuid:app_id>/statistics/daily-conversations")
api.add_resource(DailyTerminalsStatistic, "/apps/<uuid:app_id>/statistics/daily-end-users")
api.add_resource(DailyTokenCostStatistic, "/apps/<uuid:app_id>/statistics/token-costs")
api.add_resource(AverageSessionInteractionStatistic, "/apps/<uuid:app_id>/statistics/average-session-interactions")
api.add_resource(UserSatisfactionRateStatistic, "/apps/<uuid:app_id>/statistics/user-satisfaction-rate")
api.add_resource(AverageResponseTimeStatistic, "/apps/<uuid:app_id>/statistics/average-response-time")
api.add_resource(TokensPerSecondStatistic, "/apps/<uuid:app_id>/statistics/tokens-per-second")

View File

@@ -4,18 +4,14 @@ from collections.abc import Sequence
from typing import cast from typing import cast
from flask import abort, request from flask import abort, request
from flask_restx import Resource, inputs, marshal_with, reqparse from flask_restx import Resource, fields, inputs, marshal_with, reqparse
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services import services
from configs import dify_config from configs import dify_config
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import ( from controllers.console.app.error import ConversationCompletedError, DraftWorkflowNotExist, DraftWorkflowNotSync
ConversationCompletedError,
DraftWorkflowNotExist,
DraftWorkflowNotSync,
)
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from controllers.web.error import InvokeRateLimitError as InvokeRateLimitHttpError from controllers.web.error import InvokeRateLimitError as InvokeRateLimitHttpError
@@ -24,6 +20,7 @@ from core.app.apps.base_app_queue_manager import AppQueueManager
from core.app.entities.app_invoke_entities import InvokeFrom from core.app.entities.app_invoke_entities import InvokeFrom
from core.file.models import File from core.file.models import File
from core.helper.trace_id_helper import get_external_trace_id from core.helper.trace_id_helper import get_external_trace_id
from core.workflow.graph_engine.manager import GraphEngineManager
from extensions.ext_database import db from extensions.ext_database import db
from factories import file_factory, variable_factory from factories import file_factory, variable_factory
from fields.workflow_fields import workflow_fields, workflow_pagination_fields from fields.workflow_fields import workflow_fields, workflow_pagination_fields
@@ -61,7 +58,13 @@ def _parse_file(workflow: Workflow, files: list[dict] | None = None) -> Sequence
return file_objs return file_objs
@console_ns.route("/apps/<uuid:app_id>/workflows/draft")
class DraftWorkflowApi(Resource): class DraftWorkflowApi(Resource):
@api.doc("get_draft_workflow")
@api.doc(description="Get draft workflow for an application")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Draft workflow retrieved successfully", workflow_fields)
@api.response(404, "Draft workflow not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -73,7 +76,7 @@ class DraftWorkflowApi(Resource):
""" """
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
assert isinstance(current_user, Account) assert isinstance(current_user, Account)
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
# fetch draft workflow by app_model # fetch draft workflow by app_model
@@ -90,13 +93,30 @@ class DraftWorkflowApi(Resource):
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW]) @get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW])
@api.doc("sync_draft_workflow")
@api.doc(description="Sync draft workflow configuration")
@api.expect(
api.model(
"SyncDraftWorkflowRequest",
{
"graph": fields.Raw(required=True, description="Workflow graph configuration"),
"features": fields.Raw(required=True, description="Workflow features configuration"),
"hash": fields.String(description="Workflow hash for validation"),
"environment_variables": fields.List(fields.Raw, required=True, description="Environment variables"),
"conversation_variables": fields.List(fields.Raw, description="Conversation variables"),
},
)
)
@api.response(200, "Draft workflow synced successfully", workflow_fields)
@api.response(400, "Invalid workflow configuration")
@api.response(403, "Permission denied")
def post(self, app_model: App): def post(self, app_model: App):
""" """
Sync draft workflow Sync draft workflow
""" """
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
assert isinstance(current_user, Account) assert isinstance(current_user, Account)
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
content_type = request.headers.get("Content-Type", "") content_type = request.headers.get("Content-Type", "")
@@ -163,7 +183,25 @@ class DraftWorkflowApi(Resource):
} }
@console_ns.route("/apps/<uuid:app_id>/advanced-chat/workflows/draft/run")
class AdvancedChatDraftWorkflowRunApi(Resource): class AdvancedChatDraftWorkflowRunApi(Resource):
@api.doc("run_advanced_chat_draft_workflow")
@api.doc(description="Run draft workflow for advanced chat application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AdvancedChatWorkflowRunRequest",
{
"query": fields.String(required=True, description="User query"),
"inputs": fields.Raw(description="Input variables"),
"files": fields.List(fields.Raw, description="File uploads"),
"conversation_id": fields.String(description="Conversation ID"),
},
)
)
@api.response(200, "Workflow run started successfully")
@api.response(400, "Invalid request parameters")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -174,7 +212,7 @@ class AdvancedChatDraftWorkflowRunApi(Resource):
""" """
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
assert isinstance(current_user, Account) assert isinstance(current_user, Account)
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
@@ -212,7 +250,23 @@ class AdvancedChatDraftWorkflowRunApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/advanced-chat/workflows/draft/iteration/nodes/<string:node_id>/run")
class AdvancedChatDraftRunIterationNodeApi(Resource): class AdvancedChatDraftRunIterationNodeApi(Resource):
@api.doc("run_advanced_chat_draft_iteration_node")
@api.doc(description="Run draft workflow iteration node for advanced chat")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.expect(
api.model(
"IterationNodeRunRequest",
{
"task_id": fields.String(required=True, description="Task ID"),
"inputs": fields.Raw(description="Input variables"),
},
)
)
@api.response(200, "Iteration node run started successfully")
@api.response(403, "Permission denied")
@api.response(404, "Node not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -224,7 +278,7 @@ class AdvancedChatDraftRunIterationNodeApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -248,7 +302,23 @@ class AdvancedChatDraftRunIterationNodeApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/iteration/nodes/<string:node_id>/run")
class WorkflowDraftRunIterationNodeApi(Resource): class WorkflowDraftRunIterationNodeApi(Resource):
@api.doc("run_workflow_draft_iteration_node")
@api.doc(description="Run draft workflow iteration node")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.expect(
api.model(
"WorkflowIterationNodeRunRequest",
{
"task_id": fields.String(required=True, description="Task ID"),
"inputs": fields.Raw(description="Input variables"),
},
)
)
@api.response(200, "Workflow iteration node run started successfully")
@api.response(403, "Permission denied")
@api.response(404, "Node not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -260,7 +330,7 @@ class WorkflowDraftRunIterationNodeApi(Resource):
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -284,7 +354,23 @@ class WorkflowDraftRunIterationNodeApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/advanced-chat/workflows/draft/loop/nodes/<string:node_id>/run")
class AdvancedChatDraftRunLoopNodeApi(Resource): class AdvancedChatDraftRunLoopNodeApi(Resource):
@api.doc("run_advanced_chat_draft_loop_node")
@api.doc(description="Run draft workflow loop node for advanced chat")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.expect(
api.model(
"LoopNodeRunRequest",
{
"task_id": fields.String(required=True, description="Task ID"),
"inputs": fields.Raw(description="Input variables"),
},
)
)
@api.response(200, "Loop node run started successfully")
@api.response(403, "Permission denied")
@api.response(404, "Node not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -297,7 +383,7 @@ class AdvancedChatDraftRunLoopNodeApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -321,7 +407,23 @@ class AdvancedChatDraftRunLoopNodeApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/loop/nodes/<string:node_id>/run")
class WorkflowDraftRunLoopNodeApi(Resource): class WorkflowDraftRunLoopNodeApi(Resource):
@api.doc("run_workflow_draft_loop_node")
@api.doc(description="Run draft workflow loop node")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.expect(
api.model(
"WorkflowLoopNodeRunRequest",
{
"task_id": fields.String(required=True, description="Task ID"),
"inputs": fields.Raw(description="Input variables"),
},
)
)
@api.response(200, "Workflow loop node run started successfully")
@api.response(403, "Permission denied")
@api.response(404, "Node not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -334,7 +436,7 @@ class WorkflowDraftRunLoopNodeApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -358,7 +460,22 @@ class WorkflowDraftRunLoopNodeApi(Resource):
raise InternalServerError() raise InternalServerError()
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/run")
class DraftWorkflowRunApi(Resource): class DraftWorkflowRunApi(Resource):
@api.doc("run_draft_workflow")
@api.doc(description="Run draft workflow")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"DraftWorkflowRunRequest",
{
"inputs": fields.Raw(required=True, description="Input variables"),
"files": fields.List(fields.Raw, description="File uploads"),
},
)
)
@api.response(200, "Draft workflow run started successfully")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -371,7 +488,7 @@ class DraftWorkflowRunApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -397,7 +514,14 @@ class DraftWorkflowRunApi(Resource):
raise InvokeRateLimitHttpError(ex.description) raise InvokeRateLimitHttpError(ex.description)
@console_ns.route("/apps/<uuid:app_id>/workflow-runs/tasks/<string:task_id>/stop")
class WorkflowTaskStopApi(Resource): class WorkflowTaskStopApi(Resource):
@api.doc("stop_workflow_task")
@api.doc(description="Stop running workflow task")
@api.doc(params={"app_id": "Application ID", "task_id": "Task ID"})
@api.response(200, "Task stopped successfully")
@api.response(404, "Task not found")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -410,15 +534,35 @@ class WorkflowTaskStopApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, current_user.id) # Stop using both mechanisms for backward compatibility
# Legacy stop flag mechanism (without user check)
AppQueueManager.set_stop_flag_no_user_check(task_id)
# New graph engine command channel mechanism
GraphEngineManager.send_stop_command(task_id)
return {"result": "success"} return {"result": "success"}
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/run")
class DraftWorkflowNodeRunApi(Resource): class DraftWorkflowNodeRunApi(Resource):
@api.doc("run_draft_workflow_node")
@api.doc(description="Run draft workflow node")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.expect(
api.model(
"DraftWorkflowNodeRunRequest",
{
"inputs": fields.Raw(description="Input variables"),
},
)
)
@api.response(200, "Node run started successfully", workflow_run_node_execution_fields)
@api.response(403, "Permission denied")
@api.response(404, "Node not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -432,7 +576,7 @@ class DraftWorkflowNodeRunApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -466,7 +610,13 @@ class DraftWorkflowNodeRunApi(Resource):
return workflow_node_execution return workflow_node_execution
@console_ns.route("/apps/<uuid:app_id>/workflows/publish")
class PublishedWorkflowApi(Resource): class PublishedWorkflowApi(Resource):
@api.doc("get_published_workflow")
@api.doc(description="Get published workflow for an application")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Published workflow retrieved successfully", workflow_fields)
@api.response(404, "Published workflow not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -480,7 +630,7 @@ class PublishedWorkflowApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
# fetch published workflow by app_model # fetch published workflow by app_model
@@ -501,7 +651,7 @@ class PublishedWorkflowApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -538,7 +688,12 @@ class PublishedWorkflowApi(Resource):
} }
@console_ns.route("/apps/<uuid:app_id>/workflows/default-workflow-block-configs")
class DefaultBlockConfigsApi(Resource): class DefaultBlockConfigsApi(Resource):
@api.doc("get_default_block_configs")
@api.doc(description="Get default block configurations for workflow")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Default block configurations retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -551,7 +706,7 @@ class DefaultBlockConfigsApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
# Get default block configs # Get default block configs
@@ -559,7 +714,13 @@ class DefaultBlockConfigsApi(Resource):
return workflow_service.get_default_block_configs() return workflow_service.get_default_block_configs()
@console_ns.route("/apps/<uuid:app_id>/workflows/default-workflow-block-configs/<string:block_type>")
class DefaultBlockConfigApi(Resource): class DefaultBlockConfigApi(Resource):
@api.doc("get_default_block_config")
@api.doc(description="Get default block configuration by type")
@api.doc(params={"app_id": "Application ID", "block_type": "Block type"})
@api.response(200, "Default block configuration retrieved successfully")
@api.response(404, "Block type not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -571,7 +732,7 @@ class DefaultBlockConfigApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -592,7 +753,14 @@ class DefaultBlockConfigApi(Resource):
return workflow_service.get_default_block_config(node_type=block_type, filters=filters) return workflow_service.get_default_block_config(node_type=block_type, filters=filters)
@console_ns.route("/apps/<uuid:app_id>/convert-to-workflow")
class ConvertToWorkflowApi(Resource): class ConvertToWorkflowApi(Resource):
@api.doc("convert_to_workflow")
@api.doc(description="Convert application to workflow mode")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Application converted to workflow successfully")
@api.response(400, "Application cannot be converted")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -606,7 +774,7 @@ class ConvertToWorkflowApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# The role of the current user in the ta table must be admin, owner, or editor # The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
if request.data: if request.data:
@@ -629,9 +797,14 @@ class ConvertToWorkflowApi(Resource):
} }
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/config")
class WorkflowConfigApi(Resource): class WorkflowConfigApi(Resource):
"""Resource for workflow configuration.""" """Resource for workflow configuration."""
@api.doc("get_workflow_config")
@api.doc(description="Get workflow configuration")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Workflow configuration retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -642,7 +815,12 @@ class WorkflowConfigApi(Resource):
} }
@console_ns.route("/apps/<uuid:app_id>/workflows")
class PublishedAllWorkflowApi(Resource): class PublishedAllWorkflowApi(Resource):
@api.doc("get_all_published_workflows")
@api.doc(description="Get all published workflows for an application")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Published workflows retrieved successfully", workflow_pagination_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -655,7 +833,7 @@ class PublishedAllWorkflowApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -693,7 +871,23 @@ class PublishedAllWorkflowApi(Resource):
} }
@console_ns.route("/apps/<uuid:app_id>/workflows/<string:workflow_id>")
class WorkflowByIdApi(Resource): class WorkflowByIdApi(Resource):
@api.doc("update_workflow_by_id")
@api.doc(description="Update workflow by ID")
@api.doc(params={"app_id": "Application ID", "workflow_id": "Workflow ID"})
@api.expect(
api.model(
"UpdateWorkflowRequest",
{
"environment_variables": fields.List(fields.Raw, description="Environment variables"),
"conversation_variables": fields.List(fields.Raw, description="Conversation variables"),
},
)
)
@api.response(200, "Workflow updated successfully", workflow_fields)
@api.response(404, "Workflow not found")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -706,7 +900,7 @@ class WorkflowByIdApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# Check permission # Check permission
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
@@ -719,7 +913,6 @@ class WorkflowByIdApi(Resource):
raise ValueError("Marked name cannot exceed 20 characters") raise ValueError("Marked name cannot exceed 20 characters")
if args.marked_comment and len(args.marked_comment) > 100: if args.marked_comment and len(args.marked_comment) > 100:
raise ValueError("Marked comment cannot exceed 100 characters") raise ValueError("Marked comment cannot exceed 100 characters")
args = parser.parse_args()
# Prepare update data # Prepare update data
update_data = {} update_data = {}
@@ -762,7 +955,7 @@ class WorkflowByIdApi(Resource):
if not isinstance(current_user, Account): if not isinstance(current_user, Account):
raise Forbidden() raise Forbidden()
# Check permission # Check permission
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
workflow_service = WorkflowService() workflow_service = WorkflowService()
@@ -785,7 +978,14 @@ class WorkflowByIdApi(Resource):
return None, 204 return None, 204
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/last-run")
class DraftWorkflowNodeLastRunApi(Resource): class DraftWorkflowNodeLastRunApi(Resource):
@api.doc("get_draft_workflow_node_last_run")
@api.doc(description="Get last run result for draft workflow node")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.response(200, "Node last run retrieved successfully", workflow_run_node_execution_fields)
@api.response(404, "Node last run not found")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -804,73 +1004,3 @@ class DraftWorkflowNodeLastRunApi(Resource):
if node_exec is None: if node_exec is None:
raise NotFound("last run not found") raise NotFound("last run not found")
return node_exec return node_exec
api.add_resource(
DraftWorkflowApi,
"/apps/<uuid:app_id>/workflows/draft",
)
api.add_resource(
WorkflowConfigApi,
"/apps/<uuid:app_id>/workflows/draft/config",
)
api.add_resource(
AdvancedChatDraftWorkflowRunApi,
"/apps/<uuid:app_id>/advanced-chat/workflows/draft/run",
)
api.add_resource(
DraftWorkflowRunApi,
"/apps/<uuid:app_id>/workflows/draft/run",
)
api.add_resource(
WorkflowTaskStopApi,
"/apps/<uuid:app_id>/workflow-runs/tasks/<string:task_id>/stop",
)
api.add_resource(
DraftWorkflowNodeRunApi,
"/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/run",
)
api.add_resource(
AdvancedChatDraftRunIterationNodeApi,
"/apps/<uuid:app_id>/advanced-chat/workflows/draft/iteration/nodes/<string:node_id>/run",
)
api.add_resource(
WorkflowDraftRunIterationNodeApi,
"/apps/<uuid:app_id>/workflows/draft/iteration/nodes/<string:node_id>/run",
)
api.add_resource(
AdvancedChatDraftRunLoopNodeApi,
"/apps/<uuid:app_id>/advanced-chat/workflows/draft/loop/nodes/<string:node_id>/run",
)
api.add_resource(
WorkflowDraftRunLoopNodeApi,
"/apps/<uuid:app_id>/workflows/draft/loop/nodes/<string:node_id>/run",
)
api.add_resource(
PublishedWorkflowApi,
"/apps/<uuid:app_id>/workflows/publish",
)
api.add_resource(
PublishedAllWorkflowApi,
"/apps/<uuid:app_id>/workflows",
)
api.add_resource(
DefaultBlockConfigsApi,
"/apps/<uuid:app_id>/workflows/default-workflow-block-configs",
)
api.add_resource(
DefaultBlockConfigApi,
"/apps/<uuid:app_id>/workflows/default-workflow-block-configs/<string:block_type>",
)
api.add_resource(
ConvertToWorkflowApi,
"/apps/<uuid:app_id>/convert-to-workflow",
)
api.add_resource(
WorkflowByIdApi,
"/apps/<uuid:app_id>/workflows/<string:workflow_id>",
)
api.add_resource(
DraftWorkflowNodeLastRunApi,
"/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/last-run",
)

View File

@@ -3,10 +3,10 @@ from flask_restx import Resource, marshal_with, reqparse
from flask_restx.inputs import int_range from flask_restx.inputs import int_range
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from core.workflow.entities.workflow_execution import WorkflowExecutionStatus from core.workflow.enums import WorkflowExecutionStatus
from extensions.ext_database import db from extensions.ext_database import db
from fields.workflow_app_log_fields import workflow_app_log_pagination_fields from fields.workflow_app_log_fields import workflow_app_log_pagination_fields
from libs.login import login_required from libs.login import login_required
@@ -15,7 +15,24 @@ from models.model import AppMode
from services.workflow_app_service import WorkflowAppService from services.workflow_app_service import WorkflowAppService
@console_ns.route("/apps/<uuid:app_id>/workflow-app-logs")
class WorkflowAppLogApi(Resource): class WorkflowAppLogApi(Resource):
@api.doc("get_workflow_app_logs")
@api.doc(description="Get workflow application execution logs")
@api.doc(params={"app_id": "Application ID"})
@api.doc(
params={
"keyword": "Search keyword for filtering logs",
"status": "Filter by execution status (succeeded, failed, stopped, partial-succeeded)",
"created_at__before": "Filter logs created before this timestamp",
"created_at__after": "Filter logs created after this timestamp",
"created_by_end_user_session_id": "Filter by end user session ID",
"created_by_account": "Filter by account",
"page": "Page number (1-99999)",
"limit": "Number of items per page (1-100)",
}
)
@api.response(200, "Workflow app logs retrieved successfully", workflow_app_log_pagination_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -78,6 +95,3 @@ class WorkflowAppLogApi(Resource):
) )
return workflow_app_log_pagination return workflow_app_log_pagination
api.add_resource(WorkflowAppLogApi, "/apps/<uuid:app_id>/workflow-app-logs")

View File

@@ -6,7 +6,7 @@ from flask_restx import Resource, fields, inputs, marshal, marshal_with, reqpars
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden from werkzeug.exceptions import Forbidden
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import ( from controllers.console.app.error import (
DraftWorkflowNotExist, DraftWorkflowNotExist,
) )
@@ -17,10 +17,11 @@ from core.variables.segment_group import SegmentGroup
from core.variables.segments import ArrayFileSegment, FileSegment, Segment from core.variables.segments import ArrayFileSegment, FileSegment, Segment
from core.variables.types import SegmentType from core.variables.types import SegmentType
from core.workflow.constants import CONVERSATION_VARIABLE_NODE_ID, SYSTEM_VARIABLE_NODE_ID from core.workflow.constants import CONVERSATION_VARIABLE_NODE_ID, SYSTEM_VARIABLE_NODE_ID
from extensions.ext_database import db
from factories.file_factory import build_from_mapping, build_from_mappings from factories.file_factory import build_from_mapping, build_from_mappings
from factories.variable_factory import build_segment_with_type from factories.variable_factory import build_segment_with_type
from libs.login import current_user, login_required from libs.login import current_user, login_required
from models import App, AppMode, db from models import App, AppMode
from models.account import Account from models.account import Account
from models.workflow import WorkflowDraftVariable from models.workflow import WorkflowDraftVariable
from services.workflow_draft_variable_service import WorkflowDraftVariableList, WorkflowDraftVariableService from services.workflow_draft_variable_service import WorkflowDraftVariableList, WorkflowDraftVariableService
@@ -137,14 +138,20 @@ def _api_prerequisite(f):
@get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW]) @get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW])
def wrapper(*args, **kwargs): def wrapper(*args, **kwargs):
assert isinstance(current_user, Account) assert isinstance(current_user, Account)
if not current_user.is_editor: if not current_user.has_edit_permission:
raise Forbidden() raise Forbidden()
return f(*args, **kwargs) return f(*args, **kwargs)
return wrapper return wrapper
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/variables")
class WorkflowVariableCollectionApi(Resource): class WorkflowVariableCollectionApi(Resource):
@api.doc("get_workflow_variables")
@api.doc(description="Get draft workflow variables")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"page": "Page number (1-100000)", "limit": "Number of items per page (1-100)"})
@api.response(200, "Workflow variables retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_LIST_WITHOUT_VALUE_FIELDS)
@_api_prerequisite @_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_WITHOUT_VALUE_FIELDS) @marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_WITHOUT_VALUE_FIELDS)
def get(self, app_model: App): def get(self, app_model: App):
@@ -173,6 +180,9 @@ class WorkflowVariableCollectionApi(Resource):
return workflow_vars return workflow_vars
@api.doc("delete_workflow_variables")
@api.doc(description="Delete all draft workflow variables")
@api.response(204, "Workflow variables deleted successfully")
@_api_prerequisite @_api_prerequisite
def delete(self, app_model: App): def delete(self, app_model: App):
draft_var_srv = WorkflowDraftVariableService( draft_var_srv = WorkflowDraftVariableService(
@@ -201,7 +211,12 @@ def validate_node_id(node_id: str) -> NoReturn | None:
return None return None
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/variables")
class NodeVariableCollectionApi(Resource): class NodeVariableCollectionApi(Resource):
@api.doc("get_node_variables")
@api.doc(description="Get variables for a specific node")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.response(200, "Node variables retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
@_api_prerequisite @_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS) @marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
def get(self, app_model: App, node_id: str): def get(self, app_model: App, node_id: str):
@@ -214,6 +229,9 @@ class NodeVariableCollectionApi(Resource):
return node_vars return node_vars
@api.doc("delete_node_variables")
@api.doc(description="Delete all variables for a specific node")
@api.response(204, "Node variables deleted successfully")
@_api_prerequisite @_api_prerequisite
def delete(self, app_model: App, node_id: str): def delete(self, app_model: App, node_id: str):
validate_node_id(node_id) validate_node_id(node_id)
@@ -223,10 +241,16 @@ class NodeVariableCollectionApi(Resource):
return Response("", 204) return Response("", 204)
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/variables/<uuid:variable_id>")
class VariableApi(Resource): class VariableApi(Resource):
_PATCH_NAME_FIELD = "name" _PATCH_NAME_FIELD = "name"
_PATCH_VALUE_FIELD = "value" _PATCH_VALUE_FIELD = "value"
@api.doc("get_variable")
@api.doc(description="Get a specific workflow variable")
@api.doc(params={"app_id": "Application ID", "variable_id": "Variable ID"})
@api.response(200, "Variable retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_FIELDS)
@api.response(404, "Variable not found")
@_api_prerequisite @_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_FIELDS) @marshal_with(_WORKFLOW_DRAFT_VARIABLE_FIELDS)
def get(self, app_model: App, variable_id: str): def get(self, app_model: App, variable_id: str):
@@ -240,6 +264,19 @@ class VariableApi(Resource):
raise NotFoundError(description=f"variable not found, id={variable_id}") raise NotFoundError(description=f"variable not found, id={variable_id}")
return variable return variable
@api.doc("update_variable")
@api.doc(description="Update a workflow variable")
@api.expect(
api.model(
"UpdateVariableRequest",
{
"name": fields.String(description="Variable name"),
"value": fields.Raw(description="Variable value"),
},
)
)
@api.response(200, "Variable updated successfully", _WORKFLOW_DRAFT_VARIABLE_FIELDS)
@api.response(404, "Variable not found")
@_api_prerequisite @_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_FIELDS) @marshal_with(_WORKFLOW_DRAFT_VARIABLE_FIELDS)
def patch(self, app_model: App, variable_id: str): def patch(self, app_model: App, variable_id: str):
@@ -302,6 +339,10 @@ class VariableApi(Resource):
db.session.commit() db.session.commit()
return variable return variable
@api.doc("delete_variable")
@api.doc(description="Delete a workflow variable")
@api.response(204, "Variable deleted successfully")
@api.response(404, "Variable not found")
@_api_prerequisite @_api_prerequisite
def delete(self, app_model: App, variable_id: str): def delete(self, app_model: App, variable_id: str):
draft_var_srv = WorkflowDraftVariableService( draft_var_srv = WorkflowDraftVariableService(
@@ -317,7 +358,14 @@ class VariableApi(Resource):
return Response("", 204) return Response("", 204)
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/variables/<uuid:variable_id>/reset")
class VariableResetApi(Resource): class VariableResetApi(Resource):
@api.doc("reset_variable")
@api.doc(description="Reset a workflow variable to its default value")
@api.doc(params={"app_id": "Application ID", "variable_id": "Variable ID"})
@api.response(200, "Variable reset successfully", _WORKFLOW_DRAFT_VARIABLE_FIELDS)
@api.response(204, "Variable reset (no content)")
@api.response(404, "Variable not found")
@_api_prerequisite @_api_prerequisite
def put(self, app_model: App, variable_id: str): def put(self, app_model: App, variable_id: str):
draft_var_srv = WorkflowDraftVariableService( draft_var_srv = WorkflowDraftVariableService(
@@ -358,7 +406,13 @@ def _get_variable_list(app_model: App, node_id) -> WorkflowDraftVariableList:
return draft_vars return draft_vars
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/conversation-variables")
class ConversationVariableCollectionApi(Resource): class ConversationVariableCollectionApi(Resource):
@api.doc("get_conversation_variables")
@api.doc(description="Get conversation variables for workflow")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Conversation variables retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
@api.response(404, "Draft workflow not found")
@_api_prerequisite @_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS) @marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
def get(self, app_model: App): def get(self, app_model: App):
@@ -374,14 +428,25 @@ class ConversationVariableCollectionApi(Resource):
return _get_variable_list(app_model, CONVERSATION_VARIABLE_NODE_ID) return _get_variable_list(app_model, CONVERSATION_VARIABLE_NODE_ID)
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/system-variables")
class SystemVariableCollectionApi(Resource): class SystemVariableCollectionApi(Resource):
@api.doc("get_system_variables")
@api.doc(description="Get system variables for workflow")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "System variables retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
@_api_prerequisite @_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS) @marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
def get(self, app_model: App): def get(self, app_model: App):
return _get_variable_list(app_model, SYSTEM_VARIABLE_NODE_ID) return _get_variable_list(app_model, SYSTEM_VARIABLE_NODE_ID)
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/environment-variables")
class EnvironmentVariableCollectionApi(Resource): class EnvironmentVariableCollectionApi(Resource):
@api.doc("get_environment_variables")
@api.doc(description="Get environment variables for workflow")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Environment variables retrieved successfully")
@api.response(404, "Draft workflow not found")
@_api_prerequisite @_api_prerequisite
def get(self, app_model: App): def get(self, app_model: App):
""" """
@@ -413,16 +478,3 @@ class EnvironmentVariableCollectionApi(Resource):
) )
return {"items": env_vars_list} return {"items": env_vars_list}
api.add_resource(
WorkflowVariableCollectionApi,
"/apps/<uuid:app_id>/workflows/draft/variables",
)
api.add_resource(NodeVariableCollectionApi, "/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/variables")
api.add_resource(VariableApi, "/apps/<uuid:app_id>/workflows/draft/variables/<uuid:variable_id>")
api.add_resource(VariableResetApi, "/apps/<uuid:app_id>/workflows/draft/variables/<uuid:variable_id>/reset")
api.add_resource(ConversationVariableCollectionApi, "/apps/<uuid:app_id>/workflows/draft/conversation-variables")
api.add_resource(SystemVariableCollectionApi, "/apps/<uuid:app_id>/workflows/draft/system-variables")
api.add_resource(EnvironmentVariableCollectionApi, "/apps/<uuid:app_id>/workflows/draft/environment-variables")

View File

@@ -4,7 +4,7 @@ from flask_login import current_user
from flask_restx import Resource, marshal_with, reqparse from flask_restx import Resource, marshal_with, reqparse
from flask_restx.inputs import int_range from flask_restx.inputs import int_range
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from fields.workflow_run_fields import ( from fields.workflow_run_fields import (
@@ -19,7 +19,13 @@ from models import Account, App, AppMode, EndUser
from services.workflow_run_service import WorkflowRunService from services.workflow_run_service import WorkflowRunService
@console_ns.route("/apps/<uuid:app_id>/advanced-chat/workflow-runs")
class AdvancedChatAppWorkflowRunListApi(Resource): class AdvancedChatAppWorkflowRunListApi(Resource):
@api.doc("get_advanced_chat_workflow_runs")
@api.doc(description="Get advanced chat workflow run list")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"last_id": "Last run ID for pagination", "limit": "Number of items per page (1-100)"})
@api.response(200, "Workflow runs retrieved successfully", advanced_chat_workflow_run_pagination_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -40,7 +46,13 @@ class AdvancedChatAppWorkflowRunListApi(Resource):
return result return result
@console_ns.route("/apps/<uuid:app_id>/workflow-runs")
class WorkflowRunListApi(Resource): class WorkflowRunListApi(Resource):
@api.doc("get_workflow_runs")
@api.doc(description="Get workflow run list")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"last_id": "Last run ID for pagination", "limit": "Number of items per page (1-100)"})
@api.response(200, "Workflow runs retrieved successfully", workflow_run_pagination_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -61,7 +73,13 @@ class WorkflowRunListApi(Resource):
return result return result
@console_ns.route("/apps/<uuid:app_id>/workflow-runs/<uuid:run_id>")
class WorkflowRunDetailApi(Resource): class WorkflowRunDetailApi(Resource):
@api.doc("get_workflow_run_detail")
@api.doc(description="Get workflow run detail")
@api.doc(params={"app_id": "Application ID", "run_id": "Workflow run ID"})
@api.response(200, "Workflow run detail retrieved successfully", workflow_run_detail_fields)
@api.response(404, "Workflow run not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -79,7 +97,13 @@ class WorkflowRunDetailApi(Resource):
return workflow_run return workflow_run
@console_ns.route("/apps/<uuid:app_id>/workflow-runs/<uuid:run_id>/node-executions")
class WorkflowRunNodeExecutionListApi(Resource): class WorkflowRunNodeExecutionListApi(Resource):
@api.doc("get_workflow_run_node_executions")
@api.doc(description="Get workflow run node execution list")
@api.doc(params={"app_id": "Application ID", "run_id": "Workflow run ID"})
@api.response(200, "Node executions retrieved successfully", workflow_run_node_execution_list_fields)
@api.response(404, "Workflow run not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -100,9 +124,3 @@ class WorkflowRunNodeExecutionListApi(Resource):
) )
return {"data": node_executions} return {"data": node_executions}
api.add_resource(AdvancedChatAppWorkflowRunListApi, "/apps/<uuid:app_id>/advanced-chat/workflow-runs")
api.add_resource(WorkflowRunListApi, "/apps/<uuid:app_id>/workflow-runs")
api.add_resource(WorkflowRunDetailApi, "/apps/<uuid:app_id>/workflow-runs/<uuid:run_id>")
api.add_resource(WorkflowRunNodeExecutionListApi, "/apps/<uuid:app_id>/workflow-runs/<uuid:run_id>/node-executions")

View File

@@ -7,7 +7,7 @@ from flask import jsonify
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, reqparse from flask_restx import Resource, reqparse
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.wraps import get_app_model from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db from extensions.ext_database import db
@@ -17,11 +17,17 @@ from models.enums import WorkflowRunTriggeredFrom
from models.model import AppMode from models.model import AppMode
@console_ns.route("/apps/<uuid:app_id>/workflow/statistics/daily-conversations")
class WorkflowDailyRunsStatistic(Resource): class WorkflowDailyRunsStatistic(Resource):
@api.doc("get_workflow_daily_runs_statistic")
@api.doc(description="Get workflow daily runs statistics")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"start": "Start date and time (YYYY-MM-DD HH:MM)", "end": "End date and time (YYYY-MM-DD HH:MM)"})
@api.response(200, "Daily runs statistics retrieved successfully")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -79,11 +85,17 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/workflow/statistics/daily-terminals")
class WorkflowDailyTerminalsStatistic(Resource): class WorkflowDailyTerminalsStatistic(Resource):
@api.doc("get_workflow_daily_terminals_statistic")
@api.doc(description="Get workflow daily terminals statistics")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"start": "Start date and time (YYYY-MM-DD HH:MM)", "end": "End date and time (YYYY-MM-DD HH:MM)"})
@api.response(200, "Daily terminals statistics retrieved successfully")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -141,11 +153,17 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/workflow/statistics/token-costs")
class WorkflowDailyTokenCostStatistic(Resource): class WorkflowDailyTokenCostStatistic(Resource):
@api.doc("get_workflow_daily_token_cost_statistic")
@api.doc(description="Get workflow daily token cost statistics")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"start": "Start date and time (YYYY-MM-DD HH:MM)", "end": "End date and time (YYYY-MM-DD HH:MM)"})
@api.response(200, "Daily token cost statistics retrieved successfully")
@get_app_model
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@get_app_model
def get(self, app_model): def get(self, app_model):
account = current_user account = current_user
@@ -208,7 +226,13 @@ WHERE
return jsonify({"data": response_data}) return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/workflow/statistics/average-app-interactions")
class WorkflowAverageAppInteractionStatistic(Resource): class WorkflowAverageAppInteractionStatistic(Resource):
@api.doc("get_workflow_average_app_interaction_statistic")
@api.doc(description="Get workflow average app interaction statistics")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"start": "Start date and time (YYYY-MM-DD HH:MM)", "end": "End date and time (YYYY-MM-DD HH:MM)"})
@api.response(200, "Average app interaction statistics retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -285,11 +309,3 @@ GROUP BY
) )
return jsonify({"data": response_data}) return jsonify({"data": response_data})
api.add_resource(WorkflowDailyRunsStatistic, "/apps/<uuid:app_id>/workflow/statistics/daily-conversations")
api.add_resource(WorkflowDailyTerminalsStatistic, "/apps/<uuid:app_id>/workflow/statistics/daily-terminals")
api.add_resource(WorkflowDailyTokenCostStatistic, "/apps/<uuid:app_id>/workflow/statistics/token-costs")
api.add_resource(
WorkflowAverageAppInteractionStatistic, "/apps/<uuid:app_id>/workflow/statistics/average-app-interactions"
)

View File

@@ -1,6 +1,6 @@
from collections.abc import Callable from collections.abc import Callable
from functools import wraps from functools import wraps
from typing import Optional, Union from typing import ParamSpec, TypeVar, Union
from controllers.console.app.error import AppNotFoundError from controllers.console.app.error import AppNotFoundError
from extensions.ext_database import db from extensions.ext_database import db
@@ -8,8 +8,11 @@ from libs.login import current_user
from models import App, AppMode from models import App, AppMode
from models.account import Account from models.account import Account
P = ParamSpec("P")
R = TypeVar("R")
def _load_app_model(app_id: str) -> Optional[App]:
def _load_app_model(app_id: str) -> App | None:
assert isinstance(current_user, Account) assert isinstance(current_user, Account)
app_model = ( app_model = (
db.session.query(App) db.session.query(App)
@@ -19,10 +22,10 @@ def _load_app_model(app_id: str) -> Optional[App]:
return app_model return app_model
def get_app_model(view: Optional[Callable] = None, *, mode: Union[AppMode, list[AppMode], None] = None): def get_app_model(view: Callable[P, R] | None = None, *, mode: Union[AppMode, list[AppMode], None] = None):
def decorator(view_func): def decorator(view_func: Callable[P, R]):
@wraps(view_func) @wraps(view_func)
def decorated_view(*args, **kwargs): def decorated_view(*args: P.args, **kwargs: P.kwargs):
if not kwargs.get("app_id"): if not kwargs.get("app_id"):
raise ValueError("missing app_id in path parameters") raise ValueError("missing app_id in path parameters")

View File

@@ -1,8 +1,8 @@
from flask import request from flask import request
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from constants.languages import supported_language from constants.languages import supported_language
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.error import AlreadyActivateError from controllers.console.error import AlreadyActivateError
from extensions.ext_database import db from extensions.ext_database import db
from libs.datetime_utils import naive_utc_now from libs.datetime_utils import naive_utc_now
@@ -10,14 +10,36 @@ from libs.helper import StrLen, email, extract_remote_ip, timezone
from models.account import AccountStatus from models.account import AccountStatus
from services.account_service import AccountService, RegisterService from services.account_service import AccountService, RegisterService
active_check_parser = reqparse.RequestParser()
active_check_parser.add_argument(
"workspace_id", type=str, required=False, nullable=True, location="args", help="Workspace ID"
)
active_check_parser.add_argument(
"email", type=email, required=False, nullable=True, location="args", help="Email address"
)
active_check_parser.add_argument(
"token", type=str, required=True, nullable=False, location="args", help="Activation token"
)
@console_ns.route("/activate/check")
class ActivateCheckApi(Resource): class ActivateCheckApi(Resource):
@api.doc("check_activation_token")
@api.doc(description="Check if activation token is valid")
@api.expect(active_check_parser)
@api.response(
200,
"Success",
api.model(
"ActivationCheckResponse",
{
"is_valid": fields.Boolean(description="Whether token is valid"),
"data": fields.Raw(description="Activation data if valid"),
},
),
)
def get(self): def get(self):
parser = reqparse.RequestParser() args = active_check_parser.parse_args()
parser.add_argument("workspace_id", type=str, required=False, nullable=True, location="args")
parser.add_argument("email", type=email, required=False, nullable=True, location="args")
parser.add_argument("token", type=str, required=True, nullable=False, location="args")
args = parser.parse_args()
workspaceId = args["workspace_id"] workspaceId = args["workspace_id"]
reg_email = args["email"] reg_email = args["email"]
@@ -38,18 +60,36 @@ class ActivateCheckApi(Resource):
return {"is_valid": False} return {"is_valid": False}
active_parser = reqparse.RequestParser()
active_parser.add_argument("workspace_id", type=str, required=False, nullable=True, location="json")
active_parser.add_argument("email", type=email, required=False, nullable=True, location="json")
active_parser.add_argument("token", type=str, required=True, nullable=False, location="json")
active_parser.add_argument("name", type=StrLen(30), required=True, nullable=False, location="json")
active_parser.add_argument(
"interface_language", type=supported_language, required=True, nullable=False, location="json"
)
active_parser.add_argument("timezone", type=timezone, required=True, nullable=False, location="json")
@console_ns.route("/activate")
class ActivateApi(Resource): class ActivateApi(Resource):
@api.doc("activate_account")
@api.doc(description="Activate account with invitation token")
@api.expect(active_parser)
@api.response(
200,
"Account activated successfully",
api.model(
"ActivationResponse",
{
"result": fields.String(description="Operation result"),
"data": fields.Raw(description="Login token data"),
},
),
)
@api.response(400, "Already activated or invalid token")
def post(self): def post(self):
parser = reqparse.RequestParser() args = active_parser.parse_args()
parser.add_argument("workspace_id", type=str, required=False, nullable=True, location="json")
parser.add_argument("email", type=email, required=False, nullable=True, location="json")
parser.add_argument("token", type=str, required=True, nullable=False, location="json")
parser.add_argument("name", type=StrLen(30), required=True, nullable=False, location="json")
parser.add_argument(
"interface_language", type=supported_language, required=True, nullable=False, location="json"
)
parser.add_argument("timezone", type=timezone, required=True, nullable=False, location="json")
args = parser.parse_args()
invitation = RegisterService.get_invitation_if_token_valid(args["workspace_id"], args["email"], args["token"]) invitation = RegisterService.get_invitation_if_token_valid(args["workspace_id"], args["email"], args["token"])
if invitation is None: if invitation is None:
@@ -70,7 +110,3 @@ class ActivateApi(Resource):
token_pair = AccountService.login(account, ip_address=extract_remote_ip(request)) token_pair = AccountService.login(account, ip_address=extract_remote_ip(request))
return {"result": "success", "data": token_pair.model_dump()} return {"result": "success", "data": token_pair.model_dump()}
api.add_resource(ActivateCheckApi, "/activate/check")
api.add_resource(ActivateApi, "/activate")

View File

@@ -3,11 +3,11 @@ import logging
import requests import requests
from flask import current_app, redirect, request from flask import current_app, redirect, request
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource from flask_restx import Resource, fields
from werkzeug.exceptions import Forbidden from werkzeug.exceptions import Forbidden
from configs import dify_config from configs import dify_config
from controllers.console import api from controllers.console import api, console_ns
from libs.login import login_required from libs.login import login_required
from libs.oauth_data_source import NotionOAuth from libs.oauth_data_source import NotionOAuth
@@ -28,7 +28,21 @@ def get_oauth_providers():
return OAUTH_PROVIDERS return OAUTH_PROVIDERS
@console_ns.route("/oauth/data-source/<string:provider>")
class OAuthDataSource(Resource): class OAuthDataSource(Resource):
@api.doc("oauth_data_source")
@api.doc(description="Get OAuth authorization URL for data source provider")
@api.doc(params={"provider": "Data source provider name (notion)"})
@api.response(
200,
"Authorization URL or internal setup success",
api.model(
"OAuthDataSourceResponse",
{"data": fields.Raw(description="Authorization URL or 'internal' for internal setup")},
),
)
@api.response(400, "Invalid provider")
@api.response(403, "Admin privileges required")
def get(self, provider: str): def get(self, provider: str):
# The role of the current user in the table must be admin or owner # The role of the current user in the table must be admin or owner
if not current_user.is_admin_or_owner: if not current_user.is_admin_or_owner:
@@ -49,7 +63,19 @@ class OAuthDataSource(Resource):
return {"data": auth_url}, 200 return {"data": auth_url}, 200
@console_ns.route("/oauth/data-source/callback/<string:provider>")
class OAuthDataSourceCallback(Resource): class OAuthDataSourceCallback(Resource):
@api.doc("oauth_data_source_callback")
@api.doc(description="Handle OAuth callback from data source provider")
@api.doc(
params={
"provider": "Data source provider name (notion)",
"code": "Authorization code from OAuth provider",
"error": "Error message from OAuth provider",
}
)
@api.response(302, "Redirect to console with result")
@api.response(400, "Invalid provider")
def get(self, provider: str): def get(self, provider: str):
OAUTH_DATASOURCE_PROVIDERS = get_oauth_providers() OAUTH_DATASOURCE_PROVIDERS = get_oauth_providers()
with current_app.app_context(): with current_app.app_context():
@@ -68,7 +94,19 @@ class OAuthDataSourceCallback(Resource):
return redirect(f"{dify_config.CONSOLE_WEB_URL}?type=notion&error=Access denied") return redirect(f"{dify_config.CONSOLE_WEB_URL}?type=notion&error=Access denied")
@console_ns.route("/oauth/data-source/binding/<string:provider>")
class OAuthDataSourceBinding(Resource): class OAuthDataSourceBinding(Resource):
@api.doc("oauth_data_source_binding")
@api.doc(description="Bind OAuth data source with authorization code")
@api.doc(
params={"provider": "Data source provider name (notion)", "code": "Authorization code from OAuth provider"}
)
@api.response(
200,
"Data source binding success",
api.model("OAuthDataSourceBindingResponse", {"result": fields.String(description="Operation result")}),
)
@api.response(400, "Invalid provider or code")
def get(self, provider: str): def get(self, provider: str):
OAUTH_DATASOURCE_PROVIDERS = get_oauth_providers() OAUTH_DATASOURCE_PROVIDERS = get_oauth_providers()
with current_app.app_context(): with current_app.app_context():
@@ -90,7 +128,17 @@ class OAuthDataSourceBinding(Resource):
return {"result": "success"}, 200 return {"result": "success"}, 200
@console_ns.route("/oauth/data-source/<string:provider>/<uuid:binding_id>/sync")
class OAuthDataSourceSync(Resource): class OAuthDataSourceSync(Resource):
@api.doc("oauth_data_source_sync")
@api.doc(description="Sync data from OAuth data source")
@api.doc(params={"provider": "Data source provider name (notion)", "binding_id": "Data source binding ID"})
@api.response(
200,
"Data source sync success",
api.model("OAuthDataSourceSyncResponse", {"result": fields.String(description="Operation result")}),
)
@api.response(400, "Invalid provider or sync failed")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -111,9 +159,3 @@ class OAuthDataSourceSync(Resource):
return {"error": "OAuth data source process failed"}, 400 return {"error": "OAuth data source process failed"}, 400
return {"result": "success"}, 200 return {"result": "success"}, 200
api.add_resource(OAuthDataSource, "/oauth/data-source/<string:provider>")
api.add_resource(OAuthDataSourceCallback, "/oauth/data-source/callback/<string:provider>")
api.add_resource(OAuthDataSourceBinding, "/oauth/data-source/binding/<string:provider>")
api.add_resource(OAuthDataSourceSync, "/oauth/data-source/<string:provider>/<uuid:binding_id>/sync")

View File

@@ -0,0 +1,155 @@
from flask import request
from flask_restx import Resource, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from configs import dify_config
from constants.languages import languages
from controllers.console import api
from controllers.console.auth.error import (
EmailAlreadyInUseError,
EmailCodeError,
EmailRegisterLimitError,
InvalidEmailError,
InvalidTokenError,
PasswordMismatchError,
)
from controllers.console.error import AccountInFreezeError, EmailSendIpLimitError
from controllers.console.wraps import email_password_login_enabled, email_register_enabled, setup_required
from extensions.ext_database import db
from libs.helper import email, extract_remote_ip
from libs.password import valid_password
from models.account import Account
from services.account_service import AccountService
from services.billing_service import BillingService
from services.errors.account import AccountNotFoundError, AccountRegisterError
class EmailRegisterSendEmailApi(Resource):
@setup_required
@email_password_login_enabled
@email_register_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=email, required=True, location="json")
parser.add_argument("language", type=str, required=False, location="json")
args = parser.parse_args()
ip_address = extract_remote_ip(request)
if AccountService.is_email_send_ip_limit(ip_address):
raise EmailSendIpLimitError()
language = "en-US"
if args["language"] in languages:
language = args["language"]
if dify_config.BILLING_ENABLED and BillingService.is_email_in_freeze(args["email"]):
raise AccountInFreezeError()
with Session(db.engine) as session:
account = session.execute(select(Account).filter_by(email=args["email"])).scalar_one_or_none()
token = None
token = AccountService.send_email_register_email(email=args["email"], account=account, language=language)
return {"result": "success", "data": token}
class EmailRegisterCheckApi(Resource):
@setup_required
@email_password_login_enabled
@email_register_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=str, required=True, location="json")
parser.add_argument("code", type=str, required=True, location="json")
parser.add_argument("token", type=str, required=True, nullable=False, location="json")
args = parser.parse_args()
user_email = args["email"]
is_email_register_error_rate_limit = AccountService.is_email_register_error_rate_limit(args["email"])
if is_email_register_error_rate_limit:
raise EmailRegisterLimitError()
token_data = AccountService.get_email_register_data(args["token"])
if token_data is None:
raise InvalidTokenError()
if user_email != token_data.get("email"):
raise InvalidEmailError()
if args["code"] != token_data.get("code"):
AccountService.add_email_register_error_rate_limit(args["email"])
raise EmailCodeError()
# Verified, revoke the first token
AccountService.revoke_email_register_token(args["token"])
# Refresh token data by generating a new token
_, new_token = AccountService.generate_email_register_token(
user_email, code=args["code"], additional_data={"phase": "register"}
)
AccountService.reset_email_register_error_rate_limit(args["email"])
return {"is_valid": True, "email": token_data.get("email"), "token": new_token}
class EmailRegisterResetApi(Resource):
@setup_required
@email_password_login_enabled
@email_register_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("token", type=str, required=True, nullable=False, location="json")
parser.add_argument("new_password", type=valid_password, required=True, nullable=False, location="json")
parser.add_argument("password_confirm", type=valid_password, required=True, nullable=False, location="json")
args = parser.parse_args()
# Validate passwords match
if args["new_password"] != args["password_confirm"]:
raise PasswordMismatchError()
# Validate token and get register data
register_data = AccountService.get_email_register_data(args["token"])
if not register_data:
raise InvalidTokenError()
# Must use token in reset phase
if register_data.get("phase", "") != "register":
raise InvalidTokenError()
# Revoke token to prevent reuse
AccountService.revoke_email_register_token(args["token"])
email = register_data.get("email", "")
with Session(db.engine) as session:
account = session.execute(select(Account).filter_by(email=email)).scalar_one_or_none()
if account:
raise EmailAlreadyInUseError()
else:
account = self._create_new_account(email, args["password_confirm"])
if not account:
raise AccountNotFoundError()
token_pair = AccountService.login(account=account, ip_address=extract_remote_ip(request))
AccountService.reset_login_error_rate_limit(email)
return {"result": "success", "data": token_pair.model_dump()}
def _create_new_account(self, email, password) -> Account | None:
# Create new account if allowed
account = None
try:
account = AccountService.create_account_and_tenant(
email=email,
name=email,
password=password,
interface_language=languages[0],
)
except AccountRegisterError:
raise AccountInFreezeError()
return account
api.add_resource(EmailRegisterSendEmailApi, "/email-register/send-email")
api.add_resource(EmailRegisterCheckApi, "/email-register/validity")
api.add_resource(EmailRegisterResetApi, "/email-register")

View File

@@ -27,21 +27,43 @@ class InvalidTokenError(BaseHTTPException):
class PasswordResetRateLimitExceededError(BaseHTTPException): class PasswordResetRateLimitExceededError(BaseHTTPException):
error_code = "password_reset_rate_limit_exceeded" error_code = "password_reset_rate_limit_exceeded"
description = "Too many password reset emails have been sent. Please try again in 1 minute." description = "Too many password reset emails have been sent. Please try again in {minutes} minutes."
code = 429 code = 429
def __init__(self, minutes: int = 1):
description = self.description.format(minutes=int(minutes)) if self.description else None
super().__init__(description=description)
class EmailRegisterRateLimitExceededError(BaseHTTPException):
error_code = "email_register_rate_limit_exceeded"
description = "Too many email register emails have been sent. Please try again in {minutes} minutes."
code = 429
def __init__(self, minutes: int = 1):
description = self.description.format(minutes=int(minutes)) if self.description else None
super().__init__(description=description)
class EmailChangeRateLimitExceededError(BaseHTTPException): class EmailChangeRateLimitExceededError(BaseHTTPException):
error_code = "email_change_rate_limit_exceeded" error_code = "email_change_rate_limit_exceeded"
description = "Too many email change emails have been sent. Please try again in 1 minute." description = "Too many email change emails have been sent. Please try again in {minutes} minutes."
code = 429 code = 429
def __init__(self, minutes: int = 1):
description = self.description.format(minutes=int(minutes)) if self.description else None
super().__init__(description=description)
class OwnerTransferRateLimitExceededError(BaseHTTPException): class OwnerTransferRateLimitExceededError(BaseHTTPException):
error_code = "owner_transfer_rate_limit_exceeded" error_code = "owner_transfer_rate_limit_exceeded"
description = "Too many owner transfer emails have been sent. Please try again in 1 minute." description = "Too many owner transfer emails have been sent. Please try again in {minutes} minutes."
code = 429 code = 429
def __init__(self, minutes: int = 1):
description = self.description.format(minutes=int(minutes)) if self.description else None
super().__init__(description=description)
class EmailCodeError(BaseHTTPException): class EmailCodeError(BaseHTTPException):
error_code = "email_code_error" error_code = "email_code_error"
@@ -69,15 +91,23 @@ class EmailPasswordLoginLimitError(BaseHTTPException):
class EmailCodeLoginRateLimitExceededError(BaseHTTPException): class EmailCodeLoginRateLimitExceededError(BaseHTTPException):
error_code = "email_code_login_rate_limit_exceeded" error_code = "email_code_login_rate_limit_exceeded"
description = "Too many login emails have been sent. Please try again in 5 minutes." description = "Too many login emails have been sent. Please try again in {minutes} minutes."
code = 429 code = 429
def __init__(self, minutes: int = 5):
description = self.description.format(minutes=int(minutes)) if self.description else None
super().__init__(description=description)
class EmailCodeAccountDeletionRateLimitExceededError(BaseHTTPException): class EmailCodeAccountDeletionRateLimitExceededError(BaseHTTPException):
error_code = "email_code_account_deletion_rate_limit_exceeded" error_code = "email_code_account_deletion_rate_limit_exceeded"
description = "Too many account deletion emails have been sent. Please try again in 5 minutes." description = "Too many account deletion emails have been sent. Please try again in {minutes} minutes."
code = 429 code = 429
def __init__(self, minutes: int = 5):
description = self.description.format(minutes=int(minutes)) if self.description else None
super().__init__(description=description)
class EmailPasswordResetLimitError(BaseHTTPException): class EmailPasswordResetLimitError(BaseHTTPException):
error_code = "email_password_reset_limit" error_code = "email_password_reset_limit"
@@ -85,6 +115,12 @@ class EmailPasswordResetLimitError(BaseHTTPException):
code = 429 code = 429
class EmailRegisterLimitError(BaseHTTPException):
error_code = "email_register_limit"
description = "Too many failed email register attempts. Please try again in 24 hours."
code = 429
class EmailChangeLimitError(BaseHTTPException): class EmailChangeLimitError(BaseHTTPException):
error_code = "email_change_limit" error_code = "email_change_limit"
description = "Too many failed email change attempts. Please try again in 24 hours." description = "Too many failed email change attempts. Please try again in 24 hours."

View File

@@ -2,12 +2,11 @@ import base64
import secrets import secrets
from flask import request from flask import request
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from constants.languages import languages from controllers.console import api, console_ns
from controllers.console import api
from controllers.console.auth.error import ( from controllers.console.auth.error import (
EmailCodeError, EmailCodeError,
EmailPasswordResetLimitError, EmailPasswordResetLimitError,
@@ -15,7 +14,7 @@ from controllers.console.auth.error import (
InvalidTokenError, InvalidTokenError,
PasswordMismatchError, PasswordMismatchError,
) )
from controllers.console.error import AccountInFreezeError, AccountNotFound, EmailSendIpLimitError from controllers.console.error import AccountNotFound, EmailSendIpLimitError
from controllers.console.wraps import email_password_login_enabled, setup_required from controllers.console.wraps import email_password_login_enabled, setup_required
from events.tenant_event import tenant_was_created from events.tenant_event import tenant_was_created
from extensions.ext_database import db from extensions.ext_database import db
@@ -23,12 +22,35 @@ from libs.helper import email, extract_remote_ip
from libs.password import hash_password, valid_password from libs.password import hash_password, valid_password
from models.account import Account from models.account import Account
from services.account_service import AccountService, TenantService from services.account_service import AccountService, TenantService
from services.errors.account import AccountRegisterError
from services.errors.workspace import WorkSpaceNotAllowedCreateError, WorkspacesLimitExceededError
from services.feature_service import FeatureService from services.feature_service import FeatureService
@console_ns.route("/forgot-password")
class ForgotPasswordSendEmailApi(Resource): class ForgotPasswordSendEmailApi(Resource):
@api.doc("send_forgot_password_email")
@api.doc(description="Send password reset email")
@api.expect(
api.model(
"ForgotPasswordEmailRequest",
{
"email": fields.String(required=True, description="Email address"),
"language": fields.String(description="Language for email (zh-Hans/en-US)"),
},
)
)
@api.response(
200,
"Email sent successfully",
api.model(
"ForgotPasswordEmailResponse",
{
"result": fields.String(description="Operation result"),
"data": fields.String(description="Reset token"),
"code": fields.String(description="Error code if account not found"),
},
),
)
@api.response(400, "Invalid email or rate limit exceeded")
@setup_required @setup_required
@email_password_login_enabled @email_password_login_enabled
def post(self): def post(self):
@@ -48,20 +70,44 @@ class ForgotPasswordSendEmailApi(Resource):
with Session(db.engine) as session: with Session(db.engine) as session:
account = session.execute(select(Account).filter_by(email=args["email"])).scalar_one_or_none() account = session.execute(select(Account).filter_by(email=args["email"])).scalar_one_or_none()
token = None
if account is None: token = AccountService.send_reset_password_email(
if FeatureService.get_system_features().is_allow_register: account=account,
token = AccountService.send_reset_password_email(email=args["email"], language=language) email=args["email"],
return {"result": "fail", "data": token, "code": "account_not_found"} language=language,
else: is_allow_register=FeatureService.get_system_features().is_allow_register,
raise AccountNotFound() )
else:
token = AccountService.send_reset_password_email(account=account, email=args["email"], language=language)
return {"result": "success", "data": token} return {"result": "success", "data": token}
@console_ns.route("/forgot-password/validity")
class ForgotPasswordCheckApi(Resource): class ForgotPasswordCheckApi(Resource):
@api.doc("check_forgot_password_code")
@api.doc(description="Verify password reset code")
@api.expect(
api.model(
"ForgotPasswordCheckRequest",
{
"email": fields.String(required=True, description="Email address"),
"code": fields.String(required=True, description="Verification code"),
"token": fields.String(required=True, description="Reset token"),
},
)
)
@api.response(
200,
"Code verified successfully",
api.model(
"ForgotPasswordCheckResponse",
{
"is_valid": fields.Boolean(description="Whether code is valid"),
"email": fields.String(description="Email address"),
"token": fields.String(description="New reset token"),
},
),
)
@api.response(400, "Invalid code or token")
@setup_required @setup_required
@email_password_login_enabled @email_password_login_enabled
def post(self): def post(self):
@@ -100,7 +146,26 @@ class ForgotPasswordCheckApi(Resource):
return {"is_valid": True, "email": token_data.get("email"), "token": new_token} return {"is_valid": True, "email": token_data.get("email"), "token": new_token}
@console_ns.route("/forgot-password/resets")
class ForgotPasswordResetApi(Resource): class ForgotPasswordResetApi(Resource):
@api.doc("reset_password")
@api.doc(description="Reset password with verification token")
@api.expect(
api.model(
"ForgotPasswordResetRequest",
{
"token": fields.String(required=True, description="Verification token"),
"new_password": fields.String(required=True, description="New password"),
"password_confirm": fields.String(required=True, description="Password confirmation"),
},
)
)
@api.response(
200,
"Password reset successfully",
api.model("ForgotPasswordResetResponse", {"result": fields.String(description="Operation result")}),
)
@api.response(400, "Invalid token or password mismatch")
@setup_required @setup_required
@email_password_login_enabled @email_password_login_enabled
def post(self): def post(self):
@@ -137,7 +202,7 @@ class ForgotPasswordResetApi(Resource):
if account: if account:
self._update_existing_account(account, password_hashed, salt, session) self._update_existing_account(account, password_hashed, salt, session)
else: else:
self._create_new_account(email, args["password_confirm"]) raise AccountNotFound()
return {"result": "success"} return {"result": "success"}
@@ -157,22 +222,6 @@ class ForgotPasswordResetApi(Resource):
account.current_tenant = tenant account.current_tenant = tenant
tenant_was_created.send(tenant) tenant_was_created.send(tenant)
def _create_new_account(self, email, password):
# Create new account if allowed
try:
AccountService.create_account_and_tenant(
email=email,
name=email,
password=password,
interface_language=languages[0],
)
except WorkSpaceNotAllowedCreateError:
pass
except WorkspacesLimitExceededError:
pass
except AccountRegisterError:
raise AccountInFreezeError()
api.add_resource(ForgotPasswordSendEmailApi, "/forgot-password") api.add_resource(ForgotPasswordSendEmailApi, "/forgot-password")
api.add_resource(ForgotPasswordCheckApi, "/forgot-password/validity") api.add_resource(ForgotPasswordCheckApi, "/forgot-password/validity")

View File

@@ -26,7 +26,6 @@ from controllers.console.error import (
from controllers.console.wraps import email_password_login_enabled, setup_required from controllers.console.wraps import email_password_login_enabled, setup_required
from events.tenant_event import tenant_was_created from events.tenant_event import tenant_was_created
from libs.helper import email, extract_remote_ip from libs.helper import email, extract_remote_ip
from libs.password import valid_password
from models.account import Account from models.account import Account
from services.account_service import AccountService, RegisterService, TenantService from services.account_service import AccountService, RegisterService, TenantService
from services.billing_service import BillingService from services.billing_service import BillingService
@@ -44,10 +43,9 @@ class LoginApi(Resource):
"""Authenticate user and login.""" """Authenticate user and login."""
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument("email", type=email, required=True, location="json") parser.add_argument("email", type=email, required=True, location="json")
parser.add_argument("password", type=valid_password, required=True, location="json") parser.add_argument("password", type=str, required=True, location="json")
parser.add_argument("remember_me", type=bool, required=False, default=False, location="json") parser.add_argument("remember_me", type=bool, required=False, default=False, location="json")
parser.add_argument("invite_token", type=str, required=False, default=None, location="json") parser.add_argument("invite_token", type=str, required=False, default=None, location="json")
parser.add_argument("language", type=str, required=False, default="en-US", location="json")
args = parser.parse_args() args = parser.parse_args()
if dify_config.BILLING_ENABLED and BillingService.is_email_in_freeze(args["email"]): if dify_config.BILLING_ENABLED and BillingService.is_email_in_freeze(args["email"]):
@@ -61,11 +59,6 @@ class LoginApi(Resource):
if invitation: if invitation:
invitation = RegisterService.get_invitation_if_token_valid(None, args["email"], invitation) invitation = RegisterService.get_invitation_if_token_valid(None, args["email"], invitation)
if args["language"] is not None and args["language"] == "zh-Hans":
language = "zh-Hans"
else:
language = "en-US"
try: try:
if invitation: if invitation:
data = invitation.get("data", {}) data = invitation.get("data", {})
@@ -80,12 +73,6 @@ class LoginApi(Resource):
except services.errors.account.AccountPasswordError: except services.errors.account.AccountPasswordError:
AccountService.add_login_error_rate_limit(args["email"]) AccountService.add_login_error_rate_limit(args["email"])
raise AuthenticationFailedError() raise AuthenticationFailedError()
except services.errors.account.AccountNotFoundError:
if FeatureService.get_system_features().is_allow_register:
token = AccountService.send_reset_password_email(email=args["email"], language=language)
return {"result": "fail", "data": token, "code": "account_not_found"}
else:
raise AccountNotFound()
# SELF_HOSTED only have one workspace # SELF_HOSTED only have one workspace
tenants = TenantService.get_join_tenants(account) tenants = TenantService.get_join_tenants(account)
if len(tenants) == 0: if len(tenants) == 0:
@@ -133,13 +120,12 @@ class ResetPasswordSendEmailApi(Resource):
except AccountRegisterError: except AccountRegisterError:
raise AccountInFreezeError() raise AccountInFreezeError()
if account is None: token = AccountService.send_reset_password_email(
if FeatureService.get_system_features().is_allow_register: email=args["email"],
token = AccountService.send_reset_password_email(email=args["email"], language=language) account=account,
else: language=language,
raise AccountNotFound() is_allow_register=FeatureService.get_system_features().is_allow_register,
else: )
token = AccountService.send_reset_password_email(account=account, language=language)
return {"result": "success", "data": token} return {"result": "success", "data": token}

View File

@@ -1,5 +1,4 @@
import logging import logging
from typing import Optional
import requests import requests
from flask import current_app, redirect, request from flask import current_app, redirect, request
@@ -18,11 +17,12 @@ from libs.oauth import GitHubOAuth, GoogleOAuth, OAuthUserInfo
from models import Account from models import Account
from models.account import AccountStatus from models.account import AccountStatus
from services.account_service import AccountService, RegisterService, TenantService from services.account_service import AccountService, RegisterService, TenantService
from services.billing_service import BillingService
from services.errors.account import AccountNotFoundError, AccountRegisterError from services.errors.account import AccountNotFoundError, AccountRegisterError
from services.errors.workspace import WorkSpaceNotAllowedCreateError, WorkSpaceNotFoundError from services.errors.workspace import WorkSpaceNotAllowedCreateError, WorkSpaceNotFoundError
from services.feature_service import FeatureService from services.feature_service import FeatureService
from .. import api from .. import api, console_ns
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -50,7 +50,13 @@ def get_oauth_providers():
return OAUTH_PROVIDERS return OAUTH_PROVIDERS
@console_ns.route("/oauth/login/<provider>")
class OAuthLogin(Resource): class OAuthLogin(Resource):
@api.doc("oauth_login")
@api.doc(description="Initiate OAuth login process")
@api.doc(params={"provider": "OAuth provider name (github/google)", "invite_token": "Optional invitation token"})
@api.response(302, "Redirect to OAuth authorization URL")
@api.response(400, "Invalid provider")
def get(self, provider: str): def get(self, provider: str):
invite_token = request.args.get("invite_token") or None invite_token = request.args.get("invite_token") or None
OAUTH_PROVIDERS = get_oauth_providers() OAUTH_PROVIDERS = get_oauth_providers()
@@ -63,7 +69,19 @@ class OAuthLogin(Resource):
return redirect(auth_url) return redirect(auth_url)
@console_ns.route("/oauth/authorize/<provider>")
class OAuthCallback(Resource): class OAuthCallback(Resource):
@api.doc("oauth_callback")
@api.doc(description="Handle OAuth callback and complete login process")
@api.doc(
params={
"provider": "OAuth provider name (github/google)",
"code": "Authorization code from OAuth provider",
"state": "Optional state parameter (used for invite token)",
}
)
@api.response(302, "Redirect to console with access token")
@api.response(400, "OAuth process failed")
def get(self, provider: str): def get(self, provider: str):
OAUTH_PROVIDERS = get_oauth_providers() OAUTH_PROVIDERS = get_oauth_providers()
with current_app.app_context(): with current_app.app_context():
@@ -77,6 +95,9 @@ class OAuthCallback(Resource):
if state: if state:
invite_token = state invite_token = state
if not code:
return {"error": "Authorization code is required"}, 400
try: try:
token = oauth_provider.get_access_token(code) token = oauth_provider.get_access_token(code)
user_info = oauth_provider.get_user_info(token) user_info = oauth_provider.get_user_info(token)
@@ -86,7 +107,7 @@ class OAuthCallback(Resource):
return {"error": "OAuth process failed"}, 400 return {"error": "OAuth process failed"}, 400
if invite_token and RegisterService.is_valid_invite_token(invite_token): if invite_token and RegisterService.is_valid_invite_token(invite_token):
invitation = RegisterService._get_invitation_by_token(token=invite_token) invitation = RegisterService.get_invitation_by_token(token=invite_token)
if invitation: if invitation:
invitation_email = invitation.get("email", None) invitation_email = invitation.get("email", None)
if invitation_email != user_info.email: if invitation_email != user_info.email:
@@ -135,8 +156,8 @@ class OAuthCallback(Resource):
) )
def _get_account_by_openid_or_email(provider: str, user_info: OAuthUserInfo) -> Optional[Account]: def _get_account_by_openid_or_email(provider: str, user_info: OAuthUserInfo) -> Account | None:
account: Optional[Account] = Account.get_by_openid(provider, user_info.id) account: Account | None = Account.get_by_openid(provider, user_info.id)
if not account: if not account:
with Session(db.engine) as session: with Session(db.engine) as session:
@@ -162,7 +183,15 @@ def _generate_account(provider: str, user_info: OAuthUserInfo):
if not account: if not account:
if not FeatureService.get_system_features().is_allow_register: if not FeatureService.get_system_features().is_allow_register:
raise AccountNotFoundError() if dify_config.BILLING_ENABLED and BillingService.is_email_in_freeze(user_info.email):
raise AccountRegisterError(
description=(
"This email account has been deleted within the past "
"30 days and is temporarily unavailable for new account registration"
)
)
else:
raise AccountRegisterError(description=("Invalid email or password"))
account_name = user_info.name or "Dify" account_name = user_info.name or "Dify"
account = RegisterService.register( account = RegisterService.register(
email=user_info.email, name=account_name, password=None, open_id=user_info.id, provider=provider email=user_info.email, name=account_name, password=None, open_id=user_info.id, provider=provider
@@ -181,7 +210,3 @@ def _generate_account(provider: str, user_info: OAuthUserInfo):
AccountService.link_account_integrate(provider, user_info.id, account) AccountService.link_account_integrate(provider, user_info.id, account)
return account return account
api.add_resource(OAuthLogin, "/oauth/login/<provider>")
api.add_resource(OAuthCallback, "/oauth/authorize/<provider>")

View File

@@ -29,14 +29,12 @@ class DataSourceApi(Resource):
@marshal_with(integrate_list_fields) @marshal_with(integrate_list_fields)
def get(self): def get(self):
# get workspace data source integrates # get workspace data source integrates
data_source_integrates = ( data_source_integrates = db.session.scalars(
db.session.query(DataSourceOauthBinding) select(DataSourceOauthBinding).where(
.where(
DataSourceOauthBinding.tenant_id == current_user.current_tenant_id, DataSourceOauthBinding.tenant_id == current_user.current_tenant_id,
DataSourceOauthBinding.disabled == False, DataSourceOauthBinding.disabled == False,
) )
.all() ).all()
)
base_url = request.url_root.rstrip("/") base_url = request.url_root.rstrip("/")
data_source_oauth_base_path = "/console/api/oauth/data-source" data_source_oauth_base_path = "/console/api/oauth/data-source"
@@ -249,7 +247,7 @@ class DataSourceNotionDatasetSyncApi(Resource):
documents = DocumentService.get_document_by_dataset_id(dataset_id_str) documents = DocumentService.get_document_by_dataset_id(dataset_id_str)
for document in documents: for document in documents:
document_indexing_sync_task.delay(dataset_id_str, document.id) document_indexing_sync_task.delay(dataset_id_str, document.id)
return 200 return {"result": "success"}, 200
class DataSourceNotionDocumentSyncApi(Resource): class DataSourceNotionDocumentSyncApi(Resource):
@@ -267,7 +265,7 @@ class DataSourceNotionDocumentSyncApi(Resource):
if document is None: if document is None:
raise NotFound("Document not found.") raise NotFound("Document not found.")
document_indexing_sync_task.delay(dataset_id_str, document_id_str) document_indexing_sync_task.delay(dataset_id_str, document_id_str)
return 200 return {"result": "success"}, 200
api.add_resource(DataSourceApi, "/data-source/integrates", "/data-source/integrates/<uuid:binding_id>/<string:action>") api.add_resource(DataSourceApi, "/data-source/integrates", "/data-source/integrates/<uuid:binding_id>/<string:action>")

View File

@@ -1,12 +1,13 @@
import flask_restx import flask_restx
from flask import request from flask import request
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, marshal, marshal_with, reqparse from flask_restx import Resource, fields, marshal, marshal_with, reqparse
from sqlalchemy import select
from werkzeug.exceptions import Forbidden, NotFound from werkzeug.exceptions import Forbidden, NotFound
import services import services
from configs import dify_config from configs import dify_config
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.apikey import api_key_fields, api_key_list from controllers.console.apikey import api_key_fields, api_key_list
from controllers.console.app.error import ProviderNotInitializeError from controllers.console.app.error import ProviderNotInitializeError
from controllers.console.datasets.error import DatasetInUseError, DatasetNameDuplicateError, IndexingEstimateError from controllers.console.datasets.error import DatasetInUseError, DatasetNameDuplicateError, IndexingEstimateError
@@ -19,7 +20,6 @@ from controllers.console.wraps import (
from core.errors.error import LLMBadRequestError, ProviderTokenNotInitError from core.errors.error import LLMBadRequestError, ProviderTokenNotInitError
from core.indexing_runner import IndexingRunner from core.indexing_runner import IndexingRunner
from core.model_runtime.entities.model_entities import ModelType from core.model_runtime.entities.model_entities import ModelType
from core.plugin.entities.plugin import ModelProviderID
from core.provider_manager import ProviderManager from core.provider_manager import ProviderManager
from core.rag.datasource.vdb.vector_type import VectorType from core.rag.datasource.vdb.vector_type import VectorType
from core.rag.extractor.entity.datasource_type import DatasourceType from core.rag.extractor.entity.datasource_type import DatasourceType
@@ -32,6 +32,7 @@ from fields.document_fields import document_status_fields
from libs.login import login_required from libs.login import login_required
from models import ApiToken, Dataset, Document, DocumentSegment, UploadFile from models import ApiToken, Dataset, Document, DocumentSegment, UploadFile
from models.dataset import DatasetPermissionEnum from models.dataset import DatasetPermissionEnum
from models.provider_ids import ModelProviderID
from services.dataset_service import DatasetPermissionService, DatasetService, DocumentService from services.dataset_service import DatasetPermissionService, DatasetService, DocumentService
@@ -47,7 +48,21 @@ def _validate_description_length(description):
return description return description
@console_ns.route("/datasets")
class DatasetListApi(Resource): class DatasetListApi(Resource):
@api.doc("get_datasets")
@api.doc(description="Get list of datasets")
@api.doc(
params={
"page": "Page number (default: 1)",
"limit": "Number of items per page (default: 20)",
"ids": "Filter by dataset IDs (list)",
"keyword": "Search keyword",
"tag_ids": "Filter by tag IDs (list)",
"include_all": "Include all datasets (default: false)",
}
)
@api.response(200, "Datasets retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -99,6 +114,24 @@ class DatasetListApi(Resource):
response = {"data": data, "has_more": len(datasets) == limit, "limit": limit, "total": total, "page": page} response = {"data": data, "has_more": len(datasets) == limit, "limit": limit, "total": total, "page": page}
return response, 200 return response, 200
@api.doc("create_dataset")
@api.doc(description="Create a new dataset")
@api.expect(
api.model(
"CreateDatasetRequest",
{
"name": fields.String(required=True, description="Dataset name (1-40 characters)"),
"description": fields.String(description="Dataset description (max 400 characters)"),
"indexing_technique": fields.String(description="Indexing technique"),
"permission": fields.String(description="Dataset permission"),
"provider": fields.String(description="Provider"),
"external_knowledge_api_id": fields.String(description="External knowledge API ID"),
"external_knowledge_id": fields.String(description="External knowledge ID"),
},
)
)
@api.response(201, "Dataset created successfully")
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -171,7 +204,14 @@ class DatasetListApi(Resource):
return marshal(dataset, dataset_detail_fields), 201 return marshal(dataset, dataset_detail_fields), 201
@console_ns.route("/datasets/<uuid:dataset_id>")
class DatasetApi(Resource): class DatasetApi(Resource):
@api.doc("get_dataset")
@api.doc(description="Get dataset details")
@api.doc(params={"dataset_id": "Dataset ID"})
@api.response(200, "Dataset retrieved successfully", dataset_detail_fields)
@api.response(404, "Dataset not found")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -214,6 +254,23 @@ class DatasetApi(Resource):
return data, 200 return data, 200
@api.doc("update_dataset")
@api.doc(description="Update dataset details")
@api.expect(
api.model(
"UpdateDatasetRequest",
{
"name": fields.String(description="Dataset name"),
"description": fields.String(description="Dataset description"),
"permission": fields.String(description="Dataset permission"),
"indexing_technique": fields.String(description="Indexing technique"),
"external_retrieval_model": fields.Raw(description="External retrieval model settings"),
},
)
)
@api.response(200, "Dataset updated successfully", dataset_detail_fields)
@api.response(404, "Dataset not found")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -343,7 +400,12 @@ class DatasetApi(Resource):
raise DatasetInUseError() raise DatasetInUseError()
@console_ns.route("/datasets/<uuid:dataset_id>/use-check")
class DatasetUseCheckApi(Resource): class DatasetUseCheckApi(Resource):
@api.doc("check_dataset_use")
@api.doc(description="Check if dataset is in use")
@api.doc(params={"dataset_id": "Dataset ID"})
@api.response(200, "Dataset use status retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -354,7 +416,12 @@ class DatasetUseCheckApi(Resource):
return {"is_using": dataset_is_using}, 200 return {"is_using": dataset_is_using}, 200
@console_ns.route("/datasets/<uuid:dataset_id>/queries")
class DatasetQueryApi(Resource): class DatasetQueryApi(Resource):
@api.doc("get_dataset_queries")
@api.doc(description="Get dataset query history")
@api.doc(params={"dataset_id": "Dataset ID"})
@api.response(200, "Query history retrieved successfully", dataset_query_detail_fields)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -384,7 +451,11 @@ class DatasetQueryApi(Resource):
return response, 200 return response, 200
@console_ns.route("/datasets/indexing-estimate")
class DatasetIndexingEstimateApi(Resource): class DatasetIndexingEstimateApi(Resource):
@api.doc("estimate_dataset_indexing")
@api.doc(description="Estimate dataset indexing cost")
@api.response(200, "Indexing estimate calculated successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -411,11 +482,11 @@ class DatasetIndexingEstimateApi(Resource):
extract_settings = [] extract_settings = []
if args["info_list"]["data_source_type"] == "upload_file": if args["info_list"]["data_source_type"] == "upload_file":
file_ids = args["info_list"]["file_info_list"]["file_ids"] file_ids = args["info_list"]["file_info_list"]["file_ids"]
file_details = ( file_details = db.session.scalars(
db.session.query(UploadFile) select(UploadFile).where(
.where(UploadFile.tenant_id == current_user.current_tenant_id, UploadFile.id.in_(file_ids)) UploadFile.tenant_id == current_user.current_tenant_id, UploadFile.id.in_(file_ids)
.all() )
) ).all()
if file_details is None: if file_details is None:
raise NotFound("File not found.") raise NotFound("File not found.")
@@ -485,7 +556,12 @@ class DatasetIndexingEstimateApi(Resource):
return response.model_dump(), 200 return response.model_dump(), 200
@console_ns.route("/datasets/<uuid:dataset_id>/related-apps")
class DatasetRelatedAppListApi(Resource): class DatasetRelatedAppListApi(Resource):
@api.doc("get_dataset_related_apps")
@api.doc(description="Get applications related to dataset")
@api.doc(params={"dataset_id": "Dataset ID"})
@api.response(200, "Related apps retrieved successfully", related_app_list)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -512,17 +588,22 @@ class DatasetRelatedAppListApi(Resource):
return {"data": related_apps, "total": len(related_apps)}, 200 return {"data": related_apps, "total": len(related_apps)}, 200
@console_ns.route("/datasets/<uuid:dataset_id>/indexing-status")
class DatasetIndexingStatusApi(Resource): class DatasetIndexingStatusApi(Resource):
@api.doc("get_dataset_indexing_status")
@api.doc(description="Get dataset indexing status")
@api.doc(params={"dataset_id": "Dataset ID"})
@api.response(200, "Indexing status retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
def get(self, dataset_id): def get(self, dataset_id):
dataset_id = str(dataset_id) dataset_id = str(dataset_id)
documents = ( documents = db.session.scalars(
db.session.query(Document) select(Document).where(
.where(Document.dataset_id == dataset_id, Document.tenant_id == current_user.current_tenant_id) Document.dataset_id == dataset_id, Document.tenant_id == current_user.current_tenant_id
.all() )
) ).all()
documents_status = [] documents_status = []
for document in documents: for document in documents:
completed_segments = ( completed_segments = (
@@ -559,21 +640,25 @@ class DatasetIndexingStatusApi(Resource):
return data, 200 return data, 200
@console_ns.route("/datasets/api-keys")
class DatasetApiKeyApi(Resource): class DatasetApiKeyApi(Resource):
max_keys = 10 max_keys = 10
token_prefix = "dataset-" token_prefix = "dataset-"
resource_type = "dataset" resource_type = "dataset"
@api.doc("get_dataset_api_keys")
@api.doc(description="Get dataset API keys")
@api.response(200, "API keys retrieved successfully", api_key_list)
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@marshal_with(api_key_list) @marshal_with(api_key_list)
def get(self): def get(self):
keys = ( keys = db.session.scalars(
db.session.query(ApiToken) select(ApiToken).where(
.where(ApiToken.type == self.resource_type, ApiToken.tenant_id == current_user.current_tenant_id) ApiToken.type == self.resource_type, ApiToken.tenant_id == current_user.current_tenant_id
.all() )
) ).all()
return {"items": keys} return {"items": keys}
@setup_required @setup_required
@@ -608,9 +693,14 @@ class DatasetApiKeyApi(Resource):
return api_token, 200 return api_token, 200
@console_ns.route("/datasets/api-keys/<uuid:api_key_id>")
class DatasetApiDeleteApi(Resource): class DatasetApiDeleteApi(Resource):
resource_type = "dataset" resource_type = "dataset"
@api.doc("delete_dataset_api_key")
@api.doc(description="Delete dataset API key")
@api.doc(params={"api_key_id": "API key ID"})
@api.response(204, "API key deleted successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -640,7 +730,11 @@ class DatasetApiDeleteApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/datasets/api-base-info")
class DatasetApiBaseUrlApi(Resource): class DatasetApiBaseUrlApi(Resource):
@api.doc("get_dataset_api_base_info")
@api.doc(description="Get dataset API base information")
@api.response(200, "API base info retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -648,7 +742,11 @@ class DatasetApiBaseUrlApi(Resource):
return {"api_base_url": (dify_config.SERVICE_API_URL or request.host_url.rstrip("/")) + "/v1"} return {"api_base_url": (dify_config.SERVICE_API_URL or request.host_url.rstrip("/")) + "/v1"}
@console_ns.route("/datasets/retrieval-setting")
class DatasetRetrievalSettingApi(Resource): class DatasetRetrievalSettingApi(Resource):
@api.doc("get_dataset_retrieval_setting")
@api.doc(description="Get dataset retrieval settings")
@api.response(200, "Retrieval settings retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -699,7 +797,12 @@ class DatasetRetrievalSettingApi(Resource):
raise ValueError(f"Unsupported vector db type {vector_type}.") raise ValueError(f"Unsupported vector db type {vector_type}.")
@console_ns.route("/datasets/retrieval-setting/<string:vector_type>")
class DatasetRetrievalSettingMockApi(Resource): class DatasetRetrievalSettingMockApi(Resource):
@api.doc("get_dataset_retrieval_setting_mock")
@api.doc(description="Get mock dataset retrieval settings by vector type")
@api.doc(params={"vector_type": "Vector store type"})
@api.response(200, "Mock retrieval settings retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -748,7 +851,13 @@ class DatasetRetrievalSettingMockApi(Resource):
raise ValueError(f"Unsupported vector db type {vector_type}.") raise ValueError(f"Unsupported vector db type {vector_type}.")
@console_ns.route("/datasets/<uuid:dataset_id>/error-docs")
class DatasetErrorDocs(Resource): class DatasetErrorDocs(Resource):
@api.doc("get_dataset_error_docs")
@api.doc(description="Get dataset error documents")
@api.doc(params={"dataset_id": "Dataset ID"})
@api.response(200, "Error documents retrieved successfully")
@api.response(404, "Dataset not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -762,7 +871,14 @@ class DatasetErrorDocs(Resource):
return {"data": [marshal(item, document_status_fields) for item in results], "total": len(results)}, 200 return {"data": [marshal(item, document_status_fields) for item in results], "total": len(results)}, 200
@console_ns.route("/datasets/<uuid:dataset_id>/permission-part-users")
class DatasetPermissionUserListApi(Resource): class DatasetPermissionUserListApi(Resource):
@api.doc("get_dataset_permission_users")
@api.doc(description="Get dataset permission user list")
@api.doc(params={"dataset_id": "Dataset ID"})
@api.response(200, "Permission users retrieved successfully")
@api.response(404, "Dataset not found")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -783,7 +899,13 @@ class DatasetPermissionUserListApi(Resource):
}, 200 }, 200
@console_ns.route("/datasets/<uuid:dataset_id>/auto-disable-logs")
class DatasetAutoDisableLogApi(Resource): class DatasetAutoDisableLogApi(Resource):
@api.doc("get_dataset_auto_disable_logs")
@api.doc(description="Get dataset auto disable logs")
@api.doc(params={"dataset_id": "Dataset ID"})
@api.response(200, "Auto disable logs retrieved successfully")
@api.response(404, "Dataset not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -793,20 +915,3 @@ class DatasetAutoDisableLogApi(Resource):
if dataset is None: if dataset is None:
raise NotFound("Dataset not found.") raise NotFound("Dataset not found.")
return DatasetService.get_dataset_auto_disable_logs(dataset_id_str), 200 return DatasetService.get_dataset_auto_disable_logs(dataset_id_str), 200
api.add_resource(DatasetListApi, "/datasets")
api.add_resource(DatasetApi, "/datasets/<uuid:dataset_id>")
api.add_resource(DatasetUseCheckApi, "/datasets/<uuid:dataset_id>/use-check")
api.add_resource(DatasetQueryApi, "/datasets/<uuid:dataset_id>/queries")
api.add_resource(DatasetErrorDocs, "/datasets/<uuid:dataset_id>/error-docs")
api.add_resource(DatasetIndexingEstimateApi, "/datasets/indexing-estimate")
api.add_resource(DatasetRelatedAppListApi, "/datasets/<uuid:dataset_id>/related-apps")
api.add_resource(DatasetIndexingStatusApi, "/datasets/<uuid:dataset_id>/indexing-status")
api.add_resource(DatasetApiKeyApi, "/datasets/api-keys")
api.add_resource(DatasetApiDeleteApi, "/datasets/api-keys/<uuid:api_key_id>")
api.add_resource(DatasetApiBaseUrlApi, "/datasets/api-base-info")
api.add_resource(DatasetRetrievalSettingApi, "/datasets/retrieval-setting")
api.add_resource(DatasetRetrievalSettingMockApi, "/datasets/retrieval-setting/<string:vector_type>")
api.add_resource(DatasetPermissionUserListApi, "/datasets/<uuid:dataset_id>/permission-part-users")
api.add_resource(DatasetAutoDisableLogApi, "/datasets/<uuid:dataset_id>/auto-disable-logs")

View File

@@ -1,15 +1,16 @@
import logging import logging
from argparse import ArgumentTypeError from argparse import ArgumentTypeError
from collections.abc import Sequence
from typing import Literal, cast from typing import Literal, cast
from flask import request from flask import request
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, marshal, marshal_with, reqparse from flask_restx import Resource, fields, marshal, marshal_with, reqparse
from sqlalchemy import asc, desc, select from sqlalchemy import asc, desc, select
from werkzeug.exceptions import Forbidden, NotFound from werkzeug.exceptions import Forbidden, NotFound
import services import services
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.app.error import ( from controllers.console.app.error import (
ProviderModelCurrentlyNotSupportError, ProviderModelCurrentlyNotSupportError,
ProviderNotInitializeError, ProviderNotInitializeError,
@@ -79,7 +80,7 @@ class DocumentResource(Resource):
return document return document
def get_batch_documents(self, dataset_id: str, batch: str) -> list[Document]: def get_batch_documents(self, dataset_id: str, batch: str) -> Sequence[Document]:
dataset = DatasetService.get_dataset(dataset_id) dataset = DatasetService.get_dataset(dataset_id)
if not dataset: if not dataset:
raise NotFound("Dataset not found.") raise NotFound("Dataset not found.")
@@ -97,7 +98,12 @@ class DocumentResource(Resource):
return documents return documents
@console_ns.route("/datasets/process-rule")
class GetProcessRuleApi(Resource): class GetProcessRuleApi(Resource):
@api.doc("get_process_rule")
@api.doc(description="Get dataset document processing rules")
@api.doc(params={"document_id": "Document ID (optional)"})
@api.response(200, "Process rules retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -139,7 +145,21 @@ class GetProcessRuleApi(Resource):
return {"mode": mode, "rules": rules, "limits": limits} return {"mode": mode, "rules": rules, "limits": limits}
@console_ns.route("/datasets/<uuid:dataset_id>/documents")
class DatasetDocumentListApi(Resource): class DatasetDocumentListApi(Resource):
@api.doc("get_dataset_documents")
@api.doc(description="Get documents in a dataset")
@api.doc(
params={
"dataset_id": "Dataset ID",
"page": "Page number (default: 1)",
"limit": "Number of items per page (default: 20)",
"keyword": "Search keyword",
"sort": "Sort order (default: -created_at)",
"fetch": "Fetch full details (default: false)",
}
)
@api.response(200, "Documents retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -323,7 +343,23 @@ class DatasetDocumentListApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/datasets/init")
class DatasetInitApi(Resource): class DatasetInitApi(Resource):
@api.doc("init_dataset")
@api.doc(description="Initialize dataset with documents")
@api.expect(
api.model(
"DatasetInitRequest",
{
"upload_file_id": fields.String(required=True, description="Upload file ID"),
"indexing_technique": fields.String(description="Indexing technique"),
"process_rule": fields.Raw(description="Processing rules"),
"data_source": fields.Raw(description="Data source configuration"),
},
)
)
@api.response(201, "Dataset initialized successfully", dataset_and_document_fields)
@api.response(400, "Invalid request parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -393,7 +429,14 @@ class DatasetInitApi(Resource):
return response return response
@console_ns.route("/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/indexing-estimate")
class DocumentIndexingEstimateApi(DocumentResource): class DocumentIndexingEstimateApi(DocumentResource):
@api.doc("estimate_document_indexing")
@api.doc(description="Estimate document indexing cost")
@api.doc(params={"dataset_id": "Dataset ID", "document_id": "Document ID"})
@api.response(200, "Indexing estimate calculated successfully")
@api.response(404, "Document not found")
@api.response(400, "Document already finished")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -456,6 +499,7 @@ class DocumentIndexingEstimateApi(DocumentResource):
return response, 200 return response, 200
@console_ns.route("/datasets/<uuid:dataset_id>/batch/<string:batch>/indexing-estimate")
class DocumentBatchIndexingEstimateApi(DocumentResource): class DocumentBatchIndexingEstimateApi(DocumentResource):
@setup_required @setup_required
@login_required @login_required
@@ -548,6 +592,7 @@ class DocumentBatchIndexingEstimateApi(DocumentResource):
raise IndexingEstimateError(str(e)) raise IndexingEstimateError(str(e))
@console_ns.route("/datasets/<uuid:dataset_id>/batch/<string:batch>/indexing-status")
class DocumentBatchIndexingStatusApi(DocumentResource): class DocumentBatchIndexingStatusApi(DocumentResource):
@setup_required @setup_required
@login_required @login_required
@@ -592,7 +637,13 @@ class DocumentBatchIndexingStatusApi(DocumentResource):
return data return data
@console_ns.route("/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/indexing-status")
class DocumentIndexingStatusApi(DocumentResource): class DocumentIndexingStatusApi(DocumentResource):
@api.doc("get_document_indexing_status")
@api.doc(description="Get document indexing status")
@api.doc(params={"dataset_id": "Dataset ID", "document_id": "Document ID"})
@api.response(200, "Indexing status retrieved successfully")
@api.response(404, "Document not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -634,9 +685,21 @@ class DocumentIndexingStatusApi(DocumentResource):
return marshal(document_dict, document_status_fields) return marshal(document_dict, document_status_fields)
@console_ns.route("/datasets/<uuid:dataset_id>/documents/<uuid:document_id>")
class DocumentApi(DocumentResource): class DocumentApi(DocumentResource):
METADATA_CHOICES = {"all", "only", "without"} METADATA_CHOICES = {"all", "only", "without"}
@api.doc("get_document")
@api.doc(description="Get document details")
@api.doc(
params={
"dataset_id": "Dataset ID",
"document_id": "Document ID",
"metadata": "Metadata inclusion (all/only/without)",
}
)
@api.response(200, "Document retrieved successfully")
@api.response(404, "Document not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -745,7 +808,16 @@ class DocumentApi(DocumentResource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/processing/<string:action>")
class DocumentProcessingApi(DocumentResource): class DocumentProcessingApi(DocumentResource):
@api.doc("update_document_processing")
@api.doc(description="Update document processing status (pause/resume)")
@api.doc(
params={"dataset_id": "Dataset ID", "document_id": "Document ID", "action": "Action to perform (pause/resume)"}
)
@api.response(200, "Processing status updated successfully")
@api.response(404, "Document not found")
@api.response(400, "Invalid action")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -780,7 +852,23 @@ class DocumentProcessingApi(DocumentResource):
return {"result": "success"}, 200 return {"result": "success"}, 200
@console_ns.route("/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/metadata")
class DocumentMetadataApi(DocumentResource): class DocumentMetadataApi(DocumentResource):
@api.doc("update_document_metadata")
@api.doc(description="Update document metadata")
@api.doc(params={"dataset_id": "Dataset ID", "document_id": "Document ID"})
@api.expect(
api.model(
"UpdateDocumentMetadataRequest",
{
"doc_type": fields.String(description="Document type"),
"doc_metadata": fields.Raw(description="Document metadata"),
},
)
)
@api.response(200, "Document metadata updated successfully")
@api.response(404, "Document not found")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -824,6 +912,7 @@ class DocumentMetadataApi(DocumentResource):
return {"result": "success", "message": "Document metadata updated."}, 200 return {"result": "success", "message": "Document metadata updated."}, 200
@console_ns.route("/datasets/<uuid:dataset_id>/documents/status/<string:action>/batch")
class DocumentStatusApi(DocumentResource): class DocumentStatusApi(DocumentResource):
@setup_required @setup_required
@login_required @login_required
@@ -860,6 +949,7 @@ class DocumentStatusApi(DocumentResource):
return {"result": "success"}, 200 return {"result": "success"}, 200
@console_ns.route("/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/processing/pause")
class DocumentPauseApi(DocumentResource): class DocumentPauseApi(DocumentResource):
@setup_required @setup_required
@login_required @login_required
@@ -893,6 +983,7 @@ class DocumentPauseApi(DocumentResource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/processing/resume")
class DocumentRecoverApi(DocumentResource): class DocumentRecoverApi(DocumentResource):
@setup_required @setup_required
@login_required @login_required
@@ -923,6 +1014,7 @@ class DocumentRecoverApi(DocumentResource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/datasets/<uuid:dataset_id>/retry")
class DocumentRetryApi(DocumentResource): class DocumentRetryApi(DocumentResource):
@setup_required @setup_required
@login_required @login_required
@@ -966,6 +1058,7 @@ class DocumentRetryApi(DocumentResource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/rename")
class DocumentRenameApi(DocumentResource): class DocumentRenameApi(DocumentResource):
@setup_required @setup_required
@login_required @login_required
@@ -989,6 +1082,7 @@ class DocumentRenameApi(DocumentResource):
return document return document
@console_ns.route("/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/website-sync")
class WebsiteDocumentSyncApi(DocumentResource): class WebsiteDocumentSyncApi(DocumentResource):
@setup_required @setup_required
@login_required @login_required
@@ -1014,26 +1108,3 @@ class WebsiteDocumentSyncApi(DocumentResource):
DocumentService.sync_website_document(dataset_id, document) DocumentService.sync_website_document(dataset_id, document)
return {"result": "success"}, 200 return {"result": "success"}, 200
api.add_resource(GetProcessRuleApi, "/datasets/process-rule")
api.add_resource(DatasetDocumentListApi, "/datasets/<uuid:dataset_id>/documents")
api.add_resource(DatasetInitApi, "/datasets/init")
api.add_resource(
DocumentIndexingEstimateApi, "/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/indexing-estimate"
)
api.add_resource(DocumentBatchIndexingEstimateApi, "/datasets/<uuid:dataset_id>/batch/<string:batch>/indexing-estimate")
api.add_resource(DocumentBatchIndexingStatusApi, "/datasets/<uuid:dataset_id>/batch/<string:batch>/indexing-status")
api.add_resource(DocumentIndexingStatusApi, "/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/indexing-status")
api.add_resource(DocumentApi, "/datasets/<uuid:dataset_id>/documents/<uuid:document_id>")
api.add_resource(
DocumentProcessingApi, "/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/processing/<string:action>"
)
api.add_resource(DocumentMetadataApi, "/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/metadata")
api.add_resource(DocumentStatusApi, "/datasets/<uuid:dataset_id>/documents/status/<string:action>/batch")
api.add_resource(DocumentPauseApi, "/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/processing/pause")
api.add_resource(DocumentRecoverApi, "/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/processing/resume")
api.add_resource(DocumentRetryApi, "/datasets/<uuid:dataset_id>/retry")
api.add_resource(DocumentRenameApi, "/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/rename")
api.add_resource(WebsiteDocumentSyncApi, "/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/website-sync")

View File

@@ -1,10 +1,10 @@
from flask import request from flask import request
from flask_login import current_user from flask_login import current_user
from flask_restx import Resource, marshal, reqparse from flask_restx import Resource, fields, marshal, reqparse
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services import services
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.datasets.error import DatasetNameDuplicateError from controllers.console.datasets.error import DatasetNameDuplicateError
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from fields.dataset_fields import dataset_detail_fields from fields.dataset_fields import dataset_detail_fields
@@ -21,7 +21,18 @@ def _validate_name(name):
return name return name
@console_ns.route("/datasets/external-knowledge-api")
class ExternalApiTemplateListApi(Resource): class ExternalApiTemplateListApi(Resource):
@api.doc("get_external_api_templates")
@api.doc(description="Get external knowledge API templates")
@api.doc(
params={
"page": "Page number (default: 1)",
"limit": "Number of items per page (default: 20)",
"keyword": "Search keyword",
}
)
@api.response(200, "External API templates retrieved successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -79,7 +90,13 @@ class ExternalApiTemplateListApi(Resource):
return external_knowledge_api.to_dict(), 201 return external_knowledge_api.to_dict(), 201
@console_ns.route("/datasets/external-knowledge-api/<uuid:external_knowledge_api_id>")
class ExternalApiTemplateApi(Resource): class ExternalApiTemplateApi(Resource):
@api.doc("get_external_api_template")
@api.doc(description="Get external knowledge API template details")
@api.doc(params={"external_knowledge_api_id": "External knowledge API ID"})
@api.response(200, "External API template retrieved successfully")
@api.response(404, "Template not found")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -138,7 +155,12 @@ class ExternalApiTemplateApi(Resource):
return {"result": "success"}, 204 return {"result": "success"}, 204
@console_ns.route("/datasets/external-knowledge-api/<uuid:external_knowledge_api_id>/use-check")
class ExternalApiUseCheckApi(Resource): class ExternalApiUseCheckApi(Resource):
@api.doc("check_external_api_usage")
@api.doc(description="Check if external knowledge API is being used")
@api.doc(params={"external_knowledge_api_id": "External knowledge API ID"})
@api.response(200, "Usage check completed successfully")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -151,7 +173,24 @@ class ExternalApiUseCheckApi(Resource):
return {"is_using": external_knowledge_api_is_using, "count": count}, 200 return {"is_using": external_knowledge_api_is_using, "count": count}, 200
@console_ns.route("/datasets/external")
class ExternalDatasetCreateApi(Resource): class ExternalDatasetCreateApi(Resource):
@api.doc("create_external_dataset")
@api.doc(description="Create external knowledge dataset")
@api.expect(
api.model(
"CreateExternalDatasetRequest",
{
"external_knowledge_api_id": fields.String(required=True, description="External knowledge API ID"),
"external_knowledge_id": fields.String(required=True, description="External knowledge ID"),
"name": fields.String(required=True, description="Dataset name"),
"description": fields.String(description="Dataset description"),
},
)
)
@api.response(201, "External dataset created successfully", dataset_detail_fields)
@api.response(400, "Invalid parameters")
@api.response(403, "Permission denied")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -191,7 +230,24 @@ class ExternalDatasetCreateApi(Resource):
return marshal(dataset, dataset_detail_fields), 201 return marshal(dataset, dataset_detail_fields), 201
@console_ns.route("/datasets/<uuid:dataset_id>/external-hit-testing")
class ExternalKnowledgeHitTestingApi(Resource): class ExternalKnowledgeHitTestingApi(Resource):
@api.doc("test_external_knowledge_retrieval")
@api.doc(description="Test external knowledge retrieval for dataset")
@api.doc(params={"dataset_id": "Dataset ID"})
@api.expect(
api.model(
"ExternalHitTestingRequest",
{
"query": fields.String(required=True, description="Query text for testing"),
"retrieval_model": fields.Raw(description="Retrieval model configuration"),
"external_retrieval_model": fields.Raw(description="External retrieval model configuration"),
},
)
)
@api.response(200, "External hit testing completed successfully")
@api.response(404, "Dataset not found")
@api.response(400, "Invalid parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -228,8 +284,22 @@ class ExternalKnowledgeHitTestingApi(Resource):
raise InternalServerError(str(e)) raise InternalServerError(str(e))
@console_ns.route("/test/retrieval")
class BedrockRetrievalApi(Resource): class BedrockRetrievalApi(Resource):
# this api is only for internal testing # this api is only for internal testing
@api.doc("bedrock_retrieval_test")
@api.doc(description="Bedrock retrieval test (internal use only)")
@api.expect(
api.model(
"BedrockRetrievalTestRequest",
{
"retrieval_setting": fields.Raw(required=True, description="Retrieval settings"),
"query": fields.String(required=True, description="Query text"),
"knowledge_id": fields.String(required=True, description="Knowledge ID"),
},
)
)
@api.response(200, "Bedrock retrieval test completed")
def post(self): def post(self):
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument("retrieval_setting", nullable=False, required=True, type=dict, location="json") parser.add_argument("retrieval_setting", nullable=False, required=True, type=dict, location="json")
@@ -247,12 +317,3 @@ class BedrockRetrievalApi(Resource):
args["retrieval_setting"], args["query"], args["knowledge_id"] args["retrieval_setting"], args["query"], args["knowledge_id"]
) )
return result, 200 return result, 200
api.add_resource(ExternalKnowledgeHitTestingApi, "/datasets/<uuid:dataset_id>/external-hit-testing")
api.add_resource(ExternalDatasetCreateApi, "/datasets/external")
api.add_resource(ExternalApiTemplateListApi, "/datasets/external-knowledge-api")
api.add_resource(ExternalApiTemplateApi, "/datasets/external-knowledge-api/<uuid:external_knowledge_api_id>")
api.add_resource(ExternalApiUseCheckApi, "/datasets/external-knowledge-api/<uuid:external_knowledge_api_id>/use-check")
# this api is only for internal test
api.add_resource(BedrockRetrievalApi, "/test/retrieval")

View File

@@ -1,6 +1,6 @@
from flask_restx import Resource from flask_restx import Resource, fields
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.datasets.hit_testing_base import DatasetsHitTestingBase from controllers.console.datasets.hit_testing_base import DatasetsHitTestingBase
from controllers.console.wraps import ( from controllers.console.wraps import (
account_initialization_required, account_initialization_required,
@@ -10,7 +10,25 @@ from controllers.console.wraps import (
from libs.login import login_required from libs.login import login_required
@console_ns.route("/datasets/<uuid:dataset_id>/hit-testing")
class HitTestingApi(Resource, DatasetsHitTestingBase): class HitTestingApi(Resource, DatasetsHitTestingBase):
@api.doc("test_dataset_retrieval")
@api.doc(description="Test dataset knowledge retrieval")
@api.doc(params={"dataset_id": "Dataset ID"})
@api.expect(
api.model(
"HitTestingRequest",
{
"query": fields.String(required=True, description="Query text for testing"),
"retrieval_model": fields.Raw(description="Retrieval model configuration"),
"top_k": fields.Integer(description="Number of top results to return"),
"score_threshold": fields.Float(description="Score threshold for filtering results"),
},
)
)
@api.response(200, "Hit testing completed successfully")
@api.response(404, "Dataset not found")
@api.response(400, "Invalid parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -23,6 +41,3 @@ class HitTestingApi(Resource, DatasetsHitTestingBase):
self.hit_testing_args_check(args) self.hit_testing_args_check(args)
return self.perform_hit_testing(dataset, args) return self.perform_hit_testing(dataset, args)
api.add_resource(HitTestingApi, "/datasets/<uuid:dataset_id>/hit-testing")

View File

@@ -113,7 +113,7 @@ class DatasetMetadataBuiltInFieldActionApi(Resource):
MetadataService.enable_built_in_field(dataset) MetadataService.enable_built_in_field(dataset)
elif action == "disable": elif action == "disable":
MetadataService.disable_built_in_field(dataset) MetadataService.disable_built_in_field(dataset)
return 200 return {"result": "success"}, 200
class DocumentMetadataEditApi(Resource): class DocumentMetadataEditApi(Resource):
@@ -135,7 +135,7 @@ class DocumentMetadataEditApi(Resource):
MetadataService.update_documents_metadata(dataset, metadata_args) MetadataService.update_documents_metadata(dataset, metadata_args)
return 200 return {"result": "success"}, 200
api.add_resource(DatasetMetadataCreateApi, "/datasets/<uuid:dataset_id>/metadata") api.add_resource(DatasetMetadataCreateApi, "/datasets/<uuid:dataset_id>/metadata")

View File

@@ -1,13 +1,32 @@
from flask_restx import Resource, reqparse from flask_restx import Resource, fields, reqparse
from controllers.console import api from controllers.console import api, console_ns
from controllers.console.datasets.error import WebsiteCrawlError from controllers.console.datasets.error import WebsiteCrawlError
from controllers.console.wraps import account_initialization_required, setup_required from controllers.console.wraps import account_initialization_required, setup_required
from libs.login import login_required from libs.login import login_required
from services.website_service import WebsiteCrawlApiRequest, WebsiteCrawlStatusApiRequest, WebsiteService from services.website_service import WebsiteCrawlApiRequest, WebsiteCrawlStatusApiRequest, WebsiteService
@console_ns.route("/website/crawl")
class WebsiteCrawlApi(Resource): class WebsiteCrawlApi(Resource):
@api.doc("crawl_website")
@api.doc(description="Crawl website content")
@api.expect(
api.model(
"WebsiteCrawlRequest",
{
"provider": fields.String(
required=True,
description="Crawl provider (firecrawl/watercrawl/jinareader)",
enum=["firecrawl", "watercrawl", "jinareader"],
),
"url": fields.String(required=True, description="URL to crawl"),
"options": fields.Raw(required=True, description="Crawl options"),
},
)
)
@api.response(200, "Website crawl initiated successfully")
@api.response(400, "Invalid crawl parameters")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -39,7 +58,14 @@ class WebsiteCrawlApi(Resource):
return result, 200 return result, 200
@console_ns.route("/website/crawl/status/<string:job_id>")
class WebsiteCrawlStatusApi(Resource): class WebsiteCrawlStatusApi(Resource):
@api.doc("get_crawl_status")
@api.doc(description="Get website crawl status")
@api.doc(params={"job_id": "Crawl job ID", "provider": "Crawl provider (firecrawl/watercrawl/jinareader)"})
@api.response(200, "Crawl status retrieved successfully")
@api.response(404, "Crawl job not found")
@api.response(400, "Invalid provider")
@setup_required @setup_required
@login_required @login_required
@account_initialization_required @account_initialization_required
@@ -62,7 +88,3 @@ class WebsiteCrawlStatusApi(Resource):
except Exception as e: except Exception as e:
raise WebsiteCrawlError(str(e)) raise WebsiteCrawlError(str(e))
return result, 200 return result, 200
api.add_resource(WebsiteCrawlApi, "/website/crawl")
api.add_resource(WebsiteCrawlStatusApi, "/website/crawl/status/<string:job_id>")

View File

@@ -1,6 +1,5 @@
import logging import logging
from flask_login import current_user
from flask_restx import reqparse from flask_restx import reqparse
from werkzeug.exceptions import InternalServerError, NotFound from werkzeug.exceptions import InternalServerError, NotFound
@@ -28,6 +27,8 @@ from extensions.ext_database import db
from libs import helper from libs import helper
from libs.datetime_utils import naive_utc_now from libs.datetime_utils import naive_utc_now
from libs.helper import uuid_value from libs.helper import uuid_value
from libs.login import current_user
from models import Account
from models.model import AppMode from models.model import AppMode
from services.app_generate_service import AppGenerateService from services.app_generate_service import AppGenerateService
from services.errors.llm import InvokeRateLimitError from services.errors.llm import InvokeRateLimitError
@@ -57,6 +58,8 @@ class CompletionApi(InstalledAppResource):
db.session.commit() db.session.commit()
try: try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
response = AppGenerateService.generate( response = AppGenerateService.generate(
app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.EXPLORE, streaming=streaming app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.EXPLORE, streaming=streaming
) )
@@ -90,6 +93,8 @@ class CompletionStopApi(InstalledAppResource):
if app_model.mode != "completion": if app_model.mode != "completion":
raise NotCompletionAppError() raise NotCompletionAppError()
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
AppQueueManager.set_stop_flag(task_id, InvokeFrom.EXPLORE, current_user.id) AppQueueManager.set_stop_flag(task_id, InvokeFrom.EXPLORE, current_user.id)
return {"result": "success"}, 200 return {"result": "success"}, 200
@@ -117,6 +122,8 @@ class ChatApi(InstalledAppResource):
db.session.commit() db.session.commit()
try: try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
response = AppGenerateService.generate( response = AppGenerateService.generate(
app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.EXPLORE, streaming=True app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.EXPLORE, streaming=True
) )
@@ -153,6 +160,8 @@ class ChatStopApi(InstalledAppResource):
if app_mode not in {AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT}: if app_mode not in {AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT}:
raise NotChatAppError() raise NotChatAppError()
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
AppQueueManager.set_stop_flag(task_id, InvokeFrom.EXPLORE, current_user.id) AppQueueManager.set_stop_flag(task_id, InvokeFrom.EXPLORE, current_user.id)
return {"result": "success"}, 200 return {"result": "success"}, 200

Some files were not shown because too many files have changed in this diff Show More