Compare commits

...

734 Commits

Author SHA1 Message Date
-LAN-
24d06eb741 Merge branch 'feat/enhance-multi-modal-support' into release/0.10.0-beta 2024-10-15 15:34:42 +08:00
-LAN-
a36ef8430e refactor(core): improve type annotations and file handling consistency
- Use more precise type annotations with Sequence and Mapping for task entities.
- Ensure raw_prompt is assigned properly after replacement in advanced prompt transform.
- Remove unused generator return type from _fetch_context method.
- Refactor tool node file handling to retrieve more comprehensive file attributes, ensuring file existence validation in the database.
2024-10-15 15:34:14 +08:00
-LAN-
6e9129a333 Update build-push.yml to include release/0.10.0-beta branch in the workflow trigger 2024-10-15 14:24:27 +08:00
-LAN-
86427468fd Update version to 0.10.0-beta3 in packaging, docker, and web 2024-10-15 14:23:02 +08:00
-LAN-
638d9250e4 Merge branch 'feat/enhance-multi-modal-support' into release/0.10.0-beta 2024-10-15 12:01:16 +08:00
-LAN-
dba67cd87a fix(memory): filter non-image file types in prompt message content
- Skip non-image files when converting file objects to prompt message content.
- Ensures only image files are processed, improving the accuracy and relevance of prompt messages.
2024-10-15 12:01:00 +08:00
-LAN-
c3811c4c23 Merge branch 'feat/attachments' into release/0.10.0-beta 2024-10-15 11:29:54 +08:00
Joel
ff7a467a1a fix: iteration not support file item 2024-10-15 11:20:52 +08:00
-LAN-
c73b5e2410 Merge branch 'feat/attachments' into release/0.10.0-beta 2024-10-15 11:05:45 +08:00
-LAN-
421fde0e85 Merge branch 'feat/enhance-multi-modal-support' into release/0.10.0-beta 2024-10-15 11:05:41 +08:00
-LAN-
b3fdd618a1 refactor(core): simplify role handling and improve usability
- Replaced explicit string usage with `CreatedByRole` enum for better maintainability.
- Removed duplicate `CreatedByRole` class definition, improving codebase consistency.
- Increased file number limits from 6 to 10 to allow more file uploads.
- Transitioned `AppMode` to a string enum for consistent type usage.
- Refactored `extract_thread_messages` function argument for flexibility.
- Removed file extension limitation in file service to support custom extensions.
- Improved enum import statements across multiple modules for clarity and consistency.
2024-10-15 11:04:28 +08:00
StyleZhang
81e57bcc73 fix: remove image item 2024-10-15 10:53:27 +08:00
-LAN-
7156753a99 Merge branch 'feat/attachments' into release/0.10.0-beta 2024-10-14 23:57:44 +08:00
-LAN-
0d310b503b refactor(prompt): improve handling of variable templates in advanced prompt transform 2024-10-14 16:47:03 +08:00
-LAN-
03018823d8 fix(workflow): handle special values for process data consistently
- Apply `handle_special_values` to `process_data` in workflow cycle management.
- Improve template processing in `AdvancedPromptTransform` with `VariablePool`.
- Make `system_variables` and `user_inputs` optional in `VariablePool` initialization.
2024-10-14 16:47:03 +08:00
-LAN-
885192db38 feat(podcast_generator): add new podcast generation tools
- Introduced podcast generator with text-to-speech functionality using OpenAI's API.
- Implemented credential validation for TTS services and API keys.
- Added support for generating podcast audio with alternating host voices.
- Included user-friendly setup with internationalized YAML configuration.
- Added SVG icon to enhance visual identification.
2024-10-14 16:47:03 +08:00
-LAN-
ea18dd1571 feat(api): Enhance multi modal support. 2024-10-14 16:47:03 +08:00
-LAN-
7838f9f3a3 fix: Add new Milvus Lite wheel for manylinux2014_aarch64 (#9316) 2024-10-14 16:27:26 +08:00
StyleZhang
8fe5028f74 Merge branch 'main' into feat/attachments 2024-10-14 16:09:05 +08:00
我有一把妖刀
de3c5751db chore: add reopen preview btn (#9279)
Co-authored-by: billsyli <billsyli@tencent.com>
2024-10-14 13:32:52 +08:00
我有一把妖刀
5ee7e03c1b chore: Optimize operations in Q&A mode (#9274)
Co-authored-by: billsyli <billsyli@tencent.com>
2024-10-14 13:32:13 +08:00
zhuhao
7a405b86c9 refactor: Refactor the service of retrieval the recommend app (#9302) 2024-10-14 13:26:21 +08:00
非法操作
ffc3f33670 chore: remove the copied zhipu_ai sdk (#9270) 2024-10-14 10:53:45 +08:00
kurokobo
857055b797 fix: remove the latest message from the user that does not have any answer yet (#9297) 2024-10-13 23:25:50 +08:00
ice yao
d15ba3939d Add Volcengine VikingDB as new vector provider (#9287) 2024-10-13 21:26:05 +08:00
github-actions[bot]
1ec83e4969 chore: translate i18n files (#9288)
Co-authored-by: douxc <7553076+douxc@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-10-13 14:56:26 +08:00
zhuhao
9275760599 chore: add baidu-obs and supabase for .env.example (#9289) 2024-10-13 09:44:53 +08:00
zhuhao
d97d3ff5fc chore: add abstract decorator and output log when query embedding fails (#9264) 2024-10-12 23:58:41 +08:00
NFish
ea6734f550 Feat/new account page (#9236) 2024-10-12 23:49:18 +08:00
-LAN-
f73751843f Feat/implement-refresh-tokens (#9233) 2024-10-12 23:46:30 +08:00
Wu Tianwei
dbfbc56de7 feat: refresh-token (#9286)
Co-authored-by: NFish <douxc512@gmail.com>
2024-10-12 23:40:38 +08:00
github-actions[bot]
70c5b23089 chore: translate i18n files (#9284)
Co-authored-by: YIXIAO0 <54782454+YIXIAO0@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-10-12 23:27:11 +08:00
Shili Cao
2ec6ffe478 feat:support baidu vector db (#9185) 2024-10-12 23:24:17 +08:00
Yi Xiao
793205afc5 Feat: rerank model verification in front end (#9271) 2024-10-12 21:24:43 +08:00
Garfield Dai
c6b74daa0a Fix/s3 iam add region name (#7819) 2024-10-12 18:47:59 +08:00
takatost
23ce1fb1ba chore: optimize the trace ops slow queries on node executions. (#9282) 2024-10-12 18:30:46 +08:00
takatost
29188e0562 chore: use cache instead of re-querying node record during workflow execution (#9280) 2024-10-12 17:48:59 +08:00
zhuhao
d9773c963f chore: fix the misclassification of the opensearch-py package (#9266) 2024-10-12 17:37:01 +08:00
NFish
1206b1eb96 fix: add new domain to whitelist (#9265) 2024-10-12 11:32:40 +08:00
crazywoola
93af87a9e0 fix: move exception to debug mode (#9258) 2024-10-12 09:28:45 +08:00
zhuhao
7a6970e570 feat: add supabase object storage (#9229) 2024-10-11 22:48:57 +08:00
Likename Haojie
ea584e94bd fix: dialog box cannot correctly display LaTeX formulas (#9242) 2024-10-11 22:46:44 +08:00
Jyong
42b02b3a5f Fix/agent external knowledge retrieval (#9241) 2024-10-11 19:21:03 +08:00
Yi Xiao
44f6a536d2 fix: reference issue in external knowledge api (#9240) 2024-10-11 19:17:22 +08:00
AkisAya
d7b8e071dd fix:#9222 create or update custum tools error (#9228) 2024-10-11 17:12:30 +08:00
dependabot[bot]
3f1aa1f9e2 chore(deps): bump dompurify from 3.0.5 to 3.1.7 in /web (#9232)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-11 16:57:07 +08:00
dependabot[bot]
82024a65cd chore(deps): bump micromatch from 4.0.5 to 4.0.8 in /web (#9234)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-11 16:56:32 +08:00
NFish
f4ce08211d feat: support csp (#9111)
Co-authored-by: Joel <iamjoel007@gmail.com>
2024-10-11 16:14:56 +08:00
Bowen Liang
7c6ae96a09 chore: check all dependencies groups in pyproject.toml (#9224) 2024-10-11 16:08:36 +08:00
Jyong
80b62d50f5 Fix/add es num_candidates (#9225) 2024-10-11 16:04:23 +08:00
zxhlyh
d498f4e55e Chore/model icon url (#9218)
Co-authored-by: takatost <takatost@gmail.com>
Co-authored-by: jyong <718720800@qq.com>
2024-10-11 12:33:34 +08:00
Hash Brown
1c1e008dcf fix: reduce webapp icon displayed on browser tab flickering when page is loading (#9212) 2024-10-11 12:07:39 +08:00
AAEE86
fe41e8bc18 feat: add siliconflow custom add model interface (#8745) 2024-10-11 11:56:11 +08:00
massif-01
ccc6723a8e Update docker-compose.yaml (#9183)
Co-authored-by: crazywoola <427733928@qq.com>
2024-10-11 11:55:16 +08:00
Joe
1e7ef46e9c fix: update inner api proxies (#9174) 2024-10-11 11:35:01 +08:00
crazywoola
e0c8189f1a feat: add NotFound error for dataset service (#9215) 2024-10-11 11:28:09 +08:00
Wu Tianwei
8bcad002df fix: add loading to 'delete' button & 'save' button (#9214) 2024-10-11 11:22:59 +08:00
Fei He
5c76131d3d feat: add gte rerank for tongyi (#9153) 2024-10-11 10:35:56 +08:00
StyleZhang
9f492d527a fix: chat input file list remove button style 2024-10-11 10:33:19 +08:00
Jyong
cabdb4ef17 fix retrieval resource positon missed (#9155)
Co-authored-by: Bowen Liang <liangbowen@gf.com.cn>
2024-10-11 10:32:42 +08:00
-LAN-
a34891851b fix(log list): prevent duplicate data fetch (#9190) 2024-10-11 10:32:27 +08:00
zhuhao
05c1ef75c4 feat: add allow_llm_to_see_data flag for vanna (#9156) 2024-10-10 17:39:26 +08:00
Xiaoguang Sun
be2f1e59f2 improve: Refresh updated_at field of DataSourceOauthBinding model (#9173)
Signed-off-by: Xiaoguang Sun <sunxiaoguang@gmail.com>
2024-10-10 17:21:28 +08:00
Sa Zhang
75f0a5e36a fix: Typo Correction in jina.yaml (#9161) 2024-10-10 17:00:34 +08:00
Jyong
68107fe355 downgrade elastic to 8.14 (#9171) 2024-10-10 16:43:11 +08:00
StyleZhang
faec1c50f9 Merge branch 'main' into feat/attachments 2024-10-10 16:36:25 +08:00
takatost
477beae3bb fix: missing usage of metadata in the chatflow app (#9167) 2024-10-10 16:34:06 +08:00
StyleZhang
2888c068c1 fix: chat log 2024-10-10 16:07:35 +08:00
Yi Xiao
6689b592ff fix: delete offline "how to write a good knowledge description" helper (#9165) 2024-10-10 15:48:26 +08:00
Charlie.Wei
6b6e94da08 Fix code indentation errors (#9164) 2024-10-10 15:26:38 +08:00
Ziyu Huang
fc60b554a1 Fixes #9159: Modify to make it works to llama.cpp rerank API (#9160) 2024-10-10 15:18:07 +08:00
dai
bffb0919cc fix: Twitter to X(Twitter) (#9157) 2024-10-10 14:13:20 +08:00
Joe
e947103b6d Feat/add workflow sys params (#9108)
Co-authored-by: Joel <iamjoel007@gmail.com>
2024-10-10 11:15:52 +08:00
ronaksingh27
62051d5171 Corrected type annotation to "Any" from "any" all files in "model_providers" folder (#9135) 2024-10-10 10:34:25 +08:00
StyleZhang
d77521e65f Merge branch 'main' into feat/attachments 2024-10-10 10:28:56 +08:00
crazywoola
783a6b866b fix: #9076 (#9150) 2024-10-10 10:25:45 +08:00
luckylhb90
2024a6c941 fix: vertex ai remote url error(Error: not enough values to unpack) (#9134)
Co-authored-by: hobo.l <hobo.l@binance.com>
2024-10-10 10:16:42 +08:00
-LAN-
37d5c166ca fix(migrations): correct parent_message_id for service-api records (#9132) 2024-10-10 10:16:24 +08:00
呆萌闷油瓶
060897b25b chore:add azure openai api version 2024-09-01-preview (#9141) 2024-10-10 10:07:49 +08:00
非法操作
499cc57082 fix: response_format of model_parameters will not be removed (#9148) 2024-10-10 10:07:21 +08:00
StyleZhang
b463468a14 Merge branch 'main' into feat/attachments 2024-10-10 10:02:35 +08:00
Bowen Liang
b279d19040 chore: avoid star imports usage (#9123) 2024-10-09 21:12:50 +08:00
非法操作
cb35e84f51 fix: docker env typo (#9119) 2024-10-09 21:11:13 +08:00
Hash Brown
511ffa4698 fix: follow-up (suggested questions) does not refer to the most recen… (#9122) 2024-10-09 21:10:55 +08:00
pinsily
7a1da2409d add generate conversation name error log (#9117) 2024-10-09 21:09:36 +08:00
Yi Xiao
cbd3ffe056 Fix the margin for the rerank switch in retrieval setting (#9124) 2024-10-09 18:26:38 +08:00
StyleZhang
820076beec Merge branch 'main' into feat/attachments 2024-10-09 17:48:05 +08:00
StyleZhang
fd6476b941 fix: start chat check & chat send button theme 2024-10-09 17:45:38 +08:00
StyleZhang
7abe76f07e fix: basic chat 2024-10-09 17:24:43 +08:00
Alter-xyz
ec0aa32cd4 Add customizable size for chatbot embed with default values (#9115) 2024-10-09 16:56:09 +08:00
Charlie.Wei
55679b4389 azure add o1-mini、o1-preview models (#9088)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-10-09 16:15:03 +08:00
kurokobo
c0b71f8286 feat: respect x-* headers for redirections (#9054) 2024-10-09 14:42:30 +08:00
Bowen Liang
240b66d737 chore: avoid implicit optional in type annotations of method (#8727) 2024-10-09 14:36:43 +08:00
StyleZhang
dba24766e8 fix: file download 2024-10-09 14:22:46 +08:00
Bowen Liang
b360feb4c1 refactor: introduce storage factory and speed up api startup by importing storage client on demand (#9086) 2024-10-09 14:15:27 +08:00
zhuhao
5fcd614186 feat: output the execution results of tool should only in debug mode (#9104) 2024-10-09 12:53:55 +08:00
zhuhao
4be1aa516c feat: add tags for vanna tools (#9102) 2024-10-09 12:53:03 +08:00
zhuhao
5e250403d3 chore: add the missing tool names to the _position file (#9099) 2024-10-09 11:04:26 +08:00
Joel
12492c0d5d fix: answer node support choose file 2024-10-09 10:31:59 +08:00
SebastjanPrachovskij
f8af9c6ad0 refactor: Update countries & languages list for SearchApi engines (#9090) 2024-10-09 10:24:09 +08:00
ybalbert001
57994e4a24 fix s3 presign url check problem, support two versions(v2,v4) (#9093)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-10-09 10:23:21 +08:00
Pika
0540995e5c fix: prompt-editor regex.lastIndex needed to reset (#9097)
Co-authored-by: Chen(MAC) <chenchen404@outlook.com>
2024-10-09 10:22:20 +08:00
crazywoola
3a0734d94c Feat/9081 add support for llamaguard through groq provider (#9083) 2024-10-09 01:00:10 +08:00
zhuhao
5b7de7705e feat: add timestamp conversion and timestamp retrieval for time tool (#9085) 2024-10-08 23:25:11 +08:00
Dongsheng Zhao
7121afdd44 fix: #8969 (#9076) 2024-10-08 19:58:03 +08:00
zhuhao
5213650fed feat: update the xinf tool's API key to optional (#9073) 2024-10-08 19:07:53 +08:00
Joel
2740d68cd1 chore: prompt not support file var 2024-10-08 18:32:30 +08:00
Joel
7218f718ad fix: value check 2024-10-08 18:05:55 +08:00
gaocarri
8204e0e14a fix: Count exception occurs when searching conversations (#8754)
Co-authored-by: zheng.gao <zheng.gao@amh-group.com>
2024-10-08 17:25:33 +08:00
Joel
2ae0aceef8 chore: fix doc options 2024-10-08 17:23:57 +08:00
Joel
a5c1132087 chore: prompt support array string number 2024-10-08 17:13:51 +08:00
Infinitnet
e741ee2f45 Correct max_tokens for OpenRouter Sonnet 3.5 (#9068) 2024-10-08 16:06:47 +08:00
Sota Oizumi
0564e8a284 docs: Add Google Cloud Terraform to README (#9065) 2024-10-08 15:39:47 +08:00
非法操作
966e65bb66 fix: zhipu ai web_search not work (#9058) 2024-10-08 15:36:31 +08:00
Bowen Liang
896998ef3f chore: refine python dependency list and check dependencies in order (#9061) 2024-10-08 15:11:45 +08:00
Joel
691a7f727a fix: iteration output var type 2024-10-08 14:54:21 +08:00
Joel
7e2984b6e2 fix: array operator render 2024-10-08 14:45:28 +08:00
NFish
4abca8614f fix: Ignore the error toast notification if the status is 401 and isPublicAPI is true (#9062) 2024-10-08 11:48:22 +08:00
Bowen Liang
7c0b159a81 chore: removing unused imports in tests (#9049) 2024-10-08 11:13:11 +08:00
Sergio Sacristán
a8b4d1ac2a feat: Improvement- use non root user for Web container (#8928) 2024-10-08 11:12:21 +08:00
Bowen Liang
b933c9d206 chore: move testing env variables from pyproject.toml to pytest.ini (#9019) 2024-10-08 08:40:29 +08:00
呆萌闷油瓶
f45042aa8e fix:ddg ratelimit 202 (#9047) 2024-10-07 22:13:41 +08:00
aiscrm
2ab8bc679f fix: Missing model information in llm span of Langfuse #9029 (#9030)
Co-authored-by: corel <corelchen@qq.com>
2024-10-07 18:03:30 +08:00
zhuhao
2571b0c4e3 feat: add baidu obs storage (#9024) 2024-10-07 11:09:27 +08:00
zhuhao
959a81a41b refactor: remove the duplicate definitions across different modules (#9022) 2024-10-07 11:08:06 +08:00
Bowen Liang
4480b469a6 chore: fix the yanked dependency vesion aiohappyeyeballs 2.4.2 (#9020) 2024-10-07 11:07:34 +08:00
zg0d233
fcfa1252a0 fix bug when adding openai or openai-compatible stt model instance (#9006) 2024-10-07 11:06:38 +08:00
zhuhao
e1e2d0b364 fix: failed to open links to images generated by QR code tool when using Huawei OBS (#9034) 2024-10-07 11:06:08 +08:00
crazywoola
9815a0911b fix: tools description is missing (#8999) 2024-10-03 21:53:11 +08:00
Giannis Kepas
dc5839b6bb feat: Update AWS Bedrock supported regions (#8992) 2024-10-03 15:17:28 +08:00
zhaoyi233
4373777871 Update json_in_md_parser.py (#8983)
Co-authored-by: crazywoola <427733928@qq.com>
2024-10-03 10:20:56 +08:00
ice yao
415d27c8bf feat(Tools): add discord incoming webhook for sending messages (#7852) 2024-10-02 13:18:35 +08:00
omr
5366820a2f fix: corrected typo (#8979) 2024-10-02 12:54:22 +08:00
Hash Brown
5f8a27074e fix: chat API is not bringing the conversation/session history (#8965) 2024-10-01 12:10:36 +08:00
zhuhao
24ba9fdf6c feat: enhance stepfun image generation tool (#8954) 2024-10-01 10:55:54 +08:00
zhuhao
824a0dd63e feat: add qwen2.5-72b and llama3.2 for openrouter (#8956) 2024-10-01 10:55:51 +08:00
ice yao
c2d606d587 chore: remove unexecuted scripts to avoid ambiguity (#8946) 2024-10-01 09:15:18 +08:00
omr
2deaece7e2 refactor: remove unnecessary comment (#8949) 2024-10-01 09:14:49 +08:00
CXwudi
0d84221b2c chore: sort Gemini models (#8951) 2024-10-01 09:14:36 +08:00
CXwudi
cdd7e55a88 chore: add missing models from Voyage (#8950) 2024-10-01 09:14:21 +08:00
-LAN-
1f5cc071f8 chore(version): bump to 0.9.1 (#8945) 2024-09-30 23:22:21 +08:00
Jyong
625e4c4c72 fix multiple retrieval in knowledge node (#8942) 2024-09-30 23:07:04 +08:00
-LAN-
7850a28ec8 Revert "chore(version): bump to 0.9.1" (#8944) 2024-09-30 22:53:32 +08:00
-LAN-
730d3a6d7c chore(version): bump to 0.9.1 (#8938) 2024-09-30 22:13:38 +08:00
Yi Xiao
d6a44e9990 fix: request params for internal dataset (#8940) 2024-09-30 22:10:27 +08:00
Jyong
3069b5cf57 original dataset update remove unuseful parameters (#8939) 2024-09-30 22:01:32 +08:00
NFish
7873e455bb fix: Fix the error when importing web pages using jina (#8937) 2024-09-30 21:27:11 +08:00
Jyong
a651b73db0 original dataset update issue (#8935) 2024-09-30 21:17:12 +08:00
-LAN-
d2ce4960f1 chore(versioning): bump version to 0.9.0 (#8911) 2024-09-30 18:33:20 +08:00
JzoNg
44e81dbbc8 Merge branch 'main' into jzh 2024-09-30 17:19:35 +08:00
KVOJJJin
1af4ca344e Feat: add debounce for search in logs (#8924) 2024-09-30 17:18:47 +08:00
zhuhao
fa837b2dfd fix: fix the issue with the system model configuration update (#8923) 2024-09-30 17:14:13 +08:00
Joel
944cfd2b68 chore: merge main 2024-09-30 16:43:06 +08:00
Joel
6d2682c751 fix: help link url 2024-09-30 16:36:16 +08:00
github-actions[bot]
824a71388a chore: translate i18n files (#8917)
Co-authored-by: JohnJyong <76649700+JohnJyong@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-09-30 16:35:00 +08:00
Aurelius Huang
4585cffce1 fix: Compatible with special characters in pg full-text search. (#8921)
Co-authored-by: Aurelius Huang <cm.huang@aftership.com>
2024-09-30 16:32:23 +08:00
Yi Xiao
13046709a9 fix: line in iteration node is not straight (#8918) 2024-09-30 16:04:51 +08:00
Jyong
9d221a5e19 external knowledge api (#8913)
Co-authored-by: Yi <yxiaoisme@gmail.com>
2024-09-30 15:38:43 +08:00
StyleZhang
d2971e84bb fix: node handle tooltip 2024-09-30 15:23:04 +08:00
StyleZhang
b67b81bf8f fix: file upload tip 2024-09-30 14:52:16 +08:00
Joel
c05902404d chore: add help link 2024-09-30 14:00:51 +08:00
Joel
1e6d5f2c48 feat: add support doc extract file types 2024-09-30 13:56:24 +08:00
zhuhao
77aef9ff1d refactor: optimize the calculation of rerank threshold and the logic for forbidden characters in model_uid (#8879) 2024-09-30 12:55:01 +08:00
zhuhao
503561f464 fix: fix the data validation consistency issue in keyword content review (#8908) 2024-09-30 12:52:18 +08:00
-LAN-
ada9d408ac refactor(api/variables): VariableError as a ValueError. (#8554) 2024-09-30 12:48:58 +08:00
-LAN-
3af65b2f45 feat(api): add version comparison logic (#8902) 2024-09-30 11:12:26 +08:00
Zhaofeng Miao
369e1e6f58 feat(website-crawl): add jina reader as additional alternative for website crawling (#8761) 2024-09-30 09:57:19 +08:00
JzoNg
e2b1464db2 fix features update by DSL import 2024-09-29 17:15:39 +08:00
zhuhao
fb49413a41 feat: add voyage ai as a new model provider (#8747) 2024-09-29 16:55:59 +08:00
zhuhao
42dfde6546 docs: add english versions for the files customizable_model_scale_out and predefined_model_scale_out (#8871) 2024-09-29 16:16:56 +08:00
chenxu9741
c531b4a911 fix: #8843 event: tts_message_end always return in api streaming resp… (#8846) 2024-09-29 16:13:20 +08:00
longzhihun
e4ed916baa Add Jamba and Llama3.2 model support (#8878) 2024-09-29 16:12:56 +08:00
-LAN-
4ec977eaba fix(workflow): update tagging logic in GitHub Actions (#8882) 2024-09-29 16:12:42 +08:00
JzoNg
f0285a53d2 fix style of textarea in retrivel test 2024-09-29 15:53:10 +08:00
Joel
00f91b5dc4 chore: list fiter i18n 2024-09-29 14:57:38 +08:00
StyleZhang
dc3e86b82a fix: embedded chat file form 2024-09-29 14:56:13 +08:00
Joel
d239c5b54d fix: list filter first name 2024-09-29 14:47:02 +08:00
StyleZhang
23abccd3a6 fix: lint 2024-09-29 14:43:17 +08:00
JzoNg
2520e40059 fix: feature panel content height in safari 2024-09-29 14:17:27 +08:00
JzoNg
1d4ed3d9e7 fix: feature panel content height in safari 2024-09-29 14:06:07 +08:00
JzoNg
eed8ab9348 fix style of conversation variable 2024-09-29 11:58:23 +08:00
JzoNg
112aaf6e1b fix completion 2024-09-29 11:26:35 +08:00
StyleZhang
094a1a1458 remove chat thumb 2024-09-29 10:49:00 +08:00
Bowen Liang
74f58f29f9 chore: bump ruff to 0.6.8 for fixing violation in SIM910 (#8869) 2024-09-29 00:29:59 +08:00
zhuhao
f97607370a refactor: update Callback to an abstract class (#8868) 2024-09-28 21:41:02 +08:00
zhuhao
850492dafa feat: deprecate gte-Qwen2-7B-instruct embedding model (#8866) 2024-09-28 21:40:27 +08:00
zhuhao
61c89a9168 feat: add internlm2.5-20b and qwen2.5-coder-7b model (#8862) 2024-09-28 16:31:02 +08:00
takatost
49af18fbd6 fix: customize model credentials were invalid despite the provider credentials being active (#8864) 2024-09-28 15:54:26 +08:00
zhuhao
6cd22f3bca fix: update qwen2.5-coder-7b model name (#8861) 2024-09-28 15:01:27 +08:00
Kevin9703
a2e2f8a8c9 fix(workflow/nodes/knowledge-retrieval/use-config): Preserve rerankin… (#8842) 2024-09-28 10:54:50 +08:00
ice yao
27e33fb15c chore: fix wrong VectorType match case (#8857) 2024-09-28 10:54:04 +08:00
zhuhao
55e6123db9 feat: add min-connection and max-connection for pgvector (#8841) 2024-09-27 18:16:20 +08:00
JzoNg
955fa4345a fix output of run 2024-09-27 17:37:26 +08:00
走在修行的大街上
c828a5dfdf feat(Tools): add feishu tools (#8800)
Co-authored-by: 黎斌 <libin.23@bytedance.com>
2024-09-27 17:31:45 +08:00
Joel
ac5e381a1a chore: change list filter writing 2024-09-27 17:09:12 +08:00
Joel
ae9b9f867a chore: add new node description 2024-09-27 16:37:14 +08:00
JzoNg
8fd04e5313 merge main 2024-09-27 16:27:00 +08:00
CXwudi
0603359e2d fix: delete harm catalog settings for gemini (#8829) 2024-09-27 13:49:03 +08:00
HowardChan
bb781764b8 Add Llama3.2 models in Groq provider (#8831) 2024-09-27 12:13:00 +08:00
zhuhao
29275c7447 feat: deprecate mistral model for siliconflow (#8828) 2024-09-27 12:11:56 +08:00
8bitpd
4c1063e1c5 fix: AnalyticdbVector retrieval scores (#8803) 2024-09-27 12:05:21 +08:00
Joel
3904782647 fix: add upload i18n 2024-09-27 11:16:02 +08:00
非法操作
d6b9587a97 fix: close log status option raise error (#8826) 2024-09-27 11:13:40 +08:00
zhuhao
6fbaabc1bc feat: add pgvecto-rs and analyticdb in docker/.env.example (#8823) 2024-09-27 11:13:29 +08:00
StyleZhang
288be3fbd8 fix: chat message end 2024-09-27 10:30:36 +08:00
Shai Perednik
a36117e12d Updated the YouTube channel to Dify's (#8817) 2024-09-27 09:15:33 +08:00
CXwudi
e5efd09ebb chore: massive update of the Gemini models based on latest documentation (#8822) 2024-09-27 09:14:33 +08:00
wenmeng zhou
ecc951609d add more detailed doc for models of qwen series (#8799)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-26 22:32:33 +08:00
ice yao
063474f408 Add llama3.2 model in fireworks provider (#8809) 2024-09-26 22:21:01 +08:00
Hash Brown
3dfbc348e3 feat: improved SVG output UX (#8765) 2024-09-26 19:41:59 +08:00
AAEE86
9a4b53a212 feat: add stream for Gemini (#8678) 2024-09-26 19:08:59 +08:00
AAEE86
03edfbe6f5 feat: add qwen to add custom model parameters (#8759) 2024-09-26 19:04:25 +08:00
Joel
3d2cb25a67 fix: change wrong company name (#8801) 2024-09-26 17:53:11 +08:00
非法操作
6df14e50b2 fix: workflow as tool always outdated (#8798) 2024-09-26 17:50:36 +08:00
JzoNg
f7f836d6f1 fix workflow output 2024-09-26 17:16:08 +08:00
JzoNg
5dedcb74a5 fix form of chat in webapp 2024-09-26 17:01:17 +08:00
StyleZhang
b95d0fa9a9 fix: file upload limits in web app 2024-09-26 16:41:44 +08:00
zhuhao
008e0efeb0 refactor: update delete method as an abstract method (#8794) 2024-09-26 16:36:21 +08:00
cx
128a66f7fe fix: Ollama modelfeature set vision, and an exception occurred at the… (#8783) 2024-09-26 16:34:40 +08:00
StyleZhang
543503c398 fix: file progress 2024-09-26 16:33:37 +08:00
非法操作
62406991df fix: start node input config modal raise 'variable name is required' (#8793) 2024-09-26 16:28:20 +08:00
JzoNg
3f16caf244 show file list in generation result 2024-09-26 16:10:28 +08:00
JzoNg
54133dfbde files in log 2024-09-26 15:50:19 +08:00
StyleZhang
b491c93b1c image download 2024-09-26 15:49:53 +08:00
StyleZhang
2a6d9c3211 fix: chat send 2024-09-26 15:33:01 +08:00
StyleZhang
c6691bd297 fix: webapp chat embedded chat 2024-09-26 15:26:53 +08:00
StyleZhang
2a0b30de5c fix: image.enable 2024-09-26 14:39:56 +08:00
非法操作
d1173a69f8 fix: the Image-1X tool (#8787) 2024-09-26 13:48:06 +08:00
StyleZhang
a7d53abba9 webapp chat embedded chat 2024-09-26 13:47:36 +08:00
StyleZhang
296253a365 debug chat 2024-09-26 11:54:42 +08:00
Joel
c89cefe526 chore: remove log 2024-09-26 11:50:52 +08:00
Shenghang Tsai
a0b0809b1c Add more models for SiliconFlow (#8779) 2024-09-26 11:29:53 +08:00
Aaron Ji
4c9ef6e830 fix: update usage for Jina Embeddings v3 (#8771) 2024-09-26 11:29:35 +08:00
非法操作
0c96f0aa51 fix: credential *** should be string (#8785) 2024-09-26 11:24:03 +08:00
zhuhao
ac73763726 chore: add input_type param desc for the _invoke method of text_embedding (#8778) 2024-09-26 11:23:09 +08:00
非法操作
5ba19d64e9 fix: TavilySearch tool get api link (#8780) 2024-09-26 11:22:18 +08:00
StyleZhang
1d027fa065 fix: chat check inputs form 2024-09-26 11:07:08 +08:00
Joel
9ce9a52a86 fix: text not string 2024-09-26 10:15:20 +08:00
JzoNg
c74424ed85 fix text-generation files 2024-09-26 08:45:50 +08:00
Qun
fefbc43fb0 chore: fix comfyui tool doc url (#8775) 2024-09-26 08:18:13 +08:00
Bowen Liang
a8b837c4a9 dep: bump ElasticSearch from 8.14.x to 8.15.x (#8197) 2024-09-25 22:55:24 +08:00
Pan, Wen-Ming
02ff6cca70 feat: add support for Vertex AI Gemini 1.5 002 and experimental models (#8767) 2024-09-25 21:27:26 +08:00
JzoNg
719ef9cef9 text generation run 2024-09-25 20:34:48 +08:00
NFish
ef47f68e4a fix: the translation result may cause a different meaning (#8763) 2024-09-25 18:25:06 +08:00
StyleZhang
0ab525a691 fix: file size 2024-09-25 16:38:20 +08:00
StyleZhang
6fdcf6ee21 file-uploader 2024-09-25 16:24:02 +08:00
Hash Brown
2ef8b187fa Add GitHub Actions Workflow for Web Tests (#8753) 2024-09-25 15:50:51 +08:00
Joel
d01e97c1fc fix: tiny select ui 2024-09-25 15:44:02 +08:00
Joel
87e560de8a chore: type value change to array 2024-09-25 15:19:34 +08:00
zhuiyue132
b0927c39fb fix: expose the configuration of HTTP request node to Docker (#8716)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-09-25 15:06:54 +08:00
JzoNg
f8d26e46ac text-generation support file type 2024-09-25 15:01:56 +08:00
cherryhuahua
d0e0111f88 fix:Spark's large language model token calculation error #7911 (#8755) 2024-09-25 14:51:42 +08:00
zhuhao
2328944987 chore: apply ruff reformat for python-client sdk (#8752) 2024-09-25 14:48:06 +08:00
Joel
195ac19774 chore: list filter transfer value to array 2024-09-25 13:41:56 +08:00
Joel
0281eb796d chore: transfer value to string array 2024-09-25 12:59:19 +08:00
StyleZhang
9fe2f321ae fix: chatflow check start node form 2024-09-25 12:21:53 +08:00
非法操作
cb1942c242 chore: make url display in the middle of http node (#8741) 2024-09-25 11:27:17 +08:00
StyleZhang
5f76e665a1 fix: file extension 2024-09-25 11:24:14 +08:00
crazywoola
bf64ff215b fix: . is missing in file_extension (#8736) 2024-09-25 10:09:20 +08:00
ybalbert001
68c7e68a8a Fix Issue: switch LLM of SageMaker endpoint doesn't take effect (#8737)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-09-25 09:12:35 +08:00
ice yao
91f70d0bd9 Add embedding models in fireworks provider (#8728) 2024-09-25 08:47:11 +08:00
Jyong
4669eb24be add embedding input type parameter (#8724) 2024-09-24 21:53:50 +08:00
Sa Zhang
debe5953a8 Fix/update jina ai products labels and descriptions (#8730)
Co-authored-by: sa zhang <sa.zhang@jina.ai>
2024-09-24 21:19:49 +08:00
Shota Totsuka
1c7877b048 fix: remove harm category setting from vertex ai (#8721) 2024-09-24 20:53:26 +08:00
StyleZhang
81568752c0 fix: file from link 2024-09-24 17:47:01 +08:00
非法操作
9ca2e2c968 chore: remove windows platform timezone set (#8712) 2024-09-24 17:33:29 +08:00
JzoNg
ceb1dde714 Merge branch 'main' into jzh 2024-09-24 17:30:56 +08:00
zxhlyh
f42ef0624d fix: embedded chat on ios (#8718) 2024-09-24 17:23:11 +08:00
JzoNg
3209fdca53 legacy for sys.files 2024-09-24 17:08:23 +08:00
JzoNg
dc5010d833 fix step run of file type 2024-09-24 16:24:46 +08:00
Joel
8b26ae6532 fix: http file node not added 2024-09-24 15:18:39 +08:00
ice yao
64baedb484 fix: update nomic model provider token calculation (#8705) 2024-09-24 14:04:07 +08:00
Joel
66953d57a2 chore: use new add sub variables 2024-09-24 13:56:06 +08:00
Benjamin
4638f99aaa fix: change model provider name issue Ref #8691 (#8710) 2024-09-24 13:26:58 +08:00
AAEE86
aebe5fc68c fix: Remove unsupported parameters in qwen model (#8699) 2024-09-24 13:06:21 +08:00
StyleZhang
afc9630cd0 merge main 2024-09-24 11:44:38 +08:00
zhuhao
1ecf70dca0 feat: add mixedbread as a new model provider (#8523) 2024-09-24 11:20:15 +08:00
Joel
7e8bafe186 feat: support file size unit 2024-09-24 11:12:41 +08:00
Joel
6c5fcd1ffc fix: all of not show the right place 2024-09-24 10:55:37 +08:00
ybalbert001
7c485f8bb8 fix llm integration problem: It doesn't work on docker env (#8701)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-09-24 10:33:30 +08:00
themanforfree
21e9608b23 feat: add xinference sd web ui api tool (#8385)
Signed-off-by: themanforfree <themanforfree@gmail.com>
2024-09-24 10:20:06 +08:00
Sa Zhang
7f1b028840 fix: change the brand name to Jina AI (#8691)
Co-authored-by: sa zhang <sa.zhang@jina.ai>
2024-09-23 21:39:26 +08:00
Nam Vu
bef83a4d2e fix: typos and improve naming conventions: (#8687) 2024-09-23 21:32:58 +08:00
crazywoola
8cc9e68363 fix: prompt for the follow-up suggestions (#8685) 2024-09-23 20:00:34 +08:00
ice yao
d7aada38a1 Add nomic embedding model provider (#8640) 2024-09-23 19:57:21 +08:00
Vikey Chen
4f69adc8ab fix: document_create_args_validate (#8569) 2024-09-23 18:45:10 +08:00
Likename Haojie
52da5b16e7 fixbug tts(stream) not work on ios safari(17.1+) (#8645)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-23 18:44:24 +08:00
Hash Brown
11d09a92d0 fix: send message error when last sent message not succeeded (#8682) 2024-09-23 18:44:09 +08:00
Nam Vu
c7eacd1aac chore: Optimize I18nObject class for better performance and readability (#8681) 2024-09-23 18:40:40 +08:00
Joel
7602d22133 chore: files name 2024-09-23 18:33:14 +08:00
Joel
5ec91e8507 feat: if node render files sub vars 2024-09-23 18:25:11 +08:00
StyleZhang
466966f027 file input error tip 2024-09-23 17:27:06 +08:00
AAEE86
a126d535cf add Spark Max-32K (#8676) 2024-09-23 16:39:46 +08:00
AAEE86
3554a803e7 add zhipuai web search (#8668) 2024-09-23 16:19:42 +08:00
AAEE86
c66cecaa55 add Qwen model translate (#8674) 2024-09-23 16:18:55 +08:00
非法操作
b37954b966 fix: png avatar upload as jpeg (#8665) 2024-09-23 15:33:06 +08:00
Bowen Liang
86f90fd9ff chore: skip PLR6201 linter rule (#8666) 2024-09-23 15:28:57 +08:00
haike-1213
4c7beb9d7b fix: Assignment exception (#8663)
Co-authored-by: fum <fum@investoday.com.cn>
2024-09-23 15:23:52 +08:00
Aaron Ji
3618a97c20 feat: extend api params for Jina Embeddings V3 (#8657) 2024-09-23 13:45:09 +08:00
JzoNg
212d04ea27 form inputs hide handle 2024-09-23 10:25:35 +08:00
Shota Totsuka
03fdf5e7f8 chore: Enable Japanese descriptions for Tools (#8646) 2024-09-23 09:06:01 +08:00
Hiroshi Fujita
cae73b9a32 Make WORKFLOW_* configurable as environment variables. (#8644) 2024-09-23 09:05:02 +08:00
zhuhao
e34f04380d feat: add deepseek-v2.5 for model provider siliconflow (#8639) 2024-09-22 21:44:06 +08:00
zhuhao
6df77038a2 docs: fix predefined_model_scale_out.md redirect error (#8633) 2024-09-22 16:45:45 +08:00
zhuhao
45c0a44411 feat: add qwen2.5 for model provider siliconflow (#8630) 2024-09-22 16:42:34 +08:00
Hash Brown
2d869d6831 fix: send message error when chatting with opening statement (#8627) 2024-09-22 16:41:40 +08:00
Nam Vu
eaa7e9b1f0 fix: llm_generator.py JSONDecodeError (#8504) 2024-09-22 14:02:12 +08:00
Nam Vu
6e37750fbd fix: commands.py (#8483) 2024-09-22 13:41:09 +08:00
omr
8fd297f8b4 fix: redundant check for available_document_count (#8491) 2024-09-22 13:39:41 +08:00
Nam Vu
ddf6569dc5 chore: enhance configuration descriptions (#8624) 2024-09-22 13:38:41 +08:00
CXwudi
97895ec41a chore: add Gemini newest experimental models (close #7121) (#8621) 2024-09-22 13:38:08 +08:00
sino
6d56d5c1f6 feat: support o1 series models for openrouter (#8358) 2024-09-22 10:23:50 +08:00
HJY
6c2fa8defc fix: form input add tabIndex (#8478) 2024-09-22 10:14:43 +08:00
AAEE86
c9f1e18df1 Add model parameter translation (#8509)
Co-authored-by: swingchen01 <swings@126.com>
Co-authored-by: 陈长君 <chenchangjun@shuwen.com>
2024-09-22 10:14:33 +08:00
Waffle
740fad06c1 feat(tools/cogview): Updated cogview tool to support cogview-3 and the latest cogview-3-plus (#8382) 2024-09-22 10:14:14 +08:00
ice yao
0665268578 Add Fireworks AI as new model provider (#8428) 2024-09-22 10:13:00 +08:00
呆萌闷油瓶
c8b9bdebfe feat:use xinference tts stream mode (#8616) 2024-09-22 10:08:35 +08:00
Shota Totsuka
a587f0d3f1 docs: Add Japanese documentation for tools (#8469) 2024-09-22 09:04:00 +08:00
Hash Brown
8c51d06222 feat: regenerate in Chat, agent and Chatflow app (#7661) 2024-09-22 03:15:11 +08:00
Joe
b32a7713e0 feat: update pyproject.toml (#8368) 2024-09-21 23:59:50 +08:00
zhuhao
831c5a93af refactor(ops): Optimize the iteration for filter_none_values and use logging.error to record logs when an exception occurs (#8461) 2024-09-21 22:56:37 +08:00
AAEE86
1a8dcae10e add Qwen custom add model interface (#8565) 2024-09-21 22:52:10 +08:00
Nam Vu
8219f9e090 fix: api/core/ops/ops_trace_manager.py (#8501) 2024-09-21 20:49:01 +08:00
AAEE86
5ddb601e43 add MixtralAI Model (#8517) 2024-09-21 18:08:07 +08:00
Hongbin
5541248264 Update the PerfXCloud provider model list,Update PerfXCloudProvider validate_provider_credentials method. (#8587)
Co-authored-by: xhb <466010723@qq.com>
2024-09-21 17:33:15 +08:00
WalterMitty
b3cb97f0ad docs: Update ssrf_proxy related doc link in docker-compose file (#8516) 2024-09-21 17:31:49 +08:00
方程
e75c33a561 Enhance Readme Documentation to Clarify the Importance of Celery Service (#8558) 2024-09-21 17:30:58 +08:00
github-actions[bot]
483ead55d5 chore: translate i18n files (#8557)
Co-authored-by: iamjoel <2120155+iamjoel@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-09-21 17:30:43 +08:00
非法操作
d63a5a1c3c fix: a helper link error (#8508) 2024-09-21 17:30:30 +08:00
takatost
e0a3307563 fix(workflow): "Max submit count reached" error occurred when executing workflow as tool in iteration (#8595) 2024-09-20 19:47:25 +08:00
-LAN-
7f3282ec04 Update version to 0.8.3 in packaging and docker-compose files (#8590) 2024-09-20 18:24:03 +08:00
非法操作
b773ebdab1 chore: fix webpack dependencies order (#8542) 2024-09-20 18:09:35 +08:00
StyleZhang
0cb50dd4a5 fix: workflow inputs panel 2024-09-20 18:00:35 +08:00
Qun
1583283635 ComfyUI tool use the new internal enumeration class "VariableKey" (#8533) 2024-09-20 17:42:47 +08:00
StyleZhang
ab19fccf3d single file 2024-09-20 17:20:28 +08:00
Su Yang
c87f710d58 Fix: update qwen model and model config (#8584)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-09-20 17:05:57 +08:00
StyleZhang
4ed46e3fed fix: uploader 2024-09-20 15:44:49 +08:00
Su Yang
1568c5cae9 fix: fix qwen series model type (#8580) 2024-09-20 15:29:33 +08:00
Joel
9fd2f798ff feat: add type palcehloder in picker 2024-09-20 15:19:38 +08:00
Joel
146be41b1d fix: if show var types 2024-09-20 14:59:46 +08:00
Joel
ce6ae5732a fix: not show the right output type 2024-09-20 14:42:53 +08:00
StyleZhang
edf462c640 fix: uploader 2024-09-20 12:08:03 +08:00
JzoNg
d580fc1e9d fix features initilization 2024-09-20 11:03:45 +08:00
MuYu
a03919c3b3 feat: add hunyuan-vision (#8529) 2024-09-19 18:08:01 +08:00
Joel
5544791031 feat: doc extract support both file and file array 2024-09-19 17:55:58 +08:00
JzoNg
099746dd59 fix style of tracing 2024-09-19 17:40:27 +08:00
Joel
7411bcf167 chore: improve delimiter (#8552) 2024-09-19 17:40:20 +08:00
Jyong
d96f5ba1ca add storage error log (#8556) 2024-09-19 17:34:12 +08:00
StyleZhang
c6f53c9030 Merge branch 'main' into feat/attachments 2024-09-19 17:11:28 +08:00
Su Yang
d6de96c4b4 feat: sync Qwen API with Aliyun Bailian (#8538) 2024-09-19 17:08:59 +08:00
StyleZhang
8236f8fed8 fix: file uploader 2024-09-19 17:08:02 +08:00
StyleZhang
2b0c39ed3f file in chat question 2024-09-19 16:55:14 +08:00
JzoNg
396a240e68 test run 2024-09-19 16:49:30 +08:00
JzoNg
8bd9d8f6ba remove chat input 2024-09-19 15:45:30 +08:00
JzoNg
aa7ae4c5f1 chore: remove console 2024-09-19 15:45:30 +08:00
takatost
ffd2f61dd9 fix: thread_pool submit count in parallel workflow not releasing (#8549) 2024-09-19 15:34:56 +08:00
StyleZhang
49b7acf52e fix: file uploader 2024-09-19 14:52:22 +08:00
takatost
54b9e1f6d1 fix: ci issues(missing duckduckgo-search==6.2.11, ruff lint issue) (#8543) 2024-09-19 11:43:00 +08:00
StyleZhang
466ac987f5 fix: file type icon 2024-09-19 11:12:18 +08:00
StyleZhang
49972939a9 file icon 2024-09-19 11:06:38 +08:00
HJY
2721cb8dee feat: add format util unit and add pre-commit unit check (#8427) 2024-09-19 10:39:27 +08:00
StyleZhang
80f167ca02 file upload limit 2024-09-18 18:12:03 +08:00
JzoNg
f652ae0d98 step run 2024-09-18 18:07:03 +08:00
Joel
4dbf56675a fix: change to backend doc extractor 2024-09-18 17:35:14 +08:00
NFish
41bea4cafa validate user permission before enter app detail page (#8527) 2024-09-18 16:54:04 +08:00
StyleZhang
f5d1f5a20a file uploader 2024-09-18 16:50:53 +08:00
StyleZhang
fd9b71c4d7 file uploader 2024-09-18 16:36:55 +08:00
Joel
1df41cef4c fix: doc extract not export text 2024-09-18 15:23:53 +08:00
Joel
602d2486bd fix: iteration file arry type 2024-09-18 15:03:02 +08:00
Wang Bo
6f222b49f2 refactor: rename task_type to task for jina embeddings v3 (#8488) 2024-09-18 14:53:15 +08:00
-LAN-
8dfe8c773a chore: Deprecate gpt-3.5-turbo-0613 and gpt-3.5-turbo-16k-0613 models (#8500) 2024-09-18 14:38:09 +08:00
JzoNg
403fede432 fix basic app publish 2024-09-18 14:01:20 +08:00
JzoNg
9f66e6e357 fix feature bar in basic chabot 2024-09-18 13:36:11 +08:00
JzoNg
affb2e38a1 fix typo of icon path 2024-09-18 13:14:18 +08:00
Joel
31d87f85b8 merge 2024-09-18 11:57:48 +08:00
JzoNg
54105e85ff fix icon 2024-09-18 11:21:38 +08:00
Joel
5ec604500c chore: change field to backend 2024-09-18 11:15:36 +08:00
Qun
cf645c3ba1 feat: Add ComfyUI tool for Stable Diffusion (#8160) 2024-09-18 10:56:29 +08:00
JzoNg
96d2582d89 file var in form 2024-09-18 10:44:59 +08:00
JzoNg
a10b0db102 vision setting 2024-09-18 10:44:59 +08:00
zhuhao
e896d1e9d7 chore: update the .gitignore file to include opensearch,pgvector,and myscale (#8470) 2024-09-17 22:54:22 +08:00
Xiao Ley
6dba68f62d feat: Add base URL settings and secure_ascii options to the Brave search tool (#8463)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-15 17:38:43 +08:00
非法操作
3d083b758f feat: add flux dev of siliconflow image-gen tool (#8450) 2024-09-15 17:14:12 +08:00
Nam Vu
aa5b2db10a chore: workflow BRANCH, PARALLEL i18n (#8452) 2024-09-15 17:13:39 +08:00
Hirotaka Miyagi
b73faae0d0 fix(RunOnce): change to form submission instead of onKeyDown and onClick (#8460) 2024-09-15 17:09:47 +08:00
Ying Wang
4788e1c8c8 [Python SDK] Add KnowledgeBaseClient and the corresponding test cases. (#8465)
Co-authored-by: Wang Ying <wangying@xkool.org>
2024-09-15 17:08:52 +08:00
takatost
bf16de50fe fix: internal error when tool authorization (#8449) 2024-09-14 21:50:02 +08:00
Jyong
7e611ffbf3 multi-retrival use dataset's top-k (#8416) 2024-09-14 21:48:44 +08:00
zhuhao
65162a87b6 fix:docker-compose.middleware.yaml start the Weaviate container by default (#8446) (#8447) 2024-09-14 21:48:24 +08:00
Charlie.Wei
445497cf89 add svg render & Image preview optimization (#8387)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-14 19:24:53 +08:00
Ying Wang
fa1af8e47b add WorkflowClient.get_result, increase version number (#8435)
Co-authored-by: wangying <wangying@xkool.org>
2024-09-14 19:06:37 +08:00
Nam Vu
624331472a fix: Improve scrolling behavior for Conversation Opener (#8437)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-14 19:05:19 +08:00
走在修行的大街上
72b7f8a949 Bugfix/fix feishu plugins (#8443)
Co-authored-by: 黎斌 <libin.23@bytedance.com>
2024-09-14 18:59:06 +08:00
takatost
88c9834ef2 chore(workflow): Optimize the iteration when selecting a variable from a branch in the output variable causes iteration index err (#8440) 2024-09-14 18:02:43 +08:00
Yi Xiao
d882348f39 fix: delete the delay for the tooltips inside the add tool panel (#8436) 2024-09-14 17:24:31 +08:00
ybalbert001
b6ad7a1e06 Fix: https://github.com/langgenius/dify/issues/8190 (Update Model nam… (#8426)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-09-14 17:14:18 +08:00
Aaron Ji
6f7625fa47 chore: update Jina embedding model (#8376) 2024-09-14 16:21:17 +08:00
yanxiyue
de7bc22649 fix: sys_var startwith 'sys.' not 'sys' #8421 (#8422)
Co-authored-by: wuling <wuling@ke.com>
2024-09-14 15:16:12 +08:00
kurokobo
52857dc0a6 feat: allow users to specify timeout for text generations and workflows by environment variable (#8395) 2024-09-14 14:11:45 +08:00
KVOJJJin
032dd93b2f Fix: operation postion of answer in logs (#8411)
Co-authored-by: Yi <yxiaoisme@gmail.com>
2024-09-14 14:08:31 +08:00
Pika
5b18e851d2 fix: when the variable does not exist, an error should be prompted (#8413)
Co-authored-by: Chen(MAC) <chenchen404@outlook.com>
2024-09-14 14:08:10 +08:00
takatost
f01602b570 fix(workflow): the answer node after the iteration node containing the answer was output prematurely (#8419) 2024-09-14 14:02:09 +08:00
HowardChan
0123498452 fix:logs and rm unused codes in CacheEmbedding (#8409) 2024-09-14 12:56:45 +08:00
swingchen01
f55e06d8bf fix: resolve runtime error when self.folder is None (#8401)
Co-authored-by: 陈长君 <chenchangjun@shuwen.com>
2024-09-14 11:07:16 +08:00
ybalbert001
b613b11422 Fix: Support Bedrock cross region inference #8190 (Update Model name to distinguish between different region groups) (#8402)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-09-14 11:06:20 +08:00
Incca
8efae1cba2 fix(docker): aliyun oss path env key (#8394) 2024-09-14 09:52:59 +08:00
Nam Vu
bf55b1910f fix: pyproject.toml typo (#8396) 2024-09-14 09:45:49 +08:00
crazywoola
71b4480c4a fix: o1-mini 65563 -> 65536 (#8388) 2024-09-14 02:39:58 +08:00
Yeuoly
b6b1057a18 fix: sandbox issue related httpx and requests (#8397) 2024-09-14 02:02:55 +08:00
Bowen Liang
5b98acde2f chore: improve usage of striping prefix or suffix of string with Ruff 0.6.5 (#8392) 2024-09-13 23:34:39 +08:00
Bowen Liang
aad6f340b3 fix (#8322 followup): resolve the violation of pylint rules (#8391) 2024-09-13 23:19:36 +08:00
Bowen Liang
a1104ab97e chore: refurish python code by applying Pylint linter rules (#8322) 2024-09-13 22:42:08 +08:00
xiandan-erizo
1ab81b4972 support hunyuan-turbo (#8372)
Co-authored-by: sunkesi <sunkesi@hosecloud.com>
2024-09-13 20:21:48 +08:00
非法操作
06b66216d7 chore: update firecrawl scrape to V1 api (#8367) 2024-09-13 20:02:00 +08:00
takatost
cd3eaed335 fix(workflow): both parallel and single branch errors occur in if-else (#8378) 2024-09-13 19:55:54 +08:00
StyleZhang
5dd556b4c8 file uploader 2024-09-13 17:31:10 +08:00
Joel
9d80d7def7 fix: edit load balancing not pass id (#8370) 2024-09-13 17:15:03 +08:00
StyleZhang
a4c6d0b94b file uploader 2024-09-13 16:46:16 +08:00
Joe
84ac5ccc8f fix: add before send to remove langfuse defaultErrorResponse (#8361) 2024-09-13 16:08:08 +08:00
Joel
5dfd7abb2b fix: when edit load balancing config not pass the empty filed value hidden (#8366) 2024-09-13 16:05:26 +08:00
takatost
24af4b9313 fix: o1-series model encounters an error when the generate mode is blocking (#8363) 2024-09-13 15:37:54 +08:00
Bowen Liang
6613b8f2e0 chore: fix unnecessary string concatation in single line (#8311) 2024-09-13 14:24:49 +08:00
-LAN-
08c486452f fix: score_threshold handling in vector search methods (#8356) 2024-09-13 14:24:35 +08:00
sino
a45ac6ab98 fix: ark token usage is none (#8351) 2024-09-13 14:19:24 +08:00
-LAN-
80a322aaa2 chore: update version to 0.8.2 in packaging and docker-compose files (#8352) 2024-09-13 13:45:13 +08:00
Joe
82f7875a52 feat: add langfuse sentry ignore error (#8353) 2024-09-13 13:44:19 +08:00
takatost
4637ddaa7f feat: add o1-series models support in Agent App (ReACT only) (#8350) 2024-09-13 13:08:27 +08:00
Yi Xiao
8d2269f762 fix: copy and paste shortcut in the textarea of the workflow run panel (#8345) 2024-09-13 12:20:56 +08:00
fanlia
5f03e66489 Feature/service api workflow logs (#8323) 2024-09-13 11:03:57 +08:00
Pika
a9c1f1a041 fix(workflow): fix var-selector not update when edges change (#8259)
Co-authored-by: Chen(MAC) <chenchen404@outlook.com>
2024-09-13 11:03:39 +08:00
Jyong
49cee773c5 fixed score threshold is none (#8342) 2024-09-13 10:21:58 +08:00
-LAN-
c78828ab7c chore: update Dify version to 0.8.1 (#8329) 2024-09-13 02:48:24 +08:00
takatost
e90d3c29ab feat: add OpenAI o1 series models support (#8328) 2024-09-13 02:15:19 +08:00
Nam Vu
153807f243 fix: response_format label (#8326) 2024-09-12 23:17:29 +08:00
Ikko Eltociear Ashimine
5db0b56c5b docs: update lambda_translate_utils.yaml (#8293) 2024-09-12 20:33:07 +08:00
Tamer
404db1ae5b Fix VariableEntityType Bug external-data-tool -> external_data_tool (#8299) 2024-09-12 20:27:55 +08:00
呆萌闷油瓶
02c4b1af71 chore:add Azure openai api version 2024-08-01-preview (#8291) 2024-09-12 20:22:57 +08:00
crazywoola
aa11659062 Revert "Feat: update app published time after clicking publish button" (#8320) 2024-09-12 20:06:06 +08:00
ybalbert001
d4985fb3aa Fix: Support Bedrock cross region inference [#8190](https://github.com/langgenius/dify/issues/8190) (#8317) 2024-09-12 19:15:20 +08:00
Bowen Liang
8815511ccb chore: apply flake8-pytest-style linter rules (#8307) 2024-09-12 18:09:16 +08:00
Bowen Liang
40fb4d16ef chore: refurbish Python code by applying refurb linter rules (#8296) 2024-09-12 15:50:49 +08:00
Bowen Liang
c69f5b07ba chore: apply ruff E501 line-too-long linter rule (#8275)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-09-12 14:00:36 +08:00
takatost
56c90e212a fix(workflow): missing content in the answer node stream output during iterations (#8292)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-09-12 13:59:48 +08:00
StyleZhang
323a835de9 Merge branch 'main' into feat/attachments 2024-09-12 13:53:41 +08:00
Bowen Liang
0f14873255 chore: cleanup ruff flake8-simplify linter rules (#8286)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-09-12 12:55:45 +08:00
zxhlyh
0bb7569d46 fix: markdown paragraph margin (#8289) 2024-09-12 11:28:14 +08:00
Kevin9703
ec57922bb6 fix(workflow/hooks/use-shortcuts): resolve issue of copy shortcut not working in workflow debug and preview panel (#8249)
Co-authored-by: Yi <yxiaoisme@gmail.com>
2024-09-12 10:39:18 +08:00
Bowen Liang
781d294f49 chore: cleanup pycodestyle E rules (#8269) 2024-09-11 18:55:00 +08:00
StyleZhang
0076577764 file uploader 2024-09-11 18:25:49 +08:00
yalei
f515af2232 let claude models in bedrock support the response_format parameter (#8220)
Co-authored-by: duyalei <>
2024-09-11 18:24:50 +08:00
DDDDD12138
fe8191b899 enhance: improve empty data display for detail panel (#8266) 2024-09-11 18:24:18 +08:00
crazywoola
4d2cd6703b chore: remove useless code (#8198) 2024-09-11 18:19:34 +08:00
Bowen Liang
292220c596 chore: apply pep8-naming rules for naming convention (#8261) 2024-09-11 16:40:52 +08:00
HowardChan
53f37a6704 fix:ollama text embedding 500 error (#8252) 2024-09-11 16:23:19 +08:00
Leo.Wang
75c1a82556 Update Gitlab query field, add query by path (#8244) 2024-09-11 16:09:53 +08:00
Jason Tan
c5b3777d93 editor can also create api key (#8214) 2024-09-11 16:07:15 +08:00
非法操作
678bbf8fe8 fix: upload img icon mis-align in the chat input area (#8263) 2024-09-11 15:58:20 +08:00
Nam Vu
342607f4a4 fix: truthy value (#8208) 2024-09-11 15:44:53 +08:00
Joel
9a3b7345c4 fix: split line too long 2024-09-11 14:40:38 +08:00
StyleZhang
2ebf5f5ffa merge main 2024-09-11 13:40:36 +08:00
takatost
5f4cdd66fa fix(workflow): IF-ELSE nodes connected to the same subsequent node cause execution to stop (#8247) 2024-09-11 12:28:32 +08:00
zxhlyh
91942e37ff fix: workflow parallel limit in ifelse node (#8242) 2024-09-11 11:30:33 +08:00
Nam Vu
60913970dc fix: CHECK_UPDATE_URL comment (#8235) 2024-09-11 10:58:35 +08:00
HowardChan
82c42b9ec5 fix:error when adding the ollama embedding model (#8236)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-11 10:25:45 +08:00
Thales Salazar
2a3d8c25bc fix: improving the regionalization of translation (#8231)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-11 08:55:32 +08:00
takatost
cee0c51dbb feat: add from_variable_selector for stream chunk / message event (#8228) 2024-09-10 22:15:50 +08:00
takatost
fdbbdb706f fix(workflow): answers are output simultaneously across different braches in the question classifier node. (#8225) 2024-09-10 21:11:35 +08:00
takatost
f6dfe23cf8 fix(workflow): in multi-parallel execution with multiple conditional branches (#8221) 2024-09-10 21:09:18 +08:00
-LAN-
ffd4bf8bf0 fix(docker/docker-compose.yaml): Set default value for REDIS_SENTINEL_SOCKET_TIMEOUT and CELERY_SENTINEL_SOCKET_TIMEOUT (#8218) 2024-09-10 18:47:59 +08:00
Jyong
bb3002b173 revert page column (#8217) 2024-09-10 18:21:22 +08:00
Yi Xiao
d4dc54447a fix the tooltip in tool nodes (#8215) 2024-09-10 17:53:44 +08:00
Bowen Liang
d109881410 chore(api/models): apply ruff reformatting (#7600)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-09-10 17:08:06 +08:00
Joel
d1605952b0 fix: input chat input wrong padding (#8207) 2024-09-10 17:01:32 +08:00
Bowen Liang
2cf1187b32 chore(api/core): apply ruff reformatting (#7624) 2024-09-10 17:00:20 +08:00
StyleZhang
02f494c0de merge main 2024-09-10 16:38:32 +08:00
github-actions[bot]
178730266d chore: translate i18n files (#8202)
Co-authored-by: takatost <5485478+takatost@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-09-10 16:13:26 +08:00
takatost
dabfd74622 feat: Parallel Execution of Nodes in Workflows (#8192)
Co-authored-by: StyleZhang <jasonapring2015@outlook.com>
Co-authored-by: Yi <yxiaoisme@gmail.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-09-10 15:23:16 +08:00
Joel
f0e81e3918 fix: doc extract and node isconversation item 2024-09-10 15:22:44 +08:00
Joel
aa8499efac fix: doc extract var type 2024-09-10 15:16:27 +08:00
Joel
ea40b1dcb2 fix: two scrollbar 2024-09-10 15:05:54 +08:00
crazywoola
5da0182800 docs: replace docker-compose with docker compose (#8195) 2024-09-10 15:02:52 +08:00
-LAN-
ed37439ef7 refactor(api/core): Improve type hints and apply ruff formatter in agent runner and model manager. (#8166) 2024-09-10 15:00:25 +08:00
Jyong
af92f19291 filter excel empty sheet (#8194) 2024-09-10 14:55:08 +08:00
Joel
a689cd6fd4 chore: var refernece support theme 2024-09-10 14:49:27 +08:00
StyleZhang
32b6c7063a file uploader 2024-09-10 14:17:56 +08:00
zhuhao
86f7f245e4 fix: The length of the tag should between 1 and 50 (#8187) (#8188) 2024-09-10 14:07:06 +08:00
Jyong
2d690801d1 nvidia rerank top n missed (#8185) 2024-09-10 13:17:48 +08:00
Nam Vu
fede54be77 fix: Version '2.6.2-2' for 'expat' was not found (#8182) 2024-09-10 13:00:37 +08:00
Jyong
85ff82a694 code merge error (#8183)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-10 12:52:50 +08:00
Joel
97056dad30 fix: file type var match page crash 2024-09-10 11:54:34 +08:00
Joel
264f7c2139 fix: file show type error 2024-09-10 10:59:13 +08:00
ice yao
c8df92d0eb add volcengine tos storage (#8164) 2024-09-10 09:19:47 +08:00
Bowen Liang
144d30d7ef chore: bump super-linter to v7 (#8148) 2024-09-10 09:13:48 +08:00
-LAN-
4313d92e6b feat(api/core/model_runtime/entities/defaults.py): Add TOP_K in default parameters. (#8167) 2024-09-10 09:11:31 +08:00
Nam Vu
0695543f63 Fix variable typo (cont) (#8161) 2024-09-09 23:46:13 +08:00
crazywoola
0bec6a037c update qwen-long (#8157) 2024-09-09 19:09:42 +08:00
Chenhe Gu
3ff9a1f24a Update LICENSE - remove 'SaaS' from restriction term definition (#8143) 2024-09-09 16:52:55 +08:00
crazywoola
a771eea4f6 fix: html raw render (#8138) 2024-09-09 16:12:59 +08:00
github-actions[bot]
61a0ca9e0d chore: translate i18n files (#8135)
Co-authored-by: zxhlyh <16177003+zxhlyh@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-09 15:54:00 +08:00
cr-zhichen
551b33c8e5 fix: user-select style and pre-create iframe in embed.js (#8093) 2024-09-09 15:40:56 +08:00
AAEE86
fa34b9aed6 Modify model parameters in Spark LLMs and zhipuai LLMs (#8078)
Co-authored-by: Charlie.Wei <luowei@cvte.com>
2024-09-09 15:36:47 +08:00
zxhlyh
bbb609179f chore: offline n to 1 retrieval (#8134) 2024-09-09 15:32:02 +08:00
crazywoola
a27d4d58ec fix: ollama text embedding 500 error (#8131) 2024-09-09 15:27:49 +08:00
Joel
007a6fd14a chore: other file types placeholder add + 2024-09-09 15:24:50 +08:00
Joel
c159b7a781 chore: transform field type css to tailwind and multi theme 2024-09-09 15:15:08 +08:00
zhuhao
50d92f0fd4 add dify-sandbox health check in docker-compose.yaml (#8121) (#8124) 2024-09-09 14:39:06 +08:00
Joel
6c9c3faf78 fix: allow file extensions remove . 2024-09-09 14:36:56 +08:00
StyleZhang
d933ebb845 file input 2024-09-09 11:27:35 +08:00
邹成卓
a15791e788 Fix: tongyi code wrapper works not stable (#7871)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-09 11:15:17 +08:00
ybalbert001
954580a4af feat: support more model types and builtin tools on aws/sagemaker (#8061)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-09-09 10:34:11 +08:00
crazywoola
ab7d79275e fix: Claude can not validate credientials (#8109) 2024-09-09 10:22:42 +08:00
Thales Salazar
d3658166fb Translate billing to PT-BR (#8105)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-09 10:16:22 +08:00
Su Yang
54b72bdd0a chore: keep dify compose file consistent format (#8102) 2024-09-09 08:30:03 +08:00
呆萌闷油瓶
d28446301f feat:add fishaudio in xinference (#8100) 2024-09-08 23:58:02 +08:00
crazywoola
9050f92e5b fix: parameter input (#8076) 2024-09-08 15:43:55 +08:00
Charlie.Wei
feefeb44d7 fix LangSmith project config error (#7996) 2024-09-08 13:25:27 +08:00
Zhi
d542b15cc0 feat: support redis sentinel mode (#7756) 2024-09-08 13:23:51 +08:00
Nam Vu
2d7954c7da Fix variable typo (#8084) 2024-09-08 13:14:11 +08:00
crazywoola
b1918dae5e fix: knowledge input (#8065) 2024-09-07 17:53:39 +08:00
Nam Vu
031a0b576d fix: i18n typo (#8077) 2024-09-07 16:59:38 +08:00
AAEE86
0cef25ef8c Revert "fix: parameter rule" (#8070) 2024-09-07 10:44:56 +08:00
Yi Xiao
cdb08be951 fix: overflow issues in chat history (#8062) 2024-09-06 19:20:18 +08:00
crazywoola
900fd82a92 fix: parameter rule (#8064) 2024-09-06 19:15:24 +08:00
crazywoola
44f963f281 If else add regexmatch (#8059)
Co-authored-by: 罗威 <luowei@cvte.com>
2024-09-06 18:35:51 +08:00
Charlie.Wei
01858e1caf ifEsle node add regex match (#8007) 2024-09-06 17:44:09 +08:00
ChengZi
2060db8e11 fix: change milvus init args from (host, port) to (url, token) (#8019)
Signed-off-by: ChengZi <chen.zhang@zilliz.com>
2024-09-06 17:32:48 +08:00
Nam Vu
9ded063417 chore: #7348, support query conversations by updated_at (#8047) 2024-09-06 17:31:51 +08:00
Yi Xiao
d72da2777c fix the tooltip in tools node (#8055) 2024-09-06 17:28:22 +08:00
tmuife
89aede80cc Add OCI(Oracle Cloud Infrastructure) Generative AI Service as a Model Provider (#7775)
Co-authored-by: Walter Jin <jinshuhaicc@gmail.com>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: walter from vm <walter.jin@oracle.com>
2024-09-06 14:15:40 +08:00
zhuhao
e0d3cd91c6 support huawei cloud obs storage (#7980) (#7981) 2024-09-06 14:00:47 +08:00
winsonwhe
1a054ac1f4 Update milvus-standalone version and expose required ports for the container. (#7709)
Co-authored-by: Jyong <76649700+JohnJyong@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-06 12:01:59 +08:00
Charlie.Wei
3230f4a0ec Message rendering (#6868)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-09-05 21:00:09 +08:00
Designerxsh
dadca0f91a Fix/datasets api description error (#8025) 2024-09-05 16:45:44 +08:00
Byeongjin Kang
d489b8b3e0 feat: return page number of pdf documents upon retrieval (#7749) 2024-09-05 16:43:26 +08:00
Leng Yue
bd0992275c feat: support fish audio TTS (#7982) 2024-09-05 14:18:39 +08:00
非法操作
3e7597f2bd feat: add gpt-4o-2024-08-06 and json_schema for azure openAI service (#7648) 2024-09-04 21:56:08 +08:00
Jyong
0e71f6db84 fix spliter length missed (#7987) 2024-09-04 21:47:12 +08:00
wochuideng
f6b9982c23 Concurrent calls to the Wenxin model, and the exception problem when obtaining the token is fixed (#7976)
Co-authored-by: puqs1 <puqs1@lenovo.com>
2024-09-04 21:44:57 +08:00
github-actions[bot]
fb113a9479 chore: translate i18n files (#7965)
Co-authored-by: JohnJyong <76649700+JohnJyong@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Hanqing Zhao <sherry9277@gmail.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-04 17:45:12 +08:00
Jyong
15791510c8 fix wrong error message (#7972) 2024-09-04 16:46:41 +08:00
非法操作
0f72a8e89d chore: refactor the beichuan model (#7953) 2024-09-04 16:22:31 +08:00
JzoNg
b60c7a5826 vision config 2024-09-04 15:45:33 +08:00
KVOJJJin
14af87527f Feat:remove estimation of embedding cost (#7950)
Co-authored-by: jyong <718720800@qq.com>
2024-09-04 14:41:47 +08:00
zhuhao
83e84865be feat: add health check for pg and redis in docker-compose.middleware.yaml (#7961) (#7962) 2024-09-04 14:25:46 +08:00
Joel
c2a3c5a748 fix: get commit sha failed in translate action (#7959) 2024-09-04 13:13:21 +08:00
呆萌闷油瓶
83494cb4f5 fix:empty voice occurs when xinference CosyVoice tts model (#7958) 2024-09-04 13:04:31 +08:00
Vico Chu
0bc19c3fbf Feat: update app published time after clicking publish button (#7801) 2024-09-04 13:03:06 +08:00
Sumkor
571415d1a4 fix: split text keep separator (#7930) 2024-09-04 12:59:10 +08:00
Yuki Oshima
7b2cf8215f chore: fix inverted index japanese translation (#7957) 2024-09-04 12:44:59 +08:00
JzoNg
0b94218378 remove unused components 2024-09-04 11:47:29 +08:00
Joel
97cc9a5615 feat: check file item key not set 2024-09-04 11:28:36 +08:00
Joel
f6d0fd9848 feat: add check list filter value 2024-09-04 11:19:09 +08:00
Joel
b863dd7de2 fix: list filter init value 2024-09-04 10:47:26 +08:00
Joe
fee4d3f6ca feat: ops trace add llm model (#7306) 2024-09-04 10:39:00 +08:00
takatost
161cc0cda9 Revert "fix: an issue of keyword search feature in application log list" (#7949) 2024-09-04 10:00:55 +08:00
Nam Vu
71bff9fcf3 chore: #7943 i18n (#7948) 2024-09-04 09:42:25 +08:00
陳鈞
80d14c9b22 fix(api): Code-Based Extension cause error on position map sorting (#7934)
Signed-off-by: 陳鈞 <jim60105@gmail.com>
2024-09-04 08:41:12 +08:00
crazywoola
c5bdf08558 Chore/add roadmap (#7943) 2024-09-04 08:33:02 +08:00
crazywoola
596f160a1e Chore/add default step 1x url (#7933) 2024-09-04 08:32:22 +08:00
Jyong
d8b6c053a2 fix rerank model value is empty string (#7937) 2024-09-03 21:25:21 +08:00
Nam Vu
4b262cae58 chore: #7603 i18n (#7931) 2024-09-03 19:19:52 +08:00
Jyong
1a5116cba0 Fix/segment create with api (#7928) 2024-09-03 18:14:47 +08:00
Jyong
01581dd35f improve the notion table extract (#7925) 2024-09-03 17:52:07 +08:00
JzoNg
b0e7a22a27 annotation reply 2024-09-03 17:19:11 +08:00
Joel
7fdd964379 fix: frontend handle sometimes server not generate the wrong follow up data struct (#7916) 2024-09-03 14:09:46 +08:00
Joel
0cfcc97e9d feat: support auto generate i18n translate (#6964)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-03 10:17:05 +08:00
-LAN-
8986be0aab chore: Update versions to 0.7.3 (#7895) 2024-09-03 09:49:32 +08:00
-LAN-
f76bbbf5e6 chore(Dockerfile): Bump expat to 2.6.2-2 (#7904) 2024-09-03 09:48:30 +08:00
kurokobo
fe217da05c fix: correct typo in the setting screen (#7897) 2024-09-02 22:49:56 +08:00
kurokobo
80aa7c4019 feat: allow users to use the app icon as the answer icon (#7888)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-02 20:00:41 +08:00
Jyong
6f33351eb3 ignore linked images when image id is none (#7890) 2024-09-02 19:37:05 +08:00
Alex
35f13c7327 Add Russian language (#7860)
Co-authored-by: d8rt8v <alex@ydertev.ru>
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-02 19:09:41 +08:00
JzoNg
565a835947 conversation opener 2024-09-01 15:00:23 +08:00
JzoNg
fe94c876fb multiple model message sending 2024-09-01 13:37:21 +08:00
JzoNg
67a34bdd7a app publish with new features 2024-09-01 13:10:40 +08:00
JzoNg
8c785e268b completion debug & preview 2024-09-01 11:24:54 +08:00
JzoNg
65a6265ff6 new features in chat app configuration 2024-08-30 18:58:08 +08:00
Joel
08d3cb1912 fix: filter file and file sub variable 2024-08-30 17:25:32 +08:00
Joel
48d8b01d81 fix: http node value var rename 2024-08-30 15:30:12 +08:00
Joel
38edb06897 feat: list filter output 2024-08-30 15:21:37 +08:00
Joel
dc919c2a6c feat: output item var type and filter condition triger 2024-08-30 15:06:57 +08:00
Joel
e7a6a0ab01 chore: list filter operate ui 2024-08-30 14:44:16 +08:00
Joel
61d989f413 feat: support order change 2024-08-30 14:34:46 +08:00
Joel
976efd93a1 feat: support filter variable var data sync 2024-08-30 14:14:42 +08:00
JzoNg
0e2f78b3a6 features in workflow 2024-08-29 22:54:36 +08:00
JzoNg
b3529d3ccc file upload 2024-08-29 20:20:28 +08:00
JzoNg
d69b453729 conversation opener 2024-08-29 20:20:28 +08:00
JzoNg
2f658de155 moderation 2024-08-29 20:20:28 +08:00
JzoNg
a691700b48 text2speech 2024-08-29 20:20:28 +08:00
JzoNg
c5317d8f58 feature card 2024-08-29 20:20:28 +08:00
JzoNg
822f03f3cd text to speech card 2024-08-29 20:20:28 +08:00
JzoNg
101e56baaa follow up & citations & speech-to-text 2024-08-29 20:20:28 +08:00
JzoNg
3a8f516dfc more like this 2024-08-29 20:20:28 +08:00
JzoNg
912030c9a1 update style 2024-08-29 20:20:28 +08:00
JzoNg
687661eef7 new style of feature panel 2024-08-29 20:20:28 +08:00
Joel
8efc63a705 feat: handle value picker in body file selector 2024-08-29 16:36:18 +08:00
Joel
dca4f9fe9c feat: support file values in body 2024-08-29 16:18:16 +08:00
Joel
51597629b1 fix: http binary node not valid 2024-08-29 14:32:36 +08:00
Joel
76a07513ba fix: prompt editor not update data 2024-08-29 14:25:04 +08:00
Joel
dae62bef78 fix: change to key value type not show init key values 2024-08-29 11:55:18 +08:00
Joel
2a6629d435 feat: binary files 2024-08-29 11:50:42 +08:00
Joel
41f0ce1012 feat: support http body to new data struct 2024-08-28 16:56:31 +08:00
Joel
e90b055c47 fix: ts problems 2024-08-28 14:40:13 +08:00
Joel
94e40d4ed9 feat: default set vision var value 2024-08-28 14:35:49 +08:00
Joel
c34fc071e0 feat: vison file to api define 2024-08-28 10:57:26 +08:00
Joel
c014ae43e1 feat: if support file exist 2024-08-27 17:37:39 +08:00
Joel
9851153d38 chore: file type checkbox 2024-08-27 17:15:02 +08:00
Joel
cfbabb8383 feat: file valid 2024-08-27 17:09:34 +08:00
Joel
b78e90679d fix: choose file type problems 2024-08-27 16:55:24 +08:00
Joel
ec1bfdc723 feat: change to new start file struct 2024-08-27 16:37:15 +08:00
Joel
e20019f6e9 chore: merge main 2024-08-27 14:23:35 +08:00
Joel
2122cfb152 chore: list filter field 2024-08-27 10:43:03 +08:00
Joel
c2b8beffac feat: global variables 2024-08-26 17:32:46 +08:00
StyleZhang
985651454a progress circle 2024-08-26 10:30:26 +08:00
Joel
f9c1d06e91 chore: tools node 2024-08-23 18:08:54 +08:00
Joel
657f1d2de8 chore: http request 2024-08-23 17:56:34 +08:00
Joel
6e2192c1e0 chore: variable aggregator 2024-08-23 16:40:39 +08:00
Joel
e05b20eb91 chore:paramter extrctor ui 2024-08-23 16:03:01 +08:00
Joel
5117e08def chore: question classify 2024-08-22 17:00:20 +08:00
Joel
34691ca6c9 chore: knownledge node ui 2024-08-22 16:51:44 +08:00
Joel
aa40047b08 chore: knowledge 2024-08-22 15:10:49 +08:00
StyleZhang
eca17767fe chat style 2024-08-22 15:10:09 +08:00
Joel
51cec1b9ba chore: llm upgrade 2024-08-22 14:34:58 +08:00
Joel
651547c3ef fix: number var picker and other tiny css problem 2024-08-22 10:49:56 +08:00
Joel
8fbdaa604c feat: file array variable choose vars 2024-08-21 17:29:03 +08:00
Joel
1bcb30647f chore: select ui 2024-08-21 17:29:03 +08:00
JzoNg
bc245a25bf new style of tables 2024-08-21 17:07:14 +08:00
Joel
85b25ebe1b chore: file selct trigger 2024-08-21 16:45:18 +08:00
Joel
b50e94d681 feat: file arrary sub variable select 2024-08-21 16:22:58 +08:00
Joel
91c0657cf6 fix: select default trigger problem 2024-08-21 15:35:28 +08:00
StyleZhang
0da06128e3 agent tool in chat 2024-08-21 15:32:13 +08:00
Joel
0c4af3a1d2 feat: support sub variable operate changes with key and value support 2024-08-21 15:27:08 +08:00
Joel
5628b293f8 feat: sub var if condindion postion 2024-08-21 14:54:06 +08:00
JzoNg
fff40aae58 Merge branch 'main' into jzh 2024-08-21 13:40:04 +08:00
Joel
b3b87b3e4c chore: sub variable trigger 2024-08-21 11:07:13 +08:00
Joel
9a23cd08d8 fix: sub varibale select trigger 2024-08-19 16:49:52 +08:00
JzoNg
cf61ca24e3 new style of table 2024-08-19 16:05:37 +08:00
Joel
58a56add9c feat: can support value 2024-08-19 15:15:44 +08:00
JzoNg
b362031baf chip 2024-08-19 14:11:25 +08:00
Joel
7ad409b3d9 fix: update operate and value 2024-08-19 13:53:23 +08:00
Joel
876ea90fe9 feat: support update sub variable value 2024-08-19 11:43:52 +08:00
JzoNg
0eb442f954 new style of user inputs 2024-08-16 17:48:31 +08:00
Joel
4554ac3ef8 feat: can add sub variable 2024-08-16 17:12:44 +08:00
Joel
eaa7d114dc feat: file array not sub vars 2024-08-16 11:39:23 +08:00
Joel
581228be74 feat: abstract condition logic to components 2024-08-16 10:14:43 +08:00
StyleZhang
02da0219ff workflow debug and preview panel style 2024-08-15 17:30:17 +08:00
JzoNg
d0bbe43dab chore: fix type 2024-08-15 16:36:47 +08:00
JzoNg
16acdc9be4 new style of workflow process 2024-08-15 16:33:33 +08:00
Joel
a6999b5d02 fix: old not set vision data 2024-08-15 11:37:02 +08:00
StyleZhang
33bfa4758e Merge branch 'main' into feat/attachments 2024-08-15 10:57:59 +08:00
JzoNg
db63c2c219 new style of status 2024-08-14 18:40:11 +08:00
JzoNg
bea4ec5998 style update of log 2024-08-13 17:11:17 +08:00
JzoNg
74333db4c8 update input in env & conversation var 2024-08-13 16:12:12 +08:00
JzoNg
0019fb9f8b Merge branch 'main' into jzh 2024-08-13 15:59:19 +08:00
JzoNg
47615ac8fb meta data style update 2024-08-13 15:25:02 +08:00
StyleZhang
d7c8bced9b file uploader 2024-08-13 15:24:32 +08:00
Joel
57f178902f feat: if node select value 2024-08-13 14:14:38 +08:00
Joel
4586de48d6 feat: default var type 2024-08-13 14:04:43 +08:00
Joel
6549519fa5 feat: files attr select 2024-08-13 13:59:19 +08:00
Joel
ae098ad121 feat: condition operation 2024-08-13 11:16:27 +08:00
Joel
20922fde1c feat: detect file type 2024-08-12 18:22:07 +08:00
StyleZhang
079c802b5c file uploader 2024-08-12 16:24:13 +08:00
JzoNg
efcd462a69 fix style of switch 2024-08-12 13:15:36 +08:00
Joel
843c8ad306 feat: file obj 2024-08-09 17:46:33 +08:00
StyleZhang
594bf96922 file uploader hooks 2024-08-09 16:48:58 +08:00
JzoNg
ade385c9c1 replace input in workflow blocks 2024-08-09 16:19:10 +08:00
JzoNg
baed068231 replace form input 2024-08-09 12:35:01 +08:00
Joel
42f5334ae4 feat: iteration file array input 2024-08-09 11:45:14 +08:00
Joel
3c4ab0632d feat: tool support file type 2024-08-09 11:38:40 +08:00
Joel
bc5f109308 feat: http support body binary 2024-08-09 10:53:59 +08:00
Joel
97b2a42cc3 feat: form data support file type 2024-08-09 10:29:29 +08:00
JzoNg
939df16655 refactor input and replace search input 2024-08-08 17:39:09 +08:00
JzoNg
9362ae045c fix textarea onchange 2024-08-08 17:39:09 +08:00
Joel
257c515178 fix: old no vision data 2024-08-08 15:47:55 +08:00
Joel
6b7520ccc2 fix: old llm code 2024-08-08 15:44:39 +08:00
Joel
85eeaee95a feat: vision valid 2024-08-08 14:40:08 +08:00
Joel
99bf3ff565 feat: params support vison 2024-08-08 14:22:54 +08:00
Joel
36ae154ca2 feat: classify support vision 2024-08-08 11:57:07 +08:00
Joel
ef93d60534 chore: vision logic hooks 2024-08-08 11:36:44 +08:00
JzoNg
6c9a6b99e0 refactor textarea 2024-08-08 11:14:52 +08:00
JzoNg
b73f05fdf0 new style of textarea 2024-08-08 11:14:52 +08:00
StyleZhang
26bca75884 file uploader 2024-08-08 10:27:43 +08:00
Joel
e2962da1b8 chore: add visioin disabled tip check 2024-08-07 11:33:19 +08:00
NFish
1b9ebb8037 feat: add disabled support to tooltip-plus component (#7036) 2024-08-07 11:33:19 +08:00
crazywoola
a945a45b06 doc: correct typos in mdx files (#7029) 2024-08-07 11:33:19 +08:00
Achim
be829a8103 Provide output data also in json property of workflow tool (#6924) (#7027) 2024-08-07 11:33:19 +08:00
crazywoola
9432d41e60 fix: typos in wenxin llm (#7021) 2024-08-07 11:33:19 +08:00
Sa Zhang
0beeb4ab3e fix: Fix incorrect context size for jina-reranker-v2 model (#7006) 2024-08-07 11:33:19 +08:00
Bryan
d7e057be44 fix: tran list issue (#7009)
Co-authored-by: libing <libing@healink.cn>
2024-08-07 11:33:19 +08:00
Jyong
81b11c08d0 Fix/reranking mode is null (#7012) 2024-08-07 11:33:19 +08:00
Joel
83a5cdfff9 feat: agent app support generate prompt (#7007) 2024-08-07 11:33:19 +08:00
yanghx
c837218bc9 fix #6902 .docx handles images within tables and handles cross-column tables (#6951) 2024-08-07 11:33:19 +08:00
crazywoola
68552893ef fix: code-block-missing-checks (#7002) 2024-08-07 11:33:19 +08:00
灰灰
5ba93ed064 fix: code tool fails when null property exists in object (#6988) 2024-08-07 11:33:19 +08:00
Yi Xiao
959107f553 Feat/new confirm (#6984) 2024-08-07 11:33:19 +08:00
Yefori
443d929137 feat: add function calling for deepseek models (#6990) 2024-08-07 11:33:19 +08:00
Vico Chu
1e04418023 Chores: fix name typo (#6987) 2024-08-07 11:33:19 +08:00
小羽
aeda8869bc feat:nvidia add nemotron4-340b and microsoft/phi-3 (#6973) 2024-08-07 11:33:19 +08:00
非法操作
10eed02ec4 chore: update duckduckgo tool (#6983) 2024-08-07 11:33:19 +08:00
Dr. Artificial曾小健
2472c4f890 fix doc (#6974) 2024-08-07 11:33:19 +08:00
Joel
0455e4e1a5 feat: llm support vision 2024-08-07 11:33:19 +08:00
StyleZhang
251ab5418f file-uploader i18n 2024-08-06 18:03:38 +08:00
Joel
38e6e40900 feat: config vision comp 2024-08-06 17:33:02 +08:00
Joel
b3a3672857 chore: new required field 2024-08-06 15:42:22 +08:00
Joel
53a3c199ec chore: input slider 2024-08-06 15:33:38 +08:00
Joel
fca5af5073 feat: max number with slider 2024-08-06 15:25:53 +08:00
Joel
77d0aac1d3 feat: support custom file type 2024-08-06 14:59:26 +08:00
StyleZhang
fd0f8f33b5 Merge branch 'main' into feat/attachments 2024-08-06 09:58:53 +08:00
Joel
0be99ad01c feat: select file types 2024-08-02 18:17:13 +08:00
StyleZhang
a05d16375e Merge branch 'main' into feat/attachments 2024-08-02 16:30:10 +08:00
Joel
0480bb03c3 feat: new input types 2024-08-02 11:43:01 +08:00
StyleZhang
19dfc6d9a8 file uploader 2024-08-02 10:21:20 +08:00
Joel
d361675159 chore: some select style 2024-08-01 17:54:33 +08:00
Joel
23ae150298 feat: fiter condion 2024-08-01 17:10:02 +08:00
Joel
81383d7c74 feat: sub var picker 2024-08-01 14:43:17 +08:00
Joel
573f653789 feat: order by 2024-08-01 11:40:55 +08:00
Joel
f1b61861b6 Merge branch 'main' into feat/attachments 2024-07-31 17:59:10 +08:00
Joel
8ecee8abce fix: ts problem 2024-07-31 17:02:11 +08:00
Joel
e9ce9c1f47 feat: limit config 2024-07-31 17:01:26 +08:00
Joel
944fea4cc9 feat: list filter type and outpt var 2024-07-31 16:31:51 +08:00
StyleZhang
25c029877a progress circle 2024-07-31 15:15:26 +08:00
StyleZhang
9c31c56115 file uploader 2024-07-30 16:19:20 +08:00
StyleZhang
56507c9f7a chat input area 2024-07-30 13:48:39 +08:00
StyleZhang
b322dda3f6 Merge branch 'main' into feat/attachments 2024-07-30 10:06:40 +08:00
StyleZhang
52d69dd55b file uploader 2024-07-29 17:22:11 +08:00
StyleZhang
0451c5590c Merge branch 'main' into feat/attachments 2024-07-29 10:19:11 +08:00
StyleZhang
2498c238b2 file-uploader 2024-07-26 16:54:45 +08:00
Joel
6e15d7f777 feat: doc extract inputs 2024-07-26 15:00:29 +08:00
Joel
f6caf0915b chore: block bg to utils color 2024-07-26 14:23:34 +08:00
Joel
09aa14ca82 feat: node icons 2024-07-26 14:11:03 +08:00
Joel
394f06a27a feat: list filter 2024-07-26 11:55:09 +08:00
Joel
6fafd410d2 feat: doc extract struct 2024-07-26 11:21:17 +08:00
StyleZhang
1668df104f Merge branch 'main' into feat/attachments 2024-07-26 08:49:17 +08:00
StyleZhang
d376b8540e add file-uploader 2024-07-25 16:41:09 +08:00
2634 changed files with 100231 additions and 46926 deletions

View File

@@ -39,7 +39,7 @@ jobs:
api/pyproject.toml
api/poetry.lock
- name: Poetry check
- name: Check Poetry lockfile
run: |
poetry check -C api --lock
poetry show -C api
@@ -47,6 +47,9 @@ jobs:
- name: Install dependencies
run: poetry install -C api --with dev
- name: Check dependencies in pyproject.toml
run: poetry run -C api bash dev/pytest/pytest_artifacts.sh
- name: Run Unit tests
run: poetry run -C api bash dev/pytest/pytest_unit_tests.sh

View File

@@ -5,6 +5,7 @@ on:
branches:
- "main"
- "deploy/dev"
- "release/0.10.0-beta"
release:
types: [published]
@@ -125,7 +126,7 @@ jobs:
with:
images: ${{ env[matrix.image_name_env] }}
tags: |
type=raw,value=latest,enable=${{ startsWith(github.ref, 'refs/tags/') }}
type=raw,value=latest,enable=${{ startsWith(github.ref, 'refs/tags/') && !contains(github.ref, '-') }}
type=ref,event=branch
type=sha,enable=true,priority=100,prefix=,suffix=,format=long
type=raw,value=${{ github.ref_name }},enable=${{ startsWith(github.ref, 'refs/tags/') }}

View File

@@ -20,7 +20,7 @@ jobs:
- name: Check changed files
id: changed-files
uses: tj-actions/changed-files@v44
uses: tj-actions/changed-files@v45
with:
files: api/**
@@ -66,7 +66,7 @@ jobs:
- name: Check changed files
id: changed-files
uses: tj-actions/changed-files@v44
uses: tj-actions/changed-files@v45
with:
files: web/**
@@ -97,7 +97,7 @@ jobs:
- name: Check changed files
id: changed-files
uses: tj-actions/changed-files@v44
uses: tj-actions/changed-files@v45
with:
files: |
**.sh
@@ -107,7 +107,7 @@ jobs:
dev/**
- name: Super-linter
uses: super-linter/super-linter/slim@v6
uses: super-linter/super-linter/slim@v7
if: steps.changed-files.outputs.any_changed == 'true'
env:
BASH_SEVERITY: warning

View File

@@ -0,0 +1,54 @@
name: Check i18n Files and Create PR
on:
pull_request:
types: [closed]
branches: [main]
jobs:
check-and-update:
if: github.event.pull_request.merged == true
runs-on: ubuntu-latest
defaults:
run:
working-directory: web
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 2 # last 2 commits
- name: Check for file changes in i18n/en-US
id: check_files
run: |
recent_commit_sha=$(git rev-parse HEAD)
second_recent_commit_sha=$(git rev-parse HEAD~1)
changed_files=$(git diff --name-only $recent_commit_sha $second_recent_commit_sha -- 'i18n/en-US/*.ts')
echo "Changed files: $changed_files"
if [ -n "$changed_files" ]; then
echo "FILES_CHANGED=true" >> $GITHUB_ENV
else
echo "FILES_CHANGED=false" >> $GITHUB_ENV
fi
- name: Set up Node.js
if: env.FILES_CHANGED == 'true'
uses: actions/setup-node@v2
with:
node-version: 'lts/*'
- name: Install dependencies
if: env.FILES_CHANGED == 'true'
run: yarn install --frozen-lockfile
- name: Run npm script
if: env.FILES_CHANGED == 'true'
run: npm run auto-gen-i18n
- name: Create Pull Request
if: env.FILES_CHANGED == 'true'
uses: peter-evans/create-pull-request@v6
with:
commit-message: Update i18n files based on en-US changes
title: 'chore: translate i18n files'
body: This PR was automatically created to update i18n files based on changes in en-US locale.
branch: chore/automated-i18n-updates

46
.github/workflows/web-tests.yml vendored Normal file
View File

@@ -0,0 +1,46 @@
name: Web Tests
on:
pull_request:
branches:
- main
paths:
- web/**
concurrency:
group: web-tests-${{ github.head_ref || github.run_id }}
cancel-in-progress: true
jobs:
test:
name: Web Tests
runs-on: ubuntu-latest
defaults:
run:
working-directory: ./web
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Check changed files
id: changed-files
uses: tj-actions/changed-files@v45
with:
files: web/**
- name: Setup Node.js
uses: actions/setup-node@v4
if: steps.changed-files.outputs.any_changed == 'true'
with:
node-version: 20
cache: yarn
cache-dependency-path: ./web/package.json
- name: Install dependencies
if: steps.changed-files.outputs.any_changed == 'true'
run: yarn install --frozen-lockfile
- name: Run tests
if: steps.changed-files.outputs.any_changed == 'true'
run: yarn test

9
.gitignore vendored
View File

@@ -153,6 +153,9 @@ docker-legacy/volumes/etcd/*
docker-legacy/volumes/minio/*
docker-legacy/volumes/milvus/*
docker-legacy/volumes/chroma/*
docker-legacy/volumes/opensearch/data/*
docker-legacy/volumes/pgvectors/data/*
docker-legacy/volumes/pgvector/data/*
docker/volumes/app/storage/*
docker/volumes/certbot/*
@@ -164,6 +167,12 @@ docker/volumes/etcd/*
docker/volumes/minio/*
docker/volumes/milvus/*
docker/volumes/chroma/*
docker/volumes/opensearch/data/*
docker/volumes/myscale/data/*
docker/volumes/myscale/log/*
docker/volumes/unstructured/*
docker/volumes/pgvector/data/*
docker/volumes/pgvecto_rs/data/*
docker/nginx/conf.d/default.conf
docker/middleware.env

View File

@@ -36,7 +36,7 @@
| 被团队成员标记为高优先级的功能 | 高优先级 |
| 在 [community feedback board](https://github.com/langgenius/dify/discussions/categories/feedbacks) 内反馈的常见功能请求 | 中等优先级 |
| 非核心功能和小幅改进 | 低优先级 |
| 有价值不紧急 | 未来功能 |
| 有价值不紧急 | 未来功能 |
### 其他任何事情(例如 bug 报告、性能优化、拼写错误更正):
* 立即开始编码。
@@ -138,7 +138,7 @@ Dify 的后端使用 Python 编写,使用 [Flask](https://flask.palletsproject
├── models // 描述数据模型和 API 响应的形状
├── public // 如 favicon 等元资源
├── service // 定义 API 操作的形状
├── test
├── test
├── types // 函数参数和返回值的描述
└── utils // 共享的实用函数
```

View File

@@ -4,7 +4,7 @@ Dify is licensed under the Apache License 2.0, with the following additional con
1. Dify may be utilized commercially, including as a backend service for other applications or as an application development platform for enterprises. Should the conditions below be met, a commercial license must be obtained from the producer:
a. Multi-tenant SaaS service: Unless explicitly authorized by Dify in writing, you may not use the Dify source code to operate a multi-tenant environment.
a. Multi-tenant service: Unless explicitly authorized by Dify in writing, you may not use the Dify source code to operate a multi-tenant environment.
- Tenant Definition: Within the context of Dify, one tenant corresponds to one workspace. The workspace provides a separated area for each tenant's data and configurations.
b. LOGO and copyright information: In the process of using Dify's frontend components, you may not remove or modify the LOGO or copyright information in the Dify console or applications. This restriction is inapplicable to uses of Dify that do not involve its frontend components.

View File

@@ -17,7 +17,7 @@
alt="chat on Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on Twitter"></a>
alt="follow on X(Twitter)"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
@@ -196,10 +196,14 @@ If you'd like to configure a highly-available setup, there are community-contrib
#### Using Terraform for Deployment
Deploy Dify to Cloud Platform with a single click using [terraform](https://www.terraform.io/)
##### Azure Global
Deploy Dify to Azure with a single click using [terraform](https://www.terraform.io/).
- [Azure Terraform by @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform by @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
## Contributing
For those who'd like to contribute code, see our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
@@ -219,7 +223,7 @@ At the same time, please consider supporting Dify by sharing it on social media
* [Github Discussion](https://github.com/langgenius/dify/discussions). Best for: sharing feedback and asking questions.
* [GitHub Issues](https://github.com/langgenius/dify/issues). Best for: bugs you encounter using Dify.AI, and feature proposals. See our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Discord](https://discord.gg/FngNHpbcY7). Best for: sharing your applications and hanging out with the community.
* [Twitter](https://twitter.com/dify_ai). Best for: sharing your applications and hanging out with the community.
* [X(Twitter)](https://twitter.com/dify_ai). Best for: sharing your applications and hanging out with the community.
## Star history

View File

@@ -17,7 +17,7 @@
alt="chat on Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on Twitter"></a>
alt="follow on X(Twitter)"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
@@ -179,10 +179,13 @@ docker compose up -d
#### استخدام Terraform للتوزيع
انشر Dify إلى منصة السحابة بنقرة واحدة باستخدام [terraform](https://www.terraform.io/)
##### Azure Global
استخدم [terraform](https://www.terraform.io/) لنشر Dify على Azure بنقرة واحدة.
- [Azure Terraform بواسطة @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform بواسطة @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
## المساهمة

View File

@@ -17,7 +17,7 @@
alt="chat on Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on Twitter"></a>
alt="follow on X(Twitter)"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
@@ -202,10 +202,14 @@ docker compose up -d
#### 使用 Terraform 部署
使用 [terraform](https://www.terraform.io/) 一键将 Dify 部署到云平台
##### Azure Global
使用 [terraform](https://www.terraform.io/) 一键部署 Dify 到 Azure。
- [Azure Terraform by @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform by @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
## Star History
[![Star History Chart](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
@@ -232,7 +236,7 @@ docker compose up -d
- [GitHub Issues](https://github.com/langgenius/dify/issues)。👉:使用 Dify.AI 时遇到的错误和问题,请参阅[贡献指南](CONTRIBUTING.md)。
- [电子邮件支持](mailto:hello@dify.ai?subject=[GitHub]Questions%20About%20Dify)。👉:关于使用 Dify.AI 的问题。
- [Discord](https://discord.gg/FngNHpbcY7)。👉:分享您的应用程序并与社区交流。
- [Twitter](https://twitter.com/dify_ai)。👉:分享您的应用程序并与社区交流。
- [X(Twitter)](https://twitter.com/dify_ai)。👉:分享您的应用程序并与社区交流。
- [商业许可](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry)。👉:有关商业用途许可 Dify.AI 的商业咨询。
- [微信]() 👉:扫描下方二维码,添加微信好友,备注 Dify我们将邀请您加入 Dify 社区。
<img src="./images/wechat.png" alt="wechat" width="100"/>

View File

@@ -17,7 +17,7 @@
alt="chat en Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="seguir en Twitter"></a>
alt="seguir en X(Twitter)"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Descargas de Docker" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
@@ -204,10 +204,13 @@ Si desea configurar una configuración de alta disponibilidad, la comunidad prop
#### Uso de Terraform para el despliegue
Despliega Dify en una plataforma en la nube con un solo clic utilizando [terraform](https://www.terraform.io/)
##### Azure Global
Utiliza [terraform](https://www.terraform.io/) para desplegar Dify en Azure con un solo clic.
- [Azure Terraform por @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform por @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
## Contribuir
@@ -228,7 +231,7 @@ Al mismo tiempo, considera apoyar a Dify compartiéndolo en redes sociales y en
* [Discusión en GitHub](https://github.com/langgenius/dify/discussions). Lo mejor para: compartir comentarios y hacer preguntas.
* [Reporte de problemas en GitHub](https://github.com/langgenius/dify/issues). Lo mejor para: errores que encuentres usando Dify.AI y propuestas de características. Consulta nuestra [Guía de contribución](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Discord](https://discord.gg/FngNHpbcY7). Lo mejor para: compartir tus aplicaciones y pasar el rato con la comunidad.
* [Twitter](https://twitter.com/dify_ai). Lo mejor para: compartir tus aplicaciones y pasar el rato con la comunidad.
* [X(Twitter)](https://twitter.com/dify_ai). Lo mejor para: compartir tus aplicaciones y pasar el rato con la comunidad.
## Historial de Estrellas

View File

@@ -17,7 +17,7 @@
alt="chat sur Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="suivre sur Twitter"></a>
alt="suivre sur X(Twitter)"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Tirages Docker" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
@@ -202,10 +202,13 @@ Si vous souhaitez configurer une configuration haute disponibilité, la communau
#### Utilisation de Terraform pour le déploiement
Déployez Dify sur une plateforme cloud en un clic en utilisant [terraform](https://www.terraform.io/)
##### Azure Global
Utilisez [terraform](https://www.terraform.io/) pour déployer Dify sur Azure en un clic.
- [Azure Terraform par @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform par @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
## Contribuer
@@ -226,7 +229,7 @@ Dans le même temps, veuillez envisager de soutenir Dify en le partageant sur le
* [Discussion GitHub](https://github.com/langgenius/dify/discussions). Meilleur pour: partager des commentaires et poser des questions.
* [Problèmes GitHub](https://github.com/langgenius/dify/issues). Meilleur pour: les bogues que vous rencontrez en utilisant Dify.AI et les propositions de fonctionnalités. Consultez notre [Guide de contribution](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Discord](https://discord.gg/FngNHpbcY7). Meilleur pour: partager vos applications et passer du temps avec la communauté.
* [Twitter](https://twitter.com/dify_ai). Meilleur pour: partager vos applications et passer du temps avec la communauté.
* [X(Twitter)](https://twitter.com/dify_ai). Meilleur pour: partager vos applications et passer du temps avec la communauté.
## Historique des étoiles

View File

@@ -17,7 +17,7 @@
alt="Discordでチャット"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="Twitterでフォロー"></a>
alt="X(Twitter)でフォロー"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
@@ -68,7 +68,7 @@ DifyはオープンソースのLLMアプリケーション開発プラットフ
プロンプトの作成、モデルパフォーマンスの比較が行え、チャットベースのアプリに音声合成などの機能も追加できます。
**4. RAGパイプライン**:
ドキュメントの取り込みから検索までをカバーする広範なRAG機能ができます。ほかにもPDF、PPT、その他の一般的なドキュメントフォーマットからのテキスト抽出のサーポイントも提供します。
ドキュメントの取り込みから検索までをカバーする広範なRAG機能ができます。ほかにもPDF、PPT、その他の一般的なドキュメントフォーマットからのテキスト抽出のサートも提供します。
**5. エージェント機能**:
LLM Function CallingやReActに基づくエージェントの定義が可能で、AIエージェント用のプリビルトまたはカスタムツールを追加できます。Difyには、Google検索、DALL·E、Stable Diffusion、WolframAlphaなどのAIエージェント用の50以上の組み込みツールが提供します。
@@ -201,10 +201,13 @@ docker compose up -d
#### Terraformを使用したデプロイ
##### Azure Global
[terraform](https://www.terraform.io/) を使用して、AzureにDifyをワンクリックでデプロイします。
- [nikawangのAzure Terraform](https://github.com/nikawang/dify-azure-terraform)
[terraform](https://www.terraform.io/) を使用して、ワンクリックでDifyをクラウドプラットフォームにデプロイします
##### Azure Global
- [@nikawangによるAzure Terraform](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [@sotazumによるGoogle Cloud Terraform](https://github.com/DeNA/dify-google-cloud-terraform)
## 貢献
@@ -225,7 +228,7 @@ docker compose up -d
* [Github Discussion](https://github.com/langgenius/dify/discussions). 主に: フィードバックの共有や質問。
* [GitHub Issues](https://github.com/langgenius/dify/issues). 主に: Dify.AIを使用する際に発生するエラーや問題については、[貢献ガイド](CONTRIBUTING_JA.md)を参照してください
* [Discord](https://discord.gg/FngNHpbcY7). 主に: アプリケーションの共有やコミュニティとの交流。
* [Twitter](https://twitter.com/dify_ai). 主に: アプリケーションの共有やコミュニティとの交流。
* [X(Twitter)](https://twitter.com/dify_ai). 主に: アプリケーションの共有やコミュニティとの交流。

View File

@@ -17,7 +17,7 @@
alt="chat on Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on Twitter"></a>
alt="follow on X(Twitter)"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
@@ -202,10 +202,13 @@ If you'd like to configure a highly-available setup, there are community-contrib
#### Terraform atorlugu pilersitsineq
##### Azure Global
Atoruk [terraform](https://www.terraform.io/) Dify-mik Azure-mut ataatsikkut ikkussuilluarlugu.
- [Azure Terraform atorlugu @nikawang](https://github.com/nikawang/dify-azure-terraform)
wa'logh nIqHom neH ghun deployment toy'wI' [terraform](https://www.terraform.io/) lo'laH.
##### Azure Global
- [Azure Terraform mung @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform qachlot @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
## Contributing
@@ -228,7 +231,7 @@ At the same time, please consider supporting Dify by sharing it on social media
). Best for: sharing feedback and asking questions.
* [GitHub Issues](https://github.com/langgenius/dify/issues). Best for: bugs you encounter using Dify.AI, and feature proposals. See our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Discord](https://discord.gg/FngNHpbcY7). Best for: sharing your applications and hanging out with the community.
* [Twitter](https://twitter.com/dify_ai). Best for: sharing your applications and hanging out with the community.
* [X(Twitter)](https://twitter.com/dify_ai). Best for: sharing your applications and hanging out with the community.
## Star History

View File

@@ -17,7 +17,7 @@
alt="chat on Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on Twitter"></a>
alt="follow on X(Twitter)"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
@@ -39,7 +39,6 @@
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
</p>
@@ -195,10 +194,14 @@ Dify를 Kubernetes에 배포하고 프리미엄 스케일링 설정을 구성했
#### Terraform을 사용한 배포
[terraform](https://www.terraform.io/)을 사용하여 단 한 번의 클릭으로 Dify를 클라우드 플랫폼에 배포하십시오
##### Azure Global
[terraform](https://www.terraform.io/)을 사용하여 Azure에 Dify를 원클릭으로 배포하세요.
- [nikawang의 Azure Terraform](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [sotazum의 Google Cloud Terraform](https://github.com/DeNA/dify-google-cloud-terraform)
## 기여
코드에 기여하고 싶은 분들은 [기여 가이드](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)를 참조하세요.

View File

@@ -17,7 +17,7 @@
alt="Discord'da sohbet et"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="Twitter'da takip et"></a>
alt="X(Twitter)'da takip et"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Çekmeleri" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
@@ -200,9 +200,13 @@ Yüksek kullanılabilirliğe sahip bir kurulum yapılandırmak isterseniz, Dify'
#### Dağıtım için Terraform Kullanımı
Dify'ı bulut platformuna tek tıklamayla dağıtın [terraform](https://www.terraform.io/) kullanarak
##### Azure Global
[Terraform](https://www.terraform.io/) kullanarak Dify'ı Azure'a tek tıklamayla dağıtın.
- [@nikawang tarafından Azure Terraform](https://github.com/nikawang/dify-azure-terraform)
- [Azure Terraform tarafından @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform tarafından @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
## Katkıda Bulunma
@@ -222,7 +226,7 @@ Aynı zamanda, lütfen Dify'ı sosyal medyada, etkinliklerde ve konferanslarda p
* [Github Tartışmaları](https://github.com/langgenius/dify/discussions). En uygun: geri bildirim paylaşmak ve soru sormak için.
* [GitHub Sorunları](https://github.com/langgenius/dify/issues). En uygun: Dify.AI kullanırken karşılaştığınız hatalar ve özellik önerileri için. [Katkı Kılavuzumuza](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) bakın.
* [Discord](https://discord.gg/FngNHpbcY7). En uygun: uygulamalarınızı paylaşmak ve toplulukla vakit geçirmek için.
* [Twitter](https://twitter.com/dify_ai). En uygun: uygulamalarınızı paylaşmak ve toplulukla vakit geçirmek için.
* [X(Twitter)](https://twitter.com/dify_ai). En uygun: uygulamalarınızı paylaşmak ve toplulukla vakit geçirmek için.
## Star history

View File

@@ -17,7 +17,7 @@
alt="chat trên Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="theo dõi trên Twitter"></a>
alt="theo dõi trên X(Twitter)"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
@@ -196,10 +196,14 @@ Nếu bạn muốn cấu hình một cài đặt có độ sẵn sàng cao, có
#### Sử dụng Terraform để Triển khai
Triển khai Dify lên nền tảng đám mây với một cú nhấp chuột bằng cách sử dụng [terraform](https://www.terraform.io/)
##### Azure Global
Triển khai Dify lên Azure chỉ với một cú nhấp chuột bằng cách sử dụng [terraform](https://www.terraform.io/).
- [Azure Terraform bởi @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform bởi @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
## Đóng góp
Đối với những người muốn đóng góp mã, xem [Hướng dẫn Đóng góp](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) của chúng tôi.
@@ -219,7 +223,7 @@ Triển khai Dify lên Azure chỉ với một cú nhấp chuột bằng cách s
* [Thảo luận GitHub](https://github.com/langgenius/dify/discussions). Tốt nhất cho: chia sẻ phản hồi và đặt câu hỏi.
* [Vấn đề GitHub](https://github.com/langgenius/dify/issues). Tốt nhất cho: lỗi bạn gặp phải khi sử dụng Dify.AI và đề xuất tính năng. Xem [Hướng dẫn Đóng góp](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) của chúng tôi.
* [Discord](https://discord.gg/FngNHpbcY7). Tốt nhất cho: chia sẻ ứng dụng của bạn và giao lưu với cộng đồng.
* [Twitter](https://twitter.com/dify_ai). Tốt nhất cho: chia sẻ ứng dụng của bạn và giao lưu với cộng đồng.
* [X(Twitter)](https://twitter.com/dify_ai). Tốt nhất cho: chia sẻ ứng dụng của bạn và giao lưu với cộng đồng.
## Lịch sử Yêu thích

View File

@@ -20,6 +20,9 @@ FILES_URL=http://127.0.0.1:5001
# The time in seconds after the signature is rejected
FILES_ACCESS_TIMEOUT=300
# Access token expiration time in minutes
ACCESS_TOKEN_EXPIRE_MINUTES=60
# celery configuration
CELERY_BROKER_URL=redis://:difyai123456@localhost:6379/1
@@ -39,7 +42,7 @@ DB_DATABASE=dify
# Storage configuration
# use for store upload files, private keys...
# storage type: local, s3, azure-blob, google-storage
# storage type: local, s3, azure-blob, google-storage, tencent-cos, huawei-obs, volcengine-tos, baidu-obs, supabase
STORAGE_TYPE=local
STORAGE_LOCAL_PATH=storage
S3_USE_AWS_MANAGED_IAM=false
@@ -73,6 +76,18 @@ TENCENT_COS_SECRET_ID=your-secret-id
TENCENT_COS_REGION=your-region
TENCENT_COS_SCHEME=your-scheme
# Huawei OBS Storage Configuration
HUAWEI_OBS_BUCKET_NAME=your-bucket-name
HUAWEI_OBS_SECRET_KEY=your-secret-key
HUAWEI_OBS_ACCESS_KEY=your-access-key
HUAWEI_OBS_SERVER=your-server-url
# Baidu OBS Storage Configuration
BAIDU_OBS_BUCKET_NAME=your-bucket-name
BAIDU_OBS_SECRET_KEY=your-secret-key
BAIDU_OBS_ACCESS_KEY=your-access-key
BAIDU_OBS_ENDPOINT=your-server-url
# OCI Storage configuration
OCI_ENDPOINT=your-endpoint
OCI_BUCKET_NAME=your-bucket-name
@@ -80,11 +95,23 @@ OCI_ACCESS_KEY=your-access-key
OCI_SECRET_KEY=your-secret-key
OCI_REGION=your-region
# Volcengine tos Storage configuration
VOLCENGINE_TOS_ENDPOINT=your-endpoint
VOLCENGINE_TOS_BUCKET_NAME=your-bucket-name
VOLCENGINE_TOS_ACCESS_KEY=your-access-key
VOLCENGINE_TOS_SECRET_KEY=your-secret-key
VOLCENGINE_TOS_REGION=your-region
# Supabase Storage Configuration
SUPABASE_BUCKET_NAME=your-bucket-name
SUPABASE_API_KEY=your-access-key
SUPABASE_URL=your-server-url
# CORS configuration
WEB_API_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,*
CONSOLE_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,*
# Vector database configuration, support: weaviate, qdrant, milvus, myscale, relyt, pgvecto_rs, pgvector, pgvector, chroma, opensearch, tidb_vector
# Vector database configuration, support: weaviate, qdrant, milvus, myscale, relyt, pgvecto_rs, pgvector, pgvector, chroma, opensearch, tidb_vector, vikingdb
VECTOR_STORE=weaviate
# Weaviate configuration
@@ -101,11 +128,10 @@ QDRANT_GRPC_ENABLED=false
QDRANT_GRPC_PORT=6334
# Milvus configuration
MILVUS_HOST=127.0.0.1
MILVUS_PORT=19530
MILVUS_URI=http://127.0.0.1:19530
MILVUS_TOKEN=
MILVUS_USER=root
MILVUS_PASSWORD=Milvus
MILVUS_SECURE=false
# MyScale configuration
MYSCALE_HOST=127.0.0.1
@@ -150,6 +176,8 @@ PGVECTOR_PORT=5433
PGVECTOR_USER=postgres
PGVECTOR_PASSWORD=postgres
PGVECTOR_DATABASE=postgres
PGVECTOR_MIN_CONNECTION=1
PGVECTOR_MAX_CONNECTION=5
# Tidb Vector configuration
TIDB_VECTOR_HOST=xxx.eu-central-1.xxx.aws.tidbcloud.com
@@ -183,10 +211,30 @@ OPENSEARCH_USER=admin
OPENSEARCH_PASSWORD=admin
OPENSEARCH_SECURE=true
# Baidu configuration
BAIDU_VECTOR_DB_ENDPOINT=http://127.0.0.1:5287
BAIDU_VECTOR_DB_CONNECTION_TIMEOUT_MS=30000
BAIDU_VECTOR_DB_ACCOUNT=root
BAIDU_VECTOR_DB_API_KEY=dify
BAIDU_VECTOR_DB_DATABASE=dify
BAIDU_VECTOR_DB_SHARD=1
BAIDU_VECTOR_DB_REPLICAS=3
# ViKingDB configuration
VIKINGDB_ACCESS_KEY=your-ak
VIKINGDB_SECRET_KEY=your-sk
VIKINGDB_REGION=cn-shanghai
VIKINGDB_HOST=api-vikingdb.xxx.volces.com
VIKINGDB_SCHEMA=http
VIKINGDB_CONNECTION_TIMEOUT=30
VIKINGDB_SOCKET_TIMEOUT=30
# Upload configuration
UPLOAD_FILE_SIZE_LIMIT=15
UPLOAD_FILE_BATCH_LIMIT=5
UPLOAD_IMAGE_FILE_SIZE_LIMIT=10
UPLOAD_VIDEO_FILE_SIZE_LIMIT=100
UPLOAD_AUDIO_FILE_SIZE_LIMIT=50
# Model Configuration
MULTIMODAL_SEND_IMAGE_FORMAT=base64
@@ -251,6 +299,9 @@ HTTP_REQUEST_MAX_WRITE_TIMEOUT=600
HTTP_REQUEST_NODE_MAX_BINARY_SIZE=10485760
HTTP_REQUEST_NODE_MAX_TEXT_SIZE=1048576
# Respect X-* headers to redirect clients
RESPECT_XFORWARD_HEADERS_ENABLED=false
# Log file path
LOG_FILE=
@@ -261,6 +312,7 @@ INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH=1000
WORKFLOW_MAX_EXECUTION_STEPS=500
WORKFLOW_MAX_EXECUTION_TIME=1200
WORKFLOW_CALL_MAX_DEPTH=5
MAX_VARIABLE_SIZE=204800
# App configuration
APP_MAX_EXECUTION_TIME=1200

View File

@@ -1,8 +1,15 @@
{
"version": "0.2.0",
"compounds": [
{
"name": "Launch Flask and Celery",
"configurations": ["Python: Flask", "Python: Celery"]
}
],
"configurations": [
{
"name": "Python: Flask",
"consoleName": "Flask",
"type": "debugpy",
"request": "launch",
"python": "${workspaceFolder}/.venv/bin/python",
@@ -17,12 +24,12 @@
},
"args": [
"run",
"--host=0.0.0.0",
"--port=5001"
]
},
{
"name": "Python: Celery",
"consoleName": "Celery",
"type": "debugpy",
"request": "launch",
"python": "${workspaceFolder}/.venv/bin/python",
@@ -45,10 +52,10 @@
"-c",
"1",
"--loglevel",
"info",
"DEBUG",
"-Q",
"dataset,generation,mail,ops_trace,app_deletion"
]
},
}
]
}
}

View File

@@ -55,7 +55,7 @@ RUN apt-get update \
&& echo "deb http://deb.debian.org/debian testing main" > /etc/apt/sources.list \
&& apt-get update \
# For Security
&& apt-get install -y --no-install-recommends zlib1g=1:1.3.dfsg+really1.3.1-1 expat=2.6.2-1 libldap-2.5-0=2.5.18+dfsg-3 perl=5.38.2-5 libsqlite3-0=3.46.0-1 \
&& apt-get install -y --no-install-recommends zlib1g=1:1.3.dfsg+really1.3.1-1 expat=2.6.3-1 libldap-2.5-0=2.5.18+dfsg-3 perl=5.38.2-5 libsqlite3-0=3.46.0-1 \
&& apt-get autoremove -y \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -65,14 +65,12 @@
8. Start Dify [web](../web) service.
9. Setup your application by visiting `http://localhost:3000`...
10. If you need to debug local async processing, please start the worker service.
10. If you need to handle and debug the async tasks (e.g. dataset importing and documents indexing), please start the worker service.
```bash
poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion
```
The started celery app handles the async tasks, e.g. dataset importing and documents indexing.
## Testing
1. Install dependencies for both the backend and the test environment

View File

@@ -26,7 +26,7 @@ from commands import register_commands
from configs import dify_config
# DO NOT REMOVE BELOW
from events import event_handlers
from events import event_handlers # noqa: F401
from extensions import (
ext_celery,
ext_code_based_extension,
@@ -36,6 +36,7 @@ from extensions import (
ext_login,
ext_mail,
ext_migrate,
ext_proxy_fix,
ext_redis,
ext_sentry,
ext_storage,
@@ -45,7 +46,7 @@ from extensions.ext_login import login_manager
from libs.passport import PassportService
# TODO: Find a way to avoid importing models here
from models import account, dataset, model, source, task, tool, tools, web
from models import account, dataset, model, source, task, tool, tools, web # noqa: F401
from services.account_service import AccountService
# DO NOT REMOVE ABOVE
@@ -53,11 +54,9 @@ from services.account_service import AccountService
warnings.simplefilter("ignore", ResourceWarning)
# fix windows platform
if os.name == "nt":
os.system('tzutil /s "UTC"')
else:
os.environ["TZ"] = "UTC"
os.environ["TZ"] = "UTC"
# windows platform not support tzset
if hasattr(time, "tzset"):
time.tzset()
@@ -119,7 +118,7 @@ def create_app() -> Flask:
logging.basicConfig(
level=app.config.get("LOG_LEVEL"),
format=app.config.get("LOG_FORMAT"),
format=app.config["LOG_FORMAT"],
datefmt=app.config.get("LOG_DATEFORMAT"),
handlers=log_handlers,
force=True,
@@ -136,6 +135,7 @@ def create_app() -> Flask:
return datetime.utcfromtimestamp(seconds).astimezone(timezone).timetuple()
for handler in logging.root.handlers:
assert handler.formatter
handler.formatter.converter = time_converter
initialize_extensions(app)
register_blueprints(app)
@@ -158,13 +158,14 @@ def initialize_extensions(app):
ext_mail.init_app(app)
ext_hosting_provider.init_app(app)
ext_sentry.init_app(app)
ext_proxy_fix.init_app(app)
# Flask-Login configuration
@login_manager.request_loader
def load_user_from_request(request_from_flask_login):
"""Load user based on the request."""
if request.blueprint not in ["console", "inner_api"]:
if request.blueprint not in {"console", "inner_api"}:
return None
# Check if the user_id contains a dot, indicating the old format
auth_header = request.headers.get("Authorization", "")
@@ -183,10 +184,10 @@ def load_user_from_request(request_from_flask_login):
decoded = PassportService().verify(auth_token)
user_id = decoded.get("user_id")
account = AccountService.load_logged_in_account(account_id=user_id, token=auth_token)
if account:
contexts.tenant_id.set(account.current_tenant_id)
return account
logged_in_account = AccountService.load_logged_in_account(account_id=user_id)
if logged_in_account:
contexts.tenant_id.set(logged_in_account.current_tenant_id)
return logged_in_account
@login_manager.unauthorized_handler

View File

@@ -19,7 +19,7 @@ from extensions.ext_redis import redis_client
from libs.helper import email as email_validate
from libs.password import hash_password, password_pattern, valid_password
from libs.rsa import generate_key_pair
from models.account import Tenant
from models import Tenant
from models.dataset import Dataset, DatasetCollectionBinding, DocumentSegment
from models.dataset import Document as DatasetDocument
from models.model import Account, App, AppAnnotationSetting, AppMode, Conversation, MessageAnnotation
@@ -28,28 +28,28 @@ from services.account_service import RegisterService, TenantService
@click.command("reset-password", help="Reset the account password.")
@click.option("--email", prompt=True, help="The email address of the account whose password you need to reset")
@click.option("--new-password", prompt=True, help="the new password.")
@click.option("--password-confirm", prompt=True, help="the new password confirm.")
@click.option("--email", prompt=True, help="Account email to reset password for")
@click.option("--new-password", prompt=True, help="New password")
@click.option("--password-confirm", prompt=True, help="Confirm new password")
def reset_password(email, new_password, password_confirm):
"""
Reset password of owner account
Only available in SELF_HOSTED mode
"""
if str(new_password).strip() != str(password_confirm).strip():
click.echo(click.style("sorry. The two passwords do not match.", fg="red"))
click.echo(click.style("Passwords do not match.", fg="red"))
return
account = db.session.query(Account).filter(Account.email == email).one_or_none()
if not account:
click.echo(click.style("sorry. the account: [{}] not exist .".format(email), fg="red"))
click.echo(click.style("Account not found for email: {}".format(email), fg="red"))
return
try:
valid_password(new_password)
except:
click.echo(click.style("sorry. The passwords must match {} ".format(password_pattern), fg="red"))
click.echo(click.style("Invalid password. Must match {}".format(password_pattern), fg="red"))
return
# generate password salt
@@ -62,37 +62,37 @@ def reset_password(email, new_password, password_confirm):
account.password = base64_password_hashed
account.password_salt = base64_salt
db.session.commit()
click.echo(click.style("Congratulations! Password has been reset.", fg="green"))
click.echo(click.style("Password reset successfully.", fg="green"))
@click.command("reset-email", help="Reset the account email.")
@click.option("--email", prompt=True, help="The old email address of the account whose email you need to reset")
@click.option("--new-email", prompt=True, help="the new email.")
@click.option("--email-confirm", prompt=True, help="the new email confirm.")
@click.option("--email", prompt=True, help="Current account email")
@click.option("--new-email", prompt=True, help="New email")
@click.option("--email-confirm", prompt=True, help="Confirm new email")
def reset_email(email, new_email, email_confirm):
"""
Replace account email
:return:
"""
if str(new_email).strip() != str(email_confirm).strip():
click.echo(click.style("Sorry, new email and confirm email do not match.", fg="red"))
click.echo(click.style("New emails do not match.", fg="red"))
return
account = db.session.query(Account).filter(Account.email == email).one_or_none()
if not account:
click.echo(click.style("sorry. the account: [{}] not exist .".format(email), fg="red"))
click.echo(click.style("Account not found for email: {}".format(email), fg="red"))
return
try:
email_validate(new_email)
except:
click.echo(click.style("sorry. {} is not a valid email. ".format(email), fg="red"))
click.echo(click.style("Invalid email: {}".format(new_email), fg="red"))
return
account.email = new_email
db.session.commit()
click.echo(click.style("Congratulations!, email has been reset.", fg="green"))
click.echo(click.style("Email updated successfully.", fg="green"))
@click.command(
@@ -104,7 +104,7 @@ def reset_email(email, new_email, email_confirm):
)
@click.confirmation_option(
prompt=click.style(
"Are you sure you want to reset encrypt key pair?" " this operation cannot be rolled back!", fg="red"
"Are you sure you want to reset encrypt key pair? This operation cannot be rolled back!", fg="red"
)
)
def reset_encrypt_key_pair():
@@ -114,13 +114,13 @@ def reset_encrypt_key_pair():
Only support SELF_HOSTED mode.
"""
if dify_config.EDITION != "SELF_HOSTED":
click.echo(click.style("Sorry, only support SELF_HOSTED mode.", fg="red"))
click.echo(click.style("This command is only for SELF_HOSTED installations.", fg="red"))
return
tenants = db.session.query(Tenant).all()
for tenant in tenants:
if not tenant:
click.echo(click.style("Sorry, no workspace found. Please enter /install to initialize.", fg="red"))
click.echo(click.style("No workspaces found. Run /install first.", fg="red"))
return
tenant.encrypt_public_key = generate_key_pair(tenant.id)
@@ -131,18 +131,18 @@ def reset_encrypt_key_pair():
click.echo(
click.style(
"Congratulations! " "the asymmetric key pair of workspace {} has been reset.".format(tenant.id),
"Congratulations! The asymmetric key pair of workspace {} has been reset.".format(tenant.id),
fg="green",
)
)
@click.command("vdb-migrate", help="migrate vector db.")
@click.command("vdb-migrate", help="Migrate vector db.")
@click.option("--scope", default="all", prompt=False, help="The scope of vector database to migrate, Default is All.")
def vdb_migrate(scope: str):
if scope in ["knowledge", "all"]:
if scope in {"knowledge", "all"}:
migrate_knowledge_vector_database()
if scope in ["annotation", "all"]:
if scope in {"annotation", "all"}:
migrate_annotation_vector_database()
@@ -150,7 +150,7 @@ def migrate_annotation_vector_database():
"""
Migrate annotation datas to target vector database .
"""
click.echo(click.style("Start migrate annotation data.", fg="green"))
click.echo(click.style("Starting annotation data migration.", fg="green"))
create_count = 0
skipped_count = 0
total_count = 0
@@ -174,14 +174,14 @@ def migrate_annotation_vector_database():
f"Processing the {total_count} app {app.id}. " + f"{create_count} created, {skipped_count} skipped."
)
try:
click.echo("Create app annotation index: {}".format(app.id))
click.echo("Creating app annotation index: {}".format(app.id))
app_annotation_setting = (
db.session.query(AppAnnotationSetting).filter(AppAnnotationSetting.app_id == app.id).first()
)
if not app_annotation_setting:
skipped_count = skipped_count + 1
click.echo("App annotation setting is disabled: {}".format(app.id))
click.echo("App annotation setting disabled: {}".format(app.id))
continue
# get dataset_collection_binding info
dataset_collection_binding = (
@@ -190,7 +190,7 @@ def migrate_annotation_vector_database():
.first()
)
if not dataset_collection_binding:
click.echo("App annotation collection binding is not exist: {}".format(app.id))
click.echo("App annotation collection binding not found: {}".format(app.id))
continue
annotations = db.session.query(MessageAnnotation).filter(MessageAnnotation.app_id == app.id).all()
dataset = Dataset(
@@ -211,11 +211,11 @@ def migrate_annotation_vector_database():
documents.append(document)
vector = Vector(dataset, attributes=["doc_id", "annotation_id", "app_id"])
click.echo(f"Start to migrate annotation, app_id: {app.id}.")
click.echo(f"Migrating annotations for app: {app.id}.")
try:
vector.delete()
click.echo(click.style(f"Successfully delete vector index for app: {app.id}.", fg="green"))
click.echo(click.style(f"Deleted vector index for app {app.id}.", fg="green"))
except Exception as e:
click.echo(click.style(f"Failed to delete vector index for app {app.id}.", fg="red"))
raise e
@@ -223,12 +223,12 @@ def migrate_annotation_vector_database():
try:
click.echo(
click.style(
f"Start to created vector index with {len(documents)} annotations for app {app.id}.",
f"Creating vector index with {len(documents)} annotations for app {app.id}.",
fg="green",
)
)
vector.create(documents)
click.echo(click.style(f"Successfully created vector index for app {app.id}.", fg="green"))
click.echo(click.style(f"Created vector index for app {app.id}.", fg="green"))
except Exception as e:
click.echo(click.style(f"Failed to created vector index for app {app.id}.", fg="red"))
raise e
@@ -237,14 +237,14 @@ def migrate_annotation_vector_database():
except Exception as e:
click.echo(
click.style(
"Create app annotation index error: {} {}".format(e.__class__.__name__, str(e)), fg="red"
"Error creating app annotation index: {} {}".format(e.__class__.__name__, str(e)), fg="red"
)
)
continue
click.echo(
click.style(
f"Congratulations! Create {create_count} app annotation indexes, and skipped {skipped_count} apps.",
f"Migration complete. Created {create_count} app annotation indexes. Skipped {skipped_count} apps.",
fg="green",
)
)
@@ -254,7 +254,7 @@ def migrate_knowledge_vector_database():
"""
Migrate vector database datas to target vector database .
"""
click.echo(click.style("Start migrate vector db.", fg="green"))
click.echo(click.style("Starting vector database migration.", fg="green"))
create_count = 0
skipped_count = 0
total_count = 0
@@ -275,11 +275,10 @@ def migrate_knowledge_vector_database():
for dataset in datasets:
total_count = total_count + 1
click.echo(
f"Processing the {total_count} dataset {dataset.id}. "
+ f"{create_count} created, {skipped_count} skipped."
f"Processing the {total_count} dataset {dataset.id}. {create_count} created, {skipped_count} skipped."
)
try:
click.echo("Create dataset vdb index: {}".format(dataset.id))
click.echo("Creating dataset vector database index: {}".format(dataset.id))
if dataset.index_struct_dict:
if dataset.index_struct_dict["type"] == vector_type:
skipped_count = skipped_count + 1
@@ -300,7 +299,7 @@ def migrate_knowledge_vector_database():
if dataset_collection_binding:
collection_name = dataset_collection_binding.collection_name
else:
raise ValueError("Dataset Collection Bindings is not exist!")
raise ValueError("Dataset Collection Binding not found")
else:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
@@ -348,18 +347,24 @@ def migrate_knowledge_vector_database():
index_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {"type": "elasticsearch", "vector_store": {"class_prefix": index_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.BAIDU:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {
"type": VectorType.BAIDU,
"vector_store": {"class_prefix": collection_name},
}
dataset.index_struct = json.dumps(index_struct_dict)
else:
raise ValueError(f"Vector store {vector_type} is not supported.")
vector = Vector(dataset)
click.echo(f"Start to migrate dataset {dataset.id}.")
click.echo(f"Migrating dataset {dataset.id}.")
try:
vector.delete()
click.echo(
click.style(
f"Successfully delete vector index {collection_name} for dataset {dataset.id}.", fg="green"
)
click.style(f"Deleted vector index {collection_name} for dataset {dataset.id}.", fg="green")
)
except Exception as e:
click.echo(
@@ -411,14 +416,13 @@ def migrate_knowledge_vector_database():
try:
click.echo(
click.style(
f"Start to created vector index with {len(documents)} documents of {segments_count} segments for dataset {dataset.id}.",
f"Creating vector index with {len(documents)} documents of {segments_count}"
f" segments for dataset {dataset.id}.",
fg="green",
)
)
vector.create(documents)
click.echo(
click.style(f"Successfully created vector index for dataset {dataset.id}.", fg="green")
)
click.echo(click.style(f"Created vector index for dataset {dataset.id}.", fg="green"))
except Exception as e:
click.echo(click.style(f"Failed to created vector index for dataset {dataset.id}.", fg="red"))
raise e
@@ -429,13 +433,13 @@ def migrate_knowledge_vector_database():
except Exception as e:
db.session.rollback()
click.echo(
click.style("Create dataset index error: {} {}".format(e.__class__.__name__, str(e)), fg="red")
click.style("Error creating dataset index: {} {}".format(e.__class__.__name__, str(e)), fg="red")
)
continue
click.echo(
click.style(
f"Congratulations! Create {create_count} dataset indexes, and skipped {skipped_count} datasets.", fg="green"
f"Migration complete. Created {create_count} dataset indexes. Skipped {skipped_count} datasets.", fg="green"
)
)
@@ -445,7 +449,7 @@ def convert_to_agent_apps():
"""
Convert Agent Assistant to Agent App.
"""
click.echo(click.style("Start convert to agent apps.", fg="green"))
click.echo(click.style("Starting convert to agent apps.", fg="green"))
proceeded_app_ids = []
@@ -453,14 +457,14 @@ def convert_to_agent_apps():
# fetch first 1000 apps
sql_query = """SELECT a.id AS id FROM apps a
INNER JOIN app_model_configs am ON a.app_model_config_id=am.id
WHERE a.mode = 'chat'
AND am.agent_mode is not null
WHERE a.mode = 'chat'
AND am.agent_mode is not null
AND (
am.agent_mode like '%"strategy": "function_call"%'
am.agent_mode like '%"strategy": "function_call"%'
OR am.agent_mode like '%"strategy": "react"%'
)
)
AND (
am.agent_mode like '{"enabled": true%'
am.agent_mode like '{"enabled": true%'
OR am.agent_mode like '{"max_iteration": %'
) ORDER BY a.created_at DESC LIMIT 1000
"""
@@ -496,23 +500,23 @@ def convert_to_agent_apps():
except Exception as e:
click.echo(click.style("Convert app error: {} {}".format(e.__class__.__name__, str(e)), fg="red"))
click.echo(click.style("Congratulations! Converted {} agent apps.".format(len(proceeded_app_ids)), fg="green"))
click.echo(click.style("Conversion complete. Converted {} agent apps.".format(len(proceeded_app_ids)), fg="green"))
@click.command("add-qdrant-doc-id-index", help="add qdrant doc_id index.")
@click.option("--field", default="metadata.doc_id", prompt=False, help="index field , default is metadata.doc_id.")
@click.command("add-qdrant-doc-id-index", help="Add Qdrant doc_id index.")
@click.option("--field", default="metadata.doc_id", prompt=False, help="Index field , default is metadata.doc_id.")
def add_qdrant_doc_id_index(field: str):
click.echo(click.style("Start add qdrant doc_id index.", fg="green"))
click.echo(click.style("Starting Qdrant doc_id index creation.", fg="green"))
vector_type = dify_config.VECTOR_STORE
if vector_type != "qdrant":
click.echo(click.style("Sorry, only support qdrant vector store.", fg="red"))
click.echo(click.style("This command only supports Qdrant vector store.", fg="red"))
return
create_count = 0
try:
bindings = db.session.query(DatasetCollectionBinding).all()
if not bindings:
click.echo(click.style("Sorry, no dataset collection bindings found.", fg="red"))
click.echo(click.style("No dataset collection bindings found.", fg="red"))
return
import qdrant_client
from qdrant_client.http.exceptions import UnexpectedResponse
@@ -522,7 +526,7 @@ def add_qdrant_doc_id_index(field: str):
for binding in bindings:
if dify_config.QDRANT_URL is None:
raise ValueError("Qdrant url is required.")
raise ValueError("Qdrant URL is required.")
qdrant_config = QdrantConfig(
endpoint=dify_config.QDRANT_URL,
api_key=dify_config.QDRANT_API_KEY,
@@ -539,41 +543,39 @@ def add_qdrant_doc_id_index(field: str):
except UnexpectedResponse as e:
# Collection does not exist, so return
if e.status_code == 404:
click.echo(
click.style(f"Collection not found, collection_name:{binding.collection_name}.", fg="red")
)
click.echo(click.style(f"Collection not found: {binding.collection_name}.", fg="red"))
continue
# Some other error occurred, so re-raise the exception
else:
click.echo(
click.style(
f"Failed to create qdrant index, collection_name:{binding.collection_name}.", fg="red"
f"Failed to create Qdrant index for collection: {binding.collection_name}.", fg="red"
)
)
except Exception as e:
click.echo(click.style("Failed to create qdrant client.", fg="red"))
click.echo(click.style("Failed to create Qdrant client.", fg="red"))
click.echo(click.style(f"Congratulations! Create {create_count} collection indexes.", fg="green"))
click.echo(click.style(f"Index creation complete. Created {create_count} collection indexes.", fg="green"))
@click.command("create-tenant", help="Create account and tenant.")
@click.option("--email", prompt=True, help="The email address of the tenant account.")
@click.option("--name", prompt=True, help="The workspace name of the tenant account.")
@click.option("--email", prompt=True, help="Tenant account email.")
@click.option("--name", prompt=True, help="Workspace name.")
@click.option("--language", prompt=True, help="Account language, default: en-US.")
def create_tenant(email: str, language: Optional[str] = None, name: Optional[str] = None):
"""
Create tenant account
"""
if not email:
click.echo(click.style("Sorry, email is required.", fg="red"))
click.echo(click.style("Email is required.", fg="red"))
return
# Create account
email = email.strip()
if "@" not in email:
click.echo(click.style("Sorry, invalid email address.", fg="red"))
click.echo(click.style("Invalid email address.", fg="red"))
return
account_name = email.split("@")[0]
@@ -593,19 +595,19 @@ def create_tenant(email: str, language: Optional[str] = None, name: Optional[str
click.echo(
click.style(
"Congratulations! Account and tenant created.\n" "Account: {}\nPassword: {}".format(email, new_password),
"Account and tenant created.\nAccount: {}\nPassword: {}".format(email, new_password),
fg="green",
)
)
@click.command("upgrade-db", help="upgrade the database")
@click.command("upgrade-db", help="Upgrade the database")
def upgrade_db():
click.echo("Preparing database migration...")
lock = redis_client.lock(name="db_upgrade_lock", timeout=60)
if lock.acquire(blocking=False):
try:
click.echo(click.style("Start database migration.", fg="green"))
click.echo(click.style("Starting database migration.", fg="green"))
# run db migration
import flask_migrate
@@ -615,7 +617,7 @@ def upgrade_db():
click.echo(click.style("Database migration successful!", fg="green"))
except Exception as e:
logging.exception(f"Database migration failed, error: {e}")
logging.exception(f"Database migration failed: {e}")
finally:
lock.release()
else:
@@ -627,7 +629,7 @@ def fix_app_site_missing():
"""
Fix app related site missing issue.
"""
click.echo(click.style("Start fix app related site missing issue.", fg="green"))
click.echo(click.style("Starting fix for missing app-related sites.", fg="green"))
failed_app_ids = []
while True:
@@ -650,22 +652,22 @@ where sites.id is null limit 1000"""
if tenant:
accounts = tenant.get_accounts()
if not accounts:
print("Fix app {} failed.".format(app.id))
print("Fix failed for app {}".format(app.id))
continue
account = accounts[0]
print("Fix app {} related site missing issue.".format(app.id))
print("Fixing missing site for app {}".format(app.id))
app_was_created.send(app, account=account)
except Exception as e:
failed_app_ids.append(app_id)
click.echo(click.style("Fix app {} related site missing issue failed!".format(app_id), fg="red"))
click.echo(click.style("Failed to fix missing site for app {}".format(app_id), fg="red"))
logging.exception(f"Fix app related site missing issue failed, error: {e}")
continue
if not processed_count:
break
click.echo(click.style("Congratulations! Fix app related site missing issue successful!", fg="green"))
click.echo(click.style("Fix for missing app-related sites completed successfully!", fg="green"))
def register_commands(app):

View File

@@ -4,30 +4,30 @@ from pydantic_settings import BaseSettings
class DeploymentConfig(BaseSettings):
"""
Deployment configs
Configuration settings for application deployment
"""
APPLICATION_NAME: str = Field(
description="application name",
description="Name of the application, used for identification and logging purposes",
default="langgenius/dify",
)
DEBUG: bool = Field(
description="whether to enable debug mode.",
description="Enable debug mode for additional logging and development features",
default=False,
)
TESTING: bool = Field(
description="",
description="Enable testing mode for running automated tests",
default=False,
)
EDITION: str = Field(
description="deployment edition",
description="Deployment edition of the application (e.g., 'SELF_HOSTED', 'CLOUD')",
default="SELF_HOSTED",
)
DEPLOY_ENV: str = Field(
description="deployment environment, default to PRODUCTION.",
description="Deployment environment (e.g., 'PRODUCTION', 'DEVELOPMENT'), default to PRODUCTION",
default="PRODUCTION",
)

View File

@@ -4,17 +4,17 @@ from pydantic_settings import BaseSettings
class EnterpriseFeatureConfig(BaseSettings):
"""
Enterprise feature configs.
Configuration for enterprise-level features.
**Before using, please contact business@dify.ai by email to inquire about licensing matters.**
"""
ENTERPRISE_ENABLED: bool = Field(
description="whether to enable enterprise features."
description="Enable or disable enterprise-level features."
"Before using, please contact business@dify.ai by email to inquire about licensing matters.",
default=False,
)
CAN_REPLACE_LOGO: bool = Field(
description="whether to allow replacing enterprise logo.",
description="Allow customization of the enterprise logo.",
default=False,
)

View File

@@ -6,30 +6,31 @@ from pydantic_settings import BaseSettings
class NotionConfig(BaseSettings):
"""
Notion integration configs
Configuration settings for Notion integration
"""
NOTION_CLIENT_ID: Optional[str] = Field(
description="Notion client ID",
description="Client ID for Notion API authentication. Required for OAuth 2.0 flow.",
default=None,
)
NOTION_CLIENT_SECRET: Optional[str] = Field(
description="Notion client secret key",
description="Client secret for Notion API authentication. Required for OAuth 2.0 flow.",
default=None,
)
NOTION_INTEGRATION_TYPE: Optional[str] = Field(
description="Notion integration type, default to None, available values: internal.",
description="Type of Notion integration."
" Set to 'internal' for internal integrations, or None for public integrations.",
default=None,
)
NOTION_INTERNAL_SECRET: Optional[str] = Field(
description="Notion internal secret key",
description="Secret key for internal Notion integrations. Required when NOTION_INTEGRATION_TYPE is 'internal'.",
default=None,
)
NOTION_INTEGRATION_TOKEN: Optional[str] = Field(
description="Notion integration token",
description="Integration token for Notion API access. Used for direct API calls without OAuth flow.",
default=None,
)

View File

@@ -6,20 +6,23 @@ from pydantic_settings import BaseSettings
class SentryConfig(BaseSettings):
"""
Sentry configs
Configuration settings for Sentry error tracking and performance monitoring
"""
SENTRY_DSN: Optional[str] = Field(
description="Sentry DSN",
description="Sentry Data Source Name (DSN)."
" This is the unique identifier of your Sentry project, used to send events to the correct project.",
default=None,
)
SENTRY_TRACES_SAMPLE_RATE: NonNegativeFloat = Field(
description="Sentry trace sample rate",
description="Sample rate for Sentry performance monitoring traces."
" Value between 0.0 and 1.0, where 1.0 means 100% of traces are sent to Sentry.",
default=1.0,
)
SENTRY_PROFILES_SAMPLE_RATE: NonNegativeFloat = Field(
description="Sentry profiles sample rate",
description="Sample rate for Sentry profiling."
" Value between 0.0 and 1.0, where 1.0 means 100% of profiles are sent to Sentry.",
default=1.0,
)

View File

@@ -1,4 +1,4 @@
from typing import Annotated, Optional
from typing import Annotated, Literal, Optional
from pydantic import AliasChoices, Field, HttpUrl, NegativeInt, NonNegativeInt, PositiveInt, computed_field
from pydantic_settings import BaseSettings
@@ -8,145 +8,143 @@ from configs.feature.hosted_service import HostedServiceConfig
class SecurityConfig(BaseSettings):
"""
Secret Key configs
Security-related configurations for the application
"""
SECRET_KEY: Optional[str] = Field(
description="Your App secret key will be used for securely signing the session cookie"
SECRET_KEY: str = Field(
description="Secret key for secure session cookie signing."
"Make sure you are changing this key for your deployment with a strong key."
"You can generate a strong key using `openssl rand -base64 42`."
"Alternatively you can set it with `SECRET_KEY` environment variable.",
default=None,
"Generate a strong key using `openssl rand -base64 42` or set via the `SECRET_KEY` environment variable.",
default="",
)
RESET_PASSWORD_TOKEN_EXPIRY_HOURS: PositiveInt = Field(
description="Expiry time in hours for reset token",
description="Duration in hours for which a password reset token remains valid",
default=24,
)
class AppExecutionConfig(BaseSettings):
"""
App Execution configs
Configuration parameters for application execution
"""
APP_MAX_EXECUTION_TIME: PositiveInt = Field(
description="execution timeout in seconds for app execution",
description="Maximum allowed execution time for the application in seconds",
default=1200,
)
APP_MAX_ACTIVE_REQUESTS: NonNegativeInt = Field(
description="max active request per app, 0 means unlimited",
description="Maximum number of concurrent active requests per app (0 for unlimited)",
default=0,
)
class CodeExecutionSandboxConfig(BaseSettings):
"""
Code Execution Sandbox configs
Configuration for the code execution sandbox environment
"""
CODE_EXECUTION_ENDPOINT: HttpUrl = Field(
description="endpoint URL of code execution servcie",
description="URL endpoint for the code execution service",
default="http://sandbox:8194",
)
CODE_EXECUTION_API_KEY: str = Field(
description="API key for code execution service",
description="API key for accessing the code execution service",
default="dify-sandbox",
)
CODE_EXECUTION_CONNECT_TIMEOUT: Optional[float] = Field(
description="connect timeout in seconds for code execution request",
description="Connection timeout in seconds for code execution requests",
default=10.0,
)
CODE_EXECUTION_READ_TIMEOUT: Optional[float] = Field(
description="read timeout in seconds for code execution request",
description="Read timeout in seconds for code execution requests",
default=60.0,
)
CODE_EXECUTION_WRITE_TIMEOUT: Optional[float] = Field(
description="write timeout in seconds for code execution request",
description="Write timeout in seconds for code execution request",
default=10.0,
)
CODE_MAX_NUMBER: PositiveInt = Field(
description="max depth for code execution",
description="Maximum allowed numeric value in code execution",
default=9223372036854775807,
)
CODE_MIN_NUMBER: NegativeInt = Field(
description="",
description="Minimum allowed numeric value in code execution",
default=-9223372036854775807,
)
CODE_MAX_DEPTH: PositiveInt = Field(
description="max depth for code execution",
description="Maximum allowed depth for nested structures in code execution",
default=5,
)
CODE_MAX_PRECISION: PositiveInt = Field(
description="max precision digits for float type in code execution",
description="mMaximum number of decimal places for floating-point numbers in code execution",
default=20,
)
CODE_MAX_STRING_LENGTH: PositiveInt = Field(
description="max string length for code execution",
description="Maximum allowed length for strings in code execution",
default=80000,
)
CODE_MAX_STRING_ARRAY_LENGTH: PositiveInt = Field(
description="",
description="Maximum allowed length for string arrays in code execution",
default=30,
)
CODE_MAX_OBJECT_ARRAY_LENGTH: PositiveInt = Field(
description="",
description="Maximum allowed length for object arrays in code execution",
default=30,
)
CODE_MAX_NUMBER_ARRAY_LENGTH: PositiveInt = Field(
description="",
description="Maximum allowed length for numeric arrays in code execution",
default=1000,
)
class EndpointConfig(BaseSettings):
"""
Module URL configs
Configuration for various application endpoints and URLs
"""
CONSOLE_API_URL: str = Field(
description="The backend URL prefix of the console API."
"used to concatenate the login authorization callback or notion integration callback.",
description="Base URL for the console API,"
"used for login authentication callback or notion integration callbacks",
default="",
)
CONSOLE_WEB_URL: str = Field(
description="The front-end URL prefix of the console web."
"used to concatenate some front-end addresses and for CORS configuration use.",
description="Base URL for the console web interface," "used for frontend references and CORS configuration",
default="",
)
SERVICE_API_URL: str = Field(
description="Service API Url prefix." "used to display Service API Base Url to the front-end.",
description="Base URL for the service API, displayed to users for API access",
default="",
)
APP_WEB_URL: str = Field(
description="WebApp Url prefix." "used to display WebAPP API Base Url to the front-end.",
description="Base URL for the web application, used for frontend references",
default="",
)
class FileAccessConfig(BaseSettings):
"""
File Access configs
Configuration for file access and handling
"""
FILES_URL: str = Field(
description="File preview or download Url prefix."
" used to display File preview or download Url to the front-end or as Multi-model inputs;"
description="Base URL for file preview or download,"
" used for frontend display and multi-model inputs"
"Url is signed and has expiration time.",
validation_alias=AliasChoices("FILES_URL", "CONSOLE_API_URL"),
alias_priority=1,
@@ -154,49 +152,59 @@ class FileAccessConfig(BaseSettings):
)
FILES_ACCESS_TIMEOUT: int = Field(
description="timeout in seconds for file accessing",
description="Expiration time in seconds for file access URLs",
default=300,
)
class FileUploadConfig(BaseSettings):
"""
File Uploading configs
Configuration for file upload limitations
"""
UPLOAD_FILE_SIZE_LIMIT: NonNegativeInt = Field(
description="size limit in Megabytes for uploading files",
description="Maximum allowed file size for uploads in megabytes",
default=15,
)
UPLOAD_FILE_BATCH_LIMIT: NonNegativeInt = Field(
description="batch size limit for uploading files",
description="Maximum number of files allowed in a single upload batch",
default=5,
)
UPLOAD_IMAGE_FILE_SIZE_LIMIT: NonNegativeInt = Field(
description="image file size limit in Megabytes for uploading files",
description="Maximum allowed image file size for uploads in megabytes",
default=10,
)
UPLOAD_VIDEO_FILE_SIZE_LIMIT: NonNegativeInt = Field(
description="video file size limit in Megabytes for uploading files",
default=100,
)
UPLOAD_AUDIO_FILE_SIZE_LIMIT: NonNegativeInt = Field(
description="audio file size limit in Megabytes for uploading files",
default=50,
)
BATCH_UPLOAD_LIMIT: NonNegativeInt = Field(
description="", # todo: to be clarified
description="Maximum number of files allowed in a batch upload operation",
default=20,
)
class HttpConfig(BaseSettings):
"""
HTTP configs
HTTP-related configurations for the application
"""
API_COMPRESSION_ENABLED: bool = Field(
description="whether to enable HTTP response compression of gzip",
description="Enable or disable gzip compression for HTTP responses",
default=False,
)
inner_CONSOLE_CORS_ALLOW_ORIGINS: str = Field(
description="",
description="Comma-separated list of allowed origins for CORS in the console",
validation_alias=AliasChoices("CONSOLE_CORS_ALLOW_ORIGINS", "CONSOLE_WEB_URL"),
default="",
)
@@ -218,359 +226,372 @@ class HttpConfig(BaseSettings):
return self.inner_WEB_API_CORS_ALLOW_ORIGINS.split(",")
HTTP_REQUEST_MAX_CONNECT_TIMEOUT: Annotated[
PositiveInt, Field(ge=10, description="connect timeout in seconds for HTTP request")
PositiveInt, Field(ge=10, description="Maximum connection timeout in seconds for HTTP requests")
] = 10
HTTP_REQUEST_MAX_READ_TIMEOUT: Annotated[
PositiveInt, Field(ge=60, description="read timeout in seconds for HTTP request")
PositiveInt, Field(ge=60, description="Maximum read timeout in seconds for HTTP requests")
] = 60
HTTP_REQUEST_MAX_WRITE_TIMEOUT: Annotated[
PositiveInt, Field(ge=10, description="read timeout in seconds for HTTP request")
PositiveInt, Field(ge=10, description="Maximum write timeout in seconds for HTTP requests")
] = 20
HTTP_REQUEST_NODE_MAX_BINARY_SIZE: PositiveInt = Field(
description="",
description="Maximum allowed size in bytes for binary data in HTTP requests",
default=10 * 1024 * 1024,
)
HTTP_REQUEST_NODE_MAX_TEXT_SIZE: PositiveInt = Field(
description="",
description="Maximum allowed size in bytes for text data in HTTP requests",
default=1 * 1024 * 1024,
)
SSRF_PROXY_HTTP_URL: Optional[str] = Field(
description="HTTP URL for SSRF proxy",
description="Proxy URL for HTTP requests to prevent Server-Side Request Forgery (SSRF)",
default=None,
)
SSRF_PROXY_HTTPS_URL: Optional[str] = Field(
description="HTTPS URL for SSRF proxy",
description="Proxy URL for HTTPS requests to prevent Server-Side Request Forgery (SSRF)",
default=None,
)
RESPECT_XFORWARD_HEADERS_ENABLED: bool = Field(
description="Enable or disable the X-Forwarded-For Proxy Fix middleware from Werkzeug"
" to respect X-* headers to redirect clients",
default=False,
)
class InnerAPIConfig(BaseSettings):
"""
Inner API configs
Configuration for internal API functionality
"""
INNER_API: bool = Field(
description="whether to enable the inner API",
description="Enable or disable the internal API",
default=False,
)
INNER_API_KEY: Optional[str] = Field(
description="The inner API key is used to authenticate the inner API",
description="API key for accessing the internal API",
default=None,
)
class LoggingConfig(BaseSettings):
"""
Logging configs
Configuration for application logging
"""
LOG_LEVEL: str = Field(
description="Log output level, default to INFO." "It is recommended to set it to ERROR for production.",
description="Logging level, default to INFO. Set to ERROR for production environments.",
default="INFO",
)
LOG_FILE: Optional[str] = Field(
description="logging output file path",
description="File path for log output.",
default=None,
)
LOG_FORMAT: str = Field(
description="log format",
description="Format string for log messages",
default="%(asctime)s.%(msecs)03d %(levelname)s [%(threadName)s] [%(filename)s:%(lineno)d] - %(message)s",
)
LOG_DATEFORMAT: Optional[str] = Field(
description="log date format",
description="Date format string for log timestamps",
default=None,
)
LOG_TZ: Optional[str] = Field(
description="specify log timezone, eg: America/New_York",
description="Timezone for log timestamps (e.g., 'America/New_York')",
default=None,
)
class ModelLoadBalanceConfig(BaseSettings):
"""
Model load balance configs
Configuration for model load balancing
"""
MODEL_LB_ENABLED: bool = Field(
description="whether to enable model load balancing",
description="Enable or disable load balancing for models",
default=False,
)
class BillingConfig(BaseSettings):
"""
Platform Billing Configurations
Configuration for platform billing features
"""
BILLING_ENABLED: bool = Field(
description="whether to enable billing",
description="Enable or disable billing functionality",
default=False,
)
class UpdateConfig(BaseSettings):
"""
Update configs
Configuration for application update checks
"""
CHECK_UPDATE_URL: str = Field(
description="url for checking updates",
description="URL to check for application updates",
default="https://updates.dify.ai",
)
class WorkflowConfig(BaseSettings):
"""
Workflow feature configs
Configuration for workflow execution
"""
WORKFLOW_MAX_EXECUTION_STEPS: PositiveInt = Field(
description="max execution steps in single workflow execution",
description="Maximum number of steps allowed in a single workflow execution",
default=500,
)
WORKFLOW_MAX_EXECUTION_TIME: PositiveInt = Field(
description="max execution time in seconds in single workflow execution",
description="Maximum execution time in seconds for a single workflow",
default=1200,
)
WORKFLOW_CALL_MAX_DEPTH: PositiveInt = Field(
description="max depth of calling in single workflow execution",
description="Maximum allowed depth for nested workflow calls",
default=5,
)
MAX_VARIABLE_SIZE: PositiveInt = Field(
description="The maximum size in bytes of a variable. default to 5KB.",
default=5 * 1024,
description="Maximum size in bytes for a single variable in workflows. Default to 200 KB.",
default=200 * 1024,
)
class OAuthConfig(BaseSettings):
class AuthConfig(BaseSettings):
"""
oauth configs
Configuration for authentication and OAuth
"""
OAUTH_REDIRECT_PATH: str = Field(
description="redirect path for OAuth",
description="Redirect path for OAuth authentication callbacks",
default="/console/api/oauth/authorize",
)
GITHUB_CLIENT_ID: Optional[str] = Field(
description="GitHub client id for OAuth",
description="GitHub OAuth client ID",
default=None,
)
GITHUB_CLIENT_SECRET: Optional[str] = Field(
description="GitHub client secret key for OAuth",
description="GitHub OAuth client secret",
default=None,
)
GOOGLE_CLIENT_ID: Optional[str] = Field(
description="Google client id for OAuth",
description="Google OAuth client ID",
default=None,
)
GOOGLE_CLIENT_SECRET: Optional[str] = Field(
description="Google client secret key for OAuth",
description="Google OAuth client secret",
default=None,
)
ACCESS_TOKEN_EXPIRE_MINUTES: PositiveInt = Field(
description="Expiration time for access tokens in minutes",
default=60,
)
class ModerationConfig(BaseSettings):
"""
Moderation in app configs.
Configuration for content moderation
"""
MODERATION_BUFFER_SIZE: PositiveInt = Field(
description="buffer size for moderation",
description="Size of the buffer for content moderation processing",
default=300,
)
class ToolConfig(BaseSettings):
"""
Tool configs
Configuration for tool management
"""
TOOL_ICON_CACHE_MAX_AGE: PositiveInt = Field(
description="max age in seconds for tool icon caching",
description="Maximum age in seconds for caching tool icons",
default=3600,
)
class MailConfig(BaseSettings):
"""
Mail Configurations
Configuration for email services
"""
MAIL_TYPE: Optional[str] = Field(
description="Mail provider type name, default to None, availabile values are `smtp` and `resend`.",
description="Email service provider type ('smtp' or 'resend'), default to None.",
default=None,
)
MAIL_DEFAULT_SEND_FROM: Optional[str] = Field(
description="default email address for sending from ",
description="Default email address to use as the sender",
default=None,
)
RESEND_API_KEY: Optional[str] = Field(
description="API key for Resend",
description="API key for Resend email service",
default=None,
)
RESEND_API_URL: Optional[str] = Field(
description="API URL for Resend",
description="API URL for Resend email service",
default=None,
)
SMTP_SERVER: Optional[str] = Field(
description="smtp server host",
description="SMTP server hostname",
default=None,
)
SMTP_PORT: Optional[int] = Field(
description="smtp server port",
description="SMTP server port number",
default=465,
)
SMTP_USERNAME: Optional[str] = Field(
description="smtp server username",
description="Username for SMTP authentication",
default=None,
)
SMTP_PASSWORD: Optional[str] = Field(
description="smtp server password",
description="Password for SMTP authentication",
default=None,
)
SMTP_USE_TLS: bool = Field(
description="whether to use TLS connection to smtp server",
description="Enable TLS encryption for SMTP connections",
default=False,
)
SMTP_OPPORTUNISTIC_TLS: bool = Field(
description="whether to use opportunistic TLS connection to smtp server",
description="Enable opportunistic TLS for SMTP connections",
default=False,
)
class RagEtlConfig(BaseSettings):
"""
RAG ETL Configurations.
Configuration for RAG ETL processes
"""
# TODO: This config is not only for rag etl, it is also for file upload, we should move it to file upload config
ETL_TYPE: str = Field(
description="RAG ETL type name, default to `dify`, available values are `dify` and `Unstructured`. ",
description="RAG ETL type ('dify' or 'Unstructured'), default to 'dify'",
default="dify",
)
KEYWORD_DATA_SOURCE_TYPE: str = Field(
description="source type for keyword data, default to `database`, available values are `database` .",
description="Data source type for keyword extraction"
" ('database' or other supported types), default to 'database'",
default="database",
)
UNSTRUCTURED_API_URL: Optional[str] = Field(
description="API URL for Unstructured",
description="API URL for Unstructured.io service",
default=None,
)
UNSTRUCTURED_API_KEY: Optional[str] = Field(
description="API key for Unstructured",
description="API key for Unstructured.io service",
default=None,
)
class DataSetConfig(BaseSettings):
"""
Dataset configs
Configuration for dataset management
"""
CLEAN_DAY_SETTING: PositiveInt = Field(
description="interval in days for cleaning up dataset",
description="Interval in days for dataset cleanup operations",
default=30,
)
DATASET_OPERATOR_ENABLED: bool = Field(
description="whether to enable dataset operator",
description="Enable or disable dataset operator functionality",
default=False,
)
class WorkspaceConfig(BaseSettings):
"""
Workspace configs
Configuration for workspace management
"""
INVITE_EXPIRY_HOURS: PositiveInt = Field(
description="workspaces invitation expiration in hours",
description="Expiration time in hours for workspace invitation links",
default=72,
)
class IndexingConfig(BaseSettings):
"""
Indexing configs.
Configuration for indexing operations
"""
INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH: PositiveInt = Field(
description="max segmentation token length for indexing",
description="Maximum token length for text segmentation during indexing",
default=1000,
)
class ImageFormatConfig(BaseSettings):
MULTIMODAL_SEND_IMAGE_FORMAT: str = Field(
description="multi model send image format, support base64, url, default is base64",
MULTIMODAL_SEND_IMAGE_FORMAT: Literal["base64", "url"] = Field(
description="Format for sending images in multimodal contexts ('base64' or 'url'), default is base64",
default="base64",
)
class CeleryBeatConfig(BaseSettings):
CELERY_BEAT_SCHEDULER_TIME: int = Field(
description="the time of the celery scheduler, default to 1 day",
description="Interval in days for Celery Beat scheduler execution, default to 1 day",
default=1,
)
class PositionConfig(BaseSettings):
POSITION_PROVIDER_PINS: str = Field(
description="The heads of model providers",
description="Comma-separated list of pinned model providers",
default="",
)
POSITION_PROVIDER_INCLUDES: str = Field(
description="The included model providers",
description="Comma-separated list of included model providers",
default="",
)
POSITION_PROVIDER_EXCLUDES: str = Field(
description="The excluded model providers",
description="Comma-separated list of excluded model providers",
default="",
)
POSITION_TOOL_PINS: str = Field(
description="The heads of tools",
description="Comma-separated list of pinned tools",
default="",
)
POSITION_TOOL_INCLUDES: str = Field(
description="The included tools",
description="Comma-separated list of included tools",
default="",
)
POSITION_TOOL_EXCLUDES: str = Field(
description="The excluded tools",
description="Comma-separated list of excluded tools",
default="",
)
@@ -602,6 +623,7 @@ class PositionConfig(BaseSettings):
class FeatureConfig(
# place the configs in alphabet order
AppExecutionConfig,
AuthConfig, # Changed from OAuthConfig to AuthConfig
BillingConfig,
CodeExecutionSandboxConfig,
DataSetConfig,
@@ -616,14 +638,13 @@ class FeatureConfig(
MailConfig,
ModelLoadBalanceConfig,
ModerationConfig,
OAuthConfig,
PositionConfig,
RagEtlConfig,
SecurityConfig,
ToolConfig,
UpdateConfig,
WorkflowConfig,
WorkspaceConfig,
PositionConfig,
# hosted services config
HostedServiceConfig,
CeleryBeatConfig,

View File

@@ -6,31 +6,31 @@ from pydantic_settings import BaseSettings
class HostedOpenAiConfig(BaseSettings):
"""
Hosted OpenAI service config
Configuration for hosted OpenAI service
"""
HOSTED_OPENAI_API_KEY: Optional[str] = Field(
description="",
description="API key for hosted OpenAI service",
default=None,
)
HOSTED_OPENAI_API_BASE: Optional[str] = Field(
description="",
description="Base URL for hosted OpenAI API",
default=None,
)
HOSTED_OPENAI_API_ORGANIZATION: Optional[str] = Field(
description="",
description="Organization ID for hosted OpenAI service",
default=None,
)
HOSTED_OPENAI_TRIAL_ENABLED: bool = Field(
description="",
description="Enable trial access to hosted OpenAI service",
default=False,
)
HOSTED_OPENAI_TRIAL_MODELS: str = Field(
description="",
description="Comma-separated list of available models for trial access",
default="gpt-3.5-turbo,"
"gpt-3.5-turbo-1106,"
"gpt-3.5-turbo-instruct,"
@@ -42,17 +42,17 @@ class HostedOpenAiConfig(BaseSettings):
)
HOSTED_OPENAI_QUOTA_LIMIT: NonNegativeInt = Field(
description="",
description="Quota limit for hosted OpenAI service usage",
default=200,
)
HOSTED_OPENAI_PAID_ENABLED: bool = Field(
description="",
description="Enable paid access to hosted OpenAI service",
default=False,
)
HOSTED_OPENAI_PAID_MODELS: str = Field(
description="",
description="Comma-separated list of available models for paid access",
default="gpt-4,"
"gpt-4-turbo-preview,"
"gpt-4-turbo-2024-04-09,"
@@ -71,124 +71,122 @@ class HostedOpenAiConfig(BaseSettings):
class HostedAzureOpenAiConfig(BaseSettings):
"""
Hosted OpenAI service config
Configuration for hosted Azure OpenAI service
"""
HOSTED_AZURE_OPENAI_ENABLED: bool = Field(
description="",
description="Enable hosted Azure OpenAI service",
default=False,
)
HOSTED_AZURE_OPENAI_API_KEY: Optional[str] = Field(
description="",
description="API key for hosted Azure OpenAI service",
default=None,
)
HOSTED_AZURE_OPENAI_API_BASE: Optional[str] = Field(
description="",
description="Base URL for hosted Azure OpenAI API",
default=None,
)
HOSTED_AZURE_OPENAI_QUOTA_LIMIT: NonNegativeInt = Field(
description="",
description="Quota limit for hosted Azure OpenAI service usage",
default=200,
)
class HostedAnthropicConfig(BaseSettings):
"""
Hosted Azure OpenAI service config
Configuration for hosted Anthropic service
"""
HOSTED_ANTHROPIC_API_BASE: Optional[str] = Field(
description="",
description="Base URL for hosted Anthropic API",
default=None,
)
HOSTED_ANTHROPIC_API_KEY: Optional[str] = Field(
description="",
description="API key for hosted Anthropic service",
default=None,
)
HOSTED_ANTHROPIC_TRIAL_ENABLED: bool = Field(
description="",
description="Enable trial access to hosted Anthropic service",
default=False,
)
HOSTED_ANTHROPIC_QUOTA_LIMIT: NonNegativeInt = Field(
description="",
description="Quota limit for hosted Anthropic service usage",
default=600000,
)
HOSTED_ANTHROPIC_PAID_ENABLED: bool = Field(
description="",
description="Enable paid access to hosted Anthropic service",
default=False,
)
class HostedMinmaxConfig(BaseSettings):
"""
Hosted Minmax service config
Configuration for hosted Minmax service
"""
HOSTED_MINIMAX_ENABLED: bool = Field(
description="",
description="Enable hosted Minmax service",
default=False,
)
class HostedSparkConfig(BaseSettings):
"""
Hosted Spark service config
Configuration for hosted Spark service
"""
HOSTED_SPARK_ENABLED: bool = Field(
description="",
description="Enable hosted Spark service",
default=False,
)
class HostedZhipuAIConfig(BaseSettings):
"""
Hosted Minmax service config
Configuration for hosted ZhipuAI service
"""
HOSTED_ZHIPUAI_ENABLED: bool = Field(
description="",
description="Enable hosted ZhipuAI service",
default=False,
)
class HostedModerationConfig(BaseSettings):
"""
Hosted Moderation service config
Configuration for hosted Moderation service
"""
HOSTED_MODERATION_ENABLED: bool = Field(
description="",
description="Enable hosted Moderation service",
default=False,
)
HOSTED_MODERATION_PROVIDERS: str = Field(
description="",
description="Comma-separated list of moderation providers",
default="",
)
class HostedFetchAppTemplateConfig(BaseSettings):
"""
Hosted Moderation service config
Configuration for fetching app templates
"""
HOSTED_FETCH_APP_TEMPLATES_MODE: str = Field(
description="the mode for fetching app templates,"
" default to remote,"
" available values: remote, db, builtin",
description="Mode for fetching app templates: remote, db, or builtin" " default to remote,",
default="remote",
)
HOSTED_FETCH_APP_TEMPLATES_REMOTE_DOMAIN: str = Field(
description="the domain for fetching remote app templates",
description="Domain for fetching remote app templates",
default="https://tmpl.dify.ai",
)

View File

@@ -1,16 +1,20 @@
from typing import Any, Optional
from urllib.parse import quote_plus
from pydantic import Field, NonNegativeInt, PositiveInt, computed_field
from pydantic import Field, NonNegativeInt, PositiveFloat, PositiveInt, computed_field
from pydantic_settings import BaseSettings
from configs.middleware.cache.redis_config import RedisConfig
from configs.middleware.storage.aliyun_oss_storage_config import AliyunOSSStorageConfig
from configs.middleware.storage.amazon_s3_storage_config import S3StorageConfig
from configs.middleware.storage.azure_blob_storage_config import AzureBlobStorageConfig
from configs.middleware.storage.baidu_obs_storage_config import BaiduOBSStorageConfig
from configs.middleware.storage.google_cloud_storage_config import GoogleCloudStorageConfig
from configs.middleware.storage.huawei_obs_storage_config import HuaweiCloudOBSStorageConfig
from configs.middleware.storage.oci_storage_config import OCIStorageConfig
from configs.middleware.storage.supabase_storage_config import SupabaseStorageConfig
from configs.middleware.storage.tencent_cos_storage_config import TencentCloudCOSStorageConfig
from configs.middleware.storage.volcengine_tos_storage_config import VolcengineTOSStorageConfig
from configs.middleware.vdb.analyticdb_config import AnalyticdbConfig
from configs.middleware.vdb.chroma_config import ChromaConfig
from configs.middleware.vdb.elasticsearch_config import ElasticsearchConfig
@@ -24,75 +28,77 @@ from configs.middleware.vdb.qdrant_config import QdrantConfig
from configs.middleware.vdb.relyt_config import RelytConfig
from configs.middleware.vdb.tencent_vector_config import TencentVectorDBConfig
from configs.middleware.vdb.tidb_vector_config import TiDBVectorConfig
from configs.middleware.vdb.vikingdb_config import VikingDBConfig
from configs.middleware.vdb.weaviate_config import WeaviateConfig
class StorageConfig(BaseSettings):
STORAGE_TYPE: str = Field(
description="storage type,"
" default to `local`,"
" available values are `local`, `s3`, `azure-blob`, `aliyun-oss`, `google-storage`.",
description="Type of storage to use."
" Options: 'local', 's3', 'azure-blob', 'aliyun-oss', 'google-storage'. Default is 'local'.",
default="local",
)
STORAGE_LOCAL_PATH: str = Field(
description="local storage path",
description="Path for local storage when STORAGE_TYPE is set to 'local'.",
default="storage",
)
class VectorStoreConfig(BaseSettings):
VECTOR_STORE: Optional[str] = Field(
description="vector store type",
description="Type of vector store to use for efficient similarity search."
" Set to None if not using a vector store.",
default=None,
)
class KeywordStoreConfig(BaseSettings):
KEYWORD_STORE: str = Field(
description="keyword store type",
description="Method for keyword extraction and storage."
" Default is 'jieba', a Chinese text segmentation library.",
default="jieba",
)
class DatabaseConfig:
DB_HOST: str = Field(
description="db host",
description="Hostname or IP address of the database server.",
default="localhost",
)
DB_PORT: PositiveInt = Field(
description="db port",
description="Port number for database connection.",
default=5432,
)
DB_USERNAME: str = Field(
description="db username",
description="Username for database authentication.",
default="postgres",
)
DB_PASSWORD: str = Field(
description="db password",
description="Password for database authentication.",
default="",
)
DB_DATABASE: str = Field(
description="db database",
description="Name of the database to connect to.",
default="dify",
)
DB_CHARSET: str = Field(
description="db charset",
description="Character set for database connection.",
default="",
)
DB_EXTRAS: str = Field(
description="db extras options. Example: keepalives_idle=60&keepalives=1",
description="Additional database connection parameters. Example: 'keepalives_idle=60&keepalives=1'",
default="",
)
SQLALCHEMY_DATABASE_URI_SCHEME: str = Field(
description="db uri scheme",
description="Database URI scheme for SQLAlchemy connection.",
default="postgresql",
)
@@ -110,27 +116,27 @@ class DatabaseConfig:
)
SQLALCHEMY_POOL_SIZE: NonNegativeInt = Field(
description="pool size of SqlAlchemy",
description="Maximum number of database connections in the pool.",
default=30,
)
SQLALCHEMY_MAX_OVERFLOW: NonNegativeInt = Field(
description="max overflows for SqlAlchemy",
description="Maximum number of connections that can be created beyond the pool_size.",
default=10,
)
SQLALCHEMY_POOL_RECYCLE: NonNegativeInt = Field(
description="SqlAlchemy pool recycle",
description="Number of seconds after which a connection is automatically recycled.",
default=3600,
)
SQLALCHEMY_POOL_PRE_PING: bool = Field(
description="whether to enable pool pre-ping in SqlAlchemy",
description="If True, enables connection pool pre-ping feature to check connections.",
default=False,
)
SQLALCHEMY_ECHO: bool | str = Field(
description="whether to enable SqlAlchemy echo",
description="If True, SQLAlchemy will log all SQL statements.",
default=False,
)
@@ -148,15 +154,30 @@ class DatabaseConfig:
class CeleryConfig(DatabaseConfig):
CELERY_BACKEND: str = Field(
description="Celery backend, available values are `database`, `redis`",
description="Backend for Celery task results. Options: 'database', 'redis'.",
default="database",
)
CELERY_BROKER_URL: Optional[str] = Field(
description="CELERY_BROKER_URL",
description="URL of the message broker for Celery tasks.",
default=None,
)
CELERY_USE_SENTINEL: Optional[bool] = Field(
description="Whether to use Redis Sentinel for high availability.",
default=False,
)
CELERY_SENTINEL_MASTER_NAME: Optional[str] = Field(
description="Name of the Redis Sentinel master.",
default=None,
)
CELERY_SENTINEL_SOCKET_TIMEOUT: Optional[PositiveFloat] = Field(
description="Timeout for Redis Sentinel socket operations in seconds.",
default=0.1,
)
@computed_field
@property
def CELERY_RESULT_BACKEND(self) -> str | None:
@@ -172,6 +193,22 @@ class CeleryConfig(DatabaseConfig):
return self.CELERY_BROKER_URL.startswith("rediss://") if self.CELERY_BROKER_URL else False
class InternalTestConfig(BaseSettings):
"""
Configuration settings for Internal Test
"""
AWS_SECRET_ACCESS_KEY: Optional[str] = Field(
description="Internal test AWS secret access key",
default=None,
)
AWS_ACCESS_KEY_ID: Optional[str] = Field(
description="Internal test AWS access key ID",
default=None,
)
class MiddlewareConfig(
# place the configs in alphabet order
CeleryConfig,
@@ -182,10 +219,14 @@ class MiddlewareConfig(
StorageConfig,
AliyunOSSStorageConfig,
AzureBlobStorageConfig,
BaiduOBSStorageConfig,
GoogleCloudStorageConfig,
TencentCloudCOSStorageConfig,
S3StorageConfig,
HuaweiCloudOBSStorageConfig,
OCIStorageConfig,
S3StorageConfig,
SupabaseStorageConfig,
TencentCloudCOSStorageConfig,
VolcengineTOSStorageConfig,
# configs of vdb and vdb providers
VectorStoreConfig,
AnalyticdbConfig,
@@ -202,5 +243,7 @@ class MiddlewareConfig(
TiDBVectorConfig,
WeaviateConfig,
ElasticsearchConfig,
InternalTestConfig,
VikingDBConfig,
):
pass

View File

@@ -1,40 +1,70 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic import Field, NonNegativeInt, PositiveFloat, PositiveInt
from pydantic_settings import BaseSettings
class RedisConfig(BaseSettings):
"""
Redis configs
Configuration settings for Redis connection
"""
REDIS_HOST: str = Field(
description="Redis host",
description="Hostname or IP address of the Redis server",
default="localhost",
)
REDIS_PORT: PositiveInt = Field(
description="Redis port",
description="Port number on which the Redis server is listening",
default=6379,
)
REDIS_USERNAME: Optional[str] = Field(
description="Redis username",
description="Username for Redis authentication (if required)",
default=None,
)
REDIS_PASSWORD: Optional[str] = Field(
description="Redis password",
description="Password for Redis authentication (if required)",
default=None,
)
REDIS_DB: NonNegativeInt = Field(
description="Redis database id, default to 0",
description="Redis database number to use (0-15)",
default=0,
)
REDIS_USE_SSL: bool = Field(
description="whether to use SSL for Redis connection",
description="Enable SSL/TLS for the Redis connection",
default=False,
)
REDIS_USE_SENTINEL: Optional[bool] = Field(
description="Enable Redis Sentinel mode for high availability",
default=False,
)
REDIS_SENTINELS: Optional[str] = Field(
description="Comma-separated list of Redis Sentinel nodes (host:port)",
default=None,
)
REDIS_SENTINEL_SERVICE_NAME: Optional[str] = Field(
description="Name of the Redis Sentinel service to monitor",
default=None,
)
REDIS_SENTINEL_USERNAME: Optional[str] = Field(
description="Username for Redis Sentinel authentication (if required)",
default=None,
)
REDIS_SENTINEL_PASSWORD: Optional[str] = Field(
description="Password for Redis Sentinel authentication (if required)",
default=None,
)
REDIS_SENTINEL_SOCKET_TIMEOUT: Optional[PositiveFloat] = Field(
description="Socket timeout in seconds for Redis Sentinel connections",
default=0.1,
)

View File

@@ -6,40 +6,40 @@ from pydantic_settings import BaseSettings
class AliyunOSSStorageConfig(BaseSettings):
"""
Aliyun storage configs
Configuration settings for Aliyun Object Storage Service (OSS)
"""
ALIYUN_OSS_BUCKET_NAME: Optional[str] = Field(
description="Aliyun OSS bucket name",
description="Name of the Aliyun OSS bucket to store and retrieve objects",
default=None,
)
ALIYUN_OSS_ACCESS_KEY: Optional[str] = Field(
description="Aliyun OSS access key",
description="Access key ID for authenticating with Aliyun OSS",
default=None,
)
ALIYUN_OSS_SECRET_KEY: Optional[str] = Field(
description="Aliyun OSS secret key",
description="Secret access key for authenticating with Aliyun OSS",
default=None,
)
ALIYUN_OSS_ENDPOINT: Optional[str] = Field(
description="Aliyun OSS endpoint URL",
description="URL of the Aliyun OSS endpoint for your chosen region",
default=None,
)
ALIYUN_OSS_REGION: Optional[str] = Field(
description="Aliyun OSS region",
description="Aliyun OSS region where your bucket is located (e.g., 'oss-cn-hangzhou')",
default=None,
)
ALIYUN_OSS_AUTH_VERSION: Optional[str] = Field(
description="Aliyun OSS authentication version",
description="Version of the authentication protocol to use with Aliyun OSS (e.g., 'v4')",
default=None,
)
ALIYUN_OSS_PATH: Optional[str] = Field(
description="Aliyun OSS path",
description="Base path within the bucket to store objects (e.g., 'my-app-data/')",
default=None,
)

View File

@@ -6,40 +6,40 @@ from pydantic_settings import BaseSettings
class S3StorageConfig(BaseSettings):
"""
S3 storage configs
Configuration settings for S3-compatible object storage
"""
S3_ENDPOINT: Optional[str] = Field(
description="S3 storage endpoint",
description="URL of the S3-compatible storage endpoint (e.g., 'https://s3.amazonaws.com')",
default=None,
)
S3_REGION: Optional[str] = Field(
description="S3 storage region",
description="Region where the S3 bucket is located (e.g., 'us-east-1')",
default=None,
)
S3_BUCKET_NAME: Optional[str] = Field(
description="S3 storage bucket name",
description="Name of the S3 bucket to store and retrieve objects",
default=None,
)
S3_ACCESS_KEY: Optional[str] = Field(
description="S3 storage access key",
description="Access key ID for authenticating with the S3 service",
default=None,
)
S3_SECRET_KEY: Optional[str] = Field(
description="S3 storage secret key",
description="Secret access key for authenticating with the S3 service",
default=None,
)
S3_ADDRESS_STYLE: str = Field(
description="S3 storage address style",
description="S3 addressing style: 'auto', 'path', or 'virtual'",
default="auto",
)
S3_USE_AWS_MANAGED_IAM: bool = Field(
description="whether to use aws managed IAM for S3",
description="Use AWS managed IAM roles for authentication instead of access/secret keys",
default=False,
)

View File

@@ -6,25 +6,25 @@ from pydantic_settings import BaseSettings
class AzureBlobStorageConfig(BaseSettings):
"""
Azure Blob storage configs
Configuration settings for Azure Blob Storage
"""
AZURE_BLOB_ACCOUNT_NAME: Optional[str] = Field(
description="Azure Blob account name",
description="Name of the Azure Storage account (e.g., 'mystorageaccount')",
default=None,
)
AZURE_BLOB_ACCOUNT_KEY: Optional[str] = Field(
description="Azure Blob account key",
description="Access key for authenticating with the Azure Storage account",
default=None,
)
AZURE_BLOB_CONTAINER_NAME: Optional[str] = Field(
description="Azure Blob container name",
description="Name of the Azure Blob container to store and retrieve objects",
default=None,
)
AZURE_BLOB_ACCOUNT_URL: Optional[str] = Field(
description="Azure Blob account URL",
description="URL of the Azure Blob storage endpoint (e.g., 'https://mystorageaccount.blob.core.windows.net')",
default=None,
)

View File

@@ -0,0 +1,29 @@
from typing import Optional
from pydantic import BaseModel, Field
class BaiduOBSStorageConfig(BaseModel):
"""
Configuration settings for Baidu Object Storage Service (OBS)
"""
BAIDU_OBS_BUCKET_NAME: Optional[str] = Field(
description="Name of the Baidu OBS bucket to store and retrieve objects (e.g., 'my-obs-bucket')",
default=None,
)
BAIDU_OBS_ACCESS_KEY: Optional[str] = Field(
description="Access Key ID for authenticating with Baidu OBS",
default=None,
)
BAIDU_OBS_SECRET_KEY: Optional[str] = Field(
description="Secret Access Key for authenticating with Baidu OBS",
default=None,
)
BAIDU_OBS_ENDPOINT: Optional[str] = Field(
description="URL of the Baidu OSS endpoint for your chosen region (e.g., 'https://.bj.bcebos.com')",
default=None,
)

View File

@@ -6,15 +6,15 @@ from pydantic_settings import BaseSettings
class GoogleCloudStorageConfig(BaseSettings):
"""
Google Cloud storage configs
Configuration settings for Google Cloud Storage
"""
GOOGLE_STORAGE_BUCKET_NAME: Optional[str] = Field(
description="Google Cloud storage bucket name",
description="Name of the Google Cloud Storage bucket to store and retrieve objects (e.g., 'my-gcs-bucket')",
default=None,
)
GOOGLE_STORAGE_SERVICE_ACCOUNT_JSON_BASE64: Optional[str] = Field(
description="Google Cloud storage service account json base64",
description="Base64-encoded JSON key file for Google Cloud service account authentication",
default=None,
)

View File

@@ -0,0 +1,29 @@
from typing import Optional
from pydantic import BaseModel, Field
class HuaweiCloudOBSStorageConfig(BaseModel):
"""
Configuration settings for Huawei Cloud Object Storage Service (OBS)
"""
HUAWEI_OBS_BUCKET_NAME: Optional[str] = Field(
description="Name of the Huawei Cloud OBS bucket to store and retrieve objects (e.g., 'my-obs-bucket')",
default=None,
)
HUAWEI_OBS_ACCESS_KEY: Optional[str] = Field(
description="Access Key ID for authenticating with Huawei Cloud OBS",
default=None,
)
HUAWEI_OBS_SECRET_KEY: Optional[str] = Field(
description="Secret Access Key for authenticating with Huawei Cloud OBS",
default=None,
)
HUAWEI_OBS_SERVER: Optional[str] = Field(
description="Endpoint URL for Huawei Cloud OBS (e.g., 'https://obs.cn-north-4.myhuaweicloud.com')",
default=None,
)

View File

@@ -6,30 +6,30 @@ from pydantic_settings import BaseSettings
class OCIStorageConfig(BaseSettings):
"""
OCI storage configs
Configuration settings for Oracle Cloud Infrastructure (OCI) Object Storage
"""
OCI_ENDPOINT: Optional[str] = Field(
description="OCI storage endpoint",
description="URL of the OCI Object Storage endpoint (e.g., 'https://objectstorage.us-phoenix-1.oraclecloud.com')",
default=None,
)
OCI_REGION: Optional[str] = Field(
description="OCI storage region",
description="OCI region where the bucket is located (e.g., 'us-phoenix-1')",
default=None,
)
OCI_BUCKET_NAME: Optional[str] = Field(
description="OCI storage bucket name",
description="Name of the OCI Object Storage bucket to store and retrieve objects (e.g., 'my-oci-bucket')",
default=None,
)
OCI_ACCESS_KEY: Optional[str] = Field(
description="OCI storage access key",
description="Access key (also known as API key) for authenticating with OCI Object Storage",
default=None,
)
OCI_SECRET_KEY: Optional[str] = Field(
description="OCI storage secret key",
description="Secret key associated with the access key for authenticating with OCI Object Storage",
default=None,
)

View File

@@ -0,0 +1,24 @@
from typing import Optional
from pydantic import BaseModel, Field
class SupabaseStorageConfig(BaseModel):
"""
Configuration settings for Supabase Object Storage Service
"""
SUPABASE_BUCKET_NAME: Optional[str] = Field(
description="Name of the Supabase bucket to store and retrieve objects (e.g., 'dify-bucket')",
default=None,
)
SUPABASE_API_KEY: Optional[str] = Field(
description="API KEY for authenticating with Supabase",
default=None,
)
SUPABASE_URL: Optional[str] = Field(
description="URL of the Supabase",
default=None,
)

View File

@@ -6,30 +6,30 @@ from pydantic_settings import BaseSettings
class TencentCloudCOSStorageConfig(BaseSettings):
"""
Tencent Cloud COS storage configs
Configuration settings for Tencent Cloud Object Storage (COS)
"""
TENCENT_COS_BUCKET_NAME: Optional[str] = Field(
description="Tencent Cloud COS bucket name",
description="Name of the Tencent Cloud COS bucket to store and retrieve objects",
default=None,
)
TENCENT_COS_REGION: Optional[str] = Field(
description="Tencent Cloud COS region",
description="Tencent Cloud region where the COS bucket is located (e.g., 'ap-guangzhou')",
default=None,
)
TENCENT_COS_SECRET_ID: Optional[str] = Field(
description="Tencent Cloud COS secret id",
description="SecretId for authenticating with Tencent Cloud COS (part of API credentials)",
default=None,
)
TENCENT_COS_SECRET_KEY: Optional[str] = Field(
description="Tencent Cloud COS secret key",
description="SecretKey for authenticating with Tencent Cloud COS (part of API credentials)",
default=None,
)
TENCENT_COS_SCHEME: Optional[str] = Field(
description="Tencent Cloud COS scheme",
description="Protocol scheme for COS requests: 'https' (recommended) or 'http'",
default=None,
)

View File

@@ -0,0 +1,34 @@
from typing import Optional
from pydantic import BaseModel, Field
class VolcengineTOSStorageConfig(BaseModel):
"""
Configuration settings for Volcengine Tinder Object Storage (TOS)
"""
VOLCENGINE_TOS_BUCKET_NAME: Optional[str] = Field(
description="Name of the Volcengine TOS bucket to store and retrieve objects (e.g., 'my-tos-bucket')",
default=None,
)
VOLCENGINE_TOS_ACCESS_KEY: Optional[str] = Field(
description="Access Key ID for authenticating with Volcengine TOS",
default=None,
)
VOLCENGINE_TOS_SECRET_KEY: Optional[str] = Field(
description="Secret Access Key for authenticating with Volcengine TOS",
default=None,
)
VOLCENGINE_TOS_ENDPOINT: Optional[str] = Field(
description="URL of the Volcengine TOS endpoint (e.g., 'https://tos-cn-beijing.volces.com')",
default=None,
)
VOLCENGINE_TOS_REGION: Optional[str] = Field(
description="Volcengine region where the TOS bucket is located (e.g., 'cn-beijing')",
default=None,
)

View File

@@ -5,33 +5,38 @@ from pydantic import BaseModel, Field
class AnalyticdbConfig(BaseModel):
"""
Configuration for connecting to AnalyticDB.
Configuration for connecting to Alibaba Cloud AnalyticDB for PostgreSQL.
Refer to the following documentation for details on obtaining credentials:
https://www.alibabacloud.com/help/en/analyticdb-for-postgresql/getting-started/create-an-instance-instances-with-vector-engine-optimization-enabled
"""
ANALYTICDB_KEY_ID: Optional[str] = Field(
default=None, description="The Access Key ID provided by Alibaba Cloud for authentication."
default=None, description="The Access Key ID provided by Alibaba Cloud for API authentication."
)
ANALYTICDB_KEY_SECRET: Optional[str] = Field(
default=None, description="The Secret Access Key corresponding to the Access Key ID for secure access."
default=None, description="The Secret Access Key corresponding to the Access Key ID for secure API access."
)
ANALYTICDB_REGION_ID: Optional[str] = Field(
default=None, description="The region where the AnalyticDB instance is deployed (e.g., 'cn-hangzhou')."
default=None,
description="The region where the AnalyticDB instance is deployed (e.g., 'cn-hangzhou', 'ap-southeast-1').",
)
ANALYTICDB_INSTANCE_ID: Optional[str] = Field(
default=None,
description="The unique identifier of the AnalyticDB instance you want to connect to (e.g., 'gp-ab123456')..",
description="The unique identifier of the AnalyticDB instance you want to connect to.",
)
ANALYTICDB_ACCOUNT: Optional[str] = Field(
default=None, description="The account name used to log in to the AnalyticDB instance."
default=None,
description="The account name used to log in to the AnalyticDB instance"
" (usually the initial account created with the instance).",
)
ANALYTICDB_PASSWORD: Optional[str] = Field(
default=None, description="The password associated with the AnalyticDB account for authentication."
default=None, description="The password associated with the AnalyticDB account for database authentication."
)
ANALYTICDB_NAMESPACE: Optional[str] = Field(
default=None, description="The namespace within AnalyticDB for schema isolation."
default=None, description="The namespace within AnalyticDB for schema isolation (if using namespace feature)."
)
ANALYTICDB_NAMESPACE_PASSWORD: Optional[str] = Field(
default=None, description="The password for accessing the specified namespace within the AnalyticDB instance."
default=None,
description="The password for accessing the specified namespace within the AnalyticDB instance"
" (if namespace feature is enabled).",
)

View File

@@ -0,0 +1,45 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings
class BaiduVectorDBConfig(BaseSettings):
"""
Configuration settings for Baidu Vector Database
"""
BAIDU_VECTOR_DB_ENDPOINT: Optional[str] = Field(
description="URL of the Baidu Vector Database service (e.g., 'http://vdb.bj.baidubce.com')",
default=None,
)
BAIDU_VECTOR_DB_CONNECTION_TIMEOUT_MS: PositiveInt = Field(
description="Timeout in milliseconds for Baidu Vector Database operations (default is 30000 milliseconds)",
default=30000,
)
BAIDU_VECTOR_DB_ACCOUNT: Optional[str] = Field(
description="Account for authenticating with the Baidu Vector Database",
default=None,
)
BAIDU_VECTOR_DB_API_KEY: Optional[str] = Field(
description="API key for authenticating with the Baidu Vector Database service",
default=None,
)
BAIDU_VECTOR_DB_DATABASE: Optional[str] = Field(
description="Name of the specific Baidu Vector Database to connect to",
default=None,
)
BAIDU_VECTOR_DB_SHARD: PositiveInt = Field(
description="Number of shards for the Baidu Vector Database (default is 1)",
default=1,
)
BAIDU_VECTOR_DB_REPLICAS: NonNegativeInt = Field(
description="Number of replicas for the Baidu Vector Database (default is 3)",
default=3,
)

View File

@@ -6,35 +6,35 @@ from pydantic_settings import BaseSettings
class ChromaConfig(BaseSettings):
"""
Chroma configs
Configuration settings for Chroma vector database
"""
CHROMA_HOST: Optional[str] = Field(
description="Chroma host",
description="Hostname or IP address of the Chroma server (e.g., 'localhost' or '192.168.1.100')",
default=None,
)
CHROMA_PORT: PositiveInt = Field(
description="Chroma port",
description="Port number on which the Chroma server is listening (default is 8000)",
default=8000,
)
CHROMA_TENANT: Optional[str] = Field(
description="Chroma database",
description="Tenant identifier for multi-tenancy support in Chroma",
default=None,
)
CHROMA_DATABASE: Optional[str] = Field(
description="Chroma database",
description="Name of the Chroma database to connect to",
default=None,
)
CHROMA_AUTH_PROVIDER: Optional[str] = Field(
description="Chroma authentication provider",
description="Authentication provider for Chroma (e.g., 'basic', 'token', or a custom provider)",
default=None,
)
CHROMA_AUTH_CREDENTIALS: Optional[str] = Field(
description="Chroma authentication credentials",
description="Authentication credentials for Chroma (format depends on the auth provider)",
default=None,
)

View File

@@ -6,25 +6,25 @@ from pydantic_settings import BaseSettings
class ElasticsearchConfig(BaseSettings):
"""
Elasticsearch configs
Configuration settings for Elasticsearch
"""
ELASTICSEARCH_HOST: Optional[str] = Field(
description="Elasticsearch host",
description="Hostname or IP address of the Elasticsearch server (e.g., 'localhost' or '192.168.1.100')",
default="127.0.0.1",
)
ELASTICSEARCH_PORT: PositiveInt = Field(
description="Elasticsearch port",
description="Port number on which the Elasticsearch server is listening (default is 9200)",
default=9200,
)
ELASTICSEARCH_USERNAME: Optional[str] = Field(
description="Elasticsearch username",
description="Username for authenticating with Elasticsearch (default is 'elastic')",
default="elastic",
)
ELASTICSEARCH_PASSWORD: Optional[str] = Field(
description="Elasticsearch password",
description="Password for authenticating with Elasticsearch (default is 'elastic')",
default="elastic",
)

View File

@@ -1,40 +1,35 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic import Field
from pydantic_settings import BaseSettings
class MilvusConfig(BaseSettings):
"""
Milvus configs
Configuration settings for Milvus vector database
"""
MILVUS_HOST: Optional[str] = Field(
description="Milvus host",
MILVUS_URI: Optional[str] = Field(
description="URI for connecting to the Milvus server (e.g., 'http://localhost:19530' or 'https://milvus-instance.example.com:19530')",
default="http://127.0.0.1:19530",
)
MILVUS_TOKEN: Optional[str] = Field(
description="Authentication token for Milvus, if token-based authentication is enabled",
default=None,
)
MILVUS_PORT: PositiveInt = Field(
description="Milvus RestFul API port",
default=9091,
)
MILVUS_USER: Optional[str] = Field(
description="Milvus user",
description="Username for authenticating with Milvus, if username/password authentication is enabled",
default=None,
)
MILVUS_PASSWORD: Optional[str] = Field(
description="Milvus password",
description="Password for authenticating with Milvus, if username/password authentication is enabled",
default=None,
)
MILVUS_SECURE: bool = Field(
description="whether to use SSL connection for Milvus",
default=False,
)
MILVUS_DATABASE: str = Field(
description="Milvus database, default to `default`",
description="Name of the Milvus database to connect to (default is 'default')",
default="default",
)

View File

@@ -3,35 +3,35 @@ from pydantic import BaseModel, Field, PositiveInt
class MyScaleConfig(BaseModel):
"""
MyScale configs
Configuration settings for MyScale vector database
"""
MYSCALE_HOST: str = Field(
description="MyScale host",
description="Hostname or IP address of the MyScale server (e.g., 'localhost' or 'myscale.example.com')",
default="localhost",
)
MYSCALE_PORT: PositiveInt = Field(
description="MyScale port",
description="Port number on which the MyScale server is listening (default is 8123)",
default=8123,
)
MYSCALE_USER: str = Field(
description="MyScale user",
description="Username for authenticating with MyScale (default is 'default')",
default="default",
)
MYSCALE_PASSWORD: str = Field(
description="MyScale password",
description="Password for authenticating with MyScale (default is an empty string)",
default="",
)
MYSCALE_DATABASE: str = Field(
description="MyScale database name",
description="Name of the MyScale database to connect to (default is 'default')",
default="default",
)
MYSCALE_FTS_PARAMS: str = Field(
description="MyScale fts index parameters",
description="Additional parameters for MyScale Full Text Search index)",
default="",
)

View File

@@ -6,30 +6,30 @@ from pydantic_settings import BaseSettings
class OpenSearchConfig(BaseSettings):
"""
OpenSearch configs
Configuration settings for OpenSearch
"""
OPENSEARCH_HOST: Optional[str] = Field(
description="OpenSearch host",
description="Hostname or IP address of the OpenSearch server (e.g., 'localhost' or 'opensearch.example.com')",
default=None,
)
OPENSEARCH_PORT: PositiveInt = Field(
description="OpenSearch port",
description="Port number on which the OpenSearch server is listening (default is 9200)",
default=9200,
)
OPENSEARCH_USER: Optional[str] = Field(
description="OpenSearch user",
description="Username for authenticating with OpenSearch",
default=None,
)
OPENSEARCH_PASSWORD: Optional[str] = Field(
description="OpenSearch password",
description="Password for authenticating with OpenSearch",
default=None,
)
OPENSEARCH_SECURE: bool = Field(
description="whether to use SSL connection for OpenSearch",
description="Whether to use SSL/TLS encrypted connection for OpenSearch (True for HTTPS, False for HTTP)",
default=False,
)

View File

@@ -6,30 +6,30 @@ from pydantic_settings import BaseSettings
class OracleConfig(BaseSettings):
"""
ORACLE configs
Configuration settings for Oracle database
"""
ORACLE_HOST: Optional[str] = Field(
description="ORACLE host",
description="Hostname or IP address of the Oracle database server (e.g., 'localhost' or 'oracle.example.com')",
default=None,
)
ORACLE_PORT: Optional[PositiveInt] = Field(
description="ORACLE port",
description="Port number on which the Oracle database server is listening (default is 1521)",
default=1521,
)
ORACLE_USER: Optional[str] = Field(
description="ORACLE user",
description="Username for authenticating with the Oracle database",
default=None,
)
ORACLE_PASSWORD: Optional[str] = Field(
description="ORACLE password",
description="Password for authenticating with the Oracle database",
default=None,
)
ORACLE_DATABASE: Optional[str] = Field(
description="ORACLE database",
description="Name of the Oracle database or service to connect to (e.g., 'ORCL' or 'pdborcl')",
default=None,
)

View File

@@ -6,30 +6,40 @@ from pydantic_settings import BaseSettings
class PGVectorConfig(BaseSettings):
"""
PGVector configs
Configuration settings for PGVector (PostgreSQL with vector extension)
"""
PGVECTOR_HOST: Optional[str] = Field(
description="PGVector host",
description="Hostname or IP address of the PostgreSQL server with PGVector extension (e.g., 'localhost')",
default=None,
)
PGVECTOR_PORT: Optional[PositiveInt] = Field(
description="PGVector port",
description="Port number on which the PostgreSQL server is listening (default is 5433)",
default=5433,
)
PGVECTOR_USER: Optional[str] = Field(
description="PGVector user",
description="Username for authenticating with the PostgreSQL database",
default=None,
)
PGVECTOR_PASSWORD: Optional[str] = Field(
description="PGVector password",
description="Password for authenticating with the PostgreSQL database",
default=None,
)
PGVECTOR_DATABASE: Optional[str] = Field(
description="PGVector database",
description="Name of the PostgreSQL database to connect to",
default=None,
)
PGVECTOR_MIN_CONNECTION: PositiveInt = Field(
description="Min connection of the PostgreSQL database",
default=1,
)
PGVECTOR_MAX_CONNECTION: PositiveInt = Field(
description="Max connection of the PostgreSQL database",
default=5,
)

View File

@@ -6,30 +6,30 @@ from pydantic_settings import BaseSettings
class PGVectoRSConfig(BaseSettings):
"""
PGVectoRS configs
Configuration settings for PGVecto.RS (Rust-based vector extension for PostgreSQL)
"""
PGVECTO_RS_HOST: Optional[str] = Field(
description="PGVectoRS host",
description="Hostname or IP address of the PostgreSQL server with PGVecto.RS extension (e.g., 'localhost')",
default=None,
)
PGVECTO_RS_PORT: Optional[PositiveInt] = Field(
description="PGVectoRS port",
description="Port number on which the PostgreSQL server with PGVecto.RS is listening (default is 5431)",
default=5431,
)
PGVECTO_RS_USER: Optional[str] = Field(
description="PGVectoRS user",
description="Username for authenticating with the PostgreSQL database using PGVecto.RS",
default=None,
)
PGVECTO_RS_PASSWORD: Optional[str] = Field(
description="PGVectoRS password",
description="Password for authenticating with the PostgreSQL database using PGVecto.RS",
default=None,
)
PGVECTO_RS_DATABASE: Optional[str] = Field(
description="PGVectoRS database",
description="Name of the PostgreSQL database with PGVecto.RS extension to connect to",
default=None,
)

View File

@@ -6,30 +6,30 @@ from pydantic_settings import BaseSettings
class QdrantConfig(BaseSettings):
"""
Qdrant configs
Configuration settings for Qdrant vector database
"""
QDRANT_URL: Optional[str] = Field(
description="Qdrant url",
description="URL of the Qdrant server (e.g., 'http://localhost:6333' or 'https://qdrant.example.com')",
default=None,
)
QDRANT_API_KEY: Optional[str] = Field(
description="Qdrant api key",
description="API key for authenticating with the Qdrant server",
default=None,
)
QDRANT_CLIENT_TIMEOUT: NonNegativeInt = Field(
description="Qdrant client timeout in seconds",
description="Timeout in seconds for Qdrant client operations (default is 20 seconds)",
default=20,
)
QDRANT_GRPC_ENABLED: bool = Field(
description="whether enable grpc support for Qdrant connection",
description="Whether to enable gRPC support for Qdrant connection (True for gRPC, False for HTTP)",
default=False,
)
QDRANT_GRPC_PORT: PositiveInt = Field(
description="Qdrant grpc port",
description="Port number for gRPC connection to Qdrant server (default is 6334)",
default=6334,
)

View File

@@ -6,30 +6,30 @@ from pydantic_settings import BaseSettings
class RelytConfig(BaseSettings):
"""
Relyt configs
Configuration settings for Relyt database
"""
RELYT_HOST: Optional[str] = Field(
description="Relyt host",
description="Hostname or IP address of the Relyt server (e.g., 'localhost' or 'relyt.example.com')",
default=None,
)
RELYT_PORT: PositiveInt = Field(
description="Relyt port",
description="Port number on which the Relyt server is listening (default is 9200)",
default=9200,
)
RELYT_USER: Optional[str] = Field(
description="Relyt user",
description="Username for authenticating with the Relyt database",
default=None,
)
RELYT_PASSWORD: Optional[str] = Field(
description="Relyt password",
description="Password for authenticating with the Relyt database",
default=None,
)
RELYT_DATABASE: Optional[str] = Field(
description="Relyt database",
description="Name of the Relyt database to connect to (default is 'default')",
default="default",
)

View File

@@ -6,45 +6,45 @@ from pydantic_settings import BaseSettings
class TencentVectorDBConfig(BaseSettings):
"""
Tencent Vector configs
Configuration settings for Tencent Vector Database
"""
TENCENT_VECTOR_DB_URL: Optional[str] = Field(
description="Tencent Vector URL",
description="URL of the Tencent Vector Database service (e.g., 'https://vectordb.tencentcloudapi.com')",
default=None,
)
TENCENT_VECTOR_DB_API_KEY: Optional[str] = Field(
description="Tencent Vector API key",
description="API key for authenticating with the Tencent Vector Database service",
default=None,
)
TENCENT_VECTOR_DB_TIMEOUT: PositiveInt = Field(
description="Tencent Vector timeout in seconds",
description="Timeout in seconds for Tencent Vector Database operations (default is 30 seconds)",
default=30,
)
TENCENT_VECTOR_DB_USERNAME: Optional[str] = Field(
description="Tencent Vector username",
description="Username for authenticating with the Tencent Vector Database (if required)",
default=None,
)
TENCENT_VECTOR_DB_PASSWORD: Optional[str] = Field(
description="Tencent Vector password",
description="Password for authenticating with the Tencent Vector Database (if required)",
default=None,
)
TENCENT_VECTOR_DB_SHARD: PositiveInt = Field(
description="Tencent Vector sharding number",
description="Number of shards for the Tencent Vector Database (default is 1)",
default=1,
)
TENCENT_VECTOR_DB_REPLICAS: NonNegativeInt = Field(
description="Tencent Vector replicas",
description="Number of replicas for the Tencent Vector Database (default is 2)",
default=2,
)
TENCENT_VECTOR_DB_DATABASE: Optional[str] = Field(
description="Tencent Vector Database",
description="Name of the specific Tencent Vector Database to connect to",
default=None,
)

View File

@@ -6,30 +6,30 @@ from pydantic_settings import BaseSettings
class TiDBVectorConfig(BaseSettings):
"""
TiDB Vector configs
Configuration settings for TiDB Vector database
"""
TIDB_VECTOR_HOST: Optional[str] = Field(
description="TiDB Vector host",
description="Hostname or IP address of the TiDB Vector server (e.g., 'localhost' or 'tidb.example.com')",
default=None,
)
TIDB_VECTOR_PORT: Optional[PositiveInt] = Field(
description="TiDB Vector port",
description="Port number on which the TiDB Vector server is listening (default is 4000)",
default=4000,
)
TIDB_VECTOR_USER: Optional[str] = Field(
description="TiDB Vector user",
description="Username for authenticating with the TiDB Vector database",
default=None,
)
TIDB_VECTOR_PASSWORD: Optional[str] = Field(
description="TiDB Vector password",
description="Password for authenticating with the TiDB Vector database",
default=None,
)
TIDB_VECTOR_DATABASE: Optional[str] = Field(
description="TiDB Vector database",
description="Name of the TiDB Vector database to connect to",
default=None,
)

View File

@@ -0,0 +1,37 @@
from typing import Optional
from pydantic import BaseModel, Field
class VikingDBConfig(BaseModel):
"""
Configuration for connecting to Volcengine VikingDB.
Refer to the following documentation for details on obtaining credentials:
https://www.volcengine.com/docs/6291/65568
"""
VIKINGDB_ACCESS_KEY: Optional[str] = Field(
default=None, description="The Access Key provided by Volcengine VikingDB for API authentication."
)
VIKINGDB_SECRET_KEY: Optional[str] = Field(
default=None, description="The Secret Key provided by Volcengine VikingDB for API authentication."
)
VIKINGDB_REGION: Optional[str] = Field(
default="cn-shanghai",
description="The region of the Volcengine VikingDB service.(e.g., 'cn-shanghai', 'cn-beijing').",
)
VIKINGDB_HOST: Optional[str] = Field(
default="api-vikingdb.mlp.cn-shanghai.volces.com",
description="The host of the Volcengine VikingDB service.(e.g., 'api-vikingdb.volces.com', \
'api-vikingdb.mlp.cn-shanghai.volces.com')",
)
VIKINGDB_SCHEME: Optional[str] = Field(
default="http",
description="The scheme of the Volcengine VikingDB service.(e.g., 'http', 'https').",
)
VIKINGDB_CONNECTION_TIMEOUT: Optional[int] = Field(
default=30, description="The connection timeout of the Volcengine VikingDB service."
)
VIKINGDB_SOCKET_TIMEOUT: Optional[int] = Field(
default=30, description="The socket timeout of the Volcengine VikingDB service."
)

View File

@@ -6,25 +6,25 @@ from pydantic_settings import BaseSettings
class WeaviateConfig(BaseSettings):
"""
Weaviate configs
Configuration settings for Weaviate vector database
"""
WEAVIATE_ENDPOINT: Optional[str] = Field(
description="Weaviate endpoint URL",
description="URL of the Weaviate server (e.g., 'http://localhost:8080' or 'https://weaviate.example.com')",
default=None,
)
WEAVIATE_API_KEY: Optional[str] = Field(
description="Weaviate API key",
description="API key for authenticating with the Weaviate server",
default=None,
)
WEAVIATE_GRPC_ENABLED: bool = Field(
description="whether to enable gRPC for Weaviate connection",
description="Whether to enable gRPC for Weaviate connection (True for gRPC, False for HTTP)",
default=True,
)
WEAVIATE_BATCH_SIZE: PositiveInt = Field(
description="Weaviate batch size",
description="Number of objects to be processed in a single batch operation (default is 100)",
default=100,
)

View File

@@ -9,7 +9,7 @@ class PackagingInfo(BaseSettings):
CURRENT_VERSION: str = Field(
description="Dify version",
default="0.7.2",
default="0.10.0-beta3",
)
COMMIT_SHA: str = Field(

View File

@@ -1 +1,21 @@
from configs import dify_config
HIDDEN_VALUE = "[__HIDDEN__]"
UUID_NIL = "00000000-0000-0000-0000-000000000000"
IMAGE_EXTENSIONS = ["jpg", "jpeg", "png", "webp", "gif", "svg"]
IMAGE_EXTENSIONS.extend([ext.upper() for ext in IMAGE_EXTENSIONS])
VIDEO_EXTENSIONS = ["mp4", "mov", "mpeg", "mpga"]
VIDEO_EXTENSIONS.extend([ext.upper() for ext in VIDEO_EXTENSIONS])
AUDIO_EXTENSIONS = ["mp3", "m4a", "wav", "webm", "amr"]
AUDIO_EXTENSIONS.extend([ext.upper() for ext in AUDIO_EXTENSIONS])
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "pdf", "html", "htm", "xlsx", "xls", "docx", "csv"]
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])
if dify_config.ETL_TYPE == "Unstructured":
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "pdf", "html", "htm", "xlsx", "xls"]
DOCUMENT_EXTENSIONS.extend(("docx", "csv", "eml", "msg", "pptx", "ppt", "xml", "epub"))
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])

File diff suppressed because one or more lines are too long

View File

@@ -1,7 +1,9 @@
from contextvars import ContextVar
from typing import TYPE_CHECKING
from core.workflow.entities.variable_pool import VariablePool
if TYPE_CHECKING:
from core.workflow.entities.variable_pool import VariablePool
tenant_id: ContextVar[str] = ContextVar("tenant_id")
workflow_variable_pool: ContextVar[VariablePool] = ContextVar("workflow_variable_pool")
workflow_variable_pool: ContextVar["VariablePool"] = ContextVar("workflow_variable_pool")

View File

@@ -37,7 +37,16 @@ from .auth import activate, data_source_bearer_auth, data_source_oauth, forgot_p
from .billing import billing
# Import datasets controllers
from .datasets import data_source, datasets, datasets_document, datasets_segments, file, hit_testing, website
from .datasets import (
data_source,
datasets,
datasets_document,
datasets_segments,
external,
file,
hit_testing,
website,
)
# Import explore controllers
from .explore import (

View File

@@ -60,23 +60,15 @@ class InsertExploreAppListApi(Resource):
site = app.site
if not site:
desc = args["desc"] if args["desc"] else ""
copy_right = args["copyright"] if args["copyright"] else ""
privacy_policy = args["privacy_policy"] if args["privacy_policy"] else ""
custom_disclaimer = args["custom_disclaimer"] if args["custom_disclaimer"] else ""
desc = args["desc"] or ""
copy_right = args["copyright"] or ""
privacy_policy = args["privacy_policy"] or ""
custom_disclaimer = args["custom_disclaimer"] or ""
else:
desc = site.description if site.description else args["desc"] if args["desc"] else ""
copy_right = site.copyright if site.copyright else args["copyright"] if args["copyright"] else ""
privacy_policy = (
site.privacy_policy if site.privacy_policy else args["privacy_policy"] if args["privacy_policy"] else ""
)
custom_disclaimer = (
site.custom_disclaimer
if site.custom_disclaimer
else args["custom_disclaimer"]
if args["custom_disclaimer"]
else ""
)
desc = site.description or args["desc"] or ""
copy_right = site.copyright or args["copyright"] or ""
privacy_policy = site.privacy_policy or args["privacy_policy"] or ""
custom_disclaimer = site.custom_disclaimer or args["custom_disclaimer"] or ""
recommended_app = RecommendedApp.query.filter(RecommendedApp.app_id == args["app_id"]).first()

View File

@@ -57,7 +57,7 @@ class BaseApiKeyListResource(Resource):
def post(self, resource_id):
resource_id = str(resource_id)
_get_resource(resource_id, current_user.current_tenant_id, self.resource_model)
if not current_user.is_admin_or_owner:
if not current_user.is_editor:
raise Forbidden()
current_key_count = (

View File

@@ -174,6 +174,7 @@ class AppApi(Resource):
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
parser.add_argument("max_active_requests", type=int, location="json")
parser.add_argument("use_icon_as_answer_icon", type=bool, location="json")
args = parser.parse_args()
app_service = AppService()

View File

@@ -94,19 +94,15 @@ class ChatMessageTextApi(Resource):
message_id = args.get("message_id", None)
text = args.get("text", None)
if (
app_model.mode in [AppMode.ADVANCED_CHAT.value, AppMode.WORKFLOW.value]
app_model.mode in {AppMode.ADVANCED_CHAT.value, AppMode.WORKFLOW.value}
and app_model.workflow
and app_model.workflow.features_dict
):
text_to_speech = app_model.workflow.features_dict.get("text_to_speech")
voice = args.get("voice") if args.get("voice") else text_to_speech.get("voice")
voice = args.get("voice") or text_to_speech.get("voice")
else:
try:
voice = (
args.get("voice")
if args.get("voice")
else app_model.app_model_config.text_to_speech_dict.get("voice")
)
voice = args.get("voice") or app_model.app_model_config.text_to_speech_dict.get("voice")
except Exception:
voice = None
response = AudioService.transcript_tts(app_model=app_model, text=text, message_id=message_id, voice=voice)

View File

@@ -109,6 +109,7 @@ class ChatMessageApi(Resource):
parser.add_argument("files", type=list, required=False, location="json")
parser.add_argument("model_config", type=dict, required=True, location="json")
parser.add_argument("conversation_id", type=uuid_value, location="json")
parser.add_argument("parent_message_id", type=uuid_value, required=False, location="json")
parser.add_argument("response_mode", type=str, choices=["blocking", "streaming"], location="json")
parser.add_argument("retriever_from", type=str, required=False, default="dev", location="json")
args = parser.parse_args()

View File

@@ -20,9 +20,10 @@ from fields.conversation_fields import (
conversation_pagination_fields,
conversation_with_summary_pagination_fields,
)
from libs.helper import datetime_string
from libs.helper import DatetimeString
from libs.login import login_required
from models.model import AppMode, Conversation, EndUser, Message, MessageAnnotation
from models import Conversation, EndUser, Message, MessageAnnotation
from models.model import AppMode
class CompletionConversationApi(Resource):
@@ -36,8 +37,8 @@ class CompletionConversationApi(Resource):
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("keyword", type=str, location="args")
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument(
"annotation_status", type=str, choices=["annotated", "not_annotated", "all"], default="all", location="args"
)
@@ -143,8 +144,8 @@ class ChatConversationApi(Resource):
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("keyword", type=str, location="args")
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument(
"annotation_status", type=str, choices=["annotated", "not_annotated", "all"], default="all", location="args"
)
@@ -173,18 +174,22 @@ class ChatConversationApi(Resource):
if args["keyword"]:
keyword_filter = "%{}%".format(args["keyword"])
message_subquery = (
db.session.query(Message.conversation_id)
.filter(or_(Message.query.ilike(keyword_filter), Message.answer.ilike(keyword_filter)))
.subquery()
)
query = query.join(subquery, subquery.c.conversation_id == Conversation.id).filter(
or_(
Conversation.id.in_(message_subquery),
Conversation.name.ilike(keyword_filter),
Conversation.introduction.ilike(keyword_filter),
subquery.c.from_end_user_session_id.ilike(keyword_filter),
),
query = (
query.join(
Message,
Message.conversation_id == Conversation.id,
)
.join(subquery, subquery.c.conversation_id == Conversation.id)
.filter(
or_(
Message.query.ilike(keyword_filter),
Message.answer.ilike(keyword_filter),
Conversation.name.ilike(keyword_filter),
Conversation.introduction.ilike(keyword_filter),
subquery.c.from_end_user_session_id.ilike(keyword_filter),
),
)
.group_by(Conversation.id)
)
account = current_user
@@ -198,7 +203,11 @@ class ChatConversationApi(Resource):
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
query = query.where(Conversation.created_at >= start_datetime_utc)
match args["sort_by"]:
case "updated_at" | "-updated_at":
query = query.where(Conversation.updated_at >= start_datetime_utc)
case "created_at" | "-created_at" | _:
query = query.where(Conversation.created_at >= start_datetime_utc)
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
@@ -207,7 +216,11 @@ class ChatConversationApi(Resource):
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
query = query.where(Conversation.created_at < end_datetime_utc)
match args["sort_by"]:
case "updated_at" | "-updated_at":
query = query.where(Conversation.updated_at <= end_datetime_utc)
case "created_at" | "-created_at" | _:
query = query.where(Conversation.created_at <= end_datetime_utc)
if args["annotation_status"] == "annotated":
query = query.options(joinedload(Conversation.message_annotations)).join(

View File

@@ -105,8 +105,6 @@ class ChatMessageListApi(Resource):
if rest_count > 0:
has_more = True
history_messages = list(reversed(history_messages))
return InfiniteScrollPagination(data=history_messages, limit=args["limit"], has_more=has_more)

View File

@@ -12,7 +12,7 @@ from controllers.console.wraps import account_initialization_required
from extensions.ext_database import db
from fields.app_fields import app_site_fields
from libs.login import login_required
from models.model import Site
from models import Site
def parse_app_site_args():
@@ -34,6 +34,7 @@ def parse_app_site_args():
)
parser.add_argument("prompt_public", type=bool, required=False, location="json")
parser.add_argument("show_workflow_steps", type=bool, required=False, location="json")
parser.add_argument("use_icon_as_answer_icon", type=bool, required=False, location="json")
return parser.parse_args()
@@ -68,6 +69,7 @@ class AppSite(Resource):
"customize_token_strategy",
"prompt_public",
"show_workflow_steps",
"use_icon_as_answer_icon",
]:
value = args.get(attr_name)
if value is not None:

View File

@@ -11,7 +11,7 @@ from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from extensions.ext_database import db
from libs.helper import datetime_string
from libs.helper import DatetimeString
from libs.login import login_required
from models.model import AppMode
@@ -25,14 +25,17 @@ class DailyMessageStatistic(Resource):
account = current_user
parser = reqparse.RequestParser()
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
sql_query = """
SELECT date(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date, count(*) AS message_count
FROM messages where app_id = :app_id
"""
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
COUNT(*) AS message_count
FROM
messages
WHERE
app_id = :app_id"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id}
timezone = pytz.timezone(account.timezone)
@@ -45,7 +48,7 @@ class DailyMessageStatistic(Resource):
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at >= :start"
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
@@ -55,10 +58,10 @@ class DailyMessageStatistic(Resource):
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at < :end"
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += " GROUP BY date order by date"
sql_query += " GROUP BY date ORDER BY date"
response_data = []
@@ -79,14 +82,17 @@ class DailyConversationStatistic(Resource):
account = current_user
parser = reqparse.RequestParser()
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
sql_query = """
SELECT date(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date, count(distinct messages.conversation_id) AS conversation_count
FROM messages where app_id = :app_id
"""
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
COUNT(DISTINCT messages.conversation_id) AS conversation_count
FROM
messages
WHERE
app_id = :app_id"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id}
timezone = pytz.timezone(account.timezone)
@@ -99,7 +105,7 @@ class DailyConversationStatistic(Resource):
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at >= :start"
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
@@ -109,10 +115,10 @@ class DailyConversationStatistic(Resource):
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at < :end"
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += " GROUP BY date order by date"
sql_query += " GROUP BY date ORDER BY date"
response_data = []
@@ -133,14 +139,17 @@ class DailyTerminalsStatistic(Resource):
account = current_user
parser = reqparse.RequestParser()
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
sql_query = """
SELECT date(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date, count(distinct messages.from_end_user_id) AS terminal_count
FROM messages where app_id = :app_id
"""
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
COUNT(DISTINCT messages.from_end_user_id) AS terminal_count
FROM
messages
WHERE
app_id = :app_id"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id}
timezone = pytz.timezone(account.timezone)
@@ -153,7 +162,7 @@ class DailyTerminalsStatistic(Resource):
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at >= :start"
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
@@ -163,10 +172,10 @@ class DailyTerminalsStatistic(Resource):
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at < :end"
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += " GROUP BY date order by date"
sql_query += " GROUP BY date ORDER BY date"
response_data = []
@@ -187,16 +196,18 @@ class DailyTokenCostStatistic(Resource):
account = current_user
parser = reqparse.RequestParser()
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
sql_query = """
SELECT date(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
(sum(messages.message_tokens) + sum(messages.answer_tokens)) as token_count,
sum(total_price) as total_price
FROM messages where app_id = :app_id
"""
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
(SUM(messages.message_tokens) + SUM(messages.answer_tokens)) AS token_count,
SUM(total_price) AS total_price
FROM
messages
WHERE
app_id = :app_id"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id}
timezone = pytz.timezone(account.timezone)
@@ -209,7 +220,7 @@ class DailyTokenCostStatistic(Resource):
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at >= :start"
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
@@ -219,10 +230,10 @@ class DailyTokenCostStatistic(Resource):
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at < :end"
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += " GROUP BY date order by date"
sql_query += " GROUP BY date ORDER BY date"
response_data = []
@@ -245,16 +256,26 @@ class AverageSessionInteractionStatistic(Resource):
account = current_user
parser = reqparse.RequestParser()
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
sql_query = """SELECT date(DATE_TRUNC('day', c.created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
AVG(subquery.message_count) AS interactions
FROM (SELECT m.conversation_id, COUNT(m.id) AS message_count
FROM conversations c
JOIN messages m ON c.id = m.conversation_id
WHERE c.override_model_configs IS NULL AND c.app_id = :app_id"""
sql_query = """SELECT
DATE(DATE_TRUNC('day', c.created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
AVG(subquery.message_count) AS interactions
FROM
(
SELECT
m.conversation_id,
COUNT(m.id) AS message_count
FROM
conversations c
JOIN
messages m
ON c.id = m.conversation_id
WHERE
c.override_model_configs IS NULL
AND c.app_id = :app_id"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id}
timezone = pytz.timezone(account.timezone)
@@ -267,7 +288,7 @@ FROM (SELECT m.conversation_id, COUNT(m.id) AS message_count
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " and c.created_at >= :start"
sql_query += " AND c.created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
@@ -277,14 +298,19 @@ FROM (SELECT m.conversation_id, COUNT(m.id) AS message_count
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " and c.created_at < :end"
sql_query += " AND c.created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += """
GROUP BY m.conversation_id) subquery
LEFT JOIN conversations c on c.id=subquery.conversation_id
GROUP BY date
ORDER BY date"""
GROUP BY m.conversation_id
) subquery
LEFT JOIN
conversations c
ON c.id = subquery.conversation_id
GROUP BY
date
ORDER BY
date"""
response_data = []
@@ -307,17 +333,21 @@ class UserSatisfactionRateStatistic(Resource):
account = current_user
parser = reqparse.RequestParser()
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
sql_query = """
SELECT date(DATE_TRUNC('day', m.created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
COUNT(m.id) as message_count, COUNT(mf.id) as feedback_count
FROM messages m
LEFT JOIN message_feedbacks mf on mf.message_id=m.id and mf.rating='like'
WHERE m.app_id = :app_id
"""
sql_query = """SELECT
DATE(DATE_TRUNC('day', m.created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
COUNT(m.id) AS message_count,
COUNT(mf.id) AS feedback_count
FROM
messages m
LEFT JOIN
message_feedbacks mf
ON mf.message_id=m.id AND mf.rating='like'
WHERE
m.app_id = :app_id"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id}
timezone = pytz.timezone(account.timezone)
@@ -330,7 +360,7 @@ class UserSatisfactionRateStatistic(Resource):
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " and m.created_at >= :start"
sql_query += " AND m.created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
@@ -340,10 +370,10 @@ class UserSatisfactionRateStatistic(Resource):
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " and m.created_at < :end"
sql_query += " AND m.created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += " GROUP BY date order by date"
sql_query += " GROUP BY date ORDER BY date"
response_data = []
@@ -369,16 +399,17 @@ class AverageResponseTimeStatistic(Resource):
account = current_user
parser = reqparse.RequestParser()
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
sql_query = """
SELECT date(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
AVG(provider_response_latency) as latency
FROM messages
WHERE app_id = :app_id
"""
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
AVG(provider_response_latency) AS latency
FROM
messages
WHERE
app_id = :app_id"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id}
timezone = pytz.timezone(account.timezone)
@@ -391,7 +422,7 @@ class AverageResponseTimeStatistic(Resource):
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at >= :start"
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
@@ -401,10 +432,10 @@ class AverageResponseTimeStatistic(Resource):
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at < :end"
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += " GROUP BY date order by date"
sql_query += " GROUP BY date ORDER BY date"
response_data = []
@@ -425,17 +456,20 @@ class TokensPerSecondStatistic(Resource):
account = current_user
parser = reqparse.RequestParser()
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
sql_query = """SELECT date(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
CASE
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
CASE
WHEN SUM(provider_response_latency) = 0 THEN 0
ELSE (SUM(answer_tokens) / SUM(provider_response_latency))
END as tokens_per_second
FROM messages
WHERE app_id = :app_id"""
FROM
messages
WHERE
app_id = :app_id"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id}
timezone = pytz.timezone(account.timezone)
@@ -448,7 +482,7 @@ WHERE app_id = :app_id"""
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at >= :start"
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
@@ -458,10 +492,10 @@ WHERE app_id = :app_id"""
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at < :end"
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += " GROUP BY date order by date"
sql_query += " GROUP BY date ORDER BY date"
response_data = []

View File

@@ -13,14 +13,14 @@ from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from core.app.apps.base_app_queue_manager import AppQueueManager
from core.app.entities.app_invoke_entities import InvokeFrom
from core.app.segments import factory
from core.errors.error import AppInvokeQuotaExceededError
from factories import variable_factory
from fields.workflow_fields import workflow_fields
from fields.workflow_run_fields import workflow_run_node_execution_fields
from libs import helper
from libs.helper import TimestampField, uuid_value
from libs.login import current_user, login_required
from models.model import App, AppMode
from models import App
from models.model import AppMode
from services.app_dsl_service import AppDslService
from services.app_generate_service import AppGenerateService
from services.errors.app import WorkflowHashNotEqualError
@@ -101,9 +101,13 @@ class DraftWorkflowApi(Resource):
try:
environment_variables_list = args.get("environment_variables") or []
environment_variables = [factory.build_variable_from_mapping(obj) for obj in environment_variables_list]
environment_variables = [
variable_factory.build_variable_from_mapping(obj) for obj in environment_variables_list
]
conversation_variables_list = args.get("conversation_variables") or []
conversation_variables = [factory.build_variable_from_mapping(obj) for obj in conversation_variables_list]
conversation_variables = [
variable_factory.build_variable_from_mapping(obj) for obj in conversation_variables_list
]
workflow = workflow_service.sync_draft_workflow(
app_model=app_model,
graph=args["graph"],
@@ -166,6 +170,8 @@ class AdvancedChatDraftWorkflowRunApi(Resource):
parser.add_argument("query", type=str, required=True, location="json", default="")
parser.add_argument("files", type=list, location="json")
parser.add_argument("conversation_id", type=uuid_value, location="json")
parser.add_argument("parent_message_id", type=uuid_value, required=False, location="json")
args = parser.parse_args()
try:
@@ -271,17 +277,15 @@ class DraftWorkflowRunApi(Resource):
parser.add_argument("files", type=list, required=False, location="json")
args = parser.parse_args()
try:
response = AppGenerateService.generate(
app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=True
)
response = AppGenerateService.generate(
app_model=app_model,
user=current_user,
args=args,
invoke_from=InvokeFrom.DEBUGGER,
streaming=True,
)
return helper.compact_generate_response(response)
except (ValueError, AppInvokeQuotaExceededError) as e:
raise e
except Exception as e:
logging.exception("internal server error.")
raise InternalServerError()
return helper.compact_generate_response(response)
class WorkflowTaskStopApi(Resource):
@@ -465,6 +469,6 @@ api.add_resource(
api.add_resource(PublishedWorkflowApi, "/apps/<uuid:app_id>/workflows/publish")
api.add_resource(DefaultBlockConfigsApi, "/apps/<uuid:app_id>/workflows/default-workflow-block-configs")
api.add_resource(
DefaultBlockConfigApi, "/apps/<uuid:app_id>/workflows/default-workflow-block-configs" "/<string:block_type>"
DefaultBlockConfigApi, "/apps/<uuid:app_id>/workflows/default-workflow-block-configs/<string:block_type>"
)
api.add_resource(ConvertToWorkflowApi, "/apps/<uuid:app_id>/convert-to-workflow")

View File

@@ -7,7 +7,8 @@ from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from fields.workflow_app_log_fields import workflow_app_log_pagination_fields
from libs.login import login_required
from models.model import App, AppMode
from models import App
from models.model import AppMode
from services.workflow_app_service import WorkflowAppService

View File

@@ -13,7 +13,8 @@ from fields.workflow_run_fields import (
)
from libs.helper import uuid_value
from libs.login import login_required
from models.model import App, AppMode
from models import App
from models.model import AppMode
from services.workflow_run_service import WorkflowRunService

View File

@@ -10,11 +10,11 @@ from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from enums import WorkflowRunTriggeredFrom
from extensions.ext_database import db
from libs.helper import datetime_string
from libs.helper import DatetimeString
from libs.login import login_required
from models.model import AppMode
from models.workflow import WorkflowRunTriggeredFrom
class WorkflowDailyRunsStatistic(Resource):
@@ -26,16 +26,18 @@ class WorkflowDailyRunsStatistic(Resource):
account = current_user
parser = reqparse.RequestParser()
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
sql_query = """
SELECT date(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date, count(id) AS runs
FROM workflow_runs
WHERE app_id = :app_id
AND triggered_from = :triggered_from
"""
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
COUNT(id) AS runs
FROM
workflow_runs
WHERE
app_id = :app_id
AND triggered_from = :triggered_from"""
arg_dict = {
"tz": account.timezone,
"app_id": app_model.id,
@@ -52,7 +54,7 @@ class WorkflowDailyRunsStatistic(Resource):
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at >= :start"
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
@@ -62,10 +64,10 @@ class WorkflowDailyRunsStatistic(Resource):
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at < :end"
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += " GROUP BY date order by date"
sql_query += " GROUP BY date ORDER BY date"
response_data = []
@@ -86,16 +88,18 @@ class WorkflowDailyTerminalsStatistic(Resource):
account = current_user
parser = reqparse.RequestParser()
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
sql_query = """
SELECT date(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date, count(distinct workflow_runs.created_by) AS terminal_count
FROM workflow_runs
WHERE app_id = :app_id
AND triggered_from = :triggered_from
"""
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
COUNT(DISTINCT workflow_runs.created_by) AS terminal_count
FROM
workflow_runs
WHERE
app_id = :app_id
AND triggered_from = :triggered_from"""
arg_dict = {
"tz": account.timezone,
"app_id": app_model.id,
@@ -112,7 +116,7 @@ class WorkflowDailyTerminalsStatistic(Resource):
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at >= :start"
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
@@ -122,10 +126,10 @@ class WorkflowDailyTerminalsStatistic(Resource):
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at < :end"
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += " GROUP BY date order by date"
sql_query += " GROUP BY date ORDER BY date"
response_data = []
@@ -146,18 +150,18 @@ class WorkflowDailyTokenCostStatistic(Resource):
account = current_user
parser = reqparse.RequestParser()
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
sql_query = """
SELECT
date(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
SUM(workflow_runs.total_tokens) as token_count
FROM workflow_runs
WHERE app_id = :app_id
AND triggered_from = :triggered_from
"""
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
SUM(workflow_runs.total_tokens) AS token_count
FROM
workflow_runs
WHERE
app_id = :app_id
AND triggered_from = :triggered_from"""
arg_dict = {
"tz": account.timezone,
"app_id": app_model.id,
@@ -174,7 +178,7 @@ class WorkflowDailyTokenCostStatistic(Resource):
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at >= :start"
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
@@ -184,10 +188,10 @@ class WorkflowDailyTokenCostStatistic(Resource):
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " and created_at < :end"
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += " GROUP BY date order by date"
sql_query += " GROUP BY date ORDER BY date"
response_data = []
@@ -213,27 +217,31 @@ class WorkflowAverageAppInteractionStatistic(Resource):
account = current_user
parser = reqparse.RequestParser()
parser.add_argument("start", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=datetime_string("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
sql_query = """
SELECT
AVG(sub.interactions) as interactions,
sub.date
FROM
(SELECT
date(DATE_TRUNC('day', c.created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
c.created_by,
COUNT(c.id) AS interactions
FROM workflow_runs c
WHERE c.app_id = :app_id
AND c.triggered_from = :triggered_from
{{start}}
{{end}}
GROUP BY date, c.created_by) sub
GROUP BY sub.date
"""
sql_query = """SELECT
AVG(sub.interactions) AS interactions,
sub.date
FROM
(
SELECT
DATE(DATE_TRUNC('day', c.created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
c.created_by,
COUNT(c.id) AS interactions
FROM
workflow_runs c
WHERE
c.app_id = :app_id
AND c.triggered_from = :triggered_from
{{start}}
{{end}}
GROUP BY
date, c.created_by
) sub
GROUP BY
sub.date"""
arg_dict = {
"tz": account.timezone,
"app_id": app_model.id,
@@ -262,7 +270,7 @@ class WorkflowAverageAppInteractionStatistic(Resource):
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query = sql_query.replace("{{end}}", " and c.created_at < :end")
sql_query = sql_query.replace("{{end}}", " AND c.created_at < :end")
arg_dict["end"] = end_datetime_utc
else:
sql_query = sql_query.replace("{{end}}", "")

View File

@@ -5,7 +5,8 @@ from typing import Optional, Union
from controllers.console.app.error import AppNotFoundError
from extensions.ext_database import db
from libs.login import current_user
from models.model import App, AppMode
from models import App
from models.model import AppMode
def get_app_model(view: Optional[Callable] = None, *, mode: Union[AppMode, list[AppMode]] = None):

View File

@@ -8,7 +8,7 @@ from constants.languages import supported_language
from controllers.console import api
from controllers.console.error import AlreadyActivateError
from extensions.ext_database import db
from libs.helper import email, str_len, timezone
from libs.helper import StrLen, email, timezone
from libs.password import hash_password, valid_password
from models.account import AccountStatus
from services.account_service import RegisterService
@@ -37,7 +37,7 @@ class ActivateApi(Resource):
parser.add_argument("workspace_id", type=str, required=False, nullable=True, location="json")
parser.add_argument("email", type=email, required=False, nullable=True, location="json")
parser.add_argument("token", type=str, required=True, nullable=False, location="json")
parser.add_argument("name", type=str_len(30), required=True, nullable=False, location="json")
parser.add_argument("name", type=StrLen(30), required=True, nullable=False, location="json")
parser.add_argument("password", type=valid_password, required=True, nullable=False, location="json")
parser.add_argument(
"interface_language", type=supported_language, required=True, nullable=False, location="json"

View File

@@ -15,7 +15,7 @@ from controllers.console.setup import setup_required
from extensions.ext_database import db
from libs.helper import email as email_validate
from libs.password import hash_password, valid_password
from models.account import Account
from models import Account
from services.account_service import AccountService
from services.errors.account import RateLimitExceededError

View File

@@ -7,9 +7,9 @@ from flask_restful import Resource, reqparse
import services
from controllers.console import api
from controllers.console.setup import setup_required
from libs.helper import email, get_remote_ip
from libs.helper import email, extract_remote_ip
from libs.password import valid_password
from models.account import Account
from models import Account
from services.account_service import AccountService, TenantService
@@ -40,17 +40,16 @@ class LoginApi(Resource):
"data": "workspace not found, please contact system admin to invite you to join in a workspace",
}
token = AccountService.login(account, ip_address=get_remote_ip(request))
token_pair = AccountService.login(account=account, ip_address=extract_remote_ip(request))
return {"result": "success", "data": token}
return {"result": "success", "data": token_pair.model_dump()}
class LogoutApi(Resource):
@setup_required
def get(self):
account = cast(Account, flask_login.current_user)
token = request.headers.get("Authorization", "").split(" ")[1]
AccountService.logout(account=account, token=token)
AccountService.logout(account=account)
flask_login.logout_user()
return {"result": "success"}
@@ -106,5 +105,19 @@ class ResetPasswordApi(Resource):
return {"result": "success"}
class RefreshTokenApi(Resource):
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("refresh_token", type=str, required=True, location="json")
args = parser.parse_args()
try:
new_token_pair = AccountService.refresh_token(args["refresh_token"])
return {"result": "success", "data": new_token_pair.model_dump()}
except Exception as e:
return {"result": "fail", "data": str(e)}, 401
api.add_resource(LoginApi, "/login")
api.add_resource(LogoutApi, "/logout")
api.add_resource(RefreshTokenApi, "/refresh-token")

View File

@@ -9,9 +9,10 @@ from flask_restful import Resource
from configs import dify_config
from constants.languages import languages
from extensions.ext_database import db
from libs.helper import get_remote_ip
from libs.helper import extract_remote_ip
from libs.oauth import GitHubOAuth, GoogleOAuth, OAuthUserInfo
from models.account import Account, AccountStatus
from models import Account
from models.account import AccountStatus
from services.account_service import AccountService, RegisterService, TenantService
from .. import api
@@ -71,7 +72,7 @@ class OAuthCallback(Resource):
account = _generate_account(provider, user_info)
# Check account status
if account.status == AccountStatus.BANNED.value or account.status == AccountStatus.CLOSED.value:
if account.status in {AccountStatus.BANNED.value, AccountStatus.CLOSED.value}:
return {"error": "Account is banned or closed."}, 403
if account.status == AccountStatus.PENDING.value:
@@ -81,9 +82,14 @@ class OAuthCallback(Resource):
TenantService.create_owner_tenant_if_not_exist(account)
token = AccountService.login(account, ip_address=get_remote_ip(request))
token_pair = AccountService.login(
account=account,
ip_address=extract_remote_ip(request),
)
return redirect(f"{dify_config.CONSOLE_WEB_URL}?console_token={token}")
return redirect(
f"{dify_config.CONSOLE_WEB_URL}?access_token={token_pair.access_token}&refresh_token={token_pair.refresh_token}"
)
def _get_account_by_openid_or_email(provider: str, user_info: OAuthUserInfo) -> Optional[Account]:
@@ -101,7 +107,7 @@ def _generate_account(provider: str, user_info: OAuthUserInfo):
if not account:
# Create account
account_name = user_info.name if user_info.name else "Dify"
account_name = user_info.name or "Dify"
account = RegisterService.register(
email=user_info.email, name=account_name, password=None, open_id=user_info.id, provider=provider
)

View File

@@ -15,8 +15,7 @@ from core.rag.extractor.notion_extractor import NotionExtractor
from extensions.ext_database import db
from fields.data_source_fields import integrate_list_fields, integrate_notion_info_list_fields
from libs.login import login_required
from models.dataset import Document
from models.source import DataSourceOauthBinding
from models import DataSourceOauthBinding, Document
from services.dataset_service import DatasetService, DocumentService
from tasks.document_indexing_sync_task import document_indexing_sync_task

View File

@@ -18,14 +18,14 @@ from core.model_runtime.entities.model_entities import ModelType
from core.provider_manager import ProviderManager
from core.rag.datasource.vdb.vector_type import VectorType
from core.rag.extractor.entity.extract_setting import ExtractSetting
from core.rag.retrieval.retrival_methods import RetrievalMethod
from core.rag.retrieval.retrieval_methods import RetrievalMethod
from extensions.ext_database import db
from fields.app_fields import related_app_list
from fields.dataset_fields import dataset_detail_fields, dataset_query_detail_fields
from fields.document_fields import document_status_fields
from libs.login import login_required
from models.dataset import Dataset, DatasetPermissionEnum, Document, DocumentSegment
from models.model import ApiToken, UploadFile
from models import ApiToken, Dataset, Document, DocumentSegment, UploadFile
from models.dataset import DatasetPermissionEnum
from services.dataset_service import DatasetPermissionService, DatasetService, DocumentService
@@ -49,7 +49,7 @@ class DatasetListApi(Resource):
page = request.args.get("page", default=1, type=int)
limit = request.args.get("limit", default=20, type=int)
ids = request.args.getlist("ids")
provider = request.args.get("provider", default="vendor")
# provider = request.args.get("provider", default="vendor")
search = request.args.get("keyword", default=None, type=str)
tag_ids = request.args.getlist("tag_ids")
@@ -57,7 +57,7 @@ class DatasetListApi(Resource):
datasets, total = DatasetService.get_datasets_by_ids(ids, current_user.current_tenant_id)
else:
datasets, total = DatasetService.get_datasets(
page, limit, provider, current_user.current_tenant_id, current_user, search, tag_ids
page, limit, current_user.current_tenant_id, current_user, search, tag_ids
)
# check embedding setting
@@ -110,6 +110,26 @@ class DatasetListApi(Resource):
nullable=True,
help="Invalid indexing technique.",
)
parser.add_argument(
"external_knowledge_api_id",
type=str,
nullable=True,
required=False,
)
parser.add_argument(
"provider",
type=str,
nullable=True,
choices=Dataset.PROVIDER_LIST,
required=False,
default="vendor",
)
parser.add_argument(
"external_knowledge_id",
type=str,
nullable=True,
required=False,
)
args = parser.parse_args()
# The role of the current user in the ta table must be admin, owner, or editor, or dataset_operator
@@ -123,6 +143,9 @@ class DatasetListApi(Resource):
indexing_technique=args["indexing_technique"],
account=current_user,
permission=DatasetPermissionEnum.ONLY_ME,
provider=args["provider"],
external_knowledge_api_id=args["external_knowledge_api_id"],
external_knowledge_id=args["external_knowledge_id"],
)
except services.errors.dataset.DatasetNameDuplicateError:
raise DatasetNameDuplicateError()
@@ -211,6 +234,33 @@ class DatasetApi(Resource):
)
parser.add_argument("retrieval_model", type=dict, location="json", help="Invalid retrieval model.")
parser.add_argument("partial_member_list", type=list, location="json", help="Invalid parent user list.")
parser.add_argument(
"external_retrieval_model",
type=dict,
required=False,
nullable=True,
location="json",
help="Invalid external retrieval model.",
)
parser.add_argument(
"external_knowledge_id",
type=str,
required=False,
nullable=True,
location="json",
help="Invalid external knowledge id.",
)
parser.add_argument(
"external_knowledge_api_id",
type=str,
required=False,
nullable=True,
location="json",
help="Invalid external knowledge api id.",
)
args = parser.parse_args()
data = request.get_json()
@@ -399,7 +449,7 @@ class DatasetIndexingEstimateApi(Resource):
)
except LLMBadRequestError:
raise ProviderNotInitializeError(
"No Embedding Model available. Please configure a valid provider " "in the Settings -> Model Provider."
"No Embedding Model available. Please configure a valid provider in the Settings -> Model Provider."
)
except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description)
@@ -550,12 +600,7 @@ class DatasetApiBaseUrlApi(Resource):
@login_required
@account_initialization_required
def get(self):
return {
"api_base_url": (
dify_config.SERVICE_API_URL if dify_config.SERVICE_API_URL else request.host_url.rstrip("/")
)
+ "/v1"
}
return {"api_base_url": (dify_config.SERVICE_API_URL or request.host_url.rstrip("/")) + "/v1"}
class DatasetRetrievalSettingApi(Resource):
@@ -568,10 +613,12 @@ class DatasetRetrievalSettingApi(Resource):
case (
VectorType.MILVUS
| VectorType.RELYT
| VectorType.PGVECTOR
| VectorType.TIDB_VECTOR
| VectorType.CHROMA
| VectorType.TENCENT
| VectorType.PGVECTO_RS
| VectorType.BAIDU
| VectorType.VIKINGDB
):
return {"retrieval_method": [RetrievalMethod.SEMANTIC_SEARCH.value]}
case (
@@ -582,6 +629,7 @@ class DatasetRetrievalSettingApi(Resource):
| VectorType.MYSCALE
| VectorType.ORACLE
| VectorType.ELASTICSEARCH
| VectorType.PGVECTOR
):
return {
"retrieval_method": [
@@ -607,6 +655,8 @@ class DatasetRetrievalSettingMockApi(Resource):
| VectorType.CHROMA
| VectorType.TENCENT
| VectorType.PGVECTO_RS
| VectorType.BAIDU
| VectorType.VIKINGDB
):
return {"retrieval_method": [RetrievalMethod.SEMANTIC_SEARCH.value]}
case (

View File

@@ -46,8 +46,7 @@ from fields.document_fields import (
document_with_segments_fields,
)
from libs.login import login_required
from models.dataset import Dataset, DatasetProcessRule, Document, DocumentSegment
from models.model import UploadFile
from models import Dataset, DatasetProcessRule, Document, DocumentSegment, UploadFile
from services.dataset_service import DatasetService, DocumentService
from tasks.add_document_to_index_task import add_document_to_index_task
from tasks.remove_document_from_index_task import remove_document_from_index_task
@@ -302,6 +301,8 @@ class DatasetInitApi(Resource):
"doc_language", type=str, default="English", required=False, nullable=False, location="json"
)
parser.add_argument("retrieval_model", type=dict, required=False, nullable=False, location="json")
parser.add_argument("embedding_model", type=str, required=False, nullable=True, location="json")
parser.add_argument("embedding_model_provider", type=str, required=False, nullable=True, location="json")
args = parser.parse_args()
# The role of the current user in the ta table must be admin, owner, or editor, or dataset_operator
@@ -309,6 +310,8 @@ class DatasetInitApi(Resource):
raise Forbidden()
if args["indexing_technique"] == "high_quality":
if args["embedding_model"] is None or args["embedding_model_provider"] is None:
raise ValueError("embedding model and embedding model provider are required for high quality indexing.")
try:
model_manager = ModelManager()
model_manager.get_default_model_instance(
@@ -350,7 +353,7 @@ class DocumentIndexingEstimateApi(DocumentResource):
document_id = str(document_id)
document = self.get_document(dataset_id, document_id)
if document.indexing_status in ["completed", "error"]:
if document.indexing_status in {"completed", "error"}:
raise DocumentAlreadyFinishedError()
data_process_rule = document.dataset_process_rule
@@ -417,7 +420,7 @@ class DocumentBatchIndexingEstimateApi(DocumentResource):
info_list = []
extract_settings = []
for document in documents:
if document.indexing_status in ["completed", "error"]:
if document.indexing_status in {"completed", "error"}:
raise DocumentAlreadyFinishedError()
data_source_info = document.data_source_info_dict
# format document files info
@@ -661,7 +664,7 @@ class DocumentProcessingApi(DocumentResource):
db.session.commit()
elif action == "resume":
if document.indexing_status not in ["paused", "error"]:
if document.indexing_status not in {"paused", "error"}:
raise InvalidActionError("Document not in paused or error state.")
document.paused_by = None

View File

@@ -24,7 +24,7 @@ from extensions.ext_database import db
from extensions.ext_redis import redis_client
from fields.segment_fields import segment_fields
from libs.login import login_required
from models.dataset import DocumentSegment
from models import DocumentSegment
from services.dataset_service import DatasetService, DocumentService, SegmentService
from tasks.batch_create_segment_to_index_task import batch_create_segment_to_index_task
from tasks.disable_segment_from_index_task import disable_segment_from_index_task

View File

@@ -0,0 +1,263 @@
from flask import request
from flask_login import current_user
from flask_restful import Resource, marshal, reqparse
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services
from controllers.console import api
from controllers.console.datasets.error import DatasetNameDuplicateError
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from fields.dataset_fields import dataset_detail_fields
from libs.login import login_required
from services.dataset_service import DatasetService
from services.external_knowledge_service import ExternalDatasetService
from services.hit_testing_service import HitTestingService
from services.knowledge_service import ExternalDatasetTestService
def _validate_name(name):
if not name or len(name) < 1 or len(name) > 100:
raise ValueError("Name must be between 1 to 100 characters.")
return name
def _validate_description_length(description):
if description and len(description) > 400:
raise ValueError("Description cannot exceed 400 characters.")
return description
class ExternalApiTemplateListApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self):
page = request.args.get("page", default=1, type=int)
limit = request.args.get("limit", default=20, type=int)
search = request.args.get("keyword", default=None, type=str)
external_knowledge_apis, total = ExternalDatasetService.get_external_knowledge_apis(
page, limit, current_user.current_tenant_id, search
)
response = {
"data": [item.to_dict() for item in external_knowledge_apis],
"has_more": len(external_knowledge_apis) == limit,
"limit": limit,
"total": total,
"page": page,
}
return response, 200
@setup_required
@login_required
@account_initialization_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument(
"name",
nullable=False,
required=True,
help="Name is required. Name must be between 1 to 100 characters.",
type=_validate_name,
)
parser.add_argument(
"settings",
type=dict,
location="json",
nullable=False,
required=True,
)
args = parser.parse_args()
ExternalDatasetService.validate_api_list(args["settings"])
# The role of the current user in the ta table must be admin, owner, or editor, or dataset_operator
if not current_user.is_dataset_editor:
raise Forbidden()
try:
external_knowledge_api = ExternalDatasetService.create_external_knowledge_api(
tenant_id=current_user.current_tenant_id, user_id=current_user.id, args=args
)
except services.errors.dataset.DatasetNameDuplicateError:
raise DatasetNameDuplicateError()
return external_knowledge_api.to_dict(), 201
class ExternalApiTemplateApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, external_knowledge_api_id):
external_knowledge_api_id = str(external_knowledge_api_id)
external_knowledge_api = ExternalDatasetService.get_external_knowledge_api(external_knowledge_api_id)
if external_knowledge_api is None:
raise NotFound("API template not found.")
return external_knowledge_api.to_dict(), 200
@setup_required
@login_required
@account_initialization_required
def patch(self, external_knowledge_api_id):
external_knowledge_api_id = str(external_knowledge_api_id)
parser = reqparse.RequestParser()
parser.add_argument(
"name",
nullable=False,
required=True,
help="type is required. Name must be between 1 to 100 characters.",
type=_validate_name,
)
parser.add_argument(
"settings",
type=dict,
location="json",
nullable=False,
required=True,
)
args = parser.parse_args()
ExternalDatasetService.validate_api_list(args["settings"])
external_knowledge_api = ExternalDatasetService.update_external_knowledge_api(
tenant_id=current_user.current_tenant_id,
user_id=current_user.id,
external_knowledge_api_id=external_knowledge_api_id,
args=args,
)
return external_knowledge_api.to_dict(), 200
@setup_required
@login_required
@account_initialization_required
def delete(self, external_knowledge_api_id):
external_knowledge_api_id = str(external_knowledge_api_id)
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor or current_user.is_dataset_operator:
raise Forbidden()
ExternalDatasetService.delete_external_knowledge_api(current_user.current_tenant_id, external_knowledge_api_id)
return {"result": "success"}, 200
class ExternalApiUseCheckApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, external_knowledge_api_id):
external_knowledge_api_id = str(external_knowledge_api_id)
external_knowledge_api_is_using, count = ExternalDatasetService.external_knowledge_api_use_check(
external_knowledge_api_id
)
return {"is_using": external_knowledge_api_is_using, "count": count}, 200
class ExternalDatasetCreateApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self):
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("external_knowledge_api_id", type=str, required=True, nullable=False, location="json")
parser.add_argument("external_knowledge_id", type=str, required=True, nullable=False, location="json")
parser.add_argument(
"name",
nullable=False,
required=True,
help="name is required. Name must be between 1 to 100 characters.",
type=_validate_name,
)
parser.add_argument("description", type=str, required=False, nullable=True, location="json")
parser.add_argument("external_retrieval_model", type=dict, required=False, location="json")
args = parser.parse_args()
# The role of the current user in the ta table must be admin, owner, or editor, or dataset_operator
if not current_user.is_dataset_editor:
raise Forbidden()
try:
dataset = ExternalDatasetService.create_external_dataset(
tenant_id=current_user.current_tenant_id,
user_id=current_user.id,
args=args,
)
except services.errors.dataset.DatasetNameDuplicateError:
raise DatasetNameDuplicateError()
return marshal(dataset, dataset_detail_fields), 201
class ExternalKnowledgeHitTestingApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self, dataset_id):
dataset_id_str = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id_str)
if dataset is None:
raise NotFound("Dataset not found.")
try:
DatasetService.check_dataset_permission(dataset, current_user)
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
parser = reqparse.RequestParser()
parser.add_argument("query", type=str, location="json")
parser.add_argument("external_retrieval_model", type=dict, required=False, location="json")
args = parser.parse_args()
HitTestingService.hit_testing_args_check(args)
try:
response = HitTestingService.external_retrieve(
dataset=dataset,
query=args["query"],
account=current_user,
external_retrieval_model=args["external_retrieval_model"],
)
return response
except Exception as e:
raise InternalServerError(str(e))
class BedrockRetrievalApi(Resource):
# this api is only for internal testing
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("retrieval_setting", nullable=False, required=True, type=dict, location="json")
parser.add_argument(
"query",
nullable=False,
required=True,
type=str,
)
parser.add_argument("knowledge_id", nullable=False, required=True, type=str)
args = parser.parse_args()
# Call the knowledge retrieval service
result = ExternalDatasetTestService.knowledge_retrieval(
args["retrieval_setting"], args["query"], args["knowledge_id"]
)
return result, 200
api.add_resource(ExternalKnowledgeHitTestingApi, "/datasets/<uuid:dataset_id>/external-hit-testing")
api.add_resource(ExternalDatasetCreateApi, "/datasets/external")
api.add_resource(ExternalApiTemplateListApi, "/datasets/external-knowledge-api")
api.add_resource(ExternalApiTemplateApi, "/datasets/external-knowledge-api/<uuid:external_knowledge_api_id>")
api.add_resource(ExternalApiUseCheckApi, "/datasets/external-knowledge-api/<uuid:external_knowledge_api_id>/use-check")
# this api is only for internal test
api.add_resource(BedrockRetrievalApi, "/test/retrieval")

View File

@@ -1,9 +1,12 @@
import urllib.parse
from flask import request
from flask_login import current_user
from flask_restful import Resource, marshal_with
import services
from configs import dify_config
from constants import DOCUMENT_EXTENSIONS
from controllers.console import api
from controllers.console.datasets.error import (
FileTooLargeError,
@@ -13,9 +16,10 @@ from controllers.console.datasets.error import (
)
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from fields.file_fields import file_fields, upload_config_fields
from core.helper import ssrf_proxy
from fields.file_fields import file_fields, remote_file_info_fields, upload_config_fields
from libs.login import login_required
from services.file_service import ALLOWED_EXTENSIONS, UNSTRUCTURED_ALLOWED_EXTENSIONS, FileService
from services.file_service import FileService
PREVIEW_WORDS_LIMIT = 3000
@@ -51,7 +55,7 @@ class FileApi(Resource):
if len(request.files) > 1:
raise TooManyFilesError()
try:
upload_file = FileService.upload_file(file, current_user)
upload_file = FileService.upload_file(file=file, user=current_user)
except services.errors.file.FileTooLargeError as file_too_large_error:
raise FileTooLargeError(file_too_large_error.description)
except services.errors.file.UnsupportedFileTypeError:
@@ -75,11 +79,24 @@ class FileSupportTypeApi(Resource):
@login_required
@account_initialization_required
def get(self):
etl_type = dify_config.ETL_TYPE
allowed_extensions = UNSTRUCTURED_ALLOWED_EXTENSIONS if etl_type == "Unstructured" else ALLOWED_EXTENSIONS
return {"allowed_extensions": allowed_extensions}
return {"allowed_extensions": DOCUMENT_EXTENSIONS}
class RemoteFileInfoApi(Resource):
@marshal_with(remote_file_info_fields)
def get(self, url):
decoded_url = urllib.parse.unquote(url)
try:
response = ssrf_proxy.head(decoded_url)
return {
"file_type": response.headers.get("Content-Type", "application/octet-stream"),
"file_length": int(response.headers.get("Content-Length", 0)),
}
except Exception as e:
return {"error": str(e)}, 400
api.add_resource(FileApi, "/files/upload")
api.add_resource(FilePreviewApi, "/files/<uuid:file_id>/preview")
api.add_resource(FileSupportTypeApi, "/files/support-type")
api.add_resource(RemoteFileInfoApi, "/remote-files/<path:url>")

View File

@@ -47,6 +47,7 @@ class HitTestingApi(Resource):
parser = reqparse.RequestParser()
parser.add_argument("query", type=str, location="json")
parser.add_argument("retrieval_model", type=dict, required=False, location="json")
parser.add_argument("external_retrieval_model", type=dict, required=False, location="json")
args = parser.parse_args()
HitTestingService.hit_testing_args_check(args)
@@ -57,6 +58,7 @@ class HitTestingApi(Resource):
query=args["query"],
account=current_user,
retrieval_model=args["retrieval_model"],
external_retrieval_model=args["external_retrieval_model"],
limit=10,
)

View File

@@ -14,7 +14,9 @@ class WebsiteCrawlApi(Resource):
@account_initialization_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("provider", type=str, choices=["firecrawl"], required=True, nullable=True, location="json")
parser.add_argument(
"provider", type=str, choices=["firecrawl", "jinareader"], required=True, nullable=True, location="json"
)
parser.add_argument("url", type=str, required=True, nullable=True, location="json")
parser.add_argument("options", type=dict, required=True, nullable=True, location="json")
args = parser.parse_args()
@@ -33,7 +35,7 @@ class WebsiteCrawlStatusApi(Resource):
@account_initialization_required
def get(self, job_id: str):
parser = reqparse.RequestParser()
parser.add_argument("provider", type=str, choices=["firecrawl"], required=True, location="args")
parser.add_argument("provider", type=str, choices=["firecrawl", "jinareader"], required=True, location="args")
args = parser.parse_args()
# get crawl status
try:

View File

@@ -18,9 +18,7 @@ class NotSetupError(BaseHTTPException):
class NotInitValidateError(BaseHTTPException):
error_code = "not_init_validated"
description = (
"Init validation has not been completed yet. " "Please proceed with the init validation process first."
)
description = "Init validation has not been completed yet. Please proceed with the init validation process first."
code = 401

View File

@@ -81,19 +81,15 @@ class ChatTextApi(InstalledAppResource):
message_id = args.get("message_id", None)
text = args.get("text", None)
if (
app_model.mode in [AppMode.ADVANCED_CHAT.value, AppMode.WORKFLOW.value]
app_model.mode in {AppMode.ADVANCED_CHAT.value, AppMode.WORKFLOW.value}
and app_model.workflow
and app_model.workflow.features_dict
):
text_to_speech = app_model.workflow.features_dict.get("text_to_speech")
voice = args.get("voice") if args.get("voice") else text_to_speech.get("voice")
voice = args.get("voice") or text_to_speech.get("voice")
else:
try:
voice = (
args.get("voice")
if args.get("voice")
else app_model.app_model_config.text_to_speech_dict.get("voice")
)
voice = args.get("voice") or app_model.app_model_config.text_to_speech_dict.get("voice")
except Exception:
voice = None
response = AudioService.transcript_tts(app_model=app_model, message_id=message_id, voice=voice, text=text)

View File

@@ -92,7 +92,7 @@ class ChatApi(InstalledAppResource):
def post(self, installed_app):
app_model = installed_app.app
app_mode = AppMode.value_of(app_model.mode)
if app_mode not in [AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]:
if app_mode not in {AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT}:
raise NotChatAppError()
parser = reqparse.RequestParser()
@@ -100,6 +100,7 @@ class ChatApi(InstalledAppResource):
parser.add_argument("query", type=str, required=True, location="json")
parser.add_argument("files", type=list, required=False, location="json")
parser.add_argument("conversation_id", type=uuid_value, location="json")
parser.add_argument("parent_message_id", type=uuid_value, required=False, location="json")
parser.add_argument("retriever_from", type=str, required=False, default="explore_app", location="json")
args = parser.parse_args()
@@ -140,7 +141,7 @@ class ChatStopApi(InstalledAppResource):
def post(self, installed_app, task_id):
app_model = installed_app.app
app_mode = AppMode.value_of(app_model.mode)
if app_mode not in [AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]:
if app_mode not in {AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT}:
raise NotChatAppError()
AppQueueManager.set_stop_flag(task_id, InvokeFrom.EXPLORE, current_user.id)

View File

@@ -20,7 +20,7 @@ class ConversationListApi(InstalledAppResource):
def get(self, installed_app):
app_model = installed_app.app
app_mode = AppMode.value_of(app_model.mode)
if app_mode not in [AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]:
if app_mode not in {AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT}:
raise NotChatAppError()
parser = reqparse.RequestParser()
@@ -50,7 +50,7 @@ class ConversationApi(InstalledAppResource):
def delete(self, installed_app, c_id):
app_model = installed_app.app
app_mode = AppMode.value_of(app_model.mode)
if app_mode not in [AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]:
if app_mode not in {AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT}:
raise NotChatAppError()
conversation_id = str(c_id)
@@ -68,7 +68,7 @@ class ConversationRenameApi(InstalledAppResource):
def post(self, installed_app, c_id):
app_model = installed_app.app
app_mode = AppMode.value_of(app_model.mode)
if app_mode not in [AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]:
if app_mode not in {AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT}:
raise NotChatAppError()
conversation_id = str(c_id)
@@ -90,7 +90,7 @@ class ConversationPinApi(InstalledAppResource):
def patch(self, installed_app, c_id):
app_model = installed_app.app
app_mode = AppMode.value_of(app_model.mode)
if app_mode not in [AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]:
if app_mode not in {AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT}:
raise NotChatAppError()
conversation_id = str(c_id)
@@ -107,7 +107,7 @@ class ConversationUnPinApi(InstalledAppResource):
def patch(self, installed_app, c_id):
app_model = installed_app.app
app_mode = AppMode.value_of(app_model.mode)
if app_mode not in [AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]:
if app_mode not in {AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT}:
raise NotChatAppError()
conversation_id = str(c_id)

View File

@@ -11,7 +11,7 @@ from controllers.console.wraps import account_initialization_required, cloud_edi
from extensions.ext_database import db
from fields.installed_app_fields import installed_app_list_fields
from libs.login import login_required
from models.model import App, InstalledApp, RecommendedApp
from models import App, InstalledApp, RecommendedApp
from services.account_service import TenantService
@@ -31,7 +31,7 @@ class InstalledAppsListApi(Resource):
"app_owner_tenant_id": installed_app.app_owner_tenant_id,
"is_pinned": installed_app.is_pinned,
"last_used_at": installed_app.last_used_at,
"editable": current_user.role in ["owner", "admin"],
"editable": current_user.role in {"owner", "admin"},
"uninstallable": current_tenant_id == installed_app.app_owner_tenant_id,
}
for installed_app in installed_apps

View File

@@ -40,7 +40,7 @@ class MessageListApi(InstalledAppResource):
app_model = installed_app.app
app_mode = AppMode.value_of(app_model.mode)
if app_mode not in [AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]:
if app_mode not in {AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT}:
raise NotChatAppError()
parser = reqparse.RequestParser()
@@ -51,7 +51,7 @@ class MessageListApi(InstalledAppResource):
try:
return MessageService.pagination_by_first_id(
app_model, current_user, args["conversation_id"], args["first_id"], args["limit"]
app_model, current_user, args["conversation_id"], args["first_id"], args["limit"], "desc"
)
except services.errors.conversation.ConversationNotExistsError:
raise NotFound("Conversation Not Exists.")
@@ -125,7 +125,7 @@ class MessageSuggestedQuestionApi(InstalledAppResource):
def get(self, installed_app, message_id):
app_model = installed_app.app
app_mode = AppMode.value_of(app_model.mode)
if app_mode not in [AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT]:
if app_mode not in {AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT}:
raise NotChatAppError()
message_id = str(message_id)

View File

@@ -43,7 +43,7 @@ class AppParameterApi(InstalledAppResource):
"""Retrieve app parameters."""
app_model = installed_app.app
if app_model.mode in [AppMode.ADVANCED_CHAT.value, AppMode.WORKFLOW.value]:
if app_model.mode in {AppMode.ADVANCED_CHAT.value, AppMode.WORKFLOW.value}:
workflow = app_model.workflow
if workflow is None:
raise AppUnavailableError()

View File

@@ -18,7 +18,7 @@ message_fields = {
"inputs": fields.Raw,
"query": fields.String,
"answer": fields.String,
"message_files": fields.List(fields.Nested(message_file_fields), attribute="files"),
"message_files": fields.List(fields.Nested(message_file_fields)),
"feedback": fields.Nested(feedback_fields, attribute="user_feedback", allow_null=True),
"created_at": TimestampField,
}

View File

@@ -7,7 +7,7 @@ from werkzeug.exceptions import NotFound
from controllers.console.wraps import account_initialization_required
from extensions.ext_database import db
from libs.login import login_required
from models.model import InstalledApp
from models import InstalledApp
def installed_app_required(view=None):

View File

@@ -4,7 +4,7 @@ from flask import session
from flask_restful import Resource, reqparse
from configs import dify_config
from libs.helper import str_len
from libs.helper import StrLen
from models.model import DifySetup
from services.account_service import TenantService
@@ -28,7 +28,7 @@ class InitValidateAPI(Resource):
raise AlreadySetupError()
parser = reqparse.RequestParser()
parser.add_argument("password", type=str_len(30), required=True, location="json")
parser.add_argument("password", type=StrLen(30), required=True, location="json")
input_password = parser.parse_args()["password"]
if input_password != os.environ.get("INIT_PASSWORD"):

View File

@@ -4,7 +4,7 @@ from flask import request
from flask_restful import Resource, reqparse
from configs import dify_config
from libs.helper import email, get_remote_ip, str_len
from libs.helper import StrLen, email, extract_remote_ip
from libs.password import valid_password
from models.model import DifySetup
from services.account_service import RegisterService, TenantService
@@ -40,13 +40,13 @@ class SetupApi(Resource):
parser = reqparse.RequestParser()
parser.add_argument("email", type=email, required=True, location="json")
parser.add_argument("name", type=str_len(30), required=True, location="json")
parser.add_argument("name", type=StrLen(30), required=True, location="json")
parser.add_argument("password", type=valid_password, required=True, location="json")
args = parser.parse_args()
# setup
RegisterService.setup(
email=args["email"], name=args["name"], password=args["password"], ip_address=get_remote_ip(request)
email=args["email"], name=args["name"], password=args["password"], ip_address=extract_remote_ip(request)
)
return {"result": "success"}, 201

Some files were not shown because too many files have changed in this diff Show More