Compare commits

...

597 Commits

Author SHA1 Message Date
-LAN-
926546b153 chore: bump version to 0.14.1 (#11784)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-18 16:35:54 +08:00
xander-art
56434db4f5 feat:add hunyuan model(hunyuan-role, hunyuan-large, hunyuan-large-rol… (#11766)
Co-authored-by: xanderdong <xanderdong@tencent.com>
2024-12-18 15:25:53 +08:00
-LAN-
688292e6ff chore(opendal_storage): remove unused comment (#11783)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-18 15:20:54 +08:00
Shun Miyazawa
f7415e1ca4 feat: Disable the "Forgot your password?" button when the mail server setup is incomplete (#11653) 2024-12-18 15:20:41 +08:00
-LAN-
2961fa0e08 chore(.env.example): add comments for opendal (#11778)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-18 15:20:03 +08:00
Jiang
ad17ff9a92 Lindorm vdb bug-fix (#11790)
Co-authored-by: jiangzhijie <jiangzhijie.jzj@alibaba-inc.com>
2024-12-18 15:19:20 +08:00
Benjamin
558ab25f51 fix: imperfect service-api introduction text (#11782) 2024-12-18 13:43:34 +08:00
-LAN-
a5db7c9acb feat: add openai o1 & update pricing and max_token of other models (#11780)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-18 12:15:11 +08:00
Joe
580297e290 fix: file upload auth (#11774) 2024-12-18 11:02:40 +08:00
DDDDD12138
79d11ea709 feat: add parameters for JinaReaderTool (#11613) 2024-12-18 09:08:06 +08:00
-LAN-
99f40a9682 feat: full support for opendal and sync configurations between .env and docker-compose (#11754)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-18 09:05:54 +08:00
-LAN-
e86756cb39 feat(app_factory): speed up api startup (#11762)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-18 09:05:31 +08:00
barabicu
1325246da8 fix: Prevent redirection to /overview when accessing /workflow. (#11733) 2024-12-18 08:37:22 +08:00
Hiroshi Fujita
dfa9a91906 (doc) fix: update cURL examples to include Authorization header (#11750) 2024-12-17 17:44:40 +08:00
Charlie.Wei
5e2926a207 Fix explore app icon (#11742)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-12-17 17:42:44 +08:00
非法操作
9048832a9a chore: improve gemini models (#11745) 2024-12-17 17:42:21 +08:00
Shota Totsuka
7d5a385811 feat: use Gemini response metadata for token counting (#11743) 2024-12-17 17:42:05 +08:00
-LAN-
900e93f758 chore: update comments in docker env file (#11705)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-17 15:45:00 +08:00
sino
99430a5931 feat(ark): support doubao vision series models (#11740) 2024-12-17 15:43:11 +08:00
非法操作
c9b4029ce7 chore: the consistency of MultiModalPromptMessageContent (#11721) 2024-12-17 15:01:38 +08:00
Bowen Liang
78c3051585 fix: make tidb service optional with proper profile in docker compose yaml (#11729) 2024-12-17 14:25:15 +08:00
呆萌闷油瓶
cd4310df25 chore:update azure api version (#11711) 2024-12-17 13:39:56 +08:00
-LAN-
259cff9f22 fix(api/ops_trace): avoid raise exception directly (#11732)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-17 13:38:57 +08:00
Hanqing Zhao
7b7eb00385 Modify translation for error branch (#11731) 2024-12-17 13:28:13 +08:00
-LAN-
62b9e5a6f9 feat(knowledge_retrieval_node): Suppress exceptions thrown by DatasetRetrieval (#11728)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-17 13:12:29 +08:00
NFish
a399502ecd Dark Mode: Workflow darkmode style (#11695) 2024-12-17 12:20:49 +08:00
-LAN-
92a840f1b2 feat(tool_node): Suppress exceptions thrown by the Tool (#11724)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-17 12:11:50 +08:00
非法操作
74fdc16bd1 feat: enhance gemini models (#11497) 2024-12-17 12:05:13 +08:00
yihong
56cfdce453 chore: update docker env close #11703 (#11706)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-17 09:01:23 +08:00
yihong
efa8eb379f fix: memory leak by pypdfium2 close(maybe) #11510 (#11700)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-17 00:42:01 +08:00
crazywoola
7f095bdc42 fix: image icon can not display (#11701) 2024-12-16 19:15:23 +08:00
Kazuhisa Wada
e20161b3de make login lockout duration configurable (#11699) 2024-12-16 19:05:27 +08:00
方程
fc8fdbacb4 feat: add gitee ai vl models (#11697)
Co-authored-by: 方程 <fangcheng@oschina.cn>
2024-12-16 18:45:26 +08:00
longfengpili
7fde638556 fix: fix proxy for docker (#11681) 2024-12-16 18:43:59 +08:00
非法操作
be93c19b7e chore: remove duplicate folder with case sensitivity issue (#11687) 2024-12-16 17:59:00 +08:00
-LAN-
967eb81112 chore: bump version to 0.14.0 (#11679)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-16 15:49:17 +08:00
zxhlyh
9f602f73eb fix: workflow continue on error edge color (#11689) 2024-12-16 15:39:53 +08:00
Joel
41de7e76ec fix: iteration output array type causes always outputting string array (#11686) 2024-12-16 15:06:03 +08:00
Joel
607a22ad12 fix: tool constant params change cause page crashed (#11682) 2024-12-16 14:33:00 +08:00
wangbin77
4b402c4041 fix: enhance workflow.tool_published performance (#11640)
Co-authored-by: wangbin <wangbin35@xiaomi.com>
2024-12-16 13:05:38 +08:00
zhongliliu-butterfly
daccb10d8c fix: volcengine_maas and baichuan message error (#11625)
Co-authored-by: zhongliliu <liuzlx@digitalchina.com>
2024-12-16 13:05:27 +08:00
Kazuhisa Wada
63f1dd7877 Make max_submit_count configurable via Config (#11673) 2024-12-16 12:59:37 +08:00
zhaobingshuang
79801f5c30 fix: deepseek reports an error when using Response Format #11677 (#11678)
Co-authored-by: zhaobs <zhaobs@cailian.net>
2024-12-16 12:58:03 +08:00
非法操作
9c7a1bc067 fix: change http node params from dict to list tuple (#11665) 2024-12-15 21:27:39 +08:00
非法操作
cf0ff88120 feat: add grok-2-1212 and grok-2-vision-1212 (#11672) 2024-12-15 21:18:24 +08:00
Novice
e0b67536e0 fix: remove the unused QueueWorkflowPartialSuccessEvent handle in workflow (#11669)
Co-authored-by: Novice Lee <novicelee@NoviPro.local>
2024-12-15 21:18:14 +08:00
github-actions[bot]
94c7dcc7f1 chore: translate i18n files (#11639)
Co-authored-by: douxc <7553076+douxc@users.noreply.github.com>
2024-12-15 17:22:45 +08:00
luckylhb90
38e155d819 feat: log add trace id (#11599)
Co-authored-by: hobo.l <hobo.l@binance.com>
2024-12-15 17:22:25 +08:00
yihong
efd5575683 fix: _handle_workflow_run_partial_success args is wrong (#11562)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-15 17:22:13 +08:00
yihong
1a7c213405 fix: ExternalDatasetService.process_external_api wrong args (#11586)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-15 17:22:03 +08:00
yihong
8e3d60c359 fix: account.id should account_id (#11628)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-15 17:18:17 +08:00
Bowen Liang
924b4fe742 test: run vdb tests on TiDB Vector with docker in CI tests (#11645) 2024-12-15 17:16:40 +08:00
yihong
7e154a467b fix: better error message for stream (#11635)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-15 17:16:04 +08:00
IWAI, Masaharu
b90f1581be Update translate to Japanese: natural Japanese expression (#11647)
Co-authored-by: IWAI, Masaharu <iwai_masaharu@funkit.co.jp>
2024-12-15 17:15:24 +08:00
yihong
821992e21f fix: langfuse do not have created_at args and fix the typing in the file (#11648)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-15 17:13:46 +08:00
IWAI, Masaharu
f0c0ce9db1 fix: rename README filename: Japanese language code is 'JA' (#11651) 2024-12-15 17:13:34 +08:00
Junyan Qin
8ecb9aaa91 fix: remove unnecessary curly braces in wf api doc (#11658) 2024-12-15 17:12:26 +08:00
yihong
22258fb0bf fix: filter bug for keywork cause code can not reach (#11666)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-15 17:12:06 +08:00
NFish
a725b8bb6e Feat: new entry point for app creation (#10847) 2024-12-13 17:29:09 +08:00
Kevin9703
bdfdccd511 fix: app log filter value error (#11624) 2024-12-13 16:40:34 +08:00
yihong
194bc60429 fix: split dir for opendal tests (#11627)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-13 16:31:00 +08:00
-LAN-
430ca3322b chore(dependency): bump gunicorn to 23.0 (#11560)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-13 16:16:58 +08:00
zxhlyh
3d803c2e80 Fix/pdf preview in build (#11621) 2024-12-13 11:01:53 +08:00
Hiroshi Fujita
fa3dcbb3bc feat(devcontainer): add alias to stop Docker containers (#11616) 2024-12-13 10:03:58 +08:00
yihong
ee342063d8 ci: better print version for ruff to check the change (#11587)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-12 21:44:00 +08:00
JasonVV
bb3bc60f83 feat(model): add vertex_ai Gemini 2.0 Flash Exp (#11604) 2024-12-12 20:20:49 +08:00
crazywoola
e7a4cfac4d fix: name of llama-3.3-70b-specdec (#11596) 2024-12-12 16:33:49 +08:00
Alok Shrivastwa
6478aa1c9d Added new models and Removed the deleted ones for Groq #11455 (#11456)
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: Alok Shrivastwa <Alok.Shrivastwa@microland.com>
2024-12-12 14:11:30 +08:00
Warren Chen
7b5839335a [ref] use one method to get boto client for aws bedrock (#11506) 2024-12-12 13:56:52 +08:00
github-actions[bot]
a360af8687 chore: translate i18n files (#11577)
Co-authored-by: JzoNgKVO <27049666+JzoNgKVO@users.noreply.github.com>
2024-12-12 13:47:39 +08:00
yihong
36cb25b341 fix: support mdx files close #11557 (#11565)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-12 13:37:56 +08:00
Joe
e565ecdaef fix: change workflow trace id (#11585) 2024-12-12 13:37:29 +08:00
KVOJJJin
f96fdc2970 Feat: dark mode for logs and annotations (#11575) 2024-12-12 10:09:48 +08:00
Jiang
0d04cdc323 Lindorm vdb (#11574)
Co-authored-by: jiangzhijie <jiangzhijie.jzj@alibaba-inc.com>
2024-12-12 09:43:27 +08:00
非法操作
926f604f09 feat: add gemini-2.0-flash-exp (#11570) 2024-12-12 09:33:39 +08:00
yihong
180743612c fix: better opendal tests (#11569)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-12 09:33:30 +08:00
liuzhenghua
d05f189049 Fix: RateLimit requests were not released when a streaming generation exception occurred (#11540) 2024-12-11 19:16:35 +08:00
github-actions[bot]
ceaa9f1101 chore: translate i18n files (#11545)
Co-authored-by: zxhlyh <16177003+zxhlyh@users.noreply.github.com>
2024-12-11 18:04:14 +08:00
zxhlyh
6f4cbe0bde fix: workflow continue on error doc link (#11554) 2024-12-11 18:03:41 +08:00
-LAN-
8d4bb9b40d feat: integrate opendal storage (#11508)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-11 14:50:54 +08:00
Novice
1765fe2a29 fix: iteration node in parallel mode token count error (#11539)
Co-authored-by: Novice Lee <novicelee@NoviPro.local>
2024-12-11 14:23:01 +08:00
Novice
79a710ce98 Feat: continue on error (#11458)
Co-authored-by: Novice Lee <novicelee@NovicedeMacBook-Pro.local>
Co-authored-by: Novice Lee <novicelee@NoviPro.local>
2024-12-11 14:22:42 +08:00
zxhlyh
bec5451f12 feat: workflow continue on error (#11474) 2024-12-11 14:21:38 +08:00
Yi Xiao
86dfdcb8ec chore: update thai lang in app page (#11541) 2024-12-11 12:08:09 +08:00
Tommy
42d986b96d [Pixtral] Add new model ; add vision (#11231) 2024-12-11 10:14:16 +08:00
zkyTech
fbc4ca980c fix: Remove duplicate 'response_format' parameter from model YAML files (#11531)
Co-authored-by: zhangkunyuan <zhangkunyuan@cmhi.chinamobile.com>
2024-12-11 10:10:53 +08:00
Paul van Oorschot
80c52e0ea4 feat: Add llama-3.3 models for Groq (#11533) 2024-12-11 09:59:46 +08:00
yihong
50b76dd5a2 fix: better error message for url add external knowledge (#11537)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-11 09:55:48 +08:00
yihong
225fcd5e41 Revert "fix: total tokens is wrong which is zero in inter way, close … (#11536) 2024-12-11 09:54:46 +08:00
yihong
afffd345bc fix: can not start local by REMOTE_SETTINGS_SOURCE_NAME change it to … (#11535)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-11 09:35:25 +08:00
yihong
716576043d fix: issue 11247 that Completion mode content maybe list or str (#11504)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-10 23:22:14 +08:00
Wei Mingzhi
28231d39a4 Remove the processing of single quote when testing API tools. (#11390) 2024-12-10 19:53:38 +08:00
非法操作
9e23c3d625 chore: LOCAL_FILE also try to use remote_url as Prompt message (#11443) 2024-12-10 10:56:49 +08:00
Charlie.Wei
bdd5869244 Msg file preview (#11466)
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-12-10 10:53:37 +08:00
barabicu
fc1415d705 chore: fix typo in Japanese localization (#11502) 2024-12-10 09:29:16 +08:00
문정현
8218f62478 chore : fix translation Typo in ko-KR localization (#11509) 2024-12-10 09:09:26 +08:00
-LAN-
fd354d999d fix(app_generator_service): overload type hints (#11507)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-10 09:06:34 +08:00
orangeclk
ec00b25793 feat: add siliconflow qwq and llama3.3 model (#11492) 2024-12-10 08:49:45 +08:00
huanshare
967b7d89e3 feat:add apollo configuration to load env file (#11210)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: huanshare <liuhuan101@longfor.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-12-10 02:51:20 +08:00
Yingchun Lai
32f8439143 fix: add the missing abab6.5t-chat model of Minimax (#11484) 2024-12-09 17:59:20 +08:00
-LAN-
0ff8bd2aa9 chore: bump version to 0.13.2 (#11489)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-09 17:57:23 +08:00
Hash Brown
2866383228 fix: cannot close notification manually (#11490) 2024-12-09 17:55:06 +08:00
Jyong
00ac7edeb3 improve message clean logic (#11487) 2024-12-09 16:12:30 +08:00
-LAN-
537068cfde refactor(iteration_node): use Sequence and Mapping in parameters (#11483)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-09 15:41:20 +08:00
suzuki.sh
c3c6a48059 Fix the token count at the iteration node (#11235)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-12-09 15:02:04 +08:00
zhaobingshuang
5c166b3f40 fix: tags could not be saved when the Workflow Tool was created (#11481)
Co-authored-by: zhaobs <zhaobs@cailian.net>
2024-12-09 14:38:02 +08:00
kurokobo
230fa3286b feat: add 'Open in Explore' link for each apps on studio (#11402) 2024-12-09 12:04:03 +08:00
Muneyuki Noguchi
061c0b10fd Fix the Japanese translation for 'Detail' (#11476) 2024-12-09 11:18:28 +08:00
Yi Xiao
32f8a98cf8 feat: ifelse condition variable editable after selection (#11431) 2024-12-09 11:06:47 +08:00
Charlie.Wei
6c60ecb237 Refactor: Remove redundant style and simplify Mermaid component (#11472) 2024-12-09 09:47:58 +08:00
xiandan-erizo
c3fae5e801 Update ext_redis.py (#11214) 2024-12-09 09:35:52 +08:00
VoidIsVoid
a594e256ae remove mermail render cache (#11470)
Co-authored-by: Gimling <huangjl@ruyi.ai>
2024-12-09 09:33:18 +08:00
Trey Dong
41d90c2408 fix(api): throw error when notion block can not find (#11433) 2024-12-09 09:10:59 +08:00
yihong
7ff42b1b7a fix: unit tests env will need clear too (#11445)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-09 09:04:11 +08:00
Kazuki Takamatsu
4d7cfd0de5 Fix model provider of vertex ai (#11437) 2024-12-08 08:44:49 +08:00
Hash Brown
266d32bd77 fix: cannot upload animated webp image as app icon (#11453) 2024-12-08 08:37:21 +08:00
非法操作
7e1184c071 feat: support json_schema for ollama models (#11449) 2024-12-08 08:36:12 +08:00
非法操作
1ce51e57ab feat: add gemini exp 1206 (#11444) 2024-12-07 22:28:10 +08:00
非法操作
142b4fd699 feat: add zhipu glm_4v_flash (#11440) 2024-12-07 22:27:57 +08:00
Hash Brown
cc8feaa483 style: EmojiPicker component top padding (#11452) 2024-12-07 22:26:28 +08:00
yihong
d9d5d35a77 fix: issue #10596 by making the iteration node outputs right (#11394)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-12-07 16:28:15 +08:00
Huỳnh Gia Bôi
9277156b6c fix(document_extractor): pptx file type and missing metadata_filename UnstructuredIO (#11364)
Co-authored-by: Julian Huynh <julian.huynh@immersio.io>
2024-12-06 18:55:59 +08:00
KVOJJJin
1490a19fa1 Fix: compatible with outputs data structure (#11432) 2024-12-06 17:35:35 +08:00
Jyong
9b7adcd4d9 update tidb batch get endpoint to basic mode (#11426) 2024-12-06 17:06:46 +08:00
Jyong
a8d32f9964 fix external retrieval without segment id (#11423) 2024-12-06 14:45:15 +08:00
shirochan
5093337de1 FEAT: cohere rerank 3.5 model added (#11289) 2024-12-06 09:58:55 +08:00
Matsuda
f54225568c fix(model_runtime): add vision to Amazon Nova Lite and Pro (#11398) 2024-12-06 09:15:32 +08:00
crazywoola
255ff446ba use md table systax in pr template (#11412) 2024-12-06 09:14:15 +08:00
kurokobo
9a0dc4bfdc fix: add elkjs (#11404) 2024-12-06 09:00:48 +08:00
huayaoyue6
9d975750bc fix: update DocumentIsPausedError (#11405) 2024-12-06 08:59:23 +08:00
Charlie.Wei
7c979e6490 Update mermaid (#11356)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-12-05 20:43:42 +08:00
github-actions[bot]
d60ca1661c chore: translate i18n files (#11389)
Co-authored-by: JzoNgKVO <27049666+JzoNgKVO@users.noreply.github.com>
2024-12-05 17:55:44 +08:00
eux
bb62391a4c fix: broken link to knowledge base guide (#11387) 2024-12-05 17:47:11 +08:00
KVOJJJin
0b25c0b677 Fix: support file download in workflow result (#11338) 2024-12-05 16:58:39 +08:00
-LAN-
a5d6082418 chore: bump version to 0.13.1 (#11382)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-05 15:11:55 +08:00
Warren Chen
631cbcd781 [fix] rename yaml files to fit windows (#11379) 2024-12-05 14:38:12 +08:00
Yi Xiao
20c4633d2a fix: empty object (conversation variable) editable (#11352) 2024-12-05 13:59:59 +08:00
yihong
5669cac16d fix: some typos using typos (#11374)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-05 13:24:06 +08:00
Yi Xiao
6180762160 fix: bg typo in variable aggregator node (#11376) 2024-12-05 11:46:12 +08:00
Warren Chen
376726cf90 [feat] Add AWS Bedrock rerank (#11349)
Co-authored-by: crazywoola <427733928@qq.com>
2024-12-05 11:31:43 +08:00
Joel
284bb7ac71 fix: ref attribute in markdown causes page crash (#11369)
Co-authored-by: crazywoola <427733928@qq.com>
2024-12-05 10:15:21 +08:00
Akira Noda
eca466bdaa chore: fix typo (#11359) 2024-12-05 09:04:30 +08:00
crazywoola
d56abec195 Revert "Fix: iteration not in main thread pool" (#11358) 2024-12-04 21:22:22 +08:00
yihong
961e25f608 fix: better bedrock message handler close #10976 (#11317)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-04 19:46:40 +08:00
github-actions[bot]
138bf698b0 chore: translate i18n files (#11353)
Co-authored-by: douxc <7553076+douxc@users.noreply.github.com>
2024-12-04 19:24:03 +08:00
NFish
e5bb4cca12 fix: Correct category of 'Workflow' used in Explore Apps. (#11351) 2024-12-04 18:19:12 +08:00
AkaraChen
5e2cb0e3a8 feat: add base skeleton component (#11339) 2024-12-04 17:34:55 +08:00
Hash Brown
16a65cb367 fix: cannot send message when debug with multiple model with conversa… (#11333) 2024-12-04 16:17:11 +08:00
ybalbert001
1bae9b8ff7 update pricing for bedrock nova LLM models (#11336)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-12-04 16:16:41 +08:00
Jyong
d7c1f43b49 fix tidb full-text-search vector missed (#11337) 2024-12-04 16:13:23 +08:00
Yi Xiao
f933af9f57 fix: check valid for number variable (#11334) 2024-12-04 15:46:54 +08:00
非法操作
91e1ff5e30 chore: improve zhipu LLM (#11321) 2024-12-04 15:14:30 +08:00
ybalbert001
5908e10549 integrate amazon nove llms to dify (#11324)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-12-04 15:13:08 +08:00
-LAN-
464e6354c5 feat: correct the prompt grammar. (#11328)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-04 15:12:47 +08:00
非法操作
d470e55f8c fix: http node download file always image type (#11319) 2024-12-04 12:15:26 +08:00
zxhlyh
98a1b01b0c fix: file download in chat (#11322) 2024-12-04 11:10:56 +08:00
Joel
e240424be5 fix: number variable can not input constant type value in tool config form (#11320) 2024-12-04 10:46:03 +08:00
DDDDD12138
1cb5a12abb fix: resolve scrolling issue in workflow-log table (#11302) 2024-12-03 21:29:42 +08:00
KVOJJJin
ff2a4a6fcd Fix: model params in logs (#11298) 2024-12-03 21:17:55 +08:00
Jyong
c58d2fce89 roll back rerank topn setting (#11297) 2024-12-03 17:34:56 +08:00
-LAN-
7a962b9f03 chore: bump version to 0.13.0 (#11284)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-03 16:01:12 +08:00
Joel
a679079a1d fix: auto translate fail (#11286) 2024-12-03 14:21:59 +08:00
yihong
e39e776d03 fix: better wenxin rerank handler, close #11252 (#11283)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-03 13:57:16 +08:00
Yi Xiao
e135ffc2c1 Feat: upgrade variable assigner (#11285)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-12-03 13:56:40 +08:00
Bowen Liang
e79eac688a chore(lint): sort __all__ definitions (#11243) 2024-12-03 13:26:33 +08:00
-LAN-
643a90c48d fix: use removeprefix() instead of lstrip() to remove the data: prefix (#11272)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-03 09:16:25 +08:00
Novice
2a448a899d Fix: iteration not in main thread pool (#11271)
Co-authored-by: Novice Lee <novicelee@NovicedeMacBook-Pro.local>
2024-12-03 09:16:03 +08:00
yihong
7b86f8f024 fix: double split error on redis port and some type hint (#11270)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-03 09:15:51 +08:00
yihong
e686f12317 fix: better handle error (#11265)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-03 09:15:38 +08:00
kurokobo
a86f1eca79 docs: add api docs for /v1/info (#11269) 2024-12-03 09:14:13 +08:00
dependabot[bot]
668c1c0792 chore(deps): bump cross-spawn from 7.0.3 to 7.0.6 in /web (#11262)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-02 17:30:52 +08:00
Hash Brown
c4fad66f2a fix: dialogue_count incorrect in chatflow when there's... (#11175) 2024-12-02 16:09:26 +08:00
yihong
02572e8cca fix: claude can not handle empty string (#11238)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-02 16:00:40 +08:00
Hiroshi Fujita
1d8385f7ac Sync INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH between API and Web (#11230) 2024-12-02 15:29:25 +08:00
-LAN-
f8c966c39c fix(workflow_tool): Rename stream to streaming (#11258)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-02 15:00:26 +08:00
-LAN-
3c8efe7c0a fix(workflow_cycle_manage): Handle special values in the process_data. (#11253)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-02 13:53:43 +08:00
Garfield Dai
dbc10e0feb fix: license str parser. (#11248) 2024-12-02 11:38:18 +08:00
yihong
239bf97b47 fix: nvidia special embedding model payload close #11193 (#11239)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-12-02 10:25:15 +08:00
Hiroshi Fujita
858db2f239 feat(api): include tags in app information response (#11242) 2024-12-02 10:25:01 +08:00
-LAN-
c34bdb74e6 Fix/type-error (#11240)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-02 10:24:21 +08:00
-LAN-
9601102885 fix(word_extractor): Fix type error and remove stream in ssrf_proxy (#11241)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-02 10:24:03 +08:00
kazuya-awano
56c2d1cc55 feat: add pagination support for Notion search (#11194) 2024-12-01 21:49:34 +08:00
Bowen Liang
a67b0d4771 chore(lint): extract ruff configs into .ruff.toml file keeping pyproject.toml clean (#11222) 2024-12-01 12:51:28 +08:00
-LAN-
ef204817ae chore(api/Dockerfile): Bump perl to 0.40.0-8 (#11234)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-12-01 09:39:02 +08:00
Hiroshi Fujita
9bc5bc2548 feat: Increase the number of Opening Questions in the Conversation Opener (#11233) 2024-12-01 09:38:45 +08:00
yihong
fd4be36991 fix: total tokens is wrong which is zero in inter way, close #11221 (#11224)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-30 23:18:24 +08:00
Bowen Liang
9b46b02717 refactor: assembling the app features in modular way (#9129)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-11-30 23:05:22 +08:00
非法操作
3bc4dc58d7 fix: search model not work as expected (#11225) 2024-11-30 17:31:15 +08:00
Shota Totsuka
594666eb61 fix: use Gemini response metadata for token counting (#11226) 2024-11-30 17:30:55 +08:00
朱晓兵
e80f41a701 fix: support setting variables in url (#10676) 2024-11-30 11:15:17 +08:00
Cling_o3
f9c2aa7689 feat: add retireval_top_n to config in env (#11132) 2024-11-30 11:14:45 +08:00
fengjiajie
9dd4bf5574 fix: Correct inputs field type in API documentation (#11198) 2024-11-30 11:13:32 +08:00
yihong
5a9b785773 fix: excel in node only read one sheet, close #9661 (#11215)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-30 11:11:08 +08:00
catusax
d96a28487a fix: 'validation error for ToolInvokeMessage' when blob_message meta is None (#11212) 2024-11-29 17:35:13 +08:00
-LAN-
0554898b5d fix(file_factory): Remove transfer_method validation (#11207)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-11-29 17:26:31 +08:00
liujiamingtiny
6f9ce6a199 fix: fix azure open-4o-08-06 when enable json schema cant process content = "" (#11204)
Co-authored-by: jiaming.liu <jiaming.liu@zkh.com>
2024-11-29 17:26:07 +08:00
Yi Xiao
e3119112a6 chore: add Thai GUI (#11201) 2024-11-29 14:20:48 +08:00
非法操作
d3af0e9090 fix: handleLoadFileFromLink's transfer method incorrect (#11197) 2024-11-29 09:37:50 +08:00
Bowen Liang
2feb44e2c5 chore(dep): bump flask from 3.0.1 to 3.1.0 and flask-compress to 1.17 (#11195) 2024-11-29 09:28:53 +08:00
ybalbert001
cc0b92bc75 Update aws tools (#11174)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-11-29 09:28:28 +08:00
非法操作
e576d32fb6 chore: improve conversation list and rename docs (#11187) 2024-11-29 09:22:08 +08:00
kazuya-awano
2d6865d421 Ensure consistent float type for cached embedding return values (#10185) 2024-11-29 09:18:41 +08:00
Ethan
0f1133729f feat: introduce a new environment variable that suppose to disable Scarf analytics (#11179) 2024-11-28 15:21:04 +08:00
yihong
d7160ee563 fix: typo in upstashVector if id is always true, also fix some type hint (#11183)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-28 14:05:25 +08:00
github-actions[bot]
18add94a31 chore: translate i18n files (#11182)
Co-authored-by: JzoNgKVO <27049666+JzoNgKVO@users.noreply.github.com>
2024-11-28 13:21:04 +08:00
KVOJJJin
18d3ffc194 Feat: new pagination (#11170) 2024-11-28 12:26:02 +08:00
NFish
0a30a5b077 Feat: remove github star and community links if it is enterprise version (#11180) 2024-11-28 11:02:25 +08:00
jiangbo721
9049dd7725 fix: code linting (#11143)
Co-authored-by: 刘江波 <jiangbo721@163.com>
2024-11-27 23:44:51 +08:00
Jinzhou Zhang
6f418da388 Fixes #11065: tenant_id not found when login via ADMIN_KEY (#11066) 2024-11-27 19:50:56 +08:00
Jyong
41c6bf5fe4 update the scheduler of update_tidb_serverless_status_task to 1/10min (#11135) 2024-11-27 17:41:00 +08:00
Kevin Zhao
33d6d26bbf Adding AWS CDK deploy link in README in multi-language (#11166) 2024-11-27 17:40:40 +08:00
-LAN-
787285d58f fix(file_factory): convert tool file correctly. (#11167)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-11-27 17:28:01 +08:00
yihong
40fc6f529e fix: gitee ai wrong default model, and better para (#11168)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-27 17:27:11 +08:00
Novice
baef18cedd fix: Incorrect iteration log display in workflow with multiple parallel mode iteartaion nodes (#11158)
Co-authored-by: Novice Lee <novicelee@NovicedeMacBook-Pro.local>
2024-11-27 13:42:28 +08:00
Hiroshi Fujita
a918cea2fe feat: add VTT file support to Document Extractor (#11148) 2024-11-27 11:42:42 +08:00
-LAN-
9789905a1f chore(*): Removes debugging print statements (#11145)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-11-26 22:03:19 +08:00
Charlie.Wei
f458580dee fix parameter extractor function call Expected str (#11142) 2024-11-26 21:46:56 +08:00
-LAN-
223a30401c fix: LLM invoke error should not be raised (#11141)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-11-26 20:56:48 +08:00
yihong
2927493cf3 fix: better way to handle github dsl url close #11113 (#11125)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-26 19:39:55 +08:00
Joel
79db920fa7 fix: enable after disabled memory not pass user query (#11136) 2024-11-26 17:55:11 +08:00
NFish
b3d65cc7df Feat: Divider component now supports gradient background (#11130) 2024-11-26 17:44:56 +08:00
-LAN-
208d6d6d94 chore: bump to 0.12.1 (#11122) 2024-11-26 15:46:17 +08:00
Tao Wang
aa135a3780 Add TTS to OpenAI_API_Compatible (#11071) 2024-11-26 15:14:02 +08:00
-LAN-
044e7b63c2 fix(llm_node): Ignore file if not supported. (#11114) 2024-11-26 14:14:14 +08:00
-LAN-
5b7b328193 feat: Allow to contains files in the system prompt even model not support. (#11111) 2024-11-26 13:45:49 +08:00
-LAN-
8d5a1be227 fix: Cannot use files in the user inputs. (#11112) 2024-11-26 13:43:38 +08:00
非法操作
90d5765fb6 fix: app copy raise error (#11108) 2024-11-26 13:42:13 +08:00
-LAN-
1db14793fa fix(anthropic_llm): Ignore non-text parts in the system prompt. (#11107) 2024-11-26 13:31:40 +08:00
-LAN-
cbb4e95928 fix(llm_node): Ignore user query when memory is disabled. (#11106) 2024-11-26 13:07:32 +08:00
-LAN-
20c091a5e7 fix: user query be ignored if query_prompt_template is an empty string (#11103) 2024-11-26 12:47:59 +08:00
NFish
e9c098d024 Fix regenerate themes (#11101) 2024-11-26 11:33:04 +08:00
horochx
9f75970347 fix: ops_trace_manager from_end_user_id (#11077) 2024-11-26 10:29:00 +08:00
非法操作
f1366e8e19 fix #11091 raise redirect issue (#11092) 2024-11-26 10:25:42 +08:00
Hash Brown
0f85e3557b fix: site icon not showing (#11094) 2024-11-26 10:23:03 +08:00
SebastjanPrachovskij
17ee731546 SearchApi - Return error message instead of raising a ValueError (#11083) 2024-11-26 09:34:51 +08:00
Tao Wang
af2461cccc Add query_prefix + Return TED Transcript URL for Downstream Scraping Tasks (#11090) 2024-11-26 09:32:37 +08:00
非法操作
60c1549771 fix: import Explore Apps raise error (#11091) 2024-11-26 09:32:08 +08:00
fengjiajie
ab6dcf7032 fix: update the max tokens configuration for Azure GPT-4o (2024-08-06) to 16384 (#11074) 2024-11-25 21:13:02 +08:00
yihong
8aae235a71 fix: int None will cause error for context size (#11055)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-25 21:04:16 +08:00
-LAN-
c032574491 fix: timezone not imported in conversation service. (#11076) 2024-11-25 20:53:55 +08:00
Tao Wang
1065917872 Add grok-vision-beta to xAI + Update grok-beta Features (#11004) 2024-11-25 20:53:03 +08:00
非法操作
56e361ac44 fix: chart tool chinese font display and raise error (#11058) 2024-11-25 19:50:33 +08:00
yihong
2e00829b1e fix: drop useless and wrong code for zhipu embedding (#11069)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-25 19:50:23 +08:00
-LAN-
625aaceb00 chore: bump version to 0.12.0 (#11056) 2024-11-25 19:17:59 +08:00
-LAN-
98d85e6b74 fix: WorkflowNodeExecution.created_at may be earlier than WorkflowRun.created_at (#11070) 2024-11-25 18:16:55 +08:00
Pedro Gomes
319d49084b fix: ignore empty outputs in Tool node (#10988) 2024-11-25 18:00:42 +08:00
Joel
eb542067af feat: add cookie management (#11061) 2024-11-25 16:31:49 +08:00
yihong
04b9a2c605 fix: better path trigger for vdb and fix the version (#11057)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-25 13:50:03 +08:00
KVOJJJin
8028e75fbb Improvement: update api doc of workflow (#11054) 2024-11-25 12:48:36 +08:00
-LAN-
3eb51d85da fix(workflow_entry): Support receive File and FileList in single step run. (#10947)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: JzoNg <jzongcode@gmail.com>
2024-11-25 12:46:50 +08:00
nomi3
79a35c2fe6 feat(i18n): update Japanese translation for login page (#10993) 2024-11-25 12:02:56 +08:00
Joel
2dd4c34423 fix: llm node do not pass sys.query in chatflow app init (#11053) 2024-11-25 12:01:57 +08:00
Kalo Chin
684f6b2299 fix: slidespeak text output is not the download link (#10997) 2024-11-25 11:28:52 +08:00
yihong
b791a80b75 chore: update chromadb version to 0.5.20 (#11038)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-25 11:14:04 +08:00
Jiang
13006f94e2 fix the wrong LINDORM_PASSWORD variable name in docker-compose.yaml (#11052)
Co-authored-by: jiangzhijie <jiangzhijie.jzj@alibaba-inc.com>
2024-11-25 11:13:06 +08:00
Dr.MerdanBay
41772c325f Feat/add admin check (#11050) 2024-11-25 11:11:00 +08:00
SiliconFlow, Inc
a4fc057a1c ISSUE=11042: add tts model in siliconflow (#11043) 2024-11-25 11:04:13 +08:00
Tao Wang
aae29e72ae Fix Deepseek Function/Tool Calling (#11023) 2024-11-25 11:03:53 +08:00
cyflhn
87c831e5dd make tool parameters parsing compatible with the response of glm4 model in xinference provider when function tool call integerated (#11049) 2024-11-25 11:02:58 +08:00
Matsuda
40a5f1c80a fix: wrong param name (#11039) 2024-11-25 11:02:45 +08:00
-LAN-
04f1e18342 fix: Validate file only when file type is set to custom (#11036)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-11-24 21:10:01 +08:00
TakakiMoriguchi
365a40d11f fix: Japanese typo (#11034) 2024-11-24 21:09:30 +08:00
-LAN-
60b5dac3ab fix: query will be None if the query_prompt_template not exists (#11031)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-11-24 21:06:51 +08:00
-LAN-
8565c18e84 feat(file_factory): Standardize custom file type into known types (#11028)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-11-24 15:29:43 +08:00
cyflhn
03ba4bc760 fix error with xinference tool calling with qwen2-instruct and add timeout retry setttings for xinference (#11012)
Co-authored-by: crazywoola <427733928@qq.com>
2024-11-24 15:29:30 +08:00
litterGuy
ae3a2cb272 fix: json parse err when http node send request (#11001) 2024-11-24 14:19:48 +08:00
Bowen Liang
6c8e208ef3 chore: bump minimum supported Python version to 3.11 (#10386) 2024-11-24 13:28:46 +08:00
yihong
0181f1c08c fix: wrong convert in PromptTemplateConfigManager (#11016)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-24 12:18:19 +08:00
yihong
7f00c5a02e fix: uuid not import bug (#11014)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-24 11:17:55 +08:00
johnpccd
d0648e27e2 Fix typo (#11024) 2024-11-24 11:15:46 +08:00
Hiroshi Fujita
31348af2e3 doc: Updated Python version requirements to match English version (#11015) 2024-11-24 11:15:24 +08:00
kenwoodjw
096c0ad564 feat: Add support for TEI API key authentication (#11006)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-11-23 23:55:35 +08:00
Kazuhisa Wada
16c41585e1 Fixing #11005: Incorrect max_tokens in yaml file for AWS Bedrock US Cross Region Inference version of 3.5 Sonnet v2 and 3.5 Haiku (#11013) 2024-11-23 23:46:25 +08:00
AkisAya
566ab9261d fix: gitlab file url not correctly encoded (#10996) 2024-11-23 23:44:17 +08:00
Hiroshi Fujita
1cdadfdece chore(devcontainer): upgrade Python version to 3.12 in Dockerfile and configuration (#11017) 2024-11-23 23:40:09 +08:00
yihong
448a19bf54 fix: fish audio wrong validate credentials interface (#11019)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-23 23:39:41 +08:00
Bowen Liang
d3051eed48 chore (dep): bump gevent from v23 to v24 for better support for Python 3.11 and 3.12 (#10387) 2024-11-23 00:07:07 +08:00
yihong
ed55de888a fix: rules should not be None for in (#10977)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-22 23:04:20 +08:00
-LAN-
da601f0bef chore: update base image to Python 3.12 in Dockerfile (#10358) 2024-11-22 19:43:19 +08:00
非法操作
08ac36812b feat: support LLM process document file (#10966)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-11-22 19:32:44 +08:00
-LAN-
556de444e8 chore(app_dsl_service): Downgrade DSL Version (#10979) 2024-11-22 16:36:16 +08:00
crazywoola
3750200c5e feat: add a meta(mac) ctrl(windows) key (#10978) 2024-11-22 16:30:34 +08:00
-LAN-
c5f7d650b5 feat: Allow using file variables directly in the LLM node and support more file types. (#10679)
Co-authored-by: Joel <iamjoel007@gmail.com>
2024-11-22 16:30:22 +08:00
-LAN-
535c72cad7 fix(model): make sure AppModelConfig.model_dict returns a dict. (#10972) 2024-11-22 15:48:50 +08:00
NFish
8a83edc1b5 Feat: update icon and Divider components (#10975) 2024-11-22 15:44:42 +08:00
github-actions[bot]
5b415a6227 chore: translate i18n files (#10970)
Co-authored-by: laipz8200 <16485841+laipz8200@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-11-22 15:24:11 +08:00
-LAN-
5172f0bf39 feat: Check and compare the DSL version before import an app (#10969)
Co-authored-by: Yi <yxiaoisme@gmail.com>
2024-11-22 15:05:04 +08:00
CXwudi
d9579f418d chore: Added the new gemini exp-1121 and learnlm-1.5 models (#10963) 2024-11-22 13:14:20 +08:00
Wu Tianwei
3579bbd1c4 refactor: Split linear-gradient and color (#10961) 2024-11-22 10:55:42 +08:00
Kalo Chin
817b85001f feat: slidespeak slides generation (#10955) 2024-11-22 10:30:21 +08:00
Agung Besti
e8868a7fb9 feat: add gpt-4o-2024-11-20 (#10951)
Co-authored-by: akubesti <agung.besti@insignia.co.id>
2024-11-22 10:29:20 +08:00
Kalo Chin
2cd9ac60f1 fix: unstructured io credential environment variables missing (#10953) 2024-11-22 10:15:17 +08:00
yihong
464f384cea fix: tiny lora bug found by mypy (#10959)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-22 10:01:44 +08:00
非法操作
8b16f07eb0 feat: add cURL import for http request node (#8656) 2024-11-21 22:25:18 +08:00
marvin-season
fefda40acf fix: fix bugs of frontend-workflow panel operator (#10945)
Co-authored-by: marvin <sea-son@foxmail.com>
2024-11-21 19:07:02 +08:00
Xu Song
8c2f62fb92 Feat: support json output for bing-search (#10904) 2024-11-21 18:32:54 +08:00
LastHopeOfGPNU
1a6b961b5f Resolve 8475 support rerank model from infinity (#10939)
Co-authored-by: linyanxu <linyanxu2@qq.com>
2024-11-21 18:03:49 +08:00
cooper.wu
01014a6a84 fix: external dataset missing score_threshold_enabled (#10943) 2024-11-21 18:01:47 +08:00
AkisAya
cb0c55daa7 fix weight rerank of knowledge retrieval (#10931) 2024-11-21 17:53:20 +08:00
-LAN-
82575a7aea fix(gpt-4o-audio-preview): Remove the vision feature (#10932) 2024-11-21 16:42:48 +08:00
yihong
80da0c5830 fix: default max_chunks set to 1 as other providers (#10937)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-21 16:36:05 +08:00
Pedro Gomes
83b6abf4ad Update parse.py to handle empty list result (#10915)
Co-authored-by: crazywoola <427733928@qq.com>
2024-11-21 14:14:07 +08:00
Hash Brown
ea0ebc020c fix: chat history might be empty in log detail view (#10905) 2024-11-21 14:12:01 +08:00
Kota-Yamaguchi
f358db9f02 feat : Add Japanese translations for API documentation: chat, advanced-chat, completion, and workflow (#10927) 2024-11-21 14:02:46 +08:00
wy96f
94c9cadbd8 fix image files not deleted on indexing_estimate #9541 (#10798)
Co-authored-by: root <root@localhost.localdomain>
2024-11-21 13:03:16 +08:00
Steven sun
2ae6460f46 Add googlenews tools from rapidapi (#10877)
Co-authored-by: steven <sunzwj@digitalchina.com>
2024-11-21 10:39:49 +08:00
yihong
0067b16d1e fix: refactor all 'or []' and 'or {}' logic to make code more clear (#10883)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-21 10:34:43 +08:00
yihong
ec9f6220c9 doc: fix better doc for api develop, droping dead hint (#10906)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-21 10:34:23 +08:00
鬼頭拓海
af53e2b6b0 Fix : Add a process to fetch the mime type from the file name for signed url in remote_url #10872 version2 (#10908) 2024-11-20 22:57:49 +08:00
shisaru292
b42b333a72 fix: handle redis authentication for healthcheck command (#10907) 2024-11-20 20:10:51 +08:00
方程
99b0369f1b Gitee AI embedding tool (#10903) 2024-11-20 17:40:34 +08:00
llinvokerl
d6ea1e2f12 fix: explicitly use new token when retrying ssePost after refresh (#10864)
Co-authored-by: liusurong.lsr <liusurong.lsr@alibaba-inc.com>
2024-11-20 16:11:33 +08:00
-LAN-
4d6b45427c Support streaming output for OpenAI o1-preview and o1-mini (#10890) 2024-11-20 15:10:41 +08:00
-LAN-
1be8365684 Fix/input-value-type-in-moderation (#10893) 2024-11-20 15:10:12 +08:00
ybalbert001
c3d11c8ff6 fix: aws presign url is not workable remote url (#10884)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-11-20 14:24:41 +08:00
liuhaoran
8ff65abbc6 ext_redis.py support redis clusters --- Fixes #9538 (#9789)
Signed-off-by: root <root@localhost.localdomain>
Co-authored-by: root <root@localhost.localdomain>
Co-authored-by: Bowen Liang <bowenliang@apache.org>
2024-11-20 13:44:35 +08:00
非法操作
bf4b6e5f80 feat: support custom tool upload file (#10796) 2024-11-20 13:26:42 +08:00
-LAN-
25fda7adc5 fix(http_request): allow content type application/x-javascript (#10862) 2024-11-20 12:55:06 +08:00
非法操作
f3af7b5f35 fix: tool's file input display string (#10887) 2024-11-20 12:54:24 +08:00
Muntaser Abuzaid
33cfc56ad0 fix: update email validation regex to allow periods in local part (#10868) 2024-11-20 12:33:02 +08:00
鬼頭拓海
464cc26ccf Fix : Add a process to fetch the mime type from the file name for signed url in remote_url (#10872) 2024-11-20 12:30:25 +08:00
Jason Tan
d18754afdd feat: admin can also change member role (#10651) 2024-11-20 11:29:49 +08:00
非法操作
beb7953d38 feat: enhance the custom note (#8885) 2024-11-20 11:24:45 +08:00
GeorgeCaoJ
fbfc811a44 feat: support function call for ollama block chat api (#10784) 2024-11-20 11:15:19 +08:00
非法操作
7e66e5a713 feat: make toc panel can collapse (#10875) 2024-11-20 10:07:30 +08:00
kurokobo
07b5bbae06 feat: add a minimal separator between pinned apps and unpinned apps in the explore page (#10871) 2024-11-20 09:32:59 +08:00
Ding Jiatong
3087913b74 Fix the situation where output_tokens/input_tokens may be None in response.usage (#10728) 2024-11-19 21:19:13 +08:00
非法操作
904ea05bf6 fix: download some remote files raise error (#10781) 2024-11-19 21:18:53 +08:00
Rhys
6f4885d86d Encode invitee email in the invitation link (#10842) 2024-11-19 21:08:37 +08:00
Joe
2dc29cfee3 Feat/add langsmith dotted order (#10856) 2024-11-19 21:08:23 +08:00
Jyong
bd05df5cc5 fix tongyi embedding endpoint return None output (#10857) 2024-11-19 21:04:17 +08:00
Jyong
ee1f14621a fix httpx doesn't support stream parameter (#10859) 2024-11-19 21:03:01 +08:00
yihong
58a9d9eb9a fix: better WeightRerankRunner run logic use O(1) and delete unused code (#10849)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-19 20:12:13 +08:00
非法操作
bc1013dacf feat: support json schema for gemini models (#10835) 2024-11-19 17:49:58 +08:00
Tao Wang
9f195df103 Support Video Proxy and TED Embedding (#10819) 2024-11-19 17:49:14 +08:00
AkaraChen
1cc7dc6360 style: refactor fetch and context (#10795) 2024-11-19 17:16:06 +08:00
KVOJJJin
328965ed7c Fix: crash of workflow file upload (#10831)
Co-authored-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: StyleZhang <jasonapring2015@outlook.com>
2024-11-19 14:15:18 +08:00
zxhlyh
133de9a087 fix: upload file component support multiple (#10817) 2024-11-19 14:00:54 +08:00
kimjion
7261384655 fix: close child modal on log drawer close (#10839) 2024-11-19 12:09:55 +08:00
dajianguo
4718071cbb feat: Knowledge-base-api-get-post-method-text-error-#10836 (#10837) 2024-11-19 12:08:10 +08:00
非法操作
22be0816aa feat: add TOC to app develop doc (#10799) 2024-11-19 09:06:12 +08:00
孙茂胤 (Sun, Maoyin)
49e88322de doc: add clarification for length limit of init password (#10824) 2024-11-19 09:05:05 +08:00
Zane
14f3d44c37 refactor: improve handling of leading punctuation removal (#10761) 2024-11-18 21:32:33 +08:00
孙茂胤 (Sun, Maoyin)
0ba17ec116 fix: correct typo in ETL type comment in .env.example (#10822) 2024-11-18 20:58:43 +08:00
Benjamin
79d59c004b chore: update .gitignore to include mise.toml (#10778) 2024-11-18 19:35:12 +08:00
8bitpd
873e9720e9 feat: AnalyticDB vector store supports invocation via SQL. (#10802)
Co-authored-by: 璟义 <yangshangpo.ysp@alibaba-inc.com>
2024-11-18 19:29:54 +08:00
zxhlyh
de6d3e493c fix: script rendering in message (#10807)
Co-authored-by: crazywoola <427733928@qq.com>
2024-11-18 19:19:10 +08:00
-LAN-
7f1fdb774c chore: bump version to 0.11.2 (#10805) 2024-11-18 17:52:53 +08:00
Jyong
128efc3193 Feat/clean message records (#10588) 2024-11-18 16:57:39 +08:00
Garfield Dai
c49efc0c22 Feat/account not found (#10804) 2024-11-18 16:14:39 +08:00
KVOJJJin
3e2b8a8d02 Fix: legacy image upload compatible (#10803) 2024-11-18 15:57:48 +08:00
zxhlyh
9861279395 fix: upload custom file extension (#10801) 2024-11-18 15:57:32 +08:00
dajianguo
538a5df9d5 feat: Optimize usability during debugging #10641 (#10793) 2024-11-18 11:13:52 +08:00
Tao Wang
90d6ebc879 Add youtube-transcript-api as tool (#10772)
Co-authored-by: crazywoola <427733928@qq.com>
2024-11-18 10:58:16 +08:00
Kalo Chin
6de1f8c770 Feat(tools) add tavily extract tool and enhance tavily search implementation (#10786) 2024-11-18 09:51:34 +08:00
壮士
6d532bfc02 fix:Resolve the issue of Docker startup documents being queued all th… (#10791) 2024-11-18 09:49:33 +08:00
非法操作
ba537d657f feat: add gemini-exp-1114 (#10779) 2024-11-18 09:49:22 +08:00
Kalo Chin
305fbc7c92 fix: fal ai wizper also return text msg (#10789) 2024-11-18 09:45:59 +08:00
Tao Wang
29341d60aa Add DuckDuckGo Video Search and News Search (#10771) 2024-11-17 13:59:48 +08:00
呆萌闷油瓶
c170862de7 fix:custom file extension not support (#10759)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-11-16 22:37:25 +08:00
Jason Tan
ca6efd73f3 fix: date filter key not unique (#10645) 2024-11-16 14:43:55 +08:00
github-actions[bot]
d05fee1182 chore: translate i18n files (#10754)
Co-authored-by: douxc <7553076+douxc@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-11-15 19:31:15 +08:00
NFish
1f87676d52 Supports display license status (#10408)
Co-authored-by: Garfield Dai <dai.hai@foxmail.com>
2024-11-15 17:59:48 +08:00
Garfield Dai
c2ce2f88c7 feat: add license. (#10403) 2024-11-15 17:59:36 +08:00
crazywoola
2fed55ae6b Fix: number maybe empty string (#10743) 2024-11-15 16:31:10 +08:00
Bowen Liang
51db59622c chore(lint): cleanup repeated cause exception in logging.exception replaced by helpful message (#10425) 2024-11-15 15:41:40 +08:00
crazywoola
db1d2aaff5 Feat/add Slovensko (Slovenija) (#10731)
Co-authored-by: XHorizont.com <johnny@xhorizont.com>
2024-11-15 13:59:08 +08:00
Steven Lynn
4322fdc910 Feat/add reddit icon (#10733) 2024-11-15 13:55:46 +08:00
非法操作
2a5c5a4e15 fix: remove default model selection for audio tool (#10729) 2024-11-15 12:40:41 +08:00
非法操作
4b2abf8ac2 fix: create_blob_message of tool will always create image type file (#10701) 2024-11-15 10:38:12 +08:00
Bowen Liang
365cb4b368 chore(lint): bump ruff from 0.6.9 to 0.7.3 (#10714) 2024-11-15 09:19:41 +08:00
GeorgeCaoJ
c85bff235d fix(i18n): handle key naming error (#10713) 2024-11-15 09:01:38 +08:00
Kalo Chin
ad16180b1a feat(tool): fal ai wizper ASR built-in tool (#10716) 2024-11-15 09:01:07 +08:00
jarvis2f
5ff02b469f fix:position error when creating segments (#10706) 2024-11-14 21:25:15 +08:00
Bowen Liang
44f57ad9a8 chore: Bump Alpine Linux to 3.20 in web dockerfile (#10671) 2024-11-14 20:57:01 +08:00
yihong
94fd6f6901 fix: typo in test (#10707)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-14 20:54:13 +08:00
SiliconFlow, Inc
e61242a337 feat: add vlm models from siliconflow (#10704) 2024-11-14 20:53:35 +08:00
yihong
722964667f fix: non utf8 code decode close #10691 (#10698)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2024-11-14 17:29:49 +08:00
Xiao Ley
fbb9c1c249 fixed the Base URL usage issue in Podcast Generator tool verification (#10697) 2024-11-14 17:24:42 +08:00
非法操作
15f341b655 feat: add the audio tool (#10695) 2024-11-14 16:37:15 +08:00
crazywoola
b358490607 chore: update issue template (#10693) 2024-11-14 16:12:27 +08:00
crazywoola
f9e4196fd5 Update pull_request_template.md (#10692) 2024-11-14 15:56:37 +08:00
crazywoola
751525802d feat: update pr template (#10690) 2024-11-14 15:52:15 +08:00
lz
2abacd2a2d export configuration 'CODE_EXECUTION_TIMEOUT' to .env (#10688)
Co-authored-by: liuzhu <liuzhu@fridaycloud.com.cn>
2024-11-14 15:34:34 +08:00
Nam Vu
a3155e0613 Update expat version (#10686) 2024-11-14 15:30:55 +08:00
Jyong
70b9e4caf5 check dataset is none (#10682) 2024-11-14 14:07:19 +08:00
orangeclk
317ae9233e feat: add json response format for siliconflow models (#10657) 2024-11-14 08:58:22 +08:00
xiandan-erizo
5b8f03cd9d add abab7-chat-preview model (#10654)
Co-authored-by: xiandan-erizo <xiandan-erizo@outlook.com>
2024-11-13 19:30:42 +08:00
Kalo Chin
2a4783307a Feat(tool): fal ai flux image generation (#10606) 2024-11-13 17:41:58 +08:00
非法操作
bddecba9ed fix: mp3 file upload not work (#10650) 2024-11-13 17:37:29 +08:00
jiangbo721
931e76e3d1 fix: remove unused queue generation (#10532)
Co-authored-by: 刘江波 <jiangbo721@163.com>
2024-11-13 15:52:52 +08:00
-LAN-
70c2ec8ed5 feat(variable-handling): enhance variable and segment conversion (#10483) 2024-11-12 21:51:09 +08:00
wakaka6
9c7edb9242 feat: add builtin tools for send email (#10493) 2024-11-12 21:48:36 +08:00
Benjamin
0867821ae7 fix: update conversation session naming and API path in documentation (#10589) 2024-11-12 21:44:04 +08:00
Jyong
0b2d51d859 add the index field for elasticsearch (#10592) 2024-11-12 21:43:16 +08:00
方程
ef8022f715 Gitee AI Qwen2.5-72B model (#10595) 2024-11-12 21:40:32 +08:00
Kevin9703
e03ec0032b fix: Azure OpenAI o1 max_completion_token error (#10593) 2024-11-12 21:40:13 +08:00
dependabot[bot]
62642443ef chore(deps): bump elliptic from 6.5.7 to 6.6.0 in /web (#10587)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-12 16:43:11 +08:00
-LAN-
3e04c92ff9 chore(api): remove setting of expired remember_token cookie in after_request (#10582) 2024-11-12 15:53:55 +08:00
zxhlyh
b77628c458 fix: text-generation webapp file form (#10578) 2024-11-12 15:35:12 +08:00
NFish
40c5e6d67a fix: Page may lock if user close the page when refresh access_token (#10550) 2024-11-12 15:18:19 +08:00
zxhlyh
e4d175780e fix: retrieval setting validate (#10454) 2024-11-12 14:38:24 +08:00
-LAN-
16b9665033 refactor(api): improve handling of tools field and cleanup variable usage (#10553) 2024-11-12 00:08:04 +08:00
Benjamin
b7238caea5 chore(vanna): update form parameter from 'form' to 'llm' in vanna.yaml (#10548) 2024-11-12 00:00:27 +08:00
Hiroshi Fujita
e63c0e3cbb feat(settings): add chat color theme inverted toggle in settings modal (#10558) 2024-11-11 23:53:43 +08:00
fdb02983rhy
16db2c4e57 Fix: Set Celery LOG_File only when available, always log to console (#10563) 2024-11-11 23:53:12 +08:00
-LAN-
bd4a61addd fix: set default factory for extract_by in ListOperatorNodeData (#10561) 2024-11-11 23:32:40 +08:00
smyhw
f19c18dc14 Fixes you have not added provider None (#10501) 2024-11-11 21:50:32 +08:00
liuhaoran
570f10d91c fix issues:Image file not deleted when a doc is removed #9541 (#10465)
Signed-off-by: root <root@localhost.localdomain>
Co-authored-by: root <root@localhost.localdomain>
2024-11-11 21:43:37 +08:00
-LAN-
9550b884f7 chore: update version to 0.11.1 across all configurations and Docker images (#10539) 2024-11-11 18:32:28 +08:00
Novice
4b45ef62ed fix: iteration invalid output selector doesn't throw an error (#10544) 2024-11-11 17:34:48 +08:00
-LAN-
a1543b7da0 fix(extractor): temporary file (#10543) 2024-11-11 17:31:27 +08:00
Benjamin
90087160c6 chore (vanna): update form parameter from 'form' to 'llm' in vanna.yaml (#10488) 2024-11-11 16:41:47 +08:00
-LAN-
be33875199 fix(gitee_ai): update English description for clarity and accuracy (#10540) 2024-11-11 16:23:11 +08:00
-LAN-
867bf70f1a fix(model_runtime): ensure compatibility with O1 models by adjusting token parameters (#10537) 2024-11-11 16:06:53 +08:00
Novice
9018ef30fe chore: (dockerfile) upgrade perl version (#10534) 2024-11-11 15:02:33 +08:00
zxhlyh
508f84893f fix: workflow start node form optional value (#10529) 2024-11-11 14:57:28 +08:00
Novice
f414d241c1 Feat/iteration single run time (#10512) 2024-11-11 14:47:52 +08:00
Jyong
0c1307b083 add jina rerank http timout parameter (#10476) 2024-11-11 13:28:11 +08:00
-LAN-
b8b6cd409a refactor(code_executor): update input type annotations to use Mapping for better type safety (#10478) 2024-11-11 13:10:39 +08:00
Charlie.Wei
fbee41f8c7 The list action node adds methods to extract specific list objects (#10421)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-11-11 12:10:21 +08:00
Charlie.Wei
55edd5047e Support for incoming value modification (#10525) 2024-11-11 11:52:32 +08:00
非法操作
0587e24fdb feat: support tool search also can search toolProvider's name (#10518) 2024-11-11 11:32:41 +08:00
Xiao Ley
451ccb778d feat(tools/podcast_generator): add support for setting openai base url with the podcast_generationor tool (#10517) 2024-11-11 11:31:47 +08:00
crazywoola
5656f81bde Revert "fix the error of unable to retrieve url from file" (#10511) 2024-11-11 08:56:19 +08:00
Xiao Ley
b07ea5055b feat(tools/podcast_generator): add support for setting openai base url with the podcast_generationor tool (#10496) 2024-11-11 08:48:36 +08:00
Xiao Ley
5eb27afd63 fix the error of unable to retrieve url from file (#10498) 2024-11-11 08:47:47 +08:00
fdb02983rhy
05d43a4074 Fix: Correct the max tokens of Claude-3.5-Sonnet-20241022 for Bedrock and VertexAI (#10508) 2024-11-11 08:41:43 +08:00
larcane97
aa895cfa9b fix: [VESSL-AI] edit some words in vessl_ai.yaml (#10417)
Co-authored-by: moon <moon@vessl.ai>
2024-11-11 08:38:26 +08:00
-LAN-
172c7eb270 fix(file_upload): correct validation method and add unit tests (#10477) 2024-11-08 21:55:01 +08:00
crazywoola
eb6c0b8027 Fix/log tz (#10473) 2024-11-08 20:24:22 +08:00
-LAN-
06d2520db2 fix(api): replace Raw field with FilesContainedField in MessageListApi inputs (#10472) 2024-11-08 19:48:34 +08:00
-LAN-
bf31a3efbc feat(workflow-nodes): handle missing variables without failure (#10471) 2024-11-08 19:48:05 +08:00
Jyong
445dcfe4d0 add create tidb serverless job control (#10467)
Co-authored-by: crazywoola <427733928@qq.com>
2024-11-08 18:48:12 +08:00
-LAN-
25ca0278dd refactor(core): Remove extra_config from File. (#10203) 2024-11-08 18:13:24 +08:00
-LAN-
78a380bcc4 fix(migrations): correct schema reference in service API history migration (#10452) 2024-11-08 17:47:57 +08:00
Jyong
4f1a56f0f0 update document and segment word count (#10449) 2024-11-08 17:32:27 +08:00
Bowen Liang
754bfb181c chore(ci): avoid reinstall pipx and pin poetry version aligned with in api dockerfile (#10426) 2024-11-08 17:30:26 +08:00
非法操作
7903ba0297 chore: make comfy workflow can generate image with a random seed (#10462) 2024-11-08 17:21:16 +08:00
QuietlyChan
c1b2243adb feat: Add support for complete domain names in the new URL prefix. (#8893)
Co-authored-by: crazywoola <427733928@qq.com>
2024-11-08 17:17:34 +08:00
Jyong
d52c750942 embedding model check when init the knowledge (#10463) 2024-11-08 17:14:56 +08:00
liuhaoran
7c2a9b0744 celery worker log format following LOG_FORMAT env#9404 (#10016)
Signed-off-by: root <root@localhost.localdomain>
Co-authored-by: root <root@localhost.localdomain>
2024-11-08 17:12:09 +08:00
Jyong
888d7e6422 fix segment enable service api (#10445) 2024-11-08 17:09:05 +08:00
Benjamin
919275cc58 Fix conversation response issue (#10450) 2024-11-08 17:04:43 +08:00
非法操作
4fe5297e35 feat: add cogVideo tool (#10456) 2024-11-08 17:04:05 +08:00
非法操作
22dee4f6f3 chore: add MULTIMODAL_SEND_VIDEO_FORMAT to docker's env (#10458) 2024-11-08 17:03:55 +08:00
Novice
a7dbe58c85 fix: correct output order in parallel mode for iteration nodes (#10323) 2024-11-08 15:32:40 +08:00
Joe
aa3da0e24c fix(ops_tracing): enhance error handle in celery tasks. (#10401) 2024-11-08 14:43:47 +08:00
非法操作
033ab5490b feat: support LLM understand video (#9828) 2024-11-08 13:22:52 +08:00
Leo.Wang
c9f785e00f Feat/tools/gitlab (#10407) 2024-11-08 09:53:03 +08:00
Bowen Liang
0e8ab0588f fix: (#10437 followup) fix conditions with DEBUG config (#10438) 2024-11-08 09:42:53 +08:00
Bowen Liang
0ebe198ff1 chore: use DEBUG in dify_config instead of parsing raw system environment variable in place (#10437) 2024-11-08 09:34:11 +08:00
-LAN-
438ad8148b fix(http_request): send form data (#10431) 2024-11-08 09:33:40 +08:00
Bowen Liang
a60133bfb3 fix: config violations when running db migtration ci tests (#10428) 2024-11-08 09:33:12 +08:00
katsuma
98b3e37144 fix: simplify Enter key handling and remove unused ref (#10413) 2024-11-07 21:21:50 +08:00
Benjamin
6e23903c63 Conversation delete issue (#10423) 2024-11-07 21:13:23 +08:00
Bowen Liang
574c4a264f chore(lint): Use logging.exception instead of logging.error (#10415) 2024-11-07 21:13:02 +08:00
ice yao
dd5ffaf058 chore: use posixpath to wrapper filepath (#9976) 2024-11-07 19:31:49 +08:00
huangyafei
0b16270b88 fix typo: Retrieve Chunks API Docs (#10412) 2024-11-07 18:11:36 +08:00
走在修行的大街上
f562a88249 feat(Tools): add lark tools (#10117)
Co-authored-by: 黎斌 <libin.23@bytedance.com>
2024-11-07 18:11:25 +08:00
非法操作
59f8d116af chore: improve custom tool's display (#10410) 2024-11-07 18:10:41 +08:00
Benjamin
a5558f8fcc fix(conversation-service): return success response after conversation… (#10416) 2024-11-07 18:07:05 +08:00
-LAN-
823ae03a08 fix(remote-files): fallback to get when remote server not support head method (#10370) 2024-11-07 14:35:58 +08:00
-LAN-
f8c958a409 refactor(iteration): introduce specific exceptions for iteration errors (#10366) 2024-11-07 14:02:55 +08:00
-LAN-
25785d8c3f refactor(knowledge-retrieval): improve error handling with custom exceptions (#10385) 2024-11-07 14:02:46 +08:00
-LAN-
35d3da9697 refactor(tool-node): introduce specific exceptions for tool node errors (#10357) 2024-11-07 14:02:38 +08:00
-LAN-
d3e9930235 refactor(question_classifier): improve error handling with custom exceptions (#10365) 2024-11-07 14:02:30 +08:00
luckylhb90
1ccca7cc68 fixed: web api remote urls error (#10383)
Co-authored-by: hobo.l <hobo.l@binance.com>
2024-11-07 13:55:19 +08:00
powerfool
12a9e2972a Adjusted docker manifests and environment variables for OceanBase vector database (#10395) 2024-11-07 13:22:09 +08:00
omr
444c1f170a fix typo: mMaximum -> Maximum (#10389) 2024-11-07 10:40:57 +08:00
非法操作
3cb2fb8250 fix: remove duplicated category “recommended” (#10375) 2024-11-06 19:06:55 +08:00
Matsuda
1e8457441d fix(model_runtime): remove vision from features for Claude 3.5 Haiku (#10360) 2024-11-06 17:42:18 +08:00
Infinitnet
5a9448245b fix: remove unsupported vision in OpenRouter Haiku 3.5 (#10364) 2024-11-06 17:41:48 +08:00
Bowen Liang
eafe5a9d8f chore(ci): bring back poetry cache to speed up CI jobs (#10347) 2024-11-06 13:55:29 +08:00
Bowen Liang
d45d90e8ae chore: lazy import sagemaker (#10342) 2024-11-06 12:45:22 +08:00
comfuture
42a9374e71 chore: update translation for 'account' from '계좌' to '계정' (#10350) 2024-11-06 12:44:44 +08:00
-LAN-
82a775eca3 chore(ci): separate vector store tests into new workflow (#10354) 2024-11-06 12:43:55 +08:00
-LAN-
1dae1a71fc fix(api): remove fixed source attribute from FileApi (#10353) 2024-11-06 12:29:58 +08:00
Nam Vu
ac0fed6402 feat: support png, gif, webp (#7947)
Co-authored-by: xuanson9699 <84961581+xuanson9699@users.noreply.github.com>
2024-11-06 09:05:05 +08:00
Chenhe Gu
fb656d480e Update README.md (#10332) 2024-11-06 08:57:49 +08:00
方程
2b7341af57 Gitee AI tools (#10314) 2024-11-06 08:51:13 +08:00
Summer-Gu
ce1f9d935d feat: The SSRF request timeout configuration item is added (#10292) 2024-11-06 08:50:57 +08:00
Infinitnet
bdadca1a65 feat: add support for anthropic/claude-3-5-haiku through OpenRouter (#10331) 2024-11-06 08:26:44 +08:00
Benjamin
d7b4d0756e feat(vannaai): add base_url configuration (#10294) 2024-11-05 20:58:49 +08:00
-LAN-
1279e27825 docs: remove the TOC part (#10324) 2024-11-05 04:48:14 -08:00
非法操作
d92e3bd620 fix: special prompt not work for comfyUI tool (#10307) 2024-11-05 18:21:41 +08:00
-LAN-
7f583ec1ac chore: update version to 0.11.0 across all relevant files (#10278) 2024-11-05 17:53:56 +08:00
Novice
7962101e5e fix: iteration none output error (#10295) 2024-11-05 16:31:49 +08:00
-LAN-
ae254f0a10 fix(http_request): improve parameter initialization and reorganize tests (#10297) 2024-11-05 16:30:23 +08:00
Matsuda
68e0b0ac84 fix typo: writeOpner to writeOpener (#10290) 2024-11-05 16:09:53 +08:00
pinsily
5f21d13572 fix: handle KeyError when accessing rules in CleanProcessor.clean (#10258) 2024-11-05 14:47:15 +08:00
eux
233bffdb7d fix: borken faq url in CONTRIBUTING.md (#10275) 2024-11-05 14:42:59 +08:00
非法操作
bf9349c4dc feat: add xAI model provider (#10272) 2024-11-05 14:42:47 +08:00
Matsuda
4847548779 feat(model_runtime): add new model 'claude-3-5-haiku-20241022' (#10285) 2024-11-05 14:41:39 +08:00
Matsuda
cb245b5435 fix(model_runtime): fix wrong max_tokens for Claude 3.5 Haiku on Amazon Bedrock (#10286) 2024-11-05 14:41:15 +08:00
-LAN-
249b897872 feat(model): add validation for custom disclaimer length (#10287) 2024-11-05 14:40:57 +08:00
-LAN-
08c731fd84 fix(node): correct file property name in function switch (#10284) 2024-11-05 14:23:18 +08:00
NFish
302f4407f6 refactor the logic of refreshing access_token (#10068) 2024-11-05 12:38:31 +08:00
github-actions[bot]
de5dfd99f6 chore: translate i18n files (#10273)
Co-authored-by: laipz8200 <16485841+laipz8200@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-11-05 10:57:32 +08:00
Benjamin
acb22f0fde Updates: Add mplfonts library for customizing matplotlib fonts and Va… (#9903) 2024-11-05 10:34:28 +08:00
Novice
d1505b15c4 feat: Iteration node support parallel mode (#9493) 2024-11-05 10:32:49 +08:00
GeorgeCaoJ
cca2e7876d fix(workflow): handle else condition branch addition error in if-else node (#10257) 2024-11-05 09:56:41 +08:00
-LAN-
2c4d8dbe9b feat(document_extractor): support tool file in document extractor (#10217) 2024-11-05 09:49:43 +08:00
Matsuda
9305ad2102 feat: support Claude 3.5 Haiku on Amazon Bedrock (#10265) 2024-11-05 09:42:51 +08:00
-LAN-
7a98dab6a4 refactor(parameter_extractor): implement custom error classes (#10260) 2024-11-05 09:27:51 +08:00
guogeer
971defbbbd fix: buitin tool aippt (#10234)
Co-authored-by: jinqi.guo <jinqi.guo@ubtrobot.com>
2024-11-04 18:46:39 +08:00
-LAN-
6b0de08157 fix(validation): allow to use 0 in the inputs form (#10255) 2024-11-04 18:34:55 +08:00
-LAN-
87c1de66f2 chore(Dockerfile): upgrade zlib arm64 (#10244) 2024-11-04 17:48:10 +08:00
方程
2aa171c348 Using a dedicated interface to obtain the token credential for the gitee.ai provider (#10243) 2024-11-04 17:22:02 +08:00
-LAN-
6452342222 feat(workflow): add configurable workflow file upload limit (#10176)
Co-authored-by: JzoNg <jzongcode@gmail.com>
2024-11-04 15:55:34 +08:00
shisaru292
da204c131d fix: missing working directory parameter in script (#10226) 2024-11-04 15:23:18 +08:00
-LAN-
9369cc44e6 refactor(list_operator): replace ValueError with InvalidKeyError (#10222) 2024-11-04 15:23:08 +08:00
-LAN-
38bca6731c refactor(workflow): introduce specific error handling for LLM nodes (#10221) 2024-11-04 15:22:58 +08:00
-LAN-
2adab7f71a refactor(http_request): add custom exception handling for HTTP request nodes (#10219) 2024-11-04 15:22:50 +08:00
-LAN-
be96f6e62d refactor(workflow): introduce specific exceptions for code validation (#10218) 2024-11-04 15:22:41 +08:00
-LAN-
8b5ea39916 chore(llm_node): remove unnecessary type ignore for context assignment (#10216) 2024-11-04 15:22:31 +08:00
Jyong
1024fc623e fix the ssrf of docx file extractor external images (#10237) 2024-11-04 15:22:07 +08:00
Hanqing Zhao
8ab05d4c36 Modify translation (#10213) 2024-11-04 09:11:15 +08:00
Jiang
0c9e79cd67 Add Lindorm as a VDB choice (#10202)
Co-authored-by: jiangzhijie <jiangzhijie.jzj@alibaba-inc.com>
2024-11-04 09:10:26 +08:00
crazywoola
2ed6bb86c1 Fix/10199 application error a client side exception has occurred see the browser console for more information (#10211) 2024-11-03 12:53:49 +08:00
-LAN-
61da0f08dd refactor(validation): improve input validation logic (#10175) 2024-11-03 11:55:46 +08:00
-LAN-
1432c268a8 chore(list_operator): refine exception handling for error specificity (#10206) 2024-11-03 11:55:19 +08:00
-LAN-
ec6a03afdd fix(document_extractor): update base exception class (#10208) 2024-11-03 11:55:07 +08:00
Kota-Yamaguchi
bf371a6e5d Feat : add LLM model indicator in prompt generator (#10187) 2024-11-02 19:46:28 +08:00
Xiao Ley
b28cf68097 chore: enable vision support for models in OpenRouter that should have supported vision (#10191) 2024-11-02 19:45:20 +08:00
Kota-Yamaguchi
a0af7a51ed chore : code generator preview hint (#10188) 2024-11-02 19:45:07 +08:00
zxhlyh
dfa3ef0564 fix: webapp upload file (#10195) 2024-11-02 17:03:14 +08:00
-LAN-
0066531266 fix(api): replace current_user with end_user in file upload (#10194) 2024-11-02 17:03:00 +08:00
-LAN-
53a7cb0e9d feat(document_extractor): integrate unstructured API for PPTX extraction (#10180) 2024-11-01 23:19:11 +08:00
-LAN-
86739bea8b fix(tools): suppress RuntimeWarnings in podcast audio generator (#10182) 2024-11-01 20:59:40 +08:00
Cling_o3
ab127ba92e [fix] fix the bug that modify document name not effective (#10154) 2024-11-01 18:59:15 +08:00
-LAN-
6a2a9460e9 fix(workflow model): ensure consistent timestamp updating (#10172) 2024-11-01 18:58:54 +08:00
jiangbo721
07ad362854 fix: Cannot find declaration to go to CLEAN_DAY_SETTING (#10157)
Co-authored-by: 刘江波 <liujiangbo1@xiaomi.com>
2024-11-01 17:25:31 +08:00
Lawrence Li
76b0328eb1 feat: add gpustack model provider (#10158) 2024-11-01 17:23:30 +08:00
-LAN-
3c85136279 refactor(tools): Avoid warnings. (#10161) 2024-11-01 17:17:27 +08:00
-LAN-
bf048b8d7c refactor(migration/model): update column types for workflow schema (#10160) 2024-11-01 16:10:55 +08:00
-LAN-
9ac2bb30f4 Feat/add-remote-file-upload-api (#9906) 2024-11-01 15:51:22 +08:00
zxhlyh
78b74cce8e fix: upload remote image preview (#9952) 2024-11-01 15:45:27 +08:00
Jyong
82033af097 clean un-allowed special charters when doing indexing estimate (#10153) 2024-11-01 15:09:22 +08:00
-LAN-
951308b5f3 refactor(service): handle unsupported DSL version with warning (#10151) 2024-11-01 15:04:54 +08:00
larcane97
8d5456b6d0 Add VESSL AI OpenAI API-compatible model provider and LLM model (#9474)
Co-authored-by: moon <moon@vessl.ai>
2024-11-01 13:38:52 +08:00
Kota-Yamaguchi
f674de4f5d feat: synchronize input/output variables in the panel with generated code by the code generator (#10150) 2024-11-01 11:39:32 +08:00
Zixuan Cheng
fafa5938da Refined README for better reading experience. (#10143) 2024-11-01 10:17:06 +08:00
Coal Pigeon
4d5546953a add llm: ernie-4.0-turbo-128k of wenxin (#10135)
Co-authored-by: Pigeon姚宏锋 <pigeon.yhf@galaxyoversea.com>
2024-10-31 21:49:04 +08:00
Shili Cao
b61baa87ec fix: avoid unexpected error when create knowledge base with baidu vector database and wenxin embedding model (#10130) 2024-10-31 21:34:23 +08:00
llinvokerl
805c701767 fix: bar chart issue with duplicate x-axis labels being incorrectly ignored (#10134)
Co-authored-by: liusurong.lsr <liusurong.lsr@alibaba-inc.com>
2024-10-31 21:25:47 +08:00
Jyong
dad041c49f fix issue: query is none when doing retrieval (#10129) 2024-10-31 21:25:00 +08:00
zxhlyh
2ecdc54b0b Fix/rerank validation issue (#10131)
Co-authored-by: Yi <yxiaoisme@gmail.com>
2024-10-31 20:20:46 +08:00
Jyong
ce260f79d2 Feat/update knowledge api url (#10102)
Co-authored-by: nite-knite <nkCoding@gmail.com>
2024-10-31 18:29:12 +08:00
omr
11ca1bec0b fix: optimize unique document filtering with set (#10082) 2024-10-31 16:32:58 +08:00
-LAN-
05d9adeb99 fix(Dockerfile): conditionally install zlib1g based on architecture (#10118) 2024-10-31 16:07:39 +08:00
Hash Brown
73f29484e7 fix: log detail panel not showing any message when total count greate… (#10119) 2024-10-31 16:02:20 +08:00
Jyong
0154a26e0b fix issue: update document segment setting failed (#10107) 2024-10-31 15:51:33 +08:00
Nam Vu
cee1c4f63d fix: Version '1:1.3.dfsg+really1.3.1-1' for 'zlib1g' was not found (#10096) 2024-10-31 15:49:28 +08:00
-LAN-
e5397c5ec2 feat(app_dsl_service): enhance error handling and DSL version management (#10108) 2024-10-31 15:16:34 +08:00
非法操作
e36f5cb366 chore: save uploaded file extension as lower case (#10111) 2024-10-31 15:16:25 +08:00
-LAN-
8b9fed75f3 refactor(version): simplify version comparison logic (#10109) 2024-10-31 15:15:32 +08:00
beginnerZhang
66e9bd90eb fix: view logs in prompt, no response when clicked (#10093)
Co-authored-by: zhanganguo <zhanganguo@lixiang.com>
2024-10-31 10:49:14 +08:00
非法操作
b29c1224c1 chore: remove an unnecessary link (#10088) 2024-10-31 10:35:45 +08:00
非法操作
bd6175157c feat: enhance comfyui workflow (#10085) 2024-10-31 10:00:22 +08:00
AkaraChen
6692e8c508 build: update docker login action (#10050) 2024-10-31 09:53:45 +08:00
Kota-Yamaguchi
6c25131964 chore: update type definition to resolve lint error in Base usage at text-editor.tsx (#10083) 2024-10-31 09:52:59 +08:00
Bowen Liang
0bdae34b5e improve: significantly speed up the server launching time by async preloading tool providers (#9146) 2024-10-31 00:21:01 +08:00
Charlie.Wei
f6fecb957e fix azure chatgpt o1 parameter error (#10067) 2024-10-30 22:08:56 +08:00
crazywoola
0a3d51e9cf Revert "chore: improve validation and handler of logging timezone with TimezoneName" (#10077) 2024-10-30 22:06:10 +08:00
sacryu
a69513c044 fix the typos in the hit testing template (#10072) 2024-10-30 22:01:22 +08:00
JasonVV
219f5d9845 Fixed the issue where recall the knowledge base in the iteration of the workflow and report errors when executing (#10060) 2024-10-30 21:56:38 +08:00
Hiroshi Fujita
ba60e0f692 chore: Set file size limits for video and audio uploads from docker env (#10063) 2024-10-30 21:55:01 +08:00
Fog3211
18424dd82f fix: prevent onChange during IME composition (#10059) 2024-10-30 16:59:40 +08:00
22mSqRi
32ebea91ff fix: fix poetry install command in devcontainer (#9507) 2024-10-30 16:27:17 +08:00
-LAN-
3b53e06e0d fix(workflow): refine variable type checks in LLMNode (#10051) 2024-10-30 16:23:12 +08:00
非法操作
4d38798dd5 chore: mount config file of sandbox (#8576) 2024-10-30 15:45:51 +08:00
zhuhao
92a3898540 fix: resolve the incorrect model name of hunyuan-standard-256k (#10052) 2024-10-30 15:43:29 +08:00
zhuhao
7433095240 chore: use dify_config.TIDB_SPEND_LIMIT instead of constant value (#10038) 2024-10-30 15:43:07 +08:00
郭伟伟
190b6a2aa6 feat: /conversations api response add 'update_at' field,and update api docs add sort_by parameter (#10043) 2024-10-30 15:41:15 +08:00
zhuhao
0095896051 feat: add YAML type in document extractor node (#9997) 2024-10-30 13:47:19 +08:00
Xiao Ley
c647e4307a add PROMPT_GENERATION_MAX_TOKENS and CODE_GENERATION_MAX_TOKENS in docker enviromment (#10040) 2024-10-30 12:48:56 +08:00
Bowen Liang
bab5c54219 chore: improve validation and handler of logging timezone with TimezoneName (#9595) 2024-10-30 11:18:23 +08:00
Jyong
e74479717a fix update_by_api batch field issue (#10001) 2024-10-30 11:17:46 +08:00
Jyong
9ebd453b87 add rerank check when doing mutil-retrieval (#9998) 2024-10-30 11:17:39 +08:00
ice yao
5ad5d0cff4 chore: Add aliyun oss tests (#10031) 2024-10-30 11:17:30 +08:00
Mab
68cb382242 Fix #10023 : error in docker-compose.yaml about TIDB_ON_QDRANT_CLIENT… (#10025) 2024-10-30 11:15:55 +08:00
Lucas Rezende
f5d1c7cc0a Added: README_PT.md in Brazilian Portuguese (#10026)
Co-authored-by: Lucas Rezende <lucasrezende@MacBook-Pro-de-Lucas.local>
2024-10-30 11:12:31 +08:00
非法操作
c7fb8a4f20 fix: conversation variable may not change in the answer node (#10034) 2024-10-30 11:10:31 +08:00
-LAN-
eb87e690ed fix(llm-node): handle NoneSegment variables properly (#9978) 2024-10-30 08:46:11 +08:00
Hiroshi Fujita
539fc8b760 Fix content-type header case sensitivity (#9961) 2024-10-30 02:11:18 +08:00
zhuhao
c6e54c83c8 chore: add tidb-on-qdrant configuration in env and docker-compose file (#10015) 2024-10-29 21:11:10 +08:00
powerfool
878d13ef42 Added OceanBase as an option for the vector store in Dify (#10010) 2024-10-29 21:10:18 +08:00
Jyong
5580bcf870 add tidb spend limit config (#9999) 2024-10-29 17:51:13 +08:00
非法操作
12adcf8925 fix: gemini model use some tools raise error (#9993) 2024-10-29 16:09:29 +08:00
roadgoat19
c8ef9223e5 feat: couchbase integration (#6165)
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: Elliot Scribner <elliot.scribner@couchbase.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Bowen Liang <bowenliang@apache.org>
2024-10-29 15:00:23 +08:00
crazywoola
fc37e654fc Feat/support form in conversation (#9980) 2024-10-29 13:32:50 +08:00
非法操作
eb69896355 fix: allow external knowledge api use simple host (#9966) 2024-10-29 10:33:15 +08:00
ice yao
61ff2fd0f3 chore: Enable tencent cos test to run (#9971) 2024-10-29 10:33:00 +08:00
Wu Tianwei
b6d045cebf fix: Fix page logout issue due to refresh-token (#9970) 2024-10-29 09:55:14 +08:00
Jyong
f47177ecb4 add top_k for es full text search (#9963) 2024-10-28 23:04:54 +08:00
crazywoola
de850262b8 fix: button rendering when using streaming (#9957) 2024-10-28 19:23:31 +08:00
1570 changed files with 59999 additions and 14966 deletions

View File

@@ -1,5 +1,5 @@
FROM mcr.microsoft.com/devcontainers/python:3.10
FROM mcr.microsoft.com/devcontainers/python:3.12
# [Optional] Uncomment this section to install additional OS packages.
# RUN apt-get update && export DEBIAN_FRONTEND=noninteractive \
# && apt-get -y install --no-install-recommends <your-package-list-here>
# && apt-get -y install --no-install-recommends <your-package-list-here>

View File

@@ -1,7 +1,7 @@
// For format details, see https://aka.ms/devcontainer.json. For config options, see the
// README at: https://github.com/devcontainers/templates/tree/main/src/anaconda
{
"name": "Python 3.10",
"name": "Python 3.12",
"build": {
"context": "..",
"dockerfile": "Dockerfile"

View File

@@ -7,5 +7,6 @@ echo 'alias start-api="cd /workspaces/dify/api && poetry run python -m flask run
echo 'alias start-worker="cd /workspaces/dify/api && poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion"' >> ~/.bashrc
echo 'alias start-web="cd /workspaces/dify/web && npm run dev"' >> ~/.bashrc
echo 'alias start-containers="cd /workspaces/dify/docker && docker-compose -f docker-compose.middleware.yaml -p dify up -d"' >> ~/.bashrc
echo 'alias stop-containers="cd /workspaces/dify/docker && docker-compose -f docker-compose.middleware.yaml -p dify down"' >> ~/.bashrc
source /home/vscode/.bashrc
source /home/vscode/.bashrc

View File

@@ -1,3 +1,3 @@
#!/bin/bash
poetry install -C api
cd api && poetry install

36
.github/actions/setup-poetry/action.yml vendored Normal file
View File

@@ -0,0 +1,36 @@
name: Setup Poetry and Python
inputs:
python-version:
description: Python version to use and the Poetry installed with
required: true
default: '3.11'
poetry-version:
description: Poetry version to set up
required: true
default: '1.8.4'
poetry-lockfile:
description: Path to the Poetry lockfile to restore cache from
required: true
default: ''
runs:
using: composite
steps:
- name: Set up Python ${{ inputs.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ inputs.python-version }}
cache: pip
- name: Install Poetry
shell: bash
run: pip install poetry==${{ inputs.poetry-version }}
- name: Restore Poetry cache
if: ${{ inputs.poetry-lockfile != '' }}
uses: actions/setup-python@v5
with:
python-version: ${{ inputs.python-version }}
cache: poetry
cache-dependency-path: ${{ inputs.poetry-lockfile }}

View File

@@ -1,34 +1,25 @@
# Checklist:
# Summary
Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
> [!Tip]
> Close issue syntax: `Fixes #<issue number>` or `Resolves #<issue number>`, see [documentation](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword) for more details.
# Screenshots
| Before | After |
|--------|-------|
| ... | ... |
# Checklist
> [!IMPORTANT]
> Please review the checklist below before submitting your pull request.
- [ ] Please open an issue before creating a PR or link to an existing issue
- [ ] I have performed a self-review of my own code
- [ ] I have commented my code, particularly in hard-to-understand areas
- [ ] I ran `dev/reformat`(backend) and `cd web && npx lint-staged`(frontend) to appease the lint gods
# Description
Describe the big picture of your changes here to communicate to the maintainers why we should accept this pull request. If it fixes a bug or resolves a feature request, be sure to link to that issue. Close issue syntax: `Fixes #<issue number>`, see [documentation](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword) for more details.
Fixes
## Type of Change
- [ ] Bug fix (non-breaking change which fixes an issue)
- [ ] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
- [ ] This change requires a documentation update, included: [Dify Document](https://github.com/langgenius/dify-docs)
- [ ] Improvement, including but not limited to code refactoring, performance optimization, and UI/UX improvement
- [ ] Dependency upgrade
# Testing Instructions
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
- [ ] Test A
- [ ] Test B
- [x] I understand that this PR may be closed in case there was no previous discussion or issues. (This doesn't apply to typos!)
- [x] I've added a test for each change that was introduced, and I tried as much as possible to make a single atomic change.
- [x] I've updated the documentation accordingly.
- [x] I ran `dev/reformat`(backend) and `cd web && npx lint-staged`(frontend) to appease the lint gods

View File

@@ -7,6 +7,7 @@ on:
paths:
- api/**
- docker/**
- .github/workflows/api-tests.yml
concurrency:
group: api-tests-${{ github.head_ref || github.run_id }}
@@ -19,7 +20,6 @@ jobs:
strategy:
matrix:
python-version:
- "3.10"
- "3.11"
- "3.12"
@@ -27,16 +27,11 @@ jobs:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
- name: Setup Poetry and Python ${{ matrix.python-version }}
uses: ./.github/actions/setup-poetry
with:
python-version: ${{ matrix.python-version }}
cache-dependency-path: |
api/pyproject.toml
api/poetry.lock
- name: Install Poetry
uses: abatilo/actions-poetry@v3
poetry-lockfile: api/poetry.lock
- name: Check Poetry lockfile
run: |
@@ -67,7 +62,7 @@ jobs:
run: sh .github/workflows/expose_service_ports.sh
- name: Set up Sandbox
uses: hoverkraft-tech/compose-action@v2.0.0
uses: hoverkraft-tech/compose-action@v2.0.2
with:
compose-file: |
docker/docker-compose.middleware.yaml
@@ -77,21 +72,3 @@ jobs:
- name: Run Workflow
run: poetry run -C api bash dev/pytest/pytest_workflow.sh
- name: Set up Vector Stores (Weaviate, Qdrant, PGVector, Milvus, PgVecto-RS, Chroma, MyScale, ElasticSearch)
uses: hoverkraft-tech/compose-action@v2.0.0
with:
compose-file: |
docker/docker-compose.yaml
services: |
weaviate
qdrant
etcd
minio
milvus-standalone
pgvecto-rs
pgvector
chroma
elasticsearch
- name: Test Vector Stores
run: poetry run -C api bash dev/pytest/pytest_vdb.sh

View File

@@ -49,7 +49,7 @@ jobs:
echo "PLATFORM_PAIR=${platform//\//-}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v2
uses: docker/login-action@v3
with:
username: ${{ env.DOCKERHUB_USER }}
password: ${{ env.DOCKERHUB_TOKEN }}
@@ -114,7 +114,7 @@ jobs:
merge-multiple: true
- name: Login to Docker Hub
uses: docker/login-action@v2
uses: docker/login-action@v3
with:
username: ${{ env.DOCKERHUB_USER }}
password: ${{ env.DOCKERHUB_TOKEN }}

View File

@@ -6,6 +6,7 @@ on:
- main
paths:
- api/migrations/**
- .github/workflows/db-migration-test.yml
concurrency:
group: db-migration-test-${{ github.ref }}
@@ -14,25 +15,15 @@ concurrency:
jobs:
db-migration-test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version:
- "3.10"
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
- name: Setup Poetry and Python
uses: ./.github/actions/setup-poetry
with:
python-version: ${{ matrix.python-version }}
cache-dependency-path: |
api/pyproject.toml
api/poetry.lock
- name: Install Poetry
uses: abatilo/actions-poetry@v3
poetry-lockfile: api/poetry.lock
- name: Install dependencies
run: poetry install -C api
@@ -43,7 +34,7 @@ jobs:
cp middleware.env.example middleware.env
- name: Set up Middlewares
uses: hoverkraft-tech/compose-action@v2.0.0
uses: hoverkraft-tech/compose-action@v2.0.2
with:
compose-file: |
docker/docker-compose.middleware.yaml
@@ -57,6 +48,8 @@ jobs:
cp .env.example .env
- name: Run DB Migration
env:
DEBUG: true
run: |
cd api
poetry run python -m flask upgrade-db

View File

@@ -7,5 +7,8 @@ yq eval '.services["milvus-standalone"].ports += ["19530:19530"]' -i docker/dock
yq eval '.services.pgvector.ports += ["5433:5432"]' -i docker/docker-compose.yaml
yq eval '.services["pgvecto-rs"].ports += ["5431:5432"]' -i docker/docker-compose.yaml
yq eval '.services["elasticsearch"].ports += ["9200:9200"]' -i docker/docker-compose.yaml
yq eval '.services.couchbase-server.ports += ["8091-8096:8091-8096"]' -i docker/docker-compose.yaml
yq eval '.services.couchbase-server.ports += ["11210:11210"]' -i docker/docker-compose.yaml
yq eval '.services.tidb.ports += ["4000:4000"]' -i docker/docker-compose.yaml
echo "Ports exposed for sandbox, weaviate, qdrant, chroma, milvus, pgvector, pgvecto-rs, elasticsearch"
echo "Ports exposed for sandbox, weaviate, tidb, qdrant, chroma, milvus, pgvector, pgvecto-rs, elasticsearch, couchbase"

View File

@@ -22,34 +22,29 @@ jobs:
id: changed-files
uses: tj-actions/changed-files@v45
with:
files: api/**
files: |
api/**
.github/workflows/style.yml
- name: Set up Python
uses: actions/setup-python@v5
- name: Setup Poetry and Python
if: steps.changed-files.outputs.any_changed == 'true'
with:
python-version: '3.10'
uses: ./.github/actions/setup-poetry
- name: Install Poetry
if: steps.changed-files.outputs.any_changed == 'true'
uses: abatilo/actions-poetry@v3
- name: Python dependencies
- name: Install dependencies
if: steps.changed-files.outputs.any_changed == 'true'
run: poetry install -C api --only lint
- name: Ruff check
if: steps.changed-files.outputs.any_changed == 'true'
run: poetry run -C api ruff check ./api
run: |
poetry run -C api ruff --version
poetry run -C api ruff check ./api
poetry run -C api ruff format --check ./api
- name: Dotenv check
if: steps.changed-files.outputs.any_changed == 'true'
run: poetry run -C api dotenv-linter ./api/.env.example ./web/.env.example
- name: Ruff formatter check
if: steps.changed-files.outputs.any_changed == 'true'
run: poetry run -C api ruff format --check ./api
- name: Lint hints
if: failure()
run: echo "Please run 'dev/reformat' to fix the fixable linting errors."

73
.github/workflows/vdb-tests.yml vendored Normal file
View File

@@ -0,0 +1,73 @@
name: Run VDB Tests
on:
pull_request:
branches:
- main
paths:
- api/core/rag/datasource/**
- docker/**
- .github/workflows/vdb-tests.yml
- api/poetry.lock
- api/pyproject.toml
concurrency:
group: vdb-tests-${{ github.head_ref || github.run_id }}
cancel-in-progress: true
jobs:
test:
name: VDB Tests
runs-on: ubuntu-latest
strategy:
matrix:
python-version:
- "3.11"
- "3.12"
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Poetry and Python ${{ matrix.python-version }}
uses: ./.github/actions/setup-poetry
with:
python-version: ${{ matrix.python-version }}
poetry-lockfile: api/poetry.lock
- name: Check Poetry lockfile
run: |
poetry check -C api --lock
poetry show -C api
- name: Install dependencies
run: poetry install -C api --with dev
- name: Set up dotenvs
run: |
cp docker/.env.example docker/.env
cp docker/middleware.env.example docker/middleware.env
- name: Expose Service Ports
run: sh .github/workflows/expose_service_ports.sh
- name: Set up Vector Stores (TiDB, Weaviate, Qdrant, PGVector, Milvus, PgVecto-RS, Chroma, MyScale, ElasticSearch, Couchbase)
uses: hoverkraft-tech/compose-action@v2.0.2
with:
compose-file: |
docker/docker-compose.yaml
services: |
weaviate
qdrant
couchbase-server
etcd
minio
milvus-standalone
pgvecto-rs
pgvector
chroma
elasticsearch
tidb
- name: Test Vector Stores
run: poetry run -C api bash dev/pytest/pytest_vdb.sh

5
.gitignore vendored
View File

@@ -173,6 +173,9 @@ docker/volumes/myscale/log/*
docker/volumes/unstructured/*
docker/volumes/pgvector/data/*
docker/volumes/pgvecto_rs/data/*
docker/volumes/couchbase/*
docker/volumes/oceanbase/*
!docker/volumes/oceanbase/init.d
docker/nginx/conf.d/default.conf
docker/nginx/ssl/*
@@ -189,4 +192,4 @@ pyrightconfig.json
api/.vscode
.idea/
.vscode
.vscode

View File

@@ -1,6 +1,8 @@
# CONTRIBUTING
So you're looking to contribute to Dify - that's awesome, we can't wait to see what you do. As a startup with limited headcount and funding, we have grand ambitions to design the most intuitive workflow for building and managing LLM applications. Any help from the community counts, truly.
We need to be nimble and ship fast given where we are, but we also want to make sure that contributors like you get as smooth an experience at contributing as possible. We've assembled this contribution guide for that purpose, aiming at getting you familiarized with the codebase & how we work with contributors, so you could quickly jump to the fun part.
We need to be nimble and ship fast given where we are, but we also want to make sure that contributors like you get as smooth an experience at contributing as possible. We've assembled this contribution guide for that purpose, aiming at getting you familiarized with the codebase & how we work with contributors, so you could quickly jump to the fun part.
This guide, like Dify itself, is a constant work in progress. We highly appreciate your understanding if at times it lags behind the actual project, and welcome any feedback for us to improve.
@@ -10,14 +12,12 @@ In terms of licensing, please take a minute to read our short [License and Contr
[Find](https://github.com/langgenius/dify/issues?q=is:issue+is:open) an existing issue, or [open](https://github.com/langgenius/dify/issues/new/choose) a new one. We categorize issues into 2 types:
### Feature requests:
### Feature requests
* If you're opening a new feature request, we'd like you to explain what the proposed feature achieves, and include as much context as possible. [@perzeusss](https://github.com/perzeuss) has made a solid [Feature Request Copilot](https://udify.app/chat/MK2kVSnw1gakVwMX) that helps you draft out your needs. Feel free to give it a try.
* If you want to pick one up from the existing issues, simply drop a comment below it saying so.
A team member working in the related direction will be looped in. If all looks good, they will give the go-ahead for you to start coding. We ask that you hold off working on the feature until then, so none of your work goes to waste should we propose changes.
Depending on whichever area the proposed feature falls under, you might talk to different team members. Here's rundown of the areas each our team members are working on at the moment:
@@ -40,7 +40,7 @@ In terms of licensing, please take a minute to read our short [License and Contr
| Non-core features and minor enhancements | Low Priority |
| Valuable but not immediate | Future-Feature |
### Anything else (e.g. bug report, performance optimization, typo correction):
### Anything else (e.g. bug report, performance optimization, typo correction)
* Start coding right away.
@@ -52,7 +52,6 @@ In terms of licensing, please take a minute to read our short [License and Contr
| Non-critical bugs, performance boosts | Medium Priority |
| Minor fixes (typos, confusing but working UI) | Low Priority |
## Installing
Here are the steps to set up Dify for development:
@@ -63,7 +62,7 @@ Here are the steps to set up Dify for development:
Clone the forked repository from your terminal:
```
```shell
git clone git@github.com:<github_username>/dify.git
```
@@ -71,21 +70,21 @@ git clone git@github.com:<github_username>/dify.git
Dify requires the following dependencies to build, make sure they're installed on your system:
- [Docker](https://www.docker.com/)
- [Docker Compose](https://docs.docker.com/compose/install/)
- [Node.js v18.x (LTS)](http://nodejs.org)
- [npm](https://www.npmjs.com/) version 8.x.x or [Yarn](https://yarnpkg.com/)
- [Python](https://www.python.org/) version 3.10.x
* [Docker](https://www.docker.com/)
* [Docker Compose](https://docs.docker.com/compose/install/)
* [Node.js v18.x (LTS)](http://nodejs.org)
* [npm](https://www.npmjs.com/) version 8.x.x or [Yarn](https://yarnpkg.com/)
* [Python](https://www.python.org/) version 3.11.x or 3.12.x
### 4. Installations
Dify is composed of a backend and a frontend. Navigate to the backend directory by `cd api/`, then follow the [Backend README](api/README.md) to install it. In a separate terminal, navigate to the frontend directory by `cd web/`, then follow the [Frontend README](web/README.md) to install.
Check the [installation FAQ](https://docs.dify.ai/learn-more/faq/self-host-faq) for a list of common issues and steps to troubleshoot.
Check the [installation FAQ](https://docs.dify.ai/learn-more/faq/install-faq) for a list of common issues and steps to troubleshoot.
### 5. Visit dify in your browser
To validate your set up, head over to [http://localhost:3000](http://localhost:3000) (the default, or your self-configured URL and port) in your browser. You should now see Dify up and running.
To validate your set up, head over to [http://localhost:3000](http://localhost:3000) (the default, or your self-configured URL and port) in your browser. You should now see Dify up and running.
## Developing
@@ -97,9 +96,9 @@ To help you quickly navigate where your contribution fits, a brief, annotated ou
### Backend
Difys backend is written in Python using [Flask](https://flask.palletsprojects.com/en/3.0.x/). It uses [SQLAlchemy](https://www.sqlalchemy.org/) for ORM and [Celery](https://docs.celeryq.dev/en/stable/getting-started/introduction.html) for task queueing. Authorization logic goes via Flask-login.
Difys backend is written in Python using [Flask](https://flask.palletsprojects.com/en/3.0.x/). It uses [SQLAlchemy](https://www.sqlalchemy.org/) for ORM and [Celery](https://docs.celeryq.dev/en/stable/getting-started/introduction.html) for task queueing. Authorization logic goes via Flask-login.
```
```text
[api/]
├── constants // Constant settings used throughout code base.
├── controllers // API route definitions and request handling logic.
@@ -121,7 +120,7 @@ Difys backend is written in Python using [Flask](https://flask.palletsproject
The website is bootstrapped on [Next.js](https://nextjs.org/) boilerplate in Typescript and uses [Tailwind CSS](https://tailwindcss.com/) for styling. [React-i18next](https://react.i18next.com/) is used for internationalization.
```
```text
[web/]
├── app // layouts, pages, and components
│ ├── (commonLayout) // common layout used throughout the app
@@ -149,10 +148,10 @@ The website is bootstrapped on [Next.js](https://nextjs.org/) boilerplate in Typ
## Submitting your PR
At last, time to open a pull request (PR) to our repo. For major features, we first merge them into the `deploy/dev` branch for testing, before they go into the `main` branch. If you run into issues like merge conflicts or don't know how to open a pull request, check out [GitHub's pull request tutorial](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests).
At last, time to open a pull request (PR) to our repo. For major features, we first merge them into the `deploy/dev` branch for testing, before they go into the `main` branch. If you run into issues like merge conflicts or don't know how to open a pull request, check out [GitHub's pull request tutorial](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests).
And that's it! Once your PR is merged, you will be featured as a contributor in our [README](https://github.com/langgenius/dify/blob/main/README.md).
## Getting Help
If you ever get stuck or got a burning question while contributing, simply shoot your queries our way via the related GitHub issue, or hop onto our [Discord](https://discord.gg/8Tpq4AcN9c) for a quick chat.
If you ever get stuck or got a burning question while contributing, simply shoot your queries our way via the related GitHub issue, or hop onto our [Discord](https://discord.gg/8Tpq4AcN9c) for a quick chat.

View File

@@ -71,7 +71,7 @@ Dify 依赖以下工具和库:
- [Docker Compose](https://docs.docker.com/compose/install/)
- [Node.js v18.x (LTS)](http://nodejs.org)
- [npm](https://www.npmjs.com/) version 8.x.x or [Yarn](https://yarnpkg.com/)
- [Python](https://www.python.org/) version 3.10.x
- [Python](https://www.python.org/) version 3.11.x or 3.12.x
### 4. 安装

View File

@@ -74,7 +74,7 @@ Dify を構築するには次の依存関係が必要です。それらがシス
- [Docker Compose](https://docs.docker.com/compose/install/)
- [Node.js v18.x (LTS)](http://nodejs.org)
- [npm](https://www.npmjs.com/) version 8.x.x or [Yarn](https://yarnpkg.com/)
- [Python](https://www.python.org/) version 3.10.x
- [Python](https://www.python.org/) version 3.11.x or 3.12.x
### 4. インストール

View File

@@ -73,13 +73,13 @@ Dify yêu cầu các phụ thuộc sau để build, hãy đảm bảo chúng đ
- [Docker Compose](https://docs.docker.com/compose/install/)
- [Node.js v18.x (LTS)](http://nodejs.org)
- [npm](https://www.npmjs.com/) phiên bản 8.x.x hoặc [Yarn](https://yarnpkg.com/)
- [Python](https://www.python.org/) phiên bản 3.10.x
- [Python](https://www.python.org/) phiên bản 3.11.x hoặc 3.12.x
### 4. Cài đặt
Dify bao gồm một backend và một frontend. Đi đến thư mục backend bằng lệnh `cd api/`, sau đó làm theo hướng dẫn trong [README của Backend](api/README.md) để cài đặt. Trong một terminal khác, đi đến thư mục frontend bằng lệnh `cd web/`, sau đó làm theo hướng dẫn trong [README của Frontend](web/README.md) để cài đặt.
Kiểm tra [FAQ về cài đặt](https://docs.dify.ai/learn-more/faq/self-host-faq) để xem danh sách các vấn đề thường gặp và các bước khắc phục.
Kiểm tra [FAQ về cài đặt](https://docs.dify.ai/learn-more/faq/install-faq) để xem danh sách các vấn đề thường gặp và các bước khắc phục.
### 5. Truy cập Dify trong trình duyệt của bạn
@@ -153,4 +153,4 @@ Và thế là xong! Khi PR của bạn được merge, bạn sẽ được giớ
## Nhận trợ giúp
Nếu bạn gặp khó khăn hoặc có câu hỏi cấp bách trong quá trình đóng góp, hãy đặt câu hỏi của bạn trong vấn đề GitHub liên quan, hoặc tham gia [Discord](https://discord.gg/8Tpq4AcN9c) của chúng tôi để trò chuyện nhanh chóng.
Nếu bạn gặp khó khăn hoặc có câu hỏi cấp bách trong quá trình đóng góp, hãy đặt câu hỏi của bạn trong vấn đề GitHub liên quan, hoặc tham gia [Discord](https://discord.gg/8Tpq4AcN9c) của chúng tôi để trò chuyện nhanh chóng.

141
README.md
View File

@@ -19,6 +19,9 @@
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://reddit.com/r/difyai" target="_blank">
<img src="https://img.shields.io/reddit/subreddit-subscribers/difyai?style=plastic&logo=reddit&label=r%2Fdifyai&labelColor=white"
alt="join Reddit"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on X(Twitter)"></a>
@@ -46,9 +49,33 @@
</p>
Dify is an open-source LLM app development platform. Its intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production. Here's a list of the core features:
</br> </br>
Dify is an open-source LLM app development platform. Its intuitive interface combines agentic AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
## Quick start
> Before installing Dify, make sure your machine meets the following minimum system requirements:
>
>- CPU >= 2 Core
>- RAM >= 4 GiB
</br>
The easiest way to start the Dify server is through [docker compose](docker/docker-compose.yaml). Before running Dify with the following commands, make sure that [Docker](https://docs.docker.com/get-docker/) and [Docker Compose](https://docs.docker.com/compose/install/) are installed on your machine:
```bash
cd dify
cd docker
cp .env.example .env
docker compose up -d
```
After running, you can access the Dify dashboard in your browser at [http://localhost/install](http://localhost/install) and start the initialization process.
#### Seeking help
Please refer to our [FAQ](https://docs.dify.ai/getting-started/install-self-hosted/faqs) if you encounter problems setting up Dify. Reach out to [the community and us](#community--contact) if you are still having issues.
> If you'd like to contribute to Dify or do additional development, refer to our [guide to deploying from source code](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code)
## Key features
**1. Workflow**:
Build and test powerful AI workflows on a visual canvas, leveraging all the following features and beyond.
@@ -79,73 +106,6 @@ Dify is an open-source LLM app development platform. Its intuitive interface com
All of Dify's offerings come with corresponding APIs, so you could effortlessly integrate Dify into your own business logic.
## Feature comparison
<table style="width: 100%;">
<tr>
<th align="center">Feature</th>
<th align="center">Dify.AI</th>
<th align="center">LangChain</th>
<th align="center">Flowise</th>
<th align="center">OpenAI Assistants API</th>
</tr>
<tr>
<td align="center">Programming Approach</td>
<td align="center">API + App-oriented</td>
<td align="center">Python Code</td>
<td align="center">App-oriented</td>
<td align="center">API-oriented</td>
</tr>
<tr>
<td align="center">Supported LLMs</td>
<td align="center">Rich Variety</td>
<td align="center">Rich Variety</td>
<td align="center">Rich Variety</td>
<td align="center">OpenAI-only</td>
</tr>
<tr>
<td align="center">RAG Engine</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">✅</td>
</tr>
<tr>
<td align="center">Agent</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">✅</td>
</tr>
<tr>
<td align="center">Workflow</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">✅</td>
<td align="center">❌</td>
</tr>
<tr>
<td align="center">Observability</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">❌</td>
</tr>
<tr>
<td align="center">Enterprise Features (SSO/Access control)</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">❌</td>
<td align="center">❌</td>
</tr>
<tr>
<td align="center">Local Deployment</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">❌</td>
</tr>
</table>
## Using Dify
- **Cloud </br>**
@@ -167,28 +127,7 @@ Star Dify on GitHub and be instantly notified of new releases.
![star-us](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## Quick start
> Before installing Dify, make sure your machine meets the following minimum system requirements:
>
>- CPU >= 2 Core
>- RAM >= 4 GiB
</br>
The easiest way to start the Dify server is to run our [docker-compose.yml](docker/docker-compose.yaml) file. Before running the installation command, make sure that [Docker](https://docs.docker.com/get-docker/) and [Docker Compose](https://docs.docker.com/compose/install/) are installed on your machine:
```bash
cd docker
cp .env.example .env
docker compose up -d
```
After running, you can access the Dify dashboard in your browser at [http://localhost/install](http://localhost/install) and start the initialization process.
> If you'd like to contribute to Dify or do additional development, refer to our [guide to deploying from source code](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code)
## Next steps
## Advanced Setup
If you need to customize the configuration, please refer to the comments in our [.env.example](docker/.env.example) file and update the corresponding values in your `.env` file. Additionally, you might need to make adjustments to the `docker-compose.yaml` file itself, such as changing image versions, port mappings, or volume mounts, based on your specific deployment environment and requirements. After making any changes, please re-run `docker-compose up -d`. You can find the full list of available environment variables [here](https://docs.dify.ai/getting-started/install-self-hosted/environments).
@@ -208,6 +147,13 @@ Deploy Dify to Cloud Platform with a single click using [terraform](https://www.
##### Google Cloud
- [Google Cloud Terraform by @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### Using AWS CDK for Deployment
Deploy Dify to AWS with [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## Contributing
For those who'd like to contribute code, see our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
@@ -216,12 +162,6 @@ At the same time, please consider supporting Dify by sharing it on social media
> We are looking for contributors to help with translating Dify to languages other than Mandarin or English. If you are interested in helping, please see the [i18n README](https://github.com/langgenius/dify/blob/main/web/i18n/README.md) for more information, and leave us a comment in the `global-users` channel of our [Discord Community Server](https://discord.gg/8Tpq4AcN9c).
**Contributors**
<a href="https://github.com/langgenius/dify/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langgenius/dify" />
</a>
## Community & contact
* [Github Discussion](https://github.com/langgenius/dify/discussions). Best for: sharing feedback and asking questions.
@@ -229,6 +169,12 @@ At the same time, please consider supporting Dify by sharing it on social media
* [Discord](https://discord.gg/FngNHpbcY7). Best for: sharing your applications and hanging out with the community.
* [X(Twitter)](https://twitter.com/dify_ai). Best for: sharing your applications and hanging out with the community.
**Contributors**
<a href="https://github.com/langgenius/dify/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langgenius/dify" />
</a>
## Star history
[![Star History Chart](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
@@ -241,3 +187,4 @@ To protect your privacy, please avoid posting security issues on GitHub. Instead
## License
This repository is available under the [Dify Open Source License](LICENSE), which is essentially Apache 2.0 with a few additional restrictions.

View File

@@ -15,6 +15,9 @@
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://reddit.com/r/difyai" target="_blank">
<img src="https://img.shields.io/reddit/subreddit-subscribers/difyai?style=plastic&logo=reddit&label=r%2Fdifyai&labelColor=white"
alt="join Reddit"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on X(Twitter)"></a>
@@ -187,6 +190,13 @@ docker compose up -d
##### Google Cloud
- [Google Cloud Terraform بواسطة @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### استخدام AWS CDK للنشر
انشر Dify على AWS باستخدام [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK بواسطة @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## المساهمة
لأولئك الذين يرغبون في المساهمة، انظر إلى [دليل المساهمة](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) لدينا.
@@ -219,3 +229,10 @@ docker compose up -d
## الرخصة
هذا المستودع متاح تحت [رخصة البرنامج الحر Dify](LICENSE)، والتي تعتبر بشكل أساسي Apache 2.0 مع بعض القيود الإضافية.
## الكشف عن الأمان
لحماية خصوصيتك، يرجى تجنب نشر مشكلات الأمان على GitHub. بدلاً من ذلك، أرسل أسئلتك إلى security@dify.ai وسنقدم لك إجابة أكثر تفصيلاً.
## الرخصة
هذا المستودع متاح تحت [رخصة البرنامج الحر Dify](LICENSE)، والتي تعتبر بشكل أساسي Apache 2.0 مع بعض القيود الإضافية.

View File

@@ -15,6 +15,9 @@
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://reddit.com/r/difyai" target="_blank">
<img src="https://img.shields.io/reddit/subreddit-subscribers/difyai?style=plastic&logo=reddit&label=r%2Fdifyai&labelColor=white"
alt="join Reddit"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on X(Twitter)"></a>
@@ -210,6 +213,13 @@ docker compose up -d
##### Google Cloud
- [Google Cloud Terraform by @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### 使用 AWS CDK 部署
使用 [CDK](https://aws.amazon.com/cdk/) 将 Dify 部署到 AWS
##### AWS
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## Star History
[![Star History Chart](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)

View File

@@ -15,6 +15,9 @@
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat en Discord"></a>
<a href="https://reddit.com/r/difyai" target="_blank">
<img src="https://img.shields.io/reddit/subreddit-subscribers/difyai?style=plastic&logo=reddit&label=r%2Fdifyai&labelColor=white"
alt="join Reddit"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="seguir en X(Twitter)"></a>
@@ -212,6 +215,13 @@ Despliega Dify en una plataforma en la nube con un solo clic utilizando [terrafo
##### Google Cloud
- [Google Cloud Terraform por @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### Usando AWS CDK para el Despliegue
Despliegue Dify en AWS usando [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK por @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## Contribuir
Para aquellos que deseen contribuir con código, consulten nuestra [Guía de contribución](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
@@ -245,3 +255,10 @@ Para proteger tu privacidad, evita publicar problemas de seguridad en GitHub. En
## Licencia
Este repositorio está disponible bajo la [Licencia de Código Abierto de Dify](LICENSE), que es esencialmente Apache 2.0 con algunas restricciones adicionales.
## Divulgación de Seguridad
Para proteger tu privacidad, evita publicar problemas de seguridad en GitHub. En su lugar, envía tus preguntas a security@dify.ai y te proporcionaremos una respuesta más detallada.
## Licencia
Este repositorio está disponible bajo la [Licencia de Código Abierto de Dify](LICENSE), que es esencialmente Apache 2.0 con algunas restricciones adicionales.

View File

@@ -15,6 +15,9 @@
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat sur Discord"></a>
<a href="https://reddit.com/r/difyai" target="_blank">
<img src="https://img.shields.io/reddit/subreddit-subscribers/difyai?style=plastic&logo=reddit&label=r%2Fdifyai&labelColor=white"
alt="join Reddit"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="suivre sur X(Twitter)"></a>
@@ -210,6 +213,13 @@ Déployez Dify sur une plateforme cloud en un clic en utilisant [terraform](http
##### Google Cloud
- [Google Cloud Terraform par @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### Utilisation d'AWS CDK pour le déploiement
Déployez Dify sur AWS en utilisant [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK par @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## Contribuer
Pour ceux qui souhaitent contribuer du code, consultez notre [Guide de contribution](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
@@ -243,3 +253,10 @@ Pour protéger votre vie privée, veuillez éviter de publier des problèmes de
## Licence
Ce référentiel est disponible sous la [Licence open source Dify](LICENSE), qui est essentiellement l'Apache 2.0 avec quelques restrictions supplémentaires.
## Divulgation de sécurité
Pour protéger votre vie privée, veuillez éviter de publier des problèmes de sécurité sur GitHub. Au lieu de cela, envoyez vos questions à security@dify.ai et nous vous fournirons une réponse plus détaillée.
## Licence
Ce référentiel est disponible sous la [Licence open source Dify](LICENSE), qui est essentiellement l'Apache 2.0 avec quelques restrictions supplémentaires.

View File

@@ -15,6 +15,9 @@
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="Discordでチャット"></a>
<a href="https://reddit.com/r/difyai" target="_blank">
<img src="https://img.shields.io/reddit/subreddit-subscribers/difyai?style=plastic&logo=reddit&label=r%2Fdifyai&labelColor=white"
alt="Reddit"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="X(Twitter)でフォロー"></a>
@@ -209,6 +212,13 @@ docker compose up -d
##### Google Cloud
- [@sotazumによるGoogle Cloud Terraform](https://github.com/DeNA/dify-google-cloud-terraform)
#### AWS CDK を使用したデプロイ
[CDK](https://aws.amazon.com/cdk/) を使用して、DifyをAWSにデプロイします
##### AWS
- [@KevinZhaoによるAWS CDK](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## 貢献
コードに貢献したい方は、[Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)を参照してください。

View File

@@ -15,6 +15,9 @@
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://reddit.com/r/difyai" target="_blank">
<img src="https://img.shields.io/reddit/subreddit-subscribers/difyai?style=plastic&logo=reddit&label=r%2Fdifyai&labelColor=white"
alt="Follow Reddit"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on X(Twitter)"></a>
@@ -210,6 +213,13 @@ wa'logh nIqHom neH ghun deployment toy'wI' [terraform](https://www.terraform.io/
##### Google Cloud
- [Google Cloud Terraform qachlot @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### AWS CDK atorlugh pilersitsineq
wa'logh nIqHom neH ghun deployment toy'wI' [CDK](https://aws.amazon.com/cdk/) lo'laH.
##### AWS
- [AWS CDK qachlot @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## Contributing
For those who'd like to contribute code, see our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).

View File

@@ -15,6 +15,9 @@
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://reddit.com/r/difyai" target="_blank">
<img src="https://img.shields.io/reddit/subreddit-subscribers/difyai?style=plastic&logo=reddit&label=r%2Fdifyai&labelColor=white"
alt="Follow Reddit"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on X(Twitter)"></a>
@@ -202,6 +205,13 @@ Dify를 Kubernetes에 배포하고 프리미엄 스케일링 설정을 구성했
##### Google Cloud
- [sotazum의 Google Cloud Terraform](https://github.com/DeNA/dify-google-cloud-terraform)
#### AWS CDK를 사용한 배포
[CDK](https://aws.amazon.com/cdk/)를 사용하여 AWS에 Dify 배포
##### AWS
- [KevinZhao의 AWS CDK](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## 기여
코드에 기여하고 싶은 분들은 [기여 가이드](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)를 참조하세요.

251
README_PT.md Normal file
View File

@@ -0,0 +1,251 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
<p align="center">
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Introduzindo o Dify Workflow com Upload de Arquivo: Recrie o Podcast Google NotebookLM</a>
</p>
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Auto-hospedagem</a> ·
<a href="https://docs.dify.ai">Documentação</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Consultas empresariais</a>
</p>
<p align="center">
<a href="https://dify.ai" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/Product-F04438"></a>
<a href="https://dify.ai/pricing" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/free-pricing?logo=free&color=%20%23155EEF&label=pricing&labelColor=%20%23528bff"></a>
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://reddit.com/r/difyai" target="_blank">
<img src="https://img.shields.io/reddit/subreddit-subscribers/difyai?style=plastic&logo=reddit&label=r%2Fdifyai&labelColor=white"
alt="Follow Reddit"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on X(Twitter)"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
<img alt="Commits last month" src="https://img.shields.io/github/commit-activity/m/langgenius/dify?labelColor=%20%2332b583&color=%20%2312b76a"></a>
<a href="https://github.com/langgenius/dify/" target="_blank">
<img alt="Issues closed" src="https://img.shields.io/github/issues-search?query=repo%3Alanggenius%2Fdify%20is%3Aclosed&label=issues%20closed&labelColor=%20%237d89b0&color=%20%235d6b98"></a>
<a href="https://github.com/langgenius/dify/discussions/" target="_blank">
<img alt="Discussion posts" src="https://img.shields.io/github/discussions/langgenius/dify?labelColor=%20%239b8afb&color=%20%237a5af8"></a>
</p>
<p align="center">
<a href="./README.md"><img alt="README em Inglês" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README em Espanhol" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README em Francês" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README em Coreano" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README em Árabe" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="README em Turco" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README em Vietnamita" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
<a href="./README_PT.md"><img alt="README em Português - BR" src="https://img.shields.io/badge/Portugu%C3%AAs-BR?style=flat&label=BR&color=d9d9d9"></a>
</p>
Dify é uma plataforma de desenvolvimento de aplicativos LLM de código aberto. Sua interface intuitiva combina workflow de IA, pipeline RAG, capacidades de agente, gerenciamento de modelos, recursos de observabilidade e muito mais, permitindo que você vá rapidamente do protótipo à produção. Aqui está uma lista das principais funcionalidades:
</br> </br>
**1. Workflow**:
Construa e teste workflows poderosos de IA em uma interface visual, aproveitando todos os recursos a seguir e muito mais.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Suporte abrangente a modelos**:
Integração perfeita com centenas de LLMs proprietários e de código aberto de diversas provedoras e soluções auto-hospedadas, abrangendo GPT, Mistral, Llama3 e qualquer modelo compatível com a API da OpenAI. A lista completa de provedores suportados pode ser encontrada [aqui](https://docs.dify.ai/getting-started/readme/model-providers).
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
**3. IDE de Prompt**:
Interface intuitiva para criação de prompts, comparação de desempenho de modelos e adição de recursos como conversão de texto para fala em um aplicativo baseado em chat.
**4. Pipeline RAG**:
Extensas capacidades de RAG que cobrem desde a ingestão de documentos até a recuperação, com suporte nativo para extração de texto de PDFs, PPTs e outros formatos de documentos comuns.
**5. Capacidades de agente**:
Você pode definir agentes com base em LLM Function Calling ou ReAct e adicionar ferramentas pré-construídas ou personalizadas para o agente. O Dify oferece mais de 50 ferramentas integradas para agentes de IA, como Google Search, DALL·E, Stable Diffusion e WolframAlpha.
**6. LLMOps**:
Monitore e analise os registros e o desempenho do aplicativo ao longo do tempo. É possível melhorar continuamente prompts, conjuntos de dados e modelos com base nos dados de produção e anotações.
**7. Backend como Serviço**:
Todas os recursos do Dify vêm com APIs correspondentes, permitindo que você integre o Dify sem esforço na lógica de negócios da sua empresa.
## Comparação de recursos
<table style="width: 100%;">
<tr>
<th align="center">Recurso</th>
<th align="center">Dify.AI</th>
<th align="center">LangChain</th>
<th align="center">Flowise</th>
<th align="center">OpenAI Assistants API</th>
</tr>
<tr>
<td align="center">Abordagem de Programação</td>
<td align="center">Orientada a API + Aplicativo</td>
<td align="center">Código Python</td>
<td align="center">Orientada a Aplicativo</td>
<td align="center">Orientada a API</td>
</tr>
<tr>
<td align="center">LLMs Suportados</td>
<td align="center">Variedade Rica</td>
<td align="center">Variedade Rica</td>
<td align="center">Variedade Rica</td>
<td align="center">Apenas OpenAI</td>
</tr>
<tr>
<td align="center">RAG Engine</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">✅</td>
</tr>
<tr>
<td align="center">Agente</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">✅</td>
</tr>
<tr>
<td align="center">Workflow</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">✅</td>
<td align="center">❌</td>
</tr>
<tr>
<td align="center">Observabilidade</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">❌</td>
</tr>
<tr>
<td align="center">Recursos Empresariais (SSO/Controle de Acesso)</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">❌</td>
<td align="center">❌</td>
</tr>
<tr>
<td align="center">Implantação Local</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">❌</td>
</tr>
</table>
## Usando o Dify
- **Nuvem </br>**
Oferecemos o serviço [Dify Cloud](https://dify.ai) para qualquer pessoa experimentar sem nenhuma configuração. Ele fornece todas as funcionalidades da versão auto-hospedada, incluindo 200 chamadas GPT-4 gratuitas no plano sandbox.
- **Auto-hospedagem do Dify Community Edition</br>**
Configure rapidamente o Dify no seu ambiente com este [guia inicial](#quick-start).
Use nossa [documentação](https://docs.dify.ai) para referências adicionais e instruções mais detalhadas.
- **Dify para empresas/organizações</br>**
Oferecemos recursos adicionais voltados para empresas. [Envie suas perguntas através deste chatbot](https://udify.app/chat/22L1zSxg6yW1cWQg) ou [envie-nos um e-mail](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) para discutir necessidades empresariais. </br>
> Para startups e pequenas empresas que utilizam AWS, confira o [Dify Premium no AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) e implemente no seu próprio AWS VPC com um clique. É uma oferta AMI acessível com a opção de criar aplicativos com logotipo e marca personalizados.
## Mantendo-se atualizado
Dê uma estrela no Dify no GitHub e seja notificado imediatamente sobre novos lançamentos.
![star-us](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## Início rápido
> Antes de instalar o Dify, certifique-se de que sua máquina atenda aos seguintes requisitos mínimos de sistema:
>
>- CPU >= 2 Núcleos
>- RAM >= 4 GiB
</br>
A maneira mais fácil de iniciar o servidor Dify é executar nosso arquivo [docker-compose.yml](docker/docker-compose.yaml). Antes de rodar o comando de instalação, certifique-se de que o [Docker](https://docs.docker.com/get-docker/) e o [Docker Compose](https://docs.docker.com/compose/install/) estão instalados na sua máquina:
```bash
cd docker
cp .env.example .env
docker compose up -d
```
Após a execução, você pode acessar o painel do Dify no navegador em [http://localhost/install](http://localhost/install) e iniciar o processo de inicialização.
> Se você deseja contribuir com o Dify ou fazer desenvolvimento adicional, consulte nosso [guia para implantar a partir do código fonte](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code).
## Próximos passos
Se precisar personalizar a configuração, consulte os comentários no nosso arquivo [.env.example](docker/.env.example) e atualize os valores correspondentes no seu arquivo `.env`. Além disso, talvez seja necessário fazer ajustes no próprio arquivo `docker-compose.yaml`, como alterar versões de imagem, mapeamentos de portas ou montagens de volumes, com base no seu ambiente de implantação específico e nas suas necessidades. Após fazer quaisquer alterações, execute novamente `docker-compose up -d`. Você pode encontrar a lista completa de variáveis de ambiente disponíveis [aqui](https://docs.dify.ai/getting-started/install-self-hosted/environments).
Se deseja configurar uma instalação de alta disponibilidade, há [Helm Charts](https://helm.sh/) e arquivos YAML contribuídos pela comunidade que permitem a implantação do Dify no Kubernetes.
- [Helm Chart de @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart de @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [Arquivo YAML de @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Usando o Terraform para Implantação
Implante o Dify na Plataforma Cloud com um único clique usando [terraform](https://www.terraform.io/)
##### Azure Global
- [Azure Terraform por @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform por @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### Usando AWS CDK para Implantação
Implante o Dify na AWS usando [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK por @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## Contribuindo
Para aqueles que desejam contribuir com código, veja nosso [Guia de Contribuição](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
Ao mesmo tempo, considere apoiar o Dify compartilhando-o nas redes sociais e em eventos e conferências.
> Estamos buscando contribuidores para ajudar na tradução do Dify para idiomas além de Mandarim e Inglês. Se você tiver interesse em ajudar, consulte o [README i18n](https://github.com/langgenius/dify/blob/main/web/i18n/README.md) para mais informações e deixe-nos um comentário no canal `global-users` em nosso [Servidor da Comunidade no Discord](https://discord.gg/8Tpq4AcN9c).
**Contribuidores**
<a href="https://github.com/langgenius/dify/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langgenius/dify" />
</a>
## Comunidade e contato
* [Discussões no GitHub](https://github.com/langgenius/dify/discussions). Melhor para: compartilhar feedback e fazer perguntas.
* [Problemas no GitHub](https://github.com/langgenius/dify/issues). Melhor para: relatar bugs encontrados no Dify.AI e propor novos recursos. Veja nosso [Guia de Contribuição](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Discord](https://discord.gg/FngNHpbcY7). Melhor para: compartilhar suas aplicações e interagir com a comunidade.
* [X(Twitter)](https://twitter.com/dify_ai). Melhor para: compartilhar suas aplicações e interagir com a comunidade.
## Histórico de estrelas
[![Gráfico de Histórico de Estrelas](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
## Divulgação de segurança
Para proteger sua privacidade, evite postar problemas de segurança no GitHub. Em vez disso, envie suas perguntas para security@dify.ai e forneceremos uma resposta mais detalhada.
## Licença
Este repositório está disponível sob a [Licença de Código Aberto Dify](LICENSE), que é essencialmente Apache 2.0 com algumas restrições adicionais.

187
README_SI.md Normal file
View File

@@ -0,0 +1,187 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
<p align="center">
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Predstavljamo nalaganje datotek Dify Workflow: znova ustvarite Google NotebookLM Podcast</a>
</p>
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Samostojno gostovanje</a> ·
<a href="https://docs.dify.ai">Dokumentacija</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Povpraševanje za podjetja</a>
</p>
<p align="center">
<a href="https://dify.ai" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/Product-F04438"></a>
<a href="https://dify.ai/pricing" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/free-pricing?logo=free&color=%20%23155EEF&label=pricing&labelColor=%20%23528bff"></a>
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on X(Twitter)"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
<img alt="Commits last month" src="https://img.shields.io/github/commit-activity/m/langgenius/dify?labelColor=%20%2332b583&color=%20%2312b76a"></a>
<a href="https://github.com/langgenius/dify/" target="_blank">
<img alt="Issues closed" src="https://img.shields.io/github/issues-search?query=repo%3Alanggenius%2Fdify%20is%3Aclosed&label=issues%20closed&labelColor=%20%237d89b0&color=%20%235d6b98"></a>
<a href="https://github.com/langgenius/dify/discussions/" target="_blank">
<img alt="Discussion posts" src="https://img.shields.io/github/discussions/langgenius/dify?labelColor=%20%239b8afb&color=%20%237a5af8"></a>
</p>
<p align="center">
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
<a href="./README_SI.md"><img alt="README Slovenščina" src="https://img.shields.io/badge/Sloven%C5%A1%C4%8Dina-d9d9d9"></a>
</p>
Dify je odprtokodna platforma za razvoj aplikacij LLM. Njegov intuitivni vmesnik združuje agentski potek dela z umetno inteligenco, cevovod RAG, zmogljivosti agentov, upravljanje modelov, funkcije opazovanja in več, kar vam omogoča hiter prehod od prototipa do proizvodnje.
## Hitri začetek
> Preden namestite Dify, se prepričajte, da vaša naprava izpolnjuje naslednje minimalne sistemske zahteve:
>
>- CPU >= 2 Core
>- RAM >= 4 GiB
</br>
Najlažji način za zagon strežnika Dify je prek docker compose . Preden zaženete Dify z naslednjimi ukazi, se prepričajte, da sta Docker in Docker Compose nameščena na vašem računalniku:
```bash
cd dify
cd docker
cp .env.example .env
docker compose up -d
```
Po zagonu lahko dostopate do nadzorne plošče Dify v brskalniku na [http://localhost/install](http://localhost/install) in začnete postopek inicializacije.
#### Iskanje pomoči
Prosimo, glejte naša pogosta vprašanja [FAQ](https://docs.dify.ai/getting-started/install-self-hosted/faqs) če naletite na težave pri nastavitvi Dify. Če imate še vedno težave, se obrnite na [skupnost ali nas](#community--contact).
> Če želite prispevati k Difyju ali narediti dodaten razvoj, glejte naš vodnik za [uvajanje iz izvorne kode](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code)
## Ključne značilnosti
**1. Potek dela**:
Zgradite in preizkusite zmogljive poteke dela AI na vizualnem platnu, pri čemer izkoristite vse naslednje funkcije in več.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Celovita podpora za modele**:
Brezhibna integracija s stotinami lastniških/odprtokodnih LLM-jev ducatov ponudnikov sklepanja in samostojnih rešitev, ki pokrivajo GPT, Mistral, Llama3 in vse modele, združljive z API-jem OpenAI. Celoten seznam podprtih ponudnikov modelov najdete [tukaj](https://docs.dify.ai/getting-started/readme/model-providers).
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
**3. Prompt IDE**:
intuitivni vmesnik za ustvarjanje pozivov, primerjavo zmogljivosti modela in dodajanje dodatnih funkcij, kot je pretvorba besedila v govor, aplikaciji, ki temelji na klepetu.
**4. RAG Pipeline**:
E Obsežne zmogljivosti RAG, ki pokrivajo vse od vnosa dokumenta do priklica, s podporo za ekstrakcijo besedila iz datotek PDF, PPT in drugih običajnih formatov dokumentov.
**5. Agent capabilities**:
definirate lahko agente, ki temeljijo na klicanju funkcij LLM ali ReAct, in dodate vnaprej izdelana orodja ali orodja po meri za agenta. Dify ponuja več kot 50 vgrajenih orodij za agente AI, kot so Google Search, DALL·E, Stable Diffusion in WolframAlpha.
**6. LLMOps**:
Spremljajte in analizirajte dnevnike aplikacij in učinkovitost skozi čas. Pozive, nabore podatkov in modele lahko nenehno izboljšujete na podlagi proizvodnih podatkov in opomb.
**7. Backend-as-a-Service**:
AVse ponudbe Difyja so opremljene z ustreznimi API-ji, tako da lahko Dify brez težav integrirate v svojo poslovno logiko.
## Uporaba Dify
- **Cloud </br>**
Gostimo storitev Dify Cloud za vsakogar, ki jo lahko preizkusite brez nastavitev. Zagotavlja vse zmožnosti različice za samostojno namestitev in vključuje 200 brezplačnih klicev GPT-4 v načrtu peskovnika.
- **Self-hosting Dify Community Edition</br>**
Hitro zaženite Dify v svojem okolju s tem [začetnim vodnikom](#quick-start) . Za dodatne reference in podrobnejša navodila uporabite našo [dokumentacijo](https://docs.dify.ai) .
- **Dify za podjetja/organizacije</br>**
Ponujamo dodatne funkcije, osredotočene na podjetja. Zabeležite svoja vprašanja prek tega klepetalnega robota ali nam pošljite e-pošto, da se pogovorimo o potrebah podjetja. </br>
> Za novoustanovljena podjetja in mala podjetja, ki uporabljajo AWS, si oglejte Dify Premium na AWS Marketplace in ga z enim klikom uvedite v svoj AWS VPC. To je cenovno ugodna ponudba AMI z možnostjo ustvarjanja aplikacij z logotipom in blagovno znamko po meri.
## Staying ahead
Star Dify on GitHub and be instantly notified of new releases.
![star-us](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## Napredne nastavitve
Če morate prilagoditi konfiguracijo, si oglejte komentarje v naši datoteki .env.example in posodobite ustrezne vrednosti v svoji .env datoteki. Poleg tega boste morda morali prilagoditi docker-compose.yamlsamo datoteko, na primer spremeniti različice slike, preslikave vrat ali namestitve nosilca, glede na vaše specifično okolje in zahteve za uvajanje. Po kakršnih koli spremembah ponovno zaženite docker-compose up -d. Celoten seznam razpoložljivih spremenljivk okolja najdete tukaj .
Če želite konfigurirati visoko razpoložljivo nastavitev, so na voljo Helm Charts in datoteke YAML, ki jih prispeva skupnost, ki omogočajo uvedbo Difyja v Kubernetes.
- [Helm Chart by @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Uporaba Terraform za uvajanje
namestite Dify v Cloud Platform z enim klikom z uporabo [terraform](https://www.terraform.io/)
##### Azure Global
- [Azure Terraform by @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform by @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### Uporaba AWS CDK za uvajanje
Uvedite Dify v AWS z uporabo [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## Prispevam
Za tiste, ki bi radi prispevali kodo, si oglejte naš vodnik za prispevke . Hkrati vas prosimo, da podprete Dify tako, da ga delite na družbenih medijih ter na dogodkih in konferencah.
> Iščemo sodelavce za pomoč pri prevajanju Difyja v jezike, ki niso mandarinščina ali angleščina. Če želite pomagati, si oglejte i18n README za več informacij in nam pustite komentar v global-userskanalu našega strežnika skupnosti Discord .
## Skupnost in stik
* [Github Discussion](https://github.com/langgenius/dify/discussions). Najboljše za: izmenjavo povratnih informacij in postavljanje vprašanj.
* [GitHub Issues](https://github.com/langgenius/dify/issues). Najboljše za: hrošče, na katere naletite pri uporabi Dify.AI, in predloge funkcij. Oglejte si naš [vodnik za prispevke](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Discord](https://discord.gg/FngNHpbcY7). Najboljše za: deljenje vaših aplikacij in druženje s skupnostjo.
* [X(Twitter)](https://twitter.com/dify_ai). Najboljše za: deljenje vaših aplikacij in druženje s skupnostjo.
**Contributors**
<a href="https://github.com/langgenius/dify/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langgenius/dify" />
</a>
## Star history
[![Star History Chart](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
## Varnostno razkritje
Zaradi zaščite vaše zasebnosti se izogibajte objavljanju varnostnih vprašanj na GitHub. Namesto tega pošljite vprašanja na security@dify.ai in zagotovili vam bomo podrobnejši odgovor.
## Licenca
To skladišče je na voljo pod [odprtokodno licenco Dify](LICENSE) , ki je v bistvu Apache 2.0 z nekaj dodatnimi omejitvami.

View File

@@ -15,6 +15,9 @@
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="Discord'da sohbet et"></a>
<a href="https://reddit.com/r/difyai" target="_blank">
<img src="https://img.shields.io/reddit/subreddit-subscribers/difyai?style=plastic&logo=reddit&label=r%2Fdifyai&labelColor=white"
alt="Follow Reddit"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="X(Twitter)'da takip et"></a>
@@ -208,6 +211,13 @@ Dify'ı bulut platformuna tek tıklamayla dağıtın [terraform](https://www.ter
##### Google Cloud
- [Google Cloud Terraform tarafından @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### AWS CDK ile Dağıtım
[CDK](https://aws.amazon.com/cdk/) kullanarak Dify'ı AWS'ye dağıtın
##### AWS
- [AWS CDK tarafından @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## Katkıda Bulunma
Kod katkısında bulunmak isteyenler için [Katkı Kılavuzumuza](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) bakabilirsiniz.

View File

@@ -15,6 +15,9 @@
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat trên Discord"></a>
<a href="https://reddit.com/r/difyai" target="_blank">
<img src="https://img.shields.io/reddit/subreddit-subscribers/difyai?style=plastic&logo=reddit&label=r%2Fdifyai&labelColor=white"
alt="Follow Reddit"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="theo dõi trên X(Twitter)"></a>
@@ -204,6 +207,13 @@ Triển khai Dify lên nền tảng đám mây với một cú nhấp chuột b
##### Google Cloud
- [Google Cloud Terraform bởi @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### Sử dụng AWS CDK để Triển khai
Triển khai Dify trên AWS bằng [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK bởi @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## Đóng góp
Đối với những người muốn đóng góp mã, xem [Hướng dẫn Đóng góp](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) của chúng tôi.
@@ -235,4 +245,4 @@ Triển khai Dify lên nền tảng đám mây với một cú nhấp chuột b
## Giấy phép
Kho lưu trữ này có sẵn theo [Giấy phép Mã nguồn Mở Dify](LICENSE), về cơ bản là Apache 2.0 với một vài hạn chế bổ sung.
Kho lưu trữ này có sẵn theo [Giấy phép Mã nguồn Mở Dify](LICENSE), về cơ bản là Apache 2.0 với một vài hạn chế bổ sung.

View File

@@ -42,6 +42,11 @@ REDIS_SENTINEL_USERNAME=
REDIS_SENTINEL_PASSWORD=
REDIS_SENTINEL_SOCKET_TIMEOUT=0.1
# redis Cluster configuration.
REDIS_USE_CLUSTERS=false
REDIS_CLUSTERS=
REDIS_CLUSTERS_PASSWORD=
# PostgreSQL database configuration
DB_USERNAME=postgres
DB_PASSWORD=difyai123456
@@ -51,20 +56,27 @@ DB_DATABASE=dify
# Storage configuration
# use for store upload files, private keys...
# storage type: local, s3, aliyun-oss, azure-blob, baidu-obs, google-storage, huawei-obs, oci-storage, tencent-cos, volcengine-tos, supabase
STORAGE_TYPE=local
STORAGE_LOCAL_PATH=storage
# storage type: opendal, s3, aliyun-oss, azure-blob, baidu-obs, google-storage, huawei-obs, oci-storage, tencent-cos, volcengine-tos, supabase
STORAGE_TYPE=opendal
# Apache OpenDAL storage configuration, refer to https://github.com/apache/opendal
OPENDAL_SCHEME=fs
OPENDAL_FS_ROOT=storage
# S3 Storage configuration
S3_USE_AWS_MANAGED_IAM=false
S3_ENDPOINT=https://your-bucket-name.storage.s3.clooudflare.com
S3_BUCKET_NAME=your-bucket-name
S3_ACCESS_KEY=your-access-key
S3_SECRET_KEY=your-secret-key
S3_REGION=your-region
# Azure Blob Storage configuration
AZURE_BLOB_ACCOUNT_NAME=your-account-name
AZURE_BLOB_ACCOUNT_KEY=your-account-key
AZURE_BLOB_CONTAINER_NAME=yout-container-name
AZURE_BLOB_ACCOUNT_URL=https://<your_account_name>.blob.core.windows.net
# Aliyun oss Storage configuration
ALIYUN_OSS_BUCKET_NAME=your-bucket-name
ALIYUN_OSS_ACCESS_KEY=your-access-key
@@ -74,6 +86,7 @@ ALIYUN_OSS_AUTH_VERSION=v1
ALIYUN_OSS_REGION=your-region
# Don't start with '/'. OSS doesn't support leading slash in object names.
ALIYUN_OSS_PATH=your-path
# Google Storage configuration
GOOGLE_STORAGE_BUCKET_NAME=yout-bucket-name
GOOGLE_STORAGE_SERVICE_ACCOUNT_JSON_BASE64=your-google-service-account-json-base64-string
@@ -120,7 +133,8 @@ SUPABASE_URL=your-server-url
WEB_API_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,*
CONSOLE_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,*
# Vector database configuration, support: weaviate, qdrant, milvus, myscale, relyt, pgvecto_rs, pgvector, pgvector, chroma, opensearch, tidb_vector, vikingdb, upstash
# Vector database configuration
# support: weaviate, qdrant, milvus, myscale, relyt, pgvecto_rs, pgvector, pgvector, chroma, opensearch, tidb_vector, couchbase, vikingdb, upstash, lindorm, oceanbase
VECTOR_STORE=weaviate
# Weaviate configuration
@@ -136,6 +150,13 @@ QDRANT_CLIENT_TIMEOUT=20
QDRANT_GRPC_ENABLED=false
QDRANT_GRPC_PORT=6334
#Couchbase configuration
COUCHBASE_CONNECTION_STRING=127.0.0.1
COUCHBASE_USER=Administrator
COUCHBASE_PASSWORD=password
COUCHBASE_BUCKET_NAME=Embeddings
COUCHBASE_SCOPE_NAME=_default
# Milvus configuration
MILVUS_URI=http://127.0.0.1:19530
MILVUS_TOKEN=
@@ -195,6 +216,20 @@ TIDB_VECTOR_USER=xxx.root
TIDB_VECTOR_PASSWORD=xxxxxx
TIDB_VECTOR_DATABASE=dify
# Tidb on qdrant configuration
TIDB_ON_QDRANT_URL=http://127.0.0.1
TIDB_ON_QDRANT_API_KEY=dify
TIDB_ON_QDRANT_CLIENT_TIMEOUT=20
TIDB_ON_QDRANT_GRPC_ENABLED=false
TIDB_ON_QDRANT_GRPC_PORT=6334
TIDB_PUBLIC_KEY=dify
TIDB_PRIVATE_KEY=dify
TIDB_API_URL=http://127.0.0.1
TIDB_IAM_API_URL=http://127.0.0.1
TIDB_REGION=regions/aws-us-east-1
TIDB_PROJECT_ID=dify
TIDB_SPEND_LIMIT=100
# Chroma configuration
CHROMA_HOST=127.0.0.1
CHROMA_PORT=8000
@@ -212,6 +247,10 @@ ANALYTICDB_ACCOUNT=testaccount
ANALYTICDB_PASSWORD=testpassword
ANALYTICDB_NAMESPACE=dify
ANALYTICDB_NAMESPACE_PASSWORD=difypassword
ANALYTICDB_HOST=gp-test.aliyuncs.com
ANALYTICDB_PORT=5432
ANALYTICDB_MIN_CONNECTION=1
ANALYTICDB_MAX_CONNECTION=5
# OpenSearch configuration
OPENSEARCH_HOST=127.0.0.1
@@ -242,6 +281,21 @@ VIKINGDB_SCHEMA=http
VIKINGDB_CONNECTION_TIMEOUT=30
VIKINGDB_SOCKET_TIMEOUT=30
# Lindorm configuration
LINDORM_URL=http://ld-*******************-proxy-search-pub.lindorm.aliyuncs.com:30070
LINDORM_USERNAME=admin
LINDORM_PASSWORD=admin
USING_UGC_INDEX=False
# OceanBase Vector configuration
OCEANBASE_VECTOR_HOST=127.0.0.1
OCEANBASE_VECTOR_PORT=2881
OCEANBASE_VECTOR_USER=root@test
OCEANBASE_VECTOR_PASSWORD=difyai123456
OCEANBASE_VECTOR_DATABASE=test
OCEANBASE_MEMORY_LIMIT=6G
# Upload configuration
UPLOAD_FILE_SIZE_LIMIT=15
UPLOAD_FILE_BATCH_LIMIT=5
@@ -249,8 +303,8 @@ UPLOAD_IMAGE_FILE_SIZE_LIMIT=10
UPLOAD_VIDEO_FILE_SIZE_LIMIT=100
UPLOAD_AUDIO_FILE_SIZE_LIMIT=50
# Model Configuration
MULTIMODAL_SEND_IMAGE_FORMAT=base64
# Model configuration
MULTIMODAL_SEND_FORMAT=base64
PROMPT_GENERATION_MAX_TOKENS=512
CODE_GENERATION_MAX_TOKENS=1024
@@ -283,14 +337,23 @@ NOTION_INTERNAL_SECRET=you-internal-secret
ETL_TYPE=dify
UNSTRUCTURED_API_URL=
UNSTRUCTURED_API_KEY=
SCARF_NO_ANALYTICS=true
#ssrf
SSRF_PROXY_HTTP_URL=
SSRF_PROXY_HTTPS_URL=
SSRF_DEFAULT_MAX_RETRIES=3
SSRF_DEFAULT_TIME_OUT=5
SSRF_DEFAULT_CONNECT_TIME_OUT=5
SSRF_DEFAULT_READ_TIME_OUT=5
SSRF_DEFAULT_WRITE_TIME_OUT=5
BATCH_UPLOAD_LIMIT=10
KEYWORD_DATA_SOURCE_TYPE=database
# Workflow file upload limit
WORKFLOW_FILE_UPLOAD_LIMIT=10
# CODE EXECUTION CONFIGURATION
CODE_EXECUTION_ENDPOINT=http://127.0.0.1:8194
CODE_EXECUTION_API_KEY=dify-sandbox
@@ -322,9 +385,15 @@ LOG_FILE=
LOG_FILE_MAX_SIZE=20
# Log file max backup count
LOG_FILE_BACKUP_COUNT=5
# Log dateformat
LOG_DATEFORMAT=%Y-%m-%d %H:%M:%S
# Log Timezone
LOG_TZ=UTC
# Log format
LOG_FORMAT=%(asctime)s,%(msecs)d %(levelname)-2s [%(filename)s:%(lineno)d] %(req_id)s %(message)s
# Indexing configuration
INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH=1000
INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH=4000
# Workflow runtime configuration
WORKFLOW_MAX_EXECUTION_STEPS=500
@@ -351,3 +420,10 @@ POSITION_PROVIDER_EXCLUDES=
# Reset password token expiry minutes
RESET_PASSWORD_TOKEN_EXPIRY_MINUTES=5
CREATE_TIDB_SERVICE_JOB_ENABLED=false
# Maximum number of submitted thread count in a ThreadPool for parallel node execution
MAX_SUBMIT_COUNT=100
# Lockout duration in seconds
LOGIN_LOCKOUT_DURATION=86400

96
api/.ruff.toml Normal file
View File

@@ -0,0 +1,96 @@
exclude = [
"migrations/*",
]
line-length = 120
[format]
quote-style = "double"
[lint]
preview = true
select = [
"B", # flake8-bugbear rules
"C4", # flake8-comprehensions
"E", # pycodestyle E rules
"F", # pyflakes rules
"FURB", # refurb rules
"I", # isort rules
"N", # pep8-naming
"PT", # flake8-pytest-style rules
"PLC0208", # iteration-over-set
"PLC2801", # unnecessary-dunder-call
"PLC0414", # useless-import-alias
"PLE0604", # invalid-all-object
"PLE0605", # invalid-all-format
"PLR0402", # manual-from-import
"PLR1711", # useless-return
"PLR1714", # repeated-equality-comparison
"RUF013", # implicit-optional
"RUF019", # unnecessary-key-check
"RUF100", # unused-noqa
"RUF101", # redirected-noqa
"RUF200", # invalid-pyproject-toml
"RUF022", # unsorted-dunder-all
"S506", # unsafe-yaml-load
"SIM", # flake8-simplify rules
"TRY400", # error-instead-of-exception
"TRY401", # verbose-log-message
"UP", # pyupgrade rules
"W191", # tab-indentation
"W605", # invalid-escape-sequence
]
ignore = [
"E402", # module-import-not-at-top-of-file
"E711", # none-comparison
"E712", # true-false-comparison
"E721", # type-comparison
"E722", # bare-except
"E731", # lambda-assignment
"F821", # undefined-name
"F841", # unused-variable
"FURB113", # repeated-append
"FURB152", # math-constant
"UP007", # non-pep604-annotation
"UP032", # f-string
"B005", # strip-with-multi-characters
"B006", # mutable-argument-default
"B007", # unused-loop-control-variable
"B026", # star-arg-unpacking-after-keyword-arg
"B904", # raise-without-from-inside-except
"B905", # zip-without-explicit-strict
"N806", # non-lowercase-variable-in-function
"N815", # mixed-case-variable-in-class-scope
"PT011", # pytest-raises-too-broad
"SIM102", # collapsible-if
"SIM103", # needless-bool
"SIM105", # suppressible-exception
"SIM107", # return-in-try-except-finally
"SIM108", # if-else-block-instead-of-if-exp
"SIM113", # eumerate-for-loop
"SIM117", # multiple-with-statements
"SIM210", # if-expr-with-true-false
"SIM300", # yoda-conditions,
]
[lint.per-file-ignores]
"__init__.py" = [
"F401", # unused-import
"F811", # redefined-while-unused
]
"configs/*" = [
"N802", # invalid-function-name
]
"libs/gmpy2_pkcs10aep_cipher.py" = [
"N803", # invalid-argument-name
]
"tests/*" = [
"F811", # redefined-while-unused
"F401", # unused-import
]
[lint.pyflakes]
extend-generics = [
"_pytest.monkeypatch",
"tests.integration_tests",
]

View File

@@ -1,10 +1,10 @@
# base image
FROM python:3.10-slim-bookworm AS base
FROM python:3.12-slim-bookworm AS base
WORKDIR /app/api
# Install Poetry
ENV POETRY_VERSION=1.8.3
ENV POETRY_VERSION=1.8.4
# if you located in China, you can use aliyun mirror to speed up
# RUN pip install --no-cache-dir poetry==${POETRY_VERSION} -i https://mirrors.aliyun.com/pypi/simple/
@@ -55,7 +55,7 @@ RUN apt-get update \
&& echo "deb http://deb.debian.org/debian testing main" > /etc/apt/sources.list \
&& apt-get update \
# For Security
&& apt-get install -y --no-install-recommends zlib1g=1:1.3.dfsg+really1.3.1-1 expat=2.6.3-1 libldap-2.5-0=2.5.18+dfsg-3+b1 perl=5.40.0-6 libsqlite3-0=3.46.1-1 \
&& apt-get install -y --no-install-recommends expat=2.6.4-1 libldap-2.5-0=2.5.18+dfsg-3+b1 perl=5.40.0-8 libsqlite3-0=3.46.1-1 zlib1g=1:1.3.dfsg+really1.3.1-1+b1 \
# install a chinese font to support the use of tools like matplotlib
&& apt-get install -y fonts-noto-cjk \
&& apt-get autoremove -y \

View File

@@ -18,12 +18,17 @@
```
2. Copy `.env.example` to `.env`
```cli
cp .env.example .env
```
3. Generate a `SECRET_KEY` in the `.env` file.
bash for Linux
```bash for Linux
sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
```
bash for Mac
```bash for Mac
secret_key=$(openssl rand -base64 42)
sed -i '' "/^SECRET_KEY=/c\\
@@ -37,18 +42,10 @@
5. Install dependencies
```bash
poetry env use 3.10
poetry env use 3.12
poetry install
```
In case of contributors missing to update dependencies for `pyproject.toml`, you can perform the following shell instead.
```bash
poetry shell # activate current environment
poetry add $(cat requirements.txt) # install dependencies of production and update pyproject.toml
poetry add $(cat requirements-dev.txt) --group dev # install dependencies of development and update pyproject.toml
```
6. Run migrate
Before the first launch, migrate the database to the latest version.
@@ -76,13 +73,11 @@
1. Install dependencies for both the backend and the test environment
```bash
poetry install --with dev
poetry install -C api --with dev
```
2. Run the tests locally with mocked system environment variables in `tool.pytest_env` section in `pyproject.toml`
```bash
cd ../
poetry run -C api bash dev/pytest/pytest_all_tests.sh
```

View File

@@ -1,108 +1,30 @@
import os
from libs import version_utils
from configs import dify_config
if os.environ.get("DEBUG", "false").lower() != "true":
from gevent import monkey
monkey.patch_all()
import grpc.experimental.gevent
grpc.experimental.gevent.init_gevent()
import json
import threading
import time
import warnings
from flask import Response
from app_factory import create_app
# DO NOT REMOVE BELOW
from events import event_handlers # noqa: F401
from extensions.ext_database import db
# TODO: Find a way to avoid importing models here
from models import account, dataset, model, source, task, tool, tools, web # noqa: F401
# DO NOT REMOVE ABOVE
# preparation before creating app
version_utils.check_supported_python_version()
warnings.simplefilter("ignore", ResourceWarning)
def is_db_command():
import sys
os.environ["TZ"] = "UTC"
# windows platform not support tzset
if hasattr(time, "tzset"):
time.tzset()
if len(sys.argv) > 1 and sys.argv[0].endswith("flask") and sys.argv[1] == "db":
return True
return False
# create app
app = create_app()
celery = app.extensions["celery"]
if is_db_command():
from app_factory import create_migrations_app
if dify_config.TESTING:
print("App is running in TESTING mode")
app = create_migrations_app()
else:
from app_factory import create_app
from libs import threadings_utils
threadings_utils.apply_gevent_threading_patch()
@app.after_request
def after_request(response):
"""Add Version headers to the response."""
response.set_cookie("remember_token", "", expires=0)
response.headers.add("X-Version", dify_config.CURRENT_VERSION)
response.headers.add("X-Env", dify_config.DEPLOY_ENV)
return response
@app.route("/health")
def health():
return Response(
json.dumps({"pid": os.getpid(), "status": "ok", "version": dify_config.CURRENT_VERSION}),
status=200,
content_type="application/json",
)
@app.route("/threads")
def threads():
num_threads = threading.active_count()
threads = threading.enumerate()
thread_list = []
for thread in threads:
thread_name = thread.name
thread_id = thread.ident
is_alive = thread.is_alive()
thread_list.append(
{
"name": thread_name,
"id": thread_id,
"is_alive": is_alive,
}
)
return {
"pid": os.getpid(),
"thread_num": num_threads,
"threads": thread_list,
}
@app.route("/db-pool-stat")
def pool_stat():
engine = db.engine
return {
"pid": os.getpid(),
"pool_size": engine.pool.size(),
"checked_in_connections": engine.pool.checkedin(),
"checked_out_connections": engine.pool.checkedout(),
"overflow_connections": engine.pool.overflow(),
"connection_timeout": engine.pool.timeout(),
"recycle_time": db.engine.pool._recycle,
}
app = create_app()
celery = app.extensions["celery"]
if __name__ == "__main__":
app.run(host="0.0.0.0", port=5001)

View File

@@ -1,52 +1,14 @@
import os
import logging
import time
if os.environ.get("DEBUG", "false").lower() != "true":
from gevent import monkey
monkey.patch_all()
import grpc.experimental.gevent
grpc.experimental.gevent.init_gevent()
import json
from flask import Flask, Response, request
from flask_cors import CORS
from werkzeug.exceptions import Unauthorized
import contexts
from commands import register_commands
from configs import dify_config
from extensions import (
ext_celery,
ext_code_based_extension,
ext_compress,
ext_database,
ext_hosting_provider,
ext_logging,
ext_login,
ext_mail,
ext_migrate,
ext_proxy_fix,
ext_redis,
ext_sentry,
ext_storage,
)
from extensions.ext_database import db
from extensions.ext_login import login_manager
from libs.passport import PassportService
from services.account_service import AccountService
class DifyApp(Flask):
pass
from dify_app import DifyApp
# ----------------------------
# Application Factory Function
# ----------------------------
def create_flask_app_with_configs() -> Flask:
def create_flask_app_with_configs() -> DifyApp:
"""
create a raw flask app
with configs loaded from .env file
@@ -54,123 +16,86 @@ def create_flask_app_with_configs() -> Flask:
dify_app = DifyApp(__name__)
dify_app.config.from_mapping(dify_config.model_dump())
# populate configs into system environment variables
for key, value in dify_app.config.items():
if isinstance(value, str):
os.environ[key] = value
elif isinstance(value, int | float | bool):
os.environ[key] = str(value)
elif value is None:
os.environ[key] = ""
return dify_app
def create_app() -> Flask:
def create_app() -> DifyApp:
start_time = time.perf_counter()
app = create_flask_app_with_configs()
app.secret_key = dify_config.SECRET_KEY
initialize_extensions(app)
register_blueprints(app)
register_commands(app)
end_time = time.perf_counter()
if dify_config.DEBUG:
logging.info(f"Finished create_app ({round((end_time - start_time) * 1000, 2)} ms)")
return app
def initialize_extensions(app):
# Since the application instance is now created, pass it to each Flask
# extension instance to bind it to the Flask application instance (app)
ext_logging.init_app(app)
ext_compress.init_app(app)
ext_code_based_extension.init()
def initialize_extensions(app: DifyApp):
from extensions import (
ext_app_metrics,
ext_blueprints,
ext_celery,
ext_code_based_extension,
ext_commands,
ext_compress,
ext_database,
ext_hosting_provider,
ext_import_modules,
ext_logging,
ext_login,
ext_mail,
ext_migrate,
ext_proxy_fix,
ext_redis,
ext_sentry,
ext_set_secretkey,
ext_storage,
ext_timezone,
ext_warnings,
)
extensions = [
ext_timezone,
ext_logging,
ext_warnings,
ext_import_modules,
ext_set_secretkey,
ext_compress,
ext_code_based_extension,
ext_database,
ext_app_metrics,
ext_migrate,
ext_redis,
ext_storage,
ext_celery,
ext_login,
ext_mail,
ext_hosting_provider,
ext_sentry,
ext_proxy_fix,
ext_blueprints,
ext_commands,
]
for ext in extensions:
short_name = ext.__name__.split(".")[-1]
is_enabled = ext.is_enabled() if hasattr(ext, "is_enabled") else True
if not is_enabled:
if dify_config.DEBUG:
logging.info(f"Skipped {short_name}")
continue
start_time = time.perf_counter()
ext.init_app(app)
end_time = time.perf_counter()
if dify_config.DEBUG:
logging.info(f"Loaded {short_name} ({round((end_time - start_time) * 1000, 2)} ms)")
def create_migrations_app():
app = create_flask_app_with_configs()
from extensions import ext_database, ext_migrate
# Initialize only required extensions
ext_database.init_app(app)
ext_migrate.init(app, db)
ext_redis.init_app(app)
ext_storage.init_app(app)
ext_celery.init_app(app)
ext_login.init_app(app)
ext_mail.init_app(app)
ext_hosting_provider.init_app(app)
ext_sentry.init_app(app)
ext_proxy_fix.init_app(app)
ext_migrate.init_app(app)
# Flask-Login configuration
@login_manager.request_loader
def load_user_from_request(request_from_flask_login):
"""Load user based on the request."""
if request.blueprint not in {"console", "inner_api"}:
return None
# Check if the user_id contains a dot, indicating the old format
auth_header = request.headers.get("Authorization", "")
if not auth_header:
auth_token = request.args.get("_token")
if not auth_token:
raise Unauthorized("Invalid Authorization token.")
else:
if " " not in auth_header:
raise Unauthorized("Invalid Authorization header format. Expected 'Bearer <api-key>' format.")
auth_scheme, auth_token = auth_header.split(None, 1)
auth_scheme = auth_scheme.lower()
if auth_scheme != "bearer":
raise Unauthorized("Invalid Authorization header format. Expected 'Bearer <api-key>' format.")
decoded = PassportService().verify(auth_token)
user_id = decoded.get("user_id")
logged_in_account = AccountService.load_logged_in_account(account_id=user_id)
if logged_in_account:
contexts.tenant_id.set(logged_in_account.current_tenant_id)
return logged_in_account
@login_manager.unauthorized_handler
def unauthorized_handler():
"""Handle unauthorized requests."""
return Response(
json.dumps({"code": "unauthorized", "message": "Unauthorized."}),
status=401,
content_type="application/json",
)
# register blueprint routers
def register_blueprints(app):
from controllers.console import bp as console_app_bp
from controllers.files import bp as files_bp
from controllers.inner_api import bp as inner_api_bp
from controllers.service_api import bp as service_api_bp
from controllers.web import bp as web_bp
CORS(
service_api_bp,
allow_headers=["Content-Type", "Authorization", "X-App-Code"],
methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"],
)
app.register_blueprint(service_api_bp)
CORS(
web_bp,
resources={r"/*": {"origins": dify_config.WEB_API_CORS_ALLOW_ORIGINS}},
supports_credentials=True,
allow_headers=["Content-Type", "Authorization", "X-App-Code"],
methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"],
expose_headers=["X-Version", "X-Env"],
)
app.register_blueprint(web_bp)
CORS(
console_app_bp,
resources={r"/*": {"origins": dify_config.CONSOLE_CORS_ALLOW_ORIGINS}},
supports_credentials=True,
allow_headers=["Content-Type", "Authorization"],
methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"],
expose_headers=["X-Version", "X-Env"],
)
app.register_blueprint(console_app_bp)
CORS(files_bp, allow_headers=["Content-Type"], methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"])
app.register_blueprint(files_bp)
app.register_blueprint(inner_api_bp)
return app

View File

@@ -259,7 +259,7 @@ def migrate_knowledge_vector_database():
skipped_count = 0
total_count = 0
vector_type = dify_config.VECTOR_STORE
upper_colletion_vector_types = {
upper_collection_vector_types = {
VectorType.MILVUS,
VectorType.PGVECTOR,
VectorType.RELYT,
@@ -267,7 +267,7 @@ def migrate_knowledge_vector_database():
VectorType.ORACLE,
VectorType.ELASTICSEARCH,
}
lower_colletion_vector_types = {
lower_collection_vector_types = {
VectorType.ANALYTICDB,
VectorType.CHROMA,
VectorType.MYSCALE,
@@ -278,6 +278,8 @@ def migrate_knowledge_vector_database():
VectorType.BAIDU,
VectorType.VIKINGDB,
VectorType.UPSTASH,
VectorType.COUCHBASE,
VectorType.OCEANBASE,
}
page = 1
while True:
@@ -305,7 +307,7 @@ def migrate_knowledge_vector_database():
continue
collection_name = ""
dataset_id = dataset.id
if vector_type in upper_colletion_vector_types:
if vector_type in upper_collection_vector_types:
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
elif vector_type == VectorType.QDRANT:
if dataset.collection_binding_id:
@@ -321,7 +323,7 @@ def migrate_knowledge_vector_database():
else:
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
elif vector_type in lower_colletion_vector_types:
elif vector_type in lower_collection_vector_types:
collection_name = Dataset.gen_collection_name_by_id(dataset_id).lower()
else:
raise ValueError(f"Vector store {vector_type} is not supported.")
@@ -587,7 +589,7 @@ def upgrade_db():
click.echo(click.style("Database migration successful!", fg="green"))
except Exception as e:
logging.exception(f"Database migration failed: {e}")
logging.exception("Failed to execute database migration")
finally:
lock.release()
else:
@@ -631,22 +633,10 @@ where sites.id is null limit 1000"""
except Exception as e:
failed_app_ids.append(app_id)
click.echo(click.style("Failed to fix missing site for app {}".format(app_id), fg="red"))
logging.exception(f"Fix app related site missing issue failed, error: {e}")
logging.exception(f"Failed to fix app related site missing issue, app_id: {app_id}")
continue
if not processed_count:
break
click.echo(click.style("Fix for missing app-related sites completed successfully!", fg="green"))
def register_commands(app):
app.cli.add_command(reset_password)
app.cli.add_command(reset_email)
app.cli.add_command(reset_encrypt_key_pair)
app.cli.add_command(vdb_migrate)
app.cli.add_command(convert_to_agent_apps)
app.cli.add_command(add_qdrant_doc_id_index)
app.cli.add_command(create_tenant)
app.cli.add_command(upgrade_db)
app.cli.add_command(fix_app_site_missing)

View File

@@ -1,11 +1,51 @@
from pydantic_settings import SettingsConfigDict
import logging
from typing import Any
from configs.deploy import DeploymentConfig
from configs.enterprise import EnterpriseFeatureConfig
from configs.extra import ExtraServiceConfig
from configs.feature import FeatureConfig
from configs.middleware import MiddlewareConfig
from configs.packaging import PackagingInfo
from pydantic.fields import FieldInfo
from pydantic_settings import BaseSettings, PydanticBaseSettingsSource, SettingsConfigDict
from .deploy import DeploymentConfig
from .enterprise import EnterpriseFeatureConfig
from .extra import ExtraServiceConfig
from .feature import FeatureConfig
from .middleware import MiddlewareConfig
from .packaging import PackagingInfo
from .remote_settings_sources import RemoteSettingsSource, RemoteSettingsSourceConfig, RemoteSettingsSourceName
from .remote_settings_sources.apollo import ApolloSettingsSource
logger = logging.getLogger(__name__)
class RemoteSettingsSourceFactory(PydanticBaseSettingsSource):
def __init__(self, settings_cls: type[BaseSettings]):
super().__init__(settings_cls)
def get_field_value(self, field: FieldInfo, field_name: str) -> tuple[Any, str, bool]:
raise NotImplementedError
def __call__(self) -> dict[str, Any]:
current_state = self.current_state
remote_source_name = current_state.get("REMOTE_SETTINGS_SOURCE_NAME")
if not remote_source_name:
return {}
remote_source: RemoteSettingsSource | None = None
match remote_source_name:
case RemoteSettingsSourceName.APOLLO:
remote_source = ApolloSettingsSource(current_state)
case _:
logger.warning(f"Unsupported remote source: {remote_source_name}")
return {}
d: dict[str, Any] = {}
for field_name, field in self.settings_cls.model_fields.items():
field_value, field_key, value_is_complex = remote_source.get_field_value(field, field_name)
field_value = remote_source.prepare_field_value(field_name, field, field_value, value_is_complex)
if field_value is not None:
d[field_key] = field_value
return d
class DifyConfig(
@@ -19,6 +59,8 @@ class DifyConfig(
MiddlewareConfig,
# Extra service configs
ExtraServiceConfig,
# Remote source configs
RemoteSettingsSourceConfig,
# Enterprise feature configs
# **Before using, please contact business@dify.ai by email to inquire about licensing matters.**
EnterpriseFeatureConfig,
@@ -27,7 +69,6 @@ class DifyConfig(
# read from dotenv format config file
env_file=".env",
env_file_encoding="utf-8",
frozen=True,
# ignore extra attributes
extra="ignore",
)
@@ -36,3 +77,20 @@ class DifyConfig(
# please consider to arrange it in the proper config group of existed or added
# for better readability and maintainability.
# Thanks for your concentration and consideration.
@classmethod
def settings_customise_sources(
cls,
settings_cls: type[BaseSettings],
init_settings: PydanticBaseSettingsSource,
env_settings: PydanticBaseSettingsSource,
dotenv_settings: PydanticBaseSettingsSource,
file_secret_settings: PydanticBaseSettingsSource,
) -> tuple[PydanticBaseSettingsSource, ...]:
return (
init_settings,
env_settings,
RemoteSettingsSourceFactory(settings_cls),
dotenv_settings,
file_secret_settings,
)

View File

@@ -17,11 +17,6 @@ class DeploymentConfig(BaseSettings):
default=False,
)
TESTING: bool = Field(
description="Enable testing mode for running automated tests",
default=False,
)
EDITION: str = Field(
description="Deployment edition of the application (e.g., 'SELF_HOSTED', 'CLOUD')",
default="SELF_HOSTED",

View File

@@ -109,7 +109,7 @@ class CodeExecutionSandboxConfig(BaseSettings):
)
CODE_MAX_PRECISION: PositiveInt = Field(
description="mMaximum number of decimal places for floating-point numbers in code execution",
description="Maximum number of decimal places for floating-point numbers in code execution",
default=20,
)
@@ -216,6 +216,11 @@ class FileUploadConfig(BaseSettings):
default=20,
)
WORKFLOW_FILE_UPLOAD_LIMIT: PositiveInt = Field(
description="Maximum number of files allowed in a workflow upload operation",
default=10,
)
class HttpConfig(BaseSettings):
"""
@@ -271,6 +276,16 @@ class HttpConfig(BaseSettings):
default=1 * 1024 * 1024,
)
SSRF_DEFAULT_MAX_RETRIES: PositiveInt = Field(
description="Maximum number of retries for network requests (SSRF)",
default=3,
)
SSRF_PROXY_ALL_URL: Optional[str] = Field(
description="Proxy URL for HTTP or HTTPS requests to prevent Server-Side Request Forgery (SSRF)",
default=None,
)
SSRF_PROXY_HTTP_URL: Optional[str] = Field(
description="Proxy URL for HTTP requests to prevent Server-Side Request Forgery (SSRF)",
default=None,
@@ -281,6 +296,26 @@ class HttpConfig(BaseSettings):
default=None,
)
SSRF_DEFAULT_TIME_OUT: PositiveFloat = Field(
description="The default timeout period used for network requests (SSRF)",
default=5,
)
SSRF_DEFAULT_CONNECT_TIME_OUT: PositiveFloat = Field(
description="The default connect timeout period used for network requests (SSRF)",
default=5,
)
SSRF_DEFAULT_READ_TIME_OUT: PositiveFloat = Field(
description="The default read timeout period used for network requests (SSRF)",
default=5,
)
SSRF_DEFAULT_WRITE_TIME_OUT: PositiveFloat = Field(
description="The default write timeout period used for network requests (SSRF)",
default=5,
)
RESPECT_XFORWARD_HEADERS_ENABLED: bool = Field(
description="Enable or disable the X-Forwarded-For Proxy Fix middleware from Werkzeug"
" to respect X-* headers to redirect clients",
@@ -341,7 +376,7 @@ class LoggingConfig(BaseSettings):
LOG_TZ: Optional[str] = Field(
description="Timezone for log timestamps (e.g., 'America/New_York')",
default=None,
default="UTC",
)
@@ -404,6 +439,17 @@ class WorkflowConfig(BaseSettings):
)
class WorkflowNodeExecutionConfig(BaseSettings):
"""
Configuration for workflow node execution
"""
MAX_SUBMIT_COUNT: PositiveInt = Field(
description="Maximum number of submitted thread count in a ThreadPool for parallel node execution",
default=100,
)
class AuthConfig(BaseSettings):
"""
Configuration for authentication and OAuth
@@ -439,6 +485,11 @@ class AuthConfig(BaseSettings):
default=60,
)
LOGIN_LOCKOUT_DURATION: PositiveInt = Field(
description="Time (in seconds) a user must wait before retrying login after exceeding the rate limit.",
default=86400,
)
class ModerationConfig(BaseSettings):
"""
@@ -550,6 +601,11 @@ class RagEtlConfig(BaseSettings):
default=None,
)
SCARF_NO_ANALYTICS: Optional[str] = Field(
description="This is about whether to disable Scarf analytics in Unstructured library.",
default="false",
)
class DataSetConfig(BaseSettings):
"""
@@ -576,6 +632,16 @@ class DataSetConfig(BaseSettings):
default=500,
)
CREATE_TIDB_SERVICE_JOB_ENABLED: bool = Field(
description="Enable or disable create tidb service job",
default=False,
)
PLAN_SANDBOX_CLEAN_MESSAGE_DAY_SETTING: PositiveInt = Field(
description="Interval in days for message cleanup operations - plan: sandbox",
default=30,
)
class WorkspaceConfig(BaseSettings):
"""
@@ -595,13 +661,13 @@ class IndexingConfig(BaseSettings):
INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH: PositiveInt = Field(
description="Maximum token length for text segmentation during indexing",
default=1000,
default=4000,
)
class ImageFormatConfig(BaseSettings):
MULTIMODAL_SEND_IMAGE_FORMAT: Literal["base64", "url"] = Field(
description="Format for sending images in multimodal contexts ('base64' or 'url'), default is base64",
class MultiModalTransferConfig(BaseSettings):
MULTIMODAL_SEND_FORMAT: Literal["base64", "url"] = Field(
description="Format for sending files in multimodal contexts ('base64' or 'url'), default is base64",
default="base64",
)
@@ -707,19 +773,20 @@ class FeatureConfig(
FileAccessConfig,
FileUploadConfig,
HttpConfig,
ImageFormatConfig,
InnerAPIConfig,
IndexingConfig,
LoggingConfig,
MailConfig,
ModelLoadBalanceConfig,
ModerationConfig,
MultiModalTransferConfig,
PositionConfig,
RagEtlConfig,
SecurityConfig,
ToolConfig,
UpdateConfig,
WorkflowConfig,
WorkflowNodeExecutionConfig,
WorkspaceConfig,
LoginConfig,
# hosted services config

View File

@@ -1,50 +1,69 @@
from typing import Any, Optional
from typing import Any, Literal, Optional
from urllib.parse import quote_plus
from pydantic import Field, NonNegativeInt, PositiveFloat, PositiveInt, computed_field
from pydantic_settings import BaseSettings
from configs.middleware.cache.redis_config import RedisConfig
from configs.middleware.storage.aliyun_oss_storage_config import AliyunOSSStorageConfig
from configs.middleware.storage.amazon_s3_storage_config import S3StorageConfig
from configs.middleware.storage.azure_blob_storage_config import AzureBlobStorageConfig
from configs.middleware.storage.baidu_obs_storage_config import BaiduOBSStorageConfig
from configs.middleware.storage.google_cloud_storage_config import GoogleCloudStorageConfig
from configs.middleware.storage.huawei_obs_storage_config import HuaweiCloudOBSStorageConfig
from configs.middleware.storage.oci_storage_config import OCIStorageConfig
from configs.middleware.storage.supabase_storage_config import SupabaseStorageConfig
from configs.middleware.storage.tencent_cos_storage_config import TencentCloudCOSStorageConfig
from configs.middleware.storage.volcengine_tos_storage_config import VolcengineTOSStorageConfig
from configs.middleware.vdb.analyticdb_config import AnalyticdbConfig
from configs.middleware.vdb.chroma_config import ChromaConfig
from configs.middleware.vdb.elasticsearch_config import ElasticsearchConfig
from configs.middleware.vdb.milvus_config import MilvusConfig
from configs.middleware.vdb.myscale_config import MyScaleConfig
from configs.middleware.vdb.opensearch_config import OpenSearchConfig
from configs.middleware.vdb.oracle_config import OracleConfig
from configs.middleware.vdb.pgvector_config import PGVectorConfig
from configs.middleware.vdb.pgvectors_config import PGVectoRSConfig
from configs.middleware.vdb.qdrant_config import QdrantConfig
from configs.middleware.vdb.relyt_config import RelytConfig
from configs.middleware.vdb.tencent_vector_config import TencentVectorDBConfig
from configs.middleware.vdb.tidb_on_qdrant_config import TidbOnQdrantConfig
from configs.middleware.vdb.tidb_vector_config import TiDBVectorConfig
from configs.middleware.vdb.upstash_config import UpstashConfig
from configs.middleware.vdb.vikingdb_config import VikingDBConfig
from configs.middleware.vdb.weaviate_config import WeaviateConfig
from .cache.redis_config import RedisConfig
from .storage.aliyun_oss_storage_config import AliyunOSSStorageConfig
from .storage.amazon_s3_storage_config import S3StorageConfig
from .storage.azure_blob_storage_config import AzureBlobStorageConfig
from .storage.baidu_obs_storage_config import BaiduOBSStorageConfig
from .storage.google_cloud_storage_config import GoogleCloudStorageConfig
from .storage.huawei_obs_storage_config import HuaweiCloudOBSStorageConfig
from .storage.oci_storage_config import OCIStorageConfig
from .storage.opendal_storage_config import OpenDALStorageConfig
from .storage.supabase_storage_config import SupabaseStorageConfig
from .storage.tencent_cos_storage_config import TencentCloudCOSStorageConfig
from .storage.volcengine_tos_storage_config import VolcengineTOSStorageConfig
from .vdb.analyticdb_config import AnalyticdbConfig
from .vdb.baidu_vector_config import BaiduVectorDBConfig
from .vdb.chroma_config import ChromaConfig
from .vdb.couchbase_config import CouchbaseConfig
from .vdb.elasticsearch_config import ElasticsearchConfig
from .vdb.lindorm_config import LindormConfig
from .vdb.milvus_config import MilvusConfig
from .vdb.myscale_config import MyScaleConfig
from .vdb.oceanbase_config import OceanBaseVectorConfig
from .vdb.opensearch_config import OpenSearchConfig
from .vdb.oracle_config import OracleConfig
from .vdb.pgvector_config import PGVectorConfig
from .vdb.pgvectors_config import PGVectoRSConfig
from .vdb.qdrant_config import QdrantConfig
from .vdb.relyt_config import RelytConfig
from .vdb.tencent_vector_config import TencentVectorDBConfig
from .vdb.tidb_on_qdrant_config import TidbOnQdrantConfig
from .vdb.tidb_vector_config import TiDBVectorConfig
from .vdb.upstash_config import UpstashConfig
from .vdb.vikingdb_config import VikingDBConfig
from .vdb.weaviate_config import WeaviateConfig
class StorageConfig(BaseSettings):
STORAGE_TYPE: str = Field(
STORAGE_TYPE: Literal[
"opendal",
"s3",
"aliyun-oss",
"azure-blob",
"baidu-obs",
"google-storage",
"huawei-obs",
"oci-storage",
"tencent-cos",
"volcengine-tos",
"supabase",
"local",
] = Field(
description="Type of storage to use."
" Options: 'local', 's3', 'aliyun-oss', 'azure-blob', 'baidu-obs', 'google-storage', 'huawei-obs', "
"'oci-storage', 'tencent-cos', 'volcengine-tos', 'supabase'. Default is 'local'.",
default="local",
" Options: 'opendal', '(deprecated) local', 's3', 'aliyun-oss', 'azure-blob', 'baidu-obs', 'google-storage', "
"'huawei-obs', 'oci-storage', 'tencent-cos', 'volcengine-tos', 'supabase'. Default is 'opendal'.",
default="opendal",
)
STORAGE_LOCAL_PATH: str = Field(
description="Path for local storage when STORAGE_TYPE is set to 'local'.",
default="storage",
deprecated=True,
)
@@ -69,7 +88,7 @@ class KeywordStoreConfig(BaseSettings):
)
class DatabaseConfig:
class DatabaseConfig(BaseSettings):
DB_HOST: str = Field(
description="Hostname or IP address of the database server.",
default="localhost",
@@ -231,6 +250,7 @@ class MiddlewareConfig(
GoogleCloudStorageConfig,
HuaweiCloudOBSStorageConfig,
OCIStorageConfig,
OpenDALStorageConfig,
S3StorageConfig,
SupabaseStorageConfig,
TencentCloudCOSStorageConfig,
@@ -251,9 +271,13 @@ class MiddlewareConfig(
TiDBVectorConfig,
WeaviateConfig,
ElasticsearchConfig,
CouchbaseConfig,
InternalTestConfig,
VikingDBConfig,
UpstashConfig,
TidbOnQdrantConfig,
LindormConfig,
OceanBaseVectorConfig,
BaiduVectorDBConfig,
):
pass

View File

@@ -68,3 +68,18 @@ class RedisConfig(BaseSettings):
description="Socket timeout in seconds for Redis Sentinel connections",
default=0.1,
)
REDIS_USE_CLUSTERS: bool = Field(
description="Enable Redis Clusters mode for high availability",
default=False,
)
REDIS_CLUSTERS: Optional[str] = Field(
description="Comma-separated list of Redis Clusters nodes (host:port)",
default=None,
)
REDIS_CLUSTERS_PASSWORD: Optional[str] = Field(
description="Password for Redis Clusters authentication (if required)",
default=None,
)

View File

@@ -1,9 +1,10 @@
from typing import Optional
from pydantic import BaseModel, Field
from pydantic import Field
from pydantic_settings import BaseSettings
class BaiduOBSStorageConfig(BaseModel):
class BaiduOBSStorageConfig(BaseSettings):
"""
Configuration settings for Baidu Object Storage Service (OBS)
"""

View File

@@ -1,9 +1,10 @@
from typing import Optional
from pydantic import BaseModel, Field
from pydantic import Field
from pydantic_settings import BaseSettings
class HuaweiCloudOBSStorageConfig(BaseModel):
class HuaweiCloudOBSStorageConfig(BaseSettings):
"""
Configuration settings for Huawei Cloud Object Storage Service (OBS)
"""

View File

@@ -0,0 +1,9 @@
from pydantic import Field
from pydantic_settings import BaseSettings
class OpenDALStorageConfig(BaseSettings):
OPENDAL_SCHEME: str = Field(
default="fs",
description="OpenDAL scheme.",
)

View File

@@ -1,9 +1,10 @@
from typing import Optional
from pydantic import BaseModel, Field
from pydantic import Field
from pydantic_settings import BaseSettings
class SupabaseStorageConfig(BaseModel):
class SupabaseStorageConfig(BaseSettings):
"""
Configuration settings for Supabase Object Storage Service
"""

View File

@@ -1,9 +1,10 @@
from typing import Optional
from pydantic import BaseModel, Field
from pydantic import Field
from pydantic_settings import BaseSettings
class VolcengineTOSStorageConfig(BaseModel):
class VolcengineTOSStorageConfig(BaseSettings):
"""
Configuration settings for Volcengine Tinder Object Storage (TOS)
"""

View File

@@ -1,9 +1,10 @@
from typing import Optional
from pydantic import BaseModel, Field
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class AnalyticdbConfig(BaseModel):
class AnalyticdbConfig(BaseSettings):
"""
Configuration for connecting to Alibaba Cloud AnalyticDB for PostgreSQL.
Refer to the following documentation for details on obtaining credentials:
@@ -40,3 +41,11 @@ class AnalyticdbConfig(BaseModel):
description="The password for accessing the specified namespace within the AnalyticDB instance"
" (if namespace feature is enabled).",
)
ANALYTICDB_HOST: Optional[str] = Field(
default=None, description="The host of the AnalyticDB instance you want to connect to."
)
ANALYTICDB_PORT: PositiveInt = Field(
default=5432, description="The port of the AnalyticDB instance you want to connect to."
)
ANALYTICDB_MIN_CONNECTION: PositiveInt = Field(default=1, description="Min connection of the AnalyticDB database.")
ANALYTICDB_MAX_CONNECTION: PositiveInt = Field(default=5, description="Max connection of the AnalyticDB database.")

View File

@@ -0,0 +1,35 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class CouchbaseConfig(BaseSettings):
"""
Couchbase configs
"""
COUCHBASE_CONNECTION_STRING: Optional[str] = Field(
description="COUCHBASE connection string",
default=None,
)
COUCHBASE_USER: Optional[str] = Field(
description="COUCHBASE user",
default=None,
)
COUCHBASE_PASSWORD: Optional[str] = Field(
description="COUCHBASE password",
default=None,
)
COUCHBASE_BUCKET_NAME: Optional[str] = Field(
description="COUCHBASE bucket name",
default=None,
)
COUCHBASE_SCOPE_NAME: Optional[str] = Field(
description="COUCHBASE scope name",
default=None,
)

View File

@@ -0,0 +1,34 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class LindormConfig(BaseSettings):
"""
Lindorm configs
"""
LINDORM_URL: Optional[str] = Field(
description="Lindorm url",
default=None,
)
LINDORM_USERNAME: Optional[str] = Field(
description="Lindorm user",
default=None,
)
LINDORM_PASSWORD: Optional[str] = Field(
description="Lindorm password",
default=None,
)
DEFAULT_INDEX_TYPE: Optional[str] = Field(
description="Lindorm Vector Index Type, hnsw or flat is available in dify",
default="hnsw",
)
DEFAULT_DISTANCE_TYPE: Optional[str] = Field(
description="Vector Distance Type, support l2, cosinesimil, innerproduct", default="l2"
)
USING_UGC_INDEX: Optional[bool] = Field(
description="Using UGC index will store the same type of Index in a single index but can retrieve separately.",
default=False,
)

View File

@@ -1,7 +1,8 @@
from pydantic import BaseModel, Field, PositiveInt
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class MyScaleConfig(BaseModel):
class MyScaleConfig(BaseSettings):
"""
Configuration settings for MyScale vector database
"""

View File

@@ -0,0 +1,35 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class OceanBaseVectorConfig(BaseSettings):
"""
Configuration settings for OceanBase Vector database
"""
OCEANBASE_VECTOR_HOST: Optional[str] = Field(
description="Hostname or IP address of the OceanBase Vector server (e.g. 'localhost')",
default=None,
)
OCEANBASE_VECTOR_PORT: Optional[PositiveInt] = Field(
description="Port number on which the OceanBase Vector server is listening (default is 2881)",
default=2881,
)
OCEANBASE_VECTOR_USER: Optional[str] = Field(
description="Username for authenticating with the OceanBase Vector database",
default=None,
)
OCEANBASE_VECTOR_PASSWORD: Optional[str] = Field(
description="Password for authenticating with the OceanBase Vector database",
default=None,
)
OCEANBASE_VECTOR_DATABASE: Optional[str] = Field(
description="Name of the OceanBase Vector database to connect to",
default=None,
)

View File

@@ -63,3 +63,8 @@ class TidbOnQdrantConfig(BaseSettings):
description="Tidb project id",
default=None,
)
TIDB_SPEND_LIMIT: Optional[int] = Field(
description="Tidb spend limit",
default=100,
)

View File

@@ -1,9 +1,10 @@
from typing import Optional
from pydantic import BaseModel, Field
from pydantic import Field
from pydantic_settings import BaseSettings
class VikingDBConfig(BaseModel):
class VikingDBConfig(BaseSettings):
"""
Configuration for connecting to Volcengine VikingDB.
Refer to the following documentation for details on obtaining credentials:

View File

@@ -9,7 +9,7 @@ class PackagingInfo(BaseSettings):
CURRENT_VERSION: str = Field(
description="Dify version",
default="0.10.2",
default="0.14.1",
)
COMMIT_SHA: str = Field(

View File

@@ -0,0 +1,17 @@
from typing import Optional
from pydantic import Field
from .apollo import ApolloSettingsSourceInfo
from .base import RemoteSettingsSource
from .enums import RemoteSettingsSourceName
class RemoteSettingsSourceConfig(ApolloSettingsSourceInfo):
REMOTE_SETTINGS_SOURCE_NAME: RemoteSettingsSourceName | str = Field(
description="name of remote config source",
default="",
)
__all__ = ["RemoteSettingsSource", "RemoteSettingsSourceConfig", "RemoteSettingsSourceName"]

View File

@@ -0,0 +1,55 @@
from collections.abc import Mapping
from typing import Any, Optional
from pydantic import Field
from pydantic.fields import FieldInfo
from pydantic_settings import BaseSettings
from configs.remote_settings_sources.base import RemoteSettingsSource
from .client import ApolloClient
class ApolloSettingsSourceInfo(BaseSettings):
"""
Packaging build information
"""
APOLLO_APP_ID: Optional[str] = Field(
description="apollo app_id",
default=None,
)
APOLLO_CLUSTER: Optional[str] = Field(
description="apollo cluster",
default=None,
)
APOLLO_CONFIG_URL: Optional[str] = Field(
description="apollo config url",
default=None,
)
APOLLO_NAMESPACE: Optional[str] = Field(
description="apollo namespace",
default=None,
)
class ApolloSettingsSource(RemoteSettingsSource):
def __init__(self, configs: Mapping[str, Any]):
self.client = ApolloClient(
app_id=configs["APOLLO_APP_ID"],
cluster=configs["APOLLO_CLUSTER"],
config_url=configs["APOLLO_CONFIG_URL"],
start_hot_update=False,
_notification_map={configs["APOLLO_NAMESPACE"]: -1},
)
self.namespace = configs["APOLLO_NAMESPACE"]
self.remote_configs = self.client.get_all_dicts(self.namespace)
def get_field_value(self, field: FieldInfo, field_name: str) -> tuple[Any, str, bool]:
if not isinstance(self.remote_configs, dict):
raise ValueError(f"remote configs is not dict, but {type(self.remote_configs)}")
field_value = self.remote_configs.get(field_name)
return field_value, field_name, False

View File

@@ -0,0 +1,303 @@
import hashlib
import json
import logging
import os
import threading
import time
from pathlib import Path
from .python_3x import http_request, makedirs_wrapper
from .utils import (
CONFIGURATIONS,
NAMESPACE_NAME,
NOTIFICATION_ID,
get_value_from_dict,
init_ip,
no_key_cache_key,
signature,
url_encode_wrapper,
)
logger = logging.getLogger(__name__)
class ApolloClient:
def __init__(
self,
config_url,
app_id,
cluster="default",
secret="",
start_hot_update=True,
change_listener=None,
_notification_map=None,
):
# Core routing parameters
self.config_url = config_url
self.cluster = cluster
self.app_id = app_id
# Non-core parameters
self.ip = init_ip()
self.secret = secret
# Check the parameter variables
# Private control variables
self._cycle_time = 5
self._stopping = False
self._cache = {}
self._no_key = {}
self._hash = {}
self._pull_timeout = 75
self._cache_file_path = os.path.expanduser("~") + "/.dify/config/remote-settings/apollo/cache/"
self._long_poll_thread = None
self._change_listener = change_listener # "add" "delete" "update"
if _notification_map is None:
_notification_map = {"application": -1}
self._notification_map = _notification_map
self.last_release_key = None
# Private startup method
self._path_checker()
if start_hot_update:
self._start_hot_update()
# start the heartbeat thread
heartbeat = threading.Thread(target=self._heart_beat)
heartbeat.daemon = True
heartbeat.start()
def get_json_from_net(self, namespace="application"):
url = "{}/configs/{}/{}/{}?releaseKey={}&ip={}".format(
self.config_url, self.app_id, self.cluster, namespace, "", self.ip
)
try:
code, body = http_request(url, timeout=3, headers=self._sign_headers(url))
if code == 200:
if not body:
logger.error(f"get_json_from_net load configs failed, body is {body}")
return None
data = json.loads(body)
data = data["configurations"]
return_data = {CONFIGURATIONS: data}
return return_data
else:
return None
except Exception:
logger.exception("an error occurred in get_json_from_net")
return None
def get_value(self, key, default_val=None, namespace="application"):
try:
# read memory configuration
namespace_cache = self._cache.get(namespace)
val = get_value_from_dict(namespace_cache, key)
if val is not None:
return val
no_key = no_key_cache_key(namespace, key)
if no_key in self._no_key:
return default_val
# read the network configuration
namespace_data = self.get_json_from_net(namespace)
val = get_value_from_dict(namespace_data, key)
if val is not None:
self._update_cache_and_file(namespace_data, namespace)
return val
# read the file configuration
namespace_cache = self._get_local_cache(namespace)
val = get_value_from_dict(namespace_cache, key)
if val is not None:
self._update_cache_and_file(namespace_cache, namespace)
return val
# If all of them are not obtained, the default value is returned
# and the local cache is set to None
self._set_local_cache_none(namespace, key)
return default_val
except Exception:
logger.exception("get_value has error, [key is %s], [namespace is %s]", key, namespace)
return default_val
# Set the key of a namespace to none, and do not set default val
# to ensure the real-time correctness of the function call.
# If the user does not have the same default val twice
# and the default val is used here, there may be a problem.
def _set_local_cache_none(self, namespace, key):
no_key = no_key_cache_key(namespace, key)
self._no_key[no_key] = key
def _start_hot_update(self):
self._long_poll_thread = threading.Thread(target=self._listener)
# When the asynchronous thread is started, the daemon thread will automatically exit
# when the main thread is launched.
self._long_poll_thread.daemon = True
self._long_poll_thread.start()
def stop(self):
self._stopping = True
logger.info("Stopping listener...")
# Call the set callback function, and if it is abnormal, try it out
def _call_listener(self, namespace, old_kv, new_kv):
if self._change_listener is None:
return
if old_kv is None:
old_kv = {}
if new_kv is None:
new_kv = {}
try:
for key in old_kv:
new_value = new_kv.get(key)
old_value = old_kv.get(key)
if new_value is None:
# If newValue is empty, it means key, and the value is deleted.
self._change_listener("delete", namespace, key, old_value)
continue
if new_value != old_value:
self._change_listener("update", namespace, key, new_value)
continue
for key in new_kv:
new_value = new_kv.get(key)
old_value = old_kv.get(key)
if old_value is None:
self._change_listener("add", namespace, key, new_value)
except BaseException as e:
logger.warning(str(e))
def _path_checker(self):
if not os.path.isdir(self._cache_file_path):
makedirs_wrapper(self._cache_file_path)
# update the local cache and file cache
def _update_cache_and_file(self, namespace_data, namespace="application"):
# update the local cache
self._cache[namespace] = namespace_data
# update the file cache
new_string = json.dumps(namespace_data)
new_hash = hashlib.md5(new_string.encode("utf-8")).hexdigest()
if self._hash.get(namespace) == new_hash:
pass
else:
file_path = Path(self._cache_file_path) / f"{self.app_id}_configuration_{namespace}.txt"
file_path.write_text(new_string)
self._hash[namespace] = new_hash
# get the configuration from the local file
def _get_local_cache(self, namespace="application"):
cache_file_path = os.path.join(self._cache_file_path, f"{self.app_id}_configuration_{namespace}.txt")
if os.path.isfile(cache_file_path):
with open(cache_file_path) as f:
result = json.loads(f.readline())
return result
return {}
def _long_poll(self):
notifications = []
for key in self._cache:
namespace_data = self._cache[key]
notification_id = -1
if NOTIFICATION_ID in namespace_data:
notification_id = self._cache[key][NOTIFICATION_ID]
notifications.append({NAMESPACE_NAME: key, NOTIFICATION_ID: notification_id})
try:
# if the length is 0 it is returned directly
if len(notifications) == 0:
return
url = "{}/notifications/v2".format(self.config_url)
params = {
"appId": self.app_id,
"cluster": self.cluster,
"notifications": json.dumps(notifications, ensure_ascii=False),
}
param_str = url_encode_wrapper(params)
url = url + "?" + param_str
code, body = http_request(url, self._pull_timeout, headers=self._sign_headers(url))
http_code = code
if http_code == 304:
logger.debug("No change, loop...")
return
if http_code == 200:
if not body:
logger.error(f"_long_poll load configs failed,body is {body}")
return
data = json.loads(body)
for entry in data:
namespace = entry[NAMESPACE_NAME]
n_id = entry[NOTIFICATION_ID]
logger.info("%s has changes: notificationId=%d", namespace, n_id)
self._get_net_and_set_local(namespace, n_id, call_change=True)
return
else:
logger.warning("Sleep...")
except Exception as e:
logger.warning(str(e))
def _get_net_and_set_local(self, namespace, n_id, call_change=False):
namespace_data = self.get_json_from_net(namespace)
if not namespace_data:
return
namespace_data[NOTIFICATION_ID] = n_id
old_namespace = self._cache.get(namespace)
self._update_cache_and_file(namespace_data, namespace)
if self._change_listener is not None and call_change and old_namespace:
old_kv = old_namespace.get(CONFIGURATIONS)
new_kv = namespace_data.get(CONFIGURATIONS)
self._call_listener(namespace, old_kv, new_kv)
def _listener(self):
logger.info("start long_poll")
while not self._stopping:
self._long_poll()
time.sleep(self._cycle_time)
logger.info("stopped, long_poll")
# add the need for endorsement to the header
def _sign_headers(self, url):
headers = {}
if self.secret == "":
return headers
uri = url[len(self.config_url) : len(url)]
time_unix_now = str(int(round(time.time() * 1000)))
headers["Authorization"] = "Apollo " + self.app_id + ":" + signature(time_unix_now, uri, self.secret)
headers["Timestamp"] = time_unix_now
return headers
def _heart_beat(self):
while not self._stopping:
for namespace in self._notification_map:
self._do_heart_beat(namespace)
time.sleep(60 * 10) # 10分钟
def _do_heart_beat(self, namespace):
url = "{}/configs/{}/{}/{}?ip={}".format(self.config_url, self.app_id, self.cluster, namespace, self.ip)
try:
code, body = http_request(url, timeout=3, headers=self._sign_headers(url))
if code == 200:
if not body:
logger.error(f"_do_heart_beat load configs failed,body is {body}")
return None
data = json.loads(body)
if self.last_release_key == data["releaseKey"]:
return None
self.last_release_key = data["releaseKey"]
data = data["configurations"]
self._update_cache_and_file(data, namespace)
else:
return None
except Exception:
logger.exception("an error occurred in _do_heart_beat")
return None
def get_all_dicts(self, namespace):
namespace_data = self._cache.get(namespace)
if namespace_data is None:
net_namespace_data = self.get_json_from_net(namespace)
if not net_namespace_data:
return namespace_data
namespace_data = net_namespace_data.get(CONFIGURATIONS)
if namespace_data:
self._update_cache_and_file(namespace_data, namespace)
return namespace_data

View File

@@ -0,0 +1,41 @@
import logging
import os
import ssl
import urllib.request
from urllib import parse
from urllib.error import HTTPError
# Create an SSL context that allows for a lower level of security
ssl_context = ssl.create_default_context()
ssl_context.set_ciphers("HIGH:!DH:!aNULL")
ssl_context.check_hostname = False
ssl_context.verify_mode = ssl.CERT_NONE
# Create an opener object and pass in a custom SSL context
opener = urllib.request.build_opener(urllib.request.HTTPSHandler(context=ssl_context))
urllib.request.install_opener(opener)
logger = logging.getLogger(__name__)
def http_request(url, timeout, headers={}):
try:
request = urllib.request.Request(url, headers=headers)
res = urllib.request.urlopen(request, timeout=timeout)
body = res.read().decode("utf-8")
return res.code, body
except HTTPError as e:
if e.code == 304:
logger.warning("http_request error,code is 304, maybe you should check secret")
return 304, None
logger.warning("http_request error,code is %d, msg is %s", e.code, e.msg)
raise e
def url_encode(params):
return parse.urlencode(params)
def makedirs_wrapper(path):
os.makedirs(path, exist_ok=True)

View File

@@ -0,0 +1,51 @@
import hashlib
import socket
from .python_3x import url_encode
# define constants
CONFIGURATIONS = "configurations"
NOTIFICATION_ID = "notificationId"
NAMESPACE_NAME = "namespaceName"
# add timestamps uris and keys
def signature(timestamp, uri, secret):
import base64
import hmac
string_to_sign = "" + timestamp + "\n" + uri
hmac_code = hmac.new(secret.encode(), string_to_sign.encode(), hashlib.sha1).digest()
return base64.b64encode(hmac_code).decode()
def url_encode_wrapper(params):
return url_encode(params)
def no_key_cache_key(namespace, key):
return "{}{}{}".format(namespace, len(namespace), key)
# Returns whether the obtained value is obtained, and None if it does not
def get_value_from_dict(namespace_cache, key):
if namespace_cache:
kv_data = namespace_cache.get(CONFIGURATIONS)
if kv_data is None:
return None
if key in kv_data:
return kv_data[key]
return None
def init_ip():
ip = ""
s = None
try:
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s.connect(("8.8.8.8", 53))
ip = s.getsockname()[0]
finally:
if s:
s.close()
return ip

View File

@@ -0,0 +1,15 @@
from collections.abc import Mapping
from typing import Any
from pydantic.fields import FieldInfo
class RemoteSettingsSource:
def __init__(self, configs: Mapping[str, Any]):
pass
def get_field_value(self, field: FieldInfo, field_name: str) -> tuple[Any, str, bool]:
raise NotImplementedError
def prepare_field_value(self, field_name: str, field: FieldInfo, value: Any, value_is_complex: bool) -> Any:
return value

View File

@@ -0,0 +1,5 @@
from enum import StrEnum
class RemoteSettingsSourceName(StrEnum):
APOLLO = "apollo"

View File

@@ -14,11 +14,11 @@ AUDIO_EXTENSIONS.extend([ext.upper() for ext in AUDIO_EXTENSIONS])
if dify_config.ETL_TYPE == "Unstructured":
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "pdf", "html", "htm", "xlsx", "xls"]
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls"]
DOCUMENT_EXTENSIONS.extend(("docx", "csv", "eml", "msg", "pptx", "xml", "epub"))
if dify_config.UNSTRUCTURED_API_URL:
DOCUMENT_EXTENSIONS.append("ppt")
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])
else:
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "pdf", "html", "htm", "xlsx", "xls", "docx", "csv"]
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls", "docx", "csv"]
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])

View File

@@ -17,6 +17,8 @@ language_timezone_mapping = {
"hi-IN": "Asia/Kolkata",
"tr-TR": "Europe/Istanbul",
"fa-IR": "Asia/Tehran",
"sl-SI": "Europe/Ljubljana",
"th-TH": "Asia/Bangkok",
}
languages = list(language_timezone_mapping.keys())

View File

@@ -0,0 +1,6 @@
from werkzeug.exceptions import HTTPException
class FilenameNotExistsError(HTTPException):
code = 400
description = "The specified filename does not exist."

View File

@@ -0,0 +1,24 @@
from flask_restful import fields
parameters__system_parameters = {
"image_file_size_limit": fields.Integer,
"video_file_size_limit": fields.Integer,
"audio_file_size_limit": fields.Integer,
"file_size_limit": fields.Integer,
"workflow_file_upload_limit": fields.Integer,
}
parameters_fields = {
"opening_statement": fields.String,
"suggested_questions": fields.Raw,
"suggested_questions_after_answer": fields.Raw,
"speech_to_text": fields.Raw,
"text_to_speech": fields.Raw,
"retriever_resource": fields.Raw,
"annotation_reply": fields.Raw,
"more_like_this": fields.Raw,
"user_input_form": fields.Raw,
"sensitive_word_avoidance": fields.Raw,
"file_upload": fields.Raw,
"system_parameters": fields.Nested(parameters__system_parameters),
}

View File

@@ -0,0 +1,97 @@
import mimetypes
import os
import re
import urllib.parse
from collections.abc import Mapping
from typing import Any
from uuid import uuid4
import httpx
from pydantic import BaseModel
from configs import dify_config
class FileInfo(BaseModel):
filename: str
extension: str
mimetype: str
size: int
def guess_file_info_from_response(response: httpx.Response):
url = str(response.url)
# Try to extract filename from URL
parsed_url = urllib.parse.urlparse(url)
url_path = parsed_url.path
filename = os.path.basename(url_path)
# If filename couldn't be extracted, use Content-Disposition header
if not filename:
content_disposition = response.headers.get("Content-Disposition")
if content_disposition:
filename_match = re.search(r'filename="?(.+)"?', content_disposition)
if filename_match:
filename = filename_match.group(1)
# If still no filename, generate a unique one
if not filename:
unique_name = str(uuid4())
filename = f"{unique_name}"
# Guess MIME type from filename first, then URL
mimetype, _ = mimetypes.guess_type(filename)
if mimetype is None:
mimetype, _ = mimetypes.guess_type(url)
if mimetype is None:
# If guessing fails, use Content-Type from response headers
mimetype = response.headers.get("Content-Type", "application/octet-stream")
extension = os.path.splitext(filename)[1]
# Ensure filename has an extension
if not extension:
extension = mimetypes.guess_extension(mimetype) or ".bin"
filename = f"{filename}{extension}"
return FileInfo(
filename=filename,
extension=extension,
mimetype=mimetype,
size=int(response.headers.get("Content-Length", -1)),
)
def get_parameters_from_feature_dict(*, features_dict: Mapping[str, Any], user_input_form: list[dict[str, Any]]):
return {
"opening_statement": features_dict.get("opening_statement"),
"suggested_questions": features_dict.get("suggested_questions", []),
"suggested_questions_after_answer": features_dict.get("suggested_questions_after_answer", {"enabled": False}),
"speech_to_text": features_dict.get("speech_to_text", {"enabled": False}),
"text_to_speech": features_dict.get("text_to_speech", {"enabled": False}),
"retriever_resource": features_dict.get("retriever_resource", {"enabled": False}),
"annotation_reply": features_dict.get("annotation_reply", {"enabled": False}),
"more_like_this": features_dict.get("more_like_this", {"enabled": False}),
"user_input_form": user_input_form,
"sensitive_word_avoidance": features_dict.get(
"sensitive_word_avoidance", {"enabled": False, "type": "", "configs": []}
),
"file_upload": features_dict.get(
"file_upload",
{
"image": {
"enabled": False,
"number_limits": 3,
"detail": "high",
"transfer_methods": ["remote_url", "local_file"],
}
},
),
"system_parameters": {
"image_file_size_limit": dify_config.UPLOAD_IMAGE_FILE_SIZE_LIMIT,
"video_file_size_limit": dify_config.UPLOAD_VIDEO_FILE_SIZE_LIMIT,
"audio_file_size_limit": dify_config.UPLOAD_AUDIO_FILE_SIZE_LIMIT,
"file_size_limit": dify_config.UPLOAD_FILE_SIZE_LIMIT,
"workflow_file_upload_limit": dify_config.WORKFLOW_FILE_UPLOAD_LIMIT,
},
}

View File

@@ -2,9 +2,26 @@ from flask import Blueprint
from libs.external_api import ExternalApi
from .app.app_import import AppImportApi, AppImportConfirmApi
from .files import FileApi, FilePreviewApi, FileSupportTypeApi
from .remote_files import RemoteFileInfoApi, RemoteFileUploadApi
bp = Blueprint("console", __name__, url_prefix="/console/api")
api = ExternalApi(bp)
# File
api.add_resource(FileApi, "/files/upload")
api.add_resource(FilePreviewApi, "/files/<uuid:file_id>/preview")
api.add_resource(FileSupportTypeApi, "/files/support-type")
# Remote files
api.add_resource(RemoteFileInfoApi, "/remote-files/<path:url>")
api.add_resource(RemoteFileUploadApi, "/remote-files/upload")
# Import App
api.add_resource(AppImportApi, "/apps/imports")
api.add_resource(AppImportConfirmApi, "/apps/imports/<string:import_id>/confirm")
# Import other controllers
from . import admin, apikey, extension, feature, ping, setup, version
@@ -43,7 +60,6 @@ from .datasets import (
datasets_document,
datasets_segments,
external,
file,
hit_testing,
website,
)

View File

@@ -10,8 +10,7 @@ from models.dataset import Dataset
from models.model import ApiToken, App
from . import api
from .setup import setup_required
from .wraps import account_initialization_required
from .wraps import account_initialization_required, setup_required
api_key_fields = {
"id": fields.String,

View File

@@ -1,8 +1,7 @@
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from libs.login import login_required
from services.advanced_prompt_template_service import AdvancedPromptTemplateService

View File

@@ -2,8 +2,7 @@ from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from libs.helper import uuid_value
from libs.login import login_required
from models.model import AppMode

View File

@@ -6,8 +6,11 @@ from werkzeug.exceptions import Forbidden
from controllers.console import api
from controllers.console.app.error import NoFileUploadedError
from controllers.console.datasets.error import TooManyFilesError
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
setup_required,
)
from extensions.ext_redis import redis_client
from fields.annotation_fields import (
annotation_fields,

View File

@@ -1,21 +1,30 @@
import uuid
from typing import cast
from flask_login import current_user
from flask_restful import Resource, inputs, marshal, marshal_with, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import BadRequest, Forbidden, abort
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
enterprise_license_required,
setup_required,
)
from core.ops.ops_trace_manager import OpsTraceManager
from extensions.ext_database import db
from fields.app_fields import (
app_detail_fields,
app_detail_fields_with_site,
app_pagination_fields,
)
from libs.login import login_required
from services.app_dsl_service import AppDslService
from models import Account, App
from services.app_dsl_service import AppDslService, ImportMode
from services.app_service import AppService
ALLOW_CREATE_APP_MODES = ["chat", "agent-chat", "advanced-chat", "workflow", "completion"]
@@ -25,6 +34,7 @@ class AppListApi(Resource):
@setup_required
@login_required
@account_initialization_required
@enterprise_license_required
def get(self):
"""Get app list"""
@@ -87,65 +97,11 @@ class AppListApi(Resource):
return app, 201
class AppImportApi(Resource):
@setup_required
@login_required
@account_initialization_required
@marshal_with(app_detail_fields_with_site)
@cloud_edition_billing_resource_check("apps")
def post(self):
"""Import app"""
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("data", type=str, required=True, nullable=False, location="json")
parser.add_argument("name", type=str, location="json")
parser.add_argument("description", type=str, location="json")
parser.add_argument("icon_type", type=str, location="json")
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
args = parser.parse_args()
app = AppDslService.import_and_create_new_app(
tenant_id=current_user.current_tenant_id, data=args["data"], args=args, account=current_user
)
return app, 201
class AppImportFromUrlApi(Resource):
@setup_required
@login_required
@account_initialization_required
@marshal_with(app_detail_fields_with_site)
@cloud_edition_billing_resource_check("apps")
def post(self):
"""Import app from url"""
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("url", type=str, required=True, nullable=False, location="json")
parser.add_argument("name", type=str, location="json")
parser.add_argument("description", type=str, location="json")
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
args = parser.parse_args()
app = AppDslService.import_and_create_new_app_from_url(
tenant_id=current_user.current_tenant_id, url=args["url"], args=args, account=current_user
)
return app, 201
class AppApi(Resource):
@setup_required
@login_required
@account_initialization_required
@enterprise_license_required
@get_app_model
@marshal_with(app_detail_fields_with_site)
def get(self, app_model):
@@ -218,10 +174,24 @@ class AppCopyApi(Resource):
parser.add_argument("icon_background", type=str, location="json")
args = parser.parse_args()
data = AppDslService.export_dsl(app_model=app_model, include_secret=True)
app = AppDslService.import_and_create_new_app(
tenant_id=current_user.current_tenant_id, data=data, args=args, account=current_user
)
with Session(db.engine) as session:
import_service = AppDslService(session)
yaml_content = import_service.export_dsl(app_model=app_model, include_secret=True)
account = cast(Account, current_user)
result = import_service.import_app(
account=account,
import_mode=ImportMode.YAML_CONTENT.value,
yaml_content=yaml_content,
name=args.get("name"),
description=args.get("description"),
icon_type=args.get("icon_type"),
icon=args.get("icon"),
icon_background=args.get("icon_background"),
)
session.commit()
stmt = select(App).where(App.id == result.app_id)
app = session.scalar(stmt)
return app, 201
@@ -362,8 +332,6 @@ class AppTraceApi(Resource):
api.add_resource(AppListApi, "/apps")
api.add_resource(AppImportApi, "/apps/import")
api.add_resource(AppImportFromUrlApi, "/apps/import/url")
api.add_resource(AppApi, "/apps/<uuid:app_id>")
api.add_resource(AppCopyApi, "/apps/<uuid:app_id>/copy")
api.add_resource(AppExportApi, "/apps/<uuid:app_id>/export")

View File

@@ -0,0 +1,90 @@
from typing import cast
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden
from controllers.console.wraps import (
account_initialization_required,
setup_required,
)
from extensions.ext_database import db
from fields.app_fields import app_import_fields
from libs.login import login_required
from models import Account
from services.app_dsl_service import AppDslService, ImportStatus
class AppImportApi(Resource):
@setup_required
@login_required
@account_initialization_required
@marshal_with(app_import_fields)
def post(self):
# Check user role first
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("mode", type=str, required=True, location="json")
parser.add_argument("yaml_content", type=str, location="json")
parser.add_argument("yaml_url", type=str, location="json")
parser.add_argument("name", type=str, location="json")
parser.add_argument("description", type=str, location="json")
parser.add_argument("icon_type", type=str, location="json")
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
parser.add_argument("app_id", type=str, location="json")
args = parser.parse_args()
# Create service with session
with Session(db.engine) as session:
import_service = AppDslService(session)
# Import app
account = cast(Account, current_user)
result = import_service.import_app(
account=account,
import_mode=args["mode"],
yaml_content=args.get("yaml_content"),
yaml_url=args.get("yaml_url"),
name=args.get("name"),
description=args.get("description"),
icon_type=args.get("icon_type"),
icon=args.get("icon"),
icon_background=args.get("icon_background"),
app_id=args.get("app_id"),
)
session.commit()
# Return appropriate status code based on result
status = result.status
if status == ImportStatus.FAILED.value:
return result.model_dump(mode="json"), 400
elif status == ImportStatus.PENDING.value:
return result.model_dump(mode="json"), 202
return result.model_dump(mode="json"), 200
class AppImportConfirmApi(Resource):
@setup_required
@login_required
@account_initialization_required
@marshal_with(app_import_fields)
def post(self, import_id):
# Check user role first
if not current_user.is_editor:
raise Forbidden()
# Create service with session
with Session(db.engine) as session:
import_service = AppDslService(session)
# Confirm import
account = cast(Account, current_user)
result = import_service.confirm_import(import_id=import_id, account=account)
session.commit()
# Return appropriate status code based on result
if result.status == ImportStatus.FAILED.value:
return result.model_dump(mode="json"), 400
return result.model_dump(mode="json"), 200

View File

@@ -18,8 +18,7 @@ from controllers.console.app.error import (
UnsupportedAudioTypeError,
)
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from core.errors.error import ModelCurrentlyNotSupportError, ProviderTokenNotInitError, QuotaExceededError
from core.model_runtime.errors.invoke import InvokeError
from libs.login import login_required
@@ -71,7 +70,7 @@ class ChatMessageAudioApi(Resource):
except ValueError as e:
raise e
except Exception as e:
logging.exception(f"internal server error, {str(e)}.")
logging.exception("Failed to handle post request to ChatMessageAudioApi")
raise InternalServerError()
@@ -129,7 +128,7 @@ class ChatMessageTextApi(Resource):
except ValueError as e:
raise e
except Exception as e:
logging.exception(f"internal server error, {str(e)}.")
logging.exception("Failed to handle post request to ChatMessageTextApi")
raise InternalServerError()
@@ -171,7 +170,7 @@ class TextModesApi(Resource):
except ValueError as e:
raise e
except Exception as e:
logging.exception(f"internal server error, {str(e)}.")
logging.exception("Failed to handle get request to TextModesApi")
raise InternalServerError()

View File

@@ -15,8 +15,7 @@ from controllers.console.app.error import (
ProviderQuotaExceededError,
)
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.web.error import InvokeRateLimitError as InvokeRateLimitHttpError
from core.app.apps.base_app_queue_manager import AppQueueManager
from core.app.entities.app_invoke_entities import InvokeFrom

View File

@@ -1,4 +1,4 @@
from datetime import datetime, timezone
from datetime import UTC, datetime
import pytz
from flask_login import current_user
@@ -10,8 +10,7 @@ from werkzeug.exceptions import Forbidden, NotFound
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from core.app.entities.app_invoke_entities import InvokeFrom
from extensions.ext_database import db
from fields.conversation_fields import (
@@ -315,7 +314,7 @@ def _get_conversation(app_model, conversation_id):
raise NotFound("Conversation Not Exists.")
if not conversation.read_at:
conversation.read_at = datetime.now(timezone.utc).replace(tzinfo=None)
conversation.read_at = datetime.now(UTC).replace(tzinfo=None)
conversation.read_account_id = current_user.id
db.session.commit()

View File

@@ -4,8 +4,7 @@ from sqlalchemy.orm import Session
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db
from fields.conversation_variable_fields import paginated_conversation_variable_fields
from libs.login import login_required

View File

@@ -10,8 +10,7 @@ from controllers.console.app.error import (
ProviderNotInitializeError,
ProviderQuotaExceededError,
)
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from core.errors.error import ModelCurrentlyNotSupportError, ProviderTokenNotInitError, QuotaExceededError
from core.llm_generator.llm_generator import LLMGenerator
from core.model_runtime.errors.invoke import InvokeError

View File

@@ -14,8 +14,11 @@ from controllers.console.app.error import (
)
from controllers.console.app.wraps import get_app_model
from controllers.console.explore.error import AppSuggestedQuestionsAfterAnswerDisabledError
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
setup_required,
)
from core.app.entities.app_invoke_entities import InvokeFrom
from core.errors.error import ModelCurrentlyNotSupportError, ProviderTokenNotInitError, QuotaExceededError
from core.model_runtime.errors.invoke import InvokeError

View File

@@ -6,8 +6,7 @@ from flask_restful import Resource
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from core.agent.entities import AgentToolEntity
from core.tools.tool_manager import ToolManager
from core.tools.utils.configuration import ToolParameterConfigurationManager
@@ -66,7 +65,7 @@ class ModelConfigResource(Resource):
provider_type=agent_tool_entity.provider_type,
identity_id=f"AGENT.{app_model.id}",
)
except Exception as e:
except Exception:
continue
# get decrypted parameters
@@ -98,7 +97,7 @@ class ModelConfigResource(Resource):
app_id=app_model.id,
agent_tool=agent_tool_entity,
)
except Exception as e:
except Exception:
continue
manager = ToolParameterConfigurationManager(

View File

@@ -1,9 +1,9 @@
from flask_restful import Resource, reqparse
from werkzeug.exceptions import BadRequest
from controllers.console import api
from controllers.console.app.error import TracingConfigCheckError, TracingConfigIsExist, TracingConfigNotExist
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from libs.login import login_required
from services.ops_service import OpsService
@@ -27,7 +27,7 @@ class TraceAppConfigApi(Resource):
return {"has_not_configured": True}
return trace_config
except Exception as e:
raise e
raise BadRequest(str(e))
@setup_required
@login_required
@@ -49,7 +49,7 @@ class TraceAppConfigApi(Resource):
raise TracingConfigCheckError()
return result
except Exception as e:
raise e
raise BadRequest(str(e))
@setup_required
@login_required
@@ -69,7 +69,7 @@ class TraceAppConfigApi(Resource):
raise TracingConfigNotExist()
return {"result": "success"}
except Exception as e:
raise e
raise BadRequest(str(e))
@setup_required
@login_required
@@ -86,7 +86,7 @@ class TraceAppConfigApi(Resource):
raise TracingConfigNotExist()
return {"result": "success"}
except Exception as e:
raise e
raise BadRequest(str(e))
api.add_resource(TraceAppConfigApi, "/apps/<uuid:app_id>/trace-config")

View File

@@ -1,4 +1,4 @@
from datetime import datetime, timezone
from datetime import UTC, datetime
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
@@ -7,8 +7,7 @@ from werkzeug.exceptions import Forbidden, NotFound
from constants.languages import supported_language
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db
from fields.app_fields import app_site_fields
from libs.login import login_required
@@ -76,7 +75,7 @@ class AppSite(Resource):
setattr(site, attr_name, value)
site.updated_by = current_user.id
site.updated_at = datetime.now(timezone.utc).replace(tzinfo=None)
site.updated_at = datetime.now(UTC).replace(tzinfo=None)
db.session.commit()
return site
@@ -100,7 +99,7 @@ class AppSiteAccessTokenReset(Resource):
site.code = Site.generate_code(16)
site.updated_by = current_user.id
site.updated_at = datetime.now(timezone.utc).replace(tzinfo=None)
site.updated_at = datetime.now(UTC).replace(tzinfo=None)
db.session.commit()
return site

View File

@@ -8,8 +8,7 @@ from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db
from libs.helper import DatetimeString
from libs.login import login_required

View File

@@ -9,8 +9,7 @@ import services
from controllers.console import api
from controllers.console.app.error import ConversationCompletedError, DraftWorkflowNotExist, DraftWorkflowNotSync
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from core.app.apps.base_app_queue_manager import AppQueueManager
from core.app.entities.app_invoke_entities import InvokeFrom
from factories import variable_factory
@@ -21,7 +20,6 @@ from libs.helper import TimestampField, uuid_value
from libs.login import current_user, login_required
from models import App
from models.model import AppMode
from services.app_dsl_service import AppDslService
from services.app_generate_service import AppGenerateService
from services.errors.app import WorkflowHashNotEqualError
from services.workflow_service import WorkflowService
@@ -102,11 +100,11 @@ class DraftWorkflowApi(Resource):
try:
environment_variables_list = args.get("environment_variables") or []
environment_variables = [
variable_factory.build_variable_from_mapping(obj) for obj in environment_variables_list
variable_factory.build_environment_variable_from_mapping(obj) for obj in environment_variables_list
]
conversation_variables_list = args.get("conversation_variables") or []
conversation_variables = [
variable_factory.build_variable_from_mapping(obj) for obj in conversation_variables_list
variable_factory.build_conversation_variable_from_mapping(obj) for obj in conversation_variables_list
]
workflow = workflow_service.sync_draft_workflow(
app_model=app_model,
@@ -127,31 +125,6 @@ class DraftWorkflowApi(Resource):
}
class DraftWorkflowImportApi(Resource):
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW])
@marshal_with(workflow_fields)
def post(self, app_model: App):
"""
Import draft workflow
"""
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("data", type=str, required=True, nullable=False, location="json")
args = parser.parse_args()
workflow = AppDslService.import_and_overwrite_workflow(
app_model=app_model, data=args["data"], account=current_user
)
return workflow
class AdvancedChatDraftWorkflowRunApi(Resource):
@setup_required
@login_required
@@ -409,7 +382,7 @@ class DefaultBlockConfigApi(Resource):
filters = None
if args.get("q"):
try:
filters = json.loads(args.get("q"))
filters = json.loads(args.get("q", ""))
except json.JSONDecodeError:
raise ValueError("Invalid filters")
@@ -454,7 +427,6 @@ class ConvertToWorkflowApi(Resource):
api.add_resource(DraftWorkflowApi, "/apps/<uuid:app_id>/workflows/draft")
api.add_resource(DraftWorkflowImportApi, "/apps/<uuid:app_id>/workflows/draft/import")
api.add_resource(AdvancedChatDraftWorkflowRunApi, "/apps/<uuid:app_id>/advanced-chat/workflows/draft/run")
api.add_resource(DraftWorkflowRunApi, "/apps/<uuid:app_id>/workflows/draft/run")
api.add_resource(WorkflowTaskStopApi, "/apps/<uuid:app_id>/workflow-runs/tasks/<string:task_id>/stop")

View File

@@ -3,8 +3,7 @@ from flask_restful.inputs import int_range
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from fields.workflow_app_log_fields import workflow_app_log_pagination_fields
from libs.login import login_required
from models import App

View File

@@ -3,8 +3,7 @@ from flask_restful.inputs import int_range
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from fields.workflow_run_fields import (
advanced_chat_workflow_run_pagination_fields,
workflow_run_detail_fields,

View File

@@ -8,8 +8,7 @@ from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db
from libs.helper import DatetimeString
from libs.login import login_required

View File

@@ -65,7 +65,7 @@ class ActivateApi(Resource):
account.timezone = args["timezone"]
account.interface_theme = "light"
account.status = AccountStatus.ACTIVE.value
account.initialized_at = datetime.datetime.now(datetime.timezone.utc).replace(tzinfo=None)
account.initialized_at = datetime.datetime.now(datetime.UTC).replace(tzinfo=None)
db.session.commit()
token_pair = AccountService.login(account, ip_address=extract_remote_ip(request))

View File

@@ -7,8 +7,7 @@ from controllers.console.auth.error import ApiKeyAuthFailedError
from libs.login import login_required
from services.auth.api_key_auth_service import ApiKeyAuthService
from ..setup import setup_required
from ..wraps import account_initialization_required
from ..wraps import account_initialization_required, setup_required
class ApiKeyAuthDataSource(Resource):

View File

@@ -11,8 +11,7 @@ from controllers.console import api
from libs.login import login_required
from libs.oauth_data_source import NotionOAuth
from ..setup import setup_required
from ..wraps import account_initialization_required
from ..wraps import account_initialization_required, setup_required
def get_oauth_providers():
@@ -35,7 +34,6 @@ class OAuthDataSource(Resource):
OAUTH_DATASOURCE_PROVIDERS = get_oauth_providers()
with current_app.app_context():
oauth_provider = OAUTH_DATASOURCE_PROVIDERS.get(provider)
print(vars(oauth_provider))
if not oauth_provider:
return {"error": "Invalid provider"}, 400
if dify_config.NOTION_INTEGRATION_TYPE == "internal":

View File

@@ -12,8 +12,8 @@ from controllers.console.auth.error import (
InvalidTokenError,
PasswordMismatchError,
)
from controllers.console.error import EmailSendIpLimitError, NotAllowedRegister
from controllers.console.setup import setup_required
from controllers.console.error import AccountNotFound, EmailSendIpLimitError
from controllers.console.wraps import setup_required
from events.tenant_event import tenant_was_created
from extensions.ext_database import db
from libs.helper import email, extract_remote_ip
@@ -48,7 +48,7 @@ class ForgotPasswordSendEmailApi(Resource):
token = AccountService.send_reset_password_email(email=args["email"], language=language)
return {"result": "fail", "data": token, "code": "account_not_found"}
else:
raise NotAllowedRegister()
raise AccountNotFound()
else:
token = AccountService.send_reset_password_email(account=account, email=args["email"], language=language)

View File

@@ -16,11 +16,11 @@ from controllers.console.auth.error import (
)
from controllers.console.error import (
AccountBannedError,
AccountNotFound,
EmailSendIpLimitError,
NotAllowedCreateWorkspace,
NotAllowedRegister,
)
from controllers.console.setup import setup_required
from controllers.console.wraps import setup_required
from events.tenant_event import tenant_was_created
from libs.helper import email, extract_remote_ip
from libs.password import valid_password
@@ -76,7 +76,7 @@ class LoginApi(Resource):
token = AccountService.send_reset_password_email(email=args["email"], language=language)
return {"result": "fail", "data": token, "code": "account_not_found"}
else:
raise NotAllowedRegister()
raise AccountNotFound()
# SELF_HOSTED only have one workspace
tenants = TenantService.get_join_tenants(account)
if len(tenants) == 0:
@@ -119,7 +119,7 @@ class ResetPasswordSendEmailApi(Resource):
if FeatureService.get_system_features().is_allow_register:
token = AccountService.send_reset_password_email(email=args["email"], language=language)
else:
raise NotAllowedRegister()
raise AccountNotFound()
else:
token = AccountService.send_reset_password_email(account=account, language=language)
@@ -148,7 +148,7 @@ class EmailCodeLoginSendEmailApi(Resource):
if FeatureService.get_system_features().is_allow_register:
token = AccountService.send_email_code_login_email(email=args["email"], language=language)
else:
raise NotAllowedRegister()
raise AccountNotFound()
else:
token = AccountService.send_email_code_login_email(account=account, language=language)

View File

@@ -1,5 +1,5 @@
import logging
from datetime import datetime, timezone
from datetime import UTC, datetime
from typing import Optional
import requests
@@ -52,7 +52,6 @@ class OAuthLogin(Resource):
OAUTH_PROVIDERS = get_oauth_providers()
with current_app.app_context():
oauth_provider = OAUTH_PROVIDERS.get(provider)
print(vars(oauth_provider))
if not oauth_provider:
return {"error": "Invalid provider"}, 400
@@ -106,7 +105,7 @@ class OAuthCallback(Resource):
if account.status == AccountStatus.PENDING.value:
account.status = AccountStatus.ACTIVE.value
account.initialized_at = datetime.now(timezone.utc).replace(tzinfo=None)
account.initialized_at = datetime.now(UTC).replace(tzinfo=None)
db.session.commit()
try:

View File

@@ -2,8 +2,7 @@ from flask_login import current_user
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, only_edition_cloud
from controllers.console.wraps import account_initialization_required, only_edition_cloud, setup_required
from libs.login import login_required
from services.billing_service import BillingService

View File

@@ -7,8 +7,7 @@ from flask_restful import Resource, marshal_with, reqparse
from werkzeug.exceptions import NotFound
from controllers.console import api
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from core.indexing_runner import IndexingRunner
from core.rag.extractor.entity.extract_setting import ExtractSetting
from core.rag.extractor.notion_extractor import NotionExtractor
@@ -84,7 +83,7 @@ class DataSourceApi(Resource):
if action == "enable":
if data_source_binding.disabled:
data_source_binding.disabled = False
data_source_binding.updated_at = datetime.datetime.now(datetime.timezone.utc).replace(tzinfo=None)
data_source_binding.updated_at = datetime.datetime.now(datetime.UTC).replace(tzinfo=None)
db.session.add(data_source_binding)
db.session.commit()
else:
@@ -93,7 +92,7 @@ class DataSourceApi(Resource):
if action == "disable":
if not data_source_binding.disabled:
data_source_binding.disabled = True
data_source_binding.updated_at = datetime.datetime.now(datetime.timezone.utc).replace(tzinfo=None)
data_source_binding.updated_at = datetime.datetime.now(datetime.UTC).replace(tzinfo=None)
db.session.add(data_source_binding)
db.session.commit()
else:

View File

@@ -10,8 +10,7 @@ from controllers.console import api
from controllers.console.apikey import api_key_fields, api_key_list
from controllers.console.app.error import ProviderNotInitializeError
from controllers.console.datasets.error import DatasetInUseError, DatasetNameDuplicateError, IndexingEstimateError
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, enterprise_license_required, setup_required
from core.errors.error import LLMBadRequestError, ProviderTokenNotInitError
from core.indexing_runner import IndexingRunner
from core.model_runtime.entities.model_entities import ModelType
@@ -45,6 +44,7 @@ class DatasetListApi(Resource):
@setup_required
@login_required
@account_initialization_required
@enterprise_license_required
def get(self):
page = request.args.get("page", default=1, type=int)
limit = request.args.get("limit", default=20, type=int)
@@ -457,7 +457,7 @@ class DatasetIndexingEstimateApi(Resource):
)
except LLMBadRequestError:
raise ProviderNotInitializeError(
"No Embedding Model available. Please configure a valid provider in the Settings -> Model Provider."
"No Embedding Model available. Please configure a valid provider " "in the Settings -> Model Provider."
)
except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description)
@@ -621,6 +621,7 @@ class DatasetRetrievalSettingApi(Resource):
case (
VectorType.MILVUS
| VectorType.RELYT
| VectorType.PGVECTOR
| VectorType.TIDB_VECTOR
| VectorType.CHROMA
| VectorType.TENCENT
@@ -628,6 +629,7 @@ class DatasetRetrievalSettingApi(Resource):
| VectorType.BAIDU
| VectorType.VIKINGDB
| VectorType.UPSTASH
| VectorType.OCEANBASE
):
return {"retrieval_method": [RetrievalMethod.SEMANTIC_SEARCH.value]}
case (
@@ -640,6 +642,8 @@ class DatasetRetrievalSettingApi(Resource):
| VectorType.ELASTICSEARCH
| VectorType.PGVECTOR
| VectorType.TIDB_ON_QDRANT
| VectorType.LINDORM
| VectorType.COUCHBASE
):
return {
"retrieval_method": [
@@ -668,6 +672,7 @@ class DatasetRetrievalSettingMockApi(Resource):
| VectorType.BAIDU
| VectorType.VIKINGDB
| VectorType.UPSTASH
| VectorType.OCEANBASE
):
return {"retrieval_method": [RetrievalMethod.SEMANTIC_SEARCH.value]}
case (
@@ -678,7 +683,9 @@ class DatasetRetrievalSettingMockApi(Resource):
| VectorType.MYSCALE
| VectorType.ORACLE
| VectorType.ELASTICSEARCH
| VectorType.COUCHBASE
| VectorType.PGVECTOR
| VectorType.LINDORM
):
return {
"retrieval_method": [

View File

@@ -1,6 +1,6 @@
import logging
from argparse import ArgumentTypeError
from datetime import datetime, timezone
from datetime import UTC, datetime
from flask import request
from flask_login import current_user
@@ -24,8 +24,11 @@ from controllers.console.datasets.error import (
InvalidActionError,
InvalidMetadataError,
)
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
setup_required,
)
from core.errors.error import (
LLMBadRequestError,
ModelCurrentlyNotSupportError,
@@ -103,6 +106,7 @@ class GetProcessRuleApi(Resource):
# get default rules
mode = DocumentService.DEFAULT_RULES["mode"]
rules = DocumentService.DEFAULT_RULES["rules"]
limits = DocumentService.DEFAULT_RULES["limits"]
if document_id:
# get the latest process rule
document = Document.query.get_or_404(document_id)
@@ -129,7 +133,7 @@ class GetProcessRuleApi(Resource):
mode = dataset_process_rule.mode
rules = dataset_process_rule.rules_dict
return {"mode": mode, "rules": rules}
return {"mode": mode, "rules": rules, "limits": limits}
class DatasetDocumentListApi(Resource):
@@ -314,8 +318,11 @@ class DatasetInitApi(Resource):
raise ValueError("embedding model and embedding model provider are required for high quality indexing.")
try:
model_manager = ModelManager()
model_manager.get_default_model_instance(
tenant_id=current_user.current_tenant_id, model_type=ModelType.TEXT_EMBEDDING
model_manager.get_model_instance(
tenant_id=current_user.current_tenant_id,
provider=args["embedding_model_provider"],
model_type=ModelType.TEXT_EMBEDDING,
model=args["embedding_model"],
)
except InvokeAuthorizationError:
raise ProviderNotInitializeError(
@@ -659,7 +666,7 @@ class DocumentProcessingApi(DocumentResource):
raise InvalidActionError("Document not in indexing state.")
document.paused_by = current_user.id
document.paused_at = datetime.now(timezone.utc).replace(tzinfo=None)
document.paused_at = datetime.now(UTC).replace(tzinfo=None)
document.is_paused = True
db.session.commit()
@@ -739,7 +746,7 @@ class DocumentMetadataApi(DocumentResource):
document.doc_metadata[key] = value
document.doc_type = doc_type
document.updated_at = datetime.now(timezone.utc).replace(tzinfo=None)
document.updated_at = datetime.now(UTC).replace(tzinfo=None)
db.session.commit()
return {"result": "success", "message": "Document metadata updated."}, 200
@@ -781,7 +788,7 @@ class DocumentStatusApi(DocumentResource):
document.enabled = True
document.disabled_at = None
document.disabled_by = None
document.updated_at = datetime.now(timezone.utc).replace(tzinfo=None)
document.updated_at = datetime.now(UTC).replace(tzinfo=None)
db.session.commit()
# Set cache to prevent indexing the same document multiple times
@@ -798,9 +805,9 @@ class DocumentStatusApi(DocumentResource):
raise InvalidActionError("Document already disabled.")
document.enabled = False
document.disabled_at = datetime.now(timezone.utc).replace(tzinfo=None)
document.disabled_at = datetime.now(UTC).replace(tzinfo=None)
document.disabled_by = current_user.id
document.updated_at = datetime.now(timezone.utc).replace(tzinfo=None)
document.updated_at = datetime.now(UTC).replace(tzinfo=None)
db.session.commit()
# Set cache to prevent indexing the same document multiple times
@@ -815,9 +822,9 @@ class DocumentStatusApi(DocumentResource):
raise InvalidActionError("Document already archived.")
document.archived = True
document.archived_at = datetime.now(timezone.utc).replace(tzinfo=None)
document.archived_at = datetime.now(UTC).replace(tzinfo=None)
document.archived_by = current_user.id
document.updated_at = datetime.now(timezone.utc).replace(tzinfo=None)
document.updated_at = datetime.now(UTC).replace(tzinfo=None)
db.session.commit()
if document.enabled:
@@ -834,7 +841,7 @@ class DocumentStatusApi(DocumentResource):
document.archived = False
document.archived_at = None
document.archived_by = None
document.updated_at = datetime.now(timezone.utc).replace(tzinfo=None)
document.updated_at = datetime.now(UTC).replace(tzinfo=None)
db.session.commit()
# Set cache to prevent indexing the same document multiple times
@@ -941,8 +948,8 @@ class DocumentRetryApi(DocumentResource):
if document.indexing_status == "completed":
raise DocumentAlreadyFinishedError()
retry_documents.append(document)
except Exception as e:
logging.error(f"Document {document_id} retry failed: {str(e)}")
except Exception:
logging.exception(f"Failed to retry document, document id: {document_id}")
continue
# retry document
DocumentService.retry_document(dataset_id, retry_documents)

View File

@@ -1,5 +1,5 @@
import uuid
from datetime import datetime, timezone
from datetime import UTC, datetime
import pandas as pd
from flask import request
@@ -11,11 +11,11 @@ import services
from controllers.console import api
from controllers.console.app.error import ProviderNotInitializeError
from controllers.console.datasets.error import InvalidActionError, NoFileUploadedError, TooManyFilesError
from controllers.console.setup import setup_required
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_knowledge_limit_check,
cloud_edition_billing_resource_check,
setup_required,
)
from core.errors.error import LLMBadRequestError, ProviderTokenNotInitError
from core.model_manager import ModelManager
@@ -188,7 +188,7 @@ class DatasetDocumentSegmentApi(Resource):
raise InvalidActionError("Segment is already disabled.")
segment.enabled = False
segment.disabled_at = datetime.now(timezone.utc).replace(tzinfo=None)
segment.disabled_at = datetime.now(UTC).replace(tzinfo=None)
segment.disabled_by = current_user.id
db.session.commit()

View File

@@ -6,8 +6,7 @@ from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services
from controllers.console import api
from controllers.console.datasets.error import DatasetNameDuplicateError
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from fields.dataset_fields import dataset_detail_fields
from libs.login import login_required
from services.dataset_service import DatasetService

View File

@@ -2,8 +2,7 @@ from flask_restful import Resource
from controllers.console import api
from controllers.console.datasets.hit_testing_base import DatasetsHitTestingBase
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.console.wraps import account_initialization_required, setup_required
from libs.login import login_required

Some files were not shown because too many files have changed in this diff Show More