Compare commits

..

317 Commits

Author SHA1 Message Date
-LAN-
24d06eb741 Merge branch 'feat/enhance-multi-modal-support' into release/0.10.0-beta 2024-10-15 15:34:42 +08:00
-LAN-
a36ef8430e refactor(core): improve type annotations and file handling consistency
- Use more precise type annotations with Sequence and Mapping for task entities.
- Ensure raw_prompt is assigned properly after replacement in advanced prompt transform.
- Remove unused generator return type from _fetch_context method.
- Refactor tool node file handling to retrieve more comprehensive file attributes, ensuring file existence validation in the database.
2024-10-15 15:34:14 +08:00
-LAN-
6e9129a333 Update build-push.yml to include release/0.10.0-beta branch in the workflow trigger 2024-10-15 14:24:27 +08:00
-LAN-
86427468fd Update version to 0.10.0-beta3 in packaging, docker, and web 2024-10-15 14:23:02 +08:00
-LAN-
638d9250e4 Merge branch 'feat/enhance-multi-modal-support' into release/0.10.0-beta 2024-10-15 12:01:16 +08:00
-LAN-
dba67cd87a fix(memory): filter non-image file types in prompt message content
- Skip non-image files when converting file objects to prompt message content.
- Ensures only image files are processed, improving the accuracy and relevance of prompt messages.
2024-10-15 12:01:00 +08:00
-LAN-
c3811c4c23 Merge branch 'feat/attachments' into release/0.10.0-beta 2024-10-15 11:29:54 +08:00
Joel
ff7a467a1a fix: iteration not support file item 2024-10-15 11:20:52 +08:00
-LAN-
c73b5e2410 Merge branch 'feat/attachments' into release/0.10.0-beta 2024-10-15 11:05:45 +08:00
-LAN-
421fde0e85 Merge branch 'feat/enhance-multi-modal-support' into release/0.10.0-beta 2024-10-15 11:05:41 +08:00
-LAN-
b3fdd618a1 refactor(core): simplify role handling and improve usability
- Replaced explicit string usage with `CreatedByRole` enum for better maintainability.
- Removed duplicate `CreatedByRole` class definition, improving codebase consistency.
- Increased file number limits from 6 to 10 to allow more file uploads.
- Transitioned `AppMode` to a string enum for consistent type usage.
- Refactored `extract_thread_messages` function argument for flexibility.
- Removed file extension limitation in file service to support custom extensions.
- Improved enum import statements across multiple modules for clarity and consistency.
2024-10-15 11:04:28 +08:00
StyleZhang
81e57bcc73 fix: remove image item 2024-10-15 10:53:27 +08:00
-LAN-
7156753a99 Merge branch 'feat/attachments' into release/0.10.0-beta 2024-10-14 23:57:44 +08:00
-LAN-
0d310b503b refactor(prompt): improve handling of variable templates in advanced prompt transform 2024-10-14 16:47:03 +08:00
-LAN-
03018823d8 fix(workflow): handle special values for process data consistently
- Apply `handle_special_values` to `process_data` in workflow cycle management.
- Improve template processing in `AdvancedPromptTransform` with `VariablePool`.
- Make `system_variables` and `user_inputs` optional in `VariablePool` initialization.
2024-10-14 16:47:03 +08:00
-LAN-
885192db38 feat(podcast_generator): add new podcast generation tools
- Introduced podcast generator with text-to-speech functionality using OpenAI's API.
- Implemented credential validation for TTS services and API keys.
- Added support for generating podcast audio with alternating host voices.
- Included user-friendly setup with internationalized YAML configuration.
- Added SVG icon to enhance visual identification.
2024-10-14 16:47:03 +08:00
-LAN-
ea18dd1571 feat(api): Enhance multi modal support. 2024-10-14 16:47:03 +08:00
StyleZhang
8fe5028f74 Merge branch 'main' into feat/attachments 2024-10-14 16:09:05 +08:00
StyleZhang
9f492d527a fix: chat input file list remove button style 2024-10-11 10:33:19 +08:00
StyleZhang
faec1c50f9 Merge branch 'main' into feat/attachments 2024-10-10 16:36:25 +08:00
StyleZhang
2888c068c1 fix: chat log 2024-10-10 16:07:35 +08:00
StyleZhang
d77521e65f Merge branch 'main' into feat/attachments 2024-10-10 10:28:56 +08:00
StyleZhang
b463468a14 Merge branch 'main' into feat/attachments 2024-10-10 10:02:35 +08:00
StyleZhang
820076beec Merge branch 'main' into feat/attachments 2024-10-09 17:48:05 +08:00
StyleZhang
fd6476b941 fix: start chat check & chat send button theme 2024-10-09 17:45:38 +08:00
StyleZhang
7abe76f07e fix: basic chat 2024-10-09 17:24:43 +08:00
StyleZhang
dba24766e8 fix: file download 2024-10-09 14:22:46 +08:00
Joel
12492c0d5d fix: answer node support choose file 2024-10-09 10:31:59 +08:00
Joel
2740d68cd1 chore: prompt not support file var 2024-10-08 18:32:30 +08:00
Joel
7218f718ad fix: value check 2024-10-08 18:05:55 +08:00
Joel
2ae0aceef8 chore: fix doc options 2024-10-08 17:23:57 +08:00
Joel
a5c1132087 chore: prompt support array string number 2024-10-08 17:13:51 +08:00
Joel
691a7f727a fix: iteration output var type 2024-10-08 14:54:21 +08:00
Joel
7e2984b6e2 fix: array operator render 2024-10-08 14:45:28 +08:00
JzoNg
44e81dbbc8 Merge branch 'main' into jzh 2024-09-30 17:19:35 +08:00
Joel
944cfd2b68 chore: merge main 2024-09-30 16:43:06 +08:00
Joel
6d2682c751 fix: help link url 2024-09-30 16:36:16 +08:00
StyleZhang
d2971e84bb fix: node handle tooltip 2024-09-30 15:23:04 +08:00
StyleZhang
b67b81bf8f fix: file upload tip 2024-09-30 14:52:16 +08:00
Joel
c05902404d chore: add help link 2024-09-30 14:00:51 +08:00
Joel
1e6d5f2c48 feat: add support doc extract file types 2024-09-30 13:56:24 +08:00
JzoNg
e2b1464db2 fix features update by DSL import 2024-09-29 17:15:39 +08:00
JzoNg
f0285a53d2 fix style of textarea in retrivel test 2024-09-29 15:53:10 +08:00
Joel
00f91b5dc4 chore: list fiter i18n 2024-09-29 14:57:38 +08:00
StyleZhang
dc3e86b82a fix: embedded chat file form 2024-09-29 14:56:13 +08:00
Joel
d239c5b54d fix: list filter first name 2024-09-29 14:47:02 +08:00
StyleZhang
23abccd3a6 fix: lint 2024-09-29 14:43:17 +08:00
JzoNg
2520e40059 fix: feature panel content height in safari 2024-09-29 14:17:27 +08:00
JzoNg
1d4ed3d9e7 fix: feature panel content height in safari 2024-09-29 14:06:07 +08:00
JzoNg
eed8ab9348 fix style of conversation variable 2024-09-29 11:58:23 +08:00
JzoNg
112aaf6e1b fix completion 2024-09-29 11:26:35 +08:00
StyleZhang
094a1a1458 remove chat thumb 2024-09-29 10:49:00 +08:00
JzoNg
955fa4345a fix output of run 2024-09-27 17:37:26 +08:00
Joel
ac5e381a1a chore: change list filter writing 2024-09-27 17:09:12 +08:00
Joel
ae9b9f867a chore: add new node description 2024-09-27 16:37:14 +08:00
JzoNg
8fd04e5313 merge main 2024-09-27 16:27:00 +08:00
Joel
3904782647 fix: add upload i18n 2024-09-27 11:16:02 +08:00
StyleZhang
288be3fbd8 fix: chat message end 2024-09-27 10:30:36 +08:00
JzoNg
f7f836d6f1 fix workflow output 2024-09-26 17:16:08 +08:00
JzoNg
5dedcb74a5 fix form of chat in webapp 2024-09-26 17:01:17 +08:00
StyleZhang
b95d0fa9a9 fix: file upload limits in web app 2024-09-26 16:41:44 +08:00
StyleZhang
543503c398 fix: file progress 2024-09-26 16:33:37 +08:00
JzoNg
3f16caf244 show file list in generation result 2024-09-26 16:10:28 +08:00
JzoNg
54133dfbde files in log 2024-09-26 15:50:19 +08:00
StyleZhang
b491c93b1c image download 2024-09-26 15:49:53 +08:00
StyleZhang
2a6d9c3211 fix: chat send 2024-09-26 15:33:01 +08:00
StyleZhang
c6691bd297 fix: webapp chat embedded chat 2024-09-26 15:26:53 +08:00
StyleZhang
2a0b30de5c fix: image.enable 2024-09-26 14:39:56 +08:00
StyleZhang
a7d53abba9 webapp chat embedded chat 2024-09-26 13:47:36 +08:00
StyleZhang
296253a365 debug chat 2024-09-26 11:54:42 +08:00
Joel
c89cefe526 chore: remove log 2024-09-26 11:50:52 +08:00
StyleZhang
1d027fa065 fix: chat check inputs form 2024-09-26 11:07:08 +08:00
Joel
9ce9a52a86 fix: text not string 2024-09-26 10:15:20 +08:00
JzoNg
c74424ed85 fix text-generation files 2024-09-26 08:45:50 +08:00
JzoNg
719ef9cef9 text generation run 2024-09-25 20:34:48 +08:00
StyleZhang
0ab525a691 fix: file size 2024-09-25 16:38:20 +08:00
StyleZhang
6fdcf6ee21 file-uploader 2024-09-25 16:24:02 +08:00
Joel
d01e97c1fc fix: tiny select ui 2024-09-25 15:44:02 +08:00
Joel
87e560de8a chore: type value change to array 2024-09-25 15:19:34 +08:00
JzoNg
f8d26e46ac text-generation support file type 2024-09-25 15:01:56 +08:00
Joel
195ac19774 chore: list filter transfer value to array 2024-09-25 13:41:56 +08:00
Joel
0281eb796d chore: transfer value to string array 2024-09-25 12:59:19 +08:00
StyleZhang
9fe2f321ae fix: chatflow check start node form 2024-09-25 12:21:53 +08:00
StyleZhang
5f76e665a1 fix: file extension 2024-09-25 11:24:14 +08:00
StyleZhang
81568752c0 fix: file from link 2024-09-24 17:47:01 +08:00
JzoNg
ceb1dde714 Merge branch 'main' into jzh 2024-09-24 17:30:56 +08:00
JzoNg
3209fdca53 legacy for sys.files 2024-09-24 17:08:23 +08:00
JzoNg
dc5010d833 fix step run of file type 2024-09-24 16:24:46 +08:00
Joel
8b26ae6532 fix: http file node not added 2024-09-24 15:18:39 +08:00
Joel
66953d57a2 chore: use new add sub variables 2024-09-24 13:56:06 +08:00
StyleZhang
afc9630cd0 merge main 2024-09-24 11:44:38 +08:00
Joel
7e8bafe186 feat: support file size unit 2024-09-24 11:12:41 +08:00
Joel
6c5fcd1ffc fix: all of not show the right place 2024-09-24 10:55:37 +08:00
Joel
7602d22133 chore: files name 2024-09-23 18:33:14 +08:00
Joel
5ec91e8507 feat: if node render files sub vars 2024-09-23 18:25:11 +08:00
StyleZhang
466966f027 file input error tip 2024-09-23 17:27:06 +08:00
JzoNg
212d04ea27 form inputs hide handle 2024-09-23 10:25:35 +08:00
StyleZhang
0cb50dd4a5 fix: workflow inputs panel 2024-09-20 18:00:35 +08:00
StyleZhang
ab19fccf3d single file 2024-09-20 17:20:28 +08:00
StyleZhang
4ed46e3fed fix: uploader 2024-09-20 15:44:49 +08:00
Joel
9fd2f798ff feat: add type palcehloder in picker 2024-09-20 15:19:38 +08:00
Joel
146be41b1d fix: if show var types 2024-09-20 14:59:46 +08:00
Joel
ce6ae5732a fix: not show the right output type 2024-09-20 14:42:53 +08:00
StyleZhang
edf462c640 fix: uploader 2024-09-20 12:08:03 +08:00
JzoNg
d580fc1e9d fix features initilization 2024-09-20 11:03:45 +08:00
Joel
5544791031 feat: doc extract support both file and file array 2024-09-19 17:55:58 +08:00
JzoNg
099746dd59 fix style of tracing 2024-09-19 17:40:27 +08:00
StyleZhang
c6f53c9030 Merge branch 'main' into feat/attachments 2024-09-19 17:11:28 +08:00
StyleZhang
8236f8fed8 fix: file uploader 2024-09-19 17:08:02 +08:00
StyleZhang
2b0c39ed3f file in chat question 2024-09-19 16:55:14 +08:00
JzoNg
396a240e68 test run 2024-09-19 16:49:30 +08:00
JzoNg
8bd9d8f6ba remove chat input 2024-09-19 15:45:30 +08:00
JzoNg
aa7ae4c5f1 chore: remove console 2024-09-19 15:45:30 +08:00
StyleZhang
49b7acf52e fix: file uploader 2024-09-19 14:52:22 +08:00
StyleZhang
466ac987f5 fix: file type icon 2024-09-19 11:12:18 +08:00
StyleZhang
49972939a9 file icon 2024-09-19 11:06:38 +08:00
StyleZhang
80f167ca02 file upload limit 2024-09-18 18:12:03 +08:00
JzoNg
f652ae0d98 step run 2024-09-18 18:07:03 +08:00
Joel
4dbf56675a fix: change to backend doc extractor 2024-09-18 17:35:14 +08:00
StyleZhang
f5d1f5a20a file uploader 2024-09-18 16:50:53 +08:00
StyleZhang
fd9b71c4d7 file uploader 2024-09-18 16:36:55 +08:00
Joel
1df41cef4c fix: doc extract not export text 2024-09-18 15:23:53 +08:00
Joel
602d2486bd fix: iteration file arry type 2024-09-18 15:03:02 +08:00
JzoNg
403fede432 fix basic app publish 2024-09-18 14:01:20 +08:00
JzoNg
9f66e6e357 fix feature bar in basic chabot 2024-09-18 13:36:11 +08:00
JzoNg
affb2e38a1 fix typo of icon path 2024-09-18 13:14:18 +08:00
Joel
31d87f85b8 merge 2024-09-18 11:57:48 +08:00
JzoNg
54105e85ff fix icon 2024-09-18 11:21:38 +08:00
Joel
5ec604500c chore: change field to backend 2024-09-18 11:15:36 +08:00
JzoNg
96d2582d89 file var in form 2024-09-18 10:44:59 +08:00
JzoNg
a10b0db102 vision setting 2024-09-18 10:44:59 +08:00
StyleZhang
5dd556b4c8 file uploader 2024-09-13 17:31:10 +08:00
StyleZhang
a4c6d0b94b file uploader 2024-09-13 16:46:16 +08:00
StyleZhang
323a835de9 Merge branch 'main' into feat/attachments 2024-09-12 13:53:41 +08:00
StyleZhang
0076577764 file uploader 2024-09-11 18:25:49 +08:00
Joel
9a3b7345c4 fix: split line too long 2024-09-11 14:40:38 +08:00
StyleZhang
2ebf5f5ffa merge main 2024-09-11 13:40:36 +08:00
StyleZhang
02f494c0de merge main 2024-09-10 16:38:32 +08:00
Joel
f0e81e3918 fix: doc extract and node isconversation item 2024-09-10 15:22:44 +08:00
Joel
aa8499efac fix: doc extract var type 2024-09-10 15:16:27 +08:00
Joel
ea40b1dcb2 fix: two scrollbar 2024-09-10 15:05:54 +08:00
Joel
a689cd6fd4 chore: var refernece support theme 2024-09-10 14:49:27 +08:00
StyleZhang
32b6c7063a file uploader 2024-09-10 14:17:56 +08:00
Joel
97056dad30 fix: file type var match page crash 2024-09-10 11:54:34 +08:00
Joel
264f7c2139 fix: file show type error 2024-09-10 10:59:13 +08:00
Joel
007a6fd14a chore: other file types placeholder add + 2024-09-09 15:24:50 +08:00
Joel
c159b7a781 chore: transform field type css to tailwind and multi theme 2024-09-09 15:15:08 +08:00
Joel
6c9c3faf78 fix: allow file extensions remove . 2024-09-09 14:36:56 +08:00
StyleZhang
d933ebb845 file input 2024-09-09 11:27:35 +08:00
JzoNg
b60c7a5826 vision config 2024-09-04 15:45:33 +08:00
JzoNg
0b94218378 remove unused components 2024-09-04 11:47:29 +08:00
Joel
97cc9a5615 feat: check file item key not set 2024-09-04 11:28:36 +08:00
Joel
f6d0fd9848 feat: add check list filter value 2024-09-04 11:19:09 +08:00
Joel
b863dd7de2 fix: list filter init value 2024-09-04 10:47:26 +08:00
JzoNg
b0e7a22a27 annotation reply 2024-09-03 17:19:11 +08:00
JzoNg
565a835947 conversation opener 2024-09-01 15:00:23 +08:00
JzoNg
fe94c876fb multiple model message sending 2024-09-01 13:37:21 +08:00
JzoNg
67a34bdd7a app publish with new features 2024-09-01 13:10:40 +08:00
JzoNg
8c785e268b completion debug & preview 2024-09-01 11:24:54 +08:00
JzoNg
65a6265ff6 new features in chat app configuration 2024-08-30 18:58:08 +08:00
Joel
08d3cb1912 fix: filter file and file sub variable 2024-08-30 17:25:32 +08:00
Joel
48d8b01d81 fix: http node value var rename 2024-08-30 15:30:12 +08:00
Joel
38edb06897 feat: list filter output 2024-08-30 15:21:37 +08:00
Joel
dc919c2a6c feat: output item var type and filter condition triger 2024-08-30 15:06:57 +08:00
Joel
e7a6a0ab01 chore: list filter operate ui 2024-08-30 14:44:16 +08:00
Joel
61d989f413 feat: support order change 2024-08-30 14:34:46 +08:00
Joel
976efd93a1 feat: support filter variable var data sync 2024-08-30 14:14:42 +08:00
JzoNg
0e2f78b3a6 features in workflow 2024-08-29 22:54:36 +08:00
JzoNg
b3529d3ccc file upload 2024-08-29 20:20:28 +08:00
JzoNg
d69b453729 conversation opener 2024-08-29 20:20:28 +08:00
JzoNg
2f658de155 moderation 2024-08-29 20:20:28 +08:00
JzoNg
a691700b48 text2speech 2024-08-29 20:20:28 +08:00
JzoNg
c5317d8f58 feature card 2024-08-29 20:20:28 +08:00
JzoNg
822f03f3cd text to speech card 2024-08-29 20:20:28 +08:00
JzoNg
101e56baaa follow up & citations & speech-to-text 2024-08-29 20:20:28 +08:00
JzoNg
3a8f516dfc more like this 2024-08-29 20:20:28 +08:00
JzoNg
912030c9a1 update style 2024-08-29 20:20:28 +08:00
JzoNg
687661eef7 new style of feature panel 2024-08-29 20:20:28 +08:00
Joel
8efc63a705 feat: handle value picker in body file selector 2024-08-29 16:36:18 +08:00
Joel
dca4f9fe9c feat: support file values in body 2024-08-29 16:18:16 +08:00
Joel
51597629b1 fix: http binary node not valid 2024-08-29 14:32:36 +08:00
Joel
76a07513ba fix: prompt editor not update data 2024-08-29 14:25:04 +08:00
Joel
dae62bef78 fix: change to key value type not show init key values 2024-08-29 11:55:18 +08:00
Joel
2a6629d435 feat: binary files 2024-08-29 11:50:42 +08:00
Joel
41f0ce1012 feat: support http body to new data struct 2024-08-28 16:56:31 +08:00
Joel
e90b055c47 fix: ts problems 2024-08-28 14:40:13 +08:00
Joel
94e40d4ed9 feat: default set vision var value 2024-08-28 14:35:49 +08:00
Joel
c34fc071e0 feat: vison file to api define 2024-08-28 10:57:26 +08:00
Joel
c014ae43e1 feat: if support file exist 2024-08-27 17:37:39 +08:00
Joel
9851153d38 chore: file type checkbox 2024-08-27 17:15:02 +08:00
Joel
cfbabb8383 feat: file valid 2024-08-27 17:09:34 +08:00
Joel
b78e90679d fix: choose file type problems 2024-08-27 16:55:24 +08:00
Joel
ec1bfdc723 feat: change to new start file struct 2024-08-27 16:37:15 +08:00
Joel
e20019f6e9 chore: merge main 2024-08-27 14:23:35 +08:00
Joel
2122cfb152 chore: list filter field 2024-08-27 10:43:03 +08:00
Joel
c2b8beffac feat: global variables 2024-08-26 17:32:46 +08:00
StyleZhang
985651454a progress circle 2024-08-26 10:30:26 +08:00
Joel
f9c1d06e91 chore: tools node 2024-08-23 18:08:54 +08:00
Joel
657f1d2de8 chore: http request 2024-08-23 17:56:34 +08:00
Joel
6e2192c1e0 chore: variable aggregator 2024-08-23 16:40:39 +08:00
Joel
e05b20eb91 chore:paramter extrctor ui 2024-08-23 16:03:01 +08:00
Joel
5117e08def chore: question classify 2024-08-22 17:00:20 +08:00
Joel
34691ca6c9 chore: knownledge node ui 2024-08-22 16:51:44 +08:00
Joel
aa40047b08 chore: knowledge 2024-08-22 15:10:49 +08:00
StyleZhang
eca17767fe chat style 2024-08-22 15:10:09 +08:00
Joel
51cec1b9ba chore: llm upgrade 2024-08-22 14:34:58 +08:00
Joel
651547c3ef fix: number var picker and other tiny css problem 2024-08-22 10:49:56 +08:00
Joel
8fbdaa604c feat: file array variable choose vars 2024-08-21 17:29:03 +08:00
Joel
1bcb30647f chore: select ui 2024-08-21 17:29:03 +08:00
JzoNg
bc245a25bf new style of tables 2024-08-21 17:07:14 +08:00
Joel
85b25ebe1b chore: file selct trigger 2024-08-21 16:45:18 +08:00
Joel
b50e94d681 feat: file arrary sub variable select 2024-08-21 16:22:58 +08:00
Joel
91c0657cf6 fix: select default trigger problem 2024-08-21 15:35:28 +08:00
StyleZhang
0da06128e3 agent tool in chat 2024-08-21 15:32:13 +08:00
Joel
0c4af3a1d2 feat: support sub variable operate changes with key and value support 2024-08-21 15:27:08 +08:00
Joel
5628b293f8 feat: sub var if condindion postion 2024-08-21 14:54:06 +08:00
JzoNg
fff40aae58 Merge branch 'main' into jzh 2024-08-21 13:40:04 +08:00
Joel
b3b87b3e4c chore: sub variable trigger 2024-08-21 11:07:13 +08:00
Joel
9a23cd08d8 fix: sub varibale select trigger 2024-08-19 16:49:52 +08:00
JzoNg
cf61ca24e3 new style of table 2024-08-19 16:05:37 +08:00
Joel
58a56add9c feat: can support value 2024-08-19 15:15:44 +08:00
JzoNg
b362031baf chip 2024-08-19 14:11:25 +08:00
Joel
7ad409b3d9 fix: update operate and value 2024-08-19 13:53:23 +08:00
Joel
876ea90fe9 feat: support update sub variable value 2024-08-19 11:43:52 +08:00
JzoNg
0eb442f954 new style of user inputs 2024-08-16 17:48:31 +08:00
Joel
4554ac3ef8 feat: can add sub variable 2024-08-16 17:12:44 +08:00
Joel
eaa7d114dc feat: file array not sub vars 2024-08-16 11:39:23 +08:00
Joel
581228be74 feat: abstract condition logic to components 2024-08-16 10:14:43 +08:00
StyleZhang
02da0219ff workflow debug and preview panel style 2024-08-15 17:30:17 +08:00
JzoNg
d0bbe43dab chore: fix type 2024-08-15 16:36:47 +08:00
JzoNg
16acdc9be4 new style of workflow process 2024-08-15 16:33:33 +08:00
Joel
a6999b5d02 fix: old not set vision data 2024-08-15 11:37:02 +08:00
StyleZhang
33bfa4758e Merge branch 'main' into feat/attachments 2024-08-15 10:57:59 +08:00
JzoNg
db63c2c219 new style of status 2024-08-14 18:40:11 +08:00
JzoNg
bea4ec5998 style update of log 2024-08-13 17:11:17 +08:00
JzoNg
74333db4c8 update input in env & conversation var 2024-08-13 16:12:12 +08:00
JzoNg
0019fb9f8b Merge branch 'main' into jzh 2024-08-13 15:59:19 +08:00
JzoNg
47615ac8fb meta data style update 2024-08-13 15:25:02 +08:00
StyleZhang
d7c8bced9b file uploader 2024-08-13 15:24:32 +08:00
Joel
57f178902f feat: if node select value 2024-08-13 14:14:38 +08:00
Joel
4586de48d6 feat: default var type 2024-08-13 14:04:43 +08:00
Joel
6549519fa5 feat: files attr select 2024-08-13 13:59:19 +08:00
Joel
ae098ad121 feat: condition operation 2024-08-13 11:16:27 +08:00
Joel
20922fde1c feat: detect file type 2024-08-12 18:22:07 +08:00
StyleZhang
079c802b5c file uploader 2024-08-12 16:24:13 +08:00
JzoNg
efcd462a69 fix style of switch 2024-08-12 13:15:36 +08:00
Joel
843c8ad306 feat: file obj 2024-08-09 17:46:33 +08:00
StyleZhang
594bf96922 file uploader hooks 2024-08-09 16:48:58 +08:00
JzoNg
ade385c9c1 replace input in workflow blocks 2024-08-09 16:19:10 +08:00
JzoNg
baed068231 replace form input 2024-08-09 12:35:01 +08:00
Joel
42f5334ae4 feat: iteration file array input 2024-08-09 11:45:14 +08:00
Joel
3c4ab0632d feat: tool support file type 2024-08-09 11:38:40 +08:00
Joel
bc5f109308 feat: http support body binary 2024-08-09 10:53:59 +08:00
Joel
97b2a42cc3 feat: form data support file type 2024-08-09 10:29:29 +08:00
JzoNg
939df16655 refactor input and replace search input 2024-08-08 17:39:09 +08:00
JzoNg
9362ae045c fix textarea onchange 2024-08-08 17:39:09 +08:00
Joel
257c515178 fix: old no vision data 2024-08-08 15:47:55 +08:00
Joel
6b7520ccc2 fix: old llm code 2024-08-08 15:44:39 +08:00
Joel
85eeaee95a feat: vision valid 2024-08-08 14:40:08 +08:00
Joel
99bf3ff565 feat: params support vison 2024-08-08 14:22:54 +08:00
Joel
36ae154ca2 feat: classify support vision 2024-08-08 11:57:07 +08:00
Joel
ef93d60534 chore: vision logic hooks 2024-08-08 11:36:44 +08:00
JzoNg
6c9a6b99e0 refactor textarea 2024-08-08 11:14:52 +08:00
JzoNg
b73f05fdf0 new style of textarea 2024-08-08 11:14:52 +08:00
StyleZhang
26bca75884 file uploader 2024-08-08 10:27:43 +08:00
Joel
e2962da1b8 chore: add visioin disabled tip check 2024-08-07 11:33:19 +08:00
NFish
1b9ebb8037 feat: add disabled support to tooltip-plus component (#7036) 2024-08-07 11:33:19 +08:00
crazywoola
a945a45b06 doc: correct typos in mdx files (#7029) 2024-08-07 11:33:19 +08:00
Achim
be829a8103 Provide output data also in json property of workflow tool (#6924) (#7027) 2024-08-07 11:33:19 +08:00
crazywoola
9432d41e60 fix: typos in wenxin llm (#7021) 2024-08-07 11:33:19 +08:00
Sa Zhang
0beeb4ab3e fix: Fix incorrect context size for jina-reranker-v2 model (#7006) 2024-08-07 11:33:19 +08:00
Bryan
d7e057be44 fix: tran list issue (#7009)
Co-authored-by: libing <libing@healink.cn>
2024-08-07 11:33:19 +08:00
Jyong
81b11c08d0 Fix/reranking mode is null (#7012) 2024-08-07 11:33:19 +08:00
Joel
83a5cdfff9 feat: agent app support generate prompt (#7007) 2024-08-07 11:33:19 +08:00
yanghx
c837218bc9 fix #6902 .docx handles images within tables and handles cross-column tables (#6951) 2024-08-07 11:33:19 +08:00
crazywoola
68552893ef fix: code-block-missing-checks (#7002) 2024-08-07 11:33:19 +08:00
灰灰
5ba93ed064 fix: code tool fails when null property exists in object (#6988) 2024-08-07 11:33:19 +08:00
Yi Xiao
959107f553 Feat/new confirm (#6984) 2024-08-07 11:33:19 +08:00
Yefori
443d929137 feat: add function calling for deepseek models (#6990) 2024-08-07 11:33:19 +08:00
Vico Chu
1e04418023 Chores: fix name typo (#6987) 2024-08-07 11:33:19 +08:00
小羽
aeda8869bc feat:nvidia add nemotron4-340b and microsoft/phi-3 (#6973) 2024-08-07 11:33:19 +08:00
非法操作
10eed02ec4 chore: update duckduckgo tool (#6983) 2024-08-07 11:33:19 +08:00
Dr. Artificial曾小健
2472c4f890 fix doc (#6974) 2024-08-07 11:33:19 +08:00
Joel
0455e4e1a5 feat: llm support vision 2024-08-07 11:33:19 +08:00
StyleZhang
251ab5418f file-uploader i18n 2024-08-06 18:03:38 +08:00
Joel
38e6e40900 feat: config vision comp 2024-08-06 17:33:02 +08:00
Joel
b3a3672857 chore: new required field 2024-08-06 15:42:22 +08:00
Joel
53a3c199ec chore: input slider 2024-08-06 15:33:38 +08:00
Joel
fca5af5073 feat: max number with slider 2024-08-06 15:25:53 +08:00
Joel
77d0aac1d3 feat: support custom file type 2024-08-06 14:59:26 +08:00
StyleZhang
fd0f8f33b5 Merge branch 'main' into feat/attachments 2024-08-06 09:58:53 +08:00
Joel
0be99ad01c feat: select file types 2024-08-02 18:17:13 +08:00
StyleZhang
a05d16375e Merge branch 'main' into feat/attachments 2024-08-02 16:30:10 +08:00
Joel
0480bb03c3 feat: new input types 2024-08-02 11:43:01 +08:00
StyleZhang
19dfc6d9a8 file uploader 2024-08-02 10:21:20 +08:00
Joel
d361675159 chore: some select style 2024-08-01 17:54:33 +08:00
Joel
23ae150298 feat: fiter condion 2024-08-01 17:10:02 +08:00
Joel
81383d7c74 feat: sub var picker 2024-08-01 14:43:17 +08:00
Joel
573f653789 feat: order by 2024-08-01 11:40:55 +08:00
Joel
f1b61861b6 Merge branch 'main' into feat/attachments 2024-07-31 17:59:10 +08:00
Joel
8ecee8abce fix: ts problem 2024-07-31 17:02:11 +08:00
Joel
e9ce9c1f47 feat: limit config 2024-07-31 17:01:26 +08:00
Joel
944fea4cc9 feat: list filter type and outpt var 2024-07-31 16:31:51 +08:00
StyleZhang
25c029877a progress circle 2024-07-31 15:15:26 +08:00
StyleZhang
9c31c56115 file uploader 2024-07-30 16:19:20 +08:00
StyleZhang
56507c9f7a chat input area 2024-07-30 13:48:39 +08:00
StyleZhang
b322dda3f6 Merge branch 'main' into feat/attachments 2024-07-30 10:06:40 +08:00
StyleZhang
52d69dd55b file uploader 2024-07-29 17:22:11 +08:00
StyleZhang
0451c5590c Merge branch 'main' into feat/attachments 2024-07-29 10:19:11 +08:00
StyleZhang
2498c238b2 file-uploader 2024-07-26 16:54:45 +08:00
Joel
6e15d7f777 feat: doc extract inputs 2024-07-26 15:00:29 +08:00
Joel
f6caf0915b chore: block bg to utils color 2024-07-26 14:23:34 +08:00
Joel
09aa14ca82 feat: node icons 2024-07-26 14:11:03 +08:00
Joel
394f06a27a feat: list filter 2024-07-26 11:55:09 +08:00
Joel
6fafd410d2 feat: doc extract struct 2024-07-26 11:21:17 +08:00
StyleZhang
1668df104f Merge branch 'main' into feat/attachments 2024-07-26 08:49:17 +08:00
StyleZhang
d376b8540e add file-uploader 2024-07-25 16:41:09 +08:00
1020 changed files with 7570 additions and 33714 deletions

View File

@@ -1,3 +1,3 @@
#!/bin/bash
cd api && poetry install
poetry install -C api

View File

@@ -27,17 +27,18 @@ jobs:
- name: Checkout code
uses: actions/checkout@v4
- name: Install Poetry
uses: abatilo/actions-poetry@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
cache: 'poetry'
cache-dependency-path: |
api/pyproject.toml
api/poetry.lock
- name: Install Poetry
uses: abatilo/actions-poetry@v3
- name: Check Poetry lockfile
run: |
poetry check -C api --lock
@@ -78,7 +79,7 @@ jobs:
- name: Run Workflow
run: poetry run -C api bash dev/pytest/pytest_workflow.sh
- name: Set up Vector Stores (Weaviate, Qdrant, PGVector, Milvus, PgVecto-RS, Chroma, MyScale, ElasticSearch, Couchbase)
- name: Set up Vector Stores (Weaviate, Qdrant, PGVector, Milvus, PgVecto-RS, Chroma, MyScale, ElasticSearch)
uses: hoverkraft-tech/compose-action@v2.0.0
with:
compose-file: |
@@ -86,7 +87,6 @@ jobs:
services: |
weaviate
qdrant
couchbase-server
etcd
minio
milvus-standalone

View File

@@ -5,6 +5,7 @@ on:
branches:
- "main"
- "deploy/dev"
- "release/0.10.0-beta"
release:
types: [published]
@@ -49,7 +50,7 @@ jobs:
echo "PLATFORM_PAIR=${platform//\//-}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
uses: docker/login-action@v2
with:
username: ${{ env.DOCKERHUB_USER }}
password: ${{ env.DOCKERHUB_TOKEN }}
@@ -114,7 +115,7 @@ jobs:
merge-multiple: true
- name: Login to Docker Hub
uses: docker/login-action@v3
uses: docker/login-action@v2
with:
username: ${{ env.DOCKERHUB_USER }}
password: ${{ env.DOCKERHUB_TOKEN }}

View File

@@ -23,17 +23,18 @@ jobs:
- name: Checkout code
uses: actions/checkout@v4
- name: Install Poetry
uses: abatilo/actions-poetry@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
cache: 'poetry'
cache-dependency-path: |
api/pyproject.toml
api/poetry.lock
- name: Install Poetry
uses: abatilo/actions-poetry@v3
- name: Install dependencies
run: poetry install -C api

View File

@@ -7,7 +7,5 @@ yq eval '.services["milvus-standalone"].ports += ["19530:19530"]' -i docker/dock
yq eval '.services.pgvector.ports += ["5433:5432"]' -i docker/docker-compose.yaml
yq eval '.services["pgvecto-rs"].ports += ["5431:5432"]' -i docker/docker-compose.yaml
yq eval '.services["elasticsearch"].ports += ["9200:9200"]' -i docker/docker-compose.yaml
yq eval '.services.couchbase-server.ports += ["8091-8096:8091-8096"]' -i docker/docker-compose.yaml
yq eval '.services.couchbase-server.ports += ["11210:11210"]' -i docker/docker-compose.yaml
echo "Ports exposed for sandbox, weaviate, qdrant, chroma, milvus, pgvector, pgvecto-rs, elasticsearch, couchbase"
echo "Ports exposed for sandbox, weaviate, qdrant, chroma, milvus, pgvector, pgvecto-rs, elasticsearch"

View File

@@ -24,16 +24,15 @@ jobs:
with:
files: api/**
- name: Install Poetry
uses: abatilo/actions-poetry@v3
- name: Set up Python
uses: actions/setup-python@v5
if: steps.changed-files.outputs.any_changed == 'true'
with:
python-version: '3.10'
- name: Install Poetry
if: steps.changed-files.outputs.any_changed == 'true'
uses: abatilo/actions-poetry@v3
- name: Python dependencies
if: steps.changed-files.outputs.any_changed == 'true'
run: poetry install -C api --only lint

6
.gitignore vendored
View File

@@ -173,12 +173,8 @@ docker/volumes/myscale/log/*
docker/volumes/unstructured/*
docker/volumes/pgvector/data/*
docker/volumes/pgvecto_rs/data/*
docker/volumes/couchbase/*
docker/volumes/oceanbase/*
docker/nginx/conf.d/default.conf
docker/nginx/ssl/*
!docker/nginx/ssl/.gitkeep
docker/middleware.env
sdks/python-client/build
@@ -191,4 +187,4 @@ pyrightconfig.json
api/.vscode
.idea/
.vscode
.vscode

View File

@@ -6,9 +6,8 @@ Dify is licensed under the Apache License 2.0, with the following additional con
a. Multi-tenant service: Unless explicitly authorized by Dify in writing, you may not use the Dify source code to operate a multi-tenant environment.
- Tenant Definition: Within the context of Dify, one tenant corresponds to one workspace. The workspace provides a separated area for each tenant's data and configurations.
b. LOGO and copyright information: In the process of using Dify's frontend, you may not remove or modify the LOGO or copyright information in the Dify console or applications. This restriction is inapplicable to uses of Dify that do not involve its frontend.
- Frontend Definition: For the purposes of this license, the "frontend" of Dify includes all components located in the `web/` directory when running Dify from the raw source code, or the "web" image when running Dify with Docker.
b. LOGO and copyright information: In the process of using Dify's frontend components, you may not remove or modify the LOGO or copyright information in the Dify console or applications. This restriction is inapplicable to uses of Dify that do not involve its frontend components.
Please contact business@dify.ai by email to inquire about licensing matters.

157
README.md
View File

@@ -1,9 +1,5 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
<p align="center">
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Introducing Dify Workflow File Upload: Recreate Google NotebookLM Podcast</a>
</p>
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Self-hosting</a> ·
@@ -46,56 +42,6 @@
</p>
## Table of Content
0. [Quick-Start🚀](https://github.com/langgenius/dify?tab=readme-ov-file#quick-start)
1. [Intro📖](https://github.com/langgenius/dify?tab=readme-ov-file#intro)
2. [How to use🔧](https://github.com/langgenius/dify?tab=readme-ov-file#using-dify)
3. [Stay Ahead🏃](https://github.com/langgenius/dify?tab=readme-ov-file#staying-ahead)
4. [Next Steps🏹](https://github.com/langgenius/dify?tab=readme-ov-file#next-steps)
5. [Contributing💪](https://github.com/langgenius/dify?tab=readme-ov-file#contributing)
6. [Community and Contact🏠](https://github.com/langgenius/dify?tab=readme-ov-file#community--contact)
7. [Star-History📈](https://github.com/langgenius/dify?tab=readme-ov-file#star-history)
8. [Security🔒](https://github.com/langgenius/dify?tab=readme-ov-file#security-disclosure)
9. [License🤝](https://github.com/langgenius/dify?tab=readme-ov-file#license)
> Make sure you read through this README before you start utilizing Dify😊
## Quick start
The quickest way to deploy Dify locally is to run our [docker-compose.yml](https://github.com/langgenius/dify/blob/main/docker/docker-compose.yaml). Follow the instructions to start in 5 minutes.
> Before installing Dify, make sure your machine meets the following minimum system requirements:
>
>- CPU >= 2 Core
>- RAM >= 4 GiB
>- Docker and Docker Compose Installed
</br>
Run the following command in your terminal to clone the whole repo.
```bash
git clone https://github.com/langgenius/dify.git
```
After cloning,run the following command one by one.
```bash
cd dify
cd docker
cp .env.example .env
docker compose up -d
```
After running, you can access the Dify dashboard in your browser at [http://localhost/install](http://localhost/install) and start the initialization process. You will be asked to setup an admin account.
For more info of quick setup, check [here](https://docs.dify.ai/getting-started/install-self-hosted/docker-compose)
## Intro
Dify is an open-source LLM app development platform. Its intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production. Here's a list of the core features:
</br> </br>
@@ -129,6 +75,73 @@ Dify is an open-source LLM app development platform. Its intuitive interface com
All of Dify's offerings come with corresponding APIs, so you could effortlessly integrate Dify into your own business logic.
## Feature comparison
<table style="width: 100%;">
<tr>
<th align="center">Feature</th>
<th align="center">Dify.AI</th>
<th align="center">LangChain</th>
<th align="center">Flowise</th>
<th align="center">OpenAI Assistants API</th>
</tr>
<tr>
<td align="center">Programming Approach</td>
<td align="center">API + App-oriented</td>
<td align="center">Python Code</td>
<td align="center">App-oriented</td>
<td align="center">API-oriented</td>
</tr>
<tr>
<td align="center">Supported LLMs</td>
<td align="center">Rich Variety</td>
<td align="center">Rich Variety</td>
<td align="center">Rich Variety</td>
<td align="center">OpenAI-only</td>
</tr>
<tr>
<td align="center">RAG Engine</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">✅</td>
</tr>
<tr>
<td align="center">Agent</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">✅</td>
</tr>
<tr>
<td align="center">Workflow</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">✅</td>
<td align="center">❌</td>
</tr>
<tr>
<td align="center">Observability</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">❌</td>
</tr>
<tr>
<td align="center">Enterprise Features (SSO/Access control)</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">❌</td>
<td align="center">❌</td>
</tr>
<tr>
<td align="center">Local Deployment</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">❌</td>
</tr>
</table>
## Using Dify
- **Cloud </br>**
@@ -149,21 +162,30 @@ Star Dify on GitHub and be instantly notified of new releases.
![star-us](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## Quick start
> Before installing Dify, make sure your machine meets the following minimum system requirements:
>
>- CPU >= 2 Core
>- RAM >= 4GB
</br>
The easiest way to start the Dify server is to run our [docker-compose.yml](docker/docker-compose.yaml) file. Before running the installation command, make sure that [Docker](https://docs.docker.com/get-docker/) and [Docker Compose](https://docs.docker.com/compose/install/) are installed on your machine:
```bash
cd docker
cp .env.example .env
docker compose up -d
```
After running, you can access the Dify dashboard in your browser at [http://localhost/install](http://localhost/install) and start the initialization process.
> If you'd like to contribute to Dify or do additional development, refer to our [guide to deploying from source code](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code)
## Next steps
Go to [quick-start](https://github.com/langgenius/dify?tab=readme-ov-file#quick-start) to setup your Dify or setup by source code.
#### If you......
If you forget your admin account, you can refer to this [guide](https://docs.dify.ai/getting-started/install-self-hosted/faqs#id-4.-how-to-reset-the-password-of-the-admin-account) to reset the password.
> Use docker compose up without "-d" to enable logs printing out in your terminal. This might be useful if you have encountered unknow problems when using Dify.
If you encountered system error and would like to acquire help in Github issues, make sure you always paste logs of the error in the request to accerate the conversation. Go to [Community & contact](https://github.com/langgenius/dify?tab=readme-ov-file#community--contact) for more information.
> Please read the [Dify Documentation](https://docs.dify.ai/) for detailed how-to-use guidance. Most of the potential problems are explained in the doc.
> If you'd like to contribute to Dify or make additional development, refer to our [guide to deploying from source code](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code)
If you need to customize the configuration, please refer to the comments in our [.env.example](docker/.env.example) file and update the corresponding values in your `.env` file. Additionally, you might need to make adjustments to the `docker-compose.yaml` file itself, such as changing image versions, port mappings, or volume mounts, based on your specific deployment environment and requirements. After making any changes, please re-run `docker-compose up -d`. You can find the full list of available environment variables [here](https://docs.dify.ai/getting-started/install-self-hosted/environments).
If you'd like to configure a highly-available setup, there are community-contributed [Helm Charts](https://helm.sh/) and YAML files which allow Dify to be deployed on Kubernetes.
@@ -202,7 +224,6 @@ At the same time, please consider supporting Dify by sharing it on social media
* [GitHub Issues](https://github.com/langgenius/dify/issues). Best for: bugs you encounter using Dify.AI, and feature proposals. See our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Discord](https://discord.gg/FngNHpbcY7). Best for: sharing your applications and hanging out with the community.
* [X(Twitter)](https://twitter.com/dify_ai). Best for: sharing your applications and hanging out with the community.
* Make sure a log, if possible, is attached to an error reported to maximize solution efficiency.
## Star history

View File

@@ -154,7 +154,7 @@ Dify 是一个开源的 LLM 应用开发平台。其直观的界面结合了 AI
我们提供[ Dify 云服务](https://dify.ai),任何人都可以零设置尝试。它提供了自部署版本的所有功能,并在沙盒计划中包含 200 次免费的 GPT-4 调用。
- **自托管 Dify 社区版</br>**
使用这个[入门指南](#快速启动)快速在您的环境中运行 Dify。
使用这个[入门指南](#quick-start)快速在您的环境中运行 Dify。
使用我们的[文档](https://docs.dify.ai)进行进一步的参考和更深入的说明。
- **面向企业/组织的 Dify</br>**
@@ -174,7 +174,7 @@ Dify 是一个开源的 LLM 应用开发平台。其直观的界面结合了 AI
在安装 Dify 之前,请确保您的机器满足以下最低系统要求:
- CPU >= 2 Core
- RAM >= 4 GiB
- RAM >= 4GB
### 快速启动

View File

@@ -1,241 +0,0 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
<p align="center">
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Introduzindo o Dify Workflow com Upload de Arquivo: Recrie o Podcast Google NotebookLM</a>
</p>
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Auto-hospedagem</a> ·
<a href="https://docs.dify.ai">Documentação</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Consultas empresariais</a>
</p>
<p align="center">
<a href="https://dify.ai" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/Product-F04438"></a>
<a href="https://dify.ai/pricing" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/free-pricing?logo=free&color=%20%23155EEF&label=pricing&labelColor=%20%23528bff"></a>
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on X(Twitter)"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
<img alt="Commits last month" src="https://img.shields.io/github/commit-activity/m/langgenius/dify?labelColor=%20%2332b583&color=%20%2312b76a"></a>
<a href="https://github.com/langgenius/dify/" target="_blank">
<img alt="Issues closed" src="https://img.shields.io/github/issues-search?query=repo%3Alanggenius%2Fdify%20is%3Aclosed&label=issues%20closed&labelColor=%20%237d89b0&color=%20%235d6b98"></a>
<a href="https://github.com/langgenius/dify/discussions/" target="_blank">
<img alt="Discussion posts" src="https://img.shields.io/github/discussions/langgenius/dify?labelColor=%20%239b8afb&color=%20%237a5af8"></a>
</p>
<p align="center">
<a href="./README.md"><img alt="README em Inglês" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README em Espanhol" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README em Francês" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README em Coreano" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README em Árabe" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="README em Turco" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README em Vietnamita" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
<a href="./README_PT.md"><img alt="README em Português - BR" src="https://img.shields.io/badge/Portugu%C3%AAs-BR?style=flat&label=BR&color=d9d9d9"></a>
</p>
Dify é uma plataforma de desenvolvimento de aplicativos LLM de código aberto. Sua interface intuitiva combina workflow de IA, pipeline RAG, capacidades de agente, gerenciamento de modelos, recursos de observabilidade e muito mais, permitindo que você vá rapidamente do protótipo à produção. Aqui está uma lista das principais funcionalidades:
</br> </br>
**1. Workflow**:
Construa e teste workflows poderosos de IA em uma interface visual, aproveitando todos os recursos a seguir e muito mais.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Suporte abrangente a modelos**:
Integração perfeita com centenas de LLMs proprietários e de código aberto de diversas provedoras e soluções auto-hospedadas, abrangendo GPT, Mistral, Llama3 e qualquer modelo compatível com a API da OpenAI. A lista completa de provedores suportados pode ser encontrada [aqui](https://docs.dify.ai/getting-started/readme/model-providers).
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
**3. IDE de Prompt**:
Interface intuitiva para criação de prompts, comparação de desempenho de modelos e adição de recursos como conversão de texto para fala em um aplicativo baseado em chat.
**4. Pipeline RAG**:
Extensas capacidades de RAG que cobrem desde a ingestão de documentos até a recuperação, com suporte nativo para extração de texto de PDFs, PPTs e outros formatos de documentos comuns.
**5. Capacidades de agente**:
Você pode definir agentes com base em LLM Function Calling ou ReAct e adicionar ferramentas pré-construídas ou personalizadas para o agente. O Dify oferece mais de 50 ferramentas integradas para agentes de IA, como Google Search, DALL·E, Stable Diffusion e WolframAlpha.
**6. LLMOps**:
Monitore e analise os registros e o desempenho do aplicativo ao longo do tempo. É possível melhorar continuamente prompts, conjuntos de dados e modelos com base nos dados de produção e anotações.
**7. Backend como Serviço**:
Todas os recursos do Dify vêm com APIs correspondentes, permitindo que você integre o Dify sem esforço na lógica de negócios da sua empresa.
## Comparação de recursos
<table style="width: 100%;">
<tr>
<th align="center">Recurso</th>
<th align="center">Dify.AI</th>
<th align="center">LangChain</th>
<th align="center">Flowise</th>
<th align="center">OpenAI Assistants API</th>
</tr>
<tr>
<td align="center">Abordagem de Programação</td>
<td align="center">Orientada a API + Aplicativo</td>
<td align="center">Código Python</td>
<td align="center">Orientada a Aplicativo</td>
<td align="center">Orientada a API</td>
</tr>
<tr>
<td align="center">LLMs Suportados</td>
<td align="center">Variedade Rica</td>
<td align="center">Variedade Rica</td>
<td align="center">Variedade Rica</td>
<td align="center">Apenas OpenAI</td>
</tr>
<tr>
<td align="center">RAG Engine</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">✅</td>
</tr>
<tr>
<td align="center">Agente</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">✅</td>
</tr>
<tr>
<td align="center">Workflow</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">✅</td>
<td align="center">❌</td>
</tr>
<tr>
<td align="center">Observabilidade</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">❌</td>
</tr>
<tr>
<td align="center">Recursos Empresariais (SSO/Controle de Acesso)</td>
<td align="center">✅</td>
<td align="center">❌</td>
<td align="center">❌</td>
<td align="center">❌</td>
</tr>
<tr>
<td align="center">Implantação Local</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">✅</td>
<td align="center">❌</td>
</tr>
</table>
## Usando o Dify
- **Nuvem </br>**
Oferecemos o serviço [Dify Cloud](https://dify.ai) para qualquer pessoa experimentar sem nenhuma configuração. Ele fornece todas as funcionalidades da versão auto-hospedada, incluindo 200 chamadas GPT-4 gratuitas no plano sandbox.
- **Auto-hospedagem do Dify Community Edition</br>**
Configure rapidamente o Dify no seu ambiente com este [guia inicial](#quick-start).
Use nossa [documentação](https://docs.dify.ai) para referências adicionais e instruções mais detalhadas.
- **Dify para empresas/organizações</br>**
Oferecemos recursos adicionais voltados para empresas. [Envie suas perguntas através deste chatbot](https://udify.app/chat/22L1zSxg6yW1cWQg) ou [envie-nos um e-mail](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) para discutir necessidades empresariais. </br>
> Para startups e pequenas empresas que utilizam AWS, confira o [Dify Premium no AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) e implemente no seu próprio AWS VPC com um clique. É uma oferta AMI acessível com a opção de criar aplicativos com logotipo e marca personalizados.
## Mantendo-se atualizado
Dê uma estrela no Dify no GitHub e seja notificado imediatamente sobre novos lançamentos.
![star-us](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## Início rápido
> Antes de instalar o Dify, certifique-se de que sua máquina atenda aos seguintes requisitos mínimos de sistema:
>
>- CPU >= 2 Núcleos
>- RAM >= 4 GiB
</br>
A maneira mais fácil de iniciar o servidor Dify é executar nosso arquivo [docker-compose.yml](docker/docker-compose.yaml). Antes de rodar o comando de instalação, certifique-se de que o [Docker](https://docs.docker.com/get-docker/) e o [Docker Compose](https://docs.docker.com/compose/install/) estão instalados na sua máquina:
```bash
cd docker
cp .env.example .env
docker compose up -d
```
Após a execução, você pode acessar o painel do Dify no navegador em [http://localhost/install](http://localhost/install) e iniciar o processo de inicialização.
> Se você deseja contribuir com o Dify ou fazer desenvolvimento adicional, consulte nosso [guia para implantar a partir do código fonte](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code).
## Próximos passos
Se precisar personalizar a configuração, consulte os comentários no nosso arquivo [.env.example](docker/.env.example) e atualize os valores correspondentes no seu arquivo `.env`. Além disso, talvez seja necessário fazer ajustes no próprio arquivo `docker-compose.yaml`, como alterar versões de imagem, mapeamentos de portas ou montagens de volumes, com base no seu ambiente de implantação específico e nas suas necessidades. Após fazer quaisquer alterações, execute novamente `docker-compose up -d`. Você pode encontrar a lista completa de variáveis de ambiente disponíveis [aqui](https://docs.dify.ai/getting-started/install-self-hosted/environments).
Se deseja configurar uma instalação de alta disponibilidade, há [Helm Charts](https://helm.sh/) e arquivos YAML contribuídos pela comunidade que permitem a implantação do Dify no Kubernetes.
- [Helm Chart de @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart de @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [Arquivo YAML de @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Usando o Terraform para Implantação
Implante o Dify na Plataforma Cloud com um único clique usando [terraform](https://www.terraform.io/)
##### Azure Global
- [Azure Terraform por @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform por @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
## Contribuindo
Para aqueles que desejam contribuir com código, veja nosso [Guia de Contribuição](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
Ao mesmo tempo, considere apoiar o Dify compartilhando-o nas redes sociais e em eventos e conferências.
> Estamos buscando contribuidores para ajudar na tradução do Dify para idiomas além de Mandarim e Inglês. Se você tiver interesse em ajudar, consulte o [README i18n](https://github.com/langgenius/dify/blob/main/web/i18n/README.md) para mais informações e deixe-nos um comentário no canal `global-users` em nosso [Servidor da Comunidade no Discord](https://discord.gg/8Tpq4AcN9c).
**Contribuidores**
<a href="https://github.com/langgenius/dify/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langgenius/dify" />
</a>
## Comunidade e contato
* [Discussões no GitHub](https://github.com/langgenius/dify/discussions). Melhor para: compartilhar feedback e fazer perguntas.
* [Problemas no GitHub](https://github.com/langgenius/dify/issues). Melhor para: relatar bugs encontrados no Dify.AI e propor novos recursos. Veja nosso [Guia de Contribuição](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Discord](https://discord.gg/FngNHpbcY7). Melhor para: compartilhar suas aplicações e interagir com a comunidade.
* [X(Twitter)](https://twitter.com/dify_ai). Melhor para: compartilhar suas aplicações e interagir com a comunidade.
## Histórico de estrelas
[![Gráfico de Histórico de Estrelas](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
## Divulgação de segurança
Para proteger sua privacidade, evite postar problemas de segurança no GitHub. Em vez disso, envie suas perguntas para security@dify.ai e forneceremos uma resposta mais detalhada.
## Licença
Este repositório está disponível sob a [Licença de Código Aberto Dify](LICENSE), que é essencialmente Apache 2.0 com algumas restrições adicionais.

View File

@@ -31,17 +31,8 @@ REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_USERNAME=
REDIS_PASSWORD=difyai123456
REDIS_USE_SSL=false
REDIS_DB=0
# redis Sentinel configuration.
REDIS_USE_SENTINEL=false
REDIS_SENTINELS=
REDIS_SENTINEL_SERVICE_NAME=
REDIS_SENTINEL_USERNAME=
REDIS_SENTINEL_PASSWORD=
REDIS_SENTINEL_SOCKET_TIMEOUT=0.1
# PostgreSQL database configuration
DB_USERNAME=postgres
DB_PASSWORD=difyai123456
@@ -51,7 +42,7 @@ DB_DATABASE=dify
# Storage configuration
# use for store upload files, private keys...
# storage type: local, s3, aliyun-oss, azure-blob, baidu-obs, google-storage, huawei-obs, oci-storage, tencent-cos, volcengine-tos, supabase
# storage type: local, s3, azure-blob, google-storage, tencent-cos, huawei-obs, volcengine-tos, baidu-obs, supabase
STORAGE_TYPE=local
STORAGE_LOCAL_PATH=storage
S3_USE_AWS_MANAGED_IAM=false
@@ -120,8 +111,7 @@ SUPABASE_URL=your-server-url
WEB_API_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,*
CONSOLE_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,*
# Vector database configuration, support: weaviate, qdrant, milvus, myscale, relyt, pgvecto_rs, pgvector, pgvector, chroma, opensearch, tidb_vector, couchbase, vikingdb, upstash, lindorm
# Vector database configuration, support: weaviate, qdrant, milvus, myscale, relyt, pgvecto_rs, pgvector, pgvector, chroma, opensearch, tidb_vector, vikingdb
VECTOR_STORE=weaviate
# Weaviate configuration
@@ -137,13 +127,6 @@ QDRANT_CLIENT_TIMEOUT=20
QDRANT_GRPC_ENABLED=false
QDRANT_GRPC_PORT=6334
#Couchbase configuration
COUCHBASE_CONNECTION_STRING=127.0.0.1
COUCHBASE_USER=Administrator
COUCHBASE_PASSWORD=password
COUCHBASE_BUCKET_NAME=Embeddings
COUCHBASE_SCOPE_NAME=_default
# Milvus configuration
MILVUS_URI=http://127.0.0.1:19530
MILVUS_TOKEN=
@@ -203,20 +186,6 @@ TIDB_VECTOR_USER=xxx.root
TIDB_VECTOR_PASSWORD=xxxxxx
TIDB_VECTOR_DATABASE=dify
# Tidb on qdrant configuration
TIDB_ON_QDRANT_URL=http://127.0.0.1
TIDB_ON_QDRANT_API_KEY=dify
TIDB_ON_QDRANT_CLIENT_TIMEOUT=20
TIDB_ON_QDRANT_GRPC_ENABLED=false
TIDB_ON_QDRANT_GRPC_PORT=6334
TIDB_PUBLIC_KEY=dify
TIDB_PRIVATE_KEY=dify
TIDB_API_URL=http://127.0.0.1
TIDB_IAM_API_URL=http://127.0.0.1
TIDB_REGION=regions/aws-us-east-1
TIDB_PROJECT_ID=dify
TIDB_SPEND_LIMIT=100
# Chroma configuration
CHROMA_HOST=127.0.0.1
CHROMA_PORT=8000
@@ -251,10 +220,6 @@ BAIDU_VECTOR_DB_DATABASE=dify
BAIDU_VECTOR_DB_SHARD=1
BAIDU_VECTOR_DB_REPLICAS=3
# Upstash configuration
UPSTASH_VECTOR_URL=your-server-url
UPSTASH_VECTOR_TOKEN=your-access-token
# ViKingDB configuration
VIKINGDB_ACCESS_KEY=your-ak
VIKINGDB_SECRET_KEY=your-sk
@@ -264,20 +229,6 @@ VIKINGDB_SCHEMA=http
VIKINGDB_CONNECTION_TIMEOUT=30
VIKINGDB_SOCKET_TIMEOUT=30
# Lindorm configuration
LINDORM_URL=http://ld-*******************-proxy-search-pub.lindorm.aliyuncs.com:30070
LINDORM_USERNAME=admin
LINDORM_PASSWORD=admin
# OceanBase Vector configuration
OCEANBASE_VECTOR_HOST=127.0.0.1
OCEANBASE_VECTOR_PORT=2881
OCEANBASE_VECTOR_USER=root@test
OCEANBASE_VECTOR_PASSWORD=
OCEANBASE_VECTOR_DATABASE=test
OCEANBASE_MEMORY_LIMIT=6G
# Upload configuration
UPLOAD_FILE_SIZE_LIMIT=15
UPLOAD_FILE_BATCH_LIMIT=5
@@ -288,7 +239,6 @@ UPLOAD_AUDIO_FILE_SIZE_LIMIT=50
# Model Configuration
MULTIMODAL_SEND_IMAGE_FORMAT=base64
PROMPT_GENERATION_MAX_TOKENS=512
CODE_GENERATION_MAX_TOKENS=1024
# Mail configuration, support: resend, smtp
MAIL_TYPE=
@@ -354,10 +304,6 @@ RESPECT_XFORWARD_HEADERS_ENABLED=false
# Log file path
LOG_FILE=
# Log file max size, the unit is MB
LOG_FILE_MAX_SIZE=20
# Log file max backup count
LOG_FILE_BACKUP_COUNT=5
# Indexing configuration
INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH=1000
@@ -384,6 +330,3 @@ POSITION_TOOL_EXCLUDES=
POSITION_PROVIDER_PINS=
POSITION_PROVIDER_INCLUDES=
POSITION_PROVIDER_EXCLUDES=
# Reset password token expiry minutes
RESET_PASSWORD_TOKEN_EXPIRY_MINUTES=5

View File

@@ -55,14 +55,7 @@ RUN apt-get update \
&& echo "deb http://deb.debian.org/debian testing main" > /etc/apt/sources.list \
&& apt-get update \
# For Security
&& apt-get install -y --no-install-recommends expat=2.6.3-2 libldap-2.5-0=2.5.18+dfsg-3+b1 perl=5.40.0-6 libsqlite3-0=3.46.1-1 \
&& if [ "$(dpkg --print-architecture)" = "amd64" ]; then \
apt-get install -y --no-install-recommends zlib1g=1:1.3.dfsg+really1.3.1-1+b1; \
else \
apt-get install -y --no-install-recommends zlib1g=1:1.3.dfsg+really1.3.1-1; \
fi \
# install a chinese font to support the use of tools like matplotlib
&& apt-get install -y fonts-noto-cjk \
&& apt-get install -y --no-install-recommends zlib1g=1:1.3.dfsg+really1.3.1-1 expat=2.6.3-1 libldap-2.5-0=2.5.18+dfsg-3 perl=5.38.2-5 libsqlite3-0=3.46.0-1 \
&& apt-get autoremove -y \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -85,4 +85,3 @@
cd ../
poetry run -C api bash dev/pytest/pytest_all_tests.sh
```

View File

@@ -1,7 +1,5 @@
import os
from configs import dify_config
if os.environ.get("DEBUG", "false").lower() != "true":
from gevent import monkey
@@ -12,20 +10,44 @@ if os.environ.get("DEBUG", "false").lower() != "true":
grpc.experimental.gevent.init_gevent()
import json
import logging
import sys
import threading
import time
import warnings
from logging.handlers import RotatingFileHandler
from flask import Response
from flask import Flask, Response, request
from flask_cors import CORS
from werkzeug.exceptions import Unauthorized
from app_factory import create_app
import contexts
from commands import register_commands
from configs import dify_config
# DO NOT REMOVE BELOW
from events import event_handlers # noqa: F401
from extensions import (
ext_celery,
ext_code_based_extension,
ext_compress,
ext_database,
ext_hosting_provider,
ext_login,
ext_mail,
ext_migrate,
ext_proxy_fix,
ext_redis,
ext_sentry,
ext_storage,
)
from extensions.ext_database import db
from extensions.ext_login import login_manager
from libs.passport import PassportService
# TODO: Find a way to avoid importing models here
from models import account, dataset, model, source, task, tool, tools, web # noqa: F401
from services.account_service import AccountService
# DO NOT REMOVE ABOVE
@@ -38,11 +60,194 @@ if hasattr(time, "tzset"):
time.tzset()
class DifyApp(Flask):
pass
# -------------
# Configuration
# -------------
config_type = os.getenv("EDITION", default="SELF_HOSTED") # ce edition first
# ----------------------------
# Application Factory Function
# ----------------------------
def create_flask_app_with_configs() -> Flask:
"""
create a raw flask app
with configs loaded from .env file
"""
dify_app = DifyApp(__name__)
dify_app.config.from_mapping(dify_config.model_dump())
# populate configs into system environment variables
for key, value in dify_app.config.items():
if isinstance(value, str):
os.environ[key] = value
elif isinstance(value, int | float | bool):
os.environ[key] = str(value)
elif value is None:
os.environ[key] = ""
return dify_app
def create_app() -> Flask:
app = create_flask_app_with_configs()
app.secret_key = app.config["SECRET_KEY"]
log_handlers = None
log_file = app.config.get("LOG_FILE")
if log_file:
log_dir = os.path.dirname(log_file)
os.makedirs(log_dir, exist_ok=True)
log_handlers = [
RotatingFileHandler(
filename=log_file,
maxBytes=1024 * 1024 * 1024,
backupCount=5,
),
logging.StreamHandler(sys.stdout),
]
logging.basicConfig(
level=app.config.get("LOG_LEVEL"),
format=app.config["LOG_FORMAT"],
datefmt=app.config.get("LOG_DATEFORMAT"),
handlers=log_handlers,
force=True,
)
log_tz = app.config.get("LOG_TZ")
if log_tz:
from datetime import datetime
import pytz
timezone = pytz.timezone(log_tz)
def time_converter(seconds):
return datetime.utcfromtimestamp(seconds).astimezone(timezone).timetuple()
for handler in logging.root.handlers:
assert handler.formatter
handler.formatter.converter = time_converter
initialize_extensions(app)
register_blueprints(app)
register_commands(app)
return app
def initialize_extensions(app):
# Since the application instance is now created, pass it to each Flask
# extension instance to bind it to the Flask application instance (app)
ext_compress.init_app(app)
ext_code_based_extension.init()
ext_database.init_app(app)
ext_migrate.init(app, db)
ext_redis.init_app(app)
ext_storage.init_app(app)
ext_celery.init_app(app)
ext_login.init_app(app)
ext_mail.init_app(app)
ext_hosting_provider.init_app(app)
ext_sentry.init_app(app)
ext_proxy_fix.init_app(app)
# Flask-Login configuration
@login_manager.request_loader
def load_user_from_request(request_from_flask_login):
"""Load user based on the request."""
if request.blueprint not in {"console", "inner_api"}:
return None
# Check if the user_id contains a dot, indicating the old format
auth_header = request.headers.get("Authorization", "")
if not auth_header:
auth_token = request.args.get("_token")
if not auth_token:
raise Unauthorized("Invalid Authorization token.")
else:
if " " not in auth_header:
raise Unauthorized("Invalid Authorization header format. Expected 'Bearer <api-key>' format.")
auth_scheme, auth_token = auth_header.split(None, 1)
auth_scheme = auth_scheme.lower()
if auth_scheme != "bearer":
raise Unauthorized("Invalid Authorization header format. Expected 'Bearer <api-key>' format.")
decoded = PassportService().verify(auth_token)
user_id = decoded.get("user_id")
logged_in_account = AccountService.load_logged_in_account(account_id=user_id)
if logged_in_account:
contexts.tenant_id.set(logged_in_account.current_tenant_id)
return logged_in_account
@login_manager.unauthorized_handler
def unauthorized_handler():
"""Handle unauthorized requests."""
return Response(
json.dumps({"code": "unauthorized", "message": "Unauthorized."}),
status=401,
content_type="application/json",
)
# register blueprint routers
def register_blueprints(app):
from controllers.console import bp as console_app_bp
from controllers.files import bp as files_bp
from controllers.inner_api import bp as inner_api_bp
from controllers.service_api import bp as service_api_bp
from controllers.web import bp as web_bp
CORS(
service_api_bp,
allow_headers=["Content-Type", "Authorization", "X-App-Code"],
methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"],
)
app.register_blueprint(service_api_bp)
CORS(
web_bp,
resources={r"/*": {"origins": app.config["WEB_API_CORS_ALLOW_ORIGINS"]}},
supports_credentials=True,
allow_headers=["Content-Type", "Authorization", "X-App-Code"],
methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"],
expose_headers=["X-Version", "X-Env"],
)
app.register_blueprint(web_bp)
CORS(
console_app_bp,
resources={r"/*": {"origins": app.config["CONSOLE_CORS_ALLOW_ORIGINS"]}},
supports_credentials=True,
allow_headers=["Content-Type", "Authorization"],
methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"],
expose_headers=["X-Version", "X-Env"],
)
app.register_blueprint(console_app_bp)
CORS(files_bp, allow_headers=["Content-Type"], methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"])
app.register_blueprint(files_bp)
app.register_blueprint(inner_api_bp)
# create app
app = create_app()
celery = app.extensions["celery"]
if dify_config.TESTING:
if app.config.get("TESTING"):
print("App is running in TESTING mode")
@@ -50,15 +255,15 @@ if dify_config.TESTING:
def after_request(response):
"""Add Version headers to the response."""
response.set_cookie("remember_token", "", expires=0)
response.headers.add("X-Version", dify_config.CURRENT_VERSION)
response.headers.add("X-Env", dify_config.DEPLOY_ENV)
response.headers.add("X-Version", app.config["CURRENT_VERSION"])
response.headers.add("X-Env", app.config["DEPLOY_ENV"])
return response
@app.route("/health")
def health():
return Response(
json.dumps({"pid": os.getpid(), "status": "ok", "version": dify_config.CURRENT_VERSION}),
json.dumps({"pid": os.getpid(), "status": "ok", "version": app.config["CURRENT_VERSION"]}),
status=200,
content_type="application/json",
)

View File

@@ -1,176 +0,0 @@
import os
if os.environ.get("DEBUG", "false").lower() != "true":
from gevent import monkey
monkey.patch_all()
import grpc.experimental.gevent
grpc.experimental.gevent.init_gevent()
import json
from flask import Flask, Response, request
from flask_cors import CORS
from werkzeug.exceptions import Unauthorized
import contexts
from commands import register_commands
from configs import dify_config
from extensions import (
ext_celery,
ext_code_based_extension,
ext_compress,
ext_database,
ext_hosting_provider,
ext_logging,
ext_login,
ext_mail,
ext_migrate,
ext_proxy_fix,
ext_redis,
ext_sentry,
ext_storage,
)
from extensions.ext_database import db
from extensions.ext_login import login_manager
from libs.passport import PassportService
from services.account_service import AccountService
class DifyApp(Flask):
pass
# ----------------------------
# Application Factory Function
# ----------------------------
def create_flask_app_with_configs() -> Flask:
"""
create a raw flask app
with configs loaded from .env file
"""
dify_app = DifyApp(__name__)
dify_app.config.from_mapping(dify_config.model_dump())
# populate configs into system environment variables
for key, value in dify_app.config.items():
if isinstance(value, str):
os.environ[key] = value
elif isinstance(value, int | float | bool):
os.environ[key] = str(value)
elif value is None:
os.environ[key] = ""
return dify_app
def create_app() -> Flask:
app = create_flask_app_with_configs()
app.secret_key = dify_config.SECRET_KEY
initialize_extensions(app)
register_blueprints(app)
register_commands(app)
return app
def initialize_extensions(app):
# Since the application instance is now created, pass it to each Flask
# extension instance to bind it to the Flask application instance (app)
ext_logging.init_app(app)
ext_compress.init_app(app)
ext_code_based_extension.init()
ext_database.init_app(app)
ext_migrate.init(app, db)
ext_redis.init_app(app)
ext_storage.init_app(app)
ext_celery.init_app(app)
ext_login.init_app(app)
ext_mail.init_app(app)
ext_hosting_provider.init_app(app)
ext_sentry.init_app(app)
ext_proxy_fix.init_app(app)
# Flask-Login configuration
@login_manager.request_loader
def load_user_from_request(request_from_flask_login):
"""Load user based on the request."""
if request.blueprint not in {"console", "inner_api"}:
return None
# Check if the user_id contains a dot, indicating the old format
auth_header = request.headers.get("Authorization", "")
if not auth_header:
auth_token = request.args.get("_token")
if not auth_token:
raise Unauthorized("Invalid Authorization token.")
else:
if " " not in auth_header:
raise Unauthorized("Invalid Authorization header format. Expected 'Bearer <api-key>' format.")
auth_scheme, auth_token = auth_header.split(None, 1)
auth_scheme = auth_scheme.lower()
if auth_scheme != "bearer":
raise Unauthorized("Invalid Authorization header format. Expected 'Bearer <api-key>' format.")
decoded = PassportService().verify(auth_token)
user_id = decoded.get("user_id")
logged_in_account = AccountService.load_logged_in_account(account_id=user_id)
if logged_in_account:
contexts.tenant_id.set(logged_in_account.current_tenant_id)
return logged_in_account
@login_manager.unauthorized_handler
def unauthorized_handler():
"""Handle unauthorized requests."""
return Response(
json.dumps({"code": "unauthorized", "message": "Unauthorized."}),
status=401,
content_type="application/json",
)
# register blueprint routers
def register_blueprints(app):
from controllers.console import bp as console_app_bp
from controllers.files import bp as files_bp
from controllers.inner_api import bp as inner_api_bp
from controllers.service_api import bp as service_api_bp
from controllers.web import bp as web_bp
CORS(
service_api_bp,
allow_headers=["Content-Type", "Authorization", "X-App-Code"],
methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"],
)
app.register_blueprint(service_api_bp)
CORS(
web_bp,
resources={r"/*": {"origins": dify_config.WEB_API_CORS_ALLOW_ORIGINS}},
supports_credentials=True,
allow_headers=["Content-Type", "Authorization", "X-App-Code"],
methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"],
expose_headers=["X-Version", "X-Env"],
)
app.register_blueprint(web_bp)
CORS(
console_app_bp,
resources={r"/*": {"origins": dify_config.CONSOLE_CORS_ALLOW_ORIGINS}},
supports_credentials=True,
allow_headers=["Content-Type", "Authorization"],
methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"],
expose_headers=["X-Version", "X-Env"],
)
app.register_blueprint(console_app_bp)
CORS(files_bp, allow_headers=["Content-Type"], methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"])
app.register_blueprint(files_bp)
app.register_blueprint(inner_api_bp)

View File

@@ -259,28 +259,6 @@ def migrate_knowledge_vector_database():
skipped_count = 0
total_count = 0
vector_type = dify_config.VECTOR_STORE
upper_colletion_vector_types = {
VectorType.MILVUS,
VectorType.PGVECTOR,
VectorType.RELYT,
VectorType.WEAVIATE,
VectorType.ORACLE,
VectorType.ELASTICSEARCH,
}
lower_colletion_vector_types = {
VectorType.ANALYTICDB,
VectorType.CHROMA,
VectorType.MYSCALE,
VectorType.PGVECTO_RS,
VectorType.TIDB_VECTOR,
VectorType.OPENSEARCH,
VectorType.TENCENT,
VectorType.BAIDU,
VectorType.VIKINGDB,
VectorType.UPSTASH,
VectorType.COUCHBASE,
VectorType.OCEANBASE,
}
page = 1
while True:
try:
@@ -306,9 +284,11 @@ def migrate_knowledge_vector_database():
skipped_count = skipped_count + 1
continue
collection_name = ""
dataset_id = dataset.id
if vector_type in upper_colletion_vector_types:
if vector_type == VectorType.WEAVIATE:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {"type": VectorType.WEAVIATE, "vector_store": {"class_prefix": collection_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.QDRANT:
if dataset.collection_binding_id:
dataset_collection_binding = (
@@ -321,15 +301,63 @@ def migrate_knowledge_vector_database():
else:
raise ValueError("Dataset Collection Binding not found")
else:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {"type": VectorType.QDRANT, "vector_store": {"class_prefix": collection_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type in lower_colletion_vector_types:
collection_name = Dataset.gen_collection_name_by_id(dataset_id).lower()
elif vector_type == VectorType.MILVUS:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {"type": VectorType.MILVUS, "vector_store": {"class_prefix": collection_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.RELYT:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {"type": "relyt", "vector_store": {"class_prefix": collection_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.TENCENT:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {"type": VectorType.TENCENT, "vector_store": {"class_prefix": collection_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.PGVECTOR:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {"type": VectorType.PGVECTOR, "vector_store": {"class_prefix": collection_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.OPENSEARCH:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {
"type": VectorType.OPENSEARCH,
"vector_store": {"class_prefix": collection_name},
}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.ANALYTICDB:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {
"type": VectorType.ANALYTICDB,
"vector_store": {"class_prefix": collection_name},
}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.ELASTICSEARCH:
dataset_id = dataset.id
index_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {"type": "elasticsearch", "vector_store": {"class_prefix": index_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.BAIDU:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {
"type": VectorType.BAIDU,
"vector_store": {"class_prefix": collection_name},
}
dataset.index_struct = json.dumps(index_struct_dict)
else:
raise ValueError(f"Vector store {vector_type} is not supported.")
index_struct_dict = {"type": vector_type, "vector_store": {"class_prefix": collection_name}}
dataset.index_struct = json.dumps(index_struct_dict)
vector = Vector(dataset)
click.echo(f"Migrating dataset {dataset.id}.")

View File

@@ -1,15 +1,6 @@
from typing import Annotated, Literal, Optional
from pydantic import (
AliasChoices,
Field,
HttpUrl,
NegativeInt,
NonNegativeInt,
PositiveFloat,
PositiveInt,
computed_field,
)
from pydantic import AliasChoices, Field, HttpUrl, NegativeInt, NonNegativeInt, PositiveInt, computed_field
from pydantic_settings import BaseSettings
from configs.feature.hosted_service import HostedServiceConfig
@@ -27,24 +18,9 @@ class SecurityConfig(BaseSettings):
default="",
)
RESET_PASSWORD_TOKEN_EXPIRY_MINUTES: PositiveInt = Field(
description="Duration in minutes for which a password reset token remains valid",
default=5,
)
LOGIN_DISABLED: bool = Field(
description="Whether to disable login checks",
default=False,
)
ADMIN_API_KEY_ENABLE: bool = Field(
description="Whether to enable admin api key for authentication",
default=False,
)
ADMIN_API_KEY: Optional[str] = Field(
description="admin api key for authentication",
default=None,
RESET_PASSWORD_TOKEN_EXPIRY_HOURS: PositiveInt = Field(
description="Duration in hours for which a password reset token remains valid",
default=24,
)
@@ -319,16 +295,6 @@ class LoggingConfig(BaseSettings):
default=None,
)
LOG_FILE_MAX_SIZE: PositiveInt = Field(
description="Maximum file size for file rotation retention, the unit is megabytes (MB)",
default=20,
)
LOG_FILE_BACKUP_COUNT: PositiveInt = Field(
description="Maximum file backup count file rotation retention",
default=5,
)
LOG_FORMAT: str = Field(
description="Format string for log messages",
default="%(asctime)s.%(msecs)03d %(levelname)s [%(threadName)s] [%(filename)s:%(lineno)d] - %(message)s",
@@ -517,11 +483,6 @@ class MailConfig(BaseSettings):
default=False,
)
EMAIL_SEND_IP_LIMIT_PER_MINUTE: PositiveInt = Field(
description="Maximum number of emails allowed to be sent from the same IP address in a minute",
default=50,
)
class RagEtlConfig(BaseSettings):
"""
@@ -556,26 +517,16 @@ class DataSetConfig(BaseSettings):
Configuration for dataset management
"""
PLAN_SANDBOX_CLEAN_DAY_SETTING: PositiveInt = Field(
description="Interval in days for dataset cleanup operations - plan: sandbox",
CLEAN_DAY_SETTING: PositiveInt = Field(
description="Interval in days for dataset cleanup operations",
default=30,
)
PLAN_PRO_CLEAN_DAY_SETTING: PositiveInt = Field(
description="Interval in days for dataset cleanup operations - plan: pro and team",
default=7,
)
DATASET_OPERATOR_ENABLED: bool = Field(
description="Enable or disable dataset operator functionality",
default=False,
)
TIDB_SERVERLESS_NUMBER: PositiveInt = Field(
description="number of tidb serverless cluster",
default=500,
)
class WorkspaceConfig(BaseSettings):
"""
@@ -669,33 +620,6 @@ class PositionConfig(BaseSettings):
return {item.strip() for item in self.POSITION_TOOL_EXCLUDES.split(",") if item.strip() != ""}
class LoginConfig(BaseSettings):
ENABLE_EMAIL_CODE_LOGIN: bool = Field(
description="whether to enable email code login",
default=False,
)
ENABLE_EMAIL_PASSWORD_LOGIN: bool = Field(
description="whether to enable email password login",
default=True,
)
ENABLE_SOCIAL_OAUTH_LOGIN: bool = Field(
description="whether to enable github/google oauth login",
default=False,
)
EMAIL_CODE_LOGIN_TOKEN_EXPIRY_MINUTES: PositiveInt = Field(
description="expiry time in minutes for email code login token",
default=5,
)
ALLOW_REGISTER: bool = Field(
description="whether to enable register",
default=False,
)
ALLOW_CREATE_WORKSPACE: bool = Field(
description="whether to enable create workspace",
default=False,
)
class FeatureConfig(
# place the configs in alphabet order
AppExecutionConfig,
@@ -721,7 +645,6 @@ class FeatureConfig(
UpdateConfig,
WorkflowConfig,
WorkspaceConfig,
LoginConfig,
# hosted services config
HostedServiceConfig,
CeleryBeatConfig,

View File

@@ -16,14 +16,10 @@ from configs.middleware.storage.supabase_storage_config import SupabaseStorageCo
from configs.middleware.storage.tencent_cos_storage_config import TencentCloudCOSStorageConfig
from configs.middleware.storage.volcengine_tos_storage_config import VolcengineTOSStorageConfig
from configs.middleware.vdb.analyticdb_config import AnalyticdbConfig
from configs.middleware.vdb.baidu_vector_config import BaiduVectorDBConfig
from configs.middleware.vdb.chroma_config import ChromaConfig
from configs.middleware.vdb.couchbase_config import CouchbaseConfig
from configs.middleware.vdb.elasticsearch_config import ElasticsearchConfig
from configs.middleware.vdb.lindorm_config import LindormConfig
from configs.middleware.vdb.milvus_config import MilvusConfig
from configs.middleware.vdb.myscale_config import MyScaleConfig
from configs.middleware.vdb.oceanbase_config import OceanBaseVectorConfig
from configs.middleware.vdb.opensearch_config import OpenSearchConfig
from configs.middleware.vdb.oracle_config import OracleConfig
from configs.middleware.vdb.pgvector_config import PGVectorConfig
@@ -31,9 +27,7 @@ from configs.middleware.vdb.pgvectors_config import PGVectoRSConfig
from configs.middleware.vdb.qdrant_config import QdrantConfig
from configs.middleware.vdb.relyt_config import RelytConfig
from configs.middleware.vdb.tencent_vector_config import TencentVectorDBConfig
from configs.middleware.vdb.tidb_on_qdrant_config import TidbOnQdrantConfig
from configs.middleware.vdb.tidb_vector_config import TiDBVectorConfig
from configs.middleware.vdb.upstash_config import UpstashConfig
from configs.middleware.vdb.vikingdb_config import VikingDBConfig
from configs.middleware.vdb.weaviate_config import WeaviateConfig
@@ -41,8 +35,7 @@ from configs.middleware.vdb.weaviate_config import WeaviateConfig
class StorageConfig(BaseSettings):
STORAGE_TYPE: str = Field(
description="Type of storage to use."
" Options: 'local', 's3', 'aliyun-oss', 'azure-blob', 'baidu-obs', 'google-storage', 'huawei-obs', "
"'oci-storage', 'tencent-cos', 'volcengine-tos', 'supabase'. Default is 'local'.",
" Options: 'local', 's3', 'azure-blob', 'aliyun-oss', 'google-storage'. Default is 'local'.",
default="local",
)
@@ -59,11 +52,6 @@ class VectorStoreConfig(BaseSettings):
default=None,
)
VECTOR_STORE_WHITELIST_ENABLE: Optional[bool] = Field(
description="Enable whitelist for vector store.",
default=False,
)
class KeywordStoreConfig(BaseSettings):
KEYWORD_STORE: str = Field(
@@ -255,13 +243,7 @@ class MiddlewareConfig(
TiDBVectorConfig,
WeaviateConfig,
ElasticsearchConfig,
CouchbaseConfig,
InternalTestConfig,
VikingDBConfig,
UpstashConfig,
TidbOnQdrantConfig,
LindormConfig,
OceanBaseVectorConfig,
BaiduVectorDBConfig,
):
pass

View File

@@ -1,34 +0,0 @@
from typing import Optional
from pydantic import BaseModel, Field
class CouchbaseConfig(BaseModel):
"""
Couchbase configs
"""
COUCHBASE_CONNECTION_STRING: Optional[str] = Field(
description="COUCHBASE connection string",
default=None,
)
COUCHBASE_USER: Optional[str] = Field(
description="COUCHBASE user",
default=None,
)
COUCHBASE_PASSWORD: Optional[str] = Field(
description="COUCHBASE password",
default=None,
)
COUCHBASE_BUCKET_NAME: Optional[str] = Field(
description="COUCHBASE bucket name",
default=None,
)
COUCHBASE_SCOPE_NAME: Optional[str] = Field(
description="COUCHBASE scope name",
default=None,
)

View File

@@ -1,23 +0,0 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class LindormConfig(BaseSettings):
"""
Lindorm configs
"""
LINDORM_URL: Optional[str] = Field(
description="Lindorm url",
default=None,
)
LINDORM_USERNAME: Optional[str] = Field(
description="Lindorm user",
default=None,
)
LINDORM_PASSWORD: Optional[str] = Field(
description="Lindorm password",
default=None,
)

View File

@@ -1,35 +0,0 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class OceanBaseVectorConfig(BaseSettings):
"""
Configuration settings for OceanBase Vector database
"""
OCEANBASE_VECTOR_HOST: Optional[str] = Field(
description="Hostname or IP address of the OceanBase Vector server (e.g. 'localhost')",
default=None,
)
OCEANBASE_VECTOR_PORT: Optional[PositiveInt] = Field(
description="Port number on which the OceanBase Vector server is listening (default is 2881)",
default=2881,
)
OCEANBASE_VECTOR_USER: Optional[str] = Field(
description="Username for authenticating with the OceanBase Vector database",
default=None,
)
OCEANBASE_VECTOR_PASSWORD: Optional[str] = Field(
description="Password for authenticating with the OceanBase Vector database",
default=None,
)
OCEANBASE_VECTOR_DATABASE: Optional[str] = Field(
description="Name of the OceanBase Vector database to connect to",
default=None,
)

View File

@@ -14,7 +14,7 @@ class OracleConfig(BaseSettings):
default=None,
)
ORACLE_PORT: PositiveInt = Field(
ORACLE_PORT: Optional[PositiveInt] = Field(
description="Port number on which the Oracle database server is listening (default is 1521)",
default=1521,
)

View File

@@ -14,7 +14,7 @@ class PGVectorConfig(BaseSettings):
default=None,
)
PGVECTOR_PORT: PositiveInt = Field(
PGVECTOR_PORT: Optional[PositiveInt] = Field(
description="Port number on which the PostgreSQL server is listening (default is 5433)",
default=5433,
)

View File

@@ -14,7 +14,7 @@ class PGVectoRSConfig(BaseSettings):
default=None,
)
PGVECTO_RS_PORT: PositiveInt = Field(
PGVECTO_RS_PORT: Optional[PositiveInt] = Field(
description="Port number on which the PostgreSQL server with PGVecto.RS is listening (default is 5431)",
default=5431,
)

View File

@@ -1,70 +0,0 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings
class TidbOnQdrantConfig(BaseSettings):
"""
Tidb on Qdrant configs
"""
TIDB_ON_QDRANT_URL: Optional[str] = Field(
description="Tidb on Qdrant url",
default=None,
)
TIDB_ON_QDRANT_API_KEY: Optional[str] = Field(
description="Tidb on Qdrant api key",
default=None,
)
TIDB_ON_QDRANT_CLIENT_TIMEOUT: NonNegativeInt = Field(
description="Tidb on Qdrant client timeout in seconds",
default=20,
)
TIDB_ON_QDRANT_GRPC_ENABLED: bool = Field(
description="whether enable grpc support for Tidb on Qdrant connection",
default=False,
)
TIDB_ON_QDRANT_GRPC_PORT: PositiveInt = Field(
description="Tidb on Qdrant grpc port",
default=6334,
)
TIDB_PUBLIC_KEY: Optional[str] = Field(
description="Tidb account public key",
default=None,
)
TIDB_PRIVATE_KEY: Optional[str] = Field(
description="Tidb account private key",
default=None,
)
TIDB_API_URL: Optional[str] = Field(
description="Tidb API url",
default=None,
)
TIDB_IAM_API_URL: Optional[str] = Field(
description="Tidb IAM API url",
default=None,
)
TIDB_REGION: Optional[str] = Field(
description="Tidb serverless region",
default="regions/aws-us-east-1",
)
TIDB_PROJECT_ID: Optional[str] = Field(
description="Tidb project id",
default=None,
)
TIDB_SPEND_LIMIT: Optional[int] = Field(
description="Tidb spend limit",
default=100,
)

View File

@@ -1,20 +0,0 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class UpstashConfig(BaseSettings):
"""
Configuration settings for Upstash vector database
"""
UPSTASH_VECTOR_URL: Optional[str] = Field(
description="URL of the upstash server (e.g., 'https://vector.upstash.io')",
default=None,
)
UPSTASH_VECTOR_TOKEN: Optional[str] = Field(
description="Token for authenticating with the upstash server",
default=None,
)

View File

@@ -11,39 +11,27 @@ class VikingDBConfig(BaseModel):
"""
VIKINGDB_ACCESS_KEY: Optional[str] = Field(
description="The Access Key provided by Volcengine VikingDB for API authentication."
"Refer to the following documentation for details on obtaining credentials:"
"https://www.volcengine.com/docs/6291/65568",
default=None,
default=None, description="The Access Key provided by Volcengine VikingDB for API authentication."
)
VIKINGDB_SECRET_KEY: Optional[str] = Field(
description="The Secret Key provided by Volcengine VikingDB for API authentication.",
default=None,
default=None, description="The Secret Key provided by Volcengine VikingDB for API authentication."
)
VIKINGDB_REGION: str = Field(
description="The region of the Volcengine VikingDB service.(e.g., 'cn-shanghai', 'cn-beijing').",
VIKINGDB_REGION: Optional[str] = Field(
default="cn-shanghai",
description="The region of the Volcengine VikingDB service.(e.g., 'cn-shanghai', 'cn-beijing').",
)
VIKINGDB_HOST: str = Field(
VIKINGDB_HOST: Optional[str] = Field(
default="api-vikingdb.mlp.cn-shanghai.volces.com",
description="The host of the Volcengine VikingDB service.(e.g., 'api-vikingdb.volces.com', \
'api-vikingdb.mlp.cn-shanghai.volces.com')",
default="api-vikingdb.mlp.cn-shanghai.volces.com",
)
VIKINGDB_SCHEME: str = Field(
description="The scheme of the Volcengine VikingDB service.(e.g., 'http', 'https').",
VIKINGDB_SCHEME: Optional[str] = Field(
default="http",
description="The scheme of the Volcengine VikingDB service.(e.g., 'http', 'https').",
)
VIKINGDB_CONNECTION_TIMEOUT: int = Field(
description="The connection timeout of the Volcengine VikingDB service.",
default=30,
VIKINGDB_CONNECTION_TIMEOUT: Optional[int] = Field(
default=30, description="The connection timeout of the Volcengine VikingDB service."
)
VIKINGDB_SOCKET_TIMEOUT: int = Field(
description="The socket timeout of the Volcengine VikingDB service.",
default=30,
VIKINGDB_SOCKET_TIMEOUT: Optional[int] = Field(
default=30, description="The socket timeout of the Volcengine VikingDB service."
)

View File

@@ -9,7 +9,7 @@ class PackagingInfo(BaseSettings):
CURRENT_VERSION: str = Field(
description="Dify version",
default="0.10.2",
default="0.10.0-beta3",
)
COMMIT_SHA: str = Field(

View File

@@ -12,13 +12,10 @@ VIDEO_EXTENSIONS.extend([ext.upper() for ext in VIDEO_EXTENSIONS])
AUDIO_EXTENSIONS = ["mp3", "m4a", "wav", "webm", "amr"]
AUDIO_EXTENSIONS.extend([ext.upper() for ext in AUDIO_EXTENSIONS])
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "pdf", "html", "htm", "xlsx", "xls", "docx", "csv"]
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])
if dify_config.ETL_TYPE == "Unstructured":
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "pdf", "html", "htm", "xlsx", "xls"]
DOCUMENT_EXTENSIONS.extend(("docx", "csv", "eml", "msg", "pptx", "xml", "epub"))
if dify_config.UNSTRUCTURED_API_URL:
DOCUMENT_EXTENSIONS.append("ppt")
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])
else:
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "pdf", "html", "htm", "xlsx", "xls", "docx", "csv"]
DOCUMENT_EXTENSIONS.extend(("docx", "csv", "eml", "msg", "pptx", "ppt", "xml", "epub"))
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])

View File

@@ -1,6 +0,0 @@
from werkzeug.exceptions import HTTPException
class FilenameNotExistsError(HTTPException):
code = 400
description = "The specified filename does not exist."

View File

@@ -1,58 +0,0 @@
import mimetypes
import os
import re
import urllib.parse
from uuid import uuid4
import httpx
from pydantic import BaseModel
class FileInfo(BaseModel):
filename: str
extension: str
mimetype: str
size: int
def guess_file_info_from_response(response: httpx.Response):
url = str(response.url)
# Try to extract filename from URL
parsed_url = urllib.parse.urlparse(url)
url_path = parsed_url.path
filename = os.path.basename(url_path)
# If filename couldn't be extracted, use Content-Disposition header
if not filename:
content_disposition = response.headers.get("Content-Disposition")
if content_disposition:
filename_match = re.search(r'filename="?(.+)"?', content_disposition)
if filename_match:
filename = filename_match.group(1)
# If still no filename, generate a unique one
if not filename:
unique_name = str(uuid4())
filename = f"{unique_name}"
# Guess MIME type from filename first, then URL
mimetype, _ = mimetypes.guess_type(filename)
if mimetype is None:
mimetype, _ = mimetypes.guess_type(url)
if mimetype is None:
# If guessing fails, use Content-Type from response headers
mimetype = response.headers.get("Content-Type", "application/octet-stream")
extension = os.path.splitext(filename)[1]
# Ensure filename has an extension
if not extension:
extension = mimetypes.guess_extension(mimetype) or ".bin"
filename = f"{filename}{extension}"
return FileInfo(
filename=filename,
extension=extension,
mimetype=mimetype,
size=int(response.headers.get("Content-Length", -1)),
)

View File

@@ -2,21 +2,9 @@ from flask import Blueprint
from libs.external_api import ExternalApi
from .files import FileApi, FilePreviewApi, FileSupportTypeApi
from .remote_files import RemoteFileInfoApi, RemoteFileUploadApi
bp = Blueprint("console", __name__, url_prefix="/console/api")
api = ExternalApi(bp)
# File
api.add_resource(FileApi, "/files/upload")
api.add_resource(FilePreviewApi, "/files/<uuid:file_id>/preview")
api.add_resource(FileSupportTypeApi, "/files/support-type")
# Remote files
api.add_resource(RemoteFileInfoApi, "/remote-files/<path:url>")
api.add_resource(RemoteFileUploadApi, "/remote-files/upload")
# Import other controllers
from . import admin, apikey, extension, feature, ping, setup, version
@@ -55,6 +43,7 @@ from .datasets import (
datasets_document,
datasets_segments,
external,
file,
hit_testing,
website,
)

View File

@@ -1,10 +1,10 @@
import os
from functools import wraps
from flask import request
from flask_restful import Resource, reqparse
from werkzeug.exceptions import NotFound, Unauthorized
from configs import dify_config
from constants.languages import supported_language
from controllers.console import api
from controllers.console.wraps import only_edition_cloud
@@ -15,7 +15,7 @@ from models.model import App, InstalledApp, RecommendedApp
def admin_required(view):
@wraps(view)
def decorated(*args, **kwargs):
if not dify_config.ADMIN_API_KEY:
if not os.getenv("ADMIN_API_KEY"):
raise Unauthorized("API key is invalid.")
auth_header = request.headers.get("Authorization")
@@ -31,7 +31,7 @@ def admin_required(view):
if auth_scheme != "bearer":
raise Unauthorized("Invalid Authorization header format. Expected 'Bearer <api-key>' format.")
if dify_config.ADMIN_API_KEY != auth_token:
if os.getenv("ADMIN_API_KEY") != auth_token:
raise Unauthorized("API key is invalid.")
return view(*args, **kwargs)

View File

@@ -10,7 +10,8 @@ from models.dataset import Dataset
from models.model import ApiToken, App
from . import api
from .wraps import account_initialization_required, setup_required
from .setup import setup_required
from .wraps import account_initialization_required
api_key_fields = {
"id": fields.String,

View File

@@ -1,7 +1,8 @@
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from libs.login import login_required
from services.advanced_prompt_template_service import AdvancedPromptTemplateService

View File

@@ -2,7 +2,8 @@ from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from libs.helper import uuid_value
from libs.login import login_required
from models.model import AppMode

View File

@@ -6,11 +6,8 @@ from werkzeug.exceptions import Forbidden
from controllers.console import api
from controllers.console.app.error import NoFileUploadedError
from controllers.console.datasets.error import TooManyFilesError
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
setup_required,
)
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from extensions.ext_redis import redis_client
from fields.annotation_fields import (
annotation_fields,

View File

@@ -6,11 +6,8 @@ from werkzeug.exceptions import BadRequest, Forbidden, abort
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
setup_required,
)
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from core.ops.ops_trace_manager import OpsTraceManager
from fields.app_fields import (
app_detail_fields,

View File

@@ -18,7 +18,8 @@ from controllers.console.app.error import (
UnsupportedAudioTypeError,
)
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from core.errors.error import ModelCurrentlyNotSupportError, ProviderTokenNotInitError, QuotaExceededError
from core.model_runtime.errors.invoke import InvokeError
from libs.login import login_required

View File

@@ -15,7 +15,8 @@ from controllers.console.app.error import (
ProviderQuotaExceededError,
)
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.web.error import InvokeRateLimitError as InvokeRateLimitHttpError
from core.app.apps.base_app_queue_manager import AppQueueManager
from core.app.entities.app_invoke_entities import InvokeFrom

View File

@@ -10,7 +10,8 @@ from werkzeug.exceptions import Forbidden, NotFound
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from core.app.entities.app_invoke_entities import InvokeFrom
from extensions.ext_database import db
from fields.conversation_fields import (

View File

@@ -4,7 +4,8 @@ from sqlalchemy.orm import Session
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from extensions.ext_database import db
from fields.conversation_variable_fields import paginated_conversation_variable_fields
from libs.login import login_required

View File

@@ -10,7 +10,8 @@ from controllers.console.app.error import (
ProviderNotInitializeError,
ProviderQuotaExceededError,
)
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from core.errors.error import ModelCurrentlyNotSupportError, ProviderTokenNotInitError, QuotaExceededError
from core.llm_generator.llm_generator import LLMGenerator
from core.model_runtime.errors.invoke import InvokeError
@@ -51,39 +52,4 @@ class RuleGenerateApi(Resource):
return rules
class RuleCodeGenerateApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("instruction", type=str, required=True, nullable=False, location="json")
parser.add_argument("model_config", type=dict, required=True, nullable=False, location="json")
parser.add_argument("no_variable", type=bool, required=True, default=False, location="json")
parser.add_argument("code_language", type=str, required=False, default="javascript", location="json")
args = parser.parse_args()
account = current_user
CODE_GENERATION_MAX_TOKENS = int(os.getenv("CODE_GENERATION_MAX_TOKENS", "1024"))
try:
code_result = LLMGenerator.generate_code(
tenant_id=account.current_tenant_id,
instruction=args["instruction"],
model_config=args["model_config"],
code_language=args["code_language"],
max_tokens=CODE_GENERATION_MAX_TOKENS,
)
except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description)
except QuotaExceededError:
raise ProviderQuotaExceededError()
except ModelCurrentlyNotSupportError:
raise ProviderModelCurrentlyNotSupportError()
except InvokeError as e:
raise CompletionRequestError(e.description)
return code_result
api.add_resource(RuleGenerateApi, "/rule-generate")
api.add_resource(RuleCodeGenerateApi, "/rule-code-generate")

View File

@@ -14,11 +14,8 @@ from controllers.console.app.error import (
)
from controllers.console.app.wraps import get_app_model
from controllers.console.explore.error import AppSuggestedQuestionsAfterAnswerDisabledError
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
setup_required,
)
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from core.app.entities.app_invoke_entities import InvokeFrom
from core.errors.error import ModelCurrentlyNotSupportError, ProviderTokenNotInitError, QuotaExceededError
from core.model_runtime.errors.invoke import InvokeError
@@ -108,8 +105,6 @@ class ChatMessageListApi(Resource):
if rest_count > 0:
has_more = True
history_messages = list(reversed(history_messages))
return InfiniteScrollPagination(data=history_messages, limit=args["limit"], has_more=has_more)

View File

@@ -6,7 +6,8 @@ from flask_restful import Resource
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from core.agent.entities import AgentToolEntity
from core.tools.tool_manager import ToolManager
from core.tools.utils.configuration import ToolParameterConfigurationManager

View File

@@ -2,7 +2,8 @@ from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.error import TracingConfigCheckError, TracingConfigIsExist, TracingConfigNotExist
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from libs.login import login_required
from services.ops_service import OpsService

View File

@@ -7,7 +7,8 @@ from werkzeug.exceptions import Forbidden, NotFound
from constants.languages import supported_language
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from extensions.ext_database import db
from fields.app_fields import app_site_fields
from libs.login import login_required

View File

@@ -8,7 +8,8 @@ from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from extensions.ext_database import db
from libs.helper import DatetimeString
from libs.login import login_required

View File

@@ -9,7 +9,8 @@ import services
from controllers.console import api
from controllers.console.app.error import ConversationCompletedError, DraftWorkflowNotExist, DraftWorkflowNotSync
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from core.app.apps.base_app_queue_manager import AppQueueManager
from core.app.entities.app_invoke_entities import InvokeFrom
from factories import variable_factory

View File

@@ -3,7 +3,8 @@ from flask_restful.inputs import int_range
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from fields.workflow_app_log_fields import workflow_app_log_pagination_fields
from libs.login import login_required
from models import App

View File

@@ -3,7 +3,8 @@ from flask_restful.inputs import int_range
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from fields.workflow_run_fields import (
advanced_chat_workflow_run_pagination_fields,
workflow_run_detail_fields,

View File

@@ -8,11 +8,12 @@ from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from enums import WorkflowRunTriggeredFrom
from extensions.ext_database import db
from libs.helper import DatetimeString
from libs.login import login_required
from models.enums import WorkflowRunTriggeredFrom
from models.model import AppMode

View File

@@ -1,15 +1,17 @@
import base64
import datetime
import secrets
from flask import request
from flask_restful import Resource, reqparse
from constants.languages import supported_language
from controllers.console import api
from controllers.console.error import AlreadyActivateError
from extensions.ext_database import db
from libs.helper import StrLen, email, extract_remote_ip, timezone
from models.account import AccountStatus, Tenant
from services.account_service import AccountService, RegisterService
from libs.helper import StrLen, email, timezone
from libs.password import hash_password, valid_password
from models.account import AccountStatus
from services.account_service import RegisterService
class ActivateCheckApi(Resource):
@@ -25,18 +27,8 @@ class ActivateCheckApi(Resource):
token = args["token"]
invitation = RegisterService.get_invitation_if_token_valid(workspaceId, reg_email, token)
if invitation:
data = invitation.get("data", {})
tenant: Tenant = invitation.get("tenant", None)
workspace_name = tenant.name if tenant else None
workspace_id = tenant.id if tenant else None
invitee_email = data.get("email") if data else None
return {
"is_valid": invitation is not None,
"data": {"workspace_name": workspace_name, "workspace_id": workspace_id, "email": invitee_email},
}
else:
return {"is_valid": False}
return {"is_valid": invitation is not None, "workspace_name": invitation["tenant"].name if invitation else None}
class ActivateApi(Resource):
@@ -46,6 +38,7 @@ class ActivateApi(Resource):
parser.add_argument("email", type=email, required=False, nullable=True, location="json")
parser.add_argument("token", type=str, required=True, nullable=False, location="json")
parser.add_argument("name", type=StrLen(30), required=True, nullable=False, location="json")
parser.add_argument("password", type=valid_password, required=True, nullable=False, location="json")
parser.add_argument(
"interface_language", type=supported_language, required=True, nullable=False, location="json"
)
@@ -61,6 +54,15 @@ class ActivateApi(Resource):
account = invitation["account"]
account.name = args["name"]
# generate password salt
salt = secrets.token_bytes(16)
base64_salt = base64.b64encode(salt).decode()
# encrypt password with salt
password_hashed = hash_password(args["password"], salt)
base64_password_hashed = base64.b64encode(password_hashed).decode()
account.password = base64_password_hashed
account.password_salt = base64_salt
account.interface_language = args["interface_language"]
account.timezone = args["timezone"]
account.interface_theme = "light"
@@ -68,9 +70,7 @@ class ActivateApi(Resource):
account.initialized_at = datetime.datetime.now(datetime.timezone.utc).replace(tzinfo=None)
db.session.commit()
token_pair = AccountService.login(account, ip_address=extract_remote_ip(request))
return {"result": "success", "data": token_pair.model_dump()}
return {"result": "success"}
api.add_resource(ActivateCheckApi, "/activate/check")

View File

@@ -7,7 +7,8 @@ from controllers.console.auth.error import ApiKeyAuthFailedError
from libs.login import login_required
from services.auth.api_key_auth_service import ApiKeyAuthService
from ..wraps import account_initialization_required, setup_required
from ..setup import setup_required
from ..wraps import account_initialization_required
class ApiKeyAuthDataSource(Resource):

View File

@@ -11,7 +11,8 @@ from controllers.console import api
from libs.login import login_required
from libs.oauth_data_source import NotionOAuth
from ..wraps import account_initialization_required, setup_required
from ..setup import setup_required
from ..wraps import account_initialization_required
def get_oauth_providers():

View File

@@ -27,29 +27,5 @@ class InvalidTokenError(BaseHTTPException):
class PasswordResetRateLimitExceededError(BaseHTTPException):
error_code = "password_reset_rate_limit_exceeded"
description = "Too many password reset emails have been sent. Please try again in 1 minutes."
code = 429
class EmailCodeError(BaseHTTPException):
error_code = "email_code_error"
description = "Email code is invalid or expired."
code = 400
class EmailOrPasswordMismatchError(BaseHTTPException):
error_code = "email_or_password_mismatch"
description = "The email or password is mismatched."
code = 400
class EmailPasswordLoginLimitError(BaseHTTPException):
error_code = "email_code_login_limit"
description = "Too many incorrect password attempts. Please try again later."
code = 429
class EmailCodeLoginRateLimitExceededError(BaseHTTPException):
error_code = "email_code_login_rate_limit_exceeded"
description = "Too many login emails have been sent. Please try again in 5 minutes."
description = "Password reset rate limit exceeded. Try again later."
code = 429

View File

@@ -1,82 +1,65 @@
import base64
import logging
import secrets
from flask import request
from flask_restful import Resource, reqparse
from constants.languages import languages
from controllers.console import api
from controllers.console.auth.error import (
EmailCodeError,
InvalidEmailError,
InvalidTokenError,
PasswordMismatchError,
PasswordResetRateLimitExceededError,
)
from controllers.console.error import EmailSendIpLimitError, NotAllowedRegister
from controllers.console.wraps import setup_required
from events.tenant_event import tenant_was_created
from controllers.console.setup import setup_required
from extensions.ext_database import db
from libs.helper import email, extract_remote_ip
from libs.helper import email as email_validate
from libs.password import hash_password, valid_password
from models.account import Account
from services.account_service import AccountService, TenantService
from services.errors.workspace import WorkSpaceNotAllowedCreateError
from services.feature_service import FeatureService
from models import Account
from services.account_service import AccountService
from services.errors.account import RateLimitExceededError
class ForgotPasswordSendEmailApi(Resource):
@setup_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=email, required=True, location="json")
parser.add_argument("language", type=str, required=False, location="json")
parser.add_argument("email", type=str, required=True, location="json")
args = parser.parse_args()
ip_address = extract_remote_ip(request)
if AccountService.is_email_send_ip_limit(ip_address):
raise EmailSendIpLimitError()
email = args["email"]
if args["language"] is not None and args["language"] == "zh-Hans":
language = "zh-Hans"
if not email_validate(email):
raise InvalidEmailError()
account = Account.query.filter_by(email=email).first()
if account:
try:
AccountService.send_reset_password_email(account=account)
except RateLimitExceededError:
logging.warning(f"Rate limit exceeded for email: {account.email}")
raise PasswordResetRateLimitExceededError()
else:
language = "en-US"
# Return success to avoid revealing email registration status
logging.warning(f"Attempt to reset password for unregistered email: {email}")
account = Account.query.filter_by(email=args["email"]).first()
token = None
if account is None:
if FeatureService.get_system_features().is_allow_register:
token = AccountService.send_reset_password_email(email=args["email"], language=language)
return {"result": "fail", "data": token, "code": "account_not_found"}
else:
raise NotAllowedRegister()
else:
token = AccountService.send_reset_password_email(account=account, email=args["email"], language=language)
return {"result": "success", "data": token}
return {"result": "success"}
class ForgotPasswordCheckApi(Resource):
@setup_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=str, required=True, location="json")
parser.add_argument("code", type=str, required=True, location="json")
parser.add_argument("token", type=str, required=True, nullable=False, location="json")
args = parser.parse_args()
token = args["token"]
user_email = args["email"]
reset_data = AccountService.get_reset_password_data(token)
token_data = AccountService.get_reset_password_data(args["token"])
if token_data is None:
raise InvalidTokenError()
if user_email != token_data.get("email"):
raise InvalidEmailError()
if args["code"] != token_data.get("code"):
raise EmailCodeError()
return {"is_valid": True, "email": token_data.get("email")}
if reset_data is None:
return {"is_valid": False, "email": None}
return {"is_valid": True, "email": reset_data.get("email")}
class ForgotPasswordResetApi(Resource):
@@ -109,26 +92,9 @@ class ForgotPasswordResetApi(Resource):
base64_password_hashed = base64.b64encode(password_hashed).decode()
account = Account.query.filter_by(email=reset_data.get("email")).first()
if account:
account.password = base64_password_hashed
account.password_salt = base64_salt
db.session.commit()
tenant = TenantService.get_join_tenants(account)
if not tenant and not FeatureService.get_system_features().is_allow_create_workspace:
tenant = TenantService.create_tenant(f"{account.name}'s Workspace")
TenantService.create_tenant_member(tenant, account, role="owner")
account.current_tenant = tenant
tenant_was_created.send(tenant)
else:
try:
account = AccountService.create_account_and_tenant(
email=reset_data.get("email"),
name=reset_data.get("email"),
password=password_confirm,
interface_language=languages[0],
)
except WorkSpaceNotAllowedCreateError:
pass
account.password = base64_password_hashed
account.password_salt = base64_salt
db.session.commit()
return {"result": "success"}

View File

@@ -5,29 +5,12 @@ from flask import request
from flask_restful import Resource, reqparse
import services
from constants.languages import languages
from controllers.console import api
from controllers.console.auth.error import (
EmailCodeError,
EmailOrPasswordMismatchError,
EmailPasswordLoginLimitError,
InvalidEmailError,
InvalidTokenError,
)
from controllers.console.error import (
AccountBannedError,
EmailSendIpLimitError,
NotAllowedCreateWorkspace,
NotAllowedRegister,
)
from controllers.console.wraps import setup_required
from events.tenant_event import tenant_was_created
from controllers.console.setup import setup_required
from libs.helper import email, extract_remote_ip
from libs.password import valid_password
from models.account import Account
from services.account_service import AccountService, RegisterService, TenantService
from services.errors.workspace import WorkSpaceNotAllowedCreateError
from services.feature_service import FeatureService
from models import Account
from services.account_service import AccountService, TenantService
class LoginApi(Resource):
@@ -40,43 +23,15 @@ class LoginApi(Resource):
parser.add_argument("email", type=email, required=True, location="json")
parser.add_argument("password", type=valid_password, required=True, location="json")
parser.add_argument("remember_me", type=bool, required=False, default=False, location="json")
parser.add_argument("invite_token", type=str, required=False, default=None, location="json")
parser.add_argument("language", type=str, required=False, default="en-US", location="json")
args = parser.parse_args()
is_login_error_rate_limit = AccountService.is_login_error_rate_limit(args["email"])
if is_login_error_rate_limit:
raise EmailPasswordLoginLimitError()
invitation = args["invite_token"]
if invitation:
invitation = RegisterService.get_invitation_if_token_valid(None, args["email"], invitation)
if args["language"] is not None and args["language"] == "zh-Hans":
language = "zh-Hans"
else:
language = "en-US"
# todo: Verify the recaptcha
try:
if invitation:
data = invitation.get("data", {})
invitee_email = data.get("email") if data else None
if invitee_email != args["email"]:
raise InvalidEmailError()
account = AccountService.authenticate(args["email"], args["password"], args["invite_token"])
else:
account = AccountService.authenticate(args["email"], args["password"])
except services.errors.account.AccountLoginError:
raise AccountBannedError()
except services.errors.account.AccountPasswordError:
AccountService.add_login_error_rate_limit(args["email"])
raise EmailOrPasswordMismatchError()
except services.errors.account.AccountNotFoundError:
if FeatureService.get_system_features().is_allow_register:
token = AccountService.send_reset_password_email(email=args["email"], language=language)
return {"result": "fail", "data": token, "code": "account_not_found"}
else:
raise NotAllowedRegister()
account = AccountService.authenticate(args["email"], args["password"])
except services.errors.account.AccountLoginError as e:
return {"code": "unauthorized", "message": str(e)}, 401
# SELF_HOSTED only have one workspace
tenants = TenantService.get_join_tenants(account)
if len(tenants) == 0:
@@ -86,7 +41,7 @@ class LoginApi(Resource):
}
token_pair = AccountService.login(account=account, ip_address=extract_remote_ip(request))
AccountService.reset_login_error_rate_limit(args["email"])
return {"result": "success", "data": token_pair.model_dump()}
@@ -94,111 +49,60 @@ class LogoutApi(Resource):
@setup_required
def get(self):
account = cast(Account, flask_login.current_user)
if isinstance(account, flask_login.AnonymousUserMixin):
return {"result": "success"}
AccountService.logout(account=account)
flask_login.logout_user()
return {"result": "success"}
class ResetPasswordSendEmailApi(Resource):
class ResetPasswordApi(Resource):
@setup_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=email, required=True, location="json")
parser.add_argument("language", type=str, required=False, location="json")
args = parser.parse_args()
def get(self):
# parser = reqparse.RequestParser()
# parser.add_argument('email', type=email, required=True, location='json')
# args = parser.parse_args()
if args["language"] is not None and args["language"] == "zh-Hans":
language = "zh-Hans"
else:
language = "en-US"
# import mailchimp_transactional as MailchimpTransactional
# from mailchimp_transactional.api_client import ApiClientError
account = AccountService.get_user_through_email(args["email"])
if account is None:
if FeatureService.get_system_features().is_allow_register:
token = AccountService.send_reset_password_email(email=args["email"], language=language)
else:
raise NotAllowedRegister()
else:
token = AccountService.send_reset_password_email(account=account, language=language)
# account = {'email': args['email']}
# account = AccountService.get_by_email(args['email'])
# if account is None:
# raise ValueError('Email not found')
# new_password = AccountService.generate_password()
# AccountService.update_password(account, new_password)
return {"result": "success", "data": token}
# todo: Send email
# MAILCHIMP_API_KEY = dify_config.MAILCHIMP_TRANSACTIONAL_API_KEY
# mailchimp = MailchimpTransactional(MAILCHIMP_API_KEY)
# message = {
# 'from_email': 'noreply@example.com',
# 'to': [{'email': account['email']}],
# 'subject': 'Reset your Dify password',
# 'html': """
# <p>Dear User,</p>
# <p>The Dify team has generated a new password for you, details as follows:</p>
# <p><strong>{new_password}</strong></p>
# <p>Please change your password to log in as soon as possible.</p>
# <p>Regards,</p>
# <p>The Dify Team</p>
# """
# }
class EmailCodeLoginSendEmailApi(Resource):
@setup_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=email, required=True, location="json")
parser.add_argument("language", type=str, required=False, location="json")
args = parser.parse_args()
# response = mailchimp.messages.send({
# 'message': message,
# # required for transactional email
# ' settings': {
# 'sandbox_mode': dify_config.MAILCHIMP_SANDBOX_MODE,
# },
# })
ip_address = extract_remote_ip(request)
if AccountService.is_email_send_ip_limit(ip_address):
raise EmailSendIpLimitError()
# Check if MSG was sent
# if response.status_code != 200:
# # handle error
# pass
if args["language"] is not None and args["language"] == "zh-Hans":
language = "zh-Hans"
else:
language = "en-US"
account = AccountService.get_user_through_email(args["email"])
if account is None:
if FeatureService.get_system_features().is_allow_register:
token = AccountService.send_email_code_login_email(email=args["email"], language=language)
else:
raise NotAllowedRegister()
else:
token = AccountService.send_email_code_login_email(account=account, language=language)
return {"result": "success", "data": token}
class EmailCodeLoginApi(Resource):
@setup_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=str, required=True, location="json")
parser.add_argument("code", type=str, required=True, location="json")
parser.add_argument("token", type=str, required=True, location="json")
args = parser.parse_args()
user_email = args["email"]
token_data = AccountService.get_email_code_login_data(args["token"])
if token_data is None:
raise InvalidTokenError()
if token_data["email"] != args["email"]:
raise InvalidEmailError()
if token_data["code"] != args["code"]:
raise EmailCodeError()
AccountService.revoke_email_code_login_token(args["token"])
account = AccountService.get_user_through_email(user_email)
if account:
tenant = TenantService.get_join_tenants(account)
if not tenant:
if not FeatureService.get_system_features().is_allow_create_workspace:
raise NotAllowedCreateWorkspace()
else:
tenant = TenantService.create_tenant(f"{account.name}'s Workspace")
TenantService.create_tenant_member(tenant, account, role="owner")
account.current_tenant = tenant
tenant_was_created.send(tenant)
if account is None:
try:
account = AccountService.create_account_and_tenant(
email=user_email, name=user_email, interface_language=languages[0]
)
except WorkSpaceNotAllowedCreateError:
return NotAllowedCreateWorkspace()
token_pair = AccountService.login(account, ip_address=extract_remote_ip(request))
AccountService.reset_login_error_rate_limit(args["email"])
return {"result": "success", "data": token_pair.model_dump()}
return {"result": "success"}
class RefreshTokenApi(Resource):
@@ -216,7 +120,4 @@ class RefreshTokenApi(Resource):
api.add_resource(LoginApi, "/login")
api.add_resource(LogoutApi, "/logout")
api.add_resource(EmailCodeLoginSendEmailApi, "/email-code-login")
api.add_resource(EmailCodeLoginApi, "/email-code-login/validity")
api.add_resource(ResetPasswordSendEmailApi, "/reset-password")
api.add_resource(RefreshTokenApi, "/refresh-token")

View File

@@ -5,20 +5,15 @@ from typing import Optional
import requests
from flask import current_app, redirect, request
from flask_restful import Resource
from werkzeug.exceptions import Unauthorized
from configs import dify_config
from constants.languages import languages
from events.tenant_event import tenant_was_created
from extensions.ext_database import db
from libs.helper import extract_remote_ip
from libs.oauth import GitHubOAuth, GoogleOAuth, OAuthUserInfo
from models import Account
from models.account import AccountStatus
from services.account_service import AccountService, RegisterService, TenantService
from services.errors.account import AccountNotFoundError
from services.errors.workspace import WorkSpaceNotAllowedCreateError, WorkSpaceNotFoundError
from services.feature_service import FeatureService
from .. import api
@@ -48,7 +43,6 @@ def get_oauth_providers():
class OAuthLogin(Resource):
def get(self, provider: str):
invite_token = request.args.get("invite_token") or None
OAUTH_PROVIDERS = get_oauth_providers()
with current_app.app_context():
oauth_provider = OAUTH_PROVIDERS.get(provider)
@@ -56,7 +50,7 @@ class OAuthLogin(Resource):
if not oauth_provider:
return {"error": "Invalid provider"}, 400
auth_url = oauth_provider.get_authorization_url(invite_token=invite_token)
auth_url = oauth_provider.get_authorization_url()
return redirect(auth_url)
@@ -69,11 +63,6 @@ class OAuthCallback(Resource):
return {"error": "Invalid provider"}, 400
code = request.args.get("code")
state = request.args.get("state")
invite_token = None
if state:
invite_token = state
try:
token = oauth_provider.get_access_token(code)
user_info = oauth_provider.get_user_info(token)
@@ -81,43 +70,17 @@ class OAuthCallback(Resource):
logging.exception(f"An error occurred during the OAuth process with {provider}: {e.response.text}")
return {"error": "OAuth process failed"}, 400
if invite_token and RegisterService.is_valid_invite_token(invite_token):
invitation = RegisterService._get_invitation_by_token(token=invite_token)
if invitation:
invitation_email = invitation.get("email", None)
if invitation_email != user_info.email:
return redirect(f"{dify_config.CONSOLE_WEB_URL}/signin?message=Invalid invitation token.")
return redirect(f"{dify_config.CONSOLE_WEB_URL}/signin/invite-settings?invite_token={invite_token}")
try:
account = _generate_account(provider, user_info)
except AccountNotFoundError:
return redirect(f"{dify_config.CONSOLE_WEB_URL}/signin?message=Account not found.")
except (WorkSpaceNotFoundError, WorkSpaceNotAllowedCreateError):
return redirect(
f"{dify_config.CONSOLE_WEB_URL}/signin"
"?message=Workspace not found, please contact system admin to invite you to join in a workspace."
)
account = _generate_account(provider, user_info)
# Check account status
if account.status == AccountStatus.BANNED.value:
return redirect(f"{dify_config.CONSOLE_WEB_URL}/signin?message=Account is banned.")
if account.status in {AccountStatus.BANNED.value, AccountStatus.CLOSED.value}:
return {"error": "Account is banned or closed."}, 403
if account.status == AccountStatus.PENDING.value:
account.status = AccountStatus.ACTIVE.value
account.initialized_at = datetime.now(timezone.utc).replace(tzinfo=None)
db.session.commit()
try:
TenantService.create_owner_tenant_if_not_exist(account)
except Unauthorized:
return redirect(f"{dify_config.CONSOLE_WEB_URL}/signin?message=Workspace not found.")
except WorkSpaceNotAllowedCreateError:
return redirect(
f"{dify_config.CONSOLE_WEB_URL}/signin"
"?message=Workspace not found, please contact system admin to invite you to join in a workspace."
)
TenantService.create_owner_tenant_if_not_exist(account)
token_pair = AccountService.login(
account=account,
@@ -142,20 +105,8 @@ def _generate_account(provider: str, user_info: OAuthUserInfo):
# Get account by openid or email.
account = _get_account_by_openid_or_email(provider, user_info)
if account:
tenant = TenantService.get_join_tenants(account)
if not tenant:
if not FeatureService.get_system_features().is_allow_create_workspace:
raise WorkSpaceNotAllowedCreateError()
else:
tenant = TenantService.create_tenant(f"{account.name}'s Workspace")
TenantService.create_tenant_member(tenant, account, role="owner")
account.current_tenant = tenant
tenant_was_created.send(tenant)
if not account:
if not FeatureService.get_system_features().is_allow_register:
raise AccountNotFoundError()
# Create account
account_name = user_info.name or "Dify"
account = RegisterService.register(
email=user_info.email, name=account_name, password=None, open_id=user_info.id, provider=provider

View File

@@ -2,7 +2,8 @@ from flask_login import current_user
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.wraps import account_initialization_required, only_edition_cloud, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, only_edition_cloud
from libs.login import login_required
from services.billing_service import BillingService

View File

@@ -7,7 +7,8 @@ from flask_restful import Resource, marshal_with, reqparse
from werkzeug.exceptions import NotFound
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from core.indexing_runner import IndexingRunner
from core.rag.extractor.entity.extract_setting import ExtractSetting
from core.rag.extractor.notion_extractor import NotionExtractor

View File

@@ -10,7 +10,8 @@ from controllers.console import api
from controllers.console.apikey import api_key_fields, api_key_list
from controllers.console.app.error import ProviderNotInitializeError
from controllers.console.datasets.error import DatasetInUseError, DatasetNameDuplicateError, IndexingEstimateError
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from core.errors.error import LLMBadRequestError, ProviderTokenNotInitError
from core.indexing_runner import IndexingRunner
from core.model_runtime.entities.model_entities import ModelType
@@ -101,13 +102,6 @@ class DatasetListApi(Resource):
help="type is required. Name must be between 1 to 40 characters.",
type=_validate_name,
)
parser.add_argument(
"description",
type=str,
nullable=True,
required=False,
default="",
)
parser.add_argument(
"indexing_technique",
type=str,
@@ -146,7 +140,6 @@ class DatasetListApi(Resource):
dataset = DatasetService.create_empty_dataset(
tenant_id=current_user.current_tenant_id,
name=args["name"],
description=args["description"],
indexing_technique=args["indexing_technique"],
account=current_user,
permission=DatasetPermissionEnum.ONLY_ME,
@@ -456,7 +449,7 @@ class DatasetIndexingEstimateApi(Resource):
)
except LLMBadRequestError:
raise ProviderNotInitializeError(
"No Embedding Model available. Please configure a valid provider " "in the Settings -> Model Provider."
"No Embedding Model available. Please configure a valid provider in the Settings -> Model Provider."
)
except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description)
@@ -620,15 +613,12 @@ class DatasetRetrievalSettingApi(Resource):
case (
VectorType.MILVUS
| VectorType.RELYT
| VectorType.PGVECTOR
| VectorType.TIDB_VECTOR
| VectorType.CHROMA
| VectorType.TENCENT
| VectorType.PGVECTO_RS
| VectorType.BAIDU
| VectorType.VIKINGDB
| VectorType.UPSTASH
| VectorType.OCEANBASE
):
return {"retrieval_method": [RetrievalMethod.SEMANTIC_SEARCH.value]}
case (
@@ -640,9 +630,6 @@ class DatasetRetrievalSettingApi(Resource):
| VectorType.ORACLE
| VectorType.ELASTICSEARCH
| VectorType.PGVECTOR
| VectorType.TIDB_ON_QDRANT
| VectorType.LINDORM
| VectorType.COUCHBASE
):
return {
"retrieval_method": [
@@ -670,8 +657,6 @@ class DatasetRetrievalSettingMockApi(Resource):
| VectorType.PGVECTO_RS
| VectorType.BAIDU
| VectorType.VIKINGDB
| VectorType.UPSTASH
| VectorType.OCEANBASE
):
return {"retrieval_method": [RetrievalMethod.SEMANTIC_SEARCH.value]}
case (
@@ -682,9 +667,7 @@ class DatasetRetrievalSettingMockApi(Resource):
| VectorType.MYSCALE
| VectorType.ORACLE
| VectorType.ELASTICSEARCH
| VectorType.COUCHBASE
| VectorType.PGVECTOR
| VectorType.LINDORM
):
return {
"retrieval_method": [

View File

@@ -24,11 +24,8 @@ from controllers.console.datasets.error import (
InvalidActionError,
InvalidMetadataError,
)
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
setup_required,
)
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from core.errors.error import (
LLMBadRequestError,
ModelCurrentlyNotSupportError,

View File

@@ -11,11 +11,11 @@ import services
from controllers.console import api
from controllers.console.app.error import ProviderNotInitializeError
from controllers.console.datasets.error import InvalidActionError, NoFileUploadedError, TooManyFilesError
from controllers.console.setup import setup_required
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_knowledge_limit_check,
cloud_edition_billing_resource_check,
setup_required,
)
from core.errors.error import LLMBadRequestError, ProviderTokenNotInitError
from core.model_manager import ModelManager

View File

@@ -6,7 +6,8 @@ from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services
from controllers.console import api
from controllers.console.datasets.error import DatasetNameDuplicateError
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from fields.dataset_fields import dataset_detail_fields
from libs.login import login_required
from services.dataset_service import DatasetService

View File

@@ -1,3 +1,5 @@
import urllib.parse
from flask import request
from flask_login import current_user
from flask_restful import Resource, marshal_with
@@ -5,22 +7,19 @@ from flask_restful import Resource, marshal_with
import services
from configs import dify_config
from constants import DOCUMENT_EXTENSIONS
from controllers.common.errors import FilenameNotExistsError
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
setup_required,
)
from fields.file_fields import file_fields, upload_config_fields
from libs.login import login_required
from services.file_service import FileService
from .errors import (
from controllers.console import api
from controllers.console.datasets.error import (
FileTooLargeError,
NoFileUploadedError,
TooManyFilesError,
UnsupportedFileTypeError,
)
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from core.helper import ssrf_proxy
from fields.file_fields import file_fields, remote_file_info_fields, upload_config_fields
from libs.login import login_required
from services.file_service import FileService
PREVIEW_WORDS_LIMIT = 3000
@@ -31,12 +30,13 @@ class FileApi(Resource):
@account_initialization_required
@marshal_with(upload_config_fields)
def get(self):
file_size_limit = dify_config.UPLOAD_FILE_SIZE_LIMIT
batch_count_limit = dify_config.UPLOAD_FILE_BATCH_LIMIT
image_file_size_limit = dify_config.UPLOAD_IMAGE_FILE_SIZE_LIMIT
return {
"file_size_limit": dify_config.UPLOAD_FILE_SIZE_LIMIT,
"batch_count_limit": dify_config.UPLOAD_FILE_BATCH_LIMIT,
"image_file_size_limit": dify_config.UPLOAD_IMAGE_FILE_SIZE_LIMIT,
"video_file_size_limit": dify_config.UPLOAD_VIDEO_FILE_SIZE_LIMIT,
"audio_file_size_limit": dify_config.UPLOAD_AUDIO_FILE_SIZE_LIMIT,
"file_size_limit": file_size_limit,
"batch_count_limit": batch_count_limit,
"image_file_size_limit": image_file_size_limit,
}, 200
@setup_required
@@ -45,29 +45,17 @@ class FileApi(Resource):
@marshal_with(file_fields)
@cloud_edition_billing_resource_check("documents")
def post(self):
# get file from request
file = request.files["file"]
source = request.form.get("source")
# check file
if "file" not in request.files:
raise NoFileUploadedError()
if len(request.files) > 1:
raise TooManyFilesError()
if not file.filename:
raise FilenameNotExistsError
if source not in ("datasets", None):
source = None
try:
upload_file = FileService.upload_file(
filename=file.filename,
content=file.read(),
mimetype=file.mimetype,
user=current_user,
source=source,
)
upload_file = FileService.upload_file(file=file, user=current_user)
except services.errors.file.FileTooLargeError as file_too_large_error:
raise FileTooLargeError(file_too_large_error.description)
except services.errors.file.UnsupportedFileTypeError:
@@ -92,3 +80,23 @@ class FileSupportTypeApi(Resource):
@account_initialization_required
def get(self):
return {"allowed_extensions": DOCUMENT_EXTENSIONS}
class RemoteFileInfoApi(Resource):
@marshal_with(remote_file_info_fields)
def get(self, url):
decoded_url = urllib.parse.unquote(url)
try:
response = ssrf_proxy.head(decoded_url)
return {
"file_type": response.headers.get("Content-Type", "application/octet-stream"),
"file_length": int(response.headers.get("Content-Length", 0)),
}
except Exception as e:
return {"error": str(e)}, 400
api.add_resource(FileApi, "/files/upload")
api.add_resource(FilePreviewApi, "/files/<uuid:file_id>/preview")
api.add_resource(FileSupportTypeApi, "/files/support-type")
api.add_resource(RemoteFileInfoApi, "/remote-files/<path:url>")

View File

@@ -1,23 +1,88 @@
from flask_restful import Resource
import logging
from flask_login import current_user
from flask_restful import Resource, marshal, reqparse
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services
from controllers.console import api
from controllers.console.datasets.hit_testing_base import DatasetsHitTestingBase
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.app.error import (
CompletionRequestError,
ProviderModelCurrentlyNotSupportError,
ProviderNotInitializeError,
ProviderQuotaExceededError,
)
from controllers.console.datasets.error import DatasetNotInitializedError
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from core.errors.error import (
LLMBadRequestError,
ModelCurrentlyNotSupportError,
ProviderTokenNotInitError,
QuotaExceededError,
)
from core.model_runtime.errors.invoke import InvokeError
from fields.hit_testing_fields import hit_testing_record_fields
from libs.login import login_required
from services.dataset_service import DatasetService
from services.hit_testing_service import HitTestingService
class HitTestingApi(Resource, DatasetsHitTestingBase):
class HitTestingApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self, dataset_id):
dataset_id_str = str(dataset_id)
dataset = self.get_and_validate_dataset(dataset_id_str)
args = self.parse_args()
self.hit_testing_args_check(args)
dataset = DatasetService.get_dataset(dataset_id_str)
if dataset is None:
raise NotFound("Dataset not found.")
return self.perform_hit_testing(dataset, args)
try:
DatasetService.check_dataset_permission(dataset, current_user)
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
parser = reqparse.RequestParser()
parser.add_argument("query", type=str, location="json")
parser.add_argument("retrieval_model", type=dict, required=False, location="json")
parser.add_argument("external_retrieval_model", type=dict, required=False, location="json")
args = parser.parse_args()
HitTestingService.hit_testing_args_check(args)
try:
response = HitTestingService.retrieve(
dataset=dataset,
query=args["query"],
account=current_user,
retrieval_model=args["retrieval_model"],
external_retrieval_model=args["external_retrieval_model"],
limit=10,
)
return {"query": response["query"], "records": marshal(response["records"], hit_testing_record_fields)}
except services.errors.index.IndexNotInitializedError:
raise DatasetNotInitializedError()
except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description)
except QuotaExceededError:
raise ProviderQuotaExceededError()
except ModelCurrentlyNotSupportError:
raise ProviderModelCurrentlyNotSupportError()
except LLMBadRequestError:
raise ProviderNotInitializeError(
"No Embedding Model or Reranking Model available. Please configure a valid provider "
"in the Settings -> Model Provider."
)
except InvokeError as e:
raise CompletionRequestError(e.description)
except ValueError as e:
raise ValueError(str(e))
except Exception as e:
logging.exception("Hit testing failed.")
raise InternalServerError(str(e))
api.add_resource(HitTestingApi, "/datasets/<uuid:dataset_id>/hit-testing")

View File

@@ -1,85 +0,0 @@
import logging
from flask_login import current_user
from flask_restful import marshal, reqparse
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services.dataset_service
from controllers.console.app.error import (
CompletionRequestError,
ProviderModelCurrentlyNotSupportError,
ProviderNotInitializeError,
ProviderQuotaExceededError,
)
from controllers.console.datasets.error import DatasetNotInitializedError
from core.errors.error import (
LLMBadRequestError,
ModelCurrentlyNotSupportError,
ProviderTokenNotInitError,
QuotaExceededError,
)
from core.model_runtime.errors.invoke import InvokeError
from fields.hit_testing_fields import hit_testing_record_fields
from services.dataset_service import DatasetService
from services.hit_testing_service import HitTestingService
class DatasetsHitTestingBase:
@staticmethod
def get_and_validate_dataset(dataset_id: str):
dataset = DatasetService.get_dataset(dataset_id)
if dataset is None:
raise NotFound("Dataset not found.")
try:
DatasetService.check_dataset_permission(dataset, current_user)
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
return dataset
@staticmethod
def hit_testing_args_check(args):
HitTestingService.hit_testing_args_check(args)
@staticmethod
def parse_args():
parser = reqparse.RequestParser()
parser.add_argument("query", type=str, location="json")
parser.add_argument("retrieval_model", type=dict, required=False, location="json")
parser.add_argument("external_retrieval_model", type=dict, required=False, location="json")
return parser.parse_args()
@staticmethod
def perform_hit_testing(dataset, args):
try:
response = HitTestingService.retrieve(
dataset=dataset,
query=args["query"],
account=current_user,
retrieval_model=args["retrieval_model"],
external_retrieval_model=args["external_retrieval_model"],
limit=10,
)
return {"query": response["query"], "records": marshal(response["records"], hit_testing_record_fields)}
except services.errors.index.IndexNotInitializedError:
raise DatasetNotInitializedError()
except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description)
except QuotaExceededError:
raise ProviderQuotaExceededError()
except ModelCurrentlyNotSupportError:
raise ProviderModelCurrentlyNotSupportError()
except LLMBadRequestError:
raise ProviderNotInitializeError(
"No Embedding Model or Reranking Model available. Please configure a valid provider "
"in the Settings -> Model Provider."
)
except InvokeError as e:
raise CompletionRequestError(e.description)
except ValueError as e:
raise ValueError(str(e))
except Exception as e:
logging.exception("Hit testing failed.")
raise InternalServerError(str(e))

View File

@@ -2,7 +2,8 @@ from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.datasets.error import WebsiteCrawlError
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from libs.login import login_required
from services.website_service import WebsiteService

View File

@@ -38,27 +38,3 @@ class AlreadyActivateError(BaseHTTPException):
error_code = "already_activate"
description = "Auth Token is invalid or account already activated, please check again."
code = 403
class NotAllowedCreateWorkspace(BaseHTTPException):
error_code = "not_allowed_create_workspace"
description = "Workspace not found, please contact system admin to invite you to join in a workspace."
code = 400
class AccountBannedError(BaseHTTPException):
error_code = "account_banned"
description = "Account is banned."
code = 400
class NotAllowedRegister(BaseHTTPException):
error_code = "unauthorized"
description = "Account not found."
code = 400
class EmailSendIpLimitError(BaseHTTPException):
error_code = "email_send_ip_limit"
description = "Too many emails have been sent from this IP address recently. Please try again later."
code = 429

View File

@@ -21,12 +21,7 @@ class AppParameterApi(InstalledAppResource):
"options": fields.List(fields.String),
}
system_parameters_fields = {
"image_file_size_limit": fields.Integer,
"video_file_size_limit": fields.Integer,
"audio_file_size_limit": fields.Integer,
"file_size_limit": fields.Integer,
}
system_parameters_fields = {"image_file_size_limit": fields.String}
parameters_fields = {
"opening_statement": fields.String,
@@ -87,12 +82,7 @@ class AppParameterApi(InstalledAppResource):
}
},
),
"system_parameters": {
"image_file_size_limit": dify_config.UPLOAD_IMAGE_FILE_SIZE_LIMIT,
"video_file_size_limit": dify_config.UPLOAD_VIDEO_FILE_SIZE_LIMIT,
"audio_file_size_limit": dify_config.UPLOAD_AUDIO_FILE_SIZE_LIMIT,
"file_size_limit": dify_config.UPLOAD_FILE_SIZE_LIMIT,
},
"system_parameters": {"image_file_size_limit": dify_config.UPLOAD_IMAGE_FILE_SIZE_LIMIT},
}

View File

@@ -3,7 +3,8 @@ from flask_restful import Resource, marshal_with, reqparse
from constants import HIDDEN_VALUE
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from fields.api_based_extension_fields import api_based_extension_fields
from libs.login import login_required
from models.api_based_extension import APIBasedExtension

View File

@@ -5,7 +5,8 @@ from libs.login import login_required
from services.feature_service import FeatureService
from . import api
from .wraps import account_initialization_required, cloud_utm_record, setup_required
from .setup import setup_required
from .wraps import account_initialization_required, cloud_utm_record
class FeatureApi(Resource):

View File

@@ -1,25 +0,0 @@
from libs.exception import BaseHTTPException
class FileTooLargeError(BaseHTTPException):
error_code = "file_too_large"
description = "File size exceeded. {message}"
code = 413
class UnsupportedFileTypeError(BaseHTTPException):
error_code = "unsupported_file_type"
description = "File type not allowed."
code = 415
class TooManyFilesError(BaseHTTPException):
error_code = "too_many_files"
description = "Only one file is allowed."
code = 400
class NoFileUploadedError(BaseHTTPException):
error_code = "no_file_uploaded"
description = "Please upload your file."
code = 400

View File

@@ -1,71 +0,0 @@
import urllib.parse
from typing import cast
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
from controllers.common import helpers
from core.file import helpers as file_helpers
from core.helper import ssrf_proxy
from fields.file_fields import file_fields_with_signed_url, remote_file_info_fields
from models.account import Account
from services.file_service import FileService
class RemoteFileInfoApi(Resource):
@marshal_with(remote_file_info_fields)
def get(self, url):
decoded_url = urllib.parse.unquote(url)
try:
response = ssrf_proxy.head(decoded_url)
return {
"file_type": response.headers.get("Content-Type", "application/octet-stream"),
"file_length": int(response.headers.get("Content-Length", 0)),
}
except Exception as e:
return {"error": str(e)}, 400
class RemoteFileUploadApi(Resource):
@marshal_with(file_fields_with_signed_url)
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("url", type=str, required=True, help="URL is required")
args = parser.parse_args()
url = args["url"]
response = ssrf_proxy.head(url)
response.raise_for_status()
file_info = helpers.guess_file_info_from_response(response)
if not FileService.is_file_size_within_limit(extension=file_info.extension, file_size=file_info.size):
return {"error": "File size exceeded"}, 400
response = ssrf_proxy.get(url)
response.raise_for_status()
content = response.content
try:
user = cast(Account, current_user)
upload_file = FileService.upload_file(
filename=file_info.filename,
content=content,
mimetype=file_info.mimetype,
user=user,
source_url=url,
)
except Exception as e:
return {"error": str(e)}, 400
return {
"id": upload_file.id,
"name": upload_file.name,
"size": upload_file.size,
"extension": upload_file.extension,
"url": file_helpers.get_signed_file_url(upload_file_id=upload_file.id),
"mime_type": upload_file.mime_type,
"created_by": upload_file.created_by,
"created_at": upload_file.created_at,
}, 201

View File

@@ -1,3 +1,5 @@
from functools import wraps
from flask import request
from flask_restful import Resource, reqparse
@@ -8,7 +10,7 @@ from models.model import DifySetup
from services.account_service import RegisterService, TenantService
from . import api
from .error import AlreadySetupError, NotInitValidateError
from .error import AlreadySetupError, NotInitValidateError, NotSetupError
from .init_validate import get_init_validate_status
from .wraps import only_edition_self_hosted
@@ -50,10 +52,26 @@ class SetupApi(Resource):
return {"result": "success"}, 201
def setup_required(view):
@wraps(view)
def decorated(*args, **kwargs):
# check setup
if not get_init_validate_status():
raise NotInitValidateError()
elif not get_setup_status():
raise NotSetupError()
return view(*args, **kwargs)
return decorated
def get_setup_status():
if dify_config.EDITION == "SELF_HOSTED":
return DifySetup.query.first()
return True
else:
return True
api.add_resource(SetupApi, "/setup")

View File

@@ -4,7 +4,8 @@ from flask_restful import Resource, marshal_with, reqparse
from werkzeug.exceptions import Forbidden
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from fields.tag_fields import tag_fields
from libs.login import login_required
from models.model import Tag

View File

@@ -3,7 +3,6 @@ import logging
import requests
from flask_restful import Resource, reqparse
from packaging import version
from configs import dify_config
@@ -48,15 +47,43 @@ class VersionApi(Resource):
def _has_new_version(*, latest_version: str, current_version: str) -> bool:
try:
latest = version.parse(latest_version)
current = version.parse(current_version)
def parse_version(version: str) -> tuple:
# Split version into parts and pre-release suffix if any
parts = version.split("-")
version_parts = parts[0].split(".")
pre_release = parts[1] if len(parts) > 1 else None
# Compare versions
return latest > current
except version.InvalidVersion:
logging.warning(f"Invalid version format: latest={latest_version}, current={current_version}")
# Validate version format
if len(version_parts) != 3:
raise ValueError(f"Invalid version format: {version}")
try:
# Convert version parts to integers
major, minor, patch = map(int, version_parts)
return (major, minor, patch, pre_release)
except ValueError:
raise ValueError(f"Invalid version format: {version}")
latest = parse_version(latest_version)
current = parse_version(current_version)
# Compare major, minor, and patch versions
for latest_part, current_part in zip(latest[:3], current[:3]):
if latest_part > current_part:
return True
elif latest_part < current_part:
return False
# If versions are equal, check pre-release suffixes
if latest[3] is None and current[3] is not None:
return True
elif latest[3] is not None and current[3] is None:
return False
elif latest[3] is not None and current[3] is not None:
# Simple string comparison for pre-release versions
return latest[3] > current[3]
return False
api.add_resource(VersionApi, "/version")

View File

@@ -8,13 +8,14 @@ from flask_restful import Resource, fields, marshal_with, reqparse
from configs import dify_config
from constants.languages import supported_language
from controllers.console import api
from controllers.console.setup import setup_required
from controllers.console.workspace.error import (
AccountAlreadyInitedError,
CurrentPasswordIncorrectError,
InvalidInvitationCodeError,
RepeatPasswordNotMatchError,
)
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.wraps import account_initialization_required
from extensions.ext_database import db
from fields.member_fields import account_fields
from libs.helper import TimestampField, timezone

View File

@@ -2,7 +2,8 @@ from flask_restful import Resource, reqparse
from werkzeug.exceptions import Forbidden
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from core.model_runtime.entities.model_entities import ModelType
from core.model_runtime.errors.validate import CredentialsValidateFailedError
from libs.login import current_user, login_required

View File

@@ -4,11 +4,8 @@ from flask_restful import Resource, abort, marshal_with, reqparse
import services
from configs import dify_config
from controllers.console import api
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
setup_required,
)
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from extensions.ext_database import db
from fields.member_fields import account_with_role_list_fields
from libs.login import login_required

View File

@@ -6,7 +6,8 @@ from flask_restful import Resource, reqparse
from werkzeug.exceptions import Forbidden
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from core.model_runtime.entities.model_entities import ModelType
from core.model_runtime.errors.validate import CredentialsValidateFailedError
from core.model_runtime.utils.encoders import jsonable_encoder

View File

@@ -5,7 +5,8 @@ from flask_restful import Resource, reqparse
from werkzeug.exceptions import Forbidden
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from core.model_runtime.entities.model_entities import ModelType
from core.model_runtime.errors.validate import CredentialsValidateFailedError
from core.model_runtime.utils.encoders import jsonable_encoder

View File

@@ -7,7 +7,8 @@ from werkzeug.exceptions import Forbidden
from configs import dify_config
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from core.model_runtime.utils.encoders import jsonable_encoder
from libs.helper import alphanumeric, uuid_value
from libs.login import login_required

View File

@@ -6,7 +6,6 @@ from flask_restful import Resource, fields, inputs, marshal, marshal_with, reqpa
from werkzeug.exceptions import Unauthorized
import services
from controllers.common.errors import FilenameNotExistsError
from controllers.console import api
from controllers.console.admin import admin_required
from controllers.console.datasets.error import (
@@ -16,11 +15,8 @@ from controllers.console.datasets.error import (
UnsupportedFileTypeError,
)
from controllers.console.error import AccountNotLinkTenantError
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
setup_required,
)
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from extensions.ext_database import db
from libs.helper import TimestampField
from libs.login import login_required
@@ -197,20 +193,12 @@ class WebappLogoWorkspaceApi(Resource):
if len(request.files) > 1:
raise TooManyFilesError()
if not file.filename:
raise FilenameNotExistsError
extension = file.filename.split(".")[-1]
if extension.lower() not in {"svg", "png"}:
raise UnsupportedFileTypeError()
try:
upload_file = FileService.upload_file(
filename=file.filename,
content=file.read(),
mimetype=file.mimetype,
user=current_user,
)
upload_file = FileService.upload_file(file=file, user=current_user)
except services.errors.file.FileTooLargeError as file_too_large_error:
raise FileTooLargeError(file_too_large_error.description)

View File

@@ -1,5 +1,4 @@
import json
import os
from functools import wraps
from flask import abort, request
@@ -7,12 +6,9 @@ from flask_login import current_user
from configs import dify_config
from controllers.console.workspace.error import AccountNotInitializedError
from models.model import DifySetup
from services.feature_service import FeatureService
from services.operation_service import OperationService
from .error import NotInitValidateError, NotSetupError
def account_initialization_required(view):
@wraps(view)
@@ -128,17 +124,3 @@ def cloud_utm_record(view):
return view(*args, **kwargs)
return decorated
def setup_required(view):
@wraps(view)
def decorated(*args, **kwargs):
# check setup
if dify_config.EDITION == "SELF_HOSTED" and os.environ.get("INIT_PASSWORD") and not DifySetup.query.first():
raise NotInitValidateError()
elif dify_config.EDITION == "SELF_HOSTED" and not DifySetup.query.first():
raise NotSetupError()
return view(*args, **kwargs)
return decorated

View File

@@ -1,5 +1,5 @@
from flask import Response, request
from flask_restful import Resource, reqparse
from flask_restful import Resource
from werkzeug.exceptions import NotFound
import services
@@ -10,10 +10,6 @@ from services.file_service import FileService
class ImagePreviewApi(Resource):
"""
Deprecated
"""
def get(self, file_id):
file_id = str(file_id)
@@ -41,39 +37,24 @@ class FilePreviewApi(Resource):
def get(self, file_id):
file_id = str(file_id)
parser = reqparse.RequestParser()
parser.add_argument("timestamp", type=str, required=True, location="args")
parser.add_argument("nonce", type=str, required=True, location="args")
parser.add_argument("sign", type=str, required=True, location="args")
parser.add_argument("as_attachment", type=bool, required=False, default=False, location="args")
timestamp = request.args.get("timestamp")
nonce = request.args.get("nonce")
sign = request.args.get("sign")
args = parser.parse_args()
if not args["timestamp"] or not args["nonce"] or not args["sign"]:
if not timestamp or not nonce or not sign:
return {"content": "Invalid request."}, 400
try:
generator, upload_file = FileService.get_file_generator_by_file_id(
generator, mimetype = FileService.get_signed_file_preview(
file_id=file_id,
timestamp=args["timestamp"],
nonce=args["nonce"],
sign=args["sign"],
timestamp=timestamp,
nonce=nonce,
sign=sign,
)
except services.errors.file.UnsupportedFileTypeError:
raise UnsupportedFileTypeError()
response = Response(
generator,
mimetype=upload_file.mime_type,
direct_passthrough=True,
headers={},
)
if upload_file.size > 0:
response.headers["Content-Length"] = str(upload_file.size)
if args["as_attachment"]:
response.headers["Content-Disposition"] = f"attachment; filename={upload_file.name}"
return response
return Response(generator, mimetype=mimetype)
class WorkspaceWebappLogoApi(Resource):

View File

@@ -16,7 +16,6 @@ class ToolFilePreviewApi(Resource):
parser.add_argument("timestamp", type=str, required=True, location="args")
parser.add_argument("nonce", type=str, required=True, location="args")
parser.add_argument("sign", type=str, required=True, location="args")
parser.add_argument("as_attachment", type=bool, required=False, default=False, location="args")
args = parser.parse_args()
@@ -29,27 +28,18 @@ class ToolFilePreviewApi(Resource):
raise Forbidden("Invalid request.")
try:
stream, tool_file = ToolFileManager.get_file_generator_by_tool_file_id(
result = ToolFileManager.get_file_generator_by_tool_file_id(
file_id,
)
if not stream or not tool_file:
if not result:
raise NotFound("file is not found")
generator, mimetype = result
except Exception:
raise UnsupportedFileTypeError()
response = Response(
stream,
mimetype=tool_file.mimetype,
direct_passthrough=True,
headers={},
)
if tool_file.size > 0:
response.headers["Content-Length"] = str(tool_file.size)
if args["as_attachment"]:
response.headers["Content-Disposition"] = f"attachment; filename={tool_file.name}"
return response
return Response(generator, mimetype=mimetype)
api.add_resource(ToolFilePreviewApi, "/files/tools/<uuid:file_id>.<string:extension>")

View File

@@ -1,6 +1,6 @@
from flask_restful import Resource, reqparse
from controllers.console.wraps import setup_required
from controllers.console.setup import setup_required
from controllers.inner_api import api
from controllers.inner_api.wraps import inner_api_only
from events.tenant_event import tenant_was_created
@@ -21,7 +21,7 @@ class EnterpriseWorkspace(Resource):
if account is None:
return {"message": "owner account not found."}, 404
tenant = TenantService.create_tenant(args["name"], is_from_dashboard=True)
tenant = TenantService.create_tenant(args["name"])
TenantService.create_tenant_member(tenant, account, role="owner")
tenant_was_created.send(tenant)

View File

@@ -5,6 +5,7 @@ from libs.external_api import ExternalApi
bp = Blueprint("service_api", __name__, url_prefix="/v1")
api = ExternalApi(bp)
from . import index
from .app import app, audio, completion, conversation, file, message, workflow
from .dataset import dataset, document, hit_testing, segment
from .dataset import dataset, document, segment

View File

@@ -21,12 +21,7 @@ class AppParameterApi(Resource):
"options": fields.List(fields.String),
}
system_parameters_fields = {
"image_file_size_limit": fields.Integer,
"video_file_size_limit": fields.Integer,
"audio_file_size_limit": fields.Integer,
"file_size_limit": fields.Integer,
}
system_parameters_fields = {"image_file_size_limit": fields.String}
parameters_fields = {
"opening_statement": fields.String,
@@ -86,12 +81,7 @@ class AppParameterApi(Resource):
}
},
),
"system_parameters": {
"image_file_size_limit": dify_config.UPLOAD_IMAGE_FILE_SIZE_LIMIT,
"video_file_size_limit": dify_config.UPLOAD_VIDEO_FILE_SIZE_LIMIT,
"audio_file_size_limit": dify_config.UPLOAD_AUDIO_FILE_SIZE_LIMIT,
"file_size_limit": dify_config.UPLOAD_FILE_SIZE_LIMIT,
},
"system_parameters": {"image_file_size_limit": dify_config.UPLOAD_IMAGE_FILE_SIZE_LIMIT},
}

View File

@@ -4,6 +4,7 @@ from flask_restful import Resource, reqparse
from werkzeug.exceptions import InternalServerError, NotFound
import services
from constants import UUID_NIL
from controllers.service_api import api
from controllers.service_api.app.error import (
AppUnavailableError,
@@ -107,6 +108,7 @@ class ChatApi(Resource):
parser.add_argument("conversation_id", type=uuid_value, location="json")
parser.add_argument("retriever_from", type=str, required=False, default="dev", location="json")
parser.add_argument("auto_generate_name", type=bool, required=False, default=True, location="json")
parser.add_argument("parent_message_id", type=uuid_value, required=False, default=UUID_NIL, location="json")
args = parser.parse_args()

View File

@@ -2,7 +2,6 @@ from flask import request
from flask_restful import Resource, marshal_with
import services
from controllers.common.errors import FilenameNotExistsError
from controllers.service_api import api
from controllers.service_api.app.error import (
FileTooLargeError,
@@ -32,17 +31,8 @@ class FileApi(Resource):
if len(request.files) > 1:
raise TooManyFilesError()
if not file.filename:
raise FilenameNotExistsError
try:
upload_file = FileService.upload_file(
filename=file.filename,
content=file.read(),
mimetype=file.mimetype,
user=end_user,
source="datasets",
)
upload_file = FileService.upload_file(file, end_user)
except services.errors.file.FileTooLargeError as file_too_large_error:
raise FileTooLargeError(file_too_large_error.description)
except services.errors.file.UnsupportedFileTypeError:

View File

@@ -48,7 +48,7 @@ class MessageListApi(Resource):
"tool_input": fields.String,
"created_at": TimestampField,
"observation": fields.String,
"message_files": fields.List(fields.Nested(message_file_fields)),
"message_files": fields.List(fields.String),
}
message_fields = {

View File

@@ -66,13 +66,6 @@ class DatasetListApi(DatasetApiResource):
help="type is required. Name must be between 1 to 40 characters.",
type=_validate_name,
)
parser.add_argument(
"description",
type=str,
nullable=True,
required=False,
default="",
)
parser.add_argument(
"indexing_technique",
type=str,
@@ -115,7 +108,6 @@ class DatasetListApi(DatasetApiResource):
dataset = DatasetService.create_empty_dataset(
tenant_id=tenant_id,
name=args["name"],
description=args["description"],
indexing_technique=args["indexing_technique"],
account=current_user,
permission=args["permission"],

View File

@@ -6,7 +6,6 @@ from sqlalchemy import desc
from werkzeug.exceptions import NotFound
import services.dataset_service
from controllers.common.errors import FilenameNotExistsError
from controllers.service_api import api
from controllers.service_api.app.error import ProviderNotInitializeError
from controllers.service_api.dataset.error import (
@@ -56,12 +55,7 @@ class DocumentAddByTextApi(DatasetApiResource):
if not dataset.indexing_technique and not args["indexing_technique"]:
raise ValueError("indexing_technique is required.")
text = args.get("text")
name = args.get("name")
if text is None or name is None:
raise ValueError("Both 'text' and 'name' must be non-null values.")
upload_file = FileService.upload_text(text=str(text), text_name=str(name))
upload_file = FileService.upload_text(args.get("text"), args.get("name"))
data_source = {
"type": "upload_file",
"info_list": {"data_source_type": "upload_file", "file_info_list": {"file_ids": [upload_file.id]}},
@@ -110,11 +104,7 @@ class DocumentUpdateByTextApi(DatasetApiResource):
raise ValueError("Dataset is not exist.")
if args["text"]:
text = args.get("text")
name = args.get("name")
if text is None or name is None:
raise ValueError("Both text and name must be strings.")
upload_file = FileService.upload_text(text=str(text), text_name=str(name))
upload_file = FileService.upload_text(args.get("text"), args.get("name"))
data_source = {
"type": "upload_file",
"info_list": {"data_source_type": "upload_file", "file_info_list": {"file_ids": [upload_file.id]}},
@@ -173,16 +163,7 @@ class DocumentAddByFileApi(DatasetApiResource):
if len(request.files) > 1:
raise TooManyFilesError()
if not file.filename:
raise FilenameNotExistsError
upload_file = FileService.upload_file(
filename=file.filename,
content=file.read(),
mimetype=file.mimetype,
user=current_user,
source="datasets",
)
upload_file = FileService.upload_file(file, current_user)
data_source = {"type": "upload_file", "info_list": {"file_info_list": {"file_ids": [upload_file.id]}}}
args["data_source"] = data_source
# validate args
@@ -231,16 +212,7 @@ class DocumentUpdateByFileApi(DatasetApiResource):
if len(request.files) > 1:
raise TooManyFilesError()
if not file.filename:
raise FilenameNotExistsError
upload_file = FileService.upload_file(
filename=file.filename,
content=file.read(),
mimetype=file.mimetype,
user=current_user,
source="datasets",
)
upload_file = FileService.upload_file(file, current_user)
data_source = {"type": "upload_file", "info_list": {"file_info_list": {"file_ids": [upload_file.id]}}}
args["data_source"] = data_source
# validate args
@@ -258,7 +230,7 @@ class DocumentUpdateByFileApi(DatasetApiResource):
except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description)
document = documents[0]
documents_and_batch_fields = {"document": marshal(document, document_fields), "batch": document.batch}
documents_and_batch_fields = {"document": marshal(document, document_fields), "batch": batch}
return documents_and_batch_fields, 200
@@ -359,26 +331,10 @@ class DocumentIndexingStatusApi(DatasetApiResource):
return data
api.add_resource(
DocumentAddByTextApi,
"/datasets/<uuid:dataset_id>/document/create_by_text",
"/datasets/<uuid:dataset_id>/document/create-by-text",
)
api.add_resource(
DocumentAddByFileApi,
"/datasets/<uuid:dataset_id>/document/create_by_file",
"/datasets/<uuid:dataset_id>/document/create-by-file",
)
api.add_resource(
DocumentUpdateByTextApi,
"/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/update_by_text",
"/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/update-by-text",
)
api.add_resource(
DocumentUpdateByFileApi,
"/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/update_by_file",
"/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/update-by-file",
)
api.add_resource(DocumentAddByTextApi, "/datasets/<uuid:dataset_id>/document/create_by_text")
api.add_resource(DocumentAddByFileApi, "/datasets/<uuid:dataset_id>/document/create_by_file")
api.add_resource(DocumentUpdateByTextApi, "/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/update_by_text")
api.add_resource(DocumentUpdateByFileApi, "/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/update_by_file")
api.add_resource(DocumentDeleteApi, "/datasets/<uuid:dataset_id>/documents/<uuid:document_id>")
api.add_resource(DocumentListApi, "/datasets/<uuid:dataset_id>/documents")
api.add_resource(DocumentIndexingStatusApi, "/datasets/<uuid:dataset_id>/documents/<string:batch>/indexing-status")

View File

@@ -1,17 +0,0 @@
from controllers.console.datasets.hit_testing_base import DatasetsHitTestingBase
from controllers.service_api import api
from controllers.service_api.wraps import DatasetApiResource
class HitTestingApi(DatasetApiResource, DatasetsHitTestingBase):
def post(self, tenant_id, dataset_id):
dataset_id_str = str(dataset_id)
dataset = self.get_and_validate_dataset(dataset_id_str)
args = self.parse_args()
self.hit_testing_args_check(args)
return self.perform_hit_testing(dataset, args)
api.add_resource(HitTestingApi, "/datasets/<uuid:dataset_id>/hit-testing", "/datasets/<uuid:dataset_id>/retrieve")

View File

@@ -2,17 +2,8 @@ from flask import Blueprint
from libs.external_api import ExternalApi
from .files import FileApi
from .remote_files import RemoteFileInfoApi, RemoteFileUploadApi
bp = Blueprint("web", __name__, url_prefix="/api")
api = ExternalApi(bp)
# Files
api.add_resource(FileApi, "/files/upload")
# Remote files
api.add_resource(RemoteFileInfoApi, "/remote-files/<path:url>")
api.add_resource(RemoteFileUploadApi, "/remote-files/upload")
from . import app, audio, completion, conversation, feature, message, passport, saved_message, site, workflow
from . import app, audio, completion, conversation, feature, file, message, passport, saved_message, site, workflow

View File

@@ -21,12 +21,7 @@ class AppParameterApi(WebApiResource):
"options": fields.List(fields.String),
}
system_parameters_fields = {
"image_file_size_limit": fields.Integer,
"video_file_size_limit": fields.Integer,
"audio_file_size_limit": fields.Integer,
"file_size_limit": fields.Integer,
}
system_parameters_fields = {"image_file_size_limit": fields.String}
parameters_fields = {
"opening_statement": fields.String,
@@ -85,12 +80,7 @@ class AppParameterApi(WebApiResource):
}
},
),
"system_parameters": {
"image_file_size_limit": dify_config.UPLOAD_IMAGE_FILE_SIZE_LIMIT,
"video_file_size_limit": dify_config.UPLOAD_VIDEO_FILE_SIZE_LIMIT,
"audio_file_size_limit": dify_config.UPLOAD_AUDIO_FILE_SIZE_LIMIT,
"file_size_limit": dify_config.UPLOAD_FILE_SIZE_LIMIT,
},
"system_parameters": {"image_file_size_limit": dify_config.UPLOAD_IMAGE_FILE_SIZE_LIMIT},
}

Some files were not shown because too many files have changed in this diff Show More