Pam
|
caeb27c3a5
|
res_multistep: Fix cfgpp and add ancestral samplers (#6731)
|
2025-02-08 19:39:58 -05:00 |
|
comfyanonymous
|
3d06e1c555
|
Make error more clear to user.
|
2025-02-08 18:57:24 -05:00 |
|
catboxanon
|
43a74c0de1
|
Allow FP16 accumulation with --fast (#6453)
Currently only applies to PyTorch nightly releases. (>=20250208)
|
2025-02-08 17:00:56 -05:00 |
|
comfyanonymous
|
af93c8d1ee
|
Document which text encoder to use for lumina 2.
|
2025-02-08 06:57:25 -05:00 |
|
Raphael Walker
|
832e3f5ca3
|
Fix another small bug in attention_bias redux (#6737)
* fix a bug in the attn_masked redux code when using weight=1.0
* oh shit wait there was another bug
|
2025-02-07 14:44:43 -05:00 |
|
comfyanonymous
|
079eccc92a
|
Don't compress http response by default.
Remove argument to disable it.
Add new --enable-compress-response-body argument to enable it.
|
2025-02-07 03:29:21 -05:00 |
|
Raphael Walker
|
b6951768c4
|
fix a bug in the attn_masked redux code when using weight=1.0 (#6721)
|
2025-02-06 16:51:16 -05:00 |
|
Comfy Org PR Bot
|
fca304debf
|
Update frontend to v1.8.14 (#6724)
Co-authored-by: huchenlei <20929282+huchenlei@users.noreply.github.com>
|
2025-02-06 10:43:10 -05:00 |
|
comfyanonymous
|
14880e6dba
|
Remove some useless code.
|
2025-02-06 05:00:37 -05:00 |
|
Chenlei Hu
|
f1059b0b82
|
Remove unused GET /files API endpoint (#6714)
|
2025-02-05 18:48:36 -05:00 |
|
comfyanonymous
|
debabccb84
|
Bump ComfyUI version to v0.3.14
|
2025-02-05 15:48:13 -05:00 |
|
comfyanonymous
|
37cd448529
|
Set the shift for Lumina back to 6.
|
2025-02-05 14:49:52 -05:00 |
|
comfyanonymous
|
94f21f9301
|
Upcasting rope to fp32 seems to make no difference in this model.
|
2025-02-05 04:32:47 -05:00 |
|
comfyanonymous
|
60653004e5
|
Use regular numbers for rope in lumina model.
|
2025-02-05 04:17:25 -05:00 |
|
comfyanonymous
|
a57d635c5f
|
Fix lumina 2 batches.
|
2025-02-04 21:48:11 -05:00 |
|
comfyanonymous
|
016b219dcc
|
Add Lumina Image 2.0 to Readme.
|
2025-02-04 08:08:36 -05:00 |
|
comfyanonymous
|
8ac2dddeed
|
Lower the default shift of lumina to reduce artifacts.
|
2025-02-04 06:50:37 -05:00 |
|
comfyanonymous
|
3e880ac709
|
Fix on python 3.9
|
2025-02-04 04:20:56 -05:00 |
|
comfyanonymous
|
e5ea112a90
|
Support Lumina 2 model.
|
2025-02-04 04:16:30 -05:00 |
|
Raphael Walker
|
8d88bfaff9
|
allow searching for new .pt2 extension, which can contain AOTI compiled modules (#6689)
|
2025-02-03 17:07:35 -05:00 |
|
comfyanonymous
|
ed4d92b721
|
Model merging nodes for cosmos.
|
2025-02-03 03:31:39 -05:00 |
|
Comfy Org PR Bot
|
932ae8d9ca
|
Update frontend to v1.8.13 (#6682)
Co-authored-by: huchenlei <20929282+huchenlei@users.noreply.github.com>
|
2025-02-02 17:54:44 -05:00 |
|
comfyanonymous
|
44e19a28d3
|
Use maximum negative value instead of -inf for masks in text encoders.
This is probably more correct.
|
2025-02-02 09:46:00 -05:00 |
|
Dr.Lt.Data
|
0a0df5f136
|
better guide message for sageattention (#6634)
|
2025-02-02 09:26:47 -05:00 |
|
KarryCharon
|
24d6871e47
|
add disable-compres-response-body cli args; add compress middleware; (#6672)
|
2025-02-02 09:24:55 -05:00 |
|
comfyanonymous
|
9e1d301129
|
Only use stable cascade lora format with cascade model.
|
2025-02-01 06:35:22 -05:00 |
|
Terry Jia
|
768e035868
|
Add node for preview 3d animation (#6594)
* Add node for preview 3d animation
* remove bg_color param
* remove animation_speed param
|
2025-01-31 10:09:07 -08:00 |
|
Comfy Org PR Bot
|
669e0497ea
|
Update frontend to v1.8.12 (#6662)
Co-authored-by: huchenlei <20929282+huchenlei@users.noreply.github.com>
|
2025-01-31 10:07:37 -08:00 |
|
comfyanonymous
|
541dc08547
|
Update Readme.
|
2025-01-31 08:35:48 -05:00 |
|
comfyanonymous
|
8d8dc9a262
|
Allow batch of different sigmas when noise scaling.
|
2025-01-30 06:49:52 -05:00 |
|
comfyanonymous
|
2f98c24360
|
Update Readme with link to instruction for Nvidia 50 series.
|
2025-01-30 02:12:43 -05:00 |
|
comfyanonymous
|
ef85058e97
|
Bump ComfyUI version to v0.3.13
|
2025-01-29 16:07:12 -05:00 |
|
comfyanonymous
|
f9230bd357
|
Update the python version in some workflows.
|
2025-01-29 15:54:13 -05:00 |
|
comfyanonymous
|
537c27cbf3
|
Bump default cuda version in standalone package to 126.
|
2025-01-29 08:13:33 -05:00 |
|
comfyanonymous
|
6ff2e4d550
|
Remove logging call added in last commit.
This is called before the logging is set up so it messes up some things.
|
2025-01-29 08:08:01 -05:00 |
|
filtered
|
222f48c0f2
|
Allow changing folder_paths.base_path via command line argument. (#6600)
* Reimpl. CLI arg directly inside folder_paths.
* Update tests to use CLI arg mocking.
* Revert last-minute refactor.
* Fix test state polution.
|
2025-01-29 08:06:28 -05:00 |
|
comfyanonymous
|
13fd4d6e45
|
More friendly error messages for corrupted safetensors files.
|
2025-01-28 09:41:09 -05:00 |
|
Bradley Reynolds
|
1210d094c7
|
Convert latents_ubyte to 8-bit unsigned int before converting to CPU (#6300)
* Convert latents_ubyte to 8-bit unsigned int before converting to CPU
* Only convert to unint8 if directml_enabled
|
2025-01-28 08:22:54 -05:00 |
|
comfyanonymous
|
255edf2246
|
Lower minimum ratio of loaded weights on Nvidia.
|
2025-01-27 05:26:51 -05:00 |
|
comfyanonymous
|
4f011b9a00
|
Better CLIPTextEncode error when clip input is None.
|
2025-01-26 06:04:57 -05:00 |
|
comfyanonymous
|
67feb05299
|
Remove redundant code.
|
2025-01-25 19:04:53 -05:00 |
|
comfyanonymous
|
6d21740346
|
Print ComfyUI version.
|
2025-01-25 15:03:57 -05:00 |
|
comfyanonymous
|
7fbf4b72fe
|
Update nightly pytorch ROCm command in Readme.
|
2025-01-24 06:15:54 -05:00 |
|
comfyanonymous
|
14ca5f5a10
|
Remove useless code.
|
2025-01-24 06:15:54 -05:00 |
|
filtered
|
ce557cfb88
|
Remove redundant code (#6576)
|
2025-01-23 05:57:41 -05:00 |
|
comfyanonymous
|
96e2a45193
|
Remove useless code.
|
2025-01-23 05:56:23 -05:00 |
|
Chenlei Hu
|
dfa2b6d129
|
Remove unused function lcm in conds.py (#6572)
|
2025-01-23 05:54:09 -05:00 |
|
Terry Jia
|
f3566f0894
|
remove some params from load 3d node (#6436)
|
2025-01-22 17:23:51 -05:00 |
|
Chenlei Hu
|
ca69b41cee
|
Add utils/ to web server developer codeowner (#6570)
|
2025-01-22 17:16:54 -05:00 |
|
Chenlei Hu
|
a058f52090
|
[i18n] Add /i18n endpoint to provide all custom node translations (#6558)
* [i18n] Add /i18n endpoint to provide all custom node translations
* Sort glob result for deterministic ordering
* Update comment
|
2025-01-22 17:15:45 -05:00 |
|