Commit Graph

65 Commits

Author SHA1 Message Date
comfyanonymous
5d8898c056 Fix some performance issues with weight loading and unloading.
Lower peak memory usage when changing model.

Fix case where model weights would be unloaded and reloaded.
2024-03-28 18:04:42 -04:00
comfyanonymous
6a32c06f06 Move cleanup_models to improve performance. 2024-03-23 17:27:10 -04:00
comfyanonymous
314d28c251 Pass extra_pnginfo as None when not in input data. 2024-03-07 15:07:47 -05:00
Rick Love
f81dbe26e2
FIX recursive_will_execute performance (simple ~300x performance increase} (#2852)
* FIX recursive_will_execute performance

* Minimize code changes

* memo must be created outside lambda
2024-02-21 20:21:24 -05:00
comfyanonymous
2d105066df Cleanups. 2024-01-26 21:31:13 -05:00
realazthat
fad02dc2df Don't use PEP 604 type hints, to stay compatible with Python<3.10. 2024-01-17 17:16:34 -05:00
comfyanonymous
56d9496b18 Rename status notes to status messages.
I think message describes them better.
2024-01-12 18:17:06 -05:00
comfyanonymous
bcc0bde2af Clear status notes on execution start. 2024-01-12 17:21:22 -05:00
realazthat
1b3d65bd84 Add error, status to /history endpoint 2024-01-11 10:16:42 -05:00
comfyanonymous
6d281b4ff4 Add a /free route to unload models or free all memory.
A POST request to /free with: {"unload_models":true}
will unload models from vram.

A POST request to /free with: {"free_memory":true}
will unload models and free all cached data from the last run workflow.
2024-01-04 17:15:22 -05:00
comfyanonymous
04b713dda1 Fix VALIDATE_INPUTS getting called multiple times.
Allow VALIDATE_INPUTS to only validate specific inputs.
2023-12-29 17:36:40 -05:00
comfyanonymous
a252963f95 --disable-smart-memory now unloads everything like it did originally. 2023-12-23 04:25:06 -05:00
comfyanonymous
6b769bca01 Do a garbage collect after the interval even if nothing is running. 2023-11-30 15:22:32 -05:00
comfyanonymous
2dd5b4dd78 Only show last 200 elements in the UI history tab. 2023-11-20 16:56:29 -05:00
comfyanonymous
a03dde190e Cap maximum history size at 10000. Delete oldest entry when reached. 2023-11-20 16:38:39 -05:00
comfyanonymous
20d3852aa1 Pull some small changes from the other repo. 2023-10-11 20:38:48 -04:00
pythongosssss
62799c8585 fix crash on node with VALIDATE_INPUTS and actual inputs 2023-09-07 18:42:21 +01:00
comfyanonymous
89a0767abf Smarter memory management.
Try to keep models on the vram when possible.

Better lowvram mode for controlnets.
2023-08-17 01:06:34 -04:00
Michael Poutre
90b0163524
fix(execution): Fix support for input-less nodes 2023-08-01 12:29:01 -07:00
Michael Poutre
7785d073f0
chore: Fix typo 2023-08-01 12:27:50 -07:00
comfyanonymous
09386a3697 Fix issue with lora in some cases when combined with model merging. 2023-07-21 21:27:27 -04:00
comfyanonymous
6e9f28401f Persist node instances between executions instead of deleting them.
If the same node id with the same class exists between two executions the
same instance will be used.

This means you can now cache things in nodes for more efficiency.
2023-06-29 23:38:56 -04:00
comfyanonymous
d52ed407a7 Send websocket message only when prompt is actually done executing. 2023-06-13 13:38:43 -04:00
comfyanonymous
af91df85c2 Add a /history/{prompt_id} endpoint. 2023-06-12 14:34:30 -04:00
comfyanonymous
ad81fd682a Fix issue with cancelling prompt. 2023-05-28 00:32:26 -04:00
space-nuko
03f2d0a764 Rename exception message field 2023-05-27 21:06:07 -05:00
space-nuko
52c9590b7b Exception message 2023-05-27 21:06:07 -05:00
space-nuko
62bdd9d26a Catch typecast errors 2023-05-27 21:06:07 -05:00
space-nuko
a9e7e23724 Fix 2023-05-27 21:06:07 -05:00
space-nuko
e2d080b694 Return null for value format 2023-05-27 21:06:07 -05:00
space-nuko
6b2a8a3845 Show message in the frontend if prompt execution raises an exception 2023-05-27 21:06:07 -05:00
space-nuko
ffec815257 Send back more information about exceptions that happen during execution 2023-05-27 21:06:07 -05:00
space-nuko
0d834e3a2b Add missing input name/config 2023-05-27 21:06:07 -05:00
space-nuko
c33b7c5549 Improve invalid prompt error message 2023-05-27 21:06:07 -05:00
space-nuko
73e85fb3f4 Improve error output for failed nodes 2023-05-27 21:06:07 -05:00
comfyanonymous
48fcc5b777 Parsing error crash. 2023-05-22 20:51:30 -04:00
comfyanonymous
ffc56c53c9 Add a node_errors to the /prompt error json response.
"node_errors" contains a dict keyed by node ids. The contents are a message
and a list of dependent outputs.
2023-05-22 13:22:38 -04:00
comfyanonymous
516119ad83 Print min and max values in validation error message. 2023-05-21 00:24:28 -04:00
comfyanonymous
1dd846a7ba Fix outputs gone from history. 2023-05-15 00:27:28 -04:00
comfyanonymous
9bf67c4c5a Print prompt execution time. 2023-05-14 01:34:25 -04:00
comfyanonymous
44f9f9baf1 Add the prompt id to some websocket messages. 2023-05-13 11:17:16 -04:00
BlenderNeko
1201d2eae5
Make nodes map over input lists (#579)
* allow nodes to map over lists

* make work with IS_CHANGED and VALIDATE_INPUTS

* give list outputs distinct socket shape

* add rebatch node

* add batch index logic

* add repeat latent batch

* deal with noise mask edge cases in latentfrombatch
2023-05-13 11:15:45 -04:00
comfyanonymous
dfc74c19d9 Add the prompt_id to some websocket messages. 2023-05-11 01:22:40 -04:00
comfyanonymous
3a7c3acc72 Send websocket message with list of cached nodes right before execution. 2023-05-10 15:59:24 -04:00
comfyanonymous
602095f614 Send execution_error message on websocket on execution exception. 2023-05-10 15:49:49 -04:00
comfyanonymous
d6dee8af1d Only validate each input once. 2023-05-10 00:29:31 -04:00
comfyanonymous
02ca1c67f8 Don't print traceback when processing interrupted. 2023-05-09 23:51:52 -04:00
comfyanonymous
3a1f9dba20 If IS_CHANGED returns exception delete the output instead of crashing. 2023-04-26 02:13:56 -04:00
comfyanonymous
951c0c2bbe Don't keep cached outputs for removed nodes. 2023-04-26 02:05:57 -04:00
comfyanonymous
0ac319fd81 Don't delete all outputs when execution gets interrupted. 2023-04-23 22:44:38 -04:00