Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Partially incompat with Prompt Control #751

Open
miasik opened this issue Sep 23, 2024 · 1 comment
Open

Partially incompat with Prompt Control #751

miasik opened this issue Sep 23, 2024 · 1 comment

Comments

@miasik
Copy link

miasik commented Sep 23, 2024

The extension is used for work with prompt editing features like [::] or [||]
https://github.com/asagi4/comfyui-prompt-control

Node "PCSplitSampling" is used to use steps instead of time steps. It's very useful to me.
The node is the last in the schedule chain and other nodes using a model should be connected to it. KSamplers work well with it(and IP adapter in my other workflow too).
At least nodes "Iterative Upscale (Image)" and "FaceDetailer" don't work with it and throw an error below.

There's a way to avoid the error. If i connect the nodes to "ScheduleToModel" (the node before "PCSplitSampling" in the chain) they work, but using time steps instead of steps.

My workflow to demonstrate the issue:
SD15 alone Test.json

IterativeLatentUpscale[Final]: 864.0x864.0 (scale:1.50)
[INFO] PromptControl: SamplerCustom detected, number of steps not available. LoRA schedules will be calculated based on the number of sigmas (21)
[ERROR] PromptControl: Exception occurred during callback, unpatching model.
[INFO] PromptControl: Unpatching model
!!! Exception during processing !!! 'last_step'
Traceback (most recent call last):
  File "g:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "g:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "g:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "g:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "G:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 1311, in doit
    refined_latent = IterativeLatentUpscale().doit(latent, upscale_factor, steps, temp_prefix, upscaler, step_mode, unique_id)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "G:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 1273, in doit
    current_latent = upscaler.upscale_shape(step_info, current_latent, new_w, new_h, temp_prefix)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "G:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 1712, in upscale_shape
    refined_latent = self.sample(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, upscaled_latent, denoise, upscaled_images)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "G:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 1653, in sample
    refined_latent = impact_sampling.impact_sample(model, seed, steps, cfg, sampler_name, scheduler,
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "G:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_sampling.py", line 226, in impact_sample
    return separated_sample(model, True, seed, advanced_steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "G:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_sampling.py", line 214, in separated_sample
    res = sample_with_custom_noise(model, add_noise, seed, cfg, positive, negative, impact_sampler, sigmas, latent_image, noise=noise, callback=callback)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "G:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_sampling.py", line 158, in sample_with_custom_noise
    samples = comfy.sample.sample_custom(model, noise, cfg, sampler, sigmas, positive, negative, latent_image,
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "G:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-prompt-control\prompt_control\hijack.py", line 39, in pc_sample
    r = cb(orig_sampler, is_custom, *args, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "G:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-prompt-control\prompt_control\node_lora.py", line 90, in sampler_cb
    actual_end_step = kwargs["last_step"] or steps
                      ~~~~~~^^^^^^^^^^^^^
KeyError: 'last_step'

Prompt executed in 2.14 seconds
@miasik miasik changed the title Partially incompat with Propmt Control Partially incompat with Prompt Control Sep 23, 2024
@asagi4
Copy link

asagi4 commented Sep 23, 2024

This might also be a bug in how PCSplitSampling is implemented; I don't really use it myself because it's very hacky and not that useful usually, so it might have suffered some code decay.

ComfyUI's function signatures differ between custom sampling and the "normal" sampling function and PC needs to deal with that. I suspect some parameters are getting mixed up here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants