Template for prompt travel + openpose controlnet
5.0
2 reviewsDescription
This is used just as a reference for prompt travel + controlnet animations
Motion controlnet: https://huggingface.co/crishhh/animatediff_controlnet/resolve/main/controlnet_checkpoint.ckpt?download=true
Node Diagram
Discussion
Where do we download ad/motion.ckpt from?
I think it's from here: https://huggingface.co/crishhh/animatediff_controlnet/blob/main/controlnet_checkpoint.ckpt
Amazing workflow! Can you share the input video used to drive the workflow? Would like to compare input with derived output.
@matt3o can you please share the motion gif/mp4 to let us start from the exact point 0 ? >>>> oh man!! Im dumb as a brick! hahah, literaly I just went to smoke a cigarette after posting this and than the flash from the clear sky cames! hhahah, for that one above me, don't take it too seriously with that 'dumb as brick' please.... all u must to do is to click that gif above to expand , than right click and save as... ;-) than go to ezgif page and transfer it into mp4
Does anyone know how to solve DWPreprocessor: Connection error, and we cannot find the requested files in the disk cache. This keeps coming up
I changed the prompt to be set in the city (instead of sea), but unfortunately it's not keeping background consistent, so buildings changing and disappearing, etc. through the entire clip. Any recommendations for better maintaining background?
What is the difference of the 2 prompt boxes in the Batch Prompt Schedule node (under the one where you set the frames)? It doesn't look like it's using negative prompts
could not be loaded with cv.
this problem has been solved,It's because of/and\, not because of plug-in error.
Using artificial intelligence can be very helpful to help different stages of film or animation. Here are some free or free artificial intelligence tools that can help you in the process:
File "C:\Users\MSI\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 152, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MSI\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 82, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MSI\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 75, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MSI\Desktop\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\nodes_gen1.py", line 146, in load_mm_and_inject_params motion_model = load_motion_module_gen1(model_name, model, motion_lora=motion_lora, motion_model_settings=motion_model_settings) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MSI\Desktop\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\model_injection.py", line 1242, in load_motion_module_gen1 mm_state_dict = comfy.utils.load_torch_file(model_path, safe_load=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\MSI\Desktop\ComfyUI_windows_portable\ComfyUI\comfy\utils.py", line 14, in load_torch_file if ckpt.lower().endswith(".safetensors") or ckpt.lower().endswith(".sft"): ^^^^^^^^^^
Node Details
Primitive Nodes (1)
PrimitiveNode
Custom Nodes (23)
- ADE_AnimateDiffLoaderWithContext
- ADE_AnimateDiffUniformContextOptions
ComfyUI
- VAEDecode
- CheckpointLoaderSimple
- EmptyLatentImage
- VAELoader
- CLIPTextEncode
- FreeU_V2
- ControlNetApplyAdvanced
- KSampler
- LoraLoaderModelOnly
- ImageCASharpening+
- RIFE VFI
- DWPreprocessor
- ControlNetLoaderAdvanced
- VHS_VideoCombine
- VHS_LoadVideoPath
- BatchPromptSchedule
Model Details
Checkpoints (1)
sd15/dreamshaper_8.safetensors
LoRAs (1)
v3_sd15_adapter.ckpt