Skip to content

Commit

Permalink
Update UPDATES.md
Browse files Browse the repository at this point in the history
  • Loading branch information
Fannovel16 committed Aug 16, 2024
1 parent 835e85e commit e69e739
Show file tree
Hide file tree
Showing 4 changed files with 6 additions and 74 deletions.
77 changes: 4 additions & 73 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,22 +2,13 @@
Plug-and-play [ComfyUI](https://github.com/comfyanonymous/ComfyUI) node sets for making [ControlNet](https://github.com/lllyasviel/ControlNet/) hint images

"anime style, a protest in the street, cyberpunk city, a woman with pink hair and golden eyes (looking at the viewer) is holding a sign with the text "ComfyUI ControlNet Aux" in bold, neon pink" on Flux.1 Dev
[Original image with workflow](./examples/ExecuteAll.png)
![](./examples/ExecuteAll1.jpg)
![](./examples/ExecuteAll2.jpg)

![](./CNAuxBanner.jpg)

The code is copy-pasted from the respective folders in https://github.com/lllyasviel/ControlNet/tree/main/annotator and connected to [the 🤗 Hub](https://huggingface.co/lllyasviel/Annotators).

All credit & copyright goes to https://github.com/lllyasviel.

# Marigold
Check out Marigold Depth Estimator which can generate very detailed and sharp depth map from high-resolution still images. The mesh created by it is even 3D-printable. Due to diffusers, it can't be implemented in this extension but there is an Comfy implementation by Kijai
https://github.com/kijai/ComfyUI-Marigold

![](./examples/example_marigold_flat.jpg)
![](./examples/example_marigold.png)

# Updates
Go to [Update page](./UPDATES.md) to follow updates

Expand Down Expand Up @@ -184,71 +175,11 @@ for o in history['outputs']:
# Examples
> A picture is worth a thousand words
Credit to https://huggingface.co/thibaud/controlnet-sd21 for most examples below. You can get the same kind of results from preprocessor nodes of this repo.
## Line Extractors
### Canny Edge
![](https://huggingface.co/thibaud/controlnet-sd21/resolve/main/example_canny.png)
### HED Lines
![](https://huggingface.co/thibaud/controlnet-sd21/resolve/main/example_hed.png)
### Realistic Lineart
![](https://huggingface.co/thibaud/controlnet-sd21/resolve/main/example_lineart.png)
### Scribble/Fake Scribble
![](https://huggingface.co/thibaud/controlnet-sd21/resolve/main/example_scribble.png)
### TEED Soft-Edge Lines
![](./examples/example_teed.png)
### Anyline Lineart
![](./examples/example_anyline.png)

## Normal and Depth Map
### Depth (idk the preprocessor they use)
![](https://huggingface.co/thibaud/controlnet-sd21/resolve/main/example_depth.png)
## Zoe - Depth Map
![](https://huggingface.co/thibaud/controlnet-sd21/resolve/main/example_zoedepth.png)
## BAE - Normal Map
![](https://huggingface.co/thibaud/controlnet-sd21/resolve/main/example_normalbae.png)
## MeshGraphormer
![](./examples/example_mesh_graphormer.png)
## Depth Anything & Zoe Depth Anything
![](./examples/example_depth_anything.png)
## DSINE
![](./examples/example_dsine.png)
## Metric3D
![](./examples/example_metric3d.png)
## Depth Anything V2
![](./examples/example_depth_anything_v2.png)

## Faces and Poses
### OpenPose
![](https://huggingface.co/thibaud/controlnet-sd21/resolve/main/example_openpose.png)
![](https://huggingface.co/thibaud/controlnet-sd21/resolve/main/example_openposev2.png)

### Animal Pose (AP-10K)
![](./examples/example_animal_pose.png)

### DensePose
![](./examples/example_densepose.png)

## Semantic Segmantation
### OneFormer ADE20K Segmentor
![](https://huggingface.co/thibaud/controlnet-sd21/resolve/main/example_ade20k.png)

### Anime Face Segmentor
![](./examples/example_anime_face_segmentor.png)

## T2IAdapter-only
### Color Pallete for T2I-Adapter
![](https://huggingface.co/thibaud/controlnet-sd21/resolve/main/example_color.png)

## Optical Flow
### Unimatch
![](./examples/example_unimatch.png)

## Recolor
![](./examples/example_recolor.png)
![](./examples/ExecuteAll1.jpg)
![](./examples/ExecuteAll2.jpg)

# Testing workflow
https://github.com/Fannovel16/comfyui_controlnet_aux/blob/master/tests/test_cn_aux_full.json
![](https://github.com/Fannovel16/comfyui_controlnet_aux/blob/master/tests/pose.png?raw=true)
https://github.com/Fannovel16/comfyui_controlnet_aux/blob/main/examples/ExecuteAll.png

# Q&A:
## Why some nodes doesn't appear after I installed this repo?
Expand Down
1 change: 1 addition & 0 deletions UPDATES.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,3 +40,4 @@
* Added Depth Anything V2 (16/06/2024)
* Added Union model of ControlNet and preprocessors
![345832280-edf41dab-7619-494c-9f60-60ec1f8789cb](https://github.com/user-attachments/assets/aa55f57c-cad7-48e6-84d3-8f506d847989)
* Refactor INPUT_TYPES and add Execute All node during the process of learning [Execution Model Inversion](https://github.com/comfyanonymous/ComfyUI/pull/2666)
Binary file added examples/CNAuxBanner.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[project]
name = "comfyui_controlnet_aux"
description = "Plug-and-play ComfyUI node sets for making ControlNet hint images"
version = "1.0.4-alpha.2"
version = "1.0.4-alpha.3"
license = "LICENSE"
dependencies = ["torch", "importlib_metadata", "huggingface_hub", "scipy", "opencv-python>=4.7.0.72", "filelock", "numpy", "Pillow", "einops", "torchvision", "pyyaml", "scikit-image", "python-dateutil", "mediapipe", "svglib", "fvcore", "yapf", "omegaconf", "ftfy", "addict", "yacs", "trimesh[easy]", "albumentations", "scikit-learn"]

Expand Down

0 comments on commit e69e739

Please sign in to comment.