BrushNet - The Best InPainting Method Yet? FREE Local Install!

Sdílet
Vložit
  • čas přidán 16. 05. 2024
  • Inpainting with the latest BrushNet models for AI generation using stable diffusion is lots of fun! Get great results quickly thanks to both random and segmentation mask models - all for free on your own computer! Installation is quick and easy, so why not give it a go today?
    Update: BrushNet SDXL is now also available 😀
    Want to support the channel?
    / nerdyrodent
    Links:
    github.com/TencentARC/BrushNet
    huggingface.co/spaces/Tencent...
    pytorch.org/
    huggingface.co/spaces/abhishe...
    Installing Anaconda for MS Windows Beginners - • Anaconda - Python Inst...
    pixabay.com/photos/teddy-bear...
    == More Stable Diffusion Stuff! ==
    * Faster Stable Diffusions with the LCM LoRA - • LCM LoRA = Speedy Stab...
    * Video-to-Video AI using AnimateDiff - • How To Use AnimateDiff...
    * How to make Consistent Characters From 1 Picture In ComfyUI - • Reposer = Consistent S...
    0:00 Inpainting Introduction
    1:19 BrushNet 0-Install Option
    1:37 BrushNet Local Install Option
    7:26 BrushNet Examples
    12:03 ComfyUI Inpainting Example
  • Věda a technologie

Komentáře • 61

  • @ducttapebattleship
    @ducttapebattleship Před měsícem +10

    Thanks for still explaining everything in detail (conda, git, etc.) even after you've explained everyting a dozen of times already. This really helps.

  • @loubakalouba
    @loubakalouba Před měsícem +7

    I see a new Nerdy Rodent video I click instantly, Never failed me. Thank you!

  • @forfreeiran8749
    @forfreeiran8749 Před měsícem +4

    You got the best vids man

  • @knightride9635
    @knightride9635 Před měsícem +1

    This is really good, thanks !

  • @PianoCinematix
    @PianoCinematix Před měsícem +1

    Excellent work as always mate. One quick question though. Where is the default output "dir" after generating the images?

  • @MrSporf
    @MrSporf Před měsícem +2

    I thought fooocus was good, but this thing is next level!

  • @DivinityIsPurity
    @DivinityIsPurity Před měsícem +1

    Great tut

  • @rudeasmith5098
    @rudeasmith5098 Před měsícem +1

    That shining suit of armor is suss! Love your videos though. 👍

    • @NerdyRodent
      @NerdyRodent  Před měsícem

      I like the way it does the reflections too!

  • @Lamson777
    @Lamson777 Před měsícem +3

    What a funny intro 😂

  • @tonywhite4476
    @tonywhite4476 Před měsícem +3

    Brushnet or Invoke?

  • @MarcSpctr
    @MarcSpctr Před měsícem +1

    hey can you do a video on ANYDOOR, please ?
    I read their github and it seems such a cool tool to change inject location and stuff, but sadly the GRADIO DEMO only has inject transfer from one image to another.
    Can you make a tutorial on how to use all the other tools that Anydoor provides ?

  • @industrialvectors
    @industrialvectors Před měsícem

    What file explorer and image viewer are you using?
    Your setup looks like a mix of Linux and Windows in a good way.

    • @NerdyRodent
      @NerdyRodent  Před měsícem +1

      I’m using Caja with the default image viewer 😉

  • @arashsohi8551
    @arashsohi8551 Před měsícem

    Thank you so much!

    • @NerdyRodent
      @NerdyRodent  Před měsícem +1

      You're welcome!

    • @arashsohi8551
      @arashsohi8551 Před měsícem

      ​@@NerdyRodent Could you help me with this if you can please?
      I tried to find out what is the problem but nothing was found :(
      I have all the data structure and checkpoints and have the "sam_vit_h_4b8939.pth" but I don't know why this happens when I run the app_brushnet.py.
      Traceback (most recent call last):
      File "D:\AI-Programs\BrushNet\examples\brushnet\app_brushnet.py", line 13, in
      mobile_sam = sam_model_registry['vit_h'] (checkpoint='data/ckpt/sam_vit_h_4b8939.pth').to("cuda")
      File "C:\ProgramData\miniconda3\Lib\site-packages\segment_anything\build_sam.py", line 15, in build_sam_vit_h return _build_sam(
      ΑΑΑΑΑΑΑΑΑΑΑ
      File "C:\ProgramData\miniconda3\Lib\site-packages\segment_anything\build_sam.py", line 104, in _build_sam with open(checkpoint, "rb") as f:
      ΑΑΑΑΑΑΑΑΑΑΑΑ ΑΑΑΑΑΑΑΑΑ
      FileNotFoundError: [Errno 2] No such file or directory: 'data/ckpt/sam_vit_h_4b8939.pth'
      [process exited with code 1 (0x00000001)]

    • @NerdyRodent
      @NerdyRodent  Před měsícem

      Looks like you forgot to download that model!

  • @Elwaves2925
    @Elwaves2925 Před měsícem +2

    Careful with those puns, Sebastian Kamph will become jealous. 🙂
    The results look great but the one time I tried Anaconda it didn't work and seemed to be more of a mess. Then the hassle with the models so I think I'll wait until it hits Forge, which shouldn't be too long. Cheers NR.

    • @NerdyRodent
      @NerdyRodent  Před měsícem +2

      Anaconda is the way forward and has never let me down in over four years!

    • @Elwaves2925
      @Elwaves2925 Před měsícem

      @@NerdyRodent That's cool, it may have been something my end like a misunderstanding on my part. I also realised after that it might have been Miniconda, not Anaconda that I was using.

  • @worldwidewebcap
    @worldwidewebcap Před měsícem

    I wasn't able to get it running. I keep getting a ton of errors, like "OSError: Error no file named diffusion_pytorch_model.bin found in directory data/ckpt/segmentation_mask_brushnet_ckpt."
    But that file ends in .safetensors not .bin

    • @NerdyRodent
      @NerdyRodent  Před měsícem +1

      You can download the required files from their Google Drive

  • @Sandy5of5
    @Sandy5of5 Před měsícem +3

    If only it could fill in the blank spots in our lives, 'eh? 😉🐭

  • @vi6ddarkking
    @vi6ddarkking Před měsícem +3

    Ok so I am going to ask the obvious question.
    How well does it fix hands and feet?

    • @JonnyCrackers
      @JonnyCrackers Před měsícem

      It's stable diffusion so probably not very well at all.

    • @zacharyshort384
      @zacharyshort384 Před 29 dny

      @@JonnyCrackers ? You can *fix* hands and feet quite easily in SD with inpainting. People do it all the time...

  • @bobbyboe
    @bobbyboe Před 28 dny

    I wonder what is more powerful, the IOPaint (that I noticed through your video last month), or this BrushNet. Evaluating after having seen both of your videos, I would tend to rather use IOPaint. But I am sure you must have compared both, or at least have a much better insight?

    • @NerdyRodent
      @NerdyRodent  Před 28 dny +2

      There’s a brushnet inspired PowerPaint for IOPaint now - best of both worlds!

    • @bobbyboe
      @bobbyboe Před 18 dny

      @@NerdyRodent maybe we can expect some video from you about how to use this "PowerPaint" feature you described? Also a comparison of different models and how you use them in IOPaint would be very interesting. I cannot find a lot of input on how IOPaint can be power-used...

  • @lockos
    @lockos Před měsícem

    @NerdyRodent I completed every steps without any issue for installation, but in the end, the local url is unavailable, it fails to connect to it, any idea why ?

    • @NerdyRodent
      @NerdyRodent  Před měsícem

      If you’ve got extra security set up, you may need to modify it to allow local connectivity

    • @lockos
      @lockos Před měsícem

      @@NerdyRodent What'd'ya mean by extra security ? I tried with temporarily disabled firewall, checked my browser settings and flushed my DNS, that changed nothing wether I use Chrome, firefox or Edge. Now maybe there is a security setting within Brrushnet that I'm not aware of, if that is the case please could you tell ?

    • @NerdyRodent
      @NerdyRodent  Před měsícem

      If you’ve not got any extra security running, it could be that you’re using Microsoft Windows which will fail on basic routing tasks. If that’s your case, you’ll need to use localhost as the address instead!

  • @MilesBellas
    @MilesBellas Před měsícem

    Automatic1111 isn't deprecated?

    • @NerdyRodent
      @NerdyRodent  Před měsícem +1

      Does kinda seem deprecated at this point, yes 🫤

    • @zacharyshort384
      @zacharyshort384 Před 29 dny +1

      I'm using Forge which is just another implementation of it.

  • @LouisGedo
    @LouisGedo Před měsícem

    👋

  • @Shadowman0
    @Shadowman0 Před měsícem

    I think your differential diffusion workflow doesn't use its full potential for its blending capabilities. Ideally after segemting the bear you could expand the mask and blur the expanded area so you get a gradient arround the bear. (or even blur without expanding to give it a bit of bear to change) Depending on the blur parameters and the added padding, your results should be much better. The differential diffusion allows you to use non binary masks and let the gray level define the allowed changes afaik.

    • @NerdyRodent
      @NerdyRodent  Před měsícem +1

      Yup, I’ve got a node in there for you to set the blur amount though I find the depth maps are often pretty good without extra blur!

  • @eammon7144
    @eammon7144 Před měsícem +1

    Are all the output images low resolution?

  • @testales
    @testales Před měsícem

    The standard models are not made for inpainting! With SD 1.5 models you can do an on-the-fly merge by bascially subtracting the 1.5 inpainting checkpoint from the regular base model and adding this to the checkpoint of your liking. Then you will get way better results when inpainting stuff. Probably this can be done with SDXL too but since I still use SD 1.5 checkpoints most of the time, I can't tell for sure.

    • @zacharyshort384
      @zacharyshort384 Před 29 dny

      Hmm. Interesting. Do you have a vid link or tutorial you can point me to? :)

  • @Pauluz_The_Web_Gnome
    @Pauluz_The_Web_Gnome Před měsícem

    I noticed that the quality of the generated images are really low quality...

  • @relaxandlearn7996
    @relaxandlearn7996 Před měsícem

    Dont see any difference between this and normal inpaint models.