8GB VRAM GPU? Here are some Automatic1111 Stable Diffusion performance tests to see what works best

Sdílet
Vložit
  • čas přidán 23. 07. 2023
  • I ran through a series of config arguments to see which performance settings work best on this RTX 3060 TI 8GB. I hope this helps you in your own tweaking.
    Settings tested:
    No performance arguments
    xformers
    medvram
    lowvram
    medvram and xformers
    lowvram and xformers
    opt-spd-attention
    opt-spd-attention and xformers
  • Věda a technologie

Komentáře • 26

  • @tonkonton
    @tonkonton Před 10 měsíci +4

    This was a great no frills, straightforward video. Thanks for making this!

  • @magno5157
    @magno5157 Před 10 měsíci +1

    Have you tried optimising Automatic1111 with Microsoft Olive? Would love to see what it's like for Nvidia cards with the latest driver.

  • @shambhangal438
    @shambhangal438 Před 2 měsíci

    Reducing or removing the refresh period for life render updates and reducing the update quality helps immensely for sdxl on automatic1111, and was the biggest single optimization for me.

  • @ForsakenSanityGaming
    @ForsakenSanityGaming Před 4 měsíci

    Yep, been running solely xformers on the 3060ti from the start.
    Med and low should only be used if running into stability issues, from what I understood, but I could be wrong.
    Either way, the 3060ti is an amazing value card for SD.

  • @SavetheRepublic
    @SavetheRepublic Před 9 měsíci +2

    I'm having these very issue. I have a 3070 and it take over a minute, sometime it can take 3 to 5 minutes to run SDXL 1.0.

  • @MattLind-oi5kg
    @MattLind-oi5kg Před 7 měsíci

    Helpful for a 3060ti peasant. Cheers.

  • @darkskyx
    @darkskyx Před 7 měsíci +2

    I have a 16 GB RAM and I have a lot of issues, idk why. I had to go for --lowvram + xformer and --lowram if I want to merge models. I also had to use --skip-torch-cuda-test even if I have an NVIDIA card use a have or I will get errors everytime. I can't get into 1024 resolution if I don't do this.

    • @alexs2195
      @alexs2195 Před 3 měsíci

      what matters is VRAM, VIDEO RAM, the ram of your ghaphics card

    • @yoyosingh1088
      @yoyosingh1088 Před měsícem

      I've merged models on a 1060 3gb card so I dont know what you are on about

  • @zoulouad5706
    @zoulouad5706 Před 4 měsíci

    what about a resolution of 1920*1080, how many time it takes to generate ?

  • @apnavu007
    @apnavu007 Před 25 dny

    I'm thinking a buy a Laptop 8GB Vram Will I be able to run a Stable Diffusion XL model?

  • @crobinso2010
    @crobinso2010 Před 10 měsíci +1

    I thought the low ram wasn't to speed it up but to avoid out of memory errors

    • @renaissancelaboratories5645
      @renaissancelaboratories5645  Před 10 měsíci +1

      Very true, I may try a similar test suite with upscaling to see what crashes. Thanks for watching.

  • @SilentPrayerCG
    @SilentPrayerCG Před 9 měsíci +1

    What the point testing medvram and lowvram, on 8gb card and 512 model? It have enough vram for it, thus medvram and lowvram only produce perfomance hit.
    You should try test it with SDXL model. It's huuuuuge difference between just xformers and xformers+medvram.
    On my 1070, with just xformers it take more than 10 minutes, while with medvram it's 2min. (if you don't run any vram consuming programs alongside with it)

    • @shambhangal438
      @shambhangal438 Před 5 měsíci

      It depends on the VRAM *and* the output resolution *and* the model *and* what extensions you are using (e.g. controlNet), so 8GB can very easily be too low and in need of a memory flag..
      I set --medvram on my laptop 3080-8GB by default because I use sdxl and for that model, 8GB needs at least a --medvram, and because I render at greater than 1024x1024 rather than use a refiner.

  • @ShakilRenesh
    @ShakilRenesh Před 2 měsíci

    How much the vram should be? Mine is 6 gb

  • @wwk279
    @wwk279 Před 9 měsíci

    Everything is still ok with 6Gb vram but i will upgrade it soon

  • @SyamsQbattar
    @SyamsQbattar Před měsícem

    Which one is better, the Rtx 3060 12GB or 3060Ti for AI?

    • @aryaman9254
      @aryaman9254 Před 5 dny

      if you ger an answer to this do let me know ok? @SyamsQbattar

    • @SyamsQbattar
      @SyamsQbattar Před 4 dny

      @@aryaman9254 If you do deep learning, 3060 12GB is better. But, If you only use generate image, 3060Ti is better

  • @craigmcmeechan5899
    @craigmcmeechan5899 Před 8 měsíci +2

    Im gonna test this on an Asus Vivobook Pro x16 with an i9 13980HX and rtx 4070 8gb. If anyone is interested, let me know, and I'll post a video of the results. ANY ADVICE WOULD BE APPRECIATED

    • @jacjankowski6180
      @jacjankowski6180 Před 6 měsíci

      Did you try it? And how were the results?

    • @craigmcmeechan5899
      @craigmcmeechan5899 Před 5 měsíci

      @jacjankowski6180 my experiences of running any transformer based LLM using purely unaccellerated (non CDUA ie TenaorRT under WSL2) hasn't been enjoyable once VRAM requirements get cloae to the 8gb limit if the system. DEMUCS windows C++ CUDA accelerated build is impressive.... running on battery doing 5 audio feature extraction/seperation is around 35 seconds for a 5 min track. Plug the system jn and its completed in 16 seconds

    • @craigmcmeechan5899
      @craigmcmeechan5899 Před 5 měsíci

      @jacjankowski6180 when I'd chose the Vivobook Pro x16 OLED with the i9 13980hx and 4070 8gb I hadthe option of a very similarly priced Asus ROG model. This was the year prior model, had maxed i9 12th gen, only a standard 1440p lcd but came with a 16gb 3080TI.... Now I think about both and none were great selections... I've always bought Dell Precision Workstations as laptops... The Asus build quality doesn't come close.... but with tight financial times I could t find a dell the came near that spec without spending 1200-1500 more.. I'll know next time not too compromise