8GB VRAM GPU? Here are some Automatic1111 Stable Diffusion performance tests to see what works best
Vložit
- čas přidán 23. 07. 2023
- I ran through a series of config arguments to see which performance settings work best on this RTX 3060 TI 8GB. I hope this helps you in your own tweaking.
Settings tested:
No performance arguments
xformers
medvram
lowvram
medvram and xformers
lowvram and xformers
opt-spd-attention
opt-spd-attention and xformers - Věda a technologie
This was a great no frills, straightforward video. Thanks for making this!
Have you tried optimising Automatic1111 with Microsoft Olive? Would love to see what it's like for Nvidia cards with the latest driver.
Reducing or removing the refresh period for life render updates and reducing the update quality helps immensely for sdxl on automatic1111, and was the biggest single optimization for me.
Yep, been running solely xformers on the 3060ti from the start.
Med and low should only be used if running into stability issues, from what I understood, but I could be wrong.
Either way, the 3060ti is an amazing value card for SD.
I'm having these very issue. I have a 3070 and it take over a minute, sometime it can take 3 to 5 minutes to run SDXL 1.0.
Helpful for a 3060ti peasant. Cheers.
I have a 16 GB RAM and I have a lot of issues, idk why. I had to go for --lowvram + xformer and --lowram if I want to merge models. I also had to use --skip-torch-cuda-test even if I have an NVIDIA card use a have or I will get errors everytime. I can't get into 1024 resolution if I don't do this.
what matters is VRAM, VIDEO RAM, the ram of your ghaphics card
I've merged models on a 1060 3gb card so I dont know what you are on about
what about a resolution of 1920*1080, how many time it takes to generate ?
I'm thinking a buy a Laptop 8GB Vram Will I be able to run a Stable Diffusion XL model?
I thought the low ram wasn't to speed it up but to avoid out of memory errors
Very true, I may try a similar test suite with upscaling to see what crashes. Thanks for watching.
What the point testing medvram and lowvram, on 8gb card and 512 model? It have enough vram for it, thus medvram and lowvram only produce perfomance hit.
You should try test it with SDXL model. It's huuuuuge difference between just xformers and xformers+medvram.
On my 1070, with just xformers it take more than 10 minutes, while with medvram it's 2min. (if you don't run any vram consuming programs alongside with it)
It depends on the VRAM *and* the output resolution *and* the model *and* what extensions you are using (e.g. controlNet), so 8GB can very easily be too low and in need of a memory flag..
I set --medvram on my laptop 3080-8GB by default because I use sdxl and for that model, 8GB needs at least a --medvram, and because I render at greater than 1024x1024 rather than use a refiner.
How much the vram should be? Mine is 6 gb
Everything is still ok with 6Gb vram but i will upgrade it soon
Good to hear, and thanks for watching!
Which one is better, the Rtx 3060 12GB or 3060Ti for AI?
if you ger an answer to this do let me know ok? @SyamsQbattar
@@aryaman9254 If you do deep learning, 3060 12GB is better. But, If you only use generate image, 3060Ti is better
Im gonna test this on an Asus Vivobook Pro x16 with an i9 13980HX and rtx 4070 8gb. If anyone is interested, let me know, and I'll post a video of the results. ANY ADVICE WOULD BE APPRECIATED
Did you try it? And how were the results?
@jacjankowski6180 my experiences of running any transformer based LLM using purely unaccellerated (non CDUA ie TenaorRT under WSL2) hasn't been enjoyable once VRAM requirements get cloae to the 8gb limit if the system. DEMUCS windows C++ CUDA accelerated build is impressive.... running on battery doing 5 audio feature extraction/seperation is around 35 seconds for a 5 min track. Plug the system jn and its completed in 16 seconds
@jacjankowski6180 when I'd chose the Vivobook Pro x16 OLED with the i9 13980hx and 4070 8gb I hadthe option of a very similarly priced Asus ROG model. This was the year prior model, had maxed i9 12th gen, only a standard 1440p lcd but came with a 16gb 3080TI.... Now I think about both and none were great selections... I've always bought Dell Precision Workstations as laptops... The Asus build quality doesn't come close.... but with tight financial times I could t find a dell the came near that spec without spending 1200-1500 more.. I'll know next time not too compromise