20
submitted 1 year ago* (last edited 1 year ago) by Ganbat@lemmyonline.com to c/stable_diffusion@lemmy.dbzer0.com

Since the shutdown of SD on Colab, is there any option for running SD without disposable income?

I know about StableHorde, but it doesn't seem to really... Well, work. Not for people without GPUs to gain Kudos on at least. It always gives a 5+ minute long queue and then ends up erroring out before that time runs out.

EDIT: It took me a while to set up. but as it turns out, my best option is in fact my 10-year-old computer with a 2GB AMD card. Using the DirectML fork of the WebUI with --lowvram runs pretty damn well for me. It's not as fast as Colab was, but it's not slow by any means. I guess the best advice in the end is, even if you're on a shitbox, try it, your shitbox might surprise you. So take note, though, that running on 2GB Vram doesn't work for everyone, only the luckiest of broke mfs can do that it seems.

you are viewing a single comment's thread
view the rest of the comments
[-] domi@lemmy.secnd.me 2 points 1 year ago* (last edited 1 year ago)

Good to hear you got it working.

If you want to speed it up even further and you're willing to boot Linux from a USB, ROCm is much faster than DirectML right now.

edit: Also, you can run without UI, saving even more VRAM.

[-] Ganbat@lemmyonline.com 1 points 1 year ago

Interesting about ROCm, I'll have to look into that. As for running without the UI, I honestly don't think I know enough to do that right now, lol.

this post was submitted on 15 Aug 2023
20 points (95.5% liked)

Stable Diffusion

4304 readers
8 users here now

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

founded 1 year ago
MODERATORS