Like Ra's Naughty Forum

Full Version: AI generated fetish images
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27
For Mac M1/M2 users, Stable Diffusion got a lot easier but you need at least 16GB ram...
Just download and install DiffusionBee from https://diffusionbee.com
Be aware, the safety check of the generated pics is disabled. 

Sourcecode at https://github.com/divamgupta/diffusionb...ffusion-ui
(21 Sep 2022, 12:14 )Anne Wrote: [ -> ]This colab workbook lets you experiment without having to install anything:
Sounds interesting. Trying to understand how it works...
This appears to be the top of the line "run@home" version of Stable Diffusion right now.
Lots of features, easy to install and supposedly the best WebUI at this point.

Haven't tried it myself yet, I am still using the one from my original installation post, because I have set up a certain workflow I and feel quite comfortable with using the command line interface.

[url=

Source: https://www.youtube.com/watch?v=vg8-NSbaWZI
]Full installation tutorial.[/url]
I'm trying this one: https://github.com/divamgupta/stable-dif...tensorflow

Made a Python virtual environment and installed it, it's slow as it uses the cpu instead of the graphics card. It has some ... interesting results 😕 Confused

[attachment=56205][attachment=56206][attachment=56207][attachment=56208][attachment=56209][attachment=56210][attachment=56211][attachment=56212]

I thought they maybe used a smaller set of weights but I got some similar results when trying out Stable Diffusion so it seems to be the real deal. Only text to image though, no upscaling / filling in / face changing ...
(25 Sep 2022, 13:45 )Anne Wrote: [ -> ]it uses the cpu instead of the graphics card.
Hm.... Why? Tensorflow should support both CPU and GPU. Or the model is converted to use CPU and the system RAM, to avoid GPU's VRAM shortage?
Speaking of red latex...

with a little added effort (running more variations until you're happy, replacing deformities with better parts from variations or some good ol'e manual photoshopping) these could be pretty convincing.
(25 Sep 2022, 15:58 )Like Ra Wrote: [ -> ]
(25 Sep 2022, 13:45 )Anne Wrote: [ -> ]it uses the cpu instead of the graphics card.
Hm.... Why? Tensorflow should support both CPU and GPU. Or the model is converted to use CPU and the system RAM, to avoid GPU's VRAM shortage?

I get a message

Code:
tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcudart.so.11.0'; dlerror: libcudart.so.11.0: cannot open shared object file: No such file or directory
2022-09-25 20:08:11.783454: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.

So I guess if I install Tensorflow for my AMD card I can use GPU acceleration but I only have 8 gigs VRAM and now running on cpu it takes about 14 gigs of RAM. That would confirm what I had last week: GPU acceleration working but not enough VRAM.

Edit: I'm setting num_steps to 5 or 10 instead of 50 and batch size on 1 to get a crappy image fast, if it has potential I set num_steps to 50 and generate a batch of 8. Generating 8 images takes about an hour.
(25 Sep 2022, 19:14 )Anne Wrote: [ -> ]libcudart.so.11.0: cannot open shared object file
Isn't it related to NVIDIA's CUDA developer library? Some years ago, only old CUDA versions were supported by TF.

(25 Sep 2022, 19:14 )Anne Wrote: [ -> ]I install Tensorflow for my AMD card
Does TF support AMD (via OpenCL) nowadays?
Under Windows you can use DirectML for AMD GPUs: https://community.amd.com/t5/radeon-pro-...a-p/488595

For Linux, TF + PyTorch is the answer.
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27