How to install stable diffusion in a docker container.

2024-07-09 17:12:13 Tuesday

In this video we look at how to install stable diffusion in a docker container. This entails making our GPUs available in docker containers and running stable diffusion in a listening mode. We also enable the capability to run extensions inside of the container.

Nvidia docker

First of we need to get the GPU available in the docker container.

This can be done via following the guide at: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html
Or running some of the commands below

Installing the apt repository containing the nvidia container toolkit

distribution=$(. /etc/os-release;echo $ID$VERSION_ID) \
      && curl -fsSL https://nvidia.github.io/libnvidia-container/gpgkey | sudo gpg --dearmor -o /usr/share/keyrings/nvidia-container-toolkit-keyring.gpg \
      && curl -s -L https://nvidia.github.io/libnvidia-container/$distribution/libnvidia-container.list | \
            sed 's#deb https://#deb [signed-by=/usr/share/keyrings/nvidia-container-toolkit-keyring.gpg] https://#g' | \
            sudo tee /etc/apt/sources.list.d/nvidia-container-toolkit.list

Next up we update our package repository and install the toolkit and nvidia-docker2.

sudo apt-get update
sudo apt-get install -y nvidia-container-toolkit nvidia-docker2

Next we run the nvidia command from the toolkit to configure the docker runtime and restart docker.

sudo nvidia-ctk runtime configure --runtime=docker
sudo systemctl restart docker

When docker has restarted we want to ensure that the GPU is available in the container, running the command below should show you information about your GPU and no errors.

sudo docker run --rm --runtime=nvidia --gpus all nvidia/cuda:11.6.2-base-ubuntu20.04 nvidia-smi

Stable diffusion.

When we have ensured the GPU is available we can fetch the git repository for stable diffusion.

git clone https://github.com/AUTOMATIC1111/stable-diffusion-webui.git
cd stable-diffusion-webui

Next we need to edit the file webui-user.sh inside the source repository and add this command line argument to make the service available outside of docker and allow extensions to be installed even if it's available outside of localhost.

export COMMANDLINE_ARGS="--listen --enable-insecure-extension-access"

Lastly we will start the docker container and map a lot of different directories in order to store some of the creations we create inside of the docker container. We also want to any directory we want to use for training in this command. After this is run the GUI should be available at http://localhost:8090.

docker run -it --rm \
    --name stable-diffusion-webui \
    -p 8090:7860 \
    -e NVIDIA_DRIVER_CAPABILITIES=all \
    -e UID=$(id -u) \
    -e GID=$(id -g) \
    -v /home/woden/github/stable-diffusion-webui/models/Stable-diffusion:/home/user/stable-diffusion-webui/models/Stable-diffusion \
    -v /home/woden/github/stable-diffusion-webui/outputs:/home/user/stable-diffusion-webui/outputs \
    -v /home/woden/github/stable-diffusion-webui/styles:/home/user/stable-diffusion-webui/styles \
    -v /home/woden/github/stable-diffusion-webui/extensions:/home/user/stable-diffusion-webui/extensions \
    -v /home/woden/github/stable-diffusion-webui/models/extensions:/home/user/stable-diffusion-webui/models/extensions \
    -v /home/woden/github/stable-diffusion-webui/models/VAE:/home/user/stable-diffusion-webui/models/VAE \
    -v /home/woden/github/stable-diffusion-webui/config.json:/home/user/stable-diffusion-webui/config.json \
    -v /home/woden/ui-config.json.bak:/home/user/ui-config.json.bak \
    -v /home/woden/github/stable-diffusion-webui/webui-user.sh:/home/user/stable-diffusion-webui/webui-user.sh \
    --gpus all \
    --privileged \
    kestr3l/stable-diffusion-webui:1.2.2

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.