Skip to content

[Bug]: VRAM usage is way higher #6307

@shimizu-izumi

Description

@shimizu-izumi

Is there an existing issue for this?

  • I have searched the existing issues and checked the recent builds/commits

What happened?

I updated the WebUI a few minutes ago and now the VRAM usage when generating an image is way higher. I have 3 monitors (2x 1920x1080 & 1x 2560x1440), I use Wallpaper Engine on all of them, but I have Discord open on of them nearly 24/7, so Wallpaper Engine is only active for two monitors. 1.5 GB VRAM are used when I am on the Desktop without the WebUI running.
Web Browers: Microsoft Edge (Chromium)
OS: Windows 11 (Build number: 22621.963)
GPU: NVIDIA GeForce RTX 3070 Ti (KFA2)
CPU: Intel Core i7-11700K
RAM: Corsair VENGEANCE LPX 32 GB (2 x 16 GB) DDR4 DRAM 3200 MHz C16

Steps to reproduce the problem

  1. Start the WebUI
  2. Use the following settings to generate an image

Positive prompt:
masterpiece, best quality, 1girl, brown hair, green eyes, colorful, autumn, cumulonimbus clouds, lighting, blue sky, falling leaves, garden
Negative prompt:
lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, artist name,
Steps: 50, Sampler: Euler a, CFG scale: 12, Seed: 3607441108, Size: 512x768, Model hash: 8d9aaa54, Model: Anything V3 (non pruned with vae), Denoising strength: 0.69, Clip skip: 2, Hires upscale: 2, Hires upscaler: R-ESRGAN AnimeVideo

What should have happened?

The generation should complete without any errors

Commit where the problem happens

1cfd8ae

What platforms do you use to access UI ?

Windows

What browsers do you use to access the UI ?

Microsoft Edge

Command Line Arguments

--xformers

Additional information, context and logs

I have the config for animefull from the Novel AI leak in the configs folder under the name Anything V3.0.yaml, but I get this error too when I remove it from the configs folder and completely restart the WebUI. This is the error I get

RuntimeError: CUDA out of memory. Tried to allocate 1.50 GiB (GPU 0; 8.00 GiB total capacity; 4.70 GiB already allocated; 0 bytes free; 5.96 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

Metadata

Metadata

Assignees

No one assigned

    Labels

    bug-reportReport of a bug, yet to be confirmed

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions