// Tools

VRAM Calculator

Estimate exactly how much VRAM your AI workflow needs. Adjust model, resolution, batch size, and LoRA settings to see real-time estimates.

Image

Video

Animation

1
148
LoRA Count0
05

Estimated VRAM Required

13.6GB

FLUX Dev FP8 · 1024×1024 · Batch 1

0GB8GB16GB24GB32GB

Your Rigs

RTX 508016GB VRAM
✓ Runs
--gpu-only
RTX 3080 16GB16GB VRAM
✓ Runs
--gpu-only
GTX 1660 Ti6GB VRAM
✗ OOM
--lowvram--cpu-vae--disable-smart-memory

Full GPU Compatibility

RTX 5090
32GB
RTX 5070 Ti
16GB
RTX 5070
12GB
RTX 4090
24GB
RTX 4080 Super
16GB
RTX 4080
16GB
RTX 4070 Ti Super
16GB
RTX 4070 Ti
12GB
RTX 4070 Super
12GB
RTX 4070
12GB
RTX 4060 Ti 16GB
16GB
RTX 4060 Ti
8GB
RTX 4060
8GB
RTX 3090 Ti
24GB
RTX 3090
24GB
RTX 3080 Ti
12GB
RTX 3080 10GB
10GB
RTX 3070 Ti
8GB
RTX 3070
8GB
RTX 3060 Ti
8GB
RTX 3060
12GB
RTX 3060M
6GB
RTX 2080 Ti
11GB
RTX 2080 Super
8GB
RTX 2070 Super
8GB
GTX 1080 Ti
11GB
Runs fineTight fitOOM risk

Recommended Build for 13.6GB

Mid-Range Rig

12–16GB VRAM

GPURTX 4070 Ti / RTX 3080 16GB / RTX 4080
CPURyzen 7 7700X or Intel i7-13700K
RAM64GB DDR5
NoteHandles FLUX FP8, SDXL full resolution, LTX Video. Sweet spot for most workflows.
Plan this build on ComputeAtlas →