ComfyUI GGUF: How to Install, Load, and Troubleshoot GGUF Models
Use GGUF models in ComfyUI with the ComfyUI-GGUF custom node, Unet Loader GGUF, CLIPLoader GGUF, and the right model folders.
GGUF models in ComfyUI usually need the ComfyUI-GGUF custom node.
The short version: install the custom node, put diffusion model .gguf files in ComfyUI/models/unet or ComfyUI/models/diffusion_models, then replace the normal diffusion model loader with Unet Loader (GGUF).
Quick Answer
| Goal | What to Do |
|---|---|
| Load a Flux GGUF model | Install ComfyUI-GGUF, then use Unet Loader (GGUF) |
| Find the right folder | Start with ComfyUI/models/unet or ComfyUI/models/diffusion_models |
| Load a GGUF T5 text encoder | Use a CLIPLoader (GGUF) node when the workflow calls for it |
| Fix missing GGUF loader nodes | Check that ComfyUI-GGUF is installed under custom_nodes and restart |
| Fix empty model dropdown | Confirm the file extension, folder, and active ComfyUI environment |
What GGUF Means in ComfyUI
GGUF is a model file format commonly used for quantized models. In ComfyUI, it is often used for large transformer or DiT-based diffusion models such as Flux-style workflows.
The reason people use GGUF is practical: quantized files can reduce VRAM requirements compared with full precision model files. The tradeoff is that setup becomes more specific. You need the right custom node and the right loader.
Step 1: Install ComfyUI-GGUF
For a manual install, open a terminal in your ComfyUI folder and clone the custom node:
cd ComfyUI/custom_nodes
git clone https://github.com/city96/ComfyUI-GGUFThen install its requirements in the same Python environment used by ComfyUI:
cd ComfyUI-GGUF
python -m pip install -r requirements.txtFor a Windows portable package, run from the portable root:
git clone https://github.com/city96/ComfyUI-GGUF ComfyUI\custom_nodes\ComfyUI-GGUF
.\python_embeded\python.exe -s -m pip install -r .\ComfyUI\custom_nodes\ComfyUI-GGUF\requirements.txtRestart ComfyUI after installing the node.
Install into the right Python
If you use portable ComfyUI, system pip is usually the wrong pip. Use python_embeded\python.exe from the portable package.
Step 2: Put GGUF Files in the Right Folder
The ComfyUI-GGUF README says to place .gguf model files in:
ComfyUI/models/unetCurrent ComfyUI folder configuration also maps legacy unet paths to:
ComfyUI/models/diffusion_modelsSo if one guide says models/unet and another says models/diffusion_models, they may both be describing the same model type. The important part is that GGUF diffusion models are not normal checkpoint files.
Do not put a Flux GGUF diffusion model here:
ComfyUI/models/checkpointsThat folder is for regular checkpoint loader workflows.
For a broader folder map, see Where to Put Safetensors in ComfyUI.
Step 3: Use Unet Loader GGUF
After installing the custom node, search for a node like:
Unet Loader (GGUF)Use it in place of the normal Load Diffusion Model node when the workflow expects a GGUF diffusion model.
This is the most common fix for searches like:
comfyui unet loader ggufunet loader gguf comfyuicomfyui gguf loadercomfyui load gguf model
If the node does not appear, ComfyUI-GGUF did not load correctly. Check the startup terminal for an import error.
Step 4: Know When You Need CLIPLoader GGUF
Some GGUF workflows also use quantized text encoders, such as T5 GGUF files.
In that case, the workflow may use a GGUF CLIP loader node instead of the normal CLIP loader. The ComfyUI-GGUF README notes that its CLIP loader can handle GGUF and regular text encoder files.
If your workflow has a missing CLIP loader node:
- Confirm ComfyUI-GGUF is installed.
- Confirm the text encoder file is in the expected text encoder folder.
- Confirm the workflow actually expects a GGUF text encoder, not just a GGUF diffusion model.
Step 5: Restart and Check the Terminal
After installing ComfyUI-GGUF and moving files, restart ComfyUI.
If the loader node is still missing, check the first real error in the terminal. Look for:
IMPORT FAILED
ModuleNotFoundError
No module named 'gguf'If gguf is missing, install the custom node requirements in the correct Python environment.
Common Problems
| Problem | Cause | Fix |
|---|---|---|
Unet Loader (GGUF) is missing | Custom node did not load | Reinstall ComfyUI-GGUF and restart |
| GGUF file not in dropdown | File is in the wrong model folder | Move it to models/unet or models/diffusion_models |
| Workflow uses normal Load Checkpoint | Wrong loader for GGUF diffusion model | Replace it with Unet Loader (GGUF) |
No module named 'gguf' | Dependency missing | Install ComfyUI-GGUF requirements |
| Out of memory still happens | Quantization helps, but does not make every workflow fit | Use a smaller quant, lower resolution, or smaller workflow |
GGUF vs Checkpoint vs Safetensors
| Format | Typical Folder | Typical Use |
|---|---|---|
.safetensors checkpoint | models/checkpoints | SD 1.5, SDXL, many all-in-one models |
.safetensors diffusion model | models/diffusion_models | Flux and newer split-model workflows |
.gguf diffusion model | models/unet or models/diffusion_models | Quantized Flux or other transformer models |
.gguf text encoder | text encoder or clip folder, depending on workflow | Quantized T5 or text encoder loaders |
The file extension alone is not enough. The workflow and loader node decide where the file belongs.
When GGUF Is a Good Choice
GGUF is worth trying when:
- your GPU is short on VRAM
- the workflow specifically asks for a GGUF model
- you are running Flux-style or other large transformer models
- a community workflow provides a tested GGUF setup
It is not the best first choice when:
- you are still learning basic checkpoint workflows
- the custom node fails to import
- you have enough VRAM for a standard model
- the workflow instructions are unclear or mix several model formats
How Wonderful Launcher Helps
GGUF setup can touch custom nodes, model folders, Python packages, and workflow loaders at the same time. Wonderful Launcher helps when you want to:
- keep a stable non-GGUF environment separate from a GGUF experiment
- preserve working workflows before adding custom nodes
- collect startup logs when loader nodes fail
- recover models and workflows before rebuilding ComfyUI
If GGUF install breaks other nodes, start with ComfyUI Dependency Conflicts.
Related Guides
- Where to Put Safetensors in ComfyUI
- Download Models
- Install ComfyUI Custom Nodes
- ComfyUI Dependency Conflicts
- GPU Compatibility
Source References
ComfyUI Model Types Explained: Checkpoint, LoRA, VAE, ControlNet & More
Understand the different model types used in ComfyUI — what each does, where to install it, and how they work together in a workflow.
ComfyUI Flux Guide: Setup, Workflows & VRAM Options
Complete guide to running Flux.1 in ComfyUI — model versions compared, download links, text-to-image and image-to-image workflows, and low-VRAM solutions.
Wonderful Launcher Docs