This is a direct GGUF conversion of AIDC-AI/Ovis-Image-7B.

Usage

The model files can be used in ComfyUI with the ComfyUI-GGUF custom node. Place the required model(s) in the following folders:

Type Name Location Download
Main Model Ovis-Image-7B ComfyUI/models/unet GGUF (this repo)
Text Encoder Ovis2.5 ComfyUI/models/text_encoders Safetensors
VAE ae.safetensors ComfyUI/models/vae Safetensors

Since this is a quantized model, all original licensing terms and usage restrictions remain in effect.

Downloads last month
275
GGUF
Model size
7B params
Architecture
flux
Hardware compatibility
Log In to view the estimation

2-bit

3-bit

4-bit

5-bit

6-bit

8-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for QuantStack/Ovis-Image-7B-GGUF

Base model

AIDC-AI/Ovis2.5-2B
Quantized
(3)
this model

Collection including QuantStack/Ovis-Image-7B-GGUF