ONNX 模型库
返回模型

暂无说明文档

Anton-Ding/model

作者 Anton-Ding

diffusers
↓ 150 ♥ 0

创建时间: 2026-03-18 15:05:34+00:00

更新时间: 2026-03-19 17:45:56+00:00

在 Hugging Face 上查看

文件 (359)

.gitattributes
LLM/Florence-2-base-PromptGen-v1.5/.gitattributes
LLM/Florence-2-base-PromptGen-v1.5/README.md
LLM/Florence-2-base-PromptGen-v1.5/added_tokens.json
LLM/Florence-2-base-PromptGen-v1.5/config.json
LLM/Florence-2-base-PromptGen-v1.5/configuration_florence2.py
LLM/Florence-2-base-PromptGen-v1.5/generation_config.json
LLM/Florence-2-base-PromptGen-v1.5/merges.txt
LLM/Florence-2-base-PromptGen-v1.5/model.safetensors
LLM/Florence-2-base-PromptGen-v1.5/modeling_florence2.py
LLM/Florence-2-base-PromptGen-v1.5/preprocessor_config.json
LLM/Florence-2-base-PromptGen-v1.5/processing_florence2.py
LLM/Florence-2-base-PromptGen-v1.5/special_tokens_map.json
LLM/Florence-2-base-PromptGen-v1.5/tokenizer.json
LLM/Florence-2-base-PromptGen-v1.5/tokenizer_config.json
LLM/Florence-2-base-PromptGen-v1.5/vocab.json
LLM/Florence-2-base-ft/.gitattributes
LLM/Florence-2-base-ft/CODE_OF_CONDUCT.md
LLM/Florence-2-base-ft/LICENSE
LLM/Florence-2-base-ft/README.md
LLM/Florence-2-base-ft/SECURITY.md
LLM/Florence-2-base-ft/SUPPORT.md
LLM/Florence-2-base-ft/config.json
LLM/Florence-2-base-ft/configuration_florence2.py
LLM/Florence-2-base-ft/model.safetensors
LLM/Florence-2-base-ft/modeling_florence2.py
LLM/Florence-2-base-ft/preprocessor_config.json
LLM/Florence-2-base-ft/processing_florence2.py
LLM/Florence-2-base-ft/pytorch_model.bin
LLM/Florence-2-base-ft/tokenizer.json
LLM/Florence-2-base-ft/tokenizer_config.json
LLM/Florence-2-base-ft/vocab.json
LLM/Florence-2-base/.gitattributes
LLM/Florence-2-base/CODE_OF_CONDUCT.md
LLM/Florence-2-base/LICENSE
LLM/Florence-2-base/README.md
LLM/Florence-2-base/SECURITY.md
LLM/Florence-2-base/SUPPORT.md
LLM/Florence-2-base/config.json
LLM/Florence-2-base/configuration_florence2.py
LLM/Florence-2-base/modeling_florence2.py
LLM/Florence-2-base/preprocessor_config.json
LLM/Florence-2-base/processing_florence2.py
LLM/Florence-2-base/pytorch_model.bin
LLM/Florence-2-base/tokenizer.json
LLM/Florence-2-base/tokenizer_config.json
LLM/Florence-2-base/vocab.json
LLM/Florence-2-large-PromptGen-v2.0/.gitattributes
LLM/Florence-2-large-PromptGen-v2.0/README.md
LLM/Florence-2-large-PromptGen-v2.0/added_tokens.json
LLM/Florence-2-large-PromptGen-v2.0/config.json
LLM/Florence-2-large-PromptGen-v2.0/configuration_florence2.py
LLM/Florence-2-large-PromptGen-v2.0/generation_config.json
LLM/Florence-2-large-PromptGen-v2.0/merges.txt
LLM/Florence-2-large-PromptGen-v2.0/model.safetensors
LLM/Florence-2-large-PromptGen-v2.0/modeling_florence2.py
LLM/Florence-2-large-PromptGen-v2.0/preprocessor_config.json
LLM/Florence-2-large-PromptGen-v2.0/processing_florence2.py
LLM/Florence-2-large-PromptGen-v2.0/special_tokens_map.json
LLM/Florence-2-large-PromptGen-v2.0/tokenizer.json
LLM/Florence-2-large-PromptGen-v2.0/tokenizer_config.json
LLM/Florence-2-large-PromptGen-v2.0/vocab.json
LLM/Florence-2-large-ft/.gitattributes
LLM/Florence-2-large-ft/CODE_OF_CONDUCT.md
LLM/Florence-2-large-ft/LICENSE
LLM/Florence-2-large-ft/README.md
LLM/Florence-2-large-ft/SECURITY.md
LLM/Florence-2-large-ft/SUPPORT.md
LLM/Florence-2-large-ft/config.json
LLM/Florence-2-large-ft/configuration_florence2.py
LLM/Florence-2-large-ft/generation_config.json
LLM/Florence-2-large-ft/model.safetensors
LLM/Florence-2-large-ft/modeling_florence2.py
LLM/Florence-2-large-ft/preprocessor_config.json
LLM/Florence-2-large-ft/processing_florence2.py
LLM/Florence-2-large-ft/pytorch_model.bin
LLM/Florence-2-large-ft/tokenizer.json
LLM/Florence-2-large-ft/tokenizer_config.json
LLM/Florence-2-large-ft/vocab.json
LLM/Florence-2-large/.gitattributes
LLM/Florence-2-large/CODE_OF_CONDUCT.md
LLM/Florence-2-large/LICENSE
LLM/Florence-2-large/README.md
LLM/Florence-2-large/SECURITY.md
LLM/Florence-2-large/SUPPORT.md
LLM/Florence-2-large/config.json
LLM/Florence-2-large/configuration_florence2.py
LLM/Florence-2-large/generation_config.json
LLM/Florence-2-large/modeling_florence2.py
LLM/Florence-2-large/preprocessor_config.json
LLM/Florence-2-large/processing_florence2.py
LLM/Florence-2-large/pytorch_model.bin
LLM/Florence-2-large/sample_inference.ipynb
LLM/Florence-2-large/tokenizer.json
LLM/Florence-2-large/tokenizer_config.json
LLM/Florence-2-large/vocab.json
LLM/MiniCPMv2_6-prompt-generator/.gitattributes
LLM/MiniCPMv2_6-prompt-generator/.mdl
LLM/MiniCPMv2_6-prompt-generator/.msc
LLM/MiniCPMv2_6-prompt-generator/.mv
LLM/MiniCPMv2_6-prompt-generator/README.md
LLM/MiniCPMv2_6-prompt-generator/added_tokens.json
LLM/MiniCPMv2_6-prompt-generator/config.json
LLM/MiniCPMv2_6-prompt-generator/configuration.json
LLM/MiniCPMv2_6-prompt-generator/configuration_minicpm.py
LLM/MiniCPMv2_6-prompt-generator/generation_config.json
LLM/MiniCPMv2_6-prompt-generator/image_processing_minicpmv.py
LLM/MiniCPMv2_6-prompt-generator/merges.txt
LLM/MiniCPMv2_6-prompt-generator/modeling_minicpmv.py
LLM/MiniCPMv2_6-prompt-generator/modeling_navit_siglip.py
LLM/MiniCPMv2_6-prompt-generator/preprocessor_config.json
LLM/MiniCPMv2_6-prompt-generator/processing_minicpmv.py
LLM/MiniCPMv2_6-prompt-generator/pytorch_model-00001-of-00002.bin
LLM/MiniCPMv2_6-prompt-generator/pytorch_model-00002-of-00002.bin
LLM/MiniCPMv2_6-prompt-generator/pytorch_model.bin.index.json
LLM/MiniCPMv2_6-prompt-generator/resampler.py
LLM/MiniCPMv2_6-prompt-generator/special_tokens_map.json
LLM/MiniCPMv2_6-prompt-generator/test.py
LLM/MiniCPMv2_6-prompt-generator/tokenization_minicpmv_fast.py
LLM/MiniCPMv2_6-prompt-generator/tokenizer.json
LLM/MiniCPMv2_6-prompt-generator/tokenizer_config.json
LLM/MiniCPMv2_6-prompt-generator/vocab.json
LLM/microsoftFlorence-2-base
LLM/unsloth--Meta-Llama-3.1-8B-Instruct/.gitattributes
LLM/unsloth--Meta-Llama-3.1-8B-Instruct/README.md
LLM/unsloth--Meta-Llama-3.1-8B-Instruct/config.json
LLM/unsloth--Meta-Llama-3.1-8B-Instruct/generation_config.json
LLM/unsloth--Meta-Llama-3.1-8B-Instruct/model-00001-of-00004.safetensors
LLM/unsloth--Meta-Llama-3.1-8B-Instruct/model-00002-of-00004.safetensors
LLM/unsloth--Meta-Llama-3.1-8B-Instruct/model-00003-of-00004.safetensors
LLM/unsloth--Meta-Llama-3.1-8B-Instruct/model-00004-of-00004.safetensors
LLM/unsloth--Meta-Llama-3.1-8B-Instruct/model.safetensors.index.json
LLM/unsloth--Meta-Llama-3.1-8B-Instruct/special_tokens_map.json
LLM/unsloth--Meta-Llama-3.1-8B-Instruct/tokenizer.json
LLM/unsloth--Meta-Llama-3.1-8B-Instruct/tokenizer_config.json
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/.gitattributes
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/LLAMA_LICENSE
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/LLAMA_USE_POLICY.md
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/README.md
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/chat_template.json
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/config.json
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/generation_config.json
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/model-00001-of-00004.safetensors
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/model-00002-of-00004.safetensors
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/model-00003-of-00004.safetensors
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/model-00004-of-00004.safetensors
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/model.safetensors.index.json
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/preprocessor_config.json
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/processor_config.json
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/special_tokens_map.json
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/tokenizer.json
LLavacheckpoints/llama-joycaption-beta-one-hf-llava/tokenizer_config.json
SEEDVR2/ema_vae_fp16.safetensors
SEEDVR2/seedvr2_ema_3b-Q4_K_M.gguf
SEEDVR2/seedvr2_ema_3b-Q5_K_M.gguf
SEEDVR2/seedvr2_ema_3b_fp8_e4m3fn.safetensors
audio_encoders/put_audio_encoder_models_here
checkpoints/F.1 Inpainting-Alpha.safetensors
checkpoints/Qwen-Image-Edit-2509-Q3_K_M.gguf
checkpoints/RealVisXL_V4.0.safetensors
checkpoints/SDv2-1_768-ema-pruned.safetensors
checkpoints/flux1-dev-fp8.safetensors
checkpoints/flux1-dev.safetensors
checkpoints/flux1-fill-dev.safetensors
checkpoints/flux1-kontext-dev.safetensors
checkpoints/flux1-schnell.safetensors
checkpoints/hunyuan_3d_v2.1.safetensors
checkpoints/ltx-2-19b-dev-fp8.safetensors
checkpoints/ltx-video-2b-v0.9.safetensors
checkpoints/put_checkpoints_here
checkpoints/qwen_image_edit_fp8_e4m3fn.safetensors
checkpoints/realvisxlV50_v50LightningBakedvae.safetensors
checkpoints/sd3.5_large_fp8_scaled.safetensors
checkpoints/sd_xl_base_1.0.safetensors
checkpoints/sd_xl_refiner_1.0.safetensors
checkpoints/stable_zero123.ckpt
checkpoints/v1-5-pruned.safetensors
clip/clip_g.safetensors
clip/clip_l.safetensors
clip/clip_model.pt
clip/model.safetensors
clip/put_clip_or_text_encoder_models_here
clip/pytorch_model.bin
clip/qwen_2.5_vl_7b_fp8_scaled.safetensors
clip/siglip-so400m-patch14-384/.gitattributes
clip/siglip-so400m-patch14-384/README.md
clip/siglip-so400m-patch14-384/config.json
clip/siglip-so400m-patch14-384/model.safetensors
clip/siglip-so400m-patch14-384/preprocessor_config.json
clip/siglip-so400m-patch14-384/special_tokens_map.json
clip/siglip-so400m-patch14-384/spiece.model
clip/siglip-so400m-patch14-384/tokenizer.json
clip/siglip-so400m-patch14-384/tokenizer_config.json
clip/t5xxl_fp16.safetensors
clip/t5xxl_fp8_e4m3fn.safetensors
clip_vision/CLIP-ViT-H-14-laion2B-s32B-b79K.safetensors
clip_vision/CLIP-ViT-bigG-14-laion2B-39B-b160k.safetensors
clip_vision/clip-vit-large-patch14-336.bin
clip_vision/google--siglip-so400m-patch14-384/.gitattributes
clip_vision/google--siglip-so400m-patch14-384/README.md
clip_vision/google--siglip-so400m-patch14-384/config.json
clip_vision/google--siglip-so400m-patch14-384/model.safetensors
clip_vision/google--siglip-so400m-patch14-384/preprocessor_config.json
clip_vision/google--siglip-so400m-patch14-384/special_tokens_map.json
clip_vision/google--siglip-so400m-patch14-384/spiece.model
clip_vision/google--siglip-so400m-patch14-384/tokenizer.json
clip_vision/google--siglip-so400m-patch14-384/tokenizer_config.json
clip_vision/googlesiglip-so400m-patch14-384.safetensors
clip_vision/image_encoder.safetensors
clip_vision/put_clip_vision_models_here
clip_vision/sigclip_vision_patch14_384.safetensors
configs/anything_v3.yaml
configs/v1-inference.yaml
configs/v1-inference_clip_skip_2.yaml
configs/v1-inference_clip_skip_2_fp16.yaml
configs/v1-inference_fp16.yaml
configs/v1-inpainting-inference.yaml
configs/v2-inference-v.yaml
configs/v2-inference-v_fp32.yaml
configs/v2-inference.yaml
configs/v2-inference_fp32.yaml
configs/v2-inpainting-inference.yaml
controlnet/1.5/control_boxdepth_LooseControlfp16.safetensors
controlnet/1.5/control_v11p_sd15_lineart_fp16.safetensors
controlnet/FLUX.1-dev.safetensors
controlnet/FLUX.1/Shakker-Labs-ControlNet-Union-Pro/diffusion_pytorch_model.safetensors
controlnet/Flux.1-dev-Controlnet-Upscaler-JasperAI.safetensors
controlnet/InstantXQwen-Image-ControlNet-Inpaintingl.safetensors
controlnet/Qwen-Image-InstantX-ControlNet-Inpainting.safetensors
controlnet/SDXL/controlnet-canny-sdxl-1.0/diffusion_pytorch_model_V2.safetensors
controlnet/SDXL/instantid/diffusion_pytorch_model.safetensors
controlnet/controlnet-depth-sdxl-1.0.safetensors
controlnet/controlnet-union-sdxl-1.0.safetensors
controlnet/instantid/diffusion_pytorch_model.safetensors
controlnet/instantx_flux_canny.safetensors
controlnet/instantx_flux_depth.safetensors
controlnet/instantx_flux_union.safetensors
controlnet/mistoLine_rank256.safetensors
controlnet/put_controlnets_and_t2i_here
controlnet/sd3.5_large_controlnet_canny.safetensors
depthanything/depth_anything_v2_vitl_fp32.safetensors
diffusers/put_diffusers_models_here
diffusion_models/flux1-canny-dev.safetensors
diffusion_models/flux1-depth-dev.safetensors
diffusion_models/flux1-fill-dev.safetensors
diffusion_models/put_diffusion_model_files_here
diffusion_models/qwen_image_fp8_e4m3fn.safetensors
diffusion_models/wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors
diffusion_models/wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors
embeddings/put_embeddings_or_textual_inversion_concepts_here
gligen/put_gligen_models_here
hypernetworks/put_hypernetworks_here
inpaint/config.json
inpaint/diffusion_pytorch_model.safetensors
insightface/inswapper_128.onnx ONNX
insightface/models/buffalo_l/1k3d68.onnx ONNX
insightface/models/buffalo_l/2d106det.onnx ONNX
insightface/models/buffalo_l/det_10g.onnx ONNX
insightface/models/buffalo_l/genderage.onnx ONNX
insightface/models/buffalo_l/w600k_r50.onnx ONNX
instantid/SDXL/ip-adapter.bin
ipadapter-flux/InstantXFLUX.1-dev-IP-Adapter.bin
ipadapter/ip-adapter-full-face_sd15.safetensors
ipadapter/ip-adapter-plus-face_sd15.safetensors
ipadapter/ip-adapter-plus-face_sdxl_vit-h.safetensors
ipadapter/ip-adapter-plus_sd15.safetensors
ipadapter/ip-adapter-plus_sdxl_vit-h.safetensors
ipadapter/ip-adapter_sd15.safetensors
ipadapter/ip-adapter_sd15_light.safetensors
ipadapter/ip-adapter_sd15_light_v11.bin
ipadapter/ip-adapter_sd15_vit-G.safetensors
ipadapter/ip-adapter_sdxl.safetensors
ipadapter/ip-adapter_sdxl_vit-h.safetensors
latent_upscale_models/ltx-2-spatial-upscaler-x2-1.0.safetensors
latent_upscale_models/put_latent_upscale_models_here
llm_gguf/Mistral-7B-Instruct-v0.3-Q4_K_M.gguf
llm_gguf/Mistral-7B-Instruct-v0.3.IQ1_M.gguf
llm_gguf/Mistral-7B-Instruct-v0.3.IQ1_S.gguf
llm_gguf/Mistral-7B-Instruct-v0.3.IQ2_XS.gguf
loras/F.1 Booth design.safetensors
loras/F.1 JJsExhibition.safetensors
loras/F.1 S&P style 450.safetensors
loras/F.1 S&P style500.safetensors
loras/F.1 Simple linear booth.safetensors
loras/F.1 araminta.safetensors
loras/F.1 booth design V2.safetensors
loras/F.1 spiral structure exhibition.safetensors
loras/F.1 straight line chamfering - LED strip lighting .safetensors
loras/F.1-4-step-Lora.safetensors
loras/Flux/Flux Dev_lora_Flux Dev 4-step.safetensors
loras/Flux/flux-RealismLora.safetensors
loras/Flux/flux1-canny-dev-lora.safetensors
loras/Flux/flux1-depth-dev-lora.safetensors
loras/Qwen-Image-Lightning-4steps-V1.0.safetensors
loras/SDXL.v2 Hand-drawn flat illustration style large model.safetensors
loras/SDXL_concept_1.0.safetensors
loras/SDXL_illustrati.safetensors
loras/Wan2.2_Lightning_T2V_High.safetensors
loras/Wan2.2_Lightning_T2V_Low.safetensors
loras/ltx-2-19b-distilled-lora-384.safetensors
loras/ltx-2-19b-lora-camera-control-dolly-left.safetensors
loras/put_loras_here
loras/qwen/Qwen-Image-Lightning-4steps-V1.0.safetensors
loras/qwen/qwen_image_union_diffsynth_lora.safetensors
loras/wan2.2_i2v_lightx2v_4steps_lora_v1_high_noise.safetensors
loras/wan2.2_i2v_lightx2v_4steps_lora_v1_low_noise.safetensors
model_patches/DiffSynth-StudioQwen-Image-Blockwise-ControlNet-Depth.safetensors
model_patches/put_model_patches_here
model_patches/qwen_image_canny_diffsynth_controlnet.safetensors
photomaker/put_photomaker_models_here
prompt_generator/MiniCPM-V-2_6/README.md
rembg/u2net.onnx ONNX
sams/sam_vit_b_01ec64.pth
style_models/flux1-redux-dev.safetensors
style_models/put_t2i_style_model_here
style_models/sigclip_vision_patch14_384.safetensors
text_encoders/gemma_3_12B_it_fp4_mixed.safetensors
text_encoders/put_text_encoder_files_here
text_encoders/qwen_2.5_vl_7b_fp8_scaled.safetensors
text_encoders/t5xxl_fp8_e4m3fn.safetensors
text_encoders/umt5-xxl-enc-bf16.safetensors
text_encoders/umt5_xxl_fp8_e4m3fn_scaled.safetensors
ultralytics/bbox/face_yolov8m.pt
ultralytics/bbox/hand_yolov8s.pt
ultralytics/segm/person_yolov8m-seg.pt
unet/F.1 Inpainting-Alpha.safetensors
unet/Qwen-Image-Edit-2509-Q3_K_M.gguf
unet/diffusion_pytorch_model.fp16.safetensors
unet/flux1-dev-fp8.safetensors
unet/flux1-kontext-dev.safetensors
unet/flux1-schnell.safetensors
unet/fluxFillFP8_v10.safetensors
unet/hunyuan_3d_v2.1.safetensors
unet/put_unet_files_here
unet/qwen_image_edit_fp8_e4m3fn.safetensors
upscale_models/4x-UltraSharp.pth
upscale_models/4xRealWebPhoto_v4.pth
upscale_models/4x_NMKD-Siax_200k.pth
upscale_models/put_esrgan_and_other_upscale_models_here
vae/SDXL/sdxl_vae.safetensors
vae/Wan2_2_VAE_bf16.safetensors
vae/ae.safetensors
vae/ae.sft
vae/diffusion_pytorch_model.fp16 (1).safetensors
vae/flux-vae.sft
vae/flux1-dev.safetensors
vae/put_vae_here
vae/qwen_image_vae.safetensors
vae/wan22-vae.safetensors
vae/wan_2.1_vae.safetensors
vae_approx/put_taesd_encoder_pth_and_taesd_decoder_pth_here
vae_approx/taef1_decoder.safetensors
vae_approx/taef1_encoder.safetensors
vae_approx/taesd3_decoder.safetensors
vae_approx/taesd3_encoder.safetensors
vae_approx/taesd_decoder.safetensors
vae_approx/taesd_encoder.safetensors
vae_approx/taesdxl_decoder.safetensors
vae_approx/taesdxl_encoder.safetensors