返回模型
暂无说明文档
niobures/video-models
作者 niobures
tf-keras
↓ 86
♥ 0
创建时间: 2025-06-26 19:04:18+00:00
更新时间: 2025-06-26 21:06:48+00:00
在 Hugging Face 上查看文件 (117)
.gitattributes
CLIP/ViT-B-16.pt
CLIP/ViT-B-32.pt
LLaVA/Llava-v1.5-7B-GGUF/llava-v1.5-7b-Q2_K.gguf
LLaVA/Llava-v1.5-7B-GGUF/llava-v1.5-7b-mmproj-model-f16.gguf
LLaVA/lava_phi/.gitattributes
LLaVA/lava_phi/README.md
LLaVA/lava_phi/added_tokens.json
LLaVA/lava_phi/config.json
LLaVA/lava_phi/generation_config.json
LLaVA/lava_phi/merges.txt
LLaVA/lava_phi/model.safetensors
LLaVA/lava_phi/special_tokens_map.json
LLaVA/lava_phi/tokenizer.json
LLaVA/lava_phi/tokenizer_config.json
LLaVA/lava_phi/vocab.json
LaViLa/EK-100_MIR/TSF-B/clip_openai_timesformer_base.ft_ek100_mir.ep_0085.md5sum_c67d95.pth
MobileVLM/MobileVLM-1.7B-GGUF/.gitattributes
MobileVLM/MobileVLM-1.7B-GGUF/MobileVLM-1.7B-Q4_K.gguf
MobileVLM/MobileVLM-1.7B-GGUF/MobileVLM-1.7B-Q5_K.gguf
MobileVLM/MobileVLM-1.7B-GGUF/MobileVLM-1.7B-Q6_K.gguf
MobileVLM/MobileVLM-1.7B-GGUF/MobileVLM-1.7B-mmproj-f16.gguf
MobileVLM/MobileVLM_V2-1.7B-GGUF/.gitattributes
MobileVLM/MobileVLM_V2-1.7B-GGUF/README.md
MobileVLM/MobileVLM_V2-1.7B-GGUF/ggml-model-q4_k.gguf
MobileVLM/MobileVLM_V2-1.7B-GGUF/mmproj-model-f16.gguf
Qwen2-VL/Qwen2-VL-2B-Instruct-GGUF/.gitattributes
Qwen2-VL/Qwen2-VL-2B-Instruct-GGUF/Qwen2-VL-2B-Instruct-Q3_K_L.gguf
Qwen2-VL/Qwen2-VL-2B-Instruct-GGUF/Qwen2-VL-2B-Instruct-Q4_K_M.gguf
Qwen2-VL/Qwen2-VL-2B-Instruct-GGUF/Qwen2-VL-2B-Instruct-Q6_K.gguf
Qwen2-VL/Qwen2-VL-2B-Instruct-GGUF/Qwen2-VL-2B-Instruct-Q8_0.gguf
Qwen2-VL/Qwen2-VL-2B-Instruct-GGUF/README.md
Qwen2-VL/Qwen2-VL-2B-Instruct-GGUF/mmproj-model-f32.gguf
YOLO/darknet19_448.conv.23
YOLO/yolo-v11/yolo11n.onnx
ONNX
YOLO/yolo-v11/yolo11n.pt
YOLO/yolo-v11/yolo11n_saved_model/fingerprint.pb
YOLO/yolo-v11/yolo11n_saved_model/metadata.yaml
YOLO/yolo-v11/yolo11n_saved_model/saved_model.pb
YOLO/yolo-v11/yolo11n_saved_model/variables/variables.data-00000-of-00001
YOLO/yolo-v11/yolo11n_saved_model/variables/variables.index
YOLO/yolo-v11/yolo11n_saved_model/yolo11n_float16.tflite
YOLO/yolo-v11/yolo11n_saved_model/yolo11n_float32.tflite
YOLO/yolo-v11/yolo11s-cls.onnx
ONNX
YOLO/yolo-v11/yolo11s-cls.pt
YOLO/yolo-v3/modelSelect.txt
YOLO/yolo-v3/openimages/openimages.names
YOLO/yolo-v3/openimages/yolov3-openimages.cfg
YOLO/yolo-v3/openimages/yolov3-openimages.weights
YOLO/yolo-v3/v3/coco.names
YOLO/yolo-v3/v3/yolov3.cfg
YOLO/yolo-v3/v3/yolov3.weights
YOLO/yolo-v3/yolov3-tiny/coco.names
YOLO/yolo-v3/yolov3-tiny/yolov3-tiny.cfg
YOLO/yolo-v3/yolov3-tiny/yolov3-tiny.weights
YOLO/yolo-v8l (world-v2)/yolov8l-worldv2.pt
YOLO/yolo-world/yolo-world-s.pt
YOLO/yoloface-8n/yoloface_8n.onnx
ONNX
YOLO/yolov2-tiny-voc/voc.names
YOLO/yolov2-tiny-voc/yolov2-tiny-voc.cfg
YOLO/yolov2-tiny-voc/yolov2-tiny-voc.weights
YOLO/yolov2-tiny/voc.names
YOLO/yolov2-tiny/yolov2-tiny-voc.cfg
YOLO/yolov2-tiny/yolov2-tiny-voc.weights
YOLO/yolov8n_silu_coco_640x640_quant_tflite_edgetpu_1/labels_yolov8n_silu_coco.json
YOLO/yolov8n_silu_coco_640x640_quant_tflite_edgetpu_1/yolov8n_silu_coco_640x640_quant_tflite_edgetpu_1.json
YOLO/yolov8n_silu_coco_640x640_quant_tflite_edgetpu_1/yolov8n_silu_coco_640x640_quant_tflite_edgetpu_1.tflite
microsoft-git/pytorch_model.bin
onnx-community/Qwen2-VL-2B-Instruct/.gitattributes
onnx-community/Qwen2-VL-2B-Instruct/README.md
onnx-community/Qwen2-VL-2B-Instruct/added_tokens.json
onnx-community/Qwen2-VL-2B-Instruct/chat_template.json
onnx-community/Qwen2-VL-2B-Instruct/config.json
onnx-community/Qwen2-VL-2B-Instruct/generation_config.json
onnx-community/Qwen2-VL-2B-Instruct/merges.txt
onnx-community/Qwen2-VL-2B-Instruct/onnx/decoder_model_merged.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/decoder_model_merged.onnx_data
onnx-community/Qwen2-VL-2B-Instruct/onnx/decoder_model_merged_bnb4.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/decoder_model_merged_fp16.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/decoder_model_merged_fp16.onnx_data
onnx-community/Qwen2-VL-2B-Instruct/onnx/decoder_model_merged_int8.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/decoder_model_merged_q4.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/decoder_model_merged_q4f16.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/decoder_model_merged_quantized.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/decoder_model_merged_uint8.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/embed_tokens.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/embed_tokens_bnb4.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/embed_tokens_fp16.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/embed_tokens_int8.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/embed_tokens_q4.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/embed_tokens_q4f16.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/embed_tokens_quantized.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/embed_tokens_uint8.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/vision_encoder.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/vision_encoder.onnx_data
onnx-community/Qwen2-VL-2B-Instruct/onnx/vision_encoder_bnb4.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/vision_encoder_bnb4.onnx_data
onnx-community/Qwen2-VL-2B-Instruct/onnx/vision_encoder_fp16.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/vision_encoder_int8.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/vision_encoder_q4.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/vision_encoder_q4.onnx_data
onnx-community/Qwen2-VL-2B-Instruct/onnx/vision_encoder_q4f16.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/vision_encoder_quantized.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/onnx/vision_encoder_uint8.onnx
ONNX
onnx-community/Qwen2-VL-2B-Instruct/preprocessor_config.json
onnx-community/Qwen2-VL-2B-Instruct/tokenizer.json
onnx-community/Qwen2-VL-2B-Instruct/tokenizer_config.json
onnx-community/Qwen2-VL-2B-Instruct/vocab.json
transnet-v2/test_data/bbc_02_clip.mp4
transnet-v2/test_data/bbc_02_clip.mp4.truth.txt
transnet-v2/test_data/shot_boundaries.txt
transnet-v2/transnetv2-weights/latest.pth
transnet-v2/transnetv2-weights/saved_model.pb
transnet-v2/transnetv2-weights/transnetv2-pytorch-weights.pth
transnet-v2/transnetv2-weights/transnetv2.onnx
ONNX
transnet-v2/transnetv2-weights/variables/variables.data-00000-of-00001
transnet-v2/transnetv2-weights/variables/variables.index