返回模型
说明文档
The README for this model appears to be empty (only contains a license header). Based on what I fetched:
license: mit
[Agent Usage Reminder]
中文翻译 / Chinese Translation:
license: mit
注:该模型的 README.md 文件内容为空,仅包含许可证信息。
The README for dmmagdal/distilgpt2-onnx-js-quantized contains only a YAML frontmatter with the license (MIT) and no other content. The HuggingFace page explicitly states: "README.md exists but content is empty."
If you have the actual README content you'd like translated, please provide it directly and I'll translate it to Chinese (Simplified) while preserving all code blocks, links, and markdown formatting.
dmmagdal/distilgpt2-onnx-js-quantized
作者 dmmagdal
text-generation
transformers
↓ 0
♥ 0
创建时间: 2024-01-06 19:27:10+00:00
更新时间: 2024-01-06 19:30:02+00:00
在 Hugging Face 上查看文件 (16)
.gitattributes
README.md
config.json
generation_config.json
merges.txt
onnx/decoder_model.onnx
ONNX
onnx/decoder_model_merged.onnx
ONNX
onnx/decoder_model_merged_quantized.onnx
ONNX
onnx/decoder_model_quantized.onnx
ONNX
onnx/decoder_with_past_model.onnx
ONNX
onnx/decoder_with_past_model_quantized.onnx
ONNX
quantize_config.json
special_tokens_map.json
tokenizer.json
tokenizer_config.json
vocab.json