返回模型
ONNX 导出版适配器
说明文档
ONNX 导出版适配器 AdapterHub/bert-base-uncased-pf-hotpotqa(适用于 bert-base-uncased)
将 AdapterHub/bert-base-uncased-pf-hotpotqa 转换为 UKP SQuARE
使用方法
onnx_path = hf_hub_download(repo_id='UKP-SQuARE/bert-base-uncased-pf-hotpotqa-onnx', filename='model.onnx') # 或使用 model_quant.onnx 进行量化
onnx_model = InferenceSession(onnx_path, providers=['CPUExecutionProvider'])
## 架构与训练
该适配器的训练代码可在 https://github.com/adapter-hub/efficient-task-transfer 获取。
具体而言,所有任务的训练配置可以在[这里](https://github.com/adapter-hub/efficient-task-transfer/tree/master/run_configs)找到。
## 评估结果
有关结果的更多信息,请参阅[论文](https://arxiv.org/pdf/2104.08247)。
## 引用
如果您使用此适配器,请引用我们的论文["What to Pre-Train on? Efficient Intermediate Task Selection"](https://arxiv.org/pdf/2104.08247):
```bibtex
@inproceedings{poth-etal-2021-pre,
title = \"{W}hat to Pre-Train on? {E}fficient Intermediate Task Selection\",
author = {Poth, Clifton and
Pfeiffer, Jonas and
R{\"u}ckl{'e}, Andreas and
Gurevych, Iryna},
booktitle = \"Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing\",
month = nov,
year = \"2021\",
address = \"Online and Punta Cana, Dominican Republic\",
publisher = \"Association for Computational Linguistics\",
url = \"https://aclanthology.org/2021.emnlp-main.827\",
pages = \"10585--10605\",
}
UKP-SQuARE/bert-base-uncased-pf-hotpotqa-onnx
作者 UKP-SQuARE
question-answering
adapter-transformers
↓ 0
♥ 0
创建时间: 2023-01-27 00:01:42+00:00
更新时间: 2023-01-27 00:03:58+00:00
在 Hugging Face 上查看文件 (9)
.gitattributes
README.md
config.json
model.onnx
ONNX
model_quant.onnx
ONNX
special_tokens_map.json
tokenizer.json
tokenizer_config.json
vocab.txt