古风汉服美女图集

cross-encoder/ms-marco-TinyBERT-L-2-v2

2023-12-26 11:50 0 微浪网
导语: Cross-Encoder for MS Marco ...,

cross-encoder/ms-marco-TinyBERT-L-2-v2


Cross-Encoder for MS Marco

This model was trained on the MS Marco Passage Ranking task.
The model can be used for Information Retrieval: Given a query, encode the query will all possible passages (e.g. retrieved with ElasticSearch). Then sort the passages in a decreasing order. See SBERT.net Retrieve & Re-rank for more details. The training code is available here: SBERT.net Training MS Marco


Usage with Transformers

from transformers import AutoTokenizer, AutoModelForSequenceClassification<br /> import torch<br /> model = AutoModelForSequenceClassification.from_pretrained('model_name')<br /> tokenizer = AutoTokenizer.from_pretrained('model_name')<br /> features = tokenizer(['How many people live in Berlin?', 'How many people live in Berlin?'], ['Berlin has a population of 3,520,031 registered inhabitants in an area of 891.82 square kilometers.', 'New York City is famous for the Metropolitan Museum of Art.'], padding=True, truncation=True, return_tensors="pt")<br /> model.eval()<br /> with torch.no_grad():<br /> scores = model(**features).logits<br /> print(scores)<br />


Usage with SentenceTransformers

The usage becomes easier when you have SentenceTransformers installed. Then, you can use the pre-trained models like this:
from sentence_transformers import CrossEncoder<br /> model = CrossEncoder('model_name', max_length=512)<br /> scores = model.predict([('Query', 'Paragraph1'), ('Query', 'Paragraph2') , ('Query', 'Paragraph3')])<br />


Performance

In the following table, we provide various pre-trained Cross-Encoders together with their performance on the TREC Deep Learning 2019 and the MS Marco Passage Reranking dataset.

Model-Name NDCG@10 (TREC DL 19) MRR@10 (MS Marco Dev) Docs / Sec
Version 2 models
cross-encoder/ms-marco-TinyBERT-L-2-v2 69.84 32.56 9000
cross-encoder/ms-marco-MiniLM-L-2-v2 71.01 34.85 4100
cross-encoder/ms-marco-MiniLM-L-4-v2 73.04 37.70 2500
cross-encoder/ms-marco-MiniLM-L-6-v2 74.30 39.01 1800
cross-encoder/ms-marco-MiniLM-L-12-v2 74.31 39.02 960
Version 1 models
cross-encoder/ms-marco-TinyBERT-L-2 67.43 30.15 9000
cross-encoder/ms-marco-TinyBERT-L-4 68.09 34.50 2900
cross-encoder/ms-marco-TinyBERT-L-6 69.57 36.13 680
cross-encoder/ms-marco-electra-base 71.99 36.41 340
Other models
nboost/pt-tinybert-msmarco 63.63 28.80 2900
nboost/pt-bert-base-uncased-msmarco 70.94 34.75 340
nboost/pt-bert-large-msmarco 73.36 36.48 100
Capreolus/electra-base-msmarco 71.23 36.89 340
amberoad/bert-multilingual-passage-reranking-msmarco 68.40 35.54 330
sebastian-hofstaetter/distilbert-cat-margin_mse-T2-msmarco 72.82 37.88 720


收录说明:
1、本网页并非 cross-encoder/ms-marco-TinyBERT-L-2-v2 官网网址页面,此页面内容编录于互联网,只作展示之用;2、如果有与 cross-encoder/ms-marco-TinyBERT-L-2-v2 相关业务事宜,请访问其网站并获取联系方式;3、本站与 cross-encoder/ms-marco-TinyBERT-L-2-v2 无任何关系,对于 cross-encoder/ms-marco-TinyBERT-L-2-v2 网站中的信息,请用户谨慎辨识其真伪。4、本站收录 cross-encoder/ms-marco-TinyBERT-L-2-v2 时,此站内容访问正常,如遇跳转非法网站,有可能此网站被非法入侵或者已更换新网址,导致旧网址被非法使用,5、如果你是网站站长或者负责人,不想被收录请邮件删除:i-hu#Foxmail.com (#换@)

前往AI网址导航
1、本文来自 AIGC网址导航 投稿的内容 cross-encoder/ms-marco-TinyBERT-L-2-v2 ,所有言论和图片纯属作者个人意见,版权归原作者所有;不代表 本站 立场;
2、本站所有文章、图片、资源等如果未标明原创,均为收集自互联网公开资源;分享的图片、资源、视频等,出镜模特均为成年女性正常写真内容,版权归原作者所有,仅作为个人学习、研究以及欣赏!如有涉及下载请24小时内删除;
3、如果您发现本站上有侵犯您的权益的作品,请与我们取得联系,我们会及时修改、删除并致以最深的歉意。邮箱: i-hu#(#换@)foxmail.com

2023-12-26

2023-12-26

古风汉服美女图集
扫一扫二维码分享