古风汉服美女图集

deepset/gbert-base-germandpr-question_encoder

2023-12-27 17:09 0 微浪网
导语: Overview Language model: ...,

deepset/gbert-base-germandpr-question_encoder


Overview

Language model: gbert-base-germandpr
Language: German
Training data: GermanDPR train set (~ 56MB)
Eval data: GermanDPR test set (~ 6MB)
Infrastructure: 4x V100 GPU
Published: Apr 26th, 2021


Details

  • We trained a dense passage retrieval model with two gbert-base models as encoders of questions and passages.
  • The dataset is GermanDPR, a new, German language dataset, which we hand-annotated and published online.
  • It comprises 9275 question/answer pairs in the training set and 1025 pairs in the test set.
    For each pair, there are one positive context and three hard negative contexts.
  • As the basis of the training data, we used our hand-annotated GermanQuAD dataset as positive samples and generated hard negative samples from the latest German Wikipedia dump (6GB of raw txt files).
  • The data dump was cleaned with tailored scripts, leading to 2.8 million indexed passages from German Wikipedia.

See https://deepset.ai/germanquad for more details and dataset download.


Hyperparameters

batch_size = 40<br /> n_epochs = 20<br /> num_training_steps = 4640<br /> num_warmup_steps = 460<br /> max_seq_len = 32 tokens for question encoder and 300 tokens for passage encoder<br /> learning_rate = 1e-6<br /> lr_schedule = LinearWarmup<br /> embeds_dropout_prob = 0.1<br /> num_hard_negatives = 2<br />


Performance

During training, we monitored the in-batch average rank and the loss and evaluated different batch sizes, numbers of epochs, and number of hard negatives on a dev set split from the train set.
The dev split contained 1030 question/answer pairs.
Even without thorough hyperparameter tuning, we observed quite stable learning. Multiple restarts with different seeds produced quite similar results.
Note that the in-batch average rank is influenced by settings for batch size and number of hard negatives. A smaller number of hard negatives makes the task easier.
After fixing the hyperparameters we trained the model on the full GermanDPR train set.
We further evaluated the retrieval performance of the trained model on the full German Wikipedia with the GermanDPR test set as labels. To this end, we converted the GermanDPR test set to SQuAD format. The DPR model drastically outperforms the BM25 baseline with regard to recall@k.


Usage


In haystack

You can load the model in haystack as a retriever for doing QA at scale:
retriever = DensePassageRetriever(<br /> document_store=document_store,<br /> query_embedding_model="deepset/gbert-base-germandpr-question_encoder"<br /> passage_embedding_model="deepset/gbert-base-germandpr-ctx_encoder"<br /> )<br />


Authors

  • Timo Möller: timo.moeller [at] deepset.ai
  • Julian Risch: julian.risch [at] deepset.ai
  • Malte Pietsch: malte.pietsch [at] deepset.ai


About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:

  • German BERT (aka “bert-base-german-cased”)
  • GermanQuAD and GermanDPR datasets and models (aka “gelectra-base-germanquad”, “gbert-base-germandpr”)
  • FARM
  • Haystack

Get in touch:
Twitter | LinkedIn | Website
By the way: we’re hiring!


收录说明:
1、本网页并非 deepset/gbert-base-germandpr-question_encoder 官网网址页面,此页面内容编录于互联网,只作展示之用;2、如果有与 deepset/gbert-base-germandpr-question_encoder 相关业务事宜,请访问其网站并获取联系方式;3、本站与 deepset/gbert-base-germandpr-question_encoder 无任何关系,对于 deepset/gbert-base-germandpr-question_encoder 网站中的信息,请用户谨慎辨识其真伪。4、本站收录 deepset/gbert-base-germandpr-question_encoder 时,此站内容访问正常,如遇跳转非法网站,有可能此网站被非法入侵或者已更换新网址,导致旧网址被非法使用,5、如果你是网站站长或者负责人,不想被收录请邮件删除:i-hu#Foxmail.com (#换@)

前往AI网址导航
1、本文来自 AIGC网址导航 投稿的内容 deepset/gbert-base-germandpr-question_encoder ,所有言论和图片纯属作者个人意见,版权归原作者所有;不代表 本站 立场;
2、本站所有文章、图片、资源等如果未标明原创,均为收集自互联网公开资源;分享的图片、资源、视频等,出镜模特均为成年女性正常写真内容,版权归原作者所有,仅作为个人学习、研究以及欣赏!如有涉及下载请24小时内删除;
3、如果您发现本站上有侵犯您的权益的作品,请与我们取得联系,我们会及时修改、删除并致以最深的歉意。邮箱: i-hu#(#换@)foxmail.com

2023-12-27

2023-12-27

古风汉服美女图集
扫一扫二维码分享