dl.rocks - [2010.06467] Pretrained Transformers for Text Ranking: BERT and Beyond

Example domain paragraphs

Help | Advanced Search

open search GO open navigation menu quick links Login Help Pages About Computer Science > Information Retrieval arXiv:2010.06467 (cs) [Submitted on 13 Oct 2020 ( v1 ), last revised 19 Aug 2021 (this version, v3)] Title: Pretrained Transformers for Text Ranking: BERT and Beyond Authors: Jimmy Lin , Rodrigo Nogueira , Andrew Yates Download a PDF of the paper titled Pretrained Transformers for Text Ranking: BERT and Beyond, by Jimmy Lin and 2 other authors Download PDF Abstract: The goal of text ranking is to

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.