Transformers for Information Retrieval
People involved:About this project
To be updated
Methods for Effective Passage Ranking: TILDE and TILDEv2
To be updated
Methods for Effective Passage Ranking: the QLM-T5
To be updated
Exploit Relevance Feedback with Transformer-based pre-trained LMs
To be updated
Do Dense Retrievers Require Interpolation with bag-of-words?
To be updated
Relevant Publications
Transformer-based Rankers
- Shengyao Zhuang and Guido Zuccon. 2021. TILDE: Term Independent Likelihood moDEl for Passage Re-ranking. In The 44th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '21).
- Shengyao Zhuang and Guido Zuccon. 2021. Fast Passage Re-ranking with Contextualized Exact Term Matching and Efficient Passage Expansion. In Arxiv preprint.
- Shengyao Zhuang and Guido Zuccon. 2021. Dealing with Typos for BERT-based Passage Retrieval and Ranking. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP'21).
- Shengyao Zhuang and Hang Li and Guido Zuccon. 2021. Deep Query Likelihood Model for Information Retrieval. In 43rd European Conference on IR Research.
Dense Retrievers
- Shuai Wang and Shengyao Zhuang and Guido Zuccon. 2021. BERT-based Dense Retrievers Require Interpolation with BM25 for Effective Passage Retrieval. In The Proceedings of the 2021 ACM SIGIR on International Conference on Theory of Information Retrieval (ICTIR 2021).