Abstract
This tutorial focuses on both theoretical and practical aspects of improving the efficiency and robustness of transformer-based approaches, so that these can be effectively used in practical, high-scale, and high-volume information retrieval (IR) scenarios. The tutorial is inspired and informed by our work and experience while working with massive narrative datasets (8.5 billion medical notes), and by our basic research and academic experience with transformer-based IR tasks. Additionally, the tutorial focuses on techniques for making transformer-based IR robust against adversarial (AI) exploitation. This is a recent concern in the IR domain that we needed to take into concern, and we want to want to share some of the lessons learned and applicable principles with our audience. Finally, an important, if not critical, element of this tutorial is its focus on didacticism - delivering tutorial content in a clear, intuitive, plain-speak fashion. Transformers are a challenging subject, and, through our teaching experience, we observed a great value and a great need to explain all relevant aspects of this architecture and related principles in the most straightforward, precise, and intuitive manner. That is the defining style of our proposed tutorial.
Original language | English |
---|---|
Title of host publication | SIGIR 2022 - Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval |
Publisher | Association for Computing Machinery, Inc |
Pages | 3433-3435 |
Number of pages | 3 |
ISBN (Electronic) | 9781450387323 |
DOIs | |
State | Published - Jul 6 2022 |
Event | 45th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2022 - Madrid, Spain Duration: Jul 11 2022 → Jul 15 2022 |
Publication series
Name | SIGIR 2022 - Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval |
---|
Conference
Conference | 45th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2022 |
---|---|
Country/Territory | Spain |
City | Madrid |
Period | 07/11/22 → 07/15/22 |
Funding
This manuscript has been in part co-authored by UT-Battelle, LLC under Contract No. DE-AC05-00OR22725 with the U.S. Department of Energy, and under a joint programs (MVP CHAMPION and VICTOR), between the U.S. Department of Energy (DOE), and the U.S. Department of Veterans Affairs (VA). Part of this research (academic) was supported by Google Research resources made available to Dr. Edmon Begoli.
Keywords
- attention-based computing
- information retrieval efficiency
- robustness
- transformer neural networks