Spring 2023 Reading Group
Reading group on Transformers in the lab in Spring 2023
Reading Groups Tips
In a reading group everyone takes turns leading discussion of a paper each week. Leading discussion can be as simple as having your own annotated notes on Hypothes.is to share and start discussion as we go through it together. Or it could be more involved, including making slides to present your overview of the paper’s contributions, highlights and weak points.
Spring 2023 - Transformers Reading Group
Motivation
AI research has been undergoing since the dawn of computer science itself, and Deep Learning has seen an uninterrupted, and accelerating wave of advancing abilities for over 12 years since the public breakthroughs of CNNs in 2012. Yet still, many people, including AI/ML researchers have been surprised at the abilities of the generative models that have been released since summer 2022 by OpenAI, Facebook, Google and others. The recent systems all rely in various ways on the Transformer model (missing reference).
Resources
This github page has a quite extensive list of papers and references on the topic so seems as good a place as any to start:
See the links and notes on paper we have done in previous meetings, obtain the link for the next paper or look at planned upcoming or potential future papers, feel free to suggest others or changes in the upcoming order.
Jump to stage: next ~ done ~ upcoming ~ potential
next
done
- [7] Are Pretrained Convolutions Better than Pretrained Transformers?In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Association for Computational Linguistics, Online.. Aug, 2021.