BERT Annotated Paper

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding The revolutionary paper by Google that increased the State-of-the-art performance for various NLP tasks and set the stepping stone for many other revolutionary architectures. This paper leads the way and sets a direction for the entire domain. It shows clear benefits of using pre-trained models(trained on huge datasets) and transfer learning independent of the downstream tasks. Please feel free to read along with the paper with my notes and highlights. ...

June 18, 2021 路 2 min 路 Akshay Uppal

MLP-Mixer Annotated Paper

MLP-MIXER: An all MLP Architecture for Vision This is a very recent paper that challenges the need for complicated transformer-based models for huge datasets and questions the inductive biases presently in place for the present image recognition tasks. This paper argues that given a huge dataset (size 100M+), the performance of traditional CNN-based architectures or the new transformer-based architectures are only marginally better than a classic MLP based architecture, thus questioning the inductive biases of both CNNs and Transformers for images. ...

May 26, 2021 路 1 min 路 Akshay Uppal

PICK Annotated Paper

PICK: Processing Key Information Extraction from Documents using Improved Graph Learning-Convolutional Networks This paper talks KIE(Key Information Extraction), which is to extract texts of a number of key fields from given documents, and save the texts to structured documents. It proposes a solution which overcomes the existing problems fully and efficiently exploiting both textual and visual features of documents to get a richer semantic representation that is crucial for extraction. ...

May 24, 2021 路 1 min 路 Akshay Uppal

Attention Is All You Need Annotated Paper

Attention Is All You Need One of the most game changing paper of recent times. This paper introduces the transformer architecture and talks about the self attention mechanism. Please feel free to read along the paper with my notes and highlights. Color Meaning Green Topics about the current paper Yellow Topics about other relevant references Blue Implementation details/ maths Red Text including my thoughts, questions, and understandings Citation @misc{vaswani2017attention, title={Attention Is All You Need}, author={Ashish Vaswani and Noam Shazeer and Niki Parmar and Jakob Uszkoreit and Llion Jones and Aidan N. Gomez and Lukasz Kaiser and Illia Polosukhin}, year={2017}, eprint={1706.03762}, archivePrefix={arXiv}, primaryClass={cs.CL} }

May 23, 2021 路 1 min 路 Akshay Uppal