# Learning Sequential Data

## Overview

Human Lives are but un-solved sequences, but of-course we want AI to learn them !

## Outline

### Previous Talks

- Spring 2019 : Sequences and Language Models
- Fall 2019 : Graphs and Stuff

#### Resources

##### Attention and Language Models

###### Papers

- Transformers : Attention is all you need
- The Annotated Transformer
- Pretraining Language Models
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

###### Blogs n Tutorials

- Understanding Attention Mechanism
- Lilâ€™s Log: Attention
- Pytorch Transformer Tutorial
- Attention Mechanism n Memory Networks : skymind.ai

###### Vids

- Attention Model : deeplearning.ai
- Attention and Memory in Deep Learning : deepMind google
- Attention Paper Review
- Bert Explained
- Bert:TDLS

##### Graphs

###### Papers

- Graphs Survey 1
- Graphs Survey 2
- Graphs Survey 3
- Representation Learning on Graphs: Methods and Applications
- More Graph papers

###### Blogs n Tutorials

- Graph Convolution Neural Network
- Stanford Network Representations
- What is network representation learning and why is it important?
- Learning low-dimensional embeddings of nodes in complex networks (e.g., DeepWalk and node2vec).
- Techniques for deep learning on network/graph structed data (e.g., graph convolutional networks and GraphSAGE).
- Applications of network representation learning for recommender systems and computational biology.

###### Vids

### Recurrent Neural Nets

- Original Recurrent Net
- RNN in Numpy and Passage Generation
- Pytorch RNN Beginner 1
- Pytorch RNN Beginner 2

### Long Short Term Memory

- Original Lstm Tutorial - Jurgen Schmidhuber
- Understanding Lstms
- Lstm in Numpy from Scratch
- Pytorch Lstm Beginner
- Pytorch Lstm Seq2Seq

### Language Models

### Word Vectors

- Word2vec - Original Paper
- Training Millions of Embedding Model
- What are word embeddings
- Understanding word2vec
- Word Embedding Applications
- Word2vec using Gensim - Tutorial
- How to use Pre-Trained Word2vec Embeddings in Pytorch
- ELMO - Deep contextualized word representations
- COVE; Learned in Translation: Contextualized Word Vectors