Deep Learning with Applications in NLP

The course takes a tour of the main topics in Natural Language Processing (NLP), presenting both traditional and deep learning algorithmic approaches for text processing and analysis.

The grade is obtained based on three components: To graduate, each of the components above must be accomplished in proportion larger than 50%.

Subject Course Lab Readings/Useful links
w1 Intro in NLP, basic text processing
(tokens, lemmas, stemming, edit distance, POS tagging)
lexical databases, corpuses
slides Practice with WordNet and NLTK
w2 Syntactic structure and dependency parsing slides Get familiar with syntactic and dependency parsing
w3 Language modeling: statistical approaches slides Implement an N-gram language model for the Romanian language
w4 Machine translation slides Machine translation - compute BLEU
Topic Modeling slides Practice with LSA (SVD,NMF) and LDA for topic modeling An introduction to Latent Semantic Analysis
LSA web site
David Blei's web page on topic modeling
w6 Text vectorization slides Visualizing word embeddings Multilingual universal encoder
w7 Recurrent Neural Networks slides
w9 RNN variants slides Project planning
Project list
An Empirical Exploration of Recurrent Network Architectures
w10 Text classification slides Practice with (Multinomial) NB and (Multinomial) LR
w11 Transformers slides Project developement
w12 Transformers-part2 slides Project development
w13 Explainability in NLP slides Project development
w14 Project presentations