50.040 Natural Language Processing

Fall 2025 (Term 7) | Singapore University of Technology and Design (SUTD)

Announcements

Announcements will be made on eDimension platform and emails.

Teaching Team

Instructor: Prof. Wenxuan Zhang
  • Email: wxzhang@sutd.edu.sg
  • Office: 1.602-17
Teaching Assistants:
  • Chen Huang (chen_huang@mymail.sutd.edu.sg)
  • Shaoyang Xu (shaoyang_xu@mymail.sutd.edu.sg)
  • Yuhao Wu (wu_yuhao@mymail.sutd.edu.sg)

Grading Policy

Note: more details regarding the grading policy will be discussed in the first lecture.
  • Attendance & Participation: 8% (+2% survey)
  • Assignments (Individual): 30% (10% each)
  • In-class Quizzes: 30% (15% each)
  • Final (Group) Project: 30%

Class Information

Lectures: Room 2.505
  • Monday: 1:00PM – 3:00PM
  • Tuesday: 1:30PM – 3:30PM
  • Note: No recording or streaming, please come!
Cohorts: Room 1.416
  • CI01: Tuesday 9:30AM – 10:30AM
  • CI02: Thursday 4:00PM – 5:00PM
  • Note: Cohorts only in certain weeks, check schedule below

Course Schedule

Note: Schedule is tentative and subject to change!
Materials: Lecture slides and cohort materials can be found on Edimension platform.
Week Date Topics Materials / Readings Cohorts Deadlines
1 15-Sep Course Logistics
Introduction to NLP
- -
2 22-Sep Recap on ML/NN
Word Vectors
1. Stanford CS231n note on neural network basics and backpropagation
2. Efficient Estimation of Word Representations in Vector Space (original word2vec paper)
3. The Illustrated Word2vec
4. word2vec Parameter Learning Explained
- Release A1
3 29-Sep Words and Tokens
Language Model
1. BPE tutorial by Hugging Face
2. N-gram Language Models
Word2Vec Tutorial A1 Due
4 6-Oct Quiz-1
RNN and Variants
1. The Unreasonable Effectiveness of Recurrent Neural Networks
2. Understanding LSTM Networks
- -
5 13-Oct Seq2Seq & Attention 1
Seq2Seq & Attention 2
1. Sequence to Sequence (seq2seq) and Attention (with good visualizations) - Release A2
6 20-Oct Practical Tips of NLP Project
Project Announce
1. Practical Methodology (Deep Learning book chapter) Transformer Tutorial A2 Due
7 27-Oct Recess Week
8 3-Nov Transformer
LLM Pre-training
1. Attention Is All You Need
2. The Illustrated Transformer
3. The Annotated Transformer
4. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
- -
9 10-Nov Project Proposal Presentation
Post-training (SFT/RLHF)
1. Aligning language models to follow instructions (InstructGPT)
2. Scaling Instruction-Finetuned Language Models (Flan-T5)
LLM Tutorial -
10 17-Nov Efficient Adaptation
Quiz-2
1. Language Models are Few-Shot Learners
2. Chain-of-Thought Prompting Elicits Reasoning in Large Language Models
3. LoRA: Low-Rank Adaptation of Large Language Models
- Release A3
11 24-Nov Evaluation
Guest Lecture on test-time compute and RL
- Help on Project A3 Due
12 1-Dec Advanced Topic-3
Advanced Topic-4
- - -
13 8-Dec Final Project Presentations