ποΈ Week 07 - Introduction to unsupervised learning and dimensionality reduction
Theme: Unsupervised Learning
Important
π§ This website is under construction. Pages will keep being updated in the next few weeks!
Welcome to the seventh week of this course!
This week, we start our exploration of unsupervised learning:
- we discover what differentiates it from supervised learning and in which cases it is used
- we learn about the first family of unsupervised learning techniques: dimensionality reduction. Weβll explain what dimensionality reduction is about and what it is used for before introducing the most common and arguably the most well-known dimensionality reduction algorithm by far: PCA (Principal Component Analysis). We also have a look at a non-linear dimensionality reduction algorithm: UMAP.
Your second formative is due at the very beginning of this week (04 Mar at 5pm)! Your first summative (due week 09 10) will also be released this week (after the lecture).
π©π»βπ« Lecture Material
π₯ Looking for lecture recordings? You can only find those on Moodle, typically a day after the lecture. If you canβt find the recordings, please contact π§ .
Download the Jupyter notebook for this lecture :
Download the financial crises dataset for this lecture: