Week 6: Classification basics (kNN)
Learning Objectives
By the end of this week, students will be able to:
- Frame a supervised classification problem and identify inputs and outputs
- Explain how k-nearest neighbors (kNN) makes predictions
- Evaluate a classifier using accuracy, confusion matrix, precision, and recall
- Describe the effect of the choice of k on model complexity
Perspectival Reading
Reading: TBD
Reflection Questions
- kNN defines similarity by distance in feature space — whose definition of similarity does that embed?
- Classification assigns people or things to categories. What is the cost of a wrong assignment, and who bears it?
- “Accuracy” treats all errors as equal. When is that assumption unjustified?
Slides
Notebook Demo
Open in Google Colab (link TBD)
Lab Assignment
Week 6 Lab — GitHub Classroom (link TBD)