Intro to ML: K-Nearest Neighbours 2024

Assessment
•

Josiah Wang
•
Computers
•
University
•
54 plays
•
Medium
Student preview

10 questions
Show answers
1.
Multiple Choice
In K-NN, is the query time longer than the training time?
Yes
No
Answer explanation
Recall K-NN algorithms are lazy learners!
2.
Multiple Choice
James is trying to decide which machine learning algorithm to use for his project. Which of the following options is true about the k-NN algorithm?
It can only be used for classification
It can only be used for regression
It can be used for both classification and regression
Answer explanation
Regression: considers the VALUE in majority of k neighbours
Classification: considers the CLASS in majority of k neighbours
3.
Multiple Choice
Which of the following is true about Manhattan distances?
It can be used for continuous variables
It can be used for categorical variables
It can be used for categorical as well as continuous variables
Answer explanation
Manhattan distance is the L1-norm and therefore cannot be used for categorical variables. Hamming distance would be a good choice for categorical variables.
4.
Multiple Choice
Poppy is working on a machine learning project using k-NN and has a dataset with a lot of noise. What are the appropriate things to do with k-NN in this situation?
Increase the value of K
Decrease the value of K
K does not depend on the noise
None of these
Answer explanation
Recall when K=1 the decision boundaries perfectly match the data and its noise. This will result in overfitting. Increasing K makes K-NN more resilient to overfitting.
5.
Multiple Choice
In K-NN, what is the effect of increasing/decreasing the value of K?
The boundary becomes smoother with increasing values of K
Smoothness of the boundary does not depend on the value of K
The boundary becomes smoother with decreasing values of K
None of these
Answer explanation
Think about what happens when K=1. Increasing K results in K-NN being less effected by noise as a large number of neighbours are considered when making a decision.
Explore all questions with a free account
Find a similar activity
Create activity tailored to your needs using
Quiz on Machine Learning: Clustering and k-NN Algorithm

•
University
DMW CA2

•
University
Quiz On ML Classification

•
University
Fundamentals of Classification

•
University
Intro to ML: The ML Revision Quiz

•
University
Supervised Learning: Classification

•
University
Nearest Neighbor

•
University
KNN algo

•
University