In K-NN, is the query time longer than the training time?
Intro to ML: K-Nearest Neighbours 2024

Quiz
•

Josiah Wang
•
Computers
•
University
•
54 plays
•
Medium
10 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
Yes
No
Answer explanation
Recall K-NN algorithms are lazy learners!
2.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
James is trying to decide which machine learning algorithm to use for his project. Which of the following options is true about the k-NN algorithm?
It can only be used for classification
It can only be used for regression
It can be used for both classification and regression
Answer explanation
Regression: considers the VALUE in majority of k neighbours
Classification: considers the CLASS in majority of k neighbours
3.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
Which of the following is true about Manhattan distances?
It can be used for continuous variables
It can be used for categorical variables
It can be used for categorical as well as continuous variables
Answer explanation
Manhattan distance is the L1-norm and therefore cannot be used for categorical variables. Hamming distance would be a good choice for categorical variables.
4.
MULTIPLE CHOICE QUESTION
2 mins • 1 pt
Poppy is working on a machine learning project using k-NN and has a dataset with a lot of noise. What are the appropriate things to do with k-NN in this situation?
Increase the value of K
Decrease the value of K
K does not depend on the noise
None of these
Answer explanation
Recall when K=1 the decision boundaries perfectly match the data and its noise. This will result in overfitting. Increasing K makes K-NN more resilient to overfitting.
5.
MULTIPLE CHOICE QUESTION
2 mins • 1 pt
In K-NN, what is the effect of increasing/decreasing the value of K?
The boundary becomes smoother with increasing values of K
Smoothness of the boundary does not depend on the value of K
The boundary becomes smoother with decreasing values of K
None of these
Answer explanation
Think about what happens when K=1. Increasing K results in K-NN being less effected by noise as a large number of neighbours are considered when making a decision.
6.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
For embedded applications (i.e. running on a smartphone), what is the most appropriate family of algorithm?
Eager learners
Lazy learners
Answer explanation
Eager learners are desirable as all the heavy computation occurs at training time. Therefore the algorithm is time + compute efficient at inference time.
7.
MULTIPLE CHOICE QUESTION
2 mins • 1 pt
Given d the distance between a point of the dataset and the query point, which of the following weight functions is appropriate for Distance-Weighted k-NN?
w = exp( -d )
w = log ( min ( 0.25 * d, 1 ) )
w = -d
Answer explanation
Recall that the weight you assign is inversely correlated with the distance metric. Both '-d' and 'log(min(0.25*d, 1)' yield negative distances which does not make sense. exp(-d) acts as a good distance metric as it increases exponentially with distance. This favours points close by and quickly ignores points far away.
8.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
Which of the following statements is true for k-NN classifiers?
The classification accuracy is better with larger values of k
The decision boundary is linear
The decision boundary is smoother with smaller values of k
k-NN does not require an explicit training step
Answer explanation
Too many points to address here. Please refer to lecture slides if confused.
9.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
The curse of dimensionality only affects k-NN?
True
False
Answer explanation
The curse of dimensionality plagues all machine learning algorithms. Hence the need for feature extractors and engineered priors.
10.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
Benjamin is using k-NN to classify types of flowers based on petal measurements. How does the choice of distance metric affect the performance of k-NN?
It does not affect the performance
It affects the accuracy of the model
It only affects the speed of the algorithm
It affects both accuracy and speed
Answer explanation
The choice of distance metric in k-NN directly influences how distances between data points are calculated, impacting both the accuracy of classifications and the computational speed of the algorithm.
Explore all questions with a free account
Similar Resources on Quizizz
11 questions
Array

Quiz
•
University
15 questions
Excel - Worksheet and Formula

Quiz
•
9th Grade - University
15 questions
GIS Unit III

Quiz
•
University
8 questions
KNN and Decision Tree

Quiz
•
University
5 questions
K-NN python

Quiz
•
University
15 questions
KNN algo

Quiz
•
University
10 questions
Classification_G1

Quiz
•
University
10 questions
ANN and Application

Quiz
•
University
Popular Resources on Quizizz
17 questions
CAASPP Math Practice 3rd

Quiz
•
3rd Grade
20 questions
math review

Quiz
•
4th Grade
21 questions
6th Grade Math CAASPP Practice

Quiz
•
6th Grade
13 questions
Cinco de mayo

Interactive video
•
6th - 8th Grade
20 questions
Reading Comprehension

Quiz
•
5th Grade
20 questions
Types of Credit

Quiz
•
9th - 12th Grade
10 questions
4th Grade Math CAASPP (part 1)

Quiz
•
4th Grade
45 questions
5th Grade CAASPP Math Review

Quiz
•
5th Grade
Discover more resources for Computers
15 questions
Disney Trivia

Quiz
•
University
44 questions
APES Exam Review 2017

Quiz
•
11th Grade - University
22 questions
TSIA2 Math - TSI MATH 2.0 Review 1 (950ish)

Quiz
•
6th Grade - University
25 questions
APUSH Decades Review

Quiz
•
9th Grade - University
12 questions
Scientific Notation

Quiz
•
University
96 questions
World History 1 Sol Review

Quiz
•
9th Grade - University
20 questions
Unit Circle & Trig

Quiz
•
10th Grade - University
20 questions
Preterito vs. Imperfecto

Quiz
•
KG - University