Intro to ML: K-Nearest Neighbours 2024

Intro to ML: K-Nearest Neighbours 2024

University

10 Qs

Similar activities

Array

Array

University

11 Qs

Excel - Worksheet and Formula

Excel - Worksheet and Formula

9th Grade - University

15 Qs

GIS Unit III

GIS Unit III

University

15 Qs

KNN and Decision Tree

KNN and Decision Tree

University

8 Qs

K-NN python

K-NN python

University

5 Qs

KNN algo

KNN algo

University

15 Qs

Classification_G1

Classification_G1

University

10 Qs

ANN and Application

ANN and Application

University

10 Qs

Intro to ML: K-Nearest Neighbours 2024

Intro to ML: K-Nearest Neighbours 2024

Assessment

Quiz

Created by

Josiah Wang

Computers

University

54 plays

Medium

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

In K-NN, is the query time longer than the training time?

Yes

No

Answer explanation

Recall K-NN algorithms are lazy learners!

2.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

James is trying to decide which machine learning algorithm to use for his project. Which of the following options is true about the k-NN algorithm?

It can only be used for classification

It can only be used for regression

It can be used for both classification and regression

Answer explanation

Regression: considers the VALUE in majority of k neighbours

Classification: considers the CLASS in majority of k neighbours

3.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Which of the following is true about Manhattan distances?

It can be used for continuous variables

It can be used for categorical variables

It can be used for categorical as well as continuous variables

Answer explanation

Manhattan distance is the L1-norm and therefore cannot be used for categorical variables. Hamming distance would be a good choice for categorical variables.

4.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

Poppy is working on a machine learning project using k-NN and has a dataset with a lot of noise. What are the appropriate things to do with k-NN in this situation?

Increase the value of K

Decrease the value of K

K does not depend on the noise

None of these

Answer explanation

Recall when K=1 the decision boundaries perfectly match the data and its noise. This will result in overfitting. Increasing K makes K-NN more resilient to overfitting.

5.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

In K-NN, what is the effect of increasing/decreasing the value of K?

The boundary becomes smoother with increasing values of K

Smoothness of the boundary does not depend on the value of K

The boundary becomes smoother with decreasing values of K

None of these

Answer explanation

Think about what happens when K=1. Increasing K results in K-NN being less effected by noise as a large number of neighbours are considered when making a decision.

6.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

For embedded applications (i.e. running on a smartphone), what is the most appropriate family of algorithm?

Eager learners

Lazy learners

Answer explanation

Eager learners are desirable as all the heavy computation occurs at training time. Therefore the algorithm is time + compute efficient at inference time.

7.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

Given d the distance between a point of the dataset and the query point, which of the following weight functions is appropriate for Distance-Weighted k-NN?

w = exp( -d )

w = log ( min ( 0.25 * d, 1 ) )

w = -d

Answer explanation

Media Image

Recall that the weight you assign is inversely correlated with the distance metric. Both '-d' and 'log(min(0.25*d, 1)' yield negative distances which does not make sense. exp(-d) acts as a good distance metric as it increases exponentially with distance. This favours points close by and quickly ignores points far away.

8.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Which of the following statements is true for k-NN classifiers?

The classification accuracy is better with larger values of k

The decision boundary is linear

The decision boundary is smoother with smaller values of k

k-NN does not require an explicit training step

Answer explanation

Too many points to address here. Please refer to lecture slides if confused.

9.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

The curse of dimensionality only affects k-NN?

True

False

Answer explanation

The curse of dimensionality plagues all machine learning algorithms. Hence the need for feature extractors and engineered priors.

10.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Benjamin is using k-NN to classify types of flowers based on petal measurements. How does the choice of distance metric affect the performance of k-NN?

It does not affect the performance

It affects the accuracy of the model

It only affects the speed of the algorithm

It affects both accuracy and speed

Answer explanation

The choice of distance metric in k-NN directly influences how distances between data points are calculated, impacting both the accuracy of classifications and the computational speed of the algorithm.

Explore all questions with a free account

or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?