Intro to ML: K-Nearest Neighbours 2024

Intro to ML: K-Nearest Neighbours 2024

Assessment

Assessment

Created by

Josiah Wang

Computers

University

54 plays

Medium

Student preview

quiz-placeholder

10 questions

Show answers

1.

Multiple Choice

1 min

1 pt

In K-NN, is the query time longer than the training time?

Yes

No

Answer explanation

Recall K-NN algorithms are lazy learners!

2.

Multiple Choice

1 min

1 pt

James is trying to decide which machine learning algorithm to use for his project. Which of the following options is true about the k-NN algorithm?

It can only be used for classification

It can only be used for regression

It can be used for both classification and regression

Answer explanation

Regression: considers the VALUE in majority of k neighbours

Classification: considers the CLASS in majority of k neighbours

3.

Multiple Choice

1 min

1 pt

Which of the following is true about Manhattan distances?

It can be used for continuous variables

It can be used for categorical variables

It can be used for categorical as well as continuous variables

Answer explanation

Manhattan distance is the L1-norm and therefore cannot be used for categorical variables. Hamming distance would be a good choice for categorical variables.

4.

Multiple Choice

2 mins

1 pt

Poppy is working on a machine learning project using k-NN and has a dataset with a lot of noise. What are the appropriate things to do with k-NN in this situation?

Increase the value of K

Decrease the value of K

K does not depend on the noise

None of these

Answer explanation

Recall when K=1 the decision boundaries perfectly match the data and its noise. This will result in overfitting. Increasing K makes K-NN more resilient to overfitting.

5.

Multiple Choice

2 mins

1 pt

In K-NN, what is the effect of increasing/decreasing the value of K?

The boundary becomes smoother with increasing values of K

Smoothness of the boundary does not depend on the value of K

The boundary becomes smoother with decreasing values of K

None of these

Answer explanation

Think about what happens when K=1. Increasing K results in K-NN being less effected by noise as a large number of neighbours are considered when making a decision.

Explore all questions with a free account

or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?

Find a similar activity

Create activity tailored to your needs using

Quizizz AI
Quiz on Machine Learning: Clustering and k-NN Algorithm

14 questions

Quiz on Machine Learning: Clustering and k-NN Algorithm

assessment

University

DMW CA2

15 questions

DMW CA2

assessment

University

Quiz On ML Classification

10 questions

Quiz On ML Classification

assessment

University

Fundamentals of Classification

10 questions

Fundamentals of Classification

assessment

University

Intro to ML: The ML Revision Quiz

11 questions

Intro to ML: The ML Revision Quiz

assessment

University

Supervised Learning: Classification

10 questions

Supervised Learning: Classification

assessment

University

Nearest Neighbor

15 questions

Nearest Neighbor

assessment

University

KNN algo

15 questions

KNN algo

assessment

University