Intro to ML: The ML Revision Quiz

Intro to ML: The ML Revision Quiz

University

11 Qs

Similar activities

Gradient Descent Method

Gradient Descent Method

University

10 Qs

Intro to ML: Neural Networks Lecture 1 Part 1

Intro to ML: Neural Networks Lecture 1 Part 1

University

6 Qs

DL_unit-2

DL_unit-2

University

14 Qs

WS2324 S2 & S10 Formative Assessment

WS2324 S2 & S10 Formative Assessment

University

15 Qs

Neural Networks Quiz

Neural Networks Quiz

University

10 Qs

AIML

AIML

University

15 Qs

Session 6 | U

Session 6 | U

University

10 Qs

Mastering the Photoshop Tools Panel

Mastering the Photoshop Tools Panel

8th Grade - University

15 Qs

Intro to ML: The ML Revision Quiz

Intro to ML: The ML Revision Quiz

Assessment

Quiz

Created by

Josiah Wang

Computers

University

20 plays

Hard

11 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

If we predict every observation to be True, what will our model precision be?

100%

0%

The proportion of True values in the dataset

Not enough information

Answer explanation

Media Image

Think of all the False Positives

2.

MULTIPLE SELECT QUESTION

1 min • 1 pt

James, Amelia, and George are participating in a machine learning competition. They have to choose an algorithm for their project. Select which of the following algorithms they should consider if they want to use eager learners:

K-nearest neighbours

Decision trees

Neural networks

Linear regression

Answer explanation

Recall that K-nn is a lazy learner. At training time the algorithm simply stores the training data - i.e. no calculations/training occurs. It is not until inference time when the algorithm checks the K nearest points to the unseen datapoint in question. All calculations occur at inference time, hence being called lazy rather than eager.

3.

MULTIPLE SELECT QUESTION

1 min • 1 pt

Which of the following statements are True:

Performance on the validation set can be used to see if a model is overfitting to the training data

We cannot tell from the training performance alone if a model is overfitting or not

Underfitting implies better generalisation to other datasets

Answer explanation

Underfitting is when the model lacks the capacity to fit the underlying pattern/trend of the data. A model that underfits a training set will perform no better on unseen data.

4.

MULTIPLE SELECT QUESTION

1 min • 1 pt

Scarlett is working on a machine learning project and she is worried about underfitting. Which of the following actions may cause underfitting in her model?

Reducing the max. depth of a decision tree

Increasing the value of K in K-nn

Adding more layers to a neural network

Increasing the size of the training data

Increasing the value of K in K-means

Answer explanation

Underfitting is caused when the model lacks the capacity to fit the underlying trend/pattern of the data.

5.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

True or False:

If we use grid-search for testing different hyper-parameter values, we can use each of these results for finding the confidence interval of the model error.

True

False

Answer explanation

Confidence Intervals should be reported for the final model architecture. They are a prediction on how the final model will perform on unseen data. If this is calculated with a range of different models, this will clearly be an inaccurate prediction.

6.

MULTIPLE SELECT QUESTION

1 min • 1 pt

Which of the following algorithms will change given different random seeds:

Neural networks

K-nearest neighbours (K = 1, with no ties)

Decision trees

K-means

Evolution Algorithms using simple tournament

Answer explanation

Think about which methods are deterministic.

7.

MULTIPLE SELECT QUESTION

1 min • 1 pt

Which statements below are True describing the differences between Gradient Descent, Stochastic Gradient Descent and Mini-batched Gradient Descent:

Gradient Descent is faster to compute than Stochastic Gradient Descent

Stochastic Gradient Descent is faster to compute than Mini-batched Gradient Descent

There is less noise in the gradients when using Mini-batched Gradient Descent compared to Stochastic Gradient Descent

Answer explanation

Gradient descent - gradients are calculated and a step is taken based on the whole training step. This is a computationally heavy computation, as apposed to calculating the gradients and updating the parameters based on one sample. Stochastic gradient descent results in a very noising learning signal as the gradient is sensitive to the variability of each individual datapoint. The learning signal can be smoothed out by sampling groups of datapoints in mini-batch gradient descent. Here the gradient is averaged over all datapoints.

8.

MULTIPLE SELECT QUESTION

1 min • 1 pt

Which of the following statements about K-means are True:

The algorithm always converges

The algorithm always converges to a global optimum

The algorithm doesn’t always converge

If the algorithm does converge, it will converge to a global optimum

Answer explanation

Think about local optima. Is there a mechanism in K-means to help it get out of local minima once the algorithm has converged?

9.

MULTIPLE SELECT QUESTION

30 sec • 1 pt

For a Gaussian Mixture Model, which of the following statements are True:

The responsibilities r_ik for ith data point sum to 1

The responsibilities r_ni for the ith mixture component sum to 1

None of the above are True

Answer explanation

The responsibilities, r_nK for a particular data point, n, represents the probability of that datapoint being generated by all the K mixture components. This is a probability distribution over all mixtures and therefore has to sum to 1. A mixture, k, however, can be responsible for generating the vast majority of data points, therefore, easily having a total responsibility over all datapoints over 1.

10.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Media Image

Which of one the following is correct?

a) sigmoid b) tanh c) ReLU

a) tanh b) sigmoid c) Linear

a) sigmoid b) tanh c) Linear

a) softmax b) tanh c) ReLU

None of the above

Answer explanation

Check definitions

Explore all questions with a free account

or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?