Intro to ML: Neural Networks Lecture 2 Part 2

Intro to ML: Neural Networks Lecture 2 Part 2

Assessment

Assessment

Created by

Josiah Wang

Mathematics, Computers, Fun

University

15 plays

Hard

Student preview

quiz-placeholder

6 questions

Show answers

1.

Multiple Choice

2 mins

1 pt

Is the following statement True or False? A neural network’s weights can be randomly initialised as gradient descent will always eventually find the optimal set of parameters, therefore being invariant to the initial set of parameters

True

False

Answer explanation

A neural network's weights are often randomly initialised. What is wrong here is that gradient descent has no theoretical guarantee to find the optimal set of parameters and often does not achieve this. (recall local optima from the lecture slides)

2.

Multiple Choice

1 min

1 pt

L1 regularisation favours few non zero weights where as L2 regularisation favours small values around zero.

True

False

Answer explanation

L1 regularization favours a sparse solution by encouraging a few non-zero weights, while L2 regularization favours small values around zero. This means that L1 regularization tends to select only the most important features, while L2 regularization spreads the importance across all features. In this case, the correct choice is 'True' because L1 regularization does favour a few non-zero weights.

3.

Multiple Select

1 min

1 pt

Which of the following statements about dropout are correct?

Dropout prevents complex co-adaptations in which a feature detector is only helpful in the context of several other specific feature detectors.

Dropout is more similar to L2 regularisation than L1 regularisation during training.

Dropout is active during training and testing.

Dropout can be viewed as a form of ensemble learning

The amount of dropout, p, can be optimised through standard stochastic gradient descent (SGD) methods

4.

Multiple Choice

2 mins

1 pt

A dense multi layer perceptron layer has 300 input values and 200 output values. Assuming no bias, how many parameters does the layer contain?

500

250

6000

60000

3000

Answer explanation

The given question asks about the number of parameters in a dense multi-layer perceptron layer with 300 input values and 200 output values. Since there is no bias, each input value is connected to each output value, resulting in a total of 300 * 200 = 60000 parameters in the layer. This means that there are 60000 weights to be learned during the training process.

5.

Multiple Choice

1 min

1 pt

If a neural network is overfitting, which of the following would not help

Introducing dropout

Reducing the number of layers in the model

Increasing the learning rate

Increasing the size of the training data

None of the above

Answer explanation

To address overfitting in a neural network, several techniques can be employed. Introducing dropout, reducing the number of layers in the model, and increasing the size of the training data are effective methods. However, increasing the learning rate may not help in reducing overfitting. By adjusting the learning rate, the model may become more prone to overshooting the optimal solution and fail to generalize well. Therefore, increasing the learning rate is not a suitable approach to combat overfitting.

Explore all questions with a free account

or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?

Find a similar activity

Create activity tailored to your needs using

Quizizz AI
Intro to ML: Neural Networks Lecture 1 Part 1

6 questions

Intro to ML: Neural Networks Lecture 1 Part 1

assessment

University

Understanding Neural Networks

10 questions

Understanding Neural Networks

assessment

University

Intro to ML: The ML Revision Quiz

11 questions

Intro to ML: The ML Revision Quiz

assessment

University

6CSM1 QUIZ DL

11 questions

6CSM1 QUIZ DL

assessment

University

Deep Feedforward Neural Networks Quiz

5 questions

Deep Feedforward Neural Networks Quiz

assessment

University

Neural Networks Quiz

10 questions

Neural Networks Quiz

assessment

University

Intro to ML: Neural Networks Lecture 2 Part 1

6 questions

Intro to ML: Neural Networks Lecture 2 Part 1

assessment

University

Neural Networks Quiz

10 questions

Neural Networks Quiz

assessment

University