Intro to ML: Neural Networks Lecture 2 Part 1

Intro to ML: Neural Networks Lecture 2 Part 1

Assessment

Assessment

Created by

Josiah Wang

Mathematics, Computers, Fun

University

42 plays

Hard

Student preview

quiz-placeholder

6 questions

Show answers

1.

Multiple Choice

30 sec

1 pt

Mean squared error is a common loss function for which task?

Regression

Classification

None of these

Regression and Classification

Answer explanation

Mean squared error is the expected (squared) distance between the models predictions and the true values. It therefore is well suited to regression tasks where the label space is a continuous one. MSE assumes the data is normally distributed however, in binary classification tasks, the data is distributed according to a Bernoulli distribution.

2.

Multiple Choice

30 sec

1 pt

Is the following statement True or False? Multi-class and Multi-label classification are the same thing.

True

False

Answer explanation

Multiclass classification - classification task where each instance needs to be assigned to one of two or more classes.

Mulitlabel classification - assign each instance a set of target labels. Think of this as predicting a series of properties of an instance which are not mutually exclusive.

3.

Multiple Choice

2 mins

1 pt

The shape of the weight matrix, W of a Neural network linear layer is (x,y). A forward pass through this layer can be represented as follows:

Z=XW

Where X is the batched data of dimensions (batch size, input features) and W is the weight matrix. Select the correct assignments for (x, y)

x = batch size, y = number input features to that layer

x = number input features to that layer, y =batch size

x = input features, y = number of neurons in that layer

x = batch size, y = number of neurons in that layer

none of the above

Answer explanation

Media Image

Every neuron in a fully connected neural layer is connected to every input via a weight. This creates a vector of dimensions equal to the input layer dimensions for every neuron in the current layer. Resulting in a stack of these vectors as deep as the number of neurons in the current layer. For Z=XW the neighbouring dimensions need to match.

4.

Multiple Choice

30 sec

1 pt

If a neural network has a single output neuron, then the model may be used for:

Binary classification

Regression

Binary classification or regression

None of these

5.

Multiple Choice

5 mins

1 pt

Media Image

Here we have a computational graph representing a series of operations. The green text (text above the line) represents the forward pass (i.e. the values at each stage in the graph during forward propagation). The red values (the values which are positioned underneath the lines) represent gradient signals which have been passed back down the computational graph after some loss as been calculated. Calculate the missing gradients a, b and c using backpropagation (slides 20-31 onwards).

a= -0.2, b= 0.2, c=0.4

a= 0.2, b= -0.2, c=0.4

a= 0.4, b= 0.4, c=0.2

a= -0.2, b= -0.2, c=0.2

Answer explanation

Media Image

The key with this computational graph is to apply the chain rule at each node. Please see the attached solution for more information.

Explore all questions with a free account

or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?

Find a similar activity

Create activity tailored to your needs using

Quizizz AI
ML-Unit2

11 questions

ML-Unit2

assessment

University - Professi...

Artificial Intelligence: Machine Learning and Neural Networks

10 questions

Artificial Intelligence: Machine Learning and Neural Networks

assessment

University

Intro to ML: Neural Networks Lecture 2 Part 2

6 questions

Intro to ML: Neural Networks Lecture 2 Part 2

assessment

University

Neural Networks Quiz

10 questions

Neural Networks Quiz

assessment

University

Intro to ML: Neural Networks Lecture 1 Part 1

6 questions

Intro to ML: Neural Networks Lecture 1 Part 1

assessment

University

Intro to ML: Neural Networks Lecture 1 Part 2

6 questions

Intro to ML: Neural Networks Lecture 1 Part 2

assessment

University

Intro to ML Day 2

5 questions

Intro to ML Day 2

assessment

University

Lecture3-Neural-Network

9 questions

Lecture3-Neural-Network

assessment

University