6 questions
A neuron with 3 inputs has weight vector [0.2, -0.1, 0.1]^T, a bias of b = 0 and a ReLU activation function. If the input vector is X = [0.2, 0.4, 0.2]^T, then what is the output value of the neuron?
0.2
0.1
0.02
-0.1
A neural network consisting of only linear activations is underfitting a dataset. A student only has the time to change one feature of the network. Which of the following statements is the best option for increasing the accuracy of the model?
Add more layers to the network
Train for longer
Introduce non linear activations
Aquire more data
None of the these
Which is the best output configuration for a model tasked with predicting a patient's age given their brain MRI image?
One neuron with sigmoid
Multiple neurons with softmax
One neuron with linear output
Multiple neurons with Tanh
Which is the best output configuration for a model tasked with classifying a person's mood, e.g. angry, happy, sad etc. by the tone of their voice?
One neuron with sigmoid
Multiple neurons with softmax
One neuron with linear output
None of these
The derivatives for common activation functions are plotted. Pair the label with the correct activation function.
P: purple dotted line
B: blue solid line
G: green dashed line
R: red solid line
P: step, B: Tanh, G: sigmoid, R: ReLU
P: ReLU, B: Tanh, G: sigmoid, R: Step
P: step, B: sigmoid, G: Tanh, R: ReLU
P: Tanh, B: step, G: sigmoid, R: ReLU
The following diagram represents a feedforward neural network with its corresponding weights. Each layer has ReLU activations. The weight connecting node i to node j is . Calculate the output from one forward pass through the network with the input
[0,1]
[0,4]
[-6,6]
[6,6]
None of the above