Mathematics, Computers, Fun

University

Image

Intro to ML: Neural Networks Lecture 2 Part 2

15
plays

6 questions

Show Answers
See Preview
  • 1. Multiple Choice
    2 minutes
    1 pt

    Is the following statement True or False? A neural network’s weights can be randomly initialised as gradient descent will always eventually find the optimal set of parameters, therefore being invariant to the initial set of parameters

    True

    False

  • 2. Multiple Choice
    1 minute
    1 pt

    L1 regularisation favours few non zero weights where as L2 regularisation favours small values around zero.

    True

    False

  • 3. Multiple Choice
    1 minute
    1 pt

    Which of the following statements about dropout are correct?

    Dropout prevents complex co-adaptations in which a feature detector is only helpful in the context of several other specific feature detectors.

    Dropout is more similar to L2 regularisation than L1 regularisation during training.

    Dropout is active during training and testing.

    Dropout can be viewed as a form of ensemble learning

    The amount of dropout, p, can be optimised through standard stochastic gradient descent (SGD) methods

  • Answer choices
    Tags
    Answer choices
    Tags

    Explore all questions with a free account

    Already have an account?