Data Fundamentals (H) - Week 07 Quiz
1. Simulated annealing uses what metaheuristic to help avoid getting trapped in local minima?
Crossover rules.
A population of solutions.
A temperature schedule.
Randomised restart.
Hill climbing.
2. A
hyperparameter
of an optimisation algorithm is:
The determinant of the Hessian.
A measure of how good a solution is.
A value that is used to impose constraints on the solution.
A value that affects how a solution is searched for.
A direction in hyperspace.
3. First-order optimisation requires that objective functions be:
one-dimensional
monotonic
invertible
disconcerting
\(C^1\) continuous
4. The gradient vector \(\nabla L(\theta)\) is a vector which, at any given point \(\theta\) will:
have \(L_2\) norm 1
point in the direction of steepest descent
be equal to \(\theta\)
be zero
point towards the global minimum of \(L(\theta)\)
5. Finite differences is not an effective approach to apply first-order optimisation because:
the curse of dimensionality
none of the above
the effect of measurement noise
all of the above
of numerical roundoff issues.
6. Ant colony optimisation applies which two metaheuristics to improve random local search?
gradient descent and crossover
thants
temperature and memory
random restart and hyperdynamics
memory and population
Submit Quiz