Data Fundamentals (H) - Week 07 Quiz
1. Simulated annealing uses what metaheuristic to help avoid getting trapped in local minima?
Hill climbing.
Randomised restart.
A temperature schedule.
Crossover rules.
A population of solutions.
2. A
hyperparameter
of an optimisation algorithm is:
A value that affects how a solution is searched for.
The determinant of the Hessian.
A value that is used to impose constraints on the solution.
A measure of how good a solution is.
A direction in hyperspace.
3. First-order optimisation requires that objective functions be:
monotonic
\(C^1\) continuous
disconcerting
one-dimensional
invertible
4. The gradient vector \(\nabla L(\theta)\) is a vector which, at any given point \(\theta\) will:
have \(L_2\) norm 1
be equal to \(\theta\)
point towards the global minimum of \(L(\theta)\)
be zero
point in the direction of steepest descent
5. Finite differences is not an effective approach to apply first-order optimisation because:
the effect of measurement noise
of numerical roundoff issues.
none of the above
the curse of dimensionality
all of the above
6. Ant colony optimisation applies which two metaheuristics to improve random local search?
temperature and memory
thants
gradient descent and crossover
memory and population
random restart and hyperdynamics
Submit Quiz